Python 如何在keras中连接两层?
声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow
原文地址: http://stackoverflow.com/questions/43196636/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me):
StackOverFlow
How to concatenate two layers in keras?
提问by rdo
I have an example of a neural network with two layers. The first layer takes two arguments and has one output. The second should take one argument as result of the first layer and one additional argument. It should looks like this:
我有一个两层神经网络的例子。第一层有两个参数并有一个输出。第二个应该采用一个参数作为第一层的结果和一个附加参数。它应该是这样的:
x1 x2 x3
\ / /
y1 /
\ /
y2
So, I'd created a model with two layers and tried to merge them but it returns an error: The first layer in a Sequential model must get an "input_shape" or "batch_input_shape" argument.
on the line result.add(merged)
.
所以,我创建了一个包含两层的模型并尝试合并它们,但它返回一个错误:The first layer in a Sequential model must get an "input_shape" or "batch_input_shape" argument.
在result.add(merged)
.
Model:
模型:
first = Sequential()
first.add(Dense(1, input_shape=(2,), activation='sigmoid'))
second = Sequential()
second.add(Dense(1, input_shape=(1,), activation='sigmoid'))
result = Sequential()
merged = Concatenate([first, second])
ada_grad = Adagrad(lr=0.1, epsilon=1e-08, decay=0.0)
result.add(merged)
result.compile(optimizer=ada_grad, loss=_loss_tensor, metrics=['accuracy'])
回答by parsethis
You're getting the error because result
defined as Sequential()
is just a container for the model and you have not defined an input for it.
您收到错误是因为result
定义为Sequential()
只是模型的容器,而您尚未为其定义输入。
Given what you're trying to build set result
to take the third input x3
.
鉴于您正在尝试构建集result
以获取第三个 input x3
。
first = Sequential()
first.add(Dense(1, input_shape=(2,), activation='sigmoid'))
second = Sequential()
second.add(Dense(1, input_shape=(1,), activation='sigmoid'))
third = Sequential()
# of course you must provide the input to result with will be your x3
third.add(Dense(1, input_shape=(1,), activation='sigmoid'))
# lets say you add a few more layers to first and second.
# concatenate them
merged = Concatenate([first, second])
# then concatenate the two outputs
result = Concatenate([merged, third])
ada_grad = Adagrad(lr=0.1, epsilon=1e-08, decay=0.0)
result.compile(optimizer=ada_grad, loss='binary_crossentropy',
metrics=['accuracy'])
However, my preferred way of building a model that has this type of input structure would be to use the functional api.
但是,我构建具有这种类型输入结构的模型的首选方法是使用功能 api。
Here is an implementation of your requirements to get you started:
这是您的要求的实现,以帮助您入门:
from keras.models import Model
from keras.layers import Concatenate, Dense, LSTM, Input, concatenate
from keras.optimizers import Adagrad
first_input = Input(shape=(2, ))
first_dense = Dense(1, )(first_input)
second_input = Input(shape=(2, ))
second_dense = Dense(1, )(second_input)
merge_one = concatenate([first_dense, second_dense])
third_input = Input(shape=(1, ))
merge_two = concatenate([merge_one, third_input])
model = Model(inputs=[first_input, second_input, third_input], outputs=merge_two)
ada_grad = Adagrad(lr=0.1, epsilon=1e-08, decay=0.0)
model.compile(optimizer=ada_grad, loss='binary_crossentropy',
metrics=['accuracy'])
To answer the question in the comments:
要回答评论中的问题:
1) How are result and merged connected? Assuming you mean how are they concatenated.
1)结果和合并如何连接?假设您的意思是它们是如何连接的。
Concatenation works like this:
连接的工作方式如下:
a b c
a b c g h i a b c g h i
d e f j k l d e f j k l
i.e rows are just joined.
即行刚刚加入。
2) Now, x1
is input to first, x2
is input into second and x3
input into third.
2) 现在,x1
输入到第一个,x2
输入到第二个,x3
输入到第三个。
回答by o0omycomputero0o
You can experiment with model.summary()
(notice the concatenate_XX (Concatenate) layer size)
您可以试验model.summary()
(注意 concatenate_XX (Concatenate) 层大小)
# merge samples, two input must be same shape
inp1 = Input(shape=(10,32))
inp2 = Input(shape=(10,32))
cc1 = concatenate([inp1, inp2],axis=0) # Merge data must same row column
output = Dense(30, activation='relu')(cc1)
model = Model(inputs=[inp1, inp2], outputs=output)
model.summary()
# merge row must same column size
inp1 = Input(shape=(20,10))
inp2 = Input(shape=(32,10))
cc1 = concatenate([inp1, inp2],axis=1)
output = Dense(30, activation='relu')(cc1)
model = Model(inputs=[inp1, inp2], outputs=output)
model.summary()
# merge column must same row size
inp1 = Input(shape=(10,20))
inp2 = Input(shape=(10,32))
cc1 = concatenate([inp1, inp2],axis=1)
output = Dense(30, activation='relu')(cc1)
model = Model(inputs=[inp1, inp2], outputs=output)
model.summary()
You can view notebook here for detail: https://nbviewer.jupyter.org/github/anhhh11/DeepLearning/blob/master/Concanate_two_layer_keras.ipynb
您可以在此处查看笔记本的详细信息:https: //nbviewer.jupyter.org/github/anhhh11/DeepLearning/blob/master/Concanate_two_layer_keras.ipynb
回答by Praveen Kulkarni
Adding to the above-accepted answer so that it helps those who are using tensorflow 2.0
添加到上述接受的答案中,以便它可以帮助那些正在使用 tensorflow 2.0
import tensorflow as tf
# some data
c1 = tf.constant([[1, 1, 1], [2, 2, 2]], dtype=tf.float32)
c2 = tf.constant([[2, 2, 2], [3, 3, 3]], dtype=tf.float32)
c3 = tf.constant([[3, 3, 3], [4, 4, 4]], dtype=tf.float32)
# bake layers x1, x2, x3
x1 = tf.keras.layers.Dense(10)(c1)
x2 = tf.keras.layers.Dense(10)(c2)
x3 = tf.keras.layers.Dense(10)(c3)
# merged layer y1
y1 = tf.keras.layers.Concatenate(axis=1)([x1, x2])
# merged layer y2
y2 = tf.keras.layers.Concatenate(axis=1)([y1, x3])
# print info
print("-"*30)
print("x1", x1.shape, "x2", x2.shape, "x3", x3.shape)
print("y1", y1.shape)
print("y2", y2.shape)
print("-"*30)
Result:
结果:
------------------------------
x1 (2, 10) x2 (2, 10) x3 (2, 10)
y1 (2, 20)
y2 (2, 30)
------------------------------