Python 如何在 Keras 中使用高级激活层?
声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow
原文地址: http://stackoverflow.com/questions/34717241/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me):
StackOverFlow
How to use advanced activation layers in Keras?
提问by pr338
This is my code that works if I use other activation layers like tanh:
如果我使用其他激活层(如 tanh),这是我的代码:
model = Sequential()
act = keras.layers.advanced_activations.PReLU(init='zero', weights=None)
model.add(Dense(64, input_dim=14, init='uniform'))
model.add(Activation(act))
model.add(Dropout(0.15))
model.add(Dense(64, init='uniform'))
model.add(Activation('softplus'))
model.add(Dropout(0.15))
model.add(Dense(2, init='uniform'))
model.add(Activation('softmax'))
sgd = SGD(lr=0.1, decay=1e-6, momentum=0.9, nesterov=True)
model.compile(loss='binary_crossentropy', optimizer=sgd)
model.fit(X_train, y_train, nb_epoch=20, batch_size=16, show_accuracy=True, validation_split=0.2, verbose = 2)
In this case, it doesn't work and says "TypeError: 'PReLU' object is not callable" and the error is called at the model.compile line. Why is this the case? All the non-advanced activation functions works. However, neither of the advanced activation functions, including this one, works.
在这种情况下,它不起作用并显示“TypeError: 'PReLU' object is not callable”并且在 model.compile 行调用错误。为什么会这样?所有非高级激活函数都有效。但是,包括这个在内的任何高级激活函数都不起作用。
采纳答案by Tarantula
The correct way to use the advanced activations like PReLU is to use it with add()
method and not wrapping it using Activation
class. Example:
使用 PReLU 等高级激活的正确方法是将它与add()
方法一起使用,而不是使用Activation
类包装它。例子:
model = Sequential()
act = keras.layers.advanced_activations.PReLU(init='zero', weights=None)
model.add(Dense(64, input_dim=14, init='uniform'))
model.add(act)
回答by Mattia Paterna
If using the Model
API in Keras you can call directly the function inside the Keras Layer
. Here's an example:
如果Model
在 Keras 中使用API,您可以直接调用 Keras 中的函数Layer
。下面是一个例子:
from keras.models import Model
from keras.layers import Dense, Input
# using prelu?
from keras.layers.advanced_activations import PReLU
# Model definition
# encoder
inp = Input(shape=(16,))
lay = Dense(64, kernel_initializer='uniform',activation=PReLU(),
name='encoder')(inp)
#decoder
out = Dense(2,kernel_initializer='uniform',activation=PReLU(),
name='decoder')(lay)
# build the model
model = Model(inputs=inp,outputs=out,name='cae')
回答by alexander ostrikov
For Keras functional API I think the correct way to combine Dense and PRelu (or any other advanced activation) is to use it like this:
对于 Keras 函数式 API,我认为结合 Dense 和 PRelu(或任何其他高级激活)的正确方法是这样使用它:
focus_tns =focus_lr(enc_bidi_tns)
enc_dense_lr = k.layers.Dense(units=int(hidden_size))
enc_dense_tns = k.layers.PReLU()(enc_dense_lr(focus_tns))
dropout_lr = k.layers.Dropout(0.2)
dropout_tns = dropout_lr(enc_dense_tns)
enc_dense_lr2 = k.layers.Dense(units=int(hidden_size/4))
enc_dense_tns2 = k.layers.PReLU()(enc_dense_lr2(dropout_tns))
of course one should parametrize layers according to the problem
当然,应该根据问题对层进行参数化