你如何在 Python 中使用 Keras LeakyReLU?

声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow 原文地址: http://stackoverflow.com/questions/48828478/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me): StackOverFlow

提示:将鼠标放在中文语句上可以显示对应的英文。显示中英文
时间:2020-08-19 18:52:25  来源:igfitidea点击:

How do you use Keras LeakyReLU in Python?

pythonmachine-learningneural-networkkerasconv-neural-network

提问by Hyman Trute

I am trying to produce a CNN using Keras, and wrote the following code:

我正在尝试使用 Keras 生成 CNN,并编写了以下代码:

batch_size = 64
epochs = 20
num_classes = 5

cnn_model = Sequential()
cnn_model.add(Conv2D(32, kernel_size=(3, 3), activation='linear',
                     input_shape=(380, 380, 1), padding='same'))
cnn_model.add(Activation('relu'))
cnn_model.add(MaxPooling2D((2, 2), padding='same'))
cnn_model.add(Conv2D(64, (3, 3), activation='linear', padding='same'))
cnn_model.add(Activation('relu'))
cnn_model.add(MaxPooling2D(pool_size=(2, 2), padding='same'))
cnn_model.add(Conv2D(128, (3, 3), activation='linear', padding='same'))
cnn_model.add(Activation('relu'))
cnn_model.add(MaxPooling2D(pool_size=(2, 2), padding='same'))
cnn_model.add(Flatten())
cnn_model.add(Dense(128, activation='linear'))
cnn_model.add(Activation('relu'))
cnn_model.add(Dense(num_classes, activation='softmax'))

cnn_model.compile(loss=keras.losses.categorical_crossentropy,
                  optimizer=keras.optimizers.Adam(), metrics=['accuracy'])

I want to use Keras's LeakyReLUactivation layer instead of using Activation('relu'). However, I tried using LeakyReLU(alpha=0.1)in place, but this is an activation layer in Keras, and I get an error about using an activation layer and not an activation function.

我想使用KerasLeakyReLU激活层而不是使用Activation('relu'). 但是,我尝试LeakyReLU(alpha=0.1)就地使用,但这是 Keras 中的激活层,我收到关于使用激活层而不是激活函数的错误。

How can I use LeakyReLUin this example?

在这个例子中如何使用LeakyReLU

回答by desertnaut

All advanced activations in Keras, including LeakyReLU, are available as layers, and not as activations; therefore, you should use it as such:

Keras 中的所有高级激活,包括LeakyReLU,都可以作为layer 使用,而不是作为激活;因此,您应该这样使用它:

from keras.layers import LeakyReLU

# instead of cnn_model.add(Activation('relu'))
# use
cnn_model.add(LeakyReLU(alpha=0.1))

回答by P-Gn

Sometimes you just want a drop-in replacement for a built-in activation layer, and not having to add extra activation layers just for this purpose.

有时您只想直接替换内置激活层,而不必为此目的添加额外的激活层。

For that, you can use the fact that the activationargument can be a callable object.

为此,您可以使用activation参数可以是可调用对象这一事实。

lrelu = lambda x: tf.keras.activations.relu(x, alpha=0.1)
model.add(Conv2D(..., activation=lrelu, ...)

Since a Layeris also a callable object, you could also simply use

由于 aLayer也是一个可调用对象,您也可以简单地使用

model.add(Conv2D(..., activation=tf.keras.layers.LeakyReLU(alpha=0.1), ...)

which now works in TF2. This is a better solution as this avoids the need to use a custom_objectduring loading as @ChristophorusReyhan mentionned.

现在可以在 TF2 中使用。这是一个更好的解决方案,因为这避免了custom_object在加载过程中需要使用@ChristophorusReyhan 提到的。