Python 如何使用 Keras 创建自定义激活函数?
声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow
原文地址: http://stackoverflow.com/questions/43915482/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me):
StackOverFlow
How do you create a custom activation function with Keras?
提问by Martin Thoma
Sometimes the default standard activationslike ReLU, tanh, softmax, ... and the advanced activationslike LeakyReLU aren't enough. And it might also not be in keras-contrib.
有时,默认的标准激活(如 ReLU、tanh、softmax 等)和LeakyReLU 等高级激活是不够的。它也可能不在keras-contrib 中。
How do you create your own activation function?
你如何创建自己的激活函数?
回答by Martin Thoma
Credits to this Github issue comment by Ritchie Ng.
# Creating a model
from keras.models import Sequential
from keras.layers import Dense
# Custom activation function
from keras.layers import Activation
from keras import backend as K
from keras.utils.generic_utils import get_custom_objects
def custom_activation(x):
return (K.sigmoid(x) * 5) - 1
get_custom_objects().update({'custom_activation': Activation(custom_activation)})
# Usage
model = Sequential()
model.add(Dense(32, input_dim=784))
model.add(Activation(custom_activation, name='SpecialActivation'))
print(model.summary())
Please keep in mind that you have to import this function when you save and restore the model. See the note of keras-contrib.
请记住,您在保存和恢复模型时必须导入此功能。请参阅keras-contrib 的注释。
回答by Eponymous
Slightly simpler than Martin Thoma's answer: you can just create a custom element-wise back-end function and use it as a parameter. You still need to import this function before loading your model.
比Martin Thoma 的回答简单一点:您可以创建一个自定义的元素后端函数并将其用作参数。在加载模型之前,您仍然需要导入此函数。
from keras import backend as K
def custom_activation(x):
return (K.sigmoid(x) * 5) - 1
model.add(Dense(32 , activation=custom_activation))
回答by Julien Nyambal
Let's say you would like to add swish
or gelu
to keras, the previous methods are nice inline insertions. But you could also insert them in the set of keras activation functions, so that you call you custom fucntion as you would call ReLU
. I tested this with keras 2.2.2 (any v2 would do). Append to this file $HOME/anaconda2/lib/python2.7/site-packages/keras/activations.py
the definition of your custom function (can be different for you python and anaconda version).
假设您想添加swish
或添加gelu
到 keras,前面的方法是很好的内联插入。但是您也可以将它们插入到 keras 激活函数集中,这样您就可以像调用ReLU
. 我用 keras 2.2.2 测试了这个(任何 v2 都可以)。将$HOME/anaconda2/lib/python2.7/site-packages/keras/activations.py
自定义函数的定义附加到此文件(python 和 anaconda 版本可能不同)。
In keras internal:
在 keras 内部:
$HOME/anaconda2/lib/python2.7/site-packages/keras/activations.py
def swish(x):
return (K.sigmoid(beta * x) * alpha *x)
Then in your python file:
然后在你的python文件中:
$HOME/Documents/neural_nets.py
model = Sequential()
model.add(Activation('swish'))