Python 如何在 Keras 模型中初始化偏差?

声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow 原文地址: http://stackoverflow.com/questions/40708169/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me): StackOverFlow

提示:将鼠标放在中文语句上可以显示对应的英文。显示中英文
时间:2020-08-19 23:53:15  来源:igfitidea点击:

How to initialize biases in a Keras model?

pythonneural-networkdeep-learningkeras

提问by Mohammad Amin

I am trying to build a synthetic model in Keras, and I need to assign values for the weights and biases. Assigning the weights is easy, I am using the instructions provided here: https://keras.io/initializations/. However, I could not find any instructions on how to assign the biases. Any ideas?

我正在尝试在 Keras 中构建一个合成模型,我需要为权重和偏差分配值。分配权重很容易,我使用这里提供的说明:https: //keras.io/initializations/。但是,我找不到有关如何分配偏差的任何说明。有任何想法吗?

回答by StatsSorceress

You can also use bias_initializer like this:

你也可以像这样使用bias_initializer:

model.add(Dense(64,
                kernel_initializer='random_uniform',
                bias_initializer='zeros')

This is from https://keras.io/initializers/

这是来自https://keras.io/initializers/

回答by Hengda Qi

You can find the answer here. https://keras.io/layers/core/

你可以在这里找到答案。 https://keras.io/layers/core/

weights: list of Numpy arrays to set as initial weights. The list should have 2 elements, of shape (input_dim, output_dim) and (output_dim,) for weights and biases respectively.

weights:要设置为初始权重的 Numpy 数组列表。该列表应该有 2 个元素,形状分别为 (input_dim, output_dim) 和 (output_dim,) 用于权重和偏差。

When adding a new layer, you can define the argument "weights", a list that contains initial wand bwith shape speicified.

添加新层时,您可以定义参数“权重”,这是一个包含指定形状的初始wb的列表。

model.add(Dense(50, input_dim= X_train.shape[1], weights = [np.zeros([692, 50]), np.zeros(50)]))

model.add(Dense(50, input_dim= X_train.shape[1], weights = [np.zeros([692, 50]), np.zeros(50)]))

回答by daoliker

Initialize biases with small positive value such as 0.1

用小的正值(例如 0.1)初始化偏差

Since we're using ReLU neurons, it is also good practice to initialize them with a slightly positive initial bias to avoid "dead neurons".

由于我们使用的是 ReLU 神经元,因此将它们初始化为稍微正的初始偏置以避免“死神经元”也是一种很好的做法。

回答by Trevor Witter

Weight and bias initialization for each layer can be set via kernel_initializerand bias_initializerkeyword arguments respectively within layers.Dense(). If undefined by user, default settings of kernel_initializer='glorot_uniform'and bias_initializer='zeros'are applied.

每层的权重和偏置初始化可以分别通过kernel_initializerbias_initializer关键字参数在layers.Dense(). 如果用户未定义kernel_initializer='glorot_uniform'bias_initializer='zeros'则应用和 的默认设置。

For example, if you wanted to initialize a layer's weight initialization to random uniform instead of glorot and bias initialization to 0.1 instead of 0, you could define a given layer as follows:

例如,如果您想将一个层的权重初始化初始化为随机统一而不是 glorot 并将偏置初始化初始化为 0.1 而不是 0,您可以按如下方式定义给定层:

from keras import layers, initializers

layer = layers.Dense(64,
                     activation='relu',
                     kernel_initializer='random_uniform',
                     bias_initializer=initializers.Constant(0.1))(previous_layer)

See layers/core/for details on Dense layer keyword arguments and initializers/for preset and customizable initializer options

有关密集层关键字参数和初始化程序的详细信息,请参阅层/核心/以获取预设和可自定义的初始化程序选项