Python TimeDistributed 层在 Keras 中的作用是什么?

声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow 原文地址: http://stackoverflow.com/questions/47305618/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me): StackOverFlow

提示:将鼠标放在中文语句上可以显示对应的英文。显示中英文
时间:2020-08-19 18:08:36  来源:igfitidea点击:

What is the role of TimeDistributed layer in Keras?

pythonmachine-learningkerasneural-networkdeep-learning

提问by Buomsoo Kim

I am trying to grasp what TimeDistributed wrapper does in Keras.

我试图了解 TimeDistributed 包装器在 Keras 中的作用。

I get that TimeDistributed "applies a layer to every temporal slice of an input."

我得到 TimeDistributed “将一个层应用于输入的每个时间切片”。

But I did some experiment and got the results that I cannot understand.

但是我做了一些实验,得到了我无法理解的结果。

In short, in connection to LSTM layer, TimeDistributed and just Dense layer bear same results.

简而言之,就 LSTM 层而言,TimeDistributed 和 just Dense 层具有相同的结果。

model = Sequential()
model.add(LSTM(5, input_shape = (10, 20), return_sequences = True))
model.add(TimeDistributed(Dense(1)))
print(model.output_shape)

model = Sequential()
model.add(LSTM(5, input_shape = (10, 20), return_sequences = True))
model.add((Dense(1)))
print(model.output_shape)

For both models, I got output shape of (None, 10, 1).

对于这两种模型,我都得到了(None, 10, 1) 的输出形状。

Can anyone explain the difference between TimeDistributed and Dense layer after an RNN layer?

谁能解释一下 RNN 层之后的 TimeDistributed 层和 Dense 层之间的区别?

回答by Marcin Mo?ejko

In keras- while building a sequential model - usually the second dimension (one after sample dimension) - is related to a timedimension. This means that if for example, your data is 5-dimwith (sample, time, width, length, channel)you could apply a convolutional layer using TimeDistributed(which is applicable to 4-dimwith (sample, width, length, channel)) along a time dimension (applying the same layer to each time slice) in order to obtain 5-doutput.

In keras- 在构建顺序模型时 - 通常是第二个维度(样本维度之后的一个) - 与一个time维度相关。这意味着,例如,如果您的数据5-dim(sample, time, width, length, channel)您可以沿时间维度使用TimeDistributed(适用于4-dimwith (sample, width, length, channel))应用卷积层(对每个时间片应用相同的层)以获得5-d输出。

The case with Denseis that in kerasfrom version 2.0 Denseis by default applied to only last dimension (e.g. if you apply Dense(10)to input with shape (n, m, o, p)you'll get output with shape (n, m, o, 10)) so in your case Denseand TimeDistributed(Dense)are equivalent.

随着案情Dense的是,在keras2.0版本Dense默认情况下只应用于最后一个维度(例如,如果你申请Dense(10)输入与形状(n, m, o, p),你会得到与形状输出(n, m, o, 10)),所以你的情况DenseTimeDistributed(Dense)是等价的。