Python 在 TensorFlow 中展平包含向量的 2D 张量的最佳方法?
声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow
原文地址: http://stackoverflow.com/questions/34194151/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me):
StackOverFlow
Best way to flatten a 2D tensor containing a vector in TensorFlow?
提问by Andrzej Pronobis
What is the most efficient way to flatten a 2D tensor which is actually a horizontal or vertical vector into a 1D tensor?
将实际上是水平或垂直向量的 2D 张量展平为 1D 张量的最有效方法是什么?
Is there a difference in terms of performance between:
在性能方面是否存在差异:
tf.reshape(w, [-1])
and
和
tf.squeeze(w)
?
?
采纳答案by mrry
Both tf.reshape(w, [-1])
and tf.squeeze(w)
are "cheap" in that they operate only on the metadata (i.e. the shape) of the given tensor, and don't modify the data itself. Of the two tf.reshape()
has slightly simpler logic internally, but the performance of the two should be indistinguishable.
双方tf.reshape(w, [-1])
并tf.squeeze(w)
在它们的元数据只能操作给定的张量(即形状),并且不修改数据本身“便宜”。两者的内部逻辑tf.reshape()
稍微简单一些,但是两者的性能应该是没有区别的。
回答by Calimero
For a simple 2D tensor the two should function identically, as mentioned by @sv_jan5. However, please note that tf.squeeze(w)
only squeezes the first layer in the case of a multilayer tensor, whereas tf.reshape(w,[-1])
will flatten the entire tensor regardless of depth.
对于一个简单的 2D 张量,两者的功能应该相同,正如@sv_jan5 所提到的。但是,请注意,tf.squeeze(w)
在多层张量的情况下仅挤压第一层,而tf.reshape(w,[-1])
无论深度如何,都会使整个张量变平。
For example, let's look at
例如,让我们看看
w = [[1,2,],[3,4]]
now the output of the two functions will no longer be the same. tf.squeeze(w)
will output
现在这两个函数的输出将不再相同。tf.squeeze(w)
会输出
<tf.Tensor: shape=(2, 2), dtype=int32, numpy=
array([[1, 2],
[3, 4]], dtype=int32)>
while tf.reshape(w,[-1])
will output
whiletf.reshape(w,[-1])
会输出
<tf.Tensor: shape=(4,), dtype=int32, numpy=array([1, 2, 3, 4], dtype=int32)>