Python 如何在张量流中使用 tf.while_loop()
声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow
原文地址: http://stackoverflow.com/questions/37441140/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me):
StackOverFlow
How to use tf.while_loop() in tensorflow
提问by Hanyu Guo
This is a generic question. I found that in the tensorflow, after we build the graph, fetch data into the graph, the output from graph is a tensor. but in many cases, we need to do some computation based on this output (which is a tensor
), which is not allowed in tensorflow.
这是一个笼统的问题。我发现在 tensorflow 中,在我们构建图形后,将数据提取到图形中,图形的输出是一个张量。但在很多情况下,我们需要根据这个输出(即 a tensor
)做一些计算,这在 tensorflow 中是不允许的。
for example, I'm trying to implement a RNN, which loops times based on data self property. That is, I need use a tensor
to judge whether I should stop (I am not using dynamic_rnn since in my design, the rnn is highly customized). I find tf.while_loop(cond,body.....)
might be a candidate for my implementation. But the official tutorial is too simple. I don't know how to add more functionalities into the 'body'. Can anyone give me few more complex example?
例如,我正在尝试实现一个 RNN,它根据数据自身属性循环时间。也就是说,我需要使用 atensor
来判断我是否应该停止(我没有使用 dynamic_rnn 因为在我的设计中,rnn 是高度定制的)。我发现tf.while_loop(cond,body.....)
可能是我实施的候选人。但是官方教程太简单了。我不知道如何在“主体”中添加更多功能。谁能给我一些更复杂的例子?
Also, in such case that if the future computation is based on the tensor output (ex: the RNN stop based on the output criterion), which is very common case. Is there an elegant way or better way instead of dynamic graph?
此外,在这种情况下,如果未来的计算基于张量输出(例如:RNN 停止基于输出标准),这是非常常见的情况。有没有一种优雅的方式或更好的方式来代替动态图?
回答by Peter Goldsborough
What is stopping you from adding more functionality to the body? You can build whatever complex computational graph you like in the body and take whatever inputs you like from the enclosing graph. Also, outside of the loop, you can then do whatever you want with whatever outputs you return. As you can see from the amount of 'whatevers', TensorFlow's control flow primitives were built with much generality in mind. Below is another 'simple' example, in case it helps.
是什么阻止您为身体添加更多功能?您可以在主体中构建您喜欢的任何复杂计算图,并从封闭图中获取您喜欢的任何输入。此外,在循环之外,您可以对返回的任何输出做任何想做的事情。从“随便”的数量中可以看出,TensorFlow 的控制流原语在构建时考虑了很多通用性。下面是另一个“简单”的例子,以防万一。
import tensorflow as tf
import numpy as np
def body(x):
a = tf.random_uniform(shape=[2, 2], dtype=tf.int32, maxval=100)
b = tf.constant(np.array([[1, 2], [3, 4]]), dtype=tf.int32)
c = a + b
return tf.nn.relu(x + c)
def condition(x):
return tf.reduce_sum(x) < 100
x = tf.Variable(tf.constant(0, shape=[2, 2]))
with tf.Session():
tf.global_variables_initializer().run()
result = tf.while_loop(condition, body, [x])
print(result.eval())