Welcome to ShenZhenJia Knowledge Sharing Community for programmer and developer-Open, Learning and Share
menu search
person
Welcome To Ask or Share your Answers For Others

Categories

At each iteration I want to dynamically provide how many placeholders I want and then will feed data to them. Is that possible and how ? I tried to create the whole model (placeholders, loss, optimizer) inside epoch loop but that gave uninitialised variables error.

At present I have n=5 placeholders each of shape=(1, k) in a list and I feed data to them. But n needs to dynamically defined during data feeding inside epoch loop.

See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
thumb_up_alt 0 like thumb_down_alt 0 dislike
200 views
Welcome To Ask or Share your Answers For Others

1 Answer

Maybe you misunderstood what a tensor is.

If you think of a tensor like a multi-dimensional list, you can understand that having a dynamically number of placeholder with a shape [1, k] is no sense.

Instead, you have to use a single tensor.

Thus, define your input placeholder as a tensor with shape [None, 1, k].

placeholder_ = tf.placeholder(tf.float32, [None, 1, k])

With this statement you define a placeholder with tf.float32 type and an undefined number of elements (the None part) with shape [1,k].

In every iteration, you have to feed the placeholder with the right values. Eg running

result = sess.run(defined_op, feed_dict={
    placeholder_: numpy_ndarray_with_N_elements_with_shape_1_k
})

In that way you don't need to define new variables into the computational graph (that simply doesn't work) but feed it with the desired values.


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
thumb_up_alt 0 like thumb_down_alt 0 dislike
Welcome to ShenZhenJia Knowledge Sharing Community for programmer and developer-Open, Learning and Share
...