如何在没有定义变量的情况下保存张量流模型(省略标签张量)

问题描述 投票:2回答:1

我的张量流模型定义如下:

X =  tf.placeholder(tf.float32, [None,training_set.shape[1]],name = 'X')
Y = tf.placeholder(tf.float32,[None,training_labels.shape[1]], name = 'Y')
A1 = tf.contrib.layers.fully_connected(X, num_outputs = 50, activation_fn = tf.nn.relu)
A1 = tf.nn.dropout(A1, 0.8)
A2 = tf.contrib.layers.fully_connected(A1, num_outputs = 2, activation_fn = None)
cost = tf.reduce_mean(tf.nn.softmax_cross_entropy_with_logits(logits = A2, labels = Y))    
global_step = tf.Variable(0, trainable=False)
start_learning_rate = 0.001
learning_rate = tf.train.exponential_decay(start_learning_rate, global_step, 200, 0.1, True )
optimizer = tf.train.AdamOptimizer(learning_rate=learning_rate).minimize(cost)

现在我想保存这个模型省略张量YY是训练的标签张量,X是实际输入)。另外,在使用freeze_graph.py时提到输出节点时,我应该提一下"A2"还是用其他名字保存?

python machine-learning tensorflow neural-network deep-learning
1个回答
3
投票

虽然您尚未手动定义变量,但上面的代码段实际上包含15个可保存变量。您可以使用此内部tensorflow函数查看它们:

from tensorflow.python.ops.variables import _all_saveable_objects
for obj in _all_saveable_objects():
  print(obj)

对于上面的代码,它产生以下列表:

<tf.Variable 'fully_connected/weights:0' shape=(100, 50) dtype=float32_ref>
<tf.Variable 'fully_connected/biases:0' shape=(50,) dtype=float32_ref>
<tf.Variable 'fully_connected_1/weights:0' shape=(50, 2) dtype=float32_ref>
<tf.Variable 'fully_connected_1/biases:0' shape=(2,) dtype=float32_ref>
<tf.Variable 'Variable:0' shape=() dtype=int32_ref>
<tf.Variable 'beta1_power:0' shape=() dtype=float32_ref>
<tf.Variable 'beta2_power:0' shape=() dtype=float32_ref>
<tf.Variable 'fully_connected/weights/Adam:0' shape=(100, 50) dtype=float32_ref>
<tf.Variable 'fully_connected/weights/Adam_1:0' shape=(100, 50) dtype=float32_ref>
<tf.Variable 'fully_connected/biases/Adam:0' shape=(50,) dtype=float32_ref>
<tf.Variable 'fully_connected/biases/Adam_1:0' shape=(50,) dtype=float32_ref>
<tf.Variable 'fully_connected_1/weights/Adam:0' shape=(50, 2) dtype=float32_ref>
<tf.Variable 'fully_connected_1/weights/Adam_1:0' shape=(50, 2) dtype=float32_ref>
<tf.Variable 'fully_connected_1/biases/Adam:0' shape=(2,) dtype=float32_ref>
<tf.Variable 'fully_connected_1/biases/Adam_1:0' shape=(2,) dtype=float32_ref>

来自fully_connected层的变量和来自Adam优化器的更多变量(参见this question)。请注意,此列表中没有XY占位符,因此无需排除它们。当然,这些张量存在于元图中,但它们没有任何价值,因此无法保存。

如果未明确提供变量,则_all_saveable_objects()列表是默认情况下tensorflow保护程序保存的列表。因此,您的主要问题的答案很简单:

saver = tf.train.Saver()  # all saveable objects!
with tf.Session() as sess:
  tf.global_variables_initializer().run()
  saver.save(sess, "...")

没有办法提供tf.contrib.layers.fully_connected函数的名称(因此,它保存为fully_connected_1/...),但是我们鼓励你切换到tf.layers.dense,它有一个name参数。无论如何,要了解为什么这是个好主意,请看看thisthis discussion

最新问题
© www.soinside.com 2019 - 2025. All rights reserved.