with tf.variable_scope('lstm_model') as scope: # define LSTM Model lstm_model = LSTM_Model(rnn_size,batch_size,learning_rate,training_seq_len,vocab_size) scope.reuse_variables() test_lstm_model = LSTM_Model(rnn_size,vocab_size,infer=True)
上面的代码给了我一个错误
Variable lstm_model/lstm_vars/W already exists,disallowed. DID you mean to set reuse=True in VarScope?
如果我设置了reuse = True,如下面的代码块所示
with tf.variable_scope('lstm_model',reuse=True) as scope:
我得到了一个不同的错误
Variable lstm_model/lstm_model/lstm_vars/W/Adam/ does not exist,or was not created with tf.get_variable(). DID you mean to set reuse=None in VarScope?
作为参考,我在下面附上了相关的型号代码. LSTM模型中的相应部分,我有权重
with tf.variable_scope('lstm_vars'): # softmax Output Weights W = tf.get_variable('W',[self.rnn_size,self.vocab_size],tf.float32,tf.random_normal_initializer())
我有Adam优化器的相应部分:
optimizer = tf.train.AdamOptimizer(self.learning_rate)解决方法 它似乎不是:
with tf.variable_scope('lstm_model') as scope: # define LSTM Model lstm_model = LSTM_Model(rnn_size,vocab_size) scope.reuse_variables() test_lstm_model = LSTM_Model(rnn_size,infer_sample=True)
这解决了这个问题
# define LSTM Modellstm_model = LSTM_Model(rnn_size,vocab_size)# Tell TensorFlow we are reusing the scope for the testingwith tf.variable_scope(tf.get_variable_scope(),reuse=True): test_lstm_model = LSTM_Model(rnn_size,infer_sample=True)总结
以上是内存溢出为你收集整理的python – Tensorflow变量重用全部内容,希望文章能够帮你解决python – Tensorflow变量重用所遇到的程序开发问题。
如果觉得内存溢出网站内容还不错,欢迎将内存溢出网站推荐给程序员好友。
欢迎分享,转载请注明来源:内存溢出
评论列表(0条)