話不多說,干就完了。
變量重命名的用處?
簡單定義:簡單來說就是將模型A中的參數(shù)parameter_A賦給模型B中的parameter_B
使用場景:當(dāng)需要使用已經(jīng)訓(xùn)練好的模型參數(shù),尤其是使用別人訓(xùn)練好的模型參數(shù)時,往往別人模型中的參數(shù)命名方式與自己當(dāng)前的命名方式不同,所以在加載模型參數(shù)時需要對參數(shù)進(jìn)行重命名,使得代碼更簡潔易懂。
實(shí)現(xiàn)方法:
1)、模型保存
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
|
import os import tensorflow as tf weights = tf.Variable(initial_value = tf.truncated_normal(shape = [ 1024 , 2 ], mean = 0.0 , stddev = 0.1 ), dtype = tf.float32, name = "weights" ) biases = tf.Variable(initial_value = tf.zeros(shape = [ 2 ]), dtype = tf.float32, name = "biases" ) weights_2 = tf.Variable(initial_value = weights.initialized_value(), dtype = tf.float32, name = "weights_2" ) # saver checkpoint if os.path.exists( "checkpoints" ) is False : os.makedirs( "checkpoints" ) saver = tf.train.Saver() with tf.Session() as sess: init_op = [tf.global_variables_initializer()] sess.run(init_op) saver.save(sess = sess, save_path = "checkpoints/variable.ckpt" ) |
2)、模型加載(變量名稱保持不變)
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
|
import tensorflow as tf from matplotlib import pyplot as plt import os current_path = os.path.dirname(os.path.abspath(__file__)) def restore_variable(sess): # need not initilize variable, but need to define the same variable like checkpoint weights = tf.Variable(initial_value = tf.truncated_normal(shape = [ 1024 , 2 ], mean = 0.0 , stddev = 0.1 ), dtype = tf.float32, name = "weights" ) biases = tf.Variable(initial_value = tf.zeros(shape = [ 2 ]), dtype = tf.float32, name = "biases" ) weights_2 = tf.Variable(initial_value = weights.initialized_value(), dtype = tf.float32, name = "weights_2" ) saver = tf.train.Saver() ckpt_path = os.path.join(current_path, "checkpoints" , "variable.ckpt" ) saver.restore(sess = sess, save_path = ckpt_path) weights_val, weights_2_val = sess.run( [ tf.reshape(weights, shape = [ 2048 ]), tf.reshape(weights_2, shape = [ 2048 ]) ] ) plt.subplot( 1 , 2 , 1 ) plt.scatter([i for i in range ( len (weights_val))], weights_val) plt.subplot( 1 , 2 , 2 ) plt.scatter([i for i in range ( len (weights_2_val))], weights_2_val) plt.show() if __name__ = = '__main__' : with tf.Session() as sess: restore_variable(sess) |
3)、模型加載(變量重命名)
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
|
import tensorflow as tf from matplotlib import pyplot as plt import os current_path = os.path.dirname(os.path.abspath(__file__)) def restore_variable_renamed(sess): conv1_w = tf.Variable(initial_value = tf.truncated_normal(shape = [ 1024 , 2 ], mean = 0.0 , stddev = 0.1 ), dtype = tf.float32, name = "conv1_w" ) conv1_b = tf.Variable(initial_value = tf.zeros(shape = [ 2 ]), dtype = tf.float32, name = "conv1_b" ) conv2_w = tf.Variable(initial_value = conv1_w.initialized_value(), dtype = tf.float32, name = "conv2_w" ) # variable named 'weights' in ckpt assigned to current variable conv1_w # variable named 'biases' in ckpt assigned to current variable conv1_b # variable named 'weights_2' in ckpt assigned to current variable conv2_w saver = tf.train.Saver({ "weights" : conv1_w, "biases" : conv1_b, "weights_2" : conv2_w }) ckpt_path = os.path.join(current_path, "checkpoints" , "variable.ckpt" ) saver.restore(sess = sess, save_path = ckpt_path) conv1_w__val, conv2_w__val = sess.run( [ tf.reshape(conv1_w, shape = [ 2048 ]), tf.reshape(conv2_w, shape = [ 2048 ]) ] ) plt.subplot( 1 , 2 , 1 ) plt.scatter([i for i in range ( len (conv1_w__val))], conv1_w__val) plt.subplot( 1 , 2 , 2 ) plt.scatter([i for i in range ( len (conv2_w__val))], conv2_w__val) plt.show() if __name__ = = '__main__' : with tf.Session() as sess: restore_variable_renamed(sess) |
總結(jié):
# 之前模型中叫 'weights'的變量賦值給當(dāng)前的conv1_w變量
# 之前模型中叫 'biases' 的變量賦值給當(dāng)前的conv1_b變量
# 之前模型中叫 'weights_2'的變量賦值給當(dāng)前的conv2_w變量
saver = tf.train.Saver({
"weights": conv1_w,
"biases": conv1_b,
"weights_2": conv2_w
})
以上這篇tensorflow模型保存、加載之變量重命名實(shí)例就是小編分享給大家的全部內(nèi)容了,希望能給大家一個參考,也希望大家多多支持服務(wù)器之家。
原文鏈接:https://blog.csdn.net/cxx654/article/details/88927962