如何在张量流中访问嵌入层的变量?
How to access embedding layer's variables in tensorflow?
假设我有这样的嵌入层e
:
import tensorflow as tf
e = tf.keras.layers.Embedding(5,3)
如何打印它的 numpy 值?
您需要构建嵌入层才能访问嵌入矩阵:
import tensorflow as tf
emb = tf.keras.layers.Embedding(5, 3)
emb.build(())
emb.trainable_variables[0].numpy()
# array([[-0.00595363, 0.03049802, 0.01821234],
# [ 0.01515153, -0.01006874, 0.02568189],
# [-0.01845006, 0.02135053, -0.03916124],
# [-0.00822829, 0.00922295, 0.00091892],
# [-0.00727308, -0.03537174, -0.01419405]], dtype=float32)
感谢@vald 的回答。我认为 e.embeddings
更像 pythonic,也许更高效。
import tensorflow as tf
e = tf.keras.layers.Embedding(5,3)
e.build(()) # You should build it before using.
print(e.embeddings)
>>>
<tf.Variable 'embeddings:0' shape=(5, 3) dtype=float32, numpy=
array([[ 0.02099125, 0.01865673, 0.03652272],
[ 0.02714007, -0.00316695, -0.00252246],
[-0.02411103, 0.02043924, -0.01297874],
[ 0.00766286, -0.03511617, 0.03460207],
[ 0.00256425, -0.03659264, -0.01796588]], dtype=float32)>
假设我有这样的嵌入层e
:
import tensorflow as tf
e = tf.keras.layers.Embedding(5,3)
如何打印它的 numpy 值?
您需要构建嵌入层才能访问嵌入矩阵:
import tensorflow as tf
emb = tf.keras.layers.Embedding(5, 3)
emb.build(())
emb.trainable_variables[0].numpy()
# array([[-0.00595363, 0.03049802, 0.01821234],
# [ 0.01515153, -0.01006874, 0.02568189],
# [-0.01845006, 0.02135053, -0.03916124],
# [-0.00822829, 0.00922295, 0.00091892],
# [-0.00727308, -0.03537174, -0.01419405]], dtype=float32)
感谢@vald 的回答。我认为 e.embeddings
更像 pythonic,也许更高效。
import tensorflow as tf
e = tf.keras.layers.Embedding(5,3)
e.build(()) # You should build it before using.
print(e.embeddings)
>>>
<tf.Variable 'embeddings:0' shape=(5, 3) dtype=float32, numpy=
array([[ 0.02099125, 0.01865673, 0.03652272],
[ 0.02714007, -0.00316695, -0.00252246],
[-0.02411103, 0.02043924, -0.01297874],
[ 0.00766286, -0.03511617, 0.03460207],
[ 0.00256425, -0.03659264, -0.01796588]], dtype=float32)>