在 Google Colab 中使用 TPU
Use TPU in Google Colab
我目前正在借助 TPU 训练神经网络。
我更改了运行时类型并初始化了 TPU。
我感觉它仍然没有更快。我用了https://www.tensorflow.org/guide/tpu。
我是不是做错了什么?
# TPU initialization
resolver = tf.distribute.cluster_resolver.TPUClusterResolver(tpu='grpc://' + os.environ['COLAB_TPU_ADDR'])
tf.config.experimental_connect_to_cluster(resolver)
# This is the TPU initialization code that has to be at the beginning.
tf.tpu.experimental.initialize_tpu_system(resolver)
print("All devices: ", tf.config.list_logical_devices('TPU'))
.
.
.
# experimental_steps_per_execution = 50
model.compile(optimizer=Adam(lr=learning_rate), loss='binary_crossentropy', metrics=['accuracy'], experimental_steps_per_execution = 50)
我的模型总结
还有什么需要考虑或调整的吗?
您需要创建 TPU 策略:
strategy = tf.distribute.TPUStrategy(resolver).
然后正确使用此策略:
with strategy.scope():
model = create_model()
model.compile(optimizer='adam',
loss=tf.keras.losses.SparseCategoricalCrossentropy(from_logits=True),
metrics=['sparse_categorical_accuracy'])
我目前正在借助 TPU 训练神经网络。 我更改了运行时类型并初始化了 TPU。 我感觉它仍然没有更快。我用了https://www.tensorflow.org/guide/tpu。 我是不是做错了什么?
# TPU initialization
resolver = tf.distribute.cluster_resolver.TPUClusterResolver(tpu='grpc://' + os.environ['COLAB_TPU_ADDR'])
tf.config.experimental_connect_to_cluster(resolver)
# This is the TPU initialization code that has to be at the beginning.
tf.tpu.experimental.initialize_tpu_system(resolver)
print("All devices: ", tf.config.list_logical_devices('TPU'))
.
.
.
# experimental_steps_per_execution = 50
model.compile(optimizer=Adam(lr=learning_rate), loss='binary_crossentropy', metrics=['accuracy'], experimental_steps_per_execution = 50)
我的模型总结
还有什么需要考虑或调整的吗?
您需要创建 TPU 策略:
strategy = tf.distribute.TPUStrategy(resolver).
然后正确使用此策略:
with strategy.scope():
model = create_model()
model.compile(optimizer='adam',
loss=tf.keras.losses.SparseCategoricalCrossentropy(from_logits=True),
metrics=['sparse_categorical_accuracy'])