使用张量板 HParams Dashboad 进行超参数调整不适用于自定义模型

Hyperparameter tuning with tensorboard HParams Dashboad does not work with custom model

我有一个自定义的 keras 模型,我想针对超参数进行优化,同时很好地跟踪正在发生的事情和可视化。因此,我想像这样将 hparams 传递给自定义模型:

class Model_hparams(tf.keras.Model):
    def __init__(self, hparams):
        super(Model_hparams, self).__init__()
        self.hps = hparams
        
    def build(self, inputs_shape):
        self.conv1 = tf.keras.layers.Conv1D(filters=self.hps[HP_NUM_UNITS_1], 
                                            kernel_size=self.hps[HP_LEN_CONV_1], 
                                            activation='relu', 
                                            input_shape=inputs_shape[1:])
        self.pool1 = tf.keras.layers.MaxPool1D(pool_size=2)
        self.bn1 = tf.keras.layers.BatchNormalization()
        self.dense1 = tf.keras.layers.Dense(1)
        # actually, here are even more layers
    def call(self, x, training=True):
        x = self.conv1(x)
        x = self.pool1(x)
        x = self.bn1(x, training=training)
        x = self.dense1(x)
        return x

我关注了来自 TF 的 the guide

from tensorboard.plugins.hparams import api as hp
HP_NUM_UNITS_1 = hp.HParam('num_units_1', hp.Discrete([16, 32]))

HP_LEN_CONV_1 = hp.HParam('len_conv_1', hp.Discrete([3]))

METRIC = 'mae'

with tf.summary.create_file_writer("../../model_output/hparams").as_default():
    hp.hparams_config(
    hparams=[HP_NUM_UNITS_1, 
             HP_LEN_CONV_1,],
    metrics=[hp.Metric(METRIC, display_name='Test_MAE')],
  )

def run(run_dir, hparams):
    with tf.summary.create_file_writer(run_dir).as_default():
        hp.hparams(hparams)  # record the values used in this trial
        test_mae = train_model(hparams)
        tf.summary.scalar('Mean_Average_Error', test_mae, step=1)

现在我的训练函数用我的训练过程调用模型,看起来像这样(简化):

def train_model(hparams):
    model=Model_hparams(hparams)
    
    for batch in dataset:
        #...
        with tf.GradientTape() as tape:
            predictions = model(batch, training=True)
            #...

真正的优化从这里开始:

n=0
for num_units_1 in HP_NUM_UNITS_1.domain.values:
    for len_conv_1 in HP_LEN_CONV_1.domain.values:     
        hparams = {HP_NUM_UNITS_1: num_units_1,
                   HP_LEN_CONV_1: len_conv_1}
        run_name = "run-%d" % n

        run("../../model_output/hparams/" + run_name, hparams)
        n += 1

但是,如果我 运行 这个,当我想实例化我的模型时发生错误:

<ipython-input-99-17dd66300f5b> in __init__(self, hparams)
     72     def __init__(self, hparams):
     73         super(Model_hparams, self).__init__()
---> 74         self.hps = hparams
     75 
     76     def build(self, inputs_shape):

c:\users3\anaconda3\envs\python_3_8_env1\lib\site-packages\tensorflow\python\keras\engine\training.py in __setattr__(self, name, value)
    312         isinstance(v, (base_layer.Layer,
    313                        data_structures.TrackableDataStructure)) or
--> 314         base_layer_utils.has_weights(v) for v in nest.flatten(value)):
    315       try:
    316         self._base_model_initialized

c:\users3\anaconda3\envs\python_3_8_env1\lib\site-packages\tensorflow\python\util\nest.py in flatten(structure, expand_composites)
    339     return [None]
    340   expand_composites = bool(expand_composites)
--> 341   return _pywrap_utils.Flatten(structure, expand_composites)
    342 
    343 

TypeError: '<' not supported between instances of 'HParam' and 'HParam'

我不确定为什么会这样,而且我无法让它工作。我在文档中找不到任何内容。 有什么我遗漏的吗??
感谢支持

tf.keras.Model class 覆盖 __setattr__ 函数,所以你不能设置不匹配的变量。但是,您可以通过以下技巧绕过此功能。

object.__setattr__(self, 'hps', hparams)

.. 而不是

self.hps = hparams
class Model_hparams(tf.keras.Model):
  def __init__(self, hparams):
    super(Model_hparams, self).__init__()
    object.__setattr__(self, 'hps', hparams)