TensorBoard 的 Keras Tuner Trials 目录的命名

Naming of Keras Tuner Trials directory for TensorBoard

我正在使用 Keras 调谐器的 BayesianOptimization 来搜索模型的最佳超参数,我还使用 TensorBoard 回调来可视化每个 model/trial.

然而,来自 Tuner 的试验 named/labelled 很奇怪(例如 trial_1dc4838863f2e4e8a84f0e415ee1db33)。有没有一种方法可以让调谐器仅将试验命名为“trial_1”、“trial_2” “, ETC。?而不是后面的所有数字和字母?

我在 Keras 文档中找不到任何地方如何做,或者在创建 Tuner 实例时是否有关于它的参数。

我能够通过覆盖 BayesianOptimizationBayesianOptimizationOracle 类 来解决这个问题。它只是将每个试验命名为“0”、“1”、“2”等

但如果这更灵活一点就好了,因为我可能最终会为其他超级调谐器方法这样做。还有。

from kerastuner.engine import trial as trial_lib
from kerastuner.tuners import BayesianOptimization
from kerastuner.tuners.bayesian import \
    BayesianOptimization, BayesianOptimizationOracle


class CustomBayesianOptimizationOracle(BayesianOptimizationOracle):

    def __init__(self,
                 objective,
                 max_trials,
                 num_initial_points=None,
                 alpha=1e-4,
                 beta=2.6,
                 seed=None,
                 hyperparameters=None,
                 allow_new_entries=True,
                 tune_new_entries=True):
        super(CustomBayesianOptimizationOracle, self).__init__(
            objective=objective,
            max_trials=max_trials,
            num_initial_points=num_initial_points,
            alpha=alpha,
            beta=beta,
            seed=seed,
            hyperparameters=hyperparameters,
            tune_new_entries=tune_new_entries,
            allow_new_entries=allow_new_entries)

        self.trial_id = '0'

    def create_trial(self, tuner_id):
        """Create a new `Trial` to be run by the `Tuner`.

        A `Trial` corresponds to a unique set of hyperparameters to be run
        by `Tuner.run_trial`.

        Args:
          tuner_id: A ID that identifies the `Tuner` requesting a
          `Trial`. `Tuners` that should run the same trial (for instance,
           when running a multi-worker model) should have the same ID.

        Returns:
          A `Trial` object containing a set of hyperparameter values to run
          in a `Tuner`.
        """
        # Allow for multi-worker DistributionStrategy within a Trial.
        if tuner_id in self.ongoing_trials:
            return self.ongoing_trials[tuner_id]

        if self.max_trials and len(self.trials) >= self.max_trials:
            status = trial_lib.TrialStatus.STOPPED
            values = None
        else:
            response = self._populate_space(self.trial_id)
            status = response['status']
            values = response['values'] if 'values' in response else None

        hyperparameters = self.hyperparameters.copy()
        hyperparameters.values = values or {}
        trial = trial_lib.Trial(
            hyperparameters=hyperparameters,
            trial_id=self.trial_id,
            status=status)

        if status == trial_lib.TrialStatus.RUNNING:
            self.ongoing_trials[tuner_id] = trial
            self.trials[self.trial_id] = trial
            self._save_trial(trial)
            self.save()

        self.trial_id = str(int(self.trial_id) + 1)

        return trial


class CustomBayesianOptimization(BayesianOptimization):

    def __init__(self,
                 hypermodel,
                 objective,
                 max_trials,
                 num_initial_points=2,
                 seed=None,
                 hyperparameters=None,
                 tune_new_entries=True,
                 allow_new_entries=True,
                 **kwargs):
        oracle = CustomBayesianOptimizationOracle(
            objective=objective,
            max_trials=max_trials,
            num_initial_points=num_initial_points,
            seed=seed,
            hyperparameters=hyperparameters,
            tune_new_entries=tune_new_entries,
            allow_new_entries=allow_new_entries)
        super(BayesianOptimization, self).__init__(
            oracle=oracle,
            hypermodel=hypermodel,
            **kwargs)