构造函数中定义的 self.scale 个变量是常量吗?

Is it constant for self.scale variables defined in constructor function?

我不知道千层面函数的运行机制。 对于下面的代码。

class WScaleLayer(lasagne.layers.Layer):
    def __init__(self, incoming, **kwargs):
        super(WScaleLayer, self).__init__(incoming, **kwargs)
        W = incoming.W.get_value()
        scale = np.sqrt(np.mean(W ** 2))
        incoming.W.set_value(W / scale)
        self.scale = self.add_param(scale, (), name='scale', trainable=False)
        self.b = None
        if hasattr(incoming, 'b') and incoming.b is not None:
            b = incoming.b.get_value()
            self.b = self.add_param(b, b.shape, name='b', regularizable=False)
            del incoming.params[incoming.b]
            incoming.b = None
        self.nonlinearity = lasagne.nonlinearities.linear
        if hasattr(incoming, 'nonlinearity') and incoming.nonlinearity is not None:
            self.nonlinearity = incoming.nonlinearity
            incoming.nonlinearity = lasagne.nonlinearities.linear

    def get_output_for(self, v, **kwargs):
        v = v * self.scale
        if self.b is not None:
            pattern = ['x', 0] + ['x'] * (v.ndim - 2)
            v = v + self.b.dimshuffle(*pattern)
return self.nonlinearity(v)

你能告诉我self.scale在初始化后的训练过程中是不是常量?

我不是千层面专家,但除非你做奇怪的事情,self.scale在训练期间应该不会改变。

但是这段代码很奇怪。您使用传入权重的初始值来初始化比例。这真的是你想要的吗?