如何解决方法不可迭代的问题?

How can I solve the problem that method is not iterable?

我有这个变分自动编码器,我想使用 Adam 作为它的优化器,但它有这个错误我不知道这里出了什么问题

class VAE(nn.Module):
    def __init__(self):
        super().__init__()

        #encoder
        self.enc = nn.Sequential(
            nn.Linear(1200, 786),
            nn.ReLU(),
            nn.Flatten()
        )
        self.mean = nn.Linear(1200, 2)
        self.log = nn.Linear(1200, 2)
        #decoder
        self.dec = nn.Sequential(
            nn.Linear(2, 1200),
            nn.ReLU(),
        )

    def param(self, mu, Log):
        eps = torch.randn(2, 1200)
        z = mu + (eps * torch.exp(Log * 0.5))
        return z

    def forward(self, x):
        x = self.enc(x)
        mu , log = self.mean(x), self.log(x)
        z = self.param(mu, log)
        x = self.dec(z)
        return x, mu, log

model = VAE()
optim = torch.optim.Adam(model.param, lr=0.01)
criterion = nn.CrossEntropyLoss()

这里是错误

Traceback (most recent call last):
 File "C:\Users\khashayar\PycharmProjects\pythonProject2\VAE.py", line 40, in <module>
   optim = torch.optim.Adam(model.param, lr=0.01)
 File "C:\Users\khashayar\anaconda3\envs\deeplearning\lib\site-packages\torch\optim\adam.py", line 48, in __init__
   super(Adam, self).__init__(params, defaults)
 File "C:\Users\khashayar\anaconda3\envs\deeplearning\lib\site-packages\torch\optim\optimizer.py", line 47, in __init__
   param_groups = list(params)
TypeError: 'method' object is not iterable

我该如何解决这个问题?

问题可能出在model.param。 param 是一种方法,如错误中所写:“'method' object is not iterable”。优化器应该接收模型参数,而不是模型的方法“参数”class。

尝试将 optim = torch.optim.Adam(model.param, lr=0.01) 转换为 optim = torch.optim.Adam(model.parameters(), lr=0.01)