如何使用深度神经网络解决线性逆问题 Ax=b?
How can I solve linear inverse problem Ax=b using Deep Neural Network?
我正在尝试使用深度神经网络解决线性逆问题 Ax=b
。但我对机器学习完全陌生,所有教程都是关于分类的。那么,有人可以向我提供一些教程链接(代码、视频、论文),介绍如何使用深度神经网络解决 Ax=b
问题吗?
来自此 blog
的示例
import torch
dim = 2
A = torch.rand(dim, dim, requires_grad=False)
b = torch.rand(dim, 1, requires_grad=False)
x = torch.autograd.Variable(torch.rand(dim, 1), requires_grad=True)
stop_loss = 1e-2
step_size = stop_loss / 3.0
print('Loss before: %s' % (torch.norm(torch.matmul(A, x) - b)))
for i in range(1000*1000):
Δ = torch.matmul(A, x) - b
L = torch.norm(Δ, p=2)
L.backward()
x.data -= step_size * x.grad.data # step
x.grad.data.zero_()
if i % 10000 == 0: print('Loss is %s at iteration %i' % (L, i))
if abs(L) < stop_loss:
print('It took %s iterations to achieve %s loss.' % (i, step_size))
break
print('Loss after: %s' % (torch.norm(torch.matmul(A, x) - b)))
我正在尝试使用深度神经网络解决线性逆问题 Ax=b
。但我对机器学习完全陌生,所有教程都是关于分类的。那么,有人可以向我提供一些教程链接(代码、视频、论文),介绍如何使用深度神经网络解决 Ax=b
问题吗?
来自此 blog
的示例import torch
dim = 2
A = torch.rand(dim, dim, requires_grad=False)
b = torch.rand(dim, 1, requires_grad=False)
x = torch.autograd.Variable(torch.rand(dim, 1), requires_grad=True)
stop_loss = 1e-2
step_size = stop_loss / 3.0
print('Loss before: %s' % (torch.norm(torch.matmul(A, x) - b)))
for i in range(1000*1000):
Δ = torch.matmul(A, x) - b
L = torch.norm(Δ, p=2)
L.backward()
x.data -= step_size * x.grad.data # step
x.grad.data.zero_()
if i % 10000 == 0: print('Loss is %s at iteration %i' % (L, i))
if abs(L) < stop_loss:
print('It took %s iterations to achieve %s loss.' % (i, step_size))
break
print('Loss after: %s' % (torch.norm(torch.matmul(A, x) - b)))