Python numpy (einsum) 优化:1D 到 ND 外点积

Python numpy (einsum) optimization: 1D to ND outer dot products

如何最有效地计算一维数组的 2 个 ND 外积的点积?

"i,j,k,l->ij,kl->" 将是一个很好的 einsum 替代方案,但失败了。

invalid subscript ',' in einstein sum subscripts string, subscripts must be letters

以下是我试图加速的天真实现(A 和 B 将是其他东西)。理想的是具有以下 API 的快速版本,其结果与以下示例相同:nd_outer_from1D(2, A, B), nd_outer_from1D(3, A, B ).正如您将看到的,随着 |A|、|B| 和 N 增加存储结果并重新插入到 einsum 中,因为参数很快变得不可行。

$  python -m timeit 'import numpy as np; A=np.arange(50); B=np.arange(50); sum(sum(np.outer(A,A) * np.outer(B,B)))'
10000 loops, best of 3: 72.1 usec per loop
$  python -m timeit 'import numpy as np; A=np.arange(50); B=np.arange(50); sum(sum(np.einsum("i,j->ij",A,A) * np.einsum("i,j->ij",B,B)  ))'
10000 loops, best of 3: 61.4 usec per loop
$  python -m timeit 'import numpy as np; A=np.arange(50); B=np.arange(50); sum(sum(sum(np.einsum("i,j,k->ijk",A,A,A) * np.einsum("i,j,k->ijk",B,B,B)  )))'
1000 loops, best of 3: 1.78 msec per loop

编辑(示例):

>>> A
array([0, 1, 2, 3])
>>> B
array([0.58394169, 0.22495002, 0.08322459, 0.05406281])
>>> sum(sum(np.einsum('i,j->ij',A,A) * np.einsum('i,j->ij', B, B)))
0.3064592592321492

显然 sum(sum( 和领先的 einsum 没有像我预期的那样工作。

下面的专家小菜一碟,比较时间:

$  python -m timeit 'import numpy as np; A=np.arange(50); B=np.arange(50); np.einsum("i,i->",A,B)**3'
100000 loops, best of 3: 6.77 usec per loop
$  python -m timeit 'import numpy as np; A=np.arange(50); B=np.arange(50); np.einsum("i,i->",A,B)**2'
100000 loops, best of 3: 6.63 usec per loop
$  python -m timeit 'import numpy as np; A=np.arange(50); B=np.arange(50); A.dot(B)**3'
100000 loops, best of 3: 3.75 usec per loop
$  python -m timeit 'import numpy as np; A=np.arange(50); B=np.arange(50); A.dot(B)**2'
100000 loops, best of 3: 3.68 usec per loop

哇,这比我预期的要快:

$  python -m timeit 'import numpy as np; A=np.arange(5000); B=np.arange(5000); A.dot(B)**10'
100000 loops, best of 3: 12.1 usec per loop

这可以通过 einsum -

进行优化
np.einsum("i,i->",A,B)**2

matrix-multiplication-

A.dot(B)**2

仍然可以用 einsum:

完成
np.einsum('i, j, j, i', A, A, B, B)
Out: 0.30645926408901691