MPI4PY:分散矩阵

MPI4PY: Scatter a matrix

我正在使用 MPI4PY 将 n/p 列分散到两个输入数据进程。但是,我无法按我的意愿发送这些列。我必须对代码进行哪些更改才能获得最终评论中报告的结果?

矩阵为:

[1, 2, 3, 4]
[5, 6, 7, 8]
[9, 10, 11, 12]
[13, 14, 15, 16]

那么,n=4,p=2。每个进程将分别有2列。

这是我的代码:

# Imports
from mpi4py import MPI
import numpy as np

comm = MPI.COMM_WORLD
size = comm.Get_size() 
rank = comm.Get_rank()

rows = 4
num_columns = rows/size

data=None

if rank == 0:
  data = np.matrix([[1, 2, 3, 4], [5, 6, 7, 8], [9, 10, 11, 12], [13, 14, 15, 16]])

recvbuf = np.empty((rows, int(num_columns)), dtype='int')
comm.Scatterv(data, recvbuf, root=0)
print('Rank: ',rank, ', recvbuf received:\n ',recvbuf)

我得到以下输出:

Rank:  0 , recvbuf received:
[[1 2]
[3 4]
[5 6]
[7 8]]
Rank:  1 , recvbuf received:
[[ 9 10]
[11 12]
[13 14]
[15 16]]

我想获得以下输出:

Rank:  0 , recvbuf received:
[[1 2]
[5 6]
[9 10]
[13 14]]
Rank:  1 , recvbuf received:
[[ 3 4]
[7 8]
[11 12]
[15 16]]

我认为这段代码可以满足您的需求。这里的问题是 Scatterv 根本不关心 numpy 数组的形状,它只考虑包含您的值的线性内存块。因此,最简单的方法是预先将数据按正确的顺序进行操作。请注意 send_data 是一维数组,但这并不重要,因为 Scatterv 不关心。在另一端,recvbuf 的形状已经定义,Scatterv 只是从接收到的一维输入中填充它。

# Imports
from mpi4py import MPI
import numpy as np

comm = MPI.COMM_WORLD
size = comm.Get_size()
rank = comm.Get_rank()

rows = 4
num_cols = rows/size

send_data=None

if rank == 0:
  data = np.matrix([[1, 2, 3, 4],
                    [5, 6, 7, 8],
                    [9, 10, 11, 12],
                    [13, 14, 15, 16]])

  # Split into sub-arrays along required axis
  arrs = np.split(data, size, axis=1)

  # Flatten the sub-arrays
  raveled = [np.ravel(arr) for arr in arrs]

  # Join them back up into a 1D array
  send_data = np.concatenate(raveled)


recvbuf = np.empty((rows, int(num_cols)), dtype='int')
comm.Scatterv(send_data, recvbuf, root=0)

print('Rank: ',rank, ', recvbuf received:\n ',recvbuf)