Cholesky 与 ScaLAPACK

Cholesky with ScaLAPACK

我正在尝试通过 pdpotrf() of MKL-Intel's library, which uses ScaLAPACK. I am reading the whole matrix in the master node and then distribute it like in this example 进行 Cholesky 分解。当 SPD 矩阵的维度是偶数时,一切正常。但是,当它是奇数时,pdpotrf()认为矩阵不是正定的。

难道是因为子矩阵不是SPD?我正在使用这个矩阵:

子矩阵是(有 4 个进程和大小为 2x2 的块):

A_loc on node 0
  4   1   2
  1 0.5   0
  2   0  16

nrows = 3, ncols = 2
A_loc on node 1
  2 0.5
  0   0
  0   0

nrows = 2, ncols = 3
A_loc on node 2
  2   0   0
0.5   0   0

nrows = 2, ncols = 2
A_loc on node 3
  3   0
  0 0.625

在这里,每个子矩阵都不是SPD,但是,整个矩阵是SPD(已检查运行 1个过程)。我应该怎么办?或者我无能为力并且 pdpotrf() 不适用于奇数大小的矩阵?


以下是我调用例程的方式:

int iZERO = 0;
int descA[9];
// N, M dimensions of matrix. lda = N
// Nb, Mb dimensions of block
descinit_(descA, &N, &M, &Nb, &Mb, &iZERO, &iZERO, &ctxt, &lda, &info);
...
pdpotrf((char*)"L", &ord, A_loc, &IA, &JA, descA, &info);

我也试过这个:

// nrows/ncols is the number of rows/columns a submatrix has
descinit_(descA, &N, &M, &nrows, &ncols, &iZERO, &iZERO, &ctxt, &lda, &info);

但我得到一个错误:

{ 0, 0}: On entry to { 0, 1}: On entry to PDPOTR{ 1, 0}: On entry to PDPOTRF parameter number 605 had an illegal value { 1, 1}: On entry to PDPOTRF parameter number 605 had an illegal value F parameter number 605 had an illegal value

PDPOTRF parameter number 605 had an illegal value info < 0: If the i-th argument is an array and the j-entry had an illegal value, then INFO = -(i*100+j), if the i-th argument is a scalar and had an illegal value, then INFO = -i. info = -605

从我的 可以看出函数参数的含义。


代码基于此question。输出:

gsamaras@pythagoras:~/konstantis/check_examples$ ../../mpich-install/bin/mpic++ -o test minor.cpp -I../../intel/mkl/include ../../intel/mkl/lib/intel64/libmkl_scalapack_lp64.a       -Wl,--start-group       ../../intel/mkl/lib/intel64/libmkl_intel_lp64.a ../../intel/mkl/lib/intel64/libmkl_core.a  ../../intel/mkl/lib/intel64/libmkl_sequential.a    -Wl,--end-group ../../intel/mkl/lib/intel64/libmkl_blacs_intelmpi_lp64.a -lpthread       -lm     -ldl
gsamaras@pythagoras:~/konstantis/check_examples$ mpiexec -n 4 ./test
Processes grid pattern:
0 1
2 3
nrows = 3, ncols = 3
A_loc on node 0
  4   1   2
  1 0.5   0
  2   0  16

nrows = 3, ncols = 2
A_loc on node 1
  2 0.5
  0   0
  0   0

nrows = 2, ncols = 3
A_loc on node 2
  2   0   0
0.5   0   0

nrows = 2, ncols = 2
A_loc on node 3
  3   0
  0 0.625

Description init sucesss!
matrix is not positive definte
Matrix A result:
  2   1   2 0.5   2
0.5 0.5   0   0   0
  1   0   1   0 -0.25
0.25  -1 -0.5 0.625   0
  1  -1  -2 -0.5  14

问题可能来自:

MPI_Bcast(&lda, 1, MPI_INT, 0, MPI_COMM_WORLD);

在这行之前,如果矩阵的维度是奇数,lda在每个过程中都是不同的。两个进程处理 2 行,两个进程处理 3 行。但是在MPI_Bcast()之后,lda到处都是一样的(3).

问题是子程序 DESCINIT 的参数 lda 必须是局部数组的前导维度,即 2 或 3。

通过评论 MPI_Bcast(),我得到:

Description init sucesss!
SUCCESS
Matrix A result:
2   1   2 0.5   2 
0.5 0.5   0   0   0 
1  -1   1   0   0 
0.25 -0.25 -0.5 0.5   0 
1  -1  -2  -3   1 

最后,它会解释该程序对偶数维度运行良好而对奇数维度失败!