MatMPIBDiagSetPreallocation

Collective on Mat

Synopsis

int MatMPIBDiagSetPreallocation(Mat B,int nd,int bs,int *diag,PetscScalar **diagv)

Input Parameters

    diag = i/bs - j/bs  (integer division)
Set diag=PETSC_NULL on input for PETSc to dynamically allocate memory as needed (expensive).
A - the matrix
nd - number of block diagonals (global) (optional)
bs - each element of a diagonal is an bs x bs dense matrix
diag - optional array of block diagonal numbers (length nd). For a matrix element A[i,j], where i=row and j=column, the diagonal number is
diagv - pointer to actual diagonals (in same order as diag array), if allocated by user. Otherwise, set diagv=PETSC_NULL on input for PETSc to control memory allocation.

Options Database Keys

-mat_block_size <bs> -Sets blocksize
-mat_bdiag_diags <s1,s2,s3,...> -Sets diagonal numbers

Notes

If PETSC_DECIDE or PETSC_DETERMINE is used for a particular argument on one processor than it must be used on all processors that share the object for that argument.

The parallel matrix is partitioned across the processors by rows, where each local rectangular matrix is stored in the uniprocessor block diagonal format. See the users manual for further details.

The user MUST specify either the local or global numbers of rows (possibly both).

The case bs=1 (conventional diagonal storage) is implemented as a special case.

Fortran Notes

Fortran programmers cannot set diagv; this variable is ignored.

Keywords

matrix, block, diagonal, parallel, sparse

See Also

MatCreate(), MatCreateSeqBDiag(), MatSetValues()

Level:intermediate
Location:
src/mat/impls/bdiag/mpi/mpibdiag.c
Index of all Mat routines
Table of Contents for all manual pages
Index of all manual pages