I try to create a PETSc matrix based on the FunctionSpace size info dofmap.index_map.size_global
. Then I get Nonconforming object sizes error
when I do matrix-vector multiplication in parallel. To compare with that, a matrix created and assembled from a weak form can do it smoothly. I can sense it is due to different partitioning on the PETSc matrix. How can we create a PETSc matrix to match the needed partitioning (could be deduced from FunctionSpace)?
To illustrate this, append the following code to our demo_poisson.py at Poisson equation — DOLFINx 0.7.3 documentation
from dolfinx.fem.petsc import assemble_matrix, assemble_vector
from dolfinx.fem import form
from petsc4py import PETSc
# Build matrix A from form
A = assemble_matrix(form(a), bcs=[bc])
A.assemble()
x = uh.vector
b = assemble_vector(form(L))
Ax = fem.Function(V)
A.mult(x, Ax.vector) # matrix-vector multiplication OK
# Self-made matrix with the same size
size_global = V.dofmap.index_map.size_global
D = PETSc.Mat().create(MPI.COMM_WORLD)
D.setSizes([size_global, size_global])
D.setUp()
# D.setValues...
D.assemble()
D.mult(x, Ax.vector) # matrix-vector multiplication failed
The A
is matrix assembled from weak form, while D
is my self-made matrix. They have the same size. But D
fails in D.mult()
in a parallel run. If we add the following print with mpirun -n 2 python ...
:
print(f"rank: {MPI.COMM_WORLD.rank}\n"
f"A.getSizes():\n{A.getSizes()}\n"
f"D.getSizes():\n{D.getSizes()}\n")
it returns
rank: 0
A.getSizes():
((276, 561), (276, 561))
D.getSizes():
((281, 561), (281, 561))
rank: 1
A.getSizes():
((285, 561), (285, 561))
D.getSizes():
((280, 561), (280, 561))