Consider the following where I assemble a vector using the standard assemble
, and then using local_assembler
. The form being assembled is defined on the exterior boundary only.
from petsc4py import PETSc
from dolfin import *
# parameters["ghost_mode"] = "shared_facet"
mesh = UnitSquareMesh(1, 1)
V = FunctionSpace(mesh, 'CG', 1)
v = TestFunction(V)
form = Constant(1.0)*v*ds
# -- Standard vector assembly
assemble(form).str(True)
# -- Initialise empty vector
b = PETScVector()
assemble(Constant(0.0)*v*dx,
tensor=b, finalize_tensor=False)
# -- Iterate over cells and assemble using local element vector
dm = Form(form).function_space(0).dofmap()
for c in cells(mesh):
be = assemble_local(form, c)
b.vec().setValuesLocal(dm.cell_dofs(c.index()), be,
addv=PETSc.InsertMode.ADD_VALUES)
b.vec().assemble()
b.str(True)
Running with 2 MPI processes I notice with parameters["ghost_mode"]="none"
I get a mismatch
Vec Object: 2 MPI processes
type: mpi
Process [0]: 1. Process [1]: 1. 1. 1.
Vec Object: 2 MPI processes
type: mpi
Process [0]: 1. Process [1]: 1. 2.41421 2.41421
And with parameters["ghost_mode"]="shared_facet"
I get as expected
Vec Object: 2 MPI processes
type: mpi
Process [0]: 1. Process [1]: 1. 1. 1.
Vec Object: 2 MPI processes
type: mpi
Process [0]: 1. Process [1]: 1. 1. 1.
Is there something trivial Iām missing to get a coherent result?