Strange behavior with boundary conditions and PETSc solvers in dolfin 2019.2

I am having some strange errors lately which I cannot fully understand. I can get an MWE which is simply a rectangular 2D mesh with a circular hole in the middle, over which I solve a standard poisson problem.

The mesh is generated with pygmsh version 7.1.5, while meshio version is 4.3.8 and gmsh is 4.7.1, from pip freeze.

In my applications I want to generate the mesh on one process, write it to an XDMF file and read it back so that it’s decomposed between all processors, when running in parallel. The example is the following, most of it is just definition of the mesh and boundary domains:

from dolfin import *
import gmsh
import pygmsh
import meshio

# generate the mesh
gmsh.initialize()
if MPI.rank(MPI.comm_world) == 0:
    with pygmsh.geo.Geometry() as geom:
        ci = geom.add_circle([0., 0.], 0.5, mesh_size=0.05, make_surface=False)
        rec = geom.add_rectangle(xmin=-2,xmax=2.,ymin=-2.,ymax=2.,z=0.0,mesh_size=0.1,holes=[ci.curve_loop])
        msh = geom.generate_mesh(dim=2)
    msh.remove_lower_dimensional_cells()
    msh.prune_z_0()
    meshio.write('test.xdmf', msh)
else:
    pass
MPI.barrier(MPI.comm_world)     # WAIT FOR PROCESS 0 TO FINISH MESHING
# after writing to file, read again the mesh with xdmf parallel dolfin interface
mesh=Mesh(MPI.comm_world)
with XDMFFile('test.xdmf') as infile:
        infile.read(mesh)
# define domains
class Circle(SubDomain):
    def inside(self, x, on_boundary):
        RR = (x[0]*x[0]+x[1]*x[1])**0.5
        return (RR < 1.) and on_boundary
class Out(SubDomain):
    def inside(self, x, on_boundary):
        return on_boundary

domains = MeshFunction("size_t", mesh,mesh.topology().dim() -1 )
domains.set_all(0)
Out().mark(domains, 2)
Circle().mark(domains, 1)

# now solve a simple variational problem with PETSc
V = FunctionSpace(mesh, "Lagrange", 1)
bcs = [DirichletBC(V, Constant(1.0), domains, 1), DirichletBC(V, Constant(0.0), domains, 2)]
u = TrialFunction(V)
v = TestFunction(V)
a = inner(grad(u), grad(v))*dx
L = Constant(0.0)*v*dx

sol = PETScKrylovSolver('gmres')
A, bb = assemble_system(a, L, bcs)
# solve
u = Function(V)
sol.solve(A, u.vector(), bb)

with XDMFFile('sol.xdmf') as outfile:
    outfile.write(u)

Now the problem is that, if I run it on one process: python3 test.py, I get the following error message:

Traceback (most recent call last):
  File "test.py", line 54, in <module>
    sol.solve(A, u.vector(), bb)
RuntimeError: 

*** -------------------------------------------------------------------------
*** DOLFIN encountered an error. If you are not able to resolve this issue
*** using the information listed below, you can ask for help at
***
***     fenics-support@googlegroups.com
***
*** Remember to include the error message listed below and, if possible,
*** include a *minimal* running example to reproduce the error.
***
*** -------------------------------------------------------------------------
*** Error:   Unable to successfully call PETSc function 'KSPSolve'.
*** Reason:  PETSc error code is: 73 (Object is in wrong state).
*** Where:   This error was encountered inside /build/dolfin-GfTndI/dolfin-2019.2.0~git20201207.b495043/dolfin/la/PETScKrylovSolver.cpp.
*** Process: 0
*** 
*** DOLFIN version: 2019.2.0.dev0
*** Git changeset:  unknown
*** -------------------------------------------------------------------------

If instead I run it in parallel, mpirun -np 2 python3 test.py, the code runs fine and writes the expected solution to file. I guess something is wrong with the mesh reading process, since if I change the code sligthly I get another error, in parallel this time. Just by applying the boundary condition explicitely:

V = FunctionSpace(mesh, "Lagrange", 1)
bc1 = DirichletBC(V, Constant(1.0), domains, 1)
u = TrialFunction(V)
v = TestFunction(V)
a = inner(grad(u), grad(v))*dx

A = assemble(a)
bc1.apply(A)

(I repeated only the important part, mesh generation and reading part is the same as above). Now, running in parallel gives the following error:

Traceback (most recent call last):
  File "test.py", line 51, in <module>
    bc1.apply(A)
RuntimeError: 

*** -------------------------------------------------------------------------
*** DOLFIN encountered an error. If you are not able to resolve this issue
*** using the information listed below, you can ask for help at
***
***     fenics-support@googlegroups.com
***
*** Remember to include the error message listed below and, if possible,
*** include a *minimal* running example to reproduce the error.
***
*** -------------------------------------------------------------------------
*** Error:   Unable to apply boundary condition.
*** Reason:  Dimension of function space (2908) too large for application of boundary conditions to linear system (2907 rows).
*** Where:   This error was encountered inside BoundaryCondition.cpp.
*** Process: 0
*** 
*** DOLFIN version: 2019.2.0.dev0
*** Git changeset:  unknown
*** -------------------------------------------------------------------------

while in serial it works. Any suggestion on what might be going on here?

This is probably because pygmsh adds in non-physical points to your mesh (for instance when generating a circle). These unused points should not be part of your mesh. The easiest way to spot these is to use physical tags with pygmsh, then the non physical points should be at the end of the msh.points array.
See for instance:

1 Like

Thanks @dokken,

indeed, if I take from the code in your repository

cells = msh.get_cells_type('triangle')
num_vertices = len(np.unique(cells.reshape(-1)))
points = msh.points[:num_vertices]

it turns out that points contained one extra point at the end, which should be the center of the circular boundary in my mesh, and needs to be removed