HDF5File mesh not working in parallel: Unable to access vector of degrees of freedom

Hello, I am facing some trouble running my code using the mesh format h5 file in parallel. When I am running in serial, mpirun -n 1 python3 test.py, for instance, it runs. However when I run mpirun -n 4 python3 test.py it returns the error:

solver_disp.solve()

RuntimeError:

*** -------------------------------------------------------------------------
*** DOLFIN encountered an error. If you are not able to resolve this issue
*** using the information listed below, you can ask for help at


*** fenics-support@googlegroups.com


*** Remember to include the error message listed below and, if possible,
*** include a minimal running example to reproduce the error.


*** -------------------------------------------------------------------------
*** Error: Unable to access vector of degrees of freedom.
*** Reason: Cannot access a non-const vector from a subfunction.
*** Where: This error was encountered inside Function.cpp.
*** Process: 3


*** DOLFIN version: 2019.2.0.dev0
*** Git changeset: ubuntu
*** -------------------------------------------------------------------------

solver_disp.solve()

The mesh reading passage is:

mesh = Mesh()
hdf = HDF5File(MPI.comm_world, “file.h5”, “r”) #mesh.mpi_comm()
hdf.read(mesh, “/mesh”, False)
cd = MeshFunction(“size_t”, mesh, mesh.topology().dim())
hdf.read(cd, “/cd”)
fd = MeshFunction(“size_t”, mesh, mesh.topology().dim()-1)
hdf.read(fd, “/fd”)

When I add the line:

MeshPartitioning.build_distributed_mesh(mesh)

it does run, but the processes seem not to be communicating among themselves, and my mesh becomes weirdly fractioned. I’ll post an image of it.

My mesh is a cube with an sphere inside of it. the Cube region is marked with label 2 and the sphere with label 1.

The code and the files are in GitHub - DeisonPreve/HDF5File-in-parallel

Thanks in advance.

We had a similar issue, when reading in a mesh that had some nodes that were not in any cell.
Could you post the output of

import time

from dolfin       import *
from mpi4py import MPI as _MPI
import numpy as np
mesh = Mesh(MPI.comm_world)

with XDMFFile("path/to/mesh.xdmf") as f:
    f.read(mesh)
print( mesh.num_entities_global(0))  
V = FunctionSpace(mesh, "Lagrange", 1)
uV = Function(V)
print("Function:",
     V.dofmap().global_dimension(),uV.vector().size())
mesh.init(3, 0)
c_to_v = mesh.topology()(3, 0)
vertices = []
for cell in cells(mesh):
    vertices.append(c_to_v(cell.index()))
all_vertices_local = np.unique(np.hstack(vertices))
print(len(all_vertices_local))

in serial?

it returned:

11307
Function: 11307 11307
11307

hello…
have you gotten anything?
thanks