Problem regarding reading mesh in parallel

Hello
I get an error in reading a mesh in parallel that I was not able to fix. Here is the code:

from dolfin import *

mesh = UnitCubeMesh.create(1,1,1,CellType.Type.hexahedron)

# define left face
class Left(SubDomain):
    def inside(self, x, on_boundary):
        return abs(x[0] - 0.0) < DOLFIN_EPS and on_boundary

left = Left()

boundaries = MeshFunction('size_t', mesh, mesh.topology().dim()-1)
subdomains = MeshFunction('size_t', mesh, mesh.topology().dim())

# marking boundary
boundaries.set_all(0)
left.mark(boundaries, 1)

hdf = HDF5File(mesh.mpi_comm(), "file.h5", "w")
hdf.write(mesh, "/mesh")
hdf.write(subdomains, "/subdomains")
hdf.write(boundaries, "/boundaries")

So far so good but when I want to read it:

mesh = Mesh()
hdf = HDF5File(mesh.mpi_comm(), "file.h5", "r")
hdf.read(mesh, "/mesh", False)
subdomains = MeshFunction('size_t', mesh, mesh.topology().dim())

hdf.read(subdomains, "/subdomains")
boundaries = MeshFunction('size_t', mesh, mesh.topology().dim()-1)
hdf.read(boundaries, "/boundaries")

I get this error:

*** Error:   Unable to read meshfunction topology.
*** Reason:  Cell dimension mismatch.

The thing is it happens in case of hexahedron element. For example it works fine if I use :

CellType.Type.tetrahedron

But I want to use hexahedron element. Does anybody know how this could be fixed?

Hi,

Most likely there is no hexahedron support for HDF5File. However, you can use XDMFFile instead (however to visualize the data in paraview you need to read the boundary data to a separate file).
Minimal example using one file is shown below

from dolfin import *

mesh = UnitCubeMesh.create(1,1,1,CellType.Type.hexahedron)

# define left face
class Left(SubDomain):
    def inside(self, x, on_boundary):
        return abs(x[0] - 0.0) < DOLFIN_EPS and on_boundary

left = Left()

boundaries = MeshFunction('size_t', mesh, mesh.topology().dim()-1)
subdomains = MeshFunction('size_t', mesh, mesh.topology().dim())

# marking boundary
boundaries.set_all(0)
left.mark(boundaries, 1)

xdmf = XDMFFile(mesh.mpi_comm(), "file.xdmf")
xdmf.write(mesh)
xdmf.write(subdomains)
xdmf = XDMFFile(mesh.mpi_comm(), "file.xdmf")
xdmf.write(boundaries)

xdmf.close()

mesh = Mesh()
xdmf = XDMFFile(mesh.mpi_comm(), "file.xdmf")
xdmf.read(mesh)

mvc = MeshValueCollection("size_t", mesh, 2)
with XDMFFile("file.xdmf") as infile:
    infile.read(mvc, "f")
mf = cpp.mesh.MeshFunctionSizet(mesh, mvc)

mvc2 = MeshValueCollection("size_t", mesh, 1)
with XDMFFile("file.xdmf") as infile:
    infile.read(mvc2, "f")
mf2 = cpp.mesh.MeshFunctionSizet(mesh, mvc2)

Thanks for your response. I am not familiar with XDMFFile class as I would use HDF5 class but as you said it does not support hexahedron element.

The solution you suggested produces 2 files : file.h5 and file.xdmf
Now I want to read the mesh in parallel and visualize the marked boundary (e.g. left face). I did this:

from dolfin import *

mesh = Mesh()
xdmf = XDMFFile(mesh.mpi_comm(), "file.xdmf")
xdmf.read(mesh)

subdomains = MeshFunction('size_t', mesh, mesh.topology().dim())
xdmf.read(subdomains, "/subdomains")

boundaries = MeshFunction('size_t', mesh, mesh.topology().dim()-1)
xdmf.read(boundaries, "/boundaries")

# I want to see the marked face in Paraview
File("marked face.pvd") << boundaries

However it does not work and gives back error. Do you know how I can fix it?
Thanks!

So XDMF save data to two files, the xdmf file and the h5 file. The xmdf file can be thought of as the headers, describing the large data-sets stored in the h5 file.
As I explained above, you cant use the same syntax for HDF5File and XDMFFile.
Below I’ve added code saving boundary data to a separate file. Note that you cannot choose the name of a mesh function, and it is saved as f and has to be loaded as f.

from dolfin import *

mesh = UnitCubeMesh.create(1,1,1,CellType.Type.hexahedron)

# define left face
class Left(SubDomain):
    def inside(self, x, on_boundary):
        return abs(x[0] - 0.0) < DOLFIN_EPS and on_boundary

left = Left()

boundaries = MeshFunction('size_t', mesh, mesh.topology().dim()-1)
subdomains = MeshFunction('size_t', mesh, mesh.topology().dim())

# marking boundary
boundaries.set_all(0)
left.mark(boundaries, 1)

xdmf = XDMFFile(mesh.mpi_comm(), "file.xdmf")
xdmf.write(mesh)
xdmf.write(subdomains)
xdmf = XDMFFile(mesh.mpi_comm(), "boundaries.xdmf")
xdmf.write(boundaries)

xdmf.close()

mesh = Mesh()
xdmf = XDMFFile(mesh.mpi_comm(), "file.xdmf")
xdmf.read(mesh)

mvc = MeshValueCollection("size_t", mesh, 2)
with XDMFFile("file.xdmf") as infile:
    infile.read(mvc, "f")
mf = cpp.mesh.MeshFunctionSizet(mesh, mvc)
xdmf.close()

mvc2 = MeshValueCollection("size_t", mesh, 1)
with XDMFFile("boundaries.xdmf") as infile:
    infile.read(mvc2, "f")
mf2 = cpp.mesh.MeshFunctionSizet(mesh, mvc2)
1 Like

Thanks for your explanation. Based on your code the mesh and marked boundary could be visualized in Paraview by:

File("mesh.pvd") << mf  
File("boundaries.pvd") << mf2

For the same mesh and marked boundary I tried to run it in parallel for a simple linear elasticity problem. Here is the code:

from dolfin import *

mu = 1
lmbda = 1.25

mesh = Mesh()
xdmf = XDMFFile(mesh.mpi_comm(), "file.xdmf")
xdmf.read(mesh)

mvc = MeshValueCollection("size_t", mesh, 2)
with XDMFFile("file.xdmf") as infile:
    infile.read(mvc, "f")

subdomains = cpp.mesh.MeshFunctionSizet(mesh, mvc)

mvc2 = MeshValueCollection("size_t", mesh, 1)
with XDMFFile("boundaries.xdmf") as infile:
    infile.read(mvc2, "f")

boundaries = cpp.mesh.MeshFunctionSizet(mesh, mvc2)

V = VectorFunctionSpace(mesh, 'P', 1)

bc = DirichletBC(V, Constant((0, 0, 0)), boundaries,1)

def epsilon(u):
    return 0.5*(nabla_grad(u) + nabla_grad(u).T)

def sigma(u):
    return lmbda * tr(epsilon(u)) * Identity(d) + 2.0 * mu * epsilon(u)

# Define variational problem
u = TrialFunction(V)
d = u.geometric_dimension()  # space dimension
v = TestFunction(V)
f = Constant((0, 0, 0))

a = inner(sigma(u), epsilon(v))*dx
L = dot(f, v)*dx

u = Function(V)
solve(a == L, u, bc)

When I run the above code in serial it works fine but when I want to run it in parallel by :

mpirun -np 4 python3 elasticity.py

It returns this error:

*** Error:   Unable to create mesh entity.
*** Reason:  Mesh entity index 0 out of range [0, 0] for entity of dimension 2.
*** Where:   This error was encountered inside MeshEntity.cpp.
*** Process: 1
*** 
*** DOLFIN version: 2019.1.0

Do you know why this happens?

Well, this is because you cannot partition a mesh of only 1 element.
You need to use a bigger mesh, for instance a 4x4x4 unitcube.
Also, i spot an error in the code (might have been my mistake).
The mesh-value-collections should have the following dimensions:


mvc = MeshValueCollection("size_t", mesh, 3)
with XDMFFile("file.xdmf") as infile:
    infile.read(mvc, "f")

subdomains = cpp.mesh.MeshFunctionSizet(mesh, mvc)

mvc2 = MeshValueCollection("size_t", mesh, 2)
with XDMFFile("boundaries.xdmf") as infile:
    infile.read(mvc2, "f")

both when you read and write them (3 and 2) not (2 and 1).

Thanks. It makes sense now.