Problem with 3D in parallel

Dear all, I use the HDF5 file to run the MPI, but the result is really bad.

my HDF5.py:

from dolfin import *
from mpi4py import MPI as mpi

mesh = Mesh(“New3D_Denticles61000.xml”)
markers = MeshFunction(“size_t”, mesh, “New3D_Denticles61000_physical_region.xml”)
boundaries = MeshFunction(“size_t”, mesh, “New3D_Denticles61000_facet_region.xml”)
hdf = HDF5File(MPI.comm_world, “2New3D_Denticles61000.h5”, “w”)
hdf.write(mesh, “/mesh”)
hdf.write(markers, “/markers”)
hdf.write(boundaries, “/boundaries”)

parallel.py:(min. file)

from dolfin import *
from mshr import *
import numpy as np
from mpi4py import MPI as mpi

QUAD_DEG = 2*k
dx = dx(metadata={“quadrature_degree”:QUAD_DEG})

mesh = Mesh()
hdf = HDF5File(MPI.comm_world, “New3D_Denticles61000.h5”, “r”)
hdf.read(mesh, “/mesh”, False)
markers = MeshFunction(“size_t”, mesh, mesh.topology().dim())
hdf.read(markers, “/markers”)
boundaries = MeshFunction(“size_t”, mesh, mesh.topology().dim()-1)
hdf.read(boundaries, “/boundaries”)

#Define boundary conditions
bcu_inflow = DirichletBC(V.sub(0), Expression(inflow_profile, degree=1), boundaries, 1)
bcu_walls = DirichletBC(V.sub(0), Constant((0, 0, 0)), boundaries, 3)
bcu_walls2 = DirichletBC(V.sub(0).sub(2), Constant(0), boundaries, 4)
bcu_denticles = DirichletBC(V.sub(0), Constant((0, 0, 0)), boundaries, 5)
bcp_outflow = DirichletBC(V.sub(1), Constant(0), boundaries, 2)
bcs = [bcu_inflow, bcu_walls, bcu_walls2, bcu_denticles, bcp_outflow]

I run the

mpirun -np 8 python3 file.py

The result (red circle is weird):

My question is how I can fix the problem. Thanks for anyone’s help!

You need to provide a minimal working example that reproduces the problem. See: Read before posting: How do I get my question answered? - #4 by nate for a list of examples following our guidelines.

To me it looks like you are visualizing the partitioning surfaces (each local part of the mesh has an interior boundary to other processes), which is simply a post-processing issue.