Mesh partitioning when using parallel

The problem in Fenics: Hi everyone! I make the mesh file from the Gmsh software. When using mpirun -np4 the fluid domain contains some parts of mesh partitioning(Red Circle) because of the MPI. I tried many methods including add parameters["ghost_mode"] = "shared_facet" , but I still cannot resolve this problem. So, How can I solve this problem? Can anyone encounter this mpi before? Please provide some python code to me, and help me to understand this problem. How to avoid partitioning problem while using parallel?^^ Thank you!

python min.code:

----------
#read mesh
mesh = Mesh()
with XDMFFile("mesh_65000.xdmf") as infile:
    infile.read(mesh)

mvc = MeshValueCollection("size_t", mesh, 3)
with XDMFFile("mesh_65000.xdmf") as infile:
    infile.read(mvc, "name_to_read")
markers = MeshFunctionSizet(mesh, mvc)
    
mvc2 = MeshValueCollection("size_t", mesh, 2)
with XDMFFile("boundaries_65000.xdmf") as infile: 
    infile.read(mvc2, "name_to_read")
boundaries = MeshFunctionSizet(mesh, mvc2)

QUAD_DEG = 2*k
dx = dx(metadata={"quadrature_degree":QUAD_DEG})
ds = Measure("ds",domain=mesh,subdomain_data=boundaries)
-------------
# Define boundary conditions
bcu_inflow = DirichletBC(V.sub(0), Expression(inflow_profile, degree=1), boundaries, 1)
bcu_walls_top = DirichletBC(V.sub(0).sub(1), Constant(0), boundaries, 3)
bcu_walls_bottom = DirichletBC(V.sub(0).sub(1), Constant(0), boundaries, 4)
bcu_walls2 = DirichletBC(V.sub(0).sub(2), Constant(0), boundaries, 5)
bcu_denticles = DirichletBC(V.sub(0), Constant((0, 0, 0)), boundaries, 6)
bcp_outflow = DirichletBC(V.sub(1), Constant(0), boundaries, 2)
bcs = [bcu_inflow, bcu_walls_top, bcu_walls_bottom, bcu_walls2, bcu_denticles, bcp_outflow]
----------

This just looks like a visualisation issue with the .pvd file type and vtu backend. You might just be seeing overlap of boundaries exterior to each process. What happens if you save as XDMF instead?

1 Like