Failed to run Poisson demo case with DOLFINx version 0.7.0

Try the Poisson demo case https://docs.fenicsproject.org/dolfinx/v0.7.0.post0/python/demos/demo_poisson.html and get error:

(0): ERROR: SCOTCH_dgraphInit: Scotch compiled with SCOTCH_PTHREAD and program not launched with MPI_THREAD_MULTIPLE
Traceback (most recent call last):
  File ***** line 82, in <module>
    msh = mesh.create_rectangle(comm=MPI.COMM_WORLD,
          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "anaconda3/envs/fenicsx07/lib/python3.11/site-packages/dolfinx/mesh.py", line 542, in create_rectangle
    mesh = _cpp.mesh.create_rectangle_float64(comm, points, n, cell_type, partitioner, diagonal)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
RuntimeError: ParMETIS_V3_PartKway failed. Error code: -4

DOLFINx version is 0.7.0 and installed with

conda install -c conda-forge fenics-dolfinx

The error above came out with multiple processes and it is OK with a serial run.

How many processes are you trying to run the example with?

I tried 2, 3, 4 as

mpirun -n 2 python demo_poisson.py

and they encountered the same error.

I also installed the dolfinx 0.7.0 environment using conda, and when I ran the Cahn-Hilliard equation tutorial(link) in parallel, the same error occurred.
I think it is an error where ParMETIS linking does not work properly when configuring the mesh. :thinking:

I have the same problem when running any code in parallel with dolfinx 0.7.0 installed through conda.

A MWE:

from dolfinx import mesh
from mpi4py import MPI
domain = mesh.create_unit_square(MPI.COMM_WORLD, 20, 20)

This runs fine with one core but leads to the following error when run in parallel with mpirun -np 2 python3 test_parallel.py :

(0): ERROR: SCOTCH_dgraphInit: Scotch compiled with SCOTCH_PTHREAD and program not launched with MPI_THREAD_MULTIPLE
Traceback (most recent call last):
  File "/home/nav/navScripts/fenicsx/vertical-soap/test_parallel.py", line 4, in <module>
    domain = mesh.create_unit_square(MPI.COMM_WORLD, 20, 20)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/nav/miniconda3/envs/fox07/lib/python3.11/site-packages/dolfinx/mesh.py", line 571, in create_unit_square
    return create_rectangle(comm, [np.array([0.0, 0.0]), np.array([1.0, 1.0])],
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/nav/miniconda3/envs/fox07/lib/python3.11/site-packages/dolfinx/mesh.py", line 542, in create_rectangle
    mesh = _cpp.mesh.create_rectangle_float64(comm, points, n, cell_type, partitioner, diagonal)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
RuntimeError: ParMETIS_V3_PartKway failed. Error code: -4

Everything works fine with dolfinx 0.6.0 installed through conda, so I’ll return to using that for now.

@dokken, Has this problem been resolved? :thinking:

Make sure to import mpi4py before import dolfinx. This ensures that MPI is correctly initialised.

This was fixed properly in v0.7.1 which is the latest version available on conda. Release v0.7.1 · FEniCS/dolfinx · GitHub

2 Likes