Can't create mesh in parallel

Hello,

When running the following very simple script in parallel (mpirun -np 2 python3 my_script.py):

from mpi4py import MPI
from dolfinx import mesh

domain = mesh.create_unit_square(MPI.COMM_WORLD, 8, 8, mesh.CellType.quadrilateral)

I get the following error:


[0]PETSC ERROR: ------------------------------------------------------------------------
[0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, probably memory access out of range
[0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger
[0]PETSC ERROR: or see https://petsc.org/release/faq/#valgrind and https://petsc.org/release/faq/
[0]PETSC ERROR: configure using --with-debugging=yes, recompile, link, and run
[0]PETSC ERROR: to get more information on the crash.
[0]PETSC ERROR: Run with -malloc_debug to check if memory corruption is causing the crash.
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
with errorcode 59.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------

Could you, please, help me to find out what is going out here?

Best,

Lucas

I cannot reproduce your error with:
ghcr.io/fenics/dolfinx/dolfinx:v0.7.0 (using docker).

How did you install DOLFINx and what platform are you on?
Also what version of DOLFINx are you using and what MPI (version of Open MPI) do you have installed?

Thank you for your answer. I installed FEniCS from source on Linux. I installed DOLFINx v0.7 and I use Open MPI v4.0.5.

Further tests show that the problem comes from the partitioner. I took the partitioner from the python/test/unit/mesh/test_mesh_partitioners.py in the test test_asymmetric_partitioner:

import numpy as np
import dolfinx
from mpi4py import MPI
from dolfinx import mesh

def partitionerUnitTest(comm, n, m, topo):
    r = comm.Get_rank()
    dests = []
    offsets = [0]
    for i in range(topo.num_nodes):
        dests.append(r)
        if r == 1:
            dests.append(0)
        offsets.append(len(dests))

    dests = np.array(dests, dtype=np.int32)
    offsets = np.array(offsets, dtype=np.int32)
    return dolfinx.cpp.graph.AdjacencyList_int32(dests, offsets)
domain = mesh.create_unit_square(MPI.COMM_WORLD, 8, 8, mesh.CellType.quadrilateral, partitioner=partitionerUnitTest)

And that runs fine!

Moreover I tried to use the partitioners provided by dolfinx.graph . If I use partitioner, I get the following error:

  File "/stck/fpascal/Python/Code/FENICS/Create_mesh.py", line 33, in <module>
Traceback (most recent call last):
  File "/stck/fpascal/Python/Code/FENICS/Create_mesh.py", line 33, in <module>
    domain = mesh.create_unit_square(MPI.COMM_WORLD, 8, 8, mesh.CellType.quadrilateral, partitioner=partitioner)
  File "/scratchm/fpascal/Softwares/Dist_dolfinx/lib/python3.9/site-packages/dolfinx/mesh.py", line 569, in create_unit_square
    domain = mesh.create_unit_square(MPI.COMM_WORLD, 8, 8, mesh.CellType.quadrilateral, partitioner=partitioner)
  File "/scratchm/fpascal/Softwares/Dist_dolfinx/lib/python3.9/site-packages/dolfinx/mesh.py", line 569, in create_unit_square
    return create_rectangle(comm, [np.array([0.0, 0.0]), np.array([1.0, 1.0])],
  File "/scratchm/fpascal/Softwares/Dist_dolfinx/lib/python3.9/site-packages/dolfinx/mesh.py", line 540, in create_rectangle
    return create_rectangle(comm, [np.array([0.0, 0.0]), np.array([1.0, 1.0])],
  File "/scratchm/fpascal/Softwares/Dist_dolfinx/lib/python3.9/site-packages/dolfinx/mesh.py", line 540, in create_rectangle
    mesh = _cpp.mesh.create_rectangle_float64(comm, points, n, cell_type, partitioner, diagonal)
TypeError: partitioner(): incompatible function arguments. The following argument types are supported:
    1. () -> Callable[[MPICommWrapper, int, dolfinx.cpp.graph.AdjacencyList_int64, bool], dolfinx.cpp.graph.AdjacencyList_int32]

Invoked with: <mpi4py.MPI.Intracomm object at 0x147de73b6d50>, 2, 2, <AdjacencyList> with 64 nodes
  0: [0 1 9 10 ]
  1: [1 2 10 11 ]
  2: [2 3 11 12 ]
  3: [3 4 12 13 ]
  4: [4 5 13 14 ]
  5: [5 6 14 15 ]
  6: [6 7 15 16 ]
  7: [7 8 16 17 ]
  8: [9 10 18 19 ]
  9: [10 11 19 20 ]
  10: [11 12 20 21 ]
  11: [12 13 21 22 ]
  12: [13 14 22 23 ]
  13: [14 15 23 24 ]
  14: [15 16 24 25 ]
  15: [16 17 25 26 ]
  16: [18 19 27 28 ]
  17: [19 20 28 29 ]
  18: [20 21 29 30 ]
  19: [21 22 30 31 ]
  20: [22 23 31 32 ]
  21: [23 24 32 33 ]
  22: [24 25 33 34 ]
  23: [25 26 34 35 ]
  24: [27 28 36 37 ]
  25: [28 29 37 38 ]
  26: [29 30 38 39 ]
  27: [30 31 39 40 ]
  28: [31 32 40 41 ]
  29: [32 33 41 42 ]
  30: [33 34 42 43 ]
  31: [34 35 43 44 ]
  32: [36 37 45 46 ]
  33: [37 38 46 47 ]
  34: [38 39 47 48 ]
  35: [39 40 48 49 ]
  36: [40 41 49 50 ]
  37: [41 42 50 51 ]
  38: [42 43 51 52 ]
  39: [43 44 52 53 ]
  40: [45 46 54 55 ]
  41: [46 47 55 56 ]
  42: [47 48 56 57 ]
  43: [48 49 57 58 ]
  44: [49 50 58 59 ]
  45: [50 51 59 60 ]
  46: [51 52 60 61 ]
  47: [52 53 61 62 ]
  48: [54 55 63 64 ]
  49: [55 56 64 65 ]
  50: [56 57 65 66 ]
  51: [57 58 66 67 ]
  52: [58 59 67 68 ]
  53: [59 60 68 69 ]
  54: [60 61 69 70 ]
  55: [61 62 70 71 ]
  56: [63 64 72 73 ]
  57: [64 65 73 74 ]
  58: [65 66 74 75 ]
  59: [66 67 75 76 ]
  60: [67 68 76 77 ]
  61: [68 69 77 78 ]
  62: [69 70 78 79 ]
  63: [70 71 79 80 ]

    mesh = _cpp.mesh.create_rectangle_float64(comm, points, n, cell_type, partitioner, diagonal)
TypeError: partitioner(): incompatible function arguments. The following argument types are supported:
    1. () -> Callable[[MPICommWrapper, int, dolfinx.cpp.graph.AdjacencyList_int64, bool], dolfinx.cpp.graph.AdjacencyList_int32]

Invoked with: <mpi4py.MPI.Intracomm object at 0x14fdabbead50>, 2, 2, <AdjacencyList> with 0 nodes

If I use partitioner_scotch the error is the following:

  File "/stck/fpascal/Python/Code/FENICS/Create_mesh.py", line 34, in <module>
Traceback (most recent call last):
  File "/stck/fpascal/Python/Code/FENICS/Create_mesh.py", line 34, in <module>
    domain = mesh.create_unit_square(MPI.COMM_WORLD, 8, 8, mesh.CellType.quadrilateral, partitioner=partitioner_scotch)
  File "/scratchm/fpascal/Softwares/Dist_dolfinx/lib/python3.9/site-packages/dolfinx/mesh.py", line 569, in create_unit_square
    domain = mesh.create_unit_square(MPI.COMM_WORLD, 8, 8, mesh.CellType.quadrilateral, partitioner=partitioner_scotch)
  File "/scratchm/fpascal/Softwares/Dist_dolfinx/lib/python3.9/site-packages/dolfinx/mesh.py", line 569, in create_unit_square
    return create_rectangle(comm, [np.array([0.0, 0.0]), np.array([1.0, 1.0])],
  File "/scratchm/fpascal/Softwares/Dist_dolfinx/lib/python3.9/site-packages/dolfinx/mesh.py", line 540, in create_rectangle
    return create_rectangle(comm, [np.array([0.0, 0.0]), np.array([1.0, 1.0])],
  File "/scratchm/fpascal/Softwares/Dist_dolfinx/lib/python3.9/site-packages/dolfinx/mesh.py", line 540, in create_rectangle
    mesh = _cpp.mesh.create_rectangle_float64(comm, points, n, cell_type, partitioner, diagonal)
TypeError: partitioner_scotch(): incompatible function arguments. The following argument types are supported:
    1. (imbalance: float = 0.025, seed: int = 0) -> Callable[[MPICommWrapper, int, dolfinx.cpp.graph.AdjacencyList_int64, bool], dolfinx.cpp.graph.AdjacencyList_int32]

Invoked with: <mpi4py.MPI.Intracomm object at 0x1519c795dd20>, 2, 2, <AdjacencyList> with 64 nodes
  0: [0 1 9 10 ]
  1: [1 2 10 11 ]
  2: [2 3 11 12 ]
  3: [3 4 12 13 ]
  4: [4 5 13 14 ]
  5: [5 6 14 15 ]
  6: [6 7 15 16 ]
  7: [7 8 16 17 ]
  8: [9 10 18 19 ]
  9: [10 11 19 20 ]
  10: [11 12 20 21 ]
  11: [12 13 21 22 ]
  12: [13 14 22 23 ]
  13: [14 15 23 24 ]
  14: [15 16 24 25 ]
  15: [16 17 25 26 ]
  16: [18 19 27 28 ]
  17: [19 20 28 29 ]
  18: [20 21 29 30 ]
  19: [21 22 30 31 ]
  20: [22 23 31 32 ]
  21: [23 24 32 33 ]
  22: [24 25 33 34 ]
  23: [25 26 34 35 ]
  24: [27 28 36 37 ]
  25: [28 29 37 38 ]
  26: [29 30 38 39 ]
  27: [30 31 39 40 ]
  28: [31 32 40 41 ]
  29: [32 33 41 42 ]
  30: [33 34 42 43 ]
  31: [34 35 43 44 ]
  32: [36 37 45 46 ]
  33: [37 38 46 47 ]
  34: [38 39 47 48 ]
  35: [39 40 48 49 ]
  36: [40 41 49 50 ]
  37: [41 42 50 51 ]
  38: [42 43 51 52 ]
  39: [43 44 52 53 ]
  40: [45 46 54 55 ]
  41: [46 47 55 56 ]
  42: [47 48 56 57 ]
  43: [48 49 57 58 ]
  44: [49 50 58 59 ]
  45: [50 51 59 60 ]
  46: [51 52 60 61 ]
  47: [52 53 61 62 ]
  48: [54 55 63 64 ]
  49: [55 56 64 65 ]
  50: [56 57 65 66 ]
  51: [57 58 66 67 ]
  52: [58 59 67 68 ]
  53: [59 60 68 69 ]
  54: [60 61 69 70 ]
  55: [61 62 70 71 ]
  56: [63 64 72 73 ]
  57: [64 65 73 74 ]
  58: [65 66 74 75 ]
  59: [66 67 75 76 ]
  60: [67 68 76 77 ]
  61: [68 69 77 78 ]
  62: [69 70 78 79 ]
  63: [70 71 79 80 ]

    mesh = _cpp.mesh.create_rectangle_float64(comm, points, n, cell_type, partitioner, diagonal)
TypeError: partitioner_scotch(): incompatible function arguments. The following argument types are supported:
    1. (imbalance: float = 0.025, seed: int = 0) -> Callable[[MPICommWrapper, int, dolfinx.cpp.graph.AdjacencyList_int64, bool], dolfinx.cpp.graph.AdjacencyList_int32]

Invoked with: <mpi4py.MPI.Intracomm object at 0x147ff2240d20>, 2, 2, <AdjacencyList> with 0 nodes

And finally with partitioner_parmetis, the error is:

Traceback (most recent call last):
  File "/stck/fpascal/Python/Code/FENICS/Create_mesh.py", line 35, in <module>
Traceback (most recent call last):
  File "/stck/fpascal/Python/Code/FENICS/Create_mesh.py", line 35, in <module>
    domain = mesh.create_unit_square(MPI.COMM_WORLD, 8, 8, mesh.CellType.quadrilateral, partitioner=partitioner_parmetis)
  File "/scratchm/fpascal/Softwares/Dist_dolfinx/lib/python3.9/site-packages/dolfinx/mesh.py", line 569, in create_unit_square
    domain = mesh.create_unit_square(MPI.COMM_WORLD, 8, 8, mesh.CellType.quadrilateral, partitioner=partitioner_parmetis)
  File "/scratchm/fpascal/Softwares/Dist_dolfinx/lib/python3.9/site-packages/dolfinx/mesh.py", line 569, in create_unit_square
    return create_rectangle(comm, [np.array([0.0, 0.0]), np.array([1.0, 1.0])],
  File "/scratchm/fpascal/Softwares/Dist_dolfinx/lib/python3.9/site-packages/dolfinx/mesh.py", line 540, in create_rectangle
    return create_rectangle(comm, [np.array([0.0, 0.0]), np.array([1.0, 1.0])],
  File "/scratchm/fpascal/Softwares/Dist_dolfinx/lib/python3.9/site-packages/dolfinx/mesh.py", line 540, in create_rectangle
    mesh = _cpp.mesh.create_rectangle_float64(comm, points, n, cell_type, partitioner, diagonal)
TypeError: partitioner_parmetis(): incompatible function arguments. The following argument types are supported:
    1. (imbalance: float = 1.02, options: Annotated[List[int], FixedSize(3)] = [1, 0, 5]) -> Callable[[MPICommWrapper, int, dolfinx.cpp.graph.AdjacencyList_int64, bool], dolfinx.cpp.graph.AdjacencyList_int32]

Invoked with: <mpi4py.MPI.Intracomm object at 0x155206a4bd20>, 2, 2, <AdjacencyList> with 0 nodes

    mesh = _cpp.mesh.create_rectangle_float64(comm, points, n, cell_type, partitioner, diagonal)
TypeError: partitioner_parmetis(): incompatible function arguments. The following argument types are supported:
    1. (imbalance: float = 1.02, options: Annotated[List[int], FixedSize(3)] = [1, 0, 5]) -> Callable[[MPICommWrapper, int, dolfinx.cpp.graph.AdjacencyList_int64, bool], dolfinx.cpp.graph.AdjacencyList_int32]

Invoked with: <mpi4py.MPI.Intracomm object at 0x151153b1cd20>, 2, 2, <AdjacencyList> with 64 nodes
  0: [0 1 9 10 ]
  1: [1 2 10 11 ]
  2: [2 3 11 12 ]
  3: [3 4 12 13 ]
  4: [4 5 13 14 ]
  5: [5 6 14 15 ]
  6: [6 7 15 16 ]
  7: [7 8 16 17 ]
  8: [9 10 18 19 ]
  9: [10 11 19 20 ]
  10: [11 12 20 21 ]
  11: [12 13 21 22 ]
  12: [13 14 22 23 ]
  13: [14 15 23 24 ]
  14: [15 16 24 25 ]
  15: [16 17 25 26 ]
  16: [18 19 27 28 ]
  17: [19 20 28 29 ]
  18: [20 21 29 30 ]
  19: [21 22 30 31 ]
  20: [22 23 31 32 ]
  21: [23 24 32 33 ]
  22: [24 25 33 34 ]
  23: [25 26 34 35 ]
  24: [27 28 36 37 ]
  25: [28 29 37 38 ]
  26: [29 30 38 39 ]
  27: [30 31 39 40 ]
  28: [31 32 40 41 ]
  29: [32 33 41 42 ]
  30: [33 34 42 43 ]
  31: [34 35 43 44 ]
  32: [36 37 45 46 ]
  33: [37 38 46 47 ]
  34: [38 39 47 48 ]
  35: [39 40 48 49 ]
  36: [40 41 49 50 ]
  37: [41 42 50 51 ]
  38: [42 43 51 52 ]
  39: [43 44 52 53 ]
  40: [45 46 54 55 ]
  41: [46 47 55 56 ]
  42: [47 48 56 57 ]
  43: [48 49 57 58 ]
  44: [49 50 58 59 ]
  45: [50 51 59 60 ]
  46: [51 52 60 61 ]
  47: [52 53 61 62 ]
  48: [54 55 63 64 ]
  49: [55 56 64 65 ]
  50: [56 57 65 66 ]
  51: [57 58 66 67 ]
  52: [58 59 67 68 ]
  53: [59 60 68 69 ]
  54: [60 61 69 70 ]
  55: [61 62 70 71 ]
  56: [63 64 72 73 ]
  57: [64 65 73 74 ]
  58: [65 66 74 75 ]
  59: [66 67 75 76 ]
  60: [67 68 76 77 ]
  61: [68 69 77 78 ]
  62: [69 70 78 79 ]
  63: [70 71 79 80 ]

Could you add the exact code you are running when you are trying to call create_mesh with partitioner, partitioner_scotch() and partitioner_parmetis()?

Here is the code is I used to test the different partitioners (in particular we test here partitioner_parmetis:

import numpy as np
import dolfinx
from mpi4py import MPI
from dolfinx import mesh
from dolfinx.graph import partitioner, partitioner_scotch, partitioner_parmetis

def partitionerUnitTest32(comm, n, m, topo):
    r = comm.Get_rank()
    dests = []
    offsets = [0]
    for i in range(topo.num_nodes):
        dests.append(r)
        if r == 1:
            dests.append(0)
        offsets.append(len(dests))

    dests = np.array(dests, dtype=np.int32)
    offsets = np.array(offsets, dtype=np.int32)
    return dolfinx.cpp.graph.AdjacencyList_int32(dests, offsets)
# domain = mesh.create_unit_square(MPI.COMM_WORLD, 8, 8, mesh.CellType.quadrilateral, partitioner=partitioner)
# domain = mesh.create_unit_square(MPI.COMM_WORLD, 8, 8, mesh.CellType.quadrilateral, partitioner=partitioner_scotch)
domain = mesh.create_unit_square(MPI.COMM_WORLD, 8, 8, mesh.CellType.quadrilateral, partitioner=partitioner_parmetis)
# domain = mesh.create_unit_square(MPI.COMM_WORLD, 8, 8, mesh.CellType.quadrilateral, partitioner=partitionerUnitTest32) # works correctly
# domain = mesh.create_unit_square(MPI.COMM_WORLD, 8, 8, mesh.CellType.quadrilateral)

This should be
LD, 8, 8, mesh.CellType.quadrilateral, partitioner=partitioner_parmetis())
and similarly for the other partitioners

Thanks!

However the code:

import numpy as np
import dolfinx
from mpi4py import MPI
from dolfinx import mesh
from dolfinx.graph import partitioner, partitioner_scotch, partitioner_parmetis

def partitionerUnitTest32(comm, n, m, topo):
    r = comm.Get_rank()
    dests = []
    offsets = [0]
    for i in range(topo.num_nodes):
        dests.append(r)
        if r == 1:
            dests.append(0)
        offsets.append(len(dests))

    dests = np.array(dests, dtype=np.int32)
    offsets = np.array(offsets, dtype=np.int32)
    return dolfinx.cpp.graph.AdjacencyList_int32(dests, offsets)
# domain = mesh.create_unit_square(MPI.COMM_WORLD, 8, 8, mesh.CellType.quadrilateral, partitioner=partitioner())
# domain = mesh.create_unit_square(MPI.COMM_WORLD, 8, 8, mesh.CellType.quadrilateral, partitioner=partitioner_scotch())
domain = mesh.create_unit_square(MPI.COMM_WORLD, 8, 8, mesh.CellType.quadrilateral, partitioner=partitioner_parmetis())
# domain = mesh.create_unit_square(MPI.COMM_WORLD, 8, 8, mesh.CellType.quadrilateral, partitioner=partitionerUnitTest32) # works correctly
# domain = mesh.create_unit_square(MPI.COMM_WORLD, 8, 8, mesh.CellType.quadrilateral)

still fails with:

Traceback (most recent call last):
  File "/stck/fpascal/Python/Code/FENICS/Create_mesh.py", line 22, in <module>
Traceback (most recent call last):
  File "/stck/fpascal/Python/Code/FENICS/Create_mesh.py", line 22, in <module>
    domain = mesh.create_unit_square(MPI.COMM_WORLD, 8, 8, mesh.CellType.quadrilateral, partitioner=partitioner_parmetis())
  File "/scratchm/fpascal/Softwares/Dist_dolfinx/lib/python3.9/site-packages/dolfinx/mesh.py", line 569, in create_unit_square
    domain = mesh.create_unit_square(MPI.COMM_WORLD, 8, 8, mesh.CellType.quadrilateral, partitioner=partitioner_parmetis())
  File "/scratchm/fpascal/Softwares/Dist_dolfinx/lib/python3.9/site-packages/dolfinx/mesh.py", line 569, in create_unit_square
    return create_rectangle(comm, [np.array([0.0, 0.0]), np.array([1.0, 1.0])],
  File "/scratchm/fpascal/Softwares/Dist_dolfinx/lib/python3.9/site-packages/dolfinx/mesh.py", line 540, in create_rectangle
    return create_rectangle(comm, [np.array([0.0, 0.0]), np.array([1.0, 1.0])],
  File "/scratchm/fpascal/Softwares/Dist_dolfinx/lib/python3.9/site-packages/dolfinx/mesh.py", line 540, in create_rectangle
    mesh = _cpp.mesh.create_rectangle_float64(comm, points, n, cell_type, partitioner, diagonal)
TypeError: (): incompatible function arguments. The following argument types are supported:
    1. (arg0: MPICommWrapper, arg1: int, arg2: dolfinx.cpp.graph.AdjacencyList_int64, arg3: bool) -> dolfinx.cpp.graph.AdjacencyList_int32

Invoked with: <mpi4py.MPI.Intracomm object at 0x14c2e8d8f570>, 2, 2, <AdjacencyList> with 0 nodes

    mesh = _cpp.mesh.create_rectangle_float64(comm, points, n, cell_type, partitioner, diagonal)
TypeError: (): incompatible function arguments. The following argument types are supported:
    1. (arg0: MPICommWrapper, arg1: int, arg2: dolfinx.cpp.graph.AdjacencyList_int64, arg3: bool) -> dolfinx.cpp.graph.AdjacencyList_int32

Invoked with: <mpi4py.MPI.Intracomm object at 0x14c39eb58570>, 2, 2, <AdjacencyList> with 64 nodes
  0: [0 1 9 10 ]
  1: [1 2 10 11 ]
  2: [2 3 11 12 ]
  3: [3 4 12 13 ]
  4: [4 5 13 14 ]
  5: [5 6 14 15 ]
  6: [6 7 15 16 ]
  7: [7 8 16 17 ]
  8: [9 10 18 19 ]
  9: [10 11 19 20 ]
  10: [11 12 20 21 ]
  11: [12 13 21 22 ]
  12: [13 14 22 23 ]
  13: [14 15 23 24 ]
  14: [15 16 24 25 ]
  15: [16 17 25 26 ]
  16: [18 19 27 28 ]
  17: [19 20 28 29 ]
  18: [20 21 29 30 ]
  19: [21 22 30 31 ]
  20: [22 23 31 32 ]
  21: [23 24 32 33 ]
  22: [24 25 33 34 ]
  23: [25 26 34 35 ]
  24: [27 28 36 37 ]
  25: [28 29 37 38 ]
  26: [29 30 38 39 ]
  27: [30 31 39 40 ]
  28: [31 32 40 41 ]
  29: [32 33 41 42 ]
  30: [33 34 42 43 ]
  31: [34 35 43 44 ]
  32: [36 37 45 46 ]
  33: [37 38 46 47 ]
  34: [38 39 47 48 ]
  35: [39 40 48 49 ]
  36: [40 41 49 50 ]
  37: [41 42 50 51 ]
  38: [42 43 51 52 ]
  39: [43 44 52 53 ]
  40: [45 46 54 55 ]
  41: [46 47 55 56 ]
  42: [47 48 56 57 ]
  43: [48 49 57 58 ]
  44: [49 50 58 59 ]
  45: [50 51 59 60 ]
  46: [51 52 60 61 ]
  47: [52 53 61 62 ]
  48: [54 55 63 64 ]
  49: [55 56 64 65 ]
  50: [56 57 65 66 ]
  51: [57 58 66 67 ]
  52: [58 59 67 68 ]
  53: [59 60 68 69 ]
  54: [60 61 69 70 ]
  55: [61 62 70 71 ]
  56: [63 64 72 73 ]
  57: [64 65 73 74 ]
  58: [65 66 74 75 ]
  59: [66 67 75 76 ]
  60: [67 68 76 77 ]
  61: [68 69 77 78 ]
  62: [69 70 78 79 ]
  63: [70 71 79 80 ]

--------------------------------------------------------------------------
Primary job  terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------

Have a look at:
https://jsdokken.com/dolfinx_docs/meshes.html
You need to wrap
partitioner_parmetis() as:

domain = mesh.create_unit_square(MPI.COMM_WORLD, 8, 8, mesh.CellType.quadrilateral, partitioner=mesh.create_cell_partitioner(partitioner_parmetis()))
domain = mesh.create_unit_square(MPI.COMM_WORLD, 8, 8, mesh.CellType.quadrilateral, partitioner=mesh.create_cell_partitioner(partitioner_scotch()))

Thanks we’re gaining ground, please find in the code below comments showing the status of the different lines. To sum up, thanks to create_cell_partitioner, the script works correctly when using partitioner_scotch. However when using partitioner or partitioner_parmetis I end up with the same PETSc segmentation fault message as in my initial post.

import numpy as np
import dolfinx
from mpi4py import MPI
from dolfinx import mesh
from dolfinx.graph import partitioner, partitioner_scotch, partitioner_parmetis

def partitionerUnitTest32(comm, n, m, topo):
    r = comm.Get_rank()
    dests = []
    offsets = [0]
    for i in range(topo.num_nodes):
        dests.append(r)
        if r == 1:
            dests.append(0)
        offsets.append(len(dests))

    dests = np.array(dests, dtype=np.int32)
    offsets = np.array(offsets, dtype=np.int32)
    return dolfinx.cpp.graph.AdjacencyList_int32(dests, offsets)
# domain = mesh.create_unit_square(MPI.COMM_WORLD, 8, 8, mesh.CellType.quadrilateral, partitioner=mesh.create_cell_partitioner(partitioner())) # PETSc segmentation fault error
# domain = mesh.create_unit_square(MPI.COMM_WORLD, 8, 8, mesh.CellType.quadrilateral, partitioner=mesh.create_cell_partitioner(partitioner_scotch())) # works correctly
# domain = mesh.create_unit_square(MPI.COMM_WORLD, 8, 8, mesh.CellType.quadrilateral, partitioner=mesh.create_cell_partitioner(partitioner_parmetis())) # PETSc segmentation fault error
# domain = mesh.create_unit_square(MPI.COMM_WORLD, 8, 8, mesh.CellType.quadrilateral, partitioner=partitionerUnitTest32) # works correctly
# domain = mesh.create_unit_square(MPI.COMM_WORLD, 8, 8, mesh.CellType.quadrilateral) # PETSc segmentation fault error

Seems like there is an issue with the parmetis paritioner then, as it is the default for partitioner():
https://github.com/FEniCS/dolfinx/blob/fdbc308ff9f5a8f71efed2d7fc47055bccb2810e/cpp/dolfinx/graph/partition.cpp#L26, so they should give the same result.
How did you install Parmetis, and what version are you using?

Thanks for your help you pointed out what was wrong.
I used Parmetis v4.0.3 installed by some colleagues by means of Spack. I actually tried an other v4.0.3 installation (as well provided by Spack) and now it works. That is really weird and I don’t understand why it is working with one installation but not the other.
Anyway, I’m glad that it now works. Thank you for your help.