Difficulty understanding MultiPointConstraint add_constraint in a particular 3D case

Hello,
with FEniCSx 0.9 and dolfinx_mpc 0.9.1 I am attempting to reconnect nodes in meshes with “hanging nodes” using constraints.
The application I am working on generates and distributes this kind of mesh.
My understanding of add_constraint meets my current expectations in 2D, but I encounter a strange situation in 3D.
I managed to isolate this particular behaviour in the following MWE code:


from dolfinx import mesh
from dolfinx import fem
from dolfinx import graph
from dolfinx import io
import ufl
import numpy as np
from petsc4py import PETSc
from petsc4py import typing as petsc_typing

from mpi4py import MPI
from dolfinx_mpc import LinearProblem,MultiPointConstraint

DO_3D=True

def residual( A: PETSc.Mat, b: PETSc.Vec, x: PETSc.Vec, nb: petsc_typing.Scalar):
    R=b.duplicate()
    A.mult(x,R)
    vi=PETSc.Viewer()
    R.aypx(-1.,b)
    return R.norm()/nb
if DO_3D:
    if MPI.COMM_WORLD.size==2:
        def part(comm,nbpart,cell_types,topo):
            if comm.rank==0:
                dests = np.concatenate((np.full(108,0, dtype=np.int32),np.full(108,1, dtype=np.int32)))
            else:
                dests=np.empty(0,dtype=np.int32)
            return graph.adjacencylist(dests)
        [dom,_,_]=io.gmshio.read_from_msh("test_mpc3D.msh",MPI.COMM_WORLD,partitioner=part)
    else:
        [dom,_,_]=io.gmshio.read_from_msh("test_mpc3D.msh",MPI.COMM_WORLD)
    assert(dom.topology.dim==3)
    dim=3
else:
    if MPI.COMM_WORLD.size==2:
        def part(comm,nbpart,cell_types,topo):
            if comm.rank==0:
                dests = np.concatenate((np.full(20,0, dtype=np.int32),np.full(20,1, dtype=np.int32)))
            else:
                dests=np.empty(0,dtype=np.int32)
            return graph.adjacencylist(dests)
        [dom,_,_]=io.gmshio.read_from_msh("test_mpc2D.msh",MPI.COMM_WORLD,gdim=2,partitioner=part)
    else:
        [dom,_,_]=io.gmshio.read_from_msh("test_mpc2D.msh",MPI.COMM_WORLD,gdim=2)
    assert(dom.topology.dim==2)
    dim=2
dom.topology.create_connectivity(0,dim)
slaves=np.empty(0,np.int32)
masters=np.empty(0,np.int64)
coeffs=np.empty(0,np.float64)
coeffs=np.empty(0,np.int32)
offsets=np.zeros(1,np.int32)
owners=np.empty(0,np.int32)
space = fem.functionspace(dom, ("Lagrange", 1, (dim,)))
idmp=space.dofmap.index_map
nbl=idmp.size_local
glob_label_loc=idmp.local_to_global(np.arange(0,nbl))
glob_label=np.concatenate((np.array([i*dim+b for i in glob_label_loc for b in range(dim)],dtype=np.int64),np.array([i*dim+b for i in idmp.ghosts for b in range(dim)],dtype=np.int64)))

def set_slave_master(b,s,m0,m1,slaves,masters,coeffs,offsets,owners):
    slave=mesh.locate_entities(dom, 0, lambda x: np.logical_and(np.isclose(x[0], s[0]),np.logical_and(np.isclose(x[1],s[1]),np.isclose(x[2],s[2]))))
    master_0=mesh.locate_entities(dom, 0, lambda x: np.logical_and(np.isclose(x[0], m0[0]),np.logical_and(np.isclose(x[1],m0[1]),np.isclose(x[2],m0[2]))))
    master_1=mesh.locate_entities(dom, 0, lambda x: np.logical_and(np.isclose(x[0], m1[0]),np.logical_and(np.isclose(x[1],m1[1]),np.isclose(x[2],m1[2]))))
    dof_slave=fem.locate_dofs_topological(space,0,slave)
    dof_master_0=fem.locate_dofs_topological(space,0,master_0)
    dof_master_1=fem.locate_dofs_topological(space,0,master_1)
    if len(slave) and dof_slave[0]<nbl :
        slaves=np.concatenate((slaves,np.array([dof_slave[0]*dim+b ])))
        coeffs=np.concatenate((coeffs,np.full(2,0.5)))
        offsets=np.concatenate((offsets,np.array([len(coeffs)])))
        for mast in [dof_master_0[0],dof_master_1[0]]:
            masters=np.concatenate((masters,np.array([glob_label[mast*dim+b]])))
            if mast<nbl:
                owners=np.concatenate((owners,np.array([dom.comm.rank])))
            else:
                owners=np.concatenate((owners,np.array(np.array([idmp.owners[mast-nbl]]))))
    return slaves,masters,coeffs,offsets,owners
if DO_3D:
    [slaves,masters,coeffs,offsets,owners]=set_slave_master(0,[5.,-5.,0.],[5.,-5.,2.5],[5.,-5,-2.5],slaves,masters,coeffs,offsets,owners)
    [slaves,masters,coeffs,offsets,owners]=set_slave_master(1,[5.,-5.,0.],[5.,-5.,2.5],[5.,-5,-2.5],slaves,masters,coeffs,offsets,owners)
    [slaves,masters,coeffs,offsets,owners]=set_slave_master(2,[5.,-5.,0.],[5.,-5.,2.5],[5.,-5,-2.5],slaves,masters,coeffs,offsets,owners)
    [slaves,masters,coeffs,offsets,owners]=set_slave_master(0,[15.,5.,0.],[15.,5.,2.5],[15.,5,-2.5],slaves,masters,coeffs,offsets,owners)
    [slaves,masters,coeffs,offsets,owners]=set_slave_master(1,[15.,5.,0.],[15.,5.,2.5],[15.,5,-2.5],slaves,masters,coeffs,offsets,owners)
    [slaves,masters,coeffs,offsets,owners]=set_slave_master(2,[15.,5.,0.],[15.,5.,2.5],[15.,5,-2.5],slaves,masters,coeffs,offsets,owners)
else:
    [slaves,masters,coeffs,offsets,owners]=set_slave_master(0,[5.,2.5,0.],[5.,5.,0.],[5.,0.,0.],slaves,masters,coeffs,offsets,owners)
    [slaves,masters,coeffs,offsets,owners]=set_slave_master(0,[5.,-2.5,0.],[5.,-5.,0.],[5.,0.,0.],slaves,masters,coeffs,offsets,owners)
    [slaves,masters,coeffs,offsets,owners]=set_slave_master(1,[5.,2.5,0.],[5.,5.,0.],[5.,0.,0.],slaves,masters,coeffs,offsets,owners)
    [slaves,masters,coeffs,offsets,owners]=set_slave_master(0,[15.,-2.5,0.],[15.,-5.,0.],[15.,0.,0.],slaves,masters,coeffs,offsets,owners)
    [slaves,masters,coeffs,offsets,owners]=set_slave_master(0,[15.,2.5,0.],[15.,5.,0.],[15.,0.,0.],slaves,masters,coeffs,offsets,owners)
    [slaves,masters,coeffs,offsets,owners]=set_slave_master(1,[5.,-2.5,0.],[5.,-5.,0.],[5.,0.,0.],slaves,masters,coeffs,offsets,owners)
    [slaves,masters,coeffs,offsets,owners]=set_slave_master(1,[15.,-2.5,0.],[15.,-5.,0.],[15.,0.,0.],slaves,masters,coeffs,offsets,owners)
    [slaves,masters,coeffs,offsets,owners]=set_slave_master(1,[15.,2.5,0.],[15.,5.,0.],[15.,0.,0.],slaves,masters,coeffs,offsets,owners)
print(dom.comm.rank,"slaves",slaves)
print(dom.comm.rank,"masters",masters)
print(dom.comm.rank,"coeffs",coeffs)
print(dom.comm.rank,"offsets",offsets)
print(dom.comm.rank,"owners",owners)
mpc=MultiPointConstraint(space)
mpc.add_constraint(space,slaves,masters,coeffs,owners,offsets)
mpc.finalize()
mpc_space=mpc.function_space

E=5.e+6
nu=0.3
mu=fem.Constant(dom,E / (2.0 * (1.0 + nu)))
lmbda = fem.Constant(dom,E * nu / ((1.0 + nu) * (1.0 - 2.0 * nu)))
def eps(w):
    return 0.5*(ufl.grad(w) + ufl.grad(w).T)
def sigma(strain): 
    return 2.0*mu*strain + lmbda*ufl.tr(strain)*ufl.Identity(dim)
u = ufl.TrialFunction(space)
v = ufl.TestFunction(space)
a_ufl=ufl.inner(sigma(eps(u)), eps(v)) * ufl.dx
f=fem.Function(space,name='load')
f.x.array[:]=0.
b_ufl=ufl.dot(f,v)*ufl.dx
bc_nodes=mesh.locate_entities(dom, 0, lambda x: np.isclose(x[0], 0.))
bdofs=fem.locate_dofs_topological(space,0,bc_nodes)
bcb=fem.dirichletbc(np.zeros(dim),bdofs,space)
imp_nodes=mesh.locate_entities(dom, 0, lambda x: np.isclose(x[0], 20.))
idofs=fem.locate_dofs_topological(space,0,imp_nodes)
bci=fem.dirichletbc(np.arange(0.1,0.3,0.2/dim),idofs,space)
bcs=[bcb,bci]
problem = LinearProblem(a_ufl, b_ufl,mpc,bcs=bcs, petsc_options={"ksp_type": "preonly", "pc_type": "lu"})
disp = problem.solve()
#problem._A.view()
res=0
res= residual(problem._A,problem._b,disp.x.petsc_vec,problem.b.norm())
if MPI.COMM_WORLD.rank==0:
    print(f"residual out of solve { res}")
mpc.homogenize(disp)
res= residual(problem._A,problem._b,disp.x.petsc_vec,problem.b.norm())
if MPI.COMM_WORLD.rank==0:
    print(f"residual after homogenize { res}")
mpc.backsubstitution(disp)
res= residual(problem._A,problem._b,disp.x.petsc_vec,problem.b.norm())
if MPI.COMM_WORLD.rank==0:
    print(f"residual after backsubstitution { res}")

This code reads msh files representing the type of mesh I want to treat:

test_mpc3D.msh

$NOD
87
1 15.0 -5.0 -2.5
2 20.0 -5.0 2.5
3 20.0 0.0 2.5
4 15.0 -5.0 2.5
5 15.0 0.0 2.5
6 20.0 -5.0 -2.5
7 15.0 0.0 -2.5
8 20.0 0.0 -2.5
9 20.0 5.0 2.5
10 15.0 5.0 2.5
11 20.0 5.0 -2.5
12 15.0 5.0 -2.5
13 0.0 -5.0 -2.5
14 5.0 -5.0 2.5
15 5.0 0.0 2.5
16 0.0 -5.0 2.5
17 5.0 -5.0 -2.5
18 0.0 0.0 2.5
19 5.0 0.0 -2.5
20 0.0 0.0 -2.5
21 5.0 5.0 2.5
22 0.0 5.0 2.5
23 5.0 5.0 -2.5
24 0.0 5.0 -2.5
25 7.5 -2.5 0.0
26 5.0 -5.0 0.0
27 5.0 -2.5 0.0
28 7.5 -5.0 0.0
29 5.0 -2.5 2.5
30 7.5 -5.0 2.5
31 7.5 -2.5 2.5
32 5.0 -2.5 -2.5
33 10.0 -5.0 2.5
34 7.5 -5.0 -2.5
35 5.0 0.0 0.0
36 7.5 -2.5 -2.5
37 10.0 -5.0 0.0
38 10.0 -5.0 -2.5
39 7.5 0.0 2.5
40 7.5 0.0 0.0
41 10.0 -2.5 2.5
42 10.0 -2.5 0.0
43 10.0 0.0 2.5
44 7.5 2.5 0.0
45 7.5 0.0 -2.5
46 12.5 -2.5 0.0
47 12.5 -5.0 0.0
48 10.0 -2.5 -2.5
49 5.0 2.5 0.0
50 10.0 0.0 -2.5
51 12.5 -5.0 2.5
52 10.0 0.0 0.0
53 7.5 2.5 2.5
54 5.0 2.5 2.5
55 12.5 -2.5 2.5
56 12.5 -5.0 -2.5
57 7.5 2.5 -2.5
58 5.0 2.5 -2.5
59 15.0 -5.0 0.0
60 12.5 -2.5 -2.5
61 10.0 2.5 0.0
62 10.0 2.5 2.5
63 5.0 5.0 0.0
64 15.0 -2.5 2.5
65 15.0 -2.5 0.0
66 12.5 0.0 0.0
67 12.5 0.0 2.5
68 12.5 2.5 0.0
69 10.0 5.0 2.5
70 7.5 5.0 2.5
71 10.0 2.5 -2.5
72 7.5 5.0 0.0
73 15.0 -2.5 -2.5
74 12.5 0.0 -2.5
75 10.0 5.0 -2.5
76 7.5 5.0 -2.5
77 10.0 5.0 0.0
78 12.5 2.5 2.5
79 15.0 0.0 0.0
80 12.5 2.5 -2.5
81 12.5 5.0 0.0
82 12.5 5.0 2.5
83 15.0 2.5 2.5
84 15.0 2.5 0.0
85 12.5 5.0 -2.5
86 15.0 2.5 -2.5
87 15.0 5.0 0.0
$ENDNOD
$ELM
216
1 4 121 26 4 1 2 3 4
2 4 121 26 4 1 5 4 3
3 4 121 26 4 1 6 3 2
4 4 121 26 4 1 7 5 3
5 4 121 26 4 1 6 8 3
6 4 121 26 4 1 8 7 3
7 4 121 26 4 7 3 9 5
8 4 121 26 4 7 8 9 3
9 4 121 26 4 7 10 5 9
10 4 121 26 4 7 8 11 9
11 4 121 26 4 7 12 10 9
12 4 121 26 4 7 11 12 9
13 4 121 26 4 14 26 25 28
14 4 121 26 4 14 30 25 28
15 4 121 26 4 17 26 25 28
16 4 121 26 4 14 31 30 25
17 4 121 26 4 33 30 25 28
18 4 121 26 4 17 28 25 34
19 4 121 26 4 33 31 30 25
20 4 121 26 4 33 37 28 25
21 4 121 26 4 38 28 25 34
22 4 121 26 4 17 25 36 34
23 4 121 26 4 33 31 41 25
24 4 121 26 4 38 37 28 25
25 4 121 26 4 33 37 42 25
26 4 121 26 4 38 25 36 34
27 4 121 26 4 33 37 46 47
28 4 121 26 4 43 31 41 25
29 4 121 26 4 33 41 42 25
30 4 121 26 4 38 37 42 25
31 4 121 26 4 33 46 37 42
32 4 121 26 4 38 48 25 36
33 4 121 26 4 33 51 46 47
34 4 121 26 4 38 37 46 47
35 4 121 26 4 43 41 42 25
36 4 121 26 4 33 41 46 42
37 4 121 26 4 38 42 48 25
38 4 121 26 4 38 46 37 42
39 4 121 26 4 50 48 25 36
40 4 121 26 4 33 55 51 46
41 4 121 26 4 4 51 46 47
42 4 121 26 4 38 47 46 56
43 4 121 26 4 43 52 42 25
44 4 121 26 4 43 41 46 42
45 4 121 26 4 33 55 41 46
46 4 121 26 4 50 42 48 25
47 4 121 26 4 38 46 42 48
48 4 121 26 4 4 55 51 46
49 4 121 26 4 4 59 47 46
50 4 121 26 4 38 46 60 56
51 4 121 26 4 1 47 46 56
52 4 121 26 4 50 52 42 25
53 4 121 26 4 43 52 46 42
54 4 121 26 4 43 55 41 46
55 4 121 26 4 50 46 42 48
56 4 121 26 4 38 46 48 60
57 4 121 26 4 4 55 64 46
58 4 121 26 4 1 59 47 46
59 4 121 26 4 4 59 65 46
60 4 121 26 4 1 46 60 56
61 4 121 26 4 50 52 46 42
62 4 121 26 4 43 66 52 46
63 4 121 26 4 43 55 67 46
64 4 121 26 4 50 46 48 60
65 4 121 26 4 5 55 64 46
66 4 121 26 4 4 64 65 46
67 4 121 26 4 1 59 65 46
68 4 121 26 4 1 73 46 60
69 4 121 26 4 50 66 52 46
70 4 121 26 4 43 67 66 46
71 4 121 26 4 43 52 68 66
72 4 121 26 4 5 55 67 46
73 4 121 26 4 50 74 46 60
74 4 121 26 4 5 64 65 46
75 4 121 26 4 1 65 73 46
76 4 121 26 4 7 73 46 60
77 4 121 26 4 50 66 74 46
78 4 121 26 4 50 52 68 66
79 4 121 26 4 43 67 68 66
80 4 121 26 4 5 67 66 46
81 4 121 26 4 7 74 46 60
82 4 121 26 4 5 79 65 46
83 4 121 26 4 7 65 73 46
84 4 121 26 4 7 66 74 46
85 4 121 26 4 50 66 68 74
86 4 121 26 4 43 78 67 68
87 4 121 26 4 5 67 68 66
88 4 121 26 4 5 66 79 46
89 4 121 26 4 7 79 65 46
90 4 121 26 4 7 66 79 46
91 4 121 26 4 7 66 68 74
92 4 121 26 4 50 68 80 74
93 4 121 26 4 5 78 67 68
94 4 121 26 4 5 79 66 68
95 4 121 26 4 7 79 66 68
96 4 121 26 4 7 68 80 74
97 4 121 26 4 5 78 83 68
98 4 121 26 4 5 79 84 68
99 4 121 26 4 7 79 84 68
100 4 121 26 4 7 86 68 80
101 4 121 26 4 5 83 84 68
102 4 121 26 4 10 78 83 68
103 4 121 26 4 7 84 86 68
104 4 121 26 4 12 86 68 80
105 4 121 26 4 10 83 84 68
106 4 121 26 4 12 84 86 68
107 4 121 26 4 10 87 84 68
108 4 121 26 4 12 87 84 68
109 4 121 26 4 13 14 15 16
110 4 121 26 4 13 17 15 14
111 4 121 26 4 13 18 16 15
112 4 121 26 4 13 17 19 15
113 4 121 26 4 13 20 18 15
114 4 121 26 4 13 19 20 15
115 4 121 26 4 20 15 21 18
116 4 121 26 4 20 19 21 15
117 4 121 26 4 20 22 18 21
118 4 121 26 4 20 19 23 21
119 4 121 26 4 20 24 22 21
120 4 121 26 4 20 23 24 21
121 4 121 26 4 14 25 26 27
122 4 121 26 4 14 29 25 27
123 4 121 26 4 17 25 26 27
124 4 121 26 4 14 31 29 25
125 4 121 26 4 15 29 25 27
126 4 121 26 4 17 25 27 32
127 4 121 26 4 15 31 29 25
128 4 121 26 4 15 35 25 27
129 4 121 26 4 19 25 27 32
130 4 121 26 4 17 25 32 36
131 4 121 26 4 15 31 39 25
132 4 121 26 4 19 35 25 27
133 4 121 26 4 15 40 35 25
134 4 121 26 4 19 25 32 36
135 4 121 26 4 43 31 39 25
136 4 121 26 4 15 39 40 25
137 4 121 26 4 19 40 35 25
138 4 121 26 4 15 35 44 40
139 4 121 26 4 19 45 25 36
140 4 121 26 4 43 39 40 25
141 4 121 26 4 15 39 44 40
142 4 121 26 4 19 40 45 25
143 4 121 26 4 19 35 44 40
144 4 121 26 4 15 44 35 49
145 4 121 26 4 50 45 25 36
146 4 121 26 4 43 40 52 25
147 4 121 26 4 43 39 44 40
148 4 121 26 4 15 53 39 44
149 4 121 26 4 50 40 45 25
150 4 121 26 4 19 40 44 45
151 4 121 26 4 19 44 35 49
152 4 121 26 4 15 54 44 49
153 4 121 26 4 50 40 52 25
154 4 121 26 4 43 52 40 44
155 4 121 26 4 43 53 39 44
156 4 121 26 4 15 53 54 44
157 4 121 26 4 50 40 44 45
158 4 121 26 4 19 44 57 45
159 4 121 26 4 19 44 49 58
160 4 121 26 4 21 54 44 49
161 4 121 26 4 50 52 40 44
162 4 121 26 4 43 52 61 44
163 4 121 26 4 43 53 62 44
164 4 121 26 4 21 53 54 44
165 4 121 26 4 50 44 57 45
166 4 121 26 4 19 44 58 57
167 4 121 26 4 23 44 49 58
168 4 121 26 4 21 63 44 49
169 4 121 26 4 50 52 61 44
170 4 121 26 4 43 62 61 44
171 4 121 26 4 43 68 52 61
172 4 121 26 4 69 53 62 44
173 4 121 26 4 21 53 70 44
174 4 121 26 4 50 71 44 57
175 4 121 26 4 23 44 58 57
176 4 121 26 4 23 63 44 49
177 4 121 26 4 21 72 63 44
178 4 121 26 4 50 61 71 44
179 4 121 26 4 50 68 52 61
180 4 121 26 4 69 62 61 44
181 4 121 26 4 43 62 68 61
182 4 121 26 4 69 53 70 44
183 4 121 26 4 21 70 72 44
184 4 121 26 4 75 71 44 57
185 4 121 26 4 23 76 44 57
186 4 121 26 4 23 72 63 44
187 4 121 26 4 75 61 71 44
188 4 121 26 4 50 68 61 71
189 4 121 26 4 69 77 61 44
190 4 121 26 4 69 62 68 61
191 4 121 26 4 43 78 62 68
192 4 121 26 4 69 70 72 44
193 4 121 26 4 75 76 44 57
194 4 121 26 4 23 72 76 44
195 4 121 26 4 75 77 61 44
196 4 121 26 4 75 68 61 71
197 4 121 26 4 50 68 71 80
198 4 121 26 4 69 72 77 44
199 4 121 26 4 69 77 68 61
200 4 121 26 4 69 78 62 68
201 4 121 26 4 75 72 76 44
202 4 121 26 4 75 72 77 44
203 4 121 26 4 75 77 68 61
204 4 121 26 4 75 68 71 80
205 4 121 26 4 69 81 77 68
206 4 121 26 4 69 78 82 68
207 4 121 26 4 75 81 77 68
208 4 121 26 4 75 85 68 80
209 4 121 26 4 69 82 81 68
210 4 121 26 4 10 78 82 68
211 4 121 26 4 75 81 85 68
212 4 121 26 4 12 85 68 80
213 4 121 26 4 10 82 81 68
214 4 121 26 4 12 81 85 68
215 4 121 26 4 10 81 87 68
216 4 121 26 4 12 81 87 68
$ENDELM

and test_mpc2D.msh


$NOD
31
1 15.0 -5.0 0.0
2 20.0 -5.0 0.0
3 20.0 0.0 0.0
4 15.0 0.0 0.0
5 20.0 5.0 0.0
6 15.0 5.0 0.0
7 0.0 -5.0 0.0
8 5.0 -5.0 0.0
9 5.0 0.0 0.0
10 0.0 0.0 0.0
11 5.0 5.0 0.0
12 0.0 5.0 0.0
13 12.5 -2.5 0.0
14 12.5 -5.0 0.0
15 15.0 -2.5 0.0
16 10.0 -5.0 0.0
17 10.0 -2.5 0.0
18 12.5 0.0 0.0
19 10.0 0.0 0.0
20 7.5 -2.5 0.0
21 12.5 2.5 0.0
22 7.5 -5.0 0.0
23 15.0 2.5 0.0
24 7.5 0.0 0.0
25 10.0 2.5 0.0
26 7.5 2.5 0.0
27 5.0 -2.5 0.0
28 10.0 5.0 0.0
29 12.5 5.0 0.0
30 5.0 2.5 0.0
31 7.5 5.0 0.0
$ENDNOD
$ELM
40
1 2 121 26 3 7 8 9
2 2 121 26 3 7 10 9
3 2 121 26 3 10 9 11
4 2 121 26 3 10 12 11
5 2 121 26 3 20 24 19
6 2 121 26 3 21 25 19
7 2 121 26 3 20 9 24
8 2 121 26 3 26 19 24
9 2 121 26 3 20 27 8
10 2 121 26 3 26 19 25
11 2 121 26 3 21 28 25
12 2 121 26 3 21 29 6
13 2 121 26 3 20 9 27
14 2 121 26 3 26 24 9
15 2 121 26 3 26 25 28
16 2 121 26 3 21 28 29
17 2 121 26 3 26 30 9
18 2 121 26 3 26 31 28
19 2 121 26 3 26 11 30
20 2 121 26 3 26 11 31
21 2 121 26 3 1 2 3
22 2 121 26 3 1 4 3
23 2 121 26 3 4 3 5
24 2 121 26 3 4 6 5
25 2 121 26 3 13 1 14
26 2 121 26 3 13 1 15
27 2 121 26 3 13 14 16
28 2 121 26 3 13 15 4
29 2 121 26 3 13 17 16
30 2 121 26 3 13 18 4
31 2 121 26 3 13 19 17
32 2 121 26 3 20 16 17
33 2 121 26 3 13 19 18
34 2 121 26 3 21 4 18
35 2 121 26 3 20 17 19
36 2 121 26 3 20 16 22
37 2 121 26 3 21 18 19
38 2 121 26 3 21 4 23
39 2 121 26 3 20 22 8
40 2 121 26 3 21 23 6
$ENDELM

Some, but not all, nodes are constrained to simplify the test case.
In both dimensions, an Adoc distribution is applied with two processes (the msh files have been tuned for this).

From 1 up to 10 processes are used.

2D

In 2D with 1,2,3,4,6,7,9,10 processes the runs (DO_3D=False) provides consistent relative residual values (computed by provided residual function):

  • residual is incorrect when using displacement provided by LinearProblem solve as it has been back substituted. b should be completed by slaves values.
  • residual is correct when slaves displacement are reset to zero by ‘homogenize’
  • and after another ‘backsubstitution’ we get same residual as the one obtained from solve.

For 5 and 8 processes the requirements of the little function set_slave_master created only to make this MWE simple are not fulfill anymore (the 2 master dofs must be on the same proc as the slave dof).

3D

In 3D with 1,3,4,5,6,7,8,9 processes the residual are correct. For 10 processes again the limitation of set_slave_master interfere.
But with 2 processes things are wrong (bad residual value). In this case a special partitioning (using tuned msh file) is use to stick to the observation in my application:

The output is:


Info    : Reading 'test_mpc3D.msh'...
Info    : 87 nodes
Info    : 216 elements
Info    : Done reading 'test_mpc3D.msh'
1 slaves [39 40 41]
1 masters [123 132 124 133 125 134]
0 slaves [117 118 119]
0 masters [255 258 256 259 257 260]
1 coeffs [0.5 0.5 0.5 0.5 0.5 0.5]
1 offsets [0 2 4 6]
0 coeffs [0.5 0.5 0.5 0.5 0.5 0.5]
0 offsets [0 2 4 6]
0 owners [1 1 1 1 1 1]
1 owners [1 1 1 1 1 1]
residual out of solve 0.07895617284997351
residual after homogenize 0.3881266125567975
residual after backsubstitution 0.07895617284997351

I don’t thing that use of a special partitionner interfere in the problem but maybe I am wrong (I try a special partitionner in 2D also but it is not really a proof).
Anyway in the problematic test case the dofs are:

where numbers in () stand for local numbering and global numbering otherwise, left part of | stand for per component, right part of | stand for block index and R in front stand for owned by an other process. Ouputs above showing argument given to add_constraint method are consistent with this numbering.

I observed that in this pathological case despite the apparent correct creation of the MultiPointConstraint (backsubstitution works fine as pyvista outputs shows nice things; see bellows) the slave dofs are not eliminated from at least the A matrix. For sure the residual is thus wrong.

I am a little confused as to why it doesn’t work the way I want in this particular case. What did I misunderstand regarding the usage of the MPC and the add_constraint method ?

Many thanks for any reaction

As this example is rather involved, it would take some time for me to have time to consider it.

Have you considered:

which uses:

Note: You should probably set allow_missing_facets to False.

Hello @dokken ,

thank you for your time.

create_contact_inelastic_condition

I switched to dolfinx_mpc 0.9.3 to test the ‘create_contact_inelastic_condition’ method. After adding some faces to my .msh file with a master/slave tag setting, I get incorrect results for any combination of process and ‘allow_missing_facets’ option values.
Having looked through the code, I think this method is not suitable for the type of mesh I am trying to use. In my case, the coarse and fine mesh blocks are not disconnected. Nodes can be connected to either coarse or fine elements, or a combination of both. Thus, in ‘impl::locate_slave_dofs’ slave facets will provides dofs that are also masters in bb_tree (cpp/ContactConstraint.h). Circular dependency ?
In my application, this is used mainly to correctly identify the support/patch of a node and for now, I don’t want to change that (i.e. disconnect nodes), as it would, a priori, put the mess.

V0.10.0

I also tried switching to the 0.10 FEniCSx ecosystem version (conda 0.10.0 based on GIT commit: d31f333cac3a2e24d8d17093e50a10f2b5a80a9a +dolfinx_mpc) to see if my problem is related to something treated differently in this new version. Unfortunately, after a few modifications, the MWE stops when I try to use a custom partitioner.

Error msg
Info    : Reading 'test_mpc3D.msh'...
Info    : 87 nodes
Info    : 216 elements
Info    : Done reading 'test_mpc3D.msh'
Traceback (most recent call last):
Traceback (most recent call last):
  File "xxxxxxxxxxxxxxxxxxxxxxxxxx/check_mpcV10.py", line 29, in <module>
    dom_data=io.gmsh.read_from_msh("test_mpc3D.msh",MPI.COMM_WORLD,partitioner=part)
  File "xxxxxxxxxxxxx/micromamba/envs/fenicsx010/lib/python3.14/site-packages/dolfinx/io/gmsh.py", line 512, in read_from_msh
    return model_to_mesh(gmsh.model, comm, rank, gdim=gdim, partitioner=partitioner)
  File "xxxxxxxxxxxxx/micromamba/envs/fenicsx010/lib/python3.14/site-packages/dolfinx/io/gmsh.py", line 422, in model_to_mesh
    mesh = create_mesh(
        comm, cell_connectivity, ufl_domain, x[:, :gdim].astype(dtype, copy=False), partitioner
    )
  File "xxxxxxxxxxxxx/micromamba/envs/fenicsx010/lib/python3.14/site-packages/dolfinx/mesh.py", line 775, in create_mesh
    msh: _cpp.mesh.Mesh_float32 | _cpp.mesh.Mesh_float64 = _cpp.mesh.create_mesh(
                                                           ~~~~~~~~~~~~~~~~~~~~~^
        comm, cells, cmap._cpp_object, x, partitioner, max_facet_to_cell_links
        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
    )
    ^
RuntimeError: std::bad_cast

to move forward

Put simply, I just want to know if I have misunderstood the setting of information in the mpc_data.
In the 3D case, with 2 processes, to impose slave dofs at [5.,-5.,0.] location from master dofs at [5.,-5.,2.5] and [5.,-5,-2.5] locations is it correct to set mpc_data in proc 1 (proc owning slaves) as:

slaves  [39 40 41] #(local non blocking numbering i.e. x,y,z component)
masters  [123 132 124 133 125 134] #(global non blocking numbering i.e. pair (2 masters) for x, y, z components)
coeffs  [0.5 0.5 0.5 0.5 0.5 0.5] #(2 master x 3 slaves = 6 coefficients, all the same as slave node location is in in the middle of masters locations) 
offsets  [0 2 4 6] #(3 slaves +1 size)
owners  [1 1 1 1 1 1] #(in this case all masters are owned by proc 1)

and nothing in proc 0 (i.e. empty array).

If this is the correct way to use the ‘add_constraint’ method, then maybe there is a bug in dolfinx_mpcv0.9.3. I may then try to search the problem in the dolfinx_mpc library or, if it works in V0.10, to back port changes.

Many thanks in advance if you can take some times to just tell me if from figure above ( proc1 dof ids ), what I set sounds correct.

My bad

The documentation for the ‘add_constraint’ method of the ‘MultiPointConstraint’ Python class provides the following illustration for the offset argument:

masters_of_owned_slave[i] = masters[offsets[i]:offsets[i+1]]

Arrrgghhh! Here, ‘owned’ does not mean ‘slave’ owned by the process, but ‘slave’ present in the process. I should have stick to the definition of ‘slave’ argument that talks about ‘all’ the slaves in the process.

Thus, in the pathological case described in my last post, adding the MPC definition to process 0 (which has a slave that is not owned by itself) makes things run smoothly …

I hope I’m on the right track now!