Is the dofmap contains glocal index or local index?

Hello,

I want to ask a question about the dofmap in DOLFINX.

I’m thinking about implementing the periodic BC in dolfinx because I need some functions, which are avaliable in DOLFINX, but not in DOLFINX_MPC.

The idea I have is to store the slave dofs as ghost nodes (like the old FEniCS), take the 1D interval as example, say we have the following dofs in a 1D mesh:
0 1 2 3 4 5 6 7 8 9 10

If I want f10:=f0, then I modify the index map such that global index becomes like this

Global index:
0 1 2 3 4 5 6 7 8 9 (0)

The last dof is removed from the index map, and defined as a ghost node, and the global index of it is just the same as its master dof.

Then I created the function space with the new index map and the dof map of the origianal function space, which is:
0 1 1 2 2 3 3 4 4 5 5 6 6 7 7 8 8 9 9 10

The problem is that, when I try to assemble the mass matrix, there is error reported

New nonzero at (9,0) caused a malloc

But we know that the element (9,0) of the mass matrix should be nonzero, when you have PBC.

Then I modified the dof map into
0 1 1 2 2 3 3 4 4 5 5 6 6 7 7 8 8 9 9 0
In this case, I got the correct matrix and there is not error.

But it is quite strange to me, because I think the cell in functionspace()->dofmap()->map() should be local index instead of global index, so in this case, the dofmap shouldn’t be changed
It is interesting that when I work in parallel, the mass matrix I got is actually correct

Could anyone give me suggestion on what I am doing? Is it wrong?

Thanks very much.

One would need a minimal reproducible example to help you, as it is not clear to me at what point you change the dofmap, and if the changes are propagated through assembly

Hi, I have included some codes to show what I’m doing could you have a look of it.
Now in python I don’t get the Malloc error, but the results is not the same as what I got in CPP, the mass matrix is like just removing the column and row that corresponds to the slave dof.

from mpi4py import MPI
from petsc4py.PETSc import ScalarType  # type: ignore
from dolfinx.fem.forms import form as _create_form
# +
import numpy as np

import ufl
from dolfinx import fem, io, mesh, plot
from dolfinx import cpp as _cpp
from dolfinx.fem.petsc import LinearProblem
from ufl import ds, dx, grad, inner
import basix

msh = mesh.create_unit_interval(MPI.COMM_WORLD, 10)
V = fem.functionspace(msh, ("Lagrange", 1))

# remove one local dof because the slave dof is defined as ghost 
new_local_size = V.dofmap.index_map.size_local - 1

ghosts = [0]       # The slave dof is masterd by the 0th dof 
owners = [0]       # Work in one process, so owner is 0


x = V.tabulate_dof_coordinates
print(x())
#Create new Indexmap
indexmap = _cpp.common.IndexMap(MPI.COMM_WORLD, new_local_size, ghosts, owners)


# Get the dofmap list but don't change 
dofs = _cpp.graph.AdjacencyList_int32(V.dofmap.list)


# Creat the DofMap object
dofmap = _cpp.fem.DofMap(V.dofmap.dof_layout, 
                         indexmap, V.dofmap.index_map_bs, 
                          dofs, V.dofmap.bs)

new_dofmap = fem.DofMap(dofmap)

# Creat the functionspace
cppV = _cpp.fem.FunctionSpace_float64(msh._cpp_object, V.element, dofmap)
e = fem.ElementMetaData("Lagrange", 1)
ufl_e = basix.ufl.element(e.family, msh.basix_cell(), e.degree, shape=e.shape,
                                  symmetry=e.symmetry, gdim=msh.ufl_cell().geometric_dimension())

V_pbc = fem.FunctionSpaceBase(msh, ufl_e, cppV)



V = V_pbc
u = ufl.TrialFunction(V)
v = ufl.TestFunction(V)


x = ufl.SpatialCoordinate(msh)
f = 10 * ufl.sin(x[0])

a = u * v * dx
L = f * v * dx

problem = LinearProblem(a, L, bcs=[], petsc_options={"ksp_type": "preonly", "pc_type": "lu"})
uh = problem.solve()
print(uh.x.array)
problem.A.view()

Now I think the main issue is within the assembler, since I got the same function space in both interfaces, and I think that is correct.

When I impose PBC, what I do is to add the the entity with the index of the slave dof to that with the master dof(is it correct?), like this:

2 1 0 0 0 0 0 0 0 0 0                             4 1 0 0 0 0 0 0 0 0 1
1 4 1 0 0 0 0 0 0 0 0                             1 4 1 0 0 0 0 0 0 0 0 
0 1 4 1 0 0 0 0 0 0 0                             0 1 4 1 0 0 0 0 0 0 0
0 0 1 4 1 0 0 0 0 0 0                             0 0 1 4 1 0 0 0 0 0 0
0 0 0 1 4 1 0 0 0 0 0                             0 0 0 1 4 1 0 0 0 0 0
0 0 0 0 1 4 1 0 0 0 0         ==>                 0 0 0 0 1 4 1 0 0 0 0
0 0 0 0 0 1 4 1 0 0 0                             0 0 0 0 0 1 4 1 0 0 0 
0 0 0 0 0 0 1 4 1 0 0                             0 0 0 0 0 0 1 4 1 0 0
0 0 0 0 0 0 0 1 4 1 0                             0 0 0 0 0 0 0 1 4 1 0
0 0 0 0 0 0 0 0 1 4 1                             1 0 0 0 0 0 0 0 1 4 1
0 0 0 0 0 0 0 0 0 1 2

And when I work in CPP interface, the malloc error happend when the assembler try to add values to (0,9) or (9,0), while in python interface, this process is also blocked (but no error message), so the mass matrix is like this

2 1 0 0 0 0 0 0 0 0                          
1 4 1 0 0 0 0 0 0 0                         
0 1 4 1 0 0 0 0 0 0             
0 0 1 4 1 0 0 0 0 0         
0 0 0 1 4 1 0 0 0 0             
0 0 0 0 1 4 1 0 0 0     
0 0 0 0 0 1 4 1 0 0       
0 0 0 0 0 0 1 4 1 0         
0 0 0 0 0 0 0 1 4 1                 
0 0 0 0 0 0 0 0 1 4                    

I believe this is the problem withing the PETSC matrix, because I remember that in the DOLFINX_MPC, we create the matrix with a different way, using the sparse pattern given by the MPC.

I think you are misunderstanding what should be modified in your code.

Here you are only changing the index map, but not the dofmap.
You need to change both in a consistent way, which is quite some work.
Here is a code that creates a periodic BC for a 1D problem (with a single dof being replaced in the dofmap)

# A DOFMAP reduction code for periodic boundary conditions in 1D
# SPDX-License-Identifier: MIT
# Author: Jørgen S. Dokken

from petsc4py import PETSc
from mpi4py import MPI

import ufl
from dolfinx import fem, mesh, cpp, io
from dolfinx.fem.petsc import LinearProblem
from ufl import dx
import basix
import numpy as np

msh = mesh.create_unit_interval(
    MPI.COMM_WORLD, 10, ghost_mode=mesh.GhostMode.none)
V = fem.functionspace(msh, ("Lagrange", 1))


def left(x):
    return np.isclose(x[0], 0.0)


def right(x):
    return np.isclose(x[0], 1.0)


puppet = fem.locate_dofs_geometrical(V, left)
master = fem.locate_dofs_geometrical(V, right)
assert len(puppet) <= 2
assert len(master) <= 2

# Currently assuming single puppet
V_index_map = V.dofmap.index_map
has_master = np.full(msh.comm.size, -1, dtype=np.int64)
has_puppet = np.zeros(msh.comm.size, dtype=np.int32)
owned_puppet = False
if len(puppet) > 0 and puppet[0] < V_index_map.size_local:
    has_puppet[:] = 1
    owned_puppet = True

# Send master to do slave process
owned_master = False
if len(master) > 0 and master[0] < V_index_map.size_local:
    has_master[:] = V_index_map.local_range[0] + master[0]
    owned_master = True
puppet_to_master = np.zeros_like(has_puppet)
master_to_puppet = np.zeros_like(has_master)
msh.comm.Alltoall(has_puppet, puppet_to_master)
msh.comm.Alltoall(has_master, master_to_puppet)

puppet_ranks = np.flatnonzero(puppet_to_master)
master_indicator = (master_to_puppet > -1).astype(np.int32)
master_ranks = np.flatnonzero(master_indicator)
# If has puppet, this process should communicate with the master
V_index_map = V.dofmap.index_map

i_to_dest = V_index_map.index_to_dest_ranks()
src = np.unique(i_to_dest.array[:i_to_dest.offsets[V_index_map.size_local]])
dest = np.unique(V_index_map.owners)

local_range = V_index_map.local_range
local_size = V_index_map.size_local

# Map ghosts to a lower index to account for the removed dof
old_ghosts = V.dofmap.index_map.ghosts
new_ghosts = []
new_owners = []
for owner, ghost in zip(V_index_map.owners, old_ghosts):
    new_ghosts.append(ghost - sum(puppet_to_master[:owner]))
    new_owners.append(owner)

if not owned_master:
    assert not owned_master
    if (gd := (master_to_puppet[master_ranks[0]] - sum(puppet_to_master[:master_ranks[0]]))) not in new_ghosts:
        ghost_pos = len(new_ghosts)
        new_ghosts.append(gd)
        new_owners.append(master_ranks[0])
    else:
        ghost_pos = np.where(np.asarray(new_ghosts) == gd)[0][0]


if owned_puppet:
    local_size -= 1
new_index_map = cpp.common.IndexMap(
    MPI.COMM_WORLD, local_size, np.asarray(new_ghosts, dtype=np.int64), np.asarray(new_owners, dtype=np.int32))
old_dofs = V.dofmap.list.copy()
shape = old_dofs.shape
new_dofs = old_dofs.reshape(-1).copy()
pupp = -1
if len(puppet) > 0:
    pupp = puppet[0]

for i, dof in enumerate(old_dofs.reshape(-1)):
    if dof > pupp:
        new_dofs[i] = dof - int(pupp > -1)
    elif dof == pupp:
        if owned_master:
            new_dofs[i] = master[0] - int(master[0] > pupp)*int(pupp > -1)
        else:
            new_dofs[i] = local_size + ghost_pos

dofs = cpp.graph.AdjacencyList_int32(new_dofs.reshape(shape))
new_dofmap = cpp.fem.DofMap(V.dofmap.dof_layout,
                            new_index_map, V.dofmap.index_map_bs,
                            dofs, V.dofmap.bs)
# Creat the functionspace
cppV = cpp.fem.FunctionSpace_float64(
    msh._cpp_object, V.element, new_dofmap, V.value_shape)
e = fem.ElementMetaData("Lagrange", 1)
ufl_e = basix.ufl.element(e.family, msh.basix_cell(), e.degree, shape=e.shape,
                          symmetry=e.symmetry)

V_pbc = fem.FunctionSpace(msh, ufl_e, cppV)


def solve_problem(V):
    u = ufl.TrialFunction(V)
    v = ufl.TestFunction(V)

    x = ufl.SpatialCoordinate(msh)
    f = x[0]*ufl.sin(x[0])

    a = ufl.dot(ufl.grad(u), ufl.grad(v)) * dx
    L = f * v * dx

    problem = LinearProblem(a, L, bcs=[], petsc_options={
                            "ksp_type": "preonly", "pc_type": "lu",
                            "pc_factor_mat_solver_type": "mumps",
                            "mat_mumps_icntl_24": 1,  # Option to support solving a singular matrix
                            "mat_mumps_icntl_25": 0  # Option to support solving a singular matrix
                            })
    nullspace = PETSc.NullSpace().create(constant=True)  # type: ignore
    PETSc.Mat.setNearNullSpace(problem.A, nullspace)

    uh = problem.solve()

    print(problem.A[:, :])
    print(problem.b[:])
    with io.XDMFFile(msh.comm, "test.xdmf", "w") as xdmf:
        xdmf.write_mesh(msh)
        xdmf.write_function(uh, 0.0)


solve_problem(V)
solve_problem(V_pbc)

yielding (non-periodic)

[[ 10. -10.   0.   0.   0.   0.   0.   0.   0.   0.   0.]
 [-10.  20. -10.   0.   0.   0.   0.   0.   0.   0.   0.]
 [  0. -10.  20. -10.   0.   0.   0.   0.   0.   0.   0.]
 [  0.   0. -10.  20. -10.   0.   0.   0.   0.   0.   0.]
 [  0.   0.   0. -10.  20. -10.   0.   0.   0.   0.   0.]
 [  0.   0.   0.   0. -10.  20. -10.   0.   0.   0.   0.]
 [  0.   0.   0.   0.   0. -10.  20. -10.   0.   0.   0.]
 [  0.   0.   0.   0.   0.   0. -10.  20. -10.   0.   0.]
 [  0.   0.   0.   0.   0.   0.   0. -10.  20. -10.   0.]
 [  0.   0.   0.   0.   0.   0.   0.   0. -10.  20. -10.]
 [  0.   0.   0.   0.   0.   0.   0.   0.   0. -10.  10.]]
[8.32777934e-05 1.16322600e-03 4.13331213e-03 9.01733729e-03
 1.57171653e-02 2.40974738e-02 3.39877899e-02 4.51850600e-02
 5.74567199e-02 7.05442245e-02 3.97830923e-02]

Periodic

[[ 20. -10.   0.   0.   0.   0.   0.   0.   0. -10.]
 [-10.  20. -10.   0.   0.   0.   0.   0.   0.   0.]
 [  0. -10.  20. -10.   0.   0.   0.   0.   0.   0.]
 [  0.   0. -10.  20. -10.   0.   0.   0.   0.   0.]
 [  0.   0.   0. -10.  20. -10.   0.   0.   0.   0.]
 [  0.   0.   0.   0. -10.  20. -10.   0.   0.   0.]
 [  0.   0.   0.   0.   0. -10.  20. -10.   0.   0.]
 [  0.   0.   0.   0.   0.   0. -10.  20. -10.   0.]
 [  0.   0.   0.   0.   0.   0.   0. -10.  20. -10.]
 [-10.   0.   0.   0.   0.   0.   0.   0. -10.  20.]]
[0.00116323 0.00413331 0.00901734 0.01571717 0.02409747 0.03398779
 0.04518506 0.05745672 0.07054422 0.03986637]
1 Like

Thanks very much! I will read through the codes!

What version of this piece of code relies on? It seems like there is bug reported when I try to run it with v0.7.3 or v0.6.0

I built this code on the nightly branch (main). There should only be a few things to change for 0.7.x compatibility

Here is a mwe for v0.7.x

# A DOFMAP reduction code for periodic boundary conditions in 1D
# SPDX-License-Identifier: MIT
# Author: Jørgen S. Dokken

from petsc4py import PETSc
from mpi4py import MPI

import ufl
from dolfinx import fem, mesh, cpp, io
from dolfinx.fem.petsc import LinearProblem
from ufl import dx
import basix
import numpy as np

msh = mesh.create_unit_interval(
    MPI.COMM_WORLD, 10, ghost_mode=mesh.GhostMode.none)
V = fem.functionspace(msh, ("Lagrange", 1))


def left(x):
    return np.isclose(x[0], 0.0)


def right(x):
    return np.isclose(x[0], 1.0)


puppet = fem.locate_dofs_geometrical(V, left)
master = fem.locate_dofs_geometrical(V, right)
assert len(puppet) <= 2
assert len(master) <= 2

# Currently assuming single puppet
V_index_map = V.dofmap.index_map
has_master = np.full(msh.comm.size, -1, dtype=np.int64)
has_puppet = np.zeros(msh.comm.size, dtype=np.int32)
owned_puppet = False
if len(puppet) > 0 and puppet[0] < V_index_map.size_local:
    has_puppet[:] = 1
    owned_puppet = True

# Send master to do slave process
owned_master = False
if len(master) > 0 and master[0] < V_index_map.size_local:
    has_master[:] = V_index_map.local_range[0] + master[0]
    owned_master = True
puppet_to_master = np.zeros_like(has_puppet)
master_to_puppet = np.zeros_like(has_master)
msh.comm.Alltoall(has_puppet, puppet_to_master)
msh.comm.Alltoall(has_master, master_to_puppet)

puppet_ranks = np.flatnonzero(puppet_to_master)
master_indicator = (master_to_puppet > -1).astype(np.int32)
master_ranks = np.flatnonzero(master_indicator)
# If has puppet, this process should communicate with the master
V_index_map = V.dofmap.index_map

i_to_dest = V_index_map.index_to_dest_ranks
src = np.unique(i_to_dest.array[:i_to_dest.offsets[V_index_map.size_local]])
dest = np.unique(V_index_map.owners)

local_range = V_index_map.local_range
local_size = V_index_map.size_local

# Map ghosts to a lower index to account for the removed dof
old_ghosts = V.dofmap.index_map.ghosts
new_ghosts = []
new_owners = []
for owner, ghost in zip(V_index_map.owners, old_ghosts):
    new_ghosts.append(ghost - sum(puppet_to_master[:owner]))
    new_owners.append(owner)

if not owned_master:
    assert not owned_master
    if (gd := (master_to_puppet[master_ranks[0]] - sum(puppet_to_master[:master_ranks[0]]))) not in new_ghosts:
        ghost_pos = len(new_ghosts)
        new_ghosts.append(gd)
        new_owners.append(master_ranks[0])
    else:
        ghost_pos = np.where(np.asarray(new_ghosts) == gd)[0][0]


if owned_puppet:
    local_size -= 1
new_index_map = cpp.common.IndexMap(
    MPI.COMM_WORLD, local_size, np.asarray(new_ghosts, dtype=np.int64), np.asarray(new_owners, dtype=np.int32))
old_dofs = V.dofmap.list.copy()
shape = old_dofs.shape
new_dofs = old_dofs.reshape(-1).copy()
pupp = -1
if len(puppet) > 0:
    pupp = puppet[0]

for i, dof in enumerate(old_dofs.reshape(-1)):
    if dof > pupp:
        new_dofs[i] = dof - int(pupp > -1)
    elif dof == pupp:
        if owned_master:
            new_dofs[i] = master[0] - int(master[0] > pupp)*int(pupp > -1)
        else:
            new_dofs[i] = local_size + ghost_pos

dofs = cpp.graph.AdjacencyList_int32(new_dofs.reshape(shape))
new_dofmap = cpp.fem.DofMap(V.dofmap.dof_layout,
                            new_index_map, V.dofmap.index_map_bs,
                            dofs, V.dofmap.bs)
# Creat the functionspace
cppV = cpp.fem.FunctionSpace_float64(
    msh._cpp_object, V.element, new_dofmap)
e = fem.ElementMetaData("Lagrange", 1)
ufl_e = basix.ufl.element(e.family, msh.basix_cell(), e.degree, shape=e.shape,
                          symmetry=e.symmetry, gdim=msh.geometry.dim)

V_pbc = fem.FunctionSpaceBase(msh, ufl_e, cppV)


def solve_problem(V):
    u = ufl.TrialFunction(V)
    v = ufl.TestFunction(V)

    x = ufl.SpatialCoordinate(msh)
    f = x[0]*ufl.sin(x[0])

    a = ufl.dot(ufl.grad(u), ufl.grad(v)) * dx
    L = f * v * dx

    problem = LinearProblem(a, L, bcs=[], petsc_options={
                            "ksp_type": "preonly", "pc_type": "lu",
                            "pc_factor_mat_solver_type": "mumps",
                            "mat_mumps_icntl_24": 1,  # Option to support solving a singular matrix
                            "mat_mumps_icntl_25": 0  # Option to support solving a singular matrix
                            })
    nullspace = PETSc.NullSpace().create(constant=True)  # type: ignore
    PETSc.Mat.setNearNullSpace(problem.A, nullspace)

    uh = problem.solve()

    print(problem.A[:, :])
    print(problem.b[:])
    with io.XDMFFile(msh.comm, "test.xdmf", "w") as xdmf:
        xdmf.write_mesh(msh)
        xdmf.write_function(uh, 0.0)


solve_problem(V)
solve_problem(V_pbc)

1 Like

Big thanks for you patience!

Now I can see your point, in your example the left dof is puppet, so when you remove it, the whole local index map is changed, so the dof map need to be changed as well. That’s is indeed something I ignored before.

But it seems like you you are dealling the index map in a different way from me, in the code you provide, the new_ghosts vector is actually empy, should we store the master dof or just let it be empty?

New_ghosts will only be non-empty in parallel. Ghosts exist to share information in parallel. See for instance

for a thorough description

1 Like