How to obtain dofs of FunctionSpace in dolfinx?

Hello all.

If this question sounds basic, it’s because it is.

I’m looking for the number of degrees of freedom of a given function space V. I expect a very large integer as output. My understanding from this documentation is that there should be a property of V called dim which will give me what I need.

However, with my dolfinx Docker installation in complex mode I only obtain AttributeError: 'FunctionSpace' object has no attribute 'dim'.

This arises even in the simplest test case :

import dolfinx
from mpi4py.MPI import COMM_WORLD
mesh=dolfinx.UnitSquareMesh(COMM_WORLD,2,2)
import ufl
FE=ufl.FiniteElement("Lagrange",mesh.ufl_cell(),1)
V=dolfinx.FunctionSpace(mesh,FE)
V.dim

I suppose there could be a way of hacking around this with a locate_dofs_geometrical but it sounds ugly. Any ideas ?

Consider:

import dolfinx
from mpi4py.MPI import COMM_WORLD
mesh=dolfinx.UnitSquareMesh(COMM_WORLD,2,2)
import ufl
FE=ufl.FiniteElement("Lagrange",mesh.ufl_cell(),1)
V=dolfinx.FunctionSpace(mesh,FE)

num_dofs_local = (V.dofmap.index_map.size_local) * V.dofmap.index_map_bs
num_dofs_global = V.dofmap.index_map.size_global * V.dofmap.index_map_bs

print(f"Number of dofs (owned) by rank {COMM_WORLD.rank}: {num_dofs_local}")
if COMM_WORLD.rank ==0:
    print(f"Number of dofs global: {num_dofs_global}")

Yielding:

root@81aa206a058d:~/shared# mpirun -n 1 python3 num_dofs.py 
Number of dofs (owned) by rank 0: 9
Number of dofs global: 9

root@81aa206a058d:~/shared# mpirun -n 2 python3 num_dofs.py 
Number of dofs (owned) by rank 0: 6
Number of dofs global: 9
Number of dofs (owned) by rank 1: 3
2 Likes

Thank you kindly @dokken, reactive and helpful as ever !

I still think it is a shame that the documentation (in the top 10 google searches for “dolfinx Function Space degrees of freedom”) has it wrong

can you provide a link to the documentation which has the error?

I believe I have.

But for reference : https://fenicsproject.org/olddocs/dolfinx/dev/python/generated/dolfinx.function.html#dolfinx.function.FunctionSpace

1 Like

Hm, didn’t know we had an olddocs for Dolfinx. @mscroggs will have a look at it.

2 Likes

It would be a good idea to delete the link to the old docs.

I have rewritten the script to work with dolfinx.__version__==0.8.0.

Any idea why print(v.x.array.size) works as I would expect for mpirun -n 1 ... but returns 8 for mpirun -n 2 ...?

from basix.ufl import element

import dolfinx
from dolfinx.fem import Function, functionspace

from mpi4py.MPI import COMM_WORLD
mesh = dolfinx.mesh.create_unit_square(COMM_WORLD, 2, 2)

s_cg1 = element("Lagrange", mesh.topology.cell_name(), 1)
V = functionspace(mesh, s_cg1)

num_dofs_local = (V.dofmap.index_map.size_local) * V.dofmap.index_map_bs
num_dofs_global = V.dofmap.index_map.size_global * V.dofmap.index_map_bs

print(f"Number of dofs (owned) by rank {COMM_WORLD.rank}: {num_dofs_local}")
if COMM_WORLD.rank == 0:
    print(f"Number of dofs global: {num_dofs_global}")

## <--- works of "mpirun -n 1 ..." only
v = Function(V)
print(v.x.array.size)

What do you imply by «works»? What do you get as output, and what do you expect?

I would expect the number of dofs (i.e. 9 in this case) as the “size” of a function. With better formatted output
print(f'rank {COMM_WORLD.rank}: size of function v.x.array: {v.x.array.size}')
here is what I get

✗ mpirun -n 1 python3 lildolfinx.py  # the call of the script from the command line
Number of dofs (owned) by rank 0: 9
Number of dofs global: 9
rank 0: size of function v.x.array: 9
✗ mpirun -n 2 python3 lildolfinx.py  # the call of the script from the command line
Number of dofs (owned) by rank 0: 4
Number of dofs global: 9
rank 0: size of function v.x.array: 8
Number of dofs (owned) by rank 1: 5
rank 1: size of function v.x.array: 8

Running with mpi the mesh (and in turn function space and function) is partitioned, so that each process only owns and has access to a subset of the cells and dofs.

This is for instance explained in section 6.2 of DOLFINx: The next generation FEniCS problem solving environment
Or
Parallel Computations with Dolfinx using MPI — MPI tutorial

1 Like