Problems implementing the linear elasticity tutorial on Google Colab

I’ve been trying to implement the linear elasticity tutorial on Google Colab using a FEM on Colab installation of dolfinx So far I have been unable to properly set a constant vector for the Dirichlet boundary condition of the problem.
So far I’ve been using the exact code of the implementation, except for the use of fem.functionspace instead of fem.VectorFunctionSpace

This is the code I’ve been using so far

from dolfinx import mesh, fem, default_scalar_type
from mpi4py import MPI
import ufl
import numpy as np
L = 1
W = 0.2

domain = mesh.create_box(MPI.COMM_WORLD, [np.array([0, 0, 0]), np.array([L, W, W])],
                         [20, 6, 6], cell_type=mesh.CellType.hexahedron)
V = fem.functionspace(domain, ("Lagrange", 1))

def clamped_boundary(x):
    return np.isclose(x[0], 0)

fdim = domain.topology.dim - 1
boundary_facets = mesh.locate_entities_boundary(domain, fdim, clamped_boundary)

u_D = np.array([0, 0, 0], dtype=default_scalar_type)

bc = fem.dirichletbc(u_D, fem.locate_dofs_topological(V, fdim, boundary_facets), V)

However, I always get the following error

TypeError                                 Traceback (most recent call last)
/usr/local/lib/python3.10/dist-packages/dolfinx/fem/ in dirichletbc(value, dofs, V)
    174         try:
--> 175             bc = bctype(_value, dofs, V)
    176         except TypeError:

TypeError: __init__(): incompatible function arguments. The following argument types are supported:
    1. __init__(self, g: ndarray[dtype=float64, writable=False, order='C'], dofs: ndarray[dtype=int32, writable=False, shape=(*), order='C'], V: dolfinx.cpp.fem.FunctionSpace_float64) -> None
    2. __init__(self, g: dolfinx.cpp.fem.Constant_float64, dofs: ndarray[dtype=int32, writable=False, shape=(*), order='C'], V: dolfinx.cpp.fem.FunctionSpace_float64) -> None
    3. __init__(self, g: dolfinx.cpp.fem.Function_float64, dofs: ndarray[dtype=int32, writable=False, shape=(*), order='C']) -> None
    4. __init__(self, g: dolfinx.cpp.fem.Function_float64, dofs: list[ndarray[dtype=int32, writable=False, shape=(*), order='C']], V: dolfinx.cpp.fem.FunctionSpace_float64) -> None

Invoked with types: dolfinx.cpp.fem.DirichletBC_float64, ndarray, ndarray, dolfinx.fem.function.FunctionSpace

During handling of the above exception, another exception occurred:

RuntimeError                              Traceback (most recent call last)
1 frames
/usr/local/lib/python3.10/dist-packages/dolfinx/fem/ in dirichletbc(value, dofs, V)
    175             bc = bctype(_value, dofs, V)
    176         except TypeError:
--> 177             bc = bctype(_value, dofs, V._cpp_object)
    178     else:
    179         bc = bctype(_value, dofs)

RuntimeError: Rank mis-match between Constant and function space in DirichletBC

This error can be bypassed by using u_D = default_scalar_type(0). However, this generates issues with the dimension of the boundary condition latter on.
Can anybody help me understand what is going wrong with this impementation?

You likely need to provide shape to your function space. Something like

V = fem.functionspace(domain, ("Lagrange", 1, (mesh.geometry.dim,)))

Thanks for the answer.
I’ve tried the suggested code and it solved my issue.
However I do not quite understand why the shape is provided in the form (shape, )
Anyways, thank you very much!

In the old version of dolfinx, you had the Notion of a vector Finite element space. That is a space where a single basis function is repeated over multiple dimensions (such as a vector or tensor), with unique coefficients per dimension).

In the 0.8.0 code, UFL has removed the Notion of such elements, and we Therefore provide a triplet to define the Finite element:

  1. The family, here Lagrange
  2. The degree of the element, here 1
  3. how it should be repeated over multiple dimensions. Default is a single dimension (a scalar), and otherwise you send in a tensor of the appropriate shape, in your case a (3,) vector, which can be generalized by the geometrical dimension of your mesh.

Thanks, that makes it a lot clearer to me!