There are certain limitations with sending in Constant inputs to the dirichletbc constructor (next time please add the error message as well in your post).
A workaround is using a dolfinx.fem.Function instead:
u0 = Function(V)
bc = dirichletbc(u0, dofs_facets)
(This has no impact on the performance, only the memory of storing an array of size (num_dofs))
For later posts, please note that most errors can be reproduced on the “built in” meshes, which would make it easier for others to use your code.
Following is an MWE of your problem (with both the working and failing solution
from petsc4py.PETSc import ScalarType
import numpy as np
from dolfinx.fem import (dirichletbc, locate_dofs_topological)
from mpi4py import MPI
from dolfinx.mesh import create_unit_cube, exterior_facet_indices
from dolfinx.fem import (Function,
mesh = create_unit_cube(MPI.COMM_WORLD, 2, 2, 2)
V = FunctionSpace(mesh, ("N1curl", 1))
# Create facet to cell connectivity required to determine boundary facets
facets = exterior_facet_indices(mesh.topology)
dofs_facets = locate_dofs_topological(
V, mesh.topology.dim-1, facets)
u0 = Function(V)
bc_working = dirichletbc(u0, dofs_facets)
u1 = np.array((0,)*mesh.geometry.dim, dtype=ScalarType)
bc_failing = dirichletbc(u1, dofs_facets, V)
Thanks for the quick reply and sorry for the error message. It simply slipped my mind.
I am just starting with fenicsx, a couple of days looking at it. Next time, I will try to generate the mesh through the gmsh python API as done in some examples to be able to share it easily. Still in the learning curve. As I am used to gmsh, it should be a simple way to share mesh codes. I am guessing.