NCurl1 element type and boundary conditions

Dear all,

I am just starting with fenicsx project with an electromagnetic example in 3D using Nedelec first order elements to benchmark against some other software.

So far, I could get to read the mesh from gmsh format. I am stuck in imposing the boundary condition on the boundary of the full domain.

The domain is made of a coil (closed ring, tag 1) and the surrounding air in the shape of sphere (tag 2) which external surface (boundary of domain, tag 4).

Here is the code thus far:

from import gmshio
from mpi4py import MPI

from dolfinx.fem import (Expression, Function, FunctionSpace, VectorFunctionSpace)

mesh, cell_tags, facet_tags = gmshio.read_from_msh("benchmark1.msh", MPI.COMM_WORLD, 0, gdim=3)

# Output DOLFINx meshes to file
from import XDMFFile

with XDMFFile(MPI.COMM_WORLD, "mesh_out.xdmf", "w") as xdmf:

V = FunctionSpace(mesh, ("N1curl", 1))

from dolfinx.fem import (dirichletbc, locate_dofs_topological)
# Create facet to cell connectivity required to determine boundary facets
dofs_facets = locate_dofs_topological(V, mesh.topology.dim-1, facet_tags.find(4))

from petsc4py.PETSc import ScalarType
import numpy as np
u0 = np.array((0,)*mesh.geometry.dim, dtype=ScalarType)
bc = dirichletbc(u0, dofs_facets, V)

The *.geo file is found here:

Shared on Google drive



There are certain limitations with sending in Constant inputs to the dirichletbc constructor (next time please add the error message as well in your post).

A workaround is using a dolfinx.fem.Function instead:

u0 = Function(V)
bc = dirichletbc(u0, dofs_facets)

(This has no impact on the performance, only the memory of storing an array of size (num_dofs))

For later posts, please note that most errors can be reproduced on the “built in” meshes, which would make it easier for others to use your code.
Following is an MWE of your problem (with both the working and failing solution

from petsc4py.PETSc import ScalarType
import numpy as np
from dolfinx.fem import (dirichletbc, locate_dofs_topological)
from mpi4py import MPI
from dolfinx.mesh import create_unit_cube, exterior_facet_indices
from dolfinx.fem import (Function,

mesh = create_unit_cube(MPI.COMM_WORLD, 2, 2, 2)
V = FunctionSpace(mesh, ("N1curl", 1))

# Create facet to cell connectivity required to determine boundary facets
mesh.topology.create_connectivity(mesh.topology.dim-1, mesh.topology.dim)
facets = exterior_facet_indices(mesh.topology)
dofs_facets = locate_dofs_topological(
    V, mesh.topology.dim-1, facets)

# Working
u0 = Function(V)
bc_working = dirichletbc(u0, dofs_facets)

# Failing
u1 = np.array((0,)*mesh.geometry.dim, dtype=ScalarType)
bc_failing = dirichletbc(u1, dofs_facets, V)

Thanks for the quick reply and sorry for the error message. It simply slipped my mind.

I am just starting with fenicsx, a couple of days looking at it. Next time, I will try to generate the mesh through the gmsh python API as done in some examples to be able to share it easily. Still in the learning curve. As I am used to gmsh, it should be a simple way to share mesh codes. I am guessing.