Boomeramg in dolfinx

I’m trying to apply different preconditioners for a 3D indefinite Helmholtz problem in dolfinx. I tried this:

While gamg runs good (though not accelerated), boomeramg seems not work. I get 0 iteration number and -1.0 iteration residual for this:

print('Iteration number:', solver.getIterationNumber())
print('Iteration residual:', solver.getResidualNorm())

I guess my dolfinx container has no hypre ? How to check if hypre is installed in dolfinx ? (I googled but only find how in dolfin) : dolfin/demo_curl-curl.py at f8cd6c0b2722bef82f1d429a9e838a4491b30154 · ryanfreckleton/dolfin · GitHub
And for a 3D time-harmonic EM Helmholtz problem, what is the best preconditioner so far (I know there’s no perfect one for indefinite Helmholtz). Do AMS and ADS (they’re for definite Helmholtz) in Hypre work fine?
Thanks.

First of all, you neeed to explain what container you have been using to run your code. Also, in your post it is unclear if you are having trouble executing the code in DOLFINX_MPC or in your helmholtz code.
If you have issues with running the helmholtz problem, I would strongly suggest you post a minimal example (for instance on a unit cube) that describes your solver setup and what error you are achieving. In general the dolfinx/dolfinx container is configured with hypre, see: dolfinx/Dockerfile at main · FEniCS/dolfinx · GitHub (in particular line 260).

Dear @dokken,
The previous MWE is too long and I produce a shorter version. The dolfinx version is 0.3.1.0.
When I set boomeramg = 0, the MWE runs successfully but bommeramg = 1 gives following report:

from dolfinx.fem import (Function, FunctionSpace, assemble_scalar, assemble_vector, form,
                        locate_dofs_topological,dirichletbc,Constant)
from dolfinx.fem.petsc import LinearProblem
from dolfinx.io import XDMFFile
from dolfinx import mesh
from mpi4py import MPI
from petsc4py import PETSc
from petsc4py.PETSc import ScalarType
from ufl import (FiniteElement, FacetNormal, TestFunction, TrialFunction, dx, grad, inner, curl)
import numpy as np
import dolfinx

msh = mesh.create_unit_cube(MPI.COMM_WORLD, 8,8,8, mesh.CellType.tetrahedron)
tdim = msh.topology.dim
fdim = tdim - 1
msh.topology.create_connectivity(tdim-1, tdim)
V = FunctionSpace(msh,('N2curl',2))

k0 = 1 # 2*pi/wavelength
def u_D(x):
    values = np.zeros(x.shape, dtype=complex)
    values[0] = x[1]*x[1]
    values[1] = 0
    values[2] = -2*x[1]*x[1]
    return values  

def f_ex(x): #curlcurluD - k**2*uD
    values = np.zeros(x.shape, dtype=complex)
    values[0] = -2 - k0**2*x[1]*x[1]
    values[1] = 0
    values[2] = 4 + k0**2*x[1]*x[1]
    return values  

uD = Function(V)
uD.interpolate(u_D)
f = Function(V)
f.interpolate(f_ex)
    
u = TrialFunction(V)
v = TestFunction(V)
boundary_facets = np.flatnonzero(mesh.compute_boundary_facets(msh.topology))
boundary_dofs = locate_dofs_topological(V, fdim, boundary_facets)
bc = dirichletbc(uD, boundary_dofs)
bcs = [bc]

a = inner(curl(u), curl(v))*dx - k0**2*inner(u,v)*dx
L = inner(f,v)*dx
bilinear_form = dolfinx.fem.form(a)
linear_form = dolfinx.fem.form(L)
A = dolfinx.fem.petsc.assemble_matrix(bilinear_form, bcs=bcs)
A.assemble()
b = dolfinx.fem.petsc.assemble_vector(linear_form)
dolfinx.fem.petsc.apply_lifting(b, [bilinear_form], bcs=[bcs])
b.ghostUpdate(addv=PETSc.InsertMode.ADD_VALUES, mode=PETSc.ScatterMode.REVERSE)
dolfinx.fem.petsc.set_bc(b, bcs)

boomeramg = 0

nullspace = PETSc.NullSpace().create(constant=True)
opts = PETSc.Options()
opts['ksp_monitor_true_residual'] = None
opts['ksp_rtol'] = 1.0e-5
opts['ksp_view'] = None

if boomeramg:
    opts["ksp_type"] = "gmres"
    opts["pc_type"] = "hypre"
    opts['pc_hypre_type'] = 'boomeramg'
    opts["pc_hypre_boomeramg_max_iter"] = 1
    opts["pc_hypre_boomeramg_cycle_type"] = "v"
#     opts["pc_hypre_boomeramg_print_statistics"] = 1
else:
    opts["ksp_type"] = "gmres"
    opts["pc_type"] = "gamg"
    opts["pc_gamg_type"] = "agg"
    opts["pc_gamg_sym_graph"] = True
    opts["mg_levels_ksp_type"] = "richardson"
    opts["mg_levels_pc_type"] = "sor"

solver = PETSc.KSP().create(MPI.COMM_WORLD)
solver.setConvergenceHistory()
PETSc.Mat.setNearNullSpace(A, nullspace)
uh = Function(V) 
solver.setFromOptions()
solver.setOperators(A)
solver.solve(b, uh.vector)
uh.x.scatter_forward()
print('Iteration number:', solver.getIterationNumber())
print('Iteration residual:', solver.getResidualNorm())

If you add:
print(f"Convergence reason: { solver.getConvergedReason()}") to your code, you will see that it prints
Convergence reason: -9
which in petsc terms means KSPConvergedReason
KSP_DIVERGED_NANORINF

I think the first issue is that Hypre does not support complex numbers, see: Summary of Sparse Linear Solvers Available In PETSc — PETSc 3.16.5 documentation
In real mode I think your problem executes without issues.

Secondly, to use AMS, you need to supply some extra information:

from dolfinx.cpp.fem.petsc import discrete_gradient, interpolation_matrix
pc = solver.getPC()
V_CG = FunctionSpace(msh, ("CG", 1))._cpp_object
G = discrete_gradient(V_CG, V._cpp_object)
G.assemble()
pc.setHYPREDiscreteGradient(G)

cvec_0 = Function(V)
cvec_0.interpolate(lambda x: np.vstack((np.ones_like(x[0]),
                                        np.zeros_like(x[0]),
                                        np.zeros_like(x[0]))))
cvec_1 = Function(V)
cvec_1.interpolate(lambda x: np.vstack((np.zeros_like(x[0]),
                                        np.ones_like(x[0]),
                                        np.zeros_like(x[0]))))
cvec_2 = Function(V)
cvec_2.interpolate(lambda x: np.vstack((np.zeros_like(x[0]),
                                        np.zeros_like(x[0]),
                                        np.ones_like(x[0]))))
pc.setHYPRESetEdgeConstantVectors(cvec_0.vector,
                                  cvec_1.vector,
                                  cvec_2.vector)


# Attach discrete gradient to preconditioner
# Only available in PETSc main post: https://gitlab.com/petsc/petsc/-/commit/ee0c4ed34c3ea92bb645a7c3669aa02417f19b28
# from ufl import VectorElement
# Vec_CG = FunctionSpace(msh, VectorElement("CG", msh.ufl_cell(), 2))
# Pi = interpolation_matrix(Vec_CG._cpp_object, V._cpp_object)
# Pi.assemble()
# pc.setHYPRESetInterpolations(mesh.geometry.dim, None, None, Pi, None)

which in DOLFINx real mode converges in 5 iterations as opposed to boomeramgs 53 iterations (using N1curl of order 1).

N2curl of order 2 required changing the order in V_CG to 2 and boomeramg then converges in 1295 iterations, while AMS takes 43 iterations.

This should be even faster if one uses the main branch of PETSc and the feature pc.setHYPRESetInterpolations as shown in the last comment in my snippet.

The code I post above is using very new functionality from DOLFINx (merged into main last week).

Emm… sad to hear Hypre doesn’t support complex numbers, cause in my application Absorbing Boundary Condition is needed so there has to be complex numbers. And it’s the main reason I switch to dolfinx from dolfin. Could computation time increase significantly if I split real and imag part and rearange to a 2x2 matrix? DOFs seem to double but it will be a totally real matrix.

I thought AMS is only for definite Helmholtz function, good to know it improves indefinite problem performance as well. Thanks!