Set SolverType "superlu" instead of "superlu_dist" for a PETSc matrix of type SEQAIJ

Dear FEniCS community,

I would like to know about possible API for setting the solver type of a PETSc assembled matrix.

https://www.mcs.anl.gov/petsc/petsc4py-current/docs/apiref/petsc4py.PETSc.Mat.SolverType-class.html

Hence, I did’t put together a MWE.

Context for such a need:

Case 1:
When assembling a matrix with block structure M = \begin{bmatrix} A & - D^{\top}\\ D & C + \tau B \end{bmatrix}, the default solver type is “superlu” and a choice of GMRES with ILU works fine.

Case 2:
When using some implicit RK method, assembled matrix has block structure N = \begin{bmatrix} \tau Q \otimes A & - \tau Q \otimes D^{\top}\\ I \otimes D & I \otimes C + \tau Q \otimes B \end{bmatrix}, the default solver type becomes “superlu_dist” and a choice of GMRES with ILU throws the following error:

Traceback (most recent call last):
  File "/tmp/mystorage/scilibs/ellhypar/ellhypar/precond_implicit.py", line 327, in <module>
    diagnose()

  File "/tmp/mystorage/scilibs/ellhypar/ellhypar/precond_implicit.py", line 267, in diagnose
    for ind, (u_n, p_n, t) in enumerate(solutions):

  File "/tmp/mystorage/scilibs/ellhypar/ellhypar/ellpar/implicit.py", line 300, in solve_steps
    self.solver.solve(b_vec, dUP.vector)

  File "PETSc/KSP.pyx", line 403, in petsc4py.PETSc.KSP.solve
petsc4py.PETSc.Error: error code 92

[0] KSPSolve() at /usr/local/petsc/src/ksp/ksp/interface/itfunc.c:1078

[0] KSPSolve_Private() at /usr/local/petsc/src/ksp/ksp/interface/itfunc.c:843

[0] KSPSetUp() at /usr/local/petsc/src/ksp/ksp/interface/itfunc.c:407

[0] PCSetUp() at /usr/local/petsc/src/ksp/pc/interface/precon.c:990

[0] PCSetUp_ILU() at /usr/local/petsc/src/ksp/pc/impls/factor/ilu/ilu.c:127

[0] MatGetFactor() at /usr/local/petsc/src/mat/interface/matrix.c:4590

[0] See https://petsc.org/release/overview/linear_solve_table/ for possible LU and Cholesky solvers

[0] MatSolverType superlu_dist does not support factorization type ILU for matrix type seqaij

Thanks a lot.

There are many things that are not specified here, like what version of FEniCS you are using (legacy or DOLFINx), how you are currently setting the the solver type.

The variational forms are defined using necessary for loops.

I have tried the following docker images:

  1. dokken92/dolfinx_custom:15072022
  2. dolfinx/lab:stable

# variational forms

block = 0.0
rhs = 0.0

for i in range(n_stages):
    for j in range(n_stages):

        block += ...
        rhs += ...

A_form = form(block)
L_form = form(rhs)

A_mat = assemble_matrix(A_form, bcs=bcs)
A_mat.assemble()

# Solver specification

sol_opts = {
    "ksp_type" : "gmres",
    "ksp_rtol" : 1e-6,
    "ksp_atol" : 1e-9,
    "ksp_max_it" : 10000,
    "pc_type" : "ilu"
}


opts = PETSc.Options()
for key, opt in sol_opts.items():
    opts[key] = opt

# solver
solver = PETSc.KSP().create(mesh.comm)
solver.setFromOptions()
solver.setOperators(A_mat)

Nowhere do I specify whether it should be ‘superlu’ or ‘superlu_dist’.

The problem works fine when using a direct solver or when using GMRES with Jacobi preconditioner.

A tremendous thanks to Jorgen Dokken for patiently answering many questions.

See:
https://petsc.org/main/manualpages/Mat/MATSOLVERSUPERLU/

Use -pc_type lu -pc_factor_mat_solver_typesuperlu to use this direct solver

and MATSOLVERSUPERLU_DIST — PETSc v3.20.1-237-g24e1188310 documentation

Note that superlu_dist cannot be used for ilu

1 Like
sol_opts = {
    "ksp_type" : "gmres",
    "ksp_rtol" : 1e-6,
    "ksp_atol" : 1e-9,
    "ksp_max_it" : 10000,
    "pc_type" : "ilu",
    "pc_factor_mat_solver_type" : "superlu"
}