Hi! I implemented an interative solver for my simulation as follows: Running the iterative method on a single core works just fine. However, it fails when I try to do an MPI run:
if SOLVER_CONFIG == "LU":
problem = CahnHilliardEquation(a, F)
solver = NewtonSolver()
solver.parameters["linear_solver"] = "lu"
#solver.parameters["linear_solver"] = "gmres"
#solver.parameters["preconditioner"] = "ilu"
solver.parameters["convergence_criterion"] = "residual"
solver.parameters["relative_tolerance"] = 1e-6
elif SOLVER_CONFIG == "KRYLOV":
class CustomSolver(NewtonSolver):
def __init__(self):
NewtonSolver.__init__(self, mesh.mpi_comm(),
PETScKrylovSolver(), PETScFactory.instance())
def solver_setup(self, A, P, problem, iteration):
self.linear_solver().set_operator(A)
PETScOptions.set("ksp_type", "gmres")
PETScOptions.set("ksp_monitor")
PETScOptions.set("pc_type", "ilu")
PETScOptions.set("ksp_rtol", "1.0e-6")
PETScOptions.set("ksp_atol", "1.0e-10")
self.linear_solver().set_from_options()
problem = CahnHilliardEquation(a, F)
solver = CustomSolver()
How I execute an mpi-run:
mpirun -np 12 python foo.py 2>outputerr.txt
The output I get:
Error: Unable to successfully call PETSc function 'KSPSolve'.
*** Reason: PETSc error code is: 92 (See http://www.mcs.anl.gov/petsc/documentation/linearsolvertable.html for possible LU and Cholesky solvers).
*** Where: This error was encountered inside /home/conda/feedstock_root/build_artifacts/fenics-pkgs_1566991881845/work/dolfin/dolfin/la/PETScKrylovSolver.cpp.
*** Process: 11
***
*** DOLFIN version: 2019.1.0
*** Git changeset: a97cbd7b6bf8089d364d61584f529e6e36d85845
I have this problem both with the fenics build from conda and ocellaris singularity container if that is useful info.