Error on the cluster : PETScLUSolver.cpp

Hi,

I encounter the error that only appear on the cluster.
The error does not appear when I run the same code on my laptop with mpirun, so I do not know how to fix the problem.

The part of the error message I got is as follows.

Error:   Unable to solve linear system using PETSc LU solver.
Reason:  No suitable solver for parallel LU found. Consider configuring PETSc with MUMPS or SuperLU_dist.
Where:   This error was encountered inside PETScLUSolver.cpp.
cpp.la.solve(A, function.vector(), b, solver_type, preconditioner_type)
DOLFIN version: 2019.1.0

I tried to make minimal working example but did not manage.
What I’m trying to can be summarized as follows.

from fenics import *

mesh = UnitCubeMesh(10, 10, 10)
h = CellDiameter(mesh)
DG = FunctionSpace(mesh, 'DG', 0)
characteristic_edge_length = project(h, DG)

The above example worked fine both on my laptop and the cluster but what I’m doing in the code that throws the above mentioned error is essentially the same.
The error seems to come from project(h, DG).

I tested the above code with the mesh that I created with gmsh and that did not cause the error so I do not think the error comes from the mesh.

I appreciate any insight !

Best,

Looks like mumps and/or suplerlu_dist are missing on your cluster. Can you provide ldd libdolfin.so or ldd libpetsc.so (provided you know their locations)?

It would also help to know how FEniCS (and particularly PETSc) was isntalled on your cluster.

2 Likes

Hi @nate ,

Thank you for your quick reply.

To be honest, I have no idea where libdolfin.so or ldd libpetsc.so are located and how FEniCS was installed either… I need to ask the person who installed the FEniCS.

It was due to loading wrong FEniCS installation… Now the problem is fixed.

1 Like