Petsc error when calling Newton solver

Dear FEniCSx/Dolfinx community,

I’m facing a problem when trying to run problems on a Mac (M1). There is no problem when running the provided Docker containers. Yet, I also built Dolfinx (along with ufl, basix and ffcx) directly on my system (and everything seemed to work fine). However, running problems results in the error:

PETSC ERROR: Logging has not been enabled.
You might have forgotten to call PetscInitialize().
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
with errorcode 56.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.

This happens when running, for instance, the provided Poisson and Hyperelasticity examples (Implementation — FEniCSx tutorial & Hyperelasticity — FEniCSx tutorial).

The error appears when calling the line

problem = fem.petsc.LinearProblem(a, L, bcs=[bc], petsc_options={"ksp_type": "preonly", "pc_type": "lu"})

(in poisson.py) and

solver = nls.petsc.NewtonSolver(domain.comm, problem)

(in hyperelasticity.py).

There is no difference if running on single-core or with mpirun.

Any help would be greatly appreciated!

Best,
Christian

Could you try adding:

import petsc4py
petsc4py.init()

at the top of your script?

Thanks for the feedback.

Adding these two lines doesn’t make a difference, still the same error.

That is really weird, as this should initialize PETSc.

I’ve had issues with PETSc on my M1 mac which are similar to this, but only when running in parallel.

Are you able to run any of the PETSc examples (assuming you also compiled PETSc from source)?

I installed PETSc via homebrew and did not compile from source. Is that a potential problem?

oh. I thought the homebrew formula for petsc didn’t support hdf5-mpi. This is a requirement for dolfinx.

Indeed there was an issue with Homebrew-PETSc and hdf5-mpi, which is why I installed hdf5-mpi after PETSc. So that might be a problem. I’ll build PETSc from source then and will give feedback whether solves the problem.

1 Like

Unfortunately, building PETSc from source does not seem to make a difference, the error remains.

@nate: The PETSc tests are mostly successful, 26/10921 failed. Have you followed any specific procedures on your M1 Mac to make it run? Are you still facing problems for parallel runnings?

Update: I completely reinstalled/-built Fenics (ufl, basix, ffcx, dolfinx; current version 0.4.1.dev0). I built Petsc from source. The error remains.
It’s

PETSC ERROR: Logging has not been enabled.
You might have forgotten to call PetscInitialize().
The EXACT line numbers in the error traceback are not available.
Instead the line number of the start of the function is given.
[-1] #1 PetscLogGetStageLog() at /Users/chris/Documents/Fenics/petsc/src/sys/logging/utils/stagelog.c:29
[-1] #2 PetscClassIdRegister() at /Users/chris/Documents/Fenics/petsc/src/sys/logging/plog.c:2322
[-1] #3 MatMFFDInitializePackage() at /Users/chris/Documents/Fenics/petsc/src/mat/impls/mffd/mffd.c:42
[-1] #4 MatInitializePackage() at /Users/chris/Documents/Fenics/petsc/src/mat/interface/dlregismat.c:160
[-1] #5 MatCreate() at /Users/chris/Documents/Fenics/petsc/src/mat/utils/gcreate.c:74
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
with errorcode 56.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------

for elasticity.py and poisson.py.

A similar error message appears for other scripts:

PETSC ERROR: Logging has not been enabled.
You might have forgotten to call PetscInitialize().
The EXACT line numbers in the error traceback are not available.
Instead the line number of the start of the function is given.
[-1] #1 PetscLogGetStageLog() at /Users/chris/Documents/Fenics/petsc/src/sys/logging/utils/stagelog.c:29
[-1] #2 PetscClassIdRegister() at /Users/chris/Documents/Fenics/petsc/src/sys/logging/plog.c:2322
[-1] #3 KSPInitializePackage() at /Users/chris/Documents/Fenics/petsc/src/ksp/ksp/interface/dlregisksp.c:160
[-1] #4 KSPCreate() at /Users/chris/Documents/Fenics/petsc/src/ksp/ksp/interface/itcreate.c:674
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
with errorcode 56.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------

I have no idea whether this is an issue with dolfinx, petsc, mpi or all together. Any help is appreciated.

Excuse me. I have the same problem. I wonder if you had managed it.
Thanks in advance.