I’m facing a problem when trying to run problems on a Mac (M1). There is no problem when running the provided Docker containers. Yet, I also built Dolfinx (along with ufl, basix and ffcx) directly on my system (and everything seemed to work fine). However, running problems results in the error:
PETSC ERROR: Logging has not been enabled.
You might have forgotten to call PetscInitialize().
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
with errorcode 56.
NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
Indeed there was an issue with Homebrew-PETSc and hdf5-mpi, which is why I installed hdf5-mpi after PETSc. So that might be a problem. I’ll build PETSc from source then and will give feedback whether solves the problem.
Unfortunately, building PETSc from source does not seem to make a difference, the error remains.
@nate: The PETSc tests are mostly successful, 26/10921 failed. Have you followed any specific procedures on your M1 Mac to make it run? Are you still facing problems for parallel runnings?
Update: I completely reinstalled/-built Fenics (ufl, basix, ffcx, dolfinx; current version 0.4.1.dev0). I built Petsc from source. The error remains.
It’s
PETSC ERROR: Logging has not been enabled.
You might have forgotten to call PetscInitialize().
The EXACT line numbers in the error traceback are not available.
Instead the line number of the start of the function is given.
[-1] #1 PetscLogGetStageLog() at /Users/chris/Documents/Fenics/petsc/src/sys/logging/utils/stagelog.c:29
[-1] #2 PetscClassIdRegister() at /Users/chris/Documents/Fenics/petsc/src/sys/logging/plog.c:2322
[-1] #3 MatMFFDInitializePackage() at /Users/chris/Documents/Fenics/petsc/src/mat/impls/mffd/mffd.c:42
[-1] #4 MatInitializePackage() at /Users/chris/Documents/Fenics/petsc/src/mat/interface/dlregismat.c:160
[-1] #5 MatCreate() at /Users/chris/Documents/Fenics/petsc/src/mat/utils/gcreate.c:74
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
with errorcode 56.
NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------
for elasticity.py and poisson.py.
A similar error message appears for other scripts:
PETSC ERROR: Logging has not been enabled.
You might have forgotten to call PetscInitialize().
The EXACT line numbers in the error traceback are not available.
Instead the line number of the start of the function is given.
[-1] #1 PetscLogGetStageLog() at /Users/chris/Documents/Fenics/petsc/src/sys/logging/utils/stagelog.c:29
[-1] #2 PetscClassIdRegister() at /Users/chris/Documents/Fenics/petsc/src/sys/logging/plog.c:2322
[-1] #3 KSPInitializePackage() at /Users/chris/Documents/Fenics/petsc/src/ksp/ksp/interface/dlregisksp.c:160
[-1] #4 KSPCreate() at /Users/chris/Documents/Fenics/petsc/src/ksp/ksp/interface/itcreate.c:674
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
with errorcode 56.
NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------
I have no idea whether this is an issue with dolfinx, petsc, mpi or all together. Any help is appreciated.
Hello! I have the same problem. When I run the code using script “python xx.py”, there will be no problem. However, when i use “nohup python xx.py >output.log 2>&1 &”, this problem occurs.