Fenics & singularity - saving data with mpirun

Unless I am missing something obvious, it is likely to be an issue specific to the cluster where you are trying to run this. I am assuming you are using a job scheduler like PBS/SLURM to submit jobs on your cluster and that you are loading the appropriate modules (an MPI module in this case). Because

mpiexec -n 8 singularity exec <>.simg python3 demo.py

should work out of the box. And given that something like

singularity exec <>.simg python3 demo.py

works fine, you may want to look at the MPI installation on your cluster. You may also want to take a look at Singularity and MPI applications — Singularity container 3.3 documentation alongside seeking help from someone at your cluster’s help desk.