I have tried the change as you suggested, but I feel that the MPI wrapper from Fenics and MPI from mpi4py are conflicting. I made a python file just to see at what point it fails. After creating the mesh it fails again at creating function space.
from dolfin import *
from mpi4py import MPI as mpi
comm = mpi.COMM_WORLD
ip = comm.Get_rank()
print("Hello I am process ",ip)
mesh = UnitIntervalMesh(MPI.comm_world,10)
V = VectorFunctionSpace(mesh, “CG”, 1)
Hello I am process 0
Calling FFC just-in-time (JIT) compiler, this may take some time.
Calling FFC just-in-time (JIT) compiler, this may take some time.
Hello I am process 1
Traceback (most recent call last):
File “trial_elasticity.py”, line 10, in
V = VectorFunctionSpace(mesh, “CG”, 1)
File “/scinet/niagara/software/2018a/opt/intel-2018.2-intelmpi-2018.2/fenics/2017.2.0/lib/python3.6/site-packages/dolfin/function/functionspace.py”, line 222, in VectorFunctionSpace
return FunctionSpace(mesh, element, constrained_domain=constrained_domain)
File “/scinet/niagara/software/2018a/opt/intel-2018.2-intelmpi-2018.2/fenics/2017.2.0/lib/python3.6/site-packages/dolfin/function/functionspace.py”, line 31, in init
self._init_from_ufl(*args, **kwargs)
File “/scinet/niagara/software/2018a/opt/intel-2018.2-intelmpi-2018.2/fenics/2017.2.0/lib/python3.6/site-packages/dolfin/function/functionspace.py”, line 43, in _init_from_ufl
mpi_comm=mesh.mpi_comm())
File “/scinet/niagara/software/2018a/opt/intel-2018.2-intelmpi-2018.2/fenics/2017.2.0/lib/python3.6/site-packages/dolfin/jit/jit.py”, line 82, in mpi_jit
raise RuntimeError(error_msg)
RuntimeError: [Errno 30] Read-only file system: ‘/home/a/asarkar/sudhipv/.cache/dijitso’
Could try with printing the rank too ??
To get the rank you would need to call mpi from mpi4py right ?
But I am confused as to which mpi to use for the Mesh function call. mpi4py imported mpi or the MPI wrapper from dolfin ?
In my case I almost always use the MPI wrapper in dolfin and haven’t had to face any issues. You can use the inbuilt function has_mpi4py() to check if dolfin was correctly configured with mpi4py in case if you built it from source. For most standard installations this should not be an issue.