Error with IO in parallel -

I’ve written code inspired by this example on performing a parametric study in parallel, where one uses MPI_COMM_SELF and MPI_COMM_WORLD. My code only works for one process and never seems to terminate for 2 processes or more. I am sure it’s not a memory issue because the mesh size is quite small. It does end when you take the write and read part out from the first “if” condition, so it seems to be fenics related. Does anyone know why this does not run (or end) with mpirun -np 2 python parallel.py

Here is the executable code:

# vim: tabstop=8 expandtab shiftwidth=4 softtabstop=4
from __future__ import print_function
from fenics import *
import matplotlib.pyplot as plt
import random
from dolfin import *
import numpy as np
import pickle
import time
# import index_handler
import sys
from mpi4py import MPI

N = 100
deg_fe = 1

mesh = UnitSquareMesh(N, N)
#Define periodic boundary conditions.
class PeriodicBoundary(SubDomain):
    #Left edge and bottom edge are target domain
    def inside(self, x, on_boundary):
        return ((near(x[0], 0) or near(x[1], 0.)) and not (near(x[0], 1.) or near(x[1], 1.))) and on_boundary

    def map(self, x, y):
        #Map top right corner to origin.
        if near(x[0], 1.) and near(x[1], 1.):
            y[0] = x[0] - 1.
            y[1] = x[1] - 1.
        #Map right edge of domain to left edge.
        elif near(x[0], 1.):
            y[0] = x[0] - 1.
            y[1] = x[1]
        #Map top edge of domain to bottom edge.
        elif near(x[1], 1.):
            y[0] = x[0]
            y[1] = x[1] - 1.
        #Need else statement for some unknown reason.
        else:
            y[0] = -1.
            y[1] = -1.





# Define mesh, function space, and BC
pbc = PeriodicBoundary()
V = FunctionSpace(mesh, 'CG', deg_fe, constrained_domain=pbc) #Function space for FE solution

directorypath = "results/"
# +
comm = MPI.COMM_WORLD
rank = comm.Get_rank()
size = comm.Get_size()

if(rank == 0):
    f = Expression("10*exp(-(pow(x[0] - 0.5, 2) + pow(x[1] - 0.5, 2)) / 0.02)", degree=1)
    fFile = HDF5File(MPI.COMM_SELF,directorypath+"potential.h5","w")
    fFile.write(interpolate(f, V), "/V")
    fFile.close()

    fFile = HDF5File(MPI.COMM_SELF,directorypath+"potential.h5","r")
    W = Function(V)
    fFile.read(W, "/V")
    fFile.close()
    
comm.barrier()

def print_rank(rank):
    print(rank)
    
print_rank(rank)

Your mesh does not have an MPI comm input, as described in the post you are linking to, i.e.

should be

mesh = UnitSquareMesh(MPI.COMM_SELF 256, 256)

Please note that you are overloading the MPI function in dolfin, which would be MPI.comm_self (as you are importing mpi4py after the wildcard import.

With this modification, you should be able to do interpolate on a single rank.

Hello @dokken, thanks for your reply. I added MPI.COMM_SELF (with and without MPI overloaded) to the mesh initialization and the problem persisted. Do you have other ideas?

How about:



from fenics import *
import matplotlib.pyplot as plt
import random
from dolfin import *
import numpy as np
import pickle
import time
# import index_handler
import sys
from mpi4py import MPI

#Define periodic boundary conditions.
class PeriodicBoundary(SubDomain):
    #Left edge and bottom edge are target domain
    def inside(self, x, on_boundary):
        return ((near(x[0], 0) or near(x[1], 0.)) and not (near(x[0], 1.) or near(x[1], 1.))) and on_boundary

    def map(self, x, y):
        #Map top right corner to origin.
        if near(x[0], 1.) and near(x[1], 1.):
            y[0] = x[0] - 1.
            y[1] = x[1] - 1.
        #Map right edge of domain to left edge.
        elif near(x[0], 1.):
            y[0] = x[0] - 1.
            y[1] = x[1]
        #Map top edge of domain to bottom edge.
        elif near(x[1], 1.):
            y[0] = x[0]
            y[1] = x[1] - 1.
        #Need else statement for some unknown reason.
        else:
            y[0] = -1.
            y[1] = -1.





# Define mesh, function space, and BC

N = 100
deg_fe = 1

mesh = UnitSquareMesh(MPI.COMM_SELF, N, N)
pbc = PeriodicBoundary()
V = FunctionSpace(mesh, 'CG', deg_fe, constrained_domain=pbc) #Function space for FE solution

directorypath = "results/"
comm = MPI.COMM_WORLD
rank = comm.Get_rank()
size = comm.Get_size()

if (rank == 0):
    f = Expression("10*exp(-(pow(x[0] - 0.5, 2) + pow(x[1] - 0.5, 2)) / 0.02)", mpi_comm=MPI.COMM_SELF, degree=1)
    q = interpolate(f, V)

    with XDMFFile(MPI.COMM_SELF,directorypath+"potential.xdmf") as fFile:
       fFile.write_checkpoint(q, "q", 0.0)


W = Function(V)

if (rank == 0):
    with XDMFFile(MPI.COMM_SELF,directorypath+"potential.xdmf") as fFile:
        fFile.read_checkpoint(W, "q", 0)
    
comm.barrier()

def print_rank(rank):
    print(rank)
    
print_rank(rank)

Hello again, this seems to work! Thank you very much. Now, I just have to alter my previous programs to work with xdmf files, hehe!