I’m using FEniCS 2019.1.0. To test the parallel functionality of legacy FEniCS, I’m running a slightly modified version of the heat equation tutorial program (I know the program is trivial, that’s not the point).
When I run the program serially, XDMF files are indeed written to the directory ./figures as expected. However, If I run the program using mpirun -np N python heat.py, (where N is a positive integer), nothing is written to file and the program runs until I manually halt execution.
What’s going on here? Program is pasted below:
from __future__ import print_function
from fenics import *
import time
import os
comm = MPI.comm_world
rank = MPI.rank(comm)
workdir = os.getcwd()
outDirName = os.path.join(workdir, "figures")
os.makedirs(outDirName, exist_ok=True)
T = 2.0 # final time
num_steps = 50 # number of time steps
dt = T / num_steps # time step size
# Create mesh and define function space
nx = ny = 30
mesh = RectangleMesh(comm, Point(-2, -2), Point(2, 2), nx, ny)
V = FunctionSpace(mesh, 'P', 1)
# Define boundary condition
def boundary(x, on_boundary):
return on_boundary
bc = DirichletBC(V, Constant(0), boundary)
# Define initial value
u_0 = Expression('exp(-a*pow(x[0], 2) - a*pow(x[1], 2))',
degree=2, a=5)
u_n = interpolate(u_0, V)
# Define variational problem
u = TrialFunction(V)
v = TestFunction(V)
f = Constant(0)
F = u*v*dx + dt*dot(grad(u), grad(v))*dx - (u_n + dt*f)*v*dx
a, L = lhs(F), rhs(F)
# Create XDMF file for saving solution
solnfile = XDMFFile(outDirName + "/soln.xdmf")
# Time-stepping
u = Function(V)
t = 0
for n in range(num_steps):
# Update current time
t += dt
# Compute solution
solve(a == L, u, bc)
# Write to file
if rank == 0:
solnfile.write(u_n, t)
# Update previous solution
u_n.assign(u)