I am trying to do mesh smoothing in parallel MPI for a Fluid-structure interaction solver. A similar issue has been raised here at for a single process. However if I run the following code in MPI (Lets say mpirun.mpich -n 8 …), I get individual mesh partitions smoothed separately by each process - that inevitably breaks the entire mesh. That pretty much makes sense.
from dolfin import *
mesh2 = Mesh()
hdf = HDF5File(mesh2.mpi_comm(), "file.h5_2", "r")
hdf.read(mesh2, "/mesh",False)
p = File("mesh2.pvd")
p << mesh2
# MPI-rank
mpi_comm = mpi_comm_world()
MPI.barrier(mpi_comm)
my_rank = MPI.rank(mpi_comm)
mesh2.smooth(20)
ps = File("mesh2_smooth.pvd")
ps << mesh2
But is there a way to smooth the entire mesh (as a whole) and keep the partitioning same before and after smoothing in MPI? Can someone pls. help me with this? Also just curious if the same mesh smoothing works in 3D or not because I know mesh.smooth_boundary() doesn’t work for 3D?
Note : To create file.h5_2 use the code script given below in serial to generate the mesh.
from dolfin import *
from mshr import *
mesh2 = RectangleMesh(Point(3., 0.4875), Point(4., 0.5125), 100, 3)
hdf = HDF5File(mesh2.mpi_comm(), "file.h5_2", "w")
hdf.write(mesh2, "/mesh")
One workaround might be to define “smoothing” in terms of a variational problem, since you can solve those in parallel and 3D. The following is a quick example of an ad hoc mesh-smoothing formulation:
from dolfin import *
from mshr import *
from ufl import Jacobian
from matplotlib import pyplot as plt
# Unstructured mesh:
mesh = generate_mesh(Rectangle(Point(0,0),Point(1,1)),32)
x = SpatialCoordinate(mesh)
dx_dxi = Jacobian(mesh)
J = abs(det(dx_dxi))
def grad_xi(f):
df_dx = grad(f)
return dot(df_dx,dx_dxi)
h = CellDiameter(mesh)
# Linear FE space for geometry:
V = VectorFunctionSpace(mesh,"CG",1)
# "Mess up" the mesh:
ALE.move(mesh,
project(as_vector([Constant(0.2)*sin(pi*x[0])*sin(pi*x[1]),
Constant(0)]),V))
# Variational problem for smoothing:
u = TrialFunction(V)
v = TestFunction(V)
x_smoothed = x + u
# Parameter to control smoothing effect;
# larger parameter => stronger smoothing:
smoothing_strength = Constant(1e2)
# Penalize both large changes of position over a single element and deviation
# from the unsmoothed position, with smoothing_strength deciding the relative
# weights.
dx = dx(metadata={"quadrature_degree":2})
res = (smoothing_strength*inner(grad_xi(x_smoothed),grad_xi(v))
+ dot(u,v))*(1/J)*dx
# Solve for displacement to deform mesh; this does require a linear solve,
# but it should be efficient to approximate using an iterative solver
# with a loose tolerance.
uh = Function(V)
solve(lhs(res)==rhs(res),uh,
bcs=[DirichletBC(V,Constant((0,0)),"on_boundary")],
solver_parameters={"linear_solver" : "cg",
"krylov_solver":{"relative_tolerance":1e-2}})
# Plot messed-up mesh:
plot(mesh)
plt.show()
# Deform by displacement from variational problem and plot smoothed mesh:
ALE.move(mesh,uh)
plot(mesh)
plt.show()
1 Like
Sorry for the late reply @kamensky , I was traveling for a while.
First of fall, this looks like an excellent solution to my problem. I should be able to implement it successfully for my fsi-problem. I will check and get back soon. Thanks a lot.
Tejas
Hi Kamensky,
Would a similar technique work to obtain ‘smoothing’ of a function’s values if this function is computed in parallel? If not, any suggestion as to how to go about it?
Also, how could the example be adapted to run with dolfinx?
Many thanks for your help!
The most common method of PDE-based smoothing of a given function is to solve a reaction–diffusion-type (also called: positive-definite Helmholtz) equation, with the original data as a source term. In weak form, this would be: Find \tilde{u} such that, for all v,
\int_\Omega \ell^2\nabla \tilde{u}\cdot\nabla v\,d\Omega + \int_\Omega \tilde{u}v\,d\Omega = \int_\Omega uv\,d\Omega\text{ ,}
where \tilde{u} is the (unknown) smoothed version of the original (given) function u, and \ell is a length scale, which controls the strength of smoothing (roughly: how large of a distance features get smoothed out over). Solving this linear PDE would be a pretty standard use-case of DOLFINx.
1 Like
Thank you very much! This is working well.