I’ve been working on a problem that involves a wave equation and got some unexpected results. As the number of elements increases (precisely at nx=27 and ny=27), I get a bunch of pointy things that look like noise. I’ve narrowed down the code such that there is no longer a time component. I believe it has something to do with the Laplacian.
Here is SlaterModel.ufl:
element = FiniteElement("Lagrange", "triangle", 1)
u = TrialFunction(element)
v = TestFunction(element)
p = Coefficient(element)
a = inner(grad(v), grad(u)) * dx
L = (p*v)*dx
And here is my cpp file:
#include <dolfin.h>
#include "SlaterModel.h"
class Forcing :
public dolfin::Expression
{
void eval(dolfin::Array<double>& values, const dolfin::Array<double>& x) const{
values[0] = std::cos(M_PI * x[0]) * std::cos(M_PI * x[1]);
}
};
class DirichletBoundary : public dolfin::SubDomain
{
bool inside(const dolfin::Array<double>& x, bool on_boundary) const
{
return on_boundary;
}
};
int main(){
int nx = 100;
int ny = 100;
auto mesh = std::make_shared<dolfin::RectangleMesh>(dolfin::Point(-1.0/2, -1.0/2, 0), dolfin::Point(1.0/2, 1.0/2, 0), nx, ny);
auto V = std::make_shared<SlaterModel::FunctionSpace>(mesh);
SlaterModel::BilinearForm a(V, V);
SlaterModel::LinearForm L(V);
dolfin::DirichletBC bcs(V, std::make_shared<dolfin::Constant>(0.0), std::make_shared<DirichletBoundary>());
auto p = std::make_shared<Forcing>();
L.p = p;
dolfin::File file("test.pvd");
dolfin::Function u(V);
dolfin::solve(a == L, u, bcs);
file << u;
}
Here is an image of the problem when nx = ny = 100:
I’m at the end of my wits. I have a Python implementation that works fine, but need to migrate to C++ for reasons. I have a feeling its a simple error I am making, but after three days I think its best I ask for a second set of eyes!
This is puzzling. I don’t see anything obviously wrong. How’re you compiling / running the C++ version? Against a native build on your machine or in a Docker container?
If you’ve not tried the latter it may help to understand if this is a weird allocation / initialisation issue due to the compiler you’re using.
Another thing perhaps is to move the std::make_shared instantiations from the DirichletBC constructor to main's level to make sure the memory isn’t getting accidentally freed? Although I doubt that’s the problem.
I’m linking and including against the libraries and headers supplied by a conda environment. The same conda environment that the Python version works fine in. I’ve noticed that the spikes change based on what optimization level I tell the compiler to use. That is disturbing… possibly some memory issue? I’m going to try stripping away all the other “.cpp” files that get linked in the final executable to see if that changes something.
And I modified my code to make sure there is no accidental free-ing of memory. Still the same issue.