MPI hangs after building 2019.1.0 from source

Dear all,

after updating from 2018.2.0.dev0 to 2019.1.0 I experienced problems with running in parallel. When I run the C++ demo elasticity with
mpirun -np 2 demo_elasticity
it does not complete. Stepping back to using only one process I get the correct results only every third try or so. I am running on a CentOS 7. PETSc 3.12 is compiled against OpenMPI 3.1 as is dolfin. I already tried to limit pthreads with
export OMP_NUM_THREADS=1
without success.

Any help would be much appreciated
Kind Regards
Christopher

I solved this one myself. Seemed to have to do with the version of OpenMPI I was using (3.1.3 compiled with gcc 4.8). After having buit OpenMPI 4 with gcc 8.3.1 everything now runs as expected.