Out of memory despite free memory

The following minimal example

#!/usr/bin/env python3
  
from fenics import *

degree = 2
Nv = 16
Nz = 32
Lz = 3.0
vmax = 3.0

# 3d phasespace z, vpara, vperp
phasespace = BoxMesh(Point(0., -vmax, 0.), Point(Lz, vmax, vmax), Nz, 2*Nv, Nv)

P2 = FiniteElement('P', 'tetrahedron', degree)
element = MixedElement([P2, P2, P2]) #fe fi Ez

# define function space for phasespace density
V3d = FunctionSpace(phasespace, element)

# define source terms
zero3d = Expression(('0.', '0.', '0.'), degree=degree)

f_n = project(zero3d,V3d)     # at t^n
fe_n, fi_n, Ez_n = split(f_n)

fails with

$ /usr/bin/time ./project_3d.py

UMFPACK V5.7.8 (Nov 9, 2018): ERROR: out of memory

Traceback (most recent call last):
  File "./project_3d.py", line 23, in <module>
    f_n = project(zero3d,V3d)     # at t^n
  File "/usr/lib/python3/dist-packages/dolfin/fem/projection.py", line 138, in project
    cpp.la.solve(A, function.vector(), b, solver_type, preconditioner_type)
RuntimeError: 

*** -------------------------------------------------------------------------
*** DOLFIN encountered an error. If you are not able to resolve this issue
*** using the information listed below, you can ask for help at
***
***     fenics-support@googlegroups.com
***
*** Remember to include the error message listed below and, if possible,
*** include a *minimal* running example to reproduce the error.
***
*** -------------------------------------------------------------------------
*** Error:   Unable to successfully call PETSc function 'KSPSolve'.
*** Reason:  PETSc error code is: 76 (Error in external library).
*** Where:   This error was encountered inside /build/dolfin-i1VjBN/dolfin-2018.1.0.post1/dolfin/la/PETScKrylovSolver.cpp.
*** Process: 0
*** 
*** DOLFIN version: 2018.1.0
*** Git changeset:  unknown
*** -------------------------------------------------------------------------

Command exited with non-zero status 1
57.58user 1.13system 1:09.69elapsed 84%CPU (0avgtext+0avgdata 2868436maxresident)k

despite only using 2.9GB on a system with 45GB of unused memory. Any hints?

This is related to using UMFPACK, the default direct solver.
See: UMFPACK error: out of memory despite system having free memory - #2 by plugged
And Out of memory error - #4 by dokken
for how to change it when using project.

except changing the solver doesn’t fix it. Using mumps I get

*** Error:   Unable to solve linear system using PETSc Krylov solver.
*** Reason:  Solution failed to converge in 0 iterations (PETSc reason DIVERGED_PCSETUP_FAILED, residual norm ||r|| = 0.000000e+00).
*** Where:   This error was encountered inside PETScKrylovSolver.cpp.

Using superlu I get the friendly error message Not enough memory to perform factorization.

Using superlu_dist I get Command terminated by signal 9 after the process grabbed 61GB of memory.

And using default or umfpack I get the previously described error message.

Multifrontal solvers are notorious for memory consumption with the “dense-sparse” pattern of a cube uniformly partitioned into tetrahedra. The memory use for the factorisation is very large, even when the degree of freedom count can be naively interpreted as small. Try an iterative solver using Hypre as the AMG preconditioner. This should be sufficient for the projection operation.

1 Like

Right. Using bicgstab, cg, gmres, minres, richardson or tfqmr it works. All with similar memory consumption around 10GB. I will look into the preconditioner you mentioned.

Btw: I am in no way attached to tetrahedra. If you have a better suggestion for meshing I am open to try that.