How to minimize time complexity when executing FEM on a highly refined mesh?

Hello everyone,

The time complexity of my Python script increases significantly when solving FEM on a highly refined mesh. For a mesh with 1.20602e+06 nodes, the script on execution returns the error,

UMFPACK V5.7.8 (Nov 9, 2018): ERROR: out of memory

Traceback (most recent call last):
  File "exshearcircle.py", line 72, in <module>
    solve(a == L, u, bc)
  File "/usr/lib/petsc/lib/python3/dist-packages/dolfin/fem/solving.py", line 233, in solve
    _solve_varproblem(*args, **kwargs)
  File "/usr/lib/petsc/lib/python3/dist-packages/dolfin/fem/solving.py", line 273, in _solve_varproblem
    solver.solve()
RuntimeError:

Any idea how to deal or bypass this issue without using cluster computing?

Use a different solver, for instance mumps, or an iterative solver. See UMFPACK V5.7.8 (Nov 9, 2018): ERROR: out of memory with a non variational 3d problem - #2 by dokken
Fast computation of a large system of PDEs - #2 by dokken

Hi @dokken , I tried the suggestions in the link. Here are the different solvers I tried:

solve(a == L, u, bc, solver_parameters={"newton_solver": {"linear_solver": "mumps"}})
solve(a == L, u, bc, solver_parameters={"F-GMRES": {"linear_solver": "mumps"}})
solve(a == L, u, bc, solver_parameters={"CG": {"linear_solver": "mumps"}})
solve(a == L, u, bc, solver_parameters={"BiCGStab": {"linear_solver": "mumps"}})

And here are the respective errors:

RuntimeError: Invalid parameter: newton_solver
RuntimeError: Invalid parameter: F-GMRES
RuntimeError: Invalid parameter: CG
RuntimeError: Invalid parameter: BiCGStab

Any suggestions?

Did you try:

 solve(a==L, u, bc, solver_parameters={"linear_solver": "mumps"})

as in:

Yes, I did try that too. That raises the error (I am typing the exact error it raises):

UMFPACK V5.7.8 (Nov 9, 2018): ERROR: out of memory

Traceback (most recent call last):
  File "exshearcircle.py", line 120, in <module>
    C = project(Constant(1),V)
  File "/usr/lib/petsc/lib/python3/dist-packages/dolfin/fem/projection.py", line 138, in project
    cpp.la.solve(A, function.vector(), b, solver_type, preconditioner_type)
RuntimeError:

*** -------------------------------------------------------------------------
*** DOLFIN encountered an error. If you are not able to resolve this issue
*** using the information listed below, you can ask for help at
***
***     fenics-support@googlegroups.com
***
*** Remember to include the error message listed below and, if possible,
*** include a *minimal* running example to reproduce the error.
***
*** -------------------------------------------------------------------------
*** Error:   Unable to successfully call PETSc function 'KSPSolve'.
*** Reason:  PETSc error code is: 76 (Error in external library).
*** Where:   This error was encountered inside /build/dolfin-GfTndI/dolfin-2019.2.0~git20201207.b495043/dolfin/la/PETScKrylovSolver.cpp.
*** Process: 0
***
*** DOLFIN version: 2019.2.0.dev0
*** Git changeset:  unknown

Well, that means that you are getting further. You can add similar arguments to your projection. See for instance: Save Solution as variable - #13 by dokken for the syntax

1 Like

Thanks @dokken . The code C = project(Constant(1),V, solver_type="mumps") worked.

But what does the code mean? What’s implied by solver_type="mumps"?

Mumps is a parallel solver, supplied to dolfin through PETSc, see for instance: MATSOLVERMUMPS and MUMPS: A Multifrontal Massively Parallel Solver

1 Like