PETSC ILU preconditioner not working with mpi run

Hi all,

I’m currently working in DOLFIN-X. To solve my linear system I want to use GMRES with ILU preconditioning, so use the following solve command:

solve(a == L, u, [], petsc_options={“ksp_type”: “gmres”, “pc_type”: “ilu”})

The options aren’t necessary really, since it appears solve is using GMRES with ILU preconditioning by default. This works fine in serial but when I use mpi to run my code, I get an error. It appears solve wishes to use jacobi preconditioning as default when mpi run is invoked.

Do you have any ideas what the issue might be? Does PETSC not have a parallel implementation of ILU?

Many thanks,
Sam

Hey Sam,

it is actually the case that petsc does not have a parallel implementation of an ILU, see https://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/PC/PCILU.html

You could instead try to use the hypre ILU (as is suggested there), but I have not tried this myself yet.

Thanks. I should have done a bit of googling before posting the question! I’ll give hypre ILU a go.

As an aside, I’ve also tried ASM and this seems to give similar performance to ILU for my wave scattering simulations, and it runs in parallel.

Sam