In table 8 of KSP: Linear System Solvers — PETSc 3.23.3 documentation different possibilities for solvers are listed. I wanted to utilize cusparse as a solver within dolfinx for solving the 3D Poisson equation repeatedly with varying source charge distributions.
Unfortunately, till now, I was not able to get this to work.
The section of my code, where the matrix is assembled and the solver type is set is:
A = assemble_matrix(bilinear_form, bcs=boundary_conditions)
A.assemble()
A.convert(mat_type=petsc4py.PETSc.Mat.Type.AIJCUSPARSE)
b = create_vector(linear_form)
assemble_vector(b, linear_form)
#Define solver
solver = PETSc.KSP().create(dfx_mesh.comm)
solver.setOperators(A)
solver.setType(PETSc.KSP.Type.PREONLY)
solver.getPC().setType(PETSc.PC.Type.LU)
solver.getPC().setFactorSolverType("cusparse")
apply_lifting(b, [bilinear_form], [boundary_conditions])
set_bc(b, boundary_conditions)
# Solve linear problem for the first time, to save time at later calls.
solver.solve(b, PHI.x.petsc_vec)
The variables that were defined outside the code snippet should be self-explanatory. If not, I could also give you a minimal (non)-working example.
My problem is now, that I get the following error:
File "petsc4py/PETSc/Mat.pyx", line 2077, in petsc4py.PETSc.Mat.convert
petsc4py.PETSc.Error: error code 86
[0] MatConvert() at /usr/local/petsc/src/mat/interface/matrix.c:4453
[0] MatSetType() at /usr/local/petsc/src/mat/interface/matreg.c:144
[0] Unknown type. Check for miss-spelling or missing package: https://petsc.org/release/install/install/#external-packages
[0] Unknown Mat type given: aijcusparse
This surprises me, since within my dockerfile I have the following:
RUN apt-get -y install bison flex \
&& git clone --depth=1 -b v${PETSC_VERSION} https://gitlab.com/petsc/petsc.git ${PETSC_DIR} \
&& cd ${PETSC_DIR} \
&& ./configure \
--COPTFLAGS="-O2" \
--CXXOPTFLAGS="-O2" \
--FOPTFLAGS="-O2" \
--with-64-bit-indices=no \
--with-debugging=no \
--with-fortran-bindings=no \
--with-shared-libraries \
--download-hypre \
--download-metis \
--download-mumps-avoid-mpi-in-place \
--download-mumps \
--download-ptscotch \
--download-scalapack \
--download-spai \
--download-suitesparse \
--download-superlu \
--download-superlu_dist \
--download-cusparse \
--with-scalar-type=real \
--with-precision=double \
&& make ${MAKEFLAGS} all
together with:
RUN python3 -m venv ~/my-venv \
&& . ~/my-venv/bin/activate \
&& pip install --no-cache-dir -r ~/requirements.txt \
# Install petsc4py
&& cd ${PETSC_DIR}/src/binding/petsc4py \
&& pip -v install --no-dependencies --no-cache-dir --no-build-isolation .
Maybe you could help me to get this running or give a minimal working example for the utilization of cusparse? Thank you!
P.S. Since the use case is, to solve the Poisson equation repeatedly, my requirement is especially a acceleration of
solver.solve(b, PHI.x.petsc_vec)
where presumably the forward-backward-substitution is performed. In my opinion, a utilization of the gpu should speed up the forward-backward-substitution of a sparse linear system tremendously. In principle it should not be too hard, to implement a gpu-accelerated forward-backward-substitution myself (e.g. with numba-cuda), but I would be glad, if this could work somehow out of the box.