Adjoint of Jacobian with boundary conditions and apply bc to petsc matrix

Hi everyone!
My math problem is the next
\lambda Mu=Ju, where M is the mass matrix and J is the Jacobian of a non linear operator (Navier-Stokes’s equation).
The boundary condition are
u(u,y)=0 \ \ (x,y)\in \partial \Omega.
Then, my code to solve this is:

from fenics import *
from dolfin import *
from petsc4py import PETSc

mesh = UnitSquareMesh(10, 10)
V = VectorFunctionSpace(mesh, "CG", 1)
u=Funtcion(V)
v=TestFunction(V)
a=dot(u,v)*dx
solve(a==0, u, bc)


vm = TestFunction(V)
um = TrialFunction(V)

mass_form = dot(vm,um)*dx
mass_action_form = action(mass_form, Constant((1.0,1.0,1.0)))


J = derivative(a,u)
JM = PETScMatrix()
Mass= PETScMatrix()

dummy = inner(Constant((1.0, 1.0)), v) * dx
assemble_system(J, dummy, bcs_m, A_tensor = JM)
assemble_system(mass_form, dummy, bcs_m, A_tensor = Mass)

eigensolver = SLEPcEigenSolver(JM,Mass)

This works for me to solve the direct problem. I need to solve
\lambda^{\dagger} Mu^{*}=J^{\dagger}u^{*} too with the same bcs.
I try

JMAux=JM.copy()
petsc_mat = as_backend_type(JMAux).mat()
petsc_mat.transpose()
JMA = PETScMatrix(petsc_mat)
Aeigensolver = SLEPcEigenSolver(JMA,Mass)

The problem is, when I do petcs_mat.trnaspose(), the “J^{\dagger}” doesn’t have the boundary condition.
I try to do J.T and then assemble the bcs,like the first problem, but this function doesn’t work (I think that J isn’t a rank 2 tensor, but I’m not sure…)
I try to do

petsc_mat = as_backend_type(JDolfin).mat()
petsc_mat.transpose()

JMA = PETScMatrix(petsc_mat)
bcs.apply(JMA)

but i obtain the next error

 Error:   Unable to set given (local) rows to identity matrix.
 Reason:  some diagonal elements not preallocated (try assembler option keep_diagonal).
 Where:   This error was encountered inside PETScMatrix.cpp.
Process: 0

I would like if someone could help me solve this problem with some other way than this or fix it.
Thank you!!

Hello! I was also having this issue with getting the adjoint Jacobian for a Stokes problem I’m working on.

It turns out that Dolfin catches the PETSc error without looking at the PETSc error message. The actual error message should be “Need to provide local to global mapping to matrix first”. In my case, I’ve fixed it by getting the local-to-global map for the original matrix and setting the transposed matrix to have those mappings switched. This may not work on multiple processes, but it seems to work fine for one.

In your code, it would look something like the following:

lgmap_rows, lgmap_cols = as_backend_type(JM).mat().getLGMap()

JMAux = JM.copy()
petsc_mat = as_backend_type(JMAux).mat()
petsc_mat.transpose()
petsc_mat.setLGMap(lgmap_cols, lgmap_rows)

bcs.apply(JMAux)

If you’ve found a different way to handle this, I’m interested since I’m not sure that this is the best solution.

1 Like