PETSc matrices and vectors I/O

Hello everyone,

I would like to know what is the most straightforward and easy way to handle PETSc Matrices and Vectors input/output.
This is the situation: I create and assemble a PETSc matrix, then I want to export it. After that, I want to import the same matrix in an object, keeping the type, maybe to use it again in another problem. The same question is valid for vectors.

Thank you very much for the support.

Antonio

Consider: I/O from XDMF/HDF5 files in dolfin-x - #23 by hawkspar

Hello dokken,

Here you find a MWE that works:

from petsc4py import PETSc
import numpy as np
import PetscBinaryIO
n = 1000

nnz = 3 * np.ones(1000, dtype=np.int32)
nnz[0] = nnz[-1] = 2

A = PETSc.Mat()
A.createAIJ([n, n], nnz=nnz)


# First set the first row
A.setValue(0, 0, 2)
A.setValue(0, 1, -1)
# Now we fill the last row
A.setValue(999, 998, -1)
A.setValue(999, 999, 2)

# And now everything else
for index in range(1, n - 1):
    A.setValue(index, index - 1, -1)
    A.setValue(index, index, 2)
    A.setValue(index, index + 1, -1)    

A.assemble()

ksp = PETSc.KSP().create()
ksp.setOperators(A)
b = A.createVecLeft()
b.array[:] = 1

x = A.createVecRight()

ksp = PETSc.KSP().create()
ksp.setOperators(A)
ksp.setType('bcgs')
ksp.setConvergenceHistory()
ksp.getPC().setType('none')
ksp.solve(b, x)

#write file
vec = x.array.view(PetscBinaryIO.Vec)
io = PetscBinaryIO.PetscBinaryIO()
io.writeBinaryFile('x.dat',[vec])

whre I solve for x, given A and b.
I can export x and b, but what’s the syntax for exporting A?

Thank you

You can use:

A.view(PETSc.Viewer("A.mat"))

I am sorry, am I wrong?
with the following:

vec = A.view(PETSc.Viewer("A.mat"))
io = PetscBinaryIO.PetscBinaryIO()
io.writeBinaryFile('A.dat',[vec])

I get the error:

Traceback (most recent call last):
  File "/usr/local/petsc/lib/petsc/bin/PetscBinaryIO.py", line 492, in writeBinaryFile
    self.writeMatSciPy(fid, petscobj)
  File "/usr/local/petsc/lib/petsc/bin/PetscBinaryIO.py", line 109, in decorated_f
    result = f(self, *args, **kwargs)
  File "/usr/local/petsc/lib/petsc/bin/PetscBinaryIO.py", line 356, in writeMatSciPy
    assert isinstance(mat, csr_matrix)
AssertionError

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/media/Discone/FEniCS/Docker/4_modal_superposition/prova_petsc.py", line 48, in <module>
    io.writeBinaryFile('x.dat',[vec])
  File "/usr/local/petsc/lib/petsc/bin/PetscBinaryIO.py", line 109, in decorated_f
    result = f(self, *args, **kwargs)
  File "/usr/local/petsc/lib/petsc/bin/PetscBinaryIO.py", line 496, in writeBinaryFile
    raise TypeError('Object %s is not a valid PETSc object'%(petscobj.__repr__()))
TypeError: Object None is not a valid PETSc object

Consider the following:

from petsc4py import PETSc
from mpi4py import MPI

A = PETSc.Mat().create(MPI.COMM_WORLD)
A.setSizes([3, 3])
A.setFromOptions()
A.setUp()
A.setValuesLocal([0], [2], [3])
A.assemble()

viewer = PETSc.Viewer().createBinary("A.mat", mode=PETSc.Viewer.Mode.WRITE, comm=MPI.COMM_WORLD)

viewer(A)
1 Like

As always, thank you dokken!

Good morning,

I hope you can answer to another question. After exporting it, I tried to import it again and I got a tuple type that looks like this:

(MatSparse: ((10, 10), (array([ 0,  2,  5,  8, 11, 14, 17, 20, 23, 26, 28], dtype=int32), array([0, 1, 0, 1, 2, 1, 2, 3, 2, 3, 4, 3, 4, 5, 4, 5, 6, 5, 6, 7, 6, 7,
       8, 7, 8, 9, 8, 9], dtype=int32), array([ 2.+0.j, -1.+0.j, -1.+0.j,  2.+0.j, -1.+0.j, -1.+0.j,  2.+0.j,
       -1.+0.j, -1.+0.j,  2.+0.j, -1.+0.j, -1.+0.j,  2.+0.j, -1.+0.j,
       -1.+0.j,  2.+0.j, -1.+0.j, -1.+0.j,  2.+0.j, -1.+0.j, -1.+0.j,
        2.+0.j, -1.+0.j, -1.+0.j,  2.+0.j, -1.+0.j, -1.+0.j,  2.+0.j]))),)

My question is: Is there an automatic way to put this in another (empty) petsc matrix? I can figure it out a way to create a function that does it, but I wonder if it exists already, since it seems to me a standard data structure.

Thank you again

I got this.
If anyone wants to write and read later a PETSc Matrix/Vector:

############ WRITE A and x ##############
viewer_A = PETSc.Viewer().createBinary("A.dat", mode=PETSc.Viewer.Mode.WRITE, comm=MPI.COMM_WORLD)
viewer_x = PETSc.Viewer().createBinary("x.dat", mode=PETSc.Viewer.Mode.WRITE, comm=MPI.COMM_WORLD)
viewer_A(A)
viewer_x(x)


############## READ A and x ################
viewer_A2 = PETSc.Viewer().createBinary("A.dat", mode=PETSc.Viewer.Mode.READ, comm=MPI.COMM_WORLD)
viewer_x2 = PETSc.Viewer().createBinary("x.dat", mode=PETSc.Viewer.Mode.READ, comm=MPI.COMM_WORLD)
A2 = PETSc.Mat().load(viewer_A2)
x2 = PETSc.Vec().load(viewer_x2)

I like the elegance, but I can’t get it to work in parallel on my end - I keep getting ValueError: Incompatible size issues. It’s a pain to synchronise PETSc and doflinx dof ownership, I’m going to stick to my solution that @dokken linked, based on one file per processor, even if it’s more bulky