Creating a generic updatable vector through PETSc

I’m trying to generate a generic vector (not related to a particular function space), but I receive an error when I try to perform an update of the vector. Example code snippet below.

from dolfin import Vector, PETScVector
from petsc4py import PETSc
import numpy as np
x = PETSc.Vec().createSeq(3)
X = Vector(PETScVector(x))
X.get_local()
#array([ 0.,  0.,  0.])
X.set_local(np.array([1,2,3],dtype=np.float64))

The last command produces the following error

*** -------------------------------------------------------------------------
*** DOLFIN encountered an error. If you are not able to resolve this issue
*** using the information listed below, you can ask for help at
***
***     fenics-support@googlegroups.com
***
*** Remember to include the error message listed below and, if possible,
*** include a *minimal* running example to reproduce the error.
***
*** -------------------------------------------------------------------------
*** Error:   Unable to successfully call PETSc function 'VecSetValuesLocal'.
*** Reason:  PETSc error code is: 73 (Object is in wrong state).
*** Where:   This error was encountered inside /tmp/sebo/fenics/dolfin-2017.2.0/dolfin/la/PETScVector.cpp.
*** Process: 0
*** 
*** DOLFIN version: 2017.2.0
*** Git changeset:
*** -------------------------------------------------------------------------

Maybe try instead to create a DOLFIN Vector first, then access the underlying PETSc object as necessary:

from dolfin import Vector, PETScVector, MPI, as_backend_type
from petsc4py import PETSc
import numpy as np

# Use DOLFIN constructor directly with MPI.comm_self to
# behave like VecCreateSeq:
X = Vector(MPI.comm_self,3)

print(X.get_local())

X.set_local(np.array([1,2,3],dtype=np.float64))

print(X.get_local())

# To access underlying PETSc Vec:
print(as_backend_type(X).vec())
2 Likes

That definitely looks better. I wasn’t aware that it could be done like that. I will definitely try this. Can I construct a matrix in this fashion too?

So I tried this, and for some reason the MPI object in my version (2017.2.0) does not appear to have this comm_self property (or method)?

This was a change from 2017.2 to 2018.1. In 2017.2, the world and self communicators are mpi_comm_world() and mpi_comm_self().

An additional difference between the two versions (if I recall correctly) is that if you compiled against mpi4py, you will have an mpi4py communicator in 2018.1. In 2017.2 it’s a DOLFIN MPI wrapper which doesn’t offer the same functionality.