Skip to content

DistArray PyTrilinos communication via the DA protocol

Robert David Grant edited this page Jul 18, 2014 · 1 revision

The overall objective is to support bi-directional sharing of distributed arrays via the distributed array protocol.

Usecase 1: roundtrip communication with simple array

  • User creates simple block-distributed one-dimensional distributed array with DistArray. Local arrays are created and initialized on engines:
c = Context()
dist = Distribution.from_shape(c, (10**6,), ('b',))
dist_arr = c.zeros(dist)
  • User defines local function to be run on engines with local array as argument:
def handoff(la):
    from PyTrilinos.Domi import from_distarray
    domi_array = from_distarray(la) # uses protocol.
    domi_array.something_interesting()
  • User applies handoff function to dist_arr and PyTrilinos modifications are visible:
c.apply(handoff, (dist_arr,))
return dist_arr.sum()  # or something else to indicate that distarray can see Domi's changes.

Usecase 2: existing PyTrilinos method accepts DistArray

What I (Bill) had envisioned (and maybe this is too advanced for our first step -- this is more about what I want the final interface to look like) was that the user would create a DistArray and then pass it to an existing PyTrilinos method that previously required a Python Epetra.Vector. The method would manipulate the data and then the DistArray could be observed to see the changes.

To do this, I have to write a more advanced typemap -- a converter from a general Python object to an C++ Epetra_Vector. Currently, the typemap does a type check and requires a wrapped Epetra_Vector. This can be upgraded to also accept a DistArray, which would then have to be converted.

The converter would check the dist_type. If set to 'u', then we can convert directly to an Epetra_Map and Epetra_Vector. If set to (one or more) 'b', then we can first convert to a Domi MDMap and MDVector, and then call the getEpetraVectorView() method.

For this usecase, the Domi classes are used for an intermediate calculation, and might be bypassed entirely if the data is unstructured. For this reason, I had not put a lot of thought towards wrapping Domi directly. That doesn't mean Domi shouldn't be wrapped ... but the only thing it would have that DistArray would not have is the getEpetraVectorView() method (plus similar methods for MultiVectors, Copies, and Tpetra), which could then be used for interfacing to solvers, etc.

So what is your opinion? Is this too advanced to start off with? It can't hurt to wrap Domi, although hopefully users could ultimately ignore it.