Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

WIP: indexing with broadcasting #1473

Closed
wants to merge 128 commits into from
Closed
Show file tree
Hide file tree
Changes from 7 commits
Commits
Show all changes
128 commits
Select commit Hold shift + click to select a range
105bd64
Implemented `_broadcast_indexes` in Variable.py
fujiisoup Jul 7, 2017
23b4fe0
Diagonal indexing for Variable.
fujiisoup Jul 10, 2017
726ba5d
update _broadcast_indexes. update tests.
fujiisoup Jul 12, 2017
df7011f
Support basic boolean indexing.
fujiisoup Jul 15, 2017
f9232cb
tests for dask-based Variable
fujiisoup Jul 16, 2017
17b6465
Explicitly mark xfail flags
fujiisoup Jul 16, 2017
33c51d3
orthogonal indexing for dask.
fujiisoup Jul 16, 2017
03a336f
Refactor DaskArrayAdapter
shoyer Jul 16, 2017
d5af395
Merge pull request #1 from shoyer/indexing_broadcasting
fujiisoup Jul 17, 2017
866de91
Added MissingDimensionsError. Improve DaskIndexingAdapter, Variable._…
fujiisoup Jul 17, 2017
08e7444
use `np.arange(*slice.indices(size))` rather than `np.arange(size)[sl…
fujiisoup Jul 17, 2017
84afc98
Merge branch 'master' into indexing_broadcasting
fujiisoup Jul 17, 2017
7b33269
Add orthogonalize_indexers
fujiisoup Jul 20, 2017
50ea56e
A bug fix.
fujiisoup Jul 20, 2017
bac0089
Working with LazilyIndexedArray
fujiisoup Jul 20, 2017
1206c28
Fix in LazilyIndexedArray.
fujiisoup Jul 20, 2017
c2747be
add @requires_dask in test_variable
fujiisoup Jul 20, 2017
0671f39
rename orthogonalize_indexers -> unbroadcast_indexers
fujiisoup Jul 21, 2017
ffccff1
Wrap LazilyIndexedArray so that it accepts broadcasted-indexers
fujiisoup Jul 21, 2017
becf539
small rename
fujiisoup Jul 21, 2017
1ae4b4c
Another small fix
fujiisoup Jul 21, 2017
1967bf5
Remove unused function.
fujiisoup Jul 21, 2017
c2eeff3
Added _broadcast_indexes_1vector
fujiisoup Jul 23, 2017
df12c04
Minor fix
fujiisoup Jul 23, 2017
5ba367d
Avoid doubly wrapping by LazilyIndexedArray
fujiisoup Jul 23, 2017
d25c1f1
General orthogonal indexing for dask array.
fujiisoup Jul 23, 2017
0115994
Added base class IndexableArrayAdapter
fujiisoup Jul 24, 2017
1b4e854
Deprecate _unbroadcast_indexers and support IndexerTuple classes
fujiisoup Jul 25, 2017
36d052f
removed unintended prints.
fujiisoup Jul 25, 2017
9bd53ca
Some clean up.
fujiisoup Jul 25, 2017
563cafa
Some small fix.
fujiisoup Jul 25, 2017
1712060
Care for boolean array.
fujiisoup Jul 25, 2017
884423a
Always map boolean index to integer array.
fujiisoup Jul 25, 2017
c2e6f42
Takes care of boolean index in test_indexing
fujiisoup Jul 25, 2017
002eafa
replace self.assertTrue by assert
fujiisoup Jul 25, 2017
eedfb3f
Fix based on shoyer's comments.
fujiisoup Jul 29, 2017
bb2e515
Added `to_tuple()` method to IndexerTuple class.
fujiisoup Jul 29, 2017
5983a67
Removed: 'orthogonal_indexer', 'canonicalize_indexer'
fujiisoup Jul 29, 2017
7a5ff79
update IndexVariable.__getitem__
fujiisoup Jul 29, 2017
0b559bc
Made to_tuple function.
fujiisoup Jul 30, 2017
bad828e
BASIC_INDEXING_TYPES
fujiisoup Jul 30, 2017
a821a2b
Removed unused function from tests.
fujiisoup Jul 30, 2017
6550880
assert -> raise
fujiisoup Jul 30, 2017
464e711
Update Dataset.isel
fujiisoup Jul 30, 2017
7dd171d
Use `merge_variables` in checking the consistency.
fujiisoup Jul 30, 2017
e8f006b
Cleanup Dataset.__getitem__
shoyer Jul 30, 2017
a8ec82b
Add comment about why align() is unneeded
shoyer Jul 30, 2017
32749d4
Ensure correct tests are run in test_variable.py
shoyer Jul 31, 2017
31401d4
Support pointwise indexing with dask
shoyer Jul 31, 2017
f42ddfd
Merge pull request #2 from shoyer/indexing_broadcasting
fujiisoup Aug 5, 2017
8d96ad3
Add a vindex routine for np.ndarray
shoyer Aug 6, 2017
19f7204
Add an OrderedSet to xarray.core.utils
shoyer Aug 6, 2017
a8f60ba
Support dask and numpy vindex with one path
shoyer Aug 6, 2017
69f8570
Fix test failures
shoyer Aug 6, 2017
5eb00b7
working with `Dataset.sel`
fujiisoup Jul 30, 2017
d133766
Added more tests
fujiisoup Aug 6, 2017
631f6e9
Changes per review
shoyer Aug 7, 2017
72587de
Merge pull request #3 from shoyer/indexing_broadcasting
fujiisoup Aug 7, 2017
3231445
Restore `isel_points`. Remove automatic tuple conversion for `sel`
fujiisoup Aug 7, 2017
dd325c5
Some clean up
fujiisoup Aug 7, 2017
434a004
Supported indexing by a scalar Variable
fujiisoup Aug 9, 2017
d518f7a
Supported the indexing by DataArray with coordinates.
fujiisoup Aug 9, 2017
ba3cc88
Update DataArray.loc and DataArray.sel to use Dataset.loc and Dataset…
fujiisoup Aug 9, 2017
f63f3d5
Merge remote-tracking branch 'pydata/master' into indexing_broadcasting
fujiisoup Aug 21, 2017
aa10635
Added inhouse normalize_axis_index
fujiisoup Aug 21, 2017
fd73e82
Support an integer key for _advanced_indexer_subspaces
fujiisoup Aug 21, 2017
6202aff
Add warning for coordinate conflict.
fujiisoup Aug 27, 2017
f9746fd
Warning changes DeprecationWarning -> FutureWarning.
fujiisoup Aug 27, 2017
f78c932
fix related to pytest.warns
fujiisoup Aug 27, 2017
1c027cd
Another fix related to warning.
fujiisoup Aug 27, 2017
d11829f
Raise an Error for confusing indexing type
fujiisoup Aug 27, 2017
0777128
Minor fix
fujiisoup Aug 27, 2017
f580c99
Test for indexing by a scalar coordinate.
fujiisoup Aug 27, 2017
4ebe852
Modified test
fujiisoup Aug 29, 2017
20f5cb9
Remove too specialized errorning
fujiisoup Aug 29, 2017
ab08af8
Working with docs
fujiisoup Aug 29, 2017
92dded6
Found a bug in as_variable
fujiisoup Aug 29, 2017
a624424
Working with docs
fujiisoup Aug 29, 2017
a4cd724
Enable indexing IndexVariable by multi-dimensional Variable.
fujiisoup Aug 29, 2017
f66c9b6
Found a bug in indexing np.ndarray
fujiisoup Aug 30, 2017
24309c4
Added a test for boolean-DataArray indexing.
fujiisoup Aug 30, 2017
fd698de
Merge branch 'indexing_broadcasting' into indexing_broadcasting_doc
fujiisoup Aug 30, 2017
9cbaff9
Make sure assignment certainly works.
fujiisoup Aug 30, 2017
a5c7766
Added assignment section
fujiisoup Aug 30, 2017
f242166
pep8
fujiisoup Aug 31, 2017
bff18f0
Remove unused tests.
fujiisoup Aug 31, 2017
21c11c4
Add more docs.
fujiisoup Aug 31, 2017
1975f66
Api.rst changed
fujiisoup Aug 31, 2017
73ad94e
Merge branch 'indexing_broadcasting_doc' into indexing_broadcasting
fujiisoup Aug 31, 2017
b49f813
Add link in whats-new
fujiisoup Aug 31, 2017
1fd6b3a
Small format cleanup
fujiisoup Aug 31, 2017
46dd7c7
allow positional indexing with unsigned integer types
Aug 31, 2017
11f3e4f
Merge branch 'master' into indexing_broadcasting
fujiisoup Sep 1, 2017
1d3eddc
Catch up to the previous merge.
fujiisoup Sep 1, 2017
bcb25f1
Merge remote-tracking branch 'github/fix/1405' into indexing_broadcas…
fujiisoup Sep 1, 2017
7104964
workaround for daskarray with uint indexer.
fujiisoup Sep 1, 2017
173968b
Add a section about assignment, full indexing rules.
fujiisoup Sep 2, 2017
addb91a
Merge branch 'master' into indexing_broadcasting
fujiisoup Sep 3, 2017
7ad7d36
warning added for reindex for DataArray indexers.
fujiisoup Sep 4, 2017
91dd833
Move warning in alignment.reindex_variables.
fujiisoup Sep 4, 2017
118a5d8
+ Change API to attach non-dimensional coordinates.
fujiisoup Sep 5, 2017
dc9f8a6
Some clean up. Fix error in test_reindex_warning
fujiisoup Sep 5, 2017
5726c89
Enable vindex for PandasIndexAdapter.
fujiisoup Sep 5, 2017
523ecaa
Add deprecation warning for isel_points
fujiisoup Sep 6, 2017
a3a83db
Merge branch 'master' into indexing_broadcasting
fujiisoup Sep 6, 2017
765ae45
Add a sanity check for boolean vectorized indexing.
fujiisoup Sep 6, 2017
3deaf5c
Modify tests to take care of the sanity check related to boolean arra…
fujiisoup Sep 6, 2017
c8c8a12
Another follow up
fujiisoup Sep 6, 2017
a16a04b
pep8
fujiisoup Sep 6, 2017
1b34cd4
Clean up sanity checks in broadcast_indexers
fujiisoup Sep 7, 2017
24599a7
Fix unintended rename
fujiisoup Sep 11, 2017
d5d967b
indexing.rst edits
shoyer Sep 11, 2017
d0d6a6f
remove note about conflicts for now
shoyer Sep 17, 2017
4f08e2e
Apply coordinate conflict rule.
fujiisoup Sep 17, 2017
fbbe35c
Python 2 support
fujiisoup Sep 18, 2017
db23c93
Add tests for mindex selection.
fujiisoup Sep 18, 2017
031be9a
Drop coordinate of itself.
fujiisoup Sep 18, 2017
969f9cf
Clean up the coordinate dropping logic.
fujiisoup Sep 18, 2017
8608451
Merge pull request #4 from shoyer/indexing_broadcasting
fujiisoup Sep 22, 2017
b4e5b36
A small bug fix in coordinate dropping logic
fujiisoup Sep 22, 2017
9523039
Merge remote-tracking branch 'pydata/master' into indexing_broadcasting
fujiisoup Sep 27, 2017
dc60348
Fixes based on jhamman's comments.
fujiisoup Sep 27, 2017
8a62ad9
Improve test for warning.
fujiisoup Sep 27, 2017
6b96960
Merge branch 'master' into indexing_broadcasting
fujiisoup Oct 4, 2017
cb84154
Remove unused assert sentence.
fujiisoup Oct 4, 2017
9726531
Simplify rules for indexing conflicts
shoyer Oct 9, 2017
caa79fe
Merge pull request #5 from shoyer/indexing_broadcasting
fujiisoup Oct 9, 2017
170abc5
Better error-message for multiindex vectorized-selection.
fujiisoup Oct 10, 2017
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
44 changes: 19 additions & 25 deletions xarray/core/indexing.py
Original file line number Diff line number Diff line change
Expand Up @@ -371,7 +371,7 @@ def shape(self):
return tuple(shape)

def __array__(self, dtype=None):
array = orthogonally_indexable(self.array)
array = broadcasted_indexable(self.array)
return np.asarray(array[self.key], dtype=None)

def __getitem__(self, key):
Expand Down Expand Up @@ -434,7 +434,7 @@ def __setitem__(self, key, value):
self.array[key] = value


def orthogonally_indexable(array):
def broadcasted_indexable(array):
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

maybe xarray_indexable would be a better name here

if isinstance(array, np.ndarray):
return NumpyIndexingAdapter(array)
if isinstance(array, pd.Index):
Expand All @@ -445,24 +445,10 @@ def orthogonally_indexable(array):


class NumpyIndexingAdapter(utils.NDArrayMixin):
"""Wrap a NumPy array to use orthogonal indexing (array indexing
accesses different dimensions independently, like netCDF4-python variables)
"""Wrap a NumPy array to use broadcasted indexing
"""
# note: this object is somewhat similar to biggus.NumpyArrayAdapter in that
# it implements orthogonal indexing, except it casts to a numpy array,
# isn't lazy and supports writing values.
def __init__(self, array):
self.array = np.asarray(array)

def __array__(self, dtype=None):
return np.asarray(self.array, dtype=dtype)

def _convert_key(self, key):
key = expanded_indexer(key, self.ndim)
if any(not isinstance(k, integer_types + (slice,)) for k in key):
# key would trigger fancy indexing
key = orthogonal_indexer(key, self.shape)
return key
self.array = array

def _ensure_ndarray(self, value):
# We always want the result of indexing to be a NumPy array. If it's
Expand All @@ -474,29 +460,37 @@ def _ensure_ndarray(self, value):
return value

def __getitem__(self, key):
key = self._convert_key(key)
return self._ensure_ndarray(self.array[key])

def __setitem__(self, key, value):
key = self._convert_key(key)
self.array[key] = value


class DaskIndexingAdapter(utils.NDArrayMixin):
"""Wrap a dask array to support orthogonal indexing
"""Wrap a dask array to support broadcasted-indexing.
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

maybe "xarray-style" is better than "broadcasted"

"""
def __init__(self, array):
self.array = array

def __getitem__(self, key):
key = expanded_indexer(key, self.ndim)
if any(not isinstance(k, integer_types + (slice,)) for k in key):
""" key: tuple of Variable, slice, integer """
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think it should only be a NumPy or dask array here, not a Variable.

# basic or orthogonal indexing
if all(isinstance(k, (integer_types, slice)) or k.squeeze().ndim <= 1
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

For simple cases where everything is an integers or slice, we want to just use a single indexing call like self.array[key] rather than this loop.

In the hard case when some arguments are arrays, we should try self.array.vindex[key]. If it doesn't work in some cases, we can either add work-arounds or try to fix it upstream in dask.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

After looking at dask.array in a little more detail, I think we need to keep a work-around for "orthogonal" indexing in dask. It looks like vindex only works when each indexer is 1D.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

for k in key):
value = self.array
for axis, subkey in reversed(list(enumerate(key))):
if hasattr(subkey, 'squeeze'):
subkey = subkey.squeeze()
if subkey.ndim == 0: # make at least 1-d array
subkey = subkey.flatten()
value = value[(slice(None),) * axis + (subkey,)]
return value
else:
value = self.array[key]
return value
# TODO Dask does not support nd-array indexing.
# flatten() -> .vindex[] -> reshape() should be used
# instead of `.load()`
value = np.asarray(self.array)[key]
return value


class PandasIndexAdapter(utils.NDArrayMixin):
Expand Down
105 changes: 83 additions & 22 deletions xarray/core/variable.py
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,8 @@
from . import utils
from .pycompat import (basestring, OrderedDict, zip, integer_types,
dask_array_type)
from .indexing import (PandasIndexAdapter, orthogonally_indexable)
from .indexing import (DaskIndexingAdapter, PandasIndexAdapter,
broadcasted_indexable)

import xarray as xr # only for Dataset and DataArray

Expand Down Expand Up @@ -297,7 +298,7 @@ def data(self, data):

@property
def _indexable_data(self):
return orthogonally_indexable(self._data)
return broadcasted_indexable(self._data)

def load(self):
"""Manually trigger loading of this variable's data from disk or a
Expand Down Expand Up @@ -376,29 +377,89 @@ def _item_key_to_tuple(self, key):
else:
return key

def _broadcast_indexes(self, key):
"""
Parameters
-----------
key: One of
array
a mapping of dimension names to index.

Returns
-------
dims: Tuple of strings.
Dimension of the resultant variable.
indexers: list of integer, array-like, or slice. This is aligned
along self.dims.
"""
key = self._item_key_to_tuple(key) # key is a tuple
# key is a tuple of full size
key = indexing.expanded_indexer(key, self.ndim)
basic_indexing_types = integer_types + (slice,)
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Let's make this a module level constant instead, since we access it in a few places.

if all([isinstance(k, basic_indexing_types) for k in key]):
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Avoid the unneeded extra [] inside all()

return self._broadcast_indexes_basic(key)
else:
return self._broadcast_indexes_advanced(key)

def _broadcast_indexes_basic(self, key):
dims = tuple(dim for k, dim in zip(key, self.dims)
if not isinstance(k, integer_types))
return dims, key

def nonzero(self):
""" Equivalent numpy's nonzero but returns a tuple of Varibles. """
if isinstance(self._data, (np.ndarray, pd.Index, PandasIndexAdapter)):
nonzeros = np.nonzero(self._data)
elif isinstance(self._data, dask_array_type):
# TODO we should replace dask's native nonzero
# after https://github.com/dask/dask/issues/1076 is implemented.
nonzeros = np.nonzero(self.load()._data)
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I would just do nonzeros = np.nonzero(self.data) instead of all these isinstance checks. You can leave the TODO, but I don't think we actually need it for indexing since currently we already load indexers into memory.


return tuple([as_variable(nz, name=dim) for nz, dim
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Construct the variable objects directly here rather than using as_variable: Variable((dim,), nz).

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No need to include [] for the comprehension inside tuple()

in zip(nonzeros, self.dims)])

def _isbool_type(self):
""" Return if the variabe is bool or not """
if isinstance(self._data, (np.ndarray, PandasIndexAdapter, pd.Index)):
return self._data.dtype is np.dtype('bool')
elif isinstance(self._data, dask_array_type):
raise NotImplementedError

def _broadcast_indexes_advanced(self, key):
variables = []

for dim, value in zip(self.dims, key):
if isinstance(value, slice):
value = np.arange(self.sizes[dim])[value]
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Use the slice.indices method here, to construct the desired array without indexing:

In [14]: x = slice(0, 10, 2)

In [15]: np.arange(*x.indices(5))
Out[15]: array([0, 2, 4])


try: # TODO we need our own Exception.
variable = as_variable(value, name=dim)
except ValueError as e:
if "cannot set variable" in str(e):
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

My current implementation for this exception handling is rather bad.
I want to change this to something like

except DimensionMismatchError:
    raise IndexError('...')

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I agree. Can you switch this for as_variable?

Inherit from ValueError for backwards compatibility. Something like MissingDimensionsError should be more broadly useful for cases like this where we can't safely guess a dimension name.

raise IndexError("Unlabelled multi-dimensional array "
"cannot be used for indexing.")
else:
raise e
if variable._isbool_type(): # boolean indexing case
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can we make this just variable.dtype.kind == 'b'?

variables.extend(list(variable.nonzero()))
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm a little hesitate to allow multi-dimensional boolean indexers here. The problem with using extend() here is that method calls like a.isel(x=b) get mapped into something like a[b], so if b is multi-dimensional, the second dimension of b gets matched up with the second dimension of a in an unpredictable way. We would need some way to specify the mapping to multiple dimensions, something like a.isel((x,y)=b) (obviously not valid syntax).

Instead, I would error for boolean indexers with more than one dimension, and then convert with nonzero(), as you've done here.

Multi-dimensional boolean indexers are useful, but I think the main use-case is indexing with a single argument like x[y > 0], so we don't need fancy remapping between dimensions.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I agree. I add the sanity check for the boolean array.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You don't need list() here -- list.extend() can handle any iterable, including the tuple returned by nonzero().

else:
variables.append(variable)
variables = _broadcast_compat_variables(*variables)
dims = variables[0].dims # all variables have the same dims
key = tuple(variable.data for variable in variables)
return dims, key

def __getitem__(self, key):
"""Return a new Array object whose contents are consistent with
getting the provided key from the underlying data.

NB. __getitem__ and __setitem__ implement "orthogonal indexing" like
netCDF4-python, where the key can only include integers, slices
(including `Ellipsis`) and 1d arrays, each of which are applied
orthogonally along their respective dimensions.
NB. __getitem__ and __setitem__ implement "diagonal indexing" like
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm not sure I like the name "diagonal indexing".

np.ndarray.

The difference does not matter in most cases unless you are using
numpy's "fancy indexing," which can otherwise result in data arrays
whose shapes is inconsistent (or just uninterpretable with) with the
variable's dimensions.

If you really want to do indexing like `x[x > 0]`, manipulate the numpy
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think we still need this around, until we support boolean indexing with multi-dimensional indexers.

array `x.values` directly.
This method will replace __getitem__ after we make sure its stability.
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

delete

"""
key = self._item_key_to_tuple(key)
key = indexing.expanded_indexer(key, self.ndim)
dims = tuple(dim for k, dim in zip(key, self.dims)
if not isinstance(k, integer_types))
values = self._indexable_data[key]
# orthogonal indexing should ensure the dimensionality is consistent
dims, index_tuple = self._broadcast_indexes(key)
values = self._indexable_data[index_tuple]
if hasattr(values, 'ndim'):
assert values.ndim == len(dims), (values.ndim, len(dims))
else:
Expand All @@ -412,15 +473,15 @@ def __setitem__(self, key, value):

See __getitem__ for more details.
"""
key = self._item_key_to_tuple(key)
dims, index_tuple = self._broadcast_indexes(key)
if isinstance(self._data, dask_array_type):
raise TypeError("this variable's data is stored in a dask array, "
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is actually no longer true -- dask.array supports assignment in recent versions.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@shoyer
Currently, array[0, 1] = 1 gives NotImplementedError: Item assignment with <class 'tuple'> not supported

I couldn't find its API. This one?

'which does not support item assignment. To '
'assign to this variable, you must first load it '
'into memory explicitly using the .load_data() '
'method or accessing its .values attribute.')
data = orthogonally_indexable(self._data)
data[key] = value
data = broadcasted_indexable(self._data)
data[index_tuple] = value
Copy link
Member Author

@shoyer shoyer Jul 16, 2017

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

value should be broadcast against the indexing key if possible.

If it's an xarray.Variable, I think we can just call value.set_dims(dims) on it. If it's a NumPy/Dask array, I think we can safely ignore it and let NumPy/Dask handle broadcasting.


@property
def attrs(self):
Expand Down
Loading