Skip to content

Commit

Permalink
MNT: Correct spelling errors identified by codespell
Browse files Browse the repository at this point in the history
Also work around a couple (reasonable) false alarms.
  • Loading branch information
dopplershift committed Jul 1, 2021
1 parent fd72202 commit edb9828
Show file tree
Hide file tree
Showing 16 changed files with 36 additions and 36 deletions.
4 changes: 2 additions & 2 deletions .github/workflows/draft-release.yml
Original file line number Diff line number Diff line change
Expand Up @@ -33,13 +33,13 @@ jobs:
const issues = items.data.filter(i => !i.pull_request);
const issue_text = '### Issues Closed\n' +
issues.map(i => `* [Issue ${i.number}](${i.html_url}) - ${i.title}\n`)
.join('') + `\nIn this release ${issues.length} issues were closed.`;
.join('') + `\n${issues.length} issues were closed in this release.`;
// Now do the same for the PRs
const prs = items.data.filter(i => !!i.pull_request);
const pr_text = '### Pull Requests Merged\n' +
prs.map(i => `* [PR ${i.number}](${i.html_url}) - ${i.title}, by @${i.user.login}\n`)
.join('') + `\nIn this release ${prs.length} pull requests were closed.`;
.join('') + `\n${prs.length} pull requests were closed in this release.`;
// Add the list of people who contributed PRs
const contrib_text = '### Contributors\n' +
Expand Down
10 changes: 5 additions & 5 deletions docs/userguide/gempak.rst
Original file line number Diff line number Diff line change
Expand Up @@ -1502,7 +1502,7 @@ blue is uncertain of parity, and white is unevaluated.
</tr>
<tr>
<td>ENS_SWSPRD(input_arg1 & input_arg2)</td>
<td>Compute the spred, similar to ENS_SSPRD, of a scalar diagnostic field over an ensemble. The spread is weighted and input_arg2 specifies the name of a weight grid to be used in the calculation. NOTE: For uniform weights ens_ssprd (input_arg1) might be expected to give the same result as ens_swsprd(input_arg1 &amp; w1) where w1 is uniform field of 1. This does not happen because of the division by (N-1) in ens_ssprd. The same is also true in comparing ens_vsprd and ens_vwsprd results.</td>
<td>Compute the spread, similar to ENS_SSPRD, of a scalar diagnostic field over an ensemble. The spread is weighted and input_arg2 specifies the name of a weight grid to be used in the calculation. NOTE: For uniform weights ens_ssprd (input_arg1) might be expected to give the same result as ens_swsprd(input_arg1 &amp; w1) where w1 is uniform field of 1. This does not happen because of the division by (N-1) in ens_ssprd. The same is also true in comparing ens_vsprd and ens_vwsprd results.</td>
<td></td>
<td></td>
<td></td>
Expand All @@ -1526,7 +1526,7 @@ blue is uncertain of parity, and white is unevaluated.
</tr>
<tr>
<td>ENS_VWSPRD(input_arg1 & input_arg2)</td>
<td>Compute the spred, similar to ENS_VSPRD, of a vector diagnostic field over an ensemble. The spread is weighted and input_arg2 specifies the name of a weight grid to be used in the calculation. Also, see NOTE for function ENS_SWSPRD.</td>
<td>Compute the spread, similar to ENS_VSPRD, of a vector diagnostic field over an ensemble. The spread is weighted and input_arg2 specifies the name of a weight grid to be used in the calculation. Also, see NOTE for function ENS_SWSPRD.</td>
<td></td>
<td></td>
<td></td>
Expand Down Expand Up @@ -1955,7 +1955,7 @@ blue is uncertain of parity, and white is unevaluated.
</tr>
<tr>
<td>dctrop</td>
<td>Hurricane/tropical storm reprots</td>
<td>Hurricane/tropical storm reports</td>
<td></td>
<td></td>
<td></td>
Expand Down Expand Up @@ -4643,7 +4643,7 @@ blue is uncertain of parity, and white is unevaluated.
</tr>
<tr>
<td>SK12</td>
<td>maximum sustain surface wind spped fcst for 12-hr period</td>
<td>maximum sustain surface wind speed fcst for 12-hr period</td>
<td></td>
<td></td>
<td></td>
Expand Down Expand Up @@ -7419,7 +7419,7 @@ blue is uncertain of parity, and white is unevaluated.
</tr>
<tr>
<td>RADFRQ</td>
<td>tThe update frequency for RADAR composites.</td>
<td>The update frequency for RADAR composites.</td>
<td></td>
<td></td>
<td></td>
Expand Down
2 changes: 1 addition & 1 deletion docs/userguide/media.rst
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@ Talks and Other Media
---------------------

* `AMS 2020 talk`_ on MetPy being ready for a 1.0 release
* `AMS 2019 talk`_ on bringing GEMPAK-like syntax to MetPy's declaritive plotting interface
* `AMS 2019 talk`_ on bringing GEMPAK-like syntax to MetPy's declarative plotting interface
* `AMS 2019 poster`_ on recent development and community building with MetPy
* `SciPy 2018 poster`_ and `abstract <http://johnrleeman.com/pubs/2018/Leeman_2018_SciPy_Abstract.pdf>`_ on building community by John Leeman
* `SciPy 2018 talk`_ on prototyping MetPy's future declarative plotting interface
Expand Down
2 changes: 1 addition & 1 deletion examples/plots/Station_Plot_with_Layout.py
Original file line number Diff line number Diff line change
Expand Up @@ -154,7 +154,7 @@
# or instead, a custom layout can be used:

# Just winds, temps, and dewpoint, with colors. Dewpoint and temp will be plotted
# out to Farenheit tenths. Extra data will be ignored
# out to Fahrenheit tenths. Extra data will be ignored
custom_layout = StationPlotLayout()
custom_layout.add_barb('eastward_wind', 'northward_wind', units='knots')
custom_layout.add_value('NW', 'air_temperature', fmt='.1f', units='degF', color='darkred')
Expand Down
14 changes: 7 additions & 7 deletions src/metpy/calc/basic.py
Original file line number Diff line number Diff line change
Expand Up @@ -794,9 +794,9 @@ def smooth_gaussian(scalar_grid, n):
sgma = n / (2 * np.pi)

# Construct sigma sequence so smoothing occurs only in horizontal direction
nax = len(scalar_grid.shape)
num_ax = len(scalar_grid.shape)
# Assume the last two axes represent the horizontal directions
sgma_seq = [sgma if i > nax - 3 else 0 for i in range(nax)]
sgma_seq = [sgma if i > num_ax - 3 else 0 for i in range(num_ax)]

# Compute smoothed field
return gaussian_filter(scalar_grid, sgma_seq, truncate=2 * np.sqrt(2))
Expand Down Expand Up @@ -837,7 +837,7 @@ def smooth_window(scalar_grid, window, passes=1, normalize_weights=True):
function will leave an unsmoothed edge of size `(n - 1) / 2` for each `n` in the shape of
`window` around the data). If a masked value or NaN values exists in the array, it will
propagate to any point that uses that particular grid point in the smoothing calculation.
Applying the smoothing function multiple times will propogate NaNs further throughout the
Applying the smoothing function multiple times will propagate NaNs further throughout the
domain.
See Also
Expand All @@ -861,7 +861,7 @@ def _trailing_dims(indexer):
# Add ... to the front of an indexer, since we are working with trailing dimensions.
return (Ellipsis,) + tuple(indexer)

# Verify that shape in all dimensions is odd (need to have a neighboorhood around a
# Verify that shape in all dimensions is odd (need to have a neighborhood around a
# central point)
if any((size % 2 == 0) for size in window.shape):
raise ValueError('The shape of the smoothing window must be odd in all dimensions.')
Expand Down Expand Up @@ -921,7 +921,7 @@ def smooth_rectangular(scalar_grid, size, passes=1):
function will leave an unsmoothed edge of size `(n - 1) / 2` for each `n` in `size` around
the data). If a masked value or NaN values exists in the array, it will propagate to any
point that uses that particular grid point in the smoothing calculation. Applying the
smoothing function multiple times will propogate NaNs further throughout the domain.
smoothing function multiple times will propagate NaNs further throughout the domain.
See Also
--------
Expand Down Expand Up @@ -960,7 +960,7 @@ def smooth_circular(scalar_grid, radius, passes=1):
function will leave an unsmoothed edge of size `radius` around the data). If a masked
value or NaN values exists in the array, it will propagate to any point that uses that
particular grid point in the smoothing calculation. Applying the smoothing function
multiple times will propogate NaNs further throughout the domain.
multiple times will propagate NaNs further throughout the domain.
See Also
--------
Expand Down Expand Up @@ -1007,7 +1007,7 @@ def smooth_n_point(scalar_grid, n=5, passes=1):
the end points with their original values (this function will leave an unsmoothed edge of
size 1 around the data). If a masked value or NaN values exists in the array, it will
propagate to any point that uses that particular grid point in the smoothing calculation.
Applying the smoothing function multiple times will propogate NaNs further throughout the
Applying the smoothing function multiple times will propagate NaNs further throughout the
domain.
See Also
Expand Down
2 changes: 1 addition & 1 deletion src/metpy/calc/cross_sections.py
Original file line number Diff line number Diff line change
Expand Up @@ -94,7 +94,7 @@ def latitude_from_cross_section(cross):

@exporter.export
def unit_vectors_from_cross_section(cross, index='index'):
r"""Calculate the unit tanget and unit normal vectors from a cross-section.
r"""Calculate the unit tangent and unit normal vectors from a cross-section.
Given a path described parametrically by :math:`\vec{l}(i) = (x(i), y(i))`, we can find
the unit tangent vector by the formula:
Expand Down
4 changes: 2 additions & 2 deletions src/metpy/calc/tools.py
Original file line number Diff line number Diff line change
Expand Up @@ -121,7 +121,7 @@ def find_intersections(x, a, b, direction='all', log_x=False):
Notes
-----
This function implicity converts `xarray.DataArray` to `pint.Quantity`, with the results
This function implicitly converts `xarray.DataArray` to `pint.Quantity`, with the results
given as `pint.Quantity`.
"""
Expand Down Expand Up @@ -1308,7 +1308,7 @@ def parse_angle(input_dir):
"""Calculate the meteorological angle from directional text.
Works for abbrieviations or whole words (E -> 90 | South -> 180)
and also is able to parse 22.5 degreee angles such as ESE/East South East.
and also is able to parse 22.5 degree angles such as ESE/East South East.
Parameters
----------
Expand Down
2 changes: 1 addition & 1 deletion src/metpy/interpolate/grid.py
Original file line number Diff line number Diff line change
Expand Up @@ -259,7 +259,7 @@ def interpolate_to_grid(x, y, z, interp_type='linear', hres=50000,
rbf_func: str
Specifies which function to use for Rbf interpolation.
Options include: 'multiquadric', 'inverse', 'gaussian', 'linear', 'cubic',
'quintic', and 'thin_plate'. Defualt 'linear'. See `scipy.interpolate.Rbf` for more
'quintic', and 'thin_plate'. Default 'linear'. See `scipy.interpolate.Rbf` for more
information.
rbf_smooth: float
Smoothing value applied to rbf interpolation. Higher values result in more smoothing.
Expand Down
2 changes: 1 addition & 1 deletion src/metpy/interpolate/one_dimension.py
Original file line number Diff line number Diff line change
Expand Up @@ -178,7 +178,7 @@ def interpolate_1d(x, xp, *args, axis=0, fill_value=np.nan, return_list_always=F
def log_interpolate_1d(x, xp, *args, axis=0, fill_value=np.nan):
r"""Interpolates data with logarithmic x-scale over a specified axis.
Interpolation on a logarithmic x-scale for interpolation values in pressure coordintates.
Interpolation on a logarithmic x-scale for interpolation values in pressure coordinates.
Parameters
----------
Expand Down
2 changes: 1 addition & 1 deletion src/metpy/interpolate/points.py
Original file line number Diff line number Diff line change
Expand Up @@ -321,7 +321,7 @@ def interpolate_to_points(points, values, xi, interp_type='linear', minimum_neig
rbf_func: str
Specifies which function to use for Rbf interpolation.
Options include: 'multiquadric', 'inverse', 'gaussian', 'linear', 'cubic',
'quintic', and 'thin_plate'. Defualt 'linear'. See `scipy.interpolate.Rbf` for more
'quintic', and 'thin_plate'. Default 'linear'. See `scipy.interpolate.Rbf` for more
information.
rbf_smooth: float
Smoothing value applied to rbf interpolation. Higher values result in more smoothing.
Expand Down
2 changes: 1 addition & 1 deletion src/metpy/pandas.py
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@ def preprocess_pandas(func):
@functools.wraps(func)
def wrapper(*args, **kwargs):
# not using hasattr(a, values) because it picks up dict.values()
# and this is more explictly handling pandas
# and this is more explicitly handling pandas
args = tuple(a.values if isinstance(a, pd.Series) else a for a in args)
kwargs = {name: (v.values if isinstance(v, pd.Series) else v)
for name, v in kwargs.items()}
Expand Down
6 changes: 3 additions & 3 deletions src/metpy/plots/declarative.py
Original file line number Diff line number Diff line change
Expand Up @@ -612,7 +612,7 @@ class MapPanel(Panel):
default_value=None)
area.__doc__ = """A tuple or string value that indicates the graphical area of the plot.
The tuple value coresponds to longitude/latitude box based on the projection of the map
The tuple value corresponds to longitude/latitude box based on the projection of the map
with the format (west-most longitude, east-most longitude, south-most latitude,
north-most latitude). This tuple defines a box from the lower-left to the upper-right
corner.
Expand All @@ -622,7 +622,7 @@ class MapPanel(Panel):
For a CONUS region, the following strings can be used: 'us', 'spcus', 'ncus', and 'afus'.
For regional plots, US postal state abbreviations can be used, such as 'co', 'ny', 'ca',
et cetera. Providing a '+' or '-' suffix to the string value will zoom in or out,
respectivley. Providing multiple '+' or '-' characters will zoom in or out further.
respectively. Providing multiple '+' or '-' characters will zoom in or out further.
"""

Expand Down Expand Up @@ -1277,7 +1277,7 @@ class PlotVector(Plots2D):
earth-relative.
Common gridded meteorological datasets including GFS and NARR output contain wind
components that are earth-relative. The primary expection is NAM output with wind
components that are earth-relative. The primary exception is NAM output with wind
components that are grid-relative. For any grid-relative vectors set this trait to `False`.
"""

Expand Down
8 changes: 4 additions & 4 deletions src/metpy/xarray.py
Original file line number Diff line number Diff line change
Expand Up @@ -105,7 +105,7 @@ class MetPyDataArrayAccessor:
This accessor provides several convenient attributes and methods through the `.metpy`
attribute on a DataArray. For example, MetPy can identify the coordinate corresponding
to a particular axis (given sufficent metadata):
to a particular axis (given sufficient metadata):
>>> import xarray as xr
>>> from metpy.units import units
Expand Down Expand Up @@ -588,7 +588,7 @@ def assign_latitude_longitude(self, force=False):
Returns
-------
`xarray.DataArray`
New xarray DataArray with latitude and longtiude auxilary coordinates assigned.
New xarray DataArray with latitude and longtiude auxiliary coordinates assigned.
Notes
-----
Expand Down Expand Up @@ -679,7 +679,7 @@ def parse_cf(self, varname=None, coordinates=None):
mapping metadata with the ``.assign_crs`` method.
This method operates on individual data variables within the dataset, so do not be
suprised if information not associated with individual data variables is not
surprised if information not associated with individual data variables is not
preserved.
Parameters
Expand Down Expand Up @@ -820,7 +820,7 @@ def sel(self, indexers=None, method=None, tolerance=None, drop=False, **indexers
return self._dataset.sel(indexers, method=method, tolerance=tolerance, drop=drop)

def assign_crs(self, cf_attributes=None, **kwargs):
"""Assign a CRS to this Datatset based on CF projection attributes.
"""Assign a CRS to this Dataset based on CF projection attributes.
Specify a coordinate reference system/grid mapping following the Climate and
Forecasting (CF) conventions (see `Appendix F: Grid Mappings
Expand Down
2 changes: 1 addition & 1 deletion tests/calc/test_cross_sections.py
Original file line number Diff line number Diff line change
Expand Up @@ -148,7 +148,7 @@ def test_distances_from_cross_section_given_xy(test_cross_xy):


def test_distances_from_cross_section_given_bad_coords(test_cross_xy):
"""Ensure an AttributeError is raised when the cross section lacks neeed coordinates."""
"""Ensure an AttributeError is raised when the cross section lacks need coordinates."""
with pytest.raises(AttributeError):
distances_from_cross_section(test_cross_xy['u_wind'].drop_vars('x'))

Expand Down
6 changes: 3 additions & 3 deletions tutorials/declarative_tutorial.py
Original file line number Diff line number Diff line change
Expand Up @@ -104,8 +104,8 @@
#
# - ``PlotBarbs()``
#
# More complete descrptions of these and other plotting types, as well as the map panel and
# panel containter classes are at the end of this tutorial.
# More complete descriptions of these and other plotting types, as well as the map panel and
# panel container classes are at the end of this tutorial.
#
# Let's plot a 300-hPa map with color-filled wind speed, which we calculated and added to
# our Dataset above, and geopotential heights over the CONUS.
Expand Down Expand Up @@ -187,7 +187,7 @@
# set of attributes to control plotting a vector quantity.

#########################################################################
# We start with setting the attributes that we had before for our 300 hPa plot inlcuding,
# We start with setting the attributes that we had before for our 300 hPa plot including,
# Geopotential Height contours, and color-filled wind speed.

# Set attributes for contours of Geopotential Heights at 300 hPa
Expand Down
4 changes: 2 additions & 2 deletions tutorials/xarray_tutorial.py
Original file line number Diff line number Diff line change
Expand Up @@ -61,7 +61,7 @@
#
# If you are more interested in learning about xarray's terminology and data structures, see
# the `terminology section <http://xarray.pydata.org/en/stable/terminology.html>`_ of xarray's
# documenation.
# documentation.
#
# Coordinates and Coordinate Reference Systems
# --------------------------------------------
Expand Down Expand Up @@ -434,7 +434,7 @@
# **Undefined Unit Error**
#
# If the units attribute on your xarray data is not recognizable by Pint, you will likely
# recieve an ``UndefinedUnitError``. In this case, you will likely have to update the units
# receive an ``UndefinedUnitError``. In this case, you will likely have to update the units
# attribute to one that can be parsed properly by Pint. It is our aim to have all valid
# CF/UDUNITS unit strings be parseable, but this work is ongoing. If many variables in your
# dataset are not parseable, the ``.update_attribute`` method on the MetPy accessor may come
Expand Down

0 comments on commit edb9828

Please sign in to comment.