Skip to content

Commit

Permalink
Merge branch 'develop' into hysplitdev
Browse files Browse the repository at this point in the history
  • Loading branch information
Alice Crawford committed Oct 9, 2023
2 parents e8ba2e6 + 9a1f798 commit c8f668d
Show file tree
Hide file tree
Showing 44 changed files with 543 additions and 260 deletions.
37 changes: 37 additions & 0 deletions .codespell-exclude
Original file line number Diff line number Diff line change
@@ -0,0 +1,37 @@
<th>CAf</th>
<td>CAf</td>
"SIZ", "Size distribution"
"ALL", "All of the above retrievals (SIZ to FLUX) in one file"
hda = hysplit.combine_dataset([file1,file2], drange=[d1,d2])
print(hda)
metdata = ish_mod.ISH()
df = metdata.add_data(dates, country=None, box=area, resample=False)
"CNA",
d["CAf"] = add_multiple_lazy(d, newkeys, weights=neww)
d["CAf"] = d["CAf"].assign_attrs(
{"units": r"$\mu g m^{-3}$", "name": "CAf", "long_name": "Fine Mode particulate CA"}
"CAf": ["CAf"],
# value is tuple (filename, metdata)
Scrip file path for unstructured grid output
if "CAf" in var_list:
"PRES": "pres_pa_mid",
CMAQ model data including new CAf calculation
# allvars = Series(["TEMP", "Q", "PRES"])
# "p": dset["PRES"][:].compute().values
var_list.append("pres")
if var == "pres": # Insert special versions.
"SIZ",
df.loc[con, "variable"] = "Caf"
df.loc[con, "variable"] = "Laf"
["AZ", "CO", "ID", "KS", "MT", "NE", "NV", "NM", "ND", "SD", "UT", "WY"], dtype="|S12"
r = array(["AZ", "CO", "ID", "KS", "MT", "NE", "NV", "NM", "ND", "SD", "UT", "WY"])
ser = array(["Southeast" for i in se])
region = concatenate([ser, ner, ncr, scr, rr, pr])
"North Dakota": "ND",
fo = self.fs.open(f)
out = xr.open_dataset(fo, engine="h5netcdf")
"""Read SNPP OMPS Nadir Mapper Total Column Ozone L2 data from NASA GES DISC
a.prod = "SIZ"
df = aeronet.add_data(dates, inv_type="ALM15", product="SIZ")
df = aeronet.add_data(dates, inv_type="HYB15", product="SIZ")
# https://aeronet.gsfc.nasa.gov/cgi-bin/print_web_data_inv_v3?site=Cart_Site&year=2002&month=6&day=1&year2=2003&month2=6&day2=14&product=SIZ&AVG=20&ALM15=1&if_no_html=1
39 changes: 21 additions & 18 deletions .github/workflows/ci.yml
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@ on:

jobs:
test:
name: Test
name: Test (Py ${{ matrix.python-version }})
runs-on: ubuntu-latest
if: github.repository == 'noaa-oar-arl/monetio'
strategy:
Expand All @@ -22,29 +22,35 @@ jobs:
shell: bash -l {0}

steps:
- uses: actions/checkout@v3
- uses: actions/checkout@v4

- name: Set up Python (micromamba)
uses: mamba-org/provision-with-micromamba@v15
- name: Set up Python (micromamba) [>3.6]
if: matrix.python-version != '3.6'
uses: mamba-org/setup-micromamba@v1
with:
environment-file: environment-dev.yml
cache-environment: true
create-args: >-
python=${{ matrix.python-version }}
- name: Set up Python (micromamba) [3.6]
if: matrix.python-version == '3.6'
uses: mamba-org/setup-micromamba@v1
with:
environment-file: environment-dev.yml
cache-env: true
extra-specs: |
cache-environment: true
create-args: >-
python=${{ matrix.python-version }}
attrs=22.2.0
- name: Test with pytest
run: pytest -n auto -v -k 'not aqs'
run: pytest -n auto -v

- name: Test with pytspack installed
run: |
pip install https://github.com/noaa-oar-arl/pytspack/archive/master.zip
pytest -n auto -v -k with_pytspack
- name: Downgrade OpenSSL and test AQS
run: |
micromamba install 'openssl <3'
pytest -n auto -v -k aqs
docs:
name: Check docs build
runs-on: ubuntu-latest
Expand All @@ -54,16 +60,13 @@ jobs:
shell: bash -l {0}

steps:
- uses: actions/checkout@v3
- uses: actions/checkout@v4

- name: Set up Python (micromamba)
uses: mamba-org/provision-with-micromamba@v15
uses: mamba-org/setup-micromamba@v1
with:
environment-file: docs/environment-docs.yml
cache-env: true

- name: Downgrade OpenSSL (for AQS URL linkcheck)
run: micromamba install 'openssl <3'
cache-environment: true

- name: linkcheck
run: sphinx-build -b linkcheck docs docs/_build/linkcheck
Expand Down
15 changes: 13 additions & 2 deletions .github/workflows/lint.yml
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ jobs:
runs-on: ubuntu-latest
if: github.repository == 'noaa-oar-arl/monetio'
steps:
- uses: actions/checkout@v3
- uses: actions/checkout@v4
- uses: actions/setup-python@v4
with:
python-version: "3.9"
Expand All @@ -24,7 +24,18 @@ jobs:
runs-on: ubuntu-latest
if: github.repository == 'noaa-oar-arl/monetio'
steps:
- uses: actions/checkout@v3
- uses: actions/checkout@v4
- uses: citation-file-format/cffconvert-github-action@2.0.0
with:
args: "--validate"

spell:
name: codespell
runs-on: ubuntu-latest
if: github.repository == 'noaa-oar-arl/monetio'
steps:
- uses: actions/checkout@v4
- uses: codespell-project/actions-codespell@v2
with:
exclude_file: .codespell-exclude
skip: "./monetio/data/aqs_qualifiers.csv,./monetio/data/cemsinfo.csv,./monetio/data/stations.tsv"
1 change: 1 addition & 0 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,7 @@ repos:
rev: "v4.4.0"
hooks:
- id: trailing-whitespace
exclude: tdump\.[0-9]*
- id: end-of-file-fixer
- id: check-docstring-first
- id: check-yaml
Expand Down
1 change: 1 addition & 0 deletions docs/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -60,6 +60,7 @@
"https://doi.org/10.1080/10473289.2005.10464718",
"https://www.camx.com",
]
user_agent = "Mozilla/5.0 (X11; Linux x86_64; rv:25.0) Gecko/20100101 Firefox/25.0"

# -- Extension configuration -------------------------------------------------

Expand Down
2 changes: 1 addition & 1 deletion docs/environment-docs.yml
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@ channels:
- conda-forge
- nodefaults
dependencies:
- python=3.9
- python=3.10
#
# core
- dask
Expand Down
4 changes: 2 additions & 2 deletions docs/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -64,8 +64,8 @@ Supported datasets
* `HYSPLIT <https://www.ready.noaa.gov/HYSPLIT.php/>`_
* `CMAQ <https://www.epa.gov/cmaq/>`_
* `CAMx <https://www.camx.com/about/>`_
* FV3-CHEM (comming soon)
* WRF-CHEM (comming soon)
* FV3-CHEM (coming soon)
* WRF-CHEM (coming soon)

**Supported Observations**

Expand Down
4 changes: 2 additions & 2 deletions docs/models.rst
Original file line number Diff line number Diff line change
Expand Up @@ -242,7 +242,7 @@ The source dimension tags which source term the run used.
Attributes:
sample time hours: 1.0
To calcluate mass loading
To calculate mass loading
.. code-block:: python
Expand All @@ -260,7 +260,7 @@ To find top heights
massload = hypslit.hysp_hysp_heights(hxr, threshold=0, height_mult=1/1000.0, mult=1e10, mass_load=False)
returns xarray DataArray which gives top height of each level which contains mass loading higher
than the given threshold value. mult is a mutiplicative factor applied before thresholding.
than the given threshold value. mult is a multiplicative factor applied before thresholding.
height_mult is a multiplicative factor used to convert heights from meters to some other unit.
In this example heights are converted to km.
mass_load is a boolean which indicates whether the height should be determined from the mass loading value (True)
Expand Down
26 changes: 13 additions & 13 deletions docs/observations.rst
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@ AirNow is the near realtime dataset for air composition and meteorology measurem

"The U.S. EPA AirNow program is the national repository of real time air quality data and forecasts for the United States. AirNow is the vehicle for providing timely Air Quality Index (AQI) information to the public, media outlets, other federal agencies and their applications, and to the research community. The system is managed by the U.S. EPA’s Office of Air Quality Planning and Standards Outreach and Information Division, Information Transfer Group in Research Triangle Park (RTP), North Carolina. AirNow is currently hosted and operated at a contractor facility, known as the AirNow Data Management Center (DMC), which currently resides outside of RTP." - https://www.airnow.gov/index.cfm?action=ani.airnowUS

AirNow_ data can be dowloaded from the Amazon S3 server and aggregated using the
AirNow_ data can be downloaded from the Amazon S3 server and aggregated using the
monet.obs.airnow class. For example,lets say that we want to look at data from
2018-05-01 to 2018-05-05.

Expand All @@ -46,7 +46,7 @@ This provides a structured :py:class:`~pandas.DataFrame`.
df.head()
Some users may want to keep a local copy of the data and not have to retrive the data
Some users may want to keep a local copy of the data and not have to retrieve the data
each time they want to access the data. There is a simple kwarg that can be used to
download the data, *download=True*. By default, *download* is set to False.

Expand Down Expand Up @@ -93,8 +93,8 @@ MONET is able to use the EPA AQS data that is collected and reported on an hourl
prepare reports for Congress as mandated by the Clean Air Act." - https://www.epa.gov/aqs

We will begin by loading hourly ozone concentrations from 2018. The EPA AQS data
is seperated into yearly files and seperate files for hourly and daily data. The
files are also seperated by which variable is measured. For instance, hourly ozone files
is separated into yearly files and separate files for hourly and daily data. The
files are also separated by which variable is measured. For instance, hourly ozone files
for the entire year of 2018 are found in https://aqs.epa.gov/aqsweb/airdata/hourly_44201_2018.zip.
We will first load a single variable and then add multiple later on.

Expand Down Expand Up @@ -146,13 +146,13 @@ Let's load variables PM10 and OZONE using hourly data to get an idea of how to g
df = aqs.add_data(dates, param=['OZONE','PM10'])
Loading Specfic Network
^^^^^^^^^^^^^^^^^^^^^^^
Loading Specific Network
^^^^^^^^^^^^^^^^^^^^^^^^

Sometimes you may want to load a specific network that is available in the AQS data
files. For instance, lets load data from the Chemical Speciation Network (CSN;
https://www.epa.gov/amtic/chemical-speciation-network-csn).
As of writting this tutorial we will load the 2017 data as it is complete.
As of writing this tutorial we will load the 2017 data as it is complete.

.. code:: python
Expand All @@ -176,7 +176,7 @@ AERONET

"The AERONET (AErosol RObotic NETwork) project is a federation of ground-based
remote sensing aerosol networks established by NASA and PHOTONS (PHOtométrie pour le Traitement Opérationnel de Normalisation Satellitaire; Univ. of Lille 1, CNES, and CNRS-INSU)
and is greatly expanded by networks (e.g., RIMA, AeroSpan, AEROCAN, and CARSNET) and collaborators from national agencies, institutes, universities, individual scientists, and partners. Fo more than 25 years, the project has provided long-term, continuous and readily accessible public domain database of aerosol optical, microphysical and radiative properties for aerosol research and characterization, validation of satellite retrievals, and synergism with other databases. The network imposes standardization of instruments, calibration, processing and distribution.
and is greatly expanded by networks (e.g., RIMA, AeroSpan, AEROCAN, and CARSNET) and collaborators from national agencies, institutes, universities, individual scientists, and partners. For more than 25 years, the project has provided long-term, continuous and readily accessible public domain database of aerosol optical, microphysical and radiative properties for aerosol research and characterization, validation of satellite retrievals, and synergism with other databases. The network imposes standardization of instruments, calibration, processing and distribution.

AERONET collaboration provides globally distributed observations of spectral aerosol optical depth (AOD), inversion products, and precipitable water in diverse aerosol regimes. Version 3 AOD data are computed for three data quality levels: Level 1.0 (unscreened), Level 1.5 (cloud-screened and quality controlled), and Level 2.0 (quality-assured). Inversions, precipitable water, and other AOD-dependent products are derived from these levels and may implement additional quality checks. " -https://aeronet.gsfc.nasa.gov

Expand Down Expand Up @@ -206,7 +206,7 @@ Available Measurements
:widths: 20, 20

"SIZ", "Size distribution"
"RIN", "Refractive indicies (real and imaginary)"
"RIN", "Refractive indices (real and imaginary)"
"CAD", "Coincident AOT data with almucantar retrieval"
"VOL", "Volume concentration, volume mean radius, effective radius and standard deviation"
"TAB", "AOT absorption"
Expand Down Expand Up @@ -277,7 +277,7 @@ NTN
"The NTN is the only network providing a long-term record of precipitation chemistry across the United States.

Sites predominantly are located away from urban areas and point sources of pollution. Each site has a precipitation
chemistry collector and gage. The automated collector ensures that the sample is exposed only during precipitation (wet-only-sampling)."
chemistry collector and gauge. The automated collector ensures that the sample is exposed only during precipitation (wet-only-sampling)."
- https://nadp.slh.wisc.edu/NTN/

Available Measurements
Expand All @@ -296,14 +296,14 @@ Available Measurements
MDN
^^^

"The MDN is the only network providing a longterm record of total mercury (Hg) concentration and deposition in precipitation in the United States and Canada. All MDN sites follow standard procedures and have uniform precipitation chemistry collectors and gages. The automated collector has the same basic design as the NTN collector but is modified to preserve mercury. Modifications include a glass funnel, connecting tube, bottle for collecting samples, and an insulated enclosure to house this sampling train. The funnel and connecting tube reduce sample exposure to the open atmosphere and limit loss of dissolved mercury. As an additional sample preservation measure, the collection bottle is charged with 20 mL of a one percent hydrochloric acid solution."
"The MDN is the only network providing a longterm record of total mercury (Hg) concentration and deposition in precipitation in the United States and Canada. All MDN sites follow standard procedures and have uniform precipitation chemistry collectors and gauges. The automated collector has the same basic design as the NTN collector but is modified to preserve mercury. Modifications include a glass funnel, connecting tube, bottle for collecting samples, and an insulated enclosure to house this sampling train. The funnel and connecting tube reduce sample exposure to the open atmosphere and limit loss of dissolved mercury. As an additional sample preservation measure, the collection bottle is charged with 20 mL of a one percent hydrochloric acid solution."
- https://nadp.slh.wisc.edu/MDN/

Available Measurements
======================

* net concentration of methyl mercury in ng/L (conc)
* precipitation amount (in inches) reported by the raingage for the entire sampling period. (raingage)
* precipitation amount (in inches) reported by the rain gauge for the entire sampling period. (rain gauge)
* Mg2+ (mg)
* Na+ (na)
* K+ (k)
Expand Down Expand Up @@ -364,7 +364,7 @@ To see what data is in the DataFrame simply output the column header values

.. code:: python
print(df.colums.values)
print(df.columns.values)
Available Measurements
^^^^^^^^^^^^^^^^^^^^^^
Expand Down
12 changes: 6 additions & 6 deletions docs/tutorial/CMAQ_hi_volcano.rst
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ eruption. First, import MONET and several helper functions for later.
Now the data can be downloaded from the MONET github page in the
MONET/data directory. We will assume you already have this downloaded
and will proceed. Open the simulation. As of right now we still require
that a seperate grdcro2d (grddot2d) file be loaded for the mass points
that a separate grdcro2d (grddot2d) file be loaded for the mass points
(dot points) using the ``grid`` kwarg.

.. code:: python
Expand Down Expand Up @@ -138,7 +138,7 @@ aggrigate species in the concentration file.
Notice that this looks like the ncdump of the file except that there are
seperate coordinates including the latitude and longitude and the time
separate coordinates including the latitude and longitude and the time
as numpy.datetime64 objects. Also included is the proj4 string, a pyresample area grid
and default mapping tables to several different observational datasets.

Expand Down Expand Up @@ -195,7 +195,7 @@ will add a map using the MONETAccessor and use the ``robust=True`` kwarg.
Better but we can still do much more. There is low concentrations on
most of this map making it hard to notice the extremely high values and
the SO2 data is in ppmv and not ppbv as normally viewed as. Also, a
logscale may be better fo this type of data as it goes from 0-20000 ppbv
logscale may be better for this type of data as it goes from 0-20000 ppbv
rather than a linear scale.

.. code:: python
Expand All @@ -208,7 +208,7 @@ rather than a linear scale.
.. image:: CMAQ_hi_volcano_files/CMAQ_hi_volcano_11_3.png


Now let’s us view serveral time slices at once. We will average in time
Now let’s us view several time slices at once. We will average in time
(every 8 hours) to give us 6 total subplots.

.. code:: python
Expand Down Expand Up @@ -246,7 +246,7 @@ It is often useful to be able to pair model data with observational
data. MONET uses the pyresample library
(http://pyresample.readthedocs.io/en/latest/) to do a nearest neighbor
interpolation. First let us get the airnow data for the dates of the
simulation. We will also rotate it from the raw AirNow long format (stacked variables) to a wide format (each variable is a seperate column)
simulation. We will also rotate it from the raw AirNow long format (stacked variables) to a wide format (each variable is a separate column)


.. code:: python
Expand Down Expand Up @@ -288,7 +288,7 @@ new column (model).
Let’s look at the distributions to see if the two overlap to get a
general scence of performance.
general sense of performance.

.. code:: python
Expand Down
2 changes: 1 addition & 1 deletion docs/tutorial/NESDIS_VIIRS_AOD.rst
Original file line number Diff line number Diff line change
Expand Up @@ -100,7 +100,7 @@ Notice that the dimensions changed from 1800x3600 to 720x1440.
Open Multiple Days
~~~~~~~~~~~~~~~~~~

If you want to open multiple days in a sinlge call you could use the
If you want to open multiple days in a single call you could use the
open\_mfdataset. Lets grab the first nine days of July 2018.

.. code-block:: python
Expand Down
4 changes: 2 additions & 2 deletions docs/tutorial/_models.rst
Original file line number Diff line number Diff line change
Expand Up @@ -103,7 +103,7 @@ distinguish aerosols which these readers do not process well.

fv3grib2nc4.py like nemsio2nc4.py tool is a command line tool created to
convert the grib2 aerosol data to netcdf files. fv3grib2nc4.py will
create seperate files for each of the three layer types; '1 hybrid
create separate files for each of the three layer types; '1 hybrid
layer', 'entire atmosphere', and 'surface'. These are the three layers
that currently hold aerosol data. The tool is available at
https://github.com/bbakernoaa/fv3grib2nc4
Expand Down Expand Up @@ -150,7 +150,7 @@ MONETIO and FV3CHEM
Using MONET with FV3-Chem is much like using MONET with other model
outputs. It tries to recognize where the files came from (nemsio, grib2,
etc....) and then processes the data, renaming coordinates (lat lon to
latitude and longitude) and processing varaibles like geopotential
latitude and longitude) and processing variables like geopotential
height and pressure if available. First lets import ``monet`` and
``fv3chem`` from MONET

Expand Down
2 changes: 1 addition & 1 deletion docs/tutorial/aqs_pams.rst
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,7 @@ Now we have all the imports we could need lets load some data. Most of
the PAMS data is on daily data so lets add the kwarg daily=True to the
call. We will also create this for the year 2015 and 2016. Some
variables that may be valuable are the VOCS, ozone, NO2, NOX,
temperature. For all of the measurments available please see
temperature. For all of the measurements available please see
https://aqs.epa.gov/aqsweb/airdata/download_files.html

.. code-block:: python
Expand Down
Loading

0 comments on commit c8f668d

Please sign in to comment.