Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ENH: adding sphinx documentation for the dandi-cli #712

Merged
merged 10 commits into from
Jul 16, 2021
Merged
Show file tree
Hide file tree
Changes from 6 commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 2 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -3,12 +3,14 @@
.*.swp
.coverage
.coverage.*
.docker/
.eggs
.idea
.tox/
__pycache__/
build/
dist/
docs/**/generated/
pip-wheel-metadata/
sandbox/
venv/
Expand Down
20 changes: 20 additions & 0 deletions docs/Makefile
Original file line number Diff line number Diff line change
@@ -0,0 +1,20 @@
# Minimal makefile for Sphinx documentation
#

# You can set these variables from the command line, and also
# from the environment for the first two.
SPHINXOPTS ?=
SPHINXBUILD ?= sphinx-build
SOURCEDIR = source
BUILDDIR = build

# Put it first so that "make" without argument is like "make help".
help:
@$(SPHINXBUILD) -M help "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O)

.PHONY: help Makefile

# Catch-all target: route all unknown targets to Sphinx using the new
# "make mode" option. $(O) is meant as a shortcut for $(SPHINXOPTS).
%: Makefile
@$(SPHINXBUILD) -M $@ "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O)
File renamed without changes.
35 changes: 35 additions & 0 deletions docs/make.bat
Original file line number Diff line number Diff line change
@@ -0,0 +1,35 @@
@ECHO OFF

pushd %~dp0

REM Command file for Sphinx documentation

if "%SPHINXBUILD%" == "" (
set SPHINXBUILD=sphinx-build
)
set SOURCEDIR=source
set BUILDDIR=build

if "%1" == "" goto help

%SPHINXBUILD% >NUL 2>NUL
if errorlevel 9009 (
echo.
echo.The 'sphinx-build' command was not found. Make sure you have Sphinx
echo.installed, then set the SPHINXBUILD environment variable to point
echo.to the full path of the 'sphinx-build' executable. Alternatively you
echo.may add the Sphinx directory to PATH.
echo.
echo.If you don't have Sphinx installed, grab it from
echo.http://sphinx-doc.org/
exit /b 1
)

%SPHINXBUILD% -M %1 %SOURCEDIR% %BUILDDIR% %SPHINXOPTS% %O%
goto end

:help
%SPHINXBUILD% -M help %SOURCEDIR% %BUILDDIR% %SPHINXOPTS% %O%

:end
popd
2 changes: 2 additions & 0 deletions docs/requirements.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
alabaster
Sphinx
16 changes: 16 additions & 0 deletions docs/source/cmdline/delete.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,16 @@
:program:`dandi delete`
=======================

::

dandi [<global options>] delete [<options>] [<paths> ...]

Delete dandisets and assets from the server.

PATH could be a local path or a URL to an asset, directory, or an entire
dandiset.

Options
-------

.. option:: --skip-missing
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

oh -- you just manually produced them? great for now but we should look into making them generated automagically... may be smth like https://github.com/click-contrib/sphinx-click could be used ?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't really like the way the sphinx-click output looks.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

May be there is a way to improve that? I just fear that manually produced docs would quickly become our of sync with actual CLI

15 changes: 15 additions & 0 deletions docs/source/cmdline/digest.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,15 @@
:program:`dandi digest`
=======================

::

dandi [<global options>] digest [<options>] [<path> ...]

Calculate file digests

Options
-------

.. option:: -d, --digest [dandi-etag|md5|sha1|sha256|sha512]

Digest algorithm to use [default: dandi-etag]
36 changes: 36 additions & 0 deletions docs/source/cmdline/download.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,36 @@
:program:`dandi download`
=========================

::

dandi [<global options>] download [<options>] [<url> ...]

Options
-------

.. option:: -o, --output-dir <dir>

Directory where to download to (directory must exist). Files will be
downloaded with paths relative to that directory.

.. option:: -e, --existing [error|skip|overwrite|overwrite-different|refresh]

What to do if a file found existing locally. 'refresh': verify
that according to the size and mtime, it is the same file, if not -
download and overwrite.

.. option:: -f, --format [pyout|debug]

Choose the format/frontend for output.

.. option:: -J, --jobs INT

Number of parallel download jobs.

.. option:: --download [dandiset.yaml,assets,all]

Comma-separated list of elements to download

.. option:: --sync

Delete local assets that do not exist on the server
8 changes: 8 additions & 0 deletions docs/source/cmdline/index.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
**********************
Command-Line Interface
**********************

.. toctree::
:glob:

*
50 changes: 50 additions & 0 deletions docs/source/cmdline/ls.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,50 @@
:program:`dandi ls`
===================

::

dandi [<global options>] ls [<options>] [<path|url> ...]

List .nwb files and dandisets metadata.

Patterns for known setups:

- ``DANDI:<dandiset id>``
- ``https://dandiarchive.org/...``
- ``https://identifiers.org/DANDI:<dandiset id>``
- ``https://<server>[/api]/[#/]dandiset/<dandiset id>[/<version>][/files[?location=<path>]]``
- ``https://*dandiarchive-org.netflify.app/...``
- ``https://<server>[/api]/dandisets/<dandiset id>[/versions[/<version>]]``
- ``https://<server>[/api]/dandisets/<dandiset id>/versions/<version>/assets/<asset id>[/download]``
- ``https://<server>[/api]/dandisets/<dandiset id>/versions/<version>/assets/?path=<path>``
- ``dandi://<instance name>/<dandiset id>[@<version>][/<path>]``
- ``https://<server>/...``


Options
-------

.. option:: -F, --fields <fields>

Comma-separated list of fields to display. An empty value to trigger a
list of available fields to be printed out

.. option:: -f, --format [auto|pyout|json|json_pp|json_lines|yaml]

Choose the format/frontend for output. If 'auto' (default), 'pyout' will be
used in case of multiple files, and 'yaml' for a single file.

.. option:: -r, --recursive

Recurse into content of dandisets/directories. Only .nwb files will be
considered.

.. option:: -J, --jobs <int>

Number of parallel download jobs.

.. option:: --metadata [api|all|assets]

.. option:: --schema <version>

Convert metadata to new schema version
58 changes: 58 additions & 0 deletions docs/source/cmdline/organize.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,58 @@
:program:`dandi organize`
=========================

::

dandi [<global options>] organize [<options>] [<path> ...]

(Re)organize files according to the metadata.

The purpose of this command is to take advantage of metadata contained in the
.nwb files to provide datasets with consistently named files, so their naming
reflects data they contain.

.nwb files are organized into a hierarchy of subfolders one per each "subject",
e.g. sub-0001 if .nwb file had contained a Subject group with subject_id=0001.
Each file in a subject-specific subfolder follows the convention::

sub-<subject_id>[_key-<value>][_mod1+mod2+...].nwb

where following keys are considered if present in the data::

ses -- session_id
tis -- tissue_sample_id
slice -- slice_id
cell -- cell_id

and ``modX`` are "modalities" as identified based on detected neural data types
(such as "ecephys", "icephys") per extensions found in nwb-schema definitions:
https://github.com/NeurodataWithoutBorders/nwb-schema/tree/dev/core

In addition an "obj" key with a value corresponding to crc32 checksum of
"object_id" is added if aforementioned keys and the list of modalities are
not sufficient to disambiguate different files.

You can visit https://dandiarchive.org for a growing collection of
(re)organized dandisets.

Options
-------

.. options:: -d, --dandiset-path <dir>

A top directory (local) of the dandiset to organize files under. If not
specified, dandiset current directory is under is assumed. For 'simulate'
mode target dandiset/directory must not exist.

.. option:: --invalid [fail|warn]

What to do if files without sufficient metadata are encountered.

.. option:: -f, --files-mode [dry|simulate|copy|move|hardlink|symlink|auto]

If 'dry' - no action is performed, suggested renames are printed. If
'simulate' - hierarchy of empty files at --local-top-path is created. Note
that previous layout should be removed prior this operation. If 'auto'
(default) - whichever of symlink, hardlink, copy is allowed by system. The
other modes (copy, move, symlink, hardlink) define how data files should be
made available.
24 changes: 24 additions & 0 deletions docs/source/cmdline/shell-completion.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,24 @@
:program:`dandi shell-completion`
=================================

::

dandi [<global options>] shell-completion [<options>]

Emit shell script for enabling command completion.

The output of this command should be "sourced" by bash or zsh to enable command
completion.

Example::

$ source <(dandi shell-completion)
$ dandi --<PRESS TAB to display available option>

Options
-------

.. option:: -s, --shell [bash|zsh|fish|auto]

The shell for which to generate completion code; `auto` (default) attempts
autodetection
42 changes: 42 additions & 0 deletions docs/source/cmdline/upload.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,42 @@
:program:`dandi upload`
=======================

::

dandi [<global options>] upload [<options>] [<path> ...]

Upload dandiset (files) to DANDI archive.

Target dandiset to upload to must already be registered in the archive and
locally :file:`dandiset.yaml` should exist in :option:`--dandiset-path`.

Local dandiset should pass validation. For that it should be first organized
using ``dandi organize`` command.

By default all files in the dandiset (not following directories starting with a
period) will be considered for the upload. You can point to specific files you
would like to validate and have uploaded.

Options
-------

.. option:: -e, --existing [error|skip|force|overwrite|refresh]

What to do if a file found existing on the server. 'skip' would skip the
file, 'force' - force reupload, 'overwrite' - force upload if either size
or modification time differs; 'refresh' (default) - upload only if local
modification time is ahead of the remote.

.. option:: -J, --jobs N[:M]

Number of files to upload in parallel and, optionally, number of upload
threads per file

.. option:: --sync

Delete assets on the server that do not exist locally

.. option:: --validation [require|skip|ignore]

Data must pass validation before the upload. Use of this option is highly
discouraged.
10 changes: 10 additions & 0 deletions docs/source/cmdline/validate.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,10 @@
:program:`dandi validate`
=========================

::

dandi [<global options>] validate [<path> ...]

Validate files for NWB (and DANDI) compliance.

Exits with non-0 exit code if any file is not compliant.
56 changes: 56 additions & 0 deletions docs/source/conf.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,56 @@
# Configuration file for the Sphinx documentation builder.
#
# This file only contains a selection of the most common options. For a full
# list see the documentation:
# https://www.sphinx-doc.org/en/master/usage/configuration.html

# -- Path setup --------------------------------------------------------------

# If extensions (or modules to document with autodoc) are in another directory,
# add these directories to sys.path here. If the directory is relative to the
# documentation root, use os.path.abspath to make it absolute, like shown here.
#
# import os
# import sys
# sys.path.insert(0, os.path.abspath('.'))


# -- Project information -----------------------------------------------------

project = "DANDI Archive CLI and Python API library"
copyright = "2021, DANDI Team"
author = "DANDI Team"

import dandi

# The full version, including alpha/beta/rc tags
release = dandi.__version__


# -- General configuration ---------------------------------------------------

# Add any Sphinx extension module names here, as strings. They can be
# extensions coming with Sphinx (named 'sphinx.ext.*') or your custom
# ones.
extensions = ["sphinx.ext.autodoc", "sphinx.ext.autosummary"]

# Add any paths that contain templates here, relative to this directory.
templates_path = ["_templates"]

# List of patterns, relative to source directory, that match files and
# directories to ignore when looking for source files.
# This pattern also affects html_static_path and html_extra_path.
exclude_patterns = []


# -- Options for HTML output -------------------------------------------------

# The theme to use for HTML and HTML Help pages. See the documentation for
# a list of builtin themes.
#
html_theme = "alabaster"

# Add any paths that contain custom static files (such as style sheets) here,
# relative to this directory. They are copied after the builtin static files,
# so a file named "default.css" will overwrite the builtin "default.css".
html_static_path = ["_static"]
Loading