Skip to content
This repository has been archived by the owner on Jan 15, 2024. It is now read-only.

[DOC] Update Readme with correct mxnet version #635

Merged
merged 6 commits into from
Mar 19, 2019
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
7 changes: 5 additions & 2 deletions README.rst
Original file line number Diff line number Diff line change
Expand Up @@ -30,9 +30,12 @@ Language Processing (NLP) research.
News
====

- GluonNLP is featured in:
- Tutorial proposal for GluonNLP is accepted at `EMNLP 2019 <https://www.emnlp-ijcnlp2019.org>`__, Hong Kong.

- GluonNLP was featured in:

- **AWS re:invent 2018 in Las Vegas, 2018-11-28**! Checkout `details <https://www.portal.reinvent.awsevents.com/connect/sessionDetail.ww?SESSION_ID=88736>`_.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

- **PyData 2018 NYC, 2018-10-18**! Checkout the `awesome talk <https://pydata.org/nyc2018/schedule/presentation/76/>`__ by Sneha Jha.
- **KDD 2018 London, 2018-08-21, Apache MXNet Gluon tutorial**! Check out **https://kdd18.mxnet.io**.

Installation
Expand All @@ -48,7 +51,7 @@ In particular, if you want to install the most recent ``MXNet`` release:

::

pip install --upgrade mxnet>=1.3.0
pip install --upgrade mxnet>=1.4.0

Else, if you want to install the most recent ``MXNet`` nightly build:

Expand Down
2 changes: 2 additions & 0 deletions docs/api/modules/data.rst
Original file line number Diff line number Diff line change
Expand Up @@ -24,6 +24,8 @@ The dataset is available under the Creative Commons Attribution-ShareAlike Licen

WikiText2
WikiText103
WikiText2Raw
WikiText103Raw

Language modeling: Google 1 Billion Words
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Expand Down
2 changes: 1 addition & 1 deletion docs/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -51,7 +51,7 @@ command installs the latest version of MXNet.

.. code-block:: console

pip install --upgrade mxnet>=1.3.0
pip install --upgrade mxnet>=1.4.0

.. note::

Expand Down
2 changes: 2 additions & 0 deletions scripts/bert/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,8 @@ Bidirectional Encoder Representations from Transformers

Reference: Devlin, Jacob, et al. "`Bert: Pre-training of deep bidirectional transformers for language understanding. <https://arxiv.org/abs/1810.04805>`_" arXiv preprint arXiv:1810.04805 (2018).

Note: BERT model requires `nightly version of MXNet <https://mxnet.incubator.apache.org/versions/master/install/index.html?version=master&platform=Linux&language=Python&processor=CPU>`__.

The following pre-trained BERT models are available from the **gluonnlp.model.get_model** API:

+-----------------------------+----------------+-----------------+
Expand Down
4 changes: 2 additions & 2 deletions src/gluonnlp/data/corpora/wikitext.py
Original file line number Diff line number Diff line change
Expand Up @@ -79,7 +79,7 @@ class WikiText2(_WikiText):
WikiText2 is implemented as CorpusDataset with the default flatten=True.

From
https://einstein.ai/research/the-wikitext-long-term-dependency-language-modeling-dataset
https://www.salesforce.com/products/einstein/ai-research/the-wikitext-dependency-language-modeling-dataset/

License: Creative Commons Attribution-ShareAlike

Expand Down Expand Up @@ -166,7 +166,7 @@ class WikiText103(_WikiText):
WikiText103 is implemented as CorpusDataset with the default flatten=True.

From
https://einstein.ai/research/the-wikitext-long-term-dependency-language-modeling-dataset
https://www.salesforce.com/products/einstein/ai-research/the-wikitext-dependency-language-modeling-dataset/

License: Creative Commons Attribution-ShareAlike

Expand Down