Skip to content

Releases: lmjohns3/theanets

v0.6.1

10 Jul 14:14
Compare
Choose a tag to compare

Version 0.6.1 of theanets is now live!

pip install -U theanets

http://pypi.python.org/pypi/theanets
http://theanets.readthedocs.org
http://github.com/lmjohns3/theanets

The biggest change in this release series is a Network/Layer refactor that preserves the existing API but permits much more flexible network layouts if desired. Layers can now output multiple values; by default most layer types generate an "out" (the traditional layer output) as well as a "pre" (the layer's pre-activation value). Other notable changes include:

  • The semantics of the "rect:min" and "rect:max" activations has been reversed -- rect:min now gives g(z) = min(1, z) and rect:max now gives g(z) = max(0, z). The "relu" activation still means g(z) = max(0, z).
  • Theanets now uses Travis CI and Coveralls.io to build and compute test coverage automatically -- see https://travis-ci.org/lmjohns3/theanets and https://coveralls.io/r/lmjohns3/theanets. Test coverage increased from 76 to 91%.
  • The documentation has been expanded and hopefully made more clear. There's always more room for improvement here!
  • Activation functions are now first-class objects. New activation functions include Prelu, LGrelu, and Maxout.
  • Loading and saving uses the standard pickle module.
  • Almost all of the trainers have moved to a new package, see http://downhill.readthedocs.org.

As a reminder, the 0.7.x release series will incorporate several big changes, but most important is that recurrent models will reorder the axes for input/output data; see goo.gl/kXB4Db for details.

As always, I hope the library will be really useful! Please file bugs, post on the mailing list, etc. as you run into questions or issues.

Version 0.5.0

10 Feb 22:10
Compare
Choose a tag to compare
Version 0.5.0 Pre-release
Pre-release

Version 0.5.0 of theanets is now live!

pip install -U theanets

http://pypi.python.org/pypi/theanets
http://theanets.readthedocs.org
http://github.com/lmjohns3/theanets

Some great new features have been incorporated into this release, but
the biggest one is that Layers have been refactored into first-class
citizens. It's much easier to specify model layers, and many different
types of recurrent layers are now available.
http://theanets.readthedocs.org/en/stable/creating.html#specifying-layers

I've also tried to improve the speed of the models and trainers, and I
have some ideas that I'll be incorporating into future releases in
this area.

I've tried to get the documentation into better shape. It still needs
some work, but it's a bit better than it has been.

This release also includes code from 4 first-time contributors!

Please note that this version makes several backwards-incompatible
changes that I think will be a net improvement, at the cost of
potentially breaking some of your existing training scripts. Most
notably:

  • The code relies on a new release of the climate package. Be sure to
    install using "pip install -U theanets" to get the most recent
    dependencies.
  • The Experiment.itertrain() method now generates two dictionaries
    of monitor values: one for training, and one for validation.
    http://theanets.readthedocs.org/en/stable/training.html#training-as-iteration
  • A Network now has a find() method for retrieving shared variables
    (e.g., weight matrices); the get_weights() method has been removed.
  • Trainer.train() has been renamed Trainer.itertrain(), and the
    SGD-based trainers have been refactored a bit, so now there is no
    longer an SGD.train_minibatch() method.

I hope the library will be really useful! Please file bugs, post on
the mailing list, etc. as you run into questions or issues.