Skip to content

Releases: tensorly/torch

0.5.0

09 Jun 17:34
e602edf
Compare
Choose a tag to compare

Quality of life improvements.

Release 0.4.0

08 Mar 17:27
Compare
Choose a tag to compare

What's Changed

New Contributors

Full Changelog: 0.3.0...0.4.0

TensorLy-Torch version 0.3.0

08 Nov 13:31
0a32b76
Compare
Choose a tag to compare

TensorLy-Torch Release 0.3.0

TensorLy-Torch just got even easier to use for tensorized deep learning, with indexible factorized tensors, seamless compatibility with torch functions, tensorized embedding layers and more!

New features

Faster general_1D_conv, speeds up CP convolutions
Indexable TensorizedTensors, #7 : Factorized tensors can now be indexed just like regular tensors. The result will still be a factorized tensor whenever possible, and a dense tensor otherwise.

>>> import tltorch

>>> cp_tensor = tltorch.FactorizedTensor.new((3, 4, 2), rank=0.9, factorization='cp')

# Initialise the tensor with random values
>>> cp_tensor.normal_(0, 0.02)

>>> print(cp_tensor)
CPTensor(shape=(3, 4, 2), rank=2)

>>> cp_tensor[:2, :2]
CPTensor(shape=(2, 2, 2), rank=2)

>>> cp_tensor[2, 3, 1]
tensor(0.0250, grad_fn=<SumBackward0>)

# Note how, above, indexing tracks gradients as well!

New BlockTT factorization, generalizes tt-matrices

>>> ftt = tltorch.TensorizedTensor.new((5, (2, 2, 2), (3, 3, 3)), rank=0.5, factorization='BlockTT')
>>> ftt
BlockTT, shape=[5, 8, 27], tensorized_shape=(5, (2, 2, 2), (3, 3, 3)), rank=[1, 20, 20, 1])
>>> ftt[2]
BlockTT, shape=[8, 27], tensorized_shape=[(2, 2, 2), (3, 3, 3)], rank=[1, 20, 20, 1])
>>> ftt[0, :2, :2]
tensor([[-0.0009,  0.0004],
        [ 0.0007,  0.0003]], grad_fn=<SqueezeBackward0>)

get_tensorized_shape: linear layers can now be automatically tensorized to a convenient shape
Tensorized embeddings : Add factorized embedding layer and tests #10 , thanks to @colehawkins

Initialise factorized tensors directly with Pytorch, for initialisations based on normal distribution:

from torch.nn import init
import tltorch

cp_tensor = tltorch.FactorizedTensor.new((3, 4, 2), rank=0.9, factorization='cp')
init.kaiming_normal(cp_tensor)

Improvements

TuckerTensor: unsqueezed_modes option
TRL: added init_from_linear
FactorizedConvolutions now have a reset_parameters method and are initialised by default when created from random values
Layers and factorized tensors now accept a device and type as parameter
Tensor dropout now accepts min_dim and min_values

Bug fixes

Fixed bugs for TT in rank in init_from_tensor, transduction and tensor creation.
Bug fix when creating a factorized conv from a factorization.
Linear layer class method preserve context
Contiguous issue in TuckerTensor thanks to @colehawkins, #9
Fixed tensor dropout for p=1
Initialise weights when creating new random layer

Release 0.2.0

14 Apr 17:16
779fb7c
Compare
Choose a tag to compare

A full rewriting of TensorLy-Torch!