Skip to content

An All-MLP Sequence Modeling Architecture That Excels at Copying (Casual Relational Networks) Resources. https://arxiv.org/abs/2406.16168

License

Notifications You must be signed in to change notification settings

kerner-lab/causal-relation-networks

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Causal Relation Networks (CausalRNs)

PyTorch implementation of "An All-MLP Sequence Modeling Architecture That Excels at Copying"

Paper accepted at the ICML 2024 Workshop: Next Generation of Sequence Modeling Architectures.

Abstract

Recent work demonstrated Transformers’ ability to efficiently copy strings of exponential sizes, distinguishing them from other architectures. We present the Causal Relation Network (CausalRN), an all-MLP sequence modeling architecture that can match Transformers on the copying task. Extending Relation Networks (RNs), we implemented key innovations to support autoregressive sequence modeling while maintaining computational feasibility. We discovered that exponentiallyactivated RNs are reducible to linear time complexity, and pre-activation normalization induces an infinitely growing memory pool, similar to a KV cache. In ablation study, we found both exponential activation and pre-activation normalization are indispensable for Transformer-level copying. Our findings provide new insights into what actually constitutes strong in-context retrieval. arXiv

Citation

@misc{cui2024allmlpsequencemodelingarchitecture,
      title={An All-MLP Sequence Modeling Architecture That Excels at Copying}, 
      author={Chenwei Cui and Zehao Yan and Gedeon Muhawenayo and Hannah Kerner},
      year={2024},
      eprint={2406.16168},
      archivePrefix={arXiv},
      primaryClass={cs.LG},
      url={https://arxiv.org/abs/2406.16168}, 
}

License

Apache License 2.0

About

An All-MLP Sequence Modeling Architecture That Excels at Copying (Casual Relational Networks) Resources. https://arxiv.org/abs/2406.16168

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages