Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

New Optimizer: Implement Adam-Mini optimizer #1720

Open
5 tasks done
SicariusSicariiStuff opened this issue Jun 30, 2024 · 0 comments
Open
5 tasks done

New Optimizer: Implement Adam-Mini optimizer #1720

SicariusSicariiStuff opened this issue Jun 30, 2024 · 0 comments
Labels
enhancement New feature or request

Comments

@SicariusSicariiStuff
Copy link

⚠️ Please check that this feature request hasn't been suggested before.

  • I searched previous Ideas in Discussions didn't find any similar feature requests.
  • I searched previous Issues didn't find any similar feature requests.

🔖 Feature description

Paper:
https://arxiv.org/abs/2406.16793

TL;DR
Adam-mini should make it easier and faster to train models on home hardware.
In theory, it shouldn't be overly complicated to implement it, as it is very similar to AdamW

✔️ Solution

Implement Adam-Mini in Axolotl.

❓ Alternatives

Keep using AdamW

📝 Additional Context

Adam-mini should probably be 'sort-of' compatible with DeepSpeed right out of the box, greatly increasing training speed and reducing memory footprint.

Acknowledgements

  • My issue title is concise, descriptive, and in title casing.
  • I have searched the existing issues to make sure this feature has not been requested yet.
  • I have provided enough information for the maintainers to understand and evaluate this request.
@SicariusSicariiStuff SicariusSicariiStuff added the enhancement New feature or request label Jun 30, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

1 participant