Skip to content

v0.8.0

Compare
Choose a tag to compare
@milocress milocress released this 08 May 01:36

🚀 LLM Foundry v0.8.0

New Features

Megablocks support (#1102)

Support for training optimized MoE models at large scale.

Check out the megablocks documentation for more information on building state of the art MoE models.

Expanded Registries (#1080, #1093, #1094, #1095, #1096, #1165)

We've expanded support for registries to include, dataloaders, FFN layers, attention layers, norms, and parameter initialization functions.

Check out the README for detailed instructions and code examples!

Support for ShareGPT chat format (#1098)

We now support the ShareGPT format for finetuning.

Breaking Changes and Deprecations

We have updated the minimum supported PyTorch version to torch 2.3 (#1152).

In Context Learning Code Evaluation (#1181)

We've removed the code_evaluation task from the allowed in context learning task types, and we've deleted the InContextLearningCodeEvaluationDataset and InContextLearningCodeEvalAccuracy classes.

Question-Answering

We've removed the question_answering task type. Please use the generation_task_with_answers task instead.

What's Changed

New Contributors

Full Changelog: v0.7.0...v0.8.0