Skip to content

The repo has source code to fine tune gptj using 8 bit quantization so that it fits in single GPU and consumes around 14GB of memory

Notifications You must be signed in to change notification settings

rktayal/finetune_gptj_8_bit

Repository files navigation

finetune_gptj_8_bit

The repo has source code to fine tune gptj using 8 bit quantization so that it fits in single GPU and consumes around 14GB of memory

About

The repo has source code to fine tune gptj using 8 bit quantization so that it fits in single GPU and consumes around 14GB of memory

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages