Skip to content

Latest commit

 

History

History
24 lines (19 loc) · 528 Bytes

README.md

File metadata and controls

24 lines (19 loc) · 528 Bytes

Token Routing Analysis of Mixture of Experts LLMs

Install

pip install -r requirements.txt
cd ..
git clone https://github.com/hpcaitech/ColossalAI
pip install -U ./ColossalAI
cd ColossalAI/examples/language/openmoe
pip install -r requirements.txt

Run OpenMoe Inference on RedPajama

./scripts/token-routing.sh 

Analyse token routing data

See EDA notebook

TODO

  • Support Mixtral
  • Support DeepSeek