Skip to content

xv44586/Knowledge-Distillation-NLP

Repository files navigation

知识蒸馏

知识蒸馏(a.k.a Teacher-Student Model)旨在利用一个小模型(Student)去学习一个大模型(Teacher)中的知识, 期望小模型尽量保持大模型的性能,来减小模型部署阶段的参数量,加速模型推理速度,降低计算资源使用。

目录结构

About

some demos of Knowledge Distillation in NLP

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published