Compressed MoE ASR Model Based on Knowledge Distillation and Quantization
INTERSPEECH 2023(2023)
关键词
speech recognition,mixture of experts,knowledge distillation,model quantization,extreme compression
AI 理解论文
溯源树
样例

生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要