LLM Pruning and Distillation in Practice: The Minitron Approach Paper • 2408.11796 • Published 29 days ago • 53
Compact Language Models via Pruning and Knowledge Distillation Paper • 2407.14679 • Published Jul 19 • 35
RADIO Collection A collection of Foundation Vision Models that combine multiple models (CLIP, DINOv2, SAM, etc.). • 3 items • Updated 27 days ago • 2
Minitron Collection A family of compressed models obtained via pruning and knowledge distillation • 7 items • Updated 3 days ago • 54