Join the conversation

Join the community of Machine Learners and AI enthusiasts.

Sign Up
fdaudensย 
posted an update Jul 16
Post
3248
Small models, BIG impact: SmolLM is here! ๐Ÿš€๐Ÿ”ฌ

We're launching a series of small but mighty language models:
๐ŸŽ๏ธ Super fast - runs on laptops, phones, you name it!
๐Ÿ“ 3 sizes: 130M, 350M, and 1.5B parameters
๐Ÿฅ‡ Outperforms same size models from Meta, Microsoft, and Qwen
๐Ÿ”“ Fully open-source: datasets, training code, models

๐Š๐ž๐ฒ ๐Ÿ๐ž๐š๐ญ๐ฎ๐ซ๐ž๐ฌ
- Trained on FineWeb-Edu and Cosmopedia v2 (largest synthetic pre-training dataset)
- No cloud needed - run locally for privacy and energy efficiency
- Everything is public, from data curation to training steps

๐๐จ๐ญ๐ž๐ง๐ญ๐ข๐š๐ฅ ๐ฎ๐ฌ๐ž ๐œ๐š๐ฌ๐ž๐ฌ
- On-device autocomplete
- Local request parsing
- Custom fine-tuning for specific needs without the need for expensive GPUs

๐†๐จ ๐๐ž๐ž๐ฉ๐ž๐ซ
๐Ÿ‘‰ Check it out: https://huggingface.co/collections/HuggingFaceTB/smollm-models-6695016cad7167254ce15966
๐Ÿ‘‰ Run the 360M model in your browser, 100 % private: HuggingFaceTB/SmolLM-360M-Instruct-WebGPU
๐Ÿ‘‰ Read the blog explaining everything in detail: huggingface.co/blog/smollm

Kudos to the stellar team who worked on this project: @loubnabnl @anton-l @eliebak @lvwerra
In this post