A story of SLM: Roadmap of Phi Model Family
Phi-4 Family: Moving forward and more on reasoning
- [05/192025] Phi-4-mini, as a small language model that excels at text-based tasks, providing high accuracy in a compact form, has been built into Edge browser for Prompt API and Writing Assistance APIs. Blog → Simplified access to AI in Microsoft Edge
- [05/01/2025] Celebrations on one-year milestone of Phi with Phi-reasoning versions! Blog → One year of Phi: Small language models making big leaps in AI and technical report.
- [02/26/2025] Try the latest members in the Phi family: 🤗Phi-4-mini and 🤗Phi-4-multimodal! Read more → Empowering innovation: The next generation of the Phi family and our techinical report.
Phi-4 & Reasoning Series
Phi-4 Models | Phi-4-mini Models | Phi-4-multimodal Model | |
---|---|---|---|
Base | 🤗Phi-4 | 🤗Phi-4-mini | 🤗Phi-4-multimodal |
Reasoning | 🤗Phi-4-reasoning, 🤗Phi-4-reasoning-plus | 🤗Phi-4-mini-reasoning |
Phi-Silica: On-device SLM on Windows 11 Copilot+ PCs
- [04/25/2025] Expanding on the breakthrough efficiencies of Phi Silica, vision-based multimodal capabilities are added to unlock new possibilities for local SLMs on Windows. Blog → Enabling multimodal functionality for Phi Silica
- [12/6/2024] Based on a Cyber-EO compliant derivative of Phi-3.5-mini, Phi-Silica is developed specifically for Windows 11 with multilingual support and on-device rewrite and summarize support in Word and Outlook. Specially about one of the post-training steps for alignment on safety and responsible AI, Phi Silica is derived has undergone a five stage ‘break-fix’ methodology similar to the one outlined in our technical report Phi-3 Safety Post-Training: Aligning Language Models with a “Break-Fix” Cycle. Blog → Phi Silica, small but mighty on-device SLM.
Phi-3 Family: Various sizes and context lengths, improved Phi-3.5 with MoE and vision
- [08/22/2024] Welcome 🤗Phi-3.5-mini, Phi-3.5-vision, Phi-3.5-MoE, latest release as Phi-3.5 series SLMs! Read more → Discover the New Multi-Lingual, High-Quality Phi-3.5 SLMs and our latest technical report.
Phi-3.5 Series
Base | MoE | Vision |
---|---|---|
🤗Phi-3.5-mini | 🤗Phi-3.5-MoE | 🤗Phi-3.5-vision |
- [04/23/2024] Today we’re launching the first publicly available small language model from our Phi-3 family of open models, coming in mini, small and medium three sizes and 4k/8k and 128k context lengths. Try start with the most popular (top download) Phi-3-mini-4k-instruct and Phi-3-mini-128k-instruct! Read more → Tiny but mighty: The Phi-3 small language models with big potential
Phi-3 Series
Name | Model | Model (Long-Context) |
---|---|---|
Mini (3.8B) | 🤗Phi-3-mini-4k-instruct | 🤗Phi-3-mini-128k-instruct |
Small (7B) | 🤗Phi-3-small-8k-instruct | 🤗Phi-3-small-128k-instruct |
Medium (14B) | 🤗Phi-3-medium-4k-instruct | 🤗Phi-3-medium-128k-instruct |
Phi-1 Family: Era of small language models, MVP for high-quality textbook data
- [10/02/2023] Updated version of Phi-1.5 was released, trained with augmented with a new data source that consists of various NLP synthetic texts. Read the technical report here: Textbooks Are All You Need
- [06/20/2023] The first Phi model named 🤗Phi-1 with 1.3B parameters was released.