Microsoft logo

Phi-3.5-MoE-instruct

Overview

Phi-3.5-MoE-instruct is a mixture-of-experts model with ~42B total parameters (6.6B active) and a 128K context window. It excels at reasoning, math, coding, and multilingual tasks, outperforming larger dense models in many benchmarks. It underwent a thorough safety post-training process (SFT + DPO) and is licensed under MIT. This model is ideal for scenarios where efficiency and high performance are both required, particularly in multi-lingual or reasoning-intensive tasks.

Phi-3.5-MoE-instruct was released on August 23, 2024.

Performance

Timeline

ReleasedUnknown
Knowledge CutoffUnknown

Specifications

Parameters
60.0B
License
MIT
Training Data
Unknown
Tags
tuning:instruct

Benchmarks

Phi-3.5-MoE-instruct Performance Across Datasets

Scores sourced from the model's scorecard, paper, or official blog posts

LLM Stats Logollm-stats.com - Sat Dec 27 2025
Notice missing or incorrect data?Start an Issue discussion

Pricing

Pricing, performance, and capabilities for Phi-3.5-MoE-instruct across different providers:

No pricing information available for this model.

API Access

API Access Coming Soon

API access for Phi-3.5-MoE-instruct will be available soon through our gateway.

Recent Posts

Recent Reviews

FAQ

Common questions about Phi-3.5-MoE-instruct

Phi-3.5-MoE-instruct was released on August 23, 2024.
Phi-3.5-MoE-instruct has 60.0 billion parameters.