Microsoft logo

Phi-3.5-mini-instruct

Microsoft
phi-3.5-mini-instructVariant

Overview

Phi-3.5-mini-instruct is a 3.8B-parameter model that supports up to 128K context tokens, with improved multilingual capabilities across over 20 languages. It underwent additional training and safety post-training to enhance instruction-following, reasoning, math, and code generation. Ideal for environments with memory or latency constraints, it uses an MIT license.

Phi-3.5-mini-instruct was released on August 23, 2024. API access is available through Azure.

Performance

Timeline

Release DateUnknown
Knowledge CutoffUnknown

Other Details

Parameters
3.8B
License
MIT
Training Data
Unknown
Tags
tuning:instruct

Related Models

Compare Phi-3.5-mini-instruct to other models by quality (GPQA score) vs cost. Higher scores and lower costs represent better value.

Performance visualization loading...

Gathering benchmark data from similar models

Benchmarks

Phi-3.5-mini-instruct Performance Across Datasets

Scores sourced from the model's scorecard, paper, or official blog posts

LLM Stats Logollm-stats.com - Sat Dec 06 2025
Notice missing or incorrect data?Start an Issue discussion

Pricing

Pricing, performance, and capabilities for Phi-3.5-mini-instruct across different providers:

ProviderInput ($/M)Output ($/M)Max InputMax OutputLatency (s)ThroughputQuantizationInputOutput
Azure logo
Azure
$0.10$0.10128.0K128.0K0.5223.0 tok/s
Text
Image
Audio
Video
Text
Image
Audio
Video

Example Outputs

Recent Posts

Recent Reviews

API Access

API Access Coming Soon

API access for Phi-3.5-mini-instruct will be available soon through our gateway.

FAQ

Common questions about Phi-3.5-mini-instruct

Phi-3.5-mini-instruct was released on August 23, 2024.
Phi-3.5-mini-instruct has 3.8 billion parameters.