Phi-3.5-mini-instruct
MicrosoftOverview
Phi-3.5-mini-instruct is a 3.8B-parameter model that supports up to 128K context tokens, with improved multilingual capabilities across over 20 languages. It underwent additional training and safety post-training to enhance instruction-following, reasoning, math, and code generation. Ideal for environments with memory or latency constraints, it uses an MIT license.
Phi-3.5-mini-instruct was released on August 23, 2024. API access is available through Azure.
Performance
Timeline
Other Details
Related Models
Compare Phi-3.5-mini-instruct to other models by quality (GPQA score) vs cost. Higher scores and lower costs represent better value.
Performance visualization loading...
Gathering benchmark data from similar models
Benchmarks
Phi-3.5-mini-instruct Performance Across Datasets
Scores sourced from the model's scorecard, paper, or official blog posts
Pricing
Pricing, performance, and capabilities for Phi-3.5-mini-instruct across different providers:
| Provider | Input ($/M) | Output ($/M) | Max Input | Max Output | Latency (s) | Throughput | Quantization | Input | Output |
|---|---|---|---|---|---|---|---|---|---|
Azure | $0.10 | $0.10 | 128.0K | 128.0K | 0.52 | 23.0 tok/s | — | Text Image Audio Video | Text Image Audio Video |
Example Outputs
Recent Posts
Recent Reviews
API Access
API Access Coming Soon
API access for Phi-3.5-mini-instruct will be available soon through our gateway.
FAQ
Common questions about Phi-3.5-mini-instruct
