Phi-4-mini-instruct
microsoft/Phi-4-mini-instruct
Overview
Phi-4-Mini is a lightweight open model built upon synthetic and filtered public data, focusing on high-quality, reasoning-dense information. With 3.8 billion parameters and a 128K token context length, it features a dense decoder-only Transformer architecture with updates for efficiency and multilingual support. This instruction-tuned model is suitable for memory/compute constrained environments and latency-bound scenarios while aiming for strong reasoning capabilities, particularly in math and logic.
Tags
CentML Optimized
Chat
Dedicated
LLM
Serverless
API
curl -X POST "https://api.centml.com/openai/v1/chat/completions" \
-H "Authorization: Bearer *******************************************" \
-H "Content-Type: application/json" \
-d '{
"model": "microsoft/Phi-4-mini-instruct",
"messages": [{ "role": "system", "content": "You are a helpful assistant." }],
"stream": false
}'
from openai import OpenAI
client = OpenAI(
api_key="*******************************************",
base_url="https://api.centml.com/openai/v1"
)
completion = client.chat.completions.create(
model="microsoft/Phi-4-mini-instruct",
messages=[{ "role": "system", "content": "You are a helpful assistant." }],
stream=False,
)
print(completion.choices[0].message)
import OpenAI from "openai";
const client = new OpenAI(
api_key="*******************************************",
base_url="https://api.centml.com/openai/v1"
)
async function main() {
const completion = await client.chat.completions.create({
model: "microsoft/Phi-4-mini-instruct",
messages: [{ "role": "system", "content": "You are a helpful assistant." }],
stream: false,
});
console.log(completion.choices[0])
}
main()