Managed Inference and Agents API with Claude 3.0 Haiku
Last updated May 13, 2025
Table of Contents
Claude 3.0 Haiku is a text-to-text large language model (LLM) in Anthropic’s Claude 3 family, optimized for cost-efficiency and solid performance at a lower price point than Claude 3.5 Sonnet. It supports conversational chat and tool-calling capabilities.
- Model ID:
claude-3-haiku
- Region:
eu
When to Use This Model
Claude 3.0 Haiku is ideal for straightforward chat interactions, lightweight code generation, and simpler workflows.
Usage
Claude 3.0 Haiku follows our Claude v1/chat/completions
API schema.
To provision access to the model, attach claude-3-haiku
to your app:
heroku ai:models:create -a example-app claude-3-haiku
Using config variables, you can invoke claude-3-haiku
in a variety of ways:
- Heroku CLI
ai
plugin (heroku ai:models:call
) curl
- Python
- Ruby
- Javascript
Example curl
Request
Get started quickly with an example request:
export INFERENCE_MODEL_ID=$(heroku config:get -a example-app INFERENCE_MODEL_ID)
export INFERENCE_KEY=$(heroku config:get -a example-app INFERENCE_KEY)
export INFERENCE_URL=$(heroku config:get -a example-app INFERENCE_URL)
curl $INFERENCE_URL/v1/chat/completions \
-H "Authorization: Bearer $INFERENCE_KEY" \
-d @- <<EOF
{
"model": "$INFERENCE_MODEL_ID",
"messages": [
{ "role": "user", "content": "Hello!" },
{ "role": "assistant", "content": "Hi there! How can I assist you today?" },
{ "role": "user", "content": "What's the weather like in Portland, Oregon right now?" }
],
"temperature": 0.5,
"max_tokens": 100,
"stream": false,
"tools": [
{
"type": "function",
"function": {
"name": "get_weather",
"description": "Fetches the current weather for a given city.",
"parameters": {
"type": "object",
"properties": {
"city": {
"type": "string",
"description": "The name of the city to get weather for."
}
},
"required": ["city"]
}
}
}
],
"tool_choice": "auto",
"top_p": 0.9
}
EOF