This page lists all supported model sources for the Evaluations API. You can use serverless models, dedicated endpoints, or external models from providers like OpenAI, Anthropic, and Google.Documentation Index
Fetch the complete documentation index at: https://docs.together.ai/llms.txt
Use this file to discover all available pages before exploring further.
Serverless models
Setmodel_source = "serverless" to use Together’s serverless inference.
Any Together serverless model that supports structured outputs can be used.
Python
Dedicated models
Setmodel_source = "dedicated" to use your own dedicated endpoint.
A user-launched dedicated endpoint must be created before running evaluations. After launching an endpoint, copy-paste the endpoint ID into the
model field.Python
External models
Setmodel_source = "external" to use models from external providers.
Supported shortcuts
Use these shortcuts in themodel field - the API base URL will be determined automatically:
| Provider | Model Name | Model String for API |
|---|---|---|
| OpenAI | GPT-5 | openai/gpt-5 |
| OpenAI | GPT-5 Mini | openai/gpt-5-mini |
| OpenAI | GPT-5 Nano | openai/gpt-5-nano |
| OpenAI | GPT-5.2 | openai/gpt-5.2 |
| OpenAI | GPT-5.2 Pro | openai/gpt-5.2-pro |
| OpenAI | GPT-5.2 Chat Latest | openai/gpt-5.2-chat-latest |
| OpenAI | GPT-4 | openai/gpt-4 |
| OpenAI | GPT-4.1 | openai/gpt-4.1 |
| OpenAI | GPT-4o Mini | openai/gpt-4o-mini |
| OpenAI | GPT-4o | openai/gpt-4o |
| Anthropic | Claude Sonnet 4.5 | anthropic/claude-sonnet-4-5 |
| Anthropic | Claude Haiku 4.5 | anthropic/claude-haiku-4-5 |
| Anthropic | Claude Sonnet 4.0 | anthropic/claude-sonnet-4-0 |
| Anthropic | Claude Opus 4.5 | anthropic/claude-opus-4-5 |
| Anthropic | Claude Opus 4.1 | anthropic/claude-opus-4-1 |
| Anthropic | Claude Opus 4.0 | anthropic/claude-opus-4-0 |
| Gemini 2.5 Pro | google/gemini-2.5-pro | |
| Gemini 2.5 Flash | google/gemini-2.5-flash | |
| Gemini 2.5 Flash Lite | google/gemini-2.5-flash-lite | |
| Gemini 3 Pro Preview | google/gemini-3-pro-preview |
Python
Custom base URL
You can also use any OpenAIchat/completions-compatible API by specifying a custom external_base_url:
Python
The external API must be OpenAI
chat/completions-compatible.