Quickstart

Get up to speed with our API in one minute.

Together AI makes it easy to run leading open-source models using only a few lines of code.

1. Register for an account

First, register for an account to get an API key. New accounts come with $1 to get started.

Once you've registered, set your account's API key to an environment variable named TOGETHER_API_KEY:

export TOGETHER_API_KEY=xxxxx

2. Install your preferred library

Together provides an official library for Python and TypeScript, or you can call our HTTP API in any language you want:

pip install together
npm install together-ai

3. Run your first query against a model

Choose a model to query. In this example, we'll do a chat completion on Llama 3.1 8B with streaming:

from together import Together

client = Together()

stream = client.chat.completions.create(
  model="meta-llama/Meta-Llama-3.1-8B-Instruct-Turbo",
  messages=[{"role": "user", "content": "What are the top 3 things to do in New York?"}],
  stream=True,
)

for chunk in stream:
  print(chunk.choices[0].delta.content or "", end="", flush=True)
import Together from 'together-ai';
const together = new Together();

const stream = await together.chat.completions.create({
  model: 'meta-llama/Meta-Llama-3.1-8B-Instruct-Turbo',
  messages: [
    { role: 'user', content: 'What are the top 3 things to do in New York?' },
  ],
  stream: true,
});

for await (const chunk of stream) {
  // use process.stdout.write instead of console.log to avoid newlines
  process.stdout.write(chunk.choices[0]?.delta?.content || '');
}
curl -X POST "https://api.together.xyz/v1/chat/completions" \
     -H "Authorization: Bearer $TOGETHER_API_KEY" \
     -H "Content-Type: application/json" \
     -d '{
     	"model": "meta-llama/Meta-Llama-3.1-8B-Instruct-Turbo",
     	"messages": [
          {"role": "user", "content": "What are the top 3 things to do in New York?"}
     	]
     }'

Congratulations – you've just made your first query to Together AI!

Next steps

Resources