OpenAI compatibility

The Together API provides compatibility for the OpenAI API standard, allowing easier integrations into existing applications.

Python SDK

To switch to using the Together API, simply switch out the API key to your Together API key, base_url to https://api.together.xyz/v1, and model to one of our chat models.

import os
import openai

system_content = "You are a travel agent. Be descriptive and helpful."
user_content = "Tell me about San Francisco"

client = openai.OpenAI(
    api_key=os.environ.get("TOGETHER_API_KEY"),
    base_url="https://api.together.xyz/v1",
    )
chat_completion = client.chat.completions.create(
    model="mistralai/Mixtral-8x7B-Instruct-v0.1",
    messages=[
        {"role": "system", "content": system_content},
        {"role": "user", "content": user_content},
    ],
    temperature=0.7,
    max_tokens=1024,
)
response = chat_completion.choices[0].message.content
print("Together response:\n", response)

Streaming

This example demonstrates how you can use the OpenAI Python SDK to stream tokens:

import os
import openai

system_content = "You are a travel agent. Be descriptive and helpful."
user_content = "Tell me about San Francisco"

client = openai.OpenAI(
    api_key=os.environ.get("TOGETHER_API_KEY"),
    base_url="https://api.together.xyz/v1",
    )

stream = client.chat.completions.create(
    model="mistralai/Mixtral-8x7B-Instruct-v0.1",
    messages=[
        {"role": "system", "content": system_content},
        {"role": "user", "content": user_content},
    ],
    stream=True,
    max_tokens=1024,
    stop=['</s>']
)
for chunk in stream:
    print(chunk.choices[0].delta.content or "", end="", flush=True)

Node.js

This example shows how you can use the OpenAI Node.js SDK.

const OpenAI = require("openai");

const openai = new OpenAI({
    apiKey: process.env.TOGETHER_API_KEY,
    baseURL: "https://api.together.xyz/v1",
});

async function run() {
  const chatCompletion = await openai.chat.completions.create({
    messages: [
      {role: "system", content: "You are an AI assistant"},
      {role: "user", content: "Who won the world series in 2020?"},
    ],
    model: "mistralai/Mixtral-8x7B-Instruct-v0.1",
    max_tokens: 1024,
  });
  console.log(chatCompletion.choices[0].message.content);
}

run();

Streaming in Node.js

const OpenAI = require("openai");

const openai = new OpenAI({
    apiKey: process.env.TOGETHER_API_KEY,
    baseURL: "https://api.together.xyz/v1",
});

async function run() {
    const stream = await openai.chat.completions.create({
        messages: [
                {role: "system", content: "You are an AI assistant"},
                {role: "user", content: "Who won the world series in 2020?"},
        ],
        model: "mistralai/Mixtral-8x7B-Instruct-v0.1",
        max_tokens: 1024,
        stream: true,
    });
    for await (const chunk of stream) {
        console.log(chunk.choices[0].text);
      }
    }
    
run();

Response structure

An example Chat Completions API response looks as follows:

{
  "id": "83489fc92fe4faf8-SJC",
  "choices": [
    {
      "message": {
        "role": "assistant",
        "content": "The Los Angeles Dodgers won the World Series in 2020."
      }
    }
  ],
  "created": 1702411968,
  "model": "mistralai/Mixtral-8x7B-Instruct-v0.1",
  "object": "chat.completion"
}

The assistant’s response can be extracted with:

response['choices'][0]['message']['content']

Community Libraries

The Together API is also supported by most OpenAI libraries built by the community!

import openai

client = OpenAI(api_key=TOGETHER_API_KEY,
        base_url="https://api.together.xyz/v1")

def get_embeddings(texts, model="togethercomputer/m2-bert-80M-32k-retrieval"):
   texts = [text.replace("\n", " ") for text in texts]
   outputs = client.embeddings.create(input = texts, model=model)
   return [outputs.data[i].embedding for i in range(len(texts))]

return get_embeddings(texts)