New: Dedicated Container Inference
Deploy your own Dockerized workloads on Together’s managed GPU infrastructure. You bring the container — Together handles provisioning, autoscaling, and observability.
Welcome to Together AI’s docs! Together makes it easy to run, finetune, and train open source AI models with transparency and privacy.
from together import Together
client = Together()
completion = client.chat.completions.create(
model="openai/gpt-oss-20b",
messages=[{"role": "user", "content": "What are the top 3 things to do in New York?"}],
)
print(completion.choices[0].message.content)
Was this page helpful?