JSON mode corrals the LLM into outputting JSON conforming to a provided schema. To activate JSON mode, provide the response_format parameter to the Chat Completions API with {"type": "json_object"}. The JSON Schema can be specified with the schema property of response_format.

Supported models

  • mistralai/Mixtral-8x7B-Instruct-v0.1
  • mistralai/Mistral-7B-Instruct-v0.1
  • togethercomputer/CodeLlama-34b-Instruct

Example in Python

With JSON mode, you can specify a schema for the output of the LLM. Here's an example of JSON mode with Python using our Mixtral model.

import os
import json
import openai
from pydantic import BaseModel, Field

# Create client
client = openai.OpenAI(

# Define the schema for the output.
class User(BaseModel):
    name: str = Field(description="user name")
    address: str = Field(description="address")

# Call the LLM with the JSON schema
chat_completion = client.chat.completions.create(
    response_format={"type": "json_object", "schema": User.model_json_schema()},
            "role": "system",
            "content": "You are a helpful assistant that answers in JSON.",
            "role": "user",
            "content": "Create a user named Alice, who lives in 42, Wonderland Avenue.",

created_user = json.loads(chat_completion.choices[0].message.content)
print(json.dumps(created_user, indent=2))

  "address": "42, Wonderland Avenue",
  "name": "Alice"

Example in Node.js

The following example demonstrates how to use JSON mode in Typescript to extract a summary and action items from a meeting transcript. We'll use the zod library to define a schema, zod-to-json-schema to convert it to JSON, then pass that into our Together LLM call to return both a summary and a list of action items.

import OpenAI from 'openai';
import { z } from 'zod';
import { zodToJsonSchema } from 'zod-to-json-schema';

// Defining the Together.ai client
const togetherai = new OpenAI({
  apiKey: process.env.TOGETHER_API_KEY,
  baseURL: 'https://api.together.xyz/v1',

// Defining the schema we want our data in
const actionItemsSchema = z.object({
  summary: z.string().describe('A summary of the voice note'),
  actionItems: z
    .describe('A list of action items from the voice note'),
const jsonSchema = zodToJsonSchema(actionItemsSchema, 'mySchema');

async function main() {
  const transcript = 'I need to go pack my bags, hit the gym, and go on a run.';
  const extract = await togetherai.chat.completions.create({
    messages: [
        role: 'system',
        'The following is a voice message transcript. Extract the action items from it and answer in JSON',
        role: 'user',
        content: transcript,
    model: 'mistralai/Mistral-7B-Instruct-v0.1',
    // @ts-ignore – Together.ai supports schema while OpenAI does not
    response_format: { type: 'json_object', schema: jsonSchema },

  const output = JSON.parse(extract.choices[0].message.content!);
  console.log({ output });
  return output;


  output: {
    actionItems: [ 'Go pack my bags', 'Hit the gym', 'Go on a run' ],
    summary: 'Pack bags, hit the gym, and go on a run'

Simple CurL Example

curl https://api.together.xyz/chat/completions \
  -H "Content-Type: application/json" \
  -H "Authorization: Bearer $TOGETHER_API_KEY" \
  -d '{
    "model": "mistralai/Mistral-7B-Instruct-v0.1",
    "messages": [
      {"role": "system", "content": "You are a helpful assistant that provides responses in JSON."},
      {"role": "user", "content": "Who won the 2022 FIFA World Cup?"}
    "response_format": {
      "type": "json_object",
      "schema": {
        "type": "object",
        "properties": {"team_name": {"type": "string"}},
        "required": ["team_name"]
    "temperature": 0.7

Example curL output

  "choices": [
      "finish_reason": "stop",
      "message": {
        "role": "assistant",
        "content": "{\n  \"team_name\": \"Argentina\"\n}"
      "index": 0
  "model": "mistralai/Mixtral-8x7B-Instruct-v0.1",
  "object": "chat.completion"