GET
/
batches
/
{id}
from together import Together
import os

client = Together(
api_key=os.environ.get("TOGETHER_API_KEY"),
)

batch = client.batches.get_batch("batch_id")

print(batch)
{
  "id": "01234567-8901-2345-6789-012345678901",
  "user_id": "user_789xyz012",
  "input_file_id": "file-input123abc456def",
  "file_size_bytes": 1048576,
  "status": "IN_PROGRESS",
  "job_deadline": "2024-01-15T15:30:00Z",
  "created_at": "2024-01-15T14:30:00Z",
  "endpoint": "/v1/chat/completions",
  "progress": 75,
  "model_id": "meta-llama/Meta-Llama-3.1-8B-Instruct-Turbo",
  "output_file_id": "file-output789xyz012ghi",
  "error_file_id": "file-errors456def789jkl",
  "error": "<string>",
  "completed_at": "2024-01-15T15:45:30Z"
}

Authorizations

Authorization
string
header
default:default
required

Bearer authentication header of the form Bearer <token>, where <token> is your auth token.

Path Parameters

id
string
required

Job ID

Response

OK

id
string<uuid>
Example:

"01234567-8901-2345-6789-012345678901"

user_id
string
Example:

"user_789xyz012"

input_file_id
string
Example:

"file-input123abc456def"

file_size_bytes
integer

Size of input file in bytes

Example:

1048576

status
enum<string>

Current status of the batch job

Available options:
VALIDATING,
IN_PROGRESS,
COMPLETED,
FAILED,
EXPIRED,
CANCELLED
Example:

"IN_PROGRESS"

job_deadline
string<date-time>
Example:

"2024-01-15T15:30:00Z"

created_at
string<date-time>
Example:

"2024-01-15T14:30:00Z"

endpoint
string
Example:

"/v1/chat/completions"

progress
number

Completion progress (0.0 to 100)

Example:

75

model_id
string

Model used for processing requests

Example:

"meta-llama/Meta-Llama-3.1-8B-Instruct-Turbo"

output_file_id
string
Example:

"file-output789xyz012ghi"

error_file_id
string
Example:

"file-errors456def789jkl"

error
string
completed_at
string<date-time>
Example:

"2024-01-15T15:45:30Z"