Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.together.ai/llms.txt

Use this file to discover all available pages before exploring further.

Function calling (also called tool calling) lets LLMs respond with structured function names and arguments that you can execute in your application. It enables models to interact with external systems, retrieve real-time data, and power agentic AI workflows. Pass function descriptions to the tools parameter, and the model returns tool_calls when it determines a function should be used. You then execute these functions and optionally pass the results back to the model for further processing.

Patterns

Function calling fits a handful of common shapes. Pick the one that matches what you’re building, then follow the link for runnable Python, TypeScript, and cURL examples.
PatternDescriptionUse casesPage
SimpleOne function, one callBasic utilities, simple queriesCall functions
MultipleChoose from many functionsMany tools, model has to chooseCall functions
ParallelSame function, multiple calls in one turnComplex prompts, batched lookupsParallel calls
Parallel multipleMultiple functions, parallel callsSingle requests that need many toolsParallel calls
Multi-stepSequential function calling in one turnData-processing workflowsAgentic patterns
Multi-turnConversational context plus functionsAgents with humans in the loopAgentic patterns
VisionTool use with image inputsExtract structured data from imagesVision-language function calling

Supported models

For the current list of models that support function calling, see the serverless and dedicated endpoint model catalogs.

Next steps