Anthropic
batchling is compatible with Anthropic through any supported framework
The following endpoints are made batch-compatible by Anthropic:
/v1/messages
Check model support and batch pricing
Before sending batches, review the provider's official pricing page for supported models and batch pricing details.
The Batch API docs for Anthropic can be found on the following URL:
https://docs.anthropic.com/en/docs/build-with-claude/batch-processing
Example Usage
API key required
Set ANTHROPIC_API_KEY in .env or ensure it is already loaded in your environment variables before running batches.
Here's an example showing how to use batchling with Anthropic:
anthropic_example.py
import asyncio
import os
from anthropic import AsyncAnthropic
from dotenv import load_dotenv
from batchling import batchify
load_dotenv()
async def build_tasks() -> list:
"""Build Anthropic requests."""
client = AsyncAnthropic(api_key=os.getenv(key="ANTHROPIC_API_KEY"))
questions = [
"Who is the best French painter? Answer in one short sentence.",
"What is the capital of France?",
]
return [
client.messages.create(
max_tokens=1024,
messages=[
{
"role": "user",
"content": question,
}
],
model="claude-haiku-4-5",
)
for question in questions
]
async def main() -> None:
"""Run the Anthropic example."""
tasks = await build_tasks()
responses = await asyncio.gather(*tasks)
for response in responses:
print(f"{response.model} answer:\n{response.content[0].text}\n")
async def run_with_batchify() -> None:
"""Run `main` inside `batchify` for direct script execution."""
async with batchify():
await main()
if __name__ == "__main__":
asyncio.run(run_with_batchify())
Output: