batchify
The batchify function is the one and only entrypoint for the batchling library, everything starts from there.
It can be used either through the Python SDK or CLI depending on your needs.
It comes with a bunch of parameters that you can customize to alter how batching is performed:
batchify
batchify(
batch_size=50,
batch_window_seconds=2.0,
batch_poll_interval_seconds=10.0,
dry_run=False,
cache=True,
)
Context manager used to activate batching for a scoped context.
Requests are accumulated within this context and will be batched and sent to the provider when the batch size or window is reached.
Batches are accumulated in different queues based on the provider/endpoint/model triplet.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
batch_size
|
int
|
Submit a batch when this many requests are queued for a given provider/endpoint/model triplet. |
50
|
batch_window_seconds
|
float
|
Submit a provider batch after this many seconds from the moment the first request is queued, even if size not reached. |
2.0
|
batch_poll_interval_seconds
|
float
|
Poll active batches for results every this many seconds. |
10.0
|
dry_run
|
bool
|
If |
False
|
cache
|
bool
|
If |
True
|
Returns:
| Type | Description |
|---|---|
BatchingContext
|
Context manager that yields |
Source code in src/batchling/api.py
Next steps
Now that you have more information about the batchify function, you can:
-
Learn to use it from the Python SDK or the CLI
-
See on which Frameworks & Providers it can be used
-
Learn more about advanced features