Batch trigger improvements

Support for larger payloads, streaming ingestion, and fair processing with per-environment concurrency limits.

Eric Allam

Eric Allam

CTO, Trigger.dev

Image for Batch trigger improvements

We've significantly improved our batch trigger system with support for larger payloads, streaming ingestion, and increased batch sizes.

NOTE

Upgrade to SDK 4.3.1+ to use batch trigger v2.

Updated limits

With SDK 4.3.1+, batch triggers now support:

LimitNewPrevious
Maximum batch size1,000 items500 items
Batch payload per item3MB each1MB total combined

Payloads that exceed 512KB are automatically offloaded to object storage—no changes needed in your code.

Batch trigger rate limits

Batch triggering now uses a token bucket algorithm to rate limit the number of runs you can trigger per environment:

Pricing tierBucket sizeRefill rate
Free1,200 runs100 runs every 10 sec
Hobby5,000 runs500 runs every 5 sec
Pro5,000 runs500 runs every 5 sec

You can burst up to your bucket size, then tokens refill at the specified rate. For example, a Free user can trigger 1,200 runs immediately, then must wait for tokens to refill.

Batch processing concurrency

The number of batches that can be processed concurrently per environment:

Pricing tierLimit
Free1 concurrent batch
Hobby10 concurrent batches
Pro10 concurrent batches

What's new

Larger batch sizes

You can now trigger batches with up to 1,000 items in a single call (doubled from 500). The system handles large batches efficiently through streaming ingestion, processing items as they arrive rather than waiting for the entire batch.

Fair processing

We've implemented a new fair queueing system that ensures multi-tenant fairness with per-environment concurrency limits. This means:

  • Your batch triggers won't be starved by other workloads
  • Processing is distributed fairly across all environments in your organization
  • Rate limits use a token bucket algorithm for predictable burst capacity

Streaming ingestion

Large batch payloads are now streamed and processed incrementally, reducing memory pressure and improving reliability for high-volume batch operations.

Usage

Batch triggering works the same way as before:


import { myTask } from "./trigger/myTask";
// Trigger a batch of tasks - now supports up to 1,000 items
const runs = await myTask.batchTrigger([
{ payload: { userId: "user-1" } },
{ payload: { userId: "user-2" } },
{ payload: { userId: "user-3" } },
// ...
]);

You can also use batchTriggerAndWait to wait for all runs to complete:


const results = await myTask.batchTriggerAndWait([
{ payload: { userId: "user-1" } },
{ payload: { userId: "user-2" } },
]);
for (const result of results) {
if (result.ok) {
console.log("Result:", result.output);
}
}

Read the full docs for more details.

Ready to start building?

Build and deploy your first task in 3 minutes.

Get started now