Realtime streams
Stream data in realtime from inside your tasks
The world is going realtime, and so should your tasks. With the Streams API, you can stream data from your tasks to the outside world in realtime. This is useful for a variety of use cases, including AI.
How it works
The Streams API is a simple API that allows you to send data from your tasks to the outside world in realtime using the metadata system. You can send any kind of data that is streamed in realtime, but the most common use case is to send streaming output from streaming LLM providers, like OpenAI.
Usage
To use the Streams API, you need to register a stream with a specific key using metadata.stream
. The following example uses the OpenAI SDK with stream: true
to stream the output of the LLM model in realtime:
You can then subscribe to the stream using the runs.subscribeToRun
method:
runs.subscribeToRun
should be used from your backend or another task. To subscribe to a run from
your frontend, you can use our React hooks.
You can register and subscribe to multiple streams in the same task. Let’s add a stream from the response body of a fetch request:
You may notice above that we aren’t consuming either of the streams in the task. In the background, we’ll wait until all streams are consumed before the task is considered complete (with a max timeout of 60 seconds). If you have a longer running stream, make sure to consume it in the task.
And then subscribing to the streams:
React hooks
If you’re building a frontend application, you can use our React hooks to subscribe to streams. Here’s an example of how you can use the useRealtimeRunWithStreams
hook to subscribe to a stream:
Read more about using the React hooks in the React hooks documentation.
Usage with the ai
SDK
The ai SDK provides a higher-level API for working with AI models. You can use the ai
SDK with the Streams API by using the streamText
method:
And then render the stream in your frontend:
Using tools and fullStream
When calling streamText
, you can provide a tools
object that allows the LLM to use additional tools. You can then access the tool call and results using the fullStream
method:
Now you can get access to the tool call and results in your frontend:
Using toolTask
As you can see above, we defined a tool which will be used in the aiStreamingWithTools
task. You can also define a Trigger.dev task that can be used as a tool, and will automatically be invoked with triggerAndWait
when the tool is called. This is done using the toolTask
function:
Was this page helpful?