Hugging Face

huggingface.co

Integrate advanced NLP models for text analysis and generation.

Using the Hugging Face API with Trigger.dev

You can use Trigger.dev with any existing Node SDK or even just using fetch. Using io.runTask makes your Hugging Face background job resumable and appear in our dashboard.

  • Use io.runTask() and the official SDK or fetch.

  • Use our HTTP endpoint to subscribe to webhooks

  • Example code using Hugging Face

    Below are some working code examples of how you can use Hugging Face with Trigger.dev. These samples are open source and maintained by the community, you can copy and paste them into your own projects.

    import { HfInference } from "@huggingface/inference";
    import { TriggerClient, eventTrigger } from "@trigger.dev/sdk";
    import z from "zod";
    // Create a new Hugging Face inference client
    // Get start with Hugging Face https://huggingface.co/docs/api-inference/quicktour
    // SDK: https://www.npmjs.com/package/@huggingface/inference
    const hf = new HfInference(process.env.HUGGING_FACE_API_KEY);
    client.defineJob({
    id: "hugging-face-inference",
    name: "Hugging Face inference",
    version: "1.0.0",
    trigger: eventTrigger({
    name: "hugging-face-inference",
    schema: z.object({
    // Hugging Face model name or ID.
    // Example: "distilbert-base-uncased-finetuned-sst-2-english"
    // More models: https://huggingface.co/models?pipeline_tag=text-classification
    model: z.string(),
    // Text to input to the model.
    // Example: "Such nice weather outside!"
    inputs: z.string(),
    }),
    }),
    run: async (payload, io, ctx) => {
    // Use io.runTask to make the SDK call resumable and log-friendly
    await io.runTask(
    "Hugging Face inference",
    async () => {
    // Call the Hugging Face API
    return await hf.textClassification(payload);
    },
    // Add metadata to improve how the task displays in the logs
    { name: "Hugging Face inference", icon: "hugging-face" }
    );
    },
    });
    ,