Skip to content

Latest commit

 

History

History
53 lines (41 loc) · 3.07 KB

File metadata and controls

53 lines (41 loc) · 3.07 KB
title Next.js Batch LLM Evaluator
sidebarTitle Batch LLM Evaluator
description This example Next.js project evaluates multiple LLM models using the Vercel AI SDK and streams updates to the frontend using Trigger.dev Realtime.

import RealtimeLearnMore from "/snippets/realtime-learn-more.mdx";

Overview

This demo is a full stack example that uses the following:

  • A Next.js app with Prisma for the database.
  • Trigger.dev Realtime to stream updates to the frontend.
  • Work with multiple LLM models using the Vercel AI SDK. (OpenAI, Anthropic, XAI)
  • Distribute tasks across multiple tasks using the new batch.triggerByTaskAndWait method.

GitHub repo

<Card title="View the Batch LLM Evaluator repo" icon="GitHub" href="https://github.com/triggerdotdev/examples/tree/main/batch-llm-evaluator"

Click here to view the full code for this project in our examples repository on GitHub. You can fork it and use it as a starting point for your own project.

Video

<video controls className="w-full aspect-video" src="https://content.trigger.dev/batch-llm-evaluator.mp4"

Relevant code

This example uses the older `useRealtimeRunWithStreams` hook. For new projects, consider using the new [`useRealtimeStream`](/realtime/react-hooks/streams#userealtimestream-recommended) hook (SDK 4.1.0+) for a simpler API and better type safety with defined streams.