CentercodePorygon
DittoPorygonAPI

AI Integration

Text and structured output generation with Vercel AI SDK

FastReliable

Overview

Use aiGenerateText() and aiGenerateObject() from @/lib/ai for all AI operations. The utilities handle logging, error handling, and model selection automatically.

What it is

Wrapper utilities around Vercel AI SDK that provide consistent patterns for text generation, structured output, and streaming.

Why we use it

Centralized AI access with automatic logging, type-safe structured output via Zod, and easy model switching through AI Gateway.

When to use

Content generation, data extraction, document parsing, chat interfaces, or any task requiring LLM capabilities.

Key Features

  • aiGenerateText() for text generation with logging
  • aiGenerateObject() for structured output with Zod
  • aiStreamText() for real-time chat interfaces
  • parseDocumentWithAI() for PDF/DOCX analysis

Quick Start

Basic AI Operations

Text generation and structured output examples.

// Basic text generation
import { aiGenerateText } from '@/lib/ai';

const result = await aiGenerateText({
  prompt: 'Summarize this feedback in 2 sentences...',
});
// result.text contains the AI response

// Structured output with Zod
import { aiGenerateObject } from '@/lib/ai';
import { z } from 'zod';

const InsightSchema = z.object({
  category: z.enum(['bug', 'feature', 'improvement']),
  priority: z.enum(['low', 'medium', 'high']),
  summary: z.string(),
});

const result = await aiGenerateObject({
  prompt: 'Extract insight from this feedback...',
  schema: InsightSchema,
});
// result.object is fully typed!

Patterns

Model Selection

Available models and when to use each.

// Available models via AI Gateway
import { AI_MODELS } from '@/lib/ai';

// DEFAULT / FAST: Quick tasks, chat responses
AI_MODELS.DEFAULT  // gpt-5.2 (recommended)
AI_MODELS.FAST     // gpt-5-nano (cheapest)

// POWERFUL: Complex reasoning, analysis
AI_MODELS.POWERFUL // gpt-5.2-pro

// REASONING: Chain-of-thought, problem solving
AI_MODELS.REASONING // gpt-5.1-thinking

Structured Output

Type-safe data extraction with Zod schemas.

// Structured output with schema validation
import { aiGenerateObject } from '@/lib/ai';
import { z } from 'zod';

const AnalysisSchema = z.object({
  sentiment: z.enum(['positive', 'neutral', 'negative']),
  topics: z.array(z.string()),
  actionItems: z.array(z.object({
    task: z.string(),
    priority: z.enum(['low', 'medium', 'high']),
  })),
  confidence: z.number().min(0).max(1),
});

const result = await aiGenerateObject({
  prompt: `Analyze this customer feedback: ${feedback}`,
  schema: AnalysisSchema,
  schemaName: 'FeedbackAnalysis',
  schemaDescription: 'Structured analysis of customer feedback',
  model: 'gpt-5.1-instant', // Optional, defaults to instant
});

// result.object is fully typed as z.infer<typeof AnalysisSchema>
const { sentiment, topics, actionItems } = result.object;

Tip: Always use aiGenerateObject when you need structured data. The Zod schema validates the output automatically.

Streaming Responses

Real-time text generation for chat UIs.

// Streaming for chat interfaces
import { aiStreamText } from '@/lib/ai';

export async function POST(request: Request) {
  const { messages } = await request.json();

  const result = await aiStreamText({
    messages,
    system: 'You are a helpful assistant.',
  });

  // Return as streaming response
  return result.toDataStreamResponse();
}

// Or consume the stream manually
const result = await aiStreamText({ prompt: '...' });
for await (const chunk of result.textStream) {
  process.stdout.write(chunk);
}

Feature Prompts (Recommended)

Admin-editable prompts, model selection, and output schemas. Use for fixed platform features.

// Feature Prompts: Admin-editable prompts with dynamic schemas
import { compileFeaturePrompt, executeFeaturePrompt, FIXED_FEATURE_KEYS } from '@/features/feature-prompts';
import { aiGenerateObject, jsonSchemaToZod } from '@/lib/ai';

// 1. Compile: loads template, contexts, model, schema from database
const compiled = await compileFeaturePrompt(FIXED_FEATURE_KEYS.INSIGHT_SUMMARY);

// 2. Execute: apply variable interpolation ([[variable]] syntax)
const prompt = executeFeaturePrompt(compiled, {
  title: insight.title,
  status: insight.status,
});

// 3. Convert JSON Schema to Zod (admin can edit in UI)
const schema = compiled.outputSchema
  ? jsonSchemaToZod(compiled.outputSchema)
  : FALLBACK_SCHEMA;

// 4. Generate with dynamic model selection
const result = await aiGenerateObject({
  prompt,
  schema,
  schemaName: 'InsightAISummary',
  ...(compiled.modelId && { model: compiled.modelId }),
});
When to use Feature Prompts:
  • Fixed platform features (Designer, Classifier, Insight Summary)
  • Prompts that super admins should be able to edit without deploys
  • AI features that need structured output with admin-defined schemas

Watch Out

Hardcoding model IDs instead of using constants

Don't

// Hardcoding model IDs
const result = await generateText({
  model: openai('gpt-4'),
  prompt: '...',
});

Do

// Use AI_MODELS constants
import { aiGenerateText, AI_MODELS } from '@/lib/ai';

const result = await aiGenerateText({
  prompt: '...',
  model: AI_MODELS.POWERFUL, // For complex tasks
});
  • Missing error handling for AI failures
  • Using text generation when structured output is needed
  • Using expensive models for simple tasks

Related

Background Jobs

Long-running AI tasks

Validation

Zod schemas for AI output

File Storage

Document uploads for parsing

External Documentation

Vercel AI SDK|Vercel AI Gateway