Environment:Ucbepic Docetl Frontend Node Environment
| Knowledge Sources | |
|---|---|
| Domains | Infrastructure, Frontend |
| Last Updated | 2026-02-08 01:00 GMT |
Overview
Node.js 20 environment with Next.js 14, React 18, and Tailwind CSS for the DocWrangler playground frontend.
Description
The DocWrangler frontend is a Next.js 14 application that provides an interactive IDE for building, running, and debugging DocETL pipelines. It includes a Monaco code editor, drag-and-drop pipeline builder, real-time WebSocket output streaming, and data table visualization. The UI uses Radix UI primitives with Tailwind CSS and shadcn/ui component library.
Key frontend integrations:
- Vercel AI SDK for streaming chat with LLM assistants
- WebSocket connection to the FastAPI backend for pipeline execution
- Monaco Editor for YAML pipeline editing
- Supabase for optional hosted version authentication
Usage
Use this environment when developing or deploying the DocWrangler playground frontend. Required for the interactive pipeline development workflow. Not needed for CLI-only or Python API usage.
System Requirements
| Category | Requirement | Notes |
|---|---|---|
| OS | Any Node.js-supported OS | Docker uses `node:20-alpine` |
| Node.js | 20.x | Docker uses `node:20-alpine` |
| Disk | 500MB+ | For `node_modules` and `.next` build output |
Dependencies
Core Framework
- `next` = 14.2.32
- `react` ^18
- `react-dom` ^18
- `typescript` ^5
UI Components
- `@radix-ui/react-*` (accordion, dialog, popover, select, tabs, etc.)
- `lucide-react` >= 0.441.0 (icons)
- `tailwindcss` ^3.4.1
- `framer-motion` ^11.5.4 (animations)
Data & Editor
- `@monaco-editor/react` ^4.6.0 (code editor)
- `@tanstack/react-table` ^8.20.5 (data tables)
- `@tanstack/react-query` ^5.59.15 (data fetching)
- `recharts` ^2.10.3 (charts)
- `vega` ^5.33.0 + `vega-lite` ^5.23.0 (visualization)
LLM Integration
- `@ai-sdk/openai` ^0.0.70
- `@ai-sdk/azure` ^1.0.13
- `ai` ^3.4.29 (Vercel AI SDK)
Credentials
The following environment variables must be set in `website/.env.local`:
- `OPENAI_API_KEY`: For the AI chat assistant in the playground
- `NEXT_PUBLIC_BACKEND_HOST`: Backend hostname (default: `localhost`)
- `NEXT_PUBLIC_BACKEND_PORT`: Backend port (default: `8000`)
Optional:
- `NEXT_PUBLIC_BACKEND_HTTPS`: Set to `true` for wss:// WebSocket connections
- `MODEL_NAME`: LLM model for UI assistant (default: `gpt-4o-mini`)
- `OPENAI_API_BASE`: Custom OpenAI-compatible API endpoint
- `ANTHROPIC_API_KEY`: For Claude-based UI assistant
- `GEMINI_API_KEY`: For Gemini-based UI assistant
- `NEXT_PUBLIC_HOSTED_DOCWRANGLER`: Set to `true` for hosted deployment at docetl.org
Quick Install
cd website
# Copy environment template
cp .env.local.example .env.local
# Edit .env.local to add your API keys
# Install dependencies
npm install
# Development server
npm run dev
# Production build
npm run build && npm start
Code Evidence
WebSocket protocol selection from `website/src/contexts/WebSocketContext.tsx:53`:
const isHttps = process.env.NEXT_PUBLIC_BACKEND_HTTPS === "true";
const wsProtocol = isHttps ? "wss" : "ws";
Backend API config from `website/src/lib/api-config.ts`:
export function getBackendUrl() {
const host = process.env.NEXT_PUBLIC_BACKEND_HOST || "localhost";
const port = process.env.NEXT_PUBLIC_BACKEND_PORT || "8000";
return `http://${host}:${port}`;
}
Common Errors
| Error Message | Cause | Solution |
|---|---|---|
| `WebSocket connection failed` | Backend not running or wrong port | Start backend server; check `NEXT_PUBLIC_BACKEND_HOST/PORT` |
| `CORS error` | Frontend origin not in backend allow list | Set `BACKEND_ALLOW_ORIGINS` on the backend |
| `npm install` failures | Node.js version mismatch | Use Node.js 20.x |
Compatibility Notes
- Vercel Deployment: The `website/vercel.json` configures the build for Vercel hosting. Set environment variables in Vercel dashboard.
- Docker: When using Docker, the frontend is built with `node:20-alpine` and served via `npm start` alongside the Python backend.
- Development: `npm run dev` enables hot reload for frontend development.