Skip to content

System Architecture

Component overview and data flow

Component summary

ComponentRole
CX ConsoleRequest intake, clarification prompts, confirmation screen, live progress view, final summary and recovery actions
Interpreter ActionConvex action (Node.js runtime) that calls OpenRouter to classify the request as a bulk operation or a policy question, grounds answers in KB articles via vector search, and returns structured intent or a Q&A response
Job MutationscreateDraft resolves target cards, runs the policy engine, and writes a preview job; confirmJob fans out per-item execution tasks via the Convex scheduler; cancelJob and retryFailed handle lifecycle transitions
Policy Rules EngineDeterministic TypeScript function — enforces limit caps (SGD 5 000 max), excludes frozen/cancelled cards, gates >25-card ops behind approval, hard-blocks >200 items. No LLM involved
Executor ActionsConvex internalActions that call the (simulated) card API per item, mark outcomes, and re-schedule retries with exponential backoff via the Convex scheduler
Convex SchedulerBuilt-in durable scheduler — replaces Redis + Celery. Handles staggered fan-out on confirm and exponential backoff on transient failures. All scheduled function IDs are stored so queued items can be cancelled
Live QueriesConvex reactive queries push job and item state to the UI in real time — no polling or SSE plumbing required
Convex DatabaseManaged transactional store for jobs, job_items, mock_cards, KB articles (with 1 536-dim vector index), feedback, and metrics events
KB / RAGReap Help Center articles embedded with OpenAI text-embedding-3-small (via OpenRouter), stored in Convex's built-in vector index. Semantic search runs at request time to ground policy Q&A answers

Synchronous vs Asynchronous boundary

Synchronous (user waits)Asynchronous (background, live-push)
Request acknowledgmentPer-card update execution
Intent extraction + KB searchRetry logic with exponential backoff
Entity resolution + policy checkJob count aggregation after each item
Plan generation + confirmation screenJob status transitions (in_progress → completed)
Job receipt (returned immediately on confirm)Progress updates (Convex live subscriptions, no polling)

Actual tech stack — built in Dispatch

This is what was actually shipped. The stack below reflects the Dispatch prototype built for this assessment — not a hypothetical recommendation.
LayerWhat we built withWhy
FrontendNext.js / React · TypeScript · Tailwind CSS v4App Router + Server Components for fast internal tooling; TypeScript strict mode throughout
Unified backendConvex (mutations · actions · scheduler · live queries · vector search)Replaces FastAPI + Redis + Celery + Postgres in one layer — real-time subscriptions, durable background tasks, and a transactional database with zero extra infrastructure. Job status updates arrive automatically via reactive queries
AI pipelineOpenRouter → openai/gpt-5.4-mini (intent + Q&A) · text-embedding-3-small (KB embeddings)Single API key for both chat and embeddings; fast and cheap for structured JSON extraction; temperature 0 for deterministic intent output
Policy enforcementDeterministic TypeScript — src/lib/policy.tsHard rules (limit caps, excluded statuses, approval thresholds, item-count maximums) require no LLM — deterministic code is cheaper, faster, and auditable
Async executionConvex Scheduler (ctx.scheduler.runAfter)Durable fan-out on job confirm; exponential backoff on transient failures; scheduled function IDs stored on each item so queued items can be cancelled cleanly
StorageConvex managed database · Convex vector index (1 536-dim)Transactional writes for jobs and items; vector index for KB semantic search — no separate Postgres or pgvector instance needed
TestingVitest (unit + integration) · Playwright (E2E)Unit tests cover policy logic and executor retry behaviour; E2E tests cover the confirmation and progress screens end-to-end