Case study

Clotho: AI procurement intake for Fortune 500 Source-to-Pay teams

A live demo of what an AI-augmented intake layer looks like: paste an unstructured RFA email, Claude Sonnet 4.6 extracts the structured procurement fields, buyer reviews and submits to the system of record in one click.

Live demo: clotho-pi.vercel.app Enterprise AI pilot

The problem

Fortune 500 procurement teams lose hours every week manually transcribing requests from unstructured sources. RFA emails. "Recommended for Award" memos. PDF approval packets. SharePoint comment threads. The same supplier shows up under five different names in the vendor master because buyers can't find the existing record and create a new one. Single-source justifications get buried in email threads. Capital project codes and cost centers have to be looked up by hand on every request.

This is the kind of work that exists in every Fortune 500 procurement shop. It's also exactly the kind of work that doesn't belong on a human's plate in 2026.

The solution

1. AI intake form backed by Claude extraction

Paste an RFA email or a "Recommended for Award" memo. Claude Sonnet 4.6 extracts the structured procurement fields and populates a review form: capital project code, budget reference, cost center, supplier name, value, sourcing method, single-source justification, submitter. The buyer edits anything that looks wrong and submits. The ticket lands in Supabase with a generated request ID and a full audit trail entry.

On first page load the demo auto-runs against a pre-loaded sample email, so a visitor sees extraction fire in under two seconds. No click required.

2. Smart supplier search with trigram pre-filter plus Claude rerank

Type a vendor name like acme valve. Postgres pg_trgm similarity pre-filters the supplier master to the top candidates in milliseconds, then Claude reranks and scores each one by the likelihood it's the same company. The rerank accounts for spelling, punctuation, suffixes (Inc, LLC, Co, Corp), abbreviations, and common variants.

Result: the existing "Acme Valve Co" record surfaces even when the buyer types "acme valves llc." The fourth duplicate never enters the vendor master. The audit trail stays clean.

3. Tracker dashboard

All submitted requests in one sortable, filterable view. Replaces the manual "open SharePoint, scroll through the list, update status" routine that eats the end of every buyer's day.

Architecture

A single Vercel project serves both the React frontend and the FastAPI Python backend. One deploy, one URL, no CORS configuration, no separate backend hosting.

Browser
  |
  v
Vercel (single project)
  |-- React 19 + Vite + Tailwind (static SPA)
  |-- FastAPI via api/index.py (Python 3.12 serverless)
  |
  v
Supabase PostgreSQL          Anthropic Claude Sonnet 4.6
  - clotho schema              - Field extraction
  - pg_trgm index              - Supplier rerank
  - Row-Level Security
  - audit_log table

The frontend builds to static assets under dist/. api/index.py mounts the FastAPI app as a Vercel Python serverless function. Same origin, same deployment pipeline.

Engineering decisions that mattered

Trigram pre-filter plus LLM rerank, not "send the whole vendor master to Claude"

A real supplier master has thousands of rows. Sending them all to an LLM on every keystroke is slow and expensive. Postgres pg_trgm similarity pre-filters to the top ~10 candidates in milliseconds, then Claude reranks those ten with reasoning. Fast dumb filter plus slow smart reasoner gives sub-second intelligent search at a fraction of the token cost.

Single Vercel project for React plus FastAPI

Under vercel.json, Vite builds the static frontend and api/index.py exposes the FastAPI app via Vercel's Python runtime. One project, one URL, one deploy, zero CORS. The tradeoff is cold-start latency on the Python function. Acceptable for a demo, mitigated in production with a warmer tier or a lightweight keepalive ping.

Isolated Supabase schema per project, not a fresh Supabase instance

The same Supabase instance hosts multiple portfolio demos. Each lives in its own schema with its own Row-Level Security policies. Keeps costs down and makes cross-project data access impossible by construction rather than by convention.

Service-role key scoped to the backend; frontend only touches the anon key

The SPA never sees the service role. Any compromise of the frontend or a malicious browser extension cannot bypass Row-Level Security. Administrative writes go through FastAPI, which authenticates the request and then calls Supabase with the service role.

Claude Sonnet 4.6 over Haiku for extraction

Procurement intake needs structured output with high accuracy on domain-specific fields: capital project codes, cost centers, sourcing method enums, single-source justification types. Haiku is faster and cheaper but loses accuracy on the long tail. Sonnet 4.6 is the right cost-versus-accuracy tradeoff for a task where a wrong extraction wastes a buyer's time and a right one saves it.

Schema

The clotho schema enforces Row-Level Security on every table.

TablePurpose
suppliersVendor master with trigram index on display name; seed data includes intentional name variants for the fuzzy-match demo
requestsSubmitted procurement intakes with generated IDs, value, sourcing method, urgency, full audit fields
audit_logAppend-only change log capturing every request mutation
document_linksPointers to source documents (emails, PDFs) that produced each request
roi_eventsEvent stream reserved for future ROI analytics

A next_request_id() function generates human-readable IDs for new submissions. The fuzzy-search RPC search_suppliers_fuzzy lives in supabase/migrations/0002_search_functions.sql.

Stack

Frontend
React 19, TypeScript 5.7, Vite 6, Tailwind CSS 3
Backend
Python 3.12, FastAPI, Pydantic, httpx
Database
Supabase (PostgreSQL), pg_trgm, Row-Level Security
AI
Anthropic Claude Sonnet 4.6 via official Python SDK
Deploy
Single Vercel project: React SPA + Python FastAPI serverless, same origin
Auth
Supabase-managed JWT for API authorization

Why this matters for enterprise buyers

Clotho is built the way a production procurement-intake layer should be built. Not as a demo that can't scale. Not as an AI widget bolted onto a spreadsheet. As a real application with a typed frontend, a typed backend, a typed database schema, Row-Level Security at every table, an audit trail on every mutation, and an LLM that's constrained to do exactly one job per surface rather than being given free rein to hallucinate.

If you're a procurement, finance, or manufacturing team thinking about adding AI intake, supplier deduplication, spec document parsing, or variance commentary to a system that already runs real workloads, this is the shape of pilot I ship.

Want something like this for your team?

Fixed fee, scope in writing, delivery is working software with a handoff call. Four to eight weeks. NDAs welcome.