SYSTEM

Contractor OS

An AI-powered ERP system built for security camera installation contractors. One system replaces quoting, invoicing, scheduling, inventory, and client management — with a conversational AI agent at its core.

One contractor operates an entire business through this system — doing the work that would normally require a team of 5 people.

Next.jsFastAPIOllamaPostgreSQL

↓ Download Whitepaper (PDF)

The Problem

Security camera installation contractors run lean operations, but administrative overhead does not scale linearly with crew size. Quoting, invoicing, scheduling, inventory tracking, and client management each demand their own workflow. The standard solution — hiring administrative staff or stitching together 5+ SaaS products — is capital-inefficient and brittle.

What It Does

Contractor OS is a full ERP with a conversational AI agent (Neural Core) as the primary interface. The contractor talks to the system in natural language; the system routes to the appropriate backend operation.

The AI agent recognizes 30+ intents across the full business domain. The backend exposes 79 API endpoints spanning quoting, invoicing, scheduling, inventory, client management, financial reporting, and administration. The data layer is a 23-table PostgreSQL schema modeling the complete business domain.

The system enforces a hard constraint: the AI can read any financial data, but every write operation requires human approval. The system will surface a quote, draft an invoice, or flag a discrepancy — but it never autonomously commits a financial transaction.

By the Numbers

  • 30+ recognized intents across the conversational AI agent
  • 79 API endpoints across the full backend
  • 23 tables in the PostgreSQL schema
  • 0 financial write operations without human approval
  • 1 person built the entire system

Key Design Decisions

AI assists, never decides. Every financial write goes through a human. This is not a technical limitation — it is a deliberate constraint on where autonomous systems should have authority.

Local inference, no API dependency. LLM inference runs entirely local via Ollama. No token costs, no data leaving the system.

One system, not five integrations. The entire operational surface lives in a single coherent data model. Every entity — client, job, invoice, inventory item — is a first-class citizen with full relational context.