Construction & ESPC | AI Native Procurement Platform

Problem:

Municipal procurement teams were managing intake, classification, and approval workflows manually — disconnected spreadsheets, email threads, no consolidated view of spend data.

What I built:

An AI-native procure-to-pay platform from zero as the sole engineer. Multi-tenant Django application with a semantic similarity-based spend classification pipeline, multi-agent execution layer for autonomous procurement task handling, and Azure cloud deployment. Full ownership of architecture, security, compliance, and AI pipeline development.

Result:

Over 70% reduction in manual spend classification effort validated against live municipal data. Platform currently in active development for deployment across multiple municipalities.

Stack: Python, Django, Azure, semantic similarity models, multi-agent pipelines
Private Equity | Technology Due Diligence

Problem:

IT due diligence on acquisition targets was taking six to eight weeks, creating timeline risk on deals and inconsistent outputs across engagements.

What I built:

A retrieval-augmented generation system that ingested vendor documentation, infrastructure inventories, and security posture materials to accelerate analysis and standardize assessment outputs.

Result:

Assessment cycle reduced from approximately two months to two weeks. Used across roughly half a dozen PE-backed acquisition engagements.

Professional Services | Document Intelligence

Problem:

An AEC services firm was spending significant time on proposal and contract generation — experienced staff rebuilding similar documents from scratch on every engagement.

What I built:

An NLP pipeline to automate proposal and contract generation, drawing from structured inputs and a corpus of prior documents to produce first drafts automatically.

Result:

Reduced time-to-draft on standard proposal and contract workflows. Freed senior staff from document assembly work.

Healthcare Operations | Automation & Data Engineering

Problem:

A PE-backed healthcare operator running multiple clinic locations had no consolidated view of claims data. Financial reporting required manual extraction from each location individually.

What I built:

RPA pipelines to extract and consolidate claims data across clinic systems into a unified data layer, as the sole pipeline developer on a four-person engagement.

Result:

Enabled centralized reporting across locations for the first time, eliminating manual per-location extraction.

Technology Consulting | Practice Development

Problem:

A growing consulting firm had no AI service offering, no delivery methodology, and no go-to-market materials — despite increasing client demand.

What I built:

End-to-end AI practice from scratch: service design, delivery methodology, pricing frameworks, proposal templates, and go-to-market materials.

Result:

Became the foundation for the firm's AI consulting work. Generated over $500K in presales within the first year.

Open Source

Libraries and tools I've built and published publicly.

Python | NLP / Taxonomy Classification Active

semtax

Zero-shot hierarchical taxonomy classification using semantic embeddings. Classifies free-text descriptions against the UNSPSC taxonomy — segment, family, class, and commodity — with confidence scoring, ambiguity detection, and batch processing support. No training data, API keys, or labeled examples required. Runs fully local.

Spend classification is one of the most common bottlenecks in procurement automation — and most solutions require training data, external APIs, or expensive vendors. semtax exists to make it instant, local, and free.

Stack: Python, sentence-transformers, semantic embeddings
Python | MCP Infrastructure Archived

mcp-tool-kit

A Python toolkit for building Model Context Protocol servers — enabling AI agents to execute actions and access external resources across multiple tools on a single server. Built and open-sourced before MCP became a mainstream standard. Reached 100+ GitHub stars and had active users before being archived.