ten×
0
← back to work
HECCO — AI Companion for Wellness & Fitness

Healthcare / Voice AI

HECCO — AI Companion for Wellness & Fitness

500K+ lines. 42 feature domains. Zero hallucination on medical data. Both app stores. Azure AI chat engine with 20+ healthcare tools, Vertex AI document intelligence, real-time voice agents. Evaluation pipeline processing 10,000+ interactions daily.


The Problem

A preventive healthcare startup needed a voice-first AI companion for the Indian market. Users talk to "Dr. Tess" like they would talk to a doctor. The AI listens, pulls real health data from wearables and medical reports, and responds with personalized guidance. Not a chatbot. A voice-powered health intelligence system with real-time bidirectional audio, medical document analysis, wearable sync across platforms, and subscription billing. Co-founded the product and led all technical architecture and AI systems.

What Made It Hard

The core challenge was voice. Dr. Tess uses Azure Realtime API with WebRTC for bidirectional audio streaming. Sub-second intent detection across 36 different intent types while simultaneously pulling context from wearables, medical documents, and user health goals. A single voice query like "how's my heart doing?" touches heart rate data from Apple HealthKit, blood pressure from a scanned lab report, and cardiac risk scores from the body systems classifier. All in real-time, all in one context window.

Medical document processing was the second hard problem. Lab reports don't follow a standard format. A blood panel mentions cardiac markers in a table, liver enzymes in a footnote, and kidney function in a reference range column. Flat extraction catches the table and misses the rest. We built a two-phase extraction pipeline that classifies documents into 15 types, then maps findings across 15 body systems with parameter-level interpretation (abnormal/normal/borderline, severity scoring, patient-friendly descriptions).

The third: healthcare compliance. FHIR R4 for health data interoperability. ABHA integration for India's national health system. Wearable data flowing from Fitbit, Apple Health, and Google Fit through a BigQuery analytics pipeline with 1-minute latency from device to dashboard.

The Architecture

Backend: 49 NestJS modules, 172 services, 44 REST controllers. PostgreSQL with 23+ tables via Drizzle ORM. Redis-backed Bull queues processing 10,000+ jobs daily.

Voice & AI: The voice agent (Dr. Tess) runs on Azure Realtime API with WebRTC for real-time bidirectional audio. Intent detection uses a hybrid rule-based + LLM pipeline with parallel processing. Specialized healthcare agents handle different domains (nutrition, fitness, general medicine, dermatology, Ayurveda). Document analysis uses GPT-4o-mini for classification and Gemini 2.5 Pro for extraction. Right model for the right task.

Mobile: Expo SDK 53 with React Native. 12 React Context providers. Custom native module for Apple HealthKit + Google Health Connect. iOS Live Activities (Dynamic Island) for voice sessions and run tracking. Territory-based fitness gamification using H3 hexagonal geospatial indexing.

Monetization: Freemium with Razorpay billing. Usage metering on voice minutes, chat count, document uploads. Feature limits enforced per tier.

The Outcome

Both app stores. 5,000+ users. 500K+ lines of production code. Shipped in 10 months. Users talk to Dr. Tess about their health, and the system pulls real data from wearables, scanned medical reports, and health history to give responses grounded in actual patient data. Voice-first, not text-first with voice bolted on.

Tech Stack

React NativeNestJSAzure AI ProjectsVertex AIBigQueryHealth ConnectHealthKitFHIR R4Razorpay