India's healthcare system has a geographic paradox: 70% of doctors practice in cities while 65% of the population lives in rural areas. Many Primary Health Centers (PHCs) have zero internet connectivity, yet doctors make critical diagnostic decisions in complete isolation.
We realized: quantized LLMs are now small enough to run completely offline on any laptop. Why shouldn't rural doctors have access to the same diagnostic AI as specialists in metro hospitals?
ClinicAI was born from this simple insight: Medical AI that works offline, with complete privacy, on any hardware.
ClinicAI is an offline-first medical AI assistant for healthcare professionals in remote areas. It:
✓ Runs 100% offline on any desktop with zero internet requirement ✓ Provides medical consultation using quantized LLMs (3-8B parameters) ✓ Includes emergency risk detection and triage capability ✓ Supports English + Hindi (easy to add more languages) ✓ Stores everything locally in SQLite (complete privacy) ✓ Works on 4GB+ RAM laptops (typical clinic hardware)
Real scenario: A rural doctor encounters a patient with chest pain at 2 AM with no internet. They open ClinicAI, describe symptoms, get differential diagnosis + emergency flag, and make confident referral decision. Patient reaches specialist in time.
Frontend: React 18 + TypeScript (modern UX, type safety) Desktop: Tauri v2 (Rust-powered, 25MB bundle, cross-platform) Backend: Rust (memory safety, performance for AI tasks) AI: llama.cpp (CPU-optimized LLM inference, quantization support) Database: SQLite (embedded, offline-first, no server needed) Styling: Tailwind CSS (healthcare-appropriate, responsive)
Architecture: IPC-based (React frontend ↔ Tauri IPC ↔ Rust backend ↔ llama.cpp inference ↔ SQLite storage)
Everything runs locally. No network calls, no cloud dependency.
Model Size vs Performance: Solved through GGUF quantization (4-5 bit). Llama 3.1 8B now runs in 4.7GB.
Rust + React Learning Curve: Clear Tauri IPC contract made it smooth.
Medical Accuracy & Liability: Built emergency detection, audit trails, and liability disclaimers into core.
Hindi NLP Support: Modern multilingual LLMs handle this with simple prompt engineering.
Cross-Platform: Tauri handled platform differences automatically.
Database Persistence: SQLite + Rust FFI made local search/storage easy.
Biggest learning: Architecture matters more than individual tech choices.
✅ Truly offline-first architecture (zero cloud dependency) ✅ Production-quality code (type-safe Rust + TypeScript) ✅ Real medical AI (not mock - uses quantized LLMs) ✅ Multilingual support (English + Hindi) ✅ Risk detection & emergency flagging ✅ Privacy-first by design (HIPAA-aligned) ✅ Cross-platform (Win/Mac/Linux single codebase) ✅ Minimal requirements (4GB RAM, CPU-only, 25MB bundle)
Most proud of: Building something that could actually serve 50,000+ rural clinics in India, not just impress judges. ✅ Truly offline-first architecture (zero cloud dependency) ✅ Production-quality code (type-safe Rust + TypeScript) ✅ Real medical AI (not mock - uses quantized LLMs) ✅ Multilingual support (English + Hindi) ✅ Risk detection & emergency flagging ✅ Privacy-first by design (HIPAA-aligned) ✅ Cross-platform (Win/Mac/Linux single codebase) ✅ Minimal requirements (4GB RAM, CPU-only, 25MB bundle)
Most proud of: Building something that could actually serve 50,000+ rural clinics in India, not just impress judges. ✅ Truly offline-first architecture (zero cloud dependency) ✅ Production-quality code (type-safe Rust + TypeScript) ✅ Real medical AI (not mock - uses quantized LLMs) ✅ Multilingual support (English + Hindi) ✅ Risk detection & emergency flagging ✅ Privacy-first by design (HIPAA-aligned) ✅ Cross-platform (Win/Mac/Linux single codebase) ✅ Minimal requirements (4GB RAM, CPU-only, 25MB bundle)
Most proud of: Building something that could actually serve 50,000+ rural clinics in India, not just impress judges.
Built With
- frontend:-react-18
- gguf-quantized-models-database:-sqlite-3-platforms:-windows-(msi)
- linux
- macos-(dmg)
- rust-1.75+-backend:-rust-(src-tauri)-ai/llm:-llama.cpp
- tailwind-css-desktop:-tauri-2.0
- typescript
- vite
Log in or sign up for Devpost to join the conversation.