Inspiration

Medical residents juggle the impossible: 80+ hour weeks, nonstop learning, and high-stakes decisions, all while battling fatigue. Our team was inspired by family and friends in residency, sharing stories of 28-hour shifts, only to attend rounds, study for exams, and prep for procedures with barely a moment to breathe.

AI assistants like ChatGPT and Claude offer powerful tools, but sending sensitive medical data to the cloud is a non-starter in healthcare. Residents need real support without compromising HIPAA compliance or patient privacy.

That’s why we built Meredith: an AI-powered agent that connects to your Apple Watch, processing everything locally so clinical notes, schedules, and patient data never leave your computer. No cloud uploads. No privacy concerns. Just an intelligent, hands-free partner to help lighten the load of residency, right from your wrist.

With Meredith, you’re not just surviving residency. You’re mastering it.

What it does

Meredith serves as an intelligent companion for medical residents, accessible through voice commands on their Apple Watch. It streamlines daily tasks by managing schedules, facilitating communication, retrieving patient information, and researching internet sources--all while maintaining strict medical privacy standards. The core functionality includes smart scheduling, automated documentation in standard medical formats (SBAR, admission notes, procedure notes, discharge summaries), instant access to up-to-date medical research, and intelligent task management that adapts to the dynamic nature of hospital work.

How we built it

Unlike traditional AI assistants that rely on cloud processing, we developed Meredith using an Edge AI architecture so that language model inference occurs locally on our computers. This approach ensures complete patient data privacy and HIPAA compliance--no sensitive information ever leaves the device. The system is powered by an existing open-source fine-tuned model, TinyAgent-7B, that operates directly on the device via a llama.cpp HTTP server, integrating seamlessly with essential APIs:

  • Perplexity Sonar for real-time medical research
  • Native iOS APIs (Contacts, SMS, Email, Calendar, Maps, File Management)

To develop the actual agent, our system builds a DAG (Directed Acyclic Graph) for task planning and execution. When given instructions, Meredith determines the optimal task ordering to build the function calling DAG where each node represents an API (like sending an email, or creating a calendar event) and directed edges represent dependencies between tasks (for example, needing to fetch email addresses before sending a meeting invite). This DAG-based planning ensures tasks are executed in the correct sequence while maximizing parallel execution where possible. Our agentic architecture also enables the flexible integration of new tools and APIs through in-context learning and prompt engineering.

On the full stack end, we built our web app using Next.js, Flask, and Python. Our Apple Watch app was built in Xcode using Swift.

Challenges we ran into

Adding custom tooling for the TinyAgent model. The model was originally fine-tuned to pick between 16 different tools, which made adding additional functionality impossible without retraining the last layer of the model. We solved this via prompt engineering and examples so that the LLM could leverage its ICL capabilities to choose the most relevant tools. Guiding the agent to execute more complex workflows was also difficult, as the TinyAgent model often would get stuck if a single API call resulted in an error.

Accomplishments that we're proud of

  • Successfully implementing a task-planning agentic system powered by on edge AI!
  • Building a full-stack application using Typescript, Next.js, and Flask (we're all backend engineers)
  • Spinning up the Apple Watch task queue app, and enabling it to convert voice-to-text

What's next for Meredith

Meredith's potential extends far beyond its current capabilities. Paramedics operate in high-stress environments where every second counts and their hands are often occupied with life-saving procedures. A voice-first application would allow them to maintain situational awareness while documenting critical information and coordinating care. Imagine a paramedic arriving at a trauma scene: they could verbally log their arrival time, document initial patient vitals, and alert the receiving hospital about incoming trauma cases, all while physically attending to the patient. Meredith could transcribe their verbal notes into structured medical documentation, set up automated alerts to the emergency department, and maintain a real-time log of interventions and medications administered.

Additionally, many of the potential use cases of Meredith operate in high-stakes medical scenarios, where errors could impact patient care decisions or critical diagnoses. We want Meredith to be as useful as possible, but building robust safety guardrails will be a significant blocker in its adoption by medical residents.

Built With

+ 3 more
Share this project:

Updates