Inspiration

When we looked at the Howard Student Health Center, one thing stood out immediately: the Google reviews. Students repeatedly complained about long hold times, missed calls, and general frustration when trying to book appointments or get basic information. We saw an opportunity to help, not by building another chatbot, but by creating a voice-first AI-powered helpdesk that could actually talk to people, triage issues, check calendars, and assist in real time.

What it does

ClinicFlow AI is a fully automated, AI-powered helpdesk built for student health clinics, starting with Howard. It picks up calls instantly, understands natural speech, classifies the patient’s need (triage, appointment, waitlist, etc.), and routes the request to one of our five smart agents.

Whether it’s checking doctor availability via Google Calendar, logging details to MongoDB, emailing follow-ups, or escalating serious symptoms to staff, ClinicFlow handles it. It is designed to reduce staff overload, eliminate hold times, and make healthcare access smoother for students.

How we built it

We built a modular, multi-agent system managed through n8n, with five specialized AI agents:

  1. TriageAgent - understands symptoms and urgency
  2. AppointmentAgent - books and reschedules via Google Calendar
  3. RouterAgent - decides which sub-agent to activate
  4. EmailAgent - sends follow-up confirmations and reminders
  5. MongoDBAgent - logs and retrieves info from MongoDB

All communication runs through the RouterAgent, which chooses the appropriate sub-agent based on the conversation. For speech, we use OpenAI's speech-to-speech API, which is faster and more natural than traditional speech-to-text, paired with Twilio for phone call handling. Everything is wired together through HTTP requests, WebSockets, and JSON payloads inside n8n.

Challenges we ran into

  • Getting speech-to-speech working smoothly inside a live call. We had to tweak latencies and response timing a lot.
  • Managing long-form conversations in n8n. It is not designed for voice flows, so we had to get creative with memory handling and agent routing.
  • Google Calendar integration was trickier than expected, especially with overlapping events and recurring appointments.

Accomplishments that we're proud of

  • We built a fully working AI receptionist from scratch. Not just a prototype, but a full stack that can pick up real calls, talk to patients, and take action.
  • We managed to route between five different AI agents in real time, all orchestrated through a custom-built agent router in n8n.
  • And most importantly, we turned student feedback into a real, usable tool that could actually fix a campus pain point.

What we learned

  • Voice-first AI is hard, but the tech is finally here to make it usable. Tools like OpenAI's speech API, Twilio, and n8n are game changers when used creatively.
  • Multi-agent AI systems need careful planning. We spent a lot of time designing how agents talk to each other, how memory is passed, and how context is maintained.
  • Real-world use cases like clinic triage expose so many edge cases that don’t show up in a lab setting. We learned to think like users, not just developers.

What's next for ClinicFlow AI

  • We are planning to add a voice memory module, so patients can follow up and the system remembers past calls.
  • We want to make the platform plug-and-play for other student health centers across the country. This problem is not just at Howard, it exists at UMD as well.
  • We are also planning to layer in HIPAA-compliant data handling so this can be used in real-world deployments.

ClinicFlow AI started as a fix for one school’s issue, but we think it can scale far beyond that.

Built With

Share this project:

Updates