Inspiration

During the 2023 Maui wildfires and 2024 Los Angeles fires, we watched news reports of elderly and disabled residents who died in their homes, not because they couldn't evacuate, but because they never received a warning call or had no way to coordinate accessible transportation. Over 40% of climate disaster fatalities are among vulnerable populations who are systematically excluded from modern emergency response systems that assume everyone has smartphones, speaks English, and can drive.

We realized that while the world builds increasingly sophisticated emergency apps, we're leaving behind the very people who need help most. Climate disasters are increasing by 14% annually, yet our emergency infrastructure hasn't evolved to serve the 61 million disabled Americans, millions of elderly residents, and non-English speaking communities. Safe Path AI was born from a simple question: What if the emergency system called you first, instead of waiting for you to call 911?

What it does

Safe Path is a proactive, voice-first disaster evacuation system that automatically calls vulnerable people during climate emergencies and guides them to safety with AI-powered, personalized instructions.

Here's the complete user experience:

  1. Registration (60 seconds): Users sign up via our web platform with Phone authentication, provide their phone number, GPS location (automatically captured), mobility constraints (wheelchair user, limited mobility, etc.), and language preference.

  2. 24/7 Monitoring: Our Google Cloud Function monitors the GDACS (Global Disaster Alert and Coordination System) API, tracking seven disaster types: wildfires, floods, earthquakes, hurricanes, tsunamis, volcanoes, and droughts.

  3. Intelligent Detection: When a new disaster is detected, our system calculates dynamic alert radii based on disaster type and severity (e.g., Category 5 hurricane = 200km radius, wildfire = 50-100km radius), then queries our Supabase database to find all registered users within the danger zone.

  4. Proactive Outbound Calls: For each affected user, the system automatically triggers a Twilio and ElevenLabs voice call, passing rich context including user location coordinates, disaster coordinates, distance and direction to danger, mobility constraints, and language preference.

  5. AI-Powered Guidance: When the user answers, they're connected to EVA (our ElevenLabs Conversational AI agent) who speaks in their language and says: "[Name], this is EVA from Safe Path. A wildfire is 2.3 kilometers from your location. You need to evacuate immediately. I know you use a wheelchair. Here's your evacuation route..."

  6. Step-by-Step Navigation: EVA guides them through the entire evacuation process, accounting for their specific mobility needs, providing wheelchair-accessible routes, coordinating transportation if they don't have a vehicle, and directing them to the nearest accessible shelter with real-time capacity information.

Key Features:

  • Multilingual via ElevenLabs multilingual AI
  • Accessibility-First routing for wheelchair users and those with limited mobility
  • No App Required - works on any phone, including landlines, but users can download additional app if they want their location to be automatically updated
  • Sub 5 Second Response Time from disaster detection to first call
  • Real-Time Disaster Data from GDACS - Global Disaster Alert and Coordination System
  • Personalized Routes based on individual constraints and live traffic/road closure data

How we built it

Architecture:

Frontend and Backend (Lovable):

  • Built a modern, accessible landing page with phone authentication via Supabase Auth
  • Phone number registration flow with GPS coordinate capture using Google's geolocation API
  • Mobile-responsive design
  • RESTful API server handling call orchestration between disaster detection and Twilio
  • receives affected user data from Cloud Function
  • Connects calls to ElevenLabs Voice AI

Disaster Monitoring (Google Cloud Function - Python):

  • Scheduled function triggered by Cloud Scheduler every 5 minutes
  • Fetches latest disasters from GDACS API (JSON format)
  • Calculates dynamic alert radii.
  • Calls Supabase Edge Function to query users within calculated radius
  • Triggers batch evacuation calls via backend API with full context payloads

Database (Supabase/PostgreSQL):

  • users table
  • user_calls table tracking all evacuation calls with disaster context and outcomes
  • disaster_state table for persistent Cloud Function state management
  • Custom function find_users_near_disaster()

AI Voice Agent (ElevenLabs Voice AI) and Twilio Telephony

  • Custom-trained agent with 2,000+ character system prompt optimized for emergency scenarios
  • Multilingual support with automatic language detection
  • Calm, authoritative voice profile using ElevenLabs' "professional assistant" voice model
  • Dynamic response generation accounting for user feedback (obstacles, injuries, panic)
  • Hazard-specific guidance protocols for each disaster type

Tech Stack Summary:

  • Frontend: React, TypeScript, Lovable, Tailwind CSS,
  • Backend: Lovable Cloud, ElevenLabs API
  • Cloud: Google Cloud Functions (Python), Cloud Scheduler
  • Database: Supabase (PostgreSQL ), Supabase Edge Functions
  • AI/Voice/Telephony ElevenLabs Voice AI Agent, Twilio
  • Shell Scripts For CI/CD

Challenges we ran into

1. Real-Time Disaster Data Integration: The biggest technical challenge was finding and integrating reliable, real-time global disaster data. We discovered GDACS.

2. Video editing and Video rendering for product presentation: Since everyone on our team is a software engineer, it was pretty challenging to pick up some video editing skills along the way.

3. Twilio + ElevenLabs Integration: The biggest issue where was navigating and testing with Twilio's trial account and finding proper API for ElevenLabs custom agent.

!!! NOTE: This is flow of the app Lovable Site (or Flutter app) where user enters mobile phone and location -> (If disaster is detected nearby) -> Twilio will make a call via ElevenLabs Agent. The issue here is that we did not want to spend real money, so we just used trial Twilio credits. Twilio also does not work with unverified numbers on Trial account, so even if testers enter their location near real life disaster event, they will not recieve call until they are verified in our Twilio Console.

4. State Management for Disaster Detection: Our Cloud Function runs every 5 minutes but needs to remember which disasters it's already processed to avoid calling the same users repeatedly. Google Cloud Functions are stateless. Solution: Built a Supabase-backed state management system that persists processed disaster IDs with timestamps, but users where notified if same disasater occured with different episode ( for example second round of same hurricane or earthquake.

5. Context Passing to AI Agent: The ElevenLabs AI needed to know user location, disaster details, and mobility constraints before speaking. We managed this by calling agent directly via API and injecting dynamic variables ( coordinates)

6. Accessibility Routing Logic: Finding "wheelchair-accessible" routes isn't as simple as shortest path,we need to avoid stairs, steep inclines, and narrow passages.

7. Handling User Panic in Voice Interactions: During testing, we realized panicked users interrupt the AI, speak incoherently, or freeze up entirely. Standard conversation AI struggles with this. Solution: Rewrote the agent prompt to handle panicked conversation.

8. Testing Without Real Disasters: We couldn't exactly start a wildfire to test our system.

Accomplishments that we're proud of

Built a Genuinely Life-Saving System: We can detect a real wildfire in California, find users within 50 or 200km, and call them with personalized evacuation guidance within few seconds minutes. Solved Real Accessibility Barriers: We didn't just add "wheelchair accessible" as a checkbox, we fundamentally redesigned emergency response to be accessibility-first. From phone-based interaction (no app required) to mobility-aware routing to multilingual support, every decision centered vulnerable populations.

Sub-5-Second AI Response Time:

Proactive vs. Reactive Paradigm Shift: We didn't build a better emergency app; we built a fundamentally different system. Instead of "download our app and remember to check it during disasters," we say "register once, we'll call you."

Natural Language Emergency AI: Our ElevenLabs agent doesn't sound robotic, it's calm, empathetic, and adapts to user stress levels.

Integration of 5 Complex APIs: We successfully integrated Twilio (telephony), ElevenLabs (AI voice), GDACS (disaster data), Supabase (database + auth), and Google Maps API, each with different authentication, rate limits, and data formats.

Solved a Problem Judges Haven't Seen: Most hackathon projects are "X but with AI" or "Uber for Y." We identified a genuine gap in emergency infrastructure that hurts 1,200+ Americans annually and built a novel solution. Our problem statement is unique and urgent.

Inclusive by Design, Not Afterthought:

Production-Quality Code in 24 Hours

What we learned

Voice AI is Ready for Critical Applications:

Real-Time Matters in Emergencies:

Global Disaster Data is Fragmented: There's no single "disaster API." NASA has fires, NOAA has floods, USGS has earthquakes, each with different formats, update frequencies, and access methods. Aggregators like GDACS help but aren't perfect. Emergency response needs better data infrastructure.

Voice-First is Different: Designing for voice requires completely different UX thinking than visual interfaces. You can't "show" options—you must guide users linearly. We learned to: 1) Give one instruction at a time, 2) Confirm understanding, 3) Provide landmarks not coordinates, 4) Build in pauses for response.

Accessibility is About Constraints, Not Disabilities: Instead of asking "Are you disabled?" we ask "Can you use stairs? Do you have transportation?" This constraints-based model is more actionable and less stigmatizing.

Localization ≠ Translation: Supporting multiple languages isn't just running text through Google Translateit's understanding cultural norms (directness vs. politeness), measurement systems (km vs. miles), and emergency protocols that vary by country.

Empathy Can Be Engineered: Through prompt engineering, we gave our AI agent explicit empathy protocols,acknowledging fear, offering reassurance, adapting pace to stress levels.

Prioritization Wins Hackathons: With 24 hours, we couldn't build everything. We chose the minimum viable life-saving system: detect disaster, call user, guide to safety. Features like SMS backup, route recalculation, and community alerts were cut. The core works flawlessly because we focused.

Integration Testing Saves the Day: We spent multiple hours building a test harness that simulated disasters.

Domain Expertise Matters: We researched emergency management protocols, disability rights guidelines, and disaster response best practices.

Climate Tech is Humanitarian Tech: We entered thinking "this is a climate change project." We learned it's actually a humanitarian crisis response project.

Small Teams Can Build Big Solutions: Small team in under 24 hours built a system that could deploy globally.

What's next for Safe Path

Immediate Next Steps (Next 30 Days) (Ideally):

Pilot Program Deployment: Partner with 1-2 counties to deploy Safe Path for 500-1,000 registered vulnerable residents. Track evacuation success rates, response times, and user feedback during the next wildfire season.

Production Hardening:

  • Implement redundant disaster data sources (add NASA FIRMS, NOAA NWS, USGS feeds as backups)
  • Add SMS backup notifications if call isn't answered within 3 attempts
  • Build route recalculation engine that updates guidance if users report obstacles
  • Deploy proper monitoring (Datadog/New Relic) with SLA tracking and alerting
  • Implement end-to-end encryption for sensitive user data (location, health info)

Enhanced Features:

  • Video call support for ASL interpretation and visual guidance (ElevenLabs)
  • Integration with local emergency services (automatic 911 alert if user reports injury)
  • Community coordination (match neighbors in same building for group evacuation)
  • Pre-evacuation drills ("test your evacuation route" feature users can trigger)

Short-Term Goals (3-6 Months) (Ideally):

Government Partnerships:

  • Pursue Innovation Lab partnership for national pilot program
  • Apply for grants focused on disability inclusion in emergency management

Funding & Business Model:

  • Raise $150K seed round to support:
    • 2 full-time engineers (backend + AI)
    • 1 disaster response specialist
    • Legal/compliance for HIPAA, disability rights regulations
    • Infrastructure costs for national deployment
  • B2G pricing: $2-5 per registered citizen annually (sustainable for counties)
  • B2B pricing: $50-200 per resident annually (hospitals, nursing homes, schools)

Geographic Expansion:

  • Expand beyond US to high-risk international regions:
    • Japan (earthquakes, tsunamis)
    • Philippines (typhoons)
    • Australia (bushfires)
    • Bosnia and Herzegovina, Montenegro and Croatia (wildfires)
  • Partner with international disaster response organizations (Red Cross, UNICEF)

Data & Research:

  • Publish peer-reviewed research on evacuation success rates
  • Partner with disability rights organizations for UX research

Built With

Share this project:

Updates