Inspiration
Living in Hermosillo, Sonora, we have recently witnessed an alarming increase in wildfires and environmental emergencies. Seeing smoke cover the city, hills burning near communities, and families unsure of where to go made us realize something crucial: the danger is not only the fire itself, but how slowly information reaches people. Whether it is forest fires, earthquakes, or hurricanes, response time is the difference between life and death. Today, we rely on smartphones and cloud-based systems to warn us and coordinate rescue—but in real disasters, cell towers fail, signals disappear, and apps become useless icons on a screen. This is the same reality faced by firefighters, park rangers, and emergency teams working in canyons, remote mountains, and fire zones where LTE coverage is weak or completely destroyed. It is also what happened during Hurricane Katrina, when thousands of people in New Orleans were left without communication. At the same time, we were inspired by the growing number of wildfires in places like Los Angeles and the global climate crisis that is making disasters more frequent and severe. Most existing prediction systems depend on the cloud, which becomes unreachable exactly when it is needed most. This led us to ask one question: What if science could still protect people, even when the internet is gone? That question became Project AEGIS (Adaptive Environmental Global Intelligence System)—a platform that combines AI wildfire prediction, real-time disaster visualization, and an offline Bluetooth emergency network. By uniting data, science, and human movement, Project AEGIS ensures that warnings, SOS signals, and life-saving information can still flow—even when everything else fails.
What it does
AEGIS is a scientific, end-to-end disaster intelligence and emergency response system that solves a critical real-world problem: When disasters strike, prediction tools, emergency dashboards, and communication systems fail to work together—especially when internet connectivity is lost. Orbit closes this gap by integrating AI-based risk prediction, real-time disaster visualization, and offline emergency communication into a single, functional ecosystem. How Orbit works Orbit operates in three connected layers:
- AI Risk Prediction Orbit ingests environmental and spectral satellite data from CSV files and runs a trained XGBoost wildfire prediction model to: Identify high-risk fire zones Analyze vegetation health Generate statistical summaries Answer questions through an AI assistant such as: “Which zones are at highest risk?” or “What happens if wind shifts north?” This allows responders and researchers to anticipate disasters instead of reacting too late.
- Real-Time Disaster Intelligence AEGIS aggregates: Historical and live disaster data Social media distress signals Environmental indicators It visualizes them in an interactive dashboard that: Predicts future disaster events Shows safe routes and shelter paths Helps communities and agencies make data-driven decisions Promotes long-term climate awareness
- Offline Emergency Network When the internet fails, Orbit activates a Bluetooth Low Energy mesh network: Trapped users send an SOS from their phone The message hops from phone to phone Each device stores and rebroadcasts the signal Once any phone reaches connectivity, the SOS is uploaded to the rescue dashboard Users mark their status: 🔴 Red – critical 🟡 Yellow – injured 🟢 Green – safe Even people not in danger become mobile rescue relays just by carrying the app.
How we built it
AEGIS was designed with one guiding principle: A disaster system is only valuable if it works when infrastructure fails. To achieve this, we built AEGIS as a modular, offline-first, AI-driven architecture that unifies communication, prediction, and visualization into a single pipeline. System Architecture Orbit integrates three fully functional subsystems, each solving a critical part of the disaster-response chain. 1) Offline Emergency Communication We built the mobile application using Flutter and Dart, allowing a single codebase for Android and iOS. Core engineering features: Bluetooth Low Energy mesh network Custom BLE advertising + scanning sends compressed SOS packets containing: Encoded GPS coordinates User ID Emergency status Timestamp Data compression GPS data is compressed from ~40 bytes to 8 bytes using integer encoding, fitting within Bluetooth’s 31-byte limit. Epidemic routing protocol A store-and-forward system rebroadcasts messages phone-to-phone using a local SQLite cache, deleting records only after server confirmation. Background execution Uses Android WorkManager and iOS Background Fetch + silent pushes to function even when the phone is locked. Loop prevention & reliability Sequence counters and timestamp expiration prevent infinite rebroadcasts. Responder dashboard A web console displays SOS alerts on a live map for rescue coordination. This ensures AEGIS remains usable, stable, and functional in no-signal environments. 2) Real-Time Disaster Intelligence Built using Next.js + React, with Firestore for real-time updates and authentication. Google Maps API for live geolocation and evacuation routing Gemini Vision to analyze satellite images for environmental anomalies Gemini 2.0 Flash (chat + agent) for: Continuous monitoring of weather, news, and social data Interactive queries and safety recommendations Data sources include: NASA DAYMET_V4 NOAA CFSV2 Copernicus Atmosphere Monitoring Historical disaster datasets All services are deployed on Google Cloud, ensuring scalability and low-latency updates. 3) AI Wildfire Risk Prediction Data collection: Google Earth Engine + MODIS fire records (NASA FIRMS) Preprocessing: Python pipeline to clean, normalize, and merge features Modeling: XGBoost classifier trained on vegetation, climate, and terrain indicators Web interface: Flask dashboard for analytics and risk visualization GenAI support: Integrated GPT4All for offline summarization and decision support
Challenges we ran into
Building AEGIS meant solving problems that exist not only in code, but in real-world emergency conditions, where systems must remain functional under extreme technical and environmental constraints. Technical & System Challenges Cross-platform Bluetooth limitations Android and iOS handle Bluetooth very differently—especially in the background. iOS hides BLE UUIDs in an overflow space when apps are locked, making them invisible to standard scans. We had to redesign the scanning logic so Android devices could detect these hidden iOS signals, ensuring the mesh network remained functional across platforms. Background execution restrictions Modern mobile operating systems aggressively limit background processes to preserve battery. Keeping AEGIS alive while phones are locked required platform-specific background strategies and user guidance, without draining power or breaking system stability. Battery optimization conflicts Power-saving modes can silently kill emergency services. We implemented detection systems to warn users and designed AEGIS to activate mesh mode only during verified disaster events, balancing reliability with energy efficiency. Preventing network flooding Simultaneous rebroadcasting can cause radio collisions. We introduced randomized transmission delays and intelligent rebroadcast logic to maintain signal flow without overwhelming the network. Dynamic location tracking As people move, outdated SOS coordinates become dangerous. We implemented sequence counters and timestamp validation so only the most recent data is shared. Data & AI Challenges Real-time data fusion AEGIS processes satellite imagery, weather data, social signals, and historical disaster records. Merging these streams while keeping predictions fast and reliable was a major challenge. Model accuracy vs. performance We had to balance scientific precision with real-time usability, ensuring predictions were meaningful without slowing down the system. Responsible AI integration Designing transparent, unbiased prediction logic required testing, validation, and ethical safeguards—critical for life-impacting decisions.
Accomplishments that we're proud of
AEGIS demonstrates that science, communication, and AI can work together even when infrastructure fails. What we achieved goes beyond a prototype—it is a functional emergency ecosystem. Technical & Scientific Achievements True offline emergency communication AEGIS works without cell towers, WiFi, or internet—using only Bluetooth mesh networking. This directly solves the communication breakdown that occurs in real disasters. Ultra-efficient protocol design We engineered a compressed emergency packet system that transmits location, identity, status, and timestamps in just 20 bytes, fitting within Bluetooth’s strict limits. Background relay network Phones can rebroadcast SOS signals even while locked and in pockets—turning bystanders into a distributed rescue infrastructure. Smart emergency activation The system automatically switches to emergency mode by monitoring disaster APIs, ensuring the network activates only when truly needed. Battery-optimized mesh operation Through intelligent scan/sleep cycling, AEGIS can run 24+ hours in emergency mode while maintaining connectivity. 🤖 AI & Data Intelligence End-to-end machine learning pipeline Built using real satellite and environmental data. 98% wildfire prediction accuracy Using an optimized XGBoost classifier. Real-time scientific dashboard Capable of local predictions, risk visualization, and route analysis. GenAI-powered disaster assistant Integrated spatial reasoning and summarization for actionable insights.
What we learned
Building AEGIS changed how we understand both technology and responsibility. What began as a technical challenge became a lesson in designing for real human emergencies. Engineering & Systems We expected Bluetooth range and battery drain to be our hardest problems—but those were solvable. What truly challenged us were operating system restrictions created to protect users, yet unintentionally blocking emergency communication. It showed us that systems are not always designed for crisis scenarios, and that innovation often means working around the rules. We also learned that: Cross-platform does not mean identical—Android and iOS implement BLE so differently that we essentially built two systems in one. “Standards” behave very differently in real environments. Concepts from distributed systems—loop prevention, network partitioning, and eventual consistency—are not abstract. In disasters, they decide whether a message lives or dies. Data, AI & Science Through Earth-Lens and FireSight, we learned how to: Work with remote sensing data for real-world ML Merge geospatial sources (MODIS, Sentinel-2, ERA5) Build deployable web apps with Flask and GenAI agents Understand the importance of spatial labeling and preprocessing quality Balance model accuracy with real-time performance Apply responsible AI principles for transparency and fairness Human-Centered Design The most powerful lesson was this: Design for someone’s worst moment. A person trapped after an earthquake cannot debug an app. They need one tap, zero confusion, and instant help. That principle reshaped our interface, logic, and architecture—and made AEGIS not just functional, but meaningful.
What's next for Orbit
We plan to transform AEGIS from a hackathon prototype into a real-world disaster response platform. Integrate with fire departments and emergency agencies so AEGIS complements existing response systems. Deploy wildfire prediction on Raspberry Pi and edge devices for offline use in remote areas. Expand Earth data sources and improve AI accuracy for localized alerts. Build dedicated low-power relay hardware to strengthen the mesh network. Grow a community emergency mesh, where every phone becomes a potential rescue node. Add voice-based tools and accessibility features for field responders. AEGIS next step is simple: from science to survival, at scale.
Log in or sign up for Devpost to join the conversation.