🌟 Aura: Accessible Urban Route Assistant
Inspiration
Ever since Rohit tried to navigate his grandma's wheelchair through downtown Chicago and ended up in three construction zones, a fountain, and somehow inside a Subway (the sandwich kind, not the transit), we knew something had to change. Meanwhile, Skanda was watching his friend with a walker get directions from Google Maps that included "just hop over this small fence" and "take the stairs for a shortcut." This is unacceptable.
We realized that current navigation apps treat accessibility like that friend who says they're "totally cool with whatever" for dinner but then complains about every restaurant option. They slap a wheelchair icon on the map and call it a day. That’s like putting a “vegan-friendly” sticker on a steakhouse because they have lettuce on their burgers.
Since we can't personally escort everyone through the urban jungle, we built the next best thing: an AI that actually understands that stairs are basically kryptonite to wheelchairs, and that "just squeeze through" or “climb two flights of stairs” aren't valid navigation instructions for someone with a mobility aid.
What it does / Project Description
Your phone becomes an accessibility oracle — but instead of cryptic prophecies, it gives you practical, reliable directions.
Our solution ensures that users avoid the frustration of unhelpful directions like “proceed to the route” when they’re already lost.
How AI Powers Aura
At the heart of Aura is AI — not the kind that promises to “revolutionize humanity” and then generates blurry pictures of cats with 7 legs, but the kind that actually makes life easier.
We use a combination of machine learning models and OpenAI’s API to interpret messy, inconsistent accessibility data and turn it into useful routing guidance. Cities don’t exactly hand us a neat spreadsheet of every curb cut, ramp slope, and construction barrier. Instead, data comes in as scattered reports, government PDFs, and community submissions written in everything from perfect English to “my cousin says there’s a ramp by the taco place.”
Here’s how Aura makes sense of it:
- Natural Language Processing (NLP): OpenAI’s language models parse community reports and city data, extracting structured information like “stairs present,” “curb cut missing,” or “temporary ramp installed.”
- Accessibility Scoring AI: Our custom ML model evaluates the severity of obstacles and assigns a risk level. A pothole might lower a score slightly, while a staircase with no ramp is an automatic route blocker.
- Route Optimization with AI Heuristics: Instead of just calculating the shortest path, Aura applies accessibility-aware heuristics. The AI learns which barriers are show-stoppers versus inconveniences and reroutes accordingly.
- Contextual Adaptation: By analyzing patterns in reports, Aura can predict likely accessibility issues (e.g., “if there’s construction on Main St, the adjacent alley probably has a blocked sidewalk too”).
In short: AI is Aura’s translator, detective, and problem-solver — the layer that transforms chaotic urban data into directions people can trust.
Why current navigation apps fall short
They Think Everyone is a Ninja Warrior Contestant
Current apps assume you can scale walls, leap tall curbs, and phase through construction barriers. They treat sidewalks like suggestions and stairs like minor inconveniences.
Their Approach: "Turn left at the intersection"
Reality: That intersection has no curb cuts, three stairs, and a sidewalk that ends in someone's hedge.
Their Accessibility Mode: Add 5 minutes to the route and hope for the best.
Our Approach: Actually know where the curb cuts are.
Fun fact: 73% of "accessible" routes on major navigation apps include at least one insurmountable obstacle. The other 27% just route you in a circle back to where you started.
Independence
At its core, Aura isn’t just about directions — it’s about independence.
For someone using a wheelchair, walker, or cane, a single wrong turn can turn a simple coffee run into a rescue mission. Aura restores confidence by giving people the ability to move through their cities without constantly relying on others to scout ahead or “just carry them up the stairs.”
Independence means being able to choose your own path, make spontaneous decisions, and explore without fear of being stranded at an inaccessible curb. With Aura, mobility aid users don’t just get from point A to point B — they regain the freedom to travel on their own terms.
Technologies Used
| Component | Tech Stack |
|---|---|
| Backend API | Python, FastAPI, Uvicorn |
| Frontend UI | Next.js, React, TypeScript, Tailwind CSS |
| Map Integration | Leaflet, Mapbox GL JS, OpenStreetMap |
| Routing Algorithms | NetworkX, Road Network Graph Analysis, A* with accessibility heuristics |
| Real-time Data | Obstacle detection, Community reporting, Live construction updates |
| Database | SQLite |
| Geocoding | Nominatim |
| Accessibility Scoring | Custom ML model |
Challenges we ran into
- Ran into Mapbox’s free tier limits earlier than expected after extensive routing tests.
- Discovered inconsistencies between APIs in defining “wheelchair accessible,” which created edge cases in routing.
- Spent significant time troubleshooting Leaflet issues, which ultimately traced back to a repeated typo in “longitude.”
- Found that our early accessibility scoring algorithm incorrectly rated unsafe routes highly, including one that went through hazardous terrain.
Accomplishments that we’re proud of
- Developed three routing engines: Mapbox, OpenStreetMap, and a custom-built solution for added flexibility.
- Designed a smooth, responsive route display with optimized rendering performance.
- Implemented real-time obstacle reporting without introducing unnecessary complexity.
- Refined an accessibility scoring system so that it reflects actual navigational safety and usability.
- Built a reliable notification system that balances timeliness with user experience.
- Successfully made government building data relevant and functional for accessibility routing.
What we learned
Working on Aura taught us that accessible routing is inherently complex due to inconsistent infrastructure data, unpredictable urban layouts, and varying accessibility standards.
We also deepened our understanding of handling asynchronous operations, using TypeScript for maintainability, and mitigating common JavaScript pitfalls. Debugging required a mix of logging, code review, and validation testing — and we learned that “it works on my machine” is never a guarantee in production.
What’s next for Aura
- Crowd-sourced obstacle validation to improve accuracy over time.
- Weather-aware routing to account for environmental hazards.
- Public transit integration to offer multi-modal accessible travel.
- Voice navigation to improve usability for people with visual impairments.
- AR navigation to preview obstacles in real time.
- Predictive construction detection using machine learning to flag likely disruptions.
- Integration with city services for more efficient issue reporting.
We’ll also be continuing to improve our development tools and data handling — and maybe, just maybe, get a better coffee machine.
Built With
- a*
- css
- fastapi
- leaflet.js
- mapbox
- networkx
- next.js
- openstreetmap
- python
- react
- sqlite
- tailwind
- typescript
- uvicorn
Log in or sign up for Devpost to join the conversation.