Inspiration
We've all been there — twenty tasks on the list, no idea where to start, and somehow forty minutes have passed on Instagram. For neurodivergent people, this kind of task paralysis isn't just procrastination. It's a nervous system response, and it's incredibly hard to break out of from the inside.
System Interrupt was built from that lived experience. I wanted something that required almost zero activation energy to use — because when you're stuck, even opening an app feels like too much. The idea of a single shake as the trigger came from that: meet people exactly where they are.
What it does
System Interrupt is a mobile app that breaks the paralysis cycle in three steps:
- Trigger — shake your phone or tap one button to activate
- Ground — a guided breathing animation activates your parasympathetic nervous system and brings you back into your body
- Coach — an AI powered by Claude looks at your task list and gives you one small, concrete next action to take right now
- Haptic feedback — gentle vibrations that accompany the breathing guide
Just one next step towards breaking out of task paralysis.
How I built it
Built with React Native and Expo for cross-platform mobile support. Navigation is handled by React Navigation, and shake detection uses expo-sensors — listening to accelerometer data and triggering the interrupt when the force threshold is crossed.
The AI coaching screen calls the Anthropic Claude API, sending the user's task dump and a carefully designed prompt that asks Claude to respond with a single gentle, encouraging next step. The prompt is tuned specifically to avoid overwhelming language and to meet the user with warmth.
Challenges I ran into
- Getting the shake detection threshold right — too sensitive and it triggers by accident, too high and it never fires
- Wiring
ShakeWatcherinside theNavigationContainerso it had access to the navigation context - Tunnel connectivity with Expo Go during the hackathon
- Keeping the Claude API prompt short enough to get a fast response while still getting warm, useful output
Accomplishments that I'm proud of
- The full three-screen flow works end to end — shake to breathe to AI coach
- The Claude integration actually gives genuinely helpful, non-overwhelming responses
- The app is usable in the exact moment it's designed for: low friction, fast, calm
What I learned
- How to use
expo-sensorsfor gesture-based triggers beyond touch - How to prompt an LLM for emotional tone, not just information
- That the hardest UX problems aren't technical — they're about reducing friction at exactly the right moment
What's next for System Interrupt
- Calming generative visuals and optional ambient sound on the breathing screen
- Gentle yoga or movement prompts as an optional step between breathing and coaching
- Persistent task list so users don't have to re-enter tasks each session
- Customizable shake sensitivity
- Offline mode so it works without a connection
Built With
- claude-api-(anthropic)
- expo.io
- javascript
- react-native
- react-navigation
- sensors
Log in or sign up for Devpost to join the conversation.