Inspiration
We wanted to combine our current passions, as we all found a newfound interest in running, and to intertwine this with our interest in AI, we came up with an app that gives you real-time coaching during your runs. Sometimes you just need that little bit of extra motivation, or that little push to give an extra 1% and make your training more rewarding. StrideSide is both your workout buddy and your coach, and he'll help you break records.
What it does
StrideSide is here to accelerate your progression as a runner. Whether it's for a marathon or just to keep your health in check, StrideSide is your coach for any goal. All you need to do is start your session, and along with the metrics your smartwatch tracks, StrideSide will give you instant feedback based on your heart rate, pace, cadence, distance, and goals.
How we built it
In Apple's Xcode IDE, we used Swift and SwiftUI to build this project from the ground up.
Two components work in parallel: one app on Apple Watch, and one app on iPhone. The app on Apple Watch collects Biometrics + Data related to a run (E.g. Heart Rate, Cadence, Distance, Pace), and streams them real-time to the app on iPhone. The app on Apple Watch also has a button for an instant AI Feedback that the runner can use anytime. On the iPhone side, it has two main features:
- With the real-time data pushed from the app on apple watch, do a rule-based matching with rules/events scenario (e.g. If cadence drops sharply, or HR goes above 85%). When a rule-is matched, connect with Gemini to get a concise feedback depending on the scenario (e.g. “HR is skyrocketing above Lactate Threshold, let’s bring it down. Remember, your goal for today’s run is a long slow distance run”). With the feedback sentence, send it to ElevenLabs API to retrieve voice audio data, then output it to the User as audio.
- When the user clicks on the “Instant” button for Live AI Feedback, the app connects to Gemini API right away to get concise feedback analysis on the run so far, based on the goal set by the user. Then it sends the returned feedback to ElevenLabs Voice AI to get the audio output. The app outputs the returned audio to the User.
The App provides options for user to set run goals (distance, run type, etc.), and AI feedback tone.
Challenges we ran into
We faced major challenges with Apple Watch debugging, device pairing, code signing, and OS/Xcode compatibility. Handling real-time data flow, concurrency, and ensuring reliable watch-to-phone communication also gave us a lot of headaches.
Accomplishments that we're proud of
The entire project creation process was incredibly rewarding for us. From ideation to designing the project's architecture, coding the logic, and connecting all the moving parts. We were able to bring StrideSide to life in just 24 hours, and that, in and of itself, is a huge accomplishment in our eyes.
What we learned
We deepened our understanding of development within the Apple ecosystem, using Xcode, Swift, and SwiftUI. We also gained experience with watchOS development, HealthKit live workouts, WatchConnectivity, Swift concurrency, and Apple’s signing/debugging ecosystem. We also learned how to design AI feedback that is useful, concise, and non-disruptive during physical activity.
What's next for StrideSide
There are endless features that we could add to StrideSide. Not just API-based LLM feedback, but a locally hosted LLM to reduce latency and rate-limiting issues. In addition, we'd like to add features that let users personalize their app to their artistic tastes. Lastly, as a performance-enhancing feature, we'd like to include a music-generating feature, such as ElevenLabs' software, that matches your heart rate or cadence as you run to help you reach flow state.

Log in or sign up for Devpost to join the conversation.