🐾 Inspiration
Every pet owner knows the fear of not understanding their furry friend’s pain—and every vet knows the challenge of managing time, communication, and documentation. We wanted to bridge that emotional and operational gap with TailMate: a full-stack, AI-powered veterinary assistant that listens, understands, visualizes, and even speaks on behalf of pet owners—bringing futuristic pet care into the now.
🧠 What it does
TailMate is an intelligent, voice-first assistant that transforms veterinary experiences with:
- Secure login and role-based dashboards
- NFC-like proximity-based transcription between doctor and owner
- Chat with AI (voice/text) powered by GPT & Claude, contextualized with vector memory (Weaviate)
- Emergency alerting to doctors via email if critical symptoms are detected
- Smart graph visualizations to answer questions visually
- Image-based injury detection using computer vision
- Live appointment scheduling calls via real-time conversational agents - two way agents
- To-do updates and profile metadata generated from doctor-patient conversations
🛠️ How we built it
- Frontend: React JS + Ant Design Mobile for polished, iOS-style experience
- Backend: FastAPI, WebSocket, OpenAI, Claude, and Weaviate vector DB
- Speech: Web Speech API for live recognition & synthesis
- CV: Pre-trained ML model to detect pet injuries from images
- Real-time AI calls: WebSocket-based voice conversation agent that acts like a human on the phone
- Database: MongoDB for metadata, Weaviate for semantic search
🚧 Challenges we ran into
- Making real-time voice interaction feel natural and responsive
- Ensuring contextual accuracy across GPT and Claude using vector search
- Handling audio stream-to-response timing in AI-human calls
- Balancing mobile UI polish with complex flows like proximity-triggered features
- Integrating multiple AI modalities: chat, CV, voice, vector DBs, and alerts into one seamless product
🏆 Accomplishments that we're proud of
- Built a clinic-level AI experience that fits in your pocket
- Successfully orchestrated GPT + Claude + Weaviate + CV in real-time
- Simulated phone calls with AI that actually talk like humans
- Created a live transcription system that updates todos and alerts dynamically
- Pulled off a multi-agent AI architecture within one weekend!
📚 What we learned
- How to leverage multi-modal AI (voice, vision, text) for real-world use cases
- The power of vector databases in personalizing LLM responses
- How to orchestrate agentic workflows using real-time user feedback
- Importance of thoughtful UX when blending AI with human experiences
🚀 What's next for TailMate
- Integrate live video diagnostics for remote checkups
- Fine-tune our injury detection model on diverse breeds
- Build a progress tracker for treatment plans
- Add multi-pet management and doctor-patient chat history
- Launch on the App Store as the first AI-native vet companion
Things we stressed on:
- Code quality - We have definitely focused a lot on quality even though we have such a big working product
- Technical complexity and depth - Our product has almost everything that is needed for pet health care and the tech stack used is definitely something that will prove the challenges and the depth of our project
- Demo - We are very keen to showcase the full power of Tailmate. While it is a long demo it is definitely needed based on the size of the product.
With TailMate, the future of pet care isn’t just smarter. It’s already here. 🐶🐱💡
Built With
- ant
- claude
- fastapi
- mongodb
- openai
- python
- react
- weaviate
- websockets
Log in or sign up for Devpost to join the conversation.