MedAR: An Immersive AR Surgical Experience

Inspiration

As of 2025, medical students are expected to perform around 80 surgeries in the duration of their medical school years. Upon entering residency programs, the American College of Surgeons (ACS) mandates a minimum of 850 operative procedures over five years, with at least 200 completed as a chief resident. Observation alone doesn't build the muscle memory or spatial awareness needed for confident surgical performance. MedAR was created for this very reason: to bridge theoretical learning and real-time immersive surgical experiences through AR and AI technology.

What It Does

-MedAR is an interactive augmented reality (AR) application designed to:

  • Convert traditional MRI medical scans into detailed 3D immersive models
  • Provide an interactive surgical experience using interactive AR technology to perform complex surgeries like brain tumor resection
  • Engage with an anatomical walkthrough with complex organs like the brain while exploring the functions and importance of each part

How We Built It

-We built the application using SwiftUI and integrated ARKit to harness the LIDAR capabilities of iOS devices. -3D models were created in Blender from sample MRI brain scans -Models were exported with applied textures (which required some workaround due to Blender’s export limitations) -The models were embedded in the SwiftUI environment and rendered in AR -Users can scale, rotate, and explore these models directly from their devices

Challenges We Faced

-Understanding and troubleshooting SwiftUI and ARKit for real-time interaction -Adapting MRI scan data into Apple-compatible 3D models -Managing Blender export limitations, especially related to texture mapping -Rendering with high fidelity under strict AR rendering requirements

Accomplishments We’re Proud Of

-Successfully interacting with AR models in real-time using iOS LIDAR -Reconstructing and isolating tumors from brain anatomy in a 3D space -Seamlessly merging multiple tools: Blender, SwiftUI, and ARKit -Rendering and exporting textured 3D models from Blender for use in iOS

What We Learned

-SwiftUI and ARKit integration for real-world AR interaction -Fundamentals of 3D modeling using Blender, including export strategies -Techniques in medical image processing and segmentation -Effective collaboration under time pressure and iterative problem solving

What’s Next for MedAR

-Integrate real-time 2D to 3D reconstruction of DICOM files -Expand the library to support other surgeries and organ systems -Add tools like gesture-based dissection, annotations, and layer toggling -Introduce multi-user collaborative AR environments -Integrate AI for surgical path recommendations and model segmentation

Built With

Share this project:

Updates