Inspiration

Stringer-V was inspired by the renewed global interest in returning to the Moon and the upcoming Artemis missions. As humanity prepares to establish a permanent lunar presence, the ability to identify, mine, and retrieve local resources becomes critical. We wanted to simulate the engineering challenges faced by real lunar rovers: navigating hazardous craters, identifying valuable regolith, and operating remotely with limited onboard processing power. Stinger-V serves as a prototype for a lightweight "scout class" rover, designed to venture into lava tubes and craters where heavier, more expensive rovers cannot go.

What it does

Stinger-V is a dual-core, WiFi-controlled mining robot that performs three key functions: Remote Teleoperation: An operator ("Mission Control") can drive the rover from a laptop or phone using a custom web dashboard hosted directly on the robot. Live FPV Exploration: It broadcasts a low-latency video feed from its front-mounted camera, allowing the operator to navigate out-of-sight obstacles as if they were sitting in the cockpit. Resource Extraction: Using a servo-controlled robotic arm and claw, Stringer-V can manipulate objects and retrieve samples from the environment. Autonomous Safety: In "Auto Mode," the rover uses ultrasonic sensors to detect crater walls or obstacles and automatically halts or redirects to prevent collisions.

How we built it

To mimic the constraints of space travel where every gram matters, we ditched heavy Single-Board Computers (like Raspberry Pi) in favor of a highly efficient Dual-Microcontroller Array: The Brain (ESP32 Driver): We used a standard ESP32 Development Board as the central flight computer. It handles propulsion via an L298N driver, operates the robotic mining arm, and processes telemetry from ultrasonic sensors. It also hosts the asynchronous web server that powers our control dashboard. The Eyes (ESP32-CAM): We integrated a dedicated ESP32-CAM module solely for visual navigation. This unit acts as a wireless IP camera, broadcasting a low-latency MJPEG stream over the local network. The Network: We established a dedicated local hotspot (ashaybot) to serve as our "Deep Space Network," linking the rover's two independent brains to the ground station laptop for seamless telemetry and control without needing the internet.

Challenges we ran into

The Frequency Spectrum Barrier: Much like deep-space communications rely on specific bands, our ESP32 hardware could not communicate on modern 5GHz frequencies. We faced total signal loss until we reconfigured our ground station (Hotspot) to operate on the legacy 2.4GHz band—effectively restoring our link to the rover. Telemetry Latency: Transmitting live video from the "lunar surface" (the rover) to Earth (the laptop) initially introduced significant lag. We had to engineer a custom HTTP chunk-streaming method to ensure the video feed remained real-time enough for precision driving. Resource Scarcity: The ESP32-CAM is powerful but pin-limited. We initially tried to run the entire rover on a single chip, but motor noise and pin conflicts crashed the arm and claw servos. We learned to decouple the systems, isolating the "Vision" from the "Muscle" to create a fault-tolerant system.

Accomplishments that we're proud of

-Successfully getting two separate microcontrollers to act as a single cohesive unit over WiFi was a major win. -Achieving near-instantaneous motor response times alongside the video feed, making the driving experience feel snappy and responsive. -Building a control interface that works natively in a web browser means our rover can be piloted from an iPhone, Android, PC, or Mac without installing any custom apps.

What we learned

-We gained firsthand experience in the difficulty of driving a vehicle solely through a camera feed, emphasizing the need for robust sensor-assist modes. -We learned how to handle raw MJPEG streams and embed them into custom HTML interfaces.

  • Managing voltage drops when high-torque servos and DC motors engage simultaneously taught us the importance of isolating power rails.

What's next for Stinger V

AI-Assisted Mining: We plan to integrate computer vision (OpenCV) on the ground station to automatically detect "Gold" (yellow objects) and draw bounding boxes on the operator's screen. -Enabling the ESP32-CAM's onboard flash to allow for navigation in dark environments (simulating lunar shadow regions). Telemetry Upgrades: Adding battery voltage monitoring and WiFi signal strength indicators to the dashboard to warn Mission Control before connection is lost.

Built With

Share this project:

Updates