We started with: what if your stuff at home could just talk to you about how it’s feeling?

Our MVP is a plant friend. She runs on a local GPT-4.1-nano model on an NVIDIA Jetson Orin Nano, connected to an Arduino MKR WiFi 1010 that reads soil moisture data from a capacitive sensor. The Jetson reads that data over serial, turns it into structured JSON, and passes it to a local LLM that speaks through a mic and speaker using the OpenAI Realtime API.

We wanted to explore what embedded intelligence looks like, not with cloud APIs but by building objects that can reason and speak locally.

Hardware setup:

  • Capacitive soil moisture sensor → analog out to Arduino MKR Wifi 1010
  • Arduino converts readings to a JSON object and sends over serial via PySerial
  • Jetson Orin Nano reads serial data and runs GPT-4.1-nano locally
  • Mic + speaker handle voice I/O through the OpenAI Realtime API

Software loop:

  • Arduino measures voltage → maps to moisture percentage
  • Jetson parses serial input
  • The reading is passed into a small runtime that adds it to the model context
  • Model generates a natural response conditioned on the data
  • Text is sent to the realtime voice API → plant literally speaks

Built With

Share this project:

Updates