- Final team project for EECE490G: Robot Learning @POSTECH
- See our paper for more details!
When robots can't see, they can still feel and listen.
TAPS is a vision-denied robotic perception system that recognizes objects through active touch and sound. By combining contact-based geometric exploration with impact-sound-based material classification, TAPS enables reliable object recognition without cameras or LiDAR, making it suitable for low-visibility environments such as smoke, darkness, or occlusion.
vid.mp4
-
Set up the environment:
conda create -n TAPS python=3.11 conda activate TAPS pip install -r requirements.txt
-
Download pre-trained models:
- Google Drive (link to be added)
- Place the downloaded model files in the
models/directory.
-
Run the main control script:
python move/ver3.0/continuous_move.py
-
Keyboard Controls:
W,A,S,D: Move the robot base (manual control)G: X-axis (width) exploration — finds and memorizes the object's centerZ: Z-axis (height) exploration — measures object height at the memorized centerQ: Vertical collision, records impact sound, and classifies the objectX: Emergency stop/exitR: Open gripperT: Close gripperE,C,U,J,I,K: Fine joint controls (see code for details)
-
Workflow Example:
- Press
Gto perform X-axis search and memorize the object's center. - Press
Zto perform Z-axis search at the memorized center and measure height. - Press
Qto strike the object, record the impact sound, and classify the object.
- Press
To add your own objects for recognition, edit the object_database.yaml file as follows:
-
Open
move/ver3.0/object_database.yamlin a text editor. -
Under the
objects:section, add a new entry with the following fields:id: A unique identifier for your object (e.g., "obj_07")name: The object's name (e.g., "Water Bottle")material: The object's material (e.g., "plastic", "aluminum", "paper")radius: The object's radius in centimeters (cm)height: The object's height in centimeters (cm)
-
Save the file. The system will now be able to recognize your custom object based on its estimated size and material (classification into the nearest object).
This codebase is built upon Low-Cost Robot Arm and PANNs.
