This project combined autonomous robotics and musical interaction through an innovative color recognition system. Using a Jetson Nano-powered robotic car equipped with computer vision capabilities, we developed a system that could follow a line track while detecting specific colors, each triggering a corresponding musical note from the C major scale. The robotic platform utilized ROS2 for system integration, incorporating line-following algorithms with real-time color detection to create an interactive musical experience. Through successful integration of computer vision, autonomous navigation, and audio output, we demonstrated how robotics can be used to create novel interfaces between physical movement and musical expression.
Computer vision-based line following using a custom modified Lane_Detection node
Real-time color recognition system mapped to musical notes (C4, D4, E4)
Autonomous navigation with integrated audio feedback
Custom hardware integration including:
NVIDIA Jetson Nano for processing
OAKD Camera for visual input
Custom designed mounting systems and chassis components
Integrated speaker system
Technical Implementation:
Built on UCSD's Robocar framework using ROS2 and Ubuntu 20.04
DonkeyCar AI implementation for initial testing and simulation
Custom modifications to enable simultaneous line following and color detection
Comprehensive electrical system integration managing power, processing, and sensor inputs