DJ Reactor: Making Reachy Dance
Started work on a music-reactive app. Reachy will respond to beats with synchronized movements.
New app idea: DJ Reactor. Reachy reacts to music in real-time.
The Vision
Play any song, and Reachy: - Bobs its head to the beat - Wiggles antennas on drops - Changes expressions based on energy level - Maybe tracks specific instruments
Technical Approach
Using librosa for audio analysis:
python# Load audio y, sr = librosa.load('track.mp3')
# Beat tracking tempo, beats = librosa.beat.beat_track(y=y, sr=sr)
# Energy over time rms = librosa.feature.rms(y=y) ```
Then mapping these features to robot movements: - Beat → head bob (quick up/down on Z) - Energy → antenna spread (more energy = wider) - Onset detection → eye reactions
Challenges
- Latency: Audio analysis needs to be fast enough for real-time response
- Smoothing: Raw beat detection is jittery; need to smooth movements
- Not looking stupid: Easy to make the robot look like it's having a seizure
First Prototype
Got basic beat detection working. Reachy bobs on every detected beat. It's... okay. Needs work on timing and amplitude.
What's Next
- Add energy-based modulation
- Implement anticipation (move slightly before the beat, not after)
- Build a simple Gradio UI for track selection
This one's going to be fun.