INFO4320 Physical Computing and Rapid Prototyping Final Project

Musical Dancing Robot

Overview

We designed a gesture-controlled tabletop robot that allows users to select music through hand movement and watch the robot dance expressively with coordinated body motion and facial emotion.
The goal was to explore how robots can communicate affect through synchronized movement, sound, and expression, translating music’s emotional tone into embodied performance.

Watch Our Demo

Concept

The project began with the idea of a “musical companion”—a small robot that reacts playfully to its environment.
Our final prototype enables users to:

  1. Select music by moving a hand over the robot’s arm sensor.

  2. Observe a personalized dance as the robot interprets the rhythm and mood of the chosen track.

  3. Read emotion from the robot’s face via an LCD display that mirrors musical energy (e.g., happy, calm, excited).

Hardware & Interaction Design

  • Core: ESP32 microcontroller

  • Sensors: Flex sensor embedded in the robot’s hand for gesture recognition

  • Actuators: Servo motors driving head, arms, and body movement

  • Display: LCD screen showing real-time facial expressions synchronized with music

  • Audio: Mini speaker module for on-board music playback

The design prioritizes simplicity and clarity of expression. Each song is tied to a unique choreography pattern, with different tempo and amplitude reflecting musical emotion.

Previous
Previous

UX: Exploring the Online Shopping Experience for College-Aged Female Shoppers

Next
Next

Data Visualization: Mapping Economic Growth and Inequality Across China