The HaptiDrum project introduces a novel wearable air-drumming system enhanced with haptic feedback via both LRA motors and electrical muscle stimulation (EMS). The system utilizes XIAO ESP32S3 microcontrollers and BNO085 IMUs for high-frequency motion tracking and hit detection. By integrating Google MediaPipe-based vision tracking with IMU inertial data, HaptiDrum enables precise 3D hand localization.
The haptic modules are mounted on the forearms but deliver tactile feedback corresponding to hand impacts, creating a natural drumming sensation. The system consists of four wireless peripheral units: two for simulating drumstick strikes via hand-mounted feedback and two for simulating hi-hat and bass drum actions via foot-mounted feedback. All devices are interconnected wirelessly, and the system architecture is designed to be infinitely scalable, allowing additional peripherals and instruments to be added as needed.
HaptiDrum aims to deliver an immersive, expressive, and portable drumming experience without requiring physical drumsticks or drums.
Functional haptic feedback system for forearm and foot
Functional hand spatial localization system
Sensor calibration algorithms
Mobile application for visualization and configuration
High-frequency IMU-based inertial tracking
Google MediaPipe integration for vision-based hand tracking
Combined 3D hand localization system
Real-time haptic feedback through LRA motors and EMS modules
Compact, wearable forearm and foot-mounted devices
Bluetooth Low Energy (BLE) wireless communication with mobile device
IMU data update rate: ≥100 Hz
Hand localization accuracy: within 5 cm
Action (hit) recognition accuracy: ≥90%
Wireless communication latency: <30 ms
Battery life: ≥3 hours of continuous operation
Haptic feedback response delay: <50 ms
Week 2: Finalize hardware system design and part procurement
Week 4: Assemble basic IMU tracking system and prototype mounting structure
Week 6: Implement initial haptic (LRA and EMS) system integration
Week 7: Integrate BLE communication and basic mobile app interface
Week 8: Develop and test hand localization algorithms (MediaPipe + IMU fusion)
Week 9: Full system integration and calibration procedure implementation
Week 10: Final performance evaluation and demonstration preparation
Enjia Shi: IMU debugging, hardware design, and prototype development
Yu-Chu Hsieh: IMU debugging and project website management
Yuxuan Miao: PCB development and system testing
Hongrui Wu: Mobile application development and hand spatial localization algorithm development
IMU BNO085 (4 units): $75.96
XIAO ESP32S3 (without presoldered headers) (2 units): $53.98
XIAO ESP32S3 (presoldered) (1 unit): $17.99
Battery 3.7V 400mAh (1 unit): $14.99
Vibration Motors (12 pack): $13.99
LRA Motors (6 units): $20.18
Switch (100 pack): $7.99
Hook and Loop Straps (1 unit): $7.99
TENS device for EMS testing (1 unit): $25.99
Total budget: $400
Risk 1:
EMS feedback is ineffective or uncomfortable for users.
Mitigation:
The system will switch to LRA-only haptic feedback, ensuring reliable and consistent tactile sensation.
Risk 2:
IMU drift leads to inaccurate hand localization.
Mitigation:
Localization will rely solely on MediaPipe-based vision tracking when drift exceeds acceptable limits.
Risk 3:
MediaPipe tracking failure in poor lighting conditions.
Mitigation:
The system will implement a fallback to IMU-only tracking mode and provide a warning notification to the user.
Risk 4:
Wireless connection drops between peripherals and the mobile device.
Mitigation:
Automatic reconnection and resynchronization routines will be enabled to minimize user disruption during operation.
1. DigiDrum: A Haptic-Based Virtual Reality Musical Instrument and a Case Study
This paper introduces DigiDrum, a hybrid VR musical instrument that merges a physical drum with a virtual drum through synchronized auditory and haptic feedback, controlled by a physically modeled sound engine. The key innovation lies in allowing real-time modulation of virtual membrane properties, such as tension and damping, resulting in different tactile and auditory impressions without altering the physical instrument. This approach is highly relevant to the design of our project, as it demonstrates how a single physical interface can simulate multiple virtual instruments by leveraging programmable haptic-audio feedback. Its user study revealed that perceived stiffness is influenced more by damping than tension, guiding how expressive variation can be achieved.
2. Feeling the Beat Where It Counts: Fostering Multi-Limb Rhythm Skills with the Haptic Drum Kit
This paper presents the Haptic Drum Kit, a system that delivers rhythmic cues through vibrotactile actuators worn on the wrists and ankles, supporting the development of multi-limb coordination and internalized rhythm understanding. Grounded in theories of entrainment, embodied cognition, and sensorimotor contingency, the study showed that beginners could learn complex polyrhythms from haptic cues alone. This research significantly informs the training and educational features of our project. It emphasizes the importance of limb-specific haptic cues to guide motor learning and rhythm memorization, which we plan to incorporate into HapticDrum’s game and tutorial modes. The wearable aspect also inspires potential modular haptic extensions for rhythm training in portable setups.