What it does
Tactile Sound is a portable haptic device that helps deaf users connect with music through touch. It converts drum, bass, and melody into vibration patterns via six haptic motors on the fingers and palm, letting users feel the musical flow without hearing.
Your inspiration
What does it mean for a deaf person to experience music? This question took shape after we watched a YouTube video where a deaf person described music as “a personal imagination felt through vibration.” It made us realize that music is not only heard but can also be felt through touch. We found through research that many deaf individuals experience music through vibration. This led us to envision a haptic language that translates rhythm and melody into tactile patterns—beyond simple vibration. As a result, we developed a handheld haptic device that allows deaf users to feel and interpret music without hearing.
How it works
Tactile Sound is a haptic feedback system that takes a music file and controls vibration motors based on pre-analyzed data about beat timing and intensity. During the audio analysis stage, we use audio processing libraries to extract timing and intensity information for four musical elements: drums, bass, melody, and vocals. The device includes six vibration motors, each assigned to a specific part of the hand: two motors on the palm correspond to drums (kick and hi-hat/snare), the index finger to vocals, the middle finger to melody, and the ring and pinky fingers to bass. Based on the analyzed data, the ESP32 microcontroller reads the stored information and controls each motor via PWM. This allows users to physically perceive the rhythm and structure of music through different parts of the hand. Additionally, a built-in potentiometer lets users adjust the overall vibration intensity according to their tactile sensitivity.
Design process
Tactile Sound began with a clear goal: to make music accessible to people who are deaf or hard of hearing. We studied academic research and news articles, and refined our direction through Q&A sessions with relevant stakeholders. We recognized the need for a portable, personalized device and identified the fingertips as optimal for tactile feedback. Research also showed that layered vibration patterns convey musical elements more effectively than single pulses. During form development, we explored grip structures with clay and sketches, then refined ergonomic shapes through iterative 3D printing. Technically, we tested various audio analysis methods and hardware setups. Based on an ESP32 microcontroller, we repeatedly adjusted wiring and layout to ensure stable operation. After syncing vibration responses with audio signals, we built a working prototype. We designed a UI/UX scenario for personalized feedback and envisioned a community feature for shared experiences. The current prototype supports core functions such as play/pause, reset, and vibration intensity control, and was developed based on feedback from deaf users. Next, we aim to enhance real-time audio conversion and integrate the battery and control buttons.
How it is different
Most conventional haptic devices are limited to functional assistance such as alerts, notifications, or simple rhythm cues. Similarly, commercial services like Apple Music provide only basic vibration feedback in sync with the beat. In contrast, Tactile Sound focuses on translating music itself into a tactile language. By converting core musical components into distinct vibration patterns, it allows users to perceive the flow and structure of music through touch alone, without relying on hearing. It functions not just as a tool for information delivery, but as a sensory interface that helps materialize personal imagination. The device adopts a handheld form factor with Bluetooth connectivity, offering greater portability and everyday adaptability compared to traditional wearables. Additionally, it enables personalized sensory experiences through adjustable vibration intensity and genre-based haptic feedback settings.
Future plans
We plan to refine our prototype into a mass-producible form suitable for real-life use. Additionally, we aim to develop a real-time music analysis feature to allow users to quickly experience the music they choose. Beyond vibration motors, we will explore various forms of tactile stimulation—such as temperature changes, directional pressure, and force—to deliver a richer musical experience. To improve usability, we also plan to integrate with existing music streaming platforms. Furthermore, by collaborating with organizations for the deaf and user communities, we aim to incorporate meaningful feedback and expand the product’s social impact.
Share this page on