Unheard - Haptic Driver Assistance

Using Haptic Interaction to optimize the driving experience for deaf drivers

Industry: Automotive
Timeline: Set 2022 - Jan 2023 (5 months)
Role: UX Researcher, UX Designer
Team Size: Single Project

Overview

Not everyone experiences the world in the same way, and that includes behind the wheel. In this project, developed during my Master's in Interaction Design, I explored how sensory substitution can make in-car experiences more intuitive and inclusive for deaf and hard-of-hearing drivers through a multimodal, non-visual interface.

Challenge

Design an accessible driving assistant that addresses the unique cognitive demands of deaf users by translating critical auditory cues into intuitive haptic feedback, reframing how in-car interactions support diverse abilities.

Fast research for a MVP

I conducted fast, focused secondary research, reviewing topics like Calm Technology, Assistive Tech, Sensory Substitution, and Car UX. I also looked for examples of in-car interfaces for deaf drivers—while rare, they helped shape the direction of my concept.
Key takeaways:
  • Accessibility in car tech is still limited;
  • feedback must be easy to perceive without adding cognitive load;
  • The interface should stay non-intrusive so others can use the car normally.
Wireframes and respective GUI Prototype
Interaction Model based on the concept of "Implicit Interactions"
Prototype of the physical module, done by programming an Arduino and multiple vibrating motors that were sown to a steering wheel and seat cover

Diving into phyigital prototyping

In order to reliably demonstrate my concept, I had to develop a physical prototype that allowed stakeholders to experience a real simulation of what users would feel when using Unheard.

Connecting HMI and Accessibility

The result of this project was a proposal for a multimodal interface comprising an haptic module, present in the steering wheel and seat, and a graphical HMI for support.

Although it wasn't possible to test this interface with deaf users, I believe that proposals such as this one can help both unimpaired and impaired users to have safer and moore comfortable driving experiences.

Simulating an haptic HMI on a Budget

To simulate the experience of driving with the Unheard interface a custom rig was created comprising a makeshift driver seat and steering wheel (equipped with vibrating motors connected to the Arduino); a tablet positioned vertically where the GUI prototype was displayed, and a laptop with a POV video inside the car.
The prototype was presented and tested with a live audience of colleagues and professors

Presenting the concept

Finally, to present the case study to stakeholders, mock-ups and a communication video were created that explained the pertinence of accessibility ADAS that take into account the needs of drivers with different capabilities.