I led end-to-end R&D for an dynamic haptic interface that enhances transit accessibility.
My roles
UX Researcher
UI Designer
Full-Stack Eng.

My tools
Ethnographic Field UXR
Arduino Microcontroller Circuitry
My impact

I led 0-1 product development, from UXR and ideation to prototype engineering and user testing. What originally started as a one-man project blossomed into a 30-person program with a patent-pending interface design.

Problem Statement

Imagine trying to use an elevator lobby without eyes or ears.

Our deafblind users struggle with this on a daily basis; they can't see which door is primed to open, and they can't echolocate towards the arrival "ding". I was asked to lead the R&D for an interface to solve this problem.

Preliminary User Research

1:1 interviews and ethnographic pain point research gave me detailed insight into interface desires and information architecture requirements. Qualitative data synthesized into the concept of a device to determine which elevator was coming.

Pain Points - Deafblind Users

"I'll get to the elevators and press the call button, but there's nothing that tells me which one of the four is arriving. I'm totally lost."

“If there’s nobody else at the station when I’m trying to get on the elevator, I’ll just hold my cane against one of the doors and hope that’s the one that comes.”

“I don’t want to be dependent on someone to help me figure out which elevator is arriving. I can do everything else on my own, except for this.”

Lo-Fi Prototype Engineering

Constrained by strict elevator safety standards and spatial data conveyance heuristics, the first interface would have to address three key requirements: durability & reliability (minimal moving parts or vandalism opportunities), and a shallow learning curve (for accelerated demographic adoption and usability), all within a handheld tactile interface (to accommodate canes and other instances of one hand being in-use).

This preliminary scope drove the design of the first prototype, the "MK.I". Haptic feedback heuristics drove core functionality, which I integrated into a custom 3D-printed enclosure I designed in AutoCad. Arduino microcontroller circuitry and C++ did the rest.

Design Logic

Our elevator banks are four elevators in a row against the same wall, allowing four finger holds to emulate four elevators; this combination of spatial understanding and physical relation turned out to be a huge success, and set the design foundation for future prototypes.

Let's say you're using the MK.1 with your right hand; palm away, tucked thumb. Vibrations sent to your index finger would mean that the leftmost elevator arrived, and a vibration to the pinky finger would mean that the rightmost elevator arrived. An ergonomic ambidextrous design means that either hand could be used with the interface, with no changes to vibration directionality or meaning.

User Testing 1

Core functionality was nailed down, but I was still unsure of detailed ergonomic preferences. Eight finger mould versions, all in interchangeable drop-in rigs and rapid configurations testing, were produced for user testing trials.

Careful coordination with our support staff, space planning, and demographic community leaders gave us two key takeaways.

1) Users didn't like using individual fingers - testers weren't keen on sticking their fingers in an unfamiliar negative space, especially considering a long history of our stations being vandalized and polluted with heroin needles.Testers loved the tactile haptics approach, but seemed dubious of the finger-driven system.

2) Users needed elevator arrival "repositioning time" - just like how users with vision or hearing use arrival light indicators to reposition themselves in preparation for elevator arrival, deafblind testers needed the same preparatory time. This meant a backend system overhaul, tapping into the SCADA datastream provided by the elevator CS.  

Interface Redesign

The concept of a "whole hand" device was floated by testers, which was met with enthusiasm. This new interface direction prompted key design changes:

Backend Overhaul

Whole-hand haptic design prototypes couldn't effectively isolate vibrotactile data transfer. This method only worked because haptic reverberations were separated by individual digits, but inherently traveled with ease due to surface area usage increase.

Revised Heuristic Cues

I redesigned the frontend to operate as high dexterity dynamic braille. The four elevator shafts would each be represented by a series of "rods", each representing the number of seconds to arrival. This revised interface met tester standards for a shallow learning curve.

Final Design Tweaks & Launch

Intentionally blurred technical schematics

Due patent concerns, I'm unable to disclose details of the MK.III we've put into production. It's not far off from the MK.II photos I've shown above, but my legal department has asked me to keep it under wraps for the time being.

If you live in Seattle, or plan on visiting sometime soon, you'll start to see a pilot stage MK.III installation in a number of test stations around April 2024. It's exciting being on the cutting edge of ADA technology, and I'm extremely proud of the work my team and I have accomplished.