Back button
Wisebro logo
Innovation Lab
Contact us
Case Study 2 - Wisebro Innovation Lab

Typing without a screen on-the-go

Background

This research project, funded by Google and the NSF, aimed to develop an assistive technology to enable typing for the Blind and Visually Impaired (BVI) population. Conducted under the guidance of Dr. Davide Bolchini at the USER lab of the School of Informatics and Computing at Indiana University, Indianapolis, the study involved testing the usability of an interactive assistive-technology prototype. The prototype was designed to facilitate screen-less typing for BVI users by employing interaction(s) in the aural (audio) space.

The research team leveraged Human-Computer Interaction (HCI) methods to develop and evaluate the prototype. The results of the study could have significant implications for improving the accessibility and usability of assistive technologies for the BVI population.

Problem

We aimed to address the challenges faced by visually impaired individuals when using on-screen keyboards, which require them to search for visual affordances they cannot see.

Our goal was to explore the possibility of re-imagining text entry without a reference screen and introduce screen-less keyboards as aural flows, controllable by hand gestures. We aimed to model navigation strategies for entirely auditory keyboards and assess their limitations and benefits for daily access, pointing to the trade-offs in user performance and experience when blind users may trade typing speed for the benefit of being untethered from the screen.

Solution

To address the challenge of screenless text entry, we present Keyflows, a system that enables text entry through rapid auditory streams of Text-To-Speech (TTS) characters controllable by hand gestures. Our prototype is based on an off-the-shelf haptic wearable device - TapStrap. To support the generation and customization of TTS audio feedback, we integrate our system with an off-the-shelf text-to-speech engine, and customize it to provide audio feedback tailored to the hand gesture used.

We conducted a study with 12 blind and visually impaired participants to evaluate the effectiveness of TapStrap compared to traditional onscreen keyboards.

Here's a demo prototype of our solution using a simple A to Z KeyFlow:

Result

User study results show that while the performance of Keyflows is slower than that of the on-screen keyboard, participants perceive it as a more pleasant and engaging experience, and prefer it for quick, context-independent tasks. We also discuss the limitations of Keyflows, and highlight directions for future work.

Our work demonstrates the potential for screen-less keyboards to provide an accessible and effective text entry method for individuals who are blind or visually impaired, and highlights the importance of considering alternative interaction modalities for users with diverse abilities.

Unlock the power of design thinking to solve complex problems and achieve results that truly meet the needs of your customers
We love feedback! Tell us what you thought about our work :)
Write to Us