Tactus tactile touchscreen prototype

Touchscreens can be particularly challenging to use for people who cannot see. All they feel is a featureless surface. Accessibility can be provided through speech using a number of different strategies, but finding onscreen buttons through audio is slower than finding tactile buttons.

Photograph of tactile bumps.

An edge view of a tactile layer that could be used as part of a touchscreen. (From Tactus Technologies.)

Tactus Technologies has been developing a touchscreen with areas that can raise and lower from the surface. Sumi Das for CNET posted a video and hands-on impressions a couple days ago. The technology uses microfluidics. For the buttons to raise up, liquid in the system is put under pressure–the higher the pressure, the taller and firmer the keys. The current version only offers one arrangement of buttons.

Currently, the technology is limited in that it’s a fixed single array. You wouldn’t be able to use the Tactus keyboard in both portrait and landscape mode, for example. But the goal is to make the third generation of the product dynamic. “The vision that we had was not just to have a keyboard or a button technology, but really to make a fully dynamic surface,” says cofounder Micah Yairi, “So you can envision the entire surface being able to raise and lower depending on what the application is that’s driving it.”


The current generation only offers a single array of raised buttons that would work in only one orientation. This would be helpful for allowing users to find the keyboard tactilely, for example, but this would offer no support for other applications. The user cannot tactilely find buttons or other controls for other applications on a smartphone or tablet.

Future versions may fix this limitation. Ideally, the microfluidic tactile “pixels” will be small enough so that various tactile shapes can be made. To make seamless shapes, the tactile pixels should not have significant gaps between them, but this may be technically difficult to do. With gaps in between the tactile pixels, the device could still be useful to tactile exploration, but the layout would likely be constrained to a grid of bumps. An onscreen button might have a single bump associated with it (multiple separate bumps should not be used to designate a single large key because it would feel like several separate controls). The bumps would allow people to locate controls, but without different shapes, it would be more difficult to identify the controls from touch alone.

From the video, it also looks that the current technology is not well suited to braille. Spots small enough for braille would not be sharp or tall enough for effective reading. (For more information, Tiresias.org has a table of braille dimensions under different standards).

About J. Bern Jordan

Bern is a Ph.D. candidate and researcher in accessibility, usability, user interface, and technology interested in extending usability to all people, including people with disabilities and those who are aging. He currently works at the Trace R&D Center in Biomedical Engineering at the University of Wisconsin-Madison.

Leave a Reply

Your email address will not be published. Required fields are marked *