Publication: Loop interface

Published today is a defensive disclosure of an invention of mine, which I call the “Loop Interface.” Such an interface may be particularly useful to people who are blind or who have low vision and need speech output in order to use technology. The abstract from the disclosure:

Disclosed is a user interface method that allows for quick, efficient exploration and usage of user interfaces by people who cannot see. The interface method is a circular direct-selection interface where many user interface elements are arranged around the edges of a touch panel or touch screen. This makes a circular list of elements that users can touch to get speech output and then activate when they find the desired element. This arrangement can be explored easily but also allows users to efficiently and directly select elements on familiar interfaces.

The problem is that current methods of using screen readers are inadequate. People may use swiping gestures or keystrokes to navigate from element to element, which can be inefficient. Alternatively, on some touchscreen devices, a screen reader might allow a person to tap or drag a finger around the screen to explore the screen via text to speech. This exploration can be challenging because onscreen elements may be anywhere on the screen in any arrangement and might be missed when dragging a finger on the screen.

A visual depiction of the Loop Interfaces in use.
Using the Loop Interface by tracing the edges of the screen with a finger and listening to the text-to-speech feedback.

With the Loop Interface, a person can simply trace the edges of a screen with a finger and get to all elements on the screen. When they become familiar with a particular application, then they can directly touch the desired element without having to navigate through all the intervening options. More details about the Loop Interface are available in the disclosure [PDF]. In the future, I plan to do user testing on a basic version of the Loop Interface.

The defensive disclosure was published today in the IP.com prior art database (IP.com database entry, requires subscription for full content). The publication was reviewed and sponsored by the Linux Defenders program. It will soon be published in freely-available form on the Publications page of the Defensive Publications web site.

Publication: Virtual jog wheel

People who use screen readers frequently need to navigate between different elements on a screen. As the user navigates to an element, information about that element is provided in speech output. On touchscreen devices, screen reading software typically uses swiping/flicking gestures for navigation. For example, one swipes to the left to go to the next element and to the right to go to the previous element when using VoiceOver on iOS by Apple (see more details about iOS VoiceOver). The problem is that these gestures are relatively slow and fatiguing. Using buttons to control navigation may be easier, but is still relatively slow.

I recently invented and described a software technique for quickly and easily navigating a user interface on a touchscreen device. The user activates a virtual jog wheel mode and then makes arc or circular gestures on it. By just making a single circular gesture, the user can navigate through a number of elements to get to the one they want.

Virtual jog wheel in use.
The virtual jog wheel in use. The person makes a generally circular gesture on the virtual jog wheel to navigate through the elements on the screen.

More details are available in the virtual jog wheel disclosure [PDF]. This invention is free for anyone to use, but those using it will still need to ensure that the technology does not infringe any other patents.

The defensive disclosure was published today in the IP.com prior art database (IP.com database entry, requires subscription for full content). The publication was reviewed and sponsored by the Linux Defenders program through the Defensive Publications web site.