Skip to content

Assistive technology makes life easier for the visually impaired

Modern phones are powerful computers that enable a variety of assistive technologies. The Supersense application, for example, utilizes machine vision to describe what the phone's camera sees. Most mobile use cases don’t rely on super senses, however. The visually impaired need to be able to call their moms and receive SMS delivery notifications too.

Modern phones are powerful computers that enable a variety of assistive technologies. The Supersense application, for example, utilizes machine vision to describe what the phone’s camera sees.

Most mobile use cases don’t rely on super senses, however. The visually impaired need to be able to call their moms and receive SMS delivery notifications too.

The pre-smartphone era phone  with its buttons was handy for the visually impaired. A physical keypad made it possible to dial numbers blind, at least in theory. Even though the good old button phone is now a rare sight, the idea lives on: you can still connect a keyboard to your phone via Bluetooth. But that is not an adequate solution for visually impaired phone users, who need a screen reader, voice assistant and other assistive technologies to handle everyday tasks.

A screen reader is a program that does exactly what it says on the tin: it reads out loud what’s on your screen. You might be surprised to learn that every iPhone comes with an in-built screen reader, and they are available for free for Android too. On Apple devices, the function is called VoiceOver, while on Android it is TalkBack.

A shocking  UX

Using a screen reader makes a fundamental difference to how you use your phone. When a sighted digital native switches it on by accident, the first reaction is a panicked “how do I switch this off!!!???”

Sighted users are used to browsing through masses of text and images, scanning them for interesting stuff and reacting to it in various ways almost instantly. This core behavior pattern can’t be transferred to a screen reader. You can’t make important buttons stand out by highlighting them in red.

https://twitter.com/Kristy_Viers/status/1287189581926981634

For most people, listening to a radio transmission or an audio book would probably be the most familiar analogy for the screen reader experience. Everything is described in meticulous sequence. You can’t scroll the screen back and forth frantically, but have to know where to jump to continue listening. That’s why audio books are divided into chapters and you can add bookmarks to them, and radio programs follow a schedule.

This same principle holds for using apps with a screen reader. The developers have to program the app so that the screen reader identifies the “chapter titles” and other important bits and can read them out loud.

Navigation by landmarks

Kaisa Penttilä, an expert working for the Finnish Federation of the Visually Impaired, says users navigate with screen readers by jumping between elements serving as “landmarks”. International studies have also confirmed that this is how the blind and visually impaired use mobile devices.

The elements used for navigation are headers, links and buttons.

These navigation elements are usually in their right places in web browsers, but can be all over the place in apps. At worst, the elements can be completely invisible and inaudible to screen reader users.

It provides some slight relief if the user can direct the screen reader to objects on the screen with their fingers. This navigation method is called touch exploration, but Penttilä fondly refers to it as “drifting”.

When you drift, the screen reader reads whatever your finger happens to touch, which can help a little if the application doesn’t present things in logical order by itself.   It’s a good trick to know, especially with poorly accessible applications that have functionalities you couldn’t use otherwise.

Wait, there’s more!

Operating systems are equipped with other assistive technologies in addition to screen readers. Functions like zooming, increasing font size or inverting colors are primarily intended for visually impaired users. Device manufacturers have also designed other assistance solutions for people with motor disabilities.

iOS devices have the most extensive array of assistive features. Of the Android manufacturers, Samsung offers a broad selection of accessibility functions. Other Android vendors don’t offer quite the same level of support .

Can you get by with a screen reader?

Once you get past the initial shock, using your mobile with a screen reader is by no means an impossible proposition, especially for textual content. In certain scenarios, even seeing users can get added value from listening to their phones while focusing their eyes on something completely else (say, when navigating).  You can also manage many everyday tasks, such as payments and online shopping, with a combination of a screen reader and keyboard.

So, to answer the question in the header, you can get by with a screen reader, just don’t expect the same UX as with a graphical user interface. It’s a completely different experience, which serves to remind designers that you can’t build a good auditory interface by following graphical design guidelines.

According to Qvik’s ongoing research, life is not easy if you need to rely on a screen reader. Services that require strong user authentication, so common in our information society today, are especially challenging.

Qvik can help with improving accessibility

Qvik can help if you are unsure about your application’s accessibility or have already committed to improving it, but lack the expertise in your own organization. We offer accessibility services from baseline analysis and assessments to projects for improving accessibility with both immediate and long-term measures, from auditing and corrections to development team training.

We build solutions that are accessible by design and by default, so that you don’t have to make slow, costly and less effective modifications to finished solutions.

Illustration: Joel Pöllänen

Search