When a Florida federal judge ruled in 2017 that Winn-Dixie must make its website accessible to the blind, it put corporations across the U.S. on notice: making tech accessible to people with disabilities is not only the right thing to do, it’s also the law.
The lawsuit was the first of its kind to come to trial, but not the last. In 2018 the number of web accessibility lawsuits nearly tripled compared to 2017. As the web has become integral to the way we shop, communicate, and do business, making the web, applications, platforms, and other software solutions accessible to people with disabilities has become essential. This is especially true as web technologies spread to other mediums through hybrid mobile apps with tools like Ionic, Flutter, and PhoneGap and Desktop applications with tools like Electron.
As the web has become integral to the way we shop, communicate, and do business, making the web, applications, platforms, and other software solutions accessible to people with disabilities has become essential.
I’ve always had a strong passion for accessibility in the physical and digital world. This passion blossomed in 2001 when my family started working with the Travis Roy Foundation, an organization dedicated to research to find a cure for paralysis and helping people with spinal cord injuries to become more independent.
We hold WIFFLE ball tournaments in the quarter-sized replica of Fenway Park we built on our Vermont property (yes, both engineering and sports fanaticism run in my family), and all the proceeds go to the foundation. This year’s annual tournament, pictured below, happened on August 9th and raised more than $750,000 for the foundation over one weekend.
Baking Accessibility into Software Development
Alongside my family’s work, I’ve had the opportunity to absorb and practice a wide range of techniques for creating accessible products, including applications to assist special education students in public schools and a wholly accessible app for a major mobile carrier — more on that later.
I’m proud to now work at Bresslergroup, where we believe in people-centric design and know that making tech accessible makes it better for everyone. And although the legal battlefield is currently focused on the web, we believe that mobile, voice, and “interfaceless” applications should consider accessibility before the first line of code is written or the first wireframe is constructed.
That’s why I wrote this post to talk about the following solutions for making tech accessible, and how software engineers and designers can use them to bake accessibility into their products:
- Standard Visual Effects
- Visual Design: Affordances, Color Contrast Ratio, Font Sizes, Screen Readers, and Alternative Text
- Information Architecture: Content Flow and Reading Levels
- Third-Party Software
- Switch Controls
We’ve all had the experience of walking into a room and realizing we’ve forgotten the item for which we came. Now imagine if this happened to you all the time: every time you walked into a different room, clicked on a link, or navigated deeper into an app, you had to figure out why you were there.
This is what life is like for people with short-term memory loss, a cognitive impairment that’s often forgotten when it comes to building accessible applications.
Fortunately, we can design for it. For applications with a deep information architecture, we can use breadcrumbs. For shallower applications, we can fall back to our menu, a title, and making sure the navigation from where we came is clearly indicated. For multi-step processes such as a payment flow, where the application is collecting order details, multiple addresses, and payment details, we can show the process along with an indicator demonstrating what has been completed and where we are in the flow.
Standard Visual Effects
Another example of digital accessibility and making a product better for everyone is designing to avoid seizures. Strobing, flickering, or flashing animations or transitions can sometimes trigger epileptic seizures in susceptible people. Putting design rules in place that restrict those dangerous visual effects can make the experience better for everyone.
This sounds simple, but it can also apply to user-generated content. Applications that we design and build should take this into account to prevent over-customization of styles such that it impacts accessibility. An example of allowing user-generated content to impact accessibility is MySpace of the ’90s (example, pictured below). Numerous people were set on making their pages glow brighter than Times Square at night.
In addition to cognitive impairment and epilepsy, we believe design and development of applications should accommodate blindness or limited vision, colorblindness, deafness, and mobility or dexterity limitations. I’ll talk a little bit about each below, in the context of a real-life case study for a product I worked on a few years ago.
Software Accessibility Applied
In 2016, a prepaid wireless carrier was under internal scrutiny of their account management mobile app for its lack of support for accessibility features.
The app was an off-the-shelf, hybrid mobile app provided by the carrier’s billing service provider. The additional scrutiny was due to an acquisition by a major wireless carrier a few years prior and driven from accessibility guidelines published by the Federal Communications Commission (FCC). The additional regulation led the company to scrap the existing hybrid app and build native iOS and native Android apps to support each respective operating systems’ accessibility features.
Visual Design: Affordances, Color Contrast Ratio, Font Sizes, Screen Readers, and Alternative Text
Digital accessibility starts with design and we baked that into our wireframing and visual design process.
Hyperlinks within the app surfaced early on; we had links to legal documents hosted on the carrier’s website, we had links to FAQs and support pages on the carrier’s website, and we had links that launched in-app modals with additional help content provided in context.
Often a link in the text is simply differentiated by using a color that’s different than the regular text. But for someone who is colorblind or has a vision impairment, colored text could be missed. We were required to provide a second affordance, and underlining the links was the solution in most cases. We also selectively chose to implement some links as buttons based on the context in which they were used. Both methods provide an affordance in addition to color.
Net: Do not rely on color as the only means of differentiating actions.
In addition to color and a second affordance on links, those objects must be properly tagged in the underlying software. In the web, this is known as Alt Text (or Alternative Text). iOS uses VoiceOver as the screen reader and provides additional context to those users by aurally reading the accessibilityLabel and accessibilityHint properties. In similar fashion, Android uses TalkBack which reads the contentDescription and hint to provide additional accessibility descriptions for UI elements. The text of each link and button also needed to imply an action such as “Pay my bill” or “View usage.”
Color contrast ratios are also a key activity to be handled in the design of software applications. Font size, text color, and the background color can be used to determine the ratio of color contrast and inform designers/developers whether the text is legible. There are numerous tools and ratio standards to test the ratios.
At Bresslergroup, we recommend using the Web Content Accessibility Guidelines as a starting place to determine accessibility. The WCAG 2 level AA requires a contrast ratio of at least 4.5:1 for normal text and 3:1 for large text, and a contrast ratio of at least 3:1 for graphics and user interface components (such as form input borders).
Information Architecture: Content Flow and Reading Levels
We also streamlined the information architecture to make it easier for the cognitively impaired (and everyone else) to navigate. Each screen contained bite-sized content that was easily digestible and we were required to stay below a 6.9 reading level on the Flesch-Kincaid scale. There are plenty of tools available to test Flesch-Kincaid and it’s even built into Microsoft Word. You can turn on Flesch-Kincaid in Word by enabling Show readability statistics in the settings and running Spelling & Grammar under the Review tab.
By breaking content up between screens, we also had to make sure that each screen was clearly labeled with an appropriate title. This centered users who may have a cognitive impairment as to where they were in a process flow (e.g., payment flow).
When it comes to flow, we also ran into some challenges around the customer support chat screen. Close your eyes and imagine trying to interact with a chat application using only voice commands. We had to manipulate the order in which the screen reader read items on the screen. We had status messages such as “Agent is typing,” “Connection is lost,” and “Chat has ended.” Those statuses were important to read and upon changing were put into a queue of messages to be read.
Close your eyes and imagine trying to interact with a chat application using only voice commands.
New messages also had to be read as they arrived. The messages were put into that same queue to be read at the conclusion of the prior message being read. Messages from the agent needed to be prefixed with “Message from agent” so that it was clear from whom the message had been sent. All of this was in addition to the user being able to scroll the screen reader back through the chat history.
Our client’s accessibility team included a member who was blind and would periodically review our app and screens. Working with our client’s accessibility team, we were able to gain key insights into how the blind prefer to interact with screens such as the chat screen.
Open-source software is great; it accelerates development, it allows a community to troubleshoot issues, and it’s a democratic landscape of software development. However, accessibility is typically an afterthought for open-source solutions.
Libraries, plugins, and other open-source software may not support screen readers, maintain correct color contrast ratios, nor include proper screen-reader ordering. Development teams should beware of these issues when assessing software selection, to avoid building inaccessible software.
Once a screen reader is verbalizing each screen in the correct order, an additional step is needed: to test with switch controls. A switch control is an assistive technology that allows dexterity-limited folks to interact with every feature of an application using a single method of input. This could be a click, a head movement, or even using a full keyboard paired over Bluetooth. To try out the switch control on iOS, navigate to Settings > General > Accessibility > Switch Control. I encourage everyone who is not familiar with switch controls to spend some time using apps on their phone with this technology.
The switch (most often a pad that can be tapped, like the one pictured, below, or a camera that can distinguish small head movements) highlights every option on the screen in turn, in a repetitive cycle. Major sections of the screen or logical sets of components are grouped together to reduce the number of items in each cycle. When the desired option is highlighted, the user taps the pad to choose it. While it may sound cumbersome, the cycling speed is adjustable, and people experienced with switches can navigate through a properly designed site amazingly quickly.
While building our mobile app for the wireless carrier, we noticed that the switch had a sorting problem; it didn’t cycle through the choices in the proper order. We quickly realized it had to do with the way the iOS component interacted with the switch’s screen reader, and there was nothing we could do to fix it. We filed a ticket with Apple so that their development team would be aware of the issue and provided that documentation to our client in case a suit was ever filed.
Even after testing and releasing an app or device into production, it’s important to stay ahead of the accessibility curve: regularly review the product to make sure it’s still accessible.
What Kind of World?
In an ideal world, both the digital and physical realms would be designed for people of all degrees of mobility, dexterity, sight, hearing, and cognitive capability.
As engineers and designers, we have a great deal of power to make this happen with the way we go about shaping everyday experiences. You’ve probably heard the saying, “With great power comes great responsibility.” For me, accessibility is an essential part of that responsibility.
If you liked this, you might like How Design for Accessibility Drives Innovation for All.