Man Versus Machine – AI Can’t Work Alone

Artificial Intelligence Concept - a brain surrounded by icons for work, travel, home, shopping

More and more we are developing artificial intelligence tools for the automation of large, complex processes and save us time and money. This includes accessibility testing.  But there are still limits to what artificial intelligence can do for us.

There are hundreds of artificial intelligence automated tools on the market that promise to optimize your website for accessibility: some are free browser plugins that anyone can use, others are full-scale project management platforms that stretch into the tens of thousands of dollars. But despite these tools, the human component cannot be removed from the testing process.  Accessibility testing should be done with a combination of manual (human/user) testing and automated web testing tools.

How automated testing tools are useful

Web accessibility testing tools can identify between 20% and 30% of accessibility barriers programmatically. They can dive through thousands of URLs and identify issues on all of them, dramatically improving the speed of conducting an audit and providing a valuable mechanism for fixing recurring elements and the “low hanging fruit” of accessibility barriers. But that will still leave 70-80% of issues remaining to be addressed.

How automated testing tools fail

None of these tools are perfect.  Here are some classic examples of automation failures:

Automated tools can check if an image has alt text, but cannot check if that alt text is correct – even the best picture recognition software makes some pretty funny mistakes.  A photo recognition software decided that this image was a cat on a bicycle (it’s actually a cat standing next to an electric fan).

They can check whether or not a page has headings and whether the heading structure is hierarchical, but cannot check if the correct content is in those headings.

Automated web testing tools can fail with regard to things such as color. If your website instructs the user to click the red button, an automated tool may be able to check if the button is red, but cannot relate that information to the instructions and know that red is being used to convey meaning in this situation. So the tool cannot tell you that this button is unusable for people who are red/green colorblind.  

Automated tools cannot tell you whether a complex or unusual page has the correct reading order; “logical reading order” is defined by content structure, not by simplistic, machine-parsable ideas of “top left to bottom right” or “two clear columns.” Look at any listing page on eBay or Amazon and you will understand how reading order can be a very complex mechanism.

Additionally, many do not work on password-protected pages.  Some are, in fact, “better” than others, but better is often subjective and “more thorough” risks more false positives.

False positives and false negatives mean manual review

Many items could theoretically be flagged by a tool, but would still need to be verified manually.

Tools that flag every HTML bold attribute (<p><b>bold</b></p>), or every strong element (<p><strong>strong</strong></p>), or italicized element (<p><i>italic</i></p>) as a potential issue may catch a lot of items which are visually presented as a heading or a divider of regions, but which are not coded correctly as headings, but that same tool will also flag every time that style is used simply for emphasis within the text. The only way to resolve this is to manually go through and ensure the items flagged are actually meant to be headings and that they are arranged in the correct hierarchy. 

If the website is using color to convey meaning, some tools can detect changes in the font color used, but won’t be able to understand the context to determine if the text color being green in one section is an indication of meaning, or purely decorative. Only a manual review can determine these answers.

Some tools will flag issues for images containing identical alt text, links with identical names, or buttons with the same label. This will help identify placeholder labels and alt text, but what if you intentionally use the same image twice on a page? Or have a help button at the top of the page and the bottom? These kinds of items still require manual checking to identify the difference and ensure they are functioning correctly (or not).

An automatic flag for videos without captioning or transcripts can be highly beneficial for a video presentation but will create hundreds of false positives if a college music department uploads instrumental music files, which a manual tester would easily determine needs no captioning.

Additionally, sitewide sweeps can generate lists of thousands of individual issues and bugs, all of which will have to be reviewed, and many of which are not errors.

Here is an example of a “broken” website that was created by the UK government to test various automated testing tools. Their complete study of automated tools is here.

Education is your best tool for accessibility

The best way to ensure your web pages are accessible is to be sure your content creators and web designers understand accessibility and how to avoid accessibility issues. Being proactive and building a culture of accessibility into your organization will go a long way to avoiding these barriers and avoiding complaints and litigation that can occur when your online material is not in compliance.

Creators should learn when and how to use headings and to create workable heading hierarchies.  They should know best practices for adding alt text to images, for using links on the page, what constitutes adequate color contrast, and how to properly label various elements for best usability.  The good news is that making your website accessible makes it more usable for your entire audience. Captioning helps people who are watching videos in a busy environment or when they desire privacy.  Proper contrast helps when people are viewing outdoors where there is a lot of glare. Most text functionality is improved by having it auto-size to fit the screen, particularly if the user is viewing on a tablet or smartphone.  And the addition of good keyboard controls can save everyone time: I wonder if anyone reading this still uses a right click and copy when CTRL+C is now virtually omnipresent?

The use of artificial intelligence web testing tools can be a good start to accessibility checking.  These should be followed by thorough manual testing by a human who can pinpoint the 70-80% of errors that web testing tools will miss, and correctly interpret the items that automated tools flag as needing attention.  

Want to learn how to do a better job testing your website? Let us help

Subscribe

"*" indicates required fields

Clicking Subscribe will take you to the Thank You Page.
Hidden
Status*
Hidden
Lead Source*
This field is for validation purposes and should be left unchanged.

Categories

Accessibility Culture

Accessibility News

Company News

General Accessibility Resources

Laws and Regulations

PDF Accessibility Resources


Ryan Pugh

Ryan Pugh | Director of Accessibility | Equidox Prior to joining Equidox, Ryan Pugh served as an Access Technology Analyst for the National Federation of the Blind (NFB) in Baltimore, where he was the NFB's focal point for accessibility and usability testing. He conducted intensive web accessibility audits for compliance with Web Content Accessibility Guidelines (WCAG) 2.0 AA for numerous Fortune 500 companies, including some of the world’s largest online retailers, notable colleges and universities, government agencies at the federal, state, and local levels and for other non-profit institutions. He also delivered accessibility training workshops and managed the NFB’s document remediation program, specializing in PDF accessibility.