The best game plan for accessibility testing is to start with an automated tool and check for form label errors, ARIA landmarks, color contrast, and alt attributes on images. After that you'll want to do some manual testing for keyboard navigation and assistive technologies.
Types of Test
Automated tools can be used for high-level testing to identify page structure, ARIA roles, and semantic errors. They are a great first defense, but they are not a substitute for manual testing, especially when using assistive technologies, because they will frequently give inconclusive results and sometimes false positives.
We recommond the following tools for automated testing:
- aXe for Chrome and aXe for Firefox
- Accessibility Developer Tools for Chrome. Use the Audit tab in Developer Tools to analyze a site.
- AInspector Sidebar for Firefox
- Functional Accessibility Evaluator 2.0: Testing. With an account, up to 200 pages can be checked at a time.
- Khan Academy Chrome Plugin. Not good for everything, but quick for color contast, alt text, and page structure.
The following automated testing tools can run on local development server in the same browser as part of functional or unit tests.
For sighted keyboard users, you will want to tab through an interface to make sure that form elements, buttons, and links have a focus state to indicate where a user is on the page. Check to make sure that buttons and links open with appropriate keystrokes (enter for links, spacebar/enter for buttons, spacebar for form selections.) Modals and panels should receive keyboard focus after they are opened and have a keyboard method for closing them out. Zoom the page to 200% - can you still navigate? Is the text still legible?
For non-sighted keyboard users, or those using assistive technologies such as screen readers, you will want to navigate through a page checking a sampling of page elements such as navigation, tables, forms, modals, and multimedia. Navigation should be clear to someone listening to links. Tables should have a
<th> tag. Form fields should indicate verbally if they are required and form errors should be inline text and announced in some manner if a form fails. Modals should have focus (if they don't, the screen reader will read the whole page underneath first.) Images should have alt attributes or labels. Video controls should be accessible and provide subtitles.
Manual testing should cover the following areas:
- Keyboard only users can navigate completely through the site and the tab order is logical.
- It is apparent which element of the site currently currently has focus, and the focus state is clear and visible compared to other elements that don't have focus.
- Any interactive elements work with just the keyboard. For example, users should be able to navigate to a data table and read table headers data cells with screen readers.
- Screen readers annouce the labels of form controls, error messages, alternative texts for images.
Some good checklists to use are:
Accroding to WCAG 2.0 Success Criterion on minimum contrast, color contrast ratio should be 3:1 for non-decorative text, sized larger than 18 point or larger than 14 point if bold. Text smaller than that should meet a contrast ratio of at least 4.5:1.
There are a number of online tools to check contrast, including:
Automated testing tools will report color contrast violations and some tools will suggest color value pairs to match the WCAG AA or AAA recommendation.
Screen readers are audio interfaces that convert text on the screen into synthesized speech so that users can listen to the content. Without screen readers, people who are blind would need to rely on other individuals to read the content out loud to them.
Common screen readers include:
- JAWS, screen reader for Windows, demo version available
- NVDA, free and open source screen reader for Windows
- VoiceOver for Mac OS X, screen reader built into Apple's Mac OS X, iOS, tvOS, and watchOS.
- ChromeVox, screen reader for Chrome and Chrome OS.