Automated vs. Manual Accessibility Testing

The best option for testing accessibility is to combine both automated and manual testing.

We receive many questions about automated accessibility testing. According to

“Automated testing and evaluation tools are not sophisticated enough to tell you, on their own, if your site is accessible, or even compliant. You must always conduct manual testing to ensure full compliance with the Revised 508 Standards.”

We find automated testing tools beneficial during our review process. However, because we know what it takes to make a website accessible, we also know that machine-generated accessibility reports cannot test everything and often include false positives and negatives. The best option for testing accessibility is to combine both automated and manual testing.

The lists below show some of the testing you can perform with each option.

Manual Testing

  • Distinguishable links
  • Accurate alternative text
  • Actual color contrast
  • Use of color
  • Keyboard accessibility
  • Accurate form labels
  • Form error messages
  • Consistent navigation
  • Text resize
  • Timing
  • Use of sensory characters

Automated Testing

  • Empty links
  • Presence of alternative text
  • Basic color contrast
  • Presence of page title
  • Presence of document language
  • Presence of form labels

Since evaluation tools can sometimes provide false or misleading results, once you perform automated testing, you will need to know how to interpret the report. Below are some common false negatives including why they are not accurately reported.

Non-distinguishable links

Non-distinguishable links are one of the most common false negatives we see. WCAG success criteria 2.4.4 Link Purpose (In Context) require links to be distinguishable by either the linked text alone or the linked text together with the immediate surrounding content.

If your web page includes the same link text in multiple places, an automated tool will mark it as an error if the URLs are not exactly the same. Often links are redirected and will actually bring the user to the same place. 

To manually test each link, simply visit each link that includes the same text and confirm they are linked to the same place. 


Automated testing does not account for graphics with background colors since they cannot see the color of the graphic. If you receive a color contrast error, you will need to test the foreground color (text color) against the actual background color you see—not just the background color named in the website styles.

Additionally, if a graphic contains text, you will also need to manually test the color contrast ratio since it is not something automated testing can check. 

Alternative Text

Automated testing is great for finding missing text alternatives. However, automated tools cannot determine if the alternative provided is accurate. 

To determine if alternative is correct, you will need to review the alternative provided and confirm it is an accurate description of the non-text content it is describing. 

Use of Presentational Attributes

Sited users perceive the structure of a web page and relationship of the content on a web page using various visual cues. For example, headings are often larger and list items may include bullets. We use HTML (Hypertext Markup Language) such as h1, h2, h3, etc. to distinguish text as a heading. 

Since an automated scan cannot understand the purpose of content, it also cannot determine if a web page uses proper structural layout. 

If you receive this error, you will need to review the page and determine if the presentational attribute such as a heading is accurate for the text it is related to.

As you can see, an automated tool cannot guarantee a compliant website. Be sure your webmaster team is fully trained in accessibility and able to conduct both automated and manual testing. Of course, if your team is not already trained, we can do it for you. From training to testing, we can do it all! Contact us today to find out more.