Testing is an important part of the journey to accessibility. It makes sure that a website is as usable for everyone as possible. But there are different types of audits you can do on your website or app, so it can be hard to know where to start. With all methods of testing, it’s best to factor it into a project from the start. Leaving it for later can mean you end up spending more money and time fixing complex issues.

But what are the main differences between these audits? And who wins in the argument of ‘user testing vs automated: which is better?’

User testing for accessibility

We use Shaw Trust Accessibility Service’s disabled website testers to go through the website at each stage of the build. From the initial designs to the final development phase.

These audits check that everyone can access the site using screen readers, keyboard tabbing and voice commands, to name a few. User testing is the least used of the two. According to a survey done by PractiTest of over 1600 self-titled ‘web testers’ only 30% claimed they carried out co-ordinated user testing. Only 5% more (35%) claimed they used user simulations.

The best part of this method is getting the human element of the testing. A piece of software is great for spotting technical errors. But it’s no match for user testing with a group of disabled people who have a variety of impairments. Only disabled people, who have the lived experience, will be able to tell you where there’s a problem in the user interface and design.

The downside is, of course, the time frame. More than 47% of the 1600 ‘web testers’ said they found timeframes ‘extremely challenging’ when it came to using this method. A further 31% described it as ‘challenging’.

You can also do specific user-journey testing with this method. This means a tester can run through a specific set of pages that are frequently used in that order. Maybe it’s looking for and buying a specific product using the website, or navigating to the contact page and filling out a form.

This allows for the interface, the experience and the technicalities to be tested effectively. With automated testing, a computer checks for certain failures pulled from the Web Content Accessibility Guidelines (WCAG). They are limited, and this is why testing with real-life users is important. Humans will experience things that automated tests simply cannot pick up.

Automated testing for accessibility

Automated testing is achieved using an online or downloadable piece of software that scans through the code of the website and determines how technically-sound the website is, based on the code and the standards set out in the W3C’s Web Content Accessibility Guidelines (WCAG).

As the most popular testing method, nearly 60% of ‘web testers’ use this method to find faults. Using pre-scripted tests, their function is to compare the actual results with the expected results of an accessible website. In doing so, it can fish out bugs, such as blank headings or links, that are buried in the code.

Automated testing is great for testing that specific website functions are accessible to the user. This includes making sure that the website has been quality assured and all bugs picked up in testing have been fixed.

When user testers see a website for an extended period of time, there is a chance that they have memorised certain user journeys and how to navigate them, this means that issues could easily be missed. Automated testing eliminates the human error aspect and alerts developers of faults that may have been missed.

Check out our article 6 quick tools web developers can use to test for accessibility

Combine both methods for best results

Between user testing and automated testing which one is the best? One is cheaper and gives you time to carry out other tasks. While the other provides real-life insight about accessibility, from the people it affects the most.

Automated tests are useful, but they are not the same as testing with real people. When you

it can be a recipe for a better and more usable website. eMarketer quizzed marketers and webmasters alike and found that 86% of those considered testing a good practice that benefits the websites they run.

User testing with a range of disabled people usability and function

There is no way of gaining insight about the customer experience in relation to accessibility.

As much as automated programmes work in a sense that they highlight things like missing Alt tags on images, no link description, etc, they can never accurately tell you that a website’s colour contrasts are too low for a low vision user.

Our approach

We use three different testing methods to be confident.

  • An automated test to check for errors on the site
  • A technical manual review one of our expert team members goes through the site checking the code for the styling elements that an automated test would miss
  • Disabled user-testers who fully run through the website to ensure that it is accessible and usable.

If you’d like to find out more about Hex, their accessible websites or user testing, you can visit their website.