It seems that there is a great deal of confusion in the marketplace about automated website accessibility testing, what it does, and what it does not do. This article attempts to explain some of the issues.

Firstly, SiteMorse only tests sites against already established, recognized standards we have not created any of our own.

Automated testing does not and will never be able to test a web site for complete accessibility compliance. Some of the issues involved require judgements to be made that a computer simply cannot make.

However, at the same time, automated testing is an extremely useful tool to be used as part of a programme for achieving and maintaining accessibility. Some accessibility guidelines (for example, the requirement that HTML page source be valid according to published standards) are practically impossible to verify manually. Others (such as the requirement that all non-text elements, such as images, have textual equivalents) cannot be automatically verified as correct, but many areas that are or are likely to be a problem can be automatically identified.

Put simply, if an automated tool correctly identifies problem areas on a site, then that site cannot claim to be complying with the accessibility guidelines. Conversely though, if an automated tool identifies no problems, that does not necessarily mean the site is fully compliant with all guidelines. Neither manual testing nor automatic testing are sufficient by themselves.

SiteMorse users are reminded that automated testing is not a panacea for all ills that may afflict a web site - it is an important, and in some respects vital, tool to ensure a healthy website, but it must be backed up by good manual procedures and quality assurance to achieve an enjoyable and successful site experience for all users.

SiteMorse "Overall MorseMark" ratings are intended to indicate an overall usability score for a site. This usability test incorporates accessibility as an important part - but only a part. Other factors such as site functionality ("does the site work correctly?") and performance ("does the site work quickly?") also play major roles.

Following our recent report on the 'expert organisations' a statement was produced about the potential failing of automated testing; this being supported by the DRC and the RNIB, it would seem the focus should be on achieving a usable site, not just on that meets automated testing – we fully agree, and as said in every report we produce, automated testing is not the only solution but an important element to ensure compliance is met and continues to do so.

Following their remarks, we looked in a little more detail at the results of a couple of tests – focusing on the usability [a usable site should have good accessibility, well constructed code and should function correctly with good level of performance] and not just automated accessibility tests; the results were staggering to say the lest.

The breakdown for one of the sites;
9,750 Site errors effecting function (links / emails etc)
195,040 HTML code errors
546,163 Accessibility AA faults
146 Accessibility A ‘missing alt tag’ faults
Failed all speed tests

Hardly what would be considered a usable site, as accessibility to all rather limited.

For further details or comment, please contact Nicholas Le Seelleur / 0870 759 3377 or email pr@sitemorse.com

Published on: 12:00AM on 11th August 2005