This has been stewing for a while. Just brimming under the surface. An brewing anger towards companies that do not understand accessibility, nor the commitment that is required to be accessible, but will give it a light treatment simply as a sales tactic.
Not testing your website with actual users?
Basically, I’ve had it. I’m mad as hell and I am not going to take it anymore.

I’ve worked with too many projects where a vendor has sold a program, content management system or software application as a part of the overall project, and claimed that it was “technically compliant”. “Sure it is 508 compliant,” they say. Not understanding the implications of such a statement.

Invariably, the application is exposed for what it really is. A basic treatment of accessibility veiled in sales gibberish. The charade lasts until it is actually placed under scrutiny of those will be needing accessibility features. When asked to produce evidence of 508 compliance or some sort of accessibility certification, there is rarely any documentation, other than a simple automated test.

So, what is “technically compliant”?
I would describe “technical compliance” as a label company’s use when they go through the motions of compliance, without truly understanding the reasoning and methods of accessibility. Simply running a page or an application through an automated accessibility checker is NOT an approval for the “accessible” label.
Accessibility is much more than the “strict” side of the technical checklist. Accessibility is about much more than screen readers. Accessibility is about understanding the people that use a website and that making a website accessible actually makes it easier to use for everyone, not just a single, small group of people that need these features.

To better understand the “technically accessible” label that people like to use, I like to examine some of the elements of the Web Accessibility Checklist, version 1, developed by the W3C. These are elements that cannot be tested by automated software, only by actual human testing.

Where Automation Fails
MultiMedia
Checkpoint 1.3 Until user agents can automatically read aloud the text equivalent of a visual track, provide an auditory description of the important information of the visual track of a multimedia presentation.

This just makes sense. Search engines can’t read images, video, podcasts or other multimedia. Instead they rely on tags, descriptions and transcripts. For accessibility, this also provides information to anybody; regardless of access device, technology, browser, or assistive technology.

Color Contrast
Checkpoint 2.1 Ensure that all information conveyed by color is also available without color.

Checkpoint 2.2 Ensure that foreground and background color combinations provide sufficient contrast when viewed by someone having color deficits or when viewed on a black and white screen.

There are numerous contrast tools available online, some are much better than others at identifying contrast issues on a web page. I prefer aDesigner from IBM, as it highlights specific areas on the page and identifies them as problem areas. The main issue is that running a website code through a validation test does NOT identify problem contrast areas. This is a visual test, and must be tested by humans in order to find the problems.

Contrast is also a key element in design. By using contrast designers can influence the path of the eye as it follows the information and is attracted to specific calls to action. High contrast areas on a page get much more attention and are easier to see. Misusing or misunderstanding contrast results in a very poor user experience

Markup
Checkpoint 3.1 When an appropriate markup language exists, use markup rather than images to convey information.

Checkpoint 4.2 Specify the expansion of each abbreviation or acronym in a document where it first occurs.

Another validation test that an automated procedure will miss is the ability to spot when an image is not clear and mark-up would do a better job of conveying information. IN addition, there are many times when another graphic might even be more explicit. Again, only human testing will show these issues and no amount of automated testing will provide correction.

Abbreviations and acronyms are to be defined in the markup, which enables users to simple see purpose of the letters. It provides clear context to the abbreviations or the confusing world of acronyms. This is especially helpful when the acronym is also a word, which can be confusing.

Issues related to this area are the surging popularity of tag clouds, where large numbers of words are rendered on the page and their popularity is shown based on text size. For users accessing the site through a reader, there is no method of understanding the difference in size of these tags. The tags are simply read aloud with no context, order or understanding of their purpose.

Programming
Checkpoint 7.3 Until user agents allow users to freeze moving content, avoid movement in pages.

Checkpoint 6.1 Organize documents so they may be read without style sheets.

Checkpoint 9.4 Create a logical tab order through links, form controls, and objects.

Movement on pages is just like the aggravation of watching cable news networks. The news ticker that tries to compete with our attention as we watch the talking head and listen to the news and look at the news video – too many things compete visually for your attention. When the same principle is applied to a web page, the same result applies. When there are too many competing elements for the viewer’s attention (movement) there is no clear place for the user to focus their attention. It lessens the ability of the page to communicate a specific idea or purpose.

Tab order is especially critical, especially in administrative screens, eCommerce sites, interactive technologies and other form-intensive applications. This can take place in everything from a content management system to setting up a YouTube account. Tab order allows keyboard-only users to tab through forms and options. If the order is not logical, the cursor focus can easily be lost.

Tab order is not something that can be tested with automated software or web validation. It requires strict human testing and intervention, especially on different operating systems, browsers, computers and assistive technology. The combinations of all of these technologies create issues and different combinations may produce very different results. Human testing is the only way to find this issue.

Readability
Checkpoint 12.3 Divide large blocks of information into more manageable groups where natural and appropriate.

Checkpoint 12.4 Associate labels explicitly with their controls.

Checkpoint 13.8 Place distinguishing information at the beginning of headings, paragraphs, lists, etc.

Checkpoint 14.1 Use the clearest and simplest language appropriate for the site’s content.

Checkpoint 14.3 Create a style of presentation that is consistent across pages.

None of these points are able to be tested in any type of automated environment. One of the most difficult problems online for websites is readability. Beyond readability, it is estimated that 40% of the population has lower literacy skills. Add to that low-vision, senior citizens and new adopters of the internet, and there is a significant learning and learning curve that keep people from easily accessing the information that they need.

Clear and simple language, consistent presentation, making text readable by arrangement, mark-up and headings – all of these are techniques that make content more accessible. They also rely on testing with target audiences rather than an automated button-push.

Navigation
Checkpoint 13.1 Clearly identify the target of each link.

Checkpoint 13.4 Use navigation mechanisms in a consistent manner.

Checkpoint 13.5 Provide navigation bars to highlight and give access to the navigation mechanism.

Users need to have a sense of location when they are on a page – Where do I go if this is not the right page? Where is there related information? How do I get there? These are all questions that can be answered quickly and easily by a good navigational structure with visual indicators, highlights and clear labels and targets.

In a sense, users need a sense of location, clear content relation, common sense navigation and a call to a specific destination in order to reduce their frustration. Deny any of these mechanisms, and your site is a whole lot harder to use. Automated tests, again, fail to properly identify issues of clarity, purpose, consistency, readability and understand-ability.

Automated Testing Fails True Accessibility
These are issues that no amount of automated testing will grasp. When a company claims “technical accessibility” it is because they have not actually tested their software or content management system with people, much less tested with those that rely on assistive technology. Simply choosing which checkpoints are more important than others and adhering to those are a dangerous precedent to establish. In doing so, adhering to strict technical issues, but not the grammatical, layout, contrast, navigation or readability issues ignores a significantly large portion of the population that can benefit from these improvements.

No amount of automated testing will explain to you that your instructions are unclear and visually hard to find. Only testing performed by people that are familiar with assistive technology, accessibility and the multitudes of combinations of these issues can ensure that a site is truly certified as accessible.

Developing a new website?
If you are a project manager or web manager, and you are tasked with purchasing or building a specification for a website, application or content management system, I recommend that you demand a third-party verification of accessibility. Relying on the manufacturer’s word and getting the “technically accessible” line can come back on you when a user discovers the truth.

As an example, a well-known software manufacturer claimed that their software was “technically accessible” that they met 508 requirements. Interestingly, the reports the software generated were accessible, but the methods necessary to generate those reports were not even close to being accessible.

Do the Work, Reap the Rewards
In their haste to be technically correct, programmers and development companies have forgotten (if they ever even knew) that the readability of the content, and the clarity of instructions, and calls to action are just as critical to accessibility as alternate navigation, alt attributes and graceful degradation.

Related Articles:
Observing Accessibility
The Importance of Context in Content
Content v Creative – Where Does the Customer Count?