In 1995, Dr. Cynthia Waddell published a web design accessibility standard for the City of San Jose’s Office of Equality Assurance. It included a comprehensive and concise list of specifications for designers of the city’s website to strictly adhere to. The list included, among many other things, a requirement that all image tags be accompanied by alt text, all video and audio elements be paired with text transcriptions, and an explicit cap of only two columns per HTML table, to limit the damage that tables for layout did in a poorly supported browser and screen reader landscape.
If some of these seem commonplace and obvious these days, it is only because of this groundbreaking work by Waddell in the earliest days of the web. Other rules on the list, however, may seem a bit more unfamiliar, abandoned as a best practice somewhere along the line, occasionally at the expense of the users. But each rule is essential for making the web open and accessible to all.
The standard was years ahead of its time, predating the official Web Content Accessibility Guidelines (WCAG) of the W3C by almost half a decade. Browser’s were still in the days of HTML 1 with only the most basic of tags, Netscape and IE were locked in a browser war for market control, and styling the web in any way was still a few years off. It was the first of many efforts to formalize a standard for equal and open access on the web, a principle at the core of the World Wide Web’s ideals, if not always perfectly executed in practice. Waddell’s work soon influence that of accessibility experts everywhere in the industry.
Several years later, Waddell was serving as an Executive Director at the International Center for Disability Resources on the Internet (ICDRI), a non-profit imbued with the dual purpose of advocating policy changes at all levels of government and promoting accessible design best practices to the larger web community. While there, her work led her to aid in the development of a new tool by the software company HiSoftware that enabled automated testing of websites against popular accessibility standards. When it came time to name the tool, there really was no question. It was called Cynthia Says.
Cynthia Says started with a simple webform. Developers could visit the Cynthia Says site, input the URL of their own website, chose from a number of accessibility standards (Section 508, WCAG AAA, etc.) to compare against, and click “Submit”. After crawling the HTML of the URL, the site would give its users an accessibility report, a baseline overview of accessibility performance mixed with comprehensive details about where they went wrong (and right). One by one, site developers could verify their own design against criteria of their chosen standard. Each requirement was flagged as either a “pass” or a “fail”. If for any reason, a requirement was failed, users were given tangible next steps for getting the issue resolved.
Like the work of Waddell herself, the goal of Cynthia Says was both to provide developers with a list of best practices and to educate them on the nature of the issue themselves. Without understanding and context, it was all too likely that site designers would fall into the same bad habits again and again. The information provided in the reports was far from trivial. If you were, for instance, checking your site against Section 508 compliance, language would be pulled directly from the legalese of the legislation, with a fleshed out description under each. In most cases, multiple solutions would be provided.
At a time when the web was struggling to understand accessibility, Cynthia Says offered salvation, a concrete path to success. And yet, it wasn’t the only tool out there doing that.
Bobby was released as early as 1997, in the wake of legislation that updated Section 508 to bring stricter rules to the practices of public web design projects. And though it predated Cynthia Says by a few years, Bobby similarly offered users a way to crawl their sites for accessibility issues. And like Cynthia Says, users were then presented a report about errors or warnings related to accessibility across that site.
Upon release, the tool’s list of potential errors and warnings were little more than a handful of rules cobbled together from best practices promoted by experts in the industry. As legislation and standards evolved, Bobby grew with them, eventually letting users test their sites against a number of different rulesets.
These reports were centered almost entirely on the fixes required, diverging slightly from the more educational route of Cynthia Says. Still, you had to hand two things to Bobby. The first was that users could upload a custom HTML file, and starting in 2001 even purchase a version of the software that could be run locally against sites in development.
The second was the Bobby badge.
In general, the rules Bobby checked against were labeled as Priority One, Priority Two, and Priority Three, with the first priority to mark fixes that were essential and required, and the third priority to mark issues that need a manual fix. If your site had any Priority One issues, it would be marked on the report with this icon (British police officers are occasionally referred to as “bobbies”, hence the pun):
If your site passed without any Priority One issues, you’d be presented with the Bobby badge. The badge looked like this:
If your site was “Bobby Approved,” it meant that your its content was well-structured and well-intentioned. Sticking the badge on your site was meant to both as a personal boast and a slight nudge to peer pressure other web developers into doing the same. And to some extent, that strategy worked. There was a time on the web when every respectable developer made sure to include the Bobby badge in their footer’s design.
Over the years, Bobby continued to evolve into a dynamic tool that could be used in and out of the browser, and could test across different run times and environments. It remained incredibly popular until the software was officially sunsetted in 2005.
Then there’s WAVE, or as it was originally called the WAVE. Among these other tools, WAVE gets a mention for its sheer longevity. Never as popular as some of the alternatives in the earliest days, WAVE has, over time, managed to outlast them all.
The WAVE began as a research project by Dr. Len Kasday at Temple University in early 2000. After a couple of years of development, WebAIM took over the project and continued to make improvements.
WAVE offered a similar experience to other tools on the market. It validated sites against popular specifications and listed out known issues for developers to assess. WAVE set itself apart by getting into the browser extensions game just about as early as such a thing was possible.
Even the first versions of WAVE came bundled as extensions that could be installed on browsers and run from inside of any page with a click of a button. Once clicked, a report would fly out from the left side with plenty of details. Rather than making you come to it, WAVE would follow you all around the web. It was a simple idea, but opened up whole new possibilities.
That portability remained central to the development of WAVE. It has continued to evolve into more than just a browser extension, extensiblity always core to its mission, with a programmatic API that can be accessed from anywhere, and more advanced in-browser tools. That portability seems to have given WAVE the edge; it is still in active development today.
We don’t often give much thought to our tools (that is, unless their broken). That makes it easy to forget that they are forged through experience, and shaped by community. Accessibility is hard. We have our tools, and the expertise of the people behind them, to thank for making it a little bit easier.