Article: A Practical Starter Guide on Developing Accessible Websites

Citation: Ng, C., & Schofield, M. (2017). A Practical Starter Guide on Developing Accessible Websites. Code4Lib Journal, (37). (link)

This article from Code4Lib’s July 2017 issue provides users with just what its title suggests: practical advice for including accessibility when developing a website.

The authors focus on ARIA (Accessible Rich Internet Applications), scripting, and semantic markup rather than typical accessibitily how-to article stand-bys like alt text, and provide not only a basic overview of each of their chosen topics but a brief discussion of why each is important in accessible design.

Another thing that makes this article stand out is that it looks at the actual processes web browsers use to render websites. As the authors point out, understanding when and how browsers give information to assistive technologies to make websites accessible to users is important if you’re in the process of developing a website.

One example of this is the authors’ discussion of the accessibility tree, which tells the browser which DOM elements needs to be exported to assitive technologies.

The authors also discuss the problems of cross-browser capability, the growing omnipresence of Javascript, and other issues, and provide a series of best practices with code and other details on how to carry them out.

With its focus on the specifics of how browsers make websites accessible and its easily applied best practices, this article is a welcome change from the typical rehash of accessibility how-to guides.

Article: Evaluating the accessibility of online university education

Citation: Pendergast, M.O. (2017) “Evaluating the accessibility of online university education.” International Journal of Online Pedagogy and Course Design, 7(1). (link)

This article presents a relatively recent study of University websites, comparing them to WAI’s WCAG 2.0 at the AA level.

The first 6 pages of the article is taken up with an overview of web accessibility, including: its history; relevant laws and guidelines in Australia, Canada, the United Kingdom and the United States; a quick summary of the WCAG 2.0 guidelines and tools to check accessibility; and brief commentary on implementation in universities.

The author pulled home pages from 24 accredited universities, a mix of public, private, large, and small, and tested them with AChecker to determine how many problems each was likely to have with passing WCAG 2.0 at the AA level. Although several universities had no known or likely problems that were detected by the tool, most had between five and thirty known problems, with a few that had significantly more.

The author then tested a demo course set up at Florida Gulf University (his home institution), starting at the university’s home page and going through the school’s learning management system’s login page. Here, as well, there were known errors on most of the pages, and the author notes finding it “particularly disconcerting” that the login page would have been totally inaccessible to anyone with a visual impairment.

The author’s conclusion is that webmasters, administrators, faculty and staff all need to work harder to make sure course content is accessible. Recommendations include checking each HTML page for compliance before it’s uploaded, checking it routinely, training faculty to use web authoring tools with built-in compliance checkers, making sure that all off-site content from textbook publishers or other vendors is accessible, and being wary of devices and apps (11).

While this conclusion has its heart in the right place, it is a little simplistic, especially given the lack of specific, detailed advice on solving the complex problems that are likely to come up in planning for, implementing, and maintaining accessible web pages.

Overall, the study is too basic to be of broad, practical use. The information it describes may be useful to newcomers, but anyone familiar with web accessibility likely knows all this already. The author also seems to conflate disability with visual disability, although that may just be an artefact of the specific problems found by AChecker, which deal with lack of contrast and other screen-reader-centric errors.

Regardless, given the problems of the article and the simplicity of the study, this article hardly presents the evaluation of “online university education” its title proclaims.

Article: Social4all: Definition of specific adaptations in web applications to improve accessibility.

Citation: Crespo, R.G., Espada, J.P., & Burgos, D. (2016). Social4all: Definition of specific adaptations in web applications to improve accessibility. Computer Standards & Interfaces 48: 1-9. (link)

Most attempts to make the web accessible for a wider number of users rely on designing accessible websites. This paper describes an approach which works more like screen reading software, by adding a layer of accessibility to websites using a tool created by the authors called Social4all.

The tool works by allowing anybody to input the URL of a website and create an “adaptation profile” which is then stored in a central repository and can be accessed by other users who need to visit the website (p. 3).

To create profiles, a user must analyze the website’s HTML, CSS, and Javascript using jQuery scripts which test the code of the site using algorithms based on WCAG guidelines (p. 4). The big benefit of this (apart from what is essentially crowdsourcing accessibility issues to interested users) is that each profile may approach accessibility problems slightly differently depending on each individual’s preferences (p. 4).

The authors tested their system with occupational therapy students, and asked them to assess its ease of use in creating new adaptation profiles. The test users rated the system as easy to understand (p. 7), although it seems as though the system would have to be tested by users with disabilities to be fully assessed as effective.
Overall, the concept is an interesting one to consider, although the language of the article is unclear in some places, which makes for difficult reading at times.

Generally speaking, though, a tool like Social4All should not be seen as a replacement for actual accessible web design. Rather, it might serve as a useful tool for users stuck dealing with otherwise inaccessible websites. Since this is only a preliminary study into the concept, it will be interesting to see what further developments are made.

Spanish speakers can access a more detailed, lengthier report on the methodology in Spanish at the Research blog of Universidad Internacional de la Rioja, the primary author’s institution.

Article: Implementing Recommendations From Web Accessibility Guidelines.

Citation: Schmutz, S., Sonderegger, A., & Sauer, J. (2016). Implementing Recommendations From Web Accessibility Guidelines. Human Factors: The Journal of Human Factors and Ergonomics Society, 58(4), 611-629. DOI 10.1177/0018720816640962

This article explores whether implementing web accessibility guideline recommendations can make a website less usable or less pleasant for non-disabled users (spoiler: it doesn’t).

Noting a lack of research into the effects of disability guidelines on non-disabled users (no longer true, incidentally, since I’ve reviewed at least one other recent article on the topic and have read several others), the authors carried out a study on sixty-one “nondisabled” university students, requiring them to access three web sites, one of which conformed to WCAG 2.0 at the AA level, one at the A level, and one at the NA level (pp. 614-615).

The study measured the amount of time it took the students to complete a number of tasks, and additionally asked them to rate each of the three web sites in the areas of usability, aesthetics, trustworthiness, and perceived workload. Findings were, perhaps unsurprisingly, that “the AA Web site showed advantages over the two other Web sites with regard to performance and subjective evaluations” (p. 620). Sites that just passed the A level of WCAG seemed to confer no benefit, however.

The authors conclude that what causes the AA level web sites to be more usable and effective for users with no obvious disability is “a combined effect” of the various criteria at that level such as structure, text alignment, clear labeling of forms, etcetera.

My only complaint is that, presumably because the article focuses on the experiences of “nondisabled” users, the authors sometimes appear to conflate the purpose of WCAG and other accessibility guidelines with making sites accessible to those with vision problems and nothing else. There are also a few times where it sounds like the authors are saying that WCAG’s recommendations on tabbing order are irrelevant to all users, even though they are talking only of those with no disabilities.

Despite this minor flaw, this article is yet another nail in the coffin of the old “accessible websites are ugly and unusable for the rest of us” myth. As the authors put it, their study helps show that WCAG can be “a helpful tool for designing more usable web sites” regardless of the ability level of the end user (p. 623).

Article: Exploring the relationship between web accessibility and user experience.

Citation: Aizpurua, A., Harper, S., & Vigo, M. (2016). Exploring the relationship between web accessibility and user experience. International Journal of Human-Computer Studies, 19: 13-23 (Note: page numbers below are from the preprint version of the article)

This article argues that web accessibility and user experience are closely related, although it does not find a significant correlation between adherence to WCAG 2.0 and many of the desirable elements of user experience of a website.

The bulk of the article describes the results of a study of eleven blind web users’ experience with local restaurant websites. Pages from these sites were organized into highly accessible and poorly accessible, based on the Barrier Walkthrough method and the AA conformance level of WCAG 2.0 (p. 8). The test participants were then given three tasks to complete on each website, and rated these tasks with words from the “emotion word prompt list”: annoyed, bored, confident, confused, disappointed, frustrated, happy, interested, hopeful, pleased and unsure (p. 10). Participants also rated the websites themselves with the “Attracdiff” tool, which “consists of a set of 23 word pairs reflecting opposite adjectives that can be rated on a 7-point scale,” e.g. “complicated/simple” (p. 11).

Findings of the study included that accessible sites were more likely to be rated with positive emotions and positive adjectives by non-sighted users, and that inaccessible sites were more likely to be rated with negative ones. They further suggest that it may be possible to reverse engineer the process and determine some level of the sight’s accessibility by gauging its perceived usability–although they do not go into detail about this, and it certainly shouldn’t be considered a viable alternative to accessibility testing.

Article: Exploring Usefulness and Usability in the Evaluation of Open Access Digital Libraries

Citation: Tsakonas, G., & Papatheodorou, C. (2008). Exploring usefulness and usability in the evaluation of open access digital libraries. Information Processing and Management, 44: 1234-1250. [url]

This article explores the usability of OA digital libraries (DLs). Unlike many articles which feature the evaluation of scholarly websites, which tend to use accessibility-related tools, the focus of this is on usability, specifically with the Interaction Tryptich Framework (ITF), a model for evaluation which considers systems as interactions across various ‘axes’, defined here as usability, usefulness, and usability, between the system’s elements, which in this case are the DL, its content, and the user (p. 1237).

A chart showing digital libraries as interactions along the axes of usability, usefulness, and performance
the Interaction Tryptich Framework for Digital Libraries

Like most articles which examine accessibility and usability, the authors are interested in one particular digital library: E-LIS, a library science repository running on the eprints system. Unlike accessibility-focused articles, at least, the site was measured not through the use of automated tools or testing, but by means of a questionnaire filled out by the DL’s users. A regression analysis was then carried out on each category to get an idea of the general success of each axis.

The authors give some general conclusions about users’ expectations with regards to the usability and usefuless of E-LIS content, but seem to have no specific insights in regards to how the results might be applied more broadly to designing usable and useful DLs which perform well.

Given that the article is focused on usability, not accessibility as such, its conclusions may be of limited use to those interested in web accessibility. However, the difference of approach and the exploration of the ITF may be of use to researchers looking to apply different paradigms to accessibility evaluation.

Article: Measuring Altruistic Impact – A Model for Understanding the Social Justice of Open Access

Citation: Heller, M., & Gaede, F. (2016). Measuring Altruistic Impact: A Model for Understanding the Social Justice of Open Access. Journal of Librarianship and Scholarly Communication, 4, eP2132. DOI: http://doi.org/10.7710/2162-3309.2132

In this paper from August 2016, the authors argue for assessing the impact of repositories on two levels: pragmatic and in terms of social justice (p. 2). To this end, they have created a so-called “social justice impact metric” which uses the number of social justice related items accessed and the total amount of international usage from “less-resourced” countries (p. 3).

After establishing an overview of social justice as it pertains to Open Access (OA), the authors argue that since OA is “a social and public good,” (p. 3), traditional measures of impact as a number of citations or downloads are insufficient. Instead, they suggest measuring “social justice impact,” which shows how an OA repository or publication is likely to affect those otherwise without access to information that has become “vital to success in our information economy” (page 5).

To create their Social Justice Impact metric, the authors measured how often content is accessed by search engines, and how often it is accessed in “lower-resourced countries” (p. 8). Data were measured with Google Analytics, looking at both search engine keywords related to social justice and geographic usage (p. 9). Keywords were pulled from a corpus created by the authors (p. 10), included as an appendix in the report.

Anyone looking for a specific number like those given by altmetrics or journal impact factor will be disappointed by the results of the authors’ analysis, which is more like a method for measuring how often international users access repository content that relates to social justice, with suggestions for how readers might most successfully increase access to social justice related content at their own instutition.

All the same, the argument that providing access to information to those who would not otherwise have it should be a core part of measuring the success of OA repositories is a compelling one. As the authors note, we all too often focus solely on academic impact, and should not forget that broader social good comes out of OA work as well.

Balancing pedagogy, student readiness and accessibility: A case study in collaborative online course development.

Citation: van Rooij, S.W., & Zirkle, K. (2016). Balancing pedagogy, student readiness and accessibility: A case study in collaborative online course development. Internet and Higher Education 28: 1-7. DOI 10.106/j.iheduc.2015.08.001

This article presents a case study of an online course developed at George Mason University in Virginia to teach students how to learn online. The authors (who were the leads on the project) were tasked with making sure that the course was accessible as well as pedagogically-sound.

Much of the study describes the setting of the university and course, and the main finding so far as accessibility is concerned is that course creators are better off determining accessibility needs before creating content (p. 4). As the authors put it content creators “need to integrate accessibility services into the process early on and to continue those services throughout the design, development and implementation of the online course” (p. 6).

While the study’s findings elsewhere are certainly useful to creators of online courses, their findings as regards accessible course creation are hardly new.

Article: Exploring perceptions of web accessibility: A survey approach

Citation: Yesilada, Y., Brajnik, G., Vigo, M., & Harper, S. (2015). Exploring perceptions of web accessibility: A survey approach. Behaviour & Information Technology, 34(2): 119-134. http://dx.doi.org/10.1080/0144929X.2013.848238

Yesilada et al. note that definitions of accessibility vary due to the “constantly evolving” nature of the field and the various sub-fields within it (p. 119). As the authors found in a previous study, “misunderstanding [of] accessibility definitions, language, and terms might cause tension between different groups,” leading to difficulties. This study, consisting of a survey of over 300 people “with an interest in accessibility” (p. 121) is the authors’ way of addressing the issue in the hopes of enabling more useful communication within the field.

The study was carried out via a survey distributed by several accessibility-related mailing lists and which asked participants to provide demographic information about themselves; to rank five different definitions of accessibility and/or write their own definition; and to agree or disagree with statements regarding web accessibility (p. 121).

The definitions used in the study are excerpted here from the web survey used, which is still available online:

  1. Web accessibility means that people with disabilities can use the Web. More specifically, Web accessibility means that people with disabilities can perceive, understand, navigate, and interact with the Web, and that they can contribute to the Web.
  2. Technology is accessible if it can be used as effectively by people with disabilities as by those without.
  3. The extent to which a product/website can be used by specified users with specified disabilities to achieve specified goals with effectiveness, efficiency and satisfaction in a specified context of use.
  4. A website is accessible if it is effective, efficient and satisfactory for more people in more situations.
  5. The removal of all technical barriers to effective interaction.

(source)

The bulk of the survey, however, was given over to the rating of statements about the purpose of accessibility, its drivers, and how to enact it. These statements can be categorised as follows:

  • Usability vs accessibility – Does accessibility relate to usability? (p. 122)
  • Audience of accessibility – Is accessibility concerned mostly with people who have disabilities, or a broader audience? (p. 122)
  • Legislature vs revenue – Is accessibility primarily driven by laws or effect on revenue? (p. 123)
  • Evaluation of accessibility – How can accessibility best be assessed? (p. 123-124)
  • Dynamic and contextual – How is accessibility affected by “pages that change and the context in which a page is experienced”? (p. 124)
  • Standard definition – Is one important? (p. 124)
  • Accessibility and user experience – What is the relationship between accessibility and the user experience? (p. 124-125)

The authors analyzed responses to these statements not only in aggregate, but by correlating responses with respondent’s self-reported demographics. Expertise—defined by the authors as whether or not respondents’ time spent working on accessibility and their years in the field (p. 126)—technical background, work sector, area of specialisation, and whether or not the participants were “in the trenches” or not all played a role in how respondents rated statements in the various areas.

Due to the sometimes extreme variation in responses based on respondent demographics, and based on overall responses to the statements, the authors argue that more needs to be done at the educational level to teach accessibility as “interrelated” to usability and user experience (p. 131).

One particularly interesting note is that those who work in the government sector or who work on accessibility issues regularly are more likely to support statements about accessibility benefiting a broader group of people. As the authors note, there is sufficient evidence “showing how accessibility is also about those living in the developing world” or who are otherwise socially disadvantaged (p. 132), suggesting that more studies are needed to make this clearer to those who are not practitioners.

Ultimately, the authors conclude that accessibility’s breadth and continuing evolution make communication challenging, and that more studies like their own—as well as an approach to education which takes the broader context of accessibility into account—are needed to fully address the problem.

Article: The challenges of Web accessibility: The technical and social aspects of a truly universal Web

Citation: Brown, J. & Hollier, S. (2015). The challenges of Web accessibility: The technical and social aspects of a truly universal Web. First Monday, 20(9).

Brown & Hollier provide a high-level overview of various accessibility challenges as of late 2015, and argue that although there are still technical difficulties with creating accessible web content, the larger challenge is building awareness of accessibility problems in the first place.

The basic discussion of accessibility at the beginning of this article will contain no surprises to most web designers and accessibility researchers. However, the authors’ review of accessibility policies like section 508 and WCAG, and increases in technology from mobile devices and more traditional adaptive technologies like screen readers serves to illustrate a point: technologies for end-users and tools for designers have both increased in complexity since the early days of the web.

The latter half of the article discusses assessment of accessibility and conformance testing. The authors summarize several points of consensus amongst researchers in this field:

  • Automated tools cannot substitute entirely for manual checking of accessibility issues
  • Assessment of accessibility can be as complicated as accessible design

The authors also outline the W3C’s suggested accessibility conformance evaluation methodology: “defin[e] the evaluation scope, explor[e] the target Web site, [select] a representative example of pages, [audit] those pages and [report] the findings.”

The authors also note that particular areas of concern for disabled web users are government websites—which can provide crucial services, and which tend to have some accessibility issues despite greater attention and assessment than corporate sites—and social media—where the variety and type of content and communication can make accessibility challenges even more difficult for end-users to overcome.

One especially interesting technology the authors review is “cloud accessibility,” wherein user preferences can be stored in the cloud and then accessed by individual machines so that the “working environment would adapt to the context of the user and their specialised requirements.” The authors do note, however, that as more and more services move to app-based environments, cloud-based services may be “superseded before they even move beyond the concept stage.”

Finally, the authors investigate awareness issues surrounding accessibility, suggesting that it—more than the development of specific technologies—”will have a greater impact on the uptake of accessible design.”

Despite its summary nature, this paper serves a useful reference point for researchers in accessibility, as many papers which discuss the challenges of creating accessible content are older, and the information they contain about technology is no longer as relevant as a result. Additionally, the authors’ argument that building awareness, and with it the skill sets required to build a more broadly accessible web, is a more effective way forwards is useful in guiding future research and accessibility initiatives.