Comparing accessibility evaluation plug-ins

Tânia Frazão, Carlos Duarte

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

19 Scopus citations

Abstract

This article reports the results of a study comparing evaluation accessibility plugins extensions for the Chrome web browser. Eight of the most well-known tools among developers were chosen. All tools are free or available under an open-source license, and work with the Chrome browser. The tools were compared based on their feature set, their usability and their evaluation results of ten of the Alexa top websites. We found that individual tools still provide limited coverage of the success criteria; the coverage of success criteria varies quite a lot from evaluation engine to evaluation engine; what are the most and least covered success criteria in automated evaluations. After analysing the results, we highly recommend to use more than one tool (with a different engine) and to complement automated evaluation with manual checking.

Original languageEnglish (US)
Title of host publicationProceedings of the 17th International Web for All Conference, W4A 2020
PublisherAssociation for Computing Machinery, Inc
ISBN (Electronic)9781450370561
DOIs
StatePublished - Apr 20 2020
Event17th International Web for All Conference, W4A 2020 - Taipei, Taiwan, Province of China
Duration: Apr 20 2020Apr 21 2020

Publication series

NameProceedings of the 17th International Web for All Conference, W4A 2020

Conference

Conference17th International Web for All Conference, W4A 2020
Country/TerritoryTaiwan, Province of China
CityTaipei
Period04/20/2004/21/20

Bibliographical note

Publisher Copyright:
© 2020 Owner/Author.

Keywords

  • accessibility
  • automatic
  • evaluation
  • tools

ASJC Scopus subject areas

  • Computer Networks and Communications

Fingerprint

Dive into the research topics of 'Comparing accessibility evaluation plug-ins'. Together they form a unique fingerprint.

Cite this