CSUN Presentation, On-Demand: Overcoming Obstacles to Manual Accessibility Testing
Presenter: Larry Lewis
The practice of functionally testing with screen readers has played a significant role in the manual testing processes of developers, QA testers, and accessibility specialists for the better part of two decades providing useful feedback to developers that cannot be solely gleaned by automated testing services. This session presents a solution to the challenges faced by sighted developers and quality assurance testers who are tasked with the process of manually testing digital content with a screen reader within a desktop environment. When time is of the essence, and deadlines are expected to be met, this necessary step to ensure that content is developed using accessibility best practices is often performed fairly hastily, and sometimes is overlooked altogether.
Sighted individuals who are presented with the option of using a screen reader may have had little to no training as to how to effectively utilize a screen reader to perform basic tasks essential for manual, functional testing. Also, even if such training has been provided, the reality of a stream of auditory information being spoken to the untrained ear may be off-putting to the sighted, screen reading participant if not exercised consistently and routinely.
Testers may improperly test content by tracking programmatic focus visually with speech muted, or with the screen reader disabled, simply because synthetic speech is perceived by them to be uncomfortable, disruptive, and at times overwhelming. While this presentation does not excuse nor eliminate the need for the use of a screen reader to perform specific manual tests within a desktop environment, it showcases an inclusive approach to using a screen reader coupled with complementary tools to perform accessibility testing, primarily for web and PDF content.
In this presentation, the presenter will begin by covering the key strategies for utilizing a screen reader to conduct a “usability” test to determine how assistive technology functions when accessing web content. Attention will be given to implementing methodologies to determine:
- The logical layout of a webpage
- The process of navigating web content with a keyboard
- Listening and understanding what the screen reader is speaking
- Identifying and determining the operability of form controls
- Determining the level of efficiency and functionality of performing key tasks
The presenter will focus on the process of identifying a perceived defect. The focus will shift to how that defect is captured for tracking purposes. The presentation will then shift to the difficulties associated with sharing these defects across workstreams with the intention of getting them remediated in a timely manner and reintroduced to the web content for a more optimal, user experience.
The presentation will illustrate the JAWS Inspect utility as a complementary test tool that visualizes the auditory, screen reader experience of a vision-impaired user, providing an inclusive bridge for sighted testers to connect with vision-impaired users. The presenter will illustrate how using this tool can vastly simplify the collection of usability defects by generating a variety of visual, text-based reports that offer a seamless means of identifying, logging, and tracking their remediation. These reports consist of the speech output rendered by the screen reader coupled with screenshots and corresponding code that undergird and dictate the synthetic speech that the screen reader speaks. This session will engage the audience to select websites whereby the following types of reports will be generated in real-time: “Full-page report” strategically groups all of the elements of a webpage tethered to the speech output reported by JAWS when these elements are encountered by the user. “Element Under Mouse report” provides a targeted alternative for testers to generate a report for specific areas of the website. “Say All report” offers a linear alternative designed to test the reading order of a website. “ARIA Live report” provides a vehicle that ensures that real-time information being updated and spoken via ARIA Live is being properly spoken by the screen reader. The presenter will demonstrate effective strategies for quickly identifying defects while creating a system for logging them within spreadsheets by either exporting all of the contents of the report, or by manually identifying and selecting usability defects to export to a spreadsheet and then transfer to a shared. bug tracking system. Lastly, JAWS Inspect is not a tool solely designed for website testing. It’s “speech viewer” functionality serves as an accessibility catalyst for recording the transcripts of JAWS speech output for PDF documents. The presenter will demonstrate this Speech Viewer functionality as a means of determining how the JAWS screen reader might read a particular document or application Window, thus offering a path forward for either engaging the content developer to remediate the document or application or developing scripts to accommodate the needs of the end user.
A screen reader like JAWS serves as a comprehensive vehicle for measuring the level of usability and functionality of a website. JAWS Inspect removes the complexities of learning extensive screen reader commands to perform a functional test using a screen reader as well as minimizes the amount of auditory feedback for the sighted tester. While JAWS Inspect has been designed for the sighted developer/tester, this utility is also fully accessible to the vision-impaired professional tasked with working across varying workstreams, and promotes an inclusive, efficient means of communication for all parties who desire an accessible result for digital content. This session will conclude by providing a forum for brainstorming as to how to best implement the JAWS screen reader coupled with the JAWS Inspect utility into the accessibility gameplan of the audience.
Need help with your specific accessibility needs?