Courseware Accessibility Study - 2000: Methods

Methods

The study required the participants to access six different courseware packages as students and, in most cases, using assistive technology. Trained observers recorded details of each subject's experience accessing the course and each session was recorded and transcribed.

Subjects

Eight adult participants were recruited from different disability groups to participate in a six week online workshop. The criteria for selection was that the individuals might be expected to take a Web-based course at the post-secondary level, and that they have a disability that may influence their participation in an online course. All participants had completed high school, and all had taken at least one post-secondary level course. Influences on ability to participate included possible access barriers due to the need to use assistive technology or the need for more time than might be expected of a fully able person to access and complete the online content .

These participants included:

  1. One person with quadriplegia who used infrared mouse technology to access a computer.
  2. One person with moderate vision loss who used screen enhancement technology to enlarge the computer screen
  3. Two people with severe vision loss who used a screen reader to aid their use of a computer.
  4. One person who had no residual vision who relied on a screen reader to access a computer.
  5. Two people with moderate reading disabilities who used text-to-speech technology to aid comprehension of large blocks of content.
  6. One person with a severe learning disability who chose not to use assistive technology to access a computer.

Materials & Apparatus

An online workshop was developed for this study entitled Advocacy for Persons with Disabilities. The workshop was designed so that the participant experience would closely duplicate the experience of having taken an actual Web-based, post-secondary course. The workshop content was chosen because it would likely be useful and of interest to the participants and as such would keep the subjects engaged throughout the study. Beyond subject engagement, the content is not relevant to the outcome of the study and will not be outlined further.

Three of the courseware packages used were installed on remote Web servers operated by the developer of the package. The details of the server setup for each of these packages was not known. These packages included:

  1. Blackboard Courseinfo v 4.0 (http://www.blackboard.com/)
  2. Web Course in a Box (v ?)(http://www.wcbcourses.com)
  3. Mallard (v 2000b)(http://www.cen.uiuc.edu/Mallard/ )


  4. The other three courseware packages were installed on local servers. They include:

  5. WebCT v 2.1 (http://www.webct.com/ ) installed on a Solaris server
  6. Virtual-U v 2.5 (http://www.vlei.com/ ) installed on a Linux server
  7. Topclass v 3.1.2 (http://www.wbtsystems.com/ ) installed on a Linux server

Two additional Pentium II class PCs were required for the study and were connected to the Internet through a 100mb T1 line. All participants used Microsoft's Internet Explorer 5.0 to access each of the courseware products and the workshop content they contained. Internet Explorer was used in the study because of its popularity and its accessibility features.

A number of assistive technologies were setup on the PCs prior to the study. These technologies included:

  1. JAWS 3.5 - screen reader (Henter Joyce) (http://www.hj.com )
  2. Read and Write 4.0- text-to-speech (TextHelp) (http://www.texthelp.com )
  3. Zoomtext Xtra- screen enhancement (AI Squared)(http://www.aisquared.com)
  4. Wivik- onscreen keyboard (Prentke Romich) (http://www.prentrom.com )
  5. Head Master Plus- infrared pointing device (Prentke Romich)

Training

Prior to the start of the workshop participants that required it, received training on the assistive technology they would be using to reduce the likelihood that access barriers encountered were the result of inexperience using the technology. Only two participants required initial training to use JAWS. The others claimed to be sufficiently comfortable using the technology to begin the workshop without initial training. Prior to the first session participants were given some background on the Internet and participating in online courses. Features of the Web such as links, web sites, search engines, and communication tools were explained and information on what to expect from the courseware tools was provided.

Each of the two observers completed three training sessions prior to the study in which they themselves were required to access three courseware packages, and complete the respective access exercises, relying on audio to access the exercise without the aid of a computer monitor. While one observer completed the exercise, the other recorded access barriers the first encountered. This method allowed the observers to experience the barriers that might be faced by people with disabilities, and to gain some experience with the assistive technology that the participants would be using. Observers were trained using JAWS since it is the most complex and difficult of the assistive technologies reviewed to learn, and because barriers are more likely to be encountered by a screen reader user than users of the other assistive technologies listed above.

Data Collection

Two measurement tools were developed for this study. These included an intake questionnaire and an observation protocol. The Web-based intake questionnaire was an adaptation of the Distance Education Survey (DES), developed by the Evnet Group (http://socserv2.mcmaster.ca/srnet/evnet.htm). This instrument was reformatted to ensure that it would be accessible to participants who used assistive technology to access it, and that it would produce data that could be easily imported into common database applications and statistical analysis programs (i.e. CSV format). One additional section was added to the survey that gathered information on participants' knowledge and experience with accessibility issues. In all, the survey collected the following information about the participants:

  1. Learning Preferences
  2. Computer Access
  3. Computer Attitudes
  4. Employment and Education
  5. Disability Awareness
  6. Personal Information

(See Appendix A for Intake Questionnaire)

An observation protocol was developed for each of the six packages that were studied. During the setup of the software and the development of the workshop content, the features of each courseware package were identified and used as a basis for developing an "Access Exercise." The exercises ranged from 15 to 40 items that required participants to access all of the package's features. For example they may have been asked to access the homepage and read the introduction, access the bulletin board and read and post a message, or to take an online quiz etc. The observation protocol matched the access exercise, providing a space below each item for the observer to record any access difficulties participants experienced. In addition to recording observations, a 5-point scale was used to rank the participant's ability to access any given component of the courseware package being used. For each exercise item a rank was awarded on a scale of 1 to 5 as follows:

  1. Was unable to access. Required the observer to intervene physically (inaccessible)
  2. Was able to access with instruction, but with no physical help (instructional, how to)
  3. Was able to access with a hint (eg. "try tabbing", "use the down arrow')
  4. Was able to access but asked for clarification/affirmation (eg. "Am I in a text area?")
  5. Was able to access without any instruction or physical help from the observer, (accessible)

Observations recorded included any aspect of the session that may provide information about the participant's ability to access courseware components. These observations included the details of the rank assigned. For example, the details after being assigned a rank of "1" might be recorded as, "John could not find the button to access the site-map because no ALT text or text link was provided that could be interpreted by his screen reader. I [the observer] had to click on the image link for John." Other observations included noting any changes in the participant's mood or attitude. For example, "John became frustrated after tabbing through the links on the page and being unable to find the navigation links." Participants were also encouraged to make suggestions about how the content and courseware tools could be made more accessible to them. These comments were also recorded. Each session was recorded and transcribed, including a transcription of the observer, the participant, and the output from the assistive technology being used (where applicable).

A Typical Session

Prior to the first session each week, the observer accessed the courseware being evaluated and completed the access assignment to become familiar with the layout of the content and the courseware, and to identify any possible barriers that might be found in that package. Sessions were conducted approximately once per week and ranged from one to three hours in length, depending on the assistive technology being used and the complexity of the courseware being evaluated.

At the beginning of the session the participant was informed that they were being recorded, and the tape recorder was turned on. Participants were asked to login to the course by entering a username and password assigned to them, or in some instance by creating a username and password then logging in. While the participant was engaged in each task the observer wrote down on the observation protocol any access difficulties witnessed. After logging in, participants were typically asked to familiarize themselves with the homepage then go on to read the access exercise. To reduce the length of sessions for those using a screen reader, after initial access and reading through the access exercise, the observer recited each subsequent item in the exercise so the participant would not have to revisit the exercise page. The location of each tool was stated along with the exercise item so participants would not have to spend too much time searching for it. For example, an access exercise item might state, "Read through the course notes, linked from the bottom of the home page." The participant would then access the course notes and any other content related items such as announcements, or events etc. Following that, participants would access any communication tools, and read messages from, and post messages to, each of the tools. Communication tools generally included a bulletin board, an internal email system, and a chat facility. After trying the communication tools the participants were asked to access any other tools available in the package, such as a work area, glossary, student marks, note taking, or white board, etc. Finally participants accessed a number of quiz items related to the content presented that week. Generally quizzes included a short answer, multiple choice, true/false, and a longer answer question. A quiz item was created for each of the possible question types a courseware package had to offer.

Following completion of the access assignment participants were debriefed, revisiting any access barriers they encountered, discussing any emotions that may have arisen, discussing any suggestions they might have to improve the accessibility of the product being evaluated, and making comparisons to the other packages being evaluated. For the latter participants might express preference or dislike for particular feature in particular packages. For example, expressing a preference for one bulletin board that does not use frames, over another that does use frames, or visa versa. These observations were also recorded.