Home » Posts tagged 'h810' (Page 2)

Tag Archives: h810

How far can we push the boundaries of e-learning? Is there a future for extreme-learning?

Fig.1. Using scanning a doctor communicates with Scott Routley  who has been in a coma for over ten years (BBC, 2012)

Vegetative patient Scott Routley says ‘I’m not in pain’

Using scanning a doctor communicates with Scott Routley who has been in a coma for over ten years (BBC, 2012)

Invaluable to this patient but opening all kinds of possibilities in relation to responding to a stimulus through thought alone.

If this patient ‘learns’ how to communicate further than this surely is technology enhanced learning on the very out fringes of the extreme. In practice an engineer might describe this as ‘testing to destruction’ – lessons are learnt from such cases.

See more on Panorama http://www.bbc.co.uk/programmes/b01ny377

Asked to think about the future of learning and for disabled students in particular, I couldn’t help but consider the most extreme forms of e-learning with severely disabled patients – those beyond our reach the ‘brain dead’ while those in a vegetative state coming within reach – and is this state is one we go into under general anaesthetic, one from which a person does occasionally recover.

Fig.2. A new brain scanner helps completely paralysed people to spell words

I don’t want to be a guinea-pig in such a set up, but what if having been kept ‘alive’ say after a car accident I tell those who have stirred me to communicate that I wish I had died on the roadside all those years ago? Do they remove the technology and well me into a side room until I die of natural causes decades later? (This was the scenario in a black and white ante-war movie of the 1930s … I think. Recall the detail of the film and would love its name if you know it).

I don’t mean to be flippant, but could this technology be used to talk with animals … or give us the sense that we are ? If attached to such devices in our sleep, might dream actions be turned into real ones?

Fig.3. Real-life Jedi: Pushing the limits of mind control (BBC 2012) Last accessed 10 Dec 2012

http://www.bbc.co.uk/news/technology-15200386

And who gets there hands on this extremely expensive kit? Anyone willing to be a guinea-pig? The children of billionaires? Or in time – everyone with a need.

If in your 90s you are reduced to this state could you or would you want to extend life if it could be enriched in this way? Pushing humans into a stage that is more than just one foot in the grave – you are, in every sense, living as if buried alive? And if this could be realistically be sustained for decades?

Depends on the person I suppose.

Advertisement

BLOGS ON ACCESSIBILITY

Disability in business

Jonathan, who has a degenerative spinal condition which means he uses a wheelchair and has carers to assist him, has first hand experience of the challenges faced by people living with disabilities – especially in the business world. “I used to run multi-million pound companies and I’d go with some of my staff into meetings with corporate bank managers and they’d say to my staff, ‘it’s really good of you to bring a service user along’, and I’d say, ‘hang on, I’m the MD –  it’s my money!’

Disability Marketing

Michael Janger has a passionate interest in products and technologies that enable people with disabilities to enjoy a better quality of life, and works with businesses to effectively market and sell these products to the disability market.

Think Inclusive

I think there are two basic assumptions that you need in order support inclusion (in any context)

  1. All human beings are created equal (you know the American way) and deserve to be treated as such.
  2. All human beings have a desire to belong in a community and live, thrive and have a sense of purpose.

The important takeaway…when you assume people want to belong. Then is it our duty as educators, parents, and advocates to figure out how we can make that happen.

Institute of Community Inclusion

For over 40 years, the Institute for Community Inclusion (ICI) has worked to ensure that people with disabilities have the same opportunity to dream big, and make their dreams a fully included, integrated, and welcomed reality. ICI strives to create a world where all people with disabilities are welcome and fully included in valued roles wherever they go, whether a school, workplace, volunteer group, home, or any other part of the community. All of ICI’s efforts stem from one core value: that people with disabilities are more of an expert than anyone else. Therefore, people with disabilities should have the same rights and controls and maintain lives based on their individual preferences, choices, and dreams.

Cerebral Palsy Career Builders

How to deal with the following:

  1. Bias
  2. Presumption
  3. Myth
  4. Skepticism
  5. Prejudice
  6. Discrimination

The importance of having alternative formats to provide access to resources by students with disabilities

Read this web page and consider to what extent the six challenges mentioned are addressed in your context:

Mis-Adventures in Alt Format (Stewart, 2007)
http://www.altformat.org/index.asp?id=119&pid=222&ipname=GB

Pick one challenge and write a paragraph in your tutor group wiki explaining how it is relevant to your context.
____________

Developing a total picture of how Alt Format fits into the broader discussion of curricular reform and modernization will help insure that we do not continue to live on the margins of the educational mainstream. (Stewart, 2007)

‘Universal Design for Learning’

Challenges in relation to Alternative Formats:

  1. How does the provision of Alt Format fit into other emerging models for data management and delivery?
  2. How do we build systemic capacity to meet the projected needs for Alt Format and Accessible Curricular Materials?
  3. How do we align the divergent Alt Format efforts occurring on an international bases so that they minimize redundancies and duplicative efforts?
  4. How do we move beyond the current focus on Blind and Visual disabilities to a more holistic model of access for the gamut of print disabilities?
  5. How do we develop the level of technological literacy in students with print disabilities that will be necessary for them to benefit from the technological evolutions that are occurring in curricular access?
  6. How do we involve all of the curricular decision makers in the process of providing fully accessible materials?

In my context

1) How does the provision of Alt Format fit into other emerging models for data management and delivery?

With the digitization of everything a further step to ensure content is also accessible should be taken at the time of conversation or creation. I’m not aware in an agency where this ever occurs and when there is a client request the response is a simple one – word or PDF formats, or look to the browser of platform where the content will sti.

2) How do we build systemic capacity to meet the projected needs for Alt Format and Accessible Curricular Materials?

Is there a more appropriate agent to handle the conversion and delivery of electronic content on a given campus or system of campuses? I’d probably consider the Open University itself, or the Business School where I worked for a while. I know the disability officer, but his role was more to do with access and personnel and visitors to the building then meeting student needs – which I presume comes under Student Services.

3) How do we align the divergent Alt Format efforts occurring on an international bases so that they minimize redundancies and duplicative efforts?

Whilst efforts can and have to be made to improve access universally might the fine detail be left to address either group issues by working with representatitives of associations for, for example, the blind, dyslexia, cerebral palsy and other groups ? Learning from then improving such practices and tackling access for people from these groups for specific subjects and specific levels on a strategic basis knowing that complete coverage is the goal?

‘A plan for the development and incorporation of emerging technologies in a holistic and self-sustaining model is incumbent. These emerging systems must be based on flexibility and economies of scale if we are ever going to get in front of the issues of materials access.’ (Stewart, 2007)

4) How do we move beyond the current focus on Blind and Visual disabilities to a more holistic model of access for the gamut of print disabilities?

Doesn’t cover everyone who would benefit and would benefit other groups, such as non-native language populations, remedial groups and as an alternative for any user who may prefer or benefit from the text record.

5) How do we develop the level of technological literacy in students with print disabilities that will be necessary for them to benefit from the technological evolutions that are occurring in curricular access?

In many anecdotal reports, less than 10% of the incoming students to higher education have ever had any realistic exposure to the access technologies they will need to be successful in adult education and in the world of work. (Stewart, 2007)

Current studies suggest the opposite, that students with disabilities who gain so much from having a computer to access resources, that they are digitally literate. There are always people who for all kinds of reasons have had less exposure to or are less familiar with the technology -whether or not they also have a disability.

6) How do we involve all of the curricular decision makers in the process of providing fully accessible materials?

The original authors never have a say or make a contribution to the reversioning of content for use by disabled students.

This method of access often times results in the retrofit of existing materials, or the creation of alternative access methods that are not as efficient or well received in the general classroom environment. (Stewart, 2007)

For a truly effective model to be developed the original curriculum decisions should be made in a context of understanding the needs of all learners, and in particular those learners who do now have visual orientation to the teaching and learning process. (Stewart, 2007)

———————————————————————————————————–

Interview analysis revealed five personal factors that appeared to influence students’ decisions about technology use:

  1. a desire to keep things simple,
  2. a lack of DSA awareness,
  3. self-reliance,
  4. IT skills and digital literacy,
  5. a reluctance to make a fuss.

The three most talked about factors were desire to keep things simple, IT skills and digital literacy. Seal and Draffan (2010:455)

‘The are many ways of making and communicating meaning in the world today.’ Conole (2007:169)

The kind of problems students with disabilities now face are different – less whether content has been made available in a digital format, but how good the tools and services are to access this content.

  • accessibility of websites and course/learning management systems (CMS)
  • accessibility of digital audio and video
  • inflexible time limits built into online exams
  • PowerPoint/data projection during lectures
  • course materials in PDF
  • lack of needed adaptive technologies.

Students also mentioned technical difficulties using e-learning and connecting to websites and CMS, problems downloading and opening files, web pages that would not load, video clips taking too long to download, poor use of e-learning by professors and their own lack of knowledge working with elearning.

For most groups of students, solving e-learning problems by using non e-learning solutions was also popular.

During the last decade there has been tremendous development and interest in e-learning on campus. While our research shows the many benefits of e-learning, such as the availability of online course notes, there are also problems. Chief among these are problems related to inaccessibility of websites and course management systems. (Fitchen et al 2009:253)

Digital Agility

Results suggest that an important personal resource that disabled students in the study drew on when using technologies to support their studies was their ‘digital agility’. Seal and Draffan (2010:449)

Use of assistive technologies

Many students with disabilities have, since 2007, developed strategies for the use of both specialist assistive technologies (e.g. IrisPro, quill mouse, Kurzweil, Inspiration or Dragon Dictate) as well as more generic technologies (e.g. mobile phone, DS40 digital recorder, Google) Seal and Draffan (2010:450)

Seal and Draffan (2010:451) therefore suggest that disabled students have the kind of ‘sophisticated awareness’ that Creanor et al. (2006) described when they talked about effective learners being prepared to adapt activities, environments and technologies to suit their own circumstances. This contradicts somewhat the arguments of Stewart who argues that disabled students are behind other students in terms of developing digital literacies.

The digital agility of the students, identified in the study, is significant in terms of encouraging practitioners not to view all disabled students as helpless victims of exclusion. Digital inclusion does not always have to be understood through the dual lenses of deficits and barriers. Seal and Draffan (2010:458)

REFERENCE

Conole, G and Oliver, M (eds) 2007. Contemporary perspectives in E-Learning Research. Themes, methods and impact on practice.

Fichten, C. S., Ferraro, V., Asuncion, J. V., Chwojka, C., Barile, M., Nguyen, M. N., & … Wolforth, J. (2009). Disabilities and e-Learning Problems and Solutions: An Exploratory Study. Journal Of Educational Technology & Society, 12(4), 241-256.

Seale,J., Draffan,E.A. (2010) Digital agility and digital decision-making: conceptualising digital inclusion in the context of disabled learners in higer education, Studies in Higher Education, 35:4, 445-461

Stewart, R (2007) Mis-Adventures in Alt Format

The purpose of education …

XIV Paralympic Games

XIV Paralympic Games (Photo credit: Wikipedia)

 

“The purpose of education is not to make information accessible, but rather to teach learners how to transform accessible information into useable knowledge. Decades of cognitive science research have demonstrated that the capability to transform accessible information into useable knowledge is not a passive process but an active one”. CAST (2011)

 

Constructing useable knowledge, knowledge that is accessible for future decision-making, depends not upon merely perceiving information, but upon active “information processing skills” like selective attending, integrating new information with prior knowledge, strategic categorization, and active memorization.Individuals differ greatly in their skills in information processing and in their access to prior knowledge through which they can assimilate new information. CAST (2011)

 

Proper design and presentation of information – the responsibility of any curriculum or instructional methodology – can provide the scaffolds necessary to ensure that all learners have access to knowledge. CAST (2011)

 

I recommend the last link in its entirety above most that I have reviewed. It is a resource, It is succinct. It is practical. It respects the fact that all students come to this kind of learning with a set of experiences and skills – and tactics and tools that work for them. Why make someone play the tuba when they play the harp perfectly well? A metaphor worth developing I wonder in relation learning to play an instrument, read music, pass theory tests, perform solo or in an ensemble, to sight read etc:

 

Do you recall the paraorchestra performing with Coldplay at the closing ceremony of the London 2012 Paralympics who represented the widest range and degree of disability? http://www.guardian.co.uk/society/2012/sep/01/orchestra-disabled-people-play-paralympics

 

Guidelines

 

  • Provide options for perception
  • Provide options for language, mathematical expressions, and symbols
  • Provide options for comprehension

 

Checkpoints

 

  • Offer ways of customizing the display of information
  • Offer alternatives for auditory information
  • Offer alternatives for visual information
  • Support decoding text, mathematical notation, and symbols
  • Clarify vocabulary and symbols
  • Clarify syntax and structure
  • Promote understanding across languages
  • Illustrate through multiple media

 

REFERENCE

 

CAST (2011). Universal Design for Learning Guidelines version 2.0. Wakefield, MA: Author.

 

http://www.udlcenter.org/aboutudl/udlguidelines/principle1#principle1_g3

 

National Center On Universal Design for Learning

 

Guideline 3: Provide options for comprehension

 

http://www.udlcenter.org/aboutudl/udlguidelines/principle1#principle1_g3

 

NATIONAL CENTER ON UNIVERSAL DESIGN FOR LEARNING, AT CAST
40 HARVARD MILLS SQUARE, SUITE 3, WAKEFIELD, MA 01880-3233
TEL (781) 245-2212, EMAIL UDLCENTER@UDLCENTER.ORG

 

 

A Case Study in User-Centred Design Methods

Redesign of the Monash University Web Site

What is user-centred design?

User-centred design (UCD) is a development process often defined as having three core principles (Gould & Lewis, 1985: 300-11).

1) The first is the focus on users and their tasks.

UCD involves a structured approach to the collection of data from and about users and their tasks. The involvement of users begins early in the development life-cycle and continues throughout it.

2) The second core principle is the empirical measurement of usage of the system.

The typical approach is to measure aspects of ease of use on even the earliest system concepts and prototypes, and to continue this measurement throughout development.

3) The third basic principle of UCD is iterative design.

The development process involves repeated cycles of design, test, redesign and retest until the system meets its usability goals and is ready for release.

The goal of user-centred design is to make a system more usable.

Usability is defined in ISO 9241 as “the extent to which a product can be used by specified users to achieve specified goals with effectiveness, efficiency and satisfaction in a specified context of use” (cited Bevan 1999).

REFERENCE

Bevan, N (1999 in press) Quality in Use: Meeting User Needs for Quality Serco Usability Services

Gould, JD & Lewis, C (1985) “Designing for Usability: Key Principles and What Designers Think”, Communications of the ACM, 2 (3), March, pp. 300-11.

Daly-Jones, O, Thomas, C, Bevan, N. (1997) Handbook of user centred design. National Physical Laboratory, Teddington, Middx, UK,

 

Approaches to designing for accessible e-learning

H810 Activities : 24:1 – 24:4

There are three main approaches to designing for accessibility: (Seale)

Designing for accessible e–learning in particular.

1) Universal design
2) Usability
3) User-centred design.

Form follows Function

‘Software architecture research investigates methods for determining how best to partition a system, how components identify and communicate with each other, how information is communicated, how elements of a system can evolve independently, and how all of the above can be described using formal and informal notations’. Fielding (2000:xvi)

‘All design decisions at the architectural level should be made within the context of the functional, behavioral, and social requirements of the system being designed, which is a principle that applies equally to both software architecture and the traditional field of building architecture. The guideline that “form follows function” comes from hundreds of years of experience with failed building projects, but is often ignored by software practitioners’. Fielding (2000:1)

Form follows function is a principle associated with modern architecture and industrial design in the 20th century.

The principle is that the shape of a building or object should be primarily based upon its intended function or purpose.’ Wikipedia http://en.wikipedia.org/wiki/Form_follows_function

The American architect Louis Sullivan, Greenough’s much younger compatriot, who admired rationalist thinkers like Greenough, Thoreau, Emerson, Whitman and Melville, coined the phrase in his article The Tall Office Building Artistically Considered in 1896. Sullivan (1896)

In 1908 the Austrian architect Adolf Loos famously proclaimed that architectural ornament was criminal, and his essay on that topic would become foundational to Modernism and eventually trigger the careers of Le Corbusier, Walter Gropius, Alvar Aalto, Mies van der Rohe and Gerrit Rietveld. The Modernists adopted both of these equations—form follows function, ornament is a crime—as moral principles, and they celebrated industrial artifacts like steel water towers as brilliant and beautiful examples of plain, simple design integrity’. Wikipedia http://en.wikipedia.org/wiki/Form_follows_function

Which rather suggests we ought to be constrained, take a Puritan stance and behave as if we are designing on a shoestring (rather like Cable & Wireless with their LMS, 2012)

Form follows function, ornament is crime

These two principles—form follows function, ornament is crime—are often invoked on the same occasions for the same reasons, but they do not mean the same thing.

If ornament on a building may have social usefulness like aiding wayfinding, announcing the identity of the building, signaling scale, or attracting new customers inside, then ornament can be seen as functional, which puts those two articles of dogma at odds with each other. Wikipedia http://en.wikipedia.org/wiki/Form_follows_function

In relation to e–learning this could be a prerequisite to a company commissioning work, their audience doing it.

‘We don’t do boring’ WB

Even the likelihood that the learning will be taken in,understood and remembered. If the form needs to satisfy various accessibility functions then it will be a different object than had it not. It will be a different form too if these components are an add in rather than part of the original brief and build.  (Car, mobility car – add wing mirrors, head lights, bumpers, wipers and a horn. Does a car satisfy Section 5 when like the Google Self–drive car independent road travel becomes accessible to the visually and mobility impaired?)

Sustainable Futures Graphic
http://www.forumforthefuture.org/project/framework-sustainable-economy/overview

All of these approaches have their origins outside of e-learning, in disciplines such as:

  • Human-computer interaction
  • Assistive technology
  • Art and design.

Seale’s perspective on design is topsyturvey. (IMHO)

  • First of all we have universally, ‘Design’ – in all things including art, but also engineering, architecture, publishing, fashion, advertising and promotion …
  • Then, secondly – human computer interaction to embrace software, hardware and web–design,
  • And finally, most narrowly and with many caveats, ‘assistive technology’.

Two other approaches to design also have an influence on designing for accessibility:

  • Designing for adaptability JFV – repurposing, enhancing, simplifying …
  • Designing with an awareness of assistive technologies. JFV – adjusting (forewarned is forarmed)

Each of these approaches will be discussed in turn.

Universal design

The universal design approach to designing for accessibility is also known as

  • design for all
  • barrier free design
  • inclusive design.

The underpinning principle of universal design is that in designing with disability in mind a better product will be developed that also better serves the needs of all users, including those who are not disabled.

‘Better’ is a value judgement that is impossible to measure. We can be sure that the product will be different if designed with disabilty in mind, might it however be compromised even rendered less desirable?

‘Universal design is the process of creating products (devices, environments, systems, and processes), which are usable by people with the widest possible range of abilities, operating within the widest possible range of situations (environments, conditions, and circumstances) (Vanderheiden 1996).

User–centered design is ...

Barnum (2002) cites three principles that are central to understanding user–centered design:

(1) “early focus on users and tasks,”

(2) “empirical measurement of product usage,”

(3) “iterative design” (p. 7).

Elaborating, Barnum also suggests that user–centered design encourages product developers to

• gather information from users before product development begins,
• identify tasks that users will want to perform before product development begins,
• include users in the product development process,
• use an iterative product development life cycle.

I’m not convinced that designers or educators, unless they are required to do so, put the needs of the disabled first. The problem has to be tackled baring in mind a myriad of parameters, controls and history. Take the advance in wheelchairs for paralympic sport … there is however a constant need to return to first principals, the basic needs, drawing afresh upon current best practice, experience, tools and knowledge. Design is an iterative process, where solutions may be stumbled upon through serendipity.

Things have moved on considerably in the last few years (2007 >).

‘The greatest potential of these and other on-demand services is their ability to allow users to call up assistance for only those minutes or even seconds that they need them. This provides maximum independence with minimum cost, yet allows the assistance to be invoked as needed to address the situations that arise’. Vanderheiden (2007)

e.g. Philips Centre for Industrial Design

‘Thus the thermostat can be very large and easy to operate when it is needed and invisible the rest of the time. It can also be large for one user in the house while other users, who do not need a large thermostat, can have a more modest size thermostat pro- jected for them. Someone in a wheelchair could have theirs projected (and operated) low on the wall while a tall person could have it appear at a convenient height for them’. Vanderheiden (2007:151)

‘Finally, research on direct brain control19,20 is demonstrating that even individuals who have no physical movement may soon be able to use thought to control devices in their environment directly’. Vanderheiden (2007:152)

BBC 4 reports of man in vegetative state for 10 years communicates that he is in no pain. (BBC 13 NOV 2012) See also Taylor et al (2002)

Mr Routley suffered a severe brain injury in a car accident 12 years

None of his physical assessments since then have shown any sign of awareness, or ability to communicate.

But the British neuroscientist Prof Adrian Owen – who led the team at the Brain and Mind Institute, University of Western Ontario – said Mr Routley was clearly not vegetative.

“Scott has been able to show he has a conscious, thinking mind. We have scanned him several times and his pattern of brain activity shows he is clearly choosing to answer our questions. We believe he knows who and where he is.”

http://www.bbc.co.uk/news/health-20268044

‘As we move to such things as the incorporation of alternate interface capabilities and natural language control of mainstream products, we begin to blur the definitions that we have traditionally used to describe devices, strategies and even people. Those of us who wear glasses are not seen to have a disability (even if we cannot read without them), be- cause they are common and everybody uses them. In fact, the definition of legal blindness is based on one’s vision after correction. However, not all types of disability are defined by people’s ability “after they have been outfitted with all appropriate assistive technologies’. Vanderheiden (2007:154)

Whereas remote control of devices by voice would be an assistive technology by someone with a spinal cord injury today, in the future it may just be the way the standard products work. They still are “assistive technologies” in that they allow the individual to do things that they would not be able to do otherwise, but they are not “special” assistive technologies’. Vanderheiden (2007:155)

‘Apple computer, for example, has announced that their next operating system will have a screen reader for individuals who are blind built directly into the operating system’. Vanderheiden (2007:156)

‘We should be careful that our lack of imagination of what is possible does not translate into limitations that we place on the expectations of those we serve, and the preparation we give them for their future’. Vanderheiden (2007:157)

Thompson (2005) offers a number of examples that illustrate how universal web design can benefit a range of users.

For example, text alternatives for visual content (e.g. providing ALT tags for images) benefits anyone who doesn’t have immediate access to graphics. While this group includes people with blindness, it also includes those sighted computer users who surf the web using text-based browsers, users with slow Internet connections who may have disabled the display of graphics, users of handheld computing devices, and users of voice web and web portal systems including car-based systems.

Central to the universal design approach is a commitment that products should not have to be modified or adapted.

Design once.

Responsive design sees content created once and adjusting for multiple devices – if these devices could include assitive technology, to include hardware, software and websites, then from this single starting point the content would be reversioned for the users tastes and needs.

* Providing a clear, simple design, including a consistent and intuitive navigational mechanism, benefits a variety of users with disabilities, but the result of doing so is a website where users can easily and efficiently find the information they’re looking for. Clearly, this is a benefit to all users. Thompson (2005)

They should be accessible through easily imposed modifications that are ‘right out of the box’ (Jacko and Hanson 2002: 1).

Products should also be compatible with users’ assistive technologies:

‘This practice, when applied to the web, results in web content that is accessible to the broadest possible audience, including people with a wide range of abilities and disabilities who access the web using a wide variety of input and output technologies’ (Thompson 2005).

Whilst some purists argue that universal design is about designing for everyone, the majority of proponents agree that designing for the majority of people is a more realistic approach (Witt and McDermott 2004; Bohman 2003a).

But no longer legally permissable to take this stance in the case of access either in the UK or US, and presumably the same across Europe?

‘This exclusion can affect not only older and dis- abled people but also socially disadvantaged people af- fected by financial, educational, geographic and other factors’. Bühler C, Engelen J, Velasco C, et al. (2012)

While Vanderheiden (1996) argues that ‘there are NO universal designs; there are NO universally designed products. Universal design is a process, which yields products (devices, environments, systems, and processes), which are usable by and useful to the widest possible range of people. It is not possible, however, to create a product, which is usable by everyone or under all circumstances’.

There are some who feel uncomfortable with the principles of universal design because they appear to relieve educators of the responsibility of addressing individual student needs.

Rather, as the Khan Academy have shown, it frees up teachers to provide the individual touch.

‘Digital inclusion policies are drawing attention to the ‘missing 30%’ who are currently excluded and the need to ensure the rights of all citizens to participate in and benefit from these new opportunities. This missing 30% embraces social, economic and political disadvan- tage as well as issues of ageing and disability. The es- tablished principles and practices of Design for All of- fers an opportunity to ensure that designers and devel- opers working in ICT related fields are able to respond to the challenge of creating new systems, services and technologies that meet these broad demands of digital inclusion’. Bühler C, Engelen J, Velasco C, et al. (2012)

For example Kelly et al. (2004) argue that since accessibility is primarily about people and not about technologies it is inappropriate to seek a universal solution and that rather than aiming to provide an e-learning resource which is accessible to everyone there can be advantages in providing resources which are <b>tailored to the student’s particular needs.

This thinking is becoming redundant, or the scale of inclusion in such a category is decreasing – the less impaired – visual, hearing, mobility, cognitive – can be mainstream.

There are also some who warn that no single design is likely to satisfy all different learner needs.

The classic example given to support this argument is the perceived conflict between the needs of those who are blind and those who have cognitive disabilities. For example, the dyslexic’s desire for effective imagery and short text would appear to contradict the blind user’s desire for strong textual narrative with little imagery. However Bohman (2003b) provides a counterbalance to this argument stating that while the visual elements may be unnecessary for those who are blind, they are not harmful to them. As long as alternative text is provided for these visual elements, there is no conflict.

METAPHOR

The core is like a Christmas Tree, the components that create accessibilty – always to a degree and never comprehensive – are the lights and decorations, so to design and richness added for branding and to inspire.

Those with cognitive disabilities will be able to view the visual elements, and those who are blind will be able to access the alternative text.

Brajnik (2000) defines usability as the:

‘effectiveness, efficiency and satisfaction with which specified users achieve specified goals in particular environments’.

Effectiveness, efficiency and satisfaction can be achieved by following four main design principles:

  • make the site’s purpose clear
  • help users find what they need
  • reveal site content
  • use visual design to enhance, not define, interaction design (Nielson 2002).

Many practitioners talk of usability and accessibility in the same breath, as if they were synonymous. For example, Jeffels and Marston (2003) write:

‘Using legislation as the stick will hopefully cease to be as important as it currently is; the carrot of usability will hopefully be the real driving force in the production of accessible online materials one day soon.’

There may be good reason for assuming synonymy between accessibility and usability.

For example, Arch (2002) points out that many of the checkpoints in the Web Content Accessibility Guidelines (WCAG) are actually general usability requirements and a number of others are equally applicable to sections of the community that are not disabled, such as those who live in rural areas or the technologically disadvantaged.

Many proponents also argue that a usability design approach is essential in accessibility because in many cases current accessibility standards do not address making web pages directly usable by people, but rather address making web pages usable by technologies (browsers, assistive devices). However, usability and accessibility are different, as Powlik and Karshmer (2002: 218) forcefully point out: ‘To assume accessibility equates to usability is the equivalent of saying that broadcasting equates to effective communication.’

What is the difference between usability and accessibility?

One simple way to understand the difference between usability and accessibility is to see usability as focusing on making applications and websites easy for people to use, whilst accessibility focuses on making them equally easy for everyone to use. Seale C7

Others see the difference as one of objectivity and subjectivity, where accessibility has a specific definition and is about technical guidelines and compliance with official norms.

Usability, however, has a broader definition, is resistant to attempts at specification and is about user experience and satisfaction (Iwarsson and Stahl 2003; Craven 2003b; Richards and Hanson 2004).

Whilst usability and accessibility are different, it is clear that they do complement one another.

However, it does not follow that if a product is accessible, it also usable.

This is because for usability accessibility is necessary, but not sufficient. Whilst for accessibility usability is not necessary, but it is desirable. In a compelling indictment of the use of automatic accessibility checking tools Sloan and Stratford (2004) argue that ‘Accessible remains worthless without usable – which depending on the aims of the resource, could imply “valued”, ‘trustworthy, “worthwhile”, “enjoyable”, “fun”, or many other desirable adjectives.’

  • Statement 1: I’m keeping it simple, I don’t want to fall foul of the legislation
  • Statement 2: I’d like to have done something better, but I had to make it accessible. Shame really.
  • Statement 3: This resource is fully accessible – Bobby says so.

‘It’s essential to remember that there are two levels to multimedia accessibility. On the one hand multimedia developers should strive to use techniques of best practice, standards, guidelines, and features of multimedia technology to maximise the accessibility of a particular piece of multimedia content.

But on the wider scale, designers should never forget that the use of multimedia itself is an enabler, a way of making information and concepts more accessible to people.

By supplementing textual content with rich media such as animation, video or audio clips, information flow and the learning process can be greatly enhanced. So where it’s just not possible to make a specific piece of multimedia accessible to some people, think in terms of ‘how can I provide an alternative?’, rather than thinking ‘I’d better not use it’. Sloan and Stratford (2004)

Before designing a multimedia resource ask yourself:

  • What are the aims of this piece of multimedia?
  • What are the pedagogic goals?
  • How does the resource fit in with the rest of the learning environment?
  • Is it absolutely necessary that all students use it?
  • What are the potential barriers to using these multimedia? What level of sensory or motor abilities is required?
  • How might specific learning difficulties or other cognitive impairments affect the ability to use the resource?
  • What alternatives already exist and what alternatives can be reasonably created? How was the subject or topic previously taught?
  • What is the best way that the information can be presented or learning objectives reached such that:

a) the majority can still get the most out of the multimedia resource?
b) those affected by accessibility barriers get the same information and experience in a way best suited to them? Sloan and Stratford (2004)

Thatcher (2004) provides a pertinent example of a website of one US Federal Agency to demonstrate how accessibility does not equal usability.

The site passed most automated accessibility checks, but failed on a usability check. Thatcher notes that the designers used the techniques that they were supposed to use, such as using alt-text on images. However, he concluded that they must have blindly used those techniques without understanding why. Otherwise they would not have provided alt-text on invisible images that have been used purely for spacing, requiring a screen reader to read the text over and over (a total 17 times), when in fact it was information a user did not need to know.

Usability design methods

Usability is essentially about trying to see things from the user’s perspective and designing accordingly. Seale , C7 (2006)

Seeing things from the user’s perspective can be difficult, as Regan (2004a) acknowledges when discussing Flash design and screen readers. In order to see things from a user’s perspective more easily, usability designers may employ user profiles, investigate user preferences or conduct user evaluations (testing).

Personas as the OU, also see examples that Clive Shepherd uses in The New Learning Architect.

For example, Kunzinger et al. (2005) outline how IBM has employed user profiles (personas of users with disabilities) in an attempt to make their products more usable. The personas are described as composite descriptions of users of a product that embody their use of the product, their characteristics, needs and requirements.

According to Kunzinger et al. (2005) these personas engage the empathy of developers for their users and provide a basis for making initial design decisions.

Gappa et al. (2004) report how an EU project called IRIS investigated the preferences of people with disabilities with regard to interface design of web applications as well as user requirements of online help and search engines. The results showed that user preferences for information presentation vary a great deal and that solutions would require comprehensive user profiling.

Bryant (2005) documents the results of a usability study in which four participants from each of the seven target audiences for the website were asked to undertake tasks from one of seven ‘scenarios’, one for each target audience. Participants were observed and videotaped while undertaking these tasks.

This sounds similar to the IET Computer Labs where there are  fequent detailed studies of web usability and  accessibility with a generous supply of assistive technology, video, eye-tracking and two way mirrors.

http://www8.open.ac.uk/iet/main/research-scholarship

Paciello (2005) argues that engaging users and applying usability inspection methods are the cornerstones for ensuring universal accessibility.

Enhancing accessibility through usability inspections and usability testing

This paper proposes usability methods including heuristic inspections and controlled task user testing as formulas for ensuring accessible user experience.

http://www.csun.edu/cod/conf/2005/proceedings/2509.htm

‘Unfortunately, the presence of automated accessibility evaluation software often provides limited, unreliable, and difficult to interpret feedback. By its very nature, automated testing is incapable of emulating true user experience’. (2005:01) Michael G. Paciello of The Paciello Group.

Time after time, vendors present “508 compliant” software only to find out that federal employees and citizens cannot us the application with their assistive technologies.

We suggest that the problem involves incomplete design and testing methodologies. We further suggest that the best solution to ensuring accessible user experience (AUE) involves a combination of automated testing tools, expert usability inspections, and controlled task user testing.

In their book, “Designing Web Sites that Work – Usability for the Web”, Brinck, Gergle, and Wood note the following:

“Many typical guidelines for accessibility stress the use of standard technologies or recommend providing standardized fallbacks when using nonstandard technologies. In particular, most advise following HTML standards strictly. Unfortunately, the real world is rather messy, and most web browsers do not follow HTML standards exactly, making strict adherence to the standards less accessible in practice. The only certain way to ensure accessibility is to follow the recommendations as well as possible and then test the site with users with disabilities. ”

TESTING METHODOLOGIES:

The ideal then is to broaden the scope of user inclusiveness by engaging users with disabilities through a variety of usability testing methods and iterative test cycles, including:

• Participatory design
• Rapid prototyping
• Contextual inquiry
• Heuristic inspections
• Critical incident reporting
• Remote evaluation
• Controlled task user testing

Essentially, there are four basic evaluation methods:

• Automatic (using software tools)
• Formal (exact models/formulas to calculate measures)
• Informal (experienced evaluators conducting usability inspections)
• Empirical (usability testing involving users)

There are a number of different usability inspection methods. Each has its on strength and purpose.

These methods include (but are not limited to) the following:

• Heuristic evaluations (expert reviews)
• Guideline & Standards reviews (conformance review)
• Walkthroughs (user/developer scenarios)
• Consistency (family of products review)
• Formal usability inspections (moderated & focused inspection)
• Feature inspections (function, utility review)

During the heuristic inspection, the evaluator is looking for several user interface areas including:

• Ease of Use
• Error Seriousness
• Efficiency
• UI “pleasantness”
• Error Frequency

The resulting recommendation report should feature the following:

• Identify problems and suggest redesign solutions
• Prioritize list of usability problems based on rate of occurrence, severity level, & market/user impact
• Where possible, estimate implementation cost & cost-benefit analysis

To conduct an expert heuristic inspection, we recommend the following process:

• Form a team of 3-5 specialists to perform individual inspections against list of heuristics
• Studies show that use of 4-5 evaluators result in 80% identification of accessibility problems. More is better!
• Evaluators require guidelines, standards, usability guidelines, and UI design experience
• Perform a comparative analysis of collected data, formulate report
• Implement formal inspection process
• Appoint a moderator to manage the team
• Plan the inspection, conduct kick-off meeting, develop preparation phase for individual reviews, conduct main inspection, end with follow-up phase to assess effectiveness of the inspection
• Plan iterative user testing

To ensure that the inspection results in qualitative accessibility results, the following accessibility heuristics should reviewed within any given user interface:

• Simple & natural dialogue
• Familiarity (speak the user’s language)
• Minimize the user’s memory load
• Consistency
• Feedback
• Clearly marked exits
• Shortcuts
• Good error messages
• Prevent errors
• Help & Documentation
Conducting Usability Testing

Why conduct iterative user tests?

What advantages do they have over other methods of ensuring accessibility to software or web sites? The fundamental premise for testing users is to validate their experience with a given interface.

We know that automated testing rarely results in capturing true user experience and heuristic evaluations cannot address all usability issues. Expert evaluations cannot fully emulate user experience (unless inspector has a disability) and task analysis is generally not part of the heuristic evaluations.

On the other hand, the benefits of user testing are many:

• No preconceived notions, expectations, or application prejudices
• Validates user requirements
• Confirms heuristic evaluations
• Builds client confidence.
• Assures AUE (accessible user experience)

Setting up to conduct user testing with individuals with disabilities is not a difficult task. We suggest the following process:

Setup:

• Establish an accessibility center to perform user testing (Hint: Collaborate with disability organizations)
• Recruit sample users and ensure:
• Cross-disability sampling
• Multi-AT configurations
• Variable user experience level (novice, average, expert)

Involve client developers and engineers as observers. Keep in mind that this is not beta testing. You are not looking for programmatic “bugs”. You are looking for user interface design flaws that prevent a user with a disability from using a software application compatiblilty with their assistive technologies.

If permissible, video record sessions. Moderators or test facilitators should also note the following:

• Do not “lead” participants
• Perform iterative studies
• Pay participants and thank them!

Summary

To summarize, the key to accessible user interfaces is to ensure accessible user experience (AUE)

AUE is quantified by combining and conducting two usability inspection methodologies:

  • Heuristic Evaluations
  • User Testing
  • Engaging users and applying usability inspection methods are the cornerstones for ensuring universal accessibility.

Whilst most of us are now aware of the existence of accessibility statements and logos on websites, Shneiderman and Hochheiser (2001: 367) propose that web designers should also place usability statements on their sites in order to ‘inform users and thereby reduce frustration and confusion’.

User centred design

Alexander (2003b) defines user-centred design (UCD) as a development process, which has three core principles:

What is user-centred design?

User-centred design (UCD) is a development process often defined as having three core principles (Gould & Lewis, 1985: 300-11).

1) The first is the focus on users and their tasks.

UCD involves a structured approach to the collection of data from and about users and their tasks. The involvement of users begins early in the development lifecycle and continues throughout it.

2) The second core principle is the empirical measurement of usage of the system.

The typical approach is to measure aspects of ease of use on even the earliest system concepts and prototypes, and to continue this measurement throughout development.

3) The third basic principle of UCD is iterative design.

The development process involves repeated cycles of design, test, redesign and retest until the system meets its usability goals and is ready for release.

The goal of user-centred design is to make a system more usable.

Jacko and Hanson (2002: 1) argue that user profiling, or understanding the specific characteristics and interaction needs of a particular user or group of users with unique needs and abilities, is necessary to truly understand the different interaction requirements of disparate user populations. This understanding of diverse users’ needs will help to increase accessibility.

User centred design in the field of disability and accessibility is attractive because it places people with disabilities at the centre of the design process, a place in which they have not traditionally been. As Luke (2002) notes: ‘When developers consider technical and pedagogical accessibility, people with physical and/or learning disabilities are encouraged to become producers of information, not just passive consumers.’

A range of studies have reported to be user or learner centred in their design approach and in doing so have employed a range of techniques:

  • User testing (Smith 2002; Alexander 2003b; Gay and Harrison 2001; Theofanos and Redish 2003);
  • User profiling (Keller et al.2001)
  • User evaluations (Pearson and Koppi 2001).

‘Networks such as the Web, Intranets or dedicated broadband networks are being used to teach, to conduct research, to hold tutorials, to submit assignments and to act as libraries. The primary users of Web based instruction include universities, professional upgrading, employment training and lifelong learning. Those that can benefit most from this trend are learners with disabilities. Web based instruction is easily adapted to varying learning styles, rates, and communication formats. Issues of distance, transportation and physical access are reduced. Electronic text, unlike printed text, can be read by individuals who are blind, vision impaired, dyslexic and by individuals who cannot hold a book or turn pages.’ Gay and Harrison (2001)

User Training:

The eight participants received, in addition to basic Internet skills, training from ATRC staff on their respective assistive technologies, and training from the CAT staff on basic web based instructional design. Training occurred over a period of five to eight 3-hour sessions depending on the severity and type of disability, the type of assistive technology being used, and the participants experience with computers and the Internet. It was expected that most of the participants would be novice users requiring instruction in technologies and the Internet. Gay and Harrison (2001)

Instrument development:

The instrument used to assess the accessibility of courseware tools was based on the WAI Accessibility Guidelines, which outlines recommendations for creating accessible Web based documents (Treviranus et al , 1999).

Observer Training: an independent observer was trained to identify and distinguish between difficulties associated with the use of assistive technologies, and those associated with access to courseware tools.    Gay and Harrison (2001)

User Testing:

Each of the participants, after being trained on their respective assistive technologies, were observed to record their ease or difficulty in accessing each of the tools outlined by Landon (1998). In five to ten three hour sessions (depending on the rate at which the participant progresses), the trained observer guided each participant through participation in a Web based course module using each of the courseware tools being assessed. A module is synonymous with a weekly class meeting in the traditional educational setting, providing course notes, discussions, resources, exercises, and testing. Gay and Harrison (2001)

However, Bilotta (2005) warns that if accessibility guidelines are translated too literally, there is a risk that unsuccessful sites will be created that ignore the intersection of UCD and the needs and tasks of people with disabilities.

Summary

This timely presentation proposes “Case-Study” based approaches to best practices for creating highly usable accessible web environments and a model for evolution of standards creation. Bilotta (2005)

Introduction

Section 508 of the Rehabilitation Act along with an abundance of other accessibility laws, policies, standards and guidelines are rapidly heightening public awareness of the possibility of an accessible web. These laws and guidelines, coupled with an increased clarification of potential market share as well as encroaching litigation have led individuals, private companies, industry giants, government agencies and educational institutions alike to voraciously pursue accessibility practices for their online content. Bilotta (2005)

The Interpretation Problem

Though awareness of access laws is at an all time high due to mainstream publicity, success in implementing usable and useful accessible web content still remains low. Section 508 and the W3C’s Web Content Accessibility Guidelines (WCAG) are free and readily available for anyone with time to read them, yet the interpretation and implementation process of these laws and guidelines often produces further access barriers. Industry and academia alike, mandated by either the Federal Government or a Business Executive, are attempting to comply with some sort of accessibility standard. Bilotta (2005)

The (Mis)Informed Decision

Which level to choose? What laws does my product fall under? Which guidelines will be most beneficial? Will any law conflict with our business model? These are some of the questions that decision-makers face. Surprisingly, the decision is often based only on surface investigation of the information available. Often, the person implementing a company-wide access is someone who knows little or nothing about accessibility policy or the technology solutions necessary to comply with Federal Law. Bilotta (2005)

The (Over) Implementation

Once the decision has been made as to which standards, laws or guidelines to conform to, the entire course of action is too often transferred to the development team. The implementation of the accessibility standards are left up to a group of developers being told that they must do this work, but not how and certainly not “why.” Developers faced with the challenge of generating accessible web content, attempt to translate these standards into solutions without clearly understanding the goal of their efforts: Improving the everyday experiences of people with disabilities through combination of accessible products, services, and Assistive Technology (AT). It remains a rare occurrence that a database developer has any experience or knowledge of an Operating System Reader such as JAWS or other alternate web browsing strategies. Bilotta (2005)

The Letter of the Law

It is at this point in the process when the well-intended distributions of laws and guidelines becomes one of the largest access barriers for web content. In push to conform absolutely to the technical standards of accessibility, web developers too often create unsuccessful sites by ignoring the intersection of User Centered Design and the needs and tasks of people with disabilities. The User has now been replaced by a set of suggestions from a strict and faceless organization. It is now up to the developer to act as a policy interpreter, which in almost every case, means following each and every guideline precisely as applied to each and every instance possible. The translation of the guidelines can be extremely literal, there is no attempt to consider which standard is the most usable, useful and inclusive for my target audience. Following the WCAG is seen as a bullet-proof accessibility solution. The logic is: “If I comply with the guidelines, My web site is accessible.”  Bilotta (2005)

The relationship between universal, usable and user-centred design approaches.

For many, there are clear overlaps between the three design approaches.

For example Bilotta (2005) claims that accessible web design is nothing more than the logical extension of the principles of both user centered design and universal design. The three approaches are connected in that all three embrace users’ needs and preferences.

For example, in discussing universal design, Burzagli et al. (2004: 240) place an importance on user needs: ‘Design for All can constitute a good approach, as in accordance with its principles, it embraces user needs and preferences, devices and contexts of use in a common operative platform.’

As does Smith (2002: 52), when discussing a user-centred design approach to developing a VLE interface for dyslexic students: ‘This design approach places an emphasis on understanding the needs of the user and it seems to have produced an interface that is appropriate for both dyslexic and non-dyslexic users.’

Perhaps the easiest way to see the connection between the three approaches is to view usability and universal design as specific implementations of user-centred design, where:

  • a traditional usability approach employs a quite specific interpretation of who the end-users are;
  • a universal design approach employs a broader interpretation of who the end users are, an interpretation that focuses on how the user maps on the target population in terms of functional capability as well as skill and experience (Keates and Clarkson 2003).

Designing for adaptability

Some research studies that report to be designing with accessibility in mind have focused their attention on designing for adaptability.

This can mean one of two things:

  • allowing the user to configure the application to meet their needs (Owens and Keller 2000; Arditi 2004);
  • enabling the application to make adaptations to transform seamlessly in order to meet the needs of the user (Stephanidis et al. 1998; Cooper et al. 2000; Hanson and Richards 2004; Alexandraki et al. 2004).

People have different needs when seeking to use a computer to facilitate any activity.

Factors here extend beyond their physical and sensory capabilities and include needs arising from:

  • preferred learning/working styles (activist / reflective)
  • left or right handedness
  • gender
  • the environment they are working in (e.g. office, train, while driving)
  • other cognitive loads they have to deal with simultaneously

Cooper et al. 2000

Increasingly the WWW is being used to host complex applications in fields as diverse as  scientific enquiry, education, e-commerce and entertainment.  For the purposes of this paper we can define a complex application as one where a user has to receive multiple sources of  information from a web server or several servers and is able to interact with the application in  a variety of ways.  This would normally take the form of an interactive multimedia application similar to that which has been traditionally produced for off-line delivery on CDROM.  Now the very complexity and multi-modality of such applications presents a  particular challenge in seeking to design user interfaces for them that are accessible to people  with different disabilities. Cooper et al. 2000

Both approaches are reliant on detailed user profiles and in that sense there is similarity with usability and user-centred design approaches.

Designing applications that make automatic adaptations appears to be a more prevalent design approach at the moment, however. For example Alexandraki et al 2004 describe the ‘eAccessibility Engine’, a tool which employs adaptation techniques to automatically render web pages accessible by users with different types of disabilities. Specifically, the ‘eAccessibility Engine’ is capable of automatically transforming web pages to attain conformance to Section 508 standards and ‘AAA’ conformance to Web Content Accessibility Guidelines.

The proposed tool is intended for use as a web-based service and can be applied to any existing website. The researchers explain how users are not necessarily aware of the presence of the eAccessibilityEngine. If users do not wish to modify their initial selection of an ‘accessibility profile’ they need to directly interact with the tool only once. Paciello (2000: 21) describes the move towards such personalization as the ‘Third Wave’, arguing that: ‘This is the essence of true personalization – Web design that ensures accessibility for every user by adapting to the user’s preferences’.

JFV JFV JFV – ABOVE IN SIMPLE MIND MAP 1 JFV JFV JFV

Designing with an awareness of assistive technology

Some definitions of accessibility stress access by assistive technologies (Caldwell et al.2004).
WCAG 2.0 Layers of Guidance

The individuals and organizations that use WCAG vary widely and include Web designers and developers, policy makers, purchasing agents, teachers, and students. In order to meet the varying needs of this audience, several layers of guidance are provided including overall principles, general guidelines, testable success criteria and a rich collection of sufficient techniques, advisory techniques, and documented common failures with examples, resource links and code.

Principles

At the top are four principles that provide the foundation for Web accessibility:

  1. perceivable
  2. operable
  3. understandable
  4. robust.

See also Understanding the Four Principles of Accessibility.

Guidelines

Under the principles are guidelines. The 12 guidelines provide the basic goals that authors should work toward in order to make content more accessible to users with different disabilities. The guidelines are not testable, but provide the framework and overall objectives to help authors understand the success criteria and better implement the techniques.

Success Criteria

For each guideline, testable success criteria are provided to allow WCAG 2.0 to be used where requirements and conformance testing are necessary such as in design specification, purchasing, regulation, and contractual agreements. In order to meet the needs of different groups and different situations, three levels of conformance are defined: A (lowest), AA, and AAA (highest). Additional information on WCAG levels can be found in Understanding Levels of Conformance.

Sufficient and Advisory Techniques

For each of the guidelines and success criteria in the WCAG 2.0 document itself, the working group has also documented a wide variety of techniques.

The techniques are informative and fall into two categories:

1) those that are sufficient for meeting the success criteria

2) those that areadvisory.

The advisory techniques go beyond what is required by the individual success criteria and allow authors to better address the guidelines. Some advisory techniques address accessibility barriers that are not covered by the testable success criteria. Where common failures are known, these are also documented. See also Sufficient and Advisory Techniques in Understanding WCAG 2.0.

Caldwell et al.2004

Accessibility design approaches such as universal design also stress access by assistive technologies (Thompson 2005).

Furthermore, accessibility guidelines and standards such as the WCAG-1 stress access by assistive technologies. For example, guideline nine advises designers to use features that enable activation of page elements via a variety of input devices.

There is a strong case therefore for learning technologists needing to have an awareness of assistive technology in order to avoid situations where users of such technologies cannot ‘conduct Web transactions because the Web environment does not support access functionality’ (Yu 2003: 12).

Colwell et al. (2002: 75) argue that some guidelines are difficult for developers to apply if they have ‘little knowledge of assistive technology or [of] people with disabilities’.

While Bilotta (2005) notes that developers faced with the challenge of generating accessible web content, attempt to translate these standards into solutions without clearly understanding the goal of their efforts … It remains a rare occurrence that a database developer has any experience or knowledge of an Operating System Reader such as JAWS or other alternate web browsing strategies.

(The above expressed in the H810 Evaluating Accessibility for e-learning SimpleMind)

Accessibility tools

Four types of accessibility tool exist at the moment:

  1. filter and transformation tools;
  2. design and authoring tools;
  3. evaluation tools
  4. evaluation and repair tools.

Filter and transformational tools assist web users more than developers and either modify a page or supplement an assistive technology or browser.

Examples include the BBC Education Text to Speech Internet Enhancer (BETSIE), which creates automatic text-only versions of a website and the Kit for the Accessibility of the Internet (KAI). Gonzalez et al.(2003) describe how KAI can offer a global indicator of accessibility to end users (particularly blind people) at the moment of entering a site as well as filter, repair and restructure the site according to their needs.

The BBC have moved away from BETSIE and at the time of writing (May 2010) they are developing a new approach. This assists people to set their browser options and BBC website preferences and these are saved between sessions.

My Web My Way

The BBC were trialling a new tool called ‘MyDisplay’, which allows the user to choose from a range of pre-set themes and allows them to save the chosen theme in a BBC account so that it can be accessed from any computer.

Design and authoring tools that currently exist, appear to fall into two categories:

1) Multimedia design tools (Linder and Gunderson 2003; Regan 2005a)

The Power Point HTML Accessibility Plug-in allows Microsoft PowerPoint presentations to be published in an accessible HTML format for WWW. The tool automatically generates text equivalents for common PPT objects and supports the instructor in generating text alternatives for images that are used in educational settings like pie, bar and scatter point charts. The HTML markup generated exceeds current Section 508 requirements and meets W3C Web Content Accessibility Requirements Double-A conformance. (Linder and Gunderson 2003)

Web Accessibility is as much a set of core practices as it is an abstract concept. Understanding the techniques does not necessarily help those that build websites to arrive at a comprehensive understanding of the issue of accessibility. For this reason, it is important that the tools used by professional web designers and web developers integrate web accessibility into account at every available opportunity. This helps individuals with limited understanding of accessibility to make good decisions at the appropriate points in the design process.

Accessibility Preferences Options

Macromedia Dreamweaver MX 2004 allows you to set preferences that prompt you to provide accessibility information as you’re building the page. By activating options in the Preferences dialog box, you’ll be prompted to provide accessibility-related information for form objects, frames, media, images, and tables as each element is inserted in a page. (Regan 2005a)

2) Tools that simulate disability (Tagaki et al.2004; Saito et al.2005).

This session introduces a disability simulator “aDesigner”, which helps Web designers ensure that the Web content they develop is accessible and usable by visually impaired users. This presentation is designed to show the audience the functions of aDesigner and how easily and effectively the Web designers can analyze Web pages and fix the problems to improve accessibility and usability. Tagaki et al.2004

Web accessibility means access to the World Wide Web for everyone, including people with disabilities and senior citizens. Ensuring Web accessibility improves the quality of the lives of such people by removing barriers that prevent them from taking part in many important life activities. Tagaki et al.2004

The disability simulator aDesigner helps you make sure that the Web content you develop is accessible and usable by visually impaired users and also checks that it complies with accessibility guidelines. By simulating or visualizing Web pages, aDesigner helps you understand how low vision or blind users experience content on a given Web page, and it also performs a checking function for two types of problems: (a) problems such as redundant text or the appropriateness of alternate text and (b) lack of compliance with accessibility guidelines. Tagaki et al.2004

The tool aDesigner overcomes the limitations of existing accessibility tools by helping Web authors (many of whom are young and have good vision) to learn about the real accessibility issues. It simulates two types of disabilities, low vision and blindness, and automatically detects accessibility and usability problems on a page. You can specify the guidelines that you want aDesigner to apply in its review (Section 508, IBM Corporate Instruction 162, JIS (Japanese Industrial Standards) X 8341-3, and three priority levels of the Web Content Accessibility Guidelines). Tagaki et al.2004

The number of design tools that exist however, are tiny, compared to the array of evaluation, validation and repair tools that have been developed.

Evaluation tools conduct a static analysis of web pages and return a report or rating, whilst repair tools identify problems and recommend improvements (Chisholm and Kasday 2003).

Transitioning Web accessibility policies to take advantage of WCAG 2.0

Many governments and organizations around the world have taken measures to ensure that their Web sites are accessible to people with disabilities [3]. In the majority of cases they have either directly adopted or referenced W3C’s Web Content Accessibility Guidelines 1.0 (WCAG 1.0). In some cases they have also referenced the Authoring Tool Accessibility Guidelines 1.0 (ATAG 1.0) and the User Agent Accessibility Guidelines (UAAG 1.0) in order to bring about a more comprehensive Web accessibility solution. Other governments and organizations have developed local standards, regulations, or guidelines based indirectly on WCAG 1.0. (Chisholm and Kasday 2003).

See also

“Web Content Accessibility Guidelines 2.0,” B. Caldwell, W. Chisholm, G. Vanderheiden, and J.White, eds., W3C Working Draft, 30 July 2004, at http://www.w3.org/TR/WCAG20/

Validation tools such as The W3C HTML Validation Service that check HTML and Cascading Style Sheets (CSS) are often included as evaluation tools because validating to a published grammar is considered by some to be the first step towards accessibility (Smith and Bohman 2004)

Evaluation and repair tools

Evaluation tools can be categorized as either general or focused, where general evaluation tools perform tests for a variety of accessibility issues whilst focused tools test for one or a limited aspect of accessibility.

Evaluation and repair tools generally focus on evaluating content, although some do evaluate user agents (Gunderson and May 2005). Most tools see accessibility as an absolute, but some look at the degree of accessibility (Hackett te al.004). Most tools use WCAG–1 or Section 508 as their benchmark, but as guidelines are continually changing one or two are designed to check against the most recent guidelines (Abascal et al.2004).

Finally, most of the tools are proprietary.

Standard referenced evaluation tool

There are a range of tools that identify accessibility problems by mostly following the guidelines of Section 508 and WCAG–1.

Examples include

  • Access
  • Enable
  • LIFT
  • Bobby

Bobby is the most well known automatic accessibility checking tool. It was originally developed by The Center for Applied Special Technology but is now distributed through WatchFire. Bobby tests online documents against either the WCAG-1 or Section 508 standards. A report is generated that highlights accessibility checks, triggered by document mark-up. Some checks can be identified directly as an error; others are flagged as requiring manual attention. If tested against WCAG-1, the report lists priority one, two and three accessibility user checks in separate sections. Bobby provides links to detailed information as to how to rectify accessibility checkpoints, the reason for the checkpoint and provides accessibility guideline references for both WAI and Section 508 guidelines as appropriate. If all priority one checkpoints raised in the report are dealt with, the report for the revised page should indicate that the page is granted: ‘Bobby approved’ status.

Whilst Bobby is the most well-known and perhaps most used evaluation tool, it has not been without its problems

For example, some experts have pointed out that Bobby and the Bobby logo can be used inappropriately to indicate that a site is accessible, when in some people’s perception it is not (Witt and McDermott 2002).

While Bobby will detect a missing text description for an image, it is the developer who is responsible for annotating this image with meaningful text. Frequently, an image has a meaningless or misleading text description though the validation tool output states that the page is accessible.

(Witt and McDermott 2002: 48)

WatchFire Bobby is no longer available. A web search will find current automated tools, some of which are add-ons to particular web browsers. One tool

WAVE

(WebAim, 2010) works in a similar way to Bobby. Other tools mentioned in this chapter may also have been replaced, but their replacements raise similar principles and problems

Tools that identify and prioritize problems that need repairing

Many automatic evaluation tools not only indicate where the accessibility problem is, but also give designers directions for conforming to violated guidelines. Although they can also be called repair tools, almost none perform completely automatic corrections. For this reason, it is probably more appropriate to identify them as tools that assist the author in identifying and prioritizing. Although the tools mentioned in the previous section offer designers some help in correcting problems, there are tools that fulfil this goal more efficiently. Examples include A-Prompt and Step 508.

A-Prompt (Accessibility Prompt) was developed by the Adaptive Technology Resource Centre in Canada.

A-Prompt first evaluates an HTML web page to identify accessibility, it then provides the web author with a ‘fast and easy way’ to make the necessary repairs. The tool’s evaluation and repair checklist is based on WCAG-1. A-Prompt allows the author to select a file for validation and repair, or select a single HTML element within a file. The tool may be customized to check for different conformance levels. If an accessibility problem is detected, A-Prompt displays the necessary dialogs and guides the user to fix the problem. Many repetitive tasks are automated, such as the addition of ALT-text or the replacement of server-side image maps with client-side image maps. When all potential problems have been resolved, the repaired HTML code is inserted into the document and a new version of the file may be saved to the author’s hard drive. After a web page has been checked and repaired by A-Prompt it will be given a WAI Conformance ranking.

Theofanos, et al.(2004) and Kirkpatrick and Thatcher (2005) describe a free tool developed by the National Centre for Accessible Media (NCAM), called STEP 508.

A free tool developed by the National Center for Accessible Media (NCAM) at WGBH in Boston, called STEP508 was developed in 2003 to address this need. STEP508 is capable of handling data from Watchfire’s Bobby, Parasoft’s WebKing, and UsableNet’s LiftOnline, but is limited to prioritizations of evaluations where Section 508 standards were the basis for testing.

In 2005 NCAM has developed a new version of this tool with funding from the General Services Administration with additional importing, exporting, and reporting capabilities, and with the ability to test against the W3C’s Web Content Accessibility Guidelines. Like its predecessor, the new version allows site administrators to prioritize accessibility errors and quickly generate reports that provide several views of the errors. Errors are prioritized by several factors including ease of repair, page “importance” (determined by page traffic and other criteria), and impact on users. With a sorted list in hand, repair efforts are focused on pages and errors that are the best targets for repair first.

Kirkpatrick and Thatcher (2005)

STEP 508 is a tool for analysing evaluation data from popular web accessibility evaluation tools, such as Watchfire’s Bobby and UsableNet’s LIFT. STEP does not perform the accessibility evaluation, but examines the results of an evaluation and creates a report, which prioritizes the results based on the severity and repairability of the errors found, as well as on the importance of the pages where errors were identified. When it was originally designed in 2003, it was limited to prioritizations of evaluations where Section 508 standards were the basis for testing. However, in 2005 NCAM developed a new version of this tool with additional importing, exporting and reporting capabilities, and with the ability to test against the WCAG–1.

Like its predecessor, the new version allows site administrators to prioritize accessibility errors and quickly generate reports that provide several views of the errors. Errors are prioritized by several factors including ease of repair, page ‘importance’ (determined by page traffic and other criteria) and impact on users. With a sorted list in hand, repair efforts are focused first on pages and errors that are the best targets for repair. In addition to prioritizing errors, STEP allows for easy comparisons between accessibility evaluation tools, to help site administrators understand how the outputs from the various tools differ.

Validity and reliability of evaluation and repair tools

With so many tools to choose from accessibility designers need to be able to assess their validity and reliability. Brajnik (2001) argues that the testing tools have to be validated in order to be truly useful. A crucial property that needs to be assessed is the validity of the rules that tools operate. A valid rule is correct (never identifies false positives) and complete (never yields false negatives). Brajnik (2001) argues that rule completeness is extremely difficult to assess however, and considers that assessing the correctness of a rule is more viable and can be achieved through one of three methods:

  1. comparative experiments,
  2. rule inspection and testing,
  3. and page tracking.

Page tracking entails the repeated use of an automatic tool. If a site is analysed two or more times, each evaluation generates a set of problems. If, on subsequent evaluations, a problem disappears (due to a change in the underlying web page/website) it is assumed that this maintenance action on the part of the webmaster was prompted by the rule showing the problem. Brajnik argues that this method provides an indirect way to determine the utility of the rule, a property that is closely related to rule correctness. A rule that is useful (i.e. its advice is followed by a webmaster) is, according to Brajnik, also likely to be correct.

Interestingly, in another study that examined developers’ responses to the results of automated tools, Ivory et al. (2003) found that developers who relied on their own expertise to modify sites, produced sites that yielded some performance improvements.

This was not the case for sites produced with evaluation tools. Even though the automated tools identified significantly more potential problems than the designers had identified by themselves, designers made more design changes when they did not use an automated tool.

A common method of testing the reliability of tools is to conduct tests where the results of different tools are compared against each other and sometimes also against a manual evaluation.

Studies reveal mixed results. For example in a comparison between Bobby and LIFT in which tool completeness, correctness and specificity were analysed, LIFT was clearly the better performer (Brajnik 2004). Brajnik was able to conclude that:

  • LIFT generated less false positives than Bobby
  • LIFT generates less false negatives than Bobby

For the six most populated checkpoints LIFT had a number of (automatic and manual) tests that was equal or greater than the corresponding number for Bobby.

Other comparative studies have found it harder to come out with a clear winner:

  • Faulkner and Arch (2003) reviewed AccVerify 4.9, Bobby 4.0, InFocus 4.2 and PageScreamer 4.1
  • Lazar et al. (2003) evaluated the reliability of InFocus, A-Prompt and Bobby;
  • Diaper and Worman (2003) reports the results of comparing Bobby and A-Prompt.
  • Lazar et al. concluded from their study that accessibility testing tools are ‘flawed and inconsistent’
  • Whilst, Diaper and Worman (2003) warn that any overhead saved using automatic tools will be spent in interpreting their results.

There must be a strong temptation for organisations to rely more on accessibility assessment tools than they should because the tools are supposed to encapsulate and apply knowledge about the WCAGs and check points. Accessibility tools have their own knowledge over-heads concerning how to use the tools and, vitally, interpret their outputs. Indeed, it may well be that at present the tools’ related knowledge is additional to a sound understanding of the WCAG by the tools’ users, i.e. you need to know more to use the tools, not less.

Choosing an evaluation or repair tool

Decisions about which evaluation or repair tool to choose will depend on a number of factors whether:

  • the designer wants to focus on general or specific accessibility issues;
  • the designer wants to test against WCAG–1, Section 508 or both;
  • the tools are considered to be operating valid rules;
  • the tools are considered to be reliable;
  • the websites to be evaluated are small or large (O’Grady and Harrison 2003);
  • the designer is working as an individual or as part of a larger corporate organisation (Brajnik 2001);
  • the designer wants usability testing to be an integral part of the test (Brajnik 2000).

O’Grady and Harrison (2003) reviewed A-Prompt, Bobby, InFocus and AccVerify.

The review process examined functionality, platform availability, demo version availability and cost. The functional issues examined included the installation process, availability of tech support, standards the product validates and the software’s response to a violation of the checkpoint items. Each software tool was compared on a series of practical and functional indicators. O’Grady and Harrison concluded that the best tools for small to medium sites was A-Prompt and Bobby, whilst for large or multiple sites, InFocus and AccVerify were better choices.

However, Brajnik 2001 argues that comparing these tools can be unfair given their different scope, flexibility, power and price. LIFT is targeted to an enterprise-level quality assurance team and costs from $6,000 upwards, whilst Bobby is available for free and is targeted to a single individual wanting to test a relatively limited number of pages free (Watxchfire initially did make charges when it took over Bobby but it has recently announced that they will transition the Bobby online service to WebXACT, a free online tool).

Problems with evaluation and repair tools

In addition to problems of reliability, many researchers and practitioners have identified further potential problems with the use of evaluation and repair tools including:

  • difficulties with judging the severity of the error (Thompson et al.2003);
  • dangers of inducing a false sense of security (Witt and McDermott 2004);
  • dangers of encouraging over-reliance (SciVisum 2004);
  • difficulties understanding the results and recommendations (Faulkner and Arch 2003; Wattenberg 2004).

These reasons have lead many to conclude that some element of manual testing or human judgement is still needed.

For example, Thompson et al (2003) argue that one shortcoming of the automated tools is their inability to take into account the ‘severity’ of an identified accessibility error. For example, if Site A is missing ALT tags on its spacer images, and Site B is missing ALT tags on its menu buttons, both sites are rated identically, when according to Thompson et al.Site B clearly has the more serious accessibility problem.

They conclude that ‘Some of the web-content accessibility checkpoints cannot be checked successfully by software algorithms alone. There will still be a dependence on the user’s ability to exercise human judgment to determine conformance to the guidelines.’

What do you check and why? What assumptions do you make about the purpose behind a webpage?

The following websites were consistently among the most requested sites for the sample universities. It can be reasonably assumed that these would be sites that would be frequently requested at most universities. The perceived function of each website is included in parentheses.

Perceived function plays an important role in our research method as noted in the Methods section of this paper.

  1. University home page (Read and navigate the entire page without missing important information.);
  2. Search page (Search for any term and read the results.);
  3. List of university colleges, departments, and/or degree programs (Read and navigate the entire page without missing important information.);
  4. Campus directory of faculty, staff and/or students (Look up a name and read the results.);
  5. Admissions home page (Read and navigate the entire page without missing important information.);
  6. Course listings (Find a particular course listing and read all information about that course.);
  7. Academic calendar (Read and navigate the entire calendar without missing important information.);
  8. Employment home page (Read and navigate the entire page without missing important information.);
  9. Job listings (Read and navigate the entire page without missing important information.);
  10. Campus map (Locate a particular location on campus and identify how to get there.);
  11. Library home page (Read and navigate the entire page without missing important information.).

For example, Thompson et al (2003)

Witt and McDermott (2004: 49) argue that while the tools are useful for issues such as locating missing text alternatives to graphics and checking for untidy HTML coding they cannot evaluate a site for layout consistency, ease of navigation, provision of contextual
and orientation information, and use of clear and easy-to-understand language.

Therefore:

developers must also undertake their own audit of the web page and interpret the advice given in order to satisfy themselves that compliance has been achieved.
Witt and McDermott (2004: 49)

That said, these tools can give web developers a false sense of security. If presented with a Bobby or A-Prompt generated checklist showing that all issues have been addressed, developers may believe they have met the accessibility guidelines. Yet in reality, they may have only adhered to a set of rules and produced an inaccessible website.

SciVisum (2004) tested 111 UK websites that publicly claimed to be compliant with WCAG–1 by displaying the WAI compliance logo on their website. They found that 40 per cent of these failed to meet the checkpoints for which they were claiming compliance.

The report concludes:

The SciVisum study indicates that either self-regulation is not working in practice or that organisations are relying too heavily on automated web-site testing.

Semi-automated accessibility testing on Web sites by experienced engineers needs to become a standard practice for UK Web site owners. Only manual tests can help identify areas of improvement, which are impossible to identify with automated checks alone. Unless sites are tested site-wide in this way failure rates amongst those that are claiming to be compliant will continue.

(SciVisum 2004)

Faulkner and Arch (2003) argue that the interpretation of the results from the automated tools requires assessors trained in accessibility techniques with an understanding of the technical and usability issues facing people with disabilities. A thorough understanding of accessibility is also required in order to competently assess the checkpoints that the automated tools check such as consistent navigation, and appropriate writing and presentation style.

Wattenberg (2004) acknowledges that while these tools are beneficial, developers do not always have the time or the motivation to understand the complex and often lengthy recommendations that the validation tools produce.

All of the tools tested have advantages over free online testing tools [Steven Faulkner], the main advantage being their ability check a whole site rather than a limited number of pages. Also files do not have to be published to the web before they can be tested, the user has greater control of what rules are applied when testing, the style and formatting of reports, and in some cases the software will automatically correct problems found.

None of the tools evaluated, as expected, were able to identify all the HTML files and associated multimedia files that needed to be tested for accessibility on the site.

All the software tools evaluated will assist the accessibility quality assurance process, however none of them will replace evaluation by informed humans. The user needs to be aware of their limitations and needs a strong understanding of accessibility issues and the implications for people with disabilities in order to interpret the reports and the accessibility issues or potential issues flagged by the software tools. Faulkner and Arch (2003)

See Appendix 2

Combining automatic and manual judgement of accessibility

The first step is to evaluate the current site with a combination of automated tools and focused human testing. Such a combination is necessary because neither approach, by itself, can be considered accurate or reliable enough to catch every type of error.
(Bohman 2003c)

With the growing acceptance that accessibility evaluation cannot rely on automated methods and tools alone, a number of studies have used a combination of different methods to evaluate accessibility

(Hinn 1999; McCord et al.2002; Sloan et al.2002; Borchert and Conkas 2003).

Hinn (1999) reports on how she used Bobby combined with computer-facilitated focus groups and focused individual interviews to evaluate a virtual learning environment for students with disabilities. In an evaluation of selected web-based health information resources, McCord et al. (2002) also used Bobby, but combined this with a manual evaluation of how the resources interacted with screen reader and speech recognition software.

In thinking about the practicalities of using a combined approach a number of issues arise: time and resources required; the number of people who should be involved in manual evaluation; and the level of expertise required of those people involved in manual evaluation.

Combining automatic and manual judgement of accessibility

The first step is to evaluate the current site with a combination of automated tools and focused human testing. Such a combination is necessary because neither approach, by itself, can be considered accurate or reliable enough to catch every type of error.(Bohman 2003c)

With the growing acceptance that accessibility evaluation cannot rely on automated methods and tools alone, a number of studies have used a combination of different methods to evaluate accessibility (Hinn 1999; McCord et al. 2002; Sloan et al. 2002; Borchert and Conkas 2003). Hinn (1999) reports on how she used Bobby combined with computer-facilitated focus groups and focused individual interviews to evaluate a virtual learning environment for students with disabilities. In an evaluation of selected web-based health information resources, McCord et al.(2002) also used Bobby, but combined this with a manual evaluation of how the resources interacted with screen reader and speech recognition software.

In thinking about the practicalities of using a combined approach a number of issues arise: time and resources required; the number of people who should be involved in manual evaluation; and the level of expertise required of those people involved in manual evaluation.

Number of people

In contrast with the team approach adopted by Sloan et al. (2002), many evaluation studies are conducted by just one or two people. However, the Web Accessibility Initiative (Brewer 2004) appear to favour the use of review teams, arguing that it is the optimum approach for evaluating accessibility of websites because of the advantages that the different perspectives on a review team can bring.

They consider that review teams should have expertise in the following areas:

  • web mark-up languages and validation tools;
  • web content accessibility guidelines and techniques;
  • conformance evaluation process for evaluating websites for accessibility;
  • use of a variety of evaluation tools for website accessibility;
  • use of computer-based assistive technologies and adaptive strategies;
  • web design and development.

(See Appendix 3)

Level of expertise

Many evaluation studies adopt a ‘have-a-go’ response to accessibility evaluation. For example, Byerley and Chambers (2002) documented the problems encountered in using screen readers with two web-based abstracting and indexing services. In the initial phase of the study, the researchers (two sighted librarians) tested the database using JAWS and Windoweyes. They performed simple keyword searches, accessed help screens and manipulated search results by marking citations, printing and emailing full text articles. Others adopt the view that because evaluating accessibility and understanding the guidelines and tools is so difficult, the services of an expert accessibility auditor should be adopted (Sloan 2000).

Two studies that report the involvement of experts are those of Sloan et al. (2002) and Thompson et al. (2003). Sloan et al. carried out an accessibility study of 11 websites in the UK Higher Education sector in 1999. They describe the methodology used by an evaluation team of research experts to carry out the audits which included: drawing initial impressions; testing with automatic validation tools; manual evaluation with accessibility; general inspection; viewing with browsers and assistive technologies; and usability evaluation. Thompson, et al.(2003) describe how two web accessibility ‘experts’ manually evaluated the key web pages of 102 public research universities using a five-point rating scale that focused on each site’s ‘functional accessibility,’ i.e., whether all users can accomplish the perceived function of the site. The evaluators’ combined results were positively correlated with those obtained by using Bobby on the same sample.

There appears to be an interesting paradox at work here.

For some, there is a perception that tools will help to ease the burden and take a ‘great load’ off the shoulders of designers, developers and authors (Kraithman and Bennet 2005). The reality appears to be different however, in that many tools place a great deal of responsibility on the shoulders of designers, developers and authors in terms of needing to interpret the results that tools produce. This has led some to argue that the use of tools should be accompanied by human judgement, in particular expert human judgement.

Thus, it would appear that those considered not to be experts, may not have a role in judging the accessibility of online material at all. This feels like a retrograde step in terms of helping novice learning technologists to feel that they have an important responsibility and contribution to make to the accessibility of online material. In order to avoid disempowering learning technologists we perhaps need to look at developing better tools and/or effective ways of helping learning technologists to gain the skills required to conduct manual judgements of the accessibility of online mate.

Conclusions

In order to understand and make the best use of accessibility guidelines, standards and tools learning technologists need to understand and critique the design approaches that underpin accessibility guidelines and standards and appreciate and critique the strengths and weaknesses of using automatic accessibility tools. By increasing their understanding of these issues learning technologists will be in a stronger position to:

  1. choose which guidelines, standards and tools they will adopt and justify their choice;
  2. adopt or adapt the processes and methodologies that have been used to design and evaluate online material.

Furthermore, in exploring in more detail the different approaches to accessible design, learning technologists should start to get a clearer sense of alignment between:

  • definitions of accessibility;
  • approaches to accessible design;
  • accessibility guidelines and standards;
  • accessibility evaluation tools and methods.

For example, definitions of accessibility that incorporate or stress ‘ease of use’ (e.g. Disability Rights Commission, 2004) may be operationalized through design approaches that focus on usability. The products of these approaches may then be judged against standards that emphasize usability (e.g. WCAG–1) and evaluated using tools that use these standards as benchmarks.

There are signs that some areas are attempting to move towards a standardization of methods and tools.

For example, the Euro accessibility Consortium launched an initiative in 2003 with W3C to foster European co-operation towards a harmonized methodology for evaluating the accessibility of websites.

Whilst moves towards standardization may be helpful, evidence in this chapter suggests that the science of standardization will need to be counterbalanced by the art of judgement and experience.

Witt and McDermott (2002: 49) argue that ‘as a result of the range of standards and guidelines, the levels of interpretation and the subjective judgements required in negotiating these, the creation of an accessible solution is very much an art’.

A major advantage of viewing accessibility as both an art and a science is that learning technologists will be encouraged to take all dimensions and variables into account and to not to focus solely on the technology (Stratford and Sloan 2005).

Accessibility needs to be qualified by questions such as:

  • Accessible to whom?
  • In what context?
  • For what purpose?

The notion of reasonable adjustment is also discussed in the legislation; but to what extent is it reasonable or even desirable to add certain accessibility features?

What affects might adding or withholding features have on other aspects of the curriculum, or on other students?

In addition, if, for whatever reasons, the desired outcome cannot be achieved by one route, what other routes might be appropriate?
(Stratford and Sloan 2005).

Reference

Abascal, J., Arrue, M., Fajardo, I., Garay, N. and Tomas, J. (2004) The use of guidelines to automatically verify web accessibility. Universal Access in the Information Society, 3, 71–79.

Alexander, D. (2003b) Redesign of the Monash University web Site: a case study in user-centred design methods. Paper presented at AusWeb03. Online. Available HTTP: <http://ausweb.scu.edu.au/ aw03/ papers/ alexander/ paper.html> (last accessed 17 Nov 2012).

Alexandraki, C., Paramythis, A., Maou, N. and Stephanidis, C. (2004) web accessibility through adaptation. In K. Klaus, K. Miesenberger, W. Zagler and D. Burger (eds) Computers Helping People with Special Needs. Proceedings of 9th International Conference. Berlin Heidelberg: Springer-Verlag, pp. 302–309.

Arch, A. (2002) Dispelling the myths – web accessibility is for all. Paper presented at AusWeb ’02. Online. Available HTTP: <http://ausweb.scu.edu.au/ aw02/ papers/ refereed/ arch/ paper.html> (last accessed 23 May 2012).

Arditi, A. (2004) Adjustable typography: an approach to enhancing low vision text accessibility. Ergonomics, 47, 5, 469–482.

BBC Fergus Walsh presents The Mind Reader: Unlocking My Voice – a Panorama Special BBC One, Tuesday, 13 November, at 22:35 GMT BBC World News (2012)

Bilotta, J. A. (2005) Over-done: when web accessibility techniques replace true inclusive user centred design. Paper presented at CSUN ’05, Los Angeles, 17–19 March 2005. Online. Available HTTP: <http://www.csun.edu/ cod/ conf/ 2005/ proceedings/ 2283.htm> (last accessed 17 Nov 2012).

Bohman, P. (2003a) Introduction to web accessibility. Online. Available HTTP: http://www.webaim.org/ intro/ (last accessed 16 November 2012).

Brajnik, G. (2000) Automatic web usability evaluation: what needs to be done? Paper presented at the 6th conference on Human Factors and the Web. Online. Available HTTP: http://users.dimi.uniud.it/~giorgio.brajnik/papers/hfweb00.html (last accessed 16 November 2012)

Brajnik, G. (2004) Comparing accessibility evaluation tools: a method for tool effectiveness. Universal Access to the Information Society, 3, 252–263.

Brewer, J. (2004) Review Teams for Evaluating web Site Accessibility. Online. Available HTTP: <http://www.w3.org/WAI/eval/reviewteams.html&gt;. (Now available at http://www.w3.org/ WAI/ EO/ Drafts/ review/ reviewteams.html, last accessed 18 Nov 2012.)

Burzagli, L., Emiliani, P. L. and Graziani, P. (2004) Accessibility in the field of education. In C. Stary and C. Stephanidis (eds) Proceedings of UI4All 2004, Lecture Notes in Computer Science, Volume 3196. Berlin Heidelberg: Springer-Verlag, pp. 235–241.

Byerley, S. L. and Chambers, M. B. (2002) Accessibility and usability of web-based library databases for non-visual user. Library Hi Tech, 20, 2, 169–178.

Caldwell, B., Chisholm, W., Vanderheiden, G. and White, J. (2004) web Content Accessibility Guidelines 2.0: W3C Working Draft, 19 November 2004. Online. Available HTTP: <http://www.w3.org/ TR/ WCAG20/> (last accessed 17 Nov 2012).

Chisholm, W. and Brewer, J. (2005) Web Content Accessibility Guidelines 2.0: Transitioning your website. Paper presented at CSUN ’05, Los Angeles, 17–19 March 2005. Online. Available HTTP: <http://www.csun.edu/ cod/ conf/ 2005/ proceedings/ 2355.htm> (last accessed 18 November 2012).

Colwell, C., Scanlon, E. and Cooper, M. (2002) Using remote laboratories to extend access to science and engineering. Computers and Education, 38, 65–76.

Cooper, M., Valencia, L. P. S., Donnelly, A. and Sergeant, P. (2000) User interface approaches for accessibility in complex World-Wide Web applications – an example approach from the PEARL project. Paper presented at the 6th ERCIM Workshop, ‘User Interfaces for All’. Online. Available HTTP: <http://ui4all.ics.forth.gr/UI4ALL-2000/files/Position_Papers/Cooper.pdf&gt;. (Now available at http://ui4all.ics.forth.gr/ UI4ALL-2000/ files/ Position_Papers/ Cooper.pdf, last accessed 17 Nov 2012.)

Craven, J. (2003a) Access to electronic resources by visually impaired people. Information Research, 8, 4. Online. Available HTTP: <http://informationr.net/ ir/ 8-4/ paper156.html> (last accessed 23 May 2012).

Disability Rights Commission (DRC) (2004) The web: access and inclusion for disabled people. Online. Available HTTP: <http://www.drc-gb.org/publicationsandreports/2.pdf&gt; (accessed 5 October 2005 but no longer available).

Faulkner, S. and Arch, A. (2003) Accessibility testing software compared. Paper presented to AusWeb ’03. Online. Available HTTP: <http://ausweb.scu.edu.au/ aw03/ papers/ arch/ paper.html> (last accessed 18 Nov 2012).

Fielding, R.T (2000) Architectural Styles and the Design of Network-based Software Architectures.

Gappa, H., Nordbrook, Mohamad, Y. and Velasco, C. A. (2004) Preferences of people with disabilities to improve information presentation and information retrieval inside Internet Services – results of a user study. In K. Klaus, K. Miesenberger, W. Zagler and D. Burger (eds)Computers Helping People with Special Needs. Proceedings of 9th International Conference. Berlin Heidelberg: Springer-Verlag, pp. 296–301.

Gay, G. and Harrison, L. (2001) Inclusion in an electronic classroom – accessible courseware study. Paper presented at CSUN ’01. Online. Available HTTP: <http://www.csun.edu/ cod/ conf/ 2001/ proceedings/ 0188gay.htm> (last accessed 17 Nov 12).

Gonzales, J., Macias, M., Rodriguez, R. and Sanchez, F. (2003) Accessibility metrics of web pages for blind end-users. In J. M. Cueva Lovelle (ed.) Proceedings of ICWE 2003, Lecture Notes in Computer Science, Volume 2722. Berlin Heidelberg: Springer-Verlag, pp. 374–383.

Gunderson, J. and May, M. (2005) W3C user agent accessibility guidelines test suite version 2.0 and implementation report. Paper presented at CSUN ’05, Los Angeles, 17–19 March 2005. Online. Available HTTP: <http://www.csun.edu/ cod/ conf/ 2005/ proceedings/2363.htm> (last accessed 18 Nov 2012).

Jacko, J. A. and Hanson, V. L. (2002) Universal access and inclusion in design. Universal Access to the Information Society, 2, 1–2.

Keates, S. and Clarkson, P. J. (2003) Countering design exclusion: bridging the gap between usability and accessibility. Universal Access to the Information Society, 2, 215–225.

Kraithman, D. and Bennett, S. (2004) Case study: blending chalk, talk and accessibility in an introductory economics module. Best Practice, 4, 2, 12–15.

Hackett, S., Parmanto, B. and Zeng, X. (2004) Accessibility of Internet Websites through time. Paper presented at ASSETS ’04, Atlanta, Georgia, 18–20 October.

Hanson, V. L. and Richards, J. T. (2004) A web accessibility service: update and findings. Paper presented at ASSETS ’04, 18–20 October, Atlanta, Georgia. Online. Available HTTP: <http://www.research.ibm.com/ people/ v/ vlh/ HansonASSETS04.pdf>
(last accessed 17 Nov 2012).

Ivory, M. Y., Mankoff, J. and Le, A. (2003) Using automated tools to improve website usage by users with diverse abilities. IT and Society, 1, 3, 195–236.

Iwarsson, S. and Stahl, A. (2003) Accessibility, usability and universal design – positioning and definition of concepts describing person–environment relationships. Disability and Rehabilitation, 25, 2, 57–66

Jacko, J. A. and Hanson, V. L. (2002) Universal access and inclusion in design. Universal Access to the Information Society, 2, 1–2.

Jeffels, P. and Martson, P. (2003) Accessibility of online learning materials. Online. Available HTTP: <http://www.scrolla.hw.ac.uk/ papers/ jeffelsmarston.html> (last accessed 16 November  2012)

Keller, S., Braithwaite, R., Owens, J. and Smith, K. (2001) Towards universal accessibility: including users with a disability in the design process. Paper presented at the 12th Australasian Conference on Information Systems, Southern Cross University, New South Wales.

Kirkpatrick, A. and Thatcher, J. (2005) Web accessibility error prioritisation. Paper presented at CSUN ’05, Los Angeles, 17–19 March 2005. Online. Available HTTP: <http://www.csun.edu/ cod/ conf/ 2005/ proceedings/ 2341.htm> (last accessed 18 November 2012).

Kraithman, D. and Bennett, S. (2004) Case study: blending chalk, talk and accessibility in an introductory economics module. Best Practice, 4, 2, 12–15

Kunzinger, E., Snow-Weaver, A., Bernstein, H., Collins, C., Keates, S., Kwit, P., Laws, C., Querner, K., Murphy, A., Sacco, J. and Soderston, C. (2005) Ease of access – a cultural change in the IT product design process. Paper presented at CSUN ’05, Los Angeles, 17–19 March 2005. Online. Available HTTP: <http://www.csun.edu/ cod/ conf/
2005/ proceedings/ 2512.htm> (last accessed 17 November 2012).

Lazar, J., Beere, P., Greenidge, K. D. and Nagappa, Y. (2003) Web accessibility in the Mid-Atlantic States: a study of 50 homepages.Universal Access in the Information Society, 2, 331–341.

Linder, D. and Gunderson, J. (2003) Publishing Accessible WWW presentations from PowerPoint slide presentations. Paper presented at CSUN ’03. Online. Available HTTP: <http://www.csun.edu/ cod/ conf/ 2003/ proceedings/ 174.htm> (last accessed 18 November 2012).

Luke, R. (2002) AccessAbility: Enabling technology for lifelong learning inclusion in an electronic classroom-2000. Educational Technology and Society, 5, 1, 148–152.

Nielsen. (2000). Designing for web usability. (1st ed.). (Vol. 1). Indianapolis, IN: New Riders Publishing.

Owens, J. and Keller, S. (2000) MultiWeb: Australian contribution to Web accessibility. Paper presented at the 11th Australasian Conference on Information Systems, Brisbane, Australia. Online. Available HTTP: <http://www.deakin.edu.au/infosys/docs/workingpapers/archive/Working_Papers_2000/2000_18_Owens.pdf.&gt; (accessed by Jane Seale 5 October 2005 but no longer available).

Paciello, M. G. (2005) Enhancing accessibility through usability inspections and usability testing. Paper presented at CSUN ’05, Los Angeles, 17–19 March 2005. Online. Available HTTP: <http://www.csun.edu/ cod/ conf/ 2005/ proceedings/ 2509.htm> (last accessed 17 Nov 2012).

Powlik, J. J. and Karshmer, A. I. (2002) When accessibility meets usability. Universal Access in the Information Society, 1, 217–222.

Reece,G.A (2002) Accessibility Meets Usability: A Plea for a Paramount and Concurrent User–centered Design Approach to Electronic and Information Technology
Accessibility for All.

Regan, B. (2005a) Accessible web authoring with Macromedia Dreamweaver MX 2004. Paper presented at CSUN ’05, Los Angeles, 17–19 March 2005. Online. Available HTTP: <http://www.csun.edu/ cod/ conf/ 2005/ proceedings/ 2405.htm> (last accessed 18 November  2012).

$2-3 Digitized Speech and Speech Recognition Chips–Sensory, Inc. Retrieved August
15, 2005, from http://www.sensoryinc.com.

Saito, S., Fukuda, K., Takagi, H. and Asawaka, C. (2005) aDesigner: visualizing blind usability and simulating low vision visibility. Paper presented at CSUN ’05, Los Angeles, 17–19 March 2005. Online. Available HTTP: <http://www.csun.edu/ cod/ conf/ 2005/ proceedings/2631.htm> (last accessed18 Nov 2012).

SciVisum (2004) Web Accessibility Report. Kent, UK: SciVisum Ltd.

Sloan, D., Rowan, M., Booth, P. and Gregor, P. (2000) Ensuring the provision of accessible digital resources. Journal of Educational Media, 25, 3, 203–216.

Sloan, D., Gregor, P., Booth, P. and Gibson, L. (2002) Auditing accessibility of UK higher education web sites. Interacting With Computers, 14, 313–325.

Sloan, D. and Stratford, J. (2004) Producing high quality materials on accessible multimedia. Paper presented at the ILTHE Disability Forum, Anglia Polytechnic University, 29 January. Online. Available HTTP:
http://www.heacademy.ac.uk/assets/documents/resources/database/id418_producing_high_quality_materials.pdf (accessed 16 November 2012).

Smith, S. (2002) Dyslexia and Virtual Learning Environment Interfaces. In L. Phipps., A. Sutherland and J. Seale (eds) Access All Areas: Disability, Technology and Learning. Oxford: ALT/Jisc/TechDis, pp. 50–53.

Smith, J. and Bohman, P. (2004) Evaluating website accessibility: a seven step process. Paper presented at CSUN ’04. Online. Availab le HTTP: <http://www.csun.edu/ cod/ conf/ 2004/ proceedings/ 203.htm> (last accessed 18 Nov 2012).

Stephanidis, C., Paramythis, A., Akoumianakis, D. and Sfyrakis, M. (1998). Self-adapting web-based systems: towards universal accessibility. In C. Stephanidis and A. Waern (eds) Proceedings of the 4th ERCIM Workshop ‘User Interfaces for All’ Stockholm: European Research Consortium for Informatics and Mathematics, pp. 19–21.

Stephanidis et al. (2011) Twenty five years of training and education in ICT Design for
All and Assistive Technology.

Stratford, J. and Sloan, D. (2005) Skills for Access: putting the world to rights for accessibility and multimedia. ALT Online Newsletter, 15 July. Online. Available HTTP: <http://newsletter.alt.ac.uk/e_article000428038.cfm?x=b11,0,w&gt;. (Now available at http://newsweaver.co.uk/ alt/e_article000428038.cfm, (last accessed18 Nov 2012.)

Sullivan, L.H (1896) http://www.scribd.com/doc/104764188/Louis-Sullivan-The-Tall-Office-Building-Artistically-Considered

Tagaki, H., Asakawa, C., Fukudu, K. and Maeda, J. (2004) Accessibility designer: visualizing usability for the blind. Paper presented at ASSETS ’04, 18–20 October, 2004, Atlanta, Georgia.

Taylor, D. M., Tillery, S. I. H. & Schwartz, A. B. (2002). Direct cortical control of 3D neuroprosthetic devices. Science, 296(5574), 1829-32

Thatcher, J. (2004) web accessibility – what not to do. Paper presented at CSUN ’04. Online. Available HTTP: <http://www.csun.edu/ cod/ conf/ 2004/ proceedings/ 80.htm> (last 16 Novembet 2012).

Theofanos, M., Kirkpatrick, A. and Thatcher, J. (2004) Prioritizing web accessibility evaluation data. Paper presented at CSUN ’04. Online. Available HTTP: <http://www.csun.edu/ cod/ conf/ 2004/ proceedings/ 145.htm> (last accessed 18 November 2012).

Thompson, T., Burgstahler, S. and Comden, D. (2003) Research on Web accessibility in higher education. Information Technology and Disabilities, 9, 2. Online. Available HTTP: <http://www.rit.edu/~easi/itd/itdv09n2/thompson.htm&gt;. (Now available at http://people.rit.edu/ easi/itd/ itdv09n2/ thompson.htm, (last accessed 18 Nov  2012.)

Thompson, T. (2005) Universal design and web accessibility: unexpected beneficiaries. Paper presented at CSUN ’05, Los Angeles, 17–19 March 2005. Online. Available HTTP: <http://www.csun.edu/ cod/ conf/ 2005/ proceedings/ 2392.htm> (last accessed
23 May 2012)

Thompson, T., Burgstahler, S. and Comden, D. (2003) Research on Web accessibility in higher education. Information Technology and Disabilities, 9, 2. Online. Available HTTP: <http://www.rit.edu/~easi/itd/itdv09n2/thompson.htm&gt;. (Now available at http://people.rit.edu/ easi/itd/ itdv09n2/ thompson.htm, last accessed 18 Nov 2012.)

Witt, N. A. J. and McDermott, A. P. (2002) Achieving SENDA-compliance for Web Sites in further and higher education: an art or a science? In L. Phipps, A. Sutherland and J. Seale (eds). Access All Areas: Disability, Technology and Learning. Oxford: ALT/TechDis, pp. 42–49.

Witt, N. and McDermott, A. (2004) Web site accessibility: what logo will we use today? British Journal of Educational Technology, 35, 1, 45–56.

Vanderheiden, G. C., Chisholm, W. A., & Ewers, N. (1997, November 18). Making screen readers work more effectively on the web (1st ), [web]. Trace Center. Available: http://www.trace.wisc.edu/text/guideIns/brows er/screen.html

Vanderheiden, G. C.(2007) Redefining Assistive Technology, Accessibility and Disability Based on Recent Technical Advances. Journal of Technology in Human Services Volume 25, Issue 1-2, 2007, pages 147- 158

Yu, H. (2003) Web Accessibility and the law: issues in implementation. In M. Hricko (ed.) Design and Implementation of web Enabled Teaching Tools. Ohio: Kent State University.

WebAim (2010) <i>WAVE Online Web Accessibility Evaluation Tool</i> [online], <a class=”oucontent-hyperlink” href=”http://wave.webaim.org/“>http://wave.webaim.org/</a> (last accessed 18 Nov 2012).

Witt, N. A. J. and McDermott, A. P. (2002) Achieving SENDA-compliance for Web Sites in further and higher education: an art or a science? In L. Phipps, A. Sutherland and J. Seale (eds). Access All Areas: Disability, Technology and Learning. Oxford: ALT/TechDis, pp. 42–49.

APPENDIX 1

The Seven Step Process Smith and Bohman 2004

1. Validate your HTML

This first step happens to be one of the most difficult, especially for people who are not very comfortable with HTML. It is, however, a very important step toward Web accessibility. Proper, standards-based HTML lends itself toward accessibility. Assistive technologies rely on proper HTML more so than most Web browsers. Valid HTML has many other benefits as well, including a decreased likelihood for cross-browser differences or incompatibilities and better support for emerging technologies.

To validate the accessibility of your page’s HTML, use the W3C’s HTML validator at http://validator.w3.org/. The results are often overwhelming at first. Most people are not even aware of the intricate rules and standards for HTML use. Even if you are using professional Web development software programs it is likely that your page will not validate as proper HTML when first validated. Be sure to read the explanations as to why the page does not validate as proper HTML. Learn about the common HTML mistakes and do what you can to fix them.

Creating proper HTML is a challenge, but one that every Web developer should take upon him or herself. When you understand the rules of HTML, you are much more likely to design more usable and accessible Web content.

2. Validate for accessibility

Once you have created proper HTML within your page, many of your accessibility issues will be gone, because proper HTML requires many accessibility techniques, such as alt text. The next step is to find other accessibility issues that may be present in your page. This is where automated accessibility tools come into play.

The Wave accessibility tool is a great place to start – http://www.wave.webaim.org/. The Wave provides useful information about accessibility features and errors within your page. It is designed to facilitate the design process by adding icons to a version of your page. The icons represent structural, content, usability, or accessibility feature or problems within your content. You can easily see the exact location within your page where an error is present.

The Wave (or any other software-based validator) cannot check all accessibility issues, but it checks nearly everything that can possibly be checked in an automated process. As soon as you have fixed the errors and applicable warning from the Wave, you may want to validate your page using other accessibility validators, such as Bobby (http://bobby.watchfire.com/) to get another way of looking at accessibility feedback. If you need to validate or audit the accessibility of an entire site, there are many evaluation tools you can use, including HiSoftware’s line of products (http://www.hisoftware.com/) or InFocus (http://www.ssbtechnologies.com/).

3. Check for keyboard accessibility

This step is very easy. Just access the page and make sure that you can navigate through the entire Web page using the tab key. Ensure that every link and form item is accessible and that all forms can be filled out and submitted via the keyboard. If any content is displayed based upon mouse actions, make sure the content is also available using the keyboard.

4. Test in a screen reader

It is a great idea to have a copy of a screen reader available for testing. I suggest IBM HomePage Reader because it is easy to use and learn, whereas other full-featured screen readers are more expensive and have a very steep learning curve. First listen to the entire page without stopping. Did everything make sense? Did the screen reader access all of the content? Was the alternative text for images appropriate and equivalent enough to convey the content and meaning of the image? Was the reading order of the content logical?

Now try navigating the page with the screen reader. Are link labels descriptive? Were forms accessible via the keyboard? Were form labels included? If the page includes data tables, were data cells associated with headers? Did the navigation structure make sense? Was there an option to navigate within lengthy pages of content? Was content structure, such as headings and lists, correctly implemented? Was any multimedia accessible (i.e., did video have captions, audio have transcripts, Flash have an alternative, etc.)?

5. Check your pages for WCAG compliance

Become familiar with the Web Content Accessibility Guidelines – http://www.w3.org/TR/WAI-WEBCONTENT/. The WCAG are a very complete set of accessibility guidelines. They are so complete that sometimes it is nearly impossible to comply with them. Though some of the guidelines are rather vague and some are outdated, they will give you a good idea of what it takes to be truly accessible. Manually evaluate your page against these guidelines. The WCAG guidelines are broken into three priority levels, based upon level of importance or impact on accessibility. At very least, try to meet the Priority I and Priority II guidelines. The Priority III guidelines are more of a wish list for accessibility, but contain some very important items that should be included whenever possible. The Wave (http://www.wave.webaim.org) can alert you of non-compliance with many of the WCAG guidelines.

6. Conduct user testing

One of the best ways to determine the accessibility of your pages is to get feedback from individuals with disabilities. Most are very willing to give you feedback if it will help increase the accessibility of your content to the disability community. Sometimes features of the site that you believed would increase accessibility end up being very confusing or inaccessible. Be willing to make changes based on user testing. Especially seek feedback on your navigation structure and use of language. These two things can pose huge accessibility barriers to a large group of individuals. As soon as their recommendations for changes have been made, have them test again and see if things are better. Encourage feedback from all of your site visitors.

7. Repeat this process

Web accessibility is a continual process and one that should be evaluated often. Each time you update or change content, quickly run through the previous 6 steps. You will quickly get very good at them and eventually you will understand how to both evaluate and create accessible Web content.

Accessibility Testing Software Compared

APPENDIX 2

How is conformance ascertained

What is accessibility?

An accessible web is available to people with disabilities, including those with:

  • Vision impairment (e.g. low vision or colour blindness) or vision loss affecting their ability to discern or see the screen
  • Physical impairment affecting their ability to use a mouse or keyboard
  • Hearing impairment or loss affecting their ability to discern or hear online audio
  • Cognitive impairments (e.g. dyslexia, ADD, learning difficulties, memory impairment) affecting their ability to comprehend or understand your site
  • Literacy impairments (e.g. low reading skills or English is not their first language) possibly affecting their ability to fully understand your site and its messages

Beneficiaries from an accessible web, however, are a much wider group than just people with disabilities and also include:

  • People with poor communications infrastructure, especially rural Australians
  • Older people and new users, often computer illiterate
  • People with old equipment (not capable of running the latest software)
  • People with “non-standard” equipment (e.g. WAP phones and PDA’s)
  • People with restricted access environments (e.g. locked-down corporate desktops)
  • People with temporary impairments or who are coping with environmental distractions

How do we check for an accessible web site?

The Web Accessibility Initiative (WAI) outlines approaches for preliminary and conformance reviews of web sites [Brewer & Letourneau]. Both approaches recommend the use of ‘accessibility evaluation tools’ to identify some of the issues that occur on a web site. The WAI web site includes a large list of software tools to assist with conformance evaluations [Chisholm & Kasday]. These tools range from automated spidering tools such as the infamous Bobby [ Watchfire, 2003], to tools to assist manual evaluation such as The WAVE [WebAIM], to tools to assist assessment of specific issues such as colour blindness. Some of the automated accessibility assessment software tools also have options for HTML repair.

What role can automated tools play in assessing the accessibility of a web site?

WCAG 1.0 comprises 65 Checkpoints. Some of these are qualified with “Until user agents …” and with the advances in browsers and assistive technology since 1999, some of these are no longer applicable – leaving us with 61 Checkpoints. Of these only 13 are clearly capable of being tested definitively, with another 27 that can be tested for the presence of the solution or potential problem, but not whether it has definitively been resolved satisfactorily. With intelligent algorithms many of the tools can narrow down the instances of potential issues that need manual checking, e.g. the use of “spacer” as the alt text for spacer.gif used to position elements on the page.

These automated tools are very good at identifying pages and lines of code that need to be manually checked for accessibility. Unfortunately, many people misuse these tools and place a “passed” (e.g. XYZ Approved) graphic on their site when the tool can not identify any specific accessibility issues, but the site has not been competently manually assessed for issues that are not software checkable.

So, automated software tools can:

  • check the syntax of the site’s code
  • identify some actual accessibility problems
  • identify some potential problems
  • identify pages containing elements that may cause problems
  • search for known patterns that humans have listed

However, automated software tools cannot:

  • check for appropriate meaning
  • check for appropriate rendering (auditory, variety of visual)

The interpretation of the results from the automated tools requires assessors trained in accessibility techniques with an understanding of the technical and usability issues facing people with disabilities. A thorough understanding of accessibility is also required in order to competently assess the checkpoints that the automated tools cannot check such as consistent navigation, and appropriate writing and presentation style.

APPENDIX 3

Review Teams for Evaluating Web Site Accessibility

Introduction

This document describes the optimum composition, training, and operation of teams of reviewers evaluating accessibility of Web sites, based on experience from individuals and organizations who have been evaluating accessibility of Web sites for a number of years. The combination of perspectives afforded by a team rather than individual approach to Web site evaluation can improve the quality of evaluation. It provides links to potential training resources related to types of expertise needed in Web site evaluation, and suggests practices for effective coordination and communication during the review process.

The description of Web accessibility review teams in this document is informative, and not associated with any certification of review teams. Operation of Web accessibility review teams according to the description in this document does not guarantee evaluation results consistent with any given law or regulation pertaining to Web accessibility.

This document does not address repair of inaccessible Web sites. The recommended process for evaluation of Web site accessibility, and selection of software for evaluation, is addressed in a separate document, Evaluating Web Sites for Accessibility.

Composition of Review Teams

A�review team, rather than individuals, is the optimum approach for evaluating accessibility of Web sites because of the advantages that the different perspectives on a review team can bring. It is possible for individuals to evaluate Web site accessibility effectively, particularly if they have good training and experience in a variety of areas, including expertise with the increasing number of semi-automated evaluation tools for Web accessibility. However, it is less likely that one individual will have all the training and experience that a team approach can bring.

Affiliations among members of review teams may vary. For instance, a team might be a group of colleagues working in the same physical office space, or they might be a team of volunteers communicating consistently by email but located in different countries and working for different organizations.

Likewise the nature of review teams as organizations may vary.� For instance, the review team might operate as a for-profit business enterprise; or as an assigned monitoring and oversight group within a corporation or government ministry; or strictly on a voluntary basis.

Expertise of Review Teams

Review teams should have expertise in the following areas. Links to potential training resources are provided, although in areas such as use of assistive technologies and adaptive strategies, the most effective review approach can be inclusion of users with different disabilities.

  • Web mark-up languages and validation tools
    • W3C Technical Reports
    • [point to which online training resources, or describe generically?]
  • Web Content Accessibility Guidelines 1.0 (WCAG 1.0) and Techniques for WCAG 1.0
    • [link to these docs, to curriculum, and to training resource suite?]
  • Conformance evaluation process from Evaluating Web Sites for Accessibility
    • [link to conf eval section; anything to link to on eval curriculum? slide sets from hands-on workshops?]
  • Use of a variety of evaluation tools for Web site accessibility
    • [any training resources available for evaluation tools?]
  • Use of computer-based assistive technologies and adaptive strategies
    • [link to How People with Disabilities Use the Web sections]
    • [write the piece on cautions about sighted people evaluating pages with screen readers, etc.]
  • Web design and development.
    • [describe generic online resources?]

Operation of Review Teams

Communicate review process and expectations in advance

Depending upon the type of review, the review process may start with advance communication about how the review will be conducted, whether the results will be public or private, how the results will be presented, how much follow-up guidance will be given, etc. In instances where review teams are fulfilling a monitoring, oversight, or advocacy function, this advance notice may not be necessary.

Coordinate review process and communication of results

Most review teams will maintain a private e-mail discussion list for internal exchange of preliminary results during the process of reviewing a site, and for coordinating on one summary of the results from the review team.

Reference specific checkpoints when explaining results

Reports from review teams are most effective when they cite and link to specific WCAG 1.0 checkpoints which are not conformed to.

Compare evaluation results with other review teams

Review teams can benefit from periodically comparing summaries of Web site reviews with other teams as a quality check.

Provide feedback on guidelines and implementation support resources

Feedback on implementation support resources contributes to improving the quality of review tools and processes for all. Where possible, provide feedback on accessibility guidelines and implementation support resources, including the following:

  • Evaluating Web Sites for Accessibility
  • Review Teams for Evaluating Web Site Accessibility (this document)
  • Web Content Accessibility Guidelines 2.0 (Working Draft)
  • Accuracy of evaluation tools (provide feedback directly to tool developers as well as to WAI)

Xerte notes and Presentation

Presenation of Xerte by Alistair McNaught

Xerte Project AGM
18 October, 22:18

Visible to anyone in the world (from the Open University Student Blog Platform)

Xerte is an open source tool that can be used to create e-learning content that can be delivered through virtual learning environments such as Moodle.

This blog post is a summary of a meeting entitled Xerte Project AGM that was held at the East Midlands Conference Centre at the University of Nottingham on 10 October 2012.

The purpose of the day was to share information about the current release about the Xerte tool, to offer an opportunity to different users to talk to each other and also to allow delegates to gain some understanding about where the development of the tool is heading.

One of my main motivations for posting a summary of the event is to share some information about the project with my H810 Accessible online learning: supporting disabled students tutor group.

Xerte is a tool that is considered to create accessible learning material – this means that the materials that are presented through (or using) Xerte may be able to be consumed by people who have different impairments.

One of the activities that H810 students have to do is to create digital educational materials with a view to understanding what accessibility means and what challenges students may face when the begin to interact with digital materials.  Xerte can be one tool that could be used to create digital materials for some audiences.

This blog will contain some further description of accessibility (what it is and what it isn’t); a subject that was mentioned during the day.  I’ll also say something about other approaches that can be used to create digital materials.

Xerte isn’t the beginning and end of accessibility – no single tool can solve the challenge of creating educational materials that are functionally and practically accessible to learners.  Xerte is one of many tools that can be used to contribute towards the creation of accessible resources, which is something different and separate to accessible pedagogy.

Introductions

The day was introduced by Wyn Morgan, director of teaching and learning at Nottingham.  Wyn immediately touched upon some of the objectives of the tool and the project – to allow the simple creation of attractive e-learning materials.

Wyn’s introduction was followed by a brief presentation by Amber Thomas, who I understand is the manager for the JISC Rapid Innovation programme.  Amber mentioned the importance of a connected project called Xenith, but more of this later.

Project Overview

Julian Tenney presented an overview of the Xerte project and also described its history.  As a computer scientist, Julian’s unexpected but very relevant introduction resonated strongly with me.

He mentioned two important and interesting books:

Hackers, by Steven Levy
The Cathedral and the Bazaar by Eric S Raymond.

Julian introduced us to the importance of open source software and described the benefit and strength of having a community of interested developers who work together to create something (in this case, a software tool) for the common good.

I made a note of a number of interesting quotes that can be connected to both books.  These are:

‘always yield to the hands on’ (which means, just get on and build stuff),
‘hackers should be judged by their hacking’,
‘the world is full of interesting problems to be solved’, and ‘boredom and drudgery are evil’.

When it comes to the challenge of creating digital educational resources that can be delivered on-line, developers can be quickly faced with similar challenges time and time again.  The interesting and difficult problems lie with how best to overcome the drudgery of familiar problems.

I learnt that the first version of Xerte was released in 2006.  Julian mentioned other tools that can be used to create materials and touched upon the issue of both their cost and their complexity.  Continued development moved from a desktop based application to a set of on-line tools that can be hosted on an institutional web server (as far as I understand things).

An important point from Julian’s introductory presentation that I paraphrase is that one of the constants of working with technology is continual change.  During the time between the launch of the original version of Xerte and the date of this blog post, we have seen the emergence of tablet based devices and the increased use of mobile devices, such as smartphones.  The standalone version of Xerte currently delivers content using a technology called Flash (wikipedia), which is a product by Adobe.

According to the Wikipedia article that was just referenced, Adobe has no intention to support Flash for mobile devices.  Instead, Adobe has announced that they wish to develop products for more open standards such as HTML 5.

This brief excursion into the domain of software technology deftly took us onto the point of the day where the delegates were encouraged to celebrate the release of the new versions of the Xerte software and toolkits.

New Features and Page Types

Ron Mitchell introduced a number of new features and touched upon some topics that were addressed later during the day.  Topics that were mentioned included internationalisation, accessibility and the subject of Flash support.

Other subjects that were less familiar to me included how to support authentication through LDAP (lightweight directory access protocol) when using the Xerte Online Toolkit (as opposed to the standalone version), some hints about how to integrate some aspects of the Xerte software with the Moodle VLE, and how a tool such as Jmol (a Java viewer for molecular structures) could be added to content that is authored through Xerte.

One of the challenges with authoring tools is how to embed either non-standard material or materials that were derived from third party sources.  I seem to remember being told about something called an Embed code which (as far as I understand things) enables HTML code to be embedded directly within content authored through Xerte.  The advantage of this is that you can potentially make use of rich third party websites to create interactive activities.

Internationalisation

I understand the internationalisation as one of those words that is very similar to the term software localisation; it’s all about making sure that your software system can be used by people in other countries.  One of the challenges with any software localisation initiative is to create (or harness) a mechanism to replace hardcoded phrases and terms with labels, and have them dynamically changed depending on the locale in which a system is deployed.  Luckily, this is exactly the kind of thing that the developers have been working on: a part of the project called XerteTrans.

Connector Templates

When I found myself working in industry I created a number of e-learning objects that were simply ‘page turners’.  What I mean is that you had a learning object that had a pretty boring (but simple) structure – a learning object that was just one page after another.  At the time there wasn’t any (easy) way to create a network of pages to take a user through a series of different paths.  It turns out that the new connector templates (which contains something called a connector page), allows you to do just this.

The way that things work is through a page ID.  Pages can have IDs if you want to add links between them. Apparently there are a couple of different types of connector pages: linear, non-linear and some others (I can’t quite make out my handwriting at this point!) The principle of a connector template may well be something that is very useful.  It is a concept that seems significantly easier to understand than other e-learning standards and tools that have tried to offer similar functionality.

A final reflection on this subject is that it is possible to connect sets of pages (or slides) together using PowerPoint, a very different tool that has been designed for a very different audience and purpose.

Xenith and HTML 5

Returning to earlier subjects, Julian Tenney and Fay Cross described a JISC funded project called Xenith. The aim of Xenith is to create a system to allow content that has been authored using Xerte to be presented using HTML 5 (Wikipedia).

The motivation behind this work is to ensure that e-learning materials can be delivered on a wide variety of platforms.  When HTML 5 is used with toolkits such as jQuery, there is less of an argument for making use of Adobe Flash.

There are two problems with continuing to use Flash.

The first is that due to a historic fall out between Apple and Adobe, Flash cannot be used on iOS (iPhone, iPad and iPod) devices.

Secondly, Flash has not been considered to be a technology that has been historically very accessible.

Apparently, a Flash interface will remain in the client version of Xerte for the foreseeable future, but to help uncover accessibility challenges the Xenith developers have been working with JISC TechDis.  It was during this final part of the presentation that the NVDA screen reader was mentioned (which is freely available for download).

Accessibility

Alistair McNaught from TechDis gave a very interesting presentation about some of the general principles of technical and pedagogic accessibility.  Alistair emphasised the point that accessibility isn’t just about whether or not something is generally accessible; the term ‘accessibility’ can be viewed as a label.  I also remember the point that the application of different types of accessibility standards and guidelines don’t necessarily guarantee a good or accessible learning experience.

I made a note of the following words.

Accessibility is about:

  1. forethought,
  2. respect,
  3. pragmatism,
  4. testing,
  5. communication.

Forethought relates to the simple fact that people can become disabled.  There is also the point that higher educational institutions should be anticipatory.

Respect is about admitting that something may be accessible for some people but not for others.  A description of a diagram prepared for a learner who has a visual impairment may not be appropriate if it contains an inordinate amount of description, some of which may be superfluous to an underlying learning objective or pedagogic aim.

Pragmatism relates to making decisions that work for the individual and for the institution.  Testing of both content and services is necessary to understand the challenges that learners face.  Even though educational content may be accessible in a legislative sense, learners may face their own practical challenges.  My understanding is that all these points can be addressed through communication and negotiation.

It was mentioned that Xerte is accessible, but there are some important caveats.  

Firstly, it makes use of Flash, secondly the templates offer some restrictions and that access depends on differences between screen readers and browsers.  It is the issue of the browser that reminds us that technical accessibility is a complex issue.  It is also dependent upon the design of the learning materials that we create.

To conclude, Alistair mentioned a couple of links that may be useful.  The first is the TechDis Xerte page.  The second is the Voices page, which relates to a funded project to create an ‘English’ synthetic voice that can be used with screen reading software.

For those students who are studying H810, I especially recommend Alistair’s presentation which can be viewed on-line by visiting the AGM website.

Alistair’s presentation starts at about the 88 minute mark.

Closing Discussions and Comments

The final part of the day gave way to discussions, facilitated by Inge Donkervoort, about how to develop the Xerte community site. Delegates were then asked whether they would like an opportunity to attend a similar event next year.

Reflections

One of the things I helped to develop when I worked in industry was a standards compliant (I use this term with a degree of hand waving) ‘mini-VLE’.  It didn’t take off for a whole host of reasons, but I thought it was pretty cool!  It had a simple navigation facility and users could create a repository of learning objects.  During my time on the project (which predated the release of Xerte), I kept a relatively close eye on which tools I could use to author learning materials.  Two tools that I used was a Microsoft Word based add in (originally called CourseGenie) which allowed authors to create series of separate pages which were then all packaged together to create a single zip file, and an old tool called Reload.  I also had a look at some commercial tools too.

One of the challenges that I came across was that, in some cases, it wasn’t easy to determine what content should be created and managed by the VLE and what content was created and managed by an authoring tool.  An administrator of a VLE can define titles and make available on-line communication tools such as forums and wikis and then choose to provide learners with sets of pages (which may or may not be interactive) that have been created using tools like Xerte.

Relating back to accessibility, even though content may be notionally accessible it is also important to consider the route in which end users gain access to content.  Accessible content is pointless if the environment which is used to deliver the content is either inaccessible or is too difficult to navigate.

Reflecting on this issue, there is a ‘line’ that exists between the internal world of the VLE and the external world of a tool that generates material that can be delivered through (or by) a VLE.  In some respect, I feel that this notional line is never going to be pinned down due to differences between the ways in which systems operate and the environments in which they are used.  Standards can play an important role in trying to defining such issues and helping to make things to work together, but different standards will undoubtedly place the line at different points.

During my time as a developer I also thought the obvious question of, ‘why don’t we make available other digital resources, such as documents and PowerPoint files to learners?’  Or, to take the opposite view of this question, ‘why should I have to use authoring tools at all?’  I have no (personal) objections about using authoring tools to create digital materials.  The benefit of tools such as Xerte is that the output can be simple, directly and clear to understand.  The choice of the mechanisms used to create materials for delivery to students should be dictated primarily by the pedagogic objectives of a module or course of study.

And finally…

One thought did plague me towards the end of the day, and it was this: the emphasis on the day was primarily about technology; there was very little (if at all) about learning and pedagogy.  This can be viewed from two sides – understanding more about the situations in which a particular tool (in combination with other tools) can best be used, and secondly how users (educators or learning technologists) can best begin to learn about the tool and how it can be applied.  Some e-learning tools work well in some situations than others.  Also, educators need to know how to help learners work with the tools (and the results that they generate).

All in all, I had an enjoyable day.  I even recognised a fellow Open University tutor!  It was a good opportunity to chat about the challenges of using and working with technology and to become informed about what interesting developments were on the horizon and how the Xerte tool was being used.  It was also great to learn that a community of users was being established.

Finally, it was great how the developers were directly tacking the challenge of constant changes in technology, such as the emergence of tablet computers and new HTML standards.

Tackling such an issue head on whilst at the same time trying to establish a community of active open source developers can certainly help to establish a sustainable long-term project.

It is well known that the average quality of websites is poor, “lack of navigability” being the #1 cause of user dissatisfaction

Fig. 1. A model for professional development of e-learning (JISC, 2010)

It is well known that the average quality of websites is poor, “lack of navigability” being the #1 cause of user dissatisfaction [Fleming, 1998; Nielsen, 1999].

Should a link from a reference that gives dated commentary such as this be given in a contemporary piece of e-learning on accessibility?

My frustrations may be leading to enlightenment but when a subject such as e-learning is so fast moving it is laughable to find yourself being referred to commentary published over a decade ago, and so potentially first written down 13 years ago.

At times I wonder why the OU doesn’t have a model that can be repeatedly refreshed, at least with every presentation, rather than every decade when the stuff is replaced wholesale. They need a leaner machine – or at least the Institution of Educational Technology does.

I did H807 Innovations in e-learning in 2010 – it has now been replaced by H817 – at times H807 told me LESS about innovations in e-learning that I picked up myself working in the industry creating innovative online learning and development in 2000/2001 while the tutor struggled with the online tools.

Here we go again, not from the resource, but from someone cited in it :

In 1999, in anticipation of Special Educational Needs and Disability Rights in Education Bill (SENDA), funding was obtained to employ a researcher for 2 days per week over a 6 month period to produce a concise usable guide to the factors which must be taken into account in order to produce accessible online learning materials.

I don’t want to know or need to know – all of this should be filtered out.

There needs to be a new model for publishing academic papers – quicker and perishable, with a sell-by-date.

In fairness, in this instance, I am quoting from a reference of a 2006 publication that is a key resource for H810 Accessible Online Learning. But I have now found several specialists cited in Seale’s publication on accessibility who say very different things in 2007 and 2011 respectively compared to how they are referenced in papers these two wrote in 1996 and 2001.

For example, compare these two:

Vanderheiden, G. C., Chisholm, W. A., & Ewers, N. (1997, November 18). Making screen readers work more effectively on the web (1st)

Vanderheiden, G. C.(2007) Redefining Assistive Technology, Accessibility and Disability Based on Recent Technical Advances. Journal of Technology in Human Services Volume 25, Issue 1-2, 2007, pages 147- 158

The beauty of our WWW in 2012 is that a few clicks and a reference can be checked and the latest views of the author considered, yet the module’s design doesn’t instigate or expect this kind of necessary refreshing.

The other one to look at is:

Stephanidis et al. (2011) Twenty five years of training and education in ICT Design for All and Assistive Technology.

Activity designed to provide an insight into scripting content that is image rich for the visually impaired

English: Breaststroke 2 ‪中文(简体)‬: 蛙泳2

English: Breaststroke 2 ‪中文(简体)‬: 蛙泳2 (Photo credit: Wikipedia)

Fig.1. Kindle by the pool. Taking swim sessions armed with a Kindle edition of ‘The Swimming Drills book. A perfect aide memoire for the coach. A tool that grabs the attention of swimmers (I use this with 5 year olds to 75 year olds)

As part of the Masters in Open and Distance Education (MAODE) module H810 Accessible Online Learning : supporting disabled students we are looking at how best to describe visual content for the visually impaired. This fascinating exercise sees me refreshing my ideas on scriptwriting.

Asked to find an example of an online learning resource from my own context I decided to turn to swimming

Teaching breaststroke : symmetrical whip kick and glide, arms in front of the shoulders during the pull, head still looking no further than in front of your hands.

Coach Marlins – my swim teaching and coaching blog.

A personal resource, reflection on swimming (masters) and coaching for Mid Sussex Marlins Swimming Club.  A first step towards creating a mobile resource. Below is an excerpt from a typical morning teaching four groups – three grade groups (4.5.7) typically 7 – 11 year olds) and a disability swimming group of children and adults.

See ‘The Swim Drills Books’

Grade 7 are technically superior and have more stamina and may be a little older. The ones I watch out for are the 7 year olds in with 10 and 11 year olds as they need a different approach, TLC and play.

WARM UP

  • 3 x 50m warm up of front crawl and backstroke

Always giving a tip before starting them off (and accommodating the odd swimmer who is invariably late), say ‘smooth swimming’ or ‘long legs’. i.e. reducing splashing and creating a more efficient swimmer.

  1. Make sure too that there is a 5m between each swimmer.
  2. 25m of Breaststroke to see what I’ve got and potentially adjust accordingly.

LEGS

Fig. 2. Breaststroke kicking drill from ‘The Swimming Drill Book’

  • Kick on front with a kicker float.
  • Taking tips from ‘The Swim Drill Book’
  • I remember to put as much emphasis on keeping the chin in.

The glide is key – this is where to put the emphasis.

  • May start the ‘Kick, Pull, Glide’ or better ‘Kick, Pull, Slide’ mantra to get it into their heads.

ARMS

Fig. 3. Breaststroke arms from ‘The Swimming Drill Book’

Standing demo of the arm stroke, from Guzman, forming an equilateral triangle and keeping the fingers pointing away.

  • Will ‘describe’ the triangle poolside then ask what it is and what kind of triangle.
  • Anything to get them to think about it a little.

Fig. 4. Breaststroke arms from ‘The Swimming Drill Book’

  • I show this as a single action.
  • Other things I might say include ‘heart shaped’ *(upside down).
  • And making a sound effect ‘Bu-dooosh’ as I push my arms out.

Fig. 5. Breaststroke arms from ‘The Swimming Drill Book’

Repeat the need for a pronounced glide, even asking fo a 2 second count (one Mississippi, two Mississippi)

I support by showing images from ‘The Swimming Drill Book‘ on an iPhone or the Kindle

Leading into the turn we do in sequence (from the shallow end):

    • Push and glide for count of 5 seconds
    • Same, then add the underwater stroke and See how far you can go.

Legs Only Drill (Advanced)

Arms outstretched above the head. No kicker float

  • The whole BR transition counting 3,2,1.

Which visual content needs describing?

  • The objects that need describing might be photos, diagrams, models, animations and so on.

In the resources I was impressed by the clear, logical, analytical description of some of the complex bar charts, flow charts, pie charts and others. This is how all descriptions should be. In 2010 or 2011 the BBC reviewed how weather forecasts were delivered. It was determined that they were far too flowery. A plainer, clearer approach – overview, identified the region, immediate and forecast weather. Move on. Much more like ‘The Shipping Forecast‘ was wanted and worked better. No more ‘weather-caster personalities’ then. It isn’t entertainment, it is information.

What kind of description is needed?

‘Before beginning to write a description, establish what the image is showing and what the most important aspects are’. UKAAF

‘Consider what is important about the photograph in the context of how the image is going to be used, and how much detail is essential’. UKAAF

In swimming, any description of these visuals should emphasise the purpose of the action, the key action in relation to the physics and physiology of the pull, the action in relation to the rules of competitive swimming.

  • Keep it simple
  • Get to the point
  • Choose the right words

Kick without a float. Arm pull practice standing in water or on the side of the pool.

If you can, ask someone who has not seen these visual objects to read your descriptions. Then show them the object and the context. What was their reaction? (If you have online tools to share visual resources, ask another student in your tutor group to do this activity with you.)

Which aspects of this task were straightforward or difficult?

  • Knowing that gender is irrelevant. Putting it in context.
  • Take care not to use terms or metaphors that the swimmer may not be familiar with if they have never seen them.

Reading text on a diagram and wanting to shut my eyes so that I can hear the description without the image.

  • To get this close to right I need to use a screen reader or record and play back.
  • Working with someone who is visually impaired is of course the best approach.

‘Remember that blind or partially sighted people cannot skim read, so let them know how long the description is likely to be’. UKAAF

Knowing what to leave out, being confident to leave something out then knowing how to handle it.

‘It is important that information provided for sighted people is also made available to blind and partially sighted people, even if the way the information is given is different’. RNIB (2009)

An author should write with a single reader in mind – in this instance while visual impairment is the modus operandi – they are first of all a swimmer or swim teacher/assistant – so the description must be given with this in mind, which in turn defines the writing/editing process of what to put in or what to leave out.

What might have helped  improve my descriptions?

  • Physically moving the student athletes arms and legs through the positions. With their consent, allowing a visually impaired swimmer lay the hands on the arms then legs of someone as they go through the movement.
  • An artist’s manikin or a jointed doll, male or female action figure,
  • Braille embossed outline.

‘However converting a visual graphic to an appropriate tactile graphic is not simply a matter of taking a visual image and making some kind of “tactile photocopy”. The tactile sense is considerably less sensitive than the visual sense, and touch works in a more serial manner than vision. Therefore the visual graphic needs to be re-designed to make sense in a tactile form for blind and partial sighted readers’. RNIB (2009)

In some subjects, interpreting an image or diagram could be a key skill that students are expected to learn.

Drill-down organization

Descriptions should follow a drill-down organization, e.g., a brief summary followed by extended description and/or specific data. Drill-down organization allows the reader to either continue reading for more information or stop when they have read all they want.

Keeping this logic rather than imaging the sighted eye skipping about the page, so I imagine I am not allowed to lift the stylus from the screen … it has to be in a continuous, logical flow. Constructing a narrative would add some logic to it as well.

Can descriptions be done in such a way that you are not giving students the answers?

This was an interesting and relevant point regarding humorous cartoons ‘Cartoons and comic strips need to be described if necessary. Set the scene of the cartoon without giving away the joke Provide a brief overview of the image.’

The same therefore applies to ‘giving the answer’ – treat it as the punch line but leave it out. and like a quiz book say, ‘answers on page x’.

What do you think your strategy would be if you can’t find a way to give a description without compromising the learning outcomes?

Script differently – this is after all a different audience – and all students are ultimately an audience of one. Perhaps all resources will become highly personalised in future?

12) How can providing descriptions be included in the workflow process of delivering an online module? (This was touched on in the discussion for Activity 17.3.)

  • I liked this quotation:

“When organisations send me information in formats that I can read myself it allows me to be independent, feel informed and appreciated – just like every other customer.” End-user UKAAF

From Describing images 2: Charts and graphs

  • Definition of print disability
  • A print-disabled person is anyone for whom a visual, cognitive, or physical disability hinders the ability to read print. This includes all visual impairments, dyslexia, and any physical disabilities that prevent the handling of a physical copy of a print publication.

REFERENCE

RNIB Tactile Images : http://www.rnib.org.uk/professionals/accessibleinformation/accessibleformats/accessibleimages/Pages/accessible_images.aspx

RNIB Image Descriptions : http://www.rnib.org.uk/professionals/accessibleinformation/accessibleformats/accessibleimages/imagedescriptions/Pages/image_descriptions.aspx

Gould, B., O’Connell, T. and Freed, G. (2008) Effective Practices for Description of Science Content within Digital Talking Books [online], National Center for Accessible Media (NCAM), http://ncam.wgbh.org/ experience_learn/ educational_media/ stemdx (last accessed 10 November 2012).

Guzman, R (2007) The Swimming Drills Book.  Human Kinetics Publishers ISBN 9780736062510

UK Association for Accessible Formats (UKAAF) (undated) Formats and Guidance: Accessible Images [online], http://www.ukaaf.org/ formats-and-guidance#accessible (last accessed 10 November 2012).

University of Aberdeen (undated) Keep It Simple [online], http://www.abdn.ac.uk/ eLearning/ accessibility/ checklist/ keep-it-simple/ (last accessed 10 November 2012).

The Open University’s Masters in Open and Distance Education (MAODE).

Fig. 1. The Open University’s Masters in Open and Distance Education (MAODE).

Expressed as a Wordle. A personal collection of key influencers based on those tagged in this blog. Includes my own reading and indulgences.

On Friday, at midday, My OU student blog reached a significant milestone.

I’ve been at it for 33 months. I’ve blogged the best part of FIVE modules now – most of which required or invited some use of the blog platform (or another). I required little encouragement – I used to keep a diary and have found since 1999 that in their digital form they are an extraordinarily versatile way to gather, consider, share and develop ideas.

The investment in time, on average, an hour a day in addition to – though sometimes instead of coursework over 1000+ days.

(This excludes 8 months I spent on the Masters in Open and Distance Learning in 2001)

To mark this event, and as I need to go through this online diary, this e-journal, this ‘web-log’ (as they were also once momentarily called) ahead of some exciting meetings coming up next week I thought a simple task might be to click through the tags to identify who have been the key influencers in my reading and thinking over the last two and a half years.

Fig.2. Another way of looking at it.

Beetham, Conole and Weller are key MAODE authors from the Open University. John Seely Brown is a vital undercurrent, Engestrom one of several enthusiasms like Vygostky. While Gagne, second hand hardback, needs to be on your desk for frequent reference.

What I thought would take an hour has taken nearly 40 hours.

Clicking on a tag opens a corner of my head, the notes take me back to that day, that week, that assignment or task. It also takes me back to the discussions, resources and papers. And when I find an error the proof-reader in me has to fix. Aptly, as we approach November 5th, and living in Lewes where there are marches and fireworks from late October for a couple of weeks peaking of course all evening on the 5th, my head feels as if someone has accidentally set light to a box of assorted fireworks.

Just as well. Meetings these days are like a viva voce with eager ears and probing questions – they want the content of my mind and whatever else I bring to the subject after thirty years in corporate training and communications.

Fig. 3. Wordle allows you to say how many words you want to include in the mix.

To create weight I had to repeat the names I consider most important twice, three or four times in the list. I also removed first names as Wordle would have scattered these into the mix independently like peppercorns in a pan of vegetable stock.

The Task

    • List all authors who have been part of my learning and thinking over the last couple of years.
  • Include authors that my antennae have picked up that are relevant to my interest in learning, design, the moving image and the english language.
  • Visualise this and draw some conclusions

Fig.4. This even makes three of the key protagonists look like an advertising agency Gagne, Beetham and Conole.

The Outcome

I can never finish. Take this morning. I stumble upon my notes on three case studies on the use of e-portfolios from H807 which I covered from February 2010-September 2010. To begin with I feel compelled to correct the referencing in order to understand the value, pertinence and good manners (let alone the legal duty) to cite things correctly. (Even though this post was locked – a ‘private’ dump of grabs and my thoughts).

Then I add an image or two.

These days I feel a post requires a visual experssion of its contents to open and benefits from whatever other diagrams, charts or images you can conjure from your mind or a Google Search – ‘the word’ + images creative commons – is how I play it.

Fig. 5. From David Oglivy’s book ‘Ogilvy on advertising‘ – a simple suggestion – a striking image, a pertinent headline and always caption the picture. Then write your body copy.

A background in advertising has something to do with this and the influence of David Ogilvy.

I spend over two hours on the first of three case studies in just one single post. At the time I rubbished e-portfolios. The notes and references are there. Tapped back in I can now make something of it. A second time round the terms, the ideas – even some of the authors are familiar. It makes for an easier and relevant read. What is more, it is current and pertinent. A blog can be a portfolio – indeed this is what I’d recommend.

From time to time I will have to emerge from this tramp through the jungle of my MAODE mind.

Not least to work, to sleep, to cook and play.

Fig. 6. In a word

USEFUL LINKS

Wordle

Date duration calculator

REFERENCE

Gagne, R.N. (1965) Conditions of Learning Holt, Rinehart and Winston

%d bloggers like this: