Michael Gaigg: Über UI/UX Design

19Dec0

Web Content Accessiblity Guidelines (WCAG) 2.0: Overview and Structure

Posted by Michael Gaigg

Overview

Last week the W3C announced the publishing of the Web Content Accessibility Guidelines (WCAG) 2.0 as a final Web Standard "W3C Recommendation". This is good news for many reasons:

  • Guidelines are more specific, e.g. specifying contrast ratio or time-based actions in seconds.
  • Success Criteria are written in a technology neutral fashion.
  • Success Criteria are written as testable statements.
  • Past killer arguments like "Javascript is forbidden" are now included as a technique to enhance accessiblity.
  • Gathering 'implementation experience' is now part of the W3C Process.
  • Guidelines include requirements related to informing users of data entry errors.
WCAG 2.0 Overview showing Principles, Guidelines, and Success Criteria (Level A, Level AA, Level AAA).

WCAG 2.0 Overview showing Principles, Guidelines, and Success Criteria (Level A, Level AA, Level AAA).

But what I personally like the best is the revamped structure called layers of guidance:

Structure

The four principles of Web accessibility: perceivable, operable, understandable, and robust.

The four principles of Web accessibility: perceivable, operable, understandable, and robust.

The WCAG 2.0 define a logical hierarchy of accessibility guidelines called layers of guidance. All of these layers work together to provide guidance on how to make content more accessible.

Principles

The foundation is built on four principles that are essential for anyone to access and use Web content, i.e. every Web content must be:

  1. Perceivable
  2. Operable
  3. Understandable
  4. Robust

These principles are the four pillars of Web accessibility and describe at a high level what can be done to assist users with varying needs to successfully access your content.

Guidelines

The 12 WCAG 2.0 Guidelines provide basic goals for creating accessible content.

The 12 WCAG 2.0 Guidelines provide basic goals for creating accessible content.

The 12 guidelines are basic goals that authors of Web content should work toward in order to create accessible content. None of them are testable and are only meant as a framework of overall objectives. The guidelines are:

  • 1.1 Provide text alternatives for any non-text content so that it can be changed into other forms people need, such as large print, braille, speech, symbols or simpler language.
  • 1.2 Provide alternatives for time-based media.
  • 1.3 Create content that can be presented in different ways (for example simpler layout) without losing information or structure.
  • 1.4 Make it easier for users to see and hear content including separating foreground from background.
  • 2.1 Make all functionality available from a keyboard.
  • 2.2 Provide users enough time to read and use content.
  • 2.3 Do not design content in a way that is known to cause seizures.
  • 2.4 Provide ways to help users navigate, find content, and determine where they are.
  • 3.1 Make text content readable and understandable.
  • 3.2 Make Web pages appear and operate in predictable ways.
  • 3.3 Help users avoid and correct mistakes.
  • 4.1 Maximize compatibility with current and future user agents, including assistive technologies.

Success Criteria

WCAG 2.0 Success criteria shown in three column: column 1 (red) are Level A, column 2 (yellow) are Level AA, column 3 (green) are Level AAA.

WCAG 2.0 Success criteria shown in three column: column 1 (red) are Level A, column 2 (yellow) are Level AA, column 3 (green) are Level AAA.

Now, the success criteria is where the meat is. For each Guideline, testable success criteria are provided. Every Web content or series of Web content (complete web page or series of connected pages) can be tested and evaluated against these criteria and further assigned a true/false (equals pass or fail) value.
These success criteria are further divided into three levels of conformance, meaning satisfying all the requirements of a given standard, guideline or specification:

  • Level A (lowest; minimum level of conformance)
  • Level AA
  • Level AAA (highest)

The notion of conformance is so important that I will discuss it in a separate blog entry.

Sufficient and Advisory Techniques

Up until now all the principles, guidelines, and success criteria are written in a technology neutral fashion. That's great but what now? The Working Group has identified and published examples for HTML implementations that should serve as examples and tutorials and are kept in the living document called Techniques for WCAG 2.0. This document explains a variety of techniques on how to implement the given guideline for each success criteria. The list is not complete and will be expanded as new techniques are discovered.

The techniques fall into two categories:

  • Sufficient techniques: considered to be sufficient to meet a success criteria.
  • Advisory techniques: enhance accessibility, but did not qualify as sufficient techniques.

Most Success Criteria have multiple sufficient techniques listed. Any of the listed sufficient techniques can be used to meet the Success Criterion. Also there may be other techniques which are not documented by the working group that could also meet the Success Criterion. This is especially true for content that is not HTML.

Resume & Criticism

I'm really excited about the WCAG 2.0, their clear structure and promising, almost marketing-like wording. I also like the amount of effort taken to document examples, techniques and common failures.
What I miss is the programmer perspective that outlines each element with its associated success criteria and code samples, e.g. how can I make tables accessible, what about links, captcha, maps, etc.? I think this work is up to us and I will continue to tackle this issue by grouping, summarizing and compiling elements so I can publish them on this blog.

What are your opinions on WCAG 2.0?

9Nov0

User-Centered Design (UCD) Methods: Comparison and Overview

Posted by Michael Gaigg

This is the first in a series of blogs describing User-centered Design Methods. My goal is to summarize my experience, insights and findings across multiple literature and compile them into easy and quick to digest pieces for you to consume. I want to encourage you to comment your own experiences and give me feedback on why your company applies certain methods differently or not at all or something else altogether.

I personally don't like the term Usability too much, it's an empty buzz word. It means SOMETHING to everybody but isn't scientific enough to be taken serious. It's often interpreted wrongly and purely misunderstood by most. It's kinda like Psychology, we know it is important to understand fundamental human behavior, their problems and remedies, but I wouldn't pay a dime to go to a Psychologist. But who knows, just as Psychology got its scientific relevance and acknowledgment - partly maybe through the 'invention' of the IQ - hopefully Usability rises up to similar levels (Jeff Sauro offers interesting metrics via SUM (Single Usability Metric).

That's why I like the term User-centered Design. It works wonders with Project Managers and the-like, probably because Design is such an important term in their daily work. And when asked about Usability testing I can conveniently point out that this is only one tool of many in my UCD toolbox. But the really important sales trick is to know which UCD method is best used at what time in the project management cycle.

The following chart compares the most common user-centered design methods, outlines their cost and shows when to use them:

Overview of user-centered design methods

Comparison of User-centerd Design (UCD) Methods
Method Cost Output Sample Size When to Use
Competitive Studies Medium Stat. & Non-Stat. 5 Requirements
Focus Groups High Non-Statistical 6-9 Requirements
Field Studies High Non-Statistical 2-3 Requirements
Heuristic Evaluation Low Statistical 2-3 Design
Paper Prototyping Medium Stat. & Non-Stat. 5 Design
Card Sorting High Statistical 15-20 Design
Participatory Design Low Non-Statistical n/a Design
User Testing Medium Stat. & Non-Stat. 5 Design & Evaluation
Surveys Low Statistical 20+ Requirements & Evaluation
Interviews High Non-Statistical 3-5 Requirements & Evaluation
Server Traffic Log Analysis Low Statistical n/a Evaluation
Search Log Analysis Low Statistical n/a Evaluation

Not long ago, after having completed a full project management cycle (requirements, design, implementation and evaluation) the PM proudly announced to perform a Focus Group with his stakeholders. Showing the ready application, he thought, would surely impress them and lead to valuable feedback for the next milestone. This impulse isn't uncommon but has to be fought before it becomes reality. Does he really want to produce MORE and EXPENSIVE requirements? Because that's the output of Focus Groups. Wouldn't he be better off running 2 iterations of User Testing to reveal usability issues or a Survey to receive input from outside the development environment?

Recommendations

  • Requirements:
    • Competitive Studies
    • Interviews
    • Field Studies
  • Design:
    • Heuristic Evaluation
    • Paper Prototyping
    • User Testing
  • Evaluation:
    • Surveys
    • Server Traffic Log Analysis
    • Search Log Analysis
    • User Testing
  • The Usability effort is NOT proportional to the size of the project. Bigger projects spend less percentage on UCD with same effort. Regardless, as a rule of thumb assign 10% of the projects budget for UCD.
  • Faster iterations of prototype design require less testers

References