Michael Gaigg: Über UI/UX Design

31May0

Highlights of Week 21/2011

Posted by Michael Gaigg

4Apr0

Cheatsheet: Preparation for User Testing

Posted by Michael Gaigg

I find the following list really helpful when planning and conducting user testing. I collect and refine it constantly and would greatly appreciate any comments or additions I have missed (and I'm sure I did).

Setup:

  • setup web meeting
  • tell secretary to not delete account and associated recordings
  • test connection, equipment and recording capabilities
  • setup schedule for participants
  • send connection info to stakeholders
  • remind everybody to mute their phones (or whatever else is necessary)
  • prepare necessary data and files

Test machine:

  • hide windows toolbar
  • close mail program

Meeting:

  • enable full screen for all users
  • show host cursors to all attendees
  • allow access to observers
  • share desktop

Session:

  • clear user generated content from previous user
  • reset application
  • remove cookies
  • start blank application (if that's part of the test)
  • take a break/breather for yourself
  • prepare your personal notes taking material
  • get acquainted with name and capabilities of next participant
  • provide water for participant
  • start recording
  • greet participant and get going

Post-test:

  • clarify time line for test results (findings & analysis)
  • send thank you emails to participants
19Sep0

Highlights of Week 37/2010

Posted by Michael Gaigg

12Oct2

Remote User Testing – A Comprehensive Guide

Posted by Michael Gaigg

No doubt, user testing increases the usability and acceptance of your website and can/should be done as early as possible, preferably during Prototyping.

The following blog entry discusses the advantages & disadvantages of remote user testing, describes time estimates & costs and explains how a session looks like using Techsmith's UserVue.

General Discussion

Advantages

  • Reduced (or no) travel time and expenses.
  • Higher exposure through easy screen sharing (managers can sneak in easily).
  • Actual user environment, familiar and comfortable.
  • Possibly fewer drop-outs.

Disadvantages

  • Facilitator not physically present (degree of separation can be challenging).
  • Can't see facial expressions or non-verbal cues.
  • Difficult to build rapport and trust.
  • Difficult to control environment.
  • Possibly technical difficulties (firewall, etc.).
  • Setup and use of software or usability lab might be challenging and requires a liaison.

Time estimates & Costs

Task Breakdown

The following time estimates are to be taken with a grain of salt, they can change significantly up or down depending on the project size, experience of the team and infrastructure.

  • Preparation (18 hours)
    • User screening: 8 hours
    • Task creation: 8 hours
    • Environment: 2 hours
  • User testing (10 hours)
    • 2 hours per user
    • 5 users per round
  • Post-test (32 hours)
    • Test report: 16 hours
    • Implementation: 12 hours
    • Communication: 4 hours

Summary: Our Time

1 Round = 60 hours
2 Rounds = 110 hrs.
3 Rounds = 160 hrs.

+ User comps
+ Time additional observers

How it works

Software

UserVue

UserVue is a Remote user testing software that enables a Facilitator to remotely observe a Participant using a phone line for communication. Multiple Observers can passively join the Session and share their observations with the Facilitator.

Morae Manager

Morae Manager uses the collected data (observation markers and notes, video, keyboard and mouse inputs) to analyze and calculate task times, error quotes and other common measurements.

Session

A Session is initiated by the Facilitator. Invitation emails are being sent to the Participant and Observer(s).

At the announced time all the involved parties need to download a small software bundle that allows them to connect to the UserVue software. The Facilitator then calls the Participant and gives instructions on how to start the Session.

After the session has ended the installed software bundle will be removed from computers of the involved parties.

Participant Requirements

Operating System

Microsoft Windows 2000 and Windows XP or later version of Windows.

Web Browser

Internet Explorer 5.0 or later, Firefox 1.0 or later.

JavaScript

Enabling JavaScript is recommended.

Security?

All communication with the UserVue Web site is performed over an encrypted Secure Sockets Layer transport mechanism (HTTPS). All session data is encrypted with a 128-bit Blowfish cipher as it is sent over the network.

Can anyone "eavesdrop" on my session?

All session data (including audio, video, chat data, etc.) is encrypted with a 128-bit Blowfish cipher as it is sent over the network. This makes it exceedingly difficult for anyone to intercept and observe session data.

Are copies of my session data stored anywhere?

No copies of session data are stored on any server. The only recording happens directly on the facilitator's computer. Session data may pass through TechSmith's servers to facilitate firewall and NAT traversal. However, this data is never stored. Also, this data is undecipherable as it is in an encrypted form as it passes through TechSmith's servers.

What are your experiences with Remote User Testing?

References

30Sep1

Integrating Prototyping Into Your Projects

Posted by Michael Gaigg

This article was inspired by Integrating Prototyping Into Your Design Process - Using appropriate fidelity for the situation by Fred Beecher which I extend by the following:

Prototyping needs to be iterative throughout the project!

Goal of Prototyping

Prototyping is not only a design tool but a research and communication tool as well.

  • It should assist in optimizing the main task (top tasks) and validating its/their efficiency.
  • Furthermore this should not add cost to the project but reduce project expenses while increasing ROI.

So the goal is to use different levels of prototype fidelity to incrementally identify and enhance the user's task(s).

Ideally this happens linear (increase visual fidelity as you add functional fidelity) but typically it is bent to either side (see Figure 1) where more emphasize on

  • visual fidelity can be beneficial for marketing purposes or
  • functional fidelity can assist earlier user feedback trough user testing.
Prototyping in the context of your project

Figure 1: Prototyping in the context of your project.

Integration into your project

Regardless of the project approach you take it will boil down into the fundamental project management phases of Requirements, Design, Implementation (and possibly others). Prototyping should not be solely perceived as a method useful during Design, it is essential during all 3 (or more) phases starting as early as Requirements phase.

I suggest the following approach:

  1. Low-fidelity prototyping (paper / digital sketch)

    1. Create paper prototypes or digital sketches
    2. Design navigation architecture (workflow)
      1. Review with client
      2. User testing (optional)
      3. Iterate (until happy)
      4. Revise into 2
  2. Medium-fidelity prototyping (simple HTML)

    1. Simple HTML prototyping (maybe even black and white)
    2. Proof basic workflow and important interactions
      1. Review with client
      2. Iterate
      3. Revise into 3
  3. High-fidelity prototyping (Enhanced HTML)
    1. Enhance HTML prototype (links and basic functionality)
    2. Settle on design (including corporate design, basic artwork)
      1. Review with client
      2. Iterate
      3. Revise into 4
  4. Start 'real' implementation

Implementation Effort

Each prototype (digital sketch, simple HTML, advanced HTML based on simple) should not take more than 40 hours of pure development (not calculating initial meetings and communication and possible variations based on project size) plus 80 hours reviews and iterating with client. Sounds impossible? Think twice. It is so much easier to modify a sketch than programming HTML. The 'real' implementation will be built upon a solid code foundation with a grown-up design already - voila!

Can I skip a prototype?

Yes, obviously you can. But it comes with a cost later on because you miss crucial information from the earlier phase and it is more expensive to implement modifications.

Technical considerations

The argument I hear most often is that 'prototypes' are wasted time/money because they get trashed anyway. This is absolutely not true! Identifying problems early almost always saves money later on, you don't find anything out until you start showing it to people, enhancing the quality of the product will help money flow into your pocket once deployed and most important, prototypes don't necessarily need to and should not be trashed.

Low fidelity prototypes can be more than just ‘paper’, this could be digital wireframes that look like sketches, e.g. Microsoft offers software that tie sketches (SketchFlow) directly into UI design (Expression Blend) and subsequently into development (Visual Studio) - check out the WebsiteSpark Program for almost free licenses.

Don't bend too much!

Danger! Don't bend the curve from Figure 1 too much otherwise you end up with

  • a highly functional 'prototype' but without design, i.e. without visual clues whether your client/users will like it (buy it) and without validation that you got your information architecture right OR
  • a highly visual 'prototype' that looks sharp, sexy and slick but cannot be used and lack usability ("we just installed the app and now our users complain they [...]" - substitute the appropriate phrase for yourself ;)

Proof-of-concept

Creating medium- to high-fidelity prototypes can be considered proof-of-concept and can be beneficial to or sometimes even required by your project. Looking at Figure 1 that would mean to move their respective dots from Design/Implementation to an earlier phase.

What are your experiences?

Do you use / re-use multiple prototypes within your projects? Do your project structures support prototyping? To which extent?

9Nov1

User-Centered Design (UCD) Methods: Comparison and Overview

Posted by Michael Gaigg

This is the first in a series of blogs describing User-centered Design Methods. My goal is to summarize my experience, insights and findings across multiple literature and compile them into easy and quick to digest pieces for you to consume. I want to encourage you to comment your own experiences and give me feedback on why your company applies certain methods differently or not at all or something else altogether.

I personally don't like the term Usability too much, it's an empty buzz word. It means SOMETHING to everybody but isn't scientific enough to be taken serious. It's often interpreted wrongly and purely misunderstood by most. It's kinda like Psychology, we know it is important to understand fundamental human behavior, their problems and remedies, but I wouldn't pay a dime to go to a Psychologist. But who knows, just as Psychology got its scientific relevance and acknowledgment - partly maybe through the 'invention' of the IQ - hopefully Usability rises up to similar levels (Jeff Sauro offers interesting metrics via SUM (Single Usability Metric).

That's why I like the term User-centered Design. It works wonders with Project Managers and the-like, probably because Design is such an important term in their daily work. And when asked about Usability testing I can conveniently point out that this is only one tool of many in my UCD toolbox. But the really important sales trick is to know which UCD method is best used at what time in the project management cycle.

The following chart compares the most common user-centered design methods, outlines their cost and shows when to use them:

Overview of user-centered design methods

Comparison of User-centerd Design (UCD) Methods
Method Cost Output Sample Size When to Use
Competitive Studies Medium Stat. & Non-Stat. 5 Requirements
Focus Groups High Non-Statistical 6-9 Requirements
Field Studies High Non-Statistical 2-3 Requirements
Heuristic Evaluation Low Statistical 2-3 Design
Paper Prototyping Medium Stat. & Non-Stat. 5 Design
Card Sorting High Statistical 15-20 Design
Participatory Design Low Non-Statistical n/a Design
User Testing Medium Stat. & Non-Stat. 5 Design & Evaluation
Surveys Low Statistical 20+ Requirements & Evaluation
Interviews High Non-Statistical 3-5 Requirements & Evaluation
Server Traffic Log Analysis Low Statistical n/a Evaluation
Search Log Analysis Low Statistical n/a Evaluation

Not long ago, after having completed a full project management cycle (requirements, design, implementation and evaluation) the PM proudly announced to perform a Focus Group with his stakeholders. Showing the ready application, he thought, would surely impress them and lead to valuable feedback for the next milestone. This impulse isn't uncommon but has to be fought before it becomes reality. Does he really want to produce MORE and EXPENSIVE requirements? Because that's the output of Focus Groups. Wouldn't he be better off running 2 iterations of User Testing to reveal usability issues or a Survey to receive input from outside the development environment?

Recommendations

  • Requirements:
    • Competitive Studies
    • Interviews
    • Field Studies
  • Design:
    • Heuristic Evaluation
    • Paper Prototyping
    • User Testing
  • Evaluation:
    • Surveys
    • Server Traffic Log Analysis
    • Search Log Analysis
    • User Testing
  • The Usability effort is NOT proportional to the size of the project. Bigger projects spend less percentage on UCD with same effort. Regardless, as a rule of thumb assign 10% of the projects budget for UCD.
  • Faster iterations of prototype design require less testers

References