Designing user interfaces isn't about sexy graphics, shiny buttons or slick navigation (alone).
It's about taking care of the influential factors that make or break the success of a web application or website.
It's a delicate balance of user needs and business requirements, deeply understood and carefully melted into a design that is loved by all stakeholders (the end-user included )
The sum of all the design influencers are the constraints that will box your design decisions. That's not a limitation, it's liberation!
The Design Influencers are:
Whatever it is that you are planning to build, it needs to be useful to somebody and has to solve a real-world problem. This end-user need is the reason of existence, it's the meaning life.
In which context will your users access the site? Is it through mobile devices on the road? Then a shopping cart will be less important than driving directions or store hours and screen elements need to be more prominent.
Do users typically enter your site through search? Then your landing pages need to convey who you are and what you do because users won't have seen your fancy homepage (and probably never will).
Even though cultural difference across the globe become really important if you build an international site, I rather mean business or sociological culture, i.e. if you plan on building an intranet site but the company's culture doesn't encourage to report failure or spending time helping other employees, then a forum probably isn't the right choice to offer.
While your client is ideally well informed about their end-user's needs, they also have to run a business, satisfy stakeholders, fulfill legal mandates etc. And that's when compromising your perfect usability is sometimes necessary and important.
What's the available technology? Very often the vendor or client platform of choice dictates the choice of technology, e.g. a Microsoft shop will prefer .NET and Silverlight (oh, long time I haven't mentioned Silverlight so I mention it again) or Flex.
If something isn't viable or possible today that doesn't mean it won't be in a year. So think ahead and design your site accordingly, i.e. extensible, modular, maintainable.
What I've found is that sometimes it's worth including an "upsale" item into your mockups, something that the client hasn't explicitly asked for but may open their eyes and hopefully wallets Mostly you may defer these items to a later phase but it gives everybody a long-term vision and as a side-effect supports designs that are extensible.
It's been said that anything can be done if you only have enough time and money, but the real world doesn't spin like that. Your design is constrained by a budget - and that's a good thing because it forces you to stay realistic finding the right balance between innovation and familiarity.
If the main sponsor is Esri (my current employer) I better make sure that there is a map on the interface. What sounds like a designer's nightmare is the name of the game.
How long will your design need to stand the test of time? Is it 1 year or 10? A demo doesn't need to be as polished or thought-out as a content-management system that will take over the client's communication platform. It is the classical "let's get it done" versus "let's think about this a last time". I've written a more detailed article about Lifespan as an important Design Decision.
Accessibility is a law and therefore cannot be removed from the equation. Your fancy design elements might just not be (or too expensive to be) compliant with the law. Acquire knowledge about accessibility laws (e.g. section 508 in the US), their implementation specifics and know how that translates into your design.
No doubt, user testing increases the usability and acceptance of your website and can/should be done as early as possible, preferably during Prototyping.
The following blog entry discusses the advantages & disadvantages of remote user testing, describes time estimates & costs and explains how a session looks like using Techsmith's UserVue.
- Reduced (or no) travel time and expenses.
- Higher exposure through easy screen sharing (managers can sneak in easily).
- Actual user environment, familiar and comfortable.
- Possibly fewer drop-outs.
- Facilitator not physically present (degree of separation can be challenging).
- Can't see facial expressions or non-verbal cues.
- Difficult to build rapport and trust.
- Difficult to control environment.
- Possibly technical difficulties (firewall, etc.).
- Setup and use of software or usability lab might be challenging and requires a liaison.
Time estimates & Costs
The following time estimates are to be taken with a grain of salt, they can change significantly up or down depending on the project size, experience of the team and infrastructure.
- Preparation (18 hours)
- User screening: 8 hours
- Task creation: 8 hours
- Environment: 2 hours
- User testing (10 hours)
- 2 hours per user
- 5 users per round
- Post-test (32 hours)
- Test report: 16 hours
- Implementation: 12 hours
- Communication: 4 hours
Summary: Our Time
1 Round = 60 hours
2 Rounds = 110 hrs.
3 Rounds = 160 hrs.
+ User comps
+ Time additional observers
How it works
UserVue is a Remote user testing software that enables a Facilitator to remotely observe a Participant using a phone line for communication. Multiple Observers can passively join the Session and share their observations with the Facilitator.
Morae Manager uses the collected data (observation markers and notes, video, keyboard and mouse inputs) to analyze and calculate task times, error quotes and other common measurements.
A Session is initiated by the Facilitator. Invitation emails are being sent to the Participant and Observer(s).
At the announced time all the involved parties need to download a small software bundle that allows them to connect to the UserVue software. The Facilitator then calls the Participant and gives instructions on how to start the Session.
After the session has ended the installed software bundle will be removed from computers of the involved parties.
Microsoft Windows 2000 and Windows XP or later version of Windows.
Internet Explorer 5.0 or later, Firefox 1.0 or later.
All communication with the UserVue Web site is performed over an encrypted Secure Sockets Layer transport mechanism (HTTPS). All session data is encrypted with a 128-bit Blowfish cipher as it is sent over the network.
Can anyone "eavesdrop" on my session?
All session data (including audio, video, chat data, etc.) is encrypted with a 128-bit Blowfish cipher as it is sent over the network. This makes it exceedingly difficult for anyone to intercept and observe session data.
Are copies of my session data stored anywhere?
No copies of session data are stored on any server. The only recording happens directly on the facilitator's computer. Session data may pass through TechSmith's servers to facilitate firewall and NAT traversal. However, this data is never stored. Also, this data is undecipherable as it is in an encrypted form as it passes through TechSmith's servers.
What are your experiences with Remote User Testing?
- Bolt, N.; Guide to Remote Usability Testing; http://www.ok-cancel.com/archives/article/2006/07/guide-to-remote-usability-testing.html
- Gough, D., Phillips, H.; Remote Online Usability Testing: Why, How, and When to Use It; http://www.boxesandarrows.com/view/remote_online_usability_testing_why_how_and_when_to_use_it
What it is
A focus group is a moderated discussion that lasts about two hours and covers a range of pre-selected topics.
In traditional focus groups, a screened (qualified target audience) group of respondents gathers in the same room. A moderator guides the group through a loosely structured discussion that probes attitudes about a client's proposed products or services. The moderator is typically given a list of objectives or an anticipated outline. Additional questions might serve to initiate open-ended discussions.
When to use
Pro's / Gains
- Discover what users want/desire/belief
- Observe group dynamics and organizational issues
- Show users spontaneous reactions and ideas
Con's / Disadvantages
- Don’t trust what people say or pretend to do
- Possible bias through specialized groups
How to perform
- Select representative participants.
- Identify problem area (what you want to learn).
- Prepare a script for the moderator to follow.
- Hire a skilled moderator (facilitator).
- Allow flexibility during the test to keep the discussion flowing.
- Tape and/or observe the test.
- Create good notes of the test.
This is the first in a series of blogs describing User-centered Design Methods. My goal is to summarize my experience, insights and findings across multiple literature and compile them into easy and quick to digest pieces for you to consume. I want to encourage you to comment your own experiences and give me feedback on why your company applies certain methods differently or not at all or something else altogether.
I personally don't like the term Usability too much, it's an empty buzz word. It means SOMETHING to everybody but isn't scientific enough to be taken serious. It's often interpreted wrongly and purely misunderstood by most. It's kinda like Psychology, we know it is important to understand fundamental human behavior, their problems and remedies, but I wouldn't pay a dime to go to a Psychologist. But who knows, just as Psychology got its scientific relevance and acknowledgment - partly maybe through the 'invention' of the IQ - hopefully Usability rises up to similar levels (Jeff Sauro offers interesting metrics via SUM (Single Usability Metric).
That's why I like the term User-centered Design. It works wonders with Project Managers and the-like, probably because Design is such an important term in their daily work. And when asked about Usability testing I can conveniently point out that this is only one tool of many in my UCD toolbox. But the really important sales trick is to know which UCD method is best used at what time in the project management cycle.
The following chart compares the most common user-centered design methods, outlines their cost and shows when to use them:
Overview of user-centered design methods
|Method||Cost||Output||Sample Size||When to Use|
|Competitive Studies||Medium||Stat. & Non-Stat.||5||Requirements|
|Paper Prototyping||Medium||Stat. & Non-Stat.||5||Design|
|User Testing||Medium||Stat. & Non-Stat.||5||Design & Evaluation|
|Surveys||Low||Statistical||20+||Requirements & Evaluation|
|Interviews||High||Non-Statistical||3-5||Requirements & Evaluation|
|Server Traffic Log Analysis||Low||Statistical||n/a||Evaluation|
|Search Log Analysis||Low||Statistical||n/a||Evaluation|
Not long ago, after having completed a full project management cycle (requirements, design, implementation and evaluation) the PM proudly announced to perform a Focus Group with his stakeholders. Showing the ready application, he thought, would surely impress them and lead to valuable feedback for the next milestone. This impulse isn't uncommon but has to be fought before it becomes reality. Does he really want to produce MORE and EXPENSIVE requirements? Because that's the output of Focus Groups. Wouldn't he be better off running 2 iterations of User Testing to reveal usability issues or a Survey to receive input from outside the development environment?
- Competitive Studies
- Field Studies
- Heuristic Evaluation
- Paper Prototyping
- User Testing
- Server Traffic Log Analysis
- Search Log Analysis
- User Testing
- The Usability effort is NOT proportional to the size of the project. Bigger projects spend less percentage on UCD with same effort. Regardless, as a rule of thumb assign 10% of the projects budget for UCD.
- Faster iterations of prototype design require less testers
- Fidgeon, T.; User-centered design (UCD) - 6 methods; Nov. 2005; http://www.webcredible.co.uk/user-friendly-resources/web-usability/user-centered-design.shtml
- IBM; User-Centered Design Principles; https://www-01.ibm.com/software/ucd/
- Nielsen, J.; Field Studies Done Right: Fast and Observational; 01/20/2002; http://www.useit.com/alertbox/20020120.html
- Nielsen, J.; How Big is the Difference Between Websites; 01/19/2004; http://www.useit.com/alertbox/20040119.html
- Usability in Practice: Three-Day Intensive Camp; Nielsen, J. et. al.; April 2006; Proceeding, Usability Week 2006