What makes a Map App successful?
It sounds so easy and obvious. It's the basic, the 101 of analysis, Input-Analysis-Output. Usually I skip over introductions of books and that's especially true when I know the subject matter like GIS, but for some reason I started reading "The Esri Guide to GIS Analysis, Volume 1" (by Andy Mitchell, Esri Press) and it struck me like lightning, this is exactly what we should be doing:
You start an analysis by figuring out what information you need. This is often in the form of a question. Where were most of the burglaries last month? How much forest in each watershed? Which parcels are within 500 feet of this liquor store? Being as specific as possible about the question you're trying to answer will help you decide how to approach the analysis, which method to use, and how to present the results.
Other factors that influence the analysis are how it will be used and who will use it. You might simply be exploring the data on your own to get a better understanding of how a place developed or how things behave; or you may need to present results to policy makers or the public for discussion, for scientific review, or in a courtroom setting. In the latter cases, your methods need to be more rigorous, and the results more focused.
Frame the Question
Framing the question correctly will tell you:
- The problem you are trying to solve
- The approach of the analysis you want to use
- Which methods to use
- How to present the results
Who & How
Other factors that influence the analysis are:
- Who will use it?
- How will they use it?
- How are the results being used?
All this will impact your design, on what you should focus and how to lay the elements out on the page. Consider:
- Get the user to the location they are interested quickly
- Create clear call to action that allows the user to get answers to his/her question
- Simplify the methods on how to do analysis
- Provide means to use or export the results
No doubt, user testing increases the usability and acceptance of your website and can/should be done as early as possible, preferably during Prototyping.
The following blog entry discusses the advantages & disadvantages of remote user testing, describes time estimates & costs and explains how a session looks like using Techsmith's UserVue.
- Reduced (or no) travel time and expenses.
- Higher exposure through easy screen sharing (managers can sneak in easily).
- Actual user environment, familiar and comfortable.
- Possibly fewer drop-outs.
- Facilitator not physically present (degree of separation can be challenging).
- Can't see facial expressions or non-verbal cues.
- Difficult to build rapport and trust.
- Difficult to control environment.
- Possibly technical difficulties (firewall, etc.).
- Setup and use of software or usability lab might be challenging and requires a liaison.
Time estimates & Costs
The following time estimates are to be taken with a grain of salt, they can change significantly up or down depending on the project size, experience of the team and infrastructure.
- Preparation (18 hours)
- User screening: 8 hours
- Task creation: 8 hours
- Environment: 2 hours
- User testing (10 hours)
- 2 hours per user
- 5 users per round
- Post-test (32 hours)
- Test report: 16 hours
- Implementation: 12 hours
- Communication: 4 hours
Summary: Our Time
1 Round = 60 hours
2 Rounds = 110 hrs.
3 Rounds = 160 hrs.
+ User comps
+ Time additional observers
How it works
UserVue is a Remote user testing software that enables a Facilitator to remotely observe a Participant using a phone line for communication. Multiple Observers can passively join the Session and share their observations with the Facilitator.
Morae Manager uses the collected data (observation markers and notes, video, keyboard and mouse inputs) to analyze and calculate task times, error quotes and other common measurements.
A Session is initiated by the Facilitator. Invitation emails are being sent to the Participant and Observer(s).
At the announced time all the involved parties need to download a small software bundle that allows them to connect to the UserVue software. The Facilitator then calls the Participant and gives instructions on how to start the Session.
After the session has ended the installed software bundle will be removed from computers of the involved parties.
Microsoft Windows 2000 and Windows XP or later version of Windows.
Internet Explorer 5.0 or later, Firefox 1.0 or later.
All communication with the UserVue Web site is performed over an encrypted Secure Sockets Layer transport mechanism (HTTPS). All session data is encrypted with a 128-bit Blowfish cipher as it is sent over the network.
Can anyone "eavesdrop" on my session?
All session data (including audio, video, chat data, etc.) is encrypted with a 128-bit Blowfish cipher as it is sent over the network. This makes it exceedingly difficult for anyone to intercept and observe session data.
Are copies of my session data stored anywhere?
No copies of session data are stored on any server. The only recording happens directly on the facilitator's computer. Session data may pass through TechSmith's servers to facilitate firewall and NAT traversal. However, this data is never stored. Also, this data is undecipherable as it is in an encrypted form as it passes through TechSmith's servers.
What are your experiences with Remote User Testing?
- Bolt, N.; Guide to Remote Usability Testing; http://www.ok-cancel.com/archives/article/2006/07/guide-to-remote-usability-testing.html
- Gough, D., Phillips, H.; Remote Online Usability Testing: Why, How, and When to Use It; http://www.boxesandarrows.com/view/remote_online_usability_testing_why_how_and_when_to_use_it
What it is
A focus group is a moderated discussion that lasts about two hours and covers a range of pre-selected topics.
In traditional focus groups, a screened (qualified target audience) group of respondents gathers in the same room. A moderator guides the group through a loosely structured discussion that probes attitudes about a client's proposed products or services. The moderator is typically given a list of objectives or an anticipated outline. Additional questions might serve to initiate open-ended discussions.
When to use
Pro's / Gains
- Discover what users want/desire/belief
- Observe group dynamics and organizational issues
- Show users spontaneous reactions and ideas
Con's / Disadvantages
- Don’t trust what people say or pretend to do
- Possible bias through specialized groups
How to perform
- Select representative participants.
- Identify problem area (what you want to learn).
- Prepare a script for the moderator to follow.
- Hire a skilled moderator (facilitator).
- Allow flexibility during the test to keep the discussion flowing.
- Tape and/or observe the test.
- Create good notes of the test.
This is the first in a series of blogs describing User-centered Design Methods. My goal is to summarize my experience, insights and findings across multiple literature and compile them into easy and quick to digest pieces for you to consume. I want to encourage you to comment your own experiences and give me feedback on why your company applies certain methods differently or not at all or something else altogether.
I personally don't like the term Usability too much, it's an empty buzz word. It means SOMETHING to everybody but isn't scientific enough to be taken serious. It's often interpreted wrongly and purely misunderstood by most. It's kinda like Psychology, we know it is important to understand fundamental human behavior, their problems and remedies, but I wouldn't pay a dime to go to a Psychologist. But who knows, just as Psychology got its scientific relevance and acknowledgment - partly maybe through the 'invention' of the IQ - hopefully Usability rises up to similar levels (Jeff Sauro offers interesting metrics via SUM (Single Usability Metric).
That's why I like the term User-centered Design. It works wonders with Project Managers and the-like, probably because Design is such an important term in their daily work. And when asked about Usability testing I can conveniently point out that this is only one tool of many in my UCD toolbox. But the really important sales trick is to know which UCD method is best used at what time in the project management cycle.
The following chart compares the most common user-centered design methods, outlines their cost and shows when to use them:
Overview of user-centered design methods
|Method||Cost||Output||Sample Size||When to Use|
|Competitive Studies||Medium||Stat. & Non-Stat.||5||Requirements|
|Paper Prototyping||Medium||Stat. & Non-Stat.||5||Design|
|User Testing||Medium||Stat. & Non-Stat.||5||Design & Evaluation|
|Surveys||Low||Statistical||20+||Requirements & Evaluation|
|Interviews||High||Non-Statistical||3-5||Requirements & Evaluation|
|Server Traffic Log Analysis||Low||Statistical||n/a||Evaluation|
|Search Log Analysis||Low||Statistical||n/a||Evaluation|
Not long ago, after having completed a full project management cycle (requirements, design, implementation and evaluation) the PM proudly announced to perform a Focus Group with his stakeholders. Showing the ready application, he thought, would surely impress them and lead to valuable feedback for the next milestone. This impulse isn't uncommon but has to be fought before it becomes reality. Does he really want to produce MORE and EXPENSIVE requirements? Because that's the output of Focus Groups. Wouldn't he be better off running 2 iterations of User Testing to reveal usability issues or a Survey to receive input from outside the development environment?
- Competitive Studies
- Field Studies
- Heuristic Evaluation
- Paper Prototyping
- User Testing
- Server Traffic Log Analysis
- Search Log Analysis
- User Testing
- The Usability effort is NOT proportional to the size of the project. Bigger projects spend less percentage on UCD with same effort. Regardless, as a rule of thumb assign 10% of the projects budget for UCD.
- Faster iterations of prototype design require less testers
- Fidgeon, T.; User-centered design (UCD) - 6 methods; Nov. 2005; http://www.webcredible.co.uk/user-friendly-resources/web-usability/user-centered-design.shtml
- IBM; User-Centered Design Principles; https://www-01.ibm.com/software/ucd/
- Nielsen, J.; Field Studies Done Right: Fast and Observational; 01/20/2002; http://www.useit.com/alertbox/20020120.html
- Nielsen, J.; How Big is the Difference Between Websites; 01/19/2004; http://www.useit.com/alertbox/20040119.html
- Usability in Practice: Three-Day Intensive Camp; Nielsen, J. et. al.; April 2006; Proceeding, Usability Week 2006