These are my slides from the tech session held at the Esri DevSummit 2013 in Palm Springs, CA.
The session teaches participants best practices for reviewing, conceptualizing, designing and building user-centered mapping applications in a competitive business environment. Methods, techniques and tools for improving the user experience and designing useful and appealing front-end interfaces will be discussed.
What makes a Map App successful?
It sounds so easy and obvious. It's the basic, the 101 of analysis, Input-Analysis-Output. Usually I skip over introductions of books and that's especially true when I know the subject matter like GIS, but for some reason I started reading "The Esri Guide to GIS Analysis, Volume 1" (by Andy Mitchell, Esri Press) and it struck me like lightning, this is exactly what we should be doing:
You start an analysis by figuring out what information you need. This is often in the form of a question. Where were most of the burglaries last month? How much forest in each watershed? Which parcels are within 500 feet of this liquor store? Being as specific as possible about the question you're trying to answer will help you decide how to approach the analysis, which method to use, and how to present the results.
Other factors that influence the analysis are how it will be used and who will use it. You might simply be exploring the data on your own to get a better understanding of how a place developed or how things behave; or you may need to present results to policy makers or the public for discussion, for scientific review, or in a courtroom setting. In the latter cases, your methods need to be more rigorous, and the results more focused.
Frame the Question
Framing the question correctly will tell you:
- The problem you are trying to solve
- The approach of the analysis you want to use
- Which methods to use
- How to present the results
Who & How
Other factors that influence the analysis are:
- Who will use it?
- How will they use it?
- How are the results being used?
All this will impact your design, on what you should focus and how to lay the elements out on the page. Consider:
- Get the user to the location they are interested quickly
- Create clear call to action that allows the user to get answers to his/her question
- Simplify the methods on how to do analysis
- Provide means to use or export the results
This is the first in a series of blogs describing User-centered Design Methods. My goal is to summarize my experience, insights and findings across multiple literature and compile them into easy and quick to digest pieces for you to consume. I want to encourage you to comment your own experiences and give me feedback on why your company applies certain methods differently or not at all or something else altogether.
I personally don't like the term Usability too much, it's an empty buzz word. It means SOMETHING to everybody but isn't scientific enough to be taken serious. It's often interpreted wrongly and purely misunderstood by most. It's kinda like Psychology, we know it is important to understand fundamental human behavior, their problems and remedies, but I wouldn't pay a dime to go to a Psychologist. But who knows, just as Psychology got its scientific relevance and acknowledgment - partly maybe through the 'invention' of the IQ - hopefully Usability rises up to similar levels (Jeff Sauro offers interesting metrics via SUM (Single Usability Metric).
That's why I like the term User-centered Design. It works wonders with Project Managers and the-like, probably because Design is such an important term in their daily work. And when asked about Usability testing I can conveniently point out that this is only one tool of many in my UCD toolbox. But the really important sales trick is to know which UCD method is best used at what time in the project management cycle.
The following chart compares the most common user-centered design methods, outlines their cost and shows when to use them:
Overview of user-centered design methods
|Method||Cost||Output||Sample Size||When to Use|
|Competitive Studies||Medium||Stat. & Non-Stat.||5||Requirements|
|Paper Prototyping||Medium||Stat. & Non-Stat.||5||Design|
|User Testing||Medium||Stat. & Non-Stat.||5||Design & Evaluation|
|Surveys||Low||Statistical||20+||Requirements & Evaluation|
|Interviews||High||Non-Statistical||3-5||Requirements & Evaluation|
|Server Traffic Log Analysis||Low||Statistical||n/a||Evaluation|
|Search Log Analysis||Low||Statistical||n/a||Evaluation|
Not long ago, after having completed a full project management cycle (requirements, design, implementation and evaluation) the PM proudly announced to perform a Focus Group with his stakeholders. Showing the ready application, he thought, would surely impress them and lead to valuable feedback for the next milestone. This impulse isn't uncommon but has to be fought before it becomes reality. Does he really want to produce MORE and EXPENSIVE requirements? Because that's the output of Focus Groups. Wouldn't he be better off running 2 iterations of User Testing to reveal usability issues or a Survey to receive input from outside the development environment?
- Competitive Studies
- Field Studies
- Heuristic Evaluation
- Paper Prototyping
- User Testing
- Server Traffic Log Analysis
- Search Log Analysis
- User Testing
- The Usability effort is NOT proportional to the size of the project. Bigger projects spend less percentage on UCD with same effort. Regardless, as a rule of thumb assign 10% of the projects budget for UCD.
- Faster iterations of prototype design require less testers
- Fidgeon, T.; User-centered design (UCD) - 6 methods; Nov. 2005; http://www.webcredible.co.uk/user-friendly-resources/web-usability/user-centered-design.shtml
- IBM; User-Centered Design Principles; https://www-01.ibm.com/software/ucd/
- Nielsen, J.; Field Studies Done Right: Fast and Observational; 01/20/2002; http://www.useit.com/alertbox/20020120.html
- Nielsen, J.; How Big is the Difference Between Websites; 01/19/2004; http://www.useit.com/alertbox/20040119.html
- Usability in Practice: Three-Day Intensive Camp; Nielsen, J. et. al.; April 2006; Proceeding, Usability Week 2006