Have you always wanted to do more UX research but thought it might cost too much, or take too much time? Learn how a few UX ers, Jodi Bollaert and Megan Schwarz, at Team Detroit (advertising) in Michigan, have used several fast & cheap web-based tools & methodologies to glean valuable user insights for digital automotive projects.
3. UX Research Tools
AB Tests
Accessify
App Cooker
Ask Your Target Market
Axure
Axureland
Bad Usability Calendar
Balsamiq
BUXmarks
Cacoo
Chalkmark
ClickHeat
ClickTale
Clixpy
Concept Feedback
Crazy Egg
Creately
Deviant Art - Android 2.2
GUIDeviantArt - Web Kit
Interface
Layout Pack
Drawar
Ethnio
Feedback Army
Feng GUI
Five Second Test
Flowella
Fore UI
Get Backboard
Gliffy
Hotgloo
iPlotz
Just in Mind
Just Proto
Kampyle
Keynotopia - Keynote
Wireframe Templates
KISSmetrics
Loop 11
Lovely Charts
Lucid Chart
Lumzy
Mechanical Turk
Mockflow
Mockingbird
Mocksup
Mockup Builder
Morae
Mouse Trace
Napkee
Naview
Nav Flow
Omnigraffle
3
*UXPond.com
4. 4
UX Research Tools (cont’d)
Opengazer
Open Eyes
Open Hallway
Optimal Sort
Paper Browser
Pencil Project
Hibbitts Design -
Wireframe Stencils for
Microsoft Powerpoint
PlainFrame
Power Mockup
Press 9 For More
Options
Protonotes
Protoshare
Readability
Reinvigorate
SessionCam
Silverback
Simple Card Sort
Simple Mouse Tracking
Sketchflow
The Click Test
Tiggr
Total Wireframe
Treejack
Try My UI
UI Sketcher for iPad
Usability Testing Suite
Use it Better
Userfeel
Userfly
Userlytics
Userzoom
User Interface Design
Framework
UX Basis
UX Pin
UX Quotes
UX Pin
Quplo
Visual Attention Service -
3M
Webnographer
Website Grader
Websort
What Users Do
Wireframe Sketcher
Yuseo
*UXPond.com
6. “It Depends”
6
What do you want to learn?
Who’s your target audience?
What’s your timing & budget?
What will you be testing?
7. 2007-2008
Lab-based moderated testing in 1-2
markets; travel required
2009
Remote moderated testing with
nationwide market (WebEx)
2010-2012
Remote moderated testing with
nationwide market recruited via
intercept (Ethnio, WorldApp)
UX Research at Team Detroit
7
*Partner with third-party research companies
8. Challenges
• There’s 3X as much to test! (Desktop, Mobile & Tablet)
• Project lifecycles often short; traditional UX research takes 4-6 weeks to
plan, execute & report
• Traditional UX research requires a substantial investment
• Perception that UX research may slow a project down
Challenges We Face Today
8
Opportunity
Web-based tools enable teams to conduct research at a
radically lower cost, in less time and with fewer
resources.
11. • Observe and hear users as they
experience a site or prototype
• Test desktop, tablet or mobile
experiences
• What you get:
• About 15 min. of video per test
• A written summary of likes,
dislikes & improvement
suggestions
11
14. 3. Share and analyze results.
How It Works – 3 Steps
14
15. When is Usertesting.com Appropriate?
15
• You need findings quickly (e.g., within hours)
• Test can be completed in 15 min.
• Asking the exact same questions in exactly the same way is important
• Your audience can be easily recruited online
• Site or prototype is accessible via URL
• Resources are available for test planning, analysis and reporting
• You have a low budget
19. • Team Detroit’s Mobile Team
wanted to learn more about the
Audi vs. Cadillac mobile website
experience
• What were shoppers’ first
impressions?
• Was one navigation style more
intuitive/efficient than the
other?
• Which experience did they
prefer overall?
• Insights were considered in the
redesign of a Lincoln mobile site
Sample Project: Audi Vs. Cadillac
19
21. • Shared videos with team within hours of launching the test
• Developed a UT report template
• Kept it short; focused on actionable findings
• Created video highlights to underscore key themes
• Delivered report in person; some team members had watched videos
• Empowered team to come up with their own solutions (supported by UXA)
Analysis & Reporting
21
22. • Watched videos together
• Practiced “active observation”;
(laptops closed; phones off)
• Provided caffeine & chocolate
• Each viewer documented key
insights on sticky notes; one
per note
• Posted stickies on wall;
worked together to sort into
groups
• Labeled each group to identify
key themes
• Brainstormed solutions (later)
Different Project: Team-Based Analysis
22
23. • Observe how users naturally browse a site (e.g., first
impressions, where they click, when and where they exit)
• Observe how users complete 1-2 big tasks or 2-3 smaller
tasks (e.g., learn about MKZ, find a different Lincoln
vehicle, locate a dealer)
• Observe users experiencing and comparing two
competitive sites (e.g., Audi vs. Cadillac)
• Test the desktop vs. tablet vs. mobile experiences with
different user groups
• Begin your study with a search engine. How do users
begin looking for information like yours? Do they find it or
do they get side-tracked?
Usertesting.com Research Ideas
23
24. • Reinforce where participant should be early in the study
• RISKY “What are your first impressions of this website?”
• BETTER “What are your first impressions of the FiestaMovement.com website?”
• Avoid leading questions
• LEADING “Was that awkward?”
• NON-LEADING “What did you think of that experience?”
• Consider exploratory tasks, then directive
• EXPLORATORY “Find information about a vehicle that interests you. What did
you think of that experience?”
• DIRECTIVE “Now configure a vehicle with your desired options and features.
What did you think of that experience?
Usertesting.com: Constraints/Lessons Learned
24
25. • Run a pilot test with one participant before launching the full study
• Check that duration is about 15 min.
• Ensure that your directions and questions are understood
• Participant no good? Swap them out for a new one!
Usertesting.com: Constraints/Lessons Learned
25
27. • Find out what users recall about
your design
• FREE with “Karma Points” or
monthly subscription pricing
• Easy set-up
• Upload screenshots
• Enter brief instructions
• Use default questions or
customize
5 Second Test (Usability Hub)
27
28. • Does Team Detroit home
page communicate what
we do?
• Asked users, “What is
the purpose of this
page?”
Results:
• About cars in Detroit
• About Detroit
• Don’t know
5 Second Test Example
28
30. • Find out where users would click on your site to get information
• Upload screenshot, write up task and specify number of clicks allowed
• Paid accounts allow for multiple tasks
Click Test (Usability Hub)
30
31. • Completed user test of prototype
for a Lincoln Mobile project
• Found mixed expectation for
where engine information would
be found – Specs or Options
• Needed justification for whether
further testing was needed
Click Test Example
31
32. • Couldn’t use the prototype for proprietary reasons, but labels reflected on
desktop site
• Wrote up task and requested 25 responses
Where would you click to find information about the MKZ engine?
Click Test Example: Methodology
32
33. • Users clicked everywhere!
• Closer look on the sub-navigation showed Specs chosen over Options
• Concluded that this test did not confirm an immediate need for additional
testing, but future testing may be useful
Click Test Example: Analysis
33
34. • Cannot recruit users that fit target market
• Cannot test in a mobile environment
• Too many options on desktop site and not all were relevant for mobile
• Options should have been limited to those included on prototype
Click Test Example: Constraints/Lessons Learned
34
35. • Method for organizing content
on a site or section of site
• Users asked to physically sort
content separated onto note
cards
• Gives insight into what
patterns users see within your
site content
Card Sorting
35
36. • Large amount of videos and
photos sorted randomly into
one gallery on Ford Fusion
YouTube page
• Needed content sorted in
intuitive way
• Goal was to increase user
engagement through
increased understanding of
content offering
• No budget and no time
Card Sorting Example
36
37. • Cut and paste content onto large index cards
• Asked project manager to find 15 users that fit target within
organization
• Scheduled three 30 minute back-to-back sessions with five users
each
• Wrote script for facilitator to read to each group
• Users asked to group content and then write-up labels once all
cards were grouped
• One facilitator and one note taker
• Great article for reference:
Card Sorting Example: Methodology
37
http://boxesandarrows.com/card-sorting-a-definitive-guide/
38. • Recorded results in Excel
spreadsheet template
• Looked for consistencies and
inconsistencies between the
three groups’ results
• Referenced notes to
understand rationale
• Finalized results and shared
with team
Card Sorting Example: Analysis
38
40. • Constrained to target users within limited pool
• Informal means of recruiting users may reduce trust of results within project
team
• Using groups over individuals risks the loudest person in the room
influencing the group
• Multiple groups make analysis more difficult due to potential for
inconsistency between groups
Card Sorting Example: Constraints/Lessons Learned
40
42. • Qualitative research method for better understanding users
• Consists of one-on-one conversations with users
• Questions centered on:
• Understanding how and why users have or might interact with
your site or similar sites
• Understand experiences that your site supports
• Identify what your users’ needs are and why
• Uncover why or why not your users’ needs are satisfied
User Interviews
42
43. • Doing initial concepting for
upcoming project
• Had budget and time for
quantitative research, but not
qualitative
• Quantitative results not back
before work needed to begin
• Needed some directional
information about target users
to get started
User Interviews Example: Really Awesome Project
43
44. • Asked project manager to find 5 users that fit target within
organization
• Met with project team to develop a list of learning goals for
interviews
• Wrote up interview questions and sent to team for approval
• Scheduled half hour one-on-one sessions with users
• One facilitator and one note taker
• Entire project team welcome to attend sessions
User Interviews Example: Methodology
44
45. • Input interview answers into Excel spreadsheet
• Looked for common themes across interviews individually
• Reported back to the team for feedback and discussion
• Used results to create rough draft experience map
User Interviews Example: Analysis
45 http://www.adaptivepath.com/ideas/the-anatomy-of-an-experience-map
46. • Constrained to target users within limited pool
• Only used interviews to form a hypothesis with the understanding
that additional research is necessary
• Quantitative results will allow us to more confidently finalize results
User Interviews Example: Constraints/Lessons Learned
46
48. • Talk to project teams. What are
their burning questions or
concerns?
• Involve them in research planning
and observation
• Acknowledge the constraints of
the tool
• Share results as soon as you get
them
• Document findings & facilitate
next steps
• Be careful what you wish for!
Address Real Problems
48
What do you want to learn? Who is your target audience? What’s your timing & budget? What’s the format of the product you’re testing?
Results include videos of participants using the test site and written answers to four customizable follow-up questions.
Or about $5,000 per participant
Or $49 per participant ($39 if you have basic Enterprise Plan which is $12,000 per year for 20 tests per month)
In this next example, you’re going to see a woman interacting with the Audi mobile site. Our mobile team at Team Detroit was interested in learning more about what navigation styles work best. I set up a test for them that compared the Audi site to a another manufacturer’s site You’ll hear this participant thinking out loud as she explore the Audi models page.
More ownership of next stepsMore likely to be feasible (considering resources, timing, expense)
This is the template I used for analyzing the results. I inputted all of the results into the template. I was able to assign identifiers so that similar labels were mapped to the same identifier. This allowed me to easily see where the patterns existed to help me make my final recommendation.