20
© 2015 Autodesk Navigating Help Testing Information Architecture with Treejack Patty Gale Principal Learning Content Developer, Autodesk, Inc. April 15, 2015

Navigating Help - Testing Information Architecture with Treejack

Embed Size (px)

Citation preview

© 2015 Autodesk

Navigating Help Testing Information Architecture with Treejack Patty Gale Principal Learning Content Developer, Autodesk, Inc.

April 15, 2015

© 2015 Autodesk

Can people find what they're looking for? Is the taxonomy and navigation

labeling appropriate? Is the help content hierarchy

well structured?

Goal: In the help system…

Presenter
Presentation Notes
We know that many folks rely exclusively on Search. However, this assumes that you are familiar enough with the software to know what keywords to use in your search. New users, or those looking for features/functionality with which they are not yet familiar, often use a combination of Search, TOC, and related links to find information. Several users have described a process where they use an iterative approach: They search, then look at the search results to see if that keyword gives them the results they are looking for. If not, they check the TOC of a likely search result to see what terms are used nearby, and refine their search using those labels. They might also use a related link on a search result page to get closer to the information they need…following the scent of information. [CLICK] So the labels we use in the TOC, in topic titles, are very important to helping users be successful. [CLICK] We also want to make sure that we are matching the user’s mental model: Do the large groupings of topics reflect their workflow? Or the way they think about their work? The Revit Learning Content Development team has recently completed a project to improve search results by refining topic titles, creating short descriptions, and adding important keywords to every topic. In this testing, we wanted to make sure that the TOC is using the correct labels, and whether its structure is easily understandable and navigable by our users. While the testing I’m about to describe is specific to the online help, you can also use this method to test the information architecture of a web site, or perhaps even the organization of a ribbon-based or menu-based user interface.

© 2015 Autodesk

Online user study: Recruited users Tool: Treejack by Optimal Workshop

Method

Presenter
Presentation Notes
I worked with our user research staff to recruit users to participate. For this test, 123 users ultimately participated, which gave us plenty of data to work with. For the tool, I choose Treejack by Optimal Workshop. This tree testing tools it helps test your information architecture without visual distractions from the user interface.

© 2015 Autodesk

Write text for welcome, instructions, and thank you

Design pre- and post-survey questions

Get the site map/help hierarchy into Treejack

Define tasks for users: what info do you want them to look for?

Identify correct answers for each task

Run a pilot to identify and iron out wrinkles

Create the study (similar to SurveyMonkey®)

Presenter
Presentation Notes
[ALLOW THE SLIDE TO BUILD OUT AUTOMATICALLY] The process for using Treejack is very similar to using SurveyMonkey. It’s a web-based application, and lots of intelligent defaults make it easy to implement. For Treejack, you need to input the site map, either manually or by importing it as a spreadsheet. You also need to define the tasks that you want the users to perform, or tell them the information you want them to find. Think carefully about the areas of your IA that you are perhaps worried about, and be sure to create tasks designed to test those areas. Treejack suggests that 9 tasks are all that are required to get a sense of how well your IA is performing. We have such a large site map that we decided to go with a total of 20 tasks. Each user, however, would respond to only 12 of those tasks, randomly selected from the bank of 20. In our pilot study with internal users, they experienced survey fatigue when we used more tasks than that, the tasks started to feel repetitive, and users began to learn their way around the help hierarchy, perhaps making the results less valuable. I’ll show you more of the process of creating the study in the demo, if we get that far.

© 2015 Autodesk

Send the link to participants

Run the study

Participants perform the study

You can look at survey results any time

© 2015 Autodesk

A sample task

Presenter
Presentation Notes
In the study, participants are presented with a task, along with the help hierarchy. They navigate the hierarchy to locate the topic title where they expect to find the desired information.  Participants can navigate up and down the hierarchy as needed. Every click and choice is recorded for later analysis.  When they locate the correct topic, they click it and nominate it as their choice. By presenting the help hierarchy without its usual user interface, Treejack enables users to focus on the labels, the words used in the topic titles, without other distractions in the user interface.

© 2015 Autodesk

12 tasks out of 20

123 users participated

Pre- and post-survey questions provided insights about these users

In the Treejack study…

© 2015 Autodesk

Summary of results: Overall Success

Optimal Workshop says that an overall success score of "68% is at the high end of average for websites in general," so we did pretty well, but there is room for improvement.

© 2015 Autodesk

Summary of results: Overall Directness

Optimal Workshop says a Directness value of 70% is pretty good. However, the overall score for each task “is a weighted average of the squares of success and directness, favoring success over directness at a ratio of 3:1, scaled to be a value out of 10." In other words, Success (finding the information at all) is more important than Directness (finding it on the first try).

© 2015 Autodesk

Let’s examine results for one task… Overall score on a range from 1 (low) to 10 (high)

Red = lots of users did not find the correct topic

Gray = a few users skipped this task

Dark green = some users went directly to the correct topic

Light green = some users wandered before finding it

© 2015 Autodesk

What’s a pietree, you ask?

Presenter
Presentation Notes
[CLICK] Without trying to read the labels, just look at the overall effect of the pietree. It provides a visual representation of the different areas users visited trying to locate information for the task. Larger dots or pies indicate that more people visited a particular topic. Thicker lines indicate that more users followed a particular path. Green lines indicate a path to a correct topic. [CLICK] Just by glancing at this pietree, you can see that people are scattered all over, looking for the information. Some found the right path, but many did not. They’re confused and lost. This pie tree tells us that, while a certain number of users followed the correct path (green) and selected the correct topic (the yellow circle with a green outline), many other users went in completely wrong directions, exploring different parts of the help system before nominating a topic as the correct answer. So, taking those results to heart, I worked to improve the information architecture. Then did another test to see if the results improved. Would users be able to find the information more easily?

© 2015 Autodesk

Same task, different information architecture After Before

Presenter
Presentation Notes
Now compare the previous results for the same task with these results for the improved help hierarchy. [CLICK] In the current edition of the help system, this task gets a much improved score of 7. Success is up to 81%, and 71% of those users got to a correct topic directly, without wandering. The red slice of the pie chart is much smaller than for the earlier study. Lots more dark green = more users went directly to the correct topic. Smaller red area = fewer users nominated an incorrect topic.

© 2015 Autodesk

Pietrees show a clear difference After Before

Presenter
Presentation Notes
When compared to the earlier pietree, this one shows that users were more confident about their choices. [CLICK] Fewer users wandered about, looking in the wrong places. More users followed a direct path to the correct topic. Notice that, this time around, there are 3 potential correct answers: the target topic, plus 2 other topics where users can get to the desired information. However, only a small portion of users nominated the alternative topics. Most found their way directly to the main topic, "Print Views and Sheets.“ So this side-by-side comparison allows you to appreciate this style of visual representation. At a glance, you can see that our users had more success with this task in the improved help hierarchy.

© 2015 Autodesk

Click-through rates tell the story

Before: After:

Presenter
Presentation Notes
The title Share the Design is vague in nature; people weren't sure what it means. Two-thirds of all clicks went to Document the Project, but only 1/3 of those clicks went to Share the Design, indicating again that users do not associate Share with Print. Users lose the scent of information here, and start searching other areas of the help for the desired information. If you compare that result with click-through rates for the more recent study, 93% clicked Document and Present the Design, and of those, 91% clicked Print. Users were more confident in their choices here. Completion times reflect this confidence, too: for the earlier study, users took 25.6 seconds on average to complete the task; for the current study, users required only 17.1 seconds on average to navigate to the topic.

© 2015 Autodesk

We removed a layer in the help hierarchy. Document the Model > Share the Design > Print Document and Present the Design > Print

“Share the Design” was too vague.

Users lost the scent of information. It didn’t mean “print” to them.

What changed to improve the results?

Presenter
Presentation Notes
Stronger scent of information = greater confidence, higher success rate

© 2015 Autodesk

Get labels right. Don't use vague terms that cause users to lose the scent of information.

Match the user's mental model.

Lessons learned

Print Share

“I like the current system which is based on the design process.” – Jason B.

© 2015 Autodesk

Demo with Q&A

Presenter
Presentation Notes
Without a subscription, you can create a Treejack study that contains just 3 tasks and a small site map, and get results from up to 10 users. For a larger study, you can purchase a monthly subscription ($109), a per-survey subscription ($149), or an annual subscription ($990). (Prices as of Feb 2015) Click the icon to go to the Treejack page for Optimal Workshop, and create a study. Or log into your Optimal account before the presentation, and have the sample study open in the browser. Open the existing study or create a new one. Review the tabs: Settings Tree (add another item to the tree) Tasks (add another task, specify a correct answer) Messages Questionnaire Appearance Click Preview, and go through the study to show how it appears to a user.

© 2015 Autodesk

Treejack: www.optimalworkshop.com

Information Foraging by Jakob Nielsen http://www.nngroup.com/articles/information-scent/

Search is Not Enough: Synergy Between Navigation and Search by Raluca Budiu http://www.nngroup.com/articles/search-not-enough/

Resources

© 2015 Autodesk

Icons from TheNounProject.com: (CC BY 3.0) Users by Vittorio Maria Vecchi Monitor by useiconic.com Share by Joshua Stearns Browser by Max Miner Analytics by Christopher Holm-Hansen

Creative Commons licensed image: Talent #3 (dog) by Robert Terrell (CC BY-NC-ND 2.0) Confused by Guudmorning! (CC BY 2.0)

Attributions for images used in this presentation

Autodesk, the Autodesk logo, and Revit are registered trademarks or trademarks of Autodesk, Inc., and/or its subsidiaries and/or affiliates in the USA and/or other countries. All other brand names, product names, or trademarks belong to their respective holders. Autodesk reserves the right to alter product and services offerings, and specifications and pricing at any time without notice, and is not responsible for typographical or graphical errors that may appear in this document. © 2015 Autodesk, Inc. All rights reserved.