Usability Testing 2011
This page summarizes usability testing performed by Cory Chapman in 2011 and 2012.
- Background analysis and task design: 8 weeks (Dec 2011 - Jan 2012)
- Analyze existing data on Dryad usage and the Personas that have been developed for common types of Dryad users. -- complete
- Develop a set of scenarios that represent the most common uses of the Dryad user interface. -- complete
- Based on the scenarios, develop a set of tasks that can be used in a usability test. (See Future Usability Test Questions#User Workflows) Each task should include:
- Description of the functionality that will be tested.
- Text to be given/spoken to a user.
- List of the types of feedback that should be recorded during the task (e.g., written notes, log file information, screen capture).
- Develop a cardsorting task that users can perform to indicate their preferences for Dryad's menu structure.
- Obtain UNC's IRB approval for usability tests. - complete
- Recruiting participants: 2 weeks -- complete
- Usability Testing: 3 weeks -- complete
- Analysis/Reporting: 3+ weeks (got sick!) -- in progress
- Full results expected soon!
Cory's other work
Cardsort distance table in comma-separated format: File:Dryad-Cardsorting-Distance-Table.txt
A usability test was conducted in February 2012 as a first step in understanding users' preferences and interactions with the Dryad web interface. These interactions can be reduced to three distinct workflows: (1) Learn about Dryad (2) Find and download data (3) Upload data. This paper will describe these three workflows, explain the methods of testing each workflow in the current web interface, and make recommendations based on the results of those tests.
Learn About Dryad
Users arrive at the Dryad web interface in many ways (Fig. 1). Depending on their familiarity with the service, users may try to find more information about Dryad such as its journal affiliations, tutorials on uploading data, funding sources, etc. To find more information, users may read the initial Dryad page at which they arrive, click to find more information on the page, and/or search elsewhere for information about Dryad.
Find and Download Data
Some users who arrive at Dryad's web interface may want to find and download data stored in the Dryad repository. Users may also wish to find associated metadata in order to determine whether the contents of a data package before downloading it.
Users may want contribute data to the Dryad repository. Approaching the Dryad web interface with this goal in mind necessitates a unique set of learned skills and assumptions to be made about data security, documentation, and the feedback mechanisms of the Dryad web interface.
The procedures were designed to understand how users interact with the current Dryad web interface. Ten subjects were tested. Each test lasted less than 90 minutes.
First, subjects signed a waiver granting the researcher permission to use the subjects' responses as part of this research. Subjects were then presented with a brief pre-survey to understand their preexisting familiarity with the Dryad web interface and with biological data generally.
To test the three workflows listed above, users performed three different tasks: (1) Sort cards representing sections of the "Information" portion of the Dryad website into meaningful groups. (2) Find a specific data package and download it. (3) Upload a provided data package.
Finally, subjects were presented with a post-test to understand their over-all opinion of the interface and their feelings on the testing procedure itself.
Each test was conducted individually, and the subjects' faces, voices, and computer screens were recorded. Testing was performed in compliance with the University of North Carolina at Chapel Hill's Institutional Review Board's regulations.
The full text of the pre-test is included in the Appendix of this paper.
Learn About Dryad
The purpose of this test is to better understand where users expect certain pieces of information to be located within the "Information" portion of the website. Based on the results of this test, new information architectures can be designed to better accommodate subjects' expectations. The test was in two parts:
Part 1: Subjects performed an open card sorting exercise in which they were presented with 35 cards, each of which represented a section of the "Information" portion of the Dryad website. Subjects were asked to create any number of categories and to place each card in one of those categories. Once the categorization was completed, subjects verbally explained their reasoning to the researcher.
Part 2: Subjects were asked to find specific items of information within the current "Information" portion of the website. The specific items were not section titles, and they were phrased differently than the answers appeared on the website.
Find and Download Data
The purpose of this test is to better understand how users attempt to find data through the Dryad interface. Based on the results of this test, possible data discovery mechanisms can be made more prominent and intuitive for future users.
For this test, subjects were given the following citation and asked to find the link that would allow them to download the associated data:
Blackman BK, Michaels SD, Rieseberg LH (2011) Connecting the sun to flowering in sunflower adaptation. Molecular Ecology 20(17): 3503-3512. doi:10.1111/j.1365-294X.2011.05166.x
Next, subjects were asked to return to the Dryad home page and to find the article again using any method of their choice that was different from their initial method. Currently, the web interface facilitates data discovery via a browsing method (by author name or journal title), and a search method (with searchable fields including title, author, subject, and publication date). The goal of asking the subjects to find an article twice was to correct for any primacy bias if the users recognized one discovery method before the other.
The purpose of this test is to better understand what users expect from and how they react to the current data submission process. The current process is a paginated wizard-style workflow that allows users to upload and describe multiple files per data packet.
Subjects were given a citation, a DOI, list of authors, an abstract, keywords, two data files, and descriptions of those data files associated with the following paper, and they were asked to upload and describe two associated data files:
Morran LT, Schmidt OG, Gelarden IA, Parrish II RC, Lively CM (2011) Running with the Red Queen: host-parasite coevolution selects for biparental sex. Science 333(6039): 216-218. doi:10.1126/science.1206360
The full text of the post-test is included in the Appendix of this paper.
The trends and recurring themes regarding subjects' interactions with each of the three workflows are described in this section. Some of the data is cross-referenced Google Analytics data that was collected March 2011 – March 2012.
Learn About Dryad
Part 1: The results of the card sorting exercise can be seen in Figure 2 in the Appendix.
Part 2: Phrasing was the most important determinant in whether or not a subject found the appropriate information. A follow-up study should be done to determine the best way to indicate to users what information is behind each link. Position, size, and color were also important to subjects' recognition of information items. More information would be needed to ascertain exactly which items should be set apart and how. Until such a study can be conducted, put the information that is most viewed (as is determined by Google Analytics) towards the top, and in a larger font.
Based on the Google Analytics data, the "Depositing Data" page is the most viewed information page by a large margin. Out of every unique URL on the website, the Depositing Data page is the 4th most popular starting page (the page users see first), and the 5th most popular page to be clicked from the Home Page.
Find and Download Data
Subjects were very proficient at finding and downloading data. 100% of the subjects successfully found and downloaded the data packet in two different ways. Most subjects searched first and browsed upon the researcher's prompting. Subjects did not correctly interpret the search mechanism when multiple searches were performed in succession. Users expected each use of the search box to be independent of the previous search, but the search box, instead, evaluated only the subset of things that were returned from the first search.
Based on the Google Analytics data, 56% of users arrive at Dryad from search engines, and 43% of users arrive at the Dryad website at a data package without ever seeing the Home Page. (See the Visitor Flow diagram in the Appendix.) Because of this, many users never see any information about Dryad itself that would add credibility to the repository.
Subjects disliked the prominence of the article citation. They agreed that the citation was important, but not as important as the links to access the data which are at the bottom of the page. This, along with the Google Analytics data concerning link-click frequency, supports the value of vertical alignment and orientation on the page.
All subjects successfully uploaded a data package, though the quality of their metadata varied. Subjects were confused by a lack of instructions on each page, and their assumptions varied quite significantly about what to put into each field. Less than half of the subjects noticed the hover-over instructions for each metadata field. Once subjects were informed about the hover-over instructions, they found them very useful.
Some subjects attempted to exit the data submission process partway through, while others completed the process without prompting. In both cases, the subjects were confused by a lack of a "confirmation" dialogue that informed them of the successful completion or the successful deletion of their work.
This is a list of generalized recommendations followed by specific examples of their use on the current Dryad web interface.
1. All buttons that perform the same action (e.g. link to XYZ page, or submit the entered data) should have the same label.
In the submission process, the "Save & Exit" buttons should all say the same thing, since they all perform the same action. Their companion buttons that take you to the next step should all say exactly what they will do. (e.g. " Go to Step #2: Upload and Describe You Data Files") As a corollary, any two buttons that do opposite things (e.g. "Save" and "Cancel") should look different. Ideally, the button that progresses the user to the next step should be to the right, and it should be larger. The button that cancels the step should be to the left of the other button, and it should be a link, not a "button" .
2. Any external link should be obviously marked as an external link.
The "Dryad Documentation" link on the main navigation should be marked as an external link. The Wikipedia's external link symbol is becoming the de facto standard.
3. Any input field should be clearly associated with its label and/or submission button by location, color, style, etc.
The "Login" and "Logout" buttons are too close to the search box. Users expect to input their login info in that box, and they are confused when it searches.
4. Always give examples for input fields. Make all examples easy to find. Users should always be able to see the example when they see the input field.
Every input field in the submission process needs an example out to the right. The hover boxes are unnoticed and/or unread by most users.
5. Vertical screen real estate is more precious than horizontal screen real estate.
Most users' screens are in widescreen format. The big green box at the top should be vertically smaller. The yellow left-hand navigation box should be horizontally smaller.
6. Any two (or more) buttons that perform opposite tasks should be easily distinguishable by shape, size, location, and/or color, etc.
a. When presented with multiple options in a workflow, the most frequently selected choice should be first, largest, and/or brightest, etc.
In the submission process, the "Save & Exit" button and its companion at the bottom of each page should be different. Save & Exit should be a link (not a button) to the right of its companion button.
7. All pages and workflows that contain a term that may be unfamiliar to non-biologists should explicitly state the definition of that term.
"CC0" and "Creative Commons Zero waiver" are unfamiliar to some biologists. These terms should be defined in every workflow where they occur.
8. A symbol/icon should not be used when a word would not harm the aesthetic.
In the submission workflow, the asterisk as a symbol for "required" is not noticed my most users.
9. Any guided, step-by-step process should ask for confirmation before exiting or completing the process.
a. Any guided, step-by-step process should conclude in a confirmation page when complete.
b. Any guided, step-by-step process should ask for re-confirmation before exiting the process or deleting one's progress.
In the submission workflow, there is no straight-forward confirmation that the submission has succeeded. To simply send the user to the list of his/her successful submissions, then make him/her find the one that just succeeded is too difficult. Instead, before showing the user his/her list of successful submissions, show a confirmation page that indicates that the current submissions completed successfully.
10. Important information should be larger, in a uniquely colored box, near the top of the page, and/or physically separated from other information.
The "Contents" section under "Depositing Data" does not appear to be any different from the information below it. The "Contents" section should be larger, and possibly in a separate box. Consider using right-justified boxes for the most frequently asked questions on each page.
11. Use as few colors as necessary to clearly indicate different sections of the website.
In the "Information" portion of the website, links are in green, and other text is in red or black. By making the links green, the users were unable to determine that they were links. (They expected blue links like is the default for their browsers.) Red text is difficult to read on a white background.
12. Make the default text larger or provide a ubiquitous "Increase text size" button on every page.
100% of the subjects reported that the text was too small. Some of them knew how to change the font size in their browser, but most did not. Consider providing a button that will allow them to change the font size. Also make sure that the page renders well with multiple font sizes and zoom levels.