U-M Library Search Catalog: Usability Evaluation

A usability evaluation project aimed at evaluating the library's catalog search webpage and identifying primary user pain points.
UX Research / Usability Analysis           
4 Months
University of Michigan Libraries
How usable is the University of Michigan (U-M) Library's Catalog Search web tool?
A team of 3 MSI students and I worked with the U-M Library's Design and Innovation team as UX Research consultants to help them with their initiative to design and improve the library's overall website. As consultants, it was our goal to evaluate and determine the usability of the website's most well known, and well used, feature: the library's catalog search.
We found major areas for improvement in search result relevance, assistive search features, and design consistency.
Our usability evaluation yielded a set of high-impact improvements the team could make to their catalog platform that addressed the major pain points indicated by our research.

Our deliverable included various documents outlining our research as well as a final video detailing our recommendations.
Our Research & Process
Building an Interaction map helped us to explore current landscape of the catalog.
The map starts at the library's website (as of Winter 2019) and then shows the journey to the catalog search, then outlining the various avenues users can go to find resources and references at the library.
Right off the bat my team found various areas that could be a potential problem:

1) Each record provides different options based on the type of record - this can be confusing (get this vs available online).

2) There’s no indication of what is being searched for, especially if the search box data is accidentally cleared.

3) Advanced search completely over-writes normal search, which can be unclear to a beginner

4) Hard to find descriptions/help sections for each category.

5) Clicking on Records can lead to pages either within the library website or outside - which can be confusing (especially if the external page is unfamiliar to the user)
Initial user interviews helped us understand the tool's experience as well as major pain points.

The interviews allowed us to build research-based Personas and Scenarios.
Those interviews also resulted in several findings that could inform our future recommendations.
General, Advanced, and “Other” Search
1) General search: often utilized for "general" and "known item" searches
2) Advanced: is used for specific searches
3) Other platforms (like Google Scholar) are often used in conjunction with using the U-M library catalog

Choosing Resources
Users are often unsure what resources to choose from or which will be the most relevant to their needs.

Results and the Boolean Search
Search results were usually easy to understand, but users would have preferred action items to be more obvious (i.e. "View article," "cite source," etc.). Users also mentioned that resources found in a search were not always relevant to the search itself or did not show what they expected to see.

Overall Interface
Overall, users found the new interface to be helpful and easy to use, but had frustrations with boolean search and wanted more functionality.

A comparative evaluation gave us more insight into the industry standards.
We analyzed our comparative platforms on the basis of: Usability factors, information presentation, acclaim on the result resource, filters included, search relevance, search efficiency, cost, and search ease.

Platforms we compared: Google Scholar, Ann Arbor Library website, Amazon, WorldCat, JSTOR, etc.

We also looked at their interfaces in order to understand what would and would not work for the U-M Library Search website. Functionality analysis table and 2x2 matrix.

Key initial recommendations from the evaluation:

1) Integration of library with ILLiad (inter-library loan system).

2) Offer a “text analyzer” where users can submit a script and find other relevant articles.

3) Use of call numbers for resource identifications across all the library resources.

4) Ability to rate and review resource and create and share booklists.

5) Concise records with relevant information such as a smaller summary, number of citations, ratings and reviews and links to similar and related articles or books.

6) Adding an image of cover or front page of resource to search records.
A survey followed to gain a larger sample of insights.
A survey was sent to GSIs, PhD students, and Faculty to investigate the following key questions:

1) How do users with different roles use the search feature?

2) What is the usefulness and relevance of searches?

3) What is the efficiency and usability of the advanced search feature?
The survey validated many recommendations from the comparative evaluation as well as the following key insight:

Integrate the library search with google search and emphasize relevancy in search results.
A Heuristic Evaluation followed after.
We focused on using Neilsen's Heuristics as a basis for our evaluation.
Our initial goals were to: “Identify barriers, pain points, obvious usability and/or interface issues, and review the site’s overall flow -- all through a user’s typical scenario when using the site.”

Our evaluation looked at a common task -- searching for an unknown item -- and evaluated the task based on the predefined heuristics.

Nielsen's Heuristics:

1) Visibility of the system status
2) Match between the system and the real world
3) User control and freedom
4) Consistency and standards
5) Error prevention
6) Recognition rather than recall
7) Flexibility or efficiency of use
8) Aesthetic and minimalist design
9) Help users recognize, diagnose, and recover from error
10) Help and documentation

Finding 1: The system is not catered towards everyone’s needs.
Recommendations: System is like a black box to users.. Add a form of indication for users to understand how search results are tailored to them.
Finding 2: The user does not understand where to search for help.
Recommendations: Sections like ‘how to use search’ and ‘FAQ’ are currently very difficult for users to find and must be relocated to a more visible space. Important links should also be added to the "Ask a Librarian" chat box.
Finding 3: The platform is visually attractive but lacks important features which are standard in other competitive websites.
Recommendations: Users would find a library mobile application useful, and include other social features like reviews, ratings, and popularity to aid in search.
Finding 4: The system tells potential next steps that can be taken.
Recommendations: Integrate features like autocorrect and tips for narrowing your search to the search bar as well as a system to recommend the best resource for the search.
Finding 5: Overall the interface looks neat and clutter free.
Recommendations: The interface of the library page and the search page needs to be consistent, so we recommend updating the library page to include the UI features already displayed by the search catalog.
See the Heuristic Evaluation Report
Usability Tests
Usability tests allowed us to see users' pain points in action.
Methodology

5 Participants

Faculty, GSIs, and PhD Students

The tests included in-person tests with a facilitator and ran in this order: introduction, pre-questionnaire, 8 tasks, a debrief, and post-questionnaire.

The Key Findings and Recommendations

Finding 1:
The Favorites function, although interesting and useful for participants, did have some issues when attempting to favorite resources without logging in.

Recommendation: Make sure that a user can favorite items without logging in and have them saved in a “cart” feature so that the system still saves favorites from before a user logs in, to after. This is not necessary if the user leaves the site after favoriting items, but is a nice feature if the items are saved while a user logs in.

Finding 2: Participants are still having trouble with the library’s results after a search has been performed with key words.

Recommendation: Instead of having “how to search” offered as an article (or, perhaps, along with having this resource available), utilize a “hints” feature that pops up to let users know when they should be performing a specific action or hints that tell new users how to search.

Finding 3: Most participants could not figure out how to generate or export a citation for a specific resource. This may be because people are not really aware of the utility of tools like Zotero, Endnote and Export RIS.

Recommendation:
It could be useful to have a brief description of the use of Zotero, Endnote and RIS pop out when one hovers over these specific tools. Also a simpler citation feature akin to Google Scholar could be created. All the three citation tools could be grouped under a “cite” button.

Finding 4: Inconsistent icon and labelling of the “Favorites” feature on search pages and account page. It was noticed that users often click on “My Account” and “My Favorites” interchangeably. This could be due to the word “My” being used at the beginning of both action items.

Recommendation: Maintain consistent header bands with icons and labels across search and account pages. Remove “My” from “My Favorites”.

Findings 5:  Participants had a different approach to find the latest publication described in the task 18. Few participants utilized the filter of publication date in the left section of the search page, while some used the Sort by Creation Date (Newest first)  filter to find the latest publication.

Recommendation: The Publication date section in the left hand side filter options does not have the option of inputting start and end date.

Finding 6: Participant struggled with going back to the previously searched page after he navigated to the favorite section. After completing his tasks with the navigating section, participant showed confused reaction about going back to the previously searched page. The participant ended up clicking on the library search icon, and going back to the main page of the website.

Recommendation: The library should provide users with an option of viewing their search history. This could be in the form of a button which would be placed next to My account, My favorite, and Log in section in the menu bar of the website. It is an important feature that allows users to go back to their queries and the results they attained.
Reporting Our Final Results
Our final deliverable included a video synopsis of our entire usability analysis.
This video was shared with the entire Library team for their review and use going forward.
Reflecting on The Experience
Conclusion
After the video was delivered our research concluded. The team was very thankful for our research and happy to receive such a comprehensive review.

If we had continued, we would have become involved with implementing many of our suggestions.
See more work.
Bot Bud
U-M Library
IMPACT in Info
Shadow Project
Metro Health
sammbrow@umich.edu