Print Page   |   Contact Us   |   Report Abuse   |   Sign In   |   Register
TL v59n2:Engaging the Users Through Usability Testing
Share |
 

 

Tennessee Libraries 

Volume 59 Number 2
 

 2009

 

Engaging the Users Through Usability Testing

by

Fagdeba "Bako" Bakoyema

Christy Groves

Middle Tennessee State University 

 

Presented at the 2009 TLA conference

Conference Program Abstract: This presentation shows how a usability lab with Morae, an innovative software-based solution for usability testing, was implemented to engage Walker Library website users by capturing, recording, remotely observing, and analyzing data from interactions. The final goal: redesign a user-centered web presence with maximized usability.

Powerpoint Presentation (pdf)


Welcome and Introduction

Good morning and welcome. My name is Christy Groves, and I am the User Services Coordinator at Middle Tennessee State University’s James E. Walker Library. I oversee the operations of the public services units at the Library. I’d like to introduce Fagdéba Bakoyéma (Bako), MTSU’s Web Services Librarian. We are here to present on the importance of usability testing when designing a library website. We will describe usability testing and share with you how we set up a usability lab at Walker Library, chose software for this testing, and conducted usability testing with a number of our users. We will conclude today’s presentation with a summary of the impact this testing will have on our website’s redesign.

Background

Due to the organizational structure at MTSU, Bako is part of the User Services team and has many other significant responsibilities in addition to his web development work. This includes working at the Reference Desk several hours per week, teaching several library instruction session classes each semester, and serving as a subject liaison to the Foreign Languages and Literatures department.

Bako began working at MTSU in the spring of 2007. Prior to MTSU, Bako completed his MLS and MIS degrees at Indiana University, Bloomington, where he also worked with usability testing at the IU Libraries. Bako’s numerous responsibilities have given him a unique perspective on both sides of web development. He possesses the technical expertise to develop high quality web pages, and he also has significant awareness,  due to regular interaction with library users, to recognize the importance of logical website architecture.

MTSU has acquired a new content management system, Site Studio, for the entire campus. Individual departments on campus are working to integrate into this new system by redesigning their pages. Bako communicates regularly with the MTSU Information Technology Department, but for the most part, the actual development and implementation into the new content management system is up to individual departments.

Bako began gearing up for the migration of the library website and approached me last summer with his idea about creating a usability testing lab at the Library. “What a great idea!” I thought. Firmly rooted in customer service, I think that usability testing will get at the heart of quality web design to maximize our patron’s online experience with Walker Library’s services and resources.

As you all know, academic libraries are inextricably tied to the quality of each student’s university learning experience. At MTSU, Walker Library serves as the central research hub of the institution. However, due to the internet’s widespread accessibility to online classes, resources, research materials, and social networking tools, the library as “place,” traditionally defined in a physical building, has expanded into a virtual environment. Never before has a web presence been as important for libraries as patrons use it to seek information for class projects, faculty research, etc.

Users now have a myriad of choices for information access, and all too frequently completely bypass their library’s website because they find it too difficult to navigate. While students acknowledge that online library databases are preferable to Google search results for quality of content, they overwhelmingly demonstrate frustration with the usability of library web sites and resources.

What is Usability Testing?

So what exactly is usability testing? Simply put, it is a way to capture users’ interactions with a particular online application, such as a software program, a database, or a website. Website usability testing provides web designers valuable clues regarding how a user understands and navigates through the layout of a site. It can also provide insight into a user’s understanding of terminology used to describe particular website applications and access points.

Until very recently, usability testing has not been something that libraries typically do when constructing, designing, and/or redesigning their website. Usability testing is considered time consuming, particularly because previously it had been quite complicated. Previous testing involved the use of microphones, video cameras, and other expensive and cumbersome audio/video equipment. However, the usability testing software Morae by TechSmith has made such testing so much simpler. This is the software that we use at MTSU to conduct our usability testing. Bako will be describing how Morae works shortly.

Why Conduct Usability Testing?

According to Jeffery Rubin, author of Handbook of Usability Testing, website utility goals and objectives are typically defined in measurable terms of one or more of the following: 1) usefulness, 2) ease of use, 3) learnability, and 4) likability. The way to capture these attributes is through usability testing. Research demonstrates that huge numbers of testers (we have 24,000+ students at MTSU) are not actually necessary. Rather, only 10?20 testers are needed to be representative of a much larger audience.

Bako researched our options and identified the TechSmith Morae software to conduct the usability testing. We wrote a proposal addressed to our library dean to obtain this software, attend beneficial training, and set up a usability testing lab in our Library.

The following flowchart describes the process through which usability testing can be conducted to result in a well designed web site that serves the needs of users. It should also be noted that usability testing is an ongoing process. That is because users’ needs and online applications are constantly evolving! We must continue to conduct usability testing to stay abreast of and anticipate our user’s changing needs.

Bako, who designed, implemented, and conducted our Library’s usability testing is now going to share with you how Morae works, its main software components, the importance of designing a well thought?out usability test, and the results of our usability testing at MTSU.

The Process of Usability Testing

Usability testing requires authentic participants. In the academic setting participants include students, faculty, staff, and even community members who use the library website. Recruiting participants, or “human subjects” as they are generally referred to, requires the approval of the Institutional Review Board (IRB). To ensure the usability test participants understand the purpose of the study and agree to participate voluntarily, the testers sign a consent form.

We got an IRB approval letter on February 27, 2008, allowing us to have 200 participants. The letter states that any change to the approved proposal must be submitted to IRB before the change is implemented. An end?of?project report must be submitted, and if the project is not finished within the one (1) year period, a project progress report must be submitted along with the continuation request prior to the expiration date. Once we received IRB approval, we began the testing the current Walker Library Website (see below).

You see the drop down menus only as you hover on certain main navigation tabs. And once you put your cursor at other places on the site the drop down menus disappear.

We are seeking feedback from authentic users on a number of aspects of the website including the architecture, labeling, functionality as well as suggestions on how to improve and enhance the design of the website.

The space inside the library where we conduct testing is called “The Patron Information Experience Laboratory” (PIE Lab). Morae, a cutting?edge robust software, was acquired to facilitate the collection and analysis of data. Morae has three components: Recorder, Observer, and Manager.

The Recorder has a camera built into the monitor to track participants eye movements, record all clicks, record all web page changes during the usability testing as well as a microphone to record participant thinking process (using the thinking aloud protocol).

Patrons’ information experience during tasks completion is observed live from a distance using the observer. An observer, referred to as task logger, can use the IP address of the computer on which the recording is taking place and connect directly to the computer. The task logger can watch the participant and hear comments and observations as tasks are being completed. The observer will gain valuable insights of the thinking process of the participant without being seen by the participant or disrupting the testing process.

The Manager is the third and the most powerful part of the software. All recordings from the Recorder and all task loggings from the Observer are downloaded and analyzed on the Manager. The analysis consists of tabulating the number of web changes, the number of clicks, and the amount of time spent by a participant during the completion of each task. Other functionalities of the Manager include the ability to create video?clips to highlight the patron’s experience, as well as charts and PowerPoint presentations to share design issues with stakeholders and policy makers.

The next slides feature a Morae software overview and system requirements.

Credit goes to the Walker Library Systems department for providing two used computers and managing the set?up of the PIE lab. Christy and I built on the day?long Morae training at the ASIS&T conference to begin the testing process in November 2008. We developed tasks and pre?tested them with participants from our target population: faculty, students, and staff. A few changes were made to clear?up confusion in the wording of some tasks. Prior to beginning the real testing, we requested the assistance of the Library Development office. We obtained T?Shirts, souvenirs and a few 10 dollar gas cards to thank test participants. The library development office also assisted in the recruitment of participants by putting ads throughout the library.

The screenshot below shows the Recorder displaying the beginning of testing session.

After careful selection of the participants, the test begins in the PIE lab with the test administrator welcoming and thanking the participant for volunteering to participate. Then the participant is asked to fill out and sign a consent form approved by the IRB. The test administrator assures the participant that the testing has nothing to do with his/her individual ability; instead, it all has to do with the website architecture, labeling, and whether or not the website is intuitive. There are no instructions or tasks to hand out to the participant as they are set?up on the Recorder. Tasks are preceded by a participant demographics survey. Our participants included faculty, students, and even fellow librarians.

Usability Test Results and Analysis

Though we are still in the process of analyzing the quantitative and qualitative data we collected during the usability testing, our initial reaction to these data is the development of a wireframe: a schema of an improved James E. Walker Library new website. Iterative testing including focus group discussions will guide the design from start to finish of the new website and the testing will continue. The constant change of web technologies, students demographics, and information seeking processes requires website designers to constantly improve the website to keep pace with new technologies and online information access and retrieval.

See wireframe below.

We'll now take a few questions, and we ask you to check our new website some time in 2010. Thank you for coming to our presentation. We hope you have learned something that you can share with your library.

 

 


Membership Software Powered by YourMembership  ::  Legal