In the current service environment of academic libraries, how users perceive and value what is available to them is central to effective service design and delivery. These factors, added to shrinking budgets and additional demands for newer, more accessible materials, have made it more important than ever that libraries develop effective methods of collecting user feedback about library services to help understand and anticipate its users’ needs.
The proof is in the research. In a recent search of Library, Information Science & Technology Abstracts, there were hundreds of academic articles that looked at various aspects of library user feedback. Many of these articles dealt with web- and paper-based survey methods, particularly the benefits of one method over the other. For examples see Perkins and Yuan (2001), Perkins (2004), and Martins (2010).
Whatever method ultimately chosen, academic libraries all over the country are investing a great deal of staff hours and library funds with the clear intent to understand and evaluate how the library is being used.
Staff at Paul Meek Library (PML) at the University of Tennessee at Martin in Martin, Tennessee have generated both types of surveys in an effort to get a feel for how the library is being used. For example, a web-based survey in a question-a-week format was added to the PML homepage for 15 consecutive weeks. More recently, a Google Forms library satisfaction survey was distributed to students via the library’s home page, social media, and in the classroom. Many paper-based surveys have been developed over the years by PML staff and have been distributed in the library as well as classrooms around campus.
These various types of surveys provided PML with important user feedback that was used to improve library services. However, over the years, many of PML’s surveys have yielded very low response rates when compared to the significant amount of staff time and resources dedicated to developing and promoting each survey. And even when a survey yielded large response rates (as the Google Forms library satisfaction survey did), there have been times when PML needed quick student feedback on a topic and a formal survey requiring time-consuming development, promotion, and collection simply was not ideal. PML reference staff were acutely aware of this.
Dropping gate counts coupled with the perception that students were not using the library for traditional purposes led PML reference staff to begin thinking of potential options for collecting quick and easy student feedback.
In a meeting held in early January 2016, PML reference staff discussed dropping gate counts and the perception that students were not using the library for traditional purposes. A Google Forms survey was in the works that would likely shed some light on the issue, but it was under review by library staff and would likely not go live until late winter or early spring. Reference staff decided to give the instant feedback model a shot with the intent being to understand what students like or dislike about the library. By getting instant feedback, the library could feasibly address some of the students’ concerns before the more formal survey was even released. The only problem was deciding on what method to use for collecting instant feedback from students.
Like most libraries, PML has an active social media presence (Facebook and Twitter). Naturally, this social media presence was brought up as a possible option for collecting instant feedback. However, students at UT Martin are not actively engaged with the library’s social media presence, and while we are working on building a stronger social media following, doing so will take time. Other options were also discussed like creating a short, informal web- or paper-based survey, but the same problems with development and promotion exist as with longer, more formal surveys, plus there is potential for low user response rates. Finally, the library’s graduate assistant suggested we put a whiteboard in the library’s lobby and have students write responses to questions with dry erase markers. It was the simplest, most low-tech idea of the meeting, but we all agreed that it was the best and began almost immediately to put together a plan for it installation.
Locating a viable whiteboard from the library’s collection, deciding where to put it and what questions to pose to students, developing a strategy for collecting user responses, and agreeing on an overall time frame for the project took less than one day. The largest whiteboard available was placed in the middle of the library’s lobby (the only public entrance to the building) so that students would notice it upon entering and leaving the library. This spot was the most visible and offered the best possibility for student participation. The location itself served as promotion so no additional time was required for that endeavor (see image 1 below).
To further attract attention to the board, two different banners made from laminated Ellison die-cut letters were placed at the top of either side of the board. On one side appeared the question, “What do you like about the Library?” while on the other side the direct opposite question appeared, “What don’t you like about the Library?” Beyond the eye-catching element of the banner, the library opted for a banner rather than simply writing the questions on the board to prevent intended or unintended erasure.
The graduate assistant was assigned to collect and record the responses twice per week. Ideally, this time frame would give students time to fill the board with unique responses or to respond to others’ ideas. To record the data, the graduate assistant first took pictures before erasing comments. The graduate assistant then created a spreadsheet designed to keep a tally of student responses; having the spreadsheet allowed for easy identification of similar responses. The staff and graduate assistant all agreed that leaving the whiteboard in the lobby for a duration of one month was plenty of time to give interested students a chance to respond to the questions.
With preparation complete, a strategy for collecting and keeping responses to follow, and a clearly defined time frame in place, the library placed the whiteboard in the designated area of the main lobby. By the next morning, the whiteboard was covered with student responses (see image 2 below).
Image 2. The whiteboard filled with responses and student feedback.
Overall, the board was well-received by students. In fact, many students commented to library staff that the board indicated to them that the library was genuinely interested and invested in meeting the needs of students. Library staff routinely responded directly to some of the comments left by students and students, in turn, would respond back. Students also responded or “liked” other students’ responses, which created a really nice interactive, participatory feel for the project.
Feeding off of this interactive, participatory vibe, library staff decided to prolong the whiteboard time frame by an additional two weeks. This extra time allowed library staff to pose questions requesting more specific information from questions and comments students wrote on the whiteboard. For example, one such question was, “What materials do you want to see more of in the library (textbooks, novels, and graphic novels)?” a question resulted from comments left by students requesting a “better selection of books with specific titles in the graphic arts discipline.”
After approximately one and a half months, the whiteboard was removed.
In spite of the absence of the board, students continued to bring their ideas and concerns to the front desk, usually referring to the whiteboard as the motivating factor for coming to the desk. In all, just under 200 responses were received.
Library staff were very pleased with the results of the simple-to-setup whiteboard project and genuinely enjoyed addressing many of the comments presented by library users. In fact, many of the less complex comments brought up by students were dealt with proactively. For example, the library has whiteboards scattered around the study areas of the building and there are boards in all of the study rooms. The main desk checks out two markers and an eraser to students. Several library users commented that they wanted more markers per box. They also requested more color options per box. The library was able to add two additional colors per box with a cost of less than $100.00. This example, though a simple one, clearly showed students that we were paying attention to what they wrote on the whiteboard.
Another comment left by library users informed the library about campus tour leaders unintentionally misrepresenting the library’s group study rooms as soundproof. PML staff uncovered the origin of this problem by asking direct follow-up questions via the board as well as by having conversations with Student Ambassadors. In response to this, the library generated signs that let the students using the group rooms know that they are not soundproof. The UTM Student Ambassador President also agreed to address this misrepresentation to the appropriate campus personnel.
More complex issues were addressed and posted on the library’s Facebook page under the heading “Let’s Continue the Discussion.” The library user’s concern was posted (anonymously) followed by the library’s response. While some of these responses may not always have appeased the library user, the library’s response did explain that many of the issues were outside of the library’s control, generally due to staffing or budgetary issues. Several students have stopped by the reference desk and expressed their appreciation that the library took the time to address topics of concern individually. That always appears to be a key component--providing an easy method through which the students can express themselves and then letting them know someone paid attention.
One of the more interesting aspects of the project is that while the library provides an online suggestion box and maintains a physical suggestion box in the building, both are rarely used. However, providing the whiteboard generated just under 200 responses. While this form of feedback may not be as scientific as an organized survey, it did provide a much more effective suggestion format to which students responded well. Furthermore, while methods of collecting library user feedback are largely static, the whiteboard project provided a livelier, more fluid, and more interactive venue for library users to express their opinions.
To be clear, the intention here is not to suggest that collecting feedback using this or other similar methods is better than more traditional methods of surveying. However, library staff at PML have concluded that it does provide several attributes that are beneficial. For example, it allows a quick response from library users to either a broad, general question or a narrow, specific question. Students were also able to actively participate and respond to one another’s comments (something not possible in a typical web- or paper-based format) by putting symbols like the “+” sign or making simple comments like “yes!” or “I agree” beside someone else’s comment indicating their support. Additionally, small groups of students who walked in together would stop and generate an idea or two that they felt needed to be expressed.
Of course, this method is not without its flaws. One of the most obvious negatives of the project was that the library only got responses from people walking into the library. Feedback from non-library users explaining why they don’t come to the library is often just as valuable. Additionally, there is the chance that just a few students could purposely input the same topics or issues over and over again and easily falsify the responses. Furthermore, the whiteboard approach relies on the graduate assistant to properly categorize and interpret the handwritten comments on the board. It also requires manual input of data, something that most web-based surveys do automatically.
The library staff is already planning something similar for the Fall 2016 term. There is a discussion about adding an additional whiteboard in the University Center to see how those responses compare to those in the library. Overall, the staff found that this quick, inexpensive method of garnering student input was highly effective and provided valuable information that can be quickly put to good use to the benefit of library patrons.
Martins, N. (2010). Measurement model equivalence in web- and paper-based surveys. Southern African Business Review, 14(3), 77-107. Retrieved from http://www.unisa.ac.za/contents/faculties/service_dept/docs/Measurement14_3_chap4.pdf
Perkins, G. H. (2004). Will libraries’ web-based survey methods replace existing non-electronic survey methods? Information Technology and Libraries, 23(3), 123-126.
Perkins, G. H., & Yuan, H. (2001). A comparison of web-based and paper-and-pencil library satisfaction survey results. College and Research Libraries, 62(4), 369-377. Retrieved from http://crl.acrl.org/content/62/4/369.full.pdf+html
Adam Clemons, Government Documents Librarian at Paul Meek Library, University of Tennessee at Martin, can be reached at email@example.com.
Jim Nance, Reference Librarian at Paul Meek Library, University of Tennessee at Martin, can be reached at firstname.lastname@example.org.
Drew Ballinger, Graduate Assistant at Paul Meek Library, University of Tennessee at Martin, can be reached at email@example.com.