Print Page   |   Contact Us   |   Report Abuse   |   Sign In   |   Register
TL v56n2 Using a Student Response System to Obtain Real-Time Assessment of Students Attending Biblio
Share |


Tennessee Libraries

Volume 56 Number 2  



Using a Student Response System

to Obtain Real-Time Assessment

of  Bibliographic Instruction Sessions


Karen Dearing
Instruction Librarian

Sue Alexander
User Services Librarian

Sharon Parente
Reference and Instruction Team Leader

Amy York
Distance Education and Outreach Librarian

Middle Tennessee State University

Conference Abstract: Student response systems allow teachers to ask groups of students multiple choice questions to which they reply individually via hand-held wireless transmitters.  MTSU librarians illustrate how they use this system in the bibliographic instruction program.

Project S.T.A.R.T. (Student Teaching and Assessment Response Tool) consists of a combination of tools: an audience response system, a live librarian, and an interactive presentation based on the ACRL Information Literacy Guidelines. What follows is an explanation of how we came to use this system and how it is being implemented at the James E. Walker Library at Middle Tennessee State University.

The Problem: bored students and librarians


A little more than a year ago, the Reference and Instruction Team at MTSU’s Walker Library concluded that a change was desperately needed in the library instruction curriculum. It was becoming a dreaded chore to make it through introductory classes, as we observed pervasive boredom and inattention among students. We realized that our traditional lecture-based style of teaching was not reaching all student learning styles. Also, we recognized students from our classes asking confused questions at the reference desk about the material we had just covered. We decided that, in addition to making our classes more fun and informative, we needed to be able to perform real-time assessment of learning so that we could immediately re-teach material that students didn’t understand.

We began our discovery process with an idea of the outcome that we wanted: better student learning. In order to come up with a solution, we began with a few questions. First, who are our introductory students? What are their ages and other demographic characteristics? How do these students prefer to learn? We were aware of the preference of this generation for hands-on multimedia tools, including text messaging devices and portable mp3 players, because we found ourselves constantly competing with these gadgets for student attention in our classes. We turned to the professional literature for further elaboration on student learning preferences and how we could use them to provide a better library instruction experience. Unfortunately, the library science literature was less than useful, but we did find some interesting ideas in the education literature.

A Solution: interactivity and instant assessment


One idea that particularly interested us was using an audience reply system to engage students and immediately test their learning. An audience reply system consists of special software, a receiver, and handheld input devices. As an instructor delivers a lesson, he or she asks students to respond to questions presented on a screen. After all students input their answers, the software can graphically display the results. This is useful for immediately assessing student learning or opinions. It may also be helpful to ask demographic questions (age, gender, classification) to be used in conjunction with subject questions for later analysis. We were drawn to this idea because of its interactivity and also because it had not been previously applied to a library setting; most of the uses of the audience reply system had been in the physical sciences. The opportunity to use the system for information literacy instruction and present our findings to the library community was very appealing to us.

Investigation: choosing the system


Over the next few months, Karen Dearing led the team in our investigation of the types of audience reply systems available and their costs. We created a chart of systems that met our cost criteria, comparing features such as signal type (radio-frequency vs. infrared), input devices (keypads vs. keyboards), and software (compatibility, flexibility, requirements, ease of use). Ultimately, we chose the TurningPoint radio-frequency system. We wrote a purchase proposal and presented it to the library dean. A TurningPoint representative, Kevin Owens, came to MTSU to demonstrate the system to the dean and all interested librarians. The dean and almost all of the librarians were very impressed with the audience reply system and its possible uses in the library, and the dean approved the purchase. In December of 2005, Karen Dearing wrote a sole source purchase justification for outfitting our small classroom (30 stations), and we received the system in January 2006.

Implementation: creating a curriculum and testing it out


The instruction team decided that, at first, we would use the system only in the courses for new freshmen (University 1010) and new transfer students (University 2020). While putting together our interactive program in January and February of 2006, we researched professional literature and consulted ACRL guidelines for elements that should be included in information literacy curriculums. We also discussed what problems we had observed our students having, and decided to hit hard on “application” and “analysis” of both information and research tools. The final product consisted of 14 informational slides, 9 interactive questions, 2 opinion questions about the presentation content and the response system format, and 6 handouts to supplement informational content.

Informational Slides


Advantages of using the library compared to the web

  • Five steps of the library research process
    • Choose your topic
    • Identify your information needs
    • Locate your information using the library
    • Analyze and evaluate your information
    • Properly cite your information

 Interactive Questions  


  • Where do you go for help? – icebreaker
  • Demographic question – for later analysis
  • Location question about library layout and contents
  • Online catalog search for specific author
  • Online catalog search for magazine title and location
  • Examine periodicals, determine if scholarly or popular
  • Search database for journal article on specific subject
  • Search database for newspaper article on specific subject
  • Evaluate specific website and determine credibility



  • Evaluating different types of information to fit different types of needs
  • Identifying three classes of information: primary, secondary, and tertiary
  • Distinguishing between scholarly journal articles and popular magazine articles
  • Evaluating both print and online information for credibility
  • Identifying a web hoax
  • Citing sources from library database (MLA style)



Because of the time needed to develop the curriculum, we were not able to use our interactive program with our regularly scheduled bibliographic instruction (BI) classes in the Spring 2006 semester. Luckily, though, we were able to find a class willing to participate in a test drive of the system. Overall, the demonstration went very well, though there were a few surprises. First of all, we ran out of time and were unable to complete the entire lesson in the 55-minute session. In classes that run longer, this shouldn’t be a problem, but we learned to take this factor into consideration. Because of the sequential nature of the presentation, it was not as easy to cut content as it is in a traditional BI class.

We were also surprised by some of the student answers to the interactive questions. Following are graphics of some of the questions and answers, along with our interpretations.


Figure 1.

This was the first question we asked. Its purpose was to test keypad functioning and break the ice. Although there had been very little instruction up to this point, it should have been clear from the context that “Reference desk and ask a librarian” was the correct answer. The student who answered “Google” was either a jokester or determined not to learn anything new in the class. Later questions would clarify this.

Figure 2.

We asked this question after giving very clear instruction on where the various periodicals can be found and giving students hands on practice with the online catalog. Still, as the graph shows, we did not get the message across to some of the students. Having this kind of real time assessment gives us the opportunity to review and retest right away. It is helpful to be able to do this while we still have them in the classroom because once they are dismissed from the class, we have lost them.


Figure 3.  

This was one of our last questions, and it shows a success! The students really liked this type of presentation. We had expected this because throughout the class, the students were lovingly holding their keypads, eager to answer another question.


Figure 4.

It appears that the Google lover from the first slide is back again. It is realistic to expect one or two holdouts in each class, but since most of the participants found the information helpful, we feel confident that we can reach the majority of our students using this system. And that’s our main goal!



We look forward to using Project S.T.A.R.T. in more introductory classes in upcoming semesters. Though our experience with it so far has been limited, we feel confident that the interactive nature of the program and its ability to assess learning in real time make it an ideal way to teach information literacy skills. We also look forward to customizing the presentation for subject-specific classes. We will update our progress and post additional information about using an audience response system in a classroom setting at


Membership Software Powered by®  ::  Legal