Print Page   |   Contact Us   |   Report Abuse   |   Sign In   |   Register
TL v56n2 Another Look at Information Literacy
Share |


Tennessee Libraries

Volume 56 Number 2



Another Look at Information Literacy:

The Information and Communication Technology

Literacy Initiative 


Betsy Park
Head, Reference Department

Stephanie Cage
Library Assistant

Shirlene Moore
Library Assistant

University of Memphis Libraries

Conference Abstract: How many of your students have the skills to effectively participate in the 21st century? What are these skills and how do we measure them? This presentation introduces an assessment tool that tests students' proficiencies in using "digital technology, communications tools, and/or function in a knowledge society."

Technology has affected all our lives in ways that we could not have imagined 20 years ago. Statistics indicate that 72% of all adults use the Internet; that on a typical day, 40 million users go online just for fun; and that 87% of all teenagers and 84% of all Gen Yers go online to play games, instant message, do research for school, download music, get health information, and the like (Fox and Madden 2006). Half of American adults report researching health topics online; 75 million Americans used the Internet for political information during the last campaign; and nine million report consulting the Internet for a major financial or investment decision. About 30 million people say they know someone who has been in a long-term relationship or married someone they met online. (Cornfield 2005; Fox 2005; Madden and Lenhart 2006). For many teens, the Internet has replaced the library as the source of information. Ninety-four percent of teens use the Internet for school work; 71% report using the Internet as a major source (maybe the only source) for their most recent research project; and 18% report using the Internet to cheat ( "Pew Internet" 2001; Madden and Lenhart 2006).

Given the above information, what skills do our students need to be effective in the 21st century? Participants at the TLA/SELA Conference in Memphis, TN, in April 2006 suggested the following:

  • the ability to adequately formulate a research question;
  • the ability to effectively search resources to find information;
  • the ability to evaluate the information found;
  • the ability to understand the information found; and,
  • the ability to know when they had found enough information.

What do we really know about how our students find information and even more importantly, what do we know about their abilities? Many universities have developed their own assessment instruments. Readers may want to look at the instruments from James Madison University or the Health Literacy Studies from the Harvard School of Public Health. There are also assessment instruments that extend beyond single institutions, such as the Bay Area Community College Information Competency Assessment Project (ICAP), a collaborative project of faculty librarians in the San Francisco Bay Area; Project SAILS (Standardized Assessment of Information Literacy Skills), developed through a partnership of Kent State and ARL; and the Information and Communication Technology (ICT) Literacy test developed by the Educational Testing Service (ETS) in collaboration with several libraries. Both SAILS and the ETS test are based on the ACRL Information Literacy Standards (ALA Association of College & Research Libraries 2006).

The University of Memphis became involved with the ETS ICT Literacy Assessment in Fall 2005 when the University Libraries volunteered to participate in beta field testing. Later we participated in the Spring 2005 large scale assessment and in the individual assessments in Spring 2006. We are currently a member of the ETS ICT Literacy National Advisory Committee. ETS defines ICT literacy as “the ability to use digital technology, communication tools, and/or networks appropriately to solve information problems in order to function in an information society. This includes the ability to use technology as a tool to research, organize, evaluate, and communicate information, and the possession of a fundamental understanding of the ethical/legal issues surrounding the access and use of information” (Educational Testing Service 2006).

The test purports to assess seven proficiencies:

  1. Define--the ability to use ICT tools to identify an information need;
  2. Access--the a bility to collect or retrieve the required information to meet the need; 
  3. Manage--the a bility to organize information for later retrieval; 
  4. Evaluate--the degree to which an individual can determine if the information satisfies the needs of the task, including recognizing the authority, bias, or prejudice; 
  5. Synthesize--the degree to which an individual can accurately summarize and compare information from multiple sources; 
  6. Create--the degree to which an individual can adapt, apply, design, or invent information; 
  7. Communicate--the degree to which an individual can communicate information in its proper context with regard to audience and venue (Educational Testing Service 2006).

The test is scenario-based and is scheduled to take 75 minutes, plus 20 minutes for a pre-test questionnaire. During the test, test takers perform tasks demonstrating their abilities. For example, test takers may be asked to search the web to demonstrate their ability to access information or use a spread sheet to demonstrate their ability to synthesize information.

The following sample task demonstrates how the ability to synthesize information might be assessed:

In this scenario the test taker works in a law office with many left-handed people. He receives a request from the office manager to find a good source for products and gifts for left-handed people. The office manager specifies that he is looking for some place that has a wide range of merchandise with product guarantees, that has an online catalog , and online ordering. The office manager says that discounts would be a plus. The test taker receives emails from three prospective sources. One email is text only, one includes a link to a web page, and the third has an attachment.


Figure 1. The test taker opens an email.

The test taker must access each of these emails and then complete a table that includes a number of specifications (e.g. product guarantees, online catalog, customer service number, return policy). Not all of these specifications match the office manager's requirements. Therefore, the test taker must be able to identify the requirements that match the information need. Once the table is completed, the test taker must interpret the information to correctly rank the three sources.

Figure 2. Test taker completes the table.

Some students complete this task with ease, but many encounter difficulties, both technical and cognitive. Some students had difficulty with the email, particularly opening the attachment. Other students selected requirements that were not relevant to the task (e.g. an 800 number or return policy) while others did not correctly interpret the information obtained. This task requires both technical skills and the ability to analyze and synthesize information.

Institutions receive aggregate scores and individuals receive individual scores. The scores report the level of proficiency (high, medium, low) overall and for each of the seven proficiencies. ETS is currently calibrating a core assessment aimed at high school seniors and community college students and an advanced assessment aimed at rising juniors in universities. Other types of ICT assessments are being considered.

Shirlene Moore spoke about taking the test as a student. Ms. Moore works in the University Libraries and is also a student at the university. She took the test during the large scale assessment in Spring 2005. At that time the version was longer than it currently is and took approximately 2.5 hours. She was paid $25.00 to participate, but did not receive an individual score. She described the test as “different” and “challenging.” She says that it is not like any test she had taken before. Some tasks required her to open emails, for others she used a spreadsheet, while still others required the use of different databases. None of the tasks used commercially available software, but occur in a simulated environment. Several questions asked that she complete more than one task. For example, she had to access several websites, gather information, and incorporate the information into a spreadsheet; compare and contrast the information gathered; and finally summarize the information into a recommendation that was sent via email. One participant asked Ms. Moore if this test could be used to assess students in the health sciences. Ms. Moore replied that although it could be used in that field, the test was not geared toward a specific discipline. Ms. Moore stressed that critical thinking skills were important for success in this test.

Stephanie Cage spent a good deal of her time proctoring the test at the University of Memphis. In her opinion some of the advantages of the ETS test are that it has a centralized content and scores, is delivered over the Internet making it easier to update, and allows for more flexible score delivery and reporting. Some of the disadvantages of the test are that a computer is required for each test taker; Internet connectivity is required for each testing workstation; and there may be security concerns regarding global access to Internet-based assessments. During the large scale testing in the Spring the test lasted 2.5 hours, was held in the library, and the proctors did a lot of troubleshooting. Students were paid $25 to participate in the testing which took place during the students' free time. Students were pre-assigned a code to access the test, and the approval process was very time consuming. For these reasons Ms. Cage spent a good deal of time scheduling and administrating the test.

During the second testing (this Fall), the test lasted 75 minutes, and there was not as much troubleshooting. The test was taken during class time, and it was, when possible, held in the students' own computer lab/classroom. The students logged in using their university I.D. and password (rather than using a pre-assigned code), and the approval process was automated, making it easier and faster.

The presenters all agreed that participating in the ETS testing has been interesting, albeit time-consuming. Although we are not spokespersons for ETS, the test appears to have potential. The ETS test can be used for many purposes, including as a requirement for entering a university, for acceptance into a major or professional program, or as a graduation requirement. It may also be used to assess student's strengths and weaknesses, leading to the development of workshops, new classes, or the revision of programs and curricula. It is our experience that too often universities put money into technology, creating new computer labs on campus, without knowledge of what happens in these labs or how well students effectively navigate in a technologically based information environment. At the present time, the initiative for this test at the University of Memphis resides within the University Libraries. It is important that this initiative move outside the library. Although the Libraries should continue to be involved, this testing should be placed in someplace like the Office of Academic Assessment that can reach more students.


For more information, see:


ALA Association of College & Research Libraries. 2006. Information literacy competency standards for higher education. [cited April 14, 2006 ]. Available from

Cornfield, Michael. 2005. Pew internet and American life: The internet and campaign 2004. [cited April 14, 2006 ]. Available from

Educational Testing Service. 2006. ICT literacy assessment overview . [cited April 12, 2006 ]. Available from

Fox, Susannah. 2005. Pew internet and American life project: Health information online. [cited April 14, 2006 ]. Available from  

Fox, Susannah, and Madden, Mary. 2006. Pew internet and American life project: Generations online project. [cited April 11, 2006 ]. Available from

Madden, Mary, and Lenhart, Amanda. 2006. Pew internet and American life project: Online dating. [cited April 14, 2006 ]. Available from

Pew internet & American life project report: Teenage life online--report . 2001 [cited April 11, 2006 ]. Available from  

Membership Software Powered by®  ::  Legal