Teaching Information Evaluation with the Five Ws: an Elementary Method, an Instructional Scaffold, and the Effect on Student Recall and Application
Total Page:16
File Type:pdf, Size:1020Kb
University of Tennessee, Knoxville TRACE: Tennessee Research and Creative Exchange UT Libraries Faculty: Peer-Reviewed Publications University Libraries Summer 2014 Teaching Information Evaluation with the Five Ws: An Elementary Method, an Instructional Scaffold, and the Effect on Student Recall and Application Rachel Radom University of Tennessee - Knoxville, [email protected] Rachel W. Gammons Follow this and additional works at: https://trace.tennessee.edu/utk_libpub Part of the Information Literacy Commons Recommended Citation Radom, Rachel, and Rachel W. Gammons. "Teaching Information Evaluation with the 5 Ws: An Elementary Method, an Instructional Scaffold, and the Effect on Student Recall and Application." Reference & User Services Quarterly 53, no. 4 (Summer 2014): 334-347. This Article is brought to you for free and open access by the University Libraries at TRACE: Tennessee Research and Creative Exchange. It has been accepted for inclusion in UT Libraries Faculty: Peer-Reviewed Publications by an authorized administrator of TRACE: Tennessee Research and Creative Exchange. For more information, please contact [email protected]. FEATURE Teaching Information Evaluation with the Five Ws An Elementary Method, an Instructional Scaffold, and the Effect on Student Recall and Application Rachel Radom and Researchers developed an information efficiently assist students in the acqui- Rachel W. Gammons evaluation activity used in one-shot library sition and application of information instruction for English composition classes. evaluation skills. The desired frame- Rachel Radom is Instructional The activity guided students through evalu- work would be memorable, familiar Services Librarian for Undergraduate ation using the “Five Ws” method of inqui- to students, scalable (used in face-to- Programs, University of Tennessee ry (who, what, when, etc.). A summative face sessions or asynchronous, online Libraries, Knoxville, Tennessee. assessment determined student recall and instruction), and valuable to course Rachel W. Gammons is Learning application of the method. Findings, consis- instructors. Design Librarian, McNairy Library tent over two semesters, include that 66.0 The following study introduces an and Learning Forum, Millersville percent of students applied or recalled at information evaluation method based University, Millersville, Pennsylvania. least one of the Five Ws, and 20.8 percent on a well-known framework of in- of students applied or recalled more than quiry—the “Five Ws,” or who, what, one of its six criteria. Instructors were also when, where, why, and how. Research- surveyed, with 100 percent finding value ers modified the Five Ws to create a in the method and 83.3 percent using or formative assessment that introduced planning to use it in their own teaching. evaluation skills to students and piloted it in fall 2011 during one-shot library ndergraduate instruction li- instruction sessions for English compo- brarians face the common sition classes. Full implementation fol- challenge of addressing a lowed in fall 2012. In both the pilot and wide variety of information formal study, a summative assessment Uliteracy competencies in sessions that was sent to students an average of three follow short, one-shot, guest lecturer weeks after the library session to assess formats. Of these competencies, one recall and application of the evaluation of the most complicated and time- method. Composition instructors were consuming to teach is the evaluation also surveyed to assess their responses Reference & User Services Quarterly, of information sources. It can also be to the Five Ws evaluation method and vol. 53, no. 3, pp. 334–71 one of the most difficult competencies determine whether they had added, or © 2014 American Library Association. 1 All rights reserved. for students to effectively learn. In this would consider adding, the method to Permission granted to reproduce for study, the researchers aimed to find their own instruction. The findings of nonprofit, educational use. or develop a framework that would these assessments may be relevant to 334 Reference & User Services Quarterly Teaching Information Evaluation with the Five Ws instruction librarians and composition instructors, as well distinctions between information sources less discrete. as those interested in the connections between information As the information landscape undergoes radical shifts, literacy competencies and student learning outcomes in gen- librarians’ approaches to teaching information literacy and eral education. information evaluation have remained relatively static. Ap- proximately ten years ago, two information evaluation meth- ods associated with different mnemonic devices were shared Literature REView in the library literature and were subsequently incorporated into many library instruction sessions. In 2004, Blakeslee In 2000, the Association of College and Research Libraries described the motivation behind designing California State (ACRL) published the “Information Literacy Competency University Chico’s CRAAP Test as a desire to create a memo- Standards for Higher Education.”2 Intended to facilitate the rable acronym because of its “associative powers.”9 Intended development of lifelong learners, the standards outline the to guide users through evaluating the Currency, Relevance, skills needed for students to identify an information need and Authority, Accuracy, and Purpose of a document, the method’s then locate, evaluate, and utilize resources to fulfill that need.3 accompanying checklist and questions can be applied to For more than a decade, the ACRL guidelines have directed both print and online resources; however, its emphasis on the library profession’s approach to instruction, shaping the the evaluation of electronic materials has resulted in a loose ways that librarians conceptualize, design, provide, and as- categorization of the method as a website evaluation tool.10 sess library instruction. Corresponding to the widespread In contrast, the CRITIC method was incorporated into library adoption of these standards, there has been an increase in re- instruction as a tool to be utilized in the evaluation of print search investigating students’ skills (or lack thereof) in critical resources.11 In a presentation on the method at a 2004 confer- thinking, and more specifically, information evaluation. The ence, Matthies and Helmke describe CRITIC as a “practical majority of these research studies, however, are based on the system of applied critical thought”; repurposing the steps of evaluation of web and print sources as separate materials. As the scientific method, it encourages users to approach evalu- the numbers of online and open access publications increase ation as an iterative process and to interrogate the Claim, and the boundaries between formats of information recede, Role of the Claimant, Testing, Independent Verification, and the depiction of print and electronic resources as existing in Conclusion of a given document.12 distinct and separate categories does not accurately reflect the Both the CRAAP Test and CRITIC method attempt to sim- modern search experience.4 It is also misleading to students plify the evaluation process by breaking down complex ideas who are used to accessing a variety of media and information into a set of accessible criteria, but little research has been sources in multiple formats. conducted on the effectiveness of the methods themselves. Student confusion about the format and quality of infor- However, one recently published study on the advantages mation sources is substantiated by recent research. In a 2009 of formative assessment in information literacy instruction report for the United Kingdom’s Joint Information Systems includes a series of anecdotal observations that may provide Council (JISC), researchers identified a dissonance between insight into the effectiveness of the CRAAP Test.13 Following college and university students’ expectations of published an instruction workshop in which the test was taught, many research and the realities of those bodies of work.5 When students self-reported a persisting difficulty with “determin- asked what types of information a student would recognize ing the quality of different sources.”14 The authors found as “research,” an overwhelming majority (97 percent) iden- that some students continued to have trouble “distinguish- tified traditional formats such as books and articles. When ing between popular magazines and scholarly journals” and confronted with less well-known formats, such as posters or “finding authoritative websites” even after follow-up consul- dissertations, the number of students willing to identify the tations.15 Their findings suggest that the CRAAP Test may documents as “research” greatly decreased.6 Additional quali- not effectively bridge the gap between determining easily tative results describing student confusion were obtained in identifiable qualities, such as date of publication, and those small focus group sessions. While the majority of students that require a greater level independent judgment and critical “distrusted” the Internet, they widely accepted “all published thinking, such as authority, especially if used in only a single materials” as appropriate for academic use.7 This inaccurate instruction session. distinction between the credibility of print and electronic Meola contends that it is problematic to use models such resources was also reported in research by Biddix et al., who as CRAAP and CRITIC to teach