Rui Wang 619

Assessment for One-Shot Instruction: A 16.3. Conceptual Approach portal

Rui Wang

publication, abstract: The purpose of this study is to explore a conceptual approach to assessment for one-shot library instruction. This study develops a new assessment instrumentfor based on Carol Kuhlthau’s information search process (ISP) model. The new instrument focuses on measuring and identifying changes in student readiness to do research along three dimensions—cognitive thoughts, affective feelings, and research actions—before and after the students receive one-shot library instruction at the beginning of their research. The development of the new instrument includes a validation process supported by statistical analyses. The results accepteddemonstrate that the students improved their research readiness after receiving one-shot library instruction. and

edited,Introduction

ourse-specific single class visits, so-called one-shot library instruction, are a long-established formcopy of library instruction. One-shot library instruction can be traced back to the late nineteenth century. In 1880, Otis H. Robinson, a andC professor of mathematics at the University of Rochester in New York, pioneered course-related library instruction. Robinson succeeded in “getting at least half his faculty, a largereviewed, part of the students, and sometimes even the president, into the library to help students use the collections effectively.”1 Robinson’s philosophy was that “suc- cessful library instruction depended on with high standards of scholarship peer 2 whois were able to command respect in their separate academic communities.” Since then, academic librarians have made great efforts to develop course-specific, one-shot library instruction, which “has become mainstream, occupying an accepted, respected, mss.and expected place in librarianship” since the 1980s.3 Recently, two sequential reports by the Primary Research Group, which publishes research reports and benchmarking This studies for businesses, educational institutions, and other organizations, revealed a

portal: and the Academy, Vol. 16, No. 3 (2016), pp. 619–648. Copyright © 2016 by Johns Hopkins University Press, Baltimore, MD 21218. 620 Library Instruction: A Conceptual Approach

dramatic increase in the number of instruction sessions: a 20 percent rise between the fall semester of 2006 and the fall semester of 2007, and a 23 percent growth from 2012 to 2013.4 Although academic librarians have invented or reinvented various types of library instruction, including noncredit or credit-bearing, disciplinary-specific library courses5 and programmatic approach instruction,6 the most popular form is still one-shot library instruction. Many librarians consider one-shot instruction to be a quintessential component of library instruction.7 Students receive the instruction at a point of need or a teachable moment, and Janice Jaguszewski and Karen Williams noted, “The one-shot 16.3. model is familiar and easy for faculty to incorporate into the syllabus.”8 As Julie Rabine and Catherine Cardwell observed, “Librarians devote substantial time and energy to these sessions.”9 portal However, the longevity of one-shot library instruction does not make its assessment easy. Few publications mention an assessment process within the framework of one-shot instruction,10 although assessment, in general, has surged and become widespread in academic libraries. Many librarians have asserted that evaluation of one-shot library instruction is “too complex and too time consuming.”11 The reason forpublication, the complexity is that most one-shot sessions “are designed to help students infor specific courses with specific assignments.”12 Because of disciplinary disparities, the design, contents, and pedagogies of individual faculty members vary. The results of a longitudinal survey sent to teaching faculty four times over a long Librarians must constantly period revealaccepted a large variation even within adjust to changes for different a single discipline, inasmuch as individual facultyand members changed their assignments course objectives covering differ- from year to year.13 Consequently, librarians ent library resources and skills must constantly adjust to changes for dif- ferent course objectives covering different for one-shot library instruction.edited, library resources and skills for one-shot li- brary instruction. Determining what to teach in a one-shot session can becopy daunting. 14 Moreover, as Rabine and Cardwell say, “The librarians conducting these sessions function in the role of guest lecturer, performing a service for the teaching faculty member. The librarian has little control over the assign- ment created by the instructor.”15 In addition,reviewed, assessment for one-shot library instruction requires collaboration from faculty and students. It can be challenging to get faculty and students to fully cooperate and participate. Librarians’ assessment for one-shot library instruction is never the top prioritypeer for faculty and students. Researchers have frequently reported difficulties in collectingis data about one-shot library instruction. Students’ unmotivated behaviors create additional challenges not only for the process of data-collecting but also for maintaining mss.assessment as an ongoing practice. The largest challenge comes from the assessment tools for one-shot library instruc- This tion. These tools focus on assessing prepackaged library skills that are mostly irrelevant to the students’ research process for a specific course assignment. They fit with the As- sociation of College and Research Libraries (ACRL) Competency Standards for Higher Education (the Standards), “which provides a limited, almost formulaic approach” and valorizes the “‘information literate student’ as a construct of Rui Wang 621

imagined accomplishment.”16 Meanwhile, librarians have repeatedly questioned one- shot library instruction and concluded it to have little value or effectiveness, which raises “more troubling questions regarding the level of engagement possible in one-shot library instruction.”17 Developing effective and practical instruments that are relevant to the student research process and that demonstrate the value and impact of one-shot library instruction in a meaningful way is urgent for the whole community of academic libraries. Therefore, the purpose of this study is to develop a new assessment instrument for one-shot library instruction. Rather than 16.3. focusing on testing prepackaged library skills, Developing effective and practi- the new instrument aims to identify how cal instruments that are relevant students become better prepared to conduct portal research for their course assignments after re- to the student research process ceiving one-shot library instruction. The new and that demonstrate the value instrument is referred to as research readiness- and impact of one-shot library focused assessment (RRFA), as opposed to the more traditional approach, which might be instruction in apublication, meaningful way called library skills-focused assessment. The is urgent forfor the whole commu- purpose of the RRFA is to measure changes in students’ cognitive thoughts, affective nity of academic libraries. feelings, and behavioral actions from their responses before and after one-shot library instruction.accepted This study hypothesizes that: Hypothesis 1: After one-shot library instruction, students’ cognitive thinking would improve related to conducting researchand for their course assignment. Specifically, students would understand more about their course assignment, and they would be clearer about their research topic, where to look for sources to develop their topic, and how to conduct library research. Hypothesis 2: After one-shot edited,library instruction, students would feel that the course assignment was less challenging, and they would have more confidence about complet- ing course assignments successfully.copy Hypothesis 3: After one-shot library instruction, students would be more willing to ask the librarian for help over their social networks, would plan to start their research earlier, and would appreciate the one-shot library session more. reviewed, Literature Review

The librarypeer literature has some reports of assessment for one-shot library instruction, althoughis the number of such reports is not as significant as those for other types of library instruction. A popular method is the pretest and posttest to measure changes in mss.students’ learning the use of library resources after receiving one-shot library instruction. For example, Donald Barclay employed that method in the 1990s to evaluate the ability This of freshman writing students to find books and articles before and after a one-shot ses- sion. He created essentially the same two questions for his pretest and posttest, along with two satisfaction-survey questions. The results showed that the students learned the two library skills and were satisfied with the one-shot session.18 His assessment was reviewed as a realistic effort to evaluate library instruction in the real world “in terms 622 Library Instruction: A Conceptual Approach

of time and resource limitations.”19 Chris Portmann and Adrienne Roush reported that they administered pre- and posttests to a convenience sample of 38 students enrolled in a 200-level sociology course to measure the influence of one-shot library sessions. Their data analysis indicated that the library instruction had a statistically significant positive influence on library use but not on library skills. Later, the pretest/posttest instrument evolved into large-scaled tools. For example, two librarians at Indiana University in Bloomington administered their pretest and posttest with 26 questions to 404 students in a freshman composition course. They concluded that there was no increase in scores 16.3. from the pretest to the posttest regarding students’ learning of library skills and services.20 The pretest/posttest assessment tool for one-shot library instruction has also been linked to the ACRL Standards. In 2010, two librarians at Rider University in Lawrenceportal Township, New Jersey, and at State University of New York (SUNY) College at Oneonta published their report on how they followed the Standards to develop 15 questions to measure changes in learning outcomes for information literacy concepts after a one-shot session. The 15 questions included asking students how to find books, read citations, access full text articles from databases, use interlibrary loan, and employpublication, search strate - gies such as Boolean operators and truncation. The results showedfor “a positive” but not “dramatic” impact on these defined learning outcomes. The authors claimed, “The pre- and posttest instrument was able to show specific strengths and weaknesses in the students’ comprehension of IL concepts.”21 Kevin Walker and Michael Pearce reported another example of the Standards approach in 2014.accepted Their pre- and posttest instrument, which included 24 items, was modeled on the Standards and designed to meet the goals of the first-year writing program. Their resultsand showed significant improvement from pretest to posttest but no statistically significant difference between two instructional methods. They concluded, “One-shot instructional sessions likely do not fulfill the in- formation literacy needs of students.”22 Some experts criticized the pretest/posttestedited, method as merely measuring short-term recall on a prescribed set of skills.23 A reference team at the Hong Kong University of Science and Technology (HKUST)copy in China determined that the test approach would not serve the purpose of program assessment, because the library instruction program at their institution had different teaching objectives and covered different library skills.24 Therefore, HKUST chose to use a perception survey to collect 466 student responses from 23 class-specificreviewed, library instruction sessions to measure the lasting value of one-shot library instruction. To trace any enduring impact of library instruction, they adminis- tered the survey four to eight weeks after the one-shot session. The survey contained 14 questionspeer which included asking the students what they learned from the one-shot session,is what library skills they acquired, whether they continued to use these skills, and how confident they felt to do library research. The survey then asked the students mss.to rate the library session and the library instructor. The results showed that most at- tendees remained positive about the usefulness of the one-shot sessions and claimed This that they retained the skills learned. Another method that differs from the pretest/posttest method is the one-minute paper. This technique collects students’ instant feedback during the last few minutes of class time. Elizabeth Choinski and Michelle Emanuel reported their use of the one-minute paper method to assess one-shot library instruction for a biology class and a Spanish class Rui Wang 623

at the University of Mississippi Libraries in Oxford.25 The one-minute paper consisted of four reflection questions: (1) What is the difference between an indexing and abstracting database like BasicBIOSIS—or the MLA (Modern Language Association) International Bibliography or Health and Psychosocial Instruments (HaPI)—and a full-text database like EBSCOHost? (2) Which kind of database was more useful to your research and why? (3) How can you tell if an article is from a scholarly source or from a popular magazine? and (4) How do you know if a website is suitable for academic work? Forty one-minute papers were selected from 307 papers and graded by two other librarians. The authors 16.3. concluded that the one-minute paper method was an objective, quantifiable, and flexible assessment tool for one-shot library instruction. Evaluating student papers or bibliographies is another alternative to the pretest/portal posttest method. Researchers have reported on this method to assess one-shot library instruction since the 1980s. Three experimental studies used student papers and bibli- ographies to identify effects of instruction and to rate different instructional methods of one-shot library instruction for sociology, writing, and engineering classes.26 Librarians or readers graded the student work based on criteria developed inpublication, each study. One of the studies found that the group who received one-shot instructionfor scored statistically higher than the group who were not exposed to one-shot library instruction. Another study, which aimed to identify different effects of two instructional methods, found that the cognitive strategy (identifying tools appropriate for a research topic) increased the quality of student bibliographies, compared to the acceptedtraditional tool-specific approach. The third study found little difference between traditional lectures and “computer-enhanced instruction.” These reports all revealed thatand only a small portion of student work and one-minute papers could be processed and analyzed because to maintain consistency among different graders and to convert texts into scores were time-consuming and te- dious. Another problem is that the evaluation cannot separate the combined influence of one-shot library instruction, professors’edited, assistance, and student self-learning, because student work is an end product of research. This literature review copyalso shows that assessments for one-shot library instruction by testing student library use and skills were mostly conducted in large freshman writing or general education classes because of the uniformity of class assignments and research activities. Little assessment was done at upper Owing to the dynamic nature undergraduatereviewed, or graduate levels because instruction for these classes is often unique for of one-shot library instruction discipline-specific assignments. For example, and the limitations of various a psychologypeer professor and a librarian’s pilot projectis evaluated a “super-size bibliographic instruments, almost all assess- instruction” (two class periods) for a psy- ments of one-shot instruction mss.chology research class,27 and two librarians measured social work students’ information end up as a one-shot assessment. This literacy skills after a Social Welfare Research Seminar.28 Because these discipline-specific assessments demonstrate deeply embedded instruction for specific course assignments, it is hard to use these unique measurements to replicate these assessments. Owing to the dynamic nature of one-shot library instruction and the limitations of various instruments, almost all assessments of one-shot instruc- 624 Library Instruction: A Conceptual Approach

tion end up as a one-shot assessment. As Portmann and Roush pointed out, “Without continual repeat applications, these one-study approaches do little if anything in the way of establishing an enduring assessment program for library instruction.”29

Methods

The Research Readiness-Focused Assessment (RRFA) Instrument 16.3. The literature review reveals some flaws in assessment for one-shot library instruction. Various evaluation tools rely heavily on measuring general library skills. Such prepack- aged tools test student library skills as learning outcomes but have little relevance to theportal early progress of student research for a specific course assignment. The evaluation of student papers or bibliographies may reveal how students utilize library skills in their coursework at the end of their research, but the process is tedious and can only assess a small amount of student work. Another limitation is that it is hard to single out the impact of one-shot library instruction on student work from influencespublication, of nonlibrary instruction. As librarians Christopher Bober, Sonia Poulin, and Luigina Vileno described two decades ago, for Basic library skills testing does not measure the students’ actual learning. Correct answers can often be given through short term recall. Although students may have mastered basic library skills, such as how to read a call number, they may not have ac- quired the conceptual knowledge necessary to adequatelyaccepted conduct their own research.30 Although the three librarians admitted that higher-order cognitive skills were diffi- cult to evaluate, they advocated a conceptual approachand as opposed to a library skills-based approach. However, they did not specifically define what a conceptual approach was. Based on the information search process (ISP) model developed by Kuhlthau in the 1980s, the new RRFA instrument creatededited, in this present study is a conceptual approach to assessment for one-shot library instruction. The concept is research readiness. The ap- proach aims to capture changes in student research readiness before and after receiving one-shot library instruction.copy The RRFA instrument consists of a pre-survey and a post- survey, both containing essentially the same 11 question items (see Appendix A and B). For practical purposes, the survey instrument must be short. Barclay called his pretest and posttests “brutally simple”: they only had two free-response questions to test how students findreviewed, books and periodicals by subject.31 However, a “brutally simple” instru- ment does not necessarily yield meaningful results. A meaningful assessment depends on the peervalidity and reliability of the instrument. is Validity mss.Validity is the extent to which an instrument measures what it is intended to measure. “Validity has long been regarded as the touchstone of educational and psychological This measurement and has therefore been defined repeatedly with varying nuances in the assessment literature.”32 However, to date, validity of assessment for one-shot library instruction has not been defined in the professional research literature. This study at- tempts to define validity of assessment for one-shot library instruction by referencing the established practice in the social sciences for the validation process. As Edward Carmines Rui Wang 625

and Richard Zeller stressed, “One validates not the measuring instrument itself but the measuring instrument in relation to the purpose for which it is being used.”33 Many psychological testing specialists have argued that all validity is essentially construct validity.34 As Carmines and Zeller described it:

Construct validity is woven into the theoretical fabric of the social sciences, and is thus central to the measurement of abstract theoretical concepts. Indeed, as we will see, construct validation must be conceived of within a theoretical context. Fundamentally, 16.3. construct validity is concerned with the extent to which a particular measure relates to other measures consistent with theoretically derived hypotheses concerning the concepts 35 (or constructs) that are being measured. portal The validation in developing the RRFA instrument for this study included defining the domain of the assessment for one-shot library instruction, constructing a theoretical framework for the process of developing the instrument, elaborating the theoretically derived hypotheses, and applying the statistical techniques of reliability tests and factor analysis to determine the validity of the results. publication, The first step in developing an instrument is to define whatfor is being measured, which is called domain definition.36 The domain definition must be traced back to the purpose of one-shot library instruction. Evan Farber, a guru of bibliographic instruction in the 1980s, In a single class period, the had a simple and insightful comment about librarian is expected to lead stu- the purpose of library instruction: “We must accepted remember what most students are interested dents into disciplinary resources in is not learning how to use the library, butand relevant to their course assign- only in learning how to use the library to help ment, demonstrate searching them with their classes.”37 Practically speak- ing, one-shot library instruction isedited, generated techniques, retrieve results by a faculty member for a particular course relevant to their topics, boost assignment. The library sessioncopy is just one epi- sode in the whole semester of a class. Usually, students’ confidence, and take the professor uses one-shot library instruction students directly into the begin- to initiate student research for a course assign- ning of the research process. ment. In a single class period, the librarian is expected reviewed,to lead students into disciplin- ary resources relevant to their course assignment, demonstrate searching techniques, retrievepeer results relevant to their topics, boost students’ confidence, and take students directlyis into the beginning of the research process. Hence, “Instruction in bibliographic resources is useless unless wedded to a course project in which students are simultane- mss.ously acquiring subject knowledge and direction from the professor and bibliographic skills from the librarian.”38 The course assignment is the origin of the faculty member’s This expectation and the students’ motivation to learn library use and skills. The purpose of one-shot instruction is to help students become better prepared to adequately conduct research for course assignments. In this sense, student research readiness is the domain definition of the assessment for one-shot library instruction. 626 Library Instruction: A Conceptual Approach

The literature review for this study revealed that most assessments for one-shot library instruction focus on testing library skills. An argument for library skills-focused assessment is that it is natural to measure what has been taught because one-shot instruction covers use of library services and skills. However, this argument does not address the purpose of one-shot library instruction. As a reference librarian explained three decades ago, library skills are not equivalent to research in a disciplinary field:

One could teach any man-on-the-street about Library of Congress subject headings, 16.3. catalog cards, the nature and structure of indexing/abstracting systems and other bibliographies, the mechanics of reading citations, and so on. That same man-on-the- street could then pass a library skills test with flying colors. But he would not then be portal qualified to do research in anthropology.39

Validity is also inseparable from theory. As Carmines and Zeller say, “Construct validation must be conceived of within a theoretical context.”40 The RRFA instrument to measure research readiness in this study was 41 After the topic becomes more constructed based on Kuhlthau’spublication, ISP model. The model incorporates forthe three dimensions focused and personalized, of cognitive (thoughts), affective (feelings), there is increased confidence and physical (research) actions and divides the information search process into six stages: (1) and a sense of clarity. Feelings task initiation, (2) topic selection, (3) prefocus of relief with a sense of satis- exploration,accepted (4) focus formulation, (6) informa- faction eventually appear with tion collection, and (6) search closure and pre- sentation.and In the first three stages, information a successful search process seekers’ actions frequently involve beginning and completion of the task. the task, investigating general information and edited,identifying topics, and experiencing feelings of uncertainty, apprehension, anxiety, and confu- sion. The focus formulation copystage is a turning point of the search process. After the topic becomes more focused and personalized, there is increased confidence and a sense of clarity. Feelings of relief with a sense of satisfaction eventually appear with a successful search process and completion of the task. Later, information science researchers simplified Kuhlthau’s six stages into three categories: (1)reviewed, prefocus, (2) focus, and (3) post-focus.42 Prefocus is the most critical stage for students to initiate their engagement with the course assignment before arriving at the focusedpeer stage. Because students have the least self-assurance at the beginning about theiris topics and information needs, they are vulnerable to developing false focus and “the fallacy of the perfect thirty-item online search.”43 According to Kuhlthau’s observation in mss.one of her series of studies, 50 percent of the subjects showed no evidence of ever achiev- ing focus.44 One-shot library instruction takes place exactly at the critical prefocus stage. This The most common reason for faculty to request one-shot library instruction for their classes is for a course assignment. The assignment initiates a research task, triggers users’ emotions, and generates research actions. In the ISP model, the task stands at the center of all three dimensions. Information seekers’ thoughts, feelings, and actions change depending on how much of the task is completed. Because all needs and motivations Rui Wang 627

converge at the task, it is predictable that if one-shot library instruction is wedded with the course assignment, it will help students be better prepared for completing the as- signment. In other words, a successful library session will equip students with research skills and resources for their cognitive learning of course assignments, will reduce their emotional barriers, and will mobilize their research actions. Thus, encompassing the course assignments, research readiness is a concrete concept aligned with the three dimensions of the ISP. To increase student research readiness at the prefocus stage, three components of 16.3. one-shot library instruction derived from the ISP model are needed. First, students need to know how to access disciplinary research literature, use appropriate search strate- gies, and effectively find and retrieve relevant and significant sources at the beginningportal of the research. Ideally, the instruction should “get students into the primary literature as quickly as possible, for it is here that subject knowledge and scholarly guidance will be found.”45 Second, students need to be informed that the research process is nonlinear and might take them A one-shot session is an in a number of different directions.46 Students also need publication, to be advised about the common problems they will opportunityfor for a librar- encounter, and strategies and resources to handle these ian to establish a connec- problems. Third, a one-shot session is not an isolated or tion to help students per- stand-alone episode but a floating event to transfer the sonalize their research in students’ previous library experience and skills to acceptedtheir present needs and to escort them into the next research the later research stage. stage. As Kuhlthau explained, “The activeand process of forming meaning from information is the task of the user in the ISP model.”47 The ISP model regards user experience as a personalized pro- cess. The more a personalized topic is developed, the more successful the search process becomes. Kuhlthau’s longitudinaledited, study exhibits that, after the students became more experienced, they showed a sense of ownership by referring to the research process as “my process” and describingcopy their research strategy as “this is the way I do it.”48 A one- shot session is an opportunity for a librarian to establish a connection to help students personalize their research in the later research stage. Therefore, the design of the RRFA instrument is to assemble the cognitive, affective, and behavioral dimensions integrating students’ previousreviewed, experience and post-research activities into the survey questionnaire. In the pre- and post-surveys, the cognitive dimension is measured by four items on how clearly students understand (1) their course assignment, (2) their research topics,peer (3) library research, and (4) where to look for sources for their assignment. The cognitiveis dimension was not stretched to specific course subjects or research topics in this study because they differ between individuals and classes. In fact, most one-shot mss.library instruction sessions are discipline-specific, and they often come from disciplines in the social sciences. Psychology, sociology, and education are commonly listed as the This top academic departments that request the most library instructional sessions, in addi- tion to English (composition classes).49 Empirical evidence showed that undergraduate social science majors engaged in more information-seeking behaviors and will more likely request assistance from reference librarians compared to physical science majors.50 The disciplinary nature of the social sciences is dynamic. According to the psychologist 628 Library Instruction: A Conceptual Approach

16.3.

Figure 1. Student readiness defined in the research readiness-focused assessment (RRFA) instrument.portal

Anthony Biglan, “The physical sciences are characterized by the existence of paradigms that specify the appropriate problems for study and the appropriate methods to be used. It appears that the social sciences and nonscience areas such as history do not have such clearly delineated paradigms.”51 As librarians observed in 1970s, publication, for The social sciences are less stable, lack clear boundaries, and use imprecise terminology . . . New fields, such as futuristics or social policy, continually emerge in the social sciences, and disciplines overlap in such a way as to make the subject matter dependent for classification more on the author’s area of expertise than on the content.52 accepted This is the main reason that the four items constructed for the cognitive dimension were not designated for specific subjects or researchand topics in the short instrument. The affective dimension is defined by two items asking how greatly students feel challenged by their course assignment and how much confidence they have about completing the assignment successfully.edited, The behavioral dimension includes five items asking when students will start their research and requesting that they rate the useful- ness of the library session. The behavioral dimension also includes how students seek research help. In one of Kuhlthau’scopy series of studies, 80 percent of the 26 students in the study responded that they “almost always” or “often” talk to others about their topic. However, Kuhlthau’s research found that most responses about student perceptions of getting help from librarians or teachers fell into the categories of “sometimes,” “seldom,” or “almost never.”reviewed,53 Asking for help can be a sociological dimension in the information search process, although the sociological dimension is not defined in the ISP model. For thispeer reason, the instrument places “asking for help” into the behavioral dimension. Tableis 1 displays how the coded items in the RRFA instrument define each dimension. As Table 1 shows, two questions from the pre-survey investigate students’ previ- ous experience and preparations for their topics and research. The pre-survey not only mss.functions as a baseline for comparing changes in student research readiness but also provides librarians “the opportunity to design relevant and authentic activities.”54 Such This an investigation should be an important part of one-shot library instruction, as Andrea Brooks noted:

Approaching an information literacy session without pre-test is similar to walking into a classroom on the first day of the semester. The students and the instructor are strangers Rui Wang 629

Table 1. The three dimensions of the information search process (ISP) model defined by question items 16.3. Dimensions Question items

Affective feelings How challenging is your class assignment to you? (scaled QI–6) portal How confident are you in completing your class assignment successfully? (scaled QI–7))

Cognitive thoughts How clearly do you understand your class assignment? (scaled QI–2) How clear is your research topic in your mind? (scaledpublication, QI–3) How clearly do you know where you should look for information for your research? (scaled QI–4)) for How clearly do you understand the library research for your class assignment? (scaled QI–5)

Physical actions When searching databases, youaccepted can utilize search techniques to maximize relevant results without losing a research focus. Please indicate which of the following techniquesand can be used to narrow or expand the numbers of search results? (multiple choices) If you run into a problem with your research, who are you most likely to ask for help? edited,(scaled QI–8, 9,10) When are you going to start your research for your class assignment? (multiplecopy choices, scaled QI–11) How useful was the library session for you? (scaled QI–1) Why do you think the library session was or was not useful to you? (text)

Previous experience/ Did you attend any library session/course to learn how to conduct preparation reviewed,library research? (category) What research topic do you have for your class assignment now? (text) peer What resources are you mostly likely to use for research of your course is assignment? (texts)

mss.Post-connection Who are you most likely to ask for help? (scaled QI–8, 9,10) When are you going to start your research for your class assignment? This (multiple choice/scaled QI–11)

*QI means question item, followed by the numbers of variables. 630 Library Instruction: A Conceptual Approach

to each other. In a semester-long class, an instructor will gain knowledge about his or her students and adopt lesson plans and approaches to fit the class needs. For librarians teaching a one-shot session, this is not an option. However, pre-testing helps make a connection with the class ahead of time.55

A common challenge of one-shot library instruction is redundancy.56 An observa- tion by Amalia Monroe-Gulick and Julie Petr illustrates such a problem. In their study, a student described typical repetitiveness: 16.3. I have had one, two, three, four. Four or five and I really think that one was more than enough because they would bring us in, and I mean it’s very important to know how to use the library, to know how to use the resources that are available to you. And not portal to offend anyone, it is an important time in the sun for librarians because they are so unappreciated in academia that, whenever they’re given an opportunity to—this has been my experience—to speak, to give the incredible training that they usually have had to do their jobs, they get very excited. And sometimes encourage professors to do it lots of times and once was very informative. The second time cleared up some questions that I had. Three and four made me again want to toss myself out the nearestpublication, window 57 (Interview 4, 83). for A pre-survey can help librarians adjust their instruction to avoid redundancy. The post-survey contains essentially the same questions as the pre-survey. One unique open-ended question in the post-survey collects students’ comments about the one-shot session. The open-endedaccepted question 58 and the question A pre-survey can asking students to indicate their strategies or techniques and disciplinary sourcesand are free-text answers or multiple choices help librarians ad- (question 3 in Appendix A and question 4 in Appendix B). just their instruction The rest of the items use a five-point Likert scale, asking re- to avoid redundancy. spondentsedited, to express how much they agree or disagree with particular statements. Research readiness is hence constructed as a quantifiable concept to measure changes in students’ cognitive thoughts, affectivecopy feelings, and intended research actions for the course as- signment after one-shot library instruction.

Reliability Reliability is generallyreviewed, accepted as a necessary precondition for validity.59 Cronbach’s alpha coefficient is a measure of internal consistency, which means the degree to which all the itemspeer in the instrument measure the same construct. The alpha coefficient for the 11 itemsis in the post-survey was 0.69, nearly at the acceptable level of 0.70 for internal consistency in the social sciences. Two items, “Challenge [of] assignment” and “Net- mss.work,” decreased the alpha value. First, the reverse coding of some items might have caused artifactual (nonsubstantive) problems with unidimensionality, the extent to which This the instrument measured a single attribute at a time, because negatively worded items require people to reframe the statement. For example, “I dislike my job” and “I don’t like my job” might be answered slightly differently. In this case, because “Challenge [of] assignment” was next to “Confidence,” the reverse scale of “Challenge [of] assignment” might have affected some respondents. Second, “Network” proved more extraneous than Rui Wang 631

Table 2. Reliability statistics: Cronbach’s alpha* if item deleted

Items Cronbach’s alpha if item deleted 16.3.

Usefulness 0.663 Clarity assignment 0.634 portal Clarity topics 0.644 Clarity sources 0.643 Clarity library research 0.638 Challenge assignment 0.716 Confidence 0.661 Ask professor 0.683 publication, Ask librarian 0.663 for Network 0.724 When 0.679

Total items 0.690accepted * Cronbach’s alpha is a measure of internal consistency,and the degree to which all the items in the instrument measure the same construct.

edited, relevant to the internal consistency of the RRFA instrument. If “Network” is deleted, the alpha value increases to 0.724.copy (See the highlighted value in Table 2). The author also separately calculated the alpha coefficients for items of the cogni- tive, affective, and behavioral dimensions (see Table 3), even though this study was not intended to test or confirm the ISP model, and the scale was developed to measure the single attribute of research readiness. The Cronbach’s reviewed, alpha coefficient of the four items for the cognitive Two main strengths of the dimension was 0.803, suggesting that the items had instrument in the present high peerinternal consistency. The alpha of the two items foris the affective dimension was 0.259, and the alpha of study are its parsimony the five items for the behavioral dimension was 0.418, and unidimensionality. mss.suggesting that the items for the two dimensions had low internal consistency. A way to improve the alpha This value is to add more items to measure the same dimension. As indicated in Jum Nun- nally’s Psychometric Theory (1978), “The primary way to make tests more reliable is to make them longer.”60 Future research might develop a longer three-factor RRFA instru- ment to measure the three dimensions of the ISP accurately and consistently. Nonethe- less, two main strengths of the instrument in the present study are its parsimony and unidimensionality. 632 Library Instruction: A Conceptual Approach

Table 3. Reliability statistics: all items and items of each dimension of the research-readiness-focused assessment (RRFA) instrument 16.3. Cronbach’s alpha* Number of items

Total items 0.690 11 portal Cognitive dimension 0.803 4 Affective dimension 0.259 2 Physical dimension 0.418 5

* Cronbach’s alpha is a measure of internal consistency, the degree to whichpublication, all the items in the instrument measure the same construct. for

accepted and

edited,

copy

reviewed,

peer is

mss.

This Figure 2. The research readiness-focused assesment (RRFA) factor scree plot, showing how much variation in the data each factor can explain. Rui Wang 633

Table 4. Pattern matrix

Factor 1 2 3 16.3.

clarity library research 0.847 clarity sources 0.830 portal clarity assignment- 0.744 clarity topics 0.557 confidence 0.420 ask librarian 0.844 usefulness 0.409publication, network for challenge assignment 0.523 when ask professor accepted Extraction method: principal axis factoring. Rotation method: oblimin with Kaiser normalization.and Rotation converged in eight iterations.

edited, To judge the validity of the instrument, the author used a set of statistical procedures called factor analysis. Factor analysis can be useful for determining the validity of empiri- cal measures, as Nunnallycopy explained in 1978.61 Specifically, factor analysis can show that each of the components that informed the original scale development (in this case, the cognitive, affective, and behavioral dimensions) was reflected in the current measure. The purpose of this exploratory factor analysis was to demonstrate adequate coverage of the constructreviewed, and to provide evidence for content validity, how well the test measured what it was intended to measure.62 The author used principal axis factoring to assess the dimensionalitypeer of the 11 items of the RRFA. Three factors (dimensions) were extracted explainingis 54.61 percent of the variance. This was decided based on eigenvalues that were greater than 1, cumulative variance, and inspection of the scree plot (see Figure 2). The pattern matrix (see Table 5) presents the three loading factors. The first factor mss.includes five items. Four of the five items—clarity library research, clarity sources, clarity assignment, and clarity topics—were designed for the cognitive dimension. Although This the factor analysis did not extract all three factors for the RRFA instrument, at least it partially captured one of the three dimensions—cognitive thoughts, demonstrating po- tential for the new instrument to be improved in the future. Therefore, this study used all 11 items to gather the data, analyze, and interpret the results instead of focusing on each dimension. 634 Library Instruction: A Conceptual Approach

Participants The participants were the 240 students who attended the 10 one-shot sessions in the spring semester of 2014. There were 184 responses to the pre-survey, for an overall response rate of 77 percent. The post-survey had 129 responses for a rate of 54 percent, with relatively high variations of response rates for individual questions. The 10 sessions were given to classes at the undergraduate level, including four social work classes, two sociology classes, two psychology classes, one anthropology class, and one English composition 16.3. class. The instruction librarian normally conducts library training for English composi- tion courses. In this case, however, the English composition class asked the social sci- ences librarian (the author of this presentportal If the pre-survey comes too early, study) to conduct the one-shot session because the students needed to learn to the faculty and the students might search the iPOLL Databank, a database of not be ready to pay attention to the public opinion surveys, to incorporate statistical informationpublication, in writing their upcoming library session. If the papers. The course assignments for the post-survey is administered too 10 classes varied butfor were all related to late, faculty and students might writing papers within the disciplines. The author of this study conducted all 10 one- have moved on to their next course shot sessions. She designed the instruction topic, and their memories from the for eachaccepted session differently depending on one-shot session might have faded. the course assignments and on professors’ andexpectations and requirements, but the teaching was always based on the three components mentioned in the previous section of this report on “Validity.” edited, Procedure The Institutional Review Boardcopy (IRB) granted an exemption for this study. The consent letter and the Web links for surveys were e-mailed to the faculty members who ad- ministered the surveys to the students. The pre-survey was administered in the class before the one-shot session, and the post-survey was given to the students in the next class after the reviewed,one-shot session. Timing is important to collect data efficiently. If the pre- survey comes too early, the faculty and the students might not be ready to pay attention to the upcoming library session. If the post-survey is administered too late, faculty and studentspeer might have moved on to their next course topic, and their memories from the one-shotis session might have faded. Two selling points encouraged faculty and students’ participation: Each survey is mss.short and takes only about five minutes to complete; and information collected from the pre-surveys can also benefit faculty. Faculty members were interested in knowing This how their students prepared for conducting research for their course assignment. No incentive (such as extra credit or gifts) was given to students for taking the surveys. Data were collected from each course separately in the spring semester of 2014. Because the RRFA instrument focused on measuring the perceived changes of student cognitive thoughts, affective feelings, and research actions, and was not designed for a Rui Wang 635

particular discipline or course, this study merged all data files from the 10 sessions into two related large groups: the student responses before and after the library instruction. Data were entered in SPSS, a software package used for statistical analysis. The author used the paired sample t-test to determine whether the changes in student research readiness were statistically significant. This statistical method is appropriate for pretest/ posttest situations because it repeatedly measures the same subjects over the course of a treatment or time, and it examines the mean of individual differences of the paired samples. Paired samples mean that each individual’s responses in the pre-survey should 16.3. have been matched to the same individual’s responses in the post-survey. However, this study could not pair individual responses due to a lack of respondents’ identifications. Librarians Kate Zoellner, Sue Samson, and Samantha Hines reported a similar problemportal in their article “Continuing Assessment of Library Instruction to Undergraduates” in 2008.63 Zoellner, Samson, and Hines went ahead with their test because they believed that “the sample does meet the assumption of normality, enabling generalizations across the population.” Although the sample size (pre-survey N = 184, post-survey N = 129) of the present study was not as large as that reported in 2008 (pretestpublication, N = 214, posttest N = 212), it seemed large enough. Because of the similarity to thefor three librarians’ study in 2008, this present study continued to run the paired sample t-test for analyzing the data collected in the spring of 2014. This study also used the signed test to compare the results of the paired sample t-test. The signed test is an alternative of the paired sample t-test to check for any difference between two populationsaccepted that requires few assump- tions of normal distribution, and when the variables are measured on ordinal scales.64 and Results

After the data were merged into two large groups from the pre-survey and post-survey, the author retrieved and coded 11edited, variables (see Table 1). These variables were all mea- sured on a five-point Likert scale, except QI–11, which had five multiple choices: (1) “two days before the assignmentcopy is due,” (2) “one week before the assignment is due,” (3) “two weeks for the assignment is due,” (4) “right after the library session,” and (5) “I have started it.” The text responses were manually scaled from the latest date to the earliest date as 1 to 5. Table 5 provides the results of the paired sample t-test and the signed test, which reviewed, includes the means of the pre- and posttests; z scores—a test statistic that measures the difference between an observed statistic and itspeer hypothesized population parameter; Research hypothesis 1—“After andis p values, probabilities that measure the one-shot library instruction, stu- 65 evidence against the null hypothesis. With dents’ cognitive thinking would mss.the confidence interval at 95 percent, this study used 0.05 as the threshold of statisti- improve related to conducting This cal significance. All 11 variables showed research for their course assign- positive changes in the expected directions. Eight items in the paired sample t-test and ment”—was supported. seven items in the signed test demonstrated statistically significant differences from the pre-survey to the post-survey. The two tests yielded almost identical results, except one item, Q–I6, “Challenge [of] assignment.” 636 Library Instruction: A Conceptual Approach

Table 5. Results of paired sample t–test and signed test

Variables Paired sample t–test Signed test Pre- Pre- p value Z Asymp. 16.3. survey survey Sig. Sig. (mean) (mean) (2 tailed) (2–tailed) portal Usefulness of instruction 3.67 4.15 0.000* –2.854 0.004* Clarity of course 2.98 3.67 0.000* –4.1 0.000* assignment Clarity of research topic 2.79 3.33 0.000* –3.98 0.000* publication, Clarity for finding sources 2.94 4.01 0.000* –5.536 0.000* for Clarity of library research 2.90 3.90 0.000* –6.109 0.000* Challenge of course 3.11 2.86 0.049* –1.735 0.083 assignment Confidence to complete 4.06 4.11 accepted0.609 –1.575 0.115 assignment Ask professor for help 4.24 4.29and 0.671 –0.434 0.664 Ask librarian for help 3.16 3.95 0.000* –3.733 0.000* Ask network for help 3.17 edited,2.98 0.254 –1.547 0.112 When start to research 3.30 3.92 0.000* –5.414 0.000* copy *p value < 0.05.

Research hypothesisreviewed, 1—“After one-shot library instruction, students’ cognitive think- ing would improve related to conducting research for their course assignment”—was supportedpeer because the means and medians of the four variables that measured cognitive thoughtsis showed statistically significant increases from the pre-survey to the post-survey. The mean and median of student feelings about the challenge of their course assignments dropped significantly in the post-survey. The mean and median of students’ confidence mss.in the post-survey increased, but the increase was not large enough to be statistically significant to retain research hypothesis 2—“After one-shot library instruction, students This would feel that the course assignment was less challenging, and they would have more confidence about completing course assignments successfully.” Similarly, the increases of the means and medians for QI–9, “Ask librarian,” and QI–11, “When,” were statisti- cally significant, but the increase of QI–8, “Ask professor,” and the decrease of QI–10, Rui Wang 637

“Network,” were not large enough to support research hypothesis 3—“After one-shot library instruction, students would be more willing to ask the librarian for help over their social networks, would plan to start their research earlier, and would appreciate the one-shot library session more.”

Discussion

By using the RRFA instrument, this study, in general, demonstrated improvement in 16.3. students’ research readiness after receiving one-shot library instruction at the begin- ning of their research. The improvements in the cognitive dimension were statistically significant. After one-shot library instruction, the portal students understood more about their assign- After one-shot library instruc- ments, and they were clearer about their research tion, the students understood topics, about where to look for information to develop their topics, and about library research. more about their assignments, The students also changed psychologically: they and they werepublication, clearer about considered the course assignments less challeng- their researchfor topics, about ing, and they felt more confidence about their ability to complete the assignments successfully. where to look for information After one-shot instruction, the students showed to develop their topics, and statistically significant changes in their willing- about library research. ness to ask the librarian for help over their social accepted networks, a likelihood to start their researchand earlier, and increased appreciation for the usefulness of the library session. The largest increases of the means and medians among the 11 variables, “Clarity for sources” (from pre-survey 2.94 to post-survey 4.01), and “Clarity for library research” (from pre-survey 2.90 to post-survey 3.90), indicateedited, that the students learned the most about finding sources and conducting library research for their assignments in the one-shot sessions. Even though the changescopy in the three variables “Network” (decreased), “Ask profes- sor” (increased), and “Confidence” (increased) were not statistically significant, these moderate changes still imply the improvements in the students’ research readiness. How should we explain these moderate changes? “Network” was defined as social networks including classmates, friends, parents, and social media, and was used to compare the reviewed, changes in “Ask librarian.” Apparently, the students felt more comfortable and thought it more convenient to ask people around them, even though there was a large increase in thepeer likelihood that they would ask the librarian for help after one-shot library instruc- tion.is The two variables “Confidence” and “Ask professor” had the highest mean scores in the pre-survey (4.06 and 4.24 out of 5), but they both had the smallest gains (0.05) in mss.the post-survey. Seemingly, compared with other sources of help, the students remained most likely to work with their professors for their course assignments. The insignificant This gain in student confidence echoes the findings in previous studies about a tendency for student overconfidence. Two studies reported by Amy Gustavson and H. Clark Nall in 2011, and by Melissa Gross and Don Latham in 2012, observed that first-year college students tended to rate their confidence levels higher than their information literacy skills.66 This present study also showed such a tendency in undergraduate students 638 Library Instruction: A Conceptual Approach

across all levels; they were inclined to rate their confidence in successfully completing their course assignments high prior to starting their research. Such overconfidence could result from false optimism because the students did not know what they did not know. In other words, they did not anticipate difficulties they would encounter until they actually began their research. False optimism could worsen students’ psychological experience, leading to apprehension, anxiety, confusion, and frustration, which commonly occur at the prefocus stage. For this study, the students’ false optimism not only explains the smaller gain in confidence but also indicates that one-shot library instruction can help 16.3. students be more realistically prepared to deal with potential research difficulties. Although the RRFA instrument has flaws, this study shows that it measures and identifies the instant improvement of student research readiness after receiving one-shotportal instruction in the critical prefocus research stage. The RRFA’s success comes mostly from the defined assessment domainresearch readiness, which is theoretically supported by the ISP model and by the validation process in the present study, so that research readiness is quantifiable and interpretable. The time and process needed for analyzing and interpret- ing data are shortened and simplified. However, the investigator couldpublication, not quantify all items in RRFA instrument. Two items were originally designed tofor identify the learning outcomes of using search strategies and disciplinary sources for the behavioral dimen- sion of the ISP. These two items were: 1. When searching databases, utilizing search techniques can maximize relevant results without losing a research focus. Pleaseaccepted indicate which of the following techniques can be used to narrow or expand numbers of search results. 2. List a database that indexes [disciplinary]and journal articles. Because these items were multiple choices and texted responses, they were difficult to quantify and so were omitted. Without including the learning outcomes of using search strategies and disciplinary resources,edited, the ability to measure the behavioral dimension of the RRFA was weakened. The effort to make the surveyscopy short paid off. Most students took about five to ten minutes to complete each questionnaire. The shortened instrument evidently attracted more participants and contributed to higher response rates (77 percent in the pre-survey and 54 percent in the post-survey). With limited items, however, it is not possible to fully measure all threereviewed, dimensions of the RRFA. Nevertheless, the short instrument can make assessment for one-shot library instruction sustainable. In this study, the immediate impact of one-shot library instruction on student re- search peerreadiness was largely isolated from the possible outside influences of professors’ teachingis and students’ outside self-learning by controlling the timing of the surveys. The pre-survey was administered in the class prior to the library session, while the mss.post-survey was given in the next class soon after the one-shot session. To administer surveys twice to the same group within a short time is challenging because assessment This is “the last part of the teaching cycle and the one that no one has time for.”67 The timing of the surveys in two class sessions depended on the faculty’s support. Collecting data in a timely fashion also imposed a challenge with the students. According to the faculty, some students felt they were taking the same survey twice because of the matching items in the two questionnaires. One way to minimize the influence of the pre-survey Rui Wang 639

on the post-survey is to change the order and wording of question items. The number of respondents in the post-survey (N = 129) was smaller than that in the pre-survey (N = 184). The challenges from professors’ willingness to help administer two surveys in a short period and the similarity of the surveys could be the major causes of the dropped respondents in the post-survey. Although the RRFA used timing to eliminate the impact of professors’ assistance and student self-learning, other factors might jointly influence or intervene in student’s learning in the one-shot session—for 16.3. example, the students’ previous library In this study, the immediate impact experiences. As mentioned previously, the pre-survey included this question, of one-shot library instruction onportal “Did you attend any library session/ student research readiness was large- course to learn how to conduct library ly isolated from the possible outside research?” The purpose of this question was, to some extent, to avoid redundan- influences of professors’ teaching cies by knowing how many students and students’ outsidepublication, self-learning by had library instruction previously. A for total of 193 students answered this controlling the timing of the surveys. question, of whom 112 (58 percent) had previous library training in a range of courses including a credit-bearing library course, English composition, social sciences, sciences, andaccepted humanities. Eighty-one students (42 percent) had no library exposure. It is thus possible to identify the relationship between students’ previous library experience and andtheir research readiness in the future. The results of this study, however, suggest that one-shot library instruction can improve students’ research readiness across both populations. There are various ways to refine the RRFA instrument. For example, after accumu- lating and benchmarking data overedited, time, the RRFA instrument could be reduced to a post-only survey. The post-only survey could add more open-ended questions to collect student texted responses. Thecopy texts will provide richer data to define and analyzeresearch readiness qualitatively. Furthermore, the RRFA items could be changed depending on different course assignments, and new variables could be developed that better measure the three dimensions. Assessmentreviewed, is an important part of instruction because it both guides and advocates instruction. The newly developed RRFA instrument is not a perfect tool, but it opens unlimited possibilities for applying the conceptual approach to assessment of one-shot librarypeer instruction. The concept of research readiness seems to fit well with the threshold conceptis described in the Framework for Information Literacy for Higher Education (Framework). The fit is not as good with the older Standards, which ignore “the vital mss.aspect of attitudes, emotion.”68 As Megan Oakleaf observed, “At first glance, [Jan] Meyer and [Ray] Land [originators of threshold concept], do not appear to support pedagogy or This assessment based on learning outcomes . . . because, they say, it’s impossible to adequately describe a learning goal to students who haven’t yet achieved that goal.”69 Oakleaf also pointed out that, nevertheless, Meyer and Land recognized a need for assessment. In fact, they anticipated what they call 640 Library Instruction: A Conceptual Approach

new modes of mapping, representing and forming estimations of students’ conceptual formation . . . a rich feedback environment offered at the point of conceptual difficulty (“stuckness,” the liminal state) as well as in the pre-, post- and subliminal states . . . a more nuanced discourse to clarify variation and experience and achievement through the various stages of the liminal journey . . . the possibility of an ontological (as well as conceptual) dimension of assessment.70

Oakleaf, thus, concluded: 16.3. In fact, threshold concepts are very well suited to learning outcomes assessment, as long as the assessments permit the use of authentic assessment approaches, provide useful feedback to students to help them over the “stuck places,” emphasize individual portal variation in the journey that students travel to achieve them, recognize that learners may redefine their sense of self.71

In this sense, the RRFA that measures student research readiness in the liminal or tran- sitional state of their nonlinear research process can be considered a departure from the Standards and a move in the direction of the Framework in the assessmentpublication, of one-shot library instruction. for Acknowledgments

The author offers deepest thanks to Kim O’Brien, professor of psychology at Central Michigan University in Mount Pleasant, for her guidanceaccepted and review for the statistical methods of this report; to Priscilla Seaman, reference and instruction librarian at the University of Tennessee Chattanooga; and to Katherineand Mason, digital resources librarian, and Elizabeth Macleod, professor emerita and music librarian, both at Central Michigan University, for their thorough reviews and proofreading. edited, Rui Wang is a social sciences librarian and associate professor at Central Michigan University Libraries in Mount Pleasant; she may be reached by e-mail at: [email protected]. copy

reviewed,

peer is

mss.

This Rui Wang 641

Appendix 1

16.3.

portal

publication, for

accepted and

edited,

copy

reviewed,

peer is

mss.

This 642 Library Instruction: A Conceptual Approach

16.3.

portal

publication, for

accepted and

edited,

copy

reviewed,

peer is

mss.

This Rui Wang 643

Appendix 2

16.3.

portal

publication, for

accepted and

edited,

copy

reviewed,

peer is

mss.

This 644 Library Instruction: A Conceptual Approach

16.3.

portal

publication, for

accepted and

edited,

copy

reviewed,

peer is

mss.

This Rui Wang 645

Notes

1. Deborah L. Lockwood, Library Instruction: A Bibliography (Westport, CN: Greenwood, 1979), 8. 2. Ibid., 7. 3. Lynne M. Martin and Trudi E. Jacobson, “Reflections on Maturity,” introduction to “Library Instruction Revisited: Bibliographic Instruction Comes of Age,” Reference Librarian 24, 51–52 (September 1995): 5–13. 4. Primary Research Group, College Information Literacy Efforts Benchmarks (New York: Primary 16.3. Research Group, 2014), 38; Information Literacy Efforts Benchmarks, 2013 Edition (New York: Primary Research Group, 2014), 30. 5. Sarah Polkinghorne and Shauna Wilton, “Research Is a Verb: Exploring a New Information Literacy-Embedded Undergraduate Research Methods Course,” Canadian Journal of portal Information and 34, 4 (December 2010): 457–73. 6. Joyce Lindstrom and Diana D. Shonrock, “Faculty-Librarian Collaboration to Achieve Integration of Information Literacy,” Reference & User Services Quarterly 46, 1 (Fall 2006): 18–23. 7. Megan Oakleaf, Steven Hoover, Beth S. Woodard, Jennifer Corbin, Randy Hensley, Diana K. Wakimoto, Christopher V. Hollister, Debra Gilchrist, Michellepublication, Millet, and Patricia A. Iannuzzi, “Notes from the Field: 10 Short Lessons on One-Shot Instruction,” Communications in Information Literacy 6, 1 (2012): 5–23. for 8. Janice M. Jaguszewski and Karen Williams, New Roles for New Times: Transforming Liaison Roles in Research Libraries (Washington, DC: Association of Research Libraries, 2013), 7. 9. Julie Rabine and Catherine Cardwell, “Start Making Sense: Practical Approaches to Outcomes Assessment for Libraries,” Research Strategies 17, 4 (2000): 319–35. 10. Jacalyn E. Bryan and Elana Karshmer, “Assessmentaccepted in the One-Shot Session: Using Pre- and Post-Tests to Measure Innovative Instructional Strategies among First-Year Students,” College & Research Libraries 74, 6 (Novemberand 2013): 574–86. 11. Donald A. Barclay, “Evaluating Library Instruction: Doing the Best You Can with What You Have,” RQ 33, 2 (Winter 1993): 195–202; Rabine and Cardwell, “Start Making Sense.” 12. Rabine and Cardwell, “Start Making Sense.” 13. Rachel Applegate, “Faculty Informationedited, Assignments: A Longitudinal Examination of Variations in Survey Results,” Journal of Academic Librarianship 32, 4 (July 2006): 355–63. 14. Bonnie J. M. Swoger, “Closingcopy the Assessment Loop Using Pre- and Post-Assessment,” Reference Services Review 39, 2 (May 2011): 244–59. 15. Rabine and Cardwell, “Start Making Sense.” 16. Association of College and Research Libraries (ACRL), Framework for Information Literacy for Higher Education, draft 1, part 1 (February 2014), http://acrl.ala.org/ ilstandards/wp-content/uploads/2014/02/Framework-for-IL-for-HE-Draft-1-Part-1.pdf. 17. Nancy Wreviewed,ootton Colborn and Rosanne M. Cordell, “Moving from Subjective to Objective Assessment of Your Instruction Program,” Reference Services Review 26, 3–4 (1998): 125–37;peer Chris A. Portmann and Adrienne Julius Roush, “Assessing the Effects of Library is Instruction,” Journal of Academic Librarianship 30, 6 (November 2004): 461–65; Kevin W. Walker and Michael Pearce, “Student Engagement in One-Shot Library Instruction,” Journal of Academic Librarianship 40, 3–4 (May 2014): 281–90. mss.18. Barclay, “Evaluating Library Instruction.” 19. Mary Reichel, “Library Literacy,” RQ 33, 2 (Winter 1993): 195. 20. Colborn and Cordell, “Moving from Subjective to Objective Assessment of Your Instruction This Program.” 21. Ma Lei Hsieh and Hugh A. Holden, “The Effectiveness of a University’s Single-Session Information Literacy Instruction,” Reference Services Review 38, 3 (2010): 458–73. 22. Walker and Pearce, “Student Engagement in One–Shot Library Instruction.” 23. Richard Hume Werking, “Evaluating Bibliographic Education: A Review and Critique,” Library Trends 29, 1 (1980): 153–72. 646 Library Instruction: A Conceptual Approach

24. Gabrielle Wong, Diana Chan, and Sam Chu, “Assessing the Enduring Impact of Library Instruction Programs,” Journal of Academic Librarianship 32, 4 (July 2006): 384–95. 25. Elizabeth Choinski and Michelle Emanuel, “The One-Minute Paper and the One-Hour Class: Outcomes Assessment for One-Shot Library Instruction,” Reference Services Review 34, 1 (February 2006): 148–55. 26. Amy Dykeman and Barbara King, “Term Paper Analysis: A Proposal for Evaluating Bibliographic Instruction,” Research Strategies 1, 1 (1983): 14–21; David F. Kohl and Lizabeth A. Wilson, “Effectiveness of Course-Integrated Bibliographic Instruction in Improving Coursework,” RQ 26, 2 (January 1986): 206–11; Linda G. Ackerson and Virginia E. 16.3. Young, “Evaluating the Impact of Library Instruction Methods on the Quality of Student Research,” Research Strategies 12, 3 (1994): 132–44. 27. Alison Paglia and Annie Donahue, “Collaboration Works: Integrating Information portal Competencies into the Psychology Curricula,” Reference Services Review 31, 4 (2003): 320–28. 28. Mary Jane Brustman and Deborah Bernnard, “Information Literacy for Social Workers: University at Albany Libraries Prepare MSW Students for Research and Practice,” Communications in Information Literacy 1, 2 (2007): 89–101. 29. Portmann and Roush, “Assessing the Effects of Library Instruction.” 30. Christopher Bober, Sonia Poulin, and Luigina Vileno, “Evaluating Library Instruction in Academic Libraries: A Critical Review of the Literature, 1980–1993,” Referencepublication, Librarian 24, 51–52 (1995): 53–71. for 31. Barclay, “Evaluating Library Instruction.” 32. David B. Sawyer, Fundamental Aspects of Interpreter Education Curriculum and Assessment (Amsterdam, Neth.: John Benjamins, 2004), 95. 33. Edward G. Carmines and Richard A. Zeller, Reliability and Validity Assessment (Beverly Hills, CA: Sage, 1979). accepted 34. Stephen G. Sireci and Tia Sukin, “Test Validity,” vol. 1, Test Theory and Testing and Assessment in Industrial and Organizational Psychology, ed. Kurt F. Geisinger, Bruce A. Bracken, Janet F. Carlson, Jo-Ida C. Hansen, Nathanand R. Kuncel, Steven P. Reise, and Michael C. Rodriguez (Washington, DC: American Psychological Association, 2013), 64. 35. Carmines and Zeller, Reliability and Validity Assessment, 23. 36. Ibid., 65. edited, 37. Larry L. Hardesty, Jamie Hastreiter, David Henderson, and Evan Ira Farber, Bibliographic Instruction in Practice: A Tribute to the Legacy of Evan Ira Farber (Ann Arbor, MI: Pierian, 1993) 5. copy 38. Stephen K. Stoan, “Research and Library Skills: An Analysis and Interpretation,” College & Research Libraries 45, 2 (1984): 99–109. 39. Ibid. 40. Carmines and Zeller, Reliability and Validity Assessment, 23. 41. Carol Collierreviewed, Kuhlthau, “Inside the Search Process: Information Seeking from the User’s Perspective,” Journal of the American Society for Information Science 42, 5 (June 1991): 361–71. 42. Lynn Kennedy, Charles Cole, and Susan Carter, “The False Focus in Online Searching: Thepeer Particular Case of Undergraduates Seeking Information for Course Assignments in isthe Humanities and Social Sciences,” Reference & User Services Quarterly 38, 3 (April 1999): 267–73. 43. Ibid. False focus is a situation in which a user gets a focus too soon in the prefocus stage. mss. As Kennedy, Cole, and Carter described, if the student is to zero in on a topic and avoid information overload, or if a librarian pushes the student toward a topic inappropriately This or too soon, “The student might try to cut out the prefocus phase of the search process entirely.” Marcia J. Bates, “The Fallacy of the Perfect Thirty-Item Online Search,” RQ 24, 1 (1984): 43–50. As Bates analyzed, the fallacy occurs when one assumes that he or she can produce “a high-quality thirty-item output.” The searcher may inappropriately modify the “wrong” size output or stop immediately with the “right” size output. 44. Kuhlthau, “Inside the Search Process.” Rui Wang 647

45. Stoan, “Research and Library Skills.” 46. A number of scholars have observed the nonlinear research process. For example, Stoan, in “Research and Library Skills,” 102, summarized what University of Bath investigators in the United Kingdom discovered: “The research process is an extremely complex and personal one that cannot easily be defined or fit into mechanistic search strategy.” Stoan cited the finding of Maurice Line, the investigator at the University of the Bath: The chronological order of each stage cannot be predetermined, for they vary with the individual researchers’ preference for organizing his work. Research is a process that does not allow for too formal organization . . . Serendipity plays an important role in research, 16.3. and information that a researcher comes across merely by chance may cause him to channel his work along new lines. Stoan also referred to the behavioral science philosopher Abraham Kaplan’s analysis to portal describe researchers’ nonlinear research process: A new idea generated from one source, an original insight springing from other, may later the direction of the quest and the kind of material being sought. What is needed next will be dictated by the intellectual evolution of the researcher up to that point. The final product of a research project may even be very different from what the investigator envisioned at the outset. In these circumstances, there can be no pat number of predetermined sources that the researcher will consult. publication, 47. Kuhlthau, “Inside the Search Process.” for 48. Ibid. 49. Primary Research Group, Information Literacy Efforts Benchmarks, 39. 50. Ethelene Whitmire, “Disciplinary Differences and Undergraduates’ Information-Seeking Behavior,” Journal of the American Society for Information Science and Technology 53, 8 (June 2002): 631–38. accepted 51. Anthony Biglan, “The Characteristics of Subject Matter in Different Academic Areas,” Journal of Applied Psychology 57, 3 (1973): 195–203, doi:http://dx.doi.org/10.1037/h0034701. 52. Patricia Stenstrom and Ruth B. McBride, “Serialand Use by Social Science Faculty: A Survey,” College and Research Libraries 40, 5 (1979): 426–31. 53. Carol Collier Kuhlthau, “Perceptions of the Information Search Process in Libraries: A Study of Changes from High Schooledited, through College,” Information Processing & Management 24, 4 (January 1988): 419–27. 54. Andrea Brooks, “Maximizing One-Shot Impact: Using Pre-Test Responses in the Information Literacy Classroom,”copy Southeastern Librarian 61, 1 (Spring 2013): 41–43. 55. Ibid. 56. Evan Farber, “Bibliographic Instruction at Earlham College,” in Hardesty, Hastreiter, Henderson, and Farber, Bibliographic Instruction in Practice, 6. 57. Amalia Monroe-Gulick and Julie Petr, “Incoming Graduate Students in the Social Sciences: How Muchreviewed, Do They Really Know about Library Research?” portal: Libraries and the Academy 12, 3 (July 2012): 315–35. 58. The open-ended question is not included in Appendix A or B because it was not used in the research.peer 59.is Kurt F. Geisinger, “Reliability,” in APA Handbook of Testing and Assessment in Psychology, Geisinger, Bracken, Carlson, Hansen, Kuncel, Reise, and Rodriguez, 21–42. 60. Jum C. Nunnally, Psychometric Theory, 2d ed., McGraw-Hill Series in Psychology (New mss. York: McGraw-Hill, 1978), 243. 61. Ibid., 62–63. This 62. Stephen N. Haynes, David C. S. Richard, and Edward S. Kubany, “Content Validity in Psychological Assessment: A Functional Approach to Concepts and Methods,” Psychological Assessment 7, 3 (1995): 238–47. 63. Kate Zoellner, Sue Samson, and Samantha Hines, “Continuing Assessment of Library Instruction to Undergraduates: A General Education Course Survey Research Project,” College & Research Libraries 69, 4 (2008): 370–83. 648 Library Instruction: A Conceptual Approach

64. Yadolah Dodge, The Concise Encyclopedia of Statistics (New York: Springer, 2008), 376–77, 571–74. 65. Minitab.com, Minitab 17 Support, http://support.minitab.com/en-us/minitab/17/. 66. Amy Gustavson and H. Clark Nall, “Freshman Overconfidence and Library Research Skills: A Troubling Relationship?” College & Undergraduate Libraries 18, 4 (2011): 291–306; Melissa Gross and Don Latham, “What’s Skill Got to Do with It? Information Literacy Skills and Self-Views of Ability among First-Year College Students,” Journal of the American Society for Information Science and Technology 63, 3 (2012): 574–83. 67. Oakleaf, Hoover, Woodard, Corbin, Hensley, Wakimoto, Hollister, Gilchrist, Millet, and 16.3. Iannuzzi, “Notes from the Field.” 68. ACRL, Framework for Information Literacy for Higher Education, draft 1, part 1, 3. 69. Megan Oakleaf, “A Roadmap for Assessing Student Learning Using the New Framework portal for Information Literacy for Higher Education,” Journal of Academic Librarianship 40, 5 (2014): 510–14; http://meganoakleaf.info/framework.pdf. 70 Ray Land and Jan H. F. Meyer, “Threshold Concepts and Troublesome Knowledge (5): Dynamics of Assessment,” in Threshold Concepts and Transformational Learning, ed. Jan H. F. Meyer, Ray Land, and Caroline Baillie (Rotterdam, Neth.: Sense, 2010), 76–77. 71. Megan Oakleaf, “A Roadmap for Assessing Student Learning Using the New Framework for Information Literacy for Higher Education.” publication, for

accepted and

edited,

copy

reviewed,

peer is

mss.

This