<<

77 Heritage Language Journal, 9(1) https://doi.org/10.46538/hlj.9.1.5 Spring, 2012

Language Placement and Beyond: Guidelines for the Design and Implementation of a Computerized Spanish Heritage Language Exam

Sara Beaudrie University of Arizona

Cynthia Ducar Bowling Green State University

Abstract This paper outlines the design, implementation, and analysis of a computerized Spanish heritage language (SHL) placement exam. The exam created by the authors exemplifies how to design a simple yet effective placement exam with limited resources. It is suggested that an SHL placement exam should be developed in-house due not only to the diversity of student populations but also to the unique structure and content of distinct SHL programs. The paper contends that SHL placement should be a fluid process that allows for the input of students, instructors and administrators. The paper concludes by reminding readers that an essential component of the SHL placement examination process is a strong and well thought out recruitment and promotion program that needs to both precede and follow placement.

Background Hispanics continue to be the fastest growing segment of the U.S. population; according to 2010 census data, the Hispanic/Latino population in the United States reached 50.5 million, more than double the 1990 population (U.S. Census Bureau, 2011). By 2050, the U.S. Hispanic population is expected to more than double again, reaching 132.8 million people, or 30% of the total U.S. population. In the fall of 2008, 12% of college students and 20% of K-12 students were Hispanic (U.S. Census Bureau, 2010). These data alone warrant continued intense investigation of the pedagogical needs of Hispanic students throughout the educational system and particularly in institutions of higher education. colleges, small liberal arts colleges, and large universities alike are experiencing growing Hispanic enrollment. As Beaudrie (2011) notes, “Data from NCES (2007) shows that Hispanic student enrollment in degree-granting institutions increased from 353,000 to 1,667,000 from 1976 to 2004, a 372% rate of growth” (p. 1). More recently, between 2009 and 2010 there was a 24% increase in Hispanic student enrollment at these institutions (Tavernise, 2011). The desire of this growing population to maintain and develop their Spanish language abilities has fueled a growing interest in Spanish as a heritage language (SHL) classes at the university level. In fact, a recent survey by Beaudrie found that 40% of responding institutions now offer a SHL program, “a remarkable increase from the previous percentages of 18% in 1990…” (forthcoming). The growth in SHL programs has led to a simultaneous boom in research on SHL in the United States (see Beaudrie and Fairclough, forthcoming), yet there continues to be a dearth of research on how best to place these students into SHL classes (but see Fairclough, 2006; Fairclough, Belpoliti, and Bermejo, 2010; and Fairclough and Ramírez-Vera, 2009; this issue of the HLJ). As Schwartz (2001) states, “The issue of assessment is particularly difficult for the heritage languages profession and an area in

Downloaded from Brill.com10/08/2021 03:49:34PM via free access 78 Heritage Language Journal, 9(1) https://doi.org/10.46538/hlj.9.1.5 Spring, 2012

which there has not been much work, either in the design of new instruments or in the validation of existing instruments (p. 242).” The present study addresses that issue by (a) providing concrete data on the reliability and validity of an SHL placement exam recently created at the University of Arizona, and (b) providing a step-by-step guide for the design and implementation of that same SHL exam. Perhaps most importantly, this paper also addresses the issue of what should be done after the exam to ensure both the retention and success of SHL students.

Defining the SHL Learner Before addressing the myriad of issues complicating placement testing of this population, we must first define who precisely we consider to be an SHL learner, specifically in the University of Arizona context. At the University of Arizona, given the breadth of SHL program offerings (see Appendix), we include as many students as possible under the definition of SHL learner. We concur with Duisberg that a definition of the group should do “nothing to limit the diversity of students (thus) classified” (sic, 2001, p. 24). Thus, we define SHL students as “all individuals that have experienced a relatively extended period of exposure to the language, typically during childhood, through contact with family members or other individuals, resulting in the development of either receptive and/or productive abilities in the language, and varying degrees of bilingualism” (Beaudrie & Ducar, 2005, p. 13). Importantly, this definition allows for the inclusion of students who may not be Hispanic but may “feel” Hispanic (this group often includes students who were raised by a Spanish-speaking caretaker, students who married and/or are dating a Spanish-speaking individual, etc., to give but two examples). In this respect, we concur with Lynch (2008), that “the most decisive factors leading to the linguistic similarities…are of a social nature” (p. 270); in other words, students’ social networks play a decisive role in the formation of their linguistic abilities. Thus defined, SHL learners exhibit “…linguistic processes and social factors attributed to both second language acquisition and to situations of language contact” (Lynch, 2003, p. 31).

We recognize that our definition does not include students who feel a connection with the heritage language but have not been exposed to this language and therefore lack even receptive skills in the heritage language (i.e., what van Deusen-Scholl (2003) terms “learners with a heritage motivation”). This distinction is important, and is not meant to be exclusionary, for although we too consider these students to be heritage learners, their linguistic needs most parallel those of L2 students and therefore they will not be best served in an SHL class (see also Wiley, 2001).

SHL classes at the University of Arizona thus include a range of individuals, from receptive bilinguals to students who have been formally schooled in their home countries in the heritage language. As Beaudrie (2009) found, even receptive bilinguals demonstrate greater listening comprehension abilities than their beginning level L2 peers. Lynch (2003) further notes that the typical SHL student tends to have better developed basic interpersonal communication skills (BICS). Krashen (2000) notes that L2 students often outperform their SHL peers on grammar tests; clearly both groups have distinct linguistic strengths and weaknesses, not to mention substantially different affective needs. These generalizations still lack much empirical research, however, and the precise linguistic differences between the two groups have not yet been clearly delineated. However, researchers do agree that there are indeed differences; in short, SHL students appear to outperform L2 peers in oral production and aural perception of the language (i.e., phonology); however,

Downloaded from Brill.com10/08/2021 03:49:34PM via free access 79 Heritage Language Journal, 9(1) https://doi.org/10.46538/hlj.9.1.5 Spring, 2012

differences between the two groups are less clear at the morphosyntactic level (see Montrul (2010) for an overview).

SHL Placement Exams: Goals and Realities Despite the growth in SHL programs, only half of those responding to Beaudrie’s survey reported using a placement exam for student placement (forthcoming 2). As González-Pino and Pino (2000) note, most SHL programs in U.S. universities utilize questionnaires, self-placement, interviews, and/or locally designed paper and pencil examinations. As can be seen, there is little agreement about what constitutes a placement exam in the SHL context.

The inherent diversity of SHL students discussed previously adds a second complicating factor, for not only do SHL students encompass a wide range of linguistic competencies, but due to the diverse origins of the U.S. Hispanic population, they also have a wide range of cultural backgrounds, from those who can trace their Spanish heritage back to the sixteenth century to recently arrived immigrants from numerous Latin American nations. It is safe to say that all the Spanish-speaking nations of the world are represented in today’s U.S. Spanish-speaking population, although a majority (66%) continues to be of Mexican origin (U.S. Census Bureau, 2010). This complicates the issue of diversity by adding not only the obvious cultural element but also a linguistic element. Real Academia Española (n.d.) recognizes 22 separate academies of language, and thus 22 “standard” dialects of Spanish. Add to that the fact that some individuals have not been formally schooled in Spanish, and the linguistic heterogeneity of the population is multiplied yet again.

Ideally, all universities would be able to offer a range of courses to accommodate this heterogeneous group of students. However, as Beaudrie (in press) documents, “The vast majority of programs offer one (72, or 43%) or two (64, or 38%) courses, whereas programs with four or more courses are practically nonexistent.” The fact that a majority of universities (81%) offer only one or two SHL courses presents an additional set of placement issues. Who qualifies as an SHL student and what competence levels are necessary to qualify for the SHL course(s) clearly depend on the nature of the courses offered. Even more problematic is the reality that many programs call upon students to self-identify as heritage learners, yet the linguistic insecurity prevalent in this group of learners frequently leads them to underestimate their abilities (see Beaudrie & Ducar, 2005; Beaudrie, 2009). A well-designed placement exam must address all of these issues. Additionally, an SHL placement exam that is practical in one context may be entirely impractical in another. As a case in point, the University of Arizona’s current exam was devised for an institution offering six levels of SHL courses, but Beaudrie (forthcoming 2) found that only 2.4% of those responding programs offer such a range of courses. In summary, a useful placement exam must first clearly define who is considered an SHL student at the particular institution by taking into consideration the needs of the student population and the program’s ability to accommodate those needs in its course sequence. Thus, the definition followed in the current paper is specific both to the University of Arizona’s SHL program and to its SHL student population

Downloaded from Brill.com10/08/2021 03:49:34PM via free access 80 Heritage Language Journal, 9(1) https://doi.org/10.46538/hlj.9.1.5 Spring, 2012

In spite of these difficulties, creating a user-friendly, easy-to-implement SHL placement exam is not as daunting as it might seem at first glance. Importantly, a plethora of literature on language testing is available to guide the design of such an exam, and a limited amount of research on existing SHL exams can guide the design and implementation phases (Fairclough, 2006; Fairclough & Ramírez Vera, 2009; Fairclough et al., 2010; Otheguy & Toro, 2000; Potowski & Parada, 2010; Teschner, 1990). Whereas these previous studies have focused on data analysis and on how these individuals and institutions developed their SHL placement exams, the present article goes further by providing a step-by-step process that practitioners can follow to design and implement their own exams (see also Fairclough et al., 2010) while simultaneously offering suggestions for how to retain SHL students after initial testing occurs. Our hope is that by sharing the process we went through, we can make the development of a useful exam less daunting, thereby increasing the number of institutions using such exams and reducing the number of SHL students who are misidentified and all too often misplaced.

An additional concern for many institutions is that comparatively few Hispanic1 students choose to enroll in SHL courses. Fairclough et al. (2010) report that although 20% of the University of Houston’s 2008 student population was Hispanic, less than 5% of those students even took the SHL placement exam. The situation is similar at the University of Arizona though, notably, overall enrollment in the SHL program has increased in recent years, growing 65%, from 550 students in the 2006-07 academic year to 905 students in the 2010-11 academic year. In spite of this growth, not all first year Hispanic students take the SHL placement exam, for a host of reasons.2 In fact, though 19% (1,306) of entering freshmen in the fall of 2009 were Hispanic, only 569 (44%) of them took the SHL placement exam. And not everyone who takes the test goes on to enroll in SHL classes, due to a variety of factors: scheduling conflicts, poor advisement, improper placement, the perception that Spanish as a second language (L2)3 classes are easier than other language classes, or simply a lack of knowledge of what SHL courses offer (see Potowski, 2002). Though the placement exam discussed here cannot rectify this situation, we do offer strategies that have successfully increased student recruitment at the University of Arizona in the hope that they may be effective at other institutions as well.

Previous Research on SHL Placement Exams Teschner (1990) documents an initial attempt to develop an SHL placement exam. All students intending to study Spanish at a large public university took this 140-item test. Students who gave three or more correct responses on ten native-speaker identifier items were placed in the native speaker track.4 Importantly, these identifiers were colloquial expressions that were “…characteristic of Mexican-origin Spanish and that seldom if ever make their way into Spanish-as-foreign or-second-language textbooks” (p. 817). As mentioned earlier, SHL students often have an edge when it comes to BICS and vocabulary; thus, it is not surprising that both Teschner and others have found that lexical items serve as robust identifiers of SHL students.

At the University of Houston, Fairclough (2006) analyzed data from two years' worth of SHL placement exams. The exam had two focal areas: a cloze test of verbal morphology and an essay section. Though Fairclough recognizes that an inherent flaw in the verbal morphology section resulted from the conflation of spelling and subject-verb agreement errors, the detailed analysis

Downloaded from Brill.com10/08/2021 03:49:34PM via free access 81 Heritage Language Journal, 9(1) https://doi.org/10.46538/hlj.9.1.5 Spring, 2012

she provides is helpful in sketching a picture of the competencies of SHL students at this particular university.5 Additionally, her careful description of specific items and student difficulties at each level provides a useful guide for anyone attempting to devise a test item bank. Though it is preferable to develop a test using data from the population at the educational institution where the test will be used (as both we and Fairclough did), when such data are not readily available, Fairclough's article presents a practical alternative. Particularly, Fairclough’s delineation of the hierarchy of difficulty in the Spanish verbal system for SHL students may prove useful in the initial stages of test item development (though one cannot be certain that the students that Fairclough tested are representative of all SHL students).

Fairclough and Ramírez Vera (2009) presented data on a placement test based on a lexical decision making task, created on the assumption that a large passive vocabulary may indicate substantial contact with Spanish and, therefore, a greater knowledge of the language in general. The test contained a list of 200 words: 120 words randomly selected from Davies' (2006) corpus of the 5,000 most frequent words in Spanish, plus 80 pseudowords added as distracters. Students were asked to read the list quickly and check off the words they were familiar with. Interestingly, students' accuracy on the lexical decision task correlated with their ability on a morphology- based cloze test. Although the test effectively differentiated L2 students from SHL students, it did not reliably distinguish among levels of SHL students for placement purposes. The authors propose that including lower frequency words (i.e., less common words) would better differentiate among SHLs.

Fairclough et al. (2010) document the steps the University of Houston SHL program undertook to develop a computer-based SHL placement exam. The test questions were selected based on extensive analysis of the previously used paper-and-pencil exam; additionally, SHL textbooks and current research on U.S. Spanish were consulted in an attempt to make the items curriculum- neutral. Students completed a five-question demographic survey that served to distinguish SHL from L2 students and thus send each group into the appropriate testing scenario. The actual SHL test contains four sections: (a) a lexical recognition task (see Fairclough & Ramírez Vera, 2009); (b) a section including translation, dictation, grammar, and verb conjugation tasks; (c) an oral exam, and (d) a section devoted to reading and writing activities, each intended to test student abilities at specific levels offered within the University of Houston’s SHL program. As such, the test branches so that only students with the required knowledge move on to the next section. Students who score well may receive course credits. Unlike the other exams mentioned in this section, this one clearly differentiates between the purposes of separating SHL students from L2 students while also differentiating levels of SHL students based on the specific outcomes desired at each level within the University of Houston’s program. This exam represents the most comprehensive branching, computer-adaptive placement and credit-by-exam option in the SHL field. Until a more complete theory of SHL assessment is available, however, locally designed exams remain, in the authors' opinion, the best option to meet both student and institutional needs.

Preliminary Considerations for Test Design

Downloaded from Brill.com10/08/2021 03:49:34PM via free access 82 Heritage Language Journal, 9(1) https://doi.org/10.46538/hlj.9.1.5 Spring, 2012

As Bachman and Palmer (1996) remind us, before creating any exam, one must first define its purpose, identify the linguistic abilities to be tested, and consider logistical issues of length and time. One must then develop a bank of test items based on the aforementioned premises and -test them on a representative group, comparable to students for whom the exam is intended and sizable enough for statistical analysis to guide test modifications (Davidson, 2000; Davidson & Lynch, 2002; McNamara, 1996; Meunier, 1994; Shohamy, 2001). Importantly, “one person alone should not undertake the development of a placement test. Rather, this type of project should be the result of a collaborative effort between several individuals, including test developers, language instructors, and assistants” (Zabaleta, 2007, p. 689). A test is always a work in progress; given the ever-changing nature of the student population in general and the SHL student population in particular, it should come as no surprise that a final essential element to designing a valid and reliable test is continuous review and modification. The need for ongoing revision renders computerized tests particularly useful because of their adaptability. Computerized language placement exams offer one option for a “fast, reliable, and easy to administer Spanish placement test” (Zabaleta, 2007, p. 675). Among the advantages of computerized exams are (a) immediate feedback (for both the student and the examiner), and (b) their flexibility to adjust to an individual test-taker's knowledge (Zabaleta, 2007).

The University of Arizona SHL Program and Spanish Language Placement The University of Arizona’s Spanish and Portuguese Department offers two tracks in its Spanish language program from first-year (100 level) to third-year (300 level) courses: the L2 track, with eight courses, or the SHL track, with six courses (see Table 1 for the course sequence in the two tracks, and Appendix for descriptions of the SHL courses). The two tracks meet at Spanish 350. Although most SHL courses have an academic equivalent course in the L2 track (e.g., Spanish 251 for L2 students is considered the same as Spanish 253 for SHL students), the content and pedagogies of the courses in the two tracks differ significantly, because they are specifically tailored to the learning needs of the distinct student populations.6 In the fall of 2011 the SHL track enrolled 580 students and the L2 track 4,358 students.

Table 1 Sequence of L2 and SHL Courses up to Spanish 350 L2 track HL track Proficiency level upon completion Basic language courses Spanish 101 Spanish 103 1st semester (most majors require either 2nd- or 4th-semester Spanish 102 2nd semester proficiency) Spanish 201 Spanish 203 3rd semester Spanish 202 4th semester Intermediate and advanced Spanish 251 Spanish 253 5th semester language courses (for minor Spanish 325 Spanish 323 6th semester (minor or and major requirements) major requirement) Spanish 330 Spanish 333 7th semester (minor or

Downloaded from Brill.com10/08/2021 03:49:34PM via free access 83 Heritage Language Journal, 9(1) https://doi.org/10.46538/hlj.9.1.5 Spring, 2012

major requirement) Content courses (for minor Spanish 340 Spanish 343 Phonetics and and major requirements) phonology (major requirement) Spanish 350 Introduction to literary genres (minor or major requirement)

The Spanish-language exam is administered every summer during a two-day orientation for incoming freshmen and transfer students. The exam is required of all students planning to register for Spanish courses unless they have earned prior course credit or can certify that they have passed an accredited language exam. Taking this test does not entitle students to earn course credit. Its sole function is to place students in the appropriate Spanish-language course and, possibly, to waive university language requirements if students score well enough to meet second- or fourth-semester language proficiency requirements. The time allotted for the Spanish- language testing session is 45 minutes, and the whole process needs to be completed in that time frame so that students can attend their next orientation session.

Test Purpose and Rationale The creation of this computerized Spanish language placement exam was stimulated by the Spanish and Portuguese Department’s need to improve its ability to differentiate among the levels of SHL students recognized by the University of Arizona. An additional concern specific to this program was to identify receptive bilingual students and accurately distinguish them from beginning and intermediate L2 students of Spanish (see Beaudrie, 2009; Beaudrie & Ducar, 2005; Fairclough et al., 2010). Finally, university administrators required that the test-taking experience of SHL students and their L2 peers be similar, largely because of concerns with the way the previous placement exam had been conducted. Originally, all students in each freshman orientation cohort met in a classroom to take a paper-and-pencil demographic survey. The classroom proctor then asked those who responded "yes" to three or more questions to leave with a separate proctor. While all the L2 students took the placement test in one of two computer labs, the SHL students were directed to a separate classroom to take a paper-and-pencil exam (locally developed at the university). The two exams were clearly different, not only in terms of format but also time, because the written SHL exam often took longer to complete than the S-CAPE. This situation was potentially demoralizing and anxiety-producing for students who had to raise their hands in front of the group, then be led out to a separate testing area to take a different test for reasons they did not understand. In light of these issues, the new computerized SHL placement exam was designed to look similar to the S-CAPE on the screen (i.e., in multiple- choice format) and also to be completed in approximately the same amount of time (an average of 20 minutes).

Test Design and Implementation The new computerized SHL placement test consists of two parts: a survey taken by all test- takers, and the SHL computerized language test, taken only by those who qualify based on their

Downloaded from Brill.com10/08/2021 03:49:34PM via free access 84 Heritage Language Journal, 9(1) https://doi.org/10.46538/hlj.9.1.5 Spring, 2012

survey responses.7 The main purpose of the survey is to identify SHL learners from the cohort of L2 and SHL students who take the Spanish-language placement exam. The computerized language placement test, in contrast, is intended to place students into one of the first three levels of the SHL program (Spanish 103, 203, or 253; see Appendix for course descriptions).

Survey The survey used with the previous paper-and-pencil SHL exam was modified to produce 14 yes/no statements. Ten items targeted students’ childhood and current contact with Spanish through family, community, and/or friends outside of the classroom context. It should be pointed out that the questionnaire did not ask students about their ethnic background since this is not an essential characteristic for students to be eligible for the SHL program. These statements were carefully crafted to eliminate any items that would be applicable to the type of contact with Spanish that an L2 learner might have through the media, Internet, or through limited contact with Spanish-speaking neighbors and friends. The survey included four distracters asking students about their contact with English. These were intended both to disguise the purpose of the survey and to avoid demoralizing the L2 students who would respond negatively to the Spanish-related statements. The items were distributed as follows in random order:

a. Four items related to listening contact with Spanish (e.g.: As a child, I heard Spanish frequently in the home) b. Five items related to speaking contact with Spanish (e.g.: As a child, I spoke Spanish frequently outside the home c. One item related to living in a Spanish-speaking country (e.g.: As a child, I lived in a Spanish speaking country for 2 years or longer d. one item related to receptive language proficiency in Spanish (e.g.: I can understand the main idea when somebody talks to me in Spanish) e. Four items about contact with English (e.g.: As a child, I watched at least some TV in English

Through empirical experience with the survey, it was determined that students who responded affirmatively to three or more of the 10 Spanish-related statements had sufficient contact with Spanish to take the SHL placement exam, although they might eventually be placed in the L2 track, depending on their performance, and that the other students should take the S-CAPE to determine L2 placement.8 Computerizing the survey provided an easy and unobtrusive way of identifying SHL learners. All students began their test-taking experience by taking the survey and were then launched directly into either the SHL or the S-CAPE. This process created a much more symmetrical experience for both groups of test-takers, eliminating the negative affective and logistical issues while continuing to identify SHL learners efficiently.

In order to see whether the demographic survey alone could serve as a placement tool, the students’ survey responses were analyzed to determine if they predicted student placement in the first three levels of SHL courses. Thompson (2005) analyzed the paper-and-pencil survey on which the computerized survey was based, and argued that the number of affirmative responses could predict placement in two of the courses (Spanish 103 and 253) with a fairly high degree of

Downloaded from Brill.com10/08/2021 03:49:34PM via free access 85 Heritage Language Journal, 9(1) https://doi.org/10.46538/hlj.9.1.5 Spring, 2012

accuracy; however, placement into Spanish 203 was more problematic. In the 2010 analysis of survey responses, the number of "yes" responses clearly correlated with course level: students who placed into Spanish 103 had a mean of 4.7 (SD: 1.6) affirmative responses, those in Spanish 203 had a mean of 5.9 (SD: 2), and those in Spanish 253 had a mean of 8.3 (SD: 1.4).

Table 2 Survey Results per Number of Responses and Course Placement No. of "yes" No. of No. of No. of No. of Total test- responses students students students students takers directed to placed in placed in placed in L2 exam Spanish 103 Spanish 203 Spanish 253 3 90 (59%) 40 (26%) 20 (13%) 3 (2%) 153 4 25 (31%) 23 (29%) 29 (36%) 3 (4%) 80 5 0 13 (28%) 23 (49%) 11 (23%) 47 6 1 (1.5%) 24 (35%) 20 (29%) 23 (34%) 68 7 2 (2%) 9 (10%) 23 (26%) 55 (62%) 89 8 0 5 (6%) 22 (24%) 64 (70%) 91 9 1 (.7%) 1 (.7%) 18 (12%) 131 (87%) 151 10 0 0 3 (4%) 71 (96%) 74 Total 119 115 158 361 753

Table 2 presents the number of students placed in the different courses as a function of the number of their affirmative responses regarding contact with Spanish. Clearly, students with 3 or 4 "yes” responses are not likely to place into the highest level course (Spanish 253), whereas students with 7 or more positive responses are very likely to be placed into that course. Students with 5 or 6 responses are, however, equally likely to be placed into any of the three courses. Despite the general correlation, therefore, a separate placement exam was deemed necessary to direct students into the class that would best suit their needs.

The computerized SHL language placement exam The test development process began with a close examination of each course’s goals and prerequisite language requirements, in order to establish high content validity for the test. Table 3 displays goals and requirements for each course, based on the course syllabus.

Table 3

Goals and Requirements for SHL Courses Course Description

Spanish 103 To develop students’ oral competence in Spanish. It reviews all basic grammatical structures needed for communication, including present, preterit, and imperfect tenses. Students are expected to understand an informal conversation in Spanish, but no specific grammatical knowledge is required.

Downloaded from Brill.com10/08/2021 03:49:34PM via free access 86 Heritage Language Journal, 9(1) https://doi.org/10.46538/hlj.9.1.5 Spring, 2012

Table 3 (con’t.) Spanish 203 To expand students' written and oral abilities. It continues review of intermediate grammar, such as the preterit, imperfect, and subjunctive, and introduces basic spelling conventions of Spanish. Students are expected to be able to carry on an informal conversation in Spanish but have gaps in their grammar and spelling. Spanish 253 To introduce students to writing and speaking in Spanish for academic purposes. At this level, students should have mastered the basics of grammar and spelling.

With the general course goals and requirements in mind, the second step was to craft the item pool. This process began in the spring of 2004 with a careful review of the written samples students produced for the pencil-and-paper SHL exam, to discover students’ strengths and weaknesses in their language production at each level. Key to the success of this process was the development of a list of discrete points that students at one level had not mastered but students at the next level had. Spanish 103 targets receptive bilinguals, so items for that level aimed only at recognition of familiar idiomatic expressions common in Spanish conversations. For Spanish 203 versus 253, distinctions revolved around errors in orthography, morphology, grammar, and vocabulary/idioms.9 For example, whereas students placed in Spanish 203 had difficulty spelling irregular preterit forms, students in 253 had mastered these forms. As a result, two items were designed to target the spelling of irregular preterits. Once contextualized multiple choice questions and their correct answers were created, the distracter choices were designed based on actual language errors of the previous year’s students. This ensured that the distracters were semantically plausible and realistic. The test had a total of 25 items, 8 items targeting Spanish 103, 8 targeting Spanish 203, and 9 targeting Spanish 253 (see Appendix for sample items).

With the items created, the next step was the computerization of the test, which was carried out by the College of Humanities instructional computing staff. For test designers lacking access to technology experts, a number of test-authoring programs are available (e.g., SunRav Test Maker (2009), Test Pilot I (n.d.), Exam Manager (n.d.), EasyTestMaker (n.d.), FastTEST Web (n.d.), QuestionMark Perception (n.d.)) that can help to translate the test into a computerized or online format, depending on the institution's needs.

Once the preliminary test was ready, the next step was piloting (trial run) and norming (establishment of cutoff points for each level). This process was carried out with students enrolled in one pair of equivalent L2 and SHL courses for each level: 103, 203, 253 (HL track; N = 56), and 102, 201, 251 (L2 track; N = 48). These two groups were necessary to ensure that the test was able to discriminate the skills of SHL learners from those of L2 learners.10 Based on an analysis of the descriptive statistics for each question, a total of four items were eliminated from the pool of 25 due to their failure to discriminate between the two groups of learners.

The means for each group were then submitted to an unpaired-sample T-test, which showed significant differences between the two groups [t(14) = 5.68, p < 0.001], and further validated that the test accurately discriminated between the two populations of students. The next step was

Downloaded from Brill.com10/08/2021 03:49:34PM via free access 87 Heritage Language Journal, 9(1) https://doi.org/10.46538/hlj.9.1.5 Spring, 2012

to determine the cutoffs for each level, with a specific number of questions targeting each level as the differentiating factor. Means and standard deviations for each level were also analyzed and both taken into account in determining the upper and lower limits (see Table 4). Keeping in mind that students took the pilot test at the end of the semester, the scores for Spanish 103, for example, were thus considered relevant for the next level (Spanish 203). To be placed into the first level, Spanish 103, students needed to obtain a score of at 4. Any student who obtained less than 4 correct responses was redirected to the S-CAPE exam for L2 learners.11 With the test piloting and norming complete, the test was ready to launch in the summer of 2005.

Table 4

Means and Standard Deviations Mean and SD Cutoff scores Placement Level

No data 4–9 questions correct Spanish 103

Mean for Spanish 203 = 13; 10–15 questions correct Spanish 203 SD = 5 (some outliers) Mean for Spanish 253 = 19; 16+ questions correct Spanish 253 SD = 3.2

Test administration and analysis The computerized SHL placement exam was administered for the first time in the summer of 2005 and was modified in 2006, 2007, and 2008; the test has remained unchanged since 2008 with the exception of minor adjustments. The number of test-takers for each year is presented in Table 5.

Table 5 Number of Students Taking the SHL Placement Test and Assigned Course Levels, 2004–2010 Year Students Students Students Students Students qualified to placed in placed in placed in placed in take SHL Spanish 103 Spanish 203 Spanish 253 L2 exam courses1

2004 448 163 (27%) 132 (29%) 153 (34%) NA 2005 351 55 (16%) 106 (30%) 190 (54%) NA 2006 564 132 (23%) 130 (23%) 216 (38%) 86 (15%) 2007 530 110 (21%) 90 (20%) 198 (37%) 132 (25%) 2008 508 132 (26%) 86 (17%) 203 (40%) 87 (17%) 2009 569 99 (17%) 125 (22%) 262 (46%) 83 (15%) 2010 753 115 (15%) 158 (21%) 361 (48%) 119 (16%)

Downloaded from Brill.com10/08/2021 03:49:34PM via free access 88 Heritage Language Journal, 9(1) https://doi.org/10.46538/hlj.9.1.5 Spring, 2012

Note. Students who do not obtain a minimum of 4 on the test are redirected to the L2 S-CAPE exam.

The next step was to calculate test reliability, or the degree of consistency of test results, in SPSS using Cronbach's alpha as the measure of internal consistency. This commonly used and highly accurate estimate of test reliability requires a complete dataset for each item. Results were as follows: 2005 test = 0.881, 2006 test = .934, 2007 test = .921, 2008 test = .914, 2009 test = .924, 2010 = 921, 2011 = .936. The high Cronbach’s alpha values (above .8) indicate that the overall test is highly reliable.

Based on the results of an item-total correlation and item-difficulty analysis of each test item, 12 new items were piloted in the summer of 2006;12 a total of 8 new items were implemented in the 2007 test: 2 replaced items discarded from the 2006 test and 6 were new items. The item-total correlation, or item discrimination, indicates the strength of the relationship between the students’ response to an item and their total score. Ideally, a good item should have a positive correlation above .25 (Osterlind, 1998). In turn, the item difficulty, or p-value, was determined to eliminate questions that were too difficult or too easy, especially those aimed at Spanish 203 and 253. Item difficulty is determined using the percentage of test-takers who answer a question correctly. These values (usually expressed as a mean from 0 to 1, where 0 = correct and 1 = incorrect) should ideally be between 0.4 and 0.80 (Osterlind, 1998). Q2, for example, has an item difficulty of .95 and is therefore too easy for test-takers. Q12 has a difficulty of .51 (2005) and .55 (2007), which suggests that it is of moderate difficulty, with about half of test-takers answering it right and the other half wrong. The Appendix displays the results for the administration of the first test (2005) and the test with the added questions (2007).

The item-total correlation for the 2005 exam revealed that items 2, 4, and 7 had low values of .23, .05, and .11, respectively, indicating that students’ responses to these questions did not correlate well with their overall test scores. These items were analyzed in terms of the quality of the questions themselves as well as that of their distracters. Item 4 was discarded but items 2 and 7 were kept after changing the wording of a distracter to improve their quality. All 27 items in the 2007 exam had item-total correlations that fell within expected levels. Four items had values between 0.2 and 0.39, which indicate good discrimination, and 23 items had values of 0.4 or more, indicating very good discrimination. The item-difficulty analysis revealed that 20 items had values within the .20 and .80 range, which is ideal. A total of 6 items had values above .80, meaning they were easy for students. These items, however, target receptive bilinguals (i.e., those students typically placed in Spanish 103); thus, they are expected to be easy for all SHL students placing into the higher levels. Therefore, the high values above .80 are acceptable.

Even though the preceding five steps completed the basic test design, implementation, and analysis, the test development process is in fact ongoing, and every year small changes are introduced, both to address students' needs and to improve the exam. The next sections provide recommendations that go beyond the placement exam itself.

SHL Language Placement after the Test

Downloaded from Brill.com10/08/2021 03:49:34PM via free access 89 Heritage Language Journal, 9(1) https://doi.org/10.46538/hlj.9.1.5 Spring, 2012

Placement in SHL courses at the University of Arizona is a fluid process that allows for the input of students, instructors, and the SHL program director. All parties collaborate to ensure that students are placed in both the right track and the most appropriate course. To this effect, the placement process does not end with the administration of the computerized placement exam. When classes begin, instructors in the L2 track, who have previously attended a 30-minute workshop on identifying SHL learners, try to identify any SHL learners in their classes using the SHL placement survey and follow-up interviews. All students identified are strongly encouraged to switch to the SHL track, particularly when the L2 and parallel SHL courses are offered in the same time slot. This practice is extremely useful since not all potential SHL learners take the placement exam and many mistakenly enroll in L2 courses.

SHL instructors also have an important role in placement. During the first week of classes, instructors in all courses give students a three-page background survey (see Webb & Miller, pp. 47-54, for an example), a written diagnostic test, and an oral interview during office hours. Students in Spanish 103 take a listening test to ensure that they have a minimum level of receptive ability. Although all these assessments have as their main purpose to learn about the students’ language background, strengths, and instructional needs, they also serve as validity evidence of the placement test’s effectiveness.

During the first two years of the placement exam, approximately 20% of students were misplaced, according to an informal survey of instructors. After the 2006 modifications to the exam, only approximately 8% students per course on average needed to change courses. Overall, instructors are satisfied with student placement.

Another group that is essential to the success of the SHL placement process is academic advisors and administrators. Advisors are an important recruitment resource and can also help students by explaining the placement results to them. Advisors can also help identify misplaced or not-yet- placed students, and refer them for SHL advising and placement. Every year a number of students do not take the placement exam for various reasons, so the responsibility for guiding students into the right course often falls on advisors. Unfortunately, ill-informed advisors sometimes direct students to the wrong class or track. It is very important to educate advisors through workshops, informational brochures, and frequent e-mails and other forms of communication. Because advisors come from numerous disciplines, they may not be knowledgeable of language placement issues. Lastly, advisor turnover can be a problematic issue requiring constant training.

One of the main challenges the SHL Program still faces is that students who place into the lowest level course, Spanish 103, are not convinced that they know enough Spanish to enroll in the SHL track, so many register for Spanish 102, the L2 equivalent. Ideally, an advisor could meet with these students and explain the differences between the two tracks, to reassure them that they belong in the SHL course (see Beaudrie, 2009). It is our experience that once students begin the class, they feel that the level and content fit their needs perfectly. The challenge is to convince those students who do not enroll and thus never attend this class that it is the best option for

Downloaded from Brill.com10/08/2021 03:49:34PM via free access 90 Heritage Language Journal, 9(1) https://doi.org/10.46538/hlj.9.1.5 Spring, 2012

them. To address this problem, students receive a handout that explains their placement followed by e-mail communications further explaining the SHL course and encouraging them to enroll.

The second remaining challenge is that the highest level students can place into is the third-level course. Some students, especially those who have completed secondary education in Latin American countries, would likely benefit from taking one of the two more advanced classes. Although such students are identified during the diagnostic period, most of the sections of the highest level courses are completely full and enrolling new students is not always possible. To address this problem, all L2 and SHL syllabi describe the expected student population for a course and who should not enroll in it. This empowers students themselves to recognize if they have been misplaced and lets them know who they should talk to in order to enroll in a more appropriate course.

Recommendations Our main recommendation for SHL language placement is to involve all interested parties in the advising and placement process. Collaboration will ensure that students understand the benefits of an SHL track and can serve to clarify misconceptions surrounding who is and who is not considered an SHL student at a particular institution. For example, students may lack confidence and decide not to enroll in the SHL track, erroneously believing that those courses will be too difficult for them. Some overconfident students may believe that an L2 course will result in an easy “A” and do not enroll in the SHL track (see Potowski, 2002). Due to limited resources, many institutions may not include all students who could potentially be considered SHL learners according to research-based definitions. This discrepancy needs to be clarified to all those involved, as issues of identity are also at stake in language placement (see Potowski, forthcoming). It is crucial for staff to collaborate to increase awareness about the two tracks and their differences, and guide students into the most appropriate course.

An essential component of the SHL placement process is student recruitment and SHL program promotion. A strong and well-thought-out recruitment program that takes place at multiple levels for a variety of audiences needs to both precede and follow placement. At the student level, a personalized approach that seeks direct opportunities for contact with students is an important step. Strategies such as contacting students to congratulate them on their placement, and holding information sessions on the reasons to begin Spanish studies at the university and what to do with a major or minor in Spanish, ensure that students become aware of and feel welcomed into the SHL program. Once students are in the program, activities are needed to ensure and document student satisfaction and create a sense of community within the program. All courses require students to complete a mid-semester and end-of-semester satisfaction survey. Students also complete an exit survey during the last course in the SHL program. The survey results are incorporated in the course modifications introduced with students’, instructors, and director’s feedback. At the departmental level, increasing the visibility of the program through events, brochures, and bulletin boards helps to create a space for the SHL program. At the university level, recruitment can be pursued at key university events, in freshman courses that typically enroll high numbers of potential SHL learners, and through the university newspaper or newsletter (see Fountain, 2001). SHL program events, such as beginning-of-the-semester parties

Downloaded from Brill.com10/08/2021 03:49:34PM via free access 91 Heritage Language Journal, 9(1) https://doi.org/10.46538/hlj.9.1.5 Spring, 2012

or cultural and social gatherings, attract many potential students. Other units, such as the Admissions and Student Affairs offices, can be of great help in recruitment efforts, as can any student organization with a large number of Hispanic students. Reaching out to the community and high schools can also establish links between the university SHL program and future students.

Finally, language placement is a continuous process that needs constant revision, analysis, and adjustments (Zabaleta, 2007), although future revisions take considerably less work than the initial test design and implementation. It is imperative that exam revisions continue as new student populations and new course requirements are put into place.

Conclusion This paper outlines the design and implementation process of a computerized SHL placement exam. This outcome demonstrates that it is possible to design a simple, yet effective placement exam with limited resources. It is feasible for all institutions to create their own test by following a five-step design and implementation protocol: (a) obtain samples from students in current SHL courses at the beginning of the semester; (b) create a computerized placement exam drawing on key discriminatory features identified in students’ writing; (c) pilot and norm the test; (c) run statistics; (d) implement the test; and (c) run statistics to check the test’s quality.

We argue that an SHL placement exam should be developed in-house, due not only to the diversity of the student populations but also to the unique structure and content of each SHL program. Research in the field of Spanish sociolinguistics in the United States consistently arrives at the same conclusion: there is no one single “U.S. Spanish.” The exam discussed here is not intended to be suitable for all U.S. colleges and universities – quite the contrary. What it does offer, however, is a framework for implementation that is versatile enough to be useful across diverse populations. Our considerations in creating the placement exam for the University of Arizona should be kept in mind wherever a similar exam is being created: “measurements need to predict and describe language skills in fair and valid ways that are helpful for tracking, placement, and for the completion of requirements of those institutions” (Campbell & Christian, 2003, p. 5). Potowski and Carreira (2004) remind us that traditional Spanish L2 hierarchies, long used to establish placement cutoffs for SFL students “break down precisely in the SNS classroom” (p. 2). Thus, we too call for a continued formalization of the design and implementation of placement exams.

We have also argued that universities need to move beyond the test and engage in outreach to promote the SHL program. Although many strategies contribute to student recruitment, having a valid and reliable SHL placement test has been the most successful recruitment tool at the University of Arizona. The test has allowed potential students to be properly placed into an SHL course and has provided the tools to promote the SHL program to them. It is only by sharing our own experiences that we can learn from one another and begin to answer Fairclough’s (forthcoming) call to create a more solid definition of what assessment means in the SHL context.

Downloaded from Brill.com10/08/2021 03:49:34PM via free access 92 Heritage Language Journal, 9(1) https://doi.org/10.46538/hlj.9.1.5 Spring, 2012

Acknowledgement Our sincere thanks to the members of the COH Instructional Computing team, especially Hale Thomas and Justin Lebreck for their help with the technical aspects of designing the computerized version of the test and the data collection process.

References Bachman, L., & Palmer, A. (1996). Language testing in practice. Oxford: Oxford University Press. Beaudrie, S. (2009). Spanish receptive bilinguals: Understanding the cultural and linguistic profile of learners from three different generations. Spanish in Context, 6(1), 85-104. Beaudrie, S. (2011). Spanish heritage language programs: A snapshot of current programs in the southwestern United States. Foreign Language Annals, 44(2), 321-337. Beaudrie, S. (in press). Research on university-based Spanish heritage language programs in the United States: The current state of affairs. In S. Beaudrie & M. Fairclough (Eds.), Spanish as a heritage language in the United States: State of the science. Washington, DC: Georgetown University Press. Beaudrie, S., & Ducar, C. (2005). Beginning level university heritage programs: Creating a space for all heritage language learners. Heritage Language Journal, 3(1). Available from http://www.heritagelanguages.org Beaudrie, S., & Fairclough, M. (Eds.) (in press). Spanish as a heritage language in the United States: The state of the science. Washington, DC: Georgetown University Press. Campbell, R., & Christian, D. (2003). Directions in research: Intergenerational transmission of heritage languages. Heritage Language Journal, 1(1). Available from http://www. heritagelanguages.org Davidson, F. (2000). The language tester’s statistical toolbox. System, 28(4), 1-13. Davidson, F., & Lynch, B. (2002). Testcraft: A teacher’s guide to writing and using language tests. New Haven, CT: Yale University Press. Davies, M. (2006). A Frequency dictionary of Spanish: Core vocabulary for learners. London: Routledge. Duisberg, S. (2001). High school heritage learners of Spanish: An investigation of language attitudes (Unpublished doctoral dissertation). U of Arizona, Tuscon, AZ. Fairclough, M. (2006). Language placement exams for heritage speakers of Spanish: Learning from students’ mistakes. Foreign Language Annals, 39(4), 595–604. Fairclough, M., Belpoliti, F., & Bermejo, E. (2010). Developing an electronic placement examination for heritage learners of Spanish: Challenges and payoffs. Hispania, 93(2), 270- 289. Fairclough, M. (in press). Language assessment: Key theoretical considerations in academic placement of Spanish Heritage Language learners. In Beaudrie, S., & Fairclough, M. (Eds.), Spanish as a heritage language in the United States: The state of the science. Washington, DC: Georgetown University Press.

Downloaded from Brill.com10/08/2021 03:49:34PM via free access 93 Heritage Language Journal, 9(1) https://doi.org/10.46538/hlj.9.1.5 Spring, 2012

Fairclough, M., & Ramírez Vera, C. (2009). La prueba de decisión léxica como herramienta para ubicar al estudiante de español en los programas universitarios. Íkala Revista de lenguaje y cultura, 14(21), 85-99. González Pino, B., & Pino, F. (2000). Serving the heritage speaker across a five-year program. ADFL Bulletin, 32(1), 27–35. Fountain, A. (2001). Developing a program for Spanish heritage learners in a small college setting. ADFL Bulletin, 32(2), 29-32. Krashen, S. (2000) Bilingual Education, the Acquisition of English, and the Retention and Loss of Spanish. In Ana Roca (Ed.), Research on Spanish in the United States: Linguistic issues and challenges (pp. 432-44). Somerville, MA: Cascadilla. Lynch, A. (2003). The relationship between second and heritage language acquisition: Notes on research and theory building. Heritage Language Journal, 1(1). Available from http://www.heritagelanguages.org Lynch, A. (2008). The linguistic similarities of Spanish heritage and second language learners. Foreign Language Annals, 41(2), 252-381. McNamara, T. (1996). Measuring second language performance. London: Longman. Meunier, L. (1994). Computer adaptive language tests (CALT) offer a great potential for functional testing: Yet, why don’t they? CALICO Journal, 11(4), 23-39. Montrul, S. (2010). Current issues in heritage language acquisition. Annual Review of Applied Linguistics, 30, 3-23. Osterlind, S. J. (1998). Constructing test items multiple-choice, constructed-response, performance, and other formats (2nd ed.). Boston: Kluwer Academic Publishers. Otheguy, R., & Toro, J. (2000). Tests for Spanish-for-native-speaker classes. In the American Association of Teachers of Spanish and Portuguese (Ed.), Professional development series handbook for teachers K-16: Vol. 1. Spanish for native speakers (pp. 91-98). Fort Worth, TX: Harcourt College. Perception [computerized testing software]. Norwalk, Connecticut, USA and London, UK: QuestionMark. Available from: https://www.questionmark.com/us/Pages/default.aspx Potowski, K. (2002). Experiences of Spanish heritage speakers in university foreign language courses and implications for teacher training. ADFL Bulletin, 33(3), 35-42. Potowski, K., & Carreira, M. (2004). Teacher development and national standards for Spanish as a heritage language. Foreign Language Annals, 37(3), 427-438. Potowski, K., & Parada, M. (2010, February). An online placement exam for Spanish heritage speakers and L2 students. Paper presented at the 1st International Conference on Heritage/Community Languages, UCLA, CA. Real Academia Española [Spanish Royal Academy]. (n.d.). [website]. Asociación de Academias de la Lengua Española [Association of Spanish Language Academies]. Madrid, Spain: Author. Retrieved from http://asale.org/ASALE/asale.html . Schwartz, A. (2001). Preparing teachers to work with heritage language learners. In J. K. Peyton, D. A. Ranard, & S. McGinnis (Eds.), Heritage languages in America: Preserving a national resource (pp. 229-252). McHenry, IL, & Washington, DC: Delta Systems/Center for Applied Linguistics. Shohamy, E. (2001). The power of tests: A critical perspective on the power of tests. London: Pearson Education.

Downloaded from Brill.com10/08/2021 03:49:34PM via free access 94 Heritage Language Journal, 9(1) https://doi.org/10.46538/hlj.9.1.5 Spring, 2012

Tavernise, S. (2011, August 26). Young Hispanics’ College Enrollment Rose 24% in a Year, Study Says. New York Times, pp. A12. Retrieved from http://www.nytimes.com/2011/08/26/ us/26census.html Teschner, R. (1990). Spanish speakers semi- and residually native: After the placement test is over. Hispania, 73, 816-822. Thompson, G. (2005). Placement exams and the heritage language learner. Paper presented at the 21st Spanish in the US/6th Spanish in Contact Conference, Arlington, MD. U. S. Census Bureau Newsroom. (2010). Hispanic Heritage Month: Sept. 15 – Oct. 15, 2010 (Profile America Facts for Features). Washington, DC: U.S. Census Bureau. Retrieved from http://www.census.gov/newsroom/releases/archives/facts_forfeatures_special_editions/cb10 ff 17.html Humes, K. R., Jones, N. A., & Ramirez, R. R. (2011). Overview of Race and Hispanic Origin (2010 Census Briefs). Washington, DC: U.S. Census Bureau. Retrieved from Available from http://www.census.gov/prod/cen2010/briefs/c2010br-02.pdf Van Deusen-Scholl, N. (2003). Toward a definition of heritage language: Sociopolitical and pedagogical considerations. Journal of Language, Identity and Education, 2(3), 211-230. Webb, J. & Miller, B.(2000). (Eds.). Teaching heritage language learners: Voices from the classroom. New York: American Council on the Teaching of Foreign languages. Wiley, T. (2001). On defining heritage languages on their speakers. Heritage languages in America: Blueprint for the future. Washington, DC: Center for Applied Linguistics/Delta Systems. Zabaleta, F. (2007). Developing a multi-media, computer-based, Spanish placement test. CALICO Journal 24(3), 675-692.

Downloaded from Brill.com10/08/2021 03:49:34PM via free access 95 Heritage Language Journal, 9(1) https://doi.org/10.46538/hlj.9.1.5 Spring, 2012

Appendix A

SHL Courses

Spanish 103: For students who have receptive skills in Spanish, but who encounter difficulty in speaking. It focuses on developing fluent conversation and listening skills in a positive and culturally rich environment. It also covers basic grammar structures.

Spanish 203: Focuses mainly on written and oral development but reading and listening skills are also practiced in a dynamic cultural context. Grammar and spelling issues problematic to students are also covered.

Spanish 253: For students who understand and speak Spanish fluently. Focuses on differences between formal and informal uses of Spanish and develops both in the areas of writing, reading, speaking, listening, grammar, and vocabulary in a dynamic cultural context centered on Hispanics in the U.S.

Spanish 323: Focuses on expanding the learners’ oral and written academic proficiency and promotes their critical thinking skills in a dynamic cultural context centered on Latin America. Reviews advanced grammar and spelling issues to strengthen students’ writing.

Spanish 333: Develops oral and written Spanish for academic and professional contexts. Emphasizes different genres of writing, focusing on rhetorical strategies for persuasive and argumentative writing and culminates in a research paper.

Spanish 343: Introduces learners to Spanish phonology and phonetics. Students learn about the differences between spoken and written language as the basis to advance their Spanish proficiency. It also focuses on exposing students to the different varieties of the Spanish- speaking world.

Downloaded from Brill.com10/08/2021 03:49:34PM via free access 96 Heritage Language Journal, 9(1) https://doi.org/10.46538/hlj.9.1.5 Spring, 2012

Appendix B

Sample Test Items

Q7: ¿Dónde pusiste las llaves del carro? 1. En casa 2. A mi mamá 3. Ford 4. De la puerta

Q11: El verano pasado ______un viaje a Rocky Point. 1. Hicimos 2. Hacemos 3. Hacimos 4. Hicemos

Q16: Ojalá que nuestro equipo ______el próximo partido de básquetbol. 1. Gane 2. Ganaría 3. Va a ganar 4. Gana

Q17: Le recomendé a Susana que ______a un gimnasio. 1. Fuera 2. Fue 3. Fueron 4. Va

Downloaded from Brill.com10/08/2021 03:49:34PM via free access 97 Heritage Language Journal, 9(1) https://doi.org/10.46538/hlj.9.1.5 Spring, 2012

Appendix C

Percentages of Correct Responses per Level, Item Difficulty, and Item-total Correlation for Each Test Item, 2005 (N = 351) and 2007 (N = 398) Exams

2005 2007 Items Span. Span. Span. Item Item- Span. Span. Span. Item Item- 103 203 253 difficul total 103 203 253 diffic total stude studen studen ty correl stude stude stude ulty correl nts ts ts ation nts nts nts ation Q1 80% 83% 100% .89 .40 70.0% 95.6% 99.5% .90 .42 Q2 55% 93% 100% .95 .23 Replaced by 2B Q2B 78.2% 96.7% 99.0% .93 .32 Q3 45% 76% 79% .91 .44 23.6% 60.0% 90.4% .65 .58

Q4 73% 91% 99% .67 .05 Discarded Q4B 80.9% 97.8% 100% .94 .31

module Q5 29% 75% 92% .92 .32 70.0% 94.4% 98.0% .89 .38

st Q6 77% 90% 92% .77 .45 68.2% 92.2% 92.9% .86 .27

1 Q7 68% 92% 99% .89 .11 Replaced by 7B Q7b 69.1% 97.8% 100% .91 .42 Q8 27% 71% 98% .92 .34 51.8% 83.3% 100% .83 .50 Q9 14% 58% 100% .79 .58 42.7% 82.2% 99.0% .80 .55 Q10 25% 66% 92% .74 .60 45.5% 76.7% 98.0% .79 .51 Q11 27% 45% 88% .74 .52 44.5% 53.3% 90.9% .70 .41 Q12 4% 18% 83% .66 .49 40.0% 75.6% 97.0% .76 .52 Q13 36% 58% 96% .51 .67 13.6% 30.0% 88.9% .55 .66 Question moved to 15th place Q14 14% 37% 86% .75 .54 32.7% 53.3% 90.4% .66 .49 Q15 16% 32% 89% .60 .56 47.3% 75.6% 98.0% .79 .47 Question moved to 13th place Q16 3% 10% 48% .60 .61 25.5% 40.0% 84.8% .58 .52 Q17 3% 15% 50% .68 .59 15.5% 34.4% 93.9% .59 .69

Q18 1% 8% 45% .54 .62 20.0% 75.6% 99.0% .72 .71 Q19 7% 18% 51% .77 .45 13.6% 36.7% 90.4% .57 .66 Q20 2% 14% 45% .69 .53 50.0% 65.6% 100% .78 .50

Module Q21 0% 4% 47% .51 .73 20.9% 44.4% 85.9% .59 .55

nd Q22 New item 4.5% 25.6% 92.9% .53 .77

2 Q23 New item 17.3% 64.4% 95.5% .67 .69 Q24 New item 48.2% 73.3% 98.0% .79 .48 Q25 New item 46.4% 81.1% 99.5% .81 .54 Q26 New item 30.9% 90.0% 100% .79 .66 Q27 New item 21.8% 60.0% 84.3% .62 .51

Downloaded from Brill.com10/08/2021 03:49:34PM via free access 98 Heritage Language Journal, 9(1) https://doi.org/10.46538/hlj.9.1.5 Spring, 2012

Notes 1. Again, we note that not all SHL students are, by our definition, necessarily Hispanic, nor will all Hispanic students necessarily qualify as SHL students in a given program. Both of these considerations are important to bear in mind.

2. Though delineating these reasons is beyond the scope of this paper, suffice it to say that the reasons may quite possibly be as diverse as the population, spanning from a lack of interest in studying Spanish, to considering themselves proficient, having different career goals or scheduling conflicts, or being impeded by linguistic insecurities – the possibilities are endless and truly merit further research.

3. We use the phrase "second language learners of Spanish" rather than "foreign language learners of Spanish" because the United States is one of the largest Spanish-speaking nations in the world and Spanish, therefore, is not truly a foreign language.

4. We follow Teschner in using the term "native speaker" here, though such terminology has potential pitfalls (see Beaudrie, 2009).

5. Though recall that Montrul (2010) cautions against using morphosyntactic data to differentiate SHL students from L2 students; while there is no rationale specifically stated for the focus on verbal morphology, the author cites examples that commonly cause difficulties for SHL students.

6. Recall that SHL students exhibit both L2 and language in contact processes in their linguistic productions; thus, the SHL track addresses both of these issues in addition to SHL students’ affective needs (identity, attitudes, and motivations that are often distinct from their L2 peers).

7. The test was developed in 2004 by the authors of this article, who are active researchers and practitioners in the SHL field. Both had served as raters of the paper-and pencil test for two summers, had taught courses in the SHL program, and were involved in the initial design of test items. In 2006, only Sara Beaudrie continued with the design and further analysis of the test results as director of the SHL Program.

8. Linguistic-based items were not considered applicable to identify this population of SHL learners. Learners’ lexical and idiomatic knowledge is diverse and depends on the type and extent of their contact. It is difficult to find expressions that even receptive bilinguals can recognize but that L2 learners in the U.S. Southwest, many of whom have Spanish-speaking friends or neighbors, cannot.

9. It should be noted that cultural knowledge items were piloted but not included because they failed to discriminate among the three levels in the written test format.

Downloaded from Brill.com10/08/2021 03:49:34PM via free access 99 Heritage Language Journal, 9(1) https://doi.org/10.46538/hlj.9.1.5 Spring, 2012

10. Although the test was intended solely for SHL students, this step was necessary to ensure exam validity.

11. Those L2 or heritage motivated students who may have responded to demographic survey questions indicating a certain amount of exposure to Spanish that did not demonstrate sufficient similarities with those questions targeting the receptive bilingual population at the University of Arizona were thus redirected into the L2 track, based on their linguistic skills and needs.

12. In deleting items, we analyzed the item difficulty and checked for items that would increase the internal consistency of the test if deleted.

Downloaded from Brill.com10/08/2021 03:49:34PM via free access