page 1 of 75

Lia Smith, ESL Instructor

Sabbatical Report

(Spring 2015 / Fall 2016)

December 31, 2016

Contents

Section 1 Is there anything new in research on reading? page 2

Section 2 Survey of City College Instructors page 8

Section 3 Computerized Reading Programs page 34

Section 4 Survey of tools we use to determine page 49 student reading levels/competency

Section 5 Recommendations page 52

Appendix 1 (Bernhardt: Critique of body of research page 63 on second language reading)

Appendix 2 (My Reading Load/Assessment Survey page 63 questions w/rationales)

Appendix 3 (Degrees of Reading Power details for answer type) page 67

Appendix 4 (types of answers and word counts for ESL Reading page 70 Tests for Common Final)

Appendix 5 (List of books purchased Fall 2016 page 71 by CCSF Rosenberg Library)

Appendix 6 (On-line ESL reading resources) page 73

Appendix 7 (Directions for on-line browsing Read Collection page 74 in Rosenberg Library) page 2 of 75

This report is organized in sections. Section 1 introduces the idea that there is no “general reading skill,” and presents what might be reasonable expectations for how well a literate, educated second language reader might be expected to handle advanced disciplinary texts. Section 2 contains the results of my survey “My Course Reading Load / Assessment” done here at City College of San Francisco (CCSF). Two hundred and thirty instructors across the disciplines participated and wrote voluminous comments, revealing that they are deeply engaged with their students, are doing superlative work, and support student reading. Four themes emerged: reading loads are significant; completion and comprehension of reading assignments are inextricably tied to success in courses; prior knowledge supports reading; instructors change reading loads as they evolve as teachers.

In Section 3, I describe and analyze two computerized reading improvement programs, Degrees of Reading Power, and Reading Plus, both used at City College. My research and analyses show that the creators of these programs have not drawn sufficiently from reading experts, that they have built their programs on the questionable notion that there is a general reading skill, and that the programs themselves may be requiring time that students might better use to read and/or receive support for comprehending course assignments.

Section 4 is a survey of tools used at City College to determine student reading levels/competency. In this Section, I include information from my survey literature on reading in a second language.

In section 5, I make recommendations.

Section 1

Is there anything new in research on reading?

Students in general, and ESL students in particular, struggle with reading but it is a fact of college study that instructors across the disciplines (science, technology, engineering, math, art, communications, etc.) use text to transmit information. Students who are poor readers, or are perceived as poor readers, may find it difficult to succeed in their college courses. When trying to find solutions to help struggling students, instructors may complain that students lack preparedness. If a student’s first language is not English, the English department may fault the ESL department: “Your job is to teach them the English.” Similarly, when faced with students who seem to be poor readers, content instructors may fault the English department: “Your job is to teach them to read.”

The assumptions behind both these charges are misguided. Although second language learners can gain a certain level of mastery over English in advanced ESL courses, there is no “body” of English that students can master before attempting content courses (e.g., Asian American Studies, Computer Science, Economics). Second language research shows that second language readers may comprehend at higher levels than tests necessarily show and, with explicit instruction, can learn to use their first language reading skills to good effect. Reading researchers also suggest that there is no such thing as a general reading skill, and that all readers must be explicitly taught to read advanced disciplinary texts.

Second language readers page 3 of 75

What is called the Compensatory Model of Reading is a helpful framework to discuss reasonable expectations for second language (L2) readers. In the Compensatory theory, skill in second-language reading breaks down as follows (Bernhardt, 2011):

i. Literacy in L1 accounts for 20%

ii. Language knowledge for 30% (grammar + vocabulary, L1/L2 distance, etc.)

iii. Other 50% unknown This means that it is supposed that a student literate in her first language can be predictably trained to a reading competency of 50% in another language. After that, what helps a second language reader develop her reading skills is unknown. To get at this mystery, one place researchers have looked is vocabulary. And, indeed, research in both first language (L1) and second language reading (L2) soundly proves that building vocabulary helps students advance in reading skills.

Since the 1940s, it has been maintained that for a reader to understand a text, she must know 95% of the words. Corpus linguistic studies have helped researchers determine that up to 50% of the words found in English texts are high frequency words, the body of which is sometimes cited as numbering 2,000 for reading of general texts, and up to perhaps 2,570, when an Academic Word List, providing “a coverage of about 90% in academic texts,” is added (Cobb, 2009). To illustrate, the word “the” accounts for 6-7% of the words in any given English text. Knowledge of this single function word moves a reader from 0% comprehension in English to 6%. In an academic discipline, such as science, direct instruction could potentially help students acquire and potentially master the vocabulary on an Academic Word List created for that domain, adding further coverage. It might seem that what are called “the first 2,000 words” can be learned in a systematic and predictable way by an average language learner; however, studies have shown that ESL students with upwards of 4,800 known words in English, typically know only about half of the 2,000 most frequent word families (ibid, p. 150-151). Experience indicates that ESL students will not “naturally” learn the first 2,000 words simply by reading available materials or reading according to interest. Promisingly, second language reading researchers and the experience of classroom instructors suggest that it’s possible to bring a literate, educated ESL student up to an 80% comprehension level in a relatively short period of time—several years of intensive, immersion study— through direct vocabulary instruction.

However, every gain in comprehension after that is very slow and quite incremental. In order to increase her vocabulary, a reader must be repeatedly exposed to an unknown word. If one assumes she can do this by reading new texts that are at her level, the fly in the ointment is that corpus studies show that the percentage of different words “that turned up only once” in authentic texts is approximately 54% (Adams, 2009). For learning new words, “the overall determination seems to be that at minimum, ten occurrences [of a new vocabulary item] are needed in most cases just to establish a basic representation in memory” (Cobb citing Horst, 2009). That is, if the foregoing is true, reading a text containing unknown words will not automatically expose a reader to new word items often enough for her to learn them.

The points above strongly suggest there are no obvious shortcuts for a second language reader to acquire the vocabulary it takes a native speaker twenty years to learn,

In fact, there is no manner to teach an ESL learner “the English” in an ESL program to prepare her for reading advanced disciplinary texts in content courses. In the U.S., it’s been estimated that an 8th grader must know 600,000 words to do well in school and that, through direct page 4 of 75 instruction, this same 8th grader might gain approximately 8,640 words over the course of 12 years. To address this confounding mismatch, what first language reading researchers have proposed is that average and ordinary readers build vocabulary through extensive, independent reading (Adams, 2009). With vocabulary gains for foreign language students studying abroad (in the target language environment) similarly estimated at 550 per year (Milton & Meara, 1995), it seems unreasonable to expect a college-age student whose native tongue is not English to learn, or be taught, “the English” before she enters a mainstream English class or begins taking general education courses. Indeed, like the native speaker, to make necessary vocabulary gains for academic success, a second language reader must read, a lot, and have access to materials that are at 95-98% her reading level. It’s important to note that the kinds of texts that would help both an ESL student who struggles with reading to achieve automaticity (able to ready fluently) and an ESL student who competently reads with 80% mastery to increase vocabulary through independent reading, are in short supply. We also need to keep in mind that, were materials for both struggling and masterful second language readers to be in full supply (materials that are at 95-98% difficulty and on subjects and topics pertinent to the student’s area of study and/or interests), adult second language learners don’t often have the kind of free time required to read the amount of material that would translate into rapid vocabulary gain. For the second language reader, the barriers between what can be learned and what must be learned are far more numerous than for the first language 8th grader.

All readers

Setting aside the issues of “Vocabulary” and availability of texts for a moment, the second idea, long held, that there is a “general reading skill” that can be taught in an English class and used on all manner of texts, is appearing to be wishful thinking, for two reasons. First, readers need to be trained to handle lexical density to read advanced disciplinary texts. The kind of reading one does in elementary school is very different from what is soon required for academic success as schooling proceeds. We begin reading largely by reading narrative texts. These texts can be vocabulary demanding (and for second language readers, demanding because of cultural referents and a preponderance of idiomatic language), but are typically not lexically demanding, even when written for adults. According to Halliday and Martin (1993), everyday spoken language has a lexical density of 2 and 3 compared to written texts at an index of 4 to 6, with science texts going up to 10 or higher. Moving from comprehending texts read in elementary school to texts in middle and high schools, and college, does not happen automatically. Responding to statistics from the Nations Report Card: Reading 2005, which showed that “strong early reading skills do not automatically develop into more complex skills that enable students to deal with the specialized and sophisticated reading of literature, science, history and mathematics,” Timothy and Cynthia Shanahan worked with experts in several disciplines (history, science, mathematics) and concluded, “Most students need explicit teaching of sophisticated genres, specialized language conventions, disciplinary norms of precision and accuracy, and higher level interpretive processes.”

Further, from discipline to discipline, texts are not read in the same manner. In 2001, Len Unsworth at the University of New England, Australia, proposed, “…there is a need to delineate ‘curriculum literacies,’ specifying the interface between a specific curriculum and its literacies rather than imagining there is a singular literacy that could be spread homogenously across the curriculum” (Unsworth, 2001), and the work of the Shanahans and others supports this idea. Through direct work with content experts in math, chemistry, history, and literature, the Shanahans learned that disciplinary experts read texts in their respective fields quite differently (Shanahan & Shanahan, 2008). page 5 of 75

Reading the work of Zhihui Fang (2012) gave me some of the “how” of what Shanahan and Shanahan outlined in 2008. Although the Shanahans focused their studies on adolescents, much of what they’ve uncovered has strong implications for community college instructors. Certainly, if a college instructor views herself as a mentor to her student apprentices, it follows that she will work to find ways to explicitly teach her students some of the building blocks they will need to approach and read the texts from her discipline in the same manner she does.

In his paper, The Challenges of Reading Disciplinary Text, Zhihui Fang provides analyses of secondary schooling texts from science, math and history. In order to show the leap that elementary school readers must make when they begin reading secondary education texts, he describes these texts according to how certain grammatical features of English are exploited to convey complex and advanced thought.

Fang explains that in science texts, readers are faced with technical vocabulary, long noun phrases, nominalizations, and metaphorical realizations of logical reasoning. For math, a reader must grapple with technical vocabulary, semi-technical terms, long noun phrases, symbolism and visual display. And, history texts are full of generic nouns, nominalizations, and abstractions that suggest causality. To approach a history textbook critically, a reader must be able to do several things, including being able to discern how a writer sandwiches a fact or set of facts between his own prediction and conclusion, which Fang calls “texture.”

Historical discourse…is often constructed in abstract language that infuses the historian’s ideological perspectives. The abstraction is realized through 1) the use of… generic nouns…2) nominalizations, which turn a series of events, an action, or time sequences into “things”…3) evaluative vocabulary to indicate affect, judgment and valuation…4) texture…5) within-clause realizations of causal relations that sometimes conflate causality and temporality (Fang, 61-63).

Fang’s work seems to clarify some of the difficulties that second language readers face when accessing advanced texts in high school and college. “The difficulties of disciplinary texts lie not just in vocabulary, but more broadly in the discourse grammar, or language patterns… [and these features serve] quite different functions across the subjects” (ibid, 63). To illustrate, Fang enumerates the following uses of nominalization:

 In science, nominalization helps accumulate meanings so that a technical term can be defined or an explanation sequence synthesized for further discussion.

 In history, nominalization is often used to portray events as things so that historians can develop a chain of reasoning that simultaneously embeds judgment and interpretation.

 In mathematics, nominalization helps create semitechnical terms that are then quantified, reified as mathematical concepts, or put into new relationships with other concepts. Especially in a college environment, where it is not uncommon for English instructors to coax students away from over-use of nominalization in their writing, teaching students how nominalization is used effectively in advanced disciplinary texts is an important key to helping students improve their reading comprehension.

Limits of transfer of first language (L1) reading skills to second language (L2) reading page 6 of 75

Second language reading research strongly shows that L1 literacy has a markedly positive bearing on how well someone learns to read in a second language. Just as poor L1 readers tend to become worse readers as they advance in their schooling and good L1 readers tend to become better readers over time (keeping in mind that learning to read advanced disciplinary texts must be explicitly taught), good readers in L1 have superior advantages over poor readers in L1 in gaining high literacy in a second language. However, the assumption that a literate subject can transfer her reading skills in her first language (L1) to a second language (L2) has been largely based on the idea that there is some set of measurable “universal reading skills” that readers can employ across languages. This theory has not been proven and, according to Bernhardt, studies to date have been deeply flawed (see Appendix 1). I have already mentioned that there is a growing body of evidence that there is no such thing as “a universal reading skillset,” or even what could be called a “general reading ability” that can be used with any and all texts in a reader’s L1. Another wrong assumption is that just as a bi-lingual speaker of English and French changes her “code” when she leaves San Francisco (stops speaking English) and enters Paris (starts speaking French) this same individual will “turn off” her English when she reads French. On the contrary, both brain research on reading processes and L2 reading research show that the first language is always present and, paradoxically, can provide both interference and assistance.

As a means of organizing this discussion, I will use the hypothetical model Barbara Birch (2002) proposed in her book, English L2 reading: Getting to the bottom. She suggests that L2 reading has two parallel processing components:

Processing Strategies Knowledge Base

Cognitive Processing Strategies World Knowledge inference people predicting places problem solving events constructing meaning activities

Learning Processing Strategies Language Knowledge chunking into phrases sentences accessing word meaning phrases word identification words letter recognition letters sounds

page 7 of 75

I will start with the two bottom components, which are essential for reading. (The top two are necessary but not essential). First, looking at “Language Knowledge” a reader must be able to hear and differentiate sounds. This is the basis of spoken language, which humans began to write down when they invented writing. (This is not at all to say that deaf people cannot learn to read, but that is another discussion.) If the written language is alphabetic, the reader must next be able to determine what a letter is, and this must be taught. If you are unfamiliar with Arabic script, you might think that text is one continuous line. Untrained eyes (or untrained readers of Braille) cannot tell where one letter leaves off the next begins. Readers must be able to tell the boundaries of letters, of words, of phrases and of sentences. At all of these junctures, L1 can interfere with or slow down L2 processing. For a reader of Chinese, a logographic language, where native speakers memorize characters as part of learning to read, trying to use the same “accessing word meaning” strategy—memorization—to learn to read in English impedes progress; since English uses an alphabetic script, as long as a reader is mastering a foundation of “sight” words (words that must be memorized), it is far more efficient to read by sound. As a reader of English, when moving to Pinyin, a representation of modern Chinese using the Roman alphabet, when I read “qin” in Pinyin, to process the Chinese word, my English reading brain has to ignore the excitability of the letter ‘q’. That is, I can be slowed down as my English reading brain notes that the expected ‘u + vowel’ is not there (quick; brusque; quadrant), or when my brain miscues meaning because the pronunciation of “qin” is associated in my mind with the English word chin— [亲切 ( qīnqiè = cordial]. What about cognates? Do they help or hinder? Studies have shown that although cognates (words that are similar or the same from one language to another) can be helpful for an English speaker learning Spanish, false cognates (embarazada in Spanish means pregnant) and cognates that are limited to a single meaning (conveniente in Spanish is typically any cost/benefit advantage where convenient in English is most often related to location) mislead, and the promise of cognates limits some learners from acquiring vocabulary. “Well, in English, it’s ordinary and in Spanish, it is ordinario, so I’ll just add “o” to every word and I’ll be speaking Spanish.” If the reader reads from right to left in her L1, as in Arabic and Hebrew, adjustments have to be made to process letters with the opposite sweep. Syntactical differences can also slow down reading. In Japanese, the verb comes at the end of sentences. So, a native speaker of English will have to make this adjustment with her reading if she constructs meaning through subject verb pairs appearing at the beginning of sentences. All of these “interferences” are extremely slight, many happening in milliseconds or faster, but nevertheless can slow down reading and/or add to reader fatigue in L2.

Moving to the other two components, which “are necessary but not sufficient for competent reading” (Lems, 2010), L1 can also interfere with or slow down processing. For example, in Chinese, concession often comes at the end of a sentence, or paragraph, or even the entire text. This is quite different from texts typically read in American schools where the concession comes at the beginning of a sentence, paragraph, or the entire text. As long as these differences in how texts are organized are pointed out, mastery over L2 reading can proceed smoothly.

More problematic, however, is the final category: World Knowledge. Incomplete or a lack of world knowledge can lead readers astray. “[My student had read a passage in German and] answered all my content questions about the passage accurately… [and then remarked] that she didn’t know Martin Luther King, Jr. knew German” (Bernhardt, p. 103). If the student had been aware of Martin Luther’s 95 Theses, might she have comprehended the passage more fully? World knowledge encompasses more than just historical facts; it also includes educational development. “… [We] memorized everything…and then reproduced what the teacher knew. That was what was considered literacy—being able to read and write what the teacher knew…. page 8 of 75

[It wasn’t until we reached higher levels of education] that we began to understand that literacy…was understanding what is around us” (Ball quoting Gafumbe, 2006). Mieko, an Asian American, reported not reading from junior high school through her first years of university. It wasn’t until she was assigned to keep a reading journal that she was “forced to think about the material…rather than simply absorb it, perhaps to be forgotten later” (Ball, 2006, p. 89). That is, we learn facts about the world—Martin Luther brought about reform in Germany in the 1500s; Martin Luther King, Jr. brought about reform in the United States in the 1950s— and we also grow as learners and thinkers, depending on where and how we are educated. If we are taught to memorize everything we read, we will be very different readers from those who are taught to consider any number of subtexts, for example the biases a writer brings to her writing.

Research studies (albeit imperfect) show that L2 readers sometimes use L1 reading strategies and world knowledge when reading in L2, but not always. One theory is that language learners will only start to use L1 reading strategies once they gain proficiency over L2, but researchers have not been able to discern any rhyme or reasons for when, and why L2 readers use L1 reading strategies and their world knowledge.

Are we using the best reading materials to teach language?

Historically, it has been assumed that a reader accesses language through the reading of literary texts. To this day, despite trends towards using communicative language methodology, upper level foreign language students largely study “classical” literature of the target language. There have been many reasons for this, least of which was the availability of texts. For example, it has been fairly easy for a K-12 teacher in the U.S. to get an inexpensive class set of Mark Twain’s The Adventures of Huckleberry Finn, or for an ESL instructor at CCSF to determine that a young adult fiction title will be accessible by his/her students.

However, despite the low cost of this approach, educators have found that using “classical” texts is problematic. In the United States, for example, so-called “classics” are often irrelevant to the lives of diverse students who come from a multitude of language and ethnic-origin backgrounds. Literary “classics” can also be narrow in focus. For example, in a lively discussion of vocabulary acquisition, Marlise Horst of Concordia University in Montreal points out that readers will easily learn the word “moor” if they should read an abridged version of Wuthering Heights (p. 46-47, Horst, Section 2 in Han & Anderson, 2009). However, the noun moor is not on any Academic Word list and is not in the word families that comprise 80% of the vocabulary one needs to read English with some ease. Finally, literary “classics” are often taught so that students comprehend the text “…at a literal level…and appreciate [the piece] as a work of literature…situated in a particular historical time frame” when what students need are opportunities to “…openly critique [a text] by comparing it to other classic and contemporary texts…to question [the themes, such as injustice], and to produce media” (Hicks & Steffel, 2012).

Still, finding a properly leveled set of non-fiction texts, without copyright restrictions, on current affairs is very difficult. The advent of the Internet brings a potential for change, and language educators have been considering and experimenting with using non-literary texts to help move students towards being able to handle “upper register texts.”

Section 2 – Survey of City College Instructors

I wanted to find out how much reading CCSF instructors were assigning, and survey the types of reading support they offered as well as the assessments they used to determine whether page 9 of 75 students were successfully completing reading assignments. My assumption was that a number of students were possibly being assigned more reading than they could handle and/or were not, for any number of reasons, completing reading assignments. I also suspected that reading loads had increased because of technology.

Part of my interest in this direction of inquiry came from a student-produced video, “Reading Between the Lives,” produced in 2009, by a group of Chabot Community College students. In this video documentary, recently posted on YouTube in 2015, students relate their experiences regarding “reading.” Tellingly, one states that he only buys the textbook if he judges his instructor to be a poor lecturer. The students in the video seem to understand text responsibility, and state that pragmatics, for the most part, determine how much and what kind of reading they do (Chabot, 2009).

Through my survey, I hoped to find out how the instructional practices of instructors here at City College of San Francisco, also a community college, might be encouraging or dissuading students from reading.

Please note that a somewhat similar survey was conducted at CCSF in the spring of 2013 by a Basic Skills Committee (Andrew King, Lisa King, Tore Langmo, Linda Legaspi, Caroline Minkowski, Cindy Slates, and Elizabeth Zarubin). This group wanted to investigate what CCSF instructors were doing to engage students with text. Ninety-seven instructors across twenty-one disciplines participated in the 2013 survey with the highest number of participants in ESL (21), and the next highest in English (9) and Biological Sciences (9). The project was put on hold when Basic Skills changed direction at the college, and no report was generated.

Overall, my survey revealed four main themes: that reading loads are significant, that completion and comprehension of reading assignments are inextricably tied to success in courses, that prior knowledge supports reading, and that instructors change reading loads as they evolve as teachers.

Reading Loads

Surveyed instructors are typically assigning at least 2 hours of reading per week, meaning that it’s possible that students who carry a full load (3-4 classes) carry a 6-8 hour load per week. One instructor did mention considering how the reading load might affect a student’s overall study time. “Many students have two to three classes with me per semester, so I try to manage their time… they will also have a written assignment with the reading” (#209, Health Care Technology). Instructors may be underestimating the time it takes students to complete reading assignments, particularly for ESL students, who tend to spend 3-4 more times on reading assignments than native speakers of English. A conservative estimate of the hours needed for reading class texts for an average full-time ESL student may be upwards of 20 hours per week.

Elizabeth Barre, Assistant Director for the Center for Teaching Excellence at Rice University, recently created an on-line workload estimator to help college instructors understand the factors to take into consideration when assigning reading. These factors include the number of words per page (typically 750 per page in a college textbook), the difficulty (no new concepts, some new concepts, many new concepts), and the purpose of the reading (survey, understanding, engagement). For the calculator, see: [http://cte.rice.edu/blogarchive/2016/07/11/workload]. page 10 of 75

Ancillary Reading Adds to Reading Loads In the survey, ancillary reading was defined in the question prompt to mean syllabus, assignment sheets, handouts, peer review, instructions for using Insight and other computer applications, Internet forums, etc. Due to the flawed survey question, instructors gave a variety of definitions about what constituted “ancillary reading,” and it was unclear from the data whether instructors considered ancillary reading part of the reading load. For example, a few instructors interpreted “ancillary” to mean optional independent reading.

Instructors reported assigning the following ancillary readings:

 peer review

 instructions

 class handouts (some instructors go over these during class)

 on-line resources (optional)

 support resources (optional)

 ad hoc newspaper articles (current events)

 lab assignment

English and ESL instructors report providing the largest number of handouts and detailed assignment instructions. Instructors might consider how much ancillary reading adds to reading loads.

Direct Reading Instruction Question 10 asked instructors what direct instruction they gave their students during class to help them complete their reading assignments. Impressively, responses show instructors work hard to engage students in reading but, more importantly, responses show overwhelmingly that assigned readings serve to help students master course material. Most encouraging, in light of current research being done on reading was that 42% of respondents report that they instruct students to read texts “in the manner that reading in my discipline requires.” Cynthia and Timothy Shanahan, joined by other recognized experts in reading research, have questioned the idea of a “general reading skill” and argue that students must be explicitly trained by experts in a specific discipline to read advanced disciplinary texts in that particular discipline. In The Challenges of Reading Disciplinary Texts, Zhihui Fang quotes Michael Came, a math teacher, describing how he teaches his students to read word problems:

I summarize the goal of the problem and determine what is being asked before I do anything else…The goal of this particular problem is to find out the total area of the top surface of five round stools. The word round implies that the surface is a circle…I know that thinking about circles can involve many different concepts such as radius, diameter, and circumference…I also know that the formula for the area of a circle is A = π r2…So, now I know that I need the radius to find the area. Other page 11 of 75

formulas might be useful in case the radius isn’t given. These formulas are C = πd and d = 2r…. I also get rid of information that is not important by crossing it out… [127 words excerpted from 400+] (Fang, 2012). One surveyed CCSF math instructor stated, “I try to model careful reading of word problems and critical thinking when I solve problems for the class at the board” (#8, Math), an example that there is awareness that content instructors must explicitly teach students how to read according to the target discipline. Respondents also reported discussing genre and purpose with their students. Familiarity with genre is a good predictor of whether a reader will comprehend a text well enough to pass a standardized reading comprehension test, and reading for a purpose can guide readers to focus their reading. Considering genre is a method for activating prior knowledge, and identifying purpose is a method for determining importance. Both activating prior knowledge and determining importance are highly effective reading strategies.

Reading Comprehension Tied to Prior Knowledge The theme of reading being tied to prior knowledge came up in the question about reading assessments. A good number of CCSF instructors use assessments of their students’ reading comprehension to guide students to learning resources and generally support student learning. Notably, however, the majority of surveyed math instructors commented that without reading comprehension, a student would be unable to complete the coursework. “Reading comprehension is not explicitly part of my students' grades. Nevertheless, they must comprehend word problems in order to solve them correctly” (#8, Math). This need for reading comprehension was echoed by several instructors in other disciplines. “Their comprehension does affect their grade, in that their ability to succeed at the essay task is influenced by their reading comprehension” (#58, English). “Mostly in capstone project students are assessed on how thoroughly they have accomplished research and topics” (#136, Environmental Horticulture & Floristry). “I don't have a category for reading comprehension in the grade breakdown, but one can't pass an English class if the readings are not understood” (#199, English). “Reading is needed to complete in-class work, presentations, and papers” (#203, Child Development).

Conversely, in some courses, instructors state that prior knowledge or general competence is more important than unassessed reading comprehension. “Exams assess reading comprehension [of each student]. However, it is difficult to say whether the grade reflects their reading comprehension, or lecture comprehension, or their previous science knowledge. With science courses, their reading comprehension is highly affected by their previous course work!” (#86, Biological Sciences). “If they grasp the ideas in a classroom, college and in employment, then I am not too concerned with reading comprehension” (#174 Library Information Technology). These astute comments are directly tied to current reading research that shows how deeply knowledge influences reading. “Words are not just words. They are the nexus—the interface— between communication and thought” (Adams, 2009). To give an example, it is much easier for a cook who has worked in a kitchen for a number of years to understand an advanced text on how to prepare a dish, even if it contains words or grammatical and text structures she has never seen before, than for a lay person to comprehend the same text. The cook has the knowledge to interpret the text, whereas the lay person cannot visualize what the text expresses and so cannot “fill in the blanks” when she encounters unknown words or sentences with multiple clauses. page 12 of 75

Changes in Reading Loads Over the Past Ten Years The question of whether instructors had changed their reading loads in the last ten years, by far, elicited the most information in terms of what is required to use college reading as an effective teaching tool. One of my unstated hypotheses was that reading loads have increased, particularly ancillary loads, due to technology. The survey doesn’t show this, but the comments do. Any instructor who adds an on-line forum to her class (e.g. Insight or Canvas) has increased the reading load. Curiously, some instructors don’t consider computer use as reading. In an extreme example, one instructor commented that reading loads were not applicable since the course was offered on-line. I was also surprised that a Library Information Technology instructor did not see advanced reading skills as a must for successful completion his/her course work. Certain comments made me wonder if computer use has become so ubiquitous as to go unnoticed as a reading activity. Even if one appears only to be using a computer for accessing videos and other visual media, some reading is still required.

Factors Affecting Reading Loads

Although one instructor offered, “My [weekly] guideline has always been 30-50 pages for lower division undergrad, 100-200 pages for upper division, and 200-500 pages at the grad level” (#93, Political Science), it’s important to consider that while City College’s college curriculum committee always checks that a credit course assigns enough homework for the Carnegie units, including the reading load, this design (often far removed from what an instructor does in the classroom) is really the only standardization for how much text an instructor should assign. Some instructors may begin their careers assigning too much reading (unrealistic) or too little (inadequate for students to learn what is required). For the most part, instructors who offered comments revealed that they have matured over time in their teaching practices and how they assign reading loads

Several instructors report that increases in ancillary loads are due to new state requirements. “This is probably due to the fact that instructors are now being required to document so much of what they do in the classroom. The length of my first day handout has doubled!” (#105, ESL). “It is due to the increased amount of paper work that our school is required to complete” (#161, Disabled Students Programs and Services). But, by far, the majority of instructors report having increased reading loads due to the better availability of resources to encourage learning. “I am hoping that smaller or shorter ancillary readings that pertain to students' lives will get them to read more” (#109, ESL).

Indeed, instructors report having made changes to reading loads because they have gained experience and expertise. Experience gives instructors tools to better understand student needs, select reading materials judiciously, and create effective handouts.

One selection instructors make is textbooks, and better books can increase, decrease, or re- balance textbook and ancillary reading loads. “The new textbook includes enough [so I eliminated] ancillary text. Cost factored into this” (#171, Political science). “I got rid of the text… and now assign on-line articles and websites…I've customized many handouts to provide specific information in a simple, concise format. I have found that students learn better if the information is direct, well organized and not over burdening” (#196, Child Development). “I have a better textbook, and so I don't need to use as many handouts” (#200, English). Instructors also comment that they want to protect students from the high cost of textbooks whenever possible. “The main reason for a slight decrease is that I choose shorter, less expensive books” page 13 of 75

(#169, Anthropology). “Text books are outrageously priced…so I provide materials for the students to learn the information if they are not able to purchase the book” (#229, Photography).

Instructors who have increased their reading assignments express excitement about being able to offer more to students. “There is more I want students to learn. I have more resources” (# 5, Culinary Arts). “[I’ve added] more short introductions to readings to help them contextualize the reading, more handouts to check for reading comprehension/analysis” (131, History). “Reading is a great way to learn a language! I have been inspired by my own progress in foreign language, hugely improved through continual, daily reading” (#39, ESL). “I've diversified the sources, to provide a more diverse range of voices and approaches to the material” (#220 Visual Media Design).

Many who have pro-actively reduced the number of reading assignments cite having gained a better ability to focus students. “I have found that students…are more likely to complete the reading if there are fewer assignments, so I have focused on a few specific examples of what I most want them to learn” (#131, History). “Students with weak English were plowing through too much optional material…I decided to focus more on [basic concepts] that confused students” (#156, Economics). “As students have read less or been less able to read (due to time or comprehension), I have reduced the reading load so we can spend significant time on each [text]” (#91, Speech Communication). “Texts have changed… [and] require less work to understand… [I’ve also reduced ancillary reading assignments.] It has been hard enough to get them to read the text” (#113, Child Development). I found students overwhelmed by the reading; they learn more, unencumbered…learning by doing” (#163, Broadcast Electronic Media Arts).

Others have done so purely for pragmatics. “Too many students would fail the class if I required more of them” (#50 Psychology). “I feel that there is pressure from students and other faculty to reduce the out-of-class workload on students” (#134, Broadcast Electronic Media Arts).

Although one English instructor reports having reduced assigned readings due to “curriculum constraints,” all other instructors reporting that they have increased reading loads cite curriculum changes, pedagogical insights, and departmental shifts as reasons. “The course outline now specifically includes reading skills” (#13, Mathematics). “The text reading does not engage the type of learning found in Bloom's taxonomy of cognitive learning” (#78, Biological Sciences). “The English Department standard has shifted [in the last ten years] to include more authentic readings in English 91, and frequently to include a complete book” (#68, English). “Higher expectations from students and teachers” (#101, ESL).

Technological advances have had a great impact on reading loads. While it is true that information remains fairly stable in some fields, technological advances in certain fields and in methods for instructional delivery are significantly adding to reading loads. In Fire Science Technology, for example, patient assessment protocols have not “changed radically” (#205, Fire Science Technology) but “the Student study manual has been updated…for today’s demands in the life safety discussion, [and is more detailed]” (#210, Fire Science Technology). “Due to increased aircraft technical complexity, reading loads have increased. Students are required to obtain more technical information” (#217, Aircraft Maintenance). “Due to the use of newer software for post-production, more handouts and ancillary reading is given to students to give them more current information that the book might not cover” (#229, Photography). “Health care and technology changes happen daily. Textbooks are printed every 3-4 years. To keep the curriculum current, we assign additional reading to make sure students are prepared to enter the current workforce” (#209, Health Care Technology). page 14 of 75

A number of instructors report an increase in reading loads because they have been able to improve instruction by using the Internet. “The Internet makes it possible to research specific cases and we use that” (#166, Broadcast Electronic Media Arts). “Using websites to add online sources is now available” (#74, Biological Sciences). “I found better texts that come with a CD or have a video on YouTube” (#66, ESL). “Since I began using Insight five years ago, I have been assigning more reading” (#23, ESL).

At the same time, technology is also allowing instructors to replace reading assignments. “Most of my students have expressed more interest in required viewings such as TED talks. This change [allows me to respond to] differences in learning styles” (#122, Biological Sciences).

One group of instructors express a certain resignation about the state of reading. “The students don't read; that's the truth” (#7, Culinary Arts). They don't read, and lots of books at the bookstore tell students to stay away. I just can't handle the terrible writing anymore” (#168, Anthropology). “I find that students are increasingly reluctant to read. Many seem capable, but find it boring or old fashioned. They say they don't have time to do the reading or just didn't get to it in time” (#169, Anthropology). “I've found that students are less and less inclined to read lengthy material. I've cut way back on the reading assignments and rely more on in-class instruction” (#196, Child Development).

These comments about reading attitudes somewhat echo the comment of Mark Sadoski, author of Why? The Goals of Teaching Reading (2004), who states, “In fact, aliteracy [can read but don’t] may be one of our most pressing literacy issues because with reading, as with other abilities, its value lies not in its possession but its use.”

A caution for instructors is to remember that they themselves, and not the students, are changing. It is easy for experienced instructors to fall into the fallacy that somehow today’s students are not the same as students in the past. While it is true that “[it is not] sufficient to have motivated readers who read with little skill or read only in a restricted domain such as simple fiction” (Sadoski, 2004), the following comments may reflect the maturation and expanded knowledge base of the instructor more so than the students s/he encounters. “I expect that students who do not understand vocabulary will look things up but students have come to class expecting that they would be told how to connect the dots.” “[There has been a loss in] the ability to recognize abstract terminology & application of metaphor and analogy & [there has been a] lessening of a common cultural language and knowledge.” “Students preparedness for college has definitely declined, and as well, the number of students who seem to have natural ability to succeed in entry level classes has gone down.” Instructors who have these feelings, might consider Alfred Binet’s core belief: Intelligence increases with age. When France made public schooling a right for every child, Binet’s “intelligence test” was designed as a tool to help teachers serve students who came to school with less experience than the youngsters the private schools were habituated to serving. Students who can succeed in entry level classes have sufficient experience to do so.

In the age before the Internet, a high school reading level was considered more than adequate for nearly anyone to succeed in the work world (Adams, 1994; Chall, 1996). Today, because of the technological advances and the Internet, citizens need much higher reading levels. Changes to City College of San Francisco curriculum reflect this. “The writing assignments have become more sophisticated and plentiful in English 91. Students are doing more reading before they begin writing an essay than in the past. All essays are text-based and require using information from multiple sources” (#185, English). Regardless of whether English classes in high schools are preparing students as well as in the past to read literature, a claim that research tends to support, the same research shows that science texts are growing increasingly more difficult and page 15 of 75 that high school textbooks do not necessarily ask for the kind of advanced reading that college students and those in the workforce must do (Adams, 2009). It is reassuring that CCSF instructors are attuned to both the reading demands in their disciplines and the needs of their students, as well as making thoughtful changes to their reading assignments as they evolve in their teaching practices.

Survey Results

All CCSF instructors were invited to participate through list-serves, department chair requests, the EFF and personal request. Non-credit ESL instructors also participated. For the survey questions and rationales, see Appendix 2. Q1. What is your discipline? Two hundred and twenty seven CCSF instructors from forty three out of seventy seven CCSF disciplines (56%) participated, representing a broad range across the college curriculum, and over 15% of the 1,482 total number of faculty. Those who took the survey included instructors from Aircraft Maintenance, Art, Culinary Arts, Fire Science Technology, Mathematics, Photography, and Visual Media Design.

That 40% of respondents represent the English and English as a Second Language departments is not surprising given their large sizes. Despite the small sizes of their departments, instructors of Administration of Science, Biological Sciences, Broadcast Electronic Arts, Child Development, Culinary Arts, Disabled Students Programs & Services, Fire Science Technology, Health Care Technology, Library Information Technology, Nursing – Licensed Vocational, Photography, and Visual Media Design participated at high rates. Art, Foreign languages and Math instructors participated at significant rates, particularly as these disciplines are not viewed as “text” demanding.

Q2. Are you full or part-time? Full time: 158 (70%) Part-time: 69 (30%)

Q3. What division do you teach in?

Credit: 174 (76%) Non-Credit: 31 (14%) Both: 21 (9%) Other: 1 (.4%) This instructor reported on courses taught at another institution.

Q4. Which of the following reading skills do your students need to handle the texts you assign? [graph on following page] page 16 of 75

(Rounded numbers on the right represent number who chose item as “most important.” The percentage is courtesy of Survey Monkey. “Writing an essay” is my error. Please ignore.)

Recalling main ideas 69 (38%) Analyzing ideas in the readings 24 (13%) Summarizing ideas 13 (7%) Using information from the readings to solve problems 27 (15%) Synthesizing ideas drawn from different sources (including non-text sources) 7 (4%) Explaining readings in one’s “own words” 9 (5%) Reading critically and expressing one’s opinions 13 (7%) Writing an essay 13 (7%) page 17 of 75

To interpret the data for this question, it is best to keep in mind that 40% of polled instructors are English or ESL instructors. The fourth choice, Using information from the readings to solve problems, describes the reading students do in math and the sciences. Synthesizing ideas drawn from different sources (including non-text sources) similary describes reading processes for chemistry and math where the reader simultaneously uses descriptive text, symbolism (formulas), and visuals (e.g., graphs or diagrams), in concert to understand concepts.

Q5. Choose one core course to answer questions #6 and #7.

The courses listed for this question were extremely diverse, with very little duplication except in courses offered in multiple sections (e.g., English 95, English 96, ESL 150, ESL 160). This perhaps reflects the diversity among the faculty who took the survey. This could also partly be a reflection of specialization in certain fields (e.g., Fire Science Technology; Visual Media Design) where a particular instructor regularly teaches a particular course.

Q6. How many pages of core reading do you typically assign your students per semester for the selected course? How many words per page, on average?

The number of pages students are assigned per week was evenly spread across respondents with 26.8% assigning less than five pages per week, 24.4% assigning five to ten pages, 26.2% assigning ten to twenty pages, and 22.6% assigning more than twenty pages a week. page 18 of 75

These numbers are reflective of the discipline (e.g., less in Chemistry, more in Anthropology; low number of assigned pages for non-credit, high number of assigned pages in credit).

For the most part, the “less than five pages” per week is reflective of the instructors who teach non-credit ESL. English, ESL credit, and credit instructors fall into the 10-20 and more than 20 pages per week categories. Overall, instructors had only general ideas or none about the number of words per page of the reading texts they assigned. Some remarked that they had never given it much thought. Most gave general estimates for the number of pages they assigned. Instructors of General Education classes (e.g., Nutrition 12) assign 20-30 pages per week, so a student with a full load might have up to 150 assigned pages of reading per week.

Science instructors stated they used texts that contain lots of pictures and graphs. This does not necessarily lighten the reading load but may instead require more time as students grapple to understand concepts through the integration of information from text, symbolism, and visuals. One instructor noted that students with previous science backgrounds might spend less time (1-2 hours) on assigned readings than students new to the discipline. “Others can spend 4-5 hours and would have to re-read the chapters again!” (#86, Biological Sciences).

Math instructors emphasized that being able to comprehend word problems was critical. “… Reading comprehension in word problems is EXTREMELY important, so I feel a need to respond to this survey” (#12, Math). Although the text itself may not be lengthy, students may need to read the passage repeatedly in order to understand both what is being asked and how to go about finding the solution to the problem.

Some instructors noted that they assumed students only skimmed. “I don’t think they read in any great depth.” Others reported voicing expectations to students, with a number of instructors reporting that they expect students to spend up to six hours per week preparing for each class meeting. A Culinary Arts instructor advised students to read the “notes/summary/examples to get the main ideas, then if time allows, read entire chapter” (#2, Culinary Arts / Hospitality).

Q7. How many hours per week do you estimate it takes students to read these core reading assignments? page 19 of 75

Instructors estimate it takes their students two hours a week to complete the reading assignments, with 18.5% reporting less than 1 hour and 7% reporting more than five hours per week. Several instructors mentioned using the Carnegie unit—a minimum of two hours of out-of-class work for each class unit. And, for accelerated class, instructors stated reading loads were doubled.

Responses show that instructors are aware that factors such as level of education, or whether one is a native speaker of English, affect how much time students need to complete reading assignments. “[Four to five hours] for an average student who has completed a non-remedial based high-school education. About 1/3 of my students have Baccalaureate degrees already, so possibly less for them. And I also have many students from under-served populations who may find the reading takes them more than 5 hours per week for full comprehension” (#92, Broadcast Electronic Media Arts). “I think reading, understanding and synthesizing can be done in 1-2 hours per week but non-native English students often struggle with readings. They often translate into their language and then translate back, a strategy which doesn't work well for common usage words that have a different meaning in our discipline” (#190, Library Information Technology). “Reading assignments includes annotating, outlining, or summarizing - This could take 4-5 hours [compared to 2-3] if student are not strong readers” (#110, ESL).

Generally, ESL instructors gave higher estimates for reading time, 4-5 hours a week compared to the 1-2 hour estimates given by content instructors.

Responses also show that content instructors are aware of the type of reading required by their disciplines. “Reading a biology text (studying it) is slow work. Notes need to be taken, vocabulary retained…I [also] assign Connect [I couldn’t find out what this is], almost an hour per chapter, to encourage reading of the chapter” (#80, Biological Sciences). “To read the chapters doesn't take much time. However, to solve homework problems takes a long time. The book is useful as a reference for solving problems” (#124 Physics). “Besides reading, the student is required to remember the reading content and to apply the medical concepts during hands on practice session with manikins” (#205, Fire Science Technology). “In Econ there are frequently graphs or math problems in the text related to the reading. If they read slowly and carefully, which they should do, [a single reading] will take longer, from 1-2 hours. They could skim and just read the text, ignoring the graphs and examples, in under an hour” (#156, Economics). page 20 of 75

Q8. How many hours of ancillary reading do you require students to complete? (syllabus, assignment sheets, handouts, peer review, instructions for using Insight and other computer applications, Internet forums, etc.)

Ancillary reading loads were largely either less than an hour (44%) or 1-2 hours per week (39%). 21 instructors (13%) assigned 2-3 hours of ancillary reading per week. Q9. How are reading assignments distributed across the semester?

 About the same number of pages per week 64 (38%) page 21 of 75

 Some weeks, the number of pages is heavier… depending on other 75 (45%) assignments (writing/research/presentation/preparing for a test/etc.)  The reading load is heaviest at the beginning of the semester 9 (5%)  The reading load is heaviest as we approach midterm. 1 (6%)  The reading load is heavy in the weeks leading up to a writing assignment 6 (3.6%) and this happens ______a semester. (Use comment box.)  The reading load is heaviest towards the end of the semester. 6 (3.6%)  other (Please use comment box.) 6 (3.6%)

Responses to this question show that the majority of instructors either assign regularly spaced readings or assign reading loads according to activity.

Most instructors have a clear idea of how reading assignments are distributed across the semester. In English classes, when students are reading full-length works, the same number of pages is assigned each week. ESL instructors report that texts don’t change in length, but the difficulty of concepts, and the difficulty of vocabulary steadily increase. Instructors across the disciplines remark that the reading difficulty increases as the semester unfolds. “[Through reading] the student is required to build on previous concepts of the course…[this] supports the student learning outcome for the course of effective patient assessment for both medical and injury related issues” (#205, Fire Science Technology). One instructor reported making sure to balance reading loads with other assignments.

Instructor comments predominantly show that reading assignments help students meet course objectives. “We start a unit with an inductive opener, which is not text-based, but soon after that we move into a heavy reading period as we are digesting new ideas” (#58, English). “[Reading load increases whenever] I choose to bring in additional information on the topic or provide further explanation of how to perform or understand a particular technique” (#156, Economics).

In some cases, assigned readings decrease as students work on semester projects. “Reading is front-loaded, and tapers off as we focus more on project-based activities” (#220, Visual Media Design). Some semester projects increase reading loads. “Students have a capstone project that requires additional reading for research” (#136, Environmental Horticulture and Floristry). “Reading is typically heavier in weeks between writing assignments. Towards the end of the semester textbook reading is reduced to a minimum but, hopefully, is compensated by students researching their topics of interest” (#166, Broadcast Electronic Media Arts). Q10. During class what direct instruction do you give your students to help them complete their reading assignments? (Check all that apply.) page 22 of 75

a. I specifically teach them how to read texts according to the manner that reading in my discipline requires. (42%)

b. I explain the genre of the reading and spend class time giving explicit instruction for approaching the text. (38%)

c. I instruct them to consider the genre and/or the purpose of the reading before they begin reading. (40%)

d. We do a close reading of each text. (21%)

e. I set aside class time for think-alouds (28%)

f. I interpret the text for the students. (36%)

g. I hold a class discussion about the students’ reading habits. (24%)

h. How a student approaches reading the texts is up to the individual student. (33%)

i. Other: (19%)

Fifty-seven instructors, across the disciplines, added detailed comments for this question, showing a deep involvement in student learning, and described how they specifically teach students to use and hone reading skills as tools for gaining knowledge.

Two instructors begin their courses with an emphasis on reading skills. “I put a long, specific section in the syllabus about how to do the reading and use the textbook…. [and] talk about taking the time to read through explanations of graphs” (#156, Economics). “I start every semester with a lesson in critical, active reading and support it throughout the semester” (#76, Biological Sciences). page 23 of 75

English and ESL instructors reported offering on-going, in-class direct instruction on the use of “reading strategies,” as well as facilitating pair, group, and whole class discussions. Some instructors use student generated questions for discussions, while other instructors invite students to ask questions during class. A number of instructors report that they regularly check comprehension through various means (e.g., requiring students to annotate; analyzing quotes). “Reading is a part of the lab experiment in class, given as homework… [and]…students need to read, comprehend and analyze to pass this class” (#77, Biology). “We conduct debates to test out different claims in the reading” (#91, Speech Communication). “Students have to quote one main point and then write at least one paragraph in their own words on what they learned” (#121, Child Development).

Several content instructors reported that the readings are covered through lecture. Instructors also reported creating study guides for their students. “[Notes in the form of power point guide students to form questions] about the material before [they read] the text” (#80, Biology). “… Written study guides…indicate what questions students should be able to answer after doing the reading… [and] I have quizzes…” (#169, Anthropology). “I give some general questions that will direct the students [to consider]…key issues raised in [the] text…I ask that they come to [some conclusions before they begin reading]” (#191, English).

Instructors reported modeling critical thinking. “Often, I point out challenging thoughts or idioms and we clarify them together” (#56, English). “[Together] we analyze a wide variety of broadcast scripts, and discuss the genre and the format used” (#166, Broadcast Electronic Media Arts). “We do assessments by following rubrics for particular assignments and making sure descriptions are adequate and make sense in regards to the topics we have covered” (#14, Math).

Instructors also report providing resources. "Students take a VARK test [Visual – Aural – Read – Kinesthetic questionnaire that seeks to identify learning preferences] and a variety of resources are available to them on the course website with additional access to iTutors” (#81, Biological Sciences). “I don't have time to teach how to read the text, but I provide resources that students can choose to read if they need help” (#84, Psychology). “The reading material is linked and repeatedly reiterated in the tech-enhanced material… Sometimes, I refer BACK to a reading from a previous week…and ask that they reflect again with their new knowledge” (#92, Broadcast Electronic Media Arts).

A number of instructors motivate their students to complete readings in various ways. “I say, Read the required text this week…it may show up on a quiz” (#157, Environmental Horticulture and Floristry). “I explain the purpose of the reading and highlight what that reading is about to capture students' attention” (#161, Disabled Students Programs & Services). Some report giving highly personalized help. “[I] encourage students who are having [reading] difficulties…to come in during office hours… [and] show them how to break down each chapter to learn the main concepts” (#205, Fire Science Technology).

And one instructor came up with a creative solution when met with student resistance to reading. “I gave up getting students to read and created video tutorials to [mirror] all the assigned readings. If students only watch the videos, they can do fine. The better students do both: reading and watching video tutorials” (#140, Earth Sciences).

Depending on the course, instructors either check comprehension (sciences), or encourage students to develop critical reading skills. “I occasionally must interpret, but it's more important for me that students engage and possibly come up short than for me to digest text for them” (#191, English). “I sometimes interpret particularly difficult sections for students to help them relate them to page 24 of 75 the main ideas of the text, but primarily I encourage and challenge them to interpret the texts themselves, and I give time and space for them to practice this” (#181, English).

Although one instructor admitted that assignments did not require text responsibility, “Some manage to skate through this requirement by skimming, BS'ing, etc.” (#89, Business), all other responses showed that the reading and comprehending of assigned texts is essential to successful completion of the course. Rather than giving up on assigning readings, the Earth Sciences instructor added a non-text redundancy (video tutorials) to deliver the same content. Q11. What reading guidance do you offer outside of class? Check all that apply.

a. The students complete a questionnaire about their reading history/habits and I offer individualized feedback. (4%) b. I provide guiding questions to help them focus their reading. (56%) c. I provide the topic question for the paper they will write. (42%) d. If students are struggling with the readings, I suggest they read the chapter questions first in order to focus their reading. (22%) page 25 of 75 e. I require my students to annotate. (24%) f. I ask my students to turn in outlines for each reading assignment. (6%) g. I ask my students to turn in summaries for each reading assignments. (19%) h. Students fill out worksheets. (Please attach a sample worksheet if possible.) (19%) i. How a student approaches the texts is up to the individual student. (16%) To interpret the data for this question, it’s best to keep in mind that 40% of polled instructors are English or ESL instructors. The fourth choice, “If students are struggling with the readings, I suggest they read the chapter questions first in order to focus their reading,” is a typical strategy for approaching text chapters in the sciences.

One political science instructor found this question to be flawed (gave no explanation). Some answers show there was confusion about the meaning of my question.

From those who reported offering guidance outside the classroom, instructors describe using a wide variety of practices to support student reading. One instructor screens students. “Students complete a form at the beginning of the semester asking what level of English/ESL they have completed…If it’s ESL 150 or below, I ask students to meet with me and try to tell through these meetings, and also from talking to or calling on students in class, which students have difficulty comprehending either spoken or written English. Usually those with low reading levels lack sufficient basic vocabulary to do well in this class” (#156, Economics).

Instructors report giving individual feedback by reviewing annotation work (a number of instructors teach annotation explicitly, and in depth), journal assignments, and other writing tasks connected to reading assignments. “They write a weekly short paper on something from the reading that intrigues them, tying it back to a real-life example such as an ad they have seen that uses a principle from the text” (#89, Business). “I ask students to hand in annotated assigned readings the first 4 weeks of school. As the semester goes on, checking of annotations and summaries becomes random. [I post] "Credit or no credit" on Insight on the same day it is checked” (#115, Child Development).

Some instructors remarked that they only had time to offer selective feedback. “While I do teach outlining and require this of some texts, and the same of summaries, I do not do this for each text. I'd be overwhelmed” (#199, English). “I don't require outlines and summaries and worksheets for EACH assignment. I vary it a bit and, as the semester progresses, I move towards less structured guidance” (#180, English).

Instructors report using a variety of modes to give students reading support outside the classroom, including that offering guiding questions, asking students to participate in on-line forums, offering post-reading activities/assignments, requiring students to use Reading Plus, an on-line reading program for English students, and discussing the nature of reading. “We do talk about reading habits: where they read, how long at a time they read, what sorts of things throw them off the track—I ask that they make note of when they give up or drift away and become conscious of that moment of disengagement—or, conversely, what sorts of readings don't throw them off. It is not that we should expect constant interest but that we must recognize when we are thrown off and devise a strategy for reintegration with the work” (#191, English). page 26 of 75

Instructors also encourage students to use college resources (e.g., tutoring services). “We have virtual tutors and the students have an opportunity to submit their FIRST assignment for comments with the option to re-submit revisions” (#81, Biology).

Two content instructors reported teaching students to outline during office hours (Psychology; Fire Sciences). Although no other instructors explicitly stated that they work with students during office hours, providing office help was mentioned often for Q16 (see below).

Q12. How often do you assess whether students understand the texts you’ve assigned?

Instructors report using on-going informal (class discussions; oral quizzes) and a variety of formal assessments, including written quizzes, tests, mid-terms, finals, and written assignments. “I also assess their understanding in weekly writing assignments…where they apply concepts learned in class… [and] from the textbook” (#122, Biological Sciences).

ESL instructors report assessing daily and English instructors report assessing for each text. “Most of the semester is spent in me assessing how well the students are completing and understanding the readings I assign” (#181, English).

Content instructors more commonly reported using a set number of quizzes (typically 4-5 throughout the semester), the midterm, and the final to assess. Several instructors are using technology (i.e., iclickers; on-line forum discussions). Several content instructors report assessing specific outcomes through direct observation. “[Students show they understand readings by] demonstrating correct hands-on assessments of the medical condition presented” page 27 of 75

(#205, Fire Science Technology). “Assessment is done through application of the content (designing the study guide layout), rather than written tests” (#220, Visual Media Design). “Students are assigned…to lead discussions on popular science articles” (#75, Biology).

Other instructors report relying on subjective measures. “We talk about the reading in class and I am in a good position to judge who gets it or not” (#174, Library Information Technology). “… Through discussion and hands-on work in class. I don't formally assess, but if they student is lost it becomes clear and they work to get caught up” (#89, Business). “From the main points or questions students answer each week, I can see if they understand some of the concepts. Some are still having challenges writing—using their own words. This is just after the midterm” (#121, Child Development).

Some instructors identified problems with assessment. “I assess the class "as a whole" probably weekly, but of course quieter or weaker students may get overlooked” (#48, ESL). “Not enough.... I need more tools for assessing basic comprehension…When I conference with them, I ask students to show me examples of how they are marking the text and use that as a springboard to talk about the reading and informally assess their comprehension. Also, their use of textual evidence to support their essay argument provides another window into their comprehension of the issues” (#68, English). “Students do online homework problems and exam problems that test their understanding of the techniques and formulas in the book and lecture. This may or not correlate to their understanding of the text” (#124, Physics). “I do not exactly understand this question. I check in every class whether students understand the concepts and terminology under discussion. But often they don't do the reading until after I present the material—except last year, when I used specific assignments on the reading before class discussion—” (#156, Economics).

Q13. How do you use the assessment you do of your students’ reading comprehension?

 Assessments on reading comprehension are part of a student’s grade. 64 (42%)  I use reading assessments to monitor my students’ reading comprehension and offer additional instruction as necessary. 86 (56%) Sixty four instructors (42%) count assessments of reading (quizzes, mid-terms, finals) as part of a student’s grade, but most reported that quiz, test, and course grades only indirectly reflected page 28 of 75 reading comprehension. Those that graded reading comprehension commented that the weight was low. “Not a large part of the grade” (#75, Biological Sciences). “If done at all, 10% of the course grade” (#80, Biology). “There are no grades, but I do use this information in determining whether or not a student has met the course’s Student Learning Outcomes (SLOs)” (#39, ESL).

Similar to comments instructors made about direct reading instruction, instructors again revealed that while reading is not being assessed directly, reading comprehension is strongly linked to both assignment and course success. “I assess student projects to ascertain how much of the reading material they were able to use in their work” (#163 Broadcast Electronic Media Arts). “Tests are based on the reading, so how well they comprehended the reading will be reflected in the test scores” (#209 Health Care Technology). “Study Guide layouts are a part of the grade” (#220 Visual Media Design). “[I use assessment] to see how well they are able to incorporate what they learn via class/textbook and [how well they can] critically evaluate research and proposed solutions to environmental problems” (#122 Biological Sciences).

Eighty six instructors (56%) reported using reading assessment to monitor student comprehension and offer additional instruction. “Sometimes I must repeat everything or go over certain subjects several times” (#7, Culinary Arts/ Hospitality Management). “I go back over things, often” (#102, ESL). Many used reading assessments to adjust teaching. “I use them primarily to inform my own teaching and evaluation of the students' writing” (#133, ESL). “Assessment of reading also informs the type of essay assignments I create” (#59, English). “I use evidence of struggle to refine how I teach the readings in the future” (#185, English). And many use the assessments to encourage individual students. “I use them mainly to inform the students of their challenges to encourage them to spend time on task. Many students severely underestimate the value of this” (#53, English). “I speak individually to those falling through cracks” (#62, English). “My [students] vary in demographics and previous educational experiences. I must assess each student independently, spending most of my time with the remedial students, who are taking their first course, with technical and multiple reading assignments each week. It is a subject matter that they are drawn to and gives them the impetus to take more courses in math and electronics to understand the world of audio engineering” (#92 Broadcast Electronic Media Arts). “Students who appear to be struggling, I meet with and discuss course requirements and assignments and try to determine issues affecting student performance” (#210 Fire Science Technology). “[The] class is designed for students who have serious intellectual disabilities, to help them find work. Reading and writing levels are very important for what sort of jobs they can reasonably manage. I assess literacy levels to help them apply for the appropriate jobs and find the assistance and services they need” (#214, Disabled Students Programs and Services). “I comment on their forum posts. If I feel they misunderstood the article, I will discuss it with them” (#134, Broadcast Electronic Media Arts).

One instructor states that assessments are guides for students. “Formative and Summative assessment. Students use results as indicators of their own understanding” (#140, Earth Sciences).

Of the 153 instructors who answered this question, as many as 37 instructors chose “other” as a second choice. Some reported they do not assess reading skills or do not count reading skill assessment in the grading. Quite a few instructors commented that they were not concerned whether students mastered skills and content through reading. “I assume they comprehend the reading. I'm more concerned about students applying the reading to key situations” (#89, Business). “Reading is only a component and I assess overall learning on the course topics” (#93, Political Science). “I don't assess reading comprehension directly” (#124, Physics). “This sounds like an English class question. I assess understanding of the CONTENT. Since I present page 29 of 75 most of the material in class as well as having it in the reading, I am not specifically assessing reading comprehension. For example, I test understanding of the term "capital" as used in economics. It is defined in the text and discussed with examples repeatedly in class” (#156, Economics). Q14. How do you determine whether your students have completed and comprehended the readings you’ve assigned? (Check all that apply.)

a. I give a quiz for every reading. 34 (21%) b. Questions about the readings are on tests and/or the final exam. 82 (52%) c. Students write papers/complete homework about what they’ve read. 94 (59%) d. If students successfully complete the homework and pass quizzes and tests, I assume they have completed the readings. 56 (35%) e. Students work in groups and turn in worksheets I give them. 56 (35%) f. I give students points for participating in class about the reading assignments. 48 (30%) g. Other: ______

Be reminded that since 40% of those surveyed are English and ESL instructors, it’s not surprising that the data shows “Students write papers/complete homework assignments about what they’ve read” (59%) is the most common assessment measure. Questions about the readings are on tests and/or the final exam follows close behind at 52%. This choice, along with If students successfully complete the homework and pass quizzes and tests, I assume they have completed the readings describe assessments used in disciplines other than English and ESL. page 30 of 75

For this question, comments start to echo what instructors have previously noted. Instructors across the disciplines report checking comprehension frequently and consistently. One instructor “…checks in with all students by the fifth week” (#115, Child Development). Checks are informal and formal, graded and ungraded. Regarding informal, in-class checks, one instructor gave allowance for differing abilities among students, “Of course, those who don't participate [in class discussions] may have indeed read the work but perhaps have… [processed] it more slowly or uncertainly” (#191, English).

Written work, regular quizzes, and answers to oral questions from the instructor, from the book, or from instructor-generated question sets during pair, group, and class discussions all help instructors to monitor their students’ reading comprehension. “My weekly quizzes all have a reading comprehension question in the quiz – I give them a short written biography of an oceanographer, and they answer a question about it” (#140, Earth Sciences).

Instructors describe how readings are linked one to the other and the methods they use to encourage students to retain and apply what they’ve read. “Students write reading responses in their sketchbooks” (#155, Art). “I do not quiz students on every reading, but we do have spot checks... I rely on these earlier in the semester to focus students on completing the reading and showing mastery of the material. Later in the semester, their mastery of the reading is manifested in their papers” (#56, English). “Reading passages have accompanying comprehension questions…I give quizzes that contain similar reading passages with questions” (#36, ESL). “Since I use the text on 3 consecutive days, I always ask students to summarize the story on the following day” (#66, ESL).

Students are required to demonstrate reading comprehension in a variety of ways, depending on the discipline. “Each week in lab, students demonstrate what they have learned from their reading assignment” (#138, Dental Assisting). “[Students give] presentations on readings in each class” (#2, Culinary Arts). “They use the readings for a variety of oral and written activities in class” (#58, English). “Students are required to utilize key concepts and examples from the reading in their debates. I think of this as an oral demonstration of [what they’ve gained from the reading]” (#91, Speech Communication).

Instructors are also creative about motivating students to read. “Students get points for extra reading. This is part of the reading contest” (#39, ESL). “I do a random drawing every class and have students share their annotations (and show them, using the document camera) in order to "spot check" active reading and basic homework completion for reading assignments” (#188, English). “If I suspect quite a lot of the class is not doing the reading, I do give occasional reading quizzes for a class or two. Not for points per se, but [to recognize those who are keeping up with the reading]” (#191, English).

Overall, responses show there is a wide range of how an instructor expects students to show reading comprehension. “They design Study Guides for each reading” (#220 Visual Media Design). “There is also a departmental final that has a reading component” (#36, ESL). “[I have] different requirements for different assignments. No assessment for text except for exams— which are take home and I ask the students to use information in the text in answers” (#75, Biological Sciences). “Students write a 15-minute short essay about what they have read and what they think about something the author said for most readings. They need to read the questions carefully in order to do this” (#103, ESL). “Students write (when possible) and discuss [what they’ve read]…as individuals and in groups, they act out situations and practice skills in class” (#214, Disabled Students and Services). page 31 of 75

No matter what the discipline, the main theme is that students must be able to read to pass the courses. “Most of the readings contribute to the quality of writing that the students do in their assignments. If the writing is flawed, it is usually because the reading was not done or not understood” (#166, Broadcast Electronic Media Arts). “All work is based only on the readings and nothing else” (#53, English). “…Students use the readings in their essays, so comprehension of the readings becomes paramount—and [whether they comprehend or not] quite clear!—” (#200, English).

Q15. How important is reading your assigned texts for a student to successfully complete your course (a grade of C- or higher)?

Over half of the instructors report that reading skills are essential for passing the course. When technologies, for example on-line forums such as Canvas, are part of the course, reading skills become crucially important. English as a second language instructors in both credit and non- credit are the most insistent about the importance of reading skills and report that many assignments require text-responsibility. “The writing assignments specifically require the students to read and synthesize information from specific texts” (#33, ESL). “I stress the importance of reading almost daily” (#39, ESL). “In general a student with below level reading skills does not move on to the next level” (#36, Non-credit ESL). “ESL students tend to think the most important skill is speaking, but I tell them…[they] need to read instructions carefully, letters from banks, messages from doctors, and signs that are everywhere” (#66, ESL).

English instructors are similarly firm about the importance of reading skills— “The students must incorporate relevant textual evidence in their essays to earn passing grades” (#68, English) — but content instructors also make strong statements about the need for reading skills. “It's [hypothetically] possible to pass the class by just pushing symbols around without much understanding, but it's unlikely” (#8, Math). “It is very hard for students to do well in my course without the assigned text. We conduct many close readings, and each speaking assignment draws directly from the text” (#91, Speech Communication). “This might seem contradictory, given some of my other responses, but the core of this class are written instructions for class exercises” (#223, Visual Media Design). “Failing to memorize text and lecture details leaves one page 32 of 75 with a test score of ~30%” (#80, Biological Sciences). “[Understanding our textbooks is] very important. Lecture and discussion…only cover main points…textbooks provide case studies and critical thinking scenarios that enhance the learning process” (#209 Health Care Technology).

In some content course, the grades of students who don’t read can be adversely affected. “Some exam questions require that students read the text…most students can earn a C- if they study their lecture notes and don't read the text, but students who do well (A or B) clearly have studied the text” (#50, Psychology).

But content instructors also state that whether or not reading affects a student’s success in a course depends on the student. “Students of BCST 144 (a digital video editing course) or similar courses would probably do well without reading assigned text if they: 1. had substantial prior experience 2. watched extensive video material available (i.e., on YouTube)” (#163 Broadcast Electronic Media Arts). “Students with excellent English reading/listening, critical thinking skills, and study skills who have a good basic knowledge of the U.S. political system have a huge head start and can probably pass the class with less focused reading…usually less than 20% of my students” (#156, Economics). “Since it is a studio class, reading assignments help students to understand assignments and concepts in the course but are not critical to a student's success in the course” (#155, Art). “Lectures cover much of the same material, so it is possible that if a student is diligent in lecture notes but does not do the readings they could still pass with a B or an A” (#157, Environmental Horticulture and Floristry).

Again, instructors report supporting students through other modalities. “I make PowerPoints of each chapter or topic. These are as or more important [than the readings]” (#7, Culinary Arts). “Lecture and demonstration are the primary sources of course-related information” (#170, Photography). Q16. If a student in your class doesn’t complete the reading assignments, or has trouble with reading comprehension, what can s/he do to pass your course?

Several instructors reiterated that reading was a fundamental skill for passing the course. A chemistry instructor explains, “Reading supports lectures. It’s doubtful that a student who doesn’t read can pass the class…practice problems comprise ten out of one hundred problems per chapter. [I advise slow readers… or those who find readings] dull or difficult…to attempt the problems—and then go back to the reading. My tests require them to apply information from page 33 of 75 both my lectures and the texts. And my tests are difficult. For example, if the textbook asks them to solve a problem forward, in the test they have to solve it in the reverse” (Hurt, 2015).

Overall, responses show that instructors across the disciplines support students with reading through one-on-one tutoring during office hours, and encourage students to access various CCSF resources including the English lab, CLAD (the ESL lab), the Study Lab for Culinary Arts, and the Learning Assistance Center. Some students have access to tutors such as Project Shine Volunteers, an iBest ESL instructor, or other tutoring services. “We have bilingual tutors, the weblinks are usually available in multiple languages, and our course website offers the opportunity to form study/buddy groups and interact within a private student forum” (#81, Biological Sciences). “I provide a tutor who meets with students individually during, or in small groups, before office hours to work on patient assessment approaches” (#205, Fire Science Technology).

Instructors encourage students to use additional resources such as YouTube videos, books on tape, on-line student forums, captioned video tutorials, and Google translations. “[Students can] watch captioned video tutorials (with scripts)” (#140, Earth Sciences).

Some instructors offer extra credit for these and other activities. Many instructors commented that students regularly get help by working in groups or from peers. “A lot of the in-class work can be group work, so just participating produces points” (#80, Biological Sciences).

Instructors believe that regular attendance and asking questions will help students succeed. One instructor mentioned making Disabled Students Programs and Services referrals, if appropriate. Another takes student learning styles into account. “If the student advises me there is a learning issue for her/him, then I prepare accommodation for more test time, or…they can ask questions while taking the test; that puts test question wording in a way they better understand” (#210, Fire Science Technology).

Comments reveal that instructors have a good idea about the reading level required for their courses. “Students who are not yet at ESL 150 level usually aren’t ready for this course” (#156, Economics). “Students in the Medical Assistant program must have a basic reading level to enroll” (#209 Health Care Technology). “The content of the course is the language itself. Ability to read at level is necessary to pass to the next level” (#36, ESL). If a student doesn't understand the bulk of the reading assignments, they won't pass. This is because one major course outcomes is demonstrating reading proficiency” (#195, ESL).

Thoughts on students using translation vary. One instructor reports discouraging students from reading in translation. “I encourage students to not read in their native languages at this level (1A is a University-level transfer class). It is assumed that a student who has reached this level has sufficient English language skills to comprehend college-level texts, and if he or she does not at this point he or she needs more time and language instruction (outside of my Composition class) to do so” (#181, English).

However, more instructors see translation as a sound strategy. “I am not ruling out translations, in this day and age delivered by Google. Translations may miss the nuance and idiom, but I suspect they get the point across” (#174, Library Information Technology). “I teach in Spanish and readings are in Spanish” (#118, Child Development). “[I suggest students] look at websites explaining the same material in different ways, often visually” (#156, Economics). Instructors in this group make certain allowances. “In fact they usually understand enough superficially to be able to cite the more obvious/ salient points in their writing, and that's enough to pass if they also write well. However I notice that they miss a lot of the more subtle points, and the stronger essays are usually the ones that have grasped these” (#33, ESL). “Each student's reading page 34 of 75 comprehension barriers are somewhat different. If the issue is English language acquisition, I can counsel the student to read several targeted sections carefully, rather than the whole book. It is sufficient to make three points in a paper for a writing course, about a book, so I usually suggest that they concentrate on several points—and only read until they have assembled sufficient evidence to elaborate on those several points” (#191, English).

And two instructors state that access to translations is desirable. “I wish I had a translation of the text in other languages - I don't know how to provide that to students” (#75, Biological Sciences). “I'm not sure if the text is available in other languages. This gives me the idea to find out!” (#84, Psychology). Q17. I have been teaching for ______years. (This question is linked to the following question.)

Q18. In the last ten years, the general amount of ancillary reading I require students to complete...

Section 3 - Computerized Reading Programs

In the United States, there has long been a move to offer reading skill building, and remedial reading instruction through the discovery of some type of “program.” This has been true in K-12 and in higher education, and the advent of the computer has only intensified this search. Here at CCSF, the English department has experimented with various computer and on-line reading programs. This Section reviews two computer-based reading tutorial programs that the CCSF page 35 of 75

English department has used from 2000-2016. The ESL department has considered purchasing an on-line reading program, one of which is Reading Plus.

Background: Degrees of Reading Power (DRP) Since the mid-1980s, Degrees of Reading Power, a systematized reading instruction program, has been widely used throughout the United States in K-12. Originally, Degrees of Reading Power (DRP) was a test created in response to a request from the New York State Education Department (NYSED) for accountability in education. The test was first used in 1968 in New York City and soon became the state test in “Connecticut, Virginia and New York and 1,000 school districts elsewhere” (Berger, 1992). In DRP tests, to assess a reader’s skills, transitional sentences between paragraphs in reading passages have been removed and the test-taker must select the missing sentence from a choice of five. The majority of the readings are non-fiction texts, typical of the types of readings found on reading tests: a broad sampling of history, science (biology, astronomy) and social science, with some literary passages (the evolution of lighthouses, the Crystal Palace, cosmic dust, the life cycle of fish, inventors, technological innovations; a passage by Jack London).The student is assessed once at the beginning of the semester and once at semester’s end, and a comparison of the scores rates a student’s progress.

In a July 21, 1992 New York Times article, “Fired Up With a Love of the Feared Reading Tests; Koslin's Specialized Exam Wields a Powerful Influence on Millions of Students,” the author, Joseph Berger, points out that Bertram L. Koslin, the creator of the DRP, had no credentials in reading theory, reading instruction, or reading education. Berger does not hide his contempt for Koslin’s exam or later-created reading program, which Berger indirectly describes as money- makers for a savvy entrepreneur. Berger reports that test items are written by DRP staff writers who choose topics from the Encyclopedia Britannica, cull information from popular magazines, and follow a manual for form: a 300 word-length passage, leaving 7 word blanks, and inventing five options for each blank.

In 2013, that the DRP (now owned by Questar), originally sold with pencil and paper diagnostic tests, was re-designed as an on-line product. On-line information about the DRP is limited to sales pitches. Apart from a brief “history” of the inception of the DRP, claims are not drawn from any references or factual research studies. There is no opportunity to preview any of the materials and I wonder how recently the tests and readings have been updated.

Sometime after 2006, an English faculty member recommended adoption of the DRP reading test to gauge a student’s reading ability. He donated free copies of the tests and CCSF’s English department has used the Degrees of Reading Power (DRP) test for approximately 10 years.

In the recent past, City College English students have been assigned to go as individuals to the English Lab to take the DRP diagnostic test. Based on the design of the DRP, students in the beginning of the English sequence—English 91, 92/93 and 93—take the T-4, and students in levels English 96, 96/1A, 1A, 1B, or 1C take the T-2. Students’ reading skills are assessed through these two untimed reading tests, each made up of a series of 8 readings, each with 4-5 test items. One of the selling points of the DRP is that there is no time limit for the assessment and thus no bias against a student’s reading rate. Students have been known to use up to three hours to complete the initial assessment, with the average test-taking time being about 45 minutes (personal interview). page 36 of 75

Both the T-4 and T-2 contain 8 readings of a standard length, typically 3-4 paragraphs. The test items are missing sentences from either the beginning or end of a paragraph. There are three items for each reading, for a total raw score of 24.

At semester’s end, students then take a second diagnostic test, the U-4 and U-2 respectively, which, before I analyzed them, seemed to be more or less mirror instruments, to show whether a student has made progress through the use of the reading materials owned by the English lab (not DRP materials), typically for one hour a week over the course of a 15-17 week semester.

Analysis of the DRP

On the surface, the DRP tests seem “valid” in that its makers present the assessment and training tools as being calibrated from “easy” to “difficult.” I took 4 assessments tests (entry and exit for both levels) under a certain amount of self-imposed duress and, while taking the tests, noted: a. my affective reactions b. whether I felt the readings were becoming increasingly difficult or not from entry to exit, and from level to level c. what reading task(s) the correct answers were asking the students to do d. the quality of the distractors e. whether, through the act of taking the test, I was learning “how the test worked” and therefore “showing progress”

With test taking, the affective filter is always a consideration. It would be good to always consider the wide-range of test taking abilities in our diverse student population. As a test taker, I engaged in the following behaviors:

 nervously skipping around, re-reading and getting confused about where I was  slowing down to learn what I was reading  zoning out  ignoring the readings and choosing answers based on the distractors not making sense  skimming for answers  analyzing what the test was asking for to double check my answers

This mix of skilled reading and test survival behaviors resulted in some choosing of wrong answers, my taking an inordinate amount of time to finish the assessment, and some loss of confidence in my test-taking abilities, all outcomes that do not describe me as the highly competent reader I am.

Analysis of the DRP Through Its Tests Some of the test answers (missing sentences) seem capricious, but when this is so, the reader can use process of elimination since the incorrect answers are rarely distracting and are instead plainly wrong. I ascribed the seeming capriciousness of certain answers to be by-products of formulaic, and sometimes quite poor, writing. I coded all the answers and the following are answer types: a. a concluding remark, bringing reader back to main focus b. a summary of point just made, or of points presented c. a vocabulary item linking to the main point of the reading page 37 of 75 d. a contrast to drive a point home e. pronoun reference f. a solution which refers to the previous paragraph g. a topic sentence coming at the end of a paragraph h. a topic sentence for the subsequent paragraph i. a synthesis of three points j. understanding an accumulation of supporting details to drive home main point k. a detail confirmed by subsequent sentence l. a transition with a particular grammatical structure m. linking vocabulary (word form; synonym; reference) n. elaboration of a point o. a restatement p. process of elimination (no other choice makes contextual sense)

The following chart, which continues on the next page, shows a breakdown of the answer types I found across the four tests. For the descriptors I created to categorize the answers of each test, see Appendix 3.

Analysis; ordered from low- to high-level English 91 ...exit English 96 …exit thinking skills: Lia Smith, 2016 T-4 U-4 T-2 U-2 restatement (one U-4 item w/synthesis) 1 7/(8) 3 summary of point(s) just made 1 3 8 summary of points presented 1 2 1 6 topic sentence, end of paragraph 1 3 topic sentence for next paragraph 1 concluding sentence for paragraph 1 concluding remark tied to main focus 2 1 elaboration of a point 1 additive (summary, adding up points) 1 accumulation of supporting details to 1 1 drive home main point page 38 of 75 refers to main thrust of reading 3 2 transition w/particular grammatical 4 structure contrast to drive point home 1 1 1 contrast 1 pronoun reference / reference 2 1 3 linking vocabulary 3 1 1 vocab linking to main point 2 presents problem or problem/solution 3 reason to find solution to problem 1 synthesis of 2-3 points 1 1 3 concession 1 analogy 1 correcting previous belief 1 *[problem answer] 1 6 3 *[no data] 1

My analysis suggests that the items were created in a random fashion without learning objectives or test design considerations in mind. To begin with, entry and exit skills do not match. For example, the T-4 entry test requires the test taker understand four transitions based on four particular grammar patterns, but the U-4 exit test does not test this skill. The U-2 exit test poses14 questions that ask for 5 skills that were not tested at entry. Further, exit skills on both the U-4 and U-2 are less difficult and very general compared to the entry tests (T-4 and T- 2). For example, in the U-4 exit test, the test taker chooses 8 answers (33%) that are restatements. In a second example, 14 items in the U-2 exit test (over half) for the higher English 1A reader ask the test-taker to find the sentence that summarizes either a point or the points of the reading. The T-4 and U-4 ask for 23 higher order thinking skills, whereas the T-2 and U-2, supposedly assessing higher level readers, only ask for 19. The contrast between the T-4 entry and the U-4 exit is very sharp, with test takers being asked for a higher number of high-order thinking skills at entry, skills they are not later tested for either in the U-4 exit test or the next level T-2 entry test, than the contrast between the T-2 and U-2.

Since the English lab does not have DRP reading practice materials, I was not able to review them. However, based on Berger’s article and my own experiences, I felt confident in assuming that the materials are equally as arbitrary. Indeed, if a test taker does not improve her reading skills at all, the ease of the exit test might reassure her that the time she’s spent reading DRP materials—or any other reading materials, as in the case of the English lab— has led to the improvement of her reading skills. I wonder if the creators of the DRP naively used Jeanne Chall’s axiom “With time, good readers become better readers…” as their guiding methodology, assuming that reading in and of itself, no matter what the text or the text responsibility, can advance a reader’s skills, and failed to take into account the second part of the axiom, that “[… given the same amount of time] poor readers become poorer readers” (Chall, 1983, 1996 & Stanovich, 1986). “If texts become overwhelming, the struggle associated with reading makes it page 39 of 75 become unpleasant, and then a vicious cycle develops. In it, the student avoids reading and begins to fall further and further behind” (Lems, 2010).

Reading Plus at City College

In approximately 2006, the English department purchased Reading Plus, a web-based reading program, designed to increase reading comprehension and what its makers call “silent reading fluency.” The first version was only available at the Ocean campus but in 2008, it was made available to the entire CCSF network (all centers). In 2014, the English department purchased and began exclusively using the on-line version of Reading Plus (accessible from any computer off the Internet). At this time, Reading Plus has mostly replaced the Department’s free use of Degrees of Reading Power (DRP). Project and Inquiry Method

In fall 2014, I experimented with Reading Plus with one section of ESL 110. ESL 110 is our introductory academic course for ESL students, the first in a 6-sequence program that leads students to English 93 and English 96, the two sequential courses below English 1A.

It was immediately clear that the average ESL 110 students were just below the lowest level of Reading Plus. There were students who could read at much faster speeds and at higher vocabulary loads and complexity of content, but they were at the top of the class.

The students were not required to use Reading Plus, but were encouraged, and most students applied themselves. One student in particular, a very poor reader, found Reading Plus to be very helpful. She worked a graveyard shift and would return home in the early morning, when her children were asleep, and log onto Reading Plus. She misunderstood the idea that the readings should be easy for her and that she should progress at an unhurried rate. She reported “pushing herself,” and met with frustration when she tried to move to the next reading level. Despite this, the time on task seemed to make a difference. Anecdotal evidence suggested that she was gaining knowledge of American genre and that this was helping her with reading comprehension.

In the spring of 2015, I enrolled as two “students” of Reading Plus and began the program both as a student at the lowest diagnostic level and as a student who had already bridged into the upper half of the reading program.

The first analysis I present here is my critique of the features of Reading Plus. Although there are positive, the negatives are serious. My discussion follows my chart, continued on the next page.

Critique of Reading Plus features by Lia Smith, 2016 Feature Positives Negatives Notes

“American” values Expose student to American e.g., independence; cultural values women in work- force; be kind to page 40 of 75

animals; protect environment; children have feelings and good ideas Feature Positives Negatives Notes Pictures help student with definitions of unknown vocabulary; up- to-date; visually arresting; beautiful; diversity sensitive Simple sentences Easy to comprehend Staccato; unnatural; coherence problems Simple concepts Incoherent narrative; E.g. frog & turtle absurd ideas throwing popcorn; would a mother pop popcorn for breakfast? Sentence fragments Natural, real-world text Misleads students about grammar and what con- stitutes “academic style” Irrelevant question Misleads student; frustrates student Subjective answer Misleads/frustrates student Guided help (e.g., Sometimes useful Sometimes misleading explanations for missed answers) Text based questions Makes reader text student is merely hunting responsible for the answer Text associated Fosters the forming of text- Frustrating; not true test of questions to-world connections reading of target text Word definitions Helps train students to attend Distractor (incorrect to parts of speech answer choices) are often ridiculous Scan mode (feeds  Forces students to  Forces students to students text so that they remember everything rather remember everything must read without than strategizing what to rather than strategizing stopping) focus on what to focus on  Prevents ESL readers  Prevents students from looking up every word from learning new,  Keeps reader text unknown vocabulary focused  Frustrates students  Prevents reader from hunting for answers No Audio   The missing The absence of “phonological loop” is an audio is sold as a invaluable key to second positive for forcing language acquisition. students to read silently, but ESL students need aural input. page 41 of 75

To use the web-based Reading Plus program, each student is assessed and, based on this assessment, the computer program “allows” each student into the program at an Assigned Reading Level (ARL) matched to the student’s assessed reading proficiency. The program has three components: 1. Perceptual Accuracy/Visual Efficiency (PAVE) w/trademark a. Scan - Here the students are asked to identify when a “target element” appears. The target element asks students to attend to the “details” of “script.” That is, they are to determine if an “O” is open or closed or to identify a number. b. Flash - Here the students must recall a group of numbers or letters that are flashed on the screen, either by choosing from three possible matches or by typing in what they have seen. c. PAVE claims a capacity to build readers to a skill level of accurate scanning up to 120 lines per minute. 2. Guided Reading Here, depending on the student’s assessed word-per-minute rate and rate of “improvement,” the students read texts that are “delivered” to them first word by word, and then, if they show improvement, line by line, section by section, and finally in whole (when they advance to this mode, students can somewhat read at the rate they choose with certain restrictions aimed at preventing them from reducing their assessed reading speed, or gaining speed too quickly (considered cheating). At the end of the reading, the student answers ten multiple choice comprehension questions. The student cannot refer back to the reading but if she misses an item, the program gives her a chance to re-read the germane passage. Whether or not she chooses correctly this second time, she does not receive credit for second answer choices. Students can also proactively select a limited number of “re-reads.” During a re-read, the passage is again “fed” to the reader in the same mode she has been assessed to receive at that point in her progress (i.e., word by word; line by line; section by section; or in whole). The number of optional re-reads can be adjusted by the instructor. 3. Reading Around Words Here, the student completes fill-in-the-blank vocabulary exercises. The words are target words for the readings within that level. A pictorial feature helps students if they choose an incorrect answer.

Comments / Criticism Tracking (Perceptual Acuity/Visual Efficiency / PAVE)

The claim of the makers of Reading Plus is that the scanning exercises train students to track from left to right and attend to the “details” of a script, for example an open or closed circle, or page 42 of 75 number pairs such as 1 and 7, or 3 and 8, thus increasing their “efficacy” as readers. As Mark Taylor, the CEO of Reading Plus, states in a 2013 EdTech Times video interview, “We look at efficiency—rates, stamina, visual coordination with which a student can actually take in text and make sense of text—eye stops, fixations, re-reads...we give them a guided window...[which] structures what an efficient reader does...we are training them to do what efficient readers do.” On the surface, this exercise might seem to be based on the research that shows that accurate letter recognition is the highest predictor of good readers. But numbers are not letters, and even skilled readers find it difficult to differentiate between 1 and 7, or 3 and 8 at high delivery speeds. The trademark symbol on “PAVE” notwithstanding, training students to attend to details of an open or closed “O,” numbers, or arbitrary letter strings is misguided. If a student confuses “t” with “e” (a common error seen in beginning ESL students whose native language is Chinese), the student needs practice differentiating “t” from “e” in her own production of words both when reading aloud and when writing. To refine this differentiation, she needs an increase of input so that she can take advantage of the “excitability” of predictable patterns in her brain. Reading is not recognizing random letters but recognizing how letters behave in text. Stanislas Dehaene, author of Reading in the Brain, uses brain research to show that reading is done through parallel processing by a “commitee” in the brain as the eye rapidly fixates a narrow field. This “committee” furthers its skills through extensive reading and the reader learns that certain letter combinations in the target language are not possible (e.g., in English, boat not baot; high not hihg), and that letter combinations can be predicted based on frequency (e.g. in English, as a bigraph, the letter i generally comes before e). Skilled readers are not skilled because they are reading at high speeds, but because they are drawing from information stored in their brains, information they have put there themselves. Indeed, one Florida seventh grader, in her highly detailed critical review of Reading Plus, pointed out that she was being exposed to new vocabulary that she would have liked to learn but given the “forced feeding” of the text, she could not add them to her lexicon (EN, 2015). This is not surprising since Reading Plus does not allow a reader time to notice how new words are spelled, pronounce them in her head (much less aloud), consider the words around the new item, or attend to how a word works in the context. Further, there is no impelling reason to do so since once the text is read and the computerized test taken and passed, the text is discarded. The Reading Plus program claims to build readers to a skill level of accurate scanning up to 120 lines per minute. It is important that readers have enough reading speed to process meaning in context, but speed in and of itself is meaningless. “An overemphasis on rate can...interfere with construction of meaning...it seems important that readers become flexible readers rather than simply fast readers.” (Rasinski, et al., 2012).Different texts require different reading approaches and skilled readers vary their speed depending on the reading task. Most readers will attest that haste in reading leads to comprehension errors and poor or compromised recall of what has been read. To help students make speed gains, the program pushes students to increase by one level per week, a mechanical increase that seems arbitrary and self-serving for the makers of Reading Plus. page 43 of 75

For ESL learners, practice in scanning from left to right might be thought to be minimally helpful for readers of Arabic, who read from right to left. Here, it’s important to note that the scanning exercises seem largely based on a claim made by for-profit companies that sell reading remediation programs: that poor readers suffer from vision problems. This idea that has been soundly refuted by the scientific community: Most experts believe that dyslexia is a language-based disorder. Vision problems can interfere with the process of learning; however, vision problems are not the cause of primary dyslexia or learning disabilities. Scientific evidence does not support the efficacy of eye exercises, behavioral vision therapy, or special tinted filters or lenses for improving the long-term educational performance in these complex pediatric neurocognitive conditions. Diagnostic and treatment approaches that lack scientific evidence of efficacy, including eye exercises, behavioral vision therapy, or special tinted filters or lenses, are not endorsed and should not be recommended (Pediatrics, 2009). Further, training students to track and recall arbitrary symbols is specious. Word recognition is not based on the recall of arbitrary letter strings but rather on specific letter strings that carry inter-letter associations, as well as phonemic and morphemic information (Adams, 1990; Chall, 1983; Dehaene, 2009; Wolf, 2007). This watered-down interpretation of what struggling readers need that is sold by Reading Plus and other for-profit programs on the market are a waste of time and could possibly be causing eye strain since it requires students, who are not being monitored for proper posture, distance from the screen, screen brightness, etc., to keep their eyes fixated on a computer screen for scores of minutes at a time. Revealingly, the beginning level ESL students in the pilot semester complained about the Scan “wasting their time” and many asked to be excused (and were) from this component of the program. ESL 110 Students’ Assessment of Guided Reading

Students found the “guided” part of the reading irritating. They wanted to be left alone with the text. Granted, one benefit of this mechanical feeding of text was that ESL students could not spend their time looking up unknown (and known) words in their dictionaries. Preventing ESL students from over-using dictionaries and translation has benefit. However, a skilled reader does not read like a machine, taking in one word at a time in a linear fashion. Rather they will anticipate the next letter, the next word, the next idea and then selectively back-track at any of these points, sometimes in concert, at fraction-of-a-second speeds, to check meaning, or if they feel they have made an error. Our “committee” is always working on a number of levels simultaneously, processing letters (orthographic processing), processing sounds (phonological processing), processing meaning, and using context to confirm meaning. Good readers also take full advantage of the text in front of them, re-reading selectively, skipping what they deem non-essential, and will return to previous pages or skip ahead in texts to aid their comprehension of a text. page 44 of 75

While engaged in Guided Reading, in order to pass the upcoming reading test, I found myself focusing on remembering what I was reading and processing the information for future recall rather than strategizing what to focus on. This seemed a positive benefit, mirroring the kind of attentive reading that readers must do to succeed at learning content in academic settings, when the content is not particularly interesting to them. However, I also noted that I used completely different strategies with the two modes: Guided Reading, and Independent Reading (this latter mode being a reward I earned from the program for showing “gains”), and over the long term, better retained what I read while reading independently. That is, with Guided Reading, my strategy was to recall as much as I could so I could successfully pass the “test.” Passing the test had no value in the world except to pass to the next level in Reading Plus (for students, it is also to be finished). In contrast, when I read independently, I switched to my “read to learn” mode and learned quite a lot. It’s important to note that I am a highly skilled reader. I am dubious that less trained readers can necessarily, and adeptly, shift strategies. Reading research has shown that readers must be formally trained both in the beginning “break-the-code” stages and, later, trained to read texts according to genre, purpose, and discipline. In machine-driven reading programs, students are being trained to pass assessment created by the makers of the programs. This is not reflective of how teachers and instructors use text to present content, or the multitudes of assessments they use to test a student’s mastery of material. Vocabulary is important but in and of itself, is not a guarantor of mastering content. Research would have to be conducted to find out what benefit, if any, this controlled feeding of text has for ESL learners. I learned that a student can access re-reads without penalty, simply by pausing too long, or by interrupting the program. This interruption feature is designed to give students flexibility about when they complete their assignments, but can be used to access “free” re-reads (not taken into account by the computer program). Using previous knowledge of the subject matter (e.g., Nelson Mandela), and/or knowledge of genre, I was also able to pass comprehension tests at 100% without reading the texts. Critique of Reading Around Words The vocabulary selections seem only marginally related to the reading selections for Guided Reading in each level. For ESL students, this limited contextualized vocabulary study could be beneficial, especially since the program offers vivid photographs that show the words’ meanings when students give incorrect answers. However, distractors (answer choices that are incorrect) are often non-sensical and even ridiculous. The exercises do train students to attend to parts of speech. For some of the items, students choose from a word family (verb and verb variants/noun and noun variants/adjective/). For ESL students this is a benefit. Critique of Content and Form page 45 of 75

The content of the lower level readings were often non-sensical and even absurd (for example, one reading is about what goes wrong as Mom makes the breakfast meal of popcorn). Although simple sentences were employed, they did not read smoothly and lent a certain, sometimes marked, incoherence to the meaning of the text. Although the sentences in the beginning levels were simple (subject + verb + object / complement), they contained high level structures such as passive voice, the perfect tenses, adjectival phrases, and varied modals use— for example, “would” is used four different ways: to refer to a past habitual action, as a polite form, to state reported future tense, and to express conditional. From the lowest levels, the reading texts regularly contained adjective and reduced adjective clauses, and employed analogy (“…it looks like a coin…”). Even when the words were short in length 3-5 letters, the vocabulary loads were what would be considered “high level” and/or specialized for beginning credit ESL students (e.g., in a reading about a fish: spine, mouth, dig, deep, blood, a back bone, thick skin, humps, tubes, grab, a tail, fins; in a reading about sinkholes: patch, form, the cause of, leak, erode, ground, loose, landslide, process, broken pipe, a mess, quicksand, get sucked in, needless to say). As the reading level climbs, so do the length of words: satellites, information, delusion, measure, distance, purpose. The readings contain positive images of women. For example, to the question “Where is mom?” the answer was “At work,” and there was a photo of a woman in a hard hat and suit looking at architectural plans. The readings contain contemporary idiomatic expressions (e.g., to hang around; And they’re loving it; the clown wig and make up are up to you) Some of the comprehension questions were subjective and I disagreed with a number of answers. This caused me to wonder how students felt when they disagreed with the “correct” answer.

Possible Benefits to ESL students 1) Exposure to a variety of authentic and formula texts Narratives all contain a mix of tenses. Since beginning level academic textbooks contain “controlled” passages where there may be one grammar feature— for example, simple past tense — and very few “artful” touches (e.g., using fragments to add emphasis), Reading Plus could be helpful because it exposes ESL students to a variety of short reading passages of both authentic and formula texts. These texts follow the grammatical conventions of certain genres, including narrative and factual reporting (science and history texts). 2) Building of world knowledge Reading research has shown that poor readers read poorly because they have limited vocabularies, world knowledge and life experiences. ESL students, summarily cut off from information, need to continue to build knowledge. This can be done simultaneously in their native tongues and with texts in English that are both relevant and accessible. 3) Time on task page 46 of 75

Reading Plus offers engaging readings that might encourage poor, struggling, or developing readers to read. This “time on task” will improve reading skills since the large body of research on reading shows that skilled readers read more and more and therefore continue to grow as readers whereas poor readers read less and less as reading challenges increase. 4) Independent reading Once students leave school settings, they are free to become very selective about what they read. Participating in Reading Plus may suggest to reluctant readers that reading texts beyond those required by work, essential to survival (forms and legal documents), or interest areas may have certain benefits. The hope of educational systems is to inculcate habits of knowledge seeking in learners. Disadvantages 1) Minimal Vocabulary Building Benefit On the surface, Reading Plus promises to build a user’s vocabulary through direct vocabulary instruction and exposure to a wide variety of topics through readings which the user selects out of personal interest. While the value of direct vocabulary instruction is supported by both first and second-language reading research, this direct vocabulary instruction is done in the classroom for targeted classroom texts, for example teaching the concept-rich vocabulary items density, to distinguish, ratio, mass, weight, and volume before a chapter lesson on density for a textbook unit on earth formation. That is, vocabulary acquisition is best facilitated through knowledge building. In her paper, The Challenge of Advanced Texts: The Interdependence of Reading and Learning, Marilyn Jager Adams points out that direct vocabulary instruction increases vocabulary at a very slow rate, much slower than what is desireable to advance a weak reader, and that “using word-frequency information from large corpora to sequence vocabulary instruction...will not work...[because...to read a little bit of this and a little bit of that, and to retain the words encountered independently of their contexts] will not result in the reader learning “the sorts of words that distinguish advanced literacy status” (Adams, 2009). 2) Building Speed Has Minimal Value The claims that the makers of Reading Plus are not backed by any research on reading. And, since the readings are not tied to any purpose other than for a student to advance in the program, whether or not meaningful learning will take place is questionable. Although I had advanced to a high reading level with a high reading speed, at the given reading speed, a reading on the Aurora Borealis was far too difficult (I had no background knowledge) and, because there was no opportunity to re-read, passing the comprehension section was impossible. This was a frustrating reading experience. The newness of the subject matter and the complexity of the reading required me “to read to learn” rather than “to read to be rewarded for reading competency,” but the program does not allow for these shifts in reading strategy. Prevented from engaging in a meaningful reading experience with the text about the Aurora Borealis, rather than follow my interests, I subsequently felt tempted to choose readings that I could easily handle in order to pass the tests and do well. Students who travel the path of least resistance will not likely make the gains they need. page 47 of 75

Added to this focus on gaining speed, there is no listening component for Reading Plus. In fact, the makers of Reading Plus use the absence of sound recordings as a major selling point, stressing the importance of “silent reading.” However, ESL students need input in all modalities. Both experience and research show that listening to text while reading aids ESL students in numerous ways. Language learners all over the world report using subtitles in film and on television to assist their learning. Listening to the printed text in front of them read aloud is invaluable in helping ESL students absorb prosody, the chunking of language, and the intonation and stress patterns that mark meaning at sentence, and word levels. Lems, et al, list the “striking similarities between the comprehension processes involved in listening and reading” and, in the support of encouraging students to achieve oral fluency go on to state, “Oracy acts as a bridge between a natural language process, which is listening, and an unnatural process, which is reading” (Lems, Miller & Soro, 2010). That is, speaking , reading, and hearing the word all contribute to constructing meaning from text. In addition, research into the listening processes of English language learners “has shown that they are more likely to hone in on content words and miss some of the shorter function words” (Field, 2008). Listening and reading simultaneously will help raise awareness of function words, “[words which are]...so common and so short, can be the pivots for comprehension” (Lems, Miller & Soro, 2010). Finally, and perhaps most importantly, readers need phonological representations of words. “Second language acquisition researcher Koda calls phonological decoding “the most indispensable competence for reading acquistion in all languages” because even for proficient readers, having a good phonological representation of a word helps us retrieve it from working memory. Phonological decoding is a core literacy skill” (Lems quoting Koda, 2012). 3) Cost is high, and delivery does not match our purposes. In 2014, the cost of Reading Plus was $68,000 for a three-year contract for 1,000 seats. The more seats, the lower the per-seat cost. So, if the ESL department joined with the English department, it might be possible to bring the cost down to roughly $30 per student. Still, cost aside, Reading Plus does not really serve our purposes. What we’re looking for is not a program that will monitor student progress and push them to read faster. Rather, we are looking for an open-ended program that allows ESL students access to a wide-variety of current high-interest readings for adult readers that meet the 94-98% reading level for any given student. Reading Plus readings do offer progressive levels of difficulty (lexile levels) and recursive vocabulary, which are very attractive features. But the lowest level is not low enough for the students at the beginning of our present credit ESL sequence, students who need this program the most. If the cost were lower and users could access the readings at will, without being registered users, the program would be more attractive.

Researched Critiques of Reading Plus Except for bitter complaints from students and the proliferation of answer keys to Reading Plus on the Internet (some set to lyrical classical scores), I was unable to find any critiques of Reading Plus. Searches on the web typically led back to the corporation or to promotions of the program. Of interest is a 2015 disseration by Joel M. County, a doctoral student at East Carolina University. The 140-page dissertation struck me as incoherent and the following points as noteworthy: page 48 of 75

1. Mr. County gives much focus to the claim that the Reading Plus program helps guide and train readers’ eyes to fixate on text, a claim Reading Plus backs with scientific evidence, beginning with the 1933 work begun by Miles Tinker on the relationship of eye movements to reading, and moving through the subsequent innovations on machinery for measuring and training these movements (County, p. 27). There are no references to the work of Stanislas Dehaene, author of Reading in the Brain, whose extensive research on reading processes debunks this notion (Dehaene, 2009). Nor does County introduce the very strong position of numerous experts from the diverse fields of ophthamology, pediatrics, education, disability, and reading research on the specious nature of reading remediation based on assumed visual deficits (quoted earlier in this paper on page 46: Pediatrics, 2009). Although Jeanne Chall (1967, 1983) is glancingly referenced, her successor Marilyn Jager Adams (1990, 2009), is not mentioned. This gross omission suggests that County feels Jeanne Chall’s name alone adds authority to his claims.

2. County’s references to the work of Mathew J. Traxler, Professor of Psychology and Center for Mind and Brain at the University of California at Davis and an eminent expert in eye movement, are simplistic and take Traxler’s work out of context to serve the Reading Plus claim that reading speed is the key factor for skilled reading. “In Traxler et al.’s (2012) study, results showed that reading speed impacts the reader’s progress more than working-memory capacity (Traxler et al., 2012)” [sic] (ibid, page 28). On the contrary, Traxler’s work is highly technical and papers posted on the web where Traxler’s work is cited are on topics such as whether readers slow down (for hundreths of seconds) for sentences with embedded clauses, considerations regarding the fairness of timed reading tests, and whether above average visual acuity gives strong readers an edge with reading speed. That is, Traxler and those who cite Traxler are not arguing that poor reading is simply reading too slowly.

3. The “history” of Reading Plus is all referenced from the “history” pages of the Reading Plus web site (ibid, page 38).

4. The schools used in the study were given free Reading Plus subscriptions (ibid, page 53).

5. “For the purpose of this study, achievement gaps will not be examined” (ibid, page 56).

6. The measures used to assess the effectiveness of Reading Plus were:

a. the Scholastic Reading Inventory (SRI), a corporation b. Reading Plus assessments themselves c. the grade the student received by the teacher (ibid, page 69)

Note: page 49 of 75

The Scholastic Reading Inventory is a lexile system, rating texts in a sequence of difficulty according to the length of individual words and the length of sentences. It’s not out of the realm of possibility that the creators of Reading Plus used the Scholastic Reading Inventory to select and/or create texts. Sometimes explained as a means to measure readability, lexile systems have limits. For example, the following text snippets have a readability for 9-11 year olds: The aqua blue quilt is a prize for the quiz. Will you toss your hat in the ring? [Grade 4, 9-10 year olds = Charlotte’s Web]

Did that groan come from your groin? The ewe drank the dew. The ox sang an ode. There were tens of yams. The gnu ate a few. The gnu wore a fez. What did you do? A gnu is a large dark antelope with a long head, a beard and mane, and a sloping back. [Grade 5, 10-11 year olds = comic books] Critics of lexile systems argue that readers are forced to read things they might not be interested in; that no one agrees how lexile matches grade level; that lexile systems often rate children’s books as more difficult than popular fiction books for adults; and that makers of lexile systems copyright them and charge for their use.

7. Throughout the study, the students received extrinsic rewards:

a. certificates (Reading Plus awards these) b. free time on computers c. candy

“The intent was for the extrinsic rewards to turn into intrinsic rewards as the students became more motivated to read” (ibid, page 106).

8. Teachers who participated in the pilot study reported that they saw no gains in phonics or phonemic awareness (ibid, page 109).

9. Aaron Spence, Superintendent of Schools in the Moore School District, where the pilot study took place, was invited to serve on the dissertation committee of the research team (ibid, Appendix E, page 139).

Conclusions

Over the years, there have been numerous publications on effective test creation. Mary Piontek, at the Center for Research on Learning at the University of Michigan published research findings in Best Practices for Designing and Grading Exams (Piontek, 2008).I wonder if the makers of either the DRP or Reading Plus have availed themselves of these types of resources. page 50 of 75

I can only conclude that the reason that schools continue to use these programs is because the U.S. doesn’t devote adequate money on research into effective reading instruction. That is, when faced with the prospect of trying to help students of all ages, economic status, educational backgrounds, learning abilities, whose first language is not always English, who cannot read well enough to succeed in academic environments, these standardized reading programs are “better than nothing.”

Section 4 Survey of tools we use to determine student reading levels/competency

I wanted to survey the tools used at CCSF to determine ESL student reading levels/competency. For a test to be useful, it must be both valid (measures what it claims to measure) and reliable (measures consistently within itself and across time). If either of these characteristics is not present, a test is highly problematic. Further, ideally tests should have what is called inter-rater reliability. This means that different raters or observers should give similar estimates or judgments of the same phenomenon. Joel M. County, who wrote the doctoral dissertation on Reading Plus mentioned earlier, assessed Reading Plus against itself to show the Reading Plus program is accurately measuring student reading achievement. Assessments of these types are not good criteria to consider an educational tool purchase or grade a student.

At CCSF, there are a number of tools being used to assess reading. They vary widely on a number of levels: their age; authorship (purchased vs. in-house); function; and whether or not they can be challenged. The following list is only partial and excludes, for example, the reading assessments used in computer programs such as Reading Plus.

1. Non-credit ESL placement reading test (ESLN)

2. Non-credit ESL promotion reading tests (ESLN)

a. Level 2

b. Level 4

c. Level 6

3. Credit ESL placement reading test (ESL)

4. Credit ESL Common Reading Tests (ESL)

a. 110

b. 120

c. 130 page 51 of 75

d. 140

e. 150

f. 160

5. Individual instructors use various instruments (e.g., RFU-3; textbook publisher generated tests). Some instructors report that they assess reading competency by checking for response to ideas and incorporation of summarized information in writing assignments.

#1 Non-Credit ESL Placement Test, Reading Section

To receive a placement in Non-Credit ESL, incoming students first take a listening test. The score on this test determines which of two reading forms the student is given (Scholnick & Hammer, 2002). There is also an oral interview. Both reading forms were created by the non- credit division of the City College of San Francisco’s ESL department and have been tested for reliability and validity.

#2 Non-Credit ESL Promotion Test Battery, Reading Sections

In 1996, a Test Developers Working Group, made up of instructors from the non-credit division of the City College of San Francisco’s ESL department developed Reading and Listening tests, to be administered at levels 2, 4, and 6, in order to measure non-credit ESL students’ proficiency with these skills. In 2002, three separate instruments for measuring reading skills (for promotion testing at levels 2, 4, and 6) were created and introduced (Table 1).

Table 1

Non-Credit Level Number of Items Time Allowed for Test

ESLN Level 2 31 items 40 minutes

ESLN Level 4 35 items 60 minutes

ESLN Level 6 30 items 45 minutes

Working with the CCSF’s Office of Research, Nadia Scholnick, the Assessment Resource Instructor for the non-credit division of the ESL department, tested the Reading tests for reliability and validity. Reading test cut scores are only one of multiple measures that determine if a student is promoted from one non-credit level or not. For example, level 4 testing includes a writing test and oral interview, in addition to the reading and listening tests, and, for all levels, teachers use test scores as guides for promoting students.

#3 ESL Credit Placement Reading Test

Prior to 1987, City College was using a reading test for placement. Research conducted on its item quality, validity, reliability, and predictive validity revealed it was a poor instrument, and it was abandoned in the early nineties. From 1987 to 2004, there was no reading portion for the credit ESL placement test. However, in 2004, during a period when the ESL curriculum was page 52 of 75 being revised, a reading test was developed by Laura Walsh and Kitty Moriwaki. Research in the ESL field that suggested reading was the most important among the four language skills for academic success (reading, writing, speaking, listening) supported the curriculum revisions, and changes included an increased emphasis on reading and the integration of skills. This new reading test, which has been in place since 2004, contains four readings with five questions for each reading, for a raw score total of 20. As explained in the “History of Placement Testing” section below, this test will be retired and replaced with the state tests being generated under the Common Assessment Initiative.

#2 Credit Common ESL Final Exam, Reading Sections

Preceding the re-introduction of a reading section for the credit ESL placement test, in 2004, six separate reading tests were developed for the ESL Department’s common final exam. Prior to this revision, the common final exam for students taking ESL courses was a composition test. Since the ESL department was updating the ESL curriculum to combine courses and integrate skills, the common final exam was revised to contain three sections: reading, writing, and grammar. Each semester, Laura Walsh, Credit ESL Assessment Coordinator, uses multiple measures to check the reading tests for item quality, and test reliability. She also solicits feedback from her ESL colleagues. If she sees clues that something is “off,” for example, a low Kuder-Richardson co-efficient for any of the tests, she will recruit test writers to work on revisions. Readings that are not working to differentiate between high achieving and satisfactorily achieving students, items that are poorly worded, and distractors that are not working effectively are replaced.

To date, no research has been done to look into how test stem length (the number of words in an item question) correlates with test difficulty. Research has shown that the length of the question stem has a strong correlation as to whether the item is answered correctly. Items with longer stems present difficulty for all test takers (Spurling, 2014). A question stem may have as few as 3 or up to 20+ words and require a single word answer, chosen from a multiple choice list of 5. Interestingly, by both word count and syntax complexity, the ESL 140 Reading and Grammar tests require students to read more than the respective ESL 150 exams. Further, answers for the 130 reading test require the heaviest reading load (8 is the median number of words for answers compared to 6 for ESL 140, 5 for ESL 150, and 7 for ESL 160 tests. Overall, the ESL 150 reading test appears simpler than the others, requiring single word or phrase answers compared to sentences. See Appendix 4 for details.

Upcoming Changes in Placement Testing (including placement for reading) In 2015, City College of San Francisco remains one of a minority of California colleges that has created its own placement tests for English and ESL (the math department uses a third-party test, Accuplacer). The majority of California colleges purchase placement instruments from private companies, and said tests are reputed to be not only expensive but also not very good at determining accurate placement in some areas, for example beginning ESL (Walsh, 2015). According to the Common Assessment Initiative website, a 2008 report showed that approximately 30 different assessment instruments were in use at that time, and, because assessment results were accepted at low rates (24% for math, 10% for English and…8.97% for ESL), that reassessment was common. Since tests, placements, and levels vary from college to college, any student who wishes to move from one college to another is often re-tested, which can lead to loss of course credits, and/or, the CAI website suggests poor placement. “Students will benefit from a standardized assessment that clearly defines collegiate academic standards; will no longer be required to navigate multiple assessment processes when moving between page 53 of 75 colleges; will be more appropriately placed in courses; and, will benefit from a reduced need for remediation and increased academic success as a result of these efforts” (CAI website, 2015). Driven by the Student Success Act of 2012 (itself an update of the 1986 Seymour-Campbell Matriculation Act, which charged colleges with the responsibility of “[assisting] students in defining educational goals and [providing] necessary support services to ensure successful academic outcomes”), the Common Assessment Initiative (CAI) is a move to standardize assessment instruments used for placement. “[Multiple studies show]…that when appropriate assessments are used in conjunction with course placement efforts, course completion and graduation rates experience a direct increase” (Common Assessment Initiative, 2015). One stated benefit will be the “portability” of assessment, allowing students to move from one college to another with greater ease and less confusion than at present. The tests, paid for with taxpayer money, are state owned and will save colleges money.

Due to the CAI, current CCSF placement tests will soon disappear. In the fall of 2017, ten to twelve colleges, selected for broad representation of students (varying size, urban vs. rural, geographical region) will pilot a new state owned set of tests for English, ESL, and math placement. For the English and ESL placement tests, there will be an option for administering a writing sample which can be locally scored or machine scored by the state in Sacramento.

Each college will match the CAI competencies to its respective curriculum. One concern is that local placements will encourage what is termed “college shopping.” A student may take her/his tests scores to the college that gives her/him best placement for completing course sequences. However, the benefits hold the promise of outweighing the disadvantages. At CCSF, it is possible that ESL placement will inform students of parallel placement for our non-credit and credit ESL sequences. This will streamline “pathways” for students and facilitate transfer from ESL non-credit to ESL credit. It is also hoped that we can identify parallel placements for ESL and English, placements that will help students choose pathways that promise the most success. It will be interesting to see what effects this common reading test will have on student reading competency, reading instruction, and our abilities to “level” texts (leveling texts has been resistant to standardization; book publishers are typically the creators of “reading levels”).

Are tests accurate reflections of reading comprehension?

At best, our ESL credit final reading tests sort students. They are less helpful at revealing levels of reading comprehension. Research has shown that students with strong test-taking skills tend to have an advantage over students who are inexperienced with standardized testing and/or suffer from test anxiety. Research shows that, depending on a student’s native language, reading speeds differ significantly. Further, age is a factor for text accessing. The older we get, the less speedily we handle timed tests. We also become more distractible with age.

Section 5 - Recommendations

Reading loads recommendations 1. Students take any number of classes and it might be helpful if instructors use their syllabi to calendar all reading assignments for the semester so that students can anticipate homework loads and plan ahead. page 54 of 75

2. Instructors should take ancillary reading (syllabus; handouts; directions) into consideration when assigning reading loads.

3. Instructors should realize that activities on the computer require reading

Instructional recommendations

1. As instructors mature as readers, they must remember that they were once like their students: new to reading as a lifelong activity and discipline.

2. Explicitly teach students how to read in your discipline.

While helpful, it is not enough to suggest that students consider genre and purpose. Research has shown that students need to be explicitly taught reading strategies, for example, to apply knowledge of genre as a method of activating prior knowledge, and given opportunities to demonstrate they are using this strategy (Jetton & Lee, 2012). Using the information about the features of advanced disciplinary texts, ESL instructors could introduce specific grammatical and discourse features, giving students opportunities ample practice in recognizing and producing them, and content instructors could reinforce the importance of attending to these features by providing direct instruction to their students about how to approach their respective assigned texts.

For example, regarding “metaphorical realizations of logical reasoning,” both ESL and content instructors can ask students to “translate” metaphorical realizations. (examples in chart below)

Original sentence Causal “translation” A strong updraft will tilt the wind shear and - A strong updraft tilts the wind shear. produce rotation inside the thunderstorm. - A strong updraft causes rotation inside the thunderstorm. - Because it tilts the wind shear, a strong updraft causes rotation inside the thunderstorm. The incredible genetic adaptability of bacteria is Because bacteria genes adapt incredibly one reason the world faces a potentially serious well, we see more cases of infectious rise in the incidence of some infectious bacteria bacteria diseases. diseases once controlled by antibiotics. The result of these factors acting together is that Because these factors act together, every every major disease-causing bacterium now has major disease-causing bacterium now has page 55 of 75

strains that resist at least one of the roughly 160 strains that resist antibiotics. antibiotics used to treat bacteria [sic] infections. The psychiatric profession still viewed mental Psychiatrists believed that environmental disorders as environmental, most likely due to factors, like overbearing mothers or severe overbearing mothers or severe trauma. trauma, caused mental disorders. The arbitrary sixteen bed restriction had been Because the government wanted to get rid adopted as part of the government’s campaign to of large state mental hospitals, the new law shut down large state mental hospitals through said halfway houses could only offer 16 deinstitutionalization. beds. The police initially suspected murder, but when Because Jeffrey Turner had a mental they found out Jeffery Turner was a illness, the police decided he had killed schizophrenic, they ruled it a suicide. himself.

What makes Carbon-14 and Nitrogen-14 a very Carbon-14 and Nitrogen-14 make a good good isotope pair is that most substances that isotope pair because most things that contain carbon in a structure (such as shells made contain carbon in a structure (such as shells of CaCO3) do not also have nitrogen in them. made of CaCO3) do not contain nitrogen.

If we are trying to use the Carbon-14/Nitrogen-14 The Carbon-14/Nitrogen-14 radioactive radioactive decay pair to date a rock that’s 100 decay pair is not a good choice to date a million years old, there likely will not be enough rock that’s 100 million years old, because parent left to measure, and that would not be a there probably won’t be enough parent left good choice! to measure. …very much like a current of electrons moving in When a current of electrons move in a loop a loop, which you’ll learn in a basic physics class they create a magnetic field. is one way to produce a magnetic field.

Instruction along these lines help students to understand that there are different ways to express causality. As readers advance in their skills, their “bag of tricks” expands and this allows them not only to comprehend at faster rates, but also to avoid plagiarism when they paraphrase and summarize in order to demonstrate understanding.

Both ESL and content instructors can teach students that nominalization means “to make into a noun.” In science classes, students can be taught that nominalizations are a grammar resource that allow scientists to give complete and accurate descriptions of scientific phenomenon.

 The physical separation of transcription and translation by the nuclear envelope gives eukaryotes more opportunities to regulate gene expression. (from Modern Biology, 2006, p. 424).

 A hurricane is a large, swirling, low-pressure system that forms over tropical islands. page 56 of 75

 Caution must be taken against a fake smallpox attack that can “infect” hundreds of people working in the health care profession. [In simple language: We must take caution. There is an attack of smallpox. The attack is not real. The attack can “infect” hundreds of people. These people work in the health care profession.] (Fang, 2012).

Students can be taught that nominalization is used in history textbooks to present “a single story,” to bring the author’s or publisher’s hidden opinion to the forefront and obscure the historical characters. (opinions are in italics)

A wave of panic ensued, leading to business failures and slowdowns that threw thousands out of work. The major cause of the panic was a sharp, but temporary downturn in agricultural exports to Britain, and recovery was well under way by early 1859. [from a school text on the Panic of 1857]. (ibid)

Reminding students about nominalizations in math (nouns underlined) will help students focus their learning:

 Identify the point at which the parabola intersects the axis of symmetry.

 Find the value of a in a quadratic function y = ax 2 + bx + c .

 When a secant segment and a tangent segment are drawn to a circle from an external point, the product of the secant segment and its external segment is equal to the square of the tangent segment. (ibid)

Other ideas for helping ESL (and other) students read advanced texts

1. Use curriculum, textbook design, and textbooks built on corpus-based word lists. These can guide beginning ESL students to learn and master the identified 2,000 high-frequency word families that account for up to 50-80% of typical texts in English. This should be a target goal early in a student’s formal learning. Along these lines, self-assessments could be created for students to use in lab settings and those who have extensive vocabularies (5,000 or more) but who have less than the identified 2,000 high-frequency word families could use the self-assessments to guide self-study.

2. Encourage ESL students to continue reading in their native tongues and native speakers to continue reading outside of school. Teach them that building their knowledge will help them in all aspects of their lives. page 57 of 75

3. Teach ESL students to use their L1 literacy and knowledge to access L2 texts.

4. Take cues from cognitive science (Roberts & Kreuz, 2015).

a. Encourage ESL students, and others, who try to memorize to develop linguistic competence. Give them ample opportunities to apply what they have read or learned.

b. Teach ESL students, and others, to use “elaborative rehearsal” strategies rather than “maintenance rehearsal strategies” and to elaborate on what they know.

c. Re-learning is faster and more lasting than learning. The work of 1880s German researcher Herman Ebbinghaus has been repeatedly replicated— within 20 minutes you have forgotten 60 percent of what you have been exposed to and this drop off continues until it levels off, at six days to about 25 percent; this last bit being fairly enduring.

5. Encourage students to experiment. In one interesting research experiment, second language learners were asked to write down, in their L1, what they had understood from a reading in L2. Results revealed that they had understood significantly more than the standardized test showed or that they could express in L2. This may be why some second language learners whose ESL and English instructors feel they are struggling with reading can be successful in a content course where their reading is assessed not by a standardized test, but by a demonstration that they can apply what has ostensibly been taught through a text.

Here are several intriguing recommendation for helping second language learners handle advanced disciplinary texts:

1) Build world knowledge in L1 before attempting the L2 reading assignment.

a. For lessons, focus students on concepts, for example, taxation; climate change; raising a toddler. Describe the assigned text. For example, “We will be reading an analysis of how a measure to offer city subsidized housing to people who are homeless and also suffering from mental disorders will affect the cost of city government.” Ask the student to find a parallel text in his/her native language. This is not a translation but a text on the same topic or concept. page 58 of 75

b. The text in the student’s native language should be one she can handle comfortably.

c. Ask the student to identify 10 key concepts/vocabulary items she feels are key to understanding the topic.

d. Ask the student to find and learn the English vocabulary items she feels are equivalent.

e. Ask the student to read the English text.

f. The final step is to ask students to analyze whether or not the 10 key concepts/vocabulary items they chose from the text in their native tongues helped them to grasp the material/lesson in English well enough to complete assignments/assessments in English.

2) Build genre knowledge in L2 by analyzing L1 texts.

a. Assign the student an L2 reading.

b. Ask the student to explain how the L2 text is different/similar to the same kind of text the student has encountered in her L1. For example, if the assigned L2 test is an editorial opinion piece from a newspaper or magazine, ask the student how the same writing task would be accomplished in her L1 culture. Results of these techniques have shown promise.

Testing

Because we are on the eve of changing our placement testing system, I don’t have specific recommendations. However, when we create tests, it’s good to keep in mind the adage: All paper and pencil tests are reading tests. Instructors should confirm that any tests they create are valid (measure what they claim to measure), reliable (measure consistently within themselves and across time), and ideally, demonstrate inter-rater reliability (instructors in the same discipline / teaching the same course come up with the same ratings).

There are already instructors who see reading as a single modality, and know from experience that the importance of possessing reading skills is dependent on the task at hand.

Taking language distance into account (e.g., a native reader of Italian having a speed advantage over a native reader of Chinese when both are reading English), age (younger people read faster than older people), and personality (some people are slow readers and some people are fast readers), if we want to know the individual ability of a reader (vs. her skills measured against the skills of a group of her peers), it’s advisable to allow students ample time on reading tests.

Last Words Jeanne Chall, considered by some to be the Grande Dame of first language reading research, wrote: page 59 of 75

“[Reading] researchers have been influenced by the philosophical assumptions and social problems of their time, both in selecting problems and, more particularly, in drawing conclusions and making recommendations. Such influence is inevitable since no one can escape the time in which he lives” (Chall, 1967).

What are my times? We are in the eye of the storm of the technological revolution. Chall did not live to see the day she yearned for, a day when books could “talk” to struggling readers. We are now building machines to help us read, to help us learn to read, to help us teach others to read, to help us communicate in other modes than with printed text. Yet, what reading is, why we should teach reading, how we should teach reading, whether we should read, and what place reading has in the world continue to be perplexing questions. Indeed, in today’s global and digital reality, where worlds can be explored in a multitude of ways, and educators are gently reminded that “…being literate also means interpreting multiple message streams, managing a great deal of information, and producing compositions that include textual, audio, and visual components” (Hicks & Steffel, 2012), dedicating time and attention to the teaching of reading skills is in fierce competition with a multitude of equally pressing demands.

Still, it is not hard to argue that reading is a unique and invaluable mode of accessing information, and that no other communication mode, not listening, or visual imagery, or moving pictures, can match the speed of a skilled reader processing and comprehending text. In his book, On Writing, Stephen King points out that texts allow us to explore imaginary or otherwise inaccessible worlds, to read minds, and to time travel. Unlike listening to or watching public media, reading is always a private, solitary act. When we read, not only can we “freeze” the text for repeated study, we select what to focus on, and we are the final judges of what is true and what is not. What is reading? Reading is thinking. And, just as we all think differently, we all read differently.

It should be clear that… [hypotheses about how to effectively teach reading] concern groups, not individuals. Obviously, every method produces ranges of attainment, and every method has its failures. And it may very well be that certain individuals find one or another method particularly suitable—or impossible” (Chall, 1967).

As educators, perhaps our best tool for helping our students is to remind them that writing is a gift from our ancestors. And that reading is one of the ways we put that precious gift to use. page 60 of 75

References (* denotes secondary source):

Academic Word List; accessed 11/09/16 http://www.oxfordlearnersdictionaries.com/us/wordlist/english/academic/AcademicWord List_sublist_1/?page=2

Adams, M. J. (1990). Beginning to read: Thinking and learning about print. Cambridge, Massachusetts: MIT Press.

Adams, M. J. (2009). The challenge of advanced texts: The interdependence of reading and learning.http://www.childrenofthecode.org/library/MJA-ChallengeofAdvancedTexts.pdf

*Alderson, J. C. (1984). Reading in a foreign language: A reading problem or a language problem? In J.C. Alderson & A.H. Urquhart (Eds.), Reading in a foreign language (pp. 1-24). London: Longman.

* Anderson, R. C. & Pearson, P. D. (1984). A schema-theoretic view of reading comprehension. In P.D. Pearson (Ed.), Handbook of reading research (pp. 255-291). New York: Longman.

* Anderson, R. C. & Davison, A. (1998). Conceptual and empirical bases of readability formulas. In A. Davison & G. M. Green (Eds.), Linguistic complexity and text comprehension (pp 23-54). Hillsdale, New Jersey: Erlbaum.

* Alessi, S. and Dwyer, A. (2008). Vocabulary assistance before and during reading. Reading in a Foreign Language, 20(2), 246-263.

Altwerger, B., Jordan, N., Rankie Shelton, N. (2007). Rereading fluency: Process, practice, and policy. New Hampshire: Heinemann.

* Baker, J. and Westrup, H. (2000). The English language teacher’s handbook. London: Continuum.

Ball, A. (2006). Multicultural strategies for education and social change: Carriers of the torch in the United States and South Africa. New York: Teachers College Press, Columbia University.

Barre, Elizabeth. (2011). How much should we assign? Estimating out of class workload. http://cte.rice.edu/blogarchive/2016/07/11/workload.

* Bell, T. (2001). Extensive reading: Speed and comprehension. The Reading Matrix, 1(1), 3-15.

* Bernhardt, E. B. (2005). Progress & procrastination in second language reading. In M.E. McGroarty (Ed.), A survey of applied linguistics (Vol 25.1., pp. 133-150). West Nyack, NY: Cambridge University Press.

Bernhardt, E. B. (2011). Understanding advanced second-language reading. New York: Routledge. page 61 of 75

* Birch, B. M. (2002). English L2 reading: Getting to the bottom. New Jersey: Erlbaum. Blachowicz, C., Lems, K., Rasinski, T. (2012). Fluency instruction: Research-based best practices, 2nd edition. New York: Guilford Press.

California Assessment Initiative. (2015). Retrieved June 22, 2015 from http://cccassess.org/

* Calkins, L. (2001). The art of teaching reading. New York: Addison Wesley Longman.

* Capellini, M. (2005). Balancing reading & language learning. Portland, Maine: Stenhouse Publishers.

* Carter, R. and Nuncan, D., editors. (2001). Teaching English to speakers of other languages. United Kingdom: Cambridge University Press.

Cobb, T. (2009). Necessary or nice? Computers in second language reading. In Han, Z. & Anderson, N. J. (Eds.), Second language reading research and instruction. 144-172.

Chabot Community College. (2009). Reading between the lives. Video. Also posted on YouTube (2015): //www.youtube.com/watch?v=QP6JuaFX8iE.

Chall, J. S. (1967). Learning to read: The great debate. Columbus, Ohio: McGraw Hill.

Chall, J. S. (1983). Stages of Reading Development. Columbus, Ohio: McGraw-Hill.

* Davis, J. N. Reading literature in the foreign language: The comprehension/response connection. The French Review, 65 (3) (1992): 359-370.

* Davison, A., Wilson, P. & Hermon, G. (1985). Effects of syntactic connectives and organizing cues on text comprehension. Champaign, IL: Center for the Study of Reading.

Dehaene, Stanislas (2009). Reading in the brain. New York: Viking.

* Dooling, D. J., & Mullet, R.L. (1973). Locus of thematic effects in retention of prose. Journal of experimental psychology, 97, 404-406.

* Dornyei, Z. and Ushioda, E. (2010). Teaching and researching motivation (2nd edition). New York: Longman Pearson. Dudley-Marling, C. and Michaels, S., editors. (2102). High-expectation curricula: Helping all students engage in powerful learning. New York: Teachers College Press Columbia University. Fang, Zhihui. (2012). The challenges of reading disciplinary texts. In Jetton, T. L. & Shanahan, C. (Eds.) Adolescent literacy in the academic disciplines: General principles and practical strategies. New York: Guilford Press. 34-68. Fang, Z., & Schleppegrell, M. J. (2008). Reading in secondary content areas: A language-based pedagogy. Ann Arbor, MI: University of Michigan Press. page 62 of 75

Fang, Z., & Schleppegrell, M. J. (2010). Disciplinary literacies across content areas: Supporting secondary reading through functional language analysis. Journal of Adolescent and Adult Literacy, 53(7), 587– 597.

*Fields, J. (2008). Bricks or mortar: Which parts of the input does a second language listener rely on? TESOL Quarterly, 42 (3), 411-432.

Grabe, W. & Stoller, F. L. (2011). Teaching and Researching Reading, 2nd edition. New York: Longman Pearson.

*Halliday, M. A. K., & Martin, J. R. (1993). Writing science: Literacy and discursive power. Pittsburgh, PA: University of Pittsburgh Press.

Han, Z. & Anderson, N. J., Eds. (2009). Second language reading research and instruction. Ann Arbor: University of Michigan Press.

* Hedgcock, J. & Ferris, D.R. (2008). Teaching readers of English. New York: Routledge.

Hicks, T. & Steffel, S. (2012). Learning with text in English language arts. In Jetton, T. L. & and Shanahan, C. (Eds.), Adolescent literacy in the academic disciplines: General principles and practical strategies. Guilford Press: New York, 2012. Chapter 5

Hunt, Mai. (2015). City College of San Francisco Chemistry instructor. Personal interview.

* Irwin, J.W. & Pulver, C.J. (1984). Effects of explicitness, clause order and reversibility on children’s comprehension of causal relationships. Journal of educational psychology, 76, 399-407.

* Juffs, A. (1998). Main verb versus reduced relative clauses ambiguity resolution in L2 sentence processing. Language learning, 48, 107-114.

* Kamil, M., Mosenthal, P.B., Pearson, P.D., Barr, R., editors. (2000). Handbook of reading research, volume III. New Jersey: Lawrence Erlbaum Associates, Inc.

Koda, K. (2005). Insights into second language reading: A cross-linguistic approach. Cambridge, England: Cambridge University Press.

Marsh, J. (2011). Class dismissed: Why we cannot teach or learn our way out of inequality. New York: Monthly Review Press.

*Milton, J., & Meara, P. (1995). How periods abroad affect vocabulary growth in a foreign language. ITL Review of Applied Linguistics, 107/108, 17-34.

*Moje, E. B. (2008). Foregrounding the disciplines in secondary literacy teaching and learning: A call for change. Journal of Adolescent and Adult Literacy, 52(2), 96-107.

* Nagy, W.E. & Anderson, R.C. (1984). How many words are there in printed school English? Reading research quarterly, 19, 304-330. Pediatrics. Learning Disabilities, Dyslexia, and Vision. August 2009. Volume 124 / Issue 2. page 63 of 75

http://pediatrics.aappublications.org/content/124/2/837. Accessed 12/15/2016.

*Perie, M., Grigg, W., & Donahue, P. (2005). The nation’s report card: Reading 2005 (NCES- 2006-451). Washington, D.C.: U.S. Department of Education, National Center for Education Statistics. * Ratych, J. (1985). Zwei jahrzehnte literarische lehrbucher. In M. Heid (Ed.) New Yorker Werkstattgesprach 1984: Literarische texte im fremdsprachenunterricht. (pp 68-84). Munich: Kemmler & Hoch.

Roberts, R. & Kreuz, R. (2015). Becoming fluent: How cognitive science can help adults learn a foreign language. Cambridge, Massachusetts: MIT Press.

Rose, M. (1989). The Language of exclusion: Writing instruction at the university. College English, 47, 341-359. Rosenthal, N. (1995). Speaking of reading. New Hampshire: Heinemann.

* Rueda, R. Velasco, A. and Lim, H. J. (2008). Comprehension instruction: Research-based best practices (2nd ed.) (pp. 294-305). New York: Guilford Press.

Sadoski, M. (2004). Why? The goals of teaching reading. In Conceptual Foundations of Teaching Reading. New York: Guilford Press. (p. 55)

Scholnick, N. (2002). Non-credit ESL placement test procedure manual.

* Schumann, J. H. (1976). Social distance as a factor in 2nd language acquisition. Language learning 26 (1), 135-143.

Seymour, S. (1997). Reading Challenges for ESL Students in Mainstream Courses at City College of San Francisco. Sabbatical Report.

* Shaughnessy, M. (2010). Reading in 2010: A comprehensive review of a changing field. New York: Nova Science Publishers, Inc.

Shanahan, T. & Shanahan, C. Teaching disciplinary literacy to adolescents: Rethinking content- area literacy. Harvard Educational Review 78.1 (2008): 40-59.

* Sternberg, R. J. (1987). Most vocabulary is learned from context. In M.G. McKeron & M. E. Curtis (Eds.), The nature of vocabulary acquisition (pp. 89-103) Hillsdale, New Jersey: Erlbaum. * Takase, A. (2007). Japanese high school student motivation for extensive L2 reading. Reading in a foreign language 19, 9-18. Unsworth, L. (1999). Developing critical understanding of the specialized language of school science and history texts: A functional grammatical perspective. Journal of Adolescent and Adult Literacy, 42(7), 508-521.

Walsh, Laura. (2015). Personal communication. page 64 of 75

Walvoord, B.E., & McCarthy, L.B. (1990). Thinking and writing in college: A naturalistic study of students in four disciplines. Urbana, IL: National Council of Teachers of English.

Wolf, Maryanne. (2007). Proust and the squid. New York: Harper.

Zamel, V. (Dec 1995). Strangers in academia: The experiences of faculty and ESL students across the curriculum. College composition and communication, v46 n4 p506-21. Accessed February 12, 2015, from http://www.jstor.org/stable/358325? seq=1#page_scan_tab_contents

Zamel, V. and Spack, R. editors. (2004). Crossing the curriculum: Multilingual learners in college classrooms. Lawrence Erlbaum Associates. Mahwah: New Jersey. page 65 of 75

Appendix 1 Bernhardt believes second language reading research should focus on how literate readers of second-language texts “learn to comprehend at greater levels of sophistication, and whether that ability can be enhanced by instruction.” She suggests that, for further research, a “base line standard” criterion be established (Bernhardt, ix):

a. Use a single text (or set of texts) for data collection.

b. Differentiate research subjects according to native language background.

c. Establish first language literacy levels of L2 research subjects.

d. Establish amount of L2 grammatical knowledge of L2 research subjects.

e. “Second language” cannot mean “English” and studies must be done with a variety of native language readers who are learning, respectively, a variety of second languages (e.g., Spanish speakers learning Korean, Arabic speakers learning Chinese, Dutch speakers learning Italian, etc.) Bernhardt states that much research into effective instructional methods has been put into repeating “experiments,” and asserts that design flaws need to be removed for future research, and suggests using the following criteria (ibid, p. 22):

a) Specification of first language literary level

b) Measurement of second language grammatical level

c) Delineation of first-language backgrounds of subject populations

d) Explanation of linguistic relationship of cognizant first and second languages

e) At least one member of the research team able to use the cognizant first and second languages

f) Subjects’ comprehension assessed in their dominant language

g) Multiple texts employed

h) Multiple measures employed

Appendix 2 - My Course Reading Load / Assessment Survey 1. What is your discipline? [to compare reading instruction across the disciplines]

2. I am [to get a sense of how professional development might be offered if a need is indicated and also to find out if reading load/reading assessment is tied to whether an instructor is full-time or part-time] a. a full-time instructor page 66 of 75 b. a part-timer instructor

3. I teach [to gather data that may help us improve our ability to help students bridge from non-credit to credit] a. in the credit division b. in the non-credit division c. in both the credit and non-credit division d. other ______

4. The next questions will be about the reading loads you assign. Please choose one G.E., certificate, or pre-requisite course that you regularly teach. In the box below, please write the course name (e.g., Art 102).

[to get a sense of the reading load demands on students at particular points in their educational trajectory; for example, a beginning ESL student might only have a reading load from her ESL courses but an advanced ESL student will have a reading load in her non-ESL courses]

5. How many pages of core reading do you typically assign your students per semester for the course you listed in number 4? [to get a sense of the reading loads on the average student for courses that satisfy GE requirements]

6. How many hours of reading per week do you estimate it takes students to complete the reading assignments for the course you listed in number 4? [to get a sense of the time instructors believe students take to complete reading assignments]

7. How many hours of ancillary reading do you require students to complete for the course you listed in number 4? (syllabus, assignment sheets, handouts, peer review, instructions for using Insight and other computer applications, Internet forums, etc.) [to get a sense of the reading loads on the average student]

8. How are reading assignments distributed across the semester? [to get a sense of the reading loads on the average student] page 67 of 75

a. about the same number of pages per week b. some weeks, the number of pages is heavier than other weeks depending on other assignments (writing/research/presentation/preparing for a test/etc.) c. the reading load is heaviest at the beginning of the semester d. the reading load is heaviest as we approach midterm e. the reading load is heavy in the weeks leading up to a writing assignment and this happens ______a semester f. the reading load is heaviest towards the end of the semester g. other: ______

9. During class, what direct instruction do you give your students to help them complete their reading assignments? Check all that apply. [to get a sense of what sorts of reading instruction is being used across the disciplines in the classroom]

a. I specifically teach them how to read texts according to the manner that reading in my discipline requires. b. I explain the genre of the reading & give explicit instruction for approaching the text. c. I instruct them to consider the genre and/or the purpose of the reading assignment before they begin reading. d. We do a close reading of each text. e. I set aside class time for “think alouds.” f. I interpret the text for the students. g. I set aside class time for a class discussion of the students’ reading habits h. How a student approaches reading the texts is up to the individual student. i. Other: ______

10. For completion outside of class, what direct instruction do you give your students to help them complete their reading assignments? Check all that apply. [to get a sense of what sorts of reading instruction is being used across the disciplines for homework] page 68 of 75 a. The students complete a questionnaire about their reading history/habits and I offer individualized feedback. b. I provide guiding questions to help them focus their reading. c. If students are struggling with the readings, I suggest they read the chapter questions first in order to focus their reading. d. Students to turn in outlines for each reading assignment. e. Students to turn in summaries for each reading assignments. f. I provide the topic question for the paper they will write. g. I provide a list of the essay questions that will be on the unit test and/or final exam. h. Students fill out worksheets. (Please attach a sample worksheet if possible.) i. Other: ______

11. How do you determine whether students have completed and comprehended the readings you’ve assigned? (Check all that apply.) [to get a sense of what sorts of reading assessment instruments students are facing]

h. I give a quiz for every reading. i. Questions about the readings are on tests and/or the final exam. j. Students write papers/complete homework assignments about what they’ve read. k. If students successfully complete the homework and pass quizzes and tests, I assume they have completed the readings. l. Students work in groups and turn in worksheets I give them. m. I give students points for participating in class discussions about the reading assignments. n. Other: ______page 69 of 75

12. How often do you assess whether students understand the texts you’ve assigned? [to get a sense of how often students are assessed for reading]

a. more than once a week b. weekly c. once or twice a month d. monthly e. several times during the course of the semester f. only with the midterm and final exam g. only with the final exam

13. How do you use the assessments you do of your students’ reading comprehension?[to get a sense of what coursework reading assessments are used for]

a. Assessments on reading comprehension are part of a student’s grade. b. I use reading assessments to monitor my students’ reading comprehension and offer additional instruction as necessary. c. Other: ______

14. How important is reading your assigned texts for a student to successfully complete your course (a grade of C- or higher)? [to get a sense of how important reading is in order to succeed in each given course / across courses]

a. of the highest importance b. very important c. somewhat important d. not very important e. not important at all page 70 of 75

15. If a student in your class doesn’t complete the reading assignments, what can s/he do to pass your course? [to get a sense of how important reading skills are for the average student at CCSF]

 attend lectures, take good notes and pass quizzes and exams

 join a study group

 work with a tutor

 read the text in translation to his/her native language

 other: ______

 Students who don’t complete and comprehend the reading assignments will be unable to pass course.

16. I have been teaching for ______years. [to get a sense of whether reading loads have increased/decreased in the past ten years]

17. In the last ten years, the number of pages that I assign students to read for ______[course title] [to get a sense of whether reading loads have increased/decreased in the past ten years…if the respondent has taught for less than ten years, she can indicate this in the comment box] a. …has stayed about the same b. …has increased c. …has decreased d. I don’t assign core reading anymore. Please explain: [comment box]

18. In the last ten years, the general amount of ancillary reading I require students to complete [to get a sense of whether reading loads have increased/decreased in the past ten years; I am guessing it has increased because of the computer and the Internet]

a. …has stayed about the same b. …has increased page 71 of 75 c. …has decreased d. I don’t assign ancillary reading anymore. The reason for this is: [comment box]

19. Do you have any comments or questions? If you wish to be anonymous, please use the box below.

20. Would you be willing to be interviewed for this sabbatical project? If so, please contact:

Lia Smith, ESL Department (415) 452-5184 / [email protected] Batmale 212, Box L 308

Appendix 3 – Descriptors for the answers of Degrees of Reading Power T4, U4, T2, U2

T4 1. contrast to drive point home

2. topic sentence at end of paragraph

3. concluding remark

4. transition w/grammar pattern (the x are that y)

5. vocabulary; so x that y

6. concluding remark

7. summary sentence

8. topic sentence for following paragraph

9. synthesis of 3 points

10. linking vocabulary (rest = recover; journey implied)

11. all = plural / next sentence confirms choice: behave = behavior

12. summary of points presented

13. contrast page 72 of 75

14. pronoun reference / text redundancy

15. accumulation of supporting details to drive home main point

16. concluding sentence

17. internal transition; mention of a first solution refers to previous (first) paragraph

18. internal transition; mention of a 2nd solution again refers to first paragraph

19. restatement

20. elaboration of a point

21. pronoun reference (following thread of reading)

22. vocabulary (balance = equilibrium)

23. noun reference and transition

24. vocabulary (use of a synonym) and restatement

T2

1. restatement & topic sentence

2. refers back to main thrust of reading

3. concluding (mentions 3 points presented)

4. presents problem

5. problem/solution

6. additional problems (causal chain)

7. summary & topic sentence

8. [tacked on sentence; none of the other distractors come close]

9. returns to main point and synthesizes point just made

10. reason to find solution to problem that has been introduced

11. additive (summary of new point added to previous point)

12. contrastive paragraph (showing group that didn’t require innovation)

13. repeated terms; restatement & topic sentence page 73 of 75

14. synthesis of 2 paragraphs

15. restatement & topic sentence

16. [bad writing]

17. [bad writing]

18. [bad writing]

19. [I spaced out]

20. draws paragraph 1 and 2 together

21. refers back to main point

22. repeats main point

23. [answer by default]

24. [answer by default]

U4 1. summarizing comment

2. topic sentence conclusion

3. restatement

4. summarizing sentence, topic sentence

5. restatement of main point

6. summary of 3 points made

7. understanding the progression of an idea + vocabulary

8. restates main point after a problem/solution is given

9. ties back to the beginning of the reading

10. restatement of point

11. summary of several points presented

12. rounds out paragraph and refers to main thrust of article

13. moves away from “hook” and brings reader back to main point

14. restatement to show “stall” in process page 74 of 75

15. rounds out paragraph and returns to idea in introduction

16. summary

17. summary

18. pronoun reference

19. restatement and synthesis

20. synthesis

21. [bad writing]

22. shows a limit with a restatement

23. shows a limit with a restatement

24. gives the main point – contrasting two types of narrators

U2

1. pronoun reference

2. summary after presentation of 2 points [distractors are ludicrous]

3. [answer nonsensical but process of elimination points to it]

4. summary of problem/solution

5. [bad writing; I got this answer wrong]

6. summary of problem/solution

7. summary of main point

8. summary of main point

9. summary of main point

10. reference

11. summary of new point made

12. summary of new point made

13. previous knowledge and concession

14. summary of new point made page 75 of 75

15. summary of new point made

16. language of analogy

17. [bad writing; correct answer arrived at through process of elimination]

18. pronoun reference

19. word forms (acidic condition = acidity

20. summary of point just presented

21. summary comment (mentioned in previous 2nd paragraph)

22. correction of what was previously believed

23. summary of point just presented

24. summary of point just presented

Appendix 4 - Types of answers for ESL final Reading Tests:

110 120 130 140 150 160 Reading Answers Reading Answers Reading Answers Reading Answers Reading Answers Reading Answers (25) (25) (30) (30) (25) (30) noun (8) noun (6) noun (6) noun (8) noun (14) noun (5) noun phrase (6) noun phrase (4) noun phrase (6) noun phrase (8) noun phrase (2) noun phrase (7) sentence (6) sentence (12) sentence (13) sentence (9) sentence (8) sentence (12) fact (2) fact (NA) dep clause (1) dep clause (1) dep clause (NA) dep clause (1) verb (1) verb (3) verb (2) verb (3) verb (1) verb (2) adjective (1) adjective (NA) adjective (1) adjective (1) adjective (NA) adjective (2) inf. purpose (1) inf. purp. (1)

Word Count ESL Final Tests: Reading & Grammar 110 120 130 140 150 160 Reading Whole Test 1397 1830 2562 2819 2573* 3391

Stems Range: 4-13 Range: 2-15 Range: 3-14 Range: 3-32 Range: 4-16 Range: 3-25 Median: 8 Median: 7 Median: 8 Median: 9 Median: 8 Median: 8

Answers Range: 1-11 Range: 1-13 Range: 1-21 Range: 1-12 Range: 1-20 Range: 1-24 Median: 3 Median: 6 Median: 8 Median: 6 Median: 5 Median: 7

Grammar page 76 of 75

Whole Test 1223 1299 1434 1536 1518* 1644

Stems Range: 5-13 Range: 4-51 Range: 4-51 Range: 2-25 Range: 3-27 Range: 4-45 Median: 10 Median: 12 Median: 11 Median: 13 Median: 11 Median: 19

Answers Range: 5-10^ Range: 1-4 Range: 1-49 Range: 1-50 Range: 1-27 Range: 1-7 [w/sentence [w/sentence [6 items: punc. [w/sentence [w/find error section] section] focus] section] section] Median: 5 Median: 2 Median: 2 Median: 3 Median: 3 Median: 1

[Notes on next page] Notes:  110 Grammar a. There are 3 direction boxes. b. There are three sections (answer question; choose correct sentence; fill in blank) c. The first test section has sentences up to 12 words long, with words as long as 10 letters (e.g., demonstration; homework; Sacramento). d. Items 1-10 refer to a given sentence. e. ^ Items 11-19, students must choose a correct sentence from 4 choices. The median number of words is 10; students read an average of 30 words per item to get at the answer.

Word counts for ESL grammar tests: 110—1223 120—1299 130—1434 140—1536 150—1518* 160—1644

Word counts for ESL Reading tests: 110 reading—1397 120 reading—1830 130 reading—2562 140 reading—2819 150 reading—2573* 160 reading—3391 page 77 of 75

*150 grammar and reading tests have lower reading count loads than parallel level 140 tests.

Appendix 5

List of books for extensive reading at levels 110-130, targeted at males and students who “hate” reading.

1. Around the World. Phelan, Matt. 2014. Massachusetts: Candlewick Press. ISBN: 9780763669256 [Around the world on bicycles version of Around the World in 80 Days]

2. Heroes for Civil Rights, by David A. Adler; illustrated by Bill Farnsworth; Holiday House, NY (2008); ISBN: 978-0-8234-2008-7

30 pages [Ralph Abernathy; Medgar Evers; Andrew Goodman, James Early Chaney & Michael Henry Schwerner; the Greensboro Four; Fannie Lou Hamer; Lyndon Baines Johnson; Martin Luther King, Jr.; the Little Rock Nine; Thurgood Marshall; James Meredith; Rosa Parks; Fred Shuttlesworth; Earl Warren]

3. Malcolm X: A Fire Burning Brightly by Walter Dean Myers; Paperback, 40 pages; 2003 by Amistad (first published 2000); ISBN13: 9780060562014

4. Down to the Last Out: The Journal of the Biddy Owens Negro Leagues 1948. Myers, Walter Dean. 2001. New York: Scholastic. ISBN: 9780439095037

5. Sugar by Jewell Parker Rhodes. 2013. New York: Little, Brown. ISBN: 9780316043052

6. Incredible Stories of Courage in Sports, by Brad Herzog, Free Spirit Publishing, 2014. ISBN: 978-1-57542-478-1. True stories highlighting themes in sports (e.g., generosity, teamwork, courage, perseverance).

7. Captured Sports History, Capstone Publishing. What a Kick: How a Clutch World Cup Win Propelled Women’s Soccer by Emma Carlson Berne. ISBN: 9780756552930.

8. Vision of Beauty: The Story of Sarah Breedlove Walker. By Kathryn Lasky, Nneka Bennett, Illustrator Candlewick Press (MA) $17.99 (48p) ISBN 978-0-7636-0253-6

9. Morris and Buddy, by Becky Hall; Albert Whitman & Company, 2007;

ISBN: 978-0-8075-52841 (hardback) page 78 of 75

10. Cyberbullying: Online Safety by Tracy Brown; [$26.81];Series: Helpline: Teen Issues And Answers ; Catalog Number: #71812;Rosen Publishing Group, 2014; ISBN 13: 978- 1-448-89450-5

11. Talkin' About Bessie: The Story of Aviator Elizabeth Coleman by Nikki Grimes, illustrated by E.B. Lewis; Orchard Books/Scholastic, 2002. ISBN: 978-0-439-35243-7

12. Trash Talk: What You Throw Away by Amy Tilmont; Hardcover, 48 pages; Norwood House Press, 2011; ISBN13: 9781599534596

13. Awesome Stories of Generosity in Sports. Herzog, Brad. Free Spirit Publishing. 978-1-57542-477-4

14. Sarah Winnemucca. By Natalie Rosinsky; Compass Point Books, 2006; ISBN 9780756510039

15. Voice of Freedom: Fannie Lou Hamer: Spirit of the Civil Rights Movement by Carole Boston Weatherford; Candlewick Press (2015); ISBN 978-0-7637-6531-9

16. Oprah Winfrey (We Both Read Series) by Lisa Maria and Sindy McKay, Illustrated by Marc Scott; ISBN: (978-1-60115-241-1); hardcover; $9.95

17. Black Lives Matter by Sue Bradford Edwards ; Hardcover, 112 pages; Essential Library, 2015; ISBN 13: 9781624038983

18. Mo Willems by Sheila Griffin Llanas, hardcover, Checkerboard Books, 2012, ISBN13: 9781617832499

19. Scholastic: A True Book Series – 16 Biographies

a. Hillary Clinton; Item #:NTS618116; ISBN 13: 9780531238776 b. Malala Yousafzai; Item #: NTS637191; ISBN 13: 9780531211915; Library Binding c. Mark Zuckerberg; Item #:NTS641648; ISBN 13: 9780531215944; Library Binding d. Michelle Obama; Item #:NTS637194; ISBN 13: 9780531211922; Library Binding page 79 of 75

20. Rape and Sexual Assault: Healing and Recovery by Rebecca T. Klein; Hardcover, 80 pages; 2013 by Rosen Classroom; ISBN13: 9781448894499 [$26.81]

21. Body Piercing and Tattooing: Making Smart Choices by Cohen, Robert Z.; [$26.81] Series: Helpline: Teen Issues And Answers; Catalog Number: #71811Publisher: Rosen Publishing Group, 2014; ISBN 13: 978-1-448-89451-2

22. Children of War: Voices of Iraqi Refugees by Deborah Ellis; Hardcover, 144 pages; Groundwood Books, 2009; ISBN13: 9780888999078

23. Random House Kids. ISBN: 978-0-385-38636-4 Magic Tree House Fact Tracker series by Mary Pope Osborne, #31, China: Land of the Emperor’s Great Wall

24. Random House Kids ISBN: 978-0-375-87027-9 #28, Heroes for All Times— Florence Nightingale, Martin Luther King, Jr., Gandhi, Harriet Tubman, Susan B. Anthony and John Muir)

25. Random House Kids. ISBN: 978-0-385-38633-3, #30, Ninjas and Samurai

26. Madam C. J. Walker: The Inspiring Life Story of the Hair Care Entrepreneur. By Darlene R. Stille; Compass Point Books, 2017, $26.99, hardcover; ISBN: 978-0-7565- 5165-0

27. Three Wishes: Palestinian and Israeli Children Speak by Deborah Ellis; Paperback, 144 pages; 2006 by Groundwood Books (first published 2004); ISBN13: 9780888996459 page 80 of 75

Appendix 6

On-line resources

1) Esl-bits.net An extensive collection of short and long reads read in two speed modes (“normal rate” / “slow rate”) by a variety of English speakers (e.g., American English, British English, etc.). Shorter, non-fiction pieces range from 5 minutes to 45 minutes and offer a wide variety of contemporary subjects (Why I wear a headscarf; What kind of parent are you? Do you believe in ghosts? What’s wrong with what we eat? Hey, Google, please forget me!). Audio books include both full-length novels and short pieces in a wide selection:

 Classic literature (e.g., Pride and Prejudice by Jane Austen; The Great Gatsby by F. Scott Fitzgerald)

 Contemporary serious fiction (e.g., The Handmaid’s Tale by Margaret Atwood; A Long Way Down by Nick Hornby; The House on Mango Street by Sandra Cisneros)

 Mystery (e.g., And Then There Were None by Agatha Christie; Wailing Wind by Toni Hillerman)

 Young Adult (e.g., Diary of a Wimpy Kid by Jeff Kinney; Stardust by Neil Gaiman)

Appendix 7 - Directions to guide ESL students to CCSF library resources.

1) Browse the Read Collection (at all CCSF library locations) in person or online.

It’s a little tricky getting to the Read Collection….Here are two ways:

Method #1 (by location) …three steps…

1. Go through Library Guides: http://guides.ccsf.edu/esl 2. Click on the Read Collections tab (red tabs…very last one to the right 3. You can choose your location on the next page.

Method #2 (Specific to Rosenberg at Ocean Campus) …six steps… …or paste in URL: http://diego.ccsf.edu/search~S1/X?SEARCH=(*)&SORT=D&b=re

1. Go to the CCSF web page (www.ccsf.edu) 2. Go to the Library page (http://www.ccsf.edu/en/library.html) 3. Click on “Search CityCat.” This is a gray box next to the search box. (http://diego.ccsf.edu/search/?searchtype=X&searcharg=&submit=Search+CityCat) 4. Under the first category “Search Terms” in the “any field,” put * page 81 of 75

5. Under the second category, “Add Limits,” use the drop down box for “any location.” Choose Rosenberg / Read. (http://diego.ccsf.edu/search~S1/X? SEARCH=(*)&SORT=D&b=re)

2) Books with CDS: Read and listen to accompanying word-for-word CD recordings of the text. We have a collection of these books in the CCSF Rosenberg Library, in the Read Collection. You can browse them in person by going to the 5th floor of Rosenberg. When you come up the stairs, you will be facing the Read Collection. At other locations, ask the librarian for help.

You can also browse these books on the Internet by going to the CCSF Library home page and clicking on “More Search Options.” On that page, over to the right, you will see the word “Browse” in black letters. Underneath, there are red letters. Click on “Reading Lists.” A list will appear. Number 6 on the list is “Books with CDs in the Read Collection.”

3) Autobiographies for Intermediate to Advanced ESL

You can browse these books in person or on the Internet by going to the CCSF Library home page and clicking on “More Search Options.” On that page, over to the right, you will see the word “Browse” in black letters. Underneath, there are red letters. Click on “Reading Lists.” A list will appear. Number 4 on the list is “Autobiographies for Intermediate to Advanced ESL”

4) Biographies for Intermediate ESL Students

You can browse these books in person or on the Internet by going to the CCSF Library home page and clicking on “More Search Options.” On that page, over to the right, you will see the word “Browse” in black letters. Underneath, there are red letters. Click on “Reading Lists.” A list will appear. Number 5 on the list is “Biographies for Intermediate ESL Students.”