Title: Differences and Similaties of Chilean Secondary Schools That Add and Do Not Add

Total Page:16

File Type:pdf, Size:1020Kb

Title: Differences and Similaties of Chilean Secondary Schools That Add and Do Not Add

Title: Differences and similaties of Chilean secondary schools that add and do not add

value in language

Titulo: Diferencias y similitudes entre establecimientos Chilenos de educacion media que

agregan y no agregan valor en lenguaje

Author: Bernardita Muñoz-Chereau

Institution: University of Bristol

Research funding source: This research was carried out in order to obtain the degree of

PhD. It was financed by CONICYT, Programa Formación de Capital Humano Avanzado.

Contact information: Mail: 13 Osborne Villas, Ground floor, Kingsdown, Bristol,

BS28BP, England. Email: [email protected]

Keywords: Value-Added, Educational Effectiveness Research, Chilean Case Study,

School processes, Mixed-Methods.

Abstact

This article reports the qualitative phase of a Sequential Explanatory Mixed-Method

Study oriented to examine the school processes and/or dimensions that might explain observed differences in schools’ value-added (VA) scores. After analysing a longitudinal sample of a cohort of 176,896 students nested within 2,283 schools that took the SIMCE tests in grades 8 and 10 in 2004 and 2006 and their family questionnaires through the statistical technique of multilevel modelling (Golstein, 1995), schools’ VA in language -a sector not just understudied in Chile but also more closely related to overall effectiveness than math (Sammons, Thomas & Mortimore, 1997)- were used as a screening instrument to choose 2 schools where an instrumental case study (ICS) (Stake, 1995) was carried out.

Complementing an analytical framework drawn from the literature (Scheerens, Glas &

Thomas, 2003), the study identified 11 dimensions that differed and 5 that were similar in the ICS schools. This evidence provides a means to locate the features of more and less effective Chilean secondary schools within the broader literature on Educational

Effectiveness Research (EER). The study seeks to provide contributions to the new assessment framework in place in Chile with the potential to inform and enhance educational policy and practice.

Abstract (Spanish)

Este artículo reporta la fase cualitativa de un estudio Secuencial Explicativo de Métodos

Mixtos. Esta fase examinó los procesos escolares y/o dimensiones que pueden explicar las diferencias observadas en las puntuaciones de valor agregado (VA) de 2 establecimientos en el sector de lenguaje, sector poco estudiado en Chile y más ligado a la eficacia general que matemáticas (Sammons, Thomas y Mortimore, 1997). Tras el análisis estadístico multinivel (Golstein, 1995) de una muestra longitudinal de una cohorte de 176.896 estudiantes anidados en 2.283 establecimientos que rindieron las pruebas

SIMCE en los niveles 8/2do. en 2004-2006 y sus respectivos cuestionarios familiares, las puntuaciones de VA de los establecimientos se utilizaron para seleccionar 2 escuelas (una que agregó valor y otra que no en lenguaje), para llevar a cabo un estudio de caso instrumental (Stake, 1995). Complementando un marco analítico-conceptual preexistente

(Scheerens, Glas y Thomas, 2003), el estudio identificó 11 dimensiones diferentes y 5 similares entre los establecimientos estudiados. Finalmente, el estudio inscribe las características de los establecimientos secundarios Chilenos dentro de la literatura más amplia sobre Efectividad Educacional, buscando aportar al nuevo marco de evaluación vigente en Chile con el potencial de informar y mejorar las políticas y prácticas educativas. Differences and similaties of Chilean secondary schools that add and do not add value in language

In the last decade Chile has showed outstanding progress by improving 40 points in PISA, positioning the country at the top of the reading performance ranking among the

Latin-American countries (OECD, 2011). However, most of the previous EER studies carried internationally and particularly in the Chilean context (Ramirez, 2007; Valenzuela,

Bellei, Osses & Sevilla. 2009; Manzi, San Martín & Van Bellegem, 2011; Carrasco &

San Martín, 2011; Thieme, Tortosa-Ausina, Prior & Gempp, 2012) have focused exclusively on math. Perhaps this has been the case because it is well known that family background and home environment play a more significant role in determining attainment in literacy than in math or science, imposing an extra challenge when modelling data concerning language (Steele, Vignoles & Jenkin, 2007). Given that previous research conducted at the secondary level has stressed the need to analyse the impact of the department explicitly (Ainley, 1994; Luyten, 1998; Harris, Jamieson & Russ, 1995;

Sammons et al., 1997; Smyth, 1999) and has suggested that school effects in language are more closely related to overall effectiveness than math (Sammons et al., 1997), exploring school effectiveness in language in Chile clearly merits further research.

Literature review

Processes and/or dimensions of effective and ineffective schools

Effectiveness and ineffectiveness are opposed, relative concepts, than can be defined from many approaches (Stoll & Myers, 1998). This study works with a particular definition of effectiveness: students’ progress further than might be expected from a con- sideration of their intake (Mortimore, 1991) which locates this study within School Ef- fectiveness and School Improvement field. More precisely, this study will explore pro- 1 cesses, defined as aspects of the school environment associated with student achievement

(D’Haenens, Van Damme & Onghena, 2010), or school organizational and instructional variables resulting from an effort to open the "black box" of the school (Scheerens, 2000).

Describing how well those processes help explaining differences in school effects is cent- ral to EER in the international scenario, as well as in in the context of Chile, where little research has been conducted.

In the last 30 years, a number of researchers have been devoted to trying to identi- fy common features concerning the processes and characteristics of more and less effect- ive schools in different contexts. Studies using different methodological approaches

-from statistical analyses to case studies of outlier schools and a combination of other methods- have identified key characteristics of effective and ineffective schools, working as underlying explanatory mechanisms that have helped to illuminate understandings of what seems to make the difference at the school level. However, focusing on the SESI academic field developed mainly in the last 40 years, what makes an effective school and how schools improve still seems to be an elusive issue. Even though the question about which are the crucial factors in education that promote students’ academic achievement have been debated for several decades, a consensual answer has not been generated. In- deed, the claim that research in these disciplines could elucidate clear recipes or guarantee a path for improving the performance of schools is no more than an illusion or a chimera.

Nevertheless there are a lot of lessons that can be learned from the efforts of the studies carried in industrialised countries, it is also necessary to acknowledge the tension between the generalizability and the context specificity of the EER knowledge-base. It is not enough to be aware of how effective and ineffective schools have been operationalizatised in different countries, nor to choose one approach and replicate it in Chile. It is also ne- cessary to challenge the generalizability of these aproaches in order to search for dimen-

2 sions and/or characteristics associated with students’ outcomes that are particularly relev- ant in Chile, as the increasing importance of context has been recognised in this field.

Processes and/or dimensions of effective schools

Given the overwhelming amount of literature devoted to describing different process factors linked with effective schools, reviewing this work in detail is beyond the scope of this article and has been done elsewhere (Hattie, 2008). Moreover, the fact that there seem to be no generally agreed theory of educational effectiveness but rather common features of effective schools and effective teaching in a range of countries

(Sammons, 2007; Klime, 2012), a complete study could be conducted reviewing differences and similarities among different approaches, especially because consensus hides divergent conceptualisations (Scheerens, 2000).

In reference to Chile, the most well-known and influential case study was carried by Raczynski and Muñoz-Stuardo (2005) on 14 effective schools located in desadvantage areas. However, the problem is that they selected effective schools based on SIMCE’s contextualised attainment results, without controlling for student previous attainment. As this approach has been consistently described as a poor statistic for judging the effective- ness of a school (Willms & Raudenbush, 1989; Raudenbush & Willms 1995; Stoll &

Mortimore, 1997; Sammons et al., 1997; Slee, Weider & Tomlinson, 1998; Sammons,

1999; Ladd & Zelli, 2002; Hoyle & Robinson, 2003; OECD, 2008; Thomas, Salim,

Muñoz-Chereau & Peng, 2011), the question whether the case study schools were really effective, remained unanswered.

3 Nevertheless, as a brief summary Table 1 compares the effectiveness-enhancing conditions of schooling in ten studies conducted in different country contexts.

[insert Table 1]

After reviewing ten studies oriented to outline processes of effective schools, it is clear that school processes or factors are not a unitary body of knowledge. They seem rather complex, multidimensional, and difficult to measure. However, at least half of the studies included in the current review identified (1) Educational leadership, (2) School cli- mate, (3) High expectations/ Achievement orientation, (4) Parental involvement, (5) Eval- uative potential, (6) Consensus and cohesion among staff, (7) Reinforcement and feed- back and (8) Structured instruction as key processes present in effective schools.

Processes and/or dimensions of ineffective schools

Although it has been stressed in the literature that ineffective schools are not merely lacking the features of those that are more effective, there is no doubt that ineffective schools have been comparatively less researched (Stoll & Myers, 1998; Barber

& Dann, in Maden, 2001). Four studies in this area are the ones reported by Sammons,

Thomas, Mortimore, Cairns, Bausor & Walker (1995 in Thomas & Mortimore, 1996),

Stringfield (1998) Van de Grift & Houtveen (2007) and Sammons (2007).

Sammons et al. (in Thomas & Mortimore, 1996) described the relationship between negative VA results and the quality of teaching and learning. The authors highlighted disrupted teaching, low expectations, lack of appropriate feedback, inappropriate grading of school/homework, low departmental morale and lack of examination preparation.

Stringfield tried to unfold what he called the "anatomy of ineffectiveness" when describing common characteristics found in the Louisiana School Effectiveness Study

(LSES) implemented in the States in the 90s (Teddlie & Stringfield, in Stringfield, 1998).

4 They carried 16 case studies in 8 pairs of demographically outliers matched schools claimed to serve identical communities. Although this study shares with Raczynski and

Muñoz-Stuardo (2005) a similar methodological limitation (as the schools were selected based on their students’ contextualised attainment results), they focused on Year 3 classrooms (9 years old) and reported qualitative differentiations at 3 levels. At the school level they highlighted 7 dimensions: lack of academic focus, academic periods starting late and ending early, resources often worked at cross-purposes, bureaucratic leadership, head-teachers passive in teacher recruitment, lack of teacher assessment and lack of public rewards for students’ academic excellence. At the classroom level they found leisurely pace, minimal planning, preponderance of unengaging tasks, low rates of interactive teaching, teachers working in isolation, parts of the mandated curriculum not covered in teaching (less opportunity to learn), lack of any sense of academic push and finally, at the student level, low or uneven time on task and classes experienced as

“intellectual anarchy” (lack of structure and sense making).

The study conducted by Van de Grift and Houtveen (2007) in the 284 underperforming elementary schools in the Netherlands reported 6 prominent weaknesses:

(1) insufficient learning material offered at school to achieve core targets, (2) insufficient time allowed to achieving the minimum objectives of the curriculum, (3) poor instructional quality, (4) insufficient insight into students’ performance levels, (5) insufficient or inappropriate special measures for struggling learners, and (6) dysfunctional organisation of the school prolonged in time.

Sammons (2007) also reviewed studies concerning the characteristics of ineffective schools and highlighted 4 main aspects: lack of vision, unfocussed leadership, dysfunctional staff relationships, and ineffective classroom practices.

5 This brief review supports the fact that when specific features or issues that these schools are claimed to share have been outlined, they inevitably compose a "deficit dis- course". Given this unhealthy emphasis -in the sense that it is not clear that being aware of what is missing is enough to rectify the situation- is so predominant in the literature concerning ineffective schools, and that some EER authors have claimed that these theor- ies are capable of explaining effectiveness as well as ineffectiveness (Scheerens, 2012), this study decided to explore the same dimensions identified in the effective school when studying the less-effective one. Special attention was paid to describe how each factor was operating, not just highlighting its deficits.

Methods

Field approach for data gathering: ICS

The impetus for carrying out an ICS focusing on 2 schools was to generate a deeper understanding of how and why the differences in performance between these schools occurred. This field approach to inquiry was useful to gather further school-level qualitative information concerning the processes that took place in each school.

School A and school B performance information

School A and school B were chosen after using as a screening instrument the res- ults obtained in a VA model (M2). However, school A was consistently classified as good performance and school B as poor performance no matter the model adopted.

[insert Table 2]

School A and B descriptive characteristics

[insert Table 3] According to the official information provided by the Ministry fo Education

(SIMCE, 2010) school A and school B are classified in the "middle" SES group, which means that on average parents have 12-13 years of education, their monthly income lie between $325-$550 and between 25 to 42% of the students are described as vulnerable.

6 Although school A is a public school that provides education from year 7 to year 12 and school B is a subsidized private school that provides education for primary and secondary education, both enrol girls and boys, are middle size and are located in 2 urban municipal- ities within the Metropolitan region, in Santiago city. This descriptive information sup- ports the claim that the ICS is comparing "like with like".

Participants

[insert Table 4]

Three different key stakeholders were included in the ICS carried out in school A and B, in order to integrate different perspectives: 2 Head-Teachers (HT), 2 Technical-

Pedagogic teachers (UTP) and 2 language teachers within each school. The reason to include UTPs (the pedagogical head who provides instructional support and control) is given by the fact that they share the leadership with the HT. While HT play predominantly an administrative or managerial role, UTPs play a pedagogical one.

Although this dual leadership at the school level has been recently recognised as a salient problematic feature of the Chilean school system (Anderson, Marfán Sánchez & Horn,

2011), the effects of this division of labour on school effectiveness has received little attention. Therefore the ICS included both participants. This decision of including different participants had also the intention to avoid the well-known tendency within EER of over-relying on the perception of the HT as the main data provider (Fertig & Schmidt

2002; Scheerens et al., 2003). Finally, qualitative research literature has stressed the importance of including at least 3 perspectives in order to triangulate and generalise data

(Glaser & Strauss, 1967).

The ICS was focused on language in general and in Year 10 specifically. This was the case because even though the quantitative data referred to a previous cohort of stu-

7 dents, it was the same year group. So the ICS was carried out in 2010 with a special focus on Year 10.

Instruments

The 14 effectiveness-enhancing theoretical and empirical SE factors outlined by

Scheerens et al. (2003) provided the framework for the semi-structured interviews and ob- servations schedules. Although this reduction of complexity led to a partial synthesis of the school practices -because the researcher was selectively attentive- it helped narrowing down the focus of the ICS.

[insert table 5]

Even though only 8 process or dimensions seem to have been regularly identified on the literature review on effective schools, the chosen framework provided a good ap- proach. Given that different conceptual frameworks could have been used, the decision of working with this one was based on finding the 14 enhancing factors clear, manageable, easy to communicate, descriptive and concise enough to guide the ICS. They were found condensed and genuine, in the sense that they emerged from prior research carried in the field in different country contexts. The hypothesis was that these "generic" factors were also going to be relevant for Chile and by using them the emergence of "specific" factors relevant to the Chilean context was going to be possible.

However, the same authors have pointed out that there is no simple combination of factors that produce an effective school and have warned out that they are only hypothetically associated with student achievement. In this line, Opdenakker and Van

Damme (2000) have stressed the country specificity of some of these factors. At the same time, some of them, such as Differentiation, were not frequently identified in the review and are highly contended issues. For example, some authors have stressed that pupils tend to perform better when grouped according to their level of ability (i.e. heterogeneity

8 of the characteristics of the pupils is considered harmful) (Klime, 2012), whereas others have stressed that ability grouping have proved to be favourable particularly for high achievers, but not for low achievers (Duri-Bellat & Mengat, 1998): the better the class they attend, the better students tend to perform, but the more homogeneous the class, the worse it is for low achievers. For these reasons, the 14 enhancing factors were explored in the ICS carried in Chile in order to draw firmer conclusions.

Analysis

The method of analysis chosen for this study was a hybrid approach of qualitative methods of thematic analysis (Fereday & Muir-Cochrane, 2006) that incorporated first deductive a-priori codes approach which were developed before examining the current data by other researchers, derived from previous literature or existing frameworks in the field when exploring similar research questions (Johnson, 2006; Au, 2007). The analysis was then followed by a data-driven inductive approach allowing for themes to emerge directly from the data. In this way the researcher also developed new codes by directly examining the data.

The analysis was carried step-by-step. First, the data collected was transcribed and translated from Spanish to English into a word processing document. Then the researcher segmented the data into meaningful analytical units using the theory-driven a-priori codes.

This procedure was made through reading, listening and summarizing the raw data. Then the data-driven or inductive codes were additionally developed. a-priori categories: 14 enhancing effectiveness factors.

This choice of methods seemed appropriate to the research because previous studies have pointed out the need to look at overall school factors as well as maintain a focus on the department level in the process of understanding the link between school outcomes and school processes (Sammons et al., 1997). At the same time, previous

9 studies –such as one conducted by D"Haenens, Van Damme and Onghena (2010)- also have used the 14 enhancing SE factors as a tool for examining the validity of school process variables in different contexts. So instead of coming up with a new list of effective school dimensions or factors, this study used the 14 enhancing SE factors previously described as a framework for developing the instruments for data collection.

This typology was also overloaded on top of the data in order to look for ways in which the data fitted/didn’t fit these categories. Reporting disconfirming evidence was also implemented as a way to improve the accuracy of the data analysis "because in real life, we expect the evidence for themes to diverge and include more than just positive information" (Creswell & Plano Clark, 2011, p.212). In this way confirming and disconfirming evidence was divided into the 14 meaningful analytical units.

Bearing in mind that these factors are descriptive and only partially objective, a question posed by this research was if this framework is suitable to describe the most and least effective schools in the Chilean context in terms of VA as well as how the schools looked from this perspective. In this way, the ICS can also be seen as an empirical testing of whether this particular educational effectiveness framework is suitable to describe school effectiveness in a different context, or as crucially put by Gray, Jesson and Simes

(1990) "What factors (and especially factors under schools' own control) can be identified which contribute to schools' effectiveness?" (p.139). data-driven categories.

In order to make sense of the data that was not included in the a-priori codes, the researcher used Memos, that is, short phrases written in the margins of transcripts about what she found puzzling, insightful, and interesting in the data, taking note of potential new themes identified in the text. In this way, 2 new categories needed to be created in order to describe the local circumstances.

10 Results

How a Chilean secondary school that adds value differs from a school that does not add value?

Eleven dimensions were found different in the practices or processes of school A when compared with school B. These dimensions were (1) Achievement orientation, (2)

Educational leadership, (3) Consensus and cohesion among staff, (4) School climate in terms of effectiveness orientation and good internal relationships, (5) Evaluative potential,

(6) Classroom climate, (7) Effective learning time, (8) Differentiation, (9) School climate as orderly atmospheres, (10) Agency and (11) Trust. In order to present the results, the processes or dimensions will be described by each school, providing some exemplary quotes from the participants.

(1) achievement orientation.

School A: Achievement orientation and high expectations upon students and teachers, with a clear focus on raising and maintaining high attainment. ‘I feel that my challenge is to get good results with my students…now and then the HT asks me ‘How are YOUR res- ults going? He doesn’t ask me ‘How are THE results going, but MINE, and I do feel that obtaining high academic results is a personal challenge…’ (Interview, Teacher).

School B: low expectations upon students and the academic orientation was not on achievement, but on enacting moral values. ‘Even though we do give importance to the academic aspect, we focus on providing a moral education…an education oriented to help the children become good persons…because when someone asks us where the school is located, and we answer…they say, ‘Ah! In front of that slum’…they…they inevitably rise an eyebrow like saying ‘You can’t be a good school if you are working in that area’

…if you know what I mean’ (Interview, HT).

(2) educational leadership.

11 School A: Leadership was a mix of human and educational style, which fitted well with the challenging school context. ‘A strength we need as Heads is the ability to man- age the uncertainty we face every day in the school… when I talk to my teachers and they complain that they are tired and worried, I reply that that is a good sign…if you worry, you will try to find solutions and work harder! …So we end up laughing…I don’t know… it seems so obvious to me that they [teachers] need to be empowered…give them room, give them the opportunity to play this game in their own way, with their own resources… they need to realise that the challenge of getting good results is their own…but in order for that to happen, I exempted them from administrative tasks and bureaucracy…’ (Inter- view, HT).

School B: Leadership was mainly symbolic and technical, focused on maintaining discipline and control, which generated fear, distrust and isolation between staff. ‘She

[the HT] is concerned with administrative, not educational tasks…it is not the same peeping into a classroom generating fear, than knowing and understanding the work that is taking place…’ (Interview, UTP).

(3) consensus among staff.

School A: Consensus was found to be mainly related to the academic process. ‘When

I realised that the children needed to read more primary sources, I worked with my colleagues from Science and History. We chose texts according to the relevant content that they were working on, and then I taught them strategies to improve their comprehension’ (Interview, Teacher).

School B: Despite the area of consensus among staff concerning the importance of maintaining discipline, they lacked cohesion related to the academic process. ‘I feel like

12 an island…I try my best with my students, but I don’t feel supported in any way by the rest of the staff’ (Interview, Teacher).

(4) school climate as good internal relationships.

School A: Good internal relationships were seen as an aim in itself that

characterised the school climate. ‘In our mission we declared ourselves pluralists in 3

dimensions: first one, religion: we cannot bring in our approaches to religion,

because if I am Catholic and you are Mormon, and we stress our differences, we are

not going to enjoy working together…the same applies for our political and sexual

orientation…these issues cannot divide us, so I stress that no matter your personal

preferences, if you are going to work in this school, you have to be loyal to the

institutional mission and vision in terms of embracing pluralism in order to work

together, not letting our differences to be more important than our shared views’

(Interview, HT).

School B: School climate had an over emphasis put on order, discipline and control in detriment of internal relationships. ‘…So we get into the classroom at random and open every bag, even I look into their pockets and the students cannot refuse it, because we make explicit this rule when they register in the school, so if they don’t like it, they have to go to a different school…’ (Interview, HT)

(5) evaluative potential.

School A: The evaluative culture was oriented to meet external accountability requirements. ‘We are continuously evaluated from outside…apart from the ones developed by the teachers. For example, yesterday the students took a test from DEMRE,

University of Chile, and before that we took a test from Andres Bello, Cepech…we take

13 advantage from any institution dealing with evaluation, like the Catholic University, that can give us ‘free pictures’ from our school […]’ (Interview, HT).

School B: Evaluative practices were implemented mainly as a gatekeeper: to fail or pass students. ‘It seems obvious that after a certain period of time we should audit our own assessment system, but we don’t…in the last couple of years we improved our internal assessment rules, because they had lots of deficiencies…and we do have at the end of each semester an assessment meeting in order to discuss student attainment and other issues, but then, we don’t translate that into priority areas…so one thing is to have an assessment procedure, and a different one is to transform it into a tool’ (Interview,

UTP).

(6) classroom climate.

School A: Classroom climate was characterised by a work attitude in most of the students and the teacher made fun of himself, enhancing in this way the fun factor. ‘Now you might think I am mad about PSU [the national test to get into University]…well…you are right! (Students laugh).’ Latter on, when explaining a poetic type of text he declaimed: ‘poet, don’t talk about a rose, make the rose bloom in the poem…ah, now I turned into a poet!’ (Classroom Observation, Year 10).

School B: In the classroom the work attitude was interrupted by conflicts, there were

‘invisible’ children and jokes were made at the expense of the pupils. ‘The teacher is giving an example related to non-verbal communication and says that just because 2 students are sitting together, like Tom and Mary [2 actual students], it doesn’t mean they are having a more serious relationship. She gives this example pointing to the pupils who are sitting together. The boy blushes and the girl also seem embarrassed. The teacher

14 expands the example by saying that not because a boy is wearing a pink bracelet

[showing to all the class a boy’s actual bracelet] it means that he is gay. All the class laughs, except for the boy’ (Classroom Observation, Year 10).

(7) effective learning time.

School A: They gave predominance to effective learning time, which was expressed in an extended length of the school day, homework, buses to carry students from remote areas to and from the school every day and an efficient use of time on task inside the classroom.

School B: Little importance given to effective learning time was expressed in the fact that they had a short school day, did not send homework to students in secondary and spent less time on task during lessons due to frequent interruptions as well as the inclusion of non- academic activities, such as daily religious celebrations.

(8) differentiation.

School A: It was an implicit policy implemented through students’ tracking in ability groups organised in parallel classes. ‘If you look at the SIMCE results from last year, it was good as an average but, in terms of effectiveness, we have managed to support the progress of the lower group – in general terms 10% is in the lower group, 30% in the middle and 60% in the advanced – and I think this year the results will be similar.’ (Inter- view, Teacher).

School B: There was no differentiation as a school policy or practice. In School B most of the teachers were confronted with the challenge of teaching a wide range of stu- dent abilities. There was only special monitoring after school hours, dedicated exclus- ively to pupils at risk of repeating the year. But there was not any strategy oriented to set- ting or tracking students during the school day.

(9) orderly atmosphere.

15 School A: It was seen as a means to academic work with a set of explicit rules that were known by everybody. Each student signed, at the beginning of the school year, ‘12 non-negotiable rules and regulations’ that stressed that they must ‘Come to school every day and on time. There is a punishment if you are late/ Respect, through your actions and attitudes, each member of the school/Come to school wearing the uniform/Be on time in the classroom in an orderly way, sit in your assigned place and start working immediately/Come prepared to work every day. Bring your notebooks, books and any material needed to learn/Do your homework every evening. There is a punishment if you don’t and support at the school if you need to stay working after the school hours/Throw papers and other waste in the bin/Don’t bring cards, radios, mp3 players, games, GPS,

DS, mobiles, lasers or any other element not related to the learning process. They will be confiscated/ Do not get involved in physical, psychological or verbal violence. Learn how to disagree without upsetting others. Do not fight/Respect the building. Do not make graffiti, or writing in any place. Keep our building clean/Say your name and class to any adult that asks you inside the building/Always maintain an impeccable personal appearance’ (Documents).

School B: Although they defined order and discipline as one of their main features, it seemed that rules were unknown, created on the way. Rather than being a mean to academic work, order was seen as an end in itself, in detriment of academic work. ‘When the lesson is finishing, the PE teacher came inside the classroom and said in a loud voice

‘Those that are not wearing a pony tail can’t join the lesson’ generating a worrying atmosphere between the girls’ (Classroom observation, Year 10).

(10)agency.

16 School A: Although school A did not consider the immediate local community as a valuable key player, they felt that they worked in a toxic/challenging environment that they could cope/manage in order to bring about change. ‘We manage and cope daily with toxic environments and uncertainty…for example the teacher gets to the classroom and a girl might be crying because something happened [at home]…they carry to school problems and worries…so how does the teacher engage them with the learning process?

That is the main challenge!’ (Interview, HT).

School B: They passively accepted that the social context determined their work

‘How to ask a child to do more than what he or she possibly can? … I tell the parents that we do have some tools, but not all of them, and if the parents don’t support their child’s education, there is nothing we can do […]’ (Interview, HT).

(11)trust.

School A: The interaction between staff and students was based on trust: they be-

lieved in the honesty, goodness, skill and safety of the school members, and acted ac-

cordingly, specially referring to teachers. ‘Our case is showing that young teachers,

just finishing university, have the skills and competencies to generate high quality

learning…if you don’t oppress them or stand in their way with pointless administrat-

ive tasks’ (Interview, HT).

School B: Mistrust was the overwhelming belief, expressed as a general concep-

tion of students ‘… the child is always trying to find a strategy to do what is forbid-

den’ (Interview, HT) or as a justification that supported awkward practices against

students’ privacy.

How a Chilean secondary school that adds value is similar to a school that does not add value?

17 Five dimensions were found to be similarly problematic or absent in both ICS: (12)

Curriculum quality/opportunity to learn, (13) Parental involvement, (14) Structured instruction (15) Independent learning and (16) Reinforcement and feedback.

(12)curriculum quality/opportunity to learn.

School A: Teaching to the test undermined opportunity to learn. ‘We have increased the time for teaching and learning…but neglecting contents and sectors that are not in- cluded in the tests’ (Interview, Teacher). They called this ‘modifications’: the last 2 months of the school year ‘the levels that are taking the national tests, focus exclusively on language, maths and science’ (Interview, HT). These ‘tactic’ strategies (Grey & Wil- cox, 1985) or sub-optimal activities (OECD, 2008) dubious in nature, reflected the fact that they were heavily teaching to the test.

School B: Lack of planning undermined opportunity to learn. ‘I am always a bit wor- ried that I won’t be able to cover the whole programme’ (Interview, Teacher). There was an intention to cover the curriculum, but it was not clear the actual percentage that they reached at the end of the school year.

(13)parental involvement.

School A: Parents were narrowly defined as resource providers and costumers, under- mining their involvement ‘We work with parents through our shared financing system… every month each parent pays to the school 3.000 pesos and with this money the teacher pays for the materials provided, as well as 1.500 per class for maintaining the garden and the building…in this way parents see how we invest their money’ (Interview, HT).

School B: Though parental involvement was not denied, they did not implement ac- tions oriented to support families to get involved in the learning process. ‘I find really difficult dealing with parents…I am quite happy with them up to year 4, but in bigger levels just a few come to parents meetings to complain or talk about superficial things,

18 like the school party at the end of the year…I can see that students are left alone in the sense that they don’t spend time together, mainly because mothers are working…so there is no control, they don’t’ sign their children’s homework and if you ask them if their child studied at home, they don’t have a clue.’ (Interview, HT).

(14)structured instruction.

Lessons’ observed in both schools followed a similar structure that excluded independent practice and could be summarised in 4 parts: (a) looking back (b) presenting new subject matter (c) guided practice and (d) giving feedback and/or correction. Although both schools showed successful class management strategies, none of the 10 observed lessons incorporated independent practice, revealing a type of instruction heavily based on direct instruction from the teacher, conceived as a content provider and content mediator.

(15)independent learning.

Lessons observed in both schools showed that the teaching style was centred on the teacher without independent learning: ‘Students are not made responsible for their own work, let alone choosing their own assignments’ (Classroom observation, School A).

Students didn’t have enough opportunities to develop these skills. The main difference between the case study schools in reference to this dimension was the amount of time given to each part of the class.

School A: much more importance was given to guided practice ‘Learning together means adopting a pedagogic method with special emphasis on the materials we develop to mediate students’ learning of new content’ (Interview, HT).

School B: Priority was given to looking back: ‘I spend a great deal revising the main contents, because otherwise they forget…’ (Interview, Teacher).

19 (16)reinforcement and feedback.

The way feedback was provided in both schools supported more competition than the

students’ learning process and maintained the status quo.

School A: ‘The teacher sticks to the wall the results for each student in the previ-

ous tests. A student rushes to have a look at the ranking, checking his own results, but

also the results of his classmates’ (Classroom Observation, Year 10).

School B: ‘the teacher started the session giving back to the students their

marked tests. The tests were ordered from the upper to the lower grade, so even

though she did not make it explicit, one could figure out the relative position of each

student’ (Classroom Observation, Year 10).

Discussion

Although EER focusing on processes linked with outcomes has accumulated a large amount of evidence about what have distinguished school and classroom practices in effective and ineffective schools in western countries, there has not been much empirical research conducted in Latin-America, let alone Chile. In this line this ICS helped to fill this gap in the international EER knowledge-base literature.

Applying a generic model of school effectiveness was useful to describe the similarities and differences in the processes and practices of the ICS schools. However, schools A and B could not be effective without a supportive professional context. Along this line, it was not the educational leadership or the climate alone that seemed to make the difference. More precisely, what emerged from the ICS was that 2 new dimensions needed to be included: "Agency" defined as the ability or determination to pursue the school goals, and "Trust", understood as the belief or confidence in the honesty, goodness, skill or safety of a person, organization or thing, were also playing a key role. Concepts coined within social cognitive theory are linked with these dimensions. In particular,

20 collective efficacy (Bandura, 1977), which refers to the judgment of teachers in their capacity for organizing and executing the courses of action required to have a positive effect on their students highlights that the key point in teacher judgements is not a characteristic perse, nor even an accurate assessment of this capability, but a belief influenced by feedback on performance, vicarious experiences, verbal persuasion, and physiological states (Goddard, Hoy & Woolfolk-Hoy, 2004). Despite the fact that both schools worked in similar contexts, school A showed strong Agency and Trust, which implied playing a transformational and progressive role, in the sense that their stakeholders believed they could successfully challenge the limitations imposed by the background. School B found much harder to cope and not despair, precisely because they reproduced what the students brought to the school in a more reactionary way.

In line with previous research that has recognized the usefulness of identifying factors at the school level associated with positive VA results in order to assist all schools to develop plans and targets for improvement (Thomas & Mortimore, 1996), this new 16 enhancing factors approach could be used in the future as an evaluation checklist to identify areas of strength and weakness, or as a coding scheme to capture the diversity of dimensions operating in schools, but not as a final blueprint or recipe for school effectiveness, which is obviously beyond the scope of this study.

Five of the dimensions explored in the ICS blurred instead of sharpened the differences between the 2 schools, showing that not everything was unproblematic in the most effective school. After using VA measures as a screening instrument to carry the

ICS, it was possible to conclude that being effective in one academic subject was by no means everything that mattered. This research posed the need to take a broader approach to assessment by more evenly distributing incentives to schools, in order to amplify their focus on more areas to increase performance within Chilean education.

21 A matter of concern that emerged when carrying out the ICS was its actual implications for the schools. Regarding school B in particular, the question was the extent to which the school had the capacity to improve. Given that the school processes described in the ICS can be seen as a barrier to improvement, the complexity of implementing change within this school seemed very high. Overcoming the negative organisational barriers described above is essential to effectively implementing sound improvement initiatives. Though previous studies with low-performing schools have consistently found that these schools lack the capacity to improve by their own (O’Day,

2005; Isaksen, in Rosenkvist, 2010), it is necessary to be cautious not to generalise, due to the fact that nothing was said in terms of other subject areas or among different group of pupils. The fact that school B has been described as being backward in the analysed dimensions, does not mean that it could not be advanced in other aspects. School B was praised as a safe and orderly institution. This could have been serving other purposes not stated in the ICS.

Another matter of concearn was that school A taught to the test to an extent that undermined students’ opportunities to learn. This provided evidence that the assessment emphasis operating in Chile places such a strong burden upon schools, that it produces unintended consequences or perverse incentives. This conclusion resonates with many of the issues and limitations that SER has been dealing with in the last decades (Angus,

1993; Ball, 1995; Elliot, 1996; Fielding, 1997; Slee, Weider & Tomlinson, 1998; Wrigley,

2004; Scherrer, 2011). The question that emerged from the analysis is "effective for what?" which resonates with previous studies that have found that when students test results were used for accountability and improvement purposes in OECD countries, 2 findings were prominent: "(1) accountability based on student test results can be a powerful tool for changing teacher and school behavior, but it often creates unintended

22 strategic behavior, and (2) no test can be a perfect indicator of student performance"

(Rosenkvist, 2010, p.3).

The ICS results need to be interpreted as an example of how 16 effectiveness- enhancing dimensions were combined to tackle challenging school contexts in 2 Chilean secondary schools. But the fact that a different combination of factors could be operating in different schools needs to be considered, as the literature in the field have shown for example how ineffective schools differ one from another (Janssens, in Van de Grift &

Houtveen, 2007). In other words, the ICS showed how 16 factors were relevant in 2 different schools with a middle social class composition. In this way, the ICS provided verifiable examples of the practices operating in a school that added value and a school that was not adding value in language, when educating middle-socioeconomic status children.

But, as the OECD (2008) warns researchers, one has to be especially cautious when interpreting differences in estimated school effects as differences in school effectiveness, because that is assuming that the chosen model has taken account of all relevant factors that explain school effectiveness. In reality, we cannot observe or control for all these factors. Therefore a causal interpretation of the findings provided by this research is indefensible.

Although there is ongoing tension in the international literature with reference to how context-specific is the EER knowledge-base, with some authors arguing against the usefulness of applying an "objective" checklist derived not only from research carried in different cultural contexts but also done years earlier (Brown & Riddell, 1995; Fertig &

Schmidt, 2002) as well as authors arguing that "what works" in one educational system might work in others (Kovacs, 1998; Scheerens, 2012), a clear strengths of the ICS was

23 the exploration of organisational structures and leadership activities as well as the examination of the instructional activities in place at the school and classroom levels.

Finally, by the end of 2013, the Chilean Quality Agency (QA) will not only classify schools in 4 performance groups, but also high stake consequences (from organizational interventions and monetary sanctions up to school closure) will follow for those schools that are not meeting the standards (i.e. classified as poor performance).

Given that the law states that the procedure for sorting schools could eventually include

VA models depending on the availability of data (LGE, 2010), this study is timely.

The effort derived from the new assessment framework to be implemented by QA referring to moving away from the current system based on SIMCE Raw results is promising, but the imposition of high stakes seems problematic, especially because effective policies are not only those which are clear to the stakeholders, but also those that take into account their ability to implement them (Creemers, 2012). Inevitably negative consequences will follow for those schools classified as Poor performance. Apart from the well-known consequences of failure (i.e. labelling effects and school stigma), making public this classification could be counterproductive, especially because, as the ICS showed, what seemed to make a difference between school A and school B were not just the well researched dimensions within EER, but also the far less known Agency and

Trust: 2 dimensions that helped school A to overcome obstacles, to sustain their commitment, resilience and effectiveness over time. This key sense of efficacy is precisely the belief that could be seriously damaged by the way the information is going to be used. It is dubious whether the failure discourse can help schools recover or improve their performance. What seems clearer is that their improvement will depend to a great extent on the support provided. In this line, this study helped identifying 16 enhancing dimensions that can support improvement at the school level.

24 References

Agencia Calidad de la Educación (2011) at URL: http://www.mineduc.cl/contenido_int.php? id_contenido=22276&id_portal=1&id_seccion=4220 (viewed 1/02/13).

Ainley, P. (1994) Degrees of difference: Higher education in the 1990s. London:

Lawrence & Wishart.

Anderson, S., Marfán, J. and Horn, A. (2011) Leadership Efficacy, Job

Satisfaction and Educational Quality In Chilean Elementary Schools. 25th International

Congress for School Effectiveness and School Improvement. at URL: www.icsei.net/icsei2011/Full%20Papers/0095.pdf (viewed 3/02/13).

Angus, L. (1993). The Sociology of school effectiveness. British Journal of

Sociology of Eduction, 14, 3, pp.333-45.

Au, W. (2007) High-Stakes Testing and Curricular Control: A Qualitative

Metasynthesis. Educational Researcher 36: pp. 258-267.

Ball, S. (1995) Intellectual or technicians? The urgent role of theory in educational studies. British Journal of Educational Studies, XXXX111, 3, p. 255-71.

Bandura, A. (1977). Self-efficacy: Toward a unifying theory of behavioural change. Psychological Review 84(2), 191-215.

Brown, S. & Riddell, S. (1995) School effectiveness research: the policy makers’ tool for school improvement? European Educational Research Journal. March, pp. 6–15.

Carrasco, A. and San Martín, E. (2011). Are quasi-markets in Education Meeting their Policy Purposes in Chile? Re-examining Empirical Hypothesis from Value-Added

Models. at URL: mideuc.cl/wp-content/uploads/2011/09/it1104.pdf (viewed 12/10/12).

25 Creemers, B. (2012) Developing, testing and using theoretical models for promoting quality and equity in education. Keynote at the 3rd meeting EARLI SIG. at

URL: http://www.earli.uzh.ch/programme/keynote-speeches (viewed 4/02/13).

Creswell, J. and Plano Clark, V. (2011) Designing and conducting mixed methods research. 2nd. Ed. USA: Sage.

D’Haenens, E., Van Damme, J. and Onghena, P. (2010) Constructing measures for school process variables: the potential of multilevel confirmatory factor analysis. Qual

Quant 46, pp.155-188.

Elliot, J. (1996) School Effectiveness research and its critiques: alternative visions of schooling. Cambridge Journal of Education, 26, pp.199-223.

Fereday, J. and Muir-Cochrane, E. (2006) Demonstrating Rigor Using Thematic

Analysis: A Hybrid Approach of Inductive and Deductive Coding and Theme

Development. International Journal of Qualitative Methods. Vol 5, No 1 pp.80-92.

Fertig, M. and Schmidt, Ch. (2002) The Role of Background Factors for Reading

Literacy: Straight National Scores in the PISA 2000 Study. IZA Discussion Paper No.

545. at URL: SSRN: http://ssrn.com/abstract=323599 (viewed 23/02/11).

Fielding, M. (1997) Beyond school effectiveness and school improvement: lighting the slow fuse of possibility. The Curriculum Journal, 8, 1, 7-27.

Glaser B. and Strauss, A (1967). Discovery of grounded theory. Strategies for qualitative research. London: Sociology Press.

Goddard, R. D., Hoy, W. K., and Woolfolk Hoy, A. (2004). Collective efficacy beliefs: Theoretical developments, empirical evidence and future directions. Educational

Researcher, 33(3), pp. 3-13.

Goldstein, H. (1995). Multilevel statistical models. London: Edward Arnold.

26 Gray, J., Jesson, D. and Simes, N. (1990). Estimating Differences in the

Examination Performance of secondary schools in six LEAs- A multilevel approach to school effectiveness. Oxford Review of Education 16 (2): pp. 137-56.

Harris, A., Jamieson, M. and Russ, J. (1995) A study of "effective" departments in

Secondary Schools. School organisation, 15, 3, pp. 283-299.

Hattie, J. (2008) Visible Learning: a synthesis of over 800 meta-analyses relating to achievement. London: Routledge.

Hoyle, R. and Robinson, J. (2003) League tables and school effectiveness: a mathematical model. Proceedings of the Royal Society of London B, 270, pp.113–199.

Johnson, (2006) Qualitative data Analysis. Lecture South Alabama University. at

URL: http://www.southalabama.edu/coe/bset/johnson/lectures/lec17.htm (viewed

15/09/12).

Juwah, Ch., Macfarlane-Dick, D., Matthew, B., Nicol, D., Ross, D. and Smith, B.

(2004) Enhancing student learning through effective formative feedback. York: Higher

Education Academy.

Klime, E. (2012) Qualities and Effects of Teaching. Integrating findings across subjects and cultures. Keynote at the 3rd meeting EARLI SIG. At URL: http://www.earli.uzh.ch/programme/keynote-speeches (viewed 19/03/13).

Kovacs, K. (1998) Combating Failure at School; An International Perspective. In

L. Stoll, and K. Myers. K. No quick fixes. Perspectives on schools on difficulty. London:

Falmer Press. pp. 222-241.

Ladd, H. and Zelli, A. (2002) School-Based Accountability in North Carolina: The

Responses of School Principals. Educational Administration Quarterly, 38, pp. 494-529.

27 Luyten, H. (1998). School effectiveness and student achievement, consistent across subjects? Evidence from Dutch elementary and secondary education. Educational

Research and Evaluation, 4(4), pp. 281-306.

Maden, M. (2001) Success against the odds. Five years on. Revisiting effective schools in disadvantaged areas. Routledge: London.

Manzi, J. San Martín, E. and Van Bellegem, S. (2011) School System Evaluation by Value-Added Analysis under Endogeneity. at URL: http://www.ecore.be/DPs/dp_1284546165.pdf 2010/90 (viewed 12/04/12).

MINEDUC (2008) Indicadores de la educación en Chile 2007-2008. at URL: http://w3app.mineduc.cl/mineduc/ded/documentos/indicadores_2007-2008.pdf (viewed

11/03/13).

Mortimore, P. (1991) School Effectiveness Research: Which Way at the

Crossroads? School Effectiveness and School Improvement. Volume 2, Issue 3, pp. 213-

229.

OECD (2008) Measuring improvements in learning outcomes. Best practices to assess the value-added of schools. at www.oecd.org/publishing/corrigenda (viewed

12/04/11)/ La Medición del Aprendizaje de los Alumnos Mejores Prácticas para Evaluar el valor agregado de las escuelas. at URL: http://books.google.co.uk/books? id=vSl2uztDVVYC&pg=PA189&dq=H%C3%A6geland,

+2006+school+performance+indicators&hl=es&sa=X&ei=_-M5UYaJJ-

XY7AbKhYHgDw&ved=0CDEQ6AEwAA#v=onepage&q&f=false (viewed 8/03/13).

OECD (2011), Society at a Glance –OECD Social Indicators, Key Findings:

Chile. at URL: http://www.oecd.org/dataoecd/39/23/47572883.pdf (viewed 12/06/12).

Opdenakker, M. and Van Damme, J. (2000): Effects of Schools, Teaching Staff and Classes on Achievement and Well-Being in Secondary Education: Similarities and

28 Differences Between School Outcomes. School Effectiveness and School Improvement.

Vol 11:2, pp. 165-196.

Raczynski, D. and Muñoz-Stuardo, G. (2005). Efectividad Escolar y Cambio

Educativo en Condiciones de Pobreza en Chile/School Effectiveness and Education

Changes under Conditions of Poverty in Chile. Chile: Ministerio de Educación, División de Educación General.

Ramírez, M. J. (2007) Diferencias dentro de las salas de clases. Estudios públicos, Nº. 106, pp.5-22.

Raudenbush, S. and Willms, J. (1995) The Estimation of School Effects. Journal of Educational and Behavioural Statistics. December 21, vol. 20 no. 4 pp. 307-335.

Rosenkvist, M. A. (2010), Using Student Test Results for Accountability and

Improvement: A Literature Review. OECD Education Working Papers, No. 54, OECD

Publishing. at URL: http://dx.doi.org/10.1787/5km4htwzbv30-en (viewed 19/05/11).

Sammons, P. (1999) School Effectiveness coming of Age in the Twenty-first

Century. Swets and Zetlinger: The Netherlands.

Sammons, P. (2007) School Effectiveness and Equity: Making Connections. A review of school effectiveness and improvement research. Its implications for practitioners and policy makers.

Sammons, P., Thomas, S. and Mortimore, P. (1997) Forging links: effective schools and effective departments. London: Paul Chapman Publishing.

Scheerens, J. (2000) Improving School Effectiveness. Fundamentals of

Educational Planning. International Institute of Educational Planning. N. 68. Paris:

UNESCO.

29 Scheerens, J. (2012) Theories of educational effectiveness and ineffectiveness

Keynote at the 3rd meeting EARLI SIG. at URL: http://www.earli.uzh.ch/programme/keynote-speeches (viewed 4/02/13).

Scheerens, J., Glas, C. and Thomas, S. (2003) Educational Evaluation,

Assessment and Monitoring. A systemic approach. The Netherlands: Swets and Zeitlinger

Publishers.

Scherrer, J. (2011) Measuring Teaching Using Value-Added Modelling: The

Imperfect Panacea. NASSP Bulletin 95(2) pp.122–140.

SIMCE (2010) Informe de Resultados Segundo medio. at URL: http://www.simce.cl (viewed 31/01/13).

Slee, R., Weiner, G. and Tomlinson, S. (1998) School effectiveness for whom?

Challenges to the school effectiveness and school improvement movements. London:

Falmer Press.

Smyth, E. (1999). Pupil performance, absenteeism and school drop-out: a multidimensional analysis, School Effectiveness and School Improvement, 10, pp. 480-

502.

Stake, R. (1995) The art of case study research. London: Sage.

Steele, F., Vignoles, A. and Jenkin, A. (2007) The effect of school resources on pupil attainment: a multilevel simultaneous equation modelling approach. J. R. Statist.

Soc. 170, Part 3, pp. 801–824.

Stoll, L. and Mortimore, P. (1997) School effectiveness and school improvement.

In Perspectives on School effectiveness and school improvement. (Ed.) J. White and M.

Barber. London: Institute of Education.

Stoll, L., Myers. K. (1998) No quick fixes. Perspectives on schools on difficulty.

London: Falmer Press.

30 Stringfield, S. (1998) Anatomy of Ineffectiveness. In L. Stoll and K. Myers.

Perspectives on schools on difficulty. Ch. 15. London: Falmer Press (pp. 209-221).

Thieme, C., Tortosa-Ausina, E., Prior, D. and Gempp. R. (2012). Valor Agregado

Multinivel y factores contextuales en educación: una comparación no parametrica robusta. at URL: https://editorialexpress.com/cgibin/conference/download.cgi? db_name=xveep&paper_id=53 (viewed 04/02/13).

Thomas, S and Mortimore, P. (1996) Comparison of Value Added Models for

Secondary School Effectiveness, Research Papers in Education, 11(1), pp. 5-33.

Thomas, S., Salim, M., Muñoz-Chereau, B. and Peng, W-J. (2011) Educational

Quality, Effectiveness and Evaluation: Perspectives from China, South America and

Africa, in Ch. Chapman, D. Muijs, A. Harris, D. Reynolds and P. Sammons (Eds.),

Challenging the Orthodoxy of School Effectiveness and School Improvement: Towards new theoretical perspectives. London, Routledge pp. 125-145.

Valenzuela, J., Bellei, C., Osses, A., Sevilla, A. (2009) Causas que explican el mejoramiento de los resultados obtenidos por los estudiantes chilenos en PISA 2006 respecto a PISA 2001. Aprendizajes y Políticas. Available in www.fonide.cl

(viewed 21/07/10).

Van de Grift, W. and Houtveen, T. (2007): Weaknesses in Underperforming

Schools, Journal of Education for Students Placed at Risk (JESPAR), 12:4, pp.383-403.

Willms, D., and Raudenbush, S. (1989) A Longitudinal Hierarchical Linear Model for Estimating School Effects and Their Stability. Journal of Educational Measurement,

Vol. 26, No. 3 Autumn, pp. 209-232.

Wrigley, T. (2004) School Effectiveness: The Problem of Reductionism. British

Educational Research Journal, Vol. 30, No. 2, pp. 227-244.

31 Appendices Table 1: Effectiveness-enhancing processes or factors of schooling in different studies 14 effectiveness- 10 features of school 10 factors 9 processes of 9 processes 8 general 11 factors of 4 key 5 factor 8 school enhancing factors success effective effective schools factors effectiveness effective characteristi model of processes of (Scheerens, Glas and (National Commission on schools (Teddlie and Sammons, factors schools cs of (Edmonds, effective Thomas, 2003) Education, Mortimore, in (Raczynski Reynolds, 2000) Thomas, (Scheerens (Sammons et secondary 1979) schools Maden, 2001) and Mortimore, and Bosker, al., 1995) schools (Rutter et al, Muñoz- 1997) 1997) (Smith and 1979) Stuardo, Tomlinson 2007) (1990) Educational leadership Strong positive leadership Institutiona Effective Clear Professional Shared Effective Strong Teachers as by the head and senior l and leadership leadership of leadership leadership leadership educational positive role staff pedagogic HT, HoD and leadership models leadership Effective teaching management Effective by senior SMT and middle managers School climate as A good atmosphere or Clear and Producing a Strong Productive Pupil rights and Climate of Orderly and School ethos orderly atmospheres/ spirit, generated by shared shared positive school academic climate and responsibilities respect secure School climate in aims and values; and by a rules culture emphasis culture between all environment terms of effectiveness attractive and stimulating regarding emphasising participants orientation and good physical environment discipline responsibilities and internal relationships rights High expectations/ High and consistent High Creating high and High High High High High teacher Achievement expectations of all pupils expectation appropriate expectations expectations expectations expectations expectations orientation s and expectations of pupil demands attainment Developing and maintaining a Emphasis on pervasive focus on the acquiring learning of basic skills Parental involvement Parental involvement in School- Involving parents Parental Parental Home-school children’s education and family in productive and support/invol involvement partnership in supporting the aims of alliance appropriate ways vement (parental the school. involvement)

Evaluative potential Well-developed Responsibl Monitoring Appropriate Monitoring Frequent procedures for assessing e progress at all monitoring progress assessment how pupils are evaluation levels of pupil progressing progress Consensus and Shared Shared vision Teacher Shared staff- cohesion among staff vision/goals and goals involvement student in decision- activities Consistency making in approach Reinforcement and Rewards and incentives to Positive Positive Positive feedback encourage pupils to reinforcement feedback feedback and succeed and treatment of treatment of students students Structured instruction Clear and continuing Quality of Purposeful Effective focus on teaching and teaching teaching classroom learning management Independent learning Responsibility for learning Student- Students shared by the pupils centred given themselves approach responsibility Curriculum Focus on Learning quality/opportunity to central environment learn learning skills concentration on teaching and learning

Effective learning time Effective Learning instructional organisation arrangements Differentiation Handling specific responses to students diversity Classroom climate Factors not included in Participation by pupils in Efficient Developing staff Practice- Good this model: the life of the school use of skills at the school oriented staff working human site development conditions Extra-curricular activities resources for staff and which broaden pupils’ students interests and experiences, Use of expand their opportunities external to succeed and help to support build good relationships with the school Table 2: Cohort 2004/2006 year group 8/10: School A and B language residuals Models M0 M1 M2 M3 M4 M5 School A 32.585[7.45 37.506[7.99] 22.314[5.22] 21.183[5.60] 18.091[5.49] 17.472[5.49] ] School B - -43.679[10.07] -43.218[6.933] -41.943[6.89] -41.723[6.68] -41.34[8.95] 42.376[10.0 9]

Table 3: School A and B descriptive characteristics Characteristics School A School B SES (MINEDUC, 2010) Middle Middle Type Public Private-subsidised Level Secondary (y7-y12) Primary and Secondary Gender Girls and Boys Girls and Boys Size Middle (800 students) Middle (960 students) Location Urban, Colina Urban, Puente Alto

Table 4: Data collection techniques and aims Participants Data collection techniques Aim 2 Head-teachers Semi-structured interviews Explore organisational (1 each school) (2 hours each, 4 hours in total) structures and leadership activities 2 Curriculum Semi-structured interviews Explore organisational Developers ("Jefe (2 hours each, 4 hours in total) and structures and UTP1") (1 each field notes instructional strategies school)

1 2 Year 10 Semi-structured interviews Explore classroom language teachers (3 hours each, 6 hours in total) and climate and (each school) field notes instructional strategies 10 Year 10 (5 each school) classroom observations

Table 5: 14 enhancing effectiveness factors and its components Factors Components 1 Achievement orientation, high -High expectations (School level) expectations: clear focus on -High expectations (Teacher level) the mastering of basic subjects -Records on pupils’ achievement 2 Educational leadership: -General leadership skills indirect control and influence -School leader as information provider on the school primary process -Orchestrator of participative decision making -School leader as coordinator -Meta-controller as classroom processes -Time educational/administrative leadership -Counsellor and quality controller of classroom teachers -Initiator and facilitator of staff professionalization 3 Consensus and cohesion -Types and frequency of meetings and consultations among staff: Coherence, -Contents of cooperation consistency and continuity -Satisfaction about cooperation among the staff -Importance attributed to cooperation -Indicators of successful cooperation 4 Curriculum -The way curricular priorities are set quality/opportunity to learn: -Choice of methods and textbooks The degree in which the -Application of methods and textbooks implemented curriculum -Opportunity to learn matches the achievement -Satisfaction with the curriculum curriculum 5 School climate as orderly -The importance given to an orderly climate atmospheres: School culture -Rules and regulations oriented to maintain -Punishments and rewording orderliness -Absenteeism and drop out -Good conduct and behaviour of pupils -Satisfaction with orderly school climate 6 School climate in terms of -Priorities in an effectives-enhancing school climate effectiveness orientation and -Perceptions on effectiveness-enhancing conditions good internal relationships: -Relationships between pupils -Relationships between teacher 7 Evaluative potential: Aspira- -Evaluation emphasis tions and possibilities of a -Monitoring pupils’ progress school to use evaluation as a -Use of pupils monitoring systems basis for learning and feed- -School process evaluation back -Use of evaluation results -Keeping records on pupils’ performance -Satisfaction with evaluation activities 8 Parental involvement: -Emphasis on parental involvement in school policy Relationship between the -Contact with parents actual degree of involvement -Satisfaction with parental involvement of parents and the effort displayed by the school to engage them. 9 Classroom climate: Good, -Relationships within the classroom ordered relationships, work -Order attitude and satisfaction with -Work attitude the classroom interactions. -Satisfaction 10 Effective learning time: -Importance of effective learning Importance and -Time implementation of effective -Monitoring of absenteeism time on task at the class and -Time at school school level -Time at classroom level -Classroom management -Homework 11 Structured instruction: Well -Importance of structured instruction organised and closely -Structure of lessons monitored lessons -Preparation of lessons -Direct instruction -Monitoring 12 Independent Learning: Extent - No sub-components to which pupils are responsible and make decisions concerning their learning according to their own interests 13 Differentiation: How the -General orientation school deals with differences -Special attention for pupil at risk between pupils and takes care of pupils with learning and behavioural problems 14 Reinforcement and feedback: -No sub-components Basic requirement for learning that deals with the rapport to pupils in connection with their achievement

Recommended publications