Conference Proceedings

Contents

Foreword v Keynote papers Professor John Gardner 1  Assessment for teaching: the half-way house. Dr Margaret Forster 5 Informative Assessment – understanding and guiding learning. Professor Helen Wildy 9 Making local meaning from national assessment data: NAPNuLit. Professor Patrik Scheinin 12 Using student assessment to improve teaching and educational policy.

Concurrent papers Prue Anderson 15 What makes a difference? How measuring the non-academic outcomes of schooling can help guide school practice.

Peter Titmanis 20 Reflections on the validity of using results from large scale assessments at the school level.

Professor Helen Timperley 21 Using Assessment Data for improving teaching practice.

Juliette Mendelovits and Dara Searle 26 PISA for teachers: Interpreting and using information from an international reading assessment in the classroom.

Katrina Spencer and Daniel Balacco 31 Next Practice: What we are learning about teaching from student data.

Professor Val Klenowski and Thelma Gertz 36 Culture-fair assessment leading to culturally responsive pedagogy with indigenous students.

Jocelyn Cook 44 An Even Start: Innovative resources to suport teachers to better monitor and better support students measured below benchmark.

David Wasson 47 Large Cohort Testing - How can we use assessment data to effect school and system improvement?

Dr Stephen Humphry and Dr Sandra Heldsinger 57 Do rubics help to inform and direct teaching practices?

Poster presentations 63

Conference program 65

Perth Convention and Exhibition Centre floorplan 67

Conference delegates 69 Research Conference 2009 Planning Committee Professor Geoff Masters CEO, Conference Convenor, ACER

Dr John Ainley Deputy CEO and Research Director National and International Surveys, ACER

Ms Kerry-Anne Hoad Manager, Centre for Professional Learning, ACER

Ms Marion Meiers Senior Research Fellow, ACER

Dr Margaret Forster Research Director ACER

Copyright © 2009 Australian Council for Educational Research 19 Prospect Hill Road Camberwell VIC 3124 www.acer.edu.au

ISBN 978-0-86431-821-3 Design and layout by Stacey Zass of Page 12 and ACER Project Publishing Editing by Maureen O’Keefe, Elisa Webb and Kerry-Anne Hoad Printed by Print Impressions

Research Conference 2009 iv Foreword

Geoff Masters Australian Council for Educational Research Research Conference 2009 is the fourteenth national Research Conference. Through our research conferences, ACER provides significant opportunities at the national Professor Geoff Masters has been Chief Executive level for reviewing current research-based knowledge in key areas of educational Officer of Council for Educational policy and practice. A primary goal of these conferences is to inform educational Research (ACER) since 1998. Prior to joining policy and practice. ACER, he was a member of the Faculty of Education at the of . Research Conference 2009 brings together key researchers, policy makers and Prof Masters is also Chair of the Education teachers from a broad range of educational contexts from around Australia and Network of the Australian National Commission overseas. It addresses the important theme of assessment and student learning. The for UNESCO, a member of the International conference will explore the information that can be gained from quality classroom Baccalaureate Research Committee, a Past President of the Australian College of Educators; and system wide assessment and how effective teachers use that information to Founding President of the Asia-Pacific Educational guide their teaching. Research Association and a member of the Business Council of Australia Education, Skills and We are sure that the papers and discussions from this research conference will Innovations Taskforce. make a major contribution to the national and international literature and debate on He has a PhD in educational measurement from key issues related to the effective use of data to support teachers to identify starting the University of Chicago and has published points for teaching, diagnose errors and misunderstandings, provide feedback to several books and numerous journal articles in the guide student action, evaluate the effectiveness of their teaching, and to monitor field of educational assessment. individual progress over time. For more than 25 years, Prof Masters has been an international leader in developing better measures We welcome you to Research Conference 2009, and encourage you to engage of educational outcomes. in conversation with other participants, and to reflect on the research and its Prof Masters has led work on the practical connections to policy and practice. implementation of modern measurement theory to large-scale testing programs and international achievement surveys. Prof Masters recently investigated options for an Australian Certificate of Education on behalf of the Australian Government and was the author of a recent paper released by the Business Council of Australia, Restoring our Edge in Education Making Australia’s Education System its Next Competitive Professor Geoff N Masters Advantage (2007). Chief Executive Officer, ACER

V Research Conference 2009 vi Keynote papers

Assessment for teaching: The half-way house

Abstract Evidence based teaching is the conscientious, explicit and judicious use This presentation considers the merits of best evidence in making decisions and challenges associated with using about the education of individual assessment information to inform students. and improve teaching. The concept of assessment information is briefly Introduction unpacked and considered in terms of a series of questions relating, for I hope David Sackett and his colleagues example, to the need to develop (1996, p.71) will forgive my tweaking John Gardner teachers’ assessment competence; of their definition of evidence-based Queen’s University, Belfast Ireland their understanding of error, reliability medicine to serve our focus. For and validity; and their appreciation of me it sums up the manner in which John Gardner is a Professor of Education in the difference between assessments assessment information should be the School of Education at Queen’s University, used to inform teaching. It should be Belfast. His main research areas include policy used for formative purposes and and practice in education, particularly in relation summative judgements. Examples from done with care and attention, made to assessment. His recent publications include a variety of international contexts will transparent to all concerned (and Assessment and Learning (Sage, 2006) and his be drawn upon to illustrate major most importantly the students), and it recent research activities have included studies should be used appropriately to inform on assessment and social justice, assessment issues. As the title suggests there is a by teachers (ARIA), and consulting pupils on hint of a reservation in an otherwise decisions. assessment (CPAL). He has over 100 peer- strong endorsement for the focus The basic premise of this conference, reviewed publications and has managed research of the conference. Using assessment projects exceeding £2.3 million in total since that effective teachers are those 1990. He is President Elect of the British data appropriately can make for a who use assessment to guide their Educational Research Association, a fellow of more effective teacher but it cannot teaching, might therefore seem entirely the Chartered Institute of Educational Assessors, guarantee more effective learning. The reasonable. As the blurb says, this a fellow of the British Computer Society and a argument is made that students must member of the UK Council of the Academy of teacher effectiveness may be associated Social . be enabled to play a full part in using with such processes as the identification assessment information to support of starting points for teaching, error and improve their learning. Using diagnosis, feedback generation, progress assessment information should explicitly monitoring and evaluating the teaching be a joint enterprise between teachers itself. Who wouldn’t be happy with and their students. that? Well me for one, at least not entirely. It’s not that I would object to any of these activities; indeed I would hope that I am committed to them in my own teaching. However, it is only part of the story for me. But more of that later. Let me begin with the central theme – using assessment information to inform teaching. Not meaning to be overly pernickety, I would nevertheless like to unpack the concept of ‘assessment information’. Arguably all assessment information comes from one of two types of data. The first is score-type data that is objectively generated, for example: 50 per cent for choosing correct answers to half of the items in a multiple choice question test (MCQ). The sense of

Assessment and Student Learning: Collecting, interpreting and using data to inform teaching 1 ‘objectivity’ in this case is that there system-wide (e.g. UK-wide: Yelis and accept scores and grades on externally is no judgement1 or interpretation in PIPS, Durham University; US: NAEP; administered tests as indisputable the scoring process. The answers are Scotland: SSA – see key at end of and conclusive, trusting to the definitive and the scoring can proceed paper) and local authority and schools generally systematic and scrupulous on a purely secretarial basis, most (e.g. England and : Fischer administration of the examinations frequently by optical mark reading Family Trust; Scotland: Fyfe Ltd). The agency concerned. Among those who machines. information users may be policy- know better (about test scores, not makers looking at trends in standards test agencies!), some such as Murphy Such tests, judging by commercial sales over time, education authority officials (2004) have tried to raise the teaching pitches, are not only ‘highly reliable’ comparing schools or evaluating profession’s awareness of the dangers they might even appear to be the teachers, school management teams of arriving at inappropriate decisions predominant form of assessment in engaged in whole-school evaluations by not appreciating the approximations education. Of course they are not, or teachers seeking to benchmark their and trade-offs in public examination constituting, as they do, only a minor students’ performance. Among these, results. proportion of all assessments. The vast the ‘accountability’ usage of assessment majority of assessments of student In the assessment community itself, data to monitor school performance learning are based on judgement and Newton (2009) has also detailed the remains highly problematic (cf. NCLB in interpretation. There is, for example, considerable difficulties even for experts the US, national tests – the the myriad of evaluations happening in unpacking these reliability issues. He so-called Sats – in England). by the minute in classrooms. Even agrees there is a significant potential numeric scores in non-fixed response When most people think of assessment for grade misclassification of students assessments (anything from structured data, they probably have in mind in national assessments, but it is difficult questions to essays) are misleadingly the data sets arising from formal to determine its degree. Ultimately, he ‘objective’, since the individual and assessments as diverse as MCQ test argues, there is a long overdue need for aggregate scores such as 75 per cent scores and portfolio grades (with their an open debate on error in assessments or A, will always be subject to the accompanying narratives). The data and particularly on how much error is assessor’s judgement to some extent. may be generated and processed too much for any purpose to which This lure of objectivity and absolute from school-based assessments or the assessment information is to be scores, however tenuous it might be, they may come from assessments that put. This debate is certainly not one exerts a strong grip on our society have been developed, administered, I have witnessed among teachers, or and indeed individual assessors, though scored and reported through an indeed among the wider education- the latter will occasionally waver when external agency. Teachers may be related community. Meanwhile, most someone asks how a given score of 59 required by the relevant authorities to teachers continue to use classroom per cent differs from a grade boundary use the assessment information, that and school-based assessment data to score of 60 per cent. is the results these various forms of diagnose weaknesses, give feedback and assessment offer, to judge and report adjust teaching, largely untrammelled by The process of making meaning from student progress and achievement, concerns of reliability and validity. assessment data – that is the process of and make decisions based on this turning it into information that informs As we ponder the possibility that judgement. Alternatively they may appropriate actions – is for the most judgements might be misinformed simply wish to inform their judgements part then one of expert or at least by inappropriate interpretation with the best information available. informed interpretation and judgement. of assessment data, several other Regardless of the reasons for teachers’ The data may be at different levels, questions come into focus. For use of such data, however, there is an example: important question to be answered. Can we be sure that teachers have • What level of understanding is 1 I accept that judgement can be argued to be largely objective on the basis that it is generally sufficient understanding of assessment there among teachers about the evidence-informed and not the result of the information and its problematics to meaning and importance of the two subjective feelings, attitudes and beliefs of the enable them to make appropriate and concepts: reliability and validity? assessor. However, I do wish to distinguish dependable judgements? between a balanced interpretation of the • Is there sufficient understanding available evidence leading to a judgement and I am not sure that we can. For example, of the many purposes assessment the prescribed attribution of correctness or can serve (e.g. Newton, 2007 has incorrectness to fixed response items, resulting it would not be an uncommon in an indisputable score. experience to find teachers who outlined at least 22) and the caveats

Research Conference 2009 2 that govern the use of any particular Before 1988, curricula were largely Strategies, 2009) – has multi-criteria type of assessment information? school-specific up to age 16 or so, and clipboard-like features. However, with the only formal curriculum entities it has considerable merit in focusing • Is there sufficient understanding of being the ‘syllabuses’ of the main more appropriately on engaging the the difference between assessment externally examined subject areas students actively in their learning information used for summative (e.g. geography, mathematics etc.). and assessment activities. The APP’s and formative purposes? Or, and The national curricula consigned these teacher-learner classroom assessment perhaps this is mischievous on to history and the new specifications interactions have dual purpose. The my part, is formative assessment included some 900+ ‘statements of first is to support learning through the merely a handy off-the-shelf set of attainment’ across ‘programmes of immediacy of an assessment for learning mini-summative tests turning over study’. These were essentially quasi- manner. The second is to contribute to US$500 million for publishers every progressive criteria against which summative judgments for reporting etc., year (à la Education Week, 2008)? teachers would assess each student for through consideration of the wealth • When the assessments are carried mastery, judging the ‘level’ of attainment of assessment data gathered over a out by teachers, do they have the using a best-fit process on a ten-level significant period. appropriate skills to undertake them scale. The abiding image is of a teacher and, perhaps crucially, do they have standing over a child with a clipboard Conclusion sufficient time in their classroom and pen, ticking mastery criteria boxes. schedules? The detriment to teaching and learning, And that brings me to the last as these assessments preoccupied questions for now, and my return to • Is there sufficient structured training the teachers, was so obvious and the incomplete story I alluded to earlier. and ongoing professional support predictable that it precipitated the The questions are: (moderation, exemplar banks creation of the Assessment Reform etc.) for teachers undertaking • Should teachers be concentrating Group in 1989. assessments of their students? The on using assessment data to vital importance of these factors Stobart (2008, p. 156) criticises the support, adjust and improve is pressed home by a variety of reductionist approach that underpins their teaching? Or should they researchers the world over, for this type of assessment system as be focusing instead on the use of example the Assessment Reform being ‘… increasingly mechanistic, as assessment data to improve their Group (ARG) (2006) in the UK and learners are encouraged to master students’ learning? Klenowski (2007) in Australia. small, detailed chunks of curriculum. For some this may be a subtlety too far This explicitness, intended to make Let me now turn for a moment to the but for me it is fundamental. I accept, the learning clear to the learner, may assessments for formative purposes of course, that improved teaching actually reduce autonomy rather than that teachers are making on an should lead to improved learning but encourage it’. He quotes the notion ongoing basis in most lessons being improved teaching is the ‘half-way of ‘criteria compliance’ (Torrance et taught in most schools. Some may house’ in my title. If we conceptualise al., 2005) which leads to the danger of argue that effective questioning, good our efforts always in the context of the assessment becoming the learning feedback, shared learning intentions improving teaching we won’t do any (in the sense of actually displacing and criteria for assessing them, are harm. That’s for sure. But if we use all learning). If teachers do not have a more pedagogical than assessment- available assessment data formatively sufficient grasp of the purposes to oriented. However many will see and successfully in support of learning, which assessment information can be the two, pedagogy and assessment, recognising that it is only the learners validly put, we cannot be confident that as inextricably linked. Whatever the who can learn – good teachers or it is any less likely to distort teaching perception, the recent history of improved teaching cannot do it for actions and learning intentions. And it assessment in the UK has not been them – then and only then do we is instructive to recall that it was not entirely glorious. Arguably, the growth come all the way home. the distorting effect on teaching and of interest in the UK in assessment learning that finally brought down the for learning, AfL, has been partly tick-box regime; it was workload. fuelled by the de-skilling of teachers in classroom assessment since the launch That said, the new teacher assessment- of the national curricula in 1988–89 (in based approach in England – Assessing England, Wales and Northern Ireland). Pupil Progress, APP (National

Assessment and Student Learning: Collecting, interpreting and using data to inform teaching 3 References Stobart, G. (2008) Testing Times: The uses and abuses of assessment. ARG (2006) The role of teachers in the Abingdon: Routledge assessment of learning. Assessment Reform Group. Retrieved June 25, Torrance, H., Colley, H., Garratt, D., 2009, from Jarvis, J., Piper, H., Ecclestone, K. and http://www.assessment-reform-group. James, D (2005) The impact of different org/ASF%20booklet%20English.pdf modes of assessment on achievement and progress in the learning and skills Education Week (2008) Test industry sector. London: Learning and Skills split over formative assessment. Research Centre Education Week, September 16th. http://www.itslifejimbutnotasweknowit. Retrieved June 25, 2009, from org.uk/files/AssessmentModesImpact.pdf http://www.edweek.org/ew/ articles/2008/09/17/04formative_ Key: NAEP: National Assessment of Educational ep.h28.html Progress (US); NCLB: No Child Left Behind Klenowski, V. (2007). Evaluation of the (US); PIPS: Performance Indicators in Primary effectiveness of the consensus-based Schools (UK); ‘Sats’: Standard assessment standards validation process. : tasks (England); SSA: Scottish Survey of Department of Education, Training Achievement; Yelis: Year 11 Information System (UK) and the Arts. Murphy, R. (2004) Grades of uncertainty: Reviewing the uses and misuses of examination results. London: Association of Teachers and Lecturers. Retrieved June 25, 2009, from http://www.atl.org.uk/Images/ Grades%20of%20uncertainy.pdf National Strategies (2009) Assessing pupils’ progress. The National Strategies, London: Department for Children, Schools and Families. Retrieved June 25, 2009, from http://nationalstrategies.standards. dcsf.gov.uk/primary/assessment/ assessingpupilsprogressapp Newton, P. E. (2007) Clarifying the purposes of educational assessment. Assessment in Education: Principles, Policy & Practice, 14(2) 149–170 Newton, P. E. (2009) The reliability of results from national curriculum testing in England, Educational Research, 51(2) 181–212 Sackett D. L., Rosenberg W. M., Gray J. A., Haynes, R.B. and Richardson W. S. (1996) Evidence based medicine: what it is and what it isn’t. British Medical Journal 312, 71–72

Research Conference 2009 4 Informative Assessment: Understanding and guiding learning

Abstract improvement of student outcomes (e.g. Shepard, 2000). Others, such as In the last decade a good deal of Masters et al. (2006) see the roles as attention has focused on distinguishing complementary, and argue that what between assessment purposes—in matters is the quality of the data and particular between summative how data from assessments are used. assessments (assessments of learning) and formative assessments (assessment This presentation explores informative for learning). This presentation explores assessment. Informative assessment informative assessment. Informative does not make a distinction between Margaret Forster assessment does not make a distinction the contexts of assessment or their Australian Council for Educational between the contexts of assessment or stated primary purposes. Informative Research their stated primary purposes. Rather, it assessment focuses on how teachers focuses on how teachers and students and students make use of assessment Margaret Forster is the Research Director of the make use of assessment information information to understand and improve Assessment and Reporting Research Program at to both understand and improve learning. Informative assessment the Australian Council for Educational Research brings together research underpinning (ACER). learning. Informative assessment brings together research underpinning ‘assessment for learning’ with research Dr Forster has extensive experience in the on high performing school systems; area of assessment and reporting and works as ‘assessment for learning’ with research a consultant nationally and internationally. She on high performing school systems; on how students learn and highly effective has direct experience in the development of highly effective teachers and on how teachers. Two perspectives on support materials for teachers and policy makers. students learn. Two perspectives on informative assessment are explored: She conceptualised and co-authored the first the teaching perspective and the Developmental Assessment Resource for Teachers informative assessment are explored: (DART English Upper Primary), and is co-author of the teaching perspective and the learning perspective. the ACER Attitudes and Values Questionnaire, learning perspective. Research evidence and the Assessment Resource Kit (ARK) is detailed and challenges highlighted. The teaching perspective materials. She wrote the introductory overview to the Discovering Democracy Assessment Resources, Research studies confirm highly effective and prepared the introductory materials for the Introduction teachers’ skills are underpinned by a Curriculum and Standards Framework II information deep understanding of how students kit that was distributed to all schools in There are many different contexts for (Progress Maps: A Teacher’s Handbook). learn and how they progress. Highly the assessment of student learning, effective teachers are aware of Dr Forster has a particular research interest in from teachers’ informal classroom the collection and use of achievement data to common student misunderstandings improve learning. She co-directed the National observations to high-stakes entrance and errors; they are familiar with School English Literacy Survey (NSELS) and tests and certification examinations. learning difficulties and appropriate co-authored the NSELS report. She has written a Within these contexts, much has been interventions; and they ensure that all number of general research-based publications on written about distinctions between the reporting of student achievement, including students are appropriately engaged, A Policy Maker’s Guide to International Achievement assessment purposes. In particular, challenged and extended, whatever Studies, and A Policy Maker’s Guide to Systemwide attention has focused on the distinction their level of achievement (Barber & Assessment Programs. between summative assessments Mourshead, 2007). What does research Recent national consultancies on the revision (assessments of learning) for reporting tell us about how effective teachers use and implementation of assessment and reporting students’ levels of achievement, and assessment to inform their practice? frameworks include work with the Western formative assessments (assessment Australian Curriculum Council, the Victorian for learning) where achievement data Effective teachers recognise that Curriculum and Assessment Authority and learning is most likely to occur when Education . Recent international are used intentionally to feed into the consultancies include work for The World Bank in teaching cycle. a student is presented with challenges India; the Peruvian Ministry of Education; AusAID just beyond their current level of in Papua New Guinea and the ; As the National Numeracy Review attainment, in what Vygotsky (1978) UNICEF; the Scottish Executive, and the Hong Report (HCWG, 2008) noted, many referred to as the ‘zone of proximal Kong Curriculum Development Institute. educators see a clear dichotomy development’. This is the region of between these two roles and argue, ‘just manageable difficulties’, where for example, that system-wide tests students can succeed with support. have no diagnostic role resulting in the Effective teachers understand, therefore,

Assessment and Student Learning: Collecting, interpreting and using data to inform teaching 5 the importance of first determining deeper understandings (Bransford, et learning in any one year level, this students’ current levels of attainment. al. 2000). variability can be accommodated As Ausubel wrote in 1968, the single within a one-size-fits-all, age-based What does research tell us about most important factor influencing curriculum. However, research tells us how students respond to assessment learning is what the learner already that children begin school with very information? Assessment has a knows. If educators can ascertain this, different levels of developmental and profound influence on students’ they can teach accordingly. school readiness. By Year 5, the top 10 motivation and self esteem, both of per cent of children in reading are at Effective teachers administer which are crucial influences on learning. least five years ahead of the bottom 10 assessments that reveal how students A strong emphasis on marking, grading per cent of readers (Masters & Forster, think rather than what they know, the and comparing students with each 1997a). By the end of in quantity of work, or the presentation. other can demoralise less successful the UK, the highest achieving students They are interested in eliciting learners. in mathematics are approximately six students’ pre-existing, sometimes Research is clear that if the feedback years ahead of the lowest achievers incomplete understandings, and is to be effective, it must be focused (Harlen, 1997). their misconceptions in order to on what the individual student needs identify appropriate starting points for How do teachers and students marry to do to improve (i.e. it must be task- personalised teaching and learning. this reality with the evidence? We know involving) rather than on the learner and This intention demands sophisticated that learning is enhanced when teachers her or his self-esteem (i.e. ego-involving) assessment techniques that are able identify and work from individuals’ (Wiliam, 1998). If students are provided to establish, for example, the mental current knowledge, skills and beliefs with a score or a grade on an individual models that students have developed rather than working from what we piece of work, they will attend to and how well they understand when a expect them to know and understand that, even if they are provided with principle applies and when it does not. given their age or year level; and that descriptive feedback as well. If we want learning is enhanced when students In essence, effective teachers focus students to attend to the feedback have the opportunity to learn at a on delivering appropriate learning teachers provide, the feedback should level appropriate to their development opportunities to individuals rather than include written comments and not be needs. How do teachers determine and to the group of learners to which the based solely on a score or grade. monitor where students have come individual belongs (Bransford, Brown & Research confirms that effective from and where they going to? Cocking, 2000). This use of assessment learners see themselves as owners of to guide the teaching of individuals Fundamental to high quality teaching, their learning; they understand learning contrasts with the more common focus assessment and learning is an intentions and criteria for success. In on establishing how much of what understanding of what it means to essence, they have a confident view of teachers have taught has been learned progress in an area of learning—the themselves as ongoing learners who are (Fullan, Hill & Crévola, 2006). progress or development of learning capable of making progress (Wiliam & across the years of school. Indeed, Thompson, 2007). The learning perspective the term ‘development’ is critical to understanding the changes in students’ Research studies confirm that learners Bringing perspectives conceptual growth. As Bransford writes, learn best when they understand what together: Underlying ‘cognitive changes do not result from they are trying to learn, and what is understandings mere accretion of information, but are expected of them; and when they are due to processes involved in conceptual given regular feedback about the quality Most teachers and students attend schools that are structured according reorganisation’ (Bransford, et al., 2000, of their work and what they can do to p. 234). make it better (Black & Wiliam, 1998). to a factory assembly line model based Meta-analytic studies show that timely on the assumption that a sequenced Effective teachers and learners have a and useable feedback is one of the set of procedures will be implemented shared understanding of what it means most powerful ways of improving as a child moves along the conveyor to progress, including an understanding student achievement (Walberg, 1984; belt from Year 1 to Year 12 (Darling- of what is valued (e.g. the learning Hattie, 2003) and that feedback is most Hammond, 2004). intentions and the criteria for success). useful if it supports the development of This model assumes that, although Since the 1990s, these shared there is some variability in students’ understanding have been facilitated

Research Conference 2009 6 by well-constructed learning continua, • providing effective feedback to Fullan, M., Hill, P. W., & Crévola, C. ‘progress’ maps (Masters & Forster, pupils; that is feedback that assists (2006). Breakthrough. Thousand Oaks, 1997b) or ‘learning progressions’, students to recognise their next CA: Corwin Press. that are of increasing interest outside steps in learning and how to take Harlen, W. (1997). Making Sense of the of Australia (e.g. National Research them, and that assists them to Research on Ability Grouping. Edinburgh: Council, 2001; Forster, in press). become involved in their own The Scottish Council for Research in learning Maps of this kind describe and illustrate Education. the nature of development in an area The key factor for teachers Hattie, J. (2003, October) Teachers of learning, illustrating for teachers and and students is having a shared Make a Difference: What is the research students the typical path of learning understanding of development across evidence? Paper presented at ACER and providing a frame of reference for the years of schooling, supported in Research Conference ‘Building Teacher monitoring individual progress. Quality part by the use of progress maps. Quality: What does the research tell maps are constructed from empirical us?’, Melbourne, VIC. observations of how learning typically References advances, and incorporate research- Human Capital Working Group, based pedagogical content knowledge Annandale, K., Bindon, R., Handley, K., Council of Australian Governments. accompanied by information about the Johnston, A., Lockett, L., & Lynch, P. (2008). National Numeracy Review kinds of difficulties and misconceptions (2003). First Steps: Linking assessment, Report. Commonwealth of Australia. commonly found among learners at teaching and learning: Addressing current Masters, G. N., & Forster, M. (1997a). various stages in their learning. They literacy challenges. Port Melbourne, VIC: Mapping Literacy Achievement Results support teachers to establish where Rigby Heinemann. of the 1996 National School English students are in their learning, where Ausubel, D. P. (1968). Educational Literacy Survey. Canberra: Department they are going and how to get there; psychology: A cognitive view. New York: of Employment, Education, Training and and to decide appropriate instruction Holt, Rinehart & Winston. Barber, M. & Youth Affairs. based on the individual student’s needs. Mourshead, M. (2007). How the world’s Examples of progress maps include the best-performing school systems come out Masters, G. N., & Forster M. (1997b). developmental continua of the First on top. London: McKinsey & Company. ARK Progress Maps. Camberwell: Steps program (Annandale et al., 2003). Retrieved June 26, 2008, from Australian Council for Educational http://www.mckinsey.com/clientservice/ Research. In summary socialsector/ourpractices/philanthropy. Masters, G. N., Forster, M., Matters, asp Research indicates that teachers’ and G., & Tognolini, J. (2006). Australian students’ capacity to improve learning Black, P. J. and Wiliam D. (1998). Certificate of Education: Exploring a way through assessment depends on a few Assessment and classroom learning. forward. Canberra: Commonwealth of key factors for teachers: Assessment in Education, 5(1), 7–74. Australia: Department of Education and Training. • identifying and working from Bransford, J. D., Brown, A. L. & Cocking, individuals’ current knowledge, skills R. R. (2000). How People Learn: Brain, and beliefs despite -grade mind experience and school. Washington, National Research Council. (2001). structure of schooling DC: National Research Council. Knowing what students know: The science • assessing not just specific content Darling-Hammond, L. (2004). and design of educational assessment. that has been learned but the Standards, accountability and school Committee on the Foundations quality of students’ thinking, reform. Teachers College Record, of Assessment. In J. Pellegrino, N. including the depth of conceptual 106(6), 1047–1085. Chudowsky, & R. Glaser (Eds.), understanding—and using a range of Committee on the Foundations of Forster, M. (in press). Progression sophisticated assessment techniques Assessment, Board on Testing and and Assessment: Developmental to do so Assessment, Center for Education, assessment. In B. McGaw, P. Peterson Division of Behavioral and Social • adjusting teaching to take account & E. Baker (Eds.), The International Sciences and Education. Washington, of the results of assessment Encyclopedia of Education, 3rd Edition. DC: National Academy Press. Chatswood, NSW: Elsevier.

Assessment and Student Learning: Collecting, interpreting and using data to inform teaching 7 Shepard, L. A. (2000). The role of assessment in a learning culture. Educational Researcher, 29(7), 4–14. Vygotsky, L. (1978). Mind in society: The development of higher psychological processes. (M. Cole, V. John-Steiner, S. Scribner & E. Souberman, Eds. & Trans.). Cambridge, MA: Harvard University Press. Walberg, H.J. 1984. Improving the productivity of America’s schools. Educational leadership, 41(8), 19–27. Wiliam, D. (1998, September). Enculturating learners into communities of practice: Raising achievement through classroom assessment. Paper presented at European Conference on Educational Research, Ljubljana, Slovenia. Wiliam, D., & Thompson, M. (2007). Integrating assessment with instruction: What will it take to make it work? In C. Dwyer (Ed.), The future of assessment: Shaping teaching and learning. Mahwah, NJ: Lawrence Erlbaum Associates.

Research Conference 2009 8 Making local meaning from national assessment data: NAPNuLit

Abstract • test the effectiveness of these strategies with school communities, The first part of this paper provides a background to the research, starting • trial these strategies with individual in 2000 with the DEST funding for schools to build their capacity what has become known as the Data to interpret and use benchmark Club for the Western Australian performance data, and, Department of Education and Training • report on best practice in the use through to the current activity funded of benchmarking data in school self- by the Western Australian Catholic assessment. Helen Wildy Education Office and the Association University of for Independent Schools of WA. If this sounds ambitious, there is Each project’s brief, design and the more! The project was based on the Helen Wildy is Professor and Dean of the Faculty scales used are outlined. The second assumption that schools would use of Education at The University of Western part of this paper demonstrates the the 1998 and 1999 benchmark data Australia. Formerly a mathematics teacher, she to make a series of performance taught in government and independent schools representations of NAPLAN data used in Western Australia and Victoria. She currently in 2008 and also the ways in which judgements: between 1998 and 1999 conducts research and supervises doctoral and the 2001–2007 WALNA data were cohorts within the school; between masters students in a range of leadership and displayed. Finally, this paper deals with the 1998 and 1999 cohorts; between school improvement topics. She has been chief school cohorts and all students; and investigator or co-chief investigator in research uses made by classroom teachers, projects worth more than $4 million since 2000. curriculum leaders, school principals, between schools. It was assumed She has published widely in refereed national and education systems for both that by 2000 each school would be and international journals. Since 2000, she has accountability and school improvement. in a position to demonstrate growth worked with school sectors in Western Australia in student performance between on projects to present national assessment data It concludes by raising some questions in formats that are accessible to school leaders about applications of these kinds of Year 3 and Year 5, and compare this and teachers. She is Director of Performance analyses for collaborative reporting on growth with the growth of student Indicators for Primary Schools (PIPS) Australia, national partnerships. performance in other schools, and a literacy and numeracy assessment program throughout the state. Furthermore, for students entering school, used by over 800 schools throughout Australia. Introduction the initial project promised to not only work with schools but also to As early as 1999, it was clear that meet with schools, school staffs and schools in Western Australia, at least school communities to explain the government schools, were not the analyses. We undertook to improve slightest bit interested in national the skills of school leaders, teachers and assessment data. At that time, Bill communities to interpret benchmark Louden and I had begun what became data. We have come a long way since known as the Data Club. Bill had 1999 and we have learnt a great negotiated with the Department of deal. We might even have learnt Education, Training and Youth Affairs some lessons that are applicable to (DETYA) and the WA Department the expectations of gain, improvement of Education to fund a project titled: and growth in student performance ‘Developing schools’ capacity to make under the current National Partnership performance judgements’. Located at funding arrangements. Edith Cowan University in Western Australia, this collaboration was set up We invited each school to share its as a pilot project which aimed to: 1998 and 1999 benchmark data with us, and to send two school leaders • advise on ‘value added’ and ‘like to participate in a half-day workshop, school performance’ measures on the understanding that a sample suitable for schools, of about 20 schools would respond. • develop data displays and self- We would select for our trial those evaluation strategies, Districts with the largest representation of schools. In the event, 200 schools

Assessment and Student Learning: Collecting, interpreting and using data to inform teaching 9 responded, including two Districts with element: schools voluntarily joined the Data Club has provided those services 100 per cent response rates. Having Data Club and submitted their data to everyone’s satisfaction – indeed, it decided to expand the trial to take all for inclusion in the analyses. Schools seems to have exceeded expectations.’ applicants we then started to collect were coded and no materials carried I have quoted heavily from this report their data. ‘What data?’ was the most identifying names. because of its bearing on what was to common response. Although the follow. In November 2001 an evaluation data had been sent to each school in of the impact of the Data Club was At the end of 2002, I was appointed to hard copy, few schools could locate conducted by Jane Figgis and Anne the staff of Murdoch University’s School theirs but happily paid for reprints. Butorac of AAAJ Consulting Group. of Education. More importantly, the Our first lesson was that the data had Using telephone interviews with WA Department of Education resolved little meaning and even less value to principals from a random sample of that henceforth the Data Club would those 200 schools keen to join our 30 of the participating schools, Figgis operate from within its ranks. One last pilot. The second lesson for us was and Butorac examined why principals round of analysis was carried out by that the data quality was uneven. It signed up for the Data Club; the use to the original team. The following year, in was clear that schools had not taken which the WALNA data was put; the 2003, the Department’s internal team the tests seriously – large gaps in professional development provided by developed some disks and offered them cohorts; patches of extremely low the Data Club and related issues such to all schools without the requirement scores suggesting students were poorly as confidence in the assessment regime. of attending workshops which were run supervised during the tests or given too Amongst the findings of this evaluation by District office personnel. In the first little time to complete many items; and were these points: principals joined year of using this system (2004), it was some sets of outrageously high scores because they wanted to compare reported that even greater numbers suggesting rather too much teacher their school with like schools, and to of principals participated in workshops ‘support’ during the tests. However, the track their students over time; they than previously. I believe that, since third lesson is one that I continue to wanted to make use of the WALNA that time, Data Club analyses have learn now, a decade later – the variable data but did not know what the data been carried out by DET staff and disks capacity of school personnel to engage meant; and the workshops gave them distributed without workshops, and this with the data in a thoughtful way. time to devote to reflecting on the has been supplemented with a First Cut From 2000 to 2003, the Data Club was data. Many principals spoke of how analysis focused on the achievement of funded by DETYA/DEST and the WA data were used and the collaborative targets. Department of Education and run from processes they were developing in Although my involvement with the Edith Cowan University by Louden schools to share their understandings. government sector ended by mid-2003, and Wildy, with technical support from Others spoke of looking at the data I then started a new venture with the Jessica Elderfield. Over these three ‘squarely in the eye’ and accepting Catholic Education Office of Western years, the number of schools registered that there was something relevant Australia (CEOWA) at the invitation of grew to 510, representing over 80 to them and their school. Figgis and Gerry O’Keefe. With the guidance of per cent of schools with primary-aged Butorac reported on the participants’ Professor David Andrich, I assembled students in the government sector. The appreciation of the workshops as the NuLit team comprising Dr Barry materials, initially paper-based, became professional development, concluding Sheridan, programmer, and Dr Annette disk-based, and later web-based. Each that: ‘There was not a single principal Mercer, project manager and data year, workshops were run in and who felt that he or she did not learn analyst, which has continued to the across the regional centres, as well as what was intended for them to learn. present. For each of the five years, via satellite broadcasts and interactive The outcome was that they wanted 2004–2008, NuLitData has run from video conferences. The workshops more – more for themselves and for Murdoch University for the CEOWA. were conducted by Louden and Wildy, their teachers.’ The reviewers ended For the four years, 2005–2008, we and held in March, April and May. A their report with: ‘The Data Club has have run a parallel project for the key design element was that schools begun very well, but its role has only Association of Independent Schools of only received their analysed Western just begun. Schools recognise that there WA (AISWA). NuLitData CEOWA Australian Literacy and Numeracy will be much more for them to learn involved all 159 schools in that sector Assessment (WALNA) data when about using the data over the next and NuLitData AISWA involved nearly they participated in the workshops. few years. And they will want reliable all 158 schools. The NuLitData model Confidentiality was another key help from independent experts. The was similar to the Data Club although

Research Conference 2009 10 the programming was vastly more back to link with the WALMSE scale groups of teachers to explore and sophisticated than that used in the we used for the WALNA and MSE extend others’ interpretation of the Data Club. Throughout this period, data. Now usingdata from 2001 to data. More recently, all these learnings Monitoring Standards in Education 2008, we displayed on a single graph were linked specifically to school goals at Year 9 (MSE9) assessment data the means from eight years of Reading, and strategies. Now the challenge is were added to the Years 3, 5, and 7 and then of Numeracy for Years 3, to develop the skills to marshall sets WALNA data so 5, 7 and 9. For the first time each of data to back up arguments and to principals and curriculum leaders joined school could examine its long-term write coherently for different audiences. the workshops. (Until recently, Year 7 performance throughout the school These were our goals in our 2009 was the final year of primary schooling for a given test. This most powerful workshops with CEOWA and AISWA in WA.) Linking Year 7 students’ data overview of school performance schools. with their later performance as Year 9 allowed principals and other leaders to students was challenging, both because interrogate the performance of year Conclusion we could not access data across sectors groups over time – noticing the extent and also because of the difficulty of of their natural fluctuations, looking In conclusion, I refer back to the words creating a ‘virtual’ Year 7 for each for signs of upward movement, and of Figgis and Butorac in their 2001 secondary school from the numerous all the while questioning the impact report on the impact of the Data Club (as many as 43) feeder schools. of interventions and the effects of and apply these to our subsequent Workshops were conducted by Wildy organisational and cultural changes. work with the national assessment data. and Mercer, during February, March and I believe that this ‘has begun very well, Throughout the five years of working April each year. but its role has only just begun. Schools with the CEOWA, we designed recognise that there will be much By 2009, I had moved to The workshops linking NuLitData more for them to learn about using University of Western Australia and NAPNuLitData with school the data over the next few years.’ It is (UWA) and all materials for this year’s improvement processes. For the first a decade since we started this work distributions were to be re-badged couple of years, the focus was entirely and our efforts have been focused and the operation relocated. However, on understanding the data displays. on school leaders. We have not even more than that was to change. For Each year, participants examined begun to work with teachers or school the first time, we were to deal with their school’s data in terms of overall communities. That, I believe, is now in NAPLAN data and we wondered means compared with the state and the hands of the school leaders. whether to attempt to continue to with like schools, then shapes of present the longitudinal 2001–2007 distributions through box and whisker WALNA and MSE data. In the event, plots – from subgroups to individuals, we decided that we would do both. then to individual student change We set up new displays for the 2008 over time, and then to value added NAPLAN data in a program we called measures. Participants learnt how NAPNuLit, building on the concept of to interpret standardised residuals bands and incorporating subgroup data plotted around a mean of zero with (Indigenous, LBOTE, Sex) as we had expected performances lying between for all the NuLit displays. However, +1 and -1. They noticed that, over we introduced new box-plot displays the eight-year period, most of them to make use of the percentile data performed as expected and that wild available nationally. In deciding that deviation was usually accounted for by 2008 data would be the beginning very small numbers or early aberrant of the new disks, we realised, in data. They understood that, while the collaboration with our CEOWA and school as a whole might be ticking AISWA partners, that one year’s data along nicely, they could identify the did not make much of a story, even impact of interventions on subgroups though the new concepts were to (for example, low performing students) be used. So we continued the NuLit and also on individuals. Participants also analyses, and added 2008 NAPLAN learned how to construct conversations Reading and Numeracy data adjusted they could pursue back at school with

Assessment and Student Learning: Collecting, interpreting and using data to inform teaching 11 Using student assessment to improve teaching and educational policy

Abstract that spending time in school is less important than the quality of the International and national assessment instruction. Much has been made of results, as well as the case of the students’ attitudes towards school. A Finnish comprehensive school, are closer analysis reveals that no country used to discuss strategic questions of has managed to create a school educational policy, teacher education system that produces excellent results and teaching. combined with a very positive school climate. Maybe we should not be so Patrik Scheinin Introduction concerned with maximum happiness for everybody, all of the time. A serious University of Helsinki, Are students prepared to meet the but positive school atmosphere seems Finland challenges of the future? Do they have the knowledge and skills that are to be more appropriate for learning. Patrik Scheinin is a Professor of Education and essential for full participation in society? There are two types of school systems the Dean and former Vice Dean of the Faculty of These questions are central from the with excellent or good results: many of Behavioural Sciences at the University of Helsinki. viewpoint of educational policy. The the Asian and central European schools Among other administrative tasks he is Vice Chair Programme for International Student of the Board of the Open University of Helsinki, with large between-school differences, board member of the Helsinki Area Summer Assessment (PISA) is an internationally selection, testing and tracking, on the University, Vice Chair of the Central Campus standardised assessment, jointly one hand, and the typically Scandinavian Library Committee of the University of Helsinki, developed by participating OECD model of comprehensive schools, with and a former Vice Director of the Department countries, and administered to 15-year- of Education. He received the University of small between-school differences, on Helsinki Scholarship for Especially Talented Young olds in schools. The domains of PISA the other. The countries with the best Researchers for two three-year periods, first in are mathematical literacy, reading PISA results do, however, all manage 1986 and again in 1989. literacy, scientific literacy and, since to keep the between-student variation Patrik is Vice Director and a founding member of 2003, problem solving. Students have relatively low. In other words, the the Centre for Educational Assessment, member to understand key concepts, master weaker students are not left behind. of the steering group of the Finnish PISA Project, certain processes, and apply knowledge and is a former board member of the Helsinki What makes the Finnish school system University Collegium for Advanced Studies. He and skills in different situations, rather interesting from the perspective of is an expert and referee for several scientific than show how well they have educational policy is that it is the only journals, foundations and conferences, and has mastered a specific school curriculum. comprehensive school system with top been scientific expert for projects of the Finnish This makes comparisons between National Board of Education, as well as of the PISA results. countries possible and fruitful. Swedish National Agency for Education. He is The success of Finnish students in PISA a member of several national and international The PISA data shows that the research associations. has transformed our understanding correlation is very high on the country of the quality of the work done in His research interests are cognitive abilities and level between performance in reading, thinking skills, self–concept and self–esteem and our comprehensive schools. The their structure and development, educational mathematical and scientific literacy. performance of Finnish students in PISA interventions promoting development of cognitive We should, therefore, look for general seems to be attributable to several abilities and personality, educational assessment rather than country or subject-specific factors. Firstly, the role of schooling as and evaluation research, as well as teaching and explanations for why some countries do learning in higher education. a part of the Finnish history and cultural better than others. First, money does heritage is remarkable. Education of not seem to be the answer. Countries the people was used as a strategy in with top results make relatively average creating the nation. Thus, teaching investments in education. The influence has been and is still a highly regarded of socioeconomic factors, especially profession. Secondly, although Finland parental education, is also relatively is a poor country as far as natural small. In other words the students’ resources go, the educational system abilities are what counts. The results has been built to achieve a high general also show that the average yearly level and quality of education. Thirdly, number of hours spent in school a nationally coordinated curriculum correlates negatively with PISA results is the basis of teacher training and on the country level. This indicates tends to make work at school more

Research Conference 2009 12 systematic. It makes the knowledge and skills required for secondary education and adult life in Finland explicit. It also helps writers of textbooks match the content and approach of the curriculum and teaching methods used in the comprehensive school. Fourthly, a research-based teacher education at the masters level ensures a high standard of applicants for teacher training. This in turn enables a demanding standard to be set in teacher training. Finally, education is generally seen as a road to social advancement – and the comprehensive school makes it a quite realistic option for most students, regardless of their background. The students and their parents appreciate this. It also means the opportunity of further education is extended to the brightest potential students of the nation. These are key elements in the social stability and economic success of a democratic society like Finland. On the other hand, the choices made concerning schooling and career are still far too stereotypical and adhere closely to the example set by the parents, which is not optimal from the vantage point of national educational policy.

Assessment and Student Learning: Collecting, interpreting and using data to inform teaching 13 Research Conference 2009 14 Concurrent papers

What makes a difference? How measuring the non-academic outcomes of schooling can help guide school practice

Abstract simple and deeply complex problem that can, for example, be illustrated by The paper uses examples from considering aspects of the most recent the Melbourne Declaration on the formal statement on Australia’s national Educational Goals for Young goals of schooling, The Melbourne (2008) as well as from work completed Declaration on Educational Goals for by ACER to reflect on, explore and Young Australians (2008)1. make recommendations to mediate the challenges of measuring and Goal 2 of the Melbourne Declaration states that: Prue Anderson improving the non-academic outcomes of schooling. The paper outlines All young Australians become successful Australian Council for Educational common difficulties encountered when learners, confident and creative individuals, Research defining non-academic outcomes and and active and informed citizens. establishing mechanisms to measure Prue Anderson is a Senior Research Fellow in the outcomes within schools, and the paper Included in the articulation of this goal Assessment and Reporting Research Program at are, for example, the development of: the Australian Council for Educational Research. explores some of the misconceptions Prue has over ten years of experience in the that are commonly associated with successful learners who development of school-based assessments attempts to improve non-academic in the areas of English literacy, interpersonal outcomes of schooling. In each case • are able to plan activities development and personal learning through her the challenges and misconceptions are independently, collaborate, work in work for the Australian Council for Educational teams and communicate ideas Research. She has developed assessments and accompanied by recommendations for managed large system-level school assessment approaches and strategies that can be confident and creative individuals who projects for Australian state education used to address them. departments and overseas organisations including • have a sense of optimism about work in Papua New Guinea, the Philippines, their lives and the future Brunei and the USA. She has developed Introduction assessments of the social outcomes of schooling and active and informed citizens who for the Western Australian Department of This paper reflects on what the Education and Training and assessments of process of measuring the non-academic • act with moral and ethical integrity. interpersonal learning, personal learning and outcomes of schooling has taught (Melbourne Declaration, 2008) thinking for the Victorian Curriculum and us about the ways in which these Assessment Authority. She has also developed Each of these examples is arguably a essential outcomes of schooling can be interpersonal and personal enterprise assessments combination of both academic and for an outdoor education camp program. Prue conceptualised and managed in schools. non-academic outcomes of schooling. is currently the director of the International The paper includes both reflections Planning, collaboration, teamwork Schools Assessment (ISA) which is a test of on the challenges that we face when reading literacy, mathematical literacy and writing and communication all require the attempting to define, measure and administered to almost 50,000 students from motivation to engage, some cognitive over 230 international schools. Prue was a improve non-academic outcomes, and understanding of the tasks and the primary teacher for nine years and lectured in recommendations for actions (in very cognitive and emotional capacity to primary teacher education for four years. general terms) that schools can take to self-monitor and regulate behaviour. better measure and influence the non- Similarly, optimism and moral and academic outcomes of schooling. Julian Fraillon ethical actions can require a complex combination of cognitive interpretation Australian Council for Educational The challenges Research, was co-author of this paper of context as well as motivation to The overarching challenge: Is there such a think and act positively. Optimism, thing as a ‘non-academic outcome’? for example, is a desirable attribute, A consistent challenge in measuring the non-academic outcomes of schooling has been the step of deciding what they 1 Note that any vision or explicit document on schooling could have been used for this are and whether they are actually ‘non- example, but the Melbourne Declaration has academic’ at all. This is both a grossly been selected because of its currency and national profile.

Assessment and Student Learning: Collecting, interpreting and using data to inform teaching 15 but not if it is so high as to restrict there a continuum of confidence and your teaching. If a multi-dimensional students’ capacity to critically appraise creativity? If there is: balance model fits your school then information in context – we may hope the approach you take to educating • what does low confidence (or our students feel positively about your students about non-academic creativity) look like?; the future of Australia as a country outcomes may be more about raising united in diversity, but we would also • what does high confidence (or their awareness of the breadth of hope that students would understand creativity) look like?, and, possibilities along each of the non- that such positive outcomes are not academic continua, how the continua • Is the continuum age-related? (i.e. naturally inevitable and may require interact, and how to make good How do confidence or creativity effort and commitment. One lesson choices to achieve an overall sense of develop or change with age?) we have learned over time is that balance, rather than about raising the the development of non-academic Academic continua reasonably assume bar of expectations from one year to outcomes requires thought and self- that increasing proficiency is a good the next. reflection (both of which have an thing. Reading proficiency, for example, The type of discourse suggested above academic element) and that this has an is an academic outcome of schooling, is routine in the measurement of important influence on the way schools and increasing proficiency reflects academic outcomes of schooling but should conceptualise their approach increasing skills, insight and depths of regrettably lacking in consideration to them. This will be further discussed understanding, all of which are clearly of many non-academic outcomes. under Challenge 3. desirable. But, is more necessarily better Unfortunately, time, resources and the on the continuum of a non-academic desire to move forward to address the outcome? Ever-increasing levels of Challenge 1: more immediate concerns of measuring confidence may suggest overwhelming Defining the outcome these non-academic outcomes self-interest or self-aggrandisement. frequently prevent them from properly The academic outcomes of schooling Extreme creativity without reference being defined in the first place. are typically defined and explicated in to context (such as time, resources or curriculum documents and supporting the needs of others) can be counter- Challenge 1: The recommendation materials both at a system and school productive. level. By contrast, the non-academic Before devoting time and energy to outcomes of schooling are typically less Unlike academic outcomes, the measuring the non-academic outcomes well-described and, even in cases where notional model of what is desirable of schooling, it is essential that the they are extremely welldefined such as is moderated by a sense of context- outcomes are clearly defined in a way by the Values Education for Australian related balance across the different that makes sense to all those who Schooling project, or in some State and outcomes. The model of a see-saw, use them. Commonly used terms Territory curriculum documents, there comprised of multiple planks splayed such as ‘wellbeing’ and ‘resilience’ is still the expectation that schools will out in different directions with the often are poorly defined which can largely be responsible for refining and fulcrum or point of balance of each significantly diminish their usefulness in operationalising the defined concepts plank being the optimum position, may schools. At an individual school level, in their own local settings. The better apply to our conception of non- it is important that all members of second example from the Melbourne academic outcomes. In this model, each the community can share a common Declaration, ‘confident and creative plank represents the continuum for a understanding of the way the non- individuals’, can be used to illustrate substantive non-academic outcome. academic outcomes of schooling are some of the challenges schools have It is still important to understand the defined and conceptualised. Similarly, in measuring non-academic outcomes. scope of the continuum from low to time and energy should be devoted Firstly it is necessary to define what it high in order to decide where the to consideration of what ‘model’ of means to be confident and creative. In optimum point of balance is. It is also non-academic outcomes fits within a doing this questions may arise such as: important to consider the planks in given school context. What profiles relation to each other. For example, of student non-academic outcomes Should confidence and creativity be useful creativity is also about intense are seen as desirable and why? Only considered separately? (Most likely of self-discipline. Conceptualising what when these decisions have been clearly course the answer is ‘yes’.) How can you are measuring in a non-academic articulated can the tasks of measuring we define confidence and creativity? Is outcome and what improvment and addressing the outcomes begin to looks like is critical because this drives be properly addressed within a school.

Research Conference 2009 16 Challenge 2: increasing sophistication is integral long time to work out what to do, but Measuring the outcome to development of non-academic we got everything finished in the end.’ outcomes. Unfortunately our experience suggests The three examples from the to us that, where measurement of non- There are four main issues in meeting Melbourne Declaration provide academic outcomes such as working the challenge of measuring non- some insights into the challenges in groups is measured in schools, the academic outcomes. The first is to of measuring the non-academic bar is typically set very low and, where determine the most feasible, valid and outcomes of schooling. Firstly, it is the choice exists, the ultimate focus of reliable ways of collecting information clear that the three examples lend the activities is typically the academic about student outcomes as you have themselves naturally to different types outcome with little opportunity defined them. This provides both of assessment. In each case, the non- for students to formally consider, conceptual and practical challenges for academic outcomes suggest both meaningfully reflect on and build on schools. The conceptual challenge is to some form of external (e.g. teacher- the skills and dispositions relating to the confirm that what is being measured centred) assessment, such as through non-academic outcomes. This approach is actually the non-academic outcome observation of student responses to to measuring non-academic outcomes that you have defined. The practical or behaviours during outcome-related is like continually measuring students challenge is of course to devise modes activities, and some form of internal jumping over 20 centimetre heights – of assessment that can be used (i.e. student-centred) assessment such it tells us only the minimum they can seamlessly and with relative ease by as student self-reflection. A critical achieve and does not challenge them to teachers. Ideally they are embedded difference between measuring the develop. non-academic outcomes and academic within the students’ normal school outcomes of schooling is the role of experiences. Digging deeper into the dynamics of group work in a classroom setting is self-reflection in the outcome measure. A second issue is to provide most likely inappropriate, given the In the non-academic outcomes both sufficiently challenging and appropriate broader context that requires students the external and internal reflections opportunities for students to manage their day-to-day lives together on student development are clearly demonstrate what they can do when throughout their time at school. The intrinsic to the outcome itself. In the you are measuring non-academic perceptive ones will realise that they academic outcomes, student self- outcomes. Our experience suggests cannot afford to be honest about the reflection may (or may not) give an that while learning to work in groups shortcomings or failings of their peers accurate sense of student learning is a challenging experience for many but must learn to accommodate or achievement; however the process students in lower primary, by upper resist problematic behaviour while of self-reflection is typically used as a primary most students have learned maintaining a veneer of calm and, above pedagogical tool to support student how to get through group tasks with a all, remaining task focused. Measuring learning growth and understanding minimum of fuss so that they can focus group work skills may require two of the learning area. In mathematics, on the academic content of the task. By approaches: an assessment of minimum for example, students may be asked this level, the academic content of tasks competency administered in a standard to identify areas of strength or is typically the focus for teachers and classroom group work context and an weakness on a given topic for the students; in classes where group work assessment of proficiency administered purpose of helping them develop does not function well then teachers in a context created for the purpose. better understanding of the topic. typically avoid it because learning The minimum competency assessment The metacognitive process itself is the academic content is implicitly or would be a screening test to identify a teaching and learning tool rather explicitly more valued than mastering students who need support to learn than a discipline-based outcome, the challenge of group work. Low- how to function in low-key classroom whereas in considering collaboration, key demands of group tasks do not group tasks. The purposeful context confidence, creativity or moral integrity, provide much scope for insight or would be an occasional situation where the metacognitive processes are both understanding. The group task reflection students might be given a challenging intrinsic to the outcomes as well as sheets that students often complete task that focuses on team skills, rather supporting student learning of them. usually generate a list of predictable, than the task, with students they do That is, it may be possible to improve shallow comments that are much the not know well (such as those from your academic skills without a well- same from Year 5 to Year 10. ‘We another school), so that students developed capacity to self-reflect, worked quite well together, but we are free to reflect more honestly on but the capacity to self-reflect with could have tried harder.’ ‘We took a

Assessment and Student Learning: Collecting, interpreting and using data to inform teaching 17 the team dynamics. For this to be outcome in a given context that they minimum standards of behaviour that a useful measurement opportunity, will naturally transfer this capacity require regular classroom attention, students would need to be prepared to different contexts. Returning to as distinct from desirable outcomes so that they were aware of the issues the examples from the Melbourne that may be the focus of occasional about teamwork that they needed to Declaration it should be obvious that specialised activities. monitor, and so they had the language working in groups, for example, can Ultimately, good measures of the to describe and reflect on their require students to demonstrate a non-academic outcomes of schooling experience. This is further discussed broad range of different skills depending will be valid (i.e. provide the type under Challenge 3. on the context of their group work of information you need about the (such as how well they know or get The third issue is collecting evidence outcomes as you have defined them), on with the other members of their of student performance that can as well as easy to use and able to be group; how complex the task is; be used to support teaching. Useful used and interpreted in similar ways how large the group is). Similarly, the measures identify gaps in student by different teachers in a school. Just challenges of acting with moral integrity learning and understanding. If most as it is often useful to cross-mark depend greatly on the context of how students have achieved what you asked (moderate) student academic work, much risk or reward there is and the them to do then there is no point in it is essential that schools find ways of degree of complexity of the moral measuring this anymore. If you know developing consistent understandings issues involved. In order to develop students need to learn more but the of student non-academic outcome good understandings of students’ instruments that you are using mainly achievement across the school. The development of the non-academic elicit superficial responses, you will need way in which information is being outcomes of schooling it is necessary to reconsider what you are doing. Is collected must be considered with to challenge students to demonstrate the context provided too simple and respect to how the information can them across a range of contexts. lacking in challenge? Is the context inform teaching. If you are not collecting you have given so broad that it is only Challenge 2: The recommendation information that teachers can use to suited to vague generalisations? Is the inform their teaching, then you need to Challenge 1 recommended that schools context too sensitive? Or possibly reconsider what you are doing. invest serious effort in defining a your instrument lacks depth? It may manageable selection of non-academic not be useful to apply the same outcomes, paying particular attention Challenge 3: measurement instrument to everyone. to defining development along a Improving outcomes If you have defined minimum standards continuum and describing desirable of behaviour then only students falling Our work at ACER has consistently outcomes or points of balance. As below them are usefully measured demonstrated one essential flaw in the staff and students discuss these ideas on a regular basis in relation to these way many schools and systems attempt the first measurement instruments minimum standards. The measurement to improve some non-academic should arise naturally, such as a series of more proficient students is better outcomes of schooling. This flaw is of probing questions developed to reserved for specialised contexts that the assumption that simply providing help teachers to define the outcomes provide them with sufficient challenge students with the opportunity to and to find out what students already and opportunity to demonstrate higher demonstrate the outcomes will know and understand and what they level outcomes. If the measures that be enough for the students to need to learn next. As long as the you take during and after such an event develop them. initial measures that your school has are useful then they should clearly developed reveal large gaps in students’ Self-motivation is an example of an suggest where more teaching needs to understanding, they will provide useful outcome that is highly valued in schools be done to help these students evidence to inform learning and (and in the real world) and is most to grow. measure progress. Rubrics can then be commonly addressed by students A fourth issue in measuring the non- developed which describe development being given projects and school work academic outcomes of schooling along a continuum for use in the to complete in their own time, as arises from the high level of context evaluation of observations of student well as opportunities to participate in dependency in students’ demonstration behaviour and student self-reflections. interschool sports teams, and school of the outcomes. It is not sufficient At this point it is worth looking at other drama or music productions outside to assume that if a student can rubrics to see if they suit or can be school hours. Students are generally demonstrate proficiency in an adapted. It might be helpful to identify praised for showing motivation and

Research Conference 2009 18 their attitude is deplored when it is the capacity to measure non-academic References lacking, but they tend to be left to work skills with meaning. out for themselves why they are or Brookes, A. (2003a). Character building: The opportunity to consider the skills are not motivated or how they might Why it doesn’t happen , why it can’t and dispositions underpinning non- influence their own motivation. The be made to happen, and why the myth academic outcomes, together with the assumption is that providing students of character building is hurting the opportunity to self-reflect in context with opportunities to demonstrate field of outdoor education. Keynote and to speculate about other contexts, motivation is sufficient for this to presentation. In S. Polley (Ed.), 13th can lead to better internalisation of develop. National Outdoor Education Conference the outcomes themselves. Brookes (pp. 19-24). Marion, : What is frequently lacking in this and (2003a, 2003b, 2003c) argues that the Outdoor Educators Association of equivalent school experiences regarding transferability of learning experiences South Australia. other non-academic outcomes is the in outdoor adventure programs to opportunity for students to formally everyday life can only take place if Brookes, A. (2003b). A Critique of consider the discipline and skills that students are given explicit support neo-Hahnian outdoor education theory. underpin the outcome. Self-motivation to understand the experience and to Part one: Challenges to the concept of relies on a set of social, emotional and reflect on how it may relate to other ‘character building’. Journal of Adventure cognitive skills that can be formally aspects of their lives. The notion of Education and Outdoor Learning, 3(1), considered. Too often in our research the context dependence of student 49-62. we have seen students’ reflections demonstrations of achievement of Brookes, A. (2003c). A critique of neo- on their own achievement of non- non-academic outcomes extends Hahnian outdoor education theory. Part academic outcomes simply in terms to the role schools can play in two: ‘The fundamental attribution error’ of qualitative judgements akin to ‘well fostering the outcomes. Students in contemporary outdoor education enough’ or ‘not well enough’, without need to be provided with the tools discourse. Journal of Adventure Education any elaboration or explanation with to internalise the outcomes as well and Outdoor Learning, 3(2), 119-132. respect to the skills or dispositions that as the opportunities to use them and may underpin the outcome. Students generalise beyond local contexts, in need the language, a clearly defined order for lasting and transferrable construct and knowledge of a range of change to take place. relevant strategies, to be able to reflect Challenge 3: The recommendation on and learn from their experiences. If schools are implementing specialist The message from Challenge 3 is activities such as a school camp, simple. Do not assume that non- with the intention that this focus on academic outcomes necessarily require interpersonal development, autonomy less formal teaching of content, skills and independence, it is essential and applications across contexts than that students understand this focus are typically devoted to teaching of beforehand: what opportunities are academic outcomes. There is no being provided, what is expected of question that experiential learning plays them beyond mere participation and a key role in students developing many superficial reflections, and what kind non-academic outcomes, but without of strategies they might use to help a solid foundation of knowledge and them to grow. Students will need skills and the opportunity to make support from teachers during the informed self-reflection it is likely that camp to help them to monitor their the experience in itself may not be experiences, reflect more thoughtfully sufficient to facilitate lasting change in on the strategies they are using and to many students. try different approaches. The collection of vast quantities of shallow comments at the end of the camp is more likely to reflect the shallow nature of what is being practised rather than limitations in

Assessment and Student Learning: Collecting, interpreting and using data to inform teaching 19 Reflections on the validity of using results from large-scale assessments at the school level

Abstract The recent decision by the Council of Australian Governments to develop a schools’ reform plan that targets disadvantaged school communities includes an agenda for greater accountability, transparency and an outcomes focus. Peter Titmanis The agenda for greater accountability Performance Measurement and and transparency has been part of the Reporting Taskforce educational landscape in Australia and internationally for some time. It utilises, Peter Titmanis is the Director of the Performance at least in part, performance indicators Measurement and Reporting Taskforce’s based on test scores for accountability Benchmarking and Educational Measurement Unit. measures at the school and system The unit supports the Taskforce in developing levels, as well as for measures of and implementing methodologies for nationally comparable reporting of student performances in student outcomes. literacy, numeracy, science, civics and citizenship, With governments and education and information and communication technologies, including NAPLAN 2010. authorities around the world working to identify programs that are effective Peter has been closely involved with the development and implementation of Australia’s in assisting school communities national assessment strategy since 1997. improve standards, and to better Previously Peter was a senior officer in the direct the limited resources available Western Australian Department of Education and to these programs, there is increased Training where his roles included managing the state-wide testing program, i.e the Department’s utilisation of the information from Evaluation Unit, and working with a team to testing programs. This presentation develop new accountability policies for school considers some of the ways that self-assessment and reporting. He has also results from large-scale testing worked as a strategic planner in education as part of an Australian overseas aid program. programs may be used at the school and classroom levels– for example, Prior to these appointments, Peter was a teacher of physics, chemistry and science in secondary school comparisons, school averages, schools and taught in teacher education programs value added and growth measures at university. – and considers the validity of the inferences that may be drawn from the information.

Research Conference 2009 20 Using assessment data for improving teaching practice

Abstract practice in ways that benefitted students. Many of us reflected on the Fundamental to teachers becoming difference between the hope and the responsive to student learning needs is reality. This situation has now changed. the availability of detailed information We have now identified a number about what students know and can of conditions required for the use of do. High-quality assessment data can assessment data to have the impact we provide that information, but much hoped for: more is needed to improve teaching practice in ways that have a substantive The data needs to provide teachers Helen Timperley impact on student learning. A set of with curriculum-relevant information University of Auckland, conditions are identified that result in That information needs to be such an impact, based on a synthesis seen by teachers as something of the international literature on that informs teaching and learning, Helen Timperley is Professor of Education at professional development that has The University of Auckland in New Zealand. rather than as a reflection of the Her early career involved teaching in early demonstrated a positive impact on capability of individual students and childhood, primary and secondary education student outcomes and a professional to be used for sorting, labelling and sectors which included special education. This development program in over 300 credentialing experience formed the basis of a research New Zealand primary schools. This career focused on making a difference to the professional development program is Teachers need sufficient knowledge students this system serves. A particular research of the meaning of the assessment emphasis has been on promoting leadership, focused on the interpretation and use organizational and professional learning in of assessment information, building data to make appropriate ways that improve the educational experience relevant pedagogical content knowledge adjustments to practice of students currently under-achieving in our education systems. She has recently completed a in literacy and developing leadership School leaders need to be able best evidence synthesis iteration on professional for the change management process. to have the conversations with learning and development that has received major These developments occurred within teachers to unpack this meaning international attention. She has published widely systematic inquiry and knowledge- in international academic journals such as Review building cycles based on assessment Teachers need improved of Educational Research, Journal of Educational pedagogical content knowledge Change, Leadership and Policy in Schools and data for both teachers and leaders. the Journal of Curriculum studies. She has written Student achievement gains in reading to make relevant adjustments to four books focusing on the professional practice and writing have accelerated at a classroom practice in response to implications of her research in her specialty areas. rate averaging more than twice that the assessment information expected, with even greater gains School leaders need to know for the lowest-performing students. how to lead the kinds of change Both the projects have led to the in thinking and practice that are identification of a set of conditions required for teachers to use the necessary for assessment data to result data in improved teaching practice. All within the school need to be able to engage in systematic Introduction evidence-informed cycles of inquiry For a long time we have known that build the relevant knowledge more about the potential for using and skills identified above. assessment data to improve teaching These tasks are not easily accomplished. practice and student learning than how However, examples of how they can to do it. Ten years ago we did not have be achieved has been identified in a the right assessment tools, we did not systematic review of the international know enough about their use to make evidence of the kinds of professional a substantive difference to teaching learning and development experiences practice and we did not know what that have resulted in improved student else teachers and their leaders needed outcomes (Timperley, Wilson, Barrar & to know and do to improve teaching Fung, 2008) and also in the outcomes

Assessment and Student Learning: Collecting, interpreting and using data to inform teaching 21 of a professional development project of Education, 2001)1 are used because practice. The interpretation and use in New Zealand involving 300 schools, they are mapped to the New Zealand of assessment data for guiding and which has been built around this curriculum and also provide normative directing teaching requires a mind shift evidence (Timperley & Parr, 2007; in data about expected rates of student towards professional learning from data press). In this professional development progress in each curriculum area. and a new set of skills. project, student achievement gains have occurred at a rate beyond that expected over the two years of the What knowledge schools’ involvement in the project, and skills do particularly for the lowest-performing our students need? students. The average effect size gain What knowledge for all schools that focused on writing and skills do we was 1.20 and for reading it was 0.92. as teachers need? The rate of gain was greater for the students who were in the bottom 20 per cent of the distribution at Time What has been the impact of our 1 (2.25 in writing; 1.90 in reading). changed actions? Expected average annual effect size gains, using national normative cross- Deepen professional sectional sample data to calculate, in knowledge and writing is 0.20 and in reading is 0.26 . refine skills. Engage students Teacher inquiry and in new learning knowledge building experiences. cycles Figure 1: Teacher inquiry and knowledge-building cycle to promote The final bullet point above identifies valued student outcomes the need for engagement in systematic evidence-informed cycles of inquiry that builds the relevant professional Previous assumptions were that once For this reason, the second part of the knowledge, skills and dispositions. The teachers had this kind of information, cycle in Figure 1 requires teachers to process for this inquiry is illustrated in they would be able to act on it in ask, with the help of relevant experts, Figure 1. The cycle begins by identifying ways that enhanced student learning. what knowledge and skills they need the knowledge and skills students need Many teachers’ previous training and in order to address students’ identified to close the gaps between what they approaches to teaching practice did needs. More detailed questions ask: not require them to interpret and already know and can do and what How have we contributed to existing use these kinds of data, because they need to know and do to satisfy student outcomes? the requirements of the curriculum assessment information was about or other outcomes valued by the labelling and categorising students, and What do we already know that we can relevant community. Curriculum-related not for guiding and directing teaching use to promote improved outcomes assessment information is required for students? for a detailed analysis of students’ What do we need to learn to do to learning needs. These kinds of data 1 These tools are part of Project asTTle promote these outcomes? are more useful for the purposes of (Assessment Tools for Teaching and Learning), diagnosing students’ learning needs which provides detailed assessment against What sources of evidence or than assessments focused more on curriculum objectives in reading, writing knowledge can we utilise? identifying normative achievement, but and mathematics for Years 4 to 12. (A full description of this project, along with technical In this way, teachers begin a formative not related to the curriculum. Within reports and publications is available at assessment cycle that should mirror the Literacy Professional Development http://www.tki.org.nz/r/asttle/.) It is an that of students, which has long been Project, for which the outcomes above electronic assessment suite that gives teachers recognised as effective in promoting are described, the assessment Tools for choice in the design and timing of assessments and access to a range of reporting formats, student learning (Black & Wilam, Teaching and learning (asTTle, Ministry including comparisons to norms.

Research Conference 2009 22 1998). It is also effective in promoting is judged on improvement in student professional development project the learning of teachers. Answering outcomes. described above, substantive gains were the questions above requires further made in one year, but it took two The second principle is that the use of assessment data. Considering years for the change to become an knowledge and skills developed are teachers’ contribution to existing embedded part of practice. integrated into coherent practice. student outcomes, for example, Knowledge of the curriculum and how Part of the reason for the length of requires teachers to unpack student to teach it effectively must accompany time for change is that using assessment profiles within the data and relate them greater knowledge of the interpretation data for the purposes of improving to emphases and approaches in their and use of assessment information. teaching and learning requires changing teaching practices. Student profiles of Identifying students’ learning needs prior assumptions about the purposes reading comprehension on different through assessment information is of assessment information. If teachers’ assessment tasks can help teachers to unlikely to lead to changes in teaching prior theories are not engaged, it is identify what they teach well and what practice unless teachers have the quite possible they will dismiss the new requires a different or new emphasis. discipline, curriculum and pedagogical uses as unrealistic and inappropriate Most important is that co-constructing knowledge to make the relevant for their particular practice context or the evidence to answer the questions, changes to practice. Understanding reject the new information as irrelevant with relevant experts, assists teachers theories underpinning assessment (Coburn, 2001). Engaging teachers’ to identify what it is they need to information, theories underpinning the existing ideas means discussing how know and do to improve outcomes for curriculum and those underpinning those ideas differ from the ideas being students. effective teaching allow teachers to use promoted and assessing the impact these understandings as the basis for that the new approaches might have Deepening professional making ongoing, principled decisions on their students. If they cannot be knowledge and refining about practice. A skills-only focus does persuaded that a new approach is skills not develop the deep understandings valuable and be certain of support if teachers need if they are to change they implement it, teachers are unlikely The next part of the cycle in Figure teaching practice in ways that flexibly to adopt it – at least, not without 1 requires teachers to deepen their meet the complex demands of strong accountability pressures to professional knowledge and refine their everyday teaching and to link the do so. skills. In the synthesis of the evidence assessment data to requirements for of the kinds of teacher learning that new teaching approaches. In fact, are associated with changes in teaching Assessing impact of without a thorough understanding of practice that impact on student changed actions the theory, teachers are apt to believe outcomes, three principles were they are teaching in ways consistent The final part of the cycle in Figure 1 also identified in terms of the content of with the assessment information or involves knowledge about and use of the professional learning in addition they have promoted change in practice assessment information. Given the varied to using assessment information for when those relationships are typically context in which teachers work, there professional inquiry (Timperley, 2008). superficial (Hammerness et al., 2005). can be no guarantee that any specific The first was a requirement to focus activity will have the anticipated result, on the links between particular teaching The third principle is providing multiple because impact depends on the context activities, how different groups of opportunities to learn and apply in which those changes occur. The students respond to those activities, new information and to understand Best Evidence Synthesis of Professional and what their students actually learn. its implications of teaching practices. Learning and Development (Timperley et Without such a focus, changes in Interpreting assessment information, al., 2008) identified that the effectiveness teaching practice are not necessarily understanding the implications for of particular changes depends on the related to positive impacts on student practice and learning how to teach knowledge and skills of the students, learning (e.g. Stallings & Krasavage, in different ways in response to that their teachers and their leaders. Judging 1986; Van der Sijde, 1989). It should be information is a complex undertaking. impact requires the use of assessment clear to participating teachers that the It typically takes one to two years, information on a daily, term-by-term reason for their engaging in professional depending on the starting point, for and annual basis. Thus, to be effective, learning experiences is to improve the professional learning to deepen teachers need a range of ways to assess student outcomes. Similarly, success sufficiently to make a difference to their students informally and formally. student outcomes. In the literacy

Assessment and Student Learning: Collecting, interpreting and using data to inform teaching 23 Leading change are telling them about changes needed develop’. In L. Darling-Hammond (Ed.), to their practice. To undertake this Preparing teachers for a changing world: Recent research analyses demonstrating change teachers need opportunities to What teachers should learn and be able that it is teachers who have the greatest develop this knowledge as they delve to do (pp. 358–389). San Francisco: system influence on student outcomes into the assessment information, to John Wiley & Sons. (Bransford, Darling-Hammond & find out what it means for their own Ministry of Education (2001). asTTle: LePage, 2005; Nye, Konstantanopoulos learning and to engage in multiple Assessment Tools for Teaching and & Hedges. 2004; Scheerens, opportunities to acquire the new Learning: He Pūnaha Aromatawai mō te Vermeulen & Pelgrum, 1989) have knowledge and skills. Changing teaching Whakaako me te Ako. Auckland, New led to an increasing focus on what practice in ways that benefits students Zealand: University of Auckland School happens in classrooms and how to means constant checking that such of Education. See www.tki.org.nz/r/ promote teacher professional learning. changes are having the desired impact. asttle/ Teachers, however, cannot achieve Effectiveness is context-dependent, these changes alone, but require the so the knowledge and skills to check Nye, B., Konstantanopoulos, S., & kinds of organisational conditions in the impact must become part of the Hedges, L. V. (2004). ‘How large are which learning from data becomes an cycle of inquiry. When teachers are teacher effects?’ Educational Evaluation integral part of their practice. A recent provided with opportunities to use and Policy Analysis, 26(3), 237–257. meta-analysis by Robinson, Lloyd and and interpret assessment data in order Robinson, V., Lloyd, C., & Rowe, K. Rowe (2008) has identified that the to become more responsive to their (2008). ‘The impact of leadership on greatest influence of school leaders students’ learning needs, the impact student outcomes: An analysis of the on improving student outcomes is is substantive. Teachers, however, differential effects of leadership types’. their promotion of and participation in cannot do this alone, but require Educational Administration Quarterly, teacher professional learning. Creating system conditions that provide and 44(5). the kinds of conditions in schools in support these learning opportunities which teachers systematically use data in ways that are just as responsive to Scheerens, J, Vermeulen, C., & Pelgrum, to inform their practice for the benefit how teachers learn as they are to how W. J. (1989). ‘Generalizability of of students requires that they teach students learn. instructional and school effectiveness in contexts in which such practice indicators across nations’. International becomes part of the organisational References Journal of Educational Research, 13(7), routines. 789–799. Black, P., & Wiliam, D. (1998). Inside Conclusions the black box: Raising standards through Stallings, J., & Krasavage, E. M. (1986). classroom assessment. London: King’s ‘Program implementation and student Research on teacher change has College School of Education. achievement in a four-year Madeline shown that previous assumptions Hunter follow-through project’. The about teachers’ use of assessment Bransford, J., Darling-Hammond, L., & Elementary School Journal, 87(2), 117– data were unreasonably optimistic. It LePage, L. (2005). ‘Introduction’. In L. 137. is difficult to change from traditional Darling-Hammond, & J. Bransford (Eds.), Timperley, H., & Parr, J. (In press). ideas where assessment data was Preparing teachers for a changing world: ‘Chain of influence from policy to considered to be reflective of students’ What teachers should learn and be able practice in the New Zealand literacy abilities about which little can be done, to do (pp. 1–39). San Francisco: Jossey strategy’. Research Papers in Education. to one where assessment data is Bass. considered to be information to guide Coburn, C. E. (2001). ‘Collective Timperley, H. S., & Parr, J. M. (2007). reflection about the effectiveness of sensemaking about reading: How ‘Closing the achievement gap through teaching and what needs to happen teachers mediate reading policy in their evidence-based inquiry at multiple next. Making such changes is complex. professional communities’. Educational levels’. Journal of Advanced Academics, Not only are changes in professional Evaluation and Policy Analysis, 23(2), 19(1), 99–115. knowledge and skills of the use of 145–170. Timperley, H. S., Wilson, A., Barrar, H., assessment data required, but teachers & Fung, I. (2008). Teacher professional also need deeper pedagogical content Hammerness, K., Darling-Hammond, learning and development: Best evidence knowledge so that they are able to L., Bransford, J., Berliner, D., Cochran- synthesis iteration. Wellington, New respond constructively to what data Smith, M., McDonald, M., & Zeichner, K. (2005). ‘How teachers learn and Zealand: Ministry of Education.

Research Conference 2009 24 Timperley, H. S. (2008). Teacher professional learning and development. The Netherlands: International Academy of Education / International Bureau of Education. Van der Sijde, P. (1989). ‘The effect of a brief teacher training on student achievement’. Teaching and Teacher Education, 5(4), 303–314.

Assessment and Student Learning: Collecting, interpreting and using data to inform teaching 25 What information from PISA is useful for teachers? How can PISA help our students to become more proficient?

Abstract A frequent objection to large-scale testing programs, both national and international, is that they are used as an instrument of control, rather than as a means of providing information to effect change. Moreover, concerns about large-scale testing often take the form of objection to the specific Juliette Mendelovits Dara Searle characteristics of the assessments as Australian Council for Educational Australian Council for Educational being prescriptive and proscriptive, Research Research leading to a narrowing of the curriculum and the spectre of ‘teaching Juliette Mendelovits is a Principal Research Dara Searle is a Research Fellow in the to the test’ to the exclusion of more Fellow in the Assessment and Reporting research Assessment and Reporting research program program at ACER. at ACER. Dara commenced working for ACER important educational content. Taking in 2004 as the inaugural employee of ACER’s PISA reading literacy as its focus, this Juliette has been with ACER since 1991, graduate recruitment program. Since then she presentation proposes, on the contrary, working mainly in literacy and humanities test has worked mainly in literacy and humanities that a coherent assessment system is development. She has directed a number test development on a number of literacy-based of projects including the Victorian General projects for primary and secondary school valuable in so far as it makes ‘teaching Achievement Test (GAT), the Western Australian students, including the Western Australian to the test’ a virtue. With framework, Education Department’s Monitoring Standards in Literacy and Numeracy Assessment (WALNA) instrument and interpretation Education program (English), and the International and the National Assessment Program – Literacy transparently connected into a coherent Schools’ Assessment. She is currently responsible and Numeracy (NAPLAN). for the management of humanities and literacy assessment system, the test itself test development within the Assessment and Since 2007 Dara has been engaged in test represents something that recognisably Reporting research program. development for the reading literacy component ought to be taught, and its framework of the OECD’s Programme for International Juliette is domain leader for reading literacy and the interpretation of its results are Student Assessment (PISA). for the Programme for International Student tools that can be used to improve the Assessment (PISA), and a co-author of Reading for change: Performance and engagement across teaching of reading. countries (OECD, 2002). Introduction Collecting, interpreting and using Dr Tom Lumley assessment data to inform teaching Australian Council for Educational – the theme for this conference Research, was co-author of this paper – is not the immediate goal of international achievement surveys like the Organisation for Economic Co-operation and Development’s Programme for International Student Achievement (OECD PISA). PISA’s primary audience is policy makers, who use its data and interpretation to make wide-reaching decisions about national education that can seem remote from and even irrelevant to day-to-day classroom practice. Moreover, if large- scale tests can provide anything to an Australian classroom teacher, that provision is surely going to be satisfied

Research Conference 2009 26 by our own national assessment in 2003 and 2006, when mathematics however, Australia’s average reading program. NAPLAN provides an annual and science respectively had major proficiency fell significantly (OECD, snapshot of student achievement at domain status. So while reading has 2007). While there was some variation four year levels, the highest of which, been assessed three times in PISA, the in degree, the fall happened universally Year 9, is close to the target age of the most detailed information on reading across all states and territories PISA sample (15-year-olds). A teacher dates back to the reports from the (Thomson & De Bortoli, 2008). Results might ask then, what can PISA tell me 2000 administration. The picture of declined for both girls and boys. that I can’t learn from NAPLAN? Australian 15-year-olds’ reading in Because it was a minor sampling of 2000 was rather encouraging. Only the reading domain, information about If PISA is to be useful to teachers one country – Finland – performed performance on the process and text any information it provides must be significantly better than Australia on format subscales is not available. We additional or different to that provided the combined reading scale. Australia do know, however, that the decline by the national study. One obvious was in a group of eight second- in performance was most marked in addition and difference is international ranking countries including Canada, the top one-quarter of the population. comparisons of achievement. A New Zealand and the UK. Generally The potential comfort of attributing second is the opportunity to compare speaking, the spread of Australian this apparent deterioration to a frameworks: to ask whether those that students’ results was about the same difference in the sample evaporates Australia has adopted are adequate, or as the OECD (developed countries) when we consider that the results for better, or in some respect deficient in average. The top performing 5 per cent PISA mathematics from 2003 to 2006, relation to the PISA framework. The of Australian 15-year-olds performed administered to the same sample of third is to monitor any new areas that as well as any other countries’ top students as the reading assessment, PISA is including in its survey of student 5 per cent of students (except New showed no such significant decline. proficiencies. Zealand’s) on the combined reading Moreover, the tasks administered In this paper we will explore what literacy scale. The gender balance was for reading in 2003 and 2006 were PISA has to offer from these three also typical: as in every other country, identical. perspectives, with a focus on reading girls performed better than boys in At this point, we need to look more and on teaching in Australia. reading. The difference for Australian closely at the PISA reading framework. girls and boys was close to the OECD This is related to what we identified average (OECD, 2001). International earlier as the second way in which an comparisons One not-so-favourable story that international study might be informative What has PISA told us about Australian appeared in the PISA national report for teachers: comparing the national 15-year-olds’ reading proficiency? was that Australia performed worse and the international frameworks to see The survey has been administered than expected on some types of how close their alignment is and, taking three times so far, with the fourth reading: namely, narrative reading that into account, ascertaining whether administration being conducted in (Lokan, Greenwood & Cresswell, what the international study is telling Australia right now (July to September 2001). Australia’s performance on us about our students’ proficiency is 2009). PISA’s methodology is to assess the reflecting and evaluating aspect of relevant. each of its three major domains, reading was also weak when compared reading, mathematics and science, as with that of several other English-testing The PISA reading the ‘major domain’ once every nine countries: Canada, the , framework and student years, with the other two sampled Ireland and – marginally – New Zealand proficiency as ‘minor domains’ in the intervening (Mendelovits, 2002; OECD, 2001). three-yearly surveys. Thus in 2000, This is one story that could and, we To represent the range of item reading was the major domain with believe, should be noticed by Australian difficulty, and to ensure broad coverage about 135 reading items included, teachers, especially when we look of the domain, the PISA framework and the results reported overall (the at what has happened to Australia’s defines several task characteristics combined reading scale) and in five performance since the year 2000. that are used in the construction of all reading literacy tasks. These task subscales based on reading processes In 2003, the second survey of PISA, characteristics are: situation (personal, and text formats, to give a deep and when reading was a minor domain, the public, occupational and educational); broad picture of the domain (OECD, results for reading were very similar medium (print and electronic); 2001, 2002). It was only lightly surveyed to those for PISA 2000. In PISA 2006,

Assessment and Student Learning: Collecting, interpreting and using data to inform teaching 27 environment (authored and message- Thus, while reflect and evaluate and in the Australian curriculum that could based);1 text format (continuous, non- contextual understanding are in the explain our relatively poor performance continuous, mixed and multiple); text same conceptual area, the former is a in these parts of PISA. type (description, narration, exposition, broader construct. Putting all these elements together, two argumentation and instruction); and The other area of notable deficit in hypotheses could explain the relatively aspect (access and retrieve, integrate relation to expected performance weak performance of Australian and interpret, and reflect and evaluate).2 in 2006, given the overall strong 15-year-olds. One is that less weight Within the aspect variable, while both performance of Australian students is given to reflective and evaluative access and retrieve and integrate and in PISA 2000, was in tasks based on reading, and to narrative, in Australian interpret items draw on content from narrative texts. Narrative texts are classrooms than the official curriculum within the text, reflect and evaluate defined in the PISA framework as texts and assessment would lead one to items draw primarily on outside in which ‘the information refers to suppose. Another is that the particular knowledge, and ask readers to relate properties of objects in time. Narration approach taken to these elements is this to the text they are reading. typically answers when, or in what different to that represented in PISA. The reflect and evaluate aspect is sequence, questions’ (OECD, 2006). Teachers could explore the second of of particular interest for Australia, Typical narrative texts are novels, short these hypotheses efficiently by studying we have argued, since our national stories or plays, but this text type could examples of PISA’s reflecting and performance on this reading process also include, for example, newspaper evaluating items from narrative texts.3 in 2000 was below expectations. If reports or biographies. The parallel one compares the PISA framework text type in the Australian frameworks If it is judged that the reading construct to the English curriculum profile for is imaginative texts, described in the described and instantiated in PISA Australian schools, the closest match to Australian Statements of Learning for is one that Australian education reflect and evaluate is the sub-strand English as ‘texts that involve the use of subscribes to, teachers might think contextual understanding. This strand language to represent, recreate, shape about the following in their classroom is defined as ‘the understanding about and explore human experiences in real practice: sociocultural and situational contexts and imagined worlds. They include, • Reconsidering approaches to that the student brings to bear when for example, fairytales, anecdotes, reflective and evaluative reading composing and comprehending texts’ novels, plays, poetry, personal letters (Curriculum Corporation, 1994). Both and illustrated books’ (Curriculum • Changing the emphasis of what is the reflect and evaluate aspect and the Corporation, 2005). Again, it would done with narrative texts contextual understanding sub-strand deal appear that there is a substantial • Making particular efforts to with the way in which the social and intersection between PISA’s narrative challenge the most able students. cultural conditions of both the writer text and the Australian imaginative text and the reader may influence the way type. the text is written and read. The reflect These suggestions have at least an In the NAPLAN Year 9 reading and evaluate aspect is also addressed in apparent synergy. The higher order assessment 20–30 per cent of the items items that ask readers to consult their thinking that is typically involved in address the contextual understanding personal experience or knowledge and responding to reflect and evaluate sub-strand, while in PISA about 25 draw on those elements to compare, questions in PISA could usefully be per cent of the items require students contrast or hypothesise about the text. studied and modelled by teachers in all to reflect on and evaluate the text. In addition, some reflect and evaluate learning areas, but perhaps particularly NAPLAN reading allocates 30 to 40 items also include those that ask by English teachers in secondary per cent of the instrument at each readers to make a judgement drawing schools. And narrative, imaginative texts year level to imaginative texts, a much on external standards, relating to either can present the most complex and larger proportion that that assigned to the form or the content of the text. challenging types of reading and thinking narration in PISA, which accounts for only 15 per cent of the instrument. Insofar as the weighting of text types 1 Note that the environment classification only within an assessment reflects their applies to texts in the electronic medium. emphasis, it does not appear that there 3 See reflect and evaluate items in the units ‘The 2 Detailed definitions of each of these task Gift’, ‘Amanda and the Duchess’ and ‘A Just characteristics are given in the PISA framework is a lack of attention to either ‘reflection Judge’ in Take the test: Sample questions from publications (OECD, 2000, 2006). and evaluation’ or to ‘narrative’ texts OECD’s PISA assessments (OECD, 2009)

Research Conference 2009 28 that students are exposed to at school, information across several sites or The ERA items provide examples of in both primary and secondary years. pages presenting information in tasks that require construction of the different forms. Teachers could refer reading text using both textual clues The third way in which PISA might play to the classification to check that the and navigation tools such as drop- a useful role for Australian teachers range of text forms described matches down menus, embedded links and lies in its potential to throw new light the variety of forms that students are tabs. Teachers could inspect this range on elements of reading. A case in exposed to in classroom activities. of tasks to help construct a sequence point is the expansion of the reading of lessons on classifying and mastering framework, in PISA 2009, to include 2. There is a greater onus on the different navigation techniques – both electronic reading. reader to evaluate the text. This is ensuring that students are familiar because electronic texts have not with relevant technical functions, and typically undergone the scrutiny that Electronic reading developing their skills in the more is involved in the publication of a assessment elusive areas of inference and analysis print-based text. The PISA electronic reading assessment to predict the most likely pathway to (ERA) is being administered in 20 The implication of the mass of the information that is sought. information has major implications countries in 2009, including Australia. The PISA 2009 framework recognises for readers’ ability to reflect on and The new reading framework for PISA that both navigation and text processing evaluate what they read. Readers need (in press) now includes electronic skills are required for electronic reading, to swiftly evaluate the credibility of reading as an integral part of the though the proportion of each will information; critical thinking therefore reading construct. While skills in vary according to the task at hand. gains even more importance in this reading electronic texts are increasingly The ERA instrument comprises tasks medium. PISA ERA reflect and evaluate called upon in many school and non- that systematically vary the weighting items have a strong focus on the school activities, PISA ERA represents of these two skills. Teachers may find probity, relevance and credibility of the the first attempt in a large-scale this conceptualisation of the demands stimulus material. Teachers could refer international survey to assess the skills of electronic reading tasks useful, in to the framework descriptions and and knowledge required to read in the predicting the difficulty of digital reading the items that reflect them as models digital medium. tasks that they require their students to of critical reading in the electronic complete, and in diagnosing challenges The way in which electronic reading medium. is defined in PISA recognises that that students encounter when they electronic reading is not just reading 3. There is a greater onus on the engage with electronic texts. print text on a computer screen. Three reader to select and construct major differences between print and the text. In print-based texts, Conclusion the physical status of the printed electronic texts are outlined below, In this paper we have discussed some each followed by a short description text encourages the reader to approach the content of the text in of the implications for teaching, from of the way the new PISA reading both previous PISA results and those framework and the ERA instrument a particular sequence. In contrast, electronic texts have navigation that are to come. While the results for have addressed the differences, and PISA 2009 will not be available until a suggestion about how both the tools and features that make possible and indeed even require the end of 2010, and the Australian framework and items might inform national analyses probably some time teaching and learning. that the reader create their own reading sequence. after that, the new framework for 1. When compared with print reading, PISA reading, with sample items for electronic reading is more likely to The PISA framework has extended the both print and electronic reading, traverse different kinds of texts from definition of the ‘access and retrieve’ will be published later this year. PISA different sources. aspect to acknowledge that the vast 2009 will, we believe, contribute to amount of information available in educators’ understanding of both print The PISA electronic reading framework the electronic medium changes the and electronic reading, and continue to sketches a classification of text forms nature of tasks requiring the retrieval give indicators to Australian teachers of found in the digital medium, and of information. Readers more than some ways in which we can help our represents this diversity in the ERA ever need to be able to skim and students to develop as critical, reflective instrument with mixed and multiple search, and to navigate across oceans and astute readers. texts that require readers to integrate of information in a deliberate way.

Assessment and Student Learning: Collecting, interpreting and using data to inform teaching 29 References Curriculum Corporation. (1994). English – A curriculum profile for Australian schools. Carlton: Curriculum Corporation. Curriculum Corporation. (2005). Statements of Learning for English. Carlton South: Curriculum Corporation. Lokan, J., Greenwood, L., & Cresswell, J. (2001). 15-up and counting, reading, writing, reasoning: How literate are Australia’s students? Camberwell: Australian Council for Educational Research. Mendelovits, J. (2002). Retrieving information, interpreting, reflecting, and then ... Using the results of PISA reading literacy. Paper presented at the ACER research conference. OECD. (2000). Measuring student knowledge and skills: The PISA 2000 assessment of reading, mathematical and scientific literacy. Paris: OECD. OECD. (2001). Knowledge and skills for life: First results from the OECD Programme for International Student Assessment (PISA) 2000. Paris: OECD. OECD. (2002). Reading for change – Performance and engagement across countries Paris: OECD. OECD. (2006). Assessing scientific, reading and mathematical literacy: A framework for PISA 2006. Paris: OECD. OECD. (2007). PISA 2006: Science competencies for tomorrow’s world, Volume 1 analysis. Paris: OECD. OECD. (in press). Take the test: Sample questions from OECD’s PISA assessments. Paris: OECD. Thomson, S., & De Bortoli, L. (2008). Exploring scientific literacy: How Australia measures up: The PISA 2006 survey of students’ scientific, reading and mathematical literacy skills. Camberwell: ACER.

Research Conference 2009 30 Next Practice: What we are learning from teaching about student data

Abstract This paper presents emergent learning from two South Australia Department of Education and Children’s Services (DECS) initiatives. The Supporting Improved Literacy Achievement (SILA) pilot and the DECS Classroom Inquiry Projects are allowing DECS to explore next practices in relation to informing Katrina Spencer Daniel Balacco teaching through analysis of student Department of Education and Department of Education and data at the system, school, class and Children’s Services, Children’s Services, learner levels. South Australia South Australia The SILA pilot is currently being Katrina Spencer is the Assistant Director of the Daniel Balacco is the Program Manager within the implemented in 32 DECS low-SES Quality, Improvement and Effectiveness Unit in Department of Education and Children’s Services schools to provide new approaches South Australia’s Department of Education and (DECS) in South Australia’s Quality, Improvement to improve literacy outcomes in Children’s Services (DECS). and Effectiveness Unit. disadvantaged schools. The pilot Katrina has been a school principal, teacher and Over the last 10 years, Daniel has worked is successfully developing practical counsellor prior to working in central office extensively on the design, development and understandings in the use of a range of positions in support of school improvement and practical application of data to drive continuous effectiveness research programs. Katrina was a improvement and accountability processes within data to achieve focussed whole school key developer of the DECS High Performing schools, , regions and the Department. literacy approaches, build teacher and Organisations program which supported school, Daniel has led and facilitated workshops and leader capacity and strengthen home– and regional leaders to develop conferences for preschool, school and regional school partnerships. effective change theories and conduct continuous leaders across South Australia that focus on the improvement approaches in their work. Katrina use of data in education contexts. Daniel has The DECS Classroom Inquiry Project has an extensive background in school review provided educators with the opportunity to was implemented in a high-SES and evaluation processes and is currently leading collaboratively ‘make data count’ and supported a team to review and support improved literacy them in undertaking rigorous data-driven self- primary school to investigate how outcomes in disadvantaged school settings as review and improvement planning processes. to help experienced teachers gather part of the Federal Government’s literacy and and use data to drive decisions numeracy pilot projects. Before working in DECS, Daniel worked for the Australian Bureau of Statistics as a researcher and about learners and pedagogy. The The Quality, Improvement and Effectiveness Unit statistical consultant. inquiry enabled teachers to connect leads DECS Improvement and Accountability school improvement priorities to Framework. This presentation will focus on innovative projects being undertaken with school their classroom practices through the staff to inform the further development of the effective use of data. Teachers involved DECS framework. found that using student achievement, perception and observational data provided valuable information for learners and in directing pedagogy. These case studies highlight the important role of student data to support meaningful reform at all levels in education. Introduction The South Australian Department for Education and Children’s Services (DECS) has a strong tradition of working from an inquiry perspective so that the work of practitioners in the field supports and informs policy

Assessment and Student Learning: Collecting, interpreting and using data to inform teaching 31 development at the regional and central Supporting Improved Emerging findings are showing a levels. In a DECS Occasional Paper Literacy Achievement significant ‘knowing––doing’ gap at the on inquiry, Reid (2004) describes it as teacher, school and systems levels in ‘‘a process of systematic, rigorous and (SILA) Pilot supporting, monitoring and planning for critical reflection about professional The Supporting Improved Literacy learning. This is characterised by limited practice, and the contexts in which it Achievement (SILA) pilot project aims differentiation and personalisation of occurs, in ways that question taken- to deliver improved literacy outcomes learning provided for the broad range for-granted assumptions’. The DECS for learners in low-SES schools. DIAf of student abilities, few connections in Improvement and Accountability describes the importance of acceptable learning programs between classrooms Framework (DIAf) (DECS, 2007) is a standards of student achievement and year levels, and a limited range of critical policy that has evolved, and is being achieved for all learners and the pedagogies in use with low levels of continuing to do so, based on inquiry need for appropriate intervention and student engagement in learning. This and trialling in the field to inform support to be provided when standards data is informing the system, school the policy in practice. This paper are not achieved. leaders and teachers about directions seeks to outline two projects that and gaps to be addressed to effectively To this end, DECS is investigating are impacting on the development monitor and intervene at the learner complex, disadvantaged schools of DIAf through working closely with level and use in-process data to drive with low achievement in national teachers and learners to understand programming decisions. literacy assessments, to identify critical successful strategies to better intervene improvement issues and work with From this clear understanding of and support learning: the Supporting school leaders and teachers to find new the learning needs at the classroom Improved Literacy Achievement (SILA) ways to help learners achieve successful and school levels, the SILA project pilot and the DECS Classroom Inquiry standards. This work is being supported is working with schools through the Projects. by funding from the Australian appointment of coaching teams to: Parallel to the focus of the DECS Government Department of Education, • develop whole school, student- inquiry is the UK Department for Employment and Workplace Relations centred approaches to literacy Education and Skills UK, Innovation as part of the Education Revolution: teaching and learning Unit’s Next Practice in Education Improving our Schools—National program (2008) which uses Action Plan for Literacy and Numeracy. • improve teacher and leader capacity Leadbeater’s (2006) description of to support literacy teaching and The data that is informing this work is next practice as ‘‘emergent innovations learning provided through: that could open up new ways of • effectively use data to analyse and working [and] are much more likely to • a survey of teacher perceptions of monitor learning at the class and come from thoughtful, experienced, literacy knowledge and confidence school levels self-confident practitioners trying to in programs delivering outcomes find new and more effective solutions conducted in each school • build new connections with parents to intractable problems’. The Next and community services to support • classroom observations of student Practice program seeks to help schools learning learning, explicit teaching, feedback, build on good practices and successful assessment and monitoring practices • achieve significant improvement in innovation through three phases of literacy performance for all students. development—stimulate, incubate and • analyses of national literacy accelerate—while supporting practices achievement data over time and at The SILA project as an inquiry-based that foster improvement. SILA and the the question item level model is providing direct information DECS Classroom Inquiry Projects are from learners, teachers and classroom • teacher, parent and student supporting DECS to learn in practice programs to identify a range of issues perception data gathered through from practitioners to stimulate the that exist across schools which may interviews and focus groups. use of student data, to incubate better have implications for existing and ways to inform and direct teaching This data, collected through a diagnostic future system policies and practices. practices and to accelerate the spread review, is synthesised and compiled into From this data about learning gathered of successful learning across the system. a diagnostic review report detailing key in-process, DECSs is working alongside recommendations for improvement. teachers and school leaders to implement effective change models.

Research Conference 2009 32 The Department is undertaking an in the Next Practice program by. The collaboration with the teachers, it was action learning model to stimulate new classroom teachers were supported agreed to develop classroom targets ways of working on complex issues by the school’s Deputy Principal and aligned with the school plan to improve of school improvement, incubate and the central office Program Manager student achievement in maths. Teachers trial successful practices in schools, and responsible for the development and were supported to design classroom accelerate their spread and adoption implementation of the project. targets using the SMART format as effective ‘next’ practices to improve (Marino, 2006), that were specific, The Program Manager synthesised the student learning outcomes and the measurable, achievable, results oriented data from teacher interviews, classroom quality of teaching. and time bound. observations, student feedback and document analysis to map the data Class teachers engaged collaboratively DECS Classroom collected and used at the school, to plan and design a common Inquiry Project class and student levels. This was assessment task to measure progress Since the development of the DIAf mapped against Bernhardt’s (2002) against their class target. A pre-test in 2005, a range of improvement and multiple data measures (achievement, assessment was administered and accountability resources, initiatives demographic, process and perception) results provided to students individually and support programs have been that when used together ‘‘give schools and to the whole class. implemented at the school, district and the information they need to improve system levels within DECS. Information teaching and learning and to get gathered from these initiatives positive results’. combined with current research Analysis of school and classroom data indicated the need to connect school indicated the following: improvement to classroom actions, Class and to investigate the effective use of • An extensive range of student target data in classrooms to improve student achievement data is consistently outcomes in particular. Hattie (2005) gathered across the school in all suggests that data needs to be located classes. in the classroom for teachers to better • Observations, question responses, understand learner success and needs. rubrics, checklists, test scores and Figure 1: Classroom maths targets The DECS Classroom Inquiry Project student work are the key data sets stimulates experienced classroom used by both teachers. The pre-assessment data enabled practitioners to work with central staff • Differences between classroom teachers to work with students to set in gathering and using data to drive the class target to be achieved. Targets decisions about learners and pedagogy. data collections include the use of standardised assessment were documented and displayed on a The inquiry project employed a and learning style identification bar chart (see Figure 1). multiple case study design that instruments. Once the unit of work was completed, included semi-structured teacher • Learners and teachers a post-assessment was administered interviews, classrooms observations and the results of student progress and selected documentation review. routinely engage in collecting a comprehensive range of data, were reflected on the class chart. Two experienced classrooms teachers The implementation of this approach in a high-SES primary school were including student achievement and demographic data. provided useful learning about teaching invited to trial ideas in their classrooms. and motivation for students. The school had implemented DIAf • Perception and process data are not processes at the leadership and whole- effectively gathered or used at the In reflecting on the process, teachers school levels in previous years and class level. identified that: these provided the basis for further • There are limited explicit • the use of pre-assessment data investigations of connections at the connections at the classroom level provided them with effective classroom level. The participants chosen to school directions and goals. starting points for learning at the were ‘experienced and thoughtful individual and class levels. This practitioners’, reflecting the notion of These findings helped to incubate enabled teachers to provide greater seeking new approaches as outlined the next stage of the inquiry. In differentiation of individual needs

Assessment and Student Learning: Collecting, interpreting and using data to inform teaching 33 and identify areas for explicit resource. This data provided authentic • ensuring the efficient storage of teaching. feedback to teachers about which learner data so that it is easily learning tasks best engaged students accessed by staff, students and • the data provided by the class chart and supported learning. For example, parents, and supports learning improved monitoring of student one teacher reported that working with transitions from year to year. learning during lessons. The teachers process data enabled her to identify found that they provided greater This inquiry is being used to support a student consistently not engaged in differentiation as they were more DECS to refine school improvement learning and implement a successful aware of individual students at risk planning processes so that they better peer mentoring strategy. The teachers of not achieving the targets. connect to classroom teaching and indicated they would continue to learning practices. • collaboratively establishing learning collect process data as it provided cues targets connected students to their to trial alternative teaching methods own learning and enhanced student and extend successful strategies in Summary and group ownership, responsibility order to meet individual and group What we have learnt about teaching and motivation. needs. through the use of student data in the Teachers found that establishing specific To accelerate the learning from this SILA and the DECS Classroom Inquiry targets with learners helped to raise the project, a number of key considerations projects has been the importance of bar for students, and aided teachers in require further development. These presenting and analysing data at the reducing the gaps between students. include: classroom level to influence change and drive improvement. These During this process, perception • developing strategic whole of school projects have provided DECS with the data was collected via a structured approaches to data gathering and opportunity to trial new approaches survey to provide feedback from analysis that are incorporated within and examine their implications for students and staff regarding teaching classroom teaching practices and teaching and learning. This learning and learning in numeracy. The data connected to school plans with has the potential to support students confirmed the importance of engaging explicit classroom expectations and teachers to become engaged and students in pedagogies that provide through the use of survey effective data users in an era where challenge, interest and support for instruments and class targets education systems and schools are learning tasks. Students reported • developing teacher expertise to held increasingly more accountable positive levels of interest (81% deeply analyse and interpret data to improve student outcomes. The agreement) and engagement (73% within a professional culture that successful practices highlighted within agreement) when the learning was enables teachers to collaboratively these case studies provide possibilities connected to real world applications. share data through structures to stimulate, incubate and accelerate The perception data confirmed the like school improvement teams, learning across the system and use effectiveness of the electronic learning professional learning communities student data to encourage meaningful resources provided to students (96% and collegial planning processes reform. agreement), which is a key priority in the school improvement plan. This • developing effective processes to References data provided teachers with feedback support teachers to readily collect on their enactment of the school’s process data via observations and Bernhardt, V. (2004.). ‘Data Analysis for quality teaching and learning principles use perception data on a routine Continuous School Improvement’. Eye and identified directions for them to basis within their practices on Education, Larchmont, NY further develop these principles in their • extending the involvement of Department of Education and practices. students in this process to set Children’s Services. (2007). DECS Process data was collected via teacher targets, gather data and conduct Improvement and Accountability observations of student involvement observations to inform their own Framework. : Government of in various class learning activities and learning South Australia. Retrieved January 23, recorded on a digital camera. This 2009 from http://www.decs.sa.gov.au/ evidence was analysed using a DECS quality/ (2008) scaled scoring instrument from the Reflect, Respect, Relate assessment

Research Conference 2009 34 Department of Education and Children’s Services. (2008). Reflect, Respect, Relate: Assessing for learning and development in the early years using observation scales. Adelaide: Government of South Australia. Department for Education and Skills UK, Innovation Unit (2008). The Next Practice in Education Programme: Learning, insights and policy recommendations. Retrieved April 9, 2009 from http://www.innovation- unit.co.uk/images/stories/files/pdf/ np_education_finalreport.pdf Hattie, J. (2005). What is the nature of evidence that makes a difference to learning? Research Conference paper. Melbourne: Australian Council for Educational Research. Retrieved March 16, 2009 from http://www.acer.edu.au/ documents/RC2005_Hattie.pdf Leadbeater, C., (2006). The Innovation Forum: Beyond Excellence IDeA. Retrieved July 7, 2009 from http://www. innovation-unit.co.uk/next-practice/ what-is-next-pratice.html Marino, J. (2006). ‘How SMART are your goals?’ [blog post] on Quality in Education: Using quality tools to increase academic achievement and help your bottom line. Milwaukee, WI: American Society for Quality. Retrieved March 16, 2009from http://www4.asq.org/blogs/ edu/2006/04/how_smart_are_your_ goals.html Reid, A. (2004.). Towards a Culture of Inquiry in DECS: Occasional Paper #1. Occasional Paper series; Australia: Department of Education and Children’s Services. Retrieved March 18, 2009from http://www.decs.sa.gov.au/ corporate/files/links/OP_01.pdf

Assessment and Student Learning: Collecting, interpreting and using data to inform teaching 35 Culture-fair assessment: Addressing equity issues in the context of Primary Mathematics Teaching and Learning

Abstract This presentation provides the background and context to the important issue of assessment and equity in relation to Indigenous students in Australia. It draws on the research from an ARC Linkage project that is examining questions about the validity and fairness of assessment. Ways Val Klenowski Thelma Gertz forward are suggested by attending Queensland University of Technology Catholic Education Office of Townsville, to assessment questions in relation Queensland to equity and culture-fair assessment. Val Klenowski is a Professor of Education at Patterns of under-achievement by the Queensland University of Technology in Thelma Gertz is an aboriginal woman from the Indigenous students are reflected in Brisbane, Australia. She currently co-ordinates the Wulluwurru Wakamunna and Kalkadoon tribes Professional Doctoral Program and is engaged in born in Cloncurry and lived the majority of her national benchmark data (NAPLAN) research in assessment. The Australian Research earlier life in western Queensland. and international testing programs like Council Linkage projects for which she is Chief the Trends in International Mathematics Investigator include research in the use of Being raised in a remote and isolated area and moderation for achieving consistency in teacher from an extended family situation provided her and Science Study (TIMSS 2003) and judgment, culture-fair assessment and the use of with the knowledge and skills of her family’s the Program for International Student digital portfolios. She has worked in traditional aboriginal culture. In the last twenty Assessment (PISA). The approaches at the Hong Kong Institute of Education and in years she has lived and worked in the urban developed view equity, in relation to the UK at the Institute of Education, University of Centres of Darwin, Cairns and Townsville where London. Val has research interests in curriculum, she held leadership positions in Indigenous assessment, as more of a sociocultural evaluation and assessment. education. issue than a technical matter. They Thelma is currently the Senior Education highlight how teachers need to Officer for Indigenous Education at the Catholic distinguish the ‘funds of knowledge’ Education office, Townsville. Her many that Indigenous students draw on and roles include providing Indigenous leadership how teachers need to adopt culturally to Principals and to provide a leading role in developing policies and practices that will lead to responsive pedagogy to open up the improved outcomes for Indigenous young people. curriculum and assessment practice to allow for different ways of knowing and being. Introduction This paper is based on an Australian Research Council (ARC) Linkage research project, examining equity issues as they relate to the validity and fairness of assessment practices. The aims are to provide greater understanding about how to build more equitable assessment practices to address the major problem of underperforming Aboriginal and Torres Strait Islander (ATSI) students in regional and remote Australia, and to identify ways forward by attending to culture-fair assessment (Berlack, 2001).

Research Conference 2009 36 The research adopts a sociocultural Research focus requiring support. The extent to which perspective on learning, which views these teaching and assessment practices learning as occurring as part of our This research is particularly timely and are effective in promoting achievement everyday experience as we participate necessary against the background of for ATSI students is being analysed in the world. This theory of learning Australia’s under-achievement in terms and interpreted using qualitative and does not view a separation between of equity for Indigenous students and quantitative data analysis. National contexts where learning occurs, and the lack of an informed strategy in the numeracy data is also being used to contexts for everyday life; rather these education sector to counter this trend. measure success and is supplemented are seen as different opportunities The key research questions are: by additional measures of achievement for learning (Murphy et al., 2008). It is • What are the properties of from the assessment and learning tasks, important to underscore this shift in teacher-constructed mathematics developed, moderated and reported. view to the participants, the activities assessment tasks that are culture- The project is a ‘design experiment’ that they engage in, and the actions that fair? (Kelly, 2003) that involves several they undertake using the resources and cycles of design and development of tools available. It moves away from the • What are the culture-relevant assessment tasks and eight case studies view that sees the individual as the sole assessment practices, as enacted in to identify theoretical principles and determinant of learning and allows for classrooms using these mathematics design implications for application consideration of the impact of different tasks with a significant number of of culture-fair assessment practice, contexts. As Murphy and colleagues ATSI students? both within and beyond the (2008, p. 7) stress when they cite • Does the use of culture-fair immediate study. McDermott (1996) ‘… we can only mathematics assessment tasks learn what is around to learn.’ lead to improved learning for In this first phase of this study there are ATSI students as measured by the three schools: two teachers from two Rationale for the study National Statements for Learning, schools (a Year 4 and Year 6 teacher the national Numeracy Benchmarks from each, one of the latter has a Year Patterns of under-achievement by and Years 3 and 5 numeracy 6/7 class) and four teachers from the Indigenous students are reflected testing? third school (two Year 4 and two Year in national benchmark data and 6 teachers). The eight teachers were international testing programs like the • In a standards-referenced context, asked to select students (preferably Trends in International Mathematics how can teachers develop their Indigenous) on whom to focus. The and Science Study (TIMSS, 2003) assessment capacity so that more total number of Indigenous students is and the Program for International appropriate support and assistance 22 (fourteen Year 4 students and eight Student Assessment (PISA). Inequity is given to Indigenous students to Year 6 students). in Australian education has occurred improve their learning? in the relationship between social Phases of the project background, and achievement, and Research design participation in post-compulsory The first phase is focused on schooling (McGaw, 2007). A trend This project is using National establishing and developing the culture- of underperformance has continued Assessment Program – Literacy and fair assessment tasks and culture- over the past six years as evident Numeracy (NAPLAN) numeracy data relevant pedagogic practices with from the comparative analyses of for ATSI students in Years 3 and 5 to these initial three schools. This process the PISA results, first administered analyse current teaching and assessment requires the iterative development in 2000, again in 2003, and in 2006. practices. The case study uses eight of culture-fair assessment tasks, the There is consistent data across all Catholic and Independent schools culture-fair learning and assessment levels – school, state, national and from Northern Queensland which task development resources, and international – to conclude that have a relatively higher proportion professional development of the Australian schools are not addressing of ATSI students than schools in the teachers and the community. The intent equity issues effectively (Sullivan, Tobias south. The focus is on primary Year 4 is to develop principles by: & McDonough, 2006) with Indigenous and middle school Year 6 classes. The • comprehensive review and synthesis children scoring significantly lower than numeracy data for each school is being of relevant literature non-Indigenous children (Lokan, Ford & used to identify exemplary teaching Greenwood, 1997). and learning practices and the areas

Assessment and Student Learning: Collecting, interpreting and using data to inform teaching 37 • analysis and design of the that are more culture-fair. Data is difference between Indigenous and assessment tasks through being collected and analysed from the non-Indigenous students’ results across collaboration with the teacher following sources: semi-structured, Year 3. sample, the mathematics specialists telephone interviews of teachers; In Year 5 there were 80 students (professional developers) and the achievement data (2008 NAPLAN who sat the NAPLAN Numeracy Indigenous colleagues results); ethnographic observations; Test in 2008. Six or 7.5 per cent of focus group interviews of students; • quality assurance of assessment these students were Indigenous and collection of artefacts; and recordings of tasks in collaboration with the each achieved the national minimum conversations of students and teachers teachers and assessment specialists standard (Band 4). Fifty per cent of the via a software package. students were in Band 5 and 33.3 per • documentation of the cent were in Band 6. It should be noted implementation of the assessment NAPLAN data analysis that the two students who were placed tasks The analysis of 2008 NAPLAN in Band 6 achieved scores of 28 and • collection of artefacts and student Numeracy Test data focused on the 26 respectively and the highest score work results of Years 3 and 5 across the achieved by any student in the cohort was 36. These results raise interesting • analysis of online teacher exchanges three schools. In Year 3 there were 83 students who sat the test: 13 per questions for the research that are yet • student and teacher interviews. cent of these students (11 students) to be explored. The second phase of the research identified as being Indigenous. The There were 74 non-Indigenous Year project involves the extension of the results in Table 1 indicate that eight out 5 students who sat the NAPLAN development and implementation of of the 11 Indigenous students (73 per Numeracy Test last year. Six of these the culture-fair assessment tasks and cent) who sat the test received scores students (8 per cent) achieved at culture-relevant pedagogic practices to that placed them in Bands 2–3. That Band 3, below the national minimum include a further five schools. is, they were at or above the national standard. The remaining 92 per cent minimum standard (Band 2) with four The final phase in year three involves achieved at Bands 4–8 with 80 per cent students in Band 2 and four in Band achieving at Bands 4–6, 31 per cent at an evaluation of the implementation of 3. Three out of the 11 students (27 the culture-fair assessment tasks, the Band 5, 26 per cent at Band 4 and 23 per cent) were in Band 1, below the per cent at Band 6. culture-relevant pedagogic practices and national minimum standard. the learning outcomes. The Indigenous students performed There were 72 non-Indigenous slightly better than the non-Indigenous Data collection students who sat the test. In the non- students when the Year 5 results for Indigenous cohort there were three Indigenous and non-Indigenous students In this first phase, the collection students (4 per cent) who achieved were compared. This is in contrast to and analysis of data focuses on the scores at Band 1, 96 per cent at Bands Year 3 where the Indigenous students’ effectiveness of the development 2–6 with the majority at Band 3 (35 results were significantly lower than the program in building teachers’ capacity per cent), followed by Band 4 (26per non-Indigenous students’ results. to use and develop assessment tasks cent). This represents a significant

Table 1: Year 3 Indigenous Students School Number of students Ages Raw Scores /35 NMCY Bands (Band 2 = National Minimum Standard) School 1 5 7 years 7 months 12– 18 Band 2 (2 students) – 8 years 6 months Band 3 (3 students) School 2 5 7 years 9.5months 9–15 Band 1 (2 students) – 8 years 9 months Band 2 (2 students) Band 3 (1 student) School 3 1 8 years – 8.5 months 9 Band 1 (1 student)

Research Conference 2009 38 Table 2: Year 5 Indigenous Students School Number of Students Ages Raw Scores /40 NMCY Band (Band 4 = National Minimum Standard) School 2 4 9 years 6 months 18 - 26 5 (3 students) – 10 years 7 months 6 (1 student) School 3 2 9 years 8 months – 15 - 28 4 (1 student) 10 years 4 ½ months 6 (1 student)

On average, Indigenous students were Table 3: Year 3 Indigenous Students’ Results 25 percentage points behind the Queensland averages in the number Numeracy strand % Indigenous % Queensland Gap of students who correctly answered students who students who each type of maths question (Table answered questions answered questions 3). When analysing performance correctly correctly in terms of the Numeracy strand, Space 38.8% 67.4% 28.6% both Indigenous and non-Indigenous students performed best in space Number 35% 63% 28% questions, followed by number, and lastly measurement, data and chance Measurement, 34.65% 54.45% 19.8% questions. Data & Chance Interestingly, the gap column reverses this order and shows that the smallest Table 4: Year 5 Indigenous Students’ Results gap exists between the performance Type of maths % Indigenous % Queensland Gap * denotes of Indigenous and non-Indigenous question students who students who better students in measurement, data and answered questions answered questions performance chance questions. Indigenous students correctly correctly by Indigenous outperformed the Queensland average students by 8 per cent in a measurement, data and chance question (Question 29) and Space 58.3% 54.17% 4.13% * the next smallest gap existed in number questions (28 per cent), followed by Measurement, 54.2% 53.2% 1.00% * space questions (28.6 per cent). Data & Chance Interestingly in the Year 5 results, the Number 41.75% 47.5% 5.75% data indicates that on average the Indigenous students outperformed performance of Indigenous students to organise for students who were the Queensland averages in space is reported to be lower than what in Years 3 and 5 in 2008 to resit the and measurement, data and chance this analysis of last year’s NAPLAN NAPLAN test. From analysing the questions. They were 5.75 percentage data has revealed. At this stage of results of the resitting it will be possible points behind the Queensland average the project the research team will to determine how many students in number questions. Both Indigenous collect ethnographic data in relation may have improved, how many may and non-Indigenous students performed to the individual Indigenous students have flat-lined and how many may best in space questions, followed by to investigate more deeply their have regressed. Also from these measurement, data and chance, and performance, particularly in relation to results the research team will identify, lastly number questions. those students who have performed together with the teachers, practices Although the sample size is small really well. and properties of assessment tasks that have been implemented to effect these results highlight some interesting The research team is presently change. differences to those reported in the negotiating with the three schools literature and other studies, where the

Assessment and Student Learning: Collecting, interpreting and using data to inform teaching 39 The professional Table 5: Indigenous cultural protocols and practices aligned to Catholic education development program policy in Northern Queensland A series of professional development Cultural protocols and practices sessions have been organised for the teachers. The principal investigators Equal Opportunity – each child is given the opportunity to become an developed the program based on effective learner identified needs (Warren & de Vries, 2007). The focus of each session aligns Include the community – invite Indigenous community to conduct welcome to with the Numeracy strands: number; country or acknowledgement of country at school functions; build relationships chance and data; and patterns and by sharing personal stories algebra. Teachers also participate in workshops Acknowledge different perspectives in communication – includes languages, (every six weeks) designed to develop knowledge and ways of working their skills in the use of a software package developed by HeuLab, entitled Acknowledge Indigenous culture – traditional, lore (values and beliefs), Fun With Construction™ (FWC). intellectual and moral property and cultural rights This is an interactive digital web-board that enables students and teachers Maintain connections with Indigenous communities – engage traditional to use virtual construction tools such owners and elders; collaborate with Indigenous staff members as a resource as compasses or protractors. It is a teaching tool and includes the facility Honour cultural dates and events – no segregation of rituals and family to record students’ and/or teachers’ relationships; respect community celebrations such as NAIDOC conversations as they are using the program. This will provide invaluable Acknowledge cultural dates and events – Celebrate history, NAIDOC; use data for the students’ learning processes Indigenous resources, ATSI flags; invite Indigenous storytelling and problem-solving strategies. The technical consultant has established a wiki on the website for this project of their Indigenous students, were • Be culturally aware: an example and each teacher has access to this emphasised. This led to a discussion given was to ensure that after site, to files and resources developed of the ‘whole school approach’ that funerals there is no reference to specifically for this project (http://arc1. involves two-way interaction between names of the people who have wikispaces.com/). the school and community. That is, died, honour the mourning process, the school venturing out to participate and acknowledge that the older Indigenous protocols in the community and members of brother takes the role of protector and procedures the community participating in the life of the younger of the school. Indigenous protocols, • Include community through At the first of the mathematics practices and the whole school community projects such as the professional development sessions, the approach were presented as pillars that class café where Indigenous family Indigenous Senior Education Officer led support the school’s curriculum. a discussion designed to raise teachers’ members visit the school When asked to explain how these awareness of Indigenous culture and, in • Recognise cultural differences in cultural protocols and practices were particular, the cultural protocols and terms of the language used at enacted in practice, teachers were able practices they need to be aware home and adopt different modes of to provide clear examples. Some of of when interacting with Indigenous communication such as email, letters these included: students and families. and oral language • Maintain interconnections such as In the articulation of teachers’ • Be aware of particular behaviours acknowledging the close community understanding of the cultural protocols such as in welcoming, eye contact, between school family and home and practices, the primacy of body stance. relationships, and the need for teachers family to build relationships with the families

Research Conference 2009 40 Structure of the language used in NAPLAN tests. The opposed position language, such as program issue of cultural inclusivity in relation to are used in calendars or temperature the NAPLAN tests was also addressed gauges. The teacher development program as currently they are not inclusive and involves regular visits to the project this impacts on the Indigenous and Implications for schools by visiting mathematics LOTE students’ performance. pedagogic practice specialists and members of the research team. In February 2009, the Difficulty understanding test language A number of pedagogical strategies principal investigator from Association and interpreting the graphics results in were recommended to the teachers for Independent Schools Queensland poor performance for all students. The and some were identified as (AISQ) led the first maths session on graphical representations that appear being more culturally appropriate. effective strategies for teaching the routinely in numeracy testing have been Recommendations included: analysed by Diezmann et al. (2009) and topic of number to Year 4 and Year • Read the questions aloud 6 students, and included a focus on include the following: pedagogical strategies for Indigenous • Axis language – vertical or • Instruct students to highlight students. Given space limitations only horizontal axes keywords this session will be discussed here Number lines, temperature gauge, • Teach students how to adopt a in detail. number tracks process of elimination with multiple- The importance of changing pedagogy • Opposed position language – choice questions to incorporate hands-on games and vertical and horizontal axis • Engage Indigenous students in more activities, to make use of eye contact Grids, calendars, graphs interactive, ‘hands-on’ activities and to increase the use of oral language • Retinal list language – rotated to engage ATSI students in the learning • Encourage students to attempt shapes of maths – rather than simply teaching every question or activity Marks not related to position didactically from the textbook – were • Encourage students to deconstruct emphasised. It was acknowledged that • Connection language the question or item students (especially in the early years) Tree diagrams, network diagrams need to see and hear the words, feel e.g. flow charts • Discuss the process or strategy used the sound of the language, and their in completing questions • Map language parents need to be aware that this Road maps, topographic maps, scale • Be creative in the use of textbooks helps them to learn. in maps (Year 7 students often have by opening up discussion about Particular focus was given to the difficulty with scales in maps) certain questions NAPLAN Numeracy Test and the • Miscellaneous language • Give more open-ended questions development of teaching strategies Venn diagrams (often tested), circle for problem solving to effectively prepare all students graphs e.g. clocks for this test. Awareness was raised • Encourage peer learning about how NAPLAN test writers A study of graphical representations of • Use whole class or small group work within a framework that must the mathematics tests in Years 3 and activities include written literacy and numeracy 5 over the past 11 years identified that incorporating reading, comprehension, opposed position language was used in • Encourage individual problem oracy (such as discussion), numeracy 67 per cent of tests and axis language solving. (such as calculation, graphics), or visual was used in 11 per cent of tests (De The following inclusive practices were literacy or numeracy (such as diagrams, Vries, 2009). These statistics highlight advocated: graphics, etc.). The language used in the necessity for students to learn how • Commence with an activity where NAPLAN tests can be difficult for to read and interpret these graphical all children experience success students to decode and understand and representations so that they can access examples were given such as test items successfully the literacy (and/or the • Develop sequential steps to build that are often phrased in the negative, literacies) demands of the test items. on number facts introduced and rather than the positive which is used in Teachers also need to show students to gain confidence in answering the classroom and in textbooks. It was the many different ways in which the questions and solving problems suggested that teachers teach using the graphics can be used to represent

Assessment and Student Learning: Collecting, interpreting and using data to inform teaching 41 • Implement strategies by use of It is also difficult to make a fair cultural contexts, or real differences posters of different question stems assessment of the value of the in their attainment in the topic under and have students indicate when a software program –at one school the consideration due to their experiences particular question stem has been software had just been loaded onto and socio-cultural background. As used the computers and in the other two is apparent from the professional schools the software had been loaded development program organised for • Incorporate into daily and weekly onto the teachers’ laptops but not onto this design experiment, the content and teaching activities practices such as the classroom computers. Consequently mode of the NAPLAN assessment tests the use of the discourse of testing, only four of the teachers were positive are outside these students’ experiences and the deconstruction of test about the potential for the use of and they limit their engagement with items to develop student familiarity this program in their classrooms. The the tests as the students position with the language of testing and teachers indicated that they had not themselves as not knowledgeable in this the types of test questions or mini- had much opportunity to either learn particular assessment context. investigations the software themselves or to use it The intention of culture-fair assessment • Use of number games to be with their classes. is to design assessments so that no completed for homework so that particular culture has an advantage parents or caregivers can engage Conclusion over another. The purpose of culture- and encourage the enjoyment of fair assessment is to eliminate the mathematics learning both at home These are early days for this project; privileging of particular groups over and at school. however the anticipated outcomes from the assessment and pedagogic others. It is however difficult to claim Teachers’ approaches under development will that assessments can be completely advance knowledge to include more culturally unbiased. To achieve culture- implementation ‘culture-fair’ assessment practices. fair assessment there is a need to The telephone interviews of the There is much data to be collected address issues in language, cultural teachers sought to investigate: and to be more theoretically analysed. content, developmental sequence, The view of equity that underpins this framing, content and interpretation, • The extent to which the teachers assessment project is more of a socio- and reporting. The sampling of the had implemented these activities cultural issue than a technical matter. content for assessment, for instance, and strategies into their classroom Equity involves much more than a needs to offer opportunities for all of practice consideration of the specific design of the different groups of students who • Their views on how effective the the tests or tasks. As can be seen from will be taking the test. Assessment strategies had been in assisting the initial data collection and analysis, interpretations of students’ performance students’ learning of the maths whether all students have access need to be contextualised so that concepts that the tests and tasks are to learning is fundamental; equally what is, or is not, being valued is designed to assess. important considerations are how the made explicit as well as the constructs curriculum is defined and taught and being assessed and the criteria for All eight teachers interviewed were how achievement is interpreted. The assessment. To achieve culture-fair very positive about the number opportunity to participate in learning assessment the values and perspectives strategies and activities and 25–50 (access issues) and the opportunity of assessment designers need to per cent indicated that they had used to demonstrate learning (validity and be made more public. Further, to particular strategies/ activities. Given fairness in assessment) are deemed understand how culture-fair assessment the focus on culture-fair assessment, fundamental factors in developing practice is developed and attained the teachers were also asked the culture-fair assessment (Klenowski, requires this careful study of how extent to which the professional 2009). the learning experience is modified development sessions had raised their by teachers for particular students to awareness either in terms of culture- The differential performance of students achieve engagement, participation and fair assessment or culture-responsive from different cultures may not be due improvement in learning. This is now pedagogy. At this stage of the project to bias in the choice of test content or the focus of this project. more ethnographic and qualitative data design alone, but may be attributable needs to be collected to identify any to real differences in performance changes to pedagogic practice and any because of these students’ differing development of culture-fair assessment. access to learning, different social and

Research Conference 2009 42 References Warren, E., & de Vries, E. (2007) Australian Indigenous students: The role Berlack, H. (2001). Race and of oral language and representations the achievement gap, Rethinking in the negotiation of mathematical Schools Online, 15 (4). Retrieved understanding. In J. Watson & K. October 31, 2008, from http://www. Beswick (Eds.), Proceedings of the 30th rethinkingschools.org/archive/15_04/ annual conference of the Mathematics Race154.shtml Educational Research Group of Australia. de Vries, E. (2009). PowerPoint Australia: MERGA Inc. presentation, Cloncurry, Queensland. Diezmann, C., Lowrie, T., Sugars, L., & Logan, T. (2009). The visual side to numeracy: Students’ sensemaking with graphics. Australian Primary Mathematics Classroom, 14 (1), 16–20. Kelly, A. E. (2003). The role of design in research. Educational Researcher, 32(1), 3–4. Klenowski, V. (2009). Australian Indigenous students: Addressing equity issues in assessment. Teaching Education, 20, (1), 77–93. Lokan, Jan. & Ford, Phoebe. & Greenwood, Lisa. & Australian Council for Educational Research. 1997 Maths & science on the line : Australian middle primary students’ performance in the Third International Mathematics and Science Study / Jan Lokan, Phoebe Ford, Lisa Greenwood ACER, Camberwell, Vic. : McGaw, B. (2007, August 1). Resourced for a world of difference. The Australian, p. 25. Murphy, P., Hall, K., McCormick, R., & Drury, R. (2008). Curriculum, learning and society: Investigating practice. Study guide, Masters in Education. Maidenhall, UK: Open University. Sullivan, P., Tobias, S., & McDonough, A. (2006). Perhaps the decision of some students not to engage in learning mathematics in school is deliberate. Education Studies in Mathematics, 62(1), 81–99.

Assessment and Student Learning: Collecting, interpreting and using data to inform teaching 43 An Even Start: Innovative resources to support teachers better monitor and better support students measured below benchmark

Abstract – National Tuition Program. The Australian Government’s desired Literacy and numeracy are foundation outcome for the program was a skills for a successful education and measurable improvement in the literacy a productive life. Improved literacy or numeracy levels of eligible students. and numeracy outcomes encourage An Even Start – National Tuition higher school retention rates, lead to Program was a large and complex improved employment opportunities, program, completed in a changing enhance engagement in society and political environment, weathering a total Jocelyn Cook support future economic prosperity. of three federal Ministers as well as a Conversely, a range of research change of government. Australian Council for Educational indicates that poor basic skills are Research associated with a life trajectory of A key feature of the program in its disadvantage. Enhancing teachers’ initial conception was the provision of Jocelyn Cook is a Principal Research Fellow and capacity to recognise the specific needs one-to-one private tuition managed Manager of the Perth Office at the Australian of those students with the poorest by a state-based Tuition Coordinator. Council for Educational Research. With a This was broadened in its second year background in teaching English and Literature, she skills and to provide remedial help is has extensive experience in system assessment at the heart of breaking the cycle of of operation (2008) to include school- and reporting. As the Manager of the Monitoring disadvantage. based tuition that could be delivered Standards in Education (MSE) team in the either: one-to-one; to groups of up to Department of Education and Training (DET) An Even Start is a Commonwealth five students; or online. A requirement in Western Australia, she was responsible for government initiative aimed to address all system assessment programs, overseeing of An Even Start program was to give a number of highly innovative and successful the learning needs of students who accurate diagnostic information and random sample projects that included Technology require additional assistance to acquire resources to tutors to support them and Enterprise, Social Outcomes of Schooling, the satisfactory literacy or numeracy skills. in providing appropriate instruction Arts, and Languages Other than English. The resource developed for DEEWR for eligible students. The Australian From 1998 until 2006 she oversaw the by ACER focused on: accurate diagnosis Government also required from An introduction and transformation of the of specific needs; provision of useful Western Australian Literacy and Numeracy Even Start Assessments the capacity to Assessment (WALNA) program. Under Jocelyn’s advice to teachers and tutors on evaluate the success of the initiative. management, Western Australia was at the remediation of specific difficulties; and forefront of innovation and development in more precise and accurate measures of A need for better population testing in Years 3, 5, 7 and 9, as well progress. as research into the system level assessment targeting of writing. This paper traces the conceptualisation It is widely acknowledged that state and of An Even Start Assessments, Since joining ACER in 2007, she has directed the territory tests, conducted until 2007, development of An Even Start Assessments, a illustrating how the instruments and as well as the NAPLAN instruments, suite of computer-based assessments designed support materials draw together the specifically for use with students experiencing provide measures of ability with large requirements of good measurement, difficulty in acquiring literacy or numeracy skills. measurement errors for students at useful diagnostic information, and She has also undertaken consultancies on behalf the extreme ends of the scale. This of the World Bank providing policy advice to the accessible and relevant teaching is not a criticism of the quality of governments of India and Bangladesh; conducted strategies. and reported on a review of the introduction these tests; rather an observation of on-screen marking for the Curriculum on the measurement properties of Council in Western Australia; and managed Background any test designed to measure a wide item development for the National Assessment Program in Literacy and Numeracy (NAPLAN). In the 2007–08 Budget, the Australian range of abilities. This means that, for Government announced funding of students at the lower extreme of the $457.4 million over four years to scale (typically with only one or two provide $700 in personalised tuition questions correctly answered), there is (a minimum of 12 hours) to students very little information on which to base in Years 3, 5, 7 and 9 who do not the estimate of ability. Tests targeted meet national benchmarks in literacy for the general population consequently or numeracy. The tuition assistance provide quite limited diagnostic was provided through An Even Start information for very weak students.

Research Conference 2009 44 The assessment instruments developed established a strong awareness of Software by ACER for An Even Start have sound/symbol correspondence and therefore been carefully targeted to therefore are unable to effectively An Even Start assessment instruments provide the finer-grained information attempt a conventional reading or and support materials are provided about the strengths and weaknesses writing assessment. Older students who on-screen through a purpose-built of those students reported below have experienced failure in reading and software package. The An Even Start benchmark. The instruments have writing are often extremely reluctant assessment package contains materials specific features designed to both to engage with assessment tasks and targeted for use with students support the student and facilitate may exhibit passive or antagonistic reported below the Years 3, 5, 7 and 9 accurate analysis of strengths and behaviours. In the assessment of writing, benchmarks. The package has two key weaknesses. students judged to be below the components: calibrated pre- and post- benchmark tend to produce very short assessment tests for each year level Assessment features texts, which provide extremely limited that allow progress to be monitored; and design evidence on which to base decisions and links to resources or teaching about attainment and intervention. strategies relevant to the particular Firstly, in all An Even Start assessment point of need or weaknesses identified materials, particular care has been Innovations in the pre-assessment test. The post- taken to ensure the assessment topics assessments mirror the skills assessed in and activities are interesting and based An Even Start assessments of literacy the pre-assessments although the post- on contexts that are not perceived as directly address this issue by the assessments are a little harder overall trivial or simplistic, even though the inclusion of ‘investigations’: a series of so that progress can be measured. activities themselves are contained and short, highly-focused activities designed Reading and numeracy pre- and post- directed. Secondly, the assessment tasks to give the tutor some specific insight assessments include both constructed are designed to not only help the tutor into the particular difficulty a student response and multiple choice questions. gain insight into the particular needs of may be experiencing. The program Multiple choice question results the student and provide multiple and contains two sets of investigations. are automatically uploaded when independent observations on aspects The first is the Components of Reading the assessment is done on-screen. of student achievement, but also to Investigation (CoRI), a series of small Constructed response questions help the tutor establish a structured, investigations to be conducted one- are scored using the marking guide, purposeful and productive interaction on-one with students in order to available from the system documents’ with the student. provide specific insight into the areas of difficulty experienced by those students Main Page. Once student scores All numeracy instruments, for example, who are not independent readers and have been entered into the software, allow a level of teacher/tutor support are deemed to be below the Year 3 a detailed diagnostic report on the (reading questions aloud where reading benchmark. The CoRI allows student’s performance is generated. required) in their administration. The the teacher to focus on the student’s These reports show which questions numeracy assessments also utilise a phonemic awareness, understanding of the student answered correctly or scripted ‘interview’ that allows the phonics, vocabulary, and fluency. It is incorrectly and which misconceptions student to explain their mathematical essentially diagnostic in purpose. may exist. Tutor resources, linked to thinking. These strategies are designed each question or group of questions, to limit the interference of reading The second set of investigations is are provided as website links in the skills with the diagnosis of numeracy the Investigations into the Components software. These resources are as difficulties. The numeracy items have of Writing (ICW) and these too specific as possible, This means that also been constructed specifically to tap are essentially diagnostic. They are if a student demonstrates difficulty, into common misconceptions that may specifically designed to give teachers for example, with questions requiring be present in a student’s mathematical more insight into the specific areas control of place value, then the links are thinking. of difficulty for students struggling to develop writing skills. The areas for to resources that deal directly It is recognised that students who are investigation in writing are sentence with supporting the development of struggling to develop adequate reading knowledge and control; punctuation; that skill. and writing skills are not well-served sequencing and cohesion; spelling/word by conventional paper and pen tests. knowledge; vocabulary; and ideas. In some instances they may not have

Assessment and Student Learning: Collecting, interpreting and using data to inform teaching 45 Measuring growth 1. equating the material to the national scales for reading and numeracy The pre- and post-assessment items at each year level were calibrated to 2. calibration of the CoRI and ICW so a common scale. These items were that they become measures of the used to build a set of progress maps. components of writing, rather than These maps display skills typically guides to early development associated with students working close 3. continued supplementation of to the benchmark level. The progress the support material links, using map contains sufficient detail to show jurisdictions’ material when it the skills that need to be developed becomes accessible. to make progress. To achieve pre- and post-test calibration, two sets of Evaluation equating studies were conducted in 2008 in three states and across all The Department of Education, sectors to establish year-specific and Employment and Workplace Relations domain-specific scales. The scope and (DEEWR) has commissioned an timeline of the original contract did not evaluation of An Even Start by the provide for equating to the national independent social research consultant, scale since the national scales were Urbis. The overall aim of the evaluation not constructed at the time An Even is to assess the success of the program Start material was being developed. in terms of its appropriateness, However, the instruments for An Even effectiveness and efficiency in achieving Start are designed to facilitate common- the objective of lifting the literacy person equating, when national data (reading and writing) and numeracy are made available. This would allow performance of students who did for national benchmark locations to be not meet the national Year 3, 5 or 7 applied to An Even Start scales. assessment benchmarks in 2007. As part of the evaluation, in March 2009, Similarly, the scope and timeline of the online surveys of a random sample of original contract did not allow for the tutors and school coordinators involved construction of two single scales for in An Even Start have been conducted. numeracy and for literacy. Again, the The final evaluation report is due for instruments were designed with items submission in the middle of 2009. common to adjacent years, to facilitate development of single literacy and numeracy scales that will allow progress within and between year levels to be described. Development potential Although the tutorial system, for which this suite of assessment materials was initially designed will not continue, it is hoped that these materials will be made readily available for use with target students. As indicated, there is capacity to build on the An Even Start assessment tools that would enhance the usefulness of these materials. Should funding be available the following work is recommended:

Research Conference 2009 46 Large cohort testing – How can we use assessment data to effect school and system improvement?

Abstract Introduction This paper discusses the introduction Large cohort testing in literacy and and use of data from large cohort numeracy in Australia is a relatively new testing into teaching and learning in activity. The jurisdiction with the longest schools. It highlights history is New South Wales which the conditions that existed towards the began full cohort testing of students end of the 1990s when a number of in Year 6 with the introduction of influences and initiatives coalesced to the Basic Skills Test (BST) in 1989. Its 1 enable large cohort testing to impact introduction was vehemently opposed Dave Wasson positively on student outcomes. It then by the NSW Teachers Federation Department of Education and Training, considers how some of these lessons and by a number of members of the New South Wales might be employed to enhance the Primary Principals’ Association (PPA).2 impact of the new era of national More recently the outcomes of large Dave Wasson began his teaching career in 1975, testing heralded by the introduction cohort testing and associated resources working as a classroom teacher in a number of in 2008 of the National Assessment secondary schools before becoming a regional in NSW have largely been welcomed Program – Literacy and Numeracy consultant. He later held executive positions in by teachers and principals across both schools, including the role of deputy principal (NAPLAN). at Whalan High School in the Western primary and secondary schools. But Region. In 1994 he was seconded to the In order to contain the scope of the there are still a number of pivotal Quality Assurance Directorate. He subsequently discussion, this paper begins with an questions: How did this culture of held various Chief Education Officer (CEO) examination of the NSW experience acceptance of the outcomes of large positions related to quality assurance, corporate performance, school effectiveness indicators, and with the use of the data from the Basic cohort testing develop? And, can large school reporting, before becoming the Principal Skills Test in literacy and numeracy in cohort testing improve school and of Erina High School. He was then appointed Years 3 and 5 from 1996 to 2007. system performance? If so,how? School Improvement Officer, Parramatta District, and later became CEO, School Reviews. To assess the impact on teaching and Cizek (2005) argues that high stakes Since 2004 he has held the position of Director learning this paper also looks at a range (accountability) tests are incapable of the Educational Measurement and School of school effectiveness indicators used of providing high-quality information Accountability Directorate (EMSAD), with in NSW to drive school and system for instructional purposes and doubts responsibility for supporting school improvement improvement, including the notions of if relative group performances have initiatives across NSW and ensuring school accountability in the areas of: school reviews; measuring growth, and value added anything meaningful to contribute at annual school reporting; implementation of and relative effectiveness. In addition, the school level. The NSW experience statewide and, more recently, national testing it traces the development of the Like supports the contrary view: that programs; and supervising student selection School Group structure employed testing and assessment programs can and placement in selective high schools and opportunity classes. in NSW to more meaningfully effectively serve two purposes at once, compare the performance of schools. if the design of the tests is appropriate It also evaluates the utility of various and there are mechanisms in place tools in supporting the analysis and to convey the critical diagnostic and interpretation of these indicators at performance-related messages to the both a school and system level. right people in a flexible and timely manner. Finally, the paper highlights the merits of a transition from current pencil-and- The NSW Department of Education paper testing to an online environment and Training has addressed these 1 The opinions expressed in this paper are to enable the assessment of a greater issues by: those of the author and not necessarily those range of syllabus outcomes and to of the NSW Department of Education and • Providing a relevant curriculum provide more timely feedback to Training. I wish to acknowledge the significant framework in the form of a high- contribution to this paper made by many teachers, students and parents. former and current members of various iterations of the Educational Measurement and School Accountability Directorate over many 2 Chadwick, V., NSW Legislative Council Hansard, years. 28 April 1992.

Assessment and Student Learning: Collecting, interpreting and using data to inform teaching 47 quality syllabus upon which the tests In 1994 the decision was made to Each year extensive trialling of items are based move the test from Year 6 (at the end took place and only items that fitted of primary schooling in NSW) to Year the Rasch model were considered • Ensuring that the statewide testing 5. This was an acknowledgement of for the final test. A combination of programs reflected what teachers the concerns from primary principals common person equating and, since were teaching that the information from Year 6 1996, common item equating was used • Providing to teachers sophisticated, testing came too late for teachers to to place new tests on the historical relevant and accessible diagnostic meaningfully address any identified literacy and numeracy scales. In the information relating to the issues from the data. As a subsequent equating process, items in the equating performance of their students Minister for Education observed: ‘The test that showed significant misfit were previous Government changed the not used. • Ensuring that teachers can access Basic Skills Test from Year 6 to Year 5 relevant resources and support to As a result of these processes, stable after finally realising what nonsense it address areas of identified need. and reliable estimates of student and was to hold basic skills tests in Year 6 cohort achievement on common The sophisticated analysis of student when it was not possible to diagnose literacy and numeracy scales were performance and the capacity to access the results’.3 4 obtained. It is thus valid to compare high-quality resources electronically are In 1996 and until the end of the BST individual student scores over time features that teachers and principals in 2007, the Year 3 and 5 tests were and also examine cohort trends to see can access through the highly valued developed on a common scale for whether improvements have occurred.5 and supported School Measurement, literacy, and a separate common scale Assessment and Reporting Toolkit The use of a common scale for both for numeracy. The reason for this was (SMART) software. Years 3 and 5 allowed for the first to provide an accurate and reliable time the depiction of growth between This paper will provide a historical comparison of the performance of testing points. In a large and diverse overview of the development of large students across the two year levels. The jurisdiction such as NSW, this was cohort testing in NSW, highlighting reports could now reflect an individual a critical development in ensuring some critical developments. It will then student’s development from Year 3 to greater acceptance of the utility and discuss current developments, including Year 5. The reporting language was still accuracy of the data provided to the support provided to schools for the the same but now it also had the same principals from the administration of current National Assessment Program meaning in Year 3 and Year 5. large cohort testing. They had argued, – Literacy and Numeracy (NAPLAN) The method by which this was done rightly, that comparisons based on the tests. Finally, the paper will pose some was to link the tests by having common raw performance of student cohorts in future challenges in relation to large questions in both. Extensive trialling schools was flawed and indefensible as cohort testing to ensure its utility and identified suitable questions to act as schools serve communities with diverse effectiveness in promoting school and link items. demographics. system improvement. The BST was originally developed by An internal review of the Historical overview of the Australian Council for Educational BSTundertaken in 1995 (Mamouney, large cohort testing in Research (ACER) using the Rasch 1995) made a number of NSW measurement scale. Analysis by ACER recommendations, including: showed that the scale underpinning • Provide BST results on computer The Greiner Liberal Coalition the BST satisfied the requirements of disk with appropriate software to Government introduced a Basic Skills the Rasch model (local independence, enable schools to analyse the data Test for all Year 6 students in NSW in unidimensionality, specific objectivity). 1989, providing outcome information on site for school-specific purposes in literacy and numeracy. In 1990 the • Improve analysis of the BST data to decision was taken to expand the test look for patterns of performance 3 Aquilina, J., Legislative Assembly Hansard, 9 April to include Year 3 students. At this 1997. which could inform the use of data stage the tests were not developed for the benefit of individual students, on a common scale and the notion 4 Lind, P., Interview by Dave Wasson, 2 July 2009. Peter Lind is a Senior Data Analyst with schools and system of measuring growth between testing the Educational Measurement and School points was not considered. Accountability Directorate, NSW Department 5 Lind, P., Interview by Dave Wasson, 2 July of Education and Training. 2009.

Research Conference 2009 48 • Provide better ways of supporting the confidence that NSW was moving developed were focused and valued school use of BST data through in the right direction regarding literacy and that teachers were now better training programs. teaching and largely neutralised the equipped to identify areas of student debate between the whole language need. In 1996 there was pressure from the and phonics camps.6 NSW Primary Principals’ Association The following table illustrates the trends The new K–6 English Syllabus was to provide the information from the from 1996 to 2007 for students placed released in 1998 and the State Literacy BST electronically and, in 1997, the in the bottom and top bands in BST Strategy evolved into the State Literacy first iteration of what was to become literacy. While the outcomes from a Plan in 1999. The Plan provided for an known as the School Measurement, large cohort testing program such as increased concentration of resources Assessment and Reporting Toolkit the BST are subject to volatility from in terms of personnel, support (SMART) was released. year to year, there is a noticeable materials and professional learning for improvement trend, with a reduction In 1996 it was also apparent that the teachers. It was accompanied by the in the percentage of students in Band percentage of students in the lowest comprehensive assessment of student 1 from about 17 per cent in 1996 to band in the BST for Year 3 (Band 1), literacy skills via the Basic Skills Tests about 11 per cent. and the lowest two bands in Year 5 and the provision of sophisticated (Bands 1 and 2), was unacceptably high. electronic analysis of individual, group It is important to note that the There was a need for a new approach and school performance via SMART. underlying scale for the development to the teaching of both literacy and of the BST in NSW did not change The BST for primary schools was numeracy in NSW schools. In 1997 the over this period. This indicates a level subsequently complemented by a State Literacy Strategy was launched. of genuine improvement of student new literacy assessment for secondary This was accompanied by a new outcomes from 1998, when the students in 1998, the English Language syllabus (K–6 English Syllabus, 1998), percentage of students in Year 3, Band and Literacy Assessment (ELLA), an unprecedented level of professional 1 for example was reduced from 15.4 followed by the Secondary Numeracy learning for teachers and a large bank per cent in 1998 to 10.7 per cent in Assessment Program (SNAP) in 2001. of practical teaching resources, as 1999. well as enhanced central and regional An extensive evaluation of the State So, in 1998 there was a convergence consultancy support. Literacy Plan was undertaken in 2003 of initiatives thatconspired to positively by the Educational Measurement and According to the Director at that time impact on the learning outcomes of School Accountability Directorate of the Curriculum K–12 Directorate, students in NSW: student outcomes (EMSAD). The evaluation (NSW the State Literacy Strategy: data from large cohort testing; the Department of Education and Training, implementation of a high-quality … drew fragmented philosophical 2004) confirmed that the Plan was syllabus and a statewide training strands together and focused on explicit highly successful and that teaching and development program; and the and systematic teaching and buried the practice had indeed changed. The prevalence of learning by osmosis. The provision of sophisticated diagnostic evaluation also indicated the resources Strategy provided a secure foundation information on student performance for for literacy learning and revolutionised teacher use via SMART. the way teachers and educators in 6 Wasson, L.J., Interview by Dave Wasson, 19 NSW talked about learning. It provided September 2007.

Table 1: Literacy percentages in Bands

Literacy Percentages in Bands 1996 1997 1998 1999 2000 2001 2002 2003 2004 2005 2006 2007 Y3 Band 1 17.0 16.0 15.4 10.7 11.1 11.8 10.7 12.2 10.8 11.5 10.6 11.1 Y3 Band 5 16.7 17.3 13.2 13.9 15.1 19.8 18.1 17.7 16.6 20.4 19.4 19.5 Y5 Band 1&2 9.0 8.2 8.5 5.9 7.5 6.2 5.4 6.0 6.9 7.1 6.9 6.8 Y5 Band 6 19.2 23.7 20.2 19.6 19.5 23.0 24.9 25.6 27.8 23.8 25.0 26.7

Assessment and Student Learning: Collecting, interpreting and using data to inform teaching 49 Table 2: Numeracy percentages in Bands Literacy Percentages in Bands 1996 1997 1998 1999 2000 2001 2002 2003 2004 2005 2006 2007 Y3 Band 1 10.8 10.8 13.8 10.3 14.7 10.6 9.3 8.1 10.1 9.2 9.1 8.5 Y3 Band 5 23.6 17.7 21.0 16.0 15.3 15.4 17.9 17.1 15.1 21.8 21.8 19.3 Y5 Band 1&2 7.5 5.9 5.7 7.4 8.0 6.4 6.3 6.0 6.4 6.6 5.4 6.5 Y5 Band 6 20.9 20.9 23.3 23.2 19.6 23.2 25.1 23.1 24.9 23.9 29.6 32.6

It is also apparent that this percentage Key developments It was depicted in SMART in a way has stabilised and that further reduction, that allowed the growth of individual including reduction of students at this The developments described above students to be identified, and for that level in NAPLAN, will require a new were pivotal in gaining support for the information to be aggregated for a approach. outcomes of large cohort testing in custom group of students or for the NSW. In addition to these, over the last entire cohort. (See Figure 1, opposite.) Over the same period for numeracy decade a number of initiatives relating the improvement is not as pronounced, to the provision of more sophisticated Value added and perhaps reflecting a greater emphasis school performance information have on literacy in NSW at both policy and been implemented that have provided relative effectiveness operational levels. additional levels of analysis to teachers, indicators for secondary An evaluation of assessment and principals and their supervisors. While schools reporting processes and outcomes some of this additional information A second significant type of additional in NSW was undertaken in 2003 by was welcomed in schools, the data information presented was the Eltis and Crump. At this time, Eltis presented school performance in new measurement of value added. Work on and Crump detected a major shift and challenging ways that meant even value added and relative effectiveness in attitude to the outcomes of large some high-performing schools in purely indicators was undertaken from cohort testing. They observed that raw terms were not performing as 1995 (NSW Department of School there was ‘overt support for testing expected when their school intake Education, 1997a) and the models programs’ and that there was a marked characteristics were taken into account. stabilised in 1998 (Smith, 2005). The increase in the ‘quality of information notion of value added, as distinct available to schools as a result of Growth from growth, is to use performance statewide testing programs.’ A most important type of additional on one measurement scale at a They further noted that ‘statewide information presented was the particular point in time to predict tests have come to be valued by measurement of growth. The depiction subsequent performance on a different teachers and parents for their perceived of growth between testing points, measurement scale. For example, diagnostic assistance for each student where the underlying measurement using student performance in the Basic … (Eltis & Crump, 2003). In addition, scale was common, was possible with Skills Test to predict and measure Eltis and Crump commented on the the implementation of a common subsequent performance in the NSW quality and support for an earlier scale across Years 3 to 5 from 1996. School Certificate. iteration of the SMART software which The notion of growth between These additional levels of analysis for ‘… allows schools to analyse their testing points levelled the playing field, school and system use, such as growth, results by viewing achievement levels, when the two variables that have value added and relative effectiveness student results and questions, and the greatest impact on the quality of enabled school, regional and central question details. Results and graphs can student outcomes in NSW are taken personnel to grapple in sophisticated be printed (and the software) provides into account: socioeconomic status ways with school effectiveness issues. hyperlinks to resource materials.’ and geographic location. This initiative Principals could see that a system was relatively quickly understood performance analysis was based on by principals and largely embraced. more than just raw scores.

Research Conference 2009 50 Table 3: Correlations between BST Year 5 predictor scores and School Certificate student course scores for 2008 This school is showing Shows the Average Course Correlation good growth for lower achieving Growth for the school students − less positive for in comparison with English 0.75 higher achieving students the State Mathematics 0.78 Science 0.73 History 0.65

Geography 0.67 The lower point on the arrow shows the student’s Yr 3 Computer Skills 0.72 score in Reading; the high point shows the Yr 5 score

An additional value added indicator for secondary schools is the use of the Figure 1: Reading Growth – BST Yr 3 2006 to NAPLAN Yr 5 2008 Year 10 School Certificate aggregate measure as a predictor of subsequent Higher School Certificate performance (correlation = 0.790). An example of the depiction of value added in SMART This diamond indicates that the school’s value is shown in Figure 2. added is slightly better than its Like School Group This was a difficult notion for some principals and teachers to accept and understand, and required a lot of professional learning before it was accepted as legitimate and became valued in schools. Schools can compare their performance with Curriculum links like schools or with a group of (Teaching strategies) their choice Arguably the most important This graph shows the average This school is showing a very value added for the school development in securing support of positive value adding trend for between the SC and the HSC large cohort testing in NSW and the middle achieving students (grey diamond) in comparison over the last 5 years with its Like School Group subsequent use of the information to drive school and system improvement was the linking of test items with high- quality teaching strategies. Figure 2: Value added between Year 10 School Certificate and Year 12 Higher School Certificate In 1999 the decision was taken to better support teachers with high- quality support for statewide tests, and in the same year hard copy teaching strategies linked to the skills underpinning a number of the test items were developed for the first time. Within the SMART software there was a page reference provided to direct

Assessment and Student Learning: Collecting, interpreting and using data to inform teaching 51 teachers to the relevant hard copy page in the Curriculum Link document.7 From 2005 the Curriculum Links were made available electronically within SMART. This process began with the BST and subsequently included ELLA and SNAP. Part of the Sitemap for Literacy Teaching Strategies In 2008, the quality and scope of the teaching strategies was significantly increased to coincide with the Click on a Link to access Teaching Strategies implementation of the first NAPLAN test. The strategies were delivered as HTML documents via the Web – as had been the case in 2007 – but every test item in NAPLAN in literacy and numeracy, and for all Years 3, 5, 7 and 9, was linked to the NSW curriculum and the skills underpinning the items Figure 3: Teaching strategy links in SMART for NAPLAN 2008 were addressed with highly effective and classroom ready teaching strategies. The guiding principles for the School and regional For 2008, in excess of 800 electronic development of the NAPLAN teaching performance graphs pages of teaching strategies were strategies were: developed to better support teachers, In 2005, EMSAD undertook further many with hyperlinks to relevant sites 1. The NSW Quality Teaching development work on school and on the Web.8 Framework (QTF) regional performance indicators based In addition, the strategies were 2. The Modelled, Guided and on assessment data from large cohort developed within the NSW Quality Independent teaching cycle tests. These data were presented on XY scatter plots, using variables that Teaching Framework9 and in many cases 3. The National Statements of research undertaken by Dr Geoff included a range of strategies for the Learning for English (SOL) one skill area for students at different Barnes (from EMSAD) indicated had ages and at different levels of ability: 4. Strategies, and activities to support the greatest influence on student strategies for students who require those strategies learning outcomes. These variables were IRSED (Indicators of Relative modelled teaching, guided teaching or 5. Critical aspects of literacy Socio-Economic Disadvantage), ARIA independent teaching strategies. development K—10 continuum (Accessibility/Remoteness Indicators (NSW Department of Education for Areas), student attendance and and Training, 2008). 7 Cordaiy, R., Interview by Dave Wasson, 30 teacher attendance. They were used June 2009. Robert Cordaiy is the Manager, School and System Measurement and This focus on student diagnostics for the 2006 and subsequent tests. It Analysis, Educational Measurement and School and supporting teachers has been is important to note that the research Accountability Directorate, NSW Department particularly successful in gaining support undertaken by Barnes indicates that of Education and Training. for large cohort testing across the there is no correlation in NSW 8 O’Donnell, K., Interview by Dave Wasson, NSW educational community. For between teacher attendance and the 3 November 2008. Kate O’Donnell is R/ 2009, the NAPLAN teaching strategies quality of student outcomes. Assistant Director for the Educational will be further developed to address Measurement and School Accountability The two performance measures skill areas that were tested for the Directorate, NSW Department of Education analysed in relation to these variables first time in 2009, or where existing and Training. Ms O’Donnell is the NSW was raw performance, for example, DET representative on the NAPLAN Project strategies require enhancement or average Years 3 and 5 mean scores Reference Group. redevelopment.10 for 2008 NAPLAN; and value added 9 Further information about the Quality Teaching measures for junior and senior Framework can be found at: http://www.curriculumsupport.education.nsw. 10 O’Donnell, K., Interview by Dave Wasson, 3 secondary schools. As of 2010, growth gov.au/qualityteach/index.htm July 2009. will be included between Years 3 and

Research Conference 2009 52 5, Years 5 and 7, and between Years 7 and 9. The kind of performance information depicted in Figure 4 below has been used extensively to identify and share best practice, and to identify High performance schools at a regional level for closer relative to SES monitoring and specific support through the ‘Focus Support School’ model which is having a demonstrable impact Upper boundary line on a number of schools. At the same time, a Like School Group (LSG) methodology was developed Predicted to meaningfully compare schools. This performance Lower boundary line was welcomed by principals, especially when their school was remote; in a low socioeconomic status area; had a high proportion of Indigenous students; or Low performance more especially if all three factors were relative to SES present. These principals maintained it was indefensible to compare their performance with that of the state average, for example. Comparisons Figure 4: School performance relative to SES with a LSG to a certain extent levelled the playing field and were largely supported (more than 60 per cent of NSW government schools voluntarily report their outcomes against their relevant LSG in mandatory annual school reports). A LSG structure was developed that is reflected in Figure 5 below. While this model represented a significant step forward in terms of interpreting school performance within the context of the two community factors that explain the greatest amount of variation of performance in NSW (SES and remoteness), the relatively arbitrary cut-points for the various groupings created disquiet amongst some principals. For example, there were 239 primary schools in the Metro C group. This meant that while there Figure 5: NSW Like School Group structure – 2005– 2008 may have been some justification for comparison with the mean performance of schools in Metro C, no one could argue that a school at the cut-point with Metro D was similar to a school at the cut-point with Metro B. A more defensible and more equitable model was required.

Assessment and Student Learning: Collecting, interpreting and using data to inform teaching 53 School Community Education Advantage (SCEA) Comparison group for School 1 School 2 The pathway to develop a new form School 1 of LSG model came from the work undertaken by ACER (Masters et al., 2008) and commissioned by the Department of Education, Employment and Workplace Relations (DEEWR). Masters et al. advocate a ‘statistical Comparison group neighbour’ approach, such as that which for School 2 is used in Ontario, that allows schools to compare performance with schools that are most like them on various Notes: measures. 1. SCEA and performance scores are expressed in standardised (z-score) units. To undertake this analysis, the three Figure 6: School performance relative to SCEA main community influences on school aggregated outcomes were used: for generating like school group Future challenges socioeconomic status (as measured comparison data. Each point on the EMSAD is working towards by the ABS Index of Education and graph represents a school. The position implementing online testing for large Occupation); remoteness (as measured of the school on the horizontal axis cohorts which potentially has numerous by ARIA); and percentage of Aboriginal is determined by its SCEA value. The advantages over current pencil-and- enrolments. comparison group for a given school comprises the 20 schools to the left paper approaches. These include, The table below shows the correlations and the 20 schools to the right of that primarily, the capacity to assess a between these measures and the school. For example, the vertical lines greater range and depth of syllabus school performance measures. either side of School 1 and School 2 outcomes and the provision of more encompass the schools that would timely diagnostic feedback to teachers, Table 4: form their respective comparison parents and students. With the current Correlations between community groups. Note that the performances four-month lag between testing and variables and school performance of the comparison group schools can reporting in NAPLAN, for example, the vary considerably because of in-school relevance and utility of the diagnostic Primary Junior information provided is sometimes secondary factors. The average outcomes for the comparison group schools become the questioned. SES .772 .653 like school comparison data for that The Essential Secondary Science school (Barnes, 2009). %Aboriginal .555 .428 Assessment (ESSA) will extend earlier The significant advantage of this model online developmental work undertaken ARIA .293 .274 over the previous NSW LSG model with the previous Computer Skills (rural schools is that at each point along the SCEA Assessment for Year 6 (CSA6), and only) scale the comparison group of schools transition to a fully online science test Note: changes. In this way, apart from the two for Year 8 students in 2011. There 1. SES correlations are based on the ABS IEO (Index extremes at either end of the SCEA is already an online element to ESSA of Education and Occupation) SEIFA measure. – the Online Practical Component 2. Correlations based on analyses of NSW DET data. scale, with about 1600 primary schools in NSW, there is potentially 1520 (OPC). This is an innovative approach to the assessment of science as it Schools are ranked according to different, or ‘floating’ LSGs. Discussions creates the elements of a science their values on the SCEA scale. The with executive members of both the laboratory online so that sophisticated graph below plots the SCEA values Primary Principals’ Association and the scientific experiments can be replicated. for all NSW government schools Secondary Principals’ Council in NSW See Figure 7 for an example of one against overall performance measures, indicate strong support for this revised aspect from an online experiment. and demonstrates the process form of LSG comparison model.

Research Conference 2009 54 Figure 8 presents various forms of testing in NSW on a matrix, in terms of efficiency and immediacy of feedback on the vertical axis, and capacity to measure a range of syllabus outcomes on the horizontal axis. The limitations of standard pencil-and- paper large cohort tests, represented by ‘A’ in the matrix are arguably that they are inefficient, they do not provide diagnostic information back to teachers and the system in a timely manner, they are limited in their capacity to assess a range of syllabus outcomes, they are expensive and they are environmentally unfriendly. The technological capacity currently Figure 7: Replicating a science laboratory online in the Year 8 science test – ESSA exists to transition from pencil-and- paper tests to an online environment, where it is possible for the instant scoring of student responses, online assessment of written responses and Large Cohort Testing Quadrant the possible assessment of a greater range of syllabus outcomes. The challenge remains to implement the Possible locations of the í change. following: C E A – Standard pencil & paper Conclusion: Lessons B – CSA or other simple from the NSW B computer-skills tests experience D C – Assessment Item Databank Large cohort testing can have a positive (AID) concept, or other impact on school and system outcomes, e-learning systems particularly and most importantly in the D – Current HSC area of improved student outcomes A E – ESSA Onling Practical when: Efficiencey in testing & marking Efficiencey Component Range & depth of assessment • Driven by a rigorous, relevant and í pedagogically sound curriculum framework Figure 8: Dimensions of testing – Efficiency versus range of syllabus outcomes • Supported by extensive and relevant professional opportunities for teachers • Assisted by sophisticated diagnostic tools for the analysis of individual, group, school and system performance • Accompanied by central and local consultancy support and high-quality support materials.

Assessment and Student Learning: Collecting, interpreting and using data to inform teaching 55 References NSW Department of Education and Training (2008). Welcome to linking Barnes, G., (2009). Proposed model for NAPLAN 2008 to the curriculum, constructing like school data. Sydney: [electronic resource]. Retrieved Educational Measurement and School [month] [day], [year], from Accountability Directorate, NSW Department of Education and Training. https://www.det.nsw.edu.au/ directorates/schoimpro/EMD/naplan/ Chadwick, V., NSW Legislative Council pubs/Naplan08CL/index.htm Hansard, 28 April 1992. NSW Department of School Education Cizek, G. (2005). ‘High-stakes testing: (1997a). Understanding value added. Contexts, characteristics, critiques, and Sydney: NSW Department of School consequences’. In R.P. Phelps (Ed.), Education, Corporate Performance Defending standardized testing. Mahwah, Directorate. New Jersey: Lawrence Erlbaum Associates. Smith, M, (2005). Getting SMART with Data in Schools: Lessons from Eltis, K., and Crump, S. (2003). Time NSW, Research Conference paper, to teach, time to learn: Report on the Melbourne: Australian Council for evaluation of outcomes assessment Educational Research. and reporting in NSW government schools. Sydney: NSW Department of Interviews Education and Training. Cordaiy, R., Interview by Dave Wasson, Mamouney, R. (1995) 1995 Review of 30 June 2009. the Basic Skills Testing Program. Sydney: NSW Department of School Education. Lind, P., Interview by Dave Wasson, 2 July 2009. Masters, G., Lokan, J., Doig, B., Khoo, S.-T., Lindsey, J., Robinson, L. and O’Donnell, K., Interview by Zammit, S., (1990). Profiles of Learning: Dave Wasson, 3 November 2008. The Basic Skills Testing Program in New Wasson, L.J., Interview by South Wales 1989. Melbourne: ACER. Dave Wasson, 19 September 2007. Masters, G. N., Rowley, G., Ainley, J., and Khoo, S.-T., (2008). Reporting and comparing school performances: paper prepared for the MCEETYA Expert Working Group to provide advice on national schools data collection and reporting for school evaluation, accountability and resource allocation, Australia, [electronic resource]. MCEETYA, 2008. Retrieved [month] [day], [year], from http://www.appa.asn.au/images/news/ mceetyareporting20090420.pdf NSW Department of Education and Training (2004). State Literacy Strategy: Evaluation 1997–2003. Sydney: NSW Department of Education and Training.

Research Conference 2009 56 Do rubrics help to inform and direct teaching practice?

Background Assessment in learning domains that require an extended performance of some kind (for example, an essay or work of art) has been considerably more vexed than for domains where closed response items, such as multiple- choice items or short answer items, are valid. Different countries have Stephen Humphry Sandra Heldsinger grappled with the issues related to University of Western Australia University of Western Australia performance assessment in slightly different ways depending on the Stephen Humphry is an Associate Professor Sandy Heldsinger has worked at the University dominant assessment regime, but the with the Graduate School of Education at the of Cambridge Local Examination Syndicate University of Western Australia. He teaches (UCLES) as a research officer responsible for underlying issues remain very similar. In masters units in Educational Assessment, establishing programs of trialing and pre-testing, the United Kingdom (UK), for example, Measurement and Evaluation and is involved as project coordinator for the Australian National the assessment of a single composition in a number of research projects. He currently Benchmarking Equating Study and as an associate in a fixed-time examination, marked holds an Australian Research Council grant lecturer at Murdoch University in educational entitled Maintaining a Precise Invariant Unit in assessment. She worked as Senior Educational by a detailed marking scheme, is seen State, National and International Assessment with Measurement Officer, Population Testing in as the archetypal assessment that Prof David Andrich of UWA. He is a member of Department of Education, WA for over seven has influenced practice in the current the Curriculum Council’s Expert Measurement years and her work included coordination of assessment regime (Wilkinson et al., and Assessment Advisory Group and is involved random sample assessment programs of student in research on assessment and measurement achievement in the social outcomes of schooling 1980). In the 1930s, dissatisfaction more broadly in Western Australia and Australia. and the society and environment learning area; with this way of marking led to a He has presented at international conferences and the coordination of the annual, full cohort debate about analytical marking as and has visited and worked with international WA assessment program. opposed to impressionistic marking, organisations and institutions, including MetaMetrics and the Oxford University Centre In her work with the Western Australia where analytic marking consisted of for Educational Assessment. Department of Education, Dr Heldsinger a series of headings or criteria and conceptualised and led the development of an allocation of marks available for Dr Humphry completed his PhD under Professor a suite of publications that assist teachers to David Andrich, with a focus on maintaining interpret the data from system level assessment each criterion (Wilkinson et al., 1980). a common unit in measurement in the social programs and to understand the frameworks that Concerns that this way of marking did sciences. His doctoral research involved guide teaching and assessment. Dr Heldsinger not result in the best essay obtaining advancements in item response theory as well as commenced as a Lecturer, UWA in 2006 where the top mark led to an exploration of applied work to demonstrate the advancements she teaches in assessment and educational lead to improved test equating. Prior to 2006, measurement. impression marking, where the markers he worked for a number of years in industry as were provided with a small number of the Senior Psychometrician for the Department criteria to consider when marking; but of Education Western Australia. During that rather than being provided with a mark time, he was responsible for the measurement and statistical analysis of data obtained in large- for each criterion, they arrived at a scale State testing programs. He designed and judgment of an overall mark. coordinated research and development projects associated with the assessment program, as In the 1980s there was a renewed well as projects focusing on the use of student interest in performance assessment. data for monitoring and evaluating student In part, this renewed interest resulted performance. from the imposition in some countries, Dr Humphry has several lines of active research, principally the United States of America the most central being work on developing a (USA), of system-level standardised general framework for defining and realizing units in the social sciences. His work in education assessments where the predominant has included research on: test equating; question format was multiple choice or rubrics; applications of the process of pairwise short answer. Performance assessments comparison; and teacher effectiveness. He is also were considered to be an integral pursuing research on parallels between biological and cognitive growth that mirror parallels aspect of educational reform because between methods of data analysis used by Sir of their capability of measuring learning Julian Huxley and Georg Rasch.

Assessment and Student Learning: Collecting, interpreting and using data to inform teaching 57 that could not be assessed through the an overview of a series of extensive Outcomes and Standards Framework more closed response formats, and empirical studies of the assessment (OSF). The OSF describes the typical because of their value for curricular and of students’ narrative writing. This progress students make in each of instructional changes (Lane & Stone, presentation focuses on the qualitative eight learning areas. Learning in these 2006). research. The quantitative research areas is described in terms of eight undertaken is reported separately stages, referred to as eight levels. It appears that the renewed interest (Humphry & Heldsinger, 2009). Finally This rubric consisted of nine criteria. in performance assessment coincided the implications of the findings from Markers were required to make an with educational reform that was these studies for use of rubrics as on-balance judgment as to the level happening in a number of countries. instructional tools are discussed. (1–8) of each student’s performance This reform saw a move away from overall and then they were required to syllabus documents which provided assess each performance in terms of details of what teachers needed to Overview of rubrics spelling, vocabulary, punctuation, sentence teach, to frameworks that described A scoring rubric typically has three control, narrative form of writing, text progression in student learning. In the parts: (1) performance criteria (2) organisation, subject matter, and purpose UK, this framework took the form of performance level and (3) a description and audience. the National Curriculum; in Australia, of features evident in the performance National Profiles were developed and level. The performance criteria are The category descriptions within each these in turn were reworked by each related to the task; so for example criterion were derived directly from the State educational authority. In Western if a teacher was assessing his or her OSF. That is, the description used to Australia, the framework was referred students’ skills in devising an advertising determine a score of 2 in spelling was to as the Outcomes and Standards brochure, one of the criterion could taken directly from the description of Framework. In 1995, Spady (cited in be the visual appeal of the brochure. the level 2 performance in the OSF; the Dimmock, 2000) outlined the features The performance levels may be description for a score of 3 was taken of Outcome-Based Education, two of indicated by the labels weak, good, directly from the level 3 description in which were: very good and outstanding or by using the OSF, and so on. The number of categories for each criterion is shown in • Schools define and communicate numbers to indicate increasing levels Table 1. to students and parents the of achievement. The descriptions that performance criteria and standards accompany each of the performance Several interrelated issues with the that represent the intended learning levels summarise in some way the psychometric properties of the data and outcomes expected features of the performance at that obtained from this assessment were level. identified, the most tangible being the • Assessment is matched to the distribution of student raw scores. criteria and every student is eligible The predominant format of rubrics for high marks. is that each criterion has the same Figure 1 shows the raw score number of performance levels, and distribution of Years 3, 5 and 7 students Outcome-based education has the most commercially available rubrics in 2001, 2003 and 2004. It can be same intentions as rubrics: to capture have four performance levels for each seen, firstly, that the distributions the essence of student performance or criterion. We will now focus on a remained relatively stable over the development at various levels. specific example to examine these period (2001–2004). This stability When the difficulties experienced in features of rubrics and the implications was achieved through the training of assessing performances is considered in for using rubrics to inform and direct markers and in particular through the relation to the move towards defining teaching practice. use of exemplar scripts, rather than by performance criteria and standards applying post-hoc statistical procedures. it is not surprising that rubrics have Rubric for the Secondly, and most importantly, the become so popular. But are they as assessment of narrative graph shows that although there is a Popham (1997) suggests ‘instructionally writing large range of possible score points fraudulent’? Do rubrics help to inform (1– 61), the distribution clusters on a and direct teaching practice? The rubric discussed here was devised to assess narrative writing relatively small subset of these To explore these questions further, in the full-cohort testing program in (in particular, around scores 18, 27 this presentation firstly considers the Western Australia. The rubric was and 36). typical rubric structure. It then provides extracted from the Western Australian

Research Conference 2009 58 Table 1: Original classification scheme for the assessment of writing Examination of logical Aspect Score Range Aspect Score Range and semantic overlap in the rubric On-balance judgment 0 – 8 Form of Writing (F) 0 – 7 (OBJ) A close analysis of the rubric revealed logical and semantic overlap in some Spelling (Sp) 0 – 5 Subject Matter (SM) 0 – 7 of the performance criteria and levels. Table 2 shows an extract taken from Vocabulary (V) 0 – 7 Text Organisation (TO) 0 – 7 the rubric and it can be seen that a Sentence Control (SC) 0 – 7 Purpose and Audience 0 – 7 student who writes a story with a (PA) beginning and a complication would be scored 2 for the criterion, form of Punctuation (P) 0 – 6 writing. This student will necessarily have Total score range 0 – 61 demonstrated some internal consistency of ideas (category 2, subject matter). Similarly if a student has provided a 8000 beginning and a complication, he or she WALNA 2001 has most probably provided a narrative 7000 WALNA 2003 WALNA 2004 that contains two or more related 6000 connected ideas (category 2, text organisation). 5000 Based on this work, the marking rubric was refined by removing all semantic 4000 overlap. The results from this second 3000 series of studies showed that the semantic overlap did to some extent 2000 cause artificial consistency in the marking. 1000 1 4 7 10 10 13 16 22 25 28 31 34 37 40 43 46 49 52 55 58

Figure 1: The raw score distribution of Years 3, 5 and 7 students’ narrative writing as assessed through the Western Australian Literacy and Numeracy Assessment in 2001, 2003 and 2004

Table 2: Extract from the narrative rubric shows semantic overlap of criteria Category 1 Category 2 Form of writing Demonstrates a beginning sense of story structure, Writes a story with a beginning and a complication. for example opening may establish a sense of Two or more events in sequence. narrative May attempt an ending. Subject matter Includes few ideas on conventional subject matter, Has some internal consistency of ideas. which may lack internal consistency. Narrative is predictable. Ideas are few, may be disjointed and are not elaborated. Text organisation Attempts sequencing, although inconsistencies are Writes a text with two or more connected ideas. apparent. For longer texts, overall coherence is not observable.

Assessment and Student Learning: Collecting, interpreting and using data to inform teaching 59 Relative crudeness of on the mean scores of students of Devising a rubric performance levels different age levels, these ability ranges that provides greater equate to approximately two years of As previously explained, the marking schooling. precision of student rubric was derived directly from the development in levels of performance described in the Although the marking rubric contained narrative writing OSF. The explanation that accompanied many criteria, and therefore many score Based on an analysis of our findings, the introduction of the OSF was points, it provided only relatively few it was hypothesised that the general that the average student would take thresholds, or points of discrimination. level of description in the framework of approximately 18 months to progress Essentially, all the information about how student learning develops did not through a level. The levels therefore student performance was obtained provide the level of detail we needed do not describe and are not expected from the overall judgment – that is the for a marking rubric of students’ to describe fine changes in student on-balance judgment of the student’s narrative writing. The framework makes development. level. All other judgments were replications of that judgment. no mention of character and setting The statistical analysis of the data for example, nor does it articulate Over and above the issues related provides the opportunity to examine in fine detail how students’ sentence to the halo effect and the semantic the relationship between levels (as level punctuation or punctuation within overlap, the marking rubric did depicted in the marking rubric) and sentences develops. student ability. Figure 2 is taken from not capture the fine changes that This hypothesis was tested by the analysis of the writing data and can be observed in student writing developing a rubric that captured finer shows that, within a wide ability range, development. Although there were gradations in performance. The new a student would have a high probability qualitative differences between the rubric emerged from a close scrutiny of being scored similarly on each students’ written performances, the of approximately 100 exemplars. criterion. For example, students within markers could classify the students We compared the exemplars, trying the ability range of -3 to +1 logits only into three or four relatively crude to determine whether or not there would have a high probability of scoring groupings. were qualitative differences between all 3s, whereas students in the ability them and trying to articulate the range of +1 to +6 logits would have a differences that we observed. We had high probability of scoring all 4s. Based

WOBJ 2 3 4 5 6 7 WSP 2 3 4 5 WV 2 3 4 5 76 WSC WP 1 2 3 4 5 7 WF 1 2 3 4 5 6 WSM 2 3 4 5 76 WTO 1 2 3 4 5 76 WSM WTO 2 3 4 5 6 7 WPA 2 3 4 5 6 7

-9 -8 -7 -6 -5 -4 -3 -2 -1 0 1 2 3 4 5 6 7 8 9 10 11 12 **=Reversed thresholds

Figure 2: Threshold map showing the relationship between ability and the probability of a score for each criterion.

Research Conference 2009 60 Table 3: Revised classification scheme for the assessment of writing

Aspect Score Range Aspect Score Range On-balance judgment 0 – 6 Punctuation within 0 – 3 sentences Spelling 0 – 9 Narrative form 0 – 4 Vocabulary 0 – 6 Paragraphing 0 – 2 Sentence structure 0 – 6 Character and setting 0 – 3 Punctuation of sentences 0 – 2 Ideas 0 – 5 Total score range 0 – 46

Person-Item Threshold Distribution PERSONS (Grouping Set to Interval Length of 0.25 making 84 Groups) 5000 6.9% No. Mean SD Total [72218] - 1.158 2.571 4000 5.5%

3000 4.2%

Frequency 2000 2.8%

1000 1.4%

0 0% -12 -11 -10 -9 -8 -7 -6 -5 -4 -3 -2 -1 0 1 2 3 4 5 6 7 8 9 Location (logits) ITEMS 0 0% 5 11.9% 10 23.8% Frequency

Figure 3: Distribution of students in relation to the thresholds provided in the new rubric no preconceived notion of how many could be distinguished and described. Conclusion qualitative differences there would be In paragraphing however, only three for each criterion, or that there would qualitative differences could be Do rubrics help guide and inform necessarily be the same number of distinguished so there are only three teaching practice? Based on this qualitative differences for all criteria. categories. Table 3 shows this revised research, the answer to the question Thus the number of categories for classification scheme. on one level is that it depends on the each criterion varied depending on the nature of the rubric. In the presentation, The person/item distribution (Figure number of qualitative differences we a comparison between the criteria in 3) generated from marking with the could discern. the original rubric with the criteria in new rubric provides greater precision the new rubric will be made to illustrate For example, in vocabulary and sentence of student development in narrative this point. On another level however, structure there are seven categories writing. this comparison raises questions about because in a representative range of the relationship between assessment student performances from Years and teaching, and whether rubrics are 3 to 7, seven qualitative differences sufficient for informing teaching practice.

Assessment and Student Learning: Collecting, interpreting and using data to inform teaching 61 References Dimmock, C (2000) Designing the Learning Centred School: A Cross-cultural Perspective. Falmer Press, London/ New York. Heldsinger, S.A. (2009). (In press). Using a measurement paradigm to guide classroom assessment practices in Webber, C.F., & Lupart, J. (Eds.), Leading student assessment: Trends and opportunities. Dordrecht, The Netherlands: Springer.

Humphry, S.M., & Heldsinger, S.A. (2009). Experimental elimination of the halo effect in a performance assessment. Submitted for publication.

Taggart, G. L., & Wood, M. (1998). Rubrics: A cross-curricular approach to assessment. In G. L. Taggart, S. J. Phifer, J. A. Nixon, & M. Wood (Eds.), Rubrics: A handbook for construction and use (pp. 57–74). Lancaster, Pennsylvania: Technomic Publishing Co., Inc.

Popham, W.J. (1997). What’s wrong – and what’s right – with rubrics. Educational Leadership, 55, 72–75.

Wilkinson, A., Barnsley, G., Hana, P. & Swan, M. (1980). Assessing language development. Oxford: Oxford University Press.

Research Conference 2009 62 Poster presentations

1 Elisapesi Latu 2 Dr Trish Corrie 3 Anthony Harkness University of New South Wales Department of Education & Early Brisbane Catholic Education Childhood Development, Vic. “Effectiveness of Feedback in “Using Internal School Review Mathematics Learning” “On Track” Data at School and System Several reviews on the effects of The Victorian Department of Education Level to Inform Improvements teacher feedback to students claim and Early Childhood Development’s in Student Learning – An Online that feedback facilitates learning and On Track survey collects data on the Web Based Application” performance. This study investigated post school education, training and ‘Sparrow’ (Strategic Planning and the effects of feedback by comparing employment destinations of Victorian Reporting – Renewal on the Web) three types of feedback on mathematics Year 10-12 students, the year after is an on-line web based application learning. One group of participants they leave school, and the factors developed by Brisbane Catholic received feedback in the form of contributing to their decisions. The Education and used by Archdiocesan norm-based feedback, a second group survey has occurred annually since 2003 schools to record, monitor and report received standards-based feedback and aims to support policy-making on internal school review and strategies and the third group worked examples- and program development to improve for the improvement of student based feedback. All participants were year 12 completion rates and youth learning. The data can be analysed tested on an algebra topic following transitions. In 2009, 36, 019 Year 12 at school and system level to inform learning (pre-test) and feedback (post- completers (71% of the total 2009 school and system led professional test). Although there was no significant cohort) and 4676 early leavers ( 56% learning, resource deployment and difference in groups on overall learning, of those who consented to participate) policy and program development. an effect was found for a transfer were surveyed. The destination problem: participants in the worked data can analysed by school, sector, examples-based group performed gender, curriculum strand, provider, better than participants in the other location and achievement and socio two groups. Furthermore, the worked economic status quartile. Student sub examples-based group invested more cohorts who are the focus of specific mental effort in learning and found the improvement targets — student post-task easier, as well as adopting a from indigenous and culturally and different cognitive strategy. linguistically diverse backgrounds and students with a disability, are included in On Track.

Assessment and Student Learning: Collecting, interpreting and using data to inform teaching 63 4 Richard McGuiness 6 Prof. Peter Cuttance A purpose designed survey system and library of surveys has been built for St Andrew’s School, Marayong, NSW Research Australia Development and capturing data from parents, teachers Innovation Institute and students of ‘intelligence’ relevant How Assessment Effects to school performance. The Hands-On Children’s Learning Prolearning: A Real-Time Educational Research Map for Effective Performance Information From day 1 we assess each child’s Schools (HERMES) online Survey Kiosk readiness for learning. This poster System for Schools provides a user-friendly interface that enables school administrators and presentation demonstrates how Professor Cuttance is working with teachers to access high quality surveys continued assessment for has been schools to develop a next generation covering over 500 topics. Each school used to positively direct teaching and system for monitoring and improving can easily assemble and deploy a survey impact on student learning. school performance. tailored to its needs in less than 20 The objective is to provide schools with minutes. The Survey Report is delivered real-time feedback via key indicators to the school within hours of the close that monitor current performance and of the survey. 5 Doreen Conroy provide diagnostic information about Schools can choose to benchmark areas that may become a focus for Department of Education and Training, themselves against any selected group strategic improvement. NSW of other schools — the benchmarks The methodology being developed ensure that identification of individual A Sporting Chance for includes an approach to classroom schools is not possible as they are Aboriginal Students in Western formative assessment that tracks the based on data pooled across a NSW learning of each student and identifies required minimum number of schools outcomes that require additional focus selected by the user, or pre-defined Girri Girri Sports Academy provides by the teacher. In addition, it provides clusters of schools, such as ‘Catholic 180 Aboriginal students across 9 real-time feedback from students, primary schools in communities with a secondary sites an opportunity teachers and parents through a fully population of less than 10,000’. to be engaged in a positive youth automated online data gathering and development program. The poster reporting system. Real-time feedback session will outline the results achieved provides daily information on key after two years of the intervention. The indicators and weekly information research has involved the measurement reports that are generated automatically of the impact that the intervention has from a web-based application. had on Aboriginal students’ psycho- social drivers, educational outcomes The information from the formative and school attendance. assessment system is integrated with real-time information from rotating surveys of parents, students and teachers using an integrated system of multi-modal technologies (web, handheld devices/smart phones, SMS, hardcopy, and interactive voice response).

Research Conference 2009 64 Conference program Research Conference 2009 Sunday 16 August 6.00–7.30 Cocktails with the Presenters – River View Rooms 4 and 5 – Level 2 Perth Convention and Exhibition Centre. Entertainment by Neo Trio Monday 17 August 8.00 Conference Registration Level 2, Perth Convention and Exhibition Centre Entertainment by Wadumbah 8.50 Welcome to Country James Webb – Wadumbah 9.15 Keynote Address 1 Assessment for Teaching Professor Geoff Masters, Chief Executive Officer ACER Riverside Theatre Chair: Dr. John Ainley, ACER 10.30 Morning Tea and Poster Presentations 11.00 Concurrent Sessions 1

Session A Session B Session C Session D What makes a difference? How Reflections on the validity of Using Assessment Data for Conversations with a Keynote measuring the non-academic using results from large scale improving teaching practice Professor Patrik Scheinin, outcomes of schooling can help assessments at the school level Professor Helen Timperley, University of Helsinki, Finland guide school practice. Mr Peter Titmanis, University of Auckland NZ Room M12 Ms Prue Anderson, ACER Performance Measurement Room M1 Riverside Theatre and Reporting Taskforce Chair: Prof. Stephen Dinham, Chair: Suzanne Mellor, ACER River View Room 4 ACER Chair: Kerry-Anne Hoad, ACER 12.15 Lunch and Poster Presentations 12.45 Lunchtime Talkback NAPLAN – issues and directions Riverside Theatre, led by Chris Freeman, ACER 1.15 Keynote Address 2 Informative Assessment – understanding and guiding learning Dr Margaret Forster, Research Director Assessment and Reporting Program, ACER Riverside Theatre Chair: Dr. John Ainley, ACER 2.30 Afternoon Tea and Poster Presentations 3.00 Concurrent Sessions 2

Session E Session F Session G Session H PISA for teachers: Interpreting Next Practice: What we are Culture-fair assessment Conversations with a Keynote and using information from learning about teaching from leading to culturally responsive Professor Helen Wildy an international reading student data pedagogy with indigenous University of Western assessment in the classroom Ms Katrina Spencer and students Australia Ms Julie Mendelovits and Mr Daniel Balacco, DECS SA Professor Val Klenowski, Room M12 Ms Dara Searle, ACER River View Room 4 QUT and Riverside Theatre Chair: Deirdre Jackson, ACER Ms Thelma Gertz Chair: Marion Meiers, ACER CEO, QLD Room M1 Chair: Kerry-Anne Hoad, ACER 4.15 Close of Day 1 6.45 Pre-dinner Drinks Ballroom 2, Perth Convention and Exhibition Centre Entertainment by Sartory Strings 7.00 Conference Dinner Ballroom 2, Perth Convention and Exhibition Centre Entertainment by Tetrafide

Assessment and Student Learning: Collecting, interpreting and using data to inform teaching 65 Tuesday 18 August

8.30 Conference Registrations Level 2, Perth Convention and Exhibition Centre Entertainment by Angel Strings 9.15 Keynote Address 3 Making local meaning from national assessment data: NAPNuLit Professor Helen Wildy, University of Western Australia Riverside Theatre Chair: Dr. John Ainley, ACER 10.30 Morning Tea and Poster Presentations 11.00 Concurrent Sessions 3

Session I Session J Session K Session L An Even Start: Innovative Large Cohort Testing – How Do rubics help to inform and Conversations with a Keynote resources to support teachers can we use assessment data direct teaching practices? Dr Margaret Forster, to better monitor and better to effect school and system Dr Stephen Humphry and ACER support students measured improvement? Dr Sandra Heldsinger, Room M12 below benchmark Mr David Wasson, DET NSW University of Western Australia Ms Jocelyn Cook, ACER River View Room 4 Room M1 Riverside Theatre Chair: Ralph Saubern, ACER Chair: Marion Meiers, ACER Chair: Lance Deveson, ACER

12.15 Lunch and Poster Presentations 12.45 Lunchtime Talkback Fair assessment? Riverside Theatre, led by Dr John Ainley, ACER 1.15 Keynote Address 4 Using student assessment to improve teaching and educational policy Professor Patrik Scheinin, University of Helsinki Finland Riverside Theatre Chair: Dr. John Ainley, ACER 2.30 Closing Address Professor Geoff Masters, Chief Executive Officer, ACER Riverside Theatre

Research Conference 2009 66 Perth Convention and Exhibition Centre floorplan

Perth Convention and Exhibition Centre

Perth Convention and Exhibition Centre - Ballroom 2

Assessment and Student Learning: Collecting, interpreting and using data to inform teaching 67 Research Conference 2009 68 Conference delegates

Dinner table no. Delegate Name Delegate Organisation

Ms Suzieleez Abdul Rahim University of WA Mrs Gayle Abdy MacGregor State High School, QLD Deputy Principal 15 Mr Simon Abernethy Magdalene Catholic High School, NSW Assistant Principal Mrs Helen Adams Christ Church Grammar School, WA Mrs Lorraine Adams St Agnes Primary School, NSW Principal Mrs Anne Addicoat Catholic Education Office, NSW Secondary Adviser 6 Mrs Carmel Agius St Margaret Mary’s School, NSW Principal Mr Christopher Agnew Catholic Education Office, NSW Assistant Principal 2 Dr John Ainley ACER, VIC Deputy CEO Research Mr Stephen Aitken MacKillop Catholic College Principal Ms Maria Alice Catholic Education Office, NSW Adviser: Primary Numeracy Ms Anne Anderson St Ursula’s College, NSW Principal Ms Michelle Anderson ACER, VIC Senior Research Fellow 3 Ms Prue Anderson ACER, VIC Senior Research Fellow Mr Mathew Anderton Courtenay Gardens Primary School, VIC Assessment & Reporting Coordinator Mrs Rosemary Andre OLA Pagewood, NSW Principal 13 Mrs Mary Asikas Seaford 6-12 School, SA Principal 3 Mr Mark Askew Catholic Schools Office, NSW Head of Educational Services Ms Julia Audova St Mark’s Catholic College, NSW Leader of Learning Ms Maxine Augustson Mt Lockyer Primary School, WA Principal Mr Brian Aulsebrook Sacred Heart Primary School, NSW Principal Mrs Margaret Austin St Joseph’s Moorebank, NSW Literacy Coordinator Ms Vivienne Awad , NSW Deputy Principal 3 Mr David Axworthy DET, WA Executive Director 19 Mr Cameron Bacholer The Peninsula School, VIC Director of Curriulum 13 Ms Viginie Bajut Seaford 6-12 School, SA Program Manager

Assessment and Student Learning: Collecting, interpreting and using data to inform teaching 69 Dinner table no. Delegate Name Delegate Organisation

17 Mr Chris Bakon Lindisfarne Anglican Grammar, NSW Assistant Principal 3 Mr Daniel Balacco DECS, SA Program Manager Mr John Ballagh Parkwood Secondary College, VIC Acting Principal Mrs Lyn Barnes St George Christian School, NSW Head of Junior School Mr Philip Barrington Sacred Heart Primary School, NSW Principal 11 Mr Travis Bartlett DECS, SA PARC Mrs Vanja Basell St Helena’s Catholic Primary, WA Assistant Principal 5 Ms Gabrielle Bastow BEMU - DET, WA Principal Consultant 8 Mr Andrew Bawden Overnewton Anglican Comm. College, VIC Miss Brooke Baxter St Patrick’s Primary School, NSW Mrs Donella Beare St Stephen’s School, WA Head of Secondary Ms Lindy Beeley Florey Primary School, ACT Principal 4 Mrs Anna Bennett AIS, VIC Education Consultant Mrs Kathryn Bereny Salisbury High School, SA Science Coordinator Ms Miriam Berlage Rosebank College, NSW Assistant Principal 19 Mrs Christine Bessant Thomas Hassall Ang. College, NSW Head of Junior School 22 Mr Robert Blackley St Joseph’s College, VIC Director of Curriculum Ms Robyn Blair Lakeland Senior High School, WA Head of Humanities Mr Edgar Bliss Catholic Education Office, SA Senior Education Advisor 13 Mrs Merran Blockey Cairns School of Distance Educ, QLD HOD - Junior School Mrs Marlene Blundell St Augustine’s College, QLD Assistant Principal 12 Mr Peter Blundell Guardian Angels School, QLD Principal 17 Mr Terry Boland The Knox School, VIC Director of Curriculum Mr Leon Bolding St Joseph’s Catholic Primary, WA Assistant Principal Mrs Ann Booth Conifer Grove School, NZ Assistant Principal 12 Mr Nick Booth Overnewton Anglican Comm. College, VIC Mrs Denise Jane Bowley Australian Intl. School,

Research Conference 2009 70 Dinner table no. Delegate Name Delegate Organisation

12 Mr Simon Bowyer Overnewton Anglican Comm. College, VIC Dr Sydney Boydell Scotch College, VIC Director of Curriculum Mr Tony Brennan Guilford Young College, TAS Deputy Principal Miss Phillis Broadhurst Victoria Park Primary School, WA Deputy Principal Mr Peter Brogan St Agnes Catholic High School, NSW 7 Dr Sharon Broughton DETA, QLD Principal Policy Officer 15 Dr Philip Brown Avondale College, NSW Vice-President (Learning & Teaching) Ms Raelene Brown Bullsbrook District High School, WA Deputy Principal 4 Mr Nicholas Browne Trinity Grammar School, VIC Director of Curr. & Prof. Learning 12 Dr Deborah Brownson Charters Towers State High School, QLD Deputy Principal 5 Mr Peter Bruce BEMU - DET, WA Principal Consultant Mrs Deborah Buchanan St Brigid’s College, NSW Principal Ms Nicole Bugeia DET, NSW Secondary Curriculum Consultant Ms Jane Buhagiar Catholic Education, SA Education Consultant Dr Brigitte Burg Guildford Grammar School, WA Mr Alan Burgin Urrbrae Agricultural High School, SA ICT Coordinator Ms Kate Burrett Corpus Christi College, NSW Mr Ben Businovski DET, WA A/Project Officer Mrs Chris Butterworth Catholic Education Office, TAS Manager Equity 10 Ms Jan Calder SACE Board of South Aust, SA Senior Moderation Coordinator 23 Mr Peter Callaghan Gnowangerup D.H.S., WA Deputy Principal 22 Mrs Mary Camilleri Marymount College, SA Mr Leon Capra St Augustine’s College, QLD Principal 22 Mr Jeffrey Capuano Ivanhoe Grammar School, VIC Mrs Anne Carberry St Joseph’s College, QLD Assistant Principal Mr Gerald Carey Gleeson College, SA Professional Learning Coordinator 15 Mrs Linda Carstensen St Laurences College, QLD Dean of Studies 6 Mrs Liana Cartledge Gippsland Grammar School, VIC Deputy Principal (Academic)

Assessment and Student Learning: Collecting, interpreting and using data to inform teaching 71 Dinner table no. Delegate Name Delegate Organisation

Mrs Sue Cartwright Rosehill Intermediate, NZ Deputy Principal Mrs Colleen Catford Catholic Education Office, NSW 15 Miss Danielle Cavanagh Sacred Heart Primary School, NT Curriculum Coordinator 5 Ms Christine Cawsey NSW Secondary Principals Council Deputy President 6 Ms Lea Chapuis Wanniassa Hills Primary School, ACT Deputy Principal 11 Mr Mathew Charleston DECS, SA P.A.R.C. Mr Glenn Chinen St Stephens School, WA Mrs Dawn Clements St Stephens School, WA Miss Kellie Cockram Carey Baptist College, WA Assistant Principal 11 Ms Pauline Coghlan DET, WA Director of Schools Review Mr Devlyn Coleman Maranatha Christian College, WA Mrs Fiona Colley St Simon Peter CPS, WA 24 Mrs Janet Colusso St Gertrude’s, NSW Ms Amanda Connor Holy Cross College, WA Principal Mrs Jillian Conole Gleeson College, SA Deputy Principal Ms Doreen Conroy DET, NSW Teaching & Learning Coordinator 3 Ms Joceyln Cook ACER, WA Principal Research Fellow & Manager Ms Bianco Cooke Good Shepherd Primary School, NSW 23 Mrs Anne Corrigan Mary MacKillop Primary, NSW Principal Mr John Couani Catholic Education Office, NSW Regional Director Mr Garry Coyte St Bede’s College, VIC Principal 8 Mrs Anne-Maree Creenaune Catholic Education Office, NSW Ms Kerri Cresswell PLC, WA Mrs Maria Criaris Our Lady of the Sacred Heart College, SA Assistant to the Principal Mr Pedro Cruz Emmanuel Christian Comm. School, WA Principal Mrs Anne Cullender Catholic Education Office, WA Principal Schools Advisor 21 Mrs Deborah Curkpatrick PLC, Armidale, NSW Director 10 Ms Catherine Cushing Brisbane Catholic Education, QLD EO English Mr Peter Cuttance radii.org, VIC Director

Research Conference 2009 72 Dinner table no. Delegate Name Delegate Organisation

Ms Maria D’Agostino All Saints Catholic Senior College, NSW Administration Coordinator Dr Raymond Dallin Assoc. for Christian Educ, WA CEO Mrs Deborah Dalwood AISSA, SA Assistant Director 16 Ms Lucy D’Angelo Penola Catholic College, VIC Deputy Principal Ms Stephanie Dann Pannawonica Primary School, WA Principal 6 Mrs Jane Danvers , SA Principal 8 Ms Andrea Dart Overnewton Anglican Comm. College, VIC Mr Colin Davies MacGregor State High School, QLD Head of Department Dr Alison Davis Vision Education, NZ Director Mr Brian Davis Kojonup District High School, WA Principal Mrs Linda Davis Hanover Welfare Services, VIC Project Worker Ms Kerry de Boer Victoria Park Primary School, WA Mrs Jennifer de Ruiter Mukinbudin DHS, WA 15 Mr Barry Dean Brisbane Boys College, QLD Head of Teaching & Learning Mrs Shirley Dearlove TAFE, SA Lecturer 8 Dr John DeCourcy Parramatta Catholic Educ. Office, NSW Head Ms Anne Denicolo Catholic Education Office, SA E-Learning Consultant Ms Susan Dennett DEECD, VIC Group Manager 17 Ms Nicola Dennis Kincoppal-Rose Bay School, NSW Director 5 Mr Lance Deveson ACER, VIC Library and Information Manager Mr Alessandro Di Felice Rangeway Primary School, WA Deputy Principal Mrs Elizabeth Dimmer Holy Rosary School, WA Assistant Principal 1 Prof. Stephen Dinham ACER, VIC Research Director 3 Mr Alan Dodson DET, WA Director 5 Mrs Michelle Donn Education Queensland, QLD Project Officer Mrs Jeannie Donsworth St George Christian School, NSW Head of Middle School

Assessment and Student Learning: Collecting, interpreting and using data to inform teaching 73 Dinner table no. Delegate Name Delegate Organisation

Mrs Sharon Doohan North Albany SHS, WA Principal Mr Orlando Dos Santos Mandurah Baptist College, WA Deputy Principal Ms Susan Douglas Borderfields Consulting, NZ Ms Leonie Dowd Catholic Education Office, NSW Assistant Principal Mr Alan Dowsett DET, WA Principal Consultant 11 Ms Meron Drummond VCAA, VIC Project Manager 6 Mrs Maria D’Souza Wanniassa Hills Primary School, ACT Ms Maureen Duddy Hampton Senior High School, WA Deputy Principal 10 Mr Dean Dudley Charles Stuart University, NSW Academic Ms Learne Dunne DET, NT Director Mrs Sharon Duong Catholic Education, SA Senior Education Adviser Miss Nicky Durling Ministry of Education, NZ Mrs Natalie Ede DET, NT Literacy Project Officer 18 Miss Hannah Ekers Our Lady of Grace School, WA Mrs Karen Eldridge St Joseph the Worker, NSW Principal Mrs Brigid Eljed Clare Catholic High School, NSW Principal Mr Peter Ellerton QASMT, QLD Director Mr Bradley Elliott Nambour Christian College, QLD Head of Senior School Mr Greg Elliott St Mary , NSW Acting Principal Mrs Brigitte Ellis CEC, NSW Education Officer Mrs Cheryle Elphick Lesmurdie Primary School, ACT Principal Mrs Jenny Elphick St Simon Peter CPS, WA Mr Lee Elvy St Teresa’s Catholic College, QLD Head of Middle School Curriculum Mr Andrew Emanuel Chisholm Catholic Primary, NSW Assistant Principal Ms Jillian English Heathmont College, VIC Assistant Principal Mrs Sandra Erickson Glen Waverley Sec. College, VIC Assistant Principal Mrs Raelene Ernst Canberra Girls’ Grammar, ACT Enrichment Teacher

Research Conference 2009 74 Dinner table no. Delegate Name Delegate Organisation

19 Mrs Theresa Eveans Canterbury College, QLD Asst Dean of Middle Years Mr James Fanning Terra Sancta College, NSW Co-operating Principal 9 Ms Mary Farah Catholic Ladies College, VIC Deputy Principal 6 Mrs Jan Farrall Wilderness School, SA Head of Learning & Teaching 5 Mrs Sophie Fenton Ballarat Grammar School, VIC Head of Humanities Mr Geoffrey Ferguson MacGregor State High School, QLD Head of Department Ms Tracey Filmer Rasmussen State School, QLD Mr Michael Flaherty Emmanuel College, VIC Leader of Arts Mrs Clare Fletcher MacKillop Catholic College, ACT Coordinator Ms Margaret Foldes St Anthony’s Picton, NSW Assistant Principal Ms Corinne Follett Urambi Primary School, ACT Mr Lance Follett DET, NT Manager - Policy 1 Dr Margaret Forster ACER, VIC Research Director, Assessment & Reporting Ms Athina Fotopoulos Catholic Education, SA Numeracy Consultant 3 Ms Kathryn Fox Catholic Schools Office, NSW Head of Training & Learning Services Mr Jon Franzin St Paul’s College, SA Deputy Principal 11 Mr Andrew Fraser Catholic Schools Office, NSW Education Officer 2 Mr Chris Freeman ACER, NSW Research Director and General Manager 21 Mrs Janet Frost St Andrew’s College, NSW Assistant Principal 20 Mr Peter Gaiter Australian Technical College, NSW Principal 10 Ms Jeanine Gallagher Brisbane Catholic Education Centre, QLD Ms Maree Garrigan Schools North, NT General Manager, PARCS 14 Ms Dale Gathercole Salisbury High School, SA Principal Ms Linda Gelati Catholic Education, SA Numeracy Consultant 2 Ms Thelma Gertz Catholic Education Office, QLD Indigenous Education Coordinator Ms Annette Gilbert Glen Waverley Sec. College, VIC Head of Curriculum

Assessment and Student Learning: Collecting, interpreting and using data to inform teaching 75 Dinner table no. Delegate Name Delegate Organisation

Dr Kevin Gillan DET, NT Acting Chief Executive 11 Mrs Helena Gist VCAA, VIC Manager - SAP 12 Mr Craig Glass Haileybury, VIC Vice Principal 18 Mr Anthony Gleeson St Leo’s Catholic School, NSW Assistant Principal Mr Stephen Gniel Macgregor Primary School, ACT Principal 6 Ms Liana Gooch Toorak College, VIC Mr Clyde Graham Cannington Community College, WA Principal 5 Ms Tracey Gralton BEMU - DET, WA Principal Consultant Mr Alan Green DET, NT Acting Executive Director Mr John Green QLD Professional Educator Mr Phil Green DET, WA Principal Education Consultant Mrs Sheila Greenaway Cecil Andrews SHS, WA Mrs Adrienne Gregory Salisbury High School, SA Assistant Principal Ms Robyn Grey Denmark Primary School, WA Deputy Principal Ms Suzette Griffiths DECS, SA Dr Michael Gruis Melton Secondary College, VIC Leading Teacher 18 Ms Judith Hamilton Chapman Primary School, ACT Deputy Principal Mrs Lisa Harbrow-Grove Catholic Education Office, NSW Assistant Principal Ms Sophie Hardwick Emmanuel College, VIC Leader of Humanity 23 Miss Marina Hardy Mary MacKillop Primary, NSW Assisant Principal Mr Anthony Harkness Brisbane Catholic Educ Office, QLD Principal Education Officer Mrs Joanna Harmer Serpentine Jarrahdale Grammar School, WA Year 9 Coordinator 20 Mrs Leisa Harper , QLD Head of Department Mr Tony Harrison Katherine High School, NT Assistant Principal Mr Brendan Hart Millen Primary School, WA Principal Ms Karyn Hart MacGregor State High School, QLD Principal

Research Conference 2009 76 Dinner table no. Delegate Name Delegate Organisation

Mr Graeme Hartley Guildford Grammar School, WA Deputy Headmaster Mr Robert Hartmann Brauer College, VIC Deputy Principal Mr Robert Hassell Lake Joondalup Baptist College, WA Dean of Curriculum Mr Michael Hayes MLC School, NSW Director of Studies 17 Mr Stuart Hayward Newton Moore SHS, WA Dr Anne Hazell DECS, SA Senior Policy Adviser Mrs Sue Healy DET, NT Acting General Manager, Schools Mrs Judy Hearne Catholic Education Office, WA Principal Schools Advisor 1 Dr Sandra Heldsinger The University of Western Australia, WA Lecturer Mrs Suzanne Henden proEMA, QLD Ms Irene Henderson QASMT, QLD Dean Mr Neil Hendry Aberfoyle Park High School, SA Assistant Principal 10 Ms Angela Hennessey Charles Sturt University, NSW Ms Dot Henwood Heathmont College, VIC Principal Mrs Ellen Herden DET, NT Mrs Margaret Heslin Catholic Education Office, NSW Regional Consultant Mrs Janice Heyworth Catholic Education Office, NSW Head, Religious Education & Learning Services Mrs Jodie Higgins MacKillop Catholic College, ACT Coordinator Mr Brett Hillman QASMT, QLD Teacher 5 Mrs Sue Hinchliffe Ballarat Grammar School, VIC Head of English Mr Ian Hislop Boddington District High School, WA Deputy Principal 3 Ms Kerry-Anne Hoad ACER, VIC Manager, Centre for Prof. Learning 10 Mrs Giannina Hoffman SACE Board of South Aust, SA Assessor Trainer 12 Miss Shirley Holcombe Charters Towers State High School, QLD Head of Department 11 Mr Peter Holcz DET, WA Director of Schools Review Mr Jaimie Holland Pembroke School, SA Dean of Student Welfare

Assessment and Student Learning: Collecting, interpreting and using data to inform teaching 77 Dinner table no. Delegate Name Delegate Organisation

8 Ms Jillian Holmes-Smith SREAMS, VIC Director 8 Mr Philip Holmes-Smith SREAMS, VIC Director 16 Mr Craig Homer Mundingburra State School, QLD Deputy Principal Mr Alan Honeyman Curriculum Council, WA Senior Consultant 14 Ms Ann Hooper Lauriston Girls’ School, VIC Head of Junior School 1 Dr Stephen Humphry The University of Western Australia, WA Associate Professor 3 Ms Deirdre Jackson ACER, VIC Director, Assessment Services Division 21 Mr Eric Jamieson Plumpton High School, NSW Principal Mr Mark Jeffery Lakeland Senior High School, WA Deputy Principal 23 Mrs Susan Jenkins Marist College Eastwood, NSW 21 Mr Gary Johnson Cherrybrook Technology High School, NSW Principal Mr Matthew Jolly Catholic Education Office, SA Learning Tech. Consultant Mrs Bev Jones DECS, SA Curriculum Manager Mr Kevin Jones Bede Polding College, NSW Principal Mr Lee Jones Hogg Boddington District High School, WA Principal 16 Mr Ian Jordan John XXIII CPS, NSW 1 Mr Vineet Joshi Central Board of Secondary Education, Secretary and Chairman, Government of India 14 Mr Peter Kadar , VIC Director of Learning and Teaching Mr Simon Kanakis Aranmore Catholic College, WA Deputy Principal Mr Alec Kanganas Highgate Primary Schooll, WA Deputy Principal Mr Chris Kay Donvale Christian College, VIC Head of Secondary Ms Janine Kenney All Saints Catholic Senior College, NSW Assistant Principal Mr Stephen Ker Canning District Educ. Office, WA Principal Consultant 22 Mr John Keyworth Braemar College, VIC Deputy Head, Middle School Mr Honan Khreish Hillcrest Christian College, VIC Head of Middle School 1 Mr Subhash Khuntia Ministry of Human Resource and Development, Joint Secretary Government of India

Research Conference 2009 78 Dinner table no. Delegate Name Delegate Organisation

Mr Ross King Iona College, QLD Dean of Studies Mrs Kerrie Kingston-Gains Pakenham Lakeside Primary, VIC Assistant Principal 22 Ms Pamela Kinsman Marymount College, SA Leader of Learning Mrs Helen Kirkman UNSW Global - EAA, NSW Marketing Assistant 19 Mr William Kitchen Canterbury College, QLD Dean of Senior Years Mr John Klauz Dudley Park Primary School, WA Deputy Principal 2 Prof. Val Klenowski Queensland University of Technology, QLD Professor of Education 14 Mr Tony Kolb Mater Christi College, VIC Director of Curriculum Ms Mary Kondekakis All Saints Catholic Girls College, NSW Curriculum Coordinator Mr Michael Krawec Catholic Education Office, NSW Regional Consultant Mr Robert Laidler Loyola Senior High School, NSW Principal Mrs Karen Lamarre Catholic Education Office, NSW Primary Adviser 13 Mrs Julie Lambert Seaford 6-12 School, SA Program Manager Mrs Kari Lamond Mukinbudin DHS, WA Principal Mrs Mary-Lynn Lane St Thomas, NSW Assisant Principal Mr Rory Lane Lakeland Senior High School, WA Head of Quantitative Science Mrs Jo-Anne Large DET, WA Deputy Principal Ms Elisapesi Latu University of New South Wales, NSW Mrs Vicki Lavorato Catholic Education Office, NSW Regional Consultant 19 Miss Adele Leask Kiwirrkurra Remote Community School, NT 21 Mrs Mary Leask Nagle College, NSW Principal 7 Mrs Gail Ledger The University of Auckland, NZ Facilitator Mrs Elizabeth Lee St Augustine’s College, QLD Head of Primary 21 Ms Rebecca Leech ACER, VIC Journalist 6 Mr Thierry Lehembre Kojonup District High School, WA Deputy Principal Ms Elizabeth Lenders Carey Grammar School, VIC Deputy Principal

Assessment and Student Learning: Collecting, interpreting and using data to inform teaching 79 Dinner table no. Delegate Name Delegate Organisation

16 Mr Attila Lendvai John XXIII Catholic Primary School, NSW Assistant Principal Ms Meng-Yin Leong QASMT, QLD Teacher 16 Mrs Kerry Lestal St Patrick’s Primary School, NSW Assistant Principal 4 Mrs Estelle Lewis Assoc. of Independent Schools, NSW Director, Teacher Accreditation Mr Philip Lewis Gleeson College, SA Principal Mr Geoffrey Lienert Tatachilla Lutheran College, SA Head of Senior School Ms Stephanie Livings Armadale Christian College, WA Head of Curriculum Mrs Julie Loader Onslow Primary School, WA Principal Mr Richard Lobb DET, WA Manager Mr Steven Lockwood Nollamara Primary School, WA Principal Mr Stephen Loggie QASMT, QLD Principal Mrs Kerry Long DET, NSW Manager, SBAR Ms Kay Longden Tranby College, WA Coordinator - Learning Centre Mr Brian Loughland Brigidine College, NSW Assistant Principal Mr John Low Catholic Education Office, SA Integrated Learning Coordinator 11 Mr Rod Lowther DET, WA Director Schools Review Mr Damian Luscombe Frankland Primary School, WA Principal 22 Ms Kim Lynch Braemar College, VIC Head of Middle School 18 Ms Pamela Lynch Benowa State High, QLD Director of Studies Mr Brian MacCarthy DET, WA Data Manager 14 Mrs Nene MacWhirter Lauriston Girls’ School, VIC Head of Senior School Mr Tim Maddern St Paul’s College, SA Head of Curriculum Mrs Danuta Maka OLQP Primary School, NSW Principal Ms Leanne Marie Catholic Education Office, VIC Senior Project Officer Mrs Michelle Marks MacKillop Catholic College, ACT Deputy Principal

Research Conference 2009 80 Dinner table no. Delegate Name Delegate Organisation

Mr Andrew Marr Innovation Tech. Support & Training, WA Consultant Mrs Anne Maree Marrins Our Lady of Mt Carmel Primary, NSW Principal Mrs Jorga Marrum St John Bosco College, NSW Curriculum Coordinator 8 Mr Scott Marsh William Clarke College, NSW Deputy Headmaster Mr Michael Martin Saint Ignatius College, SA Mrs Susan Martin Brigidine College, NSW Assistant Principal 19 Mr Rodney Mason Patrick Rd State School, QLD Principal Mr Clayton Massey Guildford Grammar School, WA 1 Prof. Geoff Masters ACER, VIC CEO 16 Mr Paul Matton Penola Catholic College, VIC Literacy Leader 14 Mr Lukas Matysek Cedars Christian College, NSW Deputy Principal Mrs Carmel McAdam St Simon Peter CPS, WA Mrs Natius McAdam St Michael’s Primary School, NSW Principal 18 Mrs Karen McAsey Penleigh & Essendon Grammar, VIC Special Education Teacher 24 Mrs Jan McClure Ballarat Clarendon College, VIC Deputy Principal Ms Kim McCue Catholic Education Office, NSW Assistant Principal Ms Kath McCuigan Catholic Education, SA Dr Helen McDonald St Margaret’s School, VIC Principal 22 Mrs Samantha McGarrity OLHC, NSW 24 Mrs Jennifer McGie Ballarat Clarendon College, VIC Head of Middle School 25 Ms Elizabeth McGlynn Malabar Public School, NSW Assistant Principal 21 Mrs Moya McGuiness Sacred Heart Primary School, NSW Principal 20 Mr Richard McGuiness St Andrews Primary School, NSW Principal 7 Mrs Gayle McIlraith The University of Auckland, NZ Manager Ms Anne McInerney QASMT, QLD Teacher 10 Mr Mark McKechnie Brisbane Catholic Education Centre, QLD Mr Rodney McKinlay Plenty Valley Christian College, VIC Head of Primary School 25 Ms Julie McLaren DET, ACT

Assessment and Student Learning: Collecting, interpreting and using data to inform teaching 81 Dinner table no. Delegate Name Delegate Organisation

9 Mr Matthew McMahon Catholic Education Office, NSW Education Officer 15 Mr Alan McManus Magdalene Catholic High School, NSW Principal 7 Mr Gerard McPhillips SACE Board of South Aust, SA Assessor Trainer Ms Elizabeth McQuade-Jones Catholic Education Office, VIC Manager Ms Robyn Meddows St Mark’s Catholic College, NSW Principal 6 Ms Marion Meiers ACER, VIC Senior Research Fellow 7 Ms Suzanne Mellor ACER, VIC Senior Research Fellow Mrs Meg Melville Penrhos College, WA 2 Ms Juliette Mendelovits ACER, VIC Principal Research Fellow Mrs Andrea Millar St Helena’s Catholic Primary, WA Assistant Principal Ms Jill Millar Arnhem Education Office, NT Consultant Principal 23 Mrs Denise Miller Taylor Primary School, ACT Principal Ms Karen Miller Bullsbrook District High School, WA 11 Ms Janine Milton DET, WA Director Schools Review Ms Perette Minciullo Moerlina School, WA Principal 7 Miss Elaine Miranda SACE Board of SA Info Analysis & Reporting Mrs Anna Mirasgentis Mary MacKillop College, SA Director of Learning & Innovation Mrs Jane Misek OLQP Primary School, NSW Assistant Principal 20 Mrs Kaylene Mladenovic Townsville State High School, QLD Deputy Principal Mrs Lynda Moir South Ballajura Primary School, WA Principal Mrs Pauline Mongan Whitford Catholic Primary School, WA Assistant Principal 23 Mr Anthony Morgan Brigidine College Randwick, NSW Curriculum Coordinator 17 Mr Brenden Morris The Knox School, VIC Mr Gavin Morris DET, WA Manager, School Performance Ms Helen Morris Education Queensland, QLD Curriculum Adviser 12 Mrs Dawn Morrissey Guardian Angels School, QLD Assistant Principal

Research Conference 2009 82 Dinner table no. Delegate Name Delegate Organisation

Mrs Samantha Mudgway Bolgart Primary School, WA Principal 25 Dr Waris Mughal Department of Education, ACT Project Officer 24 Mr Steve Muller St Dominics College Kingswood, NSW Director of Pastoral Care Ms Glenys Mulvany Corrective Services, WA Deputy Principal Mrs Maureen Munro Guthrie Street Primary School, VIC Assistant Principal 15 Mr Don Murtas Camooweal State School, QLD Principal Ms Jennifer Nash Palmerston High School, NT Principal Mr Jeff Natt QASMT, QLD Teacher Ms Kylie Newcomb QASMT, QLD Teacher Mr Mark Newhouse Assoc. of Independent Schools, WA Manager of Curriculum Ms Jan Nicoll VCAA, VIC Project Manager Mrs Lynne Nixon Rehoboth Christian College, WA Acting Principal Mr Gary Norbury Pakenham Lakeside Primary, VIC Principal 16 Ms Christine Norton Cloncurry State School, QLD Principal Ms Rosemary Nott CEC, NSW Assistant Director Mrs Sue Nott Yarralumla Primary School, ACT Principal 19 Mr Tony Nutt Canterbury College, QLD Mr John O’Brien Townsville Cath. Education Office, QLD Consultant 15 Mr Matt O’Brien Brisbane Boys College, QLD Mrs Elizabeth O’Carrigan Catholic Education Office, NSW Regional Consultant Mrs Margaret O’Connell Catholic Education Office, VIC Senior Project Officer 2 Ms Kate O’Donnell DET, NSW R/Assistant Director Mrs Maria O’Donnell MacKillop Catholic College, ACT Coordinator 17 Mr Bruce Oerman Annesley College, SA Director Mr James O’Grady Catholic Education Office, NSW Director Mr Steve O’Halloran Whitford Catholic Primary School, WA Principal

Assessment and Student Learning: Collecting, interpreting and using data to inform teaching 83 Dinner table no. Delegate Name Delegate Organisation

Ms Geraldine O’Keefe Assumption College Primary, WA Principal Mr Ivan Ortiz Ministerio de Educacion, Chile 11 Mr William Owens Catholic Schools Office, NSW Education Officer - VET Ms Jo Padgham Ainslie School, ACT Principal 7 Mrs Jill Parfitt The University of Auckland, NZ Manager Mr Santo Passarello Patrician Brothers’ College, NSW Principal 19 Ms Sally Paterson Urrbrae Agricultural High School, SA Deputy Principal Mr Mark Paynter DET, WA Manager Mr Lindsay Pearse Hampton Senior High School, WA Principal 16 Mrs Lesley Pecchiar Mundingburra State School, QLD Principal Ms Julie-Ann Pegden Curtin University of Tech., WA Project Officer eVALUate Mrs Janelle Pepperdene Rasmussen State School, QLD Head of Curriculum Ms Kay Petherbridge Pembroke School, SA Dean of Studies 9 Mr Gregory Petherick DECS, SA Assistant Regional Director 5 Ms Lea Peut Education Queensland, QLD Principal Advisor 22 Ms Meredith Pierson Our Lady of the Sacred Heart College, NT Teaching & Learning Coordinator Mrs Alexandra Piggott Pembroke School, SA Head of Humanities 4 Mr Phillip Pinnell Braemar College, VIC Vice Principal Ms Kim Platts Good Shepherd Primary School, NSW Mr Peter Platts Cranebrook High School, NSW Deputy Principal Ms Christelle Plummet Catholic Education, SA Senior Education Adviser Dr Irene Poh Chisholm Catholic College, QLD Assistant Principal Ms Nicki Polding Collie Senior High School, WA Deputy Principal Mrs Antonella Poncini Catholic Education Office, WA R.E. Consultant Professor Paige Porter ACER Chair of Board 12 Ms Beverley Powell Hawkesbury High School, NSW Principal

Research Conference 2009 84 Dinner table no. Delegate Name Delegate Organisation

Mrs Judy Powell Redeemer Lutheran College, QLD Head of Middle School Mr Robert Prest Woodcroft College, SA Director of Curriculum Mr Ken Provis Hillcrest Christian College, VIC Head of Junior School 22 Mr Adrian Puckering Mount St Joseph Girls’ College, VIC Deputy Principal Mrs Nedra Purnell Northside Christian College, QLD Head of Junior School 2 Dr Agung Purwadi Ministry of National Education, Indonesia Director 4 Mr Brendan Pye ACER, VIC Project Officer, CPL 17 Mr Greg Quinn , QLD Director of Staff Services 17 Mrs Julie Quinn St Joseph’s College, QLD Dean of Studies Miss Sarah Quirk St Patrick’s Primary School, NSW Miss Alison Ramm DET, WA Principal Consultant 20 Mr Frank Ranaldo Rostrevor College, SA Director of Curriculum Ms Judith Ratican , SA Project Officer Mr Bradley Raynor Kununurra District High School, WA Assistant Principal Ms Karen Read Canning District Educ. Office, WA Principal Consultant 21 Ms Dympna Reavey Nagle College, NSW Curriculum Coordinator 18 Mr Marc Reicher St Leo’s Catholic School, NSW Director of Senior School Mrs Diane Reid East Maddington Primary School, WA Specialist Teacher 14 Ms Hayley Reid Salisbury High School, SA HPD Coordinator Ms Elizabeth Renton PLC, WA Ms Gail Reus Catholic Education Office, NSW Assistant Principal Ms Louise Reynolds ACER, VIC Corporate Publicity & Communications Manager Mrs Andrea Richards Queen of Peace Primary School, VIC Director of Learning 7 Ms Michelle Richards SACE Board of South Aust, SA Assessor Trainer 18 Mr Mark Rickard Benowa State High, QLD Principal 14 Mrs Catherine Rimell Cedars Christian College, NSW Dean

Assessment and Student Learning: Collecting, interpreting and using data to inform teaching 85 Dinner table no. Delegate Name Delegate Organisation

Mr Brett Roberts St Stephen’s School, WA Deputy Head of Secondary Mr James Roberts St Peter’s College, VIC Director of Learning Mrs Janette Robertson Conifer Grove School, NZ Principal Mrs Stephanie Robinson St Cecilia’s Catholic School, NSW 9 Ms Lisa Rodgers Ministry of Education, NZ Manager Ms Michele Rose Corrective Services, WA Principal Juvenile Education Services 15 Mr Ray Rose Our Lady of Fatima School, WA Assistant Principal Mrs Lisa Rosenthal DET, WA Ed. Measurement 4 Ms Lynda Rosman ACER, VIC Education Consultant 17 Mr Ian Rossotti Trinity College, SA Director of Studies Mr Allister Rouse Waverley Christian College, VIC Director of Teaching and Learning Ms Mandy Rubinich DET, WA Manager 13 Mr Andrew Russell Seaford 6-12 School, SA Program Manager Dr Erica Ryan Catholic Schools Office, NSW Education Officer 9 Mrs Sophie Ryan Catholic Education Office, NSW Head of School Services Mrs Lisa Samojlowicz Chisholm Catholic Primary, NSW Mr Graeme Sassella-Otley Leeming Primary School, WA Principal 5 Mr Ralph Saubern ACER Press, VIC General Manager Mr Roger Saulsman St Helena’s Catholic Primary, WA Principal 1 Prof. Patrik Scheinin University of Helsinki, FINLAND Dean Ms Karin Schrader Chisholm Catholic Primary, NSW Coordinator Mr Steven Schrama Bridgetown High School, WA Miss Renee Schuster DEECD, VIC Senior Data Analyst 12 Mr Derek Scott Haileybury, VIC Principal Sr Margaret Scroope Catholic Education, NSW Education Consultant 7 Ms Dara Searle ACER, VIC Research Fellow

Research Conference 2009 86 Dinner table no. Delegate Name Delegate Organisation

8 Mr Paul Sedunary Catholic Education Office, VIC Manager 22 Mrs Sally Selmes OLHC, NSW 20 Mr Jim Sheedy St Mary’s Primary School, VIC Principal 12 Ms Debra Sheehan Overnewton Anglican Comm. College, VIC 24 Mr David Sheil St Dominics College Kingswood, NSW Director of Teaching & Learning 24 Mr David Shepherd Ballarat Clarendon College, VIC Principal Mr Robert Shepherd Le Fevre High School, SA Principal 24 Ms Christine Shorter Monte Sant’ Angelo Mercy, NSW Mrs Rebecca Sidorenko Chisholm Catholic Primary, NSW Coordinator Mrs Heather Simon East Maddington Primary School, WA Ms Julile Simon Ballajura Community College, WA Vice Principal 18 Mrs Anne Simpson Chapman Primary School, ACT 7 Ms Annemarie Simpson The University of Auckland, NZ Mr Andrew Sinfield DET, WA Educ. Measurement Officer Mr Paul Sjogren St Andrew’s Anglican College, QLD Deputy Principal Ms Christine Slattery Catholic Education, SA Consultant Mrs Joan Slattery Curriculum Council, WA Dr Michael Slattery Catholic Schools Office, NSW Schools Consultant Ms Karen Sloan Catholic Education Office, SA Professional Learning Coordinator 9 Mr Andrew Smith BEMU - DET, WA Assistant Director 4 Mrs Barbara Smith ACER Press, VIC Sales Manager 24 Mr Reid Smith Ballarat Clarendon College, VIC Head of Middle School Curriculum 23 Mr Simon Smith Taylor Primary School, ACT Deputy Principal Ms Suzette Smith Chisholm Catholic Primary, NSW 4 Mr Vaughan Smith Caulfield Grammar School, VIC Head of Research 10 Mr Mark Snartt Brisbane Catholic Education Centre, QLD Senior Education Officer 8 Mr Barry Soraghan ACER, VIC Senior Project Director Mr Neil Spence East Victoria Park Primary, WA Ms Anne Spencer Catholic Education Office, SA Curriculum Software Consultant

Assessment and Student Learning: Collecting, interpreting and using data to inform teaching 87 Dinner table no. Delegate Name Delegate Organisation

3 Ms Katrina Spencer DECS, SA Assistant Director 17 Ms Lesley Ann Stace Newton Moore SHS, WA 13 Mr Harry Stassinopoulos Seaford 6-12 School, SA Deputy Principal Ms Adelheid Stelter Cecil Andrews SHS, WA Literacy Coordinator Ms Rebecca Stephen WA University, WA Lecturer Mrs Heather Stephens East Maddington Primary School, WA Principal Mr Alfred Stewart DET, WA Principal Consultant Mr Phillip Stewart St Columba’s High School, NSW Assistant Principal 20 Mr Scott Stewart Townsville State High School, QLD 23 Mr Tim Stinson St Joseph’s School, QLD Principal Ms Kelly Summers Harvey Senior High School, WA Deputy Principal Mrs Michele Sunnucks Our Lady of Mt Carmel, MT Pritchard, NSW Assistant Principal Mrs Amanda Swandey Ruyton Girls’ School, VIC Director of Learning Mr Gary Swayn DETA, QLD Principal Education Officer Ms Loretta Swayn Rasmussen State School, QLD Principal Mrs Helen Tachas Carey Grammar School, VIC Head of Learning Science 15 Mr John Tannous Magdalene Catholic High School, NSW Curriculum Coordinator 9 Ms Carmel Tapley Catholic Schools Office, NSW Education Officer Dr Pina Tarricone CPA Australia, VIC Manager of Assessment Mrs Bernadette Taylor St Cecilia’s Catholic School, NSW 4 Ms Margaret Taylor ACER, VIC Administrative Officer, CPL Mr Rob Taylor Penrhos College, WA 4 Dr Gillian Thomas Maths Technology Ltd, NZ 8 Mrs Patricia Thompson William Clarke College, NSW Dean of Junior School Mrs Shane Thompson ACER, WA Education Sales Consultant Mrs Marian Thomson DET, WA Educ. Officer 1 Prof. Helen Timperley University of Auckland, NZ Faculty of Education

Research Conference 2009 88 Dinner table no. Delegate Name Delegate Organisation

Mr Bruce Titlestad St Stephens School, WA Head of Secondary School 2 Mr Peter Titmanis Education Department of Western Australia, WA Director 13 Ms Kathryn Todd Cairns School of Distance Educ, QLD Deputy Principal Ms Diane Tomlinson Victoria Park Primary School, WA Princpal Mrs Margaret Tomov St Agatha’s School, QLD 20 Mrs Tanya Travers St Mary’s Primary School, VIC Maths Coordinator Mrs Leonie Trueman St Michael’s College, QLD Mrs Beatrice Tucker Curtin University of Tech., WA Manager, Evaluation 20 Mrs Gail Tull St Mary’s Primary School, VIC Mr Mark Turkington Catholic Education Office, NSW Director 14 Ms Debra Turley Salisbury High School, SA Assistant Principal Mr Ken Turnbull , QLD Director of Teaching & Learning Ms Lis Turner Waggrakine Primary School, WA Principal Mr Sean Tyndall Onslow Primary School, WA Deputy Principal 18 Miss Katrina Tyza Our Lady of Grace School, WA 6 Mrs Maree Uren Wanniassa Hills Primary School, ACT Principal 21 Mr Jan Van Doorn Rooty Hill High School, NSW Deputy Principal 13 Ms Liz Vandenbrink Seaford 6-12 School, SA Program Manager Mr Geoffrey Vander Vliet Nambour Christian College, QLD Deputy Principal 24 Ms Kim Vandervelde Monte Sant’ Angelo Mercy, NSW Mrs Allana Vedder Galilee Catholic School, NSW Principal Ms Rosemary Vellar CEO Sydney, NSW Manager Mrs Sonia Venour Our Lady of the Sacred Heart College, SA Assistant to the Principal 21 Mr Nicholas Vidot St Andrew’s College, NSW Principal Mr Vin Virtue Norwood Secondary College, VIC Principal Mr Nigel Wakefield DET, WA Principal Consultant Ms Madeline Walker Ravenswood School for Girls, NSW Coordinator Operations

Assessment and Student Learning: Collecting, interpreting and using data to inform teaching 89 Dinner table no. Delegate Name Delegate Organisation

14 Mrs Sue Walker Cedars Christian College, NSW Mrs Cheryl Walsh St Bernadette’s Primary School, NSW Principal 2 Mr David Wasson DET, NSW Director, Educ. Measurment & School Accountability Mr Craig Wattam James Sheahan Catholic High School, NSW Principal 9 Mr John Watters ParraSip, NSW Manager 20 Ms Evelyn Waygood St Vincent’s College, NSW Director of Studies 16 Mrs Jennifer Webb St Patrick’s Primary School, NSW Principal Mrs Pauline Webster Catholic Education Office, SA Consultant Ms Judith Weir Emmanuel College, VIC Deputy Principal Mrs Gabrielle West DET, NT Project Manager 10 Ms Cheryl-Anne White Overnewton Anglican Comm. College, VIC Miss Maria White Rosehill Intermediate, NZ Principal 19 Mrs Wendy White Thomas Hassall Ang. College, NSW Director of Operations 24 Mr Alan Whiticker St Gertrude’s, NSW Assistant Principal Mr Malcolm Widdicombe Goulburn Valley Grammar School, VIC Head of Senior School 1 Prof. Helen Wildy The University of Western Australia, WA Dean of Faculty of Education 23 Ms Alison Williams Taylor Primary School, ACT 13 Mrs Jennifer Williams Beaconhills College, VIC Head of Campus Mr Peter Williams Curriculum Council, WA Manager, Assessment Ms Wendy Williams DETA, QLD Senior Guidance Officer 9 Ms Elizabeth Wilson DECS, SA Manager 13 Mrs Jill Wilson Beaconhills College, VIC Head of Campus Mrs Sharni Wilson Katherine South Primary School, NT Principal 2 Ms Yendri Wirda Ministry of National Education, Indonesia Head Mr David Wood Curriculum Council, WA Chief Executive Officer 23 Mr Mark Woolford Marist College Eastwood, NSW

Research Conference 2009 90 Dinner table no. Delegate Name Delegate Organisation

Mr Paul Wright Immanuel College, SA Head of Middle School Ms Frances Yeo Urambi Primary School, ACT Principal 18 Dr Alison Young Anglican Church Grammar School, QLD Subject Head 9 Mr Brian Young BEMU - DET, WA Principal Consultant Ms Karen Young John Therry Catholic High School, NSW Principal 19 Mr Roger Young Thomas Hassall Ang. College, NSW Head of Senior School 10 Mrs Mirella Zalakos Overnewton Anglican Comm. College, VIC

Assessment and Student Learning: Collecting, interpreting and using data to inform teaching 91 Research Conference 2009 92