Quick viewing(Text Mode)

Educational Assessment of Students

Educational Assessment of Students

A01_BROO7072_08_SE_FM_ppi-xvi.indd 1 19/01/18 11:53 AM A01_BROO7072_08_SE_FM_ppi-xvi.indd 2 19/01/18 11:53 AM EIGHTH EDITION Educational Assessment of Students

Susan M. Brookhart Professor Emerita, Duquesne Anthony J. Nitko Professor Emeritus, University of Pittsburgh

330 Hudson Street, NY NY 10013

A01_BROO7072_08_SE_FM_ppi-xvi.indd 3 19/01/18 11:53 AM Director and Publisher: Kevin M. Davis Content Producer: Janelle Rogers Media Producer: Lauren Carlson Portfolio Management Assistant: Casey Coriell Executive Field Marketing Manager: Krista Clark Executive Product Marketing Manager: Christopher Barry Procurement Specialist: Carol Melville Full Service Project Management: Katie Ostler, Cenveo® Publisher Services Cover Designer: Cenveo® Publisher Services Cover Image: Paradoxe/offset.com Composition: Cenveo® Publisher Services Printer/Binder: LSC Communications Cover Printer: Phoenix Color/Hagerstown Text Font: 11/13 Palatino LT Pro

Copyright © 2019, 2015, 2011 by Pearson , Inc. by Pearson Education, Inc. or its affiliates. All Rights Reserved. Printed in the United States of America. This publication is protected by copyright, and permission should be obtained from the publisher prior to any prohibited reproduction, storage in a retrieval system, or transmission in any form or by any means, electronic, mechanical, photocopying, recording, or otherwise. To obtain permission(s) to use material from this work, please visit http://www.pearsoned.com/permissions/

Acknowledgments of third party content appear on the page within the text, which constitute an extension of this copyright page.

Unless otherwise indicated herein, any third-party trademarks that may appear in this work are the property of their respective owners and any references to third-party trademarks, logos or other trade dress are for demonstrative or descriptive purposes only. Such references are not intended to imply any sponsorship, endorsement, authorization, or promotion of Pearson’s products by the owners of such marks, or any relationship between the owner and Pearson Education, Inc. or its affiliates, authors, licensees or distributors.

Library of Congress Cataloging-in-Publication Data is on file with the Library of Congress.

10 9 8 7 6 5 4 3 2 1

ISBN-10: 0-13-480707-3 ISBN-13: 978-0-13-480707-2

A01_BROO7072_08_SE_FM_ppi-xvi.indd 4 19/01/18 11:53 AM About the Authors

Susan M. Brookhart is an independent consul- the journals in which her research has appeared tant in educational assessment and Professor are Applied Measurement in Education, Assessment Emerita and former Chairperson of the Depart- in Education: Principles, Policy, & Practice, Educa- ment of Educational Foundations and Leadership tional Measurement: Issues and Practice, Journal of in the of Education at Duquesne Univer- Educational Measurement, Journal of Educational sity. She has served on several state assessment Research, Oxford Review of Education, Review of technical advisory committees. Previous to her , and College Record. experience, she taught both ele- She also serves on the editorial boards of Applied mentary and . Her research inter- Measurement in Education, Assessment in Education: ests include the role of both formative and Principles, Policy, & Practice, Educational Assess- summative classroom assessment in student ment, and Teachers College Record. motivation and achievement, the connection Professor Brookhart’s assessment books for between classroom assessment and large-scale practitioners include How to Give Effective Feed- assessment, and grading. back to Your Students, Formative Classroom Walk- Professor Brookhart was the 2007–2009 editor throughs: How Principals and Teachers Collaborate to of Educational Measurement: Issues and Practice. Raise Student Achievement (with Connie M. Moss), She has served as the education columnist for How to Assess Higher-Order Thinking Skills in Your National Forum, the journal of Phi Kappa Phi. She Classroom, How to Use Grading to Support Learning, is a past president of the American Educational Learning Targets: Helping Students Aim for Under- Research Association’s Special Interest Group on standing in Today’s Lesson (with Connie M. Moss), Classroom Assessment. She was named the 2014 and How to Create and Use Rubrics for Formative Jason Millman Scholar by the Consortium for Assessment and Grading. Research on Educational Assessment and Teach- ing Effectiveness (CREATE) and is the recipient of Anthony J. Nitko is a private consultant in edu- the 2015 Samuel J. Messick Memorial Lecture cational measurement and Professor Emeritus Award from ETS/TOEFL. and former Chairperson of the Department of In all, Professor Brookhart is author or coau- Psychology in Education at the University of thor of 18 books and over 70 articles and book Pittsburgh. His research interests include chapters on classroom assessment, pro- -based criterion-referenced testing, fessional development, and . With integrating testing and instruction, classroom Anthony J. Nitko, she is the coauthor of Assess- assessment, and the assessment of ment and Grading in Classrooms. With the late Nor- and higher-order thinking skills. man E. Gronlund, she is the coauthor of Gronlund’s Professor Nitko is author of the chapter Writing Instructional Objectives (8th ed.). Some of “Designing Tests That Are Integrated with

v

A01_BROO7072_08_SE_FM_ppi-xvi.indd 5 19/01/18 11:53 AM vi ABOUT THE AUTHORS

Instruction” in the Third Edition of Educational Principles, Policy, & Practice, Measurement and coauthor (with Susan Brookhart) and Policy Analysis, Educational Measurement: of Assessment and Grading in Classrooms. He coau- Issues and Practice, , Journal thored (with Susan Brookhart) the chapter of Educational Measurement, and Research in Devel- “Strategies for Constructing Assessments of opmental Disabilities. Higher-Order Thinking Skills” (2011). He also co- Professor Nitko is a member of several profes- authored (with C. M. Lindvall) Measuring Pupil sional organizations, was elected as Fellow to the Achievement and Aptitude (with T-C Hsu), Pitt Edu- American Psychological Association, served on cational Testing Aids (PETA) (a package of com- several committees of the American Educational puter programs for classroom teachers), and Research Association, was elected Secretary of (with R. Glaser) the chapter “Measurement in AERA Division D, served on committees of the Learning and Instruction” in the Second Edition National Council on Measurement in Education, of Educational Measurement. and was elected to the Board of Directors and as Professor Nitko has been Editor of the journal President of the latter. Educational Measurement: Issues and Practice, and Professor Nitko received Fulbright awards to later served as the International News Editor of Malawi and to Barbados. He has served as a con- this journal. He was also Editor of d’News, the sultant to various and private agen- AERA Division D newsletter. Some of the jour- cies in Bangladesh, Barbados, Botswana, Egypt, nals in which his research has appeared include Ethiopia, Indonesia, Jamaica, Jordan, Liberia, American Educational Research Journal, Applied Malawi, Maldives, Namibia, Oman, Saudi Arabia, Measurement in Education, Assessment in Education: Singapore, United States, Viet Nam, and Yemen.

A01_BROO7072_08_SE_FM_ppi-xvi.indd 6 22/01/18 4:24 PM Brief Contents

Part I The Bases for Assessment Part III Interpreting and Using 1 Classroom Decision Making and Using Standardized Tests Assessment 1 16 Standardized Achievement Tests 354 2 Describing the Goals of Instruction 18 17 Interpreting Norm-Referenced Scores 377 3 of Assessment Results 37 18 Finding and Evaluating Published 4 Reliability of Assessment Results 66 Assessments 411 5 Professional Responsibilities, Ethical Behavior, 19 Scholastic Aptitude, Career Interests, and Legal Requirements in Educational Attitudes, and Personality Tests 425 Assessments 86 Appendixes A Educational Assessment Knowledge and Skills Part II Crafting and Using Classroom for Teachers 447 Assessments B Code of Fair Testing Practices in Education (Revised) 448 6 Planning for Integrating Assessment and Instruction 107 C Code of Professional Responsibilities in Educational Measurement 452 7 Diagnostic and Formative Assessments 132 D Summaries of Taxonomies of Educational Objectives: Cognitive, Affective, and 8 Providing Formative Feedback 152 Psychomotor Domains 457 9 Fill-in-the-Blank and True-False Items 166 E Implementing the Principles of Universal Design via Technology-Based Testing 464 10 Multiple-Choice and Matching Exercises 181 F Basic Statistical Concepts 466 11 Higher-Order Thinking, Problem Solving, and Critical Thinking 218 G Computational Procedures for Various Reliability Coefficients 479 12 Essay Assessment Tasks 240 H A Limited List of Published Tests 484 13 Performance and Portfolio Assessments 260 I List of Publishers and Their Websites 486 14 Preparing Your Students to Be Assessed J Answers to Even-Numbered Exercises 487 and Using Students’ Results to Improve Glossary 491 Your Assessments 299 References 511 15 Evaluating and Grading Student Name Index 523 Achievement 322 Subject Index 526 vii

A01_BROO7072_08_SE_FM_ppi-xvi.indd 7 19/01/18 11:53 AM Contents

Part I The Bases for Assessment Validity of Teacher-Made Classroom Assessment Results 40 Validity of Large-Scale Assessment Results 45 1 Classroom Decision Making and Using Validity Issues When Accommodating Assessment 1 Students with Disabilities 62 What Is Assessment? 2 Conclusion 64 Assessment and Classroom Decisions 7 Exercises 64 Assessment and Educational Decisions About Students 7 4 Reliability of Assessment Results 66 High-Stakes Assessment and Accountability 14 General Nature of Reliability 67 Assessment Literacy 16 Causes of Measurement Error Conclusion 17 or Inconsistency 68 Exercises 17 Reliability of Classroom Assessments 68 Reliability of Large-Scale Assessments 71 2 Describing the Goals of Instruction 18 Obtained Scores, True Scores, Importance of Specifying Learning and Error Scores 78 Outcomes 19 Standard Error of Measurement 78 Educational Goals, State Standards, Reliability of Mastery and Pass-Fail and Learning Objectives 20 Decisions 81 Evaluating the Learning Objectives of a Factors Affecting Reliability and SEM and How Course or Unit 24 to Improve Reliability 83 How to Write Specific Learning Objectives 24 Conclusion 85 Aligning Assessment Tasks with Learning Exercises 85 Objectives 28 Sources for Locating Learning 5 Professional Responsibilities, Ethical Objectives 30 Behavior, and Legal Requirements Taxonomies of Learning Objectives 30 in Educational Assessments 86 Cognitive Domain Taxonomies 31 A Teacher’s Professional Responsibilities Conclusion 36 in Assessment 87 Exercises 36 Six Categories of Responsibility for Teachers 88 Students’ Rights and Responsbilities 3 Validity of Assessment Results 37 as Test-Takers 96 General Nature of Validity 38 Secrecy, Access, Privacy, Confidentiality, Four Principles for Validation 38 and the Teacher 98 viii

A01_BROO7072_08_SE_FM_ppi-xvi.indd 8 22/01/18 4:30 PM CONTENTS ix

Testing Challenged in Court 100 10 Multiple-Choice and Matching Bias in Educational Assessment 102 Exercises 181 Conclusion 105 Multiple-Choice Items 182 Exercises 105 Creating Alternative Varieties of Multiple-Choice Items 199 Part II Crafting and Using Classroom Matching Exercises 206 Assessments Creating Basic Matching Exercises 208 Creating Alternative Varieties of Matching 6 Planning for Integrating Assessment Exercises 211 and Instruction 107 Conclusion 216 Assessment Planning for a Marking Exercises 216 Period 108 Assessment Planning for One Unit 11 Higher-Order Thinking, Problem Solving, of Instruction 110 and Critical Thinking 218 Preassessment to Plan Your Teaching 112 Assessing Higher-Order Thinking 219 Planning for One Summative Assessment 113 Concept Learning 221 Improving the Validity of Assessment Plans 116 Assessing Whether Students’ Thinking What Range of Assessment Options Uses Rules 224 Is Available? 119 Problem Solving 226 Differentiating Instruction 127 Critical Thinking 229 Assessment Planning for Response Reading Skills 235 to Intervention 127 Conclusion 238 Using Technology as an Aid in Assessment 128 Exercises 238 Conclusion 131 Exercises 131 12 Essay Assessment Tasks 240 7 Diagnostic and Formative Assessments 132 Formats for Essay Items 241 Usefulness of Essay Assessments 243 Diagnostic Assessment 133 Constructing Essays Assessing Subject-Matter 140 Learning 245 Learning Progressions 149 Optional Questions 249 A Coherent Assessment System 150 Constructing Prompts for Assessing Writing Systematic Record Keeping 150 Achievement 249 Conclusion 151 Scoring Essay Assessments 253 Exercises 151 and Technology 256 8 Providing Formative Feedback 152 Conclusion 258 Exercises 258 Types and Characteristics of Feedback 153 Helping Students Use Feedback 157 13 Performance and Portfolio Differentiating Feedback 159 Assessments 260 Peer Feedback 162 Feedback from Technology 163 Performance Assessment 261 Conclusion 164 Designing Performance Assessments 272 Exercises 165 Portfolios 291 Conclusion 298 9 Fill-in-the-Blank and True-False Items 166 Exercises 298 Three Fundamental Principles for Crafting Preparing Your Students to Be Assessed Assessments 167 14 Fill-in-the-Blank Items 167 and Using Students’ Results to Improve True-False Items 172 Your Assessments 299 Conclusion 180 Preparing Students for Assessment 300 Exercises 180 Testwiseness 302

A01_BROO7072_08_SE_FM_ppi-xvi.indd 9 22/01/18 4:26 PM x CONTENTS

Test Anxiety 303 Normal Distributions 389 Assessment Format and Appearance 305 Normalized Standard Scores 392 Correction for Guessing 306 Developmental and Educational Item Analysis for Classroom Assessments 308 Growth Scales 396 Item Difficulty Index 314 Extended Normalized Standard Item Discrimination Index 315 Score Scales 396 Improving Multiple-Choice Item Quality 316 Grade-Equivalent Scores 397 Selecting Test Items 318 General Guidelines for Score Interpretation 405 Conclusion 320 Conclusion 409 Exercises 321 Exercises 409

15 Evaluating and Grading Student 18 Finding and Evaluating Published Achievement 322 Assessments 411 The Meanings and Purposes of Grades 323 Locating a Published Test 412 Reporting Methods 326 Locating of Published Tests 415 Choosing a Grading Model 334 Locating Computerized Testing Materials 417 Grading Practices 337 Locating Unpublished Test Materials 418 Techniques for Combining Grades to Summarize Restrictions on Purchasing and Using Tests 418 Achievement 343 Evaluating and Selecting a Test 419 Conclusion 352 Locating Information About Your Exercises 352 State Tests 422 Conclusion 424 Part III Interpreting and Using Exercises 424 Standardized Tests 19 Scholastic Aptitude, Career Interests, 16 Standardized Achievement Tests 354 Attitudes, and Personality Tests 425 Overview of Standardized Tests 355 Aptitudes for Learning 426 Varieties of Standardized Tests 357 Group Tests of Scholastic Aptitudes 428 Commercially Published Achievement Group Tests of Specific Aptitudes 433 Tests 358 Individually Administered Tests of General Federally Mandated State Assessments 364 Scholastic Aptitudes 436 Commercially Produced Interim and Benchmark Assessing Adaptive Behavior 440 Assessments and Services 366 Assessing Vocational and Career Interests 441 Other Commercially Available Tests 368 Assessing Attitudes 444 Appropriate Uses of Standardized Assessing Personality Dimensions 444 Test Results 369 Conclusion 445 Inappropriate Uses of Standardized Exercises 445 Test Results 371 How to Administer Standardized Tests 372 Appendixes Ethical and Unethical Student Practice A Educational Assessment Knowledge and for Standardized Tests 372 Skills for Teachers 447 Conclusion 374 B Code of Fair Testing Practices in Education Exercises 374 (Revised) 448 17 Interpreting Norm-Referenced Scores 377 C Code of Professional Responsibilities in Educational Measurement 452 Three Referencing Frameworks 378 Using Norms 381 D Summaries of Taxonomies of Educational Types of Norm Groups 382 Objectives: Cognitive, Affective, and Norm-Referenced Scores 385 Psychomotor Domains 457 Percentile Ranks 385 E Implementing the Principles of Universal Linear Standard Scores 387 Design via Technology-Based Testing 464

A01_BROO7072_08_SE_FM_ppi-xvi.indd 10 19/01/18 11:53 AM CONTENTS xi

F Basic Statistical Concepts 466 Glossary 491 G Computational Procedures for Various References 511 Reliability Coefficients 479 Name Index 523 H A Limited List of Published Tests 484 Subject Index 526 I List of Test Publishers and Their NOTE: Every effort has been made to provide accurate and current Internet Websites 486 information in this book. However, the Internet and information posted on it are constantly changing, so it is inevitable that some of the Internet J Answers to Even-Numbered Exercises 487 addresses listed in this textbook will change.

A01_BROO7072_08_SE_FM_ppi-xvi.indd 11 19/01/18 11:53 AM Preface

As for the previous editions, the goal of Educa- parents; and interpreting state-mandated tests tional Assessment of Students, Eighth Edition, is to and standardized achievement tests. help teachers and those in training to teach to It is important in a first course that students improve their skills through better assessment of receive a balanced treatment of the topics. Because students. It focuses directly on the professional the book is a comprehensive treatment of tradi- practices of elementary and - tional and alternative assessments, we give exam- teachers. This edition features: ples, discuss the pros and cons, and give guidance for crafting every assessment technique that we ■■ A continued strong emphasis on classroom introduce. Research is cited that supports or refutes assessment, both formative and summative. assessment and teaching practices. ■■ Complete coverage of the basics as well as The text prepares teachers and those in training advanced topics and topics of contemporary to teach as professionals. We recognize that teach- interest. ers’ experiences and judgments are necessary for ■■ Practical advice and examples of how good proper and valid use of educational assessment. and poor classroom assessments affect stu- We do not hesitate to point out teachers’ and dents’ learning. school administrators’ erroneous judgments and ■■ A revised chapter on standardized testing assessment abuses, however, where good lessons to reflect recent changes in the assessment can be learned from them. landscape. NEW AND REVISED CONTENT Educational Assessment of Students is a core text In preparing this edition, we made a special effort written for a first course in educational testing and to make it easy for the reader to apply the mate- constructing classroom assessments, and it serves rial to classroom practice through improved equally as the textbook for an undergraduate course explanations, improved practical examples and or a first graduate course in educational assessment. illustrations, checklists, and step-by-step, how-to No formal coursework in statistics or college math- instructions. As with previous editions, we have ematics is necessary to understand the text. written the text from the viewpoint that assess- The book provides complete coverage of edu- ment is part of good teaching practice that helps cational assessment, including developing plans the teacher improve students’ learning. Material that integrate teaching and assessment; using for- new to the eighth edition includes: mative assessment strategies and providing effec- tive feedback to students; crafting objective, 1. Updated information that reflects the Ele- performance, and portfolio assessments; evaluat- mentary and Act of ing students and discussing evaluations with 2015 and the current assessment landscape.

xii

A01_BROO7072_08_SE_FM_ppi-xvi.indd 12 19/01/18 11:53 AM PREFACE xiii

2. A change in the order of chapters to put the 2. Checklists with succinct tips for evaluating chapter on higher-order thinking before the the quality of each type of assessment taught chapter on essay questions. in the book. 3. Up-to-date discussion of published achieve- 3. Strategies for assessing higher-order thinking ment tests in Chapter 16. that serve as models and descriptions for 4. Update of websites related to assessment, developing problem-solving and critical- including a discussion of how to access infor- thinking assessments. mation about state testing programs on the 4. Key concepts that serve to introduce each Internet, and update of references. chapter, coupled with online MyLab exer- cises and videos. MyLab EDUCATION 5. Important terms and concepts listed at the One of the most visible changes in the new edition, beginning of the chapter and defined in both also one of the most significant, is the expansion of the chapter’s text and in a glossary. the digital learning and assessment resources 6. End-of-chapter exercises that let students embedded in the etext and the inclusion of MyLab apply their learning to practical situations in the text. MyLab for Education is an online and an appendix with answers to even- homework, tutorial, and assessment program numbered exercises. designed to work with the text to engage learners 7. Appendixes of statistical concepts with and to improve learning. Within its structured spreadsheet applications and tutorials for cal- environment, learners practice what they learn, culating reliability coefficients for instructors test their understanding, and receive feedback to and students interested in a more quantita- guide their learning and to ensure their mastery of tive approach than the text provides. key learning outcomes. The MyLab portion of the new edition of Educational Assessment of Students is designed to bring learners more directly into the ACKNOWLEDGMENTS world of K-12 classrooms and to help them see the A project of this magnitude requires the help of very real impact that the assessment concepts cov- many persons. We are very much indebted to the ered in the book have on learners. The materials in reviewers whose critical reading contributed MyLab Education with Educational Assessment of greatly to the technical accuracy, readability, and Students include three types of resources. of the eighth edition: Kathryn Ander- son Alvestad, University of Maryland; Mary K. ■■ Application Exercises allow readers to prac- Boudreaux, University of Memphis; Kristin L. tice assessment tasks like writing different Koskey, The University of Akron; Connie M. types of assessment items, clearly communi- Moss, Duquesne University. Special thanks go to cating learning targets to students, interpreting Steve Ferrara, Measured Progress, and to Michael standardized assessment reports, and grading. J. Young, Pearson Assessment, for helpful reviews ■■ Video Examples illustrate classroom assess- and suggestions for improvement and updating. ment in action, helping students better under- We would also like to thank the reviewers for stand course content. the second, third, fourth, fifth, sixth, and seventh ■■ Self-Check Quizzes help students assess how editions: Peter W. Airasian, Boston College; Law- well they have mastered chapter learning rence M. Aleamoni, University of Arizona; Kath- outcomes. The multiple-choice, automatically ryn Anderson Alvestad, University of Maryland, graded quizzes provide rationales for both College Park; Carol E. Baker, University of Pitts- correct and incorrect answers. burgh; W. L. Bashaw, University of Georgia; Gary Bingham, Georgia State University; Pamela SPECIAL FEATURES Broadston, University of Arkansas at Little Rock; Deborah Brown, West Chester University; Marcia The following special features highlight the prac- Burell, SUNY Oswego; Heidi Legg Burross, Uni- ticality of this text: versity of Arizona; Alice Corkill, University of 1. Examples of how to craft classroom assess- Nevada at Las Vegas; Lee Doebler, University of ments and what they typically look like. Montevallo; Leonard S. Feldt, University of Iowa;

A01_BROO7072_08_SE_FM_ppi-xvi.indd 13 22/01/18 10:31 AM xiv PREFACE

Terry Fogg, Minnesota State University; Betty E. Jamaica Ministry of Education; teachers and Gridley, Ball State University; Gretchen Guiton, assessors at the Examination Development Cen- University of Southern California; Anthony E. ter, Indonesia Ministry of Education and Culture; Kelly, George Mason University; Jin-Ah Kim, Illi- and trainers with the Integrated Language Project nois State University; Thomas M. Haladyna, Ari- in Egypt, who used the second, third, and fourth zona State University; Charles Hughes, editions. They provided insightful feedback and Pennsylvania State University; Louise F. Jernigan, corrections of errors that have greatly improved Eastern Michigan University; Suzanne Lane, Uni- the usefulness of the text. Francis Amedahe versity of Pittsburgh; Robert Lange, University of helped classify chapter learning targets and write Central Florida; Robert W. Lissitz, University of test items for the third edition. Sarah Bonner con- Maryland; Nancy Martin, University of Texas– tributed test items, practical examples for class- San Antonio; Craig Mertler, Bowling Green State room activities, and many elements of the University; William P. Moore, University of Kan- Instructor’s Manual for the fourth edition. To all sas; Pamela A. Moss, University of Michigan; of these persons, and others we have failed to Robert Paugh, University of Central Florida; mention, we offer our most sincere thanks and Susan E. Phillips, Michigan State University; appreciation. Bruce Rogers, University of Northern Iowa; Mari- We are grateful for permission to use checklists anne Robin Russo, Florida Atlantic University; and examples that Anthony Nitko originally John Shimkanin, California University of Penn- published with colleagues Harry Hsu and Maury sylvania; William M. Stallings, Georgia State Uni- Lindvall. Specifically, the checklists for evaluat- versity; Hoi K. Suen, Pennsylvania State ing the quality of a test blueprint (Chapter 6), University; James S. Terwilliger, University of multiple-choice items (Chapter 10), matching Minnesota; Charles L. Thomas, George Mason exercises (Chapter 10), and essay items (Chapter University; Michael S. Trevisan, Washington State 12) and the example in Figure 6.4 originally University; Anthony Truog, University of appeared in A. J. Nitko and T-C. Hsu, Teacher’s Wisconsin–Whitewater; Tary L. Wallace, Univer- Guide to Better Classroom Testing: A Judgmental sity of South Florida, Sarasota-Manatee; Kinnard Approach, 1987, Pittsburgh, PA: Institute for Prac- White, University of North Carolina; Richard tice and Research in Education, School of Educa- Wolf, Teachers College, Columbia University; and tion, University of Pittsburgh. The examples in David R. Young, State University of New York– Figures 13.4, 17.4, and 17.12 originally appeared Cortland. in C. M. Lindvall and A. J. Nitko, Measuring Stu- We thank our students at the School of Educa- dent Achievement and Aptitude (Second Edition), tion, University of Pittsburgh; the School of Edu- 1975, New York: Harcourt Brace Jovanovich. cation, Duquesne University; the College of Special thanks to Veronica Nitko and Frank Education, University of Arizona; the Curriculum Brookhart, whose support and encouragement Development and Evaluation Centre, Botswana were invaluable throughout the work on this text Ministry of Education; teachers working with the and its previous editions.

A01_BROO7072_08_SE_FM_ppi-xvi.indd 14 19/01/18 11:53 AM Educational Assessment of Students

A01_BROO7072_08_SE_FM_ppi-xvi copy.indd 15 19/01/18 1:20 PM A01_BROO7072_08_SE_FM_ppi-xvi.indd 16 19/01/18 11:53 AM