<<

COMPREHENSIVE STANDARDS-BASED ASSESSMENT FRAMEWORK

Definition of terms

Frequency and Type and Purpose Relationship to Methods Information Uses/Actions User Instruction

Category of Function How often and when to Strategies for obtaining Types of evidence or Actions that educators assessment assessment assess students in relation evidence of learning information gained and students might take and who serves within a to instructional goals from assessment in relation to assessment uses the comprehensive to inform uses and information assessment system of actions (see next type and standards-based column) results curriculum, instruction, and assessment

1 Frequency and Type and Purpose Relationship to Methods Information Uses/Actions1 User Instruction

Classroom Assist teaching Minute-by-minute Teacher placed strategically • Emerging or • Continue with planned Formative: and learning throughout the lesson: partially-formed instruction (S, T) Embedded Daily ideas, full under- • Observation of classroom • Stop and find out more in ongoing Track learning standing relative to Weekly discourse (S, T) teaching and • Students’ current lesson goals • Observation of students • Provide specific feed- learning During teaching and learning status engaged in instructional back to class or indi- learning relative to lesson (Student, Signal important tasks vidual students (oral or learning goals (e.g., Teacher) learning goals written) (S, T) • Teacher and student inter- have students met Short-term goals action (e.g., teacher-stu- the goal(s); are they • Reflect on next steps dent conferences) nearly there?) (student self-assess- ment) (S) • Analysis of student work/ • Difficulties and misun- representations derstandings/miscon- • Adjust instructional moves in relation to • Student self-reflection ceptions student learning status (e.g., quick write, response (e.g., on “teachable to questions) moments”) in this or the • Student oral and written next lesson (S, T) work products • Refine and improve assessment (S, T) (Opportunities to gather evidence of learning during ongoing instruction are intentionally planned by the teacher and may also occur spontaneously)

1Users: S=Student, T=Teacher, SSS=Student Support Staff, SA=School Administrator, F=Family, D=District Administrator, ST=State

2 Frequency and Type and Purpose Relationship to Methods Information Uses/Actions User Instruction

Classroom Assist/evaluate Weekly or as fits with Teacher planned and • Emerging or • Plan instruction for start of Formative: teaching and instructional plan or placed strategically in partially-formed new week (T, SSS) Formal learning schedule relationship to instructional ideas, full under- • Provide feedback to class checkpoints plan: standing Signal important or individual students (oral on learning • Students’ current or written) (S, T) progress learning goals Short-term goals • Checklists (e.g., developmen- tal, observational) learning status • Reflect on effectiveness of Monitor progress relative to lesson (Student, • Curriculum-embedded planning and instruction Teacher) with respect learning goals (e.g., (T, SSS) to specifically assessments and/or have students met targeted completed student work the goal(s); are they • Reflect on next steps (stu- intervention products nearly there?) dent self-assessment) (S) goals • Other external assessments, • Difficulties and • Reflect on effectiveness of not developed by the teach- misunderstandings/ curriculum and instruction er, that provide instructional- misconceptions in real-time (T, SSS) ly tractable information (i.e. • Refine and improve assess- READ ACT Assessments) ment (T, SSS) • May be used as a por- tion of a comprehensive educator system (T, SSS)

Intervention Assessments

Progress Short- medium-term • Progress monitoring mea- Student achievement Implement, continue, revise, or monitoring goals sures (e.g., curriculum-based of target learning conclude intervention (T, SA, with respect measurement; embedded goal(s) for a specific SSS) to specifically Dynamic Learning Maps Al- intervention targeted ternate Assessment Program intervention for qualified students) • Program (interven- tion)-based assessments

1Users: S=Student, T=Teacher, SSS=Student Support Staff, SA=School Administrator, F=Family, D=District Administrator, ST=State

3 Frequency and Type and Purpose Relationship to Methods Information Uses/Actions User Instruction

Classroom Signal important After a more extended • Student work products • Status of student • Reflect on subsequent Summative learning goals period of teaching and and performances learning relative to next steps moving for- learning (e.g., after a unit (e.g., portfolio), with longer-term goals ward (S, T, SSS) (Student, Evaluate is completed and before associated rubric(s) (e.g., unit learning • Reflect on effectiveness Teacher, attainment another unit begins) goals) Student of importnat • Student self-reflection of planning and instruc- Support learning goals (e.g., short survey) tion (T, SSS) Staff, School Medium-term goals • Classroom summative • Report to administrators Administrator, assessments designed/ and families (T, SSS, F) Family, District selected by teacher(s) • Discuss student prog- Administrator) ress as a basis for instructional planning of subsequent units during teacher grade level/ departmental meetings (T, SSS, SA) • Family involvement based on results (F) • Refine and improve as- sessment (T, SSS, SA) • May be used as a por- tion of a comprehensive educator evaluation system (T, SSS)

1Users: S=Student, T=Teacher, SSS=Student Support Staff, SA=School Administrator, F=Family, D=District Administrator, ST=State

4 Frequency and Type and Purpose Relationship to Methods Information Uses/Actions User Instruction

Interim Signal important At the end of a • Teacher designed/ • Status of achieve- • Reflect on effectiveness of Summative learning goals semester selected curricu- ment of interme- planning and instruction (T, 3x per year or more lum-embedded diate goals toward SSS) (Student, Track student measures meeting standards Teacher, achievement Across instructional • Reflect on effectives of Student based on units/calendar • Student work products • Prediction of end- school/district structures, pro- Support learning goals periods and performances of-year proficiency grams, curricula (SSS, SA, D) Staff, School (e.g., portfolio), with • Standardized • Make within-year decisions Administrator, Inform associated rubric(s) results aggregated about instructional approach- Family, District Improvement Medium-term goals • School/district standard- and disaggregated: es or programs (T, SSS) Administrator) strategies for: ized standards-based, - By grade level, • Make within-year adjustments grade-level achievement • Teachers school and/or to curriculum/programs (T, tests SSS, SA) • Schools teacher • Reporting (including com- • Districts - By student subgroup munication with families and district personnel) (T, SSS, SA, - By sub-skill F, D) - Trends/patterns • Family involvement based on in student results (F) performance • Identify student for supple- • Student data mental intervention (T, SSS, dashboard/graphic SA) representation of understanding • Readjust professional learning priorities and resource deci- sions (T, SSS, SA, D) • Continue or readjust improve- ment strategies (T, SSS, SA, D) • Identify students in need of additional support or interven- tions (T, SSS, SA, D) • Identify potential promising practices (SSS, SA, D) continued...

1Users: S=Student, T=Teacher, SSS=Student Support Staff, SA=School Administrator, F=Family, D=District Administrator, ST=State

5 Frequency and Type and Purpose Relationship to Methods Information Uses/Actions User Instruction

...continued • Refine and improve assess- ment (T, SSS, SA, D) Interim Summative • Understand student perfor- mance at the school/district (Student, level for monitoring and Teacher, improvement planning, local Student accreditation or the Request Support to Reconsider process (SA, Staff, School D, ST) Administrator, • Supports improvement plan- Family, District ning (e.g., UIP) (SA, D) Administrator) • Educator (T, SSS, SA, D)

Progress monitor Medium-term • Progress monitoring • Student achievement • Implement, continue, revise, with respect goals measures (e.g., cur- of target learning or conclude intervention (T, to specifically riculum-based mea- goal(s) for a specific SSS, SA) targeted surement; embedded intervention intervention Dynamic Learning Maps Alternate Assessment Program for qualified students) • Program (interven- tion)-based assessments • Observation inventories

1Users: S=Student, T=Teacher, SSS=Student Support Staff, SA=School Administrator, F=Family, D=District Administrator, ST=State

6 Frequency and Type and Purpose Relationship to Methods Information Uses/Actions User Instruction

Summative: Accountability: After a year’s or a • State End-of-Year Assessments • Status of student • Report on the status State, district, course worth of in- - Colorado Measures of Aca- achievement with and progress of student school, other • Gauge student struction and learning respect to stan- achievement achievement demic Success (CMAS): PARCC external (English language arts, math), dards (T, SSS, SA, D, ST) mandated of standards Long-term goals Science and Social Studies • May be able to • Make judgments about • Establish provide relative student learning relative (Student, benchmark - Colorado Alternate Assess- Teacher, ment (CoAlt): Dynamic Learn- growth information to standards or floor for for students and (S, T, SSS, SA, F, D, ST) Student school/district ing Maps Year-End (English Support language arts, math); Science schools • Gauge student, school, Staff, School • Gauge school/ and Social Studies • district, and state year- district prog- Administrator, - English language proficiency results aggregated to-year progress Family, District ress relative and disaggregated: (SA, D, ST) to student test (WIDA ACCESS for ELs) Administrator, - Trends/patterns • Improvement planning State) achievement - College readiness/entrance and growth exam in student (e.g., UIP, prioritize performance professional learning Inform • Large-scale End-of-Course and resource decisions, Assessments: - Relative improvement performances curriculum program strategies: - Advanced Placement of cohorts, realignment, reflect on effectiveness of school • Teacher - International Baccalaureate subgroups, grade levels, subject initiatives) • School areas (S, T, SSS, SA, F, D, ST) • District • District/school created/selected - Relative • Educator evaluations end-of-course/year assess- • State performance of (T, SA, D) ments teachers, schools, • Certification / Accredita- Signal important • Teacher end-of-course or districts tion learning goals final assessments, including - Progress in closing (S, D) standardized and perfor- Align curriculum achievement gap • Family or student action mance assessments and based on results (S, F) other curriculum-embedded measures • Refine and improve assessment (T, SSS, SA, D, ST) • Describe student per- formance at the school/ district level for state and federal accountabil- ity ratings (SA, D, ST) 1Users: S=Student, T=Teacher, SSS=Student Support Staff, SA=School Administrator, F=Family, D=District Administrator, ST=State

7 Frequency and Type and Purpose Relationship to Methods Information Uses/Actions User Instruction

Diagnostic & Screening According to school, • Norm-referenced Identification of • Provide targeted interven- Screening for special district, or state testing standardized cognitive students who are tions for at-risk and gifted program calendars and/or referral tests at-risk students (T, SSS, SA) (Teacher, placement or policies and practices • Observation inventories • Conclude intervention Student intervention Identification of Support As needed, based on • English language gifted students (T, SSS, SA) Staff, School Identifying information from other placement test (WIDA • Program placement Administrator, underlying types of assessment ACCESS for ELs) Identification of EL (S, T, SSS, SA, F, D, ST) status Family, District causes of • Home language survey • Align instruction to specif- Administrator) breakdown in Identification of ic areas of need (T, SSS) learning • State and interim Short- medium- long-term assessment results additional areas of • Provide targeted interven- goals support for at-risk tions for students • Demonstrated behav- students ior and/or performance (T, SSS, SA, D) • Often one-on-one • Refine and improve testing assessment (T, SSS, SA, D, ST) • Provide baseline informa- tion to assist educators in setting learning goals for students that inform their evaluation criteria (T, SSS)

1Users: S=Student, T=Teacher, SSS=Student Support Staff, SA=School Administrator, F=Family, D=District Administrator, ST=State

8 Frequency and Type and Purpose Relationship to Methods Information Uses/Actions User Instruction

National & Inform the NAEP: • NAEP • How Colorado • Get an independent indica- International public about the • Administered every 2 • PISA students compare tor how Colorado students Assessments achievement of: nationally and inter- and major subgroups are years in reading and • PIRLS/TIMSS math nationally doing and whether perfor- (State, Public) • Elementary mance is improving (ST) and secondary • Other subjects (writing, • What percent of students and science, arts, geog- (All three assessments Colorado students • Establish the of progress at the raphy, etc) are tested above use matrix are considered state proficiency standards national and periodically, but with sampled test forms proficient based on by comparing to NAEP’s NAEP versus a state national barometer (ST) state level less frequency with representative assessment • Elementary • Test is based on NAEP samples of students at • Learn from promising and/or second- framework and not select grade levels — practices and school and student characteristics ary students’ Colorado standards, so 4, 8, 12) achievement no direct relationship to that are related to higher in comparison curriculum, no individ- performance (ST) with other ual, school-, or district • Refine and improve assess- countries level results available ment (ST)

PISA: Administered every 3 years

PIRLS/TIMSS: • Administered every 4 years, each based on its own framework • No direct relationship to curriculum, no indi- vidual-school or district level results available

Long-term goals

1Users: S=Student, T=Teacher, SSS=Student Support Staff, SA=School Administrator, F=Family, D=District Administrator, ST=State

9