\\jciprod01\productn\E\ELO\7-2\ELO208.txt unknown Seq: 1 20-AUG-15 12:10

“IT’S NOT FOR A GRADE”: THE REWARDS AND RISKS OF LOW- IN THE HIGH-STAKES LAW SCHOOL CLASSROOM

OLYMPIA DUHART*

I. INTRODUCTION Even before they walk into their first classroom, law students are convinced that law school is an ultra-competitive game. And in many significant ways—class ranking, grading curves, teaching methods— the competitive nature of law school is reinforced and confirmed. To- day, the escalating cost of legal education1 and shrinking job opportu- nities nationwide2 have only fueled the notion of law school as a zero- sum game with very high stakes. Law schools are even ranked based on student competitiveness.3 After all, law school has never been an en- deavor for the faint of heart.4

* Professor of Law and Director of the Legal and Writing Program at Nova Southeastern University, Shepard Broad of Law. Many thanks to Frederick J. Pye III and Joseph Morgese for their research assistance with this article. I also appreci- ate the important feedback from Cai Adia. Finally, I am grateful for the very helpful comments from Professors Andrea Curcio, Kim Chanbonpin, and Patricia Propheter on early drafts. 1 See Debra Cassens Weiss, Legal Cost Is Even Higher Than First Estimated, Transparency Group Says, A.B.A. J. (May 7, 2012, 2:37 PM), http://www.abajournal.com/ news/article/legal_education_cost_is_even_higher_than_first_estimated_transparency _group/; see also Jackie Gardina & Ngai Pindell, What Is the Progressive Response to Law School Costs?, SALTLAW BLOG (May 11, 2013), http://www.blog.saltlaw.org/progressive- response-to-rising-costs-of-legal-education/. 2 Dimitra Kessenides, Jobs Are Still Scarce for Law School Grads, BLOOMBERG BUSINESS- WEEK (June 20, 2014), http://www.bloomberg.com/bw/articles/2014-06-20/the-em ployment-rate-falls-again-for-recent-law-school-graduates. 3 Staci Zaretsky, Which Law School Has the Most Competitive Students?, ABOVE L. (Oct. 10, 2013, 1:25 PM), http://abovethelaw.com/2013/10/which-law-school-has -the-most- competitive-studentsand-which-law-school-offers-the -best-quality-of-life/. 4 Indeed, popular culture has long traded on the extremely competitive nature of law school. See, e.g., SCOTT TUROW, ONE L: THE TURBULENT TRUE STORY OF A FIRST YEAR AT HARVARD LAW SCHOOL (1977) (The author offers his now-classic memoir about his

(491) \\jciprod01\productn\E\ELO\7-2\ELO208.txt unknown Seq: 2 20-AUG-15 12:10

492 Elon Law Review [Vol. 7: 491

But the high-stakes law school culture collides in many ways with effective teaching methods. The law school culture is especially at odds with assessment tools that should be designed to maximize op- portunities for feedback and , particularly those opportunities with little risk for failure or penalty. Feedback is most effective as formative assessment when it is “conducted during learning to pro- mote, not merely judge or grade, student success.”5 Formative assess- ments, which are generally focused on improvement,6 should occur frequently in law school settings. The more practice the students have, the better off they will be. However, allowing students to participate in multiple assessment opportunities will come at a high price if each as- sessment is bound up with great exposure to failure. Often, students are so focused on the grade that they are not able to approach the experience as an opportunity to learn. And who can blame them? Who would ever want to learn to play a new instrument if every turn at the piano were in front of a packed audience at Carnegie Hall? As law schools move to implement changes to standards that call for formative assessment,7 educators must consider the efficacy of vari- ous formative assessment models. The wider impact on the adoption of the formative assessment models should also inform the curriculum and course design. Instructors should consider whether the assess- ment tools would exacerbate the problem of the high-stakes, high-pres- sure law school environment. In one recent study, for example, about forty percent of law school students were clinically depressed by gradu-

law school experience.) See also THE PAPER CHASE (20th Century Fox 1973). The film about the struggles of a first-year student has been revived through memes memorial- izing the trials of law school. Most recently, network television has promoted the hypercompetitive nature of law school through shows such as How to Get Away with Mur- der, where students would literally kill to be at the top of the class. See How to Get Away with Murder (ABC television broadcast Sept. 2014). 5 Rick Stiggins, From Formative Assessment to Assessment FOR Learning: A Path to Success in Standards-Based Schools, PHI DELTA KAPPAN, Dec. 2, 2005, at 324, 326. 6 Roberto L. Corrada, Formative Assessment in Doctrinal Classes: Rethinking Grade Ap- peals, 63 J. LEGAL EDUC. 317, 319 n.5 (2013). 7 The American Bar Association recently adopted new standards that call for forma- tive assessment. See Transition to and Implementation of the New Standards and Rules of Procedure for Approval of Law Schools, A.B.A. 2 (Aug. 13, 2014), http://www.americanbar .org/content/dam/aba/administrative/legal_education_and_admissions_to_the_bar/ governancedocuments/2014_august_transition_and_implementation_of_new_aba_ standards_and_rules.authcheckdam.pdf [hereinafter Transition to New Standards]. See in- fra Part II.B and accompanying notes. \\jciprod01\productn\E\ELO\7-2\ELO208.txt unknown Seq: 3 20-AUG-15 12:10

2015]Rewards and Risks of Low-Risk Assessment in the Classroom 493 ation.8 Moreover, law professors wading into the new world of forma- tive assessments9 should be mindful not to import the overemphasis on grades from the familiar world of summative assessment or final ex- ams. Instead, professors should more expansively incorporate low-risk formative assessment in the law school classroom. The term “low-risk formative assessment” refers to assessment that does not run the risk of having a high impact on a student’s public performance or grades. They are low-stakes assignments that carry no or very low grade impact. Most importantly, these low-risk assessments focus on feedback over scores and performance. The goal is to provide students an opportu- nity to practice—and even “fail”10—with very little risk. The possibili- ties for these assessments are almost limitless; they could include in- class practice quizzes that are collected or self-graded, group assign- ments, games, mock debates, or out-of-class projects crafted by students.11 Critics argue that multiple low-stakes practice opportunities are stripped of their validity when they do not have a significant impact on grades. However, there is new thinking about low-risk assessment that suggests that low-risk, ungraded assessments offer multiple benefits to

8 Debra Cassens Weiss, ‘You Are Not Alone’: Law Prof Who Considered Suicide Tells His Story, A.B.A. J. (Apr. 8, 2014, 10:50 AM), http://www.abajournal.com/news/article/ you_are_not_alone_law_prof_who_considered_suicide_tells_his_story. In the same study, the students were no more depressed than the general population prior to law school, where about eight percent report depression. Id. The for law student depression are well documented. Marjorie Silver, a professor at Touro Law Center in Central Islip, N.Y., who speaks to students about her own struggle with depression, notes several contributing factors. First, law students come into the profession expecting success—and then [ninety] percent are disappointed when they don’t rank in the top [ten] per- cent. Further, Silver says, students are thrust into an unfamiliar learning envi- ronment in which the predominant Socratic teaching method undermines self-esteem. Hollee Schwartz Temple, Speaking Up: Helping Law Students Break Through the Silence of Depression, ABA J. (Feb. 1, 2012, 8:50 AM), http://www.abajournal.com/magazine/arti cle/speaking_up_helping_law_students_break_through_the_silence_of_depression/. 9 The distinctions between formative and summative assessments will be addressed in greater detail below. See infra Part II.A and accompanying notes. 10 “Formative work is low stakes when taking a risk to learn something new. Failure at first is expected, but equally expected is a rise from it to find success. If an athlete doesn’t do the work to improve. . .they are not going to perform when it is game time.” Kathy Dyer, Accurately Defining Formative Assessment, NORTHWEST EVALUATION ASS’N (Oct. 30, 2013), https://www.nwea.org/blog/2013/accurately-defining-formative-assess ment/. 11 Examples of low-risk formative assessment assignments are provided in Appendices A through F. \\jciprod01\productn\E\ELO\7-2\ELO208.txt unknown Seq: 4 20-AUG-15 12:10

494 Elon Law Review [Vol. 7: 491 law students and professors. First, low-risk assessments can help mini- mize performance-inhibiting anxieties in law students.12 Eventually, these performance anxieties can become significant obstacles to learn- ing and development.13 Although grades have been called the “peda- gogical whip”14 in teaching, frequent opportunities to practice that do not have any or much impact on grades can also be valuable. They can alleviate student anxiety and help prepare students for a more effective practice performance.15 Low-risk assessments can also allow instructors to collect a more accurate reflection of what their students know and do not know. These stronger snapshots of student understanding are an essential component for professors who need to make appropriate teaching adjustments. The low-risk formative assessments also put the focus properly on learning and understanding. These types of assess- ments can optimize student performance and strengthen the feedback loop. All of these practices support greater student success rates. Through , freedom from the norms of law school assess- ment, and explicit communication with students about , law professors can develop effective, ungraded, low-risk assessment tools. Despite the persistency of the high-stakes law school culture, law schools should strive to incorporate low-risk formative assessment tools that are valid measures of student understanding and teaching effec- tiveness. Such low-risk assessments are only fair and valid, however, if they mirror the final exam. For example, low-risk formative assess- ments that are presented as non-threatening, practice multiple-choice questions would be unfair if the final exam consisted solely of essay questions. Changes in the formative assessment culture, therefore, might also lead to changes in the summative assessment tools. This article will explore the rewards and limitations of promoting low-risk formative assessments in the law school classroom. After the introduction, Part II makes distinctions between formative and summa- tive assessment; in addition, it highlights the new changes in the regu- latory standards that call for increased formative assessment methods. Part III examines the ways in which the most commonly used assess-

12 Barbara F. Cherem, Using Online Formative Assessments for Improved Learning, CUR- RENTS IN TEACHING & LEARNING, Spring 2011, at 42, 45–46. 13 Id. at 45. 14 Scott Warnock, Frequent, Low-Stakes Grading: Assessments for Communication, Confi- dence, FAC. FOCUS (Apr. 18, 2013), http://www.facultyfocus.com/articles/educational- assessment/frequent-low-stakes-grading-assessment-for-communication-confidence/. 15 Id. \\jciprod01\productn\E\ELO\7-2\ELO208.txt unknown Seq: 5 20-AUG-15 12:10

2015]Rewards and Risks of Low-Risk Assessment in the Classroom 495 ment models— particularly, overreliance on high-stakes finals at the end of the course—exacerbate the problems associated with the high- law school culture. It addresses the ways overemphasis on summative assessment both frustrates the feedback loop and adversely impacts student well-being. Part IV raises common criticisms of the low-risk formative assessment models and responds with ways that mini- mize the potentially negative impact. Part V critiques the overuse of traditional teaching and assessment practices in law school. In Part VI, this article provides some guidelines needed to create a classroom cul- ture that supports low-risk formative assessment tools. This part pro- motes self-reflective and collaborate learning experiences that are efficient for law professors. It also addresses strategies that can be used to minimize the burdens on faculty members who want to expand the variety of low-risk formative assessment in their classes. These strate- gies range from completion points to grading rubrics to collaborative exercises. The Appendix contains samples of low-risk formative assess- ments that can be used in various first- and upper-year courses.

II. ONCE IS NEVER ENOUGH Imagine being told in January that you will be performing in a summer music concert in front of a huge crowd. The first thing you are likely to do is establish a rigorous practice schedule that will ensure that you are ready for your big day. Very few people—even the most accomplished musicians—would wait until the start of the concert to play the featured song for the very first time. Even thinking very hard about the concert or listening to recordings of other artists will not prepare you well for your own performance. To figure out where you need work, you would have to sit down and play the song yourself. And you would probably have to practice several times to get it right. Once would never be enough.16 But overreliance on a single, huge, graded summative assessment at the end of a course is just like waiting until the day of the concert to try to that new song. It is very stressful, does not offer any opportunity for improvement, and carries the risk of great failure. So why do so

16 Malcolm Gladwell stresses this point as he writes about the “ten-thousand-hour- rule.” Malcolm Gladwell, Complexity and the Ten-Thousand-Hour Rule, NEW YORKER (Aug. 21, 2013), http://www.newyorker.com/news/sporting-scene/complexity-and-the-ten- thousand-hour-rule. Gladwell argues that, in complex fields, it takes a lot of practice; one estimate is that it takes at least 10,000 hours of practice to become an expert at something. Id. “In cognitively demanding fields, there are no naturals.” Id.; see also MALCOLM GLADWELL, OUTLIERS: THE STORY OF SUCCESS (2011). \\jciprod01\productn\E\ELO\7-2\ELO208.txt unknown Seq: 6 20-AUG-15 12:10

496 Elon Law Review [Vol. 7: 491 many law school classrooms still rely on the big end-of the-semester final? Quite simply, it is the way things have always been done. And as most law school faculty do not have formal training in education, we replicate our own law school experiences—for better or worse. It is also deemed by many to be a “necessary evil” because it is perceived to be less time consuming for law school faculty who sometimes have more than 150 students a semester. The term “apprenticeship of observation”17 refers to the reality that each law professor “approaches [his or her] teaching responsibili- ties having spent thousands of hours as students over the course of [his or her] lifetime.”18 The experience serves as a de facto “apprentice- ship” that restricts the instructor’s ability to situate his or her own stu- dents within a pedagogically-oriented scheme.19 Since the nineteenth century, American law schools have relied on the high-stakes final exam as the only assessment most students experience in a doctrinal course.20 The Langdellian tradition21 of case method and high-stakes final has been the norm of most law professors. Being overwhelmingly practiced in one model of law school classroom instruction, professors are slow to move out of their comfort zones into new models of teach- ing and learning.22 Their apprenticeship of observation has, for the most part, cemented a commitment to a single three- or four-hour exam at the end of the course.

A. Summative Assessment Versus Formative Assessment Since many instructors are doing things the way they have always done them, it is no surprise that there is an overreliance on summative assessment in today’s law school classroom. Summative assessment re- fers to assessments that assign grades or “otherwise indicate the extent to which students have achieved the course goals.”23

17 David M. Moss, at the Crossroads, in REFORMING LEGAL EDUCATION: LAW SCHOOLS AT THE CROSSROADS 1, 4 (David M. Moss & Debra Moss Curtis eds., 2012) (citing DAN C. LORTIE, SCHOOLTEACHER: A SOCIOLOGICAL STUDY (1975)). 18 Id. 19 Id. 20 Rogelio A. Lasso, Is Our Students Learning? Using Assessments to Measure and Improve Law School Learning and Performance, 15 BARRY L. REV. 73, 79 (2010). 21 See id. at 80 (discussing the introduction of the end-of-the-term exams around 1870 at Harvard Law School to compliment Dean Christopher Columbus Langdell’s new case method of instruction); see also Steven I. Friedland, Outcomes and the Ownership Con- ception of Law School Courses, 38 WM. MITCHELL L. REV. 947, 949 (2012). 22 Moss, supra note 17, at 4. 23 Lasso, supra note 20, at 77. \\jciprod01\productn\E\ELO\7-2\ELO208.txt unknown Seq: 7 20-AUG-15 12:10

2015]Rewards and Risks of Low-Risk Assessment in the Classroom 497

Summative assessment focuses on evaluation.24 Professor Roberto Corrada calls summative assessment a “snapshot” intended only to de- termine what someone has learned up to a certain point.25 It literally “sums up” what students have learned.26 While all summative assess- ment measures student learning, some summative assessment also ar- guably offers feedback.27 Indeed, the terms “formative” and “summative” apply not to the actual assessments but rather the func- tions they serve.28 The feedback for summative assessment can offer students the chance to develop their learning skills.29 Unfortunately, the feedback for summative assessment usually provides only a mini- mal chance for immediate improvement because the grade has already been assigned, and in many cases, the course is over.30 And the reality is that most students rarely go back to examine old exams in an effort to learn from them. Rather than an “event” to be experienced at the end of a course or unit, the best assessment has properly been described as a “continu- ing cyclical process.”31 At the student level, the most common goal is to measure how much of a subject the student learned over the course of the semester. Tradition- ally, this has been done through a final exam, or what is known as sum- mative assessment, where it is the sum of the learning that is assessed. As law school curricula move toward more skills courses and experiential

24 MICHAEL HUNTER SCHWARTZ ET AL., TEACHING LAW BY DESIGN 154 (2009). 25 Corrada, supra note 6, at 319 n.5. 26 Ruth Mitchell, A Guide to Standardized Testing: The Nature of Assessment, CENTER FOR PUB. EDUC. (Feb. 15, 2006), http://www.centerforpubliceducation.org/Main-Menu/ Evaluating-performance/A-guide-to-standardized-testing-The-nature-of-assessment. 27 David Thomson, When the ABA Comes Calling, Let’s Speak the Same Language of Assess- ment, 23 PERS.: TEACHING LEGAL RES. & WRITING 68, 69 (2014), available at http:// info.legalsolutions.thomsonreuters.com/pdf/perspec/2014-fall/2014-fall-9.pdf. 28 Paul Black & Dylan Wiliam, “In Praise of ”: Formative Assessment, 29 BRIT. EDUC. RES. J. 623, 623 (2003). 29 Lasso, supra note 20, at 78. 30 Though there is clearly a high value in advancing students’ understanding of their learning strengths and weaknesses, summative assessments that come in the form of all- or-nothing final exams are often not reviewed for meaningful feedback by students or they come too late to impact student grades. In my own experience, only a handful of students ever come back to review their final exams. Even summative assessments that test as a midterm usually are only subject to minimal opportunity to improve performance. Some professors do not re-test on material already covered in the midterm, and very few would ever invite a challenge to a midterm grade. One notable exception is Professor Corrada, who encourages midterm exam appeals as a method of encouraging his students to learn formatively from his exams. See Corrada, supra note 6, at 319, 321. 31 Thomson, supra note 27, at 69. \\jciprod01\productn\E\ELO\7-2\ELO208.txt unknown Seq: 8 20-AUG-15 12:10

498 Elon Law Review [Vol. 7: 491

learning, the preferred method of assessment is formative in nature— that is, the assessment helps to form the student’s learning and to help them improve over the course of the semester.32

In contrast to summative assessment that is aimed at measuring student grades or measuring the “extent to which students have achieved the course goals,”33 formative assessment is intended to offer feedback to both students and faculty.34 The central purpose of forma- tive assessment is learning.35 A hallmark of formative assessment is prompt communication to students about learning so students can im- prove.36 As one commentator has noted: “It helps to form learning.”37 In addition to helping students understand their learning strengths and deficiencies, formative assessment can also help professors learn what is working and not working about their teaching. Nevertheless, the single summative assessment remains the dominant assessment tool because it is easier to implement, professors experienced that model of assessment themselves, and tradition is hard to buck.

B. Change on the Horizon However, professors will soon be forced to do things differently. The American Bar Association (ABA), the accreditation body for American law schools, has just approved revisions to the standard that explicitly calls for formative and summative assessment methods.38 Among the numerous changes that have been approved over the past cycle are standards impacting assessment of student learning. The new standards explicitly state that law schools will need to feature both sum- mative and formative assessments in their curricula. These changes, slated to go into effect in the fall of 2016, are as follows:

32 Id. at 71. 33 Lasso, supra note 20, at 77. 34 Id. 35 Id. 36 Memorandum from Richard K. Neumann, Jr., Professor of Law, Maurice A. Deane Sch. of Law at Hofstra Univ., to Council of the ABA Section of Legal Educ. & Admis- sions to the Bar (Jan. 31, 2014), available at http://www.alwd.org/wp-content/uploads/ 2014/02/Chapter-3-Neumann.pdf. Professor Neumann has weighed in against the adoption of Standard 314; he has stressed that formative assessment must be detailed and individualized to be meaningful. Id. 37 Id. 38 See Transition to New Standards, supra note 7. \\jciprod01\productn\E\ELO\7-2\ELO208.txt unknown Seq: 9 20-AUG-15 12:10

2015]Rewards and Risks of Low-Risk Assessment in the Classroom 499

Standard 314. ASSESSMENT OF STUDENT LEARNING

A law school shall utilize both formative and summative assessment meth- ods in its curriculum to measure and improve student learning and pro- vide meaningful feedback to students.

Interpretation 314-1

Formative assessment methods are measurements at different points during a par- ticular course or at different points over the span of a student’s education that provide meaningful feedback to improve student learning. Summative assessment methods are measurements at the culmination of a particular course or at the cul- mination of any part of a student’s legal education that measure the degree of student learning.

Interpretation 314-2

A law school need not apply multiple assessment methods in any particular course. Assessment methods are likely to be different from school to school. Law schools are not required by Standard 314 to use any particular assessment method.39

Standard 315. EVALUATION OF PROGRAM OF LEGAL EDUCATION, LEARNING OUTCOMES, AND ASSESSMENT METHODS

The dean and the faculty of a law school shall conduct ongoing evalua- tion of the law school’s program of legal education, learning outcomes, and assessment methods; and shall use the results of this evaluation to determine the degree of student attainment of competency in the learn- ing outcomes and to make appropriate changes to improve the curriculum.

Interpretation 315-1

Examples of methods that may be used to measure the degree to which students have attained competency in the school’s student learning outcomes include review of the records the law school maintains to measure individual student achievement pur- suant to Standard 314; evaluation of student learning portfolios; student evalua- tion of the sufficiency of their education; student performance in capstone courses or other courses that appropriately assess a variety of skills and knowledge; bar exam passage rates; placement rates; surveys of attorneys, judges, and alumni; and assessment of student performance by judges, attorneys, or law professors from other schools. The methods used to measure the degree of student achievement of learning outcomes are likely to differ from school to school and law schools are not required by this standard to use any particular methods.40

39 ABA REV. STAND. FOR APPROV. L. SCH. § 314, available at http://www.americanbar .org/content/dam/aba/administrative/legal_education_and_admissions_to_the_bar/ council_reports_and_resolutions/201406_re- vised_standards_clean_copy.authcheckdam.pdf (emphasis in original). 40 Id. § 315 (emphasis in original). \\jciprod01\productn\E\ELO\7-2\ELO208.txt unknown Seq: 10 20-AUG-15 12:10

500 Elon Law Review [Vol. 7: 491

Taken together, the revised standards on assessment make clear that law schools must more robustly incorporate both formative and summative assessment models (as the old standards did not mention formative assessments at all). Further, the language in the Interpreta- tion of Standard 315 regarding various methods available to determine competency in learning outcomes demonstrates an appetite for assess- ments beyond the “do-or-die” final exams.41 However, the ABA standards do not clearly address all of the ques- tions law professors have about the new assessment requirement. For instance, there is already discussion about what the standards actually mean for law schools.42 It is not yet settled what law schools must do to be in compliance with the new standards.43 Specifically, it is not clear whether the ABA is promoting assessments at the school, program, course level, or all three.44 At the very least, however, the adoption of the revisions reflects a movement toward more valid assessment models that better promote learning and teaching. Assessments should no longer be simply equated with a final exam or even a high-stakes midterm exam, for that matter.45 Without a thoughtful consideration and understanding of the value of formative assessment, the danger is that law professors will merely take their high-stakes grading culture of the final exam and break it in half with a midterm and final exam to satisfy the “formative” assessment standard. Such a step would undermine the importance of formative assessment and exacerbate the stressful law school culture.

III. PRACTICE MAKES PERFECT “No research supports the idea of determining grades based solely on one exam given at the end of a semester.”46 Yet law schools persist in taking the “wait until the day of the concert” approach to assess- ment. This section of the article asserts that the overreliance on the most common assessment model—the dreaded tortures of the damned known as “finals”—contributes significantly to the problems associated with the high-stakes law school culture. Not only does over- reliance on this model frustrate the feedback loop that is paramount in

41 Lasso, supra note 20, at 79. 42 Thomson, supra note 27, at 69. 43 Id. 44 Id. 45 Id. 46 SCHWARTZ ET AL., supra note 24, at 155. \\jciprod01\productn\E\ELO\7-2\ELO208.txt unknown Seq: 11 20-AUG-15 12:10

2015]Rewards and Risks of Low-Risk Assessment in the Classroom 501 effective assessment, but it also has a negative impact on student well- being. First, the single summative assessment model challenges every- thing known about learning and valid assessment tools. The assess- ment cycle is illustrated in the loop below:47

The cycle reproduced above suggests that the assessment process relies on constant evaluation of (1) ways to improve student learning and (2) ways to improve teaching.48 The law school assessment is hyper-focused on a third function of assessment: evaluating students to assign grades (in both summative and formative assessments). The model defeats the feedback loop. “The feedback students re- ceive is limited and often far removed from the ending of the course.”49 By offering only limited feedback opportunities at the end—and usually to those in dire straits or those who handily earned the top grade—the high-stakes, graded final exam restricts professors’ ability to gather in a meaningful time to improve instruc- tion for the class in front of them. In addition, the emphasis on high-risk graded assessments makes an already difficult emotional journey in law school50 more difficult for

47 Id. at 136. 48 Id. at 136–37. 49 Linda S. Anderson, Incorporating Adult Learning Theory into Law School Classrooms: Small Steps Leading to Large Results, 5 APPALACHIAN J.L. 127, 135 (2006). 50 “To begin with, it is well recognized that law school is among the most stressful of all educational environments, including medical school.” James B. Levy, As a Last Re- \\jciprod01\productn\E\ELO\7-2\ELO208.txt unknown Seq: 12 20-AUG-15 12:10

502 Elon Law Review [Vol. 7: 491 students. Some reports indicate that test anxiety seriously affects about twenty percent of the school-going population.51 An additional eigh- teen percent may be moderately affected by test anxiety.52 By gradua- tion, about forty percent of law students are suffering from clinical depression, one recent study reported.53 Even for those students not suffering from acute test anxiety, there is a relationship between high-stakes testing and student mental health.54 For even young children, stress about high-stakes tests manifests itself in physical and psychological ways.55 One Florida pedi- atrician reported that there is an uptick in patients during standard- ized testing season, with patients reporting “some level of test-related anxiety, with symptoms ranging from stomach aches [sic] to at- tacks.”56 In a 2009 study of Michigan students, eleven percent of stu- dents surveyed reported severe psychological and physiological responses to testing.57 When there are already underlying mental health issues, the consequences can be even more serious for stu- dents.58 One study, for example, traced the relationship between a

sort, Ask the Students: What They Say Makes Someone an Effective Law , 58 ME. L. REV. 49, 61 (2006). The high emotional toll of law school has been long documented. See B.A. Glesner, Fear and Loathing in Law School, 23 CONN. L. REV. 627, 627 (1991) (noting that students blame the legal education process for their stress levels); see also Lawrence S. Krieger, Institutional Denial About the Dark Side of Law School, and Fresh Empirical Gui- dance for Constructively Breaking the Silence, 52 J. LEGAL EDUC. 112, 113 (2002). 51 See Valerie Strauss, Test Anxiety: Why It Is Increasing and [Three] Ways to Curb It, WASH. POST (Feb. 10, 2013), http://www.washingtonpost.com/blogs/answer-sheet/wp/2013/ 02/10/test-anxiety-why-it-is-increasing-and-3-ways-to-curb-it/; see also Mark Chapell et al., Test Anxiety and Academic Performance in Undergraduate and Graduate Students, 97 J. EDUC. PSYCHOL. 268, 268 (2005) (“Test anxiety is . . . ‘the set of phenomenological, psychologi- cal, and behavioral responses that accompany concern about possible negative conse- quences or failure on an exam or similar evaluative situations.’”); see also Bethany L. Rosado, The Effects of Deep Muscle Relaxation and Study Skills Training on Test Anxi- ety and Academic Performance 6 (2013) (unpublished honors project, East Texas Bap- tist University), available at https://www.etbu.edu/files/5713/8608/9755/Bethany_ Rosado_Honors_Project_2013.pdf. 52 Strauss, supra note 51. 53 Weiss, supra note 8. 54 Rhema Thompson, Too Much Stress? Parents, Experts Discuss High-Stakes Standardized Test Anxiety, WJCT NEWS (Apr. 23, 2014, 4:47 AM), http://news.wjct.org/post/too- much-test-stress-parents-experts-discuss-high-stakes-standarized-test-anxiety. 55 Id. 56 Id. 57 Id. 58 See, e.g., Robin Nusslock et al., A Goal-Striving Life Event and the Onset of Hypomanic and Depressive Episodes and Symptoms: Perspective from the Behavioral Approach System (BAS) Deregulation System, 116 J. ABNORMAL PSYCHOL. 105, 105 (2007) (discussing the interplay \\jciprod01\productn\E\ELO\7-2\ELO208.txt unknown Seq: 13 20-AUG-15 12:10

2015]Rewards and Risks of Low-Risk Assessment in the Classroom 503 goal-striving life event and the onset of hypomanic and depressive epi- sodes in college students.59 Further, many students typically view testing situations as “person- ally threatening.”60 Law school grades have a negative impact on stu- dents’ self-esteem.61 Test-related stress presents in two distinct areas: worry in anticipation of the test and anxiety following test comple- tion.62 While testing and grading cannot be avoided, the negative im- pact of constant high-stakes grading can be minimized through formative assessment tools. Therefore, assessment designers need to be thoughtful in considering the impact that a testing assessment pro- cess will have on the emotional and mental well-being of their stu- dents. “On campuses, where discussions of mental health are often left to student affairs, defining psychosocial well-being means ascertaining the applicability of this concept within a larger discussion . . . in rela- tionship to student learning.”63 In addition, the relationship between testing and anxiety is a two- way street. High-stakes testing does not just negatively impact student well-being. The high emotional of the testing can negatively

between goal-striving events, such as examinations, and manic episodes in bipolar individuals). 59 Id. at 105–12. In an ongoing longitudinal investigation of bipolar spectrum disor- ders, researchers confirmed the relationship between preparing for and taking final exams and the onset of hypomanic symptoms among college students with bipolar spec- trum disorders. Id. Hypomanic episodes were defined as “abnormally and persistently elevated, expansive, or irritable mood.” Id. at 109. The researchers did not record a relationship between final exams and DSM-IV major depressive episodes among bipolar spectrum participants. Id. at 112. Among the bipolar spectrum subjects who took final exams, forty-two percent had an “exam-specific hypomanic episode” compared to only four percent of those bipolar spectrum individuals who were not tested. Id. at 110. 60 Rosado, supra note 51, at 6. 61 See Andrea Curcio, Assessing Differently and Using Empirical Studies to See If It Makes a Difference: Can Law Schools Do It Better? 27 QUINNIPIAC L. REV. 899, 901 (2009) (“[L]aw school grades also impact students’ sense of confidence and self-worth and often cause students to disengage.”); see generally Grant Morris, Preparing Law Students for Disap- pointing Exam Results: Lessons from Casey at the Bat, 45 SAN DIEGO L. REV. 441 (2008). 62 M. Gail Jones et al., The Impact of High-Stakes Testing on and Student in North Carolina, PHI DELTA KAPPAN, Nov. 1999, at 199, 201. 63 Ashley Finley, Connecting the Dots: A Methodological Approach for Assessing Students’ Civic Engagement and Psychological Well-Being, ASS’N AM. CS. & US., https://www.aacu. org/publications-research/periodicals/connecting-dots-methodological-approach-as sessing-students%E2%80%99-civic (last visited May 1, 2015) (connecting the relation- ship between psychological well-being and civic engagement or, more broadly, learning on the college campus). \\jciprod01\productn\E\ELO\7-2\ELO208.txt unknown Seq: 14 20-AUG-15 12:10

504 Elon Law Review [Vol. 7: 491 impact student testing performance as well.64 While emotion triggers our attention and alerts the brain to be prepared for something impor- tant, it also plays a key role in self-efficacy.65 This, in turn, promotes a student’s ability to succeed on a task at hand.66 Conditions that lower student self-efficacy, such as an atmosphere with high-stakes assess- ment, can lead to negative neurobiological effects that “actually im- pede learning.”67 As research in cognitive and psychology gives us a better understanding about the relationship between anxiety and perform- ance,68 it makes sense to develop less stressful assessment tools to assist in both student well-being and performance levels.

IV. IS THIS FOR A GRADE? Anyone who has taught for even one day knows the subtext of a student receiving an assignment from the teacher and then raising his or her hand to ask the following question: “Is this for a grade?” Wrapped up in that simple prompt there are often value judgments about the importance of the assignment, how much effort the student will invest in the work, and what the teacher can expect as a final work product. No grade often translates to “I am not putting much effort into this.” Such a decision essentially destroys the validity of the assess- ment and renders it ineffective. While high-stakes formative assessment comes replete with problems, low-stakes formative assessment has its own set of obstacles. Although there are several legitimate critiques of low-risk formative as- sessments, the rewards of such assessment models outweigh the risks.

A. “They Won’t Take It Seriously If It Doesn’t Count” As noted above, there is an inherent challenge in getting students to make the best use of many common low-risk formative assessment tools. On one hand, some students do not take ungraded assignments or self-assessments seriously (ditto for many efforts to bolster self-regu-

64 See Levy, supra note 50, at 56 (addressing the relationship between the emotional climate of a classroom and academic success). 65 Id. at 56, 58. 66 Id. at 58. 67 Id. 68 Strauss, supra note 51. \\jciprod01\productn\E\ELO\7-2\ELO208.txt unknown Seq: 15 20-AUG-15 12:10

2015]Rewards and Risks of Low-Risk Assessment in the Classroom 505 lated learning strategies).69 Often, if students do not feel there is much at risk, they do not do their best on formative assessments. In other words, students sometimes make a decision to focus only on the final exam or product worth the largest percentage of the final grade.70 Student motivation is an integral part of teaching.71 Indeed, motiva- tion intersects with techniques to maximize student en- gagement.72 This is a product, rather than a sum; therefore, the absence of motivation makes the active learning efforts futile.73 Stu- dent motivation for completion of formative assessment is just as im- portant to the validity of such tools.74 On the other hand, law professors must respond to the challenges of teaching students who are often paralyzed by the pressures of the competitive law school environment. Because of the competitive law school culture, formative assessments can be paralyzing and can trigger fear of failure for many students. Law students are afraid of “failing” a formative assessment and afraid of the implications of such failure.75 That failure is compounded when so-called formative assessments con- form to the summative, high-stakes graded demands that are more ap- pealing for most law professors.

B. “This Is Law School. I’m Not Teaching Babies.” Another common criticism that plagues low-risk formative assess- ment is that it infantilizes law students, who are adult learners in grad- uate school training to be attorneys. Some law professors, who properly see themselves as gatekeepers of the profession, have philo- sophical objections to giving students multiple “safe” places to practice.

69 See John Misak, Why I Started Grading First Drafts and Why I Stopped, CHRON. VITAE (Mar. 10, 2015), https://chroniclevitae.com/news/936-why-i-started-grading-first-drafts- and-why-i-stopped (discussing a composition instructor’s experience hearing com- plaints from fellow faculty members that “no one takes an ungraded [assignment] seri- ously”); see also O’Farrell, Enhancing Student Learning Through Assessment: A Toolkit Approach 9 (unpublished manuscript) (discussing techniques for overcoming the challenge of some students not taking formative assessment seriously), available at http://www.tcd.ie/teaching-learning/academic-development/assets/pdf/250309_as sessment_toolkit.pdf. 70 See O’Farrell, supra note 69, at 9 (“[S]tudents are generally most motivated by what is going to contribute to their final mark.”). 71 ELIZABETH F. BARKLEY, STUDENT ENGAGEMENT TECHNIQUES 4–7 (2010). 72 Id. at 4–5. 73 Id. at 5–7. 74 Id. 75 Cherem, supra note 12, at 45. \\jciprod01\productn\E\ELO\7-2\ELO208.txt unknown Seq: 16 20-AUG-15 12:10

506 Elon Law Review [Vol. 7: 491

The argument, in a nutshell, is that students should be self-sufficient enough as young adults headed for the bar exam to take a disciplined approach to their own studies. Further, some law professors see value in training students to think on their feet, respond well under pres- sure, and function in a high-stakes environment. All these skills, after all, will help make someone a strong advocate for a client one day. But the focus on high-stakes, graded tests that students into assessments that carry great risk ignores the realities of the mental strain that impacts law students. Furthermore, by using high-stakes, graded assessment as the dominant model, law school assessments are marked by a common defect: they fail to “emphasize the skills, knowl- edge and attitudes regarded as most important, not just those that are easy to assess.”76 Other lawyering skills such as collaboration, interper- sonal skills, and metacognition may not lend themselves easily to grades but are essential lawyering skills.77 These skills should be devel- oped and supported explicitly in the classroom. Low-risk formative as- sessment makes this possible.

C. “I Need Every Point I Can Get” Another risk of low-risk formative assessment involves the relent- lessly competitive nature of law students. Even when low-risk assess- ments count for a very small portion of the grade, or only involve participation points, students can become obsessed with their perform- ance rather than their opportunities for feedback and growth. Be- cause the bulk of law school delays grades until the very end, professors who offer some feedback early in the course must deal with the overemphasis that students assign to low-stakes or no-stakes assign- ments. Indeed, the student who is prone to anxiety or stress will even obsess over a small participation point. The law school culture is hard to change, even among students. By being explicit about teaching methods and assessments from day one, the instructor should be able to minimize the student empha- sis on grades and competition. Professors can also consider sharing

76 Black & Wiliam, supra note 28, at 627. 77 See Curcio, supra note 61, at 909–10 (citing Schultz and Zedeck’s twenty-six factors for effective lawyering as set forth in MARJORIE M. SHULTZ AND SHELDON ZEDECK, FINAL REPORT: IDENTIFICATION, DEVELOPMENT AND VALIDATION OF PREDICTORS FOR SUCCESSFUL LAWYERING (2008)); see also Anthony Niedwiecki, Teaching for Lifelong Learning: Improving the Metacognitive Skills of Law Students Through More Effective Formative Assessment Tech- niques, 40 CAP. U. L. REV. 149, 155–56 (2012). \\jciprod01\productn\E\ELO\7-2\ELO208.txt unknown Seq: 17 20-AUG-15 12:10

2015]Rewards and Risks of Low-Risk Assessment in the Classroom 507 short essays about the value of engaged learning and low-risk assess- ment. By being transparent78 about all aspects of the course and being consistent in efforts to build a learning community, professors should be able to quell student concerns about the “value” of low-risk assessments.

V. “SOCRATES DIDN’T KNOW EVERYTHING” Though still the preferred teaching method in most American law schools, the Socratic method has been widely denounced. “However employed, the Socratic method is often criticized. Ralph Nader has called it ‘the game only one can play,’ and there have been genera- tions of students who . . . have wished curses on Dean Langdell.”79 The Socratic method has been defended for its ability to help stu- dents learn how to “think quickly, respond under stress, and teach themselves; all essential to most types of law practice.”80 The value of the Socratic method, coupled with the reinforcement from apprentice observation, make the teaching method hard to abandon. The corol- lary to the Socratic method, the high-stakes final exam, has been equally persistent. But there is space for multiple teaching and assess- ment methods. Further, as commentators have stressed, the single exam is “invalid, unreliable, and even ‘anti-educational.’”81 Formative assessments are designed to enhance student learning and performance by providing increased feedback.82 The most useful feedback identifies student mistakes and offers timely corrections.83 But these objectives can be elusive within the bounds of the competing tensions of the high-stakes law school environment entrenched in the Socratic method and high-stakes finals.84 A single final exam chal- lenges the well-being of students and does not offer a timely opportu-

78 In my classes, I tell the students on day one that we will do things differently. I also share with them the analogy of the music concert and practice schedule I have used throughout this article. The value of setting clear goals with regard to assessment is also outlined in detail in Chapter Seven of Roy Stuckey’s Best Practices for Legal Education: A Vision and a Road Map.ROY STUCKY ET AL., BEST PRACTICES FOR LEGAL EDUCATION: A VISION AND A ROADMAP 235 (2007). 79 TUROW, supra note 4, at 41. 80 Anderson, supra note 49, at 135. 81 Steven Friedland, A Critical Inquiry into the Traditional Uses of Law School Evaluation, 23 PACE L. REV. 147, 177 (2002). 82 Carol Springer Sargent & Andrea A. Curcio, Empirical Evidence That Formative Assess- ments Improve Final Exams, 61 J. LEGAL EDUC. 379, 379 (2012). 83 Niedwiecki, supra note 77, 155–56 (2012). 84 Id. at 176–77. \\jciprod01\productn\E\ELO\7-2\ELO208.txt unknown Seq: 18 20-AUG-15 12:10

508 Elon Law Review [Vol. 7: 491 nity to correct learning or teaching gaps. Yet professors are correctly concerned about the efficiency of offering varied low-risk assessments with a large volume of students. Some low-risk formative assess- ments—such as the review of practice questions in class—also exposes the law professor to the demands of transparency. Students are able to critique questions and answers, which does not usually happen on the final exam. For law professors, the threat of being challenged openly in class may be daunting. But the benefit is that the questioning from students can help the professor become a better exam-writer. It may also expose flaws in student understanding of material while there is still time to for adjustments.85 There are burdens and benefits in creating low-risk formative as- sessment tools that help measure teaching effectiveness and improve self-regulated learning. Formative and summative work must be al- igned within a complete system “so that teachers’ formative work [is] not undermined by summative pressures” within the system.86 The goals of each assessment tool should remain separate and distinct; nev- ertheless, the formative assessment should create a proper training ground for the demands of the summative assessment. One of the appeals of low-risk formative assessment is its ability to contribute positively to the learning culture in the classroom. Instead of leveraging fear as a motivator for students who are afraid to fall on their faces in front of the classroom during a Socratic grill, some of the more accessible low-risk assessment models build an atmosphere of a learning community.87 The models with low- or no-risk assessments support a less threatening environment where students are “more will- ing to test their understanding in public.”88 In fact, research suggests that formative assessment contributes positively to both student motiva- tion and student achievement.89 By collaborating with researchers at Washington University in St. Louis, for example, a group of teachers successfully incorporated “retrieval practice” into their science and social studies classes.90 Students were given a quiz on what

85 See generally Sergent & Curcio, supra note 82. 86 Black & Wiliam, supra note 28, at 623–24. 87 Anderson, supra note 49, at 133. 88 Id. 89 Kathleen M. Cauley & James H. McMillan, Formative Assessment Techniques to Support Student Motivation and Achievement, 83 CLEARING HOUSE: J. EDUC. STRATEGIES, ISSUES & IDEAS 1, 1 (2010). 90 Annie Murphy Paul, In Defense of School Testing, TIME (Jun. 6, 2012), http://ideas. time.com/2012/06/06/in-defense-of-school-testing/. \\jciprod01\productn\E\ELO\7-2\ELO208.txt unknown Seq: 19 20-AUG-15 12:10

2015]Rewards and Risks of Low-Risk Assessment in the Classroom 509 they were just taught at the end of each class.91 The quiz was decidedly low-stakes.92 In fact, it was no-stakes; it was not graded.93 The assess- ment was offered solely to promote retention.94 The simple exercise, which resulted in no risk of failure for the students and very little extra work for the teachers, had a positive impact on the students’ recall of the material.95 Low-stakes assessment opportunities are not just rooted in more human teaching and good policy. They are rooted in good cognitive science as well. “Every time we pull up a memory, we make it stronger and more lasting—so that testing doesn’t just measure what students know, it changes what they know.”96

VI. MUCH MORE THAN MULTIPLE CHOICE There is more to life than multiple-choice quizzes.97 In fact, there are numerous formative assessments tools available that can help stu- dents and professors navigate the competing tensions within the law school environment. The interest in providing multiple opportunities for feedback is at odds with the pressure for efficiency in the law school

91 Id. 92 Id. 93 Id. 94 Id. 95 Paul, supra note 90. 96 Id. 97 Multiple choice questions, which appear in bulk on the bar exam, have earned their fair share of criticism and defense: Anyone who has attended a U.S. school in the last half century is familiar with the bubble tests. Students face a question with four possible answers and respond by filling in a blank “bubble” with a number two pencil. Why a num- ber two pencil? Because the lead in the pencil is a conductor of electricity so that the answer sheets can be scored by machine. Tests that ask for bubbled answers are called multiple-choice, although sometimes controlled choice or selected response. All the terms refer to the fact that possible answers are given and the student has to choose rather than provide an individual response. A few years ago it was justifiable to criticize multiple-choice testing be- cause it seemed reductive. Critics charged that the questions focused on memorized facts and did not encourage thinking. However, test designers took up the challenge to make more sophisticated multiple-choice tests. In many cases multiple-choice tests now require considerable thought, even notes and calculations, before choosing a bubble. Nonetheless, it remains true that multiple-choice tests “are clearly limited in the kinds of achievement they can measure.” These tests do not ask stu- dents to produce anything, but only to recognize (even after some thought) the “right” answer. In doing so, multiple-choice tests foster a mindset that expects a right answer even though further experience in both school and life tends to frustrate that expectation. Mitchell, supra note 26 (internal citations omitted). \\jciprod01\productn\E\ELO\7-2\ELO208.txt unknown Seq: 20 20-AUG-15 12:10

510 Elon Law Review [Vol. 7: 491 classroom.98 This part will specifically promote self-reflective and col- laborative learning experiences that respond to the time constraints facing most law professors. Before a professor can make the leap into low-risk formative as- sessment, he or she must set the stage in the classroom. These sugges- tions offer guidance—in both doctrinal and skills courses—for creating a culture that supports low-risk formative assessment. Below is a list of some recommendations for professors to use to help their stu- dents succeed with low-stakes formative assessments in the high-stakes law school scenario. The challenges and opportunities ahead for the creation of these tools are also briefly addressed below. The Appendi- ces provide specific examples of low-risk formative assessments from Constitutional Law, First Amendment, and Legal Research and Writing classes.99 • Create student ownership in the course. This can be accomplished in many ways. First, students can be polled on their learning objectives,100 and the class can agree to goals that can be incor- porated into the syllabus. Students can also maintain a “port- folio” that measures both their substance mastery and process throughout the semester. Students can also share what they hope to accomplish throughout the year and to identify con- crete steps about how they will meet their goals. The goals and steps lists could then be compiled into one document, distrib- uted to the class, and used as a “roadmap” throughout the year. • Determine student entry levels. On day one, students can be sur- veyed about their own understanding of their strengths and weaknesses. The professor could then address the results in a subsequent class meeting and talk about the classroom profile.

98 There are time constraints forced by students who correctly want immediate feed- back on the assessments so they can close their learning gaps. But there are also serious demands for efficiency by instructors who must also juggle the other competing aspects of their job: teaching, scholarship, and service. 99 These assessments can be adapted and implemented across the board in doctrinal classes, skills courses, clinics, workshops, and seminars. I am limiting myself to the listed subjects because of my own teaching experience. Furthermore, legal research and writing and clinics are steeped in formative assessment—this talk is new to many of my doctrinal colleagues—so I am focusing the sample assessments on Constitutional Law and the First Amendment courses. 100 Friedland, supra note 21, at 975 (“The shared property conception can improve both the efficacy and ethos of American legal education, orienting it to meet the nu- merous challenges of a challenging lawyering marketplace.”). \\jciprod01\productn\E\ELO\7-2\ELO208.txt unknown Seq: 21 20-AUG-15 12:10

2015]Rewards and Risks of Low-Risk Assessment in the Classroom 511

This is an important reality check for many, especially as sev- eral students (through a combination of their K-12 education, Millennial birth, and protection by Baby Boomer parents) are struggling with a combination of being both academically un- derprepared and overconfident.101 • Cultivate a culture of understanding in the classroom. One advan- tage of compiling the student entry levels is that students rec- ognize that they are truly one community. More importantly, students may understand that they are typically anxious about the same issues at the start of the class. This recognition helps promote a culture of understanding and respect in the class- room. Professors may also consider attempting to connect with students by hosting “brown bag” lunch sessions with small groups where any topic except class can be discussed. This can help professors to see their students fully and gives students an opportunity to share their life experiences with their peers in a supportive environment. • Make it plain. Experts consistently stress the need to be explicit with students. The importance of explicit communication in the classroom should shape everything from class management policies to teaching methods to assessments. Experts in legal education have rightly emphasized the importance of being “transparent with your students—to be open in revealing the structure of your course, identifying key points to be retained from a given lesson, situating the topic you’re covering in its larger doctrinal context, and flagging important transitions as you move through the semester.”102 • Help students become practiced in metacognition and self-reflection. Because lawyers are lifelong learners, law students need to practice the skills of self-reflection and self-assessment. Self- regulated learning, or expert learning, involves “planfulness, control and reflection” that is self-directed and goal-ori- ented.103 Tools that bolster the practice of self-regulated learn-

101 For an interesting article about this phenomenon, see Ruth Vance & Susan Stuart, Of Moby Dick and Tartar Sauce: The Academically Underprepared Law Student and the Curse of Overconfidence, 53 DUQ. L. REV. 133 (2015). 102 HOWARD KATZ & KEVIN O’NEIL, STRATEGIES AND TECHNIQUES OF LAW SCHOOL TEACHING: A PRIMER FOR NEW (AND NOT SO NEW) PROFESSORS 31 (2009). 103 Michael Hunter Schwartz, Teaching Law Students to be Self-Regulated Leaners, 2003 MICH. ST. DCL L. REV. 447, 453 (2003). \\jciprod01\productn\E\ELO\7-2\ELO208.txt unknown Seq: 22 20-AUG-15 12:10

512 Elon Law Review [Vol. 7: 491

ing include cognitive protocols104 and journals. are also invaluable tools in helping students regulate their learn- ing.105 A professor might consider starting by giving students checklists to approach their writing assignments, for example, but ending by asking students to develop their own individual- ized checklists for their performance. (To help students ap- preciate the value of checklists, I also share with them a brief excerpt from The Manifesto, which explores the value of checklists in promoting efficiency and accuracy in other en- vironments.106) Because there is no “wrong” answer when cre- ating one’s own checklist, students are generally able to approach the assignment without much anxiety.107 • Foster an environment where students are contributors to the collective knowledge. Group assignments are an excellent way to cultivate the community of learners in the classroom. However, stu- dents are sometimes reluctant to work with others. Again, the goal is to help students overcome the high-stakes mindset that is so pervasive in law school. A few specific strategies that may be useful when making group assignments include: 1) using group activities as an opportunity to teach about the collabora- tive nature of law practice; 2) sharing with students essays on collaboration that address both the risks and rewards; 3) dis- cussing in class the different ways people can collaborate; and 4) requiring the completion of group-assessment and self-as- sessment worksheets at the end of the collaborative exercise. • Take advantage of : Rather than fight against the wave of technology that characterizes today’s student culture, professors can leverage technology to help them with assess-

104 Many thanks to Professor Sophie Sparrow, University of New Hampshire School of Law, for introducing me to this term and method at the start of my law school teaching career. Others may refer to the same format as “reflective papers” or “journals.” 105 See Enrique Rebolloso Pacheco et al., Quality Criteria for Self-Evaluation in , 6 J. MULTIDISC. EVAL. 16 (2009) (addressing the value of metaevaluation to help students create checklists). 106 See ATUL GAWANDE, THE CHECKLIST MANIFESTO: HOW TO GET THINGS RIGHT (2009). 107 I also offer a small, good faith completion point to help keep students motivated to make an honest effort on the assignment. The better incentive, however, is making the connection between the checklist and the avoidance of certain mistakes in their work explicit for them. In fact, one of my best students helped me recognize the value of the individual checklist—he arrived at a writing conference with his own handwritten checklist for his trial brief. After seeing how helpful that exercise was for him, I imple- mented an individual checklist requirement for subsequent classes. \\jciprod01\productn\E\ELO\7-2\ELO208.txt unknown Seq: 23 20-AUG-15 12:10

2015]Rewards and Risks of Low-Risk Assessment in the Classroom 513

ment. Email-based formative assessments (EFA)108 are low-risk. In the EFA arena, professors can post a question and students can respond privately by email. This format allows students to get feedback without the stress of completing a “paper.” Professors are also able to respond quickly if the prompts are short and confined to discrete topics or hypos. In addition, professors can incorporate technology in the classroom by en- couraging students to post blogs on topics, asking students to do a quick Internet search of images associated with a new topic, and breaking students into groups to create slide presentations on a topic.109 Students should be encouraged to take an active role in the use of technology; instead of merely watching presentations, students can create them. Other low- risk formative assessment tools include the use of clicker tech- nology or quizzes on electronic classroom platforms. Such ac- tive engagement with technology improves the efficacy of the assessment because it makes it more likely that students will be able to identify gaps in their learning and understanding of new material. • Use all available resources. Without a doubt, the biggest obstacle to the integration of more formative assessment in the law school classroom is the time crunch. Because professors are already juggling teaching, service, and scholarship obligations, many cannot imagine how to effectively impose additional de- mands on their time. One way professors can preserve their most valuable commodity—time—is through the skillful use of all available resources. First, I have been lucky at my institu- tion to enjoy the support and camaraderie of my colleagues. We frequently share resources, such as quizzes, practice assign- ments, and hypos to minimize the time required to create materials from scratch. Our Legal Research and Writing team also utilizes a shared network drive so people can quickly ac- cess materials without going through direct requests. I have also implemented a required practice IRAC-format essay in Constitutional Law with the help of my teaching assistants. To

108 See generally Carl Anthony Doige, E-mail-Based Formative Assessment: A Chronicle of Re- search-Inspired Practice, J. C. SCI. TEACHING, July–Aug. 2012, at 32. 109 Professor William Araiza shared this idea with me. I have had a lot of luck with this idea in my Constitutional Law class. With enough guidance, students can create slideshow presentations that are effective teaching tools for both themselves and their peers. \\jciprod01\productn\E\ELO\7-2\ELO208.txt unknown Seq: 24 20-AUG-15 12:10

514 Elon Law Review [Vol. 7: 491

help students stay motivated without too much anxiety about performance, I give five points toward the final exam score for a “good faith” effort on a practice IRAC. By using teaching assistants, I can return papers faster and with fewer demands on my time. To keep things as uniform as possible, I create a detailed rubric that research assistants can use to evaluate pa- pers. Further, I “calibrate” our grading by assigning two ran- dom papers to students to evaluate; I also evaluate the same two papers. Next, I meet with my assistant to review our find- ings. We discuss the strengths and weaknesses of each paper and rank them. I offer any needed corrective guidance and divide the remaining papers between the teaching assistants. Students appreciate the chance to review a rubric of what I expect from them on an IRAC, enjoy a quick turnaround on their efforts, and face little penalty for not performing well. I also get help where I need it. • Be creative. Usually, the term “formative assessment” conjures up images of quizzes and more papers to grade. Neither alter- native is appealing to students or professors. However, the in- tegration of alternative assessments110—assessments that creatively engage the students and give them a chance to pre- sent material in a unique format—can help professors incorpo- rate formative assessment tools that are unique and effective. And because the assessments are out-of-the-ordinary, students often approach the assignments with enthusiasm and excite- ment.111 As other legal scholars have noted: “Play

110 I was first introduced to this idea by Professor Steve Friedland, who used a similar concept in his classes more than a dozen years ago. Students are invited to demonstrate their understanding of a discrete subject through a creative medium. Students are en- couraged to use visual arts, photography, music, and even food. The language from my assignment sheet reads, in part: “Projects should reflect a creative approach to Consti- tutional Law. They will be evaluated both for creativity and instructional value. The aim is to create an effective teaching tool that showcases your knowledge of a Constitu- tional Law concept.” See infra Appendix E. I have been very excited to see their inter- pretations of key Constitutional Law concepts this way, use many as teaching props for classes. In keeping with my commitment to making assessment risks low for students, this is an optional assignment that offers students the chance to earn “bonus points” toward their final grades. 111 Though most students are excited about the chance to record music or paint for class, some students will occasionally gripe about even the suggestion of doing some- thing beyond the IRAC and Socratic method they expect from a law school class. As a result, I have worked harder recently to make explicit the connection between active learning and success. \\jciprod01\productn\E\ELO\7-2\ELO208.txt unknown Seq: 25 20-AUG-15 12:10

2015]Rewards and Risks of Low-Risk Assessment in the Classroom 515

not only offers alternative ways to learn and approach topics. From what we’ve seen, asking students to place concepts in a non-traditional format like a song, poem, or sculpture requires a grappling with the materials that results in a deeper under- standing of it.”112 I have made alternative assessments optional and allowed students to work together if they choose to do so. All work is done out of class. In class, I have tried to think expansively about formative assessments by incorporating oral arguments and games or matches between students.113 • Seek institutional support. With the addition of formative assess- ment in the American Bar Association standards, school ad- ministration will be more attentive to the deliberate inclusion of such assessments in the curriculum. Make the most of insti- tutional interest in formative assessment by calling for concrete institutional support. For instance, institutions can support low-risk formative assessment by offering research stipends for professors who are willing to spend a summer creating a bank of assessment tools that can be used by different instructors. In the age of austerity, low-cost support should also be ex- plored. Recently, my school hosted a very productive brown bag session led by our curriculum committee where faculty members came together to discuss formative assessment. Insti- tutions could also offer course release for professors who com- mit to expanding their offering of low-risk formative assessment tools. The deliberate inclusion of low-risk formative assessment will abso- lutely require additional work and reorganization by law professors. It also requires the cultivation of a culture from day one in the classroom that allows students to “buy in” to the experience. But the variety of assessment tools, the boost to student performance and the improve- ment of the student experience all support the inclusion of actual low- risk assessments in the law school classroom.

112 Bryan Adamson et al., Can the Professor Come Out and Play?—Scholarship, Teaching, and Theories of Play, 58 J. LEGAL EDUC. 481, 498 (2008). 113 There is a rise in the “gamification” of teaching assessments in the college and law school environment. Jacquelyn Bengfort, Games Grow Up: Recognize the Power of Gamification, ED TECH (Mar. 28, 2013), http://www.edtechmagazine.com/higher/arti cle/2013/03/games-grow-colleges-recognize-power-gamification. Although there is de- bate about the merits of the term “game” to refer to formative assessment tools that teach substantive law or skills, the use of games, competitions, or matches with little or no grade exposure can be effective in class. \\jciprod01\productn\E\ELO\7-2\ELO208.txt unknown Seq: 26 20-AUG-15 12:10

516 Elon Law Review [Vol. 7: 491

VII. CONCLUSION Law school will always be difficult. And it should be challenging. But the high-stakes law school culture should not infect the assessment process. Law professors must incentivize good faith efforts on low-risk formative assessments that will help provide valuable information on learning outcomes. Professors should embrace the new standards as an opportunity to challenge themselves to create robust, valid, low-risk formative assessments. Such efforts can advance the feedback loop; more importantly, they can contribute positively to the student experi- ence. Creative approaches to low-risk formative assessment can con- tribute positively to mental health and enhance student performance. As the American Bar Association establishes more explicit guide- lines concerning the use of formative assessment in law schools, profes- sors must be careful not to import the grading emphasis of summative assessment into their curriculum. The high-stakes culture of law school can be alleviated—rather than exacerbated—by increased formative assessments that properly keep the emphasis on learning rather than grading. Although these low-risk formative assessments re- quire some time and thought upfront, most are easily adaptable and can be replicated in different semesters. In addition to time con- straints, the other common challenges to implementing them in class are garnering student and institutional support for activities that do not comport with traditional law school expectations. Some students may also need to be weaned from the culture of high-stakes competi- tion. Professors can make explicit to students the connection between teaching methods and learning. Professors can also engage colleagues and administration by making these same connections clear. Also, it is essential that students are able to track their understanding of material with the availability of more feedback opportunities. Despite the addi- tional work required for the initial implementation of these ideas, the benefits are extensive. The integration of low-stakes formative assessment in the usually competitive law school environment can clarify teaching objectives, help instructors make teaching adjustments, increase feedback oppor- tunities, allow students to practice self-regulated learning, improve as- sessment reliability, and positively impact the student experience. And it gives students the chance to practice before the big concert. \\jciprod01\productn\E\ELO\7-2\ELO208.txt unknown Seq: 27 20-AUG-15 12:10

2015]Rewards and Risks of Low-Risk Assessment in the Classroom 517

APPENDIX A114 Constitutional Law

Equal Protection Pyramid

1. Complete this triangle:

2. Complete this chart:

Burden of Deference to Means Ends Proof Government

SS

IS

RB

114 This is an ungraded, in-class exercise used in Constitutional Law I. It takes about five minutes for students to complete. We then review together in class in another ten minutes. Students can also complete the work in pairs. \\jciprod01\productn\E\ELO\7-2\ELO208.txt unknown Seq: 28 20-AUG-15 12:10

518 Elon Law Review [Vol. 7: 491

APPENDIX B115 Constitutional Law

School Desegregation

Based on your understanding of the Little Rock Nine, please work with a team (three or four people) to create a PowerPoint slideshow high- lighting the events. Also identify at least two Constitutional Law issues raised by the events. You are encouraged to use images, text from rele- vant case law or Constitutional provisions, and your imagination.

The top three slideshow presentations will be posted on my faculty webpage. I will also choose material from the top slideshow to include in on the Final Exam. This assignment is due at the end of the class. Make sure your slideshow presentation includes at least one slide that features the names of all of your team members.

While you are not required to include text on all slides, a slideshow presentation that is made up exclusively of images will not be acceptable.

Minimum number of slides: six slides, including the team members’ names.

115 This is an ungraded assignment for Constitutional Law. This is an example of an easy, in-class “game” or contest the students can complete in a group. It supports collaboration, highlights key concepts, and makes a positive use of technology. It takes about twenty-five minutes of class time. \\jciprod01\productn\E\ELO\7-2\ELO208.txt unknown Seq: 29 20-AUG-15 12:10

2015]Rewards and Risks of Low-Risk Assessment in the Classroom 519

APPENDIX C116 Writer’s Response: Goals Sheet Memo #1 Legal Research and Writing Prof. Duhart

You received several comments on your first, formal writing assign- ment. The grade is important, but I stress again that the goal is to make improvements based on identified strengths and weaknesses. Please complete the following, using your edited (graded) CREAC as- signment as a guide. 1. Read the comments written on the margins and borders of your paper. 2. Please ask me for help if you do not understand anything. 3. Take a moment to record below the comments listed on your paper. This assignment is worth five points in the miscellane- ous category and must be COMPLETED to receive any credit. This sheet will go into your writing portfolio, which will be maintained in my office. This goals sheet will also be used to help facilitate your individual writing conference. This com- pleted goals sheet will serve as your required “admission ticket” to your mandatory conference next week. No goals sheet means no conference. No exceptions.

According to the comments on Memo 1, what are your primary strengths?

What, based on the comments, are your writing weaknesses? (Include organization, content & grammar.)

116 This assignment is used in Legal Research & Writing after the first graded memo is returned to students. It helps students bridge the gap between the first memo and the second. It is also intended to boost metacognitive skills. It counts as a few points in the miscellaneous category (this includes homework, small quizzes and other exercises). \\jciprod01\productn\E\ELO\7-2\ELO208.txt unknown Seq: 30 20-AUG-15 12:10

520 Elon Law Review [Vol. 7: 491

What, specifically, will you need to improve as you prepare for Memo #2? Be specific! \\jciprod01\productn\E\ELO\7-2\ELO208.txt unknown Seq: 31 20-AUG-15 12:10

2015]Rewards and Risks of Low-Risk Assessment in the Classroom 521

APPENDIX D117

Constitutional Law II

Stolen Valor Opinion

Congratulations! You have just been appointed (and confirmed) to the United States Supreme Court. As one of your first official duties, you must draft an opinion to the United States v. Alvarez case. Given your understanding of Content-Based Restrictions, please draft an opinion that includes references to the relevant portions of the Consti- tution, the rule you would apply, and at least one case covered in class. (Do not refer back to the Alvarez case here.)

You must use Times New Roman, 12-point font, and the opinion should be double-spaced. A one-inch margin is required on all sides. This assignment cannot exceed two pages. You are not required to Bluebook. Please state your name at the start of the opinion. Good news: you may write a concurring or dissenting opinion. Indicate the position you are advancing. Base your opinion on sound legal analysis and precedent. Also include at least one public policy argument. Please underline your policy argument.

This assignment must be completed alone; no collaboration is permit- ted. It is due to my faculty inbox behind my assistant no later than

117 This is used in Constitutional Law II. It is completed outside of class and returned with feedback and a rubric. I also spend about twenty minutes reviewing common mistakes in a follow-up class about a week later. Because it is not graded, I can limit comments to the rubric. Teaching assistants can also be utilized to help manage the papers. I also post a “model” answer for students after the feedback session. \\jciprod01\productn\E\ELO\7-2\ELO208.txt unknown Seq: 32 20-AUG-15 12:10

522 Elon Law Review [Vol. 7: 491

Tuesday, Oct. 14, 2014, at 5 p.m. Late work will be subject to sanc- tions. Failure to complete this assignment will result in a five-point reduction on your final exam grade for this class. \\jciprod01\productn\E\ELO\7-2\ELO208.txt unknown Seq: 33 20-AUG-15 12:10

2015]Rewards and Risks of Low-Risk Assessment in the Classroom 523

APPENDIX E118 Grading Rubric: Commerce Clause Opinion

Name:

Format requirements  Did student adhere to two-page limit?  Did student use Times New Roman font?  Did student double-space throughout?  Did student leave a one-inch margin on all sides?

Issue  Did student properly frame the issue before the Court?  Was it clear that the opinion was dissenting/concurring or both?

Rule Power  Was the relevant rule used?  Did student cite the relevant portion of the Constitution?  Did student cite the Lopez test?  Did student give a complete statement of the rule?  Was the rule amplified through the inclusion of subparts for the Substantial Affects test?  Did student make distinction between plenary power for Interstate Commerce and rational basis standard for Substantial Affects?

Limit  Did the student refer to the prohibition under the Tenth Amend- ment against Congress reaching completely internal, non-eco- nomic activities?

Application  Did the student apply the facts in the Gonzales case to the Com- merce Clause rule?  Did the student initially dispose of medical use of marijuana as NOT implicating channels or instruments of interstate commerce?  Did the student walk through the Substantial Affects test?  Did the student address legislative findings?

118 This is an example of a rubric used to evaluate practice IRACs that are assessed for a few points, not a huge grade. Rubrics are key to making formative assessment more efficient for professors many papers. They can also be used for peer edits, self- edits, or by teaching assistants. \\jciprod01\productn\E\ELO\7-2\ELO208.txt unknown Seq: 34 20-AUG-15 12:10

524 Elon Law Review [Vol. 7: 491

 Did the student address jurisdictional element?  Did the student address substantial economic effect?  Did student indicate that sub-factors were not dispositive?  Was application well reasoned and sophisticated?  Did student avoid making conclusory statements?  Did the student raise counterarguments? (“On the other hand. . .”)

Conclusion  Did student clearly indicate where he or she landed in conclusion? Writing Guidelines  Did student effectively employ any text or policy arguments?  Did student cite at least two Commerce Clause cases discussed in class?  Was writing grammatically correct?  Did writing flow smoothly?  Were transitions used as needed?  Did submission make sense without a “live interpretation?” \\jciprod01\productn\E\ELO\7-2\ELO208.txt unknown Seq: 35 20-AUG-15 12:10

2015]Rewards and Risks of Low-Risk Assessment in the Classroom 525

APPENDIX F119

Constitutional Law

Alternative Assessment Project Option

You may create an optional alternative assessment to reflect a discrete area of Constitutional Law. This assignment is entirely optional but may generate up to five points on your final score for the final exami- nation. This grade enhancement will not necessarily advance you to the next available grade.

Projects should reflect a creative approach to Constitutional Law. They will be evaluated both for creativity and instructional value. The aim is to create an effective teaching tool that showcases your knowl- edge of a Constitutional Law concept.

If you elect to participate in this project, you must meet the following guidelines:

✓ All work must be submitted anonymously. Except for general questions, do not discuss the particulars or details of your pro- ject with me. You must preserve anonymity. Only after your final grade for Constitutional Law has been posted may you dis- cuss your project with me. Use your exam number.

119 This is an ungraded, optional assignment. It has produced many creative projects including hand-made jewelry, model houses, and elaborate trading cards featuring “Constitutional Law Characters.” Students continue to surprise me with their ability to make the law appealing and accessible. The allure of bonus points is enough to get at least half of the class to participate in any given semester. I encourage students to choose a challenging topic in an effort to help them better unpack and understand more complex material. \\jciprod01\productn\E\ELO\7-2\ELO208.txt unknown Seq: 36 20-AUG-15 12:10

526 Elon Law Review [Vol. 7: 491

✓ Each project must reflect a detailed portion of Constitutional Law; general images or projects related to Americana will not earn points. ✓ I reserve the right to grant partial or no credit to submitted projects. ✓ Group projects will be considered; all group members will share in point assessment. ✓ All projects must be submitted directly to my assistant. You are not permitted to give any work directly to me. You are not per- mitted to submit any work to student affairs.