Annual Conference on Teaching and Learning Assessment CONFERENCE PROGRAM

SEPTEMBER 13–15, 2017

PHILADELPHIA, PA DREXEL.EDU/ACONF

conference-program-2017.crw2.indd 1 9/1/17 4:50 PM MESSAGE FROM JOHN FRY MESSAGE FROM BRIAN BLAKE PRESIDENT, PROVOST, DREXEL UNIVERSITY I hope you will join us at Drexel The expectations placed on higher for Facilitating Conversations that education to foster and document Matter. students’ active and deep learning I commend our Provost, Brian have never been higher. We live in Blake, and his team for spearheading a time of economic uncertainty, this event. It’s important that we global interdependence, and urgent share best practices across higher challenges. If our students are to be education. Colleges and universities equipped with the skills to succeed face great challenges, and we in such a future, we must reject any must work together as colleagues to find solutions. Effective claims of quality learning that do not include as their focus assessment will be critical to that process. students’ active learning and understanding and our ability to If you’re from out of town, we look forward to hosting in assess such claims. you in Philadelphia. I believe Greater Philadelphia is the At Drexel, our assessment activities are based on institutional hub for higher ed in the mid-Atlantic region, based on a high values that aim to produce relevant and functional data for concentration of exceptional institutions and a long tradition aligning curricular design, course content, and pedagogical of educational leadership. Philadelphia is also a great place to approaches with Drexel’s mission and values. In all be inspired by our nation’s history, and to enjoy yourself at our assessment activities, the faculty and staff endeavor to take amazing cultural destinations and great restaurants. full consideration of the different educational and cultural I am pleased that Drexel’s Conference on Teaching and backgrounds of our increasingly diverse student population. Learning Assessment has become an annual national and The primary objective of our assessment program is to establish international event, and I look forward to seeing you here. a practice of action research that informs planning and results in tangible improvements for our students. In attending Facilitating Conversations that Matter, you will enjoy three days of thought-provoking speakers, workshops, and invaluable networking on Drexel’s beautiful campus, just minutes from the heart of historic Philadelphia and the birthplace of our nation. Come join us as we work together to ensure that all students have continuous opportunities to apply their learning to the significant, real-world challenges which, no doubt, lie ahead for them.

2 BUILDING ACADEMIC INNOVATION & RENEWAL

conference-program-2017.crw2.indd 2 9/1/17 4:50 PM CONNECT WITH US

View the online version of the conference schedule at drexel.edu/aconf/program/schedule. Here you will find all of the conference materials and session descriptions you may need.

WIFI for the conference is sponsored by CONNECT WITH US username » aconf2017 password » drexel17 WIFI Instructions: 1. Choose the Drexel Guest network from the available wireless networks. 2. Open a browser and attempt to access a web site, you should be directed to the Drexel Guest login page. 3. Click on “Sponsored User” instead of visitor 4. Enter username and password

drexel.edu/aconf 3

conference-program-2017.crw2.indd 3 9/1/17 4:50 PM LeBow Hall CONFERENCE LOCATIONS 3220 Market Street

Pearlstein Business Main Building Learning Center 3141 Chestnut Street 3218 Market Street

Shuttle Stop

Parking Garage

AREA CLOSED Papadakis Integrated Sciences Building 3245 Chestnut Street Creese Student Center Behrakis Grand Hall 3200 Chestnut Street

LEONARD PEARLSTEIN BUSINESS LEARNING CENTER GERRI C. LEBOW HALL (LEBOW HALL) The Pearlstein Business Learning Center is a four-story, The 12-story, 177,500 square-foot home for Drexel University’s 40,000 square-foot facility containing numerous executive Bennett S. LeBow College of Business features an innovative classrooms, technology such as video blackboards and array of classrooms and collaborative academic spaces as document cameras for video conferencing with students, well as an environmentally friendly design underscored by a corporate executives and instructors at remote locations. dramatic five-story central atrium.

CONSTANTINE N. PAPADAKIS INTEGRATED SCIENCES BUILDING (PISB) JAMES CREESE STUDENT CENTER The 150,000 square-foot building houses 44 research and (BEHRAKIS GRAND HALL, NORTH & SOUTH) teaching laboratories for biology, chemistry and biomedical Behrakis Grand Hall is the Creese Student Center’s ballroom, engineering and a six-story atrium containing a 22-foot wide, located adjacent to the Main Lounge and left of the lobby of 80-foot tall biowall, North America’s largest living biofilter Mandell Theater. Behrakis Grand Hall is frequently utilized and the only such structure installed at a U.S. university. for banquets, lectures, meetings and conferences, as it can accommodate up to 1,200 people.

4 BUILDING ACADEMIC INNOVATION & RENEWAL

conference-program-2017.crw2.indd 4 9/1/17 4:50 PM SCHEDULE AT-A-GLANCE WEDNESDAY, SEPTEMBER 13

9:00 – 12:00 PRE-CONFERENCE WORKSHOPS

An Administrator’s Guide to Fostering a Faculty-Led Assessment Process PEARL 302 Jacob Amidon & Debora Ortloff, Finger Lakes Community College Creating & Assessing Campus Climates that Encourage Civic Learning & Engagment PEARL 303 Robert D. Reason, Iowa State University Ready, Set, Go: The New Middle States Standards and your Assessment Practice PEARL 307 Jodi Levine Laufgraben, Temple University SCHEDULE AT-A-GLANCE Assessment Toolbox: Supercharge the Direct Assessment of Student Services PEARL 101 Michael C. Sachs, John Jay College Leading Change: Tackling Institution, Program, and Individual Challenges that Derail Assessment Initiatives PEARL 102 Catherine Datte, Ruth Newberry, Blackboard Inc. 1:00 – 2:00 WELCOME & OPENING PLENARY Creating a College Culture Where Assessment is a Pathway to Student Success MANDELL TH. M. Brian Blake - Provost, Drexel University Opening Message: Creating a College Culture Where Assessment is a Pathway to Student Success Sylvia Jenkins - President, Moraine Valley Community College 2:00 – 2:15 BREAK 2:15 – 3:15 CONCURRENT SESSION 1 Building Faculty Support for a Quantitative Reasoning Requirement: Holistic Assessment of Curriculum and Learning PISB 104 J Bret Bennington, Frank Gaughan, Terri Shapiro and S. Stavros Valenti - Hofstra University From First to Final Draft: Developing a Faculty-Centered Ethical Reasoning Rubric PISB 106 Genevieve Amaral, Dana Dawson and John Dern, Temple University Student’s Leading the way: Student Driven Assessment PISB 108 Timothy Burrows, Virginia Military Institute Closing the Loop on Data Collection and Program Improvement PEARL 101 Chadia Abras and Janet Simon Schreck, Johns Hopkins University Criterion Met. Now time to Reflect PEARL 102 Kathryn Strang, Rowan College at Burlington County Implementing Assessment in Student Conduct: Understanding a Balancing Act of Challenge, Support, Accountability, and Growth LBOW 109 Jeff Kegolis, The Faculty as Networked Improvement Community: Alignment of EdD Program Learning Objectives, Standards, and Measurable Outcomes LBOW 209 Joy Phillips, Kathy Geller and Ken Mawritz, Drexel University Building a Culture of Assessment and Embracing Technology: A Communication Studies Program Success LBOW 108 Patricia Sokolski, Jaimie Riccio and Poppy Slocum, LaGuardia Community College 3:15 – 3:30 BREAK 3:30 – 4:30 CONCURRENT SESSION 2 Self-Esteem is Doomed: A Paradigm Shift to Self-Compassion Allow Everyone to Thrive in Higher Education PISB 104 Laura Vearrier, Drexel University Snapshot Sessions (5 minute Mini-sessions) PISB 106 Does Class Size Matter in the University Setting? Ethan Ake and Dana Dawson, Temple University I See What you Mean: Using Infographics and Data Visualizations to Communicate your Assessment Story Tracey Amey, College of Technology The Impact of the 3R2V Strategy on Assessment Questions in the Science Classroom. Deshanna Brown, Barry University and Broward County Public Schools Assessing and Addressing The Digital Literacy Skills of First-Generation College Students Nicole Buzzetto-Hollywood and Magdi Elobeid, University of Maryland Eastern Shore Utilization of External Reviewers for Student Learning Assessment Anthony DelConte, Saint Joseph’s University Core Curriculum Outcomes: Reflections, Reactions, Results, and Other Assessment alesT Seth Matthew Fishman, Developing an Exceptional Academic Advising Program Using Student Satisfaction Survey Data Debra Frank, Drexel University

drexel.edu/aconf 5

conference-program-2017.crw3.indd 5 9/5/17 1:54 PM Faculty Centered Assessment: getting the Right People to the Right Place at the Right Time SCHEDULE AT-A-GLANCE Brooke Kruemmling, Showing Educators How to Teach Traumatized Students. Jonathan Wisneski and Anne Hensel, Upper Darby School District and Drexel University Assessing Critical Reflection: Learning in Faculty-led Short Term Study Abroad Programs: Students in Developed Countries PISB 108 Akosa Wambalaba, United States International University Both a Science and an Art: Designing, Developing, and Implementing Academic Program Evaluations that Work PEARL 101 Erica Barone Pricci and Alicia Burns, Lost with ILO Assessment? No Worries, We Bet you are Heading in the Right Direction PEARL 102 Jacqueline Snyder, SUNY Fulton Montgomery Community College Mary Ann Carroll, SUNY Herkimer County Community College Our QuEST for Improving Learning: Year Two Analysis of Wellness Course Revisions LBOW 109 Mindy Smith and Susan Donat, Messiah College Assessors of the Galaxy: Using Technology Integration to Shift a Culture LBOW 209 Ryan Clancy, Mark Green and Nina Multak, Drexel University 30 Minute Split Sessions: LBOW 108 Assessing Student Learning in Student Affairs: There’s Just Not Enough Time! Debbie Kell, Deborah E. H. Kell, LLC Assessment Tools for Experiential Learning and Other Highly Impactful Practices Melissa Krieger, 4:45 – 5:30 ICE CREAM SOCIAL PISB ATRIUM

6:00 – 10:00 PHILLIES GAME CITIZEN’S BANK PARK THURSDAY, SEPTEMBER 14

7:30 – 8:30 CONTINENTAL BREAKFAST 8:45 – 9:45 MORNING PLENARY MANDELL THEATER Reclaiming Assessment: Unpacking the Dialogues of our Work Natasha Jankowski, National Institute of Learning Outcomes Assessment (NILOA) 10:00 – 11:00 CONCURRENT SESSION 3 Background, Methods, and Results of a 7-year Longitudinal Assessment of Undergraduate Business Writing PISB 104 Scott Warnock, Drexel University The Wizards of Assessment: Peel Back the Curtain and Experience the Art and Science of the Assessor PISB 106 Mark Green and Ray Lum, Drexel University Learner-Focused Assessment for the Creative Mind: Cultivating Growth for All Learners PISB 108 Amanda Newman-Godfrey and Lynn Palewicz, Moore College of Arts and Design Working Hand-in-Hand: Programmatic Assessments and Institutional Outcomes PEARL 101 Frederick Burrack and Chris Urban, Kansas State University Assessing Engagement in Active Learning Classrooms PEARL 102 Dawn Sinnot, Susan Hauck and Courtney Raeford, Community College of Philadelphia Collecting Meaningful Assessment Data: an Accreditation Strategy LBOW 109 Jane Marie Souza, University of Rochester Peer-to-Peer Blueprints: Leveraging Hierarchical Learning Outcomes and Peer Consultants to Foster Faculty Discussions of Assessment LBOW 209 Michael Wick and Anne Marie Brady, St Mary’s College of Maryland We Need More: Novel Metrics for Classroom Assessment and Proposed Standards in Nonformal Learning LBOW 108 Caitlin Augustin, John Harnisher and Kristen Murner, Kaplan Test Prep 11:00 – 11:15 BREAK 11:15 – 12:15 CONCURRENT SESSION 4 All About that ’Base: Database Design as Part of Your Assessment Toolkit PISB 104 Krishna Dunston, Delaware County Community College Task-based Assessment: A Step-by-Step Guideline PISB 106 Ramy Shabara, The American University in Cairo, Egypt Cracking the Code of Creative “Capital:” Assessing Student Creativity in Science, Engineering and Technology Courses PISB 108 Jen Katz-Buonincontro, Drexel University Comparative Program Assessment to Increase Student Access, Retention, and Completion PEARL 101 Catherine Carsley and Lianne Hartmann, Montgomery County Community College Acting on Data: Lessons about the Use of Student Engagement Results to Improve Student Learning PEARL 102 Jillian Kinzie, Indiana University Critial Thinking: It’s Not What You Think! LBOW 109 Janet Thiel, Georgian Court University

6 BUILDING ACADEMIC INNOVATION & RENEWAL

conference-program-2017.crw2.indd 6 9/1/17 4:50 PM Text Analysis as Assessment for Ethical and Diagnostic Purposes LBOW 209 Fredrik deBoer, Brooklyn College Assessment as Research: Using Compelling Questions to Inspire Thoughtful Assessment Practices LBOW 108 Javarro Russell, Educational Testing Service (ETS) 12:30 – 1:45 LUNCHEON PLENARY Plenary: An Accreditation Roundtable Discussion Elizabeth Sibolski - President, Middle States Commission on Higher Education (MSCHE) Belle Wheelan - President, Southern Association of Colleges and Schools Commission on Colleges (SACSCOC) Patricia O’Brien - Senior Vice President, Commission on Institutions of Higher Education of the New England Association of Schools and Colleges (NEASC) 2:00 – 3:00 CONCURRENT SESSION 5 Knowing More About Our Students in Foundational Math and Writing Reform: Building Multi-Faceted Assessment on the Front End PISB 104 Fiona Glade, University of Baltimore Snapshot Sessions PISB 106

Assessing our Assessment: A Process for Reviewing Annual Assessment Reports SCHEDULE AT-A-GLANCE Gina Calzaferri, Temple University Turning 120 Annual Reports Into a Searchable Online Planning/Reporting Database Linked to Strategic Plans Wenjun Chi, Saint Joseph’s University Rubrics: Facets That Matter Diane DePew, Drexel University The Impact of Co-Curricular Activities as an Assessment Tool on the University Students: Muhammad Farooq and Gehan El Enain, Abu Dhabi University Learning from the Assessment Process: HBCU Faculty Perspectives on Classroom and Program Review Pamela Felder and Michael Reed, University of Maryland Eastern Shore Enhancing Online Course Design by Developing Faculty-Administration Collaborations and Using Quality Matters Rubrics Moe Folk and Doug Scott, Kutztown University Collaboratively Assessing Collaboration: Self, Peer and Program-level Assessment of Collaborative Skills Janet McNellis, Developing an On-Line Simulation Activity: Assessing the Need and Implementing Action! Margaret Rateau, Assessment in Science using I-LEARN Model Hamideh Talafian, Drexel University Implementing a Student Assessment Scholar Program: Students Engaging in Continuous Improvement PISB 108 Nicholas Truncale, Elizabeth Chalk, Jesse Kemmerling and Caitlin Pelligrino, University of Scranton Organizing Program Assessment as Collaborative Problem Solving PEARL 101 Barbara Masi, Penn State University Educational Development and Assessment: Simultaneously Promoting Conversations that Matter PEARL 102 Phyllis Blumberg, University of the Sciences Listening for Learning: Using Focus Groups to Assess Students’ Knowledge LBOW 109 Corinne Dalelio and Christina Anderson, Coastal Carolina University. Gina Baker, Liberty University Rebooting Work Based Assessment for the 21st Century; Shifting to Digital Technologies for Student Nurses LBOW 209 Sian Shaw and Anne Devlin, Anglia Ruskin University (UK) Drexel Outcomes Transcript & Competency Portfolio: Empowering Students & Faculty with Evidence of Learning Using Effective Assessment LBOW 108 Mustafa Sualp, AEFIS Stephen DiPietro and Donald McEachron, Drexel University 3:00 – 3:15 BREAK 3:15 – 4:15 CONCURRENT SESSION 6 Assessing Our Assessment: Findings and Lessons Learned Three Years Later PISB 104 Victoria Ferrara, Mercy College Everything I Ever Wanted to Know About Assessment I Learned from Reality Cooking Shows LBOW 108 Krishna Dunston, Delaware County Community College Grit in the Classroom PISB 108 Rebecca Friedman, Johns Hopkins University Decoupling and Recoupling: the Important Distinctions between Program Assessment and Course Assessment PEARL 101 Nazia Naeem, Lesley Emtage, Debbie Rowe and Xiodan Zhang - York College Encouraging Meaningful Assessment By Celebrating It! PEARL 102 Letitia Basford, Hamline University Application of the Collaborative Active Learning Model (CALM) Simulation An Experiential Service Learning Approach. LBOW 109 Francis Wambalaba and Peter Kiriri, United States International University Process-Based Assessment and Concept Exploration for Personalized Feedback and Course Analytics in Freshman Calculus LBOW 209 Mansoor Siddiqui, Project One Kristen Betts, Drexel University 5:30 – 7:30 RECEPTION: THE PYRAMID CLUB & AQUA STRING BAND THE PYRAMID CLUB

drexel.edu/aconf 7

conference-program-2017.crw2.indd 7 9/1/17 4:50 PM FRIDAY, SEPTEMBER 15 SCHEDULE AT-A-GLANCE 7:30 – 8:30 CONTINENTAL BREAKFAST 8:45 – 9:45 CONCURRENT SESSION 7 It’s Prime Time to Shift IE to EE Planning and Assessment – but How? PISB 104 Mary Ann Carroll, SUNY Herkimer County Community College Jacqueline Snyder, SUNY Fulton-Montgomery Curriculum Maps: Who Starts a Trip Without a Map? PISB 106 Alaina Walton and Anita Rudman, Rowan College at Burlington County Using Course Evaluations to Better Understand what Your Academic Program is Messaging to Your Students PISB 108 Beverly Schneller and Larry Wacholtz, Belmont University Snapshots: Mission Impossible and Other Assessment Tales PEARL 101 Joanna Campbell, Maureen Ellis‐Davis, Gail Fernandez, Ilene Kleinman, Melissa Krieger, Amarjit Kaur and Jill Rivera - Bergen Community College Developing Sustainable General Education Assessment: The Example of Oral Communication Assessment at St. Lawrence PEARL 102 Valerie Lehr, Christine Zimmerman and Kirk Fuoss - St. Lawrence University Beyond the Classroom: A Collaborative Pilot to Unify Learning Assessment Across Six Academic Support Units LBOW 109 Jocelyn Manigo and Janet Long, Many Questions, Multiple Methods: Assessing Technological Literacy and Course Design in a Modular Team-Taught Course LBOW 209 Dana Dawson, Temple University Best Practices in Assessment: A Story of Online Course Design and Evaluation LBOW 108 Gulbin Ozcan-Deniz, Philadelphia University 9:45 – 10:00 BREAK 10:00 – 11:00 CONCURRENT SESSION 8 Expanding and Developing Assessment Practices to include Administrative, Educational, and Student Support (AES) Units PISB 104 Christopher Shults, Erika Carlson, and Marjorie Dorime-Williams - Borough of Manhattan Community College Creating a General Education Capstone: Assessing Institutional Outcomes through General Education PISB 106 Jenai Grigg and Gina MacKenzie, Holy Family University Assessing Information Literacy for Community College Students: Faculty and Librarian Collaboration Leads to Student Improvement PISB 108 Janis Wilson Seeley and Graceann Platukus, Luzerne County Community College Community-building through Assessment Design: Reframing Disciplinary Student Outcomes as Inquiry-based PEARL 101 Brad Knight, American University Data-driven Conversations to Make a Difference in Campus-wide General Education PEARL 102 Mindi Miller, Molly Hupcey Marnella and Bob Heckrote - Bloomsburg University of Pennsylvania Should You Take Student Surveys Seriously? LBOW 109 Zvi Goldman, Jeremi Bauer, Susan Lapine and Chris Szpryngel - Post University Skipping Stones or Making Splashes; Embedding Effective Assessment Practice into Faculty Repertoire LBOW 209 Dana Scott, Philadelphia University 11:15 – 12:00 CLOSING REMARKS PISB 120

8 BUILDING ACADEMIC INNOVATION & RENEWAL

conference-program-2017.crw2.indd 8 9/1/17 4:50 PM CONSTANTINE PAPADAKIS INTEGRATED SCIENCES BUILDING (PISB)

ENTRANCE 1ST FLOOR BUILDING FLOOR PLANS

104

ATRIUM

106

REGISTRATION

108

120

ENTRANCE

Please pardon our campus construction this year. Thank you.

drexel.edu/aconf 9

conference-program-2017.crw2.indd 9 9/1/17 4:50 PM LEONARD PEARLSTEIN BUSINESS LEARNING CENTER BUILDING FLOOR PLANS

TO LEBOW ENTRANCE

1ST FLOOR

3RD FLOOR (FLOORPLAN. NOT ACTUAL 101 STREET LOCATION)

ENTRANCE MARKET STREET

307 308

102

303 302

301 DRAGON STATUE TO PISB

33RD STREET

10 BUILDING ACADEMIC INNOVATION & RENEWAL

conference-program-2017.crw2.indd 10 9/1/17 4:50 PM GERRI C. LEBOW HALL

1ST FLOOR

ENTRANCE BUILDING FLOOR PLANS MARKET STREET

2ND FLOOR

109

ENTRANCE

209

Please pardon our campus construction this year. Thank you.

drexel.edu/aconf 11

conference-program-2017.crw2.indd 11 9/1/17 4:50 PM JAMES CREESE STUDENT CENTER BUILDING FLOOR PLANS

BEHRAKIS GRAND HALL

ENTRANCE CHESTNUT STREET

(JOE COFFEE)

(SHAKE SHACK)

ENTRANCE TO JAMES CREESE STUDENT CENTER

ENTRANCE TO MANDELL THEATER

MANDELL PATIO THEATER PATIO

PATIO

HANDSCHUMACHER DINING CENTER (GLASS ENCLOSURE STAIRWAY) PATIO PATIO

12 BUILDING ACADEMIC INNOVATION & RENEWAL

conference-program-2017.crw2.indd 12 9/1/17 4:50 PM CONFERENCE SCHEDULE » WEDNESDAY

WORKSHOP 1 PEARLSTEIN 302 WORKSHOP #3: PEARLSTEIN 307 An Administrator’s Guide to Fostering a Faculty-Led Ready, Set, Go: Assessment Process The New Middle States Standards and your Assessment Practice Jacob Amidon & Debora Ortloff, Finger Lakes Community College Jodi Levine Laufgraben, Temple University The conundrum for those of us that are tasked with overseeing an Implementation of the new Middle States standards provide an ideal assessment process at a college is that in order for the process to be opportunity to reengage your campus in conversations about assessment. effective, sustainable and meaningful it must be faculty led, but faculty How do your current practices align with the new standards? Where will not, on their own, embrace the assessment process. In this might you improve? In this workshop we will discuss strategies for using workshop we will explore several techniques and tools that can be used the new standards to renew faculty commitment to the assessment of to foster a faculty-led assessment environment. These include how to student learning and reenergize the campus commitment to assessing

reframe the act of assessment, building the capacity of the faculty to institutional effectiveness. WEDNESDAY engage in assessment, creating efficient processes around assessment At the conclusion of this workshop participants will be able to: and managing up to resource and protect the faculty-led process. • Outline how their campuses strengths and weaknesses align with Participants will work through several hands-on exercises around new standards. these core concepts so they can begin to create their own guide to apply • Plan one or more ways to use the new standards to renew campus within their own campus context. commitment to assessment. At the conclusion of this workshop participants will be able to: • Develop ideas for framing assessment on their campus. WORKSHOP #4: PEARLSTEIN 101 • Create initial targeted professional development plan to support faculty leadership in assessment. Assessment Toolbox: • Map out efficiency improvement ideas to support high quality Supercharge the Direct Assessment of Student Services assessment. Michael C. Sachs, John Jay College The Middle States Commission on Higher Education’s publication WORKSHOP #2: PEARLSTEIN 303 Student Learning Assessment: Options and Resources, Second Edition Creating & Assessing Campus Climates that Encourage states “the characteristics of good evidence of student learning include considerations of direct and indirect methods for gathering evidence Civic Learning & Engagement of student learning.” Creating direct student learning assessment Robert D. Reason, Iowa State University tools within student support services can be challenging for student After a brief discussion about the connections between campus climates service professionals. Often many student service programs rely solely and students’ civic learning and engagement, this session will focus on on indirect assessment techniques such as focus groups, evaluations, specific ways institutional leaders can create and assess those campus satisfaction surveys, NSSE results, etc. climates that encourage civic learning and engagement. Although the This workshop will explore the direct student learning assessment emphasis of the workshop will be on participant’s campus contexts, we tools available to Offices of Student Affairs and other services offices will use data from the Personal and Social Responsibility Inventory on campus. These techniques and tools are both qualitative and (PSRI), an ongoing climate assessment project at over 40 institutions, quantitative in intention and design. This workshop will also enable to examine what we know about these relationships broadly. participants to develop program goals, rubrics, and direct student At the conclusion of this workshop participants will be able to: learning outcomes for their student service areas – linked, of course, to • Articulate an understanding of how climate shapes learning on their college’s mission and/or strategic plan. Participants should bring college campuses. copies of their institutional strategic goals and mission. • Draw connections between current (and future) campus programs At the conclusion of this workshop participants will be able to: and climates that encourage civic learning and engagement. • Explain the importance of direct assessment for planning, • Develop a plan that incorporates campus climate, institutional resource allocation and student learning. policies and programs, and student engagement activities • Recognize and understand the differences between direct and to comprehensively assess the development of civic learning indirect assessment in student services. outcomes • Create direct assessment of Student Learning Outcomes for their individual areas / programs that can be incorporated into assessment plans.

drexel.edu/aconf 13

conference-program-2017.crw2.indd 13 9/1/17 4:50 PM 1:00 – 2:00 P.M.

WELCOME & OPENING PLENARY sponsored by BRIAN BLAKE, PROVOST (Mandell Theater) Greetings and welcoming remarks will be issued by Dr. Brian Blake, Provost and Executive Vice President for Academic Affairs. WEDNESDAY 12:30 – 1:45 MANDELL THEATER Learning/Director of the Library; and public services librarian Presidents’ Council. In 2016, she was elected to the American Creating a College Culture at Moraine Valley Community College in Palos Hills, Illinois, Association of Community College’s (AACC) Board of Directors and director, Library Services and public services librarian at and also serves on the AACC’s Commission on Global Education. Where Assessment is a Virginia Union University, Richmond, Virginia. Her educational In 2016, she was elected to serve a two-year term as an Pathway to Student Success background includes a Ph.D. in Education and Human Resource Associate Member Regional Representative of the Hispanic Studies w/specialization in Community College Leadership Association of Colleges and Universities (HACU). She is a Sylvia Jenkins from Colorado State University, an MLS/Master of Library member of the Illinois Green Economy Network (IGEN) Dr. Sylvia Jenkins was appointed the fifth Science from State University of at Albany, and a President’s Steering Committee. She serves on the Cook County president of Moraine Valley Community B.S. in English Education from Grambling State University. Workforce Investment Board and the Forest Preserve District of College on July 1, 2012. Moraine Valley She serves on several boards, including the League for Cook County’s Conservation and Policy Council. She previously Community College is the second largest Innovation in the Community College, Community Colleges served on the Forest Preserve District of Cook County’s Next community college in Illinois. Dr. Jenkins previously served as for International Development (CCID), Moraine Area Career Century Conservation Plan as well as a state-wide Northeastern vice president, Academic Affairs; dean, Academic Development Systems CEO Council, Chicago Regional College Program, Illinois Public Transit Task Force. and Learning Resources; assistant dean, Center for Teaching & and the South Metropolitan Higher Education Consortium

WORKSHOP #5: PEARLSTEIN 102 2:00 – 2:15 P.M. Leading Change: Tackling Institution, Program, and BREAK Individual Challenges that Derail Assessment Initiatives Refreshments Available Catherine Datte, Gannon University Ruth Newberry, Blackboard Inc. 2:15 – 3:15 P.M. In keeping with the theme Facilitating Conversations that Matter, this interactive workshop engages participants in conversations focused CONCURRENT SESSION 1 on successful change initiatives related to assessment. Participants will learn to implement the Kotter change model, prioritize initiatives, solicit support, and develop an implementation plan to move a change 2:15 – 3:15 PISB 104 initiative toward success. Success involves a thoughtful, realistic project Building Faculty Support for a Quantitative Reasoning plan, driven by a coalition and supported by a “volunteer army” that Requirement: Holistic Assessment of Curriculum and Learning can serve as spokes-persons, role models, and leaders to move the effort forward. Participants will also learn from one another successful J Bret Bennington, S. Stavros Valenti, Frank Gaughan and Terri Shapiro, Hofstra University strategies to overcome barriers and resistance that limit forward movement. Attendees will document their SWOCh, gaps, and vision We will present a holistic model of outcomes assessment that addresses the with the assistance of the co-presenters Catherine Datte and Ruth ‘fit’ between learning goals and learning opportunities in the curriculum Newberry using the Change Leadership Workbook. In a combined while also collecting data on student learning. To illustrate our model, we approach of information gathering and self-appraisal, attendees will present data and analyses from a recent assessment of Quantitative will begin to develop their unique implementation plans and receive Reasoning. If done well, analyses of goal‐curriculum fit can be powerful guidance regarding specific nuances and challenges related to their motivators for faculty and administration to cooperate on curricular institution. Throughout the workshop, Catherine and Ruth will award innovation. This approach led to a broadly supported improvement in books related to the specific challenges that are often associated with the general education curriculum at Hofstra–a quantitative reasoning assessment planning, change leadership, and team building. requirement–that was adopted less than two years after first being proposed. This session will provide attendees with a blueprint for holistic assessment At the conclusion of this workshop participants will be able to: – combining curriculum analysis with student learning assessment – as well • Identify and prioritize critical actions associated with best as a sustainable method for collecting data using online survey tools that practices in program or institution assessment along with could be scaled up to large numbers of participants with little added effort. documenting practical action steps. • Learn strategies from peers and share challenges and successes. LEARNING OUTCOMES: • Create individualized action steps that drive their assessment 1. Participants will learn how to collect and analyze data on learning process. opportunities and engagement within the curriculum (i.e., goal‐ curriculum fit). 2. Participants will learn a sustainable / scalable method for measuring student learning outcomes Audience: Intermediate

14 BUILDING ACADEMIC INNOVATION & RENEWAL

conference-program-2017.crw2.indd 14 9/1/17 4:50 PM 2:15 – 3:15 PISB 106 driven decision making is key to maintaining successful and effective programs. Data analysis is key to assess effectiveness of student learning From First to Final Draft: and curricular relevance. Closing the loop on data collection is key in Developing a Faculty-Centered Ethical Reasoning Rubric making smart decisions in program design, improvement, and delivery Genevieve Amaral, John Dern and Dana Dawson LEARNING OUTCOMES: Temple University 1. Participants will be guided in using effective strategies on creating In this session, we will address how faculty and administrators implemented descriptive assessment rubrics. a faculty-centered rubric development process. Over the course of one 2. Participants will be exposed to strategies on how to analyze data academic year, the team developed, refined and deployed a rubric for the for course, program, and unit level improvements. They will assessment of ethical reasoning in a core text program at a large, urban, understand how to triangulate multiple measures in order to drive state-related institution. What began as an institutionally mandated process decisions for curriculum effectiveness. ultimately shed light on unstated, but inherent program goals, and created Audience: Intermediate opportunities to raise awareness of how ethical reasoning informs text selection and learning activities. Presenters will review the program’s history, 2:15 – 3-15 PEARLSTEIN 102 challenges and lessons learned during rubric development, and plans for WEDNESDAY implementation. Participants will gain insight into the creation of organic Criterion Met. Now time to Reflect assessment tools that contribute meaningfully to day‐to‐day teaching and Kathryn Strang Rowan College at Burlington County curriculum development, and the process of building rubrics to address Rowan College at Burlington County’s assessment process serves as skills such as ethical reasoning which can be ambiguous and value‐laden. a systematic mechanism to measure the strengths and weaknesses LEARNING OUTCOMES: of the college’s academic offerings on a continuous basis. Through 1. Participants will better understand the stages of crafting an assessment the implementation of self‐reflection summaries the Assessment rubric, and strategies for involving faculty in all aspects of the process. Chairs use this tool to highlight what they have learned by conducting 2. Participants will better understand how to validate a rubric and the assessments whether the criterion was met or not. Often these employ it to carry out a direct assessment of student learning. summaries involve very detailed and specific adjustments to the Audience: Beginner curriculum and instructional delivery. Kathryn will lead a PowerPoint presentation followed by a learning activity and a Q&A session 2:15 – 3:15 PISB 108 designed for professionals with experience in assessment/teaching/ learning who seek to develop strategies for a continuous improvement Student’s leading the way: Student Driven Assessment process of assessments. In this session she will outline RCBC’s academic Timothy Burrows Virginia Military Institute assessment process and the tools and strategies used to establish a This session details the development of a student driven assessment strong continuous improvement cycle. Kathryn will take participants of leadership outcomes that support the Virginia Military Institute’s through the process of generating assessment results, interpreting these mission of developing confidence “in the functions and attitudes of results, and analyzing their implications through the use of reflection leadership.” Often students are not familiar with the role of assessment summary instrument. At the end of the session, participants will be in higher education and lack a general understanding of processes able to understand how outcomes can be used to create an environment in place to help an institution improve. Including students helped to of continuous improvement. create a high‐level of buy‐in and a sense of ownership (Kuh, Ikenberry, LEARNING OUTCOMES: Jankowski, Cain, Etwell, Hutchings, & Kinzie; 2015). This session 1. Participants will be able to employ a culture of continuous is relevant because it provides a positive example of participant improvement by learning how to: • implement change based upon evaluation and assessment (Ftizpatrick, Sanders, & Worthen; assessment outcomes from various well‐defined performance 2011) in a holistic and natural setting. This process highlights how the indicators relationship between several academic‐support units and students can 2. Participants will be able to employ a culture of continuous foster stakeholder buy‐in and ownership. improvement by learning how to: • design a reflective summary LEARNING OUTCOMES: tool to use at their college 1. Participants will be able to develop new possibilities for student Audience: Intermediate driven assessment practices at their home institution. 2. Participants will be able to debate the benefits, pitfalls, and challenges 2:15 – 3-15 GERRI C. LEBOW HALL, 109 facing the implementation and use of student driven assessments. Implementing Assessment in Student Conduct: Understanding a Audience: Intermediate Balancing Act of Challenge, Support, Accountability, and Growth 2:15 – 3-15 PEARLSTEIN 101 Jeff Kegolis The University of Scranton Closing the Loop on Data Collection and Program Improvement When considering assessment within a Division of Student Affairs, historically Student Conduct is a particular functional area that creates Chadia Abras and Janet Simon Schreck Johns Hopkins University challenges for administrators and educators. Although learning may This session aims to present how to collect effective data from course take place over the course of time in one’s student experience with the assessments using descriptive rubrics. The session will also present conduct process, it may be difficult to understand how a student believes how data collected can be analyzed and utilized to close the loop they are growing through their circumstances and/or the competencies on course and program improvements. Creative and effective ways they have improved upon through reflection and processing of their to derive meaningful inferences from assessment data sets will be situation. Ultimately, this session focuses upon how assessment of explored. In light of an assessment driven culture at most institutions the conduct process was implemented, specifically related to conduct of higher education and compliance with accrediting agencies, data‐ meetings and the results students identified related to their experience.

drexel.edu/aconf 15

conference-program-2017.crw2.indd 15 9/1/17 4:50 PM Depending on the size of one’s institution or one’s Student Affairs implementing assessment mechanisms while wrestling with technical division, attendees who attend this session will be able to engage limitations resulted in a stronger and better articulated program. in dialogue related to best implementing assessment to understand Building a culture of assessment is the best answer to the current competency measurement, the importance of connecting assessment to skepticism over the value of a college education. The systematic one’s office mission statement/division’s priorities/university’s strategic inquiry into teaching and student learning provides a way for higher plan. Additionally, Student Conduct is a programmatic area that may education institutions to demonstrate accountability. Our presentation be difficult to assess due to the nature of the process and a student’s will show the applicability of our college’s assessment model. / We will lack of willingness to be held accountable. Therefore, through this explain how we gained faculty participation, how we implemented a session, attendees will engage in conversation around their individual loop system of assessment, how we overcame technological limitations, WEDNESDAY department’s implementation of assessment and complete a SWOT and what we learned in the process. This should help attendees who analysis of how their assessment is implemented. have to revise curriculum and develop relevant methods of assessment LEARNING OUTCOMES: especially for a digital ability. 1. Participants will discuss best practices related to assessment within LEARNING OUTCOMES: one’s functional area, and future direction of their assessment based 1. Participants will formulate a proposal for programmatic on lessons learned from previous assessment utilized. assessment. 2. Participants will acquire skills to help develop assessment of 2. Participants will design exercises that assess digital ability specific competencies in relation to their programmatic areas. effectively. Audience: Intermediate Audience: Beginner

2:15 – 3-15 GERRI C. LEBOW HALL, 209 3:15 – 3:30 P.M. Faculty as Networked Improvement Community: Alignment of EdD BREAK Program Learning Objectives, Standards, and Measurable Outcomes Refreshments Available Joy Phillips, Kathy Geller and Ken Mawritz, Drexel University This manuscript describes how Drexel University School of Education 3:30 – 4:30 P.M. faculty have aligned EdD program principles with Carnegie Project for the Educational Doctorate (CPED) design principles, national Council CONCURRENT SESSION 2 for the Accreditation of Educator Preparation (CAEP) Advanced Program Standards, Drexel Student Learning Priorities (DSLPs), and the Drexel 3:30 – 4:30 PISB 104 School of Education Program Themes / Proposal provides example of a participatory, bottom‐up process to align outcomes and assessment Self-Esteem is Doomed: A Paradigm Shift to Self-Compassion activities that includes data/evidence with program‐level learning Allow Everyone to Thrive in Higher Education priorities. Faculty in EdD program aligned Program Learning Outcomes Laura Vearrier, Drexel University with national and institutional standards. Participants can use this process as a model for developing a cycle of continuous program improvement. The goal of this session is to teach educators about the elements of self‐ Faculty will share a multi‐step process of identifying program learning compassion—self‐kindness, shared humanity, and mindfulness— and how outcomes (PLOs) beginning with individual course learning outcomes. this construct is more productive than self‐esteem. Self‐ esteem involves This interactive session provides participants with templates (see attached) the need to feel above average and special in comparison to others and will as examples and as working documents to enable participants to engage in inevitably wane in higher education settings. / In a society where being such assessment work at their own institutions. average is unacceptable but the norm, most assessments will be perceived as failures and be unpleasant for the educator and the learner. Self‐ LEARNING OUTCOMES: compassion involves transitioning from self‐judgement to self‐kindness, 1. Reflections from discussion of faculty working as a network isolation to common humanity, and disconnection to mindfulness. This improvement community to align EdD program learning construct allows for a more positive experience. / Self‐ compassion is for objectives with national standards and measurable student the attendee to learn about for personal well‐being. They can then guide learning outcomes. assessments with the principles of self‐compassion for a more fulfilling, 2. Examples in the form of templates for conducting program‐level productive process for themselves as well the learner. alignment of program learning objectives, standards, and student learning outcomes. LEARNING OUTCOMES: Audience: Intermediate 1. Participants will be able to understand the components of self‐ compassion and how it differs from self‐esteem 2. Participants will be able to apply self‐compassion for oneself 2:15 – 3-15 GERRIE C LEBOW HALL 108 and then use the construct to guide productive self‐ refection in Building a Culture of Assessment and Embracing Technology: learners A Communication Studies Program Success Audience: Advanced Patricia Sokolski, Jaimie Riccio and Poppy Slocum. LaGuardia Community College This session will tell the successful story of a community college communication studies program faced with the challenge of implementing and assessing new general education competencies. Revising objectives and learning outcomes, creating new assignments,

16 BUILDING ACADEMIC INNOVATION & RENEWAL

conference-program-2017.crw2.indd 16 9/1/17 4:50 PM 3:30 – 4:30 PISB 106 reading strategy was developed to assist students to decode test items and unlock the meaning of test questions. Good readers interact Snapshot Sessions (A Collection of Mini Sessions) with the text and decode, read fluently, activate their vocabulary SS1: Does Class Size Matter in the University Setting? knowledge, and use multiple text comprehension strategies. Science Ethan Ake and Dana Dawson, Temple University assessments contains an assortment of information such as graphic organizers, graphs, photos, drawings, and other graphic features. The This shapshot session presents the findings of a multilevel model 3R2V strategy assists struggling readers to strive towards such tasks. examining the relationship between class size and Temple University Attendees will explore the potentials of the struggling reader and how General Education course grades for 172,928 grades nested in 7,704 the 3R2V strategy could be the bridge to their achievement in science. sections across ten semesters in a five year period (Fall 2011‐Spring 2016). The study is currently under review for journal publication. The LEARNING OUTCOMES: findings, which indicate that class size is NOT a statistically significant 1. Participants will explore the components of the 3R2V Strategy variable in predicting General Education course grades, have and its potential benefit to struggling readers. implications in terms of teaching and learning, classroom dynamics 2. Participants will examine the implementation process of the 3R2V and policy changes. Class size is a particularly important issue today Strategy in the classroom.

given budgetary pressures IHEs face (e.g. declining state aid, RCM). As Audience: Beginner WEDNESDAY per the conference theme, Facilitating Conversations that Matter, this session aims to stimulate conversation among stakeholders about the SS4: Assessing and Addressing the Digital Literacy Skills of role of class size. The findings also encourage stakeholders to weigh the complex relationship between class size and student achievement and First-Generation College Students consider class size in relation to pedagogy and assessment. Nicole Buzzetto-Hollywood and Magdi Elobeid, University of Maryland Eastern Shore LEARNING OUTCOMES: 1. Participants will be able to understand the relationship between The assessment of the digital literacy skills of first‐generation students class size and student achievement (1) for all students in all course attending a historically Black university through the use of the IC³ types, (2) in the ten General Education program domains and (3) certification program with incoming freshmen will be discussed as for the five student racial groups. well as an evaluation of a core course offered as part of the institutions 2. Participants will be able to critically consider how a statistically general education curriculum. There is a common, and growing, non‐significant result is influenced by pedagogy, assessment and misconception that students enter higher education with the digital student‐instructor interaction. literacy competencies necessary for success; however, the research Audience: Intermediate shows major skill deficiencies among students. This topic is relevant as institutions strive to meet the needs of students with varying levels of technological readiness. Attendees will be provided with relevant SS2: I See What you Mean: Using Infographics and Data questions to help them evaluate the effectiveness of their current plan Visualizations to Communicate your Assessment Story for addressing the digital literacy competencies of students. Tracey Amey, Pennsylvania College of Technology LEARNING OUTCOMES: Effective Data visualizations and infographics are highly useful tools 1. Participants will be able to critically discuss the relevance of for conveying messages and complex information. This session will assessing and addressing the digital literacy skills of students with demonstrate easy‐to‐use and readily‐accessible tools, from Powerpoint a particular emphasis on students who may be from underserved to free online programs, that can be used to communicate the story populations. your data tells in an engaging and approachable manner. Representing 2. Participants will be to use relevant questions to evaluate the assessment data visually can be an effective and approachable way to effectiveness of their current plan for assessing and addressing the gain a quick, yet profound understanding of complicated data. This digital literacy competencies of students. session will provide simple, yet effective tools for the user to create data Audience: Advanced visualizations. Assessment data can be overwhelming, not only to those working with it, but also to those who are try to understand it. Data SS5: Utilization of External Reviewers for Student Learning visualization provides an alternative way for the user and the audience to approach data and can be done with common tools, like Powerpoint. Assessment Anthony DelConte, Saint Joseph’s University LEARNING OUTCOMES: 1. Participants will gain a new awareness of the effectiveness of data The session will discuss use of external assessors of interpersonal and visualization for complicated data sets. oral communication skills in real life role‐play scenarios. Students 2. Participants will learn about common and easy‐to‐use tools that must convince a reviewer to part with a resource using the skills and can create data visualizations techniques practiced in the course involving selling skills. Many of our Audience: Beginner interactions involve a series of communications where we persuade or influence thinking or behavior. The ability to communicate that we have something of value to offer in exchange for something else that we SS3: The Impact of the 3R2V Strategy on Assessment Questions value is essential for those in the sciences, humanities, and in business. in the Science Classroom. The ability to “sell” is critical to success in the world. According to Pink, only one out of nine workers have a job title that includes sales. As Deshanna Brown, Barry University and Broward County Public Schools educators, we are selling nearly every day. Evaluating this skill set can High‐stakes testing are assessments with consequences positive or be accomplished in role play scenarios using external reviewers. negative, such as student retention or promotion (Vacca and Vacca, 2009 p. 164). To help my students succeed on science assessments, a

drexel.edu/aconf 17

conference-program-2017.crw2.indd 17 9/1/17 4:50 PM LEARNING OUTCOMES: SS8: Faculty Centered Assessment: Getting the Right People to 1. Participants will be able to determine innovative ways to evaluate interpersonal and oral communication skills. the Right Place at the Right Time 2. Participants will be able to utilize external assessors to validate Brooke Kruemmling, Salus University student learning outcomes This session will discuss the importance of selecting members for a Audience: Beginner faculty university‐wide assessment committee who are positioned most appropriately to contribute to both programmatic and institutional SS6: Core Curriculum Outcomes: Reflections, Reactions, Results, assessment goals. This presentation will identify the challenges with

WEDNESDAY selecting such individual and the strategies used to develop a cohesive and and Other Assessment Tales functional committee. In the current landscape of educational outcomes Seth Matthew Fishman, Villanova University assessment, accrediting bodies are focusing on engagement at all levels The presenters will candidly convey the challenges faced with their current of the program. Creating an effective assessment committee builds upon strategy from our core curriculum’s first large‐scale assessment project institutional tools to promote sound assessment practices. Given the using an ePortfolio. We focus on our Foundation courses, the shared standards relative to assessment of most accrediting bodies, it is critical to intellectual experience of interrelated courses all undergraduate Arts establish college or program level processes to ensure that faculty have the & Sciences students take. We will discuss the lessons learned from our ability to remain engaged in their own assessment practices. pilot experience in 2016 and current process which reviews a sample of LEARNING OUTCOMES: ePortfolios, which involved 15 evaluators from four departments. Attendees 1. Participants will recognize the importance of engaging the will benefit from learning about the process of utilizing ePortfolio and appropriate faculty in the key assessment committee roles. working with multiple departments and academic disciplines, along with 2. Participants will understand the logistical challenges with using evaluator recruitment, training, and tech issues and costs. time effectively, and selecting the right individuals to participate LEARNING OUTCOMES: in a university assessment committee. 1. Participants will gain at least one strategy to assess a general Audience: Intermediate education/core curriculum. 2. Participants will identify challenges faced when utilizing SS9: Showing Educators How to Teach Traumatized Students technology coupled with evaluative review teams. Jonathan Wisneski and Anne Hensel, Upper Darby School Audience: Intermediate District and Drexel University Increasingly, schools are working to educate and test children who are SS7: Developing an Exceptional Academic Advising Program living with trauma. It is important that educators understand how Using Student Satisfaction Survey Data trauma impacts their students so that they can respond appropriately to effectively educated these students. This content matters because students Debra Frank, Drexel University who are impacted by traumatic events learn and behave differently. This In this session, I will show how we used student feedback on their presentation will summarize current research about trauma and its effects experiences with their academic advisors to improve advisor performance on brain development. We will discuss best practices for working with and achieve an overall satisfaction rate of over 92% to 97% on advisor children living with trauma and share practices that we have implemented efficacy, advisor characteristics, and advising satisfaction from baselines of in our building to build a trauma‐ informed staff and school environment. 72% to 84%. I will share the Qualtrics survey that was developed in‐house LEARNING OUTCOMES: and reviewed by NACADA academic advising experts. There is a direct 1. Participants will learn skills and strategies to support students connection between student retention and effective academic advising. dealing with trauma. Without clear student feedback systematically collected from advisees, 2. Participants will learn how to teach other adults the action steps advisors have tendencies to overrate their own efficacy and consider any to deal with students who have suffered from trauma. negative student feedback as anomalous. By providing clear feedback Audience: Beginner and benchmarking performance against the average of high performing advisors, advisors motivation to participate in professional development activities and to accept mentoring was increased. As a result, lower 3:30 – 4:30 PISB 108 performing advisors worked to improve their performance to meet the Assessing Critical Reflection: Learning in Faculty-led Short Term standards set by the advising manager and the high performing advisors. Advisors who are able to establish positive relationships with their advisees Study Abroad Programs: Students in Developed Countries and then get positive feedback about their work feel appreciated and this Akosa Wambalaba, United States International University increases job satisfaction and productivity. Critical reflection is a transformative learning outcome of the LEARNING OUTCOMES: embedded faculty led short term study abroad program (Gaia, 2015; 1. Participants will be able to explain the connection between clear Russell and Reina,2014) Windows to the World-France at United feedback and staff performance. States International University. We look at short term study abroad 2. Participants will be able to identify the elements of an effective assessment activities and assess evidence of critical reflection skill student satisfaction survey. learning by developing country students in a developed country, with Audience: Intermediate implications on assessment objectives and methods. The focus is on assessing process oriented transformative learning (Mezirow,1998) through multiple channels: engaging field discussions, informal interviews (novel), videos, home campus presentations and thematic research. Faculty in short term study abroad programs must continuously provide evidence of the competencies students acquire,

18 BUILDING ACADEMIC INNOVATION & RENEWAL

conference-program-2017.crw2.indd 18 9/1/17 4:50 PM that are transformative, transferable and have a lasting positive impact on concrete examples of how to optimize current resources to assess ILOs in their adaptation in an increasingly interconnected world. Attendees and draft an ILO assessment plan that meets St. 5 criteria. will assess engaging paths to student self discovery with examples of LEARNING OUTCOMES: informal interviews, student videos, research and mapping. 1. Participants will be able to identify institution‐level specific LEARNING OUTCOMES: processes and systems available for data collection. 1. Participants will learn approaches used to identify thought 2. Participants will be able to draft an ILO plan outline to meet provoking scenarios in short study abroad sites using knowledge Standard V criteria. related to student cultural assumptions during the initial course Audience: Beginner phase on home campus 2.Participants will learn how to use literary texts as discussion points 3:30 – 4:30 GERRI C. LEBOW HALL, 109 to identify different and similar cultural perspectives between home and study country and how to use it as a basis for critical reflection Our QuEST for Improving Learning: Year Two Analysis of Audience: Beginner Wellness Course Revisions Mindy Smith and Susan Donat, Messiah College

3:30 – 4:30 PEARLSTEIN 101 We revised our general education learning objectives for our student wellness WEDNESDAY Both a Science and an Art: Designing, Developing, and program in 2015‐2016. We continued improving our curriculum by revising course materials in 2016‐2017, changing our required readings to include Implementing Academic Program Evaluations that Work multimedia content. Assessment findings revealed significant improvement in Erica Barone Pricci and Alicia Burns, Lackawanna College several areas of student‐reported learning, with implications for other content This session is about building and implementing a useful, sustainable, areas. Higher education professionals face the challenge of engaging students improvement‐focused academic program evaluation model that meets the in relevant content with learning experiences that will inspire critical thinking needs of multiple stakeholders, including faculty teaching within the program, and deeper application. Our session details the process of moving from an faculty teaching outside the program, administration, students, accreditors, instructional focus on wellness research articles to evidence‐based wellness and the industries that ultimately seek to employ our graduates. Systematic, tools such as organizational websites, infographics, and TED talks. Improving robust academic program evaluation is not only necessary to remain in student learning is an ongoing process, where one change can open additional compliance with accreditation requirements, but also informs decisions avenues for improvement. Participants will reflect on the learning activities about the critical issues of resource allocation; faculty development, program and required readings in their own courses and explore other creative tools growth, adaptation, or elimination; curricular revisions; and appropriateness that they could integrate based upon the findings from our assessment work. of academic policies. Program evaluation results support and empower data‐ LEARNING OUTCOMES: based decision making. This session will provide attendees with information 1. Participants will interact with the revised general education about how to launch, re‐energize, or maintain a program evaluation model wellness instructional resources at Messiah College and review the on their campuses. It will share some of the potential challenges, as well as assessment findings regarding improvement in identified student solutions and will focus on strategies to promote sustainability and faculty and outcomes. staff engagement in the process. 2. Participants will raise questions and share innovative LEARNING OUTCOMES: recommendations for engaging learners in relevant instructional 1. Participants will learn specific action steps to implement a content using creative and evidence‐based instructional resources. successful program evaluation model from conceptualization to Audience: Intermediate actual delivery 2. Participants will learn tools to assist attendees with using the 3:30 – 4:30 GERRI C. LEBOW HALL, 209 knowledge gained in the session, including an evaluation model and a template for sharing program evaluation results Assessors of the Galaxy: Using Technology Integration to Shift a Culture Audience: Intermediate Ryan Clancy, Mark Green and Nina Multak, Drexel University This presentation highlights the integration of new technology for 3:30 – 4:30 PEARLSTEIN 102 exam mapping in a graduate level health professions training program. Presenters will share the rationale, challenges and success of the Lost with ILO Assessment? No Worries, We Bet you are Heading assessment transformation. The development of an assessment framework in the Right Direction will be explored by attendees through the presentation and an experiential Jacqueline Snyder, SUNY Fulton Montgomery Community College learning activity. Use of current technology allows for more effective Mary Ann Carroll, SUNY Herkimer County Community College assessment development and implementation, student evaluation as well Does your Institutional Learning Outcomes assessment feel like you are as program self assessment and improvement as required by accrediting randomly winding down a long path toward a goal, trying to get from point organizations. Utilization of these evaluation methods of learners allows A to point B? Do you have feelings of being lost and never reaching the for early student intervention and remediation prior to their taking destination? ILO assessment can appear as a never‐ending, foggy path... high stakes certification exams. Attendees will leave with an example with many twists and turns, leaving you guessing if a final documented of how the program used a change in technology as an opportunity to systematic process will ever be achieved. Two State University of New develop evidence‐based practices and shift a culture towards continual York community colleges will share their experiences of feeling lost, yet improvement. In addition, the learning activity will promote a simple and eventually landing in a place to document ILO systems and processes effective way to start a discussion on exam blueprint mapping. that meet Education Effectiveness Assessment... Standard V. Assessment LEARNING OUTCOMES: leaders from SUNY Fulton‐Montgomery CC and SUNY Herkimer CCC, 1. Participants will be able to create a blueprint map and label exam both small, rural institutions will provide an interactive session that focuses questions.

drexel.edu/aconf 19

conference-program-2017.crw2.indd 19 9/1/17 4:50 PM 2. Participants will be able to understand one process of using share the assessment results of a student service learning project her a change in technology to shift a culture towards continual students completed as an example of effective assessment of a High improvement. Impact Practices. Attendees will participate in an interactive visual and Audience: Beginner verbal presentation to examine High Impact Practices for teaching and learning and discuss possible assessment tools for gathering data on 3:30 – 4:30 GERRIE C LEBOW HALL 108 student learning. Participants will have the opportunity to brainstorm learning outcomes that may be challenging to assess via experiential 30 Minute Split Sessions learning activities. / Thoughtful development of fair assessments is imperative, though challenging, for the assessment of learning WEDNESDAY S1: Assessing Student Learning in Student Affairs: outcomes in experiential activities and assignments. As mere student There’s Just Not Enough Time! participation does not suffice as evidence that learning outcomes were Debbie Kell, Deborah E. H. Kell, LLC achieved, attendees will have the opportunity to explore worthwhile Student Affairs Divisions are challenged when asked to assess student assessments for engaging and impactful instructional practices. learning. Unlike faculty, who see students for a full semester, SA staff LEARNING OUTCOMES: often see students in large groups and/or in single, shorter, tightly‐ 1. Participants will participate in an interactive visual and verbal defined sessions. This presentation highlights some tangible techniques presentation to examine experiential learning activities and that can be used to study the student learning take‐away. Student possible assessment tools of these practices.. Affairs and Academic Affairs should rightfully work in partnership 2. Participants will explore grading criteria that could be used to in delivering on and assessing student learning outcomes, especially assess identified learning outcomes within experiential learning general education. But the assessment findings for each area need to activities or other HIPs be in place for meaningful conversations to yield intentional responses. Audience: Beginner Institutions wish to capture more of the totality of the student experience, especially relating to general education. As a “student‐ facing” service, student affairs divisions are a critical piece of the puzzle. Despite the challenges, it is possible to move beyond indirect measures such as office visits, participation rates, and surveys. 4:45 – 5:30 P.M. LEARNING OUTCOMES: ICE CREAM SOCIAL 1. Participants will be able to articulate and study samples of measurable student affairs student learning outcomes and Come join your colleagues determine alignment with common general education focus areas. for ice cream and 2. Participants will be able to determine easily‐deliverable assessment tools suitable for the typical student affairs conversation during environment. Network 101 Hour in Audience: Intermediate the Papadakis Integrated S2: Assessment Tools for Experiential Learning and Other Sciences Building Atrium.

Highly Impactful Practices SPONSORED BY Melissa Krieger, Bergen Community College Strategies for assessing experiential learning activities can be done through thoughtfully developed rubrics directly aligned with learning outcomes instructors wish students to master. The presenter will

NOW FOR $9 SOME FUN. SEPTEMBER IS ALWAYS A BEAUTIFUL TIME TO ENJOY A BALL GAME. Stroll through Ashburn Alley and try some of the best food offered at any baseball park in America. The conference will provide a shuttle to pick you up and take you back to the Drexel campus or the Sonesta. Drexel was able to procure a limited number of tickets at a reduced price, and so the cost will be only $9 per ticket which includes the shuttle cost. PHILLIES VS. MARLINS / 7PM

20 BUILDING ACADEMIC INNOVATION & RENEWAL

conference-program-2017.crw2.indd 20 9/1/17 4:50 PM CONFERENCE SCHEDULE » THURSDAY

7:30 – 8:30 A.M. CONTINENTAL BREAKFAST Drexel University

8:45 – 9:45 A.M. MORNING PLENARY

8:45 – 9:45 MANDELL THEATER Reclaiming Assessment: Unpacking the Dialogues of our Work Natasha Jankowski Dr. Natasha Jankowski is Director of the National Institute for Learning Outcomes Assessment (NILOA) and research assistant professor with the department of education policy, organization and leadership at the University of Illinois Urbana-Champaign. She is co-author, along with her NILOA colleagues, of the book Using Evidence of Student Learning to Improve Higher Education, as well as co-author of the recently released book, Degrees That Matter: Moving Higher Education to a Learning Systems Paradigm. Her main research interests include assessment, organizational evidence use, and evidence-based storytelling. She holds a PhD in higher education from the University of Illinois, an MA in higher education administration from Kent State University and worked with the Office of Community College Research and Leadership studying community colleges and public policy. THURSDAY

10 – 11 A.M. 10:00 – 11:00 PISB 106 CONCURRENT SESSION 3 The Wizards of Assessment: Peel Back the Curtain and Experience the Art and Science of the Assessor 10:00 – 11:00 PISB 104 Mark Green and Ray Lum, Drexel University Background, Methods, and Results of a 7-year Longitudinal During this hands‐on session, conference attendees will be invited to gather in a lighted hearted, but rigorous process of creating assessment Assessment of Undergraduate Business Writing tools. Whether the subject is complex or simple, evidence‐ based Scott Warnock, Drexel University assessment techniques will be the foundation of the process. Be surprised I will present a 7‐year assessment of undergraduate business major as we demystify assessment by using the most unassuming subjects. writing, including results of more than 3,700 assessments of 2,000 Participants will work collaboratively with one another to develop their documents and comparisons between English and business assessors. assessment tools which are reliable and measurable. A panel of distinguish I’ll discuss two curricular interventions. Overall, the student writing evaluators will determine the efficacy and validity of the tools. Alternatively, was assessed as good, but we found differences between English and conference attendees may observe the quick‐witted panel as participants business assessors. This presentation is based on a unique, published gain insightful feedback and quips regarding their assessment tools. They research project that takes a longitudinal approach at writing will witness an array of techniques used. In addition, attendees will identify assessment. I also describe specific issues with writing assessment and themes of best practice and tips for improvement. While networking how this study attempted to overcome them, largely through a situated during the session is prize enough for some, top assessment tools presented assessment approach. The way we conducted this assessment represents will receive additional recognition and of course . . . bragging rights for a potentially reproducible assessment approach that departments and the year. This presentation will focus on making assessment relatable institutions could use for a variety of purposes, including satisfying to all audience members and, to promote networking opportunities accreditation requirements. for attendees. The presentation will give an example of a way to engage colleagues in a conversation about assessment that is fun at the same time. LEARNING OUTCOMES: 1. Participants will be able to create a large, longitudinal writing LEARNING OUTCOMES: assessment 1. Participants will have the opportunity to network and engage in 2. Participants will learn how situated assessment can help your meaningful dialogue with other conference attendees campus assess the writing of its students 2. Participants will gain feedback on their ability to develop rubrics Audience: Intermediate and apply them in grading situations. Audience: Beginner

drexel.edu/aconf 21

conference-program-2017.crw2.indd 21 9/1/17 4:50 PM 10:00 – 11:00 PISB 108 10:00 – 11:00 PEARLSTEIN 102 Learner-Focused Assessment for the Creative Mind: Cultivating Assessing Engagement in Active Learning Classrooms Growth for All Learners Dawn Sinnot, Susan Hauck and Courtney Raeford, Community College of Philadelphia Amanda Newman-Godfrey and Lynn Palewicz, Moore College of Arts and Design In this session, we will share a research study that evaluates student perception of engagement in parallel ALC and traditional classrooms. This session shares the process and results of learner‐focused assessment, Significant findings include increased classroom problem solving, specifically rubric designs that emphasize growth. Attendees will student collaboration and student cooperation. Participants will gain participate in an activity demonstrating the instrumentation of a perspective on both design and assessment decisions they might face learning focused visual arts activity and its subsequent assessment. in implementing similar facilities. As technology is implemented Group discussion will question perceptions of ability typically seen as to facilitate learning, it is important to assess the effect to ensure inherent to an individual. Visual arts education designed to support continuous improvement. This session outlines a study done to measure and fairly assess creative and diverse learners levels the performance the effect of technology on student engagement in an ALC environment playing field. Pedagogical practices focused on helping learners develop using parallel control groups of the same course and same instructor. a mindset for growth rather than relying solely on performance‐based This study provides context for developing innovative assessment outcomes allows for enhanced teacher‐student dialogue, authentic strategies. This case study will engage the audience in considering assessment practices, and sustained engagement in coursework. This what is evidence of the results of technology used the classroom. Can session will share how a belief in growth mindset (Dweck, 2006) tangible effects of integrating technology be measured, and how can instead of fixed mindset enhances not only how we see learning in our THURSDAY these results be used to improve teaching and learning? students, but how we engage in personal and professional development. Additionally, this session will offer strategies on reaching struggling LEARNING OUTCOMES: students and diverse learners. 1. Faculty will discover the purpose and practice of assessment and evaluation in Active Learning classrooms. LEARNING OUTCOMES: 2. Faculty will be gain insight into how the Active Learning 1. Participants will gain the ability to use rubrics emphasizing Classroom encourages problem‐solving, collaboration and growth mindset to assess students’ project specific performance as cooperation. well as their learning readiness and potential. 2. Participants will receive hands‐on experience using a student‐ Audience: Intermediate centered, growth mindset oriented rubric to assess their performance on a visual arts lesson and measure it against a 10:00 – 11:00 GERRI C. LEBOW HALL, 109 traditional teacher‐directed didactic. Collecting Meaningful Assessment Data: an Accreditation Strategy Audience: Beginner Jane Marie Souza, University of Rochester The session concerns creating a non‐threatening strategy for collecting 10:00 – 11:00 PEARLSTEIN 101 assessment data from all operational units within an institution to Working Hand-in-Hand: inform accreditation reporting. The word “accreditation” often elicits Programmatic Assessments and Institutional Outcomes a grimace when mentioned on campuses. It can be associated with countless hours spent retrieving information that has been forgotten Frederick Burrack and Chris Urban, Kansas State University or was stored on files that have been forgotten. It is not unusual for This presentation will focus on managing programmatic and institutions to reflect on all the required standards only when an institutional assessment in ways that promote authentic assessment accreditation event is imminent. However, there is another way. The within disciplinary contexts and enable the collection and reporting session provides a framework to support a culture of continuous data of institutional data. The presenters will demonstrate examples of collection such that use of assessment data becomes systemic, expected, processes and technologies to integrate data from a variety of sources and even celebrated each year through an internal report on progress into a seamless continuum. Programmatic student learning is integral made across the standards. to the expectations of higher education in liberal education as well as LEARNING OUTCOMES: occupational preparation. Managing assessment results that equally 1. Participants will be able to create a plan for routinely collecting informs institutional and programmatic constituents is a challenge. information important to the institution that is aligned with The processes shared will be of interest to assessment facilitators, accreditation standards. administrative leaders, and programmatic faculty. Automating 2. Participants will be able to implement the plan in such a way assessment data collection and reporting allows programs and that it can be embraced by the campus community and become administrative areas the opportunity to focus on meaning behind sustainable. data and guide resulting decisions toward program and institutional Audience: Intermediate improvement. Data visualizations encourage enhanced consideration of student learning and lead to considerations not easily found empirically. LEARNING OUTCOMES: 1. Participants will learn how automation of data collection and reporting can contribute to advanced consideration through analysis of student learning data 2. Participants will recognize how programmatic student learning data can contribute to institutional assessment expectations Audience: Intermediate

22 BUILDING ACADEMIC INNOVATION & RENEWAL

conference-program-2017.crw2.indd 22 9/1/17 4:50 PM 10:00 – 11:00 GERRI C. LEBOW HALL, 209 11:00 – 11:15 A.M. Peer-to-Peer Blueprints: Leveraging Hierarchical Learning BREAK Outcomes and Peer Consultants to Foster Faculty Discussions Refreshments Available of Assessment Michael Wick and Anne Marie Brady, St Mary’s College of 11:15 – 12:15 P.M. Maryland We will describe a toolkit used to enable a group of faculty leaders CONCURRENT SESSION 4 to propose, analyze, and refine student learning outcomes for courses and programs from disciplines disparate from their own 11:15 – 12:15 PISB 104 but within a consistent framework of institutional outcomes. The toolkit proved effective in elevating faculty engagement with All About that ’Base: assessment. The toolkit developed, peer‐to‐peer blueprints, is a Database Design as Part of Your Assessment Toolkit highly effective means of quickly and painlessly moving faculty Krishna Dunston, Delaware County Community College up the learning curve of student learning assessment. The toolkit Webinars purporting to present a customized assessment database fosters a community approach to assessment that both provides sometimes deliver, “how I explained to IT what I wanted.” This how‐to administrative support to faculty already overburdened with work presentation will showcase the design process of an Access web app. and too often underappreciated by external stakeholders. Attendees Participants will receive a copy of the presenter’s Access app design as a will gain access to concrete, fully‐implemented, Excel templates in starting point for building their own online assessment entry form and support of the peer‐to‐peer blueprint approach. The templates can reporting site. Database design is rarely considered part of the skill set be applied immediately in on‐the‐ground assessment processes. of the assessment professional. Conversely, the essential tasks of making LEARNING OUTCOMES: alignment clear, collecting assessment data, articulating connections 1. Participants will understand the peer‐to‐peer blueprint toolkit for between learning outcomes, and creating reports for stakeholders, are

student learning assessment as demonstrated by summarizing the all tasks which can be made more manageable with a database. A good THURSDAY blueprint’s operation. relationship with the offices managing your institution’s data is key but 2. Participants will be able to apply the peer‐to‐peer blueprint relying on your own skills to manage ad hoc or creative approaches can toolkit to student learning outcomes from their home institutions. become an immensely satisfying part of the job. And building your own Audience: Beginner solutions can be more effective than forcing data into an assessment management system designed to standardize. 10:00 – 11:00 GERRIE C LEBOW HALL 108 LEARNING OUTCOMES: We Need More: Novel Metrics for Classroom Assessment and 1. Participants will examine the design components of an Access web app: tables, queries and views. Proposed Standards in Nonformal Learning 2. Participants will apply the process of building a custom entry form Caitlin Augustin, John Harnisher and Kristen Murner, Kaplan to solve an assessment challenge. Test Prep Audience: Advanced Non‐formal education is a growing field that is a complement and corollary to formal education. However, most programs are evaluated 11:15 – 12:15 PISB 106 by posthoc surveys that rely on a student’s memory rather than standardized assessments. This session focuses on three arms of Task-based Assessment: A Step-by-Step Guideline non‐formal education management: defining the space, discussing Ramy Shabara, The American University in Cairo, Egypt assessment, and proposing industry standards. Both students and Providing the attendees with the opportunity to learn about the professionals often lack expert judgment to evaluate an investment in techniques and procedures used to develop appropriate CEFR‐aligned non‐formal education, leaving them vulnerable to “gain claims” and classroom task‐based assessments by providing a step‐by‐step other ambiguous statements of impact. However, non‐formal education guideline and develop multi‐stage rubrics to collect and document has been globally recognized as a tool for meeting educational needs. evidence about learners’ language skills Task‐based assessment is a In a digestible format, we present results of both primary experiments key indicator of the quality of the amount of the language acquired and secondary analysis of success measurement in the non‐formal by learners; however, little is known by many teachers about its space. In most non‐formal learning arrangements there exists a digital characteristics, components, task selection and development as well as learning component. These systems (LMS) allow evaluators to observe rubric design. By attending this presentation the attendees will improve how a student truly engages with a non‐formal course. However, their classroom assessment competencies and practices; they will be expected measures of engagement and effectiveness of non‐formal able to design valid task‐based assessments and multi‐stage rubrics courses have not kept pace with available data. We discuss methods of whereby they could collect and document evidence about learners’ both capitalizing on, and setting best practices for, measurement and language skills. assessment maximizing the LMS. LEARNING OUTCOMES: LEARNING OUTCOMES: 1. Participants will be able to develop valid task‐based assessments 1. Participants discuss multiple methods of performance whereby student’s language skills can be assessed. measurement for non‐formal education programs. 2. Participants will be able to design multi‐stage task‐based rubrics 2. Participants will be able to identify the value of standards of whereby student’s language skills can be accurately assessed. practice in non‐formal education and propose standards for Audience: Intermediate evaluations Audience: Beginner

drexel.edu/aconf 23

conference-program-2017.crw2.indd 23 9/1/17 4:50 PM 11:15 – 12:15 PISB 108 Audience: Intermediate Cracking the Code of Creative “Capital:” Assessing Student 11:15 – 12:15 PEARLSTEIN 102 Creativity in Science, Engineering and Technology Courses Acting on Data: Lessons about the Use of Student Engagement Jen Katz-Buonincontro, Amanda Barany, Jessica Cellitti, Tamara Galoyan, Rasheda Likely, Magdalene Moy, Elaine Results to Improve Student Learning Perignat and Hamideh Talafian, Drexel University Jillian Kinzie, Indiana University This three‐part session proposes to address the gap in assessment The ultimate goal of assessment projects, including the National Survey knowledge and practices in the area of STEM creativity: First, presenters of Student Engagement (NSSE), is not to gather data, but to catalyze will clarify the key operational definition of creative thinking and improvement. This session presents lessons about acting on data from creative problem solving (5 minutes). In the second part, seven Ph.D. in institutional stories of NSSE use, including approaches to sharing results, Education students completing a Creativity and Innovation in STEM triangulating data, and involving students in interpreting evidence. The Education course in the School of Education at Drexel during spring content for this session draws from more than 50 accounts of NSSE 2017 will discuss their assessment plans in four projects (30 minutes). data use from campuses that have effectively used results to facilitate Attendees will glean relevant assessment ideas during the presentations conversations and take action to improve undergraduate education. to then discuss and jump start their own assessment plans in the final, Instructive, field‐tested lessons are important for illustrating foundational third part of the session (25 minutes). A Checklist for Assessing Student principles of assessment and furthering best practice. Attendees will gain a Creativity will be provided to attendees that include helpful planning handful of field‐tested lessons about what works, reflect on their own data tips and resources. Although “Critical and creative thinking” is one of

THURSDAY use and consider what stimulates and impedes action, and be encouraged eleven Drexel student learning priorities and is frequently mentioned to adopt a new strategy to take action on assessment results. as a 21st Century competency in national learning standards, faculty LEARNING OUTCOMES: members, researchers, and teachers are underprepared to plan for and 1. Participants will reflect on their assessment data use and consider to assess students’ creativity. Attendees will glean relevant assessment what stimulates and impedes action. ideas during the presentations to then discuss and jump start their own 2. Participants will be able to adopt a new strategy to take action on assessment plans in the final, third part of the session (25 minutes). assessment results. LEARNING OUTCOMES: Audience: Intermediate 1. Participants will gain concrete understanding of four new creative thinking and problem solving rubrics in science, engineering, and 11:15 – 12:15 GERRI C. LEBOW HALL, 109 technology. 2. Participants will engage in hands‐on activities for adapting the Critial Thinking: It’s Not What You Think! rubrics to meet specific learning needs. Sr. Janet Thiel, Georgian Court University Audience: Intermediate This session will examine the academic quality of various intellectual skills currently classified as critical thinking. Participants will consider the 11:15 – 12:15 PEARLSTEIN 101 various nuances of critical thinking and its assessment. The definition of Comparative Program Assessment to Increase Student Access, critical thinking will be teased out as problem‐solving, reflective, self‐aware, metacognitive, creative, and critique thinking. Appropriate teaching methods Retention, and Completion and ways to assess the above intellectual skills will be presented. Participants Catherine Carsley and Lianne Hartmann, Montgomery County will consider how critical thinking is defined and assessed on their own Community College campus and within its various programs, both with learning inside and This presentation outlines a unique assessment process called PEER outside the classroom. Critical thinking has often been relegated to tests of (Program Excellence and Effectiveness Review) that provided Montgomery inferential reading skills, assessed through logic‐ intensive courses, or through County Community College with increased insight into access, retention, quantitative reasoning. This presentation considers alternate ways to define and completion issues across programs. The process supplements historical critical thinking that are applicable to not only academic learning but also can IPEDS data with current student data to give the College a snapshot of what be assessed within student life experiences. was happening right now in our programs while ensuring findings were LEARNING OUTCOMES: actionable. Supplementing IPEDS data with current enrollment data allowed 1. Participants will analyze critical thinking beyond the testing the College to make specific observations across programs. Some programs parameters of inferential reading ability. enrolled students who were disproportionately female and low income 2. Participants will consider appropriate assessment of various (based on Pell percentages) and some programs attracted students who were intellectual skills classified as critical thinking. relatively more male and wealthy. Faculty and staff can now craft up‐to‐date Audience: Intermediate and actionable interventions based on PEER data. Coordinators of programs that are like each other in some way (regardless of enrollment or division) 11:15 – 12:15 GERRI C. LEBOW HALL, 209 might benefit from discussions with other coordinators whose programs share similar issues; conversely, those program coordinators who observe specific Text Analysis as Assessment for Ethical and Diagnostic Purposes program weakness could be matched with coordinators who were successful Fredrik deBoer, Brooklyn College in those areas for help. The purpose of this session is to demonstrate, practically and theoretically, LEARNING OUTCOMES: how text analysis with software can help administrators and instructors 1. Attendees will be able to craft actionable interventions based on make more ethical and pragmatic decisions regarding student writing up‐to‐date rather than historical data. ability. Text analysis software allows us to look at large sets of student data 2. Attendees will be able to compare programs across disciplines via very quickly, allowing us to mine that data for information that is relevant new data elements and visualization tools. to how we assess student writing ability. This in turn allows us to make

24 BUILDING ACADEMIC INNOVATION & RENEWAL

conference-program-2017.crw2.indd 24 9/1/17 4:50 PM 12:30 – 1:45 P.M. LUNCHEON PLENARY: A PANEL OF THE CURRENT STATE OF ACCREDITATION

The leadership of the Southern, Middle States and New England regional accreditation associations will address the commonly held view that accreditation is being transformed from a valued private- sector process—over which the federal government historically has exercised limited control—to a process that is subject to more and more federal involvement. What are the implications of this shift for faculty? For administration? For students? Are institutions experiencing a loss of appropriate authority and responsibility for key academic decisions—that is, judgments about curriculum, academic standards, and general education? Accreditation is a creation of colleges and universities that dates back more than a century. Its fundamental purposes are quality assurance and quality improvement in higher education. As a process of self-regulation through peer and professional review, it is the oldest such system in the world. But, are the core academic values on which accreditation is built and in which faculty members invest, at risk as the federal government role expands?

Elizabeth Sibolski PhD, Belle Wheelan PhD, Patricia M. O’Brien Dr. Sibolski joined MSCHE in 2000 Dr. Wheelan currently serves Patricia M. O’Brien SND is Senior as a Vice President and was named as President of the Southern Vice President of the Commission Executive Vice President in 2007. Association of Colleges and Schools on Institutions of Higher Education Prior to joining MSCHE, she was Commission on Colleges and is the of the New England Association of Director of University Planning and first African American and the first Schools and Colleges (NEASC), the Research at American University in woman to serve in this capacity. regional accrediting association Washington, D.C. She is a past-president of the Society for Her career spans over 40 years and includes the roles of for the six New England states. NEASC accredits over 240 College and University Planning and has served as a trustee of faculty member, chief student services officer, campus colleges and universities. Prior to coming to NEASC, Dr. the Mortar Board National Foundation. Sibolski holds the B.A. provost, college president and Secretary of Education. In O’Brien worked for seven years at Bridgewater State College in political science, the M.P.A. and Ph.D. in public administration several of those roles she was the first African American (now University) in Massachusetts, first as Director of from American University, where she remains active as an and/or woman to serve in those capacities. Dr. Wheelan Institutional Research and Assessment, and later as Associate alumna of the University’s School of Public Affairs. Dr. Sibolski received her Bachelor’s degree from Trinity University in Texas Vice President for Planning and Assessment. She has also is co-author of Integrating Higher Education Planning and (1972) with a double major in Psychology and Sociology; been an adjunct faculty member at Harvard University and at THURSDAY Assessment: A Practical Guide (SCUP, 2006). More recently, her Master’s from Louisiana State University (1974) in Emmanuel College in Boston. Dr. O’Brien has presented at a she has authored several articles that have appeared in Developmental Educational Psychology; and her Doctorate number of regional, national, and international conferences, Planning for Higher Education. from the University of Texas at Austin (1984) in Educational and she is editor of Accreditation: Assuring and Enhancing Administration with a special concentration in community Quality, published as part of New Directions for Higher college leadership. Education series in Spring 2009. Dr. O’Brien earned her baccalaureate degree in Psychology at Wellesley College and her graduate degrees in Education at Harvard University. Assessment Improved

better decisions about placement and use of scarce institutional resources. presentation will present some ideas for our colleagues to collect the kind of I’ll show attendees how to access free‐to‐use software to analyze student data that could compel their colleagues to action. writing samples and how that analysis can make our writing programs LEARNING OUTCOMES: more fair, efficient, and effective. 1. Participants will be able to identify compelling questions about LEARNING OUTCOMES: student success and student learning that encourage the closing of 1. Participants will be able to identify text analysis software that is the assessment loop. useful for making decisions in writing program placement and 2. Participants will be able to identify analyses for uncovering the evaluation. performance of students on learning outcomes. 2. Participants will be able to name several types of analysis that can Audience: Beginner be undertaken using such software. Audience: Intermediate 2 – 3 P.M. 11:15 – 12:15 GERRIE C LEBOW HALL 108 CONCURRENT SESSION 5 Assessment as Research: Using Compelling Questions to Inspire Thoughtful Assessment Practices 2:00 – 3:00 PISB 104 Javarro Russell, Educational Testing Service (ETS) Knowing More about our Students in Foundational Math and This session asks participants to reconsider their assessment process. By Writing Reform: Building Multi-Faceted Assessment on the Front End brainstorming the compelling questions an institution could ask, and by Fiona Glade, University of Baltimore engaging in targeted data analysis activities, the participants of this session will In 2014, University of Baltimore redesigned developmental math gain insight into how to develop assessment processes that address compelling and writing with a focus on student self-efficacy to improve retention. questions about student learning and success. Far too often institutions begin After describing the revision process and impact of the new structure, to ask meaningful questions about student success and student learning after which includes Directed Self-Placement using multiple measures, the they have already started or completed their assessment process. Ultimately, presenter will share multiple measures materials and show participants they realize that they did not collect enough data, nor the right kind of data how to adapt them to their own contexts. that would be helpful in answering their questions. Our assessment colleagues are often discourages by the lack of data use in their assessment work. This

drexel.edu/aconf 25

conference-program-2017.crw2.indd 25 9/1/17 4:50 PM LEARNING OUTCOMES: SS3: Rubrics: Facets That Matter 1. Participants will learn how Directed Self-Placement works based Diane DePew, Drexel University on multiple measures of writing assessment, including guided self- assessment. Rubrics can be analytic or holistic. With either type, the facets of a rubric 2. Participants will learn how to use their own institutional context need to address the assessment requirements as aligned to the learning to create Directed Self Placement appropriate to supporting objectives (Dawson, 2017; Popham, 1997). This conversation will explore developmental writing students. those facets as they relate to validity and reliability measures in the development of rubrics. / Rubrics have the potential to increase student Audience: Advanced achievement and improve instruction. Addressing facets of rubrics brings attention to the details in developing rubrics. These details lead to improved 2:00 – 3:00 PISB 106 rubric validity and reliability. Participants will be stimulated to incorporate Snapshot Sessions (A Collection of Mini Sessions) methods in developing rubrics that enhance validity and reliability. This will promote student learning and improve instruction in their courses. LEARNING OUTCOMES: S S1: Assessing our Assessment: 1. Attendees will evaluate the facets of rubrics that enhance student A Process for Reviewing Annual Assessment Report achievement and improve instruction. Gina Calzaferri, Temple University 2. Attendees will explore methods to ensure validity and reliability of assessment rubrics. All the annual assessment reports are submitted, now what? Based on our Audience: Intermediate

THURSDAY assessment model, Temple University developed a system for reporting assessment activity across programs. This session details Temple’s process for reviewing these annual assessment reports, including the use of a SS4: The Impact of Co-Curricular Activities as an rubric for report review and conferencing with stakeholders. Accreditors are looking for evidence that institutions are assessing their processes. This Assessment Tool on the University Students session will demonstrate how one large, urban research institution assesses Muhammad Farooq and Gehan El Enain, Abu Dhabi University their assessment process, a design which can be replicated regardless of Our objective is to examine if a co‐curricular activity incorporated with discipline and size of college/university. Participants will be able to adapt the assessment methods might affect the students’ competence. The this process to suit their institution’s assessment practices and will have purpose of this presentation is to highlight the importance of co‐ curricular access to the rubric and other materials Temple has developed. activities in general education of adults. All modern educationists believe LEARNING OUTCOMES: that education is not only the name of memorization it must reflects 1. Participants will acquire strategies for assessing annual in the personalities of the students in form of knowledge, confidence, assessment reports. competence and moral values. A society needs human beings not machines 2. Participants will be able to adapt this process to suit their for its survival. Particularly in under‐developing countries the education institution’s assessment practices. systems are designing according to the needs of job market instead of needs Audience: Intermediate of a society. These kinds of education systems are producing a healthy bodies but without soul. That is why intolerance, impatience, injustice and social evils are very common in those societies. An ideal education system SS2: Turning 120 Annual Reports Into a Searchable Online prepares manpower which competes the job market and also contributes Planning/Reporting Database Linked to Strategic Plans in the development of the society. Wenjun Chi, Saint Joseph’s University LEARNING OUTCOMES: 1. Participants will be able to understand the importance of co‐ Provide an overview of the structure of an automated online annual curricular activities within education. planning/reporting platform which encourages strategic goal setting 2. Participants will be able to incorporate co‐curricular activities in and assessment in all units, captures yearly progress toward institutional their curriculums. strategic plans, and provides data to decision makers for resource allocation Audience: Intermediate planning and the advancement of institutional priorities. This integrated online database simplifies and enhances communication, data analysis, and documentation of our campus planning and budgeting process and S5: Learning from the Assessment Process: provides searchable evidence of progress towards institutional strategic priorities. Attendees will learn strategies for information integration HBCU Faculty Perspectives on Classroom and Program Review that can be applied to their own institutional processes. Attendees will Pamela Felder and Michael Reed, University of Maryland understand the challenges posed in developing an online platform, the Eastern Shore support that is necessary from stakeholders, and the benefits of an Assessment is the essential to the foundation and mission of our nation’s integrated system for planning, reporting, and decision making. colleges and universities. What’s known about this process within LEARNING OUTCOMES: the context of historically Black colleges and universities (HBCUs) is 1. Participants will learn strategies for information integration that underrepresented in higher education research. This represents a gap in can be applied to their own institutional processes. scholarship shaping the future development at HBCUs. Understanding 2. Participants will understand the challenges posed in developing the HBCU faculty experience as it relates to the impact of classroom and an online planning platform, the support that is necessary from program effectiveness, can facilitate more meaningful review of assessment stakeholders, the importance of a well-chosen implementation standards. Learning more about faculty peer review and interaction can team, and the benefits of an integrated system for planning, serve to illuminate what strategies support and/or hinder institutional reporting, and decision making. mission. HBCUs are an important aspect of the American higher education landscape and continue to serve the needs of our nation’s marginalized Audience: Intermediate

26 BUILDING ACADEMIC INNOVATION & RENEWAL

conference-program-2017.crw2.indd 26 9/1/17 4:50 PM student populations. Faculty perspectives during the presentation will SS8: Developing an On-Line Simulation Activity: inform attendees who have interest in serving marginalized students. Assessing the Need and Implementing Action! LEARNING OUTCOMES: 1. Attendees will learn about the challenges related to classroom and Margaret Rateau, Robert Morris University program review at HBCUs. This presentation will focus on the assessment, development, and 2. Attendees will learn about the HBCU faculty experience with evaluation of an on‐line simulation activity for RN students. Information classroom and program review. received from student practice survey assessments revealed concern Audience: Intermediate for maintenance of patient safety in practice. The simulation activity development and application of simulation for other educators will be discussed. On‐line programs of learning are becoming increasingly SS6: Enhancing Online Course Design by Developing Faculty- popular and are attractive alternatives for students, especially for the adult, Administration Collaborations and Using Quality Matters Rubrics non‐ traditional learner. Teaching effectiveness can be challenged in the Moe Folk and Doug Scott, Kutztown University on‐line environment; therefore, the use of creative, interactive teaching‐ learning methods can increase student engagement and enhance student This session covers how to use the Quality Matters rubric to improve online learning. This presentation will afford attendees the opportunity to course design and assessment. The session also covers how to develop discuss the development of creative teaching/learning strategies such as beneficial faculty‐administration collaboration with regard to planning simulation in an online environment. The expansion of teaching strategies and assessing online courses. Faculty are often thrown into teaching online contributes to a broadened educational pedagogy and can enhance student courses, which makes assessing such courses difficult. This session will help learning assessment and evaluation methods for faculty members. the audience members improve how they build online courses with regard to assessment. In addition, the session will inspire audience members to LEARNING OUTCOMES: question their current online course design and assessment process in order 1. Participants will discuss elements of the assessment, development, to have them improve faculty‐administration collaboration on designing and and implementation of an on‐line simulation learning activity. assessing online courses at their institutions. This session will help attendees 2. Participants will identify other topic areas where a simulation in creating and assessing online courses. As such, it will improve the efficiency learning activity could be used as an effective teaching‐learning strategy. THURSDAY of both their own course design process and the actual process of coordinating Audience: Intermediate with administration to build and asses online courses. LEARNING OUTCOMES: SS9: Assessment in Science using I-LEARN Model 1. Participants will understand how the Quality Matters rubric Hamideh Talafian, Drexel University allows for assessment‐minded online course design. This session is about an information literacy model entitled the I‐ 2. Participants will explain why improved faculty‐administration LEARN model which tries to focus on a problem‐solving cycle in a more collaboration is necessary for building and assessing online courses detailed way. The model is a six‐step model (Identify, Locate, Evaluate, Audience: Intermediate Apply, Reflect and know) designed to prepare students for 21st‐century learning skills. Science as one of the components of STEM needs more SS7: Collaboratively Assessing Collaboration: preparation on the part of the students for 21st‐century learning skills and this session focuses more on redesigning and assessing a course Self, Peer and Program-level Assessment of Collaborative Skills based on this model. This session gives the attendee a very specific Janet McNellis, Holy Family University rubric for teaching and assessing many science subjects in k12 and This session provides an argument for the importance of including the helps them to have a better understanding of each step based on the development of collaboration skills as a program‐level outcome for academic ongoing projects that are being done based on this model. programs. It describes how focusing on the building of collaboration skills LEARNING OUTCOMES: maximizes student learning, describes how these skills can be assessed, and 1. Participants will learn how to assess students` coursework based explains how this assessment aids students’ academic learning. Many recent on an information literacy model. research studies highlight the importance collaboration skills have on job 2. Participants will learn how to teach effectively through an performance and satisfaction in fields as varied as healthcare, business information literacy model and problem‐solving criteria leadership, engineers, and service industries. However, many degree programs Audience: Intermediate do not include collaborations skills as program‐level outcomes because they are difficult to directly assess. This session will provide attendees with theoretical and practical knowledge of how to enhance and directly assess 2:00 – 3:00 PISB 108 students’ collaboration skills. The attendees will leave with a self and peer‐ Implementing a Student Assessment Scholar Program: assessment grading rubric and a program learning objective rubric that they Students Engaging in Continuous Improvement can modify and use in their own institutions. Nicholas Truncale, Elizabeth Chalk, Jesse Kemmerling and LEARNING OUTCOMES: Caitlin Pelligrino, University of Scranton 1. Participants will gain an understanding of why collaboration skills This session covers the implementation a student assessment scholars program. should be included in program‐level goals and assessed. Student scholars aid in continuous improvement by collecting viewpoints via 2. Participants will gain an understanding of an effective tool that can student‐led focus groups and tendering suggestions to stakeholders within be used in self and peer evaluation of students’ collaboration skills. the university community. Training students to design projects and how to Audience: Intermediate administer the program will be some of the information discussed. One goal of assessment is to improve the student learning experience. It makes sense to involve students in the assessment process. Student scholars are able to collect evidence from their peers via focus groups that may not be accessible by regular

drexel.edu/aconf 27

conference-program-2017.crw2.indd 27 9/1/17 4:50 PM means of data collection (i.e. surveys, course evaluations, etc.) Assessment of improvement and make available useful information to institutions for of a general education program and institutional learning outcomes can accreditation reporting. be a burden for faculty at some institutions who may not have the support LEARNING OUTCOMES: structure necessary to accomplish these tasks. Having a dedicated group of 1. Participants will be able articulate why they should be involved students invested in their university to collect evidence from other students with programmatic assessment and how they can promote overall can help immensely. educational quality improvement. LEARNING OUTCOMES: 2. Participants will be able to use Suskie’s assessment cycle in 1. Participants will review an example of a student assessment significant conversations that lead to improvement efforts, scholar program by interpreting a program blueprint given by the pragmatic assessment, and data for accreditation reports. presenters. Audience: Beginner 2. Attendees will discuss and simulate the administration of the scholar program by considering a sample student assessment 2:00 – 3:00 GERRI C. LEBOW HALL, 109 project at their own institution. Audience: Beginner Listening for Learning: Using Focus Groups to Assess Students’ Knowledge 2:00 – 3:00 PEARLSTEIN 101 Corinne Dalelio and Christina Anderson, Coastal Carolina Organizing Program Assessment as Collaborative Problem Solving University. Gina Baker, Liberty University We will present experiences and lessons learned using focus groups as an Barbara Masi, Penn State University THURSDAY innovative, collaborative, and discussion‐based method to assess higher‐ In this session, participants will learn how to structure program assessment order learning outcomes. Specifically, we will demonstrate a design used as collaborative problem solving across the institution. The steps needed to assess students’ ability to evaluate communication processes and to support faculty and administration reframing of student learning messages; think critically about human interaction; and analyze principles assessment processes as a coherent system of student, course, program of communication. Those in charge of assessing student learning may wish and institution information and resource elements will be presented. In to use qualitative approaches to be able to observe how their students this session, faculty and administrators will learn how to connect formal actually apply their learning in a classroom environment. This session will program assessment processes with institution‐wide curricular innovation showcase how focus groups can be used to assess student learning emerging processes. By doing so, they will be able to more effectively target scarce through group discussion. We will introduce not only an innovative institution resources toward curricular innovations that will have greatest assessment method, but also a mindset that, in pursuing assessment, we impact on students. All institutions strive for efficient and effective should think about ways of effectively capturing student learning, rather processes to improve student learning, however, at many institutions those than just relying on the default methods or devices that may be the most processes are often hampered by information and resource flows from frequently used or made most easily accessible (Angelo, 1999). the course to program to institution levels. The organization framework LEARNING OUTCOMES: and templates presented simplify this process. The result in a streamlined 1. Participants will have a broadened understanding of the process for improving student learning. techniques and methods that can be used for assessing student LEARNING OUTCOMES: learning. 1. Attendees will be able to define and explain the elements of a 2. Participants will be able to apply a focus group method to assess streamlined institution student learning assessment framework. students’ higher order thinking skills and embedded knowledge 2. Attendees will be able to use the streamlined institution student in their disciplines. learning assessment framework to improve processes at their Audience: Intermediate institution Audience: Intermediate 2:00 – 3:00 GERRI C. LEBOW HALL, 209 2:00 – 3:00 PEARLSTEIN 102 Rebooting Work Based Assessment for the 21st Century; Educational Development and Assessment: Shifting to Digital Technologies for Student Nurses Simultaneously Promoting Conversations that Matter Sian Shaw and Anne Devlin, Anglia Ruskin University (UK) This session presents our case study research into the introduction of digital Phyllis Blumberg, University of the Sciences practice assessment for 500 second year student nurses. It outlines practical To promote conversation, this session will report on research that served examples on how blockades to introduction within the community of practice mutually complementary purposes: to determine a) the impact of were overcome and how learning analytics are used to enable absolute faculty development efforts and b) if colleges met their strategic goals of governance of work‐based assessment. Within the United Kingdom, like implementing learning‐centered teaching. Participants will discuss how the USA, there is a crisis ‐ a national shortage of nurses. In the UK, this has Suskie’s assessment cycle is useful for educational development purposes led to new routes of entry, including work‐based pathways, partly driven by and assessment. This session illustrates why educational developers should a Government employer levy to fund apprenticeships. This creates a need become involved with ongoing assessment projects. When developers for innovative solutions for work‐based assessment. / To reduce attrition broaden their scope from working with individuals to collaborating with it is essential be able to track students’ progress in practice, ensure that departments or colleges on programmatic assessment, they are poised to assessments are completed on time and provide timely support. Leading improve the overall educational quality of programs through significant digital change is hard. This session will outline how putting the users first conversations. Educational developers are often trusted by the faculty and technology at the heart can create better, more sustainable assessment. and the administration, therefore, they can be change agents to promote LEARNING OUTCOMES: institutional development through assessment efforts. When engaged in 1. Participants will learn strategies to overcome obstacles in leading the assessment efforts, developers provide formative feedback for the purpose

28 BUILDING ACADEMIC INNOVATION & RENEWAL

conference-program-2017.crw2.indd 28 9/1/17 4:50 PM adoption of digital assessment in a work‐based community of practice. institution’s assessment process. This session will engage attendees in 2. Participants will use the tablet and web‐based assessment used meaningful conversation about how to develop a process to assess their by Anglia Ruskin University for student nurses so that they are assessment efforts on their campuses. Time will be spent discussing able to critically discuss how learning analytics gathered from findings, lessons learned, and the evolution of the rubric used to assess the digital work‐based assessment can be used to enhance< work Mercy’s academic programs engage in to measure student learning. Audience: Intermediate LEARNING OUTCOMES: 1. Participants will be able to develop a process for evaluating 2:00 – 3:00 GERRIE C LEBOW HALL 108 academic assessment efforts. Drexel Outcomes Transcript and Competency Portfolio: 2. Participants will be able to use assessment results to improve the assessment of student learning. Empowering Students and Faculty with evidence of learning Audience: Advanced using Effective Assessment Mustafa Sualp, AEFIS 3:15 – 4:15 PISB 106 Stephen DiPietro and Donald McEachron, Drexel University Everything I Ever Wanted to Know About Assessment In higher education, courses and instructors are often functionally I Learned from Reality Cooking Shows siloed and students fail to see the connections between curricular elements. Outcomes-based design and assessment should address Krishna Dunston, Delaware County Community College this problem but often does not due a significant disconnect between Reality cooking shows, like assessment plans, need to balance the what students and faculty understand about the significance of evaluation of discrete technique, project‐based application, and synthesis student learning outcomes. In an effort to address these issues, a of critical thinking skills and knowledge. In this interactive workshop, complete assessment management solution approach and software participants will deconstruct an episode of Top Chef to identify and align are being designed and implemented to create ‘learning outcomes program goals, program objectives, student learning outcomes and criteria transcripts’ which transcend individual courses and educational for evaluative rubrics. For those new to assessment (or those newly tasked

experiences. By providing developmentally relevant feedback to with documenting or collaborating on assessment projects) getting the THURSDAY students in real-time, these transcripts may promote significant ‘lingo’ down can be daunting. This exercise allows for productive dialogue student ownership of learning outcomes, creating a stronger sense as to what each assessment term means ‐ and how it is used to improve of purpose and curricular continuity. That, in turn, should promote our understanding of student learning. This workshop allows the beginning more effective student learning and academic performance. practitioner or facilitator a low‐stakes method of opening dialogues with LEARNING OUTCOMES: colleagues and get beyond, what we mean, to what is meaningful. 1. Participants will be able to explain what an outcomes based LEARNING OUTCOMES: transcript is. 1. Participants will be able to identify examples of program goals, 2. Participants will be able to understand how a software system objectives, student learning outcomes and criteria for evaluative can automate assessment to create an outcome based transcript. rubrics and discuss how each of these distinct elements works Audience: Advanced cohesively in a balanced assessment plan. 2. Participants will discuss how to use the cooking show genre as 3:00 – 3:15 P.M. a low stress way of engaging campus discussion on a variety of assessment topics. BREAK Audience: Beginner Refreshments Available 3:15 – 4:15 PISB 108 3:15 – 4:15 P.M. Grit in the Classroom Rebecca Friedman, Johns Hopkins University CONCURRENT SESSION 6 All teachers praise students but are we thinking critically about how we praise? What words are we using and what impact could they have? 3:15 – 4:15 PISB 104 This workshop explores praising effort over intelligence.Participants will Assessing Our Assessment: brainstorm how they can encourage students to develop grit. Incorporation of the growth mindset makes a big difference in how children perform Findings and Lessons Learned Three Years Later academically. Even children who consider themselves “gifted” often avoid Victoria Ferrara, Mercy College challenge, for fear they might lose status if they fail. But when we teach Mercy College has learned quite a bit about the process used to measure youth that intelligence is malleable, they tend to persist through difficulties educational effectiveness and the quality of their assessment work. This and experience intellectual growth. (Blackwell, Trzesniewski, & Dweck, session will engage participants in a discussion about how the institution 2007) When struggling students learn how to “drive their brains” through used assessment findings to improve faculty development opportunities, the use of cognitive strategies – such as the growth‐mindset, they’re more faculty assessment activities, and the institution’s assessment process. likely to be able to learn and think at higher levels. Teachers often say they Many institutions struggle to implement formal processes to measure the need strategies for helping students learn how to increase their attention. effectiveness of educational assessment activities. Engaging in systematic LEARNING OUTCOMES: self‐ assessment is critical to an institution’s ability to identify faculty 1. Attendees will gain a deep understanding of how growth development needs, provide support to help faculty determine whether mindset and grit are related. students are meeting expectations, and ensure the effectiveness of the 2. Attendees will learn specific ways we can encourage students to

drexel.edu/aconf 29

conference-program-2017.crw2.indd 29 9/1/17 4:50 PM develop a growth mindset. LEARNING OUTCOMES: Audience: Beginner 1. Participants will analyze the celebratory strategies at one university that encourage ongoing and meaningful assessment work. 3:15 – 4:15 PEARLSTEIN 101 2. Participants will share, collaborate and design practices for their own campuses that serve to reward meaningful assessment work. Decoupling and Recoupling: the Important Distinctions between Audience: Beginner Program Assessment and Course Assessment Nazia Naeem, Lesley Emtage, Xiodan Zhang and Debbie 3:15 – 4:15 GERRI C. LEBOW HALL, 109 Rowe, York College Application of the Collaborative Active Learning Model (CALM) The panel discussion focuses on one of the key issues in program assessment: the differences between program assessment and course assessment. The Simulation An Experiential Service Learning Approach. panelists discuss the importance of the distinctions and connections of Francis Wambalaba and Peter Kiriri, United States International these two kinds of assessment, including how they serve different purposes, University how assignments and rubrics are designed differently, and how they are At the United States International University, my colleague and I have connected from a program self‐ study point of view. Mistaking course been experimenting with the CALM approach over the last three years assessment for program assessment is common in faculty’s practice even for a graduate research methods course. The model is premised on three though faculty may conceptually know the differences between the two. key activities; cooperative learning (teams); experiential participatory We maintain that decoupling program assessment from course assessment learning (project); and service learning (learning outcomes quality

THURSDAY and then recoupling the two would enable faculty not only to examine assessment). Studies have focused on active learning (Paulson and Foust, student’s performance in any specific areas of knowledge and skill but 1998; Mascarenhas, 1991; Aison, 2010; and Carlson and Winquist, 2011); also to adopt a broader view over program goals and curriculum beyond effective learning (Li, 2012; Shen, Wu, Achhpiliya, Bieber and Hiltz, 2004); an individual course. The panel discussion helps attendees take away constructivist learning from experience to knowledge (Cooperstein and important strategies on program self‐study. These strategies include Kocevar‐Weidinger, 2003); and forms for effective learning (Prince 2004). rubrics and assignment designs specifically for program self‐study, and Why CALM approach? Educators are always searching for effective ways of interpretation of the data on student learning related to the evaluation of teaching and adapting to the changing student environment. This session program curriculum: sequence of the courses at different levels, coverage engages participants in a simulation used for introductory overview to of major knowledge and skills across those frequently‐offered courses, etc. research methods, followed by a reflective discussion. Finally, an overview LEARNING OUTCOMES: of the service learning experiential research project will be presented. 1. Participants will learn how to turn an existing course rubric that a LEARNING OUTCOMES: faculty member commonly uses only for giving students a holistic 1. Participants will be able to identify key features in their courses grade into a rubric that examines important learning outcomes a that can be reorganized to include a CALM approach. program expects its students to achieve. 2. Participants will be able to identify courses in their university that 2. Participates will learn how to distinguish between formative could be appropriate for using a service learning approach and assessment and formative assessment, and then understand how hopefully work towards reorganizing them for that purpose. to design an assignment to collect student work for the purpose of Audience: Intermediate improving teaching and learning in specific areas of knowledge and skill Audience: Beginner 3:15 – 4:15 GERRI C. LEBOW HALL, 209 Process-Based Assessment and Concept Exploration for 3:15 – 4:15 PEARLSTEIN 102 Encouraging Meaningful Assessment By Celebrating It! Personalized Feedback and Course Analytics in Freshman Calculus Mansoor Siddiqui, Project One and Kristen Betts, Drexel University Letitia Basford, Hamline University Synapse, a STEM assessment platform, emphasizes grading based on the The importance of learning outcomes assessment is now clearly recognized at process rather than the final answer, rewarding students for demonstrating most universities across the country. Increased accountability and reporting their thinking throughout a problem based on a rubric. Students can demands have supported the momentum. But a greater focus on authentic explore mistakes, generating data which is used to give personalized assessment that facilitates student learning is not the result of succumbing feedback and course efficacy analytics for professors and administrators. to pressure. In order to build ongoing assessment routines that improve Online assessment has long been inefficient in capturing meaningful data student learning Hamline University has focused on celebrating the efforts about where students are making mistakes and why. In freshmen‐level and outcomes of all who engage in this work. This session will highlight the STEM courses, grading assignments by hand is the only option, but is too different ways our university celebrates assessment efforts, and will provide time‐consuming. This platform saves teachers time, gives students better participants with an opportunity to share their efforts and ideas. We have feedback, and generates meaningful course efficacy analytics. For teachers, defined our learning outcomes, collected student work, and had conversations grading time can be cut down tremendously and deep insight into their about what we are learning from this work. But this effort takes time and students’ strengths and weakness can be measured after every assignment. demands ongoing encouragement to maintain momentum. Hamline Administrators have a method of measuring student learning outcomes University has taken part in several efforts to celebrate how programs are using and course efficacy from the student, section, and program level. assessment data to create meaningful change in their programs. LEARNING OUTCOMES: 1. Attendees will learn how improved user experiences can make grading more efficient in large, intro‐level stem courses. 2. Attendees will learn how machine learning on data gathered from students’ exploration of their mistakes can lead to greater insight. Audience: Intermediate

30 BUILDING ACADEMIC INNOVATION & RENEWAL

conference-program-2017.crw2.indd 30 9/1/17 4:50 PM TOP OF THE TOWN. JOIN US ATOP ONE OF PHILLY’S ICONIC SKYSCAPERS, THE MELLON BANK CENTER. The Thursday evening reception will be at the Pyramid Club on the 52nd floor of the Mellon Bank Building in downtown Philadelphia. Enjoy a panoramic view of the nation’s first capital and the Philadelphia waterfront while enjoying entertainment by the Aqua String Band of the famous Philadelphia Mummers as well as delicious food and beverages with old friends and newly acquired ones. PYRAMID CLUB 5:30-7:30PM

CONFERENCE SCHEDULE » FRIDAY

8:45 – 9:45 PISB 106 8:45 – 9:45 A.M. Curriculum Maps: Who Starts a Trip Without a Map? Alaina Walton and Anita Rudman, Rowan College at CONCURRENT SESSION 7 Burlington County Our session will present curriculum maps as the foundation for 8:45 – 9:45 PISB 104 successful institution‐wide assessment, including linking outcomes It’s Prime Time to Shift IE to EE Planning and Assessment–but How? to general education requirements, to divisional budgets, and to the strategic plan. In addition, attendees will have the opportunity to Mary Ann Carroll, SUNY Herkimer County Community College practice creating/revising program curriculum maps. It is imperative Jacqueline Snyder, SUNY Fulton Montgomery Community College that institutions have a solid foundation as they attempt to organize An entertaining, practical look at how your institution uses current and link the massive amount of assessment data that is collected on FRIDAY practices and processes to tell your accreditation story, particularly an annual basis. Sound curriculum maps offer this foundation and Standards V and VI. Explore paradigm shifting from Institutional to move the institution forward in the cycle of continuous improvement. Educational Effectiveness in terms of planning and documentation for The opportunity to actually build a program curriculum map and all college units. The interactive session uses TV storytelling to help see how that map can link to so many areas of the institution will participants connect essential elements of IE/EE Planning, toward prove invaluable as the attendees return to their own institutions to meeting students’ needs and expected outcomes. Presenters suggest implement this foundational change in assessment planning. viewing existing structures through a close up lens, then zooming out LEARNING OUTCOMES: to the big picture to re‐brand your current IE Plan. Whether your IE 1. Attendees will be able to recognize the importance of creating story unfolds like a dreaded sit‐com, docudrama, or baffling mystery, and maintaining curriculum maps to further institution‐ wide an assessment leader can direct the institution’s characters toward a assessment. new adventure in plot development. Take the leading role to script your 2. Attendees will be able to design a program curriculum map that effectiveness story for the student’s POV. Adjust planning to plot twists links program outcomes to various institutional outcomes faced in new accreditation standards, using practical constructs and tools for achieving the epilogue your campus is targeting. Audience: Beginner LEARNING OUTCOMES: 1. Participants will be able to identify and fill gaps in institutional 8:45 – 9:45 PISB 108 processes and practices by connecting the essential elements of IE Using Course Evaluations to Better Understand what Your to EE planning. Academic Program is Messaging to Your Students 2. Participants will be able to use practical examples and models to help create integrated planning and practices that meet standards Beverly Schneller and Larry Wacholtz, Belmont University for Educational Effectiveness. The session will concentrate on how we adopted the BLUE: EXPLORANCE Audience: Advanced Course evaluation software and what we did to tailor it to provide information on student learning outcomes that could be used for course and program assessment purposes. We made it into a learning tool for faculty and Colleges. The content will show faculty and assessment leaders how to create a narrative about what your program is really telling students based on how the students respond to the course evaluation prompts and how they score the questions. Using BLUE’s top five strengths and top five weaknesses creates an opportunity to dialog within the department and also to inform professional development

drexel.edu/aconf 31

conference-program-2017.crw2.indd 31 9/1/17 4:50 PM activities that will enable faculty members to connect more readily with the needs to be expertise to address the gaps. To the extent that conducting learning needs of their students. Using course evaluations is systemic to the meaningful education with a feedback loop is difficult in relation to university. Thinking of them as an appreciative assessment tool may move some general education goals, our session will help those charged with them beyond the Rate My Professor mentality to a broader use as a learning developing models to do so, as well as provide an opportunity for others tool for the campus as a whole. doing this work successfully to discuss their approaches. LEARNING OUTCOMES: LEARNING OUTCOMES: 1. Participants will learn about a model for starting a conversation 1. Participants will be able to think through the components of a about what our course evaluations are telling students we value in successful general education assessment project and how to develop their learning. such an approach in a way that is efficient, as well as effective. 2. Participants will be able to consider their own needs in answering 2. Participants will gain examples on how one might “close the loop” the question of how they might better use information collected with a large project that touches many departments. and stored about teaching and learning outside of the tenure and Audience: Intermediate promotion process. Audience: Intermediate 8:45 – 9:45 GERRI C. LEBOW HALL, 109 Beyond the Classroom: A Collaborative Pilot to Unify 8:45 – 9:45 PEARLSTEIN 101 Mission Impossible and Other Assessment Tales: Snapshots Learning Assessment Across Six Academic Support Units Jocelyn Manigo and Janet Long, Widener University Joanna Campbell, Maureen Ellis‐Davis, Gail Fernandez, Ilene Kleinman, Melissa Krieger, Amarjit Kaur and Jill This session explores a pilot effort to unify assessment practices Rivera, Bergen Community College for six administrative units within the department of Academic Support Services ‐ Career Services, Counseling, Disabilities Services, This session will focus on a collaborative endeavor between faculty, Exploratory Studies, Student Success and Retention and Tutoring. staff and administration to create and sustain a culture of continuous Presenters will highlight a year‐long, collaborative journey to design, improvement. Through “assessment tales” components of institution‐ implement and analyze the assessment of student learning outcomes wide assessment will be examined. Practical strategies for sharing around the theme of critical thinking. Given the highly‐varied nature assessment data with stakeholders and how to facilitate faculty buy‐in of administrative units in higher education, the task of creating a will also be presented. Effective assessment is complex but need not unified assessment process can seem daunting and overly‐complex. be complicated. Institution‐wide assessment can be broken up into Educational leaders may require a starting point as well as strategies to FRIDAY manageable components. This presentation will highlight the essential develop a common language and to align thinking. Through interactive components of effective assessment design and target strategies to conversation and activities, attendees will gain practical strategies and overcome obstacles that arise as the assessment process unfolds. / materials to initiate a similar assessment model within their institution. Attendees will explore program assessment, from coursework design to This session will especially benefit those seeking a more uniform the development of assessment tools. Information provided will assist framework to assess outcomes across disparate administrative units in attendees in creating assessment processes at their own institutions. higher education. Challenges such as addressing faculty resistance, demonstrating accountability to stakeholders and faculty‐administration collaboration LEARNING OUTCOMES: will also be discussed. As a group of seven presenters, we have been 1. Attendees will gain straightforward strategies and process steps inspired to create seven five‐minute snapshots, and are requesting our to initiate, align, and implement an assessment program for co‐ own snapshot presentation session curricular units that support student learning and development outside of the classroom. LEARNING OUTCOMES: 2. Attendees will gain strategies to apply common themes and 1. Attendees will be able to identify the components of an effective, consistent language to learning outcomes among administrative sustainable assessment process. departments with diverse missions and functions. 2. Attendees will be able to align stakeholder expectations to an institution’s priorities. Audience: Beginner Audience: Beginner/Intermediate 8:45 – 9:45 GERRI C. LEBOW HALL, 209 8:45 – 9:45 PEARLSTEIN 102 Many Questions, Multiple Methods: Assessing Technological Developing Sustainable General Education Assessment: The Literacy and Course Design in a Modular Team-Taught Course Example of Oral Communication Assessment at St. Lawrence Dana Dawson, Temple University Valerie Lehr, Christine Zimmerman and Kirk Fuoss, Assessing technological literacy is challenging, as it addresses both St. Lawrence University technical and conceptual competencies. Even more challenging is assessing technological literacy in the context of piloting a modular, Using the example of oral communication, we will highlight a method team‐taught course. This presentation will review models for thinking for conducting general education assessment that: a) is sustainable; b) about technological literacy from both a course design and assessment is applicable to learning goals satisfied in multiple courses across the perspective. Technological advances deeply influence our students’ curriculum; c) minimizes workload by being course‐embedded; and d) experiences, expectations, and career trajectories. It is crucial that higher enables us look at student growth and “close the loop.” / Many colleges have education institutions grapple with what it means to be technologically learning goals addressed in multiple courses at all levels. Assessing learning literate. The information shared in this presentation stems from a across the curriculum, however, can be difficult because assignments must multi‐year project to develop, pilot and assess courses that would be meaningful and comparative, faculty must be engaged in assessment of directly address technological literacy. In this presentation, attendees general education goals, and, in areas where students are not growing, there

32 BUILDING ACADEMIC INNOVATION & RENEWAL

conference-program-2017.crw2.indd 32 9/1/17 4:50 PM will be exposed to different models for addressing technological literacy on the development of mission, goals, student learning, and support in course design, including the two models we piloted over the past year. outcomes for AES units. It will highlight the importance of assessing We will discuss how the learning goals that informed those courses AES units, how to choose appropriate assessment methods and making were used to guide a direct assessment of student learning. use of results. The audience will be able to leave with a framework for LEARNING OUTCOMES: AES assessment that explains how it all fits together. Participants will 1. Participants will increase understanding of key dimensions of be able to develop an understanding of the role of student learning technological literacy. and support outcomes in AES units; create student learning outcomes, 2. Participants will increase understanding of the relationship support outcomes, a mission, and goals for AES units; and finally, between learning outcomes for a course and tools to assess evaluate current assessment practices in AES units and implement student learning. valid assessment methodologies as an alternative. Audience: Intermediate LEARNING OUTCOMES: 1. Participants will develop an understanding of the role of student 8:45 – 9:45 GERRIE C LEBOW HALL 108 learning outcomes and support outcomes in AES units. 2. Participants will evaluate current assessment practices in AES Best Practices in Assessment: units and implement valid assessment methodologies as an A Story of Online Course Design and Evaluation alternative. Gulbin Ozcan-Deniz, Philadelphia University Audience: Advanced This session will demonstrate how online assessment can successfully be achieved by the use of traditional and not‐so‐traditional course 10:00 – 11:00 PISB 106 activities. The impact of course design and use of Blackboard as the Creating a General Education Capstone: Assessing Institutional Learning Management System (LMS) will be discussed. The challenge in assessing online student learning is obvious and this presentation Outcomes through General Education will focus on best practices of online assessment to help future online Jenai Grigg and Gina MacKenzie, Holy Family University educators. Formative and summative assessment techniques will be This session will focus on the development and implementation shared with the audience by giving specific examples. The application and assessment of a General Education Capstone Course that serves of the Bloom’s Taxonomy will be discussed to set best practices in online as a place for institutional outcomes assessment and summative assessment. Upon completion of this session, attendees will get basics engagement with our university’s mission. This content focuses on of online course assessment and will be equipped to identify their own the conference’s sub‐theme Institutional Change and Assessment to best practices in online assessment by using an LMS. This will help show how bold change can reinvigorate an institution’s approach to the attendees to improve their teaching‐assessment‐feedback loop in nature of liberal arts education and assessment. This content will serve online course with a current technology (i.e. Blackboard). as an example for other institutions struggling to find mechanics for

LEARNING OUTCOMES: institutional assessment, engagement with general education and/or FRIDAY 1. Participants will gain a working knowledge of online course mission focus. assessment. LEARNING OUTCOMES: 2. Participants will be able to identify best practices in online 1. Participants will analyze current institutional outcomes and their assessment for their own courses. relationship to programs and courses. Audience: Beginner 2. Participants will develop mechanisms for assessment of institutional outcomes within general education. 9:45 – 10:00 A.M. Audience: Intermediate BREAK 10:00 – 11:00 PISB 108 Refreshments Available Assessing Information Literacy for Community College Students: Faculty and Librarian Collaboration Leads to Student Improvement Janis Wilson Seeley and Graceann Platukus , Luzerne County Community College 10 – 11 A.M. Through faculty and librarian collaboration, the college piloted a program that embedded information literacy into general education as a core competency. CONCURRENT SESSION 8 The pilot program consisted of pre and post testing students’ competency while they used ProQuest’s Research Companion, an information literacy learning 10:00 – 11:00 PISB 104 tool, throughout the semester. Information literacy, the ability to find, evaluate, Expanding and Developing Assessment Practices to include and use information effectively, is the foundation of critical thinking. Buzz Administrative, Educational, and Student Support (AES) Units words such as “fake news” and “alternative facts” have brought information literacy to the forefront of American politics, conversations about control Christopher Shults, Erika Carlson and Marjorie Dorime- and dissemination of information, and how its misuse can be detrimental. Williams, Borough of Manhattan Community College Our study shows that Research Companion is an effective tool for improving The Middles States Commission on Higher Education now requires information literacy. Attendees will learn about the project’s evolution, data that student learning and the environment that supports student collection, assessment, and ultimately, how it improved students’ competency. learning are assessed. Many institutions struggle to develop Attendees will find insight on how to develop collaborations of their own to appropriate assessment methodologies for AES units. We will focus assess these critical skills.

drexel.edu/aconf 33

conference-program-2017.crw2.indd 33 9/1/17 4:50 PM LEARNING OUTCOMES: 2. Participants will analyze campus‐wide GE improvements and 1. Attendees will be able to recognize ACRL’s information literacy further recommendations based on GE assessment data and post‐ framework and explain its importance as a means of student graduation evaluation. assessment. Audience: Beginner 2. Attendees will be able to implement similar collaboration efforts to their own institutions. 10:00 – 11:00 GERRI C. LEBOW HALL, 109 Audience: Beginner Should You Take Student Surveys Seriously? 10:00 – 11:00 PEARLSTEIN 101 Zvi Goldman, Jeremi Bauer, Susan Lapine and Chris Szpryngel, Post University Community-building through Assessment Design: Reframing Internal, end‐of‐ class student surveys at Post University provide useful Disciplinary Student Outcomes as Inquiry-based student feedback on their perceptions of course content, instructor Brad Knight, American University performance, success factors and overall student satisfaction. The immediate benefit of these surveys, providing instructors and course American University recently approved a Core Curriculum emphasizing developers with the necessary feedback to continuously improve the metacognition and habits of mind, replacing a traditional distribution student academic experience, is obvious. However, there are much requirement. This session describes the iterative stages of learning deeper applications and value to the student survey once student outcome development undertaken by faculty and provides insights from responses are trended over time and modeled into performance lessons learned as we strove to shift from content‐centered outcomes to indicators. This presentation will share Post’s survey methodology, modes of inquiry. A ground‐up faculty‐driven process over many months integrated approach to result validation, and some specific higher‐ allowed us to reach consensus on 3‐4 student learning outcomes that value applications that have been derived from the survey outcomes. wouldn’t be about disciplinary instruction, the way courses for a major function in a curriculum. Instead, content would be used as the vehicle LEARNING OUTCOMES: to deliver the instruction in the mode of inquiry. During this session, we 1. Participants will learn about Post survey methodology and will share how we borrowed from the tools of design thinking to engage integrated approach to results validation. more than 75 faculty in our working groups. A course proposal process that 2. Participants will learn about some specific higher‐value grapples with the competing interests of openness and selectivity will also applications that have been derived from the survey outcomes. be described and sample rubrics discussed. Audience: Intermediate LEARNING OUTCOMES: FRIDAY 1. Attendees will be able to adapt, to their specific context, the steps 10:00 – 11:00 GERRI C. LEBOW HALL, 209 involved in one approach to developing faculty‐driven student Skipping Stones or Making Splashes; Embedding Effective learning outcomes. 2. Attendees will be able to select productive prompts for use with Assessment Practice into Faculty Repertoire faculty when developing inquiry‐based learning outcomes. Dana Scott, Philadelphia University Audience: Beginner This hands‐on session will present how a community based model of assessment can be effective on multiple levels of faculty development, 10:00 – 11:00 PEARLSTEIN 102 from large scale, university‐wide presentations, to smaller group workshops, down to one‐on one sessions with assessment advocates. Data-driven Conversations to Make a Difference in Campus- This model will demonstrates how to give faculty a big picture overview wide General Education of best practices, how to hone‐in on areas that need improvement, and discuss 1‐1 training and feedback to improve specific assessment needs. Mindi Miller, Molly Hupcey Marnella and Bob Heckrote, The session will cover faculty development from a macro to a micro Bloomsburg University of Pennsylvania level. Participants will learn to use strategic design tools to identify Assessing General Education (GE) can be challenging when analyzing key issues and insights in their own assessment procedures. They can evidence across disciplines. Presentation examples explain the use continue to use these tools to help generate ideas for impactful faculty of rubric guidelines to review aggregate data and learning outcomes. development. Campus‐wide targets for student achievement (like information literacy, communication, and healthy living) can focus on GE outcomes LEARNING OUTCOMES: beyond a specialized topic or major. The session activity uses sample 1. Participants will learn a process for creating a holistic system outcomes to discuss possible institutional recommendations and of faculty development, including: strategies to organize larger improvements. Accreditation standards and the goals of higher presentations systems for creating a community of assessment for education require an evaluation process for obtaining evidence of implementation on multiple levels. mission and value attainment. With practical assessment guidelines, 2. Participants will examine key issues and insights in their data from interdisciplinary GE courses help to document and provide assessment procedures: to categorize between areas of need, evidence of student achievements. Academic and co‐curricular GE potential, and accomplishment to generate ideas for impactful results may be analyzed with the same benchmark to capstone GE faculty development. criteria. This data alone, however, is insufficient without department, Audience: Intermediate division, and institutional discussions. In addition, comparisons of GE trends with other outcome data is needed, such as post‐graduation 11:15 A.M. – 12:00 P.M. surveys and conversations with alumni and employers. LEARNING OUTCOMES: CLOSING REMARKS 1. Participants will identify the connection between rubrics, aggregate data, outcomes, and targets.

34 BUILDING ACADEMIC INNOVATION & RENEWAL

conference-program-2017.crw2.indd 34 9/1/17 4:50 PM Acknowledgment

In its 2012 decennial visit and subsequent report, The Middle States Commission on Higher Education stated that Drexel University “...is a remarkable institution that has proven itself by meeting the multifaceted challenges of the past, truly challenging decade. Drexel is now poised not merely to succeed, but to lead. The moment is now Drexel’s to seize.” Against the backdrop of the MSCHE report, and the last three years of conference successes, the charge given to the planning committee for, “Facilitating Conversations that Matter”, was to produce another remarkable conference that would be a definitive, affirmative and authentic response to that challenge. Mindful of that charge, the committee sought to create a fourth conference that would be truly unique, restorative and beneficial to all who attend. We hope that we have succeeded in doing that and that our attendees will enjoy an enriching and thought provoking professional development experience. Moreover, the conference has extended its global footprint by now hosting both attendees and presenters from over 16 foreign countries since its inception. No undertaking such as this can be accomplished without combining the talents and gifts of many into a single and effective product. In that spirit then, we wish to thank each of the members of the planning committee listed below for their ideas, input, enthusiasm and tireless efforts toward making Drexel’s fourth venture as host of a regional conference a success. Their dedication and commitment have always been in evidence and their contributions toward helping us meet this goal have been abundant and productive. Thank You.

Drexel University’s Annual Regional Assessment Conference Drexel University Planning Committee

N. John DiNardo Christopher Gibson Ray Lum

Stephen DiPietro Rodrigo Gibson Danuta Nitecki

Lawrence Duke Mark Green Rob Rasberry

William Ezell Teresa Harrison Michael Scheuermann

Jenell Fritz Joseph Hawk Gaeson Taylor

Lora Furman Kristin Imhoff Christopher Weir

conference-program-2017.crw2.indd 35 9/1/17 4:50 PM Shuttle Schedule

Wednesday, September 13 Thursday, September 14 Friday, September 15 Wednesday Morning Shuttle Service Thursday Morning Shuttle Service Friday Morning Shuttle Service SONESTA PISB DROP SONESTA PISB DROP Sonesta PISB DROP Pre-Conference Workshops PICK UP OFF Breakfast will be from 7:30 - PICK UP OFF Continental Breakfast 7:30 - Pick Up OFF start at 9:00am 8:00 8:30 8:30am 7:00 7:30 8:30am 7:00 7:30 8:30 9:00 7:30 8:00 7:30 8:00

SONESTA PISB DROP 8:00 8:30 Conference begins at 8:45 8:00 8:30 PICK UP OFF Conference begins at 8:45 AM in PISB AM 11:00 11:30 8:30 9:00 8:30 9:00 Conference begins at 11:30 12:00 9:00 9:30 Friday Afternoon Shuttle Service 12:45pm in PISB PISB PICK SONESTA 12:00 12:30 Thursday Afternoon Shuttle Service UP DROP OFF PISB PICK PC DROP 12:30 1:00 Last session ends at 4:15pm 11:00 11:30 UP OFF Conference ends at 1:00 1:30 11:30 12:00 12:00pm Wednesday Afternoon Shuttle Service 12:00 12:30 Shuttle goes to Sonesta and 4:00PM-5:30PM PISB PICK SONESTA the Pyramid club for Continuous Loop UP DROP OFF 5:30pm reception start Last session ends at 4:30pm 12:20 13:00

Thursday Evening Shuttle Service Pyramid 4:30PM-6:00PM PISB Drop Reception ends at 7:30pm. Club PICK Ice Cream Social 4:45pm to Continuous Loop Off UP 5:30pm 7:30 8:00 Shuttle goes to Drexel 8:00 8:30 PISB PICK CBP Drop Wednesday Night UP Off Night out at the Phillies 5:30PM-7:00PM game: Game starts at 7:05 Continuous Loop CBP Pick- SONESTA/ 2 Pick-ups at Citizens Bank up PISB Park: 9:00/10:00pm 9:00 9:30 10:00 10:30

What’s the construction all about? The Korman Quad renovation is to include a new entryway for the Korman Center and redesign of the park space outside to include permeable pavers to recharge ground water, structural soils to reduce compaction, 35 new shade trees, 44 understory trees, 200 shrubs, perennial flowers and lawn space—all designed to create a stronger connection to nature.

conference-program-2017.crw2.indd 36 9/1/17 4:50 PM Drexel University’s Fourth Annual Assessment Conference September 13-15, 2017 Academic Quality: Driving Assessment and Accreditation Drexel University is pleased to acknowledge the generous contribution of our sponsors who were instrumental in bringing together many of the most knowledgeable university professionals of exceptional scholarship from overseas, across the nation and throughout the region to take a fresh look at the myriad of ways in which academic quality drives assessment and accreditation, and how the conversations that we have around that issue matters. This conference will be three days of pre-conference workshops, 56 interactive sessions, snapshot sessions and plenaries by accomplished speakers of national and international reputation. In addition to our sponsors making it possible for our participants to explore cutting edge practices and issues related to teaching and learning, we are also grateful to them for helping to provide countless opportunities for networking and socializing with colleagues.

Assessment Improved AEFIS CAMPUS LABS AEFIS offers the complete solution for the assessment of learning and Campus Labs is a leading provider of campus-wide assessment continuous improvement on your campus. Its innovative platform technology for higher education. Our products give colleges and enables easy automation of individual student assessment data, universities the tools they need to maximize institutional effectiveness facilitates curriculum review, streamlines campus wide accreditation and student success—empowering them to collect and report on processes, and helps to achieve your strategic educational goals. data for learning outcomes assessment, strategic planning, and accreditation. Caitlin Meehan Vice President, Operations Michael J Weisman Our 1429 Walnut Street, 10th Floor Presentation Vice President

Philadelphia, PA 19102 September 14 Campus Labs 9:45 - 10:15 AM 877.674.3122 x2032 716.270.000 x7537 PISB 105 aefis.com [email protected] [email protected] campuslabs.com

EXPLORANCE EDUCATIONAL TESTING SERVICE What is our approach to installing a Culture of Continuous At nonprofit ETS, we advance quality and equity in education for Improvement? Having the right technology and a validated process people worldwide by creating high-quality assessments based on are key parts of a lasting solution. At eXplorance®, we believe rigorous research. Institutions of higher education rely on ETS to that people come first and that human ingenuity and collaboration help them demonstrate student learning outcomes and promote drive technology. This belief is the foundation of our approach and student success and institutional effectiveness. solutions, which bring people, processes, and technology together. Visit us at www.ets.org/highered. Using a continuous improvement cycle to accelerate progress, we Christine MacKrell work closely with our clients adding value at every stage. Our 660 Rosedale Road Presentation 1470 Rue Peel, suite 500, September 14 Princeton, NJ 08541 Montreal, QC, Canada H3A 1T1 3:00 - 3:30 PM Our [email protected] PISB 105 1.403.651.0412 Presentation ets.org explorance.com September 14 11:15 - 11:45 AM PISB 105

conference-program-2017.crw2.indd 37 9/1/17 4:50 PM GEIGER IDEA Geiger is the largest family-owned and managed distributor of IDEA is a national nonprofit organization dedicated to improving promotional products in the United States. Promotional products are teaching, learning, and leadership at colleges and universities. AEFIS proven to help you connect with your customers, prospects, and Through innovative, research-based assessment instruments and employees. Geiger takes a consultative approach to your current professional development tools, IDEA offers comprehensive faculty marketing or promotion challenge. No matter the need, a Geiger and leadership feedback and provides customized insight to help Promotional Consultant will work with you to create and implement guide personal and programmatic reflection and growth. Our high-impact programs to get your desired results - cost effectively unparalleled support of practitioners engaged in the advancement and responsibly. At Geiger, we combine ideas with creativity to of teaching and learning in higher education is reflected through the create Brandspiration! Find out how promotional products can help publication of peer-reviewed articles and original research relevant your institution/organization find solutions. to the field, as well as through our IDEA Impact grant making program. Committed to continuous improvement since 1975, IDEA Michael Marolla leads the way in providing solutions designed to help our partners Geiger/Allsbrook assess and improve learning holistically. 105 East 4th Avenue Ken Ryalls Conshohocken, Pa 19428 President Our 610.825.0880 Presentation 301 South Fourth Street, Suite 200 [email protected] September 14 Manhattan, KS 66502 10:30 - 11:00 AM geiger.com 800.255.2757 PISB 105 [email protected] ideaedu.org

STYLUS Stylus Publishing is a leading publisher of books on higher education— markets and distributes throughout the Americas the lists of a number of independent publishers and leading NGOs and research institutions to bring you the latest work in the fields of agriculture, business, economic development, education, electronic engineering, health, human rights, VERIFICIENT the natural sciences, and sustainability. Verificient Technologies is a SaaS firm, specializing in continuous Stylus Publishing, KKC identity verification through its patented cloud based technologies. 22883 Quicksilver Drive Serving over 1.6 million assessments, the Proctortrack application uses biometrics, computer visioning, and machine learning to offer Sterling, VA 20166 the world’s premier automated remote online proctoring solution. 703.661.1504 Proctortrack has achieved a seamless integration into all major LMS [email protected] platforms, including Canvas, Blackboard, Moodle, Desire2Learn, styluspub.presswarehouse.com eCollege, and Sakai. Verificient works with institutions in Higher Education and K-12, as well as corporate clients, who rely on Proctortrack to ensure the integrity of their online credentials for their high-stakes assessments. For more information visit, www.verificient. com and www.proctortrack.com. For a demonstration or to discuss how Proctortrack can fit your institutional needs please contact us using the information below. Rahul Siddharth Verificient Technologies Inc. Our Presentation TEMPLE UNIVERSITY 245 W 29th St, Suite 300 September 14 Temple University is a public, four-year research university in New York, NY 10001 2:00 - 2:30 PM Philadelphia and a global leader in education, research and 212.285.3111 PISB 105 healthcare. [email protected] Gina Calzaferri verificient.com 1801 N. Broad Street Philadelphia, PA 19122 215.201.7000 [email protected] temple.edu

conference-program-2017.crw2.indd 38 9/1/17 4:50 PM Assessment Improved.

AEFIS partners with colleges and universities who want to improve! Our integrated assessment platform provides solutions that work together to enhance curriculum while engaging students and faculty.

Course Curriculum Mapping + Syllabus Management + Outcomes Alignment

Course Evaluation Outcomes Assessment + Feedback + Evidence Collection

Faculty Activity Outcomes Transcript + Curriculum Vitae + Competency Portfolio

Strategic Planning Self Study + Data Collection + Accreditation Reporting Power Solutions for Assessment Success

AEFIS is the cloud-based, integrated assessment platform and all our solutions are included in your subscription.

AEFIS strongly believes in Join our growing list of interoperability through partners including standards and plays nicely UW-Madison with other systems on How can AEFIS help you? your campus including Drexel University Canvas UC-Irvine Let’s Discuss @ University of Rochester Blackboard www.aefis.com LMS Brightspace UT-Dallas Ellucian | Banner Rutgers University Assessment, Evaluation, Feedback University of Iowa and Intervention System — AEFIS SIS Oracle | Peoplesoft

AEFIS is a proud member of UPenn Temple University

Copyright © 2017 AEFIS, LLC. All rights reserved. v.20.20170811

conference-program-2017.crw2.indd 39 9/1/17 4:50 PM UNSTOPPABLE MOMENTUM

R1 #18

HIGHEST RESEARCH ACTIVITY LEADER IN GOOGLE CITATIONS Carnegie Classification of Ranked 18th worldwide by Institutions of Higher Education Google Scholar

#1 #2

BEST ONLINE MBA PROGRAM TRIAL ADVOCACY U.S. News & World Report U.S. News & World Report’s Best Law Schools

TEMPLE.EDU

conference-program-2017.crw2.indd 40 9/1/17 4:50 PM TM Successfully piloted by E Accreditations & credentials are MEANINGLESS without verification. How will your institution pass accreditation standards for verifying Distant Learners? Are you prepared for potential (HEA), 20 U.S.C. §1099c Title IV program reviews by the DOE?

Proctortrack delivers on online identity verification & DOE Title IV Compliance.

The Office of Inspector General (OIG) of the US Department of Education estimates that between 2009 and 2012 student aid fraud increased by 82%. 85,000 students may have illegally received more than $197 million in federal student aid. Distance Learning is the area in which large scale fraud is most common.

In order for higher educational institutions to continue to gain access to Federal Pell Grants for their students, particularly in online education programs: In February 2014, the OIG 1. Institutions must verify students’ identity. issued a report: Title IV 2. Institutions must determine students’ academic attendance related to an activity. 3. Institutions must maintain sufficient evidence of students’ academic attendance. of the Higher Education Act Programs: Additional The OIG conducted a random audit of 10 schools and highlighted violations including: Safeguards Needed To A. Students receiving Pell Grants that did not attend the classes. Help Mitigate The Risks B. Students receiving Pell Grants after the institutions disbursed funds without That Are Unique To The evidence of the students’ academic attendance before the disbursements. Distance Education C. Students indicated a change in enrollment status based on the students’ lack of Environment. academic attendance in 1 or more classes.

Proctortrack* is ProctorlessTM and is the only automated remote proctoring solution that continuously verifies the identity of online test takers and delivers on Title IV compliance. 1. Proctortrack verifies students’ identity by multi-form factor identity verification. 2. Proctortrack verifies students’ academic attendance beyond login credentials. 3. Proctortrack maintains full evidence of academic attendance through record * US Patent No. 8,926,335, “System keeping of students’ online participation. and Method for Remote Test Adminis- tration and Monitoring” Let us show you how Proctortrack with Sakai can help your institution satisfy your college accreditation requirements. Contact Us 245 West 29th Street Office 212-285-3111 TM Suite 300 Email [email protected] New York, NY 10001 Website www.verificient.com

conference-program-2017.crw2.indd 41 9/1/17 4:50 PM Improve Learning Outcomes and Institutional Performance

COMING SOON NEW The Analytics Revolution Degrees That Matter in Higher Education Moving Higher Education to a Learning Systems Paradigm Edited by Jonathan S. Gagliardi, Amelia Parnell, and Natasha A. Jankowski and Julia Carpenter-Hubin David W. Marshall Paper, $35.00 | eBook, $23.99 Co-published with

Paper, $35.00 | eBook, $27.99 Coming March 2018

Meaningful and Enhancing Assessment Manageable Program in Higher Education Assessment Putting Psychometrics to Work A How-To Guide for Higher Edited by Tammie Cumming Education Faculty and M. David Miller Laura J. Massa and Foreword by Michael J. Kolen Margaret Kasimatis Co-published with and Paper, $29.95 | eBook, $23.99 Coming October 2017 Hardcover, $35.00 | eBook, $27.99

High-Impact Real-Time Student Connecting Excellence ePortfolio Practice Assessment the Dots in Higher A Catalyst for Student, Meeting the Imperative Developing Student Education Guide Faculty, and Institutional for Improved Time to Degree, Learning Outcomes A Framework for the Design, Assessment, Learning Closing the Opportunity and Outcomes-Based and Continuing Improvement of Institutions, Bret Eynon and Gap, and Assuring Student Assessment Departments, and Programs Competencies for Laura M. Gambino SEcond Edition Eighth Edition 21st-Century Needs Foreword by George D. Kuh Ronald S. Carriveau Brent D. Ruben Peggy L. Maki Paper, $35.00 | eBook, $27.99 Paper, $24.95 | eBook, $19.99 Paper, $39.95 | eBook, $31.99 Foreword by George D. Kuh Workbook and Scoring Instructions: Paper, $15.00 | eBook, 19.99 Paper, $29.95 | eBook, $23.99 Facilitator’s Materials: eBook, $19.99

20% OFF for all Drexel Assessment attendees. Use code DREX17 at www.styluspub.com.Offer expires October 15, 2017.

Connect with Stylus Online! @StylusPub

Compare Before You Buy—Ordering Direct May Save You Money and Supports Independent Publishing TO ORDER: CALL 1-800-232-0223 FAX 703-661-1501 E-MAIL [email protected] WEBSITE www.Styluspub.com

conference-program-2017.crw2.indd 42 9/1/17 4:50 PM Visit us in the PISB Atrium! Booth #3 Or View our Presentation on Bluepulse at 11:15am on Thursday, Sept 14th In the Exhibitor Forum Area

End-Of-Term Course Evaluations

Blue course evaluations is a comprehensive and intuitive solution leveraged by Boston College, University of Louisville, Northwestern University, Johnston Community College, Belmont University and 300+ more institutions worldwide.

Blue is built for growth and institutional situations like malleable faculty reports, cross-listed courses, question branching, team taught courses, mobile form fill-out, and interpreting open-ended feedback with Blue Text Analytics.

Visit www.explorance.com/course-evaluations for more information!

Live Formative Feedback

Bluepulse enables institutions to easily gather student feedback about their learning experience, implement peer reviews and competency assessments for faculty development, and measure student satisfaction on service areas.

Engage students, instructors and other key institutional stakeholders by instilling a culture of continuous improvement through feedback.

Visit www.explorance.com/bluepulse for more information!

Interested in learning more?

Visit www.explorance.com to sign up for a software walkthrough or contact our technical team at +1.877.938.2111 or [email protected]. We look forward to hearing from you!

www.explorance.com

conference-program-2017.crw2.indd 43 9/1/17 4:50 PM NOTES

conference-program-2017.crw2.indd 44 9/1/17 4:50 PM NOTES

conference-program-2017.crw2.indd 45 9/1/17 4:50 PM NOTES

conference-program-2017.crw2.indd 46 9/1/17 4:50 PM NOTES

conference-program-2017.crw2.indd 47 9/1/17 4:50 PM conference-program-2017.crw2.indd 48 9/1/17 4:50 PM