Annual Assessment Report

Total Page:16

File Type:pdf, Size:1020Kb

Annual Assessment Report

ANNUAL ASSESSMENT REPORT

2002-2003

SEPTEMBER 2003 DANVILLE AREA COMMUNITY COLLEGE

Prepared by

Randall P. Fletcher, Dean of Institutional Effectiveness & Academic Support Services

Section I Overview I. a. Defining Assessment

Section II Implementing Assessment II. a. Creation of a Culture of Assessment at DACC II. b. Assessment Planning II. c. Major Assessment Activities

Section III Educational Improvements Made/Actions Taken III. a. Classroom Level III. b. Program Level

Section IV Conclusion

Appendix 1. Course & Program Level Assessment Schedule 2003-2004

2. Examples of course-level assessment plans

3. Common Institutional Indicators of Effectiveness & FY 2003 Results

4. Mission-Specific Indicators of Effectiveness & FY 2003 Results

5. FY 2003 Graduate Exit Survey results

6. Examples of Program Level Assessment Plans Section I OVERVIEW

I. a. Defining Assessment

Since the 1999 North Central Association (NCA) site visit, Danville Area Community College has been committed to establishing a culture of assessment within all departments and divisions of the institution. This focused effort not only has a primary goal of developing an infrastructure for student learning assessment, but it also provides a foundation for measuring the overall effectiveness of the college. The assessment initiative at DACC has been supported at all levels of college, from the participation of faculty and staff on the newly created Assessment Committee to the monetary support of the board of trustees.

In order to fully understand the importance and overall impact assessment has had on Danville Area Community College over the past five years, the term “assessment” must be conceptualized, or defined as it applies to an institution of higher learning. While assessment may be defined in many ways, the official definition comes from the American Association of Higher Learning and is posited by Tom Angelo:

“Assessment is an ongoing process aimed at understanding and improving student learning. It involves:

. Making our expectations explicit; . Setting appropriate criteria and high standards for learning; . Systematically gathering, analyzing and interpreting evidence to determine how well performance matches those expectations and standards; . Using the resulting information to document, explain, and improve performance.” - Tom Angelo, AAHE Bulletin, May 1999.

2 Assessment in an educational setting can be further understood in terms of tangible action statements. This notion of assessment as an agent of action or change is best represented in the Committee on Undergraduate Program Review definition: “Assessment is . . . o Systematically exploring and gathering evidence of . What we are trying to do and why? or . What is my program suppose to accomplish? . How well are we doing it? . How do we know? . How do we use the information to improve or celebrate successes? . Do the improvements we make work? - Adapted from CUPR Guidelines

By offering explicit definitions for assessment, the college and all of its stakeholder groups can strive to not only collect data and information, but also consume the data in such a way that it directly impacts decision making processes and ultimately moves the college forward.

SECTION II. IMPLEMENTING ASSESSMENT

II. a. Creating a “Culture of Assessment” at DACC

 Assessment Committee formation - As a result of the development and subsequent organization of the NCA Response, the DACC Assessment Committee was established in the Fall of 2001. Implemented from the framework of the College’s Continuous Quality Improvement system, the Assessment Committee became a subcommittee of the CQI’s Academic Affairs team. The Assessment Committee is made up of the following DACC personnel:  two faculty members from each of the academic divisions,  one nursing faculty member  two members of Student Services  two members of the Adult Education and Basic Skills divisions,  one faculty member of the library,  three academic deans  two members of the Administrative Council,  two students

3  The Assessment Committee is chaired by the Dean of Institutional Effectiveness & Academic Support Services. Membership on the Assessment Committee by full time faculty is a two-year commitment. Each faculty member is selected by their division each spring when their term expires. -- Assessment Committee 2003-2004

Kathie Armstrong Student, Phi Kappa Theta Linda Berg Instructor, Nursing Glenda Boling Instructor, Speech Jane Brown Professor, Information Systems Belinda Dalton Dean, Student Services Chris Denton Student Trustee Viv Dudley Professor, Marketing Lori Garrett Instructor, Biology Jeff Hutton Instructor, Technology Dave Kietzmann Vice President, Instruction & Student Services Ruth Lindemann Instructional Services Librarian Penny McConnell Coordinator, Student Support Services Bruce Rape Dean, Business & Technology Eric Rayburn Instructor, Mathematics Janet Redenbaugh Dean, Mathematics & Science Eric Simonson Instructor, Music Lily Siu Dean, Liberal Arts Tom Szott Director, Adult Education Marie Vanada Professor, Developmental Education

Chairman Randy Fletcher Dean, Institutional Effectiveness & Academic Support Services

The role of the Assessment Committee - The Assessment Committee has been charged with facilitating the continuous improvement of the outcomes assessment initiative at Danville Area Community College. The primary reason for establishing the committee was to create a comprehensive outcomes assessment system for tracking and documenting student learning and achievement. The committee has begun to study the components of all of the College’s assessment measures (i.e. WorkKeys, CAAP, Exit Writing Exam), classroom and program level assessment activities, design training for faculty to learn more about classroom and program assessment and monitor progress of all program assessment plans that faculty have drafted.

The Committee will also monitor and facilitate the development of all annual assessment activities. Subcommittees have been formed to address 1) the creation of

4 an Assessment Website that will serve as a resource and information portal for all DACC faculty and staff relative to the DACC Assessment Initiative; 2) Establish a series of methods (multi-method approach) for assessing General Education Outcomes; 3) Begin working on key issues of Institutional Assessment or Effectiveness. This subcommittee will focus their activities around the common institutional indicators and mission-specific core indicators of effectiveness established by the college and submitted annually to the Illinois Board of Higher Education and the Illinois Community College Board.

II. b. Assessment Planning In terms of assessment planning, the Assessment Committee has identified three primary phases of the Assessment Initiative at DACC.

PHASE I -- FY 2003

 Assessment Committee formation

 Course-Level Assessment Plans (full-time faculty)

 Initial Education of Faculty/Staff on Importance of Assessment

PHASE II – FY 2004 (See Appendix 1 for FY2004 Assessment Timeline)

 Incorporate Core Indicators of Effectiveness into Institutional Assessment plan

 Program-Level Assessment Plans (all academic divisions)

 Course-Level Assessment Plans (full-time & part-time faculty)

 Establish General Education Outcomes for all academic programs

Phase III – FY 2005

 Establish systematic Institutional Effectiveness cycle that generates data from all divisions and departments of the college.

 Data from this system will be used to develop a report on student learning outcomes, student service functions, and administrative service functions and ultimately determine the levels of effectiveness of the college.

5 II. c. Major Assessment Activities

General Education

In FY 2003, the Assessment Committee outlined a set of rubrics that can be used to measure communication (oral and written), critical thinking, reading comprehension and problem solving skills in all of the general education courses of the college. The results of the assessment would directly influence placement score assignments as well as modifications made to existing curriculum. This assessment instrument will also be incorporated in the alternatively delivered general education courses. The establishment of these rubrics will serve as the foundation or model for establishing general education assessments for all academic programs (FY 2004 goal).

Transfer & Career and Technical Programs In April of 2003, all full-time faculty members were required to develop a course-level assessment plan for one course in the spring 2003 semester. The instructors and professors that make up the transfer and Career and Technical Education divisions of the college submitted plans that identified the following:  Learning Outcomes – what students were expected to learn from enrolling the particular course  Assessment Tool Design – the primary objective for the assessment plans at the course level is to collect student data that can be analyzed and used to improve learning. Instructors were given the option to develop tools that collected quantitative or qualitative data.  Results & Conclusion  Actions based on Findings – It was the intent of the Assessment Committee when the course-level assessment plans were devised to use the results/findings of the assessments to directly impact classrooms, student learning and most importantly the DACC campus community’s understanding of assessment. It is still inconclusive as to what impact the data from the Spring 2003 course-level assessment plans will have on classes and students in the Fall 2003 semester.

6 (See appendix #2 for examples of course-level assessments)

Annual Results Report – Report on core indicators As directed by the Illinois Board of Higher Education (IBHE) in cooperation with the Illinois Community College Board (ICCB), Danville Area Community on an annual basis must address and provide performance data on six statewide goals of effectiveness. This report is known as the IBHE Annual Results Report and is submitted in August every year.

After much review, the Assessment Committee has agreed to use the Annual Results Report as the framework from which to create the DACC Institutional Effectiveness system and the guiding document to produce tangible data to assess institutional performance, outcomes and overall effectiveness. In FY 2004, the Assessment Committee will determine if new goals will be added to the list of statewide goals (of institutional effectiveness). The statewide goals are as follows: 1) Higher education will help Illinois business and industry sustain strong economic growth. 2) Higher education will join elementary and secondary education to improve teaching and learning at all levels. 3) No Illinois citizen will be denied an opportunity for a college education because of financial need. 4) Illinois will increase the number and diversity of citizens completing training and education programs. 5) Illinois college and universities will hold students to even higher expectations for learning and will be accountable for the quality of academic programs and the assessment of learning. 6) Illinois colleges and universities will continually improve productivity, cost- effectiveness, and accountability.

With the Annual Results Report, colleges not only address the six statewide goals in relation to activities and initiatives that are taking place on their respective college campuses, community colleges are also asked to show annual progress or performance in relation to six common institutional core indicators of effectiveness and a select number of college generated, mission-specific core performance

7 indicators. These core indicators, along with benchmark data from the FY 2003 submission, are listed in Appendix 3 & 4. The data generated from the overall annual results reports and the common and mission-specific indicators will be incorporated into the college’s strategic plan processes, budget development process, departmental planning initiatives and the overall decision-making system of the college.

Satisfaction Inventories Non-returning student survey DACC is very committed to understanding why students fail to complete programs of study. The non-returning student survey was developed to not only determine student intent upon enrollment but also created to uncover reasons behind why students stop-out or drop out of the college. The surveys were sent to all students you initially enrolled at the college in FY 1999 and have yet to complete a degree. Results will be apart of the FY2005 Annual Assessment Report.

DACC Graduate Exit Survey The thoughts, perceptions and attitudes of DACC graduates is important data that the college uses on a daily basis as new programs are developed, revisions to existing curricula are made and innovations at the college are explored that will directly impact student learning and student success. The DACC graduate follow-up survey is not a new survey instrument and has been used in one form or another for many years at the college. In recent years, the survey has been expanded beyond a survey of only occupational graduates and now branches off to all DACC graduates. The survey generates data on student satisfaction on courses that made up their program, services of the college, student intent for attending DACC, transfer institution information and employment status. Results from the FY2003 graduate survey are contained in Appendix 5.

Faculty/Staff Professional Development Dr. Nancy McCoy and Donna Martin will represent Danville Area Community College at the Teaching and Learning Excellence Conference, hosted by the Illinois Community College Board in October, 2003. McCoy and Martin will offer a presentation

8 of the English 101 Writing Exit Assessment and the overall program assessment plan of the communications department.

In November 2002, twelve members of the Assessment Committee attended the annual Assessment Institute at Indiana University – Purdue University, Indianapolis, to receive training from national leaders in outcomes assessment. Members of the committee will attend the 2003 Institute at IUPUI this fall.

Dr. Ruth Lindemann, assessment committee member and instructional services librarian, developed a resource website on assessment issues for the entire campus community.

9 SECTION III EDUCATIONAL IMPROVEMENTS MADE/ACTIONS TAKEN

III. a. Classroom Level

In April of 2003, all full-time faculty members were required to develop a course-level assessment plan for two courses in the spring 2003 semester. The instructors and professors that make up the transfer and career and technical education divisions of the college submitted plans that identified the following:  Learning Outcomes – what students were expected to learn from enrolling the particular course;  Assessment Tool Design – the primary objective for the assessment plans at the course level is to collect student data that can be analyzed and used to improve learning. Instructors were given the option to develop tools that collected quantitative or qualitative data;  Results & Conclusion;  Actions based on Findings – It was the intent of the Assessment; Committee when the course-level assessment plans were devised to use the results/findings of the assessments to directly impact classrooms, student learning and most importantly the DACC campus community’s understanding of assessment. It is still inconclusive as to what impact the data from the Spring 2003 course-level assessment plans will have on classes and students in the Fall 2003 semester.

Examples of Educational Improvements Made/Actions Taken

 ELEC 104 – Industrial Safety: Students are expected to be able to develop and write an OSHA required safety program. The texts developed by students are then compared against OSHA models. The instructor assesses the quality of the written safety standard by using an objective/subjective scoring rubric. Conclusions from Spring 2003 assessment were that students lacked research and investigation skills needed to draft appropriate OSHA safety programs. This prompted changes in how instruction is delivered relative to research and writing topics.

 INFO 141 – Microsoft Windows Server 2000: Students were expected to show proficiency in eight specific learning areas of MS Windows Server 2000 applications.

10 Throughout the semester, students were given tests to determine the skill level relative to each learning outcome. Actions that took place as result of the assessment included developing “hybrid type” tests that would use the MS model tests but incorporate more applied lab time to the exam. The instructor also revised the course outline by adding two additional lab assignments per learning outcome. The instructor also plans to refocus the course to become more “hands on” rather than “text-based” as originally designed.  BIOL 140 – Microbiology: Students enter into this course with varying knowledge of chemistry and biology. The instructor designed an assessment tool that would gauge the skill level of the students on the general topic of microbiology. The assessment revealed that the class was made up of students with various backgrounds and interests and students skill level ranged from previous high school biology instruction to several students who have been out of school for extended time and not familiar with the subject matter. The actions taken were to include more basic chemistry, cell biology and metabolism in the lectures to increase understanding of the foundation concepts of the course.  Math 118 – Introduction to Mathematics: Interpreting basic math language (i.e. symbols) was a primary learning outcome of this introductory mathematics course. The instructor surveyed the students on three occasions throughout the semester to assess proficiency in math language in the areas: math statements, algebra and set theory. Actions based on findings included more repetition with the material after it is present. It was identified that students were having difficulty with recall of early semester material later in the semester and a new review process was established. Homework, assignments and review worksheets will be more comprehensive when dealing with math language.  MUSI 115 – Music Appreciation – Students were polled using a self-assessment survey to determine how learning objectives and student expectations of the course were being met. The survey was administered at the mid point of the semester. The instructor used the feedback data from the students to alter the course in the following ways: 1) To address problem of class not being lively enough, the instructor will add student presentations with students offering peer reviews of the presentations; 2) Note Taking (what to write down and when): the instructor will establish new guidelines and some class time will devoted that instructs students how to take college-level notes; 3) incorporation of more live performances: the instructor will make it a requirement for students to attend one live performance of classical music and give a class report.  HIST 152 – U.S. History Since 1865 – The instructor used a pretest in the first week of the semester to measure students understanding of certain historical events in a specific time frame. Students then would be given the same test again as a part of their final exam to again measure their knowledge and understanding of the events. Actions based on the findings was that the more supplemental material is needed to reinforce test book information. The course will also add more web-enhanced supplemental material.  BMGT 291 – Advanced Marketing – The students were given a reflective paper assessment after they concluded their primary projects for the class. It gathered student feedback both on the project assignments and the overall class. The instructor used the findings from the reflective feedback paper to develop a timeline for the class projects, where progress deadlines will be establish to ensure that students are staying on task with the project assignments. The project assignments

11 will also no have a more detailed overview handout that spells out every detail and expectation of the assignment set.

III. b. Program Level

ENGL 101 – Rhetoric Composition I - Exit Writing Exam

In the establishment of a program level assessment system, the writing department devised a written exit assessment during the fall 1998 semester. The assessment has evolved over the years and now in FY 2003 the assessment serves as a model for program level assessments campus wide. The exit exam and the questionnaires assess learning outcomes that are outlined in the English 101 and 121 syllabi. Tangible learning outcomes expected of students who successfully complete English 101 and 121 are clearly outlined in their respective syllabi.

Instructional changes that have occurred as a result of the program assessment include writing instructors requiring early writing assignments that pinpoint a students need for additional assistance (writer’s room, one-on-one tutoring, etc.). The primary learning aid that has resulted from the assessment is the development of a self-directed grammar study module. This intensive support service was piloted in the Spring 2003 semester. Students have the option of electronic, Internet, or “pen and pencil” grammar reviews that are administered through the Writer’s Room.

In addition to preparatory services, the writing committee that oversees the English assessments has shifted the focus of the Rhetoric 101 course outcomes to persuasion because the assessment revealed that students had a mastery in “engaging with discourse through summary and commentary.” This change has allowed the instruction in this program area to focus on rhetorical goals outlined in the course syllabi. The Writing Committee contends that this change will give students a better educational foundation to pass the exit exam.

On going discussions of the courses and their respective syllabi will continue at all Writing Committee meetings as will discussions on appropriate placement scores for students entering into both ENGL 101 and ENGL 121.

12 SPCH 101 – Speech Communications 101

As a way to address general and specific student learning in the Speech Communication Department, the full-time faculty developed a program level assessment in FY 2003. The faculty created two different assessment instruments. The first one was a general learning outcomes assessment that was given to students in the Fall 2002 semester. The second assessment targeted student preparedness for the course’s first exam.

For the Fall 2002 assessment, a survey was given to students at the beginning of the semester and it listed specific skills of the course. Students were asked to rate their ability in relation to these communication skills (e.g. assertiveness, conflict management, listening, etc.). The same survey was given to the class at the end of the semester.

The initial survey yielded 250 respondents, with the semester-end survey netting 189 respondents. The results indicated that students perceived that they improved in every aspect of the course except for listening skills.

The actions taken as a result of this program assessment: 1. Listening skills will be taught during other units of interpersonal communication. 2. More exercises on paraphrasing will be included in the listening unit. 3. More exercises on asking questions and providing feedback will be incorporated into the curriculum.

For the Spring 2003 semester, three separate Speech 101 classes (with different instructors) were given an assessment that sought to measure student’s preparation levels for the first exam of the course. Students were surveyed on grade expectations (of the exam), number of hours of preparation, reading preparation, and note preparation. In addition, students were asked to identify what classroom strategies best prepared them for the exam: group work, videos, lectures, class notes, textbooks and Internet research and resources.

13 Findings 1. Students have a difficult time answering application type questions; 2. Students do not read the textbook. Those that do, do not comprehend much of what is presented; 3. Most of the students spend less than 2 hours studying for exams; 4. Many students would like for the instructor to give them the test questions and the answers in advance of the test; 5. Since multiple choice questions seems difficult for students, instructors need to address this problem by developing different testing formats; 6. Students need help in learning how to take notes.

Actions

1. Short answers/essay questions will be reviewed for ambiguity; 2. Short answers/essay questions will be reviewed to verify directions are clear; 3. Detailed worksheets and outlines will be given to students for every chapter; students will be encouraged to fill in worksheets during the class discussion. 4. Study sessions will be held for interested students; 5. More repetition of material will take place; 6. Reading, classroom activities, and extra materials will be connected to real life examples.

See Appendix 6 for complete Program Assessment Plans

14 SECTION IV CONCLUSION

With the NCA visit in March of 1999 came the realization for Danville Area Community College that though data was collected, it had little meaning when it came to carrying out the duties of teaching and preparing students for the world of work. NCA spotted the deficiency and mandated that the college change its thinking on the utility of assessment. In less than five years since the NCA visit, DACC only knows what assessment is and how it can be used to improve student learning and institutional effectiveness; it embraces the concept and actively has incorporated the assessment system into classrooms, programs and institutional departments.

DACC still has much ground to cover in terms of assessment, but with the continued support of the faculty, staff, administration and board of trustees, the systems and processes that are being incorporated will continue to strengthen. Assessment is about setting high standards and expectations and then sitting back and letting the talent and expertise of an institution meet and exceed those standards and expectations. DACC is up to this challenge.

15

Recommended publications