Seminole State College
Total Page:16
File Type:pdf, Size:1020Kb
Presented: April 6, 2009 Adopted: April 6, 2009
Seminole State College Expanded Statement of Institutional Purpose
EVALUATION OF GENERAL EDUCATION (General Education Outcomes 1-4) 2009
Seminole State College Mission Statement
Seminole State College is maintained as a two-year public college authorized by the Oklahoma State Regents for Higher Education to offer courses, provide programs, and confer associate degrees. Seminole State has the primary responsibility of providing post-secondary educational programs to residents of Hughes, Lincoln, Okfuskee, Seminole, and Pottawatomie counties in east central Oklahoma. The College exists to enhance the capabilities of individuals to achieve their goals for personal development by providing quality learning experiences and services that respond to diverse individual and community needs in a changing global society. Seminole State College prepares students to continue their education beyond the two-year level, trains students for careers and other educational opportunities, and makes available resources and services designed to benefit students and the community at large.
General Education Outcomes
The Seminole State College General Education Outcomes are: 1. Students will demonstrate college level communication skills; 2. Students will demonstrate an understanding and application of scientific principles; 3. Students will demonstrate knowledge related to functioning in society; 4. Students will demonstrate an understanding of the roles of history, culture, and the arts within civilization.
Assessment Methods and Criteria for General Education
DIRECT AND INDIRECT INDICATORS General Education Assessment is the evaluation of student achievement of the goals of the General Education curriculum. Principal direct indicators used are course-embedded assessment, the ACT Collegiate Assessment of Academic Proficiency (CAAP) Test, and transfer reports from four-year institutions. Principal indirect indicators used to triangulate the data are found in SSC student satisfaction surveys and the ACT Faces of the Future Survey, which is conducted on a biennial basis. The annual SSC Institutional Statistics Report provides a framework for understanding the student body.
Form Created: April 2003 Revised: November 2008, February 2009 1 Presented: April 6, 2009 Adopted: April 6, 2009
Course-Embedded Assessment Assessment Methods and Criteria The most critical component of the College’s assessment methods and criteria is the course-embedded assessment process. In the course-embedded assessment process, each course is defined by course outcomes measured by learning objectives. Also each course measures one or more Degree Program Outcomes and one or more General Education Outcomes, all of which are tied to the College mission.
The General Education Outcomes Matrix (February 2006; Updated November 2008) comprehensively demonstrates how each course is linked to one or more of the four General Education Outcomes.
In each course, instructors, in conjunction with the appropriate division chair, select one of nine assessment options or a combination of options to assess student learning. (The original list of options was revised and adopted by the Assessment of Student Learning Committee on May 2, 2007.)
A new reporting procedure was adopted and partially implemented in fall 2005; it was fully implemented in fall 2006. The procedure requires all classes not taught in the fall semester to be assessed in subsequent semesters. In addition, it requires: Collection of assessment data for one or more Degree Program Outcomes. Collection of assessment data for one or more General Education Outcomes. Reporting of data in the annual fall report from all classes taught during the academic year. Assessment Results
For fall 2008, the percentage of faculty (who submitted reports) using each assessment option is as follows:
A: Pre- and Post-Tests—80.00% B: Pre- and Post-Writing—3.53% C: Pre- and Post-Performance—8.24% D: Observations—0.00% E: Rubrics—0.00% F: Projects and Portfolios—0.00% G: Classroom Response System—0.00% H: Creative Assessment—12.94% I: Any Combination of A-H—23.53%
For fall 2008,
4470 student assessments reported. 85 (95.51% of all faculty) instructors submitted reports. Two divisions reported on Degree Program Outcomes o Each reported aggregate percentage increases for the outcomes assessed. . Percentage increases ranged from 21.3% to 48.7%. . Successful post-assessment percentages ranged from 46.4% to 79.8%. o Since outcomes vary from division to division, no summary data is provided. Form Created: April 2003 Revised: November 2008, February 2009 2 Presented: April 6, 2009 Adopted: April 6, 2009 The General Education Outcomes are: Outcome 1: Demonstrate college level communication skills. Outcome 2: Demonstrate an understanding and application of scientific principles. Outcome 3: Demonstrate knowledge related to functioning in society. Outcome 4: Demonstrate an understanding of the roles of history, culture and the arts within civilization. o Not all divisions assessed each General Education Outcome. o All divisions reported aggregate percentage increases from pre-assessment to post-assessment for outcomes assessed. The table below shows the range of percentage increases for each outcome.
Pre- to Post- Number of General Assessment Divisions Education Percentage Increase Assessing Outcome Range Outcome Low High Outcome 1 4 22.2% 57.7% Outcome 2 3 4.8% 32.2% Outcome 3 3 24.6% 52.9% Outcome 4 4 22.2% 30.6%
A summary of division average aggregate results is shown in the following table.
Outcome Number Pre Post Diff Assessed Assessed Outcome 1 25,051 21.4% 64.3% 42.8% Outcome 2 10,728 31.9% 63.4% 31.5% Outcome 3 15,025 12.9% 64.5% 51.7% Outcome 4 3,608 38.6% 63.2% 24.7%
NOTE: The numbers in the Number Assessed column are large because one division counts each question related to the outcome. o Post-assessment results range from 63.4% to 64.5%. o Percentage increases range from 24.7% to 51.7%.
Analysis Action Plan: Next Steps to Improve Student Learning
Form Created: April 2003 Revised: November 2008, February 2009 3 Presented: April 6, 2009 Adopted: April 6, 2009 Aggregate learning increases for each General Education Outcome were reported across the Division chairs will continue to stress the need for all faculty to participate in the course-embedded curriculum throughout the institution. Percentage amounts of increase reported by divisions assessment process and to identify assessment data related to each of the General Education (from the beginning to the end of fall 2008 semester course instruction) ranged as follows: Outcomes. They will continue to provide suggestions to the Assessment of Student Learning 22.2 to 57.7 for Outcome 1; 4.8 to 32.2 for Outcome 2; 24.6 to 52.9 for Outcome 3; and 22.2 Coordinator in regard to the reporting format. These will be reviewed and revised as appropriate to 30.6 for Outcome 4. The average aggregate percentage increases are: 42.8 for Outcome and then announced to the faculty. 1; 31.5 for Outcome 2; 51.7 for Outcome 3; and 24.7 for Outcome 4. The Assessment of Student Learning Committee will continue to monitor both the format and the For Outcomes 1 and 3, fall 2008 results are higher than those for fall 2007. The increases reporting procedure. In particular, the Committee will consider adding to the tables that provide from fall 2007 to fall 2008 were 0.6% for Outcome 1 and 2.1% for Outcome 3. Fall 2008 percentage data for pre- and post-assessment results. The addition will show the actual number of results were lower than fall 2007 results for Outcome 2 (5.3%) and Outcome 4 (8.3%). The students who were successful on both assessments. increases for Outcomes 1 and 3 may be because of increased assessment of those outcomes, a better understanding of how to relate assessment questions to the outcomes, and/or a more The Assessment of Student Learning Committee will continue its review concerning the efficient reporting procedure. However, the decreases for Outcomes 2 and 4 are significant establishment of thresholds for each outcome assessed. It is expected that, much like the thresholds and troubling. One possibility for such decreases is that a shift in the emphasis has for the ACT CAAP Test, these thresholds will provide a benchmark for each outcome, which will occurred. If that is the case, it seems reasonable to take a closer look at the assessment make it possible to analyze student progress toward the threshold. questions and techniques used for those outcomes.
While results are mixed, it has been noted that percentage increases are not always an accurate reflection of how much students are learning. (For instance, a 30% increase is good if the pre-assessment is 55%, but not good if it is 13%.) However, a review of the post-assessment percentages provides a clearer understanding of how much students have learned from the start of the semester to the end.
On the whole, the post-assessment results seem satisfactory with a range of 63.2% to 64.5% and substantiate that student learning occurred in all General Education Outcomes assessed. However, current discussions by the Assessment of Student Learning Committee have centered on 70% as a goal and these scores are far below that percentage. Thus, it would seem appropriate for the committee to continue those discussions and develop a long-range plan to address the situation.
Summary: Course-Embedded Assessment
This is the fourth time to use the revised General Education Outcomes as part of an evaluation report. It is also the fourth time the new reporting procedure has been used. The reporting procedure provides a more thorough approach to the overall process and allows for a better analysis of results. Additional years of data should make it possible to refine the analysis and help with future general education curriculum planning.
Form Created: April 2003 Revised: November 2008, February 2009 4 Presented: April 6, 2009 Adopted: April 6, 2009
ACT Collegiate Assessment of Academic Proficiency (CAAP) Test
Assessment Methods and Criteria This nationally recognized test, first used during fall 2006, is designed to assist in assessing the outcomes of general education programs by focusing on the academic skills developed through general education courses. The CAAP Test does this by testing the following areas: writing skills, mathematics, reading, critical thinking, and science reasoning.
Assessment Results For the third administration of this test, 162 students each took two randomly selected test modules from the following: Writing Skills, Mathematics, Reading, Critical Thinking, and Science. The table below compares the number of SSC participants for each module and the number of students in the national database. Comparison of Number of Participants SSC vs National Number Number CAAP Test Module SSC National Writing Skills 65 28,236 Mathematics 64 29,274 Reading 65 28,667 Critical Thinking 65 24,069 Science 65 18,647 The table and chart that follow compare Seminole State College mean scores for each CAAP test module with those for students in the national database. Comparison of Mean Scores SSC Scores vs National Scores SSC National Over/Under Mean Mean National Mean Mean 61.6 62.0 -0.4 Writing Skills S.D. 4.1 4.8 N 65 28,236 Mean 55.1 56.2 -1.1 Mathematics S.D. 4.2 3.6 N 64 29,274 Mean 59.8 60.4 -0.6 Reading S.D. 5.1 5.3 N 65 28,667 Mean 59.5 60.8 -1.3 Critical Thinking S.D. 4.6 5.4 N 65 24,069 Mean 59.9 59.2 0.7 Science S.D. 3.8 4.1 N 65 18,647
Form Created: April 2003 Revised: November 2008, February 2009 5 Presented: April 6, 2009 Adopted: April 6, 2009
ACT CAAP Mean Test Scores Fall 2008
64 61.6 62.0 62 60.4 60.8 59.8 59.5 59.9 59.2 60 58 56.2 56 55.1 54 52 50 WS WS Math Math Read Read CT CT Sci Sci SSC Nat SSC Nat SSC Nat SSC Nat SSC Nat
It is clear from the table that the SSC mean scores for four of the modules are lower than the corresponding national scores. Comparing SSC scores with the national scores, it is observed that the differences range from -1.3 for Critical Thinking to -0.4 for Writing Skills. Finally, it is noted that the SSC mean score of Science is 0.7 higher than the national mean score for that module. The difference between the two scores has been highlighted in the table.
Data has now been collected for three years, but still is limited since not all students take all test modules. However, even limited data allows for some comparison between years and with the national data. The following table shows the number of participants and the SSC mean score for each test module for Fall 2006, Fall 2007, and Fall 2008.
Comparison of SSC Mean Scores Fall 2006, Fall 2007, and Fall 2008 Fall 2006 Fall 2007 Fall 2008 No. Mean No. Mean No. Mean WS 52 62.1 87 62.3 65 61.6 Math 52 55.6 87 55.4 64 55.1 Read 64 61.5 88 59.7 65 59.8 CT 52 59.7 86 60.6 65 59.5 Sci 52 58.9 86 59.9 65 59.9
ACT provided Certificates of Achievement to students who scored at or above the national mean on a test module. Ninety-three students—57.41% of all participants—received at least one certificate. In fact, 46 received one certificate and 47 received two certificates.
The tables below show the number of certificates as well as the percentage of participants for each test module for the three years the test has been administered.
Form Created: April 2003 Revised: November 2008, February 2009 6 Presented: April 6, 2009 Adopted: April 6, 2009 Certificates of Achievement
Writing % % No. Skills Tested Certificates Fall 2006 25 18.38% 20.83% Fall 2007 41 18.89% 19.90% Fall 2008 34 20.99% 24.29%
% % % % Science No. Read No. Tested Certificates Tested Certificates Fall 2006 21 15.44% 17.50% Fall 2006 31 22.79% 25.83% Fall 2007 44 20.28% 21.36% Fall 2007 41 18.89% 19.90% Fall 2008 34 20.99% 24.29% Fall 2008 25 15.43% 17.86% % % Math No. Tested Certificates Fall 2006 22 16.18% 18.33% Fall 2007 33 15.21% 16.02% Fall 2008 23 14.20% 16.43% Critical % % No. Thinking Tested Certificates Fall 2006 21 15.44% 17.50% Fall 2007 47 21.66% 22.82% Fall 2008 24 14.81% 17.14%
Analysis Action Plan: Next Steps to Improve Student Learning.
SSC’s mean scores for students with 45+ hours are comparable to the mean scores for The Assessment of Student Learning Committee plans to administer the CAAP Test during fall sophomore students tested across the nation. By test module, SSC’s mean scores are: 2009. Plans call for at least 200 students to be tested using the same five test modules. Writing Skills—61.6 (0.4 below the national mean); Mathematics—55.1 (1.1 below the national mean); Reading—59.8 (0.6 below the national mean); Critical Thinking—59.5 (1.3 below the national mean); and Science—59.9 (0.7 above the national mean). The Committee will continue with efforts to inform students and faculty of the importance of this assessment instrument. In addition, it will continue to work with the MIS Department and faculty to A comparison of mean scores for 2006, 2007, and 2008 reveals the following. make certain students are notified of their selection to participate in the test and to arrange for make- Writing Skills—0.2 increase from 2006 to 2007; up testing. 0.7 decrease from 2007 to 2008 Mathematics—0.2 decrease from 2006 to 2007; 0.3 decrease from 2007 to 2008 The committee will continue to review both SSC and national mean scores each spring semester Reading—1.8 decrease from 2006 to 2007; Form Created: April 2003 Revised: November 2008, February 2009 7 Presented: April 6, 2009 Adopted: April 6, 2009 0.1 increase from 2007 to 2008 Critical Thinking—0.9 increase from 2006 to 2007; Results of the fall 2009 test will be compared with previous results and assessment thresholds. In 1.1 decrease from 2007 to 2008 accordance with the Committee’s decision, it will prepare a summary review and propose, as Science—1.0 increase from 2006 to 2007; appropriate, revisions to the process as well as the threshold range. No change from 2007 to 2008.
The Assessment of Student Learning Committee established the following assessment threshold ranges to be effective with the fall 2007 test: SSC mean test scores will fall within +0.5 points of the 2006 national mean scores established after the 2006 test. When 2008 scores are compared to the ranges the following is observed. Writing Skills—0.1 below threshold range; Mathematics—0.5 below threshold range; Reading—0.2 below threshold range; Critical Thinking— 0.9 below threshold range; Science—0.3 above threshold range.
Results from the first administration of the CAAP Test provided the College with benchmarks for each of the areas assessed—a goal of the Assessment of Student Learning Committee. Results of the second and third administrations not only add to the College’s database, but also provide additional confidence in those benchmarks. Taken together, results of the three administrations indicate that Seminole State College students are performing at levels consistent with students in the national database.
Form Created: April 2003 Revised: November 2008, February 2009 8 Presented: April 6, 2009 Adopted: April 6, 2009
Transfer Reports from Four-Year Institutions
Assessment Methods and Criteria Transfer reports from the primary receiving institutions—East Central University, the University of Central Oklahoma, Oklahoma University, and Oklahoma State University—provide GPAs of students who transfer from Seminole State College. In addition, Northeastern State University has provided reports. We expect our students to maintain similar GPAs upon transfer as they attained at SSC.
Assessment Results
In the past, reports from ECU, UCO, OU, and OSU were received at least once each academic year. Unfortunately, since spring 2004 they have not been received on a regular basis. However, in 2003-2004, Northeastern State University began to provide information for students declaring majors, with the latest data for spring 2007.
For the last two years, the Assessment of Student Learning Coordinator has contacted the OSRHE Director of Research and Analysis who in turn has provided an expanded list of GPA data. This data includes GPA reports for all state institutions to which SSC graduates transferred. Unfortunately, the data requested for this report has not been received. (The reason may be due to a series of position changes at the OSRHE.) Therefore, the table below is the same as last year.
In the past, slight drops were recorded in the GPAs of students who transferred to primary receiving baccalaureate degree granting institutions. The tables below indicate that this is still the situation for most of the transfers with exceptions for NSU and OU.
Data Provided by OSRHE Average Average Increase/ Institution SSC GPA CGPA Decrease Jr/Sr Transfers ECU 3.12 3.09 -0.03 Transfer Spring OU NSU 2.88 2.99 0.11 Fall No. GPA OU GPA Ret GPA OSU 3.63 3.32 -0.31 2002 18 2.98 2.13 2.41 OU 3.16 3.17 0.01 2003 18 3.08 2.22 2.08 Rogers 3.69 3.60 -0.09 2004 27 3.16 2.21 2.26 UCO 3.22 3.13 -0.09 2005 11 3.38 2.28 2.34 2006 6 3.31 1.86 2.35 Sophomore Transfers Recently, Oklahoma University sent a summary of data for fall 2002 through fall 2006. It reported GPA’s for three categories— Transfer Spring OU Fall No. Freshmen Transfers, Sophomore Transfers, and Jr/Sr Transfers—and compared them to OU returning students. The tables below GPA OU GPA Ret GPA show the data. 2002 5 3.38 3.17 2.41 2003 3 3.13 2.23 2.08 2004 7 3.38 2.61 2.26 2005 2 3.57 3.10 2.34 2006 7 3.49 2.38 2.35 Form Created: April 2003 Revised: November 2008, February 2009 9 Presented: April 6, 2009 Adopted: April 6, 2009 Freshmen Transfers Transfer Spring OU Fall No. GPA OU GPA Ret GPA 2002 1 3.50 1.50 2.41 2003 1 3.56 1.58 2.08 2004 1 3.33 1.80 2.26 2005 1 3.20 1.59 2.34 2006 1 4.00 3.76 2.35
Analysis Action Plan: Next Steps to Improve Student Learning
Mean grade point averages for students who transfer from SSC to the primary receiving The Assessment of Student Learning Coordinator will continue to request updated transfer GPA institutions are slightly lower, when compared to non-transfer students at those reporting data from the OSRHE Director of Research and Analysis during fall 2009. Program specific data institutions. will be collected when possible and distributed to faculty in programs scheduled for evaluation.
According to the range of results in this measure, SSC students are competitive, earning GPAs within one-half grade points of non-transfer students when they move into baccalaureate programs.
SSC students demonstrate competence in their academic preparation, particularly when taking into consideration a variety of other factors which might be expected to negatively impact the grades of SSC students who transfer to baccalaureate granting institutions. Those factors include cultural differences, potentially larger class sizes and more impersonal interactions, adjustments to new situations and settings, and increasing financial burdens, among others.
From the data provided by OU, it is evident that SSC transfer student GPA’s drop. Since there are a greater number of transfer student records, data for Jr/Sr Transfers provides more useful information than the other categories.
SSC Transfer Spring OU GPAs for Sophomore Transfers are higher than those for returning OU students. The range is from 0.03 to 0.76. However, the reverse is true for four of the five years when Jr/Sr Transfers are considered. The range is from 0.05 to 0.49.
Form Created: April 2003 Revised: November 2008, February 2009 10 Presented: April 6, 2009 Adopted: April 6, 2009
Survey Data
Assessment Methods and Criteria Survey data provides indirect indicators for analyzing the effectiveness of the educational experience students are receiving at Seminole State College.
SSC Graduate Opinion Survey The primary survey tool used for this evaluation has been the annual SSC Graduate Opinion Survey, which is sent to recent graduates (Fall, Spring, May, Summer) and the results are compiled on campus. Assessment Results Even using an online version of the survey, the number of graduates responding to the survey has continued to decline over the past few years. This was again the situation for the 2007-2008 survey. However, the results continue to be consistent with past survey results. For fall 2008, the following questions related to student satisfaction levels were addressed. Responses ranged from Very Satisfied to Very Dissatisfied. A Likert scale was used to calculate a response average for each question with Very Satisfied = 1 and Very Dissatisfied = 5. The following table shows the question and response average.
Response Question Average Quality of instruction in your major area of study 1.22 Attitude of faculty toward students 1.35 Attitude of non-teaching personnel toward students 1.99 Concern shown for you as an individual by SSC personnel 2.01 Preparation you received for future occupation or education 1.98 General condition of buildings and grounds 1.70 Quality of instructional equipment 1.51 Variety of courses available 1.93 Out-of-class availability of your instructors 1.69 Quality of course you wanted at times you wanted 1.93 Quality of library materials and support services 1.71
Two questions requested the students to give an overall academic rating to SSC and to rate their overall experience at SSC. Responses ranged from Excellent to Poor with Excellent = 1 and Poor = 5. Again, response average was calculated. The following table presents the questions and the response average.
Response Question Average What overall academic rating would you give SSC? 1.48 How would you rate your overall experience at SSC? 1.43
A final question asked the students if they were starting over would they choose to attend to SSC. The responses were: Definitely Yes—87.2%; Probably Yes—8.7%; Definitely No—4.1%.
Form Created: April 2003 Revised: November 2008, February 2009 11 Presented: April 6, 2009 Adopted: April 6, 2009
Analysis Action Plan: Next Steps to Improve Student Learning
Data clearly indicates that students are satisfied with SSC. In particular, 90.7% of the While the number of responses continues to be low, the Committee will continue to use the online respondents indicated they were Very Satisfied or Satisfied with the College. Even with survey. The Assessment of Student Learning Coordinator will pursue new ways to contact recent limited responses, results are consistent with those from past SSC Student Opinion Surveys. graduates to encourage them to participate in that online survey. In addition, it will review the survey to determine if it still meets the needs of the College. On the other hand, the Graduate Exit Survey, which is given to students at graduation, has proven to be a useful tool to gather some of the data the Graduate Opinion Survey addresses. This survey is discussed below.
SSC Graduate Exit Survey
The survey seeks specific information concerning how graduates plan to use their SSC degree and future educational/occupational plans. Prior to 2007, the VPAA’s office distributed this survey to graduates while they were in line for commencement exercises. However, since May 2007 the survey is conducted online.
Assessment Results Some of the applicable survey questions and results are as follows: (Note: Not all graduates responded to all questions.) How will you use your degree from Seminole State College? 81.2% plan to transfer to a baccalaureate degree program, 9.4% will seek specialized employment, 4.3% will use their degree to enhance existing employment opportunities, 5.1% marked “Other.” To which college/university will you transfer? ECU—36.3% UCO—28.2% OU—20.4% Other—12.8% What will be your major? Psychology related (11.2%) Nursing (10.7%) Education (33.1%) Business related (39.4%) Other (5.6%) What is the highest degree you expect to obtain? Bachelor’s—18.7% Master’s—64.8% Form Created: April 2003 Revised: November 2008, February 2009 12 Presented: April 6, 2009 Adopted: April 6, 2009 Doctoral—9.2% MD—4.1% What are your long term E DUCATIONAL goals? Responses ranged from seeking immediate employment to pursuing additional education. What are your long term CAREER goals? ACT Faces of the Future Survey Responses ranged from finding a secretarial position to teaching in elementary school to obtaining a medical doctorate. This nationally recognizedAdditional survey, Comments which is conducted on a biennial basis, collects data similar to that collected by the SSC Graduate Opinion Survey, but it is administered to students who are enrolled during the fallThere semester. was a wide range of comments. The following are representative. The limited choices of courses offered at night makes it difficult to complete a degree. The faculty show concern for all students and are willing to help when needed. It was a great school. I will recommend SSC to my friends. Assessment Results
There are no results for this evaluation sinceAnalysis this survey was not given during Fall 2008. In accordance with SSCAction procedures, Plan: the Next survey Steps was conducted to Improve in Fall Student 2007 and Learningwill be conducted in Fall 2009.
The results of this survey are consistent withAnalysis those from previous surveys. The majority of The AssessmentAction of Student Plan: Learning Next Committee Steps to will Improve review the Student results and Learning determine if it will graduates plan to transfer to a four-year college in Oklahoma. Business related and continue with this survey. education related majors account for nearly three-fourths of those surveyed with Nursing andAnalysis Psychology and Action related are majors not applicable. accounting However, for about the one-fifth. Executive Summary of the Fall 2007 See note under Analysis section. report can be found on the Assessment web page and the complete report may be obtained Overfrom any80% Assessment of graduates of plan Student to pursue Learning degrees Committee beyond memberthe bachelor’s or division level, chair. with 13% planning to pursue a doctor’s degree.
Long term educational and career goals seem to indicate that students feel confident after their educational experience at SSC.
Form Created: April 2003 Revised: November 2008, February 2009 13 Presented: April 6, 2009 Adopted: April 6, 2009
Institutional Statistics Report
Assessment Methods and Criteria The Office of the Vice President for Academic Affairs compiles this report each fall and spring semester. It provides demographic and statistical data useful to profile learners seeking instruction at SSC.
Assessment Results
Final fall 2008 enrollment statistics indicated an enrollment of 2016 with 664 (32.9%) male and 1352 (67.1%) female. The average age of the student body was 27.1. In addition, 554 (27%) were sophomores, 38% were part-time, 62% were full-time, 93% were day students, and 7% were night students. Approximately 75% of the enrollment was from 10 communities in the five-county service area of the College. Over 50% of the students entering SSC are inadequately prepared for college-level courses. Remediation is required in English, Reading and Mathematics.
Action Plan: Next Steps to Improve Student Learning Analysis
The statistics are consistent with past reports and provide an overview of the SSC student The committee will continue to review future reports. body. Percentages for enrollment in the day schedule have gradually increased from 86% in fall 2002 to 93% in fall 2008. In addition, percentages for full-time students have increased from 53% in fall 2002 to 62% in fall 2008
Form Created: April 2003 Revised: November 2008, February 2009 14