STUDENT LEARNING OUTCOMES COORDINATORS IN THE

CALIFORNIA COMMUNITY COLLEGE SYSTEM: FOCUS ON

IMPLEMENTING STUDENT LEARNING OUTCOMES

PROCESSES AND ACCREDITATION

A Dissertation Presented to the Faculty of California State University, Stanislaus

In Partial Fulfillment of the Requirements for the Degree of Doctor of Education in Educational Leadership

By Regina Coletto May 2014 CERTIFICATION OF APPROVAL

STUDENT LEARNING OUTCOMES COORDINATORS IN THE

CALIFORNIA COMMUNITY COLLEGE SYSTEM: FOCUS ON

IMPLEMENTING STUDENT LEARNING OUTCOMES

PROCESSES AND ACCREDITATION

by Regina Coletto

Signed Certification of Approval Page is on file with the University Library

Dr. Jim Riggs Date Professor of Advanced Studies in Education

Dr. Shawna Young Date Professor of Kinesiology

Dr. George Railey Date Vice Chancellor, SCCCD

© 2014

Regina Coletto ALL RIGHTS RESERVED

DEDICATION

I would like to dedicate this dissertation to Joey and Katie, who have sacrificed our time together. While I dedicated time to school, they provided me with patience and understanding so I could complete my program. Also, to my father,

Henry Joseph Coletto, who reminded me daily to keep going, who took me fishing to clear my head when I needed it (even though I often had to study while there), and who always had faith in me to get this done. This is for you Dad!

iv

ACKNOWLEDGMENTS

My journey to complete this dissertation has been over a decade in the making. Completing my dissertation, on my second attempt, signifies much more than holding a doctoral degree in my hand. It is a testament to knowing and believing in myself; to perseverance, to sacrifice, to dedication, and to how much love and support those around me have provided.

To my dissertation chair, Dr. Jim Riggs … You never let me down. Your dedication to my education and to me as an individual has been unwavering and I cannot imagine going through this process without you.

To my committee members … Dr. Shawna Young, you have always challenged my thinking, you are incredible, and I hope many generations of students to come have the same transformative experience with you that I was given.

Dr. George Railey, thank you for your guidance and expertise in assessment in the

CCC—I respect your knowledge and leadership.

To my best friend, Rachelle, who has been there through both doctoral programs—I am done! Thank you for all your love, support, and understanding the past few years. Without it this would never have been completed.

To my family and friends, I take your strength and love every day and try to use it to do something good for my own life and the lives of others. This is just one example of how your support has led to something incredible. Thank you!

v

To my Cohort 4 classmates, this was truly the adventure of a lifetime. Not only did I learn so much about education, caring for others, and being a leader, but you each in your own way taught me something about myself. This was even more worth it to have gone through it with each of you.

Every conversation, every stressful moment quieted, every piece of motivation given, and every day spent over the past 3 years smiling or crying, but knowing this would one day end as an achievement to be proud of has transformed who I was, who I am, and who I will become in future years.

vi

TABLE OF CONTENTS

PAGE

Dedication...... iv

Acknowledgments...... v

List of Tables ...... x

Table of Acronyms ...... xi

Abstract...... xii

CHAPTER I. Introduction ...... 1

Problem Statement...... 6 Purpose of Study...... 8 Nature of the Study...... 9 Research Questions...... 9 Operational Definitions...... 10 Assumptions...... 11 Limitations ...... 13 Theoretical Framework...... 13 Significance of the Study...... 15 Conclusion ...... 17

II. Review of the Literature...... 19

ACCJC Standards ...... 20 Standard I: Institutional Mission and Effectiveness ...... 21 Standard II: Student Learning Programs and Services .... 23 Standard III: Resources...... 27 Standard IV: Leadership and Governance ...... 30 ACCJC Rubric for Student Learning Outcomes...... 31 Accreditation and Student Learning Outcomes ...... 34 Student Learning Outcomes...... 35 Assessing Student Learning Outcomes...... 38 Student Learning Outcomes Leadership...... 39 Concerns-Based Adoption Model (CBAM) ...... 42 Conclusion ...... 46

vii

CHAPTER PAGE

III. Methodology...... 48

Research Questions...... 49 Role of the Researcher...... 49 Setting ...... 50 Population and Sample ...... 51 Ethical Considerations ...... 52 Methodology...... 53 System-wide Survey ...... 53 Data Collection ...... 55 Data Analysis...... 55 Conclusion ...... 56

IV. Findings ...... 57

Research Sequence...... 59 Data Collection ...... 62 Population and Sample ...... 62 SLO Coordinator Characteristics...... 67 Strategies for Implementing SLOs...... 70 Barriers to Implementing SLOs and How to Overcome Them ... 74 Campus Cultural Characteristics...... 77 Affect on Accreditation Status...... 83 Summary of Findings...... 86 Quality of the Research...... 89 Summary...... 90

V. Summary, Implications, Conclusions, and Recommendations...... 92

Summary of Previous Chapters ...... 93 Summary of Research Findings...... 94 Comparison Between the Current Study and the ASCCC 2007 Study...... 98 Research Question 1: What Strategies Have Institutions in the California Community College System Utilized to Implement Student Learning Outcomes Processes? ...... 103 Research Question 2: What Barriers Exist in the Implementation of Student Learning Outcomes Processes? ...... 106

viii

CHAPTER PAGE

V. Summary, Implications, Conclusions, and Recommendations, continued

Research Question 3: What Strategies Have Been Successful in Eliminating or Minimizing the Barriers to Implementing Student Learning Outcomes Processes?...... 109 Research Question 4: What Institutional Cultural Characteristics Affect the Successful Implementation of Student Learning Outcomes Processes? ...... 111 Research Question 5: In what Ways Has the Institution’s Accreditation Status Impacted the Implementation of Student Learning Outcomes Processes? ...... 113 Discussion and Evaluation...... 115 Limitations of the Study...... 118 Recommendations for Practice ...... 119 Recommendations for Further Study...... 123 Research Process Reflections ...... 124 Conclusion ...... 125

References...... 128

Appendices

A. Rubric for Evaluating Institutional Effectiveness...... 135 B. Sample Survey Questions for System-wide Survey...... 136 C. ASCCC SLO Coordinator Study Survey...... 140 D. Protocol for Pilot Testing Survey Items...... 144 E. Survey Consent Form...... 149 F. General Membership of RP Group Listservs...... 150

ix

LIST OF TABLES

TABLE PAGE

1. Participant Positions Held on Campus (Q7) ...... 64

2. Discipline Taught by Full-time Faculty Participants (Q7) ...... 65

3. Other Positions Held on Campus (Q7) ...... 66

4. Length of Time Participants Have Been Involved with SLOs (Q8)...... 66

5. Number of Years the SLO Coordinator Contract Lasts (Q9)...... 67

6. Amount of Reassigned Time the SLO Position Provides (Q10)...... 68

7. Method of Selection for the SLO Coordinator (Q11)...... 69

8. Indication of a Formal Job Description for the SLO Coordinator Position (Q12)...... 70

9. Support Activities that Would Help Campus SLO Efforts (Q15)...... 71

10. Training Opportunities that Would Help Campus SLO Efforts (Q16)...... 72

11. Evaluation of SLO Components Based on ACCJC Benchmarks (Q14)...... 78

12. Overall Level of Concern by Faculty about Sustaining SLO Processes (Q23) ...... 79

13. Overall Level of Use by Faculty of Assessment Data to Improve Student Learning (Q24)...... 80

14. Campus Cultural Impact on the Ability to Reduce or Eliminate Barriers (Q22) ...... 81

15. Accreditation Status as of July 2013 (Q6) ...... 83

16. Accreditation Status Impacted SLO Implementation (Q20)...... 84

x

TABLE OF ACRONYMS

ASCCC – Academic Senate for the California Community Colleges

ALO – Accreditation Liaison Officer

ACCJC – Accrediting Commission for Community and Junior Colleges

AACC – American Association of Community Colleges

CCC – California Community College

CCCCO – California Community College Chancellor’s Office

CIO – Chief Instructional Officer

CBAM – Concerns Based Adoption Model

CHEA – Council for Higher Education Accreditation

FLEX – Flexible Calendar Program

PR – Program Review

RP Group – Research and Planning Group

SME – Science, Mathematics, and Engineering

SLO – Student Learning Outcome

SCQI – Sustainable Continuous Quality Improvement

USDE – United States Department of Education

xi

ABSTRACT

This descriptive explanatory research study drew upon a system-wide survey of the

California Community College system Student Learning Outcomes (SLO)

Coordinators and other SLO leaders to explore how California Community Colleges utilize the Student Learning Outcomes Coordinator positions on their campus, what barriers campuses face when trying to implement SLOs, how campuses have effectively addressed these barriers, and what role accreditation status plays in the role of implementing SLOs. This study found four primary differences between the

Academic Senate for California Community Colleges (ASCCC) study conducted in

2007 and the current study: (1) Increase in the amount of reassigned time allocated to the SLO Coordinator positions; (2) More formal roles; (3) more colleges using a cooperative model for position selection; (4) support and training programs are still in demand. This study also found colleges are utilizing a multitude of strategies to both implement SLOs and reduce barriers to implementation. Additionally, a hierarchical system of SLO culture was found to exist on campuses consisting of resisters, compliers, and believers. Understanding the dynamics of SLO Coordinator positions across the system and how colleges are working to implement SLO processes on their campuses will add to the existing limited body of knowledge about how CCC can successfully implement SLO processes on their campuses and avoid accreditation sanctions for SLOs and the use of the data resulting from SLO assessment.

xii

CHAPTER I

INTRODUCTION

As recently as January 2013, the Accrediting Commission for Junior and

Community Colleges (ACCJC) took action to sanction California community colleges based on their use of assessment results as well as their progress toward meeting the proficiency standard for student learning outcomes. Prior to this

Commission meeting, these two categories for sanction had not existed. As of the

June 2013 Commission meeting, approximately ¼ of California community colleges were facing a sanction from ACCJC. The number of colleges on sanction does not seem to change much over time, but the reasons for being on sanction have evolved and continue to do so (ACCJC, 2013a).

The federal government authorizes ACCJC to accredit institutions of higher education and is itself reviewed every 5 years by the United States Department of

Education (USDE), and every 10 years by the Council for Higher Education

Accreditation (CHEA). The USDE review is based on compliance with federal regulations that are a part of the Reauthorization of the Higher Education Act that was passed by Congress. These regulations are changed regularly, especially since 2009, through a process called negotiated rulemaking. The CHEA review is based on a peer-review process. Member institutions from across the country are charged with creating standards and best practices for accreditors (ACCJC, 2013b).

1 2

According to the ACCJC, there are five major national discussion topics about higher education quality and accreditation occurring among federal regulators, regional accreditors, the business sector, and institutions of higher education. They are (1) money and public investment, (2) the completion agenda, (3) quality of graduates, (4) global competitiveness related to the quality of graduates and the completion agenda, and (5) the achievement gap (ACCJC, 2013b).

The discussion surrounding money and the public investment in higher education is focused on the amount of federal funds that are put into colleges.

Federal monies are put into the higher education system either directly through programs and grants or through federal financial aid programs to an institution’s students. The federal government wants to ensure the quality of the “product”

(student outcomes) that is received in return for the investment they are making.

Although those working in higher education can argue about what it truly means to have a college education, in the end, the federal government and the general public are becoming increasingly concerned about the product of that education, not the experience itself. Products include number of graduates, amount of loan debt of graduates, the employability of graduates in the work world (having the necessary skills for the job), and the ability of graduates to be fully employed (not unemployed or underemployed) (ACCJC, 2013b). Although focused on for-profit colleges, the

Harkin’s report that was released in 2012 points to many of these concerns, including the amount of money spent for instructional purposes versus marketing, staffing, and other nonacademic purposes. The report states that “Congress has failed to

3 counterbalance investor demands for increased financial returns with requirements that hold companies accountable to taxpayers for providing quality education, support, and outcomes” (Fain, 2012, p. 2). These discussions about accountability and outcomes are evident in every facet of the accreditation process and will continue to be a major criterion related to obtaining or maintaining an institution’s accreditation.

The completion agenda was introduced into the public spotlight through its formalization by the American Association of Community Colleges and President

Obama. In 2009, President Obama charged community colleges with “increasing the number of community college students completing a degree or other credential by

50%—to five million students by the year 2020” (AACC, 2013, p. 1). In response to

President Obama’s charge in 2010, an agreement entitled Democracy’s Colleges: A

Call to Action was signed by members (American Association of Community

Colleges, the Association of Community College Trustees, the Center for Community

College Student Engagement, the League for Innovation, the National Institute for

Staff and Organizational Development, and Phi Theta Kappa) and states that

members will be committed to raising the number of students completing community

college programs by 50% by 2020. To explain why the completion agenda is

important, the AACC offers a four-point rationale: (1) college completion rates today

are dismally low; (2) nationally, too few students are completing college; (3) the

numbers are even worse for minority students; and (4) even students enrolled in

structured programs can struggle to finish. The completion rate in 2011 for 2-year

4 colleges in the nation was about 1 in 3 (34%), and according to The College

Completion Challenge Fact Sheet, the United States ranks sixth in the world in postsecondary degree attainment. Minority students have even lower attainment rates, with 19.2% of Latinos having earned an associate’s degree or higher. The numbers of students who earn a certificate or degree in an occupational program are no better and show that even those programs that are more structured than the more open-ended general education pathway have a completion rate of less than 60% .

There are varying reasons about why this may be occurring, including under-prepared students needing remedial education, lack of knowledge about the higher education system, increases in college costs (AACC, 2013), and changing financial aid policies

(AACC, 2013; Shugart, 2013).

The quality of graduates’ conversations stem from the business community expressing concerns that the students who are graduating from college are not coming to them prepared with the same basic academic (reading, writing, math) or critical- thinking skills that they once did. Sectors such as career technical education fields have long been interwoven with the business community, but now more portions of this community are seeking to be involved in the higher education process and are asking more questions about what students are learning and what courses they take to earn their degrees. The business community is not interested in the rhetoric about what content is listed on the syllabus; they are interested in how colleges prove that their students learn the information included on the syllabus. The business

5 community is asking some of the same questions that colleges are being asked to answer for accrediting bodies (ACCJC, 2013b).

The United States is currently ranked sixth globally for number of college graduates, far behind the numbers produced in China, India, and many other countries in the world. As was discussed earlier, this issue relates both to the completion agenda (not enough graduates) and the quality of graduates (what they know). It is of great concern that “the United States maintain its position in the global political economy” (ACCJC, 2013b, p. 4), and to do so, we must graduate more students who come to the world of work prepared to help advance the country, especially in the areas of technology and economic growth (ACCJC, 2013b).

Finally, the achievement gap has been a persistent and pervasive issue for years in both K–12 and higher education. Caucasian and Asian students are outperforming Black and Hispanic students in almost all conventional measures of achievement in education, such as testing, dropout rates, college attendance, college retention, and college completion. These same performance discrepancies are shown between socioeconomic classes and are becoming even more disparate over time.

Although many initiatives (e.g., No Child Left Behind, TRIO programs, EOPS) have been introduced to address these issues, none have proven to solve the problem of the inequalities leading to the disparity in performance outcomes. This is an issue that the federal government and accrediting agencies will require colleges to examine more closely through the requirement to disaggregate data when reporting it and to actively implement student equity plans (ACCJC, 2013b).

6

Problem Statement

In 2012 more California community colleges (CCCs) have been sanctioned by their regional accreditor, the Accrediting Commission for Community and Junior

Colleges (ACCJC), than in the past 4 years, with 28 of the 112 (25%) on sanction.

These sanctions are making it too difficult for many colleges to make changes to programs or start new ones, as they restrict what changes colleges can make. On the other hand, sanctions can elicit prompt and often needed changes on campuses to address systemic problems. In the ACCJC’s 2007 accreditor reauthorization by the

USDE, they were cited for not imposing the 2-year rule against member institutions

(such as California community colleges) and, in response, have taken a stricter approach to enforcing the standards. “The ‘two-year rule’ is a federally imposed mandate that requires accrediting agencies to place a two-year deadline on correction of all recommendations that relate to deficiencies” (ASCCC, 2008, p. 1). The federal government, with its monetary investment in higher education, is motivated to assure students who are attending institutions that are deemed to be deficient in meeting the standards that the colleges in which they are enrolled will come into compliance in a timely manner; hence the 2-year rule is applied (ASCCC, 2008; Freedberg, 2012).

Jamienne Studley, Executive Director of Public Advocates, a leading public interest law firm in California, stated, “If accrediting commissions don’t hold colleges more accountable for student outcomes, the federal government may look for other ways to determine a college’s eligibility for student financial aid” (Freedberg, 2012, p. 4).

The message is being heard by the ACCJC, who agreed to toughen its enforcement of

7 the 2-year rule. This change is being felt by sanctioned colleges who are asked to make major systemic changes within a 2-year timeframe.

In 2009, 21% of CCCs were on sanction for deficiencies in the following areas: Planning (92%), Program Review (71%), Financial Stability or Management

(54%), Internal Governance (46%), and Board (46%). By 2012, 25% of the CCCs were on sanction; however, the reasons were different: Planning (71%), Board (71%),

Financial Stability or Management (50%), Program Review (21%), and Internal

Governance (18%). Although the percentage of colleges on sanction for Planning,

Program Review, Financial Stability or Management, and Internal Governance (four of the five deficiency areas) has decreased since 2009, the number of colleges on sanction has increased. In addition, the percentage of board deficiencies has increased over time to 71% (the highest) from 46% in 2009 (the lowest). It appears that colleges are floundering year after year in their attempt to interpret and implement the proper policies and procedures to meet accreditation standards. To make matters worse, in the fall of 2012, all 112 CCCs needed to prove that they were at the Proficiency level for their student learning outcomes. Some colleges were on track to be in compliance, but it is believed among those in the CCC system that many more colleges will be facing sanctions for not achieving the required student learning outcomes (SLO) standards.

Seven different activities are listed in the ACCJC rubric (see Appendix A) for

SLO proficiency. The first is that student learning outcomes and authentic assessments are in place for courses, programs, support services, certificates, and

8 degrees. The next activity listed is that there is widespread institutional dialogue about the results of assessment and identification of gaps. The third and fourth activities are that decision making includes dialogue about the results of assessment and is purposefully directed toward aligning institution-wide practices to support and improve student learning and that appropriate resources continue to be allocated and fine tuned. The fifth activity is comprehensive assessment reports exist and are completed and updated on a regular basis. The next activity is that course student learning outcomes are aligned with degree student learning outcomes. The final activity is that students demonstrate awareness of goals and purposes of courses and programs in which they are enrolled (ACCJC, 2012b).

Purpose of Study

Given the changing nature of accreditation practices and policies, many of which are centered around the assessment of student learning outcomes and the use of this assessment data to improve student learning, more needs to be discovered about how best to help institutions sustain effective practices for student learning outcomes and its use of the data. All campuses have either an individual or a group

(committee) assigned to oversee the student learning outcomes processes on campus.

This oversight includes creating processes, implementing them, assessing student learning outcomes, collecting data into useable formats, interpreting data, writing reports, and making improvements based on these data. Therefore, this researcher seeks to explore how California Community Colleges utilize the Student Learning

Outcomes Coordinator positions on their campus, what barriers campuses face when

9 trying to implement SLO, how campuses have effectively addressed these barriers, and what role accreditation status plays in the role of implementing SLOs. With this information, they should be better able to assist their colleges to move forward to achieving student learning outcomes sustainability, thus decreasing their chances of being sanctioned by the ACCJC for this reason.

Nature of the Study

This author administered an online survey to the student learning outcomes coordinators (or others involved in SLO assessment) from each of the 113 California community colleges. Data from the surveys were collected in the early Fall 2013 semester and were used to establish what the norms are throughout the California

Community College system regarding SLO processes and the SLO coordinator position.

Upon completion of the survey, this researcher analyzed all of the data collected from the survey in an effort to answer the research questions posed for this study.

Research Questions

The following questions guided this study:

1. What strategies have institutions in the California Community College

system utilized to implement student learning outcomes processes?

2. What barriers exist in the implementation of student learning outcomes

processes?

10

3. What strategies have been successful in eliminating or minimizing the

barriers to implementing student learning outcomes processes?

4. What institutional cultural characteristics affect the successful

implementation of student learning outcomes processes?

5. In what ways has the institutions accreditation status impacted the

implementation of student learning outcomes processes?

Operational Definitions

Accountability. The obligation of higher education to verify that they are measuring student learning and providing a high-quality education and to share this information with the public.

Accreditation. A voluntary system of self-regulation to evaluate quality and institutional effectiveness in higher education.

Accreditation standards. A level of educational excellence established by a regional accreditation commission, such as the ACCJC, to ensure that institutions meet an approved level of quality .

Closing the loop. Includes developing student learning outcomes, conducting research, analyzing the data, using the results to make changes to a course or program to improve student learning, then retesting the outcome to determine if student learning improved followed by reflecting on the changes.

Institutionalization. A sustainable, ongoing assessment process, which includes evidence of completing more than one cycle of student learning outcomes

11 assessment and using the assessment data to make program improvements or inform resource allocation decisions.

Student learning outcomes. “Student learning outcomes are properly defined in terms of the knowledge, skills, and abilities that a student has attained at the end

(or as a result) of his or her engagement in a particular set of higher education experiences” (Eaton, 2012, p. 5).

Student Learning Outcomes Coordinator. This term will be used to denote the

individual or individuals on a California community college (CCC) campus who are

primarily responsible for the oversight of the student learning outcomes processes.

At each CCC, this individual may have a different title, but the purpose of his or her

role would reflect the above statement.

Student Learning Outcomes Processes. This term denotes the full cycle of

assessing SLOs and includes developing an SLO, assessing the SLO, analyzing the

SLO, discussing the results of the SLO assessment to determine what improvements

should be made, implementing the improvements, and then reassessing the SLO to

determine if the improvements were effective.

Assumptions

The following assumptions were made by the researcher:

1. Faculty that serve as SLO coordinators will be more successful in moving

the institution toward sustainability for SLOs than coordinators who are

nonfaculty members (administrators, managers, classified). Just as the

accreditation process is a peer-reviewed process, it is more effective to

12

have a peer-reviewed process for campus-level accountability for SLOs.

Often faculty will resist a change if the administration demands it of them;

however, they are more likely to accept it if they have at least some

control over the process.

2. SLO coordinators who have received training outside of the institution

will be better able to help faculty and staff overcome barriers related to

SLOs. While the ACCJC offers online training for anyone interested in

learning more about accreditation, there are no specific rules about how to

develop and implement SLO processes on campuses. Many campuses use

faculty who have some kind of experience with assessment to be in charge

of the campus processes. Outside training through conferences can assist

SLO coordinators in validating their approach to SLO processes on their

own campuses and also in gaining valuable information from others in the

field who are facing similar challenges.

3. Institutions that have not faced an accreditation sanction will be more

likely to be closer to stage six of the Concerns Based Adoption Model

(CBAM) (refocusing) than those institutions that have been sanctioned. In

order to reach the final stage of the CBAM, an institution must have

accepted the change (implementation of SLO), must have integrated it into

common daily practice, and must have a forward-looking mentality about

how the use of the change can be modified to enhance the process. If a

college has done all of these things, it is likely they are not on sanction for

13

SLO or use of assessment data because they would have inadvertently

met the proficiency standards listed in the ACCJC SLO rubric. This does

not mean, however, that the institution would be able to avoid any type of

accreditation sanction but only those related to the use of student learning

outcomes and/or assessment data.

4. Institutions in which effective SLO processes have been developed,

implemented, and sustained over time will fare better in the accreditation

process and will be placed on sanction less often than those who have

struggled to establish effective SLO processes.

Limitations

This study was focused on the entire California Community College (CCC) system, but the data analyzed and used to draw conclusions were limited to those institutions that responded to the online survey and agreed to be interviewed. Also, since the study was focused on the CCC system, the results may not be able to be generalized to 4-year colleges or other community colleges throughout the nation.

Another limitation in this study is that only SLO coordinators and those directly involved in SLO assessment from each college were surveyed, and their perceptions may differ from other individuals on their campuses.

Theoretical Framework

The Concerns Based Adoption Model (CBAM) was chosen as the framework for this study because of its extensive history in implementing changes in education.

Senge’s change model was explored as an option but is geared more toward the

14 business sector and based on the insights of “countless corporate practitioners, consultants, and academic researchers” (Senge et al., 1999, p. 31). The CBAM was developed in the 1970s to be used when implementing changes in education.

CBAM is founded based on several assumptions that are (a) change is a process, not an event; (b) change is accomplished by individuals; (c) change is a highly personal experience; (d) change involves developmental growth in feelings and skills; and (e) change can be facilitated by interventions directed toward the individuals, innovations and contexts involved. (Afshari, Bakar, Luan, Samah, & Fooi, 2009, pp. 1–2)

The model identifies seven stages of concerns that individuals progress through when

experiencing change.

The first stage, Stage Zero, is when an individual may be aware that a change is being implemented but is not vested in wanting to take part in it. Stage One, informational, occurs once an individual is interested in the change and is seeking more information about the change and how it will be implemented. The third stage, personal concerns, tends to elicit the strongest emotions in people as individuals and groups struggle to understand how the change will impact them and if they actually have the ability to implement the change. Stage Four, management, occurs when individuals start to implement the change and become concerned with the logistics involved. Stage Five, consequence, is the time when individuals begin to question whether or not the change will truly impact their students’ learning, “Is it worth it?”

During Stage Six, collaboration, individuals are seeking to work with others to ensure the change is as effective as possible. The final stage, Stage Seven, refocusing, is

15 concerned about changing the innovation to further enhance student learning

(Loucks-Horsley, 1996).

For the purposes of this study, the CBAM was used to offer a framework for

the survey questions. This model was also used to discuss the results and offer

suggestions for SLO coordinators to assess what concerns their faculty and staff had

and how they could best use this information to move their colleges toward Sustained

Continuous Quality Improvement (SCQI), as defined by the ACCJC SLO rubric (see

Appendix A).

Significance of the Study

President Kennedy once stated,

Just because we cannot see clearly the end of the road, that is no reason for not setting out on the essential journey. On the contrary, great change dominates the world and unless we move with change we will become its victims. (Ringel, 2000, p. 1)

Change elicits emotional, physiological, and professional anxieties within individuals,

especially when the change is coupled with uncertainty (Lane, 2007).

Leaders in the higher education community need to not only embrace change

but also guide others through the never-ending changes from lawmakers, constituents,

and accrediting agencies. Institutions may encounter resistance to proposed changes

and must react appropriately to address the fears creating the resistance and to create

a culture of dialogue and support (Lane, 2007).

In recent years, the challenges that California community colleges have faced

with regard to accreditation have been immense. It has been shown that, in the past

16 decade, just less than half (62 of 112) of the California community colleges have faced a sanction from their accrediting agency, Accrediting Commission for

Community and Junior Colleges (ACCJC). In a recent study by Tharp (2012), it was found that schools that had been placed on sanction five or more times displayed different cultural practices than did those who had never been placed on sanction.

These practices are important to understand if we aim to change them on our own campuses.

According to Tharp (2012), schools that had been placed on sanction five or more times expressed the following characteristics in comparison to schools that had never been placed on sanction:

• Campus-wide role definition was not clear.

• Conflict was longer lasting and more intense.

• Progress was made despite conflict.

• Accreditation was not deemed universally important.

• Motivation was external.

• Enforcement was not as consistent.

• The degree of contact with accreditation varied.

• Processes had less integrity.

• Participants were not as interconnected.

• Resources were not available.

The schools that had never been placed on sanction exhibited the opposite practices: roles were clearly defined, conflict occurred less often, conflict-resolution procedures

17 were in place, accreditation was universally important, motivation was internal, enforcement was consistent, degree of contact with accreditation was consistent, process integrity was maintained, participants were more interconnected, and resources were made available (Tharp, 2012). In order to move California community colleges from practices deeply rooted in the system (thinking that SLOs are not needed) to an alternative way of thinking and doing (SLOs are a necessary tool to help improve instruction and student learning), SLO coordinators and other campus leaders must introduce a systemic change. One of the most well-known models for introducing educational innovations is the Concerns Based Adoption

Model (CBAM).

This author investigated how Student learning outcomes coordinators on

California community college campuses have assisted staff and faculty to overcome barriers to successful implementation of SLOs processes and have successfully avoided accreditation sanctions. The researcher sought to provide more knowledge to student learning outcomes coordinators about each of the stages of the SLO processes as well as how to negotiate change when implementing an innovation.

Conclusion

This dissertation will be presented in five parts, beginning with Chapter I, which introduced the descriptive study, stated the problem to be explored and the purpose of the study, provided a theoretical framework for the study, and described the limitations. Chapter II will provide an overview of the relevant literature that is associated with accreditation and student learning outcomes. Chapter III will detail

18 the research design and methods included in the study. Chapter IV will describe the findings of the data. Finally, Chapter V will provide an analysis of the findings and provide recommendations for action and future research about student learning outcomes and accreditation in the California Community College system.

CHAPTER II

REVIEW OF THE LITERATURE

Although the student-learning-outcomes movement has a long history in higher education in the United States, it did not begin in the California Community

College system until 2002 when its regional accrediting agency, the Accrediting

Commission for Community and Junior Colleges (ACCJC), implemented standards for all California community colleges (CCCs) to follow for student learning outcomes. The goal of student learning outcomes should be focused on improving student learning; however, it is often difficult for colleges to separate this more intrinsic motivation to ensure that the college is meeting the requirements set forth by the ACCJC (an external force) (ACCJC, 2011).

While many colleges see the ACCJC as an external governmental force that intervenes in the business of higher education, it is important to remember that regional accreditors in the United States are a system of voluntary members who have agreed to undergo a peer-review process. Members pay a fee to join the ACCJC and agree to complete a comprehensive review every 6 years. This process consists of four phases: (1) Internal Evaluation—the college evaluates itself against the established standards, sets action plans for improvement, and submits a report detailing the evaluation and the plans to the Commission; (2) External Evaluation— the Commission sends a team of the college’s peers (other CCC members) to evaluate the college and prepare a report; (3) Commission Action—the Commission reviews

19 20 the college and external evaluations and decides on the college’s accreditation status; and (4) Self-improvement—the college works to improve, using the

Commission recommendations as a guide (ACCJC, 2011).

When the Commission takes action on a college, it has four main options.

The first is to reaffirm the accreditation of the college. This can be done for a maximum of 6 years but may be for a lesser time (3 years, for example) if the

Commission believes that they want to reexamine an aspect of the college prior to the full 6-year cycle. The final three actions are considered sanctions, and, under federal law, if a college is placed on sanction for any issue, it has only 2 years to come back into compliance, or it could face termination of its accreditation. The first level of sanction is called a “Warning,” the second is “Probation,” and the third is “Show

Cause.” Each of these sanctions comes with a list of recommendations that colleges must correct to come back into compliance with the standards within the required timeline. Therefore, it is important for colleges to not only understand what the accreditation standards are but also how the individual college is employing them in its practices and policies (ACCJC, 2011).

ACCJC Standards

The purpose of accreditation is to evaluate the general educational quality and the effectiveness of institutions. Standards are established for member colleges to assure students, employers, and other institutions of the value of degrees and certificates awarded. These standards allow the general public and educational community to have confidence that institutions have well-defined goals and are

21 actively upholding expectations that are reasonably achieved. Peer-reviewed self- studies and assessments are the main improvement strategies used as guidelines for institutions (ACCJC, 2011).

Standard I: Institutional Mission and Effectiveness

According to the ACCJC, the overview of Standard I is as follows:

The institution demonstrates strong commitment to a mission that emphasizes achievement of student learning and to communicating the mission internally and externally. The institution uses analyses of quantitative and qualitative data and analysis in an ongoing and systematic cycle of evaluation, integrated planning, implementation, and re-evaluation to verify and improve the effectiveness by which the mission is accomplished. (ACCJC, 2012a, p. 2)

After this synopsis of Standard I is outlined, the standard is divided into two parts:

Section A, listed as Mission , and Section B, listed as Improving Institutional

Effectiveness .

Four items are listed under Section A of Standard I. These items are intended to provide directives regarding the institution’s student population, its learning programs and services, the involvement of the governing board, the decision–making process, and the inclusion of the mission in institutional planning (ACCJC, 2012a).

This standard also includes allocating necessary resources and reports that may aid leaders in improving college performance.

Section B of Standard I is Improving Institutional Effectiveness and gives guidelines about how institutions should be reasonable with creating and supporting student learning, assessments, and appropriate changes (ACCJC, 2012a). To aid in institutional improvement and effectiveness, consistent evaluations and revisions are

22 used to measure the quality of services. The seven subsections of Section B deal with the process of improving institutional effectiveness. A synopsis of these subsections are listed below (ACCJC, 2012a):

1. Continuous communication regarding the institution’s progress of student

learning and processes.

2. The stated purpose must be consistent with the institution’s plans to

improve. These goals must be measurable in order to promote

collaboration toward their achievement.

3. A cyclic process of evaluation, planning, and resource allocation and

implementation should be assessed frequently.

4. Institutions should prove that their planning process is broad based, that

their appropriate stakeholders provide input, and that appropriate

resources are available for the improvement of institutional effectiveness.

5. Matters to improve the effective quality-control measures should be

documented and communicated to all necessary constituents.

6. The repeated efforts of the college’s planning and resource allocation are

effective through a system of review, research, and make adjustments.

7. The institution should assess its evaluative processes by reviewing its

effectiveness in improving student services, instructional programs,

libraries, and other learning services.

23

Standard II: Student Learning Programs and Services

Standard II of the ACCJC accreditation standards has three core sections relating to Instructional Programs, Student Support Services, and Library and

Learning Support Services. The overview of this standard states,

The institution offers high-quality instructional programs, student support services, and library and learning support services that facilitate and demonstrate the achievement of stated student learning outcomes. The institution provides an environment that supports learning, enhances student understanding and appreciation of diversity, and encourages personal and civic responsibility as well as intellectual, aesthetic, and personal development for all of its students. (ACCJC, 2012a, p. 4)

Under the first core section, Instructional Programs, there are eight substandards. The first substandard (Standard IIA1) requires that the instructional programs demonstrate that they meet the mission of the college. In addition, this standard addresses instructional programs that are not necessarily at the main college site but may be delivered by different modalities such as distance education, correspondence education, and others. These standards are to be held to the same standards as those on the main college campus. The institution accomplishes this by meeting three additional substandards (Standard IIA1a-c). These three standards discuss meeting the educational needs of students by administering programs that meet the community needs and the institution using data to inform student learning needs to measure student learning outcomes. Standard IIA1 also addresses the need to have, assess, and use student learning outcomes for all courses, programs, certificates, and degrees (ACCJC, 2012a).

24

Standard IIA2 aims to address the quality and improvement of the courses and programs offered by the institution. It does so by addressing a series (a–f) of substandards. These substandards address student learning outcomes by discussing the need for established procedures for these outcomes and the faculty’s role in improving courses and programs. It also refers to the need to assess course relevance, appropriateness, achievement of learning outcomes, currency, and future needs in a systematic way. Substandards a–f describe the requirement for institutions to make results of assessments available to those vested in the campus community. Finally, it states the requirements for awarding course credit and degrees to students based on stated student learning outcomes achievement (ACCJC, 2012a).

The third standard (IIA3) states the requirements and provides a definition of general education assessment. There are three definitions, by way of student learning outcomes, of what students should be able to do or know if they complete their general education at an institution.

The fourth standard states that “All degree programs include focused study in at least one area of inquiry or in an established interdisciplinary core” (ACCJC,

2012a, p. 6). Most institutions meet this standard when they go through the program approval process at the CCC Chancellor’s Office.

The fifth substandard (IIA5) outlines the need for vocational or occupational programs to assess competencies of their students. This assessment would occur after students’ completion of a program, and prior to students’ preparation to take external licensure or certification exams.

25

Substandard IIA6 addresses communicating information about courses, programs, and transfer policies to students. Prescribed methods of this communication are through course syllabi, published policies and catalogs, and the institution’s website.

The seventh substandard discusses the requirements for developing, using,

and making public policies about academic freedom, responsibility, student academic

honesty, and codes of conduct. This standard also outlines the consequences for not

complying with these public policies.

The eighth, and final, substandard of Standard IIA is to ensure that, if an

institution offers any courses internationally, it holds those courses to the same

standards as courses offered on the main campus site. This standard reflects similar

intentions and goals of Standard IIA1 (ACCJC, 2012a).

The second core standard is Standard IIB: Student Support Services, which

contains four substandards. The first substandard is similar to that of the Instructional

Programs standard in that it requires student support programs to improve the mission

of the institution (ACCJC, 2012a).

Standard IIB2a–d lists the required components for an institution’s college

catalog. These components include general information, requirements, major policies

affecting students, and locations or publications where other policies may be found

(ACCJC, 2012a).

The third standard (Standard IIB3a–f) addresses how institutions should

identify student support needs and programs to meet those needs. It accomplishes

26 this through equitable access to services, encouraging personal and civic responsibility, evaluating services, enhancing the understanding of diversity, evaluating placement exams, and maintaining student records. The final standard discusses evaluating these equitable services to meet student needs as well as providing assistance for students in achieving student learning outcomes and using the results for program improvement (ACCJC, 2012a).

Library and learning support services is the third core standard under

Standard II, and contains two substandards. The services that must be assessed through student learning outcomes under this standard are library services, tutoring, computer laboratories, and learning technology development and training. The first of the two substandards in Standard IIC is to ensure that the institution provides services that are “sufficient in quantity, currency, depth, and variety to facilitate educational offerings” (ACCJC, 2012a, p. 9). This is accomplished by selecting equipment and materials to support student learning, providing access to the library and training about library services , providing maintenance and security of the library, and regularly evaluating the agreements it holds with other institutions for resources.

The second substandard is to evaluate library and learning support services to ensure that they are meeting students’ needs and that they contribute to the achievement of student learning outcomes. The results are required to be utilized to improve these services.

27

Standard III: Resources

Standard III of the ACCJC accreditation standards has four main sections relating to resources. These four sections, which are covered under the umbrella of resources, are human resources, physical resources, technology resources, and financial resources. The overview of this standard includes the following statement:

The institution effectively uses its human, physical, technology, and financial resources to achieve broad educational purposes, including stated student learning outcomes, and to improve institutional effectiveness. Accredited colleges in multi-college systems may be organized such that responsibility for resources, allocation of resources and planning rests with the system. In such cases, the system is responsible for meeting standards on behalf of the accredited college. (ACCJC, 2012a, p. 10)

The first substandard, Standard IIIA, considers the area of human resources.

This substandard for institutions seeking accreditation is designed to make significant the hiring of qualified personnel. The general idea of substandard IIIA is that an institution will hire qualified staff to provide educational support, with the ultimate goal of improving institutional effectiveness. Additionally, the institution will provide equitable treatment and professional development and will evaluate their employees regularly (ACCJC, 2012a).

Under this substandard of human resources are five additional sections that are outline the role institutions play in meeting the overarching accreditation standard of resources. The first of the five additional sections discusses the importance of assuring integrity and quality in the hiring of personnel at the institution (ACCJC,

2012a).

28

The second section under the substandard of human resources describes the importance of institutions maintaining a sufficient number of qualified staff and faculty. In order for institutions to meet their mission and achieve their goals they must provide a sufficient number of personnel to support this.

The third and fourth sections under the substandard of human resources describe the importance of developing equitable policies and procedures for institution personnel. The third section focuses a great deal on developing policies and procedures that provide fairness in the hiring process and provide security of personnel records. The fourth section of the human resources substandard recommends that policies and procedures display an appropriate understanding of equity and diversity.

The fifth and sixth sections of the substandard of human resources are areas that recommend that institutions provide support to employees for professional development and that they plan for the institutions to move forward (ACCJC, 2012a).

The second substandard of the ACCJC accreditation standard for resources is the area of physical resources (Standard IIIB). This substandard is designed to make the physical environment of institutions a resource for the continued improvement of the institutions’ effectiveness. The overview of this substandard of physical resources states that institutions will provide “facilities, land, and other assets, support student learning programs and services and improve institutional effectiveness. Physical resource planning is integrated with institutional planning” (ACCJC, 2012a, p. 12).

29

The third substandard for resources (Standard IIIC) is the area of technology resources. This substandard is designed to make the available technology at the institutions a resource for continued improvement of the institutions’ effectiveness and for student learning programs and services. The overview of this substandard of technology resources encompasses two sections.

The two sections covered under the substandard of technology resources emphasize the current use of technology and the future planning of technology. The first section recommends that any use of technology should directly support the needs of “learners, teaching, college-wide communication, research, and operational systems” (ACCJC, 2012a, p. 13).

The last of the four substandards of resources is financial resources. The outline of the overview of this substandard states that

financial resources are sufficient to support student learning programs and services and to improve institutional effectiveness. The distribution of resources supports the development, maintenance, and enhancement of programs and services. The institution plans and manages its financial affairs with integrity and in a manner that ensures financial stability. The level of financial resources provides a reasonable expectation of both short-term and long-term financial solvency. Financial resources planning is integrated with institutional planning at both college and district/system levels in multi- college systems. (ACCJC, 2012a, p. 13)

Under the substandard of financial resources are four sections outlining the role institutions play in meeting the overarching accreditation standard of resources and the substandard of financial resources. The four sections cover the role finance plays in meeting the institutions’ missions, maintaining institutional integrity, planning and procedures, and institutional planning and assessment (ACCJC, 2012a).

30

Standard IV: Leadership and Governance

Section A of Standard IV discusses the decision-making process and the importance of ethical and effective leadership throughout the organization.

According to Standard IV, 1, “Institutional leaders create an environment for empowerment, innovation, and institutional excellence” (ACCJC, 2012a, p. 16).

Leaders should be encouraging and assist in making everyone feel as though they are a part of the success of the institution through a systematic participative process

(ACCJC, 2012a).

Standard IV, A. 2, discusses the establishment and implementation of written policies for faculty, staff, administration, and students to participate in the decision- making process. This section also states that the institutions depend on faculty, the academic senate and other faculty structures, the curriculum committee, and academic administrators for guidance on student learning programs and services (ACCJC,

2012a).

Section 3 under Standard IV, A, recommends that all members of the institution work together for the common good of the college. This is accomplished through established governance structures, processes, and practices. Good communication is key to the discussion and transferal of ideas, and the board is responsible for facilitating this communication (ACCJC, 2012a).

Standard IV, A, sections four and five, deal with honesty and integrity of the institution. This honesty and integrity is maintained by complying with Accrediting

Commission Standards and expeditiously responding to any recommendations made

31 by the Commission. Leadership is regularly evaluated, according to Standard IV,

A, 5, to assure this honesty and integrity, and the results of the evaluations are used for improvement (ACCJC, 2012a).

ACCJC Rubric for Student Learning Outcomes

Seven different activities are listed in the ACCJC rubric for SLO proficiency.

The first is that student learning outcomes and authentic assessments are in place for courses, programs, support services, certificates, and degrees. This has been one of the most difficult characteristics of student learning outcomes implementation for institutions to achieve. This characteristic implies that all courses, programs, student services, administrative services, institutional, and other campus programs must measure the impact of their programs and courses on student learning (ACCJC,

2012b).

The next activity listed is widespread institutional dialogue about the results of assessment and identification of gaps. Once the process for assessment is established and the campus is assessing its SLOs, the results that are collected need to be discussed across campus. This dialogue should be substantial and should include information about what the results mean to the department or course, how they can be used to improve the department or the course, and what resources might be needed to implement the changes (ACCJC, 2012b).

The third and fourth activities are that decision making includes dialogue concerning the results of assessment and is purposefully directed toward aligning institution-wide practices to support and improve student learning and that

32 appropriate resources continue to be allocated and fine tuned. These two characteristics are highly related, as they both deal with making decisions about program and course improvement and allocating the resources necessary to make changes to improve student learning. On most campuses, these are interpreted to mean that program and course improvement should be based on data collected through the assessment of student learning outcomes. Resources should then be allocated to requests based on assessment data and reassessed for continued allocation based on this process (ACCJC, 2012b).

The fifth activity, that comprehensive assessment reports exist and are completed and updated on a regular basis, implies that departments have collected data through assessment, that they have compiled and discussed it, and that they have reported their findings in a report format. These should be kept by the institution in a central location, allowing easy access to the entire campus and, potentially, all off- campus persons. These reports also need to be updated on a regular basis. Although campuses vary across the State about how often they produce comprehensive reports, almost all campuses have agreed that annual updates are conducted between their comprehensive report cycles. This allows for annual data collection, discussions, and resource requests (ACCJC, 2012b).

The next activity is that course student learning outcomes are aligned with

degree student learning outcomes. This characteristic is fairly simple to achieve, and

most colleges do not struggle with this requirement. Through the curriculum process,

course student learning outcomes are directly tied to both program and institutional

33 learning outcomes. This allows for an easy and direct link to how individual courses are helping the colleges achieve their stated learning outcomes and therefore their mission (ACCJC, 2012b).

The final activity is that students demonstrate awareness of goals and purposes of courses and programs in which they are enrolled. If colleges were to merely create and assess student learning without ever engaging the student in the process, it would seem to lack completeness and meaning. An easy way to accomplish this would be to place all student learning outcomes directly on the course syllabi and have the instructors review the desired outcomes on the first day and throughout the course (ACCJC, 2012b).

All CCCs are also supposed to take action, according to the ACCJC, to move toward Sustained Continuous Quality Improvement (SCQI) for student learning outcomes. To do so, they would be engaged in the following activities:

• student learning outcomes and assessment are ongoing, systematic, and

used for continuous quality improvement;

• dialogue about student learning is ongoing, pervasive, and robust;

• student learning outcomes processes are evaluated;

• organizational structures to support student learning are evaluated and fine

tuned on an ongoing basis;

• student learning improvement is a visible priority in all practices and

structures across the college; and

34

• learning outcomes are specifically linked to program reviews (ACCJC,

2012b).

Accreditation and Student Learning Outcomes

The simple fact is that all California community colleges have voluntarily chosen to be accredited by the ACCJC. In fact, each college pays a significant fee to seek this accreditation. The ACCJC, as well as all other accreditation agencies, require colleges to assess student learning in order to maintain their accreditation status (Beno, 2004; Serban, 2004; Suskie, 2010). Accreditors came to understand that the quality-review processes that include a focus on student learning draw the accreditation process itself closer to its true purpose of assessing the quality of education offered by an institution of higher education (Beno, 2004, p. 66). The accreditors are concerned about the assessment that institutions are making of the quality of education being provided and are requiring colleges to assess student learning as a central focus to that quality-review process. The institutions must comply with all of the requirements in order to receive accreditation.

In higher education, it is a rarity to discuss improving pedagogy; however, professors are responsible for passing along knowledge to students. They should be meeting the needs of their students, and they do this by ensuring that they are effectively conveying their knowledge. Assessing student learning ensures that professors are contributing to the improvement of student learning. Colleges need to try to understand their students better to help them gain as much as possible from their education so they can meet their educational goals and improve their lives.

35

Student Learning Outcomes

“Student outcomes assessment is the act of assembling and analyzing both qualitative and quantitative teaching and learning outcomes evidence in order to examine their congruence with an institution’s stated purposes and educational objectives” (Serban, 2004, p. 17). This quote exemplifies the mission-based accreditation standards that guide the assessment of student learning outcomes on

California community college campuses. Institutional student learning outcomes expands this notion to assess how the entire institution is doing in meeting their overarching goals by evaluating their individual programs and courses.

Serban (2004) reported that there are three recognized ways to assess general education: through courses, through the general education themes, and through other noncourse means. Course-level assessment is done by linking individual courses to at least one general education area and then making an assumption about whether that general education area has met its goals based on whether the course outcomes were met. In order to assess the general education themes, faculty from various courses are asked to provide information about their course-level outcomes, and an assumption is made based on the grouping of courses. Finally, noncourse outcomes are looked at from noninstructional areas, such as student services on campus that are linked to a general education area.

Seven assessment expertise and skill areas are needed to ensure assessment of the institutional learning outcomes. “One of the major challenges in building, sustaining, and effectively utilizing student learning outcomes assessment is having

36 the needed expertise and skills on campus” (Serban, 2004, p. 23). Vision implies a clear understanding of the college’s goals in order to assess whether outcomes are being met. Individuals with expertise should also have an understanding of the college, how it functions, and its barriers and strengths. Experts with a functional knowledge of measurement concepts, research-design methods, and statistical analysis should be sought, and institutions must locate individuals who understand the instructional components being assessed. These experts must also possess strong communication skills and have the academic qualifications and training needed to conduct such work (Serban, 2004).

Beno (2004), who is the current Executive Director of the ACCJC, discussed

the role of accrediting agencies in assessing institutional effectiveness and quality.

When an institution has accreditation from the ACCJC, it is deemed to be providing a

high-quality educational experience for students and the community. This, in turn,

creates a sense of accountability to the college’s constituents. Beno identified four

practical guidelines for institutions: document expected student learning outcomes,

document institutional assessment of learning, document student learning outcomes,

and document the use of assessment results for institutional improvements.

Beno (2004) states that the process of assessing student learning will

inevitably cause faculty to explore a variety of forms of pedagogical as well as

assessment strategies. Deciding on the most effective strategies for teaching and for

assessing learning requires experimentation, careful research, analyses, and time

(Beno, 2004, p. 67). Suskie (2010) and Banta (2002) agree that research-informed

37 pedagogies to improve teaching are needed. Therefore, allowing faculty the freedom to experiment and explore the assessment of student learning will lead to more informed pedagogical practices and increased student learning.

The assessment of student learning outcomes allows faculty members to know

how to help students who have academic deficits or are battling other significant

issues in life that may be creating barriers to their learning. Student learning should

never be a secondary concern for a faculty member. Faculty may often disagree

when discussing SLOs, but it is these intelligent discussions between highly educated

content experts that can help improve teaching and learning.

Suskie (2010) and McClenney (1998) point to the increased pressure from

stakeholders and investors to prove that students are learning, what they are learning,

how colleges are helping those students who are struggling, and how colleges are

spending the money they receive. Banta, Black, Kahn, and Jackson (2004), as well as

Banta (2002), discuss how stakeholders want to know that education is worth

investing in. Therefore, if colleges want to position themselves to keep and gain

funds for their programs, they must find ways through the assessment of student

learning to prove that they are being successful. The most effective means to assess

student learning is through faculty members’ use of authentic assessment of their own

students. This will garner much more meaningful information about what their

students are learning and how colleges can improve that learning than providing a

standardized assessment from a test publisher. Colleges need to ensure that, when

people ask why they should put money into instruction they know with confidence

38 that it is because faculty members are ensuring that when students leave a course or a college, they have learned something.

Assessing Student Learning Outcomes

“What seems to be missing in many debates about higher education … is engagement between those who understand and those who act” (Goldstein & Young,

1992, p. 39). Loacker (1988) stated that faculty members need to be at the core of the assessment process on campuses in order for the college to successfully affect student learning.

Although faculty members need to be intimately involved in assessment processes on campus, many barriers prevent them from doing so effectively.

Goldstein and Young (1992) state that the invasion of administration and political action into the academic venue is problematic, and, oftentimes, adequate support is not provided to faculty to be involved at the level it needs to be. Walvoord and Pool

(1998) and Palomba and Banta (1999) discuss the notion that faculty members may see top-down approaches as an invasion because they have experienced a culture of valued autonomy. When their autonomy is threatened by policies or administration, they quickly become suspicious.

The researchers go on to list four additional barriers: collegiality, intrinsic rewards, time, and contact with students. Collegiality deals with a faculty members’ intense dedication to their department and their discipline being put before their concern for the institution. In addition, it refers to the departmental culture of faculty- to-faculty conflict (Walvoord & Pool, 1998).

39

On a more systemic level, faculty-to-faculty conflicts can transcend department lines and become conflicts between departments or between schools in a university. The net results of all these faculty conflicts are faculty and staff stress, a loss of productivity, an inability to effectively meet students’ needs, and a drain on the university’s administrative resources. (Leal, 1995, p. 21)

Time is one of faculty members’ most valuable possessions, and it can be a major barrier to successful implementation of assessment on campus if the faculty members do not see the results being used or they believe that their time is being

“wasted” by conducting assessments for outside entities and not for internal purposes

(Palomba & Banta, 1999; Walvoord & Pool, 1998).

Student Learning Outcomes Leadership

Student learning outcomes coordinators (SLO coordinators) are often put in the precarious position of leading the entire campus in a unified direction toward fulfilling accreditation standards for student learning outcomes. The SLO coordinator is oftentimes either a faculty member or an administrator but must hold some authority over all areas of the campus concerning SLO matters. The success of SLO coordinators is highly dependent on their personality and their ability to recognize and resolve barriers that are present in the colleges’ current state, as well as to predict and prepare for barriers that may arise once the student learning outcomes and program review processes begin (ASCCC, 2007).

In order for an SLO coordinator to effectively lead a campus through the development and institutionalization of the SLO processes, a strong leadership team needs to be formed. This team should be composed of all the campus leaders and

40

SLO or program-review coordinators. Their purpose should be to serve as an educated resource, not as the decision-making body in the accreditation process. This team would be used to gain initial support and involvement from faculty and staff, to educate others about the processes, and to be in touch with what is happening across campus. As Walvoord and Pool (1998) stated, “You cannot lead faculty members to water, much less make them drink. You have to figure out what would make them want to head toward the pond” (p. 36). Assessment needs to be faculty driven, but that may take the leadership of others on campus, especially in the beginning stages.

Once the campus community is engaged in the process, other issues besides development and the beginning stages of use are bound to become more pressing. In order to move this effort forward, it needs to move from a new idea (innovation) to an institutionalized activity. The campus leadership must regularly show that assessment and the use of its results are vital to the functioning of the institution

(Gray, 1997). This can best be done by tying assessment to campus decision making and budgetary decisions (Magruder, McManis, & Young, 1997). In addition, finding an individual’s intrinsic motivation can help gain involvement and commitment

(Gray, 1997).

According to Palomba and Banta (1999), when a campus experiences faculty resistance, such as a lack of commitment, a refusal to participate, or an attempt to create conflict, the leadership needs to address why this is occurring. Usually, faculty members resist the change because they do not see the connection between the assessment they are being asked to conduct and their classroom teaching and learning

41 process, they do not feel appreciated for the time they devote to the process, they believe that the process will somehow threaten their jobs and their academic freedom, or they believe that the information they collect will sit on the shelf unused

(Magruder, McManis, & Young, 1997; Palomba & Banta, 1999).

In order to counteract resistance, campus leadership should utilize the three Rs of faculty involvement that are discussed in the article by Palomba and Banta (1999).

The three Rs are responsibility, resources, and rewards. Responsibility refers to campus leadership providing clearly defined roles and expectations for all persons involved. Faculty members need to accept responsibility for the assessment processes, but they must also be given the responsibility to control most aspects of the process.

Resources must be provided to faculty members to learn about and understand the assessment and the accreditation processes. The institutions should consider providing formal written documents that can guide the work of faculty, providing books about assessing student learning outcomes, sending individuals to conferences to learn and to network with others, providing opportunities on campus for individuals to gather together and discuss assessment, and providing human support in the form of clerical and data-entry personnel and coordinators to assist with the process.

Finally, rewards must be used to provide incentive to make positive contributions to the processes. This can be done intrinsically (motivating them to improve their teaching), using more explicit methods (release time, ties to tenure or

42 promotion, verbal and written recognition, and stipends), or by other forms of recognition (on-campus presentations or the ability to publish articles about their research). Campus leadership must find ways to maximize and maintain faculty involvement in the SLO processes through direct involvement, using local instruments (encourages involvement and provides faculty members a sense of control), encouraging teamwork, and ensuring that the results collected are utilized.

Finally, administration needs to be flexible, not top-down (Palomba & Banta, 1999).

Concerns-Based Adoption Model (CBAM)

President Kennedy stated,

Just because we cannot see clearly the end of the road, that is no reason for not setting out on the essential journey. On the contrary, great change dominates the world and unless we move with change we will become its victims. (as cited in Ringel, 2000, p. 1)

Change elicits emotional, physiological, and professional anxieties within individuals,

especially when the change is coupled with uncertainty (Lane, 2007).

Leaders in the higher education community need to not only embrace change

but also guide others through the never-ending changes from lawmakers, constituents,

and accrediting agencies. Institutions may encounter resistance to proposed changes

and must react appropriately to address the fears creating the resistance and to create

a culture of dialogue and support (Lane, 2007).

In recent years, the challenges that California community colleges have faced

regarding accreditation have been immense. It has been shown in the past decade that

just less than half (62 of 112) of California community colleges have faced sanctions

43 from their accrediting agency, the Accrediting Commission for Community and

Junior Colleges (ACCJC). A recent study by Tharp (2012) indicated that schools that had been placed on sanction five or more times displayed different cultural practices than did those who had never been placed on sanction. These practices are important to understand if we aim to change them on our own campuses.

According to Tharp (2012), schools placed on sanction five or more times expressed that

• campus-wide role definition was not clear

• conflict was longer lasting and more intense

• progress was made despite conflict

• accreditation was not deemed to be universally important

• motivation was external

• enforcement was not as consistent

• the degree of contact with accreditation varied

• processes had less integrity

• participants were not as interconnected

• resources were not available

Schools never placed on sanction exhibited the opposite practices: clearly defined roles, less conflict, conflict resolution in place, accreditation viewed as universally important, internal motivation, consistent enforcement, degree of contact with accreditation consistent, process integrity maintained, participants more interconnected, and resources made available (Tharp, 2012). In order to move

44

California community colleges from practices that are deeply rooted in the system to an alternative way of thinking and doing, we must introduce a systemic change.

One of the most well-known models for introducing educational innovations is the

Concerns Based Adoption Model (CBAM).

This model was developed in the 1970s to be used when implementing changes in education.

CBAM is founded based on several assumptions that are (a) change is a process, not an event; (b) change is accomplished by individuals; (c) change is a highly personal experience; (d) change involves developmental growth in feelings and skills; and (e) change can be facilitated by interventions directed toward the individuals, innovations and contexts involved. (Afshari, Bakar, Luan, Samah, & Fooi, 2009, pp. 1–2)

The model identifies several stages of concerns that individuals progress through when experiencing change.

The model has seven stages of concern. The first stage, Stage Zero, is when an individual may be aware that a change is being implemented but is not committed to the idea enough to take part in it. At this stage, it is vital to get people on board with the change. Providing staff with the ability to be involved will help gain their interest in the change (Afshari et al., 2009; Loucks-Horsely, 1996).

Stage One, informational, occurs once an individual is interested in the change and is seeking more information about the change and how it will be implemented.

To address this concern, consistent information through multiple modalities is necessary; providing presentations to groups, distributing handouts, sending email

45 messages, and having informal conversations with individuals are all important at this stage (Afshari et al., 2009; Loucks-Horsely, 1996).

The third stage, personal concerns, tends to elicit the strongest emotions in people as individuals and groups struggle to understand how the change will impact them, and if they actually have the ability to implement the change. At this stage, interacting in smaller groups or one to one will gain the most change and progression in individuals moving toward the fourth stage. Individuals need to be reassured that they can implement the change (Afshari et al., 2009; Loucks-Horsely, 1996).

Stage four, management, occurs when individuals start to implement the change and become concerned with the logistics of the change. At this time, individuals are concerned about the amount of time needed to devote to the change, the impact of time taken from other tasks, and preparing materials for the change. At this time, individuals need specific practical resolutions regarding the “how to” of the implementation of the change (Afshari et al., 2009; Loucks-Horsely, 1996).

Consequence, stage five, is the time when individuals begin to question whether or not the change will truly impact their students’ learning. “Is it worth it?”

Those implementing the change may also want to alter the innovation in order for faculty members to believe that it is having a bigger impact on their students.

Allowing for professional-development opportunities during this time (site visits, conferences, other sharing opportunities) will help to validate what the individuals are doing (Afshari et al., 2009; Loucks-Horsely, 1996).

46

During stage six, collaboration, individuals are seeking to work with others to ensure that the change is as effective as possible. At this time, providing the chance for collaboration within the site to occur is important (Afshari et al., 2009;

Loucks-Horsely, 1996).

The final stage, seven, refocusing, is concerned about changing the innovation to further enhance student learning. It is in the best interest of the institution to support these efforts by giving individuals an outlet to bring forward alternatives and provide opportunities for feedback and discussion (Afshari et al., 2009; Loucks-

Horsely, 1996).

Three main notions need to be kept in mind when utilizing CBAM to address the change process on campus: know where people are, plan for implementation over a span of years, and stimulate interest. In order to utilize this model to its fullest, one must assess which stage that different groups of people are concerned about and address their concerns at that level. Overwhelming individuals with too much information or stifling their creative energy to make change can have a negative effect. Change is a process that takes time. It is estimated that most innovations take about 3 years to occur, and even then, more issues will emerge. Finally, stimulating interest in the change is vital if the change is to be long standing and gain the support of the greatest number of individuals (Loucks-Horsley, 1996).

Conclusion

“When members of an organization enjoy a fair amount of autonomy, such as enjoyed by faculty, decisions related to implementing and institutionalizing

47 innovations cannot be made unilaterally and be expected to go uncontested” (Gray,

1997, p. 8). It is the responsibility of the campus leadership to involve all members of the campus community, especially faculty, in the assessment decisions made, and efforts must be made to ensure that this group of highly influential campus constituents is validated and respected for the work they do on behalf of improving student learning.

This review of literature discussed the emergence of the student learning

outcomes movement in California community colleges. Moreover, this chapter

provided an overview of the ACCJC and a look at the work that has been done with

student learning outcomes. It also documented some of the barriers to be faced when

trying to implement and sustain student learning outcomes processes. The next

chapter will outline the methodology of the study. Chapter IV will then discuss the

findings of the study, and Chapter V will present the interpretations and

recommendations of the study with a connection to the literature.

CHAPTER III

METHODOLOGY

The purpose of this study was to explore how California Community Colleges utilize the Student Learning Outcomes Coordinator positions on their campus, what barriers campuses face when trying to implement SLO, how campuses have effectively addressed these barriers, and what role accreditation status plays in the role of implementing SLOs. This was accomplished through an examination of the ways in which California community colleges sustain effective practices for student learning outcomes and its use of the data derived from SLOs. The more knowledge those in charge of the SLO processes on campuses have with regard to implementing innovations, the better they will be able to assist their campuses to accept and use the

SLO processes and not become sanctioned by the ACCJC for SLOs or the use of assessment data.

The quantitative (descriptive) survey analysis revealed common characteristics of colleges on and off sanction from the ACCJC for SLO or the use of assessment data and their ability to implement SLO processes on their campuses. In addition, it revealed barriers that exist on campuses to implementing SLOs and how those barriers have been successfully overcome.

Using the framework of the Concern Based Adoption Model (CBAM), this researcher identified which stages of concern and levels of use are problematic for institutions to overcome and identified activities or methods employed by others to

48 49 overcome them and move to institutionalize the SLO processes on campus. This information can assist campus leaders when trying to make decisions about how to best implement SLO processes, gain acceptance from faculty and staff, and ultimately improve student learning.

Research Questions

The following questions guided this study:

• RQ1: What strategies have institutions in the California Community

College system utilized to implement student learning outcomes

processes?

• RQ2: What barriers exist in the implementation of student learning

outcomes processes?

• RQ3: What strategies have been successful in eliminating or minimizing

the barriers to implementing student learning outcomes processes?

• RQ4: What institutional cultural characteristics affect the successful

implementation of student learning outcomes processes?

• RQ5: In what ways have the institutions’ accreditation status impacted

their implementation of student learning outcomes processes?

Role of the Researcher

The researcher is a full-time manager at a California community college in the

Central Valley of California. In addition to the researcher’s management duties, she also serves as the Student Services Program Review (PR) and Student Learning

Outcomes (SLO) Coordinator. In this Coordinator role, she works closely with

50

Instructional SLO and PR Coordinators through committee work, collaboration activities, trainings, and accreditation report writing.

The researcher solicited the participation of the California Community

College (CCC) SLO coordinators by way of direct email contact through campus- provided email addresses and various listservs. The survey was sent directly from the researcher. In order to obtain individual campus email addresses, a list was created that contained the name of each of the 112 CCCs in the system and the email contact information for the campus President, Vice President of Instruction, Academic Senate

President, Accreditation Liaison Officer (ALO), and SLO Coordinator.

The researcher was fully responsible for the tracking of respondents, follow up with respondents, and the handling of any technical difficulties that arose from the electronic survey. In addition, the researcher coded the open-ended survey data.

Setting

The California Community College system served 2,292,258 students in the

2012–2013 academic year, making it the largest community college system in the

United States. The student population is highly diverse, with 53.04% being female and 45.78% being male. The ethnic breakdown is comprised of 877,802 (38.29%)

Hispanic students, 692,529 (30.20%) Caucasian students, 263,232 (11.48%) Asian students, 166,143 (7.25%) African American students, and 292,822 (12.78%) students with another ethnicity or unknown ethnicity (CCCCO, 2014).

The California Community College system was chosen for this study due to its size and its emerging issues with its regional accreditor, the Accrediting

51

Commission for Community and Junior Colleges (ACCJC). Many of the

California community colleges are being placed on sanction by the ACCJC and at a rate not comparable to the rates of regional accreditor sanctions of community colleges throughout the United States. While ACCJC sanction rates for CCCs typically exceed 60%, other regional accreditors have sanction rates closer to two percent (Cohn, 2013).

Population and Sample

The population for this study consisted of the Student learning outcomes coordinators or other SLO leaders on all 113 California community college campuses.

Some colleges may have more than one SLO coordinator; therefore, more than 113 individuals were invited to participate in the online survey. According to Gall, Borg, and Gall (1996), purposeful sampling is used to “select cases that are likely to be

‘information-rich’ with respect to the purposes of the study” (p. 218). For this study, purposeful sampling was used since the study is targeting a specific group of individuals.

For the system-wide survey, all SLO coordinators and other leaders in SLO assessment on campuses from each of the California community colleges were sent an email message through various listservs with a link to an online survey. The survey contained 25 questions regarding SLO processes, challenges, strategies to overcome barriers, and successes, as well as questions tailored to assess where colleges are in regards to the Concerns Based Adoption Model stages of concern and levels of use.

52

Following the completion of the survey, the results were analyzed. The researcher coded the qualitative data and analyzed the quantitative data to draw conclusions from the online surveys collected.

Ethical Considerations

In order to protect the confidentiality and anonymity of the participants and the participating colleges, all data were coded so that only the researcher knows the names and colleges participating. Individuals were noted by their position and their respective college code, for example, SLO Coordinator from College A. The data are stored on the researcher’s personal computer, which is password protected, and a password-protected thumb drive as a backup. Surveys were all collected online and were accessible by the researcher only through her personal Qualtrics account, which requires a specific username and password, that is hosted on the California State

University, Stanislaus, website.

Any presentations or other public distribution of this dissertation or any part of it or its findings will not reveal the identity of the participants or their colleges.

The CSU, Stanislaus, Institutional Review Board reviewed and approved all of the research protocols involving human subjects to ensure compliance with the

University’s regulations and all applicable laws before any data collection took place for this study.

53

Methodology

This study included the development of a survey, pilot testing the survey, data

collection, and analysis for each of the components of this descriptive study. An

explanatory descriptive method design was employed.

System-wide Survey

The first step of the study consisted of the development, pilot testing, distribution, and collection of the data from a system-wide survey to all 113

California community colleges. In order to develop survey questions, information from the literature regarding student learning outcomes was used. In addition, information gained from a review of the ACCJC Student Learning Outcomes rubric for Sustained Continuous Quality Improvement was used. The questions were also framed, keeping in mind the Concerns Based Adoption Model’s stages of concern and level of use. Twenty-five questions were developed, most requiring a multiple- choice response, while providing an opportunity for an open-ended response following several of the questions (see Appendix B). Several of the questions were adopted from the survey conducted in 2007 by the Academic Senate for California

Community Colleges entitled, “Agents of Change: Examining the Role of Student

Learning Outcomes and Assessment Coordinators in California Community

Colleges” (see Appendix C). The proposed questions were pilot tested to determine if the questions were clear and were understood in the same context across multiple individuals, as well as to ensure that no questions were slanted to create a bias in the

54 response of individuals. Very minor adjustments were made to the question content, the order of the questions, and the formatting of them (see Appendix D).

Once the survey questions were tested and adjusted, the system-wide survey was administered. The researcher used Qualtrics to create and distribute an electronic survey through various listservs to each of the 113 California community college

SLO coordinators or campus leaders. Qualtrics is an online electronic survey-design and data-collection and analysis tool that allows researchers to send a link to the survey through an email message for participants to follow and complete online.

Once the survey data were collected, they were exported into EXCEL for data analysis. Descriptive statistics, such as frequencies, were analyzed and were used to summarize the participants’ responses. Each individual who chose to participate in the online survey electronically consented to the survey and had access to the consent form online (see Appendix E).

A system of open coding was utilized to examine the qualitative data

compiled through the open-ended questions on the survey. According to Strauss and

Corbin (1990), in open coding one would “sweep through the data and mark (by

circling or highlighting) sections of the text selected codes or labels” (p. 3). Next,

axial coding was employed to group the codes into themes that are representative of

the research questions for this study. To establish reliability and trustworthiness of

the coding process, the following steps were taken: (1) the researcher identified an

individual to assist with the data analysis who was knowledgeable about student

learning outcomes but was not part of the study, (2) the researcher and her assistant

55 coded sections of open-ended questions separately, (3) the researcher and her assistant then compared codes and themes that each person identified and came to an agreement on the appropriate codes and themes, and (4) the researcher and her assistant shared her rationale for the codes and themes chosen. Finally, selective coding was used to combine all the themes developed during axial coding into more centralized categories (Strauss & Corbin, 1990).

In the final step, all of the data collected through the system-wide survey were reviewed in the context of the major themes of this study, as reflected in the research questions. Those themes are (1) barriers to SLO implementation, (2) strategies used in implementing SLO processes, (3) strategies used to address barriers, (4) institutional cultural characteristics affecting implementation, and (5) affect on accreditation.

Data Collection

A web link to the Qualtrics survey instrument was sent in October and

November 2013 to each campus SLO coordinator and SLO leaders through various listservs. Integration of data occurred in March 2014 once all of the open-ended questions from the survey were coded. The data-collection process was completed by

February 2014. An electronic consent form was required for participants to move into and take the online survey.

Data Analysis

Both quantitative and qualitative approaches were used. The focus of the analysis was on the qualitative data but was supported by the quantitative data. The

56 quantitative data collected were used to describe the demographic variables and various aspects of the SLO coordinator position on campuses.

Each of the research questions was assessed through the integration of the survey data. Assessing the research questions required the analysis (frequencies) of the descriptive data gathered from the survey instrument as well as the qualitative data coded from the open-ended survey questions.

Conclusion

In this chapter, the steps of the study were described along with a description

of the data collection and analysis. The participants and the measures taken to protect

their anonymity were also discussed. Chapter IV will present a description of how

the data for the study were collected and analyzed. The results from the quantitative

and the qualitative portions of the study will also be presented and integrated. This

integration will provide a solid foundation for the researcher to further discuss the

results of the study in relation to the Concerns Based Adoption Model and the

accreditation standards.

CHAPTER IV

FINDINGS

The purpose of this study was to explore how California community colleges utilize the Student Learning Outcomes Coordinator positions on their campus, what barriers campuses face when trying to implement SLO, how campuses have effectively addressed these barriers, and what role accreditation status plays in implementing SLOs. The literature shows that little is known about the duties and functions of California community college (CCC) Student Learning Outcomes

Coordinators. Few studies have been identified that explored the assessment practices that campuses are using to implement SLO processes and what strategies are being used to assist in overcoming barriers to implementing them. With this information, SLO Coordinators should be better able to assist their colleges in achieving student learning outcomes sustainability, thus decreasing their chances of being sanctioned by the Accrediting Commission for Community and Junior Colleges

(ACCJC).

Chapter IV begins with a review of the research questions this study addressed. The research sequence will be explained, and the selection method used to choose the survey participants will also be described. The data-collection process will also be discussed. The summary of the data collection includes a description of the survey development process, the pilot testing of the survey, and the administration of the survey. Chapter IV also includes information about the demographics of the

57 58 study participants and a presentation of the findings from the surveys. The findings are presented in six sections, beginning with the SLO Coordinator demographic information collected from the survey, and followed by five categories that are tied to the research questions. The five categories are (1) SLO Coordinator characteristics,

(2) strategies for implementing SLOs, (2) barriers, (3) how to overcome the barriers,

(4) campus cultural characteristics, and (5) effect on accreditation status. The chapter concludes with a discussion of the measures that were taken to ensure the quality of the data.

The study was designed to address five research questions. These five research questions were as follows:

1. What strategies have institutions in the California Community College

system utilized to implement student learning outcomes processes?

2. What barriers exist in the implementation of student learning outcomes

processes?

3. What strategies have been successful in eliminating or minimizing the

barriers to implementing student learning outcomes processes?

4. What institutional cultural characteristics affect the successful

implementation of student learning outcomes processes?

5. In what ways have the institutions’ accreditation status impacted the

implementation of student learning outcomes processes?

59

Research Sequence

The research sequence employed for this study followed the timeline that was described in Chapter III. The first step took place October through December of 2013 and included the development of survey questions based on the Concerns Based

Adoption Model (CBAM) theoretical framework, the 2007 survey conducted by the

Academic Senate of the California Community Colleges (ASCCC), the pilot testing of the potential survey questions, and the data collection. The pilot testing of the survey was conducted with three individuals who were actively involved in the assessment process at the colleges where they were employed. One of the members was a retired Vice President of Student Personnel at a CCC, who also served as the campus Accreditation Liaison Officer (ALO). The second member was serving as an

Institutional Researcher for a CCC. The third member was an active full-time faculty member at a CCC who has served in the capacity of Instructional Program Review

Coordinator. Changes that were suggested by the pilot-test participants were incorporated into the final draft of the survey. The survey was administered from

October 31 through December 1, 2013, using Qualtrics. The survey was sent to multiple assessment leaders in the California Community College system through various listservs, including the RP group listserv, SLO Coordinators listserv, RP assessment listserv, Leading from the Middle listserv, and the Vice Presidents of

Instruction listserv (see Appendix F for general membership of each listserv). All individuals who were registered with each of the listservs mentioned received the invitation to participate in the online survey. In addition, the survey was sent directly

60 to several CCC personnel (a President, a Vice Chancellor of Institutional

Effectiveness, a Dean of Humanities, and full-time faculty members) who the researcher knew personally, asking them to distribute it to their faculty who were involved in SLO processes on their campuses.

The survey contained 24 questions encompassing eight different components that were broken into 11 separate pages online. The eight components were participant demographics, positions held, SLO position demographics, assessment benchmarks, needs, strategies used to develop and implement SLOs, barriers to implementing SLOs, and the campus culture in relation to SLOs.

The participant demographics section contained six questions, asking for the

respondents’ names, which CCC they were employed by, their contact information,

the campus SLO website, and their campus’s accreditation status as of July 2013. In

the next question, respondents were asked to list all of the positions they held on their

campus. There was a space for them to respond as “Other” and to manually enter

their position held, as well as space for certain positions, such as full-time faculty, to

enter which discipline in which they taught.

The third component asked six additional questions about the SLO position on

the campuses. The questions required respondents to provide information about how

long they had been involved with SLOs on their campus, the number of years the

SLO position had lasted on their campus, how much reassigned time was allocated

for their SLO Coordinator, how the position was selected, if there was a formal job

description, and the primary duties of their SLO Coordinator. To evaluate the

61 assessment benchmark status at each of the campuses, respondents were asked to complete a table that used the ACCJC SLO rubric benchmarks and the different campus SLO responsibilities, including course SLOs, program SLOs, general education SLOs, and various others.

The next component, needs, asked if there were areas in which the respondents thought they could use support and training to better assist them in serving the SLO needs of their campus. Another component had a single question asking for a description of what strategies had been or were being used to develop and implement SLOs. The component regarding barriers included three questions: which barriers colleges had faced, strategies they had used to reduce or eliminate them, and if they believed that the campuses’ accreditation status had affected their ability to implement SLOs or to reduce or eliminate barriers.

The final component asked four questions about the culture on the college campuses regarding SLOs. The questions asked respondents to describe their campus culture in relation to SLOs, to discuss if the culture impacted their ability to eliminate barriers, the level of concern that faculty members had about SLOs, and the level of use in which faculty were engaged with SLOs.

The final step was comprised of the survey data analysis and the results summary. The survey data were retrieved from Qualtrics and uploaded into EXCEL for analysis in December 2013. In addition, several reports were run within Qualtrics to gather the data for analysis. The qualitative data obtained from the open-ended

62 questions was gathered from the Qualtrics reports and used for thematic analysis as well.

Data Collection

This study had one data collection point, the survey stage. The target population for the survey was SLO coordinators and others who provided leadership for assessment on their campuses and worked in a California community college.

The researcher registered for and subsequently posted to four well-known listservs within the research and SLO assessment communities. In addition, the researcher’s dissertation chair sent an email through the Vice Presidents of Instruction listserv toward the end of the survey data–collection process. Last, the researcher emailed several campus leaders that she knew personally and asked that they distribute the survey to their campus leadership and faculty who were associated with SLO assessment. An invitation to participate in the survey for the study was sent to each of these listservs and individuals along with a link to the online survey.

Population and Sample

The target population for this study were any SLO Coordinators or those actively involved with the assessment of SLOs on CCC campuses. Since not all CCC campuses had SLO Coordinators, it was important to broaden the scope of the target population to include others actively involved in SLO assessment. In addition, the position of SLO Coordinator is typically held by only one or two individuals, and the researcher did not want to discount the input of other campus leaders and faculty who were involved with the assessment of SLOs on their campuses. The survey

63 participants were chosen based on their ability to access the online survey and their willingness to participate in the survey.

Ultimately, 90 individuals entered the survey and provided responses. The survey participants included individuals from 61 different CCCs (53.98%); 14 campuses had multiple responses, totaling 24; and 5 individuals did not provide a useable answer to the question regarding which campus they were currently employed by. Since participants were not required to answer each question, the number of total responses per question varied.

Table 1 summarizes the demographic data concerning the positions respondents held on their campuses. For this question, survey participants were asked to indicate all positions they held on campus and therefore could have responded to more than one position being held. Hence, the percentages will not add up to 100%, and the total number of responses will be greater than 90. The data below are the result of 90 total respondents indicating all the positions they held on their campus.

The majority of the participants (67.78%) were full-time faculty members. The other positions, ranging from highest to lowest number of participants, were SLO

Coordinators (58.89%), Deans (13.33%), administrators (12.22%), Accreditation

Liaison Officers (ALOs) (10%), and institutional researchers (8.89%).

64

Table 1

Participant Positions Held on Campus (Q7)

Position Count %

Administrator (President/Vice President) 11 12.22 Dean 12 13.33 Manager 4 4.44 Full time faculty 61 67.78 Part time faculty 5 5.56 SLO Coordinator 53 58.89 Accreditation Liaison Officer (ALO) 9 10.00 Institutional Researcher 8 8.89 Other 3 3.33

Note: N=90

Table 2 indicates the discipline taught by the full-time faculty survey participants (61). This data show that 26.23% of the faculty were from the English discipline; 13.11% were from the Social Sciences; 11.48% were from the Science,

Engineering, and Mathematics area; 9.84% were Library faculty; and 6.56% were from the Humanities and Fine Arts areas. All other disciplines that responded did not have more than 2 respondents and were categorized as Other; 4 participants did not indicate the discipline in which they taught.

65

Table 2

Discipline Taught by Full-time Faculty Participants (Q7)

Discipline Count %

English 16 26.23 Social Sciences 8 13.11 Science, Engineering, Math (SME) 7 11.48 Library/Librarian 6 9.84 Humanities 4 6.56 Fine Arts 4 6.56 Other 12 19.67 No response 4 6.56

Note: N=61

The participants were asked to indicate what other roles they had on their campus besides those documented above, and 29 respondents provided another role

(2 individuals checked that they had other roles but did not indicate what the roles were). The responses show that the largest percentage of other roles was that of being involved with program review, SLOs (as a committee member or an SLO Technician, not in the role as Coordinator), or Institutional Effectiveness (27.59%). The next highest percentage was for those who participated in an accreditation committee

(17.24%), and the next three roles were all tied with 13.79%: Academic Senate,

Curriculum Committee, and Department Chair. One individual also served as the

Campus Articulation Officer, and 1 was a leader in the faculty union. Table 3 provides a summary of this data.

66

Table 3

Other Positions Held on Campus (Q7)

Position Count %

Program review, SLOs, or Institutional Effectiveness 8 27.59 Accreditation Committee 5 17.24 Academic Senate 4 13.79 Curriculum Committee 4 13.79 Department Chair 4 13.79 Articulation Officer 1 3.45 Faculty union leadership 1 3.45 No response 2 6.90

Note: N=29

Most of the survey participants had been involved with SLOs on their campuses for many years (see Table 4). More than 48% indicated that they had been involved for 5 or more years, 32.56% stated that their involvement had been between

2 and 4 years, and 18.60% had been involved less than 2 years.

Table 4

Length of Time Participants Have Been Involved with

SLOs (Q8)

How Long Involved Count %

0–1 semester 8 9.30 1–3 semesters 8 9.30 2–4 years 28 32.56 5–6+ years 42 48.84

Note: N=86

67

SLO Coordinator Characteristics

Table 5, shows the data indicating the number of years for which the SLO

Coordinator position was contracted on the individual’s campus. The response with the largest percentage was campuses at which the SLO position was indefinite or was not determined (36.05%). Of those that indicated a specific length of time, 2 years was the most common response (26.74%), followed by 3 years (16.28%), 1 year

(5.81%), and 4 or more years (2.33%). Six respondents (6.98%) stated that there was no SLO Coordinator position on their campus.

Table 5

Number of Years the SLO Coordinator Contract Lasts (Q9)

Years Contract Lasts Count %

1 year 5 5.81 2 years 23 26.74 3 years 14 16.28 4 or more years 2 2.33 Indefinite or not determined 31 36.05 Not an official position 6 6.98 Other 5 5.81

Note: N=86

Most of the colleges whose employees responded to the survey used full-time faculty on reassigned time to fulfill the role of Student Learning Outcomes

Coordinator. Only 5 of the 78 respondents for this question (6.41%) indicated that they did not use reassigned time to fulfill the position; however, this is because they

68 each had allocated a full-time position to SLO Coordination. Nine respondents stated that their coordinators were 100% or more reassigned, totaling 14 full-time coordinators. The reassigned-time allocations were broken into six main categories.

The largest response from the colleges indicated that they allocated 40–60% time for their SLO Coordinators (42.31%), while 21.79% only allocated 20–30% time.

However, almost 18% of the respondents reported having a full-time or 100% reassigned coordinator. The other three groupings each had 2.56% of the total respondents and were 70–90% reassigned time, use of stipends, and reassigned by hours for 4 hours per week. It is important to note that 13 of the 78 (16.67%) respondents indicated that their campus had multiple SLO coordinators who shared role responsibilities (see Table 6).

Table 6

Amount of Reassigned Time the SLO Position Provides (Q10)

Amount of Reassigned Time Count %

20–30% 17 21.79 40–60% 33 42.31 70–90% 2 2.56 100% or more 14 17.95 Use of stipends 2* 2.56 By hours – 4 hours 2 2.56 Unusable response 9 11.54 Total 78

Note: N=78; *One of the respondents reported receiving a stipend and reassigned time.

69

In Table 7, data are shown for participants who responded to the question about how SLO Coordinators were selected on their campuses. Seven different selection methods were noted; 1 respondent indicated that there was no coordinator on their campus, and 12 provided answers that were not clear enough to categorize.

More than 20% of each selection by Academic Senate (21.52%) and the

Administration (20.25%) were noted as the most frequently utilized methods. The next largest percentages were seen in the regular hiring process (11.39%), through a combination of the Academic Senate and the Administration (10.13%), selection by a committee (7.59%), volunteer (8.86%), and, finally, the position morphed from another role (3.80%).

Table 7

Method of Selection for the SLO Coordinator (Q11)

Method of Selection Count %

Academic senate 17 21.52 Administration selection 16 20.25 Selection by committee 6 7.59 Position morphed from another role 3 3.80 No coordinator 1 1.27 Regular hiring process 9 11.39 Academic Senate and Administration 8 10.13 Volunteer 7 8.86 Answer was unclear 12 15.19 Total 79

Note: N=79

70

In an effort to explore how formalized the SLO coordinator positions are on

CCC campuses, participants were asked if there was an official job description for the position. As shown in Table 8, nearly 70% of respondents said that there was a formal job description on their campus (69.88%), while only 30.12% stated that there was not one.

Table 8

Indication of a Formal Job Description for the SLO

Coordinator Position (Q12)

Formal Job Description Count %

Yes 58 69.88 No 25 30.12 Total 83

Note: N=83

Strategies for Implementing SLOs

Three survey questions addressed strategies for implementing SLOs on CCC campuses. As shown in Table 9, almost half of the survey participants responding to the question, “what kind of support would help their campus’s SLO efforts?” stated that they thought each of the provided types of beneficial support would help. The support strategy with the highest response rate was regional meetings for

Coordinators (67.06%), followed by planned training institutes for Coordinators

(62.35%), access to online resources/web pages (58.82), a statewide listserv for SLO

71

Coordinators (49.41%), and access to experts to facilitate workshops on campuses

(47.06%). Other responses noted were the need for funding (4.71%), for the ACCJC to provide clear and increased communication to colleges (3.53%), and to have faculty use professional responsibility time instead of pay (1.18%).

Table 9

Support Activities that Would Help Campus SLO Efforts (Q15)

Support Activity Count %

Program review, SLOs, or institutional effectiveness 8 27.59 Regional meetings for Coordinators 57 67.06 Planned training institutes for Coordinators 53 62.35 Access to online resources/web pages 50 58.82 A statewide listserv for SLO Coordinators 42 49.41 Access to experts to facilitate workshops on your campus 40 47.06 Other: Funding (Coordinators, faculty help, travel) 4 4.71 ACCJC clarity or increased communication 3 3.53 Professional responsibilities instead of pay 1 1.18

Note: N=85

When survey respondents were asked about which training opportunities

would help their campus with their SLO efforts, 85 individuals responded, and

67.06% of them stated that training about using assessment data to improve student

learning was needed, while 58.82% needed training about how to engage faculty and

staff. More than 40% of the respondents indicated that they needed assistance with

developing good-quality dialog about SLOs (49.41%), general education/institutional

outcomes (44.71%), and closing the assessment loop (41.18%). The final 4 suggested

72 that training opportunities were not identified as often as helpful and included documenting evidence (35.29%), assessment basics (28.24%), course/programs outcomes (25.88%), and writing SLOs – the basics (15.29%) (see Table 10).

Table 10

Training Opportunities that Would Help Campus SLO Efforts (Q16)

Training Opportunity Count %

Using assessment data to improve student learning 57 67.06 Engaging faculty/staff 50 58.82 Developing quality dialog 42 49.41 General education/institutional outcomes 38 44.71 Closing the Assessment Loop 35 41.18 Documenting evidence 30 35.29 Assessment basics 24 28.24 Course/program outcomes 22 25.88 Writing SLOs – the Basics 13 15.29 Other Creating student awareness on how SLOs connect to class 1 1.18 The role of curriculum in SLOs 1 1.18 Templates for systematically capturing data 1 1.18

Note: N=85

The survey participants were asked to describe the strategies that have been used to develop and implement SLOs on their campuses, and the responses revealed four different strategies to these processes. The strategies included professional development, developing personal connections, integrating the SLO processes with others, and the use of electronic mediums. Multiple professional-development strategies were cited, including workshops/trainings, FLEX or other full-day campus

73 trainings, providing money for involvement, and the use of peer training/assistance.

An example of a professional-development strategy is reflected in this response:

A useful strategy for “closing the loop” has been to provide faculty with a list of possible actions involved with closing the loop, so they are jump started into thinking of creative ways to deal with what their assessment evidence yielded about student success.

The “developing personal connections” strategy was represented by the statements concerning the importance and involvement of the SLO Coordinator having one-on-one discussions with faculty, providing support to departments, and having support from the campus leadership. One of the survey participants offered the following strategy to developing personal connections:

We have done many all faculty training sessions but the real work and improvement seems to happen in the smaller more personal settings. Our strategy is to get those willing to be involved up to the point of experts so they can then be the voice to the rest of the institution. We give them as much attention and guidance as they need to be successful. We are available to both full and part time faculty alike.

The strategy of integrating SLO processes into others on campus was evidenced by the mention of integrating them with the program-review processes, involving the Academic Senate and Curriculum Committee, enforcing consequences for those who do not comply, and pressure from the ACCJC sanctions. Two examples of this strategy are reflected in the following excerpts:

A supportive Vice President of Academic Affairs who put “teeth” into assessments linking participation to classes offered (adjunct) and full-time overloads.

The Most effective one was being put on Show Cause! That finally got the ball rolling and helped faculty to understand the serious consequences of not being engaged in this work.

74

The final strategy was to use electronic mediums to either track or assess data from SLO assessments. Two main types of strategies were used to do this: the use of electronic databases and websites. Although the importance of these strategies was mentioned several times, the effectiveness of the system was also stated, as seen in the excerpt below:

This is a work in progress for us. All of our courses have SLOs and most are being assessed regularly, but our assessment system is clunky and confusing for faculty.

The open-ended survey question regarding strategies for developing and implementing SLOs indicated that campuses employ multiple strategies in an effort to gain support for SLOs. For example, 1 participant stated, “We are building a culture shift that is about meaningful, student centered, authentic assessment.” As indicated by the following statement, positive results can occur with the implementation of strategies over time:

We have had an ongoing process that has been improved over the last 5 years. The initial resentment from faculty has dissipated and the process is ongoing and there is an awareness of how the process should be used for improvement.

Barriers to Implementing SLOs and How to Overcome Them

In an effort to explore what barriers colleges either were facing at the time or had faced while implementing SLOs on their campuses, the survey participants were asked to identify the barriers they faced and what strategies they used to reduce or eliminate the barriers. Three categories of barriers were identified: faculty issues, perception/knowledge, and a lack of support. The main issues surrounding faculty that were identified were resistance to the process, motivating faculty to get involved

75 with the process, faculty feeling as though they were overworked and had no time to be involved with SLO assessment, the involvement of adjunct faculty, a lack of support from the faculty leadership, and fear. Faculty issues about resisting the process, not having time for it, and being overworked and are reflected in the following responses:

We are getting resistance from a small group of faculty members who are saying that requiring this is an infringement on their academic freedom.

Too many tasks, and this one just seems like one more thing on a too-crowded agenda.

Faculty are beginning to feel overwhelmed with more and more obligations given to them outside of teaching responsibilities.

The idea that faculty members do not value the SLO process or do not have the knowledge base to support the assessment and use of SLO data were also reoccurring subthemes. One of the survey participants made the following statement regarding the value that faculty members place on SLO assessment: “There has been a lack of vision and context regarding the assessment process; many people think it’s a fad that will soon go away, and many approach the process as yet another series of meaningless hoops to jump through.”

A lack of support from the campus administrative leadership or through the allocation of adequate resources (technological, human, and fiscal) to conduct the work required also emerged as a theme. An example of this barrier is reflected in the following excerpt:

In addition, the lack of a stable CIO (4 in the last 5 years – two interim) has resulted in a lack of leadership in these areas. ACCJC told us we would be assessing SLOs. However, colleges had to find a means to support these

76

efforts. Had our college had a better understanding of this process back in 2002 and made a more effective effort in both supporting and implementing these processes then, AND had our union had the foresight to understand the workload this would entail, they might have refused the additional workload until the college had addressed this through collective bargaining. Ultimately, we would be much better off.

A second open-ended survey question regarding strategies for reducing or eliminating barriers on campuses indicated that campuses employ three primary strategies: professional development, support, and the enforcement of compliance.

Professional-development activities that were used to overcome barriers by various colleges were FLEX days, trainings through workshops, creating time to work with each other and make this a campus priority, and sharing the success stories from involved departments. For example, with regard to professional development, one college stated,

Creating time to do the work. We now devote our convocation/FLEX days to outcomes work. Also have added two “Dialogue Days” to our official calendar—the campus shuts down in favor of division/department meetings to discuss outcomes/assessment.

The second theme, support, was mentioned across many subthemes, including using technology to track and maintain the process, communication across campus, working one-to-one with faculty, working to improve the SLO processes, providing encouragement, having an effective SLO Coordinator, and engaging faculty leadership. The three excerpts below signify the importance of support in reducing barriers:

We have been patient with resisters, working with them one on one as much as possible to try to overcome resistance.

77

Building trust, open dialogue, respecting the ideas and concerns of others; involve faculty in the decision making process with respect to SLO policy and procedures.

The gung-ho faculty work at encouraging the resistant faculty.

The third theme that resulted from the exploration of strategies that were used to reduce or eliminate barriers was enforcing compliance. There were three subthemes within compliance: ACCJC requirements, restricting access to funding or course offerings, and the use of the faculty contract. The following statement illustrates the use of compliance to overcome faculty resistance:

The “strategy” to eliminate the barrier is just that everyone knows if you don’t implement SLOs, you won’t get accredited. That may not be a positive way of looking at them, but I think that is how most faculty look at them, no matter how positive you try to present them as “assessment for improvement.”

Campus Cultural Characteristics

To explore the status of the campuses’ SLO assessment benchmarks, the culture on campus regarding SLOs and how that may impact the campuses’ ability to eliminate barriers, three separate questions were asked on the survey. An additional two questions were asked to determine the level of concern and the level of use by faculty of different SLO components. As shown in Table 11, the survey respondents were asked to evaluate their campus’s engagement in the SLO process and related benchmarks that are linked to the ACCJC SLO rubric. These data are meant to represent the involvement and dedication to SLOs that exists on campuses. In addition, it also shows where campuses have focused their energies to reach the top benchmark. They value ensuring that certain SLO components are developed and integrated campus wide more so than others. Table 11 indicates that the SLO

78 component with the highest mean, indicting that the overall highest level of development is course-level SLOs (53.85%), followed by Library and Learning

Support Services (48.72%), student support services (46.15%), institutional SLOs

(47.44%), program student learning outcomes (44.87%), general-education SLOs

(33.33%), and administrative services SLOs (26.92%).

Table 11

Evaluation of SLO Components Based on ACCJC Benchmarks (Q14)

Developed Developed on Most Developed Campus Not yet Beginning of the Campus Wide and SLO Component Begun to Develop Campus Wide Integrated Mean

Course Level 2.56 2.56 8.97 32.05 53.85 4.32 Program 1.28 8.97 20.51 24.36 44.87 4.03 General Ed 6.41 12.82 5.13 42.31 33.33 3.83 Institutional 2.56 10.26 6.41 33.33 47.44 4.13 Student Sup. Serv. 1.28 6.41 7.69 38.46 46.15 4.22 Library/Learning 1.28 6.41 7.69 35.90 48.72 4.24 Support Services Admin. Services 10.26 14.10 17.95 30.77 26.92 3.50

When asked how they would characterize the overall level of concern their faculty had about sustaining SLO processes on their campuses, 73 survey respondents selected the statement that best described their faculty on a scale with statements about varying levels on concern (see Table 12). The results show that respondents overwhelmingly evaluated the overall level of concern by their faculty was about how much time it takes (50.68%). The other half of the respondents fell into the following

79 levels of concern: faculty are moving forward with SLOs and collaborating with others about it (21.92%), faculty are getting involved and putting forth new ideas about how to improve the process (12.33%), faculty are concerned about how SLOs will affect them (8.22%), faculty are not concerned or involved at all (4.11%), and faculty are concerned about the impact on their students (2.74%).

Table 12

Overall Level of Concern by Faculty about Sustaining SLO Processes (Q23)

Level of Concern Count %

1 – Not concerned or involved at all 3 4.11 2 – They are concerned and want to know more 0 0.00 3 – They are concerned about how it will affect them 6 8.22 4 – They are concerned about how much time it takes 37 50.68 5 – They are worried about how the use of data will affect student 2 2.74 6 – They are moving forward with it and collaborating with others 16 21.92 7 – They are getting involved and putting forth ideas to improve it 9 12.33

Note: N=73

When asked how they would characterize the overall level of use their faculty members employ regarding assessment data to improve student learning on their campuses, 73 survey respondents provided feedback on a scale ranging from “no use” to “seeking more effective uses for SLO assessment data” (see Table 13). The results show that respondents evaluated the overall level of use by their faculty as beginning to make changes in order to use the assessment data (26.03%). Only one other level

80 of use exceeded 20%: faculty are making changes to increase or alter their SLOs

(24.66%). The remaining respondents fell into the following levels of use: faculty are seeking more effective assessment alternatives to improve the process (19.18%), faculty are making deliberate efforts to coordinate with others about the use of assessment data (13.70%), faculty are taking the initiative to learn more but are not using data (6.85%), faculty have established a process for using assessment data

(4.11%), faculty are taking no interest nor are they looking to use assessment data

(4.11%), and faculty have definite plans to use the assessment data (1.37%).

Table 13

Overall Level of Use by Faculty of Assessment Data to Improve Student Learning

(Q24)

Level of Use Count %

1 – Faculty have no interest and are not looking to use them 3 4.11 2 – Faculty are taking initiative to learn more, but not using them 5 6.85 3 – Faculty have definite plans to use the assessment data 1 1.37 4 – Faculty are beginning to make changes in order to use data 19 26.03 5 – Faculty have a process for using assessment data established 3 4.11 6 – Faculty are making changes to increase or alter their SLOs 18 24.66 7 – Faculty are making deliberate efforts to coordinate with 10 13.70 others 8 – Faculty are seeking out more effective alternatives to improve 14 19.18

Note: N=73

81

As indicated in Table 14, the vast majority (84.75%) of survey respondents

stated that they believed that the culture on their campus impacted their ability to

eliminate barriers, while only 15.25% believed that culture did not have an impact.

Table 14

Campus Cultural Impact on the Ability to Reduce or

Eliminate Barriers (Q22)

Indication of Impact Count %

Yes 50 84.75 No 9 15.25

Note: N=59

The final question on the survey to address campus culture asked respondents to describe the culture of their campus in relation to SLOs. From the 65 responses to this question, three themes emerged: believers, compliers, and resisters. These three cultures not only exist as subcultures within the colleges but can also be used to describe populations of faculty and staff at the colleges.

We have three groups of people in this culture: believers, compliers, and resisters. The believers are a small group; they bought into the process early and have gotten benefits from it. The compliers are a larger portion of the campus, but since they take short-cuts in the process, they create self-fulfilling prophecies: they don’t give it effort so when it doesn’t pay out, they complain that it’s worthless. Finally, there are the resisters. While this is a small group, they are a loud and powerful group made up of a good portion of campus leaders. The believers will not challenge this group.

82

Although each of these groups is distinct, many respondents reported having more than one type of culture on their campus at a time. The believers can be characterized by the following excerpt:

Initially, there was wide spread resistance and hostility (fear driven) from faculty. Then we had grudging acceptance. Now, we have acceptance and embrace by many departments who have used SLO data to improve their classes and train faculty on best practices. Many now see SLO assessment as a positive opportunity to discuss and improve as needed. There will always be those who resent/fear/and resist the SLO process, but these faculty are now in the minority. The support of the President, Academic Senate, and the a new Deal of College Planning have really helped to make faculty and staff aware of the importance and benefit of SLO assessment.

The compliance culture is primarily characterized by grudging acceptance, being in compliance with campus policies or ACCJC requirements, beginning to see the value but not believing in it yet, and having many needs. As detailed in the passage below, compliers are caught between wanting to believe and feeling frustration:

“SLOs” have become an ugly word on our campus. We (SLO coordinators and committee members) are trying to reframe and use the terms “outcomes” and “assessment” and “learning” to try to dissociate with negative connotations of SLO. It isn’t that we OPPOSE learning outcomes or assessment of outcomes. We all desire to improve student learning and success. However, faculty don’t appreciate the “top-down” approach and the SIGNIFICANT work this requires to PROVE we are assessing outcomes and using data in decision-making. Faculty are discipline experts; they are not trained in evaluation and assessment—NOT in the manner that ACCJC seems to expect. Like many campuses, I assume that some are on board and actively participating. Others are completely opposed to this new practice. Most are just trying to survive … and hoping there is some value in this very time consuming activity. ALL this being said … our faculty is working hard to meet its obligations.

83

Unlike the compliance culture, in which faculty and staff feel torn, a culture of resistance is signified by blatant resistance, refusal, hostility, and lack of use. The following excerpt is an example of resistance: “Our campus is trying to avoid

SLOs.… I do not see any campus wide dialogue occurring and only a few disciplines/departments have good dialogue regarding SLOs and student learning.”

Affect on Accreditation Status

As Table 15 indicates, of the 78 respondents, 78.21% stated that their college was not on sanction by the ACCJC as of July 2013, while 19.23% were on some form of sanction, and 2 were from a college that has received its initial candidacy for accreditation.

Table 15

Accreditation Status as of July 2013 (Q6)

Accreditation Status Count %

Not on sanction 61 78.21 On sanction 15 19.23 Candidate for accreditation 2 2.56

Note: N=78

Survey respondents were asked to discuss whether their campus’s accreditation status had affected the implementation of SLOs or the reduction or elimination of barriers in implementing SLOs. Of the 68 responses to this question,

84

80.88% indicated that their accreditation status had affected the implementation of

SLOs, while 19.12% stated that it had not (see Table 16).

Table 16

Accreditation Status Impacted SLO Implementation (Q20)

Impacted SLO Implementation Count %

Yes 55 80.88 No 13 19.12

Note: N=68

As a follow-up question, respondents were asked to describe how their campus status had or had not been affected by the implementation of SLOs. Two main themes were identified in the responses of those who indicated that their accreditation status had impacted the implementation of SLOs on their campus; these themes were a culture of compliance and a culture of continuous improvement. The culture of compliance was noted by respondents who stated that they were motivated to comply with accreditation regulations in order to avoid sanctions or to be removed from sanctions. To illustrate the effects of a culture of compliance the following passage is provided:

Sadly, accreditation sanction has added fear-mongering to arsenal … leadership still lacks adequate understanding of assessment as quality improvement activities that are part-and-parcel of the teaching profession. Fear brings some faculty participation, regardless of union suggestions; but fear also affects administrators who cave into bad ideas/empty promises by faculty members for “ensuring” faculty engagement in assessment practice.

85

The culture of continuous improvement was a less dominant culture mentioned but is one in which campuses are trying to shift away from the culture of compliance to one in which the they are more focused on improving student learning for the sake of helping students, not in meeting accreditation requirements. This is evidenced in the excerpt below:

The desire to continue to be accredited has helped facilitate the process. Faculty and staff are more inclined to be timely and consistent when they know the accreditation teams will be looking hard at the SLO data. We are striving, however, to not use ACCJC as a whip to drive the SLO process as the collection of data and assessment should really be about continuous improvement.

Two recurring themes were identified in the responses of the participants who stated that accreditation status had not impacted their campuses’ implementation of

SLOs. Those themes were that SLOs had already been implemented and that the threat from the ACCJC had not been taken seriously. Of the campuses that had already implemented SLOs, the impact of accreditation status was seen as irrelevant since they were already engaged in the activities required. However, one campus respondent indicated that the threat of a poor accreditation status was not perceived by their faculty as being serious:

Our accreditation status has not affect the implementation of SLOs. There is a relatively strong belief on this campus that the ACCJC will not really punish anyone for lack of engagement in the SLO process. As a result, the threat of losing accreditation or being placed on warning has not served as an effective stick in getting people engaged.

86

Summary of Findings

The evidence of this study indicates that full-time faculty and SLO

Coordinators were the main respondents to this survey. However, several administrators, deans, and Accreditation Liaison Officers (ALOs) also responded.

The full-time faculty members who responded were professors in three core disciplines: English; the Social Sciences; and Science, Engineering, and Math (SME).

Almost half of the respondents held multiple positions on their campus, including other roles besides SLO Coordination with SLOs (committee work or as a technician), program review, or institutional effectiveness, serving on leadership committees (accreditation, academic senate, curriculum), and serving as department chairs. On average, the majority of the respondents had been involved with SLOs on their campus for at least 2 years, with almost half engaged in the process for 5 or more years.

California community colleges are diverse in the manner in which they handle

SLO Coordinator positions, with more than a third of the colleges represented in the survey process, had SLO positions that lasted indefinitely or were not determined, while a little more than a quarter of the colleges had SLO positions that lasted for 2 years. While 14 of the colleges had a full-time position, another 17 allocated only 20–

30% of their faculty member’s time to be the SLO Coordinator. Another example of the relative instability of SLO Coordinator positions was the examination of how many colleges had a formal job description for their SLO Coordinator. While nearly

87

70% did have one, 30% of the Coordinators had no guidance from a formal job description regarding their role in managing such a huge task on campus.

When the respondents were asked about strategies for implementing SLOs on

their campuses, many indicated a need for more support through SLO Coordinator

trainings and online resources. More specifically, the respondents echoed the need

for training about topics that are relevant for colleges that are attempting to obtain

sustainable continuous quality improvement with their campus SLOs: using data to

improve student learning, engaging faculty and staff, and developing quality dialog.

And while basic training opportunities were in the minority, topics such as writing

SLOs, assessment basics, and course or program outcomes still accounted for many

concerns by college employees. In the open-ended questions of the survey regarding

the implementation of SLOs, four main strategies emerged: professional

development, developing personal connections, integration of the SLO processes, and

use of electronic means to track data.

Data were also reviewed regarding what barriers exist on CCC campuses and

how colleges work to overcome them. The three main barriers identified by

respondents were faculty issues, perception and knowledge issues, and a lack of

support. In efforts to reduce or eliminate the barriers on their campuses, the

respondents used professional development, adequate support, and the enforcement of

compliance. Faculty have shown their struggle to get involved in the SLO processes

through resistance, by not making or having time, and by lacking motivation. In

addition, faculty are fearful of the process, not having been formally trained to assess

88 learning through SLOs. Finally, SLO processes on campuses remain under supported by the proper human, financial, and technical support they need to be successful.

When the respondents spoke of their campus cultural characteristics, most indicated that their campus had moved past the beginning stages of SLO development and implementation to having already developed SLOs campus wide and had possibly integrated SLOs into college planning. On average, campuses were furthest along with the campus-wide development and integration into planning with the assessment of course-level SLOs and furthest behind on administrative-services SLOs and general-education SLOs. Although course SLOs were perceived to be furthest along for integration into college planning, faculty members were most concerned about how much time would be required to sustain the SLO processes and were not as concerned about getting involved and putting forth ideas to improve the processes. In addition, faculty members were not perceived by the survey respondents as using assessment data to improve student learning as often as they were with making changes in order to start using the data. This indicates a culture of compliance on many campuses, which is the predominant culture. Most respondents indicated that their campus culture impacted their ability to eliminate barriers. The open-ended question that asked respondents to describe their campus culture produced three types of campus cultures: believers, compliers, and resisters. Oftentimes, these cultures were intertwined on campuses with one predominant group.

89

Finally, several questions were asked regarding campus accreditation and its effect on SLO efforts. The respondents were diverse in their accreditation status as of July 2013, with more than 75% not on sanction at the time of the survey, almost

20% on sanction, and 2 respondents from one institution that had received its initial candidacy for accreditation. Regardless of their campus accreditation status, more than 80% of the respondents stated that accreditation had impacted their SLO implementation efforts. In an open-ended question asking how it had affected efforts, respondents stated that two cultures were apparent: a culture of compliance and a culture of continuous improvement. Of those who stated that accreditation had not impacted their efforts, two themes emerged: they had already implemented SLOs ahead of the accreditation requirements, and they did not perceive the threat of accreditation as serious enough to affect their efforts.

Overall, the position of SLO Coordinator and the efforts to implement and sustain SLO processes on campuses throughout the CCC is complex and challenging but provides a vital role in the accreditation process.

Quality of the Research

The researcher took measures to ensure that the data were an accurate reflection of the participants’ point of view. There was one data collection point in the study: the system-wide survey. This survey was pilot tested with three individuals prior to its dissemination. The survey include both multiple choice and open-ended questions. To ensure the quality of the open-ended survey data, the researcher downloaded each response into EXCEL, which was stored on the researcher’s

90 password-protected computer. To protect the anonymity of the participants, any reference to individuals was omitted, and each respondent was assigned a code number known to the researcher only.

The survey was administrated through the Qualtrics website, which is password protected and was used to collect the survey data. The responses from participants who did not complete the demographic portion of the survey were removed from the data set to ensure that participants identified themselves as part of the target population. Individual responses were kept in the Qualtrics site, and summary data and a spreadsheet of responses to open-ended survey questions were stored in the researcher’s password-protected computer to preserve the integrity of the data.

To establish reliability and trustworthiness of the coding process, the following steps were taken: (1) the researcher identified an individual who is knowledgeable about student learning outcomes but was not part of the study; (2) the researcher and individual coded sections of open ended questions separately; (3) the researcher and the individual then compared codes and themes that each person identified and come to an agreement on the appropriate codes and themes; and

(4) each person shared his or her rationale for the codes and themes they chose.

Summary

In this chapter, the results of the survey responses in five areas of SLO practices were presented. The five areas include SLO Coordinator characteristics, strategies for implementing SLOs, barriers and how to overcome them, campus

91 cultural characteristics, and affect on accreditation status. The open-ended survey data were interwoven into the description of survey results in order to provide insight into the multiple realities represented by the participants’ data. A summary of the results was given that reflects the overall study findings. Finally, evidence of the quality of the data was provided. In the next chapter, an interpretation of the findings is given and recommendations for actions based on the findings are presented.

Chapter V concludes with recommendations for further research and the researcher’s reflections on the research process.

CHAPTER V

SUMMARY, IMPLICATIONS, CONCLUSIONS, AND

RECOMMENDATIONS

This study was designed to explore five main research questions used to investigate the strategies utilized to implement student learning outcomes (SLOs), identified barriers and the strategies to reduce them, cultural characteristics on campuses in relation to SLOs, and the effect of accreditation on the implementation of SLOs. The theoretical framework used for the study was the Concerns Based

Adoption Model (CBAM), which is characterized by the exploration of the level of concern and the level of use that individuals have for an innovation. The CBAM was brought in through two specific survey questions asking respondents to evaluate the level of concern and the level of use for SLO components that faculty have on their campuses. Twenty-four survey questions were designed to explore the research questions (see a sample of the survey questions in Appendix B). Six of the survey questions were used to obtain participant demographic information, one question asked the respondents to indicate the positions they held on their campuses, and six questions addressed the nature of the SLO Coordinator position on their campuses.

This was followed by one question, in the form of a table, asking for an evaluation of where the respondents believed their college stood on achieving SLO benchmarks in relation to various SLO responsibilities. Two questions asked what needs would provide benefits to their SLO efforts. The final questions addressed which strategies

92 93 had been employed in implementing SLOs; three of these questions discussed barriers involving SLOs, and four questions were used to obtain information about the campus culture regarding SLOs. The survey questions consisted of a matrix question, multiple-choice questions, and open-ended response questions.

Summary of Previous Chapters

Chapter I provided an overview of topics that were important to this study, which included an introduction to student learning outcomes in the California

Community College system and the need for more information about implementing student learning outcomes on campuses to avoid sanctions from the accrediting body.

The chapter also introduced the focus of this study, with a discussion about the coordination and leadership needed to implement student learning outcomes. The research questions that guided this study were introduced, along with the assumptions, limitations, and theoretical framework that helped define and focus the study.

Chapter II began with a brief introduction and historical background about the

Accrediting Commission for Community and Junior Colleges (ACCJC). To provide a context for this study, the researcher included a multitude of information about the

ACCJC and student learning outcomes. The topics in Chapter II included an overview of the ACCJC standards, the ACCJC rubric for student learning outcomes

(SLOs), a section about the purpose of accreditation in relation to SLOs, the assessment of SLOs, SLO leadership, and an overview of the Concerns Based

Adoption Model (CBAM).

94

Chapter III provided a detailed description of the research design and the procedures used in this study and the setting where the study took place. Within this chapter, the researcher included the five primary research questions for the study and discussed the steps that were taken to ensure that the study would be conducted in a scholarly and ethical manner.

Chapter IV presented the findings of the study. This included the findings collected through open-ended and general survey questions. The findings from this study were organized and presented in five broad categories that related to the study’s core research questions.

The following sections provide the conclusions that were drawn from each of the research questions in this study. A comparison between the ASCCC study conducted in 2007 and the current study are presented. In addition, a discussion follows along with the limitations of the study and the researcher’s recommendations.

Summary of Research Findings

An online survey was used in this study to collect the data. The survey was constructed to gather responses from California community college Student Learning

Outcomes Coordinators and other leaders involved with SLOs on their campuses.

The survey was electronically disseminated Statewide to California community college Student Learning Outcomes Coordinators and other SLO leaders as well as vice presidents of instruction. Full results of the surveys and tables summarizing the results are located in Chapter IV.

95

Overall, the survey gathered data regarding the general demographics of the participants and insights into the general structure of the Student Learning Outcomes

Coordinator position. In addition, the survey gathered the perceptions of strategies and barriers for implementing SLOs and campus culture in relation to SLOs. Further, the survey provided insights into the Student Learning Outcomes Coordinator positions on California community college (CCC) campuses, training needs, and the processes used to implement SLOs, including a discussion about culture and the effect of SLO implementation on accreditation.

A total of 90 surveys were collected from respondents who were employed at

61 different California community colleges, which is more than half of the total 113 colleges within the system. The survey respondents were primarily full-time faculty

(67.78%) and SLO Coordinators (58.89%). These two categories were not mutually exclusive; therefore, many of the full-time faculty members could also have held the position of an SLO Coordinator. The average respondent in this study had been involved with SLOs on their campus for at least 2 years (81.40%), with 48.84% of those individuals exceeding 5 years of experience with SLOs. See Table 4 in Chapter

IV for a complete summary of this information.

When survey respondents were asked what kinds of support they thought would prove beneficial to their SLO efforts, there tended to be general agreement that regional trainings, planned institutes, and online resources were needed. More than half of the survey respondents reported the need for trainings, institutes, and online resources, while just less than half wanted a Statewide listserv for SLO Coordinators.

96

Although there tended to be overall agreement between the respondents in terms of which support needs were important to campus SLO efforts, the same trend did not continue when asked to indicate which training opportunities were needed.

More than half of the respondents indicated that two training opportunities were believed to be needed to help with SLO efforts: using assessment data to improve student learning and engaging faculty and staff. Other training opportunities that respondents recognized as important to help SLO efforts were developing quality dialog (enabling campus-wide dialog about how to use the results of SLO data), general education and institutional outcomes, closing the assessment loop (going through the entire SLO assessment cycle, including the reassessment of changes made based on SLO data), and documenting evidence (how a campus proves that it has done what it says it has concerning SLOs).

Further, when participants were asked to detail what strategies were used to implement SLOs, four core themes emerged: professional development, development of personal connections, integration of SLO processes, and use of electronic mediums. Participants also cited barriers to implementing SLOs as faculty issues

(including resistance or hostility toward the process), perception and knowledge issues (where there is a lack of knowledge about SLO assessment or where the faculty member perceives that they do not have the knowledge to do the process correctly), and a lack of support.

The impact of a campus culture on implementing SLOs was also explored, and the findings suggest that the culture does impact the campuses’ ability to

97 implement SLOs. Campus benchmarks from the ACCJC are not being met equally across all SLO responsibilities, with course-level SLOs being implemented campus wide and integrated into college planning at a higher rate than any other SLO responsibility.

The integration into college planning of general education and administrative services

SLOs were the furthest behind, with a third or less meeting this benchmark.

According to the survey respondents, campus faculty are still overly concerned about

how much time the SLO process takes, and only 12.33% of the respondents stated

that their college faculty had reached the highest level of concern based on the

Concerns Based Adoption Model (getting involved and putting forth new ideas to

improve the process). Similar results were found when using the CBAM to gauge the

overall level of use by faculty of assessment data to improve student learning, with

19.18% of respondents stating that their campus faculty had reached the highest level

of use (seeking more effective ways to conduct the work).

Finally, respondents were asked to address questions regarding accreditation

and its effect on the implementation of SLOs and the reduction of barriers. It was

generally agreed among the respondents that accreditation status does impact SLO

implementation. Three main campus cultures were identified and discussed in

Chapter IV. The three cultures were a culture of believers, a culture of compliance,

and a culture of resistance. Oftentimes, these cultures coexist on campuses, with one

being the predominant culture at the institution.

The foregoing summary provides a general overview of the findings of this research. This discussion is organized by first presenting a comparison of the parallel

98 questions between this study and the Academic Senate for the California

Community Colleges (ASCCC) study conducted in 2007 and then by each of the research questions in this study.

Comparison Between the Current Study and the ASCCC 2007 Study

Although the structure of the SLO Coordinator position was not a research

question for the current study, it undoubtedly has implications for the work that

campuses are doing in relation to implementing and sustaining SLO processes. In a

review of the literature, only one study was found to address the structure of the SLO

Coordinator position (ASCCC, 2007). In 2007, when the study was conducted, the

SLO Coordinator role was considered a relatively new faculty-based position. In

large part, the position was adopted by campuses in response to the 2002

accreditation standards regarding student learning outcomes. In the following

excerpt, the researchers discuss the challenge of meeting accreditation standards and

lay the groundwork for why the SLO Coordinator positions are needed:

Focusing on student learning outcomes and assessment involves more explicit and purposeful activities with respect to work faculty have always done. The difference in meeting the assessment expectations delineated in the new accreditation standards requires conventions beyond typical grading and beyond faculty focusing on individual classrooms. It requires that faculty become both discipline experts and skilled assessment practitioners. This demands leadership and clearly defined tasks, plus well organized training to make the process beneficial. In an extensive literature review by the Ad Hoc Committee there was no evidence that any system of higher education has addressed an organized training plan for Student Learning Outcomes Coordinators. (ASCCC, 2007, p. 9)

In California, some of these training needs have been addressed through the Research

and Planning Group (RP Group). This Statewide group has organized both regional

99

SLO Coordinator trainings, workshops through their Strengthening Student

Success conferences, as well as a Statewide listserv to enhance communication among the coordinators.

Several of the survey questions employed by the current study were designed

to reassess the status of CCC SLO Coordinators in 2013–2014, 7 years after the

ASCCC study was conducted. The components that were reassessed were how many

years the contract for an SLO Coordinator lasts on campuses, how the position is

selected, how much reassigned time the position provides annually, and which

training opportunities are needed. In the 2007 ASCCC study, most respondents

indicated that the SLO Coordinator contract does not have a determined length of

time (48.75%), followed by 16.25% stating that the positions lasted 2 years. An

additional 6.25% of the positions lasted 1 year, while 5.00% lasted 3 years and 3.75%

lasted 5 or more years. In comparison, this study found that 36.05% of the positions

do not have a defined term length, followed by 26.74% of the positions lasting 2

years, and 16.28% lasting 3 years. Since the initial study, nearly 7 years have passed,

and the findings indicate that there is a trend of colleges moving away from having

undetermined lengths of time for their SLO Coordinator positions and an increase in

the number of colleges extending the time limit of the position to 2 years or more.

This indicates a desire to stabilize and formalize the position of SLO Coordinator on

campuses, as they are becoming increasingly valued among CCCs. Another example

of this stabilization and formalization of the position is found in how the positions are

selected on campuses.

100

While the 2007 study indicated that most SLO Coordinators did not have a job description, the current study found that 69.88% of the respondents indicated that there was a formal job description on their campus for the SLO Coordinator position.

However, this means that 30% of the CCC SLO Coordinators are operating without a formal list of the expectations for their role. While this trend of more colleges having formal roles is encouraging for SLO Coordinators, it is also indicative of the lag in the SLO movement by some CCCs to ensure that they are providing leadership in meeting the SLO accreditation standards.

In the current study, the largest percentage of SLO Coordinators are selected by the Academic Senate (21.52%), followed closely by the college administration

(20.25%). In the 2007 study, the same groups were responsible for the majority of the appointments, but it was a smaller percentage by the Academic Senate (20.00%) and a slightly larger percentage by the college administration (21.25%). This may indicate the value that the college administration is placing on the position by wanting to have a say in who is providing the services to the college. However, it must also be seen from the faculty point of view, and one must be concerned that student learning outcomes is designed to be a faculty-driven process, yet the faculty may not be allowed to have a role in the selection of the individual who will coordinate that process. As the 2007 study points out, “Unfortunately, only 6% were appointed through joint academic senate and administrative processes, which model the support and cooperative decision making processes that contribute to the eventual success in implementing outcomes and assessment” (ASCCC , 2007, p. 13). This study found

101 that 10.13% of the respondent colleges stated that they were selecting their SLO

Coordinators using this model. Although there is an increase in the percentage of colleges using the cooperative model, it is still a relatively low percentage of colleges that have adopted it many years later.

Another assessment that can be used to determine how much value the institution places on SLO coordination is how much reassigned time the position is provided. Without the proper compensation for an SLO Coordinator, it is assumed that the position will naturally have a higher turn-over rate as well as a lack of an adequate amount of time to fully conduct the SLO Coordinator duties. In 2007, the highest response from the colleges who completed the survey stated that they allocated between 20% and 30% reassigned time (21.25%) compared to the current study, which found that 21.79% did so. While the allocation of the low 20–30% reassigned time was the same seven years later, the use of 40–60% reassigned time increased dramatically, from 20.00% in 2007 to 42.31% in this study. Additionally, more than 17% of respondents in the current study (17.95%) allocated 100% or more reassigned time, while only 3.75% did so in 2007. Overall, more reassigned time is being allocated to the SLO Coordinator position over time. This should better enable colleges to meet the ever-changing and increasing demands of the accreditation standards.

The final component that was reassessed in the current study in relation to the

ASCCC study of 2007 was training opportunities. In 2007, 5 years after the SLO accreditation standards were implemented, those involved with SLOs on their

102 campuses indicated a need to have training opportunities in documenting evidence and closing the loop. In the current study, now 12 years since the SLO accreditation standards were implemented, the two largest percentages indicated by respondents as being needed to help with their SLO efforts were training opportunities to use data to improve student learning and engaging faculty and staff in the SLO process. It should be noted that these two options were not presented to the participants of the

2007 study. Also, 41.18% of the respondents in the current study indicated a need for training to close the loop, and 35.29% indicated that there is still a need for assistance to document evidence. With the exception of the training opportunity for writing

SLOs (the basics: what SLOs are, what the format of SLOs should look like, the use of Blooms taxonomy), all training opportunities that were presented in the current study were seen as potentially beneficial by at least 25% of the respondents. Given that 12 years have passed since the introduction of the SLO accreditation standards, these findings again show the lag of the SLO movement and potentially a lack of support provided to campus SLO efforts to fully implement this work on the level required by all CCCs by the ACCJC standards.

More than a decade later, campuses are still very diverse in how they structure and select their SLO Coordinator positions. In addition, they are struggling to fully understand the broad SLO accreditation requirements and how to best help their campuses to meet them.

103

Research Question 1: What Strategies Have Institutions in the California

Community College System Utilized to Implement Student Learning Outcomes

Processes?

The first research question was designed to investigate what strategies institutions in the California Community College system have used to implement student learning outcomes processes. The survey findings reveal that SLO

Coordinators and other campus SLO leaders on CCC campuses are actively using a multitude of strategies to implement SLOs, but there are support and training needs that are viewed as needed to help campus SLO efforts. The respondents indicated that their campuses were using many strategies to conduct professional development, to develop personal connections with faculty, to integrate the SLO processes into other planning processes on campus, and to expand and improve the use of electronic mediums to track and collect SLO data. In previous studies, both Waite (2004) and

Ewell (2005) found that providing professional development and engaging the campus faculty were important to SLO processes. As campus SLO Coordinators and other campus SLO leaders employ the use of professional development activities to assist faculty and staff in implementing SLO processes, they do so while stating that they need more support through regional meetings and training institutes as well as more training about how to use assessment data to improve student learning and about engaging faculty and staff. The findings thus imply that SLO Coordinators and other campus SLO leaders often do not have all the information, knowledge, or training they need to assist others in professional-development activities. In addition,

104 for those leaders who are providing professional-development activities to others on campuses, access to online resources, listserv access, and access to experts to facilitate workshops would be beneficial to the entire campus SLO efforts.

The respondents in this study also suggested that developing personal connections with faculty and staff to assist them in the SLO process was a useful strategy. Within the use of this strategy, respondents repeatedly expressed the need for an SLO Coordinator to manage the process but also emphasized the need for this individual to have the skills to develop one-to-one relationships with faculty and within campus departments. In addition, they discussed the need for campus administration to support the SLO Coordinator’s efforts. As indicated by the expression of training and support needs in the study, SLO Coordinators not only need to navigate and develop their own skills across the various aspects of the SLO processes, but they are also responsible for the training and development of other faculty and staff who are growing in these areas. SLO Coordinators must develop relationships with many diverse faculty and staff, find ways to engage them on many levels, and provide mentorship and instruction to the campus about SLO processes.

Similarly, Dunsheath (2010) explained that, without communication between the various campus stakeholders, the processes would be more difficult.

The integration of the SLO processes to other planning processes on campus was documented in this study as a need when respondents were asked to express what training opportunities they believed would be beneficial to their campus.

Additionally, the integration of SLO processes into other campus processes has been

105 identified by earlier studies, including Waite (2004), who stated, “The process needs to be integrated with pre-existing processes in the college” (p. 120). In order to fully integrate the SLO processes into other campus processes, SLO Coordinators must assist faculty and staff to close the assessment loop, which also includes using assessment data to improve student learning and developing quality dialog. Both of those training opportunities were listed as needed by one- to two thirds of the study’s respondents. The findings also suggest one way in which the integration can occur is through the involvement of the Academic Senate or the Curriculum Committees on campus. In addition, study respondents believed that the campus administration needed to enforce the consequences that were established for those who chose not to comply with the SLO processes. Some of the consequences mentioned were not allowing courses to be offered if they had not been assessed, restricting the requesting or allocation of financial resources to departments, and utilizing the threat of ACCJC sanctions. Tharp (2012) found in his study of four California Community Colleges

(two that had never been on sanction and two that had been on sanction multiple times) that the two colleges that had been on sanction multiple times did not have their accreditation processes (of which SLOs are a part) as integrated as the two colleges that had never been placed on sanction.

More than half of the survey respondents to this study indicated the need to have access to online resources and web pages, and 35% stated that they thought additional training about documenting evidence would be beneficial. SLO

Coordinators would benefit by having access to these types of electronic resources. It

106 would enable them to have improved ease of access to gather information to provide professional-development workshops and trainings to others as well as to enhance their knowledge base. Further, many of the respondents discussed their use of electronic databases, including TracDat, ELumen, or Curricunet. Although many discussed these as invaluable resources for their campus, they also explained struggles with their use. Tharp (2012) also noted that campuses that were on sanction more often than others had developed a tool to collect and use data but that his participants reported “ongoing negative interactions that had to be overcome in order to use the tool” (p. 153). Some of the challenges noted in this study were the database not being utilized campus wide, training needs for the electronic collection and tracking of SLO data (typically through a database), technological support on campus for faculty and staff, and implementation issues that made the systems inefficient or “clunky,” as described by one respondent. Dunsheath (2010) discusses the need to have a “data point person” (p. 71) who has knowledge to assist with the use of data from these systems.

Research Question 2: What Barriers Exist in the Implementation of Student

Learning Outcomes Processes?

The second research question that the study was designed to explore looked at the barriers that exist in the implementation of student learning outcomes processes.

The survey responses show that the three main types of barriers to the implementation of SLOs on CCC campuses are faculty issues, perception and knowledge issues, and a lack of support. Many of the same faculty issues documented through this study have

107 also been noted in earlier studies, indicating that these barriers are widespread throughout the higher education system in our country and are not isolated within the survey respondents or the CCC system.

This study identified faculty resistance to the process as a key faculty issue

(issues identified by survey respondents as concerns or issues presented as directly affecting faculty) when identifying barriers to SLO implementation. Resistance to

SLO processes was also noted in several other studies (Dunsheath, 2010; Palomba &

Banta, 1999; Waite, 2004), which indicates the need to minimize this barrier.

Strategies to overcome resistance and engage faculty will be discussed under research question number three.

A second faculty issue found in this study, as well as within the literature, is a lack of time or a feeling of being overworked (Dunsheath, 2010; Palomba & Banta,

1999; Walvoord & Pool, 1998). Additionally, survey respondents discussed the ability of the college to involve their part-time faculty, who are the sole providers of instruction for many of the courses that must be assessed through the SLO process. A similar issue was noted in the study conducted by Dunsheath (2010) in that part-time faculty need to be integrated into the SLO processes on campuses.

The second main barrier noted in this study was an issue concerning perception and knowledge about SLO processes. As was previously discussed under the strategies for implementing SLO processes, many SLO Coordinators and other campus SLO leaders do not have all the training and knowledge they need to complete their jobs in the most effective manner. Many SLO Coordinators and other

108 campus SLO leaders are full-time faculty members who are content experts in their fields, but they were not trained as assessment experts. Serban (2004) states,

“One of the major challenges in building, sustaining, and effectively utilizing student learning outcomes assessment is having the needed expertise and skills on campus”

(p. 23). Dunsheath (2010) echoes the concern about knowledge of assessment: “It follows that if SLOs require a transformation in faculty’s thinking about teaching/learning, then some amount of retraining and education (i.e., professional development) would be required” (pp. 70–71). Many of the respondents of the current study indicated a need for additional training opportunities and identified professional development as a key strategy for the successful implementation of

SLOs on their campuses.

The final barrier to implementing SLOs that was identified in the current study was a lack of support. As stated in Chapter II of this study, the support of the administration without their intrusion is vital to the success of the SLO processes on campus. If faculty members are not provided adequate support, they will be less likely to engage in the process (Goldstein & Young, 1992). Waite (2004) also noted in her study that a lack of support from the campus administration could serve as a barrier to successful implementation of SLOs. Additionally, she found that administrators struggled at times with providing adequate resources to support the

SLO efforts. “Leadership felt that scarce resources for an initiative of this magnitude were [ sic ] a barrier. Measurement of SLOs was seen as a ‘huge, unfunded mandate’”

(p. 123).

109

Research Question 3: What Strategies Have Been Successful in Eliminating or Minimizing the Barriers to Implementing Student Learning Outcomes

Processes?

The third research question the current study was designed to address was which strategies have been successful in eliminating or minimizing the barriers to implementing student learning outcomes processes. The participants indicated that the primary strategies their campuses engaged in to reduce or eliminate barriers to the implementation of SLOs were professional development, providing support, and the enforcement of compliance to complete SLO assessment activities. These strategies closely mirror two (professional development and support) of the four strategies named in the first research question addressing how SLOs can be implemented successfully. In addition, respondents indicated that enforcing the compliance for the completion of the SLO activities was an important strategy to reduce barriers.

Professional development was also cited by Waite (2004) and Dunsheath (2010) as a way to reduce faculty resistance because it helps to provide information and knowledge to faculty (noted as barriers in the current study). Waite also suggests that, within professional-development activities, SLO Coordinators and other campus

SLO leaders should encourage sharing successful SLO experiences to reduce barriers.

Findings suggest that SLO Coordinators and other campus SLO leaders in the CCC system are in greatest need of the following professional-development opportunities: regional meetings and planned training institutes where these individuals from various campuses can network and share their experiences to assist each other in

110 enhancing their knowledge about SLO processes. In addition, training is needed to assist the campus as a whole in using assessment data to improve student learning, engaging faculty and staff, developing quality dialog, improving general education and institutional learning outcomes, and closing the assessment loop. Campus leaders should be reminded that, not only do campus SLO Coordinators and other campus

SLO leaders need to be educated and trained in these areas, but the entire campus also needs to become educated. In order to do this and eliminate the barriers to implementing SLO processes on campuses, a high level of support is needed.

Support and the enforcement of compliance as two additional core strategies

were very broad based and included using technology, ensuring good campus

communication, enabling faculty to work one-to-one with SLO Coordinators, having

an effective SLO Coordinator, and engaging faculty leadership. When campuses seek

to reduce or eliminate barriers, they must provide support that is tailored for the stage

that individual faculty and departments are in with regard to implementing SLO

processes. Multiple studies concur with these findings and promote the need for a

high level of support and the collegial relationship between faculty and the campus

administration (Dunsheath, 2010; Gray, 1997; Loacker, 1988; Palomba & Banta,

1999; Waite, 2004;). One approach to be mindful of when trying to create change on

campuses is the utilization of the Concerns Based Adoption Model (CBAM), which

will be discussed further under research question number four regarding campus

culture.

111

Research Question 4: What Institutional Cultural Characteristics Affect the

Successful Implementation of Student Learning Outcomes Processes?

Research question 4 explored the cultural characteristics of institutions that

affect the successful implementation of student learning outcomes processes. The

respondents indicated that three common cultures existed on CCC campuses

regarding SLOs: a culture of believers, a culture of compliance, and a culture of

resistance. The predominant culture expressed by the survey respondents was a

culture of compliance, with pockets on campus consisting of individuals creating a

culture of believers and a culture of resisters. The campus cultures identified in this

study are viewed as hierarchical in that individuals can move from resisters to

compliers to the ideal: believers. Dunsheath (2010) provides a clear definition of

campus culture:

Underlying the successful transformation to a teaching/learning college, the existing culture serves either as a barrier or asset for the SLOs process. Culture is the deeply shared values, assumptions and beliefs of an institution. Culture endures and it defies trends and fads to remain symbolically representative and reflective of the essence of the institution. Consideration of campus culture is essential for the successful implementation of a student learning outcomes process. (pp. 53–54)

As has been noted above, resisters (those who create the culture of resistance)

are often fearful of the SLO processes and are undereducated about how to complete

them. Resisters can be brought forward to the next level of culture (compliance)

within a campus by utilizing strategies to eliminate barriers for them. In addition,

using the CBAM, SLO Coordinators and other campus SLO leaders can identify

where faculty are in their SLO cultural development on campus and provide targeted

112 opportunities for growth. For example, in the current study, an assessment of the overall level of concern that faculty members have for sustaining SLO processes was provided by the survey respondents. This assessment indicated that most respondents viewed their campus faculty as being concerned about how much time SLO processes take. Earlier in this study, it was discovered that time and the feeling of being over worked are two faculty issues that need to be overcome. Utilizing the CBAM, if the overarching culture of the campus is defined, in part, by faculty members who are concerned about how much time it takes to sustain SLOs, the campus can develop targeted professional-development activities to these individuals. Concern about time is reflective of the fourth stage, consequence, of the CBAM model and is characterized by individuals beginning to question whether or not the change will truly impact their students’ learning: “Is it worth it?” (Loucks-Horsley, 1996). If

SLO Coordinators and other SLO leaders on campus can design professional- development opportunities that recognize and address this concern, they may be able to push many in this large campus group to decide that their efforts will be “worth it” and prevent them from sliding into the culture of resistance on campus.

The same can be true for using the CBAM to explore how much faculty members use SLO assessment data to improve student learning. Faculty members who have a desire to improve student learning through the use of assessment data are either in the compliance culture (participating because they believe they must) or the culture of believers (participating because they intrinsically believe that this will improve learning). According to the level-of-use assessment provided by the survey

113 respondents in this study, only a small percentage (4.11%) have no interest in using data to improve student learning (resisters). A larger percentage are exploring the use of data or are begrudgingly using it (compliers) by learning more about it, developing plans to use the data, making changes in their departments or SLOs that will enable them to use the data, or developing a process for its use (levels 2–6 with

63.02% of the survey respondents). The culture of believers is found in faculty who were assessed as making deliberate efforts to coordinate their use of the assessment data with others or were seeking more effective ways to improve the process (32.88% of survey respondents).

Research Question 5: In what Ways Has the Institution’s Accreditation Status

Impacted the Implementation of Student Learning Outcomes Processes?

The fifth research question of the study was tailored to investigate the ways in which an institution’s accreditation status has impacted the implementation of student learning outcomes processes. The responses indicate that 78.21% of the survey respondents were employed by CCCs that were not on sanction by the ACCJC as of

July 2013 (this does not mean that they had never been on sanction, only that they were not as of July 2013). Of the remainder of participants who answered the question (19.23%) were employed by CCCs that were currently on sanction and 2 respondents were from a CCC that has received its initial candidacy for accreditation.

This cross section of CCCs that were on or off sanction seems to be very comparable to the statistics of how many colleges are on and off sanction at any given time throughout the CCC system. The findings from this study support the fact that

114 accreditation status affects the implementation of SLO processes on campuses.

This finding is also supported by the work of several other studies of the CCC system

(Tharp, 2012; Waite, 2004).

In addition, findings from this study support some of Tharp’s (2012) recommendations, including, “Understand that accreditation processes may be influenced by campus culture and reframe accreditation as internally motivated”

(p. 155). These two cultural characteristics were found by Tharp to be representative of campuses that were not on sanction by the ACCJC as often as those without these characteristics. These characteristics are also indicative of a culture of believers, not one of compliers (more externally motivated) or of resisters (not motivated to participate).

In an investigation of the findings provided by the survey respondents, it was found that the predominant culture of compliers often felt torn between wanting to believe in the process (move to the next level of cultural SLO development) and feeling burdened by the barriers mentioned previously, including the time it would take to be fully invested in the process. SLO Coordinators and other campus SLO leaders can use this information to target strategies to reduce or eliminate barriers expressed by faculty on their campuses. The culture of compliance was exhibited through many of the comments by respondents concerning accreditation status. Many campuses were described as being motivated by the compliance aspect of accreditation to spur the implementation of SLOs, largely out of fear of losing their accreditation. This was also noted in the study conducted by Waite (2004), who

115 found that accreditation pressure helped to elicit the development of SLOs on campuses through pressure and fear.

Discussion and Evaluation

The purpose of this study was to explore how California community colleges

utilize the Student Learning Outcomes (SLO) Coordinator positions on their

campuses, what barriers campuses face when trying to implement SLOs, how

campuses have effectively addressed these barriers, and what role accreditation status

plays in implementing SLOs. Relatively few studies have been conducted about the

role of the Student Learning Outcomes Coordinator positions in the California

Community College system (ASCCC, 2007). Given the lack of research involving

SLO Coordinators, the objective of this descriptive study was to gather demographic

information about SLO Coordinators and how campuses are handling the various

aspects of SLO implementation. This researcher also set out to ascertain the

perceived cultural characteristics that exist on campuses in relation to SLOs. The

findings of this study can be used to help understand the challenges that SLO

Coordinators and other SLO leaders face based on their roles in implementing SLOs.

Since SLO Coordinators often struggle to enhance their own knowledge and

skill sets about the various aspects of student learning outcomes assessment and the

use of its data, they find themselves in a quandary when trying to design effective

professional-development activities to educate others about these topics. SLO

Coordinators need basic through advanced training to transition faculty from being

full-time faculty members and content experts in their fields of study to assessment

116 experts as well. Thus, if there is a lack of formalized training or gaps in the training that SLO Coordinators receive, then individuals in these positions may continue to struggle to move their colleges toward creating sustainable change in the

SLO processes on campuses. Revealing this gap will also provide campus administration with a solid foundation from which they can base resource decisions to assist in this process. In reality, even if SLO Coordinators are well trained, if the campus administration does not provide them with adequate resources (human assistance, technological assistance, and financial assistance), SLO Coordinators will not be considered as effective as they could be.

There was general agreement among respondents about the barriers that exist on campuses in relation to the implementation of SLOs and the strategies that are being used to reduce or eliminate them. According to the findings of this study, campuses tend to address barriers, including faculty resistance and a lack of support, through professional-development activities and working collegially with the campus leadership. However, providing professional-development opportunities alone is not enough. SLO Coordinators must explore where their faculty members are in the SLO process regarding both their level of concern and their level of use and then must develop strategies based on their findings in order to provide the most effective training opportunities. Most faculty members are concerned about the amount of time it will take them to be fully engaged in the SLO processes; they are using the

SLO assessment data but only at a certain level. Targeting these faculty members is

117 likely to bring about the largest shift on a campus since they are no longer resistant to the notion of participating in the SLO process.

Further, this study also revealed that three main cultures exist on campuses regarding SLO processes, with the predominant culture being one of compliance.

The research suggests that these cultures are developmental to a degree and that individuals within campuses can grow and move from one culture to the other with the proper support and training. The three cultures are a culture of resistance

(comprised of faculty members who refuse to participate in the process), a culture of compliers (composed of faculty who participate in the process, but only because they are forced to), and a culture of believers (consisting of faculty who believe in the intrinsic benefits of assessing SLOs). These three types of cultures, thus, faculty members, most likely exist to some degree on all CCC campuses, and each campus has its own cultural “flavor.” However, regardless of the individual differences, it is believed that bringing campus faculty from resistance to compliance will ultimately result in campuses being able to create a critical mass of faculty members who are believers.

Resistance can be difficult to counteract but has been successfully handled by

many campuses through professional development, follow-through by the campus

administration concerning consequences for not complying, and threats from the

ACCJC about sanctions. The largest cultural group on campus will be that of the

compliers. A compliance culture allows the work to be completed and most likely

also allows a campus to stay off of accreditation sanction from noncompliance with

118 the SLO proficiency standards. However, this can only be short-lived, as campuses must now focus on moving toward sustainable continuous quality improvement (SCQI) for SLOs in order to meet the accreditation standards. The

SCQI level for SLOs requires a much greater integration of all the SLO processes, and this cannot be achieved if campuses are not prepared to embrace this movement.

Campuses have the best chance to meet and exceed the expectations of the SCQI standards if they can create a culture of believers within their faculty. In order to be engaged at the level needed to meet the new SLO SCQI level, faculty will need to believe in the SLO processes as intrinsically beneficial to their teaching and to their students’ learning.

Limitations of the Study

A number of limitations are associated with this study. This researcher looked at SLO Coordinators and other campus SLO leaders from California community colleges only. This means that other 2-year and all 4-year colleges and universities were excluded from this study. Further, the study focused primarily on SLO

Coordinators and other campus SLO leaders. It did not attempt to include the perspectives of a general faculty member engaged in the SLO process or a campus administrator or dean not serving in the role as an SLO leader.

Survey respondents were solicited from all California community colleges

who had faculty or staff registered on one of the RP group listservs mentioned earlier

or the Vice President of Instruction listserv. Although more than 50% of the

California community colleges had at least one survey respondent, many campuses

119 did not opt to participate in the study; therefore, the perspectives of those campuses are not included. Although the survey was considered anonymous for the purposes of this project, the researcher had access to many key identifiers for the respondents, and this may have deterred some participants from completing the survey.

Recommendations for Practice

Based on the findings of this study, the researcher has several recommendations for practice. The researcher is aware of the existing workloads of college faculty and administrators as well as the ongoing fiscal constraints that

California community colleges face. These factors have been taken into consideration with respect to the recommendations presented below.

The first recommendation is based on the need found in this study for training.

Campuses simply cannot continue to leave SLO Coordinators and other SLO leaders on campus without the proper formalized training necessary to lead the campuses in their SLO efforts. SLO Coordinators must be afforded the opportunities to regularly attend trainings, including those held by the Academic Senate for the California

Community College (SLO Institute) and conferences where SLOs are a primary topic, including the RP Group’s annual Strengthening Student Success conference. In addition to these Statewide annual opportunities, it is suggested that regional SLO

Coordinator meetings begin to be organized to allow for the sharing of best practices and challenges experienced by this group. One example of this type of effort is the recent organization of Northern and Southern California SLO-Net meetings for

120 professionals being held in May 2014. These opportunities need to be made more frequently by all SLO professionals in the CCC system, and campuses must insist that both their SLO Coordinators and other campus SLO leaders attend these trainings and ensure the funding for them.

Another recommendation is for campuses to create an SLO assessment annual budget. As mentioned, funds need to be made available for trainings, and also to ensure that SLO Coordinators on campuses have the resources they need to conduct their work outside of their role as a faculty member and within their role as an SLO

Coordinator. For example, if the SLO Coordinator is a mathematics faculty member he or she should not need to use resources from the Mathematics Department to conduct his or her work as an SLO Coordinator. SLO Coordinators should have access to funds for duplicating, printing, and training materials. These funds also need to be directed toward purchasing, maintaining, and upgrading the database system being used to collect, track, and store SLO assessment data, as this study has shown that the lack of an effective technology system for these purposes can be a barrier to the implementation of SLO processes.

A third recommendation is for the CCC system to assume a leadership role in assisting colleges in gaining access to SLO resources. SLO resources are abundant but are, for the most part, scattered across various institution and agency websites.

This researcher would like to suggest that a list of the best resources for SLO practices be compiled and maintained within a main system website, similar to the one that the CCC Transfer Center Directors have regarding Statewide transfer

121 information. The RP Group has a section of their website dedicated to SLOs and

SLO research, but something more intensive and complete for CCCs is needed. In addition, the system needs to continue to work to help itself by utilizing the resources available within it. Many CCC faculty, staff, and administrators are considered experts in the field of assessment and could truly help campuses to move past their

“sticky points” to a more positive place for SLO efforts. SLO assessment experts should step forward to avail (potentially for a fee) themselves to other campuses and help the system move forward.

SLO Coordinators and other campus SLO leaders should champion the believer movement and work to create a culture of believers on campuses by understanding the various ways that campus faculty and staff may be thinking and feeling about SLOs and targeting the different groups with trainings and opportunities designed for them. Using the CBAM is one method that has been shown to help SLO

Coordinators and other campus SLO leaders understand the various levels of concern and levels of use by faculty and how they can best be served given those levels. The use of SLO teams or peer-to-peer SLO leaders is an effective way to help pockets of faculty and departments to move from one cultural level to another. This is not a job for only one person.

Based on the findings of this study, this researcher has two recommendations for campus administration. The first is to formalize the process of selecting, supporting, and evaluating the SLO Coordinator(s) on campuses. The findings of this study indicate a movement toward greater formalization since the 2007 ASCCC study

122 was conducted, but many Coordinators still operate on campuses within the system without a formal job description. In addition, by the nature of student learning outcomes and according to the accreditation standards from the ACCJC, SLOs should be a faculty-driven process. In formalizing the selection of the SLO Coordinator on campuses, the administration can ensure that the campus Academic Senate is involved in the decision-making process. Finally, some campuses have already recognized the need for a full-time SLO Coordinator, while others are still giving partial release time. If campus administration chooses not to invest in a full-time

SLO Coordinator, they need to they need to recognize the onerous burden they are placing on one or two faculty members to conduct sensitive work that requires the development of personal connections and time to create and sustain a cultural shift from resistance to compliance to believers. Faculty members will burn out and turn over, decreasing the consistency of campus SLO efforts and, ultimately, will not be as effective as necessary to assist the college in maintaining its clean accreditation status or in removing itself from an accreditation sanction for SLOs.

The second recommendation for campus administrators and faculty leaders is to formalize the enforcement for a lack of compliance to participate in the SLO processes on campus. Although this should not be the primary means of gaining compliance on campus for campus administration and faculty leadership to work together on issues that affect faculty, especially their time and compensation for it, it is key to ending much of the resistance that remains on campuses. It is suggested that the campus administration work with the faculty leadership to negotiate a union

123 contract with language regarding the responsibility that faculty members have in participating in the SLO processes as well as any relevant details regarding compensation for their involvement, including that of part-time faculty. In addition, working with faculty committees, including the Academic Senate and the Curriculum

Committee, can assist the college in moving its SLO efforts forward through Senate resolutions and regulations within curricula for SLOs. A collegial relationship must exist, with administration supporting the efforts of a faculty-driven process.

Finally, to all faculty, staff, and administrators on CCC campuses, be proud of the accomplishments of the campus. Share and celebrate them, not only among the campus community, but with the community at large. Encourage and fund presentations at trainings and conferences throughout the State about SLO efforts that have made a difference. Too often, the champions of these efforts are rewarded by becoming leaders and being made responsible for the efforts of others; instead, believers should be rewarded by praising them and showcasing their work in various venues.

Recommendations for Further Study

The researcher has several recommendations for further study based on the findings of this descriptive study. The first recommendation would be to further study the area of the use of the Concerns Based Adoption Model (CBAM) as a tool to move forward a college’s SLO efforts. As this study explored, the CBAM can help identify where faculty members are with concerns and uses of an innovation (in this case, SLOs) and target opportunities for growth based on what level they are on. This

124 has the potential to help SLO Coordinators to provide more meaningful and useful trainings to campus faculty.

In addition, it is recommended that more research be conducted that integrates the faculty perspective into the use of SLO assessment data to improve student learning. There is a large gap in this area at the present, and campuses are struggling to fully integrate SLO assessment data into college planning to improve student learning.

Finally, it is recommended that further research be conducted concerning the notion of the developmental SLO culture on campuses between resisters, compliers, and believers. This research can help campuses to understand the cultures that exist on their campuses and to what degree as well as how to work with each culture to help them reach a higher level of SLO cultural development.

Research Process Reflections

The researcher is a full-time mid-level manager and Student Learning

Outcomes Coordinator at a medium-sized community college in the Central Valley of

California. Due to her role as the SLO Coordinator, she has some experience with the

SLO assessment cycle within her institution, and many of the questions that arose in this study were a result of reflections on her assessment experiences within her institution. The researcher has spent 5 years in this leadership position within her institution, which provided her with a mixed experience with SLO assessment. In her leadership role, the researcher was charged with creating an environment in which other campus departments, as well as her own, could develop, assess, analyze, and

125 then utilize the results of SLO assessments, while receiving little formalized training. The researcher used her own perceptions and educational training in assessment to guide her work; she also used information she was able to gather from trainings off campus. The researcher has worked with her counterparts on her campus for SLO leadership and found the processes campus-wide to be riddled with complexity and confusion. The survey findings were very revealing and demonstrated that SLO experiences, although varied throughout the system, are very similar in the challenges and strategies being used. Initially, the researcher believed that the challenges experienced by SLO Coordinators were, in some cases, insurmountable, but through the survey findings and the use of the CBAM, she has a new found passion for assisting colleges to overcome their challenges.

It is hoped that the findings of this study can be used by the CCC system to aid SLO Coordinators and campus leadership through the challenge of implementing

SLOs and moving their campus culture toward sustainable continuous quality improvement.

Conclusion

The assessment of student learning outcomes has the potential of changing the way the CCC system focuses not only on the pedagogy in the classroom, but also how resources are allocated to address the greatest needs on campuses. This change must be supported by the campus administration but must be led and directed by the campus faculty for it to be an effective innovation. Widespread SLO cultural change can occur with properly funded and effective use of SLO Coordinators by

126 coordinating professional-development opportunities in relation to SLOs across the campus and building a culture of believers. The findings show that faculty are concerned about the time SLO processes take to be fully implemented, but most are assessing SLOs and at least beginning to use the data to improve student learning.

This momentum must keep moving forward by addressing the issues raised by faculty and providing the right amount of support for campus SLO efforts.

REFERENCES

128

REFERENCES

Academic Senate for California Community Colleges (ASCCC). (2007). Agents of

change: Examining the role of student learning outcomes and assessment

coordinators in California Community Colleges. Retrieved from

http://www.asccc.org/papers/agents-change-examing-role-student-learning-

outcomes-and-assessment-coordinators-california

Academic Senate for California Community Colleges (ASCCC). (2008). Have you

heard about the two-year rule and accreditation?. Retrieved from

http://www.asccc.org/content

Accrediting Commission for Community and Junior Colleges (ACCJC). (2011).

Twelve common questions and answers about regional accreditation.

Retrieved from http://www.accjc.org

Accrediting Commission of Community and Junior Colleges (ACCJC). (2012a).

Accreditation standards. Retrieved from: http://www.accjc.org

Accrediting Commission for Community and Junior Colleges (ACCJC). (2012b).

Rubric for evaluating institutional effectiveness. Retrieved from

http://www.accjc.org/all-commission-publications-policies

Accrediting Commission for Community and Junior Colleges (ACCJC). (2013a).

Commission actions on institutions from the June 2013 meeting. Retrieved

from http://www.accjc.org/

129

Accrediting Commission for Community and Junior Colleges (ACCJC). (2013b,

Spring). Trends in deficiencies leading to sanction. ACCJC News. Retrieved

from http://www.accjc.org/newsletter

Afshari, M., Bakar, K., Luan, W., Samah, B., & Fooi, F. (2009). Applying the

concerns-based adoption model to research on computers in education.

Retrieved from www.ukessays.co.uk/essays/education/concerns-based-

adoption-model.php

American Association of Community Colleges (AACC). (2013). The college

completion challenge fact sheet. Retrieved from http://www.aacc.nche.edu

Banta, T. W. (2002). Building a Scholarship of Assessment. San Francisco, CA:

Jossey-Bass.

Banta, T. W., Black, K. E., Kahn, S., & Jackson, J. E. (2004). Perspective on good

practice in community college assessment. New Directions for Community

Colleges, 126, 5–16.

Beno, B. A. (2004). The roll of student learning outcomes in accreditation quality

review. New Directions for Community Colleges, 126, 65–72.

California Community College Chancellor’s Office (CCCCO). (2014). Datamart.

Retrieved from http://datamart.cccco.edu/DataMart.aspx

Cohn, G. “Unfathomable”: Why is one Commission trying to close California’s

largest public college? Retrieved from http://capitalandmain.com/

unfathomable-why-is-one-commission-trying-to-close-californias-largest-

public-college/

130

Corbin, J. & Strauss, A. (1990). Grounded theory research: Procedures, canons,

and evaluative criteria. Qualitative Sociology, 13 (1), 3–21.

Dunsheath, B. (2010). Searching for an A+: Techniques for implementing a

successful student learning outcomes process in California community

colleges. Doctoral dissertation. California State University, Long Beach.

Eaton, J. S. (2012). An overview of U.S. accreditation. Washington, DC: Council for

Higher Education Accreditation. Retrieved from http://chea.org/pdf/

Overview%20of%20US%20Accreditation%202012.pdf

Ewell, P. (2005). Can assessment serve accountability: It depends on the question. In

Joseph C. Burke and Associates (Eds.), Achieving accountability in higher

education: Balancing public, academic, and market demands . San Francisco:

Jossey-Bass .

Fain, P. (2012). Results are in. Retrieved from http://www.insidehighered.com/news/

2012/07/30

Freedberg, L. (2012). Accrediting agency under federal pressure to be tougher on

community colleges. Retrieved from http://www.edsource.org/today/2012

Gall, M. D., Borg, W. R., & Gall, J. P. (1996). Educational research: An

introduction. White Plains, NY: Longman.

Goldstein, B., & Young, F. (1992). The evolution of student outcomes assessment:

Politics and collegiality. In California State University (Ed.), Student

outcomes assessment: What makes it work? (pp. 31–40). Long Beach, CA:

California State University Institute for Teaching & Learning.

131

Gray, P. J. (1997). Viewing assessment as an innovation: Leadership and the

change process. In P. J. Gray & T. W. Banta (Eds.), The campus-level impact

of assessment: Progress, problems, and possibilities (pp. 5–15). San

Francisco, CA: Jossey-Bass.

Lane, I. (2007). Change in higher education: Understanding and responding to

individual and organizational resistance. Journal of Veterinary Medical

Education, 34 (2), 85–92 .

Leal, R. R. (1995). From collegiality to confrontation: Faculty-to-faculty conflicts. In

S. A. Holton (Ed.), Conflict management in higher education (pp. 19–25). San

Francisco, CA: Jossey-Bass.

Loacker, G. (1988). Faculty as a force to improve instruction through assessment. In

J. M. McMillan (Ed.), Assessing students’ learning (pp. 19–32). San

Francisco, CA: Jossey-Bass.

Loucks-Horsley, S. (1996). The concerns based adoption model (CBAM): A model

for change in individuals. Retrieved from www.nas.edu/rise/backg4a.htm

Magruder, J., McManis, M. A., & Young, C. C. (1997). The right idea at the right

time: Development of a transformational assessment culture. In P. J. Gray &

T. W. Banta (Eds.), The campus-level impact of assessment: Progress,

problems, and possibilities (pp. 5–15). San Francisco, CA: Jossey-Bass.

McClenney, K. M. (1998). Community colleges perched at the millennium:

Perspectives on innovation, transformation, and tomorrow. Leadership

Abstracts, 11 (9). Retrieved from http://league.org/labs0898.html

132

Palomba, C. A., & Banta, T. W. (1999). Assessment essentials: Planning,

implementing, and improving assessment in higher education. San Francisco,

CA: Jossey-Bass.

Ringel, R. (2000, Fall). Managing change in higher education. Assessment and

Accountability Forum. Retrieved from www.intered.com/storage/jiqm/

v10n3_ringel.pdf

Rogers, G. (2013). Sample protocol for pilot testing survey items. Retrieved from

http://www.abet.org/uploadedfiles/events/webinars/develop_survey.pdf

Senge, P., Kleiner, A., Roberts, C., Ross, R., Roth, G., & Smith, B. (1999). The dance

of change. New York, NY: Doubleday.

Serban, A. M. (2004). Assessment of student learning outcomes at the institutional

level. New Directions for Community Colleges, 126, 17–27.

Shugart, S. C. (2013). Moving the needle on college completion, thoughtfully.

Retrieved from http://www.insidehighered.com/users/sanford-c-

shugart#sthash.fQab1v70.dpbs

Suskie, L. (2010). Why are we assessing? Inside Higher Education. Retrieved from

www.insidehighered.com/views/2010/10/26/suskie#sthash.n06GzYuf.dpbs

Tharp, N. (2012). Accreditation in the California community colleges: Influential

cultural practices. Doctoral dissertation. California State University,

Sacramento.

133

Waite, L. (2004). Implementing student learning outcomes: The link to

accreditation in California Community Colleges. Doctoral dissertation.

University of San Diego.

Walvoord, B. E., & Pool, K. J. (1998). Enhancing pedagogical productivity. In J. E.

Groccia & J. E. Miller (Eds.), Enhancing productivity: Administrative,

instructional, and technological strategies (pp. 35–48). San Francisco, CA:

Jossey-Bass.

APPENDICES

135

APPENDIX A

RUBRIC FOR EVALUATING INSTITUTIONAL EFFECTIVENESS

136

APPENDIX B

SAMPLE SURVEY QUESTIONS FOR SYSTEM-WIDE SURVEY

1. Name, College where serving as SLO Coordinator, Contact phone number, and email

2. List all positions you hold at your campus and indicate whether they are considered classified, management/administration, or faculty.

3. How long have you been a SLO Coordinator? a. 0–1 semester b. 1 semester-3 semesters (year and a half) c. 2–4 years d. 5–6+ years

4. What is the term of your role as SLO Coordinator? a. 1 year b. 2 years c. 3 years d. 4 of more years e. Indefinite or not determined f. It is not an official position, but stems from serving on a committee that covers SLOs g. Other: ______

5. How much reassigned time does your SLO position provide?

6. How is the SLO Coordinator selected on your campus?

7. Is there a formal job description for the SLO Coordinator position? a. Yes b. No

8. List the primary duties you are responsible for as the SLO Coordinator.

137

9. Please evaluate your campus, in your opinion, regarding the following student learning outcomes and student learning outcomes assessment benchmarks. Use 1=not yet begun; 2=beginning to develop; 3=developed on most of the campus; 4=developed campus-wide; 5=developed campus-wide and integrated into campus decision-making. a. Course level SLOs b. Program SLOs c. General education SLOs d. Institutional SLOs e. Student Support Services SLOs f. Library and learning support SLOs g. Administrative services SLOs h. The role of SLOs in accreditation i. Assessing the outcomes of course SLOs j. The role of assessment as an aid to instruction k. The use of evidence to support student learning l. Using assessment to create venues for dialogue m. Level of overall faculty participation

10. Which of the following would provide beneficial support to your role as an SLO Coordinator? (check all that apply) a. A statewide listserv for SLO coordinators b. Regional meetings for Coordinators c. Planned training institutes for Coordinators d. Access to online resources/web page e. Access to experts to facilitate workshops on your campus f. Other: ______

11. Which of the following training opportunities would assist you in your role as an SLO Coordinator? (Check all that apply) a. Writing student learning outcomes basics b. Assessment basics c. Closing the assessment loop d. Course/program outcomes e. General education/institutional outcomes f. Documenting evidence g. Developing quality dialogue h. Using assessment data to improve student learning i. Other: ______

12. What strategies have been used to develop and implement SLOs on your campus?

138

13. What barriers has/does your college faced/face in implementing SLOs?

14. What strategies have been used to eliminate the barriers your college has/is faced/facing?

15. Has your accreditation status affected the implementation of SLOs or the elimination of barriers in implementing SLOs?

16. Describe the culture of your campus in relation to SLOs.

17. Do you believe the culture of your campus in relation to SLOs impacts your ability to eliminate barriers? If yes, in what ways? If no, why?

18. As of today, how would you characterize the overall level of concern your faculty have about sustaining SLO processes on your campus? a. Not concerned or involved at all b. They are concerned and want to know more about it c. They are concerned about how it will affect them d. They are concerned about how much time it takes e. They are worried about how the use of the assessment data will affect students f. They are moving forward with it and collaborating with others g. Not only have they moved forward with it they are getting more involved and putting forth ideas to make it even better

19. As of today, please describe the overall level of use by faculty of assessment data to improve student learning on your campus. a. Faculty have no interest and are not taking action to use results b. Faculty are taking the initiative to learn more about it, but are not using them now c. Faculty have definite plans to use assessment data d. Faculty are beginning to make some changes to be able to use the assessment data e. Faculty are not making many more changes as they now have a process for using the assessment data established f. Faculty are making changes to increase or alter their outcomes (refining their use) g. Faculty are making deliberate efforts to coordinate with others about using assessment data h. Faculty are seeking out more effective alternatives to using assessment data

20. Please provide the website address to your campus’s SLO home page.

139

*Several of the questions above were adopted from the survey conducted in 2007 by the Academic Senate for California Community Colleges entitled, “Agents of Change: Examining the Role of Student Learning Outcomes and Assessment Coordinators in California Community Colleges” (see Appendix C).

140

APPENDIX C

ASCCC SLO COORDINATOR STUDY SURVEY

141

142

143

144

APPENDIX D

PROTOCOL FOR PILOT TESTING SURVEY ITEMS

Time Required: Approximately 1 hour

Subjects 4–6 persons

Purpose: As a part of a research project to fulfill a doctoral dissertation, I am developing a survey designed to assess SLO Coordinator positions and the implementation of SLOs across the CCC system. I appreciate your willingness to help us pilot test the survey and provide us some feedback on your understanding and perception of the survey items. Your individual responses in the pilot test phase are not going to be recorded or reported to anyone except those who are designing the survey.

Process:

1. The researcher will provide copies of the survey. 2. Please note how much time is required to answer all items. 3. Once you have completed the survey, respond to each survey item in four ways. a. Understandable : Was the item “understandable?” That is, did you have to read the item more than once to understand what it was asking? Was the meaning of the questions clear and straightforward? b. Scale adequate : Was the scale adequate? That is, do you fell the scale provided you with an appropriate way to respond? c. Only one response : Was the item written in such a way that you could have answered it more than one way? (E.g., could you have said BOTH “very little” and “very much?”) d. Loaded : In your opinion, was the item written in such a way that there was ONLY one OBVIOUS answer for you? In other words, the way the item is worded, it is highly unlikely that respondents would be able to respond using more than one response choice.

145

4. Please circle yes/no for each item.

For any items you answered “no,” please explain why you responded this way in the Comments box.

Scale Only one Understandable? Adequate? response? Loaded? Comments List all positions you Yes No Yes No Yes No Yes No hold at your campus and indicate whether they are considered classified, management/ administration, or faculty.

Yes No How long have you Yes No Yes No Yes No Yes No been a SLO Coordinator?

Yes No What is the term of Yes No Yes No Yes No Yes No your role as SLO Coordinator?

Yes No How much Yes No Yes No Yes No Yes No reassigned time does your SLO position provide?

Yes No How is the SLO Yes No Yes No Yes No Yes No Coordinator selected on your campus?

Yes No

146

Scale Only one Understandable? Adequate? response? Loaded? Comments Is there a formal job Yes No Yes No Yes No Yes No description for the SLO Coordinator position?

Yes No List the primary Yes No Yes No Yes No Yes No duties you are responsible for as the SLO Coordinator.

Yes No Please evaluate your Yes No Yes No Yes No Yes No campus, in your opinion, regarding the following student learning outcomes and student learning outcomes assessment benchmarks.

Yes No Which of the Yes No Yes No Yes No Yes No following would provide beneficial support to your role as an SLO Coordinator?

Yes No Which of the Yes No Yes No Yes No Yes No following training opportunities would assist you in your role as an SLO Coordinator?

Yes No

147

Scale Only one Understandable? Adequate? response? Loaded? Comments What strategies have Yes No Yes No Yes No Yes No been used to develop and implement SLOs on your campus?

Yes No What barriers Yes No Yes No Yes No Yes No has/does your college faced/face in implementing SLOs?

Yes No What strategies have Yes No Yes No Yes No Yes No been used to eliminate the barriers your college has/is faced/facing?

Yes No Has your Yes No Yes No Yes No Yes No accreditation status affected the implementation of SLOs or the elimination of barriers in implementing SLOs?

Yes No Describe the culture Yes No Yes No Yes No Yes No of your campus in relation to SLOs.

Yes No

148

Scale Only one Understandable? Adequate? response? Loaded? Comments Do you believe the Yes No Yes No Yes No Yes No culture of your campus in relation to SLOs impacts your ability to eliminate barriers?

Yes No As of today, how Yes No Yes No Yes No Yes No would you characterize the overall level of concern your faculty have about sustaining SLO processes on your campus?

Yes No As of today, please Yes No Yes No Yes No Yes No describe the overall level of use by faculty of assessment data to improve student learning on your campus.

Yes No Please provide the Yes No Yes No Yes No Yes No website address to your campus’s SLO home page.

Yes No

*Format adopted from Sample Protocol for Pilot Testing Survey Items by Gloria Rogers, ABET, Inc.

149

APPENDIX E

SURVEY CONSENT FORM

Dear Participant:

You are being asked to participate in a research project that is being done to fulfill requirements for a Doctoral degree in Educational Leadership at CSU Stanislaus. We hope to understand more clearly the role of Student Learning Outcomes Coordinators and their influence in the implementation and sustainment of the SLOs on CCC campuses. If you decide to volunteer for this study, you will be asked to answer questions regarding your position as a SLO Coordinator and provide your opinion on where you believe your campus faculty stand on several SLO issues. These questions will be asked of you in an electronic survey. The survey should take about 20 minutes to complete. SLO Coordinator’s from each California Community College campus will be surveyed. The results of this study will be used to understand the role SLO Coordinator’s play in implementing and sustaining SLOs on campuses.

There are no risks to you for your participation in this study. It is possible that you will not benefit directly by participating in this study. The information collected will be protected from all inappropriate disclosure under the law. All data will be maintained for a period of one year from the completion of the study and will be destroyed by May 2015. Only the researcher will have access to the data which can be linked to individual subjects or individual institutions. The researcher will provide each participant with a pseudonym for any actual written papers in relation to this study.

There is no cost to you beyond the time and effort required to complete the procedure(s) described above. Your participation is voluntary. Refusal to participate in this study will involve no penalty or loss of benefits. You may withdraw at any time without penalty or loss of benefits.

If you agree to participate, please continue to the electronic survey. Completion of the survey indicates your willingness to participate. If you wish to re-visit this consent form at any time during the survey, please click the link provided. If you have any questions about this research project, please contact me, Regina Coletto, at [—] or my faculty sponsor, Dr. Jim Riggs at 209-664-6789. If you have any questions regarding your rights and participation as a research subject, please contact Campus Compliance, California State University, Stanislaus at 209-667-3747.

Only people 18 years of age or older will be allowed to participate in the study.

150

APPENDIX F

GENERAL MEMBERSHIP OF RP GROUP LISTSERVS

RP Listserv Facilitates information sharing among California community college researchers and planners on issues, methodologies, and tools.

SLO Listserv Facilitates information sharing among California community college learning assessment coordinators on issues, methodologies, and tools.

Assessment Listserv A moderated listserv featuring effective practices, relevant studies, and useful resources on assessment issues.

Leading from the Middle Listserv A listserv to facilitate information sharing among California community college practitioners who are leading change in their roles as deans, department chairs, coordinators, and committee chairs.