Graduate Student Exit Interviews

Total Page:16

File Type:pdf, Size:1020Kb

Graduate Student Exit Interviews

A Brief Analysis of

Graduate Student Exit Interviews,

California State University, Chico

December 1995-December 2004

William M. Loker Interim Associate Dean School of Graduate, International and Interdisciplinary Studies California State University, Chico

June 2005 Introduction

The School of Graduate, International and Interdisciplinary Studies routinely collects exit interview data from graduating masters students in the form of a brief survey (see attached). Students receive this survey when applying for graduation and many take the time to fill it out and return it with their graduation application. The same survey has been administered since 1990. While attempts have been made to tally responses to the survey, there has been no systematic attempt at further analysis of results. Unfortunately, it appears that the surveys from 1990-94 were tallied and discarded. Therefore, the current analysis covers the period December 1995-December 2004. There are a total of 885 surveys from this time period. The procedure for analyzing the surveys has been as follows. The surveys were reviewed and responses coded by a student assistant under the supervision of the author during Fall 04. This process included reducing the open-ended responses (to questions 20-22) to a manageable number of responses and coding these for data input. The responses were then entered into an Excel spreadsheet by the student assistant. Data were reviewed and cleaned up and prepared for analysis. For the purposes of analysis, the Excel data was exported to SPSS and subsequent analysis has been carried using that program. What follows is a preliminary attempt to summarize the response to the survey, looking at the graduates as a whole, then comparing the responses of individual programs to the entire sample. It should always be borne in mind, when reviewing these results, that we are examining self-reported data from students. Hence, we are examining perceptions of programs, not the actual conditions prevailing in these programs. Having said that, clearly the perceptions of our graduates regarding their experiences is important and should be considered, among other data, in assessing the strengths and weaknesses of the graduate experience at CSU, Chico.

Overall Summary

Table 1 lists the number of masters graduates by program for the period under consideration. It is notable that more than half of our masters graduates received degrees in five programs: Computer Science, various Business degrees, various Psychology degrees, Speech Pathology and PE/Kinesiology. An additional five masters programs provides another 25% of our graduates (various Education masters programs, various Interdisciplinary Studies programs, various Communications graduate degrees, various Political Science degrees and Social Work). It is notable that Social Work appears among the top-degree granting graduate programs, as it only came into existence in AY 1999-2000, graduating its first cohort of students in Spring 2003. Clearly it is a large and growing program, much in demand. The remaining 15 degree-granting programs account for less than 25% of our graduate degrees. Turning to results of the exit surveys, there are a series of Tables (2-6) that summarize the primary questions directed at the quality of the graduate experience. Several of these questions are ordinal questions, using a 4-point ranked scale, where “1” is excellent and “4” is poor. The one exception is Question 6: “Is the focus of your masters program:” 1. Too Specialized, 2. About Right, 3. Too Broad. Questions 20-22 are open-ended questions and will be discussed in more detail below.

2 Table 1. Masters Programs, Number of Exit Surveys, 12/95-12/04 Cumulative Frequency Percent Valid Percent Percent Valid Computer Science 109 12.3 12.4 12.4 Business (All MA/Ss)a 97 11.0 11.0 23.4 Psychology (All MA/Ss)b 93 10.5 10.6 34.0 Speech Pathology 82 9.3 9.3 43.3 PE/Kinesiology 81 9.2 9.2 52.5 Education (all MAs)c 55 6.2 6.3 58.8 All Interdisciplinary 46 5.2 5.2 64.0 MA/Ssd Communications (All 38 4.3 4.3 68.3 MAs)e Political Sciences (All MAs)f 35 4.0 4.0 72.3 Social Work 30 3.4 3.4 75.7 English (All MAs)g 29 3.3 3.3 79.0 Biology/Botany 24 2.7 2.7 81.7 Anthropology 21 2.4 2.4 84.1 Nutrition 21 2.4 2.4 86.5 Teach Int'l Languages 18 2.0 2.0 88.5 Recreation 17 1.9 1.9 90.5 Instructional Technology 17 1.9 1.9 92.4 Geography (Geo & P) 13 1.5 1.5 93.9 Music 11 1.2 1.3 95.1 Social Science 11 1.2 1.3 96.4 History 9 1.0 1.0 97.4 Electrical Engineering 8 .9 .9 98.3 Art (All MAs)h 7 .8 .8 99.1 Geosciences 5 .6 .6 99.7 Nursing 3 .3 .3 100.0 Total 880 99.4 100.0 Missing System 5 .6 Total 885 100.0 a Includes MBA, MS Accounting, MIS, Redding program. b Includes Psychology, MFT, PPS c Includes Education, Ed Admin, Special Education, Curriculum d Includes Interdisciplinary MA, MS, ISMA Math, ISMS Math Ed e Includes Communications, Information and Communications Studies f Includes Political Science, MPA, MPA with Health Admin option g Includes English, MFA Creative Writing, English Literature h Includes Art, Art History

Table 2 summarizes the responses to what are arguably the most important quality measures of the graduate programs at CSU, Chico: rate overall quality, variety of courses, availability of courses, overall quality of faculty and the answer to the question: “Were you intellectually challenged by your program?” Recall that in all cases, the closer the response is to “1” the better the program’s performance. In these pooled responses the mean response is around 1.6 for all measures, except variety and availability of courses, where responses average

3 closer to “2.” In all cases, the median and modal responses are “2,” or “good.” Absent external benchmarks, it is difficult to assess how we are doing on these measures of overall quality. But it is clear that variety and availability of graduate-level courses is viewed as somewhat problematic by our students. Table 2. Graduate Student Exit Surveys, All Graduate Programs, CSU, Chico Rate overall Rate variety Rate Rate overall Were you quality of of course availability quality of intellectually graduate offerings in of course department challenged by program program offerings faculty program? N 881 880 881 878 876 Missing 4 5 4 7 9 Mean 1.6657 2.0057 2.0023 1.6093 1.6261 Median 2.0000 2.0000 2.0000 2.0000 2.0000 Mode 2.00 2.00 2.00 1.00 2.00 StdDev .62323 .73393 .81464 .66300 .59504

The second set of quality indicators in Table 3 focuses more on what might be termed “services” provided by the programs and university: advising, assistance with financial aid, and career counseling. Faculty score high on “availability for advice” but slightly lower on “satisfaction with advice received,” and “Quality of encouragement to persist toward goals.” Scores are notably lower on “Career counseling,” and especially on “Satisfaction with financial aid assistance.” On this last question, the modal response was “Not applicable.” While this can be interpreted in a variety of ways, it appears to indicate that students received little or no assistance in obtaining financial aid. Why this is so is impossible to determine from the survey responses alone and bears further investigation. But clearly, our graduate students view the situation of financial aid as problematic. Table 3. Graduate Student Exit Surveys, All Graduate Programs, CSU, Chico Rate Rate Rate Rate quality of Rate faculty degree of degree of encouragement quality of availability satisfactio satisfactio received from career for advice. n with n with department/committee information advice assistance to persist toward received. received. obtaining goals. financial aid.

N 877 873 874 870 792 Missing 8 12 11 15 93 Mean 1.5753 1.7520 3.2946 1.7109 2.3068 Median 1 2.0000 3.0000 2.0000 2.0000 Mode 1 2.00 5.00 1.00 2.00 StdDev .71196 .77634 1.5776 .80207 .92715

4 The third set of quality indicators in Table 4 focuses on a variety of topics: student morale in the program, quality of thesis direction (for those completing a thesis), focus of programs, quality of resources in programs, and, finally, the quality of services offered by the Graduate School. The mean of most response hovers around 2.0. (Remember that 2.0 is the optimal response for the “focus of program” question). Students ranked quality of direction on the thesis highest among all these indicators. This is somewhat surprising, given that many students mention issues with the thesis as one of the “things they would do differently” (see below). One possible interpretation is that students underestimate the challenge of the thesis, but feel that, ultimately, faculty are doing a very good job at guiding them through the thesis writing process.

Table 4. Graduate Student Exit Surveys, All Graduate Programs, CSU, Chico Rate the focus Rate Rate thesis of your Rate quality Rate quality of student direction masters of resources assistance morale in received from program (2 = and facilities received from your committee and “About in your Graduate program. chair. Right”) program. School. N 858 477 874 843 838

Missing 27 408 11 42 47 Mean 1.9790 1.6887 2.0561 2.0991 1.9511 Mode 2.00 2.00 1.00 2.00 2.00 Std. .67167 .74051 .32683 .81223 .75134 Deviation

Next we will briefly review the responses to the open-ended questions: Question 20, “Given what you know now, what would you have done differently in the pursuit of your master’s degree?” Question 21, “What specific suggestions do you have for improving the quality of your program?” and Question 22, “What specific suggestions do you have for improving the quality of the service offered by the Graduate School?” In reviewing the responses to these questions, we were faced with the difficulty of taking hundreds of responses and distilling these down to a manageable number, without sacrificing too much of the sense and meaning of each response. In some cases this was not too difficult: many people responded variations of “none” or “nothing” to these questions. Other cases were more difficult to resolve. Also, there tends to be many more missing data here, as graduates did not take the time to write out responses to these questions. With these caveats in mind, we will proceed to examine the responses. Tables 5,6 and 7 which summarize these responses are accompanied by bar charts that present a graphic depiction of the responses.

5 Table 5. “In retrospect, what would you do differently?”

F % Valid % None (everything OK) 129 14.6 21.5 Personal planning (start earlier, finish more quickly, etc) 103 11.6 17.2 Thesis related (start thesis sooner, etc) 93 10.5 15.5 Advising (sought more, better, rel w/ committee) 76 8.6 12.7 Pursued degree elsewhere 42 4.7 7.0 Taken more classes 39 4.4 6.5 Miscellaneous/other 25 2.8 4.2 Program planning (course sequence, not write 19 2.1 3.2 thesis) Program specific comments 16 1.8 2.7 Different program (soc wrk to psych, vice versa, 16 1.8 2.7 other) Not worked full-time 14 1.6 2.3 Academic preparation (better, different) 10 1.1 1.7 More Ind study, internship, field, job experience 9 1.0 1.5 Financial Aid 5 .6 .8 No degree 3 .3 .5 Total 599 67.7 100.0 Missing 286 32.3 Total 885 100.0

6 Table 5 presents the results from Question 20, “what would you do differently?” It is gratifying that the modal response (21.5%) is “None, everything ok.” indicating student satisfaction with their experience. At the other end of the spectrum is the response, “pursued degree elsewhere” which was the response of about 7% of those who answered this question. In other words, more than three times as many respondents would change nothing, compared to those who would not choose CSU, Chico as the university to pursue their degree. On the other hand, it is a little discouraging that nearly 1 out of 12 graduate students would go elsewhere if they had to do it over again. Other notable categories of response had to do with planning around the thesis (15.5%) and seeking more or better quality advice about the program (12.7).

Table 6 summarizes responses to the question regarding suggested improvements to the quality of the specific degree program. Here the modal response (28.8%) focuses on the variety and availability of classes. As we saw in Table 2 (above) variety and availability of courses is the indicator on which our graduate programs score lowest. Doubtless, this is symptomatic of the overall focus of CSU, Chico (and the CSUs in general) on undergraduate education, to the neglect of graduate programs. Variety and availability of classes is followed by “improved advising” and “improved facilities,” which, together with variety/availability of classes, accounts for about 50% of the responses to this question. Again, these responses reflect the secondary role of graduate education, where faculty time is focused mostly on undergraduate education and the research facilities necessary to support graduate student research is not a priority at our campus. As faculty and administrators involved in graduate education, this will not come as a surprise to most of us. What is interesting is that students notice this, too. In fact, this pattern of responses enhances our confidence in the validity of student perceptions in general, giving the overall results of the exit surveys more credibility.

7 Table 6. Suggested Improvements to Your Degree Program

F % Valid % Cumulative % More classes, better availability 161 18.2 28.8 28.8 Improved advising 61 6.9 10.9 39.7 Improved facilities 52 5.9 9.3 49.0 None (no changes needed) 52 5.9 9.3 58.3 Miscellaneous, other (More 48 5.4 8.6 66.9 selective admissions) Better faculty 42 4.7 7.5 74.4 More professors 33 3.7 5.9 80.3 Program specific (specific courses, sequences) 33 3.7 5.9 86.2 More rigor 16 1.8 2.9 89.1 More practical training, 14 1.6 2.5 91.6 internships Thesis related 14 1.6 2.5 94.1 More job information 12 1.4 2.1 96.2 Financial Aid 10 1.1 1.8 98.0 More specialized 8 .9 1.4 99.5 More opportunity to teach 2 .2 .4 99.8 Less specialized 1 .1 .2 100.0 Total 559 63.2 100.0 Missing 326 36.8 Total 885 100

8 Table 7. Suggested Improvements for Graduate School

F % Valid % Cumulative % None (OK as is) 112 12.7 30.5 30.5 Improve communication (send more, web, newsletter) 85 9.6 23.2 53.7 Don't Know, little/no contact 74 8.4 20.2 73.8 Improve efficiency (availability, processing of 28 3.2 7.6 81.5 forms) Personnel complaints 21 2.4 5.7 87.2 Program specific (more classes) 15 1.7 4.1 91.3 Thesis related (prep 11 1.2 3.0 94.3 workshop each semester) Misc, other 9 1.0 2.5 96.7 More job information 7 .8 1.9 98.6 Commencement (more seating) 5 .6 1.4 100.0 Total 367 41.5 100.0 Missing 518 58.5 Total 885 100

The final set of open-ended responses focuses on improving services provided by the Graduate School. The Graduate School is an administrative arm of the university responsible for admissions, general information for students on procedures throughout their graduate education, review of thesis, processing applications for graduation and the organization of commencement, among other duties. The Graduate School does not deliver educational content to students, but provides a context for programs to operate and for students to pursue their degrees. About half of the respondents to this question replied either “none (no improvements needed)” – 30% or “Don’t know, little contact with Graduate School” – 20%. This is not a bad outcome as both responses basically indicate no complaint with the Graduate School. The “don’t know” responses are a little troubling, as students seem to be unaware of what the Graduate School does, but, it could be worse! The most common suggestion for improvement was “more and better communication.” Complaints focused on perceived bureaucratic delays and inefficiencies (7.6%) with only 5.7% having specific complaints about treatment received from Graduate School personnel. Given the volume of graduate students and limited staff resources, this is a very positive result.

Program specific results The next section of this report will review the same information on a program-by- program basis. The general analytical procedure followed was to carry out t-tests on differences in means among the ranked quality indicators for each program compared to the rest of the sample, looking for statistically significant results.i Only statistically significant differences are reported. The ranked quality indicators reviewed in Table 2 above, identified as the primary quality indicators, were also examined for trends over time by taking the response of each

9 program and dividing them into two groups: pre-2000 and post-2000 graduates. This split the graduates into two approximately equal cohorts. When this was done for all masters graduates taken as a group (not by program) there were no statistically significant trends noted. But a few programs did have trends over time and these are noted below. The pattern of responses for the open-ended questions was also reviewed. In this case responses were simply examined for patterns or trends, without employing any formal means of statistical evaluation. Results are presented as briefly as possible. In reviewing the results, keep in mind that closer to “1” is always better on all of these measures. When a program’s mean on a quality measure is greater (farther from “1”) than that of the overall sample, student perception of quality is lower than for the overall sample. When the program’s mean is smaller (closer to “1”) than the overall mean, student perception of quality is higher than overall student perceptions. In the text below, “negative” indicates student perceptions of quality that are below average for the group as a whole, while “positive” means a significant difference in the direction of higher student perception of program quality. Also, a few programs (Music, Social Science, Electrical Engineering, Art, Geosciences and Nursing) were not analyzed due to the small number of cases (n ≤ 11), which vitiated any formal statistical analysis. Recreation was not analyzed because of a small number of cases (17) and the fact that its graduate program is currently suspended.

Anthropology: There was no significant difference on any of the major indicators of student perception of quality between Anthropology masters graduates and the overall sample. Nor were time trends evident in the data though the number of cases was small (21). There were significant differences in: student morale (negative), quality of facilities (negative). Small N also makes complicates the analysis of open-ended questions. But Anthropology graduate students appear to have more thesis-related (4 = 23.5%) and advising concerns (4, 23.5%) compared to the overall sample. Results are presented in Table 8, below.

Table 8. ANTH vs. All Other Graduate Programs, T-test results of significant differences N Mean StdDev T Sig Rate student All = 833 1.9694 .66864 2.671 0.008 morale in ANTH = 20 2.3750 .77587 your program Rate quality All = 818 2.0905 .81284 2.151 0.032 of resources and facilities ANTH = 20 2.3750 .77587 in your program.

Biology: Biology graduate students rated their program as below the mean on student morale and both variety and availability of course offerings (see Table 9). There were no time trends evident in the responses, but the number of cases is small. Also, the number of respondents to the open- ended questions is small. However it is notable that for Question 20, the response, “pursued degree elsewhere” was at 15.8% (three responses) twice as high as the university average (7%). Also, Biology graduate students seemed to have more complaints with Grad School personnel (4

10 = 40%) compared to average (6%). BIOL students are < 0.5% of all CSU, Chico graduates but make up nearly 20% of those who complained about Grad School personnel.

Table 9. BIOL vs. All Other Graduate Programs, T-test results of significant differences N Mean StdDev T Sig Rate variety All = 851 1.9935 .72818 3.348 0.001 of course offerings BIOL = 24 2.5000 .82092 Rate 4.199 0.000 availability of All = 852 1.9853 .80507 course offerings BIOL = 24 2.6875 .90665

Rate student 2.003 0.045 All = 829 1.9710 .66875 morale in your BIOL = 24 2.2500 .79400 program.

Business: With Business graduates, we are turning from two of the smaller programs (ANTH, BIOL) to the second largest program on campus. Among Business graduates who responded to Table 10. Business vs. All Other Graduate Programs, T-test results of significant differences N Mean Std Dev T Sig Were you All = 776 1.61 .602 2.253 0.026 intellectually challenged by Business = 96 1.74 .530 program? Rate degree of All = 766 1.77 .786 2.096 0.036 satisfaction Business = 95 1.69 .749 with advice received. Encouragement All = 773 1.68 .804 3.678 0.000 toward goals Business = 94 2.01 .873

Rate student All = 756 1.9993 .67867 2.482 0.013 morale in your program. Business = 97 1.8196 .61316

the exit questionnaire, there were statistically significant differences in intellectual challenge and student morale (both negative), see Table 10. Other quality indicators revealed significant

11 differences in “Degree of Satisfaction with Advice received,” (positive) and “Encouragement to persist toward goals” (negative). There were no time trends among the major quality indicators. among the open-ended questions, the only remarkable response is to Question 21 where the response, “Better faculty” was mentioned by 15% (n=9) of Business graduates compared to 7.5% of rest of sample.

Communication Studies: On the primary basic quality indicators, Communications Studies shows a significant difference on availability of courses (negative) and quality of faculty (negative). See Table 11. On secondary quality indicators, Communications shows a significant difference on career advice (negative). There were no notable time trends or patterns of responses to Questions 20,21,22.

Table 11. CMST vs. All Other Graduate Programs, T-test results of significant differences N Mean StdDev T Sig Rate All = 838 1.99 .808 3.029 0.003 availability of course CMST = 38 2.39 .886 offerings Rate overall All = 836 1.59 .655 2.786 0.005 quality of CMST = 37 1.90 .798 department faculty Rate quality All = 776 2.37 1.07 2.285 0.023 of career information CMST = 35 2.77 .883 received.

Computer Science is the largest program at CSU, Chico’s Graduate School. It has a large number of international students as well as an on-line degree component. (Surveys do not identify either category of student specifically.) Data analysis is reported in Tables 12 and 13. Among the primary quality indicators, the only significant difference from the rest of the sample is in “quality of faculty” (negative). Among other quality indicators, Computer Science scores significantly below the mean on, “encouragement toward reaching goals,” and above the mean on “quality of facilities.” Perhaps most notable is that Computer Science scores negatively on all time trends of all major quality indicators. This indicates that more recent Computer Science graduates (post-2000) view the program as having significantly lower measures of quality compared to those who graduated from 1995-1999. There were no particularly outstanding patterns in responses to the open-ended questions.

Table 12. CSCI vs. All Other Graduate Programs, T-test results of

12 significant differences N Mean StdDev T Sig Rate overall All = 765 1.58 .658 3.587 0.000 quality of department CSCI = 108 1.82 .667 faculty Encouragement All = 761 1.69 .818 3.397 0.001 toward goals CSCI = 106 1.96 .780 Rate quality of All = 734 2.13 .824 2.512 0.012 resources and facilities in CSCI = 105 1.91 .709 your program.

Table 13. Time Trends on Measures of Quality, CSCI MA Graduates, Pre- and Post-2000 Graduates

Std. T Sig. N Mean Deviation Rate overall quality of graduate education. Pre-2000 59 1.5593 .62343 2.486 0.015 Post-2000 48 1.8646 .64197 Rate variety of course offerings in program. Pre-2000 59 1.7627 .65229 3.051 0.003 Post-2000 49 2.1837 .78192 Rate availability of course offerings in Pre-2000 59 1.9492 .62763 2.421 0.017 program. Post-2000 49 2.2857 .81650

Rate overall quality of department faculty. Pre-2000 59 1.6610 .60487 2.880 0.005 Post-2000 49 2.0204 .69191 Were you intellectually challenged by course Pre-2000 60 1.5333 .57392 2.519 0.013 of study? Post-2000 48 1.8333 .66311

Education. CSU, Chico offers several distinct types of MA degrees in education that have been combined for the purposes of this analysis (see footnotes to Table 1, above.) Taken together, these students form a substantial number of our graduate students. Data analysis is reported in

13 Tables 14-16. Education students rate their educational experience significantly more positively than average on many indicators: overall quality (positive), course variety (positive), intellectual challenge (positive), quality of faculty (positive), quality of advice received (positive), encouragement toward goals (positive), quality of resources (positive), quality of thesis advice (positive) and quality of service from Grad School (positive). There were no time trends on any

Table 14. EDMA vs. All Other Graduate Programs, T-test results of significant differences N Mean StdDev T Sig Rate overall All = 821 1.6784 .62966 2.372 0.018 quality of graduate EDMA = 55 1.4727 .50386 education Rate variety of All = 820 2.0299 .73796 3.511 0.000 course EDMA= 55 1.6727 .60260 offerings in program Were you All = 816 1.6373 .60074 2.426 0.015 intellectually challenged by EDMA=55 1.4364 .49117 course of study?

measures. On the open ended questions, on Question 21, responses indicating a need for “more personal planning” are over-represented (31%) as are responses indicating a need for “more/better advising” (26%) compared to 17% and 12% in rest of sample. On the same questions, no Education students indicated they would “pursue degree elsewhere.” On Question 22, “Improved advising” is 30% versus 11% for rest of sample. There were no prominent trends that differ from the rest of the sample on Question 23, Grad School services, though there were 2 personnel complaints.

14 Table 15. EDMA vs. All Other Graduate Programs, T-test results of significant differences N Mean StdDev T Sig Rate degree of All = 817 1.8819 .78657 2.415 0.016 satisfaction with EDMA = 1.5091 .66312 advice received. 55 Rate quality of All = 816 1.6373 .60074 2.750 0.006 encouragement received from EDMA=55 1.4364 .49117 department/committee to persist toward goals.

Table 16. EDMA vs. All Other Graduate Programs, T-test results of significant differences N Mean StdDev T Sig Rate quality of 3.493 0.001 787 2.1252 .81488 resources and facilities in 52 1.7212 .68894 your program. Rate thesis 442 1.7104 .74373 2.287 0.023 direction received from committee and 35 1.4143 .64723 chair. Rate quality of 3.255 0.001 assistance 782 1.9738 .75883 received from Graduate 52 1.6250 .55902 School.

English. English, a small-to-medium sized graduate program, showed significant differences on quality of faculty (positive) and thesis advice and encouragement (positive). There were significant negative time trend in variety, availability of course offerings. (See Table 17.) There were no strikingly different responses among English graduates on the open-ended questions.

15 Table 17. ENGL vs. All Other Graduate Programs, T-test results of significant differences N Mean StdDev T Sig Rate overall quality of All = 844 1.6209 .66756 4.309 0.000 department faculty. ENGL = 29 1.2586 .43549

Rate quality of All = 838 1.7315 .82457 3.668 0.001 encouragement received from ENGL = 29 1.3793 .49380 department/committee to persist toward goals. Rate thesis direction 454 1.7093 .74677 4.277 0.000 received from committee and chair. 23 1.2826 .44788 ENGL Graduate Programs, Time Trends, T-test results Rate variety of Pre-2000 = 2.128 0.043 1.7500 .96531 courses available in 12 program. Post-2000 = 2.4118 .71229 17 Rate availability of Pre-2000 = 3.141 0.004 1.5833 .79296 courses available in 12 program. Post-2000 = 2.4706 .71743 17

Geography and Planning: is the program with the smallest number of graduates analyzed here (n=13). Geography and Planning showed significant differences from the rest of the sample on the measures “faculty availability for advice” and “student morale” (both negative, see Table 18). The sample size was too small to do time trend analysis or evaluate responses to the open- ended questions. Table 18. GEOP vs. All Other Graduate Programs, T-test results of significant differences N Mean StdDev T Sig Rate faculty All = 859 1.5664 .70687 2.962 0.003 availability for advice. GEOP = 13 2.1538 .89872 Rate student 2.193 0.029 morale in All = 840 1.9726 .67421 your program. GEOP = 13 2.3846 .50637

16 Interdisciplinary Studies: This category refers to those pursuing a variety of Interdisciplinary degrees, from programs the students design themselves to well-established programs such as Math Education. This rather mixed bag of students rated their programs more positively than average on the variables “quality of encouragement received,” “student morale” and “quality of resources” (all positive, see Table 19). Trends over time were detectable for: thesis supervision, quality of faculty and faculty availability for advice (all positive). On the open-ended questions, Question 21, the response: “pursued degree elsewhere” was over-represented with 18% of Interdisciplinary graduates providing this response vs 7% in total sample. On Question 21 (“suggestions for program improvement”) “None” is the response for 32% of Interdisciplinary graduates versus 9% in total sample and “Improved advising” was answered by 18% of Interdisciplinary graduates versus 11% in the total sample. On Question 22, there were 2 personnel complaints. These responses are rather difficult to interpret and may reflect the rather varied backgrounds, interests and graduate experiences of these students.

Table 19. IDST vs. All Other Graduate Programs, T-test results of significant differences N Mean Std Dev T Sig Rate quality of 3.668 0.001 All = 838 1.7315 .82457 encouragement received from department/committee IDST = 29 1.3793 .49380 to persist toward goals. Rate student morale All = 812 1.9895 .67605 2.056 0.040 in your program. IDST = 41 1.7683 .59264 Rate quality of 4.141 0.002 resources and facilities All = 801 2.1192 .81614 in your program. IDST = 38 1.6974 .63181 INDS Graduate Programs, Time Trends, T-test results Rate thesis direction Pre-2000 = 12 2.0000 .42640 4.686 0.000 received from committee and chair. Post-2000 = 17 1.2353 .43724 Rate overall quality of Pre-2000 = 20 1.6750 .56835 2.698 0.010 department faculty. Post-2000 = 26 1.2692 .45234 Rate faculty Pre-2000 = 20 1.8000 .69585 2.755 0.009 availability for advice. Post-2000 = 26 1.3269 .46781

Instructional Technology: This program is affiliated with Communications and could be logically grouped with that department’s MA program. Regardless, there are no significant differences between INST students and the rest of the sample on any quality indicators. There were also no time trends (small N of 17). On the open-ended questions, Question 20 on “what

17 would you do differently?” a larger than average number of INST students mentioned issues related to the thesis (>50% of respondents, i.e. 6 of 11 respondents) compared to 15% in total sample. Admittedly the sample size is small so conclusions are tempered by the numbers.

Nutrition: This program, with 21 respondents, showed positive significant differences on “overall quality,” “quality of faculty” and several secondary measures of quality(“availability of faculty for advice,” “quality of advice received,” “encouragement to reach goals,” and “quality of career advice received”). (See Table 20.) There were no significant time trends. On Question 20 “”What would you do differently” (n =15), the most frequent response was Nothing/none at 40% (vs. 21% in the sample as a whole) and Thesis related responses were 40% (vs. 15% in the sample as a whole). On Question 22 there was one personnel complaint.

Table 20. NUTR vs. All Other Graduate Programs, T-test results of significant differences N Mean StdDev T Sig Rate overall quality of 4.487 0.000 All = 855 1.6760 .62469 graduate education. NUTR = 1.2381 .43644 21 Rate overall quality of All = 852 1.6203 .66596 5.858 0.00 department faculty NUTR = 1.1429 .35857 21 Rate faculty 851 1.5846 .71489 3.447 0.002 availability for advice. 21 1.1905 .51177 Rate degree of 848 1.7683 .78633 5.356 0.000 satisfaction with advice received. 21 1.2381 .43644 Rate quality of 846 1.7305 .82063 3.543 0.002 encouragement received from department/committee 21 1.2857 .56061 to persist toward goals. Rate quality of career 792 2.3965 1.02621 2.460 0.003 information received. 19 1.9474 .77986

PE/Kinesology: All graduates examined completed their degrees prior to the name change from Physical Education to Kinesiology. This is one of the larger graduate programs on campus. PE graduate students rated their department significantly more positively on “availability of courses” and “student morale.” On the question of “intellectual challenge,” PE was perceived as significantly less challenging than other programs were by their graduates. The results on this last measure are significant at the 0.000 level indicating a wide gap between the means for PE and other programs on this measure. There were no time trends found in any measure. On

18 Question 20, Thesis related issues were mentioned by 30% of PE graduates (vs. 15% in total sample). Only 3.7% of PE students indicated that they wish they had pursued degree elsewhere vs. 7% of the total sample.

Table 21. PHED vs. All Other Graduate Programs, T-test results of significant differences N Mean StdDev T Sig Rate availability of All = 795 2.0239 .82357 2.203 0.028 course offerings in program. PHED = 81 1.8148 .70907

Rate student morale All = 772 1.9987 .67038 2.632 0.010 in your program. PHED = 81 1.7901 .67945 Were you All = 792 1.5960 .58012 4.534 0.000 intellectually challenged by course PHED = 79 1.9114 .67823 of study?

Table 22. POLS vs. All Other Graduate Programs, T-test results of significant differences N Mean StdDev T Sig Rate variety of course All = 840 1.9946 .72847 2.528 0.012 offerings in program. POLS = 35 2.3143 .83213

Rate availability of All = 841 1.9845 .80829 3.586 0.000 course offerings in POLS = 35 2.4857 .85307 program Rate quality of All = 832 1.7037 .80702 2.819 0.005 encouragement received from POLS = 35 2.1000 .98369 department/committee to persist toward goals.

Political Science: This category lumps those pursuing an MPA and an MA in Political Science. There are significant differences between POLS graduate students and the rest of the sample on the measures for variety and availability of courses and encouragement of received to reach goals (all negative, see Table 22). There were no significant differences on time trends. On Question 20 (n = 22), “Advising” issues were mentioned by 23% of POLS students (vs. 13% of others). Also, POLS students mentioned pursuing their degree elsewhere more frequently than other students: 14% vs. 7% of the rest of the sample. There was one personnel complaint in Q 22.

19 Psychology: Lumping together all of the masters degrees in Psychology, the program is the third largest on campus (behind Computer Science and Business). Psychology students rated their department as significantly better on “quality of faculty” and “availability of faculty for advice.” (See Table 23.) In terms of trends over time, more recent grads rank the department significantly better on “variety of courses.” Psychology graduates also rate the Graduate School significantly lower on quality of services received compared to graduates of other programs. Interestingly, given these positive rankings, a 11% of Psychology graduates responded to Question 20 (n = 64) that they would “pursue their degree elsewhere” (vs. 7% of the total sample). In Question 22, there were 3 personnel complaints.

Table 23. PSYC vs. All Other Graduate Programs, T-test results of significant differences N Mean StdDev T Sig Rate overall quality of All = 780 1.6250 .67035 2.089 0.037 department faculty. PSYC = 93 1.4731 .59603

Rate faculty All = 779 1.5950 .71656 2.389 0.017 availability for advice. PSYC = 93 1.4086 .66327

Rate quality of All = 747 1.9324 .74750 2.214 0.027 assistance received from Graduate PSYC = 87 2.1207 .77746 School. Time Trends, T-test results of significant differences Rate variety of course Pre-2000 = 2.128 0.036 2.1757 .86776 offerings in program. 37 Post-2000 = 1.8393 .65441 56

Social Work: A recent and rapidly growing program, Social Work graduates rated the “overall quality of program” more negatively than graduates of other programs. See Table 24. On the retrospective open-ended question (Question 20) a higher percentage of Social Work graduates said “everything was ok” with the program (41% vs. 21% of the total sample). However, they also more frequently indicated a desire for “more independent study and field work (9% vs. 1% of the total sample). The lower rating of overall quality is cause for concern in this relatively new program. As a new program, no time trend analyses were possible.

Table 24. SWRK vs. All Other Graduate Programs, T-test results of significant differences

20 N Mean StdDev T Sig Rate overall quality of All = 846 1.6572 .62596 2.442 0.020 graduate education SWRK = 1.9000 .53175 30

Speech Pathology and Therapy: Analysis of responses from SPPT graduates revealed significant differences on the quality measures, “availability of courses,” and “intellectual challenge” (both positive) as well as significant differences on “quality of thesis direction” and “program resources” (negative, but note of N of 4 in thesis advice). (See Table 25.) There were no detectable time trends in quality indicators. On open-ended Question 20 (n = 54) 42% of SPPT graduates said they would change nothing about their experience (vs. 21% of the total sample). On Question 21 (“Suggestions for program improvement”) 23% mentioned “improved facilities” and 23% mentioned “improved advising” versus 9.3% and 10.9% in total sample. On the same question “more classes” was mentioned by only 14% of SPPT graduates compared to versus 29% in total sample. On Question 22 there were two personnel complaints. Table 25. SPPT vs. All Other Graduate Programs, T-test results of significant differences N Mean StdDev T Sig Rate availability of All = 796 2.0496 .80937 5.235 0.000 course offerings in program SPPT = 80 1.5563 .74202 Were you 2.958 0.003 intellectually All = 791 1.6435 .59795 challenged by course of study? SPPT = 80 1.4375 .54758

Rate thesis direction All = 473 1.6818 .73842 2.209 0.028 received from committee and chair. SPPT = 4 2.5000 .57735 Rate quality of All = 761 2.0631 .79552 3.776 0.000 resources and facilities in your program. SPPT = 78 2.4615 .89649

Teaching International Languages. There were no significant differences on any quality indicators. TIL is a relatively recent program, so no time trends can be analyzed. On the open- ended questions, the N is small, at 10-13 responses. On Question 21, suggestions for program improvement, “more professors” is the Number 1 suggestion, versus #7 suggestion in total sample. (25% of TIL majors mention this vs. 6% in total sample.)

Conclusions

This report has endeavored to present the maximum amount of data gleaned from the exit interviews in the minimum amount of space, and with minimal interpretation of the statistical

21 results. Much, or little, can be made of the results presented herein depending on one’s perspective and attitude toward student perceptions, self-reported survey data and statistical analysis. None of the results presented should be considered definitive for these and other methodological reasons. But these data can be considered suggestive of further lines of inquiry, or may serve to reinforce other lines of evidence regarding the strength and quality of graduate programs on the CSU, Chico campus. The usual caveats regarding sample size, sampling bias, student perceptions and indirect measures of program quality all apply. Having said that, here are a few conclusions: 1. Overall, students seem give graduate programs at CSU, Chico relatively high marks on most measures of quality. (See Tables 2-4.) 2. The exceptions to (1) are the following: variety of courses, availability of courses, advice on financial aid and career counseling advice. (See Tables 2-4.) Here students see some inadequacies on the part of many or most programs and the university as a whole. 3. Individual programs can be sorted into several categories: those with largely negative measures of quality, those with largely positive measures of quality, those with mixed measures, and those that seemed “about average” on most measures. The problems of quality of most immediate concern are those focused on the overall quality of the program, the intellectual challenge of a program, as well as student perceptions of faculty quality. These are the primary indicators of quality that faculty have most control over. Variety and availability of courses and other measures are also important indicators of program quality, but largely beyond the control of faculty. Examining programs along these lines we can create the following table:

Table 26. Programs sorted according to performance on primary measures of quality Above Average Average Below Average Overall Quality Education, NUTR, ANTH, BIOL, SWRK Business, CMST, CSCI, ENGL, GEOP, INDS, PHED, POLS, PSYC, SPPT Quality of faculty ENGL, NUTR, PSYC ANTH, BIOL, CMST, CSCI Business, Education, GEOP, INDS, PHED, POLS, SWRK, SPPT Intellectual Education, SPPT ANTH, BIOL, Business, PHED Challenge CMST, CSCI, ENGL, GEOP, INDS , NUTR, POLS, PSYC, SWRK, 4. Another set of quality measures have to do with the resources actually available to students in terms of variety and availability of courses and facilities and other resources to support graduate education. Looking at student perceptions of these dimensions we generate Table 27:

22 Table 27. Programs sorted according to performance on other measures of quality Above Average Average Below Average Variety of Courses Education, ANTH, Business, BIOL, POLS, CMST, CSCI, ENGL, GEOP, IDST, NUTR, PHED, PSYC, SWRK, SPPT, TIL Availability of PHED, SPPT ANTH, Business, BIOL, CMST, POLS, Courses CSCI, Education, ENGL, GEOP, IDST, NUTR, PSYC, SWRK, TIL Quality of Resources CSCI, Education, BIOL, Business, ANTH, SPPT IDST, CMST, ENGL, GEOP, NUTR, PHED, POLS, PSYC, SWRK, SPPT, TIL

5. Finally, there are the services that the program and university provide to students: advising, encouragement, career counseling and other services that contribute to student morale. Exit interviews indicate that students feel that the university and programs do a pretty poor job of providing financial aid information and assistance, across the board. But programs vary in terms of student perceptions of other services, as illustrated in Table 28.

6. Last, but not least, there is the time trend data. The major problem identified from time the trend analysis has already been pointed out: student perceptions of the Computer Science graduate program have definitely declined over time for all primary measures of quality: overall quality, quality of faculty and intellectual challenge as well as variety and availability of courses (see Table 13). Aside from this striking finding, there is little in the time trend data of consequence for understanding student perceptions of other programs.

Table 28. Programs sorted according to performance on advising and other measures of student assistance (not financial aid)

23 Above Average Average Below Average Availability of NUTR, PSYC ANTH, BIOL, GEOP faculty for advising Business, CMST, Education, PHED, POLS, SWRK, SPPT, TIL Quality of advice Business, Education, ANTH, BIOL, received NUTR CMST, GEOP, PHED, POLS, PSYC, SWRK, SPPT, TIL Career counseling NUTR ANTH, BIOL, CMST Business, CMST, Education, GEOP, PHED, POLS, PSYC, SWRK, SPPT, TIL Encouragement Education, ENGL, ANTH, BIOL, Business, CSCI, toward goals NUTR CMST, GEOP, POLS PHED, PSYC, SWRK, SPPT, TIL Thesis advice Education, ENGL ANTH, BIOL, SPPT Business, CMST, GEOP, NUTR, PHED, POLS, PSYC, SWRK, TIL Student Morale in PHED CMST, Education, ANTH, BIOL, program ENGL, NUTR, Business, GEOP PHED, POLS, PSYC, SWRK, SPPT, TIL Services of Graduate Education ANTH, Business, BIOL, PSYC School CMST, ENGL, GEOP, NUTR, POLS, SWRK, SPPT, TIL

These results should be shared with faculty in graduate programs across the campus in order to jump start a discussion of the rigor, strengths and weaknesses of graduate education at CSU, Chico. These data and their preliminary analysis should be viewed as suggestive, not definitive. They should not be used “against” departments or faculty. But they can be combined with other sources of evidence to identify problems and seek solutions. All of us involved with graduate education know that it is a struggle to provide a high quality experience for our graduate students given budgetary constraints and the primary mission of our university: excellence in undergraduate education. However, some programs seem to be doing a better job at delivering on quality than others. We can learn from each other and work together to enhance the quality of graduate education at CSU, Chico.

24 i The Student’s t statistic assumes that data being analyzed are interval level data, but can, with some justification, be extended to ordinal level data without doing undue violence to the underlying assumptions of the statistic (see Bernard 2002).

Recommended publications