Student Learning Outcomes Assessment Plan and Report s2

Total Page:16

File Type:pdf, Size:1020Kb

Student Learning Outcomes Assessment Plan and Report s2

Ed.D. in Educational Leadership 2013 SLO Report

Spring 2013 and Fall 2013 Student Learning Outcomes Assessment Plan and Report (Document student learning outcomes assessment plans and assessment data for each undergraduate and graduate degree program and certificate program, stand alone minor, and distance education program offered online only.)

College: College of Education Department: Educational Leadership Name of Degree or Certificate Program/Stand Alone Minor/Online Distance Education Program: Ed.D. in Educational Leadership

Reflection on the Continuous Improvement of Student Learning 1. List the changes and improvements your program planned to implement as a result of last year’s student learning outcomes assessment data. 2. Were all of the changes implemented? If not, please explain. 3. What impact did the changes have on student learning?

 From 2012 SLO Report: As studied and discussed at the Doctoral Advisory Committee, there was general positive assessment of the new scoring rubrics. We believed we needed to change, we have changed, and now we need to assess if the changes were helpful to our students. Early indications are positive.

Implemented Changes: The implementation and use of the new rubrics have proven to be a positive experience for the students as expectations are shared in advance of program milestones. In the 2013 edition of the Doctoral Program Handbook, we included rubrics for the Qualifying Examination, Dissertation Proposal Defense, Dissertation Defense, and the Internship (for Superintendent licensure-focused students). So, while the use of the rubrics has been generally positive among faculty, enhancing communication with students has since improved an understanding of expectations.

 From 2012 SLO Report: There was considerable attention given to the logistics and mechanics of rubric usage in those cases wherein student remediation is necessary. It was decided that the rubrics would record student performance at the conclusion of the Qualifying Exam, Proposal Defense, and Dissertation Defense and that the overall pass/fail committee decision would occur after any needed remediation took place.

Implemented Changes: Basing the pass/fail decision on the work including remediation has proven to be an important philosophical and procedural clarification in our department. Thus, 1 Ed.D. in Educational Leadership 2013 SLO Report we use the rubrics to evaluate performance and obtain information on outcomes, yet we retain flexibility to assess revisions of work as deemed necessary by faculty committees. These discussions have also launched a Qualifying Examination procedural revision, which is set for implementation in January 2014 (thus, not a part of this report). The revision is based on achieving relative fairness to students and faculty regarding consistency of exams while promoting flexibility of content to allow faculty to ask questions that cross multiple courses. In addition, the new Qualifying Examination procedure sets clearer guidelines for what constitutes a “pass” and steps for remediation and/or retakes. The changes have been shared with students in the program, and next year’s SLO will include data to demonstrate learning outcomes using the new procedure. It should be noted that the evaluation rubrics for the Qualifying Examination will be unchanged.

 From 2012 SLO Report: All of our students performed well in their intern placement settings. Change is on the near horizon, however, as the North Carolina Department of Instruction is adopting a new set of school executive expectations. These will inevitably result in both curricular and instructional changes which have not yet been developed. Not all of our students will be affected by these state changes, only the superintendent aspirants. We have folks attending the state meetings as they continue to evolve.

Implemented Changes: During the fall 2013 semester, many faculty in the Department of Educational Leadership engaged in a North Carolina Department of Public Instruction- mandated Remodeling of the Ed.D. in Educational Leadership for those pursuing the superintendent licensure. The primary changes in the remodeled program include topic-based “evidences” aligned with professional standards. The intent is to replace the qualifying examination for the students pursuing the superintendent licensure; however, these changes cannot occur until the Blueprint is revised and approved by the State Board of Education later in 2014 and changes go through the faculty governance process. Thus, implementation will likely not occur until Fall 2015.

Student Learning Outcome 1 (knowledge, skill or ability to be assessed) SLO 1:Candidates for other professional school roles demonstrate an understanding of the professional and contextual knowledge expected in their fields; and use data, current research and technology to inform their practices.

Changes to the Student Learning Outcomes Assessment Plan: If any changes were made to the assessment plan (which includes the Student Learning Outcome, Effectiveness Measure, Methodology and Performance Outcome) for this student learning outcome since your last report was submitted, briefly summarize the changes made and the rationale for the changes.

New scoring rubrics were implemented during the previous reporting year; thus, no changes were made to the outcomes.

Effectiveness Measure: Identify the data collection instrument, e.g., exam, project, paper, etc.

2 Ed.D. in Educational Leadership 2013 SLO Report that will be used to gauge acquisition of this student learning outcome and explain how it assesses the desired knowledge, skill or ability. A copy of the data collection instrument and any scoring rubrics associated with this student learning outcome are to be submitted electronically to the designated folder on the designated shared drive.

The Department of Educational Leadership (EDLD) program uses the following measures to assess SLO 1. Qualifying Examination. The Qualifying Exam may be taken after the candidate completes a minimum of 24 credit hours and before they take 36 hours. The exam has two parts: the written portion, over approximately 12 hours, and then that is followed by an oral defense of the written work. The six components of the rubric are applied to the candidate’s combined performance on both the written and the oral exam. Specific areas assessed are: a. Ability to recognize and articulate the problems at hand. b. Expression of the problems’ background; able to employ critical analysis and relevant literature. c. Reasoning skills. d. An understanding and ability to apply appropriate research methods vis-à-vis problems posed during exam. e. Apply critical reflection to knowledge gained from the academic program. f. Ability to effectively respond to scholarly questions.

2. Proposal Defense. (The proposal is a draft of the first three chapters of candidate’s dissertation.) Results of the proposal defense are used to estimate the candidates’ ability to design a research project that answers important questions in the candidates’ content area. Specific areas assessed are: a. A research problem which is clear, articulated and significant. b. Research methods which provide detailed description of (if applicable): subjects, design/approach, methods/procedures and analyses. c. Research methods and analyses that are appropriate to the research questions. d. A relationship between the research problem and the student’s role as an educational leader. e. A preliminary literature review that describes prior conceptual and research investigations of the research problem.

3. Dissertation Defense. (Completion of research and includes five chapters.) Results from the dissertation defense are used to determine the knowledge and skills of the candidate to conduct a research project. a. Develops clear and appropriate research questions or hypotheses that guide the study. b. Demonstrates how research questions or hypotheses have been examined in previous studies. c. Analysis is comprehensive, complete, sophisticated, and convincing. d. All pertinent results reported in clear and concise manner. Table/figures are labeled appropriately. e. Draws clear conclusions based on collected data that answers research question or test the hypotheses. f. Makes recommendations for further research that can build on this project. g. Provides reflection of problems or errors in the study and discusses how they 3 Ed.D. in Educational Leadership 2013 SLO Report could be avoided in subsequent studies.

Methodology: Describe when, where and how the assessment of this student learning outcome will be administered and evaluated. Describe the process the department will use to collect, analyze and disseminate the assessment data to program faculty and to decide the changes/improvements to make on the basis of the assessment data.

Qualifying Examination:

Written component. Candidates should take the written comprehensive examination as soon as possible after completing 24 credit hours of foundations and research coursework and not later than enrollment in ADMN 8699 (Dissertation Proposal Seminar). The examination may occur at any time during the year and normally will include questions from six different doctoral courses to be completed within twelve hours (six hours on two consecutive days). Those questions will require candidates to connect basic concepts from completed coursework and to apply what they have learned to different situations and educational contexts. A committee consisting of a candidate’s advisor and the faculty members who have instructed the candidate will prepare and evaluate the written examinations. Steps in the Written Examination: 1) The candidate and advisor will determine a date for the examination which will be at least 60 days from the day of the decision; 2) The advisor will notify committee members that the student will take the examination and will request any materials/information (if appropriate) to guide the candidate’s preparation; 3) The candidate will take the examination in the department area on a department laptop computer (unless otherwise indicated by a faculty member, no materials or resources will be used during the examination); 4) The advisor will give the candidate’s responses to the examination questions to the appropriate faculty members for evaluation; 5) If the candidate’s performance on the written examination is unsatisfactory (Not Acceptable), in whole or in part, the candidate will be allowed to re-take the failed portion(s) of the examination. A second failure will result in termination from program. The written examination is scored on the following scale: Expectations Not Met-0; Meeting Expectations-1; and Exceeding Expectations-2 across multiple dimensions.

Oral component. The oral examination will normally occur within 30 days upon successful completion of the written examination. During the oral examination, the candidate’s advisor and committee will engage in dialog with the candidate about the written examination. The discussion has two purposes. First, it provides an opportunity for the candidate to address in more detail or to clarify responses to questions on the written examination. Second, it allows the committee to engage the student in a discussion of issues not addressed in the written examination but which are pertinent to the content. If the candidate’s performance on the oral examination is unsatisfactory, an additional oral examination may be scheduled and/or the candidate may be required to take additional coursework. Subsequent failure on the oral examination will result in termination from the program. Upon successful completion of the written and oral examinations, the student’s advisor and committee must sign and submit the Qualifying Examination/Comprehensive Examination Report for Doctoral Candidates. The oral exam is scored on the following scale: Expectations Not Met-0; Meeting Expectations-1; and Exceeding Expectations-2, across multiple dimensions. A rubric is used to evaluate the combination of written and oral defense of the qualifying exam. It contains the six assessment dimensions mentioned earlier.

Dissertation Proposal Defense

The development and defense of a dissertation proposal is an important aspect of dissertation research. The proposal is a draft of the first three chapters of one’s dissertation. The proposal defense is scored on the following scale: Expectations Not Met-0; Meeting Expectations-1; and Exceeding Expectations-2, across multiple dimensions. After the student/candidate “meets” or “exceeds” all dimensions, they are allowed to begin their research. A rubric is used to evaluate each of the five domains in the proposal 4 Ed.D. in Educational Leadership 2013 SLO Report defense.

Dissertation Defense

When the candidate’s dissertation committee believes that the dissertation is in satisfactory form, a final defense is scheduled. The dissertation defense is scored on the following scale: Expectations Not Met-0; Meeting Expectations-1; and Exceeding Expectations-2, across multiple dimensions. Students/candidates who do not meet expectations are provided feedback and another defense is scheduled. A rubric is used to evaluate each of the seven domains in the dissertation defense.

Assessments are administered at identified points during the program. Work samples are scored using the designated method and scores are collected and analyzed at the program level. Simple descriptive statistics are used to report the scores. Findings are discussed at monthly Doctoral Advisory Committee meetings and during department faculty meetings. Recommendations for changes and improvements are examined and adopted as deemed appropriate. All data reports created by the College of Education are housed on a secure website which is accessible to all faculty members within the College of Education.

Performance Outcome: Identify the percentage of students assessed that should be able to demonstrate proficiency in this student learning outcome and the level of proficiency expected. Example: 80% of the students assessed will achieve a score of “acceptable” or higher on the Oral Presentation Scoring Rubric. (Note: a copy of the scoring rubric, complete with cell descriptors for each level of performance, is to be submitted electronically to the designated folder on the designated shared drive.)

The program expects at least 80% of the students to score “1” or “2” (meet or exceed expectations) on each of the elements of the Qualifying Exam, Proposal Defense, and Dissertation Defense. The results indicated that candidates’ performance exceeded the expectations.

Fall 2012 Assessment Data Spring 2013-Fall 2013 Assessment Data

Qualifying Examination: The percentages of candidates who “met” or “exceeded” expectations are reported in the table below. One hundred percent of the candidates scored at the “met” or “exceeded” level for all dimensions in 2013. 2012 2013 % Meets or Exceeds % Meets or Exceeds n=13 n=7 Ability to recognize and 85% 100% articulate the problems at hand. Expression of the problems’ background; able to employ 77% 100% critical analysis and relevant literature. Reasoning skill 77% 100% An understand and ability to 75% 100% apply appropriate research methods vis-à-vis problems 5 Ed.D. in Educational Leadership 2013 SLO Report posed during exam Apply critical reflection to knowledge gained from the 77% 100% academic program. Ability to effectively respond to 85% 100% scholarly questions

Proposal Defense: The percentages of candidates who “met” or “exceeded” expectations are reported in the table below. One hundred percent of the candidates scored at the “met” or “exceeded” level for all dimensions. 2012 2013 % Meets or Exceeds % Meets or Exceeds n=5 n=9 A research problem which is clear, 100% 100% articulated and significant. Research methods which provide detailed description of (if applicable): subjects, 80% 100% design/approach, methods/procedures and analyses. Research methods and analyses that are appropriate to the research 100% 100% questions. A relationship between the research problem and the student’s role as 100% 100% an educational leader. A preliminary literature review that describes prior conceptual and 100% 100% research investigations of the research problem.

Dissertation Defense: The percentages of students who “met” or “exceeded” expectations are reported in the table below. One hundred percent of the students scored at the “met” or “exceeded” level for all dimensions. 2012 2013 % Meets or Exceeds % Meets or Exceeds n=5 n=9 Develops clear and appropriate research questions or hypotheses 100% 100% that guide the study. Demonstrates how research questions or hypotheses have been 80% 100% examined in previous studies. Analysis is comprehensive, complete, sophisticated, and 80% 100% convincing. All pertinent results reported in clear and concise manner. 80% 100% Table/figures are labeled appropriately. Draws clear conclusions based on collected data that answers 80% 100% research question or test the hypotheses. 6 Ed.D. in Educational Leadership 2013 SLO Report Makes recommendations for further research that can build on 100% 100% this project. Provides reflection of problems or errors in the study and discusses 100% 100% how they could be avoided in subsequent studies.

Plans for 2014: Based upon the 2013 assessment data included in this annual report, what changes/improvements will the program implement during the next year to improve performance on this student learning outcome? Based on a combination of student outcomes, faculty advisory committee discussions, and data from a 2011 survey of Ed.D. graduates, the Department embarked on a revision Qualifying Examination process to ensure consistent experiences for both students and faculty. While the measurement of outcomes by using the established rubric will not change in 2014, the new procedure will be implemented on January 1, 2014. Some of the key changes to the process include a standardized committee structure (3 members rather than up to 6); establishment of a 3- question exam, rather than up to 6 questions—this revision will allow faculty to ask more comprehensive questions spanning multiple courses rather than being restricted to course- specific questions; standardized exam periods—written exams in February and July with oral exams in March or August/September; and, clearer articulation of results and paths for re-test and/or remediation—in the past this was done inconsistently and at the discretion of individual faculty members where the revised process will allow for a majority vote and clarity regarding what constitutes a pass or failure. It is the Department’s hope that this improved process will enhance clarity and improve learning outcomes for students.

In addition, the NCDPI-mandated revised “Blueprint” necessitated the identification of “evidences” for superintendent licensure. The plan is currently under review by NCDPI and will likely be sent back to UNC Charlotte for revision prior to consideration by the State Board of Education in Summer 2014. If approved, it is the intent of faculty who assembled the blueprint that the evidences will take the place of qualifying examinations for superintendent licensure- focused students. Once the Blueprint is approved, the plan must be discussed further and approved by the faculty of the entire Department later in 2014. However, any new processes likely will not be enacted until Fall 2015.

Student Learning Outcome 2 (knowledge, skill or ability to be assessed) SLO 2: Candidates for other school professions demonstrate professional behaviors consistent with fairness and the belief that all students can learn, including creating caring, supportive learning environments, encouraging student-directed learning, and making adjustments to their own professional dispositions when necessary.

Changes to the Student Learning Outcomes Assessment Plan: If any changes were made to the assessment plan (which includes the Student Learning Outcome, Effectiveness Measure, Methodology and Performance Outcome) for this student learning outcome since your last report was submitted, briefly summarize the changes made and the rationale for the changes. 7 Ed.D. in Educational Leadership 2013 SLO Report

New scoring rubrics were implemented during the previous reporting year; thus, no changes were made to the outcomes.

Effectiveness Measure: Identify the data collection instrument, e.g., exam, project, paper, etc. that will be used to gauge acquisition of this student learning outcome and explain how it assesses the desired knowledge, skill or ability. A copy of the data collection instrument and any scoring rubrics associated with this student learning outcome are to be submitted electronically to the designated folder on the designated shared drive.

Intern Summary Evaluation instrument is used during the candidates’ internship process. There are seven domains on which students/candidates are scored. The professional behaviors are scored on the following scale: Expectations Not Demonstrated-0; Developing-1; and Proficient- 2. Domains include Strategic Leadership, Instructional Leadership, Cultural Leadership, Human Resource Leadership, Managerial Leadership, External Development Leadership, Micro-political Leadership.

Additionally, all students/candidates must take the Collaborative Institutional Training Initiative course in the protection of human research subjects and must participate in a tutorial which assesses their knowledge of the procedures for protecting human research subjects and conducting research in the social, educational, and behavioral sciences. All students/candidates must answer at least 80% of the tutorial assessment instrument, which is imbedded within the instrument itself, correctly before they can conduct research at UNC Charlotte.

Methodology: Describe when, where and how the assessment of this student learning outcome will be administered and evaluated. Describe the process the department will use to collect, analyze and disseminate the assessment data to program faculty and to decide the changes/improvements to make on the basis of the assessment data.

During Internship: Students/Candidates are scored by two supervisors (i.e., the University Supervisor and Intern Site Mentor) on the Intern Summary Evaluation instrument during the internship experiences (Parts I and II). At the end of the internship, the two raters provide a summative evaluation based on the ratings.

Assessments (i.e., Intern Summary Evaluation and Collaborative Institutional Training Initiative exam) are administered at identified points during the program. The tutorial assessment is embedded in the training session and the user is not permitted to proceed until mastery is attained.

Work samples are scored using the designated method and scores are collected and analyzed at the program level. Simple descriptive statistics are used to report the scores. Findings are discussed at monthly Doctoral Advisory Committee meetings and during department faculty meetings. Recommendations for changes and improvements are examined and adopted as deemed appropriate. All data reports created by the College of Education are housed on a secure website which is accessible to all faculty members within the College of Education.

8 Ed.D. in Educational Leadership 2013 SLO Report

Performance Outcome: Identify the percentage of students assessed that should be able to demonstrate proficiency in this student learning outcome and the level of proficiency expected. Example: 80% of the students assessed will achieve a score of “acceptable” or higher on the Oral Presentation Scoring Rubric. (Note: a copy of the scoring rubric, complete with cell descriptors for each level of performance, is to be submitted electronically to the designated folder on the designated shared drive.)

At least 80% of students/candidates will score “2” (Proficient) across all domains on the Intern summary Evaluation instrument and 100% must pass (score 80% or higher) on the Collaborative Institutional Training Initiative tutorial in order to conduct research at UNC Charlotte.

Fall 2012 Assessment Data Spring 2013-Fall 2013 Assessment Data

Professional Domains – Internship— the percentage of students who were determined to be “Proficient” (score of 2-the highest score) are reported in the table below. Eighty-three percent of the students scored at the proficient level for all dimensions. These data are a combination of the final internship report from Spring 2013 and the interim report in Fall 2013, since students complete the internship over a two- semester period. 2012 2013 % Proficient % Proficient n=6 n=7 Strategic Leadership 83% 86% Instructional Leadership 100% 100% Cultural Leadership 100% 100% Human Resources Leadership 100% 100% Managerial Leadership 100% 100% External Development 100% 100% Leadership 100% 83% (n=6 for this item, one Supt.-Micropolitical Leadership not reported)

Collaborative Institutional Training Initiative exam—100% of all candidates successfully passed the exam in 2012 and 2013 (from the Office of Research Services and instructor records).

Plans for 2014: Based upon the 2013 assessment data included in this annual report, what changes/improvements will the program implement during the next year to improve performance on this student learning outcome?

The NCDPI-mandated Ed.D. Remodeling Blueprint will be the focus of attention in 2014. Once revised and approved by the State Board of Education, the Blueprint will guide program revisions around the Qualifying Examination for those seeking superintendent licensure. These revisions, while mostly focused on the coursework linked with standards (vision, staffing, resources, instruction/learning, and governance), the materials from the internship will also play a role in assembling evidences. Thus, we should reach greater synergy between the Qualifying Examination and the Internship. If approved, these changes will be enacted by Fall 2015.

In addition, the Department is phasing-in the tracking of dispositions for all candidates. Future

9 Ed.D. in Educational Leadership 2013 SLO Report SLO reports will include instructor-assessed dispositions for the midpoint and/or final evaluations as data are collected.

Student Learning Outcome 3 (knowledge, skill or ability to be assessed) SLO 3: Candidates for other school professions establish positive educational environments that support and build upon the developmental levels of students, the diversity of students, families, and communities; and the policy contexts within which they work.

Changes to the Student Learning Outcomes Assessment Plan: If any changes were made to the assessment plan (which includes the Student Learning Outcome, Effectiveness Measure, Methodology and Performance Outcome) for this student learning outcome since your last report was submitted, briefly summarize the changes made and the rationale for the changes.

New scoring rubrics were implemented during the previous reporting year; thus, no changes were made to the outcomes.

Effectiveness Measure: Identify the data collection instrument, e.g., exam, project, paper, etc. that will be used to gauge acquisition of this student learning outcome and explain how it assesses the desired knowledge, skill or ability. A copy of the data collection instrument and any scoring rubrics associated with this student learning outcome are to be submitted electronically to the designated folder on the designated shared drive.

Intern Summary Evaluation instrument is used during the candidates’ internship process. There are three domains on which students/candidates are scored. The professional behaviors are scored on the following scale: Expectations Not Demonstrated-0; Developing-1; and Proficient- 2. Domains include Managerial Leadership, External Development Leadership, Micro-political Leadership.

Methodology: Describe when, where and how the assessment of this student learning outcome will be administered and evaluated. Describe the process the department will use to collect, analyze and disseminate the assessment data to program faculty and to decide the changes/improvements to make on the basis of the assessment data.

During Internship: Students/Candidates are scored by two supervisors (i.e., the University Supervisor and Intern Site Mentor) on the Intern Summary Evaluation instrument during the internship experiences (Parts I and II). At the end of the internship, the two raters provide a summative evaluation based on the ratings.

Assessments (i.e., Intern Summary Evaluation and Collaborative Institutional Training Initiative exam) are administered at identified points during the program. The tutorial assessment is embedded in the training session and the user is not permitted to proceed until mastery is attained. 10 Ed.D. in Educational Leadership 2013 SLO Report

Work samples are scored using the designated method and scores are collected and analyzed at the program level. Simple descriptive statistics are used to report the scores. Findings are discussed at monthly Doctoral Advisory Committee meetings and during department faculty meetings. Recommendations for changes and improvements are examined and adopted as deemed appropriate. All data reports created by the College of Education are housed on a secure website which is accessible to all faculty members within the College of Education.

Performance Outcome: Identify the percentage of students assessed that should be able to demonstrate proficiency in this student learning outcome and the level of proficiency expected. Example: 80% of the students assessed will achieve a score of “acceptable” or higher on the Oral Presentation Scoring Rubric. (Note: a copy of the scoring rubric, complete with cell descriptors for each level of performance, is to be submitted electronically to the designated folder on the designated shared drive.

At least 80% of students/candidates will score “2” (Proficient) across all domains on the Intern summary Evaluation instrument

Fall 2012 Assessment Data Spring 2013-Fall 2013 Assessment Data

Professional Domains – Internship— the percentage of students who were determined to be “Proficient” (Score of 2-the highest score) are reported in the table below. Eighty-three percent of the students scored at the proficient level for all dimensions; this figure reflects one student demonstrating development toward proficiency on one dimension in 2013, and one dimension was not reported. These data are a combination of the final internship report from Spring 2013 and the interim report in Fall 2013, since students complete the internship over a two-semester period. 2012 2013 % Proficient % Proficient n=6 n=7 Managerial Leadership 100% 100% External Development 100% 100% Leadership 100% 83% (n=6 for this item, one Supt.-Micropolitical Leadership not reported)

Plans for 2014: Based upon the 2013 assessment data included in this annual report, what changes/improvements will the program implement during the next year to improve performance on this student learning outcome?

The NCDPI-mandated Ed.D. Remodeling Blueprint will be the focus of attention in 2014. Once revised and approved by the State Board of Education, the Blueprint will guide program revisions around the Qualifying Examination for those seeking superintendent licensure. These revisions, while mostly focused on the coursework linked with standards (vision, staffing, resources, instruction/learning, and governance), the materials from the internship will also play a role in assembling evidences. Thus, we should reach greater synergy between the Qualifying Examination and the Internship. If approved, these changes will be enacted by Fall 2015.

11 Ed.D. in Educational Leadership 2013 SLO Report

12

Recommended publications