Annual Department Assessment Plan Summary

Total Page:16

File Type:pdf, Size:1020Kb

Annual Department Assessment Plan Summary

Annual Assessment System Summary

Department: Newman Division of Nursing Date: March 5, 2008

Part 1: Outcomes

1. Synthesize empirical and theoretical knowledge from nursing and the arts, sciences and humanities to practice professional nursing. A. Develop and use higher-order problem-solving and critical thinking skills for developing, implementing, and evaluating nursing interventions. B. Synthesize evidence-based, theoretical, and empirical findings from nursing, the arts, sciences, and humanities as appropriate in identifying, framing, and making nursing-related decisions across a variety of health care situations. 2. Demonstrate values central to professional nursing practice within the frameworks of legal, ethical and professional standards. A. Understand and practice the concept of caring in the practice of nursing across the health care continuum and in all health care settings. B. Demonstrate knowledge of legal, ethical, and professional standards when identifying, framing, and making nursing-related decisions. C. Demonstrate abilities to collaborate with Person and with other health care professionals in meeting diverse health-care needs of Person. D. Advocate for Person within the health care delivery system. E. Assume responsibility and accountability for own actions in the practice of nursing. F. Assume responsibility for life-long learning and plan for professional career development. 3. Use leadership skills and knowledge to develop in the role of a professional nurse. A. Demonstrate knowledge of health care systems and the factors that shape organizations and environments in which nursing and health care are delivered when identifying, framing, and making nursing-related decisions with Persons. B. Understand the profession of nursing and participate in the political and regulatory processes that shape the health care delivery system. C. Demonstrate knowledge of the professional nurse’s role to design, coordinate and manage nursing care using the application of outcome-based practice and the skills of communication, collaboration, negotiation, delegation, coordination, and evaluation of interdisciplinary work. D. Demonstrate the knowledge and skills needed to be a member of interdisciplinary health care teams and support agendas that enhance high quality, cost-effective health care. 4. Provide professional nursing care to promote health, reduce risk, prevent disease and manage illness and disease. A. Demonstrate knowledge of factors that promote, protect, and predict the health of Persons when delivering and evaluating nursing care across the lifespan and health care continuum. B. Synthesize knowledge of pharmacology, pathophysiology, health assessment, and nursing interventions in the identification, management, and evaluation of signs and symptoms of Person when delivering nursing care across the lifespan and health care continuum and in a variety of health care settings. C. Synthesize knowledge of the biopsychosocial, cognitive, and spiritual aspects of Person when delivering nursing care across the lifespan and health care continuum. 5. Demonstrate technical skills and communication methods necessary to deliver professional nursing care. A. Demonstrate appropriate, effective use of communication skills (i.e., nonverbal, listening, oral, written, electronic) when delivering nursing care to Person across the lifespan and health care continuum and in a variety of health care environments. B. Demonstrate the ability to modify communication methods in response to cultural or special needs of Person when delivering nursing care. C. Demonstrate the ability to access and document health-related information in a timely manner. D. Perform a wholistic assessment of Person across the lifespan and health care continuum and in a variety of health care environments and use assessment findings to diagnose, plan, deliver, and evaluation quality care. E. Demonstrate appropriate, safe, and efficient technical skills and use of technology when delivering nursing care to patients across the lifespan and health care continuum and in a variety of health care environments. 6. Demonstrate knowledge of human diversity in a global community while in the role of a professional nurse. A. Synthesize knowledge of human diversity, cultural values, and global health practices when providing culturally sensitive care to Person. B. Demonstrate knowledge of the impact of globalization on health care systems, policy, modalities, practices when in the role of a professional nurse. Part 2: Assessment Planning Charts A. Direct Measures- Evidence, based on student performance, which demonstrates actual learning (as opposed to surveys of “perceived” learning or program effectiveness). See “Assessment type” chart at the end of this document for a list of potential assessment types and their definitions. Note how it is possible to have an objective covered by more than one assessment, or one assessment to cover more than one objective.

The following are examples of the assessment data collected in the NDN. Complete data are on file in the NDN and can be provided as needed.

Objective(s) Assessment(s) Type # Data/Results Action Taken/Recommendations # +(see chart) (if necessary)

1 ATI Comprehensive Exam 3 Percent of students meeting NDN benchmark Recommend to continue the ATI (total scale scores, individual (.99 predictability of passing NCLEX-RN) Comprehensive testing and to trend data over level data) 2006 (N = 32): 48%; 2007 (N = 32): 55% the next 2 years with a goal of 75% of students meeting the benchmark by 2009. Also The number of students achieving the recommend ongoing assessment and review by benchmark improved from 2006 to 2007. course faculty and curriculum to identify additional strategies to strengthen student learning outcomes. Strengthen content areas and content delivery methods based on the ATI Comprehensive outcomes.

ATI Comprehensive Exam 3 2006 and 2007 results: Explore use of ATI Critical Thinking 1 (Critical thinking subscale Interpretation = 40% to 42% Assessment. Also recommend ongoing assessment and review by course faculty and score, individual level data) Analysis = 56% to 74% curriculum to identify additional strategies to Evaluation = 56% to 81% strengthen student learning outcomes. Inference = 78% to 71% Explanation = 28% to 19% Note: ATI Critical Thinking Assessment will Improvement was seen in the categories of begin with August 2008 admission class. This analysis and evaluation; however, student will provide more critical thinking data early in outcome performance in interpretation and a student’s progression through the curriculum. explanation remain below the benchmark (.99 It also will allow more comparison data probability). Inference remained above the between 1st semester sophomore and 2nd benchmark for both testing periods. semester senior students.

1 NU 490 Leadership Paper 1 & 5 Benchmark: 100% at 70% or above Recommend the development/implementation (NU490 Writing Rubric) 2006: 32/32 students of NDN-wide rubric for written papers. 2007: 31/32 students

1, 2, 3, 4, 5, & NU 491 Clinical Practicum 1 & 5 Benchmark: 100% of students successfully To facilitate measurement of outcome data, 6 (NU 491 Clinical Evaluation complete the NU491 practicum and meet all recommend exploring the development of a Tool) course objectives. clinical evaluation tool across the curriculum 2006: 100% that utilizes a multidimensional scale. 2007: 97%

ATI Comprehensive 2 & 3 3 (Leadership subscale scores, individual level data)

2 ATI Leadership Content 3 Mastery (Quality and Legal, Ethical Issues subscale scores, individual level data) 3 ATI Leadership Content 3 Mastery (ATI total scale scores, individual level data) ATI Comprehensive Exam 3 4 (Nursing process subscale score, individual level data)

ATI Comprehensive Exam 4 3 (Health Promotion subscale score, individual level data)

4 ATI Comprehensive Exam 3 (Management of Care subscale score, individual level data)

4 ATI Comprehensive Exam 3 (Pharmacology subscale score, individual level data) 4 ATI Comprehensive Exam 3 (Physiological Adaptation subscale score, individual level data)

4 ATI Comprehensive Exam 3 (Reduction of Risk, subscale score, individual level data)

4 ATI Comprehensive Exam 3 (Basic Care and Comfort subscale score, individual level data)

4 ATI Comprehensive Exam 3 (Psychosocial Integrity, subscale scores, individual level data)

4 ATI Comprehensive Exam 3 (Safety and Infection Control subscale scores, individual level data)

5 ATI Comprehensive Exam 3 Benchmark: .99 predictability of passing Recommend evaluation by the curriculum (Therapeutic Nursing NCLEX-RN committee of current TNIs and explore Interventions subscale scores, 2006: 53% of senior students competency evaluation in key courses as individual level data) 2007: 52% of senior students determined through curriculum review. In 2007, adult and pediatric simulators were purchased. Faculty were educated on the use of simulation. Use of simulation should increase student learning in TNIs and critical thinking.

5 ATI Comprehensive Exam 3 (Communication subscale scores, individual level data) 1, 4, 5 & 6 NCLEX-RN Pass rate 3 Benchmark: At least 80% or state/national See explanation given in another section of this average report: Part 4: Summary. Level B. Program 2001: 81% Improvement. 2002: 83.90% 2003: 84% 2004: 65.52% 2005: 82.14% 2006: 76.67% 2007: 76.67% Some improvement since drop in 2004; however, still not at benchmark. 1, 2 & 4 ATI Content Mastery 3 See additional information included in other Exams: sections of Part 4: Summary regarding the use of formative measures and the Fundamentals of Nursing development of “thumb prints.”

Medical-Surgical Nursing Care

Pharmacology in Nursing Practice

Maternal/Newborn Nursing Care

Nursing Care of children

Mental Health Nursing Care

Community Health Nursing Care

Nursing Leadership B. Indirect Measures -Reflection about the learning gained, or secondary evidence of its existence. Please refer to “assessment type” chart at the end of this document.

Objective(s) Assessment(s) Type # Data/Results Action Taken/Recommendations # +(see chart) (if necessary) 1, 2, 3 & 5 ESU/NDN Employer 11 Overall, the data indicate employer satisfaction Even though the data indicate satisfaction, the Satisfaction Survey with graduates. return rate is so low that there are questions about the validity of the data. Steps have been taken to increase the return rate; however, it is time intensive for a relative small increase. Once the curriculum review is completed, the employer and graduate satisfaction surveys will be revised to reflect changes in the curriculum and the educational outcomes.

1, 2, 3 & 5 ESU/NDN Graduate 11 Overall, the data indicate graduate satisfaction See note re: employer satisfaction survey Satisfaction Survey with the program. Part 3: Evaluation Rubric for Assessment System 1 2 3 4 Beginning Developing At Standard Above Standard

Level A: Beginning Implementation

Professional Development of the assessment Development of the assessment system is Development of the assessment system Development of the assessment system is standards and system does not reflect professional based on professional standards/outcomes, is based on professional based on professional standards/outcomes, student learning standards/outcomes nor are the but the faculty and the professional standards/outcomes, and the faculty and the faculty AND professional outcomes standards established by faculty community were not involved. AND the professional community community are engaged in continuous and/or outside consultants. were involved. improvement through systematic (e.g., yearly) activities. Faculty involvement No faculty involvement is Faculty involvement consists of one or Faculty involvement consists of a Faculty involvement is widespread evidenced in department assessment two individuals who work on program small core within the department, but throughout the program or department. activities. assessment needs and activities. Little or input from other faculty and All faculty within the department have no communication is established with professionals about assessment issues contributed (and continue to contribute) to other faculty or professionals. is evidenced. the use and maintenance of an assessment plan.

Assessment No alignment between faculty Alignment exists with some outcomes and Alignment between outcomes and Alignment between outcomes and alignment identified learning outcomes and assessments, but not others OR the assessments is complete and clear. assessments complete. Courses are assessments is evidenced. alignment is weak/unclear. identified that address each outcome.

Level B: Making Progress in Implementation

Assessment structure The assessment plan has only one of The assessment plan has only two of the The assessment plan has all of the The assessment plan has all necessary the following attributes: following attributes: multiple. regular and following attributes: multiple, regular attributes and are embedded in the 1) multiple direct and indirect comprehensive, at each stage. and comprehensive, at each stage. program (versus “added-on”). assessments are used. 2) assessments are used on a regular basis (i.e., not just given once to get initial data). 3) assessments provide comprehensive information on student performance at each stage of their program.

Data management No data management system exists. A data management system is in place to A data management system is in place A data management system is in place that collect and store data but it does not have that can store and process most student can store and process all student the capacity to store and analyze data from performance data over time. performance data over time. Data are all students over time. regularly collected and stored for all students and analyzed and reported in user-friendly formats.

Data collection Data are not collected across Data are collected at multiple points but Data are systematically collected at Data are systematically collected at points multiple points and do not predict there is no rationale regarding their multiple points and there is strong multiple points and provide strong student success. relationship to student success. rationale (e.g., research, best practice) relationship between assessments and regarding their relationship to student student success. success.

Data collection Data collected from applicants, Data collected from applicants, students, Data collected from applicant, Data collected from multiple information sources students, and faculty, but not faculty, and graduates, but not other students, recent graduates, faculty, and on/from applicants, students, recent graduates or other professionals. professionals. other professionals. graduates, faculty, and other professionals.

Program Data are only generated for external Some generated data are based on internal An ongoing, systematic, objectives An ongoing, systematic, objectives based improvement accountability reports (e.g., standards and used for program based process is in place for reporting process is in place for reporting and using accreditation), are not used for improvement, but are available only to and using data to make decisions and data to make decisions and improve program improvement, and are administrators “as needed.” improve programs within the programs both within the department and available only to administrators. department. university-wide.

Level C: Maturing Stages of Implementation

Comprehensive and The assessment system consists of The assessment system includes multiple The assessment system includes The assessment system includes integrated measures measures that are neither measures, but they are not integrated or comprehensive and integrated comprehensive and integrated measures comprehensive nor integrated. they lack scoring/cut-off criteria. measures with scoring/cut-off criteria. with scoring/cut-off criteria that are examined for validity and utility, resulting in program modifications as necessary.

Monitoring student Measures are used to monitor Measures are used to monitor student Measures are used to monitor student Measures are used to monitor student progress, & student progress, but are not used to progress and manage operations and progress and manage and improve progress and manage and improve managing & manage and improve operations and programs, but are not used for operations and programs. operations and programs. Changes based improving programs. improvement. on data are evident. operations & programs

Assessment data Assessment data are not shared with Assessment data are shared with faculty, Assessment data are shared with Assessment data are shared with faculty usage by faculty faculty. but with no guidance for reflection and faculty with guidance for reflection with guidance or reflection and improvement. and improvement. improvement. Remediation opportunities are made available.

Assessment data Assessment data are not shared with Assessment data are shared with students, Assessment data are shared with Assessment data are shared with students shared with students students. but with no guidance for reflection and students with guidance for reflection with guidance for reflection and improvement. and improvement. improvement. Remediation opportunities are made available.

Fairness, accuracy, No steps have been taken to Assessments have “face validity” Preliminary steps have been taken to Assessments have been established as fair, and consistency of establish fairness, accuracy, and regarding fairness, accuracy, and establish fairness, accuracy, and accurate, and consistent through data assessments consistency of assessments. consistency. consistency of assessments. analysis. Part 4: Summary

Factors Rubric Score Evidence/Rationale Level A

Professional standards and student 1 2 3 4 The educational outcomes and sub-outcomes for the Newman Division of Nursing (NDN) were learning outcomes developed by the nursing faculty following a review of professional standards and educational/program requirements from professional organizations including the American Association of Colleges of Nursing (AACN), the National League for Nursing Accrediting Commission (NLNAC), and the Kansas State Board of Nursing (KSBN). In addition, the detailed test plan for the national licensure exam (NCLEX-RN) developed by the National Council of State Boards of Nursing was reviewed. A Web-based search for educational outcomes from other baccalaureate programs (in Kansas and in other states) also provided more information that was considered when the NDN outcomes were developed. The outcomes were finalized in August 2007.

According to requirements from NLNAC and KSBN, the NDN has an established Systematic Evaluation Plan (SEP) that provides for the systematic evaluation of the entire program. As part of the SEP, the NDN engaged in assessment activities at least annually. Some data originally collected as part of the SEP can now also be used as data for the NDN’s recently developed assessment plan for student learning, for example, pass-rate on the NCLEX-RN and results from the graduate and employer surveys. In addition, the NDN had already implemented the administration of required achievement tests that were linked to specific courses in the nursing curriculum. Data pertaining to the results of the achievement tests can also be used with the assessment plan for student learning.

The NDN is required to report certain information to KSBN and NLNAC at least annually. Some of the reported information pertains to aspects of the NDN’s assessment plan for student learning, for example, pass-rate on the NCLEX-RN. The NDN also is evaluated based on its performance in reference to the established educational outcomes as a part of accreditation (NLNAC) and approval (KSBN) self-study reports and site visits. Faculty involvement For the past 2 years, all members of the faculty have been involved in activities that resulted in 1 2 3 4 the development of formal, measurable educational outcomes and sub-outcomes that are now being used to assess student learning. At the beginning of this effort, the faculty members, overall, had concerns about the time intensive process of developing the educational outcomes and about assessment in general. For example, there was uncertainty about the meaning of the various terms or concepts associated with assessment; the differences between program outcomes and student learning outcomes; and measurement techniques for concepts such as globalization. The faculty members also questioned the fact that the NDN had already established educational outcomes for our graduates when the NDN was first built; however, we also acknowledged that the NDN had not been effective in measuring the outcomes, collecting data and actually using the data to assess student learning. The NDN had been collecting data; yet, it is now understood that much of the data pertained to program outcomes and not actually student learning as students progressed through our program. It is now understood that even though the NDN had been collecting data, even some of which were specific to student learning, there had not been a clear understanding as to why the data were being collected or how the data could or should be used.

To develop the educational outcomes and sub-outcomes, the faculty members used an iterative process that included the NDN’s Mission and Philosophy and all of the standards and information addressed in the previous section of this report. Once the faculty members identified the educational outcomes and sub-outcomes, these were used to guide an indepth curriculum review that had been scheduled according to the NDN’s systematic evaluation plan.

Use of the iterative process is evident in our work with the curriculum review and our use of data from the educational outcomes. We also use the iterative process in making certain that the concepts in the educational outcomes are clearly defined, operationalized and consistent with our Mission and Philosophy.

As part of the curriculum review, data previously collected (e.g., data on achievement tests or content mastery tests, data from first-time pass-rates on NCLEX-RN; clinical evaluation tool results) were presented. It is important to note that even though the data had been collected before the adoption of the educational outcomes and sub-outcomes, the data were important indicators for our new educational outcomes. We realized that we had been collecting data all along; however, we had not been collecting the data with educational outcomes in mind. We had not realized the extent and richness of what the data could provide to us. The data have been used to guide curriculum review discussions and decisions. Assessment alignment 1 2 3 4 Direct summative measures for all educational outcomes have been identified. Indirect summative measures have been identified for outcomes #1, #2, #3, and #5. This is likely somewhere A draft of direct formative measures is now being reviewed by all NDN faculty members. between a 3 and 4. Formal adoption of the formative measures will occur before the end of the Spring 2008 semester. It is important to note that data from the proposed direct formative measures are currently being collected, for example, data from the ATI Content Mastery exams, data from course-specific clinical evaluation tools. The data have been essential to the curriculum review in identifying necessary content and placement of nursing content in specific courses. The data already have contributed to our discussions regarding the development of “thumb prints” or indicators of minimal performance expectations at certain points in the curriculum as students progress through the NDN.

In the curriculum review, the faculty members identified/discussed the placement of content needed to meet the educational outcomes and sub-outcomes. We also discussed what content should be added, emphasized, de-emphasized or eliminated. These decisions were partially guided by the educational outcomes and sub-outcomes and the data that had been collected. As part of the curriculum review, we will be reviewing the course objectives for each of the nursing courses and comparing the objectives to what is needed for the educational outcomes and sub-outcomes.

Level B Assessment structure 1 2 3 4 The assessment plan has the following attributes: 1) multiple direct and indirect assessments are used (see Part 2 of this report) 2) assessments are used on a regular basis (i.e., not just given once to get initial data). Assessments are completed at various points in the nursing program. Some of the assessments are associated with specific courses (e.g., pediatrics or OB); at a specific point in the program (e.g., ATI Comprehensive exam); with each clinical course (e.g., a student’s clinical evaluation tool); at graduation (e.g., NCLEX-RN); and at 6 months and 5 years following graduation (e.g., graduate and employer survey). 3) assessments provide comprehensive information on student performance at each stage of their program. Faculty members have questioned the definition of “at each stage of their program.” We have agreed that we obtain assessment information about students at certain points in the program, although this may not be defined as a “stage.” For example, when students complete the pediatrics or OB course, students are assessed by the ATI Content Mastery exams (direct, formative measures) as to their knowledge about pediatrics and OB. For each student, the assessment provides information about the level of the student’s knowledge, and the assessment results are incorporated into the course grade for each student. Collectively, the data provide information about the overall performance of the entire group and may reflect how the course content is delivered.

As identified in another section of this report, the nursing faculty members are in the process of developing “thumb prints” or indicators of minimal performance expectations at certain points in the curriculum as students progress through the NDN. These “thumb prints” are believed to be more likely equated with “stages” of the program.

Data management 1 2 3 4 In September 2007, an ad hoc committee consisting of 6 faculty members was formed. The committee was given the following charges: 1. Devise a plan of obtaining assessment data relative to the educational outcomes. The ad hoc committee will start with the data associated with the summative measures that were identified in the last curriculum review meeting. As the committee and the faculty feel comfortable using assessment data from the summative measures, the data may be extended to the identified formative measures. The time frames for the data will need to be as far back as is possible, and yet be within a time frame that the data can be interpreted and that we can identify measures the NDN took in response to the data. 2. Enter the data into a statistical package and run appropriate statistical tests. 3. Work closely with the Curriculum Affairs Committee, faculty and the Division Chair in analyzing the results; identifying possible causes of the results; and identifying possible responses the NDN has or could take in light of the findings regarding the academic learning of our students. 4. As appropriate, trend the data. This will be particularly important as we review results, participate in the curriculum review and prepare for our accreditation visits and assessment reports.

To date, the ad hoc committee has met numerous times, has compiled assessment reports, and has presented the assessment findings during curriculum review meetings. The reported findings have been critical to the decisions made during the curriculum review.

Data files are kept on the SHARE directory, a directory that is available to all NDN faculty members and staff. The data files are in a read-only format. Data are entered by members of the ad hoc assessment committee. The raw data are password protected. Data are gathered from various sources, for example, the KSBN Web site for the latest NCLEX-RN pass rates; the ATI Web site for results on content mastery achievement tests and the comprehensive exam; and reports generated by the NDN’s Student Affairs Committee or the Curriculum Affairs Committee that reflect data gathered from the Senior Exit Interview or the graduate and employer surveys. Hard copies of the reports generated by the ad hoc assessment committee are kept in the NDN faculty workroom.

After receiving and reviewing the reports from the ad hoc assessment committee, the faculty members realized that we need to guard against collecting so much data that it becomes almost impossible to digest and interpret the findings. The NDN is now in the process of reviewing the measurements and data and determining what measurements and data provide the most information specific to predicting student success and academic learning.

Data collection points 1 2 3 4 Data are collected at multiple points and provide strong relationship between assessments and student success. The ATI content mastery exams and the ATI comprehensive exam are standardized tests that have been shown to be psychometrically sound in reflecting student knowledge and predicting student success on the NCLEX-RN. Results on specific ATI content mastery exams are included as part of each student’s course grade. The NDN has academic progression policies that are clearly stated in the NDN’s Student Handbook. In order to progress in the program, students must earn at least a C in all nursing theory and laboratory courses and must earn a “pass” in all clinical courses. Any student earning a “fail” in a clinical course is withdrawn from the program with no option to re-apply.

Direct summative measurements for educational outcomes are administered as follows: Last Semester of Nursing Program: Comprehensive ATI data (total scale scores and specific sub-scale scores - - group and individual level) ATI Leadership Content Mastery NU 490 Leadership paper based on NU 490 writing rubric NU491 Clinical Practicum Evaluation Tool Following Graduation: NCLEX-RN Pass rate

Indirect summative measurements for educational outcomes are administered as follows: Following Graduation (6 months and 5 years): ESU/NDN Employer Survey ESU/NDN Graduate Survey

Direct formative measurements for educational outcomes are administered (or in a few instances are proposed to be administered) in every semester of the nursing program curriculum. These measurements include ATI critical thinking tests, ATI content mastery exams, clinical evaluation tools, completion of a formal research critique, formal written papers in identified courses (following NDN rubric for written papers), formal presentations in identified courses (following NDN rubric for oral presentations), cultural competency paper from identified course.

Data collection sources 1 2 3 4 Assessment data pertaining to student learning are collected from multiple sources. These sources include students (e.g., ATI content mastery and comprehensive exams), recent graduates (e.g., NCLEX-RN, ESU/NDN Graduate Survey) and faculty (e.g., completion of a student’s clinical evaluation tool, evaluation of a student’s written paper or oral presentation). Other professionals (i.e., registered nurses working in the role of preceptor with our senior student nurses) are consulted regarding the performance of students in the clinical area. This information is considered in determining a course grade for the student. However, at this time, we have not identified this evaluation information as part of the NDN’s formal assessment of academic learning plan. Likewise, we administer a pre-admission assessment test, the ATI TEAS. However, we have not identified this evaluation information as part of the NDN’s formal assessment of academic learning plan, although the results from the pre-admission assessment test are used in making admission decisions. Program improvement 1 2 3 4 Although the NDN’s formal assessment of student learning plan is in an infancy stage, the NDN has been collecting data about student learning since the beginning of the program. Some data were collected because of the NDN’s SEP and some data were collected for vague beliefs Data from the nursing program are that the data reflected student learning. Nevertheless, the data collected have been used to never used for make decisions and improve the NDN in its ability to deliver the curriculum and promote improving programs student learning. With the NDN’s formal assessment of student learning plan, the data will university-wide. provide a more focused assessment and will allow us increased ability to interpret the data and Therefore, the NDN make program improvements as necessary. is restricted in its ability to ever earn a “4” in this category. Previously collected data and data currently collected have been used to guide the curriculum review the NDN is conducting. The data have provided evidence of possible weaknesses in the NDN’s curriculum content or delivery methods.

Another example of how assessment data have been used to make program decisions involves the first-time pass rate on the NCLEX-RN. Data from 2001 to 2003 indicate that the NDN’s pass rate was approximately at the state average for BSN programs and slightly below the national average for all programs (i.e., diploma, associate degree and baccalaureate). However, in 2004, the pass rate dropped below the state and national averages and was below the minimum requirements from the KSBN. In response to the data (i.e., pass rate), the NDN conducted an extensive review of internal and external factors that may have contributed to the drop in the pass rate. This review did not reveal any definite contributing factors. However, the NDN did initiate changes, including the administration of a pre-admission assessment test; the administration of standardized achievement tests throughout the nursing program; the development and implementation of an elective course (that all students enrolled in) that was specific to NCLEX-RN preparation and success; the introduction and use of of NCLEX-RN test-taking strategies in most nursing courses; the development and implementation of a sophomore-level (2nd semester of 1st year in nursing program) elective course that prepared students with strategies for success; and increased involvement and accountability from students in the preparation for NCLEX-RN. Since implementing the changes, the first-time pass rate has increased.

An ongoing concern of the NDN is determining how to balance admission, retention and attrition, and first-time pass rates on NCLEX-RN. What is the correct balance between retention and attrition. The program is evaluated on its retention and attrition rates, yet to what extent should the NDN work with “at risk” students, including international students. Level C Comprehensive & integrated 1 2 3 4 The assessment system includes comprehensive and integrated measures with scoring/cut-off measures criteria that are examined for validity and utility, resulting in program modifications as necessary. All direct and indirect summative measures have expected levels of achievement identified. All standardized assessment measures (i.e., ATI comprehensive exam, NCLEX-RN) have been examined for validity. Other assessment measures, including the NU491 clinical evaluation tool, ESU/NDN Graduate Survey, and ESU/NDN Employer Survey, have been examined for at least face validity and have shown to be valid and useful measures for assessing student learning. The faculty members know that the clinical evaluation tool and the graduate/employer surveys will need to be revised depending on the outcome of the curriculum review. As explained in previous sections of this report, the findings have been used to make program/course modifications as necessary.

All direct formative measures used to assess student learning have expected levels of achievement identified. All standardized assessment measures (i.e., ATI content mastery exams) have been examined for validity. The clinical evaluation tools have been examined for at least face validity and have shown to be valid and useful measures for assessment student learning. The faculty members know that the clinical evaluation tools will need to be revised depending on the outcome of the curriculum review. The use of the formative data has been critical to guiding the curriculum review and in identifying students who have demonstrated sufficient ability to progress in the nursing program.

The ad hoc assessment committee and other members of the nursing faculty are in the process of developing NDN-wide rubrics that can be used for written papers and oral presentations.

Monitoring student progress, & 1 2 3 4 Assessment measures, particularly the direct formative measures, have been used to monitor managing & improving operations student progress in the NDN. Clinical evaluation tools are kept on each student enrolled in a & programs clinical course. Course faculty members determine if each student is meeting the objectives established in the clinical evaluation tool. If a student is not meeting the course objectives before the end of the course, the student is placed on a clinical contract. The purpose of the clinical contract is to help the student remediate or correct any deficiencies and to ultimately be successful in the course. However, if a student does not meet the clinical objectives, the student fails the clinical course and is withdrawn from the nursing program.

In addition, the ATI content mastery exams are used to guide decisions about students progressing in the NDN. Scores on the content mastery exams are incorporated into a student’s course grade. Depending on the level of achievement, the student’s grade is affected. Students must earn at least a C in the course in order to progress in the NDN. This past semester, course faculty members noted the correlation between students not progressing in the NDN and the students’ levels of achievement on the ATI content mastery exams.

Assessment measures, particularly the direct summative measures, have been used to manage and improve operations and programs. Examples of this have been included in other sections of the report (e.g., curriculum review, first-time pass rate for NCLEX-RN).

Assessment data usage by faculty 1 2 3 4 Assessment data have been and continue to be used extensively by the faculty for making decisions about course content and content delivery, for determining student progress in a course, and for making NDN-wide decisions associated with efforts such as the NDN-wide curriculum review. Faculty members have access to assessment data through the SHARE directory and by hard copies kept in the faculty workroom. Course faculty members also have access to course-specific ATI content mastery exam scores for individual students and the class as a whole.

Assessment data shared with 1 2 3 4 All individual assessment data are shared with the student. For example, each student knows students the individual results for the ATI content mastery and comprehensive exams. Each student knows the individual results for clinical performance. Remediation opportunities are made available to students, including clinical contracts, availability of student tutors for assistance, additional time in the NDN laboratory, and additional time spent with NDN faculty members. Remediation also is available through the ATI content mastery exam system. Data about first- time pass rates for NCLEX-RN are available on the KSBN Web site. Graduates failing the NCLEX-RN have access to library resources. Many of the faculty members make themselves available to offer guidance to these students.

However, the NDN has identified the need to develop and implement a more detailed plan to share assessment data with students and to promote individual accountability regarding reflection and improvement. This need has been included in the proposed NDN Strategic Plan that should be finalized before the end of the Spring 2008 semester. It has been suggested that each student should have a portfolio or record of their academic learning in the areas of knowledge, skills and attitudes, and that the portfolio would be a required component of each advising session with the student’s assigned academic advisor and/or each advising session with a specific course faculty member. Fairness, accuracy & consistency 1 2 3 4 Faculty members have worked to establish the assessments as fair, accurate and consistent of assessments through data analysis. For example, faculty members reviewed the data for the ATI content mastery exams for over 1 year before a cut-score of Level 2 was established. Performance expectations for students are clearly identified in course syllabi and in the NDN Student Handbook. Some students have said that the performance expectations have not been fair. However, all students have the right to the academic appeal process as outlined in the NDN Student Handbook and in the University’s Policy Manual. In all cases, the assessment decisions have been upheld, meaning that the performance expectation and assessment had been fair and accurate. GENERAL FINDINGS;

Data are being used to make decisions in the NDN about the curriculum and what could or should be done to enhance student learning and meet the NDN’s educational outcomes and sub-outcomes. Trended data on summative measures indicate that some benchmarks are being met and progress has been made on some measurements, but not in all areas. Particular areas of focus include the performance of graduates on the NCLEX-RN. Faculty is reviewing the use of methods to help students retain the information presented in previous courses; use of methods to help students maintain competency and proficiency in the use of nursing skills; and ways to enhance critical thinking of the students. Decisions have been made to include student performance on most of the ATI content mastery exams (identified as formative measures) as a component of the student’s course grade. In addition, faculty continues to discuss the “at risk” students and what can be done to identify these students earlier and to help these students with the factors that place the students at risk for not progressing in the program and/or not passing the NCLEX-RN. The completion of future goals listed below should help in the assessment process.

FUTURE GOALS

1. Complete the curriculum review. Review course objectives for each nursing course and revise objectives to enhance the presentation of content matched to our educational outcomes and sub-outcomes. 2. Develop and implement NDN-wide communication rubrics (e.g., written papers, oral presentations, listening, electronic) 3. Formally adopt formative measures (in addition to ATI content mastery exams) to measure aspects of the educational outcomes and sub- outcomes. 4. Develop and implement the “thumb prints” for expected levels of performance as students progress through the program. It has been discussed that there possibly could be three levels. 5. Develop and implement a more detailed plan to share assessment data with students and to promote individual accountability regarding reflection and improvement. 6. Revise the ESU/NDN graduate and employer surveys to be consistent with any changes in the curriculum and with the educational outcomes. 7. Reach consensus among the nursing faculty regarding the definitions of concepts such as globalization, diversity, life-long learning. Review the assessment measures presently being used to operationalize these concepts. 8. Refine/reduce the number of summative measures. Concentrate on the essential or most important indicators.

RESOURCES NEEDED TO IMPLEMENT ASSESSMENT SYSTEM

The assessment system has been developed and implemented. At this stage of implementation, guidance regarding the refinement of the assessment system would be helpful.

+Assessment Type Legend (use numbers in “Type” column above) Direct Measures (evidence, based on student performance, which demonstrates the learning itself)

1. Locally Developed Achievement Measures. This type of assessment generally is one that has been created by the individual faculty members, their department, the college or the university to measure specific achievement outcomes, usually identified by the dept and its faculty. 2. Internal or External Expert Evaluation. This type of assessment involves an expert using a pre-specified set of criteria to judge a student’s knowledge, and/or disposition and/or performance. 3. Nationally Standardized Achievement Tests: These are assessments produced by an outside source, administered nationally, and that usually measures broad exposure to an educational experience. 4. Portfolio Analysis. A portfolio is a collection of representative student work over a period of time. A portfolio often documents a student's best work, and may include a variety of other kinds of process information (e.g., drafts of student work, student's self assessment of their work, other students’ assessments). Portfolios may be used for evaluation of a student's abilities and improvement. The portfolio can be evaluated at the end of the student’s career by an independent jury or used formatively during a student’s educational journey towards graduation. 5. Capstone Experience. Capstone experiences integrate knowledge, concepts, and skills associated with an entire sequence of study in a program. Evaluation of students' work is used as a means of assessing student outcomes. 6. Writing Skill Assessment. Evaluation of written language. 7. Other (please list):______8. Other: ______

Indirect Measures (reflection about the learning or secondary evidence of its existence)

9. Persistence Studies. The number/percentage of students who, from entry into the university, graduate/complete the program within a given number of years, usually 6 to 7. 10. Student or Faculty Surveys (or Focus Groups or Advisory Committees). This type of assessment involves collecting data on one of the following: 1) perceptions of knowledge/skills/dispositions either from a student, faculty, or group, 2) opinions about experiences in a course/program or at the university, 3) opinions about the processes or functioning of a department/course/program, 4) minutes from an advisory committee. 11. Alumni Surveys (or Focus Groups or Advisory Committee). This type of assessment involves collecting data on the same topics as presented in “Student or Faculty Surveys” presented above, except the respondent is a past graduate and not a current student or faculty. 12. Exit Interviews. Individual or groups interviews of graduating students. Could be a survey format, but also can involve face-to-face interviews. 13. Placement of Graduates. Any data that surveys post-graduate professional status. Data can include graduate employment rates, salary earned, position attained, geographic locations, etc. 14. Employer Satisfaction Surveys. Employer surveys can provide information about the curriculum, programs, and students that other forms of assessment cannot produce. Through surveys, departments traditionally seek employer satisfaction levels with the abilities and skills of recent graduates. Employers also assess programmatic characteristics by addressing the success of students in a continuously evolving job market. 15. Other (please list): ______16. Other: ______v.4-18-05

Recommended publications