<<

2008 PTLC Research Proposals

Index of 2008 PTLC Research Proposals:

1. Professor John Basey- University of Boulder 2. Professor Lynne Bemis- University of Colorado 3. Professor Peter Blanken- University of Colorado Boulder 4. Professor Elaine Cheesman- University of Colorado Colorado Springs 5. Professor Judith Coe- University of Colorado Denver 6. Professor Alejandro Cremaschi- University of Colorado Boulder 7. Professor Kendra Gale- University of Colorado Boulder 8. Professor Scott Grabinger- University of Colorado Denver 9. Professor Jean Hertzberg- University of Colorado Boulder 10. Professors Jane Kass-Wolff and Ernestine Kotthoff-Burrell- University of Colorado Denver 11. Professor Yvonne Kellar-Guenther- University of Colorado Denver 12. Professor Mary Klages- University of Colorado Boulder 13. Professor Suzanne MacAulay- University of Colorado Colorado Springs 14. Professor Stefanie Mollborn- University of Colorado Boulder 15. Professor Mary Jane Rapport- University of Colorado Denver 16. Professor Cathy Thompson- University of Colorado Denver 17. Professor Cindy White- University of Colorado Boulder

John M. Basey

Senior Instructor Department of Ecology and Evolutionary Biology University of Colorado at Boulder UCB 334 Boulder, CO, 80309 303 492-8487 [email protected] Project Proposal

Characteristics of science labs can vary in delivery (teacher demonstration, computer simulation, hands-on observation, hands-on experimentation), approach (inductive, process-based, constructivist (1); implicit, explicit, explicit and reflective (2), style (expository, discovery, problem-based, inquiry (3) and a host of other miscellaneous features. Likewise, student- learning outcomes vary in a similar manner (Table 1), along with student learning-style preferences (4). Different combinations of lab characteristics should result in variations in student-learning outcomes. For instance, take two fetal pig dissection labs. The first, a hands-on observational lab utilizing an expository style, that has students identify structures and know functions. The second an inquiry style experiment utilizing fetal-pig dissection as the avenue for gathering data, in which students derive a question then design and carry out an investigation to test the question. Both labs address somewhat the same content topic and lab skills, but the first may help students more with term recognition and conceptualization associated with discipline content, while the second may help students more with science process and reasoning skills and improving students' attitudes toward science. In other words, trade-offs are most likely to exist in learning outcomes resulting from labs with various designs.

Theoretically, for a given set of student-learning goals, there should be an optimal lab design that maximizes student-realization of those goals. Actually determining the "optimal lab design" for a set of learning goals or even tougher an "optimal curriculum design" is virtually impossible. However, quantifying differential influences of lab characteristics on various learning goals can get lab and curriculum designs much closer to the optimal design.

Research in this area is difficult to evaluate for several reasons.

- Relatively few studies experimentally compare effects of different characteristics of hands-on science labs on student learning outcomes. Most of these studies only compare two or three lab characteristics, and with the exception of understanding the nature of science (NOS), utilize unique assessments that cannot be compared on relative scales between studies.

- Studies often do not delineate learning outcomes (especially higher and lower order cognition) in the assessment so tradeoffs cannot be evaluated for different learning outcomes.

- Studies commonly examine specific adjustments to labs or curricula rather than broad theoretical categories.

In 2007, I was part of the President's Teaching and Learning Collaborative. As part of that program, I attempted to quantify relative impacts of 6 characteristics of lab design on students' attitudes towards lab. Results indicated the following quantitative relationship: Total Student Attitude = 0.39 Exciting 0.25 Time Efficient 0.15 Not Difficult 0.10 Lecture Help 0.08 Experimental 0.03 Open-Ended. In this proposal, I would like to extend this research and quantify how lab style can influence the six characteristics listed above. For example, what are the trade-offs in excitement, time efficiency, level of difficulty and lecture help for an expository vs. discovery vs. problem-based vs. full-inquiry lab?

Methods. In 2001 -2003, I derived and verified an assessment that could be used to determine changes in students attitudes towards lab between years and be sensitive to changes in labs between years. In spring 2007, I modified the assessment for a new question and utilized it in General Biology Lab II. The assessment was an end of semester survey that had students provide an overall rating of each lab in the curriculum and a rating based on their perception of the degree to which the independent variables of excitement, time efficiency, lecture help, level of difficulty, open-ended and experimentation impacted the lab. I plan to utilize the same assessment at the end of fall 2007 in General Biology Lab I (GBLI), and General Biology Lab: A Human Approach (HAL). I then plan to redesign some labs for fall 2008 so they are the same in content covered, but have different lab styles, then reassess with the same assessment.

Extensions. This research could directly impact my lab designs and the lab designs of others. After examining attitudes, I intend to examine impacts of lab style and design on other learning goals, such as memorization, comprehension and science reasoning.

When completed, I intend to submit my findings to one or more journals (e.g. The Journal of Research in Science Teaching, The Journal of Biological Education, or The International Journal of Science Education). I also plan to speak at national meetings (e.g. The Alliance of Biology Lab Educators), and at other universities (similar to the invited seminar I did in 2004 for the biological education program at UNC).

I have chaired the senior outcomes assessment committee in The Department of EEB for the last 2 years. I do not have anyone in mind right now as a coach. If I am selected, I plan to attend required meetings. Since I was part of the PTLC cohort of 2007, I plan to be a coach.

Literature Cited: (1) Hodson D (1996) Journal of Curriculum Studies, 28(2), 115-135; (2) Toh and Woolnough (1993) Journal of Research in Science Teaching, 30(5), 445-457; (3) Domin DS (1999) Journal of Chemical Education, 76, 543 - 547; (4) Dunn R, Dunn K., & Price, GE (1982) Lawrence. Kansas: Price Systems.

Lynne T. Bemis

Associate Professor of Medicine Cancer Biology University of Colorado Denver School of Medicine Mail Stop 8117 PO Box 6511, Aurora, CO 80045 303-724-3846 [email protected]

What is the central question, issue, or problem you plan to explore in your proposed work?

The central question of our work is: Does training in health disparities and cultural diversity enhance health care professional's knowledge of the factors influencing health needs of people from diverse backgrounds. The long term goal of our program is to prepare healthcare professionals trained in the research of health disparities to implement positive solutions to health disparities.

Why is your central question, issue, or problem important, to you and to others who might benefit from or build on your findings? Recall that the goal of the scholarship of teaching and learning is not simply to improve your own teaching but to contribute to the practice and profession of teaching more broadly.

Comprehensive Health Disparities Training for First Year Medical Students was funded last year as a Diversity and Excellence pilot project. The project was initiated because of a deficit that several of our first year medical students had identified in their preparation for a medical career. They had come to realize that prior to entering medical school many of their peers developed laboratory research skills, whereas laboratory research experience was not available to many of the minority students. Another area of concern is that of cross cultural issues and health disparities, that is often students of color are expected to be experts on health disparities by virtue of their heritage. Although, many students from under represented populations may have experienced health disparities firsthand, most probably have not discussed these issues with national leaders in the field of health disparities research. The goal of this project is to rectify both the lack of research experience and the inadequate preparation to discuss health disparities by providing a comprehensive training program. The training requires that participants (first year medical students) attend the weeklong Annual Summer Workshop, Disparities in America - Working Towards Social Justice held in Houston, Texas and if their schedule permits spend the remainder of the summer conducting research in a laboratory or public health setting.

Description of the Program

Comprehensive Health Disparities Training for First Year Medical Students is a student driven educational initiative. The program was open to all students in the first year of medical school and 8 students inquired about the program. Two students who were initially interested in the program later determined that they were not available for the summer workshop and were thus unable to participate. This year ten students replied with interest in attending the disparities conference and others are attending a Health Disparities Interest Group (HDIG) being initiated in January by two second year medical students Ade Taiwo and Rose Montoya. We would like to work with the PTLC to develop evaluation tools for these three integrated programs (Health Disparities Conference Attendance, Research Fellowship and HDIG participation) and for evaluation of the longterm outcome of health disparities training and impact on future careers of medical students and healthcare professionals.

Description of the Research Experience

First year medical students have just one summer between their first and second year to address areas of their education that may impact their success in future endeavors such as residency or fellowship. The authors of this proposal realize that some students have not had the research experience available to their peers. In order to address this disadvantage we proposed to provide a stipend so that students could be readily placed in research laboratories at CU. The current state of grant funding makes it almost impossible for a research laboratory to provide training to a volunteer, however, having their own stipend allowed the students to overcome the major barrier to obtaining research experience. The format for research experiences through this project followed that required of CU Cancer Center Fellowships. Briefly two papers are required the first being a description of the proposed research project and the second being a final paper written in the format of a short research paper with findings and conclusions. A final oral presentation in the form of a poster presentation is also required.

Broader Impact and Contribution to Teaching

Many programs in the advanced training of health care professionals are administered to very small groups of students or fellows. The development of appropriate evaluation tools for the small group of students who attend the comprehensive health disparities training will also be applied to training for other fellowship programs in the Division of Medical Oncology. Indeed we hope to publish the evaluation tools that we develop in order that fellowship programs attended by small numbers of participants can be adequately evaluated.

How do you plan to conduct your investigation? What sources of evidence do you plan to examine? What methods might you employ to gather and make sense of this evidence? How might make your work available to others in ways that facilitate scholarly critique and review, and that contribute to thought and practice beyond the local? (Keep in mind that coaching will be available to help you develop these aspects of your proposal.)

Part one of the project is the attendance at the Summer Disparities Conference at MD Anderson. In order to evaluate this part of the program we would like to develop a pre and post evaluation. Currently the students are required to interview one of the speakers and then report back in a power point presentation to other students who participate in the Health Disparities Interest Group. The second part of the evaluation will be pre and post evaluation of the research experience and includes papers submitted before the research starts and following the research. The final part of the project is to evaluate the Health Disparities Interest Group and our hope is to develop this part of the project into a seminar elective course. Each part of the Comprehensive Health Disparities Training project needs improved evaluation tools. Help in developing assessment methods for this training program is our anticipated outcome of working with coaches in the PTLC. Dr. Bemis will be the faculty advisor on the project and the medical students Ade Taiwo and Rose Montoya will be using this project as their mentored scholarly activity (a required project for graduation from medical school).

Include a literature review of the theory and practice of the subject of your inquiry in order to locate your research in the literature preceding it. (The Website offers expert information and advice on how to conduct a modest literature review.)

In reviewing the literature it seems that our program best fits into the category of fellowship programs. We were able to find a few publications that had evaluated such programs but they have many years of followup or they pooled their data from small groups (1 – 3). Through interactions with the PTLC we are hoping to lay the ground work for longterm followup and longterm evaluation. Our expectation is that short term evaluation is also of great importance to our ability to develop a sustainable program.

What aspects of the design and character of this work are you not yet fully prepared to describe?

We pilot tested the program last year except for the HDIG which is a new initiative proposed by the students for January 2008. At this point since we have not pilot tested the HDIG part of the program we will be describing that and determining additional needs for evaluation through the pilot stage from January to June 2008.

What questions do you have and what do you still need to know?

The main question that remains is how to think about long term evaluation. Does our program really have the desired effect thereby helping health care professionals begin to solve health disparities problems that they encounter in their own practice.

What is your record of innovation in teaching and/or the assessment of learning?

I am the co-founder of the innovative teaching program, Genetics Education for Native Americans (GENA). This program focuses on teaching genetics to under represented minorities (URM) with the goal of providing education to inform decision making. Students from URM often do not have exposure to scientific careers and are thus often excluded from these fields of study. In addition, genetic information is increasingly more often required for communities to make decisions about healthcare and yet many of the community members have had little education in the field of genetics. GENA was developed and evolved into a 30 objective curriculum that addressed the needs of Native American community members and was useful to other URM community members as well. Descriptions of the GENA curriculum and its evaluation have been published in numerous peer reviewed articles (4-8). The most recent assesment of learning for GENA has included use of an Audience Response System (ARS) which has also been previously described (9). My previous experience gave me an interest in applying innovative methods to collaborative curriculum development and is the reason I am interested in working with the students to develop this student initiated training.

Can you suggest an appropriate coach for your project?

Dr. Gail Armstrong might be an appropriate coach for this project. I spoke with her briefly about the project. Currently our program is reaching out to first year medical students but would be appropriate for other students preparing to work in the health care professions as well. Thus, it we are hoping that through interactions with Dr. Armstrong we could reach out to additional student groups.

Are you able to attend the required meetings as specified above?

Yes I will attend meetings as required.

If your project is selected, are you willing to serve as a coach in PTLC in a future year?

Yes, I am willing to serve as a coach in future years.

References:

1)Mitchell, J, Levine, R, Gonzalez, R, Bitter, C, Webb, N, White, P. Evaluation of the National Science Foundation Graduate Teaching Fellows in K-12 Education (GK-12) Program. 2003. Paper presented at the Annual Meeting of the American Educational Research Association. (this is a reference available on AskEric)

2) Ball, WJ. Using Public Policy-Oriented Community-Based Research to Boost Both Community and Political Engagement. 2003 paper presented at the annual political science association (this is a reference available on AskEric).

3) Trumbull, E. and Pacheco, M. Leading with Diversity: Cultural Competency for Teacher Preparation and Professional Development. 2005. Education Alliance at Brown University.

4) Burhansstipanov, L., Bemis, L.T., Dignan, M. and Dukepoo, F.C. Development of a Genetics Education Workshop Curriculum for Native American College and University Students. Genetics. (2001) 158:941-948.

5) Romero, F., Bemis, L.T., Dignan, M. and Burhansstipanov, L. Genetic Research and Native American Cultural Issues. Journal of Women and Minorities in Science and Engineering. . (2001) 7:97-106.

6)Burhansstipanov, L., Bemis, L.T., and Dignan, M. Native American cancer education: genetic and cultural issues. J Cancer Educ (2001) 16(3):142-5.

7) Burhansstipanov L, Bemis L.T., Kaur JS, Bemis G. Sample genetic policy language for research conducted with native communities. J Cancer Educ. (2005) 20:52-7.

8) Dignan MB, Burhansstipanov L, Bemis L.T. Successful implementation of genetic Education for Native Americans workshops at national conferences. Genetics. (2005) 169(2):517-21.

9) Gamito EJ, Burhansstipanov L, Krebs LU, Bemis L.T., Bradley A. The use of an electronic audience response system for data collection. J Cancer Educ. (2005) S20(1 Suppl):80-6.

Peter Blanken

Associate Professor Geography and Environmental Studies University of Colorado, Boulder 260 UCB Boulder, Colorado 80309 303-492-5887 [email protected]

What is the central question, issue, or problem you plan to explore in your proposed work? Based on my informal discussions with students and faculty, there exists a general conception that class size (number of students) directly affects the quality of the student's learning experience. For example, many students feel that large classes (100 students) are not conductive to effective learning for several reasons (e.g. limited or no contact with instructor, disruptive learning environment , can't hear/see front of classroom, no place for discussion, increased occurrences of violations of the Honor Code...). In fact, at the time of tenure and promotion reviews, many instructors are "excused" from receiving poor classroom FCQ's/peer evaluations in large classes, since that is the "norm".

My hypothesis is that there is no relation between class size and the student's learning experience and instructor evaluations. This hypothesis was developed from my own evaluation of my teaching, but I wish to extend the analysis across the Boulder campus using data provide by FCQ scores.

Why is your central question, issue, or problem important, to you and to others who might benefit from or build on your findings? Recall that the goal of the scholarship of teaching and learning is not simply to improve your own teaching but to contribute to the practice and profession of teaching more broadly.

My objective is to quantify the relationship between class size and the student's learning experience, including the rating of the instructor. I believe that the student's learning experience in a class does not vary with the number of students in the class. Quantifying this relationship using actual data will help instructors determine reasons other than class size why their teaching is effective or not. This knowledge will then help improve the student's learning experience, since the influence of class size on the learning experience will be determined. The influence of class size on instructor evaluations will also help interpret FCQ score at the time of tenure and promotion reviews.

How do you plan to conduct your investigation? What sources of evidence do you plan to examine? What methods might you employ to gather and make sense of this evidence? How might make your work available to others in ways that facilitate scholarly critique and review, and that contribute to thought and practice beyond the local? (Keep in mind that coaching will be available to help you develop these aspects of your proposal.)

The hypothesis will be tested using data readily provided by FCQ results for courses taught in the College of Arts and Science, coupled with interviews of both students taking and instructors teaching small (< 15 students), medium (15-99 students), and large (> 100 students) classes.

The actual relationship between class size and the learning experience will be determined using FCQ's available on-line for course taught in the College of Arts and Sciences (since Fall 2003). This includes data on: the independent variable (class enrollment) and the following dependent variables: learning experience; course rating; instructor rating; and work effort. The recent change in FCQ format should not affect the "course", "instructor", and "learning" questions, since they are worded similarly and both can be standardized to a common format. The "work effort" question changed from a relative scale to number of hrs/week, so these will have to be standardized relative to a 40-hr/week. All data will be achieved, plotted, and analyzed for statistical significance using Matlab software.

The perceived relationship between class size and the learning experience will be determined using a series of questions asked during interviews from volunteer students and instructors (coaching from PTCL mentors will help here). Student questions such as "Was this course an effective learning experience? Why or why not? Which aspects of this course made it unique? How could this course be improved?", and instructor questions such as "What makes an effective learning environment? What makes this course easy or hard to teach? Are there any special need you require to effectively teach a course of this size", and both groups will be asked "Do you feel there is a connection between class size and the learning experience?"

This aspect of the research is the least developed, and could be modified through PTLC mentoring. My initial plans would be for me to interview at least 10 randomly-selected instructors from each class-size group, and an undergraduate student assistance to interview at least 10 randomly- selected students from each group. HRC approval of the question will be obtained prior to conduction any interview.

Include a literature review of the theory and practice of the subject of your inquiry in order to locate your research in the literature preceding it. (The Website offers expert information and advice on how to conduct a modest literature review.)

Research productivity is relatively easy to quantify through the quantity of peer-reviewed publications and grant awards (Laird and Bell, 2007), and teaching evaluations such as FCQ's offer an attractive means to similarly quantify teaching (Renaud and Murry, 1996). The relationship between class size (quantity) and teaching quality often immediately comes into question, yet few studies have examined this (and to my knowledge, none at CU). I could find only two studies similar to mine proposed here (Fernandez et al., 1998; Kuo, 2007), and both found weak relationships between teaching scores and class size. These studies were performed within specific programs/departments at specific institutions (with a small sample size), and it is not at all clear if the results can be generalized. Also, none of these studies assessed the impact of the perceived aspect of class quantity and quality.

What aspects of the design and character of this work are you not yet fully prepared to describe?

Assessing the perceived relationship between class size and teaching effectiveness from both students and instructors needs to be better designed and implemented.

What questions do you have and what do you still need to know?

Need to know the actual relationship between class size and learning/teaching effectiveness. My question from this knowledge will be how to better design (or modify) the classroom environment to make the students' leaning experience more effective (i.e. is class size an issue or not?).

What is your record of innovation in teaching and/or the assessment of learning? Can you suggest an appropriate coach for your project? (This is NOT a requirement but may increase your likelihood of acceptance.)

I have participated in several FTEP and LEAP activities during my 9-yrs of teaching at CU. In addition I have been twice invited to run workshops in topics such as "Teaching large classes", "Active-Teaching techniques" and "Web-based teaching" at Geography Faculty Development Alliance (annual workshops for new geography faculty).

In every class I teach, I try to develop and test new teaching methods. Examples include: Pairing kindergarten with undergraduate students in my GEOG 1001 lectures; actively-learning through "on-campus field trips" using several meteorological instruments; development of an interactive, cross-discipline information web site on the boreal forest and; assessing the impact of visual aids on information retention using simple in-class experiments. At this time I can not suggest an appropriate coach.

Are you able to attend the required meetings as specified above? Yes.

If your project is selected, are you willing to serve as a coach in PTLC in a future year? Yes.

References Cited

Fernadnez, J., Mateo, M.A., and Muniz, J. 1998: Is there a relationship between class size and student rating of teaching quantity. Educational and Psychological Measurement. 58 (3), 596-604.

Kuo, W. 2007: Editorial: how reliable is teaching evaluation? The relationship between class size to teaching evaluation scores. IEEE Transactions on Reliability. 56 (2), 178-181.

Liard, J.D. and Bell, R.E. Assessing the publication productivity and impact of eminent geoscientist. EOS, 88 (38), 370-371.

Renaud, R.D. and Murry, H.G. 1996: Aging, personality, and teaching effectiveness in academic psychologists. Research in Higher Education, 37 (3), 323-340.

Elaine A. Cheesman

Assistant Professor Special Education University of Colorado at Colorado Springs 1420 Austin Bluffs Parkway, P.O. Box 7150 Colorado Springs, CO 80933-7150 719-262-5861 [email protected]

Central Research Question

Will intensive, systematic practice using classroom response systems significantly improve the skills of special education teacher-candidates who struggle to learn English word-structure?

Relevance of Research Question

Knowledge of English word structure is essential for teachers who work with students who struggle with reading and writing acquisition. In the past three decades, the science of reading instruction has provided solid evidence on the nature of reading acquisition and the causes of reading failure (National Reading Panel, 2000). Scientists now estimate that over 90% of children can be taught to read using research-based instruction (Torgesen, 2000). Research evidence suggests that students with serious reading problems do respond to instruction that is explicit, comprehensive, intensive, and supportive (Foorman & Torgesen, 2001). For these children, effective reading instruction requires a teacher who thoroughly understands and has competent skills in English word-structure, and a complete understanding of the content, scope, and sequence of instruction (International Dyslexia Association, 1997).

A growing body of evidence suggest, however, that many teachers and teacher candidates do not have the recommended background knowledge or skills in some core components of reading-related content knowledge and skills to provide reading instruction that is based on empirically validated instructional content and methodology (Moats & Lyon, 1996). Spear-Swerling and Brucker (Spear-Swerling & Brucker, 2003) provided evidence that many teacher candidates enrolled in a special education licensure program have incomplete knowledge about word structure and the ability to understand how speech sounds map to letters in written language. Subsequent investigations by these researchers suggest that the teachers' own basic skills may affect their ability to absorb the complexities of English language structure. In a follow-up study, Spear- Swerling and Brucker (2006) examined the relationship between teacher- education students' component reading-related abilities and their acquisition of word-structure knowledge. Results showed that some students' basic decoding and spelling skills were below average, despite average reading comprehension and passing scores on the PRAXIS I (Educational Testing Service, 1994) teacher competency test. Students' decoding and spelling skills, but not their comprehension or reading speed, influenced their ability to acquire word-structure knowledge.

My investigations of teacher knowledge here at UCCS produce similar results. Prior to taking the first of two special education reading-methods courses (SPED 4010 / 5010 - Multisensory Structured Language Education and SPED 4012 / 5012 - Differentiated Instruction) students are pre-tested using the Survey of English Word Structure, which I previously developed. Results reflected the research evidence: the majority of these students were unable do simple tasks indicative of English word structure knowledge. Post-test results on an alternate version of the Survey of English Word Structure, taken at the end of the second term, showed significant gains by most students, but universal mastery remained elusive. In addition, they found the content of these classes, which focused on basic reading and spelling skills, to be extraordinarily difficult to master. This evidence suggests that ordinary university instruction in English word- structure may not be sufficient for some would-be teachers.

Teaching Innovations

In 2006, I began using classroom response systems, or "clickers," to help students both understand the course content and to practice the basic skills necessary for beginning reading instruction. Although common in science courses, to my knowledge, this is the first application in a reading methods course. Using clicker technology, the instructor poses a multiple- choice question on which the students first discuss, then vote using small hand-held transmitters. A bar graph showing the range of responses appears after the vote. Well-crafted questions encourage lively discussions as peers try to convince each other of the correct answer. Another innovative part of my instruction includes using peer-mentors. Students in the second term --SPED 4012 / 5012--attend three sessions of the first term class--SPED 4010 / 5010. As part of this assignment, they mentor small groups of first term students to help them refine their teaching strategies. This serves three purposes. One, it provides small-group instruction to first-term students. Secondly, it provides additional repetition for students in the second term. Finally, it prepares teacher-candidates to coach and supervise para-professionals in the classroom.

Investigation Plan

I propose to first analyze the post-test results on the Survey of English Language Knowledge from previous terms to determine which items are most difficult for the teacher candidates to learn. I will then refine my existing bank of clicker questions to provide systematic review of these items. I will analyze the number of repetitions necessary for universal mastery as indicated by individual responses to clicker questions during class. All students enrolled in SPED 4010 / 5010 " Multisensory Structured Language Education " will take the Survey of English Word Knowledge, Form A as a pre-test. Tests and clickers are coded with their mother's maiden name, and do not influence their course grade. In addition, students will take additional tests of decoding and spelling ability, similar to those used in Spear-Swerling and Brucker (2006), administered by the graduate research assistant. After course instruction, students will take the Survey of English Word Knowledge, Form B and alternate forms of the decoding and spelling pre-tests. Quantitative data analysis will include examination of in-class clicker responses, pre- and post-tests of English Word Knowledge, and pre- and post-tests of decoding and spelling data. I want to examine correlations between English Word Knowledge, decoding and spelling skills in pre-test data. Also, how will the rate and depth of learning of students with initially low scores (lowest 20%) compare to scores by students with initially high scores (highest 20%)?

Qualitative data can include students' perceptions of their decoding and spelling skills, both pre-and post-instruction, collected anonymously as part of the Survey of English Word Knowledge. The qualitative data may provide increased understanding of the personal and academic struggles of low-performing students, and insights as to the effectiveness of this intervention plan. Published results of this study would add to the body of limited evidence concerning teacher knowledge acquisition of English word structure as it relates to reading instruction for special education students.

Design Preparation

I am not yet fully prepared to describe the research design for this project. This is an area in which I need assistance. I believe a hierarchical linear model may prove useful in analyzing clicker data. I would value coaching from faculty with more sophisticated understanding of research design, quantitative statistics and qualitative analysis. An appropriate research- design and publication coach for this project would be Barbara Wise of CU Boulder. Dr. Wise has published extensively in the area of reading interventions for children with dyslexia. In addition, I will need assistance by someone who has a more sophisticated understanding of quantitative statistics. I am able to attend the required meetings as specified above. If my project is selected, I am very willing to serve as a coach in PTLC in a future year.

References

Educational Testing Service. (1994). PRAXIS I: Academic Skills Assessments. Princeton, NJ: Author.

Foorman, B. R., & Torgesen, J. (2001). Critical elements of classroom and small-group instruction promote reading success in all children. Learning Disabilities Research and Practice, 16(4), 203-212.

International Dyslexia Association. (1997). Informed instruction for reading success: Foundations for teacher preparation. A position paper of The International Dyslexia Association. Baltimore, MD: International Dyslexia Association. Moats, L., & Lyon, G. R. (1996). Wanted: Teachers with knowledge of language. Topics in Language Disorders, 16(2), 73-86.

National Reading Panel. (2000). Teaching children to read: An evidence- based assessment of the scientific research literature on reading and its implications for reading instruction. Bethesda, MD: National Reading Panel, National Institute of Child Health and Human Development.

Spear-Swerling, L., & Brucker, P. O. (2003). Teachers' acquisition of knowledge about English word structure. Annals of Dyslexia, 53, 72-103.

Spear-Swerling, L., & Brucker, P. O. (2006). Teacher-education and students' reading abilities and their knowledge about word structure. Teacher Education and Special Education, 29(2), 116-126.

Torgesen, J. K. (2000). Individual differences in response to early interventions in reading: The lingering problem of treatment resisters. Learning Disabilities Research and Practice, 15(1), 55-64.

Judith Coe

Assistant Professor Music & Entertainment Industry Studies University of Colorado at Denver 162 UCD (303) 556-6013 [email protected]

Creativity and Refection: Expanded Learning in a Singer/Songwriter

Project Ethos

This project focuses on commercial/popular song creation, expression, and reflection within the context of a university singer/songwriter performance ensemble and seeks to discover and explore how reflection enhances and expands creativity and understanding of the creative process and products. The project is qualitative in nature, and consists of original student creative/reflective work within an established signature student performance ensemble that includes peer discussions on and individual reflective writing assignments about:

- why students engage in songwriting and what they want from it - what creativity means to them

- their preferred or dominant creative domains and rituals

- steps and strategies in the creative/songwriting process with regards to both solitary and collaborative explorations

- play, brainstorming, journaling, cyber sleuthing, clustering, and other creative strategies

- creativity and guided/individual research

- imagery/visual cues and language/literary sources for exploration of and experimentation within creative ideas and processes

- curiosity, learning/creating patterns, and lifestyle issues

- creativity and culture

- outcomes and evaluations

The Process

Over the course of two semesters, data will be collected from student singer/songwriters, including: pre-, mid-, and end-of-semester surveys and reflections; in-class and out-of-class individual and collaborative research and creative work and artifacts; reflective writing about this work and these artifacts; and recordings of weekly in-class workshop performances and an end-of-semester expo concert performance of original songs. A variety of approaches and activities will be explored, including guided assignments and student-centered project choices.

The Product

One result of this project is an amazing amount of original creative work and correlative reflective writings by students, which explores basic stages in this project: presentation and discussion of a creative idea, strategy, or topic; collaborative and independent workshopping of that creative idea, strategy, or topic; either a group selection or an individual choice about an original work to be undertaken; a series of research and reflection exercises specific to the work; creation of the work; reflective writing about the creative process; in-class presentation of and discussion about the work; group reflection and discussion about individual processes, strategies, and outcomes in the work. A particularly rich aspect of the process will be a digital recording made live for every student's original, creative work and its presentation/performance/discussion/reflection.

I will design a digital portfolio (showcase and development) course website that highlights this project and also shares information and knowledge about SoTL, songwriting, creativity, and reflective practice. Student links will include artifacts and reflections (such as artists statements, writing exercises, original lyrics, research projects, etc.) and a Virtual Concert Hall with streamed digital audio files of weekly class work and concert performances or original songs and student stories behind the songs. CAM will with ITS to secure server space and support for audio files; my expectation is that this site can possibly be launched by the end mid- summer.

Some anticipated initial outcomes

Previous students in this ensemble have concluded that the exercises and assignments are wonderful creative thinking and learning and activities on their own, but that adding the reflective/discussion piece atop of these activities (something that most of them had never tried or even heard of) enriches the entire experience to such an extent, that it profoundly changes not only the ways in which they create and their creations, but also the ways in which they think about their own creative capabilities, boundaries and barriers. One student in this ensemble a few years ago remarked that he finally understood how his brain worked and that he was able to access his creativity in more consistent manner.

Results of this research are expected to contribute significantly to the field on commercial/popular music performance and pedagogy as well as adding to the research base on creativity and reflection. In this respect, I seek to clarify the pedagogical and practical results of reflection on creativity, and the ways in which reflection on creativity enriches, enhances and expands learning.

Record of Innovation in Teaching and the Assessment of Teaching

This project on reflection on creativity and expanded learning will add to the growing list of activities in my principle research area which emphasize: (1) creativity and performance; (2) commercial/popular music pedagogy and assessment/digital portfolios in the arts; (3) popular music and culture; and (4) the creation and performance of, and reflection on, original lyrics and music. As a specialist in songwriting, I am an experienced teacher, composer and performer. I have given workshops on songwriting and creativity, and have performed my original works and collaborative works with other artists, in this country and abroad.

During AY 2006-2007, I was a Visiting Fulbright Scholar in the World Academy of Music and Dance at the University of Limerick, Ireland. I was approached by the Director of UL's Centre for Teaching and Learning, Dr. Sarah Moore, to teach a series of workshops for faculty of creativity. Those interactions with UL faculty and other Fulbright-related research undertaken during my year in Ireland, informed and transformed my understanding of creativity and teaching and learning creativity.

I was the 2006-2007 recipient of the CAM Teaching Excellence Award at CU-Denver and in 2003, I was the CU-Denver campus winner of the President's Award for Excellence in Teaching with Technology. I have received numerous departmental, college, university and system grants to design and develop innovative teaching, digital portfolio, and assessment initiatives.

Additional Considerations

I would like to work with Mitch Handelsman as my process coach, and if my project is selected, I would be delighted to serve as a coach in PTLC in a future year.

Literature Research

In addition to personal sources on songwriting and creativity, I will also examine general resources on creativity and reflection, such as: Amabile, T. (1998) How to Kill Creativity, Harvard Business Review, Sep-Oct 1998 / Bohm, D. (1998) On Creativity, Routledge, London / Cannatella, H. (2004) Embedding Creativty in Teaching and Learning, Journal of Aesthetic Education, vol. 38, no. 4, winter 2004 / Cropley, A. (2001) Creativity in Education and Learning, Kogan Page, Ltd., UK. / Csikszentmihaly, M. (1990) The Domain of Creativity, in M. Runco and R. Albert, eds., Theories of Creativity, Sage Publications, London / Hartnell-Young, E. (2003) Innovation in Practice: from Consumption to Creation, keynote paper, ICTEV Conference, Melbourne, Australia / Hui, Ming Fai. (Cultivating Creativity in the Classroom: Assessment for Improving Teaching and Learning, Asia-Pacific Journal of Teacher Education & Development, vol. 6, sept-oct 2003, pp. 69-86 / Keostler, A. (1964) The Act of Creation, Macmillan, NY / Persson, R. (1996) Studying with a Musical Maestro: A Case Study of Common Sense in Artistic Training, Creativity Research Journal, vol. 9, no. 1, pp. 33-46 / Ramsden, P. (1992) Learning to Teach in Higher Education, Routledge, London / Reid, A. (2000b) Self and Peer Assessment in a Course on Instrumental Pedagogy, in D. Hunter and M. Russ, eds., Peer Learning in Music, University of Ulster, Belfast / and Reid A. (2001) Variations in the ways that instrumental and vocal students experience learning music, Music Education Research, vol. 3, no. 1, pp. 25-40 / Sternberg, R.J., ed. (1988) The Nature of Creativity, Cambridge University Press, Cambridge.

Alejandro Cremaschi

Assistant Professor of Piano Pedagogy Coordinator of Class Piano University of Colorado Boulder 301 UCB [email protected]

Background information and research proposal

All undergraduate music students whose major is not piano are required to reach a certain level of piano playing proficiency as part of their degree. To accomplish this, most incoming students take two to four semesters of group piano (also known as "class piano") during their freshman and sophomore years. These piano requirements and teaching modality are common to most university music programs in the US. Approximately 85% of all incoming undergraduate students in our College of Music take these piano courses. They are taught in sections of up to 12 students in an electronic piano laboratory, twice a week for a total of 2 hours. Students in these classes are required to practice outside class time. These students learn, practice and perform a series of increasingly difficult music pieces and exercises to develop physical coordination and control of sound and tempo (technique), music reading skills in treble (right hand) and bass (left hand) clefs, sightreading skills (the ability to perform pieces "at sight" with little previous rehearsal), etc.

Most incoming non-piano students are proficient performers of their major instruments. Yet most of them have had very little previous piano instruction. Although I don't have data to support this claim, I suspect that the slow learning curve encountered when tackling a new instrument, combined with an inefficient use of practice time and strategies and the inability to set achievable practice goals for themselves, cause a decline of these student's self-efficacy perception. From my informal observations, this decline creates a vicious cycle that reduces their interest and motivation to spend time practicing, and ultimately, negatively affects the academic performance of some of these students in these classes. Self-efficacy is defined as "people's judgment of their capabilities to organize and execute the courses of action required to attain designated types of performances" (Bandura, 1986). According to Bandura, "the stronger the perceived self-efficacy, the higher the goal challenges people set for themselves and the firmer their commitment to them" (Bandura, 1991). The self-perception of one's efficaciousness has been strongly related to persistence and performance (Schunk, 1991). Individuals who possess a high degree of self-efficacy have been shown to outperform others who possess equivalent knowledge and ability, but lower self- efficacy (Collins, 1982, cited in Bandura, 1991; Pintrich, 1991, Bong and Skaalvik, 2003). Furthermore, self-efficacy has been shown to play a determining role in the individual's academic growth and development (Bong and Skaalvik, 2003).

Academic self-efficacy beliefs have been shown to have a positive relationship to the use of effective learning and study strategies for college students (Pintrich, 1999 ; Pintrich and De Groot, 1990). These study strategies include: cognitive strategies (rehearsal, elaboration, organization and critical thinking); metacognitive strategies (metacognitive self- regulation); and resource management strategies (time and study environment, effort regulation, peer learning and help seeking) (Pintrich et. al., 1991, as cited in Nielsen, 2004). Nielsen found that students with a higher degree of self-efficacy were more likely to be cognitively and metacognitively involved in trying to learn the material for their principal instrument than less efficacious students. (Nielsen, 2004). To my knowledge, no studies have been conducted with students learning piano as a secondary instrument (piano) after they have become experts at their primary instrument.

There are several questions that I am interested in exploring in this project. They relate to the issues of motivation, practice strategies, self-efficacy, goal setting in group piano classes. These questions are still "in progress"; they need fine-tuning and probably a narrower focus, but I think they will serve as good starting points for my project:

- What level of self-efficacy do students possess at the beginning of their piano studies, and how does their perceived self-efficacy change toward the end of their first year?

- How much and how well do students practice the piano through the year?

- What cognitive and metacognitive strategies do these students use when practicing the piano? - Is there a relationship between the use of these strategies, the time spent practicing, and the perception of self-efficacy in students who are learning the piano as their SECONDARY instrument? (Nielsen showed that this is true for students who are already experts at their PRIMARY instrument)

- What are the students' goals and expectations as pianists at the outset of their piano studies, and to what degree do they feel that they successfully attained them by the end of the year? Is there a relationship between the number of unattained goals and expectations, and the degree of self- efficacy and academic performance in these classes?

My research questions have not been tested in the field of secondary group piano teaching. By answering these questions, I feel that I would be able to confirm and better understand my "informal" observations about decreasing motivation and academic performance in these classes. Finding relationships between motivation, practice strategies and self- efficacy, and perhaps causes for the decrease of motivation, would help me device and test strategies to help these students (a) use better practicing strategies, (b) develop and maintain and healthier and stronger sense of self-efficacy, (c) set realistic goals and tailor the class to help them attain them. These answers and strategies would be useful and helpful not only for our classes at CU, but also to the whole field of group piano pedagogy in general.

The peer consultations and coaching through the "President's Teaching and Learning Collaborative" program would help me: (a) fine-tune my research questions, (b) provide insight from the perspective of other disciplines, (c) help me develop appropriate and reliable instruments to gather data, and (d) provide additional guidance in the use of appropriate statistical tools to analyze these data.

Q: How do you plan to conduct your investigation? What sources of evidence do you plan to examine? What methods might you employ to gather and make sense of this evidence? How might make your work available to others in ways that facilitate scholarly critique and review, and that contribute to thought and practice beyond the local?

A: I plan to conduct my investigation through questionnaires and observation of approximately 50 students enrolled in undergraduate group piano classes at CU. I have not yet developed these instruments. The results of my investigation will likely be submitted for publication to journals such as the Journal of Research in Music Education, and presented at conferences such as the National Conference in Keyboard Pedagogy. Q: What is your record of innovation in teaching and/or the assessment of learning?

A: Please see attached CV.

Q: Can you suggest an appropriate coach for your project? (This is NOT a requirement but may increase your likelihood of acceptance.)

A: I think some of my colleagues in the music education department in our College may be appropriate coaches. I would also be interested in working with someone from the College of Education with experience in this line of research.

Q: Are you able to attend the required meetings as specified above? If your project is selected, are you willing to serve as a coach in PTLC in a future year?

Yes and yes.

References

Bandura, A. (1986). Social Foundations of Thought and Action: A Social Cognitive Theory. Englewood Cliffs, NJ: Prentice Hall.

Bandura, A. (1993). "Perceived Self-Efficacy in Cognitive Development and Functioning." Educational Psychologist 28(2):117-48.

Bong, M., and E. M Skaalvik. (2003). "Academic Self-concept and Self- efficacy: How Different Are They Really?" Educational Psychology Review 15(1): 1-40.

Collins, J. L. (1982, March). "Self-efficacy and ability in achievement behavior." Paper presented at the annual meeting of the American Educational Research Association, New York.

Nielsen, S. G. (2004). "Strategies and self-efficacy beliefs in instrumental and vocal individual practice: A study of students in higher music education." Psychology of music 32(4): 418.

Pintrich, P. R., and E.V. De Groot. (1990). "Motivational and Self- Regulated Learning Components of Classroom Academic Performance." Journal of Educational Psychology 82(1): 33-40.

Pintrich, P. R., et. al. (1991). A Manual for the Use of the Motivated Strategies for Learning Questionnaire (MSLQ). Technical Report No. 91-b- 004. Michigan: .

Pintrich, P. R. (1999). "The Role of Motivation in Promoting and Sustaining Self-Regulated Learning." International Journal of Educational Research 31(6): 459-470.

Schunk, D. H. (1991). "Self-Efficacy and Academic Motivation." Educational Psychologist 26(3&4): 207-31.

Kendra Gale

Assistant Professor School of Journalism and Mass Communication University of Colorado - Boulder 478 UCB / 1511 University Avenue Boulder CO 80309 303.735.2940 [email protected]

PROPOSAL

What is the central question, issue or problem you plan to explore in your proposed work? Do students demonstrate a shift away from "formula following" to increased personal responsibility in a civic engagement course designed to facilitate a move toward self-authoring?

Why is your central question, issue or problem important, to you and to others who might benefit from your findings? Educational institutions are under increased pressure to facilitate the development of students prepared to engage as citizens in a multicultural society. Self-authoring provides an educational model that develops the intrapersonal as well as interpersonal and cognitive skills required to take action on beliefs.

How do you plan to conduct your investigation? What sources of evidence do you plan to examine? What methods might you employ to gather and make sense of this evidence?

In Spring 2008 I am teaching a freshman course designated as the first in a series of four civic engagement courses offered within the School of Journalism and Mass Communication (SJMC). The class is comprised of a cohort of "Dean's Scholars" directly admitted from high school rather than through application during the sophomore year. They were all high achievers in high school and all have expressed an interest in one of the sub-disciplines offered within the SJMC.

As one aspect of preparing to teach this course, I applied to teach in the ARSC 1001 program (formerly known as CU 101). One of the absolute joys of this class has been the opportunity to problem-solve collaboratively with other faculty. A second inspiration is tapping into knowledge regarding college student cognitive and social development. While I have always been interested in pedagogy (see CV), the ARSC experience introduced me to methods for integrating knowledge and skill development in conjunction with personal and social development.

I was particularly intrigued by the references to self-authoring in background reading for the ARSC faculty. While I was already designing the civic engagement course using active learning strategies, I began to incorporate ideas for the Learning Partnership Model (Baxter-Magolda, 2004) based on the idea of self-authoring. In oversimplified terms, this implies sparking cognitive development through reflection on "provocative" experiential situations that afford no clear right/wrong answer in order to facilitate self-authorship skills, i.e., "the capacity to internally define their own beliefs, identity and relationships" (Baxter Magolda, 2001, p. xvi). These are precisely the skills required for effective civic engagement.

The first phase of self-authorship for most college students is The Crossroads. At this point, they are challenged to give up the expectation of absolutes and external authority for decisions and to begin investigating internal sources of belief and definition. Development continues through Becoming the Author of One's Own Life (phase 2) and then to Internal Foundations. Having an internal foundation for decision-making allows them to act on the beliefs formed in phase 2 as well as balance the inevitable competing interests.

The nudge toward Phase 1 (The Crossroads) is facilitated by self-reflection on provocative moments that induce disequilibrium between received knowledge and actual experience. Within the course, these moments are most apt to occur in the context of their service with a community group that deals with social issues: homelessness, aging, literacy, etc. For example, homeless people who go to "real" jobs every day and aren't addicts or mentally ill.

Assessment Methods

Measurement of self-authorship within a first year course provides a challenge. Freshman tend to see knowledge as fixed, rather than constructed. As high achievers in high school, this particular group of students is likely to approach learning from a performance perspective (compliance) rather than a learning perspective, i.e., they have mastered the "formulas" for success in a lecture and recitation course. The self- authorship model, like other cognitive models, is a multi-year process that is often not realized until after graduation from college. It is highly unlikely that one semester will yield a significant shift - but, ideally it will nudge them along toward The Crossroads.

As a relatively new model, there are only a handful of articles on how to assess self-authoring. To date, the assessment of self-authorship is primarily qualitative and based on interviews and narratives written by students (Baxter Magolda and King 2007). Jane Pizzolato is at the forefront in assessing self-authorship: broadening the base of students under study (Pizzolato 2003), examining the characteristics of effective provocative moments (2005), identifying beliefs that are precursors to self- authorship (Pizzolato and Ozaki 2007), and the development of a survey to be used in conjunction with written narrative (Wawrzynski and Pizzolato 2006; Pizzolato 2007).

I plan to use interviews and students' self-reflections about important decisions to look for shifts away from learning as "formula following" and to assess the characteristics of effective provocative moments. Although I will administer the Self Authorship Survey at the beginning and end of the semester, enrollment is too small to expect any kind of statistical significance. In addition, I will keep a journal of my experience teaching the course as a parallel to the students' reflections on the class. For example, I will make explicit my intent with each written exercise, monitor their levels of stress, identify what I hear in the class discussion as provocative moments as a point of comparison with what they may actually write about, and so on.

How might you make your work available to others in ways that facilitate scholarly critique and review, and that contribute to thought and practice beyond the local?

I imagine several ways of sharing this project: 1) a session within the SJMC or through the Graduate Teaching Program with current graduate students on self-authoring and how elements can be incorporated into the courses they teach; 2) creation of exercises to be included in subsequent civic engagement courses; 3) a published case study of the attempt to create a civic engagement course based on a different set of pedagogical principles; 4) publication of assessment data; and 5) providing a baseline measure for ongoing civic engagement efforts within the school. What aspects of the design and character of your work are you not yet fully prepared to describe? What questions do you have and what do you still need to know? I have some concerns about a full course assessment given this is a new pedagogy for me. If, after further review of the course, it appears to be premature, I would still like to pursue a more detailed assessment of provocative moments: how to design and manage them effectively? Do created moments or naturally occurring moments work better in triggering The Crossroads phase?

Further, I am not clear about the nuances of coding the written narratives. My immediate next step is to digest the assessment studies referenced above more carefully as well as track down other citations.

What is your record of innovation in teaching and/or the assessment of learning?

Design and implementation of a revised curriculum in the undergraduate advertising sequence that included three new courses, several pedagogical articles regarding advertising education as well as invited presentations at the annual teaching workshop, complete transformation of the courses I teach from lecture format to active learning.

Can you suggest an appropriate coach for your project?

I would like to work with a coach who is well-versed in and committed to the model of self-authorship. I was socialized to the lecture and recitation model. I've progressed to active learning strategies. But I would appreciate a periodic sounding board for the incorporation of self-authorship strategies - particularly the prompts for self-reflective exercises. I know that Alphonse Keasley and Elease Berry Robbins are both knowledgeable and possibly Joan Gabriele as well.

I am able to attend the meetings and will happily serve as a coach/mentor to others.

References

Baxter-Magolda, M. 2004. Learning Partnership Model: A framework for promoting self-authorship. Learning Partnerships: Theory and models of practice to education for self-authorship. Edited by M. Baxter -Magolda. Sterling, VA: Stylus.

Baxter-Magolda, M. and P. King. 2007. Interview Strategies for Assessing Self-Authorship: Constructing Conversations to Assess Meaning Making. Journal of College Student Development 48 (5): 491-508.

Belenky, M., et al. 1986. Women's Ways of Knowing: The development of self, voice and mind. New York: Basic Books.

Perry, W. G. 1970. Forms of Intellectual and Ethical Development in the College Years. Troy: Holt, Rinehart & Winston.

Pizzolato, J. 2003. Developing Self-Authorship: Exploring the Experiences of High-Risk College Students. Journal of College Student Development 44 (6): 797-812.

---. 2005. Creating Crossroads for Self-Authorship: Investigating the Provacative Moment. Journal of College Student Development 46 (6): 624- 41.

---. 2007. Assessing Self-Authorship. New Directions for Teaching and Learning 109 31-42.

Pizzolato, J. and C. C. Ozaki. 2007. Moving Toward Self-Authorship: Investigating Outcomes of Learning Partnerships. Journal of College Student Development 48 (2): 196-214.

Wawrzynski, M. and J. Pizzolato. 2006. Predicting Needs: A Longitudinal Investigation of the Relations between Student Characteristics, Academic Paths, and Self-Authorship. Journal of College Student Development 47 (6): 677-92.

Scott Grabinger Associate Professor Information and Learning Technologies Program University of Colorado at Denver and Health Sciences Center School of Education Campus Box 106, PO Box 173364 Denver, CO 80217-364 303-556-4364 [email protected]

Instructional Accessibility for Learners with Cognitive Impairments -- Central Issue. Students with mental illness are an underserved population on campus and online. Mental illnesses cause cognitive impairments that affect executive control functions such as concentrat- ing, planning, strategizing, and monitoring progress, which in turn affect learning and quality of work. University disabilities offices (our office is excellent) offer on-campus support including tutors, note tak- ers, quiet rooms, extra time for tests, etc. However, this kind of support is external to the class and not available online. Though useful, these supports do not keep students in the classroom and they usually drop online classes. They are labeled "disabled" and not capable of doing "normal" work. This is my issue: it is not necessary to label and isolate students with cognitive impairments to help them learn. We can help students within classes through the instructional strategies classes rather than taking them out of classes and improve retention and success. The central question is: How can we best support learners through instructional strategies with cognitive impairments on campus and online?

-- Importance. Unfortunately this is not a rare problem locally or nationally. The National Alliance for the Mentally Ill found that 14to 27of college students between the ages of 18 and 24 struggle with various kinds of cognitive impairments caused by depression, ADD, posttraumatic stress, bipolar disorder and other physical ailments. (In 2004, there were over 17,000,000 postsecondary students. Just 10 means 1.7million learners with impairments.) My own survey of disabilities1 offices around the country found that requests for services by those with mental illness, especially for bipolar disorder, are climbing rapidly-up to 50 of those needing services. Neurologically, we have a good idea what the problems are; we've done little to deal with them. Work in this field is almost non existent.

-- The Agenda. My long-term research agenda is to turn the locus of support for students with impairments 180 degrees, addressing support at the instructional level rather than the institutional level. Instructional level support helps learners at the time and place needed (in the classroom or within online instruction) rather than at a later time in another place. This just-in-time approach personalizes instruction, minimizes frustration, and encourages persistence-leading to better learning and success. Ultimately I will develop a set of exemplars demonstrating instructional and learning practices to support both learners with cogni- tive impairments from physical and mental disorders, and faculty. Most importantly, these practices will improve instruction for all students, not just those with disabilities. This concept recognizes that learners fall along a continuum-there is neither "normal" nor "disabled" learner everyone is unique.

-- Cognitive Impairments. We know where to begin because impairments affect five general learning categories including concentration, perception, and attention; long and short-term memory; conceptual learning; problem solving and reasoning; and metacognition and executive function. Work done by the Center for Applied Special Technology (CAST) in Universal Design for Learning provides the conceptual framework, focusing on the brain’s recognition, strategic, and affective networks.

-- Investigation. The first step to improving learning and teaching is to identify those impairments and teaching practices most affected. We need to identify obstacles to both teaching and learning to create best practices for learning and teaching around the brain's recognition, strategic, and affective networks. Considering how new this work is answering the following big questions will provide a significant national contribution to teaching, learning, and mental illness. Specifically: What are the most common obstacles in utilizing the brain’s recognition network? What strategies effectively activate the brain's recognition network among learners with cognitive impairments to encourage concentration, perception and attention? Graphics? Sound? Pictures? Layout? What obstacles affect the brain's strategic network including problem solving, reasoning, and strategy selection? How do learners with mental illness select strategies? How can we help them choose strategies for memory, problems solving, writing, and metacognition? What obstacles affect the brain's affective network? What strategies are most useful to take advantage of the affective network and keep cognitively impaired learners engaged?

--Conducting the Investigation. After discussing with my coaches, we have decided that the best place to begin is by asking teachers and students about obstacles and best practices. For example, we might ask students with mental illness: "What are some of the difficulties you encounter in-class/online caused by your mental illness? What supports have teachers have provided within your instruction that have helped you?" We might ask faculty: "In what ways have students with mental illness requested support? Have you changed your teaching in anyway after working with learners with mental illness?" This kind of research requires surveys, observation, and interviews of both teachers and students. Descriptive statistics and content analysis will support interpretation. The purpose of this beginning stage of research is to identify a core set of strategies that faculty use, identify factors of success and failure, associate mental illness characteristics with practices and strategies, discover exemplary instructors and their character- istics, and note messages sent by faculty-all related to the cognitive impairments described above. This phase needs refinement in the Scholar's program. My plan is big, so a critical methodological question for my coaches is, "At this phase is it better to focus on students or teachers or both? On on-campus or online or both means of delivery?" The disabilities office will support me in finding students with cognitive impairments for the study. Coaches include Nancy Leech, research methodology; Bonnie Utley, special education and disabilities; and Deanna Sands and the SEHD Research Center. Dissemination plans include publications, presentations, web tools (e.g., blogs, web site, podcasts, etc.), and applications for funding.

--Innovation. I've a history of innovation. Going back to 1993, I (with Joni Dunlap) formulated the sociocultural construct, Rich Environments for Active Learning. REALs synthesized research from con- structivist thinking into four guidelines for instruction: intentional, authentic, and generative learning; and continuous assessment. REALs has affected research around the world. I am also a leader in using prob- lem-based learning. I've published and conducted training sessions in the U.S. as well as in the United Kingdom, Turkey, and others. I teach online and have found that I use techniques not used by other online professors - very successfully.

-- Scope. Obviously, I have an ambitious agenda that I have created from my recent studies in mental illness and neurophysiology while on sabbatical. With your help, I plan on developing research protocols and conduct the initial exploration of specific successes and obstacles encountered by students with cognitive impairments and faculty within the brain's recognition, strategic, and affective networks.

-- Attendance and Coaching. I can attend meetings and am willing to serve as a coach.

Jean Hertzberg

Associate Professor Department of Mechanical Engineering College of Engineering and Applied Science University of Colorado 427 UCB Boulder, CO 80309-0427 303 492 5092 [email protected] Letter of Proposal

Two courses that I teach, in the same topical area, fluid mechanics, seem to have vastly different impacts on my students' attitudes towards the topic. The two courses are Flow Visualization, and MCEN 3021 Fluid Mechanics. This proposal is to find out whether my perception is accurate.

In the Department of Mechanical Engineering, there are a number of courses on fluid mechanics that teach how fluids (gases and liquids) are analyzed and used for engineering purposes, such as piping systems for domestic and industrial use, turbojet engines for propulsion and power generation, etc. These courses use a highly analytic, mathematical approach. The equations that govern fluid flow are derived, and students practice solving quantitative problems that have a realistic engineering context. This semester I'm teaching MCEN 3021 Fluid Mechanics to 175 students, and I use a traditional lecture format, with the addition of some in-class group problem solving and conceptual clicker questions.

The contrasting course is Flow Visualization, which I've been developing for the past four years in partnership with Prof. Alex Sweetman, of Art and Art History. Flow Visualization (Flow Vis hereafter) is taught to mixed teams of engineering and fine arts photography majors. Rather than solve engineering problems, students are asked to create images of fluid physics for aesthetic purposes – to make art using fluid flow physics.

The Flow Vis course covers significant technical content related to how to make the physics of transparent fluids (e.g. air and water) visible and recordable by photographic means. For example, a dye such as food coloring can be added to water to show flow patterns. Extensive examples are available on the course website, http://www.colorado.edu/MCEN/flowvis .

Flow vis techniques have been used in research on fluid physics throughout history, and are constantly being expanded and adapted to new technologies and research problems. Flow visualization images are used to some extent in all courses on fluid flow, to illustrate various physics phenomena. Using flow visualization for aesthetic purposes is itself not novel: most fluids researchers own a copy of Milton Van Dyke's 1982 "An Album of Fluid Motion", a compendium of particularly compelling flow visualization images. An appreciation for the beauty of fluid flows is acknowledged in an annual juried 'art' exhibition conducted by the American Physical Society Division of Fluid Dynamics. However, the Flow Vis course here at CU has a number of unique aspects. First, students are given very open-ended assignments, which require creativity. Second, unlike other 'art/science' courses where the art students do the art work, and the engineers do the science, here the engineering students are expected to be artists, and create images with impact and intent. The art students are expected to display scientific discipline in the documentation of their results. Another unusual aspect from the engineering students' perspective is that the course is almost entirely non-competitive. Expectations are high while grading is very generous.

Anecdotally, the Flow Vis course has had a tremendous impact on students. On exit surveys, students write comments such as "I'll never ignore the sky again" or "I see examples of flow vis all the time now". Students write me years later, excited about having witnessed a notable example of flow vis in every day life: clouds, fountains, oil in a puddle, colorful bubbles, wildfires. This never happens with the traditional fluids courses.

We hypothesize that the Flow Vis course has somehow taught students to see, to perceive the fluid flow physics that surrounds them every day. While the traditional courses are packed with relevant engineering examples, students have not demonstrated that they view fluid physics as existing outside of the textbook. In subsequent fluids courses, students demonstrate difficulty in applying the concepts they've learned (and yes, we do some concept inventory testing). In contrast, Flow Vis alumni seem more able to recognize fluid physics, a necessary first step to understanding.

Currently Kara Gray, a graduate Physics Education Research (PER) student in Derek Briggs' EDUC 8804 Measurement in Survey Research course is beginning development of a survey instrument that can quantify what effect the Flow Vis course has on student perception of fluid physics, and whether this differs from that of students in traditional fluids courses. We plan to administer the survey to students in Flow Vis and MCEN 3021 this semester. Kara plans to analyze the results, and revise the survey next semester, again as part of her coursework (her thesis work is on a different topic). PTLC support would enable me to hire another student to continue this work, and administer the survey again next AY, in a pre-post format, which will allow quantitative conclusions about the impact of Flow Vis vs. 3021 to be drawn. Development of such an attitudinal survey is a lengthy process; with preliminary data, NSF support can be sought. If the anecdotal evidence is borne out by the survey, a whole host of interesting questions open up. Does increased perception of fluid physics have an impact on students' ability and/or willingness to deal with fluid physics in engineering practice? For example, would a preliminary flow vis experience enhance learning in a traditional fluids course? What aspect of the flow visualization course is most important? Is the effect related to the way the course is taught, or to the content itself, i.e. does flow visualization have an inherently greater impact on student understanding of - and interest in – fluids than a mathematical approach? Can this approach be extended to other topics? Other questions involve the impact on the art students: is physics less intimidating afterwards?

There are a number of survey instruments on student attitudes towards science, although none are focused on student perception of fluid physics, and thus contain questions that are not relevant to our context. For example, the Colorado Learning Attitudes about Science Survey (CLASS), developed here at CU Boulder by the PER group has a number of questions regarding problem solving attitudes that Flow Vis students would find inappropriate to their experience in the course. However, due to their expertise in this area, any of the PER faculty (or Derek Briggs) would be welcome as mentors for this project. Note that further exploration of the literature in this area would be undertaken as part of the PTLC project. I have found that the nomenclature and concepts of education and cognitive science research represent a formidable barrier, but I can surmount it with appropriate assistance.

There are a number of venues available for dissemination of this work. The community of fluid mechanics instructors has recently begun supporting sessions on fluids education at the annual DFD meeting. The International Symposium on Flow Visualization has also shown interest by including sessions on art and visualization. These are the communities that would be most interested in the results of the survey, and make most immediate use of the results. The engineering education (EER) and PER communities both have professional societies, meetings and peer-reviewed journals where this work would be appropriate.

In closing, I am very willing to attend any required meetings, and to serve as a coach in the future. I have been interested in EER and PER for some time, and this represents my first quantitative foray in this direction.

An Album of Fluid Motion by Milton Van Dyke. Parabolic Press, Stanford CA, 1982.

A Gallery of Fluid Motion by M. Saminy, K.S. Breuer, L.G. Leal, P.H. Steen. Cambridge University Press, 2003. A new instrument for measuring student beliefs about physics and learning physics: the Colorado Learning Attitudes about Science Survey , published in PRST Jan. 2006.

Jane Kass-Wolff

Assistant Professor School of Nursing University of Colorado at Denver Health Sciences Center 4200 E. Ninth Ave, C 288 Denver, CO 80262 303-315-1034 [email protected] Ernestine Kotthoff-Burrell

Assistant Professor School of Nursing University of Colorado at Denver Health Sciences Center 4200 E. Ninth Ave, C 288 Denver, CO 80262 303-315-8234 [email protected]

Central Aim of the Proposal

The purpose of this proposal is to replicate a previous study conducted at the UCDHSC School of Medicine (Barley, Fisher, Dwinell, and White, 2006) to explore whether there is a difference in learning outcomes for students enrolled in a graduate nursing advanced health assessment course taught by Standardized Physical Examination Teaching Associates (SPETAs) versus those taught by advanced practice nursing faculty. This research represents an interprofessional educational and research initiative between faculty at UCDHSC schools of nursing and medicine.

Review of the Literature

Advanced health assessment skills are a critical foundational core competency in the preparation of master's level nurses to become clinically competent in advanced practice nursing roles (e.g., adult, family, pediatric, women's health, and gerontological nurse practitioners, clinical nurse specialists, and nurse midwives) (American Academy of Nurse Practitioners [AANP], 1999; American Association of Colleges in Nursing [AACN], 1996; American College of Nurse-Midwives [ACNM], 1997; Kelley & Kopac, 2007; Gibbons, et al., 2002; Mahoney, 2002). Advanced health assessment builds upon prior basic health assessment skills from undergraduate nursing education to: interview prospective patients; take a comprehensive or a problem focused health history; conduct a complete physical or an appropriate problem focused exam; derive differential diagnoses; and develop and implement an appropriate plan of care to promote health, treat and manage acute and chronic illnesses (National Organization of Nurse Practitioner Faculty [NONPF], 2002). A course in advanced health assessment is a prerequisite for advanced practice clinical experiences in nursing (Hagopian, Gerrity, & Lynaugh, 1990).

Advanced health assessment skills in the graduate nursing programs at UCDHSC and across the nation are traditionally taught by advanced practice nursing faculty who employ a variety of teaching/learning strategies (i.e., lectures, demonstrations of the required skills, video and CD-ROM demonstrations, and/or web-based patient case studies). However in recent years, UCDHSC nursing faculty has employed the limited use of SPETAs from the UCDHSC School of Medicine to teach the breast and female gynecological exam, and the male genital exams. The inclusion of these two experiences into the course has received overwhelmingly positive reviews from the students as a more acceptable approach to acquire the requisite knowledge and skills for these exams. Our students as well as those in other health professional programs have been reluctant to perform breast, gynecological, and male genital exams on student colleagues. The reasons cited at UCDHSC and in the literature include gender, religious, or cultural beliefs that preclude examining fellow student colleagues (Barley, Fisher, Dwinell, & White, 2006, O'Neill, Larcombe, Duffy, & Dorman, 1998).

The success and acceptance of SPETAs for the limited use described above has led graduate nursing faculty to explore expanded use of SPETAs in teaching advanced health assessment to students in advanced practice. In 2006, Barley, Fisher, Dwinell, and White published the results of a UCDHSC School of Medicine pilot study comparing learning outcomes of medical students taught by SPETAs versus those trained by physician faculty. The results indicated comparable learning outcomes for students taught by SPETAs, and significantly better mean scores on the standardized abdominal exam. SPETA trained students had a mean score of 88versus a mean score of 85.4mean score for students taught by physician faculty, and this difference was statistically significant (p =.03, SD = 9.4). Benefits of the Project

Standardized teaching of advanced assessment skills is extremely important not only to graduate nursing education at the UCDHSC, School of Nursing but also to graduate advanced practice nursing programs across the United States for a variety of reasons. First, this proposal has the potential to increase uniformity and consistency by imposing standardized teaching/learning strategies for both medical and graduate nursing students at UCDHSC. Secondly, this proposal can potentially reduce educational costs at graduate nursing education at UCDHSC. Research indicates that clinical education is labor intensive and costly to provide because of associated faculty salaries to teach skills to small groups of students (Davidson, Duerson, Rather, Pauly & Watson, 2001). Thirdly, standardization of teaching advanced assessment skills has the potential to enhance respect for clinical expertise among health care disciplines with the recognition that students in all health professional programs acquire advanced health assessment skills using the same standardized approach in teaching and evaluation.. Fourth, if outcomes of this replication pilot project yield comparable or significantly better student learning outcomes, this represents a new and innovative endeavor for graduate nursing education at UCDHSC and across the nation. Although some programs have reported the use of standardized patients, we are not aware of any other national graduate nursing programs that use SPETAs for advanced health assessment. Lastly, the consolidation of the health science center on the new Aurora campus and use of the new innovative CAPE center with state of the art training facilities, interprofessional research between nursing and medicine will continue to increase improving the provision of health care.

Study Design

Pending Institutional Review Board [IRB] approval, this pilot study replicates and extends the previous pilot study (Barley, Fisher, Dwinell, & White; 2006) in which SPETAs are trained and instructed to use their bodies to teach physical examination skills to small groups of students. Students in the study group receive immediate feedback from SPETAs on their skills and practice the skills until they are done correctly. Since all SPETAs receive the same training and instruction, they are less likely to be biased in favor of individualized adaptation or modify the exam (Barley, Fisher, Dwinell, & White, 2006). This study will employ a convenience sample of graduate nursing students enrolled in two sections of advanced health assessment (anticipated enrollment =30 students). Students will be randomly assigned to either the traditional faculty led small group lecture and faculty run demonstration groups or to SPETAs. In the control group, students are expected to practice physical examination skills on each other whereas those in the study group are assigned to SPETAs, and conduct the exam on the SPETA. One exception to this approach is that all enrolled students will participate in SPETA conducted groups for acquiring knowledge and skills to conduct the breast, female and male genital exams.

Data Collection and Analysis

At the completion of the semester, SPETAs will use standardized performance checklists to evaluate both the study and control groups' skills on four standardized physical examination techniques (pulmonary, cardiovascular, abdominal, and musculoskeletal exams). The instrument rating scale consists of a score of 0 = not performed, 0.5 points = performed incorrectly, and 1 point = performed correctly. Students' skills at the end of the semester will be evaluated by either conducting the exam on the SPETA, and the SPETA will assess the student's skill, or on more extensive exams two SPETAs will evaluate student performance (e.g., one SPETA serves as the patient, and another SPETA observes and evaluates student's performance). All scores will be entered into Statistical Package for Social Sciences (SPSS). The aims are: a) Determine the percentage of correct scores for the control group (faculty taught) and SPETA taught groups for five portions of the physical examination: vital sign measurement, upper and lower extremity musculoskeletal exam, cardiovascular, pulmonary, and abdominal exam; 2) Compare the percentage of correct scores on the five exams between the faculty taught group vs. the SPETA group; and 3) Determine student satisfaction with each specific method of training.

Prior Innovation in Teaching

Dr. Kotthoff-Burrell was a co-principal investigator and project manager for a demonstration project that involved the development, implementation, and evaluation of an interdisciplinary geriatric course that included UCDHSC graduate nursing, dental, and pharmacy students, as well as graduate law and social work students from the University of Denver. This project was one of eight sites across the U.S. sponsored by the John A. Hartford Foundation. It was the only demonstration site that included law and dentistry. Dr. Kotthoff-Burrell and project faculty developed interdisciplinary case studies, published an interdisciplinary team training manual, produced an interdisciplinary team training video for assessing older adults with cognitive impairment, and authored a chapter in a book on Ethical Decision-Making in the Care of Older Adults. Dr. Jane Kass-Wolff recently joined the faculty at UCHSC School of Nursing in 2006. She has taught the advanced assessment course for the master's advanced practice students since her arrival. Consistency and uniformity of training of advanced practice students has been an overriding focus in her teaching of this course. Her vision of standardized patients and teaching assistants in the development of core competency skills in advanced assessment led to the development of this proposal.

Suggested Coach and Mentor

Dr. Gwyn Barley, Center for Advancing Professional Excellence (CAPE) at the UCDHSC, School of Medicine, has agreed to be our coach and mentor for this project. Both of the individuals Drs. Jane Kass-Wolff, and Ernestine Kotthoff-Burrell are able to participate in the required meetings, and are willing to serve as coaches for PTLC in the future.

American Academy of Nurse Practitioners. (1999). Position statement on nurse practitioner education. Austin, TX: Author.

American Association of Colleges in Nursing. (1996). The essentials of master's education for advanced practice nursing. Washington, DC: Author.

American College of Nurse-Midwives. (1997). The core competencies for basic nurse- midwifery practice. Washington, DC: Author.

Barley, G., Fisher, J., Dwinell, B., & White, K. (2006). Teaching foundational physical examination skills: Study results comparing lay teaching associates and physician instructors. Academic Medicine, 81(10), S95-S97.

Gibbons, G., Adamo, D., Padden, R., Riccardi, M., Graziano, M., & Levine, E. (2002).

Clinical evaluation in advanced practice nursing education: Using standardized patients in health assessment. Journal of Nursing Education, 41, 215-221.

Hagopian, P., Gerrity, P., & Lynaugh, J. (1990). Assessment preparation for nurse practitioners. Nursing Outlook, 38, 272-274.

Inman, C. (2003). Providing a culture of learning for advanced practice students undertaking a masters degree. In P. Gee, and G. Castledine, (Eds.). Advanced nursing practice (2nd ed.). pp. 73-84. Oxford: Blackwell.

Kelley, F.J., & Kopac, C. (2001). Advanced health assessment in nurse practitioner programs. Journal of Professional Nursing, 17, 218-225.

Mahoney, J. (2002). Improving the advanced health assessment course. Nurse

Practitioner, 27, 85-86.

National Organization of Nurse Practitioner Faculty. (2002). Nurse practitioner primary care competencies in specialty areas: Adult, family, gerontological, pediatric, and women's health. Washington, DC: Author.

O'Neill, P.A., Larcombe, C., Duffy, K., Dorman, T.L. (1998). Medical students' willingness and reactions to learn basic skills through examining fellow students.

Medical Teacher. 20, 433-437.

Pulcini, J., & Marion, L. (2000). Nurse practitioner education in the new millennium:

Challenges and opportunities. Nursing Outlook, 48(3), 107-108.

Yvonne Kellar-Guenther

Assistant Professor Communication & Evaluation University of Colorado at Denver and Health Sciences Center 4200 E. 9th Avenue, C-230 Denver, CO 80262 303-315-1945 [email protected]

Central question, issue, or problem & Literature Review

UCDHSC, as with many organizations, utilizes trainings to provide education for their staff. Education through training, however, is only useful if the student is able to transfer what they learn to their job. Kontoghiorghes (2004) stated that training investments often fail to deliver the desired and expected outcome. This is because it is difficult to transfer what is learned in the training environment to the workplace environment (Subedi, 2006). This transfer of learning between the two environment is called the training transfer.

The question I would like to study is the efficacy of training, especially a short-training, to teach staff a new skill. Specifically, I would like to evaluate the impact of a two-hour educational training on how to write at lower reading levels. I currently teach a two-hour course on how to write consent forms and recruitment material at lower reading levels. To date, approximately 100 professionals have take the course. Attendance is voluntary and free. During the course, there are a couple hands-on exercises and, as a class, we write a paragraph that one of the attendees submitted. While students have evaluated the course highly, I am not sure what the long-term impact of the course is. While students are allowed to request TA after the course, none have. Are these students able to re-write their materials at lower reading levels? There are many articles that stress the importance of writing research materials, specifically consents, at lower reading levels (see Raich, Plomer, and Coyne, 2001 for a review).

There is literature on how to maximize the training transfer so that students are more likely to retain information. The most widely-used model is by Baldwin and Ford (1988). In their model of the transfer process, Baldwin and Ford hypothesized several pedagogical principles used during training like teaching the theory behind the practice and providing time for hands- on learning that will maximize training transfer. They do not, however, test their model. What the training research shows is that it is difficult to evaluate the efficacy of this form of education (Santos & Stuart, 2003). The most typical evaluation is the use of evaluation forms collected at the end of the training session (Bramley, 1996). At this point, however, the student is not sure if they have actually learned the skill. Alliger and Janack (1989) reported that good reactions at the end of a training were not correlated to the student having learned anything during the training. A participant in Santos and Stuart's study said "During a training course, everything makes sense. But after training, you go back to the office and realize that it is difficult to apply what you learned to the real job" (p. 37). As a result, Santos and Stuart (2006) recommend that an evaluation be done 6 months after the end of the program.

There have been some studies that have looked at retention or training transfer a few months after training. All of these, however, have relied on student self-assessments of their own learning. I would like to use an objective measure to determine if the students in the training did learn how to write at lower reading levels.

Importance of the Problem & Who Might Benefit from or Build on your Findings

First, there are many courses at UCDHSC that are short training sessions. Currently, I am leading a group who is designing 24 hours of training for clinical trial coordinators and PIs. A majority of this curriculum is training sessions. None of the 24 hours, however, includes follow-up, including the consent writing course. While the literature on training transfer does identify what can be one during the training to promote training transfer, the literature states that other factors such as the job environment and maintenance of the learned material over time on the job are also key factors (Baldwin & Ford, 1988). It would be helpful to know if this approach works or if another approach should be looked at to teach these groups.

In addition to understanding how to evaluate training, the content taught in this training is important. COMIRB is now requiring investigators to have consent forms written at lower reading levels. Those who have taken the course should be able to do this and should be able to go through the COMIRB process more quickly. This would save the institution money and the investigators should be less frustrated with the COMIRB submission process.

Methods and Dissemination

First, I would like to look at students in the course and a COMIRB submission pre and, if possible, a few submission after taking the class. If possible, I could use the layperson's evaluation (given during COMIRB panel meetings) to see which is clearer. This would include analyzing the qualitative responses given (these are written). I would need permission from COMIRB to do this. If I could not get this, I could get copies of the different consent forms, with permission from the authors, and have an independent party to rate them. The rater would be blind to which author had written which forms and when the forms were written. I'd need to develop a scale to rate the consent forms. I would most likely run a t-test to look for a significant difference between pre and post consents. This analysis would be unique in that most of the training literature relies on the participant's self-report for assessing the effectiveness of the training. This is even true when the articles ask several months post training.

Second, I'd like to ask past students to rate if they feel they were more effective at writing at lower reading levels. This approach is more consistent with past approaches published in the training literature. I would need to construct a questionnaire that not only looked at their ratings of effectiveness but also how many consents they write, years of experience prior to the course, and other demographics to see if there is a difference in perceived effectiveness. I would also include questions about their work environment that have been identified as key in the training literature for training transfer. I would first look at the items used by Kontoghiorghes (2004) and Kirwan and Birchall (2006)to see if any work to assess the perceived training transfer of this course.

To analyze the self-report data, I think I would run multiple regression to see if which variables might explain the variance in perceived effectiveness and actual effectiveness (using ratings from actual consent ratings described above). Again, the evaluations have been high but that is immediately following the class. I have a roster for the classes I have taught. Also, I'll start teaching this course again in January.

As for making my work available, I would share my findings with COMIRB and the Office of Regulatory Compliance. In addition, I am going to direct the Clinical Trial Training for Investigators and Coordinators (CTTIC) which includes several courses with similar approaches. These courses would be revised based upon what I learn. Finally, I'd be interested in publishing my findings but would need assistance finding the correct journal.

What Assistance is Needed?

First, I know very little about the pedagogy and training literature. I'd also like to read more on how to measure long-term impact of application of knowledge.

Next, I would want assistance with the designing and carrying out the evaluation. Based on what Baldwin and Ford (1988) report, my course utilizes the key training design factors that enhance training transfer. These factors are still referred to in current articles (e.g. Klein, Noe, & Wang, 2006; Kontoghiorghes, 2004) as key pieces in training transfer. There are many potentially confounding variables that would need to be minimized, however. It would be helpful to talk with someone who has experience evaluating the efficacy of teaching or training. The supportive and collaborative structure of the PTLC will be very helpful in designing and implementing the evaluation.

I would also appreciate input on the analyses used to test my research questions.

Record of Teaching Innovation

I have been received two teaching awards – the Golden Apple and the Wayne Thompson Professorship. I have also always scored highly on my teaching evaluations. Prior to coming to UCDHSC, my primary duties were teaching. As part of my position here, I have designed and delivered many trainings to research professionals. I love teaching and training allows me to continue doing that. Also, as stated earlier, the training helps UCDHSC provide professional development and, hopefully, increase the skills of the UCDHSC personnel.s

Commitment to and from PTLC

I do not have a specific coach in mind but I would appreciate working with someone who has a background in educational evaluation and a strong knowledge of the pedagogy literature. It would be ideal if I could be mentored by someone who has studied adult education.

Also, I am able to attend the required PTLC meetings as specified above and would be happy to be a mentor/coach in a future year.

References

Baldwin, T.T., & Ford, J.K. (1988). Transfer of training: A review and directions for future research. Personnel Psychology, 41, 63-105.

Chiaburu, D.S., Marinova (2006). What predicts skill transfer? An exploratory study of goal orientation, training self-efficacy and organizational supports. International Journal of Training and Development, 9(2), 110-123.

Kirwin, C. & Birchall, D. (2006). Transfer of learning from management development programmes: Testing the Holton model. International Journal of Training and Development, 10(4), 252-268.

Klein, H.J., Noe, R.A., Wang, C. (2006). Motivation to learn and course outcomes: the impact of delivery mode, learning goal orientation, and perceived barriers and enablers. Personnel Psychology, 59, 665-702.

Kontoghiorghes, C. (2004). Reconcetualizing the learning transfer conceptual framework: Empirical validation of a new systematic model. International Journal of Training and Development, 8(3), 210-221.

Santos, A., & Stuart, M. (2003). Employee perceptions and their influence on training effectiveness. Human Resource Management Journal, 13(1), 27-45.

Subedi, B.S. (2006). Cultural factors and beliefs influencing the transfer of training. International Journal of Training and Development, 10(2), 88-97.

Mary Klages

Associate Professor English Department University of Colorado, Boulder 226 UCB Boulder, CO 80309-0226 303 492 7891 (office) 303 641 2602 (cell) [email protected]

PROPOSAL

In 1993 the English Department at CU decided to offer an undergraduate course in literary theory for the first time. Professor Kelly Hurley and I designed and taught the first section of the course, which introduced students to the writings and thought of the poststructuralist, feminist, marxist, and postmodern theorists whose work had permeated literary studies in the 1980s. Since then the course has become a required part of the English major, standing as one of two "gateway" courses designed to provide students in their freshman or sophomore year with the skills and concepts necessary to advanced work in literary studies.

Faculty insist that the course makes a significant difference in student performance in upper division English courses: most find that the students who have taken the theory course can work at a higher, more sophisticated level of literary and cultural analysis than those students who have not taken it. Students, however, often delay taking the course until their senior year, having heard that it is extremely difficult, covering highly abstract material from a variety of non-literary disciplines, including philosophy, linguistics, anthropology, and political science.

The main question for my investigation revolves around this discrepancy. I wish to find out, on the one hand, why the faculty think that the theory course makes such a difference in student performance: what makes students better? Is that improvement due to the specific course content (which varies by faculty member teaching the course) or to the particular cognitive and metacognitive ways of thinking the course fosters, regardless of specific content? How can we assess the kind and degree of learning that comes from this required course? On the other hand, I also want to find out what the students think the theory course is and does. Why do students put it off until late in the major? What do they think they learn from the course when they do take it? Do they share the faculty's conception that the theory course contributes significantly to a student's ability to perform well in upper division literature courses, and if so, why?

These questions, it seems to me, are central to the design of the English major (which the English department will be re-thinking in AY 08-09). We want our students to move from "novice" forms of knowledge within the discipline to "expert" forms of knowledge and ways of thinking, and to design a major to do this we need to articulate what "novice," "intermediate," and "expert" modes of thought are in relation to our particular department. More simply put, we need to articulate what it is we want our students to learn in the course of their English major. And we need to find out what our students think they are here to learn. Beginning students in particular are often mystified by the difference between their own levels of thought and those modeled by the faculty; they also have trouble piecing the various requirements of the major into some kind of shaped coherent whole. If we want to foster our students' growth from "novice" to "expert" learning, we need to know what conceptions and reservations the students bring to the learning process as well.

My investigation has already begun; I have been teaching the theory class consistently for 14 years, and have gathered a fair amount of feedback from students about their expectations, fears, and experiences with the course. I want to continue my investigation through two methods of data collection. I will interview all 60 of the current English dept. faculty to ask whether they share the perception that the theory course makes a significant difference in student performance, from "novice" to "expert," and if so, why they think that is the case. I will also ask permission of those faculty who teach the theory class (myself included) to distribute an entrance survey to the students, asking what they think the theory course is about and what they will learn from it, followed by a midterm survey, asking for their impressions of the course, and concluded with an exit survey, distributed at the end of the course, asking what they learned and how that learning has affected their study of literature. I may also ask permission to survey a sample of senior-level seminars to ask students how the theory course affected their literary study several semesters after they have taken the course. I would hope to write a "documentary" style article, letting the faculty and students speak for themselves as I analyze and synthesize their comments.

My investigation is grounded in the material I read during the FTEP Classroom Learning Assessment Institute held in June of 2006. I was particularly influenced by a presentation given by Lori Shepard, of the CU School of Education, who addressed what makes the difference between "novice" and "expert" learners. The presentation by Kathy Perkins, Wendy Adams, Carl Wieman, Steve Pollock, Noah Finkelstein, Kara Gray and Eli Quinn, of the CU Physics Dept., on "Attending to More than Content Mastery: Assessing Student Attitudes and Beliefs in our Classrooms" gave me a model for understanding how student pre- and misconceptions about the kind of work they are being asked to do affect classroom learning situations. Randy Bass of Georgetown University gave a presentation on "Making Learning Visible" which also provided me with a model for understanding how students go about learning the material we set out for them; I have been able to videotape some "think-alouds," based on Bass's model, from an American literature course I taught last spring, and may be able to use this method of information gathering with the theory course. I have subsequently read a number of works suggested by the Institute, including How People Learn (Bransford, Brown, and Cocking 1999), Randy Bass, "The Scholarship of Teaching: What's the Problem?" Inventio Vol. 1, No. 1, February 1999, and Randy Bass and Dan Bernstein,

"The Scholarship of Teaching and Learning," Academe, July-August 2005.

The PTLC application materials ask me to include in this proposal a discussion of aspects of the investigation I'm not yet fully prepared to describe, as well as what I still need to know. I'm not sure I know what I don't yet know yet. I would hope that my work with a mentor-Marty Bickman comes immediately to mind-would help illuminate the pieces I can't quite see now.

I have been working with the scholarship of learning for a number of years, and have published two articles in the field: one on using email technology in a literature class, the other on the problem of affective learning in the theory course. In addition to pioneering the theory course, I have worked on redesigning the entire English major twice, once as a committee member and once as the Associate Chair for Undergraduate Studies in the English Department. I have also won two University-wide teaching awards, largely based on my numerous evolutions of the theory course. I have also been an FTEP faculty associate, serving as a consultant to other CU faculty on improving their teaching, for the past 5 years. I published a book, designed for use in literature classrooms, on literary theory in 2006.

I will be able to attend the required meetings, and am willing to serve as a PTLC coach in a future year.

Suzanne P. MacAulay

Chair and Associate Professor, Department of Visual and Performing Arts Art History and Ethnography University of Colorado at Colorado Springs 1420 Austin Bluffs Parkway Post Office Box 7150 Colorado Springs, Colorado 80933-7150 719.262.3865 [email protected]

Question: What is the effectiveness of the Classroom Assessment Technique known as Annotated Portfolios in fostering student's critical awareness of artistic and intellectual processes and determining student engagement or investment in interdisciplinary learning? The Annotated Portfolio is created by the student as a type of conceptual "call and response" to course content and objectives. It contains a limited number of examples of creative work (textual or visual) accompanied by explanations of how that work relates to course content and goals.

Background: Interdisciplinary departments such as the Department of Visual and Performing Arts (VAPA) at UCCS, which encompass multiple perspectives from fields that range from the theoretical to the applied, are challenged to discover assessment techniques both consonant with program goals as well as revelatory of student's capacities to reflect on their own creativity and skills of self-evaluation. The faculty of the Visual & Performing Arts Department (music, theatre, film studies, art, and art history) continues to develop the core curriculum for this interdisciplinary major. During this process the following content-based objectives have been identified: (1) application of interpretive and critical skills across VAPA disciplines, which (for students) necessitates approaching visual and performing arts from multiple perspectives and perceiving associations and constructing relationships among disciplines in new ways; and (2) ability to creatively collaborate with VAPA peers through pooling students' intellectual and creative skills in order to produce original works.

Goals: These assessable goals are embedded in the VAPA core courses to date. As an assessment technique, it appears that the implementation of the Annotated Portfolios would help faculty measure how effectively students are applying their interpretive skills in this multidisciplinary milieu and whether they perceive novel associations among disciplines and can create new and unique relationships, conceptually, thematically, etc. The student's creation of the Annotated Portfolio could also be the initial step toward understanding the potential and the value of being able to creatively collaborate with one's VAPA peers. In terms of scholarship on interdisciplinary education, particularly, its methods and assessment practices, the application of the Annotated Portfolios assessment technique offers a vehicle to investigate the real or fundamental efficacy of interdisciplinary departments such as the Visual and Performing Arts Department at UCCS.

Plan: I intend to simply begin by adapting a version of the Annotated Portfolios to my required entry-level VAPA core course, VAPA 100, The Ethnography of Performing Arts. This course is the first step of the VAPA core curriculum and has an interesting mixture of students from all program areas. In addition to classroom lectures, workshops, and exercises, it also has experiential components such as a modicum of fieldwork – all of which should inspire fairly diverse material for Annotated Portfolios. All of my non-western art history courses (e.g., Islamic Arts, Art of Japan, South Pacific Art, etc.), which primarily attract art history and visual arts majors, also offer rich possibilities for assigning the Annotated Portfolios. Since these courses are not within the VAPA core curriculum, they could provide comparative data.

Methodology: Students will be asked to respond to a topic, theme, question or problem (e.g. performativity as social behavior) through creative work (images, poetry, field notes with or without photos, etc.). They should create two or three samples. Reflecting on this work, students will be asked to write succinct explanations of how their pieces address the selected topic, theme, question, etc. These projects will be presented in portfolios, folders, notebooks, etc. The number of these assignments per semester depends on how much time is involved in analyzing and critiquing student portfolios.

Analysis of Data: Assess: (1) student's creative and interpretive abilities, capacity for reflection, critical skills, and innovative strategies in addressing assigned topics and questions, and (2) "quality of synthesis apparent in the annotations." Evaluate the student's ability to synthesize course content and goals with their own work.

- N.B. One of my objectives in proposing this project is to get help in developing an instrument for effective assessment of the Annotated Portfolios.

Dissemination of Results: The findings of such a project, which enhances interdisciplinary teaching and learning, would be of interest to a large sector of the academic community. This type of investigation could be published in journals concerned with qualitative pedagogy, interdisciplinary methodologies, and creative and experiential teaching and learning strategies as well as conferences and workshops addressing the same concerns.

Literature Review: The Annotated Portfolios is presented as Classroom Assessment Technique # 18 in Thomas A. Angelo and K. Patricia Cross. 1993. Classroom Assessment Techniques: A Handbook for College Teachers. Second Edition. San Francisco: Jossey-Bass Inc. Publishers. Its adaptability for fine arts as well as performing arts makes it suitable for assessment of VAPA curricula plus other interdisciplinary programs. This technique can also measure the change and growth of student awareness and intellectual development as the semester progresses.

- Charles L. Briggs. 1986. Learning How to Ask. Cambridge: Cambridge University Press. This book is a critique of how types of questions determine types of answers and is valuable in establishing a classroom climate of investigative collaboration of ‘seekers' through inquiry – inclusive of professors and students.

- Robert M. Diamond. 1998. Designing & Assessing Courses & Curricula. Jossey-Bass Inc. Publishers. Diamond's book is a touchstone for curricular development and assessment of which the proposed project on the Annotated Portfolios is a means of creating an assessment instrument, which focuses on an assessable goal.

- Carol Dweck. 1999. Self-theories: Their role in motivation, personality, and development. Philadelphia: Psychology Press. Applicable to interpreting and analyzing the process of student self-evaluation, which is intrinsic to the Annotated Portfolios project.

Design of project: Basically, I am initiating this as an individual project before experimenting with it and incorporating it into the VAPA core curriculum. At this point, it is difficult to predict its applicability and effectiveness throughout such a heterogeneous program as VAPA. I am also interested in its potential to be an instrument which tracks substantive change in student intellectual understanding.

Innovation in teaching: I have always taught in an interdisciplinary manner incorporating experiential elements and devising ways that students can assume responsibility for their learning. The "Ethnography of Performing Arts" (performing arts viewed through an anthropological lens) is my latest invention and was created to answer the need for an entrance into the VAPA major, which was inclusive, experimental and experiential. I have also created an art history curriculum, "The Pacific Rim," which is based on a dialogic model where the issues and challenges of certain areas (Asia, North America, etc.) are compared to other regions (Oceania, Mesoamerica, etc.) by means of a cross-cultural dialogue. Every other year I co-teach a composite course in ethnographic textiles (my contribution) and contemporary fiber art (my colleague's domain).

Coach: Someone with expertise in interdisciplinary courses, ideally, theory and practice in the visual and performing arts.

- I am able to attend required meetings and serve as a future PTLC coach. Stefanie Mollborn

Assistant Professor of Sociology and Faculty, Health and Society Program, Institute of Behavioral Science UCB 483, 1416 Broadway Boulder, CO 80309-0483 (303) 735-3796 [email protected] Having taught both small, seminar-style and large, lecture-style courses in Sociology, I have long been frustrated by the difficulties involved in making students' learning "stick" when using lectures (Boud 1981; Pintrich 1994). When I arrived at CU to begin work as an assistant professor in 2006, I was already enthusiastic about the prospect of using student response systems, or "clickers," in my larger classes. I attended the FTEP sessions on clicker pedagogy in preparation for using iClickers in my course on Sex, Gender, and Society in Spring 2007 (I was the first instructor or professor in Sociology to fully integrate clickers into a course). These FTEP sessions taught the standard style of pedagogy used in the science classes that paved the way for clicker use in the classroom, which involves short lecture periods broken up by "ConcepTests" that apply scientific concepts using quiz-style clicker questions (Mazur 1997). Students work together in small groups to try to solve the problem and correctly answer the ConcepTests. An exemplary application of "problem-based learning" (Cooper and Robinson 2000), this technique has been shown to improve students' learning in science classes (Asirvatham 2005; Duncan 2005).

When preparing my first class session using clickers, I followed this convention and wrote several questions involving the correct application of sociological theories to new situations. The students worked hard at answering the questions but seemed intensely focused on getting the correct answer, at the expense of thinking critically about the concepts they were applying. After this experience, I realized that the learning goals of a science class may differ from those in the social sciences and humanities. Constructivist approaches to learning suggest that students should regularly practice applying, evaluating, and critiquing course concepts in the classroom (Anderson 1987, von Glasersfeld 1992). Most conventional ConcepTests focus only on the first of these practices. For clickers to work well in my sociology class, I needed to develop new types of clicker questions in order to meet each of what I consider to be the learning goals of my discipline (as well as many other social sciences and humanities): (1) understanding and applying concepts and theories, (2) critically evaluating them, and (3) using them to understand the life experiences of self and others.

While conventional ConcepTest-like questions address the first learning goal, I developed new pedagogical techniques to address the second and third. Specifically, I began using clicker questions that became known to the students as critical thinking questions and past experience questions. For example, in a unit on gender and the household division of labor the students first collaborate as a group to create a definition of housework that includes "invisible" types of labor. Then, I ask "past experience" clicker questions in which students report who does more housework in their families, as well as who works more hours when paid work and housework are combined. Showing the distribution of students' responses to these clicker questions leads to a group discussion of the various factors influencing the household division of labor in their families. After students identify families trying to maximize their income as one of these factors, I then lecture for 10 minutes on human capital explanations for who does the housework in a family. Next, I ask a clicker question that tests their understanding of this theory, following the science model. Finally, I ask a "critical thinking" clicker question, such as: How useful do you think human capital explanations are for understanding who does the housework in a family? The ensuing group discussion links back to the past experience questions and points to weaknesses in the human capital explanation, which leads to the next mini-lecture on cultural explanations, followed by more concept tests and/or critical thinking questions. This pedagogical model addresses each of the three learning goals identified above through the use of three types of clicker questions.

In the project I am proposing here, I plan to investigate students' perceptions of how each of these three clicker-based instructional techniques affects their learning and motivation. Data collection is planned for Fall 2007 and Spring 2008 using the same course, so if students' fall responses recommend amendments to the techniques I can implement them in the spring. If students state that the combination of three types of clicker questions facilitates their learning and/or boosts motivation, then this pedagogical model can be of use to other instructors in the social sciences and humanities, at CU and elsewhere.

I plan to collect data from students using a mixed-methods approach that combines optional, anonymous surveys, face-to-face interviews, and optional freewrites. Angel Hoekstra, a Ph.D. student in Sociology who is writing her dissertation on clickers, has agreed to assist with data collection and intends to include some of the data that is not specifically related to my research question in her dissertation. Human subjects approval of this study is pending. The use of clicker technology will allow me to collect survey data from the students, and their responses to any individual question are never linked to their identities. I cannot disaggregate the data case by case, but histograms will provide useful information about the distribution of responses. The planned survey is attached below; the last questions about the usefulness of different types of clicker questions are of the most interest to this project. We plan to conduct 10 to 15 face-to-face interviews with students who have been randomly selected from my courses (so that students with strong feelings about clickers do not self-select into the sample by volunteering) and recruited via email. Interviews are expected to last between 30 and 45 minutes, and students will be paid $10 for their participation. Interview questions will explore students' perceptions of the effects of different types of clicker questions on their learning in more depth. Finally, this fall students will be offered the option of doing two-minute anonymous freewrites in recitation about the use of clickers. One freewrite addresses clicker use in my class generally, and the other asks about the pros and cons of different types of questions. In combination, these data should allow for a multifaceted understanding of students' responses to different clicker techniques.

All data will be collected by the end of Spring 2008, and the interviews should be transcribed by about the same time. With Angel as a second author, I plan to draft an article outlining the pedagogical techniques and students' responses to them during the summer and submit it to Teaching Sociology, the premier journal of its type, in fall of 2008. Along the way I hope to present these findings to colleagues at CU and at a professional conference to get feedback and share this model of clicker-based instruction.

Considering my short tenure at CU as an assistant professor, I have a strong record of innovation in teaching. I have created two new sociology courses here and have pioneered the full integration of clicker technology into Sociology classrooms at CU. Several instructors in Sociology have since begun using clickers, and I conducted a training session on their use in Sociology classes over the summer. I am not aware of an appropriate coach for my project, but I am very flexible and eager to work with someone who can provide an outside perspective on my project. I would be happy to serve as a PTLC coach in the future and look forward to the prospect of being involved with PTLC and attending its regular meetings.

I appreciate your consideration of this proposal. Please do not hesitate to contact me if you would like any further information.

Respectfully,

Stefanie Mollborn

References

Anderson, C.W. 1987. Strategic Teaching in Science. In Strategic Teaching and Learning: Cognitive Instruction in the Content Areas, 73-91. Alexandria, VA: Association for Supervision and Curriculum Development.

Asirvatham, M.R. 2005. "IR Clickers and ConcepTests: Engaging Students in the Classroom."

CONFCHEM 2005 Winter On-line Conference: Trends and New Ideas in Chemical Education. Retrieved 11/15/2006 from http://www.chem.vt.edu/confchem/2205/a/asirvatham.pdf

Boud, D. 1981. Developing Autonomy in Student Learning. London: Kogan Page.

Cooper, J.L. and P. Robinson. 2000. The Argument for Making Large Classes Seem

Small. In New Directions for Teaching and Learning: Energizing the Large Classroom,5-16. San Francisco: Jossey Bass.

Duncan, D. 2005. Clickers in the Classroom: How to Enhance Science Teaching Using

Classroom Response Systems. San Francisco: Pearson Education.

Mazur, E. 1997. Peer Instruction: A User's Manual. Upper Saddle River, NJ: Prentice Hall.

Pintrich, P.R. 1994. Student Motivation in the College Classroom. In Handbook of

College Teaching: Theory and Applications, 23-43. Westport, CT: Greenwood Press. von Glasersfeld, E. 1992. A Constructivist's View of Teaching and Learning. In

Research in Physics Learning: Theoretical Issues and Empirical Studies, 29-39. Kiel, Germany: IPN.

Mary Jane Rapport

Professor, School of Medicine Assistant Director, Physical Therapy Program University of Colorado Denver 4200 E. Ninth Avenue, C244 Denver, CO 80262 303-372-9148 [email protected]

Background: The University of Colorado Denver Physical Therapy Program offers a unique opportunity for students to interact, appreciate, and learn from persons in the community who have a physical disability. Student pairs are placed with a volunteer during the first month they enter the Doctor of Physical Therapy (DPT) program. Students follow the same community volunteer, along with his or her family, throughout the entire 3 year DPT curriculum. Each student is engaged with his/her volunteer through both structured and informal interaction, activities, and reflection. Specific program objectives for the Community Volunteer Program (CVP) are intended to assist students to: understand life with a disability; explore accessibility issues; link classroom learning to real life; observe life situations over time; and develop communication skills.

Additional objectives include facilitating the development of core professional values in physical therapy and allowing students a better understanding of the role of the physical therapist within a doctoring profession. Assignments are designed to enable students to appreciate their community volunteer (CV) as an individual and to observe the individual's ability to function on a daily basis and participate actively in society. Students gain a perspective of living with a disability, including the social and emotional consequences, navigating the healthcare system, and managing life in today's world. Both specific and reflective assignments are placed in courses throughout the three year curriculum. Faculty provide an ongoing role as mentors for the CVP; however, actual visits between the students and their volunteers are unsupervised. Therefore, students are unable to provide any physical therapy services and physical therapy interventions are not part of the CVP.

What is the Central Question/Issue to Explore?: The CVP concept is unique in physical therapy and other healthcare educational preparation programs. The University of Colorado Denver School of Medicine also incorporates volunteers into courses for medical students through presentations or small group discussion, but the students do not follow these individuals over time. Currently, the Physical Therapy Program does not have a meaningful assessment process to determine if the CVP is meeting program objectives (see background information above). There is anecdotal information to support the belief that the CVP is an important component of student learning in the Physical Therapy Program, but there is no objective data to support this claim. I would like to answer the following questions:

Is the CVP meeting its intended objectives from the perspectives of students, faculty, and the volunteers (or their families)? Which aspects of the CVP are most beneficial to the professional growth and development of the DPT student? Which aspects of the CVP should be changed to enhance student learning to better accomplish the stated CVP objectives?

Why is This Important?: By obtaining and analyzing appropriate data regarding the CVP, an assessment could be made regarding the relevance of the CVP in the DPT curriculum and highlight changes that could enhance student learning and benefit from the experience. Use of a CVP as an effective teaching tool in other professional healthcare education programs could be better supported through a complete and measurable assessment of the students, faculty, and participating volunteers and their families. Evidence that the CVP is meeting the stated objectives is necessary in order to promote its continued use both internally and externally.

What is the Plan for Methodology and Dissemination?: I would like to be able to develop a survey to measure the CVP outcomes from the student, faculty, and CV participant perspectives. This step would likely involve development and implementation of a survey tool and a plan for capturing data for assessment at entry point, midpoint, and upon completion of the 3 year DPT curriculum. The survey tool should include both open and closed-ended questions and may need to be adapted for volunteers who cannot provide a written response. Focus groups may be useful to verify the findings of the surveys. Other types of methodology may be useful in assessing specific CV characteristics (type of disability, age, living environment, etc.) and using those to determine which volunteers provide the students with exceptional learning experiences and where faculty should focus efforts on future volunteer recruitment. The complexity of the data could provide a clearer picture of the experience from both student and CV viewpoints. Data analysis would be significantly improved by the addition of an experienced statistician, educational research expert, or other similar assistance through the PTLC program. With an improved data analysis, this CVP and the concept of using similar experiences in other healthcare professional education programs could be shared through publications and presentations. There are multiple schools and programs in our own university system, as well as across the US, that could benefit a more complete picture of how to incorporate a CVP as a teaching and learning tool to enhance student learning about disability within the context of daily life.

Literature Review: The CVP was loosely based on the senior mentor program described by Roberts and colleagues at the University of South Carolina (USC) School of Medicine (2002, 2004, 2006). The senior mentor program was designed to dispel myths about aging and better meet the health needs of older adults. Our CVP was implemented as a teaching tool to directly incorporate patient-centered care in the educational philosophy of the DPT curriculum. Service learning has a much longer history in professional education programs for healthcare providers, and these experiences have been shown to be beneficial (Brosky et al, 2006). Researchers have recommended that service learning experiences be assessed in a comprehensive manner and involve all stakeholders (Driscoll et al., 1998), and a variety of assessment methods have been used to conduct gather data (Hegeman, 2002; Lohman & Aitken, 2002; Beling, 2004; Vanderhoff, 2005). Outcome assessment is also an important component of all accredited physical therapy education programs for purposes of accreditation and "as a vehicle to facilitate meaningful programmatic change" (Tippett, 2006, p. 38).

What Aspects Cannot Yet be Fully Described?: The complete methodology for data collection and appropriate assessment tools cannot yet be fully described. However, a pilot survey has been developed and used with a single cohort of students.

What Questions Still Exist?: One primary question that still exists is how to design an appropriate assessment process, including development of tool(s), for accurately assessing the CVP from student, faculty, and community volunteer perspectives.

Record of Innovation and Mentor Suggestion: As an educator, I have been involved in the development of several educational programs and numerous courses. Prior to joining the Physical Therapy Program faculty, I was the Interdisciplinary Training Director at JFK Partners and was responsible for developing an extensive continuing education program, an advanced clinical residency program, and a course sequence for graduate students funded on federal training grants. In my current position, I have been responsible for the implementation of the Transition DPT Program and have participated in the assessment of the entry-level DPT Program. I have been involved with multiple professional educational conferences and programs at local and national levels. Potential coaches who might be appropriate for this project include Dr. Carol Kamin, Dr. Carol Hodgson, Dr. Gretchen Guiton, or Dr. Kenneth Wolf based on their experiences with educational assessment of programs and students in the School of Medicine.

Commitment: If selected to participate in this project, I would attend the required meetings and would be willing to serve as a coach for PTLC participants in subsequent years.

Cathy J. Thompson

Associate Professor Option Coordinator, CNS Option School of Nursing Institution: University of Colorado at Denver and Health Sciences Center Campus Address: 4900 E. Ninth Ave City, State, Zip Code: Denver, CO 80262 Phone: 303-315-0453 [email protected]

The central question I plan to explore in my proposed work is to determine the effectiveness of the use of prerecorded audio and video lectures and other online teaching methods designed to engage students to student satisfaction and application of content to their clinical practice. The goal is to improve the science of teaching and instructional delivery in online classes to enhance student learning and promote positive outcomes.

Significance for this evaluation research and its impact on the instructional programs within the University of Colorado (CU) system is demonstrated in many ways. There has been a growing demand for online courses in university settings (Hubble & Richards, 2006) and the faculty in Schools of Nursing (SON) across the country have been increasing the number of educational offerings using web-based formats as a result (Bangert, 2005; Bata-Jones & Avery, 2004; Buckley, 2003).

The advantages and disadvantages of online education are well documented (Bangert, 2005; Bata-Jones & Avery, 2004; Huckstadt & Hayes, 2005; Sole & Lindquist, 2001). Online learning has been embraced as an accessible source of education for students who work or live a distance from a traditional educational setting, or have other commitments that hinder daytime attendance or travel to traditional classes. Research studies comparing online versus traditional classroom learning have shown similar or positive outcomes between the two methods of instructional delivery (Hubble & Richards, 2006; Yucha & Princen, 2000). One criticism of web-based education is that many online offerings provide little imagination regarding the delivery of course content (Huckstadt & Hayes, 2005). Students have reported their reluctance to take online classes, if a classroom alternative is available, because of an opinion that some content is better taught in traditional classroom settings; as well as, the loss of face-to-face contact with faculty and socialization with fellow students, engagement in real-time class discussions and questions, the inability to interpret nonverbal communication cues, feelings of isolation, and frustration with technology and connectivity difficulties (Gruendemann, 2007; Huckstadt & Hayes, 2005; Thurmond, 2004). The challenge of online learning for the educator is to communicate knowledge, insights, and experiences in a way that students can understand and apply to their practice. As healthcare professionals, it is important for our patients that knowledge obtained by our students is transferred into practice.

While online courses are proliferating, the outcomes of student satisfaction and performance have been variable (Bata-Jones & Avery, 2004; Hubble & Richards, 2006; Huckstadt & Hayes, 2005). Many faculty have evaluated a variety of content delivery methods in online courses. Active engagement of the student has been shown to correlate with increased academic performance and satisfaction with online learning (Hubble & Richards, 2006; Huckstadt & Hayes, 2005). Learning which strategies are effective in transferring knowledge to application is important to educators in the health sciences, as well to other faculty whose students need to be able to apply important concepts to practice situations.

Up until this semester, I have taught a weekly Advanced Pathophysiology class for the graduate students in the SON. This semester, I have transitioned this class to a completely online format. Yucha and Princen (2000) found no significant difference in two groups of students taking a graduate pathophysiology course in either the traditional or an online format. It is known that students learn in different ways. In my years of teaching this class, I have developed a variety of materials designed to engage the students to think as advanced practice nurses. For example, students apply their knowledge using a pseudo-problem-based learning format (the full case is provided with questions interspersed) to sharpen their diagnostic reasoning skills and clinical judgment. Students positively comment on this teaching and evaluation method. My method of instructional delivery in the classroom provides auditory and visual stimulation using interactive lecture and PowerPoint slides to accompany the slide handouts and readings the student has completed before class. A series of “comprehension checks” are integrated throughout the lectures, in addition to other assignments. Students have always positively commented on my enthusiasm for teaching and my ability to make difficult concepts more understandable. My challenge in the online class has been how to communicate this same level of enthusiasm and skill that the students have responded to without the face-to-face interaction.

This semester I have incorporated the use of video lectures and podcasts to engage the students at what I hope is the same or a greater level in the online class than they had in the traditional face-to-face class. This has been a learning opportunity for me to think about how to work within the expanse and limits of available technology, while still providing a quality learning experience. Instructional delivery for the online students this semester is being provided using slide handouts of my PowerPoint slides combined with prerecorded video lectures and audiocasts (podcasts). I have used two different formats – providing presentations in either the Adobe Connect system or as streaming video recorded with the RSS unit. I am currently conducting formative evaluations of the student's reactions to the use of these new methods in the online class. I asked for anonymous feedback as to the quality of the video lectures during the early weeks of the course, and outside of some initial sound or technology issues, the response has been positive. Students commented that they were excited to have the ability to hear the lecture and see the faculty. A midpoint assessment is currently being solicited.

Other strategies to actively engage students include the use of the discussion forum. Students are asked on a weekly basis to contribute questions about their readings and the lectures, as well as to encourage the sharing of clinical experiences that will add to the knowledge base of both the novice and the experienced nurses. Virtual office hours for real- time answers to student questions have also been scheduled.

As faculty become more comfortable with the online teaching format, innovative methods of delivering content to facilitate the application of knowledge into practice need to be explored. While there have been many studies published on the evaluation of online learning in nursing curricula, only one report of the effectiveness of video and audio teaching methodologies in online learning in the nursing literature was located (Sole & Lindquist, 2001). A study to establish the effectiveness of prerecorded audio and video lectures and other online teaching methods designed to engage students for a variety of outcome variables, including ability to apply content to clinical practice is being proposed.

Methods: In the spring 2008 semester, I will teach this content as a blended class. I will teach a limited number of days on campus as a traditional class and then the rest of the content will be provided online using the video lectures and audiocasts I've already recorded, as well as new recordings. I plan to collect data using a mixed-method approach, by asking a convenience sample of students in the spring class to complete a variety of formative and summative assessments to evaluate the effectiveness of the instructional methods on student satisfaction and the transfer of knowledge to clinical practice. The instructional methods will include student evaluation of asynchronous and synchronous discussions, virtual office hours, and audio and video lectures. The evaluations will be posted as anonymous surveys on the class website. The use of these data sources have been shown to provide comprehensive evaluation data (Bata-Jones & Avery, 2004). Only aggregate data will be reported, no students will be individually identified. Outcomes of interest include: instructor-student interaction, satisfaction, engagement, and the ability of students to think critically and apply their learning to their clinical experiences and current nursing positions.

I will do a pretest to assess the student's a) expectations of the course for their preparation for practice; b) comfort with online learning and technology; c) experience with previous online classes; and d) perception of whether they will avail themselves of all methods offered. For the posttest, I will assess a) whether the student's expectations of the class were met, not met, or exceeded; b) comfort with online learning and technology; c) ratings of the different learning methodologies as to ease of use, quality of the teaching and AV, ability to pause and rewind, etc; d) satisfaction with: this course overall, the perception of their ability to transfer this knowledge into practice, any absolute transfer they can describe; e) if the video/audio lectures compare favorably with the classroom lecture; and f) how often they used the different methods – all of them, some, none. Do they feel they missed important content or networking by participating in the online class?

I will analyze the data by downloading the assessment statistics through Blackboard for each survey made available. Descriptive statistics, obtained through the pretest, will describe the sample demographics will include age, gender, geographic location, # of credits taken that semester – whether classes are online or classroom; self-assessment of computer literacy, where in program – beginning, middle, end; role – NP, CNM, CNS; program – MS, DNP, PD; and number of years of nursing practice. Students' responses to questions of course effectiveness, satisfaction, and the different methods will be elicited using Likert-type scales. Opened- ended questions will be devised to elicit richer data regarding the student's actual use of knowledge into practice. The Likert-type scales will be analyzed as interval level data, as the responses will be summed and averaged; inferential statistics will be used as appropriate to answer the research questions. As the surveys would be investigator-developed and a new tool, the scales will also be tested for reliability; a factor analysis of the scales will be performed to assess construct validity. Qualitative responses will be transcribed to a qualitative computer-based program and the data will be analyzed using content analysis. Common themes with supporting quotes will be isolated as they emerge from the data. The qualitative data will provide richness to the quantitative results for a robust description of the effect of these strategies on the ability to apply concepts to practice.

My work will be made available locally to faculty in the SON through a brown-bag presentation of innovative teaching in online courses and feedback will be solicited and questions answered. Scholarly critique and review will be facilitated on a national and international level through my submission of an abstract for presentation at a national education conference, upon completion of the data analysis. In addition, an article describing the project and the results will be submitted to a professional journal, such as Nursing Education in Practice, with an international audience. The findings from the study would be important to educators in practice and academia as to which strategies are effective in transferring knowledge into practice, as well as to administrators who plan to implement online learning for employee orientation and continuing education.

What aspects of the design and character of this work are you not yet fully prepared to describe? The survey questions have not been completely fleshed out, though the ideas for the types of data have been presented above. I have a tendency to want to evaluate everything, so I need I need help in narrowing my study questions. I also need help to be realistic about what is feasible to accomplish in this time frame.

What questions do you have and what do you still need to know? What tools may already be available to measure student application of learning in online courses? “What tools will help us know whether what is taught is learned, transferable, and usable in professional practice?” (Muth, 2007, PTL proposal).

My record of innovation in teaching and/or the assessment of learning: I have developed a variety of ways to engage students in understanding and really learning content from a novice and an advanced perspective. My teaching and course evaluations are consistently excellent. I have been awarded the Chancellor's award 3 times and the President's award from the students in the SON and graduate school in the last 8 years at CU. I'm Chair of curriculum and, as senior faculty, am charged with mentoring others to develop effective teaching strategies. I continually I use both formative and summative student comments to evaluate my own teaching effectiveness in each class, every semester. My goal is to provide a quality experience for every student. I teach evidence-based practice for the graduate students and strive to have evidence to teach effectively too. The results of this study could serve as a basis for future research in online teaching effectiveness.

Potential mentors: Dr. Diane Skiba is a FACMI and internationally known speaker on online education with multiple training grants to her credit. She has been a leader in online education at the SON. Dr. Gayle Preheim is a former PTLC awardee and is an expert in teaching effectiveness. I would welcome a mentor from the SOM who has experience in video and/or audio lectures, too.

References

Bata-Jones, B., & Avery, M. D. (20040. Teaching pharmacology to graduate nursing students: evaluation and comparison of Web-based and face-to-face methods. Journal of Nursing Education, 43(4), 185-189.

Bangert, A. W. (2005). The seven principles of effective teaching: A framework for designing, delivering, and evaluating an Internet-based assessment course for nurse educators. Nurse Educator, 30(5), 221-225.

Buckley, K. (2003). Evaluation of classroom-based, Web-enhanced, and Web-based distance learning nutrition courses for undergraduate nursing. Journal of Nursing Education, 42(8), 367-370.

Gruendemann, B. J. (2007). Distance learning and perioperative nursing. AORN Journal. 85(3), 574-6, 578-86.

Hubble, M. W., & Richards, M. E. (2006). Paramedic student performance: comparison of online with on-campus lecture delivery methods [Abstract]. Prehospital and Disaster Medicine. 21(4), 261-7.

Huckstadt, A., & Hayes, K. (2005). Evaluation on interactive online courses for advanced practice nurses. Journal of the American Academy of Nurse Practitioners, 17(3), 85-89.

Sole, M. L., & Lindquist, M. (2001). Enhancing traditional, televised, and videotaped courses with Web-based technologies: A comparison of student satisfaction. Nursing Outlook, 49, 132-137. Thurmond, V. (2004). Towards an understanding of interactions in distance education. On-line Journal of Nursing Informatics (1089-9758), 8(2), p. 18p.

Yucha, C., & Princen, T. (2000). Insights learned from teaching pathophysiology on the World Wide Web. Journal of Nursing Education, 39, 68-72.

Cindy White

Associate Professor Communication University of Colorado Boulder 270 UCB [email protected]

I teach courses in communication on interpersonal communication processes (interpersonal communication, family communication, and intimate relationships). One of the pedagogical issues in teaching courses like these is that students' experiences of these processes are highly personal and emotion-laden, but the goal of such courses is to teach students to think communication processes conceptually and analytically. In the past, I have designed my courses with an emphasis is on theoretical and research-based discussion. My goal in doing this was to ensure that students learned to step outside their own personal experiences and to take an analytical approach to the analysis of communication. However, I have been aware that this approach pays insufficient attention to how students' personal experiences impact their understanding of the material. The Summer Institute for Assessing Learning that I attended in 2006 reinforced my sense that these courses could be more effective if I had a better understanding of students' intuitive conceptions of communication which I could use to guide the development of their learning in the course. According to Bransford, Brown and Cocking (2000), my teaching approach could be seen as over-emphasizing a knowledge-centered environment and under-emphasizing learner-centered aspects of learning. Learner- centered teaching environments attend to the knowledge and experiences students bring to the classroom and utilize that knowledge to help students acquire the conceptual and content-specific knowledge they need. In order to better address this in my teaching, I want to ask: How do students' assumptions about communication and their experiences within their own relationships influence their learning in interpersonal communication courses? I have two ideas for how I might explore this question. First, I want to develop ways to better understand how students' assumptions about communication processes change over the semester as a result of taking an interpersonal communication course. Currently much of what we know about student perspectives is based either on teacher impressions of students (Engen, 2002) or on research that assesses students' feelings about specific communication events, such as giving a formal speech (Witt & Behnke, 2006). Second, I want to create assignments that allow me to explore how student views or understandings of communication change across the course. This spring, I hope to use my family communication course as a site for developing these ideas, but ultimately these teaching strategies might be incorporated into the multi-section, lower-division course in interpersonal communication, which I direct.

Importance of the Issue

The challenge of addressing and utilizing student concepts is a key problem in communication pedagogy. Morreale, Applegate, Wulff, and Sprague (2002) have noted that challenge communication processes are highly engrained, social situated, and largely automatic; this means that teaching about such processes requires development of self-reflexive abilities in students. Additionally, the difficulties of communication education are further heightened by the fact that "communication" as a cultural category in everyday talk has come to mean many different things. Craig and Carlone (1998) note that the "evolution of academic communication studies has been driven by, and by now is probably helping to drive, the evolution of communication as a cultural category" (p. 78). In terms of undergraduate education, the intersection between cultural and academic views of communication is relevant because students bring to the classroom complex, but often implicit, understandings of communication that maybe in tension with the academic perspectives they then encounter. Given that our goal in communication education is to help students develop sophisticated and analytical views of their social world, we could do a better job if we understood how their intuitive conceptions of communication processes impact the interpretation and acquisition of new perspectives or theoretical knowledge.

Conducting the Investigation

I hope, in the tradition of the scholarship of teaching and learning, to meaningfully investigate processes within my own course to learn about how student perspectives on communication shape their understanding of course material. I hope to develop an assessment tool that could be used to generate information about students' experiences with family communication and their expectations regarding how their view of such processes will change in the course. In order to assess how student perspectives on communication are shaped by the course, I hope to develop specific writing prompts for self-reflection, which I could then examine to determine how student conceptualizations change across the course. (I have in mind a sort of thematic analysis across time, but I am not sure exactly how this would play out.) The short-term goal of this project is to assess and improve the way I help students use their prior knowledge and experiences throughout their learning in the course. A more long-term goal is to consider how the knowledge I gain here could be used in designing the multi-section, lower-division interpersonal communication course. This course is often taught by graduate part-time instructors who I supervise. I hope to improve both the way the course is structured, and to assist GPTIs in their understanding of how learner-centered environments can contribute to knowledge-centered processes. Additionally, family and interpersonal communication courses offered at nearly all colleges and universities with active communication studies departments. The knowledge I gain could contribute to disciplinary discussion about how to teach these courses.

What aspect of the project do I still need to develop? What do I still need to know?

I have not fully examined literature in other areas of social science to see if there is discussion of how student assumptions about social processes are changed by courses on social behavior. I also need to more fully consider how to effectively generate information about student assumptions regarding communication; research by Dannels, Anson, Bullard, & Peretti (2003) provides one model for this type of study.

Record of Teaching Innovation or Learning Assessment

I am currently the Associate Chair for Undergraduate studies in the Department of Communication. This means I am involved in decision- making about curriculum and student development. I have taught graduate and undergraduate courses in interpersonal communication, family communication, communication in intimate relationships, research methods, and nonverbal communication. I headed a team of faculty in our department who were selected to participate in the Campus Compact summer institute on Campus Engagement and Change, and the work produced by our department was featured in a session at the National Communication Association convention on civic engagement. I have attended the FTEP summer institute on technology as well as the summer institute on learning assessing learning. Additionally, I worked with a colleague (Lisa Keranen) on a research project designed to assess student's perceptions of challenging coursework; this project was funded in part by the Arts and Sciences Deans Office and has been submitted for presentation at a regional conference. At this time, I cannot suggest a coach for this project, but would be happy to work with whoever seems appropriate.

Are you able to attend the required meetings as specified above?

Yes, I would be pleased to attend the meetings that are part of this collaborative.

If your project is selected, are you willing to serve as a coach in PTLC in a future year? Yes.

References

Bransford, J. D., Brown, A. L., & Cocking, R. R. (Eds.) (2000). How people learn: Brain, mind, experience, and school. Washington, D.C.: National Academy Press.

Craig, R., & Carlone, D. (1998). Growth and transformation of communication studies in U.S. higher educations: Toward a reinterpretation. Communication Monographs, 47, 67-82.

Dannels, D. P., Anson, C. M., Bullard, L., & Peretti, S. (2003). Challenges of learning communication skills in chemical engineering. Communication Education, 52, 50-56.

Engen, D. E. (2002). The communicative imagination and its cultivation. Communication Quarterly, 50, 41-57.

Morreale, S., Applegate, J., Wulff, D., & Sprague, J. (2002). The scholarship of teaching and learning in communication studies and communication scholarship in the process of teaching and learning. In M. T. Huber & S. P. Morreale (Eds.), Disciplinary styles in the scholarship of teaching and learning: Exploring common ground. Washington, D.C.: National Academy Press.

Witt, P. L., & Behnke, R. R. (2006). Anticipatory speech anxiety as a function of public speaking assignment type. Communication Education, 55, 167-177.