Internet Learning ! www.ipsonet.org

!

Table of Contents

Matthew M. Acevedo Erin Blauvelt Kay Shattuck, Whitney Alicia Zimmerman, Deborah Adair Jiyu You, Sue Ann Hochberg, Phoebe Ballard, Mingli Xiao, Anthony Walters

Deborah Adair, Susan W. Alman, Danielle Budzick, Linda M. Grisham, Mary E. Mancini, A. Sasha Thackaberry

Leah A. Geiger, Daphne Morris, Susan L. Suboez, Kay Shattuck, Arthur Viterito

Bobbie Seyedmonir, Kevin Barry, Mehdi Seyedmonir Andria F. Schwegler, Barbara W. Altman, Lisa M. Bunkowski Penny Ralston-Berg Internet Learning Volume 3 Issue 1 - Spring 2014

Dear Internet Learning Colleagues,

I would like to take this opportunity to thank each and every one of you for your valuable contributions to Internet Learning this past year. We are already o! to a productive start for 2014, and are currently working on the Internet Learning Quality Matters Special Issue Fall 2014. If you have not had a chance yet to do so, please check our current issue (and past issues) for a multitude of exceptionally written articles covering an array of topics on online learning at http://www.ipsonet.org/publications/open-access/internet-learning

As part of expanding authorship and readership, I would also like to welcome several new members to our Editorial Reviewer Board. "ese highly distinguished scholars come from a variety of disciplines and also represent perspectives from a broader, interna- tional level of expertise—thus capturing topics around learning on the Internet on a more global scale. "ey include:

Paul Prinsloo, University of South Africa Herman van der Merwe, North-West University: Vaal Triangle Campus Ngoni Chipere, University of the West Indies Tony Onwuegbuzie, Sam Houston State University Molly M. Lim, American Public University Clark Quinn, Quinnovation Ben W. Betts, University of Warwick, UK Tony Mays, South African Institute Distance Education Robert Rosenbalm, Dallas County Community College District & !e NUTN Network Carmen Elena Cirnu, National Institute for Research & Development in Informatics, Bucharest Mike Howarth, Middlesex University Tarek Zoubir, Middlesex University Jackie Hee Young Kim, Armstrong Atlantic State University Hannah R. Gerber, Sam Houston State University Debra P. Price, Sam Houston State University Mauri Collins, St. Rebel Design, LLC. Ray Schroeder, University of Illinois Spring"eld Don Olcott, Jr., HJ Global Associates Kay Shattuck, Quality Matters and Penn State University Karan Powell, American Public University System John Sener, Senerknowledge LLC Melissa Langdon, University of Notre Dame, Australia Kristen Betts, Drexel University Barbara Altman, Texas A&M, Central Texas

Additionally, we have also added a new member to our Executive Editorial Board, Daniel Benjamin, who serves as the Vice President and Dean of the School of Science and Technology at American Public University System. With his extensive background in the #eld of distance education, he will undoubtedly o!er a wealth of knowledge to Internet Learning.

2 Internet Learning Volume 3 Issue 1 - Spring 2014

"ank you once again for your commitment to serve and support Internet Learn- ing. We look forward to your continued support with manuscript submissions, peer reviews, editing, copyediting, web and journal design, etc. and also welcome any com- ments or suggestions aimed toward improvement in these areas.

Warm regards,

Melissa Layne, Ed.D. Editor-in-Chief, Internet Learning Director of Research Methodology, American Public University System

3 Editorial Internet Learning Journal April 2014

"is issue of ILJ consists of a selection of papers concerning di!erent aspects of online teaching, learning, and quality assurance, stimulated by interaction with !e Quality Matters Higher Education Rubric and course review process. It captures a small, but signi#cant, sample of the kind of detailed analysis of online education toward which en- gagement with QM typically leads. "is kind of work is advancing incrementally toward a better understanding of the most e!ective course design elements to promote learner per- sistence, performance, and satisfaction, as well as the most e!ective strategies to persuade faculty to adopt best practices and become part of the growing community of e!ective and committed online instructors and facilitators.

Why has the study of e!ective standards for online education grown steadily, beyond the usual con#nes of departments and colleges of education, in contrast to the level of interest in the higher education classroom? Of course, the Quality Matters™ Program would like to take credit for this trend, together with other organizations like ITC, WCET, MERLOT, and the Sloan Consortium, through our conferences, sponsored research projects, and activities to engage both faculty and instructional design and technology specialists. But something deeper is at work.

"e process for online and blended course creation and improvement at the postsec- ondary level has engaged individuals and institutions in ways seldom experienced in face-to-face education. "is work increasingly involves teams of individuals who need to share and collaborate in order to succeed. And the successful course, or even the well de- signed learning object, is itself an artifact that begs to be shared, analyzed and improved. "is sharing begins locally, but spreads quickly to become regional, national, and even international. "e collaborations that result from this phenomenon and the courses that are thereby strengthened, term by term, hold promise for the continual improvement of online education, year-by-year and version-by-version.

In historical terms, we are only at the beginning of this process. I suspect, however, that it will be a permanent feature of online education, leading to new strategies and tools, and, ultimately, a re-conception of advanced learning, individualized to the learner. It will also inevitably impact classroom-based education as well. We may be taking baby steps at present with studies such as these, but the baby (online education) is growing rapidly.

Ron Legon, Ph.D. Executive Director "e Quality Matters Program and Provost Emeritus "e University of Baltimore

4 Internet Learning Volume 3 Issue 1 - Spring 2014 Collaborating with Faculty to Compose Exemplary Learning Objectives Matthew M. AcevedoA

Inclusion of well-written, measureable, and student-centered learning objec- tives represents a major component in the Quality Matters (QM) higher edu- cation rubric. However, when an instructor-expert and instructional designer collaborate to create an e-learning product, o#entimes the instructor-expert either comes to the table with course and/or unit learning objectives that are already prepared, but are not measurable, student-centered, or aligned with planned instructional materials and strategies; or has no learning objectives at all. !e responsibility then falls on the instructional designer to not only explain the importance of properly written learning objectives, but also to guide and support the instructor-expert through the process of composing learning ob- jectives that are measureable and appropriate for the e-learning product. !is paper discusses the purpose and importance of learning objectives and suggests several strategies for instructional designers, faculty trainers, and others who work with instructor-experts to compose learning objectives. !ese strategies are based on commonly encountered scenarios and are framed around a dis- cussion of terminal and enabling objectives. !ese strategies also represent an alternative to the common practice of providing an instructor-expert with a list of Bloom’s Taxonomy-aligned verbs, and can aid in successful collaboration leading to compliance with learning objective-related QM standards.

Keywords: learning objectives, faculty collaboration, higher education instruc- tional design, Quality Matters

he collaborative process that takes e-learning courses in the higher education place as a catalyst for the creation space are no exception. "e relationship of an e-learning product, whether it is unique as a result of the wildly varying Tis a higher education course, a corporate backgrounds of the two parties – the in- training module, or some other type of in- structor-expert is typically a scholar in a structional digital object, is a unique one. #eld unrelated to education, and a compe- Very o!en, two individuals – an instruc- tent instructional designer is well versed tor-expert and an instructional design- in learning theories, instructional strate- er – are responsible for working together gies, design thinking, instructional design to design and develop the end product process models, and uses of technology to (Aleckson & Ralston-Berg, 2013), and promote learning.

A Matthew M. Acevedo, FIU Online, Florida International University. "is article is based on “Beyond Under- standing: Working with Faculty to Compose Exemplary Learning Objectives,” a conference session presented at the Quality Matters' 5th Annual Conference on Quality Assurance in Online Learning in Nashville, TN on October 2, 2013. Correspondence concerning this article should be addressed to Matthew Acevedo, FIU Online, 11200 SW 8th St, Modesto A. Maidique Campus, RB227B, Miami, FL 33199. E-mail: mmaceve@#u.edu

5 Internet Learning

Further adding to the unique dy- Learning Objectives and the namic of this relationship is the idea that Quality Matters Program everyone, innately, is a teacher. In a recent webinar, well-known instructional design he Quality Matters Program is an or- scholar M. David Merrill recounts a story ganization dedicated to the promo- in which he silently and patiently listened tion of quality assurance in online to his brother-in-law, a nuclear physicist, Tcourses in the higher education and K–12 describe advanced and complicated math- arenas through an iterative, faculty-cen- ematical derivations related to particle tered peer review process. Quality Matters physics. A!erward, when Merrill began (QM) also emphasizes inter-institutional discussing his own work related to instruc- collaboration, faculty training, and imple- tional design theory, his brother-in-law in- mentation of research-based best practices terrupted him and argued his points, feel- in online course design. Courses that un- ing quali#ed to do so since “everyone is a dergo the QM review process are evaluated teacher” (Merrill, 2013). "e point is valid: based on a detailed rubric with standards parents and family members instinctively for the course’s learning objectives, as- teach children basic skills and manners; sessment strategy, instructional materials, friends may naturally teach other friends learner engagement, use of technology, hobbies and games. Teaching and learning learner support, and accessibility. are part of the human experience. In a session at the Quality Matters’ A natural extension of this is the idea 5th Annual Conference on Quality As- that faculty members may feel completely surance in Online Learning, an audience prepared to teach subjects in which they of approximately ninety – mostly instruc- are experts or scholars. As a result, these tional designers – were asked to submit instructor-experts may come to the draw- to a live poll via text message (see Figure ing board early in the e-learning design 1 below); the prompt was “Working with process with learning objectives prepared, faculty to compose learning objectives can and, o!entimes, these learning objectives o$en be…,” and responses were enlighten- are not measureable, student-centered, re- ing. Typical replies included “frustrating,” alistic, or aligned with the planned assess- “maddening,” “an uphill battle,” “a chal- ment strategy. "e responsibility then falls lenge,” and even “painful” (Acevedo, 2013). on the instructional designer to not only Clearly, working with faculty to explain the importance of properly written compose objectives represents a challenge learning objectives, but also to guide and for instructional designers, faculty leaders, support the instructor-expert through the faculty trainers, and others who work with process of composing appropriate learning instructor-experts in higher education en- objectives. vironments. In this article, I will provide a framework, based on common scenarios, for collaborating with faculty members during the process of either composing new, or rewriting ine!ective learning ob- jectives. Additionally, I will include discus- sion as to how this process relates to the QM review process, since learning objec-

6 Collaborating with Faculty to Compose Objectives

Figure 1. Sample responses from the text message poll.

7 Internet Learning tives represent a signi#cant portion of the [T]he learner must be informed of the QM rubric. nature of the achievement expected as an outcome of learning. […] "e purpose of such a communication to the learner is !e Importance of Learning to establish an expectancy of the perfor- Objectives mance to be achieved as a result of learn- ing. […] "e primary e!ect of providing obert Mager, likely the foremost learners with an expectancy of the learn- classical authority on learning ob- ing outcome is to enable them to match jectives, describes a learning objec- their own performances with a class of Rtive as “an intent communicated by a state- performance they expect to be “correct” (Gagné, 1977, p. 291). ment describing a proposed change in a learner – a statement of what the learner is Lastly, learning objectives are in- to be like when he has successfully complet- valuable instruments in a climate increas- ed a learning experience. It is a description ingly focused on outcomes assessment and of a pattern of behavior (performance) we alignment with institutional, regional, and want the learner to be able to demonstrate” national standards. (1962, p. 2). According to Mager, “When Some, however, have expressed clearly de#ned goals are lacking, it is im- skepticism or disillusionment with the use possible to evaluate a course or program of learning objectives. Rosenberg (2012), e%ciently, and there is no sound basis for for example, questions the value of present- selecting appropriate materials, content, or ing learning objectives to students: instructional methods” (p. 2). Learning objectives, derived from [D]o objectives truly help the learners? an appropriate needs analysis, serve as the […] We’ve all been there; sitting in class underpinning to all well-known instruc- while the instructor reads (or we view tional design process models. According online) any number of statements, some- to Dick, Carey, and Carey, learning ob- times dozens of them, for each lesson or module, that o$en begin, “at the conclu- jectives “are an integral part of the design sion of this course, the student will be able process […] Objectives serve as the input to…” Each objective focuses on a speci#c documentation for the designer or test skill or knowledge taught in the course, construction specialist as they prepare the but may be too much in the weeds to an- test and the instructional strategy” (2009, swer students’ bigger questions like, “Why pp. 113–114). Furthermore, “objectives are am I taking this course?” “What’s in it for used to communicate to both the instruc- me?” and “How will this help me down tor and learners what may be learned from the road?” (para. 5) the materials” (p. 114). Renowned educa- Rosenberg o!ers that learning ob- tional psychologist Robert Gagné further jectives don’t o!er students a sense of value elaborates on the importance of informing in the course, and should be replaced (or learners of the objectives in his classic text, supplemented with) a list of statements of !e Conditions of Learning: expectations to “truly broadcast the value and worthiness of your training e!orts” (para. 10). Rossett (2012) counters Rosen- berg directly: “Marc, you urge us to add expectations to [learning objectives], ex-

8 Collaborating with Faculty to Compose Objectives pectations that assure links to work and re- not expect 100%?). Mastery of the objective sults. I say that good [learning] objectives in college courses is typically assessed on are themselves that statement of expecta- a sliding scale (A through F). "e behav- tions” (para. 11). ior, then, is the most essential element of Keeping in mind a terminal goal of the learning objective. "is is the element a successful QM review, it will be assumed that is evaluated during a QM review, and the learning objectives are, indeed, vital it is also the most misunderstood and most and foundational to the design of e!ective misrepresented aspect. and quality instruction. QM Standard 2 requires learning It should be emphasized that, de- objectives at both the course level and spite their importance, learning objectives module or unit level that are student-cen- are of little value if not constructed prop- tered (“"e student will…” as opposed to erly. "e most detailed, comprehensive “"is course will…”) and measurable. "is learning objectives are framed using the measurable quality is the one with which “ABCD” model: audience, behavior, con - faculty members o$en seem to have the ditions, and degree. Audience refers to the most trouble. "ere are certain words and targeted learners, behavior refers to what phrases that come up time and time again the learner is expected to be able to do af- that are vague and immeasurable (see Ta- ter instruction, conditions refer to any set- ble 1). "e problem is not that instructors ting or circumstance in which the behav- (and instructional designers) don’t want ior should occur, and degree refers to the students to accomplish these objectives; acceptable standard of performance of the rather, these objectives cannot be assessed, stated behavior. An example of an ABCD because they are too open to interpretation, objective is “Given a right triangle with they are internal processes, or they are, by stated lengths of each leg, eighth-grade stu- their nature, entirely subjective. Sound dents will be able to use the Pythagorean learning objectives should re&ect measure- "eorem to determine the length of the able, observable, external behaviors that triangle’s hypotenuse with 90% accuracy.” can be evaluated and assessed. In this example, the audience is “eighth- grade students,” the behavior is “determine Working with Faculty the length of the triangle’s hypotenuse,” the condition is “given a right triangle with common practice among instruc- stated lengths of each leg,” and the degree tional designers, faculty trainers, is “with 90% accuracy.” and others who are working with In higher education environments, Ainstructor-experts to compose learning ob- including e-learning, the ABCD framework jectives, o$entimes in preparation for QM might be overkill. "e audience (“college review, is to hand over a sheet of paper with students” or similar) is implied by the con- Bloom’s Taxonomy and a list of measure- text of the institution or course and stating able verbs that correspond to each level of it would be redundant. "e condition is the hierarchy. "ese lists are common on typically also implied by the provided in- the internet and found easily with a basic structional materials and sequence. Includ- search engine query. "e designer or train- ing the degree element in higher education er informs the instructor-expert to reframe environments has the downside of declar- his or her objectives with these terms with ing a less than optimal expectation (why no further explanation or conversation. 9 Internet Learning

Understand Learn Know Become Realize Recognize acquainted with Internalize Appreciate Believe

Table 1. Commonly Seen Immeasurable Objective Roots

10 Collaborating with Faculty to Compose Objectives

"is practice (of which I’m guilty) • Select appropriate ingredients for a presents a number of problems. First, with- peanut butter and jelly sandwich. out additional guidance, the list of objec- • Assemble a peanut butter and jelly tives that is returned o$en doesn’t mirror sandwich. the planned (or existing) assessment instru- • Consume a peanut butter and jelly ments. For example, “Describe how con- sandwich. troversies over constitutional issues shape • Properly dispose of sandwich remains. much of the content of American politics” cannot be assessed using a multiple choice "ese terminal objectives, for exam. Second, the level of cognitive com- the sake of QM compliance, become the plexity implied by the objective doesn’t course-level objectives. Each of these ob- match the complexity of instructional con- jectives have enabling objectives, or mod- tent itself (“Evaluate the 50 state capitals” ule/unit-level objectives. For example, the is an example). In some cases, verbs from enabling objectives for the #rst terminal these lists are chosen seemingly randomly. objective (“select appropriate ingredients”) Clearly, another approach – one that are as follows: involves a more meaningful collaborative conversation – is necessary. "is alternative • Di!erentiate between di!erent types of approach excludes Bloom’s Taxonomy alto- breads. gether and starts with a conversation about • Identify types of jellies and jams, in- the goals of the course and also provides a cluding &avors appropriate for PB&J clear connection to the standards set forth sandwiches. by the QM rubric. Dick, Carey, and Carey • List the features of the di!erent variet- (2009) describe two types of objectives: ter- ies of peanut butter. minal objectives and enabling objectives. • Describe accommodations for those Terminal objectives are those skills that a with dietary preferences and/or restric- learner will be able to perform once an en- tions. tire unit or course is complete. Enabling objectives are subordinate to the terminal It is also possible (and likely) that objectives; that is, achievement of a termi- enabling objectives will have their own nal objective is impossible without achieve- subordinate enabling objectives. For exam- ment of the enabling objectives. ple, in order to “Describe accommodations In QM language, the terminal ob- for those with dietary preferences and/or jectives translate to the course-level objec- restrictions,” students must be able to: tives, and the enabling objectives translate to the module- or unit-level objectives. An • Explain the purpose of gluten-free example may in illustration. Let us bread. consider an overly simple course: “Founda- • Explain the purpose of low-sugar jelly. tions of Peanut Butter and Jelly Sandwich- es.” "e terminal objectives of this course All of these enabling objectives be- are: come the module- or unit-level objectives, and can also aid in informing the orga- Upon completion of this course, nization of instructional content within students will be able to: a course. Skills that are necessary to per- form the enabling objectives but will not

11 Internet Learning be included in the instructional sequence Common Scenarios or materials of the course are referred to as entry skills or prerequisite skills; these are he typical scenarios faced by in- requirements for learners before they begin structional designers and faculty the course of study. Refer to Figure 2 for trainers who work with instruc- a visual breakdown of these objectives and Ttor-experts to compose learning objectives entry skills in the “Foundations of Peanut can be categorized into #ve types. Each of Butter Jelly Sandwiches” course. these scenarios has a recommended course Put simply, the conversation that of action based on the terminal/enabling needs to take place with the instructor-ex- objective breakdown. pert involves asking what, broadly, learners should be able to accomplish once they #n- Scenario 1: Faculty member already has ish the course, as well as what learners need well-written, measureable objectives. to be able to do, speci#cally, to accomplish those behaviors, including what will and Given a scenario in which a facul- will not be taught in the course. Based on ty member or instructor-expert comes to this conversation, course and module ob- the table with well-written, measureable, jectives can be determined without bring- and appropriate objectives, the job of the ing Bloom’s Taxonomy into the conversa- instructional designer or faculty trainer is tion. simple: commend the instructor-expert on the achievement, and provide any further Practical Application support as needed. "is scenario, however rare, does exist, typically with faculty who n analysis of terminal (course) ob- either have a background in education or jectives, enabling (unit/module) have attended an Applying the QM Rubric objectives, and prerequisite skills is training. Aa useful tool in working with faculty, but "e remaining scenarios are more a QM review looks at courses, not at ob- common. jective analysis in the wild. What does this breakdown translate to in “real life”? Fig- Scenario 2: Faculty member needs help ure 3 depicts a unit of the “Peanut Butter writing new course objectives. and Jelly Sandwich” course deployed in the Blackboard Learning Management system. In this scenario, perhaps the fac- "is unit is framed around the #rst terminal ulty member is preparing a new course objective (“select appropriate ingredients”) or a currently running course doesn’t al- and is called “Module 1: Selecting Your In- ready have objectives (obviously the latter gredients.” A$er a brief introduction to the is not the most ideal scenario given good module, the course objective addressed in instructional design practice). In either this module is listed, followed by a list of case, the recommended action is to ask the that module’s speci#c objectives. "is lay- instructor-expert, “What can students do, out is su%cient to satisfy QM Standards. a$er taking your course, that they couldn’t do before?” "e answer to this question leads to a discussion of the terminal or course-level objectives.

12 Collaborating with Faculty to Compose Objectives

Figure 2. A visual breakdown of learning objectives in the PB&J course.

13 Internet Learning

Figure 3. Course and module objectives used practically in the LMS.

14 Collaborating with Faculty to Compose Objectives

Scenario 3: Faculty member has course ther Party and the Socialist Workers Party” objectives but doesn’t have module/unit lends itself to an objective of “Students will objectives. be able to evaluate propaganda techniques of 20th-century revolutionary movements.” In this scenario, perhaps the facul- ty member or instructor-expert has course Scenario 5: Nothing else has worked. objectives that are mandated by a depart- You’ve reached a “brick wall.” ment or program, or it’s possible that the collaboration team has just graduated from Some instructor-experts remain ab- Scenario 2. "e recommended course of solutely convinced that either their subjects action is to ask the question, “What must are too abstract to warrant measurable ob- students be able to do before accomplish- jectives or that their immeasurable objec- ing the course objectives?” "e answer to tives are already suitable with no revision this question will provide the team with the necessary. "e recommended action in enabling or module/unit-level objectives. this case is to present the following situa- However, be sure to di!erentiate between tion: “Your student is going to work at an enabling objectives and entry/prerequisite entry-level job in the area of this course. skills. What is he/she going to do at work? What earns him/her a paycheck?” "is doesn’t Scenario 4: Faculty member has some necessarily give the collaboration team or all objectives that are immeasurable, any direct answers, especially in liberal vague, or “fuzzy.” arts-type subject areas, but it can provide a jumping-o! point or conversation starter "is scenario is arguably the most to get on a productive and positive path. common. Instructor-experts, as described earlier, o$en feel equipped to provide their Summary own learning objectives with little or no background in education or sound instruc- he interaction and collaboration that tional design practice. When an instruc- take place between an instructional tor-expert comes to the table with learning designer and instructor-expert tend objectives that don’t meet QM Standards, Tto be unique, partially as a result of the wide- the recommended action is to inquire as ly varying backgrounds of the two parties, to how that particular objective will be as- and can be somewhat complicated by the sessed in the course. If the answer is a mul- notion that part of an instructional design- tiple choice exam, chances are good that an er’s skill set is innate and can be performed appropriate verb for the learning objective solely by the instructor-expert. Part of this is “identify.” If the answer is #ll-in-the- collaborative process can include the com- blank questions, more appropriate verbs position of course- and module/unit-level include “recall,” “name,” and “recite.” If the learning objectives, either (ideally) during assessment instrument is an essay or a proj- the design phase or (less ideally) retroac- ect, the prompt or instructions become the tively a$er the course has been developed. objectives themselves, although they may In either case, properly written, measur- have to be generalized. For example, an es- able, and appropriate learning objectives say prompt of “Compare and contrast the are vitally important because they provide propaganda techniques of the Black Pan- students clear expectations, they inform 15 Internet Learning the selection of instructional materials and Rosenberg, M. (2012, July 12). Marc my instructional strategy, and they are used to words: Why I hate instructional objectives develop assessment instruments. [Web log message]. Retrieved from http:// When instructor-experts approach www.learningsolutionsmag.com/arti- the collaborative environment without cles/965/marc-my-words-why-i-hate-in- learning objectives or with learning objec- structional-objectives tives that are not measureable or well writ- ten, a discussion of terminal and enabling Rossett, A. (2012, July 12). Why I LOVE objectives is an e!ective tool for beginning instructional objectives [Web log message]. the process or revising existing objectives. Retrieved from http://www.learningsolu- "is approach is clearer and more direct tionsmag.com/articles/968/ than other methods, such as providing framed statements using Bloom’s Taxono- my-aligned verbs.

References

Acevedo, M. (2013). Beyond understanding: Working with faculty to compose exemplary objectives. Concurrent session presented at the 5th Annual QM Conference on Quali- ty Assurance in Online Learning. October 1–4.

Aleckson, J., & Ralston-Berg, P. (2013). Mindmeld: Micro-collaboration between eLearning designers and instructor experts (1st ed.). Madison, WI: Atwood Publish- ing.

Dick, W., Carey, L., & Carey, J. (2009). !e systematic design of instruction (7th ed.). Upper Saddle River, NJ: Merrill.

Gagné, R. (1977). !e conditions of learning (3rd ed., p. 291). New York, NY: Holt, Rine- hard and Winston.

Mager, R. F. (1962). Preparing instructional objectives. Palo Alto, CA: Fearon Publishers.

Merrill, M. D. (2013, October 17). My hopes for the future of instructional technology [Webinar]. Retrieved from http://cc.ready- talk.com/play?id=2xrvdz 16 Internet Learning Volume 3 Issue 1 - Spring 2014 Get Rid of the Gray: Make Accessibility More Black and White! Erin Blauvelt

!e Quality Matters Rubric (Quality Matters, 2011), a nationally recognized benchmark for the quality of the design of online courses, holds accessibility as an essential element of a high-quality online course. Creating and editing courses with accessible elements can be di$cult, both in understanding and in process, as being able to interpret and administer the technical standards of Section 508 of the Rehabilitation Act of 1973 (1998) takes time and study. Cus- tomizing a de"nition of accessibility in online courses and creating the speci"c elements and best practices for an institution is essential in carrying out a plan for editing and developing accessible online courses and meeting the Quality Matters Rubric (Quality Matters, 2011). !e purpose of this paper is to outline the process that Excelsior College used to establish an accessibility standards list and implementation plan to "t speci"cally with the course design and student population and to describe some best practices in coding and accessible design requirements to meet Standards 8.3 and 8.4 of the Quality Matters Rubric.

Keywords: accessibility, Quality Matters, online education, online course qual- ity, best practices

Introduction meet Standard 8 of the Quality Matters Ru- bric (Quality Matters, 2011). aws, publications, and standards for web content accessibility exist for the Background Information purpose of assisting in designing ac- Lcessible web pages for users with disabili- "e Americans with Disabilities Act (1990), ties. Implementing the standards for web or ADA, which has been amended multiple page design into an online course remains times since its inception in 1990, outlines a “gray” topic and can be di%cult to discuss regulations and guidelines for providing and carry out in a “black and white” man- equal access for persons with disabilities. ner. "e features of an online course, both Section 508 of the Rehabilitation Act of technical and purpose, are di!erent from 1973 (1998) outlines technology-related those of a typical website, which most pub- regulations and standards for accessible lications on web content accessibility exist web design. When students with medically for. It can be di%cult to de#ne what an ac- documented disabilities request an accom- cessible online course means to an institu- modation during an online course, the in- tion and to move forward with a plan to be stitution is required to provide reasonable reactive and proactive toward accessibility. accommodation to the student. A reason- Excelsior College is currently completing a able accommodation adapts an exam, edu- four-year project to make the entire library cational aid (in this case, an online course), of 500+ online courses accessible as well as or degree program requirement allowing

17 Internet Learning equal access for an individual with a dis- dards list, editing cascading style sheets ability (Excelsior College, 2013). (CSS) and Dreamweaver templates to meet In compliance with ADA mandates, accessibility standards, implementing a Excelsior has a system in place for students process to bring online courses in accor- with documented disabilities to receive ac- dance to the developed standards list, and commodations for their online courses. editing roughly 208 courses to comply with "is process is reactive in that students must the developed standards list. A contractor #rst request an accommodation, and then was hired to assist in all areas of the proj- the course is out#tted to meet their needs. ect, but mostly for the purpose of serving In 2012, the decision was made to go be- as a co-subject matter expert and complet- yond current federal laws and become more ing the bulk of the actual course edits. A$er proactive in the approach to serve both developing a standards list customized for students with documented disabilities and our courses (see Table 1), we edited insti- those that would also bene#t from ADA-ac- tutional-level online course Dreamweaver cessible course design principles, which is templates and CSS for compliance with the the concept of universal design. Universal standards list. "e creation and implemen- design is a set of guidelines for the develop- tation of the course revision process (see ment of educational materials that provides Figure 1) began once the standards list, all individuals, including those with dis- templates, and CSS #les were created and abilities and those without, comparable ac- edited. cess to those educational materials (CAST, 2013). Individuals without documented Developing an Instructional Acces- disabilities can also bene#t from universal sibility Standards List design principles. For example, individuals with learning preferences (i.e., auditory or "e #rst step in the project was to work col- visual), environmental limitations (i.e., no laboratively with the contractor to develop access to speakers or a headset to listen to a a standards list based on Section 508 stan- lecture), and language barriers (i.e., English dards (Rehabilitation Act, 1998), WCAG as a Second Language) reap bene#ts from 1.0 Priority guidelines (W3C, 1999a), the universal design. design of and elements in Excelsior’s on- Excelsior College is pursuing insti- line courses, and our speci#c student pop- tution-level recognition by Quality Mat- ulation. WCAG 1.0 includes three Priority ters and is currently in the second year of levels, Priority 1 containing standards that a three-year implementation plan. Accessi- content developers “must satisfy” (W3C, bility is one of the eight General Standards 1999b). Excelsior, serving mostly nontra- of the Quality Matters Rubric (Quality ditional adult learners, has a unique stu- Matters, 2011); thus certain criteria must dent population that would bene#t from be met in order to meet Quality Matters certain accessibility standards/universal standards. design principles beyond WCAG 1.0 Prior- ity 1 guidelines, so Priorities 2 and 3 were Accessibility Project Overview also considered when developing the cus- tomized standards list. For example, stan- "e #rst two years of Excelsior’s accessibil- dards were added from Priorities 2 and 3, ity project have included creating a stan- which address elements such as page orga-

18 Get Rid of the Gray nization and expanded detail of acronyms given of minimizing the time that we can and tables with our high military student ask of the academic units, who are respon- population (approximately 38% of our cur- sible for managing the course content, as rent student population) in mind. Accord- much as possible. "e process outlined in ing to the American Council on Education Figure 1 was built toward the beginning of (2011), individuals who served in Iraq and the project, and have had success with the Afghanistan have up to a 40% chance of tasks and order. Experimentation with the acquiring a traumatic brain injury. We can time span for each task and the number of anticipate that there are more students with courses in each task at one time has taken Post-Traumatic Stress Disorder (PTSD) place. During the #rst year of the project, and Traumatic Brain Injury (TBI) than 101 courses were put through the process, have registered with our O%ce of Disabil- with all of the courses moving through ity Services. PTSD and TBI su!erers typ- each task at the same time. "is was found ically experience di%culty with attention, to be di%cult to manage with only one concentration, and information processing sta! member acting as Project Manager (American Council on Education, 2011), and long-term preparation periods when so page organization can be important in courses could not be edited. "e decision their ability to absorb the content. Some was made to schedule courses in batches of Section 508 and WCAG standards do not ten, with a new batch starting the process apply to Excelsior’s online courses, so the every few weeks during the second year of customized standards list was simpler than the project, which was found to be a much the Section 508 or WCAG standards lists. more manageable solution. Upon establishing the standards list, Ex- As each course moves through the celsior modi#ed institutional course devel- process, it is analyzed by the contractor us- opment practices to ensure that all newly ing the accessibility standards list described developed and revised courses aligned with above and submitted to the Project Manag- the standards list. er in spreadsheet format. "e spreadsheet is divided into items that the contractor !e Revision Process can edit without any additional input, and items that need either Project Manager or "is project is composed of continuously academic unit input in order to be edited. moving parts; therefore there existed chal- "e Project Manager #rst provides input lenges to arriving at a process that would that can be handed over without anyone account for periods of review, editing, and else’s involvement and then reaches out to collection of materials – let alone sidestep the academic unit responsible for the course the continuous tasks of preparing our on- for input if needed. "e Project Manager line courses to each term, implementing creates a clear list of input needing the at- emergency #xes unrelated to this project, tention of the academic unit and places a and normal course revision cycles. Excel- deadline for the input to be returned. Ex- sior College has regimented course devel- amples of input o$en needing the attention opment, course editing, and term prepa- of the academic unit include creating alter- ration procedures and deadlines, which native text for complex diagrams and tables limit the amount of time a course may be and obtaining text-based versions of PDFs out of commission for completion of acces- to replace scanned versions. A$er all input sibility edits. An additional challenge was has been gathered, the course returns to 19 Get Rid of the Gray

Key: AU = Academic Unit PM = Project Manager C = Contractor

Figure 1. Course revision process.

20 Internet Learning the contractor for editing. Once editing is ting (Quality Matters, 2011). Table 1 pro- complete, the Project Manager reviews the vides some examples of common elements course using the accessibility standards list in an online course web page and some best and either approves the course or sends it practices to make each of these elements ac- back to the contractor for additional work cessible. if needed. If additional work has been re- While minimizing distractions quested, the Project Manager needs to re- does not mean that any design elements view the course again for approval. or images should be eliminated from on- A goal for the second year of the line course pages, simplifying the design of project is to add sta! members to the proj- the pages can make it easier for those with ect to support the Project Manager, spread- cognitive disabilities to process the page, ing out the work that needs to be com- or manipulate the page using an assistive pleted, as well as the knowledge of online technology program. course accessibility in general. We also plan Quality Matters Standard 8.4 states to implement the accessibility standards “"e course design accommodates the use list into our course development process as of assistive technologies” (Quality Mat- soon as possible. ters, 2011). Assistive technology refers to equipment or so$ware that is used to im- Quality Matters Standards prove or correct the functions of disabled 8.3 and 8.4 persons (Assistive Technology Industry Association, n.d.). Assistive technologies can either be input devices that allow us- Quality Matters Standards 8.3 and ers to control and navigate computers and 8.4 are both two-point standards, meaning web pages, or output devices that interpret that Quality Matters declares them as “Very and/or manipulate data and elements on Important” but not “Essential” (Quality computers and web pages such as screen Matters, 2011). magni#ers, screen readers, and learning Standard 8.3 requires that “course disabilities programs (Microso$, 2014). design facilitates readability and minimiz- A vision-impaired student may use an as- es distractions,” which focuses on the visual sistive technology like the screen reader aspects of the course. "is standard most JAWS (2014) to have the elements on the obviously a!ects students with physical im- screen read aloud to them. A student with a pairments such as low vision or blindness, cognitive disability may be much more suc- but it also a!ects those with cognitive dis- cessful processing the information when it abilities, as their brains do not process vi- is read to them by a screen reader or other sual elements in the same way that nondis- type of assistive technology, and they may abled students would. For example, while also bene#t from being able to take the &ashing objects can cause trouble for some- content on the screen and manipulate it, one with a physical disability such as epilep- adding highlighting, breaking up areas of sy, they may also be a barrier for process- text, turning o! images, or adding notes. ing anything else on the page for someone Standard 8.4 covers elements such with Attention De#cit Hyperactivity Disor- as text formatting, equations, links, tables, der (WebAIM, 2014). "is standard covers scanned PDFs (portable document for- elements such as the use of color, tables, mats), and media (Quality Matters, 2011.) graphics, text placement, and text format- Table 2 provides some examples of com-

21 Internet Learning

Table 1. Examples of Standard 8.3 Elements and Best Practices Element Best Practices Color • Do not use color for instruction • Use only high-contrast colors together • Use color sparingly to keep the look simple Tables • Use tables to convey data in a simple, clean format Graphics • Include alt text tag with description of image for all essential images • Do not include citation of image in alt text tag • Avoid flashing graphics • Avoid animations that do not align with content Text • Use

,

, etc. heading Placement elements to convey headings and subheadings and order • Use bulleted and numbered lists where possible for simplicity • Break up large areas of text by chunking topics or using relevant graphics Text • Use consistent font types and sizes formatting • Do not use underline tag for emphasis, only for links (use bold and italics for emphasis)

Table 1. Examples of Standard 8.3 Elements and Best Practices

22

Internet Learning

Table 2. Examples of Standard 8.4 Elements and Best Practices Elements Best Practices Text • Use tag for abbreviations or formatting acronyms • Use em for size instead of px (em is resizable, px is not) Equations • Use HTML codes for symbols when possible • If not possible to use HTML codes (complex equations), add equation as image with proper alt text tag of text form of equation Links • Use destination description as link title, not the URL • If link opens to PDF, video, audio file, etc. include file type and size Tables • Only use tables for data, not design • Use tag to describe table data • Use (table header) tag to signify column headings Scanned • Find text-based version of PDF or PDFs scan a copy of the PDF (or original work) using an OCR (optical character recognition) software program Media • Provide text-only copies of media

Table 2. Examples of Standard 8.4 Elements and Best Practices

23 Get Rid of the Gray mon elements in an online course web page org/i4a/pages/index.cfm?pageid=3859 and some best practices to make each of these elements accessible. CAST. (2013). About UDL. Retrieved from http://www.cast.org/udl/ Conclusion Excelsior College. (2013). Excelsior college xcelsior College is committed to of- disability services. Retrieved from http:// fering accommodations to students www.excelsior.edu/disability-services with disabilities and assisting all stu- Edents in being successful in their online JAWS [so$ware]. (2014). St. Petersburg, FL: courses. "e Quality Matters Rubric (Qual- Freedom Scienti#c. Retrieved from http:// ity Matters, 2011) holds accessibility as an www.freedomscientific.com/products/fs/ essential element of a high-quality online jaws-product-page.asp course; however, creating accessible online courses and retro#tting existing courses for Microso$. (2014). Microso# Accessibility. accessibility can be di%cult. Careful con- Retrieved from http://www.microso$.com/ sideration of speci#c institutional needs enable/at/types.aspx and online course structure, along with a structured implementation plan, can be Quality Matters. (2011). Quality Matters ru- helpful in administering the technical stan- bric for higher education 2011–2013 edition. dards of Section 508 (Rehabilitation Act, Retrieved from https://www.qualitymatters. 1998) into your online course program and org/rubric meeting the Quality Matters Rubric (Qual- ity Matters, 2011). Rehabilitation Act of 1973, 29 U.S.C. § 794D (1998). Retrieved from http://www.sec- References tion508.gov/section-508-standards-guide

American Council on Education. (2011). W3C. (1999a). Web content accessibility Accommodating student veterans with guidelines 1.0. Retrieved from http://www. traumatic brain injury and post-traumatic w3.org/ TR/WCAG10/ stress disorder: Tips for campus faculty and sta%. Retrieved from http://www.acenet. W3C. (1999b). Checklist of checkpoints edu/news-room/Documents/Accommo- for web content accessibility guidelines 1.0. dating-Student-Veterans-with-Traumat- Retrieved from http://www.w3.org/TR/ ic-Brain-Injury-and-Post-Traumat- WCAG10/full-checklist.html ic-Stress-Disorder.pdf WebAIM. (2013). Cognitive disabilities. Re- Americans with Disabilities Act of 1990, trieved from http://webaim.org/articles/ Pub. L. No. 101-336, §2,104 Stat. 328 (1990). cognitive/ Retrieved from http://www.ada.gov/pubs/ adastatute08.htm

Assistive Technology Industry Association. (n.d.). What is assistive technology? How is it funded?. Retrieved from http://www.atia. 24 Internet Learning Volume 3 Issue 1 - Spring 2014 Continuous Improvement of the QM Rubric and Review Processes: Scholarship of Integration and Application Kay ShattuckA, Whitney Alicia ZimmermanB, Deborah AdairC

Quality Matters (QM) is a faculty-centered, peer review process that is de- signed to certify the quality of online and blended courses. QM is a leader in quality assurance for online education and has received national recognition for its scalable, peer-based approach and continuous improvement in online education and student learning. Regular, robust review and refreshment of the QM RubricTM and processes keep them current, practical, and appli- cable across academic disciplines and academic levels. !e review ensures validity in the set of quality standards that make up the Rubric. An overview of the regular review of the QM Rubric and process, as well as examples of the use of data to continuously improve the Rubric and process are present- ed. !e guiding principles of QM – a process that is continuously improved upon and that is collegial and collaborative – are discussed in relationship to Boyer’s scholarship of application and scholarship of integration. Glassick (2000) noted that Boyer’s scholarship of overlapping discovery, integration, application, and teaching is “a hard but worthwhile task” (p. 880). !is arti- cle outlines how the dynamic and rigorous processes adopted by QM continue to take on that worthwhile task.

Keywords: Quality Matters, course design, professional development, contin- uous improvement, quality assurance, rater agreement

Introduction and Background mented, and disseminated a set of quality benchmarks (standards) (Shattuck, 2007), he Quality Matters (QM) Program as well as a rigorous peer review process to was initially developed under a 2003– improve student learning in online courses. 2006 Department of Education Fund In their wisdom, the developers of the QM Tfor the Improvement of Post-Secondary Ed- program recognized that providing an in- ucation (FIPSE) grant. "e grant, awarded strument (a Rubric) and a process for using to the not-for-pro#t consortium, Marylan- this Rubric would not be enough. Drawing dOnline, was for the development of a rep- from their own experiences as members of licable quality assurance program focused a community of practice that worked to- on faculty peer review and improvements gether for many years to solve the common to the design of online courses. During the issue of improving online course designs grant period, a community of practice with- (Cervero & Wilson, 1994; Lave & Wenger, in Maryland researched, developed, imple- 1991; Schön, 1983; Cousin & Deepwell,

A Director of Research, Quality Matters Program B Doctoral Student, "e Pennsylvania State University C Managing Director and Chief Planning O%cer, Quality Matters Program

25 Internet Learning

2005; Guldberg & Pilkington, 2006), they In this article, the QM guiding prin- included required credentialing of speci#c ciples – a process that is continuously im- competencies in the use of the Rubric and proved upon and that is collegial and col- in an understanding of the application of laborative – are discussed in relationship to the QM guiding principles of being collab- Boyer’s scholarship of application and schol- orative, collegial, continuous, and centered arship of integration. An overview of the in academic foundations around student regular review of the QM Rubric and pro- learning. cess, as well as examples of the use of data to Quality Matters is a program that continuously improve the Rubric and pro- subscribing educational institutions use cess are presented. within the cadre of other components necessary to assure quality in their online Scholarships of Application and 2 learning programs. While the QM Rubric Integration is focused on the design of online and blended courses, the QM process was de- While the construct of CoP (com- veloped with the awareness that it impacts munity of practice) (Shattuck, 2007) is useful faculty readiness through the QM profes- in understanding the developmental phases sional training program (emphasizing ped- of the QM program, the past decade can be agogical underpinning of course design), as described as an evolving practice of Boyer’s well as the bene#ts of collegial interactions (1990) scholarships of application and in- across academic disciplines and educational tegration. In a seminal publication of "e institutions. Other factors a!ecting course Carnegie Foundation for the Advancement quality include course delivery (teaching), of Teaching, Scholarship Reconsidered, Er- course content, course delivery system, in- nest Boyer challenged higher education to stitutional infrastructure, faculty training/ move beyond “teaching versus research” readiness, and student readiness/engage- (p. 16) and for faculty to take on a scholar- ment. "e importance of other components ly approach to teaching by rigorous study of in an institution’s quality assurance com- teaching in ways that are collaborative and mitment to online education is acknowl- connect theory with the realities of teach- edged within the QM standards. ing. "e term “the scholarship of teaching Quality Matters is a faculty-centered, and learning1 (SoLT)” is becoming an in- peer review process that is designed to cer- creasingly familiar concept in higher educa- tify the quality of online and blended cours- tion (Hutchings, Huber, & Ciccone, 2011). es. QM is a leader in quality assurance for Lesser known is that Boyer suggested “four online education and has received national separate, yet overlapping, functions” (p. 16) recognition for its scalable, peer-based ap- of scholarship. "ose are the scholarships proach and continuous improvement in on- of discovery, integration, application, and line education and student learning. As of teaching, and have been applied as useful the winter of 2013, there are 825 subscrib- tools in de#ning scholarship (AACN, 1999). ing educational institutions and 160 indi- vidual subscribers; 3,998 courses have been • "e scholarship of discovery relates formally peer reviewed; and 28,756 online to the most traditional functions of re- educators have successfully completed QM search, that is, exploration to generate professional development courses. new knowledge.

26 Continuous Improvement of the QM Rubric and Review Processes

• "e scholarship of integration is “in- Ultimately, the scholarship of teach- ter-disciplinary, interpretive, integrative” ing is behind the QM commitment to de- (italics in original) (Boyer, p. 20) and velopment and dissemination of standards about “making connections across disci- of quality in online course design, which plines” (p. 18). is a key phase in developing strong teach- • "e scholarship of applicaton is about ing presence. "e scholarship of discovery use of knowledge from research to im- – “disciplined work that seeks to interpret, prove societal problems. draw together, and bring new insight to • "e scholarship of teaching encom- bear on original research” (Boyer, p. 19) passes the relationship between teacher – is the focus of QM’s interest in original and student in which the teacher is also research. "is interest will be the focus for a learner to improve student intellectual 2014-2015. growth Regular Review and Re"nement of Boyer’s call “to liberate academic the QM Rubric and Processes careers from the hegemony of published research as the dominant product and "e 2007 article by Shattuck de- measure of scholarship” (Bernstein & Bass, scribes the development of the eight gener- 2005, para. 41) served as a “tipping point” al standards of quality online course design in the century-long debate of research ver- as they were (and continue to be) informed sus teaching (Rice, 2002, p. 7). "e grow- by the independent research literature and ing sophistication of digital technologies of established best practices. "e QM Rubric the past decade introduces new formats for and processes are dynamically interpretive the production, publication, and dissemi- of evolving research and best practices. "e nation of faculty scholarship (Bernstein & plan to conduct a complete review of the Bass, 2005; Hatch, Bass, Iiyoshi, & Mace, QM Higher Education Rubric and peer 2004). "e scholarship of application and review process was established during the integration is evident in QM’s research on grant period, and reviews have become continuous improvement. Examples de- more thorough over the past decade. "e scribed in this article are ongoing history of review and re#nement of the QM Higher Education Rubric and • Regular review and re#nement of the Processes chart outlines the review pro- QM Rubric and peer review processes; cess and outcomes for the past #ve Rubrics, • Consistently rigorous applications of from the #rst to the current review. the QM process, which are inter-dis- "e chart outlines the continuous- ciplinary and integrative, and provide ly improving processes used by QM to en- tools and strategies for interpreting re- sure wide input and transparency in the search into useable processes; and re#nement of the Rubric and the peer re- • Statistical analyses of data gathered view process. Figure 1 represents the cur- during the QM peer reviews which in- rent, rigorous, and comprehensive process form continuous improvement of the followed to launch each new edition of the QM Rubric and application of research QM Rubric. "e process is undergirded by and shared online teaching/designing the commitment to interpret research, best expertise across academic disciplines practices, and teaching/designing expertise and educational institutions.

27 Internet Learning into an applicable process that can be used PR’s academic discipline is included in the across all academic disciplines. "e collab- database. Course review teams are devel- oration of peer reviewers across disciplines oped using the database of certi#ed PRs. points to Boyer’s scholarships of applica- "is ensures that at least one SME related tion (practice) and integration. to the course under review is included on each team. While a QM review does not Consistently Rigorous Application evaluate the content of a course, an SME of the QM Peer Review Process serves as a resource for others on a review team on any course design implications Following the principles of facul- for a particular academic discipline. Each ty-centered and continuous improvement, review team is chaired by a QM MR, an the QM higher education Rubric has been experienced reviewer with advanced train- thoroughly reviewed and re#ned to ensure ing on the rubric and review process, who it remains a current and e!ective set of guides the team as needed in interpretation quality guidelines in online course design. of the standards. It is important to recognize that while there is an openly accessible listing of QM stan- Consistent Application of QM Peer dards, the full QM Rubric contains detailed Reviews annotations for each standard that assist in interpreting and applying standards during Quality Matters is sometimes mis- a course review. A course review without takenly described as a “Rubric,” while in access to the complete QM Rubric and fact, it is a process of engaging online fac- done by non-QM-certi#ed reviews does ulty who have further training in their use not meet the rigors of the QM process. QM of a validated set of standards (encapsulat- course reviews are conducted by a team of ed in the QM Rubric). "is set of standards three certi#ed QM Peer Reviewers (PRs) – guides reviewers in their collaborative as- all are active online instructors, all are cur- sessment of the design quality of a particu- rently certi#ed as QM PRs, at least one PR lar online course. "e rigorous QM peer re- is from outside the institution of the course view process that results in courses meeting under review, and at least one PR is a sub- QM standards of quality (either initially or ject matter expert (SME) in the academic upon amendments) includes formal and in- discipline of the course under review. Each formal reviews of online courses and online team is led by a QM Master Reviewer (MR) components of blended courses. Informal who has extensive online teaching experi- use of the QM Rubric is under the discre- ence and in the QM review process, as well tion of the subscribing institution. Formal as having additional training in facilitating course reviews are either managed by the an inter-institutional virtual collaboration QM program sta! (QM-managed) or by of academic peers. certi#ed QM representatives within a sub- Each QM PR brings at least two scribing institution (subscriber-managed). years of current experience teaching on- "e analysis of 2008–2010 data line. Additionally, each is required to com- found no di!erence between QM- and sub- plete rigorous QM training to become QM scriber-managed formal course reviews certi#ed; each is subsequently added to the in terms of total points (t(272) = 0.831, p QM database of available PRs available to =.406) or review statuses (χ²(2) = 0.500, p conduct QM course reviews. Each certi#ed 28 Continuous Improvement of the QM Rubric and Review Processes

© 2014 MarylandOnline © 2014 Quality Matters Program

Figure 1. Steps in QM Rubric revision process.

29 Internet Learning

= .779). "is attests to the level of compe- lyzed. At the time of the data analysis, 70.5% tencies provided by QM professional devel- (1,051) of courses met standards in the ini- opment required to become a PR and to the tial review. An additional 26.6% (397) of consistency of application of the inter-insti- courses did not meet standards in the ini- tutional, faculty-focused, peer-collegial re- tial course review but did meet standards view processes. a$er an amendment. "e remaining 2.8% (42) courses were pending amendment. Examination of Peer Reviews Explanations of the increase in 2008–2013 courses meeting QM standards during the initial peer review include (1) more courses are being developed using the QM Statistical analyses were conducted Rubric as a course design guide; (2) more on data gathered from formal course re- subscribing institutions are providing in- views conducted from 2008 through 2010 formal course reviews prior to submission (n included 434 course reviews; of those, for a formal peer review; and (3) more fac- 180 were “informally managed”) and from ulty and design teams have acquired e!ec- 2011 through July 2013 (N = 1,494). "ese tive competencies in the nuances of online data were explored to identify (1) frequen- teaching and course design. cy of courses meeting QM standards in initial reviews and a$er amendments, (2) most frequently missed standards, (3) dif- Most Frequently Missed Standards ferences between courses from di!erent ac- ademic disciplines, (4) di!erences between For the 2008–2010 course reviews, courses submitted by faculty developers/in- the most frequently met standards were structors with and without familiarity with 6.1 and 6.5; they were both met in 96% of the QM, and (5) proportion of inter-rater course reviews. "e standards most fre- agreement by speci#c standards. quently not met were 8.2 and 3.5; they were not met in 54% and 60% of course reviews, Rate of Courses Initially Meeting respectively. QM Standards For the 2011–2013 course reviews, the most frequently met standards were 6.1 and 7.2; they were both met in 95% of In the technical report for the 2008– course reviews, respectively. "e standards 2010 QM Rubric (Zimmerman, 2010), the most frequently not met were again 8.2 results of 274 QM- and subscriber-managed and 3.5; they were not met in 58.9% and course reviews were analyzed. At the time of 65.3% of course reviews, respectively. Note, the data analysis, 39% (105) of courses met for the analyses of the 2011–2013 course standards in the initial course review. An reviews, standards met in initial reviews additional 48% (131) did not meet standards versus amended reviews were not distin- in the initial course review but did meet guished between. standards a$er an amendment. "e remain- Frequently missed standards are re- ing 14% (38) of course reviews were consid- viewed carefully by the Rubric Committee, ered to be in the process of amendments. as they might indicate the need for re#ne- In the technical report for the 2011– ment of the standard and annotation word- 2013 QM Rubric (Zimmerman, 2013), the ing or need for focused QM professional results of 1,490 course reviews were ana- development for PRs. 30 Continuous Improvement of the QM Rubric and Review Processes

Di#erences of Review Success by Proportion of Rater Agreement by Course Discipline Speci"c Standards

Analysis of courses reviewed from 2011 Measures of reliability are o$en giv- through July 2013 revealed that business en when discussing scores such as those courses tended to have the best outcomes. assigned using the QM Rubric. "e term Business courses were most likely to meet “reliability” refers to consistency of re- standards in the initial review, followed by sults. Inter-rater reliability is a measure of education courses. Business courses also the relationship between scores assigned had the highest total scores. Courses in the by di!erent individuals (Hogan, 2007). In remaining disciplines did not signi#cantly its strictest sense, however, inter-rater re- di!er from one another. liability works under the assumption that reviewers are randomly selected and inter- Relationship between Faculty De- changeable (see Suen, Logan, Neisworth, veloper/Instructor of Reviewed & Bagnato, 1995). "is assumption is not Course and Familiarity with the met in the QM’s process in which reviewers may be selected on the basis of their previ- QM Rubric ous experiences or areas of expertise. "e measurement of interest concerning the In the analyses of the 2011–2013 QM Rubric is the proportion of reviews course reviews, courses submitted by indi- in which all three raters assigned the same viduals familiar with QM had higher initial rating to a speci#c standard (i.e., all three scores than courses submitted by individu- reviewers assessed a standard as met or not als who were not familiar with QM (Mann– yet met). "is is di!erent from inter-rater Whitney U (N = 1,488) = 43,537, p < .001). reliability in that it is not an attempt at de- However, there were not total point di!er- scribing unsystematic variance (see Hall- ences a$er amendment (Mann–Whitney U gren, 2012; Liao, Hunt, & Chen, 2010); its (N = 1,488) = 61,900, p = .108). ("e amend- purpose is to provide an easily interpre- ment phase includes interaction with the table statistic that will allow for the com- peer review team.) parison of speci#c standards for practical "e familiarity of faculty developers purposes. "us, in the discussion of consis- and instructors with the QM Rubric was ex- tency of results of QM’s reviews, the term amined in relation to the outcome of the ini- proportion of rater agreement is used as it tial course review and the amended course explicitly describes the analyses performed review (when needed). In the analysis of the as opposed to inter-rater reliability, which 2011–2013 Rubric, the majority (93.3%) of it technically is not. individuals who submitted courses for re- One of the primary purposes of an- view were familiar with the Rubric. Only 98 alyzing proportion of rater agreement is to out of 1,492 (6.6%) of individuals stated that identify speci#c standards that may require they were not familiar with the Rubric. attention to keep the Rubric re&ective of the research and #elds of practice while being workable for a team of inter-institu- tional, inter-disciplinarian academic peers. A speci#c standard for which reviewers

31 Internet Learning frequently submit di!erent scores may lack ment in which over 28,000 online educa- clarity; this could result in the need for tors have participated. "e analyses also changes to the speci#c standard or it could provide critical information to the Rubric signal a need for more reviewer training. Committee on the frequency of met stan- "e standards with lowest rater dards and on the proportion of rater agree- agreement table provide an overview of the ment by speci#c standards. revisions made by the 2010 Rubric Com- Glassick (2000) noted that Boyer’s mittee for standards that statistically had scholarship of overlapping discovery, in- the lowest rater agreement. "e chart also tegration, application, and teaching is “a provides data on the most recent data anal- hard but worthwhile task” (p. 880). "is ysis and has been provided to the 2014 Ru- article outlines how the dynamic and rig- bric Committee. orous processes adopted by QM continue Individual ratings given by a QM to take on that worthwhile task. All aspects peer review in course reviews re&ect, to at of the QM program are regularly reviewed least some extent, that particular review- and refreshed with and for online teaching er’s professional/pedagogical opinion, and, faculty. therefore, may vary from the ratings of the other individual reviewers. However, References markedly lower rater agreement for specif- ic standards in the QM Rubric is a prompt American Association of Colleges of to members of the Rubric Committee to fo- Nursing, AACN Task Force on De#ning cus attention on those standards during the Standards for the Scholarship of Nursing. regular review and refreshment of the QM (1999). De"ning scholarship for the dis- Rubric. cipline of nursing. Retrieved from http:// www.aacn.nche.edu/publications/position/ Summary de#ning-scholarship

egular, robust (breadth and depth) Bernstein, D., & Bass, R. (2005). "e schol- review and refreshment of the QM arship of teaching and learning. Academe, Rubric and processes keep them 91(4), 37–43. Rcurrent, practical, and applicable across academic disciplines and academic levels. Cervero, R. M., & Wilson, A. L. (1994). "e "e review includes interpretation of ed- politics of responsibility: A theory of pro- ucational research, as well as an emerging gram planning practice for adult education. emphasis on research generation. Expertise Adult Education Quarterly, 45(1), 249–268. from online educators across the United States plays a critical role in the transpar- Cousin, G., & Deepwell, F. (2005). Designs ent, faculty-centered processes. "e review for network learning: A community of ensures validity in the set of quality stan- practice perspective. Studies in Higher ed- dards within the Rubric. Statistical analy- ucation, 30 (1), 57–66. ses of data gathered from formal course re- views reveals that the peer review process Glassick, C. E. (2000). Boyer’s expanded has been consistently applied across review de#nitions of scholarship, the standards for types and academic disciplines and points assessing scholarship, and the elusiveness to the value of QM’s professional develop- of the scholarship of teaching. Academic 32 Continuous Improvement of the QM Rubric and Review Processes

Medicine, 75(9), 877–880. Retrieved from scholarly work for faculty members. In K. http://www.academicpeds.org/events/as- J. Zahorski (Ed.), Scholarship in the post- sets/Glassick%20article.pdf modern era: New venues, new values, new visions. New Directions for Teaching and Guldberg, K., & Pilkington, R. (2006). A Learning, No. 90 (pp. 7–17). San Francisco: community of practice approach to the Jossey-Bass. development of non-traditional learners through networked learning. Journal of Schön, D. W. (1983). !e re&ective practi- Computer Assisted Learning, 22, 159–171. tioner: How professionals think in action. New York: Basic Books, Inc. Hallgren, K. A. (2012). Computing in- ter-rater reliability for observational data: Shattuck, K. (2007). Quality Matters: Col- An overview and tutorial. Tutorials in laborative program planning at a state lev- Quantitative Methods for Psychology, 8(1), el. Online Journal of Distance Learning Ad- 23–34. ministration, 10(3). Retrieved from http:// www.westga.edu/~distance/ojdla/fall103/ Hatch, T., Bass, R., Iiyoshi, T, & Mace, D. shattuck103.htm (2004). Building knowledge for teaching and learning. Change, 36(5), 42–49. Suen, H. K., Logan, C. R., Neisworth, J. T., & Bagnato, S. (1995). Parent-professional Hogan, T. P. (2007). Psychological testing: A congruence: Is it necessary? Journal of Ear- practical introduction. Hoboken, NJ: John ly Intervention, 19(3), 243–252. Wiley & Sons, Inc. Zimmerman, W. A. (2010). Quality Matters Hutchings, P., Huber, M. T., & Ciccone, 2008–2010 rubric report. A. (2011). !e scholarship of teaching and teaching reconsidered: Institutional integra- Zimmerman, W. A. (2013). Analyses of tion and impact. Hoboken, NJ: Wiley. 2011–2013 Quality Matters Rubric.

Lave, J., & Wenger, E. (1991). Situated End Notes learning: Legitimate peripheral participa- tion. Cambridge, UK: Cambridge Univer- 1 O’Banion (1997) focus on learning as a key sity Press. concept of the learner-centered movement has become intertwined with the scholar- Liao, S. C., Hunt, E. A., & Chen, W. (2010). ship of teaching. Comparison between inter-rater reliability and inter-rater agreement in performance 2 While this article focused on the continu- assessment. Annals of the Academy of Med- ous re#nement of the QM Higher Education icine, 39(8), 613–618. Rubric, QM provides other Rubrics and ac- companying review processes: O’Banion, T. (1997). A learning college for the 21st century. Phoenix AZ: Oryx Press. • K–12 Secondary Rubric (Grades 6–12), which, a$er the #rst regular Rubric re- Rice, R. E. (2002). Beyond scholarship re- view, is now in the second edition considered: Toward an enlarged vision of • Higher Education Publisher Rubric 33 Internet Learning

• K–12 Publisher Rubric • Continuing and Professional Education Rubric

34 Internet Learning Voume 3 Issue 1 - Spring 2014 Measuring Online Course Design: A Comparative Analysis Jiyu You, Sue Ann Hochberg, Phoebe Ballard, Mingli Xiao, Anthony Walters

!is paper investigated the di%erences between students' and QM peer re- viewers’ perspectives of essential QM standards in three online courses. !e results indicated that both peer reviewers and students share the same point of view in regard to evidencing the standards. However, they di%ered signi"- cantly regarding three of the essential standards. Factors that might cause the discrepancy are further discussed.

Keywords: Quality Matters, online course, design

Introduction certi#ed reviewers’ perspectives about QM essential standards. nline learning programs have "e results of this study indicated grown tremendously over the that most of the items in the instrument last ten years. Best practices and were designed according to the Quality Ostandards for online programs and cours- Matters standards work to measure the de- es have been developed and implemented sign perspective of online courses. "e re- in higher education. To ensure the quality sults also show there are three tiers (Tier of online courses it is critical that online I: to a great extent, Tier II: to a moderate courses are designed according to a set of extent, and Tier III: to little or some extent) best practices or standards before they are in regard to meeting the standards in the delivered to students. Quality Matters is a three courses. faculty-driven, peer-review process that "e results on most of the standards is collaborative, collegial, continuous, and evaluated in this study provided by both re- centered in national standards of best prac- viewers and students are the same, indicat- tices and research #ndings in online and ing that both peer reviewers and students blended learning to promote student learn- take the same point view in regard to ev- ing. It has been widely-adopted by higher idencing the standards; however, they dif- education across the nation as a process fered signi#cantly with three of the essen- and a rubric to continuously improve on- tial standards regarding course objectives, line course quality. unit learning objectives, and grading poli- "is study attempted to (1) vali- cy. One factor that might possible lead to date the instrument design based on QM this discrepancy is that reviewers look for Standards to measure online course design; solid evidencing aligned with measurable (2) investigate to what degree the selected learning outcomes while students look for courses meet QM standards from a stu- clearly articulated objectives. dent’s perspective, and (3) identify gaps between students’ perspectives and QM

A "e University of Toledo

35 Internet Learning

Literature Review practices and research #ndings in online and blended learning to promote student Student Perspectives learning. Quality Matters is a leader in quality assurance for online education and Several QM-related studies have has received national recognition for its been conducted with regard to student per- peer-based approach and continuous im- spectives. "ese studies can be separated provement in online education and student into two categories: a) student perceptions learning. "e research-based QM Rubric is of the value of QM features in an online designed to evaluate only course design-- course, and b) student opinions about not course delivery or content. "e QM whether a course meets QM standards or Rubric consists of eight broad categories not. Ralston-Berg and Nath (2008) stat- broken down into 41 individual standards. ed that students value the same standards "ese 41 standards can be used in a variety marked as essential “3” and very important of ways ranging from providing guidelines “2” by QM, but value signi#cantly less on for course development to the evaluation standards marked as important “1” by QM. and certi#cation of courses through an in- "ey further noted that students who claim ternal or external review process. to have high satisfaction in online cours- "e goal of the QM review process es also value all QM features over those is to continuously improve online course who claim low satisfaction. Similarly, in quality. According to Shattuck (2007), Ralston’s (2011) study results by rank of the process begins with a mature course, importance to students for success correlat- meaning the course has been o!ered for at ed with QM standards. Knowles and Kala- least 2 semesters and the course instructor ta (2010) as cited in Shattuck (2012) stated has revised it based on previous experienc- that there might be a discrepancy in expec- es. A review team with three certi#ed QM tations between students and experienced reviewers who have online teaching expe- QM master reviewers. "ey further o!ered riences will review the course and provide possible explanations about this possible feedback to the course developer. When discrepancy--that students simply com- conducting formal reviews, one of the re- pleted the survey without thinking about view team members must be a subject mat- the standards and the course content or ter expert in the #eld of the course being many of the design aspects that were clar- reviewed and one member must be a mas- i#ed by the instructors during the course ter reviewer. In the event that a course does were being taught via channels that are not not meet the required 85% (81 of 96 points, available to the peer reviewers. including all 21 3-point essential speci#c standards) constructive recommendations Quality Matters Standards and Review will be sent to the instructor/course de- Process veloper. "e instructor/course developer can meet with instructional designers to Quality Matters (QM) is a process revise the course according to the recom- and a rubric to continuously improve on- mendations. All courses reviewed by the line course quality (Shattuck, 2012). It QM review team are expected to meet the is a faculty-driven, peer-review process standards a$er necessary design improve- that is collaborative, collegial, continuous, ments. and centered in national standards of best

36 Measuring Online Course Design

Statement of Problem Method

Research indicates that there are Instrument many factors that can a!ect online course quality. Some of these factors include Based upon the QM standards, course design, course delivery, infrastruc- an instructional design team developed tures, learning management systems, facul- a questionnaire that included 27 Likert- ty readiness, student readiness, etc. Course items questions (to little or no extent 1-5 to design is one of the critical pieces in the a great extent) and three open-ended ques- quality control process as it a!ects course tions. Feedback was also obtained from a delivery and the overall success of online professor in the #eld of research and mea- programs. Quality Matters (QM) is a pro- surement. "e instrument, simply referred cess and a tool to continuously improve to as the Online Course Design Evaluation online course quality (Shattuck, 2012). "e Tool, speci#cally focuses on the design as- 2011-2013 edition of the QM Rubric stan- pect of online courses. dards for higher education includes eight general categories with 41 speci#c stan- Data Collection dards addressing di!erent aspects of on- line course design. Each of the standards is supported by rigorous independent online/ Student Data distance research and designed by a team of experts in the #eld of online and blended Since fall 2011, the Online Course learning. A team of three certi#ed QM peer Design Evaluation Tool has been used at reviewers review online courses according the university to collect feedback from stu- to QM annotated standards and provide dents about design aspects of online cours- constructive feedback to course develop- es. "e project team identi#ed three online ers. Although QM peer reviewers are asked courses for this project. One course was of- to assume a student’s point of view when fered in fall 2011 and 35 students complet- reviewing online courses there exists the ed the survey and two courses were o!ered potential for di!ering perspectives. "ere- in spring 2012 whereby 18 students com- fore, it is necessary to collect feedback from pleted the survey in the #rst course and 20 students about the course design. students in the second course. "is study attempts to achieve three objectives. First, it attempts to validate the Reviewer Data instrument design based on QM Standards to measure online course design. Second, "ree QM certi#ed reviewers who it attempts to analyze the data and under- were trained to review online courses from stand to what degree the selected courses a student’s point of view collected data meet QM standards from a student’s per- and provided reports on each of the three spective. "ird, it attempts to identify ex- courses; however, because this particular isting gaps between a student’s perspective review was not an o%cial review, none of and QM certi#ed reviewers’ perspectives the reviewers were subject matter experts about QM essential standards. in the #eld of study of these courses.

37 Internet Learning

Data Coding and Analysis peer reviewers were asked to take a stu- dent’s point of view, the two groups were To satisfy the #rst and second objec- independent. tives of this study, data were collected from the three online courses and analyzed sep- Results arately with Winsteps--a Windows-based so$ware that assists with several Rasch Course A model applications--particularly in the ar- eas of educational testing, attitude surveys "irty-#ve out of the 44 students and rating scale analysis (Linacre, J. M., completed the course design evaluation 2009). survey with a response rate of 79.55%. "e To address the third objective of person reliability was 0.83 and the item re- this project the resulting data were treat- liability was 0.48. ed. Students’ results were converted into a "e item statistics indicate that measure that is comparable to the review- Item 1 (MNSQ = 3.31) will need to be re- ers’ rating. Student responses of to a great vised and Item 16 (MNSQ = 3.13) will need extent “4” or to a very great extent “5” are to be revised or dropped if the instrument used as at or above 85% level and coded as is used in the future. “1”. Student responses of to a moderate ex- "e Item Map (Fig. 1) indicates tent “3”, to some extent “2” and to little or no that there are three tiers regarding course extent “1” are used as below 85% level and quality from a student’s persepctive. Tier coded as “0”. According to the majority I contains the items that students strong- rule principle, if 2/3 of the students selects ly agreed with, thus indicating the course to a great extent “4” or to a very great ex- met Standards 2.1 (Item 4), 3.1 (Item 14), tent “5” for an item in the survey then it is 3.2 (Item 15), 6.1 (Item 20), 3.3 (Item 7) to determined that the course meets that spe- a great extent. Tier II contains items that ci#c standard from a student’s perspective. students agreed with, which indicates that See Tables 1, 2, and 3. course met those standatds to a moderate "ree QM certi#ed peer reviewers extent. Tier III contains items that students reviewed the three courses according to agreed with to some extent, thus indicating QM standards and input their scores into a that the course did not meet Standards 7.1 spreadsheet. If a standard was met, “1” was (Item 24), 5.1 (Item 13), and 8.1 (Item 25). recorded for the standard. If a standard was not met, “0” was recorded for the standard. Course B If two (2/3) of the peer reviewers assigned a score to a speci#c standard, then it was de- Eighteen out of the 38 students termined that the course met the standard completed the course design evaluation from a peer reviewer’s perspective. See Ta- survey with a response rate of 47.37%. "e bles 1, 2, and 3. person reliability was 0.95 and the item re- "e data were treated in a spread- liability is 0.63. sheet and analyzed with SPSS. A nonpara- "e item statistics indicate that metric Mann-Whitney U test (2 indepen- Item 14 (MNSQ = 2.29) needs to be revised dent samples) was used to evaluate median if the instrument is used in the future. di!erences between the two groups (stu- "e Item Map (Fig. 2) indicates dents and peer reviewers). Although the that there are three tiers regarding course 38 Measuring Online Course Design

Figure 1. Course A item map

39 Internet Learning quality from a student’s persepctive. Tier used as at or above 85% level and coded as I contains the items that students strong- “1”. Student responses of to a moderate ex- ly agreed with, which indicates the course tent “3”, to some extent “2” and to little or no met Standards 3.3 (Item 7), 3.2 (Item 15), extent “1” were used as below 85% level and 3.2 (Item 15) to a great extent. Tier II con- coded as “0”. According to the majority tains the items that students agreed with, rule principle if 2/3 of the students selects thereby indicating that the course met to a great extent “4” or to a very great extent these standards to a moderate extent. Tier “5” for an item in the survey then it was III contains items that students agreed with determined that the course met that partic- to some extent, thus indicating that the ular standard from a student’s perspective. course did not meet Standards 7.2 (Item See Tables 1, 2, and 3. 24) and 8.1 (Item 25). • "ree QM certi#ed peer reviewers re- viewed the three courses according to QM Course C standards and recorded their scores in a spreadsheet. If a standard was met, “1” was Twenty out of the 22 students com- recorded for the standard. If a standard was pleted the course design evaluation survey not met, “0” was recorded for the standard. with a response rate of 90.91%. "e person If two (2/3) of the peer reviewers assigned a reliability was 0.96 and the item reliability score to a speci#c standard then the course was 0.78. met that standard from a peer reviewer’s "e item statistics indicate that perspective. See Tables 1, 2, and 3. Item 10 (MNSQ = 2.83) will be dropped and Item 12 (MNSQ = 2.64) and Item 6 "e data were treated in a spread- (MNSQ = 2.60) will need to be revised if sheet and analyzed with SPSS. A non- the instrument is used in the future. parametric Mann-Whitney U test (2 in- "e Item Map (Fig. 3) indicates that dependent samples) was used to evaluate there are three tiers regarding course qual- a di!erence in medians between the two ity from a student’s persepctive. Tier I con- groups (students and peer reviewers). "e tains the item that students strongly agreed two groups were di!erent and independent with, indicating the course met Standard of each other even though peer reviewers 2.2 (Item 5) to a great extent. Tier II con- are asked to take a student’s view when tains the items that students agreed with, completing course reviews. thus showing that the course met these standards to a moderate extent. Tier III contains the item that students agreed to some extent, demonstrating that the course did not meet Standard (Item 10), a non-es- sential standard. To satisfy the third objective of this project the data were treated as follows:

• Students’ results were converted into a measure comparable to that of the review- ers’ rating. Student responses of to a great extent “4” or to a very great extent “5” were

40 Measuring Online Course Design

Figure 2. Course B item map

41 Internet Learning

Figure 3. Course C item map

42 Measuring Online Course Design

Course A

Students reported that the course met all of the essential standards except Stan- dards 7.1, 7.2, and 8.1 as measured by the instrument developed by the research team. "e review results conducted by the three certi#ed peer reviewers (none of the peer reviewers served as an SME on this review) also indicate that the course met most of the essential standards except Standards 2.1, 2,2, 2.4, 3.3, and 8.1.

Table 1 Student and Peer Reviewer Results on the Essential Standards

Essential Student Peer Reviewer Items Standards Results Results 1.1 YES YES 2 1.2 YES YES 1 2.1 YES NO 4 2.2 YES NO 5 2.4 YES NO 6 3.1 YES YES 14 3.2 YES YES 15 3.3 YES/YES NO 7, 16 4.1 YES YES 8 5.1 YES YES 13 5.2 YES YES 12 6.1 YES YES 20 6.3 YES YES 21 7.1 NO YES 22 7.2 NO YES 24 8.1 NO NO 25

No statistical di!erences were detected regarding the standards, with the exception of Standards 2.1, 2.4, and 3.3. "e two groups di!ered signi#cantly regarding Standard 2.1 U = 1.000, Z = -5.192, p = .000, Standard 2.4 U = 7.500, Z = -3.393, p = .001, and Standard 3.3 (Item 7) U = 22.000, Z = -2.819, p = .005. See Fig. 4 below.

Figure 4. Mann-Whitney U test statistics

43 Internet Learning

Course B

Students reported that the course met all of the essential standards except Stan- dards 7.1, 7.2 and 8.1 as measured by the Online Course Evaluation Tool; however, the review results conducted by the three certi#ed peer reviewers (none of the peer reviewers served as an SME on this review) indicated that at least #ve course essential Standards 1.1, 1.2, 2.2, 2.4 and 3.2 did not meet the standards. Similarly, the peer reviewers’ results indicated that the course did not meet Standards 7.1, 7.2, and 8.1, however, interestingly, the students disagreed with that decision.

Table 2 Student and Peer Reviewer Results on the Essential Standards

Essential Student Results Peer Reviewer Results Items Standards 1.1 YES NO 2 1.2 YES NO 1 2.1 YES YES 4 2.2 YES NO 5 2.4 YES NO 6 3.1 YES YES 14 3.2 YES NO 15 3.3 YES/YES YES 7, 16 4.1 YES YES 8 5.1 YES YES 13 5.2 YES YES 12 6.1 YES YES 20 6.3 YES YES 21 7.1 NO YES 22 7.2 NO YES 24 8.1 NO YES 25

No statistical di!erences were detected regarding the standards with the execp- tion of Standards 2.2, and 3.2. "e two groups di!ered signi#cantly regarding Standard 2.2 U = 4.500, Z = -2.887, p = .004, and Standard 3.2 U = 3.000, Z = -3.266, p = .001. See Fig. 5 below.

44 Measuring Online Course Design

Figure 5. Mann-Whitney test statistics

Figure 6. Mann-Whitney test statistics

45 Internet Learning

Course C

Students reported that the course met only a few essential standards 1.2, 2.1, 2.2, 2.4, 3.2, 4.1, and 5.2. However, the review results conducted by the three certi#ed peer reviewers (none of the peer reviewers served as an SME on this review) indicated that only three of the essential standards were not met, standards 2.2, 7.2 and 8.1. "e peer reviewers’ results for standards 7.2 and 8.1 are in conformity with the students’ results that the course does not meet these standards.

Table 3 Student and Peer Reviewer Results for the Essential Standards

Essential Student Peer Reviewer Items (see the Standards Results Results instrument for questions) 1.1 NO YES 2 1.2 YES YES 1 2.1 YES YES 4 2.2 YES NO 5 2.4 YES YES 6 3.1 NO YES 14 3.2 YES YES 15 3.3 NO/YES YES 7, 16 4.1 YES YES 8 5.1 NO YES 13 5.2 YES YES 12 6.1 NO YES 20 6.3 NO YES 21 7.1 NO YES 22 7.2 NO NO 24 8.1 NO NO 25

No statistical di!erences were detected regarding the standards except Standard 2.2. "e two groups di!ered signi#cantly regarding Standard 2.2 U = 11.500, Z = -2.892, p = .004. See

Figure 7. Mann-Whitney U test statistics

46 Measuring Online Course Design

Discussion dots between chapter level objectives and the assigned activities and assessment for the week. We investigated the di!erences be- tween students and peer reviewers regard- Apparently peer reviewers are look- ing the essential standards in three online ing for above average at approximately 85%. courses. When the courses were approved Students might think the brief introduction for design the faculty course developers to each chapter provides instructions on were provided a copy of the Quality Mat- how to achieve the learning objectives. "e ters Rubrics in the beginning of the course overall satisfaction of the course might also development process. Instructional design- a!ect students’ rating on the standards as ers, who were certi#ed QM peer reviewers, the majority rated the course as excellent. were available for individual consultations A third factor that might contribute to the during the design and development pro- di!erence is the student satisfaction of the cess. "e faculty course developers were teacher. "e responses to the open-ended very familiar with the standards and agreed questions indicated that the professor was that it was essential to incorporate the stan- excellent and cares about student learning, dards into online course design processes. as one student stated: As reported in the results section, "e professor always leads a very informa- in course A the results reported by students tive, fun, and creative class and this one and peer reviewers di!ered signi#cantly in was not an exception. I learned a plethora of new things from the reading, assign- regards to Standard 2.1 (!e course learning ments, and independent studies through- objectives describe outcomes that are mea- out the semester. surable) and Standard 2.4 (Instructions to students on how to meet the learning objec- In course B the results reported by tives are adequate and stated clearly). For students and peer reviewers di!ered signi#- Standard 2.1, the students were asked to re- cantly regarding Standard 2.2 (!e module/ port whether course objectives were clearly unit learning objectives describe outcomes presented in the course syllabus. For Stan- that are measurable and consistent with the dard 2.4, both reviewers and students were course-level objectives) and Standard 3.2 asked to report whether clear instructions (!e course grading policy is stated clearly). on how students should meet the learning For Standard 2.2, the students were asked objectives are articulated in the course. to report whether module/unit objectives Students reported that the instructions were clearly stated in each unit. For Stan- were available. However, reviewers do not dard 3.2, both reviewers and students were agree as one stated: asked to report whether grading policy was clearly articulated in the course. Students Standard 2.4 calls for clear instructions reported that the grading policy was avail- on how students should meet the learn- ing objectives. "e course design does able, however, the majority of the review- a good job in providing students with a ers thought that the policy was not clear brief introduction to each Chapter topic; enough. One reviewer stated: however, it is somewhat di%cult to under- Standard 3.2 asks for a clear, written de- stand which learning activities, resources, scription on how student's grades will be assignments, and assessments support the calculated, for instance, the total points learning objectives for each unit week. It for each assignment, the percentages or is important to help students connect the weights for each component of the course

47 Internet Learning

grade. It would be helpful to provide an Conclusion overall list of assignments, points, per- centages or weights in the syllabus so that ost of the items in the Online students are acknowledged upfront on how they will be evaluated without dig- Course Evaluation Tool were de- ging deeper in the Unit content pages. signed according to the Quality MMatters standards and integrate very well As mentioned previously, the over- toward measuring the design aspect of on- all satisfaction of the course and the in- line courses. However, the mis#t items will structor might also a!ect students’ rating be dropped (Item 10) or revised (Items 1, on the standards as students stated: 6, 14, and 16) according to the analysis Overall, this course has given me a lot of results. "e results from students indicat- valuable information that I can use in the ed that Tier I: to a great extent, Tier II: to classroom. a moderate extent, and Tier III: to little or some extent, met the standards in the three I appreciate all the help given to me courses. throughout the years. "is was not an "e results on most of the stan- easy thing to accomplish, but I have and I will always remember all those that have dards evaluated in this study provided by helped me succeed. both reviewers and students were the same, thus indicating that both peer reviewers In course C the results reported by and students take the same point of view students and peer reviewers di!ered sig- in terms of evidencing standards; howev- ni#cantly in regards to Standard 2.2 (!e er, they di!ered signi#cantly regarding module/unit learning objectives describe three of the essential standards. One factor outcomes that are measurable and consis- possibly contributing to this discrepancy tent with the course-level objectives). "e could be that reviewers looked for solid ev- students were asked to report whether idencing of measurable learning outcomes module/unit objectives were clearly stated while students looked for clearly articulat- in each unit. While the reviewers look for ed objectives. "e second factor might be solid evidencing of measurable learning that instructors clari#ed unclear design objectives. One reviewer stated: aspects via email while the course was de- Standard 2.2 requires that the module/ livered and not available to the reviewers. unit learning objectives describe out- "e third factor might be that the review- comes that are measurable and consistent ers looked for above average approximate- with the course level objectives. Many of ly 85%, while students looked for the basic the module level learning objectives are elements regarding the standards. "e re- overlapping. It is suggested that you de- viewers also perceived that the overall sat- velop unique learning objectives for each module based on Bloom's taxonomy. isfaction of the course and the instructor might also a!ect students’ rating regard- "e peer reviewers had expect- ing the essential standards. Further study, ed the course to meet this standard at or however is needed to investigate the causes above 85% level and used this opportunity of discrepancy. to make modi#cation to the course toward meeting the standards.

48 Measuring Online Course Design

References perspective. Presentation at the 3rd Annu- al Quality Matters Conference, Baltimore, Chickering, A., & Gamson, Z. MD. (Eds.). (1987). Seven principles for good practice in undergraduate education. Rosner, B. and Grove, D. (1999). Use of the AAHE Bulletin, 38(7) 3-7. Mann Whitney U-Test for clustered data. Statistics. Med. 18, 1387-1400. Higher Learning Commission. (2007). Statement of commitment by the Shattuck, K. (2012). What we’re learning regional accrediting commissions for the from Quality Matters-focused research: Re- evaluation of electronically o!ered de- search, practice, continuous improvement. gree and certi#cate programs. Retrieved Retrieved from https://www.qmprogram. from http://www.ncahlc.org/index. org/files/Learning%20from%20QM%20 php?&Itemid=236 Focused%20Research%20Paper_0.pdf

Holmberg, B. (1995). !eory and Shattuck, K. (2007). Quality matters: Col- practice of distance education. (2nd ed.). laborative program planning at a state lev- London and New York: Routledge. el. Online Journal of Distance Learning Ad- ministration. Retrieved from http://www. Knowles, E. E., & Kalata, K. (2010, westga.edu/~distance/ojdla/fall103/shat- June). "e impact of QM standards on tuck103.htm learning experiences in online courses. [2009 QM Research Grant]. Presentation "e Higher Education Program and Pol- at the 2nd Annual Quality Matters Confer- icy Council (2000). Distance education: ence, Oak Brook, IL. Guidelines for good practice. Retrieved from http://www.aft.org/pubs-reports/higher_ Linacre, J. M. (2009). Winsteps ed/distance.pdf (Version 3.68.0) [Computer So$ware]. Chicago: Winsteps.com. Acknowledgement

Quality Matters (2011). Quality "is study was sponsored by a Quality Mat- Matters rubric standards 2011-2013 edi- ters research grant for 2013. tion Retrieved from https://www.qmpro- gram.org/rubric

Ralston-Berg, P. & Nath, L. (2008). What makes a quality online course? "e student perspective. 24th Annual Confer- ence on Distance Teaching & Learning. Retrieved from http://www.uwex.edu/dist- ed/conference/Resource_library/proceed- ings/08_12876.pdf

Ralston-Berg, P. (2011). What makes a quality online course? "e student

49 Internet Learning

Online Course Design Evaluation 5. Learning objectives were clearly stated Tool for each unit or module. a. To little or no extent b. To some extent he course evaluation focuses on the c. To a moderate extent design of this online course, NOT d. To a great extent the performance of your instructor. e. To a very great extent TPlease use the scale from 1 (To little or no extent) to 5 (To a very great extent) to make 6. Instructions were clearly given as to how your evaluation. If an item is not applica- students would meet learning objectives. ble, leave the response blank. "anks. a. To little or no extent b. To some extent 1. "e purpose and structure of the course c. To a moderate extent were introduced to the students. d. To a great extent a. To little or no extent e. To a very great extent b. To some extent c. To a moderate extent 7. Instructions were clearly given as to how d. To a great extent students would be assessed such as a de- e. To a very great extent tailed grading rubric for assignments. a. To little or no extent 2. "e introductions of the course made b. To some extent clear how to get started and where #nd c. To a moderate extent course components. d. To a great extent a. To little or no extent e. To a very great extent b. To some extent c. To a moderate extent 8. "e course materials were helpful for me d. To a great extent to achieve the learning objectives. e. To a very great extent a. To little or no extent b. To some extent 3. "e communication policy and preferred c. To a moderate extent form of communication were clearly stated. d. To a great extent a. To little or no extent e. To a very great extent b. To some extent c. To a moderate extent 9. "e visual presentations of this course d. To a great extent were helpful for me to achieve the learning e. To a very great extent objectives. a. To little or no extent 4. Course objectives were clearly presented b. To some extent in the course syllabus. c. To a moderate extent a. To little or no extent d. To a great extent b. To some extent e. To a very great extent c. To a moderate extent d. To a great extent e. To a very great extent

50 Measuring Online Course Design

10. "e audio or video clips in this course 15. "e course grading policy (e.g. how the were helpful for me to achieve the learning grades were computed) was clearly stated. objectives. a. To little or no extent a. To little or no extent b. To some extent b. To some extent c. To a moderate extent c. To a moderate extent d. To a great extent d. To a great extent e. To a very great extent e. To a very great extent 16. A description of criteria used to evalu- 11. "e resources provided in this course ate students’ work and participation in the were relevant and useful. course was clearly stated. a. To little or no extent a. To little or no extent b. To some extent b. To some extent c. To a moderate extent c. To a moderate extent d. To a great extent d. To a great extent e. To a very great extent e. To a very great extent

12. Opportunities for engagement, includ- 17. "e course provided multiple oppor- ing group discussions, collaboration proj- tunities for students to measure their own ects, online meet-ups, virtual o%ce hours learning progress. or other use of collaboration tools were a. To little or no extent used in this course. b. To some extent a. To little or no extent c. To a moderate extent b. To some extent d. To a great extent c. To a moderate extent e. To a very great extent d. To a great extent e. To a very great extent 18. "e tools selected for this course were easy to use. 13. "e interactive activities in the course a. To little or no extent were helpful for me to achieve the learning b. To some extent objectives. c. To a moderate extent a. To little or no extent d. To a great extent b. To some extent e. To a very great extent c. To a moderate extent d. To a great extent 19. Instructions were provided regarding e. To a very great extent how technology tools were used to achieve learning objectives. 14. "e course quizzes/exams/assignments a. To little or no extent were consistent with the course objectives. b. To some extent a. To little or no extent c. To a moderate extent b. To some extent d. To a great extent c. To a moderate extent e. To a very great extent d. To a great extent e. To a very great extent

51 Internet Learning

20. "e tools selected for this course sup- 25. "e course was accessible to assistive port student learning. technologies such as screen readers (PDF, a. To little or no extent graphics available to screen readers, videos b. To some extent with captions etc.) c. To a moderate extent a. To little or no extent d. To a great extent b. To some extent e. To a very great extent c. To a moderate extent d. To a great extent 21. Navigation throughout the online com- e. To a very great extent ponents of the courses was intuitive and consistent. 26. "e course was well-organized. a. To little or no extent a. Poor b. To some extent b. Fair c. To a moderate extent c. Good d. To a great extent d. Excellent e. To a very great extent 27. Overall, how would you rate this course? 22. "e course instruction articulated or a. Poor linked to tech support. b. Fair a. To little or no extent c. Good b. To some extent d. Excellent c. To a moderate extent d. To a great extent 28. What parts of this course were most e. To a very great extent useful to you?

23. "e course instructions articulated or linked to other academic support services 29. What parts of this course need im- and resources. provement? a. To little or no extent b. To some extent c. To a moderate extent 30. Please provide additional comments. d. To a great extent e. To a very great extent

24. "e course provided information and guidance on how to access disabilities sup- port services. a. To little or no extent b. To some extent c. To a moderate extent d. To a great extent e. To a very great extent

52 Internet Learning Volume 3 Issue 1 - Spring 2014 Many Shades of MOOCs Deborah AdairA, Susan W. AlmanB, Danielle BudzickC, Linda M. GrishamD, Mary E. ManciniE, A. Sasha "ackaberryF

Massive Open Online Courses (MOOCs) represent an innovation in teach- ing and learning around which there is keen interest and much experimen- tation. MOOCs are being developed using di%erent pedagogical approaches for di%erent purposes and for di%erent audiences. Starting with a theoretical framework to identify signi"cant di%erences in basic approaches to MOOCs, this paper presents a set of four case studies of MOOCs developed and de- livered in 2013 by four di%erent institutions, community colleges as well as universities, on four di%erent platforms with di%erent approaches, purposes, and intended audiences. An examination of the association between the pur- pose and audience of these MOOCs, their design considerations, and their outcomes raises important questions for future research.

Keywords: MOOC, case study, connectivist, higher education, online teach- ing and learning, models, RN–BSN, developmental math, professional devel- opment, computer programming , library and information science, commu- nity college, Quality Matters

Introduction proach. Today, there is experimentation on di!erent MOOC models that re&ect the n a startlingly short time frame, Mas- diverse creativity of their faculty and devel- sive Open Online Courses (MOOCs) opers. In fact, much of the experimentation have captured the interest and imagi- with these new MOOCs is focused on what Ination of the higher education community kinds of outcomes, for whom, and with and its many stakeholders. "is interest is what pedagogical frame these massive and re&ected in the extent of experimentation open courses are best suited. MOOCs come with an educational delivery model that in many shades; however, what counts is has yet to develop a track record for e!ec- the achievement of purpose and the quality tiveness or e%ciency in producing learning of the experience for the learner. outcomes. Originating with a more focused Regardless of approach, quality in constructivist pedagogy, the MOOCs de- instructional design is a critical component veloped over the last few years have moved for a course meant to engage large (mas- from a connectivist learning experience sive) numbers of learners who have not toward a more traditional behaviorist ap- been through the typical institutional #lters

A "e Quality Matters Program B San Jose State University C Cuyahoga Community College D Massachusetts Bay Community College E "e University of Texas at Arlington College of Nursing F Cuyahoga Community College

53 Internet Learning that produce a student body more homog- interaction and requirements for clear de- enous in their preparation for learning. In scriptions of resources available to the con- courses o!ered at large scale and that are tinuing education student. open to an audience diverse in experiences, To date, QM has reviewed little skills, abilities and disabilities, orientation more than a dozen MOOCs and, of these, to learning, and even language, it becomes only a few have met the CPE Rubric stan- especially critical to have a course designed dards. Although the educational content of to provide the communication and guid- these MOOCs was very strong, it was clear ance to the learner that the course instruc- that much less attention is being paid to the tor can’t otherwise o!er at scale. Clarity and instructional design considerations that speci#city in objectives, the communica- may be most important for such open en- tion of learner expectations, and guidance rollment courses o!ered at a scale outside about how to get help or support become of degree and credit-bearing programs. critical in a learning structure where the re- Such design considerations as e!ectively sponsibility for completion and achieving orienting the learner to the purpose and learning outcomes rests almost solely on structure of the course and communicating the learner. resources and expectations are critical for To insure that the components of learners who are not otherwise connected the course are clearly aligned with its pur- to the academic institution and have no pose and objectives, many institutions rely other recourse to gain such information. on the Quality Matters RubricTM to guide "e instructional design of MOOCs must development and to evaluate the quality of be strong enough for students to be self-re- instructional design. Quality Matters (QM) liant and must be so well aligned with the has a version of its rubric developed for use purpose, objectives, and audience that stu- with courses like MOOCs. "e QM Con- dents can succeed with the limited facul- tinuing and Professional Education Rubric ty interaction that has thus far de#ned the (CPE Rubric) is intended for the design and MOOC experience. evaluation of online and blended courses Because of the necessity for such – facilitated, mentored, or self-managed – strong alignment, the context of the MOOC that may have pass/fail, skills-based, or oth- is critical for its design. Placing MOOCs er completion or certi#cation criteria but within the appropriate theoretical frame- that do not carry academic credit. Courses work is one broad way to understand con- to which it applies may be either instructor text. Explicitly identifying MOOCs by pur- led or self-paced; either way, they must be pose and audience might be another. "is structured and have completion criteria. paper will look at both perspectives, #rst "e QM CPE Rubric di!ers from laying out a theoretical framework to iden- the QM Higher Education Rubric in a num- tify signi#cant di!erences in approaches ber of ways that make it more appropriate and then presenting a set of case studies to for courses that do not bear academic cred- examine in detail the association between it. With the CPE Rubric, courses can meet the purpose and audience of particular standards without active instructor facilita- MOOCs, design considerations, and out- tion and without direct student-to-student comes. contact. "ere are reduced expectations of institutional support but greater expec- tations for enriched student-to-content

54 Many Shades of MOOC's

!eoretical Framework of MOOCs addresses the discussion of “part of a more fundamental shi$ in universities” … which OOCs are a recent phenomenon “is taking place at a time when the nature in higher education. By wide- and purpose of the university as well as spread acknowledgment, the #rst higher education are very much in ques- MMOOC was o!ered in 2008. "e term itself tion” (Blackmore & Kandiko, 2012, p. 128). was coined in Canada when Dave Cormier What will college education become and Bryan Alexander used it to describe an as a result of MOOCs and other disruptive open course with over 2,000 students that innovations? Will they persist at all? was free and took place at the University of Despite their relatively short his- Manitoba. tory, MOOCs have already splintered into Since then, MOOCs have exploded two distinct models for massive learning: in higher education, with #rst Ivy League cMOOCs and xMOOCs. “"eir di!erences institutions embracing and scaling up the are so stark so distinct in pedagogy that it trend, and new companies emerging to is confusing to designate them by the same host MOOCs (Educause, 2012). But what term” (Hill, 2012, as cited in Daniel, 2012, speci#cally about the MOOC model is p. 2). cMOOCs embrace a constructivist disruptive? Daniel writes “While the hype approach whereas xMOOCs embrace a about MOOCs presaging a revolution in more traditional, behaviorist approach to higher education has focused on their massive online learning. scale, the real revolution is that universities cMOOCs refer to a constructivist with scarcity at the heart of their business or connectivist learning experience typ- models are embracing openness” (Daniel, i#ed by the initial MOOCs that followed 2012, p. 1). "e rush of institutions o!ering a more organic philosophy of interacting MOOCs will itself transform the landscape with resources and with fellow students to of higher education, or at the very least, connect learning and construct knowledge. help to precipitate change. Wiley and Green describe them as apply- "e very concept of disruptive in- ing “the ‘open’ ethos to course outcomes. novation addresses this directly. “Accord- In other words, students are empowered to ing to Christensen (1997), organizations learn what they need/want to learn, and the that don’t pay attention to disruptive in- journey of learning is o$en more important novation (1) maintain that their goods and than any prede#ned learning outcomes” services will always be needed, (2) develop (Wiley & Green, 2012, p. 88). cMOOCs sustaining improvements based on current o$en encompass four main types of activi- customers, (3) don’t understand the natu- ties: aggregation or curation of content, re- ral laws of disruptive innovation, and (4) mixing of content, repurposing of content, fail to spin o! an organization in direct and feed forward – the term referring to competition with itself. "ese organiza- sharing the newly cra$ed knowledge with tions risk becoming obsolete” ("ornton, a variety of outward facing streams (Kop, 2013, p. 47). Institutions of higher educa- Fournier, & Sui, 2011). tion are particularly vulnerable to external Is this type of MOOC e!ective at in&uences during a time when funding is positively impacting student learning? uncertain and pressures to perform come While there currently exists no robust body from students, citizens, and businesses of research on the e!ectiveness of MOOCs alike (Lattuca & Stark, 2009). "is directly to say one way or another, there is related

55 Internet Learning evidence to suggest that this model of mas- concepts within the larger concept of learn- sive education could be e!ective for stu- er-directed education, both inside and out- dent learning, when extrapolated from the side of institutions of higher education. perspective of a student’s participation in Research into early MOOCs sug- a knowledge community. “Participation in gests that participation in MOOCs is bi- these knowledge communities is both the furcated further, into categories of partic- process and the goal of learning in high- ipants and consumers. A small percentage er education” (Lattuca & Stark, 2009, loc of students who enroll in MOOCs actually 3785 of 8572). "e authors go on to write fully participate. A separate group of stu- that “Learning is thus a vehicle of social- dents tend to participate via a “consum- ization… and at the same time the result ing” style, wherein they review resources (or goal) of socialization” (Lattuca & Stark, and the work of fellow students, but are 2009, loc 3785 of 8572). cMOOCs are not active participants in the course (Kop, uniquely set up for social learning. "e de- Fournier, & Sui, 2011). velopment of a learning community “ben- cMOOCs have some identi#ed e#ts both students and faculty, as it can challenges that aren’t necessarily in play in lead toward better retention of students. xMOOCs. One way it is described is that In turn, course throughput rates increase the “lack of a coherent and centralized (Santovec, 2004). "ere are di!erent views structure and a lack of summary around on what route to follow to enable such a learning in the MOOCs also presented community to establish itself” (Nagel & challenges for some participants, in partic- Kotze, 2010, p. 46). ular the novice learners” (Kop, Fournier, & What implications does this model Sui, 2011, p. 86). "ere are also concerns of MOOC have for the respective roles of about the level of support provided by the teacher and learner? Blackmore address- instructors as an ongoing challenge of the es this challenge from a perspective wid- model. "e degree to which the design of er than the debate about MOOCs, writing the course allows for peer-to-peer feed- that “Increasingly, students are seen as the back to foster a higher level of cognitive consumers of an educational service. Inad- presence can “contribute value beyond the equate and unhelpful though the metaphor knowledge base of the lecturer, irrespec- might be, it is a powerful one, challenging tive of the large class size” (Nagel & Kotze, a more traditional relationship between 2010, p. 50). teacher and student. "e development of a xMOOCs are also changing the ed- network of colleagues with a shared view of ucational landscape. "ough far more sim- the purposes of a change can be a power- ilar to traditional online courses, xMOOCs ful way of enabling a change” (Blackmore, attempt to scale learning with extremely 2012, p. 134). "e demands of facilitating large class sizes that are highly structured, such learning requires facilitators “to adopt but in which only minimal customized a multifaceted role so as to guide or in&u- feedback is provided. O$en more detailed ence the learners and communities to get feedback is provided on a peer-evaluation involved and embrace social media prac- basis. Because of the sheer number of stu- tices” (Kop, Fournier, & Sui, 2011, p. 89). dents in a given course, new roles have MOOCs as a model seem to be uniquely emerged for teacher and learner, where- designed to challenge the traditional roles in the teacher becomes a facilitator of the of teacher and student, instead framing the learning process.

56 Many Shades of MOOC's

xMOOCs are more representative 2007, and the reputation for excellence is of a behavioral approach that indicates a evidenced by the 2013 Sloan-C Quality more traditional, codi#ed, and structured Scorecard E%ective Practice Award, faculty educational experience far more similar to expertise, student support, and the SJSU traditional online courses, but with instruc- Center for Information Research and Inno- tional mechanisms to allow them to serve vation. "e SLIS faculty were early adopters thousands of students (Daniel, 2012). EdX, of the concept of MOOCs, and in Fall 2012, Coursera, and Udacity all o!er more tradi- support was provided to develop and o!er tional xMOOCs. An ever-expanding mar- a professional development MOOC for a ketplace of xMOOCs include courses from global audience. Course development pro- a range of top-tier universities. "ere are gressed, and the #rst MOOC was o!ered in currently e!orts underway in several states Fall 2013. to force universities to accept the successful completion of MOOCs for college credit as MOOC Development: Purpose, Audience, a way to accelerate the achievement of bac- and Objectives calaureate degrees. Two faculty members (Michael Ste- Case Studies phens and Kyle Jones) were responsible for the design and delivery of the course, Hy- he following section contains case perlinked Library, that explored how librar- studies of four MOOCs designed ies are using emerging technologies to serve and delivered in 2013. Table 1 pro- their diverse communities. "ey were sup- Tvides an overview of the four di!erent in- ported by a team composed of faculty and stitutions that implemented these MOOCs, MLIS students to work on the administra- on four di!erent MOOC platforms, with tive, instructional, technical, and support di!erent approaches, purposes, and in- elements of the MOOC and assist with ele- tended audiences. ments of content development, design, and management. Students enrolled in the SLIS Case Study 1: San Jose State University, master’s program (MLIS) earned academic School of Library and Information Sci- credit for their work while students from ence other universities volunteered their time. In the #rst term, they were involved in research, A MOOC Model for Professional Devel- site construction, instructional design, and opment learning how to interact with members of a virtual community. In the second term, Background the students led discussion groups and assisted with the delivery of the MOOC. "e San Jose State University SJSU/SLIS is committed to o!ering (SJSU) School of Library and Information quality professional development to indi- Science (SLIS) is a recognized leader in on- viduals across the globe, and MOOCs pro- line learning with a cutting-edge curricu- vide a mechanism to engage a large audi- lum, o!ering students the convenience of a ence. "e content of the MOOCs includes 100% online program, as well as the tech- cutting-edge topics that provide informa- nology skills today’s employers seek. SLIS tion professionals with an introduction to has provided totally online programs since the material and enables them to explore 57 Internet Learning

Institution Platform Type Purpose/Course Audience Tri-C Coursesites/Blackboard xMOOC Dev. Ed. Math Multiple https://tric.coursesites.com/

UT Arlington Canvas xMOOC RN–BSN Nurses, https://learn.canvas.net/ Program “test Providers courses/83 drive,” CE Continuing Education SJSU Word Press and Buddy cMOOC Professional LIS http://mooc.hyperlib. Press Development Professionals sjsu.edu/

Mass Bay edX xMOOC Intro. Computer CS and IT BHCC Blend Programming Undergraduates

Table 1. Four Approaches to MOOCs

58 Many Shades of MOOC's the issues and network with others. "e in- http://slisweb.sjsu.edu/programs/moocs/ tended audience was reached through mar- hyperlinked-library-mooc. Also, there was keting and PR e!orts as noted below. a web page with information on how to register. Instructional Design Several strategies were used, in- cluding news features on the SLIS website, "e design of this 10-week course emails to target audiences, and informa- used a combination of three types of learn- tion shared via SLIS social media channels. ing theories in order to maximize the ex- "e instructor also promoted the MOOC perience of the participants. "e course on his blog. Additionally, Community Pro- developer, Michael Stephens, adapted con- #le stories about student assistants helping cepts in connected learning, transforma- with the MOOC were posted online. tive learning, and connectivist learning to provide an environment for the users to be Outcomes and Next Steps engaged in a variety of activities. "e struc- ture of the MOOC enabled the participants Enrollment was limited to 400, and to access a wide range of resources, re&ect many individuals interested in the MOOC on the content, create a project based on were unable to register for the course. "ose the experience, and share with others in the who participated in the 10-week course community. Details about the instructional were placed in smaller groups for easier design can be found here: https://mooc.hy- discussion, and they were encouraged to perlib.sjsu.edu/about/instructional-design/ form additional groups based on special interests. Each person had the opportunity Technical Design to earn individual badges a$er completing speci#c assignments and a master badge for "e course was built with an open the successful completion of the MOOC. source content management system, Word- A second MOOC, Exploring Future Press, and an open source plugin, Buddy- Technologies, will be o!ered in Fall 2014 us- Press, to provide a &exible platform for ing the same model. Additional details are social interactions that supported the located on the SLIS website. teaching philosophies. "e design was proven successful since the instructors had Case Study 2: !e University of Tex- used it to build learning environments for as at Arlington College of Nursing the prior six years. Additional information about the technical design is located here: MOOC2Degree Case Study https://mooc.hyperlib.sjsu.edu/about/tech- nical-design/ Background

Marketing With an enrollment approaching 33,500, the University of Texas at Arling- Promotion of the MOOC involved ton (UT Arlington) is the second largest pages on the SLIS website: http://slisweb. institution in the UT System and the sixth sjsu.edu/programs/moocs. It included a largest in Texas. "e University’s College MOOC program landing page and a page of Nursing (UTACON) is one of the larg- speci#c to the Hyperlinked Library MOOC: est and most successful in the country, 59 Internet Learning with a 94% graduation rate and a #rst-time demic Partnership RN-to-BSN program. NCLEX (National Council Licensure Ex- "is would ultimately lead to increased amination) pass rate consistently over 90% student enrollment. for new nurse graduates entering the nurs- • Provide a seamless process for award- ing #eld. "e New America Foundation, ing academic credit for students who based in Washington, DC, has honored by complete the MOOC and enroll in the UT Arlington as a Next Generation Uni- online RN-to-BSN program (this would versity, in part, for its success with online help streamline the enrollment process degree programs. More than 10,000 stu- for students). dents were enrolled in online classes and • Provide continuing professional educa- degree programs in Fall 2013. tion to nurses in a key area within the "e College of Nursing began its healthcare #eld. development of high-volume, online pro- • Expand our brand and reputation as a grams in 2008 when it o!ered the uni- leader in nursing education. versity’s #rst Academic Partnership de- • Provide a community service by o!er- gree-granting option – an RN-to-BSN ing access to important information to completion program. Prior to the initia- nonnurse healthcare providers and the tion of this program in 2008, the College general public with the opportunity to of Nursing graduated approximately 100 receive a certi#cate of completion. RN-to-BSN completion students per year. In the 2012–2013 academic year, 1,746 stu- Instructional Design dents graduated from that program. "is is the power of a dynamic, online program "is MOOC was designed specif- designed to be accessible and a!ordable. ically to achieve its articulated purpos- es. "is started with the selection of the MOOC Development: Purpose, Audi- MOOC topic. "e topic is important to ence, and Objectives practicing nurses and other healthcare pro- fessionals as well as one where there was an A$er monitoring the expansion of adequate amount of available open-access MOOCs into higher education, in the sum- material to use for learning. "e MOOC mer of 2013, the University, the College of was entitled “Enhancing Patient Safety Nursing, and Academic Partnerships – a through Interprofessional Collaborative Dallas-based organization that assists uni- Practice” and was designed to be complet- versities to develop and o!er scalable online ed in six weeks with the seventh week open programs – designed the university’s #rst for those learners who wished to take the MOOC. "e MOOC movement provided proctored #nal examination. UT Arlington and its partner, Academic As the primary intent of the course Partnerships, an opportunity to expand the was to allow students to “Test Drive” online online RN-to-BSN program through what education, no payment was required until is called a MOOC2Degree initiative. It was course completion when the learner decid- determined that this MOOC would be spe- ed on his or her desired endpoint – credit ci#cally designed to: toward a required course in the RN-to-BSN program or Continuing Education Units • Provide potential students with the (CEU) credit. Consistent with the core be- ability to “Test Drive” UTACON’s Aca- lief of open access in MOOCs, the costs to

60 Many Shades of MOOC's the learners were “pay to play.” Entrance Outcomes and Next Steps into the MOOC was free of charge. Learn- ers who wished to receive academic credit In August 2013, the course went in UTACONs RN-to-BSN program en- live with a “so$ launch” – limited enroll- rolled to take a #nal summative examina- ment – to pilot test the course structure tion and paid the online proctoring service and processes. On the start date, there were directly for that service (<$30 with the fee approximately 300 learners enrolled. In a varying dependent upon the individual’s start of the course survey, the following in- preference for when the examination was formation was obtained: scheduled). "e only additional charge for receiving credit was that associated with Country of origin the established process for receiving “cred- it by exam” ($25). Learners, who wished • 70% were from the United States to receive CEU for the course, paid the CE • 9% were from Western Europe provider $25 for obtaining a certi#cate for • 7% were from Africa the 45 hours of continuing education cred- • 4% were from South America it. • 2% were from Central/East Asia An Operations Team was respon- • English was the primary language for sible for identifying the &ow of informa- 75% of the participants tion to allow individuals to sign up for the MOOC, participate, and reach their desired Professional discipline (see Figure 1) end point (academic credit, continuing ed- ucation credit, certi#cate of completion) in • 57% of respondents were nurses. a seamless way. Individuals on this team • 43% of these respondents were inter- include representatives from the univer- ested in course credit sity’s Departments of Distance Education • 14% were interested in CEUs and Admissions along with the College of Figure 1. Breakdown of respondents by Nursing and Academic Partnerships. profession. A Course Development team was responsible for the content and the presen- tation of the course. A$er review, a deci- sion was made to use Canvas Open Net- work as the Learning Management System.

Identi#cation of course objectives, learning outcomes, curricula &ow, and included content and evaluation methods were the responsibility of the course faculty. Media experts and instructional designers from Academic Partnerships assisted in course construction on the Canvas Open Net- Learner experience work. Working collaboratively, open access learning artifacts appropriate to the course In later surveys, additional data objectives were identi#ed so as to avoid any were collected about learners’ experience. costs associated with books or other sup- "e course load was in line with learner plementary material. expectations. Learners expected between 2

61 Internet Learning and 4 hours of work a week. At mid-course, Figure 2. Favorite part of the course. 66% thought course load was manageable. At the end of the course, 80% felt the course length was appropriate. Furthermore, 90% of the students gave the course a 4- or 5-star rating, which included comments such as:

• “I would like to have more of every- thing.” • “Great and very informative.” • “I thoroughly enjoyed this course.... It has provided me with some amazing re- sources to consult and dig deeper into. I Course completion am very motivated to continue to study this issue further and start seeking out An interesting challenge appeared opportunities to get involved in organi- when new learners continued to enroll in zations focused on improving healthcare the course up through the last week. "is through educating others in IP collabo- was challenging, as the group had built en- ration.” gagement activities into the course with the • “"e case studies give examples of real assumption that the active cohort of learn- life scenarios which make me think criti- ers would stay reasonably constant. How cally. "e follow up discussion opens my best to design the MOOC to deal with new mind to other people’s opinions.” learners who join while the course is in session is something the group will be ad- "e case studies (using Team dressing in future iterations of the course. STEPPS videos from the Agency for Health- Understanding completion rates is care Research and Quality) and discussion, one of the major challenges with MOOCs “From My Perspective” videos, and the lec- and UTACON is currently considering ture videos were favorites (see Figure 2). what approach to take when reporting While Twitter Chat was in the course de- completion rates. For example, should one sign as an engagement strategy, there was measure success by using the total number virtually no engagement with these activi- of individuals who enroll at any point in the ties. On re&ection, this is perhaps not un- MOOC as the denominator, only the ones expected given a target audience that in- who had some level of instructional activ- cludes a large number of nurses who had ity, or only those who expressed in an in- limited experience with online education. terest in achieving the goals for the course? "e other interactive activities were found "ere is much debate in the literature right to be very engaging. now about this issue. Based on our experi- ence, there is a critical need for more ro- bust subgroup analysis so as to understand how to de#ne and quantify success. Of twenty-nine registered nurs- es who responded to a participant survey within the MOOC, twenty-eight expressed interest in applying to UTACON’s RN-

62 Many Shades of MOOC's to-BSN program and receiving academic be enthusiastic about the potential of this credit. By the time the MOOC closed, 50% initiative, and this group looks forward to of these were already moving forward with sharing more detailed results once it has the application process. implemented the initiative on a broader scale. Lessons learned Case Study 3: Cuyahoga Commu- "ere were numerous lessons nity College, Development Mathe- learned as part of this o!ering. Develop- matics MOOC ing a MOOC is di!erent than developing a traditional course as the learners have di!erent motivations; course developers A Competency-Driven MOOC Using need to be clear about what they want to Game-Based Mechanics accomplish for the course and build with the learner’s goals in mind. It is also im- Background portant to focus on engagement strategies and develop a sense of community (“high Cuyahoga Community College touch”) even though you are construct- (Tri-C) is a multicampus college in Cleve- ing the course to be “low touch” from the land, Ohio, serving over 52,000 credit and perspective of faculty/facilitators. For dis- noncredit students. As an Achieving the cussions, it is helpful to use case studies Dream Leader College, Tri-C has commit- revolving around actual patient care situ- ted substantial sta! and #nancial resources ations and use facilitators to help students to develop, implement, and evaluate highly feel more engaged with the course and the structured, multiyear initiatives designed instructor. It is also important to deter- to improve student success. "e College mine if the course will run in a set “term” is a member of the League for Innovation or run “open access.” Obtaining useful met- in the Community College, a 19-member rics is di%cult – but critical – and needs to international organization committed to be considered from the start. Developing improving community colleges through an evaluation plan should not be an af- innovation, experimentation, and institu- ter-thought. It is important to have a clear tional transformation. In Fall 2013, Tri-C de#nition of “success” and a plan to assess was awarded one of the Bill & Melinda for any mid-course corrections or revisions Gates Foundation grants to expedite the when running the course again is critical. transition into mainstream college course- UTACON’s inaugural MOOC2De- work for massive numbers of development gree e!ort provided important information education students. "is was the beginning that will inform the approach taken in the of turning the vision into a reality. future with the initiative. In particular, it "e Tri-C MOOC ran four separate provided valuable insight into the ways that o!erings: March, April, May, and June 2013. MOOCs di!er from traditional for-credit "ese were four faculty-facilitated o!erings courses and the ways in which the group each spanning four weeks. "e Tri-C facul- might consider adapting our approach ty also utilized the MOOC content in one as it relates to course design, student en- blended o!ering during Summer 2013. gagement, and measurement in the future. Early indicators give the group reason to

63 Internet Learning

MOOC Development: Purpose, Audience, "e audience for the MOOC in- and Objectives cluded a number of di!erent student popu- lations – both current students and nonstu- "e goal of designing and developing dents. "ese audiences included students a Developmental Mathematics MOOC was currently enrolled in Tri-C’s bridge courses, to leverage the college’s extensive experience as well as students who desired additional in subject matter and online learning to ex- practice a$er completing mandatory place- pedite the transition into mainstream college ment prep. Tri-C’s work with local high coursework for massive numbers of students. school partnerships and the community In the Fall 2011 semester, Tri-C also expanded the target audience to #rst had 2,285 “new-to-college” students test generation, returning, post-secondary, and into the College’s #rst-level developmental tech-prep students. Lastly, in partnership mathematics course – MATH 0910 – Basic with Blackboard Coursesites, Tri-C was Arithmetic and Pre-Algebra. In the Spring able to enroll students outside the region, 2012 semester, another 1,109 students test- state, and nation. ed into this course. Of these, nearly 3,400 students, approximately 1,600 tested at the Instructional Design upper end of the placement score range for the MATH 0910 course. Tri-C’s Develop- "e Tri-C Math MOOC was de- mental Mathematics MOOC targeted these veloped by a collaborative team of faculty students who tested into the upper levels and instructional designers. Several faculty of pre-algebra. "e MOOC was intended members served as subject matter experts to bridge the gap for these students, allow- and members of the instructional design ing them to skip the college’s MATH 0910 team served as both designers and devel- course altogether and go directly into the opers, supporting the faculty by aligning college’s Beginning Algebra or Quantway the course, developing the course structure course sequence. "e overarching out- in the Learning Management System, load- comes for the MOOC pilot included: ing the vetted content and materials, and setting up adaptive release for the gaming • Addressing the developmental educa- aspect of the learning experience. Finally, tion challenge and Tri -C’s priority to an external graphics developer provided help students get to college ready status unique graphics for the entire course. at a faster pace. "e course was designed and de- • Opportunities for partnership with veloped during a two-month period (Jan- K-12 by targeting high schools students uary–February 2013), as the #rst o!ering and helping students get to college-ready was scheduled for March 2013. "is re- status before they enroll at Tri-C. quired a high level of interaction between • Supporting returning students who the faculty and the design team, needing want/need a brief math refresher. regular communication, quick turnaround • Contributing to the exploration of in- times, and collaboration. "e collaboration novative and experiential practices in was critical to the success of the project, teaching and learning and being a leader and working together, the full team was among community colleges, as a Board able to #x technical issues, adapt the course Member Institution in the League for In- as needed, and improve the support to stu- novation in the Community College. dents.

64 Many Shades of MOOC's

Game mechanics worked collaboratively with the instruc- tional designers to vet and view the resourc- "e MOOC was designed using es through Kahn Academy, Connexions, game mechanics with a storyline (similar to Teacher Tube, and other sources. Tri-C also the reality television show “Survivor”) about openly licensed, through Creative Com- the world of math challenges on “Believe mons, the images and the entire course for Island.” "e course consisted of four di!er- use by any nonpro#t institution. ent levels for the competencies related to Tri-C’s lowest level of developmental math. Quality Matters In each level, students were able to interact with a variety of open educational resourc- Tri-C’s Developmental Math MOOC es, including an open educational textbook, was designed with the principles of QM in instructional videos, and practice activities. mind. "e course site was the #rst MOOC Once they felt con#dent, students were then in the country to earn QM recognition via required to complete both checkpoints and the QM CPE Rubric (Quality Matters, 2014). challenges. Each checkpoint helped the stu- "is demonstrates that MOOCs can indeed dents as a “self-test” on their pro#ciency of meet high standard of course design quality. a key concept, while the challenges were Course video tour. designed to demonstrate mastery of all the An overview of the full course de- concepts in a particular level. Students had sign can be found in the navigational video to complete the challenge with a score of at http://www.youtube.com/embed/kMeh- 80% or better. If they successfully complet- DOaVtHo. ed, students “leveled up” into the next level of the course and earned a virtual badge (in- Technical Design tegrated with Mozilla Open Badges). If stu- dents did not earn an 80%, they had the op- "e course was designed in Black- portunity to complete the challenge as many board Course sites, using open education- times as they needed based on a random al resources from Khan Academy and a block question pool developed by the fac- number of additional repositories. Students ulty subject matter experts. "e challenges could register and enroll directly in the created a low-risk, -failure environment Blackboard Course sites to gain access to the to encourage persistence in the learners. course.

Open educational resources Marketing

"e Tri-C MOOC did not recreate Tri-C used a number of di!erent the wheel. Instead, the course was designed marketing strategies to reach out to the mul- using existing open learning objects for tiple audiences, including: (1) informational the Pre-Algebra MOOC. "is included the &yers – (in both print and virtual formats), open textbook, videos, practice activities, (2) emails, (3) webpages – Tri-C’s website and more. "e checkpoint and the challenge and the eLearning & Innovation blog, (4) questions were developed by the faculty. face-to-face communication at the testing "e selection and vetting process centers where students complete the place- to align the OER with the course objectives ment tests, and (5) collaboration with a was a time-consuming task. "e faculty number of local high schools. Furthermore,

65 Internet Learning

the Ohio Board of Regents and the Ohio As- by age group and satisfaction data. sociation of Community College shared the •Keep up on the latest about Tri-C’s MOOC information via listserv to the state- MOOCs at http://elearningandinnova- wide memberships. tion.com/pilots-and-initiatives/moocs/.

Outcomes and Next Steps Case Study 4: Massachusetts Bay Community College and Bunker "e process of designing, devel- Hill Community College oping, and implementing Tri-C’s MOOC was a de#nite success with a number of xMOOC Content Implementation: learning opportunities for best practic- Community College MIT edX Partnership es. "e collaborative and iterative design and development process, partnering a Background team of faculty with instructional de- signers, worked extremely well to de- "e edX organization at MIT, Mas- liver the MOOC in a short time frame. sachusetts Institute of Technology, Cam- Figure 3 provides an overview of the bridge, MA (“edX Home Page,” 2014) total number of students engaged at each approached Massachusetts Bay Commu- level. "e total success and completion rate nity College equaled 18.4%, (MassBay) in which is near- August 2012 ly the and proposed national av- that Mass- erage. "e re- Bay o!er the sults indicate MITx MOOC that the in- course, 6.00x corporation of Introduction low-tech game to Computer m e c h a n i c s Science and in the course Programming through the to MassBay use of adaptive Figure 3. Total number of students engaged at each level versus students in a release may number of certi#cate/credit seeking students engaged at each level. blended (hy- have been one of the reasons for success. brid) format (“edX Intro Python,” 2013). "e low-risk, low-failure learning creat- "e community college instructor ed by the game-based learning strategies would use (in whole or part) the MITX proved successful for this MOOC for a de- 6.00x MOOC course content (syllabus, velopmental education audience and may course materials, video lectures, problem prove bene#cial for all MOOCs. sets, exams, etc.) in a pilot course in spring semester 2013. Bunker Hill Community Gates grant report results College (BHCC) was invited in September 2012 to participate in the project. •"e full MOOC Report can be found MassBay, located in Wellesley Hills, at https://breeze.tri-c.edu/moocreport/, and BHCC, located in Boston, are both which includes MOOC completion rates

66 Many Shades of MOOC's multicampus, urban institutions in the • To what extent do edX courses (and greater Boston area with many students MOOCs in general) need modi#cation from low-income and underrepresented for delivery in a community college communities. MassBay and BHCC serve classroom? 6,500 and 14,000 full and part-time stu- • How do di!erent types of students dents, respectively. Both schools are com- respond to the &ipped classroom ap- prehensive colleges; MassBay and BHCC proach? o!er 70+ and 100+ associate and certi#- • How does the community college stu- cate degree programs, respectively. BHCC dent experiences (and performance) serves a highly diverse student population compare to those students who have with 67% students of color (“BHCC Fast completed the same course as a MOOC Facts,” 2014). MassBay similarly serves a in the Fall 2012? diverse student body where 44% are stu- • What support does the faculty need to dents of color (“MassBay Fast Facts,” 2014). use the edX courseware? How are insti- MassBay’s Computer Science De- tutions able to support them? partment has a larger computer science as- • Is this a scalable approach for commu- sociate’s degree program in comparison to nity college courses in computer science? BHCC’s Information Technology Depart- ment which o!ers large computer support, "is project focused on two audi- database, networking, and computer secu- ences: (a) U.S. community colleges and (b) rity degree programs, along with a small the highly diverse (i.e., by income, gender, computer science program. Instructors age, race, ethnicity, language, prior aca- at both colleges were identi#ed to devel- demic preparation, especially in mathe- op courses to implement the MITx 6.00x matics) undergraduate student populations course for blended (hybrid) delivery in commonly served by community colleges. spring semester 2013. "e edX MOOC course, 6.00x In- troduction to Computer Science and Pro- MOOC Development: Purpose, Audi- gramming Using Python, is similar in con- ence, and Objectives tent and structure to a course taken by noncomputer science majors at MIT. 6.00x "is edX-Community College Part- was “designed to help people with no prior nership, funded by the Bill and Melinda exposure to computer science or program- Gate Foundation, was established to con- ming learning to think computationally duct the #rst empirical study exploring the and write programs to tackle useful prob- e%cacy of o!ering massive open online lems” (“edX Intro Python,” 2013). "e MIT courses (MOOCs) for college course cred- edX 6.00x MOOC ran for the #rst time in it in a more traditional community college fall 2012 with roughly 20,000 participants setting (Bell, Hunter, L’heureux, and Pe- active in the MOOC (over 80,000 had en- tersen, 2013). rolled initially). Important Project Research Ques- During the fall 2012 semester, a team tions: of faculty at MassBay and members of the BHCC’s Computer Information Technology • Can community colleges (and other Department worked with edX administra- credit granting institutions) adopt and tors and technical sta! to design distinctly use MOOCs to bene#t their students? di!erent courses in order to address di!er-

67 Internet Learning ences in students’ math pro#ciency at the tive preparatory exercises so that students two colleges. "e majority of the MassBay could independently complete their MITx students were computer science majors. 6.00x MOOC assignments and tests. At the However, these blended courses both used weekly classroom sessions, the MassBay in- the MITx 6.00x MOOC course unchanged structor primarily worked, as needed, with (including the problem sets and exams). "e students singularly or in small groups; lec- MassBay course, CS 270 Practical Python tures were rare. "e MassBay course, CS 270 Programming, followed the same sched- Practical Python Programming, followed the ule as the MITx 6.00x MOOC course. "e same timetable and schedule as the MITx BHCC course CIT Python Programming, 6.00x MOOC course. would progress more slowly through the "e BHCC instructors elected to MITx 6.00x MOOC materials – completing teach more traditionally with lectures and seven of the original 14 weeks (Bell, Hunter, small group work with many hands-on ac- L’heureux, and Petersen, 2013; “MCO-Key- tivities. Student met twice weekly with their note,” 2013). instructors. "e BHCC course, CIT 523 Py- "e MITx 6.00x MOOC course was thon Programming, used the same content analyzed with regard to its organization, but at a slower pace, such that seven weeks pedagogical style, course outcomes, video of the MITx 6.000x MOOC materials were lectures, activities, support materials, etc. covered by the end of the Spring 2013 se- "e instructors at both community colleges mester rather than the full 14 weeks. BHCC recognized that the in-class sessions needed students could still access the remainder of to give students a holistic and clear under- the course materials and #nish the MITx standing of the academic challenges to be 6.00x MOOC course on their own so they addressed in the MITx 6.00x MOOC as- might qualify for the edX completion certif- signments. "e community college instruc- icate. tors supplied the missing “alignments” or “sca!olding” between MITx 6.00x MOOC Instructional Design course outcomes and the individual MITx 6.00x MOOC assignments. MassBay and BHCC courses were Discussions during the course de- designed to support the “&ipped classroom” sign phase on how best to support students pedagogy. Students accessed MITx 6.000x given the di!erences in the math comfort MOOC materials online: watched the on- levels and prior programming experiences line videos; performed the online exercises; between MassBay and BHCC students led submitted the online homework; and took to di!erent pedagogical approaches. Only the online tests (the edX platform supported 29% of BHCC students had taken at least instant scoring, feedback, and multiple sub- one college programming course, compared mission attempts) just like any MITx 6.00x to 83% of MassBay students (Bell, Hunter, MOOC student. Students at each communi- L’heureux, and Petersen, 2013). ty college had required classroom meetings "e MassBay instructor adapted each week. At MassBay, students met for one course materials used to teach a previous 90-minute session; BHCC students met twice programming course and created online weekly for 60-minute sessions. "e commu- “notebooks.” "ese short tutorials, that nity college students participated in class- MassBay students accessed online, con- room activities, completed additional home- tained supplemental materials and interac- work assignments, and took in-class exams.

68 Many Shades of MOOC's

Technical Design Outcomes and Next Steps

"e community college cours- Dr. Damien Bell, the edX evalua- es were mounted on the stand-alone edX tor from Boston College, conducted inter- LMS platform developed by edX during views, and completed pre- and postsurveys Fall 2012 and piloted for this project. "e of students’ and instructors’ perspectives at entire MITx 6.00x course was copied into both colleges. He conducted student focus what has become the “Open edX” platform groups, gathered data on student partic- (“Open edX code,” 2014). MassBay and ipation for in-class and online course ac- BHCC instructors could independently ac- tivities, and made classroom observations cess their respective course shells to insert (Bell, Hunter, L’heureux, and Petersen, announcements, set up discussion forums, 2013). Preliminary analysis of project re- etc. "e edX sta! provided extensive tech- sults demonstrates that students at both nical support throughout the design phase community colleges were able to handle and during the Spring 2013 semester. the MITx 6.00x MOOC course materi- als with structured, in-class support from Implementation their instructors. "e MassBay and BHCC students’ overall academic performance "e pilot courses (CS 270 and CTI was better than that of the participants in 523) ran once, starting in January and end- the Fall 2012 MITx 6.00x MOOC where ing in May, 2013. Students registered for the great majority of those that earned the these college credit-bearing (and transfer- MITx completion certi#cates had at least a able) courses at their respective colleges, as bachelor’s degree or higher. "e Fall 2012 usual. Upon completion of the course, stu- MITx 6.00x MOOC started with around dents received a #nal (letter) grade along 20,000 active students. Of the roughly with the opportunity to qualify for the cer- 11,000 who took the MITx 6.00x MOOC ti#cate of completion issued by edX. A stu- midterm exam, 59% passed compared to dent thus could be successful in the course 90% of the community college students that by completing the stated course require- tested (N= 29). "e retention rate was bet- ments in the syllabus for CS 270 or CIT 523 ter for the community college students. Of and not qualify for the edX certi#cate. the original 40 community college students (21 at MassBay; 10 at BHCC), 73% took the Marketing MITx 6.00x midterm exam and 26 students (65%) completed their courses (and also MassBay and BHCC recruited stu- earned MITx completion certi#cates). For dents internally through informational the Fall 2012 MITx 6.00x MOOC, about &yers, posters, emails, and a specially pro- 5,000 participants (~25% of the original duced edX video posted on the websites 20,000) successfully #nished the course (“edX-BHCC,” 2013; “edX-MassBay,” 2013). and earned the MITx completion certi#- However, the most e!ective approach was cate (Bell, Hunter, L’heureux, and Petersen, to visit classrooms in fall 2012 and explain 2013; “MCO-Keynote,” 2013). "e #nal re- the project with its potential bene#ts to the port with the full analysis of this project is students. expected in Spring 2014.

69 Internet Learning

Discussion and Conclusion grading be e!ective in all courses? How can automated grading so$ware be used to pro- he four case studies highlight the mote student engagement? kind of experimentation on MOOCs With MOOCs, one of the biggest occurring in higher education today. attractions is also the biggest challenge. TAs the purpose and audience for MOOCs MOOCs provide a learning platform that vary, so do their design and development. can bring together hundreds to hundreds Each of the MOOCs described here was a of thousands of learners in a course. learning experience for its institution and Sharing the platform, however, is much its individuals – developers, instructors, di!erent and far simpler than engaging in and students alike. MOOCs will continue shared learning. "e creation of real learn- to evolve as we continue to experiment, ex- ing communities is made more challeng- amine the outcomes, and continually im- ing by scale, not easier, in the behaviorist prove our e!orts as a result. As is the case approach of the xMOOC. Yet, such com- in the most e!ective experimentation, the munities and the learning they a!ord may questions being raised by these MOOCs be essential to the awarding of academic and others are o$en the most important credit in all but direct assessment mod- part of the innovation. els. "is challenge is one reason the next "ese MOOCs were designed for generation of MOOC experimentation in- a variety of di!erent audiences; however, volves blended learning models where the can every kind of learner take advantage learning community is nurtured outside of of MOOCs? What adaptations need to be the MOOC and the MOOC becomes the made – in pedagogy, design, or content – high-quality material with which the com- to accommodate those learners who would munity engages. In these models, MOOCs otherwise be disadvantaged by a MOOC are transitioning from online course to on- approach? line content. Do low completion rates of MOOCs It is still very early in the devel- matter? What other success measures, in opment of the MOOC model to fully un- addition to or instead of completion, are derstand the potential of MOOCs and important? Will the integration of game the lessons we can learn from them about mechanics or related techniques improve teaching with technology. Whether their engagement and completion? Does a blend- future is as scalable online courses, con- ed learning structure improve performance tent supplements, or something altogether and completion rates? di!erent, the energy and momentum they Should MOOCs o!er college credit have created for experimentation around and/or should learners receive credit a$er- teaching and learning is a singular achieve- the-fact for MOOCs? What criteria need ment. Regardless of the shade of any partic- to be met for MOOCs to be credit-worthy? ular MOOC, their lasting impact will be to Can a single MOOC support multiple pur- energize the #eld to understand, improve, poses or outcomes; in particular, can it ef- and enhance the quality of the educational fectively provide multiple completion path- experience for the learner. ways to include credit toward degree? Can it be an important piece of such a pathway? Is grading at scale possible? With the appropriate so$ware, can machine

70 Many Shades of MOOC's

References edX Massachusetts Bay Community Col- lege student recruitment. (2013). Retrieved Blackmore, P., & Kandiko, C. (2012). Stra- from http://www.massbay.edu/Academics/ tegic curriculum change: Global trends in New-Course--Practical-Python-Program- universities. Milton Park, Abingdon: Rout- ming.aspx ledge. Kop, R., Fournier, H., & Mak, J. (2011). A Bell, D., Hunter, L., L’heureux, J., & Peters- pedagogy of abundance or a pedagogy to en, R. (2013). MOOCs in the Community support human beings? Participant sup- College: Implications for innovation in the port on massive open online courses. In- classroom. Retrieved from http://sloan- ternational Review of Research in Open and consortium.org/conference/2013/blend- Distance Learning, 12(7), 74–93. ed/moocs-community-college-implica- tions-innovation-classroom Lattuca, L. R., & Stark, J. S. (2009). Shap- ing the college curriculum: Academic plans Bunker Hill Community College – Fast in context (2nd ed.). San Francisco, CA: Facts. (2014) Retrieved from http://www. Jossey-Bass. bhcc.mass.edu/about/institutionaleffec- tiveness/fastfacts/ Massachusetts Colleges Online Conference – Keynote Address by O’Donnell, P., R., Educause (2012). What campus leaders Bell, D., & L., L’heureux, J. (2013). MOOCs need to know about MOOCs. Retrieved on campus: A closer examination. Retrieved from http://net.educause.edu/ir/library/ from http://www.mco.mass.edu/confer- pdf/PUB4005.pdf ence/

Daniel, J. (2012). Making Sense of MOOCs: Nagel, L., & Kotze, T. G. (2010). Supersiz- Musings in a Maze of Myth, Paradox and ing e-learning: What a CoI survey reveals Possibility. Retrieved from Academic Part- about teaching presence in a large online nerships website: http://www.academ- class. Internet and Higher Education, 13(1– icpartnerships.com/research/white-paper- 2), 45–51. making-sense-of-moocs. Open edX source code. (2014). Retrieved edX Bunker Hill Community College stu- from http://code.edx.org/ dent recruitment. (2013). Retrieved from http://www.bhcc.mass.edu/edx/ Quality Matters (2014). Quality Matters continuing and professional education ru- edX Home Page. (2014). Retrieved January bric. Retrieved from https://www.quality- 14, 2014, from https://www.edx.org/ matters.org/continuing-and-profession- al-education-rubric-program edX Introduction to Computer Science and Programming Python. (2013). Retrieved "ornton, J. S. (2013). Community colleges: January 14, 2014, from https://www.edx. Ready to disrupt again. In R. Glaspar & G. org/course/mitx/mitx-6-00-1x-introduc- E. De los Santos (Eds.), Disruptive innova- tion-computer-1122 tion and the community college (pp. 41–49). Retrieved from http://league.org/publica-

71 Internet Learning tion/whitepapers/#les/Disruptive_Innova- tion_and_the_Community_College.pdf

Wiley, D., & Green, C. (2012). Why open- ness in education? In Game changers: Ed- ucation andinformation technologies (6). Retrieved from http://www.educause.edu/ research-publications/books/game-chang- ers-education-and-information-technolo- gies

72 Internet Learning Volume 3 Issue 1 - Spring 2014 E#ect of Student Readiness on Student Success in Online Courses Leah A. GeigerA, Daphne MorrisB, Susan L. SuboezC, Kay ShattuckD, Arthur ViteritoE

!is research determined the e%ect of student readiness on student success in online courses that were of quality course design; were taught by experienced, engaging instructors; and were delivered within a well-supported and familiar Learning Management System (LMS). !e research team hypothesized that student success in well-designed courses (those that meet the Quality Matters standards) and that are taught by experienced, engaging faculty is most in- &uenced by student readiness factors, including individual attributes (such as motivation), life factors, comprehension, general knowledge, reading rate and recall, and typing speed and accuracy. A goal of the study was to deter- mine which of these factors correlated most closely to student success. Results of this study indicated that, when course design, instruction, and LMS are held constant, only typing speed/accuracy and reading rate/recall were statistically signi"cant as measured by the SmarterMeasure instrument and correlated to student course retention and course grade. Recommendations for further re- search are made.

Keywords: student readiness, course design, course retention, student success, online instructors

tudent success in online learning tion (Boston, Ice, & Gibson, 2011). It is a continues to rightfully receive a lot of complicated process to identify the cor- research and practice attention. "at relations among the factors that in$uence Ssuccess has been measured by student sat- student success. "e impact of student isfaction (Allen, Omori, Burrell, Mabry, & characteristics (Moore & Kearsley, 2012; Timmerman, 2013; Aman, 2009), by lev- Jung, 2012; Poellhuber, Roy, & Anderson, els of engagement (Hall, 2010), by grades 2011; Hall, 2011; Battalio, 2009; Wojciech- (Swan, Matthews, & Bogle, 2010; Swan, owski & Palmer, 2005), of online learning Matthews, Bogle, Boles, & Day, 2011; experience (Adkins, 2013; Sherrill, 2012; Runyon, 2006), and by course persistence Boston, Ice, & Burgess, 2012), of the design (Sherrill, 2012; Harkness, Soodjinda, Ham- of a course (Naidu, 2013), of institutional- ilton, & Bolig, 2011; Pittenger & Doering, ly provided academic, counseling (Cur- 2010; Diaz & Cartnal, 2006; Dietz-Uhler, ry, 2013), and technical supports (Jung, Fisher, & Han, 2007) and program reten- 2012), of the skills and engagement of the

A College of Southern Maryland B College of Southern Maryland C College of Southern Maryland D Director of Research, Quality Matters Program E College of Southern Maryland

73 Internet Learning instructor (Stavredes & Herder, 2013; Jung a number of variables identi#ed in the & Latchem, 2012; Swan, 2012; Rutland & Hilke study that might impact success in Diomede, 2011), and of familiarity with online courses. SmarterMeasure Learning technology have been identi#ed as impact- Readiness indicator is a web-based tool to ing the success of online student learning. access a learner’s likelihood for succeed- "is research project stemmed ing in an online and/or technology-rich from a recommendation coming out of a learning program (para. 1). Previously, study conducted by Jurgen Hilke (2010) Argosy University had conducted research for Maryland Online. He inquired as to using SmarterMeasure and found “statisti- why students withdraw from online classes. cally signi#cant relationships of technical Four clusters of related factors were iden- competency, motivations, availability of ti#ed as obstacles for students completing time, and retention” (SmarterMeasures, an online course: (1) student chooses too Research Results, para. 4). "e CSM study many courses; (2) student gets behind in originated as a replication and extension of the course; (3) student experiences person- the Argosy study by collecting data from al and family-related problems; and (4) stu- courses that meet QM standards (de#ned dent cannot cope with the combined work- as a well-designed online course), and that load of employment and academic studies were taught by experienced online instruc- (p. 7). As a result of these #ndings, he rec- tors with established positive track records ommended further investigation into “an of student engagement and who were high- analysis of variables for inclusion into a po- ly trained in the LMS and instructional de- tential regression model to predict online sign. "e rationale for including these in- course retention” (p. 9). Student orientation puts to quality online learning was to hold to distance learning was one of the areas of constant three known in$uencing factors possible intervention recommended. "is in student success: course design, instruc- research was inspired by Hilke’s call for a tor engagement, and familiarity and sup- further analysis of intervening variables. port of technology (LMS). "e intent of "e College of Southern Maryland this research was to determine the e%ect of (CSM) is a regionally accredited commu- student readiness on student success in on- nity college located in the mid-Atlantic, line courses. "e aim of this research proj- serving three counties of mixed social and ect was to o%er direction for student reme- economic demographics. CSM was one of diation in order to better prepare students the Maryland community colleges, that in for online learning and enhance student 2002, #eld-tested what would become the success, while noting the possible impact Quality Matters RubricTM and process for of course design and engaging instructor the continuous improvement of online recommendations to enhance the quality learning. CSM is committed to strength- of online learning. ening quality assurance for students, fac- ulty, and community with speci#c focus on Methodology continuing to design and to have all CSM online courses meet Quality Matters (QM) "e study was conducted over two standards. semesters. "e disciplines of the courses "e SmarterMeasureTM student varied (Fundamentals of Physics, Intro- readiness assessment was identi#ed as a duction to Sociology, and Applied Business vetted instrument to provide analysis of Communications), but were #rst-year col-

74 E%ect of Student Readiness on Student Success in Online Courses lege courses. All three instructors had over end of the semesters, we conducted a statis- seven years of online teaching experience tical analysis to measure the relationships each teaching online and had established between SmarterMeasure scores and CSM student engagement skills. Additionally, all measures of course retention and grade dis- three had successfully completed at least tribution as measures of academic success. two QM professional development train- ings, were certi#ed QM Master Reviewers, Statistical Analysis and had been active participants in the Middle States accreditation process. A Chi squared analysis was con- ducted to search for statistical signi#cance Participants to the scores of the SmarterMeasure assess- ment compared to the #nal course grades "e data collected in this study the students earned in the selected course occurred in 11 classes with a total of 200 sections. "e six SmarterMeasure indica- students over the period of two semes- tors scores were aggregated and compared ters – Fall 2012 and Spring 2013. Students to the #nal grade the individual student were required to take the SmarterMeasure earned in the course. SmarterMeasure Learning Readiness Indicator before begin- scores rely on student answers, some being ning the course work. "e Indicator is a vet- subjective (life factors and individual attri- ted web-based tool which assesses a learn- butes) as well as objective measures. er’s likelihood for succeeding in an online "e scores from the SmarterMea- and/or technology-rich learning program sure assessment are delivered as ranges by measuring the degree to which an indi- being labeled blue for rates between 85% vidual student possesses attributes, skills, and 100%; labeled green for rates between and knowledge that contribute to success. 70% and 84%; and labeled red for rates At the end of the semester, a correlational between 0% and 69%. As we analyzed the analysis was run to measure the relation- data, we realized that (a) there were a num- ships between SmarterMeasure scores and ber of small cells, and (c) there were “zero” measures of course retention and grade dis- cells. "erefore, as per acceptable social tribution as measures of academic success. statistical analysis, the only practical alter- "e study was conducted for two semesters native was to combine categories in such to ensure a valid data pool. a manner as to eliminate these small and SmarterMeasure data for six indica- zero cells. "e red cells were highly prob- tors were aggregated based on a percentage lematic in most of the cases; therefore, we scale of 0% to 100%. "e six indicators in- combined the green and red labels (fre- clude On-screen Reading Rate and Recall; quencies) to eliminate any biasing that the Typing Speed and Accuracy; Life Factors; low red frequencies may have introduced Technical Knowledge; Reading Compre- into the analysis. "erefore, we used two hension; and Individual Attributes (includ- SmarterMeasure Indicator Rates – (a) stu- ing motivation, procrastination, and will- dents earning a rate from 85% to 100% (the ingness to ask for help). "e #nal grades blue labels), and (b) students earning a rate earned for the selected CSM courses was from 0% to 84% (the green and red labels, aggregated and rated by academic success. combined). "e #ndings were analyzed through Chi "e #nal grades for the class were square tests for statistical signi#cance. At the measured as “successful” at the rate of 70%

75 Internet Learning or higher, equating to a letter grade of C, SmarterMeasure Indicator, Life Factors B, or A. CSM policy supports this valua- tion, as 70% is the cut-o! score for credit "e results for life factors do not being earned for the course as well as its show that this indicator exerts an in&u- ability to be transferred to another school. ence on student success. "e results are not In addition, the majority of student learn- statistically signi#cant, α = .05. (Note that ing outcomes assessments at CSM use the there is a statistical signi#cance only if you benchmark of 70% or higher. set the alpha at the .1 level.) See Figure 3.

Results SmarterMeasure Indicator, Technical Knowledge At the 95% con#dence level, two of the SmarterMeasure indicators (typing "e results for technical knowledge speed/accuracy and reading rate/recall) do not show that this indicator exerts an in- were statistically signi#cant, thereby exert- &uence on student success. "e results are ing signi#cant in&uence on student success not statistically signi#cant, α = .05. (Note in the course. "ere is a high probability (at that there is a statistical signi#cance only if the 95% level) that the other SmarterMea- you set the alpha at the .1 level). See Figure 4. sure indicators did not exert signi#cant in- &uence on student success. See Table 1 for SmarterMeasure Indicator, Reading the aggregate data. Comprehension

SmarterMeasure Indicator, Reading Rate "e results for technical knowledge do not show that this indicator exerts a sig- "e results for reading rate and re- ni#cant in&uence on student success, as the call indicate with a high degree of con#- results are not statistically signi#cant, α = dence that this indicator exerts an in&uence .05. See Figure 5. on student success. Speci#cally, 72 students were successful per the SmarterMeasure SmarterMeasure Indicator, Individual At- Indicator, while 62 ended up being success- tributes ful in the course. "e results are statistically signi#cant, α = .05. See Figure 1. "e results for individual attributes do not show that this indicator exerts an in- SmarterMeasure Indicator, Typing Speed &uence on student success. "e results are not statistically signi#cant, α = .05. See Fig- "e results for typing speed and ac- ure 6. curacy indicate with a high degree of con#- dence that this indicator exerts an in&uence Discussion on student success. Speci#cally, 88 students were successful per the SmarterMeasure In- "e study was based on the hypoth- dicator, while 81 ended up being successful esis that student success in well-designed in the course. "e results are statistically courses (those that meet QM standards), signi#cant, α = .05. See Figure 2. that are delivered via a familiar LMS, and that are taught by experienced and engag- ing online instructors are most in&uenced

76 E%ect of Student Readiness on Student Success in Online Courses

Table 1. SmarterMeasure Indicators Compared to Final

Indicator Final Indicator Indicator Total Name Grade Rate Rate at (N) Earned at 85% to 0% to 100% 84% Reading Successful 62 12 74 Rate* Not 10 6 16 Successful Total 72 18 90

Typing Successful 81 59 140 Speed* Not 7 14 21 Successful Total 88 73 161

Life Successful 49 122 171 Factors Not 4 25 29 Successful Total 53 147 200

Technical Successful 95 74 169 Knowledge Not 21 7 28 Successful Total 116 81 197

Reading Successful 144 26 170 Comprehension Not 24 4 28 Successful Total 168 30 198

Individual Successful 26 145 171 Attributes Not 5 24 29 Successful Total 31 169 200

Note: “Indicator Name” refers to the SmarterMeasure Indicators. “Successful” in the Final Grade Earned col- umn is based on the Institutional benchmark of earning a 70% or higher (A, B, or C letter grade). “Not Suc- cessful” is based on the Institutional benchmark of earning a 69% or lower (D or F letter grade).

*Reading Rate and Typing Speed are statistically signi#cant, α = .05, while the other SmarterMeasure Indica- tors are not statistically signi#cant, α = .05.

77 Internet Learning

Figure 1. Reading rate success comparison

Figure 2. Typing speed score comparison

Figure 3. Life factors score comparison

78 E%ect of Student Readiness on Student Success in Online Courses

Figure 4. Technical knowledge score comparison

Figure 5. Reading comprehension score comparison

Figure 6. Individual attributes score comparison

79 Internet Learning by student readiness factors, including in- be important to further explore the impact dividual attributes (such as motivation), of quality course design and engaging fac- life factors, learning styles, technical com- ulty on student readiness factors, especially petency, technical knowledge, reading rate those identi#ed by SmarterMeasure. and recall, and typing speed and accuracy. Our #ndings di!er from the Argosy Based on the results of the study, most of University study, “SmarterServices” (2011). our hypothesis was not supported. "e Argosy study found the following Only two of the SmarterMea- SmarterMeasure indicators have statisti- sure indicators (Reading Rate and Typing cally signi#cant impact on student success: Speed) were statistically signi#cant, there- technical competency, motivation, avail- by exerting an in&uence of student readi- ability of time, and retention (SmarterSer- ness on student success in these particular vices). Two factors may have contributed online courses. "ere are two limitations to to the di!erent #ndings. First, our small consider. First, the sample size was small. sample size may have a!ected our results Second, there can be alternative interpreta- compared to the Argosy study. Second, tion for these results. For example, a higher our study controlled for the course design, typing speed with accuracy may be indica- teaching, and LMS variables compared to tive of a student’s expertise with computer the Argosy study; therefore, our results technology. A higher typing speed with ac- may be more focused. curacy may also be indicative of a student’s "e current study allowed a closer attention to detail, and it is the attention to analysis of student readiness by controlling detail factor that exerted an in&uence on three variables: (a) the course design was student success. An additional possibility is considered high quality, as only cours- that higher typing speeds were developed es that had previously met QM standards from experience in previous online cours- were used; (b) the LMS utilized was indus- es, and success in previous online courses try-standard and was familiar to students has been identi#ed as a predictor of success and instructor; and (c) the faculty partic- (Boston, Ice, & Burgess, 2012). "e possible ipating in the study have strong, positive impact of previous student success in on- track records of student engagement, and line courses was not explored during this were highly trained in the LMS and instruc- study and would be an additional source tional design. We caution generalization of for correlation in readiness. these #ndings to conclude that only typing In this controlled study, two indi- speed/accuracy and reading rate/recall are cators (Life Factors and Technical Knowl- important to the successful completion of edge) were not statistically signi#cant un- an online course. less the alpha level is lowered from α = .05 to α = .01. "e last two indicators (Reading Suggestions for Future Research Comprehension and Individual Attributes) were not statistically signi#cant. "e small "e sample size could be broadened sample size may have a!ected the results. to increase validity and reliability, thereby An important caveat from this study is leading to institutional policy changes, such that these #ndings come from students as a mandatory student orientation course in courses that meet quality standards for or standardized modules for all online course design and were taught by experi- courses that incorporate resources for typ- enced, engaging online instructors. It could ing and/or reading rate practice. "e study

80 E%ect of Student Readiness on Student Success in Online Courses could be easily replicated for extended sta- level. In this study of the 200 students, 147 tistical analysis using our methodology or ranked within 0% to 84% for Life Factors, utilizing other designs, such as a matched while 53 ranked at the upper level and 169 pair design. Another approach to increase ranked within 0% to 84% for Individual the sample size would be to expand the Attributes, while 39 ranked at the upper study to multiple institutions/instructors level. A future study could compare these with similar characteristics as the original online student rankings with students tak- institution in the #rst study. We would alert ing comparable courses using other deliv- future researchers to control the inputs of ery methods (e.g., face-to-face, web-hy- quality course design and experienced, en- brid). "e results should also be compared gaging online instructors. to success factors in di!erent disciplines "is study was quantitative. Qual- using a matched pair experiment. For ex- itative information could be gathered and ample, how does an English course, where analyzed (1) to discover other indicators reading comprehension is critical, compare of student success and (2) to test alterna- to courses in other disciplines. tive analyses. For example, students who In addition, future studies could complete the SmarterMeasure instrument, compare results from QM-certi#ed courses perhaps as an online learning orientation to courses that have not been designed us- (Koehnke, 2013), may be more likely to ing QM standards. Likewise, a study could complete class work leading to student suc- compare the results of less experienced cess compared to the students who elect not with those of higher-skilled, experienced to complete the required SmarterMeasure online instructors. instrument. Focus groups of student par- ticipants in a replicated study would add References additional depth to any #ndings, as would using educational analytics to determine Adkins, (2013, April 25). Concerning on- if any correlations exist between students line learning: Experience matters [Web previous online course success and readi- log post]. Retrieved from http://wcetblog. ness factors. wordpress.com/2013/04/25/experience_ Another avenue of study would be matters/ to explore the actions of experienced, en- gaging online instructors teaching of the Allen, M., Omori, K., Burrell, N., Mabry, E., courses. It could be enlightening to learn if & Timmerman, E. (2013). Satisfaction with the highly skilled online instructors in this distance education. In M. G. Moore (Ed.), study mitigated the impact of the four oth- Handbook of distance education (3rd ed., er readiness factors measured that were not pp. 143–154). New York, NY: Routledge. found statistically signi#cant (life factors, individual attributes, technical knowledge, Aman, P. R. (2009). Improving student sat- and reading comprehension). "e #ndings isfaction and retention with online instruc- could reveal a snapshot of pedagogical hab- tion through systematic faculty peer review its that promote student success in the on- of courses. (Unpublished doctoral disser- line classroom. tation). Oregon State University, Corval- "e data for Life Factors and Indi- lis, OR. Retrieved from http://ir.library. vidual Attributes indicate that a large num- oregonstate.edu/xmlui/bitstream/han- ber of students ranked at the 0% to 84% dle/1957/11945/Aman_Dissertation.pdf

81 Internet Learning

Battalio, J. (2009). Success in distance Research Grant]. Presentation at the 2nd education: Do learning styles and mul- Annual Quality Matters Conference, Oak tiple formats matter? American Jour- Brook, IL. nal of Distance Education, 23(2), 46-60. doi:10.1080/08923640902854405 Hall, M. (2011). A predictive validity study of the Revised Mcvay Readiness for On- Boston, W., Ice, P., & Burgess, M. (2012). line Learning questionnaire. Online Jour- Assessing student retention in online learn- nal of Distance Learning Administration, ing environments: a longitudinal study. On- 14(3), Retrieved from http://www.westga. line Journal of Distance Learning Adminis- edu/~distance/ojdla/fall143/hall143.html tration, 15(2). Retrieved from http://www. westga.edu/~distance/ojdla/summer152/ Harkness, S., Soodjinda, D., Hamilton, M., boston_ice_burgess152.pdf & Bolig, R. (2011, November). Assessment of a pilot online writing program using the Boston, W. E., Ice, P., & Gibson, A. M. QM Rubric. Paper presented at the 3rd An- (2011). Comprehensive assessment of stu- nual Quality Matters Conference, Balti- dent retention in online learning environ- more, MD. ments. Online Journal of Distance Learn- ing Administration, 4(1). Retrieved from Hilke, J. (2010). Maryland Online, executive http://www.westga.edu/~distance/ojdla/ summary. In A “w” study, why do students spring141/boston_ice_gibson141.pdf take a course online? why do they withdraw? [PDF Document]. Retrieved from http:// Curry, R. F. (2013). Academic advising in www.marylandonline.org/sites/default/ degree programs. In M. G. Moore (Ed.), #les/W-Study%20Summary%20Report.pdf Handbook of distance education (3rd ed., pp. 201–215). New York, NY: Routledge. Jung, I. (2012). Learners’ perceptions and opinions of quality assurance. In I. Jung Diaz, D., & Cartnal, R. (2006). Term length and C. Latchem (eds.), Quality assurance as an indicator of attrition in online learn- and accreditation in distance education and ing. Journal of Online Education, 2(5). e-learning: Models, policies and research Retrieved from http://innovateonline. (pp. 244–254). New York, NY: Routledge. info%2Fpdf%2Fvol2_issue5%2FTerm_ Length_as_an_Indicator_Of_Attrition_ Jung, I., & Latchem, C. (2012). Compe- in_Online_Learning.pdf tencies and quality assurance in distance and e-learning. In I. Jung and C. Latchem Dietz-Uhler, B., Fisher, A., & Han, A. (eds.), Quality assurance and accreditation (2007). Designing online courses to pro- in distance education and e-learning: Mod- mote student retention. Journal of Educa- els, policies and research (pp. 231–243). tional Technology Systems, 36(1), 105–112, New York, NY: Routledge. doi:10.2190/ET.36.1.g Koehnke, P. J. (2013). !e impact of an Hall, A. (2010, June). Quality Matters Ru- online orientation to improve community bric as “teaching presence:” Application of college student retention in online courses: CoI framework to analysis of the QM Ru- An action research study (Doctoral dis- bric’s e%ects on student learning. [2009 QM sertation, Capella University). Retrieved

82 E%ect of Student Readiness on Student Success in Online Courses from http://www.cpcc.edu/pd/resources-1/ wcet/docs/par/22VariablesNotAffectin- doctoral-research-group/dissertations/ gRetentionofOnlineStudents_SHerrill_ paul-koehnke-full-dissertation April18-2012.pdf

Moore, M. G., & Kearsley, G. (2012). Dis- SmarterServices. (2011, December 1). tance education: A systems view of online SmarterMeasure research #ndings, re- learning (3rd ed.). Belmont, CA: Wad- sults of institutional level research projects sworth. [PDF Document]. Retrieved from http:// Naidu, S. (2013). Instructional design www.smartermeasure.com/smartermea- models for optional learning. In M. G. sure/assets/File/SmarterMeasure%20Re- Moore (Ed.), Handbook of distance educa- search%20Findings.pdf tion (3rd ed., pp. 268–281). New York, NY: Routledge. Stavredes, T. M., & Herder, T. M. (2013). Student persistence—and teaching strate- Pittenger, A., & Doering, A. (2010). In&u- gies to support it. Curry, R. F. (2013). Ac- ence of motivational design on completion ademic advising in degree programs. In M. rates in online self-study pharmacy-con- G. Moore (Ed.), Handbook of distance ed- tent courses. Distance Education, 31(3), ucation (3rd ed., pp. 155–169). New York, 275–293. NY: Routledge.

Poellhuber, B., Roy, N., & Anderson, T. Swan, K. (2012). Teaching and learning in (2011). Distance students’ readiness for so- post-industrial distance education. In M. F. cial media and collaboration. !e Interna- Cleveland-Innes & D. R. Garrison (Eds.), tional Review of Research in Open and Dis- An introduction to distance education: Un- tance Learning, 12(6), 102–125. Retrieved derstanding teaching and learning in a new from http://www.irrodl.org/index.php/ir- era (pp. 108–134). New York, NY: Rout- rodl/article/view/1018/1992 ledge.

Runyon, J. M. (2006). Quality in design: Swan, K., Matthews, D., & Bogle, L. R. Impact on student achievement. [2005 QM (2010, June). !e relationship between Research Grant]. Unpublished #nal report. course design and learning processes. [2009 College of Southern Maryland, LaPlata, QM Research Grant]. Presentation at the MD: Author. 2nd Annual Quality Matters Conference, Oak Brook, IL. Rutland, S. R., & Diomede, S. (2011, No- vember). !e impact of QM course revision Swan, K., Matthews, D., Bogle, L. R., Boles, on withdrawal rates. [2010 QM Research E., & Day, S. (2011). Linking online course Grant]. Presentation at the 3rd Annual design to learning processes using the Quality Quality Matters Conference, Baltimore, Matters and Community of Inquiry Frame- MD. works. Unpublished report on continuing study. Spring#eld, IL: Authors. Sherrill, J. (2012, April). 22 variable that don’t a!ect retention of online or dev ed Wojciechowski, A., & Palmer, L. B. (2005). courses anywhere (and a few that do). Individual student characteristics: Can any Retrieved from http://wcet.wiche.edu/ be predictors of success in online classes?

83 Online Journal of Distance Learning Admin- istration, 8(2). Retrieved from http://www. westga.edu/~distance/ojdla/summer82/ wojciechowski82.htm

84 Internet Learning Developing a Community of Practice (CoP) through Interdisciplinary Research on Flipped Classrooms Bobbie Seyedmonir, Kevin Barry, Mehdi SeyedmonirA

!is article describes an interdisciplinary research project that resulted from the creation of a community of practice (CoP) for faculty teaching blended and online courses at a small, historically black university. Using a &ipped-class- room approach, two modules of a Principles of Biology course were redesigned. Already-created PowerPoints were converted to screencasts and homework was completed in small groups during class. Results showed that students in the &ipped classroom performed better on application-type questions but showed no di%erence on overall test scores or on knowledge-type questions. A survey of student perceptions found that students liked the autonomy to watch content videos anytime, anywhere, and that they enjoyed the more active classroom experience. Students also noted that technical issues sometimes hindered their ability to learn; they missed the opportunity to ask questions in real time; and they did not appreciate the amount of out-of-class work this approach required. Overall, the results indicate that the &ipped-classroom model has the potential to increase student learning but that it requires a more thoughtful redesign process than is suggested in popular literature on the subject.

Keywords: &ipped classroom, Community of Practice (CoP), instructional de- sign, blended courses, teaching, teaching biology, higher education

Introduction development and teaching of an online course, thereby allowing the faculty to fo- ince the time of correspondence stud- cus on content (Ko & Rossen, 2010). ies, the ideal approach to the design of At many smaller institutions, how- distance education courses was team- ever, there are fewer resources and design Sbased in nature (Diehl, 2013). As distance sta!, leaving much of the work of course education has evolved to include the use design and development up to the indi- of online learning environments, the basic vidual faculty member. "is can lead fac- premise of course design has not changed. ulty to feeling overwhelmed and underpre- Instead of expecting faculty to become pared for the task of online course design, experts in the technical aspects of online especially since many who are asked to course design and content creation (e.g., teach online have no training in basic in- developing web pages and designing inter- structional practices (Baran, Correia, & active simulations), the team approach to "ompson, 2011). Distance education ad- online course design provides faculty ac- ministrators at such institutions have the cess to instructional designers, program- unenviable task of #nding innovative ways mers, web developers, etc. to assist in the to provide faculty support and develop- ment opportunities in order to build skill

A West Virginia State University

85 Developing a Community of Practice levels in designing, creating, and teaching ticism as to whether a literal translation of in blended and online environments. &ipped classroom (i.e., taking already-ex- To deal with these challenges, our isting presentation materials and recording University’s Center for Online Learning ad- them and using slightly modi#ed homework opted a community of practice (CoP) mod- assignments as in-class activities) would be el (Wenger, 1998). According to Wenger e!ective without implementing additional (1998), within a CoP framework, more ex- course design modi#cations such as inqui- perienced faculty (or those with more ex- ry- or problem-based approaches. pertise in a subject area) can mentor and "e result of these discussions was assist in the development of faculty who are the creation of an interdisciplinary re- newer to the discipline, thus alleviating the search team which included a biologist, an strain of understa!ed instructional design educational psychologist, and an instruc- departments. "e purpose of West Virgin- tional designer/technologist to study the ia State University’s (WVSU’s) CoP was e%cacy of a literal translation of the &ipped threefold: (1) to gather a cohort of interest- classroom design on student learning in a ed faculty from a variety of disciplines to general education biology course. discuss and learn about di!erent approach- es to blended and online course design, (2) Literature Review to develop skills and knowledge that could then be shared with the group, and (3) to lended or hybrid learning experienc- work together on projects of interest. es have been a common part of high- Participants in this CoP go through er education for the past decade. A a semester-long training program focus- Bthree-year study of over 1,000 U.S. colleges ing on online teaching and course design and universities found that roughly 46% of called the Online Teaching Institute (OTI). four-year undergraduate institutions of- Upon graduating from OTI, faculty contin- fered blended courses (Allen, Seaman, & ue meeting monthly to discuss and receive Barrett, 2007). However, the popularity of feedback on issues they are experiencing in the &ipped classroom, as brought to nation- their blended and online courses. al attention by Bergman and Sams (2012), During these meetings, several fac- has seen a marked growth over the past ulty members from science #elds shared year. While there are some slight variations their struggle to #nd a mode of instruc- of the model (e.g., Musallam, 2013), most tion that would utilize blended and on- of the available literature suggests that the line approaches to teach science but also basic &ipped instructional model consists preserve the integrity of their classrooms. of recorded lecture materials which are "e &ipped model of instruction seemed watched by students at home and applica- to be especially attractive to science facul- tion-type questions and problems (i.e., the ty because (1) the transition to this mode traditional homework) which are worked of teaching would be relatively easy as they on in class (Mangan, 2013; Bergman & could utilize already-existing lecture mate- Sams, 2012; Topo, 2011) (see Movie 1). rials and (2) they would not have to give up On a surface level, this model ap- any class time as they familiarized them- pears to be relatively simple to adopt and selves with the format. institute. An instructor needs a computer, However, discussions in the larger, screen capture so$ware (such as Camtasia, interdisciplinary CoP indicated some skep- Screencast-o-matic, etc.), a headset, and a 86 Internet Learning

Movie 1. "e &ipped classroom (Flipped Learning, 2010).

87 Developing a Community of Practice place to post videos (such as YouTube), and Methods he/she has all the equipment and so$ware needed to get started. In short, the &ipped Participants classroom approach has a low cost of adop- tion, making it relatively easy to imple- "e classes chosen for the &ipped ment. classroom experiment were two sections In addition to the low cost of adop- of an undergraduate Principles of Biology tion, proponents of this model have pro- course taught by one of the researchers (n = vided testimonials and anecdotal evidence 99). Of those enrolled, n = 88 completed the suggesting a high level of success in the experiment (34% male, n = 30 and 66% fe- &ipped classroom. Students are more en- male, n = 58). "e comparison group for this gaged, better able to address questions re- study consisted of two sections of the Princi- quiring application of content knowledge, ples of Biology course from the previous fall and are more satis#ed with the classroom semester (n = 89). "e comparison group experience in general (e.g., Mangan, 2013; was 53% male (n = 47) and 47% female (n Springen, 2013; Satullo, 2013; Flipped = 42). As this course is a general education, Learning Network, 2012). Such factors non-majors introductory biology course, combine to make the &ipped classroom the students were not science majors. very attractive to teachers, faculty, school districts, and universities that are under Design pressure to initiate changes that increase student learning in their classrooms. Two sections of the Principles of However, the empirical research Biology course taught in fall 2013 utilized in this area is still relatively sparse. Of the the &ipped instructional design. Results research that is currently available on the from learning assessment scores were then subject of &ipped classrooms, #ndings have compared to data from the previous fall se- been mixed: some researchers are report- mester. In order to ensure the comparison ing signi#cant learning gains in students groups and the &ipped groups were similar (Tune, Sturek, & Basile, 2013; Mason, in terms of prior knowledge coming into the Shuman, & Cook, 2013) and some are re- course, ACT Science scores were compared porting no di!erence between the &ipped and found to be not signi#cantly di!erent classroom and the traditional model (Win- (F(1,141) = 0.82, p = 0.3667), suggesting that ston, 2013). Because of the relative new- student groups between the two years had ness of this speci#c pedagogical approach similar backgrounds in science, making and the subsequent need for more empiri- them comparable for the purposes of this cal research focused speci#cally on the ef- research. Furthermore, the same book, con- #cacy of &ipped classroom techniques, the tent modules, visuals, assessments, and in- purpose of this research project was to (1) structor were utilized in both the treatment determine whether there is a di!erence in and comparison groups to ensure a similar- student performance in a &ipped classroom ity of experience. versus a traditional classroom setting and (2) gauge student perceptions of the &ipped Procedure classroom model and its e%cacy on their learning. "e #rst two modules of the course, Chemistry of Biology and Biological Mole-

88 Internet Learning cules, were redesigned to support &ipped questions and four short answer questions. instruction. "e videos were recorded in "e multiple choice questions were divided advance using the programs Camtasia and into those that tested knowledge/compre- Adobe Presenter. While the same Power- hension (n = 28) and application/synthesis Point presentations were used in all sec- (n = 12), as de#ned by the exam question tions (both &ipped and comparison), the bank associated with the textbook. screencasts in the &ipped group were cut In addition to the #nal assess- into segments that ran between 10 and 12 ment, participants were asked to complete minutes to prevent loss of attention due to a brief survey via surveymonkey.com that length of the screencasts (Middendorf & measured participant perceptions of the Kalish, 1996). For this same reason, partici- &ipped-classroom model. "is survey in- pants were required to watch no more than cluded questions dealing with the ease of two videos before each class meeting. use, their personal preferences, and their "e participants were given an ori- beliefs about their own learning using a entation to the &ipped-classroom model &ipped-classroom model. "e survey in- on the #rst day of class and were shown cluded both closed and open-ended ques- how and where to access the videos on the tions to allow a full range of participant University’s learning management system responses. Participants were given extra (LMS). "e PowerPoint #les themselves credit for completing the survey. (without narration) were also posted on- line to allow the slides to be printed for Results note-taking purposes. In order to further encourage participants to watch the videos, Results of Participant Performance they were required to take an online quiz Test Scores covering the assigned videos’ material pri- or to each class. Statistical analyses of quantitative During class meetings, when lectures test data were conducted using SAS statis- would typically be held, the participants tical so$ware using a mixed model ANO- were instructed to break into small groups VA (Pro c Mixed). "ough there was no and were given application-type problems signi#cant di!erence in test scores overall to work through while the professor pro- (F(1,172) = 0.06, p = 0.8079), participants in vided guidance as needed. At the end of the &ipped-classroom did perform signi#- each class, the groups were called upon to cantly better on the application/synthesis answer each of the problems they had col- multiple choice questions (F(1,167) = 4.28, p lectively worked through, and the professor = 0.0402) (see Figure 1). "ere was no sig- clari#ed answers or provided the correct ni#cant di!erence between scores on the answers to reinforce or correct learning. At knowledge/comprehension multiple choice the end of the two modules, participants questions (F(1,167) = 013, p = 0.7171). submitted a study guide assignment which encompassed all the material from the vid- Results of Participant Survey eos, and the professor provided #nal clari#- cation of material covered in both modules. "e online survey was sent to all "e two modules culminated in participants (n = 99) due to the fact that a #nal assessment of participant learn- while a participant may not have taken the ing which consisted of 40 multiple choice #nal assessment, they were all exposed to

89 Developing a Community of Practice the &ipped-classroom approach. "e re- room, loss of real-time response, technolo- sponse rate for the survey was 78% (n = gy problems, and more work in the class. 77). Demographic questions indicated that Learner autonomy. One of the survey responders were 73% female (n = major themes that emerged from the par- 51) and 27% male (n = 19) with 9% not re- ticipant responses was the idea of learner sponding (n = 7). Age of respondents broke autonomy, or being in control of when, down as 50% (n=35) between 18 and 20 where, and how frequently to access video years of age, 44% (n = 31) between the ages content for the course. Typical responses of 21 and 29, 6% (n = 4) over 29 years of for this item included, I thought the &ipped age, and 9% (n = 7) not responding. classroom method was e%ective because I Participants’ thoughts on the e!ca- could watch it on my own time. I enjoyed the cy of "ipped classroom. Participants were fact that I could rewind parts that I did not asked to rank their level of agreement with understand and I could rewatch the videos if several statements regarding &ipped class- necessary). room using a Likert-type scale of 1 – Strong- Active classroom. "e second ly Disagree to 5 – Strongly Agree. Overall, theme that presented itself was the idea of participants indicated satisfaction with the the active classroom. Participants indicated format tending to average above the mid- that they enjoyed working on problems in point in the scale on all items related to in- class, some saying that …it was great to do structional format, including items such as I the homework in class because I had already think I learned more as a result of this meth- seen the videos and PowerPoint, so if I had od and I felt more engaged in class when the any questions I could ask them. Lecturing in classroom was &ipped (see Figure 2). the classroom just gets boring, but when we #e role of the textbook. Additional- engage in the class and work together, I feel ly, participants were asked to identify when like it was easier to learn. they read their textbooks in relationship Loss of real-time response. State- to the videos. Results indicated that 33% ments such as I did not like not being able to of participants (n = 24) read the textbook communicate and ask questions represent a a$er watching the videos, thus suggesting theme found in open-ended responses that that the videos acted as an advanced orga- indicated to the researchers that the loss of nizer for participants. Additionally, 31% of real-time interaction while watching the participants (n = 22) claimed they did not videos was uncomfortable for some partici- read at all (see Figure 3). pants. Results of open-ended questions Technology problems. While tech- regarding the "ipped- classroom. Partic- nical problems were not common, for those ipants were also asked to respond to two students who did experience them, they open-ended questions, namely (1) Please appeared to negatively impact their percep- describe what you LIKED or thought was EF- tions of learning. While di%culties ranged FECTIVE about the &ipped classroom meth- from Internet connectivity to computer od and (2) Please describe what you DID hardware and so$ware, the general trend NOT LIKE or thought was INEFFECTIVE was that having any technical problem at all about the &ipped classroom method. "e- decreased the comprehension and overall matic analysis of these responses showed satisfaction with the format. "is is not un- #ve major themes related to the &ipped expected as overcoming technical support classroom: learner autonomy, active class- issues is part of any blended or online course.

90 Internet Learning

Figure 1. Mean scores on application questions.

Figure 2. Mean scores of Likert-type questions.

Figure 3. Counts of responses to question regarding when/if textbook was read.

91 Developing a Community of Practice

More work for participants. Finally, seems to indicate that simply &ipping the participants indicated that they did not like class without the inclusion of other prov- the fact that the &ipped- classroom design en teaching practices such as inquiry- or required more work from them outside of problem-based approaches does not yield class. A representative response from this the greatest gains in student learning. It theme was: I did not like spending so much also might indicate that, as with so many time out of class working for the class. While other issues related to student content participants considered this a negative of knowledge, the skill of the teacher might the &ipped classroom, the researchers felt be a greater determinant of student learn- this was actually a positive outcome of the ing gains than the &ipping itself; however, &ipped-classroom model. more research is needed in these areas to determine the individual factors that lead Discussion to greater learning gains in a &ipped envi- ronment. he results of this study seem to in- "e responses that participants dicate that the &ipped-classroom provided regarding how and when they model has some promise as a teach- accessed textbook materials were especial- Ting method, as participants did score ly interesting and suggest that the videos signi#cantly higher on the application could be acting as an advanced organizer portions of the learning assessment. Par- for the denser textbook materials. If this is ticipants also seem to enjoy the level of the case, then the use of a quick overview autonomy they have when the course con- video with the speci#c purpose of guiding tent is posted online and accessible when reading might lead to better reading com- needed as well as the ability to play back prehension and thus greater learning gains. the videos as o$en as needed to understand Further, the fact that an almost equally the concept. Participants also seem to en- large number of participants did not read joy the change in the nature and quality of at all suggests that perhaps a more thought- the face-to-face components of the course ful evaluation of the role of textbook, and as they get to spend it engaging in active textbook alternatives, is needed in order to learning experiences. In addition, while ensure students understand its relevancy to participants did not like the idea of having the course. More research on the interac- to exert more e!ort outside of class in the tion between video and textbook resourc- &ipped classroom approach, from an in- es should be done to further delineate how structor standpoint, participant reports of best to interweave those two resources to increased e!ort on course-related content optimize student learning. was welcome, and might be a way to im- "ere are some limitations to this prove participant study behaviors. study that require us to examine the #ndings However, the learning gains found of this research in the proper context. As are not as high as anecdotal reports suggest the groups in this study were not randomly (e.g., Flipped Learning Network, 2012). selected, generalization to the larger popu- While the &ipped classroom model did lead lation of college students (or even college to signi#cant increases in test scores on students in &ipped biology courses) might application-type questions, there were no be unwise. Additionally, given the amount signi#cant di!erences in knowledge-type of time that had passed between the &ipped questions or in overall test scores. "is group and the comparison group (one aca-

92 Internet Learning demic year), there could be timing a!ects References that made the groups perform di!erently. Further research in this area utilizing more Allen, I. E., Seaman, J., & Barrett, R. robust research designs is warranted in (2007). Blending in: "e extent and prom- order to arrive at more conclusive results ise of online education in the U.S. Babson about the e%cacy of the &ipped classroom Research Group. Retrieved from http:// model. sloanconsortium.org/sites/default/files/ pages/Blending_In.pdf Conclusions Baran, E., Correia, A. P., & "omp- esearch on the &ipped classroom son, A. (2011). Transforming online teach- is only beginning, and, while stud- ing practice: Critical analysis of the litera- ies on equivalency with traditional ture on the roles and competencies of online Rinstruction are needed, more research on teachers. Distance Education, 32(3), 421– how and when to e!ectively implement 439. doi:10.1080/01587919.2011.610293 this teaching strategy is the logical next step in &ipped-classroom research. While Bergman, J., & Sams, A. (2012). Flip this research shows that a &ipped class- your classroom: Reach every student in ev- room can increase student learning, it does ery class everyday. Washington, DC: ISTE. not identify which speci#c factors within the &ipped-classroom model lead to great- Diehl, C. A. (2013). Charles A. er learning gains in students, and further Wedemeyer: Visionary pioneer of distance research in this area can help clarify for education. In Michael Moore’s (ed.), Hand- educators how best to incorporate this ap- book of Distance Education (3rd ed.). New proach in their own classroom. In the end, York, NY: Routledge. the #ndings of this research seem to indi- cate that the &ipped-classroom approach Flipped Learning (2010). "e may not be the panacea for science in- &ipped classroom. [YouTube video]. Re- struction many wish it to be, but rather one trieved from http://www.youtube.com/ more tool for a skilled instructor to use in watch?v=2H4RkudFzlc his/her e!orts to support student learning. Flipped Learning Network. (2012). Improve student learning and teacher sat- isfaction with one &ip of the classroom. Re- trieved from http://&ippedlearning1.#les. wordpress.com/2012/07/classroomwin- dowinfographic7-12.pdf

Ko, S., & Rossen, S. (2010). Teach- ing online: A practical guide (3rd ed.). New York, NY: Routledge.

Mangan, K. (2013 October 4). In- side the &ipped classroom. Chronicle of Higher Education, 60(5), B18–B21.

93 Developing a Community of Practice

Musallam, R. (2013). 3 rules to Winston, H. (2013). Quickwire: spark learning [web video]. Retrieved from ‘Flipping’ classroom may not make much http://www.youtube.com/watch?v=YsYH- di!erence [web blog]. Retrieved from q*0X2A http://chronicle.com/blogs/wiredcampus/ quickwire-flipping-classrooms-may-not- Mason, G. S., Shuman, T. R., & make-much-di!erence/47667. Cook, K. E. (2013). Comparing the e!ec- tiveness of an inverted classroom to a tra- ditional classroom in an upper-division engineering course. IEEE Transactions in Education, 56(4), 430–435. doi:10.1109/ TE.2013.2249066

Middendorf, J., & Kalish, A. (1996). "e ‘change-up’ in lectures. National Teaching & Learning Forum, 5(2), 1–12. doi:10.1002/ntlf.10026

Satullo, S. K. (2013). Pa. colleges &ip classrooms to boost engagement. Commu- nity College Week, 26(4), 13.

Springen, K. (2013). Flipping the classroom: A revolutionary approach to learning presents some pros and cons. School Library Journal, 59(4), 23.

Topo, G. (2011, October 7). ‘Flipped’ classrooms take advantage of technology. USA Today. Retrieved from http://usato- day30.usatoday.com/news/education/sto- ry/2011-10-06/flipped-classrooms-virtu- al-teaching/50681482/1

Tune, J. D. Sturek, M., & Basile, D. P. (2013). Flipped classroom model improves graduate student performance in cardio- vascular, respiratory, and renal physiology. Advanced Physiology Education, 37, 316– 320. doi:10.1152/advan.00091.2013.

Wenger, E. (1998). Communities of practice. Learning, meaning, and identi- ty. Cambridge, UK: Cambridge University Press.

94 Internet Learning Volume 3 Issue 1 - Spring 2014 Beliefs Regarding Faculty Participation in Peer Reviews of Online Courses Andria F. SchweglerA, Barbara W. AltmanB, Lisa M. BunkowskiC

Prior to implementing a voluntary, uno$cial Quality Matters peer review pro- cess for online courses at our institution, several faculty members openly ex- pressed concerns about the process. To systematically identify and examine how highly endorsed these beliefs actually were, we used the !eory of Planned Be- havior (Ajzen, 1985) to investigate faculty beliefs and their plans to participate in the peer review. !is behavior prediction model provided a logical theoreti- cal basis for this investigation because it targets intentions to perform volitional behaviors and directly examines salient beliefs underlying attitudes, subjective norms, and perceived behavioral control toward the behavior. !ough di%er- ences in belief endorsement between faculty members who chose to participate in the peer review and those who did not could not be tested statistically due to small sample sizes, a qualitative examination of the endorsement of the modal belief statements provided useful information about faculty members’ percep- tions of completing the peer review. Our results indicated that many of the concerns and criticisms of the peer review process were not as highly endorsed as initially assumed. Our objective examination of faculty beliefs, instead of reliance on hearsay and a vocal minority, was useful in identifying genuine faculty concerns that could be directly addressed. Our data provided directions to guide administrative changes in our process to increase participation in in- ternal peer reviews with the goal of improving the online course design quality.

Keywords: peer review, online course design, faculty beliefs, Quality Matters, faculty attitudes

A Dr. Andria F. Schwegler is an Assistant Professor of Psychology and the Online Coordinator for the College of Education at TAMUCT. She teaches graduate and undergraduate courses in Psychology including social psychology, statistics, and research methods. Her research areas of interest include social norm formation and change, post-traumatic stress and depression symptoms in soldiers returning from combat, and the scholar- ship of teaching and learning. B Dr. Barbara W. Altman is an Assistant Professor in Management and the Online Coordinator for the College of Business Administration at TAMUCT. She teaches graduate and undergraduate courses in Business Ethics, Leadership and Strategy. Her research interests are in corporate social responsibility, organizational change, inter-organizational partnerships, and the leadership traits necessary to facilitate such linkages. Her research interests also involve methods to improve online and blended course design and delivery. C Dr. Lisa M. Bunkowski is an Assistant Professor of History and the Director of Instructional Enhancement and Innovation (IEI) at TAMUCT. She teaches graduate and undergraduate courses in American History. Her disciplinary research focus is on the mid-nineteenth century USA, with emphasis on issues of violence and gender. She is also involved with several regional Oral History projects. As the Director of IEI, she is fo- cused on the scholarship of teaching and learning, faculty development, and is actively involved in supporting course design and delivery of online and blended programs.

95 Beliefs Regarding Faculty Participation

uality Matters (QM) is one of the review on a voluntary basis is a way to gain most widely accepted set of stan- faculty buy-in in the process. Obviously, dards guiding the online course when participation is not optional, faculty Qdesign quality. As of 2013, QM reported members do not need to accept or endorse over 600 member institutions, and over the process to prompt participation, but 22,000 faculty and instructional designers when the process is optional, what prompts trained on the QM standards and course faculty members to participate in the peer review process (Quality Matters, 2013). review? More speci#cally, what do facul- "e core of the QM approach is a rubric ty members believe about the peer review covering eight overarching student/learner process when it is implemented at their focused standards, with a total of 41 spe- institution? How are these beliefs related ci#c course design standards. To ascertain to plans to participate? How can beliefs be whether an online course meets these stan- used to modify procedures with the goal of dards, a faculty developer submits a course increasing participation rates? To provide to the faculty peer review process. "e initial answers to these questions and guide goal of this process is continuous course our QM implementation process, the pres- improvement so that any identi#ed weak- ent research investigated faculty beliefs re- nesses are corrected, based on constructive garding the introduction and #rst wave of peer feedback (Finley, 2012). reviews in a QM peer review process. "e When initiating a peer review of on- goal of this research was to improve our line courses, subscribing institutions have understanding of faculty beliefs regarding the option to participate in either o%cial voluntary completion of a peer review of an QM reviews or uno%cial internal reviews. online course so that revisions to our pro- O%cial QM reviews include a Master Re- cess and new institutional changes could be viewer, a Subject Matter Expert, and an ex- designed to increase faculty participation ternal Reviewer who constitute the peer re- and ultimately improve our online course view committee, and successful completion quality. of the process leads to the o%cial QM des- ignation as a quality assured course. Unof- Literature Review #cial reviews do not receive the QM desig- nation, but they allow institutions to select o systematically investigate faculty their own peer review committee members beliefs and plans to participate in a to accommodate unique institutional needs peer review, the "eory of Planned and course improvement goals. Both o%- TBehavior provided a logical theoretical cial QM reviews and internal peer reviews basis for this study because it targets voli- are governed by the same set of standards tional behaviors and directly examines sa- and consensus protocols for determining lient beliefs regarding the behavior. Ajzen’s when standards are met. "eory of Planned Behavior (Ajzen, 1985, "e choice to participate in o%cial 1991, 2012) is one of the most widely ap- versus uno%cial reviews is determined at plied behavior prediction models in the so- the institutional level, along with the policy cial psychology literature. In a recent re- of mandated versus voluntary peer review. &ection, Ajzen (2011) noted that the theory For institutions undergoing an organiza- was cited 22 times in 1985, and citations tional change to implement QM, o!ering have grown steadily to 4,550 in 2010. A re- participation in an uno%cial internal peer view of the model’s signi#cance found this

96 Internet Learning research to have the highest scienti#c im- Underlying these attitudes, norms, pact score among U.S. and Canadian social and control perceptions are one’s beliefs psychologists (Nosek et al., 2010). which are weighted by the subjective val- "e "eory of Planned Behavior ue of these beliefs. Speci#cally, behavioral model is shown in Figure 1 (Ajzen, 2013a). beliefs are the accessible thoughts a person "e key tenets of the model include direct holds regarding a certain behavior. "ese and indirect measures of attitudes, subjec- beliefs are tempered by one’s evaluations of tive norms, and perceived behavioral con- the outcomes associated with these beliefs. trol that are used to predict intention, which In terms of the model, each of a person’s subsequently predicts behavior (Ajzen, behavioral beliefs is multiplied by the out- 1991, 2012, 2013a). According to this ex- come evaluation associated with that belief. pectancy-value model, which weights be- "en, each of these products is summed liefs about actions by their value, behavior to form an indirect assessment of one’s is the actual manifestation of an individu- attitude toward the behavior. In a similar al’s action in a particular circumstance. In- manner, normative beliefs are the salient tention is the proximal predictor of a per- expectations perceived by individuals that son’s behavior and indicates an individual’s are set by members of a relevant referent willingness/readiness to demonstrate that group (e.g., family members, co-workers). behavior. "e model postulates a strong "ese beliefs are weighted by one’s moti- relationship between intention and behav- vation to comply with these expectations, ior for those behaviors that are under one’s and the sum of products (i.e., each belief own volitional choice. Intention is predict- multiplied by the motivation to comply ed from an individual’s attitude toward the with it) constitutes an indirect measure of behavior, subjective norms regarding the one’s subjective norms regarding the be- behavior, and perceived behavioral control havior. Finally, control beliefs are thoughts over the behavior. Attitudes are de#ned as regarding factors in the setting which may “the individual’s positive or negative evalu- either impede or enhance the performance ation of performing the behavior” and are of the behavior. "ese factors are weighted assessed by having individuals respond to by the power each control factor holds over bipolar adjective scales regarding the be- the individual. One’s level of perceived be- havior under examination (e.g., good–bad; havioral control is the sum of products Ajzen & Fishbein, 1980, p. 6). Subjective (i.e., each control factor multiplied by its norms are “the person’s perception of the power of control) to perform the behavior. social pressures put on him to perform Given the wide application of the or not perform the behavior in question” theory, several notable meta-analyses (Ajzen & Fishbein, 1980, p. 6). "is social speak to the model’s utility in predicting pressure is measured as a general sense of intentions to engage in a multitude of be- what important others think one should haviors. For example, in a review of 185 do. Finally, perceived behavioral control is independent and varied applications of the one’s sense of his or her ability to perform model conducted prior to 1997, Armitage the behavior under examination, which and Conner (2001) found that across all can be assessed by having individuals rate behaviors studied, the correlation of in- the extent that performing the behavior is tention and perceived behavioral control up to them and whether they perceive they was signi#cant, with perceived behavioral have control over it. control accounting for 27% of the variance

97 Beliefs Regarding Faculty Participation in intention (R2 = 0.27). In addition, the light the overall e%cacy of the model in multiple correlations of attitude, subjec- behavior prediction. Unfortunately, appli- tive norm, and perceived behavioral con- cations of the "eory of Planned Behav- trol accounted for an average of 39% of the ior to teaching online, predicting faculty variance in intention (R2 = 0.39). "eir me- behaviors, and revising higher education ta-analysis supported the overall e%cacy of practices are extremely limited. Of the the model though they called for further few studies in this area Celuch and Slama study of the subjective norm component (2002) applied the theory to a business and attention to di!erences in self-report school course to evaluate the impact of versus observed behavior measurement. faculty-led interventions on student be- In a meta-analysis of 33 studies, havior. "ese researchers used pre- and Cooke and French (2008) examined the post-course assessments to examine how model’s overall ability to predict intention variables in the "eory of Planned Behav- to participate in health screenings and ior impacted learning critical-thinking subsequent attendance behavior. "eir skills in a marketing course. "eir #ndings meta-analysis found the strongest relation- show that some variables, speci#cally atti- ships between attitudes and intentions and tudes from the "eory of Planned Behav- the weakest relationships between subjec- ior, were accurate predictors of changes in tive norms and intentions. For attendance behavior and con#rmed the positive e!ect behavior, they found a medium-sized rela- of the course’s pedagogy on critical think- tionship between intention and behavior ing. Speci#cally, they reported that certain and a small relationship between perceived systematic elements of the course such behavioral control and behavior. "ese as expectation setting, opportunities for #ndings support the overall e%cacy of the practice, and constant feedback were sys- model, consistent with Armitage and Con- tem interventions that positively impacted ner’s #ndings. observed instances of critical thinking be- In a more recent meta-analy- haviors. sis predicting health-related behaviors, Utilizing a portion of the "eory McEachan, Conner, Taylor and Lawton of Planned Behavior, Alshare, Kwum, and (2011) analyzed 206 papers, representing Grandon (2006) examined faculty inten- 237 tests of the theory. Like previous me- tion to teach online at one American and ta-analyses, their study showed a strong two Korean institutions. "eir model in- relationship between intention and behav- cluded two factors derived from previous ior, and perceived behavioral control pre- research on faculty adoption of online dicted a small proportion of the variance courses, communication e%cacy, and &exi- in behavior. Attitude, subjective norm, and bility. "e third factor was subjective norm perceived behavioral control emerged as taken directly from Ajzen’s (1991) work. In the strongest predictors of intention rela- this context, subjective norm was de#ned tive to other variables added to the model, as the combined social pressure of school and attitude was consistently the strongest administrators and close faculty members predictor. "e purpose of this research, like to teach online courses. "e hypothesis that of Cooke and French (2008), was to that subjective norms had a positive rela- propose interventions to modify behavior tionship with the adoption of online teach- that could be examined in further research. ing was supported at both the American Taken together, these meta-analyses high- and Korean institutions.

98 Internet Learning

More closely related to the current !e Present Research project, Hartmann (2011) used the "eo- ry of Planned Behavior to explore whether revious research involving the "e- institutional level interventions would al- ory of Planned Behavior supports ter faculty willingness to submit research our expectation that the model can grant proposals in what had been a tradi- Pbe used to identify and measure facul- tional teaching institution. "e hypothe- ty members’ underlying beliefs, attitudes, ses grounding the case study were derived subjective norms, and perceptions of con- directly from the theory and stated that a trol regarding voluntary decisions to par- faculty member would be more likely to ticipate in a peer review. As such, we used intend to write and actually submit a pro- the "eory of Planned Behavior to provide posal for funding when that individual “be- a process for eliciting faculty beliefs re- lieves that submitting funding proposals is garding participating in a peer review as a desirable and valued behavior; sees oth- we introduced an internal QM peer review er similar people successfully writing and process at our institution. A$er the beliefs submitting proposals; and perceives they were identi#ed, we had faculty members, are able to write and submit proposals, that those who volunteered to participate in a obstacles can be overcome” (Hartmann, peer review and those who did not, evalu- 2011, p. 48). ate the statements. "eir ratings provided A number of interventions were the basis for revisions to our procedures put in place to test the behavioral change and for the development of interventions regarding attitudes, subjective norms, and to increase participation in a peer review, perceived control. For example, to change which should subsequently improve our attitudes regarding submitting research online course o!erings. grant proposals, interventions included publishing a monthly newsletter and pro- Method moting public awards and recognition. To change perceptions of subjective norms, Participants interventions included welcome letters to new faculty and department chairs empha- Research participation was o!ered sizing the importance of funded research, to all faculty members who were eligible published college-wide statistics, and fac- to participate in the peer review process ulty workload allocations. To change per- during its #rst year of implementation (i.e., ceived behavioral control, grant writing four semesters from Summer 2012 through workshops, how-to manuals, and tutorials Summer 2013). To be eligible to partici- were o!ered to faculty along with admin- pate in the peer review, faculty members istrative support. "e case study #ndings, must have taught at least one fully online documented over a 10-year period, show course on the recently-adopted learning average annual grant proposals rising, with management system. Of these eligible fac- indirect cost support to the college in- ulty members (N = 60), 19 faculty mem- creasing steadily. "e case study concludes bers volunteered to participate in the peer that managerial interventions can impact review process for at least one course. Of faculty members’ intentional behaviors to these peer review participants, eight facul- increase their participation in submitting ty members also volunteered to participate sponsored research.

99 Beliefs Regarding Faculty Participation in this research, a 42% participation rate. the observations are directly associated Of the 41 faculty members who chose not with contract renewals and merit raises. So, to participate in the peer review, six faculty when the peer review was discussed at our members volunteered to participate in this institution, many faculty members equated research, a 15% participation rate. it with an administrative review and were "ough participation in the peer re- not receptive. view process was incentivized with a $1000 Among the criticisms initially tar- stipend for successful completion, partic- geted at the peer review of online courses ipation in this research study was not in- were claims that the comments regarding centivized. Faculty members received no course revisions made during the context compensation for their participation in this of peer review would be an infringement research, which was entirely voluntary and on the faculty course developer’s academ- was not linked to peer review outcomes. ic freedom. In addition, faculty members "e researchers, who worked with facul- who had previous experience with only a ty members on their peer reviews, were review of their teaching (i.e., evaluation blind to the research participation status of of course delivery) were not familiar with all faculty members until the peer review distinguishing between course design and process was concluded at the end of its #rst course delivery and held persistent beliefs year. "is research was reviewed and ap- that confounded the two concepts. proved by the Institutional Review Board Faculty-generated concerns and of Texas A&M University Central Texas. criticisms of the peer review process were consistently directed to the Online Coor- Pilot Questionnaire dinators (i.e., faculty members who car- ried administrative duties to work with Utilizing the "eory of Planned Be- faculty to teach online), who were respon- havior requires the development of a survey sible for introducing and explaining the questionnaire that is based on the salient process to the faculty in their respective attitudes, subjective norms, and percep- colleges. Immediately prior to implemen- tions of control regarding the behavior for tation, the Online Coordinators recorded the target group. "erefore, the initial step this information on a pilot questionnaire, in developing our primary questionnaire which was used to create the main survey was documenting faculty comments and for this research. "e behavior targeted in beliefs regarding the introduction of the both the pilot questionnaire and the main peer review process. survey was de#ned as “completing the TA- Prior to implementing our internal MUCT peer review process for one online peer review, many faculty members openly course by the end of the current semester.” expressed concern about the process and "e pilot questionnaire included three, were reluctant to participate. "e only pre- open-ended items to elicit behavioral out- vious experience the majority of our facul- comes mentioned by faculty members (i.e., ty members had with a similar process was advantages and disadvantages of complet- when department chairs visited their class- ing the peer review and “what else comes rooms to complete their administrative fac- to mind when you think about” complet- ulty evaluations. "ese faculty evaluations ing this process). Normative referents for tend to be stress-provoking events for most the peer review process were elicited with faculty members because the outcomes of four open-ended questions that request-

100 Internet Learning ed a list of the individuals or groups who that were direct measures of faculty mem- would approve, disapprove, be most likely bers’ attitudes toward completing the peer to complete, and be least likely to complete review process rated on 7-point scales this process. Perceived behavioral control ranging from 1 (extremely good) to 7 (ex- regarding the peer review process was elic- tremely bad), 1 (valuable) to 7 (worthless), ited with two, open-ended items that re- 1 (pleasant) to 7 (unpleasant), 1 (enjoyable) quested a list of any factors or circumstanc- to 7 (unenjoyable), 1 (di$cult) to 7 (easy) es that would make it easy and di%cult to (reverse scored), and 1 (unnecessary) to complete the internal peer review. 7 (necessary) (reverse scored). When ex- Each Online Coordinator inde- amining the inter-item reliability of these pendently responded to the pilot survey statements, the item assessing how neces- with the concerns and comments that sary completing the peer review process faculty members in the respective college was displayed low correlation with the rest made. "e surveys were collected approx- of the items and was removed. "e remain- imately one week a$er being distributed ing #ve items were averaged into an overall and the responses compiled. Similar com- measure of attitudes (Cronbach’s α = 0.90). ments were combined (e.g., “I will learn Lower scores indicate more positive atti- some new techniques for online teaching” tudes toward completing the peer review. and “See what others are doing…so I can Direct measures of norms consisted borrow good ideas”), and some comments of #ve items. Norms indicating what most were reworded to communicate a neutral other faculty members do were measured a!ective tone (e.g., “People who want to by responses to the following items, “fac- get out of the required training”). All fac- ulty who are similar to me will complete ulty comments that were listed by the On- the peer review process” and “most faculty line Coordinators on the pilot survey were will complete the peer review process” on represented on the #nal survey as a set of 7-point scales ranging from 1 (de"nitely modal faculty beliefs. "e combined, re- true) to 7 (de"nitely false). Norms indicat- vised list of behavioral beliefs that were ex- ing social pressure to complete the peer pressed by faculty members when the peer review were measured by responses to the review process was introduced are listed in following items, “most of my colleagues the #rst column of Table 1, the normative whose opinions I value approve of me com- beliefs are listed in the #rst column of Ta- pleting the peer review process” rated as 1 ble 2, and the control beliefs are listed in (agree) to 7 (disagree), “most people who the #rst column of Table 3. Few control be- are important to me think that I 1(should) liefs were elicited by the pilot study so these to 7 (should not) complete the peer review beliefs were supplemented with example process,” and “it is expected of me to com- items from Ajzen (2013b). plete the peer review process” rated as 1 (de"nitely true) to 7 (de"nitely false). When Final Peer Review Survey examining the inter-item reliability of these statements, the items regarding “faculty "e #nal survey consisted of 79 who are similar to me will” and “it is ex- items assessing each of the constructs pro- pected of me to” complete the peer review posed by the "eory of Planned Behav- process produced low correlations with the ior. "is “Peer Review of Online Courses: rest of the items and were removed. "e re- Opinion Survey” included six statements maining three items were averaged into an

101 Beliefs Regarding Faculty Participation overall measure of norms (Cronbach’s α = Research participants’ past behavior 0.69). Lower scores indicate more support- in the internal peer review process since its ive norms regarding completing the peer inception was assessed with two open-end- review. ed items requesting the number of cours- Direct measures of perceived be- es submitted and the number of courses havioral control were assessed by four successfully completing the peer review items, including “I am con#dent that I can process. All but three participants had no complete the peer review process” and “I courses reviewed through the internal peer have full control over whether I complete review process prior to their participation the process” rated as 1 (de"nitely true) to 7 in this research. Research participants’ ac- (de"nitely false). One item assessed agree- tual behavior regarding peer review com- ment with the statement that “whether or pletion was recorded at the end of the initial not I complete the peer review process is round of peer reviews (i.e., 15 months a$er completely up to me” 1 (strongly agree) to 7 the project was implemented) with a 0 (non- (strongly disagree), and one item measured participant) and 1 (participant) distinction. whether completing the peer review pro- All research participants who started the cess is 1 (impossible) to 7 (possible) (reverse peer review process successfully completed scored). When examining the inter-item it before the review process was closed. reliability of these statements, the item Indirect measures of attitudes (i.e., assessing how possible completion of the behavioral beliefs and outcome evalua- peer review process was displayed low cor- tions), norms (i.e., normative beliefs and relation with the rest of the items and was motivation to comply) and perceived be- removed. "e remaining three items were havioral control (i.e., control beliefs and averaged into an overall measure of per- power of control factors) were assessed with ceived control (Cronbach’s α = 0.72). Lower the beliefs elicited from the pilot study pre- scores indicate more perceptions of control sented in Tables 1–3, respectively. over completing the peer review process. For the indirect measure of atti- Intention to complete the peer re- tudes, each behavioral belief listed in Ta- view process for one online course by the ble 1 was written as the conclusion to the end of the current semester was assessed by statement “Completing the TAMUCT peer four items. Participants rated the following review process will” and was rated on a statements, “I plan to complete the pro- 7-point scale from 1 (extremely unlikely) cess” on a 1 (extremely likely) to 7 (extreme- to 7 (extremely likely). "e positive items ly unlikely) scale, “I will make an e!ort to phrased in terms of bene#ts of participat- complete the process” on a 1 (de"nitely ing in the process (i.e., items 1 through 7) will) to 7 (de"nitely will not) scale, “I intend were reverse scored. Each outcome evalua- to complete the peer review process” on a tion was adapted to #t as the conclusion to 1 (strongly agree) to 7 (strongly disagree) “For me to” and was rated on a 7-point scale scale, and “I am going to complete the pro- from 1 (extremely good) to 7 (extremely cess” on a 1 (de"nitely true) to 7 (de"nitely bad). Lower scores indicate more support- false) scale. "ese items were averaged into ive beliefs regarding completing the peer an overall measure of intention (Cronbach’s review process. Consistent with the "eory α = 0.99). Lower scores indicate stronger of Planned Behavior model, each behavior- intentions to participate in the peer review al belief was multiplied by the correspond- process. ing outcome evaluation prior to summing

102 Internet Learning all the products for the composite indirect Lower scores indicate more supportive be- measure of attitudes. liefs regarding completing the peer review For the indirect measure of norms, process. Consistent with the "eory of each normative belief listed in Table 2 was Planned Behavior model, each control be- written as the subject to the statement lief was multiplied by the corresponding “think(s) that I should complete the TA- power of control factor prior to summing MUCT peer review process for one online all the products for the composite indirect course by the end of the current semester” measure of perceived behavioral control. and was rated on a 7-point scale from 1 (extremely likely) to 7 (extremely unlike- Procedure ly). Motivation to comply with each ref- erent was inserted in the blank “Generally Our recently independent, regional speaking, how much do you care what your university began o!ering online courses in ___ thinks you should do” and was rated Fall 2009 and became a QM-subscribing in- on a 7-point scale from 1 (not at all) to 7 stitution in Fall 2010. Concurrently, the in- (very much), which was reverse scored. stitution submitted our “Institutional Plan Lower scores indicate more supportive be- for Distance Education” to the state’s High- liefs regarding completing the peer review er Education Coordinating Board, outlin- process. Consistent with the "eory of ing 17 fully online programs to be imple- Planned Behavior model, each normative mented over three years. During this high belief was multiplied by the corresponding online growth period, University leadership motivation to comply prior to summing was committed to providing institutional all the products for the composite indirect supports (e.g., training, incentives, men- measure of norms. tors) to faculty to design high quality cours- For the indirect assessment of per- es and put in place a culture where online ceived behavioral control, each control be- quality was valued. In support of this goal, lief listed in Table 3 was adapted to #t the the Online Coordinator (OC) position was blank “How o$en do you encounter ___” created in which one faculty member from and was rated on a 7-point scale from 1 each college assumed part-time administra- (very rarely) to 7 (very frequently). "e tive duties to facilitate and mentor faculty items regarding receiving assistance from teaching online courses. All OCs were QM the Online Coordinator and receiving in- Certi#ed Peer Reviewers and taught fully centives to complete work were reverse online courses. As our procedures evolved, scored. To assess the power of control fac- QM training was made mandatory for tors, each control belief was inserted at the faculty teaching online courses, and sub- beginning of the statement “it would make mitting a course for peer review became a it more di%cult (or easier as noted in Table voluntary but incentivized option. "e Uni- 3) for me to complete the TAMUCT peer versity’s QM goal was that as many faculty review process for one online course by the as possible would submit their courses for end of the current semester” and was rated peer review so that course improvements on a 7-point scale from 1 (strongly agree) to could achieve design quality as demon- 7 (strongly disagree). All items were reverse strated through meeting QM standards. By scored except the items regarding receiving summer 2012, we had trained 56 of our fac- assistance from the Online Coordinator ulty members on the QM Rubric when we and receiving incentives to complete work. introduced our peer review process.

103 Beliefs Regarding Faculty Participation

"e decision to develop an inter- course where each standard was met. "e nal process at our institution, rather than purpose of the self-review was to get facul- adopt the o%cial QM process, was based ty familiar with using the Rubric as a peer on a consensus of faculty preference. Our review tool and allow them to systemati- faculty wanted to be personally engaged cally examine their course from a review- in the peer review process, and they were er’s perspective to reduce their apprehen- reluctant to involve external reviewers, sion and assist them in making revisions whom they believed might not understand to the course before revealing it to the peer our unique institution. In addition, some review team. While the faculty course de- faculty members expressed concern about veloper completed the self-review, the fac- a pre-existing, external procedure being ulty member’s college OC conducted an “imposed” upon them. To gain faculty sup- independent review of the course. A$er the port, we tailored our peer review process to faculty course developer and the OC com- allow faculty ownership of it and reassure pleted their reviews, they met to discuss them that they would have the chance to revisions to the course. Faculty members revise their courses before any o%cial re- were under no obligation to implement any views were undertaken. of the revisions suggested by the OC, who In our process, faculty members in- served strictly in a support role to assuage tending to submit a course for peer review faculty concerns regarding administrative made the request through the Distributed evaluation. Once the faculty course devel- Learning and Instructional Technology Of- oper was satis#ed with the course, it was #ce, which informed the respective college opened to the peer review team. OC. During the initial contact with the To be eligible to serve for an inter- faculty member, the OC invited the facul- nal peer review team, each reviewer com- ty member to participate in this research pleted the Applying the Quality Matters study. Potential participants received a Rubric course (APPQMR), the established, copy of the IRB approved Informed Con- basic QM Rubric course. Once the review sent form and a link to the online survey teams were set and the course opened to administered via Survey Monkey. Facul- the peer review team, the OC stepped out ty members were instructed to return the of the process, and the faculty Chair of the signed Informed Consent form to a des- review team was responsible for schedul- ignated sta! member in the institution’s ing timelines, leading team meetings, and research o%ce, not to the OC, and then closing the review in the QM Course Re- complete the online survey. Because each view Management System. "e peer review OC collaborated as a facilitator and men- was conducted in accordance with the of- tor with each peer review participant on #cial QM process and met/not met num- course revisions prior to the course being bering thresholds. If the course did not submitted to the internal peer review team, meet the threshold requirements on the all OCs were blind to the research partic- initial review, the faculty course developer ipation status until all peer reviews were consulted with the peer review team and completed. made revisions to the course until it earned To initiate the review process, enough points to meet requirements. Eligi- the faculty course developer conducted ble faculty members received a $1,000 sti- a self-review of the course using the QM pend when their courses successfully com- Rubric by identifying the location in the pleted the entire peer review process, and

104 Internet Learning these courses were distinguished as inter- process are listed in Table 2 by the peer re- nally quality assured. As part of the transi- view participation status. Each belief was tion when training to teach online became multiplied by its motivation to comply, and mandatory, prior successful completion of all products were summed to compute the the peer review was su%cient evidence of indirect measure of subjective norms. online course development pro#ciency to "e means and standard deviations exempt a faculty member from taking the for each control belief and power of the con- required training to teach online, which trol factor rated by faculty members during rendered the faculty member ineligible for the introduction of the internal peer review the incentive. Faculty members who volun- process are listed in Table 1 by the peer re- teered to peer review courses also received view participation status. Each belief was a small stipend (i.e., $250 for every three multiplied by its power of control factor, courses reviewed). and all products were summed to compute At the close of the fourth and #- the indirect measure of perceived behavior- nal semester of peer reviews following this al control. procedure, each college OC sent an email "e means and standard deviations invitation to participate in this research for the direct and indirect measures of at- study to the faculty members identi#ed as titudes, norms, perceived behavioral con- eligible to submit a course for peer review trol, and intention are presented in Table 4. but who choose not to participate. "ese Group means by the peer review comple- individuals received an electronic copy of tion status are also presented. All partici- the Informed Consent form and a link to pants who submitted a course to the peer the online survey. "ey were instructed to review successfully completed the process return the signed Informed Consent form by the end of the 15-month data collection to the sta! member in the institution’s re- period. search o%ce prior to completing the online "e correlation coe%cients among survey. Research participation was closed direct and composite indirect measures of for all faculty members at the beginning of attitudes, norms, control, intentions, and the Fall 2013 semester. behavior are presented in Table 5. When examining the relationship between the di- Results rect assessments of attitudes, norms, and control with the indirect assessments of he means and standard deviations for these constructs, respectively, only the two each behavioral belief and outcome measures of attitude were highly correlated evaluation rated by faculty members for this small sample. "e direct measure Tduring the introduction of the internal peer of attitudes was positively correlated with review process are listed in Table 1 by the the composite indirect measure of attitudes. peer review participation status. Each belief "e direct measure of norms was not high- was multiplied by its outcome evaluation, ly correlated with the indirect measure of and all products were summed to compute norms, and the direct measure of perceived the indirect measure of attitude. behavioral control was not highly correlat- "e means and standard deviations ed with the indirect measure of control. for each normative belief and motivation to Additional analyses including mul- comply rated by faculty members during tiple regression that were planned could not the introduction of the internal peer review be computed due to the small sample size. 105 Beliefs Regarding Faculty Participation

Discussion and gain a better understanding of the quality. Both groups indicated moderately hough di!erences between partic- positive beliefs that completion of the peer ipants’ and nonparticipants’ belief review would be useful in their promotion endorsement could not be test- and tenure packets and would help other Ted statistically due to unexpectedly small faculty members improve their courses. sample sizes, a qualitative examination of Nonparticipants were more likely to believe the endorsement of the modal belief state- that the peer review would be e!ortful and ments provides some useful information time consuming than participants in the about faculty members’ perceptions of process. Initial concerns regarding faculty completing the peer review. Analyzing the not getting along and infringement on ac- data with a qualitative lens a$er quantita- ademic freedom were not highly endorsed tive analysis conforms with mixed method by either group. Both groups agreed that approaches that point out the advantages of these outcomes would be bad, but neither complementarity, in which the alternative group believed that these outcomes were method can enhance or clarify the results very likely. Neither group held strong be- from the initial method used, leading to an liefs that the peer review process would be improved understanding of the phenome- confusing or require changes that they did non under study (Greene, Caracelli, & Gra- not want to make to their courses. ham 1989; Johnson & Onwuegbuzie, 2004; Regarding norms, on the direct Molina-Azorin, 2012). measure, both participants in the peer re- When measured directly, both par- view process and those who did not par- ticipants in the internal peer review pro- ticipate held beliefs supportive of complet- cess and those who did not participate held ing the peer review process (see Table 4). relatively positive attitudes toward com- Examining this scale by item, both partic- pleting the peer review (see Table 4), an ipants and nonparticipants thought that unexpected outcome given the reluctance valued colleagues (participants M = 2.25, and skepticism expressed by some faculty SD = 1.17; nonparticipants M = 2.00, SD = members when the process was introduced. 1.10) and important people (participants M Of course, these positive attitudes may not = 2.25, SD = .71; nonparticipants M = 1.50, be representative of those held by all facul- SD = 1.00) would approve of them com- ty members given that those who held the pleting the peer review process. However, most negative attitudes may have refused when asked whether most faculty members to participate in the peer review and this will complete the peer review process, both research. But, if these negative attitudes re- participants (M = 4.13, SD = 2.10) and non- main for some, they were not pervasive to participants (M = 4.50, SD = 1.05) failed to a!ect all faculty members. acknowledge this item as de#nitely true. For our sample, consistent with the "is is not a surprising outcome given that direct measures of attitudes, the behavioral peer review of online courses had just been beliefs underlying participants’ and non- introduced. participants’ attitudes regarding the peer Consistent with direct measures of review process were positive (see Table 1). norms, the normative beliefs underlying Both groups believed that completion of participants’ and nonparticipants’ percep- the peer review would allow them to im- tions regarding the peer review process and prove their courses, learn new techniques, their motivations to comply with normative

106 Internet Learning referents were supportive of completing the #cult to complete the peer review (see Ta- peer review (see Table 2). Both participants ble 3). But, the two groups held divergent and nonparticipants believed that Depart- beliefs on several control related items. ment Heads, Online Coordinators, School Nonparticipants reported more frequent Directors (i.e., Deans), the distance learn- problems using the learning management ing o%ce personnel, and administrators system, more family obligations, more in the Provost’s o%ce supported comple- employment demands, and more feelings tion of the peer review process, and faculty of being ill that would make it di%cult to members were motivated to comply with complete the peer review process than did these referents. However, colleagues, those participants. Both groups indicated that who teach online and those who do not, disagreements with colleagues were rare, were less likely to be endorsed as sources of but if they occurred, peer review partici- support for completion of peer review, and pants thought these disagreements would faculty were less motivated to comply with make it more di%cult to complete the peer these referents. "is is a paradoxical #nd- review process than did nonparticipants. ing because colleagues who teach online "ose who did not participate in the peer are the peers who are performing the peer review thought that having assistance from review of courses. Similarly, both partici- the Online Coordinator would make it pants and nonparticipants did not believe easier to complete the process. "ose who it was likely that students would think they chose to participate in the peer review were should complete a peer review of an online already working with the Online Coordi- course, though faculty members indicated nator to start the process but reported less that they did care what students thought reliance on the Online Coordinator. Nei- they should do. Paradoxically, students ther group reported that incentives to com- are the direct bene#ciaries of a course im- plete work were frequent, but both groups proved by a peer review, but faculty mem- acknowledged that incentives would make bers did not believe that students thought completing the peer review process easier, they should complete one. particularly the nonparticipants, though Regarding perceived behavioral completion of the peer review process was control, on the direct measure, participants already incentivized. in the peer review process were less likely Not surprisingly, participants in the than nonparticipants to agree that com- peer review process indicated stronger in- pleting the peer review was entirely up to tentions to complete the process than did them and that they had full control over nonparticipants. Participants also indicat- it (see Table 4). It appears that the partici- ed less variability in their intentions than pants acknowledged that the peer reviewers did nonparticipants, who responded less would have some control over the process. consistently regarding their intentions to Nonparticipants in the peer review indicat- complete the process (see Table 4). ed more control over (not) completing the process. Implications On the indirect perceived behavior- al control items, both nonparticipants and Looking at our research results as participants in the peer review acknowl- a whole, many of the initial concerns and edged that unanticipated demands on their criticisms of the peer review process were time were frequent and would make it dif- not as highly endorsed as initially assumed. 107 Beliefs Regarding Faculty Participation

An objective examination of faculty beliefs, near transfer of learning from training and instead of reliance on hearsay and a vocal make it clear that course development is the minority, was useful in identifying gen- goal instead of merely completing trainings uine faculty concerns that can be directly to gain the ability to teach online and then addressed. Consistent with the previous participating in a peer review if time per- research, most notably Hartman (2011) mits. Consolidation of the faculty workload who used the "eory of Planned Behavior may create the perception of one process to design interventions geared to changing that is directly applicable to their primary behaviors, our data suggest some initial di- responsibilities. Related to this issue, we rections to guide administrative changes in also intend to reframe our QM training our process. and course development activities to better Based on this research, we are revis- emphasize their linkage to improved stu- ing the delivery of our process in an attempt dent learning. "e #nding that our faculty to increase participation in our internal perceived that students would not desire peer review. Despite institutional recogni- their involvement in the peer review was tion and monetary incentives, the majority concerning, given that the foundational el- of faculty members at our institution chose ement of the process is improving courses not to participate in the peer review. Appar- so that student learning improves. We think ently, institutional supports alone are not that understanding this linkage is critical to su%cient when introducing the peer review fostering faculty buy-in to the process. to faculty members who have experienced To demonstrate to faculty that peer only administrative reviews of teaching. review is a valid use of their limited time We are exploring additional ways to sup- and that their e!ort will produce visible re- port faculty participation in peer reviews. sults, we are planning to showcase peer re- For example, as indicated in this research, viewed courses as model course exemplars limited faculty time and perceived di%- for other faculty. It is our goal to create a culty of completing the process were con- teaching and learning community in which cerns endorsed by faculty. "erefore, we are faculty members openly share course im- examining how we can link our required provement ideas. If e!ective, this practice training to teach online with our peer re- may increase the incentive to participate in view process to consolidate what faculty peer reviews without increasing the cost of members perceive as two distinct processes the process. We are currently hosting fac- with di!erent goals. "ough both the pre- ulty brown bags to set the conditions and requisite training to develop a course so are dra$ing a plan to establish a new peer- that it can be taught online and the peer re- to-peer mentoring program to support our view performed a$er the course has been peer review process. taught at least once share the same goals of Another revision to our internal course development and revision, our fac- procedures with the goal of increasing peer ulty do not necessarily view the processes review participation is increasing the in- as related. By incorporating the peer review centive to become a peer reviewer. "ough process with the conclusion of the required we had several faculty member volunteers training, course development and revision to review courses even before we included in the light of the QM standards would be a small stipend, a small group of faculty a direct application of training. Linking members shouldered a heavy review work- the processes would allow faculty to utilize load. Expanding our pool of trained inter-

108 Internet Learning nal peer reviewers will increase the level of norms are group-speci#c expectations, training of our faculty overall and better groups of faculty members may hold di!er- distribute the course review workload. An ent norms at other institutions. Conduct- alternative to alleviating our sta%ng lim- ing this study on a larger set of institutions itations is to shi$ to external peer reviews would allow for more general statements once faculty members have a better under- regarding faculty beliefs and motivations standing and buy-in of the peer review pro- to comply with expectations regarding the cess. peer review of courses. Such research may also shed light on the direction that norms Future Research and attitudes shi$ as faculty members em- brace peer review as a method of continu- Two additional research streams are ous course improvement. suggested by this initial study. "e #rst is A second stream of research will expansion of the original project to include be directed at improving the feedback pro- other institutions that are at a similar point vided during the course of internal peer in QM implementation (i.e., hosting volun- reviews. A cursory review of comments tary, internal peer reviews) to increase the provided to faculty course developers at the available participant pool. Initially, a$er we close of this initial set of internal reviews identi#ed and assessed faculty members’ revealed substantial inconsistencies across beliefs, we planned to test the utility of atti- reviewers. Given that we #rmly believe that tudes, norms, and perceived behavioral con- internal peer review is a tool that is helping trol in predicting faculty intention to par- our institution build a culture of continu- ticipate in the peer review and then predict ous course improvement, promoting more actual behavior from intention. However, at rigorous standards for acceptable reviewer the close of our data collection period when comments may have the potential to more the researchers were no longer blind to the e%ciently improve the course quality. To research participation status of our peer re- evaluate this prediction, this research team view participants, we realized that our sam- is planning to systematically examine the ple size was too small to support such an content of the comments provided by our examination. A power analysis con#rmed peer reviewers to evaluate the extent that this concern. Given the R2 from the current feedback provided to faculty course devel- dataset (i.e., R2 = 0.32), a sample size of at opers was consistent with the QM training least 34 participants would be needed for a that reviewers received (e.g., that reviewers test of the model with power = 0.90 and α = referenced speci#c Rubric standards and 0.05 (Cohen & Cohen, 1983). "ough our provided evidence from the course). "e total number of eligible faculty members results of this research will shed light on was large enough to support such a test, we the nature of comments that peer reviewers were not able to recruit enough participants make and suggest areas for revision of train- for this entirely voluntary, un-incentivized ing and minimum content standards for research study. With a larger sample of fac- comments. Follow-up research is planned ulty who are being introduced to the QM to determine whether revisions of training peer review at other institutions, a broad- and comment standards positively a!ect er picture of the accommodations that are the content of reviewer comments and as- made to the process to gain faculty buy-in sist faculty course developers in improving could be obtained. In addition, given that their courses.

109 Beliefs Regarding Faculty Participation

Conclusion Kruglanski, & E.T. Higgins (Eds.), Hand- book of theories of social psychology, 1, 438- n this paper we have presented a prelim- 459. London: Sage. inary research study using the "eory of Planned Behavior, a well-supported Ajzen, I. (2013a). Icek Ajzen TBP Imodel of behavior prediction, to examine Diagram. Retrieved November 25, 2013 beliefs that underlie faculty participation from http://people.umass.edu/aizen/tpb. in the peer review of online course design diag.html quality. While the results of this study were limited due to the small sample size, the Ajzen, I. (2013b). Sample TBP qualitative interpretations presented lead Questionnaire. Retrieved December 30, to both re#nements in our institutional 2013 from http://people.umass.edu/aizen/ processes and avenues for future research. pdf/tpb.questionnaire.pdf Online course quality is an important goal, not only in our newly independent Univer- Ajzen, I., & Fishbein, M. (1980). sity with a rapidly growing online presence, Understanding attitudes and predicting so- but in all institutions of higher education cial behavior. Englewood Cli!s, NJ: Pren- with online programs. Peer review and tice-Hall. use of an established benchmark, like the QM Rubric, command scienti#c inquiry to Alshare, K., Kwum, O., & Gran- improve their application. "e #ndings of don, E. (2006). Determinants of instruc- this study are an important #rst step to this tors’ intentions to teach online courses: A ongoing line of inquiry. cross-cultural perspective. !e Journal of Computer Information Systems, 46(2), 87- References 95. Armitage, C. J., & Conner, M. Ajzen, I. (1985). From intentions to (2001). E%cacy of the theory of planned actions: A theory of planned behavior. In behaviour: A meta-analytic review. British J. Kuhl & J. Beckman (Eds.), Action-con- Journal of Social Psychology, 40(4), 471-499. trol: From cognition to behavior (pp. 11-39). Heidelberg: Springer. Celuch, K., & Slama, M. (2002). Promoting critical thinking and life-learn- Ajzen, I. (1991). "e theory of ing: An experiment with the theory of planned behavior. Organizational Behavior planned behavior. Marketing Education Re- and Human Decision Processes, 50, 179- view, 12(2), 13-22. 211. Cohen, J., & Cohen, P. (1983). Ap- Ajzen, I. (2011). "e theory of plied Multiple Regression/Correlation Anal- planned behaviour: Reactions and re&ec- ysis for the Behavioral Sciences (2nd ed.). tions. Psychology & Health, 26(9), 1113- Hillsdale, NJ: Lawrence Erlbaum. 1127. doi: 10.1080/08870446.2011.613995 Cooke, R., & French, D. P. (2008). Ajzen, I. (2012). "e theory of How well do the theory of reasoned action planned behavior. In P.A.M. Lange, A. W. and theory of planned behaviour predict

110 Internet Learning intentions and attendance at screening cial-personality psychology programs programmes? A meta-analysis. Psychology and their members. Personality and Social & Health, 23(7), 745-765. Psychology Bulletin, 36, 1283–1300. doi: 10.1177/0146167210378111 Finley, D.L. (2012). Using Quali- ty Matters to improve all courses. Journal Quality Matters. (2013). Quality of Learning and Teaching with Technology, Matters Overview. Retrieved December 1(2), 48-50. 26, 2013 from https://www.qualitymatters. org/applying-rubric-15/download/QM_ Greene, J. C., Caracelli, V. J., & Gra- Overview_for%20Current%20Subscrib- ham, W. F. (1989). Toward a conceptual ers_AE2013.pdf framework for mixed-method evaluation designs. Educational Evaluation and Policy Analysis, 11(3), 255-274.

Hartmann, A. (2011). Case study: Applying the theory of planned behavior as interventions to increase sponsored proj- ect proposal submission from liberal arts faculty. Journal of Research Administration, 42(1), 46-60.

Johnson, R. B., & Onwuegbuzie, A. J. (2004). Mixed methods research: A research paradigm whose time has come. Educational Researcher, 33(7), 14-26. doi: 10.3102/0013189X033007014

McEachan, R.R.C., Conner, M., Taylor, N., & Lawton, R.J. (2011). Prospec- tive prediction of health-related behaviors with the theory of planned behavior: A me- ta-analysis. Health Psychology Review, 5, 97– 144. doi: 10.1080/17437199.2010.521684

Molina-Azorin, J. F. (2012). Mixed methods research in strategic manage- ment: Impact and applications. Organiza- tional Research Methods, 15(1), 33-56. doi: 10.1177/1094428110393023

Nosek, B.A., Graham, J., Lindner, N.M., Kesebir, S., Hawkins, C.B., Hahn, C., . . . Tenney, E.R. (2010). Cumulative and career-stage citation impact of so-

111 Beliefs Regarding Faculty Participation

Participants Nonparticipants

Belief Evaluation Belief Evaluation

M SD M SD M SD M SD

Help me improve my course 2.00 0.93 1.13 0.35 1.50 1.22 1.00 0.00

Give me the opportunity to earn 2.63 2.20 1.63 0.92 1.83 1.33 2.17 1.47 the incentive Keep me from taking the 5.29 1.98 4.25 1.91 5.83 2.04 4.83 2.64 required training Allow me to learn some new 2.57 1.27 1.25 0.46 1.83 1.60 1.17 0.41 techniques Help me gain a better 2.00 0.93 1.50 1.07 2.00 2.00 1.33 0.82 understanding of quality Support my Promotion and 2.63 0.92 1.25 0.46 2.83 1.94 1.83 0.75 Tenure packet Allow me to help other faculty 3.25 1.58 1.88 0.83 2.83 1.47 1.50 0.55 improve their courses Be time consuming and effortful 4.88 1.36 2.88 1.55 6.00 1.55 2.67 1.37

Require changes I do not want to 3.63 1.41 3.63 1.41 2.17 1.94 3.67 2.07 make Require me to commit time that 4.63 0.74 4.13 1.25 5.17 1.72 4.83 2.04 I do not have Subject me to faculty members 2.88 1.46 5.13 1.36 2.33 1.97 6.00 1.27 not getting along Be an unfamiliar process and 2.50 1.60 4.25 1.67 3.00 1.58 4.00 1.10 cause me to be confused Be an infringement on my 2.25 1.49 5.63 1.19 2.00 2.00 5.67 1.37 academic freedom

Table 1

Behavioral Beliefs and Outcome Evaluations (i.e., Indirect Assessment of Attitudes) Ex- pressed by Faculty Members during the Introduction of the Internal Peer Review Process by Peer Review Participation

112 Internet Learning

Participants Nonparticipants

Belief Motivation Belief Motivation

M SD M SD M SD M SD

My Department Head 2.00 1.07 1.57 0.79 2.67 1.97 1.67 0.52

My Online Coordinator 1.75 1.04 1.63 0.74 1.67 0.41 1.80 0.84

My colleagues who teach online 3.50 0.76 2.25 0.89 3.33 1.03 3.17 0.98

My colleagues who do not teach online 4.13 0.99 3.13 0.99 3.67 1.86 4.33 1.87

Administrators in the Provost’s Office 2.75 1.58 2.13 0.99 3.00 2.00 2.17 0.75

My School (College) Director 1.88 0.99 1.71 0.76 3.17 1.83 1.83 0.75

University Distance Learning Personnel 1.50 0.76 1.75 1.04 2.33 1.97 2.50 1.38

Students 4.13 1.73 1.63 0.74 4.33 2.50 1.83 0.98

Table 2

Normative Beliefs and Motivations to Comply (i.e., Indirect Assessment of Norms) Ex- pressed by Faculty Members during the Introduction of the Internal Peer Review Process by Peer Review Participation

113 Beliefs Regarding Faculty Participation

Participants Nonparticipants

Belief Power Belief Power

M SD M SD M SD M SD

If I encountered unanticipated 5.50 0.76 4.63 1.51 5.83 0.98 5.17 1.47 events that placed demands on my time If I had problems using 2.50 1.93 4.63 1.92 3.83 1.72 5.83 2.04 Blackboard when teaching online If I had family obligations that 3.63 1.77 4.29 1.25 5.17 1.17 5.67 1.37 placed unanticipated demands on my time If work or employment placed 4.75 1.67 4.88 0.99 6.00 1.10 5.33 1.51 unanticipated demands on my time If I felt ill, tired, or listless 2.50 1.69 4.50 1.41 4.00 1.67 4.83 1.94

If I had information or 3.36 1.69 3.75 1.67 2.33 0.82 2.00 2.00 assistance from the Online Coordinator (easier) If I had disagreements with my 1.63 0.74 4.14 1.68 1.50 0.55 2.17 1.94 colleagues

If I had monetary or other 5.50 1.41 3.13 1.89 5.67 1.21 2.00 1.10 incentives (easier)

Table 3

Control Beliefs and Power of Control Factors (i.e., Indirect Assessment of Perceived Behav- ioral Control) Expressed by Faculty Members during the Introduction of the Internal Peer Review Process by Peer Review Participation

114 Internet Learning

Total Participants Nonparticipants

M SD M SD M SD

Direct Attitude 2.31 1.07 1.98 0.57 2.77 1.45

Indirect Attitude 138.50 51.28 133.57 45.16 145.40 63.79

Direct Norm 2.79 0.95 2.88 1.14 2.67 0.70

Indirect Norm 55.64 30.96 48.00 29.77 64.80 33.12

Direct Control 2.14 1.20 2.54 1.40 1.61 0.65

Indirect Control 137.39 32.38 125.71 29.82 151.00 32.19

Intention 2.11 1.69 1.38 0.63 3.08 2.21

Table 4

Means and Standard Deviations for Direct and Composite Indirect Measures of Attitudes, Norms, Perceived Behavioral Control, and Intention by Peer Review Participation

Direct Indirect Direct Indirect Direct Indirect Intention Behavior Attitude Attitude Norm Norm Control Control Direct — Attitude Indirect 0.66 — Attitude Direct 0.27 0.52 — Norm Indirect 0.02 0.30 0.28 — Norm Direct 0.32 0.58 0.43 0.03 — Control Indirect 0.18 0.36 0.23 0.18 0.24 — Control Intention 0.48 0.28 0.16 0.07 0.01 0.60 — Behavior 0.38 0.12 0.11 0.28 0.40 0.41 0.52 —

Table 5 Correlations among Direct and Composite Indirect Measures of Attitudes, Norms, Control, Intention, and Behavior

115 Beliefs Regarding Faculty Participation

Figure 1. "e "eory of Planned Behavior. "e beliefs on the le$ of the #gure are multiplied by their respective values and summed to create an indirect measure of attitudes, subjective norms, and perceived behavioral control, respectively. "ese beliefs and their value un- derlie one’s attitude, subjective norm, and perceived behavioral control regarding a target behavior. One’s attitude, subjective norm, and perceived behavioral control are also directly measured and predict one’s intention to perform a target behavior. One’s intention to per- form a behavior is used to predict one’s actual behavior. One’s perceived behavioral control can be used to serve as a proxy for one’s actual behavioral control and can be included with intention to predict behavior.

Reprinted from “TPB Diagram” by I. Ajzen, 2013a, http://people.umass.edu/aizen/faq.html. Copyright 2006 by Icek Ajzen.

116 Internet Learning Volume 3 Issue 1 - Spring 2014 Surveying Student Perspectives of Quality: Value of QM Rubric Items Penny Ralston-BergA

!e Quality Matters (QM) Rubric is based on academic research. A national survey was conducted to compare QM Rubric item numerical rankings with student rankings of quality elements. Results of the survey are shared.

Keywords: online course, quality, student perspective, QM, Quality Matters, course design elements

Introduction and Background a CMS home page, or in an online course announcement. Data from each institution his survey builds on previous work were then compiled into a cumulative data- started in 2007 at University of Wis- set. "ree datasets were gathered from 2010 consin Extension (Nath & Ralston- to 2011. TBerg, 2008; Ralston-Berg & Nath, 2008; Ralston-Berg & Nath, 2009). "e Quality Participants Matters (QM) program o!ers quality assur- ance through a research-based rubric for Participants were currently enrolled online course design. From an instruction- or had taken online, for-credit courses and al design standpoint, questions arise about were over 18 years of age. Information here the student perspective. If the QM Rubric is describes cumulative results of all data- based on academic research initiated most sets for a total online sample of N=3,160 o$en by content experts or others in aca- students from 31 institutions in 22 states. demia delivering online content, do online Participants ranged in age from 18 to 65+ students – consumers of those courses – with the largest group being 26–44-year- have a di!ering perspective on what makes olds. "ey represented 25 areas of study and a quality online course? Do students agree a range of online experience from 1 to 9+ that items presented in the QM Rubric in- courses completed. Most participants were dicate quality? Are items in the QM Rubric enrolled in cohort, for-credit online courses perceived as contributors to student suc- from a four-year institution. "e majority cess? were enrolled part time (two or more cours- es) or full time and reported being comfort- Method able or very comfortable with technology.

Survey Data were collected through an on- line survey made available through a unique "e instrument contained QM items URL by a contact person at each partici- from the 2008–2010 Rubric converted to pating institution. "e URLs were deliv- student-centered language, open response ered to students via email, a link posted on

APenn State World Campus

117 Surveying Student Perspectives of Quality questions about quality, and demograph- "e QM Student Bill of Rights ic information (Appendix A). Participants http://online.collin.edu/QM%20Bill%20 were asked to consider only the online of%20Rights%20for%20Online%20Learn- course environment when rating each on- ers%20with%20Preamble.pdf (Quality line course feature in terms of how valuable Matters, 2011) and accompanying video: they thought it to be. Students rated each course characteristic on a four-point Likert scale as to how each item contributes to student success (i.e., 0=not at all important – does not contribute to my success; 1=im- portant; 2=very important; and 3=essential – could not succeed without it).

Results

"e cumulative mean and standard http://www.youtube.com/watch?v=2mDbSvqBvR8 deviation were calculated for each Joni Tornwall from the Ohio State University College survey item. "is was then compared of Nursing discussing QM Standards from the stu- to the corresponding 2008–2010 and dent perspective: 2011–2013 QM Rubric item numbers http://youtu.be/pzFYs8-IxN0 and their QM-assigned point values. Participants found all survey items References to be important. However, some sur- vey items were rated di!erently when Nath, L. E., & Ralston-Berg, P. compared to 2008–2010 and 2011– (2008). Why Quality Matters matters: what 2013 QM-assigned point values for students value. Paper presented at the Amer- each item. "e results for each of the ican Sociological Association Annual Meet- eight categories of QM Rubric items ing, Boston, MA. are listed in Tables 1 through 9. Ralston-Berg, P., & Nath, L. (2008). What makes a quality online course? "e Conclusions student perspective. Annual Conference on Teaching and Learning Proceedings. Madi- ll QM items were ranked important, son, WI. although some items were ranked di!erently than QM-assigned val- Ralston-Berg, P., & Nath, L. (2009). Aues. Some QM-ranked “3” items were What makes a quality online course? "e participant-ranked less than 2. Some QM- student perspective. Annual Conference on ranked “1” items were participant-ranked Teaching and Learning Proceedings. Madi- more than 2. son, WI. "e results of this independent re- search were also incorporated into other works.

118 Internet Learning

Survey Item Survey Survey 2008–2010 2011–2013 QM QM Mean SD Rubric Rubric Item # and Item # and Points Points Clear instructions tell me how to get started 2.66 0.60 1.1 (3) 1.1 (3) and how to find various course components. A statement introduces me to the purpose 2.04 0.83 1.2 (3) 1.2 (3) of the course and its components. Etiquette (or “netiquette”) guidelines for 1.43 0.93 1.3 (1) 1.3 (2) how to behave online are clearly stated. The instructor introduces her- or himself. 1.91 0.87 1.4 (1) 1.7 (1) I am asked to introduce myself to the class. 1.00 0.96 1.5 (1) 1.8 (1) Minimum preparation or prerequisite 2.08 0.82 1.6 (1) 1.5 (1) knowledge I need to succeed in the course is clearly stated. Minimum technical skills expected of me 1.99 0.87 1.7 (1) 1.6 (1) are clearly stated. Table 1. Course Overview and Introduction

Survey Item Survey Survey 2008–2010 2011–2013 QM QM Mean SD Rubric Rubric Item # and Item # and Points Points The course learning objectives describe 1.84 0.88 2.1 (3) 2.1 (3) outcomes that I am able to achieve. The module/unit learning objectives 1.80 0.89 2.2 (3) 2.2 (3) describe outcomes that I am able to achieve and are consistent with the course-level objectives. (For example, upon completing this lesson you will be able to…) All learning objectives are clearly stated and 1.83 0.90 2.3 (3) 2.3 (3) written from my perspective. Instructions on how to meet the learning 2.30 0.77 2.4 (3) 2.4 (3) objectives are adequate and stated clearly. The learning objectives (my expected 2.18 0.77 2.5 (2) 2.5 (3) learning) are appropriate for the level of the course. Table 2. Learning Objectives

119 Surveying Student Perspectives of Quality

Table 3. Assessment and Measurement

Survey Item Survey Survey 2008–2010 2011–2013 QM QM Mean SD Rubric Rubric Item # and Item # and Points Points Instructional materials contribute to the 2.29 0.72 4.1 (3) 4.1 (3) achievement of the course and module/unit learning objectives. The relationship between the instructional 2.17 0.79 4.2 (3) 4.2 (3) materials and the learning activities is clearly explained to me. Instructional materials have sufficient 2.32 0.73 4.3 (2) n/a breadth, depth, and currency for me to learn the subject. All resources and materials used in the 1.79 0.95 4.4 (1) 4.3 (2) course are appropriately cited.

Table 4. Resources and Materials

120 Internet Learning

Table 5. Learner Engagement

121 Surveying Student Perspectives of Quality

Survey Item Survey Survey 2008–2010 2011–2013 QM QM Mean SD Rubric Rubric Item # and Item # and Points Points The tools and media used are appropriate 2.17 0.77 6.1 (3) n/a for the content being delivered. The tools and media used support the 2.05 0.83 6.1 (3) 6.1 (3) achievement of the stated course and module/unit learning objectives. Navigation throughout the online 2.51 0.67 6.3 (3) 6.3 (3) components of the course is logical, consistent, and efficient. Technologies required for the course are 2.62 0.64 6.4 (2) 6.4 (2) readily available – provided or easily downloadable. The course components are web-based or 2.47 0.74 6.5 (1) n/a easily downloaded for use offline. Instructions on how to access resources 2.47 0.69 6.6 (1) n/a online are sufficient and easy to understand. The course design takes full advantage of 2.06 0.85 6.7 (1) n/a available tools and media.

Table 6. Course Technology

122 Internet Learning

Survey Item Survey Survey 2008–2010 2011–2013 QM QM Mean SD Rubric Rubric Item # and Item # and Points Points Course includes or links to a clear 2.05 0.83 7.1 (2) 7.1 (3) description of the technical support offered to me. Course includes or links to a clear 1.83 0.87 7.2 (2) 7.3 (2) explanation of how the institution’s academic support system can assist me in effectively using the resources provided. Course includes or links to a clear 1.68 0.93 7.3 (1) 7.4 (1) explanation of how the institution’s student support services can help me reach my educational goals. Course includes or links to tutorials and 1.75 0.92 7.4 (1) n/a resources that answer basic questions related to research, writing, technology, etc.

Table 7. Learner Support

Survey Item Survey Survey 2008–2010 2011–2013 QM QM Mean SD Rubric Rubric Item # and Item # and Points Points Course is accessible to people with 1.74 1.13 8.1 (3) n/a disabilities. Course includes equivalent alternatives to 1.65 1.06 8.2 (2) 8.2 (2) audio and visual content. Course includes web links that are self- 1.84 0.89 8.3 (2) n/a describing and meaningful. Course ensures screen readability. 2.32 0.83 8.4 (1) n/a Table 8. Accessibility

123 Surveying Student Perspectives of Quality

All survey items were compiled into one list to determine overall rank or value to partici- pants. "is provides a resource for instructional designers and course developers.

Table 9. All Survey Items in Order of Participant Rank Survey Item QM Survey SD Rank Rank Clear instructions tell me how to get started and how to find 3 2.66 0.60 various course components. Technologies required for the course are readily available – 2 2.62 0.64 provided or easily downloadable. Criteria for how my work and participation will be evaluated are 3 2.52 0.64 descriptive and specific. (For example, a grading rubric or list of expectations.) Navigation throughout the online components of the course is 3 2.51 0.67 logical, consistent, and efficient. Assessments (quizzes, exams, papers, projects, etc.) are 2 2.49 0.65 appropriately timed within the length of the course, varied, and appropriate to the content being assessed. The grading policy is stated clearly. 3 2.49 0.65

Assessments (quizzes, exams, papers, projects, etc.) measure the 3 2.48 0.66 stated learning objectives and are consistent with course activities and resources. Instructions on how to access resources online are sufficient and 1 2.47 0.69 easy to understand. The course components are web-based or easily downloaded for 1 2.47 0.74 use offline.

Requirements for my interaction with the instructor, content, 2 2.35 0.76 and other students are clearly explained. Instructional materials have sufficient breadth, depth, and 2 2.32 0.73 currency for me to learn the subject. Course ensures screen readability. 1 2.32 0.83

Instructions on how to meet the learning objectives are adequate 3 2.30 0.77 and stated clearly. Instructional materials contribute to the achievement of the 3 2.29 0.72 course and module/unit learning objectives. Clear standards are set for instructor response (turn-around time 2 2.29 0.78 for email, grade posting, etc.).

The learning objectives (my expected learning) are appropriate 2 2.18 0.77 for the level of the course. The tools and media used are appropriate for the content being 3 2.17 0.77 delivered. The relationship between the instructional materials and the 3 2.17 0.79 learning activities is clearly explained to me. Clear standards are set for instructor availability (office hours, 2 2.16 0.83

124 Internet Learning

Clear standards are set for instructor availability (office hours, 2 2.16 0.83 etc.) “Self-check” or practice assignments are provided, and I am 2 2.08 0.87 provided with timely feedback. Minimum preparation or prerequisite knowledge I need to 1 2.08 0.82 succeed in the course is clearly stated. The course design takes full advantage of available tools and 1 2.06 0.85 media. Course includes or links to a clear description of the technical 2 2.05 0.83 support offered to me. The tools and media used support the achievement of the stated 3 2.05 0.83 course and module/unit learning objectives. A statement introduces me to the purpose of the course and its 3 2.04 0.83 components. The learning activities promote the achievement of the stated 3 2.01 0.78 course and module/unit learning objectives. Minimum technical skills expected of me are clearly stated. 1 1.99 0.87

Learning activities encourage me to interact with content in the 3 1.96 0.82 course. The instructor introduces her- or himself. 1 1.91 0.87

The course learning objectives describe outcomes that I am able 3 1.84 0.88 to achieve. Course includes web links that are self-describing and 2 1.84 0.89 meaningful. All learning objectives are clearly stated and written from my 3 1.83 0.90 perspective. Course includes or links to a clear explanation of how the 2 1.83 0.87 institution’s academic support system can assist me in effectively using the resources provided. The module/unit learning objectives describe outcomes that I am 3 1.80 0.89 able to achieve and are consistent with the course-level objectives. (For example, upon completing this lesson you will be able to…) All resources and materials used in the course are appropriately 1 1.79 0.95 cited. Course includes or links to tutorials and resources that answer 1 1.75 0.92 basic questions related to research, writing, technology, etc. Course is accessible to people with disabilities. 3 1.74 1.13 Course includes or links to a clear explanation of how the 1 1.68 0.93 institution’s student support services can help me reach my educational goals. Course includes equivalent alternatives to audio and visual 2 1.65 1.06 content. Learning activities encourage me to interact with my instructor. 3 1.53 0.94

Etiquette (or “netiquette”) guidelines for how to behave online 1 1.43 0.93 are clearly stated. Learning activities encourage me to interact with other students. 3 1.24 0.98 I am asked to introduce myself to the class. 1 1.00 0.96

125 Surveying Student Perspectives of Quality

Survey Item QM Participant Rank Rank Learning activities encourage me to interact with content 3 1.96 in the course. The course learning objectives describe outcomes that I 3 1.84 am able to achieve. All learning objectives are clearly stated and written from 3 1.83 my perspective. The module/unit learning objectives describe outcomes 3 1.80 that I am able to achieve and are consistent with the course-level objectives. (For example, upon completing this lesson you will be able to…) Course is accessible to people with disabilities. 3 1.74 Learning activities encourage me to interact with my 3 1.53 instructor. Learning activities encourage me to interact with other 3 1.24 students.

Table 10. QM “3” Items Ranked <2 by Participants

Survey Item QM Participant Rank Rank Instructions on how to access resources online are 1 2.47 sufficient and easy to understand. The course components are web-based or easily 1 2.47 downloaded for use offline. Course ensures screen readability. 1 2.32 Minimum preparation or prerequisite knowledge I need to 1 2.08 succeed in the course is clearly stated. The course design takes full advantage of available tools 1 2.06 and media. Table 11. QM “1” Items Ranked >2 by Participants

126