IMAGE-BASED :

REINFORCING LEARNING IN

MEDICAL SCHOOL EDUCATION

A Project

Presented to the

Faculty of

California State Polytechnic University, Pomona

In Partial Fulfillment

Of the Requirements for the Degree

Master of Arts

In

Education

By

Joseph S. Marilo

2017 SIGNATURE PAGE

PROJECT: IMAGE-BASED AUGMENTED REALITY: REINFORCING LEARNING IN MEDICAL SCHOOL EDUCATION

AUTHOR: Joseph S. Marilo

DATE SUBMITTED: Spring 2017

Education Department

Shahnaz Lotfipour, Ph. D. ______Project Committee Chair Professor of Education

Jerry Kellogg, MA ______Senior Consultant Creative Edge

ii

ACKNOWLEDGEMENTS

I am forever appreciative of my parents for the sacrifices they made toward my scholastic pursuits. They did not have the same educational opportunities back in their respective countries, and I know that I would not be the person I am today without their love and support.

To my wife, Christy, for her tireless encouragement despite the challenges. At one point, we were both in school – she completed her Doctorate last year! We will be breathing a sigh of relief very soon...again!

This project would not have been possible without the assistance of my work colleagues. The team members in Instructional Technology and Distributed Learning provided technical expertise and guidance that kept the project on track. The generous support of the administration, post-doctoral teaching fellows, and staff of the

NMM/OMM department was invaluable as was the support of the administration in the

College of Osteopathic Medicine.

To my classmates in my cohort who inspire and exemplify the selfless act of imparting knowledge. I’m grateful for your friendship and generosity.

And lastly, a very special thank you to my faculty advisor, Dr. Shahnaz Lotfipour.

I always left our project meetings buoyed by her encouragement, enthusiasm, and guidance.

iii

ABSTRACT

Medical school education is a rigorous endeavor where students undergo hundreds of hours of lecture and laboratory instruction during their pre-clinical years alone (Smith, Peterson, Degenhardt, & Johnson, 2007). Additionally, it is not uncommon for the courses to be at times demanding, abstract, and conceptual (Smith et al., 2007).

Augmented reality (AR) is an up-and-coming technology that may aid the learner in comprehending abstract and difficult concepts.

Augmented reality is the process where computer-generated information is layered upon a real-world view, thus adding to – or augmenting – the information presented (Azuma, 1997; Caudell & Mizell, 1992). The technology surrounding augmented reality has its origins in World War II (Vaughan-Nichols, 2009). However,

AR has since diffused from the military to other sectors including manufacturing, marketing, and entertainment (Carmigniani et al., 2011; Vaughan-Nichols, 2009; Yuen,

Yaoyuneyong, & Johnson, 2011).

While augmented reality is not new, it is still considered emergent in the field of education (Bacca, Baldiris, Fabregat, Graf, & Kinshuk, 2014; Billinghurst & Duenser,

2012). Bacca et al. (2014) and Thornton et al. (2012) have suggested that the subjects of science, engineering and mathematics are well suited for augmented reality. AR has the potential of clarifying complex and abstract theories and relationships, increasing engagement and comprehension, all while placing the student in a safe environment

(Cuendet, Bonnard, Do-Lenh, & Dillenbourg, 2013).

The purpose of this research project was to create image-based augmented reality experiences for medical students enrolled in The Expanding Osteopathic Concept (EOC),

iv a five-day intensive cranial course taught to second-year osteopathic medical students.

Accompanying this course is The Expanding Osteopathic Concept Manual, a 170-page spiral-bound workbook. AR experiences were embedded in specific sections of this workbook as a means of clarifying concepts, all while reinforcing and engaging the medical student.

Three hundred forty-six second-year osteopathic medical students were invited to take part in this study via an email invitation. Of those, 82 participated in the study and completed the survey. The study involved students accessing the study website, viewing augmented reality-enhanced illustrations, and taking a short, anonymous survey. The survey was comprised of twelve questions: ten multiple choice, and two open-ended.

In the analysis of the data, participants’ attitudes and perceptions of augmented reality have been largely favorable. Perceived comprehension, retention, and engagement scored an average of 4.41, 4.32, and 4.8 out of 5 respectively. According to participant feedback, addressing technical limitations of image stabilization and device compatibility is of high priority and paramount in creating a successful user experience.

Continued research in this field will aid in augmented reality’s development and in the potential impact on student outcomes. Further studies are needed to measure AR’s effectiveness and viability beyond student perception and into measured, academic results.

v

TABLE OF CONTENTS

Signature Page ...... ii

Acknowledgements ...... iii

Abstract ...... iv

List of Tables ...... ix

List of Figures ...... x

Chapter One: Introduction ...... 1

Background of the Problem ...... 2

Statement of the Problem ...... 4

Purpose of the Project ...... 5

Assumptions ...... 6

Limitations ...... 6

Delimitations ...... 7

Definition of Terms...... 7

Chapter Two: Literature Review ...... 12

What is Augmented Reality? ...... 14

History of Augmented Reality ...... 16

How does Augmented Reality Work? ...... 18

Augmented Reality in Current Use ...... 19

Manufacturing ...... 20

vi

Marketing and Commerce...... 20

Entertainment and Informational ...... 21

Medicine ...... 23

Education ...... 26

Augmented Reality and Education ...... 26

The Future of Augmented Reality ...... 28

Summary ...... 29

Chapter Three: Methodology ...... 31

Content Development ...... 32

Analysis Phase ...... 32

Design Phase ...... 33

Program Development ...... 35

Field Testing Procedure ...... 37

Implementation Phase ...... 37

Evaluation Phase ...... 38

Chapter Four: Summary, Conclusions, and Recommendations ...... 40

Summary ...... 40

Conclusions ...... 43

Recommendations ...... 55

References ...... 56

vii

Appendix A: Approval of Study ...... 66

Appendix B: Institutional Review Board Approval...... 67

Appendix C: Institutional Review Board Approval...... 68

Appendix D: Invitation Flyer ...... 70

Appendix E: Study Questionnaire ...... 71

Appendix F: Question 7 ...... 74

Appendix G: Question 11 ...... 76

Appendix H: Question 12 ...... 80

viii

LIST OF TABLES

Table 4.1 ...... 49

Table 4.2 ...... 50

Table 4.3 ...... 53

Table 4.4 ...... 54

ix

LIST OF FIGURES

Figure 2.1 Milgram’s Virtuality Continuum...... 14

Figure 2.2 Heilig's Sensorama Device ...... 16

Figure 2.3 Sutherland’s Head-Mounted Display ...... 17

Figure 2.4 1st and 10 Graphic Overlay ...... 21

Figure 3.1 The ADDIE Model ...... 31

Figure 3.2 Two cranial bones rotating in opposite directions ...... 34

Figure 3.3 A left side bending rotation dysfunction ...... 34

Figure 3.4 Cranial bones with axis of rotation animated in Maya ...... 36

Figure 3.5 Cranial bones in Unity with accompanying targets ...... 37

Figure 4.1 Familiarity with AR ...... 43

Figure 4.2 Operating platform downloaded ...... 44

Figure 4.3 Type of mobile device used ...... 45

Figure 4.4 Comprehension of material ...... 46

Figure 4.5 Retention of information ...... 47

Figure 4.6 User engagement ...... 48

Figure 4.7 Difficulties encountered ...... 49

Figure 4.8 Usefulness of troubleshooting information ...... 50

Figure 4.9 Effectiveness rating ...... 51

Figure 4.10 AR desired in future coursework ...... 52

Figure 4.11 Word cloud representation of experiences ...... 53

Figure 4.12 Word cloud representation of suggestions ...... 54

x

CHAPTER ONE

INTRODUCTION

It is generally agreed that embarking on a medical school education is a rigorous endeavor (Smith et al., 2007). In their pre-clinical years alone, medical students undergo hundreds of hours of lecture and laboratory instruction. It is also not uncommon for medical school curricula to be abstract and conceptual at times (Smith et al., 2007).

Understanding concepts and theories that take place at a molecular level, or with degrees of movement so miniscule they defy the unaided eye – such as the motion of cranial bones, for instance – are just some of the daunting challenges confronting the medical student (Barsom, Graafland, & Schijven, 2016).

In a health sciences university located in Southern California, second-year osteopathic medical students undergo a comprehensive week-long course dealing with the human cranium. Entitled The Expanding Osteopathic Concept (EOC), this module takes an in-depth look into osteopathy and the cranial field. The intricacies of the human skull, its role in the overall function of the musculoskeletal system, the cranial bones’ inherent motions, and palpatory and treatment techniques are just some of the topics covered.

Mastering and excelling in academics is critical not only for medical students, but also for their care receivers. Future health care providers require superior levels of proficiency and professionalism since patient wellbeing will ultimately be at stake

(Barsom et al., 2016; Herron, 2016). A burgeoning technology that may aid the learner in comprehending abstract and complex concepts is augmented reality (AR).

1

Background of the Problem

Azuma (1997), and Caudell and Mizell (1992) described augmented reality as the process where computer-generated information is layered upon a real-world view, thus adding to – or augmenting – the information presented. The technology surrounding augmented reality has its origins in World War II when the British military developed a means to project tactical data directly onto an aircraft’s windshield, within the pilot’s field of view, thus limiting distractions in the cockpit (Vaughan-Nichols, 2009). AR has since diffused from the military realm to other sectors such as manufacturing, marketing, and entertainment (Carmigniani et al., 2011; Vaughan-Nichols, 2009; Yuen et al., 2011).

In manufacturing, head-worn devices have been used during the assembly of aircraft wiring harnesses (Caudell & Mizell, 1992). Updates to schematics would be sent directly to workers’ headsets where they could immediately visualize those revisions.

Nowadays, some companies are also including AR in brochures, catalogues and other forms of marketing campaign and promotion (Carmigniani et al., 2011). Shopping for a sofa? One furniture manufacturer’s catalog incorporated AR. Simply place their catalog on the floor of a proposed location, aim an AR-ready mobile device at the catalog, and a sofa appears onto the view screen – as if that furniture was indeed physically present in that designated space.

In entertainment, one example of the regular use of augmented reality is in the broadcast of American football games (Pence, 2011). As teams assemble along the line of scrimmage, a “1st and 10” graphic is superimposed onto the playing field allowing television-viewers to see yardage and other information relevant to the current play. In addition to extending beyond the military and into the civilian realm, AR has also

2

evolved from bulky, wired contraptions into portable, wireless, and even fashionable devices (Perlin, 2016).

Researchers have suggested that, due to their design and functionality, portable mobile devices lend themselves well to augmented reality experiences (Kapp & Balkun,

2011). Aiding in the public’s increased awareness of augmented reality is the increase in smartphone and tablet ownership. In a 2015 Pew Research analysis, Anderson (2015) revealed that 68% of Americans own smartphones and 45% own tablets. Globally, the number of smartphone owners was forecast to reach 2.1 billion by 2020 (“Number of smartphone users worldwide 2014-2020,” n.d.) or just over a quarter of the world’s population (“2016 World Population Data Sheet,” n.d.). Consequently, these statistics position AR on an upward trajectory of growth and acceptance (Cheng & Tsai, 2013;

Nifakos, Tomson, & Zary, 2014; Zhu, Hadadgar, Masiello, & Zary, 2014).

While augmented reality is not new, it is still considered emergent in the field of education (Bacca et al., 2014; Billinghurst & Duenser, 2012). Bacca et al. (2014) and

Thornton et al. (2012) have suggested that the subjects of science, engineering and mathematics as well suited for augmented reality. AR has the potential of clarifying complex and abstract theories and relationships, increasing engagement and comprehension, all while placing the student in a safe environment (Cuendet et al., 2013).

Moreover, Akçayır, Akçayır, Pektaş, and Ocak (2016) investigated the effects of augmented reality on the attitudes of university students. They observed AR’s positive effects on both the students’ skills and their performance in laboratory exercises.

The scarcity of AR authoring tools, however, is but one of the challenges currently facing its use in education (Billinghurst & Duenser, 2012). Another is AR’s

3

intrinsic dependence on technology. In other words, the learning curve is steep for students and instructors alike. Critics question whether AR is better than other technologies in promoting academic success (Billinghurst & Duenser, 2012). Moreover, authors agree that further studies are needed to measure AR’s effectiveness and viability, especially for learners with special needs (Bacca et al., 2014).

Statement of the Problem

Learning human anatomy is a foundational concept for the incoming medical student. One method of instruction used in teaching gross anatomy is experiential learning. This is typically accomplished in cadaveric laboratories where groups of students are assigned specimens for dissection. In fact, the gross anatomy laboratory is an important part of medical education programs worldwide (Bentley & Hill, 2009).

The teaching of human anatomy and the various organ systems is a very complicated, highly detailed, and conceptual affair. For instance, osteopathic medical students learn the intricacies and subtleties of cranial osteopathy as part of their training.

This training in turn leads to the use of hands-on manipulative techniques in order to diagnose and treat dysfunction (Bordoni & Zanier, 2015).

Augmented reality, according to Billinghurst and Duenser (2012), Thornton et al.

(2012), and Yoon and Wang (2014), allows learners to experience what would otherwise be too difficult or abstract. Wasko (2013) posited that AR helped users to better understand the world around them. And Thornton, Ernst, and Clark (2012), and Di Serio,

Ibáñez, and Kloos (2013) reported increased student motivation and collaboration when integrating augmented reality technology into education. Additionally, improvements to engagement and retention have also been documented (Thomas, William, & Delieu,

4

2010; Yoon & Wang, 2014).

Proponents of this technology indicate that improvements in comprehension were due in part to AR’s ability to create situational learning experiences and by recruiting more of the senses. “The more senses that are involved (sound, sight, touch, emotions, etc.), the more powerful the learning experience is” (Pérez-López & Contero, 2013, p.

19). Critics, however, challenge that AR technology alone cannot facilitate the learning process (Chiang, Yang, & Hwang, 2014), but rather, should be complimentary to it

(Billinghurst & Duenser, 2012). Incorporating augmented reality as a supplement to anatomy training, then, has the potential to clarify the inherent complexities of the subject matter, reinforce learning, and increase learner engagement.

Purpose of the Project

The purpose of this research project was to create image-based augmented reality experiences for medical students. The Expanding Osteopathic Concept (EOC) is a five- day intensive anatomy course taught to second-year osteopathic medical students at a

Southern California health sciences university. Accompanying this course is The

Expanding Osteopathic Concept Manual, a 170-page spiral-bound workbook. This manual covers such topics as osteopathy’s history in cranial therapy, cranial anatomy and landmarks, and a host of cranial treatment techniques. Using multimedia creation tools such as Autodesk Maya, Adobe Photoshop, Unity 3D, and Vuforia, augmented reality experiences were embedded in specific sections of this workbook. The AR experiences were intended to be a supplement to the course as well as a means of clarifying, reinforcing, and engaging the student in complex and abstract cranial concepts.

5

“EOC is a required course that exposes the student to the details, relationships, and physiologic properties of the cranial bones,” explains Jesus Sanchez, DO, the clinical coordinator, vice-chair and associate professor of the department of Neuromusculo- skeletal Medicine/Osteopathic Manipulative Medicine (NMM/OMM) (J. Sanchez DO, personal communication, April 16, 2016). “Resources for this course have included workbooks, videos and study models.” Dr. Sanchez (personal communication, April 16,

2016) characterized the addition of augmented reality into his curriculum as another tool in the students’ armamentarium of resources.

Assumptions

The following assumptions were made during the development of this project:

• An individual’s ability to learn can be improved.

• Individuals are desirous of self-improvement.

• This project would positively reinforce learning and increase engagement.

• This project would be adopted into curricula and be used in future courses.

• Participants had equal opportunity and accessibility to the technology (smart

phones or tablets) to participate in this project.

Limitations

The following limitations affected the scope of this project:

• This project was limited to second-year osteopathic medical students enrolled

in the EOC course or to osteopathic medical students who have previously

taken the EOC course.

• This project was limited to the duration of the 5-day EOC course.

6

• This project was limited by the participants’ varying levels of technical

expertise and backgrounds.

• This project was limited by students’ accessibility to smart phones and tablets.

Delimitations

The author delimited the scope of this project as follows:

• The author chose image-based augmented reality over other forms of AR.

• The project specifically targeted Southern California-based osteopathic

medical students enrolled in the EOC course.

• The project involved the selection of specific authoring software.

• The project involved the specific use of smart phones and tablets running on

either Apple iOS or Android operating systems.

Definition of Terms

ADDIE

ADDIE, a framework for instructional systems development, is an acronym that

stands for the five phases of instructional design: (a) Analysis, (b) Design, (c)

Development, (d) Implementation, and (e) Evaluation (Molenda, 2003).

Adobe Photoshop

“A raster graphics editor developed and published by Adobe Systems for macOS

and Windows” (“Adobe Photoshop,” 2017).

Adobe Muse

“A website builder that allows designers to create fixed, fluid, and adaptive

websites without having to write any code” (“Adobe Muse,” 2017).

7

Android Studio

“The official integrated development environment (IDE) for the Android

platform” (“Android Studio,” 2017).

Augmented Reality (AR)

Augmented reality (AR) can be defined as “the overlapping of actual surrounding

environment with virtual images” (Ferrer-Torregrosa, Torralba, Jimenez, García,

& Barcia, 2015, p. 120). The major attributes of augmented reality are: (a) virtual

and real objects combined in an existing setting, (b) real-time interaction between

the user and the environment, and (c) real and virtual objects precisely situated in

the user’s environment (Azuma et al., 2001).

Autodesk Maya

“A 3D computer graphics software that… is used to create interactive 3D

applications, including video games, animated film, TV series, or visual effects”

(“Autodesk Maya,” 2017).

Doctor of Osteopathic Medicine (DO)

Doctors of Osteopathic Medicine (DOs) are “fully licensed physicians whose

practice covers all areas of medicine” (J. Sanchez, DO, personal communication,

April 16, 2016). “DOs receive special training in the musculoskeletal system”

(“What is a DO?,” n.d.).

8

The Expanding Osteopathic Concept (EOC)

The Expanding Osteopathic Concept is a week-long course taught to second-year

osteopathic medical students at a health sciences school in Southern California.

This course focuses on the “details, relationships, physiologic motions, and

treatment techniques of the cranial bones” (J. Sanchez, DO, personal

communication, April 16, 2016).

Head display unit

A head display unit is a monitoring device worn on the head and covers the user’s

field of view. Video from the unit is “combined with the graphic images created

by the scene generator, blending the real and virtual” (Azuma, 1997, p. 6).

Heads-up Display

“Also known as a HUD, is any transparent display that presents data without

requiring users to look away from their usual viewpoints” (“Head-up display,”

2017).

Imaged-based Augmented Reality

Imaged-based augment reality is a form of augmented reality using markers “to

trigger the virtual images” (Carlson & Gagnon, 2016, p. 124).

Learner Control Principle

The Learner Control Principle is a multimedia design principle recommending

that instructional designers “allow learners to control pacing so they can proceed

at their own rate” (Clark & Mayer, 2011, p. 301).

9

Location-based Augmented Reality

Location-based augmented reality is a form of augmented reality that

“take into account the user’s real-world location, ensuring that contextually

relevant virtual data are provided to the user at geographically significant

locations” (Bower, Howe, McCredie, Robinson, & Grover, 2014, p. 2).

Multimedia Principle

The Multimedia Principle is a multimedia design principle that recommends

instructional designers “include words and graphics rather than words alone”

(Clark & Mayer, 2011, p. 70).

Osteopathic Manipulative Treatment (OMT)

Osteopathic Manipulative Treatment is the Doctor of Osteopathy’s “hands-on

technique to treat pain, promote healing, and increase overall mobility and

wellness in a patient” (J. Sanchez, DO, personal communication, April 16, 2016).

Spatial Contiguity Principle

The Spatial Contiguity Principle is a multimedia design principle that

recommends instructional designers ensure “corresponding graphics and printed

words be placed near each other on the screen (that is, contiguous in space)”

(Clark & Mayer, 2011, p. 93).

Temporal Contiguity Principle

The Temporal Contiguity Principle is a multimedia design principle that

recommends instructional designers make certain that “the narration describing

each step should be presented at the same time as the action shown on the screen”

(Clark & Mayer, 2011, p. 102).

10

Unity

Unity is a professional authoring program that allows developers to create

interactive applications (Squire, 2015).

Vuforia

“Augmented Reality Software Development Kit (SDK) for mobile devices that

enables the creation of Augmented Reality applications” (“Vuforia Augmented

Reality SDK,” 2016).

Xcode

“An integrated development environment for macOS containing a suite

of software development tools developed by Apple for developing software for

macOS, iOS, watchOS and tvOS” (“Xcode,” 2017).

11

CHAPTER TWO

LITERATURE REVIEW

Augmented reality (AR) is the process by which computer-generated information is layered upon a view of the real-world, thus adding to – or augmenting – the informa- tion presented to the user (Azuma, 1997; Caudell & Mizell, 1992). AR has been around, in one form or another, since World War II when the British military developed a means to allow pilots to view tactical information directly onto their windshields. This precursor to the heads-up display allowed aviators to identify the nature of neighboring and incoming aircraft: friend or enemy (Vaughan-Nichols, 2009).

Morton Heilig, a cinematographer in the 1950s, added to the notion of augmenting user experience with his multi-sensory movie-going experience he called

“Sensorama” (Carmigniani et al., 2011). His invention allowed audiences to not only watch movies, but to feel and smell them as well. Interactivity between the real and the virtual would not come into play until the mid-70’s with Myron Krueger’s Videoplace – a room where users could manipulate virtual objects (Carmigniani et al., 2011; Krueger,

Gionfriddo, & Hinrichsen, 1985). Decades later, scientists in the 1990s developed a head-worn device that assisted workers during aircraft assembly (Berryman, 2012).

Augmented reality has evolved from bulky, tethered contraptions into wireless, hand-held, and even fashionably wearable devices (Perlin, 2016). Due to the miniaturization, proliferation, and growing affordability of wireless mobile devices, the number of individuals owning smartphones and tablets is steadily increasing. According to a 2015 Pew Research analysis, 68% of Americans own smartphones and 45% own tablets (Anderson, 2015). Globally, the number of smartphone owners is forecast to

12

reach 2.1 billion by 2020 (“Number of smartphone users worldwide 2014-2020,” n.d.) or approximately 28% of the world’s population (“2016 World Population Data Sheet,” n.d.). Researchers have suggested that portable mobile devices, due to their design and functionality, lend themselves well to augmented reality experiences (Kapp & Balkun,

2011).

Consequently, these statistics position AR on a path for continued growth and acceptance (Cheng & Tsai, 2013; Nifakos et al., 2014; Zhu et al., 2014). In fact, augmented reality has since diffused from the military and engineering arenas into various other sectors. These include, but are not limited to, entertainment, marketing, manufacturing, medicine, and education (Carmigniani et al., 2011; Vaughan-Nichols,

2009; Yuen et al., 2011).

While the technology involving AR is not new, the literature considers it nascent, particularly in the field of education (Bacca et al., 2014; Billinghurst & Duenser, 2012).

Bacca et al. (2014) put forward AR’s impact on education, suggesting benefits which included increased student motivation, collaboration, and retention, while Cuendet et al.,

(2013), Martin, Dikkers, Squire, & Gagnon (2014), along with other investigators, raised cautions and concerns involving best-practices in the development and implementation of augmented reality in academia.

This review of the literature explores the definition and taxonomy of augmented reality, traces its early developments and milestones, and discusses functionality and technical requirements. In addition, this review examines augmented reality in various sectors including healthcare and education, considers the benefits and challenges associated with augmented reality in education, and concludes with what developers are

13

saying about the future of augmented reality.

What is Augmented Reality?

Augmented reality overlays computer-generated data over real world imagery – either directly or indirectly (Bacca et al., 2014). Instead of replacing reality, AR enhances it. Augmented reality is considered a subset of virtual reality (Azuma et al.,

2001). Virtual reality (VR) is an experience whereby the user is completely immersed in an artificial, computer-generated environment. This is accomplished with goggles which are fully enclosed, completely obscuring the user’s view of the real world. AR, on the other hand, places computer-generated information atop the user’s view of the real world

(Azuma, 1997). With a growing number of researchers and developers entering this burgeoning field, the need to codify and classify this young technology has become apparent.

In their seminal paper, “A Taxonomy of Mixed Reality Visual Displays,”

Milgram and Kishino (1994) devised a classification system for researchers and developers to use common, agreed-upon terminology, to better understand each other’s contributions to the field. Their classification system, referred to as “Milgram’s

Virtuality Continuum,” is a scale depicting the completely real on one end, to the completely virtual on the other (see Figure 2.1).

Figure 2.1. Milgram’s Virtuality Continuum (Milgram & Kishino, 1994).

14

Everything in between this gamut is said to belong to the domain of “mixed reality” (Kapp & Balkun, 2011). Within this mixed reality space reside both augmented reality - where the virtual augments the real, and conversely, augmented virtuality - where the real augments the virtual (Borrero & Márquez, 2012; Yuen et al., 2011).

Several years after Milgram and Kishino’s work, Azuma (1997), in his survey of augmented reality, proposed certain attributes that augmented reality must possess. For an experience to be considered augmented it must: “1) Combine real and virtual. 2) Be interactive in real time. 3) Be registered in three dimensions” (Azuma, 1997, sec. 1.2).

Augmented reality must incorporate elements of both the real and the virtual.

This differentiates it from virtual reality, where the whole experience is computer- generated. Furthermore, merely superimposing computer-generated data over the real- world environment does not represent an augmented experience, thus disqualifying the visual effects capabilities of film and television. Additionally, for the experience to fall within the definitions set forth by Azuma, real-time interactivity is necessary. That is, the user must be able to interact with the virtual overlay.

The final attribute, registered in three dimensions, describes the illusion that augmented reality affords: the appearance that the virtual information is anchored to the real world. As the user moves about in their space, the virtual overlay moves similarly in real-time. The notion that AR enhancements occur in real-time has also been brought forth by Carmigniani et al. (2011), and Di Serio, Ibáñez, and Kloos (2013). In short, the virtual object appears to have the width, height, and depth that makes it seem to coexist in the user’s space.

15

Some proponents agree that AR and VR are closely related, while others see them as very different concepts (Berryman, 2012; Lee, 2012a). Simply put, augmented reality is the amalgamation of our real-world view with computer-generated information. The purpose of augmented reality is to enhance the user’s experience (Pence, 2011).

History of Augmented Reality

The technology behind augmented reality was first introduced during World War

II when the British military developed a means that permitted pilots to view flight information projected onto their windshield. This early form of a Heads-Up Display

(HUD) permitted pilots to focus on critical information directly within their field of view, freeing them from distractions elsewhere in the cockpit. In the 1950s, cinematographer

Morton Heilig, proposed a movie-going experience that engaged all the audiences’ senses

– not just those of sight and sound (see Figure 2.2).

Figure 2.2. Heilig's Sensorama Device (Heilig, 1961).

16

His “Sensorama” machine was an immersive 3-dimensional multi-sensory experience. Berryman (2012) noted that Morton’s accomplishments, having occurred prior to the digital age, made him a forerunner to mixed reality production. His invention, however, was not portable. Approximately the size of a small refrigerator, a person was required to sit before it and place their face into a viewing area.

Portability, at least to some extent, would have to wait another decade with the advent of Ivan Sutherland’s proposal of a Head-Mounted Display (HMD) (Olshannikova,

Ometov, Koucheryavy, & Olsson, 2015; Perlin, 2016). Unlike “Sensorama”,

Sutherland’s prototype helmet-like apparatus was worn by the operator. In his paper “A head-mounted three-dimensional display,” Sutherland’s aim was to allow the wearer to see information that was not only layered atop their field of view but also registered to their vantage point. That is, the computer-generated imagery’s perspective would change based upon the perspective of the wearer (see Figure 2.3).

Figure 2.3. Sutherland’s Head-Mounted Display (Sutherland, 1968).

17

The phrase “augmented reality” did not come into existence until the 1990s.

Scientists at Boeing Corporation, the aircraft manufacturer, coined the term during the development of a device that allowed workers to efficiently assemble aircraft wiring harnesses (Berryman, 2012; Carmigniani et al., 2011). While wearing these head display units, relevant technical schematics were sent to the technicians and overlaid directly onto their visual field (Cheng & Tsai, 2013). Regardless of their form, augmented reality experiences rely on common mechanisms to operate.

How does Augmented Reality Work?

Augmented reality requires these essential components: a display, an input device, a tracking device, and a computer running and processing the AR software. The display serves as the means to impart the visual information to the user. Visually representing the AR experience occurs in one of several manners: through head-worn, handheld, or projection technologies.

The input device serves as the interface between the user and the software. The most common form of input device nowadays are the touchscreens found on smartphones and tablets. Other forms of user input range from specialized gloves, which sense hand and finger motion, to devices that detect eye gaze.

Tracking refers to the ability of the system to determine the user’s position in space. Tracking mechanisms may be in the form of optical, magnetic or laser sensors, global positioning systems (GPS), accelerometers, or electronic compass.

Finally, augmented reality relies on some method of computer processing.

Laptops, desktops, and more recently, smart devices such as tablets and phones, are all suitable means of computing and running AR software (Carmigniani et al., 2011).

18

Various forms of augmented reality triggers are discussed in the literature: image- based, location-based, and, less frequently, object-based (Pérez-López & Contero, 2013;

Cheng & Tsai, 2013). Image-based AR, sometimes referred to as marker-based, relies on visual prompts to trigger, or call up, the data. These fiducial markers act as points of reference and serve as guides by positioning the computer-generated material in proper orientation and alignment with the real world in relation to the view screen.

Location-based AR, on the other hand, uses information provided by the mobile device’s internal global positioning system and compass system. These systems determine the user’s location and orientation, and, depending on the software, call up nearby objects and statistics based on those coordinates (Johnson, Smith, Willis, Levine,

& Haywood, n.d.).

Some studies mention a third kind of AR: a marker-less solution (Bacca et al.,

2014). Oftentimes referred to as object-based, this variety uses image recognition rather than markers to bring about the necessary data. Critics suggest this is merely a variation of image-based augmented reality. Nonetheless, these various means of triggering and processing augmented reality are gaining awareness, exposure, and acceptance today

(Berryman, 2012; Lee, 2012b).

Augmented Reality in Current Use

Continued development throughout the years brought technical breakthroughs that enabled AR to be introduced into areas outside of the military. It has grown in acceptance and use in a variety of sectors: manufacturing, marketing, entertainment, medicine, and education to name just a few (Berryman, 2012; Billinghurst & Duenser,

2012; Carmigniani et al., 2011).

19

Manufacturing

Caudell and Mizell (1992) proposed a prototype head-worn display to assist in the manufacturing process of airliners. The challenges in producing a craft of this magnitude lies in its inherent complexity. Unlike an automobile, which may be manufactured in an automated assembly line, an aircraft requires more parts and a significant number of individuals to assemble them. Schematics and other necessary build information are regularly updated during the manufacturing process. A portion of the cost comes from efficiently getting updated information to the workers. Rather than having the workers go to the information, it comes to them through their head-worn displays (Caudell &

Mizell, 1992).

Greenhalgh, Mullins, Grunnet-Jepsen, and Bhowmik (2016) presented the

DAQRI smart helmet which improved upon some of its predecessors’ capabilities, such as thermal detection, and threat avoidance. The DAQRI helmet included a rear-facing camera, allowing users to visualize objects located behind them or in their blind spots

(Greenhalgh et al., 2016).

Marketing and Commerce

Today’s companies are embedding AR information in brochures, catalogues and other forms of marketing and promotion (Carmigniani et al., 2011). Uncertain how a new piece of furniture will appear in the family room? Place the manufacturer’s catalog in the proposed location. Hold an AR enabled mobile device in the direction of the target

- in this case the catalog - and with AR technology, a virtual sofa appears and is positioned onto the view screen in alignment with the viewer’s perspective. The object can then be further manipulated in a variety of ways: repositioned, scaled, recolored, etc.

20

Virtual fitting rooms allow customers to try on apparel by merely standing in front of a specially configured monitor. For example, the Magic Mirror enables the consumer to consider styles and colors of clothing before stepping into the fitting room.

Not seen as a replacement for actually trying on clothes, stores are hopeful that this method will be viewed by consumers as more of a time-saving feature and lead to increased sales (Carmigniani et al., 2011).

Entertainment and Informational

Other examples of real-time augmented reality in use today are regularly seen by

TV-viewing sports fans. American football, for instance, uses a “1st and 10” graphic superimposed onto the playing field (Pence, 2011). Computer generated elements are transmitted to the television audience depicting the status of a team’s position and how much yardage is required for their next down (see Figure 2.4).

Figure 2.4. 1st and 10 Graphic Overlay (“Football’s augmented reality prepares us for an all-robot gridiron future,” 2016).

21

This is accomplished by overlaying graphical elements onto the image of the playing field. The graphics are accurately tracked with the live video feed so precisely that they appear to belong in the environment. However, to the spectators in the stands, none of this augmentation is apparent. Automobile racing enthusiasts also witness AR in action when team member information is presented on their television screens. Vital statistics are displayed and tracked with the target vehicle (Lee, 2012b). As in the case of the football game, spectators in the bleachers witnessing the race live are unable to appreciate the enhanced information.

In addition to sports, informational AR experiences can be found in applications such as Google’s Sky Map. This app allows astronomy enthusiasts and the curious alike to observe the heavens in a novel way. Working in concert with the device’s global positioning system, compass, and accelerometer capabilities, the app allows the user to scan the sky in any direction – even below the horizon (Ouilhet, 2010). Users hold their mobile devices up to the night sky and computer-generated planets, stars, constellations, and other celestial objects appear and position themselves in sync with their movements.

To identify an object, the user simply taps on the screen and supplemental information is presented. Navigating with AR, however, is not limited to outer space.

Wayfinding applications being developed allow users not only a means of navigation, but of searching for contextual information based on their location (Kapp &

Balkun, 2011). A smart phone or tablet’s GPS, compass, and accelerometer capabilities make this feature possible. Craig Kapp, clinical Associate Professor of Computer

Science at the Courant Institute of Mathematical Sciences at New York University, finds the notion of contextual information as part and parcel to an AR experience:

22

It puts me in a space where I can coexist with the data; the data becomes part of

my environment. Recently it has become a very popular technology because we

can now view and experience AR on consumer-grade devices (Kapp & Balkun,

2011, p. 101).

Location-based augmented reality (as in the example of Google’s Sky Map, and image-based augmented reality, as in the case of the furniture company’s catalog) successfully take what would be a rather simple image and layers upon it a level of complexity, interactivity, and richness that, as the name implies, augments the user’s reality and perception. The augmented information engages the user and allows them to experience what would otherwise be too complex or abstract (Billinghurst & Duenser,

2012; Thornton, Ernst, & Clark, 2012; Yoon & Wang, 2014). Wasko (2013) posits that

AR helps the user to better understand the world around them.

Medicine

Augmented reality has been used in surgery and medicine for nearly two decades

(Armstrong, Rankin, Giovinco, Mills, & Matsuoka, 2014). Azuma et al. (1997) proposed augmented reality use in clinical, and surgical environments. Carmigniani et al. (2011) suggested that the management of patients’ medical records could also be accomplished through AR. For instance, physicians wearing AR enabled headsets would be able to receive specific medical information directly layered upon their field of view during patient encounters. X-rays and other health information could be called up thereby giving the doctor a complete view of their patient without the physician having to look away.

23

Surgical applications of AR would allow surgeons to see through the patient, beyond the point where their instruments enter the body (Monkman & Kushniruk, 2015).

For example, Masutani et al. (1998) proposed the use of AR in intervascular surgery by combining and projecting 3D models of the patient’s vascular system with video x-ray fluoroscopy. This marker-based AR method could aid the surgeon in the often-difficult process inherent in vascular surgery. Surgeons could navigate catheters while minimizing invasiveness (Masutani, Dohi, Yamane, Iseki, & Takakura, 1998).

Meanwhile, improvements in head-worn AR appliances led to Armstrong’s et al.

(2015) proposed use of as a surgical tool. Citing lower cost, lighter form factor and improved battery life, the study suggested that this novel heads-up display may improve patient communication, safety, and care, specifically in limb salvage surgery

(Armstrong et al., 2014).

Augmented reality may also support robotic surgeries. Due to the nature of robotic surgery, a surgeon’s sense of touch is diminished or completely removed

(Carmigniani et al., 2011). Haptic feedback, a computerized recreation of the sense of touch, may be recreated graphically during these robotic surgeries using augmented reality. For example, levels of force applied by the surgeon can be displayed as a color- coded graphic onto their display. Insufficient, proper, or excessive forces would be presented and updated in real-time (Carmigniani et al., 2011).

In their review of the use of AR in the evaluation and treatment of psychological disorders, Giglioli et al. (2015) noted that the number of applications and studies in this field were few. Although an emerging technology, they did find early, positive results of

AR in the treatment of certain specific phobias such as the fear of insects, small animals,

24

and heights. In fact, Botella’s et. al. (2016) randomized control trial of in vivo versus augmented reality exposure revealed clinically significant outcomes with the use of augmented reality experiences. They caution, however, that further research be performed to ensure patient safety.

Augmented reality’s use in physical and developmental rehabilitation are noted in the works of Al-Issa, Regenbrecht, and Hale (2012), Lin and Chang (2015), and Requejo

(2012). Kapp and Balkun (2011), and Lin and Chang (2015) have developed augmented reality programs to assist patients in both physical and developmental rehabilitation.

Both cited AR’s cost-effectiveness as a viable alternative to more expensive solutions.

For instance, with only a laptop and an inexpensive web camera, a prototype device was developed to assist patients with acquired visual disturbances (Kapp & Balkun, 2011).

Additionally, this same team created software to aid children with physical and cognitive therapies in long-term care post-surgery.

Along these same lines, children with developmental disabilities may find value with AR therapies. Strength exercises, for example, were developed and modified for children with special needs. Once again, using a web camera and customizable software, applications were developed that not only engaged the patients, but also led to statistically significant results (Lin & Chang, 2015).

Although results appear promising, investigators have suggested that AR’s effectiveness is limited due to too few patient-based studies. This is due in part to AR’s novelty in the field of rehabilitation (Al-Issa, Regenbrecht, & Hale, 2012; Requejo,

2012). Authors agree that patient safety is critical, and that technical and privacy challenges are issues that require further investigation (Mitrasinovic et al., 2015;

25

Monkman & Kushniruk, 2015).

Education

The literature has shown augmented reality technology in use from grades K-12 to higher education (Akçayır, Akçayır, Pektaş, & Ocak, 2016; Kapp & Balkun, 2011).

Science, technology, engineering, and mathematics (STEM) subjects have been widely developed for AR application. SimSnails, for instance, is an application that allows students to learn about natural selection concepts. AR has also been used in the field of language arts. Word Lens enables the learner to hold up their device to a foreign word or phrase and have it be automatically translated into the language of their choosing (Kapp

& Balkun, 2011).

Some educational applications include experiences that recreate historic or cultural events. This form of location-based AR uses location coordinates to trigger an event. Users are able to experience, situationally, events that may have occurred tens or hundreds of years in the past, right where they are standing (Carmigniani et al., 2011).

Additionally, museums have adopted AR to serve as virtual tour guides. Unlike the location-based example, this form of object-based augmented reality utilizes the museum’s pieces as triggers to an AR experience. While the technology involving AR is not new, it is considered emerging in the area of education (Yoon & Wang, 2014; Pérez-

López & Contero, 2013).

Augmented Reality and Education

Studies have reported the subjects of science, engineering and mathematics as primary AR candidates (Bacca et al., 2014; Thornton et al., 2012). Thornton et al.,

(2012) cited numerous benefits including collaboration, and increased engagement and

26

retention (Thomas et al., 2010); Yoon & Wang, 2014). Akçayır, Akçayır, Pektaş, and

Ocak, (2016) investigated the effects of augmented reality on university students’ laboratory skills, and their attitudes toward laboratory science. Augmented reality, they concluded, had a positive effect on both their skills and their performance in the laboratory. Teachers also observed that students could better understand concepts and experiments with AR. Proponents indicate that this is due in part to AR’s ability to create ideal learning experiences by involving more senses. “The more senses that are involved

(sound, sight, touch, emotions, etc.), the more powerful the learning experience is”

(Pérez-López & Contero, 2013, p. 19).

Providing learner safety is another benefit of AR in education and in training.

With augmented reality, learning environments that would otherwise be impossible or dangerous can be created (Cuendet et al., 2013). For instance, learners can understand the process of cleaning up hazardous spills or the course of action to take during a viral outbreak; both without risking exposure. However, critics contend that AR technology alone cannot facilitate the learning process (Bower, Howe, McCredie, Robinson, &

Grover, 2014; Chiang, Yang, & Hwang, 2014).

Contemporary articles on AR recognize that the ubiquity of smartphones, high bandwidth portable mobile devices, and the burgeoning area of wearable technology are fueling the growth of AR in education (Green, Hill, & Mcnair, 2014). Experiencing augmented reality is no longer limited to bulky, head-worn display units. As such, the proliferation of these wireless mobile devices (WMD) prompted the New Media

Consortium’s 2011 Horizon Report to forecast AR’s adoption in education within two to three years’ time (Johnson et al., 2011).

27

Many challenges still face AR in the classroom, such as the scarcity of authoring

(Billinghurst & Duenser, 2012). AR is a technology-dependent endeavor. The learning curve is steep for both students and instructors alike. Critics question whether AR is better than other technologies in promoting academic success (Billinghurst & Duenser,

2012). Moreover, authors agree that further studies are needed to measure AR’s effectiveness and viability especially for learners with special needs (Bacca et al., 2014).

Kapp and Balkun (2011) raised concerns about too much emphasis on lower order thinking and the passive experience encountered by simply embedding AR targets in textbooks. Interactivity, on the other hand, adds to the experience, by allowing the user to manipulate and problem-solve (Kapp & Balkun, 2011). Carmigniani et al. (2011) refers to this as “tangible AR interfaces.” These tangibles can be in the form of markers, paddles, or other small objects. Furthermore, Kapp and Balkun (2011) note that this interactivity facilitates comprehension. The power of AR lies in its ability to put data into context (Kapp & Balkun, 2011).

The Future of Augmented Reality

Although augmented reality has existed in one form or another since World War

II, its role in education did not emerge until decades later (Bacca et al., 2014; Billinghurst

& Duenser, 2012). Proponents favoring the use of augmented reality in education have demonstrated impactful results on student outcome (Akçayır et al., 2016; Cuendet et al.,

2013; Thomas et al., 2010; Yoon & Wang, 2014). Critics maintain that AR technology alone does not support the learning process. Rather, they contend that augmented reality should be complimentary to it (Bower et al., 2014; Chiang et al., 2014).

28

The possibilities of the future of augmented reality applications are endless due, in part, to AR’s infancy (Nifakos et al., 2014). Advances in technology may one day supplant the tools needed for current AR experiences. Today’s displays, input devices, tracking, and computers will inevitably become obsolete. Information directly projected to the retina may be a technology of the not-too-distant future.

For instance, SixthSense, from the laboratories at the Massachusetts Institute of

Technology, is wearable technology that – as the name suggests – augments a person’s other five senses. This is accomplished via projectors, sensors, and the user’s hand gestures to interact and manipulate the surrounding data (Carmigniani et al., 2011).

Babak Parviz, an affiliate professor at the University of Washington, and the creator of Google Glass, is leading the development of a bionic contact lens that allows data to be transmitted directly to the wearer’s vision. Citing privacy issues, Carmigniani et al. (2011) suggest that this method of information delivery is advantageous since only the wearer can view the information. AR glasses may perform similarly, but the contact lens’ form makes for a less obtrusive solution. Moreover, both glasses and contact lenses address privacy concerns more readily than do mobile devices (Carmigniani et al., 2011).

Summary

Although augmented reality has existed in one form or another since World War

II, its role in education did not emerge until decades later. This late emergence has been in no small part due to the relatively recent technological advances such as processing speeds, miniaturization, and the proliferation of mobile devices. Smart phones and tablets have afforded billions worldwide with the opportunity to experience AR.

29

This literature review discussed the definition and taxonomy of augmented reality, traced its early developments and milestones, and discussed functionality and technical requirements. It also examined augmented reality in various sectors including healthcare, and considered the benefits and challenges associated with augmented reality in education. Studies have demonstrated impactful and measurable results with the implementation of AR in education, however, technology alone cannot facilitate the learning process. Critics contend that augmented reality should be complimentary to it.

AR enabled contact lenses projecting data directly to the user’s retina, for example, may now seem more like science fiction than science fact. If history is any indication, though, this technology should continue to advance and amaze.

30

CHAPTER THREE

METHODOLOGY

The purpose of this research project was to create and embed image-based augmented reality (AR) experiences into the coursework of second-year osteopathic medical students enrolled in The Expanding Osteopathic Concepts (EOC) cranial module. More precisely, AR experiences were embedded in specific sections of the students’ workbook which accompanied the cranial course. These experiences were intended not only to supplement the course, but also to elucidate concepts, reinforce learning, and increase student engagement.

The researcher utilized the ADDIE instructional design model as the framework for producing this project. ADDIE is an acronym for Analysis, Design, Development,

Implementation, and Evaluation. The five phases of ADDIE are a systematic and iterative approach to instructional systems development (Molenda, 2003) (see Figure

3.1).

Figure 3.1. The ADDIE Model (Braunschweig, 2014).

31

Content Development

Analysis Phase

The analysis phase addresses the needs of the learners by determining educational objectives and goals (Cheung, 2016). The researcher conducted an informal needs assessment for the analysis phase of the project. Conversations with the curriculum coordinator, and the chair and vice-chair of the department of neuromusculoskeletal medicine/osteopathic manipulative medicine at a medical university located in Southern

California, included analyses of audience, technology, critical-incident, and extant-data.

Audience Analysis. As this project involved the implementation of augmented reality experiences in The Expanding Osteopathic Concepts training manual, invitations to participate in this study were limited to osteopathic medical students who have previously taken the course. Also, as EOC is only offered to second-year medical students, the target audience was determined to be second-year medical students or older. Of the 346 students that made up the second-year class, 178 were males and 168 were females.

Their ages ranged from 20 to 49 (Office of the Registrar, 2016).

Technology Analysis. During the informal needs assessment, it was tacitly agreed, based on students’ current use of technology in the classroom, that the clear majority of prospective participants currently possess or have access to, mobile devices such as smartphones or tablets. They are also familiar with downloading, installing, and uninstalling applications on their devices, and have demonstrated basic computer literacy skills such as navigating the Internet, filling and submitting online surveys, and communicating via e-mail.

32

Critical-incident Analysis. Deciding which images in the cranial manual were to be augmented was established during the critical-incident analysis. This was accomplished through the assistance of a third-year pre-doctoral osteopathic teaching fellow who also served as the study’s subject matter expert.

Extant-Data Analysis. In a meeting with the vice-chair and associate professor of the department of Neuromusculoskeletal Medicine/Osteopathic Manipulative Medicine, it was revealed that current resources for this course have included workbooks, videos and plastic study models. However, there had been no prior use of augmented reality in the

EOC coursework. As such, pre-existing training material incorporating AR were not identified.

Design Phase

Once the analysis phase was completed, the researcher moved into the design phase of the study. The design phase involves the creation of a general plan to tackle those objectives identified during the analysis phase (Cheung, 2016). After a comprehensive review of the training manual, it was determined that a select number of images were well-suited for augmented reality. These were found in the section entitled

Introduction to Cranial Somatic Dysfunction/Strain Patterns. Thirteen illustrations depicting axes of rotation and movements of cranial bones were selected as candidates due to their inherent abstraction and complexity. That is, the chosen illustrations attempted to depict 3-dimensional movements on a 2-dimensional surface; in this case, a sheet of paper (see Figures 3.2 and 3.3).

33

Figure 3.2. Two cranial bones rotating in opposite directions (NMM/OMM Department, 2014, p. 88).

Figure 3.3. A left side bending rotation dysfunction (NMM/OMM Department, 2014, p. 92).

Following along the lines of Billinghurst and Duenser (2012), Thornton et al.

(2012), and Yoon and Wang (2014), who posited that augmented reality allowed learners to experience what would otherwise be too difficult or abstract, the researcher theorized that AR would aid learner comprehension in this circumstance.

Additionally, Richard E. Mayer’s principles for designing multimedia instruction were followed: the Coherence principle – people learn better when unnecessary words, pictures, and sounds are omitted; the Multimedia principle – people learn better from

34

words and pictures than just from words; the Spatial Contiguity principle – people learn better when related words and pictures are presented near each other; and the Temporal

Contiguity principle - people learn better when corresponding words and pictures are presented concurrently rather than successively (Clark & Mayer, 2011).

Using the image-based AR, when directed by instructors or while studying on their own, students could simply aim their mobile devices onto AR markers triggering, depending on the illustration, various augmented reality experiences. Some events could simply demonstrate axes of rotation of the various bones, while others would add animation about those axes. All AR experiences would display onto the students’ mobile device view screens. Students could further interact with the content by reorienting their position relative to the AR marker, allowing them to see from various perspectives. It should be noted that the AR experiences work on both the hardcopy version of the manual as well as the electronic, PDF, version.

Program Development

Development Phase

The next phase, development, speaks to the production of the individual components that make up the plan established in the design phase (Cheung, 2016).

Various pieces of software and hardware were used for the creation of this project.

Hardware included an Apple MacBook Pro (Retina, 15-inch, Mid 2014), an Apple iPhone 5s 16GB, an Apple iPad Air first-generation 16GB, and a ASUS TF700T

Transformer Pad Infinity 32GB tablet. Software used were Autodesk Maya 2017 Update

2, Unity 5.5.1f1 Personal Edition, Vuforia plug-in for Unity 6.2, Xcode 8.2.1, Android

Studio 2.2.3, Adobe Photoshop CC, and Adobe Muse for web page authoring.

35

Three-dimensional models of the cranial bones were acquired using human specimens supplied by department of Neuromusculoskeletal Medicine/Osteopathic

Manipulative Medicine. These were in turn scanned and digitized with the assistance of the university’s 3D Visualization & Printing department. The models were then brought into Autodesk Maya, optimized, and textured. Other graphical elements such as axes of rotation were then created and added into the scene. Lastly, cranial motions were animated as necessary. For example, elements of flexion, extension, side bending and/or rotation were added as appropriate to the specific AR experience (see Figure 3.4).

Figure 3.4. Cranial bones with axis of rotation animated in Maya.

Once the animation for each of the illustrations were approved, they were exported into the Unity application for further development. Unity, in conjunction with

Vuforia, creates the augmented reality experience by associating target images with their respective three-dimensional models (see Figure 3.5).

36

Figure 3.5. Cranial bones in Unity with accompanying targets.

After the models and their associated targets were developed, an accompanying website was created as the delivery method of the study. Adobe Photoshop was used to create the graphics, and Adobe Muse was utilized as the web authoring tool. The site was uploaded to the university-hosted URL: https://learn.westernu.edu/ARstudy. How the subjects participated in the study will be explored in the following section.

Field Testing Procedure

Implementation Phase

The execution, or delivery, of the study occurs during this next phase of ADDIE: implementation (Cheung, 2016). Specific steps had to be undertaken for the researcher to accomplish this. First, permission to conduct the study was obtained from the vice-dean of the university’s college of medicine (see Appendix A). Next, approvals from the

Institutional Review Boards (IRB) of both the Southern California health sciences

37

university (see Appendix B), and California State Polytechnic University, Pomona were secured (see Appendix C). This approval allowed the researcher to begin the recruitment process.

Three hundred and forty-six second-year medical students were invited via e-mail sent by their college’s dean of students to participate in the study. Attached to the email was an invitation flyer explaining the background and purpose of the study, content and procedure instructions, commitment and confidentiality information, as well as the link to the study’s website (see Appendix D).

This link served as a launching point directing participants to the study's website, and to the implied consent page where students were given the choice to participate.

Those who agreed to participate were directed to download the appropriate application while those who did not agree were thanked for their time and exited the study. Those who mistakenly declined or who changed their minds had the opportunity to reenter the study.

Participants continued to a series of web pages containing four pairs of illustrations. The first illustration in each set was presented traditionally, while the second utilized augmented reality. When prompted, participants held up their mobile device to trigger augmented reality experiences. The final page of the study’s website directed the students to a 12-question anonymous survey where feedback was collected

(see Appendix E).

Evaluation Phase

The final phase in the ADDIE continuum, the evaluation phase, culls and analyzes the data from participant feedback to determine the effectiveness of the instruction

38

(Cheung, 2016). This feedback drives the iterative processes of ADDIE in a recurring fashion as improvements and revisions are made with each subsequent cycle. In fact, although the ADDIE model is described in a linear fashion – that is, from analysis to evaluation – moving back and forth between phases or revising in any one phase is not uncommon (Cheung, 2016).

The questionnaire was developed using the university’s subscription to

Qualtrics.com, a service that builds and hosts online surveys and questionnaires.

Participants were asked of their familiarity with augmented reality as well as the kinds of devices and operating systems used. Additionally, participants rated, using a five-point

Likert-type scale, their perceptions of AR’s impact on the clarity, retention, and engagement of the material. Two questions captured any technical difficulties encountered, and their resolutions, if any.

The final questions centered around the subjects’ perceptions of augmented reality’s usefulness, effectiveness, and the participants’ overall likes, dislikes, and suggestions. Upon submission, participants were thanked for their effort where the study concludes. The next chapter will present the results of the questionnaire, offer conclusions, and propose recommendations.

39

CHAPTER FOUR

SUMMARY, CONCLUSIONS, AND RECOMMENDATIONS

Summary

Generally speaking, a medical school education is a rigorous and challenging venture (Smith et al., 2007). In their first two years alone, medical students undergo hundreds of hours of lecture and laboratory instruction. Moreover, medical school curricula can be difficult, abstract, and conceptual at times (Smith et al., 2007). Some of the intellectual challenges facing the medical student is the ability to grasp difficult concepts and theories (Barsom et al., 2016).

Mastering and excelling in academics is not only critical for the medical student, but also for their future patients. Ultimately, patient wellbeing is at stake. As such, student health care providers require superior levels of proficiency and professionalism to ensure positive patient outcomes (Barsom et al., 2016; Herron, 2016). An emerging technology that has the potential to assist the student in mastering and excelling in academics is augmented reality (AR).

Azuma (1997), and Caudell and Mizell (1992) described augmented reality as the process where computer-generated information is superimposed upon a real-world view, thus augmenting the information presented. The technology surrounding augmented reality has its origins in World War II (Vaughan-Nichols, 2009). Augmented reality has evolved from its early days of bulky, tethered contraptions into wireless, hand-held, and even fashionably wearable devices (Perlin, 2016).

According to a 2015 Pew Research analysis, 68% of Americans own smartphones and 45% own tablets (Anderson, 2015). Globally, the number of smartphone owners is

40

forecast to reach 2.1 billion by 2020 (“Number of smartphone users worldwide 2014-

2020,” n.d.) or approximately 28% of the world’s population (“2016 World Population

Data Sheet,” n.d.). Due to their design and functionality, researchers have suggested that portable mobile devices lend themselves well to augmented reality experiences (Kapp &

Balkun, 2011).

Augmented reality has since spread from the military and engineering realm into various other sectors. These include, but are not limited to, entertainment, marketing, manufacturing, medicine, and education (Carmigniani et al., 2011; Vaughan-Nichols,

2009; Yuen et al., 2011).

While the technology involving AR is not new, the literature considers it emerging in the field of education (Bacca et al., 2014; Billinghurst & Duenser, 2012).

The subjects of science, engineering and mathematics have been reported as primary candidates for augmented reality (Bacca et al., 2014; Thornton et al., 2012). Thornton et al., (2012) cited numerous benefits of AR including collaboration, and increased engagement and retention (Thomas et al., 2010; Yoon & Wang, 2014).

Akçayır, Akçayır, Pektaş, and Ocak (2016) investigated the effects of augmented reality on university students’ laboratory skills, and their attitudes toward laboratory science. Augmented reality, they concluded, had a positive effect on both their skills and their performance in the laboratory. Teachers also observed that students could understand concepts and experiments better with AR. Proponents indicated that this was due in part to AR’s ability to create ideal learning experiences by involving more senses.

“The more senses that are involved (sound, sight, touch, emotions, etc.), the more powerful the learning experience is” (Pérez-López & Contero, 2013, p. 19).

41

Critics contend, however, that AR technology alone cannot facilitate the learning process (Bower et al., 2014; Chiang et al., 2014). Many challenges still face AR in the classroom, including the scarcity of authoring tools (Billinghurst & Duenser, 2012).

Additionally, AR is a technology-dependent endeavor. The learning curve is steep for students and instructors alike. Critics question whether AR is better than other technologies in promoting academic success (Billinghurst & Duenser, 2012). Moreover, authors agree that further studies are needed to measure AR’s effectiveness and viability especially for learners with special needs (Bacca et al., 2014).

The purpose of this research project was to create and embed image-based augmented reality experiences into the coursework of second-year osteopathic medical students enrolled in The Expanding Osteopathic Concepts (EOC) cranial module. AR experiences were associated with specific illustrations found in the workbook accompanying the cranial course. This was done to not only supplement the course, but also to elucidate concepts, reinforce learning, and increase student engagement.

Once permissions and approvals were obtained from the necessary departments and Institutional Review Boards (IRB) of both organizations, the researcher invited second-year osteopathic medical students to participate via e-mail. An attached recruitment flyer explained the background and purpose of the study, and supplied the link to the study’s website along with a short, anonymous questionnaire.

The study’s anonymous questionnaire followed Kirkpatrick’s four-level approach to training. Developed in the 1950s, Kirkpatrick devised a sequential approach to rating the effectiveness of training. The four levels of training outcomes are: “reaction, learning, behavior, and results” (Bates, 2004, p. 341). The questionnaire was evaluated at

42

‘level one’ which measures the participants’ emotional responses to the quality or the relevance of the training (Bates, 2004).

Three hundred forty-six second-year osteopathic medical students were invited to take part in this study. Of those students, 82 participated and completed the questionnaire. The survey was comprised of twelve questions: ten multiple choice and two open-ended. An analysis of the responses follows in the next section.

Conclusions

Participants responded to the first question “Had you heard of ‘augmented reality’ before this study?” (see Figure 4.1). Twenty-seven participants (33%) responded yes, thirteen (16%) thought that they had heard of it, and forty-two (51%) responded that they had not heard of augmented reality before. The data suggest that most respondents were not aware of AR, or do not think they had heard of it before this study.

Figure 4.1. Familiarity with AR.

43

Participants responded to the next question “Which operating platform did you download to your device?” (see Figure 4.2). Fifty-eight (71%) responded with Apple iOS, twenty-four (29%) responded with Android, while no one indicated that they downloaded both types of operating systems. Although informative, this data does not offer statistically significant information. It does, however, reveal the current distribution of device-types and operating systems during this study.

Figure 4.2. Operating platform downloaded.

44

Participants responded to the following question “Which type of mobile device did you use?” (see Figure 4.3). Seventy-seven respondents (95%) indicated that they used a smart phone, no one responded using just a tablet, and four individuals (5%) indicated that they used both a phone and a tablet. This data suggests that the clear majority of participants had access to a smartphone rather than to a tablet.

Figure 4.3. Type of mobile device used.

45

The next three questions addressed participants’ perception of augmented reality’s impact on their comprehension, retention, and engagement of the material. Participants responded to the question “Do you feel augmented reality would help you better understand the concepts shown in the illustrations?” (see Figure 4.4). Forty-four (54%) of the participants responded definitely yes, twenty-nine (35%) indicated probably yes, seven (9%) felt that augmented reality might or might not have helped, and two (2%) responded with probably not. No one responded with definitely not. Each response was assigned a rating of one to five, five being the most favorable, ‘Definitely yes.’ This resulted in an average ‘Perceived Comprehension’ score of 4.41 (see Table 4.1).

Figure 4.4. Comprehension of material.

46

Participants then responded to the question “Do you feel augmented reality would help you better retain the information?” (see Figure 4.5). Thirty-six (44%) of the participants responded definitely yes, thirty-nine (48%) indicated probably yes, four (5%) felt that augmented reality might or might not have helped, two (2%) responded with probably not, and one (1%) responded with definitely not. Each response was assigned a rating of one to five, five being the most favorable, ‘Definitely yes.’ This resulted in an average ‘Perceived Retention’ score of 4.32 (see Table 4.1).

Figure 4.5. Retention of information.

47

Participants responded to the next question “Do you feel augmented reality would help to better engage your attention?” (see Figure 4.6). Forty-nine (60%) of the participants responded definitely yes, twenty-three (28%) indicated probably yes, seven

(9%) felt that augmented reality might or might not have helped, two (2%) responded with probably not, and one (1%) responded with definitely not. Once again, each response was assigned a rating of one to five, five being the most favorable, ‘Definitely yes.’ This resulted in an average ‘Perceived Engagement’ score of 4.8 (see Table 4.1).

Figure 4.6. User engagement.

48

Table 4.1

Average Participant Perception Scores

Participants responded to the next question “Did you encounter any difficulties”

(see Figure 4.7). Two (2%) of the participants responded yes, fifty-nine (72%) indicated no, twenty-one (26%) went on to elaborate on those technical issues. A sample of those comments follows (see Table 4.2). For a complete list, please refer to Appendix F.

Figure 4.7. Difficulties encountered.

49

Table 4.2

Explanation of Technical Difficulties

Participants then responded to the question “Was the Troubleshooting information useful in answering your question/resolving your issues?” (see Figure 4.8). Eight participants (10%) responded yes, six (7%) indicated no, sixty-seven (82%) indicated that they did not use it, and two (2%) responded with probably not, and one (1%) rounded off the last category.

Figure 4.8. Usefulness of troubleshooting information.

50

Participants responded to the next question “On a scale of 1 to 10 (1 being not very and 10 being very), how would you rate the effectiveness of augmented reality in these examples?” (see Figure 4.9). Fifteen participants (18%) gave a rating of ten – or

‘very effective;’ sixteen participants (20%) offered a rating or nine; twenty-three participants (28%) gave a rating of eight; sixteen participants (20%) scored a rating of seven; eight participants (10%) scored a rating of six; three participants (4%) offered a score of five; no participants scored the effectiveness a four, three or a two; and lastly, one participant (1%) scored the effectiveness of the augmented reality used in the study’s illustrations a one – ‘not very effective.’ Each response was assigned a rating of one to ten, ten being the most favorable. This resulted in an average ‘Perceived Effectiveness’ score of 8.05.

Figure 4.9. Effectiveness rating.

51

Participants responded to the next question “Would you like to see AR-enabled experiences in future coursework?” (see Figure 4.10). Sixty-eight respondents (83%) responded yes, fourteen respondents (17%) were not sure, and no respondents (0%) did not want to see augmented reality used in future coursework. The data suggests that the clear majority of participants expressed their desire to see augmented reality developed into future coursework.

Figure 4.10. AR desired in future coursework.

52

Two open-ended questions round off the questionnaire. The first, “What did you like about this experience?” resulted in 63 responses. A sampling of those comments follows (see Table 4.3). A complete list of responses may be found in Appendix G.

Table 4.3

What Did You Like About This Experience?

A word cloud was then generated representing the top thirty recurring words found in the question (see Figure 4.11).

Figure 4.11. Word cloud representation of experiences.

53

The second open-ended question, “What improvements or suggestions would you like to offer?” resulted in 60 responses (see Table 4.4). A complete list of responses may be found in Appendix H.

Table 4.4

What Improvements or Suggestions Would You Like to Offer?

Similarly, a word cloud was generated representing the top thirty recurring words found in that question (see Figure 4.12).

Figure 4.12. Word cloud representation of suggestions.

54

Recommendations

In the analysis of the data, participants’ attitudes and perceptions of augmented reality has been largely favorable. Perceived comprehension, retention and engagement scored and average of 4.41, 4.32, and 4.8 out of 5 respectively.

According to participant feedback, addressing the technical limitations of image stabilization and device compatibility is a high priority. Additionally, participants indicated that they would like the ability for more user control. The ability to start and stop motion, to reveal or hide axes of rotation, to exaggerate the motions, and to reposition and scale the models are among the recurring suggestions. Overcoming limitations and expanding user control is paramount in creating a successful augmented reality experience.

An additional recommendation would be to host the application through the appropriate app store: Apple Store or , for example. Doing so would streamline the downloading and installation of the app and overcome any trust management hurdles that could lead to negative user experiences. Lastly, incorporating additional multimedia elements such as videos and photographs could enrichen the experience and further strengthen the learners’ grasp of the material.

Introducing augmented reality as a supplement to anatomy training, then, has the potential to clarify the inherent complexities of the subject matter, reinforce learning, and increase learner engagement. Continued research in the field of augmented reality will benefit AR’s development and its potential impact on education. Further studies are needed to measure AR’s effectiveness and viability beyond student perception and into measured, academic results.

55

REFERENCES

2011 Horizon Report. (n.d.). Retrieved from

https://library.educause.edu/resources/2011/2/2011-horizon-report

2016 World Population Data Sheet. (n.d.). Retrieved from

http://www.prb.org/Publications/DataSheets/2016/2016-world-population-data-

sheet.aspx

Adobe Muse. (2017, January 27). In Wikipedia. Retrieved from

https://en.wikipedia.org/w/index.php?title=Adobe_Muse&oldid=762302401

Adobe Photoshop. (2017, April 4). In Wikipedia. Retrieved from

https://en.wikipedia.org/w/index.php?title=Adobe_Photoshop&oldid=773768221

Akçayır, M., Akçayır, G., Pektaş, H. M., & Ocak, M. A. (2016). Augmented reality in

science laboratories: The effects of augmented reality on university students’

laboratory skills and attitudes toward science laboratories. Computers in Human

Behavior, 57, 334–342. doi:10.1016/j.chb.2015.12.054

Al-Issa, H., Regenbrecht, H., & Hale, L. (2012). Augmented reality applications in

rehabilitation to improve physical outcomes. Physical Therapy Reviews, 17(1), 16–

28. doi:10.1179/1743288X11Y.0000000051

Anderson, M. (2015, October 29). Technology Device Ownership: 2015. Retrieved from

http://www.pewinternet.org/2015/10/29/technology-device-ownership-2015/

Android Studio. (2017, April 4). In Wikipedia. Retrieved from

https://en.wikipedia.org/w/index.php?title=Android_Studio&oldid=773747745

56

Armstrong, D. G., Rankin, T. M., Giovinco, N. A., Mills, J. L., & Matsuoka, Y. (2014).

A Heads-Up Display for Diabetic Limb Salvage Surgery. Journal of Diabetes

Science and Technology, 8(5), 951–956.

Autodesk Maya. (2017, March 21). In Wikipedia. Retrieved from

https://en.wikipedia.org/w/index.php?title=Autodesk_Maya&oldid=771354663

Azuma, R., Baillot, Y., Behringer, R., Feiner, S., Julier, S., & MacIntyre, B. (2001).

Recent advances in augmented reality. IEEE Computer Graphics and Applications,

21(6), 34–47. doi:10.1109/38.963459

Azuma, R. T. (1997). A survey of augmented reality. Presence: Teleoperators & Virtual

Environments, 6(4), 355.

Bacca, J., Baldiris, S., Fabregat, R., Graf, S., & Kinshuk. (2014). Augmented Reality

Trends in Education: A Systematic Review of Research and Applications. Journal of

Educational Technology & Society, 17(4), 133–149.

Barsom, E. Z., Graafland, M., & Schijven, M. P. (2016). Systematic review on the

effectiveness of augmented reality applications in medical training. Surgical

Endoscopy, 30(10), 4174–4183. doi:10.1007/s00464-016-4800-6

Bates, R. (2004). A critical analysis of evaluation practice: the Kirkpatrick model and the

principle of beneficence. Evaluation and Program Planning, 27(3), 341–347.

doi:10.1016/j.evalprogplan.2004.04.011

Bentley, B. S., & Hill, R. V. (2009). Objective and subjective assessment of reciprocal

peer teaching in medical gross anatomy laboratory. Anatomical Sciences Education,

2(4), 143–149. doi:10.1002/ase.96

57

Berryman, D. R. (2012). Augmented Reality: A Review. Medical Reference Services

Quarterly, 31(2), 212–218. doi:10.1080/02763869.2012.670604

Billinghurst, M., & Duenser, A. (2012). Augmented Reality in the Classroom. Computer,

45(7), 56–63. doi:10.1109/MC.2012.111

Bordoni, B., & Zanier, E. (2015, Spring). Sutherland’s Legacy in the New Millennium:

The Osteopathic Cranial Model and Modern Osteopathy. Advances in Mind - Body

Medicine, 29(2), 15–20.

Borrero, A. M., & Márquez, J. M. A. (2012). A Pilot Study of the Effectiveness of

Augmented Reality to Enhance the Use of Remote Labs in Electrical Engineering

Education. Journal of Science Education and Technology, 21(5), 540–557.

doi:10.1007/s10956-011-9345-9

Botella, C., Perez-Ara, M. A., Breton-Lopez, J., Quero, S., Garcia-Palacios, A., & Banos,

R. M. (2016). In Vivo Versus Augmented Reality Exposure in the Treatment of

Small Animal Phobia: A Randomized Controlled Trial. Plos One, 11(2), 1–22.

Bower, M., Howe, C., McCredie, N., Robinson, A., & Grover, D. (2014). Augmented

Reality in Education--Cases, Places and Potentials. Educational Media

International, 51(1), 1–15. doi:10.1080/09523987.2014.889400

Braunschweig, D. (2014). English: ADDIE Instructional Design Model - All phases

highlighted. Retrieved from https://commons.wikimedia.org/wiki/File:Addie.png

Carlson, K. J., & Gagnon, D. J. (2016). Augmented Reality Integrated Simulation

Education in Health Care. Clinical Simulation in Nursing, 12(4), 123–127.

doi:10.1016/j.ecns.2015.12.005

58

Carmigniani, J., Furht, B., Anisetti, M., Ceravolo, P., Damiani, E., & Ivkovic, M. (2011).

Augmented reality technologies, systems and applications. Multimedia Tools and

Applications, 51(1), 341–377. doi:/10.1007/s11042-010-0660-6

Caudell, T. P., & Mizell, D. W. (1992). Augmented reality: an application of heads-up

display technology to manual manufacturing processes. In Proceedings of the

Twenty-Fifth Hawaii International Conference on System Sciences, 1992 (Vol. ii,

pp. 659–669 vol.2). doi:10.1109/HICSS.1992.183317

Cheng, K. H., & Tsai, C. C. (2013). Affordances of Augmented Reality in Science

Learning: Suggestions for Future Research. Journal of Science Education &

Technology, 22(4), 449–462. doi:10.1007/s10956-012-9405-9

Cheung, L. (2016). Using the ADDIE Model of Instructional Design to Teach Chest

Radiograph Interpretation. Journal of Biomedical Education, 2016, e9502572.

doi:10.1155/2016/9502572

Chiang, T. H. C., Yang, S. J. H., & Hwang, G. J. (2014). An Augmented Reality-based

Mobile Learning System to Improve Students’ Learning Achievements and

Motivations in Natural Science Inquiry Activities. Journal of Educational

Technology & Society, 17(4), 352–365.

Chicchi Giglioli, I. A., Pallavicini, F., Pedroli, E., Serino, S., & Riva, G. (2015).

Augmented Reality: A Brand New Challenge for the Assessment and Treatment of

Psychological Disorders. Computational and Mathematical Methods in Medicine,

2015, 1–12.

59

Clark, R., & Mayer, R. (2011). E-learning and the Science of Instruction: Proven

Guidelines for Consumers and Designers of Multimedia Learning. San Francisco,

CA: Pfeiffer.

Cuendet, S., Bonnard, Q., Do-Lenh, S., & Dillenbourg, P. (2013). Designing augmented

reality for the classroom. Computers & Education, 68, 557–569.

doi:10.1016/j.compedu.2013.02.015

Di Serio, Á., Ibáñez, M. B., & Kloos, C. D. (2013). Impact of an augmented reality

system on students’ motivation for a visual art course. Computers & Education, 68,

586–596. doi:10.1016/j.compedu.2012.03.002

Ferrer-Torregrosa, J., Torralba, J., Jimenez, M., García, S., & Barcia, J. (2015).

ARBOOK: Development and Assessment of a Tool Based on Augmented Reality

for Anatomy. Journal of Science Education and Technology, 24(1), 119–124.

doi:10.1007/s10956-014-9526-4

Football’s augmented reality prepares us for an all-robot gridiron future. (2016,

September 16). Retrieved April 29, 2017, from

http://www.avclub.com/article/footballs-augmented-reality-prepares-us-all-robot--

242711

Green, M., Hill, J., & Mcnair, C. (2014). Reality Check. Teacher Librarian, 41(5), 28–

34.

Greenhalgh, P., Mullins, B., Grunnet-Jepsen, A., & Bhowmik, A. K. (2016). Invited

Paper: Industrial Deployment of a Full-featured Head-mounted Augmented-reality

System and the Incorporation of a 3D-sensing Platform. SID Symposium Digest of

Technical Papers, 47(1), 448–451. doi:10.1002/sdtp.10704

60

Head-up display. (2017, May 19). In Wikipedia. Retrieved from

https://en.wikipedia.org/w/index.php?title=Head-up_display&oldid=781151723

Heilig, M. (1961). Illustration of Morton Heilig’s Sensorama device, precursor to later

virtual reality systems. Retrieved from

https://commons.wikimedia.org/wiki/File:Sensorama_patent_fig5.png

Herron, J. (2016). Augmented Reality in Medical Education and Training. Journal of

Electronic Resources in Medical Libraries, 13(2), 51–55.

doi:10.1080/15424065.2016.1175987

Kapp, C., & Balkun, M. M. (2011). Teaching on the Virtuality Continuum: Augmented

Reality in the Classroom. Transformations, 22(1), 100.

Krueger, M. W., Gionfriddo, T., & Hinrichsen, K. (1985). Videoplace—an Artificial

Reality. In Proceedings of the SIGCHI Conference on Human Factors in Computing

Systems (pp. 35–40). New York, NY, USA: ACM. doi:10.1145/317456.317463

Lee, K. (2012a). Augmented Reality in Education and Training. TechTrends: Linking

Research & Practice to Improve Learning, 56(2), 13–21. doi:10.1007/s11528-012-

0559-3

Lee, K. (2012b). The Future of Learning and Training in Augmented Reality. InSight: A

Journal of Scholarly Teaching, 7, 31–42.

Lin, C. Y., & Chang, Y. M. (2015). Interactive augmented reality using Scratch 2.0 to

improve physical activities for children with developmental disabilities. Research in

Developmental Disabilities, 37, 1–8. doi:10.1016/j.ridd.2014.10.016

61

Martin, J., Dikkers, S., Squire, K., & Gagnon, D. (2014). Participatory Scaling Through

Augmented Reality Learning Through Local Games. TechTrends: Linking Research

& Practice to Improve Learning, 58(1), 35–41. doi:10.1007/s11528-013-0718-1

Masutani, Y., Dohi, T., Yamane, F., Iseki, H., & Takakura, K. (1998). Augmented

Reality Visualization System for Intravascular Neurosurgery. Computer Aided

Surgery, 3(5), 239–247.

Milgram, P., & Kishino, F. (1994). A taxonomy of mixed reality visual displays. IEICE

TRANSACTIONS on Information and Systems, 77(12), 1321–1329.

Mitrasinovic, S., Camacho, E., Trivedi, N., Logan, J., Campbell, C., Zilinyi, R., …

Connolly, E. S. (2015). Clinical and Surgical Applications of Smart Glasses.

Technology and Health Care: Official Journal of the European Society for

Engineering and Medicine, 23(4), 381.

Molenda, M. (2015). In Search of the Elusive ADDIE Model. Performance Improvement,

54(2), 40–42. doi:10.1002/pfi.21461

Monkman, H., & Kushniruk, A. W. (2015). A See Through Future: Augmented Reality

and Health Information Systems. Studies in Health Technology and Informatics,

208, 281.

Nifakos, S., Tomson, T., & Zary, N. (2014). Combining Physical and Virtual Contexts

Through Augmented Reality: Design and Evaluation of a Prototype Using a Drug

Box as a Marker for Antibiotic Training. PeerJ, 2, e697.

NMC Horizon Report > 2016 Higher Education Edition. (n.d.). Retrieved from

https://www.nmc.org/publication/nmc-horizon-report-2016-higher-education-

edition/

62

NMM/OMM Department. (2014). The Expanding Osteopathic Concept. Pomona, CA:

Western University of Health Sciences.

Number of smartphone users worldwide 2014-2020. (n.d.). Retrieved from

https://www.statista.com/statistics/330695/number-of-smartphone-users-worldwide/

Office of the Registrar. (2016). STRATEGIC ENROLLMENT MANAGEMENT REPORT,

College of Osteopathic Medicine of the Pacific, 2015/2016 AY (p. 52). Pomona, CA:

Western University of Health Sciences.

Olshannikova, E., Ometov, A., Koucheryavy, Y., & Olsson, T. (2015). Visualizing Big

Data with augmented and virtual reality: challenges and research agenda. Journal of

Big Data, 2(1), 22. doi:10.1186/s40537-015-0031-2

Ouilhet, H. (2010). Google Sky Map: Using Your Phone As an Interface. In Proceedings

of the 12th International Conference on Human Computer Interaction with Mobile

Devices and Services (pp. 419–422). New York, NY, USA: ACM.

doi:10.1145/1851600.1851695

Parviz, B. A. (2009). For your eye only. IEEE Spectrum, 46(9), 36–41.

doi:10.1109/MSPEC.2009.5210042

Pence, H. E. (2010). Smartphones, Smart Objects, and Augmented Reality. The

Reference Librarian, 52(1–2), 136–145. doi:10.1080/02763877.2011.528281

Pérez-López, D., & Contero, M. (2013). Delivering Educational Multimedia Contents

Through an Augmented Reality Application: A Case Study on Its Impact on

Knowledge Acquisition and Retention. Turkish Online Journal of Educational

Technology, 12(4), 19–28.

63

Perlin, K. (2016). Future Reality: How Emerging Technologies Will Change Language

Itself. IEEE Computer Graphics and Applications, 36(3), 84–89.

doi:10.1109/MCG.2016.56

Requejo, P. (2012). The potential of augmented reality applications for physical

rehabilitation. Physical Therapy Reviews, 17(5), 350–351.

doi:10.1179/1743288X12Y.0000000022

Smith, C. K., Peterson, D. F., Degenhardt, B. F., & Johnson, J. C. (2007). Depression,

anxiety, and perceived hassles among entering medical students. Psychology, Health

& Medicine, 12(1), 31–39. doi:10.1080/13548500500429387

Squire, K. (2015). Creating the Future of Games and Learning. Independent School,

74(2). Retrieved from http://proxy.library.cpp.edu/login?url= http://search.

ebscohost.com/ login.aspx?direct=true&AuthType=ip,uid&db=eric&AN=

EJ1062732&site=ehost-live&scope=site

Sutherland, I. E. (1968). A Head-mounted Three Dimensional Display. In Proceedings of

the December 9-11, 1968, Fall Joint Computer Conference, Part I, 757–764.

doi:10.1145/1476589.1476686

Thomas, R. G., William John, N., & Delieu, J. M. (2010). Augmented Reality for

Anatomical Education. Journal of Visual Communication in Medicine, 33(1), 6–15.

doi:10.3109/17453050903557359

Thornton, T., Ernst, J. V., & Clark, A. C. (2012). Augmented Reality as a Visual and

Spatial Learning Tool in Technology Education. Technology & Engineering

Teacher, 71(8), 18–21.

64

Tunanidas, A. G., & Burkhart, D. N. (2005). American Osteopathic Association

Commitment to Quality and Lifelong Learning. Journal of Continuing Education in

the Health Professions, 25(3), 197–202.

Vaughan-Nichols, S. J. (2009). Augmented Reality: No Longer a Novelty? Computer,

42(12), 19–22. doi:10.1109/MC.2009.380

Vuforia Augmented Reality SDK. (2016, December 16). In Wikipedia. Retrieved from

https://en.wikipedia.org/w/index.php?title=Vuforia_Augmented_Reality_SDK&oldi

d=755140723

What is a DO? (n.d.). Retrieved May 27, 2017, from

https://www.osteopathic.org/OSTEOPATHIC-HEALTH/about-dos/what-is-a-

do/Pages/default.aspx

Wasko, C. (2013). What Teachers Need to Know About Augmented Reality Enhanced

Learning Environments. TechTrends: Linking Research & Practice to Improve

Learning, 57(4), 17–21. doi:10.1007/s11528-013-0672-y

Yoon, S. A., & Wang, J. (2014). Making the Invisible Visible in Science Museums

Through Augmented Reality Devices. TechTrends: Linking Research & Practice to

Improve Learning, 58(1), 49–55. doi:10.1007/s11528-013-0720-7

Yuen, S., Yaoyuneyong, G., & Johnson, E. (2011). Augmented Reality: An Overview

and Five Directions for AR in Education. Journal of Educational Technology

Development & Exchange, 4(1), 119–140.

Zhu, E., Hadadgar, A., Masiello, I., & Zary, N. (2014). Augmented reality in healthcare

education: an integrative review. PeerJ, 2, e469. doi:10.7717/peerj.469

65

APPENDIX A

APPROVAL OF STUDY

66

APPENDIX B

INSTITUTIONAL REVIEW BOARD APPROVAL

67

APPENDIX C

INSTITUTIONAL REVIEW BOARD APPROVAL

Memorandum California State Polytechnic University, Pomona Institutional Review Board -- Office of Research Compliance Federalwide Assurance 00001759 -- IRB principles: respect for persons, beneficence, and justice

Date: Apr 19, 2017 PI Name: Joseph Marilo; Department/College: other CEIS, Educational Multimedia Co-PI(s): Shahnaz Lotfipour

IRB protocol number: IRB-17-58 Protocol Title: IMAGE-BASED AUGMENTED REALITY: REINFORCING LEARNING IN MEDICAL SCHOOL EDUCATION Protocol Submission Type: Initial; Review Board Type: review by the CPP IRB office

Decision Date: Apr 19, 2017 Decision: Exempt

Dear Investigator(s),

The protocol as described above has been reviewed by the Cal Poly Pomona Institutional Review Board (IRB) by the exempt review method. It was found to be in compliance with both applicable federal and state regulations and Cal Poly Pomona policies regarding the protection of human subjects used in research. Thus, the Cal Poly Pomona IRB grants you approval to conduct the research. On its behalf, I thank you for your adherence to established policies meant to ensure the safety and privacy of your study participants. You may wish to keep a copy of this memo with you while conducting your research project.

You may initiate the project as of Apr 19, 2017.

The reason for approving by exempt review is as follows: Exempt-Category 2

It would be appreciated that you advise the IRB upon the completion of your study involving interaction with human subjects. Please use the closure form in the Cayuse system.

Approval is conditional upon your willingness to carry out your responsibilities as the investigators under University policy. Your research project must be conducted according to the methods described in the final approved protocol. Should there be any changes to your research plan as described, please advise the IRB, because you may be required to

68 submit an amendment (with re-certification). Additionally, should you as the investigator or any of your subjects experience any “problems which involve an undescribed element of risk” (adverse events in regulatory terms), please immediately inform the IRB of the circumstances. There are forms for both modifications and adverse events in the Cayuse system.

If you need further assistance, you are encouraged to contact the IRB. The Board wishes you success in your future research endeavors.

Sincerely,

Bonny Burns-Whitmore, MPH DrPH RD Chair, Institutional Review Board Professor, Human Nutrition and Food Science Huntley College of Agriculture

This message has been automatically generated by the Cayuse system installed at Cal Poly Pomona by Evisions. Please contact the IRB office ( [email protected] or 909.869.4215 or .3713) if you have questions or you believe you have received this message in error. Thanks for your compliance with the regulations while conducting human subjects research. [2/13]

69

APPENDIX D

INVITATION FLYER

70

APPENDIX E

STUDY QUESTIONNAIRE

71

72

73

APPENDIX F

QUESTION 7

Did you encounter any difficulties? If yes, please explain.

1. The 3rd and 4th sections did not appear to have any motion.

2. App constantly crashed after the "teapot" testing VR screen. I was not able to see the SBS flexion/extension or any of the strain patterns.

3. Could be better if motions were more gross.

4. Hard to appreciate movements unless completely still.

5. Having long arms makes watching a small screen difficult. Plus, there was a lot of shaking.

6. I didn't see the motions for the strain patterns.

7. I understand this is beta. Though by just holding the device the image shook too much and at times it was difficult to assess the subtleties.

8. I would really like to be able to rotate the bones around the axis.

9. Keeps freezing and is shaky.

10. Platform is too sensitive to motion leading to an unstable image.

11. Security settings on my phone for downloading the app.

12. Seeing the movement was a little difficult.

13. Shaky image.

14. Some of the motions were too slight to notice.

15. Some of the strain patterns were not correct, but I was told you were already aware of that. Some of the images are a little too shaky, despite me stabilizing the phone.

16. Sometimes it glitched in switching between the different strains.

17. Strain patterns weren’t right, the image needed to face me differently in some

18. The animation was jittery at times.

74

19. Too shaky... could be made more smooth... maybe exaggerate the movements of the strains also.

20. Trust device issues. Was unable to follow directions on how to trust my device based on the information provided.

21. Very shaky.

75

APPENDIX G

QUESTION 11

What did you like about this experience?

1. Appreciated learning 3D concepts with AR as opposed to looking at 2D images on paper.

2. The 3D is incredibly valuable for this type of mechanical understanding; would be particularly helpful for cranial bone movement, since some of the diagrams are difficult to understand even though the concepts aren't.

3. It was cool to maneuver around the bones.

4. Being able to explore the augmented reality from various perspectives as I moved the phone around.

5. Being able to visualize the cranial movements in 3D.

6. Very easy to use and super easy to understand. Great way to visualize something that is hard to understand.

7. I was able to move around to get different views.

8. I like the ability to visualize it and it clicks more this way

9. I am a visual kinesthetic learner so I liked the general experience overall.

10. It was fun.

11. Very cool perspective for those who are visual-spatial learners.

12. It shows you exact examples of strains that you only could imagine the movement. It was my first time visualizing some of these strains.

13. It was cool to see it in all direction.

14. More interactive; easier to visualize very small movements

15. It allows you to see the 3d structure without having to have the actual structure in your hand.

16. It allows you to see it in 3d how it is actually happening.

17. Helped to visualize very subtle movements.

76

18. Good 3D representation.

19. 3d, easy to use, better visualization.

20. Cleared up axis.

21. It depends on the learner; it was nice to see but I learn better with using my hands to visualize what is going on.

22. Ability to visualize a difficult concept, which made it much easier to understand.

23. The motions were easy to follow.

24. I liked how it was easier to visualize the axes and the motions.

25. It was easy to visualize cranial complex SBS motion/axis/dx.

26. Very interesting being able to see the motion of strains.

27. Being able to visualize in 3D

28. Actual representation of the strain patterns.

29. It was very engaging and helps visual learners.

30. Visual and animated.

31. It was really cool to see it in 3d and see the real motion.

32. Interactive. Able to see images in different angles.

33. I thought it helped me picture it a lot better.

34. Much more easy to visualize all the movements

35. Not for cranial, I feel that the manual and the images and skulls provided are sufficient, and we don’t need an additional service for this. Cranial week is long as is, so i feel that this would be difficult to integrate (INCLUDING TRAINING ETC).

36. As a visual person it really helped me as opposed to only seeing a 2d image.

37. It was a nice visual representation to learn.

38. Being able to see in 3D in space is just so nice to be able to have, and it gives me more control over what I'm seeing as opposed to a video.

77

39. Being able to see the motion of the bone along the axes.

40. Would make studying more efficient.

41. Spatial visualization.

42. New tech is always cool.

43. I like the moving bones.

44. Seeing the 3D helped orient everything.

45. See movement and axes.

46. 3d component.

47. it was def a cool experience but it would be nice if we could manipulate the 3D image on the phone screen.

48. Really helped me visualize the motions of SBS! And it would be a great tool to review these motions, since most of us don't have our own disarticulated skulls at home...

49. It was nice to see a 3d picture of the movements.

50. It made visualizing the somatic dysfunctions much easier.

51. I could actually see the movements with all the appropriate and complete bony landmarks.

52. I am a visual learner, so seeing everything in 3D was especially helpful.

53. Smooth interface and easy to use. Intuitive and sharp and clear.

54. So much better to visualize the axis and movements!

55. 3d experience is cool.

56. Anatomy is 3D, so it makes sense seeing it that way.

57. Really clear images and being able to move phone around to view from any angle.

58. An ability to see the skull from different angles. It's like having a skull model without needing to have a skull model.

78

59. it's always nice to have the 3-D representation of what's going on. Also, the technology is cool.

60. 3D rendition of a 3D concept allowed enhanced learning experience rather than having to take a 2D image which had to be rendered in our minds as a 3D image.

61. It was SO cool!

62. I felt it was very easy to download and use. The graphics were very clear.

63. 3D

79

APPENDIX H

QUESTION 12

What improvements or suggestions would you like to offer?

1. Ability to change viewing angles, start and stop the motions.

2. Tools that allow you to manipulate or change the axes.

3. It would be key if you could add tap on tap off buttons that would allow the student to see what joint they are on, what planes of motion are happening, what phase of motion they are currently in.

4. Greater ability to manipulate the virtual model.

5. Include visuals for how the strain patterns are incurred. i.e. Show the injuries that cause the strains.

6. More exaggerated movements. Along with hand movements to go alongside the AR.

7. Would there be a way to move your perspective while looking at the strain pattern. I think this would to notice the motion from a different hold or observing another person treat.

8. Not having to use a computer screen.

9. N/A

10. This is awesome.

11. Make an option to see the movement more exaggerated because I didn't notice the movement at first.

12. Have an option to exaggerate movement.

13. Show movements of different strain patterns within the 3d structure. Don't just have a static structure.

14. I'd like to be able to flip the images.

15. Hard to visualize 360.

16. Having more obvious movements to visualize the motion more easily.

80

17. I think it would be better if the movements were more prominent because it is hard to see the movements sometimes when my hands are shaking

18. None

19. Make it less shaky.

20. More clear motions/movement of the cranial bones.

21. The image is a little shaky I don’t know how to fix that though.

22. There was a little shakiness with the bones, which made it harder to see the motions.

23. Easier to see at different angles - sometimes it disappears.

24. Ability to zoom in or pause from the phone.

25. The ability to move the imagine in the phone in three dimensions.

26. To me, it seemed a little shaky but overall was really well done.

27. I think the movements should be more exaggerated because they were too subtle sometimes.

28. None.

29. More exaggerated cranial bone movements.

30. Maybe have exaggerated movement option.

31. I wish I could change the level of exaggeration, so as to make it easier to see the strains.

32. I would like to see how the hands would move in relation to the bones if possible.

33. The only skull that moved in my experience was in flexion and extension.

34. Be able to rotate bones around axes for movement and diagnoses.

35. Have the bones move with the dysfunctions.

36. Be able to manipulate the bones.

81

37. Making the movements more exaggerated because it was hard to tell the motion of the cranium and when the phone shakes when you are holding the phone, it makes the small movement of the cranium even harder to see.

38. More exaggerated movement.

39. Better motion of movement and auto stabilizing feature for the shaky hands.

40. Allow us to manipulate the image on the screen, like zooming in and rotating it.

41. I'm so impressed that I can't think of any ^_^.

42. They could be a little more exaggerated and the movement of the phone when holding it made it difficult to see movement sometimes.

43. Can't think of any!

44. Movements were subtle for some of the vertical/lateral strains, was also a bit jittery, but otherwise very good.

45. I think this is great! No suggestions.

46. Animations if possible. The movements are sometimes the most difficult part, and while the axes are helpful, the movement and animations would be great.

47. The skull motion may be accurate - but it was a little too subtle and slow for learning experiences, perhaps for beginners. I would like to see the gross motion first, and then be able to access the real motion.

48. The bone components should exhibit motion around the respective axis' to show movement from normal into the strain pattern.

49. More fluid movements (not as jittery). Maybe exaggerate the strain pattern.

50. Too shaky, if that is improved then it would be perfect!!

51. Maybe exaggerate the motions more. Include a slide with exaggerated motion in addition to what an actual strain would look like.

52. Less sensitive - the animations shake with the motions of my hands.

53. Make platform less sensitive to motion while the operator holds the phone. Currently it is too sensitive and the instability makes it difficult to visualize the strain patterns.

82

54. The movements of the bones relative to one another may have to be a bit more exaggerated so the student can better appreciate the directions/axes, etc.

55. Ability to move the 3D skull around on phone, larger gross motions.

56. Make the movements a little more gross in some of the images; it was a little hard to see the motion in a few. Also, the vertical strain looked like normal flexion and extension.

57. Crashing on android.

58. App stability across platforms (especially Android) would be crucial to the success of the project!

59. I know that this is a preliminary version at the moment but one suggestion I would have is to make the motions just a bit more exaggerated so it is easier to see.

60. Definitely need to exaggerate the motions of the strain patterns; they are too subtle and can easily get confused with motion artifact.

83