The Impact of Emotional on the Effectiveness of Instructional Materials

A dissertation presented to

the faculty of

The Patton College of Education of Ohio University

In partial fulfillment

of the requirements for the degree

Doctor of Philosophy

Dana C. Simionescu

May 2020

© 2020 Dana C. Simionescu. All Rights Reserved. 2

This dissertation titled

The Impact of Emotional Design on the Effectiveness of Instructional Materials

by

DANA C. SIMIONESCU

has been approved for

the Department of Educational Studies

and The Patton College of Education by

Greg Kessler

Professor of Educational Studies

Renée A. Middleton

Dean, The Patton College of Education 3

Abstract

SIMIONESCU, DANA, C., Ph.D., May 2020, Instructional Technology

The Impact of Emotional Design on the Effectiveness of Instructional Materials

Director of Dissertation: Greg Kessler

Emotional design refers to those features of design that do not have any specific informational or pedagogical function, but are aimed at stimulating the affective dimension of learning. In this study, I investigated the effect of two types of emotional design of learning materials on undergraduate students’ learning outcomes and perceptions, within the theoretical framework of the Cognitive Affective Theory of

Learning with Multimedia (Moreno, 2007; Moreno & Mayer, 2007). The was experimental, with participants randomly assigned to each condition.

The independent variable was the type of design of the learning materials (a short biology lesson). The materials were designed in four versions: plain lesson consisting of text with simple black and white graphics; text with images that were designed based on emotional design principles for visual design (using warm colors and anthropomorphisms); text with simple black and white graphics plus basic storytelling elements (a character, a very simple plot, and narrative tenses); lesson with both enhanced graphics and storytelling elements. After studying the lesson, the students were tested on retention and transfer, once immediately after the lesson and once after one week. The students were also asked to rate the effectiveness of the instructional content and the visual design of the content. 4

The dependent variables were the scores on the retention and transfer tests, immediate and delayed, and the student ratings for content effectiveness and for visual design. Both the immediate and delayed retention scores were based on the answer to an open question asking students to recall the process in detail. The immediate transfer score was based on the answers to five open questions, and the delayed transfer score was based on the answers to five multiple-choice questions. The number of valid cases was

220 for the initial task and 143 for the delayed task.

The results showed that graphical emotional design and storytelling may contribute to improved learning outcomes by promoting positive affect.

Recommendations were made for future research to further investigate these effects, and for practitioners to try and incorporate emotional design in their learning materials. 5

Dedication

To my parents.

6

Acknowledgments

I would like to thank my committee members Drs. Greg Kessler, Gordon Brooks,

David Moore, and Sam Girton for their ideas, resources, and guidance. I would also like to thank Dr. Mark Slayton for checking the accuracy of my materials and helping me develop test questions. Special thanks to Dr. Dawn Bikowski who listened patiently to my whining and shared valuable writing tips. Finally, I’m thankful to my classmates for being great company on this journey, especially Sarah McCorkle who helped with writing motivation and test rating.

7

Table of Contents

Page

Abstract ...... 3 Dedication ...... 5 Acknowledgments...... 6 List of Tables ...... 11 List of Figures ...... 12 Chapter 1: Introduction ...... 13 Background of the Study ...... 13 Purpose of the Study ...... 19 Significance of the Study ...... 20 Research Questions ...... 21 Definitions of Terms ...... 21 Summary ...... 25 Chapter 2: Literature Review ...... 27 The Psychology of Learning ...... 27 The Role of Memory ...... 27 The Role of Attention and Working Memory...... 30 The Role of Motivation ...... 31 The Role of Emotions ...... 33 Theories of Learning ...... 37 The Cognitive Load Theory ...... 38 Multimedia learning ...... 39 The Cognitive Theory of Multimedia Learning...... 43 The Cognitive Affective Theory of Learning with Multimedia ...... 52 Visual Design and Learning...... 55 Effective design ...... 55 The concept of emotional design ...... 62 Aesthetics and learning ...... 65 Storytelling as an Technique ...... 69 Emotional Design in Learning Materials ...... 73 Emotional ...... 74 8

Summary ...... 90 Chapter 3: Method ...... 93 Research Questions ...... 93 Research Design...... 94 Variables ...... 94 Participants ...... 95 Instruments ...... 96 Pilot Study ...... 101 Procedure ...... 102 Inter-Rater Reliability ...... 103 Group Equivalency ...... 105 Data Analysis ...... 105 Procedure ...... 105 Assumptions...... 107 Summary ...... 109 Chapter 4: Results ...... 110 Summary of Demographic Data ...... 110 Research Question 1 ...... 111 Research Question 2 ...... 117 Research Questions 3 and 4 ...... 120 Research Question 5 ...... 121 Summary ...... 121 Chapter 5: Discussion ...... 122 Research Question 1 ...... 122 Research Question 2 ...... 124 Research Question 3 ...... 125 Research Question 4 ...... 126 Research Question 5 ...... 127 Colors and Anthropomorphisms ...... 127 Storytelling Elements ...... 128 Implications...... 129 Implications for theory ...... 129 Implications for instructional design practice ...... 133 9

Limitations ...... 134 Ecological validity...... 134 Online setting ...... 135 Convenient sampling frame...... 135 Attrition...... 135 Type I error ...... 135 Storytelling ...... 136 Affect was not directly measured ...... 136 Recommendations for Future Research ...... 136 Research context...... 136 Storytelling...... 137 Affect...... 137 Cognitive load ...... 137 Lesson subject matter ...... 138 Population ...... 138 Additional emotional design types...... 138 Conclusion ...... 139 References ...... 141 Appendix A: Consent Form ...... 167 Appendix B: IRB Approval ...... 170 Appendix C: Lesson Slides ...... 172 Appendix D: Questionnaire ...... 182 Appendix E: Immediate Test ...... 183 Appendix F: Delayed Test ...... 184 Appendix G: Rating Rubric ...... 186 Appendix H: Materials Evaluation Questions ...... 188 Appendix I: Group Equivalency Checks ...... 189 Appendix J: Normality Checks ...... 194 Appendix K: Linearity Check ...... 197 Appendix L: Homogeneity Checks ...... 198 Appendix M: MANOVA for Immediate Task Performance ...... 200 Appendix N: Discriminant Analysis ...... 202 Appendix O: MANOVA for Delayed Task Performance ...... 204 10

Appendix P: ANOVA for Instructional Content Rating ...... 205 Appendix Q: ANOVA for Visual Design Rating ...... 206 Appendix R: Reliability Analysis for Pilot Study...... 207

11

List of Tables

Page

Table 1 Inter-Rater Reliability: Intraclass Correlation Coefficient ...... 103 Table 2 Inter-Rater Reliability: Crosstabs and Kappa Coefficient ...... 104 Table 3 Inter-Rater Reliability: Pearson Correlations ...... 105 Table 4 Homogeneity Tests ...... 109 Table 5 Descriptive Statistics for Immediate Task Performance ...... 114 Table 6 Discriminant Analysis: Tests of Functions ...... 115 Table 7 Discriminant Analysis: Function Coefficients ...... 115 Table 8 Discriminant Analysis: F for Pairwise Differences ...... 116 Table 9 Descriptive Statistics for Delayed Task Performance ...... 120

12

List of Figures

Page

Figure 1. The basic dimensions of emotions...... 33 Figure 2. Game characters from Plass et al. (2019)...... 36 Figure 3. An affective-cognitive model of academic learning...... 36 Figure 4. The Cognitive Theory of Multimedia Learning...... 43 Figure 5. The Cognitive Affective Theory of Learning with Media...... 53 Figure 6. The Integrated Cognitive Affective Model of Learning with Multimedia...... 54 Figure 7. Emotional design example from Um et al. (2012)...... 78 Figure 8. Example slides from Mayer and Estrella (2014)...... 81 Figure 9. Example content from Heidig et al. (2015)...... 83 Figure 10. Example screens from Kumar et al. (2018)...... 85 Figure 11. Example screens from Uzun and Yıldırım (2018)...... 87 Figure 12. Example decorative images from Schneider et al. (2018)...... 89 Figure 13. Control sample slide...... 97 Figure 14. Graphics sample slide...... 98 Figure 15. Story sample slide...... 99 Figure 16. Mean scores for Retention...... 112 Figure 17. Mean scores for Transfer...... 113 Figure 18. Mean scores for Delayed Retention...... 118 Figure 19. Mean scores for Delayed Transfer...... 119 Figure 20. An affective-cognitive model of academic learning...... 130 Figure 21. A proposed model of learning...... 131

13

Chapter 1: Introduction

Background of the Study

Instructional materials are created and delivered in a multitude of forms these days: print materials like textbooks or handouts, digital materials like presentation slides or e-learning modules, and virtual learning environments such as learning management systems, websites, or games. All of these can take multiple shapes in how they are designed, from purely functional to attractive and eye-catching, with multiple multimedia elements. Does the medium of instruction matter? Does the way materials are designed make a difference in the students’ ability to learn? How do we ensure maximum effectiveness? While there are many elements to take into account, one aspect of the learning process that has received less attention by and researchers is the affective one. Emotional design aims to tap into this dimension and boost learning outcomes. The main form of emotional design that has been studied, and which is one of the types employed of this study, relates to the visual aesthetics, or of the materials.

When people think of graphic or visual design, what usually comes to mind is successful marketing, attractive , or eye-catching magazines and posters.

Indeed, in the field of consumer science, numerous studies have supported the aesthetic design of products to increase usability and consumer satisfaction. However, graphic design is not just about selling products, but about communicating a message in the most effective and engaging way possible, with the help of layout, images, , and colors; thus, it can be a valuable tool to use in education because, just as in other 14 domains, the audience’s attention and motivation are key factors in obtaining successful outcomes.

It is important to emphasize that the quality of the visual design is not just important in order to cater to “visual learners.” There is a pervasive belief among educators that learners can be categorized into sensory “styles” – visual, auditory, kinesthetic – which can be measured by diagnostic instruments, and that teaching methods need to match the students’ styles in order to optimize learning (Geake, 2008).

In other words, so-called “visual learners” would benefit from information presented in visual form, while “auditory learners” would learn better from listening to a lecture. This belief, together with the resulting practices and materials, has been playing a central role in education for over 30 years.

Nonetheless, there is no scientific evidence to support the learning styles idea, and comprehensive reviews have found that all studies that claim to provide evidence for learning styles do not satisfy criteria for scientific validity (Coffield et al., 2004; Harold

Pashler et al., 2008). The validity and reliability of the learning styles tests are also dubious, with self-report being the most used method, and different results being obtained by the same people at different points in time (Massa & Mayer, 2006; Stahl, 1999;

Veenman et al., 2003). Moreover, using pedagogical strategies associated with learning styles may even be harmful to learners: first, they “pigeon-hole” students and may cause them to believe that they are unable to learn material presented in a different modality; second, they waste time and effort that could be spent on strategies that have been proven 15 to work (P. A. Kirschner, 2017; P. A. Kirschner & van Merriënboer, 2013; Newton,

2015; Newton & Miah, 2017; Riener & Willingham, 2010; Willingham et al., 2015).

Consequently, when we provide well-designed visual materials, we are not helping “visual learners”—we are helping all learners. Educational materials need to be developed based on sound instructional design principles, of which effective visual design can be seen as a subcategory. The visual design of learning materials is an essential component of the instructional design process: it impacts how the materials are perceived by the learners (i.e. credibility and ability to stimulate interest) and, more importantly, how well they do their job of delivering the instructional content (i.e. instructional effectiveness). Richey, Klein, and Tracey (2011) described visual elements as “a type of language” with its own “grammar” consisting of specific organizing principles. Learning these principles and how they connect to the learning process is an essential step in optimizing our materials.

There are two aspects to the role of visual design: function and aesthetics. Visual design serves a functional role because the way materials are formatted is the most basic form of learning guidance (Dick et al., 2009). While learners are interacting with pedagogical resources, aspects of the design (e.g., headings, fonts, graphics) can serve as signposts to guide and orient learner attention in the activity. We want to help the learner use that cognitive energy on understanding, remembering, and constructing meaning, rather than on deciphering the structure of what is on the page. Thus, elements of design support cognitive processes by providing guidance: materials need to be designed to facilitate navigation and reduce cognitive load. These are some of the things we need to 16 think about: Is the learner’s eye guided to the most relevant points first? Is the information flow easy to follow? Is it obvious how the content is organized, what its structure and interrelationships between components are? Is the text highly readable and easy to skim? Do the elements form a cohesive whole? (Dick et al., 2009; Malamed,

2015; Mayer, 2009)

These days, learning resources are not just print materials like textbooks or worksheets, but also digital ones such as PDF or PowerPoint documents, or virtual environments like websites, e-learning modules, or Learning Management Systems.

While similar visual design principles apply to all of these, in a virtual environment we also need to take into account interactivity, which adds a whole new layer of complexity.

This means that we also need to think of how learners engage with a page, how they identify the structure and flow of the content, and how they decide on the next thing to click. When looking at a web page or app, it should be, as much as possible, obvious what information is on there, how it is organized, what can be clicked on, or where to start—this way, the learner can focus on the task at hand. Visual design plays an important role in making this a smooth process, and we can draw on lessons from design in order to ensure a high-quality learner experience in a digital environment.

Visual design also serves an aesthetic role. This has to do with how attractive the materials look, how well the design captures the learners’ attention, and how much it contributes to sustaining their interest. This aspect of design has mostly been neglected by both researchers and instructional designers, because it is generally not considered that 17 important for learning (Glore & David, 2012; Hancock, 2004; C. Miller, 2011; Parrish,

2005; Rodríguez Estrada & Davis, 2015). However, recently there have been more studies examining the impact of aesthetics on learning outcomes. While research is still needed on the nature of this impact and how best to achieve it, the evidence so far, together with research from other fields such as and website experience design, suggests that the visual attractiveness of the design does make a difference. The current theories pose that elements of design support cognitive processes by capturing attention and sustaining motivation, and the questions we need to ask with respect to the aesthetic function of design are: Does the material look appealing? Does it trigger emotion in the learner? (Hancock, 2004; Heidig et al., 2015; Mayer & Estrella, 2014;

Park et al., 2015; Plass et al., 2014; Um et al., 2012)

While these two aspects overlap and it is hard to say where function stops and where aesthetics begins (in fact, some literature suggests that aesthetics can be considered to serve function), this categorization is generally helpful for distinguishing between elements that have purely practical roles from those that are meant for influencing affect.

Both research and practice have focused more on the former elements, with interest in the latter growing in the last decade. Thus, the term emotional design has been used to refer to those elements of design whose main purpose is to create a positive emotion in the user—in the case of instructional design, the user being the learner (Mayer & Estrella,

2014; Norman, 2005; Park et al., 2015; Um et al., 2012).

The emotional design hypothesis postulates that visually appealing elements initiate, guide and maintain the learner’s cognitive processing, resulting in improved 18 learning outcomes (Mayer & Estrella, 2014). It is, however, important to distinguish between visual elements that are intrinsic to the learning material—that is, have a clear instructional role—and elements that are extraneous and irrelevant to the topic—also known as seductive details, such as an interesting story, of graphic that may be only remotely connected to the instruction itself. The latter have been shown to have a negative effect on learning, due to the added cognitive load, and should therefore be avoided (Harp & Mayer, 1998; Mayer, 2014c; Moreno & Mayer, 2000). The emotional design hypothesis refers to creating an appealing design of those elements of the study materials that are essential in the lesson, and has its roots in the cognitive load theory, the cognitive theory of multimedia learning, and the cognitive affective theory of learning with multimedia.

Elements of emotional design investigated so far include anthropomorphisms

(human-like characteristics), warm colors, and round, face-like shapes (Mayer & Estrella,

2014; Park et al., 2015; Plass et al., 2014; Um et al., 2012; Uzun & Yıldırım, 2018), classical and expressive aesthetic color combinations (Heidig et al., 2015), holistic aesthetic design principles (Hancock, 2004; Tomita, 2018) and, more recently, positively or negatively charged textual elements (Stark et al., 2018). Emotional design has been discussed in relation to various media, such as printed handouts, PowerPoint presentations, websites, e-learning materials, and online education environment. Overall, these studies support the idea that emotions can positively influence learning, and that emotional design improves learning outcomes. However, the results are mixed and show 19 some contradictions, and more research is needed in order to understand how the various design elements work together to impact motivation and learning.

While the term emotional design has been mainly used in relation to visual design, it can be expanded to include other elements whose main purpose is to tap into the affective dimension of learning. One study focused on the use of emotional text design in multimedia learning (Stark et al., 2018). Another researcher examined the effect of sounds as emotional design (Königschulte, 2015).

One way to include emotional design in instructional material is by adding storytelling elements. While not sufficiently studied, storytelling is considered beneficial to learning because it enhances the emotion associated with the material by helping the learner better relate to the content and making the material more meaningful (Andrews et al., 2009; Gerrig, 1993; Hokanson & Fraher, 2008; Malamed, n.d.-b; Neibert, 2014). This paper will address the use of storytelling elements in instructional design and perform an exploratory investigation of their effects.

Purpose of the Study

In the present study, I examine the role of aesthetics and storytelling within the framework of the CATLM (cognitive affective theory of learning with multimedia). The purpose of the research is to test the emotional design hypothesis by relating the design of the multimedia materials to task performance, and to investigate the effect the type of design may have on students’ perceptions of effectiveness. Additionally, I investigate whether the emotional graphics are indeed perceived as more visually appealing by students, which can shed light on the mechanism through which emotional design 20 functions. This study replicates and expands on the research by Mayer and Estrella

(2014), using their materials with slight modifications. The present study adds to the previous research in four ways:

• It uses a different learner population: students at a midwestern university.

• It adds a new type of emotional design: storytelling elements. So far, storytelling

elements have been scarcely investigated, and have never been researched in the

context of emotional design.

• It includes a new dependent variable: delayed task performance (after one week).

Previous similar studies have investigated immediate task performance, in the

form of comprehension, retention, and transfer; however, so far, no research has

been conducted to examine delayed task performance.

• It checks whether emotional design graphics are perceived as more appealing by

students. Since visual attractiveness is subjective and culture-dependent, it is

useful to know whether the graphics provoke the desired emotional response in

learners.

Significance of the Study

Graphic design can be a valuable tool to use in education because, just as in other domains, the audience’s attention and motivation are key factors in obtaining successful outcomes. It has been argued that aesthetics heightens the learning experience and boosts intellectual activities, since well-designed materials facilitate navigation, reduce the cognitive load, and keep the learners interested and engaged; hence, aesthetic considerations should be an essential part of instructional design (Parrish, 2005; Wilson, 21

2005). Storytelling is being used extensively by instructional design practitioners, but there is hardly any research to support its use. This study will further our understanding of the relation between emotion and cognition, will offer practical insights into the effective design of learning materials, and will strengthen the case for including visual design training in instructional design and technology programs.

Research Questions

1. What is the effect of graphic emotional design and storytelling on immediate task

performance (i.e., the results on a retention and transfer test)?

2. What is the effect of graphic emotional design and storytelling on delayed task

performance?

3. What is the effect of graphic emotional design on students’ perceived

effectiveness of learning materials?

4. What is the effect of storytelling on students’ perceived effectiveness of learning

materials?

5. Do students perceive the materials designed with graphic emotional design as

more visually appealing?

Definitions of Terms

Accessibility. In education, accessibility refers to the ability of courses and learning materials to meet the needs of people of different abilities and from a variety of backgrounds. 22

Behavioral . The behavioral level is one of the three levels at which the brain operates, according to Norman (2005, 2013). It refers to the processes that control everyday behavior.

Central executive. The central executive is a component of the working memory.

It is the supervisory system that combines the information from the subcomponents of the working memory and integrates it with the knowledge in long-term memory (Baddeley,

1986).

Cognitive Affective Theory of Learning with Multimedia (CATLM). CATLM is a modified version of CTML, which adds the idea that learning is mediated by emotion and motivation, self-regulatory skills, and learners’ prior knowledge and characteristics

(Moreno, 2006).

Cognitive load. Cognitive load is the total effort performed by the working memory (Sweller, 2011).

Cognitive Load Theory (CLT). CTL is a theory of learning that states that human cognitive processing is constrained by the capacity of our working memory, and that learning design should aim to reduce unnecessary demands on this system (Sweller,

2011).

Cognitive Theory of Multimedia Learning (CTML). CTML is a theory of learning that states that learning is a process of filtering, organizing, and integrating information with prior knowledge, and that working memory contains a verbal channel and a pictorial channel, which do not compete for resources (Mayer, 2014). 23

Emotional design. In product design, emotional design refers to the elements of design whose main purpose is to create emotion (usually, pleasure) in the user (Norman,

2005). In instruction, emotional design refers to those elements of design that do not have a specific pedagogical or technical function, but are intended to evoke an affective response in the learner.

Essential processing. According to CTML, essential processing is the cognitive processing of the learning material which is caused by the intrinsic difficulty of the content.

External mood induction. External mood induction is the process of stimulating emotion in the learner through elements outside the learning task (for example, watching a happy or sad video before learning).

Extraneous cognitive load. According to CLT, extraneous cognitive load is the cognitive load caused by processing elements unnecessary to the instructional task.

Extraneous processing. According to CTML, the processing of instructional elements that are not relevant for the learning goal.

Forgetting curve. Also known as the retention curve, it is the shape of the function representing how forgetting occurs after learning in time (Ebbinghaus, 1885).

Generative processing. According to CTML, generative processing is the processing that supports deeper understanding of the learning material.

Germane cognitive load. According to CLT, germane cognitive load is the effective cognitive load required to learn.

Graphic design. 24

The traditional role of design has been to improve the visual appearance and

function of messages and information. […] In all these practices, designers use

typography, photographs, , and graphic elements to construct

messages that attract attention, cause us to think about their meaning, and stay in

our memories over time. (The Professional Association for Design, 2017, para. 1).

Instructional design. Instructional design is a systematic process for developing and evaluating instructional materials and activities, based on learning theories.

Internal mood induction. Internal mood induction is the process of stimulating emotion in the learner through elements pertaining to the learning task (for example, graphic design elements of the instructional material).

Intrinsic cognitive load. According to CLT, intrinsic cognitive load is the cognitive load produced by the nature and difficulty of the task

Learning outcomes. Learning outcomes refer to what students should know or be able to do after instruction.

Multimedia. Multimedia is defined as a presentation comprised of a combination of words and pictures (static or dynamic, with or without audio components (Mayer,

2009).

Multimedia learning. Multimedia learning is the process of building mental representations from multimedia instructional materials (Mayer, 2009).

Perceptual fluency. Perceptual fluency is “the subjective experience of ease or difficulty associated with completing a mental task” (Oppenheimer, 2008, p. 237). 25

Reflective level. The reflective level is one of the three levels at which the brain operates, according to Norman (2005, 2013). It is the highest level, which monitors and reflects on experience and actions.

Storytelling elements. Storytelling elements are content and language features that are usually found in stories, such as plot, characters, and narrative tenses.

Usability. Usability is the degree to which a product is easy to access and use.

User experience (UX) design. is “the process of creating products that provide meaningful and relevant experiences to users. This involves the design of the entire process of acquiring and integrating the product, including aspects of branding, design, usability, and function” (What is User Experience (UX) Design?, n.d.)

Virtual learning environment (VLE). A virtual learning environment is an online or platform designed to contain the digital contents of a course. It usually allows for online interaction with the material, instructors and other learners.

Visceral level. The visceral level is one of the three levels at which the brain operates, according to Norman (2005, 2013). It is the immediate, automatic reaction to a stimulus.

Working memory. Working memory is the immediate memory, which selects and processes information coming from the senses, and integrates it with information stored in long-term memory.

Summary

In this chapter, I gave an overview of the background information and literature relevant to the study. I outlined the purpose of the study and its significance, including 26 the original contributions that this research adds to the existing work. The research questions were also presented, followed by definitions for key terms. The second chapter will cover the literature on the psychology of learning, visual design, storytelling, and emotional design. Chapter three will present the methodology used in this research.

Chapter four will include the findings of the study, and chapter five will present a discussion of these findings, limitations, implications, and recommendations. 27

Chapter 2: Literature Review

This literature review will start with a description of the current concepts and theories that pertain to the psychology of learning. Then, I will explore the role of visual design in learning and introduce the concept of emotional design. Next, I will analyze the concept of storytelling and its importance for instruction. Finally, I will review the research that examines the effect of emotional design on learning.

The Psychology of Learning

There are many definitions of learning, but most of them see learning as a long- term change in knowledge or behavior, as a result of experience (Malamed, n.d.-a; What is learning? | Center for Teaching & Learning, n.d.). Learning is a complex process, which is active, builds on prior knowledge, occurs in a complex social environment, is situated in an authentic context, and requires learners’ motivation and cognitive engagement (What is learning? | Center for Teaching & Learning, n.d.). Therefore, learning involves several psychological mechanisms. In this section, I will outline the role of memory, attention, motivation, and emotions as psychological processes that play important roles in learning. I will then describe the main theories of learning that explain how all these mechanisms interact to support instruction.

The Role of Memory

Memory plays the central role in learning – it is “the mechanism by which our teaching literally changes students’ minds and brains” (Miller, 2014, p. 88). According to modern theories, memory involves three major processes: encoding, storage, and retrieval. Encoding means transforming information into memory representations, in 28 other words, forming new memories; storage refers to the maintenance of these representations for a long time; retrieval is the process of accessing the stored representations when we need them for some goal or task (Miller, 2014). Working memory is one of the memory models most accepted currently to explain how we encode and store information—due to its tight connection with attention, working memory is described in the next section.

M.D. Miller (2014) explained that long-term memory is considered unlimited; nonetheless, the challenging part is the retrieval (recall), which means accessing the information when needed. The success of retrieval depends on a few factors. To retrieve memory representations, we use cues—information that serves as a starting point. Since a memory can include different sensory aspects, information with rich sensory associations is usually remembered more easily. Recall is also influenced by how the information was first processed, for instance deep versus shallow processing. Finally, emotions have been shown to boost memory (Kensinger, 2009), with negative emotions (such as fear or anger) in particular having a powerful effect on memory (Mackay et al., 2004; Porter &

Peace, 2007; Rubin, 2005).

One of the most significant contributions to memory research is the work of

Hermann Ebbinghaus, who conducted a series of experiments on the shape of forgetting.

The result was the forgetting curve (also called the retention curve), which is a function showing that the majority of forgetting takes place soon after learning, after which less information will be forgotten (Ebbinghaus, 1885). Most memory researchers assumed that this is a continuous function, progressing in a relatively constant pattern (Fisher & 29

Radvansky, 2018). Murre and Dros (2015) successfully replicated Ebbinghaus’s work and found that the curve is not completely smooth, but shows an increase at the one-day retention interval, which the authors attribute to the effect of sleep on memory.

However, the assumption of a continuous function has been challenged in recent years. A recent review of studies on the retention curve concluded that the rate of forgetting may increase somewhere around seven days (Fisher & Radvansky, 2018).

Fisher and Radvansky’s research adds evidence that forgetting does not follow a continuous function; instead, there seem to be two phases of retention, with the most forgetting happening in the first seven days, and slowing down afterwards. The authors also differentiate between three levels of representation (surface form, textbase, and event model) and call for future research on patterns of forgetting for different types of information. They also explain the implications of these findings for education:

What the results show is that the basic propositional content of knowledge that is

learned is retained for about seven days before it begins to be lost. However, a

broader understanding of material may persist for long periods of time, although

with some forgetting. This could explain why people are able to readily

comprehend much of what is going on in a novel, television, or movie series, even

after a long delay between exposures to the content. There may be some detailed

information that is lost over time, with more details lost with longer periods of

time, but the essential understanding and memory of the basics of the story are

intact. (Fisher & Radvansky, 2018, p. 138) 30

Since most forgetting happens in the first period of around seven days, this is a useful interval to consider when planning learning experiments, as well as when planning instruction. A well-designed course will include sufficient opportunities for practice and retrieval during this time, so as to minimize the forgetting that naturally occurs.

The Role of Attention and Working Memory

We receive information from our senses (visual, auditory, etc.), and then we perform a preconscious analysis to check whether it is important to survival and if it is related to our current goals. If it is, this information is retained and will be further processed and turned into mental representations. Thus, attention is the major process through which information enters our consciousness (MacKay, 1987). Attention is limited, and it is to some extent under voluntary control, but it can be easily disrupted by strong stimuli. Visual processing is highly dependent on attention, which guides how we explore our environment and how we form our subjective representations of what we see.

Attention is also crucial for memory, and without attention we cannot remember much

(Miller, 2014). How attention is directed depends on the way the content is presented and on the nature of the content itself (Richey et al., 2011). Therefore, the manner we design our instructional materials will have an effect on how learners focus their attention to select and process this information, and in turn on what and how much gets stored in their memory.

Working memory is a concept introduced in the 1970s by Alan Baddeley. This model describes immediate memory as a system of subcomponents, each of them processing specialized information, for example sounds and visual-spatial information. 31

This system also performs operations on this information and are managed by a mechanism called the central executive. The central executive combines the information from the various subcomponents, draws on information stored in long-term memory, and integrates new information with the old one (Baddeley, 1986).

As mentioned earlier, these processes are closely intertwined with attention; like attention, the capacity of each of the working memory components is limited. However, these components are also independent; so, for instance, visual information will interfere with other visual information, but not much with another type such as verbal information

(Baddeley, 1986). This is why the most effective instructional materials will include a combination of media, such as images and text (or better yet, audio narration), rather than just images or just text.

Some researchers consider attention and working memory to be the same thing; while not everyone agrees, it is clear that they are highly interconnected and overlapping processes (Cowan, 2011; Engle, 2002). Attention is the process that decides what information stays in working memory and keeps it available for the current task, in other words, what information is relevant to us. Attention is also involved in coordinating the working memory components and allocating resources based on needs and goals (Miller,

2014).

The Role of Motivation

Motivation is generally categorized into two types: intrinsic and extrinsic.

Intrinsic motivation is the desire to do something because the person is interested in that topic or activity for its own sake. Extrinsic motivation is the desire to do something 32 because of an outside reward or punishment. In learning, intrinsic motivation can mean that the learner really enjoys a certain topic, or has a specific problem they need to solve.

Extrinsic motivation can be the need to learn for a promotion or because it is required

(Dirksen, 2016).

Deci and Ryan (2014) proposed the self-determination theory (SDT), which presents a continuum of motivation, from extrinsic (only due to a reward or punishment) to intrinsic (purely from enjoyment of the task). Between the extremes, there are different degrees of intrinsic / extrinsic motivation, such as doing something because it will make someone else happy or to fit in a group. In general, intrinsic motivation is considered superior, and sometimes extrinsic motivation can even affect intrinsic motivation negatively (Deci, 1971). However, many extrinsic motivators can be very powerful. The self-determination theory states that people are motivated the most by the need for three basic things: competence (being good at something), relatedness (having connections with other people), and autonomy (making their own choices) (Ryan & Deci, 2000).

While instructional design cannot change the purely intrinsic motivation of a learner, it can certainly support it by making sure the learning activities are meaningful to the students. Moreover, it can enhance motivation through various strategies such as designing materials that catch and sustain the learners’ attention, providing an appropriate degree of control, creating opportunities for success, ensuring timely and informative feedback, and including social and collaborative learning activities. Emotional design is one tool that designers can use to boost the learners’ motivation, by making learning materials more attractive. 33

The Role of Emotions

Studies have shown that emotions impact cognitive processes (Izard, 2009;

Pekrun, 2006; Russell, 2003); therefore, it is important to identify elements that may affect these emotions in a way that optimizes learning. Emotions have been defined as people’s judgment and appraisal about the world, in reaction to and through interaction with specific stimuli, such as instructional materials (Oatley & Johnson-laird, 1987;

Ortony et al., 1988). Russell (2003) defined core affect as “neurophysiological state that is consciously accessible as a simple, nonreflective feeling” (p. 147), with two basic dimensions: pleasure (pleasant vs. unpleasant affect) and arousal (activation vs. deactivation). Figure 1 shows a diagram of these dimensions.

Figure 1. The basic dimensions of emotions. This figure shows the main dimensions of core affect (Russell, 2003, p.148). Permission for reproduction not required for up to 3 figures by APA.

34

Plass and Kaplan (2016) talked about the gap between the importance of emotions for learning and the fact that educational research and practice largely ignore them, focusing mostly on cognitive issues. They warned that “The effectiveness of instructional design will depend on the extent to which it takes into account the pervasive and motivating nature of emotions and their natural interconnectedness with cognition” (p.

134).

Some researchers investigated the influence of negative emotions on learning, such as research on test anxiety (Pekrun, 2005) and effect on learning performance tasks

(Brand et al., 2007). While in most cases negative emotion appears to hinder learning

(Brand et al., 2007; D’Mello & Graesser, 2012; Pekrun, 2005), certain negative emotions such as mild anxiety were shown to boost it (D’Mello et al., 2014; Stark et al., 2018).

However, most research has focused on positive emotions, looking to find whether they have an effect on learning, and whether this effect is a beneficial or detrimental one.

Because of ethical reasons, manipulating positive emotions seems the most promising avenue for research and practice.

Several studies indicated that positive emotions facilitate cognitive processing and learning, due to the fact that they enhance motivation (Efklides et al., 2006; Isen et al.,

1987; Pekrun, 2006). Others found that even positive emotions can hinder learning, as they can increase the cognitive load and distract the student (Oaksford et al., 1996; Plass et al., 2010; Seibert & Ellis, 1991).

A number of authors investigated the effect of manipulating emotions in learning situations through external mood induction—that is, through factors extrinsic to learning 35 such as reading emotionally charged or neutral statements (Park et al., 2015; Um et al.,

2012) or watching videos before starting learning (Plass et al., 2014), and found that positive emotions facilitated learning on some of the measures. Um et al. (2012) used a 2 x 2 factorial experimental design, in which college students received a computer-based lesson on immunization, and were assigned to 4 conditions: external mood induction

(positive vs. neutral) and emotional design induction (positive vs. neutral). They found that external induction increased transfer, but not comprehension. Plass et al. (2014) replicated this study but used two 2-minute videos to induce positive or neutral emotions, and had similar results. Park et al. (2015) conducted a similar study and found that the group that received a self-referencing positive mood-induction procedure achieved higher learning outcomes than the control group, in both comprehension and transfer.

Finally, Plass et al. (2019) investigated how color, shape, expression, and dimensionality of game characters influence emotions in digital instructional games, in four studies. In the first three studies, the researchers asked participants (adults and adolescents) to report their emotional states after being shown a series of game characters: which character made them feel saddest, happiest, and most neutral?

Participants in the fourth study had to complete two surveys on their emotional states as well as think-aloud protocols. Figure 2 shows the types of game characters used in Study

1, with differences on the color, expression, and shape attributes, and types of game characters used in Study 3, with differences on the color, expression, and dimensionality attributes. The findings indicated that expression and dimensionality had the strongest effect, color had a medium effect, and shape had a small to medium-sized effect. 36

Figure 2. Game characters from Plass et al. (2019). This figure shows the game characters used in the study by Plass et al. (2019, Sections 3.1.2. and 5.1.2.), depicting differences on the color, expression, and shape attributes (left), differences on the color, expression, and dimensionality attributes (right). Permission for reproduction not required for up to 3 figures by Elsevier.

Thus, these studies suggested that positive emotions can influence learning performance, and that further study is necessary in order to understand the best ways to utilize this effect. Mayer (2019) presented “an affective-cognitive model of academic learning which incorporates both affective processing and cognitive processing during learning: the learning episode causes an emotional reaction in the learner that affects cognitive processing during learning and leads to a learning outcome” (Section 1) (Figure

3).

Figure 3. An affective-cognitive model of academic learning. This figure presents a model of learning proposed by Mayer (2019, Section 1). Permission for reproduction not required for up to 3 figures by Elsevier. 37

Mayer (2019) explained that the causal links between the four boxes are yet to be established; he highlighted recent evidence from a large-scale meta-analysis of emotions in technology-rich learning environments which suggests links between affective processing and cognitive processing, and between affective processing and learning outcomes (Loderer et al., 2018). Correlations between positive emotions and learning outcomes and negative correlations between negative emotions and learning outcomes were also found in a study by Duffy et al. (2018). Mayer (2019) also summarized key findings on emotions in academic learning, with a focus on e-learning, and explained that the main emotions related to learning, as identified by the literature, are negative activating emotions (e.g. anxiety, frustration) and positive activating emotions (e.g. enjoyment). Positive emotions seem to have the best research potential, and most studies, including emotional , have focused on them.

Theories of Learning

Clark and Mayer (2016) defined learning as “a change in the learner’s knowledge due to experience” (p. 32). According to these authors, learning is a process through which the learner, as active sense-maker, builds mental representations; hence, effective instruction involves not only presenting information, but also stimulating appropriate cognitive processing by the learner. Instruction is “the training professional’s manipulation of the learner’s experiences to foster learning” (Mayer, 2011, p. 33). The learning theories that help us understand the role of visual elements in the design of instructional materials are the Cognitive Load Theory and the theories of Multimedia

Learning. 38

The Cognitive Load Theory. Cognitive Load Theory (Sweller, 2011) asserts that people learn best when learning conditions are aligned with human cognitive , and seeks to integrate our knowledge of cognitive processes with instructional design principles. In Cognitive Load Theory (CLT), long-term memory is conceptualized as a very large store of information—this store is what constitutes expertise and allows people to recognize situations, solve problems, and obtain fluency in a skill. The contents of the long-term memory are schemas—cognitive structures that constitute the knowledge base.

The aim of instruction is to induce appropriate alterations to this information and to assist learners in acquiring fluency (Paas & Sweller, 2014). We obtain information from other people—such information is accumulated through borrowing and reorganizing, and this is achieved by imitating, listening, and reading.

The system that assists these processes is the working memory, which is limited in amount and duration of new information it can hold because its function is to select the most relevant information from the environment (Paas & Sweller, 2014). These limitations gave rise to the concept of cognitive load in instruction: the learners’ ability to process new information and to construct knowledge that gets stored in long-term memory is vulnerable to the information processing load that happens during learning tasks (Sweller et al., 2019). According to Paas and Sweller (2014), there are three types of cognitive load (CL): intrinsic, extraneous, and germane. Intrinsic CL is caused by the nature and difficulty of the task; since it is intrinsic to the content, it cannot be changed through instructional design. Extraneous CL is produced by how the information is presented and by the instructional task—thus, instructional design that includes 39 unnecessary elements for the students to process will increase extraneous CL. Germane

CL is the effective CL required to learn, the working memory resources that deal with intrinsic CL, rather than extraneous CL. “Germane cognitive load redistributes working memory resources from extraneous activities to activities directly relevant to learning by dealing with information intrinsic to the learning task” (Sweller et al., 2019, p. 264).

Hence, effective instructional design needs to decrease the amount of extraneous CL and ensure that the learners’ cognitive resources can be focused to the task itself.

Multimedia learning. Multimedia is defined as a presentation containing both words and pictures (the words can be spoken or printed, and the pictures can be static or dynamic) photos, video, animations etc. In other words, it is comprised of information in both verbal and pictorial form. Multimedia learning refers to the process of building mental representations (constructing knowledge) from these words and pictures. The multimedia principle states that students learn better from presentations that include both words and pictures, rather than words alone or pictures alone (Mayer, 2014c). Within the current landscape of instructional content variety and complexity, the multimedia principle refers to “learning supported by varied forms of visual and verbal content when presented in combination” (Butcher, 2014, p. 174). The visuals can include photographs, diagrams, videos, animations, simulations, three-dimensional models etc.

It is important to note that supporting learning content with visuals is not meant to help “visual learners”; the learning styles theory has been debunked by research (Coffield et al., 2004; P. A. Kirschner, 2017; Newton & Miah, 2017; Pashler et al., 2008). The studies underlying the multimedia principle show that adding visuals to text is helpful for 40 learners in general, as they support deeper learning by facilitating the development of mental models, knowledge integration and application, and transfer (Butcher, 2006;

Cuevas et al., 2002; Mayer et al., 1996, 2005).

When designing multimedia instruction, both the learners’ prior knowledge and the type and complexity of the learning materials need to be carefully assessed. Butcher

(2014) explained that, for learners with limited prior knowledge, visual representations should be kept to essential components and well-organized structures. Additionally, when motion is important in the instructional concepts, animation should be included; however, if the learners are able to mentally animate the materials themselves, static images may be more effective because they facilitate more active processing (Hegarty et al., 2003).

Whether the images are static or dynamic, visual cues should be added whenever possible to direct the learners’ attention and support meaningful processing of the visual elements

(Boucheix & Guignard, 2005; de Koning et al., 2010). Whenever possible, it is best to provide learners with both concrete and more abstract representations of the concept or object (Goldstone & Son, 2005; Moreno et al., 2011).

When presenting instructional materials in digital environments, another important consideration is interaction. According to Butcher (2014), interactive elements have the potential to guide the students’ attention, thus facilitating processing of concepts. Effective interactions can also help the integration of visual and verbal content

(Bodemer et al., 2004). Requiring learners to generate their own visual models has been shown to improve learning outcomes by fostering active learning (Gadgil et al., 2012; 41

Grossen & Carnine, 1990), and digital environments offer the possibility to easily compare the learner’s model with a high-quality one.

While the benefits of enhancing text materials with visuals are well documented, using visuals for learning might not come easily to all students. Some studies have shown that many students miss the advantages of visuals because they skip over the graphics or engage with them in a superficial manner (Cromley et al., 2010; Moore, 1993). They also do not know how or when to generate their own diagrams (Uesaka et al., 2007).

However, training students on how to use visuals has been effective (Moore, 1993), and providing adequate guidance can also help overcome this issue, especially in a digital environment where signaling and interaction can ensure optimal sequencing and highlighting of content.

Clark and Feldon (2014) defined multimedia in a more narrow sense, as “the capacity of computers to provide both realistic and constructed audio and visual presentations in a desired combination of text, audio, still images, video or animation” and consider multimedia programs as allowing learners to “interact with the computer and influence the pace, sequence, and content of the presentations they receive” (p. 152).

After reviewing the existing research, they warned about a number of multimedia learning principles that are commonly believed as true, but are not supported by empirical evidence. While their findings apply only to the digital environment, and mostly to self- study media, they are important considerations for instructional design; hence, I summarize them here: 42

• computer multimedia is not by itself more effective than older media (e.g.

textbooks) or live teaching by an instructor, but what makes the difference is the

instructional strategies employed;

• multimedia does not increase motivation and the fact that students simply “prefer”

a medium does improve learning;

• animated pedagogical agents do not add any benefits for learning;

• multimedia does not owe its effectiveness to the fact that it accommodates

different “learning styles;”

• discovery-based multimedia programs are beneficial for students with high levels

of prior knowledge, while novice to intermediate students need fully guided

instruction;

• having control over the pacing of a lesson is helpful for learners, while control

over the order, tasks and support are mostly detrimental; however, students with

high prior knowledge are an exception and do better with higher control;

• multimedia in itself does not promote higher order thinking;

• multimedia does not offer any benefits of “incidental learning;”

• the interactivity allowance of multimedia does not necessarily promote learning;

rather, there are specific types of interaction that are effective, and they are not

exclusive to multimedia;

• creating more “authentic” learning environments with multimedia does not

necessarily promote transfer to real life situations. 43

While these conclusions may be debatable—for instance, a simulation environment does provide different and superior opportunities for instruction compared to other media, and studies have shown that the design of multimedia environments does influence motivation, as will be discussed later—it is helpful to refer to them when developing digital media. They can help us avoid what Clark and Feldon (2014) called

“the root” of the misconceptions: conflating the medium with individual differences, instructional content, and instructional methods. When designing instruction, it is important to remember the role of each of these elements and be clear about what they bring to the table.

The Cognitive Theory of Multimedia Learning. The Cognitive Theory of

Multimedia Learning (CTML) proposes that learning is an active process of filtering, selecting, organizing, and integrating information based on prior knowledge. These cognitive processes rely on the learner’s working memory, which has limited capacity; therefore, it is essential to optimize them through the appropriate presentation of stimuli

(Mayer, 2014). Figure 4 shows the CTML model.

Figure 4. The Cognitive Theory of Multimedia Learning. This figure shows how learning takes place according to CTML (Mayer, 2014, p. 52). Permission for reproduction was obtained from Cambridge University Press.

44

CTML is based on a model of the human information processing system with three memory stores: sensory memory, working memory, and long-term memory. This model was introduced by Baddeley and Hitch (1974) and is currently the most widely accepted model of memory. Thus, the stimuli from a multimedia presentation (images and words) enter sensory memory, where they are briefly held as visual and auditory images. Then, relevant selected visual and sound images enter working memory, where they are organized into verbal and pictorial models. The third store is the long-term memory, which is practically unlimited and can hold material for a long time; however, this material also needs to be brought into the working memory in order to think about it actively, and to integrate new knowledge with the old.

CTML is grounded on three assumptions about how the human mind works: dual channels, limited capacity, and active processing. The dual-channel assumption is that we have separate channels for visual / pictorial and auditory / verbal information processing.

However, we can convert the representation acquired through one channel for processing in a different channel (for example, words describing a landscape can turn into an image of that landscape). The limited-capacity assumption is that the amount of information that can be processed in one channel is limited; this results in the need to make decisions on which information to pay attention to, how to connect the different pieces of information together, and how much of this information to integrate with existing knowledge.

However, thanks to the two channels, we can accommodate more information as long as it is presented through different modalities—that is, pictorial information does not consume the resources devoted to verbal information—hence the superiority of learning 45 materials that combine both (Mayer, 2014a). Norman (2013) also stressed the use of multiple sensory modalities in product design as a way to mitigate the limited capacity of working memory: he advised presenting information through different modalities such as sight, sound, haptics, spatial location etc. This increases the efficiency of cognitive processing and minimizes the chance of errors.

The active-processing assumption is that we construct a mental representation of our experiences through active cognitive processing. This results in a mental model that is comprised of the key parts of the material that was presented, and the relationship between them. The processes involved in this active learning are: selecting relevant material and bringing it into working memory (through paying attention); organizing the selected information within working memory by building structural relations between elements; and integrating cognitive structures with each other and with prior knowledge which is activated from long-term memory (Mayer, 2014).

According to Mayer (2014), there are three kinds of demands on cognitive processing capacity: extraneous processing, essential processing, and generative processing. Extraneous processing is the processing of unnecessary elements, which are not relevant for the learning goal; it is produced by poor instructional design and does not result in knowledge being constructed. Examples of unnecessary elements are irrelevant pictures or the act of looking back and forth trying to find a piece of information.

Essential processing refers to the cognitive processing of the instructional content

(selecting and organizing relevant information) and is caused by the nature and difficulty of this content. The result of essential processing is the construction of the verbal and 46 pictorial representations in working memory. Generative processing is processing that supports deeper understanding of the material by reorganizing information and integrating it with prior knowledge, and depends on the learner’s motivation to make sense of the instructional material. Since the learner’s working memory capacity is limited, it is important to use cognitive resources on essential and generative processing, and minimize extraneous processing, and this is achieved through good instructional design of the learning materials and tasks.

Research on educational techniques has yielded a number of CTML principles of instructional design that help ensure optimal cognitive processing and utilization of working memory capacity during learning. Following is a summary of the main principles that refer to the visual design of learning materials.

The multimedia principle. The multimedia principle states that people learn better from a combination of words and pictures, rather than from words (Butcher, 2014;

Halpern et al., 2007; H. Pashler et al., 2007). The pictures should be considered from the very beginning of the design process, and designers need to ensure that the words and pictures work well together to deliver the instructional content (Clark & Mayer, 2016).

Clark and Mayer (2016) explained that this combination is effective because it engages students in active learning: they need to create mental representations of the material in words and pictures and make connections between the verbal and pictorial representations. This is conducive to meaningful learning and supports understanding, whereas using words alone is more likely to result in more shallow learning. 47

Moreover, using both words and pictures is making more efficient use of the capabilities of working memory. It is believed that working memory has separate channels for verbal and visual information, and using both channels helps avoid overworking the limited capacity of working memory. Consequently, combining words and pictures optimizes processing (Butcher, 2014).

Clark and Mayer (2016) described six types of functions for graphics:

• Decorative graphics: these images do not add to the instructional content – for

instance, a picture of a puppy in a lesson about types of dog food.

• Representational graphics: these pictures show a single element, for instance a

photo of an octopus with the caption “octopus.”

• Relational graphics: these graphics show a quantitative relationship between

variables, for example a bar graph showing demographic data of a population.

• Organizational graphics: these show relationships between elements, for instance

an object with labels of its parts.

• Transformational graphics: these graphics illustrate changes in an object over

time, such as the stages of a cooking recipe.

• Interpretive graphics: these show relationships and processes that are invisible,

for instance the wind in a weather simulation.

According to Clark and Mayer (2016), decorative and representational graphics should be used sparingly, and designers should focus instead on the other types of graphics, which support understanding and organization of the material. 48

The contiguity principle. The contiguity principle states that text and images should be presented in an integrated manner by placing the words next to the graphic that they describe. This minimizes the need to figure out which parts of the image corresponds to which words, thus reducing the extraneous cognitive load and allowing the learner to spend cognitive resources on understanding the learning material. Ideally, the words should be embedded in the graphic—for example, the names of the parts of an object should be placed near that part, rather than underneath the graphic (Clark &

Mayer, 2016). In e-learning, mouse-overs (text that appears when you mouse over a certain part of the image) can be used to save space. Also in e-learning, other instances of this principle are placing the feedback for a question on the same page as the question, so that the learner does not have to page back and forth between question and answer or feedback, and placing directions for an exercise on the same page as the exercise itself

(Clark & Mayer, 2016).

If we place words and their corresponding pictures separately, making the connections between them takes more time and effort and wastes precious cognitive resources, leading to cognitive overload. This has also been called split attention (Ayres

& Sweller, 2014). If, on the other hand, we place them together, they can be more easily held in the working memory and thus learning is optimized.

The modality principle and the redundancy principle. The modality principle is relevant in online learning environments. It refers to the recommendation to use audio, rather than printed text, when describing visuals, when they are presented simultaneously.

This again, avoids an overload on the visual / pictorial channel, and increases ease of 49 processing. However, the words should be made available to support memory, especially when there is difficult or unfamiliar vocabulary, and for accessibility reasons. Moreover, it is beneficial to include key words and highlight relevant information (Clark & Mayer,

2016).

The redundancy principle is related to the modality principle. It states that, when audio narration is provided, simultaneous on-screen text should be avoided. This, again, is to reduce the load on the visual channel of the working memory. Research has shown that, when both audio and visual text are provided to accompany a graphic, the learners pay too much attention to the visual text and neglect the graphic (Clark & Mayer, 2016).

Moreover, learners will spend cognitive resources trying to match the visual text with the audio narration (Mayer & Fiorella, 2014).

Some exceptions to these two principles are when the pace of the presentation is very slow, when only some key words or a summary appear as on-screen text, or when it is very hard for the learners to follow or understand the spoken text—such as learners with disabilities or second language learners (Clark & Mayer, 2016)

The coherence principle. The coherence principle states that any material that does not support the instructional goal should be minimized or eliminated. Sometimes designers add text or graphics that is entertaining or ornamental, for example a picture of a person riding a bike in a lesson about how a bicycle pump works (Clark & Mayer,

2016). However, this also takes away from the limited processing capacity of the working memory. One type of such material are the so-called seductive details, which are interesting but irrelevant elements (Garner et al., 1989); for instance, in the above- 50 mentioned lesson, a seductive detail could be a text box on how the bicycle was invented, or a picture of an innovative type of bike. These extraneous elements may interfere with the learning process by distracting the learner and using up cognitive resources; hence, it is recommended to avoid trying to “spice up” a course by including such materials.

Richey et al. (2011) talked about “noise,” which refers to stimuli that can distort a message and confuse the learner.

However, some researchers have challenged the “seductive details” and decorative images effect. Schneider, Dyrna, et al. (2018) examined the effect of decorative pictures in three experiments with secondary school and university students, using instructional texts about South Korea (in two of the experiments) and the human body (in the third one). The pictures were positively or negatively charged, and weakly vs. strongly connected to the text. The results revealed that learning outcomes were higher for materials with positive or strongly connected pictures in comparison with negative or weakly connected pictures; moreover, strongly connected positive pictures promoted learning while negative, weakly connected pictures impaired learning when compared with a control group. These findings suggest that decorative pictures can be beneficial for learning as long as they have a positive affective charge and they are strongly connected with the instructional content.

In another recent study, Kühl et al. (2019) examined the effects of text-based seductive details on learning outcomes. In two experiments, university students took part in instruction in one of four conditions—no seductive details, details with positive emotional valence, details with negative emotional valence, and neutral details. They 51 found no differences between the groups, suggesting that there was no seductive detail effect. The authors stated “Considering that we used established material but did not find a seductive details effect, we cautiously want to hint to the possibility that the harmful effect of seductive details on learning may be somewhat overestimated due to a publication bias” (Kühl et al., 2019, p. 57).

Related to the coherence principle, it is also advisable to use simpler visuals rather than graphics with a lot of detail. In general, a two-dimensional drawing is preferable to a three-dimensional photograph, a series of static images is more effective than an animation, and a computer-generated animation that depicts essential elements is better than a video (Butcher, 2006; Mayer et al., 2005; Scheiter et al., 2009).

The signaling principle. The signaling (or cueing) principle states that

“multimedia learning materials become more effective when cues are added that guide the learners’ attention to the relevant elements of the material or highlight the organization of the material” (Van Gog, 2014, p. 263). Cueing can take the form of guiding text, or visual elements such as colors, arrows, icons, flashing, shading, zooming in and out, etc.

This technique helps especially with the first step of cognitive processing

(selecting information), but it also supports the subsequent steps of organization and integration. Research has shown that learners direct their attention towards visually salient features, even when they are not the most relevant to the task (Lowe, 1999, 2003).

Consequently, the visual design of learning material needs to ensure that relevant elements are appropriately signaled, in order to optimize the cognitive load. 52

These multimedia learning principles help us minimize extraneous processing, control essential processing, and promote generative processing in learning. They are not meant to be absolute rules, but rather guiding principles whose application may need to be adapted, depending on the learning context and the level of the students. More research is needed to define the conditions in which these techniques work best, and those which are better served by different principles. However, following the above principles when designing multimedia can help ensure that extraneous processing is minimized, and the learner’s cognitive resources are spent on essential and generative processing. Thus, they need to be taken into account when designing learning materials and environments.

The Cognitive Affective Theory of Learning with Multimedia. In addition to cognitive processes, factors that impact the students’ attention, motivation and emotion also need to be considered. The Cognitive Affective Theory of Learning with Media

(CATLM) is a slightly different version of CTML, proposed by Moreno (2006). It introduced the idea that learning is mediated by emotion and motivation (the affective assumption), by self-regulatory skills (the metacognitive assumption), and by differences in the learners’ prior knowledge and characteristics (the individual differences assumption) (Moreno, 2007; Moreno & Mayer, 2007; Park et al., 2015). The emotional design hypothesis is related to the affective assumption: emotions impact cognitive processes by guiding the learners’ attention and by increasing motivation, thus sustaining cognitive processing (Mayer & Estrella, 2014). Figure 5 shows the CATLM model, in which affect is depicted as influencing the learning process at multiple levels: the 53 selection of information from the sensory memory, the organization of the information in the working memory, and the retrieval of information from the long-term memory and integration with the new material.

Figure 5. The Cognitive Affective Theory of Learning with Media. This figure shows how learning works according to CATLM (Moreno & Mayer, 2007, p. 314). Permission for reproduction was obtained from Springer Nature.

Recently, this model has been extended by Plass and Kaplan (2016) into the

Integrated Cognitive Affective Model of Learning with Multimedia (ICALM). ICALM includes affective processes and takes into account the effect of the learning environment.

Figure 6 shows the ICALM model.

54

Figure 6. The Integrated Cognitive Affective Model of Learning with Multimedia. This figure shows how learning works according to ICALM (Plaas & Kaplan, 2016, p. 150). Permission for reproduction not required for up to 3 figures by Elsevier.

The authors explained that “affective processes are intertwined with, and inseparable from cognitive processes” and that these affective processes “make demands on cognitive resources, and vice-versa” (Plass & Kaplan, 2016, p. 150). In the same way that effective instructional design needs to minimize extraneous cognitive load, it must also minimize emotional overload. However, if emotions are at too low a level, learning is also impacted negatively; hence, designers of learning materials have to make sure they foster positive emotions in order to stimulate the learners’ motivation and engagement.

As Plass and Kaplan (2016) explained:

In order to fully understand how we process the world around us, we need to

consider our affective responses to the information we perceive. This is especially

important for the designers of digital educational materials, as these materials 55

offer many important opportunities to incorporate emotional considerations. (p.

131)

Visual Design and Learning

In this section, I will start by examining what effective product design means and how this applies to instructional materials. Then, I will introduce the concept of emotional design. Finally, I will analyze the role of aesthetics in learning and outline the main principles of graphic design.

Effective design. Educational materials have to be created according to sound instructional design theories and practices. However, as instructional products, they also need to follow principles of good product design. According to Norman (2013), products that are well designed “fulfill human needs while being understandable and usable” (p. 4) and ideally are also “delightful and enjoyable, which means that not only must the requirements of , manufacturing, and ergonomics be satisfied, but attention must be paid to the entire experience, which means the aesthetics of form and the quality of interaction” (p. 4). In his book The Design of Everyday Things, Norman (2013) explained how to optimize the experience that users have with objects and reviews some common design mistakes. His focus was on , , and experience design; hence, his advice is relevant for educational products, in particular for those involving the learners’ interaction with technology.

Norman (2013) recommended human-centered design: when we design a product, we need to start with human needs, capabilities, and behavior. Communication between the person and the technology needs to work smoothly, especially when things go wrong. 56

In other words, it is essential to make the design highly intuitive, so that the user does not need to use time and mental energy on figuring out how something works. This is consistent with the tenets of the cognitive theory of multimedia learning (Mayer, 2014a), as well as with cognitive load theory (Sweller, 2011), which state that extraneous cognitive load or extraneous processing have to be minimized or eliminated if possible.

An essential feature of a product, from the point of view of user experience, is discoverability, which means the ability to discover what something does, how it works, and what the user can do with it. Discoverability is achieved by applying six concepts: affordances, signifiers, constraints, mappings, feedback, and conceptual model (Norman,

2013). As these are highly relevant for the design of educational products, I will give a brief overview of each.

Affordances refer to the relationship between the characteristics of a product and the user’s capabilities, which determine how the product might be used. It is important to note that the properties of both elements—user and product—define what can be done with the product; this is something that product designers sometimes get wrong, because they fail to think of the user. An example would be the explanation of a concept in a learning material. If the explanation contains terms or ideas that novice learners are not familiar with, then the material does not offer the affordance of instruction to novice learners.

Once an affordance is designed, it is crucial that it is visible—that there are clear clues that show that it exists, and how to access it. These are called signifiers. For instance, we may offer the possibility to click on an image in an e-learning program in 57 order to obtain more information. While the image may be clickable, the users may not realize it can be done if it is not properly signaled. The change to a hand of the mouse pointer can help on a computer but may not be enough, and it is not useful on a mobile phone. In that case, a more obvious signifier such as a “Click the image to learn more” text could be necessary. In the same way, arrows that show that you can swipe left, right, up, or down on a webpage or app are signifiers.

Mapping is the relationship between the layout of the controls and the devices being controlled. If this relationship follows natural spatial analogies, it is much easier to understand and follow. This is more applicable to control of objects or machinery, but we could also have such mapping, for example, in the choice of keys for advancing a slideshow or for moving around a game.

Constraints are elements of design that limit the set of possible actions, thus helping the user figure out what to do next. In instructional design, an example would be

Clark and Feldon’s (2014) finding that novice learners benefit from less control over the sequence of the lesson or the selection of tasks. A forcing function is a type of constraint where failing one step prevents advancement to the next step. This is also implemented in instruction by not allowing the learner to proceed with a module until a certain score on a previous module quiz or task was achieved.

Feedback is information on the results of an action. It needs to be just the right amount (not too much and not too little), immediate, and informative. For instance, when doing a task or answering questions online, learners need to be told whether what they are doing is working, or if it is the correct or wrong thing. Moreover, they need to be 58 explained why their answer is right or, especially, wrong. This is consistent with another multimedia learning principle, the feedback principle, which states that explanatory feedback is more effective, especially for novice students, than corrective feedback alone

(Johnson & Priest, 2014).

A conceptual model is a mental model of how the product we interact with works, and it should be easy to infer from the design of the product. This model helps the user understand what is happening, predict how things will behave, and figure out what to do when he / she encounters problems. The model does not need to be complete or accurate but rather to be helpful for the user. For instance, in a Learning Management System, a conceptual model of a collection of folders with files can help the learner find their way around. When there is not enough information to form a helpful conceptual model, people will create an erroneous one, which causes unnecessary effort when trying to use that particular product.

Norman (2013) also talked about people’s tendency to blame themselves for failures when something goes wrong in interacting with a product, especially with technology. This can lead to learned helplessness, where people think they must be

“technically inept.” However, very often the true cause is poor design. Norman (2013) gave advice on how to avoid this, including design techniques such as eliminating error messages and replacing them with help and guidance, and making it possible to correct problems directly from the help and guidance messages instead of interrupting them and making them start over. Moreover, the thorough implementation of affordances, signifiers, constraints, mapping, and feedback should minimize the possibility of errors, 59 and maximize their quick rectification. Such techniques could be implemented in the design of digital learning materials and environment, to avoid inducing a negative affective state in the learners, as well as wasting their time and mental effort.

These days, more and more instructional materials are in the form of websites, mobile apps, or e-learning modules. These platforms need to be designed in such a way as to minimize extraneous cognitive load and maximize generative processing. That means making sure that the learner’s efforts are spent on understanding the instructional material, and not on figuring out how to use the website or app. Along with learning theories and effective design best practices, research and practice in the field of user experience design can help inform the design of learning materials.

Krug (2014) explained that, in order for a website or app to be easy to use, the most important principle can be stated as “don’t make me think.” That may sound as a strange principle in an educational context, but what Krug referred to is precisely the need to avoid wasting the users’ cognitive resources on how that particular platform works (thus reducing cognitive load), and to make them feel comfortable using that product (enhancing generative processing). When looking at a web page or app, it should be, as much as possible, obvious what information is on there, how it is organized, what can be clicked on, or where to start; this way, the user can focus on the task at hand. This can be achieved through well-chosen names, the appearance of things—such as size, color, or layout, and occasionally through small amounts of explanatory text.

When designing a web page or app, we need to take into account how users usually interact with them. According to Krug (2014), people do not read everything 60 thoroughly, analyzing the organization of the page and making careful decisions. Instead, they “glance at each new page, scan some of the text, and click on the first link that catches their interest or vaguely resembles the thing they’re looking for” (Krug, 2014,

Chapter 2). Therefore, in an educational product, the design needs to ensure that their scanning is guided to promote the instructional goal; this is consistent with the signaling principle of multimedia learning and Lowe’s (1999, 2003) findings that learners’ attention is caught by visually salient elements, even when those do not support the task.

Krug (2014) provided a few guidelines for ensuring that the users effortlessly see and understand what we want them to:

• Use conventions: Using standardized patterns makes it easier to see them quickly,

and to know what to do—for example, the “hamburger” menu on an app, the

layout for a specific type of website, or the position of the navigation elements.

• Create effective visual hierarchies: The visual cues should represent the actual

relationships between the things on the page—for instance, the more important

elements are larger, and the parts that are connected are grouped together on the

page or designed in the same style. This saves the user effort in the selection and

organization processes in the working memory.

• Separate the content into clearly defined areas: If the content is divided into

areas, each with a specific purpose, the page is easier to parse, and the user can

quickly select the parts that are the most relevant to them.

• Make it obvious what is clickable: Figuring out the next thing to click is one of

the main things that users do in a digital environment; hence, it is essential that 61

the makes this painless process. This can be done through shape, location

or formatting—for example, one important rule is to use a distinctive color for all

clickable text.

• Eliminate distractions: Having too much complexity on a page can be frustrating

and impinges on the users’ ability to their task effectively. Thus, we need to avoid

having too many things that are “clamoring for your attention” (Krug, 2014,

Chapter 3). This is consistent with the coherence principle of multimedia learning,

which states that elements that do not support the learning goal should be kept to

a minimum, and that clutter should be avoided. Additionally, we should aim for a

clean and organized look by using proper alignment.

• Format text to support scanning: Users often need to scan pages in order to find

what they are looking for. This is similar in a learning environment. There are a

few things we can do towards this goal: include many well-written headings, with

clear formatting differences between the different levels and appropriate

positioning close to the text they head; make the paragraphs short; use bulleted

lists; and highlight key terms.

Besides following these guidelines to improve the ease of using a website or app, enhancing its attractiveness has been shown to have a positive effect on its effectiveness.

Studies have shown that perceived aesthetics strongly correlates positively with perceived usability in product and software use (Hassenzahl, 2004; Kurosu & Kashimura, 1995;

Parizotto, 2007; Tractinsky et al., 2000). Emotional design is the area of design that focuses on aesthetics and on producing a pleasurable emotion on the user. 62

The concept of emotional design. In The Design of Everyday Things, Norman

(2013) stressed the importance of emotion for the overall experience. He explained that cognition and emotion are linked and interdependent, and therefore the designers must consider them both when they design a product. This mirrors the statements of Plass and

Kaplan (2016), who included affective states in the cognitive theory of multimedia learning model. Furthermore, Norman (2013) explained that

Emotion is highly underrated. In fact, the emotional system is a powerful

information processing system that works in tandem with cognition. Cognition

attempts to make sense of the world: emotion assigns value. It is the emotional

system that determines whether a situation is safe or threatening, whether

something that is happening is good or bad, desirable or not. Cognition provides

understanding: emotion provides value judgments. A human without a working

emotional system has difficulty making choices. A human without a cognitive

system is dysfunctional. (Chapter 2)

Norman (2005, 2013) described three levels at which the brain operates: visceral, behavioral, and reflective. The visceral level is the immediate, automatic, primitive reaction to a stimulus—it makes quick evaluations and alerts the brain. The behavioral level refers to those processes that control everyday behavior. The reflective level is the highest one—it monitors and reflects on experience and actions and can influence the behavioral level.

These levels work together, and the result is a person’s cognitive and emotional state. For a product design to be successful, it needs to take all these levels into account: 63 for the visceral level, the first impression of a product is what matters (what it looks and feels like); at the behavioral level, usability and performance are the factors that affect a person’s experience with that product; the reflective level is the hardest to please because here the user’s emotion is influenced by interpretation, culture, education, and individual differences. In short, visceral design will address the appearance of a product, behavioral design will address the pleasure and effectiveness of use, and reflective design will address the product’s impact on self-image, personal satisfaction, and memories.

This connection between emotion and cognition and its role in the effective design of a product gave rise to the concept of emotional design, which Norman introduced and explored at large in his 2005 book Emotional Design: Why We Love (or

Hate) Everyday Things. Along with making sure that the product fulfils a need, functions well, and is user-friendly, the emotional component also needs to be addressed: how does the product influence the user’s emotions? Norman (2005) asserted that “aesthetically pleasing objects enable you to work better” (p. 10) and “products and systems that make you feel good are easier to deal with and produce more harmonious results” (p. 10).

Emotional design has to do mainly with the visceral level of design. For example, finding symmetrical faces and bodies attractive is a visceral preference, likely due to evolutionary reasons. Likewise, there are general preferences for size, color, appearance, even when these are shaped by culture. Since these preferences are so wired in us, the attractiveness of visceral design is timeless, unlike more sophisticated design. Visceral design is ubiquitous in advertising, objects for children, of folk arts and crafts. However, 64 traditional product design shies away from visceral design in favor of behavior and reflective-oriented design (Norman, 2005).

To highlight the impact of aesthetic design on usability, Norman (2005) gave the example of an experiment examining different layouts for ATMs. While they were all the same in function, some of them had attractive of buttons and screens, and others unattractive ones. The surprising results were that the attractive machines were perceived to be easier to use (Kurosu & Kashimura, 1995). This experiment, originally performed in Japan, was replicated in Israel, with even stronger results (Tractinsky et al., 2000).

Another study examined this relationship in a Virtual Learning Environment (VLE), and found that manipulation of aesthetics impacted the ratings of both aesthetics and usability attributes of the VLE. This was observed in three separate experiments (Parizotto, 2007).

Credibility is another factor that affects the emotional relationship between user and product, and that can be influenced by the attractiveness of the design. One study found that over 45% of consumers judged the credibility of a website based only on its design, assessing things like layout, typography, color scheme, and font size (Fogg et al.,

2003). Robins and Holmes (2008) examined the connection between page aesthetics and the users’ judgment of the site’s credibility and found that a higher aesthetic treatment increased credibility—they named this “the amelioration effect of visual design and aesthetics on content credibility” (p. 397).

Perceptual fluency is the subjective sense of how easy or difficult information is to process (Oppenheimer, 2008) and its effect on cognitive processes has been studied extensively. Researchers have examined perceptual fluency in various disciplines and 65 across different modalities (visual and auditory) and found that it may influence people’s judgments of the presented information. Specifically, these studies showed that when the stimuli were harder to perceive (such as difficult to read fonts, weak type - background contrast, unfamiliar pronunciation, hard to hear auditory content), participants perceived the content as more difficult, time-consuming, less intelligent, more risky, or less likely to be true (Reber & Schwarz, 1999; Sanchez & Jaeger, 2015; Schwarz, 2004; Song &

Schwarz, 2009). Credibility is an important consideration in the design of learning materials as well, since getting “buy-in” from students greatly contributes to their motivation in engaging with the content. Hence, producers of instructional materials need to ensure the clarity and attractiveness of their products.

Aesthetics and learning. In the field of instruction, aesthetics has been defined as the elements of design that are meant to enhance the learner experience; this is in contrast with elements that fulfill purely pedagogical and technical needs, as required by learning objectives (P. Kirschner et al., 2004; Wilson, 2005). Aesthetics goes beyond these functional requirements, and its role is to provide a pleasurable experience. Some authors have argued that aesthetics heightens the learning experience and boosts intellectual activities; hence, it should be an essential part of instructional design (Parrish, 2005;

Wilson, 2005). In synergy with solid usability and pedagogy, aesthetics plays an important role in the design of online learning environments (Löwgren & Stolterman,

2004; Sharp et al., 2007). While research on aesthetics and learning is scarce, findings from other fields strongly suggested this relation between aesthetics and cognition.

Studies have shown that perceived aesthetics strongly correlates positively with perceived 66 usability in product and software use (Hassenzahl, 2004; Kurosu & Kashimura, 1995;

Parizotto, 2007; Tractinsky et al., 2000). A new evolving area, called neuro-aesthetics, is dedicated to the connections between aesthetics and cognitive processes such as comprehension, perception, and decision making (Peak et al., 2017).

However, the aesthetic aspect of learning materials design is often neglected by instructional designers, who tend to focus on functionality (Martin, 1986; Miller, 2011;

Parrish, 2005). Ervine (2016) explained the importance of visual literacy in all disciplines, and especially for instructional designers, and calls for higher education to include visual literacy skills in their curriculum. Estrada and Davis (2015) also discussed the role of graphic design in science communication and highlight the need of visual literacy training for students of that discipline. Moreover, there has been little research on the importance of graphic design for instructional materials (Glore & David, 2012;

Hancock, 2004). A better understanding of the connection between the aesthetic design of print or digital materials and learning outcomes can help teachers and instructional designers maximize the effectiveness, accessibility, and attractiveness of their instructional materials. Additionally, these insights can be useful for other materials producers, such as publishers of textbooks or e-learning resources. In the recent years, several authors have investigated the concept of emotional design and its role in education, and this study aims to contribute to this growing body of knowledge.

Attractiveness can be subjective and culture dependent. However, applying a few basic graphic design principles can greatly enhance the visual aspect of our learning materials, and create products that look pleasing for most people. “The traditional role of 67 design has been to improve the visual appearance and function of messages and information. […] In all these practices, designers use typography, photographs, illustrations, and graphic elements to construct messages that attract attention, cause us to think about their meaning, and stay in our memories over time” (The Professional

Association for Design, 2017, para. 1). Williams (2015) described four principles that well-designed materials follow: contrast, alignment, repetition, and proximity.

Contrast. Various elements of the design (type, color, size) should be contrasted by making them very different. For example, we can contrast a large font heading with a small font subheading or paragraph, a light background in one section with a black background in another, or a very square font with a round, smooth one. This draws the reader’s eye into the page and guides them, organizes information, and clarifies the hierarchy. It also creates interest, making it more likely that the reader will give it their attention.

Alignment. Each item on the page should have a visual connection with another item, which creates a clean, organized look and a stronger cohesive unit. Thus, various parts of the material have to be aligned with other elements. “Even when aligned elements are physically separated from each other, there is an invisible line that connects them, both in your eye and in your mind” (Williams, 2015, Chapter 3, para. 3).

Repetition. Some elements of design should be repeated throughout the material, which results in consistency and unity. This can be done for instance by making headings in the same size and font, by using the same kind of line or bullet point, or by using the colors of a graphic to develop the color theme of the other items on the page. 68

Proximity. Related items should be placed close together, so that they become one visual unit. Examples are placing a caption closer to its corresponding image than to other elements on the page, or a heading closer to the paragraph it belongs to, than to the text above it. This makes it clear that there is a relationship between them, which highlights the structure and enhances the organization of the material, while eliminating the need for the reader to figure out relationships.

These principles also parallel the guidelines of the cognitive theory of learning with multimedia: the proximity principle supports the contiguity rule, and all four principles support the signaling guideline. Applying them not only promotes engagement by making the material look visually pleasing, but also optimizes communication by directing attention, consequently, it has a beneficial impact on the learning process.

Along with the four general principles, Williams (2015) discussed the use of color and type to enhance readability, improve organization, and increase attractiveness. She explained how we can create pleasing combinations by using the color wheel, and how to apply certain guidelines such as using hot colors (the ones with a lot of red or yellow) sparingly, for things that need to stand out. Malamed (2015) stated that color affects learning positively by improving visual discrimination, working as a signal, enhancing storytelling, and evoking emotion. She also argued that the brightness and saturation are even more important than hue when it comes to impacting emotions, with bright and saturated (vivid) colors being more conducive to pleasure.

Typography greatly influences the visual aspect of a text. Williams (2015) recommended combining typefaces in a way that results in concordant designs (with 69 similar characteristics) for a calm, formal look, or a contrasting design (with very different features) for a more interesting, eye-catching look. While in general we aim for high readability in order to minimize extraneous cognitive load, some research suggests that a certain level of “desirable difficulty” may aid recall because it causes more cognitive engagement, leading to deeper processing; hence, using more unusual fonts or fonts that are a bit harder to read can be beneficial for certain types of content (Diemand-

Yauman et al., 2011; Wehr & Wippich, 2004).

While the above guidelines will help use develop clear and attractive learning materials, an important visual design consideration is accessibility. Since many people have some form of color blindness, it is advisable to avoid using color as the sole means of signaling, and to aim for color combinations that are easy to perceive. Similarly, since many people may have less-than-optimal eyesight, we need to ensure that they can also have access to the content. The Association of Registered Graphic Designers of Ontario

(2010) recommended at least a 70% difference in color value between type and background, and specific ways to manipulate shape, scale, style, dimension, spacing, and alignment of typography in order to improve both legibility and readability. In digital environments, we should also provide affordances for the users to modify size, typeface, and color of the on-screen material.

Storytelling as an Instructional Design Technique

A story or narrative is defined as “one method of recapitulating past experiences by matching a verbal sequence of clauses to the sequence of events” and at a minimum a

“sequence of two clauses which are temporally ordered” (Labov, 1972, pp. 359-360). 70

According to Polkinghorne (1988), “…narrative is a scheme by means of which human beings give meaning to their experience of temporality and personal actions” (p. 13).

Andrews, Hull, and Donahue (2009) explained that a story helps instruction both directly through linguistic means (semantic structures) and indirectly by facilitating the construction of a mental model from the sequence of events. Thus, it promotes inquiry, decision-making, and learning by acting as an “attention-focusing mechanism” (Gerrig,

1993).

When it comes to instructional design, storytelling can be understood in two main ways. The first way is including stories, usually true ones, as examples for the content being explained. For example, a cross-cultural communications course may include stories of people going through culture shock or committing cultural gaffes. This type of storytelling is often used by instructors and presenters in order to illustrate their ideas.

The second way is weaving a narrative around the content by including characters and a plot. In interactive media, this sometimes includes the learner as a character themselves and requires them to make decisions or solve problems.

While both types of storytelling can serve to engage learners, it is the second type that is the focus of this research. This is because in the recent years, under the influence of entertainment products such as movies and video games, instructional design has increasingly adopted such storytelling features. For example, an award-winning conference presentation by North (2019) about learner engagement featured a character— a detective called “Inspector Knowles”—who wants to solve the mystery of the disengaged learner. The detective uncovers clues and finally solves the case, while 71 informing and engaging the audience with research literature findings. Tucker (2015), in a presentation to a client in the early stages of selecting a Learning Management System, started by introducing Anna, an instructional designer who spends too much time on administrative tasks, and Anna’s manager, who has big plans for their e-learning courses.

Anna comes up with the solution of a new LMS and explains how it would help. An interesting example is a textbook called An Adventure in Statistics (Field, 2016), written as a science fiction book in which the main character needs to find his disappeared girlfriend and has to learn and use statistics in order to achieve this. This last example is a very complex and creative type of storytelling that most content creators usually do not have the time or talent for. Most often, storytelling takes the form of adding one or two characters and a simple plot, and it is especially used in e-learning.

A quick internet search yields a plethora of articles or videos from instructional designers, trainers, and e-learning developers expounding the benefits of using storytelling in learning design. An article from the E-Learning Industry, a popular online news and knowledge base, claimed that “The storytelling approach in corporate training creates higher retention and recall, leading to a memorable and sticky learning experience” (Section 3). The writer gave advice for using storytelling in corporate training and explains that stories are engaging, can simplify a complex subject, and help us remember. According to the author, storytelling used in corporate training usually follows a typical narrative that includes “1. A beginning (background or origin); 2. A middle (build-up through challenges); 3. An end (a logical conclusion with take-aways or that triggers the required action)” (Pandey, 2018, Section 2). Malamed (n.d.-b), in her 72 blog The E-Learning Coach, outlined ten reasons why storytelling should be used in learning. Among these, we learn that “stories are the emotional glue that connects the audience to the message” (Section 1), “stories reshape knowledge into something meaningful” (Section 4), and “stories make people care” (Section 5). Learning Solutions, another popular learning portal, featured an article emphasizing the importance of storytelling for learning. The writer explained that humans crave connections with other humans, and therefore having characters and a plot in a learning material makes us pay more attention (Neibert, 2014). In their article Narrative structure, myth, and cognition for instructional design, Hokanson and Fraher (2008) claimed that using narrative in instructional design lowers the learner’s cognitive load, because people are so used to this structure from entertainment, history, and their own lives that we unconsciously expect it even in learning experiences.

Although including storytelling in instructional design has become a common practice, and it is often promoted online and at learning design conferences, empirical research has been lacking. Existing research mostly comes from using video games in learning. This research claims that gaming elements such as role playing, challenges, interactive choices, and narrative arcs contribute to superior learning outcomes in game- based learning and should be incorporated in instructional design (Dickey, 2005;

Prensky, 2001; Rieber, 1996). Some researchers focusing on the use of narrative in learning design found that it can help comprehension (Laurillard, 1998) and it can be a useful tool for navigation in multimedia environments (McLellan, 1993). Since storytelling has become such an important instructional design strategy, more research is 73 needed in order to provide the empirical support for the claims that are made regarding its effectiveness for learning.

Emotional Design in Learning Materials

In instruction, emotional design refers to those elements of design that do not have a specific pedagogical or technical function—rather, they are intended to make the learning materials more attractive and evoke some sort of emotion in the learner. In the past, various authors have written on the potential of aesthetic elements such as original graphics, pleasant colors, or interesting layout to induce positive emotions and thus influence learning (Chan, 1988; Hathway, 1984; Martin, 1986). Martin (1986) asserted that:

Aesthetically sound productions can serve multiple functions; they can enhance

aesthetic awareness and they can actually increase cognitive learning as well,

since, in essence, aesthetically sound productions are better productions. They

capture and hold the attention of the learner longer, thus focusing the learner’s

attention on the content of the production. (p. 15)

Nonetheless, research on whether aesthetics influences learning has been scarce.

In the context of multimedia learning, the emotional design hypothesis postulates that visually appealing elements initiate, guide and maintain the learner’s cognitive processing, resulting in improved learning outcomes (Mayer, 2014b; Mayer & Estrella,

2014). Several authors have sought to test this idea. Most of the studies that measured learning outcomes focused on two variables: retention and transfer. According to Mayer

(2009), retention tests evaluate superficial learning, in other words the ability to recall the 74 content, while transfer tests gauge deep understanding—whether the learners can apply that knowledge to new situations.

Emotional design research. Emotional design can be thought of in terms other than visual elements; for instance, one researcher investigated the effects of sounds as emotional design in learning materials (Königschulte, 2015). The author hypothesized that sound has emotive qualities and therefore can have an influence on learners’ motivation and engagement with a lesson. The experiment included two groups of students learning a history lesson: the control group was presented a slideshow with narration, while the experimental group had the same material but with environmental sounds that illustrated the content of the lesson (for example, the sound of wind and sea and wooden creaking to accompany the picture of a cog at sea). The results did now show significant differences in retention or transfer; however, learner involvement seemed to be higher in the experimental group.

Stark et al. (2018) examined textual emotional design. In their study, the instructional text was scientific and, in order to avoid non-essential material, the researchers decided that the best way to manipulate its emotional potential in an authentic way was to embed words with strong emotional potential in the text itself, by using metaphorical structures. In this mixed-methods study, using an experimental design, the participants were assigned to three groups: positive emotional text design, negative emotional text design, and the original learning text. Specifically, the text for experimental groups was enhanced with emotional parentheses that contained metaphorical nouns describing parts of the ATP synthase molecule: blossom and petal for 75 the positive design, and abscess and wart for the negative design. These words were chosen based on the fact that they have strong emotional valence and high imagination values, and provide some kind of visual similarity with the ATP synthase molecule, as per the criteria from the Berlin Affective Wordlist Revised (Võ et al., 2009). Eye tracking was utilized to support the think-aloud procedure that was done after the learning took place.

The results showed that students who learned with positive and negative emotional text had better learning outcomes compared to the control group, which is supporting evidence for the idea that affective states influence cognitive processing.

Other findings were: emotional text design facilitated elaboration processes, but hindered metacognitive processes; the positive text did not influence the students’ emotional state, but the negative one did; the cognitive mechanisms that facilitated learning were different for the positive and negative designs, as shown by qualitative data obtained through cued retrospective think-aloud and interview questions. This study demonstrated the use of an original type of emotional design and highlighted the importance of affect; however, implementing this kind of textual emotional design may be difficult due to the constraints of the subject matter.

In most cases, the concept of emotional design is applied to multimedia learning and refers to visual elements. Multimedia materials, being comprised of a combination of text and graphics, offer more opportunities for adding elements out of aesthetic or emotional considerations without changing the essential learning material. This is 76 important since, as the coherence principle states, extraneous material can be detrimental to learning and needs to be minimized or eliminated.

So far, researchers have examined the role of anthropomorphisms (human-like characteristics), warm colors, and round, face-like shapes (Mayer & Estrella, 2014; Park et al., 2015; Plass et al., 2014; Um et al., 2012), classical and expressive aesthetic color combinations (Mayer & Estrella, 2014; Park et al., 2015; Plass et al., 2014; Um et al.,

2012), fonts (Kumar et al., 2018), and holistic aesthetic design principles (Hancock,

2004; Tomita, 2018). In these studies, emotional design was investigated in relation to various types of multimedia, such as printed handouts, PowerPoint presentations, websites, e-learning materials, and online education and testing environments.

Some of these studies had a more holistic approach, employing materials or environments that were enhanced aesthetically using multiple criteria. Hancock (2004) examined the impact of aesthetic design in distance learning media. Based on a literature review of industry professionals including graphic artists, interface designers, and academics, criteria for visual aesthetics were developed and applied to three courses developed in a Learning Management System (WebCT). The three courses were presented in two experimental conditions: with and without aesthetic enhancement; student satisfaction scores as well as time spent on content were compared. The results showed increased time for the aesthetic treatment group in two of the courses, and higher satisfaction scores overall.

77

Tomita (2018) used minimalist vs. appealing handouts and asked groups of students to report on their motivation with regard to the learning materials. The minimalist handout followed the multimedia learning Coherence Principle, “people learn better when extraneous material is excluded rather than included” (Mayer, 2009, p. 89), and was designed using the academic paper conventions (black text on white page, Times

New Roman typeface). The appealing handout was created after the author researched the visual trends and culture of undergraduate students and identified six visual trends: flat design, clean look, monochromatic color, white text, sans-serif typeface, bright colors.

The results showed that learners perceived the appealing handout as motivating when they saw it before the minimalist handout, but not when they saw it after. The results of this study are therefore inconclusive in terms of the design’s impact on motivation.

Moreover, this study did not examine learning outcomes; hence, it does not offer any evidence for the effectiveness of the design.

The studies above took a more holistic approach and examined a “style” of design. More studies, however, tried to identify specific design elements that can enhance instructional materials. Um et al. (2012) investigated the role of emotions, as well as the impact of emotional design elements, by conducting a 2 x 2 factorial experimental design in which college students received a computer-based lesson on immunization, and were assigned to 4 conditions: external mood induction (positive vs. neutral) and emotional design induction (positive vs. neutral). The external mood induction was achieved through a self-referencing mood-induction procedure (for the positive emotion group).

The emotional design induction was done through saturated, bright warm color 78 combinations such as yellow, orange, and pink, and the use of round shapes for illustrations and characters in the learning environment (Figure 7), versus the control condition, which used a neutral design with a grey-scale color scheme and square shapes.

Figure 7. Emotional design example from Um et al. (2012). This figure shows two sample screens from the study by Um et al. (2012, p. 489). Neutral emotional design (left), positive emotional design (right). Permission for reproduction not required for up to 3 figures by APA.

The in this study was based on the fact that warm colors are believed to be stimulating (Bellizzi & Hite, 1992; Wolfson & Case, 2000), and research on advertising suggests that higher levels of saturation and value also induce positive feelings (Gorn et al., 1997; Thompson et al., 1992). Round shapes, in turn, create positive feelings due to the baby-face bias, which means that they suggest innocence and honesty, and the anthropomorphism effect—the attribution of human characteristics to non-human beings or objects (Dehn & Van Mulken, 2000; Reeves & Nass, 1996). The results of the study suggest that external induction increased transfer, but not comprehension, while emotional design increased both comprehension and transfer. Overall, positive emotions 79 increased motivation, satisfaction with the materials, and the learners’ perception towards their own learning.

The study by Um et al. (2012) was replicated by Plass et al. (2014) in a German university, with the same learning materials (in German translation) and a different external induction method: 2-minute videos intended to induce either positive or neutral emotions. As a follow-up to this work, Plass et al. (2014) examined the two elements color and shape, and found that round shapes induced positive emotions both when used alone, and together with warm colors, but colors alone did not have an effect. However, warm colors and face-like shapes increased comprehension, both independently and together, while face-like shapes also increased transfer when used with neutral colors.

The researchers also investigated the effect of external emotion induction and emotional design on motivation, perceived task difficulty, perceived mental effort, satisfaction, and perception of learning materials, and found that emotional design increased reported motivation and decreased the perceived task difficulty, but did not affect the other factors.

Park et al. (2015) also replicated this study in Germany with a focus on anthropomorphisms. They employed the same materials in German version and the same self-referencing external mood induction as Um et al. (2012); additionally, they used eye tracking to study the cognitive processes at work during multimedia instruction.

Eyetracking shows which elements the learners looked at and for how long, thus providing clues regarding the selection of information that gets passed into the working memory. Longer fixation is believed to indicate deeper processing (Just & Carpenter, 80

1976; Rayner et al., 2007). They found that the group that received a self-referencing positive mood-induction procedure achieved higher learning outcomes than the control group, in both comprehension and transfer. The results also showed that, while the anthropomorphisms did catch the learners’ attention, they did not produce positive emotions. Furthermore, the group that had both the external mood induction and the design with anthropomorphisms had the significantly better performance on the comprehension test than all the other groups, as well as the highest results in the transfer test. The authors also examined whether emotional design and external mood induction influenced perceived cognitive load, perceived task difficulty, acceptance of the learning program (how much they liked it), perceived learning outcomes, and situational interest

(e.g. “I find the topic interesting”). Neither of the two emotion induction methods appeared to have influenced these subjective dimensions.

Mayer and Estrella (2014) used a multimedia lesson on how a virus catches a cold in two experiments with college students. The participants were assigned to two groups: the control group had simple, gray-scale drawings, and the “enhanced” group had the same material but with colorful graphics, and anthropomorphized features of the cells and virus (Figure 8). Then, they were tested on retention and transfer. The difference between experiments was that one of them had a time limit, while the other did not. The results showed that the enhanced group had overall better learning outcomes in both experiments; however, a significant difference was only found in the retention test. With regards to subjective measures, experiment 1 found no differences between groups on the appeal of the lesson, enjoyment of the lesson, desire for similar lessons, or difficulty 81 rating, but the emotional design group reported that they had invested a higher level of effort. In experiment 2, these results were similar, with the difference that there was no difference for reported effort. According to the authors, such findings suggest that learners may not be very aware of their own affective states during learning, and it is recommended to use more objective measures.

Figure 8. Example slides from Mayer and Estrella (2014). This figure shows two sample slides from the study by Mayer and Estrella (2014, p. 13). Control (upper), enhanced with emotional design (lower). Permission for reproduction not required for less than 3 figures by Elsevier.

Heidig et al. (2015) further examined the effects of emotional design factors on learners’ emotional states and on their learning outcomes, this time including negative states as well as positive. They aimed to induce these states by using designs with high vs. low levels of two visual aesthetics design factors (classical and expressive aesthetics), 82 and a high vs. low usability (Figure 9). The authors chose to manipulate visual aesthetics as a holistic design factor because literature from web design shows that it strongly impacts the subjective perception of a website, and the users’ emotional responses (Lavie

& Tractinsky, 2004; Moshagen & Thielsch, 2010; Schenkman & Jönsson, 2000; Tuch et al., 2010). The two dimensions (classical and expressive) were described by Lavie and

Tractinsky (2004) and refer to a clean, orderly, symmetrical aspect (classical aesthetics) vs. a novel, unconventional aspect that may use special effects (expressive aesthetics). To make it more concrete, the researchers focused on color as the design feature distinguishing between the two dimensions. Usability was used as the design element that can induce negative emotions, as low usability has been linked to negative emotions

(Lazar et al., 2006). The concrete feature used by the authors was long loading time, which has been shown to cause frustration and impatience in users (Ceaparu et al., 2004;

Dellaert & Kahn, 1999). The results of the study suggested that objective differences in aesthetics and usability did not impact the students’ emotional states, but the perceived differences did. Furthermore, the emotional states of the learners had only a small effect on learning outcomes, but a larger one on their intrinsic motivation, including willingness to continue using the instructional material. 83

Figure 9. Example content from Heidig et al. (2015). This figure shows sample graphics from Heidig et al. (2015, p. 86). Permission for reproduction not required for less than 3 figures by Elsevier. 84

Brom et al. (2016) used funny graphics in a wastewater treatment instructional animation for high school students in the Czech Republic. The experimental group graphics were modified by adding static human-like faces for two chemical elements and by giving a fish and a riverbed funny appearance. They tested the students’ learning outcomes through retention and transfer tests, as well as their state engagement (positive affect and flow levels). The results showed that the experimental group performed better on the retention test, but did not have significantly different results on the transfer test or on state engagement. The authors also conducted a qualitative investigation which suggested that the graphical enhancement may play a role as memory cues, implying a cognitive rather than affective effect. However, participants did describe the enhancements as “positive, funny, accurate, and not distracting (perhaps with the exception of one element—the funny, mutated fish—for some participants)” (Brom et al.,

2016, Section 5).

One study examined the effect of aesthetics on performance for engineering students in Malaysia (Kumar et al., 2018). The authors investigated the participant’s use and outcomes of an e-learning course which was designed in two versions: one with positive (happy) aesthetics and one with negative (sad) aesthetics. The positive design was characterized by warm or bright colors and Kristen ITC font, while the negative one featured low brightness and dull colors and Impact font (see Figure 10). It was found that students performed better with the negative design, and the authors interpreted this finding as possibly due to the association of the engineering discipline with more serious, darker colors. This raises the interesting issue of the role of visual design in creating 85 learning experiences that promote what some authors call “affiliation” with the content, considered a main factor in online learners’ motivation and learning outcomes (Chen et al., 2010). Kumar et al. (2018) concluded their findings by stating: “Therefore, before an e-learning tool is designed, the first question always remains as “who is it designed for?”

(p. 261).

Figure 10. Example screens from Kumar et al. (2018). This figure shows example screens from Kumar et al. (2018, p. 257). Positive (happy) aesthetics (left), negative (sad) aesthetics, right. Reproduction allowed by Creative Commons Attribution 4 License.

Le et al. (2018) looked into the mental effort investment as an expression of positive affect by measuring heart rate variability during learning. Participants who studied the lesson using emotional design principles (saturated warm colors and round shapes) had a stronger decrease in heart rate variability during instruction and performed 86 better on a retention test than the control group (monochromatic grayscale and rectangular shapes). The authors interpreted this finding as evidence for the affective mediation assumption of the Cognitive Affective Theory of Learning with Media.

Uzun and Yıldırım (2018) explored the individual effects of anthropomorphisms and colors on multimedia learning for 7th grade middle school students. They evaluated learning outcomes, positive emotions, and mental effort investments by using instructional materials in four different designs: Neutral (no emotional design), Colorful

(with bright, saturated graphics), Anthropomorphic (expressive facial expressions, both on lifeless objects and on human characters added to the colorful graphics) and

Anthropomorphic with Sound Effects (interesting sound effects were added to the other the emotional design features—for example, a friction sound when the animation depicted friction on a hard surface). Figure 11 shows sample screenshots from this study.

The researchers measured positive emotions by using an emWave device (a cardiovascular instrument of emotion detection). They found that positive emotions increased as the amount of emotional design increased, the Colorful design group invested more mental effort compared to the Neutral design, and the Colorful design group outperformed the Neutral group on a recall test. However, no difference was found on transfer scores. 87

Figure 11. Example screens from Uzun and Yıldırım (2018). This figure shows example screens from Uzun and Yıldırım (2018, pp. 116–118). Neutral design (upper), Colorful design (middle), Anthropomorphic design (bottom). Permission for reproduction not required for up to 3 figures by Elsevier. 88

The effect of anthropomorphisms was also examined in a study by Schneider et al. (2018), using instructional content about artificial intelligence. In a first experiment, the authors analyzed the impact of decorative pictures with anthropomorphisms and / or personalized labels on learning performance, cognition, motivation, and emotion (the pictures did not convey any relevant information). The anthropomorphism was realized by making the robots have human-like faces, and the personalized labels were designed through the use of personal pronouns (you, your) and speech bubbles (see Figure 12). The

2 x 2 factorial design (human faces vs. no human faces in pictures, and personalized vs. non-personalized labels of pictures) showed that both human faces and personalized labels improved learning performance, but only human faces influenced the motivation and emotion. In a second experiment, the researchers compared 3 groups— anthropomorphized pictures, non-anthropomorphized pictures, and no pictures—and found that learners who studied with anthropomorphized pictures performed better than both of the other groups. These experiments suggested that anthropomorphisms can play a beneficial role in learning and may be a useful addition to instructional materials, even when they are in the form of decorative pictures. In general, however, emotional design features such as anthropomorphisms are added to materials that are intrinsic and directly relevant to the content: “For anthropomorphisms, a key defining feature is that the anthropomorphic elements should be depicted on an existing, non-anthropomorphic graphical object (i.e., it is not a new graphical object)” (Brom et al., 2018, p. 103).

89

Figure 12. Example decorative images from Schneider et al. (2018). This figure shows example images from Schneider et al. (2018, p. 222). Pictures with anthropomorphic and personalized features (left), pictures without any additional features (right). Permission for reproduction not required for up to 3 figures by APA.

Recently, Brom et al. (2018) conducted a meta-analysis of 20 studies related to the benefits of anthropomorphisms and color for learning. The analysis included 33 90 independent samples, and found significant positive effects for retention, comprehension, and transfer. When it comes to subjective factors, the meta-analysis revealed mixed effects: a strong effect for intrinsic motivation, a medium effect for enjoyment, a small effect for positive emotions and perceptions of difficulty. There was no effect for perceived learning effectiveness or perceived effort.

Summary

These studies provided mixed findings regarding the effectiveness of emotional design, but, overall, they support the idea that emotions influence learning, and that emotional design of multimedia materials can affect learning outcomes. Since the existing studies provide inconclusive findings, additional research is needed to clarify and extend these results. Thus, in this present study I seek to test the emotional design hypothesis and to further the understanding of the role of various emotional design elements in learning by replicating existing research.

Replication plays a crucial role in research because it allows us to test previous studies for validity and to expand our knowledge by identifying specific conditions where the findings hold true or not (Collins, 1985). It has been referred to as “the cornerstone of science” (Simons, 2014, p. 76). Suter and Suter (2019) stated that: “Trust comes with replication of research findings and without it, science becomes unsettled” (p. 207), because “without replication, chance or biased findings (even fraudulent ones) go unchecked” (p. 207). In education, replicating and extending preliminary findings is particularly important for advancing scientific research. Shavelson and Towne (2002) proposed six principles for educational research, three of which involve replication: 91

“Provide a Coherent and Explicit Chain of Reasoning” in order to “permit others to critique, to analyze, and to attempt to replicate, a study” (p. 4); “Replicate and

Generalize Across Studies” (p. 4); and “Disclose Research to Encourage Professional

Scrutiny and Critique” so as to open studies “to examination, criticism, review, and replication” (p. 72).

Historically, replication studies have been scarce and underappreciated, even though calls for their implementation were made since over 40 years ago (Cai et al.,

2018). Recently, the “replication crisis” or “reproducibility crisis” came into public spotlight as a disturbingly large number of well-established scientific studies were found impossible to reproduce, in particular in psychology and other social sciences (Fidler &

Wilcox, 2018; Suter & Suter, 2019). Suter and Suter (2019) explained that replication is valuable because science evolves by identifying and explaining contradictions and gaps in knowledge; many authors increasingly call for more replication (Dennis & Valacich,

2014; Hedges & Schauer, 2019; Morin, 2016; Suter & Suter, 2019).

To test the emotional design hypothesis and expand on previous findings, this study replicates the research by Mayer and Estrella (2014) by carrying out their experiment on a different population and with slightly modified materials. In addition to replicating Mayer and Estrella’s research, this study also includes a new type of emotional design – the inclusion of storytelling elements in the lesson. Storytelling elements can be considered a type of emotional design because they help the learner connect with the instructional material emotionally, by making the content relatable and meaningful. Storytelling features are highly popular in instructional design but do not yet 92 have justification in the form of empirical evidence, which this study seeks to provide in an exploratory .

The relationship between the different types of design of the multimedia materials and task performance is investigated, and the present study adds a delayed performance variable, which has not been researched so far in the context of emotional design. As learning involves storing information in the long-term memory and the ability to retrieve it, it is important to know whether the type of design impacts retrieval later in time and not just immediately after studying.

This chapter addressed the main concepts and theories related to the psychology of learning. It included an analysis of the role of visual design in learning and the concepts of emotional design and storytelling, as well as their role in learning. The research examining the effect of emotional design on learning was presented, followed by a justification for the need for replication. The next chapter outlines the methodology used in this study.

93

Chapter 3: Method

In this study, I aimed to relate the design of the multimedia (text and pictures) materials to immediate and delayed task performance, for students at a midwestern university, using an experimental design. In other words, the goal was to find out whether changing the design of the materials using emotional design principles produces an effect in the learning outcomes. A secondary purpose was to find whether each of the two emotional influences the students’ perceived effectiveness of the learning materials, and whether students do indeed find the graphic emotional design as more appealing visually. In this chapter, I will describe the research methodology, including the research questions, type of research design, variables, participants, instruments, pilot study, procedure, inter-rater reliability analyses, group equivalency checks, and data analysis methods.

Research Questions

The research questions were:

1. What is the effect of graphic emotional design and storytelling on immediate

task performance (i.e. the results on a retention and transfer test)?

2. What is the effect of graphic emotional design and storytelling on delayed task

performance?

3. What is the effect of graphic emotional design on students’ perceived

effectiveness of learning materials?

4. What is the effect of storytelling on students’ perceived effectiveness of

learning materials? 94

5. Do students perceive the materials designed with graphic emotional design as

more visually appealing?

Research Design

The experimental design is the appropriate method when we try to establish a cause and effect relationship between the independent and dependent variables. There are four conditions required to claim causality: the two variables must covary significantly; the “cause” precedes the “effect” in time; there is no alternative explanation (a confounding variable); there is a theoretical explanation of the cause-effect relationship

(Warner, 2008).

To conduct a rigorous experiment, individuals need to be randomly assigned to groups or conditions, in order to avoid any bias in the participants’ characteristics, and control for extraneous factors that could have an effect on the outcome. Thus, the groups need to be equivalent, which means the variability of the individuals is distributed equally among them, as much as possible. One way to equate the characteristics of the groups is to measure variables that are correlated with the dependent variable, and then statistically control for these variables through covariance (Creswell, 2012).

Variables

The variables used to study the relationship between design and learning outcomes are:

• Independent Variable: Type of Design. This variable has four conditions: Control,

Graphics, Story, Graphics plus Story.

• Dependent Variables: 95

o Retention Score: the score on the retention question immediately after the

task. Retention tests evaluate superficial learning—whether the learners

can recall the content.

o Transfer Score: the total score on the transfer questions immediately after

the task. Transfer tests seek to evaluate deep understanding of the content

and whether learners can apply what they learned to new situations.

o Delayed Retention Score: the score on the retention question after one

week

o Delayed Transfer Score: the total score on the transfer questions after one

week

o Effectiveness Rating: the perceived effectiveness of the learning materials

o Visual Design Rating: the attractiveness rating that the students give to the

learning materials based on their graphic design

Participants

The target population of this study is undergraduate students, as they represent one of the main populations that consume educational content. Ideally, a sample would be representative of the entire population, and therefore random sampling would be used to select the participants. This would ensure that each individual in the population has an equal probability of being selected (Creswell, 2012). However, due to practical constraints, a convenience sampling frame (Brooks, 2011) was used in this case.

Thus, the sampling frame consisted of undergraduate students from Ohio

University. An email was sent to all undergraduate students from the Athens campus 96 inviting them to participate in the study, with a link to the Qualtrics instrument. The email described the process and offered an $8 gift card compensation for all complete responses.

The link in the initial email was closed after receiving 259 responses. 39 responses were eliminated because the answers were incomplete or contained inappropriate answers, leaving 220 valid responses for the initial task. After one week, these students were sent an email with the delayed task. 168 responses were received, out of which 144 valid responses.

Instruments

The instruments were presented to the students in Qualtrics and included:

1. Questionnaire: demographic data and evaluation of prior knowledge of the subject

(Appendix D)

2. Instructional material (Appendix C)

3. Post-test questions for the immediate task performance score: retention and

transfer (Appendix E)

4. Post-test questions asking the students to rate the visual design and the

effectiveness of the learning materials (Appendix H)

5. Post-test questions for the delayed task performance score (Appendix F)

The instructional materials used in this study are slides explaining how a virus produces a cold. This topic was chosen because it is similar to what an undergraduate student would learn in a biology class, it can be explained in a short presentation, it is of general interest, and participants are not likely to have in-depth knowledge about it. The 97 lesson is designed in four different versions corresponding to the independent variable.

Qualtrics randomly assigned each participant to one of the four conditions:

Group 1 (Control) received a plain lesson consisting of text plus black and white graphics. An example slide is shown in Figure 13.

Figure 13. Control sample slide. This figure shows an example of slide from the lesson used with the control group.

Group 2 (Graphics) received the same lesson, but the images were designed based on emotional design principles, using warm colors and anthropomorphisms. An example slide is shown in Figure 14. 98

Figure 14. Graphics sample slide. This figure shows an example of slide from the lesson with emotional design graphics.

Group 3 (Story) received the same instructional graphics as the Control group, but the lesson included basic storytelling elements: a character, a very simple plot, and narrative tenses (past tense). An example slide is shown in Figure 15. 99

Figure 15. Story sample slide. This figure shows an example of slide from the lesson with storytelling elements.

Group 4 (Graphics plus Story) received the lesson with both enhanced graphics and storytelling elements).

The instructional materials were designed according to the principles of multimedia learning: the multimedia principle (combination of text and graphics), the contiguity principle (the words are next to the graphic they describe), the redundancy principle (only text with no audio narration), the coherence principle (no material that does not support the instructional goal), and the signaling principle (new words are bolded for emphasis). In addition, the slides observed the principles of graphic design: contrast, alignment, repetition, and proximity and used a font with high readability. The graphics are interpretive (illustrating changes over time) and transformational (showing relationships and processes that are invisible), with the exception of the ones comprising 100 the storytelling elements, which were decorative. The enhanced graphics are designed with bright, saturated colors (red, orange, blue, green) which are believed are believed to be stimulating (Bellizzi & Hite, 1992; Wolfson & Case, 2000) and induce positive feelings (Gorn, Chattopadhyay, Yi, & Dahl, 1997; Thompson, Palacios, & Varela, 1992).

They also contain anthropomorphisms, which were found to be increase comprehension and transfer in previous research (Plass et al., 2014; Um et al., 2012).

All groups were then tested using the same questions: one retention question asking them to describe the infection process in detail, and 5 transfer questions that evaluated how well the learners understood the material and could apply the knowledge to solve problems. The participants were tested once immediately after the lesson

(immediate performance), and once after one week (delayed performance). The retention question was the same for both tasks, but the transfer questions were different. All the questions were open-ended except for the five transfer questions in the delayed task, which were multiple-choice. The open-ended questions were scored using a rubric

(Appendix G).

The Control and Emotional Design Graphics lessons were the same (with slight modifications) as the ones used by Mayer and Estrella (2014), with permission from the authors. The questions were adapted and extended from the same study. Additional transfer questions were developed starting from the original ones, with the assistance of a virology specialist who also checked the accuracy of all the materials, questions, and rubrics. 101

The effectiveness and visual design questions were asked after the initial task assessment. Both questions were rated on a 5-point scale: from “effective” to

“ineffective” and “very appealing” to “unappealing,” respectively (Appendix H).

Pilot Study

A pilot study was conducted with a small group of students. In the invitation email, the students were asked to report any problems with the instruments or unclear items. 11 students participated in the initial task, and 5 students completed the delayed task.

No problems were reported, and the time allotted for the lesson and the question was shown to be sufficient. However, the results suggested that participants did not realize that they were expected to give a detailed answer to Q1 (“Please describe how a virus attacks the body”); instead they wrote a 1-3 sentence summary. Consequently, this item was changed to “Describe the steps of how a virus attacks a body. Please include as much detail as you can.”

A reliability analysis was conducted on the initial task dataset in order to check for problematic items. Cronbach’s alpha was .522 and one item (Q1) had a slightly higher

Cronbach’s alpha if item deleted (Appendix R). However, since the sample was very small, these results did not indicate a need to change the items with the exception of the modification mentioned above. Thus, the study proceeded as planned with the modified question. 102

Procedure

Participants received the Qualtrics link by email. If they decided to participate, they were presented with a consent form and had to consent in order to proceed to the questions. After answering the demographic and previous knowledge questions, they were randomly assigned to one of the experimental conditions and were presented a self- study lesson (slides) on how a cold virus attacks the body. The lesson was timed, with a limit of 10 minutes.

After studying the slides, participants answered questions that aimed to assess their learning: one retention question asking them to recount the steps involved in viral infection, and five transfer questions. Each question had a 2.5-minute limit. At the end, the students were asked to rate the visual design and the effectiveness of the learning materials, with no time limit.

After one week, participants received an email invitation to complete the delayed task. The delay of one week was chosen based on the forgetting curve, which shows that most forgetting happens in the first 7 days (Ebbinghaus, 1885/1974). This task included the same retention question as the immediate task, but different transfer questions. The retention question had a 2.5-minute limit, and the transfer questions (which were multiple-choice) had a 1-minute limit each.

The task performance answers were scored blindly and independently by two different raters in order to eliminate bias and to ensure reliable scores. A rubric was used to inform the rating (Appendix G). The two ratings were averaged and the mean scores were used for the dependent variables. 103

Inter-Rater Reliability

The open-ended questions were graded by two different raters using a rubric

(Appendix G). To check for inter-rater reliability, a reliability analysis was conducted in

SPSS. As the score for Question 1 was a scale variable graded on a scale from 0 to 20, the intraclass correlation coefficient (ICC) was calculated, for both absolute agreement and consistency. The ICC is a widely-used measure for inter-rater reliability for quantitative variables (Koo & Li, 2016). The ICC was also calculated for the total transfer score in the immediate task. The delayed transfer questions were multiple-choice and thus did not need to be rated. The results are presented in Table 1. Values over 0.75 indicate good reliability, while values over 0.9 indicate excellent reliability (Koo & Li,

2016).

Table 1

Inter-Rater Reliability: Intraclass Correlation Coefficient

ICC Absolute Agreement ICC Consistency

Retention Score .92 .95

Transfer Score .84 .85

Retention Score Delayed .87 .92

Since the transfer questions were rated with only one point, making them essentially nominal variables with “correct” and “incorrect” values, each individual item

Q2 through Q6 was also tested for inter-rater reliability using crosstabs and the Kappa 104 coefficient. Evaluating individual items is more informative and more appropriate than an

“across-the-board” approach (Lombard et al., 2004; Orwin & Cordray, 1985). The results are shown in Table 2. In general, values of Kappa between 0.40 and 0.59 are considered moderate, and values between 0.60 and 0.79 substantial (Landis & Koch, 1977).

Table 2

Inter-Rater Reliability: Crosstabs and Kappa Coefficient

Percentage of items agreed on Kappa

Q2 91.1 .66 (p < .001)

Q3 80.4 .61 (p < .001)

Q4 84.6 .65 (p < .001)

Q5 77.3 .47 (p < .001)

Q6 75 .48 (p < .001)

As the original study that was replicated reported Pearson’s r for as a measure of inter-rater reliability (r = .92 for retention and r = .82 for transfer), the author of this study also calculated the correlations for Q1 (Retention) and Transfer scores. The results are presented in Table 3. A coefficient over .5 is generally considered a strong correlation

(Field, 2016).

105

Table 3

Inter-Rater Reliability: Pearson Correlations

r

Retention Score .90 (p < .001)

Transfer Score .75 (p < .001)

Retention Score Delayed .85 (p < .001)

While the Kappa values for Q5 and Q6 are lower than ideal, overall the inter-rater reliability appears strong. The consistency and agreement of the raters was sufficient to ensure accurate data.

Group Equivalency

To make sure that the four groups are equivalent in terms of demographics, prior knowledge, and time spent on task, the relevant independent variables were compared.

The following scale variables were checked using a t-test (one-way ANOVA): age, science interest score, and duration. The following nominal and ordinal variables were checked using crosstabs: gender, year, high school science classes, college science classes, and biology knowledge self-rating. These analyses were performed both for the initial participants and for the ones answering the delayed task. No significant differences were found between the groups on any of these variables in either phase (Appendix I).

Data Analysis

Procedure. All analyses for this study were conducted in SPSS. A one-way multivariate analysis of variance (MANOVA) was performed to check if there was any 106 difference on the two task performance variables between the four groups (research questions 1 and 2). The one-way MANOVA is a statistical procedure used to compare mean scores on multiple quantitative outcome variables. MANOVA is usually preferred over performing separate univariate analyses for each dependent variable (ANOVA) because it is more accurate and can provide more detailed information: running

MANOVA eliminates the inflated risk of Type I error that comes with multiple analyses; it takes into account the linear correlations between the dependent variables; it can identify an effect when the groups do not differ significantly on any one individual outcome variable but they do when the outcome variables are considered together (Barton et al., 2016; Tabachnick & Fidell, 2007; Warne, 2014; Warner, 2008). In the present study, the two outcome variables—retention score and transfer score—are assumed to be related, since remembering and understanding / ability to apply are different levels of learning that are interdependent: the more you understand, the more you recall, and ability to apply relies on recalling the content. Therefore, MANOVA was determined to be the most appropriate analysis.

The MANOVA was followed up with a Discriminant Analysis (DA) in order to obtain more detailed information on how each outcome variable differs across groups.

DA is used to predict group membership based on an individual’s scores on a group of predictor variables. Thus, DA is a MANOVA “in reverse” and it is recommended that a

DA be included as a follow-up to a significant MANOVA because it offers additional information. First, the DA provides the coefficients for discriminant functions, which shed light on the relation between the dependent variable scores and group membership 107 when intercorrelations between dependent variables are considered. Second, the DA can add information about the accuracy of classifying the cases into groups. In order to find out where the differences lie between groups, DA is more appropriate than the

MANOVA post-hoc tests: while the latter show the univariate comparisons, DA can identify the differences on the combination of variables (Barton et al., 2016; Tabachnick

& Fidell, 2007; Warne, 2014; Warner, 2008).

For assessing the differences in students’ perceptions of the materials (research questions 3, 4, and 5), a one-way between-subjects analysis of variance (ANOVA) was performed. The between-subjects ANOVA is used to compare means on a quantitative variable across two or more groups (Warner, 2008). After the F value was found to be significant, planned contrasts were performed for the target groups: graphic emotional design vs. no graphic emotional design for research questions 3 and 5, and storytelling design vs. no storytelling design for question 4.

Assumptions. The data was tested for assumptions before running the analysis.

The assumptions for MANOVA, Discriminant Analysis, and ANOVA are independent observations, normal distributions, linear associations between the dependent variables, and homogenous variance / covariance matrices across groups for the dependent variables (Warner, 2008).

Independent observations. The scores on any dependent variable should be independent of each other. This assumption is satisfied because participants are assumed to have responded independently. 108

Normal distributions. Each dependent variable should be quantitative and reasonably normally distributed. This assumption was checked by producing a histogram and a boxplot for each dependent variable and a scatterplot for each of the dependent variable pairs (Retention Score and Transfer Score, Retention Score Delayed and

Transfer Score Delayed (Appendix J). All of these graphs show fairly normal distributions.

Outliers were also checked by calculating the Mahalanobis distance and a few outliers were identified. The data was analyzed without the outliers and there were no significant changes. Since the extreme scores are all possible scores, they were kept in the dataset.

Linear associations between the dependent variables. The relationships between pairs of dependent variables should be linear. This assumption was tested through the scatterplots between the dependent variable pairs, which show a linear relationship

(Appendix K).

Homogenous variance / covariance. The variance / covariance matrices for the dependent variable should be homogenous across the groups. This assumption was tested with Box M and Levene’s tests which did not find significant differences between the groups for research questions 1 and 2 analyses. Levene’s test was significant for the

ANOVA analyses for research questions 3, 4, and 5; however, one-way ANOVA is robust test to violations of the homogeneity of variance assumptions, and the results are reliable when the groups have similar sizes (Warner, 2008), which is the case in this study (Appendix: L). Nonetheless, the subsequent contrast tests results were reported 109 from the “Does not assume equal variances” rows. The p values for these tests are presented in Table 4.

Table 4

Homogeneity Tests

Analysis Test p

MANOVA Immediate Task Box’s M .625

Levene’s Retention .114

Levene’s Transfer .380

MANOVA Delayed Task Box’s M .421

Levene’s Retention .550

Levene’s Transfer .523

ANOVA Content Evaluation Levene’s .001

ANOVA Visual Design Evaluation Levene’s .001

Summary

This chapter addressed the research methodology of the study. The research questions, type of research design, variables, participants, instruments, pilot study, procedure, inter-rater reliability analyses, group equivalency checks, and data analysis methods were described. The findings of the analyses will be presented in chapter four.

110

Chapter 4: Results

This chapter presents a summary of demographic data and the findings of the analyses for each research question. The research questions addressed by this study were:

1. What is the effect of graphic emotional design and storytelling on immediate

task performance (i.e., the results on a retention and transfer test)?

2. What is the effect of graphic emotional design and storytelling on delayed task

performance?

3. What is the effect of graphic emotional design on students’ perceived

effectiveness of learning materials?

4. What is the effect of storytelling on students’ perceived effectiveness of

learning materials?

5. Do students perceive the materials designed with graphic emotional design as

more visually appealing?

Summary of Demographic Data

The participants in this study were students at Ohio University. There were 220 participants in the first phase, and out of these, 143 also responded to the delayed task.

The age range in the first phase was 18-40, with 208 students (94.5%) being between 18 and 22 years old. The gender distribution was male 56 (25.5%), female 160 (72.7%), and

“other” 4 (1.8%). The distribution by year of study was freshman 42 (19.2%), sophomore

56 (25.5%), junior 56 (25.5%), senior 63 (28.6%), and “other” 3 (1.4%). The four groups were equivalent from the point of view of demographic data. 111

For the students remaining in the second phase, the age range was 18-40, with 134 students (93.7%) being between 18 and 22 years old. The gender distribution was male

29 (20.3%), female 112 (78.3%), and “other” 2 (1.4%). The distribution by year of study was freshman 23 (16.1%), sophomore 38 (26.6%), junior 37 (25.9%), senior 43 (30.1%), and “other” 2 (1.4%). The four groups were equivalent from the point of view of demographic data.

Research Question 1

This question asked: What is the effect of graphic emotional design and storytelling on immediate task performance (i.e., the results on a retention and transfer test)?

The mean scores for each group are presented in Figures 16-17 and in Table 5:

112

Figure 16. Mean scores for Retention. This figure shows the mean scores for the Retention variable in the initial task, with 95% confidence intervals. 113

Figure 17. Mean scores for Transfer. This figure shows the mean scores for the Transfer variable in the initial task, with 95% confidence intervals.

114

Table 5

Descriptive Statistics for Immediate Task Performance

Group Mean St. Deviation N

Retention Control 5.1792 2.72646 53

Score Graphics 6.3246 2.67009 57

Story 5.5625 2.55852 56

Graphics plus Story 6.4722 2.09766 54

Total 5.8909 2.56503 220

Transfer Score Control 2.4528 1.20999 53

Graphics 2.5526 1.23449 57

Story 2.7768 1.22445 56

Graphics plus Story 3.0370 1.04544 54

Total 2.7045 1.19511 220

The multivariate omnibus test (Appendix M) identified a significant difference between the four groups, with Pillai’s Trace = .077, F (6, 432) = 2.869, p = .009. The univariate tests of between-subjects effects showed that a large part of the effect belongs to the Retention variable, with F (3, 216) = 3.229, p = .023, partial eta squared = .043.

For the Transfer variable, the effect seems smaller and the p value is very close to significant (at α = .05) with univariate F (3, 216) = 2.608, p = .053, partial eta squared =

.035. Both of these are small-to-medium effects (Cohen, 1969; Richardson, 2011). 115

Because the null hypothesis was rejected, a follow-up Discriminant Analysis was used to identify the differences between the groups (Appendix N). The Discriminant

Analysis identified two functions: function 1 explains 56.6% of the variance, and function 2 explains the remaining 43.4%. Table 6 presents the tests of functions.

Table 6

Discriminant Analysis: Tests of Functions

Test of Wilks' Lambda Chi-square df Sig.

Function(s)

1 through 2 .925 16.888 6 .010

2 .967 7.345 2 .025

The Retention Score variable aligns the most with the first function, while the

Transfer score aligns with the second, as shown in Table 7.

Table 7

Discriminant Analysis: Function Coefficients

Function

1 2

Retention 1.098 -.470

Transfer -.207 1.176

116

The pairwise group comparisons are presented in Table 8 and show significant differences between three pairs of groups: Control – Graphics, Control – Graphics plus

Story, and Graphics – Story (at α = 0.05).

Table 8

Discriminant Analysis: F for Pairwise Differences

Group Control Graphics Story Graphics

plus Story

Control F (2,215) 3.329 1.018 4.358

p .038 .363 .014

Graphics F (2,215) 3.329 3.800 2.857

p .038 .024 .060

Story F (2,215) 1.018 3.800 1.784

p .363 .024 .170

Graphics plus F (2,215) 4.358 2.857 1.784

Story p .014 .060 .170

These values together with the means in Table 4 show how the four groups compare on the dependent variables. The three pairs of groups: Control – Graphics,

Control – Graphics plus Story, and Graphics – Story, have significant differences on the combination of the two dependent variables Retention and Transfer. On the Retention variable, which is the main variable contributing to the difference, the Graphics group (M 117

= 6.3246) outperformed both the Control (M = 5.1792) and the Story (M = 5.5625) group, while the Graphics plus Story group (M = 6.4722) outperformed the Control group (M =

5.1792). On the Transfer variable, the situation is slightly different: the Graphics plus

Story group (M = 3.0370) outperformed the control group (M = 2.4528), but the Story group (M = 2.7768) achieved a higher score than the Graphics group (M = 2.5526).

Research Question 2

The question asked was: What is the effect of graphic emotional design and storytelling on delayed task performance?

The mean scores for each group are presented in Figures 18-19 and in Table 9: 118

Figure 18. Mean scores for Delayed Retention. This figure shows the mean scores for the Delayed Retention variable, with 95% confidence intervals.

119

Figure 19. Mean scores for Delayed Transfer. This figure shows the mean scores for the Delayed Retention variable, with 95% confidence intervals.

120

Table 9

Descriptive Statistics for Delayed Task Performance

Group Mean St. Deviation N

Retention Control 4.1765 2.32210 34

Score Graphics 4.9487 2.33338 39

Story 5.3125 2.28512 32

Graphics plus Story 5.0789 1.88352 38

Total 4.8811 2.22419 143

Transfer Score Control 2.68 .912 34

Graphics 2.79 1.031 39

Story 3.06 1.076 32

Graphics plus Story 2.71 1.137 38

Total 2.80 1.043 143

While all three experimental groups had higher means than the control group, the multivariate analysis did not find any significant difference between the four groups, with

Pillai’s Trace = 1.141, F (6, 278) = .048, p = .338 (Appendix O).

Research Questions 3 and 4

Question 3 asked: What is the effect of graphic emotional design on students’ perceived effectiveness of learning materials?

Question 4 asked: What is the effect of storytelling on students’ perceived effectiveness of learning materials? 121

The one-way ANOVA test identified that there is a difference between the four groups on the dependent variable “Rating of instructional content,” with F (3, 216) =

3.594, p = .014. The contrast tests analyzed the difference between the two groups with emotional design graphics versus the two groups without emotional design graphics

(question 3), and between the two storytelling groups versus the ones without storytelling

(question 4). It was found that the two emotional design graphic groups rated the effectiveness of the content significantly higher, with a mean difference of .36, t

(192.225) = 3.164, p = .002. However, the storytelling did not appear to make a difference in the rating, the mean difference being .02, t (192.225) = .159, p = .874

(Appendix P).

Research Question 5

Question 5 asked: Do students perceive the materials designed with graphic emotional design as more visually appealing?

The overall F for the one-way ANOVA was statistically significant, F (3, 216) =

9.012, p < .001. The contrasts showed that the participants presented with emotional design graphics rated the visual appeal of the materials higher than the participants presented with simple graphics, with a mean difference of .68, t (196.898) = 5.037, p <

.001 (Appendix Q).

Summary

This chapter presented the results of the statistical analyses for each research question. Chapter five comprises a discussion of these findings, followed by implications, limitations, and recommendations for future research. 122

Chapter 5: Discussion

The emotional design hypothesis poses that visually appealing elements initiate, guide and maintain the learner’s cognitive processing, resulting in improved learning outcomes (Mayer & Estrella, 2014). Overall, this study supports this hypothesis and provides some new data. Additionally, including storytelling elements as a type of emotional design seems to provide some benefits as well. This section will discuss the findings for each research question and relate them to previous research and theory. It will also present implications for practice and theory, explain the limitations of the study, and provide suggestions for future research.

Research Question 1

This question asked: What is the effect of graphic emotional design and storytelling on immediate task performance (i.e., the results on a retention and transfer test)?

Previous studies had mixed results with regard to this effect, but suggested that emotional design in the form of colors and anthropomorphisms impacts learning outcomes positively. Um et al. (2012) found that participants who studied with emotional design materials scored higher on comprehension and transfer tests compared to external mood induction and to no emotion induction. In a replication of this study in Germany,

Plass et al. (2014) investigated the color and shape elements and found that warm colors and face-like shapes increased comprehension when used independently or together, and face-like shapes increased transfer when used with neutral colors. In a study by Park et al.

(2015), the participants who performed best on both comprehension and transfer tests had 123 been presented a lesson including anthropomorphisms as well as external mood induction, which suggests that emotional design may work well in combination with other types of affective manipulation. Mayer and Estrella (2014) found that the participants who received the emotional design content had overall higher scores on a retention and transfer test, but only the retention test showed a statistically significant difference. Research by Uzun and Yıldırım (2018) found that a colorful design improved recall but not transfer, and Brom et al. (2018) conducted a meta-analysis and concluded that colors and anthropomorphisms influence retention, comprehension, and transfer positively.

This study’s results are consistent with most of the previous findings: emotional design, in particular the colors and anthropomorphisms, has a positive impact on retention, and less so on transfer. The mean differences between the four groups were in the expected direction for both retention and transfer: the experimental groups had higher scores than the control group, with the Graphics plus Story group achieving the highest scores. The MANOVA omnibus test showed that there was a significant difference between the four groups on the combination of dependent variables. The effect on the

Retention scores, in particular, is in line with the results of the original experiment that was replicated (Mayer & Estrella, 2014), as well as with several other studies.

The Discriminant Analysis follow-up identified significant differences between three pairs: (1) the Control group and the Graphics group, (2) the Control group and the

Graphics plus Story group, and (3) the Graphics group and the Story group. These results suggest that emotional design graphics by themselves have a positive effect on the 124 learning process, which is consistent with the previous studies, thus supporting the emotional design hypothesis. This effect seems to be stronger for retention.

Storytelling elements were examined in this study as another type of emotional design, as storytelling is an instructional design technique believed to tap into the affective dimension and support learning. However, there is no previous research relating this technique with learning outcomes. The results of this study show that storytelling by itself does not seem to make a significant difference; nonetheless, the positive effect of emotional design graphics is enhanced when storytelling elements are added. This is evidenced by the fact that the both the Graphics group and the Graphics plus Story group outperformed the control group, while there was no significant difference for the Story group alone. This finding supports the assumption that storytelling is an effective instructional design strategy that can help promote learning.

Research Question 2

The question asked was: What is the effect of graphic emotional design and storytelling on delayed task performance?

So far, previous research only examined the effect of emotional design immediately after the lesson was presented. This study is the first one asking whether there is a delayed effect. The statistical analysis did not find a significant difference between the groups for the delayed test scores. However, the group means are in the expected direction, with the experimental performing better than the control group on both retention and transfer. The lack of a significant difference suggests that the positive impact of emotional design is limited to a short period of time after the presentation of 125 the learning content—in other words, the “forgetting curve” effect may be too strong for emotional design to make a difference.

Another reason why this result was obtained could be the experimental context.

Since students did not have the motivation for learning that they would have in a normal class, nor did they have the opportunities to practice or apply their knowledge between the two sessions, the forgetting process was strong and unmitigated. It would be interesting to investigate if emotional design affects the delayed performance in a more authentic learning situation.

Research Question 3

This question asked: What is the effect of graphic emotional design on students’ perceived effectiveness of learning materials?

Previous research found that an attractive design influences people’s perceptions of a product’s usability (Kurosu & Kashimura, 1995; Norman, 2005; Parizotto, 2007;

Tractinsky et al., 2000). In education, research showed increased satisfaction with more appealing materials (Hancock, 2004; Heidig et al., 2015). Studies that focused on specific emotional design such as bright colors, round shapes, and anthropomorphisms had mixed results. Participants in Um et al.’s (2012) study reported higher satisfaction with the materials and increased self-efficacy. However, other studies did not find an effect of emotional design on satisfaction with the learning program, perception of learning outcomes, or desire for similar lessons (Mayer & Estrella, 2014; Park et al., 2015; Plass et al., 2014). A meta-analysis of 20 studies found no effect for perceived learning effectiveness (Brom et al., 2018). 126

In this study, the two groups that studied the lesson with emotional design graphics rated the effectiveness of the instructional materials higher than those whose lesson had the simple graphics. These results contradict many of the earlier findings, and add more evidence for the idea that attractive visuals can make learners perceive learning materials as more effective, which may make them more likely to engage with the materials and to seek similar materials in the future. Hence, the impact on self-efficacy and confidence may play a part in the increased motivation effect of emotional design.

Research Question 4

This question asked: What is the effect of storytelling on students’ perceived effectiveness of learning materials?

To date, there is no previous data regarding the effect of storytelling elements on learners’ perception of instructional materials. In this study, storytelling did not make a significant difference in the rating of the instructional content. This result is consistent with the finding from Research Question 1 that storytelling did not seem to make a difference by itself in learning outcomes. Since it did not make a difference in perception either, another possible explanation for this finding is that the storytelling elements in this study were not salient enough. It would be interesting to conduct a similar study with more numerous or more complex storytelling elements and investigate if that creates a difference. However, this would cause a potential conflict with the coherence principle of multimedia learning, which will be discussed in the Storytelling Elements section further below. 127

Research Question 5

The question asked was: Do students perceive the materials designed with graphic emotional design as more visually appealing?

An assumption of the emotional design hypothesis is that enhanced graphics have a positive effect on learning outcomes because they tap into the affective dimension of learning. In other words, students find the visuals appealing, which increases positive affect. According to the Cognitive Affective Theory of Learning with Multimedia, this increases motivation and helps guide the learners’ attention (Moreno, 2007; Moreno &

Mayer, 2007; Park et al., 2015). However, attractiveness can be very subjective and culturally dependent. In previous studies, researchers did not specifically ask learners to rate the visual design of the materials; Mayer and Estrella (2014) asked participants to rate the appeal of the lesson and enjoyment of the lesson, but found no differences between the groups on this variable.

In this study, the results showed that students who studied the materials with emotional design graphics did indeed find them more attractive than the students who studied the lesson with simple graphics. Therefore, at least in this context of a midwestern US university, the current principles of emotional design seem to match what learners find attractive, thus supporting the idea that the effect of emotional design on learning outcomes is mediated by affect.

Colors and Anthropomorphisms

This study adds to a body of research that indicates that bright, warm colors together with human-like faces for certain graphical items may be beneficial for learning. 128

In particular, retention seems to benefit the most from such design choices. So far, this effect appears to be limited to short-term retention, at least in an experimental setting without opportunities for practice or learner incentives for good performance. In addition, this type of design is visually appealing to students and might play a positive role in their perception of the effectiveness of the instructional materials, as evidenced by their ratings. This suggests that emotional design graphics influence the learning process by means of affect, which is in line with the Cognitive Affective Theory of Learning with

Media.

Storytelling Elements

Due to the lack of research on storytelling for instruction, this study was an exploratory attempt to shed some light on this topic. The literature on storytelling describes it as a way to connect the learner with the content on the emotional dimension; hence, it seemed like a good candidate to be included in emotional design features. The results of this study suggest that it mainly makes a difference in combination with graphical emotional design elements, and that it does not affect the perceived effectiveness of the materials. However, the group that had only storytelling features did obtain a higher mean score than the control group (albeit not statistically significant) both in the immediate and the delayed task, so future research may want to investigate this further.

An important consideration is whether storytelling elements such as those used in this study constitute irrelevant material, which is usually discouraged. According to the

Cognitive Theory of Multimedia Learning, material that does not support the 129 instructional goal directly should be avoided because it increases the learners’ extraneous cognitive load and impinges on the limited capacity of working memory (Clark & Mayer,

2016; Mayer, 2014a). One could argue that storytelling elements like a character or a plot are extraneous and may distract the learner. On the other hand, some recent research has challenged the negative impact of decorative pictures or “seductive details”: Schneider et al. (2018) concluded that decorative pictures may be beneficial if they promote positive affect and are strongly connected with the instructional content, while Kuhl et al. (2019) did not find a seductive detail effect, even though they used established materials.

This study did not show a negative effect—on the contrary, storytelling elements seemed to promote learning. Nevertheless, extraneous cognitive load is a concern, and there are probably limits to how many or how complex storytelling elements should be included in the design. Perhaps there is an ideal balance to be found somewhere between the supportive influence of positive affect and the hindering influence of extraneous design items. Such boundary conditions could be topics for future research.

Implications

Implications for theory. The Cognitive Affective Theory of Learning with

Media (CATLM) posits that emotions play an important role in learning, and that the visual design of learning materials can help promote learning by supporting cognitive processing and by impacting the learners’ motivation (Moreno, 2007). The results of this study provide additional evidence for this assumption by showing that specific types of instructional design that aim at eliciting an emotional response in the learners can help focus their attention and increase their motivation; thus, they promote generative 130 processing and lead to superior learning outcomes. While this study did not specifically measure affect, it did link the emotional design graphics with participants’ enjoyment of visual appeal, suggesting that there is an affective factor that facilitates cognitive processes. Therefore, the current study reinforces the importance of emotion for learning and the need for affective processes to be included in learning theories.

As a reminder, Figure 20 shows the affective-cognitive model of academic learning (Mayer, 2019) that was introduced in Chapter 2.

Figure 20. An affective-cognitive model of academic learning. This figure presents a model of learning proposed by Mayer (2019, Section 1) (also appears on p. 35). Permission for reproduction not required for up to 3 figures by Elsevier.

Starting from this model, and based on the research literature and the findings of this study, I have created a new diagram that seeks to express the relationships between cognitive and affective processes and the role of emotional design in learning. This is presented in Figure 21.

131

Figure 21. A proposed model of learning. This figure shows a proposed model of academic learning, representing the relationships between cognitive and affective processes and emotional design.

The top part of the diagram represents the learning process: the instructional content is presented to the student, learning takes place and results in learning outcomes.

Learning involves cognitive processes, which are extensively studied by cognitive science. However, learning also involves affective processes, as has been evidenced by a growing body of educational research literature. Affective processes influence cognitive processing by fueling or dampening motivation and focusing or hindering attention.

Cognitive processes, in turn, impact affective processing by providing stimuli and feedback—for example, the belief that they are doing well in a task can boost a learner’s confidence and motivation.

In addition to impacting cognition and influencing learning outcomes, affective processes can also play a perhaps more subtle and more long-term role in shaping 132 learning attitudes. For instance, in the study by Heidig et al. (2015), the researchers found that emotional states had a small effect on learning outcomes, but a large effect on the learners’ intrinsic motivation, including their willingness to continue engaging with that instructional material. It is likely that, regardless of learning outcomes, affect related to a positive learning experience can make a student be more open or excited about taking similar courses in the future, or simply about learning in general. I believe that this kind of effect is valuable in itself and worth exploring, as we seek to foster an interest in life- long learning in today’s youth.

What is the role of emotional design in this framework? Emotional design, whether in the form of attractive graphics, storytelling elements, sounds etc. mainly acts at the level of affective processes, and influences cognition indirectly through positive activating emotions. This is what most research suggests, and it is coherent with the current understanding of our psychological makeup.

However, there seems to exist some degree of direct influence on cognitive processes, as some emotional design features may facilitate recall. For instance, in the study by Brom et al. (2016), qualitative investigation revealed that the funny features of the enhanced design may serve as memory cues. Another finding that supports this idea comes from studies where emotional design made a difference in learning outcomes but no differences in reported emotion were found (Brom et al., 2018; Mayer & Estrella,

2014; Park et al., 2015; Plass et al., 2014), suggesting that either no affective mediation existed, or that it was not salient enough for participants to be aware of. Additionally, it is known that imagery plays an important role in recall: the “memory palace”—the creation 133 of an imaginary place in which each piece of learning content is assigned an item or feature in that place—is an ancient mnemonic technique that is used today by memory athletes, and is recommended to learners who want to improve their memory performance. It is possible that emotional design features such as color make the material easier to recall because they make for a more salient visual. In that case, the effect would not be mediated by affect, but rather acting directly on cognition.

In conclusion, this diagram builds on previous models of learning by representing the role of emotional design, as well as the role of affect and its potential value for learning attitudes. It is based on the findings of previous research, the results of this study, and the existing theoretical frameworks of learning with multimedia.

Implications for instructional design practice. From a practical perspective, this study reiterates the recommendation that course creators and instructional designers create learning materials and environments that stimulate positive emotions. While certain negative emotions such as mild anxiety have also been shown to boost learning, there may be some ethical concerns regarding the practice of instilling negative emotions in learners. Focusing on positive emotions appears to be the safer and more ethical route.

Moreover, such design elements might contribute to the students’ positive attitudes towards that particular subject matter or towards learning in general.

One way to influence emotions is through graphics designed according to emotional design principles, such as warm colors and anthropomorphisms. This would make the materials more visually appealing and more personified, which can enhance student motivation and may provide some direct cognitive benefits for recall. One thing 134 to consider might be the appropriateness of the design for the content and audience, as one study found that engineering students performed better with “sad” aesthetics (Chen et al., 2010). Another way to design for emotion is by adding storytelling elements, while making sure they are relevant and do not significantly increase extraneous cognitive load.

This can help the learners relate more to the material and stimulate their curiosity about the content.

The results of this study showed no emotional design effect after a week, which suggests that the natural tendency to forget after that time is too powerful. Consequently, a practical implication would be the necessity to review the material and offer retrieval practice at a short interval. This is something that would normally happen in a real-life learning situation; hence, it is possible that a study conducted in an authentic environment might find an emotional design effect after a week or longer periods.

Limitations

This study had several shortcomings that will be described in this section. They could be addressed in future research.

Ecological validity. This study was an experiment conducted outside of any regular classwork with students from a variety of majors who were paid for participation.

As a result, the students did not have the motivation to study that they would normally have in a class. The material was not necessarily relevant to them (even if it was a topic of general interest) and the results did not have any influence on their grades. Thus, they did not have any incentive to perform well on the tests. It is likely that learning outcomes 135 would be different in a real-life learning situation. Moreover, the lesson was short, which is not very representative of authentic lessons.

Online setting. The study was conducted online, with students being sent a link to the lesson and later to the delayed task. This methodology had the advantage of obtaining more participants; however, it also had the disadvantage of lack of supervision and control. As the researcher was not able to supervise the tasks, it is not possible to know how seriously the students worked, if they really eliminated all distractions as required, and whether they worked independently and did not cheat. (Cheating was not likely because there was no incentive to do well on the task—nonetheless, there was one case that was removed from the dataset where a student copied some text from the internet, which was very obvious because it did not answer the question based on the learning material.)

Convenient sampling frame. The sample in this study comprised of undergraduate students at Ohio University who answered the invitation to participate. In an ideal situation, the sample should be random, but that was not possible here due to practical constraints.

Attrition. The sample for the initial lesson and task included 220 participants.

However, only 143 students answered the second part of the study (the delayed task). The smaller number reduced the power of the statistical analysis for the delayed task.

Type I error. In the Discriminant Analysis that followed up the MANOVA, a

Bonferroni correction was not applied for the pairwise group comparisons. If the alpha is adjusted for six comparisons .05 / 6 = .008, none of the differences are significant. The 136 correction was not applied because the extremely low alpha would greatly reduce power.

As a result of this decision, there is a higher risk of type I error for the reported results.

Storytelling. Due to the brevity of the lesson and the nature of the material, the storytelling elements used in this study were quite limited. Therefore, the claims made with regard to the effects of storytelling are very tentative and need to be tested with different types of material.

Affect was not directly measured. Unlike some previous research, this study did not use any measures of emotion. It is assumed that the two types of design influenced learning outcomes through the affective channel; however, more research is needed to firmly establish this connection.

Perception of effectiveness measures. This study only used a single question for the perception of effectiveness of the instructional material. It may be informative to use more complex measures for this variable.

Recommendations for Future Research

Research context. Following from the limitations described above, it would be highly informative to conduct a similar study in an authentic classroom context. This would solve the problems of attrition and ecological validity posed by this study and would add valuable data on real life uses. In particular, it would be useful to investigate the effect of emotional design on delayed task performance in conditions where student motivation and opportunities for retrieval practice are the same or similar to those in an authentic learning situation. 137

Storytelling. Storytelling is an instructional design strategy that has received a lot of praise and attention recently. However, the research on its effectiveness is lacking. It is recommended that more research in the future investigates the use of this strategy by using more salient storytelling elements (for example, a more complex plot with several steps in the narrative). It is possible that storytelling may be more suitable to some disciplines (for instance, psychology, economics, communication etc.) and less so to subjects such as natural sciences. In this study, the storytelling group performed better than the emotional design graphics group, although the difference was not statistically significative. Future research could follow up on this tentative finding and seek to define the boundary conditions for employing storytelling elements, as well as which types of learning content benefit from it the most.

Affect. Most studies evaluating affect have used self-reporting measures, with some exceptions—for instance, the research by Le et al. (2018) who measured heart rate variability during learning as an expression of mental effort investment, or the study by

Uzun and Yıldırım (2018) who used an emWave device. Employing direct measures of affect would strengthen the Cognitive Affective Theory of Learning with Multimedia model by providing insight into how emotional design influences learning, and into the connection between affective and cognitive processes.

Cognitive load. The multimedia learning theories are based on the Cognitive

Load Theory; however, most emotional design studies do not measure cognitive load, and when they do, it is usually through a self-reporting scale (Sweller, 2018). For example,

Um et al. (2012), Park et al. (2015) Mayer and Estrella (2014) asked participants “How 138 much mental effort did you invest in studying the previous material?” and “How easy or difficult was the material to understand?” (Mayer & Estrella, 2014; Park et al., 2015;

Plass et al., 2014; Um et al., 2012). At the moment, there are few established measures of cognitive load, and they do not differentiate between intrinsic and extrinsic cognitive load. Research is needed in this area, both in the development of more objective, reliable measures and in the use of such measures in emotional design studies.

Lesson subject matter. The learning materials used in this study were on the topic of biology. Biology is a subject that lends itself easily to instructional images. It would be interesting to conduct similar research in different academic topics, in particular ones where the subject matters is not as visual (for example economics, political science, education etc.), and where other types of emotional design may be employed.

Population. Future research could also focus on different target populations. All research so far has focused on undergraduate students, who are usually around 18-22 years old. It would be interesting to see if the effects of emotional design are similar in other populations such as high school students, graduate students, or non-traditional students.

Additional emotional design types. So far researchers have investigated emotional design in the form of certain visual elements (color, shapes, anthropomorphisms, fonts), textual elements, and storytelling elements. One study included an component of humor by giving two design elements a funny appearance

(Brom et al., 2016). Yet another study looked into the effect of sounds (Königschulte,

2015). Further studies could examine other design features that have the potential to 139 influence emotions. Some examples would be images of smiling people, typography, narration features, or different kinds of humor and sounds. Future research could also investigate whether the quality of the design is important as well, since many materials are made by teachers who may not have graphic design expertise.

Conclusion

This paper reviewed the previous literature on the psychology of learning, including the mechanisms involved and the current theories of learning, the connection between visual design and learning, storytelling as an instructional design strategy, and emotional design in learning materials. In the current study, the researcher examined the effect of emotional design—that is, design features whose main purpose is to stimulate emotion—on the learning process. The types of design examined in this study were emotional design graphics and storytelling elements. The impact of these two types of design on learning outcomes—retention and transfer—was investigated, as well as the students’ perceptions of the effectiveness of the content and the material’s visual appeal.

The study took the form of an experiment, performed on a sample of undergraduate college students.

It was found that graphical emotional design and storytelling had a positive impact on learning outcomes, with most of the effect observed in the retention variable.

The emotional design graphics also impacted the students’ perceived effectiveness of the learning materials, while the storytelling elements did not have such an impact.

Additionally, students did perceive the emotional design graphics as more appealing visually. Possible implications for learning designers and instructors were discussed, as 140 well as the limitations of the study. This paper also proposed a new theoretical model for the role of affect in learning, and concluded with suggestions for future research.

141

References

Andrews, D. H., Hull, T. D., & Donahue, J. A. (2009). Storytelling as an Instructional

Method: Descriptions and Research Questions. 3(2), 19.

Ayres, P., & Sweller, J. (2014). The split‐attention principle in multimedia learning. In R.

E. Mayer (Ed.), Cambridge handbook of multimedia learning (2nd ed., pp. 206–

226). Cambridge University Press.

Baddeley, A. D. (1986). Working memory. Oxford University Press.

Baddeley, A. D., & Hitch, G. (1974). Working memory. In The psychology of learning

and motivation: Advances in research and theory (Vol. 8, pp. 47–89). Academic

Press.

Barton, M., Yeatts, P. E., Henson, R. K., & Martin, S. B. (2016). Moving beyond

univariate post-hoc testing in exercise science: A primer on descriptive

discriminate analysis. Research Quarterly for Exercise and Sport, 87(4), 365–

375. https://doi.org/10.1080/02701367.2016.1213352

Bellizzi, J. A., & Hite, R. E. (1992). Environmental color, consumer feelings, and

purchase likelihood. Psychology and Marketing, 9(5), 347–363.

https://doi.org/10.1002/mar.4220090502

Bodemer, D., Ploetzner, R., Feuerlein, I., & Spada, H. (2004). The active integration of

information during learning with dynamic and interactive visualisations. Learning

and Instruction, 14(3), 325–341.

https://doi.org/10.1016/j.learninstruc.2004.06.006 142

Boucheix, J.-M., & Guignard, H. (2005). What animated illustrations conditions can

improve technical document comprehension in young students? Format, signaling

and control of the presentation. European Journal of Psychology of Education,

20(4), 369–388. https://doi.org/10.1007/BF03173563

Brand, S., Reimer, T., & Opwis, K. (2007). How do we learn in a negative mood? Effects

of a negative mood on transfer and learning. Learning and Instruction, 17(1), 1–

16. https://doi.org/10.1016/j.learninstruc.2006.11.002

Brom, C., Hannemann, T., Stárková, T., Bromová, E., & Děchtěrenko, F. (2016,

October). Anthropomorphic faces and funny graphics in an instructional

animation may improve superficial rather than deep learning: A quasi-

experimental study.

Brom, C., Stárková, T., & D’Mello, S. K. (2018). How effective is emotional design? A

meta-analysis on facial anthropomorphisms and pleasant colors during

multimedia learning. Educational Research Review, 25, 100–119.

https://doi.org/10.1016/j.edurev.2018.09.004

Brooks, G. (2011). Qualitative experimentation, local generalizability, and other

oxymoronic opportunities for educated researchers. Mid-Western Educational

Researcher, 24(4), 30.

Butcher, K. R. (2006). Learning from text and diagrams: Promoting mental model

development and inference generation. Journal of Educational Psychology, 98,

182–197. 143

Butcher, K. R. (2014). The multimedia principle. In R. E. Mayer (Ed.), The Cambridge

handbook of multimedia learning (2nd ed., pp. 174–205). Cambridge University

Press.

Cai, J., Morris, A., Hohensee, C., Hwang, S., Robison, V., & Hiebert, J. (2018). The role

of replication studies in educational research. Journal for Research in

Mathematics Education, 49(1), 2.

https://doi.org/10.5951/jresematheduc.49.1.0002

Ceaparu, I., Lazar, J., Bessiere, K., Robinson, J., & Shneiderman, B. (2004). Determining

causes and severity of end-user frustration. International Journal of Human-

Computer Interaction, 17(3), 333–356.

https://doi.org/10.1207/s15327590ijhc1703_3

Chan, T. C. (1988). The aesthetic environment and student learning. School Business

Affairs, 54(1), 26–27.

Chen, K.-C., Jang, S.-J., & Branch, R. M. (2010). Autonomy, affiliation, and ability:

Relative salience of factors that influence online learner motivation and learning

outcomes. Knowledge Management & E-Learning: An International Journal, 30–

50. https://doi.org/10.34105/j.kmel.2010.02.004

Clark, R. C., & Mayer, R. E. (2016). E-learning and the science of instruction: Proven

guidelines for consumers and designers of multimedia learning (Fourth edition).

Wiley. 144

Clark, Richard. E., & Feldon, D. F. (2014). Ten common but questionable principles of

multimedia learning. In The Cambridge handbook of multimedia learning (2nd

ed., pp. 151–173). Cambridge University Press.

Coffield, F., Moseley, D., Hall, E., & Ecclestone, K. (2004). Learning styles and

pedagogy in post 16 learning: A systematic and critical review.

Cohen, J. (1969). Statistical power analysis for the behavioural sciences. Academic

Press.

Collins, H. M. (1985). Changing order: Replication and induction in scientific practice.

Sage.

Cowan, N. (2011). The focus of attention as observed in visual working memory tasks:

Making sense of competing claims. Neuropsychologia, 49(6), 1401–1406.

https://doi.org/10.1016/j.neuropsychologia.2011.01.035

Creswell, J. W. (2012). Educational research: Planning, conducting, and evaluating

quantitative and qualitative research (4th ed). Pearson.

Cromley, J. G., Snyder-Hogan, L. E., & Luciw-Dubas, U. A. (2010). Cognitive activities

in complex science text and diagrams. Contemporary Educational Psychology,

35(1), 59–74. https://doi.org/10.1016/j.cedpsych.2009.10.002

Cuevas, H. M., Fiore, S. M., & Oser, R. L. (2002). Scaffolding cognitive and

metacognitive processes in low verbal ability learners: Use of diagrams in

computer-based training environments. 32. de Koning, B. B., Tabbers, H. K., Rikers, R. M. J. P., & Paas, F. (2010). Attention

guidance in learning from a complex animation: Seeing is understanding? 145

Learning and Instruction, 20(2), 111–122.

https://doi.org/10.1016/j.learninstruc.2009.02.010

Deci, E. L. (1971). Effects of externally mediated rewards on intrinsic motivation.

Journal of Personality and Social Psychology, 18(1), 105–115.

https://doi.org/10.1037/h0030644

Deci, E. L., & Ryan, R. M. (2014). Intrinsic motivation and self-determination in human

behavior. Springer Science+Business Media.

Dehn, D. M., & Van Mulken, S. (2000). The impact of animated interface agents: A

review of empirical research. International Journal of Human-Computer Studies,

52(1), 1–22. https://doi.org/10.1006/ijhc.1999.0325

Dellaert, B. G. C., & Kahn, B. E. (1999). How tolerable is delay?: Consumers’

evaluations of internet web sites after waiting. Journal of Interactive Marketing,

13(1), 41–54. https://doi.org/10.1002/(SICI)1520-6653(199924)13:1<41::AID-

DIR4>3.0.CO;2-S

Dennis, A., & Valacich, J. (2014). A replication manifesto. AIS Transactions on

Replication Research, 1, 1–4. https://doi.org/10.17705/1atrr.00001

Dick, W., Carey, L., & Carey, J. O. (2009). The systematic design of instruction (7th ed).

Merrill/Pearson.

Dickey, M. D. (2005). Engaging by design: How engagement strategies in popular

computer and video games can inform instructional design. Educational

Technology Research and Development, 53(2), 67–83.

https://doi.org/10.1007/BF02504866 146

Diemand-Yauman, C., Oppenheimer, D. M., & Vaughan, E. B. (2011). Fortune favors

the bold (and the italicized): Effects of disfluency on educational outcomes.

Cognition, 118(1), 111–115. https://doi.org/10.1016/j.cognition.2010.09.012

Dirksen, J. (2016). Design for how people learn (Second edition). New Riders.

D’Mello, S., & Graesser, A. (2012). Dynamics of affective states during complex

learning. Learning and Instruction, 22(2), 145–157.

https://doi.org/10.1016/j.learninstruc.2011.10.001

D’Mello, S., Lehman, B., Pekrun, R., & Graesser, A. (2014). Confusion can be beneficial

for learning. Learning and Instruction, 29, 153–170.

https://doi.org/10.1016/j.learninstruc.2012.05.003

Duffy, M. C., Lajoie, S. P., Pekrun, R., & Lachapelle, K. (2018). Emotions in medical

education: Examining the validity of the Medical Emotion Scale (MES) across

authentic medical learning environments. Learning and Instruction, 101150.

https://doi.org/10.1016/j.learninstruc.2018.07.001

Ebbinghaus, H. (1885). Memory: A contribution to experimental psychology.

Efklides, A., Kourkoulou, A., Mitsiou, F., & Ziliaskopoulou, D. (2006). Metacognitive

knowledge of effort, personality factors, and mood state: Their relationships with

effort-related metacognitive experiences. Metacognition and Learning, 1(1), 33–

49. https://doi.org/10.1007/s11409-006-6581-0

Engle, R. W. (2002). Working memory capacity as executive attention. Current

Directions in Psychological Science, 11(1), 19–23. https://doi.org/10.1111/1467-

8721.00160 147

Ervine, M. D. (2016). Visual literacy in instructional design programs. Journal of Visual

Literacy, 35(2), 104–113. https://doi.org/10.1080/1051144X.2016.1270630

Fidler, F., & Wilcox, J. (2018). Reproducibility of scientific results. In E. N. Zalta (Ed.),

The Stanford Encyclopedia of Philosophy (Winter 2018). Metaphysics Research

Lab. https://plato.stanford.edu/archives/win2018/entries/scientific-reproducibility/

Field, A. (2016). An adventure in statistics: The reality enigma (1st edition). SAGE In.

Fisher, J. S., & Radvansky, G. A. (2018). Patterns of forgetting. Journal of Memory and

Language, 102, 130–141. https://doi.org/10.1016/j.jml.2018.05.008

Fogg, B. J., Soohoo, C., Danielson, D. R., Marable, L., Stanford, J., & Tauber, E. R.

(2003). How do users evaluate the credibility of Web sites?: A study with over

2,500 participants. Proceedings of the 2003 Conference on Designing for User

Experiences - DUX ’03, 1. https://doi.org/10.1145/997078.997097

Gadgil, S., Nokes-Malach, T. J., & Chi, M. T. H. (2012). Effectiveness of holistic mental

model confrontation in driving conceptual change. Learning and Instruction,

22(1), 47–61. https://doi.org/10.1016/j.learninstruc.2011.06.002

Garner, R., Gillingham, M., & White, C. (1989). Effects of seductive details on

macroprocessing and microprocessing in adults and children. Cognition and

Instruction, 6, 41–57.

Geake, J. (2008). Neuromythologies in education. Educational Research, 50(2), 123–133.

https://doi.org/10.1080/00131880802082518

Gerrig, R. J. (1993). Experiencing narrative worlds. Yale University Press. 148

Glore, P., & David, A. (2012). Design and aesthetics in e-learning: A usability and

credibility perspective. International Journal on E-Learning, 11(4), 383–390.

Goldstone, R. L., & Son, J. Y. (2005). The transfer of scientific principles using concrete

and idealized simulations. Journal of the Learning Sciences, 14(1), 69–110.

https://doi.org/10.1207/s15327809jls1401_4

Gorn, G. J., Chattopadhyay, A., Yi, T., & Dahl, D. W. (1997). Effects of color as an

executional cue in advertising: They’re in the shade. Management Science,

43(10), 1387–1400. https://doi.org/10.1287/mnsc.43.10.1387

Grossen, B., & Carnine, D. (1990). Diagramming a logic strategy: Effects on difficult

problem types and transfer. Learning Disability Quarterly, 13(3), 168–182.

https://doi.org/10.2307/1510699

Halpern, D. F., Graesser, A., & Hakel, m. (2007). 25 learning principles to guide

pedagogy and the design of learning environments. Association of Psychological

Science Taskforce on Lifelong Learning at Work and at Home.

Hancock, D. J. (2004). Improving the environment in distance learning courses through

the application of aesthetic principles. University of South Florida.

Harp, S. F., & Mayer, R. E. (1998). How seductive details do their damage: A theory of

cognitive interest in science learning. Journal of Educational Psychology, 90(3),

414.

Hassenzahl, M. (2004). The interplay of beauty, goodness, and usability in interactive

products. Human-Computer Interaction, 19(4), 319–349.

https://doi.org/10.1207/s15327051hci1904_2 149

Hathway, M. D. (1984). Variables of computer screen display and how they affect

learning. Educational Technology, 24(7), 7–11.

Hedges, L. V., & Schauer, J. M. (2019). More than one replication study is needed for

unambiguous tests of replication. Journal of Educational and Behavioral

Statistics, 44(5), 543–570. https://doi.org/10.3102/1076998619852953

Hegarty, M., Kriz, S., & Cate, C. (2003). The roles of mental animations and external

animations in understanding mechanical systems. Cognition and Instruction,

21(4), 209–249. https://doi.org/10.1207/s1532690xci2104_1

Heidig, S., Müller, J., & Reichelt, M. (2015). Emotional design in multimedia learning:

Differentiation on relevant design features and their effects on emotions and

learning. Computers in Human Behavior, 44, 81–95.

https://doi.org/10.1016/j.chb.2014.11.009

Hokanson, B., & Fraher, R. (2008). Narrative structure, myth, and cognition for

instructional design. Educational Technology, 48(1), 27–32. JSTOR.

Isen, A. M., Daubman, K. A., & Nowicki, G. P. (1987). Positive affect facilitates creative

problem solving. Journal of Personality and Social Psychology, 52(6), 1122–

1131. https://doi.org/10.1037/0022-3514.52.6.1122

Izard, C. E. (2009). Emotion theory and research: Highlights, unanswered questions, and

emerging issues. Annual Review of Psychology, 60(1), 1–25.

https://doi.org/10.1146/annurev.psych.60.110707.163539 150

Johnson, C. I., & Priest, H. A. (2014). The feedback principle in multimedia learning. In

R. E. Mayer (Ed.), The Cambridge handbook of multimedia learning (2nd ed.).

Cambridge University Press.

Just, M. A., & Carpenter, P. A. (1976). Eye fixations and cognitive processes. Cognitive

Psychology, 8(4), 441–480. https://doi.org/10.1016/0010-0285(76)90015-3

Kensinger, E. A. (2009). How emotion affects older adults’ memories for event details.

Memory, 17(2), 208–219. https://doi.org/10.1080/09658210802221425

Kirschner, P. A. (2017). Stop propagating the learning styles myth. Computers &

Education, 106, 166–171. https://doi.org/10.1016/j.compedu.2016.12.006

Kirschner, P. A., & van Merriënboer, J. J. G. (2013). Do learners really know best?

Urban legends in education. Educational Psychologist, 48(3), 169–183.

https://doi.org/10.1080/00461520.2013.804395

Kirschner, P., Strijbos, J.-W., Kreijns, K., & Beers, P. J. (2004). Designing electronic

collaborative learning environments. Educational Technology Research and

Development, 52(3), 47–66. https://doi.org/10.1007/BF02504675

Königschulte, A. (2015). Sound as affective design feature in multimedia learning –

benefits and drawbacks from a cognitive load theory perspective. 75–83.

Koo, T. K., & Li, M. Y. (2016). A guideline of selecting and reporting intraclass

correlation coefficients for reliability research. Journal of Chiropractic Medicine,

15(2), 155–163. https://doi.org/10.1016/j.jcm.2016.02.012

Krug, S. (2014). Don’t make me think, revisited: A common sense approach to Web

usability. New Riders. 151

Kühl, T., Moersdorf, F., Römer, M., & Münzer, S. (2019). Adding emotionality to

seductive details-Consequences for learning? Applied Cognitive Psychology,

33(1), 48–61. https://doi.org/10.1002/acp.3477

Kumar, J. A., Muniandy, B., & Yahaya, W. A. J. W. (2018). Exploring the effects of

visual aesthetics in e-learning for engineering students. Knowledge Management

& E-Learning: An International Journal, 250–264.

https://doi.org/10.34105/j.kmel.2018.10.015

Kurosu, M., & Kashimura, K. (1995). Apparent usability vs. inherent usability:

Experimental analysis on the determinants of the apparent usability. Conference

Companion on Human Factors in Computing Systems - CHI ’95, 292–293.

https://doi.org/10.1145/223355.223680

Labov, W. (1972). Language in the inner city: Studies in the black English vernacular.

University of Pennsylvania Press.

Landis, J. R., & Koch, G. G. (1977). The measurement of observer agreement for

categorical data. Biometrics, 33, 159–174.

Laurillard, D. (1998). Multimedia and the learner’s experience of narrative. Computers in

Education, 31, 229–243.

Lavie, T., & Tractinsky, N. (2004). Assessing dimensions of perceived visual aesthetics

of web sites. International Journal of Human-Computer Studies, 60(3), 269–298.

https://doi.org/10.1016/j.ijhcs.2003.09.002

Lazar, J., Jones, A., Hackley, M., & Shneiderman, B. (2006). Severity and impact of

computer user frustration: A comparison of student and workplace users. 152

Interacting with Computers, 18(2), 187–207.

https://doi.org/10.1016/j.intcom.2005.06.001

Le, Y., Liu, J., Deng, C., & Dai, D. Y. (2018). Heart rate variability reflects the effects of

emotional design principle on mental effort in multimedia learning. Computers in

Human Behavior, 89, 40–47. https://doi.org/10.1016/j.chb.2018.07.037

Loderer, K., Pekrun, R., & Lester, J. C. (2018). Beyond cold technology: A systematic

review and meta-analysis on emotions in technology-based learning

environments. Learning and Instruction, 101162.

https://doi.org/10.1016/j.learninstruc.2018.08.002

Lombard, M., Snyder-Duch, J., & Bracken, C. C. (2004). A call for standardization in

content analysis reliability. Human Communication Research, 30, 434–437.

Lowe, R. K. (1999). Extracting information from an animation during complex visual

learning. European Journal of Psuchology of Education, 14, 225–244.

Lowe, R. K. (2003). Animation and learning: Selective processing of information in

dynamic graphics. Learning and Instruction, 13, 157–176.

Löwgren, J., & Stolterman, E. (2004). Thoughtful interaction design: A design

perspective on information technology. MIT Press.

MacKay, D. G. (1987). The organization of perception and action: A theory for language

and other cognitive skills. Springer New York. http://dx.doi.org/10.1007/978-1-

4612-4754-8

Mackay, D. G., Shafto, M., Taylor, J. K., Marian, D. E., Abrams, L., & Dyer, J. R.

(2004). Relations between emotion, memory, and attention: Evidence from taboo 153

Stroop, lexical decision, and immediate memory tasks. Memory & Cognition,

32(3), 474–488. https://doi.org/10.3758/BF03195840

Malamed, C. (n.d.-a). 10 Definitions of Learning. Retrieved March 3, 2019, from

http://theelearningcoach.com/learning/10-definitions-learning/

Malamed, C. (n.d.-b). Why you need to use storytelling for learning. The ELearning

Coach. Retrieved November 29, 2019, from

http://theelearningcoach.com/elearning2-0/why-you-need-to-use-storytelling-for-

learning/

Malamed, C. (2015). Visual design solutions: Principles and creative inspiration for

learning professionals. Hoboken, New Jersey : John Wiley & Sons, 2015.

Martin, B. L. (1986). Aesthetics and media: Implications for the design of instruction.

Educational Technology, 26(6), 15–21.

Massa, L. J., & Mayer, R. E. (2006). Testing the ATI hypothesis: Should multimedia

instruction accommodate verbalizer-visualizer cognitive style? Learning and

Individual Differences, 16(4), 321–335.

https://doi.org/10.1016/j.lindif.2006.10.001

Mayer, R. E. (2009). Multimedia Learning (2 edition). Cambridge University Press.

Mayer, R. E. (Ed.). (2014a). Cognitive theory of multimedia learning. In The Cambridge

handbook of multimedia learning (2nd ed., pp. 43–71). Cambridge University

Press.

Mayer, R. E. (2014b). Incorporating motivation into multimedia learning. Learning and

Instruction, 29, 171–173. https://doi.org/10.1016/j.learninstruc.2013.04.003 154

Mayer, R. E. (Ed.). (2014c). The Cambridge handbook of multimedia learning (2

edition). Cambridge University Press.

Mayer, R. E. (2019). Searching for the role of emotions in e-learning. Learning and

Instruction, 101213. https://doi.org/10.1016/j.learninstruc.2019.05.010

Mayer, R. E., Bove, W., Bryman, R., Mars, R., & Tapangco, L. (1996). When less is

more: Meaningful learning from visual and verbal summaries of textbook lessons.

Journal of Educational Psychology, 64–73.

Mayer, R. E., & Estrella, G. (2014). Benefits of emotional design in multimedia

instruction. Learning and Instruction, 33, 12–18.

https://doi.org/10.1016/j.learninstruc.2014.02.004

Mayer, R. E., & Fiorella, L. (2014). Principles for reducing extraneous processing in

multimedia learning: Coherence, signaling, redundancy, spatial contiguity, and

temporal contiguity. In R. E. Mayer (Ed.), The Cambridge handbook of

multimedia learning (2nd ed., pp. 279–315). Cambridge University Press.

Mayer, R. E., Hegarty, M., Mayer, S., & Campbell, J. (2005). When static media promote

active learning: Annotated illustrations versus narrated animations in multimedia

instruction. Journal of Experimental Psychology: Applied, 11, 256–265.

McLellan, H. (1993). Hypertextual tales: Story models for hypertext design. Journal of

Educational Multimedia and Hypermedia, 2, 239–260.

Miller, C. (2011). Aesthetics and e-assessment: The interplay of emotional design and

learner performance. Distance Education, 32(3), 307–337.

https://doi.org/10.1080/01587919.2011.610291 155

Miller, M. D. (2014). Minds online: Teaching effectively with technology. Harvard

University Press.

Moore, P. J. (1993). Metacognitive processing of diagrams, maps and graphs. Learning

and Instruction, 3(3), 215–226. https://doi.org/10.1016/0959-4752(93)90005-K

Moreno, R. (2006). Does the modality principle hold for different media? A test of the

method-affects-learning hypothesis. Journal of Computer Assisted Learning,

22(3), 149–158. https://doi.org/10.1111/j.1365-2729.2006.00170.x

Moreno, R. (2007). Optimising learning from animations by minimising cognitive load:

Cognitive and affective consequences of signalling and segmentation methods.

Applied Cognitive Psychology, 21(6), 765–781. https://doi.org/10.1002/acp.1348

Moreno, R., & Mayer, R. E. (2000). A coherence effect in multimedia learning: The case

for minimizing irrelevant sounds in the design of multimedia instructional

messages. Journal of Educational Psychology, 92(1), 117–125.

https://doi.org/10.1037//0022-0663.92.1.117

Moreno, R., & Mayer, R. E. (2007). Interactive multimodal learning environments:

Special issue on interactive learning environments: contemporary issues and

trends. Educational Psychology Review, 19(3), 309–326.

https://doi.org/10.1007/s10648-007-9047-2

Moreno, R., Ozogul, G., & Reisslein, M. (2011). Teaching with concrete and abstract

visual representations: Effects on students’ problem solving, problem

representations, and learning perceptions. Journal of Educational Psychology,

103(1), 32–47. https://doi.org/10.1037/a0021995 156

Morin, K. H. (2016). Replication: Needed now more than ever. Journal of Nursing

Education, 55(8), 423–424. https://doi.org/10.3928/01484834-20160715-01

Moshagen, M., & Thielsch, M. T. (2010). Facets of visual aesthetics. International

Journal of Human-Computer Studies, 68(10), 689–709.

https://doi.org/10.1016/j.ijhcs.2010.05.006

Murre, J. M. J., & Dros, J. (2015). Replication and analysis of ebbinghaus’ forgetting

curve. PLOS ONE, 10(7), e0120644.

https://doi.org/10.1371/journal.pone.0120644

Neibert, J. (2014, November 10). The power of storytelling in elearning. Learning

Solutions. https://learningsolutionsmag.com/articles/1566/the-power-of-

storytelling-in-elearning-

Newton, P. M. (2015). The learning styles myth is thriving in higher education. Frontiers

in Psychology, 6. https://doi.org/10.3389/fpsyg.2015.01908

Newton, P. M., & Miah, M. (2017). Evidence-based higher education – is the learning

styles ‘myth’ important? Frontiers in Psychology, 8.

https://doi.org/10.3389/fpsyg.2017.00444

Norman, D. A. (2005). Emotional design. [electronic resource]: Why we love (or hate)

everyday things. New York : BasicBooks ; Oxford : Oxford Publicity Partnership

[distributor], 2005.

http://www.library.ohio.edu/ezpauth/redir/athens.php?http%3a%2f%2fsearch.ebs

cohost.com%2flogin.aspx%3fdirect%3dtrue%26db%3dcat00572a%26AN%3dali

ce.b5107607%26site%3deds-live%26scope%3dsite 157

Norman, D. A. (2013). The design of everyday things (Revised and expanded edition).

Basic Books.

North, C. (2019, October). The case of the disengaged learner. DevLearn, Las Vegas,

NV.

Oaksford, M., Morris, F., Grainger, B., & Williams, J. M. G. (1996). Mood, reasoning,

and central executive processes. Journal of Experimental Psychology: Learning,

Memory, and Cognition, 22(2), 476–492. https://doi.org/10.1037/0278-

7393.22.2.476

Oatley, K., & Johnson-laird, P. N. (1987). Towards a cognitive theory of emotions.

Cognition & Emotion, 1(1), 29–50. https://doi.org/10.1080/02699938708408362

Oppenheimer, D. M. (2008). The secret life of fluency. Trends in Cognitive Sciences,

12(6), 237–241. https://doi.org/10.1016/j.tics.2008.02.014

Ortony, A., Clore, G. L., & Collins, A. (1988). The cognitive structure of emotions.

Cambridge University Press. https://doi.org/10.1017/CBO9780511571299

Orwin, R. G., & Cordray, D. S. (1985). Effects of deficient reporting on meta-analysis: A

conceptual framework and reanalysis. Psychological Bulletin, 97, 134–137.

Paas, F., & Sweller, J. (2014). Implications of cognitive load theory for multimedia

learning. In R. Mayer (Ed.), Cambridge handbook of multimedia learning (2nd

ed., pp. 27–42). Cambridge University Press.

Pandey, A. (2018, July 24). How you can use storytelling in corporate training: Featuring

5 effective examples. ELearning Industry. 158

https://elearningindustry.com/storytelling-in-corporate-training-featuring-

effective-examples-use

Parizotto, R. (2007). Aesthetics and usability of virtual learning environment interfaces.

The University of York.

Park, B., Knörzer, L., Plass, J. L., & Brünken, R. (2015). Emotional design and positive

emotions in multimedia learning: An eyetracking study on the use of

anthropomorphisms. Computers & Education, 86, 30–42.

https://doi.org/10.1016/j.compedu.2015.02.016

Parrish, P. (2005). Embracing the aesthetics of instructional design. Educational

Technology, 45(2), 16–24.

Pashler, H., Bain, P., Bottage, B., Graesser, A., Koedinger, K., McDaniel, M., &

Metcalfe, J. (2007). Organizing instruction and study to improve student learning.

National Center for Educational Research, Institute of Education Sciences.

Pashler, Harold, McDaniel, M., Rohrer, D., & Bjork, R. (2008). Learning styles report:

Concepts and evidence. Psychological Science in the Public Interest, 9(3), 105–

119.

Peak, D., Prybutok, V., Mai, B., & Parsons, T. (2017). Bridging aesthetics and positivism

in IS visual with neuroscience: A pluralistic research framework

and typology.

Pekrun, R. (2005). Progress and open problems in educational emotion research.

Learning and Instruction, 15(5), 497–506.

https://doi.org/10.1016/j.learninstruc.2005.07.014 159

Pekrun, R. (2006). The control-value theory of achievement emotions: Assumptions,

corollaries, and implications for educational research and practice. Educational

Psychology Review, 18(4), 315–341. https://doi.org/10.1007/s10648-006-9029-9

Plass, J. L., Heidig, S., Hayward, E. O., Homer, B. D., & Um, E. (2014). Emotional

design in multimedia learning: Effects of shape and color on affect and learning.

Learning and Instruction, 29, 128–140.

https://doi.org/10.1016/j.learninstruc.2013.02.006

Plass, J. L., Homer, B. D., MacNamara, A., Ober, T., Rose, M. C., Pawar, S., Hovey, C.

M., & Olsen, A. (2019). Emotional design for digital games for learning: The

effect of expression, color, shape, and dimensionality on the affective quality of

game characters. Learning and Instruction, 101194.

https://doi.org/10.1016/j.learninstruc.2019.01.005

Plass, J. L., & Kaplan, U. (2016). Emotional design in digital media for learning. In

Emotions, Technology, Design, and Learning (pp. 131–161). Elsevier.

https://doi.org/10.1016/B978-0-12-801856-9.00007-4

Plass, J. L., Moreno, R., & Brünken, R. (Eds.). (2010). Cognitive load theory. Cambridge

University Press.

Polkinghorne, D. E. (1988). Narrative knowing and the human sciences. State Univ. of

New York Press.

Porter, S., & Peace, K. A. (2007). The scars of memory. Psychological Science, 18(5),

435–441. https://doi.org/10.1111/j.1467-9280.2007.01918.x

Prensky, M. (2001). Digital game-based learning. McGraw-Hill. 160

Rayner, K., Li, X., Williams, C. C., Cave, K. R., & Well, A. D. (2007). Eye movements

during information processing tasks: Individual differences and cultural effects.

Vision Research, 47(21), 2714–2726. https://doi.org/10.1016/j.visres.2007.05.007

Reber, R., & Schwarz, N. (1999). Effects of perceptual fluency on judgments of truth.

Consciousness and Cognition, 8(3), 338–342.

https://doi.org/10.1006/ccog.1999.0386

Reeves, B., & Nass, C. (1996). The media equation: How people treat computers,

televisions, and new media like real people and places. Cambridge University

Press.

Richardson, J. T. E. (2011). Eta squared and partial eta squared as measures of effect size

in educational research. Educational Research Review, 6(2), 135–147.

https://doi.org/10.1016/j.edurev.2010.12.001

Richey, R., Klein, J. D., & Tracey, M. W. (2011). The instructional

base: Theory, research, and practice. Routledge.

Rieber, L. P. (1996). Seriously considering play: Designing interactive learning

environments based on the blending of microworlds, simulations, and games.

Educational Technology Research and Development, 44(2), 43–58.

Riener, C., & Willingham, D. (2010). The myth of learning styles. Change: The

Magazine of Higher Learning, 42(5), 32–35.

https://doi.org/10.1080/00091383.2010.503139 161

Robins, D., & Holmes, J. (2008). Aesthetics and credibility in web site design.

Information Processing & Management, 44(1), 386–399.

https://doi.org/10.1016/j.ipm.2007.02.003

Rodríguez Estrada, F. C., & Davis, L. S. (2015). Improving visual communication of

science through the incorporation of graphic design theories and practices into

science communication. Science Communication, 37(1), 140–148.

Rubin, D. C. (2005). A basic-systems approach to autobiographical memory. Current

Directions in Psychological Science, 14(2), 79–83. https://doi.org/10.1111/j.0963-

7214.2005.00339.x

Russell, J. A. (2003). Core affect and the psychological construction of emotion.

Psychological Review, 110(1), 145–172. https://doi.org/10.1037/0033-

295X.110.1.145

Ryan, R. M., & Deci, E. L. (2000). Self-determination theory and the facilitation of

intrinsic motivation, social development, and well-being. American Psychologist,

55(1), 68–78. https://doi.org/10.1037/0003-066X.55.1.68

Sanchez, C. A., & Jaeger, A. J. (2015). If it’s hard to read, it changes how long you do it:

Reading time as an explanation for perceptual fluency effects on judgment.

Psychonomic Bulletin & Review, 22(1), 206–211. https://doi.org/10.3758/s13423-

014-0658-6

Scheiter, K., Gerjets, P., Huk, T., Imhof, B., & Kammerer, Y. (2009). The effects of

realism in learning with dynamic visualizations. Learning and Instruction, 19,

481–494. 162

Schenkman, B. N., & Jönsson, F. U. (2000). Aesthetics and preferences of web pages.

Behaviour & Information Technology, 19(5), 367–377.

https://doi.org/10.1080/014492900750000063

Schneider, S., Dyrna, J., Meier, L., Beege, M., & Rey, G. D. (2018). How affective

charge and text–picture connectedness moderate the impact of decorative pictures

on multimedia learning. Journal of Educational Psychology, 110(2), 233–249.

https://doi.org/10.1037/edu0000209

Schneider, S., Nebel, S., Beege, M., & Rey, G. D. (2018). Anthropomorphism in

decorative pictures: Benefit or harm for learning? Journal of Educational

Psychology, 110(2), 218–232. https://doi.org/10.1037/edu0000207

Schwarz, N. (2004). Metacognitive experiences in consumer judgment and decision

making. Journal of Consumer Psychology, 14(4), 332–348.

https://doi.org/10.1207/s15327663jcp1404_2

Seibert, P. S., & Ellis, H. C. (1991). Irrelevant thoughts, emotional mood states, and

cognitive task performance. Memory & Cognition, 19(5), 507–513.

https://doi.org/10.3758/BF03199574

Sharp, H., Rogers, Y., & Preece, J. (2007). Interaction design: Beyond human-computer

interaction (2nd ed). Wiley.

Shavelson, R. J., & Towne, L. (Eds.). (2002). Scientific research in education. National

Academy Press.

Simons, D. J. (2014). The value of direct replication. Perspectives on Psychological

Science, 9(1), 76–80. https://doi.org/10.1177/1745691613514755 163

Song, H., & Schwarz, N. (2009). If it’s difficult to pronounce, it must be risky: Fluency,

familiarity, and risk perception. Psychological Science, 20(2), 135–138.

https://doi.org/10.1111/j.1467-9280.2009.02267.x

Stahl, S. A. (1999). Different strokes for different folks? A critique of learning styles.

American Educator, 23(3), 27–31.

Stark, L., Brünken, R., & Park, B. (2018). Emotional text design in multimedia learning:

A mixed-methods study using eye tracking. Computers & Education, 120, 185–

196. https://doi.org/10.1016/j.compedu.2018.02.003

Suter, W. N., & Suter, P. M. (2019). Understanding replication: Trust but verify. Home

Health Care Management & Practice, 31(4), 207–212.

https://doi.org/10.1177/1084822319850501

Sweller, J. (2011). Cognitive load theory. In J. Mestre & B. Ross (Eds.), The psychology

of learning and motivation: Cognition in education (Vol. 55, pp. 37–76).

Academic Press.

Sweller, J. (2018). Measuring cognitive load. Perspectives on Medical Education, 7(1),

1–2. https://doi.org/10.1007/s40037-017-0395-4

Sweller, J., van Merriënboer, J. J. G., & Paas, F. (2019). Cognitive architecture and

instructional design: 20 years later. Educational Psychology Review.

https://doi.org/10.1007/s10648-019-09465-5

Tabachnick, B. G., & Fidell, L. S. (2007). Using multivariate statistics (5th ed.). Allyn

and Bacon. 164

The Association of Registered Graphic Designers of Ontario. (2010). AccessAbility: A

practical handbook on accessible graphic design.

The Professional Association for Design. (2017, October 5). What is design?

https://www.aiga.org/what-is-design

Thompson, E., Palacios, A., & Varela, F. J. (1992). Ways of coloring: Comparative color

vision as a case study for cognitive science. Behavioral and Brain Sciences,

15(01), 1–26. https://doi.org/10.1017/S0140525X00067248

Tomita, K. (2018). Does the visual appeal of instructional media affect learners’

motivation toward learning? TechTrends, 62(1), 103–112.

https://doi.org/10.1007/s11528-017-0213-1

Tractinsky, N., Katz, A. S., & Ikar, D. (2000). What is beautiful is usable. Interacting

with Computers, 13(2), 127–145. https://doi.org/10.1016/S0953-5438(00)00031-

X

Tuch, A. N., Bargas-Avila, J. A., & Opwis, K. (2010). Symmetry and aesthetics in

website design: It’s a man’s business. Computers in Human Behavior, 26(6),

1831–1837. https://doi.org/10.1016/j.chb.2010.07.016

Tucker, C. (2015). Storytelling LMS Presentation. Experiencing Elearning.

https://www.christytuckerlearning.com/storytelling-lms-presentation/

Uesaka, Y., Manalo, E., & Ichikawa, S. (2007). What kinds of perceptions and daily

learning behaviors promote students’ use of diagrams in mathematics problem

solving? Learning and Instruction, 17(3), 322–335.

https://doi.org/10.1016/j.learninstruc.2007.02.006 165

Um, E. “Rachel,” Plass, J. L., Hayward, E. O., & Homer, B. D. (2012). Emotional design

in multimedia learning. Journal of Educational Psychology, 104(2), 485–498.

https://doi.org/10.1037/a0026609

Uzun, A. M., & Yıldırım, Z. (2018). Exploring the effect of using different levels of

emotional design features in multimedia science learning. Computers &

Education, 119, 112–128. https://doi.org/10.1016/j.compedu.2018.01.002

Van Gog, T. (2014). The signaling (or cueing) principle in multimedia learning. In R. E.

Mayer (Ed.), The Cambridge handbook of multimedia learning (2nd ed., pp. 174–

205).

Veenman, M. V. J., Prins, F. J., & Verheij, J. (2003). Learning styles: Self-reports versus

thinking-aloud measures. British Journal of Educational Psychology, 73(3), 357–

372. https://doi.org/10.1348/000709903322275885

Võ, M. L. H., Conrad, M., Kuchinke, L., Urton, K., Hofmann, M. J., & Jacobs, A. M.

(2009). The berlin affective word list reloaded (BAWL-R). Behavior Research

Methods, 41(2), 534–538. https://doi.org/10.3758/BRM.41.2.534

Warne, R. T. (2014). A primer on multivariate analysis of variance (MANOVA) for

behavioral scientists. Practical Assessment, Research & Evaluation, 19(17), 10.

Warner, R. M. (2008). Applied statistics: From bivariate through multivariate

techniques. SAGE Publications.

Wehr, T., & Wippich, W. (2004). Typography and color: Effects of salience and fluency

on conscious recollective experience. Psychological Research Psychologische

Forschung, 69(1–2), 138–146. https://doi.org/10.1007/s00426-003-0162-5 166

What is learning? | Center for Teaching & Learning. (n.d.). Retrieved March 3, 2019,

from https://teaching.berkeley.edu/resources/learn/what-learning

What is User Experience (UX) Design? (n.d.). The Interaction Design Foundation.

Retrieved March 6, 2019, from https://www.interaction-

design.org/literature/topics/ux-design

Williams, R. (2015). The non-designer’s design book: Design and typographic principles

for the visual novice. Peachpit Press.

Willingham, D. T., Hughes, E. M., & Dobolyi, D. G. (2015). The scientific status of

learning styles theories. Teaching of Psychology, 42(3), 266–271.

https://doi.org/10.1177/0098628315589505

Wilson, B. (2005). Broadening our foundation for instructional design: Four pillars of

practice. Educational Technology, 45(2), 10–16.

Wolfson, S., & Case, G. (2000). The effects of sound and colour on responses to a

computer game. Interacting with Computers, 13(2), 183–192.

https://doi.org/10.1016/S0953-5438(00)00037-0

167

Appendix A: Consent Form

Title of Research:

The Impact of Emotional Design on the Effectiveness of Instructional Materials

Researchers: Dana Simionescu

IRB number: 19-E-126

You are being asked by an Ohio University researcher to participate in research. For you to be able to decide whether you want to participate in this project, you should understand what the project is about, as well as the possible risks and benefits in order to make an informed decision. This process is known as informed consent. This form describes the purpose, procedures, possible benefits, and risks of the research project. It also explains how your personal information will be used and protected. Once you have read this form and your questions about the study are answered, you will be asked to participate in this study. You may ask for a copy of this document to take with you.

Summary of Study

The purpose of this study is to understand how certain graphic design elements may impact the learning process. This study involves reading a lesson, answering questions based on the lesson, a demographic questionnaire, and a follow-up questionnaire one week afterwards.

Explanation of Study

This study seeks to find out if the way instructional materials are designed have an effect on learning outcomes – that is, on how much students understand and retain.

If you agree to participate, you will be asked to:

• complete a few demographic and previous knowledge questions • study a short biology lesson 168

• answer a few questions to check how much you remember and understand.

This will take 20 - 30 minutes. After one week, you will receive an email asking you to complete a few more questions. This will take about 5 minutes.

Compensation

An $8 Walmart gift card, to be received after completing both the initial and the follow-up task. Only participants who complete both tasks will be eligible for compensation. The names and signatures of participants will be collected to document receipt of compensation.

Risks and Discomforts

No risks or discomforts are anticipated.

Benefits

This study is important because it would contribute to our knowledge of how emotions affect learning. It would also provide practical knowledge for improving instructional materials.

Individually, you may benefit by learning about how a virus infects the body to produce an illness.

Confidentiality and Records

The data is recorded with identifiers (your Ohio University email). You will be asked for your email in order to be able to send you the follow-up task. The data will be kept on the researcher’s computer under password protection, and only the researcher will have access to it. The data will be deidentified by April 2020.

Additionally, while every effort will be made to keep your study-related information confidential, there may be circumstances where this information must be shared with: 169

• Federal agencies, for example the Office of Human Research Protections, whose responsibility is to protect human subjects in research; • Representatives of Ohio University (OU), including the Institutional Review Board, a committee that oversees the research at OU.

Future Use Statement

The data may be used for future research studies or distributed to another investigator for future research studies without additional informed consent from you or your legally authorized representative.

Contact Information

If you have any questions regarding this study, please contact the investigator Dana Simionescu, [email protected], 740-818-6506 or the advisor Greg Kessler, [email protected], 740-593-2748

If you have any questions regarding your rights as a research participant, please contact Dr. Chris Hayhow, Director of Research Compliance, Ohio University, (740)593-0664 or [email protected].

By agreeing to participate in this study, you are agreeing that:

• you have read this consent form (or it has been read to you) and have been given the opportunity to ask questions and have them answered;

• you have been informed of potential risks and they have been explained to your satisfaction;

• you understand Ohio University has no funds set aside for any injuries you might receive as a result of participating in this study;

• you are 18 years of age or older;

• your participation in this research is completely voluntary;

• you may leave the study at any time.

Version Date: 7/30/2019 170

Appendix B: IRB Approval

171

172

Appendix C: Lesson Slides

Plain Design (Control Group)

173

174

Emotional Design Graphics (Graphics Group)

175

176

Plain Design with Storytelling Elements (Story Group)

177

178

179

Design with Both Emotional Graphics and Storytelling Elements (Graphics Plus

Story Group)

180

181

182

Appendix D: Questionnaire

Ohio University email: ______Major: ______Age: ______Gender: M F Year: Freshman Sophomore Junior Senior Other: Please check all the classes listed below that you completed in high school: Biology Chemistry Physics (or Physical Science) Earth Science Other natural science related classes (please list): ______How many science courses have you taken or are you currently taking in college? 1 / 2 / 3 / 4 / more than 4 Please check all the things that apply to you: I have participated in science programs or fairs. Biology was my favorite subject in high school. I sometimes watch science documentaries in my free time. I can name most of the cells organelles from memory. I sometimes find myself on the Internet looking up science related topics. I own (or used to own) a microscope. I took advanced biology classes in high school (AP, IB, Honors, etc.)

Please rate your knowledge of biology: Very high Somewhat high Average Somewhat low Very low

183

Appendix E: Immediate Test

1. RETENTION

Describe the steps of how a virus attacks a body. Please include as much detail as you can.

2. TRANSFER

Based on the lesson you just read, please answer the following questions.

(1) What would happen to a virus if it couldn’t find a host cell? (2) What is the role of the cell membrane in a viral infection? (3) Why do some kinds of viruses kill their host cells whereas others do not? (4) If you could, how would you change the human body to minimize the chances of viral infection? (5) What does DNA / RNA have to do with viral replication?

184

Appendix F: Delayed Test

1. RETENTION

Describe the steps of how a virus attacks a body. Please include as much detail as you can.

2. TRANSFER

Based on the lesson you just read, please choose THE BEST answer to the following questions. (correct answer is in bold)

(1) Suppose you are exposed to a cold virus from an infected person who sneezes on you, but you do not get sick. What is the most likely explanation? • The virus was unable to enter the body. • The virus was unable to replicate. • The virus was unable to inject its material into the host cell.

(2) What would happen to viruses if the cells in our bodies developed thicker membranes? • The virus would not be able to recognize the cell. • The virus would need additional genetic material. • The virus would not be able to enter the cell.

(3) Why don’t enveloped viruses inject their instructions through the membrane? • Because they lack the necessary protein for injecting the instructions. • Because the cell membrane prevents them from injecting. • Because they can dissolve through the membrane.

185

(4) What would happen if the host cell’s enzymes were malfunctioning? • It would be easier for the virus to take over the cell. • It would be impossible for the virus to replicate. • The infection would spread more quickly.

(5) Why do viral infections spread quickly throughout the body? • Because one virus produces thousands of copies of itself. • Because viruses can move very fast from cell to cell. • Because one infected cell is enough to cause the disease.

186

Appendix G: Rating Rubric

Q1:one point for each of 20 idea units that the student included regardless of specific wording: (1) virus enters body, (2) through nose or mouth or break in skin, (3) virus floats around until it bumps into a host cell, (4) virus usually attacks respiratory or digestive tract, (5) virus recognizes host cell using a protein, (6) virus attaches to host cell, (7) some enveloped viruses dissolve right through the membrane, (8) virus injects genetic material into host cell, (9) through cell membrane, (10) injected genetic material recruits host cell’s enzymes, (11) to copy virus’s genetic material, (12) host cell’s enzymes produce parts, (13) new parts are packaged into new viruses in the host cell, (14) new viruses break free from host cell, (15) in lysis they destroy the host cell as they leave, (16) in budding they pinch out, (17) new virus can attack other cells, (18) one virus particle reproduces thousands of new copies, (19) viral infections spread quickly through the body, (20) now the person has a cold.

For the following questions, give one point for ANY correct answer Q2 (1 point) What would happen to a virus if it couldn’t find a host cell? 187

It would not be able to replicate / spread / cause illness / it would become inactive after a while. (“die” is acceptable here) / it may be destroyed by the immune system Note: “exit the body” is NOT correct; “remain dormant” is NOT correct Q3 (1 point) What is the role of the cell membrane in a viral infection? The virus attaches itself to the membrane / injects its genetic material through it / enters the cell through it./ some viruses take a piece of membrane upon exit / some viruses burst through the membrane and destroy it upon exit Note: “helps protect from viruses” is not correct, as the question is specifically about its role in infection Q4 (1 point) Why do some kinds of viruses kill their host cells whereas others do not? Some of them need to break the cell as they get out of the cell (lysis), while others leave the cell by budding - pinching out and taking a piece of the membrane with them. (this is the only correct answer, give point even if partial) Q5 (1 point) If you could, how would you change the human body to minimize the chances of viral infection? Create a better barrier for entrance (e.g. better skin) / Change the receptors so the virus can’t attach / Change the membrane to be thicker / Equip the cell with a viral detection system (these are specific answers based on the material. General stuff like “better immune system” does not get points) Q6 (1 point) What does DNA / RNA have to do with viral replication? It provides the instructions for making the virus.

188

Appendix H: Materials Evaluation Questions

Please rate the instructional content:

Effective – somewhat effective – neutral – somewhat ineffective - ineffective

Please rate the visual design of the instructional materials:

Very appealing – appealing – neutral – somewhat appealing - unappealing

189

Appendix I: Group Equivalency Checks

Participants in the Initial Task - ANOVA Descriptives Duration (in seconds) 95% Confidence Interval for Mean

Std. Lower Upper N Mean Deviation Std. Error Bound Bound Minimum Maximum 1 53 807.6604 1569.28017 215.55721 375.1132 1240.2076 188.00 11641.00 2 57 1384.7719 4966.13094 657.78011 67.0800 2702.4638 237.00 37686.00 3 56 715.3393 602.94156 80.57145 553.8705 876.8081 157.00 4777.00 4 54 822.9815 877.96524 119.47594 583.3431 1062.6199 314.00 6126.00 Total 220 937.4455 2690.90438 181.42074 579.8914 1294.9995 157.00 37686.00

Test of Homogeneity of Variances Duration (in seconds) Levene Statistic df1 df2 Sig. 2.571 3 216 .055

ANOVA Duration (in seconds) Sum of Squares df Mean Square F Sig. Between Groups 15768548.889 3 5256182.963 .723 .539 Within Groups 1570003093.45 216 7268532.840 7 Total 1585771642.34 219 5

190

Descriptives 95% Confidence Interval for Mean Std. Std. Lower Upper N Mean Deviation Error Bound Bound Minimum Maximum Age 1 53 20.09 1.701 .234 19.63 20.56 18 26 2 57 20.16 2.419 .320 19.52 20.80 18 33 3 56 20.04 1.868 .250 19.54 20.54 18 30

4 54 20.31 3.214 .437 19.44 21.19 18 40 Total 220 20.15 2.360 .159 19.84 20.46 18 40 Science 1 53 2.6792 1.43813 .19754 2.2828 3.0756 1.00 7.00 Interest Total 2 57 2.4737 1.29705 .17180 2.1295 2.8178 1.00 6.00 3 56 2.7321 1.42051 .18982 2.3517 3.1126 1.00 6.00 4 54 2.7778 1.62140 .22064 2.3352 3.2203 1.00 7.00 Total 220 2.6636 1.44157 .09719 2.4721 2.8552 1.00 7.00

Test of Homogeneity of Variances Levene Statistic df1 df2 Sig. Age .793 3 216 .499 Science Interest Total 1.147 3 216 .331

ANOVA Sum of Squares df Mean Square F Sig. Age Between Groups 2.366 3 .789 .140 .936

Within Groups 1217.684 216 5.637 Total 1220.050 219 Science Interest Between Groups 3.036 3 1.012 .484 .694 Total Within Groups 452.073 216 2.093

Total 455.109 219

191

Participants in the Initial Task - Crosstabs

Pearson df p Chi-Square

Gender 1.777 6 .939

Year 3.482 12 .991

High school: biology 3.323 3 .344

High school: chemistry 6.982 3 .072

High school: physics 4.439 3 .218

High school: earth science 1.258 3 .739

High school: other 4.839 3 .184

Biology knowledge self-rating 19.171 12 .084

Participants in the Delayed Task - ANOVA

Descriptives 95% Confidence Interval for Mean Std. Lower Upper Minimu Maximu N Mean Deviation Std. Error Bound Bound m m Age Control 34 20.29 1.767 .303 19.68 20.91 18 26 Emotional Design 39 19.97 1.495 .239 19.49 20.46 18 24 Graphic Storytellin 32 20.31 2.132 .377 19.54 21.08 18 30 g 192

EDG and Storytellin 38 20.39 3.738 .606 19.17 21.62 18 40 g Total 14 20.24 2.446 .205 19.83 20.64 18 40 3 Science Control 34 2.9118 1.63980 .28122 2.3396 3.4839 1.00 7.00 Interest Emotional Total Design 39 2.5641 1.39161 .22284 2.1130 3.0152 1.00 6.00 Graphic Storytellin 32 2.9688 1.59605 .28214 2.3933 3.5442 1.00 6.00 g EDG and Storytellin 38 2.7895 1.61342 .26173 2.2592 3.3198 1.00 7.00 g Total 14 2.7972 1.54992 .12961 2.5410 3.0534 1.00 7.00 3 Duration Control 879.088 1919.6592 329.2188 209.287 1548.889 11641.0 34 188.00 (in 2 4 4 5 0 0 seconds Emotional 651.487 568.731 ) Design 39 255.28950 40.87904 734.2425 237.00 1495.00 2 9 Graphic Storytellin 808.593 135.9168 531.389 1085.798 32 768.86199 424.00 4777.00 g 8 8 4 1 EDG and 808.315 148.2721 507.887 1108.743 Storytellin 38 914.01080 314.00 6126.00 8 3 9 7 g Total 14 782.433 1108.0826 599.257 11641.0 92.66253 965.6099 188.00 3 6 6 3 0

Test of Homogeneity of Variances Levene Statistic df1 df2 Sig. Age 1.239 3 139 .298 Science Interest Total .487 3 139 .692 Duration (in seconds) 1.445 3 139 .232

193

ANOVA Sum of Squares df Mean Square F Sig. Age Between Groups 3.929 3 1.310 .215 .886 Within Groups 845.987 139 6.086 Total 849.916 142 Science Interest Between Groups 3.509 3 1.170 .482 .696 Total Within Groups 337.610 139 2.429 Total 341.119 142 Duration (in seconds) Between Groups 1033718.711 3 344572.904 .276 .842 Within Groups 173320580.408 139 1246910.650 Total 174354299.119 142

Participants in the Delayed Task - Crosstabs

Pearson df p Chi-Square

Gender 4.162 6 .655

Year 8.023 12 .783

High school: biology 1.906 3 .575

High school: chemistry 5.962 3 .113

High school: physics 1.831 3 .608

High school: earth science 6.658 3 .084

High school: other 7.029 3 .071

Biology knowledge self-rating 8.694 12 .729

194

Appendix J: Normality Checks

195

196

197

Appendix K: Linearity Check

198

Appendix L: Homogeneity Checks

MANOVA Immediate Task

Box's Test of Equality of Covariance Matricesa Box's M 7.237 Tests the null hypothesis that the observed covariance F .791 matrices of the dependent variables are equal across groups. df1 9 a. Design: Intercept + Group df2 527653.375 Sig. .625

Levene's Test of Equality of Error Variancesa F df1 df2 Sig. Q1 Grade Mean 2.007 3 216 .114 Transfer Score Mean 1.031 3 216 .380

Tests the null hypothesis that the error variance of the dependent variable is equal across groups. a. Design: Intercept + Group

MANOVA Delayed Task

Box's Test of Equality of Covariance Matricesa

Box's M 9.421 Tests the null hypothesis that the observed covariance F 1.019 matrices of the dependent variables are equal across groups. df1 9 a. Design: Intercept + Group df2 199620.163 Sig. .421

199

Levene's Test of Equality of Error Variancesa F df1 df2 Sig. Q1 Delayed Grade Mean .706 3 139 .550 Transfer Score Delayed .751 3 139 .523 Tests the null hypothesis that the error variance of the dependent variable is equal across groups. a. Design: Intercept + Group

ANOVA Evaluation of Content

Test of Homogeneity of Variances Rating of instructional content Levene Statistic df1 df2 Sig. 5.954 3 216 .001

ANOVA Evaluation of Visual Design

Test of Homogeneity of Variances Rating of visual design Levene Statistic df1 df2 Sig. 5.711 3 216 .001

200

Appendix M: MANOVA for Immediate Task Performance

Descriptive Statistics Group Mean Std. Deviation N Q1 Grade Mean 1 5.1792 2.72646 53 2 6.3246 2.67009 57 3 5.5625 2.55852 56 4 6.4722 2.09766 54 Total 5.8909 2.56503 220 Transfer Score Mean 1 2.4528 1.20999 53 2 2.5526 1.23449 57 3 2.7768 1.22445 56 4 3.0370 1.04544 54 Total 2.7045 1.19511 220

Multivariate Testsa Hypothesis Partial Eta Effect Value F df Error df Sig. Squared Intercept Pillai's Trace .875 754.011b 2.000 215.000 .000 .875 Wilks' Lambda .125 754.011b 2.000 215.000 .000 .875 Hotelling's 7.014 754.011b 2.000 215.000 .000 .875 Trace Roy's Largest 7.014 754.011b 2.000 215.000 .000 .875 Root Group Pillai's Trace .077 2.869 6.000 432.000 .009 .038 Wilks' Lambda .925 2.857b 6.000 430.000 .010 .038 Hotelling's .080 2.845 6.000 428.000 .010 .038 Trace Roy's Largest .045 3.252c 3.000 216.000 .023 .043 Root a. Design: Intercept + Group b. Exact statistic c. The statistic is an upper bound on F that yields a lower bound on the significance level.

201

Tests of Between-Subjects Effects Dependent Type III Sum Mean Partial Eta Source Variable of Squares df Square F Sig. Squared Corrected Q1 Grade 61.849a 3 20.616 3.229 .023 .043 Model Mean Transfer Score 10.936b 3 3.645 2.608 .053 .035 Mean Intercept Q1 Grade 7612.059 1 7612.059 1192.289 .000 .847 Mean Transfer Score 1608.202 1 1608.202 1150.771 .000 .842 Mean Group Q1 Grade 61.849 3 20.616 3.229 .023 .043 Mean Transfer Score 10.936 3 3.645 2.608 .053 .035 Mean Error Q1 Grade 1379.032 216 6.384 Mean Transfer Score 301.860 216 1.397 Mean Total Q1 Grade 9075.500 220 Mean Transfer Score 1922.000 220 Mean Corrected Q1 Grade 1440.882 219 Total Mean Transfer Score 312.795 219 Mean a. R Squared = .043 (Adjusted R Squared = .030) b. R Squared = .035 (Adjusted R Squared = .022)

202

Appendix N: Discriminant Analysis

Wilks' Lambda Number of Exact F Step Variables Lambda df1 df2 df3 Statistic df1 df2 Sig. 1 1 .957 1 3 216 3.229 3 216.000 .023 2 2 .925 2 3 216 2.857 6 430.000 .010

Pairwise Group Comparisonsa,b Step Group 1 2 3 4 1 1 F 5.643 .626 7.004 Sig. .018 .430 .009 2 F 5.643 2.569 .095 Sig. .018 .110 .759 3 F .626 2.569 3.564 Sig. .430 .110 .060 4 F 7.004 .095 3.564 Sig. .009 .759 .060 2 1 F 3.329 1.018 4.358 Sig. .038 .363 .014 2 F 3.329 3.800 2.857 Sig. .038 .024 .060 3 F 1.018 3.800 1.784 Sig. .363 .024 .170 4 F 4.358 2.857 1.784 Sig. .014 .060 .170 a. 1, 216 degrees of freedom for step 1. b. 2, 215 degrees of freedom for step 2.

203

Eigenvalues Function Eigenvalue % of Variance Cumulative % Canonical Correlation 1 .045a 56.6 56.6 .208 2 .035a 43.4 100.0 .183 a. First 2 canonical discriminant functions were used in the analysis.

Wilks' Lambda Test of Function(s) Wilks' Lambda Chi-square df Sig. 1 through 2 .925 16.888 6 .010 2 .967 7.345 2 .025

Standardized Canonical Discriminant Function Coefficients Function 1 2 Q1 Grade Mean 1.098 -.470 Transfer Score Mean -.207 1.176

204

Appendix O: MANOVA for Delayed Task Performance

Descriptive Statistics

Group Mean Std. Deviation N Q1 Delayed Grade Mean 1 4.1765 2.32210 34 2 4.9487 2.33338 39

3 5.3125 2.28512 32 4 5.0789 1.88352 38 Total 4.8811 2.22419 143 Transfer Score Delayed 1 2.68 .912 34 2 2.79 1.031 39 3 3.06 1.076 32 4 2.71 1.137 38 Total 2.80 1.043 143

Multivariate Testsa Partial Eta Effect Value F Hypothesis df Error df Sig. Squared Intercept Pillai's Trace .897 602.234b 2.000 138.000 .000 .897 Wilks' Lambda .103 602.234b 2.000 138.000 .000 .897 Hotelling's Trace 8.728 602.234b 2.000 138.000 .000 .897 Roy's Largest 8.728 602.234b 2.000 138.000 .000 .897 Root Group Pillai's Trace .048 1.141 6.000 278.000 .338 .024

Wilks' Lambda .952 1.136b 6.000 276.000 .341 .024 Hotelling's Trace .050 1.131 6.000 274.000 .344 .024 Roy's Largest .036 1.689c 3.000 139.000 .172 .035 Root a. Design: Intercept + Group b. Exact statistic c. The statistic is an upper bound on F that yields a lower bound on the significance level.

205

Appendix P: ANOVA for Instructional Content Rating

Descriptives Rating of instructional content 95% Confidence Interval for Mean Std. Std. Lower Upper N Mean Deviation Error Bound Bound Minimum Maximum Control 53 3.74 .964 .132 3.47 4.00 1 5 Emotional Design 57 4.00 .655 .087 3.83 4.17 2 5 Graphic Storytelling 56 3.66 .959 .128 3.40 3.92 1 5 EDG and 54 4.11 .718 .098 3.92 4.31 2 5 Storytelling Total 220 3.88 .849 .057 3.76 3.99 1 5

ANOVA Rating of instructional content Sum of Squares df Mean Square F Sig.

Between Groups 7.498 3 2.499 3.594 .014 Within Groups 150.189 216 .695 Total 157.686 219

Contrast Coefficients Group Emotional EDG and Contrast Control Design Graphic Storytelling Storytelling 1 -.5 .5 -.5 .5 2 -.5 -.5 .5 .5

Contrast Tests

Value of Std. Sig. (2- Contrast Contrast Error t df tailed) Rating of Assume equal 1 .36 .112 3.176 216 .002

instructional content variances 2 .02 .112 .160 216 .873 Does not assume 1 .36 .113 3.164 192.225 .002 equal variances 2 .02 .113 .159 192.225 .874 206

Appendix Q: ANOVA for Visual Design Rating

Descriptives Rating of visual design 95% Confidence Interval for Mean Std. Std. Lower Upper N Mean Deviation Error Bound Bound Minimum Maximum Control 53 3.30 1.067 .147 3.01 3.60 1 5 Emotional Design 57 3.95 1.076 .143 3.66 4.23 1 5 Graphic Storytelling 56 3.46 1.144 .153 3.16 3.77 1 5 EDG and 54 4.19 .675 .092 4.00 4.37 2 5 Storytelling Total 220 3.73 1.063 .072 3.59 3.87 1 5

ANOVA Rating of visual design Sum of Squares df Mean Square F Sig. Between Groups 27.548 3 9.183 9.012 .000 Within Groups 220.089 216 1.019 Total 247.636 219

Contrast Coefficients Group Emotional EDG and Contrast Control Design Graphic Storytelling Storytelling 1 -.5 .5 -.5 .5

Contrast Tests

Value of Std. Sig. (2- Contrast Contrast Error t df tailed) Rating of visual Assume equal 1 .68 .136 5.017 216 .000 design variances Does not assume 1 .68 .136 5.037 196.898 .000 equal variances

207

Appendix R: Reliability Analysis for Pilot Study

Reliability Statistics Cronbach's Alpha N of Items .522 6

Item Statistics

Mean Std. Deviation N Q1_Score 3.1818 2.04050 11 Q2_Score .8182 .40452 11 Q3_Score .2727 .46710 11 Q4_Score .1818 .40452 11 Q5_Score .6364 .50452 11 Q6_Score .4545 .52223 11

Item-Total Statistics Corrected Item- Cronbach's Scale Mean if Scale Variance Total Alpha if Item Item Deleted if Item Deleted Correlation Deleted Q1_Score 2.3636 1.855 .586 .527 Q2_Score 4.7273 8.418 .294 .497 Q3_Score 5.2727 8.618 .159 .522 Q4_Score 5.3636 8.255 .368 .482 Q5_Score 4.9091 7.691 .474 .440 Q6_Score 5.0909 7.491 .528 .422

Scale Statistics

Mean Variance Std. Deviation N of Items 5.5455 9.273 3.04512 6

! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !

Thesis and Dissertation Services ! !