<<

ENHANCING SPATIAL VISUALIZATION SKILLS IN FIRST-YEAR STUDENTS

DISSERTATION

Presented in Partial Fulfillment of the Requirements for the Degree Doctor of Philosophy

in the Graduate School of The Ohio State University

By

Yosef S. Allam

One-of-a-Kind Doctor of Philosophy

The Ohio State University

2009

Dissertation Committee:

Clark A. Mount-Campbell, Advisor

Patricia A. Brosnan

Robert J. Gustafson

Douglas T. Owens

Copyright by

Yosef S. Allam

2009

ABSTRACT

Spatial visualization skills are a function of genetics and life experiences. An

individual’s genetic spatial visualization aptitude can be enhanced through proper

instruction and practice. Spatial visualization skills are important to engineers as they

help with problem formulation and thus enhance problem-solving ability. They are also vital to an engineer’s ability to create and interpret visual representations of design ideas.

This study seeks to investigate the experiential factors affecting spatial visualization skills and methods with which these skills can be enhanced. This study also investigates the correlation between spatial visualization ability and pre-college life experiences, as well as spatial visualization ability and academic performance. Participants were selected from an introductory engineering course. Participants in the treatment and control groups

were pre- and post-tested using the Purdue Spatial Visualization Test—Rotations to

gauge spatial visualization ability. The treatment consisted of students being given a series of -generated representations of figures from various perspectives that may aid in visualization of these objects. Scores between the treatment and control groups were compared and checked for statistical significance. Participants were also given a questionnaire to complete. The answers from the questionnaire were coded for levels of pre-college experience in certain key areas that are hypothesized to aid in the development of spatial visualization skills. These quantitative experience levels were

ii

correlated to pre-test results to verify the hypothesis of these life experiences’

significance in spatial visualization ability development. The relationship between student academic performance and spatial visualization ability was also investigated.

Instructional tool utilization and access effects on spatial visualization skill gains between pre-tests and post-tests were not significant. This is potentially due to a substitutive rather than additive effect to student experiences through usage of the instructional tool. Developmental experiences with stackable toys such as Legos and building blocks were a significant predictor of initial spatial ability, confirming previous findings. Developmental experiences with home improvement activities significantly affected graded performance in coursework. Initial spatial ability was also a significant predictor of course grades.

Course grade and resource web-based applications can be used successfully in the deployment of open access electronic instructional tools. Institutional web applications can also be used to automate and conduct large studies. Access, utilization logging, grades, and enforcement of experimental design parameters and treatment group segregation can be provisioned via online course grade and resource repositories such as

Carmen, by Desire-to-Learn, employed in this study. Improvements in data accessibility must be made in course web applications to facilitate more studies conducted in this manner. Improvements in usability, reporting, and analysis are necessary to allow for streamlined study implementation and data dissemination for educational research in courses employing web applications. Improvements to course web applications can allow educational researchers to effectively capitalize on this technology.

iii

DEDICATION

For my family, my mother Magda, my father Serry, and my sister Dina.

iv

ACKNOWLEDGEMENTS

It was through the guidance, support, and encouragement of many that the goals

of this researcher were realized. Great thanks and appreciation are due to those that

follow:

Dr. Clark Mount-Campbell, my advisor, for his patience and latitude to allow me

to find my own way, as well as his guidance and friendship throughout the years we have

known each other. Our conversations over the years spurred many thoughts in new

directions.

Dr. Patti Brosnan, for her support and encouragement to consider various research

methods, as well her openness to allowing me to perform mathematics education course

projects with engineering students. These projects formed the initial groundwork for this

study.

Dr. Bob Gustafson, for allowing and encouraging me to use the First-Year

Engineering Program as a laboratory for my research, as well as the many other

opportunities which allowed me to gain experience in an environment rich in engineering

education innovation.

Dr. Doug Owens, for his help from the onset and throughout my journey into

educational research. His guidance in the realm of pedagogical theory as well as

selection of coursework best suited for my research interests and goals was indispensible.

v

Dr. John Merrill, who made available to me every opportunity in his power to

allow me to pursue my passion for engineering education, and for keeping me informed of new developments and opportunities.

Dr. David Tomasko, whose energy and passion for both research and engineering education are inspirational. Working with him and seeking his advice over the years has enriched my journey to this goal.

All the fine people of The Ohio State University over the years who unfortunately cannot be listed here. The institution and its people have helped me realize my dreams.

Finally, I would also like to thank my friends and family who encouraged me along the way and listened. Of particular note are my mother, father, and sister, as well as my friend Dr. Srikant Nekkanty, who empathized and commiserated, and my lifelong friend and confidant, Stephanie, for her help, encouragement, and critical eye in the final stages of this endeavor.

vi

VITA

February 25, 1974………………………….. Born – Bayonne, New Jersey

1997………………………………………... B. S. Industrial and Systems Engineering The Ohio State University

1997 – 1999………………………………... Graduate Teaching Associate Industrial and Systems Engineering The Ohio State University

1999………………………………………... M. S. Industrial and Systems Engineering Specialization: Operations Research The Ohio State University

2002………………………………………... Instructor and Curriculum Developer First-Year Engineering Programs The Ohio State University

2002 – 2008………………………………... Graduate Teaching Associate, Lead Graduate Teaching Associate, Curriculum Developer First-Year Engineering Programs The Ohio State University

2008 – 2009……………………………….. NSF Graduate STEM Fellow in K-12 Education (GK-12) The Ohio State University

2009 – Present……………………………... Instructor First-Year Engineering Programs The Ohio State University

vii

PUBLICATIONS

Allam, Y., & Irani, S. A. (1999). Systematic redesign of a manufacturing cell. In S. A. Irani (Ed.), Handbook of Cellular Manufacturing Systems (pp. 661-679). New York: Wiley.

Allam, Y., Tomasko, D. L., Trott, B., Schlosser, P., Yang, Y., Wilson, T. M., & Merrill, J. (2008). Lab-on-a-chip design-build project with a nanotechnology component in a freshman engineering course. Chemical Engineering Education, 42 (4), 185- 192.

FIELDS OF STUDY

Major Field: One-of-a-Kind Doctor of Philosophy

Engineering Education

viii

TABLE OF CONTENTS

ABSTRACT ...... ii

DEDICATION ...... iv

ACKNOWLEDGEMENTS ...... v

VITA ...... vii

LIST OF FIGURES ...... xiii

LIST OF TABLES ...... xiv

CHAPTER 1: CONTEXT, RATIONALE, AND THEORETICAL FRAMEWORK ...... 1

Problem Context ...... 1 Motivating Factors ...... 1 Problem Statement ...... 3 Research Questions ...... 4 Rationale: The Significance of Spatial Visualization Skills in Engineering Education ... 5 Theoretical Framework ...... 6 Factors Affecting Spatial Visualization Ability...... 6 Learning Theory and Cognitive Development ...... 6 Piagetian Spatial Development...... 10 Representation ...... 15 Theoretical Implications ...... 19

ix

CHAPTER 2: LITERATURE REVIEW ...... 22

Historical Perspective...... 23 Significance of Spatial Visualization Skills in Engineering Education ...... 24 Assessing Visualization Skills ...... 25 Factors Affecting Spatial Visualization Ability ...... 27 Alternatives to Promoting Good Visualization Skills ...... 30 Enhancing Visualization Skills ...... 31 Virtual Three-dimensional Remediation ...... 33 Problem-based Learning ...... 38 Implications...... 39 Summary ...... 40

CHAPTER 3: METHODOLOGY, DATA COLLECTION AND ANALYSIS ...... 41

Participants ...... 42 Instruments ...... 42 Treatment ...... 46 Procedures ...... 47 Research Design ...... 51 Sample and Initial Calculations ...... 51 Underlying Considerations ...... 54 Data Collection and Parsing Devices ...... 55 Data Logs ...... 55 Other Data ...... 56 Master Spreadsheet Data and Pre-Analysis Activities ...... 57 Hypotheses, Models, and Variables ...... 57 Tool Access Effects on Spatial Visualization Improvement ...... 58 Tool Utilization Effects on Spatial Visualization Improvement ...... 59 Experience Effects on Spatial Visualization Ability Gains ...... 60 Experience Effects on Initial Spatial Visualization Ability ...... 61 Experience Effects on Grades ...... 62 Tool Utilization Effects on Grades ...... 63 Spatial Visualization Pre-test Effects on Grades ...... 64

x

Risks ...... 65 Internal Validity ...... 66 Data Analysis ...... 66 Tool Effectiveness ...... 66 Limitations ...... 67

CHAPTER 4: DATA ANALYSIS AND RESULTS ...... 70

Treatment of the Data ...... 70 Preliminary Analysis of DATA: General Trends and Relationships ...... 72 Principal Analysis of Data: Hypothesis Testing ...... 81 Tool Access Effects on Spatial Visualization Improvement ...... 81 Tool Utilization Effects on Spatial Visualization Improvement ...... 86 Experience Effects on Spatial Visualization Ability Gains ...... 89 Experience Effects on Initial Spatial Visualization Ability ...... 92 Experience Effects on Grades ...... 94 Tool Utilization Effects on Grades ...... 100 Spatial Visualization Pre-test Effects on Grades ...... 105 Student Feedback ...... 113 Questionnaire ...... 113 Focus Group ...... 116

CHAPTER 5: DISCUSSION, CONCLUSIONS, AND RECOMMENDATIONS ...... 117

Discussion and Conclusions ...... 118 Discussion and Conclusions of Results ...... 118 Tool Access Effects on Spatial Visualization Improvement ...... 118 Tool Utilization Effects on Spatial Visualization Improvement ...... 118 Experience Effects on Spatial Visualization Ability Gains ...... 120 Experience Effects on Initial Spatial Visualization Ability ...... 121 Experience Effects on Grades ...... 122 Tool Utilization Effects on Grades ...... 123 Spatial Visualization Pre-test Effects on Grades ...... 123 Survey Responses ...... 124 Discussion and Conclusions of Limitations ...... 125 xi

Recommendations ...... 127 Recommendations for Practice ...... 127 Recommendations for Research ...... 128

REFERENCES ...... 131

APPENDIX A: PURDUE SPATIAL VISUALIZATION TEST – ROTATIONS ...... 137

APPENDIX B: EXPERIENCE SCORE LIKERT-TYPE SURVEY QUESTIONS ...... 153

APPENDIX C: SURVEY AND FOCUS GROUP SOLICITATION ...... 155

APPENDIX D: RELEVANT ENG 181 WINTER 2008 DRAWING ASSIGNMENTS159

APPENDIX E: ENG 181 WINTER 2008 DAILY ASSIGNMENT LIST ...... 172

APPENDIX F: CONSENT DOCUMENTATION ...... 174

APPENDIX G: ENG 181 WINTER 2008 EXAM DRAWING PROBLEMS ...... 180

APPENDIX H: DATA PARSING MACRO ...... 188

APPENDIX I: SQL SCRIPT ...... 197

xii

LIST OF FIGURES

Figure Page

1.1: Engineering graphics concepts required for creating graphical representations...... 2

1.2: Piagetian cognitive and spatial development...... 15

1.3: Important spatial visualization concerns and theoretical protocol...... 20

3.1: Power curve for one-way ANOVA, n=112, α=0.05, σ=5.3...... 53

3.1: Potential pattern of results given effective instructional tool...... 67

4.1: Average spatial visualization score plotted vs. Experience score...... 74

4.2: Average spatial visualization post-test score plotted vs. spatial visualization pre-test score...... 75

4.3: Main effects for PSVT—R post-test for the usage random factor...... 89

4.4: Drawing 11 grades plotted against PSVT—R pre-test scores by animation tool usage...... 113

LIST OF TABLES

Table Page

4.1: Descriptive statistics for blocked sections, excluding grades...... 73

4.2: Pearson product moment correlation coefficients for initial spatial ability, various Drawing and non-drawing grades, and Experience scores...... 77

4.3: Pearson product moment correlation coefficients for initial spatial ability, various Drawing and non-drawing grades, animation tool utilization, and Experience scores...... 78

4.4: T-test for significant differences in PSVT—R means between control and treatment groups, filtered for pre-test scores, grades, and Experience scores...... 79

4.5: T-test for significant differences in PSVT—R means between control and treatment groups, filtered for pre-test scores...... 80

4.6: ANOVA for significant differences in PSVT—R between blocked labs, filtered for pre-test scores...... 81

4.7: GLM for significant differences in blocked labs, filtered for pre-test and post-test scores excluding zeros...... 81

4.8: Descriptive statistics for data set filtered for PSVT—R and Experience scores...... 83

4.9: GLM ANCOVA for significant differences in PSVT—R post-test between control and treatment groups with blocked labs and experience and pre-test scores as covariates...... 84

4.10: GLM ANCOVA for significant differences in PSVT—R post-test between control and treatment groups, given blocked labs and pre-test scores as covariates...... 85

4.11: GLM ANCOVA for significant effects of the Overall Total Time utilization covariate on PSVT—R post-test scores...... 87

4.12: GLM for PSVT—R post-test score effects by the animation usage factor...... 89

xiv

4.13: Experience effects and interactions with PSVT—R test gains...... 91

4.14: GLM for PSVT—R pre-test score effects from Experience composite scores...... 92

4.15: GLM for PSVT—R pre-test score effects by Experience components: MODELS, SPORTS, and MUSIC...... 94

4.16: GLM for response Non-visualization grades effects by Experience score covariates...... 96

4.17: GLM for response all visualization grades effects by Experience score covariates. 97

4.18: GLM for response of drawing problem scores on midterm exam effects by Experience score covariates...... 98

4.19: GLM for drawing assignment visualization grades effects by Experience score covariates...... 99

4.20: Effects of animation tool usage on drawing visualization grades...... 101

4.21: Effects of animation tool visits on drawing visualization grades...... 102

4.22: Effects of animation tool visits on midterm exam visualization grades...... 103

4.23: T-test for Drawing 11 scores of those using the Drawing 11 animation tool versus those who did not use the tool...... 104

4.24: GLM for Drawing 11 animation tool usage effects on Drawing 11 grades...... 105

4.25: Effects of PSVT—R pre-test scores on all grades...... 106

4.26: Effects of PSVT—R pre-test scores on all but Drawing assignments...... 107

4.27: Effects of PSVT—R pre-test scores on all but Drawing assignments and exam Drawing grades...... 108

4.28: Effects of PSVT—R pre-test scores on Drawing assignment and exam Drawing grades...... 109

4.29: Effects of PSVT—R pre-test scores on Drawing assignment grades...... 110

4.30: Effects of PSVT—R pre-test scores on Drawing midterm exam grades...... 111

4.31: Effects of PSVT—R pre-test scores and Experience composite scores on Drawing midterm exam grades...... 112

xv

CHAPTER 1: CONTEXT, RATIONALE, AND THEORETICAL FRAMEWORK

Problem Context

Everything concrete or tangible in this world forms the experiences and actions for which mental representations among three-dimensional objects in space and time are created for understanding and mental processing. The ability to statically create and dynamically project mental representations of three-dimensional objects in different space and time settings is often referred to as spatial visualization. Spatial visualization skills are a result of genetics (potential) and life experiences (practice and development).

Varying levels of potential and practice and development in individuals result in varied levels of spatial visualization skills. Spatial visualization skills in young adults can be enhanced through instruction and practice (Piburn, Reynolds, McAuliffe, et al., 2005;

Crown, 2001, Deno, 1995; Lord, 1985; Pallrand & Seeber, 1984; Peters, Chisholm,

Laeng, 1995; Sorby & Baartmans, 1996).

Motivating Factors

Specific to this application, spatial visualization skills are an important aspect of

engineering education. They enhance student understanding and help students formulate

problems. Spatial visualization ability is essential for engineers to perform adequately in

academia and industry. Visualization ability is often indicative of academic performance in the engineering curriculum (Yue, 2002).

1

Today, engineering graphics instructional teams strive to develop in students the

necessary spatial visualization and graphical representation skills to allow them to

express ideas in the course of solving problems. The major requisite skills by engineering graphics students for the generation of graphical representations are depicted

in Figure 1.1.

Manual Drafting Representing on Computer-aided design Technology Media Neat labeling of all of the depictions below Transferring Figure Merging with Familiarity With Familiarity With Other Drafting Tools Lettering Technology Tools (pencil, eraser, T-square (virtual space,computer, (word processor, triangle, etc.) mouse, etc.) Presentations, Manufacturing)

Depict Onto 2-D space Onto 2-D Project real object real Project

Onto 2-D space Onto 2-D with removed Project real object real Project material Sectioned 2-D Sectioned Multi-View Depict Multi-view Multi-view with removed Multi-View material CAD CAD P ro p s e e r iz ly s s w h o o h w s s y iz rl e e s p ro

interior sizes interior P interior sizes interior or 3-D per view or 3-D per or 3-D per view or 3-D per Visualize in 2-D Visualize Visualize in Visualize 2-D or properly show or properly or properly show or properly Represent interior Represent Represent interior Represent Dimensioning Dimensioning 3-D & Isometric & Solid Model Tolerancing Tolerancing CAD CAD

Figure 1.1: Engineering graphics concepts required for creating graphical representations.

In the context it was defined earlier, spatial visualization is a subcategory of visualization. Visualization can be defined as, “the ability to take an idea from one’s mind and model it…”, and “...the ability to comprehend someone else’s model”

(Newcomer, McKell, Raudebaugh, 2001, p. 33). 2

Given that first-year engineering students enter the academic field of engineering

with a variety of backgrounds and experiences, and thus, varying degrees of spatial

visualization skill development, there is a need for additional attention and effort in some

students. Sometimes standard instruction and coursework at the introductory level of engineering graphics is not enough. Students on this lower end of spatial visualization

skill development need to close the gap between the level of their skills in this area and

the level of the more honed skills of their peers.

Since spatial visualization skills are essential for success in engineering and other

technical fields of study, it is important to develop spatial visualization ability during the

developmental years of students. The need for remediation can be avoided through

activities involving building, tactile manipulation of objects, and other tasks to reinforce

these skills prior to college. Good spatial visualization skills can maintain low student

attrition rates in and engineering, and improve general student performance in the

engineering curricula. Problem-solving and problem formulation abilities are related to

spatial visualization skills (Carter, LaRussa, Bodner, 1987). A lack of high spatial

visualization capabilities on the part of entering freshman students at the postsecondary

level may result in students selecting programs of study outside of engineering and other

technical fields. Spatial visualization competency is thus a significant concern.

Problem Statement

Undergraduate students entering a first-year engineering program arrive with a

variety of backgrounds, experiences, and abilities. Spatial visualization skills are among

those abilities that have been correlated to performance in engineering graphics and other

introductory first-year engineering courses, as well as the overall quality of engineers. 3

This study will examine tools and techniques that can be used to enhance the spatial visualization skills of beginning engineering students. The relationship between grades, spatial visualization instructional tool usage, developmental experiences, and spatial visualization test scores will also be investigated.

Research Questions

This study will address the following questions:

1. Can a computer aided design (CAD)-based 3-dimensional animated model

of existing, assigned, paper-based engineering graphics problems be used as

an effective instructional tool that enables students to enhance their spatial

visualization skills?

2. Does participation in certain activities prior to enrolling in an engineering

degree program in college predict initial scores on spatial visualization

tests?

3. Do students’ experience levels in certain developmental activities, positively

correlated to spatial visualization ability, predict student gains in spatial

visualization tests?

4. How do spatial visualization skills in engineering students, as gauged by

spatial visualization tests, relate to performance on relevant engineering

graphics assignments?

5. How do spatial visualization skills in engineering students, as gauged by

spatial visualization tests, relate to performance on relevant non-graphics

assignments?

4

Rationale: The Significance of Spatial Visualization Skills in Engineering Education

Engineering graphics courses develop graphical communication and spatial

visualization skills in student engineers. Spatial visualization skills should be emphasized throughout the engineering curriculum. The use of spatial instructional strategies in engineering courses can reinforce student abilities (Hsi, Linn, Bell, 1997).

Exam scores in engineering and other technical fields of study are significantly correlated to spatial visualization ability. Understanding of concepts, as evaluated by testing, has been to shown to be predicted by initial spatial visualization abilities of

physics students (Kozhevnikov & Thornton, 2006).

Spatial visualization skills should be promoted through remedial instruction at the

collegiate level or during the developmental periods prior to college. Students who claim

a general dislike for the and technical disciplines often lack adequate spatial

visualization skills regardless of their performance in other areas (Pallrand & Seeber,

1984). Spatial visualization is also strongly related to problem solving ability, an essential aspect of engineering (Carter, et al., 1987).

Technology tools such as computer-aided design and modeling (CAD and CAM) do not alleviate the need for strong spatial visualization skills, nor does the practice of blind adherence to analytic procedures or algorithms in the course of providing alternate representations of objects and designs. Students who do not possess the prerequisite skill set should be given options to remediate and enhance their spatial visualization skills to prepare them for engineering graphics and the remaining engineering curriculum for which possession of these skills is essential. Students should be discouraged from circumventing a skill area necessary for proficiency in their future profession.

5

Theoretical Framework

Factors Affecting Spatial Visualization Ability

Early research on spatial visualization skills suggested that spatial visualization ability is wholly innate (Lord, 1985). Modern studies however indicate that spatial visualization skills can be improved through practice (Piburn, et al., 2005; Crown, 2001,

Deno, 1995; Lord, 1985; Pallrand & Seeber, 1984; Peters, et al., 1995; Sorby &

Baartmans, 1996). Those with deficiencies in spatial visualization ability can work to develop their skills.

The source of spatial visualization deficiencies includes a lack of previous experiences such as actions performed on objects in space. Childhood activities involving construction games and toys, art and sketching, video games, sports, and tactile experiences with concrete objects are among the positive socio-cultural influences listed by researchers (Pallrand & Seeber, 1984; Peters, et al., 1995; Yue, 2002).

Learning Theory and Cognitive Development

Jean Piaget proposes four stages of cognitive development: sensory-motor, which is pre-verbal; pre-operational representation, which creates the underpinnings of language and symbol; concrete operational, which precedes full capability of abstraction; and finally, formal or hypothetic-deductive operational, which allows for full abstraction such as reasoning on hypotheses and the ability of propositional logic (2003, S9). Piaget provides ages as guidelines to separate one stage from another: the sensory-motor stage ends around 18 to 24 months of age; the pre-operational representation stage reaches up to seven to eight years of age; the concrete operational stage lasts to eleven to twelve

6

years of age typically; and the hypothetic-deductive operational stage develops through

adolescence and onward. Piaget’s theories require that development precedes learning.

Learning at higher levels becomes possible only as the individual passes through the four

stages of development. Undergraduate engineering students would thus be in the fourth

Piagetian stage of development.

These stages are traversed as a result of four factors: maturation, limiting mental

potential; experience, or play and interaction with the environment; social transmission,

or social context; and equilibration, of which the fourth he stresses and discusses in great

depth (Piaget, 2003, S10-S14). Piaget says that individuals, upon encountering a new piece of knowledge, always seek to maintain balance or equilibrium. Upon feeling perturbation from a new piece of knowledge, an individual learner will first attempt to assimilate this information into their existing structures, and, if this fails to attain the necessary balance or comfort level, they will accommodate it via a restructuring of their

existing mental structures in an effort to avoid feeling perturbation (Piaget, 2003, S13-

S14).

Piaget talks of a structure which acts as an intermediary between “stimulus” and

“response” in the traditional S—R model (2003, S14). The response essentially already exists if there is any response at all, as the response is only elicited by the stimulus if there is supporting structure. The structure continues to develop through assimilation and accommodation, potentially affecting the evolution of the response with each iteration, thus the circular nature of the model.

Those in Piaget’s fourth stage of cognitive development are typified by ability to delve into the hypothetical without having to see or interact with the concrete; abstract

7 thought is possible (2003, pp. S9-S10). An individual in this last stage of development is capable of reason “on the basis of simple assumptions which have no necessary relation to reality or to the subject’s beliefs, and from the time when he relies on the necessary validity of an inference (vi formae), as opposed to agreement of the conclusions with experience” (Piaget, 1976, p.148). In other words, thought is not bounded by the finite set of experiences of the individual in that “there is even more than reality involved, since the world of the possible becomes available for construction and since thought becomes free from the real world” (Piaget, 1976, p. 151). Eventually the ties to “real action” are not necessary as the individual can make “hypothetic-deductive operations concerning pure implications from propositions stated as postulates” (Piaget, 1976, p. 153).

Lev Vygotsky (1978) defines two terms, one of which is the Zone of Proximal

Development (ZPD), while the other is “actual developmental level” (p. 85). The actual developmental level is defined as what a child can do independently, without the assistance of others. The ZPD is everything beyond that point which can be performed by the child with assistance. Vygotsky (1978) specifically says the following about the

ZPD:

It is the distance between the actual developmental level as determined by

independent problem solving and the level of potential development as

determined through problem solving under adult guidance or in collaboration with

more capable peers. (p. 86)

Vygotsky states that learning should lead development and encourage development. The only learning of any use to children is that which advances development (Vygotsky, 1978, p. 89). Through the exercising of the child’s ZPD with

8

the help of those in the social context providing necessary guidance and social

interactions, internal developmental processes are utilized and knowledge from actual

experience in the social context is eventually re-constructed and internalized to expand

the child’s attained developmental level (Vygotsky, 1978, p. 90).

Progression of learning and development, and expansion of the ZPD thus requires

the assistance of an adult or capable peer. After Vygotsky’s time, this is referred to as

scaffolding (Bruner, 1990, p. 106). Scaffolding is any type of external aid or assistance

provided directly by or developed in proxy by someone more capable, and can take the

form of books, conversation, teamwork, computer-based exercises and instruction,

questions, discussion, or any cultural artifact in the social context which aids the learner

in some manner in their ZPD as they internalize and re-construct that which they take

from the social context. A form of scaffolding, for example, could be three-dimensional

rotated animations for students in engineering graphical communications classes.

So learning drives mental development; independent ability is representative of actual development, while ability with assistance is representative of prospective development. Therefore, prospective development, or the Zone of Proximal

Development, should be the target area of instruction. Two individuals may have the same level of actual developmental ability, but they do not necessarily have a zone of proximal development which encompasses the same scope, breadth, or depth in any given area. This suggests the need for a variety of different types, and by extension, sources of scaffolding. This is something difficult for one teacher to provide on their own, and causes educators to look elsewhere for “non-traditional” models and sources of scaffolding, including indirect instructional methods.

9

Discussion of Vygotsky and Piaget and their theories on the structuring of knowledge thus inevitably leads to the theory of constructivism. The two guiding principles, as stated by von Glasersfeld (1989) are:

a) Knowledge is not passively received either through the senses or by way of

communication; b) knowledge is actively built up by the cognizing subject.

a) The function of cognition is adaptive, in the biological sense of the term,

tending towards fit or viability; [italics in original] b) cognition serves the

subject’s organization of the experiential world, not the discovery of an objective

ontological reality. (p. 5)

Considering the documented work of classics and contemporaries on the issues of social versus radical constructivism, it is apparent that regardless of social considerations, the individual still constructs knowledge to satisfy his/her world as his/her perception allows.

Piagetian Spatial Development

In Piaget’s works, there are parent and subordinate spatial abilities in a two-tiered hierarchical tree of spatial abilities which overlap in development with some degree of sequence but also some concurrence. These spatial abilities generally increase in complexity and developmental level required for mastery and are acquired and honed as the child ages and develops mentally. These spatial abilities (or “spaces”) and the sub- abilities within each space chronologically, but loosely, follow the four stages of cognitive development (Piaget, 1967) at different paces and points of inception. In general, of the three spaces, the two more advanced spaces and their associated spatial

10 sub-abilities develop later in the four stages of cognitive development than the less advanced space and its sub-abilities.

Before delving into the details of these spatial abilities, sub-abilities, and how they relate to the stages of cognitive development, Piagetian terminology and nuances should be clarified. There are five terms referred to as “spaces.” To alleviate confusion, it is best to consider perceptual and conceptual space as types of thought, and topological, projective, and Euclidian space as spatial abilities, under each of which are several specific sub-abilities.

Piaget does not effectively distinguish between projective and Euclidian spatial abilities in instances, nor does he specify defined starting points of all spatial ability development relative to perceptual versus conceptual spatial thought (Reese, 1999, p.

167) thus eliciting many gray areas. The implication of all this is that it is important to be cognizant of the generalities involved in the spatial abilities, their corresponding spatial sub-abilities, and how they relate loosely—sometimes concurrently, and sometimes sequentially—to the cognitive stages of development.

Conceptual thought lags behind but grows in tandem with perceptual thought.

Both of these develop simultaneously with perceptual thought as the original buttressing factor of conceptual thought. Perceptual thought remains ahead of conceptual thought throughout the spatial development of the child even as the child traverses the major spaces of topological, projective, and Euclidian spatial abilities. Most of the topological spatial ability develops alongside perceptual and conceptual thought. As these developments near maturity, the development of projective and Euclidian spatial abilities begins to flourish.

11

As a child progresses through Piaget’s four stages of cognitive development,

he/she first develops primarily in the area of topological spatial ability and its corresponding sub-abilities. Topological spatial abilities are the most basic of the three spatial abilities and include shape recognition, shape differentiation, order, surrounding, and continuity. At first the infant is in perceptual thought visio-spatially and simultaneously in the sensory-motor cognitive stage of development. At this point the child performs “an imitation of the object by an action” (Piaget, 1967, p. 455) and is unable to mentally capture objects, or conceptualize that which is not immediately accessible. Only the most basic spatial relationships, such as proximity, separation, order, enclosure, and continuity are available with development and time progression through the topological space.

The conceptual spatial ability begins late in the sensory-motor cognitive stage of development and occurs when the child can conceptualize or imagine an object or actions on an object regardless of the object’s presence in the field of the child’s activity. “The construction of space begins on the perceptual level and continues on the representational

[conceptual] one” (Piaget, 1967, p. 38). As development continues and the child progresses through the conceptual level, “the image, which from the very beginning is symbolic in character, plays an increasingly subordinate role as the active component of thought becomes better organized” (Piaget, 1967, p. 456). The perceptual thought development continues and allows for further conceptual thought development as the child acquires more topological spatial abilities. Simultaneously, the child experiences more “decentration,” corresponding to less egocentricity, allowing him/her to conceive of

12 spatial relationships outside his/her immediate sphere of influence in reality and the surrounding concrete environment, allowing for a later shift in spatial perspectives.

The development of projective and Euclidian spatial abilities are minimally concurrent with the development of the topological abilities and have unclear starting points. These two more advanced spatial abilities trickle at first prior to the final, or hypothetical-mathematical, stage of cognitive development while topological spatial abilities are rapidly developing. Projective and Euclidian spatial abilities then accelerate concurrently as the child nears or reaches full development of topological abilities and cognitive development progresses into the fourth, or logical-mathematical, abstract reasoning stage of cognitive development. It is the projective spatial abilities that allow for understanding of spatial relationships between multiple objects simultaneously and, importantly, the concept of viewpoint (Reese, 1999, p. 167). It is here that object orientation, viewer perspective and similar advanced spatial visualization abilities develop.

Euclidian spatial development allows for measurement, combining mathematical understandings such as conservation, similarities, proportions, mathematical generalizations applied to space and objects in space, systems and frames of reference and coordinate systems, and full understanding of accurate, scaled, visual, proportioned models or diagrammatic representations. It must be stressed that many of these Euclidian spatial abilities must develop concurrently with projective spatial abilities. Both stem from topologic spatial ability, but in different ways (Piaget, 1967). This study focuses on the Piagetian projective spatial abilities.

13

The relationships and developmental timing of Piagetian stages of cognitive development, spatial abilities, and perceptual/conceptual phases are schematically depicted in Figure 1.2. Note that the approximate development of each is indicated by increasing opaqueness of each representative bar as a function of age and cognitive developmental stage progression.

Spatial visualization skills can be honed by presenting a variety of different representations of objects. The simplest and oldest of these are concrete representations such as wooden blocks cut to a size and shape to be depicted graphically by the student.

Paper depictions and dynamic depictions on a computer screen are other forms that can help students see different perspectives of objects. Research in other fields of education has demonstrated the value of the use of multiple representations in the classroom.

14

Figure 1.2. Piagetian cognitive and spatial development.

Representation

A representation is a schema, transformation, model, symbol, or a map of an idea, concept, knowledge bit, datum, or set of these. Representations in education are commonly divided into internal and external representations. External representations occur outside the mind and can serve to communicate organized thoughts for ourselves or others, and include such things as graphs, text, diagrams, drawings, equations, models and prototypes. External representations are an expression of an idea or collection of ideas, perhaps just a bookmark for a thought, or as extensive as a full, formal 15

communication intended for others. The recognition and active application of

representations in the classroom are Piagetian in that:

Knowledge is not a copy of reality. To know an object, to know an event, is not

simply to look at it and make a mental copy or image of it. To know an object is

to act on it. To know is to modify, to transform the object, and to understand the

process of this transformation, and as a consequence to understand the way the

object is constructed. (Piaget, 2003, p. S8)

External representations can be static or dynamic (Goldin & Shteingold, 2001), especially

considering the current state of technology. Internal representations represent a mental

schematic, idea or picture of a concept. They are impossible to gauge directly because

they occur inside the mind. However, internal representations dictate how an individual

generates external representations, and in this sense they can be gauged indirectly. They

can also be corrected, refined, reinforced, extended, and built upon through exercising the

recursive use of representations in the classroom.

The concepts of internal representation and external representation (Janvier,

Girardon, & Morand, 1993) are similar in that they both refer to the symbolism of an object or idea and a resultant transformation. They are distinct in that internal representations are formed internally in the mind and consist of mental images or schemas that exist as part of the cognitive structure of the brain. External representations, however, exist outside the mind in the social context of the learning environment

(Vygotsky 1978). Their variety and multitude (Brenner, et al., 1997) and their similarities to the existing internal representations and cognitive structures in the mind of a student can enhance the learning and development of a student.

16

Representations in the classroom can promote or hinder the cognition of concepts.

To promote the cognition of concepts, representations, both internal and external, must be

provided such that the figurative gaps between them are narrow and can easily be bridged

by the individual. In addition, representations can serve a formative purpose. Each

student has a different mental structural content. It is thus the role of the instructor to

gauge or interpret the structural content of each student’s mind. Given such a window, the instructor can provide (not necessarily directly) the external representations necessary

to allow the student to correctly internalize concepts without dictation through the use of

classroom discourse and the generation of verbal (discussion), written (minute papers,

other more formal papers), diagrammatic (graphs, figures, sketches, etc.), computer-

based, or other external representations from the instructor as well as from student peers.

As student internal representations develop and progress, more external representations

are solicited, begetting a cyclic development. It is important to note that students do not

necessarily have the skill set to synthesize an external representation provided by a peer

or instructor; the external representation may enhance their comprehension of a concept

without creating the ability to internalize that representation in those targeted for learning

and development.

The cognition of the student can be positively impacted through the connections

made between the student’s internal representations and the external representations in a

learning environment. As Greeno and Hall (1997) stated, “Forms of representation are

significant in the construction and communication of understanding,” (p. 122).

Therefore, it is important to establish similarities between the students’ existing conceptualizations or cognitive structures of information (internal representations) and

17

the new concepts for which an instructor wishes to convey meanings through models

(external representations).

It is in this fashion that the effective use of representations in the classroom is

constructivist because “learning is possible if you base the more complex structure on

simpler structures, that is, when there is a natural relationship and development of

structures and not simply an external reinforcement” (Piaget, 2003, p. S16).

Representations, like the constructivism they stem from, are recursive and are also based

on Piagetian theory of continuous accommodation and assimilation.

The use of multiple representations in the classroom also invokes the Piagetian

developmental factor of experience. Piaget discusses two types of experience, physical

and logical-mathematical (2003, p. S11). Physical experience is empirical and deals with the concrete and “acting upon objects and drawing some knowledge about the objects by abstractions from the objects” (2003, p. S11). In logical-mathematical experience,

“knowledge is not drawn from the objects, but it is drawn by the actions effected upon the objects” (2003, p. S11). To clarify, logical-mathematical experience is “an experience of the actions of the subject, and not an experience of objects themselves”

(Piaget, 2003, p. S12). Eventually, the subject can perform such actions without the use

of the physical world or props, and can abstractly reason without the support of haptic or

otherwise reality-based interactions. Thus in the cycle of building and reconstructing

multiple internal and external representations, and internalizing external representations,

it is the inter-representational transformations of these understandings that provides valuable experiences for the students that is most significant, not the actual media or content of the representations.

18

Given that all students have different existing mental images, or schemas, and

unique cognitive structures, the presentation of multiple, varied representations allows the

concepts being represented to fit into the diverse cognitive frameworks in the minds of more students. One size does not fit all, and presenting multiple, varied representations is a fair response to the cognizance of different students with different learning needs and a desire to fulfill as many of these needs as possible. Representations, both internal and external, build on each other and are building blocks of constructivist learning.

Theoretical Implications

The Vygotskian implications of internalized construction of the history of external experiences in the socio-cultural context, coupled with the scaffolding involved in the

Zone of Proximal Development are substantive. From this, any and all kinds of affective

socio-cultural support in the development of individuals are crucial. This includes

providing external representations to support the recursive development and advancement

of internal representations. The external representations provided in this case are

scaffolding in the form of three-dimensional animations of rotating objects in virtual

space on a computer screen.

The use of external representations in the form of virtual three-dimensional

animations of objects can serve to bridge gaps between students’ internal schemata of

how an object looks when rotated and how it should look two-dimensionally on paper or

on a computer screen. These viewing exercises also allow for students in the Piagetian

hypothetical-deductive fourth stage of cognitive development to gain what Piaget terms

logical-mathematical experience in the exercise of making the mental transformations

required to understand the provided external representation, while simultaneously 19

bridging aforementioned gaps in understanding resulting from potentially underdeveloped projective spatial ability. Assimilation and accommodation continue their cyclic structuring of knowledge through the mental operations performed on objects, concrete or otherwise. These transformations continue to affect students in their final cognitive stage of development, as well as in their projective spatial abilities.

Existing student spatial visualization ability is determined by genetics, life experiences, and developmental activity. Spatial visualization abilities can be improved through practice or further experiences in transforming object orientations in space.

Spatial visualization ability in individuals can be gauged through spatial visualization tests. The major issues surrounding spatial visualization ability and the coupling of the theoretical protocol are depicted in Figure 1.3.

Figure 1.3: Important spatial visualization concerns and theoretical protocol.

20

Throughout this investigation, student spatial visualization abilities prior to

instruction and after providing additional logical-mathematical experience in the form of

external, virtual, dynamic three-dimensional representations will be gauged to determine the effects of these experiences on students. This investigation serves to seek evidence of

whether this Vygotskian scaffolding and Piagetian experience that result in mental

transformations from viewing the animations will significantly enhance performance on a

known metric of spatial visualization ability.

In addition, surveying students’ prior transformative experiences during the

Piagetian stages of cognitive development and spatial development shown by others

(Deno, 1995) to affect Piagetian projective spatial ability can point to the relationship

between these experiences and initial spatial visualization ability prior to treatment. A

gauge of these developmental experiences can also indicate the potential for more

substantive gains during instruction and treatment as a function of the magnitude of

developmental experiences. In other words, those students with a greater extent of these

developmental experiences would also be expected to have a greater Zone of Proximal

Development, and thus should have more to gain from the scaffolding provided.

Research in the areas of the utilization and necessity of spatial visualization skills

in life, specific experiences during development affecting spatial visualization abilities,

spatial visualization assessment methods, spatial visualization enhancement, and the use

of two-dimensional and virtual media to enhance spatial visualization skills is well

documented and crucial to this study. However these are extensive, and thus will be

discussed in depth in Chapter 2.

21

CHAPTER 2: LITERATURE REVIEW

Spatial visualization skills are critical for success in engineering education and

essential ingredients for student understanding. Spatial visualization ability is an

important characteristic for engineers to function properly in their professional and

academic careers (Sorby & Baartmans, 2000). A variety of metrics show that students with superior visualization abilities perform better in engineering graphics as well as

other engineering curriculum classes (Yue, 2002).

Engineering graphics instructional teams strive to develop in students the

necessary visualization and graphical representation skills that allow them to express

ideas in the course of solving problems. Visualization has been defined as, “the ability to

take an idea from one’s mind and model it…”, and “...the ability to comprehend someone

else’s model” (Newcomer, et al., 2001, p. 33).

This chapter presents the historical perspective of engineering graphics

coursework, the significance of spatial visualization skills in engineering, spatial

visualization skills assessment methods, factors affecting spatial visualization ability,

alternatives to promoting good spatial visualization skills, and methods for enhancing

spatial visualization skills.

22

Historical Perspective

Engineering graphics remained relatively unchanged in the 20th century until the

early 1990’s evolution of the economically-accessible computer and software

technologies that brought computer-aided design (CAD) and computer-aided modeling

(CAM) to engineering classrooms. This affordability coupled with employer and student

demand led to the merging of traditional graphics with limited use of these technologies

in the classroom until the late 1990’s. Much of the focus remained on traditional manual

drawing tools such as triangles, T-squares, compasses, and calipers.

Today engineering graphics is much more student-friendly, integrated, and

preparatory. In recent years, researchers have investigated and repeatedly stressed the

need to change the manner in which engineering is introduced to pre-engineers (Bolton &

Morgan, 1997; Condoor, 1999; Morgan & Bolton, 1998). In many institutions with engineering programs, integrated curricula incorporating an introduction to engineering,

engineering graphics and communication, technical writing, engineering technology

tools, engineering ethics, and teamwork have been proposed and implemented.

Typically in the past, more time was spent on individual work and lecture dealing

with the creation, projection, and manipulation of manually-drawn depictions of objects

and designs. Currently, more stress is placed on the learning styles of engineering

students, who tend to be more visual rather than verbal (Scribner & Anderson, 2005; The

Ohio State University College of Engineering, 2003). Modern engineering graphics,

often taught in “Introduction to Engineering,” “Fundamentals of Engineering,” “First-

Year Engineering,” or other similar titles, use a studio approach similar to the settings

found in art and architecture where students are arranged in teams to promote peer

23 instruction, more attention to physical space in the classroom arrangement, and term- length design projects (Barr, Schmidt, Krueger, Twu, 2000; Little & Cardenas, 2001).

Significance of Spatial Visualization Skills in Engineering Education

The importance of promoting visualization is illustrated by Condoor (1999):

Visual thinking is one of the distinguishing characteristics of an engineer. At a mundane level, it is useful for documenting ideas, representing designs, and communicating them to others. At a more fundamental level, it helps reasoning about ideas and designs. For instance, designers use visual thinking to reason about stress, strain, fluid-flow, electric, and magnetic fields. Recognizing the importance of visual thinking as a means of communication and a tool for reasoning, educators have incorporated visual thinking throughout the engineering curriculum. (p. 13).

One of the stated goals of the engineering graphics course should include a desire to develop good spatial visualization skills in a student engineer. It is not only important to stress spatial visualization skills at the introductory level, spatial visualization skills should be emphasized throughout the engineering curriculum. Engineering instruction should use spatial instructional strategies to reinforce student abilities (Hsi, et al., 1997).

Performance on exams in engineering and other technical fields of study shows that scores are significantly correlated to spatial visualization ability. Academic failure is shown to be related to deficiencies in two- and three-dimensional perceptional abilities

(Rochford, Fairall, Irving, Hurly, 1989).

Because spatial visualization skills are essential to engineers beginning and graduating from engineering programs at colleges and universities, it is necessary to promote spatial visualization skills either through remedial instruction at the collegiate level or during the developmental periods prior to college. Given the need for technically adept individuals during most economic periods, industry and academia cannot afford to 24 lose intellectually competent individuals due to a lack of nurturing of spatial visualization abilities in developing students. Students who are competent in mathematics and other subject areas often withdraw or simply do not select technical courses due to a lack of the necessary spatial visualization skills and often, as a result, claim a general dislike for the sciences and technical disciplines (Pallrand & Seeber, 1984). In addition, as engineers are reputed problem solvers, spatial visualization skills are indispensable to engineers not only because of the need to represent designs, but also because spatial visualization ability is shown to be significantly correlated to problem solving ability (Carter, et al.,

1987). The importance of spatial visualization skills to engineers and others pursuing technical fields are also highlighted by Scribner and Andersen (2005).

Assessing Visualization Skills

Tests of spatial visualization ability usually measure the subject’s skills by requiring the completion of tasks demanding varying degrees of rotation, cutting, folding, or other alterations. Tasks that can only be performed by mental alterations of given images are said to be holistic or non-analytic. Tasks that require similar actions on images, but provide some verbal cue or can be accomplished algorithmically, are said to be analytic or not completely holistic in nature. Many researchers who wish to obtain a measurement of a subject’s pure spatial visualization ability often avoid tests of spatial visualization that have verbal or analytic cues as the results can be confounded by such features (Carter, et al., 1987). It is apparent that some researchers tend to think that providing analytic aids such as verbal cues or algorithmically-solvable problems allow the subject to “cheat” by foregoing his or her visualization abilities. 25

There are a number of spatial visualization tests available that researchers in this

field often use as instruments for gauging spatial visualization abilities of students. The

Purdue Spatial Visualization Test: Rotations (PSVT—R) is the most commonly used test of engineering students’ spatial visualization skills. The PSVT—R is often preferred because it effectively measures raw spatial visualization skills while allowing for minimal verbal, analytic, or otherwise non-holistic bias to affect the scoring of those being tested (Carter, et al., 1987). This and other tests typically measure the visualization

skills of students at the second stage of the three stages of Piagetian spatial development

(Sorby & Baartmans, 1996). It is in the second stage of Piagetian spatial development,

referred to as projective representation, where humans develop the ability to mentally

perceive the appearance of objects from different perspectives and orientations (Bishop,

1978).

The Mental Rotations Test (MRT) is also a commonly used non-analytic test that is similar to the PSVT—R in that it includes 24 problems to be solved via internal rotation of mental imagery, but there are two correct answers per problem. The Paper

Folding Test (PFT), an analytical spatial test, asks the participant to determine which of several choices of patterns on paper would result if the original piece of paper were hole- punched and folded a number of times. The PFT and other tests are available in the “Kit of Factor Referenced Cognitive Tests” from the Educational Testing Service in Princeton,

New Jersey (Lord, 1985; Pallrand & Seeber, 1984; Peters, et al., 1995). There are also other tests either designed anew by researchers or that are alterations of the aforementioned tests. The technologies used and the quality and not sightfulness of these tests vary and are often dependent on the available resources of the researchers

26

performing the study. A compilation of spatial aptitude tests given over the years by

military, university, and other researchers in the twentieth century through the early

1980’s is provided by Eliot and Smith (1983).

In nearly all studies where gender is mentioned as a variable, a gap typically is

reported to exist between the spatial visualization test scores of men and women, with

women typically having significantly lower scores than men. In posttests, after the

subjects of studies are exposed to some form of spatial visualization instruction, the gap narrows between those originally scoring lower (particularly women) and those originally scoring higher (mostly men) during pretests (Branoff, 1998; Peters, et al., 1995; Sorby &

Baartmans, 1996).

Factors Affecting Spatial Visualization Ability

Some researchers of the 1960’s and early 1970’s suggest that spatial visualization skills cannot be taught and are innate traits of individuals (Lord, 1985). More recent studies show that spatial visualization skills can be improved through activities involving spatial perception (Crown, 2001; Deno, 1995; Lord, 1985; Pallrand & Seeber, 1984;

Peters, et al., 1995; Sorby & Baartmans, 1996). When differences in male and female performance are noted, the existing gender gap often found in pretests tends to close

following effective exercises of spatial visualization skills as evidenced in posttest results

(Hsi, et al., 1997; Peters, et al., 1995; Sorby & Baartmans, 1996). The closing of this gap

indicates that those with deficiencies in spatial visualization ability can make up for lost

time by exercising their skills in this area until their skill level matches those of their

peers. This is analogous to a weaker muscle gaining strength more rapidly than regularly

worked muscles in the body and is a valid application of the ubiquitous “learning curve.” 27

The question then inevitably arises regarding the source of spatial visualization

deficiencies. To answer, the factors affecting visual spatial ability need to be determined.

Researchers suggest that previous experiences such as actions performed on objects in

space stimulate adolescents to develop spatial visualization skills (Bishop, 1978). Socio- cultural factors such as childhood activities involving construction games and toys, art

and sketching, video games, sports, and tactile experiences with concrete objects are

among the influences listed by researchers (Pallrand & Seeber, 1984; Peters, et al., 1995;

Yue, 2002). These are qualitative observations grounded in experience, but can be

speculative and tend to appear in the last sentences of research papers, seemingly as an

afterthought. The constructivist argument that, “Each of us makes sense of our world by

synthesizing new experiences into what we have previously come to understand” (Brooks

& Brooks, 1999, p. 4), supports the importance of experience in making increasingly

complex understandings.

One study in particular quantitatively compares the scores of subjects who took a

spatial visualization test to an inventory of activities in which the subjects participated

during their pre-collegiate development (Deno, 1995). This study employed two research

tools. The Mental Rotations Test (MRT), used to measure spatial visualization ability,

consists of 20 criterion figures for which there are two correct alternatives and two

incorrect alternatives. The Spatial Experience Inventory (SEI), developed by the

researcher and based on a list of spatial activities compiled through research done by

Guay et. al, (Guay, 1977; Guay, McDaniel, & Angelo, 1978; Guay & McDaniel, 1979,

1982; McDaniel, Guay, Ball, & Kolloff, 1978) is comprised of a list of activities, the

frequency each activity occurred, and the time period during the student’s development

28

that each activity occurred. The SEI is comprised of 312 items. These items fall into three possible categories: Formal Academic, Non-academic, and Sports activities. Each activity can be placed in the following developmental periods: junior high school, high school, postsecondary; pre-school, elementary organized, elementary non-organized. The sample consisted of 396 beginning engineering students enrolled in Engineering Graphics

166 during Winter Quarter of 1993 at the Ohio State University. Of the 396 students, 324 were men and 72 were women. ANOVA was used to confirm statistical significance of mean differences between groups, reliability analysis was conducted on the SEI, and correlation analysis was conducted between predictors and spatial measures. The SEI spatial ability groups were shown to be highly reliable, as were the elements of the developmental period scale with the exception of the elementary organized activities. It was shown through an analysis using Pearson product-moment correlation that Non- academic activities during high school seem to be the strongest predictor of ability in spatial visualization.

The inventory not only is a catalog of the various activities in which the participants had previously engaged, it also relates the developmental time period of each activity, as well as whether that activity was academic, organized, or unorganized in nature. The findings confirm most of the other researchers’ quantitative assessments of the importance of life experiences in predicting one’s current spatial visualization abilities. The study shows that building activities (non-academic during high school and middle school) are the most correlated to high spatial visualization skills for men.

Women with higher spatial visualization skills most commonly have frequent video game and educational program viewing experiences. This difference in activities between men

29

and women answers some questions about the gender gap dilemma as it pertains to

spatial visualization ability. Notably, those women with the greater than average building

activity experiences have correlations to higher spatial visualization ability, but visual

activities still seem to be more correlated to this phenomenon in females.

Alternatives to Promoting Good Visualization Skills

Some may say that development of spatial visualization skills is not as necessary

as it was before the advent of affordable technology tools such as computer-aided design

and modeling (e.g., CAD and CAM). As discussed earlier, however, spatial visualization skills are not just requisite of object and design depiction. They are also significantly correlated to problem-solving ability and performance in the sciences. Koch (2006) found that spatial visualization skills, as tested by the Purdue Spatial Visualization Test—

Rotations, significantly predicted technical problem solving ability, yet found no significant difference in problem solving ability between students using manual drawing tools versus solid modeling.

There are students who may struggle through engineering graphics courses by blindly following analytic procedures or algorithms in the course of providing alternate representations to objects and designs. Once again, it is important to note that this approach towards (or avoidance of) a skill deficiency is not a viable solution given the possibility and value of remediation in the area of spatial visualization ability (Sorby &

Baartmans, 1996). Placement exams prove effective in properly placing students in courses for which they are adequately prepared (Sorby & Young, 1998). Students with the necessary spatial visualization skills can be placed in regular graphics courses, while others who do not possess the prerequisite skill set can be placed in remedial courses 30 designed to enhance their spatial visualization skills and prepare them for engineering graphics and the remaining engineering curriculum for which possession of these skills is essential.

Finally, it should be reemphasized that poor performance in and withdrawal from courses in technical fields of study are correlated to deficiencies in spatial visualization ability (Carter, et al., 1987; Pallrand & Seeber, 1984; Rochford, et al., 1989). This further highlights the importance of spatial visualization skills, particularly in engineering.

Enhancing Visualization Skills

Spatial visualization skills can be honed in various ways. The most common methods involve the presentation of a variety of different representations of objects. The simplest and oldest of these are concrete representations such as wooden blocks cut to a size and shape to be depicted graphically by the student. These are still in use in classrooms today.

Stackable cubes were successfully used in a study concerning the effects of a remedial course on students with deficiencies in spatial visualization skills (Sorby &

Baartmans, 1996). In this study, a pre-graphics course was developed and piloted in

1993 at the Michigan Technological University for the purpose of providing remedial experience in 3-D spatial visualization skills for students scoring low on a spatial visualization placement test. The new course utilized a text and computer lab manual, written for the course, and I-DEAS software as a visualization tool. Of 535 students who took the Purdue Spatial Visualization Test: Rotations (PSVT—R) placement test, 117

(22%) were women. Of the 96 students who failed (scored less than 60% correct) 46 31

(39.3%) were women. From the 96 students who failed, 24 were selected to participate in the pre-graphics course as members of the experimental group. The average PSVT—R score for this group was only 51% correct. Their average score on the PSVT—R as a post-test was 86% with no students failing. There was a statistically significant increase in scores between the pre- and post-test (t=12.53, p<0.0001). Student comments were generally positive. The spatial visualization tools used all received an overwhelming majority of ratings in the positive area when students were polled. A majority of students also claimed they “learned a lot” from the course.

The remaining 72 students comprised the control group and were permitted to enroll in the standard engineering graphics course. The grade point averages in engineering graphics courses between the experimental and control groups were compared. The average student grade point average for the experimental group was 3.0 whereas the average grade point average for the control group was 2.6. Low grades (C,

D, or F) occurred in only 8.3% of the engineering graphics grades received by the

experimental group as compared to 16.3% of the control group students (Sorby &

Baartmans, 1996).

Engineering graphics instructors often use multiple external representations in the

course of strengthening their students’ skills through providing multiple perspectives and

imagery of the same objects. These representations range from concrete, wooden models

of objects to be depicted on paper, standard two- and three-dimensional paper depictions,

to technology-driven depictions on a computer screen that can be rotated and manipulated

through mouse and keyboard commands. This approach also helps reach a wider

audience of students who, as individuals, have distinct cognitive structures and existing

32

mental schemas with which these visualization concepts must mesh (Friedlander &

Tabach, 2001).

Virtual Three-dimensional Remediation

Engineering graphics instructors can look at the examples provided by other science and technical educators. Dynamic computer-based representations and dynamic computer-based translation between representations through computer animation and modeling of molecular structures have been shown to be valuable in improving the spatial visualization skills and understanding of subject matter concepts in chemistry

courses (Barnea & Dori, 1999). Animation has also been used to some benefit by

mathematics education researchers, where “animated learning materials can prove more useful than static representations” (Taylor, Pountney, Malabar, 2007, p. 249) in topics such as rotational symmetry and matrices.

While computer-based representations are already in use by many engineering graphics departments, when one considers that most computer-based engineering graphics representation systems were contrived for design and not educational purposes

(e.g., CAD software packages), they are not as instructionally suited as those employed in other fields of science. However, one innovative example serves as a cutting plane simulator which is also a software tool that interfaces with AutoCAD that is taught to upperclassmen in mechanical and manufacturing engineering. Named 3D-Sim, it utilizes an AutoLISP program (interpreted by AutoCAD). Huang, Bhura, and Wang (2000) examined this commercially-available tool. Their primary application was for its originally intended purpose, simulating the cutting of material from stock to form a new finished object in a virtual environment without the inconveniences and expenses of 33

setting up a laboratory around an expensive Numerical Control (NC) machine and

maintaining the equipment.

An alternate application of this software tool for pre-engineers enrolled in engineering graphics courses can be pursued. Rather than focus on the NC aspects of the

simulation, simply program the simulator to sequentially remove material from the virtual

stock to depict to the students the formation of a new 3-dimensional object. A simulated object in AutoCAD that utilizes 3D-Sim can thus be formed, rotated, and manipulated to display an infinite number of static or dynamic representations. This process may

provide additional means for connecting and reconstructing students’ internal

representations and spatial understandings of objects in the course of their engineering graphics studies.

Web-based interactive engineering graphics educational applications have been shown to be effective (Crown, 2001). Through the presentation of a series of puzzles students are challenged to solve, they interactively learn engineering graphics conventions and are promptly corrected with feedback if they do not solve a problem or answer a question correctly. In this sense, web-based scaffolding is provided for students as they progress through the exercises toward more difficult challenges. This instructional tool allows students to learn conventions and develop their spatial visualization skills without having to simultaneously learn the numerous functions and commands of a sophisticated CAD package normally used by professionals.

MBLs are used with sensors and a computer to collect data and display it graphically on a computer screen. In a study by Kozhevnikov and Thornton (2006), the use of microcomputer-based labs (MBLs) to provide undergraduate physics students with

34

real-time experimental data graphing, was shown to significantly increase the proportion

of correctly solved spatial visualization problems of the Paper Folding Test (p<0.05)

when science and non-science major treatment groups were compared to a science major

control group via pre- to post-test gains. The researchers examined the proportion of

correctly solved problems rather than simply total scores because all groups were able to

attempt more problems the second time they took the PFT, perhaps due to a test-retest

effect of familiarity where test takers learn the test when the time between identical pre-

and post-tests is too short. The authors credit Lohman (1988) for information on the test-

retest effect in other studies. It is notable, however, that an entire semester elapsed

between tests and the test-retest effect may not be so significant.

In another related study, the same authors (Kozhevnikov & Thornton, 2006)

performed the same test with high school physics and science teachers to determine if

significant increases can be detected in those with strong physics and science

backgrounds. The intention was to alleviate concerns that science or physics instruction

also causes some spatial visualization development. Using a paired t-test, the teachers

showed significant improvements both in total scored (p<0.001) and number correct

(p<0.001), however apparently there was no control group used, and the test-retest effect may have been even stronger due to the only two weeks between pre- and post-tests.

Scribner and Anderson’s (2005) substantial literature review and study yielded three recommendations, one of which promotes the use of several types of two- dimensional renderings to reinforce spatial visualization skills:

Incorporate tools such as sketching, three dimensional handheld models, three- dimensional solid model software, and orthographic and isometric projections to aid in developing spatial visualization. (p. 56)

35

In their study, the control group received standard, traditional drafting instruction, whereas the treatment group received mixed methods instruction and materials to accommodate a variety of student learning styles. More treatment group students showed significant (greater than 5) increases in their PSVT—R scores than those of the control group.

Piburn, Reynolds, and McAuliffe (2005) conducted a study using computer software “which allows the creation of detailed and realistic, two-dimensional representations depicting three-dimensional perspectives of simple and complex geologic structures and landscapes” (p. 517) for use by geology college student subjects. Using two control and two treatment groups, spatial orientation pre- and post-tests showed no

significant effect. However, spatial visualization pre- and post-test scores on the Surface

Development Test and time to complete the tests showed significant improvement. This

demonstrates the potential effectiveness of two-dimensional virtual computer-based

renderings of three-dimensional objects for increasing spatial visualization abilities.

Relevant to these results, it is noteworthy that Hegarty and Waller (2004) found further

evidence that spatial orientation (different viewpoints or perspectives), while correlated,

are dissociated with spatial visualization (mental rotation). Kwon (2003) demonstrated

that both paper-based and web-based virtual reality instruction given to groups of middle

school students can be effective in improving spatial visualization skills.

Third graders in control and treatment groups were tested on mental rotation skills

to determine whether those in the treatment group, playing video games, would show

increases in spatial visualization abilities over control group members (De Lisi &

36

Wolford, 2002). It was shown that computer-based activities or the use of virtual media

can result in increased spatial visualization ability.

Similarly, Dorval and Pepin (1986) were able to show spatial visualization test

score gains in pre- and post-test scores by undergraduate students in the humanities after

playing the video game Zaxxon for eight sessions over a six week period at five plays per session. At the time, Zaxxon was unique for its three-dimensional presentation and depth which consisted of a spaceship flying diagonally through an isometric world as there were no vanishing points as in many modern three-dimensional video games. Most spatial visualization tests involving mental rotation present objects in an isometric format.

Gerson, Sorby, Wysocki, and Baartmans (2001) found that “multimedia software is an effective tool for the development of 3-D spatial skills” (p. 111). After developing a

nine-piece modular software package with a National Science Foundation grant, they

found through quizzing, surveying, and various spatial visualization tests, significant

improvements in students who used the software over those who did not.

The sampling of work provided above and others that have been documented

elsewhere provide evidence indicating the viability of technology-based virtual three-

dimensional animated renderings in improving spatial visualization ability.

Technology-based virtual three-dimensional animated renderings used to improve

spatial visualization ability should be customizable to problems in the curriculum, should

not require additional purchases by engineering graphics and first-year programs, should be easy to implement and retrofit in universities with online curriculum aids such as

Blackboard, WebCT, or Desire2Learn, should not take additional class time, should not

require special software as it is a video (AVI) file, and should be used in an on-demand

37 basis. Given the goal of using convenient and readily available technological tools to enhance spatial visualization skills with minimal additional institutional investment of resources, it is important to survey the literature to assess the feasibility and validity of providing effective, remedial spatial visualization activities through the use of dynamic external representations via CAD-derived animations of objects projected on a computer screen.

Problem-based Learning

Problem-based learning can be a very important tool in educating future engineers. It not only applies as a method of teaching; it also simultaneously provides experiences and development of abilities and skills for young pre-engineers in the process of problem solving. According to Wilkerson and Gijselaers (1996), problem-based learning is defined as a student-centered process that occurs in small groups of which teachers are the facilitators or guides. Problems presented in some form such as a written case, case vignette, simulation of a real-life subject or computer simulation represent the challenges faced in practice and therefore provide relevance to form the organizing focus and stimulus for learning. Such problems also provide for the development of clinical problem solving skills and the means through which abilities can be acquired through self-directed learning. In this application, the problem could be one demanding the exercise of spatial visualization skills in a term-length design project.

In engineering graphics, the majority of the coursework involves student utilization of various visualization and graphical communication skills and abilities applied to the solving of problems. Due to the number of topics covered and the limited amount of time available, many of these problems are simple, lecture-based cases (Savin- 38

Baden 2000). These involve problems assigned to the students after some preliminary information on the approaches relevant to solving the problem and the uses and applications for the skills gained through practicing the solution of similar problems. The routine problems occur in lecture and also in lab, although the labs are a bit more open- ended. Far less frequently, perhaps only once during the course, a project such as building a balsawood bridge is given to the students.

Implications

Engineers are typically seen as “problem solvers.” The field of engineering can no longer afford to have its ranks filled by people learning and applying a fixed body of knowledge. This ignores the exponentially increasing body of technical knowledge, technological advancements, and emerging problems associated with the burgeoning progress and expansion of humanity.

It is necessary that engineering professionals today, as demanded by both public and private sectors, have the ability to adapt and apply readily accessible resources in a manner relevant to solving unique problems. They also must satisfactorily complete a variety of projects through the use of teams and interactions with groups of people, thus requiring the use of effective verbal, written, and visual communication. This supports the use of problem-based learning in engineering. In case of enhancing spatial visualization skills in engineers, while it is not possible to have students practice every possible situation requiring spatial visualization skills, it is possible to exercise skills with general problems that could be applied to a variety of future situations in engineering academics, as well as in the engineering profession. From the research and revelations of others interested in problem-based learning and engineering education, it has been 39

demonstrated that problem-based learning is very relevant and applicable to the education

of future engineers in the area of spatial visualization as well as other areas.

Another commonly practiced method for enhancing spatial visualization skills involves providing analytic methods such as verbal cues and algorithms for translating from one representation to another. While these may help some students eventually make the leap from a reliance on analytic methods to holistic methods through repeatedly seeing one representation translate to another, these practices may also allow students to

habitually use the analytic methods as a crutch. As a consequence, there is always a

concern that the students are not necessarily honing their spatial visualization skills but

are in effect blindly following algorithmic procedures.

Summary

Spatial visualization skills are essential for success in engineering and the

sciences. Focus should be placed on developing spatial visualization ability during the

developmental years of students through activities involving building, tactile manipulation of objects, transformations between internal and external representations, and other tasks to reinforce these skills. Remediation should be provided at the collegiate level as necessary to decrease student attrition rates in science and engineering and improve general student performance in their curricula. Spatial visualization skills are also related to problem-solving ability. Lack of high spatial visualization capabilities on the part of entering freshman students at the postsecondary level may be a predictor of those students’ selection of non-technical programs of study. It is therefore imperative that tools be made available to hone and/or remediate the spatial visualization skills of engineers and engineers-to-be. 40

CHAPTER 3: METHODOLOGY, DATA COLLECTION AND ANALYSIS

Spatial visualization skills are requisite for the academic and professional success

of engineers. Studies show that spatial visualization skills can be improved through

direct and indirect instruction, as well as through the use of modern technology. External remedial efforts are shown to be effective, as are engineering graphics courses in improving spatial visualization test scores. Some students enter college wishing to be

engineers, but score below average on spatial visualization tests. Because spatial

visualization test scores may be indicative of future academic and professional success in

engineering, it is important for engineers to be competent in spatial visualization. This

study seeks to use existing resources commonly available to first-year engineering

programs or engineering graphics departments. By using existing computer-aided design

(CAD) software to build instructional tools in the form of virtual 3-D animations of

objects, first-year engineering students in need of spatial visualization improvement can

enhance their skills while minimally taxing departmental or program resources.

Providing such instructional assistance on an open-access basis allows willing students to

take ownership of their own improvement while minimally disrupting direct instruction

and requiring minimal further investment in curriculum development.

All of the resources utilized in the development of the instructional tool and data

collection devices or their equivalents can be found at most engineering colleges. In this

41 study, only resources readily available at The Ohio State University were used. Thus, one objective and feature of the developments discussed here is to allow for the integration of these approaches with most typical first-year or engineering graphics programs. The instructional tool developed and studied, as well as the data collection methods and research design integration with the existing online university-wide course curriculum and grading web application can be applied elsewhere in a similar manner as presented here.

Participants

The participants were first-year engineering students taking the first course of the

Introduction to Engineering series (ENG 181) at The Ohio State University in the First- year Engineering Program. The participants were selected based on the section of introductory engineering graphics course sections in which they were enrolled. There were a potential maximum of 216 students in each of the control and experimental groups, each group comprising an entire treatment level for a total of approximately 432 students from six sections (12 labs). The actual number of students who enrolled and completed the course was 396, however only 273 provided consent and completed the course.

Instruments

The Purdue Spatial Visualization Test – Rotations, was administered to gauge spatial visualization ability. The PSVT—R has been used to log student visualization data for several years at the First-Year Engineering Program and is therefore ideal for this study as a course of convenience, in addition to the reasons involving avoiding

42 confounding with non-holistic mental rotation methods problematic of other spatial visualization measures (Bodner & Guay, 1997) and is deemed a good test (Yue & Chen,

2001). The PSVT—R consists of 30 problems each involving a depicted object and instructions to rotate that object a certain number of degrees (multiples of 90) around one or more axes. The participant is given a choice of four possible responses, only one of which correctly represents the figure’s new orientation. The PSVT—R is integrated with the online gradebook for all sections of ENG 181, and has access windows that span the beginning and end of the quarter. A copy of the online version of the PSVT—R is available in Appendix A.

The Likert-type Experience survey was an augmented version of the “Journal

Entry 1,” already in existence, to collect Experience data. Journal Entry 1 is the first in a series of five online surveys the students are required to complete. These surveys query students on their impressions of the class, coursework, difficulty level of the assigned work, and the instructional team’s performance. The responses to these questions are quantified into a continuous rather than categorical Experience score, where each of the ten questions represents an integer value of 1 through 5, ranging from “Never” to

“Always”, respectively. Thus a total composite experience score of 10 to 50 is possible.

The Likert-type scale Experience questions are also shown fully formatted in Appendix

B. The 10 Likert-type Experience score questions are also listed here with labels in parentheses.

1. How often have you engaged in drafting, design and design sketching,

engineering graphics, or other manual (by hand) technical drawing

activities? (DRAFTING)

43

2. How often have you constructed models or played with building blocks,

Legos, Lincoln Logs, or other similar stackable toys? (MODELS)

3. How often have you played sports? (SPORTS)

4. How often have you fixed things, with a parent, guardian, other mentor, or

on your own, such as working on a car, home improvements, construction,

carpentry, electronics, or other similar activities? (HOME

IMPROVEMENT)

5. How often have you engaged in creating artwork (drawing, graphic

design, painting, 3D art, photography, etc.)? (ART)

6. How often have you played video games? (VIDEO GAMES)

7. How often have you used Computer Aided Design (CAD) software

applications such as AutoCAD, SolidWorks, Autodesk Inventor, or

others? (CAD)

8. How often have you played musical instruments or composed music?

(MUSIC)

9. How often have you engaged in crafts (sewing, homemade decorations,

pottery, etc.)? (CRAFTS)

10. How often have you used maps or navigated in a car, navigated while

hiking, etc.? (NAVIGATION)

Students were emailed an open-ended survey inquiring as to their impressions, attitudes, and usage of the visualization tool. A copy of the survey and focus group invitation is included in Appendix C. A listing of the survey questions follows.

44

1. Were you aware of the animated instructional tool available on Carmen on a trial

basis in ENG 181 in WI08? If not, you do not have to answer the remaining

questions.

2. How did you hear of the availability of the animations?

3. Were you able to access to the animations for the daily assignment problems?

4. How did you gain access to the tool, if it was not available through your own

login? (No worries, you’re not in any trouble!)

5. Do you know of others who did not have access via their own login that also were

able to somehow gain access?

6. Why did you stop using the animations?

7. Did you find the animations useful? In what ways?

8. Did you believe you needed help visualizing or understanding how objects appear

after rotation or orientation change of the objects from their initial states? If so,

please specify if you had trouble with: visualizing isometric drawings, visualizing

orthographic drawings, the Purdue Spatial Visualization Test, exam problems

involving drawings, or something else.

9. Do you believe that viewing the animations helped you understand how an object

in an isometric view looks when rotated to an orthographic view?

10. Do you believe that viewing the animations helped you understand how an object

in an orthographic view looks when rotated to an isometric view?

11. Do you believe that viewing the animations helped you generally understand how

an object would look when rotated from one position or orientation to another? In

other words, were you able to better understand, “see,” or mentally visualize how

45

an object would appear, would be drawn, or would be depicted if it was rotated or

given a new orientation from its initial state?

12. How would you improve this instructional tool to make it more useful for future

students?

Treatment

A CAD-based 3-D animated model of existing, assigned, paper-based engineering

graphics problems was made available for students to view during engineering graphics

homework assignments to determine if this can be used as an instructional tool that

enables students to enhance their spatial visualization skills. Students were either granted

access to the tool, or not. Two possible treatments were analyzed. Those given access

are compared to those who were not given access to the tool. Analyses invoking a tool

utilization considerations including the amount of time the tool was used, number of

times the tool was used, and whether or not the tool was used at all, are used to gauge

absolute tool effectiveness. Analyses ignoring the utilization are also applied to gauge

real-world open access tool effectiveness.

The animations were developed by building existing manual engineering graphics drawing assignments in AutoDesk Inventor. They were then imported into

“Presentation” mode and rotated from a starting isometric view to one of three

orthographic views and back to the original isometric view again by specifying angles of

rotation on relevant axes. The animations were recorded as AVI-type files to ensure wide

compatibility with user media player software. These animations are viewable on the

internet at http://smg.photobucket.com/albums/v221/yallam/ENG181WI08Animations/.

Each object has three associated animations, one for each orthographic view that would 46

normally be depicted by the students. The assignments included those typical of

manually-drawn engineering graphics assignments: drawing orthographic views given an isometric view of an object, drawing an isometric view given orthographic views of an object, missing edge lines, and drawing a missing orthographic view. These assignments are included in Appendix D.

Procedures

The Ohio State University’s online course and grading web application, Carmen, was made available to the investigators via Carmen course designer access. Carmen is an online course grade and resource web application by Desire2Learn, similar to Blackboard or WebCT. Students rely heavily on this web application for content such as syllabi, daily assignment schedules, presentation slides, drawing tools and orthographic and isometric paper for printing should they deplete their own supplies, lab information, exam review materials, etcetera. There are also periodic quizzes and surveys with adjustable windows of availability. There are links to external resources, and important course- related news and reminders. With the granted designer permissions, access groups were formed for each lab section, representing the externally randomly generated control and treatment groups. Students designated as treatment group members were granted access to the animation tools. Others were denied access. Access windows were set such that, based on the daily assignment schedule, treatment group students were granted access to the relevant animation upon assignment. Access to these animations closed the day after the assignment’s due date. The assignment schedule, or “Daily Assignment List,” is provided in Appendix E.

47

The entire study occurred within the existing Carmen course framework for ENG

181. All test scores, assignment grades, and survey data were already available and automatically saved as records to Carmen gradebooks. Existing online surveys were augmented to collect the Likert-type scale experiential data (Experience). Tool access was provided via selective release hyperlinks to participants randomly selected from course sections. Existing Carmen-generated unique student identification numbers were used to protect student identities. These numbers are randomly generated and were not associated with sensitive university records and serve only as unique gradebook markers.

All data were downloadable from Carmen as comma separated value files for manipulation within spreadsheet software applications. Student email addresses and names were the only identifying fields of data within grade records. These were retained for the sole purpose of allowing for post-study consent form processing and matching data records originating from different sources with varying formats. At the end of the course, permission, using the standard written consent form to use the survey data, tool usage statistics, class assignments and exam scores for the purpose of research was distributed and collected once the students completed the forms. Once the study concluded and the consent forms were processed, all identifying data were stripped.

Only those individuals who gave consent had their corresponding data used for the purposes of the research study.

Students enrolled in Engineering 181, Fundamentals of Engineering (ENG 181) course sections in Winter 2008 (WI08) were given an online version of the Purdue

Spatial Visualization Test—Rotations (PSVT—R) as a pre-test. They also completed a survey, in this case an augmented version of the “Journal Entry 1,” already in existence,

48

to collect Experience data. Students were surveyed via ten five-point Likert-type scale

questions at the beginning of the quarter and asked to indicate, by their own estimations,

their levels of participation in certain activities prior to enrolling in an engineering degree

program in college to determine a relationship, if any, to initial scores on spatial

visualization tests administered at the beginning of the quarter. The experience

quantified via a five-point Likert-type scale experiential survey (Experience) includes ten

questions posed concerning student developmental activities that research has shown to

be correlated to spatial visualization ability (Deno, 1995). These experience levels are

also used to gauge Experience effects on gains made during exposure to both regular

classroom activities as well as regular classroom activities coupled with the 3-D

animation tool being tested.

Enrollees were not informed of the study in advance to protect study validity and

to simulate the voluntary nature of online instructional tool usage. An email was sent

discreetly to treatment group members to alert them of the forthcoming availability of an

experimental instructional tool with limited access. They were told how to access this

tool and a vague description of the purpose of this covert activity. Their confidentiality

was requested. Those randomly selected as treatment group participants were granted

instructional tool access via links on the “Content” page of the Carmen course web application. These links selectively opened for treatment group participants when the relevant engineering graphics drawing is assigned, and closed after the due date of the assignment. Link access tracking in Carmen is built into the application, so utilization data were available for each treatment group participant. Much of the participant data were stored within the Carmen gradebook with key exceptions detailed later.

49

At the conclusion of study, coinciding with the conclusion of the WI08 term,

students took the PSVT—R post-test. The spatial visualization test, PSVT—R, was already utilized by the First-Year Engineering Program at The Ohio State University

College of Engineering for internal purposes.

Per Institutional Review Board (IRB) suggestion, the study was treated as an

internal investigation, as the data collected is typical of quarterly activities, with the

addition of the devices described throughout this chapter. Consent scripts were read at

the end of the quarter and consent forms were distributed and collected during the final

class period. These are available in Appendix F. Non-published, internal research for

departmental or program continuous improvement in accepted educational settings for

purposes of gauging student performance or instructional methods may follow the

approaches outlined here without IRB approval, but this should be confirmed with the

researcher’s local Institutional Review Board to avoid any uncertainty on this issue.

Due to the nature of the treatment group participants’ voluntary usage of the tool,

a tool utilization covariate is included in the statistical model. Analyses invoking

animation tool utilization consideration are used to gauge absolute tool effectiveness,

while analyses ignoring the utilization are also applied to gauge real-world open access

tool effectiveness. The comparison occurs between pre- and post-tests administered

respectively at the beginning and end of the ten-week term in the First-Year Engineering

Program at The Ohio State University. Pre-tests are used to gauge initial student ability

levels in spatial visualization, and performance on post-tests are used to gauge the

magnitude of gains that can be expected from students of varying skill levels with the

standard instruction, as compared to instruction augmented with the use of the CAD-

50

based 3-dimensional animations. Extensions to this study were made using data collected through the course of the academic quarter. These extensions serve to identify relationships, if evident, between pre-test scores or instructional tool utilization and

grades in visualization (Drawing) and non-visualization-related (non-Drawing) items.

These extensions also serve to confirm the link between summarized student

developmental experiences and their initial spatial visualization test scores.

Finally, the significance, if any, between student visualization test scores and

relevant engineering graphics-related grades are determined. The final data collected

were open-ended survey responses regarding both treatment and control student impressions, suggestions, and usage patterns of the tool, if any. They were also queried regarding access to the tool and knowledge of the tool, especially for control group subjects who should not have had access to these animations.

Research Design

The research design is an experimental design with control and treatment groups where the treatment group receives access to a virtual 3-dimensional visualization tool that offers an additional representation of objects to be depicted in assigned engineering graphics drawing problems.

Sample and Initial Calculations

A randomized complete block design was used where one half of each block was randomly selected for designation of control or treatment. Initial sample size calculations assumed a conservative standard deviation of 5.3 based on the post-test spatial

51

visualization data from one section in Autumn 2006, a typical power value of 0.8 and an

alpha value for testing at 0.05.

With 12 blocks and 2 treatments, the analysis is equivalent to a one-way ANOVA

with 24 (12 x 2) treatments. For a sample size of 18 per treatment x block combination, power = 0.8, alpha = 0.05, the detectable difference in visualization test scores would be

8.5. However, there are several offerings of the course per quarter, all using the same

equipment, materials, time, and other resources, with the distinction being the times of

the offerings and the human resources involved in instruction. An instructor teaches one

or two offerings in the same lecture classroom. Each offering, at a unique time of day, is

divided into two sections, where each section has a unique Graduate Teaching Associate

and undergraduate Peer Mentor. Therefore, each section is unique only in the sense that

it has a unique instructional team and meeting time. The model addresses the course

section inconsistencies by blocking the course sections to accommodate variations caused

by attributes such as instructional staff and time. Since the course sections are the blocks,

there is no interest in detecting differences between blocks. There is, however, interest in detecting treatment differences. In this case, the analysis is equivalent to a 1-way

ANOVA with 2 treatment levels. For this scenario, there would be enough observations per group (216) to detect visualization test score differences as small as 1.43, assuming

maximum enrollment. Alternatively, to detect a difference as small as 2 one would need

at least 112 for treatment and 112 for control across all sections.

With nearly 400 projected students, it was deemed there was an adequate sample

to meet a pre-test to post-test gain standard of two points. Again, there is no interest in

detecting block differences. They are considered nuisance effects. There is a need to

52

control for them, but they are not of primary interest. A one-way ANOVA power curve illustrates the pre-test/post-test gain sensitivity in Figure 3.1.

Power Curve for One-way ANOVA 1.0 Sample Size 112

0.8 A ssumptions Alpha 0.05 StDev 5.3 # Levels 2 0.6 Power 0.4

0.2

0.0 0 1 2 3 4 Maximum Difference

Figure 3.1: Power curve for one-way ANOVA, n=112, α=0.05, σ=5.3.

While the preliminary sample analysis assumes six sections at two blocks per section (one for each unique lab/Graduate Teaching Associate) and 36 students per block, sample size, and as a result the number of blocks, are dependent upon student enrollment, that is, the number of sections of students enrolled in a given quarter. It was also considered likely that the blocks and treatments would be unbalanced.

53

Underlying Considerations

This study was approached not just as an investigation, but as a simulation of how an indirect, digital instructional tool can be implemented and gauged for effectiveness.

The successful administration and documentation of this study may lead to its use as a roadmap for the investigation and installation of future indirect, open access instructional aids available to students on an as-needed basis. Of substantial note is the effort to minimize disturbances to classes and disruptions in instruction of the course sections containing the subjects in all phases of the design, integration, and throughout the course of the study. A background but significant objective of this study involved the seamless integration and application of the various devices for data collection, control/treatment separation, and provisions for the spatial visualization tool itself.

Students already take several surveys throughout the quarter for internal continuous improvement uses. It is for this reason that the types of activities identified by other researchers as having significant correlation to spatial visualization abilities were abbreviated or summarized from hundreds of items in survey batteries. There was a concern that students would not take the time to carefully answer too numerous a battery of Experience questions when considering the other feedback they are already expected to provide in the course. There was also a concern of disturbing course activities or overtaxing student time during a course which already requires of the students substantial time and commitment for success.

After the end of the quarter, students were emailed open-ended surveys inquiring as to their impressions, attitudes, and usage of the visualization tool. They were also

54

solicited for a focus group session to discuss their thoughts around a table, featuring pizza as an incentive.

Data Collection and Parsing Devices

Data Logs

One of the features of Carmen is an ongoing log it keeps of student activity. The activity log includes access information to each object by users in each course section.

Data logged includes last visit time and date, number of visits to each item, and total time spent with a particular object open. This information is typically of no concern to most faculty members, and it is available only in a raw, cryptic format through special request to the University’s Office of Information Technology (OIT). Upon request, OIT was able to provide a listing formatted as comma-separated-values, accessible via Microsoft Excel, but still not readily usable in this format.

PSVT—R scores for each student are automatically logged into the Carmen gradebook upon completion alongside all other grades automatically collected by the

Carmen system or more often manually entered by Graduate Teaching Associates.

Student grades are all available via Carmen export to comma-separated-value file, as this is the primary grade recording, processing, and calculation device used by the First-Year

Engineering Program. Data imported from the Carmen gradebook were readily compatible and properly formatted and oriented for data analysis. Course grades and

PSVT—R test scores were concatenated into a master spreadsheet, with additional fields added to designate lab, section, instructional team, and meeting time information.

Additional details, such as problem-by-problem exam scores, were collected separately

55 by Graduate Teaching Associates and compiled by personnel in the First-Year

Engineering Program. Copies of the relevant visualization-related exam problems are provided in Appendix G. Additional calculations, such as separating heavily visualization-dependant assignments from others were done subsequently by formulae applied to grade components imported into Microsoft Excel spreadsheets. These results were combined again and checked against original Carmen grade totals to ensure validity.

Other Data

Other pieces of data required for the analysis had to be parsed, properly oriented, and concatenated to this master spreadsheet manually or through Visual Basic (VB) macro and Structured Query Language (SQL) script from other spreadsheets and relational database tables. Problem-by-problem exam data described in the previous section were appended manually. Experience score raw survey data were individually downloaded by section. Each subjects’ record occupied 58 rows and several columns. A macro was written to condense the data into composite scores in single-row records listed by name and crosstabbed along the columns by Experience item. These names were matched to the master spreadsheet by section and appended. The macro is listed in

Appendix H. Object utilization data provided by OIT was imported into Microsoft

Access. A series of queries were performed to crosstab the data by username and remove duplicate username listings so that each username-tied datum occupied the same row, with columns labeled for the access time and date, number of accesses, and time spent for each item running laterally in the same single row for the relevant username. The SQL script for these queries is given in Appendix I. The usernames were then matched to the master spreadsheet and manually appended. 56

Master Spreadsheet Data and Pre-Analysis Activities

Upon finalizing the master spreadsheet, data were filtered as necessary for each analysis. Row records containing blanks and zeros were filtered only for columns containing required factor or covariate information for a particular model. This was replicated for each analysis, and for each variation of each analysis if alterations were made due to shifting significance to optimize model fit with the available data. Minitab

was used for analysis, and the model employed, a General Liner Model (GLM) with

Analysis of Covariance (ANCOVA), required entry of full rank data, meaning blanks

caused exceptions and thus had to be eliminated.

Ceiling effects resulting from students scoring extremely high or at the test

maximum score occasionally threatened model applicability for those records of greatest

concern, i.e., records of students scoring lower on the spatial visualization test. Data and residuals were also examined to ensure the validity of normality assumptions when required.

Hypotheses, Models, and Variables

Following are hypotheses and models initially used to address the research questions. In many cases, these serve as starting points. As factors or covariates fail to show significance, components may be removed. In the interest of brevity and providing a general illustration of the models involving Likert-type scale Experience score components, all models involving use of Likert scale Experience score components below are modeled with just one covariate. This Likert-type scale Experience score covariate should be assumed to be either modeled in practice as the total composite score of all ten components, or a breakdown of the components into up to ten separate 57

covariates. The Likert-type scaled Experience score shall henceforth be referred to as

“Experience” or “Experience score.”

Typical of the models are the mean, µ, which is the overall post-test score mean;

the response due to treatment, α, indicating the effect of access to the tool; the block, β, for each course section; the pre-test score effects, γ; the experience level effects, δ, of which the actual experience level is determined by a composite score derived from the

Experience survey; the spatial visualization instructional tool utilization effects, η, of which utilization values are logged by the university course and grade web application

(Carmen, by Desire-to-Learn); and finally the error term, ε. The hypotheses tested and

associated models and variables are described below.

Tool Access Effects on Spatial Visualization Improvement

H 0 :1  0  0 ; to determine the effects on spatial visualization gains due to H1 :1  0  0

treatment, or free tool access. The Tool Access Effects on Spatial Visualization

Improvement are modeled as: y        T  X   postijk i j k k ijk

MEAN BLOCK EXPERIENCE

RESPONSE TREATMENT PRE-TEST ERROR where α represents effects on the spatial visualization post-test response due to treatment

(fixed factor),  represents the effects due to the block or lab section (random factor), 

represents the individual spatial visualization pre-test score (covariate), δ represents

58

effects on the response due to experience (covariate), on the mean response, μ. The

subscripts 0 and 1 designate control and response groups, respectively. The parameter

types of the model are summarized as: y        T  X   postijk i j k k ijk

RESPONSE MEAN FACTORS COVARIATES ERROR

i = 0, 1 (control, treatment)

j = 1, 2, 3, …, 12 (section/lab number, a blocking factor)

k = 1, 2, 3, …, n, where n = total number of participants (subject number)

Tool Utilization Effects on Spatial Visualization Improvement

H0 :  0 ; to determine the effects on spatial visualization gains due to tool H1 :  0

utilization access within the treatment group. The Tool Utilization Effects on Spatial

Visualization Improvement are modeled as: y       T   X U   post1 jk 1 j k k k 1 jk

RESPONSE MEAN BLOCK EXPERIENCE ERROR

PRE-TEST UTILIZATION where  represents the effects due to the block or lab section (random factor), 

represents the individual spatial visualization pre-test score (covariate), δ represents

effects on the response due to experience (covariate),  represents effects on the

response due to tool utilization (covariate), on the mean response, μ1. The subscript 1

59

designates modeling within the original treatment group. The parameter types of the model are summarized as: y       T   X U   post1 jk 1 j k k k 1 jk

RESPONSE MEAN FACTOR COVARIATES ERROR

j = 1, 2, 3, …, 12 (section/lab number, a blocking factor)

k = 1, 2, 3, …, n, where n = total number of participants (subject number)

An alternate analysis here involves removing the continuous tool utilization covariate and replacing it with a Boolean tool usage random factor to determine the effects of using the tool in any capacity on the response.

Experience Effects on Spatial Visualization Ability Gains

H0 :  0 ; to determine the effects on spatial visualization ability improvements H1 :  0

due to developmental experiences. The Experience Effects on Spatial Visualization

Ability Gains are modeled as: y        T  X   postijk i j k k ijk

MEAN BLOCK EXPERIENCE

RESPONSE TREATMENT PRE-TEST ERROR where α represents effects on the spatial visualization post-test response due to treatment

(fixed factor),  represents the effects due to the block or lab section (random factor), 

represents the individual spatial visualization pre-test score (covariate), δ represents

60

effects on the response due to experience (covariate), on the mean response, μ. The

parameter types of the model are summarized as: y        T  X   postijk i j k k ijk

RESPONSE MEAN FACTORS COVARIATES ERROR

i = 0, 1 (control, treatment)

j = 1, 2, 3, …, 12 (section/lab number, a blocking factor)

k = 1, 2, 3, …, n, where n = total number of participants (subject number)

Experience Effects on Initial Spatial Visualization Ability

H0 :  0 ; to determine the effects on initial spatial visualization ability due to H1 :  0 developmental experiences. The Experience Effects on Initial Spatial Visualization

Ability are modeled as: y       X   pre jk j k jk

MEAN EXPERIENCE

RESPONSE BLOCK ERROR

where the spatial visualization pre-test is the response,  represents the effects due to the

block or lab section (random factor), δ represents effects on the response due to

experience (covariate), on the mean response, μ. Note that utilization, as well as control

and treatment group considerations have been ignored since this is a pre-treatment

collection of data. The parameter types of the model are summarized as:

61 y      X   pre jk j k jk

MEAN COVARIATE

RESPONSE FACTOR ERROR

j = 1, 2, 3, …, 12 (section/lab number, a blocking factor)

k = 1, 2, 3, …, n, where n = total number of participants (subject number)

Experience Effects on Grades

H0 :  0 ; to determine the effects on heavily Drawing and non-Drawing course H1 :  0 grades due to developmental experiences. The Experience Effects on Grades are modeled as: y       X   grade jk j k jk

MEAN EXPERIENCE

RESPONSE BLOCK ERROR where the grade is the response,  represents the effects due to the block or lab section

(random factor), δ represents effects on the response due to experience (covariate), on the mean response, μ. The parameter types of the model are summarized as: y       X   grade jk j k jk

MEAN COVARIATE

RESPONSE FACTOR ERROR

62

j = 1, 2, 3, …, 12 (section/lab number, a blocking factor)

k = 1, 2, 3, …, n, where n = total number of participants (subject number)

Tool Utilization Effects on Grades

H0 :  0 ; to determine the effects on grades due to tool utilization access within the H1 :  0

treatment group. The Tool Utilization Effects on Grades are modeled as: y       T  X U   grade1 jk 1 j k k k 1 jk

RESPONSE MEAN BLOCK EXPERIENCE ERROR

PRE-TEST UTILIZATION where α represents effects on the grades response due to treatment (fixed factor), 

represents the effects due to the block or lab section (random factor),  represents the

individual spatial visualization pre-test score (covariate), δ represents effects on the response due to experience (covariate),  represents effects on the response due to tool

utilization (covariate), on the mean response, μ. The subscript 1 designates modeling

within the original treatment group. The parameter types of the model are summarized

as: y       T  X U   grade1 jk 1 j k k k 1 jk

RESPONSE MEAN FACTOR COVARIATES ERROR

j = 1, 2, 3, …, 12 (section/lab number, a blocking factor)

k = 1, 2, 3, …, n, where n = total number of participants (subject number)

63

Once again, an alternate analysis here involves removing the continuous tool utilization

covariate and replacing it with a Boolean tool usage random factor to determine the

effects of using the tool in any capacity on the response.

Spatial Visualization Pre-test Effects on Grades

H0 :  0 ; to determine the effects on spatial visualization gains due to tool H1 :  0

utilization access within the treatment group. The Spatial Visualization Pre-test Effects

on Grades are modeled as: y       T  X U   grade1 jk 1 j k k k 1 jk

RESPONSE MEAN BLOCK EXPERIENCE ERROR

PRE-TEST UTILIZATION where α represents effects on the grades response due to treatment (fixed factor), 

represents the effects due to the block or lab section (random factor),  represents the

individual spatial visualization pre-test score (covariate) effects, δ represents effects on

the response due to experience (covariate),  represents effects on the response due to tool utilization (covariate), on the mean response, μ. The subscript 1 designates modeling within the original treatment group. The parameter types of the model are summarized as: y       T  X U   grade1 jk 1 j k k k 1 jk

RESPONSE MEAN FACTOR COVARIATES ERROR

64

j = 1, 2, 3, …, 12 (section/lab number, a blocking factor)

k = 1, 2, 3, …, n, where n = total number of participants (subject number)

Grades from assignments and test problems that rely heavily on spatial

visualization skills and grades that are less spatial visualization reliant will be analyzed.

It is important to note that in all of the above models, as factors or covariates are determined to be not significant in their effects on the responses, they may be removed to improve the model fit for investigative purposes. This may not always be the case as some factors or covariates of some significance may still account for variance in the response.

Additional factors and levels would further limit the model should the experience scores be treated as discrete. A randomized complete block design is used where one half of each block is randomly selected for designation of control or treatment. Each block represents a course section with a unique combination of class time, instructor, and graduate teaching assistant. None of the investigators are involved in teaching or administrating any of the participants’ course sections during the quarter of the study.

Risks

Risks to participants included treatment and control group student grade means showing statistically significant differences due to the application of or lack of treatment.

Significance was tested and grades were to be leveled accordingly, but differences in overall grade means were not significant. A minority of grade fields in the student gradebook were affected. Both the First-Year Engineering Program Director and

Associate Dean of the College of Engineering were aware of and approved of this approach. 65

Risks to participants in regarding privacy and identity were minimal in that only

one person will administer, collect, and analyze the data. This person stripped the data of

identifying fields after the consent forms were processed. This risk was also

incrementally small given the existing gradebook access given to the many personnel

who are members of the instructional teams providing instruction and grading to each of

the course sections in the First-Year Engineering Program. Similarly, there was

additionally a minor, also incrementally small risk involved in course performance

information leaking out to unauthorized persons.

Internal Validity

The greatest threat to internal validity is control group knowledge or usage of the

instructional tool. This is minimized by restricting tool access to Carmen which requires

a unique login for each student participant. Additional risks to validity include half-

hearted or limited use of the tool by treatment group participants.

Data Analysis

Analysis of Covariance, or ANCOVA, a general linear model, was used to analyze the data.

Tool Effectiveness

Upon completion of the study and subsequent data analysis, if the tool is effective, results would follow a pattern as generalized in Figure 3.1.

66

SPATIAL VISUALIZATION TEST SCORE SPATIAL VISUALIZATION POST-TEST RESPONSES VS. EXPERIENTIAL SCORE RESPONSES VS. COMPOSITE SCORES PRE-TEST SCORES RESPONSE

POST-TEST POST-INSTRUCTION EXPERIMENT PRE-TEST CONTROL EXPERIENCE INITIAL ABILITY

Figure 3.1: Potential pattern of results given effective instructional tool.

.

This illustrates a pattern where those student participants with the least initial spatial ability or experience may have the most to gain through new experiences and instruction designed to exercise spatial visualization skills. This has been shown in research where gaps (such as gender gaps, discussed earlier) between groups are closed quickly by the lower-scoring group making gains faster after experience and instruction, similar to a weak muscle developing more quickly to match an equally exercised strong muscle. It also depicts a ceiling which exists because the maximum score of the PSVT—

R of 30 is occasionally attainable by some students initially on the pre-test if they are particularly gifted (Ohio State University, 2003-2008).

Limitations

Students who score at or near the maximum, may sometimes have lower post-test scores more often than students who score lower initially. This may be attributable to the random fluctuations in test performance exceeding the potential gains from added instruction and experience for those whose spatial visualization abilities are already at or 67

near the ceiling of the metric employed. Because the tool is intended to help remediate

those in need of enhancing their spatial visualization abilities, this is an acceptable

drawback of the test, although tests with higher ceilings for measuring spatial

visualization ability of the gifted can and have been created; however, they are not as

proven through nearly the iterations of studies and research as the test employed for this

study. It was assumed that the potential volume of participants in this study will alleviate

ceiling effect concerns.

In addition, it is difficult to know student participation levels and how missing

data as a result of student non-participation in certain measures will affect the study of the outcome. Finally, metrics of a voluntary or non-grade dependent nature do not

always provide the incentives for whole-hearted participation. For example, students may opt to curtail their participation in the visualization test or may rush through the

Experience score survey questions. Student online connections may get cut, or they may get interrupted and never complete some activities. Because the data collected in this study is not actively proctored there are associated risks to validity and this can hurt the sensitivity of the statistical analyses as full rank data sets are required. This may also alter the distribution of data from what is true of the sample.

Students may leave windows open when using the animation tools provided to the treatment group. These will cause the Carmen object access log to continue counting seconds of access, even if an object is left open in a window and the computer is left unattended. Students in the treatment group may also, although unlikely, allow others outside the treatment group to use their login to access the animation objects.

68

Finally, while exams are graded uniformly across sections with one grader

assigned per exam problem, all other grades are graded by instructional team members

assigned to a particular section, thus grades will vary by section, another need for the application of blocking factors in this model.

69

CHAPTER 4: DATA ANALYSIS AND RESULTS

This study was conducted to investigate whether virtual 3-dimensional animated models of existing, assigned, paper-based engineering graphics problems can be used as an effective instructional tool that enables students to enhance their spatial visualization skills. In addition, this study sought to uncover, or in some cases confirm, evidence of links between developmental activities and student initial spatial visualization abilities and visualization gains after instruction and/or treatment. Finally, this investigation also sought to determine the relationship between initial spatial visualization skills and both heavily visualization reliant and less visualization reliant assignment grades.

Treatment of the Data

As discussed earlier, the data collected in this study was not only extensive, but employed a variety of collection, parsing, and formatting devices and activities. Section information, grades – both for individual items and calculated items, PSVT—R pre-test and post-test scores were collected. Necessary conversions and disseminations were made for the analysis, including separating grades by those which are heavily visualization skill dependent and those that are not. Most of these separated grades were normalized to percentages for ease of comparison. Depending on the model, responses include PSVT—R post-test scores, PSVT—R pre-test scores, and visualization and non-

70

visualization dependent grades. Covariates, also model-dependent, include PSVT—R

pre-test scores, individual and composite Experience scores, time-based utilization data,

and access count-based utilization data. The 12 labs comprised the blocking factors, and

the students randomly assigned to receive access to the animations were designated as

treatment group members, while those students not granted access were designated as

control group members. Boolean access-based utilization is considered a random factor

within the original treatment group in an additional model that was analyzed.

Over 40,000 data points were collected altogether. Much of this data was filtered

for the analyses that follow, reducing resolution of the data and adversely affecting the

detectable difference of the statistical models. A large amount of data was removed due

to lack of consent from absent students, or students who simply refused to sign. This

reduced the number of study participants from a total of 396 students who enrolled and

completed the course to 273 who provided consent and completed the course. Often

missing data prevented full rank analysis, required for the General Linear Model module

in Minitab. Actions taken included removing subject records where data points essential

to achieving full rank data were missing. In addition, some data sets were truncated to

remove ceiling effects. This of course can affect the distribution and other characteristics

of the data, but considering the subsamples modeled for the purposes of evaluating the spatial visualization instructional tool, students with lower PSVT—R scores, appropriate liberties were taken considering that subjects at or near the ceiling of the PSVT—R are not targets for remediation. This remedy alleviated ceiling effects such as the conical pattern of residuals when plotted against the model response of PSVT—R test scores.

Ceiling effects may also cause an artificially altered slope, as the fitted model is forced to

71

“funnel” towards the test maximum. Since the interest is in remediating students with lower than average PSVT—R scores, a more suitable model fits the slopes of students closer to the mean, rather on the far right tail at the test maximum. This truncation also helped alleviate some of the left skew, as the original data collected leaves the analysis tailless on the right side after the test maximum of 30. These issues and associated remedies, as well as their implications, are discussed further in the next chapter.

Throughout this chapter, portions of tables indicating significant effects are highlighted.

Preliminary Analysis of Data: General Trends and Relationships

First, several general observations about the data, the relationships observed, and significant linear correlations are described and depicted graphically. In the subsequent analyses employing the models described in the previous chapter to test their respective hypotheses, adjustments made to data to achieve full rank status and/or remove ceiling effects will be noted.

Table 4.1 summarizes the data collected by block or lab. The missing data are evident when comparing control and treatment group counts to the number of data points collected for each data type listed in columns to the right. It is important to note that a total of 81 students opted to use the spatial visualization instructional animations at least once, or 60% of the treatment group and 29.7% of the total participants listed in the table.

72

r blocked sections, excluding grades.

Table 4.1: Descriptive statistics fo 73

For illustrative purposes and to confirm earlier assumptions regarding student

gains in pre- versus post-test PSVT—R scores as related to experience and ceiling

effects, average spatial visualization scores at each Experience score level are plotted and

shown with trend lines in Figure 4.1.

Figure 4.1: Average spatial visualization score plotted vs. Experience score.

.

For further illustrative purposes and to confirm earlier assumptions, regarding gains diminishing with increasing pre-test scores and ceiling effects, average spatial visualization post-test scores at each spatial visualization pre-test score level are plotted and shown with trend lines in Figure 4.2.

74

Figure 4.2: Average spatial visualization post-test score plotted vs. spatial visualization pre-test score.

.

In both of the above depicted cases, there is a decreasing gain, the distance between the fitted curves. For both cases, this signifies two possible phenomena. The first cause is the PSVT—R test ceiling score of 30. The other may also be the effects of the learning curve, where an individual learns a skill at a decreasing rate with exposure.

It is shown in other studies (Hsi, et al., 1997; Peters, et al., 1995; Sorby & Baartmans,

1996) that students with initially weaker spatial visualization abilities narrow the gap after instruction and gain faster than their initially more capable peers. Both of these factors seem to potentially be in effect here.

All data used as covariates or responses, that is, those data items that are continuous or are treated as continuous and appear in the models, were checked for 75

correlation with each other. In Table 4.2, the Pearson product moment coefficient and

associated p-values between data types such as PSVT—R pre-test scores, various

Drawing and non-drawing grade types, and Experience scores are shown, excluding spatial visualization animation utilization. Items significantly linearly correlated are highlighted, assuming a critical α level of 0.05. Here, n=123 out an original data set of

273. Records with missing PSVT—R pre- or post-tests, Experience scores, and grades

were removed.

Another similar listing of Pearson product moment correlation coefficients that

does include spatial visualization instructional animation correlations is shown in Table

4.3, resulting in only 38 full rank records. Both a simplified (Table 4.2) and more

detailed (Table 4.3) table are provided due to this drop in sample size as a result of full

rank data requirements.

From the correlation tables, especially Table 4.2, several trends are evident. First,

Experience components (as listed in Appendix B) DRAFTING (drafting, design, and

sketching experience) and especially MODELS (building blocks, models, and stackable

toy experience) are linearly correlated with pre-test scores and most grades, as is the

overall Experience composite, L Total. It is also striking how correlated PSVT—R pre-

test scores, representing initial spatial visualization abilities, are with all grades, as well

as DRAFTING, MODELS, SPORTS (sports experience), and MUSIC (musical

experience). While correlation does not imply causality, it is also noteworthy that

SPORTS (sports experience) is negatively correlated with PSVT—R pre-test scores and

all visualization-related grades, but not so with non-visualization grades.

Preliminary tests were run with data sets filtered for ceiling effects to check

76

MT Vis Percent

DWG Vis Grade Percent

Vis Grade Percent

PSVT‐R PRE

L Total

10 NAVIGATION

9 CRAFTS 0.052 0.025 0.163 ‐

8 MUSIC

7 CAD r initial spatial ability, various Drawing and non- r initial spatial ability, various Drawing highlighted.

6 VIDEO GAMES 0.001 0.112 0.080 0.057 0.132 0.204 0.341 0.010 0.044 0.088 0.023 0.087 0.140 0.287 0.957 0.451 0.030 0.124 0.129 0.096 0.136 0.202 0.322 0.535 0.472 0.471 are

‐ ‐ ‐ values ‐

5 ART p

0.018 ‐

4 HOME significant

IMPROVEMENT =0.05, rrelation coefficients fo

3 SPORTS α 0.223 0.0680.053 0.037 0.101 0.124 0.131 0.094 0.236 0.045 0.1830.047 0.146 0.051 0.022 0.104 0.234 0.029 0.118 0.189 0.278 0.331 0.692 ‐ ‐ ‐ ‐ scores. 2 MODELS Likert

1 DRAFTING tests, ‐ 0.190 0.223 0.0350.195 0.013 0.215 0.013 0.452 0.688 0.173 0.298 0.009 0.569 0.787 0.072 0.0030.134 0.027 0.185 0.618 0.043 0.108 0.812 0.009 0.754 0.195 0.037 0.002 0 0 0.0300.266 0.017 0.199 0.562 0.265 0.147 0.991 0.218 0.378 0.529 0.145 0.024 0 0.1390.124 0.041 0.177 0.606 0.007 0.572 0.211 0.254 0.912 0.631 0.336 0.798 0.337 0.124 0.001 0 0 0.1700.0510.9420.0190.8420.7440.1730.1540.2920.1350.0250000 Pre

R ‐ p p p p n=123* p PSVT

Pearson Pearson Pearson Pearson Pearson

grades,

PRE

Pearson product Grade

Percent

Grade

Grade R moment correlation ‐ Vis vis

Vis ‐

Percent Percent Percent

coefficient missing

Grade PSVT

MT Non DWG drawing grades, and Experience scores. Table 4.2: Pearson product moment co Table 4.2: Pearson Vis *No

77

Overall Number of Visits

Non‐vis Percent

MT Vis Percent

DWG Vis Grade Percent

Vis Grade Percent

PSVT‐R PRE 0.225 0.091 0.163 0.062 0.227 0.344 0.109 0.284 0.257 0.268 0.11 ‐ ‐

PSVT‐R POST

PSVT‐R PRE 0.225 0.066 0.109 0.21 ‐ ‐

L Total 0.025 ‐

10 NAVIGATION

9 CRAFTS r initial spatial ability, various Drawing and non- r initial spatial ability, various Drawing 8 MUSIC 0.189 0.149 0.009 0.107 0.052 0.111 0.119 0.222 0.200 0.133 0.268 0.039 ‐ ‐ ‐ highlighted.

are 7 CAD 0.045 0.053 0.068 0.273 0.136 0.159 0.188 0.017 0.034 0.017 0.983 0.649 0.218 ‐ ‐ ‐ ‐ rience scores. values ‐ p 6 VIDEO GAMES 0.275 0.143 0.0120.213 0.254 0.2790.110 0.207 0.119 0.215 0.388 0.283 0.343 0.063 0.523 0.074 0.202 0.063 0.19 0.202 0.779 0.167 ‐ ‐ ‐ ‐ ‐ significant

5 ART 0.004 ‐ =0.05, rrelation coefficients fo

4 HOME α IMPROVEMENT ilization, and Expe utilization.

3 SPORTS 0.274 0.206 0.160 0.081 0.087 0.355 0.134 0.080 0.302 0.061 0.042 0.075 0.282 0.122 0.423 0.088 0.006 0.075 0.297 0.260 0.339 0.357 0.488 0.154 0.488 0.462 0.546 0.399 ‐ ‐ ‐ scores, 2 MODELS Likert

1 DRAFTING tests, ‐ 0.155 0.269 0.148 0.066 0.025 0.231 0.0260.107 0.334 0.172 0.279 0.006 0.250 0.196 0.010 0.121 0.151 0.177 0.064 0.353 0.103 0.375 0.695 0.980 0.095 0.788 0.255 0.373 0.959 0.881 0.174 0.695 0.174 0.585 0.327 0.709 0.171 0.034 0.954 0.468 0.364 0.288 0.702 0.317 0.189 0.229 0.427 0.104 0.815 0.514 0.206 0.514 0.084 0.119 0.104 0.51 0.395 0.241 0.014 0.145 0.096 0.214 0.337 0.629 0.602 0.029 0.423 0.632 0.065 0.147 0.169 0.379 0.3100.883 0.7180.261 0.1620.114 0.804 0.365 0.877 0.024 0.656 0.085 0.040 0.610 0.086 0.526 0.089 0.001 0.753 0.483 0.392 0.002 0.521 0.943 0.200 0.754 0.123 0.090 0.505 0.212 0.478 0.475 0.196 0.016 0.181 0.085 0.035 0.706 0.001 0.657 0.223 0.706 0.253 0.223 0.000 0.522 0.301 0.970 0.131 0.239 0.510 0.685 0.097 0.414 0.342 0.257 0.92 0.84 0.92 0.000 0 0.100 0.331 0.551 0.043 0.464 0.008 0.600 0.970 0.654 0.070 0.115 0.038 0.028 0.002 0.357 0.002 0.003 0 0.013 ‐ ‐ ‐ ‐ Pre

R ‐ p p p p p p p n=38* p PSVT

Pearson Pearson Pearson Pearson Pearson Pearson Pearson Pearson

of

Time

Pearson product grades,

PRE Grade

POST Percent Grade

Grade

moment R R ‐ Total ‐ Number

Vis

vis

Visits ‐ correlation Vis

Percent Percent Percent missing

Grade PSVT

coefficient PSVT MT Non DWG drawing grades, animation tool ut Table 4.3: Pearson product moment co Table 4.3: Pearson Vis Overall *No Overall 78

samples and blocks for differences. Although sampling for control and experiment

assignment was random and blocks were a result of existing environmental circumstances

(student scheduling), it was deemed worthwhile to check for significant differences in the

population before performing the hypothesis testing on the models described in Chapter

3.

The first test (Table 4.4) tests for significant differences between the control and

treatment groups’ PSVT—R pre-test means. In this scenario, the data are filtered for

records of students with PSVT—R pre-test scores, all grades, and Experience survey

participation. The difference between the control and treatment groups is not significant

(T=0.66, p=0.51, n=123), with a control group mean pre-test score of 23.80, 0.52 greater

than the treatment group mean pre-test score of 23.28.

Two-sample T for PSVT-R PRE

Cont. (0)/Exp. (1) N Mean StDev SE Mean 0 70 23.80 4.30 0.51 1 53 23.28 4.27 0.59

Difference = mu (0) - mu (1) Estimate for difference: 0.517 95% CI for difference: (-1.028, 2.062) T-Test of difference = 0 (vs not =): T-Value = 0.66 P-Value = 0.509 DF = 112

Table 4.4: T-test for significant differences in PSVT—R means between control and treatment groups, filtered for pre-test scores, grades, and Experience scores.

The next test (Table 4.5) also tests for significant differences between the control

and treatment groups’ PSVT—R pre-test means. In this scenario, the data are filtered

only for records of students with PSVT—R pre-test scores. The difference between the

control and treatment groups is not significant (T=0.82, p=0.41, n=246), although the 79

treatment group mean pre-test score of 23.43 is 0.49 greater than the mean control group

score of 22.94.

Two-sample T for PSVT-R PRE

Cont. (0)/Exp. (1) N Mean StDev SE Mean 0 124 22.94 4.64 0.42 1 122 23.43 4.69 0.42

Difference = mu (0) - mu (1) Estimate for difference: -0.491 95% CI for difference: (-1.663, 0.682) T-Test of difference = 0 (vs not =): T-Value = -0.82 P-Value = 0.410 DF = 243

Table 4.5: T-test for significant differences in PSVT—R means between control and treatment groups, filtered for pre-test scores.

Testing the same filtered sample again for differences between the blocks, or labs, trends towards significance, but is outside the α level of 0.05 (R2=0.063, p=0.16, n=246).

This is shown in Table 4.6.

However, in some filtering scenarios, such when only PSVT—R pre- and post-

test zeros are filtered out, resulting in n=190, the effects of blocking by lab is significant, as shown in Table 4.7, where the course section blocking factor is significant with p=0.046. In addition, this blocking factor accounts for some variance in each of the models. Intuitively, it makes sense that different meeting times, instructional team members, and other circumstantial considerations would impact tool usage and participation in course and study activities, and this would impact data fields such as grades, exam scores, PSVT—R scores and participation rates, etcetera.

80

One-way ANOVA: PSVT-R PRE versus Lab

Source DF SS MS F P Lab 11 335.9 30.5 1.43 0.160 Error 234 4993.5 21.3 Total 245 5329.4

S = 4.619 R-Sq = 6.30% R-Sq(adj) = 1.90%

Table 4.6: ANOVA for significant differences in PSVT—R between blocked labs, filtered for pre-test scores.

General Linear Model: PSVT-R PRE versus Lab, Cont.(0)/Exp.(1)

Factor Type Levels Values Lab random 12 1A, 1B, 2A, 2B, 3A, 3B, 4A, 4B, 5A, 5B, 6A, 6B Cont.(0)/Exp.(1) fixed 2 0, 1

Analysis of Variance for PSVT-R PRE, using Adjusted SS for Tests

Source DF Seq SS Adj SS Adj MS F P Lab 11 386.47 373.72 33.97 1.87 0.046 Cont.(0)/Exp.(1) 1 4.06 4.06 4.06 0.22 0.637 Error 178 3238.35 3238.35 18.19 Total 190 3628.88

S = 4.26532 R-Sq = 10.76% R-Sq(adj) = 4.75%

Table 4.7: GLM for significant differences in blocked labs, filtered for pre-test and post- test scores excluding zeros.

Principal Analysis of Data: Hypothesis Testing

Tool Access Effects on Spatial Visualization Improvement

The hypothesis to test the effects on spatial visualization gains due to treatment, or free tool access follows. Achieving full rank data required filtering to remove zeros,

81

indicating non-participation, from PSVT—R pre- and post-test scores, and blanks in the

Experience scores. Descriptive statistics are provided in Table 4.8, with PSVT—R pre-

test scores filtered to allow only scores between 1 through 23, inclusively; PSVT—R post-test scores excluding zero scores and ceiling scores of 30; and records with no

Experience scores removed. Table 4.9 presents results of the general linear model for

differences in PSVT—R post-tests between control and treatment groups, given a lab

blocking factor with experience and pre-test scores as covariates, filtered for pre-test

scores to exclude zeros, no pre-test scores over 23/30, no post-test scores at 30/30, with

experience scores available. Differences between control and treatment groups were

significant at p=0.035. However, the only covariate of significance is the PSVT—R pre-

test score with p<0.0005. Experience scores (1-10) were not significant in explaining this

model. Altogether the version of the model described accounts for over 53% of the

variance.

Another version of this model is shown in Table 4.10. In this version, the not significant covariates were all removed stepwise. Here, significant differences attributed to control versus treatment groups are shown with p=0.01 with an increase in the mean from control to experiment post-test score of 2.255. The lab blocking factor is kept in place because it accounts for 13% of the variance. Altogether the model accounts for just over 50% of the variance.

82

5.005.00 2.00 5.00 2.00 5.00 2.00 5.00 1.00 5.00 2.00 5.00 2.00 5.00 1.00 5.00 2.00 5.00 1.00 1.00 27.00 8.00 24.00 5.00 39.00 6.00 0 0 0 0 0 0 0 0 0 0 0 0 0 5.0 4.0 removed.

4 4 2221 26.0 23.0 entries

8 8 Likert

blank

30,

no

Minimum Q1 Median Q3 Maximum IQR zeros,

Statistics no

for

Variance PSVT—R and Experience scores. Descriptive filtered

test ‐ post

R ‐ Mean StDev

PSVT

23, ‐ 1

Mean SE s for data set filtered scores

. for

3535 21.314 0.74835 19.829 4.424 0.61335 2.143 19.575 3.62635 3.286 0.17 13.146 1435 0.156 4.029 1.004 1235 0.926 0.171 3.743 1.008 1 35 0.857 1.014 0.118 2.771 1 135 1.029 0.701 0.179 3.257 2 135 0.491 0.176 1.829 1.06 2 235 1.039 0.181 2.286 1.123 3 2 3 35 1.079 1.071 0.215 2.314 3 1 3 3.0 35 1.146 1.274 0.168 3.543 2 4.0 2 1.622 0.993 0.161 29.2 1 2 0.987 0.95 0.675 1 3 1 0.903 3.991 1 3 1 4.0 15.929 2 1 4.0 2 22 2 2.0 3 2 26 3.0 3 3.0 29 4.0 32.0 Total Count filtered

test 11 37 37 24.459 0.553 20.73 3.363 0.518 11.311 3.15 17 9.925 21.5 12 24 19 27.50 21 29.00 23.00 6.00 1 24.00 37 4.00 30 0.865 5.26 27.667 17 27 29 34.00 43.00 7.00 11 371 37 2.1891 37 0.133 3.1621 37 0.811 0.142 4.1621 37 0.658 0.866 0.157 3.4321 37 0.751 0.958 3.108 0.18 11 37 0.917 0.197 3.757 1.094 11 2 37 1.197 0.175 1.197 1.784 21 3 37 1.432 1.065 2.595 0.17 2 1 3 37 1.134 0.234 2.297 1.031 1 3 3.00 3 1.423 0.168 1.063 3.514 2 4 4.00 4.00 2 2.026 1.024 0.126 1 3 5.00 5.00 3 1.00 1.048 0.768 1 3 4.00 5.00 1.00 1 0.59 1 4 4.00 1 5.00 2.00 2 1 5.00 5.00 2 1.00 2 3.00 5.00 2.00 3 2 4.00 4.00 2.00 4 3.00 5.00 2.00 4.00 5.00 3.00 5.00 1.00 1.00 0 0 0 0 0 0 0 0 0 0 0 0 0 ‐ Cont. (0)/Exp. pre

R ‐

PRE

POST

PSVT R

R ‐ 10 ART CAD ‐

Total

HOME

VIDEO MUSIC

5

SPORTS 7

CRAFTS MODELS L

GAMES

4 6 8 DRAFTING Variable 3 9

2 PSVT 1 Table 4.8: Descriptive statistic Table 4.8: Descriptive IMPROVEME PSVT *n=72, NAVIGATION

83

General Linear Model: PSVT-R POST versus Lab, Cont.(0)/Exp.(1)

Factor Type Levels Values Lab random 12 1A, 1B, 2A, 2B, 3A, 3B, 4A, 4B, 5A, 5B, 6A, 6B Cont.(0)/Exp.(1) fixed 2 0, 1

Analysis of Variance for PSVT-R POST, using Adjusted SS for Tests

Source DF Seq SS Adj SS Adj MS F P DRAFTING 1 0.06 12.81 12.81 1.06 0.309 MODELS 1 5.70 2.20 2.20 0.18 0.672 SPORTS 1 0.33 4.69 4.69 0.39 0.537 HOME IMPROVEMENT 1 0.19 0.70 0.70 0.06 0.811 ART 1 20.55 4.53 4.53 0.37 0.544 VIDEO GAMES 1 39.95 6.02 6.02 0.50 0.484 CAD 1 1.00 0.64 0.64 0.05 0.820 MUSIC 1 27.94 0.02 0.02 0.00 0.966 CRAFTS 1 19.62 0.00 0.00 0.00 0.989 NAVIGATE 1 14.44 0.05 0.05 0.00 0.951 PSVT-R PRE 1 332.12 232.24 232.24 19.15 0.000 Lab 11 149.98 120.12 10.92 0.90 0.547 Cont.(0)/Exp.(1) 1 56.78 56.78 56.78 4.68 0.035 Error 48 582.01 582.01 12.13 Total 71 1250.65

S = 3.48213 R-Sq = 53.46% R-Sq(adj) = 31.16%

Table 4.9: GLM ANCOVA for significant differences in PSVT—R post-test between control and treatment groups with blocked labs and experience and pre-test scores as covariates.

84

General Linear Model: PSVT-R POST versus Lab, Cont.(0)/Exp.(1) Factor Type Levels Values Lab random 12 1A, 1B, 2A, 2B, 3A, 3B, 4A, 4B, 5A, 5B, 6A, 6B Cont.(0)/Exp.(1) fixed 2 0, 1

Analysis of Variance for PSVT-R POST, using Adjusted SS for Tests Source DF Seq SS Adj SS Adj MS F P PSVT-R PRE 1 389.09 246.10 246.10 22.85 0.000 Lab 11 161.40 120.27 10.93 1.02 0.445 Cont.(0)/Exp.(1) 1 75.53 75.53 75.53 7.01 0.010 Error 58 624.64 624.64 10.77 Total 71 1250.65 S = 3.28171 R-Sq = 50.06% R-Sq(adj) = 38.86% Term Coef SE Coef T P Constant 12.363 3.250 3.80 0.000 PSVT-R PRE 0.6224 0.1320 4.72 0.000 L Total -0.07187 0.09976 -0.72 0.474 Lab 1A -0.377 1.473 -0.26 0.799 1B 2.325 1.854 1.25 0.215 2A 2.147 1.840 1.17 0.248 2B 0.222 1.470 0.15 0.881 3A 0.287 1.246 0.23 0.819 3B 0.086 1.201 0.07 0.943 4A 1.844 1.051 1.75 0.085 4B -1.083 1.168 -0.93 0.358 5A 0.472 1.380 0.34 0.734 5B -2.521 3.208 -0.79 0.435 6A -2.415 1.249 -1.93 0.058 Cont.(0)/Exp 0 -1.1274 0.4272 -2.64 0.011 Tukey 95.0% Simultaneous Confidence Intervals Response Variable PSVT-R POST All Pairwise Comparisons among Levels of Cont.(0)/Exp.(1) Cont.(0)/Exp.(1) = 0 subtracted from: Cont. (0)/Exp. (1) Lower Center Upper -----+------+------+------+- 1 0.5438 2.255 3.966 (------*------) -----+------+------+------+- 1.0 2.0 3.0 4.0 Tukey Simultaneous Tests Response Variable PSVT-R POST All Pairwise Comparisons among Levels of Cont.(0)/Exp.(1) Cont.(0)/Exp.(1) = 0 subtracted from: Cont. (0)/Exp. Difference SE of Adjusted (1) of Means Difference T-Value P-Value 1 2.255 0.8545 2.639 0.0107

Table 4.10: GLM ANCOVA for significant differences in PSVT—R post-test between control and treatment groups, given blocked labs and pre-test scores as covariates.

85

Tool Utilization Effects on Spatial Visualization Improvement

The hypothesis to test the effects on spatial visualization gains due to tool

utilization access within the treatment group follows. Two primary versions of the model

are tested, one involving the number of visits by each student to the animation tool, and another involving the total time spent per student on the animation tool. Achieving full

rank data initially required filtering to remove zeros from PSVT—R pre- and post-test

scores, and blanks in the Experience scores, as with the previous hypothesis test.

Descriptive statistics are similar to previous models.

Several data subsamples and model variations with reasonable assumptions for ceiling effect removal of records and full rank data were run, as finding significance was

elusive, but mostly negative coefficients of small magnitudes for utilization consistently

appeared. Only one iteration showed a significant predictor for either utilization model,

whether time-based or visit count-based. With Experience data ignored completely and

only records with zeros for PSVT—R tests removed, a p-value of 0.003 was produced

with -0.00028 as a coefficient for the version of the utilization model incorporating

Overall Total Time spent on the animations (n=95). As shown in Table 4.11, covariate

predictor effects were significant, but its true meaning is questionable with such a

miniscule magnitude of the utilization coefficient.

86

General Linear Model: PSVT-R POST versus Lab

Factor Type Levels Values Lab random 12 1A, 1B, 2A, 2B, 3A, 3B, 4A, 4B, 5A, 5B, 6A, 6B

Analysis of Variance for PSVT-R POST, using Adjusted SS for Tests

Source DF Seq SS Adj SS Adj MS F P PSVT-R PRE 1 424.663 232.104 232.104 26.15 0.000 Overal Total Time 1 101.328 81.879 81.879 9.22 0.003 Lab 11 76.963 76.963 6.997 0.79 0.651 Error 81 719.046 719.046 8.877 Total 94 1322.000

S = 2.97945 R-Sq = 45.61% R-Sq(adj) = 36.88%

Term Coef SE Coef T P Constant 16.498 2.001 8.24 0.000 PSVT-R PRE 0.41179 0.08053 5.11 0.000 Overal Total -0.000280 0.000092 -3.04 0.003

Table 4.11: GLM ANCOVA for significant effects of the Overall Total Time utilization covariate on PSVT—R post-test scores.

The Overall Total Time is derived from the Carmen log which simply records the time a student’s internet browser window is open on the target of the link monitored, which in the case of the instructional tool is the animation file. The Total Number of

Visits is simply a count of the number of times a student clicks on a link in Carmen. In either scenario, there is no indication as to whether the student is actively using the instructional tool each time, or if the student simply left a browser window open or continues to click on the same link multiple times without successfully accessing the animations.

Thus the questionable validity of the time-based and visit-count-based utilization variables begs of the investigation one more approach. The continuous utilization covariate was removed, and a Boolean usage random factor was added to the model. 87

This Boolean factor represents whether or not a student in the original treatment group chose to use the visualization tool. This also did not produce significant results with most model approaches and data subsets. One scenario, illustrated in Table 4.12, shows significant, but negative effects. In this case, only those with tool access, in the treatment group, who took the PSVT—R pre-test, scored no greater than 25 to remove ceiling effects, and provided Experience survey information were included in the test. After stepwise removal of the most not significant Experience components, a p-value of 0.050 for the Boolean usage factor was obtained. The main effects for the model, shown in

Figure 4.3, indicate a decrease in post-test scores for those who chose to use the visualization tool for any amount of time or any number of visits.

Main Effects Plot for PSVT-R POST Fitted Means

26.25

26.00

25.75

25.50

Mean 25.25

25.00

24.75

24.50 0 1 Us e d?

Figure 4.3: Main effects for PSVT—R post-test for the usage random factor. 88

General Linear Model: PSVT-R POST versus Used?

Factor Type Levels Values Used? random 2 0, 1

Analysis of Variance for PSVT-R POST, using Adjusted SS for Tests

Source DF Seq SS Adj SS Adj MS F P PSVT-R PRE 1 154.744 111.206 111.206 18.42 0.000 DRAFTING 1 15.349 50.057 50.057 8.29 0.007 HOME IMPROVEMENT 1 10.198 6.786 6.786 1.12 0.296 ART 1 3.867 13.267 13.267 2.20 0.147 CAD 1 11.473 21.270 21.270 3.52 0.068 NAVIGATE 1 1.755 6.263 6.263 1.04 0.315 Used? 1 24.850 24.850 24.850 4.12 0.050 Error 37 223.409 223.409 6.038 Total 44 445.644

S = 2.45725 R-Sq = 49.87% R-Sq(adj) = 40.38%

Term Coef SE Coef T P Constant 9.806 4.111 2.39 0.022 PSVT-R PRE 0.7333 0.1709 4.29 0.000 DRAFTING -1.6993 0.5902 -2.88 0.007 HOME IMPROVEMENT 0.4478 0.4224 1.06 0.296 ART 0.6941 0.4683 1.48 0.147 CAD 0.7454 0.3972 1.88 0.068 NAVIGATE -0.5920 0.5812 -1.02 0.315 Used? 0 0.8679 0.4278 2.03 0.050

Table 4.12: GLM for PSVT—R post-test score effects by the animation usage factor.

Experience Effects on Spatial Visualization Ability Gains

The hypothesis to test the effects on spatial visualization ability improvements due to developmental experiences follows. Achieving full rank data initially required filtering to remove zeros from PSVT—R pre- and post-test scores, and blanks in the

Experience scores, as with previous hypothesis tests. In addition, ceiling effects severely distort gains for students scoring in the upper echelon of the pre-test, so records with pre-

89

test scores above 25/30 were filtered, leaving n=148. Descriptive statistics are otherwise

similar to previous models.

Checking for interactions between pre-test PSVT—R scores and Experience

scores, not significant covariates were removed stepwise until iterations resulted in loss

of significance. In the resulting model, shown in Table 4.13, covariate predictor effects

were significant for HOME IMPROVEMENT (p=0.028), where students were asked,

“How often have you fixed things, with a parent, guardian, other mentor, or on your own,

such as working on a car, home improvements, construction, carpentry, electronics, or other similar activities?” The interaction between pre-test score and HOME

IMPROVEMENT is also significant (p=0.035). Experience item DRAFTING, “How often have you engaged in drafting, design and design sketching, engineering graphics, or other manual (by hand) technical drawing activities?” showed a trend towards significance (p=0.058), as did its interaction with pre-test scores (p=0.085). In both cases, pre-test score, HOME IMPROVEMENT, and DRAFTING had negative effects on the response, PSVT—R gain, while both interactions had positive effects on the response.

90

General Linear Model: PSVT-R GAIN versus Lab

Factor Type Levels Values Lab random 12 1A, 1B, 2A, 2B, 3A, 3B, 4A, 4B, 5A, 5B, 6A, 6B

Analysis of Variance for PSVT-R GAIN, using Adjusted SS for Tests

Source DF Seq SS Adj SS Adj MS F P PSVT-R PRE 1 453.967 30.273 30.273 3.58 0.061 Lab 11 62.024 48.037 4.367 0.52 0.890 DRAFTING 1 19.967 30.914 30.914 3.65 0.058 PSVT-R PRE*DRAFTING 1 28.123 25.460 25.460 3.01 0.085 HOME IMPROVEMENT 1 0.570 41.691 41.691 4.92 0.028 PSVT-R PRE*HOME IMPROVEMENT 1 15.204 38.557 38.557 4.55 0.035 ART 1 6.420 20.436 20.436 2.41 0.123 PSVT-R PRE*ART 1 19.501 19.074 19.074 2.25 0.136 CAD 1 0.052 7.897 7.897 0.93 0.336 PSVT-R PRE*CAD 1 6.655 8.409 8.409 0.99 0.321 CRAFTS 1 0.023 2.912 2.912 0.34 0.559 PSVT-R PRE*CRAFTS 1 1.689 2.748 2.748 0.32 0.570 NAVIGATE 1 4.601 20.816 20.816 2.46 0.119 PSVT-R PRE*NAVIGATE 1 18.045 18.045 18.045 2.13 0.147 Error 123 1041.483 1041.483 8.467 Total 147 1678.324

S = 2.90987 R-Sq = 37.95% R-Sq(adj) = 25.84%

Term Coef SE Coef T P Constant 14.552 6.676 2.18 0.031 PSVT-R PRE -0.5443 0.2879 -1.89 0.061 Lab 1A -0.156 1.003 -0.16 0.876 1B 0.679 1.032 0.66 0.512 2A 0.2590 0.8399 0.31 0.758 2B 0.246 1.142 0.22 0.830 3A 0.7096 0.8607 0.82 0.411 3B 0.2407 0.7652 0.31 0.754 4A 0.8514 0.6969 1.22 0.224 4B -0.2419 0.6802 -0.36 0.723 5A -0.2048 0.8260 -0.25 0.805 5B -0.2262 0.8321 -0.27 0.786 6A -1.2664 0.8375 -1.51 0.133 DRAFTING -4.423 2.315 -1.91 0.058 PSVT-R PRE*DRAFTING 0.17055 0.09836 1.73 0.085 HOME IMPROVEMENT -3.832 1.727 -2.22 0.028 PSVT-R PRE*HOME IMPROVEMENT 0.15782 0.07396 2.13 0.035 ART 2.643 1.701 1.55 0.123 PSVT-R PRE*ART -0.10668 0.07108 -1.50 0.136 CAD 2.100 2.174 0.97 0.336 PSVT-R PRE*CAD -0.09004 0.09035 -1.00 0.321 CRAFTS -1.171 1.996 -0.59 0.559 PSVT-R PRE*CRAFTS 0.04760 0.08355 0.57 0.570 NAVIGATE 2.883 1.839 1.57 0.119 PSVT-R PRE*NAVIGATE -0.11379 0.07795 -1.46 0.147

Table 4.13: Experience effects and interactions with PSVT—R test gains.

91

Experience Effects on Initial Spatial Visualization Ability

This section describes the test of the effects on initial spatial visualization ability due to developmental experiences. Achieving full rank data initially required filtering to remove zeros from PSVT—R pre-test scores, and blanks in the Experience scores.

Fortunately, ceiling effects are not as great an issue here.

Testing for the effects of predicting pre-test scores with the total Experience score reveals significance (p=0.040). This analysis is shown in Table 4.14.

General Linear Model: PSVT-R PRE versus Lab

Factor Type Levels Values Lab random 12 1A, 1B, 2A, 2B, 3A, 3B, 4A, 4B, 5A, 5B, 6A, 6B

Analysis of Variance for PSVT-R PRE, using Adjusted SS for Tests

Source DF Seq SS Adj SS Adj MS F P L Total 1 130.38 89.56 89.56 4.28 0.040 Lab 11 278.75 278.75 25.34 1.21 0.280 Error 278 5820.51 5820.51 20.94 Total 290 6229.64

S = 4.57571 R-Sq = 6.57% R-Sq(adj) = 2.53%

Term Coef SE Coef T P Constant 19.872 1.676 11.86 0.000 L Total 0.11341 0.05484 2.07 0.040 Lab 1A -1.4744 0.9132 -1.61 0.108 1B -0.0773 0.8810 -0.09 0.930 2A 1.1729 0.9134 1.28 0.200 2B 0.619 1.022 0.61 0.545 3A 0.5390 0.8241 0.65 0.514 3B -0.1007 0.8636 -0.12 0.907 4A 0.6040 0.8951 0.67 0.500 4B 1.2276 0.8215 1.49 0.136 5A -0.3500 0.8504 -0.41 0.681 5B 0.8930 0.8809 1.01 0.312 6A -1.0069 0.9533 -1.06 0.292

Table 4.14: GLM for PSVT—R pre-test score effects from Experience composite scores.

92

A similar model featuring all ten separate Experience scores was run, with

stepwise removal of not significant covariates until maximum significance of remaining

Experience components was achieved. This resulted in Experience components

MODELS, “How often have you constructed models or played with building blocks,

Legos, Lincoln Logs, or other similar stackable toys?”; SPORTS, “How often have you played sports?”; and MUSIC, “How often have you played musical instruments or composed music?”, showing significance (p=0.003, 0.042, and 0.013, respectively), with

SPORTS showing negative effects on the response. The results of the analysis for this model are shown in Table 4.15.

93

General Linear Model: PSVT-R PRE versus Lab

Factor Type Levels Values Lab random 12 1A, 1B, 2A, 2B, 3A, 3B, 4A, 4B, 5A, 5B, 6A, 6B

Analysis of Variance for PSVT-R PRE, using Adjusted SS for Tests

Source DF Seq SS Adj SS Adj MS F P MODELS 1 248.73 181.45 181.45 9.21 0.003 SPORTS 1 107.85 82.29 82.29 4.18 0.042 MUSIC 1 137.95 123.86 123.86 6.29 0.013 Lab 11 297.62 297.62 27.06 1.37 0.185 Error 276 5437.48 5437.48 19.70 Total 290 6229.64

S = 4.43858 R-Sq = 12.72% R-Sq(adj) = 8.29%

Term Coef SE Coef T P Constant 21.024 1.512 13.90 0.000 MODELS 0.9146 0.3014 3.03 0.003 SPORTS -0.5444 0.2664 -2.04 0.042 MUSIC 0.5037 0.2009 2.51 0.013 Lab 1A -1.6237 0.8872 -1.83 0.068 1B -0.2475 0.8568 -0.29 0.773 2A 1.5400 0.8919 1.73 0.085 2B 0.6723 0.9924 0.68 0.499 3A 0.9119 0.8030 1.14 0.257 3B -0.5108 0.8467 -0.60 0.547 4A 0.5948 0.8682 0.69 0.494 4B 0.9892 0.7985 1.24 0.216 5A -0.1405 0.8306 -0.17 0.866 5B 0.7149 0.8589 0.83 0.406 6A -1.0462 0.9262 -1.13 0.260

Table 4.15: GLM for PSVT—R pre-test score effects by Experience components:

MODELS, SPORTS, and MUSIC.

Experience Effects on Grades

The hypothesis to test the effects on heavily visualization-dependent grades such as drawings and non-visualization grades such as team lab assignments due to developmental experiences follows. Achieving full rank data required filtering to remove zeros grades and blanks in the Experience scores, in a manner similar to previous 94 hypothesis tests. This results in n=130. Descriptive statistics are otherwise similar to previous models.

Grades were adjusted to remove components such as attendance, group assignment grades, and journal entry participation, as well as bonus points unrelated to the curriculum. They were then segregated into overall grades, visualization-dependent grades, and non-visualization heavily dependent grades. Visualization-dependent grades were also further broken into drawing assignment grades and midterm grades that include only midterm problems requiring drawing and visualization skills similar to those requisite of the drawing homework assignments. All models were analyzed with a stepwise covariate elimination approach until maximum significance was attained. It is worth noting that the lab blocking factor had significant effects in a few instances here, portraying the differences in instructional settings and grading.

Experience scores did not significantly predict overall final grades. Non- visualization grades were significantly affected by HOME IMPROVEMENT, “How often have you fixed things, with a parent, guardian, other mentor, or on your own, such as working on a car, home improvements, construction, carpentry, electronics, or other similar activities?” Covariate DRAFTING, “How often have you engaged in drafting, design and design sketching, engineering graphics, or other manual (by hand) technical drawing activities?”, and ART, “How often have you engaged in creating artwork

(drawing, graphic design, painting, 3D art, photography, etc.)?”, also showed some significance. Table 4.16 shows the GLM table for this model.

95

General Linear Model: Non-vis, grd reduced percent versus Lab

Factor Type Levels Values Lab random 10 1B, 2B, 3A, 3B, 4A, 4B, 5A, 5B, 6A, 6B

Analysis of Variance for Non-vis, grd reduced percent, using Adjusted SS for Tests

Source DF Seq SS Adj SS Adj MS F P DRAFTING 1 10.583 26.120 26.120 3.22 0.075 SPORTS 1 0.280 14.885 14.885 1.84 0.178 HOME IMPROVEMENT 1 35.195 48.468 48.468 5.98 0.016 ART 1 9.128 25.861 25.861 3.19 0.077 VIDEO GAMES 1 15.369 10.579 10.579 1.31 0.256 CAD 1 1.325 5.073 5.073 0.63 0.431 MUSIC 1 30.476 20.498 20.498 2.53 0.115 Lab 9 249.029 249.029 27.670 3.41 0.001 Error 113 915.896 915.896 8.105 Total 129 1267.280

S = 2.84698 R-Sq = 27.73% R-Sq(adj) = 17.49%

Term Coef SE Coef T P Constant 92.107 1.959 47.02 0.000 DRAFTING 0.6401 0.3566 1.80 0.075 SPORTS -0.3622 0.2673 -1.36 0.178 HOME IMPROVEMENT 0.7783 0.3183 2.45 0.016 ART -0.5189 0.2905 -1.79 0.077 VIDEO GAMES -0.2761 0.2417 -1.14 0.256 CAD -0.2638 0.3335 -0.79 0.431 MUSIC 0.3425 0.2154 1.59 0.115 Lab 1B 0.4396 0.9927 0.44 0.659 2B 0.5540 0.7212 0.77 0.444 3A 2.7862 0.7544 3.69 0.000 3B -0.4376 0.8116 -0.54 0.591 4A -1.2695 0.7068 -1.80 0.075 4B 0.9225 0.6478 1.42 0.157 5A -1.1630 0.7616 -1.53 0.130 5B 0.9849 0.7966 1.24 0.219 6A -2.9671 0.9164 -3.24 0.002

Table 4.16: GLM for response Non-visualization grades effects by Experience score covariates.

Combined visualization grades were not significantly affected by any of the

Experience score covariates, but HOME IMPROVEMENT, “How often have you fixed things, with a parent, guardian, other mentor, or on your own, such as working on a car,

96 home improvements, construction, carpentry, electronics, or other similar activities?”, and VIDEO GAMES, “How often have you played video games?”, showed trends towards significance. Table 4.17 shows the GLM table for this model. Covariate

VIDEO GAMES negatively affected the response.

General Linear Model: All Vis Grade percent versus Lab

Factor Type Levels Values Lab random 10 1B, 2B, 3A, 3B, 4A, 4B, 5A, 5B, 6A, 6B

Analysis of Variance for All Vis Grade percent, using Adjusted SS for Tests

Source DF Seq SS Adj SS Adj MS F P SPORTS 1 21.86 126.39 126.39 2.64 0.107 HOME IMPROVEMENT 1 189.88 143.89 143.89 3.01 0.085 VIDEO GAMES 1 80.45 154.32 154.32 3.23 0.075 Lab 9 647.04 647.04 71.89 1.50 0.155 Error 117 5594.60 5594.60 47.82 Total 129 6533.83

S = 6.91499 R-Sq = 14.37% R-Sq(adj) = 5.59%

Term Coef SE Coef T P Constant 94.409 3.853 24.50 0.000 SPORTS -1.0198 0.6272 -1.63 0.107 HOME IMPROVEMENT 1.2410 0.7154 1.73 0.085 VIDEO GAMES -1.0123 0.5635 -1.80 0.075 Lab 1B 0.660 2.338 0.28 0.778 2B 2.733 1.749 1.56 0.121 3A 2.326 1.798 1.29 0.198 3B 0.140 1.962 0.07 0.943 4A -1.482 1.674 -0.89 0.378 4B 0.894 1.569 0.57 0.570 5A -1.546 1.829 -0.85 0.400 5B 2.836 1.897 1.49 0.138 6A -1.531 2.168 -0.71 0.482

Table 4.17: GLM for response all visualization grades effects by Experience score covariates.

Midterm exam visualization-related grades were significantly affected by

MODELS, “How often have you constructed models or played with building blocks, 97

Legos, Lincoln Logs, or other similar stackable toys?” Covariate VIDEO GAMES,

“How often have you played video games?”, showed trends toward significance with negative effects. Table 4.18 shows the GLM table for this model.

General Linear Model: MT Vis percent versus Lab

Factor Type Levels Values Lab random 10 1B, 2B, 3A, 3B, 4A, 4B, 5A, 5B, 6A, 6B

Analysis of Variance for MT Vis percent, using Adjusted SS for Tests

Source DF Seq SS Adj SS Adj MS F P MODELS 1 235.83 388.56 388.56 4.48 0.036 SPORTS 1 39.13 78.32 78.32 0.90 0.344 VIDEO GAMES 1 168.05 296.35 296.35 3.42 0.067 MUSIC 1 122.94 86.37 86.37 1.00 0.320 CRAFTS 1 39.87 57.25 57.25 0.66 0.418 Lab 9 915.73 915.73 101.75 1.17 0.318 Error 115 9968.46 9968.46 86.68 Total 129 11490.01

S = 9.31033 R-Sq = 13.24% R-Sq(adj) = 2.68%

Term Coef SE Coef T P Constant 92.015 6.157 14.94 0.000 MODELS 2.250 1.063 2.12 0.036 SPORTS -0.8053 0.8472 -0.95 0.344 VIDEO GAMES -1.4682 0.7941 -1.85 0.067 MUSIC 0.6872 0.6884 1.00 0.320 CRAFTS -0.7697 0.9471 -0.81 0.418 Lab 1B -1.144 3.122 -0.37 0.715 2B 2.547 2.340 1.09 0.279 3A 2.041 2.436 0.84 0.404 3B -1.790 2.710 -0.66 0.510 4A -2.161 2.258 -0.96 0.341 4B 1.471 2.100 0.70 0.485 5A 1.054 2.499 0.42 0.674 5B 4.910 2.561 1.92 0.058 6A -1.655 2.932 -0.56 0.574

Table 4.18: GLM for response of drawing problem scores on midterm exam effects by

Experience score covariates.

98

Drawing assignment visualization-related grades were not significantly affected by any of the Experience score covariates. However, SPORTS, “How often have you played sports?”, and HOME IMPROVEMENT, “How often have you fixed things, with a parent, guardian, other mentor, or on your own, such as working on a car, home improvements, construction, carpentry, electronics, or other similar activities?”, show trends toward significance and the lab blocking factor was significant at p<0.0005.

SPORTS had negative effects on the response. Table 4.19 shows the GLM table for this model.

General Linear Model: DWG Vis Grade percent versus Lab

Factor Type Levels Values Lab random 10 1B, 2B, 3A, 3B, 4A, 4B, 5A, 5B, 6A, 6B

Analysis of Variance for DWG Vis Grade percent, using Adjusted SS for Tests

Source DF Seq SS Adj SS Adj MS F P SPORTS 1 11.95 64.76 64.76 2.99 0.086 HOME IMPROVEMENT 1 217.85 76.80 76.80 3.55 0.062 VIDEO GAMES 1 18.59 49.23 49.23 2.27 0.134 Lab 9 831.64 831.64 92.40 4.27 0.000 Error 117 2534.49 2534.49 21.66 Total 129 3614.52

S = 4.65428 R-Sq = 29.88% R-Sq(adj) = 22.69%

Term Coef SE Coef T P Constant 93.113 2.593 35.91 0.000 SPORTS -0.7300 0.4222 -1.73 0.086 HOME IMPROVEMENT 0.9066 0.4815 1.88 0.062 VIDEO GAMES -0.5718 0.3793 -1.51 0.134 Lab 1B 3.062 1.573 1.95 0.054 2B 3.112 1.177 2.64 0.009 3A 2.976 1.210 2.46 0.015 3B 1.541 1.321 1.17 0.246 4A -0.404 1.127 -0.36 0.721 4B -0.572 1.056 -0.54 0.589 5A -4.204 1.231 -3.41 0.001 5B -0.052 1.277 -0.04 0.968 6A -0.545 1.459 -0.37 0.710

Table 4.19: GLM for drawing assignment visualization grades effects by Experience score covariates. 99

Tool Utilization Effects on Grades

The hypothesis to test the effects on spatial visualization gains due to tool

utilization access within the treatment group follows. Achieving full rank data initially

required filtering to remove zeros from PSVT—R pre- and post-test scores, and blanks in

the Experience scores, and missing or zero grades. In addition, this subsample only

involved treatment group participants, leaving n=52.

Absolute animation tool usage, time spent, and number of online visits to the animations, the utilization measures, do not significantly affect drawing assignment visualization-reliant grades. As shown in Tables 4.20 and 4.21, usage and number of visits do not show significant effects (p=0.122, 0.070, respectively).

100

General Linear Model: DWG Vis Grade percent versus Lab, Used?

Factor Type Levels Values Lab random 10 1B, 2B, 3A, 3B, 4A, 4B, 5A, 5B, 6A, 6B Used? random 2 0, 1

Analysis of Variance for DWG Vis Grade percent, using Adjusted SS for Tests

Source DF Seq SS Adj SS Adj MS F P PSVT-R PRE 1 143.21 124.11 124.11 7.46 0.009 Lab 9 378.62 396.07 44.01 2.64 0.016 Used? 1 41.45 41.45 41.45 2.49 0.122 Error 41 682.51 682.51 16.65 Total 52 1245.78

S = 4.08003 R-Sq = 45.21% R-Sq(adj) = 30.52%

Term Coef SE Coef T P Constant 81.805 3.629 22.54 0.000 PSVT-R PRE 0.4049 0.1483 2.73 0.009 Lab 1B 3.025 2.012 1.50 0.140 2B 3.740 1.790 2.09 0.043 3A 3.671 1.591 2.31 0.026 3B -0.722 3.788 -0.19 0.850 4A 1.655 1.437 1.15 0.256 4B -0.820 1.548 -0.53 0.599 5A -3.982 1.961 -2.03 0.049 5B -0.497 1.559 -0.32 0.752 6A -1.582 2.026 -0.78 0.439 Used? 0 -1.0797 0.6843 -1.58 0.122

Table 4.20: Effects of animation tool usage on drawing visualization grades.

101

General Linear Model: DWG Vis Grade percent versus Lab

Factor Type Levels Values Lab random 10 1B, 2B, 3A, 3B, 4A, 4B, 5A, 5B, 6A, 6B

Analysis of Variance for DWG Vis Grade percent, using Adjusted SS for Tests

Source DF Seq SS Adj SS Adj MS F P PSVT-R PRE 1 143.21 129.90 129.90 7.98 0.007 Overall Number of Visits 1 86.65 56.38 56.38 3.46 0.070 Lab 9 348.35 348.35 38.71 2.38 0.029 Error 41 667.58 667.58 16.28 Total 52 1245.78

S = 4.03514 R-Sq = 46.41% R-Sq(adj) = 32.04%

Term Coef SE Coef T P Constant 80.989 3.669 22.07 0.000 PSVT-R PRE 0.4153 0.1471 2.82 0.007 Overall Numb 0.1931 0.1038 1.86 0.070 Lab 1B 3.153 1.969 1.60 0.117 2B 3.150 1.765 1.79 0.082 3A 3.400 1.532 2.22 0.032 3B -0.663 3.737 -0.18 0.860 4A 1.477 1.432 1.03 0.308 4B -0.679 1.538 -0.44 0.661 5A -4.485 1.964 -2.28 0.028 5B -0.065 1.581 -0.04 0.968 6A -1.513 2.001 -0.76 0.454

Table 4.21: Effects of animation tool visits on drawing visualization grades.

Once again animation tool usage, time spent, and number of online visits to the animations do not significantly affect midterm exam visualization-reliant grades. As shown in Table 4.22, however, number of visits does trend towards significance

(p=0.100).

102

General Linear Model: MT Vis percent versus Lab

Factor Type Levels Values Lab random 10 1B, 2B, 3A, 3B, 4A, 4B, 5A, 5B, 6A, 6B

Analysis of Variance for MT Vis percent, using Adjusted SS for Tests

Source DF Seq SS Adj SS Adj MS F P PSVT-R PRE 1 195.03 196.46 196.46 4.21 0.046 Overall Number of Visits 1 124.45 131.84 131.84 2.83 0.100 Lab 9 570.16 570.16 63.35 1.36 0.238 Error 41 1911.03 1911.03 46.61 Total 52 2800.67

S = 6.82719 R-Sq = 31.77% R-Sq(adj) = 13.46%

Term Coef SE Coef T P Constant 78.074 6.207 12.58 0.000 PSVT-R PRE 0.5108 0.2488 2.05 0.046 Overall Numb 0.2953 0.1756 1.68 0.100 Lab 1B -4.412 3.331 -1.32 0.193 2B 4.460 2.986 1.49 0.143 3A 2.823 2.593 1.09 0.283 3B -5.508 6.322 -0.87 0.389 4A 3.327 2.422 1.37 0.177 4B 2.188 2.603 0.84 0.406 5A -0.074 3.324 -0.02 0.982 5B 4.266 2.675 1.59 0.119 6A -2.947 3.385 -0.87 0.389

Table 4.22: Effects of animation tool visits on midterm exam visualization grades.

Further analysis involves examining the drawing assignment relying heavily on visualization skills which invoked the most accesses by students of the corresponding animations. The animations of Drawing 11 were fully accessed by 19 students. Others accessed some, but not all of the related animations for this assignment. A t-test, in Table

4.23, shows that students scored significantly better on average if they used the animation tool (p=0.045). In this scenario, n=245 because only zeros for Drawing 11 and the

PSVT—R pre-test were screened.

103

t‐Test: Two‐Sample Assuming Equal Variances

Used DWG11 Did not use DWG11 Tool Tool Mean 18.07894737 16.12389381 Variance 5.673976608 24.54235988 Observations 19 226 Pooled Variance 23.14470186 Hypothesized Mean Difference 0 df 243 t Stat 1.701299768 P(T<=t) one‐tail 0.045082919 t Critical one‐tail 1.651148402 P(T<=t) two‐tail 0.090165838 t Critical two‐tail 1.969774341

Table 4.23: T-test for Drawing 11 scores of those using the Drawing 11 animation tool versus those who did not use the tool.

In addition, when accounting for pre-test scores in a GLM, as shown in Table 4.24, usage of the animation tool for Drawing 11 also shows significance (p=0.022).

104

General Linear Model: DWG11 versus USED?

Factor Type Levels Values USED? random 2 0, 1

Analysis of Variance for DWG11, using Adjusted SS for Tests

Source DF Seq SS Adj SS Adj MS F P PRE 1 193.82 244.43 244.43 11.00 0.001 USED? 1 117.61 117.61 117.61 5.29 0.022 Error 242 5379.73 5379.73 22.23 Total 244 5691.15

S = 4.71490 R-Sq = 5.47% R-Sq(adj) = 4.69%

Term Coef SE Coef T P Constant 12.341 1.542 8.00 0.000 PRE 0.21766 0.06564 3.32 0.001 USED? 0 -1.3164 0.5723 -2.30 0.022

Table 4.24: GLM for Drawing 11 animation tool usage effects on Drawing 11 grades.

Spatial Visualization Pre-test Effects on Grades

The hypothesis to test the effects on spatial visualization gains due to tool utilization access within the treatment group follows. Achieving full rank data required filtering to remove zeros from PSVT—R pre-test scores, blanks in the Experience scores, and missing or zero grades, leaving n=124.

As shown in Tables 4.25 through 4.30, PSVT—R pre-test scores were highly significant in their effects on all grades, in every scenario conceived (p<0.012 for

PSVT—R pre-test effects in all cases).

105

General Linear Model: Fin grade no team, no attend, versus Lab

Factor Type Levels Values Lab random 10 1B, 2B, 3A, 3B, 4A, 4B, 5A, 5B, 6A, 6B

Analysis of Variance for Fin grade no team, no attend,, using Adjusted SS for Tests

Source DF Seq SS Adj SS Adj MS F P PSVT-R PRE 1 482.68 341.82 341.82 20.10 0.000 Lab 9 173.65 173.65 19.29 1.13 0.345 Error 113 1921.87 1921.87 17.01 Total 123 2578.20

S = 4.12404 R-Sq = 25.46% R-Sq(adj) = 18.86%

Term Coef SE Coef T P Constant 79.910 2.121 37.67 0.000 PSVT-R PRE 0.39922 0.08905 4.48 0.000 Lab 1B -0.269 1.366 -0.20 0.844 2B 1.189 1.176 1.01 0.314 3A 2.493 1.097 2.27 0.025 3B -0.888 1.132 -0.78 0.435 4A -0.237 1.027 -0.23 0.818 4B 0.4739 0.9321 0.51 0.612 5A -0.791 1.057 -0.75 0.456 5B 1.092 1.132 0.96 0.337 6A -1.723 1.292 -1.33 0.185

Table 4.25: Effects of PSVT—R pre-test scores on all grades.

106

General Linear Model: Fin grade NO DWG vis, no etc versus Lab

Factor Type Levels Values Lab random 10 1B, 2B, 3A, 3B, 4A, 4B, 5A, 5B, 6A, 6B

Analysis of Variance for Fin grade NO DWG vis, no etc, using Adjusted SS for Tests

Source DF Seq SS Adj SS Adj MS F P PSVT-R PRE 1 498.26 365.90 365.90 19.27 0.000 Lab 9 176.68 176.68 19.63 1.03 0.418 Error 113 2145.79 2145.79 18.99 Total 123 2820.74

S = 4.35767 R-Sq = 23.93% R-Sq(adj) = 17.20%

Term Coef SE Coef T P Constant 79.304 2.241 35.38 0.000 PSVT-R PRE 0.41304 0.09409 4.39 0.000 Lab 1B -0.680 1.443 -0.47 0.639 2B 0.945 1.243 0.76 0.448 3A 2.436 1.160 2.10 0.038 3B -1.144 1.196 -0.96 0.341 4A -0.345 1.085 -0.32 0.751 4B 0.6766 0.9849 0.69 0.494 5A -0.364 1.117 -0.33 0.745 5B 1.346 1.196 1.13 0.263 6A -1.897 1.365 -1.39 0.167

Table 4.26: Effects of PSVT—R pre-test scores on all but Drawing assignments.

107

General Linear Model: Non-vis, no final, no team, etc versus Lab

Factor Type Levels Values Lab random 10 1B, 2B, 3A, 3B, 4A, 4B, 5A, 5B, 6A, 6B

Analysis of Variance for Non-vis, no final, no team, etc, using Adjusted SS for Tests

Source DF Seq SS Adj SS Adj MS F P PSVT-R PRE 1 122.051 79.423 79.423 10.64 0.001 Lab 9 219.490 219.490 24.388 3.27 0.001 Error 112 836.350 836.350 7.467 Total 122 1177.891

S = 2.73266 R-Sq = 29.00% R-Sq(adj) = 22.66%

Term Coef SE Coef T P Constant 88.208 1.457 60.56 0.000 PSVT-R PRE 0.19890 0.06099 3.26 0.001 Lab 1B 0.8773 0.9050 0.97 0.334 2B 1.4185 0.7794 1.82 0.071 3A 2.4225 0.7271 3.33 0.001 3B -0.3676 0.7504 -0.49 0.625 4A -1.4341 0.6813 -2.10 0.038 4B 0.4263 0.6177 0.69 0.491 5A -1.2467 0.7246 -1.72 0.088 5B 0.1097 0.7502 0.15 0.884 6A -2.5795 0.8568 -3.01 0.003

Table 4.27: Effects of PSVT—R pre-test scores on all but Drawing assignments and exam Drawing grades.

108

General Linear Model: Vis Grade percent versus Lab

Factor Type Levels Values Lab random 10 1B, 2B, 3A, 3B, 4A, 4B, 5A, 5B, 6A, 6B

Analysis of Variance for Vis Grade percent, using Adjusted SS for Tests

Source DF Seq SS Adj SS Adj MS F P PSVT-R PRE 1 499.51 290.30 290.30 9.76 0.002 Lab 9 476.76 476.76 52.97 1.78 0.080 Error 112 3331.70 3331.70 29.75 Total 122 4307.96

S = 5.45411 R-Sq = 22.66% R-Sq(adj) = 15.76%

Term Coef SE Coef T P Constant 82.677 2.907 28.44 0.000 PSVT-R PRE 0.3803 0.1217 3.12 0.002 Lab 1B 0.791 1.806 0.44 0.662 2B 3.408 1.556 2.19 0.031 3A 1.994 1.451 1.37 0.172 3B -0.485 1.498 -0.32 0.747 4A 0.950 1.360 0.70 0.486 4B -0.035 1.233 -0.03 0.978 5A -2.155 1.446 -1.49 0.139 5B 1.082 1.497 0.72 0.471 6A -1.201 1.710 -0.70 0.484

Table 4.28: Effects of PSVT—R pre-test scores on Drawing assignment and exam

Drawing grades.

109

General Linear Model: DWG Vis Grade percent versus Lab

Factor Type Levels Values Lab random 10 1B, 2B, 3A, 3B, 4A, 4B, 5A, 5B, 6A, 6B

Analysis of Variance for DWG Vis Grade percent, using Adjusted SS for Tests

Source DF Seq SS Adj SS Adj MS F P PSVT-R PRE 1 337.08 178.57 178.57 10.54 0.002 Lab 9 835.84 835.84 92.87 5.48 0.000 Error 112 1897.83 1897.83 16.94 Total 122 3070.75

S = 4.11642 R-Sq = 38.20% R-Sq(adj) = 32.68%

Term Coef SE Coef T P Constant 84.623 2.194 38.57 0.000 PSVT-R PRE 0.29824 0.09187 3.25 0.002 Lab 1B 3.171 1.363 2.33 0.022 2B 3.251 1.174 2.77 0.007 3A 2.975 1.095 2.72 0.008 3B 1.278 1.130 1.13 0.261 4A 0.694 1.026 0.68 0.500 4B -1.2218 0.9304 -1.31 0.192 5A -4.557 1.092 -4.17 0.000 5B -1.030 1.130 -0.91 0.364 6A -0.222 1.291 -0.17 0.863

Table 4.29: Effects of PSVT—R pre-test scores on Drawing assignment grades.

110

General Linear Model: MT Vis percent versus Lab

Factor Type Levels Values Lab random 10 1B, 2B, 3A, 3B, 4A, 4B, 5A, 5B, 6A, 6B

Analysis of Variance for MT Vis percent, using Adjusted SS for Tests

Source DF Seq SS Adj SS Adj MS F P PSVT-R PRE 1 618.29 374.63 374.63 6.59 0.012 Lab 9 513.63 513.63 57.07 1.00 0.442 Error 112 6370.23 6370.23 56.88 Total 122 7502.15

S = 7.54169 R-Sq = 15.09% R-Sq(adj) = 7.51%

Term Coef SE Coef T P Constant 81.449 4.020 20.26 0.000 PSVT-R PRE 0.4320 0.1683 2.57 0.012 Lab 1B -0.710 2.498 -0.28 0.777 2B 3.506 2.151 1.63 0.106 3A 1.375 2.007 0.69 0.495 3B -1.597 2.071 -0.77 0.442 4A 1.112 1.880 0.59 0.556 4B 0.714 1.705 0.42 0.676 5A -0.641 2.000 -0.32 0.749 5B 2.414 2.070 1.17 0.246 6A -1.818 2.365 -0.77 0.444

Table 4.30: Effects of PSVT—R pre-test scores on Drawing midterm exam grades.

A final analysis is run screening only for midterm exam data, PSVT—R pre-test scores availability, and Experience information, resulting in n=187. In this instance, both overall Experience composite score (p=0.019) and PSVT—R pre-test (0.009) covariates are significant. This is shown in Table 4.31.

111

General Linear Model: MT 1,3,4 Total versus Lab

Factor Type Levels Values Lab random 10 1B, 2B, 3A, 3B, 4A, 4B, 5A, 5B, 6A, 6B

Analysis of Variance for MT 1,3,4 Total, using Adjusted SS for Tests

Source DF Seq SS Adj SS Adj MS F P PSVT-R PRE 1 410.40 325.27 325.27 6.94 0.009 L Total 1 204.04 262.77 262.77 5.61 0.019 Lab 9 436.55 436.55 48.51 1.04 0.414 Error 175 8201.17 8201.17 46.86 Total 186 9252.17

S = 6.84572 R-Sq = 11.36% R-Sq(adj) = 5.79%

Term Coef SE Coef T P Constant 43.375 4.003 10.84 0.000 PSVT-R PRE 0.3048 0.1157 2.63 0.009 L Total 0.2553 0.1078 2.37 0.019 Lab 1B -1.743 1.810 -0.96 0.337 2B 2.134 1.778 1.20 0.232 3A -1.859 1.382 -1.34 0.180 3B -2.607 1.432 -1.82 0.070 4A -0.411 1.405 -0.29 0.770 4B 1.127 1.335 0.84 0.400 5A 1.249 1.470 0.85 0.397 5B 0.968 1.541 0.63 0.531 6A 1.465 1.551 0.94 0.346

Table 4.31: Effects of PSVT—R pre-test scores and Experience composite scores on

Drawing midterm exam grades.

Finally, the scatterplot of Drawing 11 grades plotted against PSVT—R pre-test scores by usage (Figure 4.4) shows an overall gain for the grade by those who used the animation and shows a leveling of the advantage students with higher PSVT—R test scores may have in visualization-reliant assignments, to some extent nullifying the grade advantage of those who scored higher initially on the PSVT—R.

112

Scatterplot of DWG11 vs PRE

25 USED? 0 1

20

15

DWG11 10

5

0

10 15 20 25 30 PRE

Figure 4.4: Drawing 11 grades plotted against PSVT—R pre-test scores by animation tool usage.

Student Feedback

Questionnaire

Students were emailed a questionnaire after the course was completed to gauge their general impressions and reactions; to verify study validity regarding animation access between groups; and to solicit suggestions. All students who provided consent (273) were emailed. After a week without a response, another email was sent. Four emails were returned by the mail server. Sixteen replies were eventually received, nine from the control group and seven from the treatment group. All students who replied from the

113

control group did not know about the animations’ availability and/or did not ever access

them. Following is a listing of the questions sent with a summary of responses from the treatment group students below each question. A full copy of the script from the email is provided in Appendix C.

1. Were you aware of the animated instructional tool available on Carmen on a trial

basis in ENG 181 in WI08? If not, you do not have to answer the remaining

questions.

Yes.

2. How did you hear of the availability of the animations?

Received an email.

3. Were you able to access to the animations for the daily assignment problems?

Yes. One student reported a technical issue that was resolved.

4. How did you gain access to the tool, if it was not available through your own

login? (No worries, you’re not in any trouble!)

N/A.

5. Do you know of others who did not have access via their own login that also were

able to somehow gain access?

No.

6. Why did you stop using the animations?

One forgot about the tool’s availability on Carmen. Another generally finished

their work in class. Another already felt comfortable with their visualization

skills and did not feel they needed the animations.

7. Did you find the animations useful? In what ways?

114

They were helpful in visualizing the objects.

“Yes, because it's good to see the object from all angles.”

“I think it does give a bit more detail to overall structure when learning about

engineering drawings.”

8. Did you believe you needed help visualizing or understanding how objects appear

after rotation or orientation change of the objects from their initial states? If so,

please specify if you had trouble with: visualizing isometric drawings, visualizing

orthographic drawings, the Purdue Spatial Visualization Test, exam problems

involving drawings, or something else.

“When I first started drawing isometric drawings I had difficulty visualizing, but I

remember the tool on Carmen helped me think about when it rotated. It also

helped me see things I missed when I drew them.”

“I found it helpful, especially in visualizing orthographic drawings and the

Purdue Spatial Visualization Test.”

9. Do you believe that viewing the animations helped you understand how an object

in an isometric view looks when rotated to an orthographic view?

Yes.

10. Do you believe that viewing the animations helped you understand how an object

in an orthographic view looks when rotated to an isometric view?

Yes.

11. Do you believe that viewing the animations helped you generally understand how

an object would look when rotated from one position or orientation to another? In

other words, were you able to better understand, “see,” or mentally visualize how

115

an object would appear, would be drawn, or would be depicted if it was rotated or

given a new orientation from its initial state?

Yes.

“Yes, the animations were I thought were a very useful tool to help see how to

draw it. Sometimes I would use the animation more than once on a drawing

because I kept on missing part of the object I could not see before.”

12. How would you improve this instructional tool to make it more useful for future

students?

Allow the user to manipulate the objects in all directions. Make the tool available

to everyone, “especially for students who don’t have drawing experience like

me.”

Focus Group

Along with the questionnaire, students were emailed an invitation after the course was completed to participate in a focus group. All students who provided consent (273) were emailed. After a week without a response, another email was sent. Four emails were returned by the mail server. Sixteen replies were eventually received, nine from the control group and seven from the treatment group. Only one student showed interest in participating in the focus group, but did not reply after several attempts to schedule a discussion. A full copy of the script from the email, the same script as the questionnaire, is provided in Appendix C.

116

CHAPTER 5: DISCUSSION, CONCLUSIONS, AND RECOMMENDATIONS

This study investigated whether virtual 3-dimensional animated models of existing, assigned, paper-based engineering graphics problems can be used as an effective instructional tool that enables students to enhance their spatial visualization skills. This study also used data collected to investigate the relationship between developmental activities and student initial spatial visualization abilities and visualization gains after instruction and/or treatment. This investigation also sought to determine the relationship between initial spatial visualization skills and both heavily visualization-reliant and less visualization-reliant assignment grades.

An underlying theme of this investigation involved using readily available resources within the program offering the introductory engineering course. An objective was to show that such an undertaking could use existing technology accessible to an academic department or program and could be run in the background with minimal disruption to instruction and students, while collecting and analyzing a large dataset with hundreds of participants. This girded the activities in this investigation, particularly the data collection, administration of indirect instructional devices, and enforcement of the research design.

117

Discussion and Conclusions

Discussion and Conclusions of Results

Tool Access Effects on Spatial Visualization Improvement

The first hypothesis tested the effects on spatial visualization gains due to treatment, or free tool access. Participants in the treatment group, those who had free access to the spatial visualization instructional animations, scored higher on the PSVT—

R post-test as a result of being in the treatment group versus the control group, with pre- test scores considered.

While at first this seems to validate the application of the animation tool as an open access aid for improving spatial visualization skills, further analysis is warranted for confirmation, and this will lower the confidence in the effects of the tool itself. The significantly higher mean test score may be a result of random fluctuation, resulting in a

Type I error. The mean score of the treatment group was nearly one half of a point greater than that of the control group. If they were randomly performing below ability, and then randomly performed above ability during the post-test, falsely significant results may ensue.

Tool Utilization Effects on Spatial Visualization Improvement

The second hypothesis tested the effects on spatial visualization gains due to tool utilization access within the treatment group. Although total time spent using animations was significant with slightly negative effects in one scenario, animation utilization did not appear to generally have significant effects on spatial visualization improvement.

118

However, a scenario in which the Boolean tool usage random factor to determine

the effects of using the tool in any capacity on the response was employed showed that

usage may have negative effects on post-test PSVT—R scores when considering pre-test

PSVT—R scores.

It is apparent here that, rather than augmenting existing student transformational

actions on objects and their representations, students may be replacing the substantial

effort involved in holistically determining the drawn object’s mental orientations in space

as it transitions to the new orientation to be depicted, as demanded by the homework

assignment, by another set of more passive transformations whereby the student is

provided with a dynamic, continuous set of external representations. It is not to say that

both are not helpful, but it would likely be more beneficial for the student to use both.

Perhaps, if the animation tool was an additive activity, rather than a substitutive one as it

appears to have become, it would result in a net gain in student spatial visualization

development. In other words, this animation tool in the setting it was presented may have

replaced student developmental scaffolding rather than provide additional scaffolding,

and the two may not be equivalent, possibly resulting in a net loss in development.

Future iterations of this instructional tool should ensure that it is in fact additive

and not substitutive, or even worse, a crutch in that it substitutes spatial visualization

development through a traditional approach to the assignment with a shortcut. One way

to implement this involves providing access to the tool after the due date listed on the

assignment schedule. Students in ENG 181 are permitted to redo assignments if they

score below a threshold score, with a maximum redo score of 70% of the total point value

of the assignment. This could address students who “just can’t see it” may free Graduate

119

Teaching Associate time, and provide an incentive for students to review their graded

work. This also addresses student comments received in the emailed survey, where the animation tool was praised for allowing them to see the objects depicted in orientations they had not themselves been able to conceive.Another approach would to provide animations for an alternate problem set in the form of practice problems rather than the existing assigned problems.

Previous investigations (Sorby & Baartmans, 1996, 2000; Zavotka, 1987) show that virtual 3-D animations and/or computer-based spatial visualization remediation is

effective. In these cases, the treatments were external to existing participant activities,

and therefore not substitutive. It is also possible that the amount of development provided by regular instruction in ENG 181 diffused the gains made through the use of the animation tool, considering the average time spent on the animations was only 34 minutes, compared to the magnitude of practice throughout the term of the course.

Experience Effects on Spatial Visualization Ability Gains

The next hypothesis tested the effects on spatial visualization ability improvements due to developmental experiences. The magnitude of student-reported experiences working around the house on home improvement projects (HOME

IMPROVEMENT) and drafting, design and sketching experiences (DRAFTING) showed some notable but not significant negative effects on spatial visualization gains, but interacted positively (also not significant) with pre-test scores on spatial visualization gains.

120

It is apparent here that the those students with high Experience scores in these categories already scored near the ceiling of the PSVT—R test. Thus, their potential gains were truncated by the test maximum score.

Experience Effects on Initial Spatial Visualization Ability

The fourth hypothesis tested the effects on initial spatial visualization ability due to developmental experiences. Total experience scores showed significant effects on spatial visualization ability. Individual Experience scores showed significant effects on initial spatial visualization ability for those students reporting personal experiences playing with models and building block-type toys (MODELS), playing sports (SPORTS), and involvement in music (MUSIC), but SPORTS showed negative effects.

The significance of MODELS on spatial visualization ability confirms the findings of others (Deno, 1995) in that this type of developmental activity seems to be one of the strongest indicators of spatial visualization ability. Is it because of an existing early ability and an innate interest in such things that spatial ability and experiences with building blocks and stackable toys and models coincide, or is it causal? Spatial visualization gender gap studies would suggest otherwise, where girls lacking early exposure to such activities as a result of culture quickly close the gap later in life once given the opportunity to develop in this manner when they are older. If gender roles were less distinct, the gap may be smaller.

The negative effects of sports may be a result of the time sports take away from other activities that develop these skills. Sports may help more with developing skills for the mental manipulation of perspective, or spatial orientation, rather than the

121

manipulation of objects, or spatial visualization, although the two are related (Lord &

Garrison, 1998).

Experience Effects on Grades

The fifth hypothesis tested the effects on heavily visualization-dependent and

non-visualization grades due to developmental experiences. Home improvement

experiences (HOME IMPROVEMENT) showed significant effects on both heavy visualization and less-visualization-invoking grades. Video game playing experiences

(VIDEO GAMES) trended toward effects on visualization-related grades as well, but

were not shown to be significant. Play with building block-type toys (MODELS) showed

significant effects on midterm exam visualization-related problem performance. No

significant effects by any of the Experience components were shown for the visualization-related drawing assignment performance, however SPORTS had some negative effect, while HOME IMPROVEMENT had some positive effects. The lab blocking factor showed significance.

Instructional setting and particularly grading plays a large part in the grades of daily assignments. Once again, MODELS seems to predict visualization skill indicators.

Home improvement experiences that involve working one’s hands may help develop visualization skills, but perhaps they instill a sense of motivation and work ethic. Further research should be conducted here, and perhaps the activities listed under HOME

IMPROVEMENT can be broken down and studied again to analyze its components’ effects.

122

Tool Utilization Effects on Grades

The sixth hypothesis tested the effects on spatial visualization gains due to tool

utilization access within the treatment group. Number of visits to the animation tool and

general usage had some not significant effects on visualization-related drawing

assignment grades. However, using a specific example of the grades produced for the

drawing assignment heavy in requisite spatial visualization skills and the corresponding

animations that were accessed by the most participants (allowing for more data points),

grades were shown to be significantly affected by use of the animation tool.

This provides more evidence that such an instructional tool should be applied in

an additive manner rather than used in place of traditional and/or holistic methods, as discussed earlier.

Spatial Visualization Pre-test Effects on Grades

The final hypothesis tested the effects on spatial visualization gains due to tool utilization access within the treatment group. Of all the hypotheses tested, spatial visualization pre-test scores had the most consistent and significant effects on the

response, which in this case was grades. Initial spatial visualization ability is a consistent

predictor of all grades to some extent, and in models where the final grade is concerned,

on its own regularly accounted for 1/6 to 1/5 of the variance of the model. In models

where the response is predominantly related to performance on daily drawing

assignments, it was still consistently significant but only accounted for around 1/10 of the

variance of the model, with lab section gaining more prominence relative to pre-test in

accounting for model variance.

123

The significance of pre-test PSVT—R scores effects on grades highlights the

importance of strong spatial visualization ability. Further efforts to enhance this skill set

are called for, as those with greater spatial visualization skills consistently achieved

higher grades in every facet of ENG 181 (excluding group assignments, which were

removed from the analysis). Research has shown that those with better spatial

visualization skills are better problem solvers (Koch, 2006), and this as well as other

studies show that spatial visualization skills are predictors of grades in first-year engineering.

These differences in accounted variance also point to the differences in grading approaches. Final grades are heavily influenced (45%) by exam grades. Exams are uniformly graded across sections, with a single grader grading all submissions of one problem, whereas homework grades are graded individually by section.

Survey Responses

Students who responded to the emailed surveys indicated finding use in the animations, particularly in being able to perceive objects virtually in various orientations.

The most common suggestion was made regarding granting user control over the orientation and movement of the object.

The response rate here was low (6%) for a number of reasons. The survey was

given well after the conclusion of the class rather than at the end. Stemming from this,

insertion of consent policies and reminders as to the nature of the study and participant

rights may have discouraged reading such a lengthy email and sabotaged further student

involvement. Kaplowitz, Hadlock, and Levine (2004) found an email only survey

response rate of under 21% at Michigan State University in which an underrepresented 124

proportion of the student population under 24 years of age responded. Although the

response rate for this study’s email solicitation is unusually low, it may be expected

considering the respondents’ typical age, the delay in the solicitation, and the amount of

text in the email.

Discussion and Conclusions of Limitations

This study demonstrated that an online, indirect instructional tool can be

implemented and measured for effectiveness in a real setting. However, several factors

affecting some of the results of this study should be noted. First, because of the manner

in which treatment group participants were informed of the tool’s availability, in a private

email, instructional team promotion was not applicable. Generally, when an important

guide or study tool is made available, students are informed and encouraged by

instructional team members to make use of the device. The process by which students

were informed of the availability of the instructional tool may have led to reduced awareness and utilization of the animations.

Student participation in the devices used for collecting data can be encouraged through the use of incentives such as bonus points. Half-hearted participation can be rectified by rewarding students for scoring initially high on PSVT—R tests with bonus points, and then rewarding them again with more bonus points for showing gains at the

end of the quarter.

Graduate Teaching Associates may also need encouragement, as understandably,

their top priorities in data processing are those items directly related to reporting student

grades. If a study or data collection device is not directly contributing to generating

student grades, they may not see it as a high priority, as may be the case with the 125 problem-by-problem midterm and final exam data. The final exam problem-by-problem data were not used due to a lack of availability.

The target population of the instructional tool is reduced as a result of an annually increasing average aptitude and achievement level of incoming freshmen. Considering the results shown when modeling subsamples of students to the scoring below the spatial visualization test ceiling, the greater average ability of the sample may also be a significant contributor to the outcome of some of the analyses. Comparable spatial visualization tests with a higher test ceiling or increased difficulty could be utilized to test for the effects on a larger sample. Another option is for the PSVT—R to be extended.

The PSVT—R becomes more difficult as the participant progresses to the latter problems in the test. Perhaps this could continue for several extra problems to facilitate analyses for those at all skill levels. Special tests exist with higher ceilings for those with greater aptitudes. Perhaps something similar exists or could be developed in the realm of spatial visualization. However, efforts to include upper echelon students may not test the effects of the instructional tool on those most in need of remediation

The analyses were also hindered by the need for full rank datasets. In this study it was necessary to remove the records with empty fields, particularly PSVT—R scores,

Experience scores, and exam problem-by-problem data. Because this study ran alongside regular instruction and was not a closed experiment with actively proctored devices, participation rates were unpredictable. This was, however, part of the research design and a known risk. In the future, this could be alleviated by randomly assigning treatment and control participants by whole sections, although this may make it difficult to identify effects due to instructional settings. Another approach could involve splitting treatment

126 and control groups by term, with similar instructional teams in each term, possibly allowing for valid comparisons without requiring blocking. This assumes lab section effects are leveled throughout similar instructional settings across the many lab sections in each term.

Recommendations

Recommendations for Practice

This study employed the use of The Ohio State University’s online course gradebook and curriculum web application, Carmen, by Desire-to-Learn. This application was used extensively in nearly every facet of the study. The application was used to enforce control and treatment group segregation. It was used for data collection and consolidation. Carmen was also used to selectively release the treatment to the appropriate group based on user login and access scheduling. It was also an essential deployment mechanism of the pre- and post-tests. Carmen also tracked the amount of treatment individuals received and collected online survey and Experience data. The

Carmen web application, and others like it, is a very powerful tool in more ways than those for which it is typically used. However, its potential has yet to be unleashed.

Although Carmen tracks and records a multitude of data, it is often difficult for designers or researchers to access this data in a readily useful manner. Much of the data, aside from regular grade records and calculations, are logged. These data logs require scripts written by administrators in information technology departments, and even then it is not yet usable until properly parsed and reoriented, otherwise it is quite cryptic. Even a

127

simple end-user download of survey data, if desired in a format other than the default generic report layout, is retrieved with a cryptic output that requires further processing.

Software designers should also improve the user interface and tools available to instructors and designers. Compared to modern spreadsheets, formulae for simple calculations are too complex to enter and there is no way to query information on a large gradebook page with many students and many grade fields per student.

Recommendations for Research

Further inquiry as to how viewing virtual 3-D animations of rotating objects on a computer screen compares to other approaches for developing spatial visualization skills should be undertaken. Studies featuring multi-level treatments should be conducted to investigate the best way to provide indirect instruction for students wishing to enhance their spatial visualization skills while minimizing the burden on first-year program resources. Can the use of virtual 3-D animations be more effectively combined with other methods of spatial visualization enhancement? How much of a benefit do students get from viewing animated depictions of objects after completing assignments involving those same objects? How does giving students control of the orientation of the object in virtual 3-D space contribute to developing visualization skills compared to other virtual

3-D animations?

Further investigation should be conducted to ascertain a threshold of spatial visualization competency measured by a common spatial visualization test, such as the

PSVT—R, for being able to successfully complete a battery of different types of mental exercises in different disciplines such as mathematics, language, the sciences, etcetera.

Performance on spatial visualization tests seems to be a significant predictor of grades in 128

engineering and other technical fields, but how does this relate to non-technical fields?

Perhaps competency in spatial visualization is more obviously related to music and visual

arts, but how does this relate to achievement in verbal or literary activities? It would be

also useful to investigate the relationship between preferred learning styles and

visualization abilities in individuals across different fields of study.

While it is not possible to capture every facet of a student or any human in a

model, other predictors of student spatial visualization aptitude should be researched. In

addition to developmental experiences, what other indicators of spatial visualization

aptitude are there? Are spatial visualization skills indicators of achievement levels in

non-technical fields? Further investigation is warranted.

More surveys of different visualization tests would be useful. There are many

tests available and many are well documented (Elliot & Smith, 1983), and some have

examined various tests. However, more direction on the application of such tests, ratings

of the difficulty levels, and the typical ceiling of different population samples, e.g.,

engineering students versus middle school math students, would be beneficial. In addition, a compendium of different visualization tests and whether they are best suited for testing traditional measures such as spatial visualization ability, spatial orientation ability, or some form of dynamic visualization ability (now possible with modern technology), would also be of great use. An inventory of common activities related to the types of visualization abilities they exercise merits compilation.

To unleash the full potential of Carmen or other similar institutional web applications, software engineers should consider building in a basic statistical analysis package, perhaps even one that dynamically updates as data such as grade entries, link or

129 object accesses, or surveys are performed. Such a feature would allow for widespread educational and institutional quantitative research, allowing faculty to quickly check the significance a quantity of accesses to a new piece of curriculum has on some response grade. Perhaps students are not aware and are not accessing a new curriculum piece. If institutional web applications like Carmen featured imbedded statistical analysis packages, database query functionality, and spreadsheet-like capabilities and user interfaces, the sky would be the limit for educational researchers pursuing large-scale quantitative research projects.

Finally, with an increasing proportion of courses either going fully online or providing student grades and other data entirely online, course grade applications such as

Carmen need to be more than just repositories of information and data entry devices. If every simple analysis or presentation requires a complete download of updated data into a spreadsheet or statistical package, while the download of each data type is of a unique format, it becomes little more than an unwieldy web device for disseminating information to students – a mere webpage of grades and a static display of curriculum materials.

130

REFERENCES

Barnea, N., & Dori, Y. J. (1999). High-school chemistry students’ performance and gender differences in a computerized molecular modeling learning environment. Journal of Science Education and Technology, 8, 257-271.

Barr, R. E., Schmidt, P. S., Krueger, T. J., & Twu, C. (2000). An introduction to engineering through an integrated reverse engineering and design graphics project. Journal of Engineering Education, 89, 413-417.

Bishop, J. E. (1978). Developing students’ spatial ability. The Science Teacher, 45, 20- 23.

Bodner, G. M., & Guay, R. B. (1997). The Purdue visualization of rotations test. The Chemical Educator, 2(4).

Bolton, R. W., & Morgan, J. (1997). Engineering graphics in an integrated environment. Proceedings of the 27th ASEE/IEEE Frontiers in Education Conference.

Branoff, T. (1998). Coordinate axes and mental rotation tasks: a dual-coding approach. Proceedings of the 1998 American Society for Engineering Education Annual Conference & Exposition, Seattle, WA, Session 2238.

Branoff, T. (1998). The effects of adding coordinate axes to a mental rotations task in measuring spatial visualization ability in introductory undergraduate technical graphics courses. Engineering Design Graphics Journal, 62(2), 16-34.

Brenner, M. E., Mayer, R. E., Mosely, B., Brar, T., Duran, R., Reed, B. S., & Webb, D. (1997). Learning by understanding: the role of multiple representations in learning algebra. American Educational Research Journal, 34(4), 663-689.

Brooks, J.G., & Brooks, M. G. (1999). The Case For Constructivist Classrooms. Alexandria, VA: Association for Supervision and Curriculum Development.

Bruner, J. S. (1990). Acts of meaning. Cambridge, Mass: Harvard University Press.

Carter, C. S., LaRussa, M. A., & Bodner, G. M. (1987). A study of two measures of spatial ability as predictors of success in different levels of general chemistry. Journal of Research in Science Teaching, 24, 645-657.

131

Condoor, S. S. (1999). Integrating design in engineering graphics courses using feature- based, parametric solid modeling. Proceedings of the 29th ASEE/IEEE Frontiers in Education Conference, Session 12d2.

Crown, S. W. (2001). Improving visualization skills of engineering graphics students using simple JavaScript Web Based Games. Journal of Engineering Education, 90, 347-355.

De Lisi, R., & Wolford, J. L. (2002). Improving children’s mental rotation accuracy with computer game playing. The Journal of Genetic Psychology, 163, 272-282. Retrieved September 5, 2004, from http://pqasb.pqarchiver.com/heldref/

Deno, J. A. (1995). The relationship of previous experiences to spatial visualization ability. Engineering Design Graphics Journal, 59(3), 5-17.

Eliot, J., & Smith, I. M. (1983). An international directory of spatial tests. Windsor, Berkshire: NFER-Nelson.

Friedlander, A., & Tabach, M. (2001). Promoting Multiple Representations in Algebra. In A. Cuoco (Ed.), The Roles of Representation in School Mathematics (pp. 173- 185). Reston, VA: National Council of Teachers of Mathematics.

Gerson, H. B. P., Sorby, S. A., Wysocki, A., & Baartmans, B. J. (2001). The development and assessment of multimedia software for improving 3-D spatial visualization skills. Computer Applications in Engineering Education, 9, 105-113.

Goldin, G., & Shteingold, N. (2001). Systems of representations and the development of mathematical concepts. In A. A. Cuoco & F. R. Curcio (Eds.), The roles of representation in school mathematics: 2001 yearbook (pp. 1-23). Reston, VA: National Council of Teachers of Mathematics.

Greeno, J.G., & Hall, R.P. (1997). Practicing representation: Learning with and about representational forms. Phi Delta Kappan, 78(5), 361-367.

Guay, R. B. (1977). Factors affecting the development of two spatial abilities basic to technical drawing. Journal of Industrial Teacher Education, 14(3), 38-43.

Guay, R. B., & McDaniel, E.D. (1979). Toward explaining sex differences in spatial ability: An investigation of selected cultural and neurophysiological factors. Arlington, VA: Army Research Institude. For the Behavioral and Social Sciences. (ERIC Document Reproduction Service No. ED 185 071)

Guay, R. B., McDaniel, E.D., & Angelo, S. (1978). Analytic factor confounding spatial ability measurement. Paper presented at the annual meeting of the American Psychological Association, Toronto, Canada.

132

Guay, R. B., & McDaniel, E.D. (1982). The relationship between spatial abilities and social-emotional factors. Lafayette, IN: Purdue University. (ERIC Document Reproduction Service No. ED 224 810).Hegarty, M., & Waller, D. (2004). A Dissociation between mental rotation and perspective-taking spatial abilities. Intelligence, 32(2), 175-191.

Hsi, S., Linn, M. C., & Bell, J. E. (1997). The role of spatial reasoning in engineering and the design of spatial instruction. Journal of Engineering Education, 86, 151-158.

Huang, S. H., Bhura K. R., & Wang G. (2000). Cutter path simulation and product visualization using AutoCAD. Computer Applications in Engineering Education, 8(2).

Janvier, C., Girardon, C., & Morand, J. C. (1993). Mathematics symbols and representations. In P. S. Wilson (Ed.), Research ideas for the classroom: High school mathematics (pp. 79-102). New York: Macmillan Publishing Company.

Kaplowitz, M. D., Hadlock, T. D., & Levine, R. (2004). A comparison of web and mail survey response rates. Public Opinion Quarterly. 68(1), 94-101.

Koch, D. S. (2006). The effects of solid modeling and visualization on technical problem solving. Blacksburg, Va: University Libraries, Virginia Polytechnic Institute and State University.

Kozhevnikov, M., & Thornton, R. (2006). Real-Time Data Display, Spatial Visualization Ability, and Learning Force and Motion Concepts. Journal Of Science Education And Technology. 15(1), 111-132.

Kwon, O. N. (2003). Fostering spatial visualization ability through web-based virtual- reality program and paper-based program. Lecture Notes in Computer Science, 2713, 701-706.

Little, P., & Cardenas, M. (2001). Use of studio methods in the introductory engineering design curriculum. Journal of Engineering Education, 90, 309-318.

Lohman, D. (1988). Spatial abilities as traits, processes, and knowledge. In R. J. Sternberg (Ed.), Advances in the Psychology of Human Intelligence, Vol. 4, Erlbaum, Hillsdale, NJ, pp. 181–248.

Lord, T. R. (1985). Enhancing the visuo-spatial aptitude of students. Journal of Research in Science Teaching, 22, 395-405.

133

Lord, T.R., & Garrison J. (1998). Comparing spatial abilities of collegiate athletes in different sports. Perceptual and Motor Skills. 86(3), 1016-8.

McDaniel, E.D., Guay, R. B., Ball, L., & Kolloff, M. (1978). A spatial experience questionnaire and some preliminary findings. Paper presented at the annual meeting of the American Psychological Association, Toronto, Canada.

Morgan, J., & Bolton, R W. (1998). An integrated first-year engineering curricula. Proceedings of the 28th ASEE/IEEE Frontiers in Education Conference.

Newcomer, J. L., McKell, E. K., & Raudebaugh, R. A. (2001). Creating a strong foundation with engineering design graphics. Engineering Design Graphics Journal, 65(2).

The Ohio State University, College of Engineering, First-year Engineering Program. (2003). Summary of Student Learning Styles, ENG 181, Winter Quarter.

Pallrand, G. J. & Seeber, F. (1984). Spatial ability and achievement in introductory physics. Journal of Research in Science Teaching, 21, 507-516.

Pepin, M., & Dorval, M. (1986). Effect of playing a video game on adults' and adolescents' spatial visualization. Perceptual and Motor Skills, 62, 159-162.

Peters, M, Chisholm, P., & Laeng, B. (1995). Spatial ability, student gender, and academic performance. Journal of Engineering Education, 84, 69-73.

Piaget, J. (1967). The child’s conception of space. New York: W. W. Norton & Company, Inc.

Piaget, J. (1976). Psychology of intelligence. Totowa, NJ: Littlefield, Adams & Co.

Piaget, J. (2003). Development and learning. Journal of Research in Science Teaching. 40, S8-S18.

Piburn, M. D., Reynolds, S. J., McAuliffe, C., Leedy, D. E., Birk, J. P., & Johnson, J. K. (2005). The Role of Visualization in Learning from Computer-Based Images. Research Report. International Journal of Science Education. 27(5), 513-527.

Reese, H. W. (1999). Advances in child development and behavior (Vol. 25). San Diego: Academic Press.

134

Rochford, K., Fairall, A. P., Irvings, A., & Hurly, P. (1989). Academic failure and spatial visualization handicap of undergraduate engineering students. International Journal of Applied Engineering Education, 5, 741-749.

Savin-Baden, M. (2000). Problem-Based Learning in Higher Education: Untold Stories. (pp. 13-26, 143-148). Buckingham: The Society for Research into Higher Education and Open University Press.

Scribner, S. A., & Anderson, M. A. (2005). Novice Drafters' Spatial Visualization Development: Influence of Instructional Methods and Individual Learning Styles. Journal of Industrial Teacher Education. 42(2), 38-60.

Sorby, S. A., & Baartmans, B. J. (1996). A course for the development of 3-D spatial visualization skills. Engineering Design Graphics Journal, 60(1), 13-20.

Sorby, S. A., & Baartmans B. J. (2000). The development and assessment of a course for enhancing the 3-D spatial visualization skills of first year engineering students. Journal of Engineering Education, 89(3), 301-307.

Sorby, S. A., & Young, M. F. (1998). Assessment of a visualization-based placement exam for a freshman graphics course. Proceedings of the 1998 American Society for Engineering Education Annual Conference & Exposition, Session 2238.

Taylor, M., Pountney, D., & Malabar, I. (2007). Animation as an Aid for the Teaching of Mathematical Concepts. Journal of Further and Higher Education. 31(3), 249- 261. von Glasersfeld, E. (1989). An Exposition of Constructivism: Why Some Like It Radical. [S.l.]: Distributed by ERIC Clearinghouse.

Vygotsky, L. S. (1978). Mind in society: The development of higher psychological processes. Cambridge: Harvard University Press.

Yue, J., & Chen, D. M. (2001). Does CAD improve spatial visualization ability? Proceedings of the 2001 American Society for Engineering Education Annual Conference & Exposition, Session 2486.

Yue, J. (2002). Do basic mathematical skills improve spatial visualization abilities? Proceedings of the 2002 American Society for Engineering Education Annual Conference & Exposition, Session 3286.

135

Yue, J. (2002). Spatial visualization skills at various educational levels. Proceedings of the 2002 American Society for Engineering Education Annual Conference & Exposition, Session 2438.

Zavotka, S. L. (1987). Three-dimensional computer animated graphics: A tool for spatial skill instruction. Educational Communication and Technology Journal, 35(3), 133-44.

136

APPENDIX A: PURDUE SPATIAL VISUALIZATION TEST – ROTATIONS

(ONLINE VERSION)

137

138

139

140

141

142

143

144

145

146

147

148

149

150

151

152

APPENDIX B: EXPERIENCE SCORE LIKERT-TYPE SURVEY QUESTIONS

153

2007B0293, “Enhancing the Spatial Visualization Skills of First-Year Engineering Students” Survey Questions

How often have you engaged in drafting, design and design sketching, engineering graphics, or other manual (by hand) technical drawing activities?

Never Rarely Occasionally Frequently Always 12345 1 DRAFTING

How often have you constructed models or played with building blocks, Legos, Lincoln Logs, or other similar stackable toys?

Never Rarely Occasionally Frequently Always 12345 2 MODELS

How often have you played sports?

Never Rarely Occasionally Frequently Always 12345 3 SPORTS

How often have you fixed things, with a parent, guardian, other mentor, or on your own, such as working on a car, home improvements, construction, carpentry, electronics, or other similar activities?

MENT Never Rarely Occasionally Frequently Always 4 HOME 4 HOME 12345 IMPROVE-

How often have you engaged in creating artwork (drawing, graphic design, painting, 3D art, photography, etc.)?

5 ART Never Rarely Occasionally Frequently Always 12345

How often have you played video games?

Never Rarely Occasionally Frequently Always GAMES 6 VIDEO 6 VIDEO 12345

How often have you used Computer Aided Design (CAD) software applications such as AutoCAD, SolidWorks, Autodesk Inventor, or others?

Never Rarely Occasionally Frequently Always 7 CAD 12345

How often have you played musical instruments or composed music?

Never Rarely Occasionally Frequently Always

8 MUSIC 12345

How often have you engaged in crafts (sewing, homemade decorations, pottery, etc.)?

Never Rarely Occasionally Frequently Always 12345 9 CRAFTS

How often have you used maps or navigated in a car, navigated while hiking, etc.? 10 10 Never Rarely Occasionally Frequently Always 12345 NAVIGATION 154

APPENDIX C: SURVEY AND FOCUS GROUP SOLICITATION

155

Enhancing Spatial Visualization Skills in First-Year Engineering Students Yosef Allam 03-09-09

Post-study Survey and Focus Group Feedback Solicitation

Survey Text and Questions:

Hello,

My name is Yosef Allam, and I need just a few minutes of your time. I am finishing a study to help engineering students with their visualization skills which I conducted in your ENG 181 class in WI08, as you may recall from my address to your class that quarter.

In ENG 181 in WI08, an instructional tool was provided on a trial basis to half of the students via links on the Carmen “Content” page, after logging in, which consisted of animations of daily assignments in Basics, showing a video, viewable in Windows Media Player. These were video animations of parts you were to draw as part of your daily assignments rotating to and from isometric and orthographic positions. The goal of this trial offering of these animations was to determine whether or not such a tool would help students with their visualization skills (per test results of the Purdue Spatial Visualization Test you took via Carmen online at the beginning and end of the WI08 academic quarter) and to aid students in understanding how an object correctly translates from an orthographic to isometric depiction, and vice versa.

There are just a few questions I would like to ask you regarding this. All you have to do is reply to this email. Your identity will not be used, and this is only meant to gauge general student impressions towards the use of the tool for future use and for research purposes. By answering these questions, you are helping future engineering students at The Ohio State University College of Engineering. To jog your memory, I have attached a sample animation that was used last year and is similar to the other animations that were made available to some students. Thanks in advance for your time and cooperation.

1. Were you aware of the animated instructional tool available on Carmen on a trial basis in ENG 181 in WI08? If not, you do not have to answer the remaining questions. 2. How did you hear of the availability of the animations? 3. Were you able to access to the animations for the daily assignment problems? 4. How did you gain access to the tool, if it was not available through your own login? (No worries, you’re not in any trouble!) 5. Do you know of others who did not have access via their own login that also were able to somehow gain access? 6. Why did you stop using the animations? 7. Did you find the animations useful? In what ways?

156

8. Did you believe you needed help visualizing or understanding how objects appear after rotation or orientation change of the objects from their initial states? If so, please specify if you had trouble with: visualizing isometric drawings, visualizing orthographic drawings, the Purdue Spatial Visualization Test, exam problems involving drawings, or something else. (possibly check boxes) 9. Do you believe that viewing the animations helped you understand how an object in an isometric view looks when rotated to an orthographic view? 10. Do you believe that viewing the animations helped you understand how an object in an orthographic view looks when rotated to an isometric view? 11. Do you believe that viewing the animations helped you generally understand how an object would look when rotated from one position or orientation to another? In other words, were you able to better understand, “see,” or mentally visualize how an object would appear, would be drawn, or would be depicted if it was rotated or given a new orientation from its initial state? 12. How would you improve this instructional tool to make it more useful for future students?

If you did use the instructional tool (you viewed the animations while taking ENG 181 in WI08), would you be interested in some free pizza? All you have to do is come in to a focus group session on campus to discuss the animation tool with other former students of ENG 181 from WI08. It should take only 30 minutes of your time, it’s a chance to reconnect with some fellow students from last year, and there’s free food! We will audio record the group’s conversation, but will not use your identity. If you are interested, please indicate so in your reply to this email and please provide your availability during the week and you will be given further details.

Per Institutional Review Board (IRB) requirements, we must acquire your consent to participate in this research endeavor. By replying to this email with responses to the survey questions, or by indicating a willingness to participate in the focus group and showing up for it, your consent will be assumed to have been given. Your name or any other identifying information will not be used in the research or subsequent publications.

Thanks again and good luck in your studies!

-Yosef Allam

157

Focus Group Starter Questions and Discussion topics:

1. Was availability of animation well-communicated? If not, how could this be improved. (Information was provided via email). 2. Did the instructional team encourage use, discourage use, or were indifferent about the animation tool? 3. How did you find it useful? 4. Did you believe you needed help with visualizing rotated objects? 5. Did you believe you needed help with visualizing objects depicted orthographically? 6. Did you believe you needed help with visualizing objects depicted isometrically? 7. What areas of visualization or course assignments or exams and tests did it specifically help you with? 8. How else did you find it useful? 9. How could it be more useful? 10. Do you believe that having good visualization skills is important to being an engineer? How? Why? As a student? As a future professional? 11. Any other suggestions? 12. Do you have any questions for me?

158

APPENDIX D: RELEVANT ENG 181 WINTER 2008 DRAWING ASSIGNMENTS

159

160

161

DWG: Orthographic from Isometric (hidden line practice) Grading Guidelines

Points Part A 8 Extra line ½ each up to 1 Missing visible line ½ each up to 2 Missing hidden line 1 Wrong view orientation 3 Neatness 1 Part B 8 Extra line ½ each up to 2 Missing visible line ½ each up to 2 Missing hidden line 1 Wrong view orientation 3 Neatness 1 Title Information 4 Incomplete title information ½ for each one, up to 2 point No use of engineering letters 2 Total 20

162

163

164

165

166

DWG: Orthographic to Isometric Drawings B15 & B16

Points Incorrect shape or dimensions 2 each feature max. 12 Incorrect orientation of views 1 each max. 2 Missing center lines 1 each max. 3 Incomplete title information ½ each max. 1 No use of engineering letters 1 point Neatness 1 Total 20

167

168

169

170

DWG: Missing Lines with Isometrics. Grading Guidelines

Points Missing Line 10 Incorrect line (including drawing more 1½ each up to 9 than one line) Neatness ½ Incomplete title information or not use ½ of engineering letters Isometric Views 10 Isometric view according to the 1 each up to 6 orthographic views Wrong orientation of the object ½ each up to 2 Neatness 1 Incomplete title information or not use 1 of engineering letters Total 20

Note: If the student doesn’t rotate the isometric paper, the isometric views will not really be an isometric view and the assignment will have a max of 10 points. The student has the option to redo the complete assignment with late penalty (max. 14 points).

171

APPENDIX E: ENG 181 WINTER 2008 DAILY ASSIGNMENT LIST

172

173

APPENDIX F: CONSENT DOCUMENTATION

174

2007B0293, “Enhancing the Spatial Visualization Skills of First-Year Engineering Students” Debriefing Script

Thank you for your participation in our research study, “Enhancing the Spatial Visualization Skills of First-Year Engineering Students”.

I would like to discuss with you in more detail the study you just participated in.

As you may know, scientific methods sometimes require that subjects in research studies not be given complete information about the research until after the experiment is completed. Although we cannot always tell you everything before you begin your participation, we do want to tell you everything when the experiment is completed.

Before I tell you about all the goals of this study, however, I want to explain why it is necessary in some kinds of studies to not tell people about the purpose of the study before they begin.

Discovering how people would naturally react is what we are really trying to find out in behavioral experiments. We don't always tell people everything at the beginning of a study because we do not want to influence your responses. If we tell people what the purpose of the experiment is and what we predict about how they will react, then their reactions would not be a good indication of how they would react in everyday situations.

Now, I would like to explain exactly what we were trying to study in this investigation.

Students enrolled in ENG 181 course sections in Winter 2008 (WI08) were given a PSVT-R (Purdue Spatial Visualization Test – Rotations) pre-test. They also completed a survey, in this case an augmented version of the “Journal Entry 1,” already in existence, to collect experiential data. Enrollees were not informed of the study in advance to protect study validity and to simulate the voluntary nature of online instructional tool usage. Those randomly selected as treatment group participants were granted instructional tool usage via links on the “Content” page of their Carmen course application. These links selectively opened for treatment group participants when the relevant engineering graphics drawing was assigned, and closed at the due date of the assignment. Link access tracking in Carmen is built into the application, so utilization data was readily available for each treatment group participant. All participant data were stored within the Carmen gradebook. At the conclusion of study, coinciding with the conclusion of the WI08 term, all students took the PSVT-R post-test and were emailed this study debriefing and were instructed to reply to opt-out. The instructional tool tested consisted of animations of ENG 181 Daily Assignment drawings. Any effect on grades will be leveled in the interest of fairness between control and treatment group participants who also happen to be

175

students receiving grades in ENG 181 in WI08. This study has been approved by the Institutional Review Board and the Director of the Fundamentals of Engineering Program, John Merrill, and the Associate Dean of the College of Engineering, Robert Gustafson.

To summarize, the goals of this study included gauging how your past experiences shape your spatial visualization abilities and your ability to further enhance your spatial visualization abilities. In addition, the effectiveness of a spatial visualization tool is being gauged to determine how it can be augmented and whether some version of it should be used in the future in the Fundamentals of Engineering Program. Once again, it important to emphasize that all necessary steps are being taken to make sure that status as a treatment versus control group participant does not unfairly affect your grades and the grades of your peers. Your participation is greatly appreciated and will help future students in the program.

If other people knew the true nature of the experiment, it would affect how they behave, so we are asking you not to share the information we just discussed.

Now that the study has been explained, do you agree to allow the investigator to use your data that we collected from your participation in this study? All identifying and confidential information that would tie you to grades and test scores will be stripped from our records and destroyed within 2 weeks after sending this message. If you would like abstain from participation in this study, please email Yosef Allam ([email protected]) as soon as possible. If you choose to abstain, all records pertaining to you and the data collected as a result of your individual participation will be destroyed as well, and the results studied, reported, analyzed, etc., will not reflect any data tied to you. If you do not contact the Investigators to abstain, it will be assumed that you agree to participate.

I hope you enjoyed your experience and I hope you found the instructional tool, if you had access to it as a member of the treatment group, useful.

If you have any questions later please feel free to contact me.

Yosef Allam, Co-Investigator (contact person) 614-657-2254 [email protected]

Clark Mount-Campbell, Principal Investigator [email protected]

Thank you again for your participation.

176

The Ohio State University Consent to Participate in Research

Enhancing the Spatial Visualization Skills of First-Year Study Title: Engineering Students Researcher: Clark Mount-Campbell

Sponsor:

This is a consent form for research participation. It contains important information about this study and what to expect if you decide to participate.

Your participation is voluntary. Please consider the information carefully. Feel free to ask questions before making your decision whether or not to participate. If you decide to participate, you will be asked to sign this form and will receive a copy of the form.

Purpose: You are being asked to participate in this study because we are investigating the effectiveness of an animated visualization tool we hope will provide for the enhancement of your and future students’ visualization skills. We are gauging the tool’s effectiveness via the “Purdue Spatial Visualization Test – Rotations,” which you took at the commencement and conclusion of this academic quarter. We are also trying to gauge how your prior experiences in activities that have been related through other research to strong spatial visualization skills affect your initial spatial visualization performance, as well as how well these experiences affect the gains you can make after using the visualization tool. Your experience levels in these areas were collected when you completed 10 five-point questions in Journal Entry 1 at the beginning of the academic quarter.

Procedures/Tasks: The activities you performed this quarter that were above and beyond standard ENG 181 tasks include three items: the 10 seven-point experiential survey questions that were added to the existing version of Journal Entry 1, the animated visualization tool that half of you randomly received which we are evaluating, and this consent form. Your participation involves granting permission to use data for research purposes that has already been collected to evaluate the visualization tool. We are seeking permission to use the data from the three items above, in addition to relevant performance metrics that include your grades that pertain to spatial visualization and the results of your Purdue Spatial Visualization Tests. Once these consent forms have been processed, the records of those providing consent will be kept for analysis purposes, but any identifying information attached to these records will be stripped.

177

Duration: You may leave the study at any time. If you decide to stop participating in the study, there will be no penalty to you, and you will not lose any benefits to which you are otherwise entitled. Your decision will not affect your future relationship with The Ohio State University.

Risks and Benefits: Risks to participants in the form of privacy and identity are minimal in that only one person will administer, collect, and analyze the data. This person will strip the data of identifying fields after these consent forms have been processed. Similarly, there is also a minor, also incrementally small risk involved in course performance information leaking out to unauthorized persons. Once again, all identifying information will be stripped upon receipt and processing of these consent forms.

You may have the satisfaction of knowing that your participation in this study may help future students at the university through continuous improvements to the First-Year Engineering Program. Research of this type has the underlying goal of improving educational practices for current and future students.

CONFIDENTIALITY:

Efforts will be made to keep your study-related information confidential. However, there may be circumstances where this information must be released. For example, personal information regarding your participation in this study may be disclosed if required by state law. Also, your records may be reviewed by the following groups (as applicable to the research):  Office for Human Research Protections or other federal, state, or international regulatory agencies;  The Ohio State University Institutional Review Board or Office of Responsible Research Practices;  The sponsor, if any, or agency (including the Food and Drug Administration for FDA-regulated research) supporting the study.

Incentives: You will not be paid to participate in the study.

PARTICIPANT RIGHTS:

You may refuse to participate in this study without penalty or loss of benefits to which you are otherwise entitled. If you are a student or employee at Ohio State, your decision will not affect your grades or employment status.

If you choose to participate in the study, you may discontinue participation at any time without penalty or loss of benefits. By signing this form, you do not give up any personal legal rights you may have as a participant in this study.

178

An Institutional Review Board responsible for human subjects research at The Ohio State University reviewed this research project and found it to be acceptable, according to applicable state and federal regulations and University policies designed to protect the rights and welfare of participants in research.

CONTACTS AND QUESTIONS:

For questions, concerns, or complaints about the study you may contact Yosef Allam, [email protected].

For questions about your rights as a participant in this study or to discuss other study- related concerns or complaints with someone who is not part of the research team, you may contact Ms. Sandra Meadows in the Office of Responsible Research Practices at 1- 800-678-6251.

If you are injured as a result of participating in this study or for questions about a study- related injury, you may contact Yosef Allam [email protected].

SIGNING THE CONSENT FORM

I have read (or someone has read to me) this form and I am aware that I am being asked to participate in a research study. I have had the opportunity to ask questions and have had them answered to my satisfaction. I voluntarily agree to participate in this study.

I am not giving up any legal rights by signing this form. I will be given a copy of this form.

Printed name of subject Signature of subject SEAT #

AM/PM Date and time

Printed name of person authorized to consent for subject Signature of person authorized to consent for subject (when applicable) (when applicable)

AM/PM Relationship to the subject Date and time

Investigator/Research Staff I have explained the research to the participant or his/her representative before requesting the signature(s) above. There are no blanks in this document. A copy of this form has been given to the participant or his/her representative.

Yosef Allam Printed name of person obtaining consent Signature of person obtaining consent

AM/PM Date and time

179

APPENDIX G: ENG 181 WINTER 2008 EXAM DRAWING PROBLEMS

180

ENG181 Fundamentals of Engineering Midterm

Problem 1 (15 points)

Using Autodesk Inventor 2008 create a part file that meets the specifications shown in the drawing immediately below. Once completed, print an isometric view of the object, including a text block with your nam e, seat number, instructor, and section as shown below. Print in landscape mode. Do not leave your seat to pick up your printout, it will be brought to you.

Your printout should look like this.

WI08

Name ______Professor______Section_____Seat No._____

181

ENG181 Fundamentals of Engineering Midterm

Problem 3 (25 points)

Directions: On the rectilinear grid make a set of orthographic drawings for the object shown in the isometric drawing. One unit length in the isometric drawing should correspond to one unit length on the rectilinear grid. The point A should be located as shown.

A A

A A

WI08

Name ______Professor______Section_____Seat No._____

182

ENG181 Fundamentals of Engineering Midterm

Problem 4 (25 points)

Directions: On the rectilinear grid make a set of orthographic drawings for the object shown in the isometric drawing. One unit length in the isometric drawing should correspond to one unit length on the rectilinear grid. The point A should be located as shown.

A A

A A

WI08

Name ______Professor______Section_____Seat No._____

183

ENG181 Fundamentals of Engineering Final

Problem 1 (25 points)

Directions: Create the following object using Autodesk Inventor Professional 2008. All dimensions are in inches. Note that the object is symmetric front to back and the hole on the right side is a blind hole .5 in deep. Then create a drawing with orthographic views using A-size paper and a title block. Include TOP-FRONT-RIGHT views and an isometric view in their standard locations. Do NOT include centerlines or dimensions. Edit the title block to include your name and information. Title the drawing “ENG181 Exam Object”. Call it Drawing Number 1. Don’t leave your seat, your printout will be returned to you. At the end of the exam, staple your printout to the front of this page. Save your work on the desktop!

Note: If you can't remember how to create a drawing with orthographic views, you may add your name, instructor’s name, seat number, and class time to a sketch plane and print an isometric view of the model from the Inventor window for a partial credit of up to 15 points.

WI08

Name ______Professor______Section_____Seat No._____

184

ENG181 Fundamentals of Engineering Final

Problem 2 (20 points)

Directions: Using Autodesk Inventor Professional 2008, open the file “N:ENG181_Final_Dim_Model_A.idw”. The drawing should look as follows. If you have trouble locating or opening the file raise your hand. Add centerlines, fully dimension the drawing using Inventor 2008 (use two place decimal inches), complete the title block and print. Don’t leave your seat, your printout will be returned to you. At the end of the exam, staple your printout to the front of this page.

Don’t forget that the “Drawing Annotation Panel” can be accessed by left clicking on the header for the “Drawing Views Panel”.

WI08

Name ______Professor______Section_____Seat No._____ 185

ENG181 Fundamentals of Engineering Final

Problem 3 (20 points)

Directions: There are missing lines in the three orthographic views below. The missing lines may be visible, hidden, or centerlines. First, draw the isometric view in the space provided. Then draw the missing lines in the orthographic views. Drawing the isometric view is mandatory!

T ON FR

WI08

Name ______Professor______Section_____Seat No._____

186

ENG181 Fundamentals of Engineering Final

Problem 4 (15 points)

Directions: Given the top and the right side views, draw the front view as an offset section. Indicate the cutting plane on the top view. Note that the outline of the front view has been given for you.

WI08

Name ______Professor______Section_____Seat No._____

187

APPENDIX H: DATA PARSING MACRO

188

' ' MatchMakerMacro2 ' Macro recorded by Yosef S. Allam ' ' Sub MatchMakerMacro2() 'Declaration Of Variables Dim ansbool As Integer '0 or 1 for answer selection Dim ansvalue As Integer '1‐5 for value of each of 5 possible Likert responses per 'question Dim quesnum As Integer 'Likert question number 1‐10 Dim row1 As Integer 'Current row number for Sheet1 Dim row2 As Integer 'Current row number for Sheet2 Dim column2 As Integer 'Current column number for Sheet2

Beep

ansbool = 6 'Variable initialization for error‐checking values ansvalue = 7 quesnum = 8 row1 = 0 row2 = 0 column2 = 1

For row1 = 2 To 4500 'Approx max number of rows

Sheets(1).Select 'Raw Likert survey data select Cells(row1, 1).Select 'Select current cell in student name column

If Selection.Value <> ("") Then 'Contains (name) text?

row2 = row2 + 1 'Increment record number (or row number) for Sheet2 and Selection.Copy 'copy student name to Sheet2 in that row Sheets(2).Select Cells(row2, 1).Select ActiveSheet.Paste

Else

Sheets(1).Select Cells(row1, 3).Select

'DETERMINE IF THIS ROW IS A LIKERT QUESTION RESPONSE

If StrComp(Sheets(1).Cells(row1, 3).Value, Sheets(3).Cells(16, 1).Value) = 0 Then

Cells(row1, 10).Select 'Does this row contain positive (1) boolean indication 'for the Likert Scale item listed in this row?

ansbool = Sheets(1).Cells(row1, 10).Value

189

'DETERMINE WHICH LIKERT QUESTION NUMBER 1‐10 'DETERMINE IF NEVER, RARELY, OCCASIONALLY, 'REGULARLY, OR ALWAYS AND ASSIGN 'CORRESPONDING 1‐5 VALUE

If StrComp(Sheets(1).Cells(row1, 8).Value, Sheets(3).Cells(1, 1).Value) = 0 Then

quesnum = 1

If StrComp(Sheets(1).Cells(row1, 9).Value, Sheets(3).Cells(11, 1).Value) = 0 And ansbool = 1 Then

ansvalue = 1

ElseIf StrComp(Sheets(1).Cells(row1, 9).Value, Sheets(3).Cells(12, 1).Value) = 0 And ansbool = 1 Then

ansvalue = 2

ElseIf StrComp(Sheets(1).Cells(row1, 9).Value, Sheets(3).Cells(13, 1).Value) = 0 And ansbool = 1 Then

ansvalue = 3

ElseIf StrComp(Sheets(1).Cells(row1, 9).Value, Sheets(3).Cells(14, 1).Value) = 0 And ansbool = 1 Then

ansvalue = 4

ElseIf StrComp(Sheets(1).Cells(row1, 9).Value, Sheets(3).Cells(15, 1).Value) = 0 And ansbool = 1 Then

ansvalue = 5

End If

ElseIf StrComp(Sheets(1).Cells(row1, 8).Value, Sheets(3).Cells(2, 1).Value) = 0 Then

quesnum = 2

If StrComp(Sheets(1).Cells(row1, 9).Value, Sheets(3).Cells(11, 1).Value) = 0 And ansbool = 1 Then

ansvalue = 1

ElseIf StrComp(Sheets(1).Cells(row1, 9).Value, Sheets(3).Cells(12, 1).Value) = 0 And ansbool = 1 Then

ansvalue = 2

ElseIf StrComp(Sheets(1).Cells(row1, 9).Value, Sheets(3).Cells(13, 1).Value) = 0 And ansbool = 1 Then

190

ansvalue = 3

ElseIf StrComp(Sheets(1).Cells(row1, 9).Value, Sheets(3).Cells(14, 1).Value) = 0 And ansbool = 1 Then

ansvalue = 4

ElseIf StrComp(Sheets(1).Cells(row1, 9).Value, Sheets(3).Cells(15, 1).Value) = 0 And ansbool = 1 Then

ansvalue = 5

End If

ElseIf StrComp(Sheets(1).Cells(row1, 8).Value, Sheets(3).Cells(3, 1).Value) = 0 Then

quesnum = 3

If StrComp(Sheets(1).Cells(row1, 9).Value, Sheets(3).Cells(11, 1).Value) = 0 And ansbool = 1 Then

ansvalue = 1

ElseIf StrComp(Sheets(1).Cells(row1, 9).Value, Sheets(3).Cells(12, 1).Value) = 0 And ansbool = 1 Then

ansvalue = 2

ElseIf StrComp(Sheets(1).Cells(row1, 9).Value, Sheets(3).Cells(13, 1).Value) = 0 And ansbool = 1 Then

ansvalue = 3

ElseIf StrComp(Sheets(1).Cells(row1, 9).Value, Sheets(3).Cells(14, 1).Value) = 0 And ansbool = 1 Then

ansvalue = 4

ElseIf StrComp(Sheets(1).Cells(row1, 9).Value, Sheets(3).Cells(15, 1).Value) = 0 And ansbool = 1 Then

ansvalue = 5

End If

ElseIf StrComp(Sheets(1).Cells(row1, 8).Value, Sheets(3).Cells(4, 1).Value) = 0 Then

quesnum = 4

If StrComp(Sheets(1).Cells(row1, 9).Value, Sheets(3).Cells(11, 1).Value) = 0 And ansbool = 1 Then

ansvalue = 1 191

ElseIf StrComp(Sheets(1).Cells(row1, 9).Value, Sheets(3).Cells(12, 1).Value) = 0 And ansbool = 1 Then

ansvalue = 2

ElseIf StrComp(Sheets(1).Cells(row1, 9).Value, Sheets(3).Cells(13, 1).Value) = 0 And ansbool = 1 Then

ansvalue = 3

ElseIf StrComp(Sheets(1).Cells(row1, 9).Value, Sheets(3).Cells(14, 1).Value) = 0 And ansbool = 1 Then

ansvalue = 4

ElseIf StrComp(Sheets(1).Cells(row1, 9).Value, Sheets(3).Cells(15, 1).Value) = 0 And ansbool = 1 Then

ansvalue = 5

End If

ElseIf StrComp(Sheets(1).Cells(row1, 8).Value, Sheets(3).Cells(5, 1).Value) = 0 Then

quesnum = 5

If StrComp(Sheets(1).Cells(row1, 9).Value, Sheets(3).Cells(11, 1).Value) = 0 And ansbool = 1 Then

ansvalue = 1

ElseIf StrComp(Sheets(1).Cells(row1, 9).Value, Sheets(3).Cells(12, 1).Value) = 0 And ansbool = 1 Then

ansvalue = 2

ElseIf StrComp(Sheets(1).Cells(row1, 9).Value, Sheets(3).Cells(13, 1).Value) = 0 And ansbool = 1 Then

ansvalue = 3

ElseIf StrComp(Sheets(1).Cells(row1, 9).Value, Sheets(3).Cells(14, 1).Value) = 0 And ansbool = 1 Then

ansvalue = 4

ElseIf StrComp(Sheets(1).Cells(row1, 9).Value, Sheets(3).Cells(15, 1).Value) = 0 And ansbool = 1 Then

ansvalue = 5

End If 192

ElseIf StrComp(Sheets(1).Cells(row1, 8).Value, Sheets(3).Cells(6, 1).Value) = 0 Then

quesnum = 6

If StrComp(Sheets(1).Cells(row1, 9).Value, Sheets(3).Cells(11, 1).Value) = 0 And ansbool = 1 Then

ansvalue = 1

ElseIf StrComp(Sheets(1).Cells(row1, 9).Value, Sheets(3).Cells(12, 1).Value) = 0 And ansbool = 1 Then

ansvalue = 2

ElseIf StrComp(Sheets(1).Cells(row1, 9).Value, Sheets(3).Cells(13, 1).Value) = 0 And ansbool = 1 Then

ansvalue = 3

ElseIf StrComp(Sheets(1).Cells(row1, 9).Value, Sheets(3).Cells(14, 1).Value) = 0 And ansbool = 1 Then

ansvalue = 4

ElseIf StrComp(Sheets(1).Cells(row1, 9).Value, Sheets(3).Cells(15, 1).Value) = 0 And ansbool = 1 Then

ansvalue = 5

End If

ElseIf StrComp(Sheets(1).Cells(row1, 8).Value, Sheets(3).Cells(7, 1).Value) = 0 Then

quesnum = 7

If StrComp(Sheets(1).Cells(row1, 9).Value, Sheets(3).Cells(11, 1).Value) = 0 And ansbool = 1 Then

ansvalue = 1

ElseIf StrComp(Sheets(1).Cells(row1, 9).Value, Sheets(3).Cells(12, 1).Value) = 0 And ansbool = 1 Then

ansvalue = 2

ElseIf StrComp(Sheets(1).Cells(row1, 9).Value, Sheets(3).Cells(13, 1).Value) = 0 And ansbool = 1 Then

ansvalue = 3

ElseIf StrComp(Sheets(1).Cells(row1, 9).Value, Sheets(3).Cells(14, 1).Value) = 0 And ansbool = 1 Then 193

ansvalue = 4

ElseIf StrComp(Sheets(1).Cells(row1, 9).Value, Sheets(3).Cells(15, 1).Value) = 0 And ansbool = 1 Then

ansvalue = 5

End If

ElseIf StrComp(Sheets(1).Cells(row1, 8).Value, Sheets(3).Cells(8, 1).Value) = 0 Then

quesnum = 8

If StrComp(Sheets(1).Cells(row1, 9).Value, Sheets(3).Cells(11, 1).Value) = 0 And ansbool = 1 Then

ansvalue = 1

ElseIf StrComp(Sheets(1).Cells(row1, 9).Value, Sheets(3).Cells(12, 1).Value) = 0 And ansbool = 1 Then

ansvalue = 2

ElseIf StrComp(Sheets(1).Cells(row1, 9).Value, Sheets(3).Cells(13, 1).Value) = 0 And ansbool = 1 Then

ansvalue = 3

ElseIf StrComp(Sheets(1).Cells(row1, 9).Value, Sheets(3).Cells(14, 1).Value) = 0 And ansbool = 1 Then

ansvalue = 4

ElseIf StrComp(Sheets(1).Cells(row1, 9).Value, Sheets(3).Cells(15, 1).Value) = 0 And ansbool = 1 Then

ansvalue = 5

End If

ElseIf StrComp(Sheets(1).Cells(row1, 8).Value, Sheets(3).Cells(9, 1).Value) = 0 Then

quesnum = 9

If StrComp(Sheets(1).Cells(row1, 9).Value, Sheets(3).Cells(11, 1).Value) = 0 And ansbool = 1 Then

ansvalue = 1

ElseIf StrComp(Sheets(1).Cells(row1, 9).Value, Sheets(3).Cells(12, 1).Value) = 0 And ansbool = 1 Then

194

ansvalue = 2

ElseIf StrComp(Sheets(1).Cells(row1, 9).Value, Sheets(3).Cells(13, 1).Value) = 0 And ansbool = 1 Then

ansvalue = 3

ElseIf StrComp(Sheets(1).Cells(row1, 9).Value, Sheets(3).Cells(14, 1).Value) = 0 And ansbool = 1 Then

ansvalue = 4

ElseIf StrComp(Sheets(1).Cells(row1, 9).Value, Sheets(3).Cells(15, 1).Value) = 0 And ansbool = 1 Then

ansvalue = 5

End If

ElseIf StrComp(Sheets(1).Cells(row1, 8).Value, Sheets(3).Cells(10, 1).Value) = 0 Then

quesnum = 10

If StrComp(Sheets(1).Cells(row1, 9).Value, Sheets(3).Cells(11, 1).Value) = 0 And ansbool = 1 Then

ansvalue = 1

ElseIf StrComp(Sheets(1).Cells(row1, 9).Value, Sheets(3).Cells(12, 1).Value) = 0 And ansbool = 1 Then

ansvalue = 2

ElseIf StrComp(Sheets(1).Cells(row1, 9).Value, Sheets(3).Cells(13, 1).Value) = 0 And ansbool = 1 Then

ansvalue = 3

ElseIf StrComp(Sheets(1).Cells(row1, 9).Value, Sheets(3).Cells(14, 1).Value) = 0 And ansbool = 1 Then

ansvalue = 4

ElseIf StrComp(Sheets(1).Cells(row1, 9).Value, Sheets(3).Cells(15, 1).Value) = 0 And ansbool = 1 Then

ansvalue = 5

End If

End If

column2 = quesnum + 1 'Paste responses for Likert questions 1‐10 next to 195

Sheets(2).Select 'student name on same row in Sheet2 in corresponding Cells(row2, column2).Select 'columns B‐K, with possible values of 1‐5 Selection.Value = ansvalue

End If

End If

Next row1

Beep

End Sub

196

APPENDIX I: SQL SCRIPT

197

TRANSFORM First(Sheet2.courseID) AS FirstOfcourseID SELECT Sheet2.[student lastname# or username], First(Sheet2.courseID) AS [Total Of courseID] FROM Sheet2 GROUP BY Sheet2.[student lastname# or username] PIVOT Sheet2.[content item name];

TRANSFORM Last(Sheet2.[LateVisited by student]) AS [LastOfLateVisited by student] SELECT Sheet2.[student lastname# or username], Last(Sheet2.[LateVisited by student]) AS [Total Of LateVisited by student] FROM Sheet2 GROUP BY Sheet2.[student lastname# or username] PIVOT Sheet2.[content item name];

TRANSFORM Sum(Sheet2.[number of visits]) AS [SumOfnumber of visits] SELECT Sheet2.[student lastname# or username], Count(Sheet2.[number of visits]) AS [Total Of number of visits] FROM Sheet2 GROUP BY Sheet2.[student lastname# or username] PIVOT Sheet2.[content item name];

TRANSFORM Sum(Sheet2.[total time]) AS [SumOftotal time] SELECT Sheet2.[student lastname# or username], Sum(Sheet2.[total time]) AS [Total Of total time] FROM Sheet2 GROUP BY Sheet2.[student lastname# or username] PIVOT Sheet2.[content item name];

198