Developing a computer-based interactive system that creates a sense of deep engagement

Renusha Mandara Vishruta Athugala

ORCID ID - https://orcid.org/0000-0002-9641-1140

Doctor of Philosophy

Submission date – September 2020

Research was carried out in the Faculty of Fine Arts and Music, University of Melbourne

This thesis is submitted in partial fulfilment of the degree.

Abstract This project develops a low-cost computer-based interactive system that generates a sense of deep engagement in the user, similar to that experienced when engaging with an artwork. It is based on the Buddhist concept of ‘’; being relaxed, focused, aware, and paying attention to the present moment: which is understood as having a sense of deep engagement. Sati Interactive System integrates elements of an artwork; colour, sound, and body movement to create this sense.

Research has shown that the independent use of colour, sound, and body movement can provide benefits such as: generating positive emotions, achieving a relaxed state, and improvements in psychological and physical health. This project integrates colour, sound, and movement in a low-cost computer-based interactive system to create a sense of deep engagement. Literature on colour, sound, movement, computer-based interactivity, philosophies of technology, human-computer interaction, aspects of grounded theory, and the arts, form the basis and methodology for the development and testing of the system.

An evaluation of the system was based on data collected from 30 participants. 71 percent of participants provided positive responses to using the system, indicating that the Sati Interactive System was effective in creating a relaxed, focused state, and not feeling the time passing while using the system and afterwards. From this information we understood that a sense of ‘Sati’, or deep engagement had been generated.

This project includes the application “Sati Interactive”, found here, that can be downloaded and used on a computer with a Windows or Macintosh operating system. Max and QuickTime software should be installed before running the Sati Interactive application. Instructions and links can be found in the Instructions.pdf in the Sati Interactive folder. The computer should have a webcam, speakers, and a screen. The minimum system specifications are: Intel i5 or AMD Ryzen 3 2.7 Ghz processor, 8GB ram, and 1.5GB video graphics card to run on a laptop or a desktop computer without audio-video lags.

ii Declaration This thesis comprises only original work towards the Doctor of Philosophy, except where indicated in the preface. Due acknowledgement has been made in the text to all other material used. As approved by the Research Higher Degrees Committee, this thesis does not exceed the maximum word limit, exclusive of tables, images, bibliographies, and appendices.

iii Acknowledgement I have received a great deal of support and assistance throughout the writing of this thesis.

I would first like to thank my supervisor Dr. Roger Alsop for all the guidance, support, and encouragement he gave me. His constant feedback and metal support helped me improve my thinking and the quality of my work.

I would like to thank the University of Melbourne for granting me the Melbourne Research Scholarship which immensely helped me to follow my dreams of completing my PhD.

I would also like to thank my second supervisor Associate Professor Denny Oetomo and Dr. Danny Butt for their valuable guidance and support to improve my work.

In addition, I would like to thank my parents for their support and for believing in me. I would like to thank my wife for always being there with me and supporting me to focus on completing my PhD. Finally, I would like to thank my friends for their understanding and support.

iv Table of contents Abstract ...... ii Declaration ...... iii Acknowledgement ...... iv Table of contents ...... v Table of Figures ...... x Table of Tables ...... xi Instructions ...... xii For Windows and MacOS ...... xii Chapter 1: Introduction ...... 1 The Idea ...... 1 Hypothesis ...... 1 The Question ...... 1 Defining an art-based approach ...... 1 Defining a low-cost system ...... 2 The gaps in current computer-based interactivity in the arts ...... 2 Potential Benefits ...... 2 Methodology ...... 3 Outline of the thesis ...... 3 Chapter 2: Literature Review ...... 4 Sati ...... 4 Comparing Sati and Mindfulness ...... 5 Why is it important/benefits? ...... 5 Colour, Sound, and Movement ...... 6 Colour ...... 6 What influences the connections between colour and emotion? ...... 7 Culture ...... 7 Emotional state ...... 7 Personality ...... 7 Creating a relaxed environment using colour...... 7 Sound ...... 9 Sound and Emotion ...... 9 Creating a relaxed environment using sounds ...... 9 Analysing sounds ...... 9 Comparison ...... 10 Selected Nature Sounds...... 13 Movement ...... 13 Using Tai Chi as a movement guide ...... 14

v Engagement through movement ...... 14 Using Colour, Sound, and Movement to Create a Sense of Deep Engagement ...... 14 Computer-based Interactivity ...... 15 Interpretation of Interactivity ...... 15 Concepts of Interactivity ...... 16 Concepts/Principles of interactivity within a computer-based context ...... 16 Interactive Technologies ...... 18 An introduction to Interactive systems...... 18 Fields of Computer-based Interactivity ...... 19 Computer-based Interactivity in Entertainment ...... 19 First Person Shooter Games (FPS) ...... 20 Augmented Reality (AR) application ...... 20 Virtual Reality in Entertainment ...... 20 Problems with VR...... 21 Computer-based Interactivity in Healthcare Applications...... 21 Computer-based interactivity in Surgical Procedures ...... 21 Computer-based interactivity in Pre-surgical planning, practice, and rehabilitation ...... 22 Computer-based interactivity in Alzheimer’s disease type dementia ...... 22 Computer-based interactivity in dementia treatment for the elderly ...... 23 Computer-based Interactivity in Educational Applications ...... 23 Computer-based interactivity in cognitive modifiability ...... 23 Computer-based interactivity in digital game-based learning ...... 24 Computer-based interactivity in Training applications...... 25 Computer-based interactivity in Psychology and behavioural experiments ...... 25 Limitations of the technology used in computer-based interactivity ...... 26 Limitations in Virtual Reality technology ...... 26 Limitations in CAVE system technology ...... 26 Human Computer Interaction (HCI) ...... 26 Usability and User Experience ...... 26 Usability ...... 26 Standards related to Usability ...... 27 Five E’s of Usability ...... 27 Moving from Usability to User Experience ...... 28 The Development of HCI ...... 29 First Wave of HCI ...... 29 Second Wave of HCI ...... 29 Third Wave of HCI ...... 29 User Experience ...... 29

vi Models of user experience ...... 31 Interaction Design Framework ...... 31 What influences experience? ...... 31 User-product interaction ...... 32 Types of experiences ...... 33 Product experience ...... 34 Philosophy of Technology for Human Computer Interaction ...... 35 Albert Borgmann’s Device Paradigm ...... 35 Focal things and Devices ...... 35 Focal things ...... 35 Devices ...... 36 The device paradigm...... 36 Three types of Information ...... 37 Don Ihde’s Non-neutrality of Technology ...... 37 Ihde’s human-technology relations ...... 38 Relevance to the project ...... 39 Similar Applications to the Sati Interactive System ...... 40 Conclusion ...... 42 Chapter 3: An Arts-based Approach: Sati Interactive System, Deep Engagement, and Transformative Experience ...... 45 Feelings and emotional changes generated by an artwork ...... 45 Art as Text ...... 46 Art as Static Image ...... 48 Art as Sound/Music ...... 50 Art as Movement ...... 52 Mirror Neurons ...... 54 Conclusion ...... 55 Chapter 4: Methodology ...... 56 Technology used in the Sati Interactive System ...... 56 Software ...... 56 Max ...... 56 Hardware ...... 56 Sati Interactive System setup at the University ...... 57 Defining a low-cost system ...... 57 Analysing the data gathered from the questionnaire ...... 58 Why use these coding methods? ...... 58 The three coding methods ...... 58 Open Coding ...... 58 Axial Coding ...... 58

vii Selective Coding ...... 59 Likert Scale ...... 59 Why use a Likert Scale?...... 60 Applying the three coding methods ...... 61 Open Coding ...... 61 Axial Code ...... 69 Selective Code...... 75 Data Coding – Finding the Core-category ...... 76 Ethics ...... 77 Conclusion ...... 77 Chapter 5: Development Process ...... 78 Creative process ...... 78 Applying the principles/concepts of computer-based interactivity ...... 78 Development of the Interaction Design Framework ...... 80 Factors that influence experience ...... 80 User-product interaction and experience of the Sati Interactive System ...... 80 The Interaction Design Framework for Sati Interactive System ...... 81 Using the Framework ...... 81 The Development of the System – Software and Hardware ...... 82 Sati Interactive System setup used at the University of Melbourne ...... 82 Hardware ...... 83 Software ...... 83 Sati Interactive System Test Setup ...... 83 Functionality of the Sati Interactive System ...... 85 User Interface Design ...... 86 How the Sati Interactive System may be used in other contexts ...... 88 Conclusion ...... 90 Chapter 6: Testing Process and the Results ...... 91 Testing the Sati Interactive System ...... 91 The system used for testing with participants ...... 91 Participants ...... 91 Task and Procedure ...... 91 Measures ...... 92 Results and Discussion ...... 92 With user image outline ...... 92 Responses per Likert question ...... 92 Responses per Participant ...... 93 No user image outline ...... 95

viii Responses per Likert question ...... 95 Responses per Participant ...... 96 Interpretation of the data ...... 98 Comparison of the two analysis ...... 98 An overall analysis of data gathered from all the participants ...... 100 Responses per Participant ...... 100 Responses per Question ...... 101 Additional comments provided by the participants ...... 101 Summary of the questionnaire responses ...... 102 Finding the Core category through Selective coding ...... 102 Conclusion ...... 102 Chapter 7: Conclusion ...... 104 Project Summary ...... 104 Responding to the Hypothesis ...... 104 Responding to the Question ...... 104 Defining a low-cost system ...... 104 New knowledge brought to current computer-based interactivity in the arts ...... 104 Development Framework and Philosophies ...... 105 Methodology ...... 106 Sati Interactive System as an Artwork, or Not ...... 106 Further developments for the future of the Sati Interactive System ...... 106 Conclusion ...... 106 Bibliography ...... 108 Appendix ...... 129 Appendix 1 ...... 129 Appendix 2 ...... 131 Appendix 3 ...... 132 Appendix 4 ...... 133 Appendix 5 ...... 138 Appendix 6 ...... 139 Appendix 7 ...... 140 Appendix 8 ...... 141 Appendix 9 ...... 142 Appendix 10 ...... 143 Appendix 11 ...... 145

ix Table of Figures Figure 1 : Colours that will be used in the Sati Interactive System...... 8 Figure 2 : 8-hour YouTube sound clip - includes water, wind, and bird sounds...... 11 Figure 3 : 8-hour YouTube sound clip - only includes bird sounds...... 11 Figure 4 : 5-minute sound clip - includes water and bird sounds...... 12 Figure 5 : 5-minute sound clip - includes only the bird sounds...... 12 Figure 6 : Sense of deep engagement – transformative experience ...... 15 Figure 7 : Construct3D (Kaufmann and Schmalstieg, 2003) ...... 20 Figure 8 : The VIVIAN (Virtual Intracranial Visualization and Navigation) (Kockro et al., 2000) ...... 22 Figure 9 : The VENSTER Project - interactive art installation bringing the outside world into the nursing home (Luyten et al., 2017) ...... 23 Figure 10 : Demonstration of the 3D multimodal interaction prototype (Alves Fernandes et al., 2016) ...... 24 Figure 11 : Cave-like system - virtual bar scenario (Rovira et al., 2013) ...... 25 Figure 12 : Influences on experience (Forlizzi and Ford, 2000, p. 420) ...... 32 Figure 13 : Framework of product experience by Desmet and Hekkert (2007, p. 4) ...... 34 Figure 14 : Sati Interactive System conceptual development process ...... 43 Figure 15: Archaic Torso of Apollo Statue (Bernstein, 2019) ...... 47 Figure 16: Rothko’s paining – untitled 1968 (2020f) ...... 49 Figure 19 : The first 10 seconds of Sweet Child O’ Mine by Guns N’ ...... 51 Figure 17: Image of a sine wave at 440Hz over 100msecs ...... 51 Figure 18 : First 100msecs of Sweet Child O’ Mine by Guns N’ Roses ...... 51 Figure 20 : Vincent van Gogh’s The Starry Night painting (2020h) ...... 52 Figure 21 : The Rite of Spring dance performed by Pina Bausch in 1978 (Példatár a távoktatáshoz-MTE, 2020) 53 Figure 22 : Images of Pina Bausch’s dance Barbe Bleue (Demidiv, 2017) ...... 53 Figure 23 : Analysis Process used here ...... 59 Figure 24 : “Healthy Eating” 5 question, 5 alternative responses Likert Scale (Boone and Boone, 2012, p. 2) ... 60 Figure 25 : User and product factors that influence user experience ...... 80 Figure 26 : Interaction design framework for Sati Interactive System ...... 81 Figure 27 : Sati Interactive System ...... 83 Figure 28 : A participant using the Sati Interactive System at the University of Melbourne ...... 84 Figure 29 : The Tai Chi Master – Image of the right is without colours, on the left is with colours ...... 84 Figure 30 : Functionality of the Sati Interactive System ...... 85 Figure 31 : (a) not synced (b) when synced ...... 85 Figure 32 : User interface of the Sati Interactive System ...... 86 Figure 33 : Help statement ...... 87 Figure 34 : A screenshot of the Sati Interactive System designed using Max ...... 88 Figure 35 : Where to drag and drop your movie file ...... 89 Figure 36 : Images of the dance video ...... 89 Figure 37 : (1) no user image outline (2) with user image outline...... 91 Figure 38 : Chart – Responses per question (with user image outline) ...... 93 Figure 39 : Chart – Responses per participant (With user image outline) ...... 95 Figure 40 : Chart – Responses per question (No user image outline) ...... 96 Figure 41 : Chart – Responses per participant (No user image outline) ...... 98 Figure 42 : Individual responses per participant ...... 100 Figure 43 : Responses per Question ...... 101

x Table of Tables Table 1 : Colour emotion relationship from the research conducted by Naz and Epps (2004) ...... 7 Table 2 : Munsell notations for colour profiles...... 8 Table 3 : Munsell to RGB conversion values...... 8 Table 4 : Colour and its association with emotions ...... 8 Table 5 : Nature sounds for the Sati Interactive System ...... 13 Table 6 : Summary of the principles of computer-based Interactivity ...... 17 Table 7 : Classification of Virtual Reality Systems (Biocca and Delaney, 1995) ...... 19 Table 8 : Summery of a framework of user-product interaction (Forlizzi and Battarbee, 2004, p. 263)...... 33 Table 9 : User experience developed by Forlizzi and Battarbee (2004, p. 263) ...... 33 Table 10 : Similar relaxation and mindfulness applications from Steam, Google Play, and App Store...... 41 Table 11 : Tested system specifications ...... 57 Table 12 : General Open Code stage – Keywords/Focus words and the General Open Codes...... 62 Table 13 : Open Code stage – further categorisation of Open Code using alternative answers...... 69 Table 14 : Axial Code stage categorisation ...... 75 Table 15 : Selective Code stage ...... 76 Table 16 : Example of data coding using one question...... 76 Table 17 : Example of data coding using 10 participants...... 77 Table 18 : The use of the principles/concepts of computer-based interactivity in the Sati Interactive System. . 79 Table 19 : Using the Interaction Design Framework ...... 82 Table 20 : Features added to the main menu of the Sati Interactive System ...... 87 Table 21 : Data collected from the Sati Interactive System with the user image outline ...... 93 Table 22 : Responses per participant – with user image outline ...... 94 Table 23 : Data collected from the Sati Interactive System without the user image outline ...... 96 Table 24 : Responses per participant – no outline ...... 97 Table 25 : Summary of the analysis ...... 99 Table 26 : Total positive, negative, and neutral responses ...... 102 Table 27 : Selective coding process ...... 102

xi Instructions For Windows and MacOS 1. Read the thesis – “ATHUGALA – Thesis”. 2. If you do not have Max and QuickTime software, download and install the required software shown below: a. Max - https://cycling74.com/downloads b. QuickTime - https://support.apple.com/downloads/quicktime 3. The Sati Interactive files are in the Supplementary folder – Supplementary Files – ATHUGALA. 4. Download the Supplementary Files – ATHUGALA folder to your computer and open the folder. 5. The Supplementary Files – ATHUGALA folder consists of the following: a. Instructions.pdf b. Sati Interactive Folder c. Sati Interactive Prototype App builds Folder d. Sati Interactive Test Run.mp4 6. Open the Sati Interactive folder. 7. To use the program with the outline of your body, open Sati Interactive.mxf 8. To use the program without the outline of your body, open Sati Interactive (no outline).mxf

After opening the Sati Interactive application:

9. Make sure you have 2 metres of space to move back from the camera. 10. Select the camera that you want to use from the “select the camera” drop down list. 11. Make sure your speakers are connected and the volume level is turned up so that you can hear nature sounds: birds, wind, and water. 12. As instructed in the Sati Interactive application, take a few steps back until the outline of your body fits the image displayed on the screen, remember that position in the room. 13. If you want to view on full screen, then press ‘Esc’ on the keyboard for full screen mode. 14. Press ‘Play’ on the application or Spacebar on the keyboard when you are ready and move back to that position in the room. 15. Try to follow the actions of the image on the screen. 16. Enjoy.

I have also built prototype apps for MacOS and Windows, these apps can be found in the Sati Interactive Prototype App Builds folder. However, these prototype apps may not work on all Mac and Windows computers. For this reason, I have included the Max Collective Build mentioned above, which requires the Max software to run the Sati Interactive application.

Prototype App build for MacOS

Built and tested on MacOS - Catalina 10.15.6, Processor - 2Ghz i7, 8GB of Memory, Intel Iris Pro 1.5GB graphics card. This app can be found in the Sati Interactive for Mac folder.

Prototype App build for Windows

Built and tested on Windows 10, Processor – 2.7Ghz i7, 16GB of Memory, Nvidia GTX 1050 4GB graphics card. This app can be found in the Sati Interactive for Windows folder.

xii Chapter 1: Introduction The Idea This research explores the process of developing a computer-based interactive system that creates a sense of deep engagement, and therefore a transformative experience through taking an arts-based approach. Here, a sense of deep engagement is considered as experiencing being relaxed, focused, and not feeling the passing of time, Goodall (2008, p. 159) referenced John Cage, describing this as the “vanishing point of time and space”. The expression, ‘sense of deep engagement’ is derived from the Buddhist concept of ‘Sati’: being aware of and paying attention to the present moment.

The computer-based interactive system is called Sati Interactive System. I am attempting to create a similar transformative experience that one may have when engaging with an artwork without creating an artwork in the traditional sense of the word artwork. To do this, the Sati Interactive System integrates colour, sound, and movement, the main elements of an artwork. This may seem an oversimplification, however, my contention here is that one or more of these elements is required in any or every artwork. It is in and from the constructing or viewing of these elements that a philosophy or theory might develop. Here, Tai Chi based movement, varying colours, and natural sounds are used in this Sati Interactive System, and it is possible to use different movements, sounds and colours selected by the user. Further development on this system may be used in areas such as entertainment, art, dance, education, and therapy.

The topics discussed in the development of Sati Interactive System are wide ranging, including:

• colour and emotion • sound and emotion • body movement techniques and relaxation • computer-based interactive technologies and approaches. Hypothesis Engaging with an artwork can create a sense of deep engagement in the viewer. This research hypotheses that computer-based interactive technology developed using an arts-based approach can create a sense of deep engagement.

While it is well known that a sense of deep engagement can be enhanced using computer-based interactive technology (Schoenau-Fog, 2011, Bontchev and Vassileva, 2016, Brockmyer et al., 2009, Liu, 2005, Buchanan, 2006, Hoffman and Nadelson, 2010, Gunn et al., 2020, Hart, 2011, McMorran, 2007). However, there is limited discussion regarding using an arts-based approach in developing low-cost computer-based interactive systems designed to create a sense of deep engagement. The Question How can colour, sound, and movement combined with computer-based interactivity be used to create a sense of deep engagement? The hypothesis and question will be tested and responded to through the development and implementation of the Sati Interactive System. A questionnaire is used to elicit then analyse responses to the Sati Interactive System by 30 users. Defining an art-based approach An artwork is often made by the artist to express an idea or experience and is intended to invoke that idea or experience in the audience1, (Tolstoy, 1904). Often this idea or experience is not easily defined or put into words. For an example, the colours and brush strokes of painting ‘The Starry Night’ by Gogh (1889), and his other works best described in the description of his works given by the actor Bill Nighy in the Doctor Who episode Vincent and the Doctor (Curtis, 2010). Here Bill Nighy’s character says “to me Van Gogh is the finest painter of them all. Certainly, the most popular, great painter of all time. The most beloved, his command of colour most magnificent. He transformed the pain of his tormented life into ecstatic beauty. Pain is easy to portray, but to

1 This audience may be the artist themselves.

1 use your passion and pain to portray the ecstasy and joy and magnificence of our world, no one had ever done it before. Perhaps no one ever will again. To my mind, that strange, wild man who roamed the fields of Provence was not only the world's greatest artist, but also one of the greatest men who ever lived”. When I engage with the painting, my response is not rapturous as the one described by the actor, it creates a relaxed and focused state in me. Slash’s guitar solo of ‘Sweet Child O’ mine’ (GunsN'Roses, 1987) also deeply engages me, it brings back memories, gives me energy and makes me feel like I am alive. Pina Bausch’s Le Sacre du Printemps (Bausch, 1978) is also full of energy, creating senses of tension, relaxation, suspense and intrigue, and experiencing these feelings deeply engage me. These artworks express something more than just the object or event, and in doing so create a sense of deep engagement in their audience.

Sati Interactive System is designed to invoke a similar experience of engagement without the creation of an artwork per se. Defining a low-cost system Here a low-cost system is considered between $AU800-$AU1000, and that the user would be providing their own screen and sound system, for example a flat screen TV, and a web cam. Please see Defining a low-cost system on page 57 for more information. The gaps in current computer-based interactivity in the arts I have identified four gaps in computer-based interactivity in the arts, these are:

1. that there are few low-cost computer-based interactive systems 2. that there are few computer-based interactive systems that are explicitly designed to create a sense of deep engagement 3. that an arts-based approach to the design of such a system has not been explicitly explored 4. that interactive systems are often designed with one intention or outcome, and not to be flexible in their uses. The creation of the Sati Interactive System intends to examine and address these gaps.

There are interactive systems built for presenting an artwork in a gallery, such as: ‘Water Light Graffiti’ by Fourneau (2019) and interactive art that plays with perception such as the work shown at the Museum of Contemporary Art Zagreb by Kogler (2019). There are also computer-based interactive performance that integrate colour, sound, and movement, such as: Syngenta by Yeroheen (2017) and Pop Up Garden by Compagnia TPO (2016).

These systems are expensive and not easy to use. Sati Interactive System, which can be used in gallery, therapeutic, and personal spaces, intends to address this problem.

Developing a computer-based interactive system that creates a sense of deep engagement through an HCI and arts-based perspective will add new knowledge to the field of computer-based interactivity in the arts. Potential Benefits Potential benefits to users/participants may include:

• Achieving a sense of deep engagement. • A relaxed state of mind and body. • The ability to focus the mind towards a single objective without disruptive thoughts. • The ability to deal with stressful situations. • The ability to control emotions.

Potential social benefits may include: • Developing and refining an interactive system that promotes a sense of deep engagement. • Developing understanding of the potential benefits of integrating sound, vision, and movement in a computer-based interactive system.

2 • A process for introducing the perceived benefits of meditation without the user specifically engaging in traditional meditation practices. • An interactive system that can be used in a variety of fields such as education, entertainment, training, and healthcare. • Sati Interactive can be used in: art making; galleries, performance venues, and museums; and therapeutic environments. Methodology I am using the three coding methods used in Grounded Theory: Open coding, Axial coding, and Selective Coding (Corbin and Strauss, 1990); this is discussed in greater detail in Chapter 3.

The theoretical framework that I am taking is that by developing a system based on gathering information and literature regarding computer-based interactive technologies and then testing the result with participants and using the three coding methods listed to assess the efficacy of the system. Outline of the thesis Chapter 1 provides an introduction about the thesis and its chapters. This includes: an explanation of the thesis, the hypothesis, the question, the gap, expected benefits, methodology and the outline of the chapters.

Chapter 2 contains the literature review in areas such as: Sati, Colour-Sound-Movement technique, Computer- based Interactivity, Interactive Technologies, Human Computer Interaction (HCI), Philosophy of Technology for HCI, and Technology.

Chapter 3 discusses an Arts-based Approach: Sati Interactive System, Deep Engagement, and Transformative Experience.

Chapter 4 consists of the methodology for testing the system which includes: Three Coding Methods, Analysis Procedure, Applying the methodology, Data coding, and Ethics.

Chapter 5 discusses the development process of the Sati Interactive System. This includes: the development of the interaction design framework, using the framework, development of the Sati Interactive System, functionality of the system, interface design, anticipated user reactions, and uses in other contexts.

Chapter 6 discusses the testing process of the Sati Interactive System. This includes: information on how the test was conducted, results gathered from the questionnaires, raw data, average, positive, negative, and neutral responses according to Likert scale, establishing the Core-category of the methodology, suggested further developments to the system, and the conclusion of the analysis.

Chapter 7 concludes the thesis, reviewing the work done, addressing the hypothesis, and considering potential future developments.

3 Chapter 2: Literature Review This chapter forms the foundation for the development of the Sati Interactive System. The many areas discussed are: Sati, colour-sound-movement, computer-based interactivity, interactive technologies, human computer interaction, philosophy of technology, and the technologies used in the Sati Interactive System. Its primarily made of definitions within these fields and these definitions are first used and then tested through the Sati Interactive System.

These topics are broad; however, I am focusing directly on writing, practice, and other relevant research into these topics that are related to my hypothesis and question. Sati Sati is the state of being aware and paying attention to the present moment (Thera, 2005, p.32, Gunaratana, 2010, p.38, Anālayo, 2003, p.47, Rhys Davids, 1910, p.323). The word is from the ancient Indian language Pali, in which Gautama Buddha preached over 2500 years ago, and is the core of the Buddhist philosophy and practice (Germer, 2004, Childers, 1875). According to Childers (1875, p.466) dictionary of the Pali language, Sati is “Recollection: active state of mind, fixing the mind sternly upon any subject, attention, attentiveness, thought, reflection, consciousness”. Rhys Davids (1910, p.323), founder of the Pali Text Society, expands, saying

“Etymologically Sati is Memory. But as happened at the rise of Buddhism to so many other expressions in common use, a new connotation was then attached to the word, a connotation that gave a new meaning to it, and renders ‘memory’ a most inadequate and misleading translation. It became the memory, recollection, calling-to-mind, being-aware-of, certain specified facts. Of these the most important was the impermanence (the coming to be as the result of a cause, and the passing away again) of all phenomena, bodily and mental. And it included the repeated application of this awareness, to each experience of life, from the ethical point of view.”

Ven. Gunaratana (2010, p.38) has nine definitions of Sati, translating to the current English term, mindfulness:

• mirror thought • non-judgemental observation • impartial watchfulness • nonconceptual awareness • present time awareness • non-egoistic alertness • goal-less awareness • awareness of change • participatory observation. To summarise, Sati is being aware of the present moment without any judgemental thoughts or memories. In other words, paying attention without involving thoughts, concepts, ideas, opinions, or memories.

The contemporary expression ‘mindfulness’ emanates from the word, and Buddhist practice of, Sati (Bodhi, 2011, p.20, Dreyfus, 2011, p.44, Grossman and Van Dam, 2011, p.220, Purser and Milillo, 2015, p.2). However, ‘mindfulness’ as it is used in contemporary western therapies, has so many interpretations that its meaning is distant from its original Buddhist roots. For example, using mindfulness to improve sex life (Brotto, 2011, Paterson et al., 2017, Brotto, 2013), or as something that can be achieved after taking an eight week course (Rosch, 2015). Ven. Gunaratana (2010) describes mindfulness and Sati as synonymous, however, due to the western appropriations of the word given above, I am using Sati to create a distinction from those appropriations. Here, I am referring to the Buddhist meaning of Sati2, but without the religious connotations or philosophy.

2 The word sati has other definitions as well. For instance, sati or suttee was a Hindu Indian ritual where a widow follows her husbands’s pyre with him or in another manner shortly after his death (Hawley, 1994, Yang, 1989). Sati is also a name of a Hindu goddess according to ancient Hinduism (Kinsley, 1988, Harman, 1992).

4 Comparing Sati and Mindfulness Historically, Sati is referred to as a practice which is used in meditation (Bodhi, 2011, Gunaratana, 2010, Bhikkhu et al., 1995, Thera, 1998, Thera, 2005, Anālayo, 2003, Grossman and Van Dam, 2011). According to Ven. Gunaratana (2010), Sati is the basis of Vipassana Meditation a Buddhist meditation technique. It is used in meditation to understand and focus on the present moment by concentrating entirely on a single thought. As an example, in Anapanasati meditation, the entire concentration is on the breath, breathing in and out (Nanamoli, 1964).

Mindfulness practice is popular in therapy, including: mindfulness training as a clinical intervention (Baer, 2003); medical and psychological symptoms and well-being (Carmody and Baer, 2008, Brown and Ryan, 2003); regional brain grey matter density (Hölzel et al., 2011); psychological health (Keng et al., 2011); and stress reduction on medical and premedical students (Shapiro et al., 1998).

Research findings indicate that mindfulness practice can:

• Decrease mood disturbance and stress (Brown and Ryan, 2003) • Increase in mindfulness and well-being (Carmody and Baer, 2008, Keng et al., 2011) • Decreases in stress symptoms (Baer, 2003, Carmody and Baer, 2008) • Reduce of self-reported state and trait anxiety (Shapiro et al., 1998) • Reduce of overall psychological distress including depression (Shapiro et al., 1998) • Increase scores on overall empathy levels (Shapiro et al., 1998) • Increases regional brain grey matter density which affects learning and memory process, emotion regulation, self-referential process and perspective thinking (Hölzel et al., 2011) • Reduced psychological symptoms and emotional reactivity (Keng et al., 2011) • Improve behavioural regulation (Keng et al., 2011) However, here Sati is not considered as overtly therapeutic, and this is the significant distinction between Sati and Mindfulness.

Here sound, vision, and guided physical actions, integrated through the low-cost system, are used to guide the user to focus their attention to achieve a sense of deep engagement.

Why is it important/benefits? The anticipated benefits of creating this interactive technology-based system include:

• a process for introducing the perceived benefits of meditation (Horowitz, 2010, Monk-Turner, 2003, Zeidan et al., 2010, Campos et al., 2016) without the user specifically engaging in traditional meditation practices. It is known that sections of some societies are averse to traditional meditation (Bhuvanesh, 2013, Eifring, 2013); • it may be used in other contexts such as, o Museums, where it can be used as a device to display and interact with historical objects and environments (Ciolfi and Bannon, 2002, Geller, 2006, Hornecker, 2008, Hornecker and Stifter, 2006). o Education, where it can be used as an immersive display that students can interact with and receive a real-time feedback (Dede, 2009, de Freitas, 2006, Dunleavy and Dede, 2014). o Entertainment, where it can be used as an interactive system that can detect body movements and create visual elements or work as an interactive gaming system (Jacobson et al., 2005, Höysniemi et al., 2004, Roussou, 1999, Sra and Schmandt, 2015, Zyda, 2005). o Training, where it can be used as an interactive training ground for simulated training (Cook et al., 2012, Katz et al., 2005, Reznek et al., 2002), and military training (Zook et al., 2012, Nieborg, 2004, Lenoir, 2000). o Aged and other care environments, where it can be used as a device to improve health (Shneiderman et al., 2013, McCloy and Stone, 2001, Pouke and Häkkilä, 2013, Moyle et al., 2018, Siriaraya and Ang, 2014).

5 The anticipated benefits of Sati; focusing on the present moment, include:

• The ability to relax/calm yourself by avoiding other distractive thoughts or problems (Gunaratana, 2010). • The ability to focus the mind towards a single objective without disruptive thoughts (Gunaratana, 2010). • The ability to deal with stressful situations (Grossman et al., 2004). • The ability to control emotions (Davis and Hayes, 2011). Colour, Sound, and Movement This section looks at how colour, sound, and movement are used in the Sati Interactive System. Colour and sound are used to generate positive emotions in the user. Movement is used as a guide and to engage the user in an activity.

Colour This section looks at how colour is used as visual stimuli and the connection between colour and human emotions. It focuses on how colour can be used as a visual stimulus that helps capture the user attention.

Research shows colour is strongly connected with emotions (Hemphill, 1996, Lang, 1987, Mahnke, 1996, Clark, 1978). For example: red emphasizes excitement, green shows a state of calm/tender/comfort, yellow highlights happy/playful/excitement, blue has been perceived as sad/dull (Odbert et al., 1942), but this connection is complex.

It is thought that one colour can trigger more than one emotion and some emotions can be triggered by a variety of colours (Linton and Linton, 1999, Saito, 1996). Naz and Epps (2004) studied the emotional response to a variation of colour hues, finding that, for example, red has both positive and negative effect. Naz and Epps (2004, p. 399) says, “Red was seen to be positive because it was associated with love and romance, while the negative aspects of red included having associations with fight and blood as well as Satan and evil”. Red has been associated with positive emotions such as warm, energetic, passionate, and negative emotions such as fury, anger, bloody, intense, violent (Naz and Helen, 2004, Naz and Epps, 2004). Green express positive emotions such as calm, quietness, refreshing, peaceful, and it also expresses negative emotions such as guilt and exhaustion (Davey, 1998, Mahnke, 1996, Saito, 1996). Blue is also responsible for positive feelings such as calmness, relaxed, happiness, peace, and comfort as well as negative feelings such as sadness and depression. The reason for this is stated as, blue reminds of the night sky and darkness which stimulates sadness. Kingston et al. (2007) & Epps’s observations are somewhat similar to the findings of Saito (1996). Saito’s study shows the positive and negative feelings associated with the colour white. According to Saito’s study, the colour white is associated with feelings such as pure, clean, harmonious, clear beautiful, refreshing, gentle, and natural. It also triggers negative feelings such as emptiness, void, loneliness, boredom, and death. However, culture, personality, and country can have different responses to the same colour (Choungourian, 1968, Saito, 1996). For example, the response to the colour red in China is different to the response in America.

Table 1 below shows a summary of the colour emotion relationship from the research conducted by Naz and Epps (2004) consisting a total of 98 participants.

6 Colour Positive Emotion Negative Emotion No Emotion Red 63 32 3 Yellow 92 6 0 Principle Hues Green 94 0 4 Blue 78 17 3 Purple 63 32 3 Yellow-Red 73 18 7 Green-Yellow 24 70 4 Intermediate Blue-Green 80 15 3 Hues Purple-Blue 64 30 4 Red-Purple 75 15 8 White 60 36 2 Achromatic Gray 7 88 3 Colours Black 19 77 2 Table 1 : Colour emotion relationship from the research conducted by Naz and Epps (2004)

What influences the connections between colour and emotion? This section looks at the elements which can influence the response to colour, and what may cause people to respond to certain colours differently.

Culture The study conducted by Ou et al. (2004) on colour emotion and colour preference shows that people with different cultural and ethnic backgrounds respond to colours differently. This research data indicated that British and Chinese observers respond differently to similar colours. Another cognate study conducted in the UK, Taiwan, France, Germany, Spain, Sweden, Argentina, and Iran by Ou et al. (2012) to identify the influence of colour on emotion shows that cultural and ethnic background influences the responses between colour and emotion and that gender, professional background, and age also influence colour and emotions. David McCandless (2012, p. 7) has provided an excellent infographic on the responses to colour in different cultures. I have included this infographic in Appendix 11 on page 145.

Emotional state According to Birren (1961), colour can affect human emotions differently depending on each individual emotional state. For instance, a person with a stable emotional state has an open response to colour, whilst an emotionally unstable person might avoid any direct response. Studies also show that, children who suffered dangerous situations tend to use black and red colours more in their paintings (Cotton, 1985, Gregorian et al., 1996). Wadeson (1971) research shows that, there is a noticeable difference between the amount of colour used in the paintings of patients with depression when compared to other patients.

Personality The emotional response to colour can also change according to different types of personalities (Birren, 1961, Birren, 2016). People who are extrovert, for example, show interest in warm colours such as orange and red. Cool colours such as blue and green are liked by introverts, which extroverts find less stimulating. Reserved people have displayed distress towards warmer colours, and are more sensitive towards colours than sociable people (Mahnke, 1996).

Creating a relaxed environment using colour This project will be using the Munsell colour system (Munsell Color, 2019) referenced from D'Andrade and Egan (1974), Naz and Epps (2004), and Valdez and Mehrabian (1994). Munsell colour system is chosen to create the colour profiles for this project for, its unique colour identification structure using hue, value (brightness), and chroma (saturation), and since it is used in major colour-emotion research projects (Ballast, 2002, Valdez and Mehrabian, 1994). The Principle Hues which are used in this project is displayed in Table 2 below. These colour profiles are referenced from Naz and Epps (2004) research.

7 Colour Hue Value/Chroma Blue 10B 6/10 Yellow 7.5Y 9/10 Green 2.5G 5/10 Purple 5P 5/10 Table 2 : Munsell notations for colour profiles.

The Munsell colour notations are converted in to RGB colour scheme in order to be used in the Sati Interactive System. Here, the Munsell to RGB conversion is done using the four colour conversion websites; Munsell Colour Palette (Geng, 2019), e-paint (e-paint, 2019), procato (Bang, 2019), and Virtual Munsell Colour Wheel (Werth, 2019). The RGB values are chosen after comparing values from the four websites. Below displayed in Table 3 are the converted Munsell to RGB colour samples created using the above Munsell notations.

Colour Munsell notation RGB value Blue 10B6/10 5,158,212 Yellow 7.5Y9/10 248,230,77 Green 2.5G5/10 18,142,78 Purple 5P5/10 149,103,172 Table 3 : Munsell to RGB conversion values.

Below displayed in Figure 1 are the RGB colours after the conversion. These colours are developed using Adobe Photoshop (Adobe, 2019).

Figure 1 : Colours that will be used in the Sati Interactive System.

In conclusion, colours affect human emotions. The colour-emotion connection is complicated as there are other elements that influence this such as, culture, emotional state, gender, age and personality. Also, research shows that other than the principle hues, intermediate hues and achromatic colours also affects emotions differently (Naz and Epps, 2004). However, this project will only use the Principle Hues, because, Achromatic Colours tend to create more negative emotions and Intermediate Hues can create a variety of emotions from positive to negative. To conclude, this project will be using Blue, Yellow, Green and Purple colours, selected considering the most positive emotional association.

Table 4 below shows a summary of the above discussion on colour and emotion.

Colour Positive Emotions Negative Emotions Red Excitement, warm, solid, energetic, passionate Fury, anger, bloody, intense, violent Blue Comfort, security, calmness, relaxed, Sadness and depression happiness, peace Green Calm, tender, comfort, natural, quietness, Guilt, exhaustion, envy, greed, suspicion refreshing, peaceful, hope, prosperous, stable Yellow Happy, cheerful, warmth, laughter Intensity, frustration, anger, attention, hunger Purple Relaxed, calm, happiness, power, excitement, Sadness, tiredness, fear, boredom comfort White Hope, peaceful, innocence Loneliness, boredom Black Wealth, richness, power Depression, fear, anger Table 4 : Colour and its association with emotions

8 Sound Sound and Emotion This section discusses the relationship between sound and emotions. Sound has been used in psychological and physical health as a relaxation tool which has proven to be beneficial in many ways (del Olmo and Cudeiro, 2005, Iyer et al., 2011, Rose and Weis, 2008, Salamon et al., 2003). However, here, the focus is on how sound can be used as a stimulus to create a relaxed and calm environment.

It is believed that music/sound affects emotions (Rose and Weis, 2008, Aldridge, 1993, Mullooly et al., 1988, Snyder and Chlan, 1999, Tusek et al., 1997). Rose and Weis (2008) study on sound meditation and the effects of receptive music therapy using the monochord (a music instrument with one string) shows that the nervousness of patients was substantially reduced, and 75.6% of the patients felt relaxed and more balanced.

Salamon et al. (2003) and Fessenden and Schacht’s (1998) research also shows that music has the means to reduce feelings such as anxiety and stress and that it helps generate Nitric Oxide molecules which enhance physiological and psychological relaxation effects and helps develop the auditory system. Krumhansl (1997) says that soothing music has a positive effect on the brain, and feelings such as motivation may be enhanced.

Guetin et al. (2009) study on the effect of music on anxiety and depression in patients with Alzheimer’s type dementia shows the positive effects of music therapy on reducing anxiety and depression. For example, Guetin et al.’s new technique of varying tempos has allowed the patients to recall factual memories and images from 4 to 16 weeks.

Research has shown that nature sounds and environmental noise such as tweeting birds, road traffic, and building sites also affects emotions and provide psycho-physiological benefits (Alvarsson et al., 2010, Ratcliffe et al., 2013, Benfield et al., 2014, Annerstedt et al., 2013, Cutshall et al., 2011). According to Alvarsson et al.’s (2010) research conducted using 40 participants suggests that, exposure to nature sounds helps people recover faster from psychological stress. Ratcliffe et al. (2013) experiment on assisting in stress recovery and attention restoration using bird sound shows that these sounds have emotional benefits. Bird sounds and natural sounds are used in other research to explore attention restoration, stress recovery, and particularly perceived restorativeness (Kjellgren and Buhrkall, 2010, Payne, 2013, Valtchanov et al., 2010). Furthermore, research conducted by Benfield et al. (2014) also provides evidence that natural sounds can provide restorative benefits and mood recovery. Annerstedt et al. (2013) also adds to this by providing evidence of physiological stress recovery using sounds of nature in a virtual reality forest. Finally, a randomized study conducted by Cutshall et al. (2011) also shows that recorded music and nature sounds can provide means for relaxations for patients with common symptoms of pain and anxiety.

The results of these studies suggest that sound is an effective tool which enhances psychological health. It can also create a relaxed environment. Therefore, sound is used in this project as an element to stimulate positive emotions and create a calm and relaxed environment to achieve a state of deep engagement.

Creating a relaxed environment using sounds Here I am looking at the types of sounds that will be used in the Sati Interactive System.

This project is only using a combination of nature sounds, such as the sounds of a: forest, creek, waterfall, birds, ocean waves, and rain because they are natural. Music is not used because it is manmade and various factors influence the connection between sound and emotion such as:

• Listeners age - different types of music generate different personal feelings depending on the listeners age (Kemper and Danhauer, 2005). • Ethnic background (Balkwill et al., 2004, Balkwill and Thompson, 1999). • Health status (Beck, 1991, McCaffrey and Freeman, 2003) Analysing sounds When selecting nature sounds to use in the Sati Interactive System, the initial idea was to use the same nature sound clips used in the research studies by (Ratcliffe et al., 2013, Benfield et al., 2014, Annerstedt et al., 2013, Kjellgren and Buhrkall, 2010, Payne, 2013, Valtchanov et al., 2010). I contacted all of these researchers

9 requesting access to the sound clips they used in their research. However, only Alvarsson et al. (2010) responded and provided the sound clips used in their research. I have used one of their Ambient Nature sounds. Consequently, I used sounds from freesounds.org.

Here, I have used YouTube to search for the most viewed relaxing nature sounds. This selection process is conducted under the assumption that ‘the most viewed content has a positive effect on the viewer’. Firstly, the YouTube search was led using the phrase, “most relaxing natural sounds”. The first 87 videos are selected for a general analysis. From the 87 videos, most of them have similar natural sounds such as, water, rain, birds, animals, and wind. All these sounds are recorded in natural environments such as rainforests, mountains, waterfalls, and oceans.

Further analysis is conducted by analysing sound elements such as frequency and rhythmic patterns. These sound elements are analysed using Sonic Visualiser (Cannam et al., 2010). From the results of the YouTube search, 6 sound clips are chosen for the analysis. The first clip includes a combination of water, wind, and bird sounds. The second clip only include bird sounds. The rest of the sound clips include varieties of nature sounds. However, only the first 5-minutes will be extracted and normalized3 (AdobeAudition, 2019) for analysis of the sound clips.

Nature sounds downloaded from YouTube –

• Relax 8 Hours-Relaxing Nature Sounds-Study-Sleep-Meditation-Water Sounds-Bird Song - https://youtu.be/eKFTSSKCzWA • 8 Hour Nature Sound Relaxation-Soothing Forest Birds Singing-Relaxing Sleep Sounds-Without Music - https://youtu.be/2G8LAiHSCAs • Calming Seas #1 - 11 Hours Ocean Waves, Nature Sounds, Relaxation, Meditation, Reading, Sleep - https://youtu.be/f77SKdyn-1Y • Nature Sound 16 - THE MOST RELAXING SOUNDS - https://youtu.be/b2njHW9ydWs • 8 Hours Relaxing Nature Sounds Meditation Relaxation Birdsong Sound of Water Johnnie Lawson - https://youtu.be/Jll0yqdQclw • Forest and Nature Sounds 10 Hours - https://youtu.be/OdIJ2x3nxzQ Comparison The images of the sound waves below show the reader the difference of the type of sounds such as bird, water, and wind. Figure 2 and 3 shows 8-hour sounds and figure 4 and 5 shows 5-minute extract of longer sounds.

Figure 2 shows the waveform and the spectrogram analysis of the 8-hour YouTube sound clip that includes water, wind, and bird sounds.

3 The normalize effect makes the audio reach its maximum amplitude that digital audio allows.

10

Figure 2 : 8-hour YouTube sound clip - includes water, wind, and bird sounds.

Figure 3 below shows the waveform and the spectrogram analysis of the 8-hour YouTube sound clip that only includes bird sounds.

Figure 3 : 8-hour YouTube sound clip - only includes bird sounds.

However, for a detailed analysis, 5 minutes of each clip is normalized and analysed. Figure 4 shows the waveform and the spectrogram of the 5-minute sound clip that includes water and bird sounds.

11

Figure 4 : 5-minute sound clip - includes water and bird sounds.

Here, the audio reaches a maximum of -2 decibel (dB). The bird sound varies around a frequency of 1550.39Hz to a 5426.37Hz. The overall analysis also shows a rhythmic pattern in the bird sounds.

The below Figure 5 shows the waveform and the spectrogram of the sound clip that includes only the bird sounds.

Figure 5 : 5-minute sound clip - includes only the bird sounds.

Here, the audio reaches a maximum of -0.3 dB. The frequency of the Bird sound fluctuates around 1466.03Hz to 12898.8Hz. The rhythmic pattern of the bird sound can be seen clearly in this clip.

When comparing the two 5minute sound clips shown above, this shows that the lower frequencies are at a similar range. However, the frequencies of the bird sounds in the clip without the water sound shows a higher value. This might be because the clip without the water sound has less interference from other sounds. However,

12 the occurrence of the bird sounds shows a clear rhythmic pattern in both sound clips. Overall, all 6 sound clips do not show any similarities other than the rhythmic pattern. The images of the other audio analyses can be found in Appendix 1 on page 129.

The nature sounds for this thesis were selected by focusing on sound characteristics such as rhythmic patterns and according to my personal preference of sounds that help me reach a relaxed state.

Selected Nature Sounds The Sati Interactive System uses 9 different nature sounds. These sounds are downloaded from freesound.org (Freesound, 2019) and soundbible.com (Soundbible, 2019). More information about the nature sound files are displayed in the below Table 5.

Name of the sound Audio Information Producer Website Cuckoo Mp3, 500kbps, mono Vonora https://freesound.org/peop le/Vonora/sounds/269570/ Water Wave, 48000 Hz, Stereo Shain_T https://freesound.org/peop le/Shain_T/sounds/352962/ Hieronyme Birds AIFF, 44100 Hz, Stereo Hieronyme https://freesound.org/peop le/hieronyme/sounds/6219 6/ Ocean waves Wave, 44100 Hz, Stereo Xserra https://freesound.org/peop le/xserra/sounds/161701/ Creek Flac, 96000 Hz, Stereo Hitrison https://freesound.org/peop le/Hitrison/sounds/191360/ Rain Wave, 480000 Hz, Stereo Lebaston100 https://freesound.org/peop le/lebaston100/sounds/243 628/ Forest ambient Wave, 48000 Hz, Stereo Bajko https://freesound.org/peop le/bajko/sounds/385279/ Birds Wave, 48000 Hz, Stereo Hargissssound https://freesound.org/peop le/hargissssound/sounds/3 45852/ Ambience Wave, 48000 Hz, Stereo Hargissssound https://freesound.org/peop le/hargissssound/sounds/3 45851/ Table 5 : Nature sounds for the Sati Interactive System

These sound files are under the Creative Commons 0 License (CreativeCommons, 2019b) and Attribution license (CreativeCommons, 2019a). The appropriate credit has been given to the creators of each sound file. The selected sound files are converted into ‘16-bit 44100Hz Stereo mp3’ for an optimised and quality performance.

Movement Here the body movement Tai Chi is used as a movement guide, not for its potential benefits or as a part of the Sati Interactive System. It is possible to use any other form of movement including: dance, exercise, and/or yoga as a movement guide. For example, the use of a dance video can be seen in How the Sati Interactive System may be used in other contexts on page 88. I am selecting Tai Chi by considering these factors:

• Tai Chi has profound history (Guo et al., 2014) • Tai Chi is simple and easy to follow (Lan et al., 2002) • Nonreligious (Guo et al., 2014) • Well-known and proven to be beneficial in mental and physical wellbeing (Lan et al., 2002, Lan et al., 2013, Burschka et al., 2014, Wall, 2005, Sandlund and Norlander, 2000, Mills and Allen, 2000).

13 Using Tai Chi as a movement guide The Tai Chi movement guide used in the Sati Interactive System is provided by Master Wong4. I have chosen Master Wong is a professional Wing Chun, Tai Chi, JKD teacher. The link to the original Tai Chi video from Master Wong (2016) is Tai Chi Daily - 14-minute Tai Chi Routine – Master Wong - https://youtu.be/GyX8iIRtECc

Engagement through movement This section discusses using body movement to engage the user. As mentioned above, any other form of body movement can be used in the Sati Interactive System to engage the user. For instance, dance, yoga, forms of physical exercise, and sports training exercises can be used.

Movement can provide: psychological benefits (Byrne and Byrne, 1993, Yeung, 1996), social benefits (Muller and Berthouze, 2010, Mueller and Gibbs, 2007, Mueller et al., 2003), posture changes and bodily expressions affecting motivation (Riskind and Gotay, 1982), emotional benefits (Laird, 1974), improved self-performance (Stepper and Strack, 1993), and beneficial effects on self-image and food intake (Bloom et al., 2008).

While these benefits may be profound the focus here is on engagement through movement in a digital environment such as an interaction in a game environment or any other virtual environment. This is because the Sati Interactive System creates a similar digital/virtual environment.

Research shows that different forms of body movement can be used to engage the user. For example, Bianchi- Berthouze et al. (2007) suggests that the player’s engagement level can be increased by body movement. This suggests that full-body experience in a digital environment can: enable the feeling of presence; enable affective aspects of human-human interaction; and affects emotions (Bianchi-Berthouze et al., 2007, p. 1). Bianchi- Berthouze et al. (2007) also found that the engagement created using body movement also helped the user enter into a fantasy world, thus experiencing more emotions. Bianchi-Berthouze’s (2013) research supports the idea of using movement to enhance user engagement, concluding that,

… the selection of the body movement to control the game and the degree of freedom offered by the controller will have an effect on how the player will engage with the game. (Bianchi-Berthouze, 2013, p. 28)

Another study by Lindley et al. (2008) shows that a controller that supports natural movements can increase levels of user engagement.

Devices such as Kinect and Leap Motion are used in various research that focuses on body movement. For example, free-hand interaction with a Leap Motion controller is used for stroke rehabilitation (Khademi et al., 2014), Kinect is used in a 3D virtual online gym application (Cassola et al., 2014), and to teach natural user interaction (Villaroman et al., 2011). However, these devices have issues such as: tracking accuracy, not being user-friendly, platform specific, and more expensive than a webcam. This is why I have chosen to use a camera- based system for interaction. Using Colour, Sound, and Movement to Create a Sense of Deep Engagement As discussed in Colour, Sound, and Movement on page 6, the Sati Interactive System uses these three elements to create a sense of deep engagement. Colour and sound are used to generate positive emotions in the user. Movement is used as a guide and to engage the user in an activity, not for its benefits. This sense of deep engagement creates a transformative experience; an emotional change, and a sense of deep engagement. This is shown in Figure 6 below. This transformative experience is further discussed in Chapter 3: An Arts-based Approach: Sati Interactive System, Deep Engagement, and Transformative Experience on page 45.

4 Master Wong has been informed about this project and he has approved the use of his video, see Appendix 5 on page 127.

14

Figure 6 : Sense of deep engagement – transformative experience

John Dewey (1934, p 30) has discussed the use of colour, sound, and movement saying they may bring “delight to eye and ear and that evokes massive emotions of suspense, wonder, and awe” when discussing the use of “solemn processions, incense, embroidered robes, music, the radiance of colored lights, with stories that stir wonder and induce hypnotic admiration” to “direct appeal to sense and to sensuous imagination”. This may create a sense of deep engagement in the person experiencing it and potentially create a transformative experience. Dewey has also given the example of physicists and astronomers today (in 1934) “answer to the esthetic need for satisfaction of the imagination rather than to any strict demand of unemotional evidence for rational interpretation”. Artworks are often inherently designed to answer to this aesthetic satisfaction of the imagination, but the Sati Interactive System is not designed to answer to this requirement. This way the Sati Interactive System is attempting to create the experience Dewey discusses without answering to that aesthetic satisfaction of the imagination. Computer-based Interactivity This section discusses computer-based interactivity. Firstly, various definitions of computer-based interactivity and its development is discussed. These definitions are based on different concepts which vary within different contexts including: science, engineering, statistics. The second part of the discussion is based on different concepts associated with interactivity.

Interpretation of Interactivity Interactivity has numerous definitions according to its use in different fields (O'sullivan et al., 1994), it is defined by its use in contexts. Here, the concept ‘interactivity’ is defined within the context of technology. There are, however, numerous definitions of computer-based interactivity (Downes and McMillan, 2000, Jensen, 1998a, Neuman, 2008, Jensen, Jensen, 2000, Trindade et al., 2003, Williams et al., 2001, Rice and Williams, 1984, Rogers and Chaffee, 1983, Everett, 1986, Rafaeli, 1988, Steuer, 1992, Williams et al., 1988). I will give examples of the definitions from a selection of the above authors.

Over the years, the concept ‘interactivity’ has been developed by many researchers. According to Williams et al., interactivity is,

the degree to which participants in a communication process have control over, and can exchange roles, in their mutual discourse (Williams et al., 1988, p. 10).

This definition emphasizes that interactivity is being able to communicate while having the ability to control and mutually exchange roles. Newhagen & Rafaeli describes interactivity as,

the extent to which communication reflects back on itself, feeds on and responds to the past (Newhagen and Rafaeli, 1996, p. 6).

15 Newhagen & Rafaeli adds a new facet by stating that, interactivity should be able to respond to previous communications. Later, Rafaeli further develops this idea by pointing out that interactivity is comprised of feedback and being able to response. He defines interactivity as,

an expression of the extent that in a given series of communication exchanges, any third (or later) transmission (or message) is related to the degree to which previous exchanges referred to even earlier transmissions (Rafaeli, 1988, p. 111).

Steuer’s definition of interactivity contains additional facets such as speed of response, the range it can be modified, and the connection of human actions, saying

the extent to which users can participate in modifying the form and content of a mediated environment in real time (Steuer, 1992, p. 14).

Lastly, there are several features that include the concept ‘interactivity’: such as reciprocity, responsiveness, and nonverbal information (Johnson et al., 2006).

In conclusion, based on the various interpretations discussed above, interactivity can be defined as, the engagement between two entities that exchanges information and experience whilst evolving with dialogic reasoning.

Concepts of Interactivity Concepts of computer-based interactivity take different forms in areas such as: games, medicine, education, arts, business and the military. Jäckel (1995) argues that the concept ‘interactivity’ evolved from the concept ‘interaction’, which has the meaning of ‘interplay’, ‘ex-change’ and ‘mutual influence’. These concepts can vary around different parameters (Rafaeli, 1988, Rice and Williams, 1984, Jensen, Heeter, 1989, McLaughlin, 1984, Steuer, 1992, Neuman, 2008, Jäckel, 1995, Rogers, 1995). For example, in the field of digital media, interactivity refers to the phenomenon of mutual adaptation (Neuman, 2008), and this can be within the internet or a game, as an interlocutor of the same communication medium.

Jensen focuses on how the definition of interactivity varies in different fields of study. He describes ‘interaction’ in the field of medical science as, “the interplay between two medications given at the same time” Jensen (1998b, p. 188). In the field of engineering, ‘interaction’ means the reactivity and relationship between two different materials under stress and in statistics it is described as, “The common affect of several variables on an independent variable” Jensen (1998b, p.188). Jensen says that Jackel refers the meaning of ‘interaction’ in linguistics as: “the influence on language behavior of bi-lingual children”.

These different representations of ‘interaction’ shows that the conceptual meaning of it depends on the context of the necessary field of study. As an evolving field, interactivity is vividly theorized. Therefore, the concept of ‘interactivity’ is further discussed in relation to the context of computer-based interactivity.

Concepts/Principles of interactivity within a computer-based context Numerous concepts of interactivity unfold within the context of computer-based interactivity. In the field of computer-based interactivity, studies of interactivity consist of concepts such as: interchangeable roles, user effort, real-time communication, and characteristics of the communicator and the medium (Rice, 1984, Williams et al., 1988, Rice and Williams, 1984, Rogers, 1995, Rafaeli, 1988, Heeter, 1989, McLaughlin, 1984, Steuer, 1992). I will give examples of the concepts from a selection of the above authors.

The notion of interchangeable roles was suggested by Rice (1984), one of the early investigators of interactivity. He says “fully interactive media imply that the sender and the receiver’s roles are interchangeable” (Rice, 1984, p. 35).

Later, Rogers’ (1995, p. 314) definition of interactivity is similar to Rice’s studies, as he states, “interactivity is the degree to which participants in a communication process can exchange roles and have control over their mutual discourse”.

Rafaeli (1988) also states that the roles of communication need to be interchangeable, thus it should be a nonautomatic or nearly a process of assigning roles and turn-taking. Consequently, the ideas of interchangeable

16 roles are closely related to McLaughlin, Watzlawick, Beavin and Jackson’s interpersonal communication theory (McLaughlin, 1984, Watzlawick et al., 2011).

The concept of user effort was suggested by Heeter (1989). He stated that, user engagement in traditional forms of media exercises more effort from the user compared to interactive media. Real-time communication as a concept was studied by Rice and Williams (1984), and Steuer (1992) . They also suggested that interactive media should be an instant exchange. However, Rheingold (2000) does not agree with the importance of real-time exchange. He suggests that interactive media tools such as email, listservs5, and newsgroups is a significant advantage in interactivity where the effect of real-time does not affect the interactivity process.

The concept of ‘responsiveness’ was emphasized in Rafaeli’s (1988) definition of interactivity. He identifies ‘responsiveness’ as being able to responsed to current and previous communications. Neuman (2008) suggests, responsiveness can also be addressed as the response speed of the medium responding to the user. This has been highlighted by analysts using video games while others used the amount of flexibility or frequency of the interaction (Neuman, 2008). Also, responsiveness is the interruptibility of the communication stream between the interlocular and the engaging subject (Rafaeli, 1988). According to Downes and McMillan (2000), the amount of effort which is required for interactive communication relates to responsiveness. This suggests that interactivity through different systems can require more and less effort.

Another principle of interactivity, directionality of communication (Neuman, 2008), proposes a two-way stream of communication within interactivity. History of communication from the late industrialized eras used communication methods such as newspapers, which can be considered as a one-way stream of communication. For example, the published material would rarely receive feedback from the readers as the technology was limited. Later, the era of radio had the same limitation of not allowing feedback, thus, not being able to escape the one-way stream of communication. However, since the development of the Internet and advanced digital technologies, the fundamental conditions of establishing a two-way communication is accessible. The technological advancement allowed immediate sending and receiving information such as data, messages, audio and video. As a result, the criteria of bidirectionality was enabled.

The principle, Selectivity, as Neuman (2008, p. 2319) describes, proposes the extent of the type of format in which the information is available. Neuman describes selectivity as, “the breadth of choice that is available to the user in terms of both types and formats of information and entertainment”. The principle ‘awareness’ is “the degree of reciprocal awareness of system states and user reactions”. Neuman says that Rafaeli has specified “later interactions of an exchange can reflect an awareness of earlier ones, as typified by human conversation”. Human-human interactions consist of awareness as both parties can understand the interaction.

Duncan (1989, p. 22) adds to this idea of awareness stating, “interaction may be said to come into being when each of at least two participants is aware of the presence of the other, and each has reason to believe the other is similarly aware”. However, the development of the current technology still lacks the means to perceive this element. A summary of various concepts/principles of interactivity is displayed in the below Table 6.

Concepts/Principles Author(s) Explanation Directionality Neuman/Downes/McMillan Two-way stream of communication Responsiveness Rafaeli/Downes/McMillan Effort exerted into communicating via different communication systems Awareness Neuman/van Dijk/Duncan Understanding of context and meaning Selectivity Neuman The availability of choices Interchangeable Ronald E Rice Communication roles can be interchangeable Real-time Rice, Williams, and Steur Interactive media should be an instant exchange communication Table 6 : Summary of the principles of computer-based Interactivity6

5 An application that distributes messages to subscribers on an electronic mailing list (Lexico, 2020). 6 This table is developed in Applying the principles/concepts of computer-based interactivity on page 77 with specific reference to the Sati Interactive System and taking an arts-based approach.

17 Computer-based interactivity can be defined through these 6 concepts/principles, where a computer is included as a tool, an agent or an interlocuter. Here I will be focusing on these 6 principles/concepts of interactivity in developing Sati Interactive. How these concepts are used is discussed in Applying the principles/concepts of computer-based interactivity on page 78. Interactive Technologies The areas covered in this section are: interactive systems, virtual reality systems, exemplar projects in various fields of computer-based interactivity (uses of interactive technologies in different context), types of controllers in interactive systems, and interactive software paradigms. This section also consists of exemplar projects on, interactive technologies in: entertainment, healthcare, and education. A brief outline of approaches to interactive technologies in these areas is discussed.

Significance:

This topic holds a significant weight towards this research project because the state of ‘technology’ is rapidly developing. It is necessary to understand the current state of interactive technologies and the development of interactive technologies in order to develop a system which can endure towards the future.

An introduction to Interactive systems An interactive system is a combination of technologies put together in order to create a system with interactive functionalities. Developments in three-dimensional visualization has facilitated a wide range of visualization and projection capabilities such as: Virtual Reality [VR] (Burdea Grigore and Coiffet, 1994), Augmented Reality [AR] (Meisner et al., 2003), 3D Projection Mapping (Priede, 2017), and Immersive CAVE (Cave Automatic Virtual Environment) systems (Mazikowski and Lebiedź). Immersive technologies such as Oculus Rift (Facebook Technologies, 2019), PlayStation VR (Sony Interactive Entertainment, 2019b), Vive (HTC Corporation, 2019), Nintendo Wii (Nintendo, 2019b), Google Cardboard (Google, 2019b), and Google Glass (Google, 2019a) are mostly popular as entertainment devices. However, these immersive technologies are now often used in therapy and training as a tool that supports individuals resolving various issues (Parsons and Rizzo, Stanney, 2002). Accessories for immersive devices include: head-mounted displays with the ability to track eye motion and to generate multisensory VR that simulates smell, heat, water, wind, and vibration (Gortari, 2015, Moss, 2015). For example, FOVE-Inc (2019) and Feelreal (2019). These VR devices are used to interact with and explore virtual environments (Ausburn and Ausburn, 2004).

Currently, the use of interactive technologies in the field of education, healthcare, museums, galleries, and entertainment is rapidly developing. This is similar to the development process of the internet in the early 21st century (Mehlenbacher and Miller). Interactive learning methods can be used to amplify understanding compared to conventional ways of learning. Interactive technologies enable the user to participate in discussions in 2D/3D audio visual environments (Napier, 2017). Different types of interactive technologies being used for communication include: Microsoft HoloLens (Hanna et al., 2018), 3DNSITE (Pintore et al., 2012), Kinect (Lee et al., 2013), Leap Motion (Potter et al., 2013), Oculus Rift (Hoffman et al., 2014), Wii Remote (Schou and Gardner, 2007, Anderson et al., 2010), EyeToy (Rand et al., 2008), and Myo Gesture (Sathiyanarayanan and Rajan, 2016).

Immersive systems are available in different classifications depending on their interactive capabilities and their sensory experience. There is “Like the computer itself, [VR] is a protean technology. There will be no single type of VR system and no paradigmatic virtual environment” (Biocca and Delaney, 1995, p. 57). Biocca explains the term ‘immersive’ as,

“a term that refers to the degree to which a virtual environment submerges the perceptual system of the user in computer-generated stimuli. The more the system captivates the senses and blocks out stimuli from the physical world, the more the system is considered immersive” (Biocca and Delaney, 1995, p. 57).

Table 7 displays the classification of virtual reality systems by Biocca and Delaney (1995), which was adapted from Louis Brill’s classification at Virtual Reality 1993, San Francisco.

18 Types of systems Description Window systems A computer screen provides a window or portal onto an interactive, 3D virtual world. Desktop computers are often used, and users sometimes wear 3D glasses for stereoscopic effects. Mirror systems The users look at the projection screen and see an image of themselves moving in a virtual world. Video equipment is used to record the user’s body. A computer superimposes a cut-out image on a computer graphic background. The cut-out images of themselves on the screen mirrors their movements, hence the name mirror systems. Vehicle-based systems The users enter what appears to be vehicle (e.g., tank, plane, care, spaceship, etc.) and operate controls that simulate movement in the virtual world. The world is most often projected on screens. The vehicles may include motion platforms to simulate physical movement. Cave systems User enter a room enclosure where they are surrounded by large screens that project a nearly continuous virtual scene. 3D glasses are sometimes used to enhance the sense of space. Immersive Virtual Reality systems Users wear displays that fully immerse a number of the senses in computer generated stimuli. The stereoscopic head-mounted displays (HMD) are a distinctive feature of such systems. Augmented Reality systems Users wear a visual display (e.g., transmissive HMD) that superimposes 3D virtual objects on real-world scenes. Table 7 : Classification of Virtual Reality Systems (Biocca and Delaney, 1995)

Fields of Computer-based Interactivity This section discusses how technology mediated interactivity is linked across these fields: entertainment, education, training, health, and simulation experiments. The discussion is broad to show the use of interactive technology in various fields and its benefits and limitations.

Computer-based Interactivity in Entertainment The development of technology for entertainment influenced other fields7. ‘Video games’, is simply the common most term to describe interactivity in entertainment. Since the development of games such as: SpaceWar in 1962 (Graetz, 1981), Pac Man in 1980 (Gallagher and Ryan, 2003) and Grand Theft Auto in 1997 (Rockstar Games, 2019), videogames have made a huge impact in the field of interactive entertainment (Ritterfeld and Weber, 2006, Annetta et al., 2009). Games evolved from basic arcade games to computer games and console games which uses a variety of gaming accessories such as joysticks, controllers, motion sensors and VR headsets (Card et al., 1978, Carter et al., 1994). Technological development led 2D games such as: Pong in 1972 (Atari, 2018), Super Mario in 1982 (Nintendo, 2018), and Aladdin’s Castle in 1969 (Namco, 2018) in to large scale 3D games such: as Call of Duty in 2003 (Activision, 2019), Grand Theft Auto in 1997 (Rockstar Games, 2019), Need for Speed in 1994 (Electronic Arts, 2019), and Battlefield in 2002 (Electronic Arts, 2018). Arcade gaming consoles developed in to household gaming consoles such as: PlayStation (Sony Interactive Entertainment, 2019a), Xbox (Microsoft, 2019), Nintendo (Nintendo, 2019a), and Personal Computers (Freiberger and Swaine, 1999). Technologies such as: VR (Burdea Grigore and Coiffet, 1994), motion sensors (Microsoft Store, 2018) and gesture controllers (Ultrahaptics, 2019) have opened new paths in interactive entertainment (Zyda, 2005, Bates, 1992).

Currently computer-based entertainment uses interactive technologies such as: Standard Mouse and Keyboards, Gamepads (Logitech, 2019a), Joysticks (Guillemot Corporation, 2019), Racing Wheels (Logitech, 2019b), which provide a wide range of tactile interactive possibilities, and Motion Sensors (for example, Kinect and Leap motion) provide interaction through human gesture/non-tactile controls. Other interactive mechanisms such as Oculus Rift enables interactivity within a 3D virtual space through the technologies

7 This is possibly because entertainment is one of the most wealthy and agile areas of paradigmatic software and hardware development.

19 mentioned earlier and technologies such as: Oculus Insight Tracking, Oculus Touch Controllers (Facebook Technologies, 2019), VIVE Cosmos controllers, and Inside-out tracking (HTC Corporation, 2019).

Below I discuss examples of interactive videogames and their applied technologies.

First Person Shooter Games (FPS) First person shooter games have contributed significantly towards computer-based interactivity in other fields such as: education, training, and healthcare. FPS games require hand-eye coordination and fast reaction time, therefore developing these skills. First person shooter games allow us to point and select within an interactive 3D environment. The principles of an FPS game’s interaction system can be seen in other areas such as the Construct3D AR interactive system discussed below.

Augmented Reality (AR) application Construct3D is a mobile collaborative augmented reality (AR) tool developed by Kaufmann and Schmalstieg (2003) explicitly for mathematics and geometry education. Augmented reality is used to provide students with the advantage of viewing constructed objects in 3D, rather than a traditional pen and paper illustration. The hardware setup consists of two augmented reality kits that can be worn by the student and the teacher. The two AR kits are powered by a backpack computer which also connects a stereoscopic head mounted display with a camera and custom pinch gloves for two-handed input. A projection screen is used to project the live monoscopic video of the work process. This application consists of basic functions such as the construction of primitives, points, lines, planes, cubes, spheres, cylinders, and cones. It also includes intersections, symmetry operations and measuring (Kaufmann and Schmalstieg, 2003).

Figure 7 : Construct3D (Kaufmann and Schmalstieg, 2003)

The AR head mounted displays create the illusion of a 3D object at the focused direction. Here, the player orientation (position and focus direction) is controlled by the AR head mounted sensors. The user has the ability move around and focus on the object. The AR headgear calculates the user orientation and focus position creating an illusion of a moving 3D space. This is similar to the interaction functionalities of an FPS game where a mouse and a keyboard or a joystick is used to control the player orientation. Computer-based interactivity in healthcare and education are discussed in more detail below.

Virtual Reality in Entertainment Virtual reality (VR) gaming has spread its way across platforms from PC based VR games to mobile based VR games. However, games are not the only entertainment category that uses VR technology. For example, YouTube and Netflix have enabled VR offering a wide range of VR videos from movies, music videos, sports to wildlife documentary videos. VR movies such as: ‘Enter The Void’, ‘Fantasia & Fantasia 2000’, ‘Jungle Book’, ‘Jaws’, and ‘Trainspotting’ are some of the VR movies that Netflix offers (Walton, 2017). The ‘Documentary: Planet Earth Amazing Nature 3D SBS’ by 3D SBS videos HD compilations (2016) is an interesting showcase of VR videos.

The gaming industry has merged with VR technology, allowing new pathways to gaming. Sony’s PlayStation currently has over 100 VR games available and more are in development. ‘Farpoint’, ‘Superhot VR’, ‘Resident Evil 7’, ‘Rise of the Tomb Raider: 20 Year Celebration’, ‘Trackmania Turbo’ are some of the VR games that are currently available on the PlayStation Store (Sony Interactive Entertainment, 2017). Mobile platforms such as Android and iOS are also supporting VR technology. Android VR games such as: ‘VR Thrills: Roller Coster 360’,

20 ‘Deep Space Battle VR’, ‘VR Haunted house’ and ‘VR Bike’ can be downloaded in to Android devices from Google Play (Google Play, 2017). iOS has enabled VR experiences through mobile games such as: ‘End Space VR’, ‘Fractal Combat X’, ‘InCell VR’, ‘Lamper VR: Firefly Rescue’ and ‘Roller Coaster VR’ (Broida, 2016).

These devices that support VR applications use different types of VR headgear as the main display and controller device. Sony’s PlayStation uses its own PlayStation VR headset which is powered by the PlayStation4 (Sony Interactive Entertainment, 2017). Samsung Gear VR, HTC Vive, Google Daydream View, Google Cardboard, Merge VR Goggles, Xiaomi Mi VR Play, Zeiss One Plus, BlitzWolf BR-VR3 are examples of VR headgear that support Android and iOS platform mobile devices (Lamkin, 2017).

Problems with VR There are limitations and issues with the use of VR in the field of training and rehabilitation, such as: VR sickness8 (Bruun-Pedersen et al., 2016, Kolasinski, 1995), dark focus shifts (Kolasinski, 1995, Arthur and Brooks Jr, 2000), nausea, disorientation, sweating, oculomotor disturbances, fatigue, and headaches (Fernandes and Feiner, 2016, Liu, 2014). Research also indicates that postural sway9 also known as ataxia (Morel et al., 2015, Riccio and Stoffregen, 1991), which may cause musculoskeletal issues, eyes strain and oculomotor (Kolasinski, 1995) pose significant problems in the extended and widespread use of current VR technologies.

VR application development is expensive, this significantly adds to the ongoing cost of the VR system. Developers should take into consideration that low- and middle-income countries (LMIC) would not be able to afford an expensive VR system. A screen-based or projection system could remedy some of these limitations and issues.

These problems also occur in the following topics, but where there is a specific problem related to the specific topic it is mentioned there.

Computer-based Interactivity in Healthcare Applications This section looks at examples of computer-based interactivity used to enhance healthcare related applications including: surgical procedures, pre-surgical planning, Alzheimer’s disease type dementia, and dementia treatment for the elderly.

In the past decade, interactive applications for healthcare have been rapidly developing. The research on this area has changed from curiosity based research to commercially and clinically substantial technological research (Riva et al., 1999, Emerson et al., 1994). The healthcare sector uses a wide range of VR and related interactive technologies. Surgical procedures, medical therapy, preventive medicine and patient education, medical education and training, visualization of massive medical databases, skill enhancement and rehabilitation, architectural design for health-care facilities are some of the areas that use computer-based interactivity (Moline, 1997, Satava, 1995a).

Computer-based interactivity in Surgical Procedures Virtual reality technology is associated with robotics to develop applications for different stages of surgical procedures such as: remote surgery/telepresence, augmented reality surgery, and pre-surgery planning and simulation (Moline, 1997). Remote surgery/telepresence systems are used for remote surgeries where the surgeon is given the ability to perform surgeries remotely with the use of VR and robotics applications. The VR applications provide interactive visualizations while the robotics applications provide the remote operational capabilities, for example, on the battlefield. Green Telepresence Surgery System, invented by SRI International, is a surgical workstation equipped with remote operating capabilities, 3D vision, dexterous precision surgical instrument manipulation, and a force feedback system which provides sensory information (Satava, 1995b). This system has been demonstrated using fiber-optic connections up to 150 yards. The surgeon uses telepresence to operate on a virtual image, and the surgeon’s movements are mirrored by the robot on the battlefield. This application has proved effective and has provided a safe environment for the surgeons to operate. Another advantage of this system is that it provides a pre-operation platform for the surgeons where they can practice the surgery procedure and analyse the outcome before the actual operation process (Satava, 1995b). However,

8 Similar to motion sickness. 9 Similar to an effect caused by closing the eyes.

21 the lack of wireless connectivity in the Green Telepresence Surgery System is a major drawback, this limits the distance between the surgeon and the patient.

Computer-based interactivity in Pre-surgical planning, practice, and rehabilitation The VIVIAN (Virtual Intracranial Visualization and Navigation) system uses VR technology for medical procedures (Kockro et al., 2000). This is a simulation system created for pre-surgical planning and practice. According to Kockro et al. (2000), this system uses patient-specific, coregistered, fused data sets. The surgeon can view the virtual environment through stereoscopic glasses. The Dextroscope (Volume Interactions) used in this system allows the user to manipulate the VR environment. Pre-surgical planning and testing allowed by the system brings great advantages to the healthcare field. It can be used by professionals or for training purposes. However, this system has substantial boundaries such as the lack of haptic feedback. Refer to the figure below.

Figure 8 : The VIVIAN (Virtual Intracranial Visualization and Navigation) (Kockro et al., 2000)

Computer-based interactivity is also used in rehabilitation. The ‘Rehabilitation Gaming System’ (RGS) designed by Cameirão et al. (2010) is a great example project that uses VR technology for Neurorehabilitation. The RGS is a VR system built to allow unsupervised rehabilitation for adults in order to prevent strokes (Cameirão et al., 2010). A rehabilitation scenario called Spheroids has been put to the test using 21 acute/subacute stroke patients and 20 healthy controls. The results of this study indicate that RGS is a valuable rehabilitation tool.

Computer-based interactivity in Alzheimer’s disease type dementia The healthcare sector has a growing demand for interactive virtual technology and its use for therapy, training and rehabilitation. Identifying, preventing, and providing necessary treatment in dementia has been supported by the European Union in 2011 (2021). Research and development on VR based training systems for age-related cognitive deficiencies has also gained attention. The research on using virtual reality for cognitive training of the elderly, conducted by García-Betances et al. (2015), suggests that Virtual Reality based cognitive training systems are effective and has succeeded on reaching the expected objectives. The study focused on improving the quality of living of elderly people with early stage of Alzheimer’s type dementia and mild cognitive impairment. Memory, language, attention, daily living activities and executive functions are used as the main domains of her study. The VR environments are designed specifically according to the requirements of the caregivers, specialists and patients. Pre-testing on the system was conducted by specialists before using it on the patients. The success of the project depended on a variety of factors such as: preliminary research, data collection, surveys, questionnaires, personal interviews, and the feedback of caregivers and specialists. The VR simulation was specific towards the tested patients’ profiles and their caregivers’. In addition, virtual reality-based systems are also used as an experimental platform which provides feedback and information to researches and caregivers who provide a substantial contribution towards research and treatment (García-Betances et al., 2015).

22 According to the World Health Organization, between 5 to 8 percent of people aged 60 and above have dementia (World Health Organization, 2017). This shows that patients with mild cognitive impairment or dementia would likely be an elderly person. Wortmann (2012) highlights in ‘Alzheimer’s Disease International and World Health Organization report’, that by 2050 more than 75% of people with dementia will be from low-income and middle- income countries. Direct social care costs and informal care costs are high, and low-income countries may not be able to support these costs as well as high-income countries (World Health Organization, 2012).

Computer-based interactivity in dementia treatment for the elderly The interactive art project ‘VENSTER’, led by Luyten et al. (2017) shows that aged care residents triggered memories with the aid of interactive art, the artwork is shown in Figure 9. The VENSTER project was developed to pursue two goal: to study the general responses of the residents with dementia towards the interactive art installation, and to identify if the responses change according to the displayed content type. Two touch screens were mounted to a wall connecting to a computer, Kinect sensor and a roller blind string. The two touch screens represented a window. The content could be changed by pulling the roller blind string which was connected to the system. The user could interact using physical gestures, tapping on the screen or singing along to the music. The initial engagement with the screen was taught by the care providers. The study results indicate that the projected content is important. In most cases recognition of the content triggered memories and verbal reactions. The limitations of this project include issues related to recording emotions, measuring interaction durations and categories (Luyten et al., 2018, p.93). This project demonstrates practicality and simplicity considering the use of content and technology compared to the ‘virtual reality for cognitive training of the elderly’ conducted by García-Betances et al. (2015).

Figure 9 : The VENSTER Project - interactive art installation bringing the outside world into the nursing home (Luyten et al., 2017)

The studies discussed here shows that computer-based interactivity can enhance healthcare outcomes and is not limited to a single healthcare procedure. However, there are limitations in these applications. The most common limitation is that most of the used technologies are complex and costly. This limits the application from being distributed widely. This means these applications can only be used in countries that can afford the cost and can support its technology.

Computer-based Interactivity in Educational Applications This section focusses on how computer-based interactivity is used in education related applications.

Computer-based interactivity in cognitive modifiability The development of advanced interactive technologies has been influential in the field of education as a learning tool. According to Passig et al.’s (2016) research, conducted in central Israel, incorporating teaching strategies and immersive technologies into the education system positively influences the child’s cognitive modifiability.

23 Their interactive learning system provided instant feedback by merging practical learning and integrated computer learning, based on active learning, this approach increased student’s involvement and enthusiasm in the subject area.

Their dynamic assessment was conducted in four types of tests: pre-teaching, teaching, post-teaching, and transfer test which is carried out two weeks after teaching. Pre- and post-teaching phases tested the children’s cognitive modifiability by the transfer test. One control group and three experimental groups were categorized as 3D Immersive Virtual Reality, 2D, and tangible blocks with randomly assigned Grade 1 and 2 children. This project aimed to observe children’s cognitive modifiability in analogical reasoning in different modes of teaching such as: 3D virtual reality based learning and standard learning situations. Results show that children who participated in the 3D virtual reality study space developed cognitive modifiability than the standard learning group. Passig et al. (2016, p.306) says, “integrating a computerized 3D IVR10 environment synergistically with MLE11 strategies within a DA12 procedure creates a computer-mediator-learner “intellectual partnership”. This partnership, it seems, generates a unique perceptual experience that broadens the child's world of mental images, it strengthens the internalization of the MLE principles and contributes to her/his cognitive achievements”.

Passig et al.’s study shows similar results to Rustin Webster’s (2016) study on immersive virtual reality and interaction effects in a classroom scenario built to create a virtual learning environment for the US Army. Bailenson et al. (2008), Brelsford (1993), Trindade et al. (2003) and Ragan et al.’s (2010) studies on immersive virtual reality as a learning tool also show similar results to Webster’s (2016) research. This suggest that the level of learning can be affected by the level of interaction, engagement, immersion and motivation that the immersive virtual reality environment provides (Webster, 2016, p.1330).

Objects and environments can be viewed and manipulated in VR, increasing the ability to explore objects factually and creatively. The above research shows potential benefits in learning in a variety of situations. However, VR technology has potential health issues as mentioned above.

Computer-based interactivity in digital game-based learning Immersive systems are also used in education as a digital game-based learning tool. The use of computer games for educational purposes was defined as digital game-based learning [DGBL] by Marc Prensky in 2001 (Prensky, 2007).

Fernandes et al.’s (2016) study on the user experience [UX] of immersive technologies used to enhance learning. It uses the multi-model immersive videogame prototype that shows a Portuguese historical event. This project was conducted using two participants, where one user was given a VR headset and the other a motion sensor controller and a screen with no headset device. The user interactions with different immersive devices were analysed separately. The results indicate both positive and negative responses on the use of immersion and virtual world. Both users have experienced motivation and enthusiasm towards the content shown through the VR headset (Alves Fernandes et al., 2016).

Figure 10 : Demonstration of the 3D multimodal interaction prototype (Alves Fernandes et al., 2016)

10 Immersive virtual reality 11 Mediated learning experience 12 Dynamic assessment

24 Lastly, research shows that the use of computer-based interactivity in educational applications enhances learning capabilities. Interactive virtual content both screen-based and with headsets, captivates user attention, and may therefore enhance focus and memory. However, there are limitations and risks, such as using expensive VR headgear and other technical instruments with children.

Computer-based interactivity in Training applications The full-immersive CAVE-based VR simulation system designed by Yuen et al. (2010), provides a virtual workplace to train forklift truck operators. The CAVE-based VR simulation system includes stereoscopic 3D image projection with four screens, VR head and hand tracking tools, and a MOMO Force Feedback Racing Wheel (Logitech, 2019c). It was built to deliver different workplace scenarios such as: slopes, sharp turns, and overloads. The main intention of the system was to provide a safe training facility for the drivers to improve their skills though visualizing and experiencing sensitive and dangerous situations, to gradually improve their ability, and to reduce real-world accidents (Yuen et al., 2010). However, it was unable to simulate the top and rear view, making it less suitable to use for safety training forklift truck operations. The rear-view should be visible to the driver in a situation where the truck should be reversed. CAVE-based VR simulation systems should be built on accurate parameters analysing the users’ performance between the real and virtual situation as the training is later used in real life situations (Morel et al., 2015).

Computer-based interactivity in Psychology and behavioural experiments Psychology and behavioural sciences use immersive virtual reality systems to simulate real world scenarios in user behavioural experiments. In Rovira et al.’s (2013) experiment exploring “The impact of enhanced projector display on the responses of people to a violent scenario in immersive virtual reality”, 10 participants in two groups were put in a virtual bar scenario where they observed an argument about a soccer club which ended up with a violent attack. The two groups faced different CAVE systems, one with higher resolution and luminance than the other. Participants were able to involve themselves as many times as they wanted. The number of involvements and responses made by the participants were studied. The results suggest that there were more physical and verbal interferences made by the participants in the higher resolution CAVE system than the lower resolution system. Rovira et al. (2013) says that ecological validity of such experiments can be increased using immersive virtual reality systems compared to traditional experimental methods created in a laboratory setting based on non-violent and abstract scenarios.

Figure 11 : Cave-like system - virtual bar scenario (Rovira et al., 2013)

The CAVE system has issues such as: the cost of development compared to a VR or screen-based system, not being able to project six sides, and not easy to transport. Projecting quality graphical content requires high quality projectors. Higher quality rendering also requires a large amount of computation power. The cost of design and development has been a major problem for developers and users. Using low-cost devices can be beneficial for both user and developer. Factors such as: quality of graphics, simulation logic, operational fluency, user’s imagination and movement mapping should be considered when assessing the cost of production in the course of designing immersive systems (Buń et al., 2017).

25 Limitations of the technology used in computer-based interactivity The above research shows that computer-based interactivity can be useful for various purposes; the benefits of each technology are outlined above. However, it is also understood that computer-based interactivity has its limitations specific to its technology.

Limitations in Virtual Reality technology Here is a summary of the limitations of VR technology discussed in the above research.

• Cost of VR development and knowledge. • Technology requires who has experience and confidence in these kind of systems to setup the system13. • Lack of haptic feedback. • It is not always practical to use (e.g., children below the age of 10 and elderly people over the age of 50). • VR sickness - nausea, disorientation and headaches, sweating, oculomotor disturbances, fatigues, postural sway (ataxia), eye strain, musculoskeletal issues. Limitations in CAVE system technology Here is a summary of the limitations of CAVE system technology discussed in the above research.

• Development cost of a quality user experience - quality of graphics, simulation logic, operational fluency, user’s imagination, and movement mapping. • Cost of equipment – projectors, computing power, sound systems. • Requires a specific space to build a CAVE system. In conclusion, computer-based interactive technologies have shown their usefulness in education, healthcare, entertainment, and training. VR, AR, Cave systems, projections, Kinect, Leap motion, and other haptic devices are beneficial technologies in creating computer-based interactive systems. However, I am not using any of these technologies because I am creating a low-cost system and to reduce the above problems. Human Computer Interaction (HCI) This section discusses about Human Computer Interaction (HCI). Here I am discussing the importance of HCI and its features: usability and user experience.

The development of technology constantly changes the state of human-computer interaction (HCI). Conventional analogue systems have become digital, enabling more possibilities that history could not predict. Interactive devices have developed from the traditional keyboard and mouse to motion sensors, hand gestures and virtual reality devices. Technologies such as: Wi-Fi, Bluetooth, 4/5G, and RFID has allowed communication between devices. Thus, the rapid development of technology changes the perspective of human-computer interaction.

“It is becoming increasingly difficult to ignore the impact that various digital interactive products, services, and systems have on the lives we live-our mobile phones keep ringing, our in-car navigation systems tell us to turn, our email inboxes stack up and Twitter feeds roll by, and that we rather meet friends over Facebook than over a pint at the pub” (Fallman, 2011, p. 1).

Usability and User Experience Here, I am discussing aspects of usability and user experience that were used to create the initial design of the Sati Interactive System.

Usability This section discusses the relationship between HCI and usability. In this thesis, the term ‘usability’ is always discussed within the context of human-computer interaction devices/systems (i.e., interfaces with digital displays).

13 Sati Interactive System is designed for people who might not have the confidence to setup a VR system, for example, people who needs assistance setting up their phone.

26 Traditionally, HCI focused on the ‘five E’s’ of usability. The ‘five E’s’ of usability stands for, effective, efficient, engaging, error tolerant, and easy to learn (Bevan, 2001, Fallman, 2010). Over the years, the international standards defined the general principles of usability in different areas such as: software interface interaction and hardware interface (Bevan, 2001). Here, the standards have been issued to retain consistency in products. As an example, ISO/IEC14 standards have been defined for mechanisms such as: icons, cursor control, PDA scripts, and software ergonomics to retain its consistency throughout various products (Bevan, 2001).

Standards related to Usability There are two standards, ISO 9241-11 (1998) and ISO 13407 (1999), related to usability. ISO 9241-11 provides necessary guidance on how to filter the important information required to evaluate user performance and user satisfaction. However, in terms of defining usability according to ISO standards, there are other definitions in different contexts such as software product evaluation and software engineering product quality. In the field of software engineering, usability is more focused on interface design. Here, usability is defined as, “the capability of the software product to be understood, learned, used and attractive to the user, when used under specific conditions” (Bevan, 2001, p. 537).

The ISO 13407 (1999) ‘Human-centred design processes for interactive systems’ provides a guidance towards designing a user/human-centred design (UCD/HCD) based on a computer-based interactive system. It describes user/human-centred design as a multi-disciplinary system that improves: user satisfaction, productivity, project planning such as time and resources, user feedback, communication and teamwork (Bevan, 2001, Jokela et al., 2003).

However, ISO 9241-11 standard is updated yearly. The 2018 revision included additional components such as: aspects of user experience and new definitions for effectiveness, efficiency, satisfaction, and context of use (Bevan et al., 2015, DIS, 2009, ISO, 2018b).

An interpretative analysis conducted by Jokela et al. (2003) on the standard of user-centered design from the viewpoint of the standard definition of usability shows that, ISO 13407 only offers limited guidance for designing usability. These limitations are in areas such as: descriptions of user goals and usability measures, and guidance in the process of producing the various outcomes (Jokela et al., 2003, Bevan, 2001, Bevan et al., 2015).

Five E’s of Usability The ISO 9241-11 (ISO, 2018a) standard of usability is defined as, “the extent to which a product can be used by specified users to achieve specified goals with effectiveness, efficiency and satisfaction in a specified context of use”. It focuses on the ‘five E’s’ of usability, arranged in order of importance: the technology should be effective, it should be efficient, it must engage the user in its intended outcome, it must tolerate errors by the user or within the technology, and it must be easy to understand and use. Additionally, it must also satisfy the user needs.

Effective –

According to the ISO 9241 (1998), effectiveness is how the users achieved the specific goals. The definition of effectiveness consists of two principles, completeness and accuracy. Here, it is valued by the success of achieving the user specified goals with no errors. However, the later update of the ISO 9241 (ISO, 2018b) standard has added ‘appropriateness’ as another principle to the definition of effectiveness. Bevan et al. (2015) defines appropriateness as, avoiding and minimalizing the risk of errors, and the accuracy and form of the output. As an example, the user interface of an ATM machine should be and is designed differently than a game user interface. The ATM machine user interface is effective in withdrawing the correct amount, entering the right amount, and displaying simple yet complete and accurate icons and menus. If an ATM interface had more options, it could be confusing for the end user. In contrast, a game user interface is designed to: engage the user for a longer period of time, offer many options, be portable, be ergonomic, and extendable, for example, to be used with many different games.

14 ISO - International Organization for Standardization, IEC - The International Electrotechnical Commission.

27 Efficiency –

Efficiency is the amount of resources used to reach the goal. According to the ISO 9241 (1998), it is calculated by dividing the ratio of effectiveness by the amount of consumed resources. However, this is only valid as a theoretical measure. The revised ISO 9241 (ISO, 2018b) states efficiency as the resources used to complete a specific goal. Here, resources can be time, cost, materials, or effort. For example, the time or cost of completing a specific task. An interface for a self-service cash register should be designed different from a store cash register application. The self-service cash register application might have a hierarchically structured set of navigation while the store cash register might have a non-hierarchical navigation as it is used frequently by a specific set of users.

Engaging –

This is the amount of pleasure and satisfaction gained by the process of completing the task. It can be audio video design elements, the user interfaces with characteristics such as: graphical user interface icons, logos, colour, visuals, and audio. These elements attract the user and the likeliness of the user’s engagement depends on these audio-visual elements. However, like all usability characteristics, this too should be designed focusing on the specific context. For example, design aspects of a social website are different from an ATM machine. Social website interface design can have various visual elements such as colourful logos, menus, transition effects, or interactive menus. Whereas, an ATM machine might not find such design elements useful in its context.

Error Tolerant –

Here, error tolerant means designing a program to prevent and limit issues occurred by the user’s interactions, and to guide the user through any unavoidable errors (Quesenbery, 2001). As an example, suggesting the user to fill in certain information with a popup message can limit user interaction errors or work as a guide to eliminate future errors.

Easy to learn –

The interface should be simple and minimal, and allows the user to use their prior knowledge (Quesenbery, 2001). This is different from providing the user with instructions to use the application. For example, once you have used a computer, you can apply your knowledge to use a different computer. To achieve this, the designer can use predictable interaction patterns similar to other systems.

Satisfaction –

The ISO 9241 (1998) standard defines satisfaction as, “freedom from discomfort, and positive attitudes towards the use of the product”. The updated ISO 9241 (ISO, 2018b) defines it as, “the extent to which attitudes related to the use of a system, product or service and the emotional physiological effects arising from use are positive or negative”.

Context of use –

According to the ISO 9241 (ISO, 2018b) standard, context of use can vary according to: all relevant contexts of use, a single instance, a specified context, or a single individual user context.

Moving from Usability to User Experience The ISO 9241 (ISO, 2018b) standard’s definition has focused on the word “specified” three times. This means that the conceptual definition of usability can be applied to anything outside the context of HCI such as: a cup, bottle, or a table. In his System Usability Scale paper, Brooke says that, “Usability does not exist in any absolute sense; it can only be defined with reference to particular contexts” (Brooke, 1996, p. 2). In order to specify usability of an HCI system, it is important to define its intended users, the tasks it is required to perform, and the characteristics of its used environment.

Currently developments in HCI are focusing more on user experience. It is currently understood that usability (as defined above) is not sufficient to use as a measuring standard for modern HCI systems (Fallman, 2011).

28 Conventionally and historically, technology was viewed as a tool, but the current view is that technology is considered as ‘an experience’. Also, culture, emotion, meaning, motivation, engagement, and complexity are considered as necessary aspects that are required to measure and quantify modern HCI systems (Bødker, 2006, Bertelsen, 2006, Bertelsen and Pold, 2004, Norman, 2002, Petersen et al., 2004, McCarthy and Wright, 2004). I have used: efficiency, effectiveness, engaging, error-free, and easy to learn, from the Five E’s to develop the Sati Interactive System. How this is applied is further discussed in Factors that influence experience on page 80.

Contemporary HCI focuses on concepts such as: user experience and meaning, leading towards studying interaction between humans, environments, and artefacts. As Fallman (2011, p. 1) explains, modern HCI focuses on “what constitutes a ‘good’ user experience”.

The Development of HCI HCI’s development/movement can be categorised in to three sections: first wave, second wave, and the third wave. (Fallman, 2011, Bødker, 2006, Norman, 2002) I have summarised their discussions below and included their references.

First Wave of HCI The first wave focuses on single user interaction with the computer and the interfaces were comparatively less user-friendly than now. The basis of the first wave was information processing (Fallman, 2011). Fallman says various information processing theories were adopted, such as: Card et al.’s (1983) quantitative predictions model for evaluating different interfaces, and Norman’s (1986) theory of action. Rasmussen and Vicente’s (1989) ecological interface design model for control and support recovery from errors was also influential in developing the first wave of HCI. These theories became the basis of usability driven HCI, allowing the design and testing of systems according to the user requirements.

Second Wave of HCI Fallman (2011) discusses the second wave of HCI focusing on a broader interaction between the human and the computer. He references second wave’s HCI analysis and design was based on theories such as: Wright’s distributed and external cognition (Wright et al., 2000), Nardi’s activity theory (Nardi, 1996), Ehn’s participatory design (Ehn, 1988), Gaver’s ecological psychology (Gaver, 1991), Heath and Luff’s ethnography and ethnomethodology (Heath and Luff, 1991), and Dourish and Winograd et al’s phenomenology (Dourish, 2004b, Winograd et al., 1986). Here, the interaction focused on a collaboration of well-established groups and communities using a variety of applications (Bødker, 2006). She says that previously used rigid guidelines, predicative models, and systematic testing were mostly replaced with prototyping, contextual inquiry (Beyer and Holtzblatt, 1997), and participatory design workshops.

Third Wave of HCI The development of technology brings HCI in to its third wave. New elements such as: experience, emotion, culture, meaning, engagement, complexity, and motivation were being considered when designing and analysing HCI (Norman, 2002, Bødker, 2006, Fallman, 2011). Fallman says philosophy, values, cultural analysis, history, aesthetics, and critical theory was also incorporated in to HCI design (Bardzell, 2009, Bertelsen and Pold, 2004, Bødker, 2006, Friedman and Kahn Jr, 2007, McCarthy and Wright, 2004, Norman, 2002, Bertelsen, 2006, Petersen et al., 2004). Cultural aspects were analysed through a technological experience (McCarthy and Wright, 2004), or an emotional change (Norman, 2002).

User Experience Here, user experience is defined as the experience gained from the relationship created between the user and the device/technology (McNamara and Kirakowski, 2006). The primary focus is to develop a strong and meaningful bond between the user and the device. According the ISO 9241-11 (2018a) standard, user experience is a “user’s perceptions and responses that result from the use and/or anticipated use of a system, product or service”. The ISO 9241-11 (2018a) further elaborates on user experience by stating:

“Users’ perceptions and responses include the users’ emotions, beliefs, preferences, perceptions, comfort, behaviours, and accomplishments that occur before, during and after use.”

29 “User experience is a consequence of brand image, presentation, functionality, system performance, interactive behaviour, and assistive capabilities of a system, product or service. It also results from the user’s internal and physical state resulting from prior experiences, attitudes, skills, abilities and personality; and from the context of use.”

Here are several definitions of user experience from different authors:

User experience is “about how people feel about a product and their pleasure and satisfaction when using it, looking at it, and opening or closing it. It includes their overall impression of how good it is to use, right down to the sensual effect small details have on them, such as how smoothly a switch rotates or the sound of a click and the touch of a button when pressing it.” (Preece et al., 2015, p. 12)

“user experience encompasses all aspects of the end-user’s interaction with the company, its services, and its products.” (Nielsen and Norman, 2019)

“all the aspects of how people use an interactive product: the way it feels in their hands, how well they understand how it works, how they feel about it while they’re using it, how well it serves their purposes, and how well it fits into the entire context in which they are using it.” (Alben, 1996, p. 12)

“the totality of the effect or effects felt (experienced) internally by a user as a result of interaction with, and the usage context of, a system, device, or product.” (Hartson and Pyla, 2012, p. 19)

“UX is a consequence of a user’s internal state (predispositions, expectations, needs, motivation, mood, etc.), the characteristics of the designed system (e.g. complexity, purpose, usability, functionality, etc.) and the context (or the environment) within which the interaction occurs (e.g. organisational/social setting, meaningfulness of the activity, voluntariness of use, etc.).” (Hassenzahl and Tractinsky, 2006, p. 95)

“the entire set of effects that is elicited by the interaction between a user and a product, including the degree to which all our senses are gratified (aesthetic experience), the meanings we attach to the product (experience of meaning), and the feelings and emotions that are elicited (emotional experience).” (Hekkert, 2006, p. 160)

These definitions help distinguish similar characteristics of user experience:

• Aesthetics of a design is an important aspect. o The research conducted by Tractinsky et al. (2000) on the relationship between user’s perception of a computerized system’s beauty and usability, and Chawda et al. (2005) research on ‘attractive things work better’ supports this argument. o De Angeli et al. (2006) argues the importance of aesthetics in an interactive system and how it influences the user engagement. • User experience involves human emotions/feelings. o Wright et al. (2008, p. 19) argues that “the key to good aesthetic interaction design is understanding how the user makes sense of the artifact and his/her interactions with it at emotional, sensual, and intellectual levels”. • User experience involves aspects of the interaction such as: user’s responses, the designed system/product, its services/purpose, and context. o Logan et al. (1994), Hassenzahl (2001, 2003, 2018), Hassenzahl et al. (2000), and Hassenzahl and Tractinsky (2006) argue that HCI should also focus on the user needs and values, considering pragmatic aspects such as: acceptable results, efficiency, effectiveness, and safety, and hedonic aspects including: stimulation (for example, personal growth, an increase of knowledge and skills), identification (for example, self-expression, interaction with relevant others), and evocation (for example, self-maintenance, memories) (Bevan, 2009). • User experience involves identifying the users’ perspective on the experience gained from using the technology. o McCarthy and Wright (2005, p. 266) states that user experience can be identified using questions such as, “how the person felt about the experience, what I meant to them, weather it was important to them, and weather it sat comfortably with their other values and goals.”

30 Models of user experience Here I am discussing the three models developed by Forlizzi and Battarbee (2004) to understand user experience from various perspectives. The three models are: product-centred models, user-centred models, and interaction-centred models. I have summarised their discussions below and included their references.

Product-centred models are useful for both designers and non-designers to design a product that provides a positive experience in the user. Forlizzi and Battarbee (2004) include types of issues and experiences that must be considered during the process of designing and evaluating the system: artefact, service, or environment. Jääskö and Mattelmäki’s (2003) framework of user experience includes the most relevant aspects that impact the human-product relationship in product concept design. Alben’s (1996) criteria of evaluating the quality of experience: understanding of users, learnable, needed, mutable, effective, appropriate, aesthetic, and manageable.

User-centred models are useful for designers and developers to understand their target customer/user base. These models help understand the actions and aspects of experience in the user while they are interacting with the product (Forlizzi and Battarbee, 2004). For instance, Abras et al.,’s (2004) methods that supports user- centred design includes: usability engineering, usability testing, discount evaluation, quick and dirty evaluation, participatory design, and heuristic evaluation. Vredenburg et al.,’s (2002) survey of user-centred design practice shows the most widely used methods, processes, and key factors towards developing a successful user-centred design.

Interaction-centred models focus on the gap between the user and the designer. These models help understand various aspects of the experience. For example, Dewey’s (2005) theory on helping designers understand user experience and the engagement or relationship between the product and the user. Forlizzi and Battarbee say that Wright et al. (2003) also discuss four threads of experience from the designer’s perspective: compositional, sensual, emotional, and spatio-temporal.

These models are used to create frameworks to understand interaction and the experience between the user and the product. The next section discusses the interaction design framework used to design the Sati Interactive System.

Interaction Design Framework There are different types of interaction design frameworks developed according to different perspectives such as person-product centred (Wensveen et al., 2004), design-centred (Bartneck and Forlizzi, 2004), and user- product interaction-centred (Forlizzi and Ford, 2000, Desmet and Hekkert, 2007). Here, I am focusing on a User- product Interaction-centred model to design the Sati Interactive System as it encompasses the necessary aspects to understand the gap between the designer and the user, from a designer’s perspective (Forlizzi and Battarbee, 2004). I discuss how user experience and interaction design are implemented in the development of the Sati Interactive System in Development of the Interaction Design Framework on page 80.

Before discussing user-product interaction, we must first understand which factors influence experience.

What influences experience? User experience is influenced by: emotions, values, prior experience, product features, aesthetic qualities, and usefulness (Forlizzi and Ford, 2000). Forlizzi and Ford (2000, p. 420) say, “The simple way to think about what influences experience is to think about the components of a user-product interaction, and what surrounds it”.

31 Figure 12 explains Forlizzi and Ford’s influences on experience.

Form language Emotions Features Values Aesthetic qualities Prior experience Accessibility

user product Context of use

Social and cultural factors

Figure 12 : Influences on experience (Forlizzi and Ford, 2000, p. 420)

Here, Forlizzi and Ford (2000) describes influences of experience in two categories: user and product.

User – is the influence that each user brings. These influences include their emotions, values, feelings, prior experiences, and cognitive models.

Product – is the influence generated by the product/artefact. These include features of the product, its accessibility, form of language, and aesthetic qualities.

These influences vary depending on social and cultural factors, and the context that the interaction take place. For example, designing a classroom for children under the age of 8 is different than designing a classroom for high school students. The objects might have similarities but how people use it might be different. In the context of the children’s classroom, the tables and chairs would need to fit a certain height; the walls of the classroom would need to display various colours, numbers, or letters; objects in the classroom would need to be child safe. In the context of the high school classroom, tables would need to accommodate large books, laptops or other devices; the surrounding would need to accommodate lockers other devices necessary for a high school education. As explained, designers should understand what influences the user experience, consider user and product factors, and recognise the context of use. A comprehensive understanding of these aspects will help create a positive user experience.

User-product interaction User-product interaction is the type of interaction that occurs between the user and the product. Here, I am focusing on the three types of user-product interactions presented by Forlizzi and Battarbee (2004) – fluent, cognitive, and expressive.

Fluent user-product interaction is the most common, automatic, and skilled interaction between a user and a product (Forlizzi and Battarbee, 2004). These types of interactions are mostly done automatically because of prior experience. For instance, fluent user-product interactions can be a regular activity such as driving a car, cooking your usual dish, or washing the dishes.

Cognitive user-product interactions require attention towards the product. These interactions maybe similar interactions that the user is already used to, but, in a different situation the user maybe required to focus on the product in order to use it. For instance, using a different currency, ordering food in a foreign country, using public transportation in a foreign country are a few cognitive interactions.

Expressive user-product interactions create a relationship between the user and the product. These interactions enable the user to personalize, modify, or change the product in order to create their own relationship with the product (Forlizzi and Battarbee, 2004). For instance, changing settings on your mobile phone or personal computer, assembling or fixing a product, or painting old furniture are expressive interactions.

Each of these user-product interactions generate different experiences. Next, I discuss these types of experiences.

32 Types of experiences User-product interactions create different experiences. These experiences can be named as – experience, an experience, co-experience.

Experience. Forlizzi and Battarbee (2004, p. 263) explain this as “the constant stream of “self-talk” that happens while we are conscious. [it] is how we constantly assess our goals relative to the people, products, and environments”.

An experience is where the user’s behaviour or emotion changes. These types of experiences have a beginning and an end, can be expressed or named, and can generate a sense of completion (Forlizzi and Battarbee, 2004). For instance, listening to a new song, watching a new movie, trying out new food, or trying out a new activity can be listed as an experience.

Co-experience is experience gained from a social context, such as a shared experience (Forlizzi and Battarbee, 2004, Battarbee and Koskinen, 2005), where each individual might gain a different experience. Co-experience can be experienced while using interactive technologies. For instance, winning a multiplayer game can be experienced as an achievement for the winner, but other players might experience is as a loss. An outsider who viewed the game might have a different experience. These experiences can be affected by the individual understanding of the context (Dourish, 2004a). According to Forlizzi and Battarbee (2004, p. 263), “Co- experience reveals how the experiences an individual has and the interpretations that are made of them are influenced by the physical or virtual presence of others”.

Table 8 shows a summary of the framework of user-product interaction and Table 9 shows its relation to user experience developed by Forlizzi and Battarbee (2004, p. 263).

Types of User-Product Description Example Interactions Fluent Automatic and skilled • riding a bicycle interactions with products • making the morning coffee • checking the calendar by glancing at the PDA Cognitive Interactions that focus on the • trying to identify the flushing mechanism of product at hand; result in a toilet in a foreign country knowledge or confusion and • using online algebra tutor to solve a math error problem Expressive Interactions that help the user • restoring a chair and painting it a different form a relationship to the colour product • setting background images for mobile phones • creating workarounds in complex software Table 8 : Summery of a framework of user-product interaction (Forlizzi and Battarbee, 2004, p. 263).

Types of Experience Description Example Experience Constant stream of “self-talk” • walking in a park that happens when we • doing light housekeeping interact with products • using instant messaging systems An Experience Can be articulated or named; • going on a roller coaster ride has a beginning and end; • watching a movie inspires behavioural and • discovering an online community of interest emotional change Co-Experience Creating meaning and • interacting with others with a museum emotion exhibit together through product use • commenting on a friend’s remodelled kitchen • playing a mobile messaging game with friends Table 9 : User experience developed by Forlizzi and Battarbee (2004, p. 263)

33 Product experience While Forlizzi and Battarbee (2004) look at user experience from a user-product interaction-based perspective, Desmet and Hekkert (2007) look at experience from an emotion-based perspective. Emotion-based product experience consists of three levels: aesthetic pleasure, attribution of meaning, and emotional response. The three types of product experience discussed are from (Desmet and Hekkert, 2007) and (Forlizzi and Battarbee, 2004). I have summarised their discussions below and included their references.

Figure 13 illustrates (Desmet and Hekkert, 2007) framework of product experience.

Figure 13 : Framework of product experience by Desmet and Hekkert (2007, p. 4)

According to Hekkert, product experience is,

the entire set of effects that is elicited by the interaction between a user and a product, including the degree to which all our senses are gratified (aesthetic experience), the meanings we attach to the product (experience of meaning), and the feelings and emotions that are elicited (emotional experience). (2006, p. 160)

Aesthetic experience is when the user-product interaction causes stimulation through vision, sound, touch, smell, and/or taste. Aesthetic experience mostly focusses on visual stimuli (Tractinsky et al., 2000). Overbeeke and Wensveen (2003, p. 94) says, “The user should experience the access to the product’s function as aesthetically pleasing as possible”. Wright et al. (2008), Black (1998), Battarbee and Koskinen (2005), Wright and McCarthy (2008) discuss designing towards an aesthetic experience acknowledging ‘empathy’ and the principle “know thy user” (Wright et al., 2008, p. 10). For example, designing to create a sense of beauty experienced from an artwork, hearing music, or a tailored suit in the user.

Experience of meaning is the experience gained: through a personal or symbolic attachment towards a product and through understanding its value (Reinmoeller, 2002); interpretation and associations (Crilly et al., 2004); or history (Desmet and Hekkert, 2007). Desmet and Hekkert (2007) says, this type of experience varies due to different factors such as cultural differences, individual perceptions, figurative expressions (van Rompay et al., 2005), and linguistic expressions (Lakoff and Johnson, 1980, Gibbs Jr, 2003, Halliday and Matthiessen, 2006). Govers and Mugge’s (2004) research discusses how people are attached to products that have a similar personality to their own. Experience of meaning is also created by luxury products that provide a luxury experience (Uotila et al., 2005).

Emotional experience is the positive or negative emotions stimulated when using a product. Emotions are considered functional for their ability to influence us towards or away from products, objects, ideas, actions, or people (Frijda, 1986, Desmet, 2002) and a result of an automatic, unconscious, or cognitive process (Desmet and Hekkert, 2007). Roseman and Smith (2001) say that according to the appraisal theory emotions are caused by interpretations of events and situations, rather than the event or situation itself. Appraisal is a significant feature of emotion (Smith and Ellsworth, 1985, Roseman and Smith, 2001, Desmet, 2002, Ellsworth and Scherer, 2003). It is based on each person’s personal preference, therefore two people who appraise the same product or

34 situation may have a different emotional experience. For instance, some people might have a positive reaction towards rock music and others may have a negative reaction.

Here, this research is used to build a framework to support the initial design process of the Sati Interactive System, this is discussed in Chapter 5: Development Process on page 78. Philosophy of Technology for Human Computer Interaction Understanding the philosophy of technology provides a better understanding of HCI in relation to the Sati Interactive System. Here, Don Ihde’s non-neutrality of technology (Ihde, 1983, 1990, 1993, 1999, 2012) and Albert Borgmann’s device paradigm (Borgmann, 1987, 1993, 1999, 2000) is discussed. These philosophies are related to the context of HCI as the ideas remain contemporary.

Albert Borgmann’s Device Paradigm In Borgmann’s (1987) philosophy of technology, he argues that we should focus on the ‘good’ in technology rather than the ‘useful’. His focus is more towards the user experience than useability. He also says that it is important to focus on the context of the technology used, as ‘good’ and ‘useful’ technology in one context can be harmful in another. Higgs and Strong (2010, p. 21) discuss this, giving the example;

on the one hand, ambulances save lives and so are eminently useful; on the other hand, cars save us bodily exertion and the annoyances of fellow pedestrians or passengers and are thus, at least in part, a threat to the goods of community and our physical health in the form of exercise.

Technological development has created positive outcomes such as: faster communication, online banking, faster data gathering, faster information processing, advanced health science, and remote education. One might argue that the growth of technology and human interaction with computers increases the quality of life. However, Borgmann argues that technology also disengages us from one another, and that modern technology has the power to seduce people which leads to people focusing on material goods.

Focal things and Devices The basis of Borgmann’s philosophy of technology is the difference between the ‘focal things’ and ‘devices’.

Focal things Borgmann explains ‘focal things’ as,

things that of themselves have engaged mind and body and centered our lives. Commanding presence, continuity with the world, and centering power are signs of focal things (Borgmann, 1993, p. 119).

He illustrates this idea using a hearth, saying that,

in a pretechnological house the fireplace constituted a center of warmth, of light, and of daily practices. [...] The fireplace often has a central location in the house. Its fire is now symbolical since it rarely furnishes sufficient warmth. But the radiance, the sounds, and the fragrance of living fire consuming logs that are split, stacked and felt in their grain have retained their force. (Borgmann, 1987, p. 196)

Although, the modern household might not have a fireplace, the symbolic centre of the house where people gather can be named as a hearth. Borgmann describes this as,

a place of warmth and activity that encompasses cooking, eating, and living and so is central to the house weather it literally has a fireplace or not. (Borgmann, 1987, p. 197)

Another characteristic of a ‘focal thing’ is that it is a continuous activity. For example, it is required to maintain the kitchen in order to prepare food. A fireplace requires cutting trees, chopping and drying wood to keep the fire alive. Strong and Higgs (2000, p. 23) add to this by saying,

a focal thing is not an isolated entity; it exists as a material center in a complicated network of human relationships and relationships to its natural and cultural setting.

According to Borgmann, another characteristic of ‘focal things’ is that they have a tendency to unify means and ends. Fallman (2007, p.303) adds to this by saying,

35 Achievement and enjoyment are brought together; so are individual and community; mind and body; and body and world.

Focal things have value to our heart, can be a necessity to life, and contain human emotions.

Devices Devices are products designed for a purpose, and can be disengaged from focal things, emotions, and context. They are often disposable, mass-produced, can be discontinued, appealing, and fashionable (Fallman, 2011). Devices are technologies developed to be useful and might require very little effort, skill, attention, and patience to be used. However, this creates disengaged technology, without relationship to humans (Strong and Higgs, 2000).

Borgmann describes the shift from ‘focal things’ to ‘devices’ with the example of replacing the pretechnological house fireplace to a central heating system used in a contemporary household. He argues that this central heating device only provides a single service, which is heating. It replaced every other aspect of the fireplace, such as providing a focus for the human relationship, family, gathering, and the consequent quality of life. His argument further states that technological advancements can detach the human involvement, leaving only disengaged materials. Fallman (2011, p. 5) illustrates Borgmann’s views by saying,

humanity has become captive to the promises of modern technology, whose devices keep demanding less and less human input. The shift from being involved and engaged with focal things to disengaging consumption of devices, then, frustrates the deeper aspirations of life.

However, Borgmann does not reject the fact that technology can be good. In the illustration of the fireplace, he also argues that the technological development of the central heating system has enabled us to focus on other interesting interactions by saying,

Technology, as we have seen, promises to bring the forces of nature and culture under control, to liberate us from misery and toil, and to enrich our lives. […] the parlance is convenient and can always be reconstructed to mean that implied in the technological mode of taking up with the world there is a promise that this approach to reality will, by way of the domination of nature, yield liberation and enrichment. (Borgmann, 1987, p. 41)

Borgmann argues that this promise of technology has affected our views, thus leading us to believe that a technology-based life is a ‘good life’.

Another example of this is the use of the modern mobile phone. While the wired telephones helped us connect with the world, the modern mobile phone, with a promise to a connected world, has equally disengaged us from each other through access to disembodied and displaced media.

This is the irony of technology, according to Borgmann—the promise of technology is both tempting and disengaging. Rather than being technologically enriched by the devices we buy, use, and eventually dispose of, we tend to find ourselves disengaged, diverted, and distracted. (Fallman, 2010, p. 58)

And Higgs and Strong (2010, p. 31) say

The good life that devices obtain disappoints our deeper aspirations. The promise of technology, pursued limitlessly, is simultaneously alluring and disengaging.

Technology can enhance life as well as disconnect us from each other; devices, as defined by Borgmann, are technologies designed for a purpose with a promise to enrich our lives. Technology is supposed to enhance the development of mankind, to improve ‘good life’. However, not all technological advancements lead to this.

The device paradigm This disengagement between the user and the device happens because technology hides its working mechanism (Borgmann, 1987, Strong and Higgs, 2000). The advancement of technological devices makes it hard for the user to understand how the technology is built or how the device works. For example, it is simple to understand how a fireplace works, however, understanding the mechanism behind a central heating system is somewhat complicated. Here, the user might only need to know and understand that a controller knob increases or decreases the amount of heat provided. The background of its technology, such as how the heat is generated

36 and distributed, is hidden, showing that modern technology tends to hide means but not its ends. This disconnection of means and ends may lead to a disengagement or an emotionless connection between the technology and the user.

Three types of Information In the device paradigm, Borgmann (1999) also emphasizes the user’s experience with reality and how information and communication technology affect them. He named three types of information regarding connection between humans and reality: natural information, cultural information, and technological information.

Natural information emerges of itself, intimates rather than conveys its message, and disappears. Cultural information, to the contrary, is wrested and abstracted from reality, carries a definite content, and assumes an enduring shape. Before it can be encountered, it has to be produced, the signs that are used to extract and convey information must have evolved. (Borgmann, 1999, p. 59)

Borgmann further clarifies Technological information with the following discussion.

Technological information could simply be defined as the object of information technology. But we can be more explicit and define it structurally as the information that is measured in bits, ordered by Boolean algebra, and conveyed by electrons. (Borgmann, 1999, p. 166)

Technological information, moreover, is typically written, read, and copied by unfailing and objective copyists machines. (Borgmann, 1999, p. 194)

Borgmann focuses on technological information, as it is related to the advancing information technology. He also argues that technological development pushes us further away from reality. Current examples of this include virtual spaces such as: Facebook, Instagram, Fortnite, and World of Warcraft, where people spend time away from reality. Fallman quotes Borgmann as saying,

Postmodern technology uses the hyperreality of simulations to get rid of the limitations imposed by reality. The limit of postmodern reality is not the total objectification of nature, but the replacement of reality by virtual reality totally under our control. The objects of reality disappear to the extent that we as subjects gain control over them, but we are similarly reduced to ‘a point of arbitrary desires. (Fallman, 2011, p. 5)

Borgmann argues that, the advancement of technology has created ‘hyperrealities’ that lack engagement with reality. We control these created realities and engage with them for their inherent experiences. However, this disconnects us from reality, not expanding our connection with reality and each other.

Agreeing with Borgmann’s views, Fallman (2007) says that,

Borgmann’s prophecy seems to be that we have become mesmerised by the promises of modern technology […] whose devices demand less and less of our own skills, efforts, patience, and risk. But in this shift from engagement with focal things and practices to disengaged consumption of devices, his fear is that we have come to disappoint our own, deeper aspirations. Rather than the promise of technological enrichment and consumption, we have come to find ourselves disengaged, diverged, and distracted, and—ultimately—lonely. (Fallman, 2007, p. 305)

In conclusion, Borgmann uses the device paradigm to explain the effects of technological developments towards Human Computer Interaction. Here, he emphasises how devices and information technology have moved away from focal things, reality, and human engagement, thus changing the ‘good life’. Focal things, which hold value to our heart and a necessity to life, seem to be fading away. Devices, in their attempt at being useful, can disengage relationships between humans and with nature.

Don Ihde’s Non-neutrality of Technology Don Ihde’s philosophy of technology was based on non-neutrality of the technology-mediated experience. In his books; Technology and the Lifeworld: From Garden to Earth (Ihde, 1990) and Technics and Praxis (Ihde, 1978), he argues that, for every development in technology, there is also an aspect that is decreased. For example, the development of optical technology such as: microscopes, binoculars, and telescopes have opened access to different worlds which was beyond reach for the human eye. Telescopes increases long distance vision, while decreasing short distance vision. Ihde writes:

37 For every enhancement of some feature, perhaps never before seen, there is also a reduction of other features. To magnify some observed object, optically, is to bring it forth from a background into a foreground and make it present to the observer, but it is also to reduce the former field in which it fit, and—due to foreshortening—to reduce visual depth and background. (Ihde, 1979, p. 111)

Ihde also argues that this enhancement of some features and reduction of others belongs to all types of technologies; non-neutral transformation (Ihde, 1990). His argument is that, even a technology, such as eyeglasses, have the characteristic of being non-neutral. He points out that even though eyeglasses improve vision, it also comes with a cost. Here, cost implies, the requirement of maintenance. The glasses require frequent maintenance such as cleaning, to avoid damage and a disrupted view. Similarly, the eyeglasses are somewhat fragile, and it does not fit every occasion such as, swimming or underwater activities. Dust, water, and sun glares not only disrupts the view, it gives the user a fringe awareness that their vision is disrupted and enframed:

Now I have to care for my glasses or my contacts and the paraphernalia which go with them. But more, particularly with eyeglasses, my world now comes to me enframed, intruded upon, perhaps almost imperceptibly, with the backglares that occur, with dust of water sports that accumulate, and with a fringe awareness that the world seen through the lens is, however slightly, changed. (Ihde, 1990, p. 49)

However, according to Ihde, the non-neutral behaviour of technology changes with different types of technologies. According to Fallman (2010), technologies such as eyeglasses and telescopes enhance and reduce certain aspects of the technological experience, whereas, technologies such as clocks and speedometers behave differently, as they are an interpretation to reality rather than a direct view of reality. While, an eyeglass or a telescope help enhance vision, which can also be called as a direct interpretation of reality, a speedometer represents a value, the speed, which needs to be interpreted using other knowledge. Ihde argues that this behaviour is also non-neutral. The thrilling experience of ‘speed’ has been represented by a figure. Here, the technology of the speedometer provides information regarding the speed such as, kilometres per hour. However, this information must be translated using previously obtained knowledge. Ihde writes:

for every revealing transformation there is a simultaneously concealing transformation of the world, which is given through a technological mediation. Technologies transform experience, however subtly, and that is one root of their non-neutrality. (Ihde, 1990, p. 49)

Furthermore, Ihde separates the human-technology relationship in to four categories according to their behaviour: embodiment, hermeneutical, alterity, and background relations (Ihde, 1990).

Ihde’s human-technology relations Embodiment relations

Embodiment relationships are technologies that allow the user to experience reality through the technology, becoming a part of the user. For example, eyeglasses allow the user to see the world through it. Here, the human-technology relationship becomes transparent, which Ihde categorise as a profound existential relationship. However, it takes time to generate an embodiment relation between the human and the technology. For example, it takes time for new users to get used to wearing eyeglasses. At first the user might experience its weight, headaches, and eyestrain. When the user gets used to it, the embodiment relationship is created.

Hermeneutical relations

A hermeneutical relationship does not enhance the users’ vision, sensors, or abilities. It is a relationship created by technological instruments, which provides indirect information, that must be interpreted using previously obtained knowledge. For instance, the speedometer and the clock provide information which the user interprets and perceives. Here, the knowledge of reading and understanding speed and time is required. The user focuses on the device that provides the information. This device or technology does not directly enhance the users’ connection with the world, rather disengages the user from the world. Fallman writes:

When reading a speedometer, one’s perceptual focus is not on the world but on the technological instrument, where one perceives the instrument itself rather than the object being referred to in the world. Hermeneutical

38 instruments do generally neither enhance any of their users’ innate capabilities or senses, nor are they meant to become invisible; quite the opposite, they are often designed to be objects of focus and, as a result, the world tends to withdraw from their users. (Fallman, 2011, p. 7)

Additionally, the user might completely rely on the acquired information from these technological devices, hence it is important that the user has the proper knowledge to understand the information and, in some cases, understand its functionality.

Alterity relations

An alterity is a primary relationship created between the human and the technology, also referred to as, a relationship between the human and a quasi-otherness (Ihde, 1990). According to Ihde, alterity can be seen in the relationship between a human and a computer. For instance, playing a computer game creates embodiment relations (using the hand to control the keyboard and mouse) and hermeneutical relations (understanding and living in the world created in the game), as well as alterity relations. Here, the alterity relation is created when the player is interacting with a quasi-otherness, i.e., the computer game object appears as another individual to the player. Ihde describes this relation as an experience created from the interaction:

There is the sense of interacting with something other than me, the technological competitor. In competition there is a kind of dialogue or exchange. It is the quasianimation, the quasi-otherness of the technology that fascinates and challenges. I must beat the machine or it will beat me. In each of these cases, features of technological alterity have shown themselves (Ihde, 1990, p. 100).

Background relations

Background relations provides structure to the experience. These relations do not have a direct relation to human-technology such as embodiment, hermeneutical, and alterity relations. Digital technologies, electronic devices, and automated machines are a few examples for technologies with a background relation. Ihde states,

Different technologies texture environments differently. They exhibit unique forms of non-neutrality through the different ways in which they are interlinked with the human lifeworld. Background technologies, no less than focal ones, transform the gestalts of human experience and, precisely because they are absent presences, may exert more subtle indirect effects upon the way a world is experienced. There are also involvements both with wider circles of connection and amplification/reduction selectivities that may be discovered in the roles of background relations (Ihde, 1990, p. 112).

For instance, a central heating system of a house provides warmth and a creates a comfortable atmosphere, while its main machinery works in the background to produce heat. Here, the technology remains in the background, therefore, it is a background relation.

Relevance to the project Here, I am explaining why I have chosen Borgmann and Ihde’s views, and how their philosophies contribute towards modern HCI within the context of my work.

According to Fallman (2011, p. 8), when

compared with many other philosophers, they appear attractive to HCI in that they deal directly with today’s technologies. At the same time, they sustain strong links to earlier philosophy, for instance in Borgmann’s case with the dystopian undertones of Heidegger and Ihde’s somewhat instrumental approach and the pragmatism of Dewey.

And Mitcham (1994, p. 77) says “Ihde argues against any view of technology as neutral or pure transparency with utopian possibilities”.

Borgmann and Ihde linked their philosophy towards HCI, therefore their philosophies may be adopted to modern technology. For example: human-technology relations and data visualization (Hogan and Hornecker, 2011), hypertext fiction reading where digital technology replaces books (Mangen, 2008), and human-electricity relations as Ihde’s human-technology relations (Pierce and Paulos, 2011).

39 Other researchers have used Borgmann and Ihde’s philosophies to discuss their views on HCI. For instance, Fallman (2010) discusses how Borgmann’s work raises social, cultural, moral and ethical issues in modern HCI and how it can be used to design interactive systems that connect with reality. Verbeek (2002) discussed how Borgmann’s philosophy of technological devices and information technology help analyse technology as a technological artefact, rather different to the transcendentalism movement in early philosophies. Leshed et al. (2008) uses Borgmann’s device paradigm to discuss commodification and de-skilling issues of an in-car GPS navigation technology. Ermer (2009) analysed Borgmann’s philosophy in order to create recommendations for the engineering curriculum. Verbeek (2008) used Don Ihde’s human-technology relation to study three types of ‘cyborg intentionality’ involved in human-technology relations.

To summarise, Borgmann and Ihde’s philosophies are useful to this project because:

• It focuses more on user experience than useability – the ‘good’ in technology rather than the ‘useful’ (Borgmann, 1987). • The focus towards ‘focal things’ – understanding the value of emotional attachment (Borgmann, 1987, Borgmann, 1993). • Focusing on the positivity of technology – a technology-based life is a ‘good life’ (Borgmann, 1987, Strong and Higgs, 2000, Higgs and Strong, 2010). • Understanding the negativity of technology – promises of technology (Borgmann, 1987, Fallman, 2011). • Understanding that for every development in technology, there is also an aspect that is decreased (Ihde, 2012, Ihde, 1979, Ihde, 1999). • Understanding types of human-technology relations and its value towards developing technologies (Ihde, 1990). How I use Borgmann and Ihde’s philosophies to develop the Sati Interactive System is discussed Development Framework and Philosophies on page 105.

To conclude, Borgmann and Ihde’s philosophy of technology can be related to past technologies and modern technologies. Many researchers have used their philosophies as a foundation or a reference. These philosophies have helped HCI to move from a traditional usability-based principle towards a user experience-based module. However, it is important to understand that technology always changes and so will HCI and its philosophical paradigms. Similar Applications to the Sati Interactive System This section discusses about applications on relaxation and mindfulness-based computer-based interactivity. This includes mobile and screen-based applications within the frame of the thesis.

The search for these applications is conducted using keywords such as ‘relaxing’, ‘relaxation apps’, ‘relaxation games’, ‘mindfulness’, ‘sati’, and ‘interactive relaxation’. Here, I have used Google search, Steam, Google Play, and Apple’s App store as the search engines because they are the most popular computer and mobile application databases. The search was limited to the top hits, filtered by most relevant.

The table below is a summary of the search.

Application Database Interactive experience Link Deep Meditation: Google Play Mobile experience using serene https://play.google.com Relaxation & Sleep music, white noise, nature /store/apps/details?id=o Meditation App sounds, and teaching meditation rg.deeprelax.deepmedit techniques. ation Relax Melodies: Sleep Google Play Mobile experience using nature https://play.google.com Sounds sounds, white noise, guided /store/apps/details?id=i meditations, brainwaves, body- pnossoft.rma.free mind exercises and breathing techniques.

40 Application Database Interactive experience Link Brain Yoga Brain Training Google Play Mobile experience using puzzles https://play.google.com Game and relaxing music. /store/apps/details?id=c om.megafaunasoft.brain yoga The Mindfulness App: Google Play Mobile experience using https://play.google.com relax, calm, focus and mindfulness training, meditation /store/apps/details?id=s sleep guides, bell and nature sounds. e.lichtenstein.mind.en Shinrin-yoku: Forest Steam VR experience using forest https://store.steampow Meditation and Relaxation environments, realistic ered.com/app/774421/S interactive wildlife and nature, hinrinyoku_Forest_Medi and immersive sounds. tation_and_Relaxation/ River Relaxation VR Steam VR experience - simulation of https://store.steampow calmly floating down a river in a ered.com/app/938760/R kayak iver_Relaxation_VR/ Just Sleep - Meditate, Steam Windows experience – Guided https://store.steampow Focus, Relax meditations, white noise, ered.com/app/1093150/ weather and nature sounds, and Just_Sleep__Meditate_F particle animations. ocus_Relax/ Headspace: Meditation & App Store Mobile experience – Teaching https://www.headspace. Sleep meditation using animations and com/headspace- sounds. meditation-app The Mindfulness App - App Store Mobile experience – Guided https://apps.apple.com/ meditate meditation practice using audio us/app/the-mindfulness- and visual aids. app- meditate/id417071430 My Oasis: Relaxing Clicker App Store Mobile experience – Idle tap https://apps.apple.com/ App game designed using graphics au/app/my-oasis- and music to provide a mellow relaxing-clicker- and emotional experience. app/id1247889896 Wildfulness: Meditate & App Store Mobile experience – trying to https://apps.apple.com/ Relax help calm the mind using au/app/wildfulness- illustrated imagery and meditate- immersive 3D nature sounds. relax/id1082150954

myNoise Website Windows experience – using https://mynoise.net/ relaxing nature sounds with The Ultimate Noise Machine that lets the user change frequency levels. Beautiful Games - For Website Windows experience – using https://www.free- Relaxation & Stress Relief interactive 360-degree camera training- rotations with 3D environments tutorial.com/mindfulnes and sounds to create a relaxing s.html state. Sway App Store Mobile experience – interactive https://apps.apple.com/ meditation experience using the us/app/sway- phone to track user movements mindfulness-in- and provides feedback to help motion/id1200737413?l gain focus and improve s=1 attention. Table 10 : Similar relaxation and mindfulness applications from Steam, Google Play, and App Store.

These applications provide a similar experience to the Sati Interactive System. However, most of the apps only use visual and sound elements to stimulate the mind. The VR technology apps use visual, sound, and movement interactivity to create a full body stimulation. Nevertheless, VR technology has its limitations (see Limitations in

41 Virtual Reality technology on page 26). Sati Interactive System is different from these applications because the Sati Interactive System,

• combines colour, sound, and body movements to stimulate a full body experience to create a sense of deep engagement. • is developed based on profound research in colour, sound, and body movements. • is designed as a low-cost system. • and supports both Windows and Mac operating systems. Conclusion This literature review covers seven topics: Sati; Colour and Sound; Art and Transformative Experience; Computer-based Interactivity; Interactive Technologies; Human Computer Interaction (HCI); and Philosophy of Technology for Human Computer Interaction.

Sati is being aware and paying attention to the present moment. Anticipated benefits of Sati include the ability to: relax/calm your mind, help focus on a single objective, deal with stressful situations, and control emotions. Sati does this by training the mind to focus on a single thing.

Colour and sound are also known to affect human emotions, however they do this passively in contrast to Sati, which requires action by the user. And here I am using colour and sound to generate positive human emotions in the user. Movement is used as a tool to engage the user while interacting with Sati Interactive.

An artwork creates a transformative experience, an emotional change in the user, and here colour, sound, and movement are considered the main elements of any artwork. Sati Interactive System uses these three elements to create the transformative experience of an artwork without creating an artwork per se.

Regarding computer-based interactivity I focused on six concepts/principles to develop the Sati Interactive System: directionality, responsiveness, awareness, selectivity, interchangeability, and real-time communication. Integrating these concepts in the Sati Interactive System can improve the interaction between the user and the system.

Interactive technologies have benefits such as: improving learning, providing remote interaction, enhancing engagement, and may provide real-time feedback to the user. But interactive technologies also have limitations such as: cost of development and equipment, lack of haptic feedback, and may require specialist assistance to use and/or setup. This knowledge is useful in developing the Sati Interactive System because it can help select the necessary technologies. The Sati Interactive is a low-cost computer-based interactive system that requires only a computer, a webcam, and speakers; and these technologies are often available in any home-based computer system.

Regarding HCI, models of user experience, user-product interaction, types of experience, and interaction design framework provide knowledge used to develop an interaction design framework that can enhance the user interaction with the Sati Interactive System.

Borgmann and Ihde’s philosophy of technology focuses on: usability and the ‘good’ in technology, technology developed as a ‘focal thing’ considering the value of emotional attachment, and helps understand the relationship between humans and technology. Integrating this knowledge in the Sati Interactive System can enhance its relationship with the user.

42 This information helped develop the Sati Interactive System, indicating: what devices to use, how to enhance interactivity, how to develop an interaction design framework, how to enhance the relationship between the device and the user, how to integrate colour, sound, and movement with computer-based technology, and how to create a transformative experience.

Figure 14 : Sati Interactive System conceptual development process

The figure above shows the Sati Interactive conceptual development process. The initial idea was to create a positive emotional experience. I used the concept of Sati, being aware and paying attention to the present moment, to create this positive emotional experience. I am creating a transformative experience of an artwork without creating an artwork. To create this transformative experience, I am using colour, sound and movement, the main elements of an artwork. Colour and sound are used to create positive emotions, and movement is used to engage the user in the Sati Interactive System. I considered these ideas through discussions on HCI, Computer- based Interactivity, Philosophy of Technology, and Interactive Technologies. The culmination of this approach resulted in building the low-cost computer-based interactive system, Sati Interactive System. A detailed discussion on how these elements blended and influenced the development of the Sati Interactive System is in Chapter 5: Development Process on page 78.

The computer-based Sati Interactive System is developed using/based on the knowledge gained from: concepts of computer-based interactivity, limitations of computer-based technology, HCI, user experience, interaction design framework, and philosophy of technology for human computer interaction. Here, selected concepts of computer-based interactivity, HCI and user experience is used to develop an Interaction design framework.

The use of computer-based interactivity in various fields are discussed to find their benefits and limitations. These limitations helped the development process of the Sati Interactive System. According to Borgmann’s Focal things and Devices, here, the Sati Interactive System is developed as a focal thing; something that has value to our heart, can create the experience of an artwork, can create a sense of deep engagement, and can generate positive human emotions.

The Sati Interactive System intends to provide: a low-cost computer-based interactive system; an arts-based approach to design the system; and to be flexible in its uses. Currently there are few similar low-cost computer-

43 based interactive systems, there are few computer-based interactive systems that are explicitly designed to create a sense of deep engagement, there are few computer-based interactive systems that explicitly take an arts-based approach in their design and are explicitly designed to be flexible in their uses. These are the gaps the Sati Interactive System is attempting to fill. To sum up, developing a system that creates this sense of deep engagement through an HCI and arts-based perspective may add new knowledge to the field of computer-based interactive arts.

44 Chapter 3: An Arts-based Approach: Sati Interactive System, Deep Engagement, and Transformative Experience We can assume that an artist creates an artwork to create a sense of deep engagement and a transformative experience in the user, this is often done by the artist making a personal expression that they wish to infect the audience with; with reference to Tolstoy. When I listen to a song, watch a movie, or see an artwork, sometimes I feel deeply engaged with it. This sense of deep engagement is transformative. For example, the movie ‘The Art of Racing in the Rain’ (Curtis, 2019) deeply engaged me, and I felt different emotions such as happiness, sadness, and excitement. Feeling these emotions had a transformative effect. This particular experience is possibly unique to me, and different artworks may create similar responses in others. It can be assumed that most artworks are made to create a response in the viewer, and that most viewers attend to artworks in order to engage with and experience a response to the artworks. Below I discuss how art and Sati Interactive System affect transformative experiences.

Unlike the normal creation of an artwork, where the creator intends the perceiver to understand their efforts through the lens of ‘art’, the Sati Interactive System does not require that the viewer perceive it as a work of art. Instead, it tries to evoke the experience of an artwork without being an artwork. This is different to Duchamp’s ‘Fountain’ (2020e) and any other “readymade” art object, where the viewer is required to perceive an everyday object, such as the urinal, as if it were a work of art.

My research does not attempt to take a philosophical or theoretical position within the field of aesthetics. I am focusing solely on the making of the Sati Interactive System that creates an experience of an artwork. An artwork may inspire a philosophical thought and discussion, but to quote Haddenhorst (1971),

A work of art is not a picture or reproduction of something; it is something. It is a whole new thing, an invention never seen before. If a painting, "before being a war horse, a female nude, or some anecdote, it is essentially a flat surface covered with colors assembled in a particular order," as Maurice Denis expressed it early in this century. If a sculpture, it is an arrangement of forms and spaces. One may or may not find the arrangement pleasing, he may or may not like or agree with the message; but even to get the message, the art work must be viewed with candor and without the preconceived idea that it must moralize or tell a story or have a literal interpretation.

This implies, an artwork might not be viewed from or through a philosophical or theoretical perspective, but rather as an object itself that does not moralise, theorize, philosophize, or tell a story. The Sati Interactive System intends to be an object that does not do these things.

The term “transformative experience” is used to express different meanings in different contexts. For example, epistemological transformative experience through life changing decisions (Paul, 2014), “situations where learning leads to an aesthetic transformation of students’ everyday experience” (Pugh, 2004, p. 183), and the interaction between a person and the world (Dewey, 1934). However, here I use the term “transformative experience” to refer to the emotional change that happens in our mind and the physical changes that happen in our body when we interact with an artwork. The Sati Interactive System improves awareness of the present moment and through that it creates a transformative experience. The term “transformative experience” expresses the idea I am trying to convey more than the term “change”. Feelings and emotional changes generated by an artwork These feelings or the emotional changes evoked by art is the transformative experience of art. Professor of Psychology, Shimamura (2015, p. 2) says,

These days art can evoke a variety of emotions, such as sadness, anger, surprise, and even disgust. It can corral our sensory experience through artistic balance and form. It may even deny emotions altogether and make us think– such as when it reminds us of our cultural background or forces us to think about the world in a different way. By this view, art offers a wonderful opportunity to gain from our senses, learn new things, and become emotionally engaged.

Art Historian Christensen (2017, p. 3) discusses how Rothko’s art interacts with the viewer in a perceptual, ethical, or political aspect creating an emotional impact, she says

45 Performative art is approaching the viewer's own self directly, which means that the viewer has to reflect on his/her [below, feminine gender only] experience encountering the artwork and has to question her personal feelings and actions exerted in this encounter. In this way, it could be argued that Rothko's art is performative in the sense that it is presuming a self-reflecting viewer taking part in a dialogue with the artwork pointing forward, even, towards contemporary art forms such as interactive installation art.

These emotional changes or feelings aroused by art are the transformative experience that I refer to. It can be evoked by as paintings, sculptures, dance performances, theatre, movies, and sounds.

Creative industries, social, and cultural studies researchers Franklin and Sansom’s (2018, p. 30) set of museum aims states that they wish to, “Transform visitors lives and consciousness: To provide a transformative experience - to change visitors’ lives, in some way.” They also discuss how art creates a transformative experience. Arts educator, Gutiérrez-Vicario (2016) agrees with the transformative experience of art while discussing the benefits created by art and artmaking.

Professor in Sociology and Criminology, Maggie O’Neill (2008, p. 9) explains the transformative role of art, saying

Art is a “feeling form” created in the tension between sensuous knowing, the playfulness and creativity of the artist and the historically given techniques and means of production. Art is a social product not just a reflection on its social origins and it manifests its own specificity-it is constitutive. Art makes visible experiences, hopes, ideas; it is a reflective space and socially it brings something new into the world-it contributes to knowledge and understanding.

Applied Neuroscientist, Preminger (2012) discusses art and its effect on the brain such as benefits to long-term neurocognitive changes, saying

Art is a medium of inducing experiences. Artistic experiences can be a vehicle to convey meanings, a way to provide pleasure, or means for self-expression and communication. Every artwork leads to a mental experience by the observer, participant, or experiencer.

Art as Text Rainer Maria Rilke’s poem, ‘Archaic Torso of Apollo’ (2020a, Bernstein, 2019), describes the Miletus Torso. This is Rainer Rilke’s discourse of how seeing the statue transformed him, inspiring him to change his life, in fact the last sentence of the more formal translations are, “You must change your life/You have to change your life”. These formal translations are similar to the google translation. However, the informal translation of the last phrase is “the way you live”.

Different translations of the original German poem can convey different meanings. The first four translations below come from Echopoetics: A Brief History of Poetry from Sappho, Blake, and Baudelaire to Stein, Morris, and Scalapino (Bernstein, 2019). These translations of the second stanza of the poem indicate how different meanings can be derived from the same text. The original German text reads,

sich hält und glänzt. Sonst könnte nicht der Bug der Brust dich blenden, und im leisen Drehen der Lenden könnte nicht ein Lächeln gehen zu jener Mitte, die die Zeugung trug.

C.F. MacIntyre’s translation of the second stanza of the poem is,

restrained and shining. Else the curving breast could not thus blind you, nor through the soft turn of the loins could this smile easily have passed into the bright groins where the genitals burned.

Stephen Mitchell’s translation of this stanza is,

gleams in all its power. Otherwise the curved breast could not dazzle you so, nor could a smile run through the placid hips and thighs to that dark center where procreation flared.

46 Sarah Stutt’s translation of this stanza is,

still persists and gleams. If this were not so, the bow of his breast could not blind you, nor could a smile, steered by the gentle curve of his loins, glide to the centre of procreation.

Looser translation of this stanza is,

incandescent light radiates from his torso, and in the curve of his loins, a smile turns towards the centre of creation.

Google translation of this stanza is,

holds and shines. Otherwise the bug couldn't your chest blind you, and in a gentle twist the loins could not smile to the middle that bore the conception.

The first translation and the last translation gives a very different understanding of the text and therefore a different understanding of the statue. These translations can be found in Appendix 10 on page 143.

If we take Peirce’s (1865, p. 6) statement that “words are ideas in the mind of the speaker” we can say that a poem such as Rilke’s is an assemblage of ideas. Each of these translations conveys different ideas. Rilke assembled his ideas to express what he felt when he saw the statue, but each of the five translations can give a different experience for the reader. When I read these translations, they had a transformational effect on me. I felt a presence of a great human with strength, this presence gave me a sense of power, this power motivated me in to being a better person, this motivation brought me a positive feeling and a ‘can do’ attitude.

When looking at the statue itself I felt like I am looking at an old general or commander, a strong character that had power and great reputation. It also gave me a sense of strength and it made me feel like I should start working out again. An image of the statue is shown below.

Figure 15: Archaic Torso of Apollo Statue (Bernstein, 2019)

Then when considering the statue through Rilke’s interpretation, my experience when looking at the statue was reinforced. Rilke and I had similar experiences.

These three engagements with the statue as a work-of-art and Rilke’s poem as a work-of-art show similar responses, but with some differences. For example, when saw the statue I felt like it is of an old general or commander from the past. However, when I read Rilke’s interpretation, I did not feel any connection towards the past or that it was a statue of a war hero.

47 Each of these responses transformed me by having me think intellectually about how I respond to art rather than a purely instinctual and primitive responses when experiencing art. However, sometimes when listening to music a purely instinctual and primitive responses are the best. For example, when listening to Of by Motörhead (Motörhead, 1980).

Art as Static Image Art has the ability to transform emotions, for example the transformations happened when looking at the Archaic Torso of the Apollo Statue and listening to Ace Of Spades. The Sati Interactive System is creating this transformation through engagement. Artists and researchers have discussed about this transformation. Philosopher Stephen Snyder (2018, p. 2) says,

when an object is recognized as art, a transformation occurs that changes how we relate to it, and this is always linked more to a transformation of thought than of perception itself.

Here, the object recognised as art is what the user led to create through the engagement with the Sati Interactive System. The transformation occurs when the engagement happens. Snyder (2018, p. 24) also says,

An object of art is more than the mere matter from which it is made: it embodies the struggle of spirit’s metamorphosis of the world.

This metamorphosis is the transformation that art invokes. The artwork provides this metamorphosis/transformation on the moment of experience the artwork. The Sati Interactive System intends to provide this momentary transformation. Of course, the transformation may reside in the participant as a memory of an experience or more deeply.

Dancer Valerie Durham (2012) also believes in the transformative ability of art. Durham (2012, p. 30) says,

I believe part of the reason people spend their time engaging with art is because of its transformative ability. It can affect our thinking, raise our consciousness, and evoke our feelings. I wondered can it also change our movements and our physical actions?

John Dewey (1934, p. 3) says,

the actual work of art is what the product does with and in experience.

The Sati Interactive System creates the transformative experience with and in the act of engagement and participation with it.

As discussed in the headings on Colour, Sound, and Movement on page 6, it is shown that these elements can have an effect on emotions. Different colours associate with different emotions. For example, red colour is known to associate with emotions such as: excitement, warm, energetic, passionate, fury, anger, bloody, intense, and violent.

Rothko’s (2020f) painting shown below, deeply engaged me and transformed my mood. For a moment I felt like I cannot take my eyes off the image of the painting. I believe this is because of the uniqueness of the colours that Rothko has used. I also had mixed emotions such as: warmth, hunger, intensity, nervousness, and frustration.

48 Figure 16: Rothko’s paining – untitled 1968 (2020f)

Visual artist Kosoi (2005) expresses her thoughts on Rothko’s paintings by saying

I believe that any observer cannot fail to notice that these paintings [in Rothko’s Chapel] include anxiety rather than delight. (Kosoi, 2005, p. 26)

She also says,

Many critics have mentioned, although not always expressly using the word, that Rothko’s paintings have this effect. Robert Rosenblum, for example, describes Rothko’s paintings as “awe-inspiring”; Jeffery Weiss describes them as “objects of emotional or spiritual awe”; while Robert Goldwater writes, “It is significant that at the entrance to this room one pauses, hesitating to enter. Its space seems both occupied and empty”; and Peter Selz writes, “The spectator contemplates an atmosphere of alarm…” (Kosoi, 2005, p. 26)

Kosoi admits that different people have different responses to Rothko’s works. However, all the people she has listed say that there is some kind of a transformational effect resulting from experiencing the artwork. What I see from the images of Rothko’s Chapel are dark coloured paintings displayed on non-rectangular walls. All the paintings are at eye height and if you stand about three metres from a painting it will take all of your peripheral vision. From the Rothko Chapel 2014 WordPress site, Matikopi (2014) has stated “It was amazing how such a plain building could have such a massive message.” This statement shows that the artworks had a transformative effect on the person. However, they could not say what that transformative effect was, other than saying that there was a “massive message”.

When interacting with the Sati Interactive System, most people said that they felt like they had some kind of a transformative experience while and after using the system, these were: feeling relaxed, feeling focused, and in a different zone. The Sati Interactive System creates a transformative experience, this might not be “anxiety rather than delight”, “awe-inspiring”, “emotional or spiritual awe”, hesitation, or “an atmosphere of alarm”.

These experiences people have listed when looking at the paintings in the Rothko Chapel relate back to the discussion on Colour, Sound, and Movement on page 6.

49 Denis’s quote from 1890, shown in Oxford Reference (2021), explains what a painting is.

[A] picture—before being a war horse or a nude woman or an anecdote—is essentially a flat surface covered with colours assembled in a certain order.

It is the assembling of colours on a visual artwork that creates a transformative effect in the perceiver. In sound and music, it is the assemblage of air molecules that creates the transformative effect.

Art as Sound/Music Sound and Music are not static art forms, they are known to create a transformative effect in their audience. Music therapist Barbara Hesser says, “we chose this profession because we think, feel, intuit, and/or know that music can be transformative” (Hesser, 2001, p. 57). From her studies of spontaneous, improvised music making group experiences, she says,

The [creating of] music allows for expression of emotion and thoughts as they arise and subside, and for the elaboration and expansion of these expressions. This process involves developing an increasing awareness of ourselves. We learn to become more conscious from minute to minute of our individual actions, feelings, and thoughts. Through this process we become more aware of ourselves in the psychological, physical, and spiritual realms. (Hesser, 2001, p. 56)

According to the above quote “expression of emotions”, “increasing awareness”, “become more conscious”, and “aware of ourselves” are transformative effects of creating music. The experience of listening to music and creating music can have a similar transformative effect. When interacting with the Sati Interactive System, most people said they were feeling “relaxed” and “focused”. The Sati Interactive System integrates colour, sound, and movement to create this transformative effect; for example, expression of emotions, increasing awareness, becoming more conscious, and aware of ourselves.

Music and sound are used for many purposes; for example, Caldwell et al.’s (1999) research show that music can be used to influence people to spend more time at a restaurant, Cutshall et al.’s (2011) research show that recorded music and nature sounds may provide means of relaxation for anxiety patients, Benfield et al.’s (2014) research show that natural soundscapes can provide restorative benefits, Koelsch’s (2014) research shows the positive effects of music in the treatment of psychiatric and neurological disorders, and Zentner et al.’s (2008) research on music-induced emotions. The Long Range Acoustic Device (2020c) used for crowd control, SWAT operations, and traffic control shows how sound can be used for other purposes.

The transformative effect of assembling colour on a flat surface is within the viewer and it is in the viewers interpretation of that assembly that causes the transformative experience. When Denis made this statement, he was referring to paintings of things that are recognisable to their audience. For example, a painting of a horse, a naked woman, or a landscape, or a still life. With the advent of abstract paintings this representation of a known object was subverted. However, the transformative effect in a viewer of assembling colours on a certain order on a flat surface was not reduced and, in some ways, may actually be enhanced by not being representative of something known.

It could also be said that composed music or designed sound is an assemblage of air molecules moving in a certain order. This assemblage is often referred to as frequencies or wave forms; for example, a sine wave form oscillating at 440Hz causes air molecules to concentrate and rarefy every 75.9cm at sea level. Figure 18 shows an oscillating sine wave at 440Hz.

50 Figure 18: Image of a sine wave at 440Hz over 100msecs

Figure 19 : First 100msecs of Sweet Child O’ Mine by Guns N’ Roses

Composed music and designed sounds, the assemblage of air molecules, creates far more complex concentration and rarefication of air molecules than simple sine waves. For example, Figure 17 below, of the first 10 seconds of Sweet Child O’ Mine shows how it has more assemblage of air molecules than the first 100msecs of the sine wave at 440Hz shown in Figure 18. The assembling of air molecules happens in time, and then this assemblage resides in the listeners memory. It is this memory of the assemblage that causes the transformative experience.

Figure 17 : The first 10 seconds of Sweet Child O’ Mine by Guns N’ Roses

51 A musical work is different from a painting. Listening to music is a continuous process. Music is constantly changing. It is the fact that music is constantly changing that makes it what we normally think of as music. A painting is a static object, it does not continuously change. It is the fact that a painting does not change that makes it a painting. The perceiver has control over what elements of the painting they are focusing on and the order in which they are focusing on those elements. However, a musical work or sound design work happens in time. The perceiver has no control over the order in which the work is perceived. The composer decides the order in which the assemblages of wave forms are perceived. For example, it is impossible to hear the end of a song before you hear the beginning.

It is, however, possible to skip around a composed musical work, for example, by moving the needle on a vinyl record or skipping on the timeline of a musical work heard on a computer. However, when that is happening, we are no longer listening to the musical work as it was created. This is similar to cutting up a painting and rearranging its elements.

Figure 20 : Vincent van Gogh’s The Starry Night painting (2020h)

When I look at Vincent van Gogh’s the Starry Night (2020h) painting shown in Figure 20 above, the first thing I look at is the sky with stars and clouds. Next, I look at the houses and the mountains and then I look at the moon on the top right corner. Lastly, I look at the green plant towards the left. Here I can look at any part of the picture in any order I want to. I can also return to see any of these things in any order I want to. Different people may look at this painting in different orders, focusing on different objects in the painting. It can be argued that this is how the painter intended, or assumed, the work to be seen, and this is normal whenever viewing paintings or static artworks.

Art as Movement Dance is an assemblage of human body movements that changes continuously in time, and like sound and music this assemblage resides in our memory, we cannot re-see a dance sequence at our leisure like we can with a painting. Dance often is accompanied by sound/music. However, when we look at a dance work we are focused on the movement. Sometimes the music/sound can help us interpret the dance or influence the interpretation, but it is not the primary focus of the dance artwork.

52 Here I discuss two dance performances by Pina Bausch: The Rite of Spring, first performed on December 3, 1975 at Opera House Wuppertal (2020g) and Barbe Bleue, first performed on January 8, 1977 at Opera House Wuppertal (2020b). The Rite of Spring is a ballet composed by Igor Stravinsky in 1913 and performed by different artists over the years (Schwarm, 2020). I am discussing The Rite of Spring dance performed by Pina Bausch in 1978. Figure 21 below is an image captured from Pina Bausch performance.

Figure 21 : The Rite of Spring dance performed by Pina Bausch in 1978 (Példatár a távoktatáshoz-MTE, 2020)

Climenhaga (2018, p. 13) says Pina Bausch’s dance movements “draw on the roots of earlier expressionist dance’s formulation of the essential emotive gesture, but take on the source of that creation rather than adopting it as an established technique.” He then says that Pina Bausch “returned to the source of everyday actions as they expressed a sense of ourselves as individuals and maintained that individualist stance in performance”. Then he says that “gestural movement structures that begin to break down as the dancer violently throws herself into the exhausting ritual” show emotions such as “the fear of the woman going through [a suffering]” and that “the physicality [of the person] does not point at this fear”. Pina Bausch used dance movements as a way of transforming emotions in the perceiver by using essential emotive gestural movements.

Pina Bausch’s dance work, Barbe Bleue performed in 1977 shows “The violence of male-female interactions and the seemingly relentless victimisation of women, … the victim not only of domestic but also of state violence” (Mumford, 2004, p. 44 and p. 54). Figure 22 below shows two images of Barbe Bleue.

Figure 22 : Images of Pina Bausch’s dance Barbe Bleue (Demidiv, 2017)

Birringer (1986, p. 91) describes this work as portraying “the overtly brutal treatment of the theme in Bluebeard- a piece about an obsessive male sadist who kills the women he loves [while being] caught up with his machine, "listening to a tape recording" of his murderous desire. He goes on to say, “most American audiences were

53 outraged by the violence portrayed in the performance”. He then goes on to ask “why they responded to the violence and not to the sharply focussed process of recognition that leads from the pathos of self-absorbed sexual obsession to the much larger patterns of mechanical evasion that a guilt-ridden society resorts to when it prefers to deny the consequences of its continuing aggression.” This is an example first of the way that an artwork can evoke an emotion that may then create a transformation, and of how that transformation may be considered and interrogated through a text, the question Birringer asks above.

When I watched Pina Bausch’s Barbe Bleue before reading Birringer’s comments, I felt like Pina Bausch’s movements narrate a story. I felt like the female character is trying to escape from a violent situation where a man is trying to forcefully keep her. Her movements made me feel tensed, sad, and scared. These emotions that I felt were strong and I felt the urge to help her. Pina Bausch transformed my emotions/feelings just by various body movements.

When I watched Pina Bausch’s Barbe Bleue after reading Birringer’s comments, I deeply connected with the dance movements and the story/message Bausch is trying to narrate. This is an example of how text can create a transformative experience.

When I look at Pina Bausch’s dance, I feel like she is expressing her emotions using various body movements and these movements are in sync with Igor’s music. The dance starts with slow movements and gradually increases movement speed with its music. All the dancers are very active, and they move with so much passion. These dance movements are unique and filled with energy. Slow movements make me feel calm and relaxing. Fast movements make me feel excited and lively.

Mirror Neurons When I watch Pina Bausch’s dancers, I feel like my body is trying to move along with the dancers. This activates mirror neurons in me. According to Acharya and Shukla (2012, p. 118),

Essentially, mirror neurons respond to actions that we observe in others. The interesting part is that mirror neurons fire in the same way when we actually recreate that action ourselves.

And Berrol (2006, p. 302) says,

An interactive phenomenon, studies are revealing that the identical sets of neurons can be activated in an individual who is simply witnessing another person performing a movement as the one actually engaged in the action or the expression of some emotion or behavior.

One of the aspects of an artwork is that it can generate empathy in the perceiver. Iacoboni (2009, p. 653) says that “Social psychology studies have demonstrated that imitation and mimicry are pervasive, automatic, and facilitate empathy”. This imitation can be physical, or it can be an imitation experience experienced through mirror neurons. Feeling empathy towards others is one of the stronger things that connects us to them. Activating mirror neurons is shown to generate a sense of empathy, this sense of empathy may influence a positive transformative effect.

The activation of mirror neurons helps the transformation process by creating emotions/feelings. For example, when I look at Bluebeard, I feel tense and sad. I also feel that the female character feels the same way. When I look at the male character, I feel an anger towards him, and I also feel violent because I am feeling his violence.

Regarding movement in the Sati Interactive System, the viewer/perceiver is also the participant, who is engaging/interacting through movement by seeing and following. When the user looks and follows the movement of the person displayed on the screen, it activates mirror neurons.

When I dance, I feel like I am alive and energetic. When I practice Tai Chi, I feel positive emotions such as calm, relaxed, and energetic. When I engage in an exercise or boxing, I feel positive emotions such as energetic, strong, motivated, and happy. I also feel like I am widely awake and aware of my surroundings. This is because engaging in a movement makes my body active. After engaging in an activity, I feel like I am healthy. Above mentioned positive transformative effects are caused by movement.

54 The positive transformative effect of movement has been researched by many people, for example: Zajenkowski et al.’s (2015) research shows the positive emotional change in dancing, Bond and Stinson’s (2000) research shows the positive experiences of young people in dancing, Kerr and Kul’s (2001) research shows that exercise can lead to positive emotions, and Narasimhan et al.’s (2011) research shows how yoga practices can increase positive emotions in the user. Conclusion This chapter has discussed art as a way of generating a transformative experience and how Sati Interactive System responds to that. The Sati Interactive System is attempting to generate the same transformative experience as an artwork does without defining itself as an artwork.

Here we have considered text art as an assemblage of ideas, visual art as an assemblage of colours, sound, and music as an assemblage of air molecules, and dance as an assemblage of movements. Normally the artist or creator of these assemblages is trying to use them to convey a specific idea to experience that idea. This is seen in Tolstoy’s idea of an artwork of an artist infecting the perceiver with their idea.

By certain external signs – movements, lines, colours, sounds, or arrangements of words – an artist infects other people so that they share his feelings. (Tolstoy, 1904, p. 100)

Art as text, static image, sound/music, and movement can create a sense of deep engagement and a transformative experience in the perceiver. For example, reading a poem, looking at a painting, listening to a song, or watching a dance can create a positive or negative emotion in us. This emotional change is the transformative experience.

Here we are considering that colour, sound, and movement are the basic elements of an artwork. This relates back to Denis’s quote on page 50 regarding a painting being “essentially a flat surface covered with colours assembled in a certain order”. Here Denis says that the colours are the basic elements of a painting and all meaning is derived from the assemblage of colours. Music is the assembling of air molecules in a certain order to create meaning, and dance is the assembling of body movements in a certain order to create meaning as well. If colours were removed from a painting it would no longer be a painting (considering black and white to be colours), if we removed the movement of air molecules we would no longer hear music, if we removed movement we would no longer see dance. When we are looking at a film, we see colour in the image, hear sound in the soundtrack, and see movement in the motion of the actors and other things on the screen.

The basic elements of an artwork: colour, sound, and movement are used in the Sati Interactive System to create this transformative experience. However, unlike the examples given above where the creator has intended to make an expressive artwork through assembling colours, wave forms, or movement, Sati Interactive System is not designed to make an expressive artwork through the assembling of colours, wave forms, or movement. Here I am considering that the Sati Interactive System is creating the transformative experience of an artwork without creating an artwork per se. This transformative experience creates a sense of deep engagement, and that sense of deep engagement then loops back to reinforce the transformative experience.

55 Chapter 4: Methodology This chapter discusses the technology used in the Sati Interactive System including: software, hardware, and the design process. Then I discuss the process of analysing the data gathered from the user responses when the Sati Interactive System was tested at the University of Melbourne, this process of testing is discussed in Chapter 6: Testing Process and the Results on page 91. I discuss the data analysis process here because it informed the testing mechanism and how the mechanism was implemented. Finally, I outline the ethics application; this was also influential in designing the testing process of the Sati Interactive System.

There are two approaches to testing the hypothesis put forward in this research: the first is to build the Sati Interactive System based on the information discussed in Chapter 2: Literature Review, and second is to test the system and its intended outcomes through asking 30 participants to engage with the system and respond to 15 questions in a questionnaire. Their responses are analysed through three coding methods to understand the success of the Sati Interactive System.

Questionnaires are used in testing human responses when researching computer-based interactivity. For example, Homer and Plass (2014), Proske et al. (2007), and Moreno et al.’s (2001) research on learning, Moreno and Mayer’s (2005) research on agent-based multimedia games, Khalil et al.’s (2005) research on instructional programs, and Sundar and Kim’s (2005) research on interactivity and persuasion, use questionnaires to gather and test human responses. These examples show the use of questionnaires when researching computer-based interactivity. Methods used for analysing the responses from the questionnaires is based on the approaches outlined in Why use these coding methods? on page 58.

First, I discuss the approaches and technology used in building the Sati Interactive System. Technology used in the Sati Interactive System This section discusses the software and hardware used to develop the Sati Interactive System.

Software Max I have used Max, an “interactive media software creation tool” developed by Cycling '74 (2019). I also looked at TouchDesigner (Derivative, 2019) and Unity (Unity Technologies, 2019). I decided that Max is the best choice for the reasons outlined below.

Max can be considered a node-based audio-visual development software program with a unique set of tools to create real-time interactive productions. Max is used to create many applications, for example, interactive installations, live audio development, projection mapping systems, and more examples of the use of Max are shown in Made With Max (Cycling '74, 2020) and many other forums. Max does not require programming knowledge to develop interactive audio-visual programs and it comes with extensive support including tutorials, help files, demonstration examples, and a broad informative community. Another reason for choosing Max is its ability to connect a variety of external devices such as: cameras, microphones, Kinect, MIDI controllers, synthesizers, projectors, DMX lights, Arduino, and musical instruments. Here, Max is used to create a proof-of- concept interactive system that detects human body movements and manipulate visual images and audio. This proof-of-concept maybe further developed using other programming languages.

The software architecture has three simple categories: gathering data, analysing the data, and providing the user with feedback according to the data. The program gathers data from the live camera feed, compares it with the reference filmed image, then generates visual images and audio based on the comparison data. This is discussed in detail in The Development of the System – Software and Hardware, on page 82.

Hardware The Sati Interactive System requires a computer, monitor, webcam, and speakers. The software system is built to run on low-cost hardware compatible across Windows and MacOS operating systems; low-cost is defined according to the analysis below.

56 The Sati Interactive System is built using a Dell XPS 9560 2017 laptop. The system was tested on the computer systems listed in the Table 11 below.

Model Cost Year OS Processor Ram Video Ram How well it Estimate worked iMac $500 2012 MacOS i5 2.7Ghz 8GB 512MB GeForce 5 GT 640M HP Pavilion 15 $650 2015 Windows i3 2.1Ghz 8GB 4GB Intel HD 3 10 Graphics Acer Aspire S7 $700 2014 Windows i7 1.8Ghz 8GB 2GB Intel HD 8 10 Graphics MacBook Pro $800 2013 MacOS i7 2Ghz 8GB 1.5GB Intel Iris 7 Pro Dell XPS 9560 $1300 2017 Windows i7 2.8Ghz 16GB 4GB Nvidia GTX 10 10 1050 MacBook Pro $1500 2014 MacOS i7 2.5Ghz 16GB 2GB AMD 10 Radeon R9 M370X Dell XPS 15 $2000 2019 Windows i5 2.4Ghz 8GB 4GB Nvidia GTX 10 10 1650 Table 11 : Tested system specifications

The Sati Interactive program worked in all the tested systems. However, older systems were not able to run it as smoothly as the newer systems. Each system has been given a score out of 10, where 10 - no audio-video lags, 5 - noticeable video lags but no audio lags, and 0 - that it cannot be used because of audio-video lags. I have provided a video clip of the Sati Interactive System performing without any audio-video lags, taken on the Dell XPS 15 9560. This video, “Sati Interactive Test Run.mp4” can be found in the ‘Supplementary files - ATHUGALA’ folder.

Sati Interactive System setup at the University The Sati Interactive System was tested at the Immersive Space 1 at the University of Melbourne (Room 202, Level 3, Arts West - Building 148). This system consists of an external webcam, two projectors, and a set of speakers connected to the Dell XPS 15 9560. This is also the location where testing of the system was done with the participants to see if the Sati Interactive System did create a sense of deep engagement in the user. An image of the Immersive Space 1 can be seen in Figure 28 on page 84.

Defining a low-cost system Here a low-cost system is considered to be between $AU800-$AU1000. This amount will provide a laptop computer with built-in camera and speakers, or a desktop computer with a screen, external webcam, and external speakers. From the tests discussed in Table 11 above, the Sati Interactive System requires an Intel i5 or AMD Ryzen 3 2.7 Ghz processor, 8GB ram, and 1.5GB graphics card to run on a laptop or a desktop computer without audio-video lags. If the user wants to see a larger image, they can use their smart TV or projector.

From a google search conducted within the Melbourne region in May 2020, a new Windows laptop with this specification costs around $AU 800 (CentreCom, 2020a, 2020b) and a desktop costs around $ AU 800 (Harvey Norman, 2020). However, a new MacBook with this specification costs around $AU 2000 (Apple, 2020b) and an iMac costs around $AU 1700 (Apple, 2020a).

A used Windows laptop with this specification costs around $AU 550 (recompute, 2020) and a desktop computer costs around $AU 350 (Amazon, 2020). However, the prices mentioned here is without the cost of a monitor/display and a webcam15.

15 For example, a HP 250 G7 15.6" HD Core i5-8265U and an Acer Aspire 5 15.6" HD Core i5-10210U Laptop are available from Centrecom.com for $AU799. A Dell Optiplex 9020 SFF Intel Core i5 4570 renewed desktop without a screen or keyboard available from Amazon.com.au for $AU319. In the USA, a ThinkCentre M720e SFF desktop without a screen or keyboard available from Lenovo.com for $US 605. This information is accurate at the time of writing.

57 Analysing the data gathered from the questionnaire I am using the three coding methods used in Grounded Theory to analyse the data gathered from the questionnaire. The three coding methods are: Open Coding, Axial Coding, and Selective Coding (Corbin and Strauss, 1990). I also applied the use of ‘key words’ from Grounded theory. Here the ‘key words’ are used in developing the Likert Scale questionnaire.

Why use these coding methods? These three coding methods are used because of their:

• capability to filter qualitative and quantitative data such as user experience, emotions, and feelings. • flexibility in gathering and analysing data according to the researchers’ preference. • capability to be implemented with the Likert Scale to gather data, which is used here. These coding methods are used in various areas of research to gather both qualitative and quantitative data. They are used in the areas that are closely related to this research such as: Human-Computer Interaction (Adams et al., 2008), the experience of software development (Adolph et al., 2011), art video games and its involvement with feelings (Castaño Díaz and Tungtjitcharoen, 2015), audience experience in sound performance (Lai and Bovermann, 2013), developing a model of participants’ perception of an immersive telematic artwork (Hohl, 2009), reflection in interactive system design (Baumer et al., 2014), and social interaction in open-ended play environments (de Valk et al., 2015).

These grounded theory coding methods are used to study psychological behaviours and feelings such as: emotions, personal experience, attraction, motivation, identity, prejudice, interpersonal co-operation, and conflicts (Charmaz and Belgrave, 2007). For example, grounded theory coding is used in research areas such as medical education research (Watling and Lingard, 2012, Kennedy and Lingard, 2006), higher education research (Conrad, 1982), medical trainees’ requests for clinical support (Kennedy et al., 2009), psychological research (Pidgeon and Henwood, 1997), counselling psychology research (Fassinger, 2005), software engineering research (Stol et al., 2016), studying the experience of software development (Adolph et al., 2011), gambling as emotion management (Ricketts* and Macaskill, 2003), music and emotional states (Bishop et al., 2007), art and creativity (Mace and Ward, 2002), and indigenous music therapy theory building (Daveson et al., 2008).

The three coding methods The following sections describe the three coding methods used in this analysis.

Open Coding Open coding is the initial stage of the data analysis. This process has no pre-determined set of codes. The main purpose of open coding is to help the researcher identify, conceptualise, and label data into categories according to similarities and differences (Moghaddam, 2006). According to (Corbin and Strauss, 1990, p. 423),

Open Coding is the interpretive process by which data are broken down analytically. The purpose of open coding is to help the analyst gain new insights into the data by breaking through standard ways of thinking about (interpreting) phenomena reflected in the data.

The researcher should be open to study and change throughout the analysis process (Adams et al., 2008). In other words, the researcher compares similarities and differences in words, events, action, interaction, views, and expressions (Corbin and Strauss, 1990). Then, these comparisons are conceptually labelled so that they can be easily identified and grouped into categories and subcategories. However, more categories may emerge during axial coding stage (M. Alammar et al., 2019).

Axial Coding Axial coding is the second stage of analysis. Axial coding studies the relationships among the open coding categories and subcategories while testing them against the data, further developing the categories and subcategories (Corbin and Strauss, 1990). This coding stage might reduce or increase the number of categories while building a much deeper relationship among them. For instance, categories that share clear similarities can be merged to create one core category, or categories that do not provide an acceptable basis can be discarded. This technique help generate a deeper connection in categories, connecting or disconnecting them, building

58 towards a theory (M. Alammar et al., 2019). The data analysed from this stage is used to support the emerging theory (Moghaddam, 2006).

Selective Coding Selective coding is the final stage of the analysis process where all categories are connected to the core category (Corbin and Strauss, 1990). Here, the core category signifies the main phenomenon of the research study. (Corbin and Strauss, 1990, p. 424) suggest that, to identify the core or the main phenomenon of the study, the researcher should ask themselves questions such as:

• what is the main analytic idea presented by this research? • if I had to conceptualize my finding in a few sentences, what would I say? • what does all the action/interaction seem to be about? • how can I explain all the variation that I see between and among the categories? However, the main phenomenon or the core category does not have to appear from the previously created categories. As a researcher, it is important to comprehend that the core category might develop as a new category which was not identified as a category from the previous analytic stages. As research progresses, the core category might develop with relation to the previous categories (Corbin and Strauss, 1990). The core category will also be in relation to the rest of the categories.

The below figure is designed to provide a clear picture of the analysis process used here.

Data

Open Coding Code Code Code Code

Axial Coding Categories Categories Categories

Selective Coding Core Category

Developed Theory

Figure 23 : Analysis Process used here

This three-stage analysis process is a part of grounded theory. It starts with gathering data, categorising, and labelling according its characteristics. The next stage of analysis builds a stronger connection between the categories by comparing its similarities and dissimilarities while justifying it with the gathered data, trying to build a theory. The final stage develops a core category based on the analysis, which ultimately develops into the basis for the developed theory.

I am using these three coding methods to develop a questionnaire that uses a Likert Scale, this is discussed below.

Likert Scale According to Nemoto and Beglar (2014, p. 2), a Likert scale is

a psychometric scale that has multiple categories from which respondents choose to indicate their opinions, attitudes, or feelings about a particular issue.

The Likert Scale was initially discussed by Rensis Likert (1932) in ‘A technique for the measurement of attitudes’ and further expanded in his article, ‘A Simple and Reliable Method of Scoring the Thurstone Attitude Scales’ (Likert et al., 1934). A Likert Scale usually consist of questions directed towards measuring personal attitudes and feelings using five response alternatives such as: (1) strongly agree, (2) agree, (3) neither agree nor disagree, (4) disagree, (5) strongly disagree (Boone and Boone, 2012). Over the years, various Likert Scale response

59 alternatives have been developed and used such as: the four-point Likert alternatives (Chang, 1994), five-point Likert alternatives (Adelson and McCoach, 2010), six-point Likert alternatives (Chomeya, 2010), and eleven-point Likert alternatives (Leung, 2011).

A Likert Scale is used to collect attitudinal data in many areas of research such as that of: Wrzesien et al.’s (2011) research in psychology, HCI, augmented reality, and mental health technology; Francese et al.’s (2012) research in gestural user interfaces and HCI; Fisher et al.’s (2001) research in self-directed learning and nursing education; Lochbaum and Roberts (1993) research in goal orientations, perceptions, and sport experience; Sinha et al.’s (2004) research in neural circuits and emotional distress in humans; De Choudhury’s (2012) research in human emotional states in social media; Torres et al.’s (2013) research in sports-related concussion; and Kellmann’s (2010) research in overtraining, high-intensity sports and stress/recovery monitoring.

Why use a Likert Scale? I am using a Likert Scale questionnaire to understand participants feelings/emotions: relaxed, focused, and not feeling the passing of time.

A Likert scale is used here because of its ability to,

• quickly gather data from a large number of participants. • gather personal experiences with the use of an alternative point scale. • gather reliable and direct data. • easily compare all data as a single composite. • compare, combine, and analyse qualitative information using quantitative measures/techniques. An example of a Likert questionnaire is shown below.

Figure 24 : “Healthy Eating” 5 question, 5 alternative responses Likert Scale (Boone and Boone, 2012, p. 2)

The Likert Scale questionnaire shown above consists of 5 statements regarding ‘Healthy eating’. Here the researcher is trying to find out the participants habits of healthy eating. Researchers can use the same questionnaire for an infinite number of participants and understand trends based on quantitative responses to qualitative statements. For example, if 75% of the participants strongly disagree that they eat healthy foods on a regular basis but they strongly agree that a healthy diet is important to their family, it could be assumed that there is a difference between their intentions and their actions. This kind of data can give researchers richer information.

I created a Likert Scale questionnaire using my own keywords/focus words, shown in Appendix 2 on page 131. For example, “felt relaxed”, “focused”, and “distracted”. I choose these keywords to find out if the Sati Interactive System helped the participants feel relaxed and focused while using the Sati Interactive System and

60 after using it. I used the three coding methods mentioned above to analyse the data from the questionnaires. This is discussed in more detail below.

Applying the three coding methods Here I discuss how the three coding methods are applied to the Likert Scale questionnaire to gather data.

Open Coding Firstly, I created 15 statements and their ‘keywords/focus words’. For example, the first statement in the questionnaire, “I felt relaxed after using the system” is assigned 3 focus words: “I felt”, “relaxed”, and “after”. These focus words are assigned a General Open Code: ‘personal feeling’, ‘positive emotion’, and ‘after the experience’. Other statements can have other Open Codes which are assigned to each statement depending on their alternative answer, shown in Table 13 on page 69.

The Table 12 below shows the Questions asked in the questionnaire, Keywords/Focus words, and the General Open Codes.

Question Keywords/Focus words General Open Codes I felt relaxed after using the system I felt - - Personal feeling Relaxed - - Positive emotion After - - After the experience I was focused while using the system I - - Personal experience Focused - - Positive emotion While - - During the experience I felt transported into a different I felt – - Personal experience environment Different environment – - Positive emotion - Accepting change I felt distracted by the visuals I felt – - Personal experience Distracted - - Negative emotion - Uncomfortable - Dissatisfaction - Displeasure - Disapproval I felt distracted by the sounds I felt – - Personal experience Distracted - - Negative emotion - Uncomfortable - Dissatisfaction - Displeasure - Disapproval I would recommend this system to a I – - Personal experience friend Recommend this system - - Satisfaction - Approval - Affirmation I would give this system to a friend I – - Personal experience Give this system to a friend - - Satisfaction - Approval - Affirmation I found the Sati Interactive System I – - Personal experience user-friendly User-friendly - - Satisfaction - Acceptance - Approval I would use this system at home I – - Personal experience Use at home - - Satisfaction - Approval - Affirmation I feel that using this system daily I feel – - Personal experience would help reduce stress Would help reduce stress - - Satisfaction

61 Question Keywords/Focus words General Open Codes - Accepting change - Believes I did not feel the time passing I – - Personal experience Did not feel the time passing - - Positive emotion - Satisfaction - Enjoyed - Interested I felt like 10 minutes is enough I felt – - Personal experience Enough - - Personal emotion I felt relaxed while using the system I felt – - Personal experience Relaxed – - Positive emotion - Satisfaction - Enjoyed - Interested While - - During the experience I felt like I was in a different ‘zone’ I felt – - Personal experience while using the system Different zone – - Positive emotion - Comfortable - Satisfaction - Enjoyed While - - During the experience I feel like I am still in a different ‘zone’ I feel – - Personal experience after using the system Still – - present state, after the experience Different zone - - Positive emotion - Comfortable - Satisfaction - Enjoyed Table 12 : General Open Code stage – Keywords/Focus words and the General Open Codes.

The question “do you feel a sense of deep engagement?” was not used as we felt this question would be confusing for the respondents. However, the questions regarding relaxed, focused, and time passing relate directly to the qualities of Sati and deep engagement.

The table below shows the further categorisation of Open Code using alternative answers. The first column is the question, the second column shows the general Open Codes, the third column shows the five alternative answers, the fourth column shows the Open Codes categorised according to the alternative answer.

Question General Open Code Five Alternative Open Code Answers I felt relaxed after using - Personal feeling 1 – Strongly disagree Not feeling relaxed. the system - Positive emotion 2 - Disagree Negative experience. - After the Negative emotions. experience Uncomfortable. Dissatisfaction. Disapproval. Displeasure. 3 – Neither agree nor Neutral experience. disagree Neutral emotions. Neither uncomfortable nor comfortable. Neither dissatisfied nor satisfied. 4 – Agree Felt relaxed. 5 – Strongly agree Positive experience.

62 Question General Open Code Five Alternative Open Code Answers Positive emotions. Comfortable. Satisfied. Approved. Enjoyed. Interested. I was focused while using - Personal feeling 1 – Strongly disagree Could not focus. the system - Positive emotion 2 – Disagree Negative experience. - During the Negative emotions. experience Uncomfortable. Dissatisfaction. Disapproval. Displeasure. 3 – Neither agree nor Neutral experience. disagree Neutral emotions. Neither uncomfortable nor comfortable Neither dissatisfied nor satisfied. 4 – Agree Able to focus. 5 – Strongly agree Positive experience. Positive emotions. Comfortable. Satisfied. Approved. Enjoyed. Interested. I felt transported into a - Personal 1 – Strongly disagree Did not feel the change of the different environment experience 2 – Disagree environment/atmosphere. - Positive emotion Negative experience. - Accepting change Negative emotions. Not accepting change. Uncomfortable. Dissatisfaction. Disapproval. Displeasure. 3 – Neither agree nor Do not understand the disagree change. Neutral experience. Neutral emotions. Neither uncomfortable nor comfortable. Neither dissatisfied nor satisfied. 4 – Agree Felt the changing 5 – Strongly agree environment/atmosphere. Positive experience. Positive emotions. Comfortable. Satisfied. Approved. Enjoyed. Interested. 1 – Strongly disagree Visuals were not distracting.

63 Question General Open Code Five Alternative Open Code Answers I felt distracted by the - Personal 2 – Disagree Positive experience. visuals experience Positive emotions. - Negative emotion Comfortable. - Uncomfortable Satisfied. - Dissatisfaction Approved. - Displeasure Enjoyed. - Disapproval Interested. 3 – Neither agree nor Neutral experience. disagree Neutral emotions. Neither uncomfortable nor comfortable. Neither dissatisfied nor satisfied. 4 – Agree Distracting visuals. 5 – Strongly agree Negative experience. Negative emotions. Not accepting change. Uncomfortable. Dissatisfaction. Disapproval. Displeasure. I felt distracted by the - Personal 1 – Strongly disagree Sounds were not distracting. sounds experience 2 – Disagree Positive experience. - Negative emotion Positive emotions. - Uncomfortable Comfortable. - Dissatisfaction Satisfied. - Displeasure Approved. - Disapproval Enjoyed. Interested. 3 – Neither agree nor Neutral experience. disagree Neutral emotions. Neither uncomfortable nor comfortable. Neither dissatisfied nor satisfied. 4 – Agree Distracting sounds. 5 – Strongly agree Negative experience. Negative emotions. Not accepting change. Uncomfortable. Dissatisfaction. Disapproval. Displeasure. I would recommend this - Personal 1 – Strongly disagree Negative experience. system to a friend experience 2 – Disagree Negative emotions. - Satisfaction Not accepting change. - Approval Uncomfortable. - Affirmation Dissatisfaction. Disapproval. Displeasure. 3 – Neither agree nor Neutral experience. disagree Neutral emotions. Neither uncomfortable nor comfortable.

64 Question General Open Code Five Alternative Open Code Answers Neither dissatisfied nor satisfied. 4 – Agree Positive experience. 5 – Strongly agree Positive emotions. Comfortable. Satisfied. Approved. Enjoyed. Interested. I would give this system - Personal 1 – Strongly disagree Negative experience. to a friend experience 2 – Disagree Negative emotions. - Satisfaction Not accepting change. - Approval Uncomfortable. - Affirmation Dissatisfaction. Disapproval. Displeasure. 3 – Neither agree nor Neutral experience. disagree Neutral emotions. Neither uncomfortable nor comfortable. Neither dissatisfied nor satisfied. 4 – Agree Positive experience. 5 – Strongly agree Positive emotions. Comfortable. Satisfied. Approved. Enjoyed. Interested. I found the Sati - Personal 1 – Strongly disagree Not user-friendly. Interactive System user- experience 2 – Disagree Negative experience. friendly - Satisfaction Negative emotions. - Acceptance Not accepting change. - Approval Uncomfortable. Dissatisfaction. Disapproval. Displeasure. 3 – Neither agree nor Neutral experience. disagree Neutral emotions. Neither uncomfortable nor comfortable. Neither dissatisfied nor satisfied. 4 – Agree User-friendly. 5 – Strongly agree Positive experience. Positive emotions. Comfortable. Satisfied. Approved. Enjoyed. Interested. I would use this system - Personal 1 – Strongly disagree Negative experience. at home experience 2 – Disagree Negative emotions. - Satisfaction Not accepting change.

65 Question General Open Code Five Alternative Open Code Answers - Approval Uncomfortable. - Affirmation Dissatisfaction. Disapproval. Displeasure. 3 – Neither agree nor Neutral experience. disagree Neutral emotions. Neither uncomfortable nor comfortable. Neither dissatisfied nor satisfied. 4 – Agree Positive experience. 5 – Strongly agree Positive emotions. Comfortable. Satisfied. Approved. Enjoyed. Interested. I feel that using this - Personal 1 – Strongly disagree Doesn’t help reduce stress. system daily would help experience 2 – Disagree Not feeling relaxed. reduce stress - Satisfaction Negative experience. - Accepting change Negative emotions. - Believes Not accepting change. Uncomfortable. Dissatisfaction. Disapproval. Displeasure. 3 – Neither agree nor Neutral experience. disagree Neutral emotions. Neither uncomfortable nor comfortable. Neither dissatisfied nor satisfied. 4 – Agree Help reduces help. 5 – Strongly agree Felt relaxed. Positive experience. Positive emotions. Comfortable. Satisfied. Approved. Enjoyed. Interested. I did not feel the time - Personal 1 – Strongly disagree Could not engage with the passing experience 2 – Disagree system. - Positive emotion Negative experience. - Satisfaction Negative emotions. - Enjoyed Not accepting change. - Interested Uncomfortable. Dissatisfaction. Disapproval. Displeasure. 3 – Neither agree nor Neutral experience. disagree Neutral emotions. Neither uncomfortable nor comfortable.

66 Question General Open Code Five Alternative Open Code Answers Neither dissatisfied nor satisfied. 4 – Agree Engaged in the system. 5 – Strongly agree Positive experience. Positive emotions. Comfortable. Satisfied. Approved. Enjoyed. Interested. I felt like 10 minutes is - Personal 1 – Strongly disagree Not enough time to engage. enough experience 2 – Disagree Negative experience. - Personal emotion Negative emotions. Not accepting change. Uncomfortable. Dissatisfaction. Disapproval. Displeasure. 3 – Neither agree nor Neutral experience. disagree Neutral emotions. Neither uncomfortable nor comfortable. Neither dissatisfied nor satisfied. 4 – Agree Enough time to engage. 5 – Strongly agree Positive experience. Positive emotions. Comfortable. Satisfied. Approved. Enjoyed. Interested. I felt relaxed while using - Personal 1 – Strongly disagree Not relaxing. the system experience 2 – Disagree Negative experience. - Positive emotion Negative emotions. - Satisfaction Not accepting change. - Enjoyed Uncomfortable. - Interested Dissatisfaction. - During the Disapproval. experience Displeasure. 3 – Neither agree nor Neutral experience. disagree Neutral emotions. Neither uncomfortable nor comfortable. Neither dissatisfied nor satisfied. 4 – Agree Felt relaxed. 5 – Strongly agree Positive experience. Positive emotions. Comfortable. Satisfied. Approved. Enjoyed. Interested.

67 Question General Open Code Five Alternative Open Code Answers I felt like I was in a - Personal 1 – Strongly disagree Did not feel any change of different ‘zone’ while experience 2 – Disagree environment. using the system - Positive emotion Audio Visual interaction did - Comfortable not have any affect. - Satisfaction Negative experience. - Enjoyed Negative emotions. - During the Not accepting change. experience Uncomfortable. Dissatisfaction. Disapproval. Displeasure. 3 – Neither agree nor Neutral experience. disagree Neutral emotions. Neither uncomfortable nor comfortable. Neither dissatisfied nor satisfied. 4 – Agree Felt the environment change. 5 – Strongly agree Audio Visual interaction is effective. Positive experience. Positive emotions. Comfortable. Satisfied. Approved. Enjoyed. Interested. I feel like I am still in a - Personal 1 – Strongly disagree Did not feel any change of different ‘zone’ after experience 2 – Disagree environment. using the system - present state, after Audio Visual interaction did the experience not have any affect. - Positive emotion Negative experience. - Comfortable Negative emotions. - Satisfaction Not accepting change. - Enjoyed Uncomfortable. Dissatisfaction. Disapproval. Displeasure. 3 – Neither agree nor Neutral experience. disagree Neutral emotions. Neither uncomfortable nor comfortable. Neither dissatisfied nor satisfied. 4 – Agree Felt the environment change 5 – Strongly agree and its effects are continuing. Audio Visual interaction is effective. Positive experience. Positive emotions. Comfortable. Satisfied. Approved. Enjoyed.

68 Question General Open Code Five Alternative Open Code Answers Interested. Table 13 : Open Code stage – further categorisation of Open Code using alternative answers.

Axial Code During the process of Axial Code shown in Table 14 below, Open Code categories are further categorised building a much deeper relationship among them. The first column shows the question, the second column shows the five alternative answers, the third column shows the Open Code, and the last column shows the Axial Code.

Question Five Alternative Open Code Axial Code answers I felt relaxed after using 1 – Strongly disagree Not feeling relaxed. Did not feel relaxed. the system 2 - Disagree Negative experience. Negative emotions. Uncomfortable. Dissatisfaction. Disapproval. Displeasure. 3 – Neither agree nor Neutral experience. Neutral feeling. disagree Neutral emotions. Neither uncomfortable nor comfortable. Neither dissatisfied nor satisfied. 4 – Agree Felt relaxed. Felt relaxed. 5 – Strongly agree Positive experience. Positive emotions. Comfortable. Satisfied. Approved. Enjoyed. Interested. I was focused while 1 – Strongly disagree Could not focus. Not able to focus. using the system 2 – Disagree Negative experience. Negative emotions. Uncomfortable. Dissatisfaction. Disapproval. Displeasure. 3 – Neither agree nor Neutral experience. Neutral feeling. disagree Neutral emotions. Neither uncomfortable nor comfortable Neither dissatisfied nor satisfied. 4 – Agree Able to focus. Able to focus. 5 – Strongly agree Positive experience. Positive emotions. Comfortable. Satisfied. Approved. Enjoyed. Interested.

69 Question Five Alternative Open Code Axial Code answers I felt transported into a 1 – Strongly disagree Did not feel the change of Negative different environment 2 – Disagree the atmosphere/environmen environment/atmosphere. t. Negative experience. Negative emotions. Not accepting change. Uncomfortable. Dissatisfaction. Disapproval. Displeasure. 3 – Neither agree nor Do not understand the Neutral feeling. disagree change. Neutral experience. Neutral emotions. Neither uncomfortable nor comfortable. Neither dissatisfied nor satisfied. 4 – Agree Felt the changing Positive 5 – Strongly agree environment/atmosphere. atmosphere/environmen Positive experience. t. Positive emotions. Comfortable. Satisfied. Approved. Enjoyed. Interested. I felt distracted by the 1 – Strongly disagree Visuals were not Visuals help engage with visuals 2 – Disagree distracting. the system. Positive experience. Positive emotions. Comfortable. Satisfied. Approved. Enjoyed. Interested. 3 – Neither agree nor Neutral experience. Neutral feeling. disagree Neutral emotions. Neither uncomfortable nor comfortable. Neither dissatisfied nor satisfied. 4 – Agree Distracting visuals. Visuals does not help 5 – Strongly agree Negative experience. engage with the system. Negative emotions. Not accepting change. Uncomfortable. Dissatisfaction. Disapproval. Displeasure. I felt distracted by the 1 – Strongly disagree Sounds were not Sounds help engage with sounds 2 – Disagree distracting. the system. Positive experience. Positive emotions.

70 Question Five Alternative Open Code Axial Code answers Comfortable. Satisfied. Approved. Enjoyed. Interested. 3 – Neither agree nor Neutral experience. Neutral feeling. disagree Neutral emotions. Neither uncomfortable nor comfortable. Neither dissatisfied nor satisfied. 4 – Agree Distracting sounds. Sounds does not help 5 – Strongly agree Negative experience. engage with the system. Negative emotions. Not accepting change. Uncomfortable. Dissatisfaction. Disapproval. Displeasure. I would recommend 1 – Strongly disagree Negative experience. Does not recommend this system to a friend 2 – Disagree Negative emotions. the experience. Not accepting change. Uncomfortable. Dissatisfaction. Disapproval. Displeasure. 3 – Neither agree nor Neutral experience. Neutral feeling. disagree Neutral emotions. Neither uncomfortable nor comfortable. Neither dissatisfied nor satisfied. 4 – Agree Positive experience. Recommends the 5 – Strongly agree Positive emotions. experience. Comfortable. Satisfied. Approved. Enjoyed. Interested. I would give this system 1 – Strongly disagree Negative experience. Does not recommend to a friend 2 – Disagree Negative emotions. the experience. Not accepting change. Uncomfortable. Dissatisfaction. Disapproval. Displeasure. 3 – Neither agree nor Neutral experience. Neutral feeling. disagree Neutral emotions. Neither uncomfortable nor comfortable. Neither dissatisfied nor satisfied. 4 – Agree Positive experience. Recommends the 5 – Strongly agree Positive emotions. experience.

71 Question Five Alternative Open Code Axial Code answers Comfortable. Satisfied. Approved. Enjoyed. Interested. I found the Sati 1 – Strongly disagree Not user-friendly. Not a user-friendly Interactive System user- 2 – Disagree Negative experience. experience. friendly Negative emotions. Not accepting change. Uncomfortable. Dissatisfaction. Disapproval. Displeasure. 3 – Neither agree nor Neutral experience. Neutral feeling. disagree Neutral emotions. Neither uncomfortable nor comfortable. Neither dissatisfied nor satisfied. 4 – Agree User-friendly. User-friendly experience. 5 – Strongly agree Positive experience. Positive emotions. Comfortable. Satisfied. Approved. Enjoyed. Interested. I would use this system 1 – Strongly disagree Negative experience. Does not like the at home 2 – Disagree Negative emotions. experience. Not accepting change. Uncomfortable. Dissatisfaction. Disapproval. Displeasure. 3 – Neither agree nor Neutral experience. Neutral feeling. disagree Neutral emotions. Neither uncomfortable nor comfortable. Neither dissatisfied nor satisfied. 4 – Agree Positive experience. Likes the experience. 5 – Strongly agree Positive emotions. Comfortable. Satisfied. Approved. Enjoyed. Interested. I feel that using this 1 – Strongly disagree Doesn’t help reduce stress. Did not feel relaxed. system daily would help 2 – Disagree Not feeling relaxed. reduce stress Negative experience. Negative emotions. Not accepting change. Uncomfortable. Dissatisfaction.

72 Question Five Alternative Open Code Axial Code answers Disapproval. Displeasure. 3 – Neither agree nor Neutral experience. Neutral feeling. disagree Neutral emotions. Neither uncomfortable nor comfortable. Neither dissatisfied nor satisfied. 4 – Agree Helps reduce stress. Felt relaxed. 5 – Strongly agree Felt relaxed. Positive experience. Positive emotions. Comfortable. Satisfied. Approved. Enjoyed. Interested. I did not feel the time 1 – Strongly disagree Could not engage with the Could not engage with passing 2 – Disagree system. the system. Negative experience. Negative emotions. Not accepting change. Uncomfortable. Dissatisfaction. Disapproval. Displeasure. 3 – Neither agree nor Neutral experience. Neutral feeling. disagree Neutral emotions. Neither uncomfortable nor comfortable. Neither dissatisfied nor satisfied. 4 – Agree Engaged in the system. Was able to engage with 5 – Strongly agree Positive experience. the system. Positive emotions. Comfortable. Satisfied. Approved. Enjoyed. Interested. I felt like 10 minutes is 1 – Strongly disagree Not enough time to Not enough time for a enough 2 – Disagree engage. positive experience. Negative experience. Negative emotions. Not accepting change. Uncomfortable. Dissatisfaction. Disapproval. Displeasure. 3 – Neither agree nor Neutral experience. Neutral feeling. disagree Neutral emotions. Neither uncomfortable nor comfortable.

73 Question Five Alternative Open Code Axial Code answers Neither dissatisfied nor satisfied. 4 – Agree Enough time to engage. Enough time for a 5 – Strongly agree Positive experience. positive experience. Positive emotions. Comfortable. Satisfied. Approved. Enjoyed. Interested. I felt relaxed while 1 – Strongly disagree Not relaxing. Not a relaxing using the system 2 – Disagree Negative experience. experience. Negative emotions. Not accepting change. Uncomfortable. Dissatisfaction. Disapproval. Displeasure. 3 – Neither agree nor Neutral experience. Neutral feeling. disagree Neutral emotions. Neither uncomfortable nor comfortable. Neither dissatisfied nor satisfied. 4 – Agree Felt relaxed. A relaxing experience. 5 – Strongly agree Positive experience. Positive emotions. Comfortable. Satisfied. Approved. Enjoyed. Interested. I felt like I was in a 1 – Strongly disagree Did not feel any change of Audio visual interaction different ‘zone’ while 2 – Disagree environment. did not have a positive using the system Audio Visual interaction did effect on the experience. not have any affect. Negative experience. Negative emotions. Not accepting change. Uncomfortable. Dissatisfaction. Disapproval. Displeasure. 3 – Neither agree nor Neutral experience. Neutral feeling. disagree Neutral emotions. Neither uncomfortable nor comfortable. Neither dissatisfied nor satisfied. 4 – Agree Felt the environment Audio visual interaction 5 – Strongly agree change. had a positive effect on Audio Visual interaction is the experience. effective. Positive experience.

74 Question Five Alternative Open Code Axial Code answers Positive emotions. Comfortable. Satisfied. Approved. Enjoyed. Interested. I feel like I am still in a 1 – Strongly disagree Did not feel any change of Audio visual interaction different ‘zone’ after 2 – Disagree environment. did not have a positive using the system Audio Visual interaction did effect on the experience. not have any affect. Negative experience. Negative emotions. Not accepting change. Uncomfortable. Dissatisfaction. Disapproval. Displeasure. 3 – Neither agree nor Neutral experience. Neutral feeling. disagree Neutral emotions. Neither uncomfortable nor comfortable. Neither dissatisfied nor satisfied. 4 – Agree Felt the environment Audio visual interaction 5 – Strongly agree change and its effects are has a positive effect on continuing. the experience. Audio Visual interaction is effective. Positive experience. Positive emotions. Comfortable. Satisfied. Approved. Enjoyed. Interested. Table 14 : Axial Code stage categorisation

Selective Code In the Selective Code stage shown in Table 15 below, the Selective Code is developed from the refined Axial Code. The first column shows the Axial Code, the second column shows the Refined Axial Code, and the last column shows the Selective Code.

Axial Code Refined Axial Code Selective Code Did not feel relaxed. Negative experience. Does not create a sense of Not able to focus. deep engagement. Negative atmosphere/environment. Visuals does not help engage with the system. Sounds does not help engage with the system. Does not recommend the experience. Does not recommend the experience. Not a user-friendly experience. Does not like the experience. Could not engage with the system. Not enough time for a positive experience. Not a relaxing experience.

75 Axial Code Refined Axial Code Selective Code Audio visual interaction did not have a positive effect on the experience. Neutral feeling. Neutral feeling. Neither Felt relaxed. Positive experience. Creates a sense of deep Able to focus. engagement. Positive atmosphere/environment. Visuals help engage with the system. Sounds help engage with the system. Recommends the experience. User-friendly experience. Likes the experience. Was able to engage with the system. Enough time for a positive experience. A relaxing experience. Audio visual interaction had a positive effect on the experience. Table 15 : Selective Code stage

Here, I have developed three Selective Codes: ‘Creates a sense of deep engagement’, ‘Does not create a sense of deep engagement’, and ‘Neither’. The ‘neither’ Selective Code is not considered because it does not provide a positive or negative answer, and this information is not useful for this research at the moment.

Next, I will discuss how the collected data is analysed using the three coding methods to find the Core-category.

Data Coding – Finding the Core-category This section shows the process of finding the Core-category. The first step is to define a set of relevant questions/statements, then create a set of General Open Codes according to the questions/statements, then a set of Open Codes are created in relation to each of the 5 alternative answers to each question/statement, then Axial Codes are created in relation to the Open Codes, then these Axial Codes are refined to create three Refined Axial Codes, and then three Selective Codes are created in relation to the three Refined Axial Codes. The Selective Code that has the most responses define the Core-category.

As an example, if a participant answers - ‘Strongly agree’, to the question - ‘I felt relaxed after using the system’, this falls into a positive Open Code, then a positive Axial Code, then the positive Selective Code category, and finally this fits the Core-category of ‘Creates a sense of deep engagement’.

The table below is an example of the above-explained method. The first column shows the Question/statement, the second column shows the alternative answers, the third column shows the Axial Code, and the last column shows the Selective Code16.

Question/statement Alternative answers Axial Code Selective Code I felt relaxed after using 1 – Strongly disagree Negative experience Does not create a the system 2 - Disagree sense of deep engagement. 4 – Agree Positive experience Creates a sense of 5 – Strongly agree deep engagement. Table 16 : Example of data coding using one question.

The table below shows the 30 participants responses. This is discussed further in Results and Discussion on page 92. The first column shows the Axial Code, the second column shows the number of responses, the third column shows the Selective Code, and the last column shows the Core-category.

Axial Code Responses Selective Code Core-Category

16 The Open Codes and the response Neither agree nor disagree are not shown here for ease of understanding.

76 Negative responses 35 Does not create a sense of deep engagement. Creates a sense of Positive responses 321 Creates a sense of deep deep engagement engagement. Neutral responses 94 Neither Table 17 : Example of data coding using 10 participants.

According to the example in Table 17 above, the Core-category with the most responses is - ‘Creates a sense of deep engagement’. Therefore, I conclude that the Sati Interactive system can create a sense of deep engagement in the user. Ethics This research project has been approved as a minimum risk project by The Fine Arts and Music Human Ethics Advisory Group (2019). This includes the Consent, Plain Language Statement, Privacy Databank Module, Recruitment Advertisement, and the Questionnaire. A copy of the approval letter can be found in Appendix 3 on page 132. The Plain Language Statement, Consent Form, and the Questionnaire are in Appendix 4 on page 133. Conclusion Sati Interactive System is built using Max. I have used Max because I did not have to learn a new programming language, this saved time. Max is also designed to create real-time interactive productions and it is very easy to create alternative prototypes when developing a program.

I have defined a low-cost system as costing between $AU800-$AU1000 and tested the Sati Interactive System in a variety of computers in this price range. For example, a laptop or desktop system with an Intel i5 or AMD Ryzen 3 2.7 Ghz processor, 8GB ram, and 1.5GB graphics card will run without audio-video lags.

The analysis process uses the three coding methods used in Grounded Theory. This was an effective process because it filtered the questionnaire data to finding a Core-category and this showed the effectiveness of the Sati Interactive System in creating a sense of deep engagement.

The following chapter discusses the development process of the Sati Interactive System.

77 Chapter 5: Development Process This chapter discusses the development process of the Sati Interactive System, including: the interaction design framework, developed software, aspects of the software, interface design, the functionality of the interactive system, anticipated reactions of the user, and how it could be used in other contexts. Creative process The focus of the Sati Interactive System is to create a sense of deep engagement in the user, rather than creating an artwork, or an interactive system towards entertainment, education, training, or healthcare purposes. It does not provide the user with achievements or level-ups, such as in a first-person shooter game, and does not provide options to increase the difficulty; nor does it attempt to educate people.

Most computer-based interactive artworks have, as their main goal, entertaining the participants and often creating a sense of excitement, for example, DJ Light created by CinimodStudio (2019), Yeosu Spanish Pavilion project created by Zappulla et al. (2019), and Voice Array created by Lozano-Hemmer (2019). The main goal of computer-based interactive systems such as: Lange et al.’s (2012) Interactive game-based rehabilitation using the Microsoft Kinect, Jaume-i-Capó et al.’s (2013) Interactive Rehabilitation System for Improvement of Balance Therapies in People With Cerebral Palsy, and the Augmented Chemistry: Interactive Education System by Singhal et al. (2012), are focused on learning and developing or improving physical characteristics.

Creating a sense of deep engagement is usually thought of as requiring being still, quiet with eyes closed, and focusing on one object or idea. The Sati Interactive System requires the participant moving, seeing, and listening. These seem like opposing activities. The process of developing this system has required a new way of thinking and that is why I considered using an arts-based approach, this is introduced on page 1 and discussed in detail in Chapter 3: An Arts-based Approach: Sati Interactive System, Deep Engagement, and Transformative Experience on page 45. Applying the principles/concepts of computer-based interactivity Here I am focusing on the principles/concepts of computer-based interactivity and how this can be linked to arts-based interactivity. The literature on computer-based interactivity is broadly discussed in Computer-based Interactivity from page 15 to page 18. This link is important because I am using an arts-based approach, as discussed in Chapter 3: An Arts-based Approach: Sati Interactive System, Deep Engagement, and Transformative Experience on page 45 above, to develop the Sati Interactive System. An artwork is inherently designed to create a sense of deep engagement/transformative experience; this is what I want the user to experience when they interact with the Sati Interactive System.

Table 18 on the following page gives an explanation of concepts/principles of interactivity in a computer-based and an arts-based approach and how these two approaches are linked in making of the Sati Interactive System. The first column shows the concepts/principles of computer-based interactivity, the second column shows the explanation of the concept/principle in computer-based interactivity, the third column shows the explanation of the concept/principle in arts-based interactivity, and the last column shows how these two approaches are linked in making of the Sati Interactive System.

78 Principles/Concepts Explanation in computer based Explanation in art based (AB) How these two approaches are linked in making of the Sati (CB) interactivity interactivity Interactive System Directionality Typically, two-way bi- Typically, uni-directional, but this is The Sati Interactive System provides ‘bi-directionality’ as it directional stream of developing with the advent of communicates with the user in real-time by providing feedback. For communication – typical for all computer-based arts and art example, color and sound is generated according to the user computer-based interactions making. movements. Responsiveness The ability to respond to Typically, non-responsive, but this Sati Interactive responses to the user immediately using visuals and current and previous is developing with the advent of audio. communications and the computer-based arts and art response speed. making.

Awareness Understanding of context and An artwork is not typically aware or Sati Interactive is aware of the user’s actions such as movements and meaning not responsive to its context or input via the user interface. However, it does not have an awareness meaning. However, this is of its context or meaning. The user can understand the context and developing with the computer- meaning of the Sati Interactive System. based interactive artworks. Selectivity The availability of choices for The creator may have the ability to ‘Selectivity’ is achieved by providing choices to the user. Sati the user select the medium used to design Interactive gives the choice to use both color and sound. For example, the artwork, however the user the user can select different colors, adjust the speed of the does not. But the user can select interaction, adjust the responsiveness of the interaction, and to adjust which artwork they engage with. the volume. The Sati Interactive user interface consists of a ‘help’ button that provides information on using the system. Additionally, the user interface also provides clear and detailed information for each available option such as ‘play’ button, various sliders, ‘full screen’ button, and audio controllers. Being Communication roles can be Communication rolls cannot be ‘Being interchangeable’ is achieved by providing the user with the interchangeable interchangeable interchangeable. Typically, an opportunity to control certain factors of the Sati Interactive System to artwork does not communicate meet their needs according to the feedback provided by the system. with the user. Here, the user communicates with the Sati Interactive System and the communication roles are interchanged by the feedback. Real-time Interactive media should be an An artwork can create real-time This is achieved by providing real-time feedback to the user. For communication instant exchange communication. For example, an example, the user receives feedback from the system in real-time artwork can create a through color and sound. This can also be seen as a way of transformative experience in the communication between the user and the system. user at that moment. Table 18 : The use of the principles/concepts of computer-based interactivity in the Sati Interactive System.

79 Development of the Interaction Design Framework The development of the framework is based on the research discussed in Interaction Design Framework on page 31. This framework is used to develop the Sati Interactive System, its user interface, user-product interactions, backend design program resulting in the product.

The framework takes an interaction-centred perspective that includes aspects of user experience, discussed in greater detail under Interaction Design Framework, on page 31. I am using this framework to create a low-cost design suitable for the initial development, not to test the Sati Interactive System.

Factors that influence experience The first step of designing the framework is to understand what influences experience. Figure 25, shows a summary of the factors that influence user experience as discussed in What influences experience? on page 31. This figure is developed from the Figure 12 : Influences on experience (Forlizzi and Ford, 2000, p. 420) on page 32.

Figure 25 : User and product factors that influence user experience

Here, I have added additional factors combining aspects specific to HCI useability that affect user experience, such as: efficiency, effectiveness, engaging, error-free, and easy to learn. For example, a product that includes these features has a better chance of creating a positive user experience when competing with similar products17. Understanding what influences experience can help develop an interactive system that promotes a positive user experience.

User-product interaction and experience of the Sati Interactive System According to the framework of user-product interaction and its relation to user experience presented by Forlizzi and Battarbee (2004, p. 263) as discussed in Types of experiences on page 33. The Sati Interactive System can be categorized as a product with an Expressive interaction that creates An Experience with aspects of Co- Experience.

Expressive interaction – Sati Interactive System consist of functions that enables the user to customise certain aspects of the software to the preference of the user. For example, the user can change the colour of the visuals, adjust volume levels, adjust the speed, and adjust the sensitivity of the camera.

An Experience – Using the Sati Interactive System is an activity that provides an experience. This activity has a beginning and end and it influence emotional changes. For example, this activity can be named as ‘relaxing’ or ‘learning tai chi’ experience, and the interaction might influence positive or negative emotional changes.

17 For more information and examples on these features see Five E’s of Usability on page 27.

80 Aspects of Co-Experience – The Sati Interactive System does not inherently allow Co-Experience. However, the Co-Experience is created when individual user experience is verbally shared among others18. The Interaction Design Framework for Sati Interactive System The interaction design framework for Sati Interactive System is shown in Figure 26.

Figure 26 : Interaction design framework for Sati Interactive System

The above figure shows the interaction between a user, a product, in this case the Sati Interactive System, and the designers’ role of understanding user experience. Here, the designer’s role includes understanding both user and product-based aspects that influence user experience. This framework can be used to create any interactive system.

Using the Framework Initially, according to the framework, the designer does not have control over factors such as social and cultural influences, prior experience of the user, and the personal background of the user. Understanding these limitations is the first step of using the framework.

In Table 19 below, I analyse each component of the Interaction Design Framework and discuss how it is used in the design process of the Sati Interactive System.

18 Further developments of the Sati Interactive System will allow Co-Experience through web-based systems.

81 Aspects that influence experience How it is used in the Sati Interactive System Emotions – Positive emotions Using variations of colour and sound to enhance positive emotions. See Colour, Sound, and Movement on page 6. Stimuli – Vision, Sound Positive emotions are stimulated through vision and sound using colour, sound, body movement techniques, and real-time interactive motions. See Colour, Sound, and Movement on page 6. Product features Ability to change the colour of the visuals. Ability to adjust the speed, camera sensitivity, and volume. Ability to plug in various cameras. See User Interface Design on page 86. Aesthetics Simple graphical interface. Interactive smooth visuals. Interactive sounds. See User Interface Design on page 86. Ease of access The Sati Interactive System can be copied to any computer with the required specifications and played (see page 56). Small file size. See Software on page 83. User-friendliness Direct and short instructions. Simplicity Simple user-interface design that includes direct instructions. See User Interface Design on page 86. Efficiency The software is programmed using simple algorithms and coding techniques to deliver a smooth interaction, using resources efficiently. See Functionality of the Sati Interactive System on page 85. Effectiveness The system is designed only focusing on the end goal. The user interface is designed to be user-friendly and simple to be understood by anyone. The system is designed to deliver smooth interactions, quality visuals, and sounds. See User Interface Design on page 86. Engaging The system is using visuals and sounds to create an attractive environment to engage the user. See Colour, Sound, and Movement on page 6. Error-free Sati Interactive System is tested in both Windows and Mac systems. See Functionality of the Sati Interactive System on page 85. Table 19 : Using the Interaction Design Framework

In conclusion, aspects that influence user experience has been used in the development of the Sati Interactive System. The use of aspects such as: product features, aesthetics, ease of access, simplicity, efficiency, effectiveness, and engaging is discussed later in this chapter. As mentioned in Chapter 5: Development Process on page 78, I am using this framework to create a simplistic low-cost design suitable for development. The Development of the System – Software and Hardware Here, I discuss the initial system design of the Sati Interactive System including: used hardware, software functionality, and the interface design. First, I discuss the Sati Interactive System setup used at the University of Melbourne. I tested how it would work in an immersive environment, and I then tested it with participants to see if it creates a sense of deep engagement in the user19. Then I discuss anticipated reactions of the user and how the Sati Interactive System can be used in other contexts.

Sati Interactive System setup used at the University of Melbourne I have used an isolated environment to tests the Sati Interactive System with participants because,

19 This is discussed in Chapter 5 on page 77.

82 • There are no distractions/no external influence – This allows the user to focus on the activity without being distracted, which is an important factor that helps the user achieve a sense of sati (Gunaratana, 2010, Bodhi, 2011). • A discreate environment might provide a more immersive experience – Studies have shown that an immersive experience can enhance learning (Dede, 2009). Hardware The Sati Interactive System setup used at University of Melbourne consists of the laptop that I have used to developed the system as described on page 56, external webcam, projector, and external speakers. The system was tested in the Immersive Space 1 at the University of Melbourne (Room 202, Level 3, Arts West - Building 148)20. However, the Sati Interactive System is designed to be used on a low-cost computer equipped or connected to a camera, speakers, and a monitor/display/tv, as defined on page 57.

Software Max is used to design the Sati Interactive System. This was discussed in Max on page 56. The system is designed to do tasks such as: analysing the webcam data (body movement), comparing it with the reference video, playing audio clips according to different body movements, and projecting various visual elements according to the analysed data. The system runs on Windows and MacOS.

Sati Interactive System Test Setup Figure 27 shows the diagram of the Sati Interactive System setup used for testing in the Immersive Space 1 at the University of Melbourne.

Figure 27 : Sati Interactive System

The Sati Interactive System test setup is equipped with a computer, a projector, an external webcam, and two external speakers are connected to the computer. A detailed description about the Sati Interactive software application is discussed in the next section.

Figure 29 below shows the Sati Interactive System test setup at the Immersive Space. This is discussed in greater detail in Chapter 6: Testing Process and the Results on page 91.

20 While this is not a low-cost system, the intention here was to test the effect of the Sati Interactive System on the users.

83 Figure 28 : A participant using the Sati Interactive System at the University of Melbourne

Figure 29 : The Tai Chi Master – Image of the right is without colours, on the left is with colours

84 Functionality of the Sati Interactive System Here, I am using Figure 30 to describe the functionality of the Sati Interactive System software.

Figure 30 : Functionality of the Sati Interactive System

As shown in Figure 30, the Sati Interactive software has two inputs: the reference video21 (Tai Chi) and the Camera (User). First, the Tai Chi reference video is duplicated to create two outputs. The first output processes the original resolution of the video and creates an outline of the Tai Chi Master. The second output reduces the resolution to 5 by 5 pixels.

The same process is done to the input from the camera. The first output uses the original resolution from the camera to create an outline of the user. The second output uses a reduced 5 by 5 pixel resolution. The resolution of the camera input is reduced to ease comparison to the reference video movements (Tai Chi Master’s movements) with the user movements. This comparison of the two videos is used to compare the user’s movements with the reference video movements to see if the user is in sync with the reference video movements. For example, as shown in Figure 31, when the user movements and the Tai Chi Master’s movements

Figure 31 : (a) not synced (b) when synced

21 I have used a Tai Chi video as the reference video here, however, the user can select their own video. In the examples presented here, the ability to add different reference videos have been disabled. The recommended video resolution is 480p with a frame rate of 30fps because the application will only output a maximum resolution of 480p. The reference videos must have a plain background and the character in the video must be dressed in a colour different from the background colour.

85 are in sync, the Tai Chi Master’s outline image will be filled with colour (b), when it is not in sync the colour will fadeout (a).

The audio changes according to the user movements. Each movement triggers different natural sounds such as: ocean waves, birds, creeks, and waterfalls. The reasons for these sound choices were discussed in Selected Nature Sounds on page 13.

User Interface Design The user interface design is based on the Interaction Design Framework discussed on page 31. Figure 32 below shows the user interface of the Sati Interactive System.

Figure 32 : User interface of the Sati Interactive System

Here, aspects from the interaction design framework: effectiveness, efficiency, engaging, aesthetics, simplicity, and user-friendliness have been used. I have used a consistent menu design including:

• one colour scheme – blue and white • simple and direct words – for example: play, volume, select colour, and speed • large buttons – the buttons on the left: Play [Space], Full Screen [Esc], Help, and Close button • dropdown menus – Select Colour (under Background Colour) and Integrated Webcam menus

86 • controllers – Speed, Difficulty, Volume, and Mute. When you hold the mouse under an object, a drop-down help will appear.

Additionally, I have included the following features listed in the Table 20 below.

Features How it is represented on the main menu Change the background colour of the projected image

Change the speed of the interaction

Change the difficulty level

Adjust the volume level

Select external cameras

Apply full screen mode

Access instructions

Table 20 : Features added to the main menu of the Sati Interactive System

The interface design also consists of a ‘Help’ statement that can be accessed by pressing the Help button on the left side of the main menu. Figure 33 below shows the ‘Help’ statement that includes instructions on using the system and its features.

Figure 33 : Help statement

87 Figure 34 below shows a screenshot of the Sati Interactive System designed using the Max software. This is the internal programming behind the interface. The section in red shows the reference video input, the section in yellow shows the camera input, the section in green shows the comparison of the reference video and the camera, the section in orange shows the audio functions, the section in blue shows the main menu functions, and the section in purple is the final video output. Each of these sections have different boxes that perform different functions that are connected through wires.

Figure 34 : A screenshot of the Sati Interactive System designed using Max

How the Sati Interactive System may be used in other contexts The Sati Interactive System has potential uses for other purposes such as dance training, yoga training, exercise training, and entertainment. To test this, I have replaced the Tai Chi video with a dance video and the images of this self-test are shown in Figure 36 below.

If the user wants to add a different video, they can drag their video on top of the playlist shown in Figure 35 below. It is essential that the video is 480p and no more than 30fps, while other codecs may work, .mp4 is recommended. There should be high contrast between the character and the background of the video. When you drag and drop a video to the playlist, it adds the new video to the playlist, however, if you drag and hold on top of the playlist until it shows a red colour outline and drop the video, it will replace the video on the playlist with your new video.

Examples of how this system can be used for other purposes include:

• teaching in other modalities such as dance, sport, and yoga training. • interactive entertainment such as dancing or movement-oriented games and interactive art installations. • health-based programs such as physical activity improvements.

88 Drag the video here

Figure 35 : Where to drag and drop your movie file

Figure 36 : Images of the dance video

89 Conclusion Here I have discussed the development process of the Sati Interactive System. I discussed how principles/concepts of computer-based interactivity can be linked to art-based interactivity. The Sati Interactive System is developed using a user-product interaction design framework focusing on the user’s prior experience, emotions, and stimuli. I also focused on aspects of the product such as its: features, aesthetics, ease of access, user-friendliness, simplicity, language, efficiency, effectiveness, engaging, and error-free. This process resulted in the Sati Interactive System, which has features developed from the framework that I designed based on Forlizzi, Ford, and Battarbee’s approaches discussed on page 31. These include: an interface that is simple and easy to use; features designed to enhance user experience; an application that works on Windows and Mac operating systems; a system that is easy to setup and is of low-cost, and can integrate with a home entertainment system.

90 Chapter 6: Testing Process and the Results This chapter discusses how I tested the Sati Interactive System with participants and the test results. It includes: information on how the test was conducted, results gathered from the questionnaires, raw data, average, positive, negative, and neutral responses according to the Likert Scale used, and the process of establishing the Core-categories that were used to assess the Sati Interactive System. The development of this process is discussed in detail in page 58 to page 77 above. I then discuss the findings from the results and the suggested developments to the system. Testing the Sati Interactive System The system used for testing with participants The testing was conducted in the Immersive Space 1 at the University of Melbourne (Room 202, Level 3, Arts West - Building 148) on four separate days within two weeks. The laptop that was running the Sati Interactive System was connected to the projector and the sound system in the Immersive Space. More information about this setup can be found in Sati Interactive System setup used at the University of Melbourne on page 82. A Dell XPS 15 9560 laptop was used to run the Sati Interactive application. The laptop consists of an i7 2.8GHz processor, 16GB of ram, and a 4GB dedicated GTX 1050 video card. More information on the hardware can be found on page 56.

Participants The test consisted of 30 participants who were asked to try the Sati Interactive System, these included: university staff members, students from different universities, and people not related to the university. All participants were above the age of 18. However, age will not be taken into consideration because it is not relevant to the process.

Participants were invited by me and Melbourne University staff via email, twitter (@Digital Studio UM), face to face communication, and participants also invited their friends to be involved, see Appendix 9 on page 142 for the twitter advert.

Task and Procedure The initial plan was to conduct 30 trials of the Sati Interactive System. However, I noticed that three out of the first five participants are only focusing on trying to align them self with the Tai Chi master outline and they try to adjust them self during the movement. I felt like this interferes with what the Sati Interactive System is trying to create, a sense of deep engagement, because the user lost focus, while trying to make adjustments to their pose, rather preventing them from relaxing and enjoying the colour and sounds. The participants said that it was hard to keep up with the Tai Chi master’s movements. As a result of this observation I decided to conduct two type of trials: (1) the user can only see the Tai Chi master (no user image outline) and (2) the user can see their image and the Tai Chi master on the screen (with user image outline). There were 15 participants for each type of trial.

The Figure 37 shows the difference between (1) no user image outline and (2) with user image outline.

Figure 37 : (1) no user image outline (2) with user image outline

91 Participants received a hardcopy of the Consent Form and the Plain Language Statement prior to the participation. They were also provided with a brief description about the project. Immediately after engaging with the Sati Interactive System the participants were asked to answer a short questionnaire. Each participant engaged with the Sati Interactive System for 5-10 minutes. All 30 participants completed the questionnaire, and a few provided additional comments on the questionnaire itself. Some provided personal comments which I wrote down as additional notes which will be discussed later in this chapter.

The Plain Language Statement, Consent Form, and the Questionnaire is in Appendix 4 on page 133.

Measures As mentioned in Chapter 4: Methodology on page 56, the Sati Interactive System is evaluated using the three coding methods: Open coding, Axial coding, and Selective coding. Firstly, the raw data is analysed using Open code and Axial code, measuring the total amount of positive, negative, and neutral experiences. Later, this data is used to choose the Selective code ‘Core category’, which will provide the final evaluation. This is discussed in greater detail under Data Coding on page 76. Results and Discussion Here I am discussing the results gathered from the trials. Firstly, the 15 trials conducted with the user image outline is discussed. Secondly, the 15 trials conducted without the user image outline is discussed. Lastly, I compare both trials.

Data will be analysed under these topics:

• 15 trials with the user image outline o Responses per Likert question o Responses per participant • 15 trials without the user image outline o Responses per Likert question o Responses per participant • Comparison of all data With user image outline Responses per Likert question Here, I am analysing the data according to each Likert question. Table 21 shows the 15 participants responses according to Axial Code on page 69. A more detailed table that includes individual questionnaire responses can be found in Appendix 6 on page 139. The data in Table 21 is represented using the chart shown in Figure 38 to understand and compare multiple continuous data sets.

Axial Code points Question Positive Negative Neutral experience experience I felt relaxed after using the system 13 0 2 I was focused while using the system 15 0 0 I felt transported into a different environment 8 1 6 I felt distracted by the visuals 9 4 2 I felt distracted by the sounds 12 3 0 I would recommend this system to a friend 9 1 5 I would give this system to a friend 11 1 3 I found the Sati Interactive System user-friendly 13 0 2 I would use this system at home 10 1 4 I feel that using this system daily would help reduce stress 11 1 3 I did not feel the time passing 12 0 3 I felt like 10 minutes is enough 10 1 4 I felt relaxed while using the system 12 0 3 I felt like I was in a different ‘zone’ while using the system 10 1 4 I feel like I am still in a different ‘zone’ after using the system 8 0 7

92 Axial Code points Question Positive Negative Neutral experience experience Total 163 14 48 Table 21 : Data collected from the Sati Interactive System with the user image outline

Responses per Question 15 14 13 13 12 12 12 12 11 11 10 10 10 10 9 9 8 8 8 7

Participant 6 6 5 4 4 4 4 4 3 3 3 3 3 2 2 2 2 1 1 1 1 1 1 1 0 0 0 0 0 0 0 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 Question

Positive Responses Negative Responses Neutral Responses

Figure 38 : Chart – Responses per question (with user image outline)

After analysing the data according to the Data Coding method discussed in Applying the three coding methods on page 61, it shows that 15 participants have provided 163 positive responses, 14 negative responses, and 48 neutral responses.

Data shows that all 15 participants have agreed to the statement: “I was focused while using the system”. Thirteen participants have agreed to the statement: “I felt relaxed after using the system” and “I found the Sati Interactive System user-friendly”. Twelve participants have stated that the sound did not distract them, and they also felt relaxed while using the system. There is a significant difference between the positives and the negatives. However, there are more neutral responses than negative responses.

Four participants have stated that they felt distracted by the visuals, and three participants have stated that they felt distracted by the sounds. Seven participants have stated that they neither agree nor disagree to the statement: “I feel like I am still in a different ‘zone’ after using the system”. Six participants stated that they neither agree nor disagree feeling transported into a different environment.

These responses indicate that the participants felt that they gained a positive experience from the Sati Interactive System, this is discussed further analysing responses per participant.

Responses per Participant Table 22 shows how each participant responded to each question. This data is also displayed using a chart shown in Figure 39.

• Positive responses: 1, Green • Negative responses: -1, Red • Neutral responses: 0, White

93 Participants Question 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 I felt relaxed after using the system 0 1 1 1 1 1 1 1 1 1 1 0 1 1 1 I was focused while using the 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 system I felt transported into a different -1 1 0 0 0 1 1 1 0 1 1 0 1 0 1 environment I felt distracted by the visuals -1 1 1 1 0 -1 -1 1 1 0 -1 1 1 1 1

I felt distracted by the sounds 1 1 1 1 1 -1 -1 1 -1 1 1 1 1 1 1 I would recommend this system to 1 1 1 -1 0 1 0 0 1 0 1 0 1 1 1 a friend I would give this system to a friend 1 1 1 1 -1 1 1 0 1 0 1 0 1 1 1 I found the Sati Interactive System 1 1 1 1 1 1 0 0 1 1 1 1 1 1 1 user-friendly I would use this system at home 1 1 0 1 -1 1 0 1 1 0 1 0 1 1 1 I feel that using this system daily 1 1 0 1 1 1 -1 1 0 1 1 0 1 1 1 would help reduce stress I did not feel the time passing 1 0 1 1 1 1 0 1 1 0 1 1 1 1 1

I felt like 10 minutes is enough 0 0 -1 1 1 1 1 1 1 0 1 1 1 0 1

I felt relaxed while using the system 0 1 1 1 1 1 0 1 1 1 1 0 1 1 1 I felt like I was in a different ‘zone’ -1 1 0 1 0 1 1 1 0 1 1 0 1 1 1 while using the system I feel like I am still in a different 1 1 0 0 1 1 0 0 1 1 1 0 1 0 0 ‘zone’ after using the system Total Individual positive responses 9 13 9 12 9 13 6 11 11 9 14 6 15 12 14 (out of 15) Total Individual positive responses 60 87 60 80 60 87 40 73 73 60 93 40 100 80 93 (%) Table 22 : Responses per participant – with user image outline

94 Responses per participant

15 14 14 14 13 13 12 12 12 11 11 10 9 9 9 9 9 8

Responses 6 6 6 6 5 4 4 4 3 3 3 3 2 2 2 2 2 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 Participant

Positive Responses Negative Responses Neutral Responses

Figure 39 : Chart – Responses per participant (With user image outline)

Individual participant data shows that most participants showed a positive response. There is a significant difference between positive and negative responses. 87% of the participants provided positive responses to more than 50% of the questions. All 15 participants gave a positive response to the statement: “I was focused while using the system”. Data also indicates that most participants felt relaxed while and after using the system. Only two participants provided negative responses to more than 50% of the questions. There are more neutral responses compared to negative responses.

Results indicate fewer positive responses to questions such as: “I felt transported into a different environment”, “I would recommend this system to a friend”, “I would use this system at home”, and “I feel like I am still in a different ‘zone’ after using the system”. Four participants have felt distracted by the visuals.

In conclusion, this analysis shows that most participants found that the Sati Interactive System helped them achieve a sense of relaxation and the ability to focus on that specific moment.

No user image outline Responses per Likert question Here I am analysing the data collected from the trials conducted without using the user image outline. Table 23 shows the 15 participants responses according to Axial Code as discussed on page 69. The more detailed table that includes individual questionnaire data can be found in Appendix 7 on page 140. Figure 40 also shows a chart that represents the data shown in Table 22.

Axial Code points Question Positive Negative Neutral experience experience I felt relaxed after using the system 14 0 1 I was focused while using the system 15 0 0 I felt transported into a different environment 7 3 5 I felt distracted by the visuals 9 6 0 I felt distracted by the sounds 12 2 1 I would recommend this system to a friend 13 0 2 I would give this system to a friend 11 0 4 I found the Sati Interactive System user-friendly 12 1 2

95 Axial Code points Question Positive Negative Neutral experience experience I would use this system at home 10 1 4 I feel that using this system daily would help reduce stress 11 0 4 I did not feel the time passing 8 1 6 I felt like 10 minutes is enough 10 1 4 I felt relaxed while using the system 14 0 1 I felt like I was in a different ‘zone’ while using the system 9 2 4 I feel like I am still in a different ‘zone’ after using the system 3 4 8 Total 158 21 46 Table 23 : Data collected from the Sati Interactive System without the user image outline

Responses per question

15 14 14 14 13 12 12 12 11 11 10 10 10 9 9 8 8 8 7

6 6 6 Participant 5 4 4 4 4 4 4 4 3 3 2 2 2 2 2 1 1 1 1 1 1 1 0 0 0 0 0 0 0 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 Question

Positive Responses Negative Responses Neutral Responses

Figure 40 : Chart – Responses per question (No user image outline)

Results show that these 15 participants provided 158 positive responses, 21 negative responses, and 46 neutral responses.

Data shows that all 15 participants agreed to the statement: “I was focused while using the system”. Fourteen participants felt relaxed while and after using the system. Twelve participants stated that the sounds are not distracting, and 10 participants said that they would use the Sati Interactive System at home.

Six participants felt distracted by the visuals, and two participants felt distracted by the sounds. Eight participants neither agree nor disagree to the statement: “I feel like I am still in a different ‘zone’ after using the system”. Four participants stated that they neither agree nor disagree feeling transported into a different environment.

These responses also indicate that the participants felt that they gained a positive experience from the Sati Interactive System. The responses per participants are analysed and discussed below.

Responses per Participant Table 24 shows how each participant responded to each question, this data is also displayed in the chart shown in Figure 41.

• Positive responses: 1, Green • Negative responses: -1, Red

96 • Neutral responses: 0, White Participants Question 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 I felt relaxed after using the system 1 1 1 1 1 1 1 1 1 1 1 1 1 0 1 I was focused while using the 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 system I felt transported into a different -1 1 1 1 0 1 0 0 1 1 -1 1 0 0 -1 environment I felt distracted by the visuals 1 1 1 1 1 1 1 1 1 -1 -1 -1 -1 -1 -1

I felt distracted by the sounds 1 1 -1 1 1 0 1 1 1 1 1 1 1 -1 1 I would recommend this system to 1 1 1 1 1 1 1 1 1 1 0 1 1 0 1 a friend I would give this system to a friend 0 0 1 1 1 1 1 1 1 1 0 1 1 0 1 I found the Sati Interactive System 1 1 1 1 1 0 -1 1 1 1 1 1 1 0 1 user-friendly I would use this system at home 1 0 1 1 1 1 1 1 1 1 0 1 0 -1 0 I feel that using this system daily 0 1 1 1 1 1 1 1 1 1 0 1 0 0 1 would help reduce stress I did not feel the time passing 0 1 -1 1 1 1 0 0 0 0 1 1 1 0 1

I felt like 10 minutes is enough 1 1 1 1 -1 1 1 0 0 1 1 0 1 1 0

I felt relaxed while using the system 1 1 1 1 1 1 1 1 1 1 1 1 1 0 1 I felt like I was in a different ‘zone’ -1 1 1 1 0 1 0 1 1 1 0 1 0 -1 1 while using the system I feel like I am still in a different -1 1 1 -1 0 0 -1 0 0 0 0 1 0 -1 0 ‘zone’ after using the system Total Individual positive responses 9 13 13 14 11 12 10 11 12 12 7 13 9 2 10 (out of 15) Total Individual positive responses 60 87 87 93 73 80 67 73 80 80 47 87 60 13 67 (%) Table 24 : Responses per participant – no outline

97 Responses per participant

14 14 13 13 13 12 12 12 12 11 11 10 10 10 9 9 8 8 7 6 Responses 6 5 5 4 4 3 3 3 3 3 3 2 2 2 2 2 2 2 2 1 1 1 1 1 0 0 0 0 0 0 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 Participant

Positive Responses Negative Responses Neutral Responses

Figure 41 : Chart – Responses per participant (No user image outline)

Data analysed according to the responses per participant indicates an overall positive response to the Sati Interactive System. Thirteen participants provided positive responses to more than 50% of the questions, indicating a significant difference between positive and negative responses. Results also show that all participants were focused on the activity while using the system. Most participants felt relaxed while and after using the system. 73% of the participants felt that using this system daily would help reduce stress.

Six participants stated that the visual elements are distracting. Four participants disagreed to the statement: “I feel like I am still in a different ‘zone’ after using the system”, and eight participants neither agree nor disagree to that statement. Only one person disagreed to the statement, “I felt like I was in a different ‘zone’ while using the system”, and “I felt transported into a different environment”. Two participants have provided negative responses to more than 50% of the questions.

This analysis shows that most participants have provided a positive response to the questionnaire. Below I compare the data gathered from the two analysis.

Interpretation of the data Comparison of the two analysis Here I compare the results gathered from the two analysis: with user image outline, and no user image outline. Table 25 below shows the summary of the above discussed analysis.

With user image outline No user image outline Responses per Likert question Responses per Likert question Positive Positive • Fifteen participants agreed to the • Fifteen participants agreed to the statement: “I was focused while using the statement: “I was focused while using the system”. system”. • Thirteen participants agreed to the • Fourteen participants felt relaxed while and statement: “I felt relaxed after using the after using the system. system” and “I found the Sati Interactive • Twelve participants stated that the sounds System user-friendly”. are not distracting, and ten they would use • Twelve participants stated that the sound Sati Interactive System at home. did not distract them, and they also felt relaxed while using the system.

98 • There is a significant difference between the positives and the negatives. • There are more neutral responses than negative responses. Negative Negative • Four participants stated that they felt • Six participants stated that they felt distracted by the visuals. distracted by the visuals. • Three participants stated that they felt • Two participants stated that they felt distracted by the sounds. distracted by the sounds. • Seven participants stated that they neither • Eight participants stated that they neither agree nor disagree to the statement: “I feel agree nor disagree to the statement: “I feel like I am still in a different ‘zone’ after using like I am still in a different ‘zone’ after using the system”. the system”. • Six participants stated that they neither • Four participants stated that they neither agree nor disagree feeling transported into agree nor disagree feeling transported into a different environment. a different environment. Responses per participant Responses per participant Positive Positive • Most participants showed a positive • Overall positive response. response. • Thirteen participants provided positive • There is a significant gap between positive responses to more than 50% of the and negative responses. questions. • 87% of the participants provided positive • A significant difference between positive responses to more than 50% of the and negative responses. questions. • Results also show that all participants were • All 15 participants displayed a positive focused on the activity while using the response to the statement: “I was focused system. while using the system”. • Most participants felt relaxed while and • Data also indicates that most participants after using the system. felt relaxed while and after using the • 73% of the participants felt that using this system. system daily would help reduce stress. • Only two participants provided negative responses to more than 50% of the questions. Negative Negative • Less positive responses to questions such as • Six participants stated that the visual “I felt transported into a different elements are distracting. environment”, “I would recommend this • Four participants disagreed to the system to a friend”, “I would use this system statement: “I feel like I am still in a different at home”, and “I feel like I am still in a ‘zone’ after using the system”, and 8 different ‘zone’ after using the system”. participants neither agree nor disagree to it. • Four participants felt distracted by the • Only one person disagreed to the visuals. (This may have been because the statement, “I felt like I was in a different user was trying to align their outline on the ‘zone’ while using the system”, and “I felt screen with the Tai Chi master’s outline.) transported into a different environment”. • There are more neutral responses • Two participants provided negative compared to negative responses. responses to more than 50% of the questions. Table 25 : Summary of the analysis

According to the comparison analysis, there is no significant difference between the responses of the two trials: with user image outline and no user image outline. However, there are insignificant differences in some areas. The trials conducted with the user image outline has more positive responses, has fewer negative responses, and shows more fluctuations in positive and neutral responses, compared to the responses gathered from no user image outline trials.

99 This suggests that a user image outline provides a better interaction with the system, thus helping the user achieve a better sense of deep engagement. For this reason, I have created two different systems: one with user image outline and one without the user image outline. However, as mentioned above, there is no significant difference in the user experience of the two systems. Both systems can create a sense of deep engagement in the user. An overall analysis using all 30 participants responses is conducted below.

An overall analysis of data gathered from all the participants The data is analysed according to the responses per participant, and responses per question. I am using charts to compare and analyse this data.

Responses per Participant The chart displayed in Figure 42 shows the 30 participants responses to the questionnaire.

Responses per Participant

15 15 14 14 14 14 13 13 13 13 13 13 12 12 12 12 12 12 11 11 11 11 11 10 10 10 9 9 9 9 9 9 9 9 8 8 7 7

Responses 6 6 6 6 6 5 5 5 5 4 4 4 4 3 3 3 3 3 3 3 3 3 3 3 2 2 2 2 2 2 2 2 2 2 2 2 1 1 1 1 1 1 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 Participants

Postive Responses Negative Responses Neutral Responses

Figure 42 : Individual responses per participant

According to the results, both trials show an overall positive response, and negative responses are lower than neutral responses. Only two participants showed different results to the trends, these were participant 14 and 27. The highest amount of negative responses, five, is made by participant 14, who also made 8 neutral responses. Participant 27 has six positive responses and nine neutral responses but no negative responses. I consider these responses as anomalies. Twenty-five participants have at least one neutral response and seven participants have more than five neutral responses, which indicates that neutral responses are common among participants. From the total responses, there is 8% negative responses, 71% positive responses, and 21% neutral responses. This data suggests that most participants felt relaxed and focused when and after using the Sati Interactive System. Below I analyse the data according to the responses per Likert question.

100 Responses per Question Figure 43 shows a graph representing the data presented according to responses per question.

Responses per Question

30 30 27 25 26 24 25 22 22 22 20 20 20 20 18 19 15 15 15 11 11 Particiapnts 10 10 9 7 7 8 7 8 8 5 4 5 4 4 4 3 2 2 2 3 0 0 0 1 1 1 1 1 1 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 Questions

Positive Responses Negative Responses Neutral Responses

Figure 43 : Responses per Question

The chart displayed in Figure 43 illustrates positive, negative, and neutral responses. Positive responses are higher than negative and neutral responses except for question 15 which has more neutral responses than positive or negative responses. Responses to, question 3 “I felt transported into a different environment”, question 4 “I felt distracted by the visuals”, question 14 “I felt like I was in a different ‘zone’ while using the system”, and question 15 “I feel like I am still in a different ‘zone’ after using the system”, shows a difference in the trends. This difference may be because these questions require a more subjective response from the participants than the other questions, which can be answered by a simple yes or no. The terms “environment” and “zone” may be interpreted differently by each user and this could be what causes the lower positive responses and higher neutral responses. However, it is interesting to note that question 14 has an overall positive response even though it uses the same subjective term “zone”.

Negative responses to the question 4 “I felt distracted by the visuals” suggest that the quality of the visual elements can be improved, however, most participants did not see this as a problem.

Additional comments provided by the participants Below are feedback/comments provided by some of the participants; some of these were written on the questionnaire and some of the participants spoke to me after participating. I did not solicit these comments.

1. Sound felt immersive (10 people). 2. Want more colours on the screen (6 people). 3. Camera position was not good (2 people). 4. Gets tired trying to keep up (2 people). 5. Trying to compare the movements using the outline is a struggle (2 people). 6. Need more time to get used to the movements (2 people). 7. Hard to understand left and right-angled movements from the Tai Chi master video (1 person). 8. One participant does not understand the questions: “I would give this system to a friend”, and “I felt like 10 minutes is enough” (1 person). 9. One participant said that he does not feel like he is in a different ‘zone’ after using the system because it was during his work time (1 person). The comments are interesting and useful, however, in many cases there is not much that can be done to improve the user experience. Regarding these comments: it is possible for the user to change the colours on the screen (2), the Sati Interactive System can possibly improve the user’s physical fitness (4), the user can adjust the speed of the Tai Chi master’s movement (6 and 7).

101 Summary of the questionnaire responses Overall, there are 71% positive responses, 8% negative responses, and 21% neutral responses. This indicates that the Sati Interactive System is successful in its goals. Most participants providing a positive response suggest that the system helped them relaxed and focused. There was no significant difference between the experience using the user image online and not using the user image outline. Two systems are provided, one with the user image outline view and one without the user image outline view.

The final evaluation is decided by the Selective coding process outlined in the methodology. This is discussed below.

Finding the Core category through Selective coding Here, the three Selective codes are used to determine the Core category as outline in the methodology. This was discussed in detail in Selective Code on page 75.

Table 26 below shows the total amount of positive, negative, and neutral responses.

Axial Code points Question Positive Negative Neutral responses responses I felt relaxed after using the system 27 0 3 I was focused while using the system 30 0 0 I felt transported into a different environment 15 4 11 I felt distracted by the visuals 18 10 2 I felt distracted by the sounds 24 5 1 I would recommend this system to a friend 22 1 7 I would give this system to a friend 22 1 7 I found the Sati Interactive System user-friendly 25 1 4 I would use this system at home 20 2 8 I feel that using this system daily would help reduce stress 22 1 7 I did not feel the time passing 20 1 9 I felt like 10 minutes is enough 20 2 8 I felt relaxed while using the system 26 0 4 I felt like I was in a different ‘zone’ while using the system 19 3 8 I feel like I am still in a different ‘zone’ after using the system 11 4 15 Total 321 35 94 Average 71% 8% 21% Table 26 : Total positive, negative, and neutral responses

According to the overall responses, there are 321 positive responses, 35 negative responses, and 94 neutral responses. The positive responses represent 71% of the total responses.

Table 27 below shows the total responses processed through Selective coding.

Axial Code Responses Selective Code Core-Category Negative responses 35 Does not create a sense of deep engagement. Creates a sense of Positive responses 321 Creates a sense of deep deep engagement engagement. Neutral responses 94 Neither Table 27 : Selective coding process

Conclusion Using the adapted Grounded Theory process discussed on page 58, this analysis shows that most participants felt that the Sati Interactive System helped them achieve a sense of relaxation and the ability to focus on that specific moment, a sense of deep engagement. While only one testing process, with a limited number of participants, was carried out, we can surmise that similar responses would occur if we were to do more testing

102 in different environments. The questions were designed to be interpreted by the user. However, using the process outlined in Table 27 we can see that each of these questions can be interpreted to have a positive, negative, or neutral response, as can be seen the positive responses far outweigh the negative responses.

While the responses to the questionnaires show that the Sati Interactive System is successful in creating a sense of deep engagement in the participants, this success may be argued due to the comparatively small number of respondents. However, this number of respondents, or fewer is not unusual in this field, for example: Castellano et al. (2007) had only 10 participants when recognising human emotions from body movement and gesture dynamics, Dow et al. (2007) had only 12 participants when exploring the impact of immersive technologies on presence and engagement in an interactive drama, and Yang et al. (2011) had only 23 respondents when considering the effect of yoga on type 2 diabetes.

103 Chapter 7: Conclusion Project Summary This thesis explored the process of developing a computer-based interactive system that creates a sense of deep engagement, the Sati Interactive System. Results of the Sati Interactive System trials show that it can create an experience of being relaxed, focused, and not feeling the passing of time, a sense of deep engagement, and therefore a transformative experience in the user. This experience may not be easily expressed in words, similar to how the experiencing of art may not be easily expressed in words.

The program uses sounds of nature, Tai Chi movement technique, and four types of colours. Nature sounds and colours are used to create positive emotions. Tai Chi movement technique is used as a visual guide to help engage the user in an activity, not for its ability to calm and relax the mind and body. It is important to note that any other kind of video can be used as discussed on page 88.

Here, a sense of deep engagement is considered as experiencing being relaxed, focused, and not feeling the passing of time. The expression, ‘sense of deep engagement’ is derived from the Buddhist concept of ‘Sati’, which is: being aware and paying attention to the present moment. This thesis is focusing on ‘Sati’ for its ability to relax/calm yourself, to focus the mind towards a single objective without disruptive thoughts, when dealing with stressful situations while controlling emotions.

Here an arts-based approach is the process of creating an idea or experience that is not easily defined or put into words in its audience, the people that are experiencing the work. Responding to the Hypothesis While the number of people who participated in the testing was comparatively small, their responses validate the hypothesis “A sense of deep engagement can be enhanced using computer-based interactive technology through an arts-based perspective”, by indicating a majority (more than 70%) of positive responses. Responding to the Question The research question “How can interactive technology be used to create a sense of deep engagement?”, is answered through demonstrating a method that combines aspects of colour, sound, and Tai Chi movements with a computer-based interactive system. Defining a low-cost system Here, low-cost is defined as a computer system that costs around $AU800-$AU1000 with a minimum specification of an Intel i5 or AMD Ryzen 3 2.7 Ghz processor, 8GB ram, and 1.5GB graphics card. Please see Defining a low-cos on page 57 for more details. New knowledge brought to current computer-based interactivity in the arts Below I discuss how I responded to the four gaps identified on page 2:

1. “there are few low-cost computer-based interactive systems” by developing the Sati Interactive System, a low-cost computer-based interactive system, as defined on page 57. The Sati Interactive System requires a computer, screen, speakers, and a webcam. This system is designed for in-home use, but the software can be used in other situations such as a gallery, education, or training on more expensive hardware. This contributes to the field of computer-based interactivity because it provides an approach to developing computer-based interactive systems with a primary focus on being low-cost and being available to as many people as possible. 2. “there are few computer-based interactive systems that are explicitly designed to create a sense of deep engagement” by designing a computer-based interactive system with a primary focus of creating a sense of deep engagement in the user. This is a significant difference to interactive systems that are designed with the focus of being educational, entertaining, or providing or gathering information, and with a commercial intention. 3. “an arts-based approach to the design of such a system has not been explicitly explored” by creating an idea or experience in the audience integrating colour, sound, and movement, basic elements of an

104 artwork. The Sati Interactive System integrates colour, sound, and movement, the basic elements of an artwork, as discussed on page 55, with computer-based interactivity to create a sense of deep engagement in the user. Here the Sati Interactive System uses these three elements to create a transformative experience then create a sense of deep engagement. It uses these three elements of an artwork to create this transformative experience without explicitly creating an artwork. This contributes to the field of computer-based interactivity in the arts by providing an approach to developing computer-based interactive systems with an arts-based approach. 4. “interactive systems are often designed with one intention or outcome, and not to be flexible in their uses” by developing a computer-based interactive system that can be used in entertainment, healthcare, and training. For example, it can be used in entertainment as an interactive game for many age groups, it can be used in healthcare as a therapeutic movement, and it can be used in training for dance and sports. By allowing users to insert their own content they may find other uses for the Sati Interactive System. Development Framework and Philosophies The Sati Interactive System is developed based on an ‘interaction design framework’ which is built using a set of HCI concepts and philosophies. The framework consists of aspects that influence user experience such as: emotions, stimuli, aesthetics, product features, etc. Please see Development of the Interaction Design Framework on page 80 for more details. The philosophy of technology is based on Borgmann (1987) and Ihde’s (1990) philosophies, discussed in Philosophy of Technology for Human Computer Interaction on page 35, which focuses on the following aspects:

• user experience rather than usability • understanding the value of emotional attachments • understanding a low-cost, simple, easy-to-use system • understanding the values of human-technology relations. The following four paragraphs are directly related to each of these dot points.

Considering ‘user experience rather than usability’ can be seen in the simple user interface design. The interface is divided into two parts for easy identification and to provide a direct thought progress, everything can be accessed by one click, the user can start the program by clicking the ‘play’ button or the ‘spacebar’, full screen mode can be accessed by pressing the ‘Esc’ key, and no clutter. Please see Figure 32 : User interface of the Sati Interactive System on page 86 Figure 32and Figure 33 : Help statement on page 87 for more details. There are also contextual drop-down help menus available for each element in the Sati Interactive System.

Considering ‘understanding the value of emotional attachments’ can be seen in the use of colour, sound, and Tai Chi movement. Colour and sound are used for its strong connection towards human emotions. For instance, blue express positive emotions such as calmness, relaxed, happiness, peace and comfort as well as negative feelings such as sadness and depression, while red express negative emotions such as fury, anger, bloody, intense, violent and positive emotions such as warm, solid, energetic, and passionate (Naz and Helen, 2004, Naz and Epps, 2004, Mahnke, 1996, Saito, 1996). Nature sounds can generate positive emotions (Alvarsson et al., 2010, Benfield et al., 2014, Annerstedt et al., 2013, Cutshall et al., 2011). Tai Chi movement is used as a form of movement to guide the user in an activity, not for its benefits. Please see the discussions on colour, sound, and movement on pages 6 to 14 for more details.

‘Understanding a low-cost, simple, easy-to-use system’ can be seen by the focus directed towards the development of a low-cost in-home system that does not use technologies such as VR and CAVE systems, which are costly, require a specialist’s knowledge, a dedicated space, and specific equipment. Please see Limitations of the technology used in computer-based interactivity on page 26 for more information.

‘Understanding the values of human-technology relations’ can be seen by the use to technology to create embodiment relations (using body movements as the input), hermeneutical relations (understanding and living in the world created by the Sati Interactive System), and alterity relations (user is interacting with a quasi- otherness, i.e., the Tai Chi Master). Please see Ihde’s human-technology relations on page 38 for more information.

105 Methodology This project used the three coding methods from Grounded Theory: Open coding, Axial coding, and Selective Coding (Corbin and Strauss, 1990). Data was collected using a Likert Scale questionnaire and analysed using the three coding methods. The purpose of collecting and analysing data was to identify if the Sati Interactive System can be used to achieve a sense of deep engagement, using a simple questionnaire immediately after people engage with the Sati Interactive System was considered, the most effective way to ascertain responses. While the questionnaire did not directly ask “did you feel a sense of deep engagement?” it was clear that the qualities of a sense of deep engagement – being relaxed, focused, with reduced stress (calm), the attributes outlined on page 1. Participants also indicated that they felt a sense of timelessness and feeling in a different zone, these could also be considered as attributes of feeling a sense of deep engagement. The overwhelming positive responses indicate that the hypothesis can be validated. Please see Chapter 4: Methodology on page 56 for more details. Sati Interactive System as an Artwork, or Not An artwork is often intended to create a sense of deep engagement in its audience, this may be to feel the emotions the maker intends, such as in Munch’s ‘the Scream’ (Munch, 1893) where he wanted show his feelings of anxiety, in ‘Grave of the Fireflies’ (Takahata, 1988) where the writer wanted to show the consequences of war and the suffering of people in Japan after the war, or Bock, Sheldon and Stein’s ‘Fiddler on the Roof’ (Bock et al., 1964) where the creators wanted to show the challenges of religious traditions. The ideas discussed in these artworks are easily spoken; the ideas these artworks intend to communicate is easily expressible through words. There are also creative works such as Bach’s ‘Well-tempered Clavier’ (Bach, 1722), Pollock’s ‘Number 11’ (Pollock, 1952) or Beckett’s ‘Quod’ (Beckett, 1981) which are designed to generate a response or sense in the audience, potentially of deep engagement, that is ineffable, the ideas of these artworks are not easily expressible though words.

The test results show that the Sati Interactive System creates a sense of deep engagement in the user which is expressible through words. A sense of deep engagement was defined as being relaxed, focused, and not feeling the passing of time, see page 1. These three phrases are feelings that are expressible as words. Participants understood terms ‘relaxed’, ‘focused’ and ‘not feeling the passing of time’ and responded to them according to their experiences. Participants provided positive responses to the questions “I felt relaxed while using the system”, “I felt relaxed after using the system”, “I was focused while using the system”, and the question “I did not feel the time passing” had only one negative response.

The success of this project has opened other possible uses for the Sati Interactive System, as mentioned in How the Sati Interactive System may be used in other contexts on page 88. Here, developing the Sati Interactive System to be used in other contexts will have additional benefits in a low-cost system. The Sati Interactive System can be commercialized as a product to suit various markets such as a Tai Chi training application, dance training application, interactive art installation in galleries or public places, and health-based program for physical and psychological wellbeing. Further developments for the future of the Sati Interactive System Possible enhancements to the Sati Interactive System are in areas such as: interactivity, visuals, and other extra features. Interactivity could be developed by providing the ability to auto detect the user to fit the screen position, enhanced camera and video resolutions and framerates, however this would require a more expensive system, additional sound and video banks, voice activation, and availability across a variety of platforms including iOS and Android. These developments may enhance the interactivity between the user and the system and create a better immersive engagement. Conclusion Developing the Sati Interactive System required exploring three main themes:

• Computer-based interactivity, philosophy of technology for human computer interaction, computer software and hardware to develop the system;

106 • Arts-based approach: to create an experience of Sati, deep engagement, and transformative experience without creating an expressive artwork; • Coding methods to understand the effect of the Sati Interactive System. These are broad and wide-ranging topics and discussing them is required in the development of the Sati Interactive System. The Sati Interactive System is used to explore a process for creating a sense of deep engagement. This thesis shows that computer-based interactive technology can be used to integrate colour, sound, and movement to create a sense of deep engagement. This thesis assumes that the purpose of an artwork is to create a mental state, in this case a sense of deep engagement in the perceiver, and this experience is considered the artwork. Developing a system that creates this sense of deep engagement through an HCI based perspective has added new knowledge to the field of computer-based interactive arts.

People go to see an artwork to engage with it. Through that engagement we have a transformative experience such as getting new ideas, a change in mood, and to be inspired; this is a form of deep engagement. The Sati Interactive System is developed specifically to create that sense of deep engagement.

107 Bibliography Maurice Denis. 2010. EUROPEAN PARLIAMENTARY COMMITTEE ADOPTS REPORT ON A EUROPEAN INITIATIVE ON ALZHEIMER'S DISEASE AND OTHER DEMENTIAS. 2016. Understanding Interactive Learning [Online]. Scholastic. Available: http://www.scholastic.com/parents/resources/article/your-child- technology/understanding-interactive-learning [Accessed 3rd February 2016]. 2017. Interactive Learning: Increasing Student Participation through Shorter Exercise Cycles. ACM International Conference Proceeding Series Volume Part F126225, 17. 2019. Research ethics and integrity [Online]. @unimelb. Available: https://finearts- music.unimelb.edu.au/current-students/research-students/research-ethics-and-integrity2 [Accessed]. 2020a. Archaic Torso of Apollo [Online]. poets.org. Available: https://poets.org/poem/archaic-torso- apollo [Accessed 3 June 2020]. 2020b. Bluebeard. While Listening to a Tape Recording of Béla Bartók´s opera “Duke Bluebeard´s Castle” [Online]. pina-bausch.de. Available: http://www.pina-bausch.de/en/works/complete- works/show/blaubart-beim-anhoeren-einer-tonbandaufnahme-von-bela-bartoks-oper- herzog-blaubarts-burg/ [Accessed 20 July 2020 2020]. 2020c. Genasys [Online]. Genasys, Inc. Available: https://genasys.com/lrad/law-enforcement/ [Accessed 17 June 2020 2020]. 2020d. Lexico Lexico. Oxford University Press. 2020e. Marcel Duchamp [Online]. tate.org.uk. Available: https://www.tate.org.uk/art/artists/marcel- duchamp-1036 [Accessed 09/07/2020 2020]. 2020f. Mark Rothko Untitled (1968) [Online]. The Museum of Modern Art. Available: https://www.moma.org/collection/works/37042?sov_referrer=artist&artist_id=5047&page= 1 [Accessed 7 June 2020]. 2020g. The Rite of Spring [Online]. pina-bausch.de. Available: http://www.pina- bausch.de/en/works/complete-works/show/das-fruehlingsopfer/ [Accessed 20 July 2020 2020]. 2020h. The Starry Night [Online]. MoMA. Available: https://www.moma.org/learn/moma_learning/vincent-van-gogh-the-starry-night-1889/ [Accessed 09/07/2020 2020]. 2021. HATICE [Online]. hatice.eu. Available: http://www.hatice.eu/ [Accessed 2019]. 3D SBS VIDEOS HD COMPILATIONS 2016. Documentary: Planet Earth Amazing Nature 3D SBS. ABRAS, C., MALONEY-KRICHMAR, D. & PREECE, J. 2004. User-centered design. Bainbridge, W. Encyclopedia of Human-Computer Interaction. Thousand Oaks: Sage Publications, 37, 445- 456. ACHARYA, S. & SHUKLA, S. 2012. Mirror neurons: Enigma of the metaphysical modular brain. Journal of natural science, biology, and medicine, 3, 118-124. ACTIVISION. 2019. Call of Duty® [Online]. @CallofDuty. Available: https://www.callofduty.com//content/atvi/callofduty/hub/web/en_au/home [Accessed 2019]. ADAMS, A., LUNT, P. & CAIRNS, P. 2008. A qualititative approach to HCI research. ADELSON, J. L. & MCCOACH, D. B. 2010. Measuring the mathematical attitudes of elementary students: The effects of a 4-point or 5-point Likert-type scale. Educational and Psychological measurement, 70, 796-807. ADOBE. 2019. Adobe Photoshop | Best photo, image and design editing software [Online]. Available: https://www.adobe.com/au/products/photoshop.html [Accessed].

108 ADOBEAUDITION. 2019. Adjust the amplitude and compression effects [Online]. Available: https://helpx.adobe.com/au/audition/using/amplitude-compression- effects.html#normalize_effect_waveform_editor_only [Accessed]. ADOLPH, S., HALL, W. & KRUCHTEN, P. 2011. Using grounded theory to study the experience of software development. Empirical Software Engineering, 16, 487-513. ALBEN, L. 1996. Quality of experience: defining the criteria for effective interaction design. interactions, 3, 11-15. ALDRIDGE, K. 1993. The use of music to relieve pre-operational anxiety in children attending day surgery. The Australian Journal of Music Therapy, 4, 19-35. ALVARSSON, J. J., WIENS, S. & NILSSON, M. E. 2010. Stress recovery during exposure to nature sound and environmental noise. International journal of environmental research and public health, 7, 1036-1046. ALVES FERNANDES, L. M., CRUZ MATOS, G., AZEVEDO, D., RODRIGUES NUNES, R., PAREDES, H., MORGADO, L., BARBOSA, L. F., MARTINS, P., FONSECA, B., CRISTÓVÃO, P., DE CARVALHO, F. & CARDOSO, B. 2016. Exploring educational immersive videogames: an empirical study with a 3D multimodal interaction prototype. Behaviour & Information Technology, 35, 907-918. AMAZON. 2020. Desktops [Online]. Available: https://www.amazon.com.au/Optiplex-Processor- Computer-Wireless- Professional/dp/B07Z4MVYRZ/ref=asc_df_B07Z4MVYRZ/?tag=googleshopdsk- 22&linkCode=df0&hvadid=341743943895&hvpos=&hvnetw=g&hvrand=5704911129460483 155&hvpone=&hvptwo=&hvqmt=&hvdev=c&hvdvcmdl=&hvlocint=&hvlocphy=9071321&hv targid=pla-835041758085&psc=1 [Accessed 11 May 2020 2020]. ANĀLAYO 2003. Satipaṭṭhāna: The direct path to realization, Windhorse. ANDERSON, F., ANNETT, M. & BISCHOF, W. F. 2010. Lean on Wii: physical rehabilitation with virtual reality Wii peripherals. Stud Health Technol Inform, 154, 229-34. ANNERSTEDT, M., JÖNSSON, P., WALLERGÅRD, M., JOHANSSON, G., KARLSON, B., GRAHN, P., HANSEN, Å. M. & WÄHRBORG, P. 2013. Inducing physiological stress recovery with sounds of nature in a virtual reality forest—Results from a pilot study. Physiology & behavior, 118, 240- 250. ANNETTA, L. A., MINOGUE, J., HOLMES, S. Y. & CHENG, M.-T. 2009. Investigating the impact of video games on high school students’ engagement and learning about genetics. Computers & Education, 53, 74-85. APPLE. 2020a. iMac [Online]. Available: https://www.apple.com/au/shop/buy-mac/imac [Accessed 11 May 2020 2020]. APPLE. 2020b. MacBook Pro [Online]. Available: https://www.apple.com/au/shop/buy- mac/macbook-pro/13-inch [Accessed 11 May 2020 2020]. ARTHUR, K. W. & BROOKS JR, F. P. 2000. Effects of field of view on performance with head-mounted displays. University of North Carolina at Chapel Hill. ATARI. 2018. Atari - Pong [Online]. Available: https://www.atari.com/search/node/pong [Accessed]. AUSBURN, L. J. & AUSBURN, F. B. 2004. Desktop Virtual Reality: A Powerful New Technology for Teaching and Research in Industrial Teacher Education. Journal of Industrial Teacher Education, 41, 1-16. BACH, J. S. 1722. The Well-Tempered Clavier. Germany: unknown. BAER, R. A. 2003. Mindfulness training as a clinical intervention: A conceptual and empirical review. Clinical psychology: Science and practice, 10, 125-143. BAILENSON, J. N., YEE, N., BLASCOVICH, J., BEALL, A. C., LUNDBLAD, N. & JIN, M. 2008. The Use of Immersive Virtual Reality in the Learning Sciences: Digital Transformations of Teachers, Students, and Social Context. Taylor & Francis Group. BALKWILL, L.-L. & THOMPSON, W. F. 1999. A cross-cultural investigation of the perception of emotion in music: Psychophysical and cultural cues. Music perception: an interdisciplinary journal, 17, 43-64.

109 BALKWILL, L. L., THOMPSON, W. F. & MATSUNAGA, R. 2004. Recognition of emotion in japanese, western, and hindustani music by japanese listeners 1. Japanese Psychological Research, 46, 337-349. BALLAST, D. 2002. Interior design reference manual. Professional Pub. Inc.: Belmont, CA. BANG, P. 2019. Procato [Online]. Available: http://www.procato.com/home/ [Accessed]. BARDZELL, J. Interaction criticism and aesthetics. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, 2009. ACM, 2357-2366. BARTNECK, C. & FORLIZZI, J. A design-centred framework for social human-robot interaction. RO-MAN 2004. 13th IEEE International Workshop on Robot and Human Interactive Communication (IEEE Catalog No. 04TH8759), 2004. IEEE, 591-594. BATES, J. 1992. Virtual reality, art, and entertainment. Presence: Teleoperators & Virtual Environments, 1, 133-138. BATTARBEE, K. & KOSKINEN, I. 2005. Co-experience: user experience as interaction. CoDesign, 1, 5-18. BAUMER, E. P., KHOVANSKAYA, V., MATTHEWS, M., REYNOLDS, L., SCHWANDA SOSIK, V. & GAY, G. Reviewing reflection: on the use of reflection in interactive system design. Proceedings of the 2014 conference on Designing interactive systems, 2014. ACM, 93-102. BAUSCH, P. 1978. Le Sacre du Printemps. BECK, S. The therapeutic use of music for cancer-related pain. Oncology nursing forum, 1991. 1327- 1337. BECKETT, S. 1981. Quad. Germany: Süddeutscher Rundfunk. BENFIELD, J. A., TAFF, B. D., NEWMAN, P. & SMYTH, J. 2014. Natural sound facilitates mood recovery. Ecopsychology, 6, 183-188. BERNSTEIN, C. 2019. Course Syllabi [Online]. Pennsylvania: upenn.edu. Available: http://www.writing.upenn.edu/bernstein/syllabi/readings/Rilke-Archaic.html [Accessed 5 June 2020]. BERROL, C. F. 2006. Neuroscience meets dance/movement therapy: Mirror neurons, the therapeutic process and empathy. The Arts in Psychotherapy, 33, 302-315. BERTELSEN, O. W. 2006. Tertiary Artifacts at the interface. Aesthetic computing, 357-368. BERTELSEN, O. W. & POLD, S. Criticism as an approach to interface aesthetics. Proceedings of the third Nordic conference on Human-computer interaction, 2004. ACM, 23-32. BEVAN, N. 2001. International standards for HCI and usability. International journal of human- computer studies, 55, 533-552. BEVAN, N. Extending quality in use to provide a framework for usability measurement. International Conference on Human Centered Design, 2009. Springer, 13-22. BEVAN, N., CARTER, J. & HARKER, S. ISO 9241-11 revised: What have we learnt about usability since 1998? International Conference on Human-Computer Interaction, 2015. Springer, 143-151. BEYER, H. & HOLTZBLATT, K. 1997. Contextual Design: A Customer-Centered Approach to Systems Designs (Morgan Kaufmann Series in Interactive Technologies). BHIKKHU, B., POTPARIC, O. & BUDDHA, G. 1995. The Middle Length Discourses of the Buddha: A Translation of the Majjhima Nikaya, Simon and Schuster. BHUVANESH, E. 2013. Issues and perspectives in meditation research: In search for a definition. Frontiers in Psychology, Vol 3 (2013). BIANCHI-BERTHOUZE, N. 2013. Understanding the role of body movement in player engagement. Human–Computer Interaction, 28, 40-75. BIANCHI-BERTHOUZE, N., KIM, W. W. & PATEL, D. Does body movement engage you more in digital game play? and why? International conference on affective computing and intelligent interaction, 2007. Springer, 102-113. BIOCCA, F. & DELANEY, B. 1995. Immersive virtual reality technology. Communication in the age of virtual reality, 57-124. BIRREN, F. 1961. Color psychology and color therapy: a factual study of the influence of color on human life, Secaucus, N. J., The Citadel Press.

110 BIRREN, F. 2016. Color psychology and color therapy; a factual study of the influence of color on human life, Pickle Partners Publishing. BIRRINGER, J. 1986. Pina Bausch: Dancing across Borders. The Drama Review: TDR, 30, 85-97. BISHOP, D. T., KARAGEORGHIS, C. I. & LOIZOU, G. 2007. A grounded theory of young tennis players’ use of music to manipulate emotional state. Journal of Sport and Exercise Psychology, 29, 584- 607. BLACK, A. 1998. Empathic design. User focused strategies for innovation. Proceedings of new product development. BLOOM, J., HUNKER, R., MCCOMBS, K., RAUDENBUSH, B. & WRIGHT, T. 2008. Nintendo Wii vs. Microsoft Xbox: Differential effects on mood, physiology, snacking behavior, and caloric burn. Appetite, 2, 354. BOCK, J., HARNICK, S. & STEIN, J. 1964. Fiddler on the Roof. Broadway: Imperial Theatre. BODHI, B. 2011. What does mindfulness really mean? A canonical perspective. Contemporary Buddhism, 12, 19-39. BØDKER, S. When second wave HCI meets third wave challenges. Proceedings of the 4th Nordic conference on Human-computer interaction: changing roles, 2006. ACM, 1-8. BOND, K. E. & STINSON, S. W. 2000. “I Feel Like I'm Going to Take Off!”: Young People's Experiences of the Superordinary in Dance. Dance Research Journal, 32, 52-87. BONTCHEV, B. & VASSILEVA, D. Assessing engagement in an emotionally-adaptive applied game. Proceedings of the fourth international conference on technological ecosystems for enhancing multiculturality, 2016. 747-754. BOONE, H. N. & BOONE, D. A. 2012. Analyzing likert data. Journal of extension, 50, 1-5. BORGMANN, A. 1987. Technology and the character of contemporary life: A philosophical inquiry, University of Chicago Press. BORGMANN, A. 1993. Crossing the postmodern divide, University of Chicago Press. BORGMANN, A. 1999. Holding on to Reality: The Nature of Information at the Turn of the Millennium, University of Chicago Press. BORGMANN, A. 2000. Reply to my critics. BRELSFORD, J. W. Physics Education in a Virtual Environment. 1993. HFES, 1286-1290. BROCKMYER, J. H., FOX, C. M., CURTISS, K. A., MCBROOM, E., BURKHART, K. M. & PIDRUZNY, J. N. 2009. The development of the Game Engagement Questionnaire: A measure of engagement in video game-playing. Journal of experimental social psychology, 45, 624-634. BROIDA, R. 2016. The 7 best VR games for iPhone [Online]. cnet. Available: https://www.cnet.com/how-to/the-7-best-vr-games-for-iphone/ [Accessed]. BROOKE, J. 1996. SUS-A quick and dirty usability scale. Usability evaluation in industry, 189, 4-7. BROTTO, L. A. 2011. Non-judgmental, present-moment, sex… as if your life depended on it. Taylor & Francis. BROTTO, L. A. 2013. Mindful sex. University of Toronto Press Incorporated. BROWN, K. W. & RYAN, R. M. 2003. The benefits of being present: mindfulness and its role in psychological well-being. Journal of personality and social psychology, 84, 822. BRUUN-PEDERSEN, J. R., SERAFIN, S. & KOFOED, L. B. 2016. Going Outside While Staying Inside — Exercise Motivation with Immersive vs. Non–immersive Recreational Virtual Environment Augmentation for Older Adult Nursing Home Residents. 2016 IEEE International Conference on Healthcare Informatics (ICHI), 216. BUCHANAN, K. 2006. Beyond attention-getters: Designing for deep engagement, Michigan State University. BUŃ, P. K., WICHNIAREK, R., GÓRSKI, F., GRAJEWSKI, D., ZAWADZKI, P. & HAMROL, A. 2017. Possibilities and Determinants of Using Low-Cost Devices in Virtual Education Applications. Eurasia Journal of Mathematics, Science & Technology Education, 13, 381. BURDEA GRIGORE, C. & COIFFET, P. 1994. Virtual reality technology, London: Wiley-Interscience.

111 BURSCHKA, J. M., KEUNE, P. M., HOFSTADT-VAN OY, U., OSCHMANN, P. & KUHN, P. 2014. Mindfulness-based interventions in multiple sclerosis: beneficial effects of Tai Chi on balance, coordination, fatigue and depression. BMC neurology, 14, 165. BYRNE, A. & BYRNE, D. 1993. The effect of exercise on depression, anxiety and other mood states: a review. Journal of psychosomatic research, 37, 565-574. CALDWELL, C. & HIBBERT, S. A. 1999. Play that one again: The effect of music tempo on consumer behaviour in a restaurant. ACR European Advances. CAMEIRÃO, M. S., BADIA, S. B. I., OLLER, E. D. & VERSCHURE, P. F. 2010. Neurorehabilitation using the virtual reality based Rehabilitation Gaming System: methodology, design, psychometrics, usability and validation. Journal of NeuroEngineering and Rehabilitation, 7, 48. CAMPOS, D., CEBOLLA, A., QUERO, S., BRETÓN-LÓPEZ, J., BOTELLA, C., SOLER, J., GARCÍA-CAMPAYO, J., DEMARZO, M. & BAÑOS, R. M. 2016. Meditation and happiness: Mindfulness and self- compassion may mediate the meditation–happiness relationship. Personality and Individual Differences, 93, 80-85. CANNAM, C., LANDONE, C. & SANDLER, M. Sonic visualiser: An open source application for viewing, analysing, and annotating music audio files. Proceedings of the 18th ACM international conference on Multimedia, 2010. ACM, 1467-1468. CARD, S., MORAN, T. & NEWELL, A. 1983. The Psychology of Human-Computer Interaction. Hillsdale, New Jersey: Lawerence Erlbaum Associates. Inc. CARD, S. K., ENGLISH, W. K. & BURR, B. J. 1978. Evaluation of mouse, rate-controlled isometric joystick, step keys, and text keys for text selection on a CRT. Ergonomics, 21, 601-613. CARMODY, J. & BAER, R. A. 2008. Relationships between mindfulness practice and levels of mindfulness, medical and psychological symptoms and well-being in a mindfulness-based stress reduction program. Journal of behavioral medicine, 31, 23-33. CARTER, R. L., REEVES, D. W., HOFFMAN, C. A. & MOFFETT, G. L. 1994. Video game joystick controller. Google Patents. CASSOLA, F., MORGADO, L., DE CARVALHO, F., PAREDES, H., FONSECA, B. & MARTINS, P. 2014. Online- Gym: a 3D virtual gymnasium using Kinect interaction. Procedia Technology, 13, 130-138. CASTAÑO DÍAZ, C. M. & TUNGTJITCHAROEN, W. 2015. Art video games: Ritual communication of feelings in the digital era. Games and Culture, 10, 3-34. CASTELLANO, G., VILLALBA, S. D. & CAMURRI, A. Recognising human emotions from body movement and gesture dynamics. International Conference on Affective Computing and Intelligent Interaction, 2007. Springer, 71-82. CENTRECOM. 2020a. CentreCom [Online]. Available: https://www.centrecom.com.au/acer-aspire-5- 156-hd-core-i5-10210u-laptop?gclid=CjwKCAjw7-P1BRA2EiwAXoPWA- RDwajkXN1kzOUWtzhZtPTQYOBsWAVehMoPdRr3lQnYb-138ENmQhoC92MQAvD_BwE [Accessed 11 May 2020 2020]. CENTRECOM. 2020b. CentreCom [Online]. Available: https://www.centrecom.com.au/hp-250-g7-156- hd-core-i5-8265u-laptop-2?gclid=CjwKCAjw7- P1BRA2EiwAXoPWA_sZ38pCd79PBgFDqg6l5TM5iXQVzFq6wpQ- CtPLqJKn2_OlLcqASBoCh3sQAvD_BwE [Accessed 11 May 2020 2020]. CHANG, L. 1994. A psychometric evaluation of 4-point and 6-point Likert-type scales in relation to reliability and validity. Applied psychological measurement, 18, 205-215. CHARMAZ, K. & BELGRAVE, L. L. 2007. Grounded theory. The Blackwell encyclopedia of sociology. CHAWDA, B., CRAFT, B., CAIRNS, P., HEESCH, D. & RÜGER, S. 2005. Do “attractive things work better”? An exploration of search tool visualisations. CHILDERS, R. C. 1875. A dictionary of the Pali language, Trübner. CHOMEYA, R. 2010. Quality of psychology test between Likert scale 5 and 6 points. Journal of Social Sciences, 6, 399-403. CHOUNGOURIAN, A. 1968. Color preferences and cultural variation. Perceptual and motor skills, 26, 1203-1206.

112 CHRISTENSEN, R. 2017. Mark Rothko: Art as an Experience. The Significance of Interaction between Painting and Viewer in the Rothko Chapel. RIHA Journal, 183, 1-27. CINIMODSTUDIO. 2019. DJ Light [Online]. Available: http://www.cinimodstudio.com/dj-light [Accessed]. CIOLFI, L. & BANNON, L. Designing Interactive Museum Exhibits: Enhancing visitor curiosity through augmented artefacts. Eleventh European Conference on Cognitive Ergonomics, 2002. Citeseer. CLARK, L. A. 1978. The ancient art of color therapy, Pocket Books. CLIMENHAGA, R. 2018. Pina Bausch, Routledge, Taylor & Francis Group. COMPAGNIA TPO 2016. POP UP GARDEN. Paris. CONRAD, C. F. 1982. Grounded theory: An alternative approach to research in higher education. The Review of Higher Education, 5, 239-249. COOK, N. F., MCALOON, T., O'NEILL, P. & BEGGS, R. 2012. Impact of a web based interactive simulation game (PULSE) on nursing students' experience and performance in life support training—A pilot study. Nurse education today, 32, 714-720. CORBIN, J. & STRAUSS, A. 1990. Grounded Theory Research: Procedures, Canons and Evaluative Criteria. Zeitschrift für Soziologie, 19, 418-427. COTTON, M. A. 1985. Creative art expression from a leukemic child. Art Therapy, 2, 55-65. CREATIVECOMMONS. 2019a. Creative Commons — Attribution 3.0 Unported — CC BY 3.0 [Online]. Available: https://creativecommons.org/licenses/by/3.0/ [Accessed]. CREATIVECOMMONS. 2019b. Creative Commons — CC0 1.0 Universal [Online]. Available: https://creativecommons.org/publicdomain/zero/1.0/ [Accessed]. CRILLY, N., MOULTRIE, J. & CLARKSON, P. J. 2004. Seeing things: consumer response to the visual domain in product design. Design studies, 25, 547-577. CURTIS, R. 2010. Vincent and the Doctor [Online]. Available: https://youtu.be/ubTJI_UphPk?t=102 [Accessed 12 August 2020]. The Art of Racing in the Rain (2019) - IMDb, 2019. Directed by CURTIS, S. CUTSHALL, S. M., ANDERSON, P. G., PRINSEN, S. K., WENTWORTH, L. J., OLNEY, T. L., MESSNER, P. K., BREKKE, K. M., LI, Z., SUNDT III, T. M. & KELLY, R. F. 2011. Effect of the combination of music and nature sounds on pain and anxiety in cardiac surgical patients: a randomized study. Alternative Therapies in Health & Medicine, 17. CYCLING '74. 2019. MAX [Online]. @cycling74. Available: https://cycling74.com/ [Accessed 2019]. CYCLING '74. 2020. Made with Max [Online]. USA: Cycling '74. Available: https://cycling74.com/products/made-with-max [Accessed April 2 2020]. D'ANDRADE, R. & EGAN, M. 1974. The colors of emotion. American ethnologist, 1, 49-63. DAVESON, B., O’CALLAGHAN, C. & GROCKE, D. 2008. Indigenous music therapy theory building through grounded theory research: The developing indigenous theory framework. The Arts in Psychotherapy, 35, 280-286. DAVEY, P. 1998. True colours. Architectural Review, 204, 34-5. DAVIS, D. M. & HAYES, J. A. 2011. What are the benefits of mindfulness? A practice review of psychotherapy-related research. Psychotherapy, 48, 198. DE ANGELI, A., SUTCLIFFE, A. & HARTMANN, J. Interaction, usability and aesthetics: what influences users' preferences? Proceedings of the 6th conference on Designing Interactive systems, 2006. ACM, 271-280. DE CHOUDHURY, M., COUNTS, S. & GAMON, M. Not all moods are created equal! exploring human emotional states in social media. Sixth international AAAI conference on weblogs and social media, 2012. DE FREITAS, S. I. 2006. Using games and simulations for supporting learning. Learning, Media and Technology, 31, 343-358. DE VALK, L., BEKKER, T. & EGGEN, B. 2015. Designing for social interaction in open-ended play environments. International Journal of Design, 9, 107-120.

113 DEDE, C. 2009. Immersive interfaces for engagement and learning. science, 323, 66-69. DEL OLMO, M. F. & CUDEIRO, J. 2005. Temporal variability of gait in Parkinson disease: Effectsof a rehabilitation programme based on rhythmic sound cues. Parkinsonism & related disorders, 11, 25-33. DEMIDIV, K. 2017. Pina Bausch "Barbe Bleue" (1977). YouTube. DERIVATIVE. 2019. Derivative TouchDesigner [Online]. Available: https://www.derivative.ca/ [Accessed 2019]. DESMET, P. 2002. Designing emotions. Delft University of Technology, Department of Industrial Design. DESMET, P. & HEKKERT, P. 2007. Framework of product experience. International journal of design, 1, 57-66. DEWEY, J. 1934. Art As An Experience, Perigee Books. DEWEY, J. 2005. Art as experience, Penguin. DIS, I. 2009. 9241-210: 2010. Ergonomics of human system interaction-Part 210: Human-centred design for interactive systems. International Standardization Organization (ISO). Switzerland. DOURISH, P. 2004a. What we talk about when we talk about context. Personal and ubiquitous computing, 8, 19-30. DOURISH, P. 2004b. Where the action is: the foundations of embodied interaction, MIT press. DOW, S., MEHTA, M., HARMON, E., MACINTYRE, B. & MATEAS, M. Presence and engagement in an interactive drama. Proceedings of the SIGCHI conference on Human factors in computing systems, 2007. ACM, 1475-1484. DOWNES, E. J. & MCMILLAN, S. J. 2000. Defining interactivity: A qualitative identification of key dimensions. New Media & Society, 2, 157. DREYFUS, G. 2011. Is mindfulness present-centred and non-judgmental? A discussion of the cognitive dimensions of mindfulness. Contemporary Buddhism, 12, 41-54. DUNCAN, S. 1989. Interaction, face-to-face. DUNLEAVY, M. & DEDE, C. 2014. Augmented reality teaching and learning. Handbook of research on educational communications and technology. Springer. DURHAM, V. 2012. In| And| Of| Through. Master of Fine Arts Masters University of Maryland. E-PAINT. 2019. Convert L*a*b* values to the nearest standard colour [Online]. Available: https://www.e-paint.co.uk/Convert_LAB.asp?Lstar=89.283&astar=- 2.753&bstar=73.121&Munsell3=5 [Accessed]. EHN, P. 1988. Work-oriented design of computer artifacts. Arbetslivscentrum. EIFRING, H. 2013. Meditation in Judaism, Christianity and Islam: Technical Aspects of Devotional Practices. Meditation in Judaism, Christianity and Islam: Cultural Histories, 1-13. ELECTRONIC ARTS. 2018. Battlefield V: Available Nov 20 2018 on Xbox One, PlayStation® 4, and PC – EA Official [Online]. @ea. Available: https://www.ea.com/games/battlefield/battlefield-5 [Accessed 3 July 2018 2018]. ELECTRONIC ARTS. 2019. Need for Speed Payback - Car Racing Action Game - Official EA Site [Online]. @ea. Available: https://www.ea.com/games/need-for-speed/need-for-speed-payback [Accessed 9 June 2019 2019]. ELLSWORTH, P. C. & SCHERER, K. R. 2003. Appraisal processes in emotion. Handbook of affective sciences, 572, V595. EMERSON, T., PROTHERO, J. D. & WEGHORST, S. J. 1994. Medicine and virtual reality: a guide to the literature (MedVR), Human Interface Technology Laboratory. ERMER, G. E. 2009. Guiding Technological Development: An Analysis of Borgmann’s Device Paradigm. 2009 CEEC. EVERETT, R. 1986. Communication Technology: The New Media in Society. New York: Free Press. FACEBOOK TECHNOLOGIES, L. 2019. Oculus [Online]. Available: https://www.oculus.com/?locale=en_US [Accessed May 7 2019].

114 FALLMAN, D. Persuade into what? why human-computer interaction needs a philosophy of technology. International Conference on Persuasive Technology, 2007. Springer, 295-306. FALLMAN, D. 2010. A different way of seeing: Albert Borgmann’s philosophy of technology and human–computer interaction. Ai & Society, 25, 53-60. FALLMAN, D. The new good: exploring the potential of philosophy of technology to contribute to human-computer interaction. Proceedings of the SIGCHI conference on human factors in computing systems, 2011. ACM, 1051-1060. FASSINGER, R. E. 2005. Paradigms, praxis, problems, and promise: Grounded theory in counseling psychology research. Journal of counseling psychology, 52, 156. FEELREAL. 2019. FEELREAL Multisensory VR Mask [Online]. @feelreal_com. Available: https://feelreal.com/ [Accessed]. FERNANDES, A. S. & FEINER, S. K. 2016. Combating VR sickness through subtle dynamic field-of-view modification. 2016 IEEE Symposium on 3D User Interfaces (3DUI), 201. FESSENDEN, J. D. & SCHACHT, J. 1998. The nitric oxide/cyclic GMP pathway: a potential major regulator of cochlear physiology. Hearing research, 118, 168-176. FISHER, M., , J. & TAGUE, G. 2001. Development of a self-directed learning readiness scale for nursing education. Nurse education today, 21, 516-525. FORLIZZI, J. & BATTARBEE, K. Understanding experience in interactive systems. Proceedings of the 5th conference on Designing interactive systems: processes, practices, methods, and techniques, 2004. ACM, 261-268. FORLIZZI, J. & FORD, S. The building blocks of experience: an early framework for interaction designers. Proceedings of the 3rd conference on Designing interactive systems: processes, practices, methods, and techniques, 2000. ACM, 419-423. FOURNEAU, A. 2019. Waterlight Graffiti [Online]. Available: http://www.waterlightgraffiti.com/ [Accessed]. FOVE-INC. 2019. FOVE Eye Tracking Virtual Reality Headset [Online]. @FOVEinc. Available: https://www.getfove.com/ [Accessed]. FRANCESE, R., PASSERO, I. & TORTORA, G. Wiimote and Kinect: gestural user interfaces add a natural third dimension to HCI. Proceedings of the International Working Conference on Advanced Visual Interfaces, 2012. ACM, 116-123. FRANKLIN, A. & SANSOM, M. 2018. ‘Aimless and Absurd Wanderings’? Children at the Museum of Old and New Art (Mona). Museum & Society, 16, 28-40. FREESOUND. 2019. Freesound - Freesound [Online]. Available: https://freesound.org/ [Accessed]. FREIBERGER, P. & SWAINE, M. 1999. Fire in the Valley: the making of the personal computer, McGraw- Hill Professional. FRIEDMAN, B. & KAHN JR, P. H. 2007. Human values, ethics, and design. The human-computer interaction handbook. CRC Press. FRIJDA, N. H. 1986. The emotions, Cambridge University Press. GALLAGHER, M. & RYAN, A. Learning to play Pac-Man: An evolutionary, rule-based approach. Evolutionary Computation, 2003. CEC'03. The 2003 Congress on, 2003. IEEE, 2462-2469. GARCÍA-BETANCES, R. I., JIMÉNEZ-MIXCO, V., ARREDONDO, M. T. & CABRERA-UMPIÉRREZ, M. F. 2015. Using virtual reality for cognitive training of the elderly. American Journal Of Alzheimer's Disease And Other Dementias, 30, 49-54. GAVER, W. W. Technology affordances. Proceedings of the SIGCHI conference on Human factors in computing systems, 1991. ACM, 79-84. GELLER, T. 2006. Interactive tabletop exhibits in museums and galleries. IEEE Computer Graphics and Applications, 26, 6-11. GENG, A. 2019. Munsell Color Palette [Online]. Available: http://pteromys.melonisland.net/munsell/ [Accessed]. GERMER, C. 2004. What is mindfulness. Insight Journal, 22, 24-29. GIBBS JR, R. W. 2003. Embodied experience and linguistic meaning. Brain and language, 84, 1-15.

115 GOGH, V. V. 1889. The Starry Night. GOODALL, J. 2008. Stage presence, Routledge. GOOGLE. 2019a. Glass – Glass [Online]. Available: https://www.google.com/glass/start/ [Accessed]. GOOGLE. 2019b. Google Cardboard – Google VR [Online]. Available: https://vr.google.com/cardboard/ [Accessed]. GOOGLE PLAY. 2017. Apps [Online]. Google Play. Available: https://play.google.com/store/search?q=vr%20games&c=apps&hl=en [Accessed]. GORTARI, A. B. O. D. 2015. What Can Game Transfer Phenomena Tell Us about the Impact of Highly Immersive Gaming Technologies? 2015 International Conference on Interactive Technologies & Games, 84. GOVERS, P. C. & MUGGE, R. I love my Jeep, because its tough like me: The effect of product-personality congruence on product attachment. Proceedings of the fourth international conference on design and emotion, Ankara, Turkey, 2004. 12-14. GRAETZ, J. M. 1981. The origin of spacewar. Creative Computing, 7, 56-67. GREGORIAN, V. S., AZARIAN, A., DEMARIA, M. B. & MCDONALD, L. D. 1996. Colors of disaster: The psychology of the" Black sun". The Arts in Psychotherapy. GROSSMAN, P., NIEMANN, L., SCHMIDT, S. & WALACH, H. 2004. Mindfulness-based stress reduction and health benefits: A meta-analysis. Journal of psychosomatic research, 57, 35-43. GROSSMAN, P. & VAN DAM, N. T. 2011. Mindfulness, by any other name…: trials and tribulations of sati in western psychology and science. Contemporary Buddhism, 12, 219-239. GUETIN, S., PORTET, F., PICOT, M.-C., POMMIÉ, C., MESSAOUDI, M., DJABELKIR, L., OLSEN, A., CANO, M., LECOURT, E. & TOUCHON, J. 2009. Effect of music therapy on anxiety and depression in patients with Alzheimer’s type dementia: randomised, controlled study. Dementia and geriatric cognitive disorders, 28, 36-46. GUILLEMOT CORPORATION. 2019. Thrustmaster [Online]. Available: http://www.thrustmaster.com/products/categories/joysticks-0 [Accessed 4 April 2019 2019]. GUNARATANA, H. 2010. Mindfulness in plain English, ReadHowYouWant. com. GUNN, F., KENDAL, M. & MULLA, M. 2020. An Exploration of How Artists Use Immersive Technologies to Promote Inclusivity, Diversity and Deep Public Engagement in Ethical Ways. Proceedings of EVA London 2020 30, 198-205. GUNSN'ROSES 1987. Sweet Child O' Mine. Appetite for Destruction. GUO, Y., QIU, P. & LIU, T. 2014. Tai Ji Quan: an overview of its history, health benefits, and cultural value. Journal of Sport and Health Science, 3, 3-8. GUTIÉRREZ-VICARIO, M. A. 2016. More than a mural: The intersection of public art, immigrant youth, and human rights. Radical Teacher, 104, 55-61. HADDENHORST, G. 1971. The Spirit of Science and the Visual Arts. ETC: A Review of General Semantics, 319-322. HALLIDAY, M. A. K. & MATTHIESSEN, C. 2006. Construing experience through meaning: A language- based approach to cognition, A&C Black. HANNA, M. G., AHMED, I., NINE, J., PRAJAPATI, S. & PANTANOWITZ, L. 2018. Augmented reality technology using Microsoft HoloLens in anatomic pathology. Archives of pathology & laboratory medicine, 142, 638-644. HART, W. Constructing Experience: Art and Digital Literacy. Australian Council of University Art and Design Schools (ACUADS) Conference 2011, 2011. 1-6. HARTSON, R. & PYLA, P. S. 2012. The UX Book: Process and guidelines for ensuring a quality user experience, Elsevier. HARVEY NORMAN. 2020. Desktops & All-in-Ones [Online]. Available: https://www.harveynorman.com.au/hp-ryzen-3-8gb-1tb-hdd-desktop.html [Accessed 11 May 2020 2020]. HASSENZAHL, M. 2001. The effect of perceived hedonic quality on product appealingness. International Journal of Human-Computer Interaction, 13, 481-499.

116 HASSENZAHL, M. 2003. The thing and I: understanding the relationship between user and product. Funology: From Usability to Enjoyment (pp. 31–42). Dordrecht: Kluwer. HASSENZAHL, M. 2018. The thing and I: understanding the relationship between user and product. Funology 2. Springer. HASSENZAHL, M., PLATZ, A., BURMESTER, M. & LEHNER, K. Hedonic and ergonomic quality aspects determine a software's appeal. Proceedings of the SIGCHI conference on Human Factors in Computing Systems, 2000. ACM, 201-208. HASSENZAHL, M. & TRACTINSKY, N. 2006. User experience – a research agenda. Behaviour & Information Technology, 25, 91-97. HEATH, C. & LUFF, P. Collaborative activity and technological design: Task coordination in London Underground control rooms. Proceedings of the Second European Conference on Computer- Supported Cooperative Work ECSCW’91, 1991. Springer, 65-80. HEETER, C. 1989. Implications of new interactive technologies for conceptualizing communication. Media use in the information age: Emerging patterns of adoption and consumer use, 217-235. HEKKERT, P. 2006. Design aesthetics: principles of pleasure in design. Psychology science, 48, 157. HEMPHILL, M. 1996. A note on adults' color–emotion associations. The Journal of genetic psychology, 157, 275-280. HESSER, B. 2001. The transformative power of music in our lives: A personal perspective. Music Therapy Perspectives, 19, 53-58. HIGGS, E. & STRONG, D. 2010. Borgmann's Philosophy of Technology. In: HIGGS, E., LIGHT, A. & STRONG, D. (eds.) Technology and the good life? : University of Chicago press. HOFFMAN, B. & NADELSON, L. 2010. Motivational engagement and video gaming: A mixed methods study. Educational Technology Research and Development, 58, 245-270. HOFFMAN, H. G., MEYER III, W. J., RAMIREZ, M., ROBERTS, L., SEIBEL, E. J., ATZORI, B., SHARAR, S. R. & PATTERSON, D. R. 2014. Feasibility of articulated arm mounted Oculus Rift Virtual Reality goggles for adjunctive pain control during occupational therapy in pediatric burn patients. Cyberpsychology, Behavior, and Social Networking, 17, 397-401. HOGAN, T. & HORNECKER, E. 2011. Human-Data Relations and the Lifeworld. Proc of iHCI. HOHL, M. 2009. Designing the art experience: using Grounded Theory to develop a model of participants' perception of an immersive telematic artwork. Digital Creativity, 20, 187-196. HÖLZEL, B. K., CARMODY, J., VANGEL, M., CONGLETON, C., YERRAMSETTI, S. M., GARD, T. & LAZAR, S. W. 2011. Mindfulness practice leads to increases in regional brain gray matter density. Psychiatry Research: Neuroimaging, 191, 36-43. HOMER, B. D. & PLASS, J. L. 2014. Level of interactivity and executive functions as predictors of learning in computer-based chemistry simulations. Computers in Human Behavior, 36, 365-375. HORNECKER, E. “I don’t understand it either, but it is cool”-visitor interactions with a multi-touch table in a museum. Horizontal interactive human computer systems, 2008. TABLETOP 2008. 3rd IEEE International Workshop on, 2008. IEEE, 113-120. HORNECKER, E. & STIFTER, M. Learning from interactive museum installations about interaction design for public settings. Proceedings of the 18th Australia conference on Computer-Human Interaction: Design: Activities, Artefacts and Environments, 2006. ACM, 135-142. HOROWITZ, S. 2010. Health benefits of meditation: What the newest research shows. Alternative and Complementary Therapies, 16, 223-228. HÖYSNIEMI, J., AULA, A., AUVINEN, P., HÄNNIKÄINEN, J. & HÄMÄLÄINEN, P. Shadow boxer: a physically interactive fitness game. Proceedings of the third Nordic conference on Human- computer interaction, 2004. ACM, 389-392. HTC CORPORATION. 2019. VIVE™ Australia | Discover Virtual Reality Beyond Imagination [Online]. Available: https://www.vive.com/au/ [Accessed 5 April 2019]. IACOBONI, M. 2009. Imitation, empathy, and mirror neurons. Annual review of psychology, 60, 653- 670. IHDE, D. 1978. Technics and Praxis. [electronic resource], Springer Netherlands.

117 IHDE, D. 1979. Technics and praxis. Dordrecht: D. Boston: Reidel. IHDE, D. 1983. Existential technics, SUNY Press. IHDE, D. 1990. Technology and the lifeworld: From garden to earth, Indiana University Press. IHDE, D. 1993. Philosophy of technology: An introduction. IHDE, D. 1999. Technology and prognostic predicaments. AI & SOCIETY, 13, 44-51. IHDE, D. 2012. Technics and praxis: A philosophy of technology, London: England, Springer Science & Business Media. ISO 1998. 9241-11 (1998). Ergonomic Requirements for Office Work with Visual Display Terminals (VDTs)–Part II Guidance on Usability. ISO 2018a. ISO 9241-11:2018(en), Ergonomics of human-system interaction - Part 11: Usability: Definitions and concepts. International Organization for Standardization. ISO. 2018b. ISO 9241-11:2018(en), Ergonomics of human-system interaction — Part 11: Usability: Definitions and concepts [Online]. Available: https://www.iso.org/obp/ui/#iso:std:iso:9241:- 11:ed-2:v1:en [Accessed]. ISO, I. 1999. 13407: Human-centred design processes for interactive systems. Geneva: ISO. IYER, N., BRITTO, A. P., SRIMATH, N., SATHYASUNDARI, I., DEVI, S. M. V., DEVI, S. M. A., DEVI, S. M. A. & MAHARAJ, D. O. 2011. Human-Machine Interactions based on Psychological Acceptance of the Divine Omdasji Sound Meditation as a Universal Cure for Human Diseases. IJBNST (2011), 1, 1-31. JÄÄSKÖ, V. & MATTELMÄKI, T. Observing and probing. Proceedings of the 2003 international conference on Designing pleasurable products and interfaces, 2003. ACM, 126-131. JÄCKEL, M. 1995. Interaktion. Soziologische Anmerkungen zu einem Begriff. Rundfunk und Fernsehen, 43, 463-476. JACOBSON, J., LE RENARD, M., LUGRIN, J.-L. & CAVAZZA, M. The CaveUT system: immersive entertainment based on a game engine. Proceedings of the 2005 ACM SIGCHI International Conference on Advances in computer entertainment technology, 2005. ACM, 184-187. JAUME-I-CAPÓ, A., MARTÍNEZ-BUESO, P., MOYÀ-ALCOVER, B. & VARONA, J. 2013. Interactive rehabilitation system for improvement of balance therapies in people with cerebral palsy. IEEE transactions on neural systems and rehabilitation engineering, 22, 419-427. JENSEN, J. F. 1998a. Interactivity. Nordicom Review, Nordic research on media and comunication review, 19. JENSEN, J. F. 1998b. 'Interactivity': Tracking a New Concept in Media and Communication Studies. Nordicom Review, 19, 185-204. JENSEN, J. F. 2000. Interactivity-tracking a new concept. JOHN, D. 1934. Art as experience. JOHNSON, G. J., BRUNER II, G. C. & KUMAR, A. 2006. Interactivity and its facets revisited: Theory and empirical test. Journal of Advertising, 35, 35-52. JOKELA, T., IIVARI, N., MATERO, J. & KARUKKA, M. The standard of user-centered design and the standard definition of usability: analyzing ISO 13407 against ISO 9241-11. Proceedings of the Latin American conference on Human-computer interaction, 2003. ACM, 53-60. KATZ, N., RING, H., NAVEH, Y., KIZONY, R., FEINTUCH, U. & WEISS, P. 2005. Interactive virtual environment training for safe street crossing of right hemisphere stroke patients with unilateral spatial neglect. Disability and rehabilitation, 27, 1235-1244. KAUFMANN, H. & SCHMALSTIEG, D. 2003. Mathematics and geometry education with collaborative augmented reality. Computers & graphics, 27, 339-345. KELLMANN, M. 2010. Preventing overtraining in athletes in high‐intensity sports and stress/recovery monitoring. Scandinavian journal of medicine & science in sports, 20, 95-102. KEMPER, K. J. & DANHAUER, S. C. 2005. Music as therapy. South Med J, 98, 282-8. KENG, S.-L., SMOSKI, M. J. & ROBINS, C. J. 2011. Effects of mindfulness on psychological health: A review of empirical studies. Clinical psychology review, 31, 1041-1056.

118 KENNEDY, T. J. & LINGARD, L. A. 2006. Making sense of grounded theory in medical education. Medical education, 40, 101-108. KENNEDY, T. J., REGEHR, G., BAKER, G. R. & LINGARD, L. 2009. Preserving professional credibility: grounded theory study of medical trainees’ requests for clinical support. Bmj, 338, b128. KERR, J. H. & KUK, G. 2001. The effects of low and high intensity exercise on emotions, stress and effort. Psychology of sport and exercise, 2, 173-186. KHADEMI, M., MOUSAVI HONDORI, H., MCKENZIE, A., DODAKIAN, L., LOPES, C. V. & CRAMER, S. C. 2014. Free-hand interaction with leap motion controller for stroke rehabilitation. CHI'14 Extended Abstracts on Human Factors in Computing Systems. KHALIL, M., LAMAR, C. & JOHNSON, T. 2005. Using computer‐based interactive imagery strategies for designing instructional anatomy programs. Clinical Anatomy: The Official Journal of the American Association of Clinical Anatomists and the British Association of Clinical Anatomists, 18, 68-76. KINGSTON, J., CHADWICK, P., MERON, D. & SKINNER, T. C. 2007. A pilot randomized control trial investigating the effect of mindfulness practice on pain tolerance, psychological well-being, and physiological activity. Journal of psychosomatic research, 62, 297-300. KJELLGREN, A. & BUHRKALL, H. 2010. A comparison of the restorative effect of a natural environment with that of a simulated natural environment. Journal of environmental psychology, 30, 464- 472. KOCKRO, R. A., SERRA, L., TSENG-TSAI, Y., CHAN, C., YIH-YIAN, S., GIM-GUAN, C., LEE, E., HOE, L. Y., HERN, N. & NOWINSKI, W. L. 2000. Planning and simulation of neurosurgery in a virtual reality environment. Neurosurgery, 46, 118-35; discussion 135-7. KOELSCH, S. 2014. Brain correlates of music-evoked emotions. Nature Reviews Neuroscience, 15, 170- 180. KOGLER, P. 2019. Spatial Illusion [Online]. Available: http://www.kogler.net/index.php?c=p [Accessed]. KOLASINSKI, E. M. 1995. Simulator Sickness in Virtual Environments. DTIC Document. KOSOI, N. 2005. Nothingness made visible: The case of Rothko's paintings. Art Journal, 64, 20-31. KRUMHANSL, C. L. 1997. An exploratory study of musical emotions and psychophysiology. Canadian Journal of Experimental Psychology/Revue canadienne de psychologie expérimentale, 51, 336. LAI, C.-H. & BOVERMANN, T. Audience Experience in Sound Performance. NIME, 2013. 170-173. LAIRD, J. D. 1974. Self-attribution of emotion: the effects of expressive behavior on the quality of emotional experience. Journal of personality and social psychology, 29, 475. LAKOFF, G. & JOHNSON, M. 1980. Metaphors we live by. Chicago: Univ. Chicago Press. LAMKIN, P. 2017. The best smartphone headsets for VR apps [Online]. Wareable. Available: https://www.wareable.com/vr/best-smartphone-headsets-mobile-vr-apps-1655 [Accessed]. LAN, C., LAI, J.-S. & CHEN, S.-Y. 2002. Tai chi chuan. Sports Medicine, 32, 217-224. LAN, C., WOLF, S. L. & TSANG, W. W. 2013. Tai chi exercise in medicine and health promotion. Evidence-Based Complementary and Alternative Medicine, 2013. LANG, J. T. 1987. Creating architectural theory: The role of the behavioral sciences in environmental design, JSTOR. LANGE, B., KOENIG, S., MCCONNELL, E., CHANG, C.-Y., JUANG, R., SUMA, E., BOLAS, M. & RIZZO, A. Interactive game-based rehabilitation using the Microsoft Kinect. 2012 IEEE Virtual Reality Workshops (VRW), 2012. IEEE, 171-172. LEE, J., GU, H., KIM, H., KIM, J., KIM, H. & KIM, H. 2013. Interactive manipulation of 3D objects using Kinect for visualization tools in education. 2013 13th International Conference on Control, Automation & Systems (ICCAS 2013), 1220. LENOIR, T. 2000. All but war is simulation: The military-entertainment complex. Configurations, 8, 289- 335.

119 LESHED, G., VELDEN, T., RIEGER, O., KOT, B. & SENGERS, P. In-car gps navigation: engagement with and disengagement from the environment. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, 2008. ACM, 1675-1684. LEUNG, S.-O. 2011. A comparison of psychometric properties and normality in 4-, 5-, 6-, and 11-point Likert scales. Journal of Social Service Research, 37, 412-421. LIKERT, R. 1932. A technique for the measurement of attitudes. Archives of psychology. LIKERT, R., ROSLOW, S. & MURPHY, G. 1934. A simple and reliable method of scoring the Thurstone attitude scales. The Journal of Social Psychology, 5, 228-238. LINDLEY, S. E., LE COUTEUR, J. & BERTHOUZE, N. L. Stirring up experience through movement in game play: effects on engagement and social behaviour. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, 2008. ACM, 511-514. LINTON, H. & LINTON, R. 1999. Color in architecture: design methods for buildings, interiors, and urban spaces, McGraw-Hill New York. LIU, C.-L. 2014. A study of detecting and combating cybersickness with fuzzy control for the elderly within 3D virtual stores. International Journal of Human - Computer Studies, 72, 796-804. LIU, P. R. N. S. C. 2005. Maintaining optimal challenge in computer games through real-time physiological feedback. Foundations of Augmented Cognition, 184. LOCHBAUM, M. R. & ROBERTS, G. C. 1993. Goal orientations and perceptions of the sport experience. Journal of Sport and Exercise Psychology, 15, 160-171. LOGAN, R. J., AUGAITIS, S. & RENK, T. Design of simplified television remote controls: a case for behavioral and emotional usability. Proceedings of the Human Factors and Ergonomics Society Annual Meeting, 1994. SAGE Publications Sage CA: Los Angeles, CA, 365-369. LOGITECH. 2019a. Game Controllers, Gamepads, Console-Style Controls for PC Gaming | Logitech G [Online]. Available: https://www.logitechg.com/en-us/products/gamepads.html [Accessed]. LOGITECH. 2019b. Logitech G920 & G29 Driving Force Steering Wheels & Pedals [Online]. Available: https://www.logitechg.com/en-au/products/driving/driving-force-racing-wheel.html [Accessed 2019]. LOGITECH. 2019c. Logitech Support [Online]. Available: https://support.logi.com/hc/en- us/articles/360025416613--Product-Gallery-Logitech-MOMO-Racing-Force-Feedback-Wheel [Accessed May 4 2019]. LOZANO-HEMMER, R. 2019. Voice Array [Online]. Available: http://www.lozano- hemmer.com/voice_array.php [Accessed]. LUYTEN, T., BRAUN, S., JAMIN, G., VAN HOOREN, S. & DE WITTE, L. 2017. How nursing home residents with dementia respond to the interactive art installation 'VENSTER': a pilot study. Disabil Rehabil Assist Technol, 1-8. LUYTEN, T., BRAUN, S., JAMIN, G., VAN HOOREN, S. & DE WITTE, L. 2018. How nursing home residents with dementia respond to the interactive art installation ‘VENSTER’: a pilot study. Disability and Rehabilitation: Assistive Technology, 13, 87-94. M. ALAMMAR, F., INTEZARI, A., CARDOW, A. & J. PAULEEN, D. 2019. Grounded Theory in Practice: Novice Researchers' Choice Between Straussian and Glaserian. Journal of Management Inquiry, 28, 228-245. MACE, M.-A. & WARD, T. 2002. Modeling the creative process: A grounded theory analysis of creativity in the domain of art making. Creativity research journal, 14, 179-192. MAHNKE, F. H. 1996. Color, environment, and human response: an interdisciplinary understanding of color and its use as a beneficial element in the design of the architectural environment, John Wiley & Sons. MANGEN, A. 2008. Hypertext fiction reading: haptics and immersion. Journal of research in reading, 31, 404-419. MATIKOPI. 2014. My Experience [Online]. Available: https://rothkochapel2014.wordpress.com/2014/09/18/my-experience/ [Accessed 8 June 2020 2020].

120 MAZIKOWSKI, A. & LEBIEDŹ, J. 2014. Image Projection in Immersive 3D Visualization Laboratory. Procedia Computer Science, 35, 842-850. MCCAFFREY, R. & FREEMAN, E. 2003. Effect of music on chronic osteoarthritis pain in older people. Journal of advanced nursing, 44, 517-524. MCCANDLESS, D. 2012. Information is beautiful, Collins London. MCCARTHY, J. & WRIGHT, P. 2004. Technology as experience. interactions, 11, 42-43. MCCARTHY, J. & WRIGHT, P. 2005. Putting ‘felt-life’at the centre of human–computer interaction (HCI). Cognition, technology & work, 7, 262-271. MCCLOY, R. & STONE, R. 2001. Science, medicine, and the future: Virtual reality in surgery. BMJ: British Medical Journal, 323, 912. MCLAUGHLIN, M. L. 1984. Conversation. How talk is organized. Beverly Hills (Calif.). MCMORRAN, S. M. 2007. Interactive painting: an investigation of interactive art and its introduction into a traditional art practice. Northumbria University. MCNAMARA, N. & KIRAKOWSKI, J. 2006. Functionality, usability, and user experience. interactions, 13, 26-28. MEHLENBACHER, B. & MILLER, C. R. 2000. Active and interactive learning online: A comparison of Web-based and conventional writing classes. IEEE Transactions on Professional Communication, 43, 166. MEISNER, J., DONNELLY, W. P. & ROOSEN, R. 2003. Augmented reality technology. Google Patents. MICROSOFT. 2019. Xbox Official Site: Consoles, Games, and Community | Xbox [Online]. @xbox. Available: https://www.xbox.com/ [Accessed]. MICROSOFT STORE. 2018. Kinect Sensor for Xbox One - Microsoft Store en-AU [Online]. @microsoftstore. Available: https://www.microsoft.com/en-au/store/d/kinect-sensor-for- xbox-one/91hq5578vksc [Accessed]. MILLS, N. & ALLEN, J. 2000. Mindfulness of movement as a coping strategy in multiple sclerosis: a pilot study. General hospital psychiatry, 22, 425-431. MITCHAM, C. 1994. Thinking through technology: The path between engineering and philosophy, University of Chicago Press. MOGHADDAM, A. 2006. Coding issues in grounded theory. Issues in educational research, 16, 52-66. MOLINE, J. 1997. Virtual reality for health care: a survey. Studies in health technology and informatics, 3-34. MONK-TURNER, E. 2003. The benefits of meditation: experimental findings. The Social Science Journal, 40, 465-470. MOREL, M., BIDEAU, B., LARDY, J. & KULPA, R. 2015. Review/Mise au point: Advantages and limitations of virtual reality for balance assessment and rehabilitation. Avantages et limites de la réalité virtuelle pour l’évaluation et la rééducation de l’équilibre (French), 45, 315-326. MORENO, R. & MAYER, R. E. 2005. Role of guidance, reflection, and interactivity in an agent-based multimedia game. Journal of educational psychology, 97, 117. MORENO, R., MAYER, R. E., SPIRES, H. A. & LESTER, J. C. 2001. The case for social agency in computer- based teaching: Do students learn more deeply when they interact with animated pedagogical agents? Cognition and instruction, 19, 177-213. MOSS, R. 2015. Feelreal VR Mask and Nirvana helmet let gamers smell and feel the action [Online]. New Atlas. Available: http://newatlas.com/feelreal-vr-mask-nirvana-helmet-smell-feel- action/37256/ [Accessed 30 March 2016]. MOTÖRHEAD 1980. Ace Of Spades. MOYLE, W., JONES, C., DWAN, T. & PETROVICH, T. 2018. Effectiveness of a Virtual Reality Forest on People With Dementia: A Mixed Methods Pilot Study. The Gerontologist, 58, 478-487. MUELLER, F., AGAMANOLIS, S. & PICARD, R. Exertion interfaces: sports over a distance for social bonding and fun. Proceedings of the SIGCHI conference on Human factors in computing systems, 2003. ACM, 561-568.

121 MUELLER, F. F. & GIBBS, M. R. Evaluating a distributed physical leisure game for three players. Proceedings of the 19th Australasian Conference on Computer-Human Interaction: Entertaining User Interfaces, 2007. ACM, 143-150. MULLER, F. & BERTHOUZE, N. 2010. Evaluating Exertion Games Experiences from Investigating Movement-Based Games. Springer. MULLOOLY, V., LEVIN, R. F. & FELDMAN, H. 1988. Music for postoperative pain and anxiety. The Journal of the New York State Nurses" Association, 19, 4-7. MUMFORD, M. 2004. Pina Bausch Choreographs Blaubart: A Transgressive or Regressive Act? German Life and Letters, 57, 44-57. MUNCH, E. 1893. The Scream. Norway. MUNSELL COLOR. 2019. Munsell Color System; Color Matching from Munsell Color Company [Online]. Available: https://munsell.com/ [Accessed]. NAMCO. 2018. NAMCO USA Amusement Suppliers & Operators [Online]. Available: http://www.namcoentertainment.com/ [Accessed]. NANAMOLI, B. 1964. Mindfulness of Breathing (Anapanasati). Kandy: Buddhist Publication Society. NAPIER, B. 2017. 3D Visualisation Systems [Online]. United Kingdom: British Geological Survey. Available: http://www.bgs.ac.uk/research/environmentalModelling/3dVisualisation.html [Accessed 2017]. NARASIMHAN, L., NAGARATHNA, R. & NAGENDRA, H. 2011. Effect of integrated yogic practices on positive and negative emotions in healthy adults. International journal of yoga, 4, 13. NARDI, B. A. 1996. Context and consciousness: activity theory and human-computer interaction, mit Press. NAZ, K. & EPPS, H. 2004. Relationship between color and emotion: A study of college students. College Student J, 38, 396. NAZ, K. & HELEN, H. E. Color-emotion associations: Past experience and personal preference. AIC 2004 Color and Paints, Interim Meeting of the International Color Association, Proceedings, 2004. Jose Luis Caivano, 31. NEMOTO, T. & BEGLAR, D. Likert-scale questionnaires. JALT 2013 Conference Proceedings, 2014. 1-8. NEUMAN, W. R. 2008. Concept of Interactivity. In: DONSBACH, W. (ed.) The International Encyclopedia of Communication. v. 5 ed.: John Wiley & Sons, Inc. NEWHAGEN, J. E. & RAFAELI, S. 1996. Why communication researchers should study the Internet: A dialogue. Journal of computer-mediated communication, 1, JCMC145. NIEBORG, D. 2004. America‟ s Army: More than a game. NIELSEN, J. & NORMAN, D. 2019. The Definition of User Experience (UX) [Online]. @nngroup. Available: https://www.nngroup.com/articles/definition-user-experience/ [Accessed]. NINTENDO. 2018. Nintendo games catalogue - SuperMario [Online]. Available: http://www.nintendo.com.au/index.php?action=prodcatalogue&form_name=adv_search&n ame=super mario [Accessed]. NINTENDO. 2019a. Nintendo - Official Site - Video Game Consoles, Games [Online]. Available: https://www.nintendo.com.au/ [Accessed]. NINTENDO. 2019b. Wii U from Nintendo - Official Site - HD Video Game Console [Online]. Available: https://www.nintendo.com.au/wii-u [Accessed]. NORMAN, D. 2002. Emotion & design: attractive things work better. interactions, 9, 36-42. NORMAN, D. A. 1986. Cognitive engineering. User centered system design, 31, 61. O'NEILL, M. Transnational refugees: The transformative role of art? Forum: Qualitative Sozialforschung.= Forum: qualitative social research., 2008. Freie Universität Berlin, 59. O'SULLIVAN, T., HARTLEY, J., SAUNDERS, D., MONTGOMERY, M. & FISKE, J. 1994. Key concepts in communication and cultural studies, Citeseer. ODBERT, H. S., KARWOSKI, T. F. & ECKERSON, A. 1942. Studies in synesthetic thinking: I. Musical and verbal associations of color and mood. The journal of general psychology, 26, 153-173.

122 OH, J. & SUNDAR, S. S. 2015. How does interactivity persuade? An experimental test of interactivity on cognitive absorption, elaboration, and attitudes. Journal of Communication, 65, 213-236. OU, L. C., LUO, M. R., SUN, P. L., HU, N. C., CHEN, H. S., GUAN, S. S., WOODCOCK, A., CAIVANO, J. L., HUERTAS, R., TREMEAU, A., BILLGER, M., IZADAN, H. & RICHTER, K. 2012. A cross-cultural comparison of colour emotion for two-colour combinations. Color Research and Application, 23. OU, L. C., LUO, M. R., WOODCOCK, A. & WRIGHT, A. 2004. A study of colour emotion and colour preference. Part I: Colour emotions for single colours. Color Research & Application, 29, 232- 240. OVERBEEKE, K. C. & WENSVEEN, S. S. From perception to experience, from affordances to irresistibles. Proceedings of the 2003 international conference on Designing pleasurable products and interfaces, 2003. ACM, 92-97. OXFORD REFERENCE 2021. Maurice Denis. PARSONS, T. D. & RIZZO, A. A. 2008. Affective outcomes of virtual reality exposure therapy for anxiety and specific phobias: A meta-analysis. Journal of Behavior Therapy and Experimental Psychiatry, 39, 250-261. PASSIG, D., TZURIEL, D. & ESHEL-KEDMI, G. 2016. Improving children's cognitive modifiability by dynamic assessment in 3D Immersive Virtual Reality environments. Computers & Education, 95, 296-308. PATERSON, L. Q., HANDY, A. B. & BROTTO, L. A. 2017. A pilot study of eight-session mindfulness-based cognitive therapy adapted for women’s sexual interest/arousal disorder. The Journal of Sex Research, 54, 850-861. PAUL, L. A. 2014. Transformative experience, OUP Oxford. PAYNE, S. R. 2013. The production of a perceived restorativeness soundscape scale. Applied Acoustics, 74, 255-263. PEIRCE, C. 1865. Harvard lecture 1, Spring 1865 (MS 94). Writings of Charles S. Peirce, 1, 1857-1866. PÉLDATÁR A TÁVOKTATÁSHOZ-MTE 2020. Pina Bausch Le Sacre du Printemps/The Rite of Spring /Tavaszi áldozat 1978. YouTube. PETERSEN, M. G., IVERSEN, O. S., KROGH, P. G. & LUDVIGSEN, M. Aesthetic interaction: a pragmatist's aesthetics of interactive systems. Proceedings of the 5th conference on Designing interactive systems: processes, practices, methods, and techniques, 2004. ACM, 269-276. PIDGEON, N. & HENWOOD, K. 1997. Using grounded theory in psychological research. PIERCE, J. & PAULOS, E. A phenomenology of human-electricity relations. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, 2011. ACM, 2405-2408. PINTORE, G., GOBBETTI, E., GANOVELLI, F. & BRIVIO, P. 3DNSITE: A networked interactive 3D visualization system to simplify location awareness in crisis management. Proceedings of the 17th International Conference on 3D Web Technology, 2012. ACM, 59-67. POLLOCK, J. 1952. Blue Poles. Sidney Janis Gallery. POTTER, L. E., ARAULLO, J. & CARTER, L. The leap motion controller: a view on sign language. Proceedings of the 25th Australian computer-human interaction conference: augmentation, application, innovation, collaboration, 2013. ACM, 175-178. POUKE, M. & HÄKKILÄ, J. 2013. Elderly healthcare monitoring using an avatar-based 3D virtual environment. International journal of environmental research and public health, 10, 7283- 7298. PREECE, J., ROGERS, Y. & SHARP, H. 2015. Interaction design: beyond human-computer interaction, John Wiley & Sons. PREMINGER, S. 2012. Transformative art: art as means for long-term neurocognitive change. Frontiers in Human Neuroscience, 6, 96. PRENSKY, M. 2007. Digital game-based learning : practical ideas for the application of digital game- based learning / Marc Prensky, St. Paul, MN : Paragon House, 2007. PRIEDE, J. E. H. 2017. Projection mapping video pipeline. Google Patents.

123 PROSKE, A., NARCISS, S. & KÖRNDLE, H. 2007. Interactivity and Learners' Achievement in Web-Based Learning. Journal of Interactive Learning Research, 18, 511-531. PUGH, K. J. 2004. Newton's laws beyond the classroom walls. Science Education, 88, 182-196. PURSER, R. E. & MILILLO, J. 2015. Mindfulness revisited: A Buddhist-based conceptualization. Journal of Management Inquiry, 24, 3-24. QUESENBERY, W. What does usability mean: Looking beyondease of use'. Annual conference-society for technical communication, 2001. Citeseer, 432-436. RAFAELI, S. 1988. From new media to communication. Sage annual review of communication research: Advancing communication science, 16, 110-134. RAGAN, E. D., SOWNDARARAJAN, A., KOPPER, R. & BOWMAN, D. A. S The Effects of Higher Levels of Immersion on Procedure Memorization Performance and Implications for Educational Virtual Environments. 2010. MIT Press, 527-543. RAND, D., KIZONY, R. & WEISS, P. T. L. 2008. The Sony PlayStation II EyeToy: low-cost virtual reality for use in rehabilitation. Journal of neurologic physical therapy, 32, 155-163. RASMUSSEN, J. & VICENTE, K. J. 1989. Coping with human errors through system design: implications for ecological interface design. International Journal of Man-Machine Studies, 31, 517-534. RATCLIFFE, E., GATERSLEBEN, B. & SOWDEN, P. T. 2013. Bird sounds and their contributions to perceived attention restoration and stress recovery. Journal of Environmental Psychology, 36, 221-228. RECOMPUTE. 2020. recompute [Online]. Available: https://www.recompute.com.au/hp-elitebook- 820-g3-12-5-core-i5-6300u-8gb-ram-512gb-ssd/?gclid=CjwKCAjw7- P1BRA2EiwAXoPWA2bjRhfW6w6Z905Uk1bJyOcX23BlJFv4sUlY0zb- oc8AdqvV_AlaexoCD6kQAvD_BwE [Accessed 11 May 2020 2020]. REINMOELLER, P. 2002. Emergence of pleasure: Communities of interest and new luxury products. Pleasure with products: Beyond usability, 125-134. REZNEK, M., HARTER, P. & KRUMMEL, T. 2002. Virtual reality and simulation: training the future emergency physician. Academic Emergency Medicine, 9, 78-87. RHEINGOLD, H. 2000. The virtual community: Homesteading on the electronic frontier, MIT press. RHYS DAVIDS, T. 1910. Dialogues of the Buddha (Vol. 2). London: Henry Frowde. RICCIO, G. E. & STOFFREGEN, T. A. 1991. An ecological Theory of Motion Sickness and Postural Instability. Ecological Psychology, 3, 195. RICE, R. E. 1984. New media technology: Growth and integration. The new media: Communication, research, and technology, 33-54. RICE, R. E. & WILLIAMS, F. 1984. Theories old and new: The study of new media. The new media: Communication, research, and technology, 55-80. RICKETTS*, T. & MACASKILL, A. 2003. Gambling as emotion management: Developing a grounded theory of problem gambling. Addiction Research & Theory, 11, 383-400. RISKIND, J. H. & GOTAY, C. C. 1982. Physical posture: Could it have regulatory or feedback effects on motivation and emotion? Motivation and emotion, 6, 273-298. RITTERFELD, U. & WEBER, R. 2006. Video games for entertainment and education. Playing video games: Motives, responses, and consequences, 399-413. RIVA, G., BACCHETTA, M., BARUFFI, M., BORGOMAINERIO, E., DEFRANCE, C., GATTI, F., GALIMBERTI, C., FONTANETO, S., MARCHI, S., MOLINARI, E., NUGUES, P., RINALDI, S., ROVETTA, A., FERRETTI, G. S., TONCI, A., WANN, J. & VINCELLI, F. 1999. VREPAR projects: the use of virtual environments in psycho-neuro-physiological assessment and rehabilitation. Cyberpsychol Behav, 2, 69-76. ROCKSTAR GAMES. 2019. Grand Theft Auto V - Rockstar Games [Online]. Available: https://www.rockstargames.com/games/info/V [Accessed 4 May 2019 2019]. ROGERS, E. 1995. Diffusion of Innovations (4th Eds.) ACM The Free Press (Sept. 2001). New York, 15- 23.

124 ROGERS, E. M. & CHAFFEE, S. H. 1983. Communication as an academic discipline: A dialogue. Journal of Communication, 33, 18-30. ROSCH, E. 2015. The Emperor’s Clothes: A Look Behind the Western Mindfulness Mystique. In: OSTAFIN, B. D., ROBINSON, M. D. & MEIER, B. P. (eds.) Handbook of Mindfulness and Self- Regulation. New York, NY: Springer New York. ROSE, J.-P. & WEIS, J. 2008. Sound meditation in oncological rehabilitation--a pilot study of a receptive music therapy group using the monochord. Forschende Komplementarmedizin (2006), 15, 335-343. ROSEMAN, I. J. & SMITH, C. A. 2001. Appraisal processes in emotion: Theory, methods, research. In: SCHERER, K. R., SCHORR, A. & JOHNSTONE, T. (eds.). Oxford University Press. ROUSSOU, M. Incorporating Immersive Projection-based Virtual Reality in Public Spaces. Proceedings of 3rd International Immerse Projection Technology Workshop, 1999. Citeseer, 33-39. ROVIRA, A., SWAPP, D., SOUTHERN, R., ZHANG, J. J. & SLATER, M. The impact of enhanced projector display on the responses of people to a violent scenario in immersive virtual reality. 2013 IEEE Virtual Reality (VR), 18-20 March 2013 2013. 15-18. SAITO, M. 1996. Comparative studies on color preference in Japan and other Asian regions, with special emphasis on the preference for white. Color Research & Application, 21, 35-49. SALAMON, E., KIM, M., BEAULIEU, J. & STEFANO, G. B. 2003. Sound therapy induced relaxation: down regulating stress processes and pathologies. Medical Science Monitor, 9, RA96-RA0. SANDLUND, E. S. & NORLANDER, T. 2000. The effects of Tai Chi Chuan relaxation and exercise on stress responses and well-being: an overview of research. International Journal of Stress Management, 7, 139-149. SATAVA, R. M. Medicine 2001: the king is dead. Interactive Technology and the New Paradigm for Healthcare, 1995a. IOS Press, Washington DC, 334 - 339. SATAVA, R. M. 1995b. Virtual reality, telesurgery, and the new world order of medicine. Journal of Image Guided Surgery, 1, 12-16. SATHIYANARAYANAN, M. & RAJAN, S. MYO Armband for physiotherapy healthcare: A case study using gesture recognition application. 2016 8th International Conference on Communication Systems and Networks (COMSNETS), 2016. IEEE, 1-6. SCHOENAU-FOG, H. The Player Engagement Process-An Exploration of Continuation Desire in Digital Games. DiGRA Conference, 2011. Citeseer. SCHOU, T. & GARDNER, H. J. A Wii remote, a game engine, five sensor bars and a virtual reality theatre. Proceedings of the 19th Australasian conference on computer-human interaction: entertaining user interfaces, 2007. ACM, 231-234. SCHWARM, B. 2020. The Rite of Spring. Encyclopædia Britannica. Encyclopædia Britannica, inc. SHAPIRO, S. L., SCHWARTZ, G. E. & BONNER, G. 1998. Effects of mindfulness-based stress reduction on medical and premedical students. Journal of behavioral medicine, 21, 581-599. SHIMAMURA, A. 2015. Experiencing art: In the brain of the beholder, Oxford University Press. SHNEIDERMAN, B., PLAISANT, C. & HESSE, B. W. 2013. Improving healthcare with interactive visualization. Computer, 46, 58-66. SINGHAL, S., BAGGA, S., GOYAL, P. & SAXENA, V. 2012. Augmented chemistry: Interactive education system. International Journal of Computer Applications, 49. SINHA, R., LACADIE, C., SKUDLARSKI, P. & WEXLER, B. E. 2004. Neural circuits underlying emotional distress in humans. Annals of the New York Academy of Sciences, 1032, 254-257. SIRIARAYA, P. & ANG, C. S. Recreating living experiences from past memories through virtual worlds for people with dementia. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, 2014. ACM, 3977-3986. SMITH, C. A. & ELLSWORTH, P. C. 1985. Patterns of cognitive appraisal in emotion. Journal of personality and social psychology, 48, 813. SNYDER, M. & CHLAN, L. 1999. Music therapy. Annual review of nursing research, 17, 3-25. SNYDER, S. 2018. End-of-Art Philosophy in Hegel, Nietzsche and Danto, Springer.

125 SONY INTERACTIVE ENTERTAINMENT. 2017. PlayStation VR [Online]. Sony Interactive Entertainment Europe Limited. Available: https://www.playstation.com/en-au/explore/playstation- vr/games/ [Accessed 2017]. SONY INTERACTIVE ENTERTAINMENT. 2019a. Playstation [Online]. Available: https://www.playstation.com/en-au/ [Accessed]. SONY INTERACTIVE ENTERTAINMENT. 2019b. PlayStation®VR [Online]. Available: https://www.playstation.com/en-gb/explore/playstation-vr/ [Accessed June 3 2019]. SOUNDBIBLE. 2019. Soundbible [Online]. Available: http://soundbible.com/free-sound-effects-1.html [Accessed]. SRA, M. & SCHMANDT, C. Metaspace: Full-body tracking for immersive multiperson virtual reality. Adjunct Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology, 2015. ACM, 47-48. STANNEY, K. M. 2002. Handbook of virtual environments : design, implementation, and applications, Mahwah, NJ : Lawrence Erlbaum Associates, c2002. STEPPER, S. & STRACK, F. 1993. Proprioceptive determinants of emotional and nonemotional feelings. Journal of Personality and Social Psychology, 64, 211. STEUER, J. 1992. Defining virtual reality: Dimensions determining telepresence. Journal of communication, 42, 73-93. STOL, K.-J., RALPH, P. & FITZGERALD, B. Grounded theory in software engineering research: a critical review and guidelines. 2016 IEEE/ACM 38th International Conference on Software Engineering (ICSE), 2016. IEEE, 120-131. STRONG, D. & HIGGS, E. 2000. Borgmann’s philosophy of technology. Technology and the good life, 19-37. SUNDAR, S. S. & KIM, J. 2005. Interactivity and persuasion: Influencing attitudes with information and involvement. Journal of interactive advertising, 5, 5-18. Grave of the Fireflies, 1988. Directed by TAKAHATA, I. Japan. THERA, N. 2005. The heart of buddhist meditation: Satipaṭṭhāna: A handbook of mental training based on the Buddha's way of mindfulness, with an anthology of relevant texts translated from the Pali and Sanskrit, Buddhist Publication Society. THERA, S. 1998. The way of mindfulness: The Satipatthana Sutta and its commentary. Kandy: Buddhist Publication Society. TOLSTOY, L. 1904. What is Art?, New York, Funk & Wagnalls Company. TOLSTOY, L. 1995. What is art?, Penguin UK. TORRES, D. M., GALETTA, K. M., PHILLIPS, H. W., DZIEMIANOWICZ, E. M. S., WILSON, J. A., DORMAN, E. S., LAUDANO, E., GALETTA, S. L. & BALCER, L. J. 2013. Sports-related concussion: anonymous survey of a collegiate cohort. Neurology: Clinical Practice, 3, 279-287. TRACTINSKY, N., KATZ, A. S. & IKAR, D. 2000. What is beautiful is usable. Interacting with computers, 13, 127-145. TRINDADE, J., FIOLHAIS, C. & ALMEIDA, L. 2003. Science learning in virtual environments: a descriptive study. Educational Administration Abstracts, 38, 366. TUSEK, D. L., CHURCH, J. M., STRONG, S. A., GRASS, J. A. & FAZIO, V. W. 1997. Guided imagery. Diseases of the colon & rectum, 40, 172-178. ULTRAHAPTICS. 2019. Leap Motion [Online]. Available: https://www.leapmotion.com/ [Accessed]. UNITY TECHNOLOGIES. 2019. Unity - Unity [Online]. @unity3d. Available: https://unity.com/frontpage [Accessed]. UOTILA, M., FALIN, P. & AULA, P. Experience of luxury and pleasure with products. Proceedings of International Conference on Designing Pleasurable Products and Interfaces, 2005. 91-104. VALDEZ, P. & MEHRABIAN, A. 1994. Effects of color on emotions. Journal of experimental psychology: General, 123, 394. VALTCHANOV, D., BARTON, K. R. & ELLARD, C. 2010. Restorative effects of virtual nature settings. Cyberpsychology, Behavior, and Social Networking, 13, 503-512.

126 VAN ROMPAY, T., HEKKERT, P., SAAKES, D. & RUSSO, B. 2005. Grounding abstract object characteristics in embodied interactions. Acta psychologica, 119, 315-351. VERBEEK, P.-P. 2002. Devices of engagement: on Borgmann's philosophy of information and technology. Techné: Research in Philosophy and Technology, 6, 48-63. VERBEEK, P.-P. 2008. Cyborg intentionality: Rethinking the phenomenology of human–technology relations. Phenomenology and the Cognitive Sciences, 7, 387-395. VILLAROMAN, N., ROWE, D. & SWAN, B. Teaching natural user interaction using OpenNI and the Microsoft Kinect sensor. Proceedings of the 2011 conference on Information technology education, 2011. 227-232. VREDENBURG, K., MAO, J.-Y., SMITH, P. W. & CAREY, T. A survey of user-centered design practice. Proceedings of the SIGCHI conference on Human factors in computing systems, 2002. ACM, 471-478. WADESON, H. 1971. Characteristics of art expression in depression. The Journal of nervous and mental disease, 153, 197-204. WALL, R. B. 2005. Tai chi and mindfulness-based stress reduction in a Boston public middle school. Journal of Pediatric Health Care, 19, 230-237. WALTON, B. 2017. Netflix Virtual Reality: 15 Movies That Will Blow Your Mind [Online]. Screen Rant. Available: http://screenrant.com/netflix-virtual-reality-vr-movies-best/ [Accessed]. WATLING, C. J. & LINGARD, L. 2012. Grounded theory in medical education research: AMEE Guide No. 70. Medical teacher, 34, 850-861. WATZLAWICK, P., BAVELAS, J. B., JACKSON, D. D. & O'HANLON, B. 2011. Pragmatics of human communication: A study of interactional patterns, pathologies and paradoxes, WW Norton & Company. WEBSTER, R. 2016. Declarative Knowledge Acquisition in Immersive Virtual Learning Environments. Interactive Learning Environments, 24, 1319-1333. WENSVEEN, S. A., DJAJADININGRAT, J. P. & OVERBEEKE, C. Interaction frogger: a design framework to couple action and function through feedback and feedforward. Proceedings of the 5th conference on Designing interactive systems: processes, practices, methods, and techniques, 2004. ACM, 177-184. WERTH, A. 2019. Virtual Munsell Color Wheel. WILLIAMS, F., RICE, R. E. & ROGERS, E. M. 1988. Research methods and the new media, Simon and Schuster. WILLIAMS, K. A., KOLAR, M. M., REGER, B. E. & PEARSON, J. C. 2001. Evaluation of a wellness-based mindfulness stress reduction intervention: A controlled trial. American Journal of Health Promotion, 15, 422-432. WINOGRAD, T., FLORES, F. & FLORES, F. F. 1986. Understanding computers and cognition: A new foundation for design, Intellect Books. WONG, M. 2016. Tai Chi Daily - 14 minute Tai Chi Routine. YouTube. WORLD HEALTH ORGANIZATION 2012. Dementia: a public health priority, World Health Organization. WORLD HEALTH ORGANIZATION. 2017. WHO | Dementia [Online]. Available: http://www.who.int/mediacentre/factsheets/fs362/en/ [Accessed]. WORTMANN, M. 2012. Dementia: a global health priority - highlights from an ADI and World Health Organization report. Alzheimer's Research & Therapy, 4, 40. WRIGHT, P. & MCCARTHY, J. Empathy and experience in HCI. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, 2008. ACM, 637-646. WRIGHT, P., MCCARTHY, J. & MEEKISON, L. 2003. Making sense of experience. Funology. Springer. WRIGHT, P., WALLACE, J. & MCCARTHY, J. 2008. Aesthetics and experience-centered design. ACM Transactions on Computer-Human Interaction (TOCHI), 15, 18. WRIGHT, P. C., FIELDS, R. E. & HARRISON, M. D. 2000. Analyzing human-computer interaction as distributed cognition: the resources model. Human-Computer Interaction, 15, 1-41.

127 WRZESIEN, M., BURKHARDT, J.-M., ALCAÑIZ RAYA, M. & BOTELLA, C. Mixing psychology and HCI in evaluation of augmented reality mental health technology. CHI'11 Extended Abstracts on Human Factors in Computing Systems, 2011. ACM, 2119-2124. YANG, K., BERNARDO, L. M., SEREIKA, S. M., CONROY, M. B., BALK, J. & BURKE, L. E. 2011. Utilization of 3-month yoga program for adults at high risk for type 2 diabetes: a pilot study. Evidence- Based Complementary and Alternative Medicine, 2011. YEROHEEN, A. 2017. Syngenta. YEUNG, R. R. 1996. The acute effects of exercise on mood state. Journal of psychosomatic research, 40, 123-141. YUEN, K. K., CHOI, S. H. & YANG, X. B. 2010. A Full-immersive CAVE-based VR Simulation System of Forklift Truck Operations for Safety Training. Computer-Aided Design & Applications, 7, 235- 245. ZAJENKOWSKI, M., JANKOWSKI, K. S. & KOŁATA, D. 2015. Let's dance–feel better! Mood changes following dancing in different situations. European journal of sport science, 15, 640-646. ZAPPULLA, C., TORIBIO, N. & UROZ, C. 2019. Yeosu Spanish Pavilion [Online]. Yeosu, Korea. Available: http://externalreference.com/projects/ysp12 [Accessed]. ZEIDAN, F., JOHNSON, S. K., DIAMOND, B. J., DAVID, Z. & GOOLKASIAN, P. 2010. Mindfulness meditation improves cognition: Evidence of brief mental training. Consciousness and Cognition, 19, 597-605. ZENTNER, M., GRANDJEAN, D. & SCHERER, K. R. 2008. Emotions evoked by the sound of music: characterization, classification, and measurement. Emotion, 8, 494. ZOOK, A., LEE-URBAN, S., RIEDL, M. O., HOLDEN, H. K., SOTTILARE, R. A. & BRAWNER, K. W. Automated scenario generation: toward tailored and optimized military training in virtual environments. Proceedings of the international conference on the foundations of digital games, 2012. ACM, 164-171. ZYDA, M. 2005. From visual simulation to virtual reality to games. Computer, 38, 25-32.

128 Appendix Appendix 1

8 Hours Relaxing Nature Sounds Meditation Relaxation Birdsong Sound of Water Johnnie Lawson

Calming Seas #1 - 11 Hours Ocean Waves, Nature Sounds, Relaxation, Meditation, Reading, Sleep

129

Forest and Nature Sounds 10 Hours

Nature Sound 16 - THE MOST RELAXING SOUNDS

130 Appendix 2

131 Appendix 3

132 Appendix 4 Plain Language Statement Victorian College of Arts

Project: Developing a computer-based interactive system that creates a sense of deep engagement.

Dr. Roger Alsop (Responsible Researcher) Email: [email protected] Tel: +61 3 90359387 Renusha Athugala (Student Researcher) Email: [email protected] Tel: +61 420620588

Introduction Thank you for your interest in participating in this research project. The following few pages will provide you with further information about the project, so that you can decide if you would like to take part in this research.

Please take the time to read this information carefully. You may ask questions about anything you don’t understand or want to know more about.

Your participation is voluntary. If you don’t wish to take part, you don’t have to. If you begin participating, you can also stop at any time.

What is this research about? This project explores how a computer-based interactive system may create a sense of Deep engagement. Here, a sense of deep engagement is a calm and focused state of mind. The Buddhist concept ‘Sati’ is used to achieve this. Sati is the state of being aware and paying attention to the present moment. The interactive system is designed to help focus on the present moment. Here we are blending sound, vision and physical movement to help the user to focus their attention to the present moment. Here, Tai Chi movement technique is used.

What will I be asked to do? Should you agree to participate, you will be asked to read and sign the accompanying consent form. You will also be informed on how to use the interactive system. The system will guide you in Tai Chi movement technique. A visual representation of a Tai Chi Master will be displayed, and you will be asked to follow the Tai Chi Master’s movements. Here, you will see an outline of your body and the Tai Chi master, and you will be asked to copy the master’s moves. The more accurate your movements align with the Tai Chi Master; the outline of the Master will be filled with a colour. These images will be recorded in the system, but they are outlines they will not be identifiable. The estimated time commitment for one participant is 10 minutes. You will also be asked to fill in the accompanying questionnaire after the session.

133

A nonidentifiable outline of the participants image will be recorded. This will be recorded through the Sati Interactive software. This image will only be recorded if the participants agree.

What are the possible benefits? Expected benefits to the participants may include: • Achieving a sense of Sati. • A relaxed state of mind and body. Potential on-going benefits may include: • The ability to focus the mind towards a single objective without disruptive thoughts. • The ability to deal with stressful situations. • The ability to control emotions. Expected benefits to society of this research may include: • Developing and refining an interactive system that promotes a sense of Sati. • Developing understanding of the potential benefits of integrating sound, vision, and movement within an arts-based system • A process for introducing the perceived benefits of meditation without the user specifically engaging in traditional meditation practices. • An interactive system that can be used in a variety of fields such as education, entertainment, training and healthcare.

What are the possible risks? Slight muscle strain and uncomfortable feelings are identified as potential risks of using the system. However, the risks are reduced as the test duration per person is limited to 10 minutes, and participants may stop whenever they want. These potential risks are normal in daily life, but I will be informing each participant regarding these risks before they participate.

If you require counselling services, you can contact the Counselling and Psychological Services (CAPS) : https://services.unimelb.edu.au/counsel/individual/making-an- appointment or call 8344 6927.

Do I have to take part? No. Participation is completely voluntary. You are able to withdraw at any time. Data will only be collected if you participate and you will be able to withdraw any unprocessed data.

Will I hear about the results of this project? The results of this project will be published in the final thesis and possibly in journal articles and conference papers. Participants will be notified of any publication if they request it.

134 What will happen to information about me? The participants personal details will not be recorded. Each participants data will be recorded as anonymous. The images recorded in the system will be retained for 5 years and it will be deleted. All date will be stored according to university policies. Data gathered in this research project may also be used in future projects that are closely related to this project, or in the same general area of research as this project.

Is there any potential conflict of interest? There are no potential conflicts of interest anticipated at present.

Who is funding this project? This project is supported by the University of Melbourne, but no direct financial support is supplied.

Where can I get further information? If you would like more information about the project, please contact the researchers; Dr. Roger Alsop (+61 3 90359387), or Renusha Athugala (+61 420620588).

Who can I contact if I have any concerns about the project? This research project has been approved by the Human Research Ethics Committee of The University of Melbourne. If you have any concerns or complaints about the conduct of this research project, which you do not wish to discuss with the research team, you should contact the Manager, Human Research Ethics, Research Ethics and Integrity, University of Melbourne, VIC 3010. Tel: +61 3 8344 2073 or Email: HumanEthics- [email protected]. All complaints will be treated confidentially. In any correspondence please provide the name of the research team or the name or ethics ID number of the research project.

135 Consent Form Victorian College of Arts

Project: Developing a computer-based interactive system that creates a sense of deep engagement.

Responsible Researcher: Dr. Roger Alsop (Principle supervisor) Additional Researchers: Renusha Athugala (Student researcher)

Name of Participant:

1. I consent to participate in this project, the details of which have been explained to me, and I have been provided with a written plain language statement to keep.

2. I understand that the purpose of this research is to investigate the effectiveness of the student-designed software Sati Interactive.

3. I understand that my participation in this project is for research purposes only.

4. I acknowledge that the possible effects of participating in this research project have been explained to my satisfaction.

5. In this project I will be required to test the Sati Interactive system and to answer the accompanying questionnaire at the end of the session. The Sati Interactive system will run for up to 10 minutes. However, this can be stopped or repeated according to the participants need. The estimated time commitment for one participant is 10 minutes. I have read and agreed to the Plain Language Statement.

6. I understand that my interviews involve body movement recognition.

7. I understand that an outline of my image will only be recorded if I agree to it.

8. I understand that my participation is voluntary and that I am free to withdraw from this project anytime without explanation or prejudice and to withdraw any unprocessed data that I have provided.

9. I understand that the data from this research will be stored at the University of Melbourne and will be deleted after 5 years.

10. I have been informed that the confidentiality of the information I provide will be safeguarded subject to any legal requirements; my data will be password protected and accessible only by the named researchers.

11. I understand that I cannot obtain a copy of the results until the results are published in the final thesis. The thesis will be made available on request.

12. I understand that counselling services are provided by the Counselling and Psychological Services (CAPS) and information regarding this is provided in the PLS.

13. I understand that after I sign and return this consent form, it will be retained by the researcher.

Participant Signature: Date:

136

Questionnaire

Select your age group – (18-25) (26-30) (31-40) (40<)

1 = Strongly disagree, 2 = Disagree, 3 = Neither Agree nor Disagree, 4 = Agree, 5 = Strongly Agree

Neither Strongly Strongly Disagree Agree nor Agree Disagree Agree Disagree

I felt relaxed after using the system. 1 2 3 4 5

I was focused while using the 1 2 3 4 5 system. I felt transported in to a different 1 2 3 4 5 environment.

I felt distracted by the visuals. 1 2 3 4 5

I felt distracted by the sounds. 1 2 3 4 5

I would recommend this system to a 1 2 3 4 5 friend.

I would give this system to a friend. 1 2 3 4 5

I found the Sati Interactive system 1 2 3 4 5 user-friendly.

I would use this system at home. 1 2 3 4 5

I feel that using this system daily 1 2 3 4 5 would help reduce stress.

I did not feel the time passing. 1 2 3 4 5

I felt like 10 minutes is enough. 1 2 3 4 5

I felt relaxed while using the system. 1 2 3 4 5

I felt like I was in a different ‘zone’ 1 2 3 4 5 while using the system. I feel like I am still in a different 1 2 3 4 5 ‘zone’ after using the system.

137 Appendix 5 Approval for the Tai Chi Video from Master Wong.

138 Appendix 6 Data collected from the Sati Interactive trials with the user image outline.

Positive (1) Negative (-1) Neutral (0) Question 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 Positive Negative Neutral I felt relaxed after using the system 0 1 1 1 1 1 1 1 1 1 1 0 1 1 1 13 0 2 I was focused while using the system 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 15 0 0 I felt transported into a different environment -1 1 0 0 0 1 1 1 0 1 1 0 1 0 1 8 1 6 I felt distracted by the visuals -1 1 1 1 0 -1 -1 1 1 0 -1 1 1 1 1 9 4 2 I felt distracted by the sounds 1 1 1 1 1 -1 -1 1 -1 1 1 1 1 1 1 12 3 0 I would recommend this system to a friend 1 1 1 -1 0 1 0 0 1 0 1 0 1 1 1 9 1 5 I would give this system to a friend 1 1 1 1 -1 1 1 0 1 0 1 0 1 1 1 11 1 3 I found the Sati Interactive system user-friendly 1 1 1 1 1 1 0 0 1 1 1 1 1 1 1 13 0 2 I would use this system at home 1 1 0 1 -1 1 0 1 1 0 1 0 1 1 1 10 1 4 I feel that using this system daily would help reduce stress1 1 0 1 1 1 -1 1 0 1 1 0 1 1 1 11 1 3 I did not feel the time passing 1 0 1 1 1 1 0 1 1 0 1 1 1 1 1 12 0 3 I felt like 10 minutes is enough 0 0 -1 1 1 1 1 1 1 0 1 1 1 0 1 10 1 4 I felt relaxed while using the system 0 1 1 1 1 1 0 1 1 1 1 0 1 1 1 12 0 3 I felt like I was in a different ‘zone’ while using the system-1 1 0 1 0 1 1 1 0 1 1 0 1 1 1 10 1 4 I feel like I am still in a different ‘zone’ after using the system1 1 0 0 1 1 0 0 1 1 1 0 1 0 0 8 0 7 Total Individual positive responses (out of 15) 9 13 9 12 9 13 6 11 11 9 14 6 15 12 14 Total 163 14 48

139 Appendix 7 Data collected from the Sati Interactive trials without the user image outline.

Positive Negative Neutral Question 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 Positive Negative Neutral I felt relaxed after using the system 1 1 1 1 1 1 1 1 1 1 1 1 1 0 1 14 0 1 I was focused while using the system 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 15 0 0 I felt transported into a different environment -1 1 1 1 0 1 0 0 1 1 -1 1 0 0 -1 7 3 5 I felt distracted by the visuals 1 1 1 1 1 1 1 1 1 -1 -1 -1 -1 -1 -1 9 6 0 I felt distracted by the sounds 1 1 -1 1 1 0 1 1 1 1 1 1 1 -1 1 12 2 1 I would recommend this system to a friend 1 1 1 1 1 1 1 1 1 1 0 1 1 0 1 13 0 2 I would give this system to a friend 0 0 1 1 1 1 1 1 1 1 0 1 1 0 1 11 0 4 I found the Sati Interactive system user-friendly 1 1 1 1 1 0 -1 1 1 1 1 1 1 0 1 12 1 2 I would use this system at home 1 0 1 1 1 1 1 1 1 1 0 1 0 -1 0 10 1 4 I feel that using this system daily would help reduce stress0 1 1 1 1 1 1 1 1 1 0 1 0 0 1 11 0 4 I did not feel the time passing 0 1 -1 1 1 1 0 0 0 0 1 1 1 0 1 8 1 6 I felt like 10 minutes is enough 1 1 1 1 -1 1 1 0 0 1 1 0 1 1 0 10 1 4 I felt relaxed while using the system 1 1 1 1 1 1 1 1 1 1 1 1 1 0 1 14 0 1 I felt like I was in a different ‘zone’ while using the system-1 1 1 1 0 1 0 1 1 1 0 1 0 -1 1 9 2 4 I feel like I am still in a different ‘zone’ after using the system-1 1 1 -1 0 0 -1 0 0 0 0 1 0 -1 0 3 4 8 Total Individual positive responses out of 15 9 13 13 14 11 12 10 11 12 12 7 13 9 2 10 Total 158 21 46

140 Appendix 8 All the data collected from the Sati Interactive trials.

Positive Negative Neutral Question 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 Positive Negative Neutral I felt relaxed after using the system 1 1 1 1 1 1 1 1 1 1 1 1 1 0 1 0 1 1 1 1 1 1 1 1 1 1 0 1 1 1 27 0 3 I was focused while using the system 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 30 0 0 I felt transported into a different environment -1 1 1 1 0 1 0 0 1 1 -1 1 0 0 -1 -1 1 0 0 0 1 1 1 0 1 1 0 1 0 1 15 4 11 I felt distracted by the visuals 1 1 1 1 1 1 1 1 1 -1 -1 -1 -1 -1 -1 -1 1 1 1 0 -1 -1 1 1 0 -1 1 1 1 1 18 10 2 I felt distracted by the sounds 1 1 -1 1 1 0 1 1 1 1 1 1 1 -1 1 1 1 1 1 1 -1 -1 1 -1 1 1 1 1 1 1 24 5 1 I would recommend this system to a friend 1 1 1 1 1 1 1 1 1 1 0 1 1 0 1 1 1 1 -1 0 1 0 0 1 0 1 0 1 1 1 22 1 7 I would give this system to a friend 0 0 1 1 1 1 1 1 1 1 0 1 1 0 1 1 1 1 1 -1 1 1 0 1 0 1 0 1 1 1 22 1 7 I found the Sati Interactive system user-friendly 1 1 1 1 1 0 -1 1 1 1 1 1 1 0 1 1 1 1 1 1 1 0 0 1 1 1 1 1 1 1 25 1 4 I would use this system at home 1 0 1 1 1 1 1 1 1 1 0 1 0 -1 0 1 1 0 1 -1 1 0 1 1 0 1 0 1 1 1 20 2 8 I feel that using this system daily would help reduce stress0 1 1 1 1 1 1 1 1 1 0 1 0 0 1 1 1 0 1 1 1 -1 1 0 1 1 0 1 1 1 22 1 7 I did not feel the time passing 0 1 -1 1 1 1 0 0 0 0 1 1 1 0 1 1 0 1 1 1 1 0 1 1 0 1 1 1 1 1 20 1 9 I felt like 10 minutes is enough 1 1 1 1 -1 1 1 0 0 1 1 0 1 1 0 0 0 -1 1 1 1 1 1 1 0 1 1 1 0 1 20 2 8 I felt relaxed while using the system 1 1 1 1 1 1 1 1 1 1 1 1 1 0 1 0 1 1 1 1 1 0 1 1 1 1 0 1 1 1 26 0 4 I felt like I was in a different ‘zone’ while using the system-1 1 1 1 0 1 0 1 1 1 0 1 0 -1 1 -1 1 0 1 0 1 1 1 0 1 1 0 1 1 1 19 3 8 I feel like I am still in a different ‘zone’ after using the system-1 1 1 -1 0 0 -1 0 0 0 0 1 0 -1 0 1 1 0 0 1 1 0 0 1 1 1 0 1 0 0 11 4 15 Total Individual positive responses out of 15 9 13 13 14 11 12 10 11 12 12 7 13 9 2 10 9 13 9 12 9 13 6 11 11 9 14 6 15 12 14 Total 321 35 94

141 Appendix 9

142 Appendix 10 These translations are from Bernstein (2019).

Archaïscher Torso Apollos Rainer Maria Rilke

Wir kannten nicht sein unerhörtes Haupt, darin die Augenäpfel reiften. Aber sein Torso glüht noch wie ein Kandelaber, in dem sein Schauen, nur zurückgeschraubt, sich hält und glänzt. Sonst könnte nicht der Bug der Brust dich blenden, und im leisen Drehen der Lenden könnte nicht ein Lächeln gehen zu jener Mitte, die die Zeugung trug.

Sonst stünde dieser Stein entstellt und kurz der Schultern durchsichtigem Sturz und flimmerte nicht so wie Raubtierfelle und bräche nicht aus allen seinen Rändern aus wie ein Stern: denn da ist keine Stelle, die dich nicht sieht. Du mußt dein Leben ändern.

Torso of an Archaic Apollo

Translated by C. F. MacIntyre

Never will we know his fabulous head where the eyes' apples slowly ripened. Yet his torso glows: a candelabrum set before his gaze which is pushed back and hid, restrained and shining. Else the curving breast could not thus blind you, nor through the soft turn of the loins could this smile easily have passed into the bright groins where the genitals burned.

Else stood this stone a fragment and defaced, with lucent body from the shoulders falling, too short, not gleaming like a lion's fell; nor would this star have shaken the shackles off, bursting with light, until there is no place that does not see you. You must change your life.

From Rilke: Selected Poems (Univ. of California Press, 1957)

Translated by Stephen Mitchell

We cannot know his legendary head with eyes like ripening fruit. And yet his torso is still suffused with brilliance from inside, like a lamp, in which his gaze, now turned to low,

143 gleams in all its power. Otherwise the curved breast could not dazzle you so, nor could a smile run through the placid hips and thighs to that dark center where procreation flared.

Otherwise this stone would seem defaced beneath the translucent cascade of the shoulders and would not glisten like a wild beast's fur: would not, from all the borders of itself, burst like a star: for here there is no place that does not see you. You must change your life. tr. Sarah Stutt

Apollo's Archaic Torso

We cannot know his incredible head, where the eyes ripened like apples, yet his torso still glows like a candelabrum, from which his gaze, however dimmed, still persists and gleams. If this were not so, the bow of his breast could not blind you, nor could a smile, steered by the gentle curve of his loins, glide to the centre of procreation.

And this stone would seem disfigured and stunted, the shoulders descending into nothing, unable to glisten like a predator's pelt, or burst out from its confines and radiate like a star: for there is no angle from which it cannot see you. You have to change your life.

(Looser translation)

We will never know his magnificent head, the ebb and flow of his youth - an orchard of ripening fruit, yet his fire has not diminished, incandescent light radiates from his torso, and in the curve of his loins, a smile turns towards the centre of creation.

Or else this body would be disfigured - a lump of rock with no vision, unable to glisten like a lion's mane.

It would not burst out of its skin

144 like a star: for its searing gaze penetrates your soul, the way you live.

Appendix 11

David McCandless – Information is Beautiful, page 7

145

Minerva Access is the Institutional Repository of The University of Melbourne

Author/s: Athugala, Renusha Mandara Vishruta

Title: Developing a computer-based interactive system that creates a sense of deep engagement

Date: 2020

Persistent Link: http://hdl.handle.net/11343/273217

File Description: Final thesis file