Published by the Open Science Centre, University of Jyväskylä, Finland
Volume 13, Number 1, May 2017
SPECIAL ISSUE HUMAN–TECHNOLOGY CHOREOGRAPHIES: BODY, MOVEMENT, AND SPACE IN EXPRESSIVE INTERACTIONS Antti Pirhonen, Kai Tuuri, and Cumhur Erkut Guest Editors
Pertti Hurme Editor in Chief
Volume 13 Number 1, May 2017
DOI: http://dx.doi.org/10.17011/ht/urn.201705302564
Contents
From the Editors pp. 1–5 External Assessors as “Reviewers” for Quality Assurance of Open Access Journals Pertti Hurme and Barbara J. Crawford
Guest Editors’ Introduction pp. 6–9 Human–Technology Choreographies: Body, Movement, and Space in Expressive Interactions Antti Pirhonen, Kai Tuuri, & Cumhur Erkut
Original Articles
Technology Choreography: Studying Interactions in pp. 10–31 Microsoft’s Future Visions Through Dance Olli Poutanen, Salu Ylirisku, & Petri Hoppu
Body, Space, and Emotion: A Perceptual Study pp. 32–57 Donald Glowinski, Sélim Yahia Coll, Naëm Baron, Maëva Sanchez, Simon Schaerlaeken, & Didier Grandjean
Musical Instruments, Body Movement, Space, and Motion Data: pp. 58–81 Music as an Emergent Multimodal Choreography Federico Visi, Esther Coorevits, Rodrigo Schramm, & Eduardo Reck Miranda
Bodily Interactions in Motion-Based Music Applications pp. 82–108 Marcella Mandanici, Antonio Rodà, & Sergio Canazza
Designing a Computer Model of Drumming: The pp. 109–141 Biomechanics of Percussive Performance John R. Taylor
Book Review
Public Access ICT Across Cultures: Diversifying Participation pp. 142–144 in the Network Society Francisco J. Proenza (Ed.) Reviewed by Sakari Taipale
ISSN: 1795-6889 humantechnology.jyu.fi
Submission: humantechnologypublishing.jyu.fi
Editor in Chief Founding Editor in Chief Pertti Hurme, University of Jyväskylä, Finland Pertti Saariluoma, University of Jyväskylä, Finland
Managing Editor: Barbara J. Crawford Editorial Assistant: Rachel Ferlatte Kuisma
Board of Editors Morana Alač Markku Häkkinen Kalle Lyytinen University of California San Diego, USA Education Testing Service, USA Case Western Reserve University, USA Michael Arnold Päivi Häkkinen Jim McGuigan University of Melbourne, Australia University of Jyväskylä, Finland Loughborough University, UK Jeffrey Bardzell Sung H. Han Jan Pawlowski Indiana University Bloomington, USA Pohang University of Science and Hochschule Ruhr West, Germany, & Technology, South Korea University of Jyväskylä, Finland Dhrubes Biswas Indian Institute of Technology, India Jun He Raul Pertierra University of Michigan, Dearborn,USA Ateneo de Manila University, the Philippines Adele Botha CSIR Meraka & University of South Africa Jörn Hurtienne Roger Säljö Julius-Maximilians University, Germany University of Gothenburg, Sweden José Cañas University of Granada, Spain R. Malatesha Joshi Stephen Viller Texas A & M University, USA University of Queensland, Australia Ling Chen Hong Kong Baptist University Pilar Lacasa Marita Vos University of Alcalá, Spain University of Jyväskylä, Finland Monica Divitini Norwegian University of Science Ann Light Chihiro Watanabe and Technology University of Sussex, UK Tokyo Institute of Technology (emeritus)
Human Technology is an interdisciplinary, scholarly journal publishing innovative, peer-reviewed articles exploring the issues and challenges within human–technology interaction and the human role in all areas of ICT-infused societies.
Human Technology, published by the Open Science Centre, University of Jyväskylä, is distributed without a charge online.
ISSN: 1795-6889 www.humantechnology.jyu.fi Volume 13(1), May 2017, 1–5
From the Editors
EXTERNAL ASSESSORS AS “REVIEWERS” FOR QUALITY ASSURANCE OF OPEN ACCESS JOURNALS
Pertti Hurme Barbara J. Crawford Editor in Chief Managing Editor Department of Language and Department of Language and
Communication Studies Communication Studies
University of Jyväskylä University of Jyväskylä
Finland Finland
The interdisciplinary journal Human Technology has been a venue for scholarly research on the interactions between humans and technology since 2005. The editorial philosophy on publishing has been steeped in, from the start, the value of open access (OA) research. The leadership of the Agora Center at the University of Jyväskylä, Finland, as publisher, and founding editor Professor Pertti Saariluoma decided that the journal’s OA financing model would be institution-funded in conjunction with the funding of special issues through the journal and the university working as partners on research grants. As a result, no author has been charged an article processing fee (APC) to submit or be published in the online-only journal. For its first 12 years, the journal was published by the interdisciplinary research unit, the Agora Center; with this current issue, the publishing responsibility has been assumed by the Open Science Centre, also at the University of Jyväskylä. From the start, the publisher and editors of Human technology have endeavored to produce a high-quality journal with impeccable ethical standards and a robust peer review process. We established an editorial board with experts from a diversity of fields that research human– technology interaction and continually are expanding the number of disciplines and research focuses represented by members of our board. In virtually every way, the editorial team has embraced the ethical standards for high quality publishing, in line with long-established and well-respected journals. By some estimates (e.g., Ware & Mabe, 2015, p. 6), as many as 10,000 publishers around the globe are actively publishing academic journals. But questions have arisen from academics across the spectrum regarding the quality of the newer, particularly OA and online, journals (see, e.g., Butler, 2013; Sorokowski, Kulczycki, Sorokowska & Pisanski, 2017). One way to substantiate the quality of a journal is by means of quality assurance undertaken by external organizations, a
© 2017 Pertti Hurme & Barbara J. Crawford, and the Open Science Centre, University of Jyväskylä DOI: http://dx.doi.org/10.17011/ht/urn.201705272514
This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.
1
Hurme & Crawford
process known as “whitelisting” (see, e.g., Berger & Cirasella, 2015; Butler, 2013; Gasparyan, Yessirkepov, Diyanova, & Kitas, 2015; Shen & Björk, 2015), that is, articulating standards regarding peer review, ethical behavior, adherence to scope, and other measures of excellence through which journal editors and publishers can attain accreditation. In many ways, such organizations serve the role of the “reviewer” of journals, assuring they have academic quality and relevance and abide by accepted scholarly publishing standards that contribute to advancing scientific fields, rather than threatening their integrity (Bartholomew, 2014; Clark & Smith, 2015) External validation provides important recognition for primarily young, open access, and interdisciplinary journals. Human Technology has been indexed by the Directory of Open Access Journals1 (DOAJ)—a key player in whitelisting OA journals (Berger & Cirasella, 2015)—since 2005, receiving reaccreditation under the directory’s revised guidelines in 2016. In February 2017, our journal was accepted by Scopus,2 Elsevier’s bibliographic database, and will be indexed there in the coming months. In the process for being considered for, and then accepted by, these two well-known indexers, we at Human Technology had to supply to the indexers’ editors a wide variety of materials that supported our claim of quality, of reputable OA practices, and adherence to ethical standards. We are pleased that our consistent emphasis on publishing ethics, attention to quality in accepting papers, and filling an important niche in the vast academic publishing world has been assessed and accepted as equally valuable as those journals published by large, established publishing houses. Obviously, young, independent OA journals cannot have their quality assured right away; indeed, in the early years, many face several challenges. For instance, predatory journals (see, e.g., Cook, 2017) and low-quality journals (Gasparyan et al., 2015) threaten the reputations of good, ethics-abiding journals in that the unscrupulous and poor scientific behaviors of these journals cast a long shadow over all journals, particularly OA journals that cover their publication costs through APCs (Shen & Björk, 2015). Good, young journals accept that, inevitably, it takes time and ongoing attention to reputable and ethical publishing in order to, eventually, earn the recognition of the academic community and accreditation by the quality assessors. But, considering the reality that quality journals may not yet have been vetted by external assessors such as DOAJ, Scopus, or Web of Science, a separate but equally important question is, How can authors and researchers know to which journal to submit their work or which OA papers reflect sufficiently peer-reviewed quality for citation? The following points are based in part on the checklist of the Open Access Scholarly Publishers’ Association,3 and Butler (2013), Clark and Smith (2015), and Hansoti, Langendorf and Murphy (2016): . Examine the reputation of the journal and publisher: Do you or your colleagues know the journal? Is the journal associated with a noted organization, such as a scientific association, university, or research institute? Are there any unpleasant rumors about the journal or publisher on the Internet? . Carry out an analysis of the information on the journal’s Web site: Can you easily identify and contact the publisher? Does the journal articulate (and then follow) basic ethical standards? What are the academic affiliations and credentials of the editors and editorial board members?
2 Role of Assessors of Quality for OA Journals
Is the journal clear about the type of peer review process it uses? Are there outside experts reviewing papers or are all reviewers associated with the journal (e.g., editorial board members)? What is the editorial process like? A reputable journal will need at least several months to assess a paper’s quality, get it through review, reassess, and allow time for revisions. Does the journal have a peer-review label or some other independent notation of quality assurance? How does the journal assess whether any parts of a paper have been plagiarized, knowingly or unknowingly? What fees (if any) will be charged? Can you find the publication fee easily identified on the Web site, or is it hidden many pages back with obscure navigation or written in a less-than-clear manner? . Consider the source and approach: Are the research articles published through the journal under evaluation available through a reputable indexer or research database? Predatory publishers and journals rarely are accepted by these organizations, even if the articles they publish are readily available on the Internet. Are you being solicited for submission and in what manner? Most reputable journals do not need to send emails with enticing lures on quick publication. Does the offer seem too good to be true? Chances are, if it does, it is. Please note that while a lacking in any of the above areas does not necessarily mean a journal is predatory or low quality, the number and/or combination of issues should at least merit caution. In addition to the assurances provided by external assessors, such as DOAJ, Scopus, and Web of Science, authors and researchers can use one of several organizations as sources aimed at identifying quality journals. For example, national lists of reputable journals often are produced (and/or ranked) in collaboration with public organizations and the academic community. Comprehensive lists have been produced by several Nordic countries. Human Technology is included in the Finnish Publication Forum4 and the Norwegian Register for Scientific Journals, Series, and Publishers.5 Currently, academics in the Nordic countries, in conjunction with their governmental officials and several academic communities, are discussing the potential of creating a joint Nordic List of quality journals and publishers. The organizations involved are collaborating as well with DOAJ in regard to open access journals.6 Finally, although the list of potential, possible, and probable predatory journals and publishers,7 compiled by University of Colorado librarian Jeffrey Beall (2012, 2016a, 2016b), is no longer available, many found the list useful and influential (see, e.g., Butler, 2013). However, it also was controversial (see, e.g., Berger & Cirasella, 2015; Butler, 2013; Gasparyan et al., 2015; Straumsheim, 2017). Some academics have called for a more systematic way, and by a wider swath of academic players (e.g., librarians, researchers, accrediting agencies, journal publishers, publisher and editor associations), to identify problematic journals and publishers as a companion to organizations recognizing quality (Berger & Cirasella, 2015; Gasparayan et al., 2015). As the editor in chief and managing editor of a reputable journal that for several years was not yet evaluated by the major indexers, we see value in a formal way of indicating that a
3 Hurme & Crawford
young journal is not disreputable as it develops toward official acknowledgment via the well- known external assessors. The publisher and editors of Human Technology share the long-term goal of producing a journal that will be accepted as well by the Thomson Reuters Web of Science to supplement DOAJ and Scopus in recognizing our journal’s continued emphasis on quality and ethical publishing and in contributing to the interdisciplinary field of human–technology interaction. For this we need to maintain our long-standing reputation as a quality journal, good submissions, and the hard work of our reviewers and editors, supported by our distinguished Board of Editors. An additional support toward the journal’s goals is provided by our newly established (2017) internal Publisher’s Board, a team of five experts from the University of Jyväskylä with various scientific backgrounds. The Publisher’s Board members are Päivi Fadjukoff (adjunct professor, Psychology), Marja Kankaanranta (research professor, Digital Learning Environments), Raine Koskimaa (professor, Digital Culture), Pekka Olsbo (head of the publishing unit, Open Science Centre), and Pasi Tyrväinen (professor, Digital Media). Their multifaceted task involves supporting and adding expertise to the publisher’s role in maintaining quality and ethical publishing and in advancing Human Technology within the university, in Finland, and internationally. In their advisory role, they facilitate the development of the journal from the publishing perspective, thus serving as an instrumental collaborator with day-to-day editorial leadership (i.e., editor in chief and managing editor) of the journal. In many ways, they are to the publisher what the editorial board is to the journal’s editors, although there is considerable overlap in their combined efforts toward the advancement and integrity of the journal. As technological and global interconnectedness continues to develop in the coming years, and new journals are founded to address niches in the scholarly publishing arena, authors and researchers—and journal publishers, as a community supporting and advocating quality, relevant, and ethical journals—need to remain vigilant against unscrupulous and low-quality journals and publishers. External indexers and accrediting organizations play the essential role of reviewers of ethical and quality publishing practices in journals, particularly young, niche, and OA journals.
ENDNOTES
1. More information on the DOAJ, see https://doaj.org 2. Information on the service that Scopus provides is available at https://blog.scopus.com/posts/is-a- title-indexed-in-scopus-a-reminder-to-check-before-you-publish 3. The Open Access Scholarly Publishers’ Association (http://thinkchecksubmit.org/check/) provides advice to submitters on assessing reputable open access journals. 4. See https://www.tsv.fi/julkaisufoorumi/haku.php?lang=en for the Finnish Publications Forum. 5. More information on the Norwegian Register for Scientific Journals, Series, and Publishers is available at https://dbh.nsd.uib.no/publiseringskanaler/Forside 6. The DOAJ announcement of collaboration with Nordic countries regarding evaluating open access journals can be found at https://doajournals.wordpress.com/2017/03/31/nordic-research-organizations- support-doaj/ 7. A copy of Beall’s List of Predatory Journals and Publishers currently can still be found on the Internet (e.g., http://beallslist.weebly.com/standalone-journals.html), but it is no longer updated.
4 Role of Assessors of Quality for OA Journals
REFERENCES
Bartholomew, R. E. (2014). Science for sale [Editorial]. Journal of the Royal Society of Medicine, 107(10), 384– 385. doi: 10.1177/0141076814548526 Beall, J. (2012, September 13). Predatory publishers are corrupting open access [Commentary]. Nature, 489, 179. Beall, J. (2016a). Ban predators from the scientific record [Letter to Editor]. Nature, 534, 326. Beall, J. (2016b). Best practices for scholarly authors in the age of predatory journals. Annals of the Royal College of Surgeons of England, 98, 77–79. doi: 10.1308/rcsann.2016.0056 Berger, M., & Cirasella, J (2015). Beyond Beall’s list: Better understanding predatory publishers. College & Research Libraries News, 76(3), 132–135. Butler, D. (2013, March 28). The dark side of publishing. Nature, 495, 433–435. Clark, J., & Smith, R. (2015). Firm action needed on predatory journals: They’re harming researchers in low and middle income countries most, but everyone must fight back. BMJ (British Medical Journal), 350, h210. doi: 10.1136/bmj.h210 Cook, C. (2017). Predatory journals: The worst thing in publishing, ever. Journal of Orthopaedic & Sports Physical Therapy, 47(1), 1–2. Gasparyan, A. Y., Yessirkepov, M., Diyanova, S. N., & Kitas, G. D. (2015). Publishing ethics and predatory practices: A dilemma for all stakeholders of science communication. Journal of Korean Medical Sciences, 30, 1010–1016. doi: 10.3346/jkms.2015.30.8.1010 Hansoti, B., Langdorf, M., & Murphy, L. (2016). Discriminating between legitimate and predatory open access journals: Report from the International Federation for Emergency Medicine Research Committee. Western Journal of Emergency Medicine, 17(5), 497–507. Shen, C., & Björk, B.-C. (2015). “Predatory” open access: A longitudinal study of article volumes and market characteristics. BMC Medicine, 13, Article 230, unpaginated. doi: 10.1186/s12916-015-0469-2 Sorokowski, P., Kulczycki, E., Sorokowska, A., & Pisanski, K. (2017, March 23). Predatory journals recruit fake editor [Commentary]. Nature, 543(7646), 481–483. Straumsheim, C. (2017, January 18). No more “Beall’s List.” Inside Higher Ed post, retrieved 2 May, 2017, from https://www.insidehighered.com/news/2017/01/18/librarians-list-predatory-journals-reportedly-removed-due- threats-and-politics Ware, M., & Mabe, M. (2015). The STM Report: An overview of scientific and scholarly journal publishing (4th ed.). The Hague, the Netherlands: International Association of Scientific, Technical, and Medical Publishers.
Authors’ Note
All correspondence should be addressed to Pertti Hurme Department of Language and Communication Studies University of Jyväskylä P.O. Box 35 40014 University of Jyväskylä, FINLAND [email protected]
Human Technology ISSN 1795-6889 www.humantechnology.jyu.fi
5
ISSN: 1795-6889 www.humantechnology.jyu.fi Volume 13(1), May 2017, 6–9
Guest Editors’ Introduction
HUMAN–TECHNOLOGY CHOREOGRAPHIES: BODY, MOVEMENT, AND SPACE IN EXPRESSIVE INTERACTIONS
Antti Pirhonen Kai Tuuri Department of Teacher Education Department of Music, Art & Culture Studies University of Jyväskylä University of Jyväskylä Finland Finland
Cumhur Erkut Department of Architecture, Design & Media Technology Aalborg University Copenhagen, Denmark
THE EXPRESSIVE POTENTIAL OF HUMAN-TECHNOLOGY CHOREOGRAPHIES
The design of anything that is intended for us human beings is fundamentally influenced by how the designers conceptualize what is “human.” Researchers in the field of human– computer interaction happily use human to denote the counterpart of technology. In this tradition, however, human often becomes synonymous with a user of technology. As a result, the emphasis tends to be on the user’s cognitive abilities and successful end-results from interaction. Consequently, the embodied nature of humans—as intentional, moving beings— often receives less focus. We, as editors, but more so as researchers, thus want to promote a shift in focus from what can be done with technology to the inherently expressive processes of how humans move while interacting with technology. In parallel with the emergence of digital applications in everyday life, there is a need to go beyond the traditional usability approach, to see human beings and the available technology from several points of view. Above all, the discussions inevitably relate to the very core of the nature of humanity. When discussing the relationship between humans and technology, researchers and designers take a stand—either implicitly or explicitly—on what is essentially human and what can be outsourced to technology.
© 2017 Antti Pirhonen, Kai Tuuri, & Cumhur Erkut, and the Open Science Centre, University of Jyväskylä DOI: http://dx.doi.org/10.17011/ht/urn.201705272515
This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.
6
Expressive Interaction in Human–Technology Choreographies
In this thematic issue of Human Technology, we editors turn the focus of human– technology interaction from the traditional cognitive aspects to the moving human being, specifically emphasizing the concept of choreography in design and application. This issue was preceded by another special issue within the same topic area (see Pirhonen, Tuuri, & Erkut, 2016). By choreography we want to emphasize meaningful continua of movement that humans, as individuals or as groups, constitute and experience during interaction with technology (see also, e.g., Loke & Reinhardt, 2012; Parviainen, Tuuri, & Pirhonen, 2013; Tuuri, Parviainen, & Pirhonen, in press). Our perspective on choreographies in technology design and interaction clearly has been demonstrated as timely and interesting. Through the call for papers on this topic area, a large number of high quality manuscripts were submitted to the journal. Following the review process, we ended up with enough papers to fill two separate thematic issues. In addition, three submissions were shifted to an open submissions issue in Human Technology.1 Another reason for two issues was the variation of topics among the submissions. We found that it was quite effortless to divide the submissions into two categories. The submissions in the May 2016 issue concentrated on interaction design, while the submissions in this issue more specifically focused on expressivity in human–technology choreographies. Therefore, the papers published in the present issue deal with themes such as emotion perception, analysis through dancing, interfaces for musical expression, and the biomechanics of drumming. Interaction design, even if it was the primary focus of the previous choreography-themed special issue, can still be seen at least implicitly as an area that potentially benefits from the insights of the contributions of this issue. Human–technology choreographies denote a multifaceted approach to human life. As can be seen in the contributions of this issue, the concept may be approached from a wide variety of perspectives. Both the backgrounds of the authors and the application areas are so diverse that, in the end, the common ground that all the texts share the interest in the movements of human beings. Nearly 3 years ago, we first considered how to continue the work involving human– technology choreographies that formed the basis for our NordiChi ’14 conference workshop. We were convinced then and remain so that movement and the bodily basis of human cognition would be an appropriate approach to conceptualize the relationship between humans and technology. Much has happened in technology and in our culture since that workshop. Indeed, it is often argued in the human–computer research field that the form of technology changes so rapidly that research in that area becomes outdated quickly. However, as we observe the research world around us, it seems our theme has not lost any of its topicality. Rather, the expressive power of the moving human body remains a valuable and valued topic that deserves more attention among designers of technology.
PREVIEW OF THIS THEMATIC ISSUE
Olli Poutanen, Salu Ylirisku, and Petri Hoppu propose a method that highlights the value of first-person enactment in user-centered design. They capture implied choreographies in particular interaction scenarios (drawn from two Microsoft Corporation’s Productivity Future Vision videos) and literally dance these interactions while gathering insights. Their research
7 Pirhonen, Tuuri, & Erkut
design is theoretically and practically relevant for the choreographic approach in interaction design (Parviainen et al., 2013). Theoretically, the authors extend the choreographic approach through applying the hermeneutics of the body; practically, they provide a methodical, movement-based framework for operationalizing the approach. Their work offers much for future intelligent environments designed with this approach. In their article “Body, Space and Emotion: A Perceptual Study,” Donald Glowinski, Sélim Yahia Coll, Naëm Baron, Maëva Sanchez, Simon Schaerlaeken and Didier Grandjean take a holistic stance on the expression and experience of body movements, specifically focusing on emotional content. In their conceptual framework, they combine the perspectives of the performer and the observer, thus outlining an integrated and interactive model of communication through the body. In this second thematic issue, we are fortunate to have three articles that relate to embodied musical interaction. The first one, titled “Musical Instruments, Body Movement, Space, and Motion Data: Music as an Emergent Multimodal Choreography,” comes from Federico Visi, Esther Coorevits, Rodrigo Schramm, and Eduardo Reck Miranda. The authors first propose techniques for extracting meaningful information from motion capture data. But the core of this paper lies in their suggestion that artistic practices, such as musical performances, can be utilized in better understanding movement-related data for expressive purposes that ultimately extend beyond musical or artistic applications. Marcella Mandanici, Antonio Rodà, and Sergio Canazza investigated bodily choreographies in producing musically structured events within a music education-oriented interactive space, explicitly explored through an application titled Harmonic Walk. The authors suggest that movement coordination in such applications relates to entrainment phenomena between the human user and the environmental stimuli in the interactive space. Consequently they propose a framework for designing and assessing motion-based music applications. John R. Taylor demonstrates yet another different approach to the notion of human– technology choreographies. Playing drums requires learning a large repertoire of choreographed body movements. Applying an information-processing stance as a model of human motor control, the author tackles the biomechanics of a percussive performance in relation to the various developmental goals of a drummer. Motivated by research into computer modeling and as a prequel for future empirical studies, the article outlines a theoretical framework for considering the aspects of movement patterns in the interaction between the drummer and his or her instrument. Finally, this issue includes a book review. The ubiquitousness of technology and its benefits in 21st century societies undergirds much research and product development. But access to this technology is uneven across the globe and is the subject of the tome Public Access ICT Across Cultures: Diversifying Participation in the Network Society, edited by Francisco Proenza. In his review of this 2015 publication, Sakari Taipale assesses the research regarding workable and problematic provision of public access venues (e.g., government-subsidized telecenters, libraries) in 10 developing countries and emerging economies, as well as various demographic groups, on three continents. He argues the book provides an unprecedented overview of Internet access provision that allows the poorer citizens of the world to participate in the Network Society.
8 Expressive Interaction in Human–Technology Choreographies
ENDNOTE
1. See Human Technology’s website (http://humantechnology.jyu.fi/archive/vol-12/issue-2) for papers by Marc-Eric Bobillier Chaumon, Bruno Cuvillier, Salima Body, & Florence Cros; Elena Márquez Segura, Laia Turmo Vidal, & Asreen Rostami; and Pablo Ventura & Daniel Bisig.
REFERENCES
Loke, L., & Reinhardt, D. (2012). First steps in body-machine choreography. In L. Loke & T. Robertson (Eds.), Australasian Computer Human Interaction Conference, 2nd International Workshop: The Body in Design (OzCHI ’12; pp. 20–23). Sydney, Australia: Interaction Design and Work Practice Laboratory (IDWoP). Parviainen, J., Tuuri, K., & Pirhonen, A. (2013). Drifting down the technologization of life: Could choreography- based interaction design support us in engaging with the world and our embodied living? Challenges, 4(1), 103–115. Pirhonen, A., Tuuri, K., & Erkut, C. (2016). Human–technology choreographies: Body, movement, and space [Guest Editors’ Introduction]. Human Technology, 12(1), 1–4. Tuuri, K., Parviainen, J., & Pirhonen, A. (in press). Who controls who? Embodied control within human– technology choreographies. Interacting with Computers. doi: 10.1093/iwc/iww040
Authors’ Note
All correspondence should be addressed to Antti Pirhonen Department of Teacher Education FI-40014 University of Jyväskylä, Finland [email protected]
Human Technology: An Interdisciplinary Journal on Humans in ICT Environments ISSN 1795-6889 www.humantechnology.jyu.fi
9
ISSN: 1795-6889 www.humantechnology.jyu.fi Volume 13(1), May 2017, 10–31
TECHNOLOGY CHOREOGRAPHY: STUDYING INTERACTIONS IN MICROSOFT’S FUTURE VISIONS THROUGH DANCE
Olli Poutanen Salu Ylirisku Department of Design SDU Design Aalto University University of Southern Denmark Helsinki, Finland Kolding, Denmark
Petri Hoppu Media and Performing Arts Oulu University of Applied Sciences Finland
Abstract: In the future, an increasing number of devices will be utilized in concert to support human activities, but little is known about how these interacting multidevice settings should be designed optimally in a human-centered manner. We report on a study in which we took two visions created by the Microsoft Corporation as a starting point. The aim of the paper is to describe a method for user-centered design that extends the ideas of a choreographic approach to interaction design and to demonstrate how micromovement analysis can be conducted in practice. We utilized a structural reorganization of movement continua originally presented in the videos for a first-person enactment of that choreography as a means to understand the kinesthetic quality and the potential of the implied choreographies. The approach underscores the influence of interaction designs on the moving and experiencing body, as well as the potential that the moving and experiencing body has for interaction design.
Keywords: interaction design, intelligent environment, choreographic analysis, hermeneutics of the body, embodied mind, dance.
© 2017 Olli Poutanen, Salu Ylirisku, & Petri Hoppu, and the Open Science Centre, University of Jyväskylä DOI: http://dx.doi.org/10.17011/ht/urn.201705272516
This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.
10
Studying Interactions Through Dance
INTRODUCTION
Interaction design is moving from a single-device focus towards interconnected objects in intelligent environments (Gubbi, Buyya, Marusic, & Palaniswami, 2013). This study focuses on human–computer systems in which interplay occurs between the user and a combination of various form factors—for example, smartphones, tablets, screens, and smart objects, such as office accessories with inbuilt intelligence—that collectively constitute a communicating– actuating network. The enablers of the described interactions are embedded in the systems that are built upon logic that exceeds that of any individual device. The ever-increasing number of sensors, processing power, and communication capabilities of objects enhance everyday objects’ capability of producing a “deeper understanding of the internal and external worlds encountered by humans” (Swan, 2012, p. 217). The main challenge of designing environments that are capable of responding sensibly to the user’s actions is, as Rogers puts it, to make the numbers “intelligible, usable and useful” (Rogers, 2006, p. 412). In this study, we focus on intelligent environments in which the user is given the role of an active agent who orchestrates the network of smart, interconnected devices to serve his/her needs. We argue that, when defining these systems from the perspective of user-centered interaction design, the moving, sensing, and meaning-making body of researchers and designers should be considered the core of the design research method. We describe an analysis method that quite literally puts the researchers and designers “into the shoes” of the user and increases their awareness on how physical setups of interconnected visual displays, moving application controls, and different haptic–tactile input and visual output mechanisms influence the user’s body during the interaction. Our focus is on the ways in which a designer or a researcher can explore and analyze the qualities of movement experienced by the user during interaction. We describe how we used the method to analyze a conceptual use context. The analysis data come from two Microsoft Corporation’s videos on intelligent environments, namely Productivity Future Vision (2009, 2011). In this study, choreography is the central concept that serves both as a term of describing body movement in space and time, in which case it is useful in conducting practical movement analysis, and a concept that links our approach to mobile aspects of embodied methodology. Through choreography, movement can be analyzed both as an ephemeral action and a sustained, recurrent structure. The term choreography originates in the domain of dance. It refers primarily to the structuring of movement in space. However, it also implies a profound attentiveness to the body and its movements both on stage and in everyday life, where any movement or mobile action can be turned into choreography when performed by people trained in these activities, such as dance (Thomas, 2013, p. 3). Even within the field of dance, the concept of choreography does not merely refer to an external model or schema for structured actions, but the term has been expanded to encompass its frame and parameters, dancing bodies and their physicality, and processes of learning, as well as the ways in which the choreography is actualized and experienced (Foster, 2011, pp. 215–218). Furthermore, choreography is not limited to dance, but it is used to describe manifold embodied mobile processes, even the functioning of the brain. As J. L. Hanna (2015) suggested, the brain organizes sensory experiences, thoughts, and actions into meaningful entities, such as thinking, speaking, and dancing, which Hanna regarded as a complex choreographic process. Choreography may indeed point to the fact that the mode of cognitive organization is characterized by mobility
11 Poutanen, Ylirisku, & Hoppu
and, therefore, the concept of choreography can be seen as a valid tool for analyzing complex systems of human behavior (Mason, 2009, p. 32). Performed actions, as well as sequences of movement progression, can be encompassed within the term choreography. The study of movement is not new to interaction design and research. Even as the focus in the field has shifted over decades from usability, user experience, and co-experience to embodied interaction (see Dourish, 2001), all these frameworks have considered the user’s movement as part of the interaction. Understandably, the different frameworks’ relationship to movement, as well as the scope and rigor of movement analysis, vary considerably among these traditions. In some approaches to interaction design, movement has been studied with special attention. This applies especially to movement-oriented interaction design (see Loke & Robertson, 2013). We acknowledge this background, yet we also claim that the role of the physical human body and the ways in which the acting body forms structures of movements in space, in interaction, and in environments with networked technologies has thus far attracted limited attention. Not surprisingly, the scientific treatment of the human body in interaction design has its roots in the ontological, epistemological, and methodological structures of Western knowledge. Science has been regarded as belonging to the realm of cognition which, in turn, has been treated as nonembodied, that is, a matter of the mind. This, however, has been questioned by scholars from neurologists (e.g., Damasio, 1994) and linguists (e.g., Lakoff & Johnson, 1999) to dance researchers (e.g., Parviainen, 1998). Moreover, philosophers, such as Nietzsche (“there is more reason, sanity and intelligence in your body than in your best wisdom”; 1883/2010, p. 32), Heidegger (1975), and Merleau-Ponty (1945/1965), also aimed at challenging the dominance of logocentric knowledge in Western thinking. Interaction design that factors in the human body, its knowledge, and its versatile intentions in human–technology choreographies may positively contribute to the development of everyday environments. It is challenging to define the interrelationship between bodies, movement, and objects in interactive environments. A possible starting point for understanding the nature of the interaction and for describing complicated embodied interrelationships is to count human beings as part of the “thing world.” Thus, the body is characterized as the user of things that, in turn, can be considered augmentations of the biological body. The body and the things are thus characterized by constant coevolution in which the reach and the possibilities of the body are defined by the nature of the hybrids created (see Thrift, 2008, p. 10). Because the moving body and its intentional behavior are emphasized as constitutive elements of human–technology interaction in this study, the consequent definitions of interaction can be built upon theories in which the body, movement, and perception are seen as sources of knowledge. The question of the embodied mind from the perspective of natural sciences has been studied by a number of scholars since the late 20th century. As a biologist, philosopher, and pioneer of the science of embodied cognition, Francisco Varela (1999) applied phenomenological analyses of time-consciousness to neuro-dynamic processes connected to the lived temporal experience of the embodied ego. Influenced by Varela and other scholars of the embodied mind, philosopher Evan Thompson (2007) stated that concepts, language, and thinking are based on embodied information: In other words, the lived body is the foundation of all knowing. Similarly, neurologist Antonio Damasio (1994, p. 118) concurred that the human body, not only the brain, is inextricably and comprehensively intertwined with the mind in the sense that the mind can be seen as embodied. Linguists, such as Lakoff and Johnson (1999), have developed similar themes
12 Studying Interactions Through Dance
from the perspective of cognitive science, addressing how language and thinking have their deepest roots in embodied metaphors. The embodied mind is tightly connected to embodied knowledge, a concept that emphasizes the body as the basis of knowledge. Michael Polanyi (1966) named the embodied dimension of knowing “tacit knowing,” which referred to the kind of knowing that cannot be completely articulated verbally, as it remains rooted in the human bodily abilities and expressions. Embodied knowledge belongs to the realm of nontraditional episteme, in which valid knowledge is not merely propositional. This was further demonstrated by Maxine Sheets- Johnstone (1999, pp. 490–491), who stated that thinking is essentially embodied in a way that is often a matter of movement, not of language, and, as a result, it is not restricted to mental capacities of the brain. Thinking in movement is not the work of a symbol-making body but an existentially resonant body that knows the world through its own kinesthetic qualia as well as the movement of the surrounding world (Sheets-Johnstone, 1999, pp. 516–517). Embodied or tacit knowledge is often regarded as something that escapes verbal explication, which results in the phenomenon being difficult to recognize and analyze. This, however, is partly a delusion because language is not the only way to articulate interpretations of the reality. In this article, we argue that movement is an integral and natural part of human thinking and that it supports the organization of ideas in human–computer interaction. Movement serves as a means of documenting and reporting results that are easier to understand by looking at the enactment of the movement sequences. In this study, enacting micromovement continua formed an important step in getting access to the kinesthetic experience of human–technology interaction. Furthermore, performing movement sequences with awareness of specific technological contexts made it possible to concentrate on user-centered understanding of the mutual interplay between the user’s body and senses in interaction with technology. Microsoft’s videos Productivity Future Vision (2009, 2011) provided a visual reference for the technological settings. These videos provided comprehensive scenarios of how interactions between a user and a set of devices unfold in space and time. The videos are precise in representing movement continua, which makes them suitable for choreographic analysis. Because the researcher who danced the choreography (author Olli Poutanen) is very familiar with the Microsoft video material that presents the interactions in context, the enactment of choreography in this study also serves as a mediator between the felt essences of movement and the situated interactions. The aim of the study is to extend the choreographic approach to interaction design introduced by Parviainen, Tuuri, and Pirhonen (2013) and to introduce a research procedure that supports conducting microlevel choreographic analysis in practice. The outcome of a choreographic approach enhances the design of human–technology interfaces that stimulate the imagination of the user with creative movement practices. Ideally, the choreographic approach facilitates the creation of such systems and configurations of networked products, services, and systems that enable users to become more expressive, effective, and productive. According to Parviainen et al. (2013), movement-based interfaces, actions, and the movements themselves may support the creation of sensitive or intelligent user-centered experiences. We find the ideas of choreographic approach very relevant, yet the descriptions provided in the literature do not provide a clear methodical framework for operationalizing the choreographic approach. We adopt the idea of micromovement continua by Parviainen et al. and extend it with another approach called hermeneutics of the body (Hoppu, 1999, 2005). Hermeneutics of the body enabled us to build a framework for conducting first-person enactment and for performing interaction choreography as
13 Poutanen, Ylirisku, & Hoppu
part of a micromovement analysis. An experiment of utilizing this approach is described and discussed below. The method development presented in this paper is based on a previous experiment (Poutanen, 2015); thus, the experiment is presented in a descriptive manner. The presented method contributes to the understanding of how the kinesthetic quality and potential of the implied interaction choreographies can be addressed through interaction design research.
RELATED WORK
One of the pioneering documented works in the field of prototyping ubiquitous computing environments is reported by Johanson, Fox, and Winograd (2002). Their work outlined both design principles and aspects that need to be considered in evaluation. Their principles for the design of a ubiquitous computing room for teamwork included valuing the collocation of people and the materials they work with, the appraisal of the social conventions of the people, and the aspiration for simplicity. Using these principles, Johanson et al. created an interactive space with novel applications and computing devices that enabled them to study how people used these applications. They studied interaction with a special focus on three generic task scenarios: moving data, moving control, and dynamic application control. Johanson et al. concluded that one of the most critical aspects of the fluency of the space and interactivity is user learning. To facilitate success, users should not be forced to learn more than a minimal number of new controls and modes. We view this perspective as where both the embodied approach and the concept of choreography become very useful, in that designers can become aware of and build systematically on the existing repertoires of bodily conventions of users. Iqbal, Sturm, Kulyk, Wang, and Terken (2005) studied how ubiquitous technologies could provide support for collocated communication and collaboration in meetings and in teaching. The evaluation method focused on social dynamics, and the goal was on discovering how well a computing environment might function as a proactive “butler” that senses what the users need and then assists them at the right time in the right way. This approach emphasized the intelligence of computing systems over people’s embodied capabilities. Their evaluation built on the collaboration usability analysis (Pinelle, Gutwin, & Greenberg, 2003). The evaluation of intelligent environments, such as control rooms with complex networked equipment, thus far has been primarily based on the traditional usability and ergonomics emphasis (cf. Savioja, 2014). These evaluations emphasize safety, human well- being, and the attainment of predefined goals. It is understandable that, in production environments, this is a highly feasible approach that conforms to general regulations about work regarding expectations of efficiency. However, when design focuses on private and creative workspaces, a more flexible use of digital resources may lead to better results in terms of work productivity and enjoyment of interaction. In order to achieve these goals, further development of methods and approaches that recognize variation in the users’ needs, as well as differences in the users’ motivations, is needed. Loke and Robertson (2013) reported a long line of research in movement-oriented interaction design and, on this basis, they formulated a method as presented in their article “Moving and Making Strange” (i.e., moving, sensing, and feeling in order to open up new spaces for design). Central to this method is providing a coherent approach to movement-based interaction design in which movement is considered an input for interaction design. According
14 Studying Interactions Through Dance
to Loke and Robertson, the approach allows for a systematic and principled development of movement-based interaction. The approach operates from three central perspectives, those of the mover, the observer, and the machine. The process of moving and making strange consists of seven phases: (a) investigating movement, (b) inventing and choreographing movement, (c) re-enacting movement, (d) describing and documenting movement, (e) analyzing visually and representing movement, and (f) representing machine input and (g) interpreting moving bodies (Loke & Robertson, 2013, p. 7:11). The moving and making strange approach suits well the design of movement-oriented interactive systems. Kaasinen et al. (2013) approached intelligent environments from a user-centered perspective and built their approach on the concepts of user expectation, user acceptance, and user experience. They recognized the interplay between the concepts of expectation and experience. On the one hand, experiences are affected by individual expectations, and, on the other hand, experiences can become influential in the formation of future expectations (Kaasinen et al., 2013, p. 12). User acceptance, meanwhile, is defined as the intention to use a certain technology or the actual use of that technology (p. 6). The writers pointed out that the majority of user acceptance studies were first conducted in organizational contexts. User acceptance theories have been used to provide an understanding on how workers adopt new information technologies and on how this information could facilitate adoption processes in the target organization. Since then, the arena of user acceptance studies has widened to accommodate complex interrelating systems and consumer-oriented applications. With these reinforcements, Kaasinen et al. argued that user acceptance studies have incorporated some of the critical variables to benefit the studies on intelligent environments (p. 6). Kaasinen et al. (2013) also noted that users should be provided with a role as co-crafters of their intelligent environments (p. 3). They argued that, as intelligent technologies become ubiquitous and pervasive, the use of intelligent technologies becomes less optional; rather, smart technologies become part of users’ surroundings and daily life. Thus, users should be allowed to participate in the development of these environments because the solutions directly affect their lives (p. 3). Rogers (2006) presented similar ideas and stated that, despite the expected major leaps in technological intelligence and automation, taking the initiative to be creative, constructive, and in control of the interactions within the technological environments is an activity that should belong to the user. Furthermore, technologies could be designed to activate and motivate the user through interactions that creatively, excitedly, and constructively extend what people currently do (Rogers, 2006, p. 406). Typically, interaction design in intelligent environments is conducted and presented in the form of narratives, such as scenarios (Carroll, 2000), customer journeys, or journey maps (Polaine, Løvlie, & Reason, 2013). These methods omit a wide portion of the actual structures of movements in which people interacting in the technologically mediated environment engage. Choreographic analysis ensures that indirect yet meaningful movements that emerge in the particular local context of use are critically examined.
METHODS AND RESEARCH DATA
For this study, we expanded the choreographic approach to interaction design method of Parviainen et al., 2013) used in a previous experiment (Poutanen, 2015) in which two videos
15 Poutanen, Ylirisku, & Hoppu
presenting a future view of networked technology by the Microsoft Corporation were analyzed. Both videos are called the Productivity Future Vision: The 2009 version runs 5 minutes and 46 seconds, and the 2011 version has a length of 6 minutes and 18 seconds. These videos narrate how future information networks could be utilized with ubiquitous interactive technologies, clearly emphasizing the benefits of linked data resources, open access, and powerful information processing. The visions are presented as a sequence of different scenes. Together, these two videos can be interpreted as providing views to a day-in-the-life of the two central characters. The scenarios cover places and situations that characterize smart home, intelligent transportation, and smart office, and merge personal and professional communication. Our analysis of these videos was conducted with the choreographic analysis method extended with the hermeneutics of the body method. The approach yielded new research data in the form of notes and experiences that were utilized in the analysis. For example, the data included notes about desirable and undesirable characteristics of movement continua, such as intuitive movement sequences, logical visual–haptic couplings of augmented and internal feedback, and kinesthetic coherence. The notes were initially made in a format of a dance diary, which was kept throughout the research process. The dance diary was in itself an open-ended but frequent process of writing entries about ideas and experiences related to different steps of the experiment. Connections between the dance diary and the adopted methods—choreographic analysis and hermeneutics of the body—made the dance diary a central tool for reflection of theory in relation to the research data, which consisted of two videos Productivity Future Vision (2009; 2011). The dance diary was also a key process that provided data that allowed documentation and reflection about the hermeneutic process, as experienced by the enacting researcher Olli Poutanen.
Choreographic Analysis
Choreographic analysis (Parviainen et al., 2013) is an analytic approach that addresses diverse movement dynamics and emphasizes the perspective of an embodied and active user. It enables the assessment of interaction choreographies on various temporal and spatial scales. The approach may be used to detect design flaws and highlight successful choreographic designs related to both individual and intersubjective interactions. Choreographies serve as instruments for shifting the focus of design from the shapes and structures of objects to the activity and intended use of technology. Understanding and expressing the user’s movement continua is a practical way to approach the movement-oriented flow of everyday activities (Parviainen et al., 2013). The choreographic approach addresses the embodied dimensions of experience that can be accessed only by an experiencing body, and choreographic analysis hence recognizes the researcher’s body as a legitimate research instrument (cf. Hoppu, 2005). In practice, choreographic analysis can be described with a three-level structure. The levels are labeled as micro, local, and macro, although all of the levels are interconnected. The microlevel describes the dynamicity of “acting-sensing bodies and enactive minds” (Parviainen et al., 2013, p. 110) and the focus is directed toward the subtleties and habitations of the user’s muscular activity. The spatial focus of microlevel choreographies is the kinesphere, that is, the space within the body’s reach (Parviainen et al., 2013, pp.110–111; also Laban, 1963, p. 85). Different ways of touching, looking, twisting, squeezing, pushing, turning, and so on, are examples of micromovements that are considered in the analysis. These individual micromovements are not
16 Studying Interactions Through Dance
studied in isolation but as constituents of micromovement sequences. These movement sequences form the units of analysis for this study, and they are examined in relation to the kinesthetic experience of conducting the movement continua. Local-level analysis concentrates on the relation between the engaged performer and the “intentional, environment-oriented and social aspects of interaction” (Parviainen et al., 2013, pp.110–111). A local-level analysis aims at clarifying how the flow of the movement continua fits into the user’s situation and addresses usability-related questions, such as the intuitive and engaging character of the interactions. Macrolevel analysis addresses the “socio-cultural effects of the design choices” (Parviainen et al., 2013, p.110). This is a systems-level perspective that helps in addressing the question of choreographic sustainability. Macrolevel analysis operates on a global scale, addressing phenomena both in geographical and virtual space (Parviainen et al., 2013, pp.110–111). Microsoft’s Productivity Future Visions (2009, 2011) expectedly presented an abundance of visions related to computation, on-demand processing and visualization of data, smart supply chain management tools, and telepresence meeting solutions, all relevant in both local-level and macrolevel choreography analyses. However, because the kinesthetic analysis is the main focus in this study, the local-level and macrolevel perspectives were not aspects of our analysis. More information on local-level analysis can be found in the previous study (Poutanen, 2015, pp. 44–53).
Hermeneutics of the Body
Hermeneutics of the body is a method developed by Petri Hoppu (2005) building primarily on the thinking of Hans-Georg Gadamer (1960/1975). It presents a perspective of science that connects research and dancing. In hermeneutic analysis, the researcher is in contact with his/her object of research (in Gadamer’s work, primarily texts) and, as the researcher gains new insights concerning the object, his/her perceptions and expectations change. The basic idea of hermeneutics can be abstracted as a spiral representing a process that deepens the researcher’s knowledge of the research object (Hoppu, 2005, p.107). With the felt dimensions of movement as the object of research, the researcher’s own body and the memories carried by the body can be seen as primary resources of the study (Sklar, 2000, p. 71). This entire process of interaction can be regarded as an extension of the human empathic capacity, which, according to de Waal (2008, p. 281), enables a person to access and assess another person’s emotional state and the reasons for the state, and to adopt the other person’s perspective. According to philosopher Edith Stein (1917/1989, p. 20), adopting the other person’s perspective is a special way of “knowing within others,” which differs from conceptual knowledge. The latter knowledge is typically based on verbal communication, whereas the former is that which can be grasped primarily through the experienced, embodied acts with the other. Stein’s view was elaborated by Parviainen (2003, pp. 157–162), who referred to kinesthetic empathy as a key feature in understanding another person’s nonverbal kinesthetic experiences and in acquiring knowledge of his/her movement. When one perceives another person’s movements, one does not merely perceive bodily expressions, but also the living experiences of those movements, though not primordially. Moreover, Dee Reynolds and Matthew Reason (2012) argued that kinesthetic empathy is not limited to human interaction, but it also includes objects and spaces. An object may merge with its human counterpart through an empathic act, of which riding a bike is an example, and a space may have a strong, embodied impact, such as on a person who enters a large room.
17 Poutanen, Ylirisku, & Hoppu
Typically, researchers have paid little attention to the embodiment of research, despite the fact that corporeality forms an essential part of social reality. According to Thomas (2003, p. 77), little discussion has explored the embodiment of research activities, that is, through practices such as observation, interpretation and/or analysis, and, more importantly, the influence of these on the experiences of both the researcher and the researched (see also Hoppu, 2005, p. 107). However, the body is clearly a means to acquire knowledge of the movement, and—through understanding “qualitative and associative nuances” of the movement—the body provides the researcher with a language to express this knowledge (Sklar, 2000, p. 75). Therefore, the moving and experiencing body is not only a phenomenon under investigation but also an evolving reflexive process of inquiry involving the researcher as a living subject with his/her activities examined from the perspective of embodiment in this approach. Traditionally, the hermeneutic analysis begins in the field, where a researcher meets practice. During this phase, s/he also articulates his/her experiences in text. In our research, the fieldwork phase refers to the learning and performing a choreography based on the micromovements performed in the Microsoft videos. Here, choreography is seen as a process of both producing and reproducing embodied knowledge—a way of articulating and structuring movement as well as learning and conveying it through empathic acts. In other words, choreography is not limited to something that is merely observed, but it is something that may become a part of a researcher’s body. This resembles Nonaka and Takeuchi’s (1995, p. 64) mode of socialization, where tacit knowledge is conveyed from one person to other people through interaction or observation. Their model has been criticized for being limited (Gourlay, 2006). Nevertheless, adopting tacit knowledge into one’s body is regarded as a significant facet for developing an understanding of the body-movement phenomenon under investigation. Embodied knowledge needs to be made explicit for people to become aware of it. What Nonaka and Takeuchi (1995, p. 65) called externalization is also a central element of research. In our study externalization encompassed (a) learning to perform the movements, including the continua of body postures, individual limb movements, and sensuous foci reaching out to the surrounding space, (b) understanding meanings of interactions, and (c) interpreting the nuances relating to the microlevel choreographies. In the overall hermeneutic process, embodied knowledge was externalized and documented in active writing, which implies that the researcher constantly compared what he had written regarding evolving experiences, feelings, and skills, and iteratively works towards a deeper understanding of what was examined (Hoppu, 2005, p. 109). The hermeneutic process intertwines human embodiment, its experience, and its interpretation (Hoppu, 2005, p. 107).
The Research Procedure
The research procedure of the micromovement interactions was conducted in five steps: (a) extracting, (b) reorganizing, (c) performing, (d) dance writing, and (e) analyzing. During the extraction step, the main elements of the choreographies were extracted as separate movements from Productivity Future Visions videos (2009; 2011). A total of 112 individual screenshots depicting individual micromovements were taken and arranged. The screenshots were first arranged into a sequential continuum from a start of an action to its completion. The screenshots were organized on a computer screen so that each movement continuum was organized on a vertical axis, resulting in separate image clusters. Within each
18 Studying Interactions Through Dance
image cluster, the first micromovement was set in the uppermost part of the screen. Screenshot clusters representing different scenarios, movement continua, and use contexts were aligned horizontally in rows. This approach provided a wide visual view of the material of the choreography as a whole and thus provided a good starting point for the reorganization of the data. In the reorganizing step, the previous screenshot clusters were reorganized into three clusters corresponding to the body position of the user during interaction, namely the sitting position, standing position, and walking. The reason for reordering the formerly mixed order was to enhance the flow of activities that composed the choreography, thereby allowing a movement progression from the performance components (i.e., rehearsal and “dancing”) to analysis. Based on body postures, the enactment of the choreography logically flowed through all relevant movement continua first from sitting, proceeding to interactions in a standing position, and moving on from standing to walking, in line with the original interaction. In the performance step, the enacting researcher performed the rearranged choreography through the chunked movement flows. This performing was supported by a process known as dance writing, meaning developing a script of the micromovements to be performed as well as writing down observations that emerged during the performing. The performing and dance writing steps were carried out alternately. In the rehearsal of the movement continua, the focus of dance writing was on the technical reproduction of the movement: the correctness of the body alignment and executing right micromovements in a right order. Once the choreography was internalized by the researcher, the dance performance process was initiated, with the focus shifting to assessing the bodily experience of the movement. Dancing the movements was thus the premise for creating embodied understanding of interaction and engaging a state of critical thinking where the nuances of interaction could unfold. At this point, a dance diary was used to explicate emerging ideas and interpretations of the kinesthetic experience of human-technology interaction. Figure 1 illustrates how the movements were performed without representations of technology to better focus on the movements and kinesthetic experience. The enacting researcher performed all the micromovements involved in the specified activity, including the gazing, and imagined the use context that involved the devices, the physical architectonic space and furniture, and the information content around him. This was important because, without keeping the use context in mind, the analysis would only have represented the pleasurability of a certain movement sequence. The researcher’s ability to differentiate transitions from one device to another as well as the spatial organization of the choreography allowed him, for instance, to locate discontinuities in the movement flow and reflect the cause for dysfunction in the suggested micromovement continuum for a specific interaction. The researcher conducted the choreographed dance within a 7-day timespan of intensive engagement with the movements. The dancing cycle consisted of morning and evening sessions. Writing about the dancing experience was conducted daily within the two sessions, resulting in an iterative writing process typical of the hermeneutic process. The choreography was danced five times a row in each session. A dance diary, written by the enacting researcher immediately after the session, was used for documenting the ideas and findings that emerged during and after the dancing. Diary entries were written in a nonstructured manner. As the writings based on the sessions started to accumulate, several interests and themes started to shape. The practice of writing helped the enacting researcher to explicate kinesthetic sensations and become aware of specific nuances in the choreography. Performing and writing thus constituted a reciprocal process.
19 Poutanen, Ylirisku, & Hoppu
Figure 1. In the movement continuum example above, the user copies information from one device to another with gesture commands without actually employing the technology, thus reflecting a perspective of movement experience of using the technology within the imagined use context. The enacting researcher holds his left hand in a pointing position and enacts a sequence in which he directs imaginary information content on a computer screen. In using his other hand, the user drags the content from a computer to a tablet device. This enactment also demonstrates how the micromovement of dragging is part of a continuous flow of movement in the overall choreography that progressed seamlessly through the various actions conducted with the imaginary technology surrounding the performer.
Altogether, the enacting researcher performed the choreography 70 times during the entire enacting period, which means a total of 7840 micromovements performed in the cumulative choreography. One round of performing the full choreography took approximately 2 minutes 30 seconds; thus, the entire study totaled 175 minutes of active dancing, traversing all the studied movements. Going through movements to this extent is not essential for learning an advanced skill, but it nevertheless allows for the attainment of such repeated embodied experience of the studied choreographies that functions as a foundation for drawing conclusions about the kinesthetic experience and about the experienced effectiveness of the performed actions. At the end of the performing and dance writing steps, a final performance was documented on video with three cameras in a television studio. The choreography included the 112 micromovements, flowing from sitting to standing to moving, resulting in a total of 150 seconds of performed choreography, that is, the dance. The video shows the enacting researcher’s body in a quasi-empty studio. The space was furnished with only a chair for presenting movements in a sitting position. Because the performances were created without physically employing the technologies under investigation, the dance rendered visible the movements that were obscured by the overwhelming visual richness of the original video material that utilized special effects extensively. Thus, the visual documentation of the dance underscored the relationship of reciprocal influence between and among the technological designs and interactive interfaces and a user’s body and his/her movement on the other. The visual performance was, however, only a part of the process. Addressing the experience of performing the movements as if being immersed in the use context formed another challenge. The research process concluded with the analysis. It would be erroneous to state that it was the last phase, particularly when performance, reflection, and analysis occurred in a hermeneutic loop throughout the study. The analysis process, therefore, unfolded as a reflective and analytical dialogue within the construction of the choreography. In this dialogue, attention was paid to the microlevel phenomena through the definition of movement continua (Poutanen,
20 Studying Interactions Through Dance
2015, pp. 38–53). The analytical focus was on the kinesthetic experience on one hand, and on technology design on the other. The notes compiled in the dance diary created by the enacting researcher regarding the microlevel choreographies were grouped and the groups labeled by the enacting researcher during and slightly after the active performing of the choreography. The focus was on discovering the desirable and undesirable characteristics of the movement continua, such as intuitive movement sequences, logical visual–haptic couplings of augmented and internal feedback, and kinesthetic coherence. An example of the imaginary technology and the gestural trajectory are depicted in Figure 2 to illustrate how the enacting researcher conceived the technology in his immediate surroundings in performing the interaction.
Figure 2. A visualization depicting the imaginary technology (i.e., a tablet and a desktop computer) superimposed onto a frame from the movement continuum used by the enacting researcher in performing a sequence. These visualizations were not part of the performance but are added here to show how the enacting researcher was imagining the technologies.
FINDINGS
The process of reorganizing the micromovements and the iterative processes of rehearsing/dancing and dance writing yielded discoveries in multiple areas. Most importantly, the enactment generated insight into the kinesthetic experience and usability of technologies proposed in the Microsoft visions as well as into generic considerations of how particular technology artifacts constrain, guide, and support movement, positioning, posture, orientation, and sensuous foci. The visual–tactile continua in time and space were found to have a strong significance in regard to the experience of interaction. This outcome points to the importance of how the orchestration of human–computer interaction within a network of heterogeneous, interconnected devices could be made to enhance the user’s kinesthetic experience. The technological designs in both Productivity Future Vision (2009, 2011) productions are based on five distinct technological artifacts: mobile resources, tabletop screens, smart walls, wearable technologies (glasses and wristwatch), and small-sized smart objects. All these intelligent interactive elements impact the user’s movement, position in space, posture, and sensuous focus, among other embodied experiences. In the Productivity Future Vision videos, many of the artifacts employed have specific vocabularies and typical movement continua for
21 Poutanen, Ylirisku, & Hoppu
interaction, both individually and when combined with other types of artifacts. One key finding of our research was that combining various artifacts creates the setting for choreographies. Understanding the different choreographic possibilities related to certain technologies and their collaborative use can thus facilitate interaction design in specific use contexts. Kinesthetically, the enacting researcher experienced the use of small handheld devices as unpleasant and restricting. The experience probably was compounded by the lack of any actual reference point in the enacting researcher’s hand. Larger screens, however, enabled the utilization of broader movement trajectories, allowing the activation of larger areas in the body. The enacting researcher considered these movements more natural and pleasurable. This outcome also could be a question of personal preferences. The smartphone and tablet-size interfaces performed well in situations in which the actual data manipulation was realized with embedded information resources, such as on large wall screens or on the top of a table. The smart handheld devices were experienced as performing the best when operated within the ecology of resources, such as screens shared by connected computing processes (see also Poutanen, 2015, pp. 55–59). The tactile–visual input–output loops between the user and the technological system can be considered a basic organizing principle of human–computer interaction in the Productivity Future Vision (2009, 2011). The user’s body during the interaction seemed to be related to the spatial–temporal progression of visual foci in space in ongoing interaction. The result of this aspect of the analysis suggests that dynamic relations between the visual and tactile foci have a comprehensive influence on the user’s body during the interaction, which should be further examined. A key finding was related to the role of the user’s sight in interaction; when his/her gaze was traveling across multiple screens provided by a device network, it seemed to have the potential to enable choreographies that activate the user’s body, for instance, enhance shifts between body postures and alignment in space, as a reaction to relocation of visually displayed information in surrounding space. We suggest that designs that assist in shifting visual focus can be intentionally designed to enhance smooth changes in body alignment and posture. On the contrary, superimposed visual and tactile foci, especially when extended over time, can lead to stagnant and unpleasant kinesthetic experiences (see also Poutanen, 2015, pp. 58–59).
Kinesthetic Pleasure
Dance reflections proved to be valuable in recognizing the challenges within the micromovement sequences and, in contrast, finding insightfully designed choreographies that resulted in pleasing kinesthetic experiences. First indications of successful choreographies arose at the beginning of the dance practice. The enacting researcher recognized that it was easy and quick to internalize a particular interaction into dance. Dancing also evoked personal experiences of the pleasurability of particular movement continua. In particular, tapping as a method for drawing data from the device was reported as fun by the enacting researcher. In the dance diary, the enacting researcher described the experience as resembling the act of tapping a floating object so that it submerges and resurfaces with a popping motion. The kinesthetic feel of the micromovement choreography of a map application that zoomed in response to vertical level alterations of the hand-held smartphone screen also was experienced as pleasurable. The motion would enlarge the map when lifting the device up and to shrink the map when moving the device down (see Figure 3). This movement was among the favorite movement sequences reported by the enacting researcher. The application enabled dynamic exploration of a
22 Studying Interactions Through Dance
Figure 3. Example of using level-alterations in scaling the content to a desired level of detail on a relatively small screen; moving the device upward zoomed in and downward zoomed out. map on a small-screen device through use of two hands. It recognized sweeps and finger-based zoom commands (two or three fingers) without compromising the usability of the application. This particular use feature also provided a surprise for the enacting researcher. The visual image of the use situation on video was not appealing at first. However, the related somatic experience of the micromovement choreography of the interaction was experienced as pleasing. Thus, it can be stated that the embodiment of the choreography changed the enacting researcher’s preconception of the use experience of this particular interaction. Enacting also revealed that it is not enough to design good individual movements in isolation: It is necessary to think through how the complete interaction choreography flows as a movement sequence. A clever micromovement idea can be lost within a clumsy movement sequence. Based on the experience of the enacting researcher, some of the movement combinations required uncomfortable body postures and the suggested micromovement sequences resulted in unnatural bodily positions. For example, the interaction was experienced as unpleasant when the researcher had to stretch his arms towards the side and perform a converging movement. The enacting researcher, a young healthy male, experienced these movements as straining. However, working through the complete dance choreography sensitized the enacting researcher to the differences and similarities across the kinesthetic experience of the movement sequences. Thus, some of movement continua were experienced as complex and physically demanding while others were free flowing. Some movement sequences were experienced as rigid, and their rigidness was highlighted by the surrounding fluent movement sequences. Thus it is possible that technological interactions presented in video productions as smooth and pleasing could, in reality, not be the case. Analysis of technology use through reenactment is therefore a more truthful or accurate means for assessing the quality of experience than is watching a mediated use when the kinesthetic experience is the focus of interest.
Usability and Embodied Engagement
In the Productivity Future Vision (2009, 2011) videos, it was suggested that the envisioned interactions enable the user to accomplish technological interactions smoothly. The visions presented rich variations in choreographic strategies that seem to be created in order to avoid interactions with strictly logical means. Usability is taken into account through the optimization
23 Poutanen, Ylirisku, & Hoppu
of user–system behavior by minimizing the time, effort, and errors by users trying to attain their goals (cf. Nielsen, 1993). Some of the usability issues were easy to identify through the choreographic approach. For example, it was apparent that some of the micromovement sequences were too complex, exemplified by the difficulty in internalizing and performing the sequences. This resulted in extra movements, as compared to other interaction choreographies with a similar intent. By making the usability issues explicit between homologous choreographies, it was possible to locate inconsistencies in the overall choreographic vocabulary. However, there were some examples of intuitive choreographies that could be figured out just by using the technologies, that is, “learning the possibility of drag and drop interaction can also be learned by accident” (Dance diary excerpt). Usability was also reported as a user-driven recombination of visual displays in space, in that the “wider table-surface-computer provides a larger screen that supports extending the illustration into a form that supports [the activity of the user]” (Dance diary excerpt). A figure representing the transmission of information from one device to another between a table surface computer and a tablet computer device is presented in Figure 4. In sum, it could be concluded that usability was assessed by the enacting researcher in terms of how a technological system supported the body’s ability to learn, remember, and become skillful in controlling networked resources in the intelligent environment. The researcher’s personal learning process through choreographed reenactments of the micromovements provided a benchmark on how that can be achieved.
Limits of the Choreographic Approach
The choreographic approach had limitations in instances in which particular information perceptualizations had to be considered. Productivity Future Vision (2009, 2011) presented a number of interactions in which the content at the edge of the screen seemingly disappeared from
Figure 4. Example of how devices network spontaneously in Productivity Future Vision and how the advantages of different forms can be seized; in this figure the surface of a large screen has been taken into use in scaling up content in order to create a more appropriate visual representation concerning the user’s task at hand.
24 Studying Interactions Through Dance
a device and reappeared on another. Because no augmentations or props of any kind were used when reenacting these specific choreographies, interactions in the video data had to be consciously performed in keeping the original context in mind. The problematic character of these events, however, unfolded when the nature of that particular movement continuum was developed into embodied knowledge of the enacting researcher, and thus an active component of the reflection that emerged during the performance. In confronting these types of situations, it is important that the researcher observe closely in the original video the relative position of the devices or whether the target screen is situated behind the sending screen, on top of it, or somewhere else. Moreover, a choreographic analysis is based on subjective experience and the quality of the analysis corresponds with the researcher’s experience and skills. The approach can be considered as artistic research on technology. Further development and adapting the method to a more systematic evaluation of human–computer interaction is needed. Some usability issues are more challenging to assess through the choreographic approach, however, such as movements that required painstaking care with body positioning. For example, while a menu interface required the user to control his/her hand accurately in an exact position, the danced performance was unable to recreate the role of the physical surfaces in providing physical support for performing the movements accurately. Such realities make the enactment insufficient for providing, for instance, tactile and haptic feedback on the devices with which the user directly interacts. The choreographic approach conducted with only one person enacting the choreography also overlooks the social aspects of the situations with technology where interaction between different users plays a role. This study was nevertheless an experiment with the goal of developing and understanding what would happen if technology-design critique was based on experiences occurring in continuous movement of a single user in an environment with networked devices, such as visual displays with touch screen interfaces. The study concentrated on the experience of movement and it was realized in a way that explicitly moved technology out of the way of the moving body. Finally, this research raises concerns regarding whether data can be gathered and analyzed from only single levels of the choreographic approach. It became apparent in our analysis that local-level choreography, that is, collaborative use, also is needed in when investigating the productivity of an activity. For instance, when people use data across devices and work with it collaboratively, the social choreography is heavily influenced by the way in which the users are allowed to make changes to the data on the fly. In addition, collaborative choreographies are closely connected to the perceptual qualities of technology. Further research is needed on the development of a multiple-dancer choreography for learning about the multiactor movement characteristics for investigating the collaborative kinesthetic experience in interactive technology environments. This research approach also would address how it feels to perform particular kinds of movements while others are observing.
DISCUSSION
When performing a well-rehearsed choreography, in this case, based on movement extracted from a fictional representation of human–computer interaction, the enacting researcher is not responding to any new information that might appear on the imaginary screens. Hence, the
25 Poutanen, Ylirisku, & Hoppu
technique is inward-looking in that it focuses on the embodied experience of using one’s body to control and attend to multiple technology channels in imaginary, ubiquitous environments rather than on the human–computer interaction happening around the body. Thus, the discovery of new affordances, that is to say, new ways of engaging with technology, is not based on feedback from external-to-body devices but rather from the performing and experiencing body. The choreographic approach directs attention toward personal engagement in the activities of reorganizing, performing, and writing about the choreography, thus fostering the development of a detailed understanding of the kinesthetic experience of the interaction choreographies. As a result, the approach promotes the development of researcher sensitivity, especially to the kinesthetic experience of a movement continuum in which the arrangement of trajectories between visual–tactile foci influences the body posture, alignment, and reach of movement. Considering the differences between the represented architectonic space in the video representation and the real physical surroundings where the choreography is performed, it may be surprising that the choreographies also facilitate the development of understanding about the interplay between the user and physical contexts of an interaction. The process of enactment had to be carried out in a way that it constructs the context and thus helps the enacting researcher to perform the movements while taking into account the missing elements, such as the devices and characteristics of the physical space in the original use context. The combination of watching the original human–computer interactions and dancing the choreography made it possible to assimilate the experiential and the contextual into an embodied understanding. The approach emphasizes the importance of understanding the embodied mind for the quality of interaction design, and it seems especially well suited for the study of multichannel experiences. As a result, investigating complex interactions through a choreographic approach is neither constrained nor enabled by knowledge or experience of particular technologies. However, the process is fully driven by the development of the choreography that makes visible the demands and delights that the studied design context sets for the experiencing body. Contemporary visions of intelligent environments are dominated by visual inputs. What if we researchers and designers started to think about movement as an input to interaction design or even as a major factor for the user experience? The development and study of interaction choreographies provides an alternative way to consider users’ preferences in choosing a channel, thereby opening up opportunities to discover how intelligent environments could be improved to provide alternative choices for a diversity of users. In the context of software design, developers typically talk in terms of workflow. This is conceptualized as sequences of tasks leading up to an attained goal. The process of embodying choreographic principles to tap into the body’s knowledge requires time, effort, and creative approaches. However, such investments may be beneficial by increasing opportunities for novel design that addresses better the kinesthetic experience. The studied videos described people interacting with envisioned future technologies. The research data restricted the choreographic analysis solely to the kinds of interactions that were depicted in the original videos. However, we see that the choreographic analysis can also be used methodically without such detailed examples. For instance, researchers could start from a particular technical possibility, and then pursue investigating the mutual interplay between device networking, technological forms, adaptable spaces, and/or the user’s capacity to move application controls and data across present visual displays. This would allow a greater role and flexibility for embodied improvisation through which novel interaction qualities can emerge spontaneously.
26 Studying Interactions Through Dance
The number of ad hoc networks created by various kinds of devices is rapidly increasing. This creates abundant possibilities for combining and recombining microchoreographic continua based on the concept of adaptable technology. It seems a new era has dawned, one in which users have a multitude of possible channels to “resource” (see Ylirisku, Buur, & Revsbæk, 2016a, 2016b). This change may also provide new opportunities for users to choose, develop, and implement preferred personal interaction choreographies, in other words, to engage in an idiosyncratic process of body-making and meaning-making (see Sklar, 2000, p. 74). The technological contexts presented in the Productivity Future Vision (2009, 2011) videos cannot be accessed; they exist only as visual stories. The depicted technologies and interactions could, however, be projected into a physical setting where an enacting researcher could adapt the choreographic approach with spatially imagined displays, surfaces, and devices. Because the enactment of a choreography as research poses minimal requirements for the physical and technological environments, the ability to learn about the micromovements may prove to be a useful and versatile skill for interaction designers. In line with interaction design patterns in graphical user interfaces, it is feasible to expect that choreographic design patterns for intelligent environments may become commonplace in the future.
CONCLUSIONS
This paper described a choreographic approach to interaction design and introduced a procedure to extract micromovement continua, reorganize movement material into a practicable choreography, conduct first-person enactment of that choreography, and document the analysis. The procedure built on the choreographic approach to interaction design (Parviainen et al., 2013) and extended it by providing an example of how micromovement analysis can be conducted in practice. The choreographic analysis was strengthened with the hermeneutics of the body approach (Hoppu, 2005). The extended method allows operationalizing first-person enactment and performing movement continua related to the human–computer interactions under scrutiny. The extended choreographic approach for interaction design method starts with documenting movement continua present in human–computer interaction. The material under investigation also can be selected from established technology visions, as was done in the experiment (Poutanen, 2015) analyzed in this study. The movement continua can be video recorded in real use context, or one can design animated situations or document-improvised interactions in a mock-up setting. The key criterion in drawing on video sources is that video depiction presents all individual movements that are constitutive to the intended human– computer interaction. Based on the video representations of specific human–computer interactions, the movement continua are then reorganized to form a coherent choreography that can be rehearsed and enacted efficiently. The reorganization can be carried out with the help of a movement sequence table that is based on screen shots from the video. Screen shots should present understandably and uninterrupted the continua of the represented user’s limbs and changes in body alignment and posture and differentiate between the micromovements constitutive to interaction. The resulting choreography and the use context are then realized through rehearsing the choreography, keeping the use context in mind. The use context should somehow be present in the movement sequence table so that the images captured from the video representation of the target human–computer interaction
27 Poutanen, Ylirisku, & Hoppu
serve both rehearsing and embodying information about the use context. Thorough analysis is possible when the choreography is being performed with the use context in mind. Ideally, performing through dance-like movement helps the researcher to organize ideas related both to the experienced movement and to the imagined, technologically augmented space. Reporting the kinesthetic experience related to micromovement continua can be done through writing a dance diary. Expressions of the experiences become more explicit in the course of hermeneutic process that, in this experiment, included rehearsing movement sequences and use contexts with the image movement sequence table and looking at the video excerpts from the original interaction data. Results can be reported as textual descriptions and/or video documentation of the practices and procedures of enacting and commenting on the experience related to the studied movement continua. The evaluations of intelligent environments have typically concerned safety, human well- being, and the attainment of predefined goals. Choreographic analysis through the hermeneutics of the body approach sheds light on how multiple networked artifacts in ubiquitous computing settings influence the movements and embodied experiences of the user. The choreographic and hermeneutic analyses in this study were based on research data that consisted of futuristic technological visions of interactions within user-centered, intelligent environments. The visions were crafted thoughtfully. Nevertheless, the envisioned interactions included interactions that did not fit well within the movement continuum that they were designed to be part of. Although videos can offer an illusion of smoothness, naturalness, and embodied qualities of interaction, the moving body provides the ultimate reference for human experience of movement. The extended choreographic approach to interaction design shares many similar features with the moving-and-making-strange approach developed by Loke and Robertson (2013). Both may contain the elements of investigating and choreographing the movement, and they also may include inventing and choreographing movement. Also, visual analysis and representation of movements are applied in both approaches. The major methodological premise for both approaches is the reenactment of movement. This characteristic connects the two approaches methodically and makes them mutually constructive. In future work, this connection should be analyzed more deeply. The choreographic approach to interaction design could benefit from the adoption of an approach similar to the three perspectives—the mover, the observer, and the machine—introduced in the moving-and-making-strange approach. This would enable the choreographic approach to mature as a tool for the design of intelligent environments. This study focused on a microlevel analysis (Parviainen et al., 2013) that proved useful for depicting the felt qualities in sequences of micromovement interactions. A description and discussion of local-level and macrolevel analyses were not in the scope of this study. However, ideally, microlevel, local-level, and macrolevel choreographic analyses would be applied together. Combined analysis of these interrelated analysis levels remains to be addressed in future studies. The introduction of new contextual services to be consumed and experienced in increasingly intelligent urban spaces, transportation services, educational facilities, and homes will impose requirements on the methods and methodologies that guide the development of everyday environments in a user-centered, movement-oriented manner. A choreographic approach to interaction design that is extended with the hermeneutics of the body method has the potential to mature into a useful tool, mindset, and method for interaction designers who design intelligent environments.
28 Studying Interactions Through Dance
IMPLICATIONS FOR THEORY AND APPLICATION
This study explored what would happen if technology-design critique was based on experiences that explicitly moved technology out of the way of the moving body. While the study is an initial exploration, it emphasizes the value of taking the kinesthetic experience (comprising the felt dimensions of movement and body and the memories and expectations carried by the body) as the starting point for developing and evaluating interactions with technology. The approach is gaining relevance in the context of design of augmented and virtual reality environments, which provide designers with open-ended possibilities for creating new kinds of interaction choreographies. It is also useful for the consideration of how people may combine and recombine microchoreographic continua with available adaptable and networked technologies in intelligent environments. Theoretically the work opens up for the development and study of new kinds of design patterns and patterns of choreographies aimed for kinesthetic appeal.
REFERENCES
Carroll, J. M. (2000). Making use: Scenario-based design of human–computer interactions. Cambridge, MA, USA: MIT Press. Damasio, A. (1994). Descartes’ error: Emotion, reason, and the human brain. New York, NY, USA: Putnam. de Waal, F. B. (2008). Putting the altruism back into altruism: The evolution of empathy. Annual Review of Psychology, 59, 279–300. doi: 10.1146/annurev.psych.59.103006.093625 Dourish, P. (2001). Where the action is: The foundations of embodied interaction. Cambridge, MA, USA: MIT Press. Foster, S. (2011). Choreographing empathy: Kinesthesia in performance. Abingdon, England: Routledge. Gadamer, H.-G. (1975). Truth and method (G. Barden & J. Gumming, Trans.). London, England: Sheed and Ward. (Original work published 1960) Gourlay, S. (2006). Conceptualizing knowledge creation: A critique of Nonaka’s theory. Journal of Management Studies, 43(7), 1415–1436. doi: 10.1111/j.1467-6486.2006.00637.x Gubbi, J., Buyya, R., Marusic, S., & Palaniswami, M. (2013). Internet of things (IoT): A vision, architectural elements, and future directions. Future Generation Computer Systems, 29(7), 1645–1660. doi: 10.1016/j.future.2013.01.010 Hanna, J. L. (2015). Dancing to learn: The brain’s cognition, emotion, and movement. Lanham, WA, USA: Rowman & Littlefield. Heidegger, M. (1975). Early Greek thinking (D. F. Krell & F. A. Capuzzi, Trans.). New York, NY, USA: Harper & Row. (Compilation of selected works published in 1940s and 1950s) Hoppu, P. (1999). Symbolien ja sanattomuuden tanssi. Menuetti Suomessa 1700-luvulta nykyaikaan [The dance of symbols and wordlessness: Minuet in Finland from the 18th century to the present]. Helsinki, Finland: Suomalaisen Kirjallisuuden Seura. Hoppu, P. (2005). Hermeneutics of the body: Connecting research and dancing in dance research. In I. Björnsdóttir (Ed.), Dance heritage: Crossing academia and physicality (pp. 106–110). Reykjavik, Iceland: Nordic Forum for Dance Research. Iqbal, R., Sturm, J., Kulyk, O., Wang, J., & Terken, J. (2005). User-centred design and evaluation of ubiquitous services. In Proceedings of the 23rd Annual International Conference on Design of Communication: Documenting and designing for pervasive information (pp. 138–45). New York, NY, USA: ACM. doi: 10.1145/1085313.1085346
29 Poutanen, Ylirisku, & Hoppu
Johanson, B., Fox, A., & Winograd, T. (2002). The interactive workspaces project: Experiences with ubiquitous computing rooms. IEEE Pervasive Computing, 1(2), 67–74. doi: 10.1109/MPRV.2002.1012339 Kaasinen, E., Kymäläinen, T., Niemelä, M., Olsson, T., Kanerva, M., & Ikonen, V. (2013). A user-centric view of intelligent environments: User expectations, user experience and user role in building intelligent environments. Computers, 2(1), 1–33. doi: 10.3390/computers2010001 Laban, R. (1963). Modern educational dance (2nd ed.). London, England: Macdonald & Evans. Lakoff, G., & Johnson, M. (1999). Philosophy in the flesh: The embodied mind and its challenge to Western thought. New York, NY, USA: Basic Books. Loke, L., & Robertson, T. (2013). Moving and making strange: An embodied approach to movement-based interaction design. ACM Transactions on Computer–Human Interaction, 20(1), 1–25. doi: 10.1145/2442106.2442113 Mason, P. (2009). Brain, dance and culture: Broad hypotheses of an intuitive science of dance. Brolga: An Australian Journal About Dance, 3, 27–34. Merleau-Ponty, M. (1965). Phenomenology of perception (C. Smith, Trans.). London, UK: Routledge & Kegan Paul. (Original work published 1945) Microsoft. (2009). Productivity Future Vision. Retrieved April 23, 2017, from https://www.youtube.com/watch?v=t5X2PxtvMsU Microsoft. (2011). Productivity Future Vision. Retrieved April 23, 2017, from https://www.youtube.com/watch?v=a6cNdhOKwi0 Nielsen, J. (1993). Usability engineering. Boston, MA, USA: Academic Press. Nietzsche, F. (2010). Thus spoke Zarathustra (T. Common, Trans.; B. Chapko, Ed.). Paris, France: Feedbooks. (Original work published 1883) Nonaka, I., & Takeuchi, T. (1995). The knowledge-creating company. How Japanese companies create the dynamics of innovation. New York, NY, USA: Oxford University Press. Parviainen, J. (1998). Bodies moving and moved: A phenomenological analysis of the dancing subject and the cognitive and ethical values of dance art. Tampere, Finland: Tampere University Press. Parviainen, J. (2003). Kinaesthetic empathy. Dialogue and universalism, 1(11–12), 154–165. Parviainen, J., Tuuri, K., & Pirhonen, A. (2013). Drifting down the technologization of life: Could choreography- based interaction design support us in engaging with the world and our embodied living? Challenges, 1(1), 103–115. doi: 10.3390/challe4010103 Pinelle, D., Gutwin, C., & Greenberg, S. (2003). Task analysis for groupware usability evaluation: Modeling shared-workspace tasks with the mechanics of collaboration. ACM Transactions on Computer–Human Interaction, 10(4), 281–311. doi: 10.1145/966930.966932 Polaine, A., Løvlie, L., & Reason, B. (2013). Service design: From insight to implementation. Brooklyn, NY, USA: Rosenfeld Media, LLC. Polanyi, M., (1966). The tacit dimension. Garden City, NY, USA: Doubleday. Poutanen, O. (2015). Embodied interaction choreographies: Kinesthetic approach to intelligent environment design. Master thesis, Aalto University, School of Art, Design and Architecture. Available at https://aaltodoc.aalto.fi/handle/123456789/19664 Reynolds, D., & Reason, M. (2012). Conclusion. In D. Reynolds & M. Reason (Eds.), Kinesthetic empathy in creative and cultural practices [Kindle Version 1.12.4]. Bristol, England: Intellect. Rogers Y. (2006). Moving on from Weiser’s vision of calm computing: Engaging UbiComp experiences. Ubiquitous computing, Lecture Notes in Computer Science, 4206, 404–426. doi: 10.1007/11853565_24 Savioja, P. (2014). Evaluating systems usability in complex work: Development of a systemic usability concept to benefit control room design. Espoo, Finland: Valtion teknillinen tutkimuskeskus. Sheets-Johnstone, M. (1999). The primacy of movement. Amsterdam, the Netherlands: John Benjamins Publishing Company.
30 Studying Interactions Through Dance
Sklar, D. (2000). Reprise: On dance ethnography. Dance Research Journal 32(1), 70–77. doi:10.2307/1478278 Stein E. (1989). On the problem of empathy (W. Stein, Trans.). Washington, DC, USA: ICS Publication. (Original work published 1917) Swan, M. (2012). Sensor mania! The Internet of things, wearable computing, objective metrics, and the quantified self 2.0. Journal of Sensor and Actuation Networks, 1(3), 217–253. doi: 10.3390/jsan1030217 Thomas, H. (2003). The body, dance and cultural theory. Basingstoke, England: Palgrave Macmillan. Thomas, H. (2013). The body and everyday life. London, England: Routledge. Thompson, E. (2007). Mind in life: Biology, phenomenology, and the science of mind. Cambridge, England: Belknap Press. Thrift, N. (2008). Non-representational theory: Space, politics, affect. New York, NY, USA: Routledge. Varela, F. J. (1999). The specious present: A neurophenomenology of time consciousness. In J. Petitot, F. J. Varela, B. Pachoud, & J.-M. Roy (Eds.), Naturalizing phenomenology: Issues in contemporary phenomenology and cognitive science (pp. 266–329). Stanford, CA, USA: Stanford: University Press. Ylirisku, S., Buur, J., & Revsbæk, L. (2016a, June). Resourcing in co-design. Paper presented at the Design+Research+Society: Future-Focused Thinking conference, Brighton, UK. Ylirisku, S., Buur, J., & Revsbæk, L. (2016b, November). Resourcing of experience in co-design. Paper presented at the 11th Design Thinking Research Symposium (DTRS’11), Copenhagen, Denmark.
Authors’ Note We express our gratitude to Energizing Urban Ecosystems Programme and WSP in Finland for funding the research. We also thank professors Aija Staffans and Ramia Mazé for their useful comments and support for finalizing the manuscript, and Maria Sannemann, Glen Forde, Heidi Konttinen, Toni Tolin and Max Mäkinen for their kind support in choreography documentation process.
All correspondence should be addressed to Olli Poutanen Aalto University Hämeentie 135 C 00076 Helsinki, Finland [email protected]
Human Technology ISSN 1795-6889 www.humantechnology.jyu.fi
31
ISSN: 1795-6889 www.humantechnology.jyu.fi Volume 13(1), May 2017, 32–57
BODY, SPACE, AND EMOTION: A PERCEPTUAL STUDY
Donald Glowinski Sélim Yahia Coll Neuroscience of Emotion and Affective Neuroscience of Emotion and Affective Dynamics (NEAD Lab), Faculty of Dynamics (NEAD Lab), Faculty of Psychology and Educational Sciences, Psychology and Educational Sciences, and the Swiss Center for Affective Sciences and the Swiss Center for Affective Sciences
(SCAS), University of Geneva (SCAS), University of Geneva
and Switzerland
Swiss Integrative Center for Human Health (SICHH), Fribourg Switzerland
Naëm Baron Maëva Sanchez Neuroscience of Emotion and Affective Neuroscience of Emotion and Affective Dynamics (NEAD Lab), Faculty of Dynamics (NEAD Lab), Faculty of Psychology and Educational Sciences, Psychology and Educational Sciences, University of Geneva University of Geneva Switzerland Switzerland
Simon Schaerlaeken Didier Grandjean Neuroscience of Emotion and Affective Neuroscience of Emotion and Affective Dynamics (NEAD Lab), Faculty of Dynamics (NEAD Lab), Faculty of Psychology and Educational Sciences, Psychology and Educational Sciences, and the Swiss Center for Affective Sciences and the Swiss Center for Affective Sciences (SCAS), University of Geneva (SCAS), University of Geneva
Switzerland Switzerland
Abstract: The present study aims at providing a systematic account of emotion perception applied to expressive full-body movement. Within the framework of the lens model, we identified the decoding process underlying one’s capacity to categorize emotions while watching others’ behaviors. We considered the application of Laban movement analysis, a method focusing on qualitative aspects of movement. An original
© 2017 Donald Glowinski, Sélim Yahia Coll, Naëm Baron, Maëva Sanchez, Simon Schaerlaeken, & Didier Grandjean, and the Open Science Centre, University of Jyväskylä DOI: http://dx.doi.org/10.17011/ht/urn.201705272517
This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.
32
Body, Space, and Emotion: A Perceptual Study
experimental setup used a contemporary choreography interpreted by four expert dancers in an environment that restricted their movement to their peripersonal space. Each performance consisted of a subtle or intense emotional interpretation of the choreography (e.g., happiness, anger, surprise, fear, and sadness). Results showed that emotions being expressed in this confined environment could still be identified, categorized, and associated with a profile of movement qualities and specific body parts.
Keywords: nonverbal expressive movement, emotion, peripersonal space, lens model, Laban movement analysis, dance, choreography.
INTRODUCTION: BODY, SPACE AND EMOTIONS
The body is a tool of unparalleled power for expressing emotion beyond the use of verbal utterances, and this issue has been the focus of numerous research studies (e.g., de Gelder, 2013; Glowinski et al., 2011; Wallbott, 1998). However, the relationship between space and the body’s emotion expression powers has been far less studied. This is due in part to prior difficulties in exploring and recording human interaction or body expression with a high degree of precision and flexibility. Most of the performances recorded for scientific experiments take place within a laboratory setting to allow absolute control over lighting conditions that otherwise might affect the robustness of the tracking. In fact, far fewer studies engage the natural setting of the performance (e.g., open-air performance and theatre). The role of space, and specifically the impact of a small space, on the progression of movement, particularly on the full-body expression of an individual’s emotion, remains to be clarified. A better insight into such relationships is valuable for a wide range of domains, including psychological research on emotional and aesthetical expression or the field of human–computer interaction (Pantic, Pentland, Nijholt, & Huang, 2007). In a context that combines psychological, aesthetic, and human–computer interaction research, studies focusing on emotion and space often refer to boundary conditions, extreme situations (e.g., flight space), or phobias (e.g., claustrophobia; see Palinkas, 2001). To address the space–emotion relationship, we present a case study of a dance choreography that was performed by professional dancers conveying various expressive emotional intents. This experimental method drew upon previous work by Camurri, Lagerlöf, and Volpe (2003). What characterizes these performances, and distinguishes the novelty of our contribution, is that the space made available to each dancer corresponded to their peripersonal space. This peripersonal space has been approximated by the kinesphere in dance theory as the environmental sphere surrounding the body whose periphery can be easily reached by extending a limb (Laban & Lawrence, 1947; Sutil, 2013). We are interested in understanding how external observers can discriminate the expressed emotions based on the expressive behavior of the dancer within the boundaries of his/her peripersonal space. Our contribution integrated the Laban movement analysis (LMA) categories, used to describe qualitative movement, to fit the decoding process underlying such observers’ emotional categorizations. We also were interested in evaluating other factors’ impact on the emotion recognition process: (a) the observers’ expertise (e.g., does being a trained dancer augment sensitivity to emotion expression?), (b) the performance expressive intensity (e.g., are emotions better recognized when expressed in an emphatic manner?), and (c) the potential role of body parts
33 Glowinski, Coll, Baron, Sanchez, Schaerlaeken, & Grandjean
in emotion recognition (e.g., are there specific relationships between expressed emotions and body parts?). This paper is organized in the following manner. We first review the existing approaches in the literature that attempt to recognize emotions through body postures and movements with a specific focus on dance as a test case. Next, we present the developed experimental framework, which includes the steps dedicated to the recording and presentation of the stimuli and details related to the statistical analysis methods applied to the participants’ answers. This is followed by the presentation of our results: how emotions are recognized and how they can be described in terms of the LMA categories. In the Discussion section, we examine the impact of the participants’ expertise, the intensity of the stimuli, and the relationship between specific emotions and identified body parts. Finally, we conclude this paper by addressing future research directions.
Body as a Source of Emotional Expressivity
The body is a key source of information for emotion recognition. An increasing trend in research relates to facial and vocal expression, gesture, and dynamic body motion recognition (e.g., Glowinski et al., 2011). The recent development of low-cost digital image recording equipment, together with the advent of professional motion-capture technologies, has enabled a close analysis of nonverbal modalities in human communication of emotions and, in particular, bodily behavior (Wallbott, 1998). Recently, affective computing, along with the wide range of related application areas, has led the way to meet an increasing demand for the creation of natural, intelligent, adaptive, and personalized multimodal environments (Vinciarelli et al., 2012). Until the turn of the 21st century, various coding systems were proposed by psychologists. The main focus has been on emotional facial expression due in part to the pioneering work of Ekman, who offered a systematic account for facilitating explicit coding and categorization (FACS; Ekman & Friesen, 1978). A realm of new standards is emerging in this domain, opening opportunities for commercial applications of automatic emotion recognition. Furthermore, alternative approaches have been developed more in recent decades. Research results in psychology suggest, in particular, that body movements do constitute a significant source of affective information (Wallbott, 1998). For example, body gesture, as a complement to facial expressions, can help disambiguate emotional information (de Gelder, 2006; Todorov, Baron, & Oosterhof, 2008). Yet, the vast number and combination of body postures and gestures offers a higher degree of freedom for expressions that can be difficult to easily manage during analysis. No standard body-movement coding scheme equivalent to the FACS for facial expressions exists to facilitate decomposing bodily expression into elementary components. Various systems have been suggested by psychologists (e.g., Bobick, 1997; Dael, Goudbeek, & Scherer, 2013) but none has reached the consensus achieved by the Ekman’s system (Ekman & Friesen, 1978) for facial expression analysis. Alternative approaches have been developed to fill this need. For instance, research on upper body expressivity took advantage of the clear conceptualization of sign language (e.g., Gunes & Piccardi, 2009). The world’s many sign languages, now being extensively documented, have become a resource for emotion recognition. In sign language, signs made with the hands work in complex coordination with signs made with the face, head movements, torso shifts, gaze, gestures, and mimetic moves. As upper-body movements also correspond to what can be
34 Body, Space, and Emotion: A Perceptual Study
captured easily through a Webcam in the typical working environment (i.e., where people sit in front of their desktop computer), the recognition of such movements has stimulated a realm of applications in this domain. Yet, with the advent of wearable computing, that is, devices worn on the body giving the potential for digital interaction, particular attention is now being devoted to specific limb expressivity (e.g., arms, fingers; see Velloso, Bulling, & Gellersen, 2013). In this context, the lack of systematic coding has been successfully compensated for through a promising alternative approach developed by Caramiaux (2014). Caramiaux, Montecchio, Tanaka and Bevilacqua (2014) investigated and demonstrated how the variability of the body behavior itself can stand as a central cue for capturing expressivity. However, other than the seminal work by Pollick, Paterson, Bruderlin, and Sanford (2001) on motion of knocking, very few experiments have investigated emotional communication through specific limb variations. The key issue in recognition of affective bodily expression is to consider the level of information that determines the quality of movement. The view the body as a whole represents a complete source of affective information is now receiving increased attention in the scientific literature. The recent interest into full-body emotion expressivity can be related to the diffusion of advanced motion capture systems (e.g., Vicon) and especially to the larger dissemination of RGB-depth cameras (e.g., Kinect) that allow for an affordable and relatively fine-tuned tracking of an individual’s body movement. Existing studies on full-body movement have used coarse-grained posture features (e.g., leaning forward or slumping back) or low-level physical features of movements (i.e., kinematics, see, for example, Bianchi-Berthouze, Cairns, Cox, Jennett, & Kim, 2006). Other approaches have exploited the dynamics of gestures, referring to psychological studies reporting that temporal dynamics play an important role in interpreting emotional displays (e.g., Kapur, Kapur, Virji-Babul, Tzanetakis, & Driessen, 2005).
Towards a Unified Approach
One may note the disparity between the different approaches and, as pointed out earlier, the lack of broader and systematic view to address emotion recognition based on full-body expressivity. A few attempts have drawn inspiration from dance notation and theories (Camurri et al., 2003; Laban & Ullmann, 1971). The key issue is to consider the level of information that determines the quality of movement, that is to say the general characteristics about the way a movement is performed (e.g., the effort dimensions of LMA described below). This level of information may lie between the low-level features (e.g., position of and derivatives in a limb’s trajectory that report a mere displacement and inform about a specific gesture or motion activity) and a high level of information related to the emotional categories that people may use to infer emotional attitudes through observation (Glowinski et al., 2011). In this context, Laban’s (1947) conceptualization has proved useful in modeling what could be an intermediate level of information representing qualitative properties of movement where expressivity is embedded and conveyed. Emotion recognition may ultimately rely on the specific combination of these qualitative properties of movement (Glowinski et al., 2011). Since the pioneering work led by Zhang et al. (2006), an increasing number of computational implementations have been suggested but much more needs to be done on the perceptual side. Thus, our study aims to tackle this aspect by focusing on the perception of emotions
35 Glowinski, Coll, Baron, Sanchez, Schaerlaeken, & Grandjean
expressed through full-body movement in a restricted environment (i.e., the performer’s peripersonal space). Our specific goal is to provide a systematic account of the decoding process underlying such bodily emotion recognition and observe the respective impact of the observer’s expertise, the intensity of the performance expressivity, and role of body parts in this process. In particular, we are interested in understanding whether an intermediate level of emotional information discernment related to the perception of movement qualities based on the LMA conceptual framework (Laban & Ullmann, 1971) could help in tying the low and high levels of emotional information. We expect this study will shed light on to which level of information may be decisive in researchers’ understanding of the perceptual processes underlying emotion classification. The research approach to the expression of emotion employed in this study relies on the lens model initially developed by Brunswik (1956), adapted to the performing arts by Juslin and Laukka (2003), and recently reviewed by Glowinski et al. (2014; see Figure 1). According to this model, the analysis of emotion expression must take into account both the sender’s (e.g., the performer) and the receiver’s (e.g., the observer) perspectives. Two types of processes are thus considered: emotion expression/communication through body behavior (e.g., the proximal cues exhibited by a performer) and emotion recognition (e.g., the observer inferences based on the behavioral cues of that performer).
Figure 1. An illustration of the revisited lens model (Brunswik, 1956) integrating the effort dimensions (i.e., factors and elements) of Laban movement analysis (Laban & Lawrence, 1947). This conceptual framework includes both the point of view of the sender (e.g., the dancer) and of the receiver (e.g., spectator). The respective encoding and decoding processes of emotion expression through body components can be analyzed in a systematic way. In particular, the attribution of a specific emotion category during the recognition phase of the movement can be related to the way a spectator combines bodily, effort-based, features of the dancer’s performance.
36 Body, Space, and Emotion: A Perceptual Study
Perceptual Evaluation in Terms of Movement Qualities
An analysis in terms of movement qualities helps identify the differences between the perspective of the person expressing the movement (i.e., the sender) and that of the person observing the same movement (i.e., the receiver). According to Laban (as cited by Sutil, 2013), movement can be experienced intuitively as a continuum, an indivisible flux of changes. Alternatively, movement can be rationalized as a series of snapshots that can be ordered, structured, and formalized as the building blocks of a representation (Glowinski, Camurri, Chiorri, Mazzarino, & Volpe, 2007). This conceptual framework facilitates investigating and understanding the implicit or explicit strategies an observer applies in manipulating the collected snapshots of movement in building up a complete, sensitive, and yet subjective representation of the (performer’s) sequence. Supported by a profound dualistic approach, Laban’s key movement concepts come in pairs of opposites (Laban & Lawrence, 1947). The theory of efforts developed by Laban aims at characterizing such dynamism in relation to four basic properties (effort factors): weight, space, time, and flow (Laban & Lawrence, 1947). Each of these factors is in turn divided into opposed subcategories known as effort elements (heavy–light, direct–indirect, quick–slow, and free– bound). These effort elements allow researchers to understand the fundamental qualitative differences in human movement. As stated by Sutil (2013, p. 5), “The difference between punching someone in anger and reaching for a glass is slight in terms of body organization—both rely on extension of the arm and the same spatial direction of the movement. The weight of the movement and the intensity of the movement are very different, though.” In this study, the Laban elements (Laban & Lawrence, 1947) are operationalized via measurable descriptions, described as follows: 1. The weight element considers the individual's movements in relationship to gravity and may describe its vigorousness. The two subcategories associated are heavy (i.e., powerful) and light (i.e., delicate). 2. The space element here considers the individual’s movements related to his/her peripersonal space. The two subcategories associated are direct and indirect. Indirect motion is interrupted and roundabout, and direct motion proceeds along a mostly straight line without deviation. 3. The time element is a measure of movement activity speed. The two associated subcategories are quick (i.e., sudden and urgent) and slow (i.e., sustained, continuous, and time stretching). 4. The flow element characterizes the continuity of the movement. The two associated subcategories with this element are free (i.e., a fluid and released movement) and bound (i.e., a controlled or contained movement).
Emotion Categories, Peripersonal Space, and Laban’s Effort Dimensions
Based on the qualitative approach of movement expressivity, recent computational studies have investigated how emotion could be rendered by integrating the effort dimensions (i.e., factors and elements) of Laban movement analysis (Laban & Lawrence, 1947). Initially, the objective in this domain was not to access perceptual processes only but also to create complementary strategies to organize and classify large databases on motion that included more subtle aspects
37 Glowinski, Coll, Baron, Sanchez, Schaerlaeken, & Grandjean
of expressions relating to emotion. A first attempt by Wakayama, Okajima, Takano, and Okada (2010), then Okajima, Wakayama, and Okada (2012), showed that motion retrieval can benefit from the use of a subset of LMA dimensions, especially when searching for data (indications, instances) on body movement in large research databases (Kapadia, Chiang, Thomas, Badler, & Kider, 2013). Recently, Aristidou, Charalambous, and Chrysanthou (2015) inspected the similarities among various emotional states classified according to the arousal and valence of Russell’s (1980) circumplex model and a subset of features that encode stylistic characteristics of motion based on the LMA. Overall, previous experimental results, based on video processing or body limbs’ trajectory dynamics, show that these features can be extracted using the LMA dimensions and thus allow researchers to encode body posture differences depicting emotion expression. The pertinence of Laban’s (Laban & Ullman, 1971) dimensions as descriptors for motion expressivity can be attested further by their use in avatar animation. Since the seminal work of Chi, Costa, Zhao, and Badler (2000), various studies have demonstrated that LMA-derived dimensions can be exploited efficiently in motion parameterization and expression (Zhao & Badler, 2005). Unfortunately, few studies have considered the application of Laban-based analysis in understanding perceptual processes during emotion recognition. Levy and Duke (2003) used LMA dimensions to score the capacity of nonprofessional dancers to improvise sequences of expressive movement. Correlation analyses in emotion recognition revealed relationships among the emotional states of depression and anxiety and certain movement qualities. Focusing on the specific case of walking, Crane and Gross (2013) explicitly instructed participants to use LMA-based analysis to classify the observed nuances of emotions. Sadness, anger, contempt, and joy were decoded with accuracy that ranged from 74% to 32 %, respectively; for most of the targeted emotions, decoding accuracy was related to the four effort factors (Serino, Annella, & Avenanti, 2009).
Peripersonal Space, Kinesphere, and Emotions
Peripersonal space has been defined in contrast to general space in a way similar to how space is defined in geometry or topology (Serino et al., 2009). In dance theory, this peripersonal space has been approximated further by a kinesphere, which refers to the space occupied by an outreaching body without it moving from one spatial location (Laban, 1966). Specifically, it can be represented as an area within reach of the body’s extended limbs which, when projected in all directions from the body’s center, can be conceived as a totality of movement or a sphere of movement (Sutil, 2013). Therefore, the kinesphere gives an operationalized way to analyze peripersonal space (see Figure 2) and how each individual might explore the surrounding environment to express emotion and, in turn, how such peripersonal space may reciprocally impact on emotional expression.
Dance as a Test Case
Dance appears as an ideal test case to study emotion in relation to movement expressivity. It condenses within a minimal amount of space and time the human capacity to express emotional information. Aesthetically, it is possible to create a minimally expressive dance or a dance that
38 Body, Space, and Emotion: A Perceptual Study
Figure 2. Illustration of the peripersonal space as operationalized by the concept of kinesphere (Laban, 1966). The figure on the right represents the motion capture of the dancer’s movement (on the left) that was used in the perceptual experiment presented in this paper. unfolds slowly. Typically, dance provides no functional or utilitarian use of the space (i.e., related to a practical task achievement, such as opening a door or typing on a keyboard), but rather to express oneself. In contemporary Western dance art, however, the aesthetics have turned into quite everyday activities; therefore objects and space may be interrelated in very functional ways. Dance as a test case allows the experimenter to observe movement of an undetermined wide range of complexity that can vary according to the level of the dancer’s expertise (e.g., from beginner to accomplished dancer) as well as according to the choreography. Specific to our research interest, dance also can reveal exceptional and evolved ways of exploiting the surrounding space. By focusing on the expression of emotion, through movement in a limited space or constrained environment, we sought to observe how a dancer faces this challenge: Specifically, we were interested in how the dancer could transform a source of what could appear to most people as a stress or difficulty into a positive experience. Therefore, we experimentally defined a limiting condition where the performer was bounded by a choreography that limited the variety of possible movements within a further, restricted environment that constrained the dancer’s displacement within his/her personal space. As a consequence, we expected that this challenging situation could impact the performer’s creative capacity in expressing emotion through the only dance component that remained freely available: body expressivity. We explored this issue by assessing how external observers perceived such emotional body expression.
Laban Movement Analysis and Dance
The LMA is widely used in dance, either to annotate and generate choreography or to train dancers. With the advent of new systems for motion capture, dancers have shown an increased interest in using these new forms of interaction to map in real-time their expressive body movements to audio or visual feedback. The LMA dimensions have resurfaced as a source of inspiration in capturing key expressive variations in dance performance and for improving the “naturalness” during interaction with automatic systems (Mancas, Glowinski, Volpe, Coletta, & Camurri, 2010). Drawing upon Laban’s approach, Camurri et al. (2003),
39 Glowinski, Coll, Baron, Sanchez, Schaerlaeken, & Grandjean
and Van Dyck, Vansteenkiste, Lenoir, Lesaffre, and Leman (2014) developed a qualitative approach to human full-body movement for emotion recognition. In this study, LMA dimensions were approximated through a combination of low-level physical features to allow for a coarse description of an encoding process of emotion through body behaviors (e.g., emotion arousal revealed by acceleration peaks). Based on a more explicit computational modeling of Effort-Shape features, Alaoui, Caramiaux, Serrano, and Bevilacqua (2012) developed an interactive augmented dance performance that extracts movement qualities (energy, kick, jump/drop, verticality/height and stillness) to generate a visual simulation. However, these previous works overlooked the experiences that can be gained through perceptual studies based on the LMA. For dance examples, it is useful to consider the key conceptual distinctiveness embodied in this art form. Dance can be considered a specific case of stylized body movements in which the entirety of movement encodes a particular emotion. Stylized motions usually originate in laboratory settings, where subjects are asked to freely act on an emotion without any constraints. Another key distinction may refer to the propositional and nonpropositional aspects of movement (Boone & Cunningham, 1998). Raising one’s hand to indicate stop, for example, may be considered a propositional movement that constitutes established signs to transmit shared meaning. As stated in Camurri et al. (2003), emotions can be expressed through propositional movement (e.g. a clenched fist to show anger or raised arms to demonstrate joy), whereas nonpropositional movements are embodied not in discrete, easily segmented motions but rather through a subtle combination of movement qualities (e.g., lightness or heaviness). From the point of view of a perceiver, this distinction could be interpreted as a shift in attentional focus, whether on the configurational aspects (e.g., gesture as an explicit, well-delimited code) or the dynamic itself (i.e., how one expresses emotional intent). In this study, we focused on the nonpropositional style of movements as represented by dance sequences. In our research, we considered how external observers combine body movement based on the LMA to recognize the emotional intent of contemporary dancers during their performance. Specifically, we focused on assessing the significance of the time, weight, space, and flow factors featured in emotion recognition by external observers.
METHODS
Participants
Dancers
Two female (D1 and D2) and two male (D3 and D4) professional contemporary dancers were recruited for this study. Their average age was 28 years, and they were remunerated 100 CHF for their participation. All participants provided signed informed consent prior to study participation. The protocol for this study was approved by the University of Geneva, Switzerland, at the Faculty of Psychology and Educational Sciences. Dancers were asked to wear tight clothes, their hair was tied up, and all jewelry and accessories were removed for the recordings.
40 Body, Space, and Emotion: A Perceptual Study
Observers
Forty-eight observers, recruited through social networks, participated in the experiment (17 males; age M = 27.77 years, SD = 10.77). Among them, 38 were following or had followed dance classes and 10 never practiced dance. We followed the standard of the Geneva education system in dance to categorize our research participants: The experts in our study (n = 22) were dancers with 8 or more years of practice; those with 1 to 7 years of dance experience were considered novices (n = 16). Individuals with a year or less of dance practice were grouped into the nonexpert condition (n = 10). All observers were competent in French, the language used to collect the data.
Stimuli Recording
MoCap (motion capture) optical reflectors (markers) were positioned on dancers’ body to record their movement (see Figure 3). Each dancer had an equal number of reflectors attached to the left and right halves of their body. However, these reflectors were not symmetrically positioned to allow them to be distinguished more easily by the camera system and to facilitate offline postprocessing of data. Eight Bonita 3 Vicon cameras were used to record dancers’ movements. The motion-capture system sampled the data at 120 frames per second. This material allowed us to record in three dimensions the position of the 30 markers placed on the dancers’ body. By using a body model, the state (orientation and position, where applicable) of each body part was estimated. Katrin Blantar, a professional choreographer, designed specific sequences of stylistic movements that excluded stereotyped postures that could be perceived as expression of a basic emotion (e.g., upward movements expressing anger). This 30-second microdance (see Glowinski
Figure 3. Disposition of the motion capture optical reflectors on the dancers’ bodies.
41 Glowinski, Coll, Baron, Sanchez, Schaerlaeken, & Grandjean
et al., 2007) was performed in a controlled environment of 3 x 3 m that strictly enclosed the dancer’s kinesphere. This choreography was first learned by the dancers in a formal way without emotional engagement. Then, during the experiment, each dancer explicitly addressed a scenario for each emotion, meaning that they reviewed the proposed definitions (see Table 1) before integrating that into their expression of the choreography. Each dancer performed each emotional expression twice, at two levels of intensity: first in a subtle way (i.e., low-intensity condition) and then with a more emphasized and demonstrative manner (i.e., high-intensity condition). This distinction was based upon a paradigm used in music to distinguish between levels of expressivity (Davidson, 1993). Dancers were asked not to modify the choreography, but no instructions were given concerning how emotions should be expressed through their dancing that choreography. By the completion of the recording process, each dancer performed a set choreography six times: one that contained no emotional expressivity (neutral, danced twice) and versions that expressed the five emotions (i.e., happiness, anger, surprise, fear and sadness) in both low and high intensity. Thus a total of 12 performances per dancer were recorded, resulting in, overall, about 5 hours of recordings.
Stimuli Preparation
The Vicon Nexus software program1 was used to reconstruct and label the data. At the end of the processing, the reflectors were linked to each other to simulate the head, chest, pelvis, arms, and legs (see Figure 4). To create the video stimuli (i.e., the various choreographies performed in abstract representations), we displayed the linked markers through the Vicon Nexus view, using Camtasia software2 (see Figure 5).
Table 1. Definitions of the Five Emotions Used as the Basis for Expressive Choreography and Observer Perceptions (based on Banziger & Scherer, 2007).
Emotion Definition Anger Feel violent discontent caused by an action deemed stupid or malicious. Happiness Feel transported by a wonderful event happening in a more or less unpredictable way. Fear Feel threatened by an imminent danger that could affect one’s survival or physical or mental integrity. Surprise Feel confronted, often abruptly, with an unexpected or unusual event (without a positive or negative connotation). Sadness Feel depressed and discouraged by the loss of a relative, an object, or a familiar environment. Note. The definitions were used by the dancers in creating the emotional expressiveness of the choreography versions for the stimuli preparation; the observers then employed the definitions during the emotion recognition task.
42 Body, Space, and Emotion: A Perceptual Study
Figure 4. A silhouette of a dancer at the end of the Vicon Nexus processing, created through connecting the data collected via optical reflectors on a dancer’s body.
Figure 5. Screenshot examples of movement sequences of the choreography expressing happiness, as performed by dancer D1.
Questionnaires
The Qualtrics software program3 was used to create the online questionnaires. Four questionnaires, each containing videos of two dancers, a man and a woman, were created. They were written in French and checked by all authors of the study; all English translations provided
43 Glowinski, Coll, Baron, Sanchez, Schaerlaeken, & Grandjean
for this paper were conducted by the authors. The first questionnaire displayed the performances of dancers D1 and D3, the second of D1 and D4, the third of D2 and D3, and the fourth of D2 and D4. Two videos of D1 expressing sadness were lost due to a technical problem. Thus, the first and second questionnaire presented 22 videos, whereas the other two contained 24 (each video of each of the two selected dancer). All videos were presented, in 428 x 1027 cm format. In each questionnaire, the videos were presented randomly. The observers completed only one of the four questionnaires in order to reduce the duration of the experiment. The questionnaire was administered in a lab. Observers were asked to complete information regarding their age, gender, and diploma or study degree. Questions followed concerning their dance training (i.e., “Have you already followed dance classes?” “Which kind of dance did you practice?” “How many hours per month?” “How many years of practice?”) and contemporary dance familiarity (i.e., “How frequently do you watch contemporary dance performances?”). Before presenting the videos, the observers were asked to carefully read the definitions of the five emotions they would use in the questionnaire to judge the performances (see Table 1). The neutral condition was simply presented as a performance without clear emotional engagement. Depending on the questionnaire, 22 or 24 videos were then randomly displayed to each participant. For each video, the participants did not identify the perceived emotions directly but rather by following Dael’s method (Dael et al., 2013), that is, rating the intensity they perceived of each emotional expression from 0 (low intensity) to 100 (high intensity). This 100-point scale enabled a fine-tuned evaluation of participants’ responses and to distinguish better how they recognized emotion. Participants were then instructed, via a forced-choice question, to indicate which body part most captured their attention during the performance, choosing between the head, shoulders, arms, pelvis, or legs. Then, four scales were displayed, ranging from 0 to 100, to measure the various elements of the Laban theory of effort (Laban & Lawrence, 1947): weight (0 = heavy, 100 = light), space (0 = direct, 100 = indirect), time (0 = quick, 100 = slow), and flow (0 = free, 100 = bound). The observers were asked to select these elements intuitively and without overthinking. The questionnaire was completed by the participants in 40 minutes, on average. Figure 6 provides an overview of the experimental protocol adopted in this study.
Statistical Analyses
To analyze the data regarding the participants’ emotion recognition in the performances, their evaluations of the Laban dimensions among the emotions, and their responses regarding emotions and their intensity, and the body parts capturing attention, we used the generalized linear mixed models (GLMMs) statistical method. GLMMs combine the properties of linear mixed models, which incorporate random effects, and generalized linear models, which handle nonnormal data by allowing the researchers to specify different distributions, such as Poisson or binomial (Bolker et al., 2009). By using GLMMs, we could also control for the interindividual variability random effect. To investigate the contribution of each variable and its interactions, we compared different models, from the most simple (i.e., with one unique variable) to the most complex (i.e., all combinations of variables). Statistical differences were evaluated through Chi-square difference tests. Our fixed effects comprised the specific emotions expressed by the dancers in individual
44 Body, Space, and Emotion: A Perceptual Study
flow
Sudden
Figure 6. An overview of the experimental protocol adopted in this study, comprising (a) the microdance sequence of emotionally expressive choreography by professional dancers using motion-capture technology to create the stimuli, and observers’ perception regarding (b) the rating of the perceived emotions, (c) the body part of the dancer’s representation that drew the participant’s attention during the dance, and (d) the questions regarding the performance qualities and expressivity (LMA elements). videos (i.e., happiness vs. surprise vs. sadness vs. anger vs. fear vs. neutral), the intensity of the dances (low vs. high), and the dance expertise of the observers (nonexpert vs. novice vs. expert). Our random effect consisted of the interindividual differences of ratings between the observers. Our dependent variables were the observers’ perception responses on the emotional scales (happiness, surprise, sadness, anger, fear, and neutral) that we transformed into binomial variables 1 (the emotion with the relative highest rating) or 0 (for all other emotions that scored under this maximum value) for each trial. Four other dependent variables were the Laban factors (flow, space, time and weight), which were continuous variables ranging from 0 to 100. Finally, five binomial variables concerned the body part (head, shoulders, arms, pelvis, or legs) that captured the most participants’ attention during each performance.
RESULTS
Emotion Recognition, Influence of Expertise, and Relations Between Emotion and Laban’s Dimensions
The dancers’ intended emotion showed a significant main effect in all emotional scales, as displayed in Figure 7: happiness, χ2(5, N = 48) = 92.15, p < .001; sadness, χ2(5, N = 48) = 192.81, p < .001; anger, χ2(5, N = 48) = 52.02, p < .001; fear, χ2(5, N = 48) = 82.47, p < .001); surprise, χ2(5, N = 48) = 28.36, p < .001); and neutral, χ2(5, N = 48) = 30.65, p < .001). As shown
45 Glowinski, Coll, Baron, Sanchez, Schaerlaeken, & Grandjean Perceived Emotion Perceived
anger happiness neutral fear surprise sadness
Expressed Emotion
Figure 7. Histogram representing on the horizontal axis the emotions expressed by the dancers and on the vertical axis the emotions rated by the observers (N = 48). Vertical bars denote a 95% confidence interval. by contrast analysis, observers perceived significantly more happy, sad, angry, and neutral expressions when the dancers expressed happy, sad, angry and neutral expressions, respectively, than the other emotions: happiness, χ2(1, N = 48) = 69.08, p < .001); sadness, χ2(1, N = 48) = 31.81, p < .001); anger, χ2(1, N = 48) = 62.19, p < .001); and neutral, χ2(1, N = 48) = 25.23, p < .001). Data did not show any difference in emotion recognition based on level of expertise. No significant interaction between emotion and expertise was observed in the emotional scales: (happiness, χ2(10, N = 48) = 4.52, p = .92; sadness, χ2(10, N = 48) = 11.98, p = .29; anger, χ2(10, N = 48) = 4.40, p = .93; fear, χ2(10, N = 48) = 10.91, p = .36; surprise, χ2(10, N = 48) = 16.96, p = .75); and neutral, χ2(10, N = 48) = 5.14, p = .88). Data showed that the fear emotion was recognized differently according to the level of intensity (low vs high intensity), χ2(4, N = 48) = 13.13, p < .05 (see Figure 8). As shown by a simple effect, an expressed fear was significantly more frequently perceived as fear when it was expressed with low intensity rather than high intensity (χ2(1, N = 48) = 20.09, p < .001). No other significant interaction was found for the other emotions: happiness, χ2(4, N = 48) = 7.36, p = .12; sadness, χ2(4, N = 48) = 3.97, p = .41; anger, χ2(4, N = 48) = 5.69, p = .22); and surprise, χ2(4, N = 48) = 8.77, p = .07. Emotion showed a significant main effect in all Laban’s factors: time, χ2(5, N = 48) = 220.32, p < .001; weight, χ2(5, N = 48) = 62.50, p < .001; space, χ2(5, N = 48) = 107.93, p < .001; and flow, χ2(5, N = 48) = 51.51, p < .001). These results are displayed in Figure 9.
46 Body, Space, and Emotion: A Perceptual Study
Figure 8. Plot of the interaction between the emotion expressed by the dancers and the observers’ perceived intensity of the dances in the fear scale. Vertical bars denote a 95% confidence interval.
Evaluation of Laban’s factors
time weight space flow Laban’s factors
Figure 9. Histogram representing differences in Laban’s criteria evaluation (speed, weight, extension and flow) between each expressed emotion (anger, happiness, neutral, fear, surprise and sadness; N = 48). Vertical bars denote a 95% confidence interval.
Time Factor
As shown by the comparisons, anger was rated as significantly quicker, representing sudden and urgent movement, χ2(1, N = 48) = 99.65, p < .001, than the other emotions. Moreover, fear and surprise were rated as significantly quicker, χ2(1, N = 48) = 127.29, p < .001, than the happy, neutral, and sad expressions. Neutral and sad expressions were rated significantly
47 Glowinski, Coll, Baron, Sanchez, Schaerlaeken, & Grandjean
slower than happiness, χ2(1, N = 48) = 24.92, p < .001. However, no significant difference was observed between neutral and sad expressions, χ2(1, N = 48) = 1.03, p = .31, or between fearful and surprise expressions, χ2(1, N = 48) = 0.86, p = .36.
Weight Factor
Happiness was rated as significantly lighter than the other emotions, χ2(1, N = 48) = 4.56, p < .05. Conversely, sadness was rated as significantly more anchored and heavier than the other emotions, χ2(1, N = 48) = 19.50, p < .001. No significant difference was observed between anger and neutral, χ2(1, N = 48) = 0.32, p = .57, or anger and surprise, χ2(1, N = 48) = 0.59, p = .44) expressions. However, anger was recognized as significantly more anchored than fear, χ2(1, N = 48) = 5.66, p < .05. No significant difference was obtained between the expressions of neutral and fear, χ2(1, N = 48) = 1.99, p = .16, neutral and surprise, χ2(1, N = 48) = 0.002, p = .96, or fear and surprise, χ2(1, N = 48) = 2.71, p = .10.
Space Factor
Sadness was rated as significantly more indirect than the other emotions, χ2(1, N = 48) = 70.25, p < .001. Anger and happiness were rated as significantly more direct than neutral, fear, or surprise expressions, χ2(1, N = 48) = 19.99, p < .001. Between these two groups, happiness was rated as significantly more direct than anger, χ2(1, N = 48) = 8.58, p < .01, and neutral was considered significantly more indirect than the fear or surprise expressions, χ2(1, N = 48) = 7.45, p < .01. However, no significant difference was observed between fear and surprise, χ2(1, N = 48) = 0.99, p = .32.
Flow Factor
Happiness was rated as significantly more fluid, that is, referring to a free, relaxed, and uncontrolled movement, than the other emotions, χ2(1, N = 48) = 36, p < .001. By contrast, surprise and sadness were the two emotions recognized as the most bounded, χ2(1, N = 48) = 26.22, p < .001. No significant difference was observed between them, χ2(1, N = 48) = 0.74, p = .39. Moreover, anger was rated no differently than the neutral or fear expressions, χ2(1, N = 48) = 0.34, p = .56 and χ2(1, N = 48) = 0.78, p = .38, respectively, which were also not significantly different, χ2(1, N = 48) = 0.03, p = .86.
Relationships Among Expressed Emotions and Body Parts
Emotion showed a significant main effect on the dependent variables: the head, χ2(5, N = 48) = 71.23, p < .001; arms, χ2(5, N = 48) = 12.23, p < .05; shoulders, χ2(5, N = 48) = 11.23, p < .05; and pelvis, χ2(5, N = 48) = 12.01, p < .05. However, no significant difference was observed in the legs variable, χ2(5, N = 48) = 6.42, p = .27. For sake of clarity, we developed a visualization procedure to illustrate how each perceived emotion could be systematically related to a specific body area. Based on the analysis of the contrasts, we individuated three levels of gray-scale intensities (light grey, dark grey, and black) to designate the level of explicit attention the observers noted regarding the head,
48 Body, Space, and Emotion: A Perceptual Study
shoulders, arms, pelvis, and legs with respect to the emotion expressed (see Figure 10). Black indicates the highest significant correlation between an emotion and the observed body part; a light gray code indicates the lowest one. The other body parts, for which correlation was neither the highest nor the lowest, were colored in dark gray. We report in the following the results of the contrast analyses. The observers noted the head as the focus during dances involving fear in comparison to neutral condition: χ2(1, N = 48) = 20.21, p < .001); shoulders (sadness in comparison to surprise: χ2(1, N = 48) = 7.37, p < .01); arms (anger in comparison to fear: χ2(1, N = 48) = 9.47, p < .01); pelvis (happiness in comparison to fear: χ2(1, N = 48) = 8.49, p < .01); legs (anger in comparison to sadness: χ2(1, N = 48) = 3.61, p = .06).
Figure 10. Degree of reported observers’ attention to the different body areas depending on emotions expressed by the dancers. Neutral and Surprise did not appear to have a clear consensus regarding which body part most observers observed. Rather, the observers had responses across multiple body areas.
49 Glowinski, Coll, Baron, Sanchez, Schaerlaeken, & Grandjean
DISCUSSION
In agreement with the literature (Camurri et al., 2003; Crane & Gross, 2013; Shikanai, Sawada, & Ishii, 2013), our results show that the emotions of happiness, anger, and sadness are typically identified correctly by the observers. Of the three, sadness is recognized most often, followed by happiness. This partially reproduces the results of the studies by Camurri et al. (2003) and Dittrich, Troscianko, Lea, and Morgan (1996), in which the highest recognition rate was obtained for sadness, followed by anger and joy. For the neutral performance, in which no emotion was expressed, observers also predominantly perceived it correctly. However, when the dancers expressed fear or surprise, these two emotions were not readily distinguished. This perceptual confusion finds support from two prior studies. In the seminal study by Ekman and Friesen (1974) on emotional facial expressions and in the study of a full-body behavior in a daily life scenario by Meijer (1989), expressions of surprise tended to be recognized as fear. In our study, the perceived expressions of fear and surprise shared most of the Laban factors: time, space, and weight. For the Laban elements associated with time, for example, speed changes were seen as particularly fast for these two emotions and for the weight factor, the performance was perceived as moderately anchored to the ground and light and delicate. Contrary to our expectations based on earlier studies (e.g., Bläsing et al., 2012; Broughton & Davidson, 2014; Cross, Kirsch, Ticini, & Schütz-Bosbach, 2011), the expertise of the observers had little influence on the emotion recognition in this study. Overall, our results suggest that the affective components of body expression are less driven by expertise in dance; we saw broadly consistent responses across observers, in line with van Paasschen, Bacci, & Melcher (2015). However, the use of a point-light display based on MoCap recordings could have diminished the impact of expertise. This type of minimalistic display transmits a sufficient amount of gestural information for emotion recognition when deployed in a restricted environment, and perhaps even those without dance experience were able to recognize the emotional expressions from these images. It is possible that additional physical evidence (e.g., facial expression and skin texture) may have allowed for one’s dance expertise to become more apparent. These findings also could be explained by the fact that the criterion chosen to classify our group (i.e., novice, expert, and nonexpert) was not sufficient enough to differentiate the levels of expertise. Our criterion for expertise was based on the standard levels of training period (8 years of regular practice) that Swiss dance institutions suggest to their most motivated students who are preparing for a professional career. Among the 22 experts who participated in our study, only four were dancing professionally. In future studies, we plan to include a larger number of experts, specifically professional dancers, to test whether sensorimotor expertise or aesthetic familiarity may impact emotion recognition in the specific context of this study (see also Bläsing et al., 2012). Results confirmed that emotions expressed in the high intensity condition (i.e., with a higher emphasis and demonstrative manner) during performances are better recognized than in the low emotional intensity, where emotion would be more subtlety expressed (Burger, Saarikallio, Luck, Thompson, & Toiviainen, 2013). Our results show a statistically significant and a marginal difference, respectively, for fear and surprise in that the emotion recognition was higher in the low intensity condition. However, anger, joy, and sadness were perceived in a similar way for
50 Body, Space, and Emotion: A Perceptual Study
both intensities. This result goes against our expectations based on the literature and the results obtained by Hill and Pollick (2000). These authors showed that a particular increase in the expressive intensity of movements produced a better recognition of joy and anger. In their study, however, the differences between the expressive conditions were achieved through offline manipulation of the recorded point-light display used to render the dancers’ performances. More noteworthy is that fear was recognized by the observers as a more intense conveyed movement in our context of investigating the body gestures and movements. One can argue that, in real-life conditions, the experience of fear might be related to reduced motion in body parts in order to be less detectable by dangerous animals or conspecifics in a threatening situation. Such fear-related body movement might have been integrated as an internal representation of fear by dancers and spectators, explaining our results. Results showed that Laban’s effort dimensions can inform on an intermediate level of perceptual processing underlying the emotion recognition. They revealed that the movement qualities can be discriminated in a statistically significant way and be significantly associated with emotion portrayals, thus highlighting their expressive pertinence. Considering the time factor, the anger, fear, and surprise emotions were positively related to quicker element ratings in comparison to the sad and neutral conditions. This is in line with the results in the scientific literature (Camurri et al., 2003; Crane & Gross, 2013; Meijer, 1989). As previous research has shown, fear and surprise cannot be distinguished based on the time factor alone. Concerning the weight factor, performances expressing joy turned out to be lighter and delicate whereas those expressing sadness were seen as firmer and anchored to the ground. However, this single criterion may not be sufficient to discriminate between surprise, anger, and fear. For the space factor, the movement underlying the joy and anger emotions was considered as more direct than those of sadness, confirming recent studies on this specific issue (Crane & Gross, 2013; Shikanai et al., 2013). This factor alone, however, did not allow for any distinction between fear and sadness. Finally, the flow factor was positively rated in the happiness emotion, displaying the LMA elements of relaxed, free, and uncontrolled movement, whereas surprise and sadness seemed to include bound, tense, and controlled types of movement. These results replicate the findings in Camurri et al. (2003). In addition, our results show that the surprise and fear emotions, which prior search has found are typically confused by observers, could be distinguished on the flow factor: The movement related to fear was rated as more uncontrolled than that of surprise (see also Wallbott, 1998). These findings, in totality, are consistent with Crane and Gross’s observations that emotions displayed by dancers affect movement style in distinctive ways that could be described consistently with a specific combination of the effort dimensions (Crane & Gross, 2013). An original finding of this study concerns the how various body parts attract the attention of observers according to which emotions they perceive expressed by the dancers. It extends and confirms an original approach developed by Nummenmaa, Glerean, Hari, and Hietanen (2014) to observe how emotions may be preferentially related to specific body parts. Our results concerning the upper body parts (i.e., head, shoulders, arms) and the pelvis—and the absence of any significant effect concerning the legs—are confirmed partially by Sawada, Suda, & Ishii (2003) and by empirical observation from early contemporary dance. In this period, expressions and movements were confined essentially to the upper body, particularly the chest. Legs were less used. In the mid-20th century, choreographer Ted Shawn stated,
51 Glowinski, Coll, Baron, Sanchez, Schaerlaeken, & Grandjean
“The torso must become the most sensitive and expressive part of the body” (Shawn, 1963, p. 63). Our results, however, show that the head attracted more observers’ attention when watching dance performances expressing happiness, anger, or no emotion (the neutral condition) than fear, surprise, or sadness. Arms were primarily observed during performances expressing anger; dances expressing sadness attracted less attention to the arms. Concerning the shoulders, the happy, sad, and neutral conditions attracted the participants’ attention to this area of the body, whereas performances expressing surprise did significantly less. Finally, the pelvis was considered the most salient when watching happy and neutral performances and the least when sad dances were presented. Our results provide a further detailed insight on how differentiated emotion recognition is associated with specific body parts. The use of peripersonal space (Cléry, Guipponi, Wardak, & Ben Hamed, 2015; Serino et al., 2009) as an environmental restriction to constrain dancers’ expressive movement also is a unique component of this study. It allowed us to evaluate whether expressed motion within a limited environment could still be recognized by observers. Our findings showed that angry, fearful, happy, surprised, and sad emotions can still be well discriminated in constrained movement. In addition, our findings highlight that not only space-related features but also complementary movement qualities (e.g., flow, weight, and time) characterize the decoding process of emotion expression in a systematic manner within this specific context (see also Taffou & Viaud-Delmon, 2014). To better understand how emotion recognition may vary in relation to a space representation, a future experimental protocol could include systematic manipulation of the space rendering during the stimuli presentation. The same point-light displays could be presented in a virtual environment that differs in terms of space extension (e.g., either strictly matching an individual’s peripersonal space or placing a dancer’s rendered movement within a larger neutral space). On the other hand, these results reveal the dancer’s potential for exploiting his/her peripersonal space for emotion expression. According to Cléry et al. (2015), not only is the type of action performed in the representation of peripersonal space key, but also the emotional consequences of the actions can dynamically modify it. A computational analysis of the Laban effort dimensions may help clarify the encoding processes of emotion through expressive body behavior. Of specific interest will be the relationship among the expressed emotions, the exploration of the confined space, and the qualitative features of movement. An experimental manipulation also could consider the dance performance within a wider variety of environments (e.g., restricted to peripersonal space or to a larger space, such as a theater stage). Finally, this study reveals the potential of using dance as test case for investigating emotion expression and peripersonal space representation.
CONCLUSIONS
As novel digital environments increase the degree of freedom in movement expression, ongoing research would benefit from a conceptual framework and set of methodological procedures to consider the implicit effects of the space factor on emotion expressivity and perception. Using dance as a test case and a choreographed performance with a variety of emotional variations, we systematically considered the decoding processes underlying a spectator’s identification of the emotion expressed within the performer’s peripersonal space.
52 Body, Space, and Emotion: A Perceptual Study
Within the integrated conceptual apparatus of the lens model and the LMA, we contribute by offering a better insight into the qualitative features of body expression on which external observers may rely. We further reveal commonalities in perceptual strategies developed by dance experts and nonexperts. This study sets the agenda for future developments in the thriving and quickly evolving field of motion analysis and on nonverbal communication in technologically mediated environments. Future work includes (a) extending and replicating with a greater number of participants the correlations observed so far, and (b) establishing relationships within the encoding process (i.e., those implemented by the dancers through their body behavior). The computational modeling of LMA features may be exploited in this perspective.
IMPLICATIONS FOR THEORY OR APPLICATION
Our study is advancing the technological breakthrough in the analysis of full-body expressivity but also acknowledges the embodied turn toward cognition research, confirming the prominent role of emotion and embodiment as essential components of cognitive processes. We used dance as a test case and bound the dancer’s movements to his/her peripersonal space in order to better study the crucial impact of space constraints on emotional body expressivity and observers’ perception of emotions expressed within those constraints. Drawing upon the Laban movement analysis framework seems an efficient approach for recognizing emotions and revealed underlying perceptual process. Used in a Brunswik’s (1956) lens model, we believe those qualities categorized along the Laban’s effort dimensions are intermediate-level key components in emotion recognition. This study could help advance a new generation of digital environments, allowing for natural and emotionally vivid interactions. Equally, this line of research could facilitate better automated recognition of emotion from bodily movement. In addition, it suggests how dance research can influence a wide variety of disciplines also interested in exploring the perception and interpretation of emotional conveyance.
ENDNOTES
1. More information on the Nexus software is available at http://www.vicon.com/products/software/nexus 2. More information on the video capture software used to display the motion capture video https://www.techsmith.com/camtasia.html 3. More information on the software used to develop the online questionnaire at http://www.qualtrics.com
REFERENCES
Alaoui, S. F., Caramiaux, B., Serrano, M., & Bevilacqua, F. (2012). Movement qualities as interaction modality. In Proceedings of the Designing Interactive Systems Conference (pp. 761–769). Newcastle, UK: ACM Press. doi: 10.1145/2317956.2318071
53 Glowinski, Coll, Baron, Sanchez, Schaerlaeken, & Grandjean
Aristidou, A., Charalambous, P., & Chrysanthou, Y. (2015). Emotion analysis and classification: Understanding the performers’ emotions using the LMA entities. Computer Graphics Forum, 34(6), 262–276. doi: 10.1111/cgf.12598 Banziger, T., & Scherer, K. R. (2007). Using actor portrayals to systematically study multimodal emotion expression: The GEMEP corpus. In A.C.R Paiva., R. Prada, & R. W. Picard (Eds.), Affective computing and intelligent interaction (pp. 476–487). Berlin, Germany: Springer. doi: 10.1007/978-3-540-74889-2_42 Bianchi-Berthouze, N., Cairns, P., Cox, A., Jennett, C., & Kim, W. (2006). On posture as a modality for expressing and recognizing emotions. In Proceedings of the Emotion in HCI Workshop (pp. 74–80). London, UK: University College of London Press. Bläsing, B., Calvo-Merino, B., Cross, E. S., Jola, C., Honisch, J., & Stevens, C. J. (2012). Neurocognitive control in dance perception and performance. Acta Psychologica, 139(2), 300–308. doi: 10.1016/j.actpsy.2011.12.005 Bobick, A. F. (1997). Movement, activity and action: The role of knowledge in the perception of motion. Philosophical Transactions of the Royal Society of London B: Biological Sciences, 352, 1257–1265. doi: 10.1098/rstb.1997.0108 Bolker, B. M., Brooks, M. E., Clark, C. J., Geange, S. W., Poulsen, J. R., Stevens, M. H. H., & White, J. S. S. (2009). Generalized linear mixed models: A practical guide for ecology and evolution. Trends in Ecology and Evolution, 24(3), 127–135. doi: 10.1016/j.tree.2008.10.008 Boone, R. T., & Cunningham, J. G. (1998). Children’s decoding of emotion in expressive body movement: The development of cue attunement. Developmental Psychology, 34(7), 1007–1016. doi: 10.1037//0012- 1649.34.5.1007 Broughton, M., & Davidson, J. W. (2014). Action and familiarity effects on self and other expert musicians’ Laban effort-shape analyses of expressive bodily behaviors in instrumental music performance: A case study approach. Frontiers in Psychology, 5(10), 1201–1206. doi: 10.3389/fpsyg.2014.01201 Brunswik, E. (1956). Perception and the representative design of psychological experiments. Berkeley, CA, USA: University of California Press. Burger, B., Saarikallio, S., Luck, G., Thompson, M. R., & Toiviainen, P. (2013). Relationships between perceived emotions in music and music-induced movement. Music Perception: An Interdisciplinary Journal, 30(5), 517–533. doi: 10.1525/mp.2013.30.5.517 Camurri, A., Lagerlöf, I., & Volpe, G. (2003). Recognizing emotion from dance movement: Comparison of spectator recognition and automated techniques. International Journal of Human Computer Studies, 59(1), 213–225. doi: 10.1016/S1071-5819(03)00050-8 Caramiaux, B. (2014). Motion modeling for expressive interaction: A design proposal using Bayesian adaptive systems. In Proceedings of the International Workshop on Movement and Computing (pp. 76–81). Paris, France: ACM Press. doi: 10.1145/2617995.2618009 Caramiaux, B., Montecchio, N., Tanaka, A., & Bevilacqua, F. (2014). Adaptive gesture recognition with variation estimation for interactive systems. ACM Transactions on Interactive Intelligent Systems, 4(4), 1– 34. doi:10.1145/2643204 Chi, D., Costa, M., Zhao, L., & Badler, N. (2000). The EMOTE model for effort and shape. In Proceedings of the 27th Annual Conference on Computer Graphics and Interactive Techniques (pp. 173–182). New York, NY, USA: ACM Press. doi: 10.1145/344779.352172 Cléry, J., Guipponi, O., Wardak, C., & Ben Hamed, S. (2015). Neuronal bases of peripersonal and extrapersonal spaces, their plasticity and their dynamics: Knowns and unknowns. Neuropsychologia, 70, 313–326. doi: 10.1016/j.neuropsychologia.2014.10.022 Crane, E. A., & Gross, M. M. (2013). Effort-shape characteristics of emotion-related body movement. Journal of Nonverbal Behavior, 37(2), 91–105. doi: 10.1007/s10919-013-0144-2 Cross, E. S., Kirsch, L., Ticini, L. F., & Schütz-Bosbach, S. (2011). The impact of aesthetic evaluation and physical ability on dance perception. Frontiers in Human Neuroscience, 5(9), 1–10. doi: 10.3389/fnhum.2011.00102
54 Body, Space, and Emotion: A Perceptual Study
Dael, N., Goudbeek, M., & Scherer, K. R. (2013). Perceived gesture dynamics in nonverbal expression of emotion. Perception, 42(6), 642–657. doi: 10.1068/p7364 Davidson, J. W. (1993). Visual perception of performance manner in the movements of solo musicians. Psychology of Music, 21, 103–113. doi: 10.1177/030573569302100201 de Gelder, B. (2006). Towards the neurobiology of emotional body language. Nature Reviews Neuroscience, 7(3), 242–249. doi: 10.1038/nrn1872 de Gelder, B. (2013). From body perception to action preparation: A distributed neural system for viewing bodily expressions of emotion. In K. Johnson & M. Shiffrar (Eds.), People watching: Social, perceptual, and neurophysiological studies of body perception (pp. 350–368). New York, NY, USA: Oxford University Press. Dittrich, W. H., Troscianko, T., Lea, S. E. G., & Morgan, D. (1996). Perception of emotion from dynamic point- light displays represented in dance. Perception, 25(6), 727–738. doi: 10.1068/p250727 Ekman, P., & Friesen, W. V. (1974). Nonverbal behavior and psychopathology. In R. J. Friedman & H. M. Katz (Eds.), The psychology of depression: Contemporary theory and research (pp. 203–224). New York, NY, USA: Wiley. Ekman, P., & Friesen, W. V. (1978). Facial action coding system: A technique for the measurement of facial movement. San Francisco, CA, USA: Consulting Psychologists Press. Glowinski, D., Camurri, A., Chiorri, C., Mazzarino, B., & Volpe, G. (2007). Validation of an algorithm for segmentation of full-body movement sequences by perception: A pilot experiment. In Proceedings of the International Gesture Workshop (pp. 239–244). Berlin, Germany: Springer. Glowinski, D., Dael, N., Camurri, A., Volpe, G., Mortillaro, M., & Scherer, K. (2011). Toward a minimal representation of affective gestures. IEEE Transactions on Affective Computing, 2(2), 106–118. doi: 10.1109/T-AFFC.2011.7 Glowinski, D., Shirole, K., Torres-Eliard, K., Grandjean, D., Riolfo, A., & Chiorri, C. (2014). Is he playing solo or within an ensemble? How the context, visual information, and expertise may impact upon the perception of musical expressivity. Perception, 43(8), 825–828. doi: 10.1068/p7787 Gunes, H., & Piccardi, M. (2009). Automatic temporal segment detection and affect recognition from face and body display. IEEE Transactions on Systems, Man, and Cybernetics Part B, 39(1), 64–84. doi: 10.1109/TSMCB.2008.927269 Hill, H., & Pollick, F. E. (2000). Exaggerating temporal differences enhances recognition of individuals from point light displays. Psychological Science, 11(3), 223–228. doi: 10.1111/1467-9280.00245 Juslin, P. N., & Laukka, P. (2003). Communication of emotions in vocal expression and music performance: Different channels, same code? Psychological Bulletin, 129(5), 770–814. doi:10.1037/0033-2909.129.5.770 Kapadia, M., Chiang, I., Thomas, T., Badler, N. I., & Kider, J. T. (2013). Efficient motion retrieval in large motion databases. In Proceedings of the ACM SIGGRAPH Symposium on Interactive 3D Graphics and Games (pp. 19–28). Orlando, FL, USA: ACM Press. Kapur, A., Kapur, A., Virji-Babul, N., Tzanetakis, G., & Driessen, P. F. (2005). Gesture-based affective computing on motion capture data. In J. Tao, T. Tan, & R.W. Picard (Eds.), Affective computing and intelligent interaction (pp. 1–7). Berlin, Germany: Springer. Laban, R. (1966). The language of movement: A guidebook to choreutics. Boston, MA, USA: Plays, Inc. Laban, R., & Lawrence, F. (1947). Effort. London, UK: Macdonald and Evans. Laban, R., & Ullmann, L. (1971). The mastery of movement. Boston, MA, USA: Plays, Inc. Levy, J. A., & Duke, M. P. (2003). The use of Laban movement analysis in the study of personality, emotional state and movement style: An exploratory investigation of the veridicality of “body language.” Individual Differences Research, 1(1), 39–63. Mancas, M., Glowinski, D., Volpe, G., Coletta, P., & Camurri, A. (2010). Gesture saliency: A context-aware analysis. Gesture in Embodied Communication and Human-Computer Interaction, 5934, 146–157. doi: 10.1007/978-3-642-12553-9_13
55 Glowinski, Coll, Baron, Sanchez, Schaerlaeken, & Grandjean
Meijer, M. (1989). The contribution of general features of body movement to the attribution of emotions. Journal of Nonverbal Behavior, 13(4), 247–268. doi: 10.1007/BF00990296 Nummenmaa, L., Glerean, E., Hari, R., & Hietanen, J. K. (2014). Bodily maps of emotions. Proceedings of the National Academy of Sciences, 111(2), 646–651. doi: 10.1073/pnas.1321664111 Okajima, S., Wakayama, Y., & Okada, Y. (2012). Human motion retrieval system based on LMA features using interactive evolutionary computation method. In T. Watanabe & L. C. Jain (Eds.), Innovations in intelligent machines 2: Intelligent paradigms and applications (pp. 117–130). Berlin, Germany: Springer. Palinkas, L. A. (2001). Psychosocial issues in long-term space flight: Overview. Gravitational and Space Biology Bulletin, 14(2), 25–33. Pantic, M., Pentland, A., Nijholt, A., & Huang, T. S. (2007). Human computing and machine understanding of human behavior: A survey. In T. Huang, A. Nijholt, M. Pantic, & A. Pentland (Eds.), Artificial intelligence for human computing (pp. 47–71). Berlin, Germany: Springer. Pollick, F. E., Paterson, H. M., Bruderlin, A., & Sanford, A. J. (2001). Perceiving affect from arm movement. Cognition, 82(2), 51–61. doi: 10.1016/S0010-0277(01)00147-0 Russell, J. (1980). A circumplex model of affect. Journal of Personality and Social Psychology, 39, 1161–1178. doi: 10.1037/h0077714 Serino, A., Annella, L., & Avenanti, A. (2009). Motor properties of peripersonal space in humans. PLoS ONE, 4(8), e6582. doi: 10.1371/journal.pone.0006582 Shawn, T. (1963). Every little movement: A book about François Delsarte, the man and his philosophy, his science and applied aesthetics, the application of this science to the art of the dance, the influence of Delsarte on American dance. New York, NY, USA: Dance Horizons. Shikanai, N., Sawada, M., & Ishii, M. (2013). Development of the movements impressions emotions model: Evaluation of movements and impressions related to the perception of emotions in dance. Journal of Nonverbal Behavior, 37(2), 107–121. doi: 10.1007/s10919-013-0148-y Sutil, N. (2013). Rudolf Laban and topological movement: A videographic analysis. Space and Culture, 16(2), 173–193. doi: 10.1177/1206331213475776 Taffou, M., & Viaud-Delmon, I. (2014). Cynophobic fear adaptively extends peri-personal space. Frontiers in Psychiatry, 5, 122–128. doi:10.3389/fpsyt.2014.00122 Todorov, A., Baron, S. G., & Oosterhof, N. N. (2008). Evaluating face trustworthiness: A model based approach. Social Cognitive and Affective Neuroscience, 3(2), 119–127. doi: 10.1093/scan/nsn009 van Dyck, E., Vansteenkiste, P., Lenoir, M., Lesaffre, M., & Leman, M. (2014). Recognizing induced emotions of happiness and sadness from dance movement. PLoS ONE, 9(2), e89773. doi: 10.1371/journal.pone.0089773 van Paasschen, J., Bacci, F., & Melcher, D. P. (2015). The influence of art expertise and training on emotion and preference ratings for representational and abstract artworks. PLoS ONE, 10(8), e0134241. doi: 10.1371/journal.pone.0134241 Velloso, E., Bulling, A., & Gellersen, H. (2013). AutoBAP: Automatic coding of body action and posture units from wearable sensors. In Proceedings of the Humaine Association Conference on Affective Computing and Intelligent Interaction (pp. 135–140). Los Alamitos, CA, USA: IEEE. doi: 10.1109/ACII.2013.29 Vinciarelli, A., Pantic, M., Heylen, D., Pelachaud, C., Poggi, I., D’Errico, F., & Schröder, M. (2012). Bridging the gap between social animal and unsocial machine: A survey of social signal processing. IEEE Transactions on Affective Computing, 3(1), 69–87. doi: 10.1109/T-AFFC.2011.27 Wakayama, Y., Okajima, S., Takano, S., & Okada, Y. (2010). IEC-based motion retrieval system using Laban movement analysis. In R. Setchi, I. Jordanov, R. J. Howlett, & L. C. Jain (Eds.), Knowledge-based and intelligent information and engineering systems (pp. 251–260). Berlin, Germany: Springer. Wallbott, H. G. (1998). Bodily expression of emotion. European Journal of Social Psychology, 28(11), 879– 896. doi: 10.1002/(SICI)1099-0992(1998110)28:6<879::AID-EJSP901>3.0.CO;2-W
56 Body, Space, and Emotion: A Perceptual Study
Zhang, S., Li, Q., Yu, T., Shen, X., Geng, W., & Wang, P. (2006). Implementation of a notation-based motion choreography system. In H. Zha, Z. Pan, H. Thwaites, A. C. Addison, & M. Forte (Eds.), Interactive technologies and sociotechnical systems (pp. 495–503). Berlin, Germany: Springer. Zhao, L., & Badler, N. I. (2005). Acquiring and validating motion qualities from live limb gestures. Graphical Models, 67(1), 1–16. doi: 10.1016/j.gmod.2004.08.002
Authors’ Note The authors thank the dancers, Amory, Katrin, Maraussia, and Phileas, for their performances and Katrin Blantar for the choreography. The work reported in this paper was supported in part by the National Centre of Competence in Research in Affective Sciences supported by the Swiss National Science Foundation, Grant Number 51NF40-104897 – DG, and by the Swiss Institute of Rome (ISR).
All correspondence should be addressed to Donald Glowinski Swiss Center for Affective Sciences University of Geneva, Campus Biotech, bât. H8-2 Chemin des Mines 9, Case postale 60, 1211 Geneva 20, Switzerland [email protected]
Human Technology: An Interdisciplinary Journal on Humans in ICT Environments ISSN 1795-6889 www.humantechnology.jyu.fi
57
ISSN: 1795-6889
www.humantechnology.jyu.fi Volume 13(1), May 2017, 58–81
MUSICAL INSTRUMENTS, BODY MOVEMENT, SPACE, AND MOTION DATA: MUSIC AS AN EMERGENT MULTIMODAL CHOREOGRAPHY
Federico Visi Esther Coorevits Interdisciplinary Centre for Computer Institute for Psychoacoustics and Electronic Music Research (ICCMR) Music (IPEM) Plymouth University Ghent University UK Belgium and
Institute for Systematic Musicology University of Hamburg Germany
Rodrigo Schramm Eduardo Reck Miranda Department of Music Interdisciplinary Centre for Computer Federal University of Rio Grande do Sul Music Research (ICCMR) Brazil Plymouth University UK
Abstract: Music is a complex multimodal medium experienced not only via sounds but also through body movement. Musical instruments can be seen as technological objects coupled with a repertoire of gestures. We present technical and conceptual issues related to the digital representation and mediation of body movement in musical performance. The paper reports on a case study of a musical performance where motion sensor technologies tracked the movements of the musicians while they played their instruments. Motion data were used to control the electronic elements of the piece in real time. It is suggested that computable motion descriptors and machine learning techniques are useful tools for interpreting motion data in a meaningful manner. However, qualitative insights regarding how human body movement is understood and experienced are necessary to inform further development of motion-capture technologies for expressive purposes. Thus, musical performances provide an effective test bed for new modalities of human–computer interaction.
Keywords: music, movement, performance, musical instrument, motion sensor, score.
© 2017 Federico Visi, Esther Coorevits, Rodrigo Schramm, & Eduardo Reck Miranda, and the Open Science Centre, University of Jyväskylä DOI: http://dx.doi.org/10.17011/ht/urn.201705272518
This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.
58
Musical Instruments, Body Movement, Space, & Motion Data
INTRODUCTION AND BACKGROUND SCENARIO
In the interdisciplinary field of musicological research, the idea that music is a multimodal phenomenon that engages body movement has given rise to a wide range of methodologies for studying musical gestures. Musical gestures are “human body movement that goes along with sounding music … the gestures of those that produce the sounds (the musicians), and the gestures of those that perceive the sounds (the listeners or dancers)” (Jensenius, Wanderley, Godøy, & Leman, 2010, p. 13). On the one hand, the advent of new technologies, such as infrared motion capture, has allowed researchers to observe human movement in detail, extracting precise three-dimensional data and kinematic features of bodily movement. This brought about a corpus of studies where motion analysis is based on the computation of several low-level descriptors—or movement features—that could be linked with musical expression (Godøy & Leman, 2010). For example, acceleration and velocity profiles have been shown to be useful in the study of musical timing (Burger, Thompson, Luck, Saarikallio, & Toiviainen, 2014; Dahl, 2014; Glowinski et al., 2013; Goebl & Palmer, 2009; Luck & Sloboda, 2009). Quantity of motion (QoM) has been related to expressiveness (Thompson, 2012) and has been used to study the dynamic effects of the bass drum on a dancing audience (Van Dyck et al., 2013), while contraction/expansion of the body can be used to estimate expressivity and emotional states (Camurri, Lagerlöf, & Volpe, 2003). More advanced statistical methods, such as functional principal component analysis and physical modeling, have led to midlevel descriptors, including topological gesture analysis (Naveda & Leman, 2010), curvature and shape (Desmet et al., 2012; Maes & Leman, 2013), and commonalities and individualities in performance (Amelynck, Maes, Martens, & Leman, 2014). On the other hand, gestures in musical performance can be accessed by means of high-level descriptors. Verbal descriptions, subjective experiences, and the musician’s intentions play an important role in daily interaction with music. This is the way performers and audiences naturally communicate about music. Leman (2008b) referred to these descriptions as first-person perspectives on music experience, resulting in intention-based symbolic/linguistic expressions. In the analysis of musical performance, this qualitative approach has been explored profoundly in the studies of, among others, Davidson (Davidson, 2007, 2012; Williamon & Davidson, 2002) and King (2006). In studies such as these, musical gestures are accessed by means of verbal descriptors or directly perceivable movements that appear to be expressive. In that sense, the concept of musical gesture can be useful to bridge the gap between mental and subjective experiences of the performer/listener and the directly observable physical world. Recent studies have made an attempt to close the gap between these two perspectives in musical performance research by applying a performer-informed analysis (Coorevits, Moelants, Östersjö, & Gorton, 2016; Desmet et al., 2012). In trying to understand the relationship between the physical aspects of movement in space and expressive qualities, the study of musical gestures has resulted in new understandings of the relationship between musician and musical instrument as well. Here, the instrument becomes a natural extension of the musician (Nijs, Lesaffre, & Leman, 2013), being part of the body and hence integrated into the mediation process of communicating musical meaning. In the context of musical practice, an instrumentalist’s gestures bear great expressive potential. Many new interfaces for musical expression (NIMEs) have been developed in the past years and take advantage of this (Miranda & Wanderley, 2006). The development of human–computer interfaces also exploits the expressive potential of musical
59 Visi, Coorevits, Schramm, & Reck Miranda
gestures to enhance the interaction between the digital and the human environments and to create meaningful applications for musical practice (Camurri & Volpe, 2003). Recently, artistic practice has been increasingly adopted as a complementary research method for the arts and humanities, leading to mixed, interdisciplinary methodologies (Smith & Dean, 2009). In this article, we aim to explore issues related to the representation and mediation of body movement in musical performance through digital data. We do so by adopting an interdisciplinary approach, which involves the theoretical analysis of implications brought about by (a) representing movement through data, (b) defining computable motion descriptors, and (c) employing a case study of a musical composition for viola, guitar, and motion sensors. The structure of the paper is as follows. We first review and comment on several technological approaches to analysis, computation, and interpretation of movement data obtained from different devices. We then propose some techniques useful for extracting meaningful information from data collected via wearable motion sensors. Successively, we describe a musical performance where motion sensors have been employed alongside musical instruments. The analysis of this case study then leads to a discussion on the interpretation of human expressive movement through musical experience and digitized motion data. Subsequently, we suggest that musical practice can aid the development of new techniques for the interpretation and understanding of movement data for expressive purposes. This can result in valid contributions to the development of motion-based technologies for human-computer interaction beyond music applications.
CAPTURING, STORING, AND MEDIATED MOVEMENT
Human movement can be digitally captured and stored via different means for purposes of analysis, description, and notation. In the context of musicological studies, movement has been recorded throughout the years using visual media, such as photography (Ortmann, 1929) and video (Davidson, 1993). More recently, motion capture has become widely adopted as the medium of choice for quantitative studies of human motion. Even though new technologies are emerging, marker-based optical motion capture is still regarded as the most reliable solution for precise, high-speed tracking. Data obtained from these systems is usually in the form of three- dimensional vectors referring to a global coordinate system. Each sample in the data returns three-dimensional information regarding the position of a point (marker) in space in relation to the origin of the Cartesian axes. The origin is defined during the calibration procedure and is usually set in an arbitrary place on the floor within the capture area. As Salazar Sutil pointed out, the term motion capture (sometimes shortened to MoCap) indicates not only a technological setup but also a “technologized language of movement, involving the formalized description of movement coordinates and movement data for its subsequent computational analysis and ... processing” (2015, p. 198). Compared to photography and film, MoCap is definitely a younger medium. This has obvious technological implications, as MoCap technologies are still being developed and only recently have become more widely accessible to researchers and practitioners. However, the nature of body movement itself makes its mediation somehow still conceptually challenging. Salazar Sutil noted that the conceptualization of corporeal movement often is optically biased, whereas sensations that are unrelated to sight are often neglected. The ubiquity of visual record is certainly a factor in this
60 Musical Instruments, Body Movement, Space, & Motion Data
process. Still, movement cannot be entirely represented and therefore fully understood exclusively by means of visual media. In fact, interpreting human movement objectively as a displacement of body parts in a three-dimensional space would result in a limited interpretation. Merleau-Ponty famously points this out by giving the example of typing: The subject knows where the letters are on the typewriter as we know where one of our limbs is, through a knowledge bred of familiarity which does not give us a position in objective space. The movement of her fingers is not presented to the typist as a path through space which can be described, but merely as a certain adjustment of motility, physiognomically distinguishable from any other. (Merleau-Ponty, 1945/2002, p. 166) This is possibly one of the reasons why the use of absolute position in a Cartesian coordinate system imposes some constraints and challenges to high-level analysis of motion data and its use for expressive applications. In previous works, we used MoCap to carry out experiments aimed at analyzing relationships between body movements and other musical features in instrumental musical performance (Visi, Coorevits, Miranda, & Leman, 2014; Visi, Coorevits, Schramm, & Miranda, 2016). For real-time musical applications, we preferred the use of various wearable sensors because they are easier to transport and use in performance situations. Optical MoCap, on the other hand, is definitely more demanding in terms of portability and setup time. As we will show more in detail, the raw data returned by wearable sensors are intrinsically different from that of MoCap, and this presents some implications for how the data are eventually interpreted and applied. Understanding how to extract meaningful descriptors from such sensors is useful beyond the domain of musical practice because similar technologies are becoming ubiquitous, employed in everyday objects such as mobile devices and game controllers. Previous research (e.g., Freedman & Grand, 1977; McNeill, 1996) has pointed out that upper body movements are of particular interest when observing expressive behavior. In instrumental musical performance, the upper limbs have a central role; they typically are involved in the main sound-producing gestures. Moreover, in most cases, hands and arms are the primary points of contact between the body of the performer and the instrument. Therefore, in the studies described in this article, we placed the sensor bands on the forearms of the performers. However, as we will show in the following sections, processing data from the inertial measurement units (IMUs) using motion descriptors and machine learning models allowed us to obtain information related to full body movements, which can be used to extract expressive movement features. In earlier tests (Visi, Schramm, & Miranda, 2014a, 2014b), we made use of fingerless gloves for decreased interference during the musical instrument manipulation. We progressively moved away from gloves in order to obtain an even less obtrusive configuration. We first placed the sensor on the wrists and eventually moved further away from the hands of the performer, onto the upper forearm. Doing so did not reduce the amount of information about hand movements we were able to retrieve. On the contrary, by using multimodal sensing and exploiting the constraints posed by the structure of the limbs and the interdependence of its parts, we were able to estimate various measures describing the movement of both hands. Initially, we mostly employed IMUs; later we sought to include a form of muscle sensing. This was done in order to address and estimate body movement components beyond those strictly related to displacement in space, such as proprioception and effort qualities.
61 Visi, Coorevits, Schramm, & Reck Miranda
INERTIAL MEASUREMENT UNITS (IMUs)
An IMU is a device that incorporates accelerometers and gyroscopes. When these devices are paired with magnetometers, the resulting arrays are known as magnetic, angular rate, and gravity (MARG) sensors as well. These sensor arrays allow for the tracking of acceleration, rotational velocity, and orientation of the object they are attached to relative to the earth’s magnetic field. They are used extensively in aviation, robotics, and human–computer interaction (HCI), and their increasing affordability and small size have made them a very common feature of mobile and wearable devices and other consumer electronics (see Figure 1). Recently, 9 degrees of freedom (9DoF) sensors have become the most widely used type of IMU/MARG. Featuring three types of tri-axis sensors (hence the name), they enable estimating various motion features, including optimized three-dimensional (3D) orientation obtained by fusing together the data from the three types of sensors. Whereas the raw data obtained using marker-based optical motion capture consist of samples of position based on a 3D Cartesian coordinate system,1 the data returned by 9DoF sensors are usually in the form of three 3D vectors, each one expressing acceleration, rotational velocity, and orientation, respectively. The sensor band we used more recently2 returns acceleration in units of g, rotational velocity in degrees per second, and orientation angles in radians. Orientation also is estimated using a quaternion representation, which— unlike Euler angles—is not subject to problematic singularities such as gimbal lock (Brunner, Lauffenburger, Changey, & Basset, 2015). In addition, the sensor band returns 8-channel electromyogram (EMG) data, which we used to compute descriptors of muscular effort and to estimate the movements of wrists and fingers. Calculating absolute position from IMU data in real time is technically very difficult if not outright unfeasible, as the operation would require double integration of acceleration data. This would result in a considerable level of residual error because drift would accumulate quadratically.
Figure 1. Some of the wearable devices used in prior research. Clockwise from the bottom-left corner: Myo armbands, Sense/Stage controllers with wristbands, Axivity WAX9 with silicone wristband, Adafruit 9-DOF IMU Breakout, FreeIMU v0.4.
62 Musical Instruments, Body Movement, Space, & Motion Data
Moreover, it would also be relatively expensive in terms of computation. Madgwick et al. designed computationally efficient algorithms for compensating the residual error (Madgwick, Harrison, & Vaidyanathan, 2011). These can be used for estimating position from IMU data recorded in situations where specific constraints could be exploited, such as gait analysis and cyclic motion. The data obtained from IMUs are therefore morphologically very different from positional data returned by optical MoCap. The differences in the way movement is tracked and represented by the two technologies has implications on how movement data are eventually interpreted and used, particularly in the context of expressive movement tracking. High-level movement descriptors often are used to extract features from the raw motion data that can help describing the meaning that the movements of the subject convey. This is no trivial task, and various interdisciplinary approaches have been adopted over the past 2 decades. In the following section we look at several motion descriptors most widely used with positional data and discuss how they can be adapted for use with IMU data.
MOVEMENT DESCRIPTORS AND WEARABLE SENSORS: UNDERSTANDING DIGITIZED MOVEMENT QUALITIES
Computable descriptors of human motion are used across several disciplinary fields for various applications, ranging from kinesiology and gait analysis to HCI and gaming. Even though human motion data analysis has become an increasingly active field, there is still little consensus regarding which descriptors and methodologies yield meaningful representations of human body motion. The MoCap Toolbox (Burger & Toiviainen, 2013) provides a wide range of MATLAB scripts for offline kinematic analysis and visualization. Alternatively, expressive feature extraction and real-time interaction are prominent features of the Eyesweb platform (Camurri et al., 2007). Going beyond traditional low-level kinematic features has proven challenging, especially when dealing with expressiveness, emotions, affective states, and meaning. Larboulette and Gibet (2015) recently attempted a thorough review of computable descriptors of human motion. This was indeed a useful endeavor; however it showed continued segmentation and that many procedures are either ill-defined or redundant (i.e., similar concepts appear in other research literature under different names). Most of the descriptors we mention below were conceived using positional data. However, the principles behind their design are nonetheless useful for describing certain movement qualities; therefore, we attempted to adapt them to the data obtained from the IMU/MARG sensors. A series of objects were implemented using Max,3 which was chosen over other programming environments because it allowed for rapid prototyping and testing of algorithms for real-time interaction and for easily integrating them with other music applications.
Fluidity and Jerkiness
In kinematic analysis, “jerk” is the name given to the third-order derivative of movement position, namely the variation of acceleration over time. The definition of “jerk index” as the
63 Visi, Coorevits, Schramm, & Reck Miranda
magnitude of the jerk averaged over the entire movement (Flash & Hogan, 1985) was used by Pollick, Paterson, Bruderlin, and Sanford (2001) alongside other descriptors to correlate arm movement to basic affective states. This relates to the fluidity or smoothness of a movement— as fluid movement tends to have an even level of velocity and, therefore, low values of higher- order derivatives—that can be used to detect emotionally relevant information in movement data (Glowinski et al., 2011). In fact, roughly speaking, jerkiness could be seen as the inverse of fluidity. Piana, Staglianò, Odone, and Camurri (2016) defined the fluidity index as a local 4 kinematic feature equal to , where ji is the jerk of the joint i. This means that higher values of jerk correspond to lower fluidity. To estimate jerkiness using 9DoF sensor data instead of positional data, we averaged the derivatives of longitudinal acceleration returned by the accelerometer ( , , ) into a single jerk index. The resulting value was then combined with the averaged second-order derivatives of the angular velocity returned by the gyroscope ( , , ) and then summed over a time window of length samples: