Daniel Stainsby Thesis

Total Page:16

File Type:pdf, Size:1020Kb

Daniel Stainsby Thesis Australia Felix: Immersive Visions of The Great Southern Land PhD by artefact and exegesis Daniel Stainsby 1 Abstract This project surveys how New Media is utilised to develop Virtual Reality (VR) experiences for creative immersive interactive photography. It is an investigation into how VR can enrich our experience and perceptions of the Australian landscape. The artefact and the exegesis intersect to explore how technology can shape our various relationships with the environment. Together they reflect the physical, spiritual and artistic research journey undertaken for this project. The artefact is an online experiential gallery that engages with the natural world as communicated through panoramic images of the Australian landscape. It demonstrates technical and artistic rigour and elevates VR photography as a digital art form. The exegesis reflects upon the aesthetics of the natural world in the digital milieu. It utilises the theoretical prism of ecocriticism to illustrate the confluence of the love of nature (biophilia) with that of technology (technophilia). The exegesis also offers insights into personal reflections on the creative process and its importance to scholarship. This draws together insights into creativity as knowledge, the importance of a love of nature and the impact of new technologies. In doing so it illustrates technobiophilia. The artefact and exegesis complement one another as the creativity of VR panoramic landscape art photography leads the academic research. 2 Acknowledgements I would like to thank my supervisors for their insights and guidance: Dr Carolyn Beasley; Dr Matthew Mitchell. Further thanks to: Professor Josie Arnold; Peter Arnold; Emma-Leigh Miller-Olsen; Sand Martin. Thanks also to my family: John Stainsby; Dr Clare Stainsby; Greg Stainsby; Javari Green. To my supportive friends: Claire Oriander; Ruby Walker-Smith; Adrienne Fanning; Amanda McMillan; Bernadette Trenchman, my thanks. 3 Declaration I declare that this thesis: 1. contains no material which has been accepted for the award to me of any other degree or diploma, except where due reference is made in the text of the examinable outcome; 2. to the best of my knowledge contains no material previously published or written by another person except where due reference is made in the text of the examinable outcome; and 3. contains no work based on joint research or publications. This thesis has been professionally copy edited by Dr Rachel Le Rossignol according to the Australian Standards for Editing Practice. Specifically the standards applied included D1, D3 to D5 and E1, E2 and E4. These standards relate to appropriate academic editing, including clarity of expression, spelling, punctuation and grammar, and ensuring the document meets the examining university's format, style and sequencing requirements. Signed, Daniel Stainsby 4 Australia Felix: Immersive Visions of the Great Southern Land Immersive interactive gallery www.australiafelix.com.au Artefact Daniel Stainsby The gallery of Australia Felix: Immersive Visions of The Great Southern Land transports you as user through interactive immediacy into virtual reality experiences of the Australian landscape. You have the opportunity to interact at first hand with a wide variety of photographic panoramas of remote locations in Australia. Choose any of the carefully selected artistically inspired immersive interactions with the deserts, seas, skies and forests of Australia’s most beautiful and challenging landscapes. Daniel Stainsby invites you to share his VR artistic and inspirational realisations of his journey around Australia through the magic of virtual reality. System Requirements 5 Australia Felix: Immersive Visions of The Great Southern Land Exegesis Daniel Stainsby 6 Table of Contents Abstract 2 Acknowledgements 3 Declaration 4 Catalogue Error! Bookmark not defined. Table of Contents 7 List of Plates 10 Introduction 12 The Journey 17 Virtual Technologies 18 Technobiophilia 21 Section 1: Method & Methodology 29 Artistry in new media 29 The artistic ‘want’ and ‘electracy’ 30 The research journal in PLR 31 Autoethnography 32 Practice Led Research 33 Photographic poesis 35 Section 2: The Australian Landscape 37 Presence and absence in the Australian Landscape 37 Colonial pictorial representations of ‘Australia Felix’ 38 Colonial landscape photography: Fred Kruger 39 Australian photographic narratives: Charles Bayliss 41 Australian landscape visions 43 The spiritual elements of the Australian landscape 45 Landscape and memory 50 Landscape representation and Virtual Reality 51 Section 3: My Project 53 Artistic photography 54 VR and The Gaze 56 Manufactured realities 57 Photographic poetry for new media: visual verses 58 7 The VR moment 59 The creative breakthrough: an epic VR visual poem 60 Sublime solitude 61 The outback experience 62 From the real to the hyperreal 64 Section 4: Theoretical Perspectives 67 Biophilia and technophilia 67 Ecocriticism and landscape aesthetics 69 Ecophobia 71 Technobiophilia 74 Electracy 75 The uncanny 76 Landscape, virtual reality and the uncanny 79 New electronic textual possibilities for my photography 81 The Eureka experience: invention, intuition, want and curiosity 82 The Artistic Oeuvre, the VR tourist gaze and the journey of this project 83 Section 5: Creativity and Technology 85 Photographer and/as the camera 85 Vilem Flusser: The eye of the camera, the teeth of the camera; the creative aesthetic and photographic techniques. 87 High Dynamic Range techniques in panoramic photography and questions of authenticity in photography 89 Visual quality 93 The photographer and the camera 94 Immersive representations and postproduction 99 Aesthetic decision making: postproduction 101 Challenges and considerations 102 A significant contribution to the genre of Digital Panoramic Immersive Photography 104 Visual quality 105 Ideal audience 107 VR: the changing audience 108 A ‘sense of living movement’ 113 8 Section 6: Artistic influences 118 Artists and influences 118 New Media immersive photography: a new aesthetic 118 The online gallery 119 Artists and inspirations 122 Australian landscape as a digital narrative 125 Contemporary digital art 127 Immersion not representation 128 Virtual technologies: demonstrations of digital applications for VR photography 130 ‘Images of Thought’ 132 Conclusion 134 The exegetical journey 135 The creative journey 137 The photographer’s absent presence 138 ‘Fieldpath’ 140 Where to? 141 References Cited 143 Bibliography Error! Bookmark not defined. 9 List of Plates Plate 1 Journey undertaken for this project. 18 Plate 2 West Kimberley Coast. 21 Plate 3 Cable Beach. 22 Plate 4 Roebuck Bay. 23 Plate 5 Pindan Cliffs. 25 Plate 6 Goulet Point, WA. 29 Plate 7 Detail from Goulet Point panorama, showing sharks swimming in the shallows. 30 Plate 8 Eugene VON GUERARD: Ferntree Gully in the Dandenong Ranges. 39 Plate 9 Nyirrpi. 49 Plate 10 Sunset. 52 Plate 11 Winkypop. 56 Plate 12 The Lost Cities, NT. 62 Plate 13 Forecourt of the San Francisco Museum of Art. 77 Plate 14 The Arc de Triomphe. 77 Plate 15 Nature’s Window, Kalbarri NP, WA. 80 Plate 16 Nambung, WA. 82 Plate 17 A crocodile lies in wait in the Daintree. 87 Plate 18 Rex Creek, Daintree, Queensland. 92 Plate 19 Nikon 10.5 DX Fisheye (Google images). 95 Plate 20 Shaving the inbuilt lens-hood off the Nikon 10.5 Fisheye (Google images). 96 Plate 21 Nikon 10.5 Fisheye rear gelatin filter holder (Google images). 97 Plate 22 Sunset off Prevelly panoramas. 98 Plate 23 Prevelly. 99 Plate 24 Albany, WA. 104 Plate 25 Kakadu: Image with people included (on right) 105 Plate 26 Kakadu: Image with people removed. 105 Plate 27 Mr Wyld’s large model of the Earth. 110 Plate 28 Bea Maddock exhibition. 123 Plate 29 Details of Bea Maddock’s Terra Spiritus exhibition. 125 10 List of the 39 landscapes portrayed as interactive panoramas (each with a non-interactive duration of 3 minutes 10 seconds) 1. Jervis Bay Territory Booderee; photographed 2016 2. NSW Barrington Tops Creek; photographed 2015 3. NSW Crowdy Bay Diamond Head; photographed 2015 4. NSW Lake Mungo; photographed 2015 5. NSW Myall Lakes; photographed 2015 6. NT Karrku; photographed 2015 7. NT Kata Tjuta; photographed 2015 8. NT Kings Canyon; photographed 2015 9. NT Lake Bennett; photographed 2015 10. NT Lasseter Desert Oaks; photographed 2015 11. NT Ormiston Gorge; photographed 2015 12. NT Palm Valley; photographed 2015 13. NT Rainsbow Valley; photographed 2015 14. NT Tanami Ghost Gums; photographed 2015 15. NT Uluru; photographed 2015 16. QLD Daintree Noah's Beach; photographed 2014 17. QLD Daintree Rex Creek; photographed 2014 18. QLD Hinchinbrook Island Nina's Peak; photographed 2014 19. QLD Lamington Forest; photographed 2014 20. QLD Whitehaven Beach; photographed 2014 21. SA Bunda Cliffs; photographed 2013 22. SA Lake Eyre; photographed 2013 23. TAS Bruny Island Beach; photographed 2014 24. TAS Cradle Mountain Dove Lake; photographed 2014 25. TAS Donoughys Lookout; photographed 2014 26. TAS Friendly Beaches; photographed 2014 27. TAS Mount Field; photographed 2014 28. TAS St Helens Point; photographed 2014 29. TAS Tarkine; photographed 2014 30. VIC Apostles; photographed 2013 31. VIC Grampians; photographed 2014 32. VIC Mt Buffalo; photographed 2015 33. WA Albany; photographed 2013 34. WA Cape Le Grand; photographed 2013 35. WA Gascoyne Coast; photographed 2013 36. WA Gascoyne Storm; photographed 2013 37. WA James Price Point; photographed 2013 38. WA Kalbarri; photographed 2013 39. WA Shark Bay; photographed 2013 11 Introduction
Recommended publications
  • Capture4vr: from VR Photography to VR Video
    Capture4VR: From VR Photography to VR Video Christian Richardt Peter Hedman Ryan S. Overbeck University of Bath University College London Google LLC [email protected] [email protected] [email protected] Brian Cabral Robert Konrad Steve Sullivan Facebook Stanford University Microsoft [email protected] [email protected] [email protected] ABSTRACT from VR photography to VR video that began more than a century Virtual reality (VR) enables the display of dynamic visual content ago but which has accelerated tremendously in the last five years. with unparalleled realism and immersion. However, VR is also We discuss both commercial state-of-the-art systems by Facebook, still a relatively young medium that requires new ways to author Google and Microsoft, as well as the latest research techniques and content, particularly for visual content that is captured from the real prototypes. Course materials will be made available on our course world. This course, therefore, provides a comprehensive overview website https://richardt.name/Capture4VR. of the latest progress in bringing photographs and video into VR. ACM Reference Format: Ultimately, the techniques, approaches and systems we discuss aim Christian Richardt, Peter Hedman, Ryan S. Overbeck, Brian Cabral, Robert to faithfully capture the visual appearance and dynamics of the Konrad, and Steve Sullivan. 2019. Capture4VR: From VR Photography to real world, and to bring it into virtual reality to create unparalleled VR Video. In Proceedings of SIGGRAPH 2019 Courses. ACM, New York, NY, realism and immersion by providing freedom of head motion and USA,2 pages. https://doi.org/10.1145/3305366.3328028 motion parallax, which is a vital depth cue for the human visual system.
    [Show full text]
  • Creating a Realistic VR Experience Using a Normal 360-Degree Camera 14 December 2020, by Vicky Just
    Creating a realistic VR experience using a normal 360-degree camera 14 December 2020, by Vicky Just Dr. Christian Richardt and his team at CAMERA, the University of Bath's motion capture research center, have created a new type of 360° VR photography accessible to amateur photographers called OmniPhotos. This is a fast, easy and robust system that recreates high quality motion parallax, so that as the VR user moves their head, the objects in the foreground move faster than the background. This mimics how your eyes view the real world, OmniPhotos takes 360 degree video footage taken with creating a more immersive experience. a rotating selfie stick to create a VR experience that includes motion parallax. Credit: Christian Richardt, OmniPhotos can be captured quickly and easily University of Bath using a commercially available 360° video camera on a rotating selfie stick. Using a 360° video camera also unlocks a Scientists at the University of Bath have developed significantly larger range of head motions. a quick and easy approach for capturing 360° VR photography without using expensive specialist OmniPhotos are built on an image-based cameras. The system uses a commercially representation, with optical flow and scene adaptive available 360° camera on a rotating selfie stick to geometry reconstruction, which is tailored for real capture video footage and create an immersive VR time 360° VR rendering. experience. Dr. Richardt and his team presented the new Virtual reality headsets are becoming increasingly system at the international SIGGRAPH Asia popular for gaming, and with the global pandemic conference on Sunday 13th December 2020.
    [Show full text]
  • Capture, Reconstruction, and Representation of the Visual Real World for Virtual Reality
    Capture, Reconstruction, and Representation of the Visual Real World for Virtual Reality Christian Richardt1[0000−0001−6716−9845], James Tompkin2[0000−0003−2218−2899], and Gordon Wetzstein3[0000−0002−9243−6885] 1 University of Bath [email protected] https://richardt.name/ 2 Brown University james [email protected] http://jamestompkin.com/ 3 Stanford University [email protected] http://www.computationalimaging.org/ Abstract. We provide an overview of the concerns, current practice, and limita- tions for capturing, reconstructing, and representing the real world visually within virtual reality. Given that our goals are to capture, transmit, and depict com- plex real-world phenomena to humans, these challenges cover the opto-electro- mechancial, computational, informational, and perceptual fields. Practically pro- ducing a system for real-world VR capture requires navigating a complex design space and pushing the state of the art in each of these areas. As such, we outline several promising directions for future work to improve the quality and flexibility of real-world VR capture systems. Keywords: Cameras · Reconstruction · Representation · Image-Based Rendering · Novel-view synthesis · Virtual reality 1 Introduction One of the high-level goals of virtual reality is to reproduce how the real world looks in a way which is indistinguishable from reality. Achieving this arguably-quixotic goal requires us to solve significant problems across capture, reconstruction, and represen- tation, and raises many questions: “Which camera system should we use to sample enough of the environment for our application?”; “How should we model the world and which algorithm should we use to recover these models?”; “How should we store the data for easy compression and transmission?”, and “How can we achieve simple and high-quality rendering for human viewing?”.
    [Show full text]
  • Photography Techniques Intermediate Skills
    Photography Techniques Intermediate Skills PDF generated using the open source mwlib toolkit. See http://code.pediapress.com/ for more information. PDF generated at: Wed, 21 Aug 2013 16:20:56 UTC Contents Articles Bokeh 1 Macro photography 5 Fill flash 12 Light painting 12 Panning (camera) 15 Star trail 17 Time-lapse photography 19 Panoramic photography 27 Cross processing 33 Tilted plane focus 34 Harris shutter 37 References Article Sources and Contributors 38 Image Sources, Licenses and Contributors 39 Article Licenses License 41 Bokeh 1 Bokeh In photography, bokeh (Originally /ˈboʊkɛ/,[1] /ˈboʊkeɪ/ BOH-kay — [] also sometimes heard as /ˈboʊkə/ BOH-kə, Japanese: [boke]) is the blur,[2][3] or the aesthetic quality of the blur,[][4][5] in out-of-focus areas of an image. Bokeh has been defined as "the way the lens renders out-of-focus points of light".[6] However, differences in lens aberrations and aperture shape cause some lens designs to blur the image in a way that is pleasing to the eye, while others produce blurring that is unpleasant or distracting—"good" and "bad" bokeh, respectively.[2] Bokeh occurs for parts of the scene that lie outside the Coarse bokeh on a photo shot with an 85 mm lens and 70 mm entrance pupil diameter, which depth of field. Photographers sometimes deliberately use a shallow corresponds to f/1.2 focus technique to create images with prominent out-of-focus regions. Bokeh is often most visible around small background highlights, such as specular reflections and light sources, which is why it is often associated with such areas.[2] However, bokeh is not limited to highlights; blur occurs in all out-of-focus regions of the image.
    [Show full text]
  • Real-Time Sphere Sweeping Stereo from Multiview Fisheye Images
    Real-Time Sphere Sweeping Stereo from Multiview Fisheye Images Andreas Meuleman Hyeonjoong Jang Daniel S. Jeon Min H. Kim KAIST Abstract A set of cameras with fisheye lenses have been used to capture a wide field of view. The traditional scan-line stereo (b) Input fisheye images algorithms based on epipolar geometry are directly inap- plicable to this non-pinhole camera setup due to optical characteristics of fisheye lenses; hence, existing complete 360◦ RGB-D imaging systems have rarely achieved real- time performance yet. In this paper, we introduce an effi- cient sphere-sweeping stereo that can run directly on multi- view fisheye images without requiring additional spherical (c) Our panorama result rectification. Our main contributions are: First, we intro- 29 fps duce an adaptive spherical matching method that accounts for each input fisheye camera’s resolving power concerning spherical distortion. Second, we propose a fast inter-scale bilateral cost volume filtering method that refines distance in noisy and textureless regions with optimal complexity of O(n). It enables real-time dense distance estimation while preserving edges. Lastly, the fisheye color and distance im- (a) Our prototype (d) Our distance result ages are seamlessly combined into a complete 360◦ RGB-D Figure 1: (a) Our prototype built on an embedded system. image via fast inpainting of the dense distance map. We (b) Four input fisheye images. (c) & (d) Our results of om- demonstrate an embedded 360◦ RGB-D imaging prototype nidirectional panorama and dense distance map (shown as composed of a mobile GPU and four fisheye cameras. Our the inverse of distance).
    [Show full text]
  • Besut Campus Virtual Reality (Vr) 360 Degree Panoramic View
    BESUT CAMPUS VIRTUAL REALITY (VR) 360 DEGREE PANORAMIC VIEW MUHAMMAD NAUFAL BIN FARID BACHELOR OF INFORMATION TECHNOLOGY (INFORMATIC MEDIA) WITH HONOURS UNIVERSITI SULTAN ZAINAL ABIDIN 2018 BESUT CAMPUS VIRTUAL REALITY (VR) 360 DEGREE PANORAMIC VIEW MUHAMMAD NAUFAL BIN FARID Bachelor Of Information Technology (Informatic Media) With Honours Universiti Sultan Zainal Abidin,Terengganu,Malaysia DECEMBER 2018 DECLARATION I hereby declare that this report is based on my original work except for quotations and citations, which have been duly acknowledged. I also declare that it has not been previously or concurrently submitted for any other degree at Universiti Sultan Zainal Abidin or other institutions ________________________________ Name : Muhammad Naufal bin Farid Date : 23 December 2018 i CONFIRMATION This is to confirm that: The research conducted and the writing of this report was under my supervision. ________________________________ Name : Dr Ismahafezi bin Ismail Tarikh : 23 December 2018 ii DEDICATION In the Name of Allah, the Most Gracious and the Most Merciful. Alhamdulillah, I thank God for His grace and grace, I can prepare and complete this report successfully. I am grateful and would like to express my sincere gratitude to my supervisor, Dr Ismahafezi bin Ismail for his invaluable guidance, continuous encouragement and constant support in making this research possible. I really appreciate his guidance from the initial to the final level that enabled me to develop an understanding of this project thoroughly. Without his advice and assistance, it would be a lot tougher for completion. I also sincerely thanks for the time spent proofreading and correcting my mistakes. I acknowledge my sincere indebtedness and gratitude to my parents for their love, dream and sacrifice throughout my life.
    [Show full text]
  • Arxiv:2103.05842V1 [Cs.CV] 10 Mar 2021 Generators to Produce More Immersive Contents
    LEARNING TO COMPOSE 6-DOF OMNIDIRECTIONAL VIDEOS USING MULTI-SPHERE IMAGES Jisheng Li∗, Yuze He∗, Yubin Hu∗, Yuxing Hany, Jiangtao Wen∗ ∗Tsinghua University, Beijing, China yResearch Institute of Tsinghua University in Shenzhen, Shenzhen, China ABSTRACT Omnidirectional video is an essential component of Virtual Real- ity. Although various methods have been proposed to generate con- Fig. 1: Our proposed 6-DoF omnidirectional video composition tent that can be viewed with six degrees of freedom (6-DoF), ex- framework. isting systems usually involve complex depth estimation, image in- painting or stitching pre-processing. In this paper, we propose a sys- tem that uses a 3D ConvNet to generate a multi-sphere images (MSI) representation that can be experienced in 6-DoF VR. The system uti- lizes conventional omnidirectional VR camera footage directly with- out the need for a depth map or segmentation mask, thereby sig- nificantly simplifying the overall complexity of the 6-DoF omnidi- rectional video composition. By using a newly designed weighted sphere sweep volume (WSSV) fusing technique, our approach is compatible with most panoramic VR camera setups. A ground truth generation approach for high-quality artifact-free 6-DoF contents is proposed and can be used by the research and development commu- nity for 6-DoF content generation. Index Terms— Omnidirectional video composition, multi- Fig. 2: An example of conventional 6-DoF content generation sphere images, 6-DoF VR method 1. INTRODUCTION widely available high-quality 6-DoF content that
    [Show full text]
  • Shooting Panoramas and Virtual Reality
    4104_ch09_p3.qxd 6/25/03 11:17 PM Page 176 4104_ch09_p3.qxd 6/25/03 11:17 PM Page 177 Shooting Panoramas and Virtual Reality With just a little help from special software, you can extend the capabilities of your digital camera to create stunning panoramas and enticing virtual reality (VR) images. This chapter shows you how to shoot—and create— images that are not only more beautiful, but 177 ■ also have practical applications in the commer- SHOOTING PANORAMAS AND VIRTUAL REALITY cial world as well. Chapter Contents 9 Panoramas and Object Movies Shooting Simple Panoramas Extending Your View Object Movies Shooting Tips for Object Movies Mikkel Aaland Mikkel 4104_ch09_p3.qxd 6/25/03 11:18 PM Page 178 Panoramas and Object Movies Look at Figure 9.1. It’s not what it seems. This panorama is actually comprised of several images “stitched” together using a computer and special software. Because of the limitations of the digital camera’s optical system, it would have been nearly impos- sible to make this in a single shot. Panoramas like this can be printed, or with an extended angle of view and more software help, they can be turned into interactive virtual reality (VR) movies viewable on a computer monitor and distributed via the Web or on a CD. Figure 9.1: This panorama made by Scott Highton using a Nikon Coolpix 990 is comprised of 178 four adjacent images, “stitched” together using Photoshop Elements’ 2 Photomerge plug-in. ■ If you look at Figure 9.2 you’ll see another example where software was used to extend the capabilities of a digital camera.
    [Show full text]
  • 2017 Provost's Learning Innovations Grants
    2017 Provost’s Learning Innovations Grants 2017 PROVOST’S LEARNING INNOVATIONS GRANTS APPLICATION INSTRUCTIONS 1. Complete this Application Form, in its entirety, and save as “Lastname_Firstname_APP” (using your name). 2. Complete the Budget Worksheet and save as “Lastname_Firstname_BUDGET” (using your name). 3. Ask your Department Head to complete the Department Head Certification, scan and save as, “Lastname_Firstname_SIG” (using your name). 4. Email all documents to [email protected], no later than 11:59pm EST, January 25, 2017. If you have any questions about completing this application, please email [email protected], or contact Michael Starenko at 585-475-5035 or [email protected]. APPLICANT INFORMATION This application is for a: Exploration Grant x Focus Grant Principal Applicant name: Jennifer Poggi Faculty title: Assistant Professor Email: [email protected] Phone: (202)510-1079 (Full-time only) College: CIAS Department: SPAS / Photojournalism Department Head name: Therese Mulligan __________________________________ Email: [email protected] Others involved in the project (if any): Nitin Simpat, Josh Meltzer, Susan Lakin, Juilee Decker, Joe Geigel Project name: Exploring VR/360 Video Through Interdisciplinary Experiential Student Learning Total funds requested (as calculated on the budget worksheet): $5000.00 (requests of $1,000 to $5,000 will be considered) 3 2017 Provost’s Learning Innovations Grants BUDGET There is a fillable PDF worksheet to calculate your budget. You can download the worksheet at rit.edu/ili/plig. • The total shown on this worksheet must match the “Total funds requested” in the Applicant Information section of this application form • If awarded, additional funds will be provided to cover any benefits and ITS expenses associated with the salary budget requested • Note that any equipment or other materials purchased with grant funds are the property of your department and revert to the department after your project is completed TIMELINE Please indicate any variances to the planned PLIG 2017 schedule and your reasons.
    [Show full text]
  • Joe Farace Barry Staver
    Better Available Light Digital Photography This page intentionally left blank Better Available Light Digital Photography How to Make the Most of Your Night and Low-Light Shots Second Edition Joe Farace Barry Staver AMSTERDAM • BOSTON • HEIDELBERG • LONDON NEW YORK • OXFORD • PARIS • SAN DIEGO SAN FRANCISCO • SINGAPORE • SYDNEY • TOKYO Focal Press is an imprint of Elsevier Associate Acquisitions Editor: Valerie Geary Publishing Services Manager: George Morrison Project Manager: Mónica González de Mendoza Marketing Manager: Kate lanotti Cover Design: Eric DeCicco Cover image: © Joe Farace Focal Press is an imprint of Elsevier 30 Corporate Drive, Suite 400, Burlington, MA 01803, USA Linacre House, Jordan Hill, Oxford OX2 8DP, UK © 2009 Joe Farace and Barry Staver. Published by Elsevier, Inc. All rights reserved. No part of this publication may be reproduced, stored in a retrieval system, or transmitted in any form or by any means, electronic, mechanical, photocopying, recording, or otherwise, without the prior written permission of the publisher. Permissions may be sought directly from Elsevier’s Science & Technology Rights Department in Oxford, UK: phone: (+44) 1865 843830, fax: (+44) 1865 853333, E-mail: [email protected]. You may also complete your request online via the Elsevier homepage (http://elsevier.com), by selecting “Support & Contact” then “Copy- right and Permission” and then “Obtaining Permissions.” Recognizing the importance of preserving what has been written, Elsevier prints its books on acid-free paper whenever possible. Library of Congress Cataloging-in-Publication Data Farace, Joe. Better available light digital photography : how to make the most of your night and low-light shots / Joe Farace, Barry Staver.
    [Show full text]
  • Framing the Virtual Reality Documentary GENRE DEFINITIONS, POLITICS of CANONIZATION and the CHALLENGING of the DOCUMENTARY TRADITION
    Framing the Virtual Reality Documentary GENRE DEFINITIONS, POLITICS OF CANONIZATION AND THE CHALLENGING OF THE DOCUMENTARY TRADITION Martyna Turska | 6243983 | [email protected] Research Master Media, Art and Performance Supervisor: dr. Stefan Werning Second reader: dr. René Glas August 2019 Abstract Documentary-making has always adapted dynamically to the affordances of new technologies. This thesis focuses on a new corpus of virtual reality (VR) documentaries, whose appearance calls for contemplation of the aesthetic and political implications of VR technology in the context of their impact on the ‘documentary tradition’. It suggests steps in the conceptualization of the VR documentary practice, understood as a new medium that draws from – and builds upon – traditional documentary conventions. First, through a comparative case study of the cinematic (2014) and VR (2016) versions of the documentary Notes on Blindness, the author claims that VR documentaries appropriate many of the documentary genre’s rhetorical and aesthetic tropes for ‘authentic’ representation of lived experience – yet they also carry separate meanings and call for different representational structures. Next, it is argued that in order to strive for a more in-depth understanding of the VR doc, it is necessary to form typologies and canonizations (as for any earlier genre). Hence, drawing from the theoretical analysis of processes of ‘canonization’ and ‘typologization’ in earlier media contexts, the thesis explores questions of power structure and dynamics between the agents responsible for canon-creation with regard to VR documentaries. The author proposes a more procedural understanding of canon as something that is practiced rather than ‘set in stone’.
    [Show full text]
  • IMMERSIVE MEDIA a STATE of the INDUSTRY May 2017
    IMMERSIVE MEDIA A STATE OF THE INDUSTRY May 2017 VR/AR Project VR/AR Project vrarproject.com [email protected] VR/AR Project New Amsterdam Media 1 VR/AR Project May 1, 2017 Rev 3.0 Editor: Seth Shapiro Deputy Editors: Bryce Paul & Andriy Pishchalenko Contributors: Perisa Brown, John Canning, Amy Dai, Xingting Gu, Hudson Leiser, Francisco Serna, Kinsey Thompson V2 Seth Shapiro 2016 Andriy Pishchalenko V1 Seth Shapiro 2015 Andriy Pishchalenko [email protected] VR/AR Project New Amsterdam Media 2 VR/AR Project [email protected] VR/AR Project New Amsterdam Media 3 VR/AR Project THIS IS AN OPEN COMMUNITY PROJECT. THIS IS A SNAPSHOT OF A RAPIDLY EVOLVING LANDSCAPE. TO OPT IN FOR FUTURE VERSIONS OF THIS PAPER, JOIN US AT VRARPROJECT.COM EMAIL US AT [email protected] [email protected] VR/AR Project New Amsterdam Media 4 VR/AR Project TABLE OF CONTENTS I 1. Preface 7 2. Overview 8 3. Enterprise Use Cases 9 Advertising & Marketing 9 Aerospace & Defense 11 Construction 12 Education 14 Entertainment 15 Fashion 18 Finance 19 Gaming 21 Healthcare 24 Live Events 26 Real Estate 28 Retail 29 Training 31 Travel & Hospitality 32 4. Computer Rendered VR 35 5. Live Action VR 37 6. Web VR 38 7. Social VR 39 8. Location-Based VR 42 Theme parks 42 VRcades 44 9. Smartphone VR/AR 46 10. VR Head Mounted Displays (HMDs) 49 11. AR/MR Head Mounted Displays (HMDs) 54 12. Spatial Audio 59 13. Haptics, Accessories, and Control Systems 63 Omnidirectional Treadmills 66 Exercise and fitness 29 Haptic Suits 64 Galvanic Vestibular Stimulation (GVS) 67 [email protected] VR/AR Project New Amsterdam Media 5 VR/AR Project TABLE OF CONTENTS II Advanced movement tracking 66 14.
    [Show full text]