360 VR Based Robot Teleoperation Interface for Virtual Tour

Total Page:16

File Type:pdf, Size:1020Kb

360 VR Based Robot Teleoperation Interface for Virtual Tour 360 VR Based Robot Teleoperation Interface for Virtual Tour Yeonju Oh Ramviyas Parasuraman Department of Computer and Information Technology Department of Computer and Information Technology Purdue University Purdue University West Lafayette, IN 47907, USA West Lafayette, IN 47907, USA [email protected] [email protected] Tim McGraw Byung-Cheol Min Department of Computer and Graphics Technology Department of Computer and Information Technology Purdue University Purdue University West Lafayette, IN 47907, USA West Lafayette, IN 47907, USA [email protected] [email protected] ABSTRACT VR Display We propose a novel mobile robot teleoperation interface that demon- Remote Environment strates the applicability of robot-aided remote telepresence system 360⁰ Camera with a virtual reality (VR) device in a virtual tour scenario. To im- prove the realism and provide an intuitive replica of the remote Live Streaming Directional environment at the user interface, we implemented a system that Antenna automatically moves a mobile robot (viewpoint) while displaying a 360-degree live video streamed from the robot on a virtual reality User Input Mobile Robot gear (Oculus Rift). Once the user chooses a destination location User among a given set of options, the robot generates a route to the des- tination based on a shortest path graph, and travels along the route Figure 1: Illustrative example of the proposed VR interface. using a wireless signal tracking method that depends on measuring the Direction of Arrival (DOA) of radio signal. In this paper, we Therefore, to mitigate such limitations and to realize a true im- present an overview of the system and the architecture, highlight mersive remote tour experience, this study proposes a system ex- the current progress in the system development, and discuss the ploiting a VR display that produces a (near real-time) stream of implementation aspects of the above system. the live 360-degree camera from a moving agent, specifically a mo- bile robot. Figure1 illustrates an example scenario of the proposed 1 INTRODUCTION system. The VR display shows the general interface so that the In addition to the entertainment industry, Virtual Reality (VR) has user who is equipped with a VR device can visualize the robot’s a myriad of potential application areas such as tourism [22], indus- 3D view and also control the robot to navigate through the remote trial design [6], architecture, real estate, medical care [7, 8], and environment (and change the viewpoint). education [20]. Among these applications, we are interested in the Recent years have witnessed active research in robot control use of VR in a virtual tour scenario, where users can be provided using VR [4]. Particularly, immersive teleoperation systems us- with vivid touring experiences of real-world environments. For ing Head Mounted Displays (HMD) have been developed [10, 12]. instance, in the 2017 Google I/O conference, Google introduced the Telepresence via the VR interface has shown to be easier and more motif of educational VR content using smartphones, with the main intuitive than interfaces using liquid crystal displays and joysticks feature being that students can watch 360-degree field trip videos [3, 11]. For instance, in [1], a panoramic image is generated by join- to realistically learn what they see in a book. Through the pilot ing the images taken by the omnidirectional camera on a Remotely studies, students could engage in more immersive, interactive, and Operated underwater Vehicle (ROV), and the VR display renders social activities to obtain enhanced learning experiences. the combined image to improve the spatial perception in the under- Currently, such virtual tour applications have limitations mainly water environment. In another study [14], a VR-based teleoperation in terms of displaying a prerecorded media (video or panorama interface is developed to allow users to perform specific tasks (as- images) [9, 13, 21]. In [9, 21], the studies constructed a 3D map sembling work) with a robot manipulator by mimicking the user’s for a virtual tour of world cultural heritage, but they do not re- movements operated with the VR controller, which is more accurate flect real-time situation. In [13], the authors proposed a web-based than the conventional auto assembly used in automated processes. telepresence system with 360-degree panorama image. This mobile These previous studies show that the VR interface can be more robot platform adopts a landmark-based path planning entered by effective than the general display (such as a screen and ajoystick the users. The monitoring window provides an omnidirectional controller) and can be used in real life through an integration of image but does not present it using a VR device. In addition, existing various systems. Taking inspirations from these works, in this paper, research and applications can only adjust the viewpoint only at a we propose a VR interface that incorporates a 360-degree camera fixed position or move using manual control. from a mobile robot to better utilize the immersive experience of User Server 1 Robot combining two fisheye lenses to create a total of 360-degree im- age. Thus, by stitching the image frames like orange guidelines in VR Deliver robot’s info RSSI Directional ROS Bridge Figure 3(a), a live video stream is rendered on a spherical object Interface Antenna User’s destination shown in Figure 3(b). Each fisheye image is presented on each hemi- Oculus Rift • HMD Deliver sensor info Wheels sphere to produce a panoramic image in one perfect sphere. The ROS Nodes • Controller Sensors 3D display the user can see is shown in Figure 3(c) (but stretched Commands for • Path planning navigation to 2D in the picture). Since each lens takes more than 180 degrees • RSSI tracking 360 of photographs, when cutting and joining each sphere, a part of Camera Server 2 the original photograph is cut and merged. At this time, a slight boundary is created in the panoramic photograph. To improve the Video stream : Network related component quality of streaming in terms of latency, we have to change the HTTP server : Navigation related component streaming settings such as reducing the bitrate of the video frame • 360‐degree video feed : Independent component or streaming buffer size, because the camera needs to send two photos together, which is a feature of dual fisheye lenses. Due to Figure 2: An overview of the proposed 360 VR based virtual this limitations, the changes of settings may degrade the quality (remote) tour system. of video. Therefore, there is a trade-off every time we do this post processing. the VR display. Additionally, we design an autonomous navigation 2.2 Telepresence Robot Server system for the robot through a wireless signal tracking method Once the users select their destinations using the UI, the robot [15, 16]. This way, the users are expected to feel less fatigue (and starts to plan the path based on the registered Wi-Fi Access Points not worry about fine tuned movement control) during the virtual (APs) and tracks the generated path using the directional antenna tour experience through the VR display and have a more relaxed based wireless signal tracking method proposed in [15]. Wi-Fi sig- experience by minimizing the mental workload of navigating the nal tracking can take several advantages in terms of cost effective environment. and supporting a consistent live streaming and path planning. By Moreover, we expect that this platform can be generalized to using indoor APs, the server can consistently keep streaming the realize an intuitive telepresence robot with a 360-degree camera live video in a farther space. Also, even though the selected des- video streaming interface in VR environments. Therefore, the us- tination is not in the line of sight, the path can be generated by ability of the VR based robot interface can be improved not only for passing through the waypoint AP where the robot is accessible at the case of virtual tour but also broadly for the general telepresence the current location. case. 2.2.1 Hardware. Multiple APs are pre-installed throughout the 2 METHODOLOGY AND SYSTEM DESIGN environment, and they can serve as a way point or the destination This section provides the overall architecture and discusses the user for the robot’s tracking as illustrated in Figure4. This approach is interface (UI) components for building the entire system. cost-effective, compared to other infrastructures using beacons or RFID. It is also very effective as typical Wi-Fi APs can be utilized. 2.1 System Architecture A 360-degree camera and a rotating Wi-Fi directional antenna are connected to the onboard computer on a mobile robot that To fully support the concepts of the envisioned virtual tour, we de- provides ROS-compatible driver. The directional antenna measures termine the following essential components: a VR video streaming the Wi-Fi Received Signal Strength Indicator (RSSI) within a specific and rendering, robot’s signal tracking for the indoor autonomous angular range rotated by a servo pan system. This angular scan of navigation, and integration with VR interface. Figure2 shows an the RSSI is then used to calculate the Direction of Arrival (DOA) of overview of the overall system architecture. This categorizes the the Wi-Fi AP. The built-in ultrasonic sensors on the mobile robot system into three entities: VR user interface at the client (User), a are used to determine the DOA of the obstacles, which is then used mobile robot which is the moving agent (Robot), and path planning to avoid obstacles on the trajectory. and tracking function to generate the robot path (Server). The robot’s path planning features are integrated with Robot 2.2.2 Software. The software module includes a path planning Operating System (ROS) [19], which is a powerful tool to simplify algorithm, which finds the shortest path from a graph consisting of the development processes in robotic programming.
Recommended publications
  • Underwater Virtual Tours Using an Omnidirectional Camera Integrated in an AUV
    Creating 360◦ Underwater Virtual Tours Using an Omnidirectional Camera Integrated in an AUV Josep Bosch∗, Pere Ridao∗, David Ribas∗ and Nuno Gracias∗ ∗ Computer Vision and Robotics Institute (VICOROB), University of Girona, 17071 Girona, Spain [email protected], [email protected], [email protected], [email protected] Abstract—The use of omnidirectional cameras underwater is The impact of acquiring omnidirectional images from an enabling many new and exciting applications in multiple fields. AUV is illustrated with the results of a survey over a ship- Among these, the creation of virtual tours from omnidirectional wreck in the bay of Porto Pim in Horta, Azores Islands, that image surveys is expected to have a large impact in terms of was carried out in September 2014 during sea trials in the science and conservation outreach. These surveys can be per- framework of the MORPH EU-FP7 project [3]. formed by Remotely Operated Vehicles (ROVs) and Autonomous Underwater Vehicles (AUVs) that can cover large marine areas A selection of the related work can be found in section with precise navigation. Virtual tours are relevant in zones of II. The used equipment is described in section III. Section IV special interest such as shipwrecks or underwater nature reserves and V present the construction of the panoramas and the virtual for both scientists and the general public. This paper presents tours. Section VI presents practical results. Finally section VII the first results of surveys carried out by an AUV equipped with draws briefly the conclusions and future work. an omnidirectional underwater camera, and explores the process of automatically creating virtual tours from the most relevant images of the datasets.
    [Show full text]
  • Matterport: Virtual Tour As a New Marketing Approach in Real Estate
    Advances in Social Science, Education and Humanities Research, volume 502 Proceedings of the International Conference of Innovation in Media and Visual Design (IMDES 2020) Matterport: Virtual Tour as A New Marketing Approach in Real Estate Business During Pandemic COVID-19 Mohamad Zaidi Sulaiman1,* Mohd Nasiruddin Abdul Aziz2, Mohd Haidar Abu Bakar3, Nur Akma Halili4, Muhammad Asri Azuddin5 1,2,4,5, Faculty of Art & Design, Universiti Teknologi MARA, Perak Branch, 32610 Seri Iskandar, Perak, Malaysia 3, Cerenkov Scientific Sdn. Bhd., 35-1, Jalan Seroja 4, Taman Seroja, 42900 Sepang, Selangor, Malaysia *Corresponding author. Email: [email protected] ABSTRACT Real estate business was highlighted in several reports as one of the critical economic sectors affected by the pandemic COVID-19. The potential buyer cannot visit the property to experience the real space physically as before anymore. In this situation, an advance tool is needed to improve the marketing strategy in the real estate business. Currently, the 360 photography and virtual tour have become the most relevant marketing strategy to sell the property. As a new approach, Matterport, as one of the online platforms, is presented. The study engaged in an archival method or document review, thematic and content analysis. Four themes between the conventional approaches and Matterport online application were compared and discussed which are: (1) accessibility, (2) visual capture, (3) details information, (4) visual experience. This paper presents and explains the Matterport as a new platform for the marketing approach, which aims at helping the real estate business to survive during the pandemic COVID-19. Keywords: Real Estate, 360 Photography, Virtual Tour, Matterport, Marketing.
    [Show full text]
  • Development of Panoramic Virtual Tours System Based on Low Cost Devices
    The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, Volume XLIII-B2-2021 XXIV ISPRS Congress (2021 edition) DEVELOPMENT OF PANORAMIC VIRTUAL TOURS SYSTEM BASED ON LOW COST DEVICES Issam Boukerch 1 *, Bachir Takarli 1, Kamel Saidi 1, Mokrane Karich 1, Mostapha Meguenni 1 1 Agence Spatiale Algérienne, Centre des Techniques Spatiales, Arzew, Algeria - (iboukerch, btakarli, ksaidi)@cts.asal.dz KEY WORDS: panoramic photography, low cost, virtual tour, open source software, web application. ABSTRACT: The virtual visit is a simulation of reality that allows virtually changing time or placing. Unlike this, virtual reality offers the possibility to associate the real world with the virtual world, while ensuring interactivity and an improvement in prospecting the reality. One of the most used form of virtual reality is the 360° panoramic imagery and video, currently in the market we can find many providers of panoramic cameras, going from costly professional cameras to the low cost consumer grade cameras. In addition, many software providers offers the virtual tours creation and visualisation, to our knowledge, only google street view offers full mapping functionalities, but the data capture and collection is an opaque operation. Generally, smartphones are equipped with positioning capabilities, based on sensors like gyroscopes, accelerometers, magnetometer used as compass and GPS, the integration of this data allow the estimation of the pose. This paper discusses the creation of an interactive virtual reality with 360 ° panoramic images with the possibility of automating its acquisition and integration in mapping environment by using smartphone positioning methods, while the panoramic images are taken using costumer grade camera, which composes a low cost system for acquiring and diffusing panoramic virtual tours map.
    [Show full text]
  • 360-Degree Panoramas As a Reality Capturing Technique in Construction Domain: Applications and Limitations
    55th ASC Annual International Conference Proceedings Copyright 2019 by the Associated Schools of Construction 360-Degree Panoramas as a Reality Capturing Technique in Construction Domain: Applications and Limitations Ricardo Eiris Pereira, PhD Student, and Masoud Gheisari, PhD University of Florida Gainesville, Florida 360-degree panorama is a reality capturing technique for visualizing environments that fully immerses the user in a true-to-reality representation of real-world spaces. 360-degree panoramas create highly realistic and detailed replication of construction sites while giving users a high sense of presence. This paper aims to discuss the applications of 360-degree panoramas and extend the understanding of the current state of this technology in the Construction setting and other affine fields. Additionally, the methodology to develop 360-degree panoramas and the limitations of the technique were examined for the Construction domain. Overall, three major areas were identified that employ 360-degree panoramic technology in Construction: interactive learning, reality backdrop to augmented information, and visualizing safe and unsafe situations. It was found that these applications leverage the realism embedded into 360-degree panoramas to visualize remote, inaccessible, or unsafe places that are common in Construction. Keywords: 360-Degree Panoramas, Mixed Reality, Reality Capturing Introduction 360-degree panorama is a reality capturing technique that creates an unbroken view of a whole region surrounding an observer, giving a “sense of presence, of being there” (Bourke 2014). Panoramas were devised by the English artist Robert Baker in 1792 (Ellis 2012), as a method to describe his paintings on the internal face of cylindrical-shaped buildings. The concept present on these paintings was later extended to photography, by combining pictures taken from a static advantage point in a cylindrical fashion and combining them together.
    [Show full text]
  • Designing Virtual Reality Experiences in Education
    Bulletin of the Technical Committee on Learning Technology (ISSN: 2306-0212), Volume 21, Number 1, March, 19-22 (2021) Designing Virtual Reality Experiences in Education Ramesh Chander Sharma , and Yash Paul Sharma hardware and software are available. An immersive experiences lab Abstract—Pedagogy of student engagement embodies creativity, student has been setup at the Grinnell College autonomy, engagement, and metacognition. We have been working on (https://gciel.sites.grinnell.edu/) to explore the innovative ways to developing a framework for transformed pedagogies by designing and teach liberal arts using immersive 3D, virtual reality, and mixed creating virtual reality experiences for learners. These transformative learning experiences enable learners to learn creatively by exploring and reality content. A commendable aspect of their approach is to experimenting; as active citizen by making choices, taking decisions, and distribute these as open educational resources for the benefit of larger solving problems; engaging intellectually by generating ideas; reflecting on education community. Another interesting example is from the their own learning, and by learning how to learn through metacognition. Engineering Education Transformations Institute, University of We created virtual immersive experiences for the students using real-world Georgia (https://eeti.uga.edu/area_of_focus/research_initiation/) content (360-degree media), synthetic content (computer-generated), or a mix of these two. Our work involved creating virtual reality
    [Show full text]
  • Guided 360-Degree Visual Perception for Mobile Telepresence Robots
    Guided 360-Degree Visual Perception for Mobile Telepresence Robots Kishan Chandan, Xiaohan Zhang, Jack Albertson, Xiaoyang Zhang, Yao Liu, Shiqi Zhang SUNY Binghamton, Binghamton, NY 13902 USA fkchanda2;xzhan244;jalbert5;xzhan211;yaoliu;[email protected] Abstract—Omnidirectional (360-degree) cameras have pro- vided robots a field of view that covers approximately the entire sphere, whereas people’s visual field spans only about 120◦ horizontally. The difference in visual field coverage places a challenge for people to directly process the rich visual information (in spherical or planar forms) from the robots’ 360-degree vision system. In this paper, we focus on mobile telepresence robot platforms equipped with 360-degree vision, where the human operator perceives and interacts with a remote environment through a mobile robot. We develop a framework that, for the first time, leverages the mobile robot’s 360-degree scene analysis capability to guide human attention to the remote areas with potentially rich visual information. We have evaluated our (a) (b) framework using a remote target search task, where a human Fig. 1: Segway-based mobile robot platform (RMP-110) with operator needs to use a semiautonomous Segway-based mobile robot to locate a target object in a remote environment. We a 360-degree camera (Ricoh Theta V), mounted on top of the have compared our approach to a baseline from the literature robot, and laser ranger finder (SICK TiM571 2D) that has been that supports 360-degree visual perception, but does not provide used for prototyping and evaluation purposes in this research. guidance to human attention. From the results, we observed significant improvements in the efficiency of the human-robot system conducting remote target search tasks.
    [Show full text]
  • Viewer's Role and Viewer Interaction in Cinematic Virtual Reality
    computers Article Viewer’s Role and Viewer Interaction in Cinematic Virtual Reality Lingwei Tong 1,* , Robert W. Lindeman 1 and Holger Regenbrecht 2 1 HIT Lab NZ, University of Canterbury, Christchurch 8041, New Zealand; [email protected] 2 Department of Information Science, University of Otago, Dunedin 9016, New Zealand; [email protected] * Correspondence: [email protected] Abstract: Cinematic Virtual Reality (CVR) is a form of immersive storytelling widely used to create engaging and enjoyable experiences. However, issues related to the Narrative Paradox and Fear of Missing Out (FOMO) can negatively affect the user experience. In this paper, we review the literature about designing CVR content with the consideration of the viewer’s role in the story, the target scenario, and the level of viewer interaction, all aimed to resolve these issues. Based on our explorations, we propose a “Continuum of Interactivity” to explore appropriate spaces for creating CVR experiences to archive high levels of engagement and immersion. We also discuss two properties to consider when enabling interaction in CVR, the depth of impact and the visibility. We then propose the concept framework Adaptive Playback Control (APC), a machine-mediated narrative system with implicit user interaction and backstage authorial control. We focus on “swivel-chair” 360-degree video CVR with the aim of providing a framework of mediated CVR storytelling with interactivity. We target content creators who develop engaging CVR experiences for education, entertainment, and other applications without requiring professional knowledge in VR and immersive systems design. Citation: Tong, L.; Lindeman, R.W.; Regenbrecht, H. Viewer’s Role and Keywords: cinematic VR; digital storytelling; interaction; 360-degree narrative Viewer Interaction in Cinematic Virtual Reality.
    [Show full text]
  • Virtual Field Trips.” Eachers Can Acquire Prepared Lesson Premier Travel Media, Inc
    2021 EDITION 6 EASY TIPS FOR VIRTUAL CHARTERING A SAFE FIELD TRIPS MOTORCOACH TRIP Tour museums, national DISCOVER parks and laboratories LEARN-AT-HOME from the safety of home PROGRAMS OFFERED BY MUSEUMS DEVELOP TEAMWORK SKILLS AT OUTDOOR ADVENTURE PARKS A Premier Travel Media publication / StudentTravelPlanningGuide.com STPG2021_Cover.indd 1 9/16/20 2:19 PM 2021 EDITION Editorial & Advertising Offic 621 Plainfield oad, Suite 406 Willowbrook, IL 60527 P 630.794.0696 F 630.794.0652 [email protected] www.ptmgroups.com Publisher – Jeffrey Gayduk [email protected] Senior Editor – Randy Mink [email protected] Associate Editor – Miles Dobis [email protected] Contributing Writers – Lisa Curtin, Ayumi Davis, George Hashemi, Amanda Landwehr, Dove Rebmann, Zoe Strozewski Director, Design & Production – Lisa Hede dolgachov/Bigstock.com [email protected] Design & Production Intern – Abbie Wilson f there have been any silver linings to the ongoing COVID-19 pandemic, then one of them has certainly been the growing appreciation for teachers across the The publisher accepts unsolicited editorial matter, as well as world. The ordinary stresses of the job have been compounded with limited face advertising, but assumes no responsibility for statements made by time with students, concerns about classroom safety and uncertainty about the advertisers or contributors. Every effort is made to ensure the accuracy Ifuture. However, both educators and students alike have quickly adapted to our new of the information is published, but the publisher makes no warranty that listings are free of error. The publisher is not responsible for the digital normal—a world where lessons and tests are administered over Zoom calls.
    [Show full text]
  • Underwater Navigation and Mapping with an Omnidirectional Optical Sensor
    UNDERWATER NAVIGATION AND MAPPING WITH AN OMNIDIRECTIONAL OPTICAL SENSOR Josep Bosch Alay Per citar o enllaçar aquest document: Para citar o enlazar este documento: Use this url to cite or link to this publication: http://hdl.handle.net/10803/664260 http://creativecommons.org/licenses/by/4.0/deed.ca Aquesta obra està subjecta a una llicència Creative Commons Reconeixement Esta obra está bajo una licencia Creative Commons Reconocimiento This work is licensed under a Creative Commons Attribution licence Doctoral Thesis Underwater Navigation and Mapping with an Omnidirectional Optical Sensor Josep Bosch Alay 2018 Doctoral Thesis Underwater Navigation and Mapping with an Omnidirectional Optical Sensor Josep Bosch Alay 2018 Doctoral Program in Technology Supervised by: Nuno Gracias Pere Ridao Thesis submitted to University of Girona in fulfillment of the requirements for the degree of Doctor of Philosophy Certificate of Thesis Direction Dr. Nuno Gracias and Dr. Pere Ridao, members of the Departament d’Arquitectura i Tecnologia de Computadors of Universitat de Girona, Declare: That the work entitled Underwater Navigation and Mapping with an Omnidirectional Optical Sensor presented by Josep Bosch Alay to obtain the degree in Doctor of Philosophy has been developed under our supervision. Therefore, in order to certify the aforesaid statement, we sign this document. Girona, August 2018 Dr. Nuno Gracias Dr. Pere Ridao Acknowledgements This thesis would have not been possible without the help of many people who I want to address the lines below. Com no podria ser d’altra manera m’agradaria comen¸car amb un agra¨ıment immens cap als meus pares. Es´ gr`acies a ells que he pogut estudiar i amb el seu exemple m’han ensenyat uns valors inestimables: la humilitat, la honradesa i l’esfor¸c.
    [Show full text]
  • The Use of Virtual Reality to Promote Sustainable Tourism: a Case Study of Wooden Churches Historical Monuments from Romania
    remote sensing Article The Use of Virtual Reality to Promote Sustainable Tourism: A Case Study of Wooden Churches Historical Monuments from Romania 1, 2 2 2 2 Tudor Caciora *, Grigore Vasile Herman , Alexandru Ilies, ,S, tefan Baias , Dorina Camelia Ilies, , Ioana Josan 2 and Nicolaie Hodor 3 1 Doctoral School in Geography, University of Oradea, 1 Universitatii Street, 410087 Oradea, Romania 2 Department of Geography, Tourism and Territorial Planning, Faculty of Geography, Tourism and Sport, University of Oradea, 1 Universitatii Street, 410087 Oradea, Romania; [email protected] (G.V.H.); [email protected] (A.I.); [email protected] (S, .B.); [email protected] (D.C.I.); [email protected] (I.J.) 3 Faculty of Geography, Babes-Bolyai University, 5-6 Clinicilor Street, 400006 Cluj-Napoca, Romania; [email protected] * Correspondence: [email protected]; Tel.: +40-740-941-144 Abstract: The accelerated development and expansion of cultural tourism in areas with unique tourist objectives, characterised by a high degree of risk in terms of their physical and chemical integrity, requires sustained efforts by all stakeholders to identify new methods, techniques, and procedures for their conservation, protection, and capitalisation, with respect to tourism. The aim of this study was to propose an optimal methodology for capitalising on tourism related to wooden churches, regarded as a structural item of tangible cultural heritage, with positive effects on the protection, conservation, information, and awareness of all stakeholders in tourism development. Citation: Caciora, T.; Herman, G.V.; This involved the development of a web portal, in which were integrated the 3D models related to Ilies, , A.; Baias, S, .; Ilies, , D.C.; Josan, I.; the analysed objects, the panoramic images inside them, the audio support, the photographs, and the Hodor, N.
    [Show full text]
  • Online Virtual Tour by Panoramic Photography
    Matias Tukiainen Online Virtual Tour by Panoramic Photography Helsinki Metropolia University of Applied Sciences Bachelor’s degree Media engineering Thesis 3 May 2013 Abstract Author(s) Matias Tukiainen Title Online virtual tour by panoramic photography Number of Pages 35 pages Date 3 May 2013 Degree Bachelor of Engineering Degree Programme Media Engineering Instructor(s) Erkki Aalto, Head of Degree Programme The purpose of the project was constructing a virtual tour, usable by a web browser of Es- poo city museum's mansion museum to ensure availability of the exhibition during renova- tion of the premises. The project included defining the scope with the client, technical im- plementation based on the specifications and instructing the client on installation of the application. The virtual tour was defined as a 360 degree horizontal multi-room panorama presentation that can be viewed with both a traditional desktop computer browser and a mobile device. The virtual tour was implemented by using a DSLR system camera with high dynamic range imaging, a special panoramic tripod head and two pieces of computer software for stitching the pictures together and creating the browser platform for the virtual tour. In addition to creating the virtual tour and specifying the software and hardware, the work- flow was defined, with special attention paid to the possible issues of the process. Errors might occur in handling the dynamics of interior photography, specifying the camera set- tings, selecting a proper camera lens and tripod and stitching the images together. The project acted as both a service done for a client and a definition of requirements for a panoramic virtual presentation.
    [Show full text]
  • Nctech 360 & Virtual Tour
    Extract from NCTech Application Notes & Case Studies Download the complete booklet from nctechimaging.com/technotes [Application note - World wide locations] NCTech 360 & Virtual Tour Date: 04 March 2016 Author: Araceli Perez Ramos, Application Assistant Organisations involved: NCTech Products used: iSTAR, iris360, Immersive Studio, Google Street View App, PanoTour Pro, Pano2VR plus other web-sharing providers for 360° content. NCTech to provide rapid HDR imaging for virtual tour deliverables The NCTech 360 degree cameras provide rapid, publish panoramic images to the Street View automatic High Dynamic Range (HDR) 360 imaging platform through its direct integration with and can be used as automated devices or for Google’s Street View app. ‘virtual tour’ or other virtual site-documentation Both iris360 and iSTAR can also be used for applications. NCTech products provide advanced other non-Google imaging applications, most capabilities to make the work easier, faster and commonly for virtual tours, site documentation more enjoyable for their users. or asset management. NCTech is a pioneer in 360° imaging and its iris360 Both NCTech imaging devices rapidly and has the prestigious position of being the first automatically provide standard format image automatic 360° camera approved by Google for output, fully compatible with all standard 3rd use by their Street View | Trusted photographers. party software for virtual tour creation and iris360 can also be used by anyone to instantly customisation. NCTech Ltd 1 Boroughloch Square, Edinburgh, EH8 9NJ, UK 53 T: +44 131 202 6258 - www.nctechimaging.com The aim of this report is to show how NCTech products improve the For further information about how to use Kolor AutoPano Giga or task of virtual tour creation by making it simpler and faster so anyone PTGui for stitching, visit the NCTech Vimeo channel: vimeo.com/nctech can achieve good results with no need of advanced photographic Here you can see an iSTAR image in equirectangular projection view: knowledge.
    [Show full text]