Stitching Software

Total Page:16

File Type:pdf, Size:1020Kb

Stitching Software 63 Stitching Software 1 Can‘t See the Forest for the Trees … Shot Using: Nikon D70, Nikkor AF-S f3.5-4.5/24-85 mm zoom Liepaja, Latvia lens, hand-held, 5 images using a 24mm focal length Stitching Software: PTMac (alignment, blending), Photoshop (parallax error corrections, image fine-tuning) Projection Type: Cylindrical, 150° × 35° 64 Stitching Software Don’t Lose Your Way in the Software Jungle 65 Internet Panorama Photography Forums process itself. The handling and performance of these programs is as varied as their design • International VR Photography Association: and price range. Most programs offer free http://ivrpa.org/forum downloadable test versions, and most are • Panoguide: http://www.panoguide.com/forums also supported by Internet forums, managed • Panotools: either by independent third parties or even http://tech.groups.yahoo.com/group/PanoToolsNG by the software manufacturers themselves. • Quicktime VR: A little time spent surfing through these http://lists.apple.com/mailman/listinfo/quicktime-vr forums will help you decide whether a par- • PTMac: http://www.kekus.com/forum ticular program suits your level of creativity • Autodesk Stitcher: http://discussion.autodesk.com/forums and experience, and whether the accompa- • PTGui: http://groups.google.com/group/ptgui nying documentation is appropriate to your 1 Panorama Tools • Max Lyons: http://www.tawbaware.com/forum2 personal style and workflow. Editor in the 2.6.1 version of the program Don’t Lose Your Way in the Software panorama fans. His program also provided a Tools for Apple Macintosh, introduced in Software Diversity Jungle strong impulse for the further development 2002) are both based entirely on Panorama of panorama software in general. Panorama Tools and are still clearly related to the origi- Working with a computer is a highly signifi- The introduction of Apple QuickTime in Tools is basically a loose collection of soft- nal program. cant part of the process of creating digital 1991 was one of the major starting signals ware tools designed to distort, correct, align, REALVIZ Stitcher (now called Autodesk panoramas from multiple source images. for the subsequent development of multi- and transform digital images for the spe- Stitcher) was probably the first generally So why are we dedicating so much time and media software. QuickTime VR appeared in cific purpose of creating digital panorama available, modern spherical panorama stitch- column space to theory? Firstly, it is simply 1994, and QuickTime VR Authoring Studio images–with excellent results. The only ing program with a user-friendly GUI. While interesting to understand the background (QTVRAS) followed in 1997. QTVRAS was problem is that the tools themselves are not Panorama Tools (and others) offered purely 1 QuickTime VR and the connections between the individual the first commercially available software easy to operate and do not have their own manual panorama stitching functionality, Authoring Studio parts of the process; and secondly, it would that enabled users to create 360° panoramas graphical user interface. In order to operate REALVIZ offered a highly automated stitch- otherwise be too easy to become disori- using computer technology instead of spe- the tools effectively, you need a fair amount ing process without manual control point ented by the complex choice of processing cial cameras, and to display the results as a of background knowledge, and you must also functionality. Image alignment was also per- options and parameters offered by profes- rolled-flat, cylindrical image or interactively use a command line to execute some tasks. formed semi-automatically, and was unfor- sional stitching software. Some background on a computer monitor using QuickTime The main aspects of the necessary back- tunately not always reliable. Until recently, if knowledge helps you to avoid making basic Player. Since the introduction of QuickTime ground knowledge have been addressed you wanted to produce high-quality panora- mistakes from the outset, or at least allows 5 in 2001, 360° × 180° panoramas can also be graphically (if not mathematically) in the mas for printing, you would most likely end you to correct your mistakes effectively dur- viewed interactively. These days, there are a previous chapters, so you shouldn’t have up using a program based on one of Professor ing the stitching process, making the results multitude of stitching and viewing programs much trouble coming to grips with the fol- Dersch’s algorithms, in spite of the steep produced by the computer sometimes less of available, often in the form of Java Applets lowing sections. Additionally, in order to learning curve involved. a surprise. and Shockwave or Flash applications. allow non-physicists and non-programmers All of these tools were part of the early Stitching software is available in many In 1998, mathematics professor Helmut access to these great tools, various groups development stages of digital photographic forms–from free, highly specialized open- Dersch at the University of Furtwangen in have programmed user-friendly GUIs for blending techniques. They demanded perfect source algorithms right up to complex (and Germany introduced his Panorama Tools the Panorama Tools algorithms. PTGui parallax correction and exposure, as well as sometimes expensive) software suites that software suite. He has since earned himself (Panorama Tools Graphical User Interface, excellent alignment and composition on the offer functionality way beyond the stitching a legendary international reputation among introduced in 2000), and PTMac (Panorama 66 Stitching Software Don’t Lose Your Way in the Software Jungle 67 part of the photographer, in order to produce at various stages in the stitching process. You error-free panoramas. can enter lens correction parameters manu- Without going into too much detail, ally, and, should you not want to use the we now compare four currently available, program’s built-in control point generator, professional-level stitching programs that you can align your individual source images offer very different handling characteristics by manually entering tilt, pan, and roll coor- and diverse functionality. But before we go dinates, with the mouse, or by selecting your any further, we would like to point out that own control points. You can also use addi- recent developments in stitching software tional plug-ins, such as Autopano, Enblend, algorithms have led to an ever-increasing or Smartblend, to control the alignment and level of fault tolerance for the overlap areas blending processes. in panorama source images. Until recently, it The 7.0 and 8.0 versions of PTGui signifi- was necessary to shoot precisely aligned and cantly increase the program’s range of func- exposed source material in order to produce tionality and go a long way beyond simple acceptable results. Modern blending algo- panorama stitching. Constant program rithms, such as Enblend and Smartblend, maintenance and fine-tuning has markedly can produce astonishing results even if the improved the automatic control point selec- source material is not of optimum qual- tion function, the program’s own blending ity. We can be fairly certain that the avail- algorithm, and general processing speed (no able blending algorithms will continue to need to be scared of gigapixels!). The pro- improve, and that the near future will bring gram now includes built-in QuickTime VR 7 PTGui Pro even more exciting developments. functionality, allowing you to output images User interface in directly to the interactive digital panorama Advanced mode The Perfectionist: format without having to use additional, New House Software – external tools. PTGui Pro While most stitching programs only adjustable Vedutismo projection type, these In a nutshell: PTGui is very fast and offers PTGui is well-known and offer the three standard projection types projection types offer a real plus when it an extremely wide range of useful functions. extremely popular among (flat, cylindrical, and spherical), PTGui has comes to producing printed output. Once you get used to the somewhat cryptic professional panorama pho- included Mercator and fisheye projection The automatic EXIF data recognition user interface, it quickly becomes a pleasure tographers. Although it had its origins as a for a while now, as well as various trans- function helps to realize large stitching to use the program to produce perfect pan- GUI for Helmut Dersch’s Panorama Tools, it verse projection types for use with vertically projects at least semi-automatically, and the oramas. is constantly being developed and expanded, stitched panoramas. The Mercator projection program‘s outstanding batch and template and is now only loosely based on the original. produces similar results to a cylindrical pro- functionality allow you to render hundreds The Ghost Buster: Today’s PTGui not only includes all of its jection, but with significantly less distortion of projects overnight. Additional support for Autodesk (REALVIZ) Stitcher original (and complex) command line-based to the upper and lower edges of an image; it HDR and bracketing sequences rounds out Bought by Autodesk in 2008, stitching tools, but also a suite of automatic, is thus a very useful addition for producing PTGui’s functionality, which also includes this program was probably GUI-based tools for aligning and merg- printed panoramas. The program even allows Exposure Fusion algorithms as an alternative the first to offer user-friendly ing images. As we have already
Recommended publications
  • Hardware and Software for Panoramic Photography
    ROVANIEMI UNIVERSITY OF APPLIED SCIENCES SCHOOL OF TECHNOLOGY Degree Programme in Information Technology Thesis HARDWARE AND SOFTWARE FOR PANORAMIC PHOTOGRAPHY Julia Benzar 2012 Supervisor: Veikko Keränen Approved _______2012__________ The thesis can be borrowed. School of Technology Abstract of Thesis Degree Programme in Information Technology _____________________________________________________________ Author Julia Benzar Year 2012 Subject of thesis Hardware and Software for Panoramic Photography Number of pages 48 In this thesis, panoramic photography was chosen as the topic of study. The primary goal of the investigation was to understand the phenomenon of pa- noramic photography and the secondary goal was to establish guidelines for its workflow. The aim was to reveal what hardware and what software is re- quired for panoramic photographs. The methodology was to explore the existing material on the topics of hard- ware and software that is implemented for producing panoramic images. La- ter, the best available hardware and different software was chosen to take the images and to test the process of stitching the images together. The ex- periment material was the result of the practical work, such the overall pro- cess and experience, gained from the process, the practical usage of hard- ware and software, as well as the images taken for stitching panorama. The main research material was the final result of stitching panoramas. The main results of the practical project work were conclusion statements of what is the best hardware and software among the options tested. The re- sults of the work can also suggest a workflow for creating panoramic images using the described hardware and software. The choice of hardware and software was limited, so there is place for further experiments.
    [Show full text]
  • Capture4vr: from VR Photography to VR Video
    Capture4VR: From VR Photography to VR Video Christian Richardt Peter Hedman Ryan S. Overbeck University of Bath University College London Google LLC [email protected] [email protected] [email protected] Brian Cabral Robert Konrad Steve Sullivan Facebook Stanford University Microsoft [email protected] [email protected] [email protected] ABSTRACT from VR photography to VR video that began more than a century Virtual reality (VR) enables the display of dynamic visual content ago but which has accelerated tremendously in the last five years. with unparalleled realism and immersion. However, VR is also We discuss both commercial state-of-the-art systems by Facebook, still a relatively young medium that requires new ways to author Google and Microsoft, as well as the latest research techniques and content, particularly for visual content that is captured from the real prototypes. Course materials will be made available on our course world. This course, therefore, provides a comprehensive overview website https://richardt.name/Capture4VR. of the latest progress in bringing photographs and video into VR. ACM Reference Format: Ultimately, the techniques, approaches and systems we discuss aim Christian Richardt, Peter Hedman, Ryan S. Overbeck, Brian Cabral, Robert to faithfully capture the visual appearance and dynamics of the Konrad, and Steve Sullivan. 2019. Capture4VR: From VR Photography to real world, and to bring it into virtual reality to create unparalleled VR Video. In Proceedings of SIGGRAPH 2019 Courses. ACM, New York, NY, realism and immersion by providing freedom of head motion and USA,2 pages. https://doi.org/10.1145/3305366.3328028 motion parallax, which is a vital depth cue for the human visual system.
    [Show full text]
  • Creating a Realistic VR Experience Using a Normal 360-Degree Camera 14 December 2020, by Vicky Just
    Creating a realistic VR experience using a normal 360-degree camera 14 December 2020, by Vicky Just Dr. Christian Richardt and his team at CAMERA, the University of Bath's motion capture research center, have created a new type of 360° VR photography accessible to amateur photographers called OmniPhotos. This is a fast, easy and robust system that recreates high quality motion parallax, so that as the VR user moves their head, the objects in the foreground move faster than the background. This mimics how your eyes view the real world, OmniPhotos takes 360 degree video footage taken with creating a more immersive experience. a rotating selfie stick to create a VR experience that includes motion parallax. Credit: Christian Richardt, OmniPhotos can be captured quickly and easily University of Bath using a commercially available 360° video camera on a rotating selfie stick. Using a 360° video camera also unlocks a Scientists at the University of Bath have developed significantly larger range of head motions. a quick and easy approach for capturing 360° VR photography without using expensive specialist OmniPhotos are built on an image-based cameras. The system uses a commercially representation, with optical flow and scene adaptive available 360° camera on a rotating selfie stick to geometry reconstruction, which is tailored for real capture video footage and create an immersive VR time 360° VR rendering. experience. Dr. Richardt and his team presented the new Virtual reality headsets are becoming increasingly system at the international SIGGRAPH Asia popular for gaming, and with the global pandemic conference on Sunday 13th December 2020.
    [Show full text]
  • Capture, Reconstruction, and Representation of the Visual Real World for Virtual Reality
    Capture, Reconstruction, and Representation of the Visual Real World for Virtual Reality Christian Richardt1[0000−0001−6716−9845], James Tompkin2[0000−0003−2218−2899], and Gordon Wetzstein3[0000−0002−9243−6885] 1 University of Bath [email protected] https://richardt.name/ 2 Brown University james [email protected] http://jamestompkin.com/ 3 Stanford University [email protected] http://www.computationalimaging.org/ Abstract. We provide an overview of the concerns, current practice, and limita- tions for capturing, reconstructing, and representing the real world visually within virtual reality. Given that our goals are to capture, transmit, and depict com- plex real-world phenomena to humans, these challenges cover the opto-electro- mechancial, computational, informational, and perceptual fields. Practically pro- ducing a system for real-world VR capture requires navigating a complex design space and pushing the state of the art in each of these areas. As such, we outline several promising directions for future work to improve the quality and flexibility of real-world VR capture systems. Keywords: Cameras · Reconstruction · Representation · Image-Based Rendering · Novel-view synthesis · Virtual reality 1 Introduction One of the high-level goals of virtual reality is to reproduce how the real world looks in a way which is indistinguishable from reality. Achieving this arguably-quixotic goal requires us to solve significant problems across capture, reconstruction, and represen- tation, and raises many questions: “Which camera system should we use to sample enough of the environment for our application?”; “How should we model the world and which algorithm should we use to recover these models?”; “How should we store the data for easy compression and transmission?”, and “How can we achieve simple and high-quality rendering for human viewing?”.
    [Show full text]
  • Linux: Come E Perchх
    ÄÒÙÜ Ô ©2007 mcz 12 luglio 2008 ½º I 1. Indice II ½º Á ¾º ¿º ÈÖÞÓÒ ½ º È ÄÒÙÜ ¿ º ÔÔÖÓÓÒÑÒØÓ º ÖÒÞ ×Ó×ØÒÞÐ ÏÒÓÛ× ¾½ º ÄÒÙÜ ÕÙÐ ×ØÖÙÞÓÒ ¾ º ÄÒÙÜ ÀÖÛÖ ×ÙÔÔ ÓÖØØÓ ¾ º È Ð ÖÒÞ ØÖ ÖÓ ÓØ Ù×Ö ¿½ ½¼º ÄÒÙÜ × Ò×ØÐÐ ¿¿ ½½º ÓÑ × Ò×ØÐÐÒÓ ÔÖÓÖÑÑ ¿ ½¾º ÒÓÒ ØÖÓÚÓ ÒÐ ×ØÓ ÐÐ ×ØÖÙÞÓÒ ¿ ½¿º Ó׳ ÙÒÓ ¿ ½º ÓÑ × Ð ××ØÑ ½º ÓÑ Ð ½º Ð× Ñ ½º Ð Ñ ØÐ ¿ ½º ÐÓ ½º ÓÑ × Ò×ØÐÐ Ð ×ØÑÔÒØ ¾¼º ÓÑ ÐØØÖ¸ Ø×Ø ÐÖ III Indice ¾½º ÓÑ ÚÖ Ð ØÐÚ×ÓÒ ¿ 21.1. Televisioneanalogica . 63 21.2. Televisione digitale (terrestre o satellitare) . ....... 64 ¾¾º ÐÑØ ¾¿º Ä 23.1. Fotoritocco ............................. 67 23.2. Grafica3D.............................. 67 23.3. Disegnovettoriale-CAD . 69 23.4.Filtricoloreecalibrazionecolori . .. 69 ¾º ×ÖÚ Ð ½ 24.1.Vari.................................. 72 24.2. Navigazionedirectoriesefiles . 73 24.3. CopiaCD .............................. 74 24.4. Editaretesto............................. 74 24.5.RPM ................................. 75 ¾º ×ÑÔ Ô ´ËÐе 25.1.Montareundiscoounapenna . 77 25.2. Trovareunfilenelsistema . 79 25.3.Vedereilcontenutodiunfile . 79 25.4.Alias ................................. 80 ¾º × ÚÓÐ×× ÔÖÓÖÑÑÖ ½ ¾º ÖÓÛ×Ö¸ ÑÐ ººº ¿ ¾º ÖÛÐРгÒØÚÖÙ× Ð ÑØØÑÓ ¾º ÄÒÙÜ ½ ¿¼º ÓÑ ØÖÓÚÖ ÙØÓ ÖÖÑÒØ ¿ ¿½º Ð Ø×ØÙÐ Ô Ö Ð ×ØÓÔ ÄÒÙÜ ¿¾º ´ÃµÍÙÒØÙ¸ ÙÒ ×ØÖÙÞÓÒ ÑÓÐØÓ ÑØ ¿¿º ËÙÜ ÙÒ³ÓØØÑ ×ØÖÙÞÓÒ ÄÒÙÜ ½¼½ ¿º Á Ó Ò ÄÒÙÜ ½¼ ¿º ÃÓÒÕÙÖÓÖ¸ ÕÙ×ØÓ ½¼ ¿º ÃÓÒÕÙÖÓÖ¸ Ñ ØÒØÓ Ô Ö ½½¿ 36.1.Unaprimaocchiata . .114 36.2.ImenudiKonqueror . .115 36.3.Configurazione . .116 IV Indice 36.4.Alcuniesempidiviste . 116 36.5.Iservizidimenu(ServiceMenu) . 119 ¿º ÃÓÒÕÙÖÓÖ Ø ½¾¿ ¿º à ÙÒ ÖÖÒØ ½¾ ¿º à ÙÒ ÐÙ×ÓÒ ½¿½ ¼º ÓÒÖÓÒØÓ Ò×ØÐÐÞÓÒ ÏÒÓÛ×È ÃÍÙÒØÙ º½¼ ½¿¿ 40.1.
    [Show full text]
  • Photography Techniques Intermediate Skills
    Photography Techniques Intermediate Skills PDF generated using the open source mwlib toolkit. See http://code.pediapress.com/ for more information. PDF generated at: Wed, 21 Aug 2013 16:20:56 UTC Contents Articles Bokeh 1 Macro photography 5 Fill flash 12 Light painting 12 Panning (camera) 15 Star trail 17 Time-lapse photography 19 Panoramic photography 27 Cross processing 33 Tilted plane focus 34 Harris shutter 37 References Article Sources and Contributors 38 Image Sources, Licenses and Contributors 39 Article Licenses License 41 Bokeh 1 Bokeh In photography, bokeh (Originally /ˈboʊkɛ/,[1] /ˈboʊkeɪ/ BOH-kay — [] also sometimes heard as /ˈboʊkə/ BOH-kə, Japanese: [boke]) is the blur,[2][3] or the aesthetic quality of the blur,[][4][5] in out-of-focus areas of an image. Bokeh has been defined as "the way the lens renders out-of-focus points of light".[6] However, differences in lens aberrations and aperture shape cause some lens designs to blur the image in a way that is pleasing to the eye, while others produce blurring that is unpleasant or distracting—"good" and "bad" bokeh, respectively.[2] Bokeh occurs for parts of the scene that lie outside the Coarse bokeh on a photo shot with an 85 mm lens and 70 mm entrance pupil diameter, which depth of field. Photographers sometimes deliberately use a shallow corresponds to f/1.2 focus technique to create images with prominent out-of-focus regions. Bokeh is often most visible around small background highlights, such as specular reflections and light sources, which is why it is often associated with such areas.[2] However, bokeh is not limited to highlights; blur occurs in all out-of-focus regions of the image.
    [Show full text]
  • Real-Time Sphere Sweeping Stereo from Multiview Fisheye Images
    Real-Time Sphere Sweeping Stereo from Multiview Fisheye Images Andreas Meuleman Hyeonjoong Jang Daniel S. Jeon Min H. Kim KAIST Abstract A set of cameras with fisheye lenses have been used to capture a wide field of view. The traditional scan-line stereo (b) Input fisheye images algorithms based on epipolar geometry are directly inap- plicable to this non-pinhole camera setup due to optical characteristics of fisheye lenses; hence, existing complete 360◦ RGB-D imaging systems have rarely achieved real- time performance yet. In this paper, we introduce an effi- cient sphere-sweeping stereo that can run directly on multi- view fisheye images without requiring additional spherical (c) Our panorama result rectification. Our main contributions are: First, we intro- 29 fps duce an adaptive spherical matching method that accounts for each input fisheye camera’s resolving power concerning spherical distortion. Second, we propose a fast inter-scale bilateral cost volume filtering method that refines distance in noisy and textureless regions with optimal complexity of O(n). It enables real-time dense distance estimation while preserving edges. Lastly, the fisheye color and distance im- (a) Our prototype (d) Our distance result ages are seamlessly combined into a complete 360◦ RGB-D Figure 1: (a) Our prototype built on an embedded system. image via fast inpainting of the dense distance map. We (b) Four input fisheye images. (c) & (d) Our results of om- demonstrate an embedded 360◦ RGB-D imaging prototype nidirectional panorama and dense distance map (shown as composed of a mobile GPU and four fisheye cameras. Our the inverse of distance).
    [Show full text]
  • Besut Campus Virtual Reality (Vr) 360 Degree Panoramic View
    BESUT CAMPUS VIRTUAL REALITY (VR) 360 DEGREE PANORAMIC VIEW MUHAMMAD NAUFAL BIN FARID BACHELOR OF INFORMATION TECHNOLOGY (INFORMATIC MEDIA) WITH HONOURS UNIVERSITI SULTAN ZAINAL ABIDIN 2018 BESUT CAMPUS VIRTUAL REALITY (VR) 360 DEGREE PANORAMIC VIEW MUHAMMAD NAUFAL BIN FARID Bachelor Of Information Technology (Informatic Media) With Honours Universiti Sultan Zainal Abidin,Terengganu,Malaysia DECEMBER 2018 DECLARATION I hereby declare that this report is based on my original work except for quotations and citations, which have been duly acknowledged. I also declare that it has not been previously or concurrently submitted for any other degree at Universiti Sultan Zainal Abidin or other institutions ________________________________ Name : Muhammad Naufal bin Farid Date : 23 December 2018 i CONFIRMATION This is to confirm that: The research conducted and the writing of this report was under my supervision. ________________________________ Name : Dr Ismahafezi bin Ismail Tarikh : 23 December 2018 ii DEDICATION In the Name of Allah, the Most Gracious and the Most Merciful. Alhamdulillah, I thank God for His grace and grace, I can prepare and complete this report successfully. I am grateful and would like to express my sincere gratitude to my supervisor, Dr Ismahafezi bin Ismail for his invaluable guidance, continuous encouragement and constant support in making this research possible. I really appreciate his guidance from the initial to the final level that enabled me to develop an understanding of this project thoroughly. Without his advice and assistance, it would be a lot tougher for completion. I also sincerely thanks for the time spent proofreading and correcting my mistakes. I acknowledge my sincere indebtedness and gratitude to my parents for their love, dream and sacrifice throughout my life.
    [Show full text]
  • Arxiv:2103.05842V1 [Cs.CV] 10 Mar 2021 Generators to Produce More Immersive Contents
    LEARNING TO COMPOSE 6-DOF OMNIDIRECTIONAL VIDEOS USING MULTI-SPHERE IMAGES Jisheng Li∗, Yuze He∗, Yubin Hu∗, Yuxing Hany, Jiangtao Wen∗ ∗Tsinghua University, Beijing, China yResearch Institute of Tsinghua University in Shenzhen, Shenzhen, China ABSTRACT Omnidirectional video is an essential component of Virtual Real- ity. Although various methods have been proposed to generate con- Fig. 1: Our proposed 6-DoF omnidirectional video composition tent that can be viewed with six degrees of freedom (6-DoF), ex- framework. isting systems usually involve complex depth estimation, image in- painting or stitching pre-processing. In this paper, we propose a sys- tem that uses a 3D ConvNet to generate a multi-sphere images (MSI) representation that can be experienced in 6-DoF VR. The system uti- lizes conventional omnidirectional VR camera footage directly with- out the need for a depth map or segmentation mask, thereby sig- nificantly simplifying the overall complexity of the 6-DoF omnidi- rectional video composition. By using a newly designed weighted sphere sweep volume (WSSV) fusing technique, our approach is compatible with most panoramic VR camera setups. A ground truth generation approach for high-quality artifact-free 6-DoF contents is proposed and can be used by the research and development commu- nity for 6-DoF content generation. Index Terms— Omnidirectional video composition, multi- Fig. 2: An example of conventional 6-DoF content generation sphere images, 6-DoF VR method 1. INTRODUCTION widely available high-quality 6-DoF content that
    [Show full text]
  • Shooting Panoramas and Virtual Reality
    4104_ch09_p3.qxd 6/25/03 11:17 PM Page 176 4104_ch09_p3.qxd 6/25/03 11:17 PM Page 177 Shooting Panoramas and Virtual Reality With just a little help from special software, you can extend the capabilities of your digital camera to create stunning panoramas and enticing virtual reality (VR) images. This chapter shows you how to shoot—and create— images that are not only more beautiful, but 177 ■ also have practical applications in the commer- SHOOTING PANORAMAS AND VIRTUAL REALITY cial world as well. Chapter Contents 9 Panoramas and Object Movies Shooting Simple Panoramas Extending Your View Object Movies Shooting Tips for Object Movies Mikkel Aaland Mikkel 4104_ch09_p3.qxd 6/25/03 11:18 PM Page 178 Panoramas and Object Movies Look at Figure 9.1. It’s not what it seems. This panorama is actually comprised of several images “stitched” together using a computer and special software. Because of the limitations of the digital camera’s optical system, it would have been nearly impos- sible to make this in a single shot. Panoramas like this can be printed, or with an extended angle of view and more software help, they can be turned into interactive virtual reality (VR) movies viewable on a computer monitor and distributed via the Web or on a CD. Figure 9.1: This panorama made by Scott Highton using a Nikon Coolpix 990 is comprised of 178 four adjacent images, “stitched” together using Photoshop Elements’ 2 Photomerge plug-in. ■ If you look at Figure 9.2 you’ll see another example where software was used to extend the capabilities of a digital camera.
    [Show full text]
  • 2017 Provost's Learning Innovations Grants
    2017 Provost’s Learning Innovations Grants 2017 PROVOST’S LEARNING INNOVATIONS GRANTS APPLICATION INSTRUCTIONS 1. Complete this Application Form, in its entirety, and save as “Lastname_Firstname_APP” (using your name). 2. Complete the Budget Worksheet and save as “Lastname_Firstname_BUDGET” (using your name). 3. Ask your Department Head to complete the Department Head Certification, scan and save as, “Lastname_Firstname_SIG” (using your name). 4. Email all documents to [email protected], no later than 11:59pm EST, January 25, 2017. If you have any questions about completing this application, please email [email protected], or contact Michael Starenko at 585-475-5035 or [email protected]. APPLICANT INFORMATION This application is for a: Exploration Grant x Focus Grant Principal Applicant name: Jennifer Poggi Faculty title: Assistant Professor Email: [email protected] Phone: (202)510-1079 (Full-time only) College: CIAS Department: SPAS / Photojournalism Department Head name: Therese Mulligan __________________________________ Email: [email protected] Others involved in the project (if any): Nitin Simpat, Josh Meltzer, Susan Lakin, Juilee Decker, Joe Geigel Project name: Exploring VR/360 Video Through Interdisciplinary Experiential Student Learning Total funds requested (as calculated on the budget worksheet): $5000.00 (requests of $1,000 to $5,000 will be considered) 3 2017 Provost’s Learning Innovations Grants BUDGET There is a fillable PDF worksheet to calculate your budget. You can download the worksheet at rit.edu/ili/plig. • The total shown on this worksheet must match the “Total funds requested” in the Applicant Information section of this application form • If awarded, additional funds will be provided to cover any benefits and ITS expenses associated with the salary budget requested • Note that any equipment or other materials purchased with grant funds are the property of your department and revert to the department after your project is completed TIMELINE Please indicate any variances to the planned PLIG 2017 schedule and your reasons.
    [Show full text]
  • Joe Farace Barry Staver
    Better Available Light Digital Photography This page intentionally left blank Better Available Light Digital Photography How to Make the Most of Your Night and Low-Light Shots Second Edition Joe Farace Barry Staver AMSTERDAM • BOSTON • HEIDELBERG • LONDON NEW YORK • OXFORD • PARIS • SAN DIEGO SAN FRANCISCO • SINGAPORE • SYDNEY • TOKYO Focal Press is an imprint of Elsevier Associate Acquisitions Editor: Valerie Geary Publishing Services Manager: George Morrison Project Manager: Mónica González de Mendoza Marketing Manager: Kate lanotti Cover Design: Eric DeCicco Cover image: © Joe Farace Focal Press is an imprint of Elsevier 30 Corporate Drive, Suite 400, Burlington, MA 01803, USA Linacre House, Jordan Hill, Oxford OX2 8DP, UK © 2009 Joe Farace and Barry Staver. Published by Elsevier, Inc. All rights reserved. No part of this publication may be reproduced, stored in a retrieval system, or transmitted in any form or by any means, electronic, mechanical, photocopying, recording, or otherwise, without the prior written permission of the publisher. Permissions may be sought directly from Elsevier’s Science & Technology Rights Department in Oxford, UK: phone: (+44) 1865 843830, fax: (+44) 1865 853333, E-mail: [email protected]. You may also complete your request online via the Elsevier homepage (http://elsevier.com), by selecting “Support & Contact” then “Copy- right and Permission” and then “Obtaining Permissions.” Recognizing the importance of preserving what has been written, Elsevier prints its books on acid-free paper whenever possible. Library of Congress Cataloging-in-Publication Data Farace, Joe. Better available light digital photography : how to make the most of your night and low-light shots / Joe Farace, Barry Staver.
    [Show full text]