2. Camera, Optics

Total Page:16

File Type:pdf, Size:1020Kb

2. Camera, Optics 2.2. Camera,Camera, OpticsOptics E-mail: [email protected] http://web.yonsei.ac.kr/hgjung 2.1.2.1. CameraCamera E-mail: [email protected] http://web.yonsei.ac.kr/hgjung Virtual image, perspective projection [1] E-mail: [email protected] http://web.yonsei.ac.kr/hgjung How large a pinhole? [1] E-mail: [email protected] http://web.yonsei.ac.kr/hgjung Shrinking the aperture [2] E-mail: [email protected] http://web.yonsei.ac.kr/hgjung Shrinking the aperture [2] E-mail: [email protected] http://web.yonsei.ac.kr/hgjung Same function with large pinhole: Lens [1] E-mail: [email protected] http://web.yonsei.ac.kr/hgjung 광전효과광전효과(Photoelectric(Photoelectric Effect)Effect) In the photoelectric effect, electrons are emitted from matter (metals and non-metallic solids, liquids or gases) as a consequence of their absorption of energy from electromagnetic radiation of very short wavelength, such as visible or ultraviolet light. Electrons emitted in this manner may be referred to as "photoelectrons". Symbol for photodiode. Light-matter interaction http://en.wikipedia.org/wiki/Photoelectric E-mail: [email protected] http://web.yonsei.ac.kr/hgjung ImageImage SensorSensor http://www.rocketroberts.com/astro/ccd_fundamentals.htm E-mail: [email protected] http://web.yonsei.ac.kr/hgjung ImageImage SensorSensor An image sensor is a device that converts an optical image to an electric signal. It is used mostly in digital cameras and other imaging devices. Early sensors were video camera tubes but a modern one is typically a charge-coupled device (CCD) or a complementary metal– oxide–semiconductor (CMOS) active pixel sensor. A CCD image sensor on a flexible circuit board http://en.wikipedia.org/wiki/Image_sensor E-mail: [email protected] http://web.yonsei.ac.kr/hgjung ImageImage Sensor:Sensor: CCDCCD A charge-coupled device (CCD) is a device for the movement of electrical charge, usually from within the device to an area where the charge can be manipulated, for example conversion into a digital value. This is achieved by "shifting" the signals between stages within the device one at a time. CCDs move charge between capacitive bins in the device, with the shift allowing for the transfer of charge between bins. Often the device is integrated with an image sensor, such as a photoelectric device to produce the charge that is being read, thus making the CCD a major technology for digital imaging. The charge packets (electrons, blue) are collected in potential wells (yellow) created by applying positive voltage at the gate electrodes (G). Applying positive voltage to the gate electrode in the correct sequence transfers the charge packets. http://en.wikipedia.org/wiki/Charge-coupled_device E-mail: [email protected] http://web.yonsei.ac.kr/hgjung ImageImage Sensor:Sensor: CCDCCD Vertical smear. http://en.wikipedia.org/wiki/Charge-coupled_device E-mail: [email protected] http://web.yonsei.ac.kr/hgjung ImageImage Sensor:Sensor: CMOS,CMOS, APSAPS (Active(Active PixelPixel Senor)Senor) An active-pixel sensor (APS), also commonly written active pixel sensor, is an image sensor consisting of an integrated circuit containing an array of pixel sensors, each pixel containing a photodetector and an active amplifier. There are many types of active pixel sensors including the CMOS APS used most commonly in cell phone cameras, web cameras and in some DSLRs. Such an image sensor is produced by a CMOS process (and is hence also known as a CMOS sensor), and has emerged as an alternative to charge-coupled device (CCD) imager sensors. A three-transistor active pixel sensor. http://en.wikipedia.org/wiki/Active_pixel_sensor E-mail: [email protected] http://web.yonsei.ac.kr/hgjung ImageImage Sensor:Sensor: CMOS,CMOS, APSAPS (Active(Active PixelPixel Senor)Senor) 강문식, 신경욱, “IT CookBook, 전자회로: 핵심 개념부터 응용까지,” 한빛미디어. E-mail: [email protected] http://web.yonsei.ac.kr/hgjung InterlacedInterlaced .vs..vs. ProgressiveProgressive ScanScan With progressive scan, an image is captured, transmitted, and displayed in a path similar to text on a page: line by line, from top to bottom. The interlaced scan pattern in a CRT (cathode ray tube) display completes such a scan too, but only for every second line. This is carried out from the top left corner to the bottom right corner of a CRT display. This process is repeated again, only this time starting at the second row, in order to fill in those particular gaps left behind while performing the first progressive scan on alternate rows only. When interlaced video is watched on a progressive monitor with very poor deinterlacing, it exhibits combing when there is movement between two fields of one frame. http://en.wikipedia.org/wiki/Interlaced_video E-mail: [email protected] http://web.yonsei.ac.kr/hgjung ShutteringShuttering - Image sensor의 sensitivity좋아야 Shuttering speed 높일수있고 motion blur 줄일 수있음. - Rolling shutter .vs. global shuttering 노출의 동기화 global shuttering 선호 http://www.vision-systems.com/articles/print/volume-10/issue- 5/features/component-integration/auto-cameras-benefit-from- cmos-imagers.html E-mail: [email protected] http://web.yonsei.ac.kr/hgjung ShutteringShuttering In photography, shutter speed is a common term used to discuss exposure time, the effective length of time a camera's shutter is open. The total exposure is proportional to this exposure time, or duration of light reaching the film or image sensor. 짧은 shuttering 선호 Shutter speed can have a dramatic impact on A pinwheel photographed at three different the appearance of moving objects. Changes in shutter speeds background blurring are apparent from the need to adjust the aperture size to achieve proper exposure. 화면이 어두워질 수 있음 Sensitivity 좋은 소자 필요 A demonstration of the effect of exposure in night photography. Longer shutter speeds result in increased exposure. http://en.wikipedia.org/wiki/Shutter_speed E-mail: [email protected] http://web.yonsei.ac.kr/hgjung ShutteringShuttering Rolling shutter (also known as line scan) is a method of image acquisition in which each frame is recorded not from a snapshot of a single point in time, but rather by scanning across the frame either vertically or horizontally. In other words, not all parts of the image are recorded at exactly the same time, even though the whole frame is displayed at the same time during playback. This in contrast with global shutter in which the entire frame is exposed for the same time window. This produces predictable distortions of fast-moving objects or when the sensor captures rapid flashes of light. Rolling Shutter Frame (Global) Shutter http://www.ptgrey.com/support/kb/index.asp?a=4&q=115 E-mail: [email protected] http://web.yonsei.ac.kr/hgjung ShutteringShuttering 고속으로 움직이면서 촬영하는 자동차응용의 경우, global shuttering 필요 http://en.wikipedia.org/wiki/Global_shutter E-mail: [email protected] http://web.yonsei.ac.kr/hgjung Shuttering:Shuttering: stoboscopicstoboscopic effecteffect Extraordinary stroboscopic effect Stroboscope: Grinder http://youtu.be/rVSh-au_9aM http://youtu.be/8mQaXaRVUoM E-mail: [email protected] http://web.yonsei.ac.kr/hgjung HDRC(HighHDRC(High DynamicDynamic RangeRange CMOS)CMOS) 카메라카메라 http://www.vision-systems.com/articles/print/volume-10/issue-5/features/component-integration/auto-cameras-benefit- from-cmos-imagers.html E-mail: [email protected] http://web.yonsei.ac.kr/hgjung 원적외선원적외선(FIR:(FIR: FarFar InfraRedInfraRed)) 카메라카메라 http://en.wikipedia.org/wiki/Electromagnetic_wave E-mail: [email protected] http://web.yonsei.ac.kr/hgjung 원적외선원적외선(FIR:(FIR: FarFar InfraRedInfraRed)) 카메라카메라 Flir Systems - "Path Finder" Automotive Infrared Camera http://youtu.be/PM9OcBpZaPo E-mail: [email protected] http://web.yonsei.ac.kr/hgjung 근적외선근적외선(NIR:(NIR: NearNear InfraRedInfraRed)) 카메라카메라 Active-infrared night vision : the camera illuminates the scene at infrared wavelengths invisible to the human eye. Despite a dark back-lit scene, active-infrared night vision delivers identifying details, as seen on the display monitor. http://en.wikipedia.org/wiki/Infrared E-mail: [email protected] http://web.yonsei.ac.kr/hgjung 근적외선근적외선(NIR:(NIR: NearNear InfraRedInfraRed)) 카메라카메라 E-mail: [email protected] http://web.yonsei.ac.kr/hgjung Color:Color: BayerBayer FilterFilter 어떻게 color 영상을 획득하나? http://en.wikipedia.org/wiki /Bayer_filter E-mail: [email protected] http://web.yonsei.ac.kr/hgjung Color:Color: BayerBayer FilterFilter Bayer Demosaicing http://en.wikipedia.org/wiki/Demosaicing E-mail: [email protected] http://web.yonsei.ac.kr/hgjung Color:Color: BayerBayer FilterFilter Bayer Demosaicing http://www.sharplabs.com/2pfc/Demosaicing%20One.html E-mail: [email protected] http://web.yonsei.ac.kr/hgjung Color:Color: ThreeThree--CCDCCD CameraCamera http://en.wikipedia.org/wiki/Three-CCD_camera E-mail: [email protected] http://web.yonsei.ac.kr/hgjung 원적외선원적외선(FIR:(FIR: FarFar InfraRedInfraRed)) 카메라카메라 열화상은 어떻게 온도를 측정하는가? http://globale- http://logicalmystery.blogspot.kr/2012/09/therm solutions.blogspot.kr/2012/11/thermographic- al-camerasreally-why-updated.html camera-thermal-imaging.html E-mail: [email protected] http://web.yonsei.ac.kr/hgjung 원적외선원적외선(FIR:(FIR: FarFar InfraRedInfraRed)) 카메라카메라 The radiant energy in the entire wavelength range (area beneath each curve) increases to the power of 4 of the temperature. These relationships were recognized by Stefan and Boltzmann in 1879 and illustrate that an unambiguous temperature can be measured from the radiation signal.
Recommended publications
  • Photography Techniques Intermediate Skills
    Photography Techniques Intermediate Skills PDF generated using the open source mwlib toolkit. See http://code.pediapress.com/ for more information. PDF generated at: Wed, 21 Aug 2013 16:20:56 UTC Contents Articles Bokeh 1 Macro photography 5 Fill flash 12 Light painting 12 Panning (camera) 15 Star trail 17 Time-lapse photography 19 Panoramic photography 27 Cross processing 33 Tilted plane focus 34 Harris shutter 37 References Article Sources and Contributors 38 Image Sources, Licenses and Contributors 39 Article Licenses License 41 Bokeh 1 Bokeh In photography, bokeh (Originally /ˈboʊkɛ/,[1] /ˈboʊkeɪ/ BOH-kay — [] also sometimes heard as /ˈboʊkə/ BOH-kə, Japanese: [boke]) is the blur,[2][3] or the aesthetic quality of the blur,[][4][5] in out-of-focus areas of an image. Bokeh has been defined as "the way the lens renders out-of-focus points of light".[6] However, differences in lens aberrations and aperture shape cause some lens designs to blur the image in a way that is pleasing to the eye, while others produce blurring that is unpleasant or distracting—"good" and "bad" bokeh, respectively.[2] Bokeh occurs for parts of the scene that lie outside the Coarse bokeh on a photo shot with an 85 mm lens and 70 mm entrance pupil diameter, which depth of field. Photographers sometimes deliberately use a shallow corresponds to f/1.2 focus technique to create images with prominent out-of-focus regions. Bokeh is often most visible around small background highlights, such as specular reflections and light sources, which is why it is often associated with such areas.[2] However, bokeh is not limited to highlights; blur occurs in all out-of-focus regions of the image.
    [Show full text]
  • Tools for the Paraxial Optical Design of Light Field Imaging Systems Lois Mignard-Debise
    Tools for the paraxial optical design of light field imaging systems Lois Mignard-Debise To cite this version: Lois Mignard-Debise. Tools for the paraxial optical design of light field imaging systems. Other [cs.OH]. Université de Bordeaux, 2018. English. NNT : 2018BORD0009. tel-01764949 HAL Id: tel-01764949 https://tel.archives-ouvertes.fr/tel-01764949 Submitted on 12 Apr 2018 HAL is a multi-disciplinary open access L’archive ouverte pluridisciplinaire HAL, est archive for the deposit and dissemination of sci- destinée au dépôt et à la diffusion de documents entific research documents, whether they are pub- scientifiques de niveau recherche, publiés ou non, lished or not. The documents may come from émanant des établissements d’enseignement et de teaching and research institutions in France or recherche français ou étrangers, des laboratoires abroad, or from public or private research centers. publics ou privés. THÈSE PRÉSENTÉE À L’UNIVERSITÉ DE BORDEAUX ÉCOLE DOCTORALE DE MATHÉMATIQUES ET D’INFORMATIQUE par Loïs Mignard--Debise POUR OBTENIR LE GRADE DE DOCTEUR SPÉCIALITÉ : INFORMATIQUE Outils de Conception en Optique Paraxiale pour les Systèmes d’Imagerie Plénoptique Date de soutenance : 5 Février 2018 Devant la commission d’examen composée de : Hendrik LENSCH Professeur, Université de Tübingen . Rapporteur Céline LOSCOS . Professeur, Université de Reims . Présidente Ivo IHRKE . Professeur, Inria . Encadrant Patrick REUTER . Maître de conférences, Université de Bordeaux Encadrant Xavier GRANIER Professeur, Institut d’Optique Graduate School Directeur 2018 Acknowledgments I would like to express my very great appreciation to my research supervisor, Ivo Ihrke, for his great quality as a mentor. He asked unexpected but pertinent questions during fruitful discussions, encouraged and guided me to be more autonomous in my work and participated in the solving of the hardships faced during experimenting in the laboratory.
    [Show full text]
  • A Comprehensive and Versatile Camera Model for Cameras with Tilt Lenses
    Int J Comput Vis (2017) 123:121–159 DOI 10.1007/s11263-016-0964-8 A Comprehensive and Versatile Camera Model for Cameras with Tilt Lenses Carsten Steger1 Received: 14 March 2016 / Accepted: 30 September 2016 / Published online: 22 October 2016 © The Author(s) 2016. This article is published with open access at Springerlink.com Abstract We propose camera models for cameras that are Keywords Camera models · Tilt lenses · Scheimpflug equipped with lenses that can be tilted in an arbitrary direc- optics · Camera model degeneracies · Camera calibration · tion (often called Scheimpflug optics). The proposed models Bias removal · Stereo rectification are comprehensive: they can handle all tilt lens types that are in common use for machine vision and consumer cameras and correctly describe the imaging geometry of lenses for 1 Introduction which the ray angles in object and image space differ, which is true for many lenses. Furthermore, they are versatile since One problem that often occurs when working on machine they can also be used to describe the rectification geometry vision applications that require large magnifications is that of a stereo image pair in which one camera is perspective and the depth of field becomes progressively smaller as the mag- the other camera is telecentric. We also examine the degen- nification increases. Since for regular lenses the depth of field eracies of the models and propose methods to handle the is parallel to the image plane, problems frequently occur if degeneracies. Furthermore, we examine the relation of the objects that are not parallel to the image plane must be imaged proposed camera models to different classes of projective in focus.
    [Show full text]
  • FOCUSING the VIEW CAMERA
    FOCUSING the VIEW CAMERA A Scientific Way to focus the View Camera and Estimate Depth of Field J by Harold M. Merklinger FOCUSING the VIEW CAMERA A Scientific Way to focus the View Camera and Estimate Depth of Field by Harold M. Merklinger Published by the author This version exists in electronic (PDF) format only. ii Published by the author: Harold M. Merklinger P. O. Box 494 Dartmouth, Nova Scotia Canada, B2Y 3Y8. v. 1.0 1 March 1993. 2nd Printing 29 March 1996. 3rd Printing 27 August 1998. Internet Edition (v. 1.6.1) 8 Jan 2007 ISBN 0-9695025-2-4 © All rights reserved. No part of this book may be reproduced or translated without the express written permission of the author. ‘Printed’ in electronic format, by the author, using Adobe Acrobat. Dedicated to view camera users everywhere. FOCUSING THE VIEW CAMERA iii CONTENTS Page Preface ...............................................................................................................iv CHAPTER 1: Introduction ............................................................................1 CHAPTER 2: Getting Started .......................................................................3 CHAPTER 3: Definitions .............................................................................11 The Lens ...................................................................................................11 The Film and the Image Space .................................................................19 The Plane of Sharp Focus and the Object Space .....................................23
    [Show full text]
  • Experiments on Calibrating Tilt-Shift Lenses for Close-Range Photogrammetry
    The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, Volume XLI-B5, 2016 XXIII ISPRS Congress, 12–19 July 2016, Prague, Czech Republic EXPERIMENTS ON CALIBRATING TILT-SHIFT LENSES FOR CLOSE-RANGE PHOTOGRAMMETRY E. Nocerino a, *, F. Menna a, F. Remondino a, J.-A. Beraldin b, L. Cournoyer b, G. Reain b a 3D Optical Metrology (3DOM) unit, Bruno Kessler Foundation (FBK), Trento, Italy – Email: (nocerino, fmenna, remondino)@fbk.eu, Web: http://3dom.fbk.eu b National Research Council, Ottawa, K1A 0R6, Canada – Email: (Angelo.Beraldin, Luc.Cournoyer, Greg.Reain)@nrc-cnrc.gc.ca Commission V, WG 1 KEY WORDS: Tilt-shift lens, Scheimpflug principle, Close range photogrammetry, Brown model, Pinhole camera, Calibration, Accuracy, Precision, Relative accuracy ABSTRACT: One of the strongest limiting factors in close range photogrammetry (CRP) is the depth of field (DOF), especially at very small object distance. When using standard digital cameras and lens, for a specific camera – lens combination, the only way to control the extent of the zone of sharp focus in object space is to reduce the aperture of the lens. However, this strategy is often not sufficient; moreover, in many cases it is not fully advisable. In fact, when the aperture is closed down, images lose sharpness because of diffraction. Furthermore, the exposure time must be lowered (susceptibility to vibrations) and the ISO increased (electronic noise may increase). In order to adapt the shape of the DOF to the subject of interest, the Scheimpflug rule is to be applied, requiring that the optical axis must be no longer perpendicular to the image plane.
    [Show full text]
  • Macro Photography 1 Macro Photography
    Macro photography 1 Macro photography Macro photography (or photomacrography[1] or macrography,[2] and sometimes macrophotography[3]) is extreme close-up photography, usually of very small subjects, in which the size of the subject in the photograph is greater than life size (though macrophotography technically refers to the art of making very large photographs).[2][4] By some definitions, a macro photograph is one in which the size of the subject on the negative or image sensor is life size or greater.[5] However in other uses it refers to a finished photograph of a subject at greater than life size.[6] Photomacrograph of a common yellow dung fly The ratio of the subject size on the film plane (or sensor plane) to (Scathophaga stercoraria) made using a lens at its maximum 1:1 reproduction ratio, and a 18×24mm the actual subject size is known as the reproduction ratio. image sensor, the on-screen display of the photograph Likewise, a macro lens is classically a lens capable of results in a greater than life-size image. reproduction ratios greater than 1:1, although it often refers to any lens with a large reproduction ratio, despite rarely exceeding 1:1.[6][7][8][9] Outside of technical photography and film-based processes, where the size of the image on the negative or image sensor is the subject of discussion, the finished print or on-screen image more commonly lends a photograph its macro status. For example, when producing a 6×4 inch (15×10 cm) print using 135 format film or sensor, a life-size result is possible with a lens having only a 1:4 reproduction ratio.[10][11] Reproduction ratios much greater than 1:1 are considered to be photomicrography, often achieved with digital microscope (photomicrography should not be confused with microphotography, the art of making very small photographs, such as for microforms).
    [Show full text]
  • Understanding and Using Tilt/Shift Lenses by Jeff Conrad
    Understanding and Using Tilt/Shift Lenses by Jeff Conrad Introduction Camera Movements In addition to the larger image size, a great advantage of a view camera over most small- and medium-format cameras is the inclusion of camera movements that allow adjustment of the lens position relative to the image plane. Two types of movements are usually possible: displacement of the lens parallel to the image plane, and rotation of the lens plane relative to the image plane. Parallel displacement allows the line of sight 1 to be changed without moving the camera back, so that the position of the subject in the image can be adjusted while parallel lines in the subject are kept parallel in the image. In effect, the camera is aimed by adjusting the shift setting. When the shift motion is vertical, it is usually called rise or fall (or sometimes, drop); when the motion is horizontal, it is called lateral shift or cross. Rising front is often used when photographing tall buildings, to avoid the appearance of them falling over backward; falling front is often used in studio product photography. Lateral shift is sometimes used in both situations. Rotation of the lens plane allows control of the part of the image that is acceptably sharp. Rotation about a horizontal axis is called tilt; rotation about a vertical axis is called swing. Tilt, swing, or both are often used when it is otherwise not possible to get sufficient depth of field (“DoF”). Without tilt or swing, the DoF extends between two planes parallel to the image plane; it is infinite in height and width but limited in depth, so that only a small part of it is within the camera’s field of view.
    [Show full text]
  • Understanding and Using Tilt/Shift Lenses by Jeff Conrad
    Understanding and Using Tilt/Shift Lenses by Jeff Conrad Introduction Camera Movements In addition to the larger image size, a great advantage of a view camera over most small- and medium-format cameras is the inclusion of camera movements that allow adjustment of the lens position relative to the image plane. Two types of movements are usually possible: displacement of the lens parallel to the image plane, and rotation of the lens plane relative to the image plane. Parallel displacement allows the line of sight 1 to be changed without moving the camera back, so that the position of the subject in the image can be adjusted while parallel lines in the subject are kept parallel in the image. In effect, the camera is aimed by adjusting the shift setting. When the shift motion is vertical, it is usually called rise or fall (or sometimes, drop); when the motion is horizontal, it is called lateral shift or cross. Rising front is often used when photographing tall buildings, to avoid the appearance of them falling over backward; falling front is often used in studio product photography. Lateral shift is sometimes used in both situations. Rotation of the lens plane allows control of the part of the image that is acceptably sharp. Rotation about a horizontal axis is called tilt; rotation about a vertical axis is called swing. Tilt, swing, or both are often used when it is otherwise not possible to get sufficient depth of field (“DoF”). Without tilt or swing, the DoF extends between two planes parallel to the image plane; it is infinite in height and width but limited in depth, so that only a small part of it is within the camera’s field of view.
    [Show full text]
  • Leica S Medium Format – Minimum Size
    Leica S Medium format – minimum size. LEICA S-SyStEM I 1 CONtENtS Leica CamerA Ag 04 Leica S 06 LEICA S-LENSES 12 The central shutter. 32 S-Adapters for third-party lenses. 36 The autofocus system. 38 Leica S 42 Intuitive handling. 44 Perfect ergonomics. 46 Innovative menu control. 50 ready for any situation. 52 Leica S-SyStem 54 Sensor with offset microlenses. 56 the Maestro image processor. 60 Professional work flow. 62 Custom-designed accessories. 66 technical data. 68 Service for the S-System. 71 L E I C ACA MeRa aG Passionate photography. 1913/14 1925 1930 1932 1954 1965 1966 1971 1996 Oskar Barnack constructs Leica I with a fixed lens is the first Leica with Leica II: the first camera Leica M3 with combined Leicaflex: the first Leica Leica Noctilux-M Leica M5: the first rangefinder Leica S1: the first digital the Ur-Leica. presented at the Leipzig interchangeable with a coupled rangefinder. bright-line viewfinder/ single lens reflex camera 50 mm f/1.2: the camera with selective metering camera with 75-megapixel Spring Fair. thread-mount lenses rangefinder and bayonet goes into production. first lens with an through the lens. resolution. appears on the market. lens mount. aspherical element. Oskar Barnack (1879 – 1936). Sketched construction diagram by Oskar Barnack. the Leitz Optics building, Wetzlar. “Kissing in the rearview mirror,” by Magnum photographer Installation of the control board on the rear shell of the Leica S2. Lens element and assembled Elliott Erwitt, 1955. lens testing. Leica S2 Leica M9 Leica X1 Leica S Leica M Leica X2 2006 2008 2009 2012 Leica M8: the first Leica Noctilux-M Leica S2: the professional Leica M9: the smallest Leica X1: the first Leica S: medium format – reduced Leica M: the new M generation Leica X2: the trailblazer digital rangefinder 50 mm f/0.95 ASPH.: digital camera sets new digital system camera compact camera with an to the maximum.
    [Show full text]
  • Depth of Field for View Cameras Part I
    by Harold M. Merklinger Depth of Field For View Cameras—Part I as published in Shutterbug, November 1993. Having established the basic op- where g=Aa/d. (There is a slight ap- March ‘93) we find the near and far tical principles of view cameras (in proximation used here, but the error is limits of depth of field as shown in “The Scheimpflug Principle”, Shut- usually less than 1%.) Figure 1 will Figure 2. First we use the Scheimp- terbug, Nov. ’92 through March. ’93), help to understand the geometry. For flug rule and the hinge rule to de- we can now tackle the job of cal- ordinary cameras we then use the lens termine where the film plane must be culating depth of field in the tradi- equation to determine what object to achieve the desired plane of precise tional manner. Following the standard distances correspond to those limits. sharp focus. A distance, g, either side method, we will assume a suitable Within those object distance limits of the film plane we draw two lim- standard of image resolution and then lies the zone of acceptable focus. It is iting planes parallel to the film plane. determine where an object must lie to often more convenient to express the For images focused at either of these be imaged in accord with that stan- lens diameter in terms of the f- limiting planes, the circle of confu- dard. Since we are dealing with the number of the lens, N. We make the sion on the film will be the maximum view camera, however, the story is further assumption that the lens-to- allowable.
    [Show full text]
  • Beam Halo Imaging with a Digital Optical Mask*
    Preprint: submitted to Phys. Rev. ST Accel. Beams, 3/09/2012 Beam Halo Imaging with a Digital Optical Mask* H.D. Zhang, R.B. Fiorito†, A.G. Shkvarunets, R.A. Kishek Institute for Research in Electronics and Applied Physics University of Maryland, College Park, MD C.P. Welsch University of Liverpool and Cockcroft Institute, Daresbury, UK Abstract Beam halo is an important factor in any high intensity accelerator. It can cause difficulties in the control of the beam, emittance growth, particle loss and even damage to the accelerator. It is therefore essential to understand the mechanisms of halo formation and its dynamics in order to control and minimize its effects. Experimental measurement of the halo distribution is an important tool for such studies. In this paper, we present a new adaptive masking method that we have developed to image beam halo, which uses a digital micro-mirror-array device (DMD). This method has been thoroughly investigated in the laboratory using laser and white light sources, and with real beams produced by the University of Maryland Electron Ring (UMER). A high dynamic range (DR~105) has been demonstrated with this new method and recent studies indicate that this number can be exceeded for more intense beams by at least an order of magnitude. The method is flexible, easy to setup and can be used at any accelerator or light source. We present the results of our measurements of the performance of the method and images of beam halos produced under various experimental conditions. *Work supported by ONR and the DOD Joint Technology Office †Corresponding author: [email protected] 1.
    [Show full text]
  • The Scheimpflug Principle Part IV
    by Harold M. Merklinger The Scheimpflug Principle—Part IV as published in Shutterbug, March 1993. In Part I we reviewed the ject side of—the front nodal point of tance! We’ll give this three-plane in- Scheimpflug rule which applies for the lens. I’ll call this plane the Front tersection line a name too: the “hinge thin, rectilinear, flat-field lenses. This Focal Plane. The significance of the line”. We don’t know at what angle rule states that the film plane, lens front focal plane is that an object any- the plane of sharp focus passes plane and plane of sharp focus inter- where on this plane is focused an in- through the hinge line, but it must sect along a common line. This rule is finite distance behind the lens. pass through it. So long as we keep certainly a help, but it is not com- The significance of the PTF plane the film orientation fixed, we know at plete. There are an infinite number of takes a little more description, but it’s least one line on the plane of sharp fo- ways to meet the requirements of the related. With the help of Figure 1 you cus that remains fixed in space— Scheimpflug rule and still not have will understand that the light rays unless we change the lens tilt (or the camera in focus. We need some- from an object anywhere on the PTF swing) or the back tilt (or swing). thing else to help us set up. That plane are also aimed at the film an in- The Hinge Rule, simply stated, is something else is the subject for this finite distance behind the lens, no that the plane of sharp focus, the front month.
    [Show full text]