Efficient Ray Tracing Through Aspheric Lenses and Imperfect

Total Page:16

File Type:pdf, Size:1020Kb

Efficient Ray Tracing Through Aspheric Lenses and Imperfect Eurographics Symposium on Rendering 2016 Volume 35 (2016), Number 4 E. Eisemann and E. Fiume (Guest Editors) Efficient Ray Tracing Through Aspheric Lenses and Imperfect Bokeh Synthesis Hyuntae Joo1, Soonhyeon Kwon1, Sangmin Lee1, Elmar Eisemann2 and Sungkil Lee†1 1Sungkyunkwan University, South Korea 2Delft University of Technology, Netherlands Figure 1: Examples of bokeh textures generated with our ray-tracing solution. Abstract We present an efficient ray-tracing technique to render bokeh effects produced by parametric aspheric lenses. Contrary to conventional spherical lenses, aspheric lenses do generally not permit a simple closed-form solution of ray-surface intersections. We propose a numerical root-finding approach, which uses tight proxy surfaces to ensure a good initialization and convergence behavior. Additionally, we simulate mechanical imperfections resulting from the lens fabrication via a texture-based approach. Fractional Fourier transform and spectral dispersion add additional realism to the synthesized bokeh effect. Our approach is well-suited for execution on graphics processing units (GPUs) and we demonstrate complex defocus-blur and lens-flare effects. Categories and Subject Descriptors (according to ACM CCS): I.3.3 [Computer Graphics]: Picture/Image Generation—Display Algorithms 1. Introduction no general closed-form solution for calculating ray-surface intersec- tions, as exists for spherical lenses [KMH95, SDHL11, HESL11]. Optical systems are typically imperfect and not all rays will follow Instead, we present an efficient numerical scheme to trace rays the expected optical path toward the sensor, which leads to optical through these lens elements. Additionally, aspheric lenses often con- aberrations [Smi00]. Classical systems make use of many spherical tain small imperfections due to the fabrication process (e.g., mechan- lens elements to reduce this effect. A recent alternative is an aspheric ical grinding [LB07]), influencing out-of-focus blur. In particular, it lens, whose profile does not correspond to a part of a sphere [KT80]. can impact the intensity profile of bokeh, the aesthetic appearance of Their use typically implies fewer optical elements for a similar out-of-focus highlights, resulting in “onion-ring bokeh” [ST14]. We reduction of aberrations, which, consequently, results in less weight. simulate this characteristic aspect by using a normal map derived This property made them a popular choice (e.g., smartphones lenses, from a virtual grinding process. pancake lenses, and eyeglasses). Despite their wide-spread use in practice, aspheric lenses have hardly been investigated in computer The contributions of this paper can be summarized as: graphics. • an aspheric lens model, including imperfections; • In this paper, we model and simulate aspheric lenses. Because an efficient numerical aspheric-lens intersection method; their surface profile is usually nonlinear and non-spherical, there is • a rendering solution for optical systems. Our paper is organized as follows. We discuss previous work (Sec.2) and introduce our aspheric lens model (Sec.3), including † [email protected] (corresponding author) imperfections (Sec.4). We then cover our rendering algorithm and © 2016 The Author(s) Computer Graphics Forum © 2016 The Eurographics Association and John Wiley & Sons Ltd. Published by John Wiley & Sons Ltd. H. Joo & S. Kwon & S. Lee & E. Eisemann & S. Lee / Efficient Ray Tracing Through Aspheric Lenses and Imperfect Bokeh Synthesis intersection test for aspheric lenses (Sec.5) before presenting results sa sa (Sec.6) and concluding (Sec.7). (a) spherical lens (b) aspheric lens 2. Related Work This section reviews previous work on simulating optical systems Figure 2: A spherical lens (a) can cause spherical aberration, while and ray-tracing techniques for parametric surfaces. even a single aspheric lens (b) may focus well. 2.1. Optical Simulation but unreliable for complex surfaces. These problems manifest them- Early approaches for optical simulations have relied on simple op- selves even more clearly when using limited precision floating-point tical models [Coo86, Che87], but simulating a full optical system numbers, as is often the case for GPU approaches. In contrast, our has been of growing interest. Many approaches opted for additional solution relies on a bracketing method supported by tight proxy realism in terms of geometric distortion, defocus blur, and spectral geometry to perform the intersection test, which even works for dispersion. They usually build upon formal data sheets specifying lower-precision floating point values. the optical system in terms of its optical elements. The seminal work of Kolb et al. obtained geometric accuracy for distortion and field of view [KMH95]. Steinert et al. included additional aberrations and 3. Aspheric-Lens Model spectral diversities using Monte-Carlo rendering [SDHL11]. Hullin We here introduce our parametric aspheric-lens model. An aspheric et al. added reflections and anti-reflective coatings, which play a role lens adds nonlinear deviations to its base curvature c of a spherical for lens flares [HESL11]. They attained physically faithful image surface (Figure2), leading to a better convergence of peripheral rays quality but were restricted to spherical lenses in order to simplify on the focus plane. The common parametric form of an aspheric the ray-surface intersection tests. In this paper, we add support for surface with its sagittal profile z on the optical axis (along the z-axis), aspheric lenses. is expressed in terms of the radial coordinate r from the optical axis Optical imperfections can be classified into intrinsic and extrin- as: 2 sic ones. The former ones include aberrations and flares [HESL11, cr 2i z(r) := p + ∑A2ir ; (1) LE13], the latter ones include issues due to mechanical manufactur- 1 + 1 − (1 + k)c2r2 ing (e.g., grinding and polishing) and real-world artifacts (e.g., dirt or axis misalignment). where k is a conic constant [FTG00] and A2i are coefficients for even-polynomial sections; the coefficients of particular systems can One of the pronounced intrinsic imperfections are (spherical) be found in patents or lens design books [Smi05]. aberrations manifesting in bokeh and can be simulated via ray trac- ing [LES10, WZH∗10, WZHX13]. For higher performance, early The design of an aspheric lens goes through an iterative opti- models captured the appearance phenomenologically [BW02]. Later mization procedure to minimize aberrations. The conic constant approaches relied on efficient filtering; polygonal filtering [MRD12], k is often a consequence of the lens’ base shape from which the separable filtering [MH14], low-rank linear filters [McG15], or look- aspheric lens is derived, hence, the focus lies on optimizing A2i. up table [GKK15]. The design requires high expertise and care in optics. We refer the reader to the textbooks for mathematical properties and practical Extrinsic imperfections are not easily incorporated into ideal para- usages [FTG00, Smi05]. metric models [KMH95,WZH ∗10,SDHL11,HESL11,WZHX13]. One attempt was to use artificial noise for diffraction at the aper- ture [RIF∗09,HESL11], whereas our method is driven by simulating 4. Imperfections the lens-fabrication process. In the real world, the fabrication of large spherical/aspheric lenses involves mechanical grinding and polishing [LB07] (Figure3). The process has usually several phases; for instance, spherical grinding, 2.2. Ray Tracing of Parametric Surfaces precision grinding, and polishing are applied in sequence [FTG00]. While many modern ray tracers focus on polygonal mod- Similarly, in precision glass molding, a mold is carved, which will els [PBMH02,FS05], early approaches often included several para- then exhibit the imperfections that are passed on to the lens, when it metric surface types, which can be classified in four categories: sub- is produced with the mold. The processes are not entirely standard- division [Whi80], algebraic methods [Kaj82,Bli82, Han83,Man94], ized and although the various phases have been improved over the Bézier clipping [NSK90, EHS05], and numerical techniques [Tot85, years, the result is not perfect yet. JB86, AGM06]. The most visible artifacts resulting from the fabrication process In optics, much work addressed intersections with spherical sur- are due to a misalignment between grinding and optical axis, grind- faces but few approaches covered aspheric lenses; typically, tessel- ing asymmetries resulting from the swinging of the tools, and limited lation [MNN∗10] and Delaunay triangles [OSRM09]) or Newton- grinding precision. A few nanometers of deviation can lead to a visi- Raphson methods [AS52,For66] were applied. Tessellation is costly ble impact on bokeh. We model the resulting imperfections indirectly for sufficient precision, while the Newton-Raphson method is fast by simulating the grinding process itself. Following the fabrication © 2016 The Author(s) Computer Graphics Forum © 2016 The Eurographics Association and John Wiley & Sons Ltd. H. Joo & S. Kwon & S. Lee & E. Eisemann & S. Lee / Efficient Ray Tracing Through Aspheric Lenses and Imperfect Bokeh Synthesis (a) grinding (b) polishing Offline preprocessing Online rendering load an optical system for each RGB channel swinging swinging for each ray for each surface element s if s is an aperture apply aperture stop at intersection else if s is an aspheric lens find analytical intersection with proxies for each aspheric lens test numerical intersection - find proxy surfaces refract ray (with
Recommended publications
  • Still Photography
    Still Photography Soumik Mitra, Published by - Jharkhand Rai University Subject: STILL PHOTOGRAPHY Credits: 4 SYLLABUS Introduction to Photography Beginning of Photography; People who shaped up Photography. Camera; Lenses & Accessories - I What a Camera; Types of Camera; TLR; APS & Digital Cameras; Single-Lens Reflex Cameras. Camera; Lenses & Accessories - II Photographic Lenses; Using Different Lenses; Filters. Exposure & Light Understanding Exposure; Exposure in Practical Use. Photogram Introduction; Making Photogram. Darkroom Practice Introduction to Basic Printing; Photographic Papers; Chemicals for Printing. Suggested Readings: 1. Still Photography: the Problematic Model, Lew Thomas, Peter D'Agostino, NFS Press. 2. Images of Information: Still Photography in the Social Sciences, Jon Wagner, 3. Photographic Tools for Teachers: Still Photography, Roy A. Frye. Introduction to Photography STILL PHOTOGRAPHY Course Descriptions The department of Photography at the IFT offers a provocative and experimental curriculum in the setting of a large, diversified university. As one of the pioneers programs of graduate and undergraduate study in photography in the India , we aim at providing the best to our students to help them relate practical studies in art & craft in professional context. The Photography program combines the teaching of craft, history, and contemporary ideas with the critical examination of conventional forms of art making. The curriculum at IFT is designed to give students the technical training and aesthetic awareness to develop a strong individual expression as an artist. The faculty represents a broad range of interests and aesthetics, with course offerings often reflecting their individual passions and concerns. In this fundamental course, students will identify basic photographic tools and their intended purposes, including the proper use of various camera systems, light meters and film selection.
    [Show full text]
  • Logitech PTZ Pro Camera
    THE USB 1080P PTZ CAMERA THAT BRINGS EVERY COLLABORATION TO LIFE Logitech PTZ Pro Camera The Logitech PTZ Pro Camera is a premium USB-enabled HD Set-up is a snap with plug-and-play simplicity and a single USB cable- 1080p PTZ video camera for use in conference rooms, education, to-host connection. Leading business certifications—Certified for health care and other professional video workspaces. Skype for Business, Optimized for Lync, Skype® certified, Cisco Jabber® and WebEx® compatible2—ensure an integrated experience with most The PTZ camera features a wide 90° field of view, 10x lossless full HD business-grade UC applications. zoom, ZEISS optics with autofocus, smooth mechanical 260° pan and 130° tilt, H.264 UVC 1.5 with Scalable Video Coding (SVC), remote control, far-end camera control1 plus multiple presets and camera mounting options for custom installation. Logitech PTZ Pro Camera FEATURES BENEFITS Premium HD PTZ video camera for professional Ideal for conference rooms of all sizes, training environments, large events and other professional video video collaboration applications. HD 1080p video quality at 30 frames per second Delivers brilliantly sharp image resolution, outstanding color reproduction, and exceptional optical accuracy. H.264 UVC 1.5 with Scalable Video Coding (SVC) Advanced camera technology frees up bandwidth by processing video within the PTZ camera, resulting in a smoother video stream in applications like Skype for Business. 90° field of view with mechanical 260° pan and The generously wide field of view and silky-smooth pan and tilt controls enhance collaboration by making it easy 130° tilt to see everyone in the camera’s field of view.
    [Show full text]
  • The Evolution of Keyence Machine Vision Systems
    NEW High-Speed, Multi-Camera Machine Vision System CV-X200/X100 Series POWER MEETS SIMPLICITY GLOBAL STANDARD DIGEST VERSION CV-X200/X100 Series Ver.3 THE EVOLUTION OF KEYENCE MACHINE VISION SYSTEMS KEYENCE has been an innovative leader in the machine vision field for more than 30 years. Its high-speed and high-performance machine vision systems have been continuously improved upon allowing for even greater usability and stability when solving today's most difficult applications. In 2008, the XG-7000 Series was released as a “high-performance image processing system that solves every challenge”, followed by the CV-X100 Series as an “image processing system with the ultimate usability” in 2012. And in 2013, an “inline 3D inspection image processing system” was added to our lineup. In this way, KEYENCE has continued to develop next-generation image processing systems based on our accumulated state-of-the-art technologies. KEYENCE is committed to introducing new cutting-edge products that go beyond the expectations of its customers. XV-1000 Series CV-3000 Series THE FIRST IMAGE PROCESSING SENSOR VX Series CV-2000 Series CV-5000 Series CV-500/700 Series CV-100/300 Series FIRST PHASE 1980s to 2002 SECOND PHASE 2003 to 2007 At a time when image processors were Released the CV-300 Series using a color Released the CV-2000 Series compatible with x2 Released the CV-3000 Series that can simultaneously expensive and difficult to handle, KEYENCE camera, followed by the CV-500/700 Series speed digital cameras and added first-in-class accept up to four cameras of eight different types, started development of image processors in compact image processing sensors with 2 mega-pixel CCD cameras to the lineup.
    [Show full text]
  • How Post-Processing Effects Imitating Camera Artifacts Affect the Perceived Realism and Aesthetics of Digital Game Graphics
    How post-processing effects imitating camera artifacts affect the perceived realism and aesthetics of digital game graphics Av: Charlie Raud Handledare: Kai-Mikael Jää-Aro Södertörns högskola | Institutionen för naturvetenskap, miljö och teknik Kandidatuppsats 30 hp Medieteknik | HT2017/VT2018 Spelprogrammet Hur post-processing effekter som imiterar kamera-artefakter påverkar den uppfattade realismen och estetiken hos digital spelgrafik 2 Abstract This study investigates how post-processing effects affect the realism and aesthetics of digital game graphics. Four focus groups explored a digital game environment and were exposed to various post-processing effects. During qualitative interviews these focus groups were asked questions about their experience and preferences and the results were analysed. The results can illustrate some of the different pros and cons with these popular post-processing effects and this could help graphical artists and game developers in the future to use this tool (post-processing effects) as effectively as possible. Keywords: post-processing effects, 3D graphics, image enhancement, qualitative study, focus groups, realism, aesthetics Abstrakt Denna studie undersöker hur post-processing effekter påverkar realismen och estetiken hos digital spelgrafik. Fyra fokusgrupper utforskade en digital spelmiljö medan olika post-processing effekter exponerades för dem. Under kvalitativa fokusgruppsintervjuer fick de frågor angående deras upplevelser och preferenser och detta resultat blev sedan analyserat. Resultatet kan ge en bild av de olika för- och nackdelarna som finns med dessa populära post-processing effekter och skulle möjligen kunna hjälpa grafiker och spelutvecklare i framtiden att använda detta verktyg (post-processing effekter) så effektivt som möjligt. Keywords: post-processing effekter, 3D-grafik, bildförbättring, kvalitativ studie, fokusgrupper, realism, estetik 3 Table of content Abstract .....................................................................................................
    [Show full text]
  • Lens Flare Prediction Based on Measurements with Real-Time Visualization
    Authors manuscript. Published in The Visual Computer, first online: 14 May 2018 The final publication is available at Springer via https://doi.org/10.1007/s00371-018-1552-4. Lens flare prediction based on measurements with real-time visualization Andreas Walch1 · Christian Luksch1 · Attila Szabo1 · Harald Steinlechner1 · Georg Haaser1 · Michael Schw¨arzler1 · Stefan Maierhofer1 Abstract Lens flare is a visual phenomenon caused by lens system or due to scattering caused by lens mate- interreflection of light within a lens system. This effect rial imperfections. The intensity of lens flare heavily de- is often seen as an undesired artifact, but it also gives pends on the camera to light source constellation and rendered images a realistic appearance and is even used of the light source intensity, compared to the rest of for artistic purposes. In the area of computer graph- the scene [7]. Lens flares are often regarded as a dis- ics, several simulation based approaches have been pre- turbing artifact, and camera producers develop coun- sented to render lens flare for a given spherical lens sys- termeasures, such as anti-reflection coatings and opti- tem. For physically reliable results, these approaches mized lens hoods to reduce their visual impact. In other require an accurate description of that system, which applications though, movies [16,23] or video games [21], differs from camera to camera. Also, for the lens flares lens flares are used intentionally to imply a higher de- appearance, crucial parameters { especially the anti- gree of realism or as a stylistic element. Figure 1 shows reflection coatings { can often only be approximated.
    [Show full text]
  • Sources of Error in HDRI for Luminance Measurement: a Review of the Literature
    Sources of Error in HDRI for Luminance Measurement: A Review of the Literature Sarah Safranek and Robert G. Davis Pacific Northwest National Laboratory 620 SW 5th Avenue, Suite 810 Portland, OR 97204 [email protected] (corresponding author) [email protected] This is an archival copy of an article published in LEUKOS. Please cite as: Sarah Safranek & Robert G. Davis (2020): Sources of Error in HDRI for Luminance Measurement: A Review of the Literature, LEUKOS, DOI: 10.1080/15502724.2020.1721018 Abstract Compared to the use of conventional spot luminance meters, high dynamic range imaging (HDRI) offers significant advantages for luminance measurements in lighting research. Consequently, the reporting of absolute luminance data based on HDRI measurements has rapidly increased in technical lighting literature, with researchers using HDRI to address topics such as daylight distribution and discomfort glare. However, questions remain about the accuracy of luminance data derived from HDRI. This article reviewed published papers that reported potential sources of error in deriving absolute luminance values from high dynamic range imaging (HDRI) using a consumer grade digital camera, along with application papers that included an analysis of errors in HDRI-derived luminance values. Four sources of significant error emerged from the literature review: lens vignetting, lens flare, luminous overflow, and sensor spectral responsivity. The cause and magnitude for each source of error is discussed using the relevant findings from previous research and any available correction methods are presented. Based on the review, a set of recommendations was developed for minimizing the possible errors in HDRI luminance measurements as well as recommendations for future research using HDRI.
    [Show full text]
  • Schneider-Kreuznach Erweitert Seine F-Mount-Objektiv-Familie
    FAQ Century Film & Video 1. What is Angle of View? 2. What is an Achromat Diopter? 3. How is an Achromat different from a standard close up lens? 4. What is a matte box? 5. What is a lens shade? 6. When do you need a lens shade or a matte box? 7. What is aspect ratio? 8. What is 4:3? 9. What is 16:9? 10. What is anamorphic video? 11. What is the difference between the Mark I and Mark II fisheye lenses? 12. What is anti-reflection coating? 13. How should I clean my lens? 14. Why should I use a bayonet Mount? 15. How much does my accessory lens weigh? 16. What is hyperfocal distance? 17. What is the Hyperfocal distance of my lens? 18. What is a T-Stop? 19. What is a PL mount? 1. What is Angle of View? Angle of view is a measure of how much of the scene a lens can view. A fisheye lens can see as much as 180 degrees and a telephoto lens might see as narrow an angle as 5 degrees. It is important to distinguish horizontal angle of view from the vertical angle of view. 2. What is an Achromat Diopter? An achromat diopter is a highly corrected two element close up lens that provides extremely sharp images edge to edge without prismatic color effects. 3. How is an Achromat different from a standard close up lens? Standard close-up lenses, or diopters, are single element lenses that allow the camera lens to focus more closely on small objects.
    [Show full text]
  • Visual Homing with a Pan-Tilt Based Stereo Camera Paramesh Nirmal Fordham University
    Fordham University Masthead Logo DigitalResearch@Fordham Faculty Publications Robotics and Computer Vision Laboratory 2-2013 Visual homing with a pan-tilt based stereo camera Paramesh Nirmal Fordham University Damian M. Lyons Fordham University Follow this and additional works at: https://fordham.bepress.com/frcv_facultypubs Part of the Robotics Commons Recommended Citation Nirmal, Paramesh and Lyons, Damian M., "Visual homing with a pan-tilt based stereo camera" (2013). Faculty Publications. 15. https://fordham.bepress.com/frcv_facultypubs/15 This Article is brought to you for free and open access by the Robotics and Computer Vision Laboratory at DigitalResearch@Fordham. It has been accepted for inclusion in Faculty Publications by an authorized administrator of DigitalResearch@Fordham. For more information, please contact [email protected]. Visual homing with a pan-tilt based stereo camera Paramesh Nirmal and Damian M. Lyons Department of Computer Science, Fordham University, Bronx, NY 10458 ABSTRACT Visual homing is a navigation method based on comparing a stored image of the goal location and the current image (current view) to determine how to navigate to the goal location. It is theorized that insects, such as ants and bees, employ visual homing methods to return to their nest [1]. Visual homing has been applied to autonomous robot platforms using two main approaches: holistic and feature-based. Both methods aim at determining distance and direction to the goal location. Navigational algorithms using Scale Invariant Feature Transforms (SIFT) have gained great popularity in the recent years due to the robustness of the feature operator. Churchill and Vardy [2] have developed a visual homing method using scale change information (Homing in Scale Space, HiSS) from SIFT.
    [Show full text]
  • Three Techniques for Rendering Generalized Depth of Field Effects
    Three Techniques for Rendering Generalized Depth of Field Effects Todd J. Kosloff∗ Computer Science Division, University of California, Berkeley, CA 94720 Brian A. Barskyy Computer Science Division and School of Optometry, University of California, Berkeley, CA 94720 Abstract Post-process methods are fast, sometimes to the point Depth of field refers to the swath that is imaged in sufficient of real-time [13, 17, 9], but generally do not share the focus through an optics system, such as a camera lens. same image quality as distributed ray tracing. Control over depth of field is an important artistic tool that can be used to emphasize the subject of a photograph. In a A full literature review of depth of field methods real camera, the control over depth of field is limited by the is beyond the scope of this paper, but the interested laws of physics and by physical constraints. Depth of field reader should consult the following surveys: [1, 2, 5]. has been rendered in computer graphics, but usually with the same limited control as found in real camera lenses. In Kosara [8] introduced the notion of semantic depth this paper, we generalize depth of field in computer graphics of field, a somewhat similar notion to generalized depth by allowing the user to specify the distribution of blur of field. Semantic depth of field is non-photorealistic throughout a scene in a more flexible manner. Generalized depth of field provides a novel tool to emphasize an area of depth of field used for visualization purposes. Semantic interest within a 3D scene, to select objects from a crowd, depth of field operates at a per-object granularity, and to render a busy, complex picture more understandable allowing each object to have a different amount of blur.
    [Show full text]
  • TESSERACT -- Antique Scientific Instruments
    TESSERACT Early Scientific Instruments Special Issue: OPTICAL PLEASURES Catalogue One Hundred Seven Summer, 2018 $10 CATALOGUE ONE HUNDRED SEVEN Copyright 2018 David Coffeen CONDITIONS OF SALE All items in this catalogue are available at the time of printing. We do not charge for shipping and insurance to anywhere in the contiguous 48 states. New York residents must pay applicable sales taxes. For buyers outside the 48 states, we will provide packing and delivery to the post office or shipper but you must pay the actual shipping charges. Items may be reserved by telephone, and will be held for a reasonable time pending receipt of payment. All items are offered with a 10-day money-back guarantee for any reason, but you pay return postage and insurance. We will do everything possible to expedite your shipment, and can work within the framework of institutional requirements. The prices in this catalogue are net and are in effect through December, 2018. Payments by check, bank transfer, or credit card (Visa, Mastercard, American Express) are all welcome. — David Coffeen, Ph.D. — Yola Coffeen, Ph.D. Members: Scientific Instrument Society American Association for the History of Medicine Historical Medical Equipment Society Antiquarian Horological Society International Society of Antique Scale Collectors Surveyors Historical Society Early American Industries Association The Oughtred Society American Astronomical Society International Coronelli Society American Association of Museums Co-Published : RITTENHOUSE: The Journal of the American Scientific Instrument Enterprise (http://www.etesseract.com/RHjournal/) We are always interested in buying single items or collections. In addition to buying and selling early instruments, we can perform formal appraisals of your single instruments or whole collections, whether to determine fair market value for donation, for insurance, for loss, etc.
    [Show full text]
  • Download Spec Sheet
    Hardware Camera & Entertainment • Snapdragon 2.26 GHz Quad-Core Processor • 13 MP Optical Image Stabilization (OIS) Full HD Rear- • T-Mobile 4G LTE Network* Facing Autofocus Camera and Camcorder with LED Flash • 5.2" Full HD IPS Display – 1920 x 1080 resolution, 16:9 • Optical Image Stabilization – clearer results by keeping aspect ratio, and 423 ppi imagery stable while a photo or video is taken, even in • Zerogap Touch – precise touch response by reducing the low-light conditions space under the surface of the display • 2.1 MP Full HD Front-Facing Camera • Rear Key Placement – allows ambidextrous convenience • Camera Resolutions: up to 4160 x 31201 for the most natural and immediate use (4160 x 2340 default) pixels • Sapphire Crystal Glass – scratch-resistant material helps • Multipoint Autofocus – camera detects and captures a protect camera lens from blemishes particular subject faster and more precisely with nine autofocus points * T-Mobile’s 4G LTE Network not available everywhere. • Shot & Clear – eliminate select moving objects in the 2.79" 0.35" background of a picture UX Productivity • Tracking Zoom1 – zoom in on a subject while recording to • Slide Aside – three-finger swipe to the left saves up to track and magnify it through the scene three running apps; access tabs with a swipe to the right • Voice Shutter – take pictures using voice commands • QSlide – overlay up to two QSlide app windows with • Shot Mode – choose from Normal, Shot & Clear,1 Dynamic Tone adjustable sizing and transparency on primary screen (HDR),1 Panorama,1
    [Show full text]
  • Evaluation of the Lens Flare
    https://doi.org/10.2352/ISSN.2470-1173.2021.9.IQSP-215 © 2021, Society for Imaging Science and Technology Evaluation of the Lens Flare Elodie Souksava, Thomas Corbier, Yiqi Li, Franc¸ois-Xavier Thomas, Laurent Chanas, Fred´ eric´ Guichard DXOMARK, Boulogne-Billancourt, France Abstract Flare, or stray light, is a visual phenomenon generally con- sidered undesirable in photography that leads to a reduction of the image quality. In this article, we present an objective metric for quantifying the amount of flare of the lens of a camera module. This includes hardware and software tools to measure the spread of the stray light in the image. A novel measurement setup has been developed to generate flare images in a reproducible way via a bright light source, close in apparent size and color temperature (a) veiling glare (b) luminous halos to the sun, both within and outside the field of view of the device. The proposed measurement works on RAW images to character- ize and measure the optical phenomenon without being affected by any non-linear processing that the device might implement. Introduction Flare is an optical phenomenon that occurs in response to very bright light sources, often when shooting outdoors. It may appear in various forms in the image, depending on the lens de- (c) haze (d) colored spot and ghosting sign; typically it appears as colored spots, ghosting, luminous ha- Figure 1. Examples of flare artifacts obtained with our flare setup. los, haze, or a veiling glare that reduces the contrast and color saturation in the picture (see Fig. 1).
    [Show full text]