Topic 3: Operation of Simple Lens

Total Page:16

File Type:pdf, Size:1020Kb

Topic 3: Operation of Simple Lens V N I E R U S E I T H Y Modern Optics T O H F G E R D I N B U Topic 3: Operation of Simple Lens Aim: Covers imaging of simple lens using Fresnel Diffraction, resolu- tion limits and basics of aberrations theory. Contents: 1. Phase and Pupil Functions of a lens 2. Image of Axial Point 3. Example of Round Lens 4. Diffraction limit of lens 5. Defocus 6. The Strehl Limit 7. Other Aberrations PTIC D O S G IE R L O P U P P A D E S C P I A S Properties of a Lens -1- Autumn Term R Y TM H ENT of P V N I E R U S E I T H Y Modern Optics T O H F G E R D I N B U Ray Model Simple Ray Optics gives f Image Object u v Imaging properties of 1 1 1 + = u v f The focal length is given by 1 1 1 = (n − 1) + f R1 R2 For Infinite object Phase Shift Ray Optics gives Delta Fn f Lens introduces a path length difference, or PHASE SHIFT. PTIC D O S G IE R L O P U P P A D E S C P I A S Properties of a Lens -2- Autumn Term R Y TM H ENT of P V N I E R U S E I T H Y Modern Optics T O H F G E R D I N B U Phase Function of a Lens δ1 δ2 h R2 R1 n P0 P ∆ 1 With NO lens, Phase Shift between , P0 ! P1 is 2p F = kD where k = l with lens in place, at distance h from optical, F = k0d1 + d2 +n(D − d1 − d2)1 Air Glass @ A which can be arranged to|giv{ze } | {z } F = knD − k(n − 1)(d1 + d2) where d1 and d2 depend on h, the ray height. PTIC D O S G IE R L O P U P P A D E S C P I A S Properties of a Lens -3- Autumn Term R Y TM H ENT of P V N I E R U S E I T H Y Modern Optics T O H F G E R D I N B U Parabolic Approximation Lens surfaces are Spherical, but: If R1 & R2 h, take parabolic approximation h2 h2 d1 = and d2 = 2R1 2R2 So that kh2 1 1 F(h) = kDn + (n − 1) + 2 R1 R2 substituting for focal length, we get kh2 F(h) = kDn − 2 f So in 2-dimensions, h2 = x2 + y2, so that (x2 + y2) F(x;y; f ) = kDn − k 2 f the phase function of the lens. Note: In many cases kDn can be ignored since absolute phase not usually important. The difference between Parabolic and Spherical surfaces will be con- sidered in the next lecture. PTIC D O S G IE R L O P U P P A D E S C P I A S Properties of a Lens -4- Autumn Term R Y TM H ENT of P V N I E R U S E I T H Y Modern Optics T O H F G E R D I N B U Pupil Function The pupil function is used to define the physical size and shape of the lens. p(x;y) ) Shape of lens so the total effect of the lens of focal length is p(x;y)exp(ıF(x;y; f )) For a circular lens of radius a, p(x;y) = 1 if x2 + y2 ≤ a2 = 0 else The circular lens is the most common, but all the following results apply equally well for other shapes. Pupil Function of a simple lens is real and positive, but it will be used later to include aberrations, and will become complex. PTIC D O S G IE R L O P U P P A D E S C P I A S Properties of a Lens -5- Autumn Term R Y TM H ENT of P V N I E R U S E I T H Y Modern Optics T O H F G E R D I N B U Fresnel Image of Axial Point We will now consider the imaging of an axial point using the Fresnel propagation equations from the last lecture. Consider the system. u (x,y) u (x,y) f 2 0 z z P P' 1 P P0 0 1 1 2 In plane P0 we have that u0(x;y) = A0d(x;y) so in plane P1 a distance z0, we have that, u(x;y;z0) = h(x;y;z0) A0d(x;y) = A0h(x;y;z0) where h(x;y;z) is the free space impulse response function. If we now assume that the lens is thin, then there is no diffraction 0 0 between planes, P1 and P1, so in P1 we have 0 u (x;y;z0) = A0h(x;y;z0)p(x;y)exp(ıF(x;y; f )) so finally in P2 a further distance z1 we have that 0 u2(x;y) = u (x;y;z0) h(x;y;z1) = A0h(x;y;z0)p(x;y)exp(ıF(x;y; f )) h(x;y;z1) PTIC D O S G IE R L O P U P P A D E S C P I A S Properties of a Lens -6- Autumn Term R Y TM H ENT of P V N I E R U S E I T H Y Modern Optics T O H F G E R D I N B U Take the Fresnel approximation, where exp(ıkz) k h(x;y;z) = exp ı (x2 + y2) ılz 2z and k F(x;y; f ) = knD − (x2 + y2) 2 f so we get that A0 exp(ıkz0) k 2 2 u(x;y;z0) = exp ı (x + y ) ılz0 2z0 then by more substitution, 0 A0 exp(ık(z0 + nD)) u (x;y;z0) = p(x;y) ılz0 k 1 1 exp ı − (x2 + y2) 2 z0 f Finally!, we can substitute and expand to get 1 A0 u2(x;y) = 2 exp(ık(z0 + z1 + nD)) zl z0z1 }| { 2 k exp ı (x2 + y2) z 2z1}| { 3 k 1 1 1 p(s;t)exp ı (s2 +t2) + − Z Z z 2 }|z0 z1 f { k exp −ı (sx +ty) dsdt z1 4 PTIC | {z } D O S G IE R L O P U P P A D E S C P I A S Properties of a Lens -7- Autumn Term R Y TM H ENT of P V N I E R U S E I T H Y Modern Optics T O H F G E R D I N B U Look at the term: 1. Constant, amplitude gives absolute brightness, phase not mea- surable. 2. Quadratic phase term in plane P2, no effect on intensity, and usually ignored. 3. Quadratic phase term across the Pupil, depends on both z0 and z1. 4. Scaled Fourier Transform of Pupil Function. If we select the location of plane P2 such that 1 1 1 + − = 0 z0 z1 f Note same expression as Ray Optics, then we can write k 2 2 u2(x;y) = B0 exp ı (x + y ) 2z1 k p(s;t)exp −ı (sx +ty) dsdt Z Z z1 so in plane P2 we have the scaled Fourier Transform of the Pupil Function, (plus phase term). Intensity in plane P2 is thus 2 g(x;y) = ju2(x;y)j k 2 = B2 p(s;t)exp −ı (sx +ty) dsdt 0 Z Z z 1 Being the scaled power spectrum of the Pupil Function. PTIC D O S G IE R L O P U P P A D E S C P I A S Properties of a Lens -8- Autumn Term R Y TM H ENT of P V N I E R U S E I T H Y Modern Optics T O H F G E R D I N B U Image of a Distant Object For a distant object z0 ! ¥ and z1 ! f u (x,y) p(x,y) 2 z P1 P' f 0 1 P2 Then the amplitude in P2 becomes, k u (x;y) = B exp ı (x2 + y2) 2 0 2 f k p(s;t)exp −ı (sx +ty) dsdt Z Z f which being the scaled Fourier Transform of the Pupil function. The intensity is therefore: k 2 g(x;y) = B2 p(s;t)exp −ı (sx +ty) dsdt 0 Z Z f which is known as the Point Spread Function of the lens Key Result Note on Units: x;y;s;t all have units of length m. Scaler Fourier kernal is: 2p exp −ı (sx +ty) l f PTIC D O S G IE R L O P U P P A D E S C P I A S Properties of a Lens -9- Autumn Term R Y TM H ENT of P V N I E R U S E I T H Y Modern Optics T O H F G E R D I N B U Simple Round Lens Consider the case of round lens of radius a, amplitude in P2 then becomes, k u (x;y) = Bˆ exp −ı (sx +ty) dsdt 2 0 Z Z f s2+t2≤a2 the external phase term has been absorbed into Bˆ0 This can be integrated, using standard results (see Physics 3 Optics notes or tutorial solution), to give: ka J1 x ˆ 2 f u2(x;0) = 2pB0a ka f x where J1 is the first order Bessel Function.
Recommended publications
  • Te2, Part Iii
    TERMINOLOGIA EMBRYOLOGICA Second Edition International Embryological Terminology FIPAT The Federative International Programme for Anatomical Terminology A programme of the International Federation of Associations of Anatomists (IFAA) TE2, PART III Contents Caput V: Organogenesis Chapter 5: Organogenesis (continued) Systema respiratorium Respiratory system Systema urinarium Urinary system Systemata genitalia Genital systems Coeloma Coelom Glandulae endocrinae Endocrine glands Systema cardiovasculare Cardiovascular system Systema lymphoideum Lymphoid system Bibliographic Reference Citation: FIPAT. Terminologia Embryologica. 2nd ed. FIPAT.library.dal.ca. Federative International Programme for Anatomical Terminology, February 2017 Published pending approval by the General Assembly at the next Congress of IFAA (2019) Creative Commons License: The publication of Terminologia Embryologica is under a Creative Commons Attribution-NoDerivatives 4.0 International (CC BY-ND 4.0) license The individual terms in this terminology are within the public domain. Statements about terms being part of this international standard terminology should use the above bibliographic reference to cite this terminology. The unaltered PDF files of this terminology may be freely copied and distributed by users. IFAA member societies are authorized to publish translations of this terminology. Authors of other works that might be considered derivative should write to the Chair of FIPAT for permission to publish a derivative work. Caput V: ORGANOGENESIS Chapter 5: ORGANOGENESIS
    [Show full text]
  • Chapter 3 (Aberrations)
    Chapter 3 Aberrations 3.1 Introduction In Chap. 2 we discussed the image-forming characteristics of optical systems, but we limited our consideration to an infinitesimal thread- like region about the optical axis called the paraxial region. In this chapter we will consider, in general terms, the behavior of lenses with finite apertures and fields of view. It has been pointed out that well- corrected optical systems behave nearly according to the rules of paraxial imagery given in Chap. 2. This is another way of stating that a lens without aberrations forms an image of the size and in the loca- tion given by the equations for the paraxial or first-order region. We shall measure the aberrations by the amount by which rays miss the paraxial image point. It can be seen that aberrations may be determined by calculating the location of the paraxial image of an object point and then tracing a large number of rays (by the exact trigonometrical ray-tracing equa- tions of Chap. 10) to determine the amounts by which the rays depart from the paraxial image point. Stated this baldly, the mathematical determination of the aberrations of a lens which covered any reason- able field at a real aperture would seem a formidable task, involving an almost infinite amount of labor. However, by classifying the various types of image faults and by understanding the behavior of each type, the work of determining the aberrations of a lens system can be sim- plified greatly, since only a few rays need be traced to evaluate each aberration; thus the problem assumes more manageable proportions.
    [Show full text]
  • Wavefront Decomposition and Propagation Through Complex Models with Analytical Ray Theory and Signal Processing Paul Cristini, Eric De Bazelaire
    Wavefront decomposition and propagation through complex models with analytical ray theory and signal processing Paul Cristini, Eric de Bazelaire To cite this version: Paul Cristini, Eric de Bazelaire. Wavefront decomposition and propagation through complex models with analytical ray theory and signal processing. 2006. hal-00110279 HAL Id: hal-00110279 https://hal.archives-ouvertes.fr/hal-00110279 Preprint submitted on 27 Oct 2006 HAL is a multi-disciplinary open access L’archive ouverte pluridisciplinaire HAL, est archive for the deposit and dissemination of sci- destinée au dépôt et à la diffusion de documents entific research documents, whether they are pub- scientifiques de niveau recherche, publiés ou non, lished or not. The documents may come from émanant des établissements d’enseignement et de teaching and research institutions in France or recherche français ou étrangers, des laboratoires abroad, or from public or private research centers. publics ou privés. March 31, 2006 14:7 WSPC/130-JCA jca_sbrt Journal of Computational Acoustics c IMACS Wavefront decomposition and propagation through complex models with analytical ray theory and signal processing P. CRISTINI CNRS-UMR5212 Laboratoire de Mod´elisation et d'Imagerie en G´eosciences de Pau Universit´e de Pau et des Pays de l'Adour BP 1155, 64013 Pau Cedex, France [email protected] E. De BAZELAIRE 11, Route du bourg 64230 Beyrie-en-B´earn, France [email protected] Received (Day Month Year) Revised (Day Month Year) We present a novel method which can perform the fast computation of the times of arrival of seismic waves which propagate between a source and an array of receivers in a stratified medium.
    [Show full text]
  • Making Your Own Astronomical Camera by Susan Kern and Don Mccarthy
    www.astrosociety.org/uitc No. 50 - Spring 2000 © 2000, Astronomical Society of the Pacific, 390 Ashton Avenue, San Francisco, CA 94112. Making Your Own Astronomical Camera by Susan Kern and Don McCarthy An Education in Optics Dissect & Modify the Camera Loading the Film Turning the Camera Skyward Tracking the Sky Astronomy Camp for All Ages For More Information People are fascinated by the night sky. By patiently watching, one can observe many astronomical and atmospheric phenomena, yet the permanent recording of such phenomena usually belongs to serious amateur astronomers using moderately expensive, 35-mm cameras and to scientists using modern telescopic equipment. At the University of Arizona's Astronomy Camps, we dissect, modify, and reload disposed "One- Time Use" cameras to allow students to study engineering principles and to photograph the night sky. Elementary school students from Silverwood School in Washington state work with their modified One-Time Use cameras during Astronomy Camp. Photo courtesy of the authors. Today's disposable cameras are a marvel of technology, wonderfully suited to a variety of educational activities. Discarded plastic cameras are free from camera stores. Students from junior high through graduate school can benefit from analyzing the cameras' optics, mechanisms, electronics, light sources, manufacturing techniques, and economics. Some of these educational features were recently described by Gene Byrd and Mark Graham in their article in the Physics Teacher, "Camera and Telescope Free-for-All!" (1999, vol. 37, p. 547). Here we elaborate on the cameras' optical properties and show how to modify and reload one for astrophotography. An Education in Optics The "One-Time Use" cameras contain at least six interesting optical components.
    [Show full text]
  • Considering Contact Lens CORNEAL RESHAPING
    Considering Contact Lens CORNEAL RESHAPING Patient Information Booklet for Potential Users of PARAGON RG-4 Contact Lens Corneal Reshaping PATIENT INFORMATION BOOKLET FOR POTENTIAL USERS OF PARAGON RG-4 Manufactured in Paragon HDS® 100 (paflufocon D) Contact Lenses For Contact Lens Corneal Reshaping Overnight Wear CAUTION: Federal (US) law restricts this device to sale by, or on the order of a licensed practitioner. Contact lenses for corneal reshaping should be fitted only by a trained and certified contact lens fitter. Nonsterile. Clean and condition lenses prior to use. ii TABLE OF CONTENTS Page Introduction 1 How The Eye Functions 1 How Paragon RG-4 Contact Lenses For Corneal Reshaping Function 2 Alternative Ways To Correct Nearsightedness 3 Risk Analysis 3 Indications 4 Precautions 4 Contraindications (Reasons Not To Use) 6 Warnings 6 Adverse Effects (Problems and What To Do) 7 Clinical Study Data 7 Overnight Wear Safety Summary 12 Maintaining Effects of Paragon RG-4 Lenses For Corneal Reshaping 13 Glossary 14 iii INTRODUCTION The information in this booklet is to help you decide whether or not to be fitted with Paragon RG-4 lens designs for Contact Lens Corneal Reshaping. Corneal reshaping is a fitting procedure that temporarily corrects or greatly reduces nearsightedness (known by the medical name, myopia) with or without astigmatism after contact lenses have been removed. By temporary, it is meant that the contact lenses are worn while sleeping (overnight) and then removed upon awaking; whereupon the nearsightedness remains corrected or greatly reduced for all or most of your waking hours. The exact time period over which the myopia remains corrected varies with each patient.
    [Show full text]
  • Model-Based Wavefront Reconstruction Approaches For
    Submitted by Dipl.-Ing. Victoria Hutterer, BSc. Submitted at Industrial Mathematics Institute Supervisor and First Examiner Univ.-Prof. Dr. Model-based wavefront Ronny Ramlau reconstruction approaches Second Examiner Eric Todd Quinto, Ph.D., Robinson Profes- for pyramid wavefront sensors sor of Mathematics in Adaptive Optics October 2018 Doctoral Thesis to obtain the academic degree of Doktorin der Technischen Wissenschaften in the Doctoral Program Technische Wissenschaften JOHANNES KEPLER UNIVERSITY LINZ Altenbergerstraße 69 4040 Linz, Osterreich¨ www.jku.at DVR 0093696 Abstract Atmospheric turbulence and diffraction of light result in the blurring of images of celestial objects when they are observed by ground based telescopes. To correct for the distortions caused by wind flow or small varying temperature regions in the at- mosphere, the new generation of Extremely Large Telescopes (ELTs) uses Adaptive Optics (AO) techniques. An AO system consists of wavefront sensors, control algo- rithms and deformable mirrors. A wavefront sensor measures incoming distorted wave- fronts, the control algorithm links the wavefront sensor measurements to the mirror actuator commands, and deformable mirrors mechanically correct for the atmospheric aberrations in real-time. Reconstruction of the unknown wavefront from given sensor measurements is an Inverse Problem. Many instruments currently under development for ELT-sized telescopes have pyramid wavefront sensors included as the primary option. For this sensor type, the relation between the intensity of the incoming light and sensor data is non-linear. The high number of correcting elements to be controlled in real-time or the segmented primary mirrors of the ELTs lead to unprecedented challenges when designing the control al- gorithms.
    [Show full text]
  • Megakernels Considered Harmful: Wavefront Path Tracing on Gpus
    Megakernels Considered Harmful: Wavefront Path Tracing on GPUs Samuli Laine Tero Karras Timo Aila NVIDIA∗ Abstract order to handle irregular control flow, some threads are masked out when executing a branch they should not participate in. This in- When programming for GPUs, simply porting a large CPU program curs a performance loss, as masked-out threads are not performing into an equally large GPU kernel is generally not a good approach. useful work. Due to SIMT execution model on GPUs, divergence in control flow carries substantial performance penalties, as does high register us- The second factor is the high-bandwidth, high-latency memory sys- age that lessens the latency-hiding capability that is essential for the tem. The impressive memory bandwidth in modern GPUs comes at high-latency, high-bandwidth memory system of a GPU. In this pa- the expense of a relatively long delay between making a memory per, we implement a path tracer on a GPU using a wavefront formu- request and getting the result. To hide this latency, GPUs are de- lation, avoiding these pitfalls that can be especially prominent when signed to accommodate many more threads than can be executed in using materials that are expensive to evaluate. We compare our per- any given clock cycle, so that whenever a group of threads is wait- formance against the traditional megakernel approach, and demon- ing for a memory request to be served, other threads may be exe- strate that the wavefront formulation is much better suited for real- cuted. The effectiveness of this mechanism, i.e., the latency-hiding world use cases where multiple complex materials are present in capability, is determined by the threads’ resource usage, the most the scene.
    [Show full text]
  • High-Quality Computational Imaging Through Simple Lenses
    High-Quality Computational Imaging Through Simple Lenses Felix Heide1, Mushfiqur Rouf1, Matthias B. Hullin1, Bjorn¨ Labitzke2, Wolfgang Heidrich1, Andreas Kolb2 1University of British Columbia, 2University of Siegen Fig. 1. Our system reliably estimates point spread functions of a given optical system, enabling the capture of high-quality imagery through poorly performing lenses. From left to right: Camera with our lens system containing only a single glass element (the plano-convex lens lying next to the camera in the left image), unprocessed input image, deblurred result. Modern imaging optics are highly complex systems consisting of up to two pound lens made from two glass types of different dispersion, i.e., dozen individual optical elements. This complexity is required in order to their refractive indices depend on the wavelength of light differ- compensate for the geometric and chromatic aberrations of a single lens, ently. The result is a lens that is (in the first order) compensated including geometric distortion, field curvature, wavelength-dependent blur, for chromatic aberration, but still suffers from the other artifacts and color fringing. mentioned above. In this paper, we propose a set of computational photography tech- Despite their better geometric imaging properties, modern lens niques that remove these artifacts, and thus allow for post-capture cor- designs are not without disadvantages, including a significant im- rection of images captured through uncompensated, simple optics which pact on the cost and weight of camera objectives, as well as in- are lighter and significantly less expensive. Specifically, we estimate per- creased lens flare. channel, spatially-varying point spread functions, and perform non-blind In this paper, we propose an alternative approach to high-quality deconvolution with a novel cross-channel term that is designed to specifi- photography: instead of ever more complex optics, we propose cally eliminate color fringing.
    [Show full text]
  • Spherical Aberration ¥ Field Angle Effects (Off-Axis Aberrations) Ð Field Curvature Ð Coma Ð Astigmatism Ð Distortion
    Astronomy 80 B: Light Lecture 9: curved mirrors, lenses, aberrations 29 April 2003 Jerry Nelson Sensitive Countries LLNL field trip 2003 April 29 80B-Light 2 Topics for Today • Optical illusion • Reflections from curved mirrors – Convex mirrors – anamorphic systems – Concave mirrors • Refraction from curved surfaces – Entering and exiting curved surfaces – Converging lenses – Diverging lenses • Aberrations 2003 April 29 80B-Light 3 2003 April 29 80B-Light 4 2003 April 29 80B-Light 8 2003 April 29 80B-Light 9 Images from convex mirror 2003 April 29 80B-Light 10 2003 April 29 80B-Light 11 Reflection from sphere • Escher drawing of images from convex sphere 2003 April 29 80B-Light 12 • Anamorphic mirror and image 2003 April 29 80B-Light 13 • Anamorphic mirror (conical) 2003 April 29 80B-Light 14 • The artist Hans Holbein made anamorphic paintings 2003 April 29 80B-Light 15 Ray rules for concave mirrors 2003 April 29 80B-Light 16 Image from concave mirror 2003 April 29 80B-Light 17 Reflections get complex 2003 April 29 80B-Light 18 Mirror eyes in a plankton 2003 April 29 80B-Light 19 Constructing images with rays and mirrors • Paraxial rays are used – These rays may only yield approximate results – The focal point for a spherical mirror is half way to the center of the sphere. – Rule 1: All rays incident parallel to the axis are reflected so that they appear to be coming from the focal point F. – Rule 2: All rays that (when extended) pass through C (the center of the sphere) are reflected back on themselves.
    [Show full text]
  • Single‑Molecule‑Based Super‑Resolution Imaging
    Histochem Cell Biol (2014) 141:577–585 DOI 10.1007/s00418-014-1186-1 REVIEW The changing point‑spread function: single‑molecule‑based super‑resolution imaging Mathew H. Horrocks · Matthieu Palayret · David Klenerman · Steven F. Lee Accepted: 20 January 2014 / Published online: 11 February 2014 © Springer-Verlag Berlin Heidelberg 2014 Abstract Over the past decade, many techniques for limit, gaining over two orders of magnitude in precision imaging systems at a resolution greater than the diffraction (Szymborska et al. 2013), allowing direct observation of limit have been developed. These methods have allowed processes at spatial scales much more compatible with systems previously inaccessible to fluorescence micros- the regime that biomolecular interactions take place on. copy to be studied and biological problems to be solved in Radically, different approaches have so far been proposed, the condensed phase. This brief review explains the basic including limiting the illumination of the sample to regions principles of super-resolution imaging in both two and smaller than the diffraction limit (targeted switching and three dimensions, summarizes recent developments, and readout) or stochastically separating single fluorophores in gives examples of how these techniques have been used to time to gain resolution in space (stochastic switching and study complex biological systems. readout). The latter also described as follows: Single-mol- ecule active control microscopy (SMACM), or single-mol- Keywords Single-molecule microscopy · Super- ecule localization microscopy (SMLM), allows imaging of resolution imaging · PALM/(d)STORM imaging · single molecules which cannot only be precisely localized, Localization microscopy but also followed through time and quantified. This brief review will focus on this “pointillism-based” SR imaging and its application to biological imaging in both two and Fluorescence microscopy allows users to dynamically three dimensions.
    [Show full text]
  • To See the Invisible: the Quest of Imaging Vitreous J
    DOP42005.qxd 4/15/08 11:34 AM Page 5 Meyer CH (ed): Vital Dyes in Vitreoretinal Surgery. Dev Ophthalmol. Basel, Karger, 2008, vol 42, pp 5–28 To See the Invisible: The Quest of Imaging Vitreous J. Sebag VMR Institute, University of Southern California, Los Angeles, Calif., USA Abstract Purpose: Imaging vitreous has long been a quest to view what is, by design, invisible. This chapter will review important historical aspects, past and present imaging methodologies, and new technologies that are currently in development for future research and clinical applications. Methods: Classic and modern histologic techniques, dark-field slit microscopy, clinical slit lamp biomicroscopy, standard and scanning laser ophthalmoscopy (SLO), ultrasonography, optical coherence tomography (OCT), com- bined OCT-SLO, magnetic resonance and Raman spectroscopies, and dynamic light scattering method- ologies are presented. Results: The best available histologic techniques for imaging vitreous are those that avoid rapid dehydration of vitreous specimens. Dark-field slit microscopy enables in vitro imaging without dehydration or tissue fixatives. OCT enables better in vivo visualization of the vitreoretinal inter- face than SLO and ultrasonography, but does not adequately image the vitreous body. The combination of OCT with SLO has provided useful new imaging capabilities, but only at the vitreoretinal interface. Dynamic light scattering can evaluate the vitreous body by determining the average sizes of vitreous macromolecules in aging, disease, and as a means to assess the effects of pharmacologic vitreolysis. Raman spectroscopy can detect altered vitreous molecules, such as glycated collagen and other pro- teins in diabetic vitreopathy and possibly other diseases. Conclusions: A better understanding of normal vitreous physiology and structure and how these change in aging and disease is needed to develop more effective therapies and prevention.
    [Show full text]
  • Mapping the PSF Across Adaptive Optics Images
    Mapping the PSF across Adaptive Optics images Laura Schreiber Osservatorio Astronomico di Bologna Email: [email protected] Abstract • Adaptive Optics (AO) has become a key technology for all the main existing telescopes (VLT, Keck, Gemini, Subaru, LBT..) and is considered a kind of enabling technology for future giant telescopes (E-ELT, TMT, GMT). • AO increases the energy concentration of the Point Spread Function (PSF) almost reaching the resolution imposed by the diffraction limit, but the PSF itself is characterized by complex shape, no longer easily representable with an analytical model, and by sometimes significant spatial variation across the image, depending on the AO flavour and configuration. • The aim of this lesson is to describe the AO PSF characteristics and variation in order to provide (together with some AO tips) basic elements that could be useful for AO images data reduction. Erice School 2015: Science and Technology with E-ELT What’s PSF • ‘The Point Spread Function (PSF) describes the response of an imaging system to a point source’ • Circular aperture of diameter D at a wavelenght λ (no aberrations) Airy diffraction disk 2 퐼휃 = 퐼0 퐽1(푥)/푥 Where 퐽1(푥) represents the Bessel function of order 1 푥 = 휋 퐷 휆 푠푛휗 휗 is the angular radius from the aperture center First goes to 0 when 휗 ~ 1.22 휆 퐷 Erice School 2015: Science and Technology with E-ELT Imaging of a point source through a general aperture Consider a plane wave propagating in the z direction and illuminating an aperture. The element ds = dudv becomes the a sourse of a secondary spherical wave.
    [Show full text]