The Spatial Frequency Domain

Total Page:16

File Type:pdf, Size:1020Kb

The Spatial Frequency Domain The spatial frequency domain MIT 2.71/2.710 Optics 10/31/05 wk9-a-1 Recall: plane wave propagation x path delay increases linearly with x λ ⎛ x E0 exp⎜i2π sinθ + ⎝ λ z ⎞ i2π cosθ ⎟ λ ⎠ θ z=0 z plane of observation MIT 2.71/2.710 Optics 10/31/05 wk9-a-2 Spatial frequency ⇔ angle of propagation? x λ electric field θ z=0 z plane of observation MIT 2.71/2.710 Optics 10/31/05 wk9-a-3 Spatial frequency ⇔ angle of propagation? x λ θ z=0 z plane of observation MIT 2.71/2.710 Optics 10/31/05 wk9-a-4 Spatial frequency ⇔ angle of propagation? The cross-section of the optical field with the optical axis is a sinusoid of the form ⎛ sinθ ⎞ E0 exp⎜i2π x +φ0 ⎟ ⎝ λ ⎠ i.e. it looks like sinθ E exp()i2π u x +φ where u ≡ 0 0 λ is called the spatial frequency MIT 2.71/2.710 Optics 10/31/05 wk9-a-5 2D sinusoids corresp. phasor E0 cos[]2π ()ux + vy E0 exp[i2π (ux + vy)] y x MIT 2.71/2.710 Optics 10/31/05 wk9-a-6 2D sinusoids corresp. phasor E0 cos[]2π ()ux + vy E0 exp[i2π (ux + vy)] y x min max MIT 2.71/2.710 Optics 10/31/05 wk9-a-7 2D sinusoids corresp. phasor E0 cos[]2π ()ux + vy E0 exp[i2π (ux + vy)] y 1/u 1/v x min max MIT 2.71/2.710 Optics 10/31/05 wk9-a-8 2D sinusoids corresp. phasor E0 cos[]2π ()ux + vy E0 exp[i2π (ux + vy)] u tanψ = y v 1 Λ = 2 2 u + v Λ x ψ min max MIT 2.71/2.710 Optics 10/31/05 wk9-a-9 Spatial (2D) Fourier Transforms MIT 2.71/2.710 Optics 10/31/05 wk9-a-10 The 2D2D Fourier integral (aka inverseinverse FourierFourier transformtransform) g()x, y = ∫ G(u,v) e+i2π (ux+vy) dudv superposition sinusoids complex weight, expresses relative amplitude (magnitude & phase) of superposed sinusoids MIT 2.71/2.710 Optics 10/31/05 wk9-a-11 The 2D2D Fourier transform The complex weight coefficients G(u,v), aka FourierFourier transformtransform of g(x,y) are calculated from the integral G()u,v = ∫ g(x, y) e−i2π (ux+vy) dxdy (1D so we can draw it easily ... ) g(x) Re[e-i2πux] Re[G(u)]=∫ x dx MIT 2.71/2.710 Optics 10/31/05 wk9-a-12 2D2D Fourier transform pairspairs (from Goodman, Introduction to Fourier Optics, page 14) MIT 2.71/2.710 Optics 10/31/05 wk9-a-13 Space and spatial frequency representations SPACE DOMAIN g(x,y) G()u,v = ∫ g (x, y ) e−i2π (ux+vy) dxdy g(x, y) = ∫ G(u,v) e+i2π (ux+vy) dudv 2D2D FourierFourier transformtransform 2D2D FourierFourier integralintegral aka inverseinverse 2D2D FourierFourier transformtransform SPATIAL FREQUENCY DOMAIN G(u,v) MIT 2.71/2.710 Optics 10/31/05 wk9-a-14 Periodic Grating /1: vertical y v x u Frequency Space (Fourier) domain y v domain x u MIT 2.71/2.710 Optics 10/31/05 wk9-a-15 Periodic Grating /2: tilted y v x u Frequency Space (Fourier) domain y v domain x u MIT 2.71/2.710 Optics 10/31/05 wk9-a-16 Superposition: multiple gratings ++ Frequency Space (Fourier) domain domain MIT 2.71/2.710 Optics 10/31/05 wk9-a-17 Superpositions: spatial frequency representation Frequency Space (Fourier) domain domain MIT 2.71/2.710 Optics 10/31/05 wk9-a-18 Superpositions: spatial frequency representation space domain spatial frequency domain g()x, y G(u,v) = ℑ{g(x, y)} MIT 2.71/2.710 Optics 10/31/05 wk9-a-19 Fourier transform properties /1 • Fourier transforms and the delta function ℑ{}δ ()x, y =1 ℑ{}exp[]i2π (u0 x + v0 y) = δ (u − u0 )δ (v − v0 ) • Linearity of Fourier transforms if ℑ{}g1()x, y = G1(u,v)() and ℑ{g2 x, y }= G2 (u,v) then ℑ{}()()a1g1()x, y + a2 g2 ()x, y = a1G1 u,v + a2G2 u,v for any pair of complex numbers a1, a2. MIT 2.71/2.710 Optics 10/31/05 wk9-a-20 Fourier transform properties /2 Let ℑ{}g()x, y = G(u,v) • Shift theorem (space → frequency) ℑ{}g()x − x0 , y − y0 = G(u,v)exp[− i2π (ux0 + vy0 )] • Shift theorem (frequency → space) ℑ{}g()x, y exp[i2π (ux0 + vy0 )] = G(u − u0 ,v − v0 ) • Scaling theorem 1 ⎛ u v ⎞ ℑ{}g()ax,by = G⎜ , ⎟ ab ⎝ a b ⎠ MIT 2.71/2.710 Optics 10/31/05 wk9-a-21 Modulation in the frequency domain: the shift theorem Frequency Space (Fourier) domain domain MIT 2.71/2.710 Optics 10/31/05 wk9-a-22 Size of object vs frequency content: the scaling theorem Frequency Space (Fourier) domain domain MIT 2.71/2.710 Optics 10/31/05 wk9-a-23 Fourier transform properties /3 Let ℑ{}f ()x, y = F(u,v) and ℑ{}h(x, y) = H (u,v) Let g()x, y = ∫ f (x′, y′)⋅h(x − x′, y − y′) dx′dy′ • Convolution theorem (space → frequency) ℑ{}g()x, y = F(u,v)⋅ H (u,v) Let Q()u,v = ∫ F(u′,v′)⋅ H (u − u′,v − v′) du′dv′ • Convolution theorem (frequency → space) Q()u,v = ℑ{f (x, y)⋅h(x, y)} MIT 2.71/2.710 Optics 10/31/05 wk9-a-24 Fourier transform properties /4 Let ℑ{}f ()x, y = F(u,v) and ℑ{}h(x, y) = H (u,v) Let g()x, y = ∫ f (x′, y′)⋅h(x′ − x, y′ − y) dx′dy′ • Correlation theorem (space → frequency) ℑ{}g()x, y = F(u,v)⋅ H * (u,v) Let Q()u,v = ∫ F(u′,v′)⋅ H (u′ − u,v′ − v) du′dv′ • Correlation theorem (frequency → space) Q()u,v = ℑ{f (x, y)⋅h* (x, y)} MIT 2.71/2.710 Optics 10/31/05 wk9-a-25 Spatial frequency representation space domain Fourier domain 3 sinusoids (aka spatial frequency domain) MIT 2.71/2.710 Optics 10/31/05 wk9-a-26 Spatial filtering space domain Fourier domain 2 sinusoids (1 removed) (aka spatial frequency domain) MIT 2.71/2.710 Optics 10/31/05 wk9-a-27 Spatial frequency representation space domain Fourier domain g()x, y (aka spatial frequency domain) MIT 2.71/2.710 Optics 10/31/05 wk9-a-28 G(u,v) = ℑ{g(x, y)} Spatial filtering (low–pass) removed high-frequency content space domain Fourier domain (aka spatial frequency domain) MIT 2.71/2.710 Optics 10/31/05 wk9-a-29 Spatial filtering (band–pass) removed high- and low-frequency content space domain Fourier domain (aka spatial frequency domain) MIT 2.71/2.710 Optics 10/31/05 wk9-a-30 Example: optical lithography original pattern mild (“nested L’s”) low-pass filtering Notice: (i) blurring at the edges (ii) ringing MIT 2.71/2.710 Optics 10/31/05 wk9-a-31 Example: optical lithography original pattern severe (“nested L’s”) low-pass filtering Notice: (i) blurring at the edges (ii) ringing MIT 2.71/2.710 Optics 10/31/05 wk9-a-32 2D2D linear shift invariant systems input output convolution with impulse response gi(x,y) go(x’, y’) inverse Fourier g x′, y′ = g x, y h x′ − x, y′ − y dxdy inverse Fourier o ()∫ i ( ) ( ) transform transform LSI Fourier Fourier transform transform multiplication with transfer function Gi(u,v) Go(u,v) Go (u,v) = Gi (u,v)( H u,v ) MIT 2.71/2.710 Optics 10/31/05 wk9-a-33 2D2D linear shift invariant systems SPACE DOMAIN input output convolution with impulse response gi(x,y) go(x’, y’) inverse Fourier g x′, y′ = g x, y h x′ − x, y′ − y dxdy inverse Fourier o ()∫ i ( ) ( ) transform transform LSI Fourier Fourier transform transform multiplication with transfer function Gi(u,v) Go(u,v) Go (u,v) = Gi (u,v)( H u,v ) SPATIAL FREQUENCY DOMAIN MIT 2.71/2.710 Optics 10/31/05 wk9-a-34 2D2D linear shift invariant systems input output convolution with impulse response gi(x,y) go(x’, y’) inverse Fourier g x′, y′ = g x, y h x′ − x, y′ − y dxdy inverse Fourier o ()∫ i ( ) ( ) transform transform LSI h(x,y), H(u,v) Fourier Fourier transform transform are 2D F.T. pair multiplication with transfer function Gi(u,v) Go(u,v) Go (u,v) = Gi (u,v)( H u,v ) MIT 2.71/2.710 Optics 10/31/05 wk9-a-35 Sampling space and frequency δx :pixel size δu :frequency resolution umax space ∆x :field size spatial domain frequency Nyquist 1 1 umax = δu = domain relationships: 2δx 2∆x MIT 2.71/2.710 Optics 10/31/05 wk9-a-36 The Space–Bandwidth Product Nyquist relationships: 1 from space → spatial frequency domain: u = max 2δx ∆x 1 from spatial frequency → space domain: = 2 2δu ∆x 2u = max ≡ N : 1D Space–Bandwidth Product (SBP) δx δu aka number of pixels in the space domain 2D SBP ~ N 2 MIT 2.71/2.710 Optics 10/31/05 wk9-a-37 SBP: example space domain Fourier domain δx =1 ∆x =1024 (aka spatial frequency domain) MIT 2.71/2.710 Optics −4 10/31/05 wk9-a-38 umax = 0.5 δu =1/1024 = 9.765625×10.
Recommended publications
  • Precise and Fast Spatial-Frequency Analysis Using the Iterative Local Fourier Transform
    Vol. 24, No. 19 | 19 Sep 2016 | OPTICS EXPRESS 22110 Precise and fast spatial-frequency analysis using the iterative local Fourier transform 1 2 2,* SUKMOCK LEE, HEEJOO CHOI, AND DAE WOOK KIM 1Department of Physics, Inha University, Incheon, 402-751, South Korea 2College of Optical Sciences, University of Arizona, 1630 East University Boulevard, Tucson, Arizona 85721, USA *[email protected] Abstract: The use of the discrete Fourier transform has decreased since the introduction of the fast Fourier transform (fFT), which is a numerically efficient computing process. This paper presents the iterative local Fourier transform (ilFT), a set of new processing algorithms that iteratively apply the discrete Fourier transform within a local and optimal frequency domain. The new technique achieves 210 times higher frequency resolution than the fFT within a comparable computation time. The method’s superb computing efficiency, high resolution, spectrum zoom-in capability, and overall performance are evaluated and compared to other advanced high-resolution Fourier transform techniques, such as the fFT combined with several fitting methods. The effectiveness of the ilFT is demonstrated through the data analysis of a set of Talbot self-images (1280 × 1024 pixels) obtained with an experimental setup using grating in a diverging beam produced by a coherent point source. ©2016 Optical Society of America OCIS codes: (300.6300) Spectroscopy, Fourier transforms; (070.4790) Spectrum analysis; (070.6760) Talbot and self-imaging effects. References and links 1. L. M. Sanchez-Brea, F. J. Torcal-Milla, J. M. Herrera-Fernandez, T. Morlanes, and E. Bernabeu, “Self-imaging technique for beam collimation,” Opt. Lett. 39(19), 5764–5767 (2014).
    [Show full text]
  • Fourier Optics
    Fourier Optics 1 Background Ray optics is a convenient tool to determine imaging characteristics such as the location of the image and the image magni¯cation. A complete description of the imaging system, however, requires the wave properties of light and associated processes like di®raction to be included. It is these processes that determine the resolution of optical devices, the image contrast, and the e®ect of spatial ¯lters. One possible wave-optical treatment considers the Fourier spectrum (space of spatial frequencies) of the object and the transmission of the spectral components through the optical system. This is referred to as Fourier Optics. A perfect imaging system transmits all spectral frequencies equally. Due to the ¯nite size of apertures (for example the ¯nite diameter of the lens aperture) certain spectral components are attenuated resulting in a distortion of the image. Specially designed ¯lters on the other hand can be inserted to modify the spectrum in a prede¯ned manner to change certain image features. In this lab you will learn the principles of spatial ¯ltering to (i) improve the quality of laser beams, and to (ii) remove undesired structures from images. To prepare for the lab, you should review the wave-optical treatment of the imaging process (see [1], Chapters 6.3, 6.4, and 7.3). 2 Theory 2.1 Spatial Fourier Transform Consider a two-dimensional object{ a slide, for instance that has a ¯eld transmission function f(x; y). This transmission function carries the information of the object. A (mathematically) equivalent description of this object in the Fourier space is based on the object's amplitude spectrum ZZ 1 F (u; v) = f(x; y)ei2¼ux+i2¼vydxdy; (1) (2¼)2 where the Fourier coordinates (u; v) have units of inverse length and are called spatial frequencies.
    [Show full text]
  • The Representation of Complex Images in Spatial Frequency Domains of Primary Visual Cortex
    9310 • The Journal of Neuroscience, August 29, 2007 • 27(35):9310–9318 Behavioral/Systems/Cognitive The Representation of Complex Images in Spatial Frequency Domains of Primary Visual Cortex Jing X. Zhang,1 Ari Rosenberg,2 Atul K. Mallik,3 T. Robert Husson,2 and Naoum P. Issa4 1Department of Biomedical Engineering, Illinois Institute of Technology, Chicago, Illinois 60616, and 2Committee on Computational Neuroscience, 3Committee on Neurobiology, and 4Department of Neurobiology, University of Chicago, Chicago, Illinois 60637 The organization of cat primary visual cortex has been well mapped using simple stimuli such as sinusoidal gratings, revealing superim- posed maps of orientation and spatial frequency preferences. However, it is not yet understood how complex images are represented across these maps. In this study, we ask whether a linear filter model can explain how cortical spatial frequency domains are activated by complex images. The model assumes that the response to a stimulus at any point on the cortical surface can be predicted by its individual orientation, spatial frequency, and temporal frequency tuning curves. To test this model, we imaged the pattern of activity within cat area 17 in response to stimuli composed of multiple spatial frequencies. Consistent with the predictions of the model, the stimuli activated low and high spatial frequency domains differently: at low stimulus drift speeds, both domains were strongly activated, but activity fell off in high spatial frequency domains as drift speed increased. To determine whether the filter model quantitatively predicted the activity patterns, we measured the spatiotemporal tuning properties of the functional domains in vivo and calculated expected response ampli- tudes from the model.
    [Show full text]
  • Vision: from Eye to Brain (Chap 3, Part B)
    Vision: From Eye to Brain (Chap 3, Part B) Lecture 7 Jonathan Pillow Sensation & Perception (PSY 345 / NEU 325) Princeton University, Spring 2015 1 more “channels”: spatial frequency channels spatial frequency: the number of cycles of a grating per unit of visual angle (usually specified in degrees) • think of it as: # of bars per unit length low frequency intermediate high frequency 2 Why sine gratings? • Provide useful decomposition of images Technical term: Fourier decomposition 3 Fourier decomposition • mathematical decomposition of an image (or sound) into sine waves. reconstruction: “image” 1 sine wave 2 sine waves 3 sine waves 4 sine waves 4 “Fourier Decomposition” theory of V1 claim: role of V1 is to do “Fourier decomposition”, i.e., break images down into a sum of sine waves • Summation of two spatial sine waves • any pattern can be broken down into a sum of sine waves 5 Fourier decomposition • mathematical decomposition of an image (or sound) into sine waves. Original image High Frequencies Low Frequencies 6 original low medium high 7 Retinal Ganglion Cells: tuned to spatial frequency Response of a ganglion cell to sine gratings of different frequencies 8 The contrast sensitivity function Human contrast sensitivity illustration of this sensitivity 9 Image Illustrating Spatial Frequency Channels 10 Image Illustrating Spatial Frequency Channels 11 If it is hard to tell who this famous person is, try squinting or defocusing “Lincoln illusion” Harmon & Jules 1973 12 “Gala Contemplating the Mediterranean Sea, which at 30 meters becomes
    [Show full text]
  • Visual Performance Theoretical Limit of Resolution
    Visual Performance Aspects Conditions • Resolution Limit • Illumination • Pattern Detection • Monocular vs. Binocular • Pattern Recognition • Distance • Contrast Level • On-axis vs. Off-axis • Color • Single or multiple targets • Temporal Response • Literacy & Verbal ability Theoretical Limit of Resolution θ Rayleigh Criterion 1.22λ θ = radians D For λ = 587.6 nm D ranges from 2 – 8 mm 0.090 ≤θ≤0.358 mrad 0.3 ≤θ≤1.23 minutes of arc 1 Rayleigh Criteria 1 1.6 mm Pupil 0.75 Contrast 14% Airy #1 0.5 Airy #2 Sum Relative Irradiance 0.25 Photorecptor Photorecptor Photorecptor 0 -15 -10 -5 0 5 10 15 20 25 Position (microns) Visual Acuity Visual Acuity is a measure of the smallest detail that can be resolved by the visual system. There are different types of acuity measures. Point Acuity – “Binary Star” test – typically 1 arcmin resolution Vernier Acuity – Two lines slightly offset from each other. Finds smallest detectable offset – typically 10 seconds of arc θ 2 Visual Acuity Grating Acuity – Sinusoidal or Square wave gratings are used to determine the smallest separation between peaks that can be resolved. Typically 2 arcmin. θ Visual Acuity Letter Acuity – Different Letters or Symbols need to be recognized Typically 5 arcmin. E θ F P T O Z L P E D ETDRS Landolt C Tumbling Es Lea 3 Visual Acuity & Pupil Size Visual Acuity Charts 5’ N Visual Acuity Charts are designed so the 20/20 line subtends 5 arcmin. 20/40 subtends 10 arcmin 20/10 subtends 2.5 arcmin 4 Stereo Acuity Given one object slightly closer than the other, find the smallest separation that is resolvable.
    [Show full text]
  • Psychophysical Aspects of Contrast Sensitivity*
    S Afr Optom 2013 72(2) 76-85 Psychophysical aspects of contrast sensitivity* Anusha Y Sukhaa and Alan Rubinb a, b Department of Optometry, University of Johannesburg, PO Box 524, Auckland Park, 2006 South Africa a<[email protected]> b<[email protected]> Received 23 November 2012; revised version accepted 9 June 2013 Abstract demonstrate stereo-pair representation of contrast visual acuities in the context of diabetic eyes. This paper reviews the psychophysical aspects The doctoral research of the first author (AYS) of contrast sensitivity which concerns components that applies similar idea to understanding both of visual stimuli and the behavioural responses inter- and intra-ocular variation of contrast visual and methods used in contrast sensitivity testing. acuities. Some discussion is included of the different types of contrast sensitivity charts available as well Key words: contrast, contrast sensitivity, as a brief background on the different types of contrast visual acuity, vision science, vision graphical representations of contrast sensitivity psychophysics, stereo-pair scatter plots, and contrast visual acuities. Two illustrations also multivariate statistics Introduction (namely spatial frequency, contrast, spatial phase, and orientation) by means of Fourier analysis. Fourier Psychophysics is a scientific discipline designed analysis is an analytical method that calculates to measure internal sensory and perceptual responses simple sine-wave components whose linear sum to external stimuli1. Sensory stimuli and behavioural forms a given complex image2, 3. The visual stimuli responses are the defining or crucial concepts in typically used in contrast sensitivity testing consist of determining contrast thresholds (for example, visual sine-wave or square-wave gratings whose luminance stimuli are used in chart-based contrast sensitivity perpendicular to the bars is modulated in sinusoidal measurements).
    [Show full text]
  • A Tutorial Essay on Spatial Filtering and Spatial Vision
    Spatial Filters/Meese/Nov 2000 A Tutorial Essay on Spatial Filtering and Spatial Vision Tim S Meese Neurosciences School of Life and Health Sciences Aston University Feb 2009: minor stylistic and modifications and technical corrections March 2009: referencing tidied up To follow: Final round-up sections and detailed references added. -1- Spatial Filters/Meese/Nov 2000 1. Introduction To the layperson, vision is all about ‘seeing’—we open our eyes and effortlessly we see the world ‘out there’. We can recognize objects and people, we can interact with them in sensible and useful ways and we can navigate our way about the environment, rarely bumping into things. Of course, the only information that we are really able to act upon is that which is encoded as neural activity by our nervous system. These visual codes, the essence of all of our perceptions, are constructed from information arriving from each of two directions. Image data arrives ‘bottom-up’ from our sensory apparatus, and knowledge-based rules and inferences arrive ‘top-down’ from memory. These two routes to visual perception achieve very different jobs. The first route is well suited to providing a descriptive account of the retinal image while the second route allows these descriptions to be elaborated and interpreted. This chapter focuses primarily on one aspect of the first of these routes, the processing and encoding of spatial information in the two-dimensional retinal image. One of the major success stories in understanding the human brain has been the exploration of the bottom-up processes used in vision, sometimes referred to as early vision.
    [Show full text]
  • Session 2. (1) Spatial Frequency and Fourier Transformation
    2005 Autumn semester Pattern Information Processing Topic 1. Sampling and digital processing of images Session 2. (1) Spatial frequency and Fourier transformation Organization of digital images is explained in this Diffraction of light and imaging and the next sessions. Although an image is natu- We start from the optical phenomenon of imaging. rally a continuous distribution of brightness, it has to The imaging is defined as a collection of diverged be converted into a discrete set of integers for com- light spread from a point of object into one point by puter processing. The conversion into discrete pixels a lens. This phenomenon can be observed from the is called sampling, and the conversion into integers following point of view: The light has the property of is called quantization. The sampling period is quite diffraction. The diffraction of wave is a phenomenon important issue, and it is evaluated by the concept of that the wave reaches beyond an opaque object ob- spatial frequency. The concept of spatial frequency stracting its path. For example, even if the progress and Fourier transformation are explained in this ses- of wave on a water surface is obstructed by a board, sion. it reaches beyond the board. Since the light is a kind of electromagnetic wave, the light has this property. The radio wave reaches beyond obstructing objects from the broadcasting station by diffraction. wavefronts reach beyond divergence of light the obstruction collection of light Opaque object wavefronts Fig. 2: Diffraction. Fig. 1: Imaging. If the light passes through a diffraction grating, of first order, is obtained in this direction.
    [Show full text]
  • Contrast Sensitivity and Spatial Frequency Response of Primate Cortical Neurons in and Around the Cytochrome Oxidase Blobs DAVID P
    Vision Res. Vol. 35, No. 11, pp. 1501-1523, 1995 Pergamon Copyright © 1995 Elsevier Science Ltd 0042-6989(94)00253-3 Printed in Great Britain. All rights reserved 0042-6989/95 $9.50 + 0.00 Contrast Sensitivity and Spatial Frequency Response of Primate Cortical Neurons in and Around the Cytochrome Oxidase Blobs DAVID P. EDWARDS,* KEITH P. PURPURA,*t EHUD KAPLAN*:~ Received 26 January 1994; in revised form 26 September 1994 The striate cortex of macaque monkeys contains an array of patches which stain heavily for the enzyme cytochrome oxidase (CO blobs). Cells inside and outside these blobs are often described as belonging to two distinct populations or streams. In order to better understand the function of the CO blobs, we measured the contrast sensitivity and spatial frequency response of single neurons in and around the CO blobs. Density profiles of each blob were assessed using a new quantitative method, and correlations of local CO density with the physiology were noted. We found that the CO density dropped off gradually with distance from blob centers: in a typical biob the CO density dropped from 75% to 25% over 100 pm. Recordings were confined to cortical layers 2]3. Most neurons in these layers have poor contrast sensitivity, similar to that of the parvocellular neurons in the lateral geniculate nucleus. However, in a small proportion of layers 2]3 neurons we found higher contrast sensitivity, similar to that of the magnocellular neurons. These neurons were found to cluster near blob centers. This finding is consistent with (indirect) parvocellular input spread uniformly throughout layers 2/3, and (indirect) magnoceHular input focused on CO blobs.
    [Show full text]
  • Nanophotonics Enhanced Coverslip for Phase Imaging in Biology Lukas Wesemann1,2, Jon Rickett1, Jingchao Song1, Jieqiong Lou1, Elizabeth Hinde1, Timothy J
    Wesemann et al. Light: Science & Applications (2021) 10:98 Official journal of the CIOMP 2047-7538 https://doi.org/10.1038/s41377-021-00540-7 www.nature.com/lsa LETTER Open Access Nanophotonics enhanced coverslip for phase imaging in biology Lukas Wesemann1,2, Jon Rickett1, Jingchao Song1, Jieqiong Lou1, Elizabeth Hinde1, Timothy J. Davis1 and Ann Roberts 1,2 Abstract The ability to visualise transparent objects such as live cells is central to understanding biological processes. Here we experimentally demonstrate a novel nanostructured coverslip that converts phase information to high-contrast intensity images. This compact device enables real-time, all-optical generation of pseudo three-dimensional images of phase objects on transmission. We show that by placing unstained human cancer cells on the device, the internal structure within the cells can be clearly seen. Our research demonstrates the significant potential of nanophotonic devices for integration into compact imaging and medical diagnostic devices. Introduction Theinventionofthephase-contrastmicroscopebyFrits Phase-contrast microscopy has had a profound impact on Zernike14 earned him the Nobel Prize for Physics in 1953. biology, enabling weakly absorbing microscopic organisms The invention came from Zernike’sanalysisofthephase to be observed without staining or fixing1. Recently, profile of light that he describesasasumofplanewaves fi 15 1234567890():,; 1234567890():,; 1234567890():,; 1234567890():,; nanostructured thin- lm devices have been developed with travelling in different directions . The absence of intensity the potential to replace the bulky optics used in traditional contrast results from an ideal convolution of all the plane phase-contrast microscopes. These devices perform mathe- wave components, such that a perturbation to any one of matical operations on wavefields, such as first- and second- them disrupts the convolution creating contrast.
    [Show full text]
  • Nanophotonics in Microscopy
    Lecture 12 – Nanophotonics in Microscopy EECS 598-002 Winter 2006 Nanophotonics and Nano-scale Fabrication P.C.Ku Schedule for the rest of the semester Introduction to light-matter interaction (1/26): How to determine ε(r)? The relationship to basic excitations. Basic excitations and measurement of ε(r). (1/31) Structure dependence of ε(r) overview (2/2) Surface effects (2/7): Surface EM wave Surface polaritons Size dependence Case studies (2/9 – 2/16): Quantum wells, wires, and dots Nanophotonics in microscopy Nanophotonics in plasmonics Dispersion engineering (2/21 – 3/7): Material dispersion Waveguide dispersion (photonic crystals) EECS 598-002 Nanophotonics and Nanoscale Fabrication by P.C.Ku 2 Principles of optical microscopy The information contained in the object/image is carried by the light wave to the detector (e.g. eyes). Information = F(x,y) Æ F(,kkxy ) EECS 598-002 Nanophotonics and Nanoscale Fabrication by P.C.Ku 3 Optical vs electron microscopy Advantages of optical microscopes Photon energy is low (~ eV). Electron energy is high (~10-100 keV) -27 Photon momentum is low (~ 10 kg m/s). Electron momentum is high (~ 10-23 -10-22 kg m/s) Especially good to study optical processes (including nonlinear optical response) in the sample Ultra-short optical pulse is available to study ultra-fast processes Can make the fluorescence work Æ useful for biological imaging Disadvantages of optical microscopes Photon does not have charge. Cannot study the Coulomb interaction in the sample. Photon does not reveal information regarding the chemical composition of the sample (unless we go to the x-ray wavelength).
    [Show full text]
  • Standards for Visual Acuity June 15, 2006 Page 1
    Standards for Visual Acuity June 15, 2006 Page 1 Standards for Visual Acuity A Report Prepared For Elena Messina Intelligent Systems Division National Institute for Standards and Technology In Support Of ASTM International Task Group E54.08.01 On Performance Measures for Robots for Urban Search and Rescue Prepared by John M Evans LLC Newtown, CT [email protected] Under Contract 2006-02-13-01 Through KT Consulting Standards for Visual Acuity June 15, 2006 Page 2 Standards for Visual Acuity 1. Summary Visual acuity is defined as the ability to read a standard test pattern at a certain distance, usually measured in terms of a ratio to “normal” vision. Multiple standard test patterns have been developed and are in use today. Closely related are measures of image quality or imaging or printing system quality, generally referred to as optical resolution. Again, standard test patterns can be used, or the modulation transfer function (MTF) can be measured. The MTF defines the ability of an imaging system to reproduce a given spatial frequency; the MTF of components of a system can be multiplied to give the MTF of the composite system. The problem for defining the quality of vision systems for robots for application in Urban Search and Rescue (USAR) involves viewing a scene with a camera (which includes a lens and sensor chip and digitization circuitry), transmission of the digital data and conversion to an image on a screen that can be viewed by the operator. Hence, either approach is relevant. We will discuss both approaches with emphasis on the first.
    [Show full text]