RADIOMETRY (1 Lecture)

Total Page:16

File Type:pdf, Size:1020Kb

RADIOMETRY (1 Lecture) COS 429: COMPUTER VISON RADIOMETRY (1 lecture) • Elements of Radiometry •Radiance • Irradiance •BRDF • Photometric Stereo Reading: Chapters 4 and 5 Many of the slides in this lecture are courtesy to Prof. J. Ponce Geometry Viewing Lighting Surface albedo Images I Photometric stereo example data from: http://www1.cs.columbia.edu/~belhumeur/pub/images/yalefacesB/readme Image Formation: Radiometry The light source(s) The sensor characteristics The surface normal The surface The optics properties What determines the brightness of an image pixel? The Illumination and Viewing Hemi-sphere r (θi , φi ) n(x, y) (θo , φo ) At infinitesimal, each point has a tangent plane, and thus a hemisphere Ω. The ray of light is indexed ρ(x, y) by the polar coordinates (θ , φ) Foreshortening • Principle: two sources that • Reason: what else can a look the same to a receiver receiver know about a source must have the same effect on but what appears on its input the receiver. hemisphere? (ditto, swapping • Principle: two receivers that receiver and source) look the same to a source • Crucial consequence: a big must receive the same amount source (resp. receiver), of energy. viewed at a glancing angle, • “look the same” means must produce (resp. produce the same input experience) the same effect as hemisphere (or output a small source (resp. receiver) hemisphere) viewed frontally. Measuring Angle • To define radiance, we require the concept of solid angle • The solid angle sub- tended by an object from a point P is the area of the projection of the object onto the unit sphere centered at P • Measured in steradians, sr • Definition is analogous to projected angle in 2D • If I’m at P, and I look out, solid angle tells me how much of my view is filled with an object Solid Angle of a Small Patch • Later, it will be important to talk about the solid angle of a small piece of surface dAcosθ dω = r2 A dω = sinθdθdφ DEFINITION: Angles and Solid Angles L θ = l = (radians) R A Ω = a = (steradians) R2 DEFINITION: The radiance is the power traveling at some point in a given direction per unit area perpendicular to this direction, per unit solid angle. δ2P = L( P, v ) δA δω δ2P = L( P, v ) cosθδA δω PROPERTY: Radiance is constant along straight lines (in vacuum). δ2P = L( P, v ) δA δω δ2P = L( P, v ) cosθδA δω DEFINITION: Irradiance The irradiance is the power per unit area incident on a surface. 2 δ P=δE δA=Li( P , vi ) cosθiδωi δΑ δE= Li( P , vi ) cosθi δωi E=∫H Li (P, vi) cosθi dωi Photometry 2 ⎡π ⎛ d ⎞ ⎤ • L is the radiance. E = ⎢ ⎜ ⎟ cos4 α ⎥ L ⎣⎢ 4 ⎝ z'⎠ ⎦⎥ • E is the irradiance. DEFINITION: The Bidirectional Reflectance Distribution Function (BRDF) The BRDF is the ratio of the radiance in the outgoing direction to the incident irradiance (sr-1). Lo ( P, vo ) = ρBD ( P, vi , vo ) δ Ei ( P, vi ) = ρBD ( P, vi , vo ) Li ( P, vi ) cos θi δωi Helmoltz reciprocity law: ρBD ( P, vi , vo ) = ρBD ( P, vo , vi ) DEFINITION: Radiosity The radiosity is the total power Leaving a point on a surface per unit area (W * m-2). B(P) = ∫H Lo ( P , vo ) cosθo dω Important case: Lo is independent of vo. B(P) = π Lo(P) DEFINITION: Lambertian (or Matte) Surfaces A Lambertian surface is a surface whose BRDF is independent of the outgoing direction (and by reciprocity of the incoming direction as well). ρBD( vi , vo ) = ρBD = constant. The albedo is ρd = πρBD. DEFINITION: Specular Surfaces as Perfect or Rough Mirrors θ θ θi θs i s δ Perfect mirror Rough mirror Perfect mirror: Lo (P,vs) = Li (P,vi) n Phong (non-physical model): Lo (P,vo)= ρsLi(P,vi) cos δ n Hybrid model: Lo (P,vo)= ρd ∫ H Li(P, vi) cosθi dωi+ρsLi(P,vi) cos δ DEFINITION: Point Light Sources N θi θi S A point light source is an idealization of an emitting sphere with radius ε at distance R, with ε << R and uniform radiance Le emitted in every direction. For a Lambertian surface, the corresponding radiosity is ⎡ πε 2 ⎤ N(P)⋅S(P) ≈ ρ (P) B(P) = ⎢ρd (P) Le 2 ⎥ cosθi d 2 ⎣ R(P) ⎦ R(P) Local Shading Model • Assume that the radiosity at a patch is the sum of the radiosities due to light source and sources alone. No interreflections. N(P)⋅S (P) • For point sources: B(P) (P) s = ∑ ρd 2 visible s Rs (P) • For point sources at infinity: B(P) = ρd (P)N(P) ⋅ ∑Ss (P) visible s Photometric Stereo (Woodham, 1979) ?? Problem: Given n images of an object, taken by a fixed camera under different (known) light sources, reconstruct the object shape. Photometric Stereo: Example (1) • Assume a Lambertian surface and distant point light sources. with g(P)=ρ N(P) I(P) = kB(P) = kρ N(P) • S = g(P) •V and V= k S • Given n images, we obtain n linear equations in g: T I1 V1.gV1 T I2 V2.gV2 i= … = … = gi= V g g = V-1 i T In Vn.gVn Photometric Stereo: Example (2) • What about shadows? • Just skip the equations corresponding to zero-intensity pixels. • Only works when there is no ambient illumination. Photometric Stereo: Example (3) ρ(P) = | g(P) | g(P)=ρ(P)N(P) 1 N(P) = g(P) | g(P) | Photometric Stereo: Example (3) Integrability! ∂ ⎛ ∂z ⎞ ∂ ⎛ ∂z ⎞ ⎜ ⎟ = ⎜ ⎟ z ∂y ⎝ ∂x ⎠ ∂x ⎝ ∂y ⎠ ⎡ ∂z ⎤ ⎢− ⎥ y ∂x ⎧∂z a ⎡a⎤ ⎢ ⎥ = − ⎢ ⎥ ⎢ ∂z ⎥ ⎪∂x c N = b ∝ − ⇒ ⎨ v ⎢ ⎥ ⎢ ∂y ⎥ ∂z b ⎢c⎥ ⎢ ⎥ ⎪ = − ⎣ ⎦ ⎩⎪∂y c u ⎢ 1 ⎥ x ⎣⎢ ⎦⎥ u ∂z v ∂z z(u,v) = (x,0)dx + (u, y)dy ∫0 ∂x ∫0 ∂y Photometric stereo example data from: http://www1.cs.columbia.edu/~belhumeur/pub/images/yalefacesB/readme.
Recommended publications
  • Fundametals of Rendering - Radiometry / Photometry
    Fundametals of Rendering - Radiometry / Photometry “Physically Based Rendering” by Pharr & Humphreys •Chapter 5: Color and Radiometry •Chapter 6: Camera Models - we won’t cover this in class 782 Realistic Rendering • Determination of Intensity • Mechanisms – Emittance (+) – Absorption (-) – Scattering (+) (single vs. multiple) • Cameras or retinas record quantity of light 782 Pertinent Questions • Nature of light and how it is: – Measured – Characterized / recorded • (local) reflection of light • (global) spatial distribution of light 782 Electromagnetic spectrum 782 Spectral Power Distributions e.g., Fluorescent Lamps 782 Tristimulus Theory of Color Metamers: SPDs that appear the same visually Color matching functions of standard human observer International Commision on Illumination, or CIE, of 1931 “These color matching functions are the amounts of three standard monochromatic primaries needed to match the monochromatic test primary at the wavelength shown on the horizontal scale.” from Wikipedia “CIE 1931 Color Space” 782 Optics Three views •Geometrical or ray – Traditional graphics – Reflection, refraction – Optical system design •Physical or wave – Dispersion, interference – Interaction of objects of size comparable to wavelength •Quantum or photon optics – Interaction of light with atoms and molecules 782 What Is Light ? • Light - particle model (Newton) – Light travels in straight lines – Light can travel through a vacuum (waves need a medium to travel in) – Quantum amount of energy • Light – wave model (Huygens): electromagnetic radiation: sinusiodal wave formed coupled electric (E) and magnetic (H) fields 782 Nature of Light • Wave-particle duality – Light has some wave properties: frequency, phase, orientation – Light has some quantum particle properties: quantum packets (photons). • Dimensions of light – Amplitude or Intensity – Frequency – Phase – Polarization 782 Nature of Light • Coherence - Refers to frequencies of waves • Laser light waves have uniform frequency • Natural light is incoherent- waves are multiple frequencies, and random in phase.
    [Show full text]
  • Black Body Radiation and Radiometric Parameters
    Black Body Radiation and Radiometric Parameters: All materials absorb and emit radiation to some extent. A blackbody is an idealization of how materials emit and absorb radiation. It can be used as a reference for real source properties. An ideal blackbody absorbs all incident radiation and does not reflect. This is true at all wavelengths and angles of incidence. Thermodynamic principals dictates that the BB must also radiate at all ’s and angles. The basic properties of a BB can be summarized as: 1. Perfect absorber/emitter at all ’s and angles of emission/incidence. Cavity BB 2. The total radiant energy emitted is only a function of the BB temperature. 3. Emits the maximum possible radiant energy from a body at a given temperature. 4. The BB radiation field does not depend on the shape of the cavity. The radiation field must be homogeneous and isotropic. T If the radiation going from a BB of one shape to another (both at the same T) were different it would cause a cooling or heating of one or the other cavity. This would violate the 1st Law of Thermodynamics. T T A B Radiometric Parameters: 1. Solid Angle dA d r 2 where dA is the surface area of a segment of a sphere surrounding a point. r d A r is the distance from the point on the source to the sphere. The solid angle looks like a cone with a spherical cap. z r d r r sind y r sin x An element of area of a sphere 2 dA rsin d d Therefore dd sin d The full solid angle surrounding a point source is: 2 dd sind 00 2cos 0 4 Or integrating to other angles < : 21cos The unit of solid angle is steradian.
    [Show full text]
  • Guide for the Use of the International System of Units (SI)
    Guide for the Use of the International System of Units (SI) m kg s cd SI mol K A NIST Special Publication 811 2008 Edition Ambler Thompson and Barry N. Taylor NIST Special Publication 811 2008 Edition Guide for the Use of the International System of Units (SI) Ambler Thompson Technology Services and Barry N. Taylor Physics Laboratory National Institute of Standards and Technology Gaithersburg, MD 20899 (Supersedes NIST Special Publication 811, 1995 Edition, April 1995) March 2008 U.S. Department of Commerce Carlos M. Gutierrez, Secretary National Institute of Standards and Technology James M. Turner, Acting Director National Institute of Standards and Technology Special Publication 811, 2008 Edition (Supersedes NIST Special Publication 811, April 1995 Edition) Natl. Inst. Stand. Technol. Spec. Publ. 811, 2008 Ed., 85 pages (March 2008; 2nd printing November 2008) CODEN: NSPUE3 Note on 2nd printing: This 2nd printing dated November 2008 of NIST SP811 corrects a number of minor typographical errors present in the 1st printing dated March 2008. Guide for the Use of the International System of Units (SI) Preface The International System of Units, universally abbreviated SI (from the French Le Système International d’Unités), is the modern metric system of measurement. Long the dominant measurement system used in science, the SI is becoming the dominant measurement system used in international commerce. The Omnibus Trade and Competitiveness Act of August 1988 [Public Law (PL) 100-418] changed the name of the National Bureau of Standards (NBS) to the National Institute of Standards and Technology (NIST) and gave to NIST the added task of helping U.S.
    [Show full text]
  • Extraction of Incident Irradiance from LWIR Hyperspectral Imagery Pierre Lahaie, DRDC Valcartier 2459 De La Bravoure Road, Quebec, Qc, Canada
    DRDC-RDDC-2015-P140 Extraction of incident irradiance from LWIR hyperspectral imagery Pierre Lahaie, DRDC Valcartier 2459 De la Bravoure Road, Quebec, Qc, Canada ABSTRACT The atmospheric correction of thermal hyperspectral imagery can be separated in two distinct processes: Atmospheric Compensation (AC) and Temperature and Emissivity separation (TES). TES requires for input at each pixel, the ground leaving radiance and the atmospheric downwelling irradiance, which are the outputs of the AC process. The extraction from imagery of the downwelling irradiance requires assumptions about some of the pixels’ nature, the sensor and the atmosphere. Another difficulty is that, often the sensor’s spectral response is not well characterized. To deal with this unknown, we defined a spectral mean operator that is used to filter the ground leaving radiance and a computation of the downwelling irradiance from MODTRAN. A user will select a number of pixels in the image for which the emissivity is assumed to be known. The emissivity of these pixels is assumed to be smooth and that the only spectrally fast varying variable in the downwelling irradiance. Using these assumptions we built an algorithm to estimate the downwelling irradiance. The algorithm is used on all the selected pixels. The estimated irradiance is the average on the spectral channels of the resulting computation. The algorithm performs well in simulation and results are shown for errors in the assumed emissivity and for errors in the atmospheric profiles. The sensor noise influences mainly the required number of pixels. Keywords: Hyperspectral imagery, atmospheric correction, temperature emissivity separation 1. INTRODUCTION The atmospheric correction of thermal hyperspectral imagery aims at extracting the temperature and the emissivity of the material imaged by a sensor in the long wave infrared (LWIR) spectral band.
    [Show full text]
  • RADIANCE® ULTRA 27” Premium Endoscopy Visualization
    EW NNEW RADIANCE® ULTRA 27” Premium Endoscopy Visualization The Radiance® Ultra series leads the industry as a revolutionary display with the aim to transform advanced visualization technology capabilities and features into a clinical solution that supports the drive to improve patient outcomes, improve Optimized for endoscopy applications workflow efficiency, and lower operating costs. Advanced imaging capabilities Enhanced endoscopic visualization is accomplished with an LED backlight technology High brightness and color calibrated that produces the brightest typical luminance level to enable deep abdominal Cleanable splash-proof design illumination, overcoming glare and reflection in high ambient light environments. Medi-Match™ color calibration assures consistent image quality and accurate color 10-year scratch-resistant-glass guarantee reproduction. The result is outstanding endoscopic video image performance. ZeroWire® embedded receiver optional With a focus to improve workflow efficiency and workplace safety, the Radiance Ultra series is available with an optional built-in ZeroWire receiver. When paired with the ZeroWire Mobile battery-powered stand, the combination becomes the world’s first and only truly cordless and wireless mobile endoscopic solution (patent pending). To eliminate display scratches caused by IV poles or surgical light heads, the Radiance Ultra series uses scratch-resistant, splash-proof edge-to-edge glass that includes an industry-exclusive 10-year scratch-resistance guarantee. RADIANCE® ULTRA 27” Premium Endoscopy
    [Show full text]
  • The International System of Units (SI)
    NAT'L INST. OF STAND & TECH NIST National Institute of Standards and Technology Technology Administration, U.S. Department of Commerce NIST Special Publication 330 2001 Edition The International System of Units (SI) 4. Barry N. Taylor, Editor r A o o L57 330 2oOI rhe National Institute of Standards and Technology was established in 1988 by Congress to "assist industry in the development of technology . needed to improve product quality, to modernize manufacturing processes, to ensure product reliability . and to facilitate rapid commercialization ... of products based on new scientific discoveries." NIST, originally founded as the National Bureau of Standards in 1901, works to strengthen U.S. industry's competitiveness; advance science and engineering; and improve public health, safety, and the environment. One of the agency's basic functions is to develop, maintain, and retain custody of the national standards of measurement, and provide the means and methods for comparing standards used in science, engineering, manufacturing, commerce, industry, and education with the standards adopted or recognized by the Federal Government. As an agency of the U.S. Commerce Department's Technology Administration, NIST conducts basic and applied research in the physical sciences and engineering, and develops measurement techniques, test methods, standards, and related services. The Institute does generic and precompetitive work on new and advanced technologies. NIST's research facilities are located at Gaithersburg, MD 20899, and at Boulder, CO 80303.
    [Show full text]
  • Ikonos DN Value Conversion to Planetary Reflectance Values by David Fleming CRESS Project, UMCP Geography April 2001
    Ikonos DN Value Conversion to Planetary Reflectance Values By David Fleming CRESS Project, UMCP Geography April 2001 This paper was produced by the CRESS Project at the University of Maryland. CRESS is sponsored by the Earth Science Enterprise Applications Directorate at NASA Stennis Space Center. The following formula from the Landsat 7 Science Data Users Handbook was used to convert Ikonos DN values to planetary reflectance values: ρp = planetary reflectance Lλ = spectral radiance at sensor’s aperture 2 ρp = π * Lλ * d ESUNλ = band dependent mean solar exoatmospheric irradiance ESUNλ * cos(θs) θs = solar zenith angle d = earth-sun distance, in astronomical units For Landsat 7, the spectral radiance Lλ = gain * DN + offset, which can also be expressed as Lλ = (LMAX – LMIN)/255 *DN + LMIN. The LMAX and LMIN values for each of the Landsat bands are given below in Table 1. These values may vary for each scene and the metadata contains image specific values. Table 1. ETM+ Spectral Radiance Range (W/m2 * sr * µm) Low Gain High Gain Band Number LMIN LMAX LMIN LMAX 1 -6.2 293.7 -6.2 191.6 2 -6.4 300.9 -6.4 196.5 3 -5.0 234.4 -5.0 152.9 4 -5.1 241.1 -5.1 157.4 5 -1.0 47.57 -1.0 31.06 6 0.0 17.04 3.2 12.65 7 -0.35 16.54 -0.35 10.80 8 -4.7 243.1 -4.7 158.3 (from Landsat 7 Science Data Users Handbook) The Ikonos spectral radiance, Lλ, can be calculated by using the formula given in the Space Imaging Document Number SE-REF-016: 2 Lλ (mW/cm * sr) = DN / CalCoefλ However, in order to use these values in the conversion formula, the values must be in units of 2 W/m * sr * µm.
    [Show full text]
  • Radiometry of Light Emitting Diodes Table of Contents
    TECHNICAL GUIDE THE RADIOMETRY OF LIGHT EMITTING DIODES TABLE OF CONTENTS 1.0 Introduction . .1 2.0 What is an LED? . .1 2.1 Device Physics and Package Design . .1 2.2 Electrical Properties . .3 2.2.1 Operation at Constant Current . .3 2.2.2 Modulated or Multiplexed Operation . .3 2.2.3 Single-Shot Operation . .3 3.0 Optical Characteristics of LEDs . .3 3.1 Spectral Properties of Light Emitting Diodes . .3 3.2 Comparison of Photometers and Spectroradiometers . .5 3.3 Color and Dominant Wavelength . .6 3.4 Influence of Temperature on Radiation . .6 4.0 Radiometric and Photopic Measurements . .7 4.1 Luminous and Radiant Intensity . .7 4.2 CIE 127 . .9 4.3 Spatial Distribution Characteristics . .10 4.4 Luminous Flux and Radiant Flux . .11 5.0 Terminology . .12 5.1 Radiometric Quantities . .12 5.2 Photometric Quantities . .12 6.0 References . .13 1.0 INTRODUCTION Almost everyone is familiar with light-emitting diodes (LEDs) from their use as indicator lights and numeric displays on consumer electronic devices. The low output and lack of color options of LEDs limited the technology to these uses for some time. New LED materials and improved production processes have produced bright LEDs in colors throughout the visible spectrum, including white light. With efficacies greater than incandescent (and approaching that of fluorescent lamps) along with their durability, small size, and light weight, LEDs are finding their way into many new applications within the lighting community. These new applications have placed increasingly stringent demands on the optical characterization of LEDs, which serves as the fundamental baseline for product quality and product design.
    [Show full text]
  • Radiometric and Photometric Measurements with TAOS Photosensors Contributed by Todd Bishop March 12, 2007 Valid
    TAOS Inc. is now ams AG The technical content of this TAOS application note is still valid. Contact information: Headquarters: ams AG Tobelbaderstrasse 30 8141 Unterpremstaetten, Austria Tel: +43 (0) 3136 500 0 e-Mail: [email protected] Please visit our website at www.ams.com NUMBER 21 INTELLIGENT OPTO SENSOR DESIGNER’S NOTEBOOK Radiometric and Photometric Measurements with TAOS PhotoSensors contributed by Todd Bishop March 12, 2007 valid ABSTRACT Light Sensing applications use two measurement systems; Radiometric and Photometric. Radiometric measurements deal with light as a power level, while Photometric measurements deal with light as it is interpreted by the human eye. Both systems of measurement have units that are parallel to each other, but are useful for different applications. This paper will discuss the differencesstill and how they can be measured. AG RADIOMETRIC QUANTITIES Radiometry is the measurement of electromagnetic energy in the range of wavelengths between ~10nm and ~1mm. These regions are commonly called the ultraviolet, the visible and the infrared. Radiometry deals with light (radiant energy) in terms of optical power. Key quantities from a light detection point of view are radiant energy, radiant flux and irradiance. SI Radiometryams Units Quantity Symbol SI unit Abbr. Notes Radiant energy Q joule contentJ energy radiant energy per Radiant flux Φ watt W unit time watt per power incident on a Irradiance E square meter W·m−2 surface Energy is an SI derived unit measured in joules (J). The recommended symbol for energy is Q. Power (radiant flux) is another SI derived unit. It is the derivative of energy with respect to time, dQ/dt, and the unit is the watt (W).
    [Show full text]
  • Lighting - the Radiance Equation
    CHAPTER 3 Lighting - the Radiance Equation Lighting The Fundamental Problem for Computer Graphics So far we have a scene composed of geometric objects. In computing terms this would be a data structure representing a collection of objects. Each object, for example, might be itself a collection of polygons. Each polygon is sequence of points on a plane. The ‘real world’, however, is ‘one that generates energy’. Our scene so far is truly a phantom one, since it simply is a description of a set of forms with no substance. Energy must be generated: the scene must be lit; albeit, lit with virtual light. Computer graphics is concerned with the construction of virtual models of scenes. This is a relatively straightforward problem to solve. In comparison, the problem of lighting scenes is the major and central conceptual and practical problem of com- puter graphics. The problem is one of simulating lighting in scenes, in such a way that the computation does not take forever. Also the resulting 2D projected images should look as if they are real. In fact, let’s make the problem even more interesting and challenging: we do not just want the computation to be fast, we want it in real time. Image frames, in other words, virtual photographs taken within the scene must be produced fast enough to keep up with the changing gaze (head and eye moves) of Lighting - the Radiance Equation 35 Great Events of the Twentieth Century35 Lighting - the Radiance Equation people looking and moving around the scene - so that they experience the same vis- ual sensations as if they were moving through a corresponding real scene.
    [Show full text]
  • Radiometry and Photometry
    Radiometry and Photometry Wei-Chih Wang Department of Power Mechanical Engineering National TsingHua University W. Wang Materials Covered • Radiometry - Radiant Flux - Radiant Intensity - Irradiance - Radiance • Photometry - luminous Flux - luminous Intensity - Illuminance - luminance Conversion from radiometric and photometric W. Wang Radiometry Radiometry is the detection and measurement of light waves in the optical portion of the electromagnetic spectrum which is further divided into ultraviolet, visible, and infrared light. Example of a typical radiometer 3 W. Wang Photometry All light measurement is considered radiometry with photometry being a special subset of radiometry weighted for a typical human eye response. Example of a typical photometer 4 W. Wang Human Eyes Figure shows a schematic illustration of the human eye (Encyclopedia Britannica, 1994). The inside of the eyeball is clad by the retina, which is the light-sensitive part of the eye. The illustration also shows the fovea, a cone-rich central region of the retina which affords the high acuteness of central vision. Figure also shows the cell structure of the retina including the light-sensitive rod cells and cone cells. Also shown are the ganglion cells and nerve fibers that transmit the visual information to the brain. Rod cells are more abundant and more light sensitive than cone cells. Rods are 5 sensitive over the entire visible spectrum. W. Wang There are three types of cone cells, namely cone cells sensitive in the red, green, and blue spectral range. The approximate spectral sensitivity functions of the rods and three types or cones are shown in the figure above 6 W. Wang Eye sensitivity function The conversion between radiometric and photometric units is provided by the luminous efficiency function or eye sensitivity function, V(λ).
    [Show full text]
  • Basic Principles of Imaging and Photometry Lecture #2
    Basic Principles of Imaging and Photometry Lecture #2 Thanks to Shree Nayar, Ravi Ramamoorthi, Pat Hanrahan Computer Vision: Building Machines that See Lighting Camera Physical Models Computer Scene Scene Interpretation We need to understand the Geometric and Radiometric relations between the scene and its image. A Brief History of Images 1558 Camera Obscura, Gemma Frisius, 1558 A Brief History of Images 1558 1568 Lens Based Camera Obscura, 1568 A Brief History of Images 1558 1568 1837 Still Life, Louis Jaques Mande Daguerre, 1837 A Brief History of Images 1558 1568 1837 Silicon Image Detector, 1970 1970 A Brief History of Images 1558 1568 1837 Digital Cameras 1970 1995 Geometric Optics and Image Formation TOPICS TO BE COVERED : 1) Pinhole and Perspective Projection 2) Image Formation using Lenses 3) Lens related issues Pinhole and the Perspective Projection Is an image being formed (x,y) on the screen? YES! But, not a “clear” one. screen scene image plane r =(x, y, z) y optical effective focal length, f’ z axis pinhole x r'=(x', y', f ') r' r x' x y' y = = = f ' z f ' z f ' z Magnification A(x, y, z) B y d B(x +δx, y +δy, z) optical f’ z A axis Pinhole A’ d’ x planar scene image plane B’ A'(x', y', f ') B'(x'+δx', y'+δy', f ') From perspective projection: Magnification: x' x y' y d' (δx')2 + (δy')2 f ' = = m = = = f ' z f ' z d (δx)2 + (δy)2 z x'+δx' x +δx y'+δy' y +δy = = Area f ' z f ' z image = m2 Areascene Orthographic Projection Magnification: x' = m x y' = m y When m = 1, we have orthographic projection r =(x, y, z) r'=(x', y', f ') y optical z axis x z ∆ image plane z This is possible only when z >> ∆z In other words, the range of scene depths is assumed to be much smaller than the average scene depth.
    [Show full text]