ELEC 7450 - Digital Image Processing Image Acquisition

Total Page:16

File Type:pdf, Size:1020Kb

ELEC 7450 - Digital Image Processing Image Acquisition I indirect imaging techniques, e.g., MRI (Fourier), CT (Backprojection) I physical quantities other than intensities are measured I computation leads to 2-D map displayed as intensity Image acquisition Digital images are acquired by I direct digital acquisition (digital still/video cameras), or I scanning material acquired as analog signals (slides, photographs, etc.). I In both cases, the digital sensing element is one of the following: Line array Area array Single sensor Stanley J. Reeves ELEC 7450 - Digital Image Processing Image acquisition Digital images are acquired by I direct digital acquisition (digital still/video cameras), or I scanning material acquired as analog signals (slides, photographs, etc.). I In both cases, the digital sensing element is one of the following: Line array Area array Single sensor I indirect imaging techniques, e.g., MRI (Fourier), CT (Backprojection) I physical quantities other than intensities are measured I computation leads to 2-D map displayed as intensity Stanley J. Reeves ELEC 7450 - Digital Image Processing Single sensor acquisition Stanley J. Reeves ELEC 7450 - Digital Image Processing Linear array acquisition Stanley J. Reeves ELEC 7450 - Digital Image Processing Two types of quantization: I spatial: limited number of pixels I gray-level: limited number of bits to represent intensity at a pixel Array sensor acquisition I Irradiance incident at each photo-site is integrated over time I Resulting array of intensities is moved out of sensor array and into a buffer I Quantized intensities are stored as a grayscale image Stanley J. Reeves ELEC 7450 - Digital Image Processing Array sensor acquisition I Irradiance incident at each photo-site is integrated over time I Resulting array of intensities is moved out of sensor array and into a buffer I Quantized Two types of quantization: intensities are stored as a I spatial: limited number of pixels grayscale image I gray-level: limited number of bits to represent intensity at a pixel Stanley J. Reeves ELEC 7450 - Digital Image Processing Spatial resolution Stanley J. Reeves ELEC 7450 - Digital Image Processing Grayscale resolution Stanley J. Reeves ELEC 7450 - Digital Image Processing Sensors - CCD & CMOS CMOS (complementary CCD (charge-coupled device) metal-oxide-semiconductor) Charge-coupled device. I QE of 19-26%. Whole systems Quantum efficiency of 70% I can be integrated on the same (film has 2% QE). device. Camera-on-chip. Mature technology. In I Standard semiconductor device development since 1969. I manufacturing process. Uses photo-diodes in I Each pixel has read-out conjunction with capacitors to I electronics, amplifiers, noise store charge. correction, and ADC. Charge converted to voltage at I Consume far less power than limited nodes. Varied I CCDs. architectures used for read-out. Need more room for electronics. Most of pixel area is light I I Fill-factor generally not as good sensitive. Good fill-factor. as CCDs. Stanley J. Reeves ELEC 7450 - Digital Image Processing CCD architectures CCDs function in two stages—exposure and read-out I Photons are collected and charge is accumulated during exposure I Area arrays use vertical and horizontal shift registers for read-out I In some architectures, charge is transferred to an inactive/opaque region before readout Linear array Full frame transfer I Pixel intensities are read I The entire pixel area is active sequentially I Time between exposures is significant I Needs mechanical shutter Stanley J. Reeves ELEC 7450 - Digital Image Processing CCD architectures Interline transfer Frame transfer I Charge shifted to adjacent I Need 2x optically active area opaque area and thus are larger and costlier I Subsequently shifted row-wise I Half of the array (for storage) is to a horizontal shift register masked I Complex design (requires I Shutter delay is smaller than full micro-mirrors or microlenses for frame transfer good optical efficiency) Stanley J. Reeves ELEC 7450 - Digital Image Processing Image formation I Both CCD and CMOS sensors are monochromatic I Color images are acquired using color filters overlaid on the sensor The intensity measured at a pixel is Z 1 ci = fi(λ)g(λ)x(λ)l(λ)dλ + ηi −∞ I i = 1;:::; k are distinct color channels sampled at each location I fi(λ) - spectral transmittance of color filter I g(λ) - sensitivity of sensor I x(λ) - spectral reflectance of imaged surface I l(λ) - spectral power density of illuminant I ηi - measurement noise Stanley J. Reeves ELEC 7450 - Digital Image Processing Spectral response of common illuminants Source: http://www.ni.com/white-paper/6901/en/ Stanley J. Reeves ELEC 7450 - Digital Image Processing Multiple sensors I To acquire a 2-D image, multiple CCDs are used to acquire separate color bands Dichroic prism I A dichroic prism is used to split incoming irradiance into narrow-band beams I Red, blue, and green beams directed to separate optical sensors I Issues: cost, weight, registration Beam splitter in action Stanley J. Reeves ELEC 7450 - Digital Image Processing Single sensor acquisition I To avoid the cost and complexity associated with multiple-sensor acquisition, most color digital cameras use a single sensor I Each pixel is overlaid with a color filter such that only one color channel is acquired at a particular pixel location I The Bayer array is the most common color filter array I Green is sampled at twice the density of red and blue since the human visual system (HVS) is more sensitive in the green region of the spectrum I The quincunx sampling arrangement ensures that aliasing in the green channel is least along the horizontal and vertical directions I The full color image is recovered in a post-processing stage known as demosaicking Stanley J. Reeves ELEC 7450 - Digital Image Processing Direct color imaging I The Foveon X3 sensor captures colors at different depths at the same spatial location I The increased density leads to much better spatial resolution I The spectral sensitivity functions at the different layers have substantial overlap I Color separation is a major issue for such sensors Stanley J. Reeves ELEC 7450 - Digital Image Processing Digital camera pipeline Lens assembly Focus control Exposure control I IR blocking (hot I Active auto-focus I Good contrast mirror) systems use IR across image by I Anti-aliasing: emitters to estimate manipulating blurs to increase distance aperture size and spatial I A passive method exposure time correlation dynamically adjusts I Prevents over- among color the focus setting to and channels to help maximimize under-exposed with high-frequency images demosaicking energy Stanley J. Reeves ELEC 7450 - Digital Image Processing Digital camera pipeline I Correct for lens distortion: barrel (fish-eye), pincushion (telephoto), vignetting (reduced brightness at edges) I Gamma correction to compensate for nonlinearity of sensor response (opto-electronic conversion function) I Compensation for dark current. Capture appropriate “dark-image”, subtract from acquired image. I Lens flare (scattered light) compensation (mostly proprietary) Stanley J. Reeves ELEC 7450 - Digital Image Processing Digital camera pipeline I HVS remarkably adaptive; e.g., paper appears white under incandescent light or sunlight I Imaging system will integrate spectral content of irradiance. Without color compensation, images appear unnatural and dissimilar to viewed scenes I White balancing algorithms based on one of two philosophies: I Gray-world assumption R = krR, B = kbB; kr = Gmean=Rmean, kb = Gmean=Bmean I Perfect reflector method Brightest pixel corresponds to white. R = R=Rmax, G = G=Gmax, B = B=Bmax Stanley J. Reeves ELEC 7450 - Digital Image Processing Digital camera pipeline I Reconstruct sparsely sampled signal to form 3-color image I Multitude of methods based on heuristics, properties of the HVS, and mathematical formulations I Since the Bayer array is the most common, most algorithms are tailored Bayer demosaicking specifically for it I Effective algorithms use inter-channel correlation Stanley J. Reeves ELEC 7450 - Digital Image Processing Digital camera pipeline I Captured image is in the digital camera color space. Colors are not impulses at specific wavelength. The sensitivity function of the camera color sensors dictates the camera color space. I The camera-RGB image is transformed to one of many standard color spaces. Most commonly, the transformation is Camera-RGB ! CIEXYZ. I The CIEXYZ space defined by CIE (Commission Internationale de l’Eclairage the International Commission on Illumination) corresponds to the human visual subspace I Many enhancement algorithms use non-RGB color spaces. Stanley J. Reeves ELEC 7450 - Digital Image Processing Digital camera pipeline I Removal of color artifacts due to demosaicking — algorithms based on the constant-hue assumption I Sharpening — performed on luminance component only I Denoising — median filters, bilateral filtering, and thresholding Stanley J. Reeves ELEC 7450 - Digital Image Processing Digital camera pipeline I Display — Images are converted to a format appropriate for display medium (sRGB for monitors, CMY/CMYK for printers). I Compression — Most cameras offer flexible compression options. JPEG is standard in current models. Some JPEG2000. I Storage — Low-end cameras offer only JPEG images as output. Some high-end point-and-shoot cameras and most SLRs will allow for retrieval of RAW images that are unprocessed. RAW images can be processed later on a PC without time and computational constraints. Stanley J. Reeves ELEC 7450 - Digital Image Processing.
Recommended publications
  • Management of Large Sets of Image Data Capture, Databases, Image Processing, Storage, Visualization Karol Kozak
    Management of large sets of image data Capture, Databases, Image Processing, Storage, Visualization Karol Kozak Download free books at Karol Kozak Management of large sets of image data Capture, Databases, Image Processing, Storage, Visualization Download free eBooks at bookboon.com 2 Management of large sets of image data: Capture, Databases, Image Processing, Storage, Visualization 1st edition © 2014 Karol Kozak & bookboon.com ISBN 978-87-403-0726-9 Download free eBooks at bookboon.com 3 Management of large sets of image data Contents Contents 1 Digital image 6 2 History of digital imaging 10 3 Amount of produced images – is it danger? 18 4 Digital image and privacy 20 5 Digital cameras 27 5.1 Methods of image capture 31 6 Image formats 33 7 Image Metadata – data about data 39 8 Interactive visualization (IV) 44 9 Basic of image processing 49 Download free eBooks at bookboon.com 4 Click on the ad to read more Management of large sets of image data Contents 10 Image Processing software 62 11 Image management and image databases 79 12 Operating system (os) and images 97 13 Graphics processing unit (GPU) 100 14 Storage and archive 101 15 Images in different disciplines 109 15.1 Microscopy 109 360° 15.2 Medical imaging 114 15.3 Astronomical images 117 15.4 Industrial imaging 360° 118 thinking. 16 Selection of best digital images 120 References: thinking. 124 360° thinking . 360° thinking. Discover the truth at www.deloitte.ca/careers Discover the truth at www.deloitte.ca/careers © Deloitte & Touche LLP and affiliated entities. Discover the truth at www.deloitte.ca/careers © Deloitte & Touche LLP and affiliated entities.
    [Show full text]
  • What Resolution Should Your Images Be?
    What Resolution Should Your Images Be? The best way to determine the optimum resolution is to think about the final use of your images. For publication you’ll need the highest resolution, for desktop printing lower, and for web or classroom use, lower still. The following table is a general guide; detailed explanations follow. Use Pixel Size Resolution Preferred Approx. File File Format Size Projected in class About 1024 pixels wide 102 DPI JPEG 300–600 K for a horizontal image; or 768 pixels high for a vertical one Web site About 400–600 pixels 72 DPI JPEG 20–200 K wide for a large image; 100–200 for a thumbnail image Printed in a book Multiply intended print 300 DPI EPS or TIFF 6–10 MB or art magazine size by resolution; e.g. an image to be printed as 6” W x 4” H would be 1800 x 1200 pixels. Printed on a Multiply intended print 200 DPI EPS or TIFF 2-3 MB laserwriter size by resolution; e.g. an image to be printed as 6” W x 4” H would be 1200 x 800 pixels. Digital Camera Photos Digital cameras have a range of preset resolutions which vary from camera to camera. Designation Resolution Max. Image size at Printable size on 300 DPI a color printer 4 Megapixels 2272 x 1704 pixels 7.5” x 5.7” 12” x 9” 3 Megapixels 2048 x 1536 pixels 6.8” x 5” 11” x 8.5” 2 Megapixels 1600 x 1200 pixels 5.3” x 4” 6” x 4” 1 Megapixel 1024 x 768 pixels 3.5” x 2.5” 5” x 3 If you can, you generally want to shoot larger than you need, then sharpen the image and reduce its size in Photoshop.
    [Show full text]
  • Invention of Digital Photograph
    Invention of Digital photograph Digital photography uses cameras containing arrays of electronic photodetectors to capture images focused by a lens, as opposed to an exposure on photographic film. The captured images are digitized and stored as a computer file ready for further digital processing, viewing, electronic publishing, or digital printing. Until the advent of such technology, photographs were made by exposing light sensitive photographic film and paper, which was processed in liquid chemical solutions to develop and stabilize the image. Digital photographs are typically created solely by computer-based photoelectric and mechanical techniques, without wet bath chemical processing. The first consumer digital cameras were marketed in the late 1990s.[1] Professionals gravitated to digital slowly, and were won over when their professional work required using digital files to fulfill the demands of employers and/or clients, for faster turn- around than conventional methods would allow.[2] Starting around 2000, digital cameras were incorporated in cell phones and in the following years, cell phone cameras became widespread, particularly due to their connectivity to social media websites and email. Since 2010, the digital point-and-shoot and DSLR formats have also seen competition from the mirrorless digital camera format, which typically provides better image quality than the point-and-shoot or cell phone formats but comes in a smaller size and shape than the typical DSLR. Many mirrorless cameras accept interchangeable lenses and have advanced features through an electronic viewfinder, which replaces the through-the-lens finder image of the SLR format. While digital photography has only relatively recently become mainstream, the late 20th century saw many small developments leading to its creation.
    [Show full text]
  • Multispectral Imaging for Medical and Industrial Machine Vision Systems
    1 | Tech Guide: Multispectral imaging for medical and industrial machine vision systems Tech Guide: Multispectral Imaging Multispectral imaging for medical and industrial machine vision systems 2 | Tech Guide: Multispectral imaging for medical and industrial machine vision systems Table of contents Introduction Chapter 1: What is multispectral imaging? Chapter 2: Multispectral imaging applications Chapter 3: Multispectral camera technologies Chapter 4: Key considerations when selecting camera technology for multispectral imaging Chapter 5: Hyperspectral and the future of multispectral imaging 3 | Tech Guide: Multispectral imaging for medical and industrial machine vision systems Introduction Just as machine vision systems have evolved from traditional monochrome cameras to many systems that now utilize full color imaging information, there has also been an evolution from systems that only captured broadband images in the visible spectrum, to those that can utilize targeted spectral bands in both visible and non- visible spectral regions to perform more sophisticated inspection and analysis. The color output of the cameras used in the machine vision industry today is largely based on Bayer-pattern or trilinear sensor technology. But imaging is moving well beyond conventional color where standard RGB is not enough to carry out inspection tasks. Some applications demand unconventional RGB wavelength bands while others demand a combination of visible and non-visible wavelengths. Others require exclusively non-visible wavelengths such as UV, NIR or SWIR, with no wavebands in the visible spectrum. Complex metrology and imaging applications are beginning to demand higher numbers of spectral channels or possibilities to select application-specific spectral filtering at high inspection throughputs. With the traditional machine vision industry merging with intricate measurement technologies, consistent, reliable, high-fidelity color and multispectral imaging are playing key roles in industrial quality control.
    [Show full text]
  • Spatial Frequency Response of Color Image Sensors: Bayer Color Filters and Foveon X3 Paul M
    Spatial Frequency Response of Color Image Sensors: Bayer Color Filters and Foveon X3 Paul M. Hubel, John Liu and Rudolph J. Guttosch Foveon, Inc. Santa Clara, California Abstract Bayer Background We compared the Spatial Frequency Response (SFR) The Bayer pattern, also known as a Color Filter of image sensors that use the Bayer color filter Array (CFA) or a mosaic pattern, is made up of a pattern and Foveon X3 technology for color image repeating array of red, green, and blue filter material capture. Sensors for both consumer and professional deposited on top of each spatial location in the array cameras were tested. The results show that the SFR (figure 1). These tiny filters enable what is normally for Foveon X3 sensors is up to 2.4x better. In a black-and-white sensor to create color images. addition to the standard SFR method, we also applied the SFR method using a red/blue edge. In this case, R G R G the X3 SFR was 3–5x higher than that for Bayer filter G B G B pattern devices. R G R G G B G B Introduction In their native state, the image sensors used in digital Figure 1 Typical Bayer filter pattern showing the alternate sampling of red, green and blue pixels. image capture devices are black-and-white. To enable color capture, small color filters are placed on top of By using 2 green filtered pixels for every red or blue, each photodiode. The filter pattern most often used is 1 the Bayer pattern is designed to maximize perceived derived in some way from the Bayer pattern , a sharpness in the luminance channel, composed repeating array of red, green, and blue pixels that lie mostly of green information.
    [Show full text]
  • United States Patent (19) 11 Patent Number: 6,072,635 Hashizume Et Al
    US006072635A United States Patent (19) 11 Patent Number: 6,072,635 Hashizume et al. (45) Date of Patent: Jun. 6, 2000 54) DICHROIC PRISM AND PROJECTION FOREIGN PATENT DOCUMENTS DISPLAY APPARATUS 7-294.845 11/1995 Japan. 75 Inventors: Toshiaki Hashizume, Okaya; Akitaka Primary Examiner Ricky Mack Yajima, Tatsuno-machi, both of Japan Attorney, Agent, or Firm-Oliff & Berridge, PLC 73 Assignee: Seiko Epson Corporation, Tokyo, 57 ABSTRACT Japan The invention provides a dichroic prism capable of reducing displacements of projection pixels of colors caused by 21 Appl. No.: 09/112,132 chromatic aberration of magnification. A dichroic prism is formed in the shape of a quadrangular prism as a whole by 22 Filed: Jul. 9, 1998 joining four rectangular prisms together. A red reflecting 30 Foreign Application Priority Data dichroic plane and a blue reflecting dichroic plane intersect to form Substantially an X shape along junction Surfaces of Jul. 15, 1997 JP Japan .................................... 9-1900OS the prisms. The red reflecting dichroic plane is convex shaped by partly changing the thickness of an adhesive layer 51 Int. Cl." ............................ G02B 27/12: G02B 27/14 for connecting the rectangular prisms together. Accordingly, 52 U.S. Cl. ............................................. 359/640; 359/634 Since a red beam can be guided to a projection optical System 58 Field of Search ..................................... 359/634, 637, while being enlarged, it is possible to reduce a projection 359/640 image of the red beam to be projected onto a projection plane Via the projection optical System. This makes it 56) References Cited possible to reduce relative displacements of projection pix U.S.
    [Show full text]
  • 1/2-Inch Megapixel CMOS Digital Image Sensor
    MT9M001: 1/2-Inch Megapixel Digital Image Sensor Features 1/2-Inch Megapixel CMOS Digital Image Sensor MT9M001C12STM (Monochrome) Datasheet, Rev. M For the latest datasheet, please visit www.onsemi.com Features Table 1: Key Performance Parameters Parameter Value • Array Format (5:4): 1,280H x 1,024V (1,310,720 active Optical format 1/2-inch (5:4) pixels). Total (incl. dark pixels): 1,312H x 1,048V Active imager size 6.66 mm (H) x 5.32 mm (V) (1,374,976 pixels) • Frame Rate: 30 fps progressive scan; programmable Active pixels 1,280 H x 1,024 V • Shutter: Electronic Rolling Shutter (ERS) Pixel size 5.2 m x 5.2 m • Window Size: SXGA; programmable to any smaller Shutter type Electronic rolling shutter (ERS) Maximum data rate/ format (VGA, QVGA, CIF, QCIF, etc.) 48 MPS/48 MHz • Programmable Controls: Gain, frame rate, frame size master clock Frame SXGA 30 fps progressive scan; rate (1280 x 1024) programmable Applications ADC resolution 10-bit, on-chip Responsivity 2.1 V/lux-sec • Digital still cameras Dynamic range 68.2 dB • Digital video cameras •PC cameras SNRMAX 45 dB Supply voltage 3.0 V3.6 V, 3.3 V nominal 363 mW at 3.3 V (operating); Power consumption General Description 294 W (standby) Operating temperature 0°C to +70°C The ON Semiconductor MT9M001 is an SXGA-format with a 1/2-inch CMOS active-pixel digital image sen- Packaging 48-pin CLCC sor. The active imaging pixel array of 1,280H x 1,024V. It The sensor can be operated in its default mode or pro- incorporates sophisticated camera functions on-chip grammed by the user for frame size, exposure, gain set- such as windowing, column and row skip mode, and ting, and other parameters.
    [Show full text]
  • More About Digital Cameras Image Characteristics Several Important
    More about Digital Cameras Image Characteristics Several important characteristics of digital images include: Physical Size How big is the image that has been captured, as measured in inches or pixels? File Size How large is the computer file that makes up the image, as measured in kilobytes or megabytes? Pixels All digital images taken with a digital camera are made up of pixels (short for picture elements). A pixel is the smallest part (sometimes called a point or a dot) of a digital image and the total number of pixels make up the image and help determine its size and its resolution, or how much information is included in the image when we view it. Generally speaking, the larger the number of pixels an image contains, the sharper it will appear, especially when it is enlarged, which is what happens when we want to print our photographs larger than will fit into small 3 1\2 X 5 inch or 5 X 7 inch frames. You will notice in the first picture below that the Grand Canyon is in sharp focus and there is a large amount of detail in the image. However, when the image is enlarged to an extreme level, the individual pixels that make up the image are visible--and the image is no longer clear and sharp. Megapixels The term megapixels means one million pixels. When we discuss how sharp a digital image is or how much resolution it has, we usually refer to the number of megapixels that make up the image. One of the biggest selling features of digital cameras is the number of megapixels it is capable of producing when a picture is taken.
    [Show full text]
  • Cameras • Video Camera
    Outline • Pinhole camera •Film camera • Digital camera Cameras • Video camera Digital Visual Effects, Spring 2007 Yung-Yu Chuang 2007/3/6 with slides by Fredo Durand, Brian Curless, Steve Seitz and Alexei Efros Camera trial #1 Pinhole camera pinhole camera scene film scene barrier film Add a barrier to block off most of the rays. • It reduces blurring Put a piece of film in front of an object. • The pinhole is known as the aperture • The image is inverted Shrinking the aperture Shrinking the aperture Why not making the aperture as small as possible? • Less light gets through • Diffraction effect High-end commercial pinhole cameras Adding a lens “circle of confusion” scene lens film A lens focuses light onto the film $200~$700 • There is a specific distance at which objects are “in focus” • other points project to a “circle of confusion” in the image Lenses Exposure = aperture + shutter speed F Thin lens equation: • Aperture of diameter D restricts the range of rays (aperture may be on either side of the lens) • Any object point satisfying this equation is in focus • Shutter speed is the amount of time that light is • Thin lens applet: allowed to pass through the aperture http://www.phy.ntnu.edu.tw/java/Lens/lens_e.html Exposure Effects of shutter speeds • Two main parameters: • Slower shutter speed => more light, but more motion blur – Aperture (in f stop) – Shutter speed (in fraction of a second) • Faster shutter speed freezes motion Aperture Depth of field • Aperture is the diameter of the lens opening, usually specified by f-stop, f/D, a fraction of the focal length.
    [Show full text]
  • Oxnard Course Outline
    Course ID: DMS R120B Curriculum Committee Approval Date: 04/25/2018 Catalog Start Date: Fall 2018 COURSE OUTLINE OXNARD COLLEGE I. Course Identification and Justification: A. Proposed course id: DMS R120B Banner title: AdobePhotoShop II Full title: Adobe PhotoShop II B. Reason(s) course is offered: This course provides the development of skills necessary to combine the use of Photoshop digital image editing software with Adobe LightRoom's expanded digital photographic image editing abilities. These skills will enhance a student’s ability to enter into employment positions such as web master, graphics design, and digital image processing. C. C-ID: 1. C-ID Descriptor: 2. C-ID Status: D. Co-listed as: Current: None II. Catalog Information: A. Units: Current: 3.00 B. Course Hours: 1. In-Class Contact Hours: Lecture: 43.75 Activity: 0 Lab: 26.25 2. Total In-Class Contact Hours: 70 3. Total Outside-of-Class Hours: 87.5 4. Total Student Learning Hours: 157.5 C. Prerequisites, Corequisites, Advisories, and Limitations on Enrollment: 1. Prerequisites Current: DMS R120A: Adobe Photoshop I 2. Corequisites Current: 3. Advisories: Current: 4. Limitations on Enrollment: Current: D. Catalog description: Current: This course will continue the development of students’ skills in the use of Adobe Photoshop digital image editing software by integrating the enhanced editing capabilities of Adobe Lightroom into the Adobe Photoshop workflow. Students will learn how to “punch up” colors in specific areas of digital photographs, how to make dull-looking shots vibrant, remove distracting objects, straighten skewed shots and how to use Photoshop and Lightroom to create panoramas, edit Adobe raw DNG photos on mobile device, and apply Boundary Wrap to a merged panorama to prevent loss of detail in the image among other skills.
    [Show full text]
  • Characterization of Color Cross-Talk of CCD Detectors and Its Influence in Multispectral Quantitative Phase Imaging
    Characterization of color cross-talk of CCD detectors and its influence in multispectral quantitative phase imaging 1,2,3 1 1,2 1 AZEEM AHMAD , ANAND KUMAR , VISHESH DUBEY , ANKIT BUTOLA , 2 1,4 BALPREET SINGH AHLUWALIA , DALIP SINGH MEHTA 1Department of Physics, Indian Institute of Technology Delhi, Hauz Khas, New Delhi 110016, India 2Department of Physics and Technology, UiT The Arctic University of Norway, Tromsø 9037, Norway [email protected] [email protected] Abstract: Multi-spectral quantitative phase imaging (QPI) is an emerging imaging modality for wavelength dependent studies of several biological and industrial specimens. Simultaneous multi-spectral QPI is generally performed with color CCD cameras. However, color CCD cameras are suffered from the color crosstalk issue, which needed to be explored. Here, we present a new approach for accurately measuring the color crosstalk of 2D area detectors, without needing prior information about camera specifications. Color crosstalk of two different cameras commonly used in QPI, single chip CCD (1-CCD) and three chip CCD (3-CCD), is systematically studied and compared using compact interference microscopy. The influence of color crosstalk on the fringe width and the visibility of the monochromatic constituents corresponding to three color channels of white light interferogram are studied both through simulations and experiments. It is observed that presence of color crosstalk changes the fringe width and visibility over the imaging field of view. This leads to an unwanted non-uniform background error in the multi-spectral phase imaging of the specimens. It is demonstrated that the color crosstalk of the detector is the key limiting factor for phase measurement accuracy of simultaneous multi-spectral QPI systems.
    [Show full text]
  • Comparative Analysis of Color Architectures for Image Sensors
    Comparative analysis of color architectures for image sensors Peter B. Catrysse*a, Brian A. Wande11', Abbas El Gamala aDept of Electrical Engineering, Stanford University, CA 94305, USA bDept ofPsychology, Stanford University, CA 94305,USA ABSTRACT We have developed a software simulator to create physical models of a scene, compute camera responses, render the camera images and to measure the perceptual color errors (CIELAB) between the scene and rendered images. The simulator can be used to measure color reproduction errors and analyze the contributions of different sources to the error. We compare three color architectures for digital cameras: (a) a sensor array containing three interleaved color mosaics, (b) an architecture using dichroic prisms to create three spatially separated copies of the image, (c) a single sensor array coupled with a time-varying color filter measuring three images sequentially in time. Here, we analyze the color accuracy of several exposure control methods applied to these architectures. The first exposure control algorithm (traditional) simply stops image acquisition when one channel reaches saturation. In a second scheme, we determine the optimal exposure time for each color channel separately, resulting in a longer total exposure time. In a third scheme we restrict the total exposure duration to that of the first scheme, but we preserve the optimum ratio between color channels. Simulator analyses measure the color reproduction quality of these different exposure control methods as a function of illumination taking into account photon and sensor noise, quantization and color conversion errors. Keywords: color, digital camera, image quality, CMOS image sensors 1. INTRODUCTION The use of CMOS sensors in digital cameras has created new opportunities for developing digital camera architectures.
    [Show full text]