Rawtherapee 4 User Manual

Total Page:16

File Type:pdf, Size:1020Kb

Rawtherapee 4 User Manual RawTherapee 4.0.10 User Manual Downloaded from Google Docs on 2013-03-08 https://docs.google.com/document/d/1DHLb_6xNQsEInxiuU8pz1-sWNinnj09bpBUA4_Vl8w8/edit If you would like to help translate this manual, read this forum introductory thread: http://www.rawtherapee.com/forum/viewtopic.php?f=4&t=2943#p21647 RawTherapee 4.0.10 User Manual Page 1 of 94 Table of Contents Introduction..............................................................................................................................5 Welcome!............................................................................................................................5 Features..................................................................................................................................6 General Features................................................................................................................6 Exposure and Color Features.............................................................................................7 Detail Features....................................................................................................................7 Transformation Features ....................................................................................................7 Raw Pre-Demosaicing Features.........................................................................................8 The Floating Point Engine...................................................................................................8 Installation notes...................................................................................................................10 Making a Portable Installation...........................................................................................11 For Windows.................................................................................................................11 For Linux.......................................................................................................................11 Using RawTherapee..............................................................................................................12 First Run............................................................................................................................12 Editor Tab Modes..............................................................................................................13 The Image Editor Tab........................................................................................................13 The Left Panel..............................................................................................................13 The Right Panel............................................................................................................14 Preview modes.............................................................................................................15 Saving...........................................................................................................................18 The Batch Queue..............................................................................................................19 8-bit and 16-bit..................................................................................................................21 Naming..............................................................................................................................22 Sidecar Files - Processing Profiles...................................................................................22 Creating processing profiles for general use....................................................................22 Closing RawTherapee......................................................................................................23 Edit Current Image in External Editor...............................................................................23 The Toolbox...........................................................................................................................25 Panels...........................................................................................................................25 Sliders...........................................................................................................................25 Curve Editors................................................................................................................25 The Preview Area.........................................................................................................26 Exposure Tab....................................................................................................................26 Exposure panel.............................................................................................................26 Highlight Reconstruction...............................................................................................34 Shadows/Highlights......................................................................................................34 Tone Mapping...............................................................................................................35 Lab Adjustments...........................................................................................................36 RGB versus Lab...........................................................................................................45 CIE Color Appearance Model 2002..............................................................................46 Detail Tab..........................................................................................................................46 Sharpening....................................................................................................................46 Edges............................................................................................................................48 Microcontrast................................................................................................................49 Impulse Noise Reduction..............................................................................................49 Noise Reduction...........................................................................................................49 Defringe........................................................................................................................50 RawTherapee 4.0.10 User Manual Page 2 of 94 Contrast by Detail Levels..............................................................................................51 Color Tab...........................................................................................................................52 White Balance...............................................................................................................52 Vibrance........................................................................................................................55 Channel Mixer...............................................................................................................56 HSV Equalizer..............................................................................................................56 RGB Curves..................................................................................................................59 Color Management.......................................................................................................59 Transform tab....................................................................................................................63 Crop..............................................................................................................................63 Resize...........................................................................................................................63 Lens / Geometry...........................................................................................................64 Raw Tab............................................................................................................................67 Demosaicing.................................................................................................................67 Preprocessing...............................................................................................................68 Raw White & Black Points............................................................................................68 Dark Frame...................................................................................................................69 Bad Pixels.....................................................................................................................70 Flat Field.......................................................................................................................71 Chromatic Aberration....................................................................................................77 Metadata tab.....................................................................................................................78 Exif tab..........................................................................................................................78 IPTC tab........................................................................................................................78 Preferences Window.............................................................................................................80
Recommended publications
  • Breaking Down the “Cosine Fourth Power Law”
    Breaking Down The “Cosine Fourth Power Law” By Ronian Siew, inopticalsolutions.com Why are the corners of the field of view in the image captured by a camera lens usually darker than the center? For one thing, camera lenses by design often introduce “vignetting” into the image, which is the deliberate clipping of rays at the corners of the field of view in order to cut away excessive lens aberrations. But, it is also known that corner areas in an image can get dark even without vignetting, due in part to the so-called “cosine fourth power law.” 1 According to this “law,” when a lens projects the image of a uniform source onto a screen, in the absence of vignetting, the illumination flux density (i.e., the optical power per unit area) across the screen from the center to the edge varies according to the fourth power of the cosine of the angle between the optic axis and the oblique ray striking the screen. Actually, optical designers know this “law” does not apply generally to all lens conditions.2 – 10 Fundamental principles of optical radiative flux transfer in lens systems allow one to tune the illumination distribution across the image by varying lens design characteristics. In this article, we take a tour into the fascinating physics governing the illumination of images in lens systems. Relative Illumination In Lens Systems In lens design, one characterizes the illumination distribution across the screen where the image resides in terms of a quantity known as the lens’ relative illumination — the ratio of the irradiance (i.e., the power per unit area) at any off-axis position of the image to the irradiance at the center of the image.
    [Show full text]
  • Estimation and Correction of the Distortion in Forensic Image Due to Rotation of the Photo Camera
    Master Thesis Electrical Engineering February 2018 Master Thesis Electrical Engineering with emphasis on Signal Processing February 2018 Estimation and Correction of the Distortion in Forensic Image due to Rotation of the Photo Camera Sathwika Bavikadi Venkata Bharath Botta Department of Applied Signal Processing Blekinge Institute of Technology SE–371 79 Karlskrona, Sweden This thesis is submitted to the Department of Applied Signal Processing at Blekinge Institute of Technology in partial fulfillment of the requirements for the degree of Master of Science in Electrical Engineering with Emphasis on Signal Processing. Contact Information: Author(s): Sathwika Bavikadi E-mail: [email protected] Venkata Bharath Botta E-mail: [email protected] Supervisor: Irina Gertsovich University Examiner: Dr. Sven Johansson Department of Applied Signal Processing Internet : www.bth.se Blekinge Institute of Technology Phone : +46 455 38 50 00 SE–371 79 Karlskrona, Sweden Fax : +46 455 38 50 57 Abstract Images, unlike text, represent an effective and natural communica- tion media for humans, due to their immediacy and the easy way to understand the image content. Shape recognition and pattern recog- nition are one of the most important tasks in the image processing. Crime scene photographs should always be in focus and there should be always be a ruler be present, this will allow the investigators the ability to resize the image to accurately reconstruct the scene. There- fore, the camera must be on a grounded platform such as tripod. Due to the rotation of the camera around the camera center there exist the distortion in the image which must be minimized.
    [Show full text]
  • Darktable 1.2 Darktable 1.2 Copyright © 2010-2012 P.H
    darktable 1.2 darktable 1.2 Copyright © 2010-2012 P.H. Andersson Copyright © 2010-2011 Olivier Tribout Copyright © 2012-2013 Ulrich Pegelow The owner of the darktable project is Johannes Hanika. Main developers are Johannes Hanika, Henrik Andersson, Tobias Ellinghaus, Pascal de Bruijn and Ulrich Pegelow. darktable is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. darktable is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details. You should have received a copy of the GNU General Public License along with darktable. If not, see http://www.gnu.org/ licenses/. The present user manual is under license cc by-sa , meaning Attribution Share Alike . You can visit http://creativecommons.org/ about/licenses/ to get more information. Table of Contents Preface to this manual ............................................................................................... v 1. Overview ............................................................................................................... 1 1.1. User interface ............................................................................................. 3 1.1.1. Views ..............................................................................................
    [Show full text]
  • Camera Raw Workflows
    RAW WORKFLOWS: FROM CAMERA TO POST Copyright 2007, Jason Rodriguez, Silicon Imaging, Inc. Introduction What is a RAW file format, and what cameras shoot to these formats? How does working with RAW file-format cameras change the way I shoot? What changes are happening inside the camera I need to be aware of, and what happens when I go into post? What are the available post paths? Is there just one, or are there many ways to reach my end goals? What post tools support RAW file format workflows? How do RAW codecs like CineForm RAW enable me to work faster and with more efficiency? What is a RAW file? In simplest terms is the native digital data off the sensor's A/D converter with no further destructive DSP processing applied Derived from a photometrically linear data source, or can be reconstructed to produce data that directly correspond to the light that was captured by the sensor at the time of exposure (i.e., LOG->Lin reverse LUT) Photometrically Linear 1:1 Photons Digital Values Doubling of light means doubling of digitally encoded value What is a RAW file? In film-analogy would be termed a “digital negative” because it is a latent representation of the light that was captured by the sensor (up to the limit of the full-well capacity of the sensor) “RAW” cameras include Thomson Viper, Arri D-20, Dalsa Evolution 4K, Silicon Imaging SI-2K, Red One, Vision Research Phantom, noXHD, Reel-Stream “Quasi-RAW” cameras include the Panavision Genesis In-Camera Processing Most non-RAW cameras on the market record to 8-bit YUV formats
    [Show full text]
  • FOSS & Photography
    FOSS & Photography What is FOSS, and how does it relate to photography? FOSS stands for Free/Open Source Software. Not free, as in price, although this is also often the case, but free, as in freedom. You are free to run the program for any purpose you choose, be it personal or commercial. You are free to make copies and share with your friends and family. You are free to inspect and modify/improve the source code, and share the results. I discovered FOSS many years ago, and as a philosophical decision, my digital equipment is run by almost only free software. My OS (GNU/Linux), web browser (Firefox), office suite (LibreOffice), RAW development (RawTherapee), photo editing (Gimp - GNU Image Manipulation Program). The entire system I am writing this on is run by free software. OK Nick. How does all this help me as a photographer? We all know how expensive this passion of ours can become. Between camera bodies, various lenses, filters, supports, lights, and the software to develop our creations, the expense can become extreme. As most people don’t know about FOSS, or any of the interesting projects that have been developed for the photographic community, for those with my philosophical disposition, or those who wouldn’t mind saving some money, I thought I could present a couple of FOSS options for photo editing....RawTherapee, and Gimp. With RawTherapee, and Gimp, you have two very powerful tools to develop your photos. Both are FOSS, and as such are free to download and update. Both are in constant development and improvement by dedicated teams of individuals, who are as passionate as we are about photography.
    [Show full text]
  • Effects on Map Production of Distortions in Photogrammetric Systems J
    EFFECTS ON MAP PRODUCTION OF DISTORTIONS IN PHOTOGRAMMETRIC SYSTEMS J. V. Sharp and H. H. Hayes, Bausch and Lomb Opt. Co. I. INTRODUCTION HIS report concerns the results of nearly two years of investigation of T the problem of correlation of known distortions in photogrammetric sys­ tems of mapping with observed effects in map production. To begin with, dis­ tortion is defined as the displacement of a point from its true position in any plane image formed in a photogrammetric system. By photogrammetric systems of mapping is meant every major type (1) of photogrammetric instruments. The type of distortions investigated are of a magnitude which is considered by many photogrammetrists to be negligible, but unfortunately in their combined effects this is not always true. To buyers of finished maps, you need not be alarmed by open discussion of these effects of distortion on photogrammetric systems for producing maps. The effect of these distortions does not limit the accuracy of the map you buy; but rather it restricts those who use photogrammetric systems to make maps to limits established by experience. Thus the users are limited to proper choice of flying heights for aerial photography and control of other related factors (2) in producing a map of the accuracy you specify. You, the buyers of maps are safe behind your contract specifications. The real difference that distortions cause is the final cost of the map to you, which is often established by competi­ tive bidding. II. PROBLEM In examining the problem of correlating photogrammetric instruments with their resultant effects on map production, it is natural to ask how large these distortions are, whose effects are being considered.
    [Show full text]
  • Model-Free Lens Distortion Correction Based on Phase Analysis of Fringe-Patterns
    sensors Article Model-Free Lens Distortion Correction Based on Phase Analysis of Fringe-Patterns Jiawen Weng 1,†, Weishuai Zhou 2,†, Simin Ma 1, Pan Qi 3 and Jingang Zhong 2,4,* 1 Department of Applied Physics, South China Agricultural University, Guangzhou 510642, China; [email protected] (J.W.); [email protected] (S.M.) 2 Department of Optoelectronic Engineering, Jinan University, Guangzhou 510632, China; [email protected] 3 Department of Electronics Engineering, Guangdong Communication Polytechnic, Guangzhou 510650, China; [email protected] 4 Guangdong Provincial Key Laboratory of Optical Fiber Sensing and Communications, Guangzhou 510650, China * Correspondence: [email protected] † The authors contributed equally to this work. Abstract: The existing lens correction methods deal with the distortion correction by one or more specific image distortion models. However, distortion determination may fail when an unsuitable model is used. So, methods based on the distortion model would have some drawbacks. A model- free lens distortion correction based on the phase analysis of fringe-patterns is proposed in this paper. Firstly, the mathematical relationship of the distortion displacement and the modulated phase of the sinusoidal fringe-pattern are established in theory. By the phase demodulation analysis of the fringe-pattern, the distortion displacement map can be determined point by point for the whole distorted image. So, the image correction is achieved according to the distortion displacement map by a model-free approach. Furthermore, the distortion center, which is important in obtaining an optimal result, is measured by the instantaneous frequency distribution according to the character of distortion automatically. Numerical simulation and experiments performed by a wide-angle lens are carried out to validate the method.
    [Show full text]
  • Measuring Camera Shannon Information Capacity with a Siemens Star Image
    https://doi.org/10.2352/ISSN.2470-1173.2020.9.IQSP-347 © 2020, Society for Imaging Science and Technology Measuring camera Shannon Information Capacity with a Siemens Star Image Norman L. Koren, Imatest LLC, Boulder, Colorado, USA Abstract Measurement background Shannon information capacity, which can be expressed as bits per To measure signal and noise at the same location, we use an pixel or megabits per image, is an excellent figure of merit for pre- image of a sinusoidal Siemens-star test chart consisting of ncycles dicting camera performance for a variety of machine vision appli- total cycles, which we analyze by dividing the star into k radial cations, including medical and automotive imaging systems. Its segments (32 or 64), each of which is subdivided into m angular strength is that is combines the effects of sharpness (MTF) and segments (8, 16, or 24) of length Pseg. The number sine wave noise, but it has not been widely adopted because it has been cycles in each angular segment is 푛 = 푛푐푦푐푙푒푠/푚. difficult to measure and has never been standardized. We have developed a method for conveniently measuring inform- ation capacity from images of the familiar sinusoidal Siemens Star chart. The key is that noise is measured in the presence of the image signal, rather than in a separate location where image processing may be different—a commonplace occurrence with bilateral filters. The method also enables measurement of SNRI, which is a key performance metric for object detection. Information capacity is strongly affected by sensor noise, lens quality, ISO speed (Exposure Index), and the demosaicing algo- rithm, which affects aliasing.
    [Show full text]
  • Multi-Frame Demosaicing and Super-Resolution of Color Images
    1 Multi-Frame Demosaicing and Super-Resolution of Color Images Sina Farsiu∗, Michael Elad ‡ , Peyman Milanfar § Abstract In the last two decades, two related categories of problems have been studied independently in the image restoration literature: super-resolution and demosaicing. A closer look at these problems reveals the relation between them, and as conventional color digital cameras suffer from both low-spatial resolution and color-filtering, it is reasonable to address them in a unified context. In this paper, we propose a fast and robust hybrid method of super-resolution and demosaicing, based on a MAP estimation technique by minimizing a multi-term cost function. The L 1 norm is used for measuring the difference between the projected estimate of the high-resolution image and each low-resolution image, removing outliers in the data and errors due to possibly inaccurate motion estimation. Bilateral regularization is used for spatially regularizing the luminance component, resulting in sharp edges and forcing interpolation along the edges and not across them. Simultaneously, Tikhonov regularization is used to smooth the chrominance components. Finally, an additional regularization term is used to force similar edge location and orientation in different color channels. We show that the minimization of the total cost function is relatively easy and fast. Experimental results on synthetic and real data sets confirm the effectiveness of our method. ∗Corresponding author: Electrical Engineering Department, University of California Santa Cruz, Santa Cruz CA. 95064 USA. Email: [email protected], Phone:(831)-459-4141, Fax: (831)-459-4829 ‡ Computer Science Department, The Technion, Israel Institute of Technology, Israel.
    [Show full text]
  • Iq-Analyzer Manual
    iQ-ANALYZER User Manual 6.2.4 June, 2020 Image Engineering GmbH & Co. KG . Im Gleisdreieck 5 . 50169 Kerpen . Germany T +49 2273 99 99 1-0 . F +49 2273 99 99 1-10 . www.image-engineering.de Content 1 INTRODUCTION ........................................................................................................... 5 2 INSTALLING IQ-ANALYZER ........................................................................................ 6 2.1. SYSTEM REQUIREMENTS ................................................................................... 6 2.2. SOFTWARE PROTECTION................................................................................... 6 2.3. INSTALLATION ..................................................................................................... 6 2.4. ANTIVIRUS ISSUES .............................................................................................. 7 2.5. SOFTWARE BY THIRD PARTIES.......................................................................... 8 2.6. NETWORK SITE LICENSE (FOR WINDOWS ONLY) ..............................................10 2.6.1. Overview .......................................................................................................10 2.6.2. Installation of MxNet ......................................................................................10 2.6.3. Matrix-Net .....................................................................................................11 2.6.4. iQ-Analyzer ...................................................................................................12
    [Show full text]
  • Demosaicking: Color Filter Array Interpolation [Exploring the Imaging
    [Bahadir K. Gunturk, John Glotzbach, Yucel Altunbasak, Ronald W. Schafer, and Russel M. Mersereau] FLOWER PHOTO © PHOTO FLOWER MEDIA, 1991 21ST CENTURY PHOTO:CAMERA AND BACKGROUND ©VISION LTD. DIGITAL Demosaicking: Color Filter Array Interpolation [Exploring the imaging process and the correlations among three color planes in single-chip digital cameras] igital cameras have become popular, and many people are choosing to take their pic- tures with digital cameras instead of film cameras. When a digital image is recorded, the camera needs to perform a significant amount of processing to provide the user with a viewable image. This processing includes correction for sensor nonlinearities and nonuniformities, white balance adjustment, compression, and more. An important Dpart of this image processing chain is color filter array (CFA) interpolation or demosaicking. A color image requires at least three color samples at each pixel location. Computer images often use red (R), green (G), and blue (B). A camera would need three separate sensors to completely meas- ure the image. In a three-chip color camera, the light entering the camera is split and projected onto each spectral sensor. Each sensor requires its proper driving electronics, and the sensors have to be registered precisely. These additional requirements add a large expense to the system. Thus, many cameras use a single sensor covered with a CFA. The CFA allows only one color to be measured at each pixel. This means that the camera must estimate the missing two color values at each pixel. This estimation process is known as demosaicking. Several patterns exist for the filter array.
    [Show full text]
  • CODE by R.Mutt
    CODE by R.Mutt dcraw.c 1. /* 2. dcraw.c -- Dave Coffin's raw photo decoder 3. Copyright 1997-2018 by Dave Coffin, dcoffin a cybercom o net 4. 5. This is a command-line ANSI C program to convert raw photos from 6. any digital camera on any computer running any operating system. 7. 8. No license is required to download and use dcraw.c. However, 9. to lawfully redistribute dcraw, you must either (a) offer, at 10. no extra charge, full source code* for all executable files 11. containing RESTRICTED functions, (b) distribute this code under 12. the GPL Version 2 or later, (c) remove all RESTRICTED functions, 13. re-implement them, or copy them from an earlier, unrestricted 14. Revision of dcraw.c, or (d) purchase a license from the author. 15. 16. The functions that process Foveon images have been RESTRICTED 17. since Revision 1.237. All other code remains free for all uses. 18. 19. *If you have not modified dcraw.c in any way, a link to my 20. homepage qualifies as "full source code". 21. 22. $Revision: 1.478 $ 23. $Date: 2018/06/01 20:36:25 $ 24. */ 25. 26. #define DCRAW_VERSION "9.28" 27. 28. #ifndef _GNU_SOURCE 29. #define _GNU_SOURCE 30. #endif 31. #define _USE_MATH_DEFINES 32. #include <ctype.h> 33. #include <errno.h> 34. #include <fcntl.h> 35. #include <float.h> 36. #include <limits.h> 37. #include <math.h> 38. #include <setjmp.h> 39. #include <stdio.h> 40. #include <stdlib.h> 41. #include <string.h> 42. #include <time.h> 43. #include <sys/types.h> 44.
    [Show full text]