Imaging Defects of Real Camera Systems

Total Page:16

File Type:pdf, Size:1020Kb

Imaging Defects of Real Camera Systems Imaging defects of real camera systems Traditional 3D rendering uses a perfect, idealized camera model that is easy to define in math- ematical terms and provides a reasonably fast and efficient rendering. Real world cameras, and similar optical imaging systems such as human eyes, exhibit quite a few inherent defects, some of which are actually essential to the perceived “realism” of an image. Too perfect images easily stand out as being synthetic. The following text is an attempt to briefly summarize the various defects and their causes, along with some examples on how they may be simulated using digital methods. Higher resolution versions of the example images can be found on the web at: http://www.itn.liu.se/~stegu/TNM057-2002/. Depth of field A synthetic camera renders everything in view with perfect sharpness. A real camera has a finite depth of field, so that only objects at a certain distance, in the focal plane, are rendered distinctly, and objects at other distances are progressively more blurred with increasing distance to the fo- cal plane. The depth of field is dependent on the focal length of the lens and the aperture. Tele- photo lenses generally have a smaller depth of field than wide angle lenses. A smaller numerical aperture (a larger aperture opening) lets more light through the lens but yields a smaller depth of field. Depth of field is one of the most prominent and artistically important effects of a real camera, but unfortunately it is rather difficult to simulate. One option is to cheat by saving the depth in- formation for each pixel in the rendered scene, and blurring the image in a post-processing step by a convolution filter that is chosen differently for different parts of the image depending on the distance to the objects. This is a comparably cheap simulation, and it is quite often sufficient, but it does have its problems around edges between objects that are in focus and objects that are not. A common method for depth of field simulation in raytracing is to use multisampling. Several camera rays can be traced through each image pixel on the projection plane from slightly dif- ferent projection reference points. The resulting intensity for the pixel is calculated as the aver- age of the intensity value from each ray. All camera rays for one image pixel are traced so that they converge in a common projection plane, which becomes the focal plane of the rendering. The area over which the projection reference points for the different rays are spread is directly equivalent to the aperture of the camera. The equivalent of multisampling can be made for regular scanline-based rendering as well. Depth of field can be simulated by projecting the scene against the projection plane with several slightly separated projection reference points, and averaging a fairly large number of images. Figure 1: Depth of field (DOF). Left to right: infinite DOF, post effect DOF, multisample DOF Defocus A real camera, particularly if it is of low quality, often has a visibly imperfect focus even in the focal plane. A generally defocused image is quite easy to simulate by applying a local averaging filter on the image. Blurring an image on a wholesale basis like this is a very common and sim- ple operation available in almost any image editing program. Formally, the blurring operation is a convolution with a filter kernel. Smaller kernels give little blur, and large kernels give heavy blur. For most blurring operations, a simple Gaussian filter kernel shape gives good results. Noise and grain Noise in electronic image sensors, and its analog counterpart, the grain in photographic film, are defects that are often clearly noticeable in real-world images, and may therefore need to be add- ed to synthetic images, particularly if they are to be blended with real images. Electronic noise and film grain look quite different, but both are quite easily simulated as an image post effect by adding some randomness to the pixels of the perfect synthetic image. Figure 2: Blur and grain. Left to right: sharp image, blurred image, simulated film grain Motion blur Another common defect is motion blur, which occurs because a real camera has a shutter. Some time is required for the available light to expose a photographic film or give a large enough sig- nal in an electronic image sensor, and any objects that move in the image during that time are blurred. A simple synthetic camera samples the scene at one single instance in time, and thus does not exhibit motion blur. Motion blur is often simulated by averaging together several sampled images, distributed over some simulated shutter time. A successful motion blur simulation done in this way requires quite a few images, so motion blur often adds considerably to the rendering time. For raytracing, the multisampling mentioned above works well also for simulating motion blur. Motion blur and depth of field can be simulated together by distributing the ray samples for each pixel both in time and space, thereby cutting down some on the total increase in rendering time. However, methods which incorporate explicit averaging of images over time are general but rather costly method. The effort can be reduced by performing the explicit averaging only in those areas of the image where objects move. Animation software packages have options for specifying which objects in a scene should be motion blurred. This is often called object motion blur, and it has the advantage of not adding anything to the rendering time for static parts of the scene. Just like depth of field, motion blur may also be simulated by image post-processing methods. At rendering time, the renderer knows the position change of each object in the scene between one frame and the next. Velocity information can be stored for each object, or even for each pix- el, making it possible to blur moving parts of the image without having to spend any extra work on the non-moving parts. This method is often referred to as image motion blur. For many ap- plications it can be both faster and better than multisample motion blur. Figure 3: Motion blur. Left to right: no motion blur, post effect blur, multisample blur Camera motion blur is a similar but more global effect. Even static scenes can be blurred from a camera motion. Blur from panning, tilting or rotating an otherwise stationary camera can be successfully simulated in a post-processing step as a local pixel averaging along the direction of the apparent motion in the image. Motion blur for camera trucking (changing the location of the camera in the scene) requires multisampling for an absolutely correct appearance, but that is only rarely an issue. It can be simulated a lot easier and quite accurately using depth informa- tion for the scene. Even without depth information, cheap post-processing methods using linear or radial blurring can often be sufficient with proper care and thought. Figure 4: Post-effect camera motion blur. Top row: stationary camera, left-right pan, up-down tilt Middle row: camera rotation, camera zoom Figure 5: Multisample camera motion blur. Left: sidewards motion. Right: forwards motion. Glare, bleeding and soft focus Glare manifests itself as glowing halos around strong light sources, and a generally reduced contrast for dark parts of scenes with strongly illuminated objects. The effect is due to imper- fections of the camera optics even at the focal plane. Most of the light is projected onto a small vicinity of the ideal projection point, but some smaller part of it ends up in a larger vicinity of that point, sometimes even spread out over the entire image. Glass imperfections, dust, dirt and fingerprints can all have that effect, as can reflections within the camera housing. (That’s why the inside of a camera is matte and black.) Glare can also arise from lateral light transport in the film or in the image sensor, or from electrical crosstalk between image sensor pixels, although in these cases it is often referred to as bleeding. Glare is not necessarily due to the camera. It can also arise from atmospheric light scattering by dust particles. Any type of glare can be easily simulated by a convolution operation with a large filter kernel, and adding a small part of the blurred result to the original image. Compared to defocusing, glare is a quite subtle secondary effect, but it is very visible around high intensity parts in the image. If the dynamic range for the digital image intensity is restricted to 8 bits, the glare effect has to be fudged by making a separate blurring of only the maximum intensity parts of the im- age. In rendering software, there is often a facility for specifying only some light sources or ob- jects which should have simulated glare around them, and this is often good enough. Generally speaking, glare simulation is best performed on real radiance values for the image pixels. If possible, truncation to 8 bit image pixels should not be done until after such simula- tions. A strong and wide-spread glare is sometimes used for a dream-like or romantic effect called soft focus. Such strong glare is not only visible around light sources, but also around any reasonably light object. Commercial soft focus optical filters exist for photography, but a similar effect can be achieved very inexpensively by smearing some vaseline on part of the front lens. (The vase- line method is in common use by professional photographers.) As with glare, soft focus is easily simulated by a convolution filter.
Recommended publications
  • Still Photography
    Still Photography Soumik Mitra, Published by - Jharkhand Rai University Subject: STILL PHOTOGRAPHY Credits: 4 SYLLABUS Introduction to Photography Beginning of Photography; People who shaped up Photography. Camera; Lenses & Accessories - I What a Camera; Types of Camera; TLR; APS & Digital Cameras; Single-Lens Reflex Cameras. Camera; Lenses & Accessories - II Photographic Lenses; Using Different Lenses; Filters. Exposure & Light Understanding Exposure; Exposure in Practical Use. Photogram Introduction; Making Photogram. Darkroom Practice Introduction to Basic Printing; Photographic Papers; Chemicals for Printing. Suggested Readings: 1. Still Photography: the Problematic Model, Lew Thomas, Peter D'Agostino, NFS Press. 2. Images of Information: Still Photography in the Social Sciences, Jon Wagner, 3. Photographic Tools for Teachers: Still Photography, Roy A. Frye. Introduction to Photography STILL PHOTOGRAPHY Course Descriptions The department of Photography at the IFT offers a provocative and experimental curriculum in the setting of a large, diversified university. As one of the pioneers programs of graduate and undergraduate study in photography in the India , we aim at providing the best to our students to help them relate practical studies in art & craft in professional context. The Photography program combines the teaching of craft, history, and contemporary ideas with the critical examination of conventional forms of art making. The curriculum at IFT is designed to give students the technical training and aesthetic awareness to develop a strong individual expression as an artist. The faculty represents a broad range of interests and aesthetics, with course offerings often reflecting their individual passions and concerns. In this fundamental course, students will identify basic photographic tools and their intended purposes, including the proper use of various camera systems, light meters and film selection.
    [Show full text]
  • How Post-Processing Effects Imitating Camera Artifacts Affect the Perceived Realism and Aesthetics of Digital Game Graphics
    How post-processing effects imitating camera artifacts affect the perceived realism and aesthetics of digital game graphics Av: Charlie Raud Handledare: Kai-Mikael Jää-Aro Södertörns högskola | Institutionen för naturvetenskap, miljö och teknik Kandidatuppsats 30 hp Medieteknik | HT2017/VT2018 Spelprogrammet Hur post-processing effekter som imiterar kamera-artefakter påverkar den uppfattade realismen och estetiken hos digital spelgrafik 2 Abstract This study investigates how post-processing effects affect the realism and aesthetics of digital game graphics. Four focus groups explored a digital game environment and were exposed to various post-processing effects. During qualitative interviews these focus groups were asked questions about their experience and preferences and the results were analysed. The results can illustrate some of the different pros and cons with these popular post-processing effects and this could help graphical artists and game developers in the future to use this tool (post-processing effects) as effectively as possible. Keywords: post-processing effects, 3D graphics, image enhancement, qualitative study, focus groups, realism, aesthetics Abstrakt Denna studie undersöker hur post-processing effekter påverkar realismen och estetiken hos digital spelgrafik. Fyra fokusgrupper utforskade en digital spelmiljö medan olika post-processing effekter exponerades för dem. Under kvalitativa fokusgruppsintervjuer fick de frågor angående deras upplevelser och preferenser och detta resultat blev sedan analyserat. Resultatet kan ge en bild av de olika för- och nackdelarna som finns med dessa populära post-processing effekter och skulle möjligen kunna hjälpa grafiker och spelutvecklare i framtiden att använda detta verktyg (post-processing effekter) så effektivt som möjligt. Keywords: post-processing effekter, 3D-grafik, bildförbättring, kvalitativ studie, fokusgrupper, realism, estetik 3 Table of content Abstract .....................................................................................................
    [Show full text]
  • Lens Flare Prediction Based on Measurements with Real-Time Visualization
    Authors manuscript. Published in The Visual Computer, first online: 14 May 2018 The final publication is available at Springer via https://doi.org/10.1007/s00371-018-1552-4. Lens flare prediction based on measurements with real-time visualization Andreas Walch1 · Christian Luksch1 · Attila Szabo1 · Harald Steinlechner1 · Georg Haaser1 · Michael Schw¨arzler1 · Stefan Maierhofer1 Abstract Lens flare is a visual phenomenon caused by lens system or due to scattering caused by lens mate- interreflection of light within a lens system. This effect rial imperfections. The intensity of lens flare heavily de- is often seen as an undesired artifact, but it also gives pends on the camera to light source constellation and rendered images a realistic appearance and is even used of the light source intensity, compared to the rest of for artistic purposes. In the area of computer graph- the scene [7]. Lens flares are often regarded as a dis- ics, several simulation based approaches have been pre- turbing artifact, and camera producers develop coun- sented to render lens flare for a given spherical lens sys- termeasures, such as anti-reflection coatings and opti- tem. For physically reliable results, these approaches mized lens hoods to reduce their visual impact. In other require an accurate description of that system, which applications though, movies [16,23] or video games [21], differs from camera to camera. Also, for the lens flares lens flares are used intentionally to imply a higher de- appearance, crucial parameters { especially the anti- gree of realism or as a stylistic element. Figure 1 shows reflection coatings { can often only be approximated.
    [Show full text]
  • Sources of Error in HDRI for Luminance Measurement: a Review of the Literature
    Sources of Error in HDRI for Luminance Measurement: A Review of the Literature Sarah Safranek and Robert G. Davis Pacific Northwest National Laboratory 620 SW 5th Avenue, Suite 810 Portland, OR 97204 [email protected] (corresponding author) [email protected] This is an archival copy of an article published in LEUKOS. Please cite as: Sarah Safranek & Robert G. Davis (2020): Sources of Error in HDRI for Luminance Measurement: A Review of the Literature, LEUKOS, DOI: 10.1080/15502724.2020.1721018 Abstract Compared to the use of conventional spot luminance meters, high dynamic range imaging (HDRI) offers significant advantages for luminance measurements in lighting research. Consequently, the reporting of absolute luminance data based on HDRI measurements has rapidly increased in technical lighting literature, with researchers using HDRI to address topics such as daylight distribution and discomfort glare. However, questions remain about the accuracy of luminance data derived from HDRI. This article reviewed published papers that reported potential sources of error in deriving absolute luminance values from high dynamic range imaging (HDRI) using a consumer grade digital camera, along with application papers that included an analysis of errors in HDRI-derived luminance values. Four sources of significant error emerged from the literature review: lens vignetting, lens flare, luminous overflow, and sensor spectral responsivity. The cause and magnitude for each source of error is discussed using the relevant findings from previous research and any available correction methods are presented. Based on the review, a set of recommendations was developed for minimizing the possible errors in HDRI luminance measurements as well as recommendations for future research using HDRI.
    [Show full text]
  • Schneider-Kreuznach Erweitert Seine F-Mount-Objektiv-Familie
    FAQ Century Film & Video 1. What is Angle of View? 2. What is an Achromat Diopter? 3. How is an Achromat different from a standard close up lens? 4. What is a matte box? 5. What is a lens shade? 6. When do you need a lens shade or a matte box? 7. What is aspect ratio? 8. What is 4:3? 9. What is 16:9? 10. What is anamorphic video? 11. What is the difference between the Mark I and Mark II fisheye lenses? 12. What is anti-reflection coating? 13. How should I clean my lens? 14. Why should I use a bayonet Mount? 15. How much does my accessory lens weigh? 16. What is hyperfocal distance? 17. What is the Hyperfocal distance of my lens? 18. What is a T-Stop? 19. What is a PL mount? 1. What is Angle of View? Angle of view is a measure of how much of the scene a lens can view. A fisheye lens can see as much as 180 degrees and a telephoto lens might see as narrow an angle as 5 degrees. It is important to distinguish horizontal angle of view from the vertical angle of view. 2. What is an Achromat Diopter? An achromat diopter is a highly corrected two element close up lens that provides extremely sharp images edge to edge without prismatic color effects. 3. How is an Achromat different from a standard close up lens? Standard close-up lenses, or diopters, are single element lenses that allow the camera lens to focus more closely on small objects.
    [Show full text]
  • Evaluation of the Lens Flare
    https://doi.org/10.2352/ISSN.2470-1173.2021.9.IQSP-215 © 2021, Society for Imaging Science and Technology Evaluation of the Lens Flare Elodie Souksava, Thomas Corbier, Yiqi Li, Franc¸ois-Xavier Thomas, Laurent Chanas, Fred´ eric´ Guichard DXOMARK, Boulogne-Billancourt, France Abstract Flare, or stray light, is a visual phenomenon generally con- sidered undesirable in photography that leads to a reduction of the image quality. In this article, we present an objective metric for quantifying the amount of flare of the lens of a camera module. This includes hardware and software tools to measure the spread of the stray light in the image. A novel measurement setup has been developed to generate flare images in a reproducible way via a bright light source, close in apparent size and color temperature (a) veiling glare (b) luminous halos to the sun, both within and outside the field of view of the device. The proposed measurement works on RAW images to character- ize and measure the optical phenomenon without being affected by any non-linear processing that the device might implement. Introduction Flare is an optical phenomenon that occurs in response to very bright light sources, often when shooting outdoors. It may appear in various forms in the image, depending on the lens de- (c) haze (d) colored spot and ghosting sign; typically it appears as colored spots, ghosting, luminous ha- Figure 1. Examples of flare artifacts obtained with our flare setup. los, haze, or a veiling glare that reduces the contrast and color saturation in the picture (see Fig. 1).
    [Show full text]
  • H-E-M-I-S-P-H-E-R-E-S Some Abstractions for a Trilogy
    H-e-m-i-s-p-h-e-r-e-s Some abstractions for a trilogy Horizontal This was the position in which I first encountered BRIDGIT and Stoneymollan Trail played in a seamless loop. In Bergen, crisp with Norwegian cold and rain, myself hungover and thus accompanied by all the attendant feelings of slowness and arousal, melancholy and permeability of spirit. It remains the optimum filmic position, not least for its metaphorical significance to sedimentation and landscape, but because of the material honesty a single figure prostrate in the darkness possesses. Ecstasy Drugs, sorcery, pills, holes, luminous fibres, jumping out of the world. Sweat, blood, pattern, metabolism, livid skin, a ridding oneself of form. Electronic dance music is an encryption of all of these terms. It is a translation of ritualised unknowing, a primal deference to sensation in which rhythm logic performs a radical eclipse of sensory logic. At expansive volumes, it swallows a room and those within it. It is an irony of abstraction – the repetitive construction of its beat is a formula (4/4) and yet the somatic response is one of orgasmic disarray. The thump of heavy music exploding in a chest turns vision electric, a seething in the depth of the body that rigs image to a pulse. Dance music, like heavy wind or rain, takes queer territorial assemblages and holds them together. It enables new feeling: geographies and celestial monsters multiplying out across the same inferno. Panic and interludes; colour and elasticity. As psychoactive drugs, acid and ecstasy – empathogens and entactogens – share their etymological roots with empathy and tactile.
    [Show full text]
  • Practical Real-Time Lens-Flare Rendering
    Eurographics Symposium on Rendering 2013 Volume 32 (2013), Number 4 Nicolas Holzschuch and Szymon Rusinkiewicz (Guest Editors) Practical Real-Time Lens-Flare Rendering Sungkil Lee Elmar Eisemann Sungkyunkwan University, South Korea Delft University of Technology, Netherlands (a) Ours: 297 fps (b) Reference: 4.1 fps Figure 1: Our lens-flare rendering algorithm compared to a reference state-of-the-art solution [HESL11]. Our method signifi- cantly outperforms previous approaches, while quality remains comparable and is often acceptable for real-time purposes. Abstract We present a practical real-time approach for rendering lens-flare effects. While previous work employed costly ray tracing or complex polynomial expressions, we present a coarser, but also significantly faster solution. Our method is based on a first-order approximation of the ray transfer in an optical system, which allows us to derive a matrix that maps lens flare-producing light rays directly to the sensor. The resulting approach is easy to implement and produces physically-plausible images at high framerates on standard off-the-shelf graphics hardware. Categories and Subject Descriptors (according to ACM CCS): I.3.3 [Computer Graphics]: Picture/Image Generation— Display algorithms 1. Introduction Real-time approaches in games and virtual reality are usually based on texture sprites; artists need to place and design Lens flare is the result of unwanted light reflections in an opti- ghosts by hand and have to rely on (and develop) heuristics cal system. While lenses are supposed to refract an incoming to achieve a convincing behavior. Although efficient at run light ray, reflections can occur, which lead to deviations from time, the creation process is time-consuming, cumbersome, the intended light trajectory.
    [Show full text]
  • Crazy Cameras, Discorrelated Images, and the Post-Perceptual Mediation of Post-Cinematic Affect
    Repositorium für die Medienwissenschaft Shane Denson Crazy Cameras, Discorrelated Images, and the Post- Perceptual Mediation of Post-Cinematic Affect 2016 https://doi.org/10.25969/mediarep/13488 Veröffentlichungsversion / published version Sammelbandbeitrag / collection article Empfohlene Zitierung / Suggested Citation: Denson, Shane: Crazy Cameras, Discorrelated Images, and the Post-Perceptual Mediation of Post-Cinematic Affect. In: Shane Denson, Julia Leyda (Hg.): Post-Cinema. Theorizing 21st-Century Film. Falmer: REFRAME Books 2016, S. 193– 233. DOI: https://doi.org/10.25969/mediarep/13488. Nutzungsbedingungen: Terms of use: Dieser Text wird unter einer Creative Commons - This document is made available under a creative commons - Namensnennung - Nicht kommerziell - Keine Bearbeitungen 4.0/ Attribution - Non Commercial - No Derivatives 4.0/ License. For Lizenz zur Verfügung gestellt. Nähere Auskünfte zu dieser Lizenz more information see: finden Sie hier: https://creativecommons.org/licenses/by-nc-nd/4.0/ https://creativecommons.org/licenses/by-nc-nd/4.0/ 2.5 Crazy Cameras, Discorrelated Images, and the Post-Perceptual Mediation of Post-Cinematic Affect BY SHANE DENSON With the shift to a digital and more broadly post-cinematic media environment, moving images have undergone what I term their “discorrelation” from human embodied subjectivities and (phenomenological, narrative, and visual) perspectives. Clearly, we still look at—and we still perceive—images that in many ways resemble those of a properly cinematic age; yet many of these
    [Show full text]
  • Efficient Monte Carlo Rendering with Realistic Lenses
    EUROGRAPHICS 2014 / B. Lévy and J. Kautz Volume 33 (2014), Number 2 (Guest Editors) Efficient Monte Carlo Rendering with Realistic Lenses Johannes Hanika and Carsten Dachsbacher, Karlsruhe Institute of Technology, Germany ray traced 1891 spp ray traced 1152 spp Taylor 2427 spp our fit 2307 spp reference aperture sampling our fit difference Figure 1: Equal time comparison (40min, 640×960 resolution): rendering with a virtual lens (Canon 70-200mm f/2.8L at f/2.8) using spectral path tracing with next event estimation and Metropolis light transport (Kelemen mutations [KSKAC02]). Our method enables efficient importance sampling and the degree 4 fit faithfully reproduces the subtle chromatic aberrations (only a slight overall shift is introduced) while being faster to evaluate than ray tracing, naively or using aperture sampling, through the lens system. Abstract In this paper we present a novel approach to simulate image formation for a wide range of real world lenses in the Monte Carlo ray tracing framework. Our approach sidesteps the overhead of tracing rays through a system of lenses and requires no tabulation. To this end we first improve the precision of polynomial optics to closely match ground-truth ray tracing. Second, we show how the Jacobian of the optical system enables efficient importance sampling, which is crucial for difficult paths such as sampling the aperture which is hidden behind lenses on both sides. Our results show that this yields converged images significantly faster than previous methods and accurately renders complex lens systems with negligible overhead compared to simple models, e.g. the thin lens model.
    [Show full text]
  • UC San Diego Electronic Theses and Dissertations
    UC San Diego UC San Diego Electronic Theses and Dissertations Title A physically-based approach for lens flare simulation Permalink https://escholarship.org/uc/item/5n07m4p6 Author Keshmirian, Arash Publication Date 2008 Peer reviewed|Thesis/dissertation eScholarship.org Powered by the California Digital Library University of California UNIVERSITY OF CALIFORNIA, SAN DIEGO A Physically-Based Approach for Lens Flare Simulation A thesis submitted in partial satisfaction of the requirements for the degree Master of Science in Computer Science by Arash Keshmirian Committee in charge: Professor Henrik Wann Jensen, Chair Professor David Kriegman Professor Matthias Zwicker 2008 Copyright Arash Keshmirian, 2008 All rights reserved. The thesis of Arash Keshmirian is approved and it is acceptable in quality and form for publica- tion on microfilm: Chair University of California, San Diego 2008 iii DEDICATION I wish to thank my parents Homa and Mansour. They bore me, brought me up, supported me, taught me, made innumerable sacrifices for me, inspired me, and loved me. To them I dedicate this thesis. iv EPIGRAPH A technically perfect photograph can be the world’s most boring picture. —Andreas Feininger Andreas Bernhard Lyonel Feininger (27 December 1906 - 18 February 1999) was a French-born American photographer, and writer on photographic technique, noted for his dynamic black-and-white scenes of Manhattan and studies of the structure of natural objects. v TABLE OF CONTENTS Signature Page........................................... iii Dedication.............................................
    [Show full text]
  • Paul Jacobsen Selected Press
    Paul Jacobsen Selected Press signs and symbols 102 Forsyth Street, New York, NY | www.signsandsymbols.art http://museemagazine.com/culture/2018/12/3/art-out-paul-jacobson-material-ethereal Art Out: Paul Jacobsen “Material Ethereal” 18 Nov — 21 Dec 2018 at the Signs and Symbols in New York, United States November 20, 2018 Courtesy of Signs and Symbols Gallery Photos by Sarah Sunday, Signs and Symbols Gallery Beginning November 18th and carrying on until December 21st, Signs and Symbols, a gallery on the Lower East Side, is proudly presenting the work of solo artist Paul Jacobsen in his exhibition titled Material Ethereal. The gallery features five of Jacobsen’s small oil paintings, each 14 x 11 inches in size. Jacobsen found inspiration for his paintings from photographs captured in the mid 1950’s by American photographer Walker Evans’ project Beauties of the Common Tool. Evans had originally photographed five extremely commonplace hand tools, including their names and their prices, each item worth no more than three dollars at the time. The five items photographed were a pair of chain- nose pliers, a crate opener, tin snips, a bricklayer’s pointed trowel and an open-end crescent wrench. Evans had been inspired by the ordinariness and inexpensiveness of each tool, as well as the aesthetic appeal present in their curves and simplistic designs. Over 60 years later, Paul Jacobsen has recreated Evan’s photographs in painterly form, adding layers of paint and artistic supplementation, and therefore deeper layers of meaning. Ever the perfectionist, Jacobsen harnesses the effect of realism in his artwork.
    [Show full text]