Making Your Bokeh Fascinating Real-Time Rendering of Physically Based Optical Effect in Theory and Practice SIGGRAPH 2015 Course
Total Page:16
File Type:pdf, Size:1020Kb
Load more
Recommended publications
-
Depth-Aware Blending of Smoothed Images for Bokeh Effect Generation
1 Depth-aware Blending of Smoothed Images for Bokeh Effect Generation Saikat Duttaa,∗∗ aIndian Institute of Technology Madras, Chennai, PIN-600036, India ABSTRACT Bokeh effect is used in photography to capture images where the closer objects look sharp and every- thing else stays out-of-focus. Bokeh photos are generally captured using Single Lens Reflex cameras using shallow depth-of-field. Most of the modern smartphones can take bokeh images by leveraging dual rear cameras or a good auto-focus hardware. However, for smartphones with single-rear camera without a good auto-focus hardware, we have to rely on software to generate bokeh images. This kind of system is also useful to generate bokeh effect in already captured images. In this paper, an end-to-end deep learning framework is proposed to generate high-quality bokeh effect from images. The original image and different versions of smoothed images are blended to generate Bokeh effect with the help of a monocular depth estimation network. The proposed approach is compared against a saliency detection based baseline and a number of approaches proposed in AIM 2019 Challenge on Bokeh Effect Synthesis. Extensive experiments are shown in order to understand different parts of the proposed algorithm. The network is lightweight and can process an HD image in 0.03 seconds. This approach ranked second in AIM 2019 Bokeh effect challenge-Perceptual Track. 1. Introduction tant problem in Computer Vision and has gained attention re- cently. Most of the existing approaches(Shen et al., 2016; Wad- Depth-of-field effect or Bokeh effect is often used in photog- hwa et al., 2018; Xu et al., 2018) work on human portraits by raphy to generate aesthetic pictures. -
Chapter 3 (Aberrations)
Chapter 3 Aberrations 3.1 Introduction In Chap. 2 we discussed the image-forming characteristics of optical systems, but we limited our consideration to an infinitesimal thread- like region about the optical axis called the paraxial region. In this chapter we will consider, in general terms, the behavior of lenses with finite apertures and fields of view. It has been pointed out that well- corrected optical systems behave nearly according to the rules of paraxial imagery given in Chap. 2. This is another way of stating that a lens without aberrations forms an image of the size and in the loca- tion given by the equations for the paraxial or first-order region. We shall measure the aberrations by the amount by which rays miss the paraxial image point. It can be seen that aberrations may be determined by calculating the location of the paraxial image of an object point and then tracing a large number of rays (by the exact trigonometrical ray-tracing equa- tions of Chap. 10) to determine the amounts by which the rays depart from the paraxial image point. Stated this baldly, the mathematical determination of the aberrations of a lens which covered any reason- able field at a real aperture would seem a formidable task, involving an almost infinite amount of labor. However, by classifying the various types of image faults and by understanding the behavior of each type, the work of determining the aberrations of a lens system can be sim- plified greatly, since only a few rays need be traced to evaluate each aberration; thus the problem assumes more manageable proportions. -
Depth of Field PDF Only
Depth of Field for Digital Images Robin D. Myers Better Light, Inc. In the days before digital images, before the advent of roll film, photography was accomplished with photosensitive emulsions spread on glass plates. After processing and drying the glass negative, it was contact printed onto photosensitive paper to produce the final print. The size of the final print was the same size as the negative. During this period some of the foundational work into the science of photography was performed. One of the concepts developed was the circle of confusion. Contact prints are usually small enough that they are normally viewed at a distance of approximately 250 millimeters (about 10 inches). At this distance the human eye can resolve a detail that occupies an angle of about 1 arc minute. The eye cannot see a difference between a blurred circle and a sharp edged circle that just fills this small angle at this viewing distance. The diameter of this circle is called the circle of confusion. Converting the diameter of this circle into a size measurement, we get about 0.1 millimeters. If we assume a standard print size of 8 by 10 inches (about 200 mm by 250 mm) and divide this by the circle of confusion then an 8x10 print would represent about 2000x2500 smallest discernible points. If these points are equated to their equivalence in digital pixels, then the resolution of a 8x10 print would be about 2000x2500 pixels or about 250 pixels per inch (100 pixels per centimeter). The circle of confusion used for 4x5 film has traditionally been that of a contact print viewed at the standard 250 mm viewing distance. -
Portraiture, Surveillance, and the Continuity Aesthetic of Blur
Michigan Technological University Digital Commons @ Michigan Tech Michigan Tech Publications 6-22-2021 Portraiture, Surveillance, and the Continuity Aesthetic of Blur Stefka Hristova Michigan Technological University, [email protected] Follow this and additional works at: https://digitalcommons.mtu.edu/michigantech-p Part of the Arts and Humanities Commons Recommended Citation Hristova, S. (2021). Portraiture, Surveillance, and the Continuity Aesthetic of Blur. Frames Cinema Journal, 18, 59-98. http://doi.org/10.15664/fcj.v18i1.2249 Retrieved from: https://digitalcommons.mtu.edu/michigantech-p/15062 Follow this and additional works at: https://digitalcommons.mtu.edu/michigantech-p Part of the Arts and Humanities Commons Portraiture, Surveillance, and the Continuity Aesthetic of Blur Stefka Hristova DOI:10.15664/fcj.v18i1.2249 Frames Cinema Journal ISSN 2053–8812 Issue 18 (Jun 2021) http://www.framescinemajournal.com Frames Cinema Journal, Issue 18 (June 2021) Portraiture, Surveillance, and the Continuity Aesthetic of Blur Stefka Hristova Introduction With the increasing transformation of photography away from a camera-based analogue image-making process into a computerised set of procedures, the ontology of the photographic image has been challenged. Portraits in particular have become reconfigured into what Mark B. Hansen has called “digital facial images” and Mitra Azar has subsequently reworked into “algorithmic facial images.” 1 This transition has amplified the role of portraiture as a representational device, as a node in a network -
Depth of Focus (DOF)
Erect Image Depth of Focus (DOF) unit: mm Also known as ‘depth of field’, this is the distance (measured in the An image in which the orientations of left, right, top, bottom and direction of the optical axis) between the two planes which define the moving directions are the same as those of a workpiece on the limits of acceptable image sharpness when the microscope is focused workstage. PG on an object. As the numerical aperture (NA) increases, the depth of 46 focus becomes shallower, as shown by the expression below: λ DOF = λ = 0.55µm is often used as the reference wavelength 2·(NA)2 Field number (FN), real field of view, and monitor display magnification unit: mm Example: For an M Plan Apo 100X lens (NA = 0.7) The depth of focus of this objective is The observation range of the sample surface is determined by the diameter of the eyepiece’s field stop. The value of this diameter in 0.55µm = 0.6µm 2 x 0.72 millimeters is called the field number (FN). In contrast, the real field of view is the range on the workpiece surface when actually magnified and observed with the objective lens. Bright-field Illumination and Dark-field Illumination The real field of view can be calculated with the following formula: In brightfield illumination a full cone of light is focused by the objective on the specimen surface. This is the normal mode of viewing with an (1) The range of the workpiece that can be observed with the optical microscope. With darkfield illumination, the inner area of the microscope (diameter) light cone is blocked so that the surface is only illuminated by light FN of eyepiece Real field of view = from an oblique angle. -
DMC-TZ7 Digital Camera Optics
DMC-TZ7 Digital Camera LUMIX Super Zoom Digital Camera 12x Optical Zoom 25mm Ultra Wide-angle LEICA DC Lens HD Movie Recording in AVCHD Lite with Dolby Stereo Digital Creator Advanced iA (Intelligent Auto) Mode with Face Recognition and Movie iA Mode Large 3.0-inch, 460,000-dot High-resolution Intelligent LCD with Wide-viewing Angle Venus Engine HD with HDMI Compatibility and VIERA Link Super Zoom Camera TZ7 - 12x Optical Zoom 25mm Wide-angle LEICA DC Lens with HD Movie Re- cording in AVCHD Lite and iA (Intelligent Auto) Mode Optics Camera Effective Pixels 10.1 Megapixels Sensor Size / Total Pixels / Filter 1/2.33-inch / 12.7 Total Megapixels / Primary Colour Filter Aperture F3.3 - 4.9 / Iris Diaphragm (F3.3 - 6.3 (W) / F4.9 - 6.3 (T)) Optical Zoom 12x Award 2009-03-26T11:07:00 Focal Length f=4.1-49.2mm (25-300mm in 35mm equiv.) DMC-TZ7, Photography- Extra Optical Zoom (EZ) 14.3x (4:3 / 7M), 17.1x (4:3 / 5M), 21.4x (under 3M) Blog (Online), Essential Award, March 2009 Lens LEICA DC VARIO-ELMAR 10 elements in 8 groups (2 Aspherical Lenses / 3 Aspherical surfaces, 2 ED lens) 2-Speed Zoom Yes Optical Image Stabilizer MEGA O.I.S. (Auto / Mode1 / Mode2) Digital Zoom 4x ( Max. 48.0 x combined with Optical Zoom without Extra Optical Zoom ) Award (Max.85.5x combined with Extra Optical Zoom) 2009-03-26T11:10:00 Focusing Area Normal: Wide 50cm/ Tele 200cm - infinity DMC-TZ7, CameraLabs Macro / Intelligent AUTO / Clipboard : Wide 3cm / Max 200cm / Tele (Online), Highly Recom- 100cm - infinity mended Award, March 2009 Focus Range Display Yes AF Assist Lamp Yes Focus Normal / Macro, Continuous AF (On / Off), AF Tracking (On / Off), Quick AF (On / Off) AF Metering Face / AF Tracking / Multi (11pt) / 1pt HS / 1pt / Spot Shutter Speed 8-1/2000 sec (Selectable minimum shutter speed) Starry Sky Mode : 15, 30, 60sec. -
A N E W E R a I N O P T I
A NEW ERA IN OPTICS EXPERIENCE AN UNPRECEDENTED LEVEL OF PERFORMANCE NIKKOR Z Lens Category Map New-dimensional S-Line Other lenses optical performance realized NIKKOR Z NIKKOR Z NIKKOR Z Lenses other than the with the Z mount 14-30mm f/4 S 24-70mm f/2.8 S 24-70mm f/4 S S-Line series will be announced at a later date. NIKKOR Z 58mm f/0.95 S Noct The title of the S-Line is reserved only for NIKKOR Z lenses that have cleared newly NIKKOR Z NIKKOR Z established standards in design principles and quality control that are even stricter 35mm f/1.8 S 50mm f/1.8 S than Nikon’s conventional standards. The “S” can be read as representing words such as “Superior”, “Special” and “Sophisticated.” Top of the S-Line model: the culmination Versatile lenses offering a new dimension Well-balanced, of the NIKKOR quest for groundbreaking in optical performance high-performance lenses Whichever model you choose, every S-Line lens achieves richly detailed image expression optical performance These lenses bring a higher level of These lenses strike an optimum delivering a sense of reality in both still shooting and movie creation. It offers a new This lens has the ability to depict subjects imaging power, achieving superior balance between advanced in ways that have never been seen reproduction with high resolution even at functionality, compactness dimension in optical performance, including overwhelming resolution, bringing fresh before, including by rendering them with the periphery of the image, and utilizing and cost effectiveness, while an extremely shallow depth of field. -
Spherical Aberration ¥ Field Angle Effects (Off-Axis Aberrations) Ð Field Curvature Ð Coma Ð Astigmatism Ð Distortion
Astronomy 80 B: Light Lecture 9: curved mirrors, lenses, aberrations 29 April 2003 Jerry Nelson Sensitive Countries LLNL field trip 2003 April 29 80B-Light 2 Topics for Today • Optical illusion • Reflections from curved mirrors – Convex mirrors – anamorphic systems – Concave mirrors • Refraction from curved surfaces – Entering and exiting curved surfaces – Converging lenses – Diverging lenses • Aberrations 2003 April 29 80B-Light 3 2003 April 29 80B-Light 4 2003 April 29 80B-Light 8 2003 April 29 80B-Light 9 Images from convex mirror 2003 April 29 80B-Light 10 2003 April 29 80B-Light 11 Reflection from sphere • Escher drawing of images from convex sphere 2003 April 29 80B-Light 12 • Anamorphic mirror and image 2003 April 29 80B-Light 13 • Anamorphic mirror (conical) 2003 April 29 80B-Light 14 • The artist Hans Holbein made anamorphic paintings 2003 April 29 80B-Light 15 Ray rules for concave mirrors 2003 April 29 80B-Light 16 Image from concave mirror 2003 April 29 80B-Light 17 Reflections get complex 2003 April 29 80B-Light 18 Mirror eyes in a plankton 2003 April 29 80B-Light 19 Constructing images with rays and mirrors • Paraxial rays are used – These rays may only yield approximate results – The focal point for a spherical mirror is half way to the center of the sphere. – Rule 1: All rays incident parallel to the axis are reflected so that they appear to be coming from the focal point F. – Rule 2: All rays that (when extended) pass through C (the center of the sphere) are reflected back on themselves. -
Terms Set #1 Bingo Myfreebingocards.Com
Terms Set #1 Bingo myfreebingocards.com Safety First! Before you print all your bingo cards, please print a test page to check they come out the right size and color. Your bingo cards start on Page 3 of this PDF. If your bingo cards have words then please check the spelling carefully. If you need to make any changes go to mfbc.us/e/sv4rw Play Once you've checked they are printing correctly, print off your bingo cards and start playing! On the next page you will find the "Bingo Caller's Card" - this is used to call the bingo and keep track of which words have been called. Your bingo cards start on Page 3. Virtual Bingo Please do not try to split this PDF into individual bingo cards to send out to players. We have tools on our site to send out links to individual bingo cards. For help go to myfreebingocards.com/virtual-bingo. Help If you're having trouble printing your bingo cards or using the bingo card generator then please go to https://myfreebingocards.com/faq where you will find solutions to most common problems. Share Pin these bingo cards on Pinterest, share on Facebook, or post this link: mfbc.us/s/sv4rw Edit and Create To add more words or make changes to this set of bingo cards go to mfbc.us/e/sv4rw Go to myfreebingocards.com/bingo-card-generator to create a new set of bingo cards. Legal The terms of use for these printable bingo cards can be found at myfreebingocards.com/terms. -
Using Depth Mapping to Realize Bokeh Effect with a Single Camera Android Device EE368 Project Report Authors (SCPD Students): Jie Gong, Ran Liu, Pradeep Vukkadala
Using Depth Mapping to realize Bokeh effect with a single camera Android device EE368 Project Report Authors (SCPD students): Jie Gong, Ran Liu, Pradeep Vukkadala Abstract- In this paper we seek to produce a bokeh Bokeh effect is usually achieved in high end SLR effect with a single image taken from an Android device cameras using portrait lenses that are relatively large in size by post processing. Depth mapping is the core of Bokeh and have a shallow depth of field. It is extremely difficult effect production. A depth map is an estimate of depth to achieve the same effect (physically) in smart phones at each pixel in the photo which can be used to identify which have miniaturized camera lenses and sensors. portions of the image that are far away and belong to However, the latest iPhone 7 has a portrait mode which can the background and therefore apply a digital blur to the produce Bokeh effect thanks to the dual cameras background. We present algorithms to determine the configuration. To compete with iPhone 7, Google recently defocus map from a single input image. We obtain a also announced that the latest Google Pixel Phone can take sparse defocus map by calculating the ratio of gradients photos with Bokeh effect, which would be achieved by from original image and reblured image. Then, full taking 2 photos at different depths to camera and defocus map is obtained by propagating values from combining then via software. There is a gap that neither of edges to entire image by using nearest neighbor method two biggest players can achieve Bokeh effect only using a and matting Laplacian. -
How Your Digital Camera Works
How Your Digital Camera Works By Todd Vorenkamp | Have you ever wondered what is going on inside that picture-taking box that you just held up to your eye, or out at arm’s length, to capture a photograph? The Basics The camera is, in its most simplified terms, a box that allows light to enter and strike a light- sensitive surface. This surface is either a frame of film or a digital sensor. Cameras can accomplish this task in the most simple way—a pinhole camera, for instance. Pinhole cameras may have only one moving part, or none. Or, the camera can have dozens of moving parts like the modern film or digital single-lens reflex (SLR or DSLR) camera. In this piece, we will discuss the modern cameras popular with today’s photographers. We are going to talk about cameras in general terms, so please know that I am aware of dozens of different ways in which different cameras make images. For simplicity’s sake, we will keep it simple! A Common Path Modern cameras, more or less, work similarly to produce a photograph. Obviously, some are more complex than others, but, in general, light travels a similar path once it meets the camera lens. • Aperture • Shutter • Image Plane How the image is viewed on the camera, through an optical or electronic viewfinder or electronic screen is one thing that differentiates different types of cameras. The Lens Light first enters a lens. This is an optical device made from plastic, glass, or crystal that bends the light entering the lens toward the image plane. -
Visual Effect of the Combined Correction of Spherical and Longitudinal Chromatic Aberrations
Visual effect of the combined correction of spherical and longitudinal chromatic aberrations Pablo Artal 1,* , Silvestre Manzanera 1, Patricia Piers 2 and Henk Weeber 2 1Laboratorio de Optica, Centro de Investigación en Optica y Nanofísica (CiOyN), Universidad de Murcia, Campus de Espinardo, 30071 Murcia, Spain 2AMO Groningen, Groningen, The Netherlands *[email protected] Abstract: An instrument permitting visual testing in white light following the correction of spherical aberration (SA) and longitudinal chromatic aberration (LCA) was used to explore the visual effect of the combined correction of SA and LCA in future new intraocular lenses (IOLs). The LCA of the eye was corrected using a diffractive element and SA was controlled by an adaptive optics instrument. A visual channel in the system allows for the measurement of visual acuity (VA) and contrast sensitivity (CS) at 6 c/deg in three subjects, for the four different conditions resulting from the combination of the presence or absence of LCA and SA. In the cases where SA is present, the average SA value found in pseudophakic patients is induced. Improvements in VA were found when SA alone or combined with LCA were corrected. For CS, only the combined correction of SA and LCA provided a significant improvement over the uncorrected case. The visual improvement provided by the correction of SA was higher than that from correcting LCA, while the combined correction of LCA and SA provided the best visual performance. This suggests that an aspheric achromatic IOL may provide some visual benefit when compared to standard IOLs. ©2010 Optical Society of America OCIS codes: (330.0330) Vision, color, and visual optics; (330.4460) Ophthalmic optics and devices; (330.5510) Psycophysics; (220.1080) Active or adaptive optics.