Calibration of Ultra-Wide Fisheye Lens Cameras by Eigenvalue Minimization

Total Page:16

File Type:pdf, Size:1020Kb

Calibration of Ultra-Wide Fisheye Lens Cameras by Eigenvalue Minimization Memoirs of the Faculty of Engineering, Okayama University, Vol. 46, pp. 10-20, January 2012 Calibration of Ultra-Wide Fisheye Lens Cameras by Eigenvalue Minimization Kenichi KANATANI¤ Department of Computer Science, Okayama University Okayama 700-8530 Japan (Received November 21, 2011) We present a new technique for calibrating ultra-wide ¯sheye lens cameras by imposing the con- straint that collinear points be recti¯ed to be collinear, parallel lines to be parallel, and orthogonal lines to be orthogonal. Exploiting the fact that line ¯tting reduces to an eigenvalue problem, we do a rigorous perturbation analysis to obtain a Levenberg-Marquardt procedure for the optimization. Doing experiments, we point out that spurious solutions exist if collinearity and parallelism alone are imposed. Our technique has many desirable properties. For example, no metric information is required about the reference pattern or the camera position, and separate stripe patterns can be displayed on a video screen to generate a virtual grid, eliminating the grid point extraction processing. 1. Introduction et al. [8], and Okutsu et al. [13]. Komagata et al. [9] further introduced the parallelism constraint and the Fisheye lens cameras are widely used for surveil- orthogonality constraint, requiring that parallel lines lance purposes because of their wide angles of view. be recti¯ed to be parallel and orthogonal lines to be They are also mounted on vehicles for various pur- orthogonal. However, the cost function has been di- poses including obstacle detection, self-localization, rectly minimized by brute force means such as the and bird's eye view generation [10, 13]. However, Brent method and the Powell method [16]. ¯sheye lens images have a large distortion, so that In this paper, we adopt the collinearity- in order to apply the computer vision techniques ac- parallelism-orthogonality constraint of Komagata et cumulated for the past decades, one ¯rst needs to al. [9] and optimize it by eigenvalue minimization. rectify the image into a perspective view. Already, The fact that imposing collinearity implies eigenvalue there is a lot of literature for this [3, 5, 7, 8, 11, 12, minimization and that the optimization can be done 13, 14, 17, 18]. by invoking the perturbation theorem was pointed The standard approach is to place a reference out by Onodera and Kanatani [15]. Using this princi- grid plane and match the image with the reference, ple, they recti¯ed perspective images by gradient de- whose precise geometry is assumed to be known cent. Here, we consider ultra-wide ¯sheye lenses and [3, 4, 5, 7, 18]. However, this approach is not very do a more detailed perturbation analysis to derive the practical for recently popularized ultra-wide ¯sheye ± Levenberg-Marquardt procedure, currently regarded lenses, because they cover more than 180 angles of as the standard tool for e±cient and accurate opti- view and hence any (even in¯nite) reference plane mization. cannot cover the entire ¯eld of view. This di±culty The eigenvalue minimization principle has not can be circumvented by using the collinearity con- been known in the past collinearity-based work [1, straint pointed out repeatedly, ¯rst by Onodera and 8, 12, 13, 17]. Demonstrating its usefulness for ultra- Kanatani [15] in 1992, later by Swaminathan and Na- wide ¯sheye lens calibration is the ¯rst contribution yar [17] in 2000, and by Devernay and Faugeras [1] in of this paper. We also show by experiments that 2001. They pointed out that camera calibration can the orthogonality constraint plays an essential role, be done by imposing the constraint that straight lines pointing out that a spurious solution exists if only should be recti¯ed to be straight. This principle was collinearity and parallelism are imposed. This fact applied to ¯sheye lenses by Nakano, et al. [12], Kase has not been known in the past collinearity-based ¤E-mail [email protected] work [1, 8, 12, 13, 15, 17]. Pointing this out is the This work is subjected to copyright. All rights are reserved by this author/authors. 10 Kenichi KANATANI MEM.FAC.ENG.OKA.UNI. Vol. 46 second contribution of this paper. For data acquisition, we take images of stripes of di®erent orientations on a large-screen display by placing the camera in various positions. Like the past θ m collinearity-based methods [1, 8, 12, 13, 15, 17], our method is non-metric in the sense that no metric in- formation is required about the camera position or the reference pattern. Yet, many researchers pointed f out the necessity of some auxiliary information. For example, Nakano et al. [11, 12] proposed vanishing r point estimation using conic ¯tting to straight line images (recently, Hughes et al. [5] proposed this same Figure 1: The imaging geometry of a ¯sheye lens and the technique again). Okutsu et al. [14] picked out the incident ray vector m. images of antipodal points by hand. Such auxiliary information may be useful to suppress spurious so- and determine the values of f, a1, a2, ... along with lutions. However, we show that accurate calibra- the principal point position. Here, f0 is a scale con- tion is possible without any auxiliary information stant to keep the powers rk within a reasonable nu- by our eigenvalue minimization for the collinearity- merical range (in our experiment, we used the value parallelism-orthogonality constraint. f0 = 150 pixels). The linear term r=f0 has no coe±- This paper is organized as follows. In Sec. 2, we cient because f on the right-hand side is an unknown describe our imaging geometry model. Section 3 gives parameter; a1 = a2 = ¢ ¢ ¢ = 0 corresponds to the derivative expressions of the fundamental quantities, stereographic projection. Since a su±cient number of followed by a detailed perturbation analysis of the correction terms could approximate any function, the collinearity constraint in Sec. 4, of the parallelism right-hand side of (2) could be any function of θ, e.g., constraint in Sec. 5, and of the orthogonality con- the perspective projection model (f=f0) tan θ or the straint in Sec. 6. Section 7 describes our Levenberg- equidistance projection model (f=f0)θ. We adopt the Marquardt procedure for eigenvalue minimization. In stereographic projection model merely for the ease of Sec. 8, we experiment our non-metric technique, using initialization. stripe images on a video display. We point out that In (2), even power terms do not exist, because the a spurious solution arises if only collinearity and par- lens has circular symmetry; r is an odd function of allelism are imposed and that it can be eliminated θ. We assume that the azimuthal angle of the projec- without using any auxiliary information if orthogo- tion is equal to that of the incident ray. In the past, nality is introduced. We also show some examples of these two were often assumed to be slightly di®erent, real scene applications. In Sec. 9, we conclude. and geometric correction of the resulting \tangential distortion" was studied. Currently, the lens manu- 2. Geometry of Fisheye Lens Imaging facturing technology is highly advanced so that the tangential distortion can be safely ignored. If not, We consider recently popularized ultra-wide ¯sh- we can simply include the tangential distortion terms eye lenses with the imaging geometry modeled by the in (2), and the subsequent calibration procedure re- stereographic projection mains unchanged. In the literature, the model of the form r = c1θ + θ 3 5 r = 2f tan ; (1) c2θ + c3θ + ¢ ¢ ¢ is frequently assumed [7, 8, 13, 12]. 2 As we see shortly, however, the value of θ for a spec- where θ is the incidence angle (the angle of the inci- i¯ed r is necessary in each step of the optimization dent ray of light from the optical axis) and r (in pix- iterations. So, many authors computed θ by solv- els) is the distance of the corresponding image point ing a polynomial equation using a numerical means from the principal point (Fig. 1). The constant f [7, 8, 13, 12], but this causes loss of accuracy and ef- is called the focal length. We consider (1) merely be- ¯ciency. It is more convenient to express θ in terms cause our camera is as such, but the following calibra- of r from the beginning. From (2), the expression of tion procedure is identical whatever model is used. θ is given by For a manufactured lens, the value of f is unknown ³ ³ ³ ´3 ³ ´5 ´´ ¡1 f0 r r r or may not be exact if provided by the manufacturer. θ = 2 tan + a1 + a2 + ¢ ¢ ¢ : Also, the principal point may not be at the center 2f f0 f0 f0 (3) of the image frame. Moreover, (1) is an idealization, and a real lens may not exactly satisfy it. So, we generalize (1) into the form 3. Incident Ray Vector ³ ´ ³ ´ 3 5 Let m be the unit vector in the direction of the r r r ¢ ¢ ¢ 2f θ + a1 + a2 + = tan ; (2) incident ray of light (Fig. 1); we call m the incident f0 f0 f0 f0 2 11 January 2012 Calibration of Ultra-Wide Fisheye Lens Cameras by Eigenvalue Minimization ³ 1 ³ ´ ´ X 2k ¡ ray vector. In polar coordinates, it has the expression @θ ¡ 1 2 θ ¡ r y v0 0 1 = cos 1 + (2k 1)ak : @v0 f 2 f0 r sin θ cos Á k=1 m = @ sin θ sin Á A ; (4) (10) cos θ Substituting these into (9), we can evaluate @m=@u0 where θ is the incidence angle from the Z-axis and Á and @m=@v0. is the azimuthal angle from the X-axis. Since Á, by Next, we consider derivation with respect to f.
Recommended publications
  • Instruction Manual Go to for ENGLISH, FRANÇAIS, DEUTSCH, ITALIANO, ESPAÑOL and NEDERLANDS Translations
    Fisheye Wide Angle Lens (SL975) Instruction Manual Go to www.sealife-cameras.com/manuals for ENGLISH, FRANÇAIS, DEUTSCH, ITALIANO, ESPAÑOL and NEDERLANDS translations. 1. What’s in the Box: Lens (SL97501) with Retaining Lanyard (SL95010) Neoprene Lens cover (SL97508) Lens Dock (SL97502) With mounting screw (SL97201) 2. Great Pictures Made Easy Without and …………………….with Fisheye Lens The secret to taking bright and colorful underwater pictures is to get close to the subject. The fisheye wide angle lens creates a super wide angle effect that allows you to get closer to the subject and still fit everything in the picture. In addition to increasing field of view, the fisheye lens will help you shoot better, steadier video by dampening movement. You will also be able to take super macro pictures with increased depth of field. 3. Attach Lens Dock Screw Lens Dock on to bottom of flash tray 4. Attach Retaining Lanyard To prevent dropping or losing the lens, attach the lens retaining lanyard to the camera as shown. When the lens is not being used, slide it in to Lens Dock to secure and protect lens. 5. Attach Lens to Camera Push lens onto lens port of any SeaLife “DC”-series underwater camera. Lens can be attached on land or underwater. When entering the water, make sure to remove air bubbles trapped under the lens - There should be water between the lens and the camera. Important: The camera’s focus setting must be set to “Macro Focus” or “Super Macro Focus” or the resulting picture/video will not be in focus.
    [Show full text]
  • Optics – Panoramic Lens Applications Revisited
    Panoramic Lens Applications Revisited Simon Thibault* M.Sc., Ph.D., Eng Director, Optics Division/Principal Optical Designer ImmerVision 2020 University, Montreal, Quebec, H3A 2A5 Canada ABSTRACT During the last few years, innovative optical design strategies to generate and control image mapping have been successful in producing high-resolution digital imagers and projectors. This new generation of panoramic lenses includes catadioptric panoramic lenses, panoramic annular lenses, visible/IR fisheye lenses, anamorphic wide-angle attachments, and visible/IR panomorph lenses. Given that a wide-angle lens images a large field of view on a limited number of pixels, a systematic pixel-to-angle mapping will help the efficient use of each pixel in the field of view. In this paper, we present several modern applications of these modern types of hemispheric lenses. Recently, surveillance and security applications have been proposed and published in Security and Defence symposium. However, modern hemispheric lens can be used in many other fields. A panoramic imaging sensor contributes most to the perception of the world. Panoramic lenses are now ready to be deployed in many optical solutions. Covered applications include, but are not limited to medical imaging (endoscope, rigiscope, fiberscope…), remote sensing (pipe inspection, crime scene investigation, archeology…), multimedia (hemispheric projector, panoramic image…). Modern panoramic technologies allow simple and efficient digital image processing and the use of standard image analysis features (motion estimation, segmentation, object tracking, pattern recognition) in the complete 360o hemispheric area. Keywords: medical imaging, image analysis, immersion, omnidirectional, panoramic, panomorph, multimedia, total situation awareness, remote sensing, wide-angle 1. INTRODUCTION Photography was invented by Daguerre in 1837, and at that time the main photographic objective was that the lens should cover a wide-angle field of view with a relatively high aperture1.
    [Show full text]
  • Hardware and Software for Panoramic Photography
    ROVANIEMI UNIVERSITY OF APPLIED SCIENCES SCHOOL OF TECHNOLOGY Degree Programme in Information Technology Thesis HARDWARE AND SOFTWARE FOR PANORAMIC PHOTOGRAPHY Julia Benzar 2012 Supervisor: Veikko Keränen Approved _______2012__________ The thesis can be borrowed. School of Technology Abstract of Thesis Degree Programme in Information Technology _____________________________________________________________ Author Julia Benzar Year 2012 Subject of thesis Hardware and Software for Panoramic Photography Number of pages 48 In this thesis, panoramic photography was chosen as the topic of study. The primary goal of the investigation was to understand the phenomenon of pa- noramic photography and the secondary goal was to establish guidelines for its workflow. The aim was to reveal what hardware and what software is re- quired for panoramic photographs. The methodology was to explore the existing material on the topics of hard- ware and software that is implemented for producing panoramic images. La- ter, the best available hardware and different software was chosen to take the images and to test the process of stitching the images together. The ex- periment material was the result of the practical work, such the overall pro- cess and experience, gained from the process, the practical usage of hard- ware and software, as well as the images taken for stitching panorama. The main research material was the final result of stitching panoramas. The main results of the practical project work were conclusion statements of what is the best hardware and software among the options tested. The re- sults of the work can also suggest a workflow for creating panoramic images using the described hardware and software. The choice of hardware and software was limited, so there is place for further experiments.
    [Show full text]
  • Cata-Fisheye Camera for Panoramic Imaging
    Cata-Fisheye Camera for Panoramic Imaging Gurunandan Krishnan Shree K. Nayar Department of Computer Science Columbia University {gkguru,nayar}@cs.columbia.edu Abstract Current techniques for capturing panoramic videos can We present a novel panoramic imaging system which uses be classified as either dioptric (systems that use only refrac- a curved mirror as a simple optical attachment to a fish- tive optics) or catadioptric (systems that use both reflective eye lens. When compared to existing panoramic cameras, and refractive optics). Dioptric systems include camera- our “cata-fisheye” camera has a simple, compact and inex- clusters and wide-angle or fisheye lens based systems. Cata- pensive design, and yet yields high optical performance. It dioptric systems include ones that use single or multiple captures the desired panoramic field of view in two parts. cameras with one or more reflective surfaces. The upper part is obtained directly by the fisheye lens and Dioptric camera-clusters [14, 20, 28] compose high reso- the lower part after reflection by the curved mirror. These lution panoramas using images captured simultaneously by two parts of the field of view have a small overlap that is multiple cameras looking in different directions. In this ap- used to stitch them into a single seamless panorama. The proach, due to their finite size, the cameras cannot be co- cata-fisheye concept allows us to design cameras with a located. Hence, they do not share a single viewpoint and, wide range of fields of view by simply varying the param- in general, have significant parallax. To address this issue, eters and position of the curved mirror.
    [Show full text]
  • Canon EF 8-15Mm F/4L USM Fisheye by Phil Rudin
    Canon EF 8-15mm F/4L USM Fisheye by Phil Rudin Rarely do we see a photo product started to become popular during the come along that is both unique and early 1960s. The difference between exciting in so many ways. The Canon the two types of lenses is simple, EF 8-15mm F/4L USM Fisheye the circular fisheye lens renders a zoom lens is both radical in design perfectly round image within the and exceptionally well suited for center of the 3:2 format sensor while underwater photography. This lens is the full-frame fisheye covered the not new and was announced in August entire frame. Keep in mind that with 2010 to replace the old EF 15mm the circular image you are loosing F/2.8 Fisheye for full-frame sensor megapixels to the black negative bodies. At the time Canon had no space while the full-frame image takes fisheye lens to cover APS-C or APS-H advantage of the entire sensor. Both size sensors. This lens combines both 8mm and 15mm fisheye lenses for circular and full-frame fisheyes into Canon full-frame cameras are offered one lens. Unlike the popular Tokina by third-party lens manufactures like 10-17mm for APS-C cameras that Sigma but the Canon 8-15mm fisheye covers a range from 180-100 degrees is the only lens that combines the the Canon 8-15mm covers a range two into one lens making it a unique from 180-175-30 degrees on full- and more cost effective addition for frame.
    [Show full text]
  • Panoramic Photography & ANRA's Water Quality Monitoring Program
    Panoramic Photography in ANRA’s Water Quality Monitoring Program Jeremiah Poling Information Systems Coordinator, ANRA Disclaimer The equipment and software referred to in this presentation are identified for informational purposes only. Identification of specific products does not imply recommendation of or endorsement by the Angelina and Neches River Authority, nor does it imply that the products so identified are necessarily the best available for the purpose. All information presented regarding equipment and software, including specifications and photographs, were acquired from publicly-available sources. Background • As part of our routine monitoring activities, standard practice has been to take photographs of the areas upstream and downstream of the monitoring station. • Beginning in the second quarter of FY 2011, we began creating panoramic images at our monitoring stations. • The panoramic images have a 360o field of view and can be viewed interactively in a web browser. • Initially these panoramas were captured using a smartphone. • Currently we’re using a digital SLR camera, fisheye lens, specialized rotating tripod mount, and a professional software suite to create and publish the panoramas. Basic Terminology • Field of View (FOV), Zenith, Nadir – When we talk about panoramas, we use the term Field of View to describe how much of the scene we can capture. – FOV is described as two numbers; a Horizontal FOV and a Vertical FOV. – To help visualize what the numbers represent, imagine standing directly in the center of a sphere, looking at the inside surface. • We recognize 180 degrees of surface from the highest point (the Zenith) to the lowest point (the Nadir). 180 degrees is the Maximum Vertical FOV.
    [Show full text]
  • Choosing Digital Camera Lenses Ron Patterson, Carbon County Ag/4-H Agent Stephen Sagers, Tooele County 4-H Agent
    June 2012 4H/Photography/2012-04pr Choosing Digital Camera Lenses Ron Patterson, Carbon County Ag/4-H Agent Stephen Sagers, Tooele County 4-H Agent the picture, such as wide angle, normal angle and Lenses may be the most critical component of the telescopic view. camera. The lens on a camera is a series of precision-shaped pieces of glass that, when placed together, can manipulate light and change the appearance of an image. Some cameras have removable lenses (interchangeable lenses) while other cameras have permanent lenses (fixed lenses). Fixed-lens cameras are limited in their versatility, but are generally much less expensive than a camera body with several potentially expensive lenses. (The cost for interchangeable lenses can range from $1-200 for standard lenses to $10,000 or more for high quality, professional lenses.) In addition, fixed-lens cameras are typically smaller and easier to pack around on sightseeing or recreational trips. Those who wish to become involved in fine art, fashion, portrait, landscape, or wildlife photography, would be wise to become familiar with the various types of lenses serious photographers use. The following discussion is mostly about interchangeable-lens cameras. However, understanding the concepts will help in understanding fixed-lens cameras as well. Figures 1 & 2. Figure 1 shows this camera at its minimum Lens Terms focal length of 4.7mm, while Figure 2 shows the110mm maximum focal length. While the discussion on lenses can become quite technical there are some terms that need to be Focal length refers to the distance from the optical understood to grasp basic optical concepts—focal center of the lens to the image sensor.
    [Show full text]
  • User Guide Nikon
    HD Lens Series for Nikon DSLR User Guide Thank you for purchasing the Altura Photo 8mm f/3.5 Fisheye Lens! Your new lens is an excellent addition to your collection, allowing you to vastly diversify your photographic capabilities with a new and unique perspective. The special design of Altura Photo Fisheye allows you to create images with an expanded perspective when paired with an APS-C image sensor. These photos will give you a creatively desired deformation in the image as well as a sharp pan focus consistent throughout the entire frame. With a 180-degree, diagonal field of view, this lens creates amazing images with exaggerated perspectives and distortions that are unique to fisheye technology. Its minimum focusing distance of only 0.3 meters (12 inches) is ideal for close-up shots or close quarter photography. Altura Photo magnificent fisheye also employs a multi-layer coating to reduce flares and ghosting while providing optimal color balance. Key features 180 Degree, APS-C format compatible, diagonal field of view. Constructed with hybrid aspherical lenses for sharp defined images. Rounded images when used with full frame. Super multi-layer coating to reduce flares and ghost images. Has a minimum focusing distance of 12 inches. Includes a removable petal type lens hood and lens cap. Specifications Filter size: None Aperture range: 3.5 ~ 22 Minimum focus distance: 1.0 ft (0.3m) Lens hood: Yes (removable) Angle of view: 180 Degrees Groups/Elements: 7/10 Weight: 14.7 oz (416.74g) Box includes 8mm Circular Fisheye Lens Altura Photo lens pouch Removable petal lens hood Front and back lens caps F.A.Q.
    [Show full text]
  • User Guide Canon
    HD Lens Series for Canon DSLR User Guide Thank you for purchasing the Altura Photo 8mm f/3.5 Fisheye Lens! Your new lens is an excellent addition to your collection, allowing you to vastly diversify your photographic capabilities with a new and unique perspective. The special design of Altura Photo Fisheye allows you to create images with an expanded perspective when paired with an APS-C image sensor. These photos will give you a creatively desired deformation in the image as well as a sharp pan focus consistent throughout the entire frame. With a 180-degree, diagonal field of view, this lens creates amazing images with exaggerated perspectives and distortions that are unique to fisheye technology. Its minimum focusing distance of only 0.3 meters (12 inches) is ideal for close-up shots or close quarter photography. Altura Photo magnificent fisheye also employs a multi-layer coating to reduce flares and ghosting while providing optimal color balance. Key features 180 Degree, APS-C format compatible, diagonal field of view. Constructed with hybrid aspherical lenses for sharp defined images. Rounded images when used with full frame. Super multi-layer coating to reduce flares and ghost images. Has a minimum focusing distance of 12 inches. Includes a removable petal type lens hood and lens cap. Specifications Filter size: None Aperture range: 3.5 ~ 22 Minimum focus distance: 1.0 ft (0.3m) Lens hood: Yes (removable) Angle of view: 180 Degrees Groups/Elements: 7/10 Weight: 14.7 oz (416.74g) Box includes 8mm Circular Fisheye Lens Altura Photo lens pouch Removable petal lens hood Front and back lens caps F.A.Q.
    [Show full text]
  • Nonlinear Projections
    CS 563 Advanced Topics in Computer Graphics Nonlinear Projections by Joe Miller Nonlinear vs Linear Projections What is a nonlinear projection? Anything that's not a linear projection. They are known as linear projections because the projectors are straight lines A linear system is defined by system which satisfies the following properties Superposition – f(x + y) = f(x) + f(y) Homogeneity – f(kx) = k*f(x) Pinhole camera review Recall the pinhole camera algorithm This technique casts a ray in a direction defined by the camera through the view plane into the scene to create the projection d = xvu + yvv –dw v u w Examples of Nonlinear Projections Fisheye Spherical Panoramic Cylindrical Panoramic More Projections Aitoff Projection Bonne Projection Eckert-Greifendorff Projection Fisheye Lens Fisheye is not just a computer projection but a physical technique used in photography Fisheye lens are super wide angle lens which can capture up to 180° field of view. Originally designed for scientific studies Primary use was for astrophotography and astronomical observations Once called the “full sky lens” Designed to mimic that of a fish looking through water Fisheye Projection Ray tracing mimics real life by projecting what the lens would see onto the view plane. Define the camera by the maximum it can see – the half angle or ⎠max The half angle will define the fov → fov = 2⎠max As before cast rays from the camera to objects Extract color information from the objects and paint them on the viewport Fisheye Projection Here are details on how the system projects the scene onto display Normalize the view plane into a unit square xn = xp*(2/(s*hres)) yn = yp*(2/(s*vres)) Convert normalized coordinates to polar coordinates in terms of (r, 〈) sin 〈 = yn / r cos 〈 = xn / r If r<=1 then this defines the uv plane in terms of a circle Fisheye Projection Uniformly project the rays through uv plane by keeping angular distance between them the same ⎠ = r*⎠max Convert the spherical coordinate (〈,⎠) to cartesian coordinates to define the ray.
    [Show full text]
  • 2 1 SAL16F28 16Mm F2.8 Fisheye
    Notes on use 2-685-154-11(1) • Do not leave the lens in direct sunlight. If sunlight is focused onto a nearby Vignetting object, it may cause a fire. If circumstances necessitate leaving the lens in direct When you use lens, the corners of the screen become darker than the center. sunlight, be sure to attach the lens cap. To reduce this phenomena (called vignetting), close the aperture by 1 to 2 • Be careful not to subject the lens to mechanical shock while attaching it. stops. Lens for Digital Single Lens • Always place the lens caps on the lens when storing. Reflex Camera • Do not keep the lens in a very humid place for a long period of time to prevent Condensation mold. If your lens is brought directly from a cold place to a warm place, • Do not hold the camera by the lens part extended for focusing, etc. condensation may appear on the lens. To avoid this, place the lens in a plastic • Do not touch the lens contacts. If dirt, etc., gets on the lens contacts, it may bag or something similar. When the air temperature inside the bag reaches interfere or prevent the sending and receiving of signals between the lens and the the surrounding temperature, take the lens out. Operating Instructions camera, resulting in operational malfunction. Cleaning the lens Precautions for flash use • Do not touch the surface of the lens directly. Due to the lens’ wide angle of view, edges of pictures will tend to be dark in • If the lens gets dirty, brush off dust with a lens blower and wipe with a soft, 16mm F2.8 Fisheye combination with a flash.
    [Show full text]
  • Precise Determination of Fisheye Lens Resolution
    PRECISE DETERMINATION OF FISHEYE LENS RESOLUTION M. Kedzierski Dept. of Remote Sensing and Geoinformation, Military University of Technology, Kaliskiego 2, Str. 00-908 Warsaw, Poland - [email protected] Theme Session C2 KEY WORDS: Close-Range, Resolution, Optical Systems, Accuracy, Camera, Non-Metric, Fisheye Lens ABSTRACT: Using fisheye lens in close range photogrammetry gives great possibilities in acquiring photogrammetric data of places, where access to the object is very difficult. But such lens have very specific optical and geometrical properties resulting from great value of radial distortion. In my approach before calibration, Siemens star test was used to determination of fisheye lens resolving power. I have proposed a resolution examination method that is founded on determination of Cassini ovals and computation of some coefficient. This coefficient modifies resolution calculations. Efficiency of my method was estimated by its comparison with classical way and statistical analysis. Proposed by me procedure of lens resolutions investigations can be used not only in case of fisheye lens, but also in case of another wide-angle lenses. What have to be mentioned is fact, that the smaller focal length of lens, the more accurate is the method. Paper presents brief description of method and results of investigations. 1. INTRODUCTION Transfer Function (OTF), of which the limiting resolution is one point. The Optical Transfer Function describes the spatial Very important are: lens calibration and appropriately chosen (angular) variation as a function of spatial (angular) frequency. test. Depending on gradient of resolution variations, the proper number of points (X, Y, Z) of the test, and their density On the other side, I would like to propose different approach to (depending on radial radius) should be matched.
    [Show full text]