Top 5 Optical Design Parameters for Chip on Tip Endoscopes

Total Page:16

File Type:pdf, Size:1020Kb

Top 5 Optical Design Parameters for Chip on Tip Endoscopes Top 5 Optical Design Parameters for Chip on Tip Endoscopes Introduction Defining optical design specifications for custom chip on tip endoscopes can be a daunting task. In optics, the critical specifications are literally invisible, and it is common that parameters critical to the success of the device are overlooked. Furthermore, it is now understood that the current ISO standards for measuring some of these parameters have known limitations and sources of error. The error is greatest for close focus endoscopes, see Wang et al. Vol. 8, No. 3, 1 Mar 2017, BIOMEDICAL OPTICS EXPRESS 1441 for more details. Here is a list of the most important optical parameters to use to define and design a custom chip on tip endoscope: 1. Field of View 2. Direction of view 3. Depth of field 4. F/# 5. Optical resolution In this article, we explain what each of these parameters mean, list common specifications, and recommend the correct way to measure them. Optical design parameters for endoscopes 1. Field of View The Field of view (FoV) of a chip on tip endoscope defines the size of the object viewable at the extent of the displayed image. The FoV is defined in angular (typically degrees) rather than linear dimensions. This is because the endoscope is used across a board range of working distances (see depth of field below) and defining the FoV in angle vs linear dimension removes the dependence on a fixed working distance. To what image location the FoV is defined and measured to is where confusion typically arises. The ISO standard defines the FoV as the "largest visible" angle. For endoscopes that have a square or rectangular display size, this means the FoV is measured to the diagonal or corners of the image. For endoscopes with a circular FoV defined by a mask (mechanical or digital), this refers to the diameter of the circle. In some applications the horizontal or vertical FoV is important. It is acceptable to define FoV to these dimensions as long it is clearly identified in the specification. In defining the FoV of a chip on tip endoscope it is important to specify: 1. The FoV in angle 2. The image feature defining the FoV (Image diagonal vs Diameter) 765 Roosevelt Trail, Suite 9, Windham, ME 04062 T: 207-893-8233 | F: 207-893-8245 | www.lighthouseoptics.com 2. Direction of view Direction of view refers to the view angle of the endoscope relative to the axis of the device. Typically, if an endoscope has a non-zero direction of view, the distal or front face will be at the same angle as the optical direction of view. Direction of view is defined in angle relative to the axis or long dimension of the device. 3. F/# F/# (pronounced "F-number") is the same parameter on a chip on tip endoscope as a digital camera. Formally in optics, this parameter is defined as the ratio of the effective focal length to diameter of the entrance pupil. It can be viewed as defining the cone of light collected by the endoscope from the object. As in digital cameras, F/# impacts the amount of light captured by the endoscope, the depth of field, and sets an upper limit on the achievable optical resolution. This parameter is typically the hardest to specify on a chip on tip endoscope, but needs to be defined properly to balance the overall imaging performance. 4. Optical resolution In digital imaging systems, optical resolution can be easily confused with the pixel resolution. Both of these parameters are important to understand and balance in the system design. In a digital imaging system, optical resolution is defined as the resolution limited by the lens, while sensor resolution is defined by the pixel pitch and arrays size of the image sensor. The system resolution is the final image resolution as a result of the lens and the image sensor. In the case of an endoscope, the optical resolution must be specified in conjunction with the pixel size, image sensor array size, and F/# of the system. Optical resolution is best quantified for endoscopes using a modulation transfer function (MTF). This is a continuous function that describes how well a lens focuses objects of different sizes or frequencies to the image sensor. For chip on tip endoscopes where the primary use is to display an image at a fixed size or zoom on a traditional monitor, the optical resolution should be defined and measured in a way that quantifies the optical performance contributing to the perceived image sharpness. This typically corresponds to an object size or frequency within the middle of the total range of sizes/frequencies captured by the lens. This target is defined as a particular MTF frequency, commonly in lp/mm units. The target frequency is determined by the display resolution and view distance of the monitor. For more on perceived image quality and how it relates to the display system, see this link at Imatest.com http://www.imatest.com/docs/sqf/ 5. Depth of field Depth of field (DoF) is a specification that is influenced by the lens F/#, quality of the lens, and lens focal distance. For example, an endoscope with a fixed F/# may have two very different DoFs simply by changing the focal distance from close to the endoscope to far away. The apparent DoF visible on the display is also influenced by the size of the object. For example, a 5mm object viewed at 30mm from the endoscope may be at the limit of resolution of the system, while the same 5mm 765 Roosevelt Trail, Suite 9, Windham, ME 04062 T: 207-893-8233 | F: 207-893-8245 | www.lighthouseoptics.com object viewed 3mm from the endoscope will appear larger and in focus. Thus to quantify DoF, it is best to define it in terms that are independent of object size. This is best done by defining the DoV as sharpness in image space. See these links for additional information: http://www.lighthouseoptics.com/tutorial/measuring-endoscope-field-of-view/ http://www.lighthouseoptics.com/tutorial/endobench-endoscope-image-quality-tester/ Measurement methods 1. Field of View ISO 8600-3: 1997 prescribes a method for measuring the Field of View (FoV) of endoscopes using a target placed at a specific distance from the distal window of the endoscope. Wang et al. Vol. 8, No. 3, 1 Mar 2017, BIOMEDICAL OPTICS EXPRESS 1441 goes in to great detail describing the limitations of this approach, showing how significant errors are present in the values obtained through this method especially for close focus endoscopes. Briefly, the limitations are associated with the way lenses work and the actual location of the entrance pupil within the lens. Lighthouse Imaging's Endobench product implements another method of measuring Endoscope FoV that circumvents this limitation. Using a concentric ring target, the FoV is quantified by determining the delta d value along the scopes optical axis where the outer and inner circle of the target are aligned to the diagonals of the image. This method has been shown to have lower error than the current ISO method. Concentric ring target r2 r1 FoV/2 Endoscope r2 − r1 d 퐹표푉 = 2atan⁡( ) ∆푑 2. Direction of view Direction of view (DoV) is measured with respect to the axis of the endoscope. ISO 8600-3: 1997 describes a basic way to measure endoscope DoV. This is a simple method using a protractor aligned perpendicular to the endoscope axis. Lighthouse's Endobench includes a target system that 765 Roosevelt Trail, Suite 9, Windham, ME 04062 T: 207-893-8233 | F: 207-893-8245 | www.lighthouseoptics.com first allows one to find the true optical axis of the endoscope, and second measure the DoV relative to the fixture using a rotary stage with encoders. If a system like an Endobench is not available, the ISO method is generally adequate. DoV Endoscope Endoscope axis 3. F/# Measuring F/# directly is challenging and requires specialized equipment. Lighthouse recommends confirming F/# through design inspection of the physical aperture stop prior to assembling the lens. For near diffraction limited systems (perfect lens systems), the F/# can be inferred through measurement of the MTF. The frequency in which the MTF goes to zero, or cut off frequency, can be used to calculate the F/# assuming diffraction limited lens performance. 4. Optical resolution Optical resolution as defined by ISO 8600-5: 2005 is measured at the resolution limit of the endoscope. While this measurement is valuable, it typically does not quantify the perceived sharpness of the endoscope. This is because when viewing the image formed by the endoscope, our eyes are most sensitive to object sizes of frequencies in the mid-range of resolution, rather than the extreme. The standard does specify to convert a linear resolution, measured in line pairs per millimeter, to an angular resolution, line pairs/ degree. This conversion is required for ridgid endoscopes with an eye piece, and particularly valuable when comparing two different types of endoscopes, and quantifying depth of field. The method specified by ISO does limitations for use as a means of representing perceived image sharpness as stated above. It also has limitations for chip on tip endoscopes where the results are to be correlated back to a design or specification. Lighthouse uses and recommends slanted edge MTF for quantifying image sharpness for chip on tip endoscopes. Slanted edge MTF uses targets with rotated squares (as shown in the image below) to measure line spread function and MTF across the image. The advantage to slanted edge MTF over line pair based measurements is that the full MTF curve can be calculated from a single measurement.
Recommended publications
  • Photography and Photomontage in Landscape and Visual Impact Assessment
    Photography and Photomontage in Landscape and Visual Impact Assessment Landscape Institute Technical Guidance Note Public ConsuDRAFTltation Draft 2018-06-01 To the recipient of this draft guidance The Landscape Institute is keen to hear the views of LI members and non-members alike. We are happy to receive your comments in any form (eg annotated PDF, email with paragraph references ) via email to [email protected] which will be forwarded to the Chair of the working group. Alternatively, members may make comments on Talking Landscape: Topic “Photography and Photomontage Update”. You may provide any comments you consider would be useful, but may wish to use the following as a guide. 1) Do you expect to be able to use this guidance? If not, why not? 2) Please identify anything you consider to be unclear, or needing further explanation or justification. 3) Please identify anything you disagree with and state why. 4) Could the information be better-organised? If so, how? 5) Are there any important points that should be added? 6) Is there anything in the guidance which is not required? 7) Is there any unnecessary duplication? 8) Any other suggeDRAFTstions? Responses to be returned by 29 June 2018. Incidentally, the ##’s are to aid a final check of cross-references before publication. Contents 1 Introduction Appendices 2 Background Methodology App 01 Site equipment 3 Photography App 02 Camera settings - equipment and approaches needed to capture App 03 Dealing with panoramas suitable images App 04 Technical methodology template
    [Show full text]
  • Field of View - Wikipedia, the Free Encyclopedia
    Field of view - Wikipedia, the free encyclopedia http://en.wikipedia.org/wiki/Field_of_view From Wikipedia, the free encyclopedia The field of view (also field of vision, abbreviated FOV) is the extent of the observable world that is seen at any given moment. In case of optical instruments or sensors it is a solid angle through which a detector is sensitive to electromagnetic radiation. 1 Humans and animals 2 Conversions Horizontal Field of View 3 Machine vision 4 Remote sensing 5 Astronomy 6 Photography 7 Video games 8 See also 9 References In the context of human vision, the term “field of view” is typically used in the sense of a restriction Vertical Field of View to what is visible by external apparatus, like spectacles[2] or virtual reality goggles. Note that eye movements do not change the field of view. If the analogy of the eye’s retina working as a sensor is drawn upon, the corresponding concept in human (and much of animal vision) is the visual field. [3] It is defined as “the number of degrees of visual angle during stable fixation of the eyes”.[4]. Note that eye movements are excluded in the definition. Different animals have different visual fields, depending, among others, on the placement of the eyes. Humans have an almost 180-degree forward-facing horizontal diameter of their visual field, while some birds have a complete or nearly complete 360-degree Angle of view can be measured visual field. The vertical range of the visual field in humans is typically horizontally, vertically, or diagonally.
    [Show full text]
  • Investigation of Driver's FOV and Related Ergonomics Using Laser Shadowgraphy from Automotive Interior
    of Ergo al no rn m u ic o s J Hussein et al., J Ergonomics 2017, 7:4 Journal of Ergonomics DOI: 10.4172/2165-7556.1000207 ISSN: 2165-7556 Research Article Open Access Investigation of Drivers FOV and Related Ergonomics Using Laser Shadowgraphy from Automotive Interior Wessam Hussein1*, Mohamed Nazeeh1 and Mahmoud MA Sayed2 1Military Technical College, KobryElkobbah, Cairo, Egypt 2Canadian International College, New Cairo, Cairo, Egypt *Corresponding author: Wessam Hussein, Military Technical College, KobryElkobbah, 11766, Cairo, Egypt, Tel: + 20222621908; E-mail: [email protected] Received date: June 07, 2017; Accepted date: June 26, 2017; Publish date: June 30, 2017 Copyright: © 2017 Hussein W, et al. This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited. Abstract A new application of laser shadowgraphy in automotive design and driver’s ergonomics investigation is described. The technique is based on generating a characterizing plot for the vehicle’s Field of View (FOV). This plot is obtained by projecting a high divergence laser beam from the driver’s eyes cyclopean point, on a cylindrical screen installed around the tested vehicle. The resultant shadow-gram is photographed on several shots by a narrow field camera to form a complete panoramic seen for the screen. The panorama is then printed as a plane sheet FOV plot. The obtained plot is used to measure and to analyse the areal visual field, the eye and nick movement ranges in correlation with FOV, the horizontal visual blind zones, the visual maximum vertical angle and other related ergonomic parameters.
    [Show full text]
  • 12 Considerations for Thermal Infrared Camera Lens Selection Overview
    12 CONSIDERATIONS FOR THERMAL INFRARED CAMERA LENS SELECTION OVERVIEW When developing a solution that requires a thermal imager, or infrared (IR) camera, engineers, procurement agents, and program managers must consider many factors. Application, waveband, minimum resolution, pixel size, protective housing, and ability to scale production are just a few. One element that will impact many of these decisions is the IR camera lens. USE THIS GUIDE AS A REFRESHER ON HOW TO GO ABOUT SELECTING THE OPTIMAL LENS FOR YOUR THERMAL IMAGING SOLUTION. 1 Waveband determines lens materials. 8 The transmission value of an IR camera lens is the level of energy that passes through the lens over the designed waveband. 2 The image size must have a diameter equal to, or larger than, the diagonal of the array. 9 Passive athermalization is most desirable for small systems where size and weight are a factor while active 3 The lens should be mounted to account for the back athermalization makes more sense for larger systems working distance and to create an image at the focal where a motor will weigh and cost less than adding the plane array location. optical elements for passive athermalization. 4 As the Effective Focal Length or EFL increases, the field 10 Proper mounting is critical to ensure that the lens is of view (FOV) narrows. in position for optimal performance. 5 The lower the f-number of a lens, the larger the optics 11 There are three primary phases of production— will be, which means more energy is transferred to engineering, manufacturing, and testing—that can the array.
    [Show full text]
  • 17-2021 CAMI Pilot Vision Brochure
    Visual Scanning with regular eye examinations and post surgically with phoria results. A pilot who has such a condition could progress considered for medical certification through special issuance with Some images used from The Federal Aviation Administration. monofocal lenses when they meet vision standards without to seeing double (tropia) should they be exposed to hypoxia or a satisfactory adaption period, complete evaluation by an eye Helicopter Flying Handbook. Oklahoma City, Ok: US Department The probability of spotting a potential collision threat complications. Multifocal lenses require a brief waiting certain medications. specialist, satisfactory visual acuity corrected to 20/20 or better by of Transportation; 2012; 13-1. Publication FAA-H-8083. Available increases with the time spent looking outside, but certain period. The visual effects of cataracts can be successfully lenses of no greater power than ±3.5 diopters spherical equivalent, at: https://www.faa.gov/regulations_policies/handbooks_manuals/ techniques may be used to increase the effectiveness of treated with a 90% improvement in visual function for most One prism diopter of hyperphoria, six prism diopters of and by passing an FAA medical flight test (MFT). aviation/helicopter_flying_handbook/. Accessed September 28, 2017. the scan time. Effective scanning is accomplished with a patients. Regardless of vision correction to 20/20, cataracts esophoria, and six prism diopters of exophoria represent series of short, regularly-spaced eye movements that bring pose a significant risk to flight safety. FAA phoria (deviation of the eye) standards that may not be A Word about Contact Lenses successive areas of the sky into the central visual field. Each exceeded.
    [Show full text]
  • Nikon Binocular Handbook
    THE COMPLETE BINOCULAR HANDBOOK While Nikon engineers of semiconductor-manufactur- FINDING THE ing equipment employ our optics to create the world’s CONTENTS PERFECT BINOCULAR most precise instrumentation. For Nikon, delivering a peerless vision is second nature, strengthened over 4 BINOCULAR OPTICS 101 ZOOM BINOCULARS the decades through constant application. At Nikon WHAT “WATERPROOF” REALLY MEANS FOR YOUR NEEDS 5 THE RELATIONSHIP BETWEEN POWER, Sport Optics, our mission is not just to meet your THE DESIGN EYE RELIEF, AND FIELD OF VIEW The old adage “the better you understand some- PORRO PRISM BINOCULARS demands, but to exceed your expectations. ROOF PRISM BINOCULARS thing—the more you’ll appreciate it” is especially true 12-14 WHERE QUALITY AND with optics. Nikon’s goal in producing this guide is to 6-11 THE NUMBERS COUNT QUANTITY COUNT not only help you understand optics, but also the EYE RELIEF/EYECUP USAGE LENS COATINGS EXIT PUPIL ABERRATIONS difference a quality optic can make in your appre- REAL FIELD OF VIEW ED GLASS AND SECONDARY SPECTRUMS ciation and intensity of every rare, special and daily APPARENT FIELD OF VIEW viewing experience. FIELD OF VIEW AT 1000 METERS 15-17 HOW TO CHOOSE FIELD FLATTENING (SUPER-WIDE) SELECTING A BINOCULAR BASED Nikon’s WX BINOCULAR UPON INTENDED APPLICATION LIGHT DELIVERY RESOLUTION 18-19 BINOCULAR OPTICS INTERPUPILLARY DISTANCE GLOSSARY DIOPTER ADJUSTMENT FOCUSING MECHANISMS INTERNAL ANTIREFLECTION OPTICS FIRST The guiding principle behind every Nikon since 1917 product has always been to engineer it from the inside out. By creating an optical system specific to the function of each product, Nikon can better match the product attri- butes specifically to the needs of the user.
    [Show full text]
  • Field of View in Passenger Car Mirrors
    UMTRI-2000-23 FIELD OF VIEW IN PASSENGER CAR MIRRORS Matthew P. Reed Michelle M. Lehto Michael J. Flannagan June 2000 FIELD OF VIEW IN PASSENGER CAR MIRRORS Matthew P. Reed Michelle M. Lehto Michael J. Flannagan The University of Michigan Transportation Research Institute Ann Arbor, Michigan 48109-2150 U.S.A. Report No. UMTRI-2000-23 June 2000 Technical Report Documentation Page 1. Report No. 2. Government Accession No. 3. Recipient’s Catalog No. UMTRI-2000-23 4. Title and Subtitle 5. Report Date Field of View in Passenger Car Mirrors June 2000 6. Performing Organization Code 302753 7. Author(s) 8. Performing Organization Report No. Reed, M.P., Lehto, M.M., and Flannagan, M.J. UMTRI-2000-23 9. Performing Organization Name and Address 10. Work Unit no. (TRAIS) The University of Michigan Transportation Research Institute 11. Contract or Grant No. 2901 Baxter Road Ann Arbor, Michigan 48109-2150 U.S.A. 12. Sponsoring Agency Name and Address 13. Type of Report and Period The University of Michigan Covered Industry Affiliation Program for Human Factors in Transportation Safety 14. Sponsoring Agency Code 15. Supplementary Notes The Affiliation Program currently includes Adac Plastics, AGC America, Automotive Lighting, BMW, Britax International, Corning, DaimlerChrysler, Denso, Donnelly, Federal-Mogul Lighting Products, Fiat, Ford, GE, GM NAO Safety Center, Guardian Industries, Guide Corporation, Hella, Ichikoh Industries, Koito Manufacturing, Libbey- Owens-Ford, Lumileds, Magna International, Meridian Automotive Systems, North American Lighting, OSRAM Sylvania, Pennzoil-Quaker State, Philips Lighting, PPG Industries, Reflexite, Reitter & Schefenacker, Stanley Electric, Stimsonite, TEXTRON Automotive, Valeo, Visteon, Yorka, 3M Personal Safety Products, and 3M Traffic Control Materials.
    [Show full text]
  • The Camera Versus the Human Eye
    The Camera Versus the Human Eye Nov 17, 2012 ∙ Roger Cicala This article was originally published as a blog. Permission was granted by Roger Cicala to re‐ publish the article on the CTI website. It is an excellent article for those police departments considering the use of cameras. This article started after I followed an online discussion about whether a 35mm or a 50mm lens on a full frame camera gives the equivalent field of view to normal human vision. This particular discussion immediately delved into the optical physics of the eye as a camera and lens — an understandable comparison since the eye consists of a front element (the cornea), an aperture ring (the iris and pupil), a lens, and a sensor (the retina). Despite all the impressive mathematics thrown back and forth regarding the optical physics of the eyeball, the discussion didn’t quite seem to make sense logically, so I did a lot of reading of my own on the topic. There won’t be any direct benefit from this article that will let you run out and take better photographs, but you might find it interesting. You may also find it incredibly boring, so I’ll give you my conclusion first, in the form of two quotes from Garry Winogrand: A photograph is the illusion of a literal description of how the camera ‘saw’ a piece of time and space. Photography is not about the thing photographed. It is about how that thing looks photographed. Basically in doing all this research about how the human eye is like a camera, what I really learned is how human vision is not like a photograph.
    [Show full text]
  • Digital Light Field Photography
    DIGITAL LIGHT FIELD PHOTOGRAPHY a dissertation submitted to the department of computer science and the committee on graduate studies of stanford university in partial fulfillment of the requirements for the degree of doctor of philosophy Ren Ng July © Copyright by Ren Ng All Rights Reserved ii IcertifythatIhavereadthisdissertationandthat,inmyopinion,itisfully adequateinscopeandqualityasadissertationforthedegreeofDoctorof Philosophy. Patrick Hanrahan Principal Adviser IcertifythatIhavereadthisdissertationandthat,inmyopinion,itisfully adequateinscopeandqualityasadissertationforthedegreeofDoctorof Philosophy. Marc Levoy IcertifythatIhavereadthisdissertationandthat,inmyopinion,itisfully adequateinscopeandqualityasadissertationforthedegreeofDoctorof Philosophy. Mark Horowitz Approved for the University Committee on Graduate Studies. iii iv Acknowledgments I feel tremendously lucky to have had the opportunity to work with Pat Hanrahan, Marc Levoy and Mark Horowitz on the ideas in this dissertation, and I would like to thank them for their support. Pat instilled in me a love for simulating the flow of light, agreed to take me on as a graduate student, and encouraged me to immerse myself in something I had a passion for.Icouldnothaveaskedforafinermentor.MarcLevoyistheonewhooriginallydrewme to computer graphics, has worked side by side with me at the optical bench, and is vigorously carrying these ideas to new frontiers in light field microscopy. Mark Horowitz inspired me to assemble my camera by sharing his love for dismantling old things and building new ones. I have never met a professor more generous with his time and experience. I am grateful to Brian Wandell and Dwight Nishimura for serving on my orals commit- tee. Dwight has been an unfailing source of encouragement during my time at Stanford. I would like to acknowledge the fine work of the other individuals who have contributed to this camera research. Mathieu Brédif worked closely with me in developing the simulation system, and he implemented the original lens correction software.
    [Show full text]
  • Evaluating Relative Impact of Vr Components Screen Size, Stereoscopy and Field of View on Spatial Comprehension and Presence in Architecture
    EVALUATING RELATIVE IMPACT OF VR COMPONENTS SCREEN SIZE, STEREOSCOPY AND FIELD OF VIEW ON SPATIAL COMPREHENSION AND PRESENCE IN ARCHITECTURE by Nevena Zikic Technical Report No.53 May 2007 Computer Integrated Construction Research Program The Pennsylvania State University ©Copyright University Park, PA, 16802 USA ABSTRACT In the last couple of years, computing technology has brought new approaches to higher education, particularly in architecture. They include simulations, multimedia presentations, and more recently, Virtual Reality. Virtual Reality (also referred to as Virtual Environment) is a computer generated three-dimensional environment which responds in real time to the activities of its users. Studies have been performed to examine Virtual Reality’s potential in education. Although the results point to the usefulness of Virtual Reality, recognition of what is essential and how it can be further adapted to educational purposes is still in need of research. The purpose of this study is to examine Virtual Reality components and assess their potential and importance in an undergraduate architectural design studio setting. The goal is to evaluate the relative contribution of Virtual Reality components: display and content variables, (screen size, stereoscopy and field of view; level of detail and level of realism, respectively) on spatial comprehension and sense of presence using a variable- centered approach in an educational environment. Examining the effects of these independent variables on spatial comprehension and sense of presence will demonstrate the potential strength of Virtual Reality as an instructional medium. This thesis is structured as follows; first, the architectural design process and Virtual Reality are defined and their connection is established.
    [Show full text]
  • Optics for Birding – the Basics
    Optics for Birding – The Basics A Wisconsin Society for Ornithology Publicity Committee Fact Sheet Christine Reel – WSO Treasurer Arm yourself with a field guide and a binocular and you are ready to go birding. The binocular – a hand-held, double- barreled telescope – uses lenses and roof or porro prisms to magnify images generally between 6 and 12 times. Some birds are too distant for easy viewing with a binocular. At these times, a spotting scope – a single-barrel telescope with a magnification of generally 15 to 60 power using zoom or interchangeable fixed eyepieces – makes all the difference. Remember that a scope is not a substitute for a binocular, because it must be fitted to a tripod or another stabilizing device, and maneuverability and portability are limited. Optics shopping Buying a binocular or scope is a lot like buying a car: do some research, set some minimum standards, and then look at special features and options; also consider a number of choices and compare them. Keep in mind that there is no perfect binocular or scope, only one that is perfect for you. It must fit your hands, your face, your eyes, your size, and the way you bird. 1. Research. Plan to visit a local birding site or club meeting and ask a few birders what binoculars or scopes they prefer and why. Then choose a store or nature center that is geared toward birders and has a large selection of optics. Research the various specifications and what they mean to your birding experience (use this sheet as a starting point).
    [Show full text]
  • The Orientation of Visual Space from the Perspective of Hummingbirds
    ORIGINAL RESEARCH published: 30 January 2018 doi: 10.3389/fnins.2018.00016 The Orientation of Visual Space from the Perspective of Hummingbirds Luke P. Tyrrell 1,2, Benjamin Goller 1, Bret A. Moore 3, Douglas L. Altshuler 4 and Esteban Fernández-Juricic 1* 1 Department of Biological Sciences, Purdue University, West Lafayette, IN, United States, 2 Department of Biological Sciences, State University of New York at Plattsburgh, Plattsburgh, NY, United States, 3 William R. Pritchard Veterinary Medical Teaching Hospital, School of Veterinary Medicine, University of California, Davis, Davis, CA, United States, 4 Department of Zoology, University of British Columbia, Vancouver, BC, Canada Vision is a key component of hummingbird behavior. Hummingbirds hover in front of flowers, guide their bills into them for foraging, and maneuver backwards to undock from them. Capturing insects is also an important foraging strategy for most hummingbirds. However, little is known about the visual sensory specializations hummingbirds use to guide these two foraging strategies. We characterized the hummingbird visual field configuration, degree of eye movement, and orientation of the centers of acute vision. Edited by: Hummingbirds had a relatively narrow binocular field (∼30◦) that extended above Qasim Zaidi, ◦ State University of New York College and behind their heads. Their blind area was also relatively narrow (∼23 ), which at Buffalo, United States increased their visual coverage (about 98% of their celestial hemisphere). Additionally, eye Reviewed by: movement amplitude was relatively low (∼9◦), so their ability to converge or diverge their Misha Vorobyev, University of Auckland, New Zealand eyes was limited. We confirmed that hummingbirds have two centers of acute vision: a Thomas Lisney, fovea centralis, projecting laterally, and an area temporalis, projecting more frontally.
    [Show full text]