Characterization and Compensation of Stray Light Effects in Time of Flight Based Range Sensors
Total Page:16
File Type:pdf, Size:1020Kb
Characterization and compensation of Stray light effects in Time of flight based range sensors Jonas Holmlund Master’s Thesis in Engineering Physics, Department of Physics, Umeå University, 2013 Department of physics Linnaeus väg 20 901 87 Umeå Sweden www.physics.umu.se Characterization and compensation of Stray light effects in Time of flight based range sensors Jonas Holmlund [email protected] MAY 2013 Supervisors: Emil Jonze Jonas Sjöberg Examiner: Ove Axner Abstract Recently much work has been done with the development of 3D real-time imagers, including the Time-of-flight (ToF) camera. This type of camera uses the principle of time of flight, in order to perform a distance measurement to every point in the field of view of the camera. This is done by sending a light signal towards the objects of the observed scene of the camera, where the light is reflected and returns to the camera. The signal is detected at the camera sensor, usually a CCD- or CMOS-chip with an array of pixels, where a distance measurement is made at every pixel, creating a 3D image of the whole scene. The problem is that some of the signal light travels through the optical system of the camera in a way that was not meant in the design. This light is called stray light. The measurement of the distance to different points within the field of view of the camera depends on the amount of light and the distance traveled by the light that reaches each and every one of the sensor pixels. This means that the effects of stray light will directly affect the accuracy of the depth data for the images when stray light spreads to different parts of the sensor. The purpose of this master thesis work was to devise a method, using computer software, that can be used in order to study stray light effects in order to reduce these effects. This has been done by running simulations of light emitted from predefined observed scenes entering a simulated model of the optical system of a ToF camera, using Zemax. The ray tracing data obtained in Zemax was then treated in Matlab, creating simulated ToF images, displaying the same stray light effects as real ToF images. A short analysis of the optical system was made using the simulated model. The results suggest that the reflective surface at the sensor, as well as an additional band-pass filter sometimes used to prevent light from external sources to enter the camera, were fundamental sources of stray light. Two algorithms, designed to correct the errors caused by stray light, were tested and evaluated. One used the properties of the evaluated incoming signal to approximate the contribution from stray light and subtract this contribution from the detected signal. The other method used filtering, approximating the distribution of light spread across the sensor in order to create an appropriate filter. Both methods show some promise, and could be used to decrease the effects of stray light, but the results can differ from case to case. The choice of certain weighting factors, when applying the algorithms, show different results when applied to different cases, where no optimized factors that work well for all measurement situations was found. Thus it is recommended that both methods are investigated and optimized further before being implemented in practice. Sammanfattning Den senaste tiden har mycket arbete utförts kring utvecklingen av 3D-sensorer som kan konstruera 3D-bilder i realtid. Detta inkluderar Time-of-Flight-kameror (TOF-kameror). Dessa kameror använder time-of-flight för att mäta avståndet till varje punkt i kamerans synfält. Kameran skickar en ljussignal mot objekten i kamerans synfält, vid vilka signalen reflekteras och återvänder till kameran. Signalen registreras vid en sensor i kameran, vanligen ett CCD- eller CMOS-chip konstruerad med en serie pixlar, där avståndet beräknas vid varje pixel. På så sätt erhålls en 3D-bild av hela omgivningen inom ToF-kamerans synfält på en gång. Ett problem är att en del av ljuset färdas genom kamerans linssystem på ett sätt som inte är beräknat eller förutsett vid designen av kameran. Detta ljus kallas för ströljus. Avståndsmätningen vid varje pixel beror både på hur mycket ljus som når pixlarna och hur långt detta ljus färdats. Detta gör att ströljus kan spridas över sensorn och påverkar direkt avståndsmätningen vid pixlarna. Syftet med detta exjobb är att komma på en metod, för att med hjälp av olika dataprogram, som kan användas för att studera ströljuseffekter, med målet att kunna bearbeta dessa effekter. Med programmet Zemax har simuleringar kunnat köras där ljus från olika objekt skickats in i en simulerad modell av kamerans linssystem. Data från dessa simuleringar har överförts till Matlab och behandlats för att göra simulerade ToF-bilder som visar samma inverkan av ströljus som riktiga ToF-bilder. En enkel optisk analys av det optiska systemet gjordes med modellen. Resultaten tyder på att de reflektiva ytorna hos glaset som täcker sensorns pixlar, kamerans inre hölje samt ett band- pass-filter, som ibland används för att filtrera ljus från övriga källor från att nå sensorn, är fundamentala källor till ströljus. Två algoritmer som designats för att korrigera de fel som uppstår p.g.a. ströljus testades och analyserades. Ena metoden använde signalljusets egenskaper för att bedöma hur mycket ströljuset bidrar till varje beräknat avståndsvärde för att subtrahera detta bidrag från den registrerade signalen. Den andra metoden använde sig utav approximationer för hur ljuset sprids över sensorn för att designa ett lämpligt filter som kan reducera ströljusets inverkan. Båda metoderna visade sig potentiellt kunna förbättra kamerans prestanda, men resultaten varierar något från fall till fall. Valet av värdet på vissa viktningsfaktorer som används för att justera algoritmernas effekter, gav varierande resultat beroende på mätsituationen. Inga optimerade värden, som fungerade bra för varenda mätsituation som testades, kunde bestämmas under projektet. Därför rekommenderas att vidare undersökning av dessa två algoritmer bör göras innan de tillämpas i praktiken. Preface This master’s thesis work was done in co-operation with the organizations Adopticum and Optronic, stationed in Skellefteå, Sweden. I would like to thank the people at Adopticum, particularly my supervisors Emil Jonze and Jonas Sjöberg, for giving me a lot of support and encouragement during my whole time of working with this thesis. I would also like to thank Emil Hällstig at Optronic, for presenting me the opportunity of doing this master’s thesis work as well as additional support during the project. Many thanks go to Bo Lindberg at Lens-Tech AB, who has offered a lot of help and teachings, valuable for the project. Also, I want to thank Ove Axner, at Umeå University, for taking the role as my examiner for this project. This has been a great experience for me. I’ve taken a big step towards the life outside the university. I’m looking forward to see what the future holds. Jonas Holmlund Table of Contents 1 INTRODUCTION .................................................................................................... 3 1.1 Background...................................................................................................................................................... 3 1.2 The purpose of this thesis ............................................................................................................................... 3 1.3 Restrictions ...................................................................................................................................................... 4 2 THEORY ............................................................................................................. 5 2.1 Time of flight ................................................................................................................................................... 5 2.1.1 The Time of Flight principle ...................................................................................................................... 5 2.1.2 Continuous wave modulation .................................................................................................................... 6 2.2 The camera – How it works ............................................................................................................................ 9 2.2.1 Components ............................................................................................................................................... 9 2.2.2 The sensor ................................................................................................................................................ 11 2.2.3 Partial reflection ....................................................................................................................................... 15 2.2.4 Other sources of error .............................................................................................................................. 18 3 METHOD ........................................................................................................... 21 3.1 General approach .......................................................................................................................................... 21 3.1.1 Running simulations ................................................................................................................................ 21 3.1.2 Treating data using Matlab .....................................................................................................................