CMOS Depth Image Sensor with Offset Pixel Aperture Using a Back
Total Page:16
File Type:pdf, Size:1020Kb
sensors Letter CMOS Depth Image Sensor with Offset Pixel Aperture Using a Back-Side Illumination Structure for Improving Disparity Jimin Lee 1, Sang-Hwan Kim 1, Hyeunwoo Kwen 1, Juneyoung Jang 1, Seunghyuk Chang 2, JongHo Park 2, Sang-Jin Lee 2 and Jang-Kyoo Shin 1,* 1 School of Electronics Engineering, Kyungpook National University, 80 Deahak-ro, Buk-gu, Daegu 41566, Korea; [email protected] (J.L.); [email protected] (S.-H.K.); [email protected] (H.K.); [email protected] (J.J.) 2 Center for Integrated Smart Sensors, KAIST, 291 Daehak-ro, Yuseong-gu, Daejeon 34141, Korea; [email protected] (S.C.); [email protected] (J.P.); [email protected] (S.-J.L.) * Correspondence: [email protected]; Tel.: +82-53-950-5531 Received: 10 August 2020; Accepted: 7 September 2020; Published: 9 September 2020 Abstract: This paper presents a CMOS depth image sensor with offset pixel aperture (OPA) using a back-side illumination structure to improve disparity. The OPA method is an efficient way to obtain depth information with a single image sensor without additional external factors. Two types of apertures (i.e., left-OPA (LOPA) and right-OPA (ROPA)) are applied to pixels. The depth information is obtained from the disparity caused by the phase difference between the LOPA and ROPA images. In a CMOS depth image sensor with OPA, disparity is important information. Improving disparity is an easy way of improving the performance of the CMOS depth image sensor with OPA. Disparity is affected by pixel height. Therefore, this paper compared two CMOS depth image sensors with OPA using front-side illumination (FSI) and back-side illumination (BSI) structures. As FSI and BSI chips are fabricated via different processes, two similar chips were used for measurement by calculating the ratio of the OPA offset to pixel size. Both chips were evaluated for chief ray angle (CRA) and disparity in the same measurement environment. Experimental results were then compared and analyzed for the two CMOS depth image sensors with OPA. Keywords: offset pixel aperture; CMOS depth image sensor; back-side illumination structure; improving disparity 1. Introduction Recent studies on image sensors have focused on depth cameras. Since the development of cameras, there has been consistent and continuous research on the advancement of imaging technology for capturing two-dimensional (2D) images, higher-dimensional images, etc. However, despite the improved quality of 2D imaging technology, depth information is not captured. Due to the inability of conventional 2D image sensors to capture the depth information of images, the quality of 2D images is lower compared with actual images seen by the human eye. Three-dimensional (3D) imaging technology complements the 2D image technology. 3D imaging technology can provide more stereoscopic image information to observers through depth information, which is not present in 2D images. Presently, a wide variety of studies are being conducted on 3D image sensor technology. Typical 3D imaging methods include time of flight (ToF), structured light, and stereo vision [1–17]. The ToF method is used to calculate the return time of incident light on an object and determine the distance using a high-power infrared (IR) source. Structured light imaging is a method of illuminating an object with a specific light pattern and analyzing it algorithmically to determine the object distance. Sensors 2020, 20, 5138; doi:10.3390/s20185138 www.mdpi.com/journal/sensors Sensors 2020, 20, x FOR PEER REVIEW 2 of 13 Sensors 2020, 20, 5138 2 of 13 illuminating an object with a specific light pattern and analyzing it algorithmically to determine the object distance. Finally, a stereo camera is a setup that uses a plurality of cameras to obtain an image Finally,according a stereo to the camera incidence is a angle, setup which that uses is used a plurality to obtain of the cameras distance. to obtain The above-mentioned an image according methods to theare incidence used as external angle, which elements is used to extract to obtain depth the information distance. The to realize above-mentioned a 3D image. methodsThe manufacturing are used ascost external and the elements power consumption to extract depth of 3D information image sensor tos realizeusing these a 3D methods image. Theare relatively manufacturing high because cost andtwo the or power more cameras, consumption structured of 3D imagelight sources, sensors or using IR sources these methods are required. are relatively However, high depth because image twosensors or more adopting cameras, the structured offset pixel light aperture sources, (OPA or IR) technique sources are can required. extract depth However, information depth image with a sensorssimple adopting structure the and offset without pixel aperture external (OPA) elements. technique Previously, can extract a sensor depth informationfabricated using with athe simple OPA structuretechnique and was without reported external with elements.a front-side Previously, illuminati aon sensor (FSI) fabricatedstructure [18–20]. using the Generally, OPA technique in depth wasimage reported sensing, with disparity a front-side tends illumination to be inversely (FSI) structureproportional [18– 20to]. pixel Generally, height. in However, depth image CMOS sensing, depth disparityimage sensors tends towith be inverselyOPA, which proportional have the FSI to structure, pixel height. are limited However, in that CMOS the depthpixel height image is sensors reduced withdue OPA, to multiple which havemetal the layers FSI structure, on the front-side are limited of inthe that pixel. the Therefore, pixel height a ba is reducedck-side illumination due to multiple (BSI) metalstructure layers is on applied the front-side to OPA ofCMOS the pixel. depth Therefore, image sensors a back-side to improve illumination disparity. (BSI) The structure OPA CMOS is applied depth toimage OPA CMOS sensor depthwith the image BSI sensorsstructure to has improve a pixel disparity. height that The is OPA relatively CMOS lower depth than image that sensor of one with with thethe BSI FSI structure structure. has If the a pixel pixel height height that is low, is relatively the difference lower in than the peak that ofangle one of with response the FSI between structure. left- IfOPA the pixel (LOPA) height and is right-OPA low, the di ff(ROPA)erence inand the the peak disparity angle ofincreases. response between left-OPA (LOPA) and right-OPAThe (ROPA)rest of this and paper the disparity is organized increases. as follows. Disparity is described in Section 2. The design of theThe pixel rest structures of this paper is described is organized in as Section follows. 3.Disparity Measurement is described of disparity in Section based2. The on design distance of the and pixelmeasurement structures is of described the chief in ray Section angle3. Measurement(CRA) with OPA-based of disparity baseddepth onimage distance sensors and measurementis described in ofSection the chief 4. rayFinally, angle Section (CRA) 5 withconcludes OPA-based the paper. depth image sensors is described in Section4. Finally, Section5 concludes the paper. 2. Disparity 2. Disparity Depth information is the most important factor for realizing 3D images. A universalized 2D imageDepth sensor information uses only is thethe most x- and important y-axis information factor for realizing of the 3Dobject images. to generate A universalized data. Therefore, 2D image the sensorimage uses data only from the thex- and 2D ycamera-axis information is formed of without the object distance to generate information data. Therefore, to the actual the image object datainformation. from the 2DTo implement camera is formed this in the without 3D method, distance depth information information to the must actual be applied object information. to the planar Toimage implement data. The this depth in the information 3D method, of depth the image information is important must data be applied to detect to the the distance planar image of the data.object. TheTherefore, depth information when the depth of the information image is important is extracted data an tod detect applied the to distance the image, of the it is object. possible Therefore, to know whenthe distance the depth of information the object. isThe extracted OPA depth andapplied image se tonsor the image,described it is in possible this paper to know used the disparity distance to ofextract the object. the depth The OPA information. depth image In the sensor OPA described method, inan thisaperture paper is used applied disparity to the to inside extract of thethe depthpixel to information.obtain disparity In the information. OPA method, The an aperture isintegrat applieded to inside the inside the pixel of the exhibits pixel to equivalent obtain disparity optical information.characteristics The with aperture the camera integrated lens. inside Figure the 1 shows pixel exhibits that the equivalent effects of the optical OPA characteristics and lens aperture with of thea camera lens.are equivalent. Figure1 shows O1 is that the the offset e ffects value of theof the OPA camera and lens lens aperture aperture of and a camera O2 is arethe equivalent.offset value O1ofis the the OPA. offset value of the camera lens aperture and O2 is the offset value of the OPA. Figure 1. Equivalent effect between OPA and camera lens aperture. Figure 1. Equivalent effect between OPA and camera lens aperture. Sensors 2020, 20, 5138 3 of 13 SensorsFigure 20202 ,shows 20, x FOR a PEER cross-sectional REVIEW view of the pixel of the CMOS depth image sensor with3 of 13 OPA. The equivalent offset f-number (FOA) is inversely proportional to the pixel height and is given by: SensorsFigure 2020, 202 ,shows x FOR PEER a cross-sectional REVIEW view of the pixel of the CMOS depth image sensor with OPA.3 of 13 The equivalent offset f-number (FOA) is inverselyf proportionalh toh the pixel height and is given by: = = = Figure 2 shows a cross-sectionalFOA view of the pixelℎ of theℎ CMOS, depth image sensor with OPA.