Multidimensional Optical Sensing-Imaging MOSIS PIEEE 2017

Total Page:16

File Type:pdf, Size:1020Kb

Multidimensional Optical Sensing-Imaging MOSIS PIEEE 2017 Multidimensional Optical Sensing and Imaging System (MOSIS): From Macroscales to Microscales This review paper describes a passive multidimensional optical sensing and imaging system (MOSIS), which can be used for 3-D visualization, seeing through obscurations, material inspection, 3-D endoscopy, and 3-D object recognition from microscales to long-range imaging. The system utilizes time and space multiplexing, polarimetric, temporal, photon flux, and multispectral information to reconstruct multidimensionally integrated scenes. By BAHRAM JAVIDI, Fellow IEEE, XIN SHEN, A DAM S. M ARKMAN, PEDRO L ATORRE-CARMONA, A DOLFO M ARTÍNEZ-USO, JOSÉ M ARTINEZ SOTOCA, FILIBERTO PL A, M ANUEL M ARTÍNEZ-CORRAL, GENARO SA AVEDR A, YI-PAI HUANG, AND A DRIAN STERN, Member IEEE ABSTRACT | Multidimensional optical imaging systems veillance, and defense. Among different 3-D imaging for information processing and visualization technolo- methods, integral imaging is a promising multiperspective gies have numerous applications in fields such as manu- sensing and display technique. Compared with other 3-D imag- facturing, medical sciences, entertainment, robotics, sur- ing techniques, integral imaging can capture a scene using an incoherent light source and generate real 3-D images for obser- Manuscript received August 19, 2016; revised October 18, 2016; accepted vation without any special viewing devices. This review paper December 16, 2016. Date of publication March 2, 2017; date of current version April 20, 2017. The work of B. Javidi was supported in part by the National Science describes passive multidimensional imaging systems combined Foundation (NSF) under Grant NSF/IIS-1422179, in part by the Defence Advanced with different integral imaging configurations. One example is Research Projects Agency (DARPA), and in part by the U.S. Army under Contract the integral-imaging-based multidimensional optical sensing W911NF-13-1-0485. The work of P. Latorre Carmona, A. MartõÂnez-Uso,J. M. Sotoca, and F. Pla was supported by the Spanish Ministry of Economy under Project and imaging system (MOSIS), which can be used for 3-D visu- ESP2013-48458-C4-3-P, and by MICINN under Project MTM2013-48371-C2-2-PDGI, alization, seeing through obscurations, material inspection, and by Generalitat Valenciana under Project PROMETEO-II/2014/062, and by object recognition from microscales to long-range imaging. This Universitat Jaume I through Project P11B2014-09. The work of M. MartõÂnez-Corral and G. Saavedra was supported by the Spanish Ministry of Economy and Competi- system utilizes many degrees of freedom such as time and space tiveness under Grant DPI2015-66458-C2-1R, and by the Generalitat Valenciana, multiplexing, depth information, polarimetric, temporal, photon Spain, under Project PROMETEOII/2014/072. flux and multispectral information based on integral imaging to B. Javidi, X. Shen, and A. S. Markman are with the Electrical and Computer record and reconstruct the multidimensionally integrated scene. Engineering Department, University of Connecticut, Storrs, CT 06269 USA (e-mail: [email protected]; [email protected]; [email protected]). Image fusion may be used to integrate the multidimensional P. Latorre-Carmona, A. MartõÂnez-Uso, J. M. Sotoca, and F. Pla are with the images obtained by polarimetric sensors, multispectral cam- Institute of New Imaging Technologies. Universidad Jaume I, 12071 Castellon de la eras, and various multiplexing techniques. The multidimensional Plana, Spain (e-mail: [email protected]; [email protected]; [email protected]; [email protected]). M. MartõÂnez-Corral and G. Saavedra are with the Department of Optics, images contain substantially more information compared with University of Valencia, 46100 Burjassot, Spain (e-mail: [email protected]; 2-D images or conventional 3-D images. In addition, we present [email protected]). recent progress and applications of 3-D integral imaging includ- Y.-P. Huang is with the Department of Photonics and the Institute of Electro-Optical ing human gesture recognition in the time domain, depth estima- Engineering/Display Institute, National Chiao Tung University, Hsinchu 30010, Taiwan (e-mail: [email protected]). tion, mid-wave-infrared photon counting, 3-D polarimetric imag- A. Stern is with the Electro-Optical Engineering Unit, Ben-Gurion University of the ing for object shape and material identification, dynamic integral Negev, Beer Sheva 81405, Israel (e-mail: [email protected]). imaging implemented with liquid-crystal devices, and 3-D endos- Digital Object Identifier: 10.1109/JPROC.2017.2654318 copy for healthcare applications. 0018-9219 © 2017 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission. See http://www.ieee.org/publications standards/publications/rights/index.html for more information. 850 Proceedings of the IEEE | Vol. 105, No. 5, May 2017 Javidi et al.: MOSIS: From Macroscales to Microscales KEYWORDS | 3-D endoscopy; 3-D human activity recognition; Integrating features from multidimensional and multi- 3-D imaging; dynamic integral imaging; long-range integral modal imaging, that is, 3-D imaging, multispectral imaging imaging; material analysis; multidimensional object recognition; and polarization imaging, etc., provides unique information multispectral imaging; photon counting; polarimetric imaging about a scene. In this paper, we present an overview of some recent work on multidimensional sensing and integrated vis- I. INTRODUCTION ualization with 3-D integral imaging technology. In addition, There have been significant technological advancements in new work on using MOSIS 2.0 for 3-D object shape, mate- sensors, devices, materials, algorithms, and computational rial inspection, and recognition, such that similar objects hardware. Therefore, sensing and visualization capabilities with different materials can be discriminated, is presented. applied to real world objects have improved extensively. In To the best of our knowledge, this is the first time that all recent decades, 3-D imaging technology has received interest of these degrees of freedom are integrated in a passive 3-D from many research groups. Instead of conventional 2-D sens- integral imaging system. This paper is organized as follows: ing techniques, which record the intensity of the scene, passive the original concept of MOSIS [51] is first reviewed in Section 3-D imaging also includes depth and directional information. II, followed by the principle and recent progress of the inte- Many techniques for 3-D imaging have been proposed, such as gral imaging technique in Section III. Section IV presents the holography and interferometry [1], [2], two-view based stere- development of the 3-D polarimetric integral imaging sensing oscopy [3], [4], and multi-view techniques for autostereoscopic and visualization. Integral imaging techniques in the infra- 3-D imaging [5], [6], to cite a few. red domain are presented in Section V. Three-dimensional Integral imaging [7] is an autostereoscopic 3-D sensing and human gesture recognition using integral imaging videos is imaging technique, which provides true 3-D images with full discussed in Section VI, and recent progress of MOSIS 2.0 is parallax and quasi-continuous viewing angles [8]. In addition, given in Section VII. Section VIII presents a brief overview of integral imaging can work well for long-range objects [9]. In MOSIS in microscales for medical applications and dynamic contrast, some other 3-D sensing techniques, such as the time- integral imaging systems with time multiplexing imple- of-flight camera [10] or structured light techniques [11], [12], mented with liquid crystal devices. Conclusions are given in may not work well for long-range objects. Integral imaging is a Section IX. Progress in these topics has grown substantially in promising technique that has been used in various fields, such the recent years, and therefore it is difficult to give a complete as 3-D sensing [13], 3-D displays [14]–[17], holographic display overview of all the reported work. Thus, we apologize if some [18], 3-D imaging of objects in turbid water [19], 3-D track- relevant work has been omitted in this review. ing [20] and 3-D target detection and recognition [21], [22], photon counting 3-D sensing and visualization [23]–[25], 3-D II. MULTIDIMENSIONAL OPTICAL microscopy [26]–[31] and endoscopy for microscale 3-D imag- SENSING AND IMAGING SYSTEMS ing and display [32], [33], head tracking 3-D display [34], 3-D (MOSIS) augmented reality [35]–[38], to cite a few. Originally developed for space-based imaging [39], mul- In this section, the integral-imaging-based MOSIS is reviewed. tispectral imaging captures the information corresponding MOSIS is an extension of the conventional 3-D integral imaging to specific wavelengths of light. The spectrum for an imag- technique to incorporate multimodality into image sensing and ing system can be extended from the visible range to the reconstruction. Additional information obtained by the system near-infrared (NIR) range, mid-wave-infrared (MIR) range, can be further used for object recognition, material inspection or long-wave-infrared (LWIR) range. Applications of mul- and integrated 3-D visualization, etc., which can significantly tispectral imaging range from remote sensing [40]–[42] to enhance the amount of information extracted from a scene. medical imaging [43], to name a few. The concept of MOSIS [51] is to use different degrees of One of the fundamental properties of light is its state of freedom
Recommended publications
  • Scalable Multi-View Stereo Camera Array for Real World Real-Time Image Capture and Three-Dimensional Displays
    Scalable Multi-view Stereo Camera Array for Real World Real-Time Image Capture and Three-Dimensional Displays Samuel L. Hill B.S. Imaging and Photographic Technology Rochester Institute of Technology, 2000 M.S. Optical Sciences University of Arizona, 2002 Submitted to the Program in Media Arts and Sciences, School of Architecture and Planning in Partial Fulfillment of the Requirements for the Degree of Master of Science in Media Arts and Sciences at the Massachusetts Institute of Technology June 2004 © 2004 Massachusetts Institute of Technology. All Rights Reserved. Signature of Author:<_- Samuel L. Hill Program irlg edia Arts and Sciences May 2004 Certified by: / Dr. V. Michael Bove Jr. Principal Research Scientist Program in Media Arts and Sciences ZA Thesis Supervisor Accepted by: Andrew Lippman Chairperson Department Committee on Graduate Students MASSACHUSETTS INSTITUTE OF TECHNOLOGY Program in Media Arts and Sciences JUN 172 ROTCH LIBRARIES Scalable Multi-view Stereo Camera Array for Real World Real-Time Image Capture and Three-Dimensional Displays Samuel L. Hill Submitted to the Program in Media Arts and Sciences School of Architecture and Planning on May 7, 2004 in Partial Fulfillment of the Requirements for the Degree of Master of Science in Media Arts and Sciences Abstract The number of three-dimensional displays available is escalating and yet the capturing devices for multiple view content are focused on either single camera precision rigs that are limited to stationary objects or the use of synthetically created animations. In this work we will use the existence of inexpensive digital CMOS cameras to explore a multi- image capture paradigm and the gathering of real world real-time data of active and static scenes.
    [Show full text]
  • Dr. Juan Carlos Jácome Fernández
    Title CAPTURING OF 3D CONTENT USING A SINGLE APERTURE CAMERA Name: Dr. Juan Carlos Jácome Fernández This is a digitised version of a dissertation submitted to the University of Bedfordshire. It is available to view only. This item is subject to copyright. CAPTURING OF 3D CONTENT USING A SINGLE APERTURE CAMERA By Juan Carlos Jácome Fernández A thesis submitted to the University of Bedfordshire in partial fulfilment of the requirements for the degree of Doctor of Philosophy October 2017 Abstract Integral imaging has recently re-emerged as an alternative to current 3D capturing systems, providing post-production refocusing capabilities and reducing the complexity of 3D capturing systems. One of the main drawbacks of conventional plenoptic 1 integral imaging systems is the implementation of a single custom made microlens array which has a fixed focal length and a high cost/low scalability associated with. This thesis demonstrates a variable focal length microlens arrays system, which can flexibly operate within a range of various focal lengths, increase the cost-effectiveness of the integral imaging system and offers the opportunity to manipulate the main camera optical parameters without modifying the objective lens settings. To validate the proposed concept, a custom-made integral imaging camera system was designed and built (IMPERX 4K system). Based on the results obtained from two initial optical simulations, software simulation and mathematical model; the suitable microlens arrays were acquired, and several experiments were performed to establish the feasibility of a variable focal length microlens arrays system. The results obtained show the proposed system performed as expected. The variable focal length system represents an ideal method to control microlens focal length, as it offers a higher microlens count, without a dedicated electronic system, whose performance is not susceptible to temperature-pressure changes and can operate in real-time as it does not require downtimes to re-adjust.
    [Show full text]
  • Synthesis, Coding, and Evaluation of 3D Images Based on Integral Imaging
    Synthesis, Coding, and Evaluation of 3D Images Based on Integral Imaging Roger Olsson Department of Information Technology and Media Mid Sweden University Doctoral Thesis No. 55 Sundsvall, Sweden 2008 Mittuniversitetet Informationsteknologi och medier ISBN 978-91-85317-98-1 SE-851 70 Sundsvall ISSN 1652-893X SWEDEN Akademisk avhandling som med tillstånd av Mittuniversitetet framlägges till offent- lig granskning för avläggande av Teknologie Doktorsexamen torsdagen den 12 juni 2008 i L111, Mittuniversitetet, Holmgatan 10, Sundsvall. c Roger Olsson, maj 2008 Tryck: Kopieringen, Mittuniversitetet, Sundsvall. Always in motion is the future Yoda iv Sammanfattning De senaste åren har kameraprototyper som kan fånga tredimensionella (3D) bilder presenterats, baserade på 3D-tekniken Integral Imaging (II). När dessa II-bilder be- traktas på en 3D-skärm, delger de både ett djup och ett innehåll som på ett realistiskt sätt ändrar perspektiv när tittaren ändrar sin betraktningsposition. Avhandlingen koncentrerar sig på tre hämmande faktorer gällande II-bilder. För det första finns det en mycket begränsad allmän tillgång till II-bilder för jämförande forskning och utveckling av kodningsmetoder. Det finns heller inga objektiva kvali- tetsmått som uttryckligen mäter distorsion med avseende på II-bildens egenskaper: djup och betraktningsvinkelberoende. Slutligen uppnår nuvarande standarder för bildkodning låg kodningseffektivitet när de appliceras på II-bilder. En metod baserad på datorrendrering har utvecklats som tillåter produktion av olika typer av II-bilder. En II-kameramodell ingår som bas, kombinerad med ett scen- beskrivningsspråk som möjligör att godtydligt komplexa virtuella scener definieras. Ljustransporten inom scenen och fram till II-kameran simuleras med strålföljning och geometrisk optik. Den presenterade metoden används för att skapa ett antal II- kameramodeller, scendefinitioner och II-bilder.
    [Show full text]
  • Generation of 3-D Synthetic Autostereoscopic Integral Images Using Computer Simulated Cameras
    GENERATION OF 3-D SYNTHETIC AUTOSTEREOSCOPIC INTEGRAL IMAGES USING COMPUTER SIMULATED CAMERAS A thesis submitted in partial fulfilment of the requirements for the degree of Doctor of Philosophy By Shafik Salih Electronic and Computer Engineering School of Engineering and Design Brunel University March 2014 Abstract ________________________________________________________ Production of artificial Three-Dimension (3-D) images was the aim of many researches over hundreds of years. 3-D images are the images that create sense of depth when viewing them. 3-D images are closer to the real world scenes than 2-D images due to the 3-D effect or the sense of depth the 3-D images provide. Sense of depth can be caused by binocular cues including convergence and parallax. Convergence is created by the difference between the angles of the left eye and the right eye viewing axes. Parallax is the effect of viewing with one eye a view of the scene that is inherently shifted to the view seen by the other eye. Several techniques have targeted the creation of 3-D images with the mentioned cues. The technique is preferred when it is able to create 3-D images so that the viewer can view these images without wearing special glasses and the occurrence of viewer fatigue. Integral photography that was invented in 1908 is able to meet the previous requirements. Based on integral photography, several techniques, research and studies have been published. The purposes of this thesis include the computer simulation of flexible integral photography systems, the computer generation of good quality 3-D static and animated integral images using the simulated systems, optimising the generation process to be more accurate, less expensive, more effective, and faster, and producing a portable specialist software tool to achieve these targets.
    [Show full text]
  • STEREOSCOPIC IMAGING TECHNOLOGY: a Review Of
    2 THE INTERNATIONALJOURNAL OF VIRTUALREALITY Vol. I, No. 2 STEREOSCOPIC IMAGING and anaglyph cassettes and videodiscs appear occasionally. Unfortunately, this technique, like those TECHNOLOGY: employing Pulfrichor prism glasses, is hopeless for high A Review of Patents and the Literature quality or comfortable viewing with video. It is better with computer displays. James Butterfield broadcasted Michael Starks side-by-side stereo images for viewing with prism 3DTV Corporation glasses in Mexico in the 1950s. He was one of many to San Rafael, California make stereo systems with dual cameras viewed through a binocular stereoscope. He was alsoone of thefirst to use Editor:Hal Thwaites polarized glasses to view anaglyphs by placing dichroic polarizers on the face of the CRT, an idea later refined ABSTRACT by Benton2. Polarized glasses for use with dual cross polarized CRTs as well as interdigitated images on a This article contains comments on the nature of single display covered with crossed polarizers were stereo perception as it relates to stereo video displays as proposed many times in the patent literature. However, well as a discussion of interfaces for stereo graphics and theproblem of manufacture prevented commercialization the interaction of stereo video and stereo graphics. A until Faris applied lithography to create the number of areas of interest are briefly reviewed and micropolarizer arrays [Faris]. This may be a viable accompanied by extensive citations from the patent and alternative to field sequential techniques where flat technical literature. These include single camera (70 screens or projection are involved, but for CRTs it has references) and dual camera (100 references) the same problem of aligning pixels with the optical stereoscopy, compatible 3D recording and transmission elements through a thick glass surface as the lenticular (57 references), head mounted displays (85 references), technique.
    [Show full text]
  • Towards 3D Television Through Fusion of Kinect and Integral
    894 JOURNAL OF DISPLAY TECHNOLOGY, VOL. 11, NO. 11, NOVEMBER 2015 Towards 3D Television Through Fusion of Kinect and Integral-Imaging Concepts Seokmin Hong, Donghak Shin, Byung-Gook Lee, Adrián Dorado, Genaro Saavedra, and Manuel Martínez-Corral Abstract—We report a new procedure for the capture and pro- However, the need for capturing and displaying the 3D in- cessing of light proceeding from 3D scenes of some cubic meters formation of 3D scenes is increasing very fast in the 21st Cen- in size. Specifically we demonstrate that with the information tury. Its potential application in, for example, microscopy [1], provided by a kinect device it is possible to generate an array of microimages ready for their projection onto an integral-imaging [2], medical imaging [3]–[6], optical inspection in production monitor. We illustrate our proposal with some imaging experiment chains [7], security monitoring [8], or virtual simulators for civil in which the final result are 3D images displayed with full parallax. or military applications [9], etc., makes the capture and display Index Terms—Integral imaging, kinect, 3D monitors. of 3D images a hot topic in the research end/or engineering for the next decade. If we discard, at this moment, holography, which still needs I. INTRODUCTION coherent illumination, or stereoscopy, which does not provide real 3D experiences, we can affirm that technology based on the Integral Photography principle is in the right way of ONVENTIONAL photography is fully adapted for producing acceptable 3D experience. Integral Photography was C recording in a 2D sensor the images of the 3D world. proposed, in 1908, by Gabriel Lippmann [10].
    [Show full text]
  • Symposium Program (Pdf)
    WRAP FINAL_Layout 1 5/13/2015 1:23 PM Page 1 The Evolution of Displays through Innovation Display Week Business Track emony Society for Information Display As electronic information displays become more and more ubiquitous g Cer throughout the world, the technology must evolve at an ever-increasing BUSINESS CONFERENCE R ttin The Business Conference will kick off the Business Track for Display ibbon-Cu pace in order to meet the increasing demands of users. Display Week’s DISPLAY WEEK 2015 DISPLAY WEEK 2015 Week 2015 on Monday, June 1, 2015. The theme of this year’s 2015 Keynote addresses will provide a vision of how the evolution of rendition is “Game Changers: Finding Ways to Increase Profitability.” display technology will change the look and functionality of the displays IHS/DisplaySearch analysts will anchor each session, presenting Sponsors Sponsors of the future and what role new materials, advances in technology, and Please join us for a Special Ribbon Cutting Ceromony market and technology analysis and up-to-date forecasts. Each emerging markets will play. to declare the opening of the Display Week Exhibition The Society for Information Display would session will include industry executives and marketing managers on Tuesday morning at 10:30 am directly outside the Exhibit Hall. like to acknowledge the following Keynote Speakers speaking on the following topics: This event will be presided over by SID President Amal Ghosh • Is Good Enough, Good Enough? Can LCDs Persevere? along with local dignitaries and major exhibitor executives. sponsors of Display Week 2015: Mr. Brian Krzanich • A Better Consumer Experience: Display Customers Point of light measurement CEO, Intel Corp., Santa Clara, CA, USA View on Increasing Profitability “Toward an Immersive Image Experience” • Financial Implications that Can Lead to Success NEW Instrument Systems LG Display Co., Ltd.
    [Show full text]
  • Cutting Edge 3D Display Technology Optical Hardware
    Cutting Edge 3D Display Technology Optical Hardware Jukka-Tapani Mäkinen | 3/2017 VTT Technical Research Centre of Finland 3D Display and Rendering Technology will play a pivotal role in powering Augmented Reality, Mixed Reality, VR Devices and Experiences. The purpose of this report is to collect the State of Art (SoA) optical hardware currently used in the feld of 3D display technology. The frst part of the document presents some basic background information on the human visual perception and its effects to the 3D display technologies. The second part lists existing 3D display types and briefy explains the optical hardware used in them. The third part of the document is focused on emerging 3D display optical hardware technologies with main emphasis on Light Field (LF) and Holographic techniques. InterDigital.com About InterDigital lnterDigital, Inc., designs and develops advanced technologies that enable and enhance mobile communications and capabilities. Founded in 1972, our engineers have designed and developed a wide range of innovations used in mobile technology products and networks that include 2G, 3G, 4G and IEEE 802-related products and networks. For more than four decades, lnterDigital has been a pioneer in mobile technology and a key contributor to global wireless standards. Our team of more than 150 engineers-nearly 75 per cent of whom hold advanced degrees, including nearly 50 PhDs-has unparalleled expertise in major mobile connectivity and content delivery technologies. Since 2000, lnterDigital has spent close to $900 million on technology research and development. The company’s activities are organized around the concept of the Living Network: a future where intelligent networks self-optimize to deliver service tailored to the content, context and connectivity of the user, device or need.
    [Show full text]
  • Advancements in 3D Display Technologies for Single and Multi-User Applications By: Alex Lyubarsky
    University of Arizona College of Optical Sciences OPTI 909 – Master’s Report Professor Jim Schwiegerling Academic Advisor: Dr. Hong Hua Advancements in 3D Display Technologies for Single and Multi-User Applications By: Alex Lyubarsky Table of Contents I. Preface/Introduction II. Human Visual System a. Depth Cues: How We Perceive 3D III. The Rise and Fall of Stereoscopic 3D Displays IV. What is the Light Field? V. Various Autostereoscopic/Automultiscopic 3D Display Technologies a. Parallax Barrier b. Pinhole Array c. Lenticular and Integral Imaging d. Projection-based Multi-View e. Super-Multi-View and Light Field Displays f. Cascaded/Stacked Displays VI. Holographic Displays VII. Volumetric Displays a. Static Volume Displays b. Swept Volume Displays VIII. 3D Displays for Near Eye applications a. Virtual Reality b. Augmented and Mixed Reality IX. Conclusion X. References Page | 2 I. Preface/Introduction The trend in 3D displays has grown exponentially in the past couple of decades due to the many advantages over a two-dimensional display. Since humans live in a three-dimensional world, it is only natural to want to create visual content that mimics our daily experiences. With recent advancements in display technologies, 3D displays have transitioned from being a fantasy to becoming more of a reality. As Allied Market Research predicts, the 3D display market is expected to reach $112.9 Billion globally by 2020 [1]. The most common displays that create a 3D visual experience can be found today in a local movie theater. Known as stereoscopic 3D, which although creates a fun visual experience, is also known to be uncomfortable especially over longer viewing times.
    [Show full text]
  • 2019 OLEN 115 172.Pdf
    Optics and Lasers in Engineering 115 (2019) 172–178 Contents lists available at ScienceDirect Optics and Lasers in Engineering journal homepage: www.elsevier.com/locate/optlaseng GPU-accelerated integral imaging and full-parallax 3D display using stereo–plenoptic camera system Seokmin Hong a,∗, Nicolò Incardona a, Kotaro Inoue b, Myungjin Cho b, Genaro Saavedra a, Manuel Martinez-Corral a a 3D Imaging and Display Laboratory, Department of Optics, University of Valencia, 46100 Burjassot, Spain b Department of Electrical, Electronic and Control Engineering, IITC, Hankyong National University, Anseong, 17579, South Korea a r t i c l e i n f o a b s t r a c t Keywords: In this paper, we propose a novel approach to produce integral images ready to be displayed onto an integral- 3D display imaging monitor. Our main contribution is the use of commercial plenoptic camera, which is arranged in a stereo Integral imaging configuration. Our proposed set-up is able to record the radiance, spatial and angular, information simultaneously 3D data registration in each different stereo position. We illustrate our contribution by composing the point cloud from a pair of Point cloud captured plenoptic images, and generate an integral image from the properly registered 3D information. We have GPU Plenoptic camera exploited the graphics processing unit (GPU) acceleration in order to enhance the integral-image computation Stereo camera speed and efficiency. We present our approach with imaging experiments that demonstrate the improved quality of integral image. After the projection of such integral image onto the proposed monitor, 3D scenes are displayed with full-parallax.
    [Show full text]
  • All-In-Focus Image Rendering Based on Holoscopic 3D Camera
    Journal of Computer and Communications, 2016, 4, 24-35 Published Online May 2016 in SciRes. http://www.scirp.org/journal/jcc http://dx.doi.org/10.4236/jcc.2016.46003 Digital Refocusing: All-in-Focus Image Rendering Based on Holoscopic 3D Camera Obaidullah Abdul Fatah, Peter Lanigan, Amar Aggoun, Mohammad Rafiq Swash College of Engineering, Design and Physical Sciences, Brunel University, London, UK Received 15 March 2016; accepted 27 May 2016; published 30 May 2016 Copyright © 2016 by authors and Scientific Research Publishing Inc. This work is licensed under the Creative Commons Attribution International License (CC BY). http://creativecommons.org/licenses/by/4.0/ Abstract This paper presents an innovative method for digital refocusing of different point in space after capturing and also extracts all-in-focus image. The proposed method extracts all-in-focus image using Michelson contrast formula hence, it helps in calculating the coordinates of the 3D object location. With light field integral camera setup the scene to capture the objects precisely posi- tioned in a measurable distance from the camera therefore, it helps in refocusing process to re- turn the original location where the object is focused; else it will be blurred with less contrast. The highest contrast values at different points in space can return the focused points where the objects are initially positioned as a result; all-in-focus image can also be obtained. Detailed experiments are conducted to demonstrate the credibility of proposed method with results. Keywords Holoscopic 3D Image, Integral Image, Viewpoint, 3D Image, Digital Focusing, Light Field Image 1. Introduction Holoscopic 3D imaging also known as 3D imaging is on the verge of constant development in the scientific as well as the entertainment community in recent years.
    [Show full text]
  • New 3D Integral Imaging Camera Viewing Parameters
    International Journal of Emerging Engineering Research and Technology Volume 6, Issue 12, 2018, PP 17-29 ISSN 2394-5923 (Print) & ISSN 2394-5931 (Online) New 3D Integral Imaging Camera Viewing Parameters 1Mahmoud G. Eljadid, 2A. Aggoun, 3Osama H. Youssef Atallah 1Software Engineering Department, Faculty of Information Technology, Tripoli University, Libya. 2School of Mathematics and Computer Science, University of Wolverhampton, United Kingdom. 3Biomedical Engineering Department, Medical Research Institute, Alexandria University, Egypt. *Corresponding Author: Mahmoud G. Eljadid, Software Engineering Department, Faculty of Information Technology, Tripoli University, Libya. Email: [email protected] ABSTRACT 3D Holoscopic imaging (also known as Integral imaging) is a technique that is capable of creating and encoding a true volume spatial optical model of the object scene in the form of a planar intensity distribution by using unique optical components. It is akin to holography in that 3D information recorded on a 2D medium can be replayed as a full 3D optical model, Typically, moving from cylindrical lens to the next cylindrical lens is carried out knowing the lens width, Lw, which defines the translation of the camera along the x optical axis and hence the position of the new camera cylindrical lens. This new renderer is represented in the paper by the term integral ray tracing (IRT). In this paper, a method of generation 3D integral images camera file parameters is proposed. The camera file is an important part of the process of generating micro-images and set up the camera position. The method is mainly based on Parallel Multiprocessor Ray Tracing System adapted "Tachyon" as renderer. Experimental results show validation of the new algorithm and the lenticular lens can be set up and tested on small-balls, gears, room, tree and teapot scenes.
    [Show full text]