Optical See-Through 2D/3D Compatible Display Using Variable-Focus Lens and Multiplexed Holographic Optical Elements

Total Page:16

File Type:pdf, Size:1020Kb

Optical See-Through 2D/3D Compatible Display Using Variable-Focus Lens and Multiplexed Holographic Optical Elements hv photonics Article Optical See-through 2D/3D Compatible Display Using Variable-Focus Lens and Multiplexed Holographic Optical Elements Qinglin Ji 1, Huan Deng 1,*, Hanle Zhang 2, Wenhao Jiang 1, Feiyan Zhong 1 and Fengbin Rao 1 1 College of Electronics and Information Engineering, Sichuan University, Chengdu 610065, China; [email protected] (Q.J.); [email protected] (W.J.); [email protected] (F.Z.); [email protected] (F.R.) 2 School of Instrumentation Science and Optoelectronics Engineering, Beihang University, Beijing 100191, China; [email protected] * Correspondence: [email protected] Abstract: An optical see-through two-dimensional (2D)/three-dimensional (3D) compatible display using variable-focus lens and multiplexed holographic optical elements (MHOE) is presented. It mainly consists of a MHOE, a variable-focus lens and a projection display device. The customized MHOE, by using the angular multiplexing technology of volumetric holographic grating, records the scattering wavefront and spherical wavefront array required for 2D/3D compatible display. In particular, we proposed a feasible method to switch the 2D and 3D display modes by using a variable- focus lens in the reconstruction process. The proposed system solves the problem of bulky volume, and makes the MHOE more efficient to use. Based on the requirements of 2D and 3D displays, we calculated the liquid pumping volume of the variable-focus lens under two kinds of diopters. Keywords: optical see-through 2D/3D; variable-focus lens; holographic optical elements Citation: Ji, Q.; Deng, H.; Zhang, H.; Jiang, W.; Zhong, F.; Rao, F. Optical See-Through 2D/3D Compatible 1. Introduction Display Using Variable-Focus Lens Augmented reality (AR) is a technology which allows computer generated virtual and Multiplexed Holographic Optical imagery to exactly overlay physical objects [1,2]. AR display mainly includes optical Elements. Photonics 2021, 8, 297. see-through type [3], video see-through type [4] and reflection type [5,6] display devices. https://doi.org/10.3390/ At present, the optical see-through display device has become the mainstream technical photonics8080297 solution, because it allows the user to directly watch the real environment light through an image combiner, and improves the user’s perception of the real world. As a kind of Received: 28 June 2021 true three-dimensional (3D) display technology, integral imaging has a compact structure Accepted: 23 July 2021 and can provide various physiological depth cues [7,8]. It is conductive for the natural Published: 27 July 2021 integration of virtual 3D images and real scenes, and is suitable to be integrated into AR display devices [9,10]. However, due to optical modulation of the lens array, integral Publisher’s Note: MDPI stays neutral imaging is not compatible with two-dimensional (2D) display. Although 3D display with regard to jurisdictional claims in can provide depth information that is absent in 2D display, the current 3D resolution published maps and institutional affil- is severely degraded. The 2D display has the advantage of high resolution and large iations. viewing angle, but it can’t replace 3D display. Therefore, some researchers have proposed 2D/3D compatible displays, such as switching backlight units by using polarized pinhole array [11,12] and polymer dispersed liquid crystal (PDLC) [13], or using dynamic parallax grating technology [14], but these technologies cannot be applied to optical see-through Copyright: © 2021 by the authors. AR display devices. Licensee MDPI, Basel, Switzerland. The technology that is key to realizing optical see-through AR display is the image This article is an open access article combiner that overlaps the virtual images and the real scenes together. The image combiner distributed under the terms and can be further divided into two categories: reflection-type devices [15,16] and diffraction- conditions of the Creative Commons type devices [17,18]. Compared with reflection-type devices, the diffraction-type devices Attribution (CC BY) license (https:// creativecommons.org/licenses/by/ have many advantages, such as unique angle selectivity and wavelength selectivity and 4.0/). flexibility of design, as well as high diffraction efficiency and transparency. It can replace Photonics 2021, 8, 297. https://doi.org/10.3390/photonics8080297 https://www.mdpi.com/journal/photonics Photonics 2021, 8, x FOR PEER REVIEW 2 of 9 biner can be further divided into two categories: reflection-type devices [15,16] and dif- fraction-type devices [17,18]. Compared with reflection-type devices, the diffraction-type Photonics 2021, 8, 297 2 of 9 devices have many advantages, such as unique angle selectivity and wavelength selec- tivity and flexibility of design, as well as high diffraction efficiency and transparency. It can replace one or several optical elements in the system and is an ideal image combiner [19].one orYeom several et al. optical proposed elements a 2D/3D in the converti systemble and projection is an ideal system image using combiner see-through [19]. Yeom in- tegralet al. proposedimaging based a 2D/3D on convertibleHOE. The use projection of prism system reduced using the see-through number of integral projectors imaging and enabledbased on a HOE.compact The form use of factor, prism but reduced the utilization the number of the of projectors HOE was and only enabled 50%, that a compact is, one halfform of factor, the HOE but thewas utilization used for 2D of thedisplay HOE and was the only other 50%, half that for is, 3D one display half of [20]. the HOE Chou. was et al.used proposed for 2D display a 2D/3D and hybrid the other integral half for imaging 3D display display [20]. based Chou. eton al.a proposedliquid crystal a 2D/3D mi- cro-lenshybrid integralarray (LCMLA), imaging displayin which based the LCMLA on a liquid has the crystal focusing micro-lens effect arrayor no (LCMLA),optical effect in determinedwhich the LCMLA by the haspolarization the focusing state effect of orthe no incident optical effectlight. determinedTherefore, bythe the system polarization added devicesstate of thesuch incident as twisted light. nematic Therefore, cell the and system polarizer added to devicescontrol suchthe polarization as twisted nematic state, and cell increasedand polarizer the toredundancy control the [21]. polarization In our previous state, and work, increased we proposed the redundancy attaching [21 a]. PDLC In our filmprevious to the work, lens wearray proposed HOE toattaching constitute a a PDLC 2D/3D film projection to the lens screen. array The HOE lens to constitutearray HOE a 2D/3D projection screen. The lens array HOE and PDLC film were used to realize integral and PDLC film were used to realize integral imaging 3D display and 2D display, respec- imaging 3D display and 2D display, respectively. However, the PDLC film degraded the tively. However, the PDLC film degraded the transparency of the system, and the opti- transparency of the system, and the optical see-through properties could not be achieved cal see-through properties could not be achieved in 2D mode [22]. in 2D mode [22]. In this paper, we propose an optical see-through 2D/3D compatible display by using In this paper, we propose an optical see-through 2D/3D compatible display by using variable-focus lens and multiplexed holographic optical elements (MHOE). Based on variable-focus lens and multiplexed holographic optical elements (MHOE). Based on angular multiplexing technology, the MHOE records the scattering wavefront of a dif- angular multiplexing technology, the MHOE records the scattering wavefront of a diffuser fuser for 2D display and the spherical wavefront array of micro-lens array (MLA) for 3D for 2D display and the spherical wavefront array of micro-lens array (MLA) for 3D display. display. The incident angles of the projected angles vary with the changes in the varia- The incident angles of the projected angles vary with the changes in the variable-focus ble-focus lens to meet the two Bragg diffraction conditions of 2D and 3D displays. lens to meet the two Bragg diffraction conditions of 2D and 3D displays. Therefore, our Therefore, our approach does not require polarization devices and reduces the systematic approach does not require polarization devices and reduces the systematic complexity, complexity,enhances the enhances effective utilizationthe effective of MHOE utilization and maintains of MHOE optical and see-through maintains properties optical see-throughfor both 2D andproperties 3D displays, for both all 2D of whichand 3D are displays, highly beneficialall of which for are AR highly applications. beneficial for AR applications. 2. Structure and Principle 2. StructureFigure1 and shows Principle the structure of our proposed optical see-through 2D/3D compatible displayFigure system. 1 shows It is the mainly structure composed of our of propos a MHOE,ed optical a variable-focus see-through lens 2D/3D and compatible a projector. displayEssentially, system. the MHOEIt is mainly is a volumetric composed holographicof a MHOE, grating.a variable-focus The scattering lens and wavefront a projector. of a Essentially,diffuser and the the MHOE spherical is a wavefrontvolumetric array holograp of MLAhic grating. are recorded The scattering on the single wavefront volumetric of a diffuserholographic and the grating spherical by two-step wavefront recording array of with MLA angular are recorded multiplexing on the single technology. volumetric The holographicvariable-focus grating lens has by twotwo-step optical recording states of diopterwith angularj1 and multiplexingj2, and it modulates technology. the lightThe variable-focustransmitted by lens the has projector two optical respectively states of to diopter meet the φ1 two and Bragg φ2, and diffraction it modulates conditions the light of transmittedMHOE. by the projector respectively to meet the two Bragg diffraction conditions of MHOE. (a) (b) Figure 1. Schematic diagrams of system structure: (a) 3D image reconstruction when the probe beam satisfies the Bragg diffraction condition 1; (b) 2D image display when the probe beam satisfies the Bragg diffraction condition 2.
Recommended publications
  • Stereo Capture and Display At
    Creation of a Complete Stereoscopic 3D Workflow for SoFA Allison Hettinger and Ian Krassner 1. Introduction 1.1 3D Trend Stereoscopic 3D motion pictures have recently risen to popularity once again following the success of films such as James Cameron’s Avatar. More and more films are being converted to 3D but few films are being shot in 3D. Current available technology and knowledge of that technology (along with cost) is preventing most films from being shot in 3D. Shooting in 3D is an advantage because two slightly different images are produced that mimic the two images the eyes see in normal vision. Many take the cheaper route of shooting in 2D and converting to 3D. This results in a 3D image, but usually nowhere near the quality as if the film was originally shot in 3D. This is because a computer has to create the second image, which can result in errors. It is also important to note that a 3D image does not necessarily mean a stereo image. 3D can be used to describe images that have an appearance of depth, such as 3D animations. Stereo images refer to images that make use of retinal disparity to create the illusion of objects going out of and into the screen plane. Stereo images are optical illusions that make use of several cues that the brain uses to perceive a scene. Examples of monocular cues are relative size and position, texture gradient, perspective and occlusion. These cues help us determine the relative depth positions of objects in an image. Binocular cues such as retinal disparity and convergence are what give the illusion of depth.
    [Show full text]
  • Depth-Aware Blending of Smoothed Images for Bokeh Effect Generation
    1 Depth-aware Blending of Smoothed Images for Bokeh Effect Generation Saikat Duttaa,∗∗ aIndian Institute of Technology Madras, Chennai, PIN-600036, India ABSTRACT Bokeh effect is used in photography to capture images where the closer objects look sharp and every- thing else stays out-of-focus. Bokeh photos are generally captured using Single Lens Reflex cameras using shallow depth-of-field. Most of the modern smartphones can take bokeh images by leveraging dual rear cameras or a good auto-focus hardware. However, for smartphones with single-rear camera without a good auto-focus hardware, we have to rely on software to generate bokeh images. This kind of system is also useful to generate bokeh effect in already captured images. In this paper, an end-to-end deep learning framework is proposed to generate high-quality bokeh effect from images. The original image and different versions of smoothed images are blended to generate Bokeh effect with the help of a monocular depth estimation network. The proposed approach is compared against a saliency detection based baseline and a number of approaches proposed in AIM 2019 Challenge on Bokeh Effect Synthesis. Extensive experiments are shown in order to understand different parts of the proposed algorithm. The network is lightweight and can process an HD image in 0.03 seconds. This approach ranked second in AIM 2019 Bokeh effect challenge-Perceptual Track. 1. Introduction tant problem in Computer Vision and has gained attention re- cently. Most of the existing approaches(Shen et al., 2016; Wad- Depth-of-field effect or Bokeh effect is often used in photog- hwa et al., 2018; Xu et al., 2018) work on human portraits by raphy to generate aesthetic pictures.
    [Show full text]
  • Virtual Reality and Visual Perception by Jared Bendis
    Virtual Reality and Visual Perception by Jared Bendis Introduction Goldstein (2002) defines perception as a “conscious sensory experience” (p. 6) and as scientists investigate how the human perceptual system works they also find themselves investigating how the human perceptual system doesn’t work and how that system can be fooled, exploited, and even circumvented. The pioneers in the ability to control the human perceptual system have been in the field of Virtual Realty. In Simulated and Virtual Realities – Elements of Perception, Carr (1995) defines Virtual Reality as “…the stimulation of human perceptual experience to create an impression of something which is not really there” (p. 5). Heilig (2001) refers to this form of “realism” as “experience” and in his 1955 article about “The Cinema of the Future” where he addresses the need to look carefully at perception and breaks down the precedence of perceptual attention as: Sight 70% Hearing 20% Smell 5% Touch 4% Taste 1% (p. 247) Not surprisingly sight is considered the most important of the senses as Leonardo da Vinci observed: “They eye deludes itself less than any of the other senses, because it sees by none other than the straight lines which compose a pyramid, the base of which is the object, and the lines conduct the object to the eye… But the ear is strongly subject to delusions about the location and distance of its objects because the images [of sound] do not reach it in straight lines, like those of the eye, but by tortuous and reflexive lines. … The sense of smells is even less able to locate the source of an odour.
    [Show full text]
  • Intraocular Lenses and Spectacle Correction
    MEDICAL POLICY POLICY TITLE INTRAOCULAR LENSES, SPECTACLE CORRECTION AND IRIS PROSTHESIS POLICY NUMBER MP-6.058 Original Issue Date (Created): 6/2/2020 Most Recent Review Date (Revised): 6/9/2020 Effective Date: 2/1/2021 POLICY PRODUCT VARIATIONS DESCRIPTION/BACKGROUND RATIONALE DEFINITIONS BENEFIT VARIATIONS DISCLAIMER CODING INFORMATION REFERENCES POLICY HISTORY I. POLICY Intraocular Lens Implant (IOL) Initial IOL Implant A standard monofocal intraocular lens (IOL) implant is medically necessary when the eye’s natural lens is absent including the following: Following cataract extraction Trauma to the eye which has damaged the lens Congenital cataract Congenital aphakia Lens subluxation/displacement A standard monofocal intraocular lens (IOL) implant is medically necessary for anisometropia of 3 diopters or greater, and uncorrectable vision with the use of glasses or contact lenses. Premium intraocular lens implants including but not limited to the following are not medically necessary for any indication, including aphakia, because each is intended to reduce the need for reading glasses. Presbyopia correcting IOL (e.g., Array® Model SA40, ReZoom™, AcrySof® ReStor®, TECNIS® Multifocal IOL, Tecnis Symfony and Tecnis SymfonyToric, TRULIGN, Toric IO, Crystalens Aspheric Optic™) Astigmatism correcting IOL (e.g., AcrySof IQ Toric IOL (Alcon) and Tecnis Toric Aspheric IOL) Phakic IOL (e.g., ARTISAN®, STAAR Visian ICL™) Replacement IOLs MEDICAL POLICY POLICY TITLE INTRAOCULAR LENSES, SPECTACLE CORRECTION AND IRIS PROSTHESIS POLICY NUMBER
    [Show full text]
  • Stereo 3D (Depth) from 2D
    Stereo 3D (Depth) from 2D • 3D information is lost by Viewing Stereo projection. Stereograms • How do we recover 3D Autostereograms information? Depth from Stereo Image 3D Model Depth Cues Shadows: Occlusions: Shading: Size Constancy Perspective Illusions (perspective): Height in Plane: Texture Gradient: 3D from 2D + Accommodation • Accommodation (Focus) • Change in lens curvature • Eye Vengeance according to object depth. • Effective depth: 20-300 cm. • Motion. • Stereo Accommodation Eye Vergence • Change in lens curvature • Change in lens curvature according to object depth. according to object depth. • Effective depth: 20-300 cm. • Effective depth: up to 6 m. Motion: Motion: Stereo Vision • In a system with 2 cameras (eyes), 2 different images are captured. • The "disparity" between the images is larger for closer objects: 1 disp ∝ depth • "Fusion" of these 2 images gives disparities depth information. Left Right Right Eye Left Eye Right Eye Left Eye Image Separation for Stereo • Special Glasses • Red/green images with red/green glasses. • Orthogonal Polarization • Alternating Shuttering Optic System Optic System Parlor Stereo Viewer 1850 Viewmaster 1939 ViduTech 2011 Active Shutter System Red/Green Filters Anaglyphs Anaglyphs How they Work Orthogonal Polarization Orthogonal Polarization Linear Polarizers: 2 polarized projectors are used (or alternating polarization) Orthogonal Polarization Orthogonal Polarization Circular Polarizers: Circular Polarizers: Left handed Right handed Orthogonal Polarization TV and Computer Screens Polarized Glasses Circular Polarizer Glasses: Same as polarizers – but reverse light direction Left handed Right handed Glasses Free TV and Glasses Free TV and Computer Screens Computer Screens Parallax Stereogram Parallax Stereogram Parallax Barrier display Uses Vertical Slits Blocks part of screen from each eye Glasses Free TV and Glasses Free TV and Computer Screens Computer Screens Lenticular lens method Lenticular lens method Uses lens arrays to send different Image to each eye.
    [Show full text]
  • Adaptive Optics in Laser Processing Patrick S
    Salter and Booth Light: Science & Applications (2019) 8:110 Official journal of the CIOMP 2047-7538 https://doi.org/10.1038/s41377-019-0215-1 www.nature.com/lsa REVIEW ARTICLE Open Access Adaptive optics in laser processing Patrick S. Salter 1 and Martin J. Booth1 Abstract Adaptive optics are becoming a valuable tool for laser processing, providing enhanced functionality and flexibility for a range of systems. Using a single adaptive element, it is possible to correct for aberrations introduced when focusing inside the workpiece, tailor the focal intensity distribution for the particular fabrication task and/or provide parallelisation to reduce processing times. This is particularly promising for applications using ultrafast lasers for three- dimensional fabrication. We review recent developments in adaptive laser processing, including methods and applications, before discussing prospects for the future. Introduction enhance ultrafast DLW. An adaptive optical element Over the past two decades, direct laser writing (DLW) enables control over the fabrication laser beam and allows with ultrafast lasers has developed into a mature, diverse it to be dynamically updated during processing. Adaptive – and industrially relevant field1 5. The ultrashort nature of elements can modulate the phase, amplitude and/or the laser pulses means that energy can be delivered to the polarisation of the fabrication beam, providing many focus in a period shorter than the characteristic timescale possibilities for advanced control of the laser fabrication for thermal diffusion, leading to highly accurate material process. In this review, we briefly outline the application modification1. Thus, by focusing ultrashort pulses onto areas of AO for laser processing before considering the 1234567890():,; 1234567890():,; 1234567890():,; 1234567890():,; the surface of the workpiece, precise cuts and holes can be methods of AO, including the range of adaptive elements manufactured with a minimal heat-affected zone.
    [Show full text]
  • A Curriculum Guide
    FOCUS ON PHOTOGRAPHY: A CURRICULUM GUIDE This page is an excerpt from Focus on Photography: A Curriculum Guide Written by Cynthia Way for the International Center of Photography © 2006 International Center of Photography All rights reserved. Published by the International Center of Photography, New York. Printed in the United States of America. Please credit the International Center of Photography on all reproductions. This project has been made possible with generous support from Andrew and Marina Lewin, the GE Fund, and public funds from the New York City Department of Cultural Affairs Cultural Challenge Program. FOCUS ON PHOTOGRAPHY: A CURRICULUM GUIDE PART IV Resources FOCUS ON PHOTOGRAPHY: A CURRICULUM GUIDE This section is an excerpt from Focus on Photography: A Curriculum Guide Written by Cynthia Way for the International Center of Photography © 2006 International Center of Photography All rights reserved. Published by the International Center of Photography, New York. Printed in the United States of America. Please credit the International Center of Photography on all reproductions. This project has been made possible with generous support from Andrew and Marina Lewin, the GE Fund, and public funds from the New York City Department of Cultural Affairs Cultural Challenge Program. FOCUS ON PHOTOGRAPHY: A CURRICULUM GUIDE Focus Lesson Plans Fand Actvities INDEX TO FOCUS LINKS Focus Links Lesson Plans Focus Link 1 LESSON 1: Introductory Polaroid Exercises Focus Link 2 LESSON 2: Camera as a Tool Focus Link 3 LESSON 3: Photographic Field
    [Show full text]
  • A Theoretical and Practical Introduction to Optics a Theoretical and Practical Introduction to Optics
    White Paper A Theoretical and Practical Introduction to Optics A Theoretical and Practical Introduction to Optics Be honest: do you really know how to calculate the focal length of a lens? If so, you are an exception to the rule and can stop reading here !! For the rest of you, here is a second chance. Back to square one "Piece of broken glass starts forest fire"– a common headline during the summer. But how could this have happened? Due to the enormous distance between the Earth and the Sun, the Sun only appears as a tiny point emitting parallel rays of light (figure 1a) Should these parallel rays pass through a lens (or a piece of glass, which has similar characteristics) the rays would meet behind the lens at what is called the focal point. But what happens if our point of light is so near to the lens that we can not assume to have parallel rays of light? They cross each other behind the focal point (figure 1b). If we take a look at the image of our point of light at the focal points position we will see a unclear blurred spot. And so the question arises- "what is focusing?". Focusing is to increase the distance between the focal plane and the lens until the focal plane and the junction of the rays overlap each other (figure 1c). Thus, for single points of light the situation is quite simple. But what happens to the image of screws, PCBs or plates of steel? From points of light to images A point of light does not necessarily originate directly from the sun, candles or lamps, it can also result from a reflection.
    [Show full text]
  • Mission to the Solar Gravity Lens Focus: Natural Highground for Imaging Earth-Like Exoplanets L
    Planetary Science Vision 2050 Workshop 2017 (LPI Contrib. No. 1989) 8203.pdf MISSION TO THE SOLAR GRAVITY LENS FOCUS: NATURAL HIGHGROUND FOR IMAGING EARTH-LIKE EXOPLANETS L. Alkalai1, N. Arora1, M. Shao1, S. Turyshev1, L. Friedman8 ,P. C. Brandt3, R. McNutt3, G. Hallinan2, R. Mewaldt2, J. Bock2, M. Brown2, J. McGuire1, A. Biswas1, P. Liewer1, N. Murphy1, M. Desai4, D. McComas5, M. Opher6, E. Stone2, G. Zank7, 1Jet Propulsion Laboratory, Pasadena, CA 91109, USA, 2California Institute of Technology, Pasadena, CA 91125, USA, 3The Johns Hopkins University Applied Physics Laboratory, Laurel, MD 20723, USA,4Southwest Research Institute, San Antonio, TX 78238, USA, 5Princeton Plasma Physics Laboratory, Princeton, NJ 08543, USA, 6Boston University, Boston, MA 02215, USA, 7University of Alabama in Huntsville, Huntsville, AL 35899, USA, 8Emritus, The Planetary Society. Figure 1: A SGL Probe Mission is a first step in the goal to search and study potential habitable exoplanets. This figure was developed as a product of two Keck Institute for Space Studies (KISS) workshops on the topic of the “Science and Enabling Technologies for the Exploration of the Interstellar Medium” led by E. Stone, L. Alkalai and L. Friedman. Introduction: Recent data from Voyager 1, Kepler and New Horizons spacecraft have resulted in breath-taking discoveries that have excited the public and invigorated the space science community. Voyager 1, the first spacecraft to arrive at the Heliopause, discovered that Fig. 2. Imaging of an exo-Earth with solar gravitational Lens. The exo-Earth occupies (1km×1km) area at the image plane. Using a 1m the interstellar medium is far more complicated and telescope as a 1 pixel detector provides a (1000×1000) pixel image! turbulent than expected; the Kepler telescope According to Einstein’s general relativity, gravity discovered that exoplanets are not only ubiquitous but induces refractive properties of space-time causing a also diverse in our galaxy and that Earth-like exoplanets massive object to act as a lens by bending light.
    [Show full text]
  • Dr. Juan Carlos Jácome Fernández
    Title CAPTURING OF 3D CONTENT USING A SINGLE APERTURE CAMERA Name: Dr. Juan Carlos Jácome Fernández This is a digitised version of a dissertation submitted to the University of Bedfordshire. It is available to view only. This item is subject to copyright. CAPTURING OF 3D CONTENT USING A SINGLE APERTURE CAMERA By Juan Carlos Jácome Fernández A thesis submitted to the University of Bedfordshire in partial fulfilment of the requirements for the degree of Doctor of Philosophy October 2017 Abstract Integral imaging has recently re-emerged as an alternative to current 3D capturing systems, providing post-production refocusing capabilities and reducing the complexity of 3D capturing systems. One of the main drawbacks of conventional plenoptic 1 integral imaging systems is the implementation of a single custom made microlens array which has a fixed focal length and a high cost/low scalability associated with. This thesis demonstrates a variable focal length microlens arrays system, which can flexibly operate within a range of various focal lengths, increase the cost-effectiveness of the integral imaging system and offers the opportunity to manipulate the main camera optical parameters without modifying the objective lens settings. To validate the proposed concept, a custom-made integral imaging camera system was designed and built (IMPERX 4K system). Based on the results obtained from two initial optical simulations, software simulation and mathematical model; the suitable microlens arrays were acquired, and several experiments were performed to establish the feasibility of a variable focal length microlens arrays system. The results obtained show the proposed system performed as expected. The variable focal length system represents an ideal method to control microlens focal length, as it offers a higher microlens count, without a dedicated electronic system, whose performance is not susceptible to temperature-pressure changes and can operate in real-time as it does not require downtimes to re-adjust.
    [Show full text]
  • Camera Settings Guide
    Camera Settings Guide • " " and " " are trademarks or registered trademarks of Sony Corporation. • All other company and product names mentioned herein are used for identification purposes only and may be the trademarks or registered trademarks of their respective owners. TM and ® symbols are not included in this booklet. • Screen displays and effects used to illustrate some functions are simulated. Conventional autofocus has until now dealt with space alone. Sony goes one step further — a big step, with an innovative image sensor that picks up both space and time to capture moving subjects with new clarity. Sony spells the beginning of a new autofocus era. 4D FOCUS allows you to take crisper photos than ever. Plain old autofocus is a thing of the past. The future of photography is in motion. What is 4D FOCUS? Space: 3D Time: 4D 4D FOCUS Area Depth Time Wide Fast Steadfast The wide AF area, covering nearly the Fast Hybrid AF, combining phase- An advanced AF algorithm accurately entire frame, allows focusing on a detection AF and contrast-detection AF, predicts subject’s next move. Precise AF subject positioned even off the center instantly detects distance to the subject tracking allows focus to be maintained of the frame. to focus accurately. even on fast-moving subjects. Meeting your focusing demands Basic AF performance of Wide Fast Steadfast Focusing over wide area Instant focusing! Once it's focused, it never lets go The 6000 employs a focal plane phase- Advanced Fast Hybrid AF combines phase- With Focus Mode set to AF-C, the camera detection AF sensor with 179 AF points spread detection AF and contrast-detection AF to achieve displays one or more small green frames to cover nearly the entire frame.
    [Show full text]
  • Review of Stereoscopic 3D Glasses for Gaming
    ISSN: 2278 – 1323 International Journal of Advanced Research in Computer Engineering & Technology (IJARCET) Volume 5, Issue 6, June 2016 Review of Stereoscopic 3D Glasses for Gaming Yogesh Bhimsen Joshi, Avinash Gautam Waywal sound cards and CD-ROMs had the multimedia Abstract— Only a decade ago, watching in 3-D capability. meant seeing through a pair of red and blue glasses. Early 3D games such as Alpha Waves, Starglider 2 It was really great at first sight, but 3-D technology began with flat-shaded graphics and then has been moving on. Scientists have been aware of progressed with simple forms of texture mapping how human vision works and current generation of such as in Wolfenstein 3D. computers are more powerful than ever before. In the early 1990s, the most popular method of Therefore, most of the computer users are familiar with 3-D games. Back in the '90s, most of the publishing games for smaller developers was enthusiasts were amazed by the game Castle shareware distribution, including then-fledgling Wolfenstein 3D, which took place in a maze-like companies such as Apogee which is now branded as castle, which was existed in three dimensions. 3D Realms, Epic MegaGames (now known as Epic Nowadays, gamers can enjoy even more complicated Games), and id Software. It enabled consumers the graphics with the available peripherals. This paper opportunity to try a trial portion of the game, which gives an overview of this 3-D Gaming technology and was restricted to complete first section or an episode various gaming peripherals. This paper will also of full version of the game, before purchasing it.
    [Show full text]