<<

applied sciences

Article Verification of an Accommodative Response for Depth Measurement of Floating Hologram Using a Holographic Optical Element

Leehwan Hwang 1, Sungjae Ha 2, Philippe Gentet 2 , Jaehyun Lee 1, Soonchul Kwon 3 and Seunghyun Lee 4,* 1 Department of Plasma Bio-Display, Kwangwoon University, Seoul 01897, Korea; [email protected] (L.H.); [email protected] (J.L.) 2 Spatial Computing Convergence Center, Kwangwoon University, Seoul 01897, Korea; [email protected] (S.H.); [email protected] (P.G.) 3 Graduate School of Smart Convergence, Kwangwoon University, Seoul 01897, Korea; [email protected] 4 Ingenium College, Kwangwoon University, Seoul 01897, Korea * Correspondence: [email protected]; Tel.: +82-940-5290

 Received: 31 August 2020; Accepted: 24 September 2020; Published: 28 September 2020 

Abstract: Floating holograms using holographic optical element screens differ from existing systems because they can float 2D images in the air and provide a of depth. Until now, the verification of such displays has been conducted only on the system implementation, and only the diffraction efficiency and angle of view of the hologram have been verified. Although such displays can be directly observed with the human , the eye’s control ability has not been quantitatively verified. In this study, we verified that the focus of the observer coincided with the appropriate depth value determined with experiments. This was achieved by measuring the amount of control reaction from the of the observer on the image of the floating hologram using a holographic optical element (HOE). An autorefractor was used, and we confirmed that an image with a sense of depth can be observed from the interaction of the observer’s focus and convergence on the 2D floating image using a HOE. Thus, the realization of content with a sense of depth of 2D projected images using a HOE in terms of human factors was quantitatively verified.

Keywords: accommodation; floating hologram; projection hologram; holographic optical element; ; three-dimensional image;

1. Introduction When a floating hologram is seen by an observer using a reflector, it appears as if a 2D image is floating in the air. This is also called a projection hologram, and is used not only for concerts and musical performances, but also in augmented reality (AR) systems for fighters and in head-up display (HUD) systems for automobiles [1]. A floating hologram uses a half- system [2]. However, this system has a limitation in image size, as large floating holograms must have a screen that is equal to the display size. Thus, it is difficult to implement a sense of depth because the screen must be projected from a long distance from the half mirror, which is also an inevitable part of the system’s enlargement [3]. This limitation can be solved with a holographic optical element (HOE) such as a holographic lens, which is developed using holography. A HOE is recorded by using two laser beams, i.e., an object beam and a reference beam, which interfere in the volume of the holographic film. During recording, the object beam is shaped by introducing the lens on the beam path. During reconstruction by illumination with the reference beam, a holographic lens reproduces the refractive power of a lens used in the holographic recording. A system that projects 2D images

Appl. Sci. 2020, 10, 6788; doi:10.3390/app10196788 www.mdpi.com/journal/applsci Appl. Sci. 2020, 10, x FOR PEER REVIEW 2 of 10 Appl. Sci. 2020, 10, 6788 2 of 10 power of a lens used in the holographic recording. A system that projects 2D images using a HOE is beingusing studied a HOE in is beingcompanies, studied institutions, in companies, and school institutions,s. However, and schools. such systems However, are limited such systems in terms are of implementation,limited in terms and of implementation, there is currently and no system there is th currentlyat verifies no and system evaluates that verifiesthe system and from evaluates a human the factorsystem perspective. from a human To factordetermine perspective. this human To determine factor perspective, this human we factor attempted perspective, to verify we the attempted depth of to averify floating the hologram depth of a using floating a HOE. hologram We estimated using a HOE. the change We estimated in lens thethickness change that in lens happens thickness when that a personhappens sees when an a object. person This sees anverification object. This meant verification that the meant content that theof contentfocal imaging of focal using imaging a usingHOE quantitativelya HOE quantitatively verified verified the control the controlability of ability the human of the humaneye. eye.

2.2. Background Background Theory Theory TheThe human human visual system relies relies on on various various factors, factors, such such as as depth, depth, to to perceive perceive the the shape shape of of a a real real object’sobject’s three-dimensionalthree-dimensional (3D) (3D) e ffeffect.ect. Depth Depth perception refers refers to obtaining to obtaining a perception a perception of the distance of the distancefrom the from front the to the front back to ofthe a back 3D solid. of a 3D The solid. use of Th thee use diff erenceof the difference in vision between in vision the between to the perceive eyes tothe perceive 3D effect the of 3D an effect object of is an called object stereoscopic is called stereoscopic vision [4]. vision [4]. InIn general, general, therethere areare aa number number of of factors factors influencing influencing the the 3D 3D perception perception of objectsof objects in . in space. One One can canperceive perceive depth depth with with the imagethe image generated generated by two by two eyes eyes and thatand that generated generated by one by eye.one eye. When When both both eyes eyeslook look at an at object, an object, the vision the vision function function is automatically is automatically adjusted adjusted to recognize to recognize the distance the distance to the to target the targetobject. object. With this With recognition this recognition of the absolute of the distance,absolute thedistance, functions theof functions accommodation of accommodation and convergence and convergenceact, resulting act, in a resulting natural state in a natural in which stat thee samein which 3D imagethe same is obtained.3D image is obtained.

2.1.2.1. Convergence Convergence Response Response SinceSince the structure ofof thethe humanhuman eye eye provides provides a a range range of of vision vision of of up up to to approximately approximately 6 cm 6 cm to theto theleft left and and the the right, right, it allowsit allows for for the the perception perception of of depth. depth. The The mainmain factorsfactors aaffectingffecting depth perception areare binocular disparity and and vergence vergence [5]. [5]. Eye movement changes changes according according to to the the binocular binocular function function basedbased on on the the focus focus adjustment adjustment and and the the absolute absolute dist distanceance to to the the actual actual object; object; moreover, moreover, the the relative relative distancedistance can can also also be be recognized. Figure Figure 11aa shows thatthat the angle ofof vergencevergence ofof bothboth eyeseyes increasesincreases whenwhen viewing viewing a a nearby object. object. Figure Figure 11bb indicates that there is aa decreasedecrease inin thethe convergenceconvergence angleangle whenwhen viewing a distant object [6 [6,,7].7].

(a) (b)

FigureFigure 1. 1. ConvergenceConvergence angle angle of of human : eye: ( (aa)) wide; wide; ( (bb)) narrow. narrow. 2.2. Accommodation Response 2.2. Accommodation Response When a person looks at an object, the eyes focus on a specific point and create a clear image, When a person looks at an object, the eyes focus on a specific point and create a clear image, and and objects before and after the target object are perceived as blurred such that the relative position objects before and after the target object are perceived as blurred such that the relative position from from the object can be known [8]. That is, the focus of the lens is automatically adjusted to create a clear the object can be known [8]. That is, the focus of the lens is automatically adjusted to create a clear image, and this change in focus occurs due to one eye, not two eyes. This is called focusing. As shown image, and this change in focus occurs due to one eye, not two eyes. This is called focusing. As shown in Figure2, at close range, the contracts, the ciliary body relaxes, and the lens becomes in Figure 2, at close range, the ciliary muscle contracts, the ciliary body relaxes, and the lens becomes thick. When viewed from farther away, the ciliary muscle relaxes, the ciliary body contracts, and the thick. When viewed from farther away, the ciliary muscle relaxes, the ciliary body contracts, and the lens becomes thinner [9]. lens becomes thinner [9]. Appl. Sci. 2020, 10, x FOR PEER REVIEW 3 of 10 Appl.Appl. Sci.Sci.2020 2020,,10 10,, 6788x FOR PEER REVIEW 33 ofof 1010

(a) (b) (a) (b) Figure 2. Changing thickness of the lens of the human eye: (a) looking at far point; (b) looking at near FigureFigure 2.2. Changing thickness thickness of of the the lens lens of of the the human human eye: eye: (a) (lookinga) looking at far at point; far point; (b) looking (b) looking at near at point. nearpoint. point. The control and convergence responses are closely related: the control reaction causes the TheThe controlcontrol andand convergenceconvergence responsesresponses areare closelyclosely related:related: thethe controlcontrol reactionreaction causescauses thethe convergence reaction, and the convergence reaction causes the control reaction [10]. All objects that convergenceconvergence reaction,reaction, andand thethe convergenceconvergence reactionreaction causescauses thethe controlcontrol reactionreaction [[10].10]. AllAll objectsobjects thatthat are in focus naturally exist in the spatial domain and are expressed [11]. However, stereoscopic 3D areare inin focusfocus naturallynaturally existexist inin thethe spatialspatial domaindomain andand areare expressedexpressed [11[11].]. However,However, stereoscopicstereoscopic 3D3D technologies do not work because they are closely related to this focus control and congestion technologiestechnologies do do not not work work because because they they are closelyare closel relatedy related to this to focus this control focus andcontrol congestion and congestion response. response. As shown in Figure 3, the image is clearly visible, but not the distance; thus, the viewer Asresponse. shown inAs Figureshown3 ,in the Figure image 3, is the clearly image visible, is clea butrly notvisible, the distance;but not the thus, distance; the viewer thus, shouldthe viewer fix should fix the focus according to the depth of the real object by focusing on a fixed distance. theshould focus fix according the focus to according the depth ofto thethe realdepth object of th bye focusingreal object on by a fixed focusing distance. on a Eventually, fixed distance. in a Eventually, in a stereoscopic 3D image, the image appears blurred on the viewer’s . As a result, stereoscopicEventually, 3Din a image, stereoscopic the image 3D image, appears the blurred image on appears the viewer’s blurred retina. on the As viewer a result,’s retina. the human As a visualresult, the human visual system operates to refocus on the image that is not clear because of the focus control systemthe human operates visual to system refocus operates on the imageto refocus that on is notthe image clear because that is not of clear the focus because control of the function; focus control thus, function; thus, the motion to automatically focus on the object of interest and the motion to refocus thefunction; motion thus, to automatically the motion to focus automatically on the object focus of intereston the object and the of motioninterest to and refocus the motion on a clear to refocus image on a clear image are continuous. This causes repetitive motion, resulting in eye strain [12–15]. areon continuous.a clear image This are causescontinuous. repetitive This motion,causes repetitive resulting motion, in eye strain resulting [12–15 in]. eye strain [12–15].

(a) (b) (a) (b) FigureFigure 3.3. AccommodationAccommodation andand convergenceconvergence distances:distances: ((aa)) agreement;agreement; ((bb)) disagreement.disagreement. Figure 3. Accommodation and convergence distances: (a) agreement; (b) disagreement. 3. Materials and Methods 3. Materials and Methods 3. Materials and Methods 3.1.3.1. DesignDesign ofof aa FloatingFloating HologramHologram SystemSystem UsingUsing aa HOEHOE 3.1. Design of a Floating Hologram System Using a HOE FigureFigure4 4shows shows the the recording recording stage stage of of a holographic a holographic lens lens using using reflection-mode reflection-mode geometry geometry and and its Figure 4 shows the recording stage of a holographic lens using reflection-mode geometry and reconstructionits reconstruction stage. stage. At At the the recording recording stage stage in in Figure Figure4a, 4a, the the HOE HOE was was developed developed by by exposing exposing the the its reconstruction stage. At the recording stage in Figure 4a, the HOE was developed by exposing the holographicholographic filmfilm withwith twotwo beamsbeams incidentincident onon thethe filmfilm fromfrom opposite opposite sides. sides. TheThe developeddeveloped HOEHOE isis anan holographic film with two beams incident on the film from opposite sides. The developed HOE is an opticaloptical devicedevice thatthat reproducesreproduces thethe signalsignal lightlight beambeam byby illuminatingilluminating itit withwith thethe referencereference lightlight beam,beam, optical device that reproduces the signal light beam by illuminating it with the reference light beam, asas shownshown inin FigureFigure4 b.4b. The The HOE HOE has has the the transmission transmission characteristics characteristics required required in in augmented augmented reality reality as shown in Figure 4b. The HOE has the transmission characteristics required in augmented reality (AR)(AR) and and the the characteristics characteristics of of an an optical optical element element for for image image expression expression [ 16[16–21].–21]. (AR) and the characteristics of an optical element for image expression [16–21]. Appl. Sci. 2020, 10, x FOR PEER REVIEW 4 of 10 Appl. Sci. 2020, 10, 6788 4 of 10 Appl. Sci. 2020, 10, x FOR PEER REVIEW 4 of 10

(a) (b) (a) (b) Figure 4. Holographic optical(a) element (HOE) recording and reconstruction:(b) (a) recording; (b) reconstruction. FigureFigure 4. 4.Holographic Holographic optical optical element element (HOE) recording(HOE) recording and reconstruction: and reconstruction: (a) recording; ( (ab)) reconstruction.recording; (b) reconstruction. Figure 55 showsshows the design of of a a 2D 2D projection projection system system using using the the HOE. HOE. The The HOE HOE can can replace replace the existing aspherical concave mirror, providing a relatively free optical path design. Because it is the existingFigure aspherical5 shows the concave design mirror,of a 2D providing projection a system relatively using free the optical HOE. path The design.HOE can Because replace it the is transparently arranged to project the external environment, the observer can see a reality in which transparentlyexisting aspherical arranged concave to project mirror, the externalproviding environment, a relatively the free observer optical can path see adesign. reality Because in which it the is the external environment and the AR image are mixed [22]. externaltransparently environment arranged and to the project AR image the external are mixed envi [22ronment,]. the observer can see a reality in which the external environment and the AR image are mixed [22].

Figure 5. FloatingFloating hologram hologram system using a HOE. Figure 5. Floating hologram system using a HOE. 3.2. Materials Materials to to Measure Measure the Accommodative Response 3.2. MaterialsIn this this study, study, to Measure the the control control the Accommodative response response in in the Response the binocular binocular state state was was measured measured using using a Shin-Nippon a Shin-Nippon N- visionN-vision K5001 K5001 autorefractor autorefractor [23]. [23 ].An An autorefrac autorefractortor is is a a computer-controlled computer-controlled measuring measuring machine In this study, the control response in the binocular state was measured using a Shin-Nippon N- commonly used during eye examinations to objectively measure a patient’s patient’s refractive refractive error. With With the vision K5001 autorefractor [23]. An autorefractor is a computer-controlled measuring machine N-vision K5001 wide-view window, the subject can be seen naturally with both eyes, thereby easing commonly used during eye examinations to objectively measure a patient’s refractive error. With the measurement collection. Figure 6 6 showsshows suchsuch aa system.system. An environment wherewhere thethe observerobserver naturallynaturally N-vision K5001 wide-view window, the subject can be seen naturally with both eyes, thereby easing sees the image with human eyes and how the human eye changes when observing the content was measurement collection. Figure 6 shows such a system. An environment where the observer naturally measured by combining an open-type open-type autorefractor autorefractor an andd a a 2D 2D projection projection displa displayy system system using using a a HOE. HOE. sees the image with human eyes and how the human eye changes when observing the content was measured by combining an open-type autorefractor and a 2D projection display system using a HOE.

Figure 6. Measurement system using autorefractor for floating hologram system with a HOE. Figure 6. Measurement system using autorefractor for floating hologram system with a HOE. Figure 6. Measurement system using autorefractor for floating hologram system with a HOE.

Appl. Sci. 2020, 10, 6788 5 of 10 Appl. Sci. 2020, 10, x FOR PEER REVIEW 5 of 10

3.3. Experiment Experiment Design Design Two experimentalexperimental configurationsconfigurations were were used used to to test test this this hypothesis. hypothesis. The The first first configuration configuration was was the thedepth depth of the of content the content located located according according to the actualto the optical actual design. optical To design. check whetherTo check the whether augmented the augmentedcontent was content reproduced was reproduced with an appropriate with an depthappropriate and size, depth a marker and size, was a setmarker up at was the depthset up whereat the depththe augmented where the 2D augmented projected image2D projected was reproduced image was using reproduced a camera, using and a the camera, focus and was the placed focus on was the placedsame plane. on the In same the secondplane. In setup, the second the N-vision setup, K5001the N-vision autorefractor, K5001 autorefractor, as shown Figure as 7showna, was Figure placed 7a, to wasenable placed the viewer to enable to see the it viewer at a distance to see of it 0.67at a Ddist diopter,ance of which 0.67 D was diopter, the distance which the was actual the distance content wasthe actualaugmented. content After was projecting augmented. an image After at projecting a location thatan image satisfied at the a location initial conditions that satisfied for recording the initial the conditionsHOE, the control for recording response the of HOE, the eye the to control the observer response was of measured. the eye to the observer was measured.

(a) (b)

Figure 7. MainMain device device for for measuring measuring accommodative accommodative response: response: ( (aa)) autorefractor, autorefractor, and and ( (b)) HOE HOE screen. screen.

The HOE was manufactured as a screen that functioned as a convex lens with a of 500 mm using the reflective reflective hologram recording method, as shown Figure7 7b.b. TheThe materialmaterial usedused waswas silver halide of U08C holographicholographic film.film. HologramsHolograms werewere recordedrecorded using using a a monochromatic monochromatic light light with with a awavelength wavelength of of 532 532 nm nm [24 –[24–26].26]. The The parameters parameters of the of floatingthe floating hologram hologram system system using using the HOE the screenHOE screenare listed are in listed Table in1 .Table 1. Figure 8 shows the entire system of acoustic response measurement using the HOE. At the time Table 1. Specifications of a floating hologram projection system using HOE. of measurement, the comparison target group was measured using an actual target and a virtual image was reproduced using Indexthe HOE screen. Shin-Nippon’s N-vision Value K5001 autorefractor was used as a control force-measuringRecording instrument, material currently used in ophthalmology U08C to measure the thickness change of the lens in realFocal time. length Accordingly, of HOE the amount of control 500 reaction mm of the eye was measured by comparing the actualSize target of HOE and screen the virtual image reproduced 100 mm using125 the mm HOE screen. × Wavelength 532 mm Table 1. SpecificationsResolution of a floating hologram projection Over 1000 system line/ mmusing HOE. Diffraction efficiency 48.50% Distance: Virtual image–HOEIndex screen 3000Value mm Distance: DiffRecordinguser–HOE screenmaterial 430 U08C mm Distance: HOE screen–eye box 500 mm Distance: Projector–diFocal lengthffuser of HOE 300 500 mm mm Size of HOE screen 100 mm × 125 mm Figure8 shows the entire systemWavelength of acoustic response measurement532 mm using the HOE. At the time of measurement, the comparisonResolution target group was measured Over using 1000 anline/mm actual target and a virtual image was reproduced usingDiffraction the HOE screen. efficiency Shin-Nippon’s N-vision 48.50% K5001 autorefractor was used as a control force-measuringDistance: instrument, Virtual image–HOE currently screen used in ophthalmology 3000 mm to measure the thickness change of the lens in realDistance: time. Accordingly, Diffuser–HOE the amount screen of control reaction 430 mm of the eye was measured by Distance: HOE screen–eye box 500 mm comparing the actual target and the virtual image reproduced using the HOE screen. Distance: Projector–diffuser 300 mm Appl. Sci. 2020, 10, 6788 6 of 10 Appl. Sci. 2020, 10, x FOR PEER REVIEW 6 of 10 Appl. Sci. 2020, 10, x FOR PEER REVIEW 6 of 10

Figure 8. Real system model for accommodative response measurement. Figure 8. Real system model for accommodative response measurement.

4. Results Results and and Discussion Discussion 4. Results and Discussion 4.1. Measuring the Depth of a Floating 2D Image 4.1. Measuring the Depth of a Floating 2D Image We needed to confirm that the virtual image was floating at 150 cm, as the initial optical was We neededneeded toto confirmconfirm thatthat thethe virtualvirtual image was floatingfloating at 150 cm, as the initial optical was designed. The marker was placed at 150 cm and the camera was used to focus on the marker. We designed. The The marker marker was was placed placed at at150 150 cm cm and and the the camera camera was was used used to focus to focus on the on marker. the marker. We verified that the virtual image was clearly visible at 150 cm. In Figure 9a, the camera was focused on Weverified verified that that the thevirtual virtual image image was was clearly clearly visible visible at 150 at 150 cm. cm. In Figure In Figure 9a,9 thea, the camera camera was was focused focused on the HOE plane. In this case, both the marker 100 cm away and the floated virtual image 150 cm away onthe theHOE HOE plane. plane. In this In thiscase, case, both both the marker the marker 100 cm 100 away cm away and the and floated the floated virtual virtual image image 150 cm 150 away cm were out of focus. In Figure 9b, the camera was focused on the marker, and Figure 9c is the result of awaywere out were of out focus. of focus. In Figure In Figure 9b, the9b, camera the camera was wasfocu focusedsed on the on themarker, marker, and and Figure Figure 9c9 isc isthe the result result of the camera focusing on the virtual image display. Through this process, verification that the virtual ofthe the camera camera focusing focusing on on the the virtual virtual image image display. display. Through Through this this process, process, verification verification that that the virtual image was floating at 150 cm was acquired by focusing the camera. image was floatingfloating atat 150150 cmcm waswas acquiredacquired byby focusingfocusing thethe camera.camera.

(a) (b) (c) (a) (b) (c) Figure 9. Focal plane distance measurement: (a) HOE plane (500 mm); (b) target plane (1000 mm); Figure 9. Focal plane distance measurement: ( (aa)) HOE HOE plane plane (500 (500 mm); mm); ( (bb)) target target plane (1000 mm); and (c) virtual image plane (1500 mm). and (c) virtual image planeplane (1500(1500 mm).mm).

4.2. Measurement Measurement Results of the Accommodative Responses 4.2. Measurement Results of the Accommodative Responses Qualitative andand weakweak variables variables were were used used to analyzeto analyze the dithefference difference in the in average the average between between groups Qualitative and weak variables were used to analyze the difference in the average between groupsusing a using paired a t-test.paired SPSS t-test. Ver.18.0 SPSS Ver.18.0 for Windows for Windows was used was for used data for analysis. data analysis. The paired The t-test paired method t-test groups using a paired t-test. SPSS Ver.18.0 for Windows was used for data analysis. The paired t-test methodis a statistical is a statistical method method that measures that measures two variables two variables within within a group a group to determine to determine whether whether there there is a method is a statistical method that measures two variables within a group to determine whether there isdi aff erencedifference in means. in means. A largeA large t-value t-value means means that that there there is is a a high high possibility possibility ofof a didifferencefference in means. means. is a difference in means. A large t-value means that there is a high possibility of a difference in means. In addition, if the p-value is less less than than 0.05, 0.05, it it can can be be interp interpretedreted that that the the difference difference between the means In addition, if the p-value is less than 0.05, it can be interpreted that the difference between the means of the two groups is significant.significant. If the p-value value is greater greater than 0.05, it can be interpreted that the of the two groups is significant. If the p-value value is greater than 0.05, it can be interpreted that the differencedifference between the means of the two groups is no nott significant. significant. Parametric Parametric statistics, which analyze difference between the means of the two groups is not significant. Parametric statistics, which analyze the population probability distri distributionbution assuming a normal distri distribution,bution, were used to compare the the population probability distribution assuming a normal distribution, were used to compare the amount of controlled reaction in each environment [[2277].]. In general, parametric statistics are used for amount of controlled reaction in each environment [27]. In general, parametric statistics are used for data that are continuous variables and have aa largelarge numbernumber ofof samples.samples. The parametric statistics data that are continuous variables and have a large number of samples. The parametric statistics should therefore have a sample of at least 30 people. The regulated response of 30 participants was should therefore have a sample of at least 30 people. The regulated response of 30 participants was measured in this experiment [28,29]. measured in this experiment [28,29]. Appl. Sci. 2020, 10, 6788 7 of 10 should therefore have a sample of at least 30 people. The regulated response of 30 participants was measured in this experiment [28,29].

4.2.1. Comparison Results of the Control Reaction between the Target Position of 1.00 D and the Virtual Image Plane N-vision K5001 was used to measure and compare the amount of modulated response between the 1.00 D (100 cm) real stimulus and the holographic stimulus (Table2). When the modulatory responses to the 1.00 D stimulus and to the stimulus at the virtual image position were compared, a higher result for the first one was obtained. The significant probability value was less than 0.05, which implied that the difference in the means of the two result diopters was significant. Therefore, the difference between the two values was statistically significant and the two result values in the first experiment meant a statistically different value.

Table 2. Comparison of control reaction between 1.00 D and virtual image.

Mean SD MD t p-Value ± Target Virtual Image 0.36 8.335 p < 0.001 (1.03 1.01) (0.67 1.02) − − ± ± Unit: D, SD: standard deviation, MD: mean difference.

4.2.2. Comparison Results of the Control Reaction between the Target Position of 0.67 D and the Virtual Image Plane N-vision K5001 was used to measure and compare the amount of modulated response between the 0.67 D (150 cm) real stimulus and the holographic stimulus (Table3). The first and second values were found to have similar diopter values when the modulatory responses to the 0.67 D stimulus were compared to the stimulus at the virtual image position. The significant probability value was higher than 0.05, which meant that the difference between the means of the two resulting diopters was not significant. Therefore, the difference between the two values was not statistically significant and the two result values in the second experiment were statistically the same.

Table 3. Comparison of controlled reaction amount between 0.67 D and virtual image.

Mean SD MD t p-Value ± Target Virtual Image 0.02 1.225 0.311 (0.69 1.03) (0.67 1.37) − ± ± Unit: D, SD: standard deviation, MD: mean difference.

5. Conclusions This study proposed a method to measure the depth of the reconstruction image of the floating hologram system using a HOE by measuring accommodation response. Figures 10 and 11 show the result of observing the control response to the real target and the virtual image plane at 100 cm and 150 cm, respectively. As seen in Figure 10, the results of the experiment showed that the congestion of the accommodative response and the mismatch of the control occurred at a distance of 1.00 D, which was a target located 50 cm closer than the reconstruction image in the virtual image plane. Figure 11 shows the result of observing the accommodative response after placing the actual target at a distance of 0.67 D, which was the same distance as the virtual image plane. The results of this experiment confirmed that the experimenters focused on the virtual image at a distance of 0.67 D (150 cm), which the location designed for the initial design using a HOE. This confirmed that the visual image depth identified by the camera was the same as the visual image depth measured by the Appl. Sci. 2020, 10, 6788 8 of 10

Appl.acoustic Sci. 2020 response, 10, x FOR of PEER the eye.REVIEW The experimental results confirmed that when the augmented virtual8 of 10 Appl.image Sci. of2020 the, 10 floating, x FOR PEER hologram REVIEW produced through the HOE screen was observed by the human8 of eye, 10 hologramthe accommodative system fabricated response using due tothe the HOE adjustment screen could of the provide thickness an ofobserver the lens with was a consistent natural and with deep the 3Dhologramimage image. restoration system fabricated position, such using as the the HOE optical screen design could value. provide This an indicated observer that with the a floatingnatural hologramand deep 3Dsystem image. fabricated using the HOE screen could provide an observer with a natural and deep 3D image.

Figure 10. Comparison of accommodative response of target (1.00 D) and virtual image planes. Figure 10. Comparison of accommodative response of target (1.00 D) and virtual image planes. Figure 10. Comparison of accommodative response of target (1.00 D) and virtual image planes.

Figure 11. Comparison of accommodative response of target (0.67 D) and virtual image planes. Figure 11. Comparison of accommodative response of target (0.67 D) and virtual image planes. Figure 11. Comparison of accommodative response of target (0.67 D) and virtual image planes. AuthorAuthor Contributions: Contributions: AllAll the the authors authors have have made made substantial substantial contri contributionsbutions regarding regarding the the conception conception and and design design ofAuthorof the the work, work, Contributions: and and the the acquisition, acquisition, All the authors analys analysis haveis and and made interpretation interpretation substantial of of contri the the data. data.butions Spec Specifically, regardingifically, conceptualization, conceptualization, the conception and L.H. L.H. design and and S.L.; methodology, S.H.; formal analysis, P.G.; validation, J.L. and L.H.; investigation, S.K. All authors have read S.L.;of the methodology, work, and the S.H.; acquisition, formal analysis, analysis P.G.;and interpretation validation, J.L. of and the L.H.;data. investigation,Specifically, conceptualization, S.K. All authors have L.H. read and and agreed to the published version of the manuscript. andS.L.; agreed methodology, to the published S.H.; formal version analysis, of the P.G.; manuscript. validation, J.L. and L.H.; investigation, S.K. All authors have read andFunding: agreedThis to the research published was supportedversion of the by themanuscript. Ministry of Science and ICT (MSIT), Korea, under the Information Funding:Technology This Research research Center was supported (ITRC) support by the program Ministry (IITP-2020-0-01846) of Science and ICT supervised (MSIT), Korea, by the under Institute the of Information Information TechnologyFunding:& Communications This Research research Technology Center was supported (ITRC) Planning bysupport &the Evaluation Ministry program of (IITP). Science(IITP-2020-0-01846) and ICT (MSIT), supervised Korea, under by thethe InformationInstitute of InformationTechnologyConflicts of & Interest:Research Communications TheCenter authors (ITRC) Techno declare supportlogy that Planning there program are &no Evaluation (IITP-2020-0-01846) conflicts of(IITP). interest. supervised by the Institute of Information & Communications Technology Planning & Evaluation (IITP). Conflicts of Interest: The authors declare that there are no conflicts of interest. ConflictsReferences of Interest: The authors declare that there are no conflicts of interest. References1. Thomas, W.; Frey, H.; Jean, P. Virtual HUD using an HMD. Helmet- and Head-Mounted Displays VI. ReferencesProc. SPIE Int. Soc. Opt. Eng. 2001, 4361, 251–262. 1. Thomas, W.; Frey, H.; Jean, P. Virtual HUD using an HMD. Helmet- and Head-Mounted Displays VI. Proc. 2. Choi, P.H.; Choi, Y.H.; Park, M.S.; Lee, S.H. Non-glasses Stereoscopic 3D Floating Hologram System using 1. SPIEThomas, Int. Soc.W.; Frey,Opt. Eng.H.; Jean, 2001 P., 4361 Virtual, 251–262. HUD using an HMD. Helmet- and Head-Mounted Displays VI. Proc. Polarization Technique. Inst. Internet Broadcasting Commun. 2019, 8, 18–23. 2. Choi,SPIE Int.P.H.; Soc. Choi, Opt. Y.H.; Eng. Park,2001, M.S.;4361, Lee,251–262. S.H. Non-glasses Stereoscopic 3D Floating Hologram System using 2. PolarizationChoi, P.H.; Choi, Technique. Y.H.; Park, Inst. M.S.;Internet Lee, Broadcasting S.H. Non-glasses Commun. Stereoscopic 2019, 8, 18–23. 3D Floating Hologram System using 3. Toshiaki,Polarization Y.; Technique.Nahomi M.; Inst. Kazuhisa Internet Y. Broadcasting Holographic Commun. Pyramid 2019 Using, 8, 18–23.Integral Photography. In Proceedings 3. ofToshiaki, the 2nd Y.; World Nahomi Congress M.; Kazuhisa on Electrical Y. Holographic Engineer Pyingramid and UsingComputer Integral Systems Photography. and Science, In Proceedings Budapest, Hungary,of the 2nd 16–17 World August Congress 2016; onMHCI Electrical 109; pp. Engineer 16–17. ing and Computer Systems and Science, Budapest, Hungary, 16–17 August 2016; MHCI 109; pp. 16–17. Appl. Sci. 2020, 10, 6788 9 of 10

3. Toshiaki, Y.; Nahomi, M.; Kazuhisa, Y. Holographic Pyramid Using Integral Photography. In Proceedings of the 2nd World Congress on Electrical Engineering and Computer Systems and Science, Budapest, Hungary, 16–17 August 2016; MHCI 109. pp. 16–17. 4. Reichelt, S.; Haussler, R.; Fütterer, G.; Leister, N. Depth cues in human and their realization in 3D displays. SPIE Proc. 2010, 7690, 111–112. 5. De Silva, V.; Fernando, A.; Worrall, S.; Arachchi, H.K.; Kondoz, A. Sensitivity analysis of the human visual system for depth cues in stereoscopic 3-D displays. IEEE Trans. Multimed. 2011, 13, 498–506. [CrossRef] 6. Owens, D.A.; Mohindra, I.; Held, R. The Effectiveness of a Retinoscope Beam as an Accommodative Stimulus. Investig. Ophthalmol. Vis. Sci. 1980, 19, 942–949. 7. Avudainayagam, K.V.; Avudainayagam, C.S. Holographic multivergence target for subjective measurement of the astigmatic error of the human eye. Opt. Lett. 2007, 32, 1926–1928. [CrossRef][PubMed] 8. Ciuffreda, K.J.; Kruger, P.B. Dynamics of human voluntary accommodation. Am. J. Optom. Physiol. 1988, 65, 365–370. [CrossRef][PubMed] 9. Cacho, P.; Garcia, A.; Lara, F.; Segui, M.M. Binocular acommodative facilty testing reliabilty. Optom. Vis. 1992, 4, 314–319. 10. Bharadwaj, S.R.; Candy, T.R. Acommodative and vergence responses to conflicting blur and disparity stimuli during development. J. Vis. 2009, 9, 1–18. [CrossRef] 11. Iribarren, R.; Fornaciari, A.; Hung, G.K. Effect of cumulative near work on accommodative facility and asthenopia. Int. Ophthalmol. 2001, 24, 205–212. [CrossRef] 12. Rosenfield, M.; Ciufreda, K.J. Effect of surround propinquity on the open-loop accommodative response. Investig. Ophthalmol. Vis. Sci. 1991, 32, 142–147. 13. Rosenfield, M.; Ciufreda, K.J.; Hung, G.K.; Gilmartin, B. Tonic accommodation: A review. I. Basic aspects. Ophthalmic Physiol. Opt. 1993, 13, 266–284. [CrossRef][PubMed] 14. McBrien, N.A.; Millodot, M. The relationship between tonic accommodation and refractive error. Investig. Ophthalmol. Vis. Sci. 1987, 28, 997–1004. 15. Krumholz, D.M.; Fox, R.S.; Ciuffreda, K.J. Short-term changes in tonic accommodation. Investig. Ophthalmol. Vis. Sci. 1986, 27, 552–557. 16. Nadezhda, V.; Pavel, S. Application of Photopolymer Materials in Holographic Technologies. Polymers 2019, 11, 2020. 17. Zhang, H.; Deng, H.; He, M.; Li, D.; Wang, Q. Dual-View 3D Display Based on Multiplexed Lens-Array Holographic Optical Element. Appl. Sci. 2019, 9, 3852. [CrossRef] 18. Maria, A.F.; Valerio, S.; Giuseppe, C. Volume Holographic Optical Elements as Solar Concentrators: An Overview. Appl. Sci. 2019, 9, 193. 19. Liu, J.P.; Tahara, T.; Hayasaki, Y.; Poon, T.C. Incoherent Digital Holography: A Review. Appl. Sci. 2018, 8, 143. [CrossRef] 20. Reid, V.; Adam, R.; Elvis, C.S.C.; Terry, M.P. Hologram stability evaluation for Microsoft HoloLens. Int. Soc. Opt. Photonics. 2017, 10136, 1013614. 21. Oh, J.Y.; Park, J.H.; Park, J.M. Virtual Object Manipulation by Combining Touch and Head Interactions for Mobile Augmented Reality. Appl. Sci. 2019, 9, 2933. [CrossRef] 22. Takeda, T.; Hashimoto, K.; Hiruma, N.; Fukui, Y. Characteristics of accommodation toward aparent depth. Vis. Res. 1999, 39, 2087–2097. [CrossRef] 23. Davies, L.N.; Mallen, E.A.; Wolffsohn, J.S.; Gilmartin, B. Clinical evaluation of the Shin-Nippon NVision-K5001/Grand Seiko WR-5100K autorefractor. Optom. Vis. Sci. 2003, 80, 320–324. [CrossRef] [PubMed] 24. Gentet, P.; Gentet, Y.; Lee, S.H. Ultimate 04 the new reference for ultra-realistic color holography. In Proceedings of the Emerging Trends Innovation in ICT(ICEI), 2017 International Conference on IEEE, Yashada, Pune, 3–5 February 2017; pp. 162–166. 25. Gentet, P.; Gentet, Y.; Lee, S.H. New LED’s wavelengths improve drastically the quality of illumination of pulsed digital holograms. In Proceedings of the Digital Holography and Three-Dimensional Imaging, JeJu Island, Korea, 29 May–1 June 2017; p. 209. 26. Lee, J.H.; Hafeez, J.; Kim, K.J.; Lee, S.H.; Kwon, S.C. A Novel Real-Time Match-Moving Method with HoloLens. Appl. Sci. 2019, 9, 2889. [CrossRef] Appl. Sci. 2020, 10, 6788 10 of 10

27. Moses, L.E. Non-parametric statistics for psychological research. Psychol. Bull. 1952, 49, 122–143. [CrossRef] [PubMed] 28. Andrew, J.K. Parametric versus non-parametric statistics in the analysis of randomized trials with non-normally distributed data. BMC Med. Res. Methodol. 2005, 5, 35. 29. Davison, M.L.; Sharma, A.R. Parametric statistics and levels of measurement. Psychol. Bull. 1988, 104, 137–144. [CrossRef]

© 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).