Larry F. Hodges Geometric Considerations for College of Computing-0280 Graphics, Visualization Stereoscopic Virtual Environments & Usability Center Atlanta, Georgia 30332

Elizabeth Thorpe Davis of School Psychology-0170 Abstract Graphics, Visualization & Center Usability We examine the relationship among the different geometries implicit in a stereoscopic Institute of Georgia Technology virtual environment. In particular, we examine in detail the relationship of retinal dispar- Atlanta, 30332 Georgia ity, fixation , binocular visual direction, and screen parallax. We introduce the con- cept of a volumetric spatial unit called a stereoscopic voxel. Due to the shape of stereo- scopic voxels, apparent depth of points in space may be affected by their horizontal placement. Downloaded from http://direct.mit.edu/pvar/article-pdf/2/1/34/1622602/pres.1993.2.1.34.pdf by guest on 26 September 2021

I Introduction

The display component of the most common implementations of virtual environments provides the user with a visual image that incorporates stereopsis as a visual cue. Examples include head-mounted displays (Teitel, 1990), time- multiplexed CRT-based displays (Deering, 1992), and time-multiplexed pro- jection systems (Cruz-Neira, Sandin, DeFanti, Kenyon, & Hart, 1992). It is not always recognized, however, that the characteristics of a stereoscopic image can be very different from that of a monoscopic perspective image. The visual impression given by stereoscopic images are very sensitive to the geometry of the visual system of a user, the geometry ofthe display environment, and the modeling geometry assumed in the computation of the scene. In this paper we present a tutorial whose purpose is to review these basic geometries and to ana- lyze their relationship to each other in a stereoscopic virtual environment. In Section 2 we review the geometry of binocular vision and retinal disparity. In Section 3 we discuss modeling geometry and its relationship to retinal dis- parity. Section 4 describes the effects of the discrete nature of display geometry and the distortions caused by optical and tracking artifacts. In Section 5 we re- visit our approximate model of the visual system geometry and discuss its limi- tations.

2 Visual System Geometry

Stereopsis results from the two slightly different views of the external world that our laterally displaced eyes receive (e.g., Schor, 1987; Tyler, 1983). This difference is quantified in terms of retinal disparity. If both eyes are fixated on a point, f1, in space, then an image of f 1 is focused at corresponding points in the center of the fovea1 of each eye. Another point, f2, at a different spatial

Presence, Volume 2, Number I, Winter 1993 1. The fovea is the part of the human retina that possesses the best spatial resolution or visual © 1993 The Massachusetts Institute of Technology acuity.

34 PRESENCE: VOLUME 2, NUMBER I Hodges and Davis 35

location, would be imaged at points in each eye that may not be the same distance from the fovea. This difference in distance is the retinal disparity. Normally we measure this distance as the sum of the angles ôi + Ô2 as shown in Figure 1, where angles measured from the center of fo- vea toward the outside of each eye are negative. In the example shown in Figure 1, ôi has a positive value, ô2 has a negative value, and their sum (8i + ô2) is positive. The retinal disparity that results from these two differ- ent views can provide information about the distance or Left Eye Right Eye Downloaded from http://direct.mit.edu/pvar/article-pdf/2/1/34/1622602/pres.1993.2.1.34.pdf by guest on 26 September 2021 depth of an object as well as about the shape of an ob- Retinal = Si + 52 ject. Stereoacuity is the smallest depth that can be de- disparity tected based on retinal disparity. In some humans, under Figure I. Retinal disparity. optimal stimulus conditions, stereoacuities of 5" or less can be obtained (Westheimer & McKee, 1980; McKee, stereoacuities of more than 5" are more 1983); however, where a is the angle of convergence, Di is the distance common 8c Shor 8c (Davis, King, Anoskey, 1992; from the midpoint of the interocular axis to the fixated Wood, 1983). point, f1, and i is the interocular distance (Arditi, 1986; Graham, 1965).2 (See Fig. 2.) Notice that the angle of 2.1 Binocular Visual Direction convergence, a, is inversely related to the distance, D\, of the fixated from the observer; this inverse rela- Visual direction is the location of point perceived spatial tion is nonlinear. an object relative to the observer. Usually, it is measured For another point in space, f2, located at a distance in terms of azimuth (left and of the of fixa- right point D2, the of convergence is ß Note that and of elevation and below the of angle (see Fig. 2). tion) (above point since fixation). Often, the binocular visual direction of an ob- ject is not the same as the monocular visual direction of a + a + c + b + d= 180 either can this at a eye. (You verify yourself by looking and very close object first with one eye, then with the other eye, then with both eyes. Notice how the apparent spa- ß + c + d = 180 tial location of the object changes.) Hering proposed then that binocular visual direction will lie midway between the directions of the monocular others have re- a ß = (-a) + (-b) = (8, + 82) images; - ported that the binocular visual direction will lie some- The difference in vergence (a ß) is angles — equiva- where between the left and right monocular visual direc- lent to the retinal disparity between the two points in tions, but not necessarily midway (e.g., Tyler, 1983). space, measured in units of visual angle. This retinal dis- parity is monotonically related to the depth between the 2.2 Convergence Angles and Retinal Disparities 2. For asymmetric convergence of the two eyes, the formula for the For symmetric convergence of the two eyes on a angle of convergence is basically the same as that shown for symmetric The difference is that D now the fixated in space, f1, the of convergence is convergence. represents perpendicular point angle distance from the interocular axis to the frontoparallel plane that inter- defined as sects the asymmetrically converged point of fixation. This interpreta- tion of the convergence angle formula for asymmetric convergence is a = 2 arctan(¿/2D!) not exact, but it is a good approximation. 36 PRESENCE: VOLUME 2, NUMBER I

Dh

* Downloaded from http://direct.mit.edu/pvar/article-pdf/2/1/34/1622602/pres.1993.2.1.34.pdf by guest on 26 September 2021

d2

Figure 2. Convergence angles. Figure 3. Vieth-Mueller circle.

two points in space (i.e., at a constant distance, D\, a The Vieth-Mueller circle is a theoretical horopter de- larger depth corresponds to a larger retinal disparity); fined only in terms of geometric considerations. This this monotonie relationship is a nonlinear one. horopter is a circle in the horizontal plane that intersects If an object is closer than the fixation point, the retinal each eye at the first nodal point of the eye's lens system disparity will be a negative value. This is known as a (Gulick & Lawson, 1974; Ogle, 1968). (See Fig. 3.) crossed disparity because the two eyes must cross to fixate This circle defines a locus of points with zero disparity. the closer object. Conversely, if an object is farther than However, in devising this theoretical horopter it is as- the fixation point, the retinal disparity will be a positive sumed that the eyes are perfect spheres with perfectly value. This is known as uncrossed disparity because the spherical optics and that the eyes rotate about axes that two eyes must uncross to fixate the farther object. An pass only through their first optical nodal points (e.g., object located at the fixation point or whose image falls Arditi, 1986; Bishop, 1985, Gulick 6k Lawson, 1974). on corresponding points in the two retinae has a zero None of these assumptions is strictly true. Thus, when disparity. one compares the Vieth-Mueller circle to any empiri- cally determined horopter there is a discrepancy between the theoretical and With few 2.3 Horopters empirical horopters. excep- tions (Deering, 1992) the eye geometry used for image Corresponding points on the two retinae are de- calculations for stereoscopic virtual environments as- fined as being the same vertical and horizontal distance sume a visual model consistent with the Veith-Mueller from the center of the fovea in each eye (e.g., Tyler, circle. We will address the results of this assumption in 1983; Arditi, 1986; Davis & Hodges, 1994). When the Section 5. two eyes binocularly fixate on a given point in space, Horopters usually describe a locus of points that there is a locus of points in space that fall on correspond- should result in zero disparity. Stereopsis, however, occurs ing points in the two retinae. This locus of points is the when there is a nonzero disparity that gives rise to the horopter, a term originally used by Aguilonius in 1613. percept of depth. That is, an object or visual stimulus The horopter can be defined either theoretically or em- can appear closer or farther than the horopter for crossed pirically. and uncrossed disparity, respectively. Hodges and Davis 37

Display Screen

Left eye position Downloaded from http://direct.mit.edu/pvar/article-pdf/2/1/34/1622602/pres.1993.2.1.34.pdf by guest on 26 September 2021

Right eye position

Object with negative parallax Figure 4. Screen parallax.

3 Modeling Geometry The amount of screen parallax in this case may be computed with respect to the geometric model of the For any type of stereoscopic display, we are model- scene as what would be seen each if the were ing by eye image = p i(D d)/D projected onto a screen or window being viewed by an - observer (Hodges, 1992). The simplest case, which is where/? is the amount of screen parallax for a point, f 1, equivalent to the geometry of most head-mounted dis- when projected onto a plane a distance d from the plane plays, occurs when the observer's eyes lie in a plane par- containing two eyepoints. The modeled distance be- allel to a single planar projection screen on which both tween eyepoints is i and the distance from f 1 to the near- the left- and right-eye projected views of an image are est point on the plane containing the two eyepoints is D displayed (Fig. 4). If we choose a point, P, on the object (Fig. 5). and follow its projection vectors to each eye position, Screen parallax does not correspond directly to retinal then the screen parallax is the distance between the pro- disparity. Retinal disparity is measured on the two reti- jected location ofP on the screen, Pleft, seen by the left nae and its value is relative to the current binocular fixa- eye and the projected location, Pright, seen by the right tion point and convergence angle. Screen parallax is eye. Crossed or negative parallax (i.e., the left-eye view measured on the surface of a display device and its value ofP is to the right of the right-eye view ofP) results in is dependent on the computational model of the ob- an image that appears spatially in front of the display server (i.e., distance of each eye position from the pro- screen. Uncrossed or positive parallax results in an image jection plane and the orientation of the plane relative to that appears behind the screen. Zero or no parallax re- the plane containing the two eye positions) and the dis- sults in a spatial position at the plane of the screen. play geometry (how values are mapped to a display 38 PRESENCE: VOLUME 2, NUMBER I

Projection D

Left Right Downloaded from http://direct.mit.edu/pvar/article-pdf/2/1/34/1622602/pres.1993.2.1.34.pdf by guest on 26 September 2021 eyepoint eyepoint Figure 6. Screen parallax and convergence angles.

Figure 5. Screen parallax, p, is equal to ,(D d)l D. environment, then there is a screen parallax pa associated - with that point and an angle of convergence, a. Another point, f2, with convergence angle, ß, would also have an screen). Points with constant screen parallax describe a associated screen parallax,pp. Remember that retinal straight line that is parallel to a horizontal plane through is the difference in vergence (a ß) disparity angles — the two eye points. This assumes that the eyepoints do between two points in space, measured in units of visual not change positions when the eyes converge and fixate angle. For points that lie on a line along the binocular on a different point. In reality the location of the first visual direction, the retinal disparity may be closely ap- nodal of the eye's lens system will vary slightly point from the screen the for- with convergence and also with accommoda- proximated directly parallax by possibly mula tion. We will address this issue in Section 5. For now we assume a model in which perfectly spheri- d)] - d)]} 2{tzn-l[pJ2(D - - tan-1(^ß/2(D cal eyes rotate about the eyepoint so that the eyepoint is stationary with respect to the head position. For these assumptions the resulting screen parallax that is com- 4 Display Geometry puted for each point in the scene will produce proper 4.1 Stereoscopic Voxels retinal parallax for any fixation point and resultant con- vergence angles. This is illustrated in Figure 6. Points f2 A stereoscopic display system provides a partition- and f3 both have the same screen parallax as point f1. ing of three-dimensional space into volumetric spatial These equal screen parallaxes induce different conver- units that we shall refer to as stereoscopic voxels. Each ste- gence angles a, ß, and 7 for points f 1, f2, and f3, re- reoscopic voxel is defined by the intersection of the lines spectively. If the observer is fixated on f1, then the re- of sight from each eye through two distinct . The sultant retinal for f2 and f3 are (a ß) and size and of voxels are determined disparities — shape stereoscopic by (a the of each as well as the and — y), respectively. position eye pitch Screen parallax is an absolute measure of a point's dis- shape. If we assume an idealized rectangular pixel tance from the plane of an observer's eyepoints. To cal- shape,3 then stereoscopic voxels can be modeled as six- culate retinal disparity from screen parallax, however, we sided polyhedra with diamond-shaped horizontal cross must look at values that are proportional to differences in screen For this cal- parallax. symmetric convergence, 3. When displayed on a CRT, pixels actually have a Gaussian shape culation is straightforward for points that lie on a line and adjacent pixels may blend together. LCD pixels have well-defined but most head-mounted incorporate an low- a binocular visual direction. If an observer's edges, displays optical along given pass filter to reduce the perception of this effect. HMDs also incorpo- eyes are fixated on a point, f 1, in a stereoscopic virtual rate optics that distort size and location of pixels. Hodges and Davis 39

Lett Eye Point Downloaded from http://direct.mit.edu/pvar/article-pdf/2/1/34/1622602/pres.1993.2.1.34.pdf by guest on 26 September 2021

Right Eye Point

Figure 7. Cross section of stereoscopic voxels.

section. A horizontal slice (created by one row of pixels) range of depth, are positioned in five depth positions (1, is shown in Figure 7. The distance, measured in number 2, 3,4, and 5). of pixels between the eye points, specifies the total num- Conversely, if we allow some jitter in a pixel's depth ber of discrete depth positions at which a point may be location, then we can provide an effective doubling of placed. In Figure 7, the number in each diamond cell horizontal spatial resolution for point positioning in ste- indicates the number of pixels of parallax that defines reoscopic display as compared to monoscopic display. that row of stereoscopic voxels. All stereoscopic voxels in For example, if we allow points to be positioned in ei- a particular row have the same volume. They differ in ther rows 0 and 1 of Figure 8, then we have 15 horizon- shape in that each is a shear of the center (between the tal positions represented across the front of the eight- eyepoints) stereoscopic voxel. pixel screen in Figure 8. A be defined as the cen- point's apparent position may We can analytically characterize a stereoscopic voxel in ter of the cross-sectional area in which it resides.4 The row ? by the values, djna„ djhn d]mid, and d]vndth (Fig. 9). diamond cross-sectional of voxels shape stereoscopic The distance of nearest approach of a voxel/ to the plane means that a horizontal can affect its point's placement of the display screen is ¿,ncar, the distance from the plane location. In A would apparent depth Figure 7, point to the most distant point of the voxel is djhn the distance to be closer to an observer than B. appear point Similarly to the widest width of the voxel is djmid, and the maxi- the distinct in at which a be steps depth point may mum width of the voxel is dJwidth. is affected its horizontal For ex- placed by placement. From similar triangles we can derive the following ample, the position of the triangular points in Figure 7 equations: puts them only in three different depth positions (1,3, = and 5). Yet, the square points, which exist in the same *;near *(;'-2)far - + l)^/[i + 4far U - (J m 4. Alternately, we could think of the volume of a stereoscopic pixel */mid *(;-l)far as a measure of the uncertainty in the position of a point or as a type of spatial aliasing. = (j + 4w.dth l)í(4far - 4mid)/(4far) 40 PRESENCE: VOLUME 2, NUMBER I

Table I. Width and Most Distant Point from Projection Surface of Voxels in Rows 0 through I Ia

¿far ^width Row (cm) (cm)

0 3.39 0.5 1 7.41 0.54 2 12.24 0.59 3 18.18 0.65 4 25.64 0.73 Downloaded from http://direct.mit.edu/pvar/article-pdf/2/1/34/1622602/pres.1993.2.1.34.pdf by guest on 26 September 2021 35.29 0.82 6 48.28 0.94 7 66.67 1.10 Right 8 94.74 1.33 Eye 9 142.86 1.68 10 244.44 2.29 Figure 8. Jittering depth locations to achieve better horizontal resolution. 11 600.00 3.56 "Assuming interocular separation of 6.4 cm projection surface at 40 cm, uncrossed parallax, and pixel width of 0.5 cm.

4.2 Geometric Distortion

Image distortion caused by display technology can be a problem for both time-multiplexed and head- mounted displays. Two possible sources of distortion for time-multiplexed CRT displays have been analyzed by Deering (1992). The first source of distortion is that Plane of screen standard projection geometry assumes the display sur- face of a CRT is Modern CRT Figure 9. Characterization of planar. screens, however, are as a section of a or a with a stereoscopic voxel. shaped sphere cylinder radius of curvature of about 2 m. The second source of distortion is that the phosphor screen on a CRT is viewed through a thick glass faceplate with an index of where p is the pixel pitch, the interocular distance is i, refraction significantly higher than that of air. For a and the distance from the eyeplane to the projection viewing angle of 60° relative to a normal vector to the plane is D. plane of the CRT, the curvature of the CRT screen can Table 1 shows some representative values of these pa- cause positional errors for objects at the surface of the rameters assuming an interocular distance of 6.4 cm, a display screen on the order of 1.8 cm. For the same distance from the eyeplane to the plane of the screen of viewing angle, the thickness of the glass faceplate can 40 cm, and a pixel width (through the optics) of0.5 cm. cause positional errors at the surface ofthe display screen These parameters are consistent with those of currently of up to 2.0 cm. These errors are both viewpoint depen- available head-mounted displays in the $5000-7000 dent and nonlinear. Equations to compensate for these price range (Robinett & Rolland, 1992). distortions are described by Deering (1992). Hodges and Davis 41

0.40

Symmetric convergenc

Convergence 2 0 centimeters to the left of t Downloaded from http://direct.mit.edu/pvar/article-pdf/2/1/34/1622602/pres.1993.2.1.34.pdf by guest on 26 September 2021

Lfl

Distance in centimeters from eye plane

Figure 10. Change in eyepoint separation with change in point of fixation. Centers of rotation of the eyes are assumed to be 6.4 cm apart.

For head-mounted displays there are image distor- distortion effects in the image occur. If the observer's tions introduced by the optics of the system, by im- head is further away from the plane of the display than proper alignment of the user's eye position with respect reported by the head tracker, then the image is elongated to the mathematical model of eye position, and by mis- in depth. If his head is closer, then the image is com- matches between the display's field-of-view and the pressed in depth. If the observer's head is incorrectly 's implicit mathematical model for field-of-view. tracked for side-to-side or up-down motion, the image Of particular interest are the distortions caused by the is distorted by a shear transformation. optical system in the head-mounted display. The pur- pose of the optical system is to provide an image to the user that is located at a comfortable distance for accom- 5 A Dose off Real Reality modation and magnified to provide a reasonable field- of-view. The optics also cause nonlinear distortions in From our analysis in Section 3 we observed that the image so that straight lines in the model appear screen parallax is dependent on the display geometry and curved in the visual image. Robinett and Rolland the value chosen for interocular distance in the modeling (1992) have developed a computational model for the geometry, but independent of the fixation point and of geometry of head-mounted displays to correct for these the visual direction. In that section we assumed that the distortions. Watson and Hodges (1993) have demon- eyes were perfect spheres with perfectly spherical optics strated a real-time implementation of this model. and that each eye rotated about its first nodal point (the eyepoint for perspective projection). This model is im- in all current of vir- 4.3 Head plicit implementations stereoscopic Tracking tual environments. Accurate head tracking is crucial to the presenta- As indicated by the Hering-Hillenbrand deviation tion of spatially consistent stereoscopic images. Screen between an empirically determined horopter and the parallax information is computed based on implied eye- Vieth-Mueller circle, however, these assumptions are point positions relative to head position and head orien- only approximations of the true geometry of the eye. tation. If the observer's head is not exactly in the posi- For example, the first nodal point does not actually cor- tion reported by the head tracker, then scaling and respond to the center of rotation of the eye (Deering, 42 PRESENCE: VOLUME 2, NUMBER I

1992; Ogle, 1968). As a result, if the eye rotates, then and the geometry implicit in the graphics model all must the first nodal point of each eye also shifts position de- be carefully coordinated to create an accurate visual rep- pending on the current visual direction. The shifting resentation. creates a change in eyepoint separation. The greatest In particular, we have examined in detail the relation- changes occur for symmetric convergence on points ship of retinal disparity, fixation point, binocular visual close to the observer. Figure 10 illustrates the range of direction, and screen parallax. We also have shown geo- change for an assumed separation of 6.4 cm between the metrically that a stereoscopic display partitions three- centers of rotation of the eyes and a distance of 0.6 cm dimensional space into volumetric spatial units (stereo- between the first nodal point and the center of rotation scopic voxels). Due to the shape of stereoscopic voxels, for each eye. of in be affected apparent depth points space may by Downloaded from http://direct.mit.edu/pvar/article-pdf/2/1/34/1622602/pres.1993.2.1.34.pdf by guest on 26 September 2021 From Section 2.2 we know that the angle of conver- their horizontal placement. gence, a, for a fixated point, f 1, is dependent not only on the distance to the point but also on eyepoint separa- tion. If we correctly model the location of the eyepoint Acknowledgments of a user and locate the center of rotation of each of his eyes, then we can compute the eye separation for any We would like to acknowledge both the anonymous reviewers fixated point. Stereopsis is dependent on relative conver- and Mr. Ben Watson for many helpful comments on how to gence angles specific to a particular fixated point. The improve this paper. result is that the screen parallax must be recomputed based on the current fixation point if we want to get reti- nal disparities in the three-dimensional virtual scene that References are identical to retinal disparities created by a real image with the same geometry. For example, consider symmet- Arditi, A. (1986). Binocular vision. In K. R. BofT, L. Kauf- ric convergence on a fixated point 30 cm from the eye- man, 8c J. P. Thomas (Eds.), Handbook ofperception and hu- plane, 6.4 cm interocular distance and 0.6 cm between man performance. Vol. 1. Sensory Processes and Perception. New the center of rotation of each eye and the first nodal York: Wiley. point. We can compute the retinal disparity between Bishop, P. O. (1985). Binocular vision. In Adler'sphysiology of the St. Louis: that point and another point 100 cm along the same eye. Mosby. S. of virtual environment gaze direction as approximately 521 min of arc. If we Bryson, (1992). Survey technologies and '92 course notes for course #9: now fixate on the point 100 cm away, the change in eye techniques. Siggraph of Immersive Virtual Environments. separation results in a retinal disparity of approximately Implementation Cruz-Neira, Sandin, D. J., DeFanti, T. A., Kenyon, R. V., 8c 514 min of arc between the two points. This difference Hart, J. C. (1992). The CAVE: Audio visual experience au- based on fixation and gaze direction cannot be point tomatic virtual environment. CommunicationsACM, 35(6), modeled without an accurate of the description optical 64-72. characteristics and of the individual user's physiology Davis, E. T., & Hodges, L. F. (1994). Human stereopsis & eye. fusion and stereoscopic virtual environments. In W. Barfield & T. Furness (Eds.), Virtual environments and advanced in- terface design. New York: Oxford University Press, in press. 6 Summary Davis, E. T., King, R. A., & Anoskey, A. (1992). Oblique ef- fect in stereopsis? Human Vision, Visual Processing and Digi- Geometric considerations are critical in the visual tal Display III, Proc. SPIE 1666, 465-475. display of stereoscopic virtual environments. The geom- Deering, M. (1992). High resolution virtual reality. Computer etry of the two eyes, the geometry of the display system, Graphics, 26(2), 195-202. Hodges and Davis 43

Graham, C. H. (1965). Visual space perception. In C. H. Gra- Schor, C. M., & Wood, I. (1983). Disparity range for local ham (Ed.), Vision and visualperception. New York: Wiley. stereopsis as a function of luminance spatial frequency. Vi- Gulick, W. L., & Lawson, R. B. (1974). Human stereopsis: A sion Research, 23, 1649-1654. psychophysical analysis. New York: Oxford. Teitel, M. A. (1990). The eyephone: A head-mounted stereo Hodges, L. F. (1992). Time-multiplexed stereoscopic display. display. SPIE Proc, 1256, 168-171. IEEE &Applications, 12(2), 20-30. Tyler, C. W. (1983). Sensory processing of binocular disparity. McKee, S. P. (1983). The spatial requirements for fine stereo- In C. M. Schor & K. J. Ciuffreda (Eds.), Vergence eye move- acuity. Vision Research, 23, 191-198. ments: Basic and clinical aspects. Boston: Butterworth. Ogle, K. N. (1968). Optics. Springfield, IL: Charles C. Watson, B. A., 8c Hodges, L. F. (1993). Real-time compensa- Thomas. tionfor optical distortion in head-mounted displays. Technical Robinett, W. & Rolland, J. (1992). A computational model report GIT-GVU-93-10, Georgia Tech Visualiza-

Graphics, Downloaded from http://direct.mit.edu/pvar/article-pdf/2/1/34/1622602/pres.1993.2.1.34.pdf by guest on 26 September 2021 for stereoscopic optics of a head-mounted display. Presence, tion & Usability Center. 1(1), 45-62. Westheimer, G., & McKee, S. P. (1980). Stereo design for Schor, C. M. (1987). Spatial factors limiting stereopsis and testing local stereopsis. Investigative Ophthalmology and Vi- fusion. OpticNews, 13(5), 14-17. sual Science, 19, 802-809.