sensors

Review Electrooculograms for Human–Computer Interaction: A Review

Won-Du Chang

School of Electronic and Biomedical Engineering, Tongmyong University, Busan 48520, Korea; [email protected]; Tel.: +82-51-629-1314  Received: 10 May 2019; Accepted: 12 June 2019; Published: 14 June 2019 

Abstract: movements generate electric signals, which a user can employ to control his/her environment and communicate with others. This paper presents a review of previous studies on such electric signals, that is, electrooculograms (EOGs), from the perspective of human–computer interaction (HCI). EOGs represent one of the easiest means to estimate eye movements by using a low-cost device, and have been often considered and utilized for HCI applications, such as to facilitate typing on a virtual keyboard, moving a mouse, or controlling a wheelchair. The objective of this study is to summarize the experimental procedures of previous studies and provide a guide for researchers interested in this field. In this work the basic characteristics of EOGs, associated measurements, and signal processing and pattern recognition algorithms are briefly reviewed, and various applications reported in the existing literature are listed. It is expected that EOGs will be a useful source of communication in virtual reality environments, and can act as a valuable communication tools for people with amyotrophic lateral sclerosis.

Keywords: electrooculogram (EOG); bio-signal processing; EOG calibration; eye-tracking

1. Introduction In the past decades, keyboards, mice, and touchscreens have been the most popular input devices to control computers. These traditional devices are effective in most general situations such as for writing documents, searching and navigating the internet, and gaming. However, people may encounter special situations in which such traditional devices are not effective, for instance, when people want to operate computer devices in a hands-free manner. Such scenarios may include surgery, as medical doctors often wish to search for information and discuss the procedure with advisors during an operation [1]; augmented reality (AR) applications, as they require fast and intuitive interactions [2]; or some gameplaying [3,4]. Some people also experience difficulties in using traditional input devices. Furthermore, people with advanced amyotrophic lateral sclerosis (ALS) are unable to communicate with people except by using their eye movements [5,6]. Numerous studies have been performed focusing on novel input approaches for such situations. Hand-movements were detected using optical or infrared cameras, and the intention of a user was recognized [7,8]. Bioelectric signals, such as electroencephalograms (EEGs), electromyograms (EMGs), and electrooculograms (EOGs) were also utilized to understand a user’s intention [9–11]. Among these approaches, the present paper focuses on the approach of utilizing EOGs. An EOG originates from the standing potential between the front and back of the , and thus, it represents eye-movements [12,13]. The use of EOGs was one of the first methods to measure eye-movements [14], and it has been expected to be a promising source for communications between human and computers [11,15,16]. The main benefit of utilizing EOG as an input source of human–computer interaction (HCI) is that the eye-movements can be estimated using low-cost devices. It is known that a complete EOG recording system can be assembled for less than 100 EUR [17], and eye-movements can be estimated

Sensors 2019, 19, 2690; doi:10.3390/s19122690 www.mdpi.com/journal/sensors Sensors 2019, 19, 2690 2 of 20

with a precision of up to 1.5◦ [18]. Considering this background, several researchers have investigated EOG-based HCI systems. Although most of these studies were focused on utilizing instant movements in a single direction, recent studies on eye-writing have enabled the estimation and recognition of more complex eye-movements [16,19,20], thereby allowing users to write Arabic numbers, English alphabets, and Japanese Katakana directly. In this manner, the users can communicate even complicated messages in a relatively short time. Several review papers on eye tracking exist, in which the origin and measurements methods of EOGs are briefly introduced. Young and Sheena reviewed early techniques for measuring eye movements in 1975 [14,18]. They compared the techniques of electrooculography, corneal reflexes, and contact lenses. Techniques on tracking the limbus, pupil, and have also been reviewed. In the perspective of EOG, early historical aspects and measurement methods have been described well. A comprehensive review on eye tracking was conducted in 1995 by Glenstrup and Engell-Nielsen in their bachelors’ thesis [21], in which eye tracking techniques, psychological and physiological aspects concerning eye tracking, and relevant applications in literature were described in detail. However, because their review was focused on a wide research domain, studies on EOGs were not detailed adequately. Singh and Singh reviewed eye tracking technology including EOGs in 2012 [22]. They described a concise history of eye tracking and types of eye movements, and compared four approaches of eye tracking: scleral search coil, infrared oculography, electro-oculography, and video oculography. The basic characteristics of EOGs—such as the origin of the signal, types of electrodes, and a typical electrode placement—were also introduced briefly. The purpose of this study is to summarize the recent studies on EOGs and suggest research directions in the aspect of HCI. In spite of recent advances on this topic—EOGs for HCI—there have not been many attempts to summarize the recent progresses. Previous reviews have briefly introduced EOGs, but have not covered recent advances. By summarizing the recent advances, I would like to create a small guide for studying this topic, as many different methods on measuring and signal processing exist. The aims of this study are as follows: (1) to suggest a generalized procedure for utilizing EOGs, (2) to categorize and discuss methods in previous studies at each step of the generalized procedure, and (3) to suggest a research direction for the topic. This paper describes the characteristics of EOGs, measurement issues, and signal processing techniques. The presented contents would be valuable in developing EOG-based applications or initiating research on this topic.

2. Characteristics An EOG is a signal that changes according to eye-movements; the changes are caused by the standing potential between the retina and of the eyes [12,23]. The potential increases when the cornea approaches an electrode, and it decreases when the cornea moves in the opposite direction. Figure1 illustrates this process. When the eyes are directed straight and the gaze is fixed, the electric potential does not change and remains at a certain value. When the eyes move left to the sensor, the electric potential increases sharply as the cornea reaches the sensor. This phenomenon also occurs when the eyes are closed [24], or in the case of blind people [25]. An EOG often reflects activities other than eye-movements, and these are commonly named artifacts. -related artifacts are strong and occur frequently when closing and opening the eyes. Such artifacts correspond to a dramatic increase in the potential above the eyes. This high potential is the sum of the retinal and corneal potentials, as the eyelid conducts the positive charges of the cornea to the frontopolar region when it comes in contact with the cornea [23]. Hence, although eyelid-related artifacts originate from the same source as that of the eye-movement signals, it is not significantly related to the eye-movement: The eyes move only slightly (less than 5◦) during eye blinks as there exists a mechanism to suppress saccadic movements during eye blinks [23,26]. Sensors 2019, 19, 2690 3 of 20 Sensors 2019, 19, x FOR PEER REVIEW 3 of 20

Figure 1. Illustrations of eye-movements and changes in electric potential around the eyes. S denotes the Figure 1. Illustrations of eye-movements and changes in electric potential around the eyes. S denotes location at which a sensor is placed to record the EOG signal, and R is the location of a reference electrode. the location at which a sensor is placed to record the EOG signal, and R is the location of a reference electrode.EMGs represent another artifact in EOGs, and they are observed when a person moves his/her facial muscles or body during EOG measurement. EMGs originate from activities such as jaw EMGs represent another artifact in EOGs, and they are observed when a person moves his/her clenching, raising of the eyebrow, smiling, or walking [27–29]. The shapes of these artifacts differ facial muscles or body during EOG measurement. EMGs originate from activities such as jaw according to the types of movement, electrode position, and sampling rates [11,27,28]; however, clenching, raising of the eyebrow, smiling, or walking [27–29]. The shapes of these artifacts differ generally, high-frequency noise and plateau-like waveforms, which exceed the potential range of according to the types of movement, electrode position, and sampling rates [11,27,28]; however, eye-movement-related signals, are noted [11,29]. The shapes of a facial EMG measured above the eyes generally, high-frequency noise and plateau-like waveforms, which exceed the potential range of eye- are illustrated well in Ref. [27]. movement-related signals, are noted [11,29]. The shapes of a facial EMG measured above the eyes are illustrated3. Overall well Procedure in Ref.of [27] Utilizing. EOGs for Human–Computer Interface

3. OverallThe experimental Procedure of procedures Utilizing reportedEOGs for in Human the literature–Computer differ accordingInterface to the purpose of the study, however, they can be generalized into five steps: measurement, signal processing, feature extraction, patternThe recognition, experimental and procedures communication reported (see in Figure the literature2). These differ steps according are not mandatory, to the purpose and may of the be studomittedy, however, depending they on thecan purpose, be generalized signal quality, into five and steps: applications. measurement, For example, signal the processing, raw signal feature could extraction,be directly usedpattern for recognition, training a pattern and communication recognition model. (see Figure 2). These steps are not mandatory, and mayThe firstbe omitted step is depending measurement, on the during purpose, which signal researchers quality, need and toapplications. decide the For type example, of device the to rawbe used signal for could data be acquisition directly used and for select training the types a pattern and placementrecognition of model. electrodes. The decision made in thisThe step first determines step is measurement, the initial qualityduring which of the researchers signal and aneedffects to the decide later the steps. typeThe of device device to can be usedbe stationary for data acquisition or mobile; and the select stationary the types devices and pl areacement typically of electrodes. more expensive The decision and support made in more this stepaccurate determines bioelectrical the initial signal quality measurement. of the signal The and electrode affects placementthe later steps. may The affect device the signal can be quality stationary [30]. orIn Section mobile;4 , I the summarize stationary and devices discuss arethe electrode typically placement more expensive used in previous and support studies. more accurate bioelectriSignalcalprocessing signal measurement. is the next step, The in electrode which the placement eye-related may signals affect are the extracted signal quality and separated. [30]. In SectionBecause 4, EOGs I summarize are composed and discuss of signals the electrode from various placement origins, used it isin importantprevious studies. to remove signals not relatedSignal to the processing eyes. EMGs, is the drifts, next step, and highin which frequency the eye noises-related are signals well-known are extracted signals thatand separated. need to be Becauseremoved EOGs from theare EOGs;composed the removal of signals methods from various vary in theorigins, literature. it is important Crosstalk betweento remove EOG signals channels not relatedare also to signals the eyes. that EMGs, need todrifts, be removed, and high whichfrequency have noises not been are well discussed-known much signals in thethat past. need Theseto be removedsignals are from suppressed the EOGs; and partly the removal removed methods at this step, vary and in further the literature. removed Crosstalk in the pattern between recognition EOG channelsstep. The are methods also signals and a that discussion need to of be this removed, issue are which summarized have not in been Section discussed5. much in the past. TheseThe signals processed are suppressed signal must and be classifiedpartly removed after feature at this selection. step, and The further feature removed selection in is designedthe pattern in recognitionconsideration step. of recognitionThe methods tasks and and a discussion models to of train. this Aissue typical are featuresummarized for majority in Section of the 5.recognition tasksThe is the processed processed signal signal must of thebe classified previous after step, feature in which selection. the series The of feature signals selection are considered is designed as an ininput consideration vector for a of recognition recognition model. tasks The and hidden models Markov to train. model A typical (HMM) feature and dynamic for majority time warping of the recognition(DTW) based tasks approaches is the processed often consider signal these of features the previous [16,20 ]. step, Stochastic in which features, the series such asof the signals mean are of considered as an input vector for a recognition model. The hidden Markov model (HMM) and dynamic time warping (DTW) based approaches often consider these features [16,20]. Stochastic

Sensors 2019, 19, 2690 4 of 20 Sensors 2019, 19, x FOR PEER REVIEW 4 of 20 saccadicfeatures, duration,such as the fixation mean rates,of saccadic and blink duration, rates, havefixation been rates, used and for blink the support rates, have vector been machine used for (SVM) the andsupport artificial vector neural machine network (SVM) [20 and,31, 32artificial]. neural network [20,31,32].

Figure 2. Overall Procedure of Utilizing EOGs for Human–ComputerHuman–Computer Interface.

Recognition algorithms algorithms are are widely widely available, available, and and the the SVM, SVM, HMM, HMM, artificial artificial neural neural network, network, and andDTW DTW techniques techniques have have been been utilized utilized to torecognize recognize EOG EOG signals, signals, although although the the most most popular popular algorithm isis the thresholding method. The The recognition recognition algorithms have been been utilized utilized for for both both recognizing recognizing users’ users’ intentionsintentions andand filteringfiltering outout EMGEMG artifactsartifacts or eye blinks, where thethe users’ intentionsintentions are expressed by single oror complexcomplex eyeeye movements.movements. The thresholding methods are popular in classifyingclassifying thethe singlesingle directional movements, and advanced algorithms, such as the SVM, HMM, and DTW have have been been utilized forfor classifyingclassifying the the complex complex eye eye movements movements [16 [16,20,25,31],20,25,31]. Although. Although recognition recognition algorithms algorithms for EMGfor EMG artifacts artifacts have have not beennot been studied studied much much yet, detection yet, detection of eye blinkof eye signals blink signals has been has widely been studiedwidely becausestudied because eye blink eye removal blink removal is an important is an important issue inEEG issue signal in EEG processing signal processing in the prefrontal in the prefrontal region of theregion brain of [the33]. brain Thresholding[33]. Thresholding amplitude amplitude is a popular is a algorithm popular algorithm to detect the to detect eye-blink the regioneye-blink in signals, region andin signals, template and matching template algorithms matching with algorithms DTW, correlation with DTW,coefficients, correlation and other coefficients, dissimilarity and features other havedissimilarity been proposed features inhave the been past proposed decades [ 34in]. the Finally, past decades the classified [34]. Finally, activity the of classified a user is activity transferred of a intouser target is transferred systems; this into activity target could systems; be used this to activity control could other devicesbe used or to environments, control other type devices using or a virtualenvironments, keyboard, type or using directly a virtual write alphabets. keyboard, The or directly subsequent write sections alphabets. review The previous subsequent studies sections and discussreview previous each of these studies steps. and discuss each of these steps.

4. Measurement The measurement of an EOG typically involves placing sensors at the left andand rightright sides of thethe eyes (for horizontal EOG),EOG), andand aboveabove andand belowbelow anan eye (for verticalvertical EOG)EOG) [[22,35]22,35].. Figure3 3aa illustratesillustrates thethe positionspositions ofof thethe electrodes. electrodes. The The horizontal horizontal EOG EOG is is obtained obtained by by subtracting subtracting the the signal signal pertaining pertaining to Bto from B from that that ofA of (or A that(or that of A of from A from that ofthat B), of and B), theand vertical the vertical EOG isEOG obtained is obtained by subtracting by subtracting the signal the pertainingsignal pertaining to D from to D that from of that C (or of that C (or of that C from of C that from of D).that The of D). reference The reference electrode electrode is often is placed often onplaced the foreheadon the forehead [15,16,31 [15,16,31,36,3,36,37], in the7] middle, in the of middle the eyes of [ 38the], oreyes on [38] the, mastoids or on the [35 mastoids,39]. Young [35,39] and. SheenaYoung and suggested Sheena using suggested six electrodes using six placed electrodes above placed and below above both and thebelow eyes both of the the user eyes [14 of]. the user [14]. Some studies used only two electrodes in addition to a reference electrode: One above one eye and theSome other studies to the used right only of the two right electrodes eye (or to in the addition left of the to lefta reference eye) (Figure electrode:3b) [ 25 One,40,41 above]. Expectedly, one eye and the other to the right of the right eye (or to the left of the left eye) (Figure 3b) [25,40,41]. Expectedly, the signals obtained using this electrode placement were less stable than those obtained

Sensors 2019, 19, x 2690 FOR PEER REVIEW 55 of of 20 using other methods; however, it was possible to recognize eye movements in eight directions with the signals obtained using this electrode placement were less stable than those obtained using other an accuracy of over 80% [25]. methods; however, it was possible to recognize eye movements in eight directions with an accuracy of Some atypical electrode placements are shown in Figure 3c–f. The placement in Figure 3c over 80% [25]. involves embedding electrodes in a headband [27,42], which may increase the usability of the Some atypical electrode placements are shown in Figure3c–f. The placement in Figure3c involves measuring device. Under this placement, the vertical eye movements are ignored. Yan et al. proposed embedding electrodes in a headband [27,42], which may increase the usability of the measuring device. a method of placing electrodes above and below the right side of the right eye and electrodes above Under this placement, the vertical eye movements are ignored. Yan et al. proposed a method of and below the left side of the left eye (Figure 3d) [30]. Twenty-four levels of eye movements (-75° to placing electrodes above and below the right side of the right eye and electrodes above and below the 75° horizontal movements) were successfully classified with an accuracy of 87.1%, but a comparison left side of the left eye (Figure3d) [ 30]. Twenty-four levels of eye movements ( 75 to 75 horizontal between electrode placement was not performed. Kanoh et al. proposed the measurement− ◦ ◦ of EOGs movements) were successfully classified with an accuracy of 87.1%, but a comparison between electrode by using three electrodes—two in the middle of the eyes and one between the eyebrows [43]. The placement was not performed. Kanoh et al. proposed the measurement of EOGs by using three electrodes were embedded at the nose-pads (A and B in Figure 3e) and on the bridge of the eyeglasses electrodes—two in the middle of the eyes and one between the eyebrows [43]. The electrodes were (C in Figure 3e). The horizontal EOG was obtained by subtracting the signal pertaining to A from embedded at the nose-pads (A and B in Figure3e) and on the bridge of the eyeglasses (C in Figure3e). that of B, and vertical EOG was obtained by subtracting the signal pertaining to (A + B)/2 from that The horizontal EOG was obtained by subtracting the signal pertaining to A from that of B, and vertical of C. A product with this electrode placement was eventually commercialized by a Japanese EOG was obtained by subtracting the signal pertaining to (A + B)/2 from that of C. A product with this company. This product enabled the measurement of EOGs just by wearing glasses, thereby electrode placement was eventually commercialized by a Japanese company. This product enabled the improving its usability. Figure 3f shows another electrode placement. Favre-Felix et al. proposed the measurement of EOGs just by wearing glasses, thereby improving its usability. Figure3f shows another use of in-ear electrodes to record horizontal EOGs [44]. The accuracy of in-ear EOGs was slightly electrode placement. Favre-Felix et al. proposed the use of in-ear electrodes to record horizontal worse than that of conventional EOGs (correlation of 82.60% compared to correlation of 90.11% EOGs [44]. The accuracy of in-ear EOGs was slightly worse than that of conventional EOGs (correlation pertaining to conventional EOGs), although this accuracy could be improved by implementing of 82.60% compared to correlation of 90.11% pertaining to conventional EOGs), although this accuracy advanced signal processing algorithms. could be improved by implementing advanced signal processing algorithms. The electrode placements illustrated in Figure 3c,e,f were designed to be embedded in mobile The electrode placements illustrated in Figure3c,e,f were designed to be embedded in mobile devices. The quality of the obtained signal still remains unproved. Studies on the electrode placement devices. The quality of the obtained signal still remains unproved. Studies on the electrode placement and quality of the EOG signal would be necessary for future research. and quality of the EOG signal would be necessary for future research.

(a) (b) (c)

(d) (e) (f)

Figure 3. Six different different electrodeelectrode positionspositions toto measuremeasure EOGs EOGs in in the the literature. literature. The The letters letters A, A, B, B, C, C, and and D Dindicated indicated in in circles circles denote denote the the locations locations of of electrodes. electrodes.

The EOG amplitude of eye eye-movement-movement is is typically typically less less than than 50 5000 μµVV [35][35],, and the amplit amplitudeude is proportional toto thethe magnitudemagnitude of of eye eye movements movements [37 [37]]. Yagi. Yagi et al.et al. confirmed confirmed this this by performingby performing 70 min 70 minrecordings recordings per participant per participant [45]. However,[45]. However, it is di ffi itcult is todifficult utilize to a small utilize amount a small of eye amount movements of eye movementsowing to DC owing drifts to and DC other drifts noises and other [18]. noi Moreses DC[18]. drift More is DC incurred drift is as incurred the recording as the recording time increases, time increases,and because and of because this, most of applicationsthis, most applications utilize a large utilize amount a large of eyeamount movements of eye movements or consider movementsor consider movementsperformed in perf a shortormed time in a period short [time11,16 period,20]. [11,16,20]. An EOG EOG is is recorded recorded using using an an amplifier amplifier because because its amplitude its amplitude is of the is of order the of order microvolts. of microvol Severalts. Severaltypes of types measurement of measurement systems systems exist. The exist. devices The devices typically typically used for used electroencephalograms for electroencephalograms (EEGs) (EEGs)can be utilizedcan be utilized for EOG for recording. EOG recording. Stationary Stationary EEG recording EEG recording devices devices commonly commonly cost over cost 10,000 over 10,000 EUR (BioSemi [11,20,35], NeurOne [39,46], BioPac [47,48], and NF Corporation [41,49]). Figure 4b shows a recording environment with a stationary system (BioSemi). Some companies offer

Sensors 2019, 19, 2690 6 of 20

Sensors 2019, 19, x FOR PEER REVIEW 6 of 20 EUR (BioSemi [11,20,35], NeurOne [39,46], BioPac [47,48], and NF Corporation [41,49]). Figure4b recordingshows a recording systems environmentdesigned for with EOGs a stationary (which record system two (BioSemi). to four channels Some companies of bioelectric offer recording signals). Thesesystems include designed BlueGain for EOGs [16,50] (which, NI-USB record-6008 two [27,51] to four, and channels JinsMeme of bioelectric [43,52], which signals). cost These from include 300 to 1600BlueGain EUR. [ 16Among,50], NI-USB-6008 these, JinsMeme [27,51 is], anda mobile JinsMeme system, [43 ,in52 ],which which an cost EOG from amplifier 300 to 1600 is embedded EUR. Among in eyeglassesthese, JinsMeme (Figure is a4a). mobile BioPac system, also offers in which a mobile an EOG device amplifier to record is embedded EOGs, inin eyeglasses which the (Figure device4 a).is connectedBioPac also too theffers stationary a mobile system device to(MP160) record via EOGs, wireless in which communication the device is [48] connected. Many researchers to the stationary also introducedsystem (MP160) self-designed via wireless EOG communication systems [15,29,36,53 [48].– Many58]. Some researchers of these also systems introduced are implemented self-designed as statiEOGonary systems types, [15 , which29,36,53 are–58 connected]. Some of tothese a computer systems are via implemented a cable [36,57] as, stationary whereas the types, others which have are portableconnected forms, to a in computer which the via devices a cable are [36 ,installed57], whereas in a wheelchair the others have[15,59] portable or embedded forms, into in which goggles the [29,56,60]devices are. installed in a wheelchair [15,59] or embedded into goggles [29,56,60].

(a) (b)

FigureFigure 4. Devices with with di differentfferent styles styles and and types types of electrodes. of electrodes. (a) Dry (a) electrodesDry electrodes embedded embedded in a mobile in a mobiledevice device [52], (b [52]) a typical, (b) a typical stationary stationary device device with wet with electrodes wet electrodes [35]. Reprinted [35]. Reprinted with permission.with permission.

InIn the the literature, literature, the useuse ofof wetwet electrodes electrodes is is popular. popular. This This is is possibly possibly because because many many studies studie dids did not notfocus focus on makingon making a product, a product, as signal as signal to noise to ratios noise (SNRs)ratios (SNRs) of the wet of theelectrodes wet electrodes are higher. are However, higher. However,glasses with glasses dry with electrodes dry electrodes have demonstrated have demonstrated impressive impressive performance performance comparable comparable to that to of that wet ofelectrodes. wet electrodes. Bulling Bulling et al. classified et al. classified six eye-movement six eye-mo activitiesvement (copying,activities reading,(copying, writing, reading, watching writing, a watchingvideo, browsing, a video, andbrowsing, no activity) and no with activity) the use with of dry the electrodes use of dry leading electrodes to an leading accuracy to an (F1 accuracy score) of (F168.5 score) [31], andof 68.5 they [31] also, and reported they also the reported classification the classification of eight eye-gestures of eight eye with-gestures an accuracy with an of accuracy 87% [29]. ofLled 87%ó et [29] al.. recognizedLledó et al. six-directional recognized six eye-movements-directional eye with-movements an accuracy with of an90% accuracy [60]. Although of 90% direct[60]. Althoughcomparisons direct between comparisons the types between of electrodes the types have notof electrodes been carried have out not yet, been the recognition carried out accuracies yet, the recognitiondo not exhibit accuracies a considerable do not exhibit difference. a considerable However, itdi canfference. be expected However, that it the can noises be expected caused that by body the noisesmovements caused or sweatby body degrade movements the quality or sweatof the signal degrade more the when quality dry electrodesof the signal are used more compared when dry to electrodesthat in the are case used of wet compared electrodes, to that because in the the case contact of wet between electrodes, the electrodes because andthe contact skin could between worsen the in electrodesmobile environments. and skin could worsen in mobile environments.

5.5. Signal Signal Processing Processing TheThe main main purpose purpose of of EOG EOG signal signal processing processing is is to to remove remove noise noise and and emphasize emphasize the the signal signal of of interest.interest. A A numbernumber ofof di differentfferent eye movement types types exist: exist: saccades saccades (fast (fast eye movement), eye movement), fixation fixation (fixing (fixinggaze onto gaze a subject),onto a subject), miniature miniature eye movements eye movements (drift, tremor, (drift, and tremor, microsaccades and microsaccades during fixation), during fixation),smooth pursuitsmooth (whenpursuit tracking (when tracking a slowly a slowly moving moving target), target), and vergence and vergence (inward (inward and and outward outward eye eyemovements) movements) [22 ].[22] Among. Among these these movement movement types, types, saccades saccades and and smooth smooth pursuit pursuit have have been been the main the mainresearch research source source of signal of forsignal HCI for in the HCI past in decades, the past as decades, the others, as exceptthe others, for fixation, except are for uncontrollable. fixation, are uncontrollable.Hence, these eye-movement Hence, these eye related-movement signals haverelated been signals considered have been as noise considered to remove as noise in this to field.remove in thisHigh field. frequency noises, such as tremor or high-frequency EMG artifacts, have been often removed usingHigh median frequency and low-pass noises, filters such [11 as,15 tremor,16,20,27 or,29 , high38,53-frequency–55,57]. The EMG cut-o ff artifacts,frequency have for the been low-pass often removedfilters, as using reported median in the and literature, low-pass varies filters from [11,15,16,20,27,29,38,53 10 to 60 [15,16,27,38,53–55,57]–55,57. ].The There cut- isoff no frequency consensus for on thethe low best-pass filter filters, to remove as reported these noises, in the but literature, the use ofvaries low-pass from filters10 to has60 [15,16,27,38,53 been preferred–55,57] in the. literature.There is no consensus on the best filter to remove these noises, but the use of low-pass filters has been preferred in the literature. It is visually observable that a low-pass filter does not preserve the edge steepness of saccadic eye movements [56], and using a median filter, ten eye gestures could be

Sensors 2019, 19, 2690 7 of 20

Sensors 2019, 19, x FOR PEER REVIEW 7 of 20 It is visually observable that a low-pass filter does not preserve the edge steepness of saccadic eye movementsrecognized with [56], andan accuracy using a medianof 95.93% filter, [20] ten. In eyeaddition, gestures Bulling could et be al. recognized demonstrated with that an accuracyan adaptive of 95.93%median [ 20filter]. In is addition,superior to Bulling a low- etpass al. filter demonstrated [56]. However, that ana recent adaptive study median employed filter the is superior median filter to a low-passtogether filterwith [a56 20].- However,Hz low-pass a recent filter, study and employedtwelve eye the gestures median were filter reported together withto be a recognized 20-Hz low-pass with filter,an accuracy and twelve of 95%; eye gesturesfurther, it were was reported claimed tothat be a recognized cut-off frequency with an of accuracy 20 Hz is of sufficient 95%; further, to include it was claimedintentional that eye a cut-o-movementsff frequency and of remove 20 Hz is high suffi-cientfrequency to include noises intentional [16]. Despite eye-movements this slight contradiction, and remove high-frequencyboth methods could noises obtain [16]. applicable Despite this results. slight Hence, contradiction, as noted both in [39] methods, it is reasonable could obtain to consider applicable that results.the method Hence, for as high noted frequency in [39], it rem is reasonableoval should to considerbe selected that taking the method into account for high other frequency procedures removal of shouldthe entire be selectedsystem. taking into account other procedures of the entire system. DriftDrift has has conventionally conventionally been been removed removed using using high-pass high- filters,pass filters, with the with cut-o theff frequency cut-off frequency varying fromvarying 0.05 from to 0.1 0.05 [15 to,38 0.1,53 [15,38,53,54],54]. In a few. In exceptional a few exceptional cases, cut-o cases,ffs cut of- 0.175offs of [ 610.175] and [61] 0.53 and Hz 0.53 [40 Hz] were [40] used;were however,used; however, these cases these dealt cases only dealt with only single with and single directional and directional eye movement eye movement [40], and [40] they, and utilized they theutilized high-pass the high filtered-pass signal filtered together signal with together the raw with signal the [61 raw]. The signal reason [61] for. The the low reason variation for the is that low thevariation low-pass is that filter the easily low deforms-pass filter the easily signal deforms during saccadethe signal and during fixation. saccade Figure and5 illustrates fixation. howFigure the 5 signalsillustrates are how deformed the signals by high-pass are deformed filters asby the high cut-o-passff frequency filters as the increases. cut-off Anfrequency ideal eye increases. movement An signalideal eye is noted movement to be signal generated is noted (top-left) to be generated and the high-pass (top-left) and filtered the high signal-pass with filtered a diff erentsignal cut-o withff a frequencydifferent cut is displayed.-off frequency The experimentalis displayed. codeThe experimental was written and code run was using written Matlab andTM runand using the first-order MatlabTM Butterworthand the first- filterorder was Butterworth utilized, as filter in [34 was,57 ].utilized, It is clearly as in shown[34,57]. that It is during clearly fixation,shown that the during signal becomes fixation, rounded,the signal which becomes can berounded, easily confused which can with be the easily signals confused pertaining with to the smooth signals pursuit pertaining or complex to smooth and continuouspursuit or complex movements. and continuous movements.

FigureFigure 5.5. ArtificiallyArtificially generatedgenerated eyeeye movementmovement signalsignal andand thethe changeschanges afterafter applyingapplying high-passhigh-pass filtersfilters ofof didifferentfferent cut-ocut-offff frequencies. frequencies. TheThe cut-ocut-offff frequencies frequencies areare notednoted aboveabove eacheach axis,axis, whilewhile thethe originaloriginal signalsignal isis displayeddisplayed in in the the top-left. top-left.

ToTo avoid avoid this this problem,problem, some some researchers researchers attempted attempted to to utilize utilize alternative alternative methods methods for for drift drift removal.removal. PolynomialPolynomial regressionregression isis oneone ofof thesethese methodsmethods [[11,39]11,39],, inin whichwhich thethe EOGEOG signalsignal isis fitfit toto aa polynomialpolynomial expressionexpression andand thethe fittedfitted signalsignal isis subtractedsubtracted fromfrom thethe originaloriginal signal.signal. TheThe degreedegree ofof thethe polynomialpolynomial varies varies from from 1 to1 to 20, 20, depending depending on on the the shape shape of the of originalthe original signal. signal. After After removing removing the drift, the Petterssondrift, Pettersson et al.’s etmethod al.’s method could detect could horizontal detect horizontal and vertical and saccadesvertical saccades of eyes moving of eyes more moving than more 7.5◦ withthan a 7.5 sensitivity° with a of sensitivity 94% [39]. of Another 94% [39] substitute. Another for high-passsubstitute filtering for high is-pass wavelet filtering transformation. is wavelet Bullingtransformation. et al. proposed Bulling a et method al. proposed named a CWT-SD,method named in which CWT wavelet-SD, in coe whichfficients wavelet were coeff calculatedicients were and signalscalculated were and reconstructed signals were for reconstructed cases in which for thecases coe inffi whichcients the were coefficients greater than were a given greater threshold than a given [29]. threshold [29]. Pettersson et al. also removed high frequency noise over 100 Hz using the wavelet transformation [39]. The merits of this method were that the signals in the fixation region remained

Sensors 2019, 19, 2690 8 of 20

Sensors 2019, 19, x FOR PEER REVIEW 8 of 20 Pettersson et al. also removed high frequency noise over 100 Hz using the wavelet transformation [39]. unchanged,The merits of and this gradual method changes were that such the as signals drift inwere the removed. fixation region This method remained was unchanged, replicated and by another gradual researchchanges such group, as drift and were they removed. were able This to method recognize was eye replicated gestures by another after noise research removal group, [11,20] and they. A weredisadvantage able to recognize of utilizing eye gesturesthe wavelet after transformation noise removal [11is ,that20]. Aslow disadvantage eye movements of utilizing such theas waveletsmooth pursuittransformation or the movements is that slow in eyea short movements distance are such removed as smooth with pursuit drift. This or thecould movements be an issue in when a short an applicationdistance are is removed developed with for drift. people This with could ALS, be anbecause issue whentheir aneye application movements is could developed be slo forwer people than withthoseALS, of people because without their eyeALS movements [62]. could be slower than those of people without ALS [62]. Methods concerningconcerning thethe detection detection of of eye eye blinking blinking are are described described in Section in Section7. Typically, 7. Typically, the eye the blink eye blinkartifact artifact region region is completely is completely removed removed and the and signal the signal before before and after and the after blink the is blink interpolated is interpolated [20,29], [20,29]because, because although although eye blinking eye blinking occurs rarely occurs during rarely saccades, during saccades, it occurs it immediately occurs immediately before and before after andthe saccadesafter the [saccades29]. [29]. The last feature that needs to be considered in the signal processing of EOGs is crosstalk, or the interdependency betweenbetween thethe horizontalhorizontal andand verticalvertical channels.channels. InIn 2008,2008, Tsai Tsai et et al. al. [ 63[63]] reported reported that that a alarge large amount amount of of horizontal horizontal movement movement led led to to signal signal changes changes inin verticalvertical EOGs.EOGs. The firstfirst strategy for addressing crosstalk was to ignore EOG signals with low amplitude. In In 2016, 2016, another another research research group group raised this issue and studied it intensively [35] [35].. The The authors authors instructed instructed participants participants to move their eyes according to to dots dots on on a a screen, screen, and and the the EOGs EOGs during during the the experiments experiments were were recorded. recorded. By analyzing By analyzing the experimentalthe experimental data, data, the the authors authors reported reported that that crosstalk crosstalk occurred occurred in in the the case case of of nine nine out out of of ten participants, and and the the amount amount of crosstalk of crosstalk was diwasfferent different for each for participant. each participant. The authors The also authors proposed also proposeda simple methoda simple to method remove to this remove crosstalk, this crosstalk, which involved which firstinvolved measuring first measuring the amount the of amount crosstalk of crosstalkby recording by recording the EOG duringthe EOG horizontal during horizontal eye movement, eye movement, and then and removing then removing the amount the of amount crosstalk of crosstalkfor the test for signals. the test Examples signals. Examples of the ideal of EOGthe ideal signal, EOG measured signal, measured signal, and signal, compensated and compensated signal are signalshown are in Figureshown6 .in Figure 6.

Figure 6. Crosstalk in EOG signals [35]. The horizontal component represents the EOG recorded Figure 6. Crosstalk in EOG signals [35]. The horizontal component represents the EOG recorded at at the left and right side of the eyes, whereas the vertical component represents the EOG recorded the left and right side of the eyes, whereas the vertical component represents the EOG recorded above above and below an eye. The ideal signal is the expected EOG when a participant moves his/her eyes and below an eye. The ideal signal is the expected EOG when a participant moves his/her eyes as as instructed. The compensated EOG is the signal after the crosstalk is removed from the obtained instructed. The compensated EOG is the signal after the crosstalk is removed from the obtained (uncompensated) signal. (uncompensated) signal. 6. Applications for Human–Computer Interface 6. Applications for Human–Computer Interface HCI applications for EOGs are often developed for paralyzed people, such as people with ALS. PeopleHCI su applicationsffering from for ALS EOGs gradually are often lose developed the ability for to controlparalyzed their people, muscles, such and as people they lose with most ALS. of Peopletheir controls suffering in thefrom later ALS stages gradually of this lose disease the ability [5,6]. Because to control eye-movements their muscles, areand one they of lose the ultimatemost of their controls in the later stages of this disease [5,6]. Because eye-movements are one of the ultimate ways of communication in such scenarios, it could be valuable if an EOG-based communication tool is developed as a product for patients and their caregivers.

Sensors 2019, 19, 2690 9 of 20 ways of communication in such scenarios, it could be valuable if an EOG-based communication tool is Sensors 2019, 19, x FOR PEER REVIEW 9 of 20 developed as a product for patients and their caregivers. Another application field field of EOGs is gaming. In In a a virtual reality (VR) game environment, it is not easy to use conventional input devices. Voice Voice recognition technology is a powerful tool and has demonstrated high accura accuracies;cies; however, it could disruptdisrupt thethe player’splayer’s surroundings. Most HCI applications of EOG reported in the literature utilize two signal sources: saccades and eye blinks. blinks. Saccades Saccades are are often often used used for for controlling controlling movement, movement, and blinks and blinks help in help selections. in selections. This section This selistsction recent lists applicationsrecent applications utilizing utilizing EOGs forEOGs HCI. for Please HCI. Please note that note although that although I classified I classified eye blink eye blink as an asartifact an artifact according according to the to EOG’s the EOG’s definition definition in Section in Section2, the eye 2, the blink eye is blink an important is an important signal thatsignal can that be canobtained be obtained from EOGs from because EOGs thebecause eye blink the iseye distinguished blink is distinguished from eye-movements from eye- inmovements EOGs. The in eye EOGs. blink Thecan beeye utilized blink can for be selecting utilized a buttonfor selecting or starting a button an eye-based or starting communication an eye-based communication system. system. In 1999 and 2002, Barea et al. proposed proposed a a wheelchair wheelchair with an EOG EOG input input interface interface [15,59] [15,59].. A A user user could control thethe wheelchairwheelchair by by providing providing six six different different commands commands to theto the wheelchair wheelchair by usingby using his/ his/herher eye eyemovements: movements: forward, forward, backward, backward, left, right,left, right, stop, stop, and speedand speed control. control. Four Four directional directional movements movements were werecommanded commanded using fastusing and fast long and saccadic long saccadic movements movements of upward, of downward,upward, downward, leftward, and leftward, rightwards. and rightwards.The speed was The controlled speed was via controlled small and via gradual small eye and movements gradual eye (see movements Figure7 for (see the Figure wheelchair 7 for and the wheelchairmenu screen). and The menu stop screen). condition The was stop activated condition when was the activated eyes were when closed. the In eyes a comparable were closed. manner, In a comparablesome researchers manner, suggested some researchers the use of suggested EOG communication the use of EOG for robotcommunication control. Rusydi for robot et al.control. and RusydiOh et al. etindependently al. and Oh et al. suggested independently moving suggested a robot according moving a torobot the according moving direction to the moving and angle direction of the anduser’s angle eyes of [41 the], anduser’s to controleyes [41 a] robot’s, and to moving control directiona robot’s andmoving speed direction by using and eye speed movements by using in foureye movementsdirections [42 in]. four The directions studies of [42] both. The Rusydi studies et al. of andboth Oh Rusydi et al. et suggested al. and Oh the etidea al. suggested of the application the idea ofwithout the application implementation. without implementation.

(a) (b)

Figure 7. 7. WheelchairWheelchair embedded embedded with with EOG EOG input input interface. interface. (a) Overlook (a) Overlook (b) Control (b) display. Control Reprinted display. Reprintedwith permission with permission from Ref. [15from]. Ref. [15].

Some studies studies suggested suggested utilizing utilizing eye-movements eye-movements for forcursor cursor control. control. Yan et Yan al. introducedet al. introduced a mouse a mousecontrol control system forsystem answering for answering a questionnaire a questionnaire [64]. The [64] system. The allows system a user allows to select a user the to answer select the by answermoving by a cursor moving in a a cursor 4 4 grid, in a after4×4 grid, displaying after displaying a question a forquestion a few seconds.for a few Theseconds. user canThe move user c thean × movecursor the by movingcursor by the moving eyes in fourthe eyes directions. in four The directions. cursor does The not cursor move does when not the move electric when potential the electric caused potentialby an eye caused movement by an is eye smaller movement than the is smaller threshold. than Deng the threshold. et al. implemented Deng et al. a implemented remote controller a remote for a controllerTV, in which for thea TV, four-directional in which the four movements-directional corresponded movements to corresponded changes in the to channel changes and in volumethe channel [53]. andThe channel volume changed[53]. The when channel the eyes changed moved when vertically, the eyes and the moved sound vertically, volume increased and the/ decreased sound volume when increased/decreasedthe eyes moved horizontally. when the eyes moved horizontally. Games are another application of utilizing EOG. A si simplemple shooting game was proposed in the same study conducted byby DengDeng etet al. al. [ 53[53]].. The The goal goal of of the the game game was was to to shoot shoot the the same same colored colored balls balls as asthat that of of the the user’s user’s ball, ball, in in which which the the user’s user’s ball ball moved moved leftleft andand rightright via eye movements (Figure (Figure 88a).a). The user’s ball at the bottom, as illustrated in Figure 88a,a, movesmoves leftleft andand rightright accordingaccording toto thethe eyeeye movements, and the user can change the ball color by moving the eyes downwards. Target balls were displayed in a 3×5 grid, in three different colors. The user can shoot the ball by moving the eyes upwards, and the target balls having the same color as that of the user’s in the same line disappear. The game finished when all the target balls disappeared.

Sensors 2019, 19, 2690 10 of 20 movements, and the user can change the ball color by moving the eyes downwards. Target balls were displayedSensors 2019, in19, ax 3FOR 5PEER grid, REVIEW in three different colors. The user can shoot the ball by moving the10 eyesof 20 × upwards, and the target balls having the same color as that of the user’s in the same line disappear. The gameKim finishedand Yoon when developed all the targeta Dance balls Dance disappeared. Revolution game with eye movements [38]. The user was Kimrequired and Yoonto move developed his/her eyes a Dance by following Dance Revolution directional game instructions with eye on movements the game screen, [38]. Thein which user wasarrows required for the to move instruction his/her dropped eyes by following gradually directional from top toinstructions bottom (see on the Figure game 8b). screen, The userin which was arrowsrequired for to the follow instruction the instructions dropped gradually before the from arrows top to reached bottom the (see bottom; Figure8 b).the Thegame user was was terminated required tootherwise. follow the It instructions was difficult before to recognize the arrows directions reached thewhen bottom; the users the game repeatedly was terminated moved his/her otherwise. eyes Ittoward was di ffi thecult same to recognize direction. directions They amplified when the the users signal repeatedly sufficiently moved and his / sether aeyes threshold toward thefor same noise direction.removals Theyprecisely. amplified the signal sufficiently and set a threshold for noise removals precisely. KumarKumar andand SharmaSharma [[65]65] implementedimplemented aa VRVR gamegame withwith EOG,EOG, named “VRailSurfer.” InIn thisthis game,game, thethe user user controls controls a a gamegame character’s character’s movement movement through through eye eye blinks. blinks. The The goal goal ofof the the game game was was to to survivesurvive andand collectcollect coinscoins duringduring thethe runrun onon thethe rails.rails. TheThe gamegame charactercharacter waswas requiredrequired toto avoidavoid trainstrains byby moving moving left left or or right, right, and and jumping; jumping; the movementsthe movements of the of character the character were controlled were controlled by blink, by double blink, blink,double and blink, wink, and respectively. wink, respectively. The accuracy The accuracy of the blink of the detection blink detection and VR control and VR were control 96% were and 80%,96% respectively.and 80%, respectively. The drop inThe the drop accuracy in the for accuracy VR control for VR was control caused was by impedance caused by changesimpedance during changes the headduring movement, the head movement, although wet although and sticker-type wet and sticker electrodes-type were electrodes used. were used.

(a) (b)

FigureFigure 8.8.Games Games controlled controlled using using eye eye movement. movement. (a) Shooting(a) Shooting game game to find to thefind same the colorsame [ 53color]. (b [53]) Dance. (b) DanceDance RevolutionDance Revolution game in game which in which the user the moved user moved his/her his/her eyes toeyes follow to follow directional directional instructions instructions [38]. Reprinted[38]. Reprinted withpermission. with permission.

AlthoughAlthough majority majority of theof theprevious previous studies studies have focused have focused on utilizing on utilizinga limited number a limited of commands, number of suchcommands, as the four-directional such as the four movements,-directional there movements, have been there a few have attempts been to a enable few attempts keyboard to typing enable orkeyboard writing typing with EOGs. or writing Yamagishi with EOGs. et al. proposedYamagishi a et method al. proposed for a user a method to type for letters a user [25 to]. type A virtual letters keyboard[25]. A virtual in which keyboard cursors in could which move cursors in eight could directions move in eight according directions to eye according movements to waseye movements introduced. Thewas users introduced. could freely The move users the could cursor freely over move a keyboard the cursor on a screen over a with keyboard eye movements, on a screen and with wink eye to selectmovements, the letter and on wink the cursor. to select The the merit letter of on this the system cursor. was The that merit it facilitatedof this system complex was that communication. it facilitated Insteadcomplex of choosingcommunication. one of four Instead predefined of choosing options, one users of could four communicate predefined freely, options, similar users to when could usingcommunicate a computer freely, keyboard. similar to when using a computer keyboard. XiaoXiao etet al.al. proposedproposed aa typingtyping systemsystem withwith eye eye blink blink [ 66[66]].. The The system system displayed displayed letters letters in in 4 4× 88 × grids,grids, inin whichwhich eacheach buttonbutton flashedflashed inin aa randomrandom order.order. AA useruser waswas requiredrequired toto starestare atat aa buttonbutton toto typetype andand blinkblink whenwhen itit flashed.flashed. TheThe systemsystem recognizedrecognized thethe buttonbutton byby synchronizingsynchronizing thethe flashesflashes andand detectingdetecting thethe eye eye blink. blink. The The user user can can type type a a letter letter in in 2 s2 usings using this this system, system, as as the the flashing flashing duration duration is as is as short as 100 ms and there is an overlap of 40 ms between two adjacent flashes. This was an impressive idea, which enabled fast communication using eye blinking only. The gesture recognition proposed by Bulling et al. could be utilized in a similar way [29]. In their study, eight different gestures—combinations of eye movements in different directions—were designed and recognized (see Figure 9). These eye gestures could provide users with more options to communicate, especially because these gestures were diverse and easy to remember.

Sensors 2019, 19, 2690 11 of 20 short as 100 ms and there is an overlap of 40 ms between two adjacent flashes. This was an impressive idea, which enabled fast communication using eye blinking only. SensorsThe 2019 gesture, 19, x FOR recognition PEER REVIEW proposed by Bulling et al. could be utilized in a similar way11 [of29 20]. SensorsIn their 2019 study,, 19, x FOR eight PEER diff erentREVIEW gestures—combinations of eye movements in different directions—were11 of 20 designedIn this and perspective, recognized eye (see-writing Figure—9).in These which eye users gestures draw could letters provide directly users using with eye more movements options— tois communicate,a meaningfulIn this perspective, approach especially eyefor because -utilizingwriting these— EOGsin gestures which for HCI.users were Thedraw diverse concept letters and ofdirectly easy eye to-writing remember.using waseye movementsfirst introduced—is aby meaningful TsaiIn thiset al. perspective, inapproach 2007 [63] for eye-writing—in. utilizing EOGs which for HCI. users The draw concept letters of directly eye-writing using eyewas movements—isfirst introduced a bymeaningful Tsai et al. approachin 2007 [63] for. utilizing EOGs for HCI. The concept of eye-writing was first introduced by Tsai et al. in 2007 [63].

Figure 9. Eye gestures of different complexities [29]. Dots denote the starting points. Figure 9. Eye gestures of different complexities [29]. Dots denote the starting points. Figure 9. Eye gestures of different complexities [29]. Dots denote the starting points. Ten Arabic Arabic numbers numbers and and four four mathematical mathematical symbols symbols were written were written by 11 users, by 11 and users, fairly acceptable and fairly accuracyacceptableTen Arabic was accuracy achieved numbers was (believability: achievedand four (believability: mathematical 75.5%, dependability: 75.5%,symbols dependability: were 72.1%). written The 72.1%). authors by 11 users, The also authors applied and fairly also the acceptablesameapplied system the accuracy same to 40 system di wasfferent to achieved 40 letters different (English (believability: letters alphabets, (English 75.5%, Arabicalphabets, dependability: numbers, Arabic andnumbers, 72.1%). punctuation The and authors punctuation symbols). also appliedThesymbols). reported the Thesame mean reported system accuracy to mean 40 was different accuracy 80% for letters five was users,(English 80% although for alphabets, five theusers, smallArabic although number numbers, the of participants smalland punctuation number was of a symbols).limitationparticipants Theof was the reported study.a limitation mean of accuracythe study. was 80% for five users, although the small number of participantsInIn 2016,2016, was LeeLee a etetlimitation al.al. attemptedattempted of the totostudy. buildbuild aa systemsystem ofof eye-writingeye-writing EnglishEnglish lettersletters [[11]11].. TheyThey mademade participantsparticipantsIn 2016,sit Lee sit in in frontet al. front ofattempted an of empty an empty to panel, build panel, and a system eye-write and of eye eye letter-write-writing by letter letter English according by letter letters to according audio[11]. They instructions. to made audio participantsTheinstructions. stroke orders sitThe in stroke of front 12 lettersorders of an (c,of empty 12 e, f,letters i, panel, j, o, (c, r, ande, t, f, y, eyei, space, j, -o,write r, backspace, t, lettery, space, by andbackspace, letter enter) according wereand enter) predefined to audio were instructions.topredefined avoid confusion, to The avoid stroke asconfusion, theorders shapes ofas 12the of letters someshapes letters(c, of e, some f, are i, lettersj, remarkably o, r, aret, y, remarkably space, similar backspace, in similar the absence andin the enter) absence of pen-up were of predefinedinformation.pen-up information. to Theavoid participants confusion, The participants were as the asked shapes were to writeofasked some naturally, to letters write are innaturally, theremarkably manner in the in similar whichmanner in they thein handwhichabsence wrote they of penletters.hand-up wrote information. Figure letters. 10 shows FigureThe aparticipants set10 ofshows the eye-written awere set of asked the letters. eyeto -writewritten The naturally, accuracyletters. Thein (F1 the accuracy score) manner of (F1 thein score)which system ofthey was the hand87.38%system wrote forwas 20 letters.87.38% participants. Figurefor 20 participants.10 shows a set of the eye-written letters. The accuracy (F1 score) of the system was 87.38% for 20 participants.

Figure 10. Example of eye-written characters (26 English alphabets and three punctuation symbols). TheFigure color 10. of Example traces indicates of eye-written the stroke characters order (from (26 English red to blue).alphabets Reprinted and three with punctuation permission [symbols).11]. FigureThe color 10. ofExample traces indicatesof eye-written the stroke characters order (from(26 English red to alphabets blue). Reprinted and three with punctuation permission symbols). [11]. TheRecently, color of in traces 2018, indicates Fang et the al. stroke published order (from an article red to blue). on eye-writing Reprinted with for permission Japanese Katakana[11]. [16]. BecauseRecently, Katakana in 2018, is mostly Fang composed et al. published of straight an lines, article the on researchers eye-writing developed for Japanese a system Katakana to recognize [16]. BecauseRecently, Katakana in 2018, is mostlyFang et composedal. published of straightan article lines, on eye the- writing researchers for Japanese developed Katakana a system [16] to. Becauserecognize Katakana 12 basic istypes mostly of strokes. composed By recognizing of straight these lines, strokes, the researchers the proposed developed system a was system able toto recognizeclassify all 12 thebasic Katakana types of (48strokes. letters). By recognizing The accuracy these of strokes, Katakana the recognition proposed system was 93.8% was able for sixto classifyparticipants. all the One Katakana of the distinguishing (48 letters). The features accuracy of the of study Katakana was that recognition continuous was eye 93.8%-writing for was six participants.implemented. One By of ignoring the distinguishing small eye movements,features of the the study system was could that recognizecontinuous the eye eye-wri-writingting was of implemented. By ignoring small eye movements, the system could recognize the eye-writing of

Sensors 2019, 19, 2690 12 of 20

12 basic types of strokes. By recognizing these strokes, the proposed system was able to classify all the SensorsKatakana 2019, (4819, x letters). FOR PEER The REVIEW accuracy of Katakana recognition was 93.8% for six participants. One12of of the20 distinguishing features of the study was that continuous eye-writing was implemented. By ignoring multiplesmall eye letters movements, continuously the system without could discrete recognize sessions. the eye-writing The mean of input multiple rate was letters 27.9 continuously letters per minute.without discrete sessions. The mean input rate was 27.9 letters per minute. TheThe study study of of Chang Chang et et al. al. also also considered considered eye eye-writing,-writing, but but it it focused focused on on the the feasibility feasibility of of applying applying thethe eye eye-writing-writing technique technique to to individuals individuals suffering suffering from from ALS ALS [20] [20. ].They They proposed proposed a novel a novel design design of tenof tendigits digits (from (from 0 to 09) to so 9) that so thatpeople people may mayeye-write eye-write with withminimum minimum effort. e ffTheort. shapes The shapes of the digits of the weredigits designed were designed to be to similar be similar to the to cotherresponding corresponding Arabic Arabic numbers, numbers, to to be be short short as as possible, possible, and and distinguishabledistinguishable from from other other digits digits (see (see Figure Figure 11).11). Eye Eye-writing-writing experiments experiments were were conducted conducted with with 18 18 healthyhealthy participants participants (healthy (healthy group) group) and and three three participants participants with with ALS ALS (ALS group). Because Because people people withwith ALS get fatigued easilyeasily andand theythey findfind it it di difficultfficult to to communicate communicate verbally, verbally, all all the the instructions instructions for foreye-writing eye-writing were were explained explained systematically. systematically. The The mean mean accuracies accuracies of the of healthythe healthy and ALSand ALS groups groups were were95.74 95.74 and 86.11, and 86.11, respectively. respectively. Despite Despite the reduction the reduction in the meanin the accuracy,mean accuracy, the study the verified study verified that the thatEOG-based the EOG HCI-based system HCI couldsystem be could used be as used a communication as a communication tool for peopletool for withpeople ALS. with ALS.

Figure 11. Design of eye-writing digits for individuals with ALS. Reprinted with permission [20]. Figure 11. Design of eye-writing digits for individuals with ALS. Reprinted with permission [20]. 7. Feature Extraction and Pattern Recognition 7. Feature Extraction and Pattern Recognition Pattern recognition algorithms play the role of interpreting EOG signals for a given practical use. The algorithmsPattern recogn areition required algorithms for the followingplay the role tasks: of interpreting eye blink detection, EOG signals EMG for detection, a given recognitionpractical use. of Thesaccadic algorithms direction, are andrequired eye-writing for the/activity following recognition. tasks: eye Feature blink detection, extraction EMG is necessary detection, to minimize recognition the ofcomplexity saccadic of direction, recognition and problems. eye-writing/activity By converting the recognition. raw signals Feature into notable extraction features, is higher necessary accuracy to minimizemay be achieved. the complexity of recognition problems. By converting the raw signals into notable features,Automatic higher detectionaccuracy may of eye be blink achieved. artifacts has been studied extensively in the past decades. One of the easiestAutomatic methods detection for such of eye detection blink artifacts is to determine has been astudied threshold extensively and classify in the a region past decades. of the signal One ofas the an easiest eye blink methods if the for potential such detection of the region is to determine is higher thana threshold the threshold and classify [67]. a This region method of the hassignal an asadvantage an eye blink when if highthe potential accuracy of is the not region necessarily is higher required than [68the], threshold as it can be [67] easily. This implemented method has inan a advantageshort time. when The main high limitationsaccuracy is of not this necessarily method are required the relatively [68], as low it can accuracy be easily for implemented detection and in low a shortprecision time. for The determining main limitations the range of this of method the artifacts are the because relatively the low detection accuracy was for conducted detection epochand low by precisionepoch, that for is, determining it was determined the range if anof epochthe artifacts included because an eye the blink detection [34]. Thewas meanconducted accuracy epoch of theby epoch,amplitude that thresholdingis, it was determined was 93.24% if an for epoch 24 participants included an when eye anblink advanced [34]. The method mean achievedaccuracy 96.02%of the amplitudefor the same thresholding database [ 34was]. The93.24% drop for in 24 the participants accuracy was when mainly an advanced caused by method the variation achieved of 96.02% the eye forblink the amplitudes same database among [34] participants.. The drop in the accuracy was mainly caused by the variation of the eye blink Toamplitudes detect eye among blink with partic betteripants. accuracy, advanced algorithms have been utilized. Chambayil et al. utilizedTo detect the kurtosis eye blink of adjacent with better windows, accuracy, and advanced minimum algorithms and maximum have amplitudebeen utilized of signals. Chambayil as input et al.features utilized to the determine kurtosis if of a adjacent signal in windows, a window and includes minimum an eye and blink maximum [32]. A amplitude multi-layer of perceptron signals as inputmodel features was developed to determine with nine if a hidden signal inlayers. a window The reported includes regression an eye blink value [32] was. 84.99% A multi for-layer 384 perceptronepochs from model 13 participants. was developed Delorme with nine et al. hidden suggested layers. a The method reported of utilizing regression kurtosis value andwas entropy84.99% forof a 384 signal epochs for eye from blink 13 participants. detection [69 Delorme]. The authors et al. suggested found that a themethod epochs of with utilizing eye blinks kurtosis exhibit and entropyhigher kurtosisof a signal and for entropy, eye blink and detection these features [69]. The were authors utilized found to classify that the the epochs epochs with with eye eye b blinkslinks exhibit higher kurtosis and entropy, and these features were utilized to classify the epochs with eye blinks after applying independent component analysis (ICA). Durka et al. utilized the correlation coefficients between the EEG channels [70]. An epoch was said to include eye blinks, if the coefficient between the EEG signals at Fp1–F3 and F3–C3 or at Fp2–F4 and F4–C4 is higher than the threshold. The accuracy of the proposed system was similar to that of the human experts: the area under curve

Sensors 2019, 19, 2690 13 of 20 after applying independent component analysis (ICA). Durka et al. utilized the correlation coefficients between the EEG channels [70]. An epoch was said to include eye blinks, if the coefficient between the EEG signals at Fp1–F3 and F3–C3 or at Fp2–F4 and F4–C4 is higher than the threshold. The accuracy of the proposed system was similar to that of the human experts: the area under curve (AUC) of the proposed system was 91.5% whereas the AUC of human experts was 95.4%. Chang and Im suggested utilizing dynamic positional warping—a template matching method that identifies the corresponding points dynamically [71]. The authors achieved an accuracy of 87.13%, which was higher than that of the traditional dynamic time warping (77.92%). The authors also proposed a method to determine the eye blink region without employing templates by utilizing the summation of the derivatives within multiple windows (MSDW) [34]. The reported mean accuracy was 96.02% for the detection of epochs contaminated by eye blink artifacts, and the system detected more than 80% of the artifact region with an accuracy of 87.96%. The automatic adaption of a system to a user is another issue in eye blink detection [39,72]. Pettersson et al. proposed a method of selecting a threshold to distinguish saccadic or eye blink from noise [39]. The method first determines all the local maxima and sorts them according to the amplitudes. Subsequently, the curve formed by the maxima’s amplitudes is fit to two consecutive straight lines, where the joint point becomes the threshold. Chang et al. proposed a similar algorithm by utilizing the histogram of the local maxima [72]. A notable point of this study is that the authors detected eye blinks in real time while adjusting the threshold. The proposed algorithm determined a user’s personal threshold for detecting eye blinks in a few seconds. The detection or removal of EMG artifacts from EOG has not been studied extensively yet. Most studies on EOGs involve controlling the users’ body or facial movements, and focus on clean eye-movement-related signals [16,20,35]. Bulling et al. suggested utilizing an adaptive median filter to remove EMGs from EOGs [29]. The adaptive filter reduced EMGs during a walk by up to 80% in horizontal EOGs and 60% in vertical EOGs. The window size of the median filter is critical and different for each activity. The window size was adjusted to cover each activity by utilizing an accelerometer. A few studies classified facial EMGs from the obtained EOG signals. Lee et al. considered a signal region with a higher amplitude than that of the typical EOG signals, which was experimentally set to 800 µV [11]. Paul et al. classified EOGs and EMGs by using different bandpass filters (10 Hz low-pass filter for EOGs, and 25–125 Hz for EMGs) [27]. The signals with a larger rooted mean sum (RMS) than the threshold were considered as EMGs, in which the threshold was set to 120 mV through calibration. Classifications of eye-written characters and eye activities have utilized elaborate algorithms. Tsai et al. utilized an artificial neural network (learning vector quantization) for eye-writing recognition [63]. The input features were the initial movement direction and the number of directional changes in horizontal and vertical EOGs. The proposed system recognized 14 patterns (10 Arabic numbers and four arithmetic symbols) with an accuracy (believability) of 75.5%. Lee et al. utilized DTW, which is a type of template matching technique [11]. Raw signals of horizontal and vertical EOGs were used as features after signal processing. The DTW technique involves finding the corresponding pairs between the template and test signal and calculating the dissimilarity between the two signals. Dynamic positional warping (DPW) is a modification of the conventional DTW, and is used to enhance the correspondence quality for two-dimensional data. Chang et al. classified Arabic digits with an accuracy of 95.74%, utilizing DPW together with the support vector machine (SVM) [20]. SVM is a traditional and popular classification model, which searches the hyperplane to maximize the distance between groups. This model was originally designed for binary classification, but it can solve multiple classification problems by utilizing multiple SVM models [73]. Bulling et al. utilized SVM solely to recognize six different activities (copying, reading, writing, watching a video, browsing, and no activity) [31]. Fifteen features were introduced to achieve a high accuracy, including the mean and variation of the amplitude of saccades, proportion of specific saccades, fixation rates, blink rates, and wordbook counts. The obtained accuracy (F1 score) was 73.19% when the verifications were Sensors 2019, 19, 2690 14 of 20 conducted in a subject-independent manner. Bulling et al. also attempted to use the edit distance from string matching to recognize eight eye gestures (Figure9). The eye gestures were similar to eye-writing, however, the traces of gaze did not form a letter. The attained accuracy was 87% in subject-dependent verification [29]. HMM is a method of probabilistic modeling for changes in states. Because this method is suitable for describing time-domain signals, it has been utilized for voice recognition, gesture recognition, and online handwriting recognition [74–76]. Fang et al. employed HMM together with a multilayer perceptron (MLP) to recognize ten basic strokes of Katakana [16]. The MLP was utilized to determine states in HMM, and HMM trained the state transitions and calculated the possibility of a series of transitions for a class. The MLP had two hidden layers with 200 and 100 nodes, respectively, and the trained HMM had 63.6 states on average for the user-independent model. The input features were raw signals filtered via a low-pass filter, delta, and delta-delta signals. They reported an accuracy of 86.5% for ten types of isolated strokes, which was 8.9% higher than the accuracy for dynamic time warping. After reviewing the pattern recognition algorithms for EOGs in the literature, sufficient evidence for the superiority of any algorithm could not be found. Although a number of studies compared their proposed method to conventional ones, the performance of an algorithm may differ according to the purpose of study, collected data, and employed signal processing methods. For intensive comparisons and advancement of this domain, I believe that in future studies, data will need to be collected considering various scenarios, and must be made public.

8. Limitations and Future Work As described and discussed above, EOGs are an effective and cost-efficient way to estimate eye-movements. The approach involves recording eye movements under any light condition, and there is no influence from the presence of obstacles, even when the subject’s eyes are closed. Despite its merits, the number of studies on this subject is limited, and many topics and issues have been left unaddressed. One of the most critical issues to utilize EOGs for HCI is the handling of artifacts, especially EMGs. Studies on EMG removal from EOGs are limited; however, more investigation is necessary. To increase usability, EOG-based HCI systems can be embedded into mobile devices by using dry electrodes. Another point to consider is that although mobile devices are easy for users or caregivers to carry, the mobile environment incurs unexpected movements of the users or devices, which generate EMGs or other artifacts. Future works on EOG-based HCI are expected to address this issue. The next issue pertains to the application fields. Although the accuracy of the discussed systems was noted to be sufficiently high for practical use, the applications were limited to prototype games, wheelchair control, and virtual keyboards. I believe that the field of application must be expanded in the present scenario. Because a user can make four to eight selections in a few seconds, complicated systems for control devices could be invented in the near future. Eye-writing, which involves directly inputting characters by moving the eyes, is a relatively novel approach for EOG-based communication, and it could be embedded in virtual reality (VR) devices. A research topic for future study is a hybrid system with other biosignals for HCI. There are a number of hybrid systems in the literature, however, ways of utilizing these signals are several. Ramli et al. proposed a wheelchair control system with electroencephalograms (EEGs) [77]. They utilized signals from three EEG channels (C3, C4, and O2 in the ten-twenty electrode placement) and successfully classified EEG signals of four categories (eye closed and eye moving left, right, and center). This study demonstrated that placing electrodes around the eyes is not mandatory for the recording of EOGs. Moreover, it indicated the possibilities of utilizing other EEG features together with EOGs without placing additional electrodes on the user’s face. Figure 12 shows eye- and eyelid-related signals observed at EEG channels. Similarly, Oh et al. proposed a hybrid system utilizing both EEGs and EOGs, which classified eye closing/opening, and the movement of eyes in the left/right direction [42]. It was different from Ramli’s system in terms of the placement of electrodes: The EEGs and EOGs were measured from the prefrontal region in the proposed system. The authors confirmed Sensors 2019, 19, 2690 15 of 20 that EEG measured from the prefrontal region could also support EOG-based HCI by recognizing if the eyes are open and closed automatically, thereby leading to an increase in the accuracy of the Sensors 2019, 19, x FOR PEER REVIEW 15 of 20 entire system.

(a) (b)

(c)

FigureFigure 12.12.Eye- Eye- and and eyelid-related eyelid-related signals signals observed observed at EEGat EEG channels. channels. (a) Signal(a) Signal at O2 at when O2 when closing closing and openingand opening eyes, (eyes,b) Signals (b) Signals at C3 andat C3 C4 and when C4 movingwhen moving eyes left, eyes (c) Signalsleft, (c) atSignals C3 and at C4C3 when and C4 moving when eyesmoving rights. eyes Reprinted rights. Reprinted with permission with permission [77]. [77].

MaMa et et al. al. proposed proposed another another hybrid hybrid system system with with EEG EEG for for robot robot control control [78[78]].. The The system system was was composedcomposed ofof twotwo didifferentfferent modes:modes: EOGEOG andand EEGEEG modes,modes, wherewhere thethe modesmodes switchedswitched byby frowning.frowning. TheThe EOGEOG modemode waswas usedused toto controlcontrol thethe directionaldirectional movementsmovements ofof thethe robotrobot andand thethe EEGEEG modemode waswas usedused toto control activities (receiving, (receiving, passing passing over over an an object, object, sitting sitting down, down, and and danci dancing)ng) or change or change the thetarget target robots. robots. The The EOGs EOGs were were classified classified into into seven seven activities activities (frowning, (frowning, double double and and triple triple blink, blink, left leftand and right right winks, winks, and and horizontal horizontal eye eye movements) movements) using using a a thresholding thresholding based based method. In thethe EEGEEG modes,modes, an an event event related related potential potential (ERP) (ERP) based ba systemsed system [79] was [79] activated, was activated, in which in a which user was a user required was torequired choose to an choose icon from an icon eight from selections. eight selections. ByBy addingadding moremore sensors to record a user’s user’s activities, activities, an an HCI HCI system system may may increase increase the the number number of ofcommands commands to to control control a a device device or or increase increase the the recognition recognition accuracy accuracy of of the the activities. MinatiMinati et al. al. proposedproposed aa hybrid wearable wearable device device to to control control a arobot robot arm, arm, which which measures measures EOGs, EOGs, EEGs, EEGs, EMGs, EMGs, and andaccelerations. accelerations. By utilizing By utilizing multiple multiple signals signals from from a device, a device, they they increased increased the the number number and and types types of ofactivities activities to tocommand command order order to a to system. a system. The The recognizable recognizable activities activities were were head head tilting, tilting, sideways sideways head headrotation, rotation, nodding, nodding, repeated repeated or single or single eye eye blink, blink, horizontal/vertical horizontal/vertical eye eye movements, movements, jaw jaw clenching, andand brainbrain activitiesactivities inin thethe alpha-band.alpha-band. AnotherAnother important important research research topic topic for forfuture future research research is a standard is a standard database. database. Many research Many resea groupsrch havegroups introduced have introduced various methods,various methods, but it is di butfficult it is to difficult compare to and compare find the and best find method the best because method of thebecause lack of of a the common lack of and a common standard and EOG standard database. EOG The database. standard database The standard will help database researchers will help to developresearchers novel to algorithmsdevelop novel as the algorithms newly developed as the newly algorithms developed can bealgorithms easily tested can bybe theeasily database. tested by the database. 9. Summary 9. SummaryThis paper presents a review of the studies on EOGs from the perspective of HCI (the various studiesThis are paper tabulated presents in Table a review1). The of topic—EOGs the studies foron HCI—hasEOGs from been the widely perspective studied of in HCI recent (the decades; various however,studies are a comparison tabulated inbetween Table the 1). studies The topic has— notEOGs been for conducted HCI—has enough. been widely The main studied contribution in recent of thisdecades; study howe is thatver, a general a comparison procedure between for utilizing the studies EOGs for has HCI not was been suggested, conducted and enough. the recent The studies main werecontribution summarized of this for study each is step that of a thegeneral procedure. procedure Through for utilizing this study, EOGs researchers for HCI was can suggested, overview and the entirethe recent research studies field, were and summarized easily compare for oneeach study step of to the another. procedure. Moreover, Through research this study, directions researchers can be can overview the entire research field, and easily compare one study to another. Moreover, research directions can be derived from this study by reviewing the overall research progress in this field. A number of future research directions, which may inspire the readers are suggested.

Table 1. Summary of recent studies on EOGs for human computer interface.

Sensors 2019, 19, 2690 16 of 20 derived from this study by reviewing the overall research progress in this field. A number of future research directions, which may inspire the readers are suggested.

Table 1. Summary of recent studies on EOGs for human computer interface.

Categories Types References Stationary [11,16,20,27,35,36,39,41,47,57,63] Measuring Device Mobile [15,29,31,42–44,53–56,58–60] a [11,15,16,20,29,31,36–39,54,56,57,59,61,80] b [25,40,41] c [27,42] Electrode Position * d [30] e [43] f [44] Median filter [11,16,20,29,31,35,56] Low/High pass filter [15,16,25,27,37,38,40,53–55,57,59,61] Signal Processing Polynomial regression [11,39], Saccade detection [20,29] Interdependency removal [35,63] Time-series signal [20,70,71,81] Feature Stochastic feature [32,69] Thresholding [25,54,59,67] Support vector machine [31] Pattern Recognition Hidden Markov model [16] Artificial Neural network [16,63] Dynamic time warping [11,20,71] Button selection [61] Typing/ Virtual keyboard [25,66] Eye-writing [11,16,20,63] Applications Robot control [41,42,47] Wheelchair control [15,59] TV control [53] Game [38,53,65] * Refers to the number of electrode positions subfigure in Figure3.

EOGs are typically obtained around the eyes, by measuring the electric potential between opposite sides of the eyes. Commonly, four electrodes are used (one each above and below an eye, and to the left and right of the eyes), and the minimum number of electrodes is three (at both sides of the bridge of nose and between the eyebrows). This author found no evidence of any claim to substantiate the superiority of an electrode position against the others; however, the use of a smaller number of electrodes may increase the usability and reduce manufacturing costs. A number of different signal processing and pattern recognition algorithms exist for EOGs. Artifacts and noises are required to be removed for further processing, and the movement directions, eye gestures, and eye-written characters are recognized by the algorithms. The process is generalized as follows: (1) high/low-pass filtering, (2) median filtering, (3) eye blink removal, (4) saccade detection, (5) feature extraction, and (6) pattern recognition. There may exist variations in this procedure according to the purpose, types of activities, and combination of algorithms used for an application. In the literature, EOG-based communication has been applied for typing on a virtual keyboard, writing letters directly, moving a mouse, and controlling a wheelchair or a robot. Most applications utilized directional eye movements in a short time period, although a few exceptions utilizing complex eye gestures exist. When these gestures involve the forms of general letters (English alphabets, Arabic numbers, or Japanese Katakana), the activity is called eye-writing, which is a relatively new concept of utilizing eye-movement for communication. Eye-writing increases the users’ freedom by allowing Sensors 2019, 19, 2690 17 of 20 them to write complex words or sentences directly. These gesture recognition techniques may improve the EOG-based communication by increasing the speed of communication. There are several promising directions pertaining to EOG-based communication. First, EOG-based communication needs to be tested in mobile environments to expand its usability; in this case, artifacts and other noises are the significant issues in building a reliable system. Second, new application fields for utilizing EOGs must be developed. For instance, input devices for VR environments would be interesting for both users and researchers. Finally, the use of hybrid systems with other bio-signals could improve the performance of EOG-based communication.

Author Contributions: W.-D.C. authored the paper and revised it. Funding: This work was supported by a grant from the Basic Science Research Program through the National Research Foundation of Korea (NRF), which is funded by the Ministry of Education (NRF-2017R1D1A1A09000774). Conflicts of Interest: The author declares no conflict of interest.

References

1. Muensterer, O.J.; Lacher, M.; Zoeller, C.; Bronstein, M.; Kübler, J. Google Glass in pediatric surgery: An exploratory study. Int. J. Surg. 2014, 12, 281–289. [CrossRef][PubMed] 2. Nilsson, S.; Gustafsson, T.; Carleberg, P. Hands free interaction with virtual information in a real environment. PsychNol. J. 2007, 7, 175–196. 3. Dorr, M.; Bohme, M.; Martinetz, T.; Brath, E. Gaze beats mouse: A case study. PsychNol. J. 2007, 7, 16–19. 4. Agustin, J.S.; Mateo, J.C.; Hansen, J.P.; Villanueva, A. Evaluation of the potential of gaze input for game interaction. PsychNol. J. 2009, 7, 213–236. 5. Beukelman, D.; Fager, S.; Nordness, A. Communication support for people with ALS. Neurol. Res. Int. 2011, 2011, 714693. [CrossRef][PubMed] 6. Communication Guide. Available online: http://www.alsa.org/als-care/augmentative-communication/ communication-guide.html. (accessed on 8 September 2017). 7. Marin, G.; Dominio, F.; Zanuttigh, P. Hand gesture recognition with leap motion and kinect devices. In Proceedings of the 2014 IEEE International Conference on Image Processing, Paris, France, 27–30 October 2014; pp. 1565–1569. 8. Liu, H.; Wang, L. Gesture recognition for human-robot collaboration: A review. Int. J. Ind. Ergon. 2018, 68, 355–367. [CrossRef] 9. Lopes, J.; Simão, M.; Mendes, N.; Safeea, M.; Afonso, J.; Neto, P. Hand/arm gesture segmentation by motion using IMU and EMG sensing. Procedia Manuf. 2017, 11, 107–113. [CrossRef] 10. Han, C.H.; Kim, Y.W.; Kim, D.Y.; Kim, S.H.; Nenadic, Z.; Im, C.H. -based endogenous brain-computer interface for online communication with a completely locked-in patient. J. Neuroeng. Rehabil. 2019, 16, 18. [CrossRef] 11. Lee, K.-R.; Chang, W.-D.; Kim, S.; Im, C.-H. Real-time ‘eye-writing’ recognition using electrooculogram (EOG). IEEE Trans. Neural Syst. Rehabil. Eng. 2016, 25, 37–48. [CrossRef] 12. Malmivuo, J.; Plonsey, R. Bioelectromagnetism: Principles and Applications of Bioelectric and Biomagnetic Fields; Oxford University Press: New York, NY, USA, 1995. 13. Frishman, L.J. Electrogenesis of the electroretinogram. In Retina; Elsevier: Amsterdam, The Netherlands, 2013; pp. 177–201. 14. Young, L.R.; Sheena, D. Survey of eye movement recording methods. Behav. Res. Methods Instrum. 1975, 7, 397–429. [CrossRef] 15. Barea, R.; Boquete, L.; Mazo, M.; López, E. Wheelchair guidance strategies using EOG. J. Intell. Robot. Syst. Theory Appl. 2002, 34, 279–299. [CrossRef] 16. Fang, F.; Shinozaki, T. Electrooculography-based continuous eye-writing recognition system for efficient assistive communication systems. PLoS ONE 2018, 13, e0192684. [CrossRef] 17. Borghetti, D.; Bruni, A.; Fabbrini, M.; Murri, L.; Sartucci, F. A low-cost interface for control of computer functions by means of eye movements. Comput. Biol. Med. 2007, 37, 1765–1770. [CrossRef][PubMed] 18. Young, L.R.; Sheena, D. Eye-movement measurement techniques. Am. Psychol. 1975, 30, 315–330. [CrossRef] [PubMed] Sensors 2019, 19, 2690 18 of 20

19. Tsai, J.-Z.; Chen, T.-S. Eye-writing communication for patients with amyotrophic lateral sclerosis. In Proceedings of the SIGACCESS Conference on Computers and Accessibility, Pittsburgh, PA, USA, 25–28 October 2009; pp. 239–240. 20. Chang, W.-D.; Cha, H.-S.; Kim, D.Y.; Kim, S.H.; Im, C.-H. Development of an electrooculogram-based eye-computer interface for communication of individuals with amyotrophic lateral sclerosis. J. Neuroeng. Rehabil. 2017, 14, 89. [CrossRef][PubMed] 21. Glenstrup, A.J.; Engell-Nielse, T. Eye Controlled Media: Present and Future State; University of Copenhagen: Copenhagen, Denmark, 1995. 22. Singh, H.; Singh, J. tracking and related issues: A review. Int. J. Sci. Res. Publ. 2012, 2, 1–9. 23. Iwasaki, M.; Kellinghaus, C.; Alexopoulos, A.V.; Burgess, R.C.; Kumar, A.N.; Han, Y.H.; Lüders, H.O.; Leigh, R.J. Effects of eyelid closure, blinks, and eye movements on the electroenacephalogram. Clin. Neurophysiol. 2005, 116, 878–885. [CrossRef] 24. Schleicher, R.; Galley, N.; Briest, S.; Galley, L. Blinks and saccades as indicators of fatigue in sleepiness warnings: Looking tired? Ergonomics 2008, 51, 982–1010. [CrossRef] 25. Yamagishi, K.; Hori, J.; Miyakawa, M. Development of EOG-based communication system controlled by eight-directional eye movements. In Proceedings of the 2006 International Conference of the IEEE Engineering in Medicine and Biology Society, New York, NY, USA, 30 August–3 September 2006; pp. 2574–2577. 26. de Visser, B.W.O.; Bour, L.J. Eye and Eyelid Movements during Blinking: An Eye Blink Centre?; Elsevier B.V.: Amsterdam, The Netherlands, 2006; Volume 58, Chapter 3. 27. Paul, G.M.; Cao, F.; Torah, R.; Yang, K.; Beeby, S.; Tudor, J. A smart textile based facial EMG and EOG computer interface. IEEE Sens. J. 2013, 14, 393–400. [CrossRef] 28. Perdiz, J.; Pires, G.; Nunes, U.J. Emotional state detection based on EMG and EOG biosignals: A short survey. In Proceedings of the 2017 IEEE 5th Portuguese Meeting on Bioengineering (ENBENG), Coimbra, Portugal, 16–18 February 2017. 29. Bulling, A.; Roggen, D.; Tröster, G. Wearable EOG goggles: Seamless sensing and context-awareness in everyday environments. J. Ambient Intell. Smart Environ. 2009, 1, 157–171. 30. Yan, M.; Tamura, H.; Tanno, K. A study on gaze estimation system using cross-channels electrooculogram signals. Int. Multiconf. Eng. Comput. Sci. 2014, 1, 112–116. 31. Bulling, A.; Ward, J.A.; Gellersen, H.; Tröster, G. Eye movement analysis for activity recognition using electrooculography. IEEE Trans. Pattern Anal. Mach. Intell. 2011, 33, 741–753. [CrossRef][PubMed] 32. Chambayil, B.; Singla, R.; Jha, R. EEG eye blink classification using neural network. In Proceedings of the the World Congress on Engineering 2010, London, UK, 30 June–2 July 2010. 33. Jung, T.-P.; Makeig, S.; Westerfield, M.; Townsend, J.; Courchesne, E.; Sejnowski, T.J. Removal of eye activity artifacts from visual event-related potentials in normal and clinical subjects. Clin. Neurophysiol. 2000, 111, 1745–1758. [CrossRef] 34. Chang, W.-D.; Cha, H.-S.; Kim, K.; Im, C.-H. Detection of eye blink artifacts from single prefrontal channel electroencephalogram. Comput. Methods Programs Biomed. 2016, 124, 19–30. [CrossRef][PubMed] 35. Chang, W.-D.; Cha, H.-S.; Im, C.-H. Removing the interdependency between horizontal and vertical eye-movement components in electrooculograms. Sensors 2016, 16, 227. [CrossRef] 36. Usakli, A.B.; Gurkan, S. Design of a novel efficient humancomputer interface: An electrooculagram based virtual keyboard. IEEE Trans. Instrum. Meas. 2010, 59, 2099–2108. [CrossRef] 37. LaCourse, J.R.; Hludik, F.C.J. An eye movement communication-control system for the disabled. IEEE Trans. Biomed. Eng. 1990, 37, 1215–1220. [CrossRef] 38. Kim, M.R.; Yoon, G. Control signal from EOG analysis and its application. Int. J. Electr. Comput. Electron. Commun. Eng. 2013, 7, 864–867. 39. Pettersson, K.; Jagadeesan, S.; Lukander, K.; Henelius, A.; Haeggström, E.; Müller, K. Algorithm for automatic analysis of electro-oculographic data. Biomed. Eng. Online 2013, 12, 110. [CrossRef] 40. Hori, J.; Sakano, K.; Saitoh, Y. Development of communication supporting device controlled by eye movements and voluntary eye blink. In Proceedings of the 26th International Conference of the IEEE Engineering in Medicine and Biology Society, San Francisco, CA, USA, 1–5 September 2004; Volume 6, pp. 4302–4305. 41. Rusydi, M.; Sasaki, M.; Ito, S. Affine transform to reform pixel coordinates of EOG signals for controlling robot manipulators using gaze motions. Sensors 2014, 14, 10107–10123. [CrossRef] Sensors 2019, 19, 2690 19 of 20

42. Oh, S.; Kumar, P.S.; Kwon, H.; Varadan, V.K. Wireless brain-machine interface using EEG and EOG: Brain wave classification. In Proceedings of the Nanosensors, Biosensors, and Info-Tech Sensors and Systems, San Diego, CA, USA, 11–15 March 2012. 43. Kanoh, S.; Shioya, S.; Inoue, K.; Kawashima, R. Development of an eyewear to measure eye and body movements. In Proceedings of the Annual International Conference of the IEEE Engineering in Medicine and Biology Society, Milan, Italy, 25–29 August 2015; Volume 1, pp. 2267–2270. 44. Favre-Felix, A.; Graversen, C.; Dau, T.; Lunner, T. Real-time estimation of eye gaze by in-ear electrodes. In Proceedings of the Annual International Conference of the IEEE Engineering in Medicine and Biology Society, Seogwipo, Korea, 11–15 July 2017; pp. 4086–4089. 45. Yagi, T.; Kuno, Y.; Koga, K.; Mukai, T. Drifting and blinking compensation in electro-oculography (EOG) eye-gaze interface. In Proceedings of the 2006 IEEE International Conference on Systems, Man and Cybernetics, Taipei, Taiwan, 8–11 October 2006; Volume 4, pp. 3222–3226. 46. Tesla. Bittium NeurOneTM. Available online: https://www.bittium.com/medical/bittium-neurone (accessed on 1 April 2019). 47. Wijesoma, W.S.; Wee, K.S.; Wee, O.C.; Balasuriya, A.P.; San, K.T.; Soon, K.K. EOG based control of mobile assistive platforms for the severely disabled. In Proceedings of the IEEE Conference on Robotics and Biomimetics, Shatin, China, 5–9 July 2005; pp. 490–494. 48. BioPac-Electrooculogram Amplifier. Available online: https://www.biopac.com/product/electrooculogram- amplifier/ (accessed on 31 March 2019). 49. NF Corporation. Available online: http://www.nfcorp.co.jp/english/index.html (accessed on 1 April 2019). 50. BlueGain EOG Biosignal Amplifier. Available online: https://www.crsltd.com/tools-for-vision-science/eye- tracking/bluegain-eog-biosignal-amplifier/ (accessed on 31 March 2019). 51. National Instrument. Available online: http://www.ni.com/en-us.html (accessed on 31 March 2019). 52. JINS MEME. Available online: https://jins-meme.com/en/ (accessed on 31 March 2019). 53. Deng, L.Y.; Hsu, C.-L.; Lin, T.-C.; Tuan, J.-S.; Chang, S.-M. EOG-based human–computer interface system development. Expert Syst. Appl. 2010, 37, 3337–3343. [CrossRef] 54. Choudhury, S.R.; Venkataramanan, S.; Nemade, H.B.; Sahambi, J.S. Design and development of a novel EOG biopotential amplifier. Int. J. Bioelectromagn. 2005, 7, 271–274. 55. Ding, Q.; Tong, K.; Li, G. Development of an EOG (electro-oculography) based human-computer Interface. In Proceedings of the International Conference of the IEEE Engineering in Medicine and Biology Society, Shanghai, China, 17–18 January 2006; Volume 7, pp. 6829–6831. 56. Bulling, A. Eye Movement Analysis for Context Inference and Cognitive-Awareness: Wearable Sensing and Activity Recognition Using Electrooculography; ETH Zurich: Zurich, Switzerland, 2010. 57. Venkataramanan, S.; Prabhat, P.; Choudhury, S.R.; Nemade, H.B.; Sahambi, J.S. Biomedical instrumentation based on electrooculogram (EOG) signal processing and application to a hospital alarm system. In Proceedings of the 2005 International Conference on Intelligent Sensing and Information Processing, Chennai, India, 4–7 January 2005; pp. 535–540. 58. Banerjee, A.; Datta, S.; Pal, M.; Konar, A.; Tibarewala, D.N.; Janarthanan, R. Classifying electrooculogram to detect directional eye movements. Procedia Technol. 2013, 10, 67–75. [CrossRef] 59. Barea, R.; Boquete, L.; López, E.; Mazo, M. Guidance of a wheelchair using electrooculography. In Proceedings of the 3rd IMACS International Multiconference Circuits, Systems, Communications and Computers, Athens, Greece, 4–8 July 1999; pp. 2421–2426. 60. Iáñez, E.; Azorin, J.M.; Perez-Vidal, C. Using eye movement to control a computer: A design for a lightweight electro-oculogram electrode array and computer interface. PLoS ONE 2013, 8, e67099. [CrossRef][PubMed] 61. Yan, M.; Go, S.; Tamura, H. Communication system using EOG for persons with disabilities and its judgment by EEG. Artif. Life Robot. 2014, 19, 89–94. [CrossRef] 62. Kaethner, I.; Kuebler, A.; Halder, S.; Kathner, I.; Kubler, A.; Halder, S.; Käthner, I.; Kübler, A.; Halder, S. Comparison of eye tracking, electrooculography and an auditory brain-computer interface for binary communication: A case study with a participant in the locked-in state. J. Neuroeng. Rehabil. 2015, 12, 76. [CrossRef][PubMed] 63. Tsai, J.-Z.; Lee, C.-K.; Wu, C.-M.; Wu, J.-J.; Kao, K.-P. A feasibility study of an eye-writing system based on electro-oculography. J. Med. Biol. Eng. 2008, 28, 39–46. Sensors 2019, 19, 2690 20 of 20

64. Yan, M.; Tamura, H.; Tanno, K. Development of Mouse Cursor Control System using Electrooculogram Signals and its Applications in Revised Hasegawa Dementia Scale Task. In Proceedings of the 2012 World Automation Congress, Puerto Vallarta, Mexico, 24–28 June 2012; pp. 1–6. 65. Kumar, D.; Sharma, A. Electrooculogram-based virtual reality game control using blink detection and gaze calibration. In Proceedings of the 2016 International Conference on Advances in Computing, Communications and Informatics, Jaipur, India, 21–24 September 2016; pp. 2358–2362. 66. Xiao, J.; Qu, J.; Li, Y. An electrooculogram-based interaction method and its music-on-demand application in a virtual reality environment. IEEE Access 2019, 7, 22059–22070. [CrossRef] 67. Nolan, H.; Whelan, R.; Reilly, R.B. FASTER: Fully automated statistical thresholding for EEG artifact rejection. J. Neurosci. Methods 2010, 192, 152–162. [CrossRef][PubMed] 68. Aarabi, A.; Kazemi, K.; Grebe, R.; Moghaddam, H.A.; Wallois, F. Detection of EEG transients in neonates and older children using a system based on dynamic time-warping template matching and spatial dipole clustering. Neuroimage 2009, 48, 50–62. [CrossRef][PubMed] 69. Delorme, A.; Makeig, S.; Sejnowski, T. Automatic artifact rejection for EEG data using high-order statistics and independent component analysis. In Proceedings of the International Workshop on ICA, San Diego, CA, USA, 9–13 December 2001; pp. 457–462. 70. Durka, P.; Klekowicz, H.; Blinowska, K.; Szelenberger, W.; Niemcewicz, S. A simple system for detection of EEG artifacts in polysomnographic recordings. IEEE Trans. Biomed. Eng. 2003, 50, 526–528. [CrossRef] [PubMed] 71. Chang, W.-D.; Im, C.-H. Enhanced template matching using dynamic positional warping for identification of specific patterns in electroencephalogram. J. Appl. Math. 2014, 2014, 528071. [CrossRef] 72. Chang, W.-D.; Lim, J.-H.; Im, C.-H. An unsupervised eye blink artifact detection method for real-time electroencephalogram processing. Physiol. Meas. 2016, 37, 401–417. [CrossRef][PubMed] 73. Hsu, C.-W.; Lin, C.-J. A comparison of methods for multiclass support vector machines. IEEE Trans. Neural Netw. 2002, 13, 415–425. [PubMed] 74. Längkvist, M.; Karlsson, L.; Loutfi, A. A review of unsupervised feature learning and deep learning for time-series modeling. Pattern Recognit. Lett. 2014, 42, 11–24. [CrossRef] 75. Gales, M.; Young, S. The application of hidden Markov models in speech recognition. Found. Trends Signal Process. 2008, 1, 195–304. [CrossRef] 76. Wang, X.; Xia, M.; Cai, H.; Gao, Y.; Cattani, C. Hidden-Markov-models-based dynamic hand gesture recognition. Math. Probl. Eng. 2012, 2012, 986134. [CrossRef] 77. Ramli, R.; Arof, H.; Ibrahim, F.; Mokhtar, N.; Idris, M.Y.I. Using finite state machine and a hybrid of EEG signal and EOG artifacts for an asynchronous wheelchair navigation. Expert Syst. Appl. 2015, 42, 2451–2463. [CrossRef] 78. Ma, J.; Zhang, Y.; Cichocki, A.; Matsuno, F. A novel EOG/EEG hybrid human-machine interface adopting eye movements and ERPs: Application to robot control. IEEE Trans. Biomed. Eng. 2015, 62, 876–889. [CrossRef] 79. Zhang, Y.; Zhao, Q.; Jin, J.; Wang, X.; Cichocki, A. A novel BCI based on ERP components sensitive to configural processing of human faces. J. Neural Eng. 2012, 9, 026018. [CrossRef] 80. Chang, W.-D.; Cha, H.-S.; Lee, C.; Kang, H.-C.; Im, C.-H. Automatic Identification of Interictal Epileptiform Discharges in Secondary Generalized Epilepsy. Comput. Math. Methods Med. 2016, 2016, 8701973. [CrossRef] [PubMed] 81. Chang, W.-D.C.W.-D.; Shin, J.S.J. DPW Approach for Random Forgery Problem in Online Handwritten Signature Verification. In Proceedings of the 2008 Fourth International Conference on Networked Computing and Advanced Information Management, Gyeongju, Korea, 2–4 September 2008; Volume 1, pp. 347–352.

© 2019 by the author. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).