US 2016.0070439A1 (19) United States (12) Patent Application Publication (10) Pub. No.: US 2016/0070439 A1 Bostick et al. (43) Pub. Date: Mar. 10, 2016

(54) ELECTRONIC COMMERCEUSING G06F 3/0 (2006.01) GLASSES ANDA G06F 3/16 (2006.01) SMART (52) U.S. Cl. CPC ...... G06F 3/04842 (2013.01); G06F 3/017 (71) Applicant: International Business Machines (2013.01); G06F 3/013 (2013.01); G06F 3/167 Corporation, Armonk, NY (US) (2013.01); G06T 19/006 (2013.01); G06Q 30/0641 (2013.01) (72) Inventors: James E. Bostick, Cedar Park, TX (US); John M. Ganci, JR. Cary, NC (US); Sarbajit K. Rakshit, Kolkata (IN); (57) ABSTRACT Craig M.Trim, Sylmar, CA (US) In an approach for electronic commerce using augmented reality glasses and a Smart watch, a receives a (21) Appl. No.: 14/477,127 configuration associating a user gesture to a command. The computer determines whethera user of the augmented reality (22) Filed: Sep. 4, 2014 glasses selects an object in a first electronic commerce envi ronment and, responsive to determining the user selects an Publication Classification object, the computer determines whether the user performs a (51) Int. C. first gesture detectable by a Smart watch. The computer, then, G06F 3/0484 (2006.01) determines whether the first gesture matches the user gesture G06O 30/06 (2006.01) and, responsive to determining the first gesture matches the G06T 9/00 (2006.01) usergesture, the computer performs the associated command.

-1 200

RECEIVESA CONFIGURATION OFA COMMAND ASSOCATED WITHAUSERGESTURE

DETERMINES WHETHER AN OBJECT ISSELECTED

RECEIVESA COMMAND BASED ONADETECTED USERGESTURE

216

DETERMINES WHETHER THE COMMAND EXECUTES THE DETERMINED RECEIVED PROCEEDS TO A SHOPPING CART COMMAND

PROCEEDS TO SHOPPING CART Patent Application Publication Mar. 10, 2016 Sheet 1 of 3 US 2016/0070439 A1

AR GLASSES 12O 100 NN APPLICATIONeCOMMERCE

eCOMMERCE

USER INTERFACE

SENSOR 132

133 BAR CODE SCANNER 130 SMART WATCH

SERVER 140 3 D DATABASE

FIG. 1 Patent Application Publication Mar. 10, 2016 Sheet 2 of 3 US 2016/0070439 A1

-1 200

RECEIVESA CONFIGURATION OF A COMMAND ASSOCIATED WITHAUSERGESTURE 202

204 DETERMINES WHETHER AN OBJECT ISSELECTED

DETERMINES THE OBJECT

STORES THE DATAVIEWED BY THE USER

RECEIVESA COMMAND BASED ONADETECTED USERGESTURE

216 DETERMINES WHETHER THE COMMAND EXECUTES THE RECEIVED PROCEEDS TO A DETERMINED SHOPPING CART COMMAND

PROCEEDS TO SHOPPING CART

FIG. 2 Patent Application Publication Mar. 10, 2016 Sheet 3 of 3 US 2016/0070439 A1

MEMORY

C C 304 PERSISTENT STORAGE PROCESSOR(S)

30 320

IO DISPLAY INTERFACE(S) COMMUNICATIONS UNIT

318

EXTERNAL DEVICE(S) FIG. 3 US 2016/0070439 A1 Mar. 10, 2016

ELECTRONIC COMMERCEUSING 0007 FIG.3 depicts a block diagram of components of the AUGMENTED REALITY GLASSES ANDA augmented reality glasses executing the electronic commerce SMART WATCH application, in accordance with an embodiment of the present invention. BACKGROUND OF THE INVENTION DETAILED DESCRIPTION 0001. The present invention relates generally to the field of 0008 Embodiments of the present invention recognize augmented reality glasses, and more particularly to the use of that several electronic commerce (E-commerce) applications augmented reality glasses and a Smart watch for electronic for augmented reality glasses (AR glasses) have been devel COCC. oped using tactile and audio commands. Touch screens, in 0002 Augmented reality (AR) is a live, director indirect, or touch sensors on the AR glasses, may be used view of a physical, real-world environment whose elements in conjunction to or as an alternative to audio sensors and are augmented by computer-generated sensory input Such as speech recognition to command AR glasses. Embodiments of visually perceivable content, including graphics, text, video, the present invention utilize gaze focal point detection to global position satellite (GPS) data or sound. Augmentation is identify an object by identifying a focal point in the user's conventionally in real-time and in semantic context with envi field of vision. Furthermore, embodiments of the invention ronmental elements, for example, the addition of current, use a Smart watch or other wearable computing device with real-time sports scores to a non-related news feed. Advanced one or more sensors which can detect one or more muscle augmentation Such as the use of , speech movements for a gesture such as a finger motion or a hand recognition and object recognition allows information about gesture. The Smart watch sends sensor data for a detected the Surrounding real-world to be interactive and manipulated gesture to AR glasses. The sensor data may include detected digitally. In many cases, information about the environment is muscle movement data foragesture. The gesture correlated to visually overlaid on the images of the perceived real-world. the sensor data of the muscle movements received by AR 0003. Some augmented reality devices, rely, at least in glasses may be configured to correspond to a user command. part, on a head-mounted display, for example, with sensors For example, a gesture associated with sensor data for one or for Sound recognition. An example of existing head-mounted more muscle movements may be configured to select an display technology or augmented reality glasses (AR glasses) object or product. uses transparent glasses which may include an electro-optic 0009 Embodiments of the invention provide a capability device and a pair of transparent lenses, which display infor to identify a selected object or product in an augmented mation or images displayed over a portion of a user's visual reality view, Such as an internet site or an on-line store data field while allowing the user to perceive the real-world. The base viewed using AR glasses. Embodiments of the present displayed information and/or images can provide Supplemen invention provide the ability to view or scan data of tal information about a user's environment and objects in the a product in a real world environment Such as a brick and user's environment, in addition to the user's visual and audio mortar store. Additionally, embodiments of the present inven perception of the real-world. tion provide the ability to capture an image of an object in a real world environment such as a brick and mortar store for possible selection, identification, shopping cart addition, and SUMMARY other object related actions. Furthermore, embodiments of the present invention provide the capability to search product 0004. According to aspects of the present invention a data, to search product attributes, to search multiple websites, method, a computer product, and a computer are local or on-line and real world environments, to disclosed for electronic commerce using augmented reality select an object or product, to move an object or product to an glasses and a Smart watch. A computer receives a configura on-line or augmented reality shopping cart for purchase and tion associating a user gesture to a command. The computer to store and retrieve selected products and search results using determines whether a user of the augmented reality glasses AR glasses and a Smart watch. Additionally, embodiments of selects an object in a first electronic commerce environment the present invention provide a memory management func and, responsive to determining the user selects an object, the tion for recall of data on previously viewed or searched computer determines whether the userperforms a first gesture objects or products such as product images, product identifi detectable by a smart watch. The computer, then, determines cation, product attributes, product type and product location. whether the first gesture matches the usergesture and, respon 0010. The present invention will now be described in sive to determining the first gesture matches the user gesture, detail with reference to the Figures. FIG. 1 is a functional the computer performs the associated command. block diagram illustrating an augmented reality data process ing environment, generally designated 100, in accordance BRIEF DESCRIPTION OF THE DRAWINGS with one embodiment of the present invention. FIG. 1 pro vides only an illustration of one implementation of the 0005 FIG. 1 is a functional block diagram illustrating an present invention and does not imply any limitations with augmented reality data processing environment, in accor regard to the environment in which different embodiments dance with an embodiment of the present invention; may be implemented. Modifications to the depicted environ 0006 FIG. 2 is a flowchart depicting operational steps of ment may be made by those skilled in the art without depart an electronic commerce application on augmented reality ing from the scope of the invention as recited by the claims. glasses for electronic commerce using augmented reality 0011 Augmented reality data processing environment glasses and a Smart watch within the augmented reality data 100 includes augmented reality glasses (AR glasses) 120, processing environment of FIG. 1, in accordance with an Smart watch 130 and 140 all connected over network embodiment of the present invention; and 110. Network 110 can be, for example, a telecommunications US 2016/0070439 A1 Mar. 10, 2016

network, a local area network (LAN), a wide area network addition to digital image projection to the user in AR glasses (WAN), a virtual local area network (VLAN) such as the 120, creating the augmented reality standard in augmented Internet, or any combination of the three, and can include reality device technology. AR glasses 120 may be capable of wired, , or fiber optic connections. In general, net sending and receiving data Such as sensor data from Smart work 110 can be any combination of connections and proto watch 130 via network 110. AR glasses 120 include E-com cols that will Support communications between AR glasses merce application 121, E-commerce database 125, and user 120, Smart watch 130 and server 140. interface (UI) 126. AR glasses 120 may include internal and 0012 Server 140 may be a management server, a web external hardware components, as depicted and described in server, or any other electronic device or computing system further detail with respect to FIG. 3. capable of receiving and sending data. In other embodiments, server 140 may represent a server computing system utilizing 00.15 E-commerce application 121 uses eye gaze data multiple as a server system, which may be a dis received by E-commerce application 121 from AR glasses tributed computing environment created by clustered com 120 to track user eye movement and uses data from one or puters and components acting as a single pool of seamless more sensors included in Smart watch 130 to capture user resources Such as a cloud computing environment. In another gestures or motions such as a finger motion oran arm motion embodiment, server 140 may be a computer, a associated with smart watch 130. In an exemplary embodi computer, a (PC), a , a ment, E-commerce application 121 allows a user to select an personal digital assistant (PDA), a , or any pro object using gaze focal point tracker capability. E-commerce grammable electronic device capable of communicating with application 121 may select an object with a gaze focal point AR glasses 120 and Smart watch 130 via network 110. Server tracker which uses the direction of a user gaZe and binocular computer 140 includes database 145. While depicted as a vision principles to extrapolate a focal point of the user's single server and a single database in FIG.1, in some embodi vision. E-commerce application 121 may receive sensor data ments, server 140 may include multiple databases. from sensor 132 on Smart watch 130 for muscle movements 0013 Database 145 resides on server 140. In an embodi associated with a gesture Such as bending a finger or turning ment, database 145 may reside on AR glasses 120. In another a wrist or curling all fingers. E-commerce application 121 embodiment, database 145 may reside on Smart watch 130 or may use a gesture associated with muscle movements another device (not shown) within augmented reality data detected by a sensor, such as sensor 132 on smart watch 130, processing environment 100 accessible via network 110. to configure a user identified command or action Such as Database 145 may be implemented with any type of storage “move to shopping cart” or “select object’. E-commerce device capable of storing data that may be accessed and application 121 provides a method for on-line and in-store utilized by server 140, such as a database server, a hard disk shopping using an augmented reality data processing envi drive, or a flash memory. In other embodiments, database 145 ronment to enhance on-line and in-store shopping. The user may represent multiple storage devices within server 140. In initially configures E-commerce application 121 to receive an embodiment, database 145 is a store database Such as an sensor data of movements associated with a gesture and use on-line product catalog. Database 145 may include product the gesture to perform command Such as "scroll to the next images, product names, product specifications or product product' or "drag and move the product to the shopping cart'. attributes including product availability and barcode informa E-commerce application 121 can receive sensor data from tion or a product barcode. An application within augmented sensor 132 in Smart watch 130 of a gesture and executes the reality data processing environment 100, for example, corresponding command, for example, "add to shopping E-commerce application 121 on AR glasses 120, may access cart” or "scroll to next product’. In addition, E-commerce database 145 which may be any database including any store application 121 may store in E-commerce database 125 the database, multi-vendor database, multiple advertisement data of objects viewed by the user. The data may include database, or product database. E-commerce application 121 images of the objects selected, the attributes of the object may retrieve information on an object or product from data selected and the location of an object viewed and selected by base 145 via network 110. the user of AR glasses 120. E-commerce application 121 may retrieve from E-commerce database 125 data regarding the 0014 AR glasses 120 may be an augmented reality com object selected, including the attributes of a previously puting device, a wearable computer, a desktop computer, a viewed object, the location of a previously viewed object laptop computer, a , a Smart phone, or any from the currently accessed database or from a previously programmable electronic device capable of communicating accessed database. with smart watch 130 and server 140 via network 110 and with various components and devices within augmented real 0016 E-commerce application 121 provides the user the ity data processing environment 100. In the exemplary capability to select another or second object or to search embodiment, AR glasses 120 are an augmented reality com stored data on previously viewed or selected objects. The puting device implemented as a wearable computer. Wearable object may be a product, a person, a building, product data or computers such as AR glasses 120 are especially useful for other object for example. The object discussed in the follow applications that require more complex computational Sup ing embodiments of the invention will focus on an object Such port than just hardware coded logics. In general, AR glasses as a consumer product, however, the object should not be 120 represents a programmable electronic device, a comput limited to “products’ but may include other objects. While the ing device or a combination of programmable electronic method discussed herein focuses on on-line and in-store devices capable of executing machine readable program shopping, some embodiments of the present invention may be instructions and communicating with other computing applied to other areas of technology. For example, an object devices via a network, Such as network 110. Digital image selected may be a building that may be selected by gaze focal capture technology Such as a digital camera or image scan point tracking and the configured gesture may be for identi ning technology may be provided with AR glasses 120 in fication of, for example, a name of the object, other object US 2016/0070439 A1 Mar. 10, 2016

information identification Such as a history of the building, or devices that may be worn by the user under, with or on top of an identification of information from a Social network regard clothing, as well as in glasses, jewelry, hats, or other acces ing the selected object. sories. Smart watch 130 may be any other electronic device 0017 E-commerce database 125 resides on AR glasses with sensing capability including a hand gesture sensing, 120. In an embodiment, E-commerce database 125 may muscle movement detection, gesture sensing, barcode scan reside on Smart watch 130. In another embodiment, E-com ning and communication capability Such as the ability to send merce database 125 may reside on server 140 or another and receive data over network 110 or wirelessly over a local device (not shown) inaugmented reality data processing envi area network (WLAN) to AR glasses 120. In one embodi ronment 100. E-commerce database 125 stores data regarding ment, smart watch 130, with communication capability with the identification of and related information of objects, prod an E-commerce application, Such as E-commerce application ucts or locations that the user of AR glasses 120 may access or 121, may include only a sensor 132. In another embodiment, view. E-commerce application 121 may retrieve information smart watch 130 may include one or more sensors. As on objects previously viewed from E-commerce database depicted, Smart watch 130 includes sensor 132 and barcode 125. E-commerce database 125 may receive updates, from scanner 133. E-commerce application 121, regarding new objects viewed, 0020 Sensor 132 may provide the capability to identify products or locations viewed. E-commerce database 125 may movement, for example, finger, hand, arm or muscle move also receive, via network 110, additional information related ment or a series of movements used in a user gesture Such as to objects, products and locations from database 145. For a finger tapping movement. Barcode scanner 133 on Smart example, E-commerce application 121 may store updates or watch 130 may be used, for example, to scan a barcode of a additional information from database 145 to E-commerce product in a brick and mortar store. Sensor data and barcode database 125. In another example, server 140 may send scan data may be sent over network 110 to AR glasses 120 or updates or additional information to E-commerce database may be sent wirelessly via a local wireless network (WLAN). 125. Database 145, located on server 140, and database 125 In another embodiment, smart watch 130 may be a wearable on AR glasses 120 may be implemented with any type of computer including, for example, E-commerce application storage device capable of storing data that may be accessed 121 and E-commerce database 125, which can send and and utilized by server 140, such as a database server, a hard receive data from AR glasses 120 and server 140 and may disk drive, or a flash memory. include components and capabilities discussed with reference 0018 UI 126 provides an interface between a user and a to FIG. 3. In an embodiment, smart watch 130 may be a computer program, Such as E-commerce application 121, and bracelet, a wristband, one or more rings, or other apparel, may utilize input such as sensor data from Smart watch 130. A decorative item or jewelry with sensors and data transmission , such as UI 126, may be an interface, a set of that may or may not include barcode scanner 133. In some commands, a data input such as sensor data generated in embodiments, Smart watch 130 includes a touch screen, but response to a user gesture, a Voice signal input using speech ton or other tactile activated area for user input to Smart watch recognition, a touch input using a touch screen or button 130 for communication to E-commerce application 121. through which a user communicates the control sequences or 0021 Sensor 132 resides in smart watch 130 and may be commands to a program and the interface can provide the any device capable of capturing a user gesture Such as a hand information (such as graphic, text, and Sound) that a program gesture, a finger movement, an arm movement, a muscle presents to a user. In one embodiment, UI 126 may be the movement or other user movement associated with the sensor interface between AR glasses 120 and E-commerce applica location. Sensor 132 may consist of one or more sensors or tion 121. In other embodiments, UI 126 provides an interface other devices capable of capturing a user's movement Such as between E-commerce application 121 and database 145, a finger, a hand, a muscle movement, an arm movement or a which resides on server 140. In one embodiment, UI 126 may combination of one or more movements associated with a be the interface between AR glasses 120 and smart watch 130. user gesture. Sensor 132 provides sensor data which may be In an embodiment, the user interface input technique may electrical potential data, motion data, or any similar digital utilize data received from one or more sensors which may be data associated with a usergesture as captured by one or more located on Smart watch 130. In another embodiment, user sensors such as sensor 132. In an embodiment, sensor 132 interface input technique may utilize barcode scan data may sense the electrical activity produced by the user's received from a barcode scanner on Smart watch 130. In an muscles, for example, similar to sensors used in electromyo embodiment, the user input technique may utilize data graphy. In one embodiment, sensor 132 may be a sensitive received from sensors in AR glasses 120. In another embodi motion sensor capable of detecting both fine motions created ment, the user interface input technique may utilize data by a finger gesture or a gross movement such as an arm received from one or more tactile sensors such as a touch movement. In an exemplary embodiment, sensor 132 may be screen, a button, or a touch sensitive area on Smart watch 130. located on the user's wristin Smart watch 130. Sensor data for Additionally, audio commands or speech recognition com a user's gesture or motion may be sent to E-commerce appli monly applied in AR glasses 120 may be used by UI 126 to cation 121 via network 110 or a wireless local area network receive user input that may be used, for example, to configure (WLAN). E-commerce application 121. 0022. As discussed above, barcode scanner 133 resides in 0019 Smart watch 130 may be a wearable computer, a smart watch 130. Barcode scanner 133 may be used to scan a personal digital assistant, a Smart phone or a watch with product barcode to select and retrieve information on the sensing capability, such as with a motion sensor or a barcode scanned product when a user is in a brick and mortar store. scanner capable of communication with AR glasses 120. Barcode scanner 133 may scan a product barcode and send Smart watch 130 may be, for example, a hand gesture cap the barcode scan data to E-commerce application 121 using turing device. Such as a computing device capable of detect network 110 or a wireless local area network. E-commerce ing motion or movement. Wearable computers are electronic application 121 may use the received barcode scan data to US 2016/0070439 A1 Mar. 10, 2016 identify attributes of the product using database 145. E-com and say 'select to configure E-commerce application 121 to merce application 121 may send the barcode scan data select an object currently viewed or determined to be selected directly using a wireless local area network to a local in-store by the user's focal point by gaze focal point tracker. The database or in-store website or via network 110 to an internet sensor data, which may include the muscle movements asso website with access to a store database for the brick and ciated with a pointer finger tapping movement, may be con mortar store where the product is residing. In an embodiment, figured Such that E-commerce application 121 selects an barcode scanner 133 may reside on AR glasses 120. In one object when the gesture, in this case, a point finger tap, is embodiment, barcode scan data scanned by barcode scanner detected in sensor data from Smart watch 130. The sensor data 133 may be sent to E-commerce database 125 by E-com may include muscle movement for a gesture of the user's merce application 121. In an embodiment, barcode scanner body such as a finger movement or a hand movement. In 133 may reside on another device (not shown) capable of another example, E-commerce application 121 may be con communicating with E-commerce application 121, E-com figured to scroll through an on-line website to search, for merce database 125, or database 145. example, the website or a store database which may include 0023 FIG. 2 is a flowchart 200 depicting operational steps product images, product descriptions, order information, of E-commerce application 121, on AR glasses 120 within product price or product specification data with a gesture Such augmented reality data processing environment 100 of FIG. as a sliding motion of the user's left pointer finger 1, for electronic commerce using AR glasses and a Smart 0026. In decision block 204, E-commerce application 121 watch, in accordance with an embodiment of the present determines whether an object is selected. In an embodiment, invention. when a userlooks at or focuses on an object in an internet site 0024. In step 202, E-commerce application 121 receives a Such as a store catalog with AR glasses 120, E-commerce configuration of a command associated with a usergesture. A application 121, using a gaze focal point tracker, determines configuration of a command corresponding to a user gesture the object the user's gaze is focused on. E-commerce appli may be, for example, created by the user upon initialization of cation 121 with a gaze focal point tracker utilizes input from E-commerce application 121, stored by the user prior to use AR glasses 120 on the spacing of the user's eyes or the of E-commerce application 121, or the configuration may be spacing of the user's eye pupils in conjunction with the direc a default setting for use of E-commerce application 121. The tion of the user's gaze to extrapolate a focal point of the user's exemplary embodiment of the present invention includes gaze. The gaze focal point tracker using detected eye or pupil Smart watch 130 with one or more sensors to detect and track spacing, direction of view and binocular vision principals one or more gestures. Using E-commerce application 121, the may identify the object in a locus or a focal point of the users user can configure a gesture to correspond to a command. vision. In some embodiments, the user may open a web Common tasks used in E-commerce. Such as drag and drop of browser to view objects in a first electronic commerce vendor a product to add, change a quantity of, or remove the product environment which may be an internet site where the object from a virtual shopping cart, and complete a purchase, for may be an image of an object oran image of a product viewed example, may be initially configured and correlated by the in the website using AR glasses 120. The object viewed, user to specific gestures detected by sensor 132. When the which may be selected, may also be text or words in an on-line user initially configures E-commerce application 121, Smart internet site oran on-line product catalog. In another embodi watch 130 may send the sensor data, for example, muscle ment, the object viewed for possible selection could be a movement data for a user gesture to E-commerce application real-world product (e.g., on a store shelf in a brick and mortar 121. The one or more sensors, such as sensor 132, which may store). be located on the watch band of smart watch 130, can detect 0027 E-commerce application 121 may determine the one or more movements (e.g. finger, hand, arm or muscle object is selected in one or more ways (the “YES branch of movements) which correspond to a gesture configured for an decision block 204). In an embodiment, E-commerce appli action in E-commerce application 121. In another embodi cation 121 with gaze focal point tracker may be configured to ment, upon receiving sensor data for a gesture from Smart select an object based on a threshold period of time the user watch 130, the user may direct E-commerce application 121 focuses on the object. For example, an object identified by to configure a command or an action to be executed in gaze focal point tracker as the focal point of the user's gaZe response to the gesture. In an embodiment, sensors may be may be selected by E-commerce application 121 when the used in AR glasses 120 to detect a head movement, which user views the object for five seconds. In another embodi may correspond to a command in E-commerce application ment, E-commerce application 121 may be initially config 121. The gesture may be configured by E-commerce applica ured to select an object in the user's focal point of vision only tion 121 according to a user input which may be an audio when object selection is requested by the user using a voice input or Voice input received by AR glasses 120 using speech command (for example, “select product’) or a gesture. In the recognition Software, natural language processing algo exemplary embodiment, the user may, for example, request rithms, or, a text input, such as a text, a note or another type of an object selection by a gesture recorded by the one or more user input from another electronic or computing device which sensors in Smart watch 130. In one embodiment, the user may may be a laptop, a Smart phone, a wearable computer, for also configure E-commerce application 121 to select an example, Smart watch 130. For example, a user may configure object using a gesture Such as a nod of the head detected by E-commerce application 121 to use a gesture to select an sensors in AR glasses 120. In another embodiment, a user object. In another embodiment, E-commerce application 121 may use a tactile object selection method to request an object may retrieve information for associating a command with a selection by using a touchscreen, a button oran active area on usergesture from a database, for example, E-commerce data smart watch 130 to identify object selection to E-commerce base 125 or database 145. application 121. In one embodiment, an object in the real 0025. When configuring a gesture to a command, a user world, which may be a product in a store, may be selected by may use a gesture Such as a tapping motion of a pointerfinger digitally capturing an image of the product using AR glasses US 2016/0070439 A1 Mar. 10, 2016

120 (e.g. using image scanning or digital camera capability in decoded product barcode for an object in E-commerce data AR glasses). In an embodiment, E-commerce application 121 base 125. In an embodiment, the user may select the infor may select an object in a brick and mortar store when E-com mation or data to be saved in E-commerce database 125 by merce application 121 receives data from a barcode scan of a performing a gesture associated with a command to save the product in a store from barcode scanner 133 included within data or by a voice command (e.g., saying 'save product” or Smart watch 130. “save product and price') when focusing on the desired 0028 E-commerce application 121 may determine no object, for example, when looking at an image of the object in object was selected and ends processing (the “NO” branch of an on-line store catalog, a digital image or photograph of the decision block 204). In an embodiment, E-commerce appli object, the real-life object in a brick and mortar store, or a cation 121 may receive direction from the user to exit the description of a product, a product type, or a product attribute application from one of several methods. The user may input associated with the object, such as an estimated shipping time an audio or speech command to exit the application into UI or a product price. In one embodiment, E-commerce applica 126. E-commerce application 121 may receive sensor data tion 121 may store data viewed by the user when barcode data from the sensors on smart watch 130 of a gesture configured from bar code scanner 133 is used to identify a product and/or to end the application. E-commerce application 121 may associated product information of an object Such as a product receive direction to end the application based on a tactile in a brick and mortar store. In another embodiment, E-com selection of an icon, a button, or a menu item selection from merce application 121 may store an image of a product in a a touchscreen on Smart watch 130 or a touch activated area on brick and mortar store as captured by AR glasses 120. In some AR glasses 120 to exit the application, for example. embodiments, E-commerce application 121 may store the 0029. In step 206, E-commerce application 121 deter data viewed by the user in the order in which the data was mines the selected object. An embodiment of the present viewed. invention uses image recognition of an image of an object to 0031. In another embodiment, a user may save an object determine the selected object. The image of an object may be viewed by the user and associated data, by object type which animage viewed inaugmented reality on AR glasses 120 Such may be, for example, a product type. For example, a record as on an internet site which may be a store website, or the may be created for a product type such as "cameras' and a image of the object may be a scanned or digitally captured user may indicate by selecting an object, for example, using image of a real world object, for example, an image of product gaze focal point detection, a menu item, a product image, a on a shelf captured by AR glasses 120. E-commerce applica product description or attribute displayed by AR glasses 120, tion 121 may search a store website, a multi-vendor website, using a gesture or saying 'save in cameras. In another a multiple advertisement website or database, an internet site, embodiment, E-commerce application 121 may save or store an object recognition database, or perform an internet search a selected product when a user uses gaze tracker focal point for a correlated or matching object image or product image detection to select a user configured icon or menu item in AR using image recognition. E-commerce application 121 may glasses 120 for the record or file for “cameras”. The memory use image recognition techniques to match or correlate the management function provided by E-commerce application digital image of the real-world object or an augmented reality 121 may save the data viewed by the user. In an embodiment, image of a product in a store website with a digital image in a the data viewed by the user and/or the selected objects may be store internet website or another such database that stores sent to E-commerce database 125 and stored in the order in information on the product. In some embodiments, E-com which the objects were selected. The data sent to E-commerce merce application 121 may search another store website, a database 125 may be a product image, for example, from an multi-vendor website, a multiple advertisement website or internet website oran image of a product in a brick and mortar database, an object recognition database or another internet store or the data may be a product price saved and stored in the site connected by either network 110 or another network for sequence as selected by the user. For example, a user selects an image matching the object or product. In one embodiment, a first lawnmower in a lawn and garden center store internet E-commerce application 121 may receive from Smart watch website and views the first lawnmower and price, then, the 130 a barcode or barcode data from barcode scanner 133 of a user moves to anotherinternet shopping site, such as a depart product to identify the object or product. E-commerce appli ment store website, and searches for and selects a second lawn cation 121 can be connected to database 145 which may be mower to view the price. The second lawn mower selected the store database on server 140 via network 110. In an may be stored by the memory management function in embodiment, E-commerce application 121 may be connected E-commerce application as a more recently viewed object in wirelessly by a local area network provided by the brick and E-commerce database 125. E-commerce application 121 may mortar store accessing the store database which may include be configured to store data viewed by a user or a selected a product catalog and product information Such as product object in E-commerce database 125 by any user defined cat attributes. egory. E-commerce application 121 may be configured by the 0030. In step 208, E-commerce application 121 stores the user to store selected objects by product type, by a store name, data viewed by the user. In the exemplary embodiment, or by product availability, for example. E-commerce 121 stores the data viewed by the user which 0032. In step 210, E-commerce application 121 receives a may be, for example, an image of the selected object or a command based on a detected user gesture. Sensor 132 on product description, in E-commerce database 125. E-com Smart watch 130 detects a gesture and sends the sensor data to merce application 121 provides a memory management capa E-commerce application 121. E-commerce application 121, bility for data storage. For example, E-commerce application in response to receiving the sensor data for the gesture, deter 121 may store or save a name of the selected object, save a mines what the associated command is for the gesture. In an price and product name, save a product by a user defined embodiment, E-commerce 121 may receive a command to product type, an internet location, a store physical location, a navigate to a second electronic commerce vendor environ product identification number, a product barcode, or a ment such as a second store website to search for the selected US 2016/0070439 A1 Mar. 10, 2016

object. In an embodiment, the user may configure the web 0035. In step 216, E-commerce application 121 executes sites or databases to be searched and may include the order in the determined command (the “no branch of decision block which to search the website or databases. For example, a user 212). E-commerce application 121 executes the command may wish to search three specified stores, for example, store determined in step 210. The command may be, for example, A, store B, and store C starting with the user preferred store, to scroll to the next page on the website or to add the object to which is identified as store A. E-commerce application 121 the shopping cart. E-commerce application 121 performs the may be configured to search only these three stores. The order configured action or command for the gesture. For example, in which E-commerce application searches the three stores E-commerce application 121 receives from sensor 132 on may be configured by the user. In addition, the user may Smart watch 130 sensor data of gesture Such as the muscle configure the type of data retrieved from a store website or a movements associated with a pointerfinger tap and slide, and database Such as database 145. For example, a user may only according to the pre-configured command (see step 202) for want to look for shoes in the first and the third of the three the gesture, E-commerce application 121 drags and drops the stores (i.e. store A and store C) configured in the previous selected object to a location indicated by a length of the user's example. E-commerce application can then retrieve the data slide of the finger (i.e., the dragging of the product depicted stored by the user(step 208). The stored data may be an image and directed by a gesture Such as the sliding motion of the of a product, a scan of a barcode, a decoded barcode, a product user's finger). In another embodiment, E-commerce applica description, or a product attribute. Such as price, for example. tion 121 may use the gaze focal point tracker to identify the 0033. In one embodiment, E-commerce application 121 location, for example, a virtual shopping cart, where the may retrieve stored data associated with selected objects in object is to be dropped when the right pointer finger tap and the reverse order in which the objects were selected or, in slide is used. In another embodiment, a tactile or touch screen other words, retrieve the objects by sequential order of entry on smart watch 130 may be configured to perform an action starting from the most recent object to the oldest selected Such as to select an object, drag an object, select an image, a object. For example, the user may click an icon labeled word or a line of text, or perform another pre-configured “review last item” and the memory function in E-commerce command. Upon executing the determined command, application 121 will show the price for the first lawnmower E-commerce application 121 proceeds to determine whether viewed previously at the lawn and garden center database in another object is selected (decision block 204). the previous example. In another embodiment, E-commerce 0036 FIG. 3 depicts a block diagram 300 of components 121 may retrieve from E-commerce database 125 data stored of a computing device, for example, AR glasses 120, in accor by a category. For example, data stored by the user may be dance with an illustrative embodiment of the present inven searched by a user or other defined category Such as a product tion. It should be appreciated that FIG. 3 provides only an type in E-commerce database 125 (e.g. “lawnmowers'). For illustration of one implementation and does not imply any example, a user may select to retrieve data associated with limitations with regard to the environments in which different each object previously selected in a product type or category embodiments may be implemented. Many modifications to Such as “high resolution printers’. Upon the user completing the depicted environment may be made. a review the retrieved data viewed by the user, E-commerce 0037 AR glasses 120 include communications fabric 302, application 121 may return to step 204 to determine whether which provides communications between computer proces another object is selected by the user. sor(s) 304, memory 306, persistent storage 308, communica 0034. In decision block 212, E-commerce application 121 tions unit 310, and input/output (I/O) interface(s) 312. Com determines whether the command received is configured to munications fabric 302 can be implemented with any proceed to a shopping cart. In the exemplary embodiment, architecture designed for passing data and/or control infor based on the gesture and the associated command, E-com mation between processors (such as microprocessors, com merce application 121 determines if the command received in munications and network processors, etc.), system memory, response to the sensor data proceeds to the shopping cart. In devices, and any other hardware components step 214, E-commerce application 121 determines the com within a system. For example, communications fabric 302 mand proceeds to the shopping cart (the 'yes' branch of can be implemented with one or more buses. decision block 212) and executes the command to move the object to the shopping cart which is a virtual shopping cart. 0038 Memory 306 and persistent storage 308 are com The object in the shopping cart may be purchased using, for puter readable storage media. In this embodiment, memory example, shopping cart directed actions such as payment 306 includes random access memory (RAM) 314 and cache entry, address entry, shipping address, shipping method and memory 316. In general, memory 306 can include any suit other similar purchase related user data inputs. In one able volatile or non-volatile computer readable storage embodiment, a command based on a user's gesture may be a media. command to purchase an item which may include E-com 0039 E-commerce application 121, E-commerce data merce application 121 connecting with an automated pay base 125 and UI 126 can be stored in persistent storage 308 ment program. In an embodiment, the shopping cart may for execution by one or more of the respective computer utilize another website or vendor for payment or financial processors 304 via one or more memories of memory 306. In transactions related to the purchase of an object. Upon pro this embodiment, persistent storage 308 includes a magnetic ceeding to the shopping cart and completing a purchase, hard disk drive. Alternatively, or in addition to a magnetic E-commerce application 121 ends processing. In other hard disk drive, persistent storage 308 can include a solid state embodiments, upon proceeding to the shopping cart, E-com hard drive, a semiconductor storage device, read-only merce application 121 may return to determine whether an memory (ROM), erasable programmable read-only memory object is selected (decision block 204), or determine whether (EPROM), flash memory, or any other computer readable sensor data is received indicating a command to navigate to storage media that is capable of storing program instructions another website or store. or digital information. US 2016/0070439 A1 Mar. 10, 2016

0040. The media used by persistent storage 308 may also Strued as being transitory signals perse, such as radio waves be removable. For example, a removable hard drive may be or other freely propagating electromagnetic waves, electro used for persistent storage 308. Other examples include opti magnetic waves propagating through a waveguide or other cal and magnetic disks, thumb drives, and Smart cards that are transmission media (e.g., light pulses passing through a fiber inserted into a drive for transfer onto another computer read optic cable), or electrical signals transmitted through a wire. able storage medium that is also part of persistent storage 308. 0047 Computer readable program instructions described 0041 Communications unit 310, in these examples, pro herein can be downloaded to respective computing/process vides for communications with other data processing ing devices from a computer readable storage medium or to or devices, including resources of server 140 and Smart watch an external computer or external storage device via a network, 130. In these examples, communications unit 310 includes for example, the Internet, a local area network, a wide area one or more network interface cards. Communications unit network and/or a wireless network. The network may com 310 may provide communications through the use of either or prise copper transmission cables, optical transmission fibers, both physical and wireless communications links. E-com wireless transmission, routers, firewalls, Switches, gateway merce application 121 and database 125 may be downloaded computers and/or edge servers. A network adapter card or to persistent storage 308 through communications unit 310. network interface in each computing/processing device 0042 I/O interface(s) 312 allows for input and output of receives computer readable program instructions from the data with other devices that may be connected to AR glasses network and forwards the computer readable program 120. For example, I/O interface(s) 312 may provide a con instructions for storage in a computer readable storage nection to external device(s)318 such as a sensor on a Smart medium within the respective computing/processing device. watch, a keyboard, a keypad, a touch screen, and/or some 0048 Computer readable program instructions for carry other suitable input device. External device(s) 318 can also ing out operations of the present invention may be assembler include readable storage media Such as, for instructions, instruction-set-architecture (ISA) instructions, example, thumb drives, portable optical or magnetic disks, machine instructions, machine dependent instructions, and memory cards. Software and data used to practice microcode, firmware instructions, state-setting data, or either embodiments of the present invention, e.g., E-commerce Source code or object code written in any combination of one application 121, sensor data from smart watch 130 and data or more programming languages, including an object ori base 125 can be stored on such portable computer readable ented Such as Smalltalk, C++ or the storage media and can be loaded onto persistent storage 308 like, and conventional procedural programming languages, via I/O interface(s)312. I/O interface(s)312 also connect to a Such as the “C” programming language or similar program display 320. ming languages. The computer readable program instructions 0043. Display 320 provides a mechanism to display data may execute entirely on the user's computer, partly on the to a user and may be, for example, a computer monitor. user's computer, as a stand-alone software package, partly on 0044) The programs described herein are identified based the user's computer and partly on a remote computer or upon the application for which they are implemented in a entirely on the remote computer or server. In the latter sce specific embodiment of the invention. However, it should be nario, the remote computer may be connected to the user's appreciated that any particular program nomenclature herein computer through any type of network, including a local area is used merely for convenience, and thus the invention should network (LAN) or a wide area network (WAN), or the con not be limited to use solely in any specific application iden nection may be made to an external computer (for example, tified and/or implied by such nomenclature. through the Internet using an Internet Service Provider). In 0045. The present invention may be a system, a method, Some embodiments, electronic circuitry including, for and/or a computer program product. The computer program example, programmable logic circuitry, field-programmable product may include a computer readable storage medium (or gate arrays (FPGA), or programmable logic arrays (PLA) media) having computer readable program instructions may execute the computer readable program instructions by thereon for causing a processor to carry out aspects of the utilizing state information of the computer readable program present invention. instructions to personalize the electronic circuitry, in order to 0046. The computer readable storage medium can be a perform aspects of the present invention. tangible device that can retain and store instructions for use 0049 Aspects of the present invention are described by an instruction execution device. The computer readable herein with reference to flowchart illustrations and/or block storage medium may be, for example, but is not limited to, an diagrams of methods, apparatus (systems), and computer pro electronic storage device, a magnetic storage device, an opti gram products according to embodiments of the invention. It cal storage device, an electromagnetic storage device, a semi will be understood that each block of the flowchart illustra conductor storage device, or any Suitable combination of the tions and/or block diagrams, and combinations of blocks in foregoing. A non-exhaustive list of more specific examples of the flowchart illustrations and/or block diagrams, can be the computer readable storage medium includes the follow implemented by computer readable program instructions. ing: a portable computer diskette, a hard disk, a random 0050. These computer readable program instructions may access memory (RAM), a read-only memory (ROM), an eras be provided to a processor of a general purpose computer, able programmable read-only memory (EPROM or Flash special purpose computer, or other programmable data pro memory), a static random access memory (SRAM), a por cessing apparatus to produce a machine, such that the instruc table compact disc read-only memory (CD-ROM), a digital tions, which execute via the processor of the computer or versatile disk (DVD), a memory stick, a , a other programmable data processing apparatus, create means mechanically encoded device Such as punch-cards or raised for implementing the functions/acts specified in the flowchart structures in a groove having instructions recorded thereon, and/or block diagram block or blocks. These computer read and any suitable combination of the foregoing. A computer able program instructions may also be stored in a computer readable storage medium, as used herein, is not to be con readable storage medium that can direct a computer, a pro US 2016/0070439 A1 Mar. 10, 2016 grammable data processing apparatus, and/or other devices to 2. The method of claim 1, further comprising, responsive to function in a particular manner, such that the computer read determining the user selects an object, storing, by one or more able storage medium having instructions stored therein com computing devices, information associated with the object. prises an article of manufacture including instructions which 3. The method of claim 2, further comprising: implement aspects of the function/act specified in the flow determining, by one or more computing devices, the user chart and/or block diagram block or blocks. has navigated to a second electronic commerce vendor 0051. The computer readable program instructions may environment; also be loaded onto a computer, other programmable data retrieving, by one or more computing devices, the infor processing apparatus, or other device to cause a series of mation associated with the object; and operational steps to be performed on the computer, other searching, by one or more computing devices, based, at programmable apparatus or other device to produce a com least in part, on the information associated with the puter implemented process, such that the instructions which object, the second electronic commerce vendor environ execute on the computer, other programmable apparatus, or ment for the object. other device implement the functions/acts specified in the 4. The method of claim 2, wherein storing information flowchart and/or block diagram block or blocks. associated with the object further comprises storing a cat 0052. The flowchart and block diagrams in the Figures egory of the object. illustrate the architecture, functionality, and operation of pos 5. The method of claim 1, wherein receiving a configura sible implementations of systems, methods, and computer tion associating a user gesture to a command further com program products according to various embodiments of the prises: present invention. In this regard, each block in the flowchart receiving, by one or more computing devices, sensor data or block diagrams may represent a module, segment, or por corresponding to the user gesture from at least one sen tion of instructions, which comprises one or more executable sor on the Smart watch; instructions for implementing the specified logical function receiving, by one or more computing devices, a command (s). In some alternative implementations, the functions noted from the user to be configured to the sensor data for the in the block may occur out of the order noted in the figures. user gesture; and For example, two blocks shown in Succession may, in fact, be configuring, by one or more computing devices, the com executed Substantially concurrently, or the blocks may some mand to be associated to the user gesture. times be executed in the reverse order, depending upon the 6. The method of claim 1, whereindetermining whether the functionality involved. It will also be noted that each block of user of the augmented reality glasses selects an object further the block diagrams and/or flowchart illustration, and combi comprises: nations of blocks in the block diagrams and/or flowchart determining, by one or more computing devices, an object illustration, can be implemented by special purpose hard of focus of the user by a gaze focal point tracker, ware-based systems that perform the specified functions or determining, by one or more computing devices, if the acts or carry out combinations of special purpose hardware object of focus is viewed for a threshold period of time; and computer instructions. and 0053. The descriptions of the various embodiments of the determining, by one or more computer devices, the object present invention have been presented for purposes of illus is selected. tration, but are not intended to be exhaustive or limited to the 7. The method of claim 1, whereindetermining whether the embodiments disclosed. Many modifications and variations user of the augmented reality glasses selects an object further will be apparent to those of ordinary skill in the art without comprises receiving, by one or more computing devices, a departing from the scope and spirit of the invention. The barcode from the Smart watch. terminology used herein was chosen to best explain the prin 8. The method of claim 1, whereindetermining whether the ciples of the embodiment, the practical application or techni user of the augmented reality glasses selects an object further cal improvement over technologies found in the marketplace, comprises receiving a voice command from the user. or to enable others of ordinary skill in the art to understand the 9. The method of claim 1, whereindetermining whether the embodiments disclosed herein. user performs a first gesture detectable by a Smart watch further comprises receiving, by one or more computing What is claimed is: devices, sensor data from the Smart watch, wherein the Smart 1. A method for electronic commerce using augmented watch includes at least one sensor. reality glasses and a Smart watch, the method comprising: 10. A computer program product for electronic commerce receiving, by one or more computing devices, a configura using augmented reality glasses and a Smart watch, the com tion associating a user gesture to a command; puter program product comprising: determining, by one or more computing devices, whethera one or more computer readable storage media and program user of an augmented reality glasses selects an object in instructions stored on the one or more computer read able storage media, the program instructions executable a first electronic commerce vendor environment; by a processor, the program instructions comprising: responsive to determining the user selects an object, deter program instructions to receive a configuration associating mining, by one or more computing devices, whether the a user gesture to a command; user performs a first gesture detectable by a Smart watch; program instructions to determine whether a user of an determining, by one or more computing devices, whether augmented reality glasses selects an object in a first the first gesture matches the user gesture; and electronic commerce vendor environment; responsive to determining the first gesture matches the user responsive to determining the user selects an object, pro gesture, performing, by one or more computing devices, gram instructions to determine whether the user per the associated command. forms a first gesture detectable by a Smart watch; US 2016/0070439 A1 Mar. 10, 2016

program instructions to determine whether the first gesture augmented reality glasses selects an object further comprise matches the user gesture; and program instructions to receive a barcode from the Smart responsive to determining the first gesture matches the user watch. gesture, program instructions to perform the associated 17. The computer program product of claim 10, wherein command. program instructions to determine whether the user of the 11. The computer program product of claim 10, further augmented reality glasses selects an object further comprise comprising, responsive to determining the user selects an program instructions to receive a voice command from the USC. object, program instructions to store information associated 18. A computer system for electronic commerce using with the object. augmented reality glasses and a Smart watch, the computer 12. The computer program product of claim 11, further system comprising: comprising: one or more computer processors; program instructions to determine the user has navigated to one or more computer readable storage media; a second electronic commerce vendor environment; program instructions stored on the one or more computer program instructions to retrieve the information associated readable storage media for execution by at least one of with the object; and the one or more computer processors, the program program instructions to search, based, at least in part, on instructions comprising: the information associated with the object, the second program instructions to receive a configuration associating electronic commerce vendor environment for the object. a user gesture to a command; 13. The computer program product of claim 11, wherein program instructions to determine whether a user of an augmented reality glasses selects an object in a first program instructions to store information associated with the electronic commerce vendor environment; object further comprises program instructions to store a cat responsive to determining the user selects an object, pro egory of the object. gram instructions to determine whether the user per 14. The computer program product of claim 10, wherein forms a first gesture detectable by a Smart watch; program instructions to receive a configuration associating a program instructions to determine whether the first gesture user gesture to a command further comprises: matches the user gesture; and program instructions to receive sensor data corresponding responsive to determining the first gesture matches the user to the user gesture from at least one sensor on the Smart gesture, program instructions to perform the associated watch; command. program instructions to receive a command from the user 19. The computer system of claim 18, further comprising, to be configured to the sensor data for the user gesture; responsive to determining the user selects an object, program and instructions to store information associated with the object. program instructions to configure the command to be asso 20. The computer system of claim 18, wherein program ciated to the user gesture. instructions to receive a configuration associating a user ges 15. The computer program product of claim 10, wherein ture to a command further comprises: program instructions to determine whether the user of the program instructions to receive sensor data corresponding augmented reality glasses selects an object further comprises: to the user gesture from at least one sensor on the Smart program instructions to determine an object of focus of the watch; user by a gaze focal point tracker, program instructions to receive a command from the user program instructions to determine if the object of focus is to be configured to the sensor data for the user gesture; viewed for a threshold period of time; and and program instructions to determine the object is selected. program instructions to configure the command to be asso 16. The computer program product of claim 10, wherein ciated to the user gesture. program instructions to determine whether the user of the k k k k k