Global Illumination for Fun and Profit

Global Illumination for Fun and Profit

A Wearable Hardware Platform for Capturing Expert’s Experience Puneet Sharma1, Tre Azam2, Soyeb Aswat2, Roland Klemke3, Fridolin Wild4 1University of Tromsø, Norway 2Myndplay, UK 3Open University of the Netherlands, Netherlands 4Oxford Brookes University, UK ABSTRACT proposed instructional methods in the form of In this article, we propose a mapping of methods components that facilitate the transfer of (that facilitate the transfer of experience from experience from an expert to a novice. expert to trainee) to low-level functions such as 2 METHODOLOGY gaze, voice, video, body posture, hand gestures, bio-signals, fatigue levels, and location of the user In this section, first, we will discuss the mapping of in the environment. This mapping is used to transfer mechanisms; second, we outline sensor decompose the low-level functions to their recommendations, and finally we discuss the associated sensors. After reviewing the proposed hardware platform prototype. requirements, a set of sensors is proposed for the experience-capturing platform. Based on this, a 2.1 Mapping Transfer Mechanisms first version of hardware platform is designed and The authors [3], define instructional methods as implemented. Our proposed platform is modular transfer mechanisms i.e., methods that facilitate and can be adapted to different sensors from the transfer of knowledge. The transfer different vendors as technology progresses. mechanisms include: remote symmetrical tele- Furthermore, we discuss key challenges and future assistance, virtual/tangible manipulation, haptic directions associated with the hardware platform. hints, virtual post its, mobile control, in situ real This work was done as part of an EU project called time feedback, case identification, directed focus, WEKIT. self-awareness of physical state, contextualisation, object enrichment, think aloud protocol, zoom, and Keywords: sensors, experience capturing, augmented slow motion. In this section, we decompose the reality, AR, WEKIT, WT different transfer mechanisms to low-level functions and their associated state-of-the-art Index Terms: WT, AR, WEKIT sensors [5]. 1 INTRODUCTION 1. Remote symmetrical tele-assistance For the daily smooth operations of an organization, experienced workers are vital at every level. By Description: View and capture the activity sharing their knowledge, experience, expertise of of another person from their perspective procedures, and best practices with colleagues, transmit video & audio. trainees, managers and bosses, they build, Sensors: smart/AR glasses. maintain, and support the different functions of an organization. Industries are fully aware of that and Key products: Moverio BT-200/2000, are trying new ways to capture, support, and Microsoft Hololens, Sony SmartEyeglass, preserve, the experience of an expert [1]. Google Glass, Meta 2, Vuzix M-100, Optinvent Ora-1, ODG R7. One of the key challenges faced by the industry is keeping the workers up to date in terms of their 2. Virtual/ tangible manipulation skill set. With this in mind, WEKIT is a European research and innovation project supported under Description: Hand movement tracker, Horizon 2020 to develop and test within three accelerometer, gyroscope. years a novel way of industrial training enabled by smart Wearable Technology (WT). Sensors: Depth camera, smart armband. The WEKIT industrial learning methodology Key products: Myo Gesture control comprises of capturing experience of an expert, armband, Leap Motion controller. and re-enacting the experience for training novices [2], with the former being the focus of this article. 3. Haptic hints In a recent study by [3], using the principles of Description: Vibrations on arm or fingers. 4C/ID model [4] for hands on industrial training an Sensors: Vibrotactile bracelets and rings. extensive review was performed, and the authors Key products: Myo, magic ring. Description: zoom in and get details. 4. Virtual post its, contextualisation, in situ Sensors: Smart glasses / tablet with high- real-time feedback, resolution camera. Description: Object tracking in Key products: Several. environment. 11. Slow motion Sensors: Smart glasses, Tablet Computer or Mobile Phone. Description: Allow replay at slower speed. Key products: several. Sensors: High frame rate camera (high frame rate often comes at price of 5. Mobile control, resolution with smart glasses; and vice versa). Description: control dials and other user interface elements. Key products: Several. Sensors: hand gesture, controller. The mapping from transfer mechanisms to sensors is not injective. For instance, a transfer functions Key products: Myo, Leap motion. such as remote symmetrical tele-assistance requires both audio and video information, for 6. Case identification which we need more than one sensor. On the other Description: Link with existing cases, link hand, some smart glasses (such as Microsoft with error knowledge. Hololens) are equipped with a number of integrated sensors, which enables it to capture Sensors: case based reasoning component. various transfer mechanisms. Some transfer mechanisms (e.g., virtual post its, Key products: no specific sensor. contextualisation, in situ real-time feedback) need highly processed information provided by 7. Directed focus subroutines or software libraries of an API. Some transfer mechanisms (such as zoom and slow Description: Direct focus of user. motion) are computationally expensive and can be Sensors: Gaze direction / object impractical based on the current state-of-the art of recognition, EEG (attention/focus/mental wearable devices. effort), 2.2 Sensor recommendations Key products: Smart Glasses (or In this section, we will discuss recommendations gyroscope only), MyndPlay MyndBand, on sensor choice that are used in the development Interaxon Muse EEG, Neurosky Mindwave, of the first hardware platform and discuss the Emotiv EEG. associated issues. 8. Self-awareness of physical state In order to design and develop the first version of the hardware prototype, we focus on the most Description: Fatigue level, vigilance level, important transfer mechanisms defined in the Biodata (e.g., steps, sleep, heart rate,), body studies [2, 3]. posture: ergonomics (e.g. lean back, forward). Taking into consideration features such as built-in microphone array, environment capture, gesture Sensors: EEG, smart watch, posture tracking, mixed reality capture, Wi-Fi 802.11ac, tracker. and fully untethered holographic computing, Microsoft Hololens [6] was selected for the role of Key products: MyndPlay MyndBand, AR / Smart glasses. Furthermore, the built in Neurosky Mindwave, Emotiv EEG, Fitbit, components of Hololens enable us to capture apple watch, LeapMotion, Myo, Lumo Lyft, several different attributes of the user and her Alex posture. environment. For EEG, the MyndBand [7] and 9. Think aloud protocol Neurosky chipset were favoured due to the availability of the processed data and real time Description: Capture voice of the user. feedback. For detecting hand, arm movements and gestures: Leap Motion [8] and Myo armband [9] Sensors: Microphone. were chosen. To track the position and angle of the neck, Alex posture tracker [10] was suggested. Key products: Cochlea Wireless Mini Microphone, built-in microphone of 2.2.1 Data from sensors Camera/Smart Glasses, Wireless In this section, we analyze the different types of Microphones (e.g. from AKG). data and interpreted signal available from the selected sensors. As shown in Table 1, by using the 10. Zoom API associated with Microsoft Hololens, we get processed data in the form of spatial data, computational power, the sensor-processing unit orientation, and gaze, and interpreted data in the acts as a processor for collecting, syncing and form of gestures, but, raw data is not accessible. streaming the data from all the different sensors. We can get raw data for Leap motion, Myo, and the The number of sensors can be decreased or Myndband, this was important in allowing us to increased based on the requirements of the manage the visualisations, record the data for post industrial training scenario. Hardware processing and help us discover potentially new components can use different communication interpreted markers or algorithms relating to standards such as: WiFi and Bluetooth. The experience capture. Also, different sensors require proposed architecture is quite flexible and allows different bandwidths. It is particularly high for for communication protocols such as Apache Thrift video signals (associated with AR glasses), [11], TCP/IP, UDP. moderate for Leap motion and audio signals (microphone of AR glasses) and low for Myo, Myndband, and Alex posture tracker. Table 1: Sensors and their raw/processed/interpreted Data, Sensor Data Raw Processed Interpreted Microsoft Not accessible Spatial data, Gesture HoloLens via official API Orientation, Gaze Figure 1: Proposed hardware architecture Leap Raw sensor Hand model - Motion images data Myo Raw EMG Orientation Detected data available and pose acceleration MyndPlay 512Hz Raw Bandwidth Attention, MyndBand spectrum meditation, 0.5Hz - 100Hz. zone, mental Delta - Mid effort, Gamme familiarity Alex Raw Unconfirmed Head, Neck posture accelerometer from Vendor and Shoulder tracker and posture gyroscope 2.3 Proposed Hardware Platform The proposed hardware platform (as shown in Figure 1) consists of three components: smart Figure 2: Hardware platform first iteration glasses, external sensors, and sensor processing unit, with the last two being housed in a WEKIT Based

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    5 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us