(12) Patent Application Publication (10) Pub. No.: US 2016/0112279 A1 Kalanithi Et Al

Total Page:16

File Type:pdf, Size:1020Kb

Load more

US 2016O112279A1 (19) United States (12) Patent Application Publication (10) Pub. No.: US 2016/0112279 A1 Kalanithi et al. (43) Pub. Date: Apr. 21, 2016 (54) SENSOR-BASED DISTRIBUTED TANGIBLE (52) U.S. Cl. USER INTERFACE CPC ................ H04L 4 1/22 (2013.01); H04L 67/10 (2013.01); G06F 3/04847 (2013.01); G06F (71) Applicants: Jeevan James Kalanithi, San Francisco, 3/04842 (2013.01); H04B 7/24 (2013.01) CA (US); David Jeffrey Merrill, Somerville, MA (US); Patricia Emilia Maes, Cambridge, MA (US) (57) ABSTRACT (72) Inventors: can(US); James David Kalanithi, Jeffrey Merrill, San Francisco, A distributed tangible user interface comprises compact, self Somerville, MA (US); Patricia Emilia powered, tangible USC interface manipulative devices having Maes, Cambridge, MA (US) sensing, display, and wireless communication capabilities, s s along with one or more associated digital content or other (73) Assignee: MASSACHUSETTS INSTITUTE OF interactive software management applications. The manipu TECHNOLOGY, Cambridge, MA (US) lative devices display visual representations of digital content s s or program controls and can be physically manipulated as a (21) Appl. No.: 14/519,050 group by a user for interaction with the digital information or Software application. A controller on each manipulative (22) Filed: Oct. 20, 2014 device receives and processes data from a movement sensor, initiating behavior on the manipulative and/or forwarding the Publication Classification results to a management application that uses the information to manage the digital content, software application, and/or the (51) Int. Cl. manipulative devices. The manipulative devices may also H04L 2/24 (2006.01) detect the proximity and identity of other manipulative G06F 3/0484 (2006.01) devices, responding to and/or forwarding that information to H04B 7/24 (2006.01) the management application, and may have feedback devices H04L 29/08 (2006.01) for presenting responsive information to the user. DATASIORAGE 5 AON CG:C : 5. 3 5 COff NCAON 30 6 5 25 19. Patent Application Publication Apr. 21, 2016 Sheet 1 of 22 US 2016/0112279 A1 r --- c wer ve r t \ i. 5 2 2 C s s c rrrr rts wer ---- e i. 3 < Patent Application Publication Apr. 21, 2016 Sheet 2 of 22 US 2016/0112279 A1 ¿HlIwq| c as N r N D D Patent Application Publication Apr. 21, 2016 Sheet 3 of 22 US 2016/0112279 A1 ONER AP CATONS 3 C i. -- ANPAES instructions AND $ DAARANSFERRED 31 C. TOMANIPULATIVES 1DETECNC <MANIPULATIVEYN2 NMOVEMEN/ YES DENTE AND REPORT O MANAGEMEN ARPCAON ^CHANGEN NREQUIRED1 X1 ETECN 1 MANIPULATIVE NGROUPNGS EENTIFY AND REPORT O ANAGEEN APPCAON S-1 CHANGEN Y REQUIRED Y 7 Es ( POWER OFF 350 G. 3 Patent Application Publication Apr. 21, 2016 Sheet 4 of 22 US 2016/0112279 A1 {{{} Patent Application Publication Apr. 21, 2016 Sheet 5 of 22 US 2016/0112279 A1 Patent Application Publication Apr. 21, 2016 Sheet 6 of 22 US 2016/0112279 A1 ?NOISVOGJOHg s ON, “ZTETT k N | d wer CEAEDE,HOWSSEWON•099« sir Patent Application Publication Apr. 21, 2016 Sheet 7 of 22 US 2016/0112279 A1 Patent Application Publication Apr. 21, 2016 Sheet 8 of 22 US 2016/0112279 A1 N 710 72 READY TO RESPOND 8 O EVENS s UPDATE RESPONSE 1DETECN BEHAVIOR CHANGE S NEIGHBORY SAY -760 UPDATE RESPONSE yes 1 ETECN BEHAVIOR CHANGE “{C MOVEMENTX Patent Application Publication Apr. 21, 2016 Sheet 9 of 22 US 2016/0112279 A1 SHOWFACE OOKING FORWARD SHOWFACE 16ETECN LOOKING AT *-S NEIGHBOR NEIGHBOR M 1DETECN SMOVEMENT) Patent Application Publication Apr. 21, 2016 Sheet 10 of 22 US 2016/0112279 A1 Patent Application Publication Apr. 21, 2016 Sheet 11 of 22 US 2016/0112279 A1 FG, C) Patent Application Publication Apr. 21, 2016 Sheet 12 of 22 US 2016/0112279 A1 cir-e-siliolis 5 SWEWL?se T g-8 i3S 3 XOSWIf Set T ell. NiCd7 1007 NO i?31AAdSe T g-SOLNiOd/OOOw TSV N3 WNO-Sefo . see OCA 5) Si-e- w vis- s sy eaceae e Silv HO OTOHAAd-T i. cues Ogolo vro id:Of8&OCI/O NO & -resoG GTO 80 HSw-j-et Cole SIVg Osset -es-SO OB IO GVS-VIVCN/WYOON3-selm cc cric r r c s (D cc AW X Me Xic cric i. e c. 1. sa-3 -- N Patent Application Publication Apr. 21, 2016 Sheet 13 of 22 US 2016/0112279 A1 Patent Application Publication Apr. 21, 2016 Sheet 14 of 22 US 2016/0112279 A1 Patent Application Publication Apr. 21, 2016 Sheet 15 of 22 US 2016/0112279 A1 @?TRZ?TRZ??EIZ? -- () () () () t > Patent Application Publication Apr. 21, 2016 Sheet 16 of 22 US 2016/0112279 A1 Patent Application Publication Apr. 21, 2016 Sheet 17 of 22 US 2016/0112279 A1 r.- Patent Application Publication Apr. 21, 2016 Sheet 18 of 22 US 2016/0112279 A1 s w cy Å warr * a a--- 5 Patent Application Publication US 2016/0112279 A1 Patent Application Publication Apr. 21, 2016 Sheet 20 of 22 US 2016/0112279 A1 GDC) (DGGXG) () () ) Patent Application Publication Apr. 21, 2016 Sheet 21 of 22 US 2016/0112279 A1 P.JG (TO COMPUTER) SOCKET TO SFTABLE) G. 7 G. 8 Patent Application Publication Apr. 21, 2016 Sheet 22 of 22 US 2016/0112279 A1 210 19C 905 2O 22 O 22 US 2016/01 12279 A1 Apr. 21, 2016 SENSOR-BASED DISTRIBUTED TANGIBLE action, and a reduction in the level of indirection between a USER INTERFACE person’s hand and the actual computation taking place when adjusting a parameter Fitzmaurice, G., “Graspable User RELATED APPLICATIONS Interfaces”, PhD thesis, University of Toronto (1997). These 0001. This application is a continuation of U.S. patent features make handle-based TUIs more direct form of application Ser. No. 12/365,885, filed Feb. 4, 2009, now manipulation than a GUI alone. abandoned, which claims the benefit of U.S. Provisional 0005. Another class of tangible user interface largely dis Application Ser. No. 61/063,479, filed Feb. 4, 2008, the entire penses with the GUI paradigm, featuring physical objects that disclosures of which are herein incorporated by reference. directly embody the digital information or media that they represent. These TUIs do not implement handles to manipu FIELD OF THE TECHNOLOGY late a GUI overlay; rather, such as in Ishii's Music Bottles Ishii, H., Mazalek, A., Lee, J., “Bottles as a minimal interface 0002 The present invention relates to computer user inter to access digital information’, CHI '01, Extended abstracts faces and, in particular, to a tangible user interface that is on human factors in computing systems, New York, N.Y., distributed and sensor-based. USA, ACM Press (2001), 187-188 and Want et al.’s work with embedded RFID tags Want, R., Fishkin, K.P., Gujar, A., BACKGROUND Harrison, B. L., “Bridging physical and virtual worlds with 0003 Humans exhibit particular skill at sifting, sorting, electronic tags'. Proceedings of CHI '99, New York, N.Y., and otherwise manipulating large numbers of Small physical USA, ACM Press (1999),370-377, the shape and features of objects. When a human performs a task Such as overturning a the objects themselves Suggest the semantics of the interac container of nuts and bolts and sifting through the resulting tion. The inherent coupling between form and function in this pile to find one of a particular size, or spreading photographs class of TUIS brings an increased directness to an interaction out on a tabletop and sorting them into piles, he or she uses with digital information or media. However, this gain in both hands and all of his or her fingers actively and efficiently. directness comes at a cost: since they represent the underlying However, when the human sorts digital information or media data implicitly with their physical form, UIs featuring spe Such as digital photographs or emails, the task typically does cial-purpose objects can be more limited to a particular appli not effectively leverage the full range of physical manipula cation domain or style of interaction. tion skills. For example, one typical user interaction with a 0006 Sensor networks consist of collections of sensing modern graphical user interface (GUI) is to click on an icon and communication devices that can be distributed spatially. with a mouse, drag the icon to another location on the screen, They are capable of exhibiting coordinated behavior, forming and then drop it to reposition it or to assign the data it repre a kind of “functional fabric' in the spaces that they inhabit sents to a folder. This so-called “direct manipulation of without requiring external sensing or power infrastructure. information, as afforded by a GUI, is a poor substitute for a Sensor network nodes can be built with an array of sensing human’s facile all-finger, two-handed manipulation of physi technologies that can be used to build rich models of local cal items. interactions and their Surroundings. The sensor network com 0004 Tangible user interfaces (TUIs) have made some munity has focused a great deal of effort on the technical progress towards leveraging human physical manipulation issues that it faces. Many of these present themselves as abilities for interaction with digital information. For example, tradeoffs in the design space. Longer battery life for a sensor Fitzmaurice et al. pioneered physical handles to digital node requires that other priorities. Such as battery size, radio objects, in which TUIs with handles operate by sensing the transmit range, or frequency of communication must Suffer. user's manipulation of each handle and displaying a co-lo The use of ultra-small components can cause the device to cated visual representation of the data being manipulated require more specialized and expensive assembly, so they Fitzmaurice, G.
Recommended publications
  • Visual Modelling of and on Tangible User Interfaces

    Visual Modelling of and on Tangible User Interfaces

    Faculty of Science, Technology and Communication Visual Modelling of and on Tangible User Interfaces Thesis Submitted in Partial Fulfilment of the Requirements for the Degree of Master in Information and Computer Sciences Author: Supervisor: Eric Tobias Prof. Dr. Pierre Kelsen Reviewer: Prof. Dr. Yves Le Traon Advisors: Dr. Eric Ras Public Research Centre Henri Tudor Dr. Nuno Amalio August 2012 Abstract The purpose of this thesis is to investigate the use and benefit of visual modelling languages (VMLs) in modelling on tangible user interfaces (TUIs), and modelling TUI applications. Three main research questions are asked: • Is it possible and practical to model an application or process using a TUI? • Which General-purpose VML (GPVML) performs best in modelling a VML scenario for use on TUI? • Is it realistic to use a GPVML to model complex TUI applications? To answer the first research question, a Business Process Model and No- tation, version 2 (BPMN2), ideation scenario is used with a tangible widget toolkit prototype on a TUI table to evaluate the performance and obtain feedback from test candidates. The study showed that it is possible to model a process using a TUI and the test candidate feedback did not lead to the conclusion that the modelling was cumbersome or impractical. To find a suitable GPVML, the thesis explores and evaluates the current state of the art in VMLs for describing general problems. After gathering different VMLs, the thesis compares three languages using a simple sce- nario: the visual object constraint language (VOCL), augmented constraint diagrams (ACD), and the visual contract language (VCL).
  • Alternative Interfaces for Assisting Elderly Users Thomas R

    Alternative Interfaces for Assisting Elderly Users Thomas R

    Exploring tangible interaction: Alternative interfaces for assisting elderly users Thomas R. Iversen Master’s Thesis Autumn 2015 Abstract The use of tangible user interfaces in assisting elderly users is still a fairly unexplored domain. While the current research provides some solutions to this domain, the research are still limited. The elderly population is growing, as a result of increased living standard and better health services. This causes a high demand of nursing homes, which will cause more elderly people to be living in their own homes. This thesis investigates how tangible user interfaces (TUI) can make things easier compared to the elderly people’s current living situation. Domain-knowledge of common age impairments and details of TUIs have been collected through literature review, to propose a framework for designing TUIs for elderly people, based on previous frameworks on TUIs and common age impairments discovered. Further domain-knowledge has been collected through focus groups, interviews, workshops and observations. Four prototypes featuring TUIs and designed to compensate for some challenges of aging have been developed. This includes T-Radio; a radio controlled by tangible blocks, Payless; an alternative way to pay for food and beverages in a canteen using only a RFID key card, Natural Charge; seven different wireless chargers to investigate the best configuration, and LightUp; a light bulb that changes the color intensity depending on environmental temperature. The prototypes were taken through a formative usability test with experts from the HCI community, revealing some problems of the prototypes before a few modifications were done. Then summative usability tests with elderly participants were conducted at two dif- ferent research sites; a local care home and a senior center.
  • Tabletop Tangible Interfaces for Music Performance: Design and Evaluation

    Tabletop Tangible Interfaces for Music Performance: Design and Evaluation

    MUSIC COMPUTING GROUP DEPARTMENT OF COMPUTING AND COMMUNICATIONS FACULTY OF MATHEMATICS,COMPUTING AND TECHNOLOGY THE OPEN UNIVERSITY Tabletop Tangible Interfaces for Music Performance: Design and Evaluation Author: Anna Xambó Sedó MSc (Universitat Pompeu Fabra) BA, MA (Universitat de Barcelona) A thesis submitted in partial fulfilment of the requirements for the degree of Doctor of Philosophy Supervisors: Robin Laney (The Open University, UK) Chris Dobbyn (The Open University, UK) Sergi Jordà Puig (Universitat Pompeu Fabra, Spain) Examiners: Eduardo Reck Miranda (Plymouth University, UK) Janet van der Linden (The Open University, UK) Submitted 30 September 2014 Abstract This thesis investigates a new generation of collaborative systems: tabletop tangi- ble interfaces (TTIs) for music performance or musical tabletops. Musical tabletops are designed for professional musical performance, as well as for casual interaction in public settings. These systems support co-located collaboration, offered by a shared interface. However, we still know little about their challenges and opportunities for collaborative musical practice: in particular, how to best support beginners or ex- perts or both. This thesis explores the nature of collaboration on TTIs for music performance be- tween beginners, experts, or both. Empirical work was done in two stages: 1) an exploratory stage; and 2) an experimental stage. In the exploratory stage we studied the Reactable, a commercial musical tabletop designed for beginners and experts. In particular, we explored its use in two environments: a multi-session study with expert musicians in a casual lab setting; and a field study with casual visitors in a science centre. In the experimental stage we conducted a controlled experiment for mixed groups using a bespoke musical tabletop interface, SoundXY4.
  • Models and Mechanisms for Tangible User Interfaces

    Models and Mechanisms for Tangible User Interfaces

    Models and Mechanisms for Tangible User Interfaces Brygg Anders Ullmer Bachelor of Science, University of Illinois, Urbana-Champaign, Illinois, January 1995 Submitted to the Program in Media Arts and Sciences, School of Architecture and Planning, in Partial Fulfillment of the Requirements for the Degree of Master of Science in Media Arts and Sciences at the Massachusetts Institute of Technology June 1997 © Massachusetts Institute of Technology, 1997 All Rights Reserved _____________________________________________________ Author Brygg Anders Ullmer Program in Media Arts and Sciences May 9, 1997 _____________________________________________________ Certified by Hiroshi Ishii Associate Professor of Media Arts and Sciences Thesis Supervisor _____________________________________________________ Accepted by Stephen A. Benton Chair, Departmental Committee for Graduate Students Program in Media Arts and Sciences Models and Mechanisms for Tangible User Interfaces Brygg Anders Ullmer Submitted to the Program in Media Arts and Sciences, School of Architecture and Planning, on May 9, 1997 in Partial Fulfillment of the Requirements for the Degree of Master of Science in Media Arts and Sciences at the Massachusetts Institute of Technology Abstract Current human-computer interface design is dominated by the graphical user interface approach, where users interact with graphical abstractions of virtual interface devices through a few general-purpose input “peripherals.” The thesis develops models and mechanisms for “tangible user interfaces” – user interfaces
  • Emerging Frameworks for Tangible User Interfaces

    Emerging Frameworks for Tangible User Interfaces

    In “Human-Computer Interaction in the New Millenium,” John M. Carroll, ed.; © Addison-Wesley, August 2001, pp. 579-601. Emerging Frameworks for Tangible User Interfaces Brygg Ullmer and Hiroshi Ishii ABSTRACT For more than thirty years, people have relied primarily on screen-based text and graphics to interact with computers. Whether the screen is placed on a desk, held in one’s hand, worn on one’s head, or embedded in the physical environment, the screen has cultivated a predominantly visual paradigm of human-computer interaction. In this chapter, we discuss a growing space of interfaces in which physical objects play a central role as both physical representations and controls for digital information. We present an interaction model and key characteristics for such “tangible user interfaces,” and explore these characteristics in a number of interface examples. This discussion supports a newly integrated view of both recent and previous work, and points the way towards new kinds of computationally-mediated interfaces that more seamlessly weave to- gether the physical and digital worlds. INTRODUCTION The last decade has seen a wave of new research into ways to link the physical and digital worlds. This work has led to the identification of several major research themes, including augmented reality, mixed reality, ubiquitous computing, and wearable computing. At the same time, a number of interfaces have begun to explore the relationship between physical representation and digital information, highlighting kinds of interac- tion that are not readily described by these existing frameworks. Fitzmaurice, Buxton, and Ishii took an important step towards describing a new conceptual framework with their discussion of “graspable user interfaces” [Fitzmaurice 95].
  • Tangible User Interfaces: Past, Present, and Future Directions by Orit Shaer and Eva Hornecker Contents

    Tangible User Interfaces: Past, Present, and Future Directions by Orit Shaer and Eva Hornecker Contents

    R Foundations and Trends⃝ in Human–Computer Interaction Vol. 3, Nos. 1–2 (2009) 1–137 c 2010 O. Shaer and E. Hornecker ⃝ DOI: 10.1561/1100000026 Tangible User Interfaces: Past, Present, and Future Directions By Orit Shaer and Eva Hornecker Contents 1 Introduction 3 2 Origins of Tangible User Interfaces 6 2.1 Graspable User Interface 7 2.2 Tangible Bits 8 2.3 Precursors of Tangible User Interfaces 10 3 Tangible Interfaces in a Broader Context 14 3.1 Related Research Areas 14 3.2 Unifying Perspectives 17 3.3 Reality-Based Interaction 19 4 Application Domains 22 4.1 TUIs for Learning 23 4.2 Problem Solving and Planning 27 4.3 Information Visualization 31 4.4 Tangible Programming 33 4.5 Entertainment, Play, and Edutainment 36 4.6 Music and Performance 39 4.7 Social Communication 43 4.8 Tangible Reminders and Tags 44 5 Frameworks and Taxonomies 46 5.1 Properties of Graspable User Interfaces 47 5.2 Conceptualization of TUIs and the MCRit Interaction Model 48 5.3 Classifications of TUIs 49 5.4 Frameworks on Mappings: Coupling the Physical with the Digital 51 5.5 Tokens and Constraints 54 5.6 Frameworks for Tangible and Sensor-Based Interaction 56 5.7 Domain-Specific Frameworks 59 6 Conceptual Foundations 62 6.1 Cuing Interaction: Affordances, Constraints, Mappings and Image Schemas 62 6.2 Embodiment and Phenomenology 64 6.3 External Representation and Distributed Cognition 66 6.4 Two-Handed Interaction 69 6.5 Semiotics 70 7 Implementation Technologies 73 7.1 RFID 74 7.2 Computer Vision 75 7.3 Microcontrollers, Sensors, and Actuators 77 7.4 Comparison
  • Design of a Tangible User Interface for Task Management at Home

    Design of a Tangible User Interface for Task Management at Home

    Meztli Sabina Morales Sanchez Design of a Tangible User Interface for Task Management at Home Master’s thesis in Interaction Design Supervisor: Anders-Petter Andersson June 2019 Master’s thesis Master’s NTNU Department of Design Faculty of Architecture and Design of Architecture Faculty Norwegian University of Science and Technology of Science University Norwegian Meztli Sabina Morales Sanchez Design of a Tangible User Interface for Task Management at Home Master’s thesis in Interaction Design Supervisor: Anders-Petter Andersson June 2019 Norwegian University of Science and Technology Faculty of Architecture and Design Department of Design This thesis is the finale of a Master’s Degree in Interaction Design in the Department of Design at the Norwegian University of Science and Technology. The thesis is the product of initial research carried out from August 2018 and finishes with the development of a prototype in May 2019. I would like to thank my supervisor Anders-Petter Andersson for inspiration and for being available for guidance and feedback. I would also like to thank my husband for providing emotional support and his programming skills that made me feel confident enough to embrace a challenge like this. Finally, I want to say thank you to the participants who opened their houses and allowed me to know a bit about their life. Design of a Tangible User Interface for Task Management at Home Abstract The aim of this thesis was to design a tangible user interface (TUI) for the management of tasks at home. There are multiple existing solutions that families already use to organise their activities and chores at home.
  • Tangible User Interfaces Introduction

    Tangible User Interfaces Introduction

    Tangible User Interfaces – CHI 2006 workshop (draft: please do not distribute) 1 Tangible User Interfaces Hiroshi ISHII Tangible Media Group MIT Media Laboratory Introduction Where the sea meets the land, life has blossomed into a myriad of unique forms in the turbulence of water, sand, and wind. At another seashore between the land of atoms and the sea of bits, we are now facing the challenge of reconciling our dual citizenships in the physical and digital worlds. Our visual and auditory sense organs are steeped in the sea of digital information, but our bodies remain imprisoned in the physical world. Windows to the digital world are confined to flat square screens and pixels, or "painted bits." Unfortunately, one can not feel and confirm the virtual existence of this digital information through one's hands and body. Imagine an iceberg, a floating mass of ice in the ocean. That is the metaphor of Tangible User Interfaces. A Tangible User Interface gives physical form to digital information and computation, salvaging the bits from the bottom of the water, setting them afloat, and making them directly manipulatable with human hands. From GUI to TUI People have developed sophisticated skills for sensing and manipulating their physical environments. However, most of these skills are not employed in interaction with the digital world today. A Tangible User Interface (TUI) is built upon those skills and situates the physically-embodied digital information in a physical space. Its design challenge is a seamless extension of the physical affordance of the objects into digital domain (Ishii and Ullmer, 1997; Ullmer and Ishii, 2000).
  • A Framework for Tangible Gesture Interactive Systems

    A Framework for Tangible Gesture Interactive Systems

    Machines 2015, 3, 173-207; doi:10.3390/machines3030173 OPEN ACCESS machines ISSN 2075-1702 www.mdpi.com/journal/machines/ Article Move, Hold and Touch: A Framework for Tangible Gesture Interactive Systems Leonardo Angelini 1,*, Denis Lalanne 2, Elise van den Hoven 3,4,5, Omar Abou Khaled 1 and Elena Mugellini 1 1 HumanTech Institute, University of Applied Sciences and Arts Western Switzerland, Fribourg 1700, Switzerland; E-Mails: [email protected] (O.A.K.); [email protected] (E.M.) 2 Human-IST research center, University of Fribourg, Fribourg 1700, Switzerland; E-Mail: [email protected] 3 Faculty of Design, Architecture & Building, University of Technology Sydney, Ultimo NSW 2007, Australia; E-Mail: [email protected] 4 Industrial Design Department, Eindhoven University of Technology, Eindhoven 5612 AZ, The Netherlands 5 Duncan of Jordanstone College of Art & Design, University of Dundee, Nethergate, Dundee DD1 4HN, UK * Author to whom correspondence should be addressed; E-Mail: [email protected]; Tel.: +41-(0)-26-429-6745. Academic Editor: David Mba Received: 15 June 2015 / Accepted: 5 August 2015 / Published: 18 August 2015 Abstract: Technology is spreading in our everyday world, and digital interaction beyond the screen, with real objects, allows taking advantage of our natural manipulative and communicative skills. Tangible gesture interaction takes advantage of these skills by bridging two popular domains in Human-Computer Interaction, tangible interaction and gestural interaction. In this paper, we present the Tangible Gesture Interaction Framework (TGIF) for classifying and guiding works in this field. We propose a classification of gestures according to three relationships with objects: move, hold and touch.
  • Tangible User Interface (TUI) - I Digital Data and Information by Prof

    Tangible User Interface (TUI) - I Digital Data and Information by Prof

    D’source 1 Digital Learning Environment for Design - www.dsource.in Design Course Tangible User Interface (TUI) - I Digital Data and Information by Prof. Keyur Sorathia DoD, IIT Guwahati Source: http://www.dsource.in/course/tangible-user-inter- face-tui-i 1. Introduction 2. About 3. Evolution 4. Major Considerations 5. Basic Model 6. Types 7. Key Question 8. References 9. Contact Details D’source 2 Digital Learning Environment for Design - www.dsource.in Design Course Introduction Tangible User Interface We live in a real world, surrounded by real people & ob- (TUI) - I jects and interact with them in a real space; however Digital Data and Information most of our digital data and information is restricted by to a set of metaphors such as keyboard, mouse and Prof. Keyur Sorathia screen. We have relied majorly on screen based text DoD, IIT Guwahati and graphics to interact with computer, specifically digital data or information for more than 3 decades. Consider it wall mounted screen, hand held device, head mounted displays etc. Source: Interfaces are not designed to immerse into spaces http://www.dsource.in/course/tangible-user-inter- and objects, instead it forces people to interact with face-tui-i/introduction screen-based metaphors. We as human beings have a great ability to perform gestures, speech, remember interactions with natural objects; however computers 1. Introduction do not use these mediums. They ask users to sit on 2. About a chair, play with keyboard, learn to use mouse, drop downs, software to know even a tiny set of informa- 3. Evolution tion.
  • Exploring the Use of Tangible User Interfaces for Human-Robot Interaction: a Comparative Study

    Exploring the Use of Tangible User Interfaces for Human-Robot Interaction: a Comparative Study

    University of Calgary PRISM: University of Calgary's Digital Repository Science Science Research & Publications 2007-09-26 Exploring the Use of Tangible User Interfaces for Human-Robot Interaction: A Comparative Study Guo, Cheng; Sharlin, Ehud http://hdl.handle.net/1880/45640 unknown Downloaded from PRISM: https://prism.ucalgary.ca Exploring the Use of Tangible User Interfaces for Human- Robot Interaction: A Comparative Study Cheng Guo and Ehud Sharlin ABSTRACT on the robotic platform. The necessity to perform high-level robotic actions through the composition low-level actions is In this paper we suggest the use of tangible user interfaces not ideal. Depending on the low-level set of interactions the (TUIs) for human-robot interaction (HRI) applications. We overall experience can be unnatural, error prone and can discuss the potential benefits of this approach while impose a need for advanced user training. We, like others, focusing on low-level of autonomy tasks. We present an are exploring more clear and intuitive spatial mappings experimental robotic interaction testbed we implemented to between interface and robot to facilitate intuitive, high-level support our investigation. We used the testbed to explore robotic interfaces. two HRI-related task-sets: robotic navigation control and robotic posture control. We discuss the implementation of As the level of task difficulty increases, it is ideal if the these two task-sets using an AIBO robot dog. Both tasks operator spends more time on high-level problem solving were also mapped to two different robotic control interfaces: and task planning than on low-level robot operations. keypad interface which resembles the interaction approach Intuitive interfaces would allow users to focus on the common in HRI, and a gesture input mechanism based on content of their human-robot interaction (HRI) tasks rather Nintendo Wiimotes and Nunchuks.
  • Finger Identification-Based Hand Gestures and Point Cloud-Based 3D Motion Gestures for Natural User Interfaces

    Finger Identification-Based Hand Gestures and Point Cloud-Based 3D Motion Gestures for Natural User Interfaces

    Finger identification-based hand gestures and point cloud-based 3D motion gestures for Natural User Interfaces 著者 李 雲? year 2016 その他のタイトル ナチュラルインタフェースを目指した指認識に基づ いたハンドジェスチャとポイントクラウドに基づく 3次元モーションジェスチャ 学位授与大学 筑波大学 (University of Tsukuba) 学位授与年度 2015 報告番号 12102甲第7703号 URL http://hdl.handle.net/2241/00143823 Finger identification-based hand gestures and point cloud-based 3D motion gestures for Natural User Interface March 2016 Unseok Lee Finger identification-based hand gestures and point cloud-based 3D motion gestures for Natural User Interface Graduate School of Systems and Information Engineering University of Tsukuba March 2016 Unseok Lee i Abstract In recent years, many researches have been proposed about gestures in 2D/3D for design- ing Natural User Interface(NUI). In NUI, the needs for the communication methods have been evolved; it aims at more natural way to communicate with computers. Recently, with developments of technologies, the direction for designing the NUI is changing rapidly; our body became the interface to communicate with the computers. This is the new trend for designing the interfaces. On this point, we focused on hand gestures and 3D motion gestures to design more natural interfaces. Our research focused on how can we design more natural interfaces with powerful and robust gesture recognition. Our research is composed of two main parts. A finger identification-based hand gestures and point cloud-based 3D motion gestures. We aim at designing the use of NUI easier both beginners and experts, and designing the interactions occur as naturally as possible, without having the users go through an artificial step such as training. Finger identification-based hand gestures aim at designing natural hand gestures with ro- bust recognition.