Interaction Aspects of Wearable Computing for Human Communication

Total Page:16

File Type:pdf, Size:1020Kb

Interaction Aspects of Wearable Computing for Human Communication 2006:60 DOCTORAL T H E SI S Interaction Aspects of Wearable Computing for Human Communication Mikael Drugge Luleå University of Technology Department of Computer Science and Electrical Engineering Media Technology Research Group 2006:60|: 02-544|: - -- 06 ⁄60 -- Interaction Aspects of Wearable Computing for Human Communication Mikael Drugge Media Technology Research Group Department of Computer Science and Electrical Engineering Luleå University of Technology SE–971 87 Luleå Sweden December 2006 Supervisor Ph.D. Peter Parnes, Luleå University of Technology ii Abstract This thesis presents the use of wearable computers for aiding human communication over a distance, focusing on interaction aspects that need to be resolved in order to realize this goal. As wearable computers by definition are highly mobile, always on, and always accessible, the ability to communicate becomes independent of place, time and situation. This also imposes new requirements on the user interface of the wearable computer, calling for natural and unobtrusive interaction with the user. One of the key challenges in wearable computing today is to streamline the user’s inter- action, so that it is tailored for the situation at hand. A user interface that takes too much effort to use, interrupts or requires more than a minimum of attention, will inevitably ham- per the user’s ability to perform tasks in real life. At the same time, human communication involves both effort, interruptions and paying attention, so the key is to find a balance where wearable computers can aid human communication without being intrusive. To design user interfaces supporting this, we need to know what roles different aspects of interaction have in the field of wearable computing. In this thesis, the use of wearable computing for aiding human communication is explored around three aspects of interaction. The first aspect deals with how information can be conveyed by the wearable computer user, allowing a user to retrieve advice and guidance from experts, and remote persons to share experiences over a distance. The thesis presents findings of using wearable computing for sharing knowledge and experience, both for informal exchange among work colleagues, as well as enabling more efficient communication among health-care personnel. The second aspect is based on findings from these trials and concerns how the wearable computer inter- acts with the user. As the user performs tasks in the real world, it is important to determine how different methods of notifying the user affects her attention and performance, in order to design interfaces that are efficient yet pleasant to use. The thesis presents user studies examin- ing the impact of different methods of interruption, and provides guidelines for how to make notifications less intrusive. The third and final aspect considers how the user’s physical inter- action with the wearable computer can be improved. The thesis presents rapid prototyping of systems employing user centric design. Furthermore, a framework for ubiquitous multimedia communication is presented, enabling wearable computers to be dynamically configurable and utilize resources in the environment to supplement the user’s equipment. All in all, the thesis presents how wearable communications systems can be developed and deployed, how their human-computer interaction should be designed for unobtrusive operation, and how they can come to practical use in real world situations. iii iv Contents Abstract iii Preface xi Publications xiii Acknowledgments xv 1 Thesis Introduction 1 1.1 Introduction . .................................. 3 1.2ThesisOrganization............................... 3 1.3 Background and Motivation .......................... 4 1.3.1 WearableComputing.......................... 4 1.3.2 UbiquitousandPervasiveComputing................. 6 1.3.3 VideoConferencingandE-meetings.................. 7 1.3.4 MobileE-meetings........................... 8 1.3.5 MotivationofThesis.......................... 11 1.4ResearchQuestions............................... 11 1.5ScopeandDelimitationoftheThesis...................... 14 1.6 Research Methodology . .......................... 14 1.7SummaryofIncludedPublications....................... 16 1.8WearableComputingforHumanCommunication............... 18 1.8.1 Mobile E-Meetings through Wearable Computing ........... 19 1.8.2 ManagingInterruptionsandNotifications............... 22 1.8.3 PrototypingandDeployingMobileE-MeetingSystems........ 24 1.9Discussion.................................... 28 1.9.1 FutureResearchDirections....................... 31 v vi Contents 1.9.2 Conclusions............................... 31 1.10PersonalContribution.............................. 32 2 Sharing Experience and Knowledge with Wearable Computers 35 2.1 Introduction . .................................. 37 2.1.1 EnvironmentforTesting........................ 38 2.2RelatedWork.................................. 38 2.3TheMobileUser................................ 38 2.3.1 HardwareEquipment.......................... 39 2.3.2 SoftwareSolution............................ 40 2.4 Beyond Communication . .......................... 41 2.4.1 BecomingaKnowledgeableUser................... 41 2.4.2 InvolvingExternalPeopleinMeetings................. 42 2.4.3 WhenWearableComputerUsersMeet................. 43 2.5Evaluation.................................... 44 2.5.1 TheImportanceofText......................... 44 2.5.2 CameraandVideo........................... 46 2.5.3 Microphone and Audio . ................... 46 2.5.4 TransmissionofKnowledge...................... 46 2.6Conclusions................................... 47 2.6.1 FutureWork.............................. 47 2.7Acknowledgements............................... 47 3 Experiences of Using Wearable Computers for Ambient Telepres- ence and Remote Interaction 49 3.1 Introduction . .................................. 51 3.1.1 RelatedWork.............................. 52 3.2EverydayTelepresence............................. 54 3.3WearableComputers.............................. 56 3.4ExperiencesofTelepresence.......................... 58 3.4.1 UserInterfaceProblems........................ 59 3.4.2 ChoiceofMediaforCommunicating................. 61 3.5Evaluation.................................... 62 3.5.1 TimeforSetupandUse........................ 62 3.5.2 DifferentLevelsofImmersion..................... 63 Contents vii 3.5.3 AppearanceandAesthetics....................... 66 3.5.4 RemoteInteractionsmadePossible.................. 68 3.5.5 Summary................................ 68 3.6Conclusions................................... 69 3.6.1 FutureWork.............................. 69 3.7Acknowledgments............................... 70 4 Methods for Interrupting a Wearable Computer User 71 4.1 Introduction . .................................. 73 4.1.1 RelatedWork.............................. 74 4.2Experiment................................... 75 4.2.1 RealWorldTask............................ 75 4.2.2 InterruptionTask............................ 76 4.2.3 CombiningtheTasks.......................... 76 4.2.4 Treatments............................... 77 4.3UserStudy................................... 79 4.3.1 TestSession............................... 79 4.3.2 Apparatus................................ 80 4.4Results...................................... 82 4.4.1 ComparisonwithBaseCases...................... 83 4.4.2 PairwiseComparisonofTreatments.................. 84 4.4.3 ComparisonwithOriginalStudy.................... 85 4.4.4 SubjectiveComments......................... 85 4.5Conclusions................................... 86 4.5.1 FutureWork.............................. 86 4.6Acknowledgments............................... 86 5 Using the "HotWire" to Study Interruptions in Wearable Com- puting Primary Tasks 87 5.1 Introduction . .................................. 89 5.1.1 Motivation............................... 89 5.1.2 Outline . .............................. 90 5.2RelatedWork.................................. 90 5.3Experiment................................... 91 5.3.1 PrimaryTask.............................. 91 viii Contents 5.3.2 InterruptionTask............................ 92 5.3.3 Methods for Handling Interruptions . ............... 92 5.4UserStudy................................... 93 5.4.1 Apparatus................................ 94 5.5Results...................................... 96 5.5.1 Time.................................. 98 5.5.2 Contacts................................ 99 5.5.3 Errorrate................................101 5.5.4 Averageage...............................101 5.6Evaluatingtheapparatus............................101 5.7Conclusions...................................102 5.7.1 FutureWork..............................103 5.8Acknowledgments...............................103 6 Wearable Systems in Nursing Home Care: Prototyping Experi- ence 105 6.1 Introduction . ..................................107 6.2ScopingtheProject...............................108 6.3PaperPrototyping................................109 6.3.1 Paper,Pen,andPlastic.........................109 6.3.2 PaperPrototypingBenefits.......................110 6.4 Moving to Multimodal Devices . ...................111 6.4.1 WearablePrototype...........................111 6.4.2 CommunicationApplication......................111 6.4.3 WizardofOzTesting..........................112 6.4.4 FeedbackFromtheNurses.......................113 6.5FinalRemarks..................................114 6.6Acknowledgments...............................115 7 Enabling Multimedia
Recommended publications
  • Gesttrack3d™ Toolkit
    GestTrack3D™ Toolkit A Collection of 3D Vision Trackers for Touch-Free User Interface and Game Control Control interactive displays and digital signs from a distance. Navigate “PrimeSense™-like” 3D game worlds. Interact with virtually any computer system without ever touching it. GestureTek, the inventor and multiple patent holder of video gesture control using 2D and 3D cameras, introduces GestTrack3D™, our patented, cutting-edge, 3D gesture control system for developers, OEMs and public display providers. GestTrack3D eliminates the need for touch-based accessories like a mouse, keyboard, handheld controller or touch screen when interacting with an electronic device. Working with nearly any Time of Flight camera to precisely measure the location of people’s hands or body parts, GestTrack3D’s robust tracking enables device control through a wide range of gestures and poses. GestTrack3D is the perfect solution for accurate and reliable off-screen computer control in interactive environments such as boardrooms, classrooms, clean rooms, stores, museums, amusement parks, trade shows and rehabilitation centres. The Science Behind the Software GestureTek has developed unique tracking and gesture recognition algorithms to define the relationship between computers and the people using them. With 3D cameras and our patented 3D computer vision software, computers can now identify, track and respond to fingers, hands or full-body gestures. The system comes with a depth camera and SDK (including sample code) that makes the x, y and z coordinates of up to ten hands available in real time. It also supports multiple PC development environments and includes a library of one-handed and two-handed gestures and poses.
    [Show full text]
  • Literature Study Concerning Wearable Computers for Use in Process Plants
    Literature study: Wearable Control Room /LWHUDWXUHVWXG\FRQFHUQLQJZHDUDEOHFRPSXWHUVIRUXVHLQSURFHVVSODQWV This report is a literature study concerning different aspects of wearable computers. The report is meant to be used mainly as background information for defining relevant projects within the program “wearable control room” as well as an introduction of this research topics for Ph.D. students and other interested readers. ,QWURGXFWLRQ This report presents the state of the research concerning wearable computers with focus on use within process plants. In addition to wearable computers, the report focuses on human factors as well as the organisational and social aspects of substituting the existing centralised control room with distributed wearable computers. Research related to wearable computers has existed for several years, but most of the research has concentrated on other aspects and goals than developing a wearable control room for process plants. A wearable computer can generally be defined as (Bass 1997): • it may be used while the wearer is in motion; • it may be used while one or both hands are free or occupied with other tasks; • it exists within the corporeal envelope of the user, i.e., it should be not merely attached to the body but becomes an integral part of the person's clothing; • it must allow the user to maintain control; • it must exhibit constancy, in the sense that it should be constantly available. The following chapters outline the state of work related to different topics, applications and finally, universities and research institutions working with wearable computers. Chapter 2 pre- sents a short review of equipment available on the market.
    [Show full text]
  • Tutorial on Designing for Wearability
    Tutorial on Designing for Wearability Francine Gemperle and Peter Sellar Carnegie Mellon University Pittsburgh, PA 15213 USA [email protected], [email protected] http://www.wearablegroup.org ABSTRACT At the Second International Symposium on The objective of this workshop will be to Wearable Computing we presented a paper titled familiarize the group with two things. The role Design for Wearability [1]. In this paper we of Industrial Design in an interdisciplinary discussed a set of guidelines for designing design process and how to Design for wearable products to fit the three dimensional Wearability. The intended audience for this shapes of the dynamic human body. This paper tutorial is anyone who is managing or also presented a set of wearable forms to be used participating in an interdisciplinary development as a reference tool for the creation of wearable team designing wearable hardware. This tutorial products. In this tutorial we will share our will also be valuable for anyone who is process for both creating a wearable computer interested the role of industrial design in the housing to fit the human body and designing the development of a wearable computer. Some component placement on a printed circuit board knowledge of the field of wearable computing to meet the complex and organic shape of the will be helpful. For the exercise, participants computer housing. This tutorial will focus on an should be comfortable working with their hands iterative and interdisciplinary design process. We and talking in a group. will present a case study application of the Design for Wearability work. We will do an INSTRUCTORS exercise creating wearable shapes for computers Francine Gemperle has been a Design and negotiating those shapes with the internal Researcher with Carnegie Mellon’s Wearable components of a computer.
    [Show full text]
  • PROJECTION – VISION SYSTEMS: Towards a Human-Centric Taxonomy
    PROJECTION – VISION SYSTEMS: Towards a Human-Centric Taxonomy William Buxton Buxton Design www.billbuxton.com (Draft of May 25, 2004) ABSTRACT As their name suggests, “projection-vision systems” are systems that utilize a projector, generally as their display, coupled with some form of camera/vision system for input. Projection-vision systems are not new. However, recent technological developments, research into usage, and novel problems emerging from ubiquitous and portable computing have resulted in a growing recognition that they warrant special attention. Collectively, they represent an important, interesting and distinct class of user interface. The intent of this paper is to present an introduction to projection-vision systems from a human-centric perspective. We develop a number of dimensions according to which they can be characterized. In so doing, we discuss older systems that paved the way, as well as ones that are just emerging. Our discussion is oriented around issues of usage and user experience. Technology comes to the fore only in terms of its affordances in this regard. Our hope is to help foster a better understanding of these systems, as well as provide a foundation that can assist in making more informed decisions in terms of next steps. INTRODUCTION I have a confession to make. At 56 years of age, as much as I hate losing my hair, I hate losing my vision even more. I tell you this to explain why being able to access the web on my smart phone, PDA, or wrist watch provokes nothing more than a yawn from me. Why should I care? I can barely read the hands on my watch, and can’t remember the last time that I could read the date on it without my glasses.
    [Show full text]
  • A 0.13 Μm CMOS System-On-Chip For
    IEEE JOURNAL OF SOLID-STATE CIRCUITS, VOL. 50, NO. 1, JANUARY 2015 303 A0.13μm CMOS System-on-Chip for a 512 × 424 Time-of-Flight Image Sensor With Multi-Frequency Photo-Demodulation up to 130 MHz and 2 GS/s ADC Cyrus S. Bamji, Patrick O’Connor, Tamer Elkhatib, Associate Member, IEEE,SwatiMehta, Member, IEEE, Barry Thompson, Member, IEEE, Lawrence A. Prather,Member,IEEE, Dane Snow, Member, IEEE, Onur Can Akkaya, Andy Daniel, Andrew D. Payne, Member, IEEE, Travis Perry, Mike Fenton, Member, IEEE, and Vei-Han Chan Abstract—We introduce a 512 × 424 time-of-flight (TOF) depth Generally, 3-D acquisition techniques can be classified into image sensor designed in aTSMC0.13μmLP1P5MCMOS two broad categories: geometrical methods [1], [2], which in- process, suitable for use in Microsoft Kinect for XBOX ONE. The clude stereo and structured light, and electrical methods, which 10 μm×10μm pixel incorporates a TOF detector that operates using the quantum efficiency modulation (QEM) technique at include ultrasound or optical TOF described herein. The oper- high modulation frequencies of up to 130 MHz, achieves a mod- ating principle of optical TOF is based on measuring the total ulation contrast of 67% at 50 MHz and a responsivity of 0.14 time required for a light signal to reach an object, be reflected by A/W at 860 nm. The TOF sensor includes a 2 GS/s 10 bit signal the object, and subsequently be detected by a TOF pixel array. path, which is used for the high ADC bandwidth requirements Optical TOF methods can be classified in two subcategories: of the system that requires many ADC conversions per frame.
    [Show full text]
  • Real-Time Depth Imaging
    TU Berlin, Fakultät IV, Computer Graphics Real-time depth imaging vorgelegt von Diplom-Mediensystemwissenschaftler Uwe Hahne aus Kirchheim unter Teck, Deutschland Von der Fakultät IV - Elektrotechnik und Informatik der Technischen Universität Berlin zur Erlangung des akademischen Grades Doktor der Ingenieurwissenschaften — Dr.-Ing. — genehmigte Dissertation Promotionsausschuss: Vorsitzender: Prof. Dr.-Ing. Olaf Hellwich Berichter: Prof. Dr.-Ing. Marc Alexa Berichter: Prof. Dr. Andreas Kolb Tag der wissenschaftlichen Aussprache: 3. Mai 2012 Berlin 2012 D83 For my family. Abstract This thesis depicts approaches toward real-time depth sensing. While humans are very good at estimating distances and hence are able to smoothly control vehicles and their own movements, machines often lack the ability to sense their environ- ment in a manner comparable to humans. This discrepancy prevents the automa- tion of certain job steps. We assume that further enhancement of depth sensing technologies might change this fact. We examine to what extend time-of-flight (ToF) cameras are able to provide reliable depth images in real-time. We discuss current issues with existing real-time imaging methods and technologies in detail and present several approaches to enhance real-time depth imaging. We focus on ToF imaging and the utilization of ToF cameras based on the photonic mixer de- vice (PMD) principle. These cameras provide per pixel distance information in real-time. However, the measurement contains several error sources. We present approaches to indicate measurement errors and to determine the reliability of the data from these sensors. If the reliability is known, combining the data with other sensors will become possible. We describe such a combination of ToF and stereo cameras that enables new interactive applications in the field of computer graph- ics.
    [Show full text]
  • WT41N0 Wearable Computer Spec Sheet
    PRODUCT SPEC SHEET WT41N0 FEATURES Ergonomic hands-free wearable design Award-winning ergonomic design increases user WT41N0 comfort and productivity THE NEXT GENERATION IN RUGGED WEARABLE VOICE AND High-performance nextgeneration DATA MOBILE COMPUTERS platform Best-in-class dual core Easily increase productivity and eliminate errors in your warehouse or distribution center with next processor provides the generation hands-free voice and data. With the major increase in package volume driven by multi-channel support, power to run virtually any demands for the ultimate in customer service, plus increasing regulations for traceability, you need to move more enterprise application items through your warehouse or distribution center and capture more information about those items than ever before. With the WT41N0 wearable mobile computer on the arms of your workers, you will. Now, workers can keep their 802.11a/b/g/n WLAN hands and eyes on the materials they are handling ' no time is lost handling paper or a handheld mobile device. Add a Easily connects to existing ring-style scanner worn on a finger and workers can capture 1-D and 2-D bar codes on the fly, able to document the WLAN for fast integration; path of that item for full traceability ' and verify that the right items are in the right orders, shipped to the right 802.11n and support for customers at the right time. The result? Improved customer satisfaction and loyalty. More throughput with the same advanced Zebra's WLAN staff, driving staff utilization up. And less time spent capturing more information on item movement, driving the cost of features greatly improves compliance with traceability regulations down.
    [Show full text]
  • Classification of Smart Environment Scenarios in Combination with a Human-Wearable- Environment-Communication Using Wireless Connectivity
    CLASSIFICATION OF SMART ENVIRONMENT SCENARIOS IN COMBINATION WITH A HUMAN-WEARABLE- ENVIRONMENT-COMMUNICATION USING WIRELESS CONNECTIVITY Kristof Friess1 and Prof. Dr. Dr. h.c. Volker Herwig2 1Department of Applied Computer Science, University of Applied Science Erfurt, Erfurt, Germany [email protected] 2Department of Applied Computer Science, University of Applied Science Erfurt, Erfurt, Germany [email protected] ABSTRACT The development of computer technology has been rapid. Not so long ago, the first computer was developed which was large and bulky. Now, the latest generation of smartphones has a calculation power, which would have been considered those of supercomputers in 1990. For a smart environment, the person recognition and re-recognition is an important topic. The distribution of new technologies like wearable computing is a new approach to the field of person recognition and re-recognition. This article lays out the idea of identifying and re-identifying wearable computing devices by listening to their wireless communication connectivity like Wi-Fi and Bluetooth and building a classification of interaction scenarios for the combination of human-wearable-environment. KEYWORDS Wireless Network, Smart Environment, Wearable-Computing, UbiComp, Interaction-Scenarios 1. INTRODUCTION With the growing market of worn computer systems like smartphones and smartwatches, in short wearables, the possible interaction between human and computer has changed. In a short time, also the interaction between human, computer and the environment will change. There are unlimited use cases where a human uses the computer as an assistant for filtering data, processing information or storing context information. Now and in the near future, there are a lot of use cases coming up, where not the device itself helps the human to become smarter, but the environment, based on the knowledge about the human, acts smartly.
    [Show full text]
  • Ubiquitous Computing: Trends and History
    Ubiquitous Computing: Trends and History Lecture 2 CSI 660, William A. Maniatty, Dept. of Computer Science, University at Albany 1 Introduction Review: What is Ubiquitous Computing? • Immerses computers in a real environment • Sensors support interact with and control the environment. • Limited power supply, storage, memory and bandwidth. • Operate unattended (much like embedded systems). • Devices are mobile/wireless. • May reside on a person (wearable computing). • Have special peripherals. • Contrast this with virtual reality which immerses humans in a computer generated articial environment. CSI 660, William A. Maniatty, Dept. of Computer Science, University at Albany 2 Historical Origins and Trends Computers are becoming smaller and cheaper over time • Originally few computers many operators . Machines Expensive and Large . People (relatively) cheap • Trend toward more computers per person . Users may not be tech savvy . Even tech savvy users have limited time . Minimal intervention is required People don't want to be separated from their data • But spying on users upsets them • And can violate laws - security is important • Mobility and wireless access are critical. CSI 660, William A. Maniatty, Dept. of Computer Science, University at Albany 3 Some Popular Views Many visions were popularized in the press • First to work on it, although other visionaries preceded him • Entertainment Industry (Ian Fleming, Gene Rodenberry) • Vanaver Bush's seminal article [1] As We Might Think predicted the WWW and Ubiquitous Computing in 1945! • Vernor Vinge (retired Computer Science Professor and Science ction writer) has interesting ubiquitous computing visions. • Movies: The Terminator, numerous Philip K. Dick books and screen plays (Blade Runner, Total Recall, Minority Report). Has been popular in the research community for over a decade CSI 660, William A.
    [Show full text]
  • Wearable Technology for Enhanced Security
    Communications on Applied Electronics (CAE) – ISSN : 2394-4714 Foundation of Computer Science FCS, New York, USA Volume 5 – No.10, September 2016 – www.caeaccess.org Wearable Technology for Enhanced Security Agbaje M. Olugbenga, PhD Babcock University Department of Computer Science Ogun State, Nigeria ABSTRACT Sproutling. Watches like the Apple Watch, and jewelry such Wearable's comprise of sensors and have computational as Cuff and Ringly. ability. Gadgets such as wristwatches, pens, and glasses with Taking a look at the history of computer generations up to the installed cameras are now available at cheap prices for user to present, we could divide it into three main types: mainframe purchase to monitor or securing themselves. Nigerian faced computing, personal computing, and ubiquitous or pervasive with several kidnapping in schools, homes and abduction for computing[4]. The various divisions is based on the number ransomed collection and other unlawful acts necessitate these of computers per users. The mainframe computing describes reviews. The success of the wearable technology in medical one large computer connected to many users and the second, uses prompted the research into application into security uses. personal computing, as one computer per person while the The method of research is the use of case studies and literature term ubiquitous computing however, was used in 1991 by search. This paper takes a look at the possible applications of Paul Weiser. Weiser depicted a world full of embedded the wearable technology to combat the cases of abduction and sensing technologies to streamline and improve life [5]. kidnapping in Nigeria. Nigeria faced with several kidnapping in schools and homes General Terms are in dire need for solution.
    [Show full text]
  • The Challenges of Wearable Computing: Part 2
    THE CHALLENGES OF WEARABLE COMPUTING: PART 2 WEARABLE COMPUTING PURSUES AN INTERFACE IDEAL OF A CONTINUOUSLY WORN, INTELLIGENT ASSISTANT THAT AUGMENTS MEMORY, INTELLECT, CREATIVITY, COMMUNICATION, AND PHYSICAL SENSES AND ABILITIES. MANY CHALLENGES AWAIT WEARABLE DESIGNERS. PART 2 BEGINS WITH THE CHALLENGES OF NETWORK RESOURCES AND PRIVACY CONCERNS. THIS SURVEY DESCRIBES THE POSSIBILITIES OFFERED BY WEARABLE SYSTEMS AND, IN DOING SO, DEMONSTRATES ATTRIBUTES UNIQUE TO THIS CLASS OF COMPUTING. Challenges throughput. Another serious issue is open The most immediately striking challenge standards to enable interoperability between in designing wearable computers is creating different services. For example, only one long- appropriate interfaces. However, the issues of range radio should be necessary to provide power use, heat dissipation, networking, and telephony, text messaging, Global Positioning privacy provide a necessary framework in System (GPS) correction signals, and so on. which to discuss interface. Part 1 of this arti- For wearable computers, networking cle covers the first two of these issues; Part 2 involves communication off body to the fixed begins with the networking discussion. network, on body among devices, and near Thad Starner body with objects near the user. Each of these Networking three network types requires different design Georgia Institute of As with any wireless mobile device, the decisions. Designers must also consider pos- amount of power and the type of services sible interference between the networks. Technology available can constrain networking. Wearable computers could conserve resources through Off-body communications. Wireless commu- improved coordination with the user inter- nication from mobile devices to fixed infra- face. For example, the speed at which a given structure is the most thoroughly researched of information packet is transferred can be bal- these issues.
    [Show full text]
  • A Smart-Dashboard Augmenting Safe & Smooth Driving
    Master Thesis Computer Science Thesis no: 2010:MUC:01 Month Year 02-10 A Smart-Dashboard Augmenting safe & smooth driving Muhammad Akhlaq School of Computing Blekinge Institute of Technology Box 520 SE – 372 25 Ronneby Sweden This thesis is submitted to the School of Computing at Blekinge Institute of Technology in partial fulfillment of the requirements for the degree of Master of Science in Computer Science (Ubiquitous Computing). The thesis is equivalent to 20 weeks of full time studies. Contact Information: Author(s): Muhammad Akhlaq Address: Mohallah Kot Ahmad Shah, Mandi Bahauddin, PAKISTAN-50400 E-mail: [email protected] University advisor(s): Prof. Dr. Bo Helgeson School of Computing School of Computing Internet : www.bth.se/com Blekinge Institute of Technology Phone : +46 457 38 50 00 Box 520 Fax : + 46 457 102 45 SE – 372 25 Ronneby Sweden ii ABSTRACT Annually, road accidents cause more than 1.2 million deaths, 50 million injuries, and US$ 518 billion of economic cost globally [1]. About 90% of the accidents occur due to human errors [2] [3] such as bad awareness, distraction, drowsiness, low training, fatigue etc. These human errors can be minimized by using advanced driver assistance system (ADAS) which actively monitors the driving environment and alerts a driver to the forthcoming danger, for example adaptive cruise control, blind spot detection, parking assistance, forward collision warning, lane departure warning, driver drowsiness detection, and traffic sign recognition etc. Unfortunately, these systems are provided only with modern luxury cars because they are very expensive due to numerous sensors employed. Therefore, camera-based ADAS are being seen as an alternative because a camera has much lower cost, higher availability, can be used for multiple applications and ability to integrate with other systems.
    [Show full text]