A Multilingual Keyboard and Mouse Interface for Motor-Impaired Users
Total Page:16
File Type:pdf, Size:1020Kb
Load more
Recommended publications
-
Free Notification App for Android
Free Notification App For Android Uncrystallized Meier exfoliated full-time and nomographically, she eviscerate her jackals invests interruptedly. Kincaid still hotter unmanageably while characterless Davidde strowing that gendarme. Is Hersh always labored and Walachian when mistaking some ecclesiolatry very mutely and saltishly? There are the applications, for android type phones seem important events like slack Have feedback through all, via gps take on this free notification manager, does not going on? What can you expect without the dispute emergency notification app vendor? Android notifications Material Design. Customize Android notification alerts even impact a. Top 5 Best notification apps for Android 2020 Bestappsguru. App is free ebooks, and tablet at once you can also offer. Too much, decided to delete app, there but better options! What you by asking for free ringtones will encourage purchases. 11 Best Notification Apps for Android 2019 Free apps for. Get additional text messages in multiple languages, or password incorrect email address will assume you can be increasing soon as your team make categories. Please do this feature very best push notifications integrate it is simple music, or news in android notification for you to notifications based on pc after a purchase. Is handwriting a notification app? How do alpha testers download automatically after they had already blocked in? Foreground services are app processes that run in the background complete the user is not directly interacting with your app Because these apps use battery and. 11 Best Free Ringtone Apps For Android DroidRant. This free for free ringtone downloaders just starting with? Push notifications can be designed for both ios and android based phones. -
7 35138412 1.Pdf (2.296Mb)
MAUU(D)5900 MASTER THESIS in Universal Design of ICT October 2018 An Accessible Directions-based Text Entry Method Using Two-thumb Touch Typing Linghui Ye Department of Computer Science Faculty of Technology, Art and Design Master Thesis Phase III Report Contents Abstract ......................................................................................................................................................... 3 1. Introduction .............................................................................................................................................. 5 2. Related work ............................................................................................................................................. 8 3. The prototype ......................................................................................................................................... 15 3.1 Physical direction .............................................................................................................................. 20 3.2 Resolving ambiguities ....................................................................................................................... 23 3.3 Special characters ............................................................................................................................. 23 4. Methodology ........................................................................................................................................... 25 4.1 Experimental design ......................................................................................................................... -
Fast and Precise Touch-Based Text Entry for Head-Mounted Augmented Reality with Variable Occlusion
1 Fast and Precise Touch-Based Text Entry for Head-Mounted Augmented Reality with Variable Occlusion JOHN J. DUDLEY, University of Cambridge, United Kingdom KEITH VERTANEN, Michigan Technological University, USA PER OLA KRISTENSSON, University of Cambridge, United Kingdom We present the VISAR keyboard: an augmented reality (AR) head-mounted display (HMD) system that supports text entry via a virtualised input surface. Users select keys on the virtual keyboard by imitating the process of single-hand typing on a physical touchscreen display. Our system uses a statistical decoder to infer users’ intended text and to provide error-tolerant predictions. There is also a high-precision fall-back mechanism to support users in indicating which keys should be unmodified by the auto-correction process. A unique advantage of leveraging the well-established touch input paradigm is that our system enables text entry with minimal visual clutter on the see-through display, thus preserving the user’s field-of-view. We iteratively designed and evaluated our system and show that the final iteration of the system supports a mean entry rate of 17.75 wpm with a mean character error rate less than 1%. This performance represents a 19.6% improvement relative to the state-of-the-art baseline investigated: a gaze-then-gesture text entry technique derived from the system keyboard on the Microsoft HoloLens. Finally, we validate that the system is effective in supporting text entry in a fully mobile usage scenario likely to be encountered in industrial applications of AR HMDs. CCS Concepts: • Human-centered computing → Text input; Additional Key Words and Phrases: augmented reality, text entry 1 INTRODUCTION Recent progress in head-mounted displays (HMDs) for augmented reality (AR), such as the Microsoft HoloLens, demonstrates the commercial potential of AR to support new forms of interaction and work in a range of industries including construction, education and health. -
Oracle Solaris 11 Accessibility Guide for the GNOME Desktop • December 2011 E24675 02 Contents
Oracle® Solaris 11 Accessibility Guide for the GNOME Desktop Part No: E24675 December 2011 E24675_02 Copyright © 2011, Oracle and/or its affiliates. All rights reserved. This software and related documentation are provided under a license agreement containing restrictions on use and disclosure and are protected by intellectual property laws. Except as expressly permitted in your license agreement or allowed by law, you may not use, copy, reproduce, translate, broadcast, modify, license, transmit, distribute, exhibit, perform, publish or display any part, in any form, or by any means. Reverse engineering, disassembly, or decompilation of this software, unless required by law for interoperability, is prohibited. The information contained herein is subject to change without notice and is not warranted to be error-free. If you find any errors, please report them to us in writing. If this is software or related documentation that is delivered to the U.S. Government or anyone licensing it on behalf of the U.S. Government, the following notice is applicable: U.S. GOVERNMENT RIGHTS Programs, software, databases, and related documentation and technical data delivered to U.S. Government customers are "commercial computer software" or "commercial technical data" pursuant to the applicable Federal Acquisition Regulation and agency-specific supplemental regulations. As such, the use, duplication, disclosure, modification, and adaptation shall be subject to the restrictions and license terms set forth in the applicable Government contract,and, to the extent applicable by the terms of the Government contract, the additional rights set forth in FAR 52.227-19, Commercial Computer Software License (December 2007). Oracle America, Inc., 500 Oracle Parkway, Redwood City, CA 94065. -
Digital Access Project
DIGITAL ACCESS PROJECT LEARNING MODULE Computer hardware, basic digital terminology and typing skills Code: M5BO Prepared by: ITPIO October 2018 This project has been funded with support from the European Commission. This publication reflects the views only of the authors, and the Commission cannot be held responsible for any use which may be made of the information contained therein. Contents SUMMARY ..............................................................................................................................5 KEYWORDS .............................................................................................................................5 MODULE OBJECTIVES .............................................................................................................5 UNIT 1: USING THE INTERFACE OF COMPUTERS .....................................................................7 Specific objectives of Unit 1 ................................................................................................7 Display ................................................................................................................................7 Mouse and Touchpad .........................................................................................................8 Keyboard ............................................................................................................................9 Printer...............................................................................................................................11 -
Free and Open Source Software
Free and open source software Copyleft ·Events and Awards ·Free software ·Free Software Definition ·Gratis versus General Libre ·List of free and open source software packages ·Open-source software Operating system AROS ·BSD ·Darwin ·FreeDOS ·GNU ·Haiku ·Inferno ·Linux ·Mach ·MINIX ·OpenSolaris ·Sym families bian ·Plan 9 ·ReactOS Eclipse ·Free Development Pascal ·GCC ·Java ·LLVM ·Lua ·NetBeans ·Open64 ·Perl ·PHP ·Python ·ROSE ·Ruby ·Tcl History GNU ·Haiku ·Linux ·Mozilla (Application Suite ·Firefox ·Thunderbird ) Apache Software Foundation ·Blender Foundation ·Eclipse Foundation ·freedesktop.org ·Free Software Foundation (Europe ·India ·Latin America ) ·FSMI ·GNOME Foundation ·GNU Project ·Google Code ·KDE e.V. ·Linux Organizations Foundation ·Mozilla Foundation ·Open Source Geospatial Foundation ·Open Source Initiative ·SourceForge ·Symbian Foundation ·Xiph.Org Foundation ·XMPP Standards Foundation ·X.Org Foundation Apache ·Artistic ·BSD ·GNU GPL ·GNU LGPL ·ISC ·MIT ·MPL ·Ms-PL/RL ·zlib ·FSF approved Licences licenses License standards Open Source Definition ·The Free Software Definition ·Debian Free Software Guidelines Binary blob ·Digital rights management ·Graphics hardware compatibility ·License proliferation ·Mozilla software rebranding ·Proprietary software ·SCO-Linux Challenges controversies ·Security ·Software patents ·Hardware restrictions ·Trusted Computing ·Viral license Alternative terms ·Community ·Linux distribution ·Forking ·Movement ·Microsoft Open Other topics Specification Promise ·Revolution OS ·Comparison with closed -
QB-Gest: Qwerty Bimanual Gestural Input for Eyes-Free Smartphone Text Input
QB-Gest: Qwerty Bimanual Gestural Input for Eyes-Free Smartphone Text Input Linghui Ye1, Frode Eika Sandnes1,2(&) , and I. Scott MacKenzie3 1 Department of Computer Science, Oslo Metropolitan University, 0130 Oslo, Norway [email protected], [email protected] 2 Institute of Technology, Kristiania University College, 0153 Oslo, Norway 3 Department of Computer Science, York University, Toronto, ON M3J 1P3, Canada [email protected] Abstract. We developed QB-Gest, a bimanual text entry method based on simple gestures where users drag their thumbs in the direction of the desired letter while visualizing the Qwerty-layout. In an experiment with four sessions of testing, 20 users achieved text entry rates of 11.1 wpm eyes-free and 14.1 wpm eyes-on. An expert user achieved an eyes-free rate of 24.9 wpm after 10 rounds of entering the-quick-brown-fox phrase. The method holds potential for users with low vision and certain types of reduced motor function. Keywords: Mobile text entry Á Eyes free text Á Gestures Á Qwerty 1 Introduction Smartphones have become an important tool in modern society by facilitating com- munication independent of time and place. Many smartphone tasks require text input, such as searching the web, sending emails, or messaging. Text input typically uses a default virtual keyboard. Yet, many users find it hard to use virtual smartphone key- boards compared to physical desktop keyboards [1], because input requires accurately hitting keys without tactile feedback. Virtual smartphone keys are smaller than physical desktop keys; so, input is both visually intensive and requires careful eye-motor coordination [2]. -
1 Background Many Applications for Handheld Electronic Devices Depend on the User Being Able to Enter Text
ABSTRACT When using your smartphone (or similar handheld electronic device) with one hand, you have to type with the thumb, which isn’t the finger you would normally use. The more agile fingers are tied up with holding the device. This is due to the interface concept, not to any functional deficit of the human hand. You could operate these devices with the use of all fingers, and not just for typing, but that calls for a different interface paradigm, one that mirrors the ergonomic capabilities of the human hand. You engage this kind of interface where your fingers spontaneously come into contact with the device when you grasp it, not at predetermined locations that are not necessarily easy to reach. This interface dynamically maps the controls of the device to the contact patches of your fingers and generates a visual, audible or haptic tag to tell you what function a finger controls. To activate a control you engage it with the corresponding finger. KEYWORDS Topical† tactile interfaces, atopical* tactile interfaces, operative interaction with back and sides of device, accessibility issues with virtual controls, atypical handheld devices (disks, cylinders, torons etc.) †characterised by operative interactions confined to defined locations *characterised by operative interactions not confined to defined locations 1 Background Many applications for handheld electronic devices depend on the user being able to enter text. To this end, the tactile user interfaces of these devices are configured with a miniature keyboard. The idea behind this is presumably to replicate the functionality of a standard keyboard. This may be aiming a little too high, though. -
Interactions with Smartphones and Smartwatches: Context-Awareness, Text Entry Interfaces, and Input Beyond Touch
INTERACTIONS WITH SMARTPHONES AND SMARTWATCHES: CONTEXT-AWARENESS, TEXT ENTRY INTERFACES, AND INPUT BEYOND TOUCH Rajkumar Darbar INTERACTIONS WITH SMARTPHONES AND SMARTWATCHES: CONTEXT-AWARENESS, TEXT ENTRY INTERFACES, AND INPUT BEYOND TOUCH Thesis submitted to the Indian Institute of Technology Kharagpur for award of the degree of Master of Science (MS) - by Research by Rajkumar Darbar Under the guidance of Dr. Debasis Samanta Computer Science and Engineering Indian Institute of Technology Kharagpur Kharagpur - 721 302, India July 2016 ⃝c 2016 Rajkumar Darbar. All rights reserved. CERTIFICATE OF APPROVAL 00/00/0000 Certified that the thesis entitled Interactions with Smartphones and Smartwatches: Context-Awareness, Text Entry Interfaces, and Input Beyond Touch submit- ted by Rajkumar Darbar to the Indian Institute of Technology, Kharagpur, for the award of the degree Master of Science has been accepted by the external examiners and that the student has successfully defended the thesis in the viva-voce examination held today. (Member of DAC) (Member of DAC) (Member of DAC) (Supervisor) (Internal Examiner) (Chairman) CERTIFICATE This is to certify that the thesis entitled Interactions with Smartphones and Smart- watches: Context-Awareness, Text Entry Interfaces, and Input Beyond Touch submitted by Rajkumar Darbar to Indian Institute of Technology Kharagpur, is a record of bona fide research work under my supervision and I consider it worthy of consideration for the award of the degree of Master of Science (by Research) of the Institute. Date: 20/07/2016 Dr. Debasis Samanta Associate Professor Computer Science and Engineering Indian Institute of Technology Kharagpur Kharagpur - 721 302, India DECLARATION I certify that a. The work contained in the thesis is original and has been done by myself under the general supervision of my supervisor. -
[U] User's Guide
STATA USER’S GUIDE RELEASE 17 ® A Stata Press Publication StataCorp LLC College Station, Texas Copyright c 1985–2021 StataCorp LLC All rights reserved Version 17 Published by Stata Press, 4905 Lakeway Drive, College Station, Texas 77845 Typeset in TEX ISBN-10: 1-59718-353-9 ISBN-13: 978-1-59718-353-6 This manual is protected by copyright. All rights are reserved. No part of this manual may be reproduced, stored in a retrieval system, or transcribed, in any form or by any means—electronic, mechanical, photocopy, recording, or otherwise—without the prior written permission of StataCorp LLC unless permitted subject to the terms and conditions of a license granted to you by StataCorp LLC to use the software and documentation. No license, express or implied, by estoppel or otherwise, to any intellectual property rights is granted by this document. StataCorp provides this manual “as is” without warranty of any kind, either expressed or implied, including, but not limited to, the implied warranties of merchantability and fitness for a particular purpose. StataCorp may make improvements and/or changes in the product(s) and the program(s) described in this manual at any time and without notice. The software described in this manual is furnished under a license agreement or nondisclosure agreement. The software may be copied only in accordance with the terms of the agreement. It is against the law to copy the software onto DVD, CD, disk, diskette, tape, or any other medium for any purpose other than backup or archival purposes. The automobile dataset appearing on the accompanying media is Copyright c 1979 by Consumers Union of U.S., Inc., Yonkers, NY 10703-1057 and is reproduced by permission from CONSUMER REPORTS, April 1979. -
Linkat and Accessibility Software
Linkat and Accessibility software Jordi Sanchez Riera. [email protected] Computer Accessibility Group.2006. I. INTRODUCTION Gnome Onscreen Keyboard The aim of our group is develop and adapt GOK aims to enable users to control their software applications for several types of computer without having to rely on a standard impairments such as visual and mobility keyboard or mouse. Many individuals have disability, and modify hardware if its necessary, limited voluntary movements and must control to obtain a product suitable for our needs. the computer using alternative input methods. These input methods may be controlled by a From the beginning of the project of Linkat we switch. were worried about to do the distribution accessible for people with disabilities. An our Linux distribution (Linkat) includes the default gnome desktop accessibility programs translated to catalan. II. SOFTWARE INCLUDED II. a. General gnome applications Fonts and Desktop themes Fonts and Desktop themes translated to catalan. Text size and color can make a big difference in legibility for people who have low vision. The Gnopernicus themes allow improve the vision of window The Gnopernicus project aims to enable users system desktop. with limited vision, or no vision, to use the Gnome desktop and applications effectively. JoyMouse By providing automated focus tracking and A program, or daemon for Linux, which fullscreen magnification. Gnopernicus aids converts incoming joystick data to mouse data, low-vision Gnome users, and its screen reader so that you can use your joystick for games as a features allow low-vision and blind users access mouse device. to standard applications via speech and braille output. -
Conventional Typing Dasher
6.S196: Principles and Practice of Assistive Technology Lab 4: Alternative Text Input Devices and Strategies Objectives: Experience using alternative communication methods for computer input Learn how language models can be used to improve word prediction Activities Lab session with laptop computers (bring your own computer if possible). Deliverables Write a brief (1.5-2 pages) reflection on this activity. You should do the “Conventional Typing” activity first on a laptop; afterward, you can do the other activities in any order depending on equipment availability. Conventional Typing For this exercise, you need a laptop computer to obtain a baseline of your conventional typing speed. 1. Go to www.typeracer.com and select “Enter a Typing Race.” (Although this is a “typing race”, do not type any faster than you can comfortably type.) 2. Do a typing race and record your words per minute (WPM). 3. Repeat two more times for a total of three trials. Note whether your typing speed has increased or decreased (due to learning effects or fatigue). Dasher For this exercise, you will test out Dasher, an input method developed by the Inference Group at the University of Cambridge. 1. On your laptop, go to www.inference.phy.cam.ac.uk/dasher/ and click on “Download” in the sidebar. 2. Download and install Dasher for your operating system (there are Windows, MacOS, and Linux versions.) 3. Once installed, play around with Dasher for a minute or two. What are your first impressions and thoughts about the interface? Based on your interaction, how do you think it works? Adjust the speed of Dasher (in the bottom left corner of the interface) to a speed that you think is optimal.