New Directions in 3D User Interfaces Abstract
Total Page:16
File Type:pdf, Size:1020Kb
Load more
Recommended publications
-
Grants.Gov Workspace
Grants.gov Workspace Must have Adobe Reader or Pro Register Must be registered in Grants.gov 2 1. Go to www.grants.gov 2. Click on “Register” in the upper right corner 3 3. Click on “Get Registered Now” link near bottom of the page. On the following page, fill in requested information and select “Continue”. 4. Click “Send Temporary Code” on the next screen. The email you have registered on the earlier screen will receive a six digit code from the Grants.gov Registration Team. Retrieve from your email, enter where indicated, and select Continue. 5. Affiliate with our Institution by Selecting “Add Organization Applicant Profile”, enter UMKU DUNS number (010989619), a profile name (your choice), and your title. Save. 5 The Institutions authorized official will receive notification of your request automatically and will process your permissions. You can now apply for grants! 1 Initiate Workspace Application Once you’re registered, on the Grants.gov home page you can now select Login, which is in the upper right corner next to Register. There are couple of ways you can get to the correct application forms. Within the guidelines for the funding opportunity announcement (FOA), you can click on “Go to Grants.gov” to download an application package. You can search for the FOA from “Search Grants” tab in Grants.gov. o Once you find it, click on the Opp Number o Then click on “Package” o Then click on “Apply” o To receive updates on this funding opportunity, Select Subscribe to Opportunity in the top right of this page. -
Chapter 2 3D User Interfaces: History and Roadmap
30706 02 pp011-026 r1jm.ps 5/6/04 3:49 PM Page 11 CHAPTER 2 3D3D UserUser Interfaces:Interfaces: HistoryHistory andand RoadmapRoadmap Three-dimensional UI design is not a traditional field of research with well-defined boundaries. Like human–computer interaction (HCI), it draws from many disciplines and has links to a wide variety of topics. In this chapter, we briefly describe the history of 3D UIs to set the stage for the rest of the book. We also present a 3D UI “roadmap” that posi- tions the topics covered in this book relative to associated areas. After reading this chapter, you should have an understanding of the origins of 3D UIs and its relation to other fields, and you should know what types of information to expect from the remainder of this book. 2.1. History of 3D UIs The graphical user interfaces (GUIs) used in today’s personal computers have an interesting history. Prior to 1980, almost all interaction with com- puters was based on typing complicated commands using a keyboard. The display was used almost exclusively for text, and when graphics were used, they were typically noninteractive. But around 1980, several technologies, such as the mouse, inexpensive raster graphics displays, and reasonably priced personal computer parts, were all mature enough to enable the first GUIs (such as the Xerox Star). With the advent of GUIs, UI design and HCI in general became a much more important research area, since the research affected everyone using computers. HCI is an 11 30706 02 pp011-026 r1jm.ps 5/6/04 3:49 PM Page 12 12 Chapter 2 3D User Interfaces: History and Roadmap 1 interdisciplinary field that draws from existing knowledge in perception, 2 cognition, linguistics, human factors, ethnography, graphic design, and 3 other areas. -
Graphical User Interface (Gui) Lab
GRAPHICAL USER INTERFACE (GUI) LAB This lab will guide you through the complex process of graphical user interface (GUI) creation. GUI’s are interfaces computer users invoke to make computer programs easier to use. They provide a graphical means to perform simple and complex operations or procedures. Computer programmers make most of their applications as GUIs so users are not required to learn computer programming languages. We each use GUIs on a daily basis. Any computer program that implements buttons or menus to perform tasks is GUI based. Some examples include; Microsoft Word, ArcMap, ENVI, S-Plus, etc. GUIs in IDL In IDL there are two ways to create GUIs; manual script generation (writing code line by line as we have done in the previous labs) or semi-automatic script generation (this process uses a GUI already built into IDL to generate GUIs (this will make more sense later in the lab)). Before we create a functional GUI we need to understand basic GUI architecture. GUIs are comprised of numerous widgets that interact to accomplish a task. Common widgets used in IDL include the base widget (widget_base), button widgets (widget_button), text widgets (widget_text), and label widgets (widget_label). MANUAL GUI CREATION Let’s create a simple GUI (manually) to display a few basic concepts. First we must create the base widget (the matrix within which all other widgets in the GUI are contained). 1. Use the widget_base function to create a base widget by typing the following code in the IDL editor window. ; creates a widget_base called base Pro simp_widg base = widget_base(XSIZE = 175, YSIZE =50, TITLE='A Simple Example') ;realize the widget widget_control, base, /REALIZE end The XSIZE and YSIZE keywords specify the horizontal and vertical size (in pixels) of the base widget, while the TITLE keyword creates a title for the widget. -
Correlating the Effects of Flow and Telepresence in Virtual Worlds: Enhancing Our Understanding of User Behavior in Game-Based Learning
CITATION: Faiola, A., Newlon, C., Pfaff, M., & Smysolva, O. (2013) Correlating the effects of flow and telepresence in virtual worlds: Enhancing our understanding of user behavior in game-based learning. Computers in Human Behavior, 29, 1113-1121. (Elsevier) Correlating the effects of flow and telepresence in virtual worlds: Enhancing our understanding of user behavior in game-based learning Anthony Faiola a , Christine Newlon a, Mark Pfaff a, Olga Smyslova b a Indiana University, School of Informatics (IUPUI), Indianapolis, IN, USA b Kaiser Permanente, USA ABSTRACT Article history: Recent research on online learning suggests that virtual worlds are becoming an important environment Available online xxxx to observe the experience of flow. From these simulated spaces, researchers may gather a deeper under- standing of cognition in the context of game-based learning. Csikszentmihalyi (1997) describes flow as a Keywords: feeling of increased psychological immersion and energized focus, with outcomes that evoke disregard Flow for external pressures and the loss of time consciousness, issuing in a sense of pleasure. Past studies sug- Telepresence gest that flow is encountered in an array of activities and places, including those in virtual worlds. The Human–computer interaction authors’ posit that flow in virtual worlds, such as Second Life (SL), can be positively associated with Virtual worlds degrees of the cognitive phenomenon of immersion and telepresence. Flow may also contribute to a bet- Gaming Online learning ter attitude and behavior during virtual game-based learning. This study tested three hypotheses related to flow and telepresence, using SL. Findings suggest that both flow and telepresence are experienced in SL and that there is a significant correlation between them. -
New Realities Risks in the Virtual World 2
Emerging Risk Report 2018 Technology New realities Risks in the virtual world 2 Lloyd’s disclaimer About the author This report has been co-produced by Lloyd's and Amelia Kallman is a leading London futurist, speaker, Amelia Kallman for general information purposes only. and author. As an innovation and technology While care has been taken in gathering the data and communicator, Amelia regularly writes, consults, and preparing the report Lloyd's does not make any speaks on the impact of new technologies on the future representations or warranties as to its accuracy or of business and our lives. She is an expert on the completeness and expressly excludes to the maximum emerging risks of The New Realities (VR-AR-MR), and extent permitted by law all those that might otherwise also specialises in the future of retail. be implied. Coming from a theatrical background, Amelia started Lloyd's accepts no responsibility or liability for any loss her tech career by chance in 2013 at a creative or damage of any nature occasioned to any person as a technology agency where she worked her way up to result of acting or refraining from acting as a result of, or become their Global Head of Innovation. She opened, in reliance on, any statement, fact, figure or expression operated and curated innovation lounges in both of opinion or belief contained in this report. This report London and Dubai, working with start-ups and corporate does not constitute advice of any kind. clients to develop connections and future-proof strategies. Today she continues to discover and bring © Lloyd’s 2018 attention to cutting-edge start-ups, regularly curating All rights reserved events for WIRED UK. -
Exploring Telepresence in Virtual Worlds
Exploring Telepresence in Virtual Worlds Dan Zhang z3378568 A thesis in fulfillment of the requirements for the degree of Doctor of Philosophy School of Information Systems and Technology Management UNSW Business School March 2018 PLEASE TYPE THE UNIVERSITY OF NEW SOUTH WALES Thesis/Dissertation Sheet Surname or Family name: Zhang First name: Dan Other name/s: Abbreviation for degree as given in the University calendar: PhD School: School of Information Systems and Technology Management Faculty: UNSW Business School Title: Exploring telepresence in virtual worlds Abstract 350 words maximum: (PLEASE TYPE) Virtual worlds, as the computer-based simulated environments incorporating various representations of real-world elements, have great potential to not only transform the structures and operation modes of various industries but also change the way people work, do business, learn, play, and communicate. However, the existing sharp distinctions between virtual worlds and the real world also bring critical challenges. To address these challenges, the concept of telepresence—the user’s feeling of ‘being there’ in the virtual environments—is adopted as it is considered a direct and essential consequence of a virtual world’s reality. To cultivate this feeling, it is essential to understand what factors can lead to telepresence. However, some literature gaps on telepresence antecedents impede the understanding of telepresence antecedents and affect the adoption of the telepresence construct in the design of virtual worlds. To address these issues, this study explores the concept of telepresence in the context of virtual worlds. Specifically, by adopting means-end chain (MEC) theory, the study aims to investigate the antecedents of telepresence; to reveal the inter-relationships among these antecedents by building a hierarchical structure; and to develop an innovative approach for user segmentation to understand in-depth individual differences in perceiving telepresence. -
How to Use the Graphical User Interface TCS Technical Bulletin
How to Use the Graphical User Interface TCS Technical Bulletin A. Create/Edit the Graphical Interface (Build Mode) Accessing the site using the Graphical Interface requires that you first build a layout (one or more layers/tabs depending on your site). This is done using the setup wizard to upload images/backgrounds and place controllers in appropriate locations on those images/backgrounds. When finished and saved, the User accesses the site using the Graphical Interface. 1. Click the “+” button to add a layer/tab for the site. (Skip to step 7 to edit an existing layer.) 2. Name the layer/tab by clicking in the field and entering the desired name. 3. Click the Choose File button to select the desired background image from your computer’s drive and click the Save button. 4. The Place View will open showing you the layer/tab title, a Save Positions button, the background image, and a bin of available controllers along the right-hand edge of the Graphical Interface which can be placed onto the layer/ tab. 5. Drag/drop controller icons from the icon bin to the desired location on the background image. Moving your mouse over each icon will show that controller’s name. The arrows at the top and bottom of scroll bar or the scroll bar itself allow you to scroll through the available controllers. NOTE: If you have placed controller icons too close to the icon bin and you would like to move them, you may need to scroll the available controllers up or down to clear the area around an icon to allow it to be dragged/dropped again. -
Widget Toolkit – Getting Started
APPLICATION NOTE Atmel AVR1614: Widget Toolkit – Getting Started Atmel Microcontrollers Prerequisites • Required knowledge • Basic knowledge of microcontrollers and the C programming language • Software prerequisites • Atmel® Studio 6 • Atmel Software Framework 3.3.0 or later • Hardware prerequisites • mXT143E Xplained evaluation board • Xplained series MCU evaluation board • Programmer/debugger: • Atmel AVR® JTAGICE 3 • Atmel AVR Dragon™ • Atmel AVR JTAGICE mkll • Atmel AVR ONE! • Estimated completion time • 2 hours Introduction The aim of this document is to introduce the Window system and Widget toolkit (WTK) which is distributed with the Atmel Software Framework. This application note is organized as a training which will go through: • The basics of setting up graphical widgets on a screen to make a graphical user interface (GUI) • How to get feedback when a user has interacted with a widget • How to draw custom graphical elements on the screen 8300B−AVR−07/2012 Table of Contents 1. Introduction to the Window system and widget toolkit ......................... 3 1.1 Overview ........................................................................................................... 3 1.2 The Window system .......................................................................................... 4 1.3 Event handling .................................................................................................. 5 1.3.2 The draw event ................................................................................... 6 1.4 The Widget -
Bootstrap Tooltip Plugin
BBOOOOTTSSTTRRAAPP TTOOOOLLTTIIPP PPLLUUGGIINN http://www.tutorialspoint.com/bootstrap/bootstrap_tooltip_plugin.htm Copyright © tutorialspoint.com Tooltips are useful when you need to describe a link. The plugin was inspired by jQuery.tipsy plugin written by Jason Frame. Tooltips have since been updated to work without images, animate with a CSS animation, and data-attributes for local title storage. If you want to include this plugin functionality individually, then you will need tooltip.js. Else, as mentioned in the chapter Bootstrap Plugins Overview, you can include bootstrap.js or the minified bootstrap.min.js. Usage The tooltip plugin generates content and markup on demand, and by default places tooltips after their trigger element. You can add tooltips in the following two ways: Via data attributes : To add a tooltip, add data-toggle="tooltip" to an anchor tag. The title of the anchor will be the text of a tooltip. By default, tooltip is set to top by the plugin. <a href="#" data-toggle="tooltip" title="Example tooltip">Hover over me</a> Via JavaScript : Trigger the tooltip via JavaScript: $('#identifier').tooltip(options) Tooltip plugin is NOT only-css plugins like dropdown or other plugins discussed in previous chapters. To use this plugin you MUST activate it using jquery readjavascript. To enable all the tooltips on your page just use this script: $(function () { $("[data-toggle='tooltip']").tooltip(); }); Example The following example demonstrates the use of tooltip plugin via data attributes. <h4>Tooltip examples for anchors</h4> This is a <a href="#" title="Tooltip on left"> Default Tooltip </a>. This is a <a href="#" data-placement="left" title="Tooltip on left"> Tooltip on Left </a>. -
Insert a Hyperlink OPEN the Research on First Ladies Update1 Document from the Lesson Folder
Step by Step: Insert a Hyperlink Step by Step: Insert a Hyperlink OPEN the Research on First Ladies Update1 document from the lesson folder. 1. Go to page four and select the Nancy Reagan picture. 2. On the Insert tab, in the Links group, click the Hyperlink button to open the Insert Hyperlink dialog box. The Insert Hyperlink dialog box opens. By default, the Existing File or Web Page is selected. There are additional options on where to place the link. 3. Key http://www.firstladies.org/biographies/ in the Address box; then click OK. Hypertext Transfer Protocol (HTTP) is how the data is transfer to the external site through the servers. The picture is now linked to the external site. 4. To test the link, press Ctrl then click the left mouse button. When you hover over the link, a screen tip automatically appears with instructions on what to do. 5. Select Hilary Clinton and repeat steps 2 and 3. Word recalls the last address, and the full address will appear once you start typing. You have now linked two pictures to an external site. 6. It is recommended that you always test your links before posting or sharing. You can add links to text or phrases and use the same process that you just completed. 7. Step by Step: Insert a Hyperlink 8. Insert hyperlinks with the same Web address to both First Ladies names. Both names are now underlined, showing that they are linked. 9. Hover over Nancy Reagan’s picture and you should see the full address that you keyed. -
The Three-Dimensional User Interface
32 The Three-Dimensional User Interface Hou Wenjun Beijing University of Posts and Telecommunications China 1. Introduction This chapter introduced the three-dimensional user interface (3D UI). With the emergence of Virtual Environment (VE), augmented reality, pervasive computing, and other "desktop disengage" technology, 3D UI is constantly exploiting an important area. However, for most users, the 3D UI based on desktop is still a part that can not be ignored. This chapter interprets what is 3D UI, the importance of 3D UI and analyses some 3D UI application. At the same time, according to human-computer interaction strategy and research methods and conclusions of WIMP, it focus on desktop 3D UI, sums up some design principles of 3D UI. From the principle of spatial perception of people, spatial cognition, this chapter explained the depth clues and other theoretical knowledge, and introduced Hierarchical Semantic model of “UE”, Scenario-based User Behavior Model and Screen Layout for Information Minimization which can instruct the design and development of 3D UI. This chapter focuses on basic elements of 3D Interaction Behavior: Manipulation, Navigation, and System Control. It described in 3D UI, how to use manipulate the virtual objects effectively by using Manipulation which is the most fundamental task, how to reduce the user's cognitive load and enhance the user's space knowledge in use of exploration technology by using navigation, and how to issue an order and how to request the system for the implementation of a specific function and how to change the system status or change the interactive pattern by using System Control. -
Metaverse Roadmap Overview, 2007. 2007
A Cross-Industry Public Foresight Project Co-Authors Contributing Authors John Smart, Acceleration Studies Foundation Corey Bridges, Multiverse Jamais Cascio, Open the Future Jochen Hummel, Metaversum Jerry Paffendorf, The Electric Sheep Company James Hursthouse, OGSi Randal Moss, American Cancer Society Lead Reviewers Edward Castronova, Indiana University Richard Marks, Sony Computer Entertainment Alexander Macris, Themis Group Rueben Steiger, Millions of Us LEAD SPONSOR FOUNDING PARTNERS Futuring and Innovation Center Graphic Design: FizBit.com accelerating.org metaverseroadmap.org MVR Summit Attendees Distinguished industry leaders, technologists, analysts, and creatives who provided their insights in various 3D web domains. Bridget C. Agabra Project Manager, Metaverse Roadmap Project Patrick Lincoln Director, Computer Science Department, SRI Janna Anderson Dir. of Pew Internet’s Imagining the Internet; Asst. International Prof. of Communications, Elon University Julian Lombardi Architect, Open Croquet; Assistant VP for Tod Antilla Flash Developer, American Cancer Society Academic Services and Technology Support, Office of Information Technology Wagner James Au Blogger, New World Notes; Author, The Making of Second Life, 2008 Richard Marks Creator of the EyeToy camera interface; Director of Special Projects, Sony CEA R&D Jeremy Bailenson Director, Virtual Human Interaction Lab, Stanford University Bob Moore Sociologist, Palo Alto Research Center (PARC), PlayOn project Betsy Book Director of Product Management, Makena Technologies/There;