Field Service Support with Google Glass and Webrtc
Total Page:16
File Type:pdf, Size:1020Kb
Load more
Recommended publications
-
Wayne Community College 2009-2010 Strategic Plan End-Of-Year Report Table of Contents
Wayne Community College INSTITUTIONAL EFFECTIVENESS THROUGH PLAN & BUDGET INTEGRATION 2009-2010 Strategic Plan End-of-Year Report Wayne Community College 2009-2010 Strategic Plan End-of-Year Report Table of Contents Planning Group 1 – President Foundation Institutional Advancement Planning Group 2 – VP Academic Services Academic Skills Center Ag & Natural Resources Allied Health Arts & Sciences Business Administration Cooperative Programs Dental Engineering & Mechanical Studies Global Education Information Systems & Computer Technology Language & Communication Library Mathematics Medical Lab Sciences Nursing Pre-Curriculum Public Safety Public Services Science SJAFB Social Science Transportation Planning Group 3 – VP Student Services VP Student Services Admissions & Records Financial Aid Student Activities Student Development Planning Group 4 – VP Educational Support Services VP Educational Support Services Campus Information Services Educational Support Technologies Facilities Operations Information Technology Security Planning Group 5 – VP Continuing Education VP Continuing Education Basic Skills Business & Industry Center Occupational Extension WCC PLANNING DOCUMENT 2009-2010 Department: Foundation Long Range Goal #8: Integrate state-of-practice technology in all aspects of the college’s programs, services, and operations. Short Range Goal #8.2: Expand and improve program accessibility through technology. Objective/Intended Outcome: The Foundation has experienced phenomenal growth in the last three years. With the purchase of the Raisers Edge Software, we have been able to see a direct increase in our revenues. In order to sustain this level of growth, The Foundation either needs to hire extra manpower or purchase additional Raiser’s Edge software to support our growth. 1. Raiser’s Edge NetSolutions: Enhance the Foundation office’s fundraising abilities. The Foundation would be able to accept online donations, reservations for golf tournament, gala, arts and humanities programs and reach out to alumni. -
Työohjeistuksen Kehittäminen Lisätyn Todellisuuden Avulla
Minna Vanhatapio TYÖOHJEISTUKSEN KEHITTÄMINEN LISÄTYN TODELLISUUDEN AVULLA Tukiasematehtaan tuotanto TYÖOHJEISTUKSEN KEHITTÄMINEN LISÄTYN TODELLISUUDEN AVULLA Tukiasematehtaan tuotanto Minna Vanhatapio Opinnäytetyö Syksy 2016 YAMK, Teknologialiiketoiminta Oulun ammattikorkeakoulu TIIVISTELMÄ Oulun ammattikorkeakoulu Ylempi ammattikorkeakoulututkinto, Teknologialiiketoiminta Tekijä: Minna Vanhatapio Opinnäytetyön nimi: Työohjeistuksen kehittäminen lisätyn todellisuuden avulla, Tukiasematehtaan tuotanto Työn ohjaaja: Hannu Päätalo Työnvalmistumislukukausi ja -vuosi: Syksy 2016 Sivumäärä: 107 Opinnäytetyössä tutustuttiin Augmented Reality -tekniikkaan eli lisättyyn todellisuuteen. Lisätyllä todellisuudella tarkoitetaan reaaliaikaista näkymää, jonka päälle lisätään tietokonegrafiikalla tuo- tettua informaatiota kuten 3D-kuvia, ääntä ja videoita. Informaatio voidaan näyttää esimerkiksi äly- puhelimessa, tabletissa, tietokoneen näytöllä tai älylaseilla. Tavoitteena oli tarjota toimeksiantajalle kattava kuva lisätystä todellisuudesta ja sen tämän hetkisistä mahdollisuuksista työohjeistuksessa sekä selvittää mitä hyötyjä sillä voitaisiin saavuttaa Nokia Networksin tukiasematuotannossa. Työssä tutkittiin voitaisiinko lisätyn todellisuuden avulla tuotannon työohjeistusta parantaa, sekä pohdittiin laajemmin mitä tekniikan käyttöönotto vaatii ja mitä kaikkea on otettava huomioon. Tutki- mus suoritettiin tutustumalla tekniikkaa kehittäneiden tutkijoiden tutkimuksiin, käyttäjien ja eri käyt- töalojen ammattilaisten arviointeihin sekä haastateltiin -
Exploratory Research Into Potential Practical Uses Of
50th ASC Annual International Conference Proceedings Copyright 2014 by the Associated Schools of Construction Exploratory Research into Potential Practical uses of Next Generation Wearable Wireless Voice-Activated Augmented Reality (VAAR) Devices by Building Construction Site Personnel Christopher J. Willis PhD, CAPM, LEED Green Assoc., P.Eng Concordia University Montreal Quebec The miniaturization and increased functionalities of next generation augmented reality (AR) devices, as well as advances in computing technology in the form of cloud computing, is moving the building construction industry closer to adoption of AR devices for use by building construction site personnel. There is therefore a need to understand the potential practical uses of next generation AR devices in building construction site work. A conceptualization of a next generation AR device suitable for use by site personnel is provided. Based on this conceptualization, a focus group of industry professionals and postgraduate researchers have determined that potential practical uses of such a device include: easy access to digital information to support work tasks, live streaming of videos of tasks being worked on, and easy creation of a repository of as-built photographs and videos. Potential applied research studies that will aid in the adoption of next generation AR devices by site personnel include those associated with usability testing, labor productivity measurement and improvement, and suitability testing based on nature of work tasks. The major implication of this exploratory study is that its findings will help to bridge the gap between next generation AR devices and practical use in building construction. Keywords: Augmented Reality, Building Construction, Cloud Computing, Next Generation, Practical Use Introduction Augmented reality (AR) is an emerging technology that is increasingly acquiring greater relevance and usage. -
A Viga T Ing R T Ificia L N Te Ll Igence
July 24, 2018 Semiconductor Get real with artificial intelligence (AI) "Seriously, do you think you could actually purchase one of my kind in Walmart, say in the next 10 years?" NTELLIGENCE I "You do?! You'd better read this report from RTIFICIAL RTIFICIAL cover to cover, and I assure you Peter is not being funny at all this time." A ■ Fantasies remain in Star Trek. Let’s talk about practical AI technologies. ■ There are practical limitations in using today’s technology to realise AI elegantly. ■ AI is to be enabled by a collaborative ecosystem, likely dominated by “gorillas”. ■ An explosion of innovations in AI is happening to enhance user experience. ■ Rewards will go to the problem solvers that have invested in R&D ahead of others. Analyst(s) AVIGATING AVIGATING Peter CHAN T (82) 2 6730 6128 E [email protected] N IMPORTANT DISCLOSURES, INCLUDING ANY REQUIRED RESEARCH CERTIFICATIONS, ARE PROVIDED AT THE Powered by END OF THIS REPORT. IF THIS REPORT IS DISTRIBUTED IN THE UNITED STATES IT IS DISTRIBUTED BY CIMB the EFA SECURITIES (USA), INC. AND IS CONSIDERED THIRD-PARTY AFFILIATED RESEARCH. Platform Navigating Artificial Intelligence Technology - Semiconductor│July 24, 2018 TABLE OF CONTENTS KEY CHARTS .......................................................................................................................... 4 Executive Summary .................................................................................................................. 5 I. From human to machine .......................................................................................................10 -
Augmented Reality Navigation
VYSOKÉ UČENÍ TECHNICKÉ V BRNĚ BRNO UNIVERSITY OF TECHNOLOGY FAKULTA ELEKTROTECHNIKY A KOMUNIKAČNÍCH TECHNOLOGIÍ ÚSTAV RADIOELEKTRONIKY FACULTY OF ELECTRICAL ENGINEERING AND COMMUNICATION DEPARTMENT OF RADIO ELECTRONICS AUGMENTED REALITY APPLICATIONS IN EMBEDDED NAVIGATION DEVICES BAKALÁŘSKÁ PRÁCE BACHELOR'S THESIS AUTOR PRÁCE MARTIN JAROŠ AUTHOR BRNO 2014 VYSOKÉ UČENÍ TECHNICKÉ V BRNĚ BRNO UNIVERSITY OF TECHNOLOGY FAKULTA ELEKTROTECHNIKY A KOMUNIKAČNÍCH TECHNOLOGIÍ ÚSTAV RADIOELEKTRONIKY FACULTY OF ELECTRICAL ENGINEERING AND COMMUNICATION DEPARTMENT OF RADIO ELECTRONICS AUGMENTED REALITY APPLICATIONS IN EMBEDDED NAVIGATION DEVICES AUGMENTED REALITY APPLICATIONS IN EMBEDDED NAVIGATION DEVICES BAKALÁŘSKÁ PRÁCE BACHELOR'S THESIS AUTOR PRÁCE MARTIN JAROŠ AUTHOR VEDOUCÍ PRÁCE doc. Ing. TOMÁŠ FRÝZA, Ph.D. SUPERVISOR BRNO 2014 VYSOKÉ UČENÍ TECHNICKÉ V BRNĚ Fakulta elektrotechniky a komunikačních technologií Ústav radioelektroniky Bakalářská práce bakalářský studijní obor Elektronika a sdělovací technika Student: Martin Jaroš ID: 146847 Ročník: 3 Akademický rok: 2013/2014 NÁZEV TÉMATU: Augmented reality applications in embedded navigation devices POKYNY PRO VYPRACOVÁNÍ: Analyze the hardware possibilities of the OMAP platform and design an application to effectively combine captured video data and rendered virtual scene based on navigational data from GPS and INS sensors. Design and create a functional prototype. Examine practical use cases of the proposed navigation device, design applicable user interface. DOPORUČENÁ LITERATURA: [1] BIMBER, O.; RASKAR, R. Spatial augmented reality: merging real and virtual worlds. Wellesley: A K Peters, 2005, 369 p. ISBN 15-688-1230-2. [2] Texas Instruments. OMAP 4460 Multimedia Device [online]. 2012 - [cit. 8. listopadu 2012]. Available: http://www.ti.com/product/omap4460. Termín zadání: 10.2.2014 Termín odevzdání: 30.5.2014 Vedoucí práce: doc. -
Augmented Reality Environments for Immersive and Non Linear Virtual Storytelling
Augmented reality environments for immersive and non linear virtual storytelling Stefanos Kougioumtzis University of Piraeus, Department of Informatics 80, Karaoli & Dimitriou St., 185 34 Piraeus, Greece [email protected] Nikitas N. Karanikolas Technological Educational Institution (TEI) of Athens, Department of Informatics Ag. Spyridonos St., 12210 Aigaleo, Greece [email protected] Themis Panayiotopoulos University of Piraeus, Department of Informatics 80, Karaoli & Dimitriou St., 185 34 Piraeus, Greece [email protected] Abstract The work reported in this paper has the purpose of creating a virtual environment where the recipients (the user) will experience a virtual story telling. In order to introduce the user to a plausible, near to reality experience, we use augmentation methods to create an environment maintaining its physical properties, while three dimensional virtual objects and textures change it for the needs of a real time and controlled scenario. The augmentation is implemented using marker based techniques while the superimposed objects are rendered through the wearable equipment that the user is carrying. In the current implementation we try to blend intelligent non controllable characters that interpret the users’ choices and actions in order to intensify his or her interest for the story, while the real time events are being acted. The user is free to explore, interact and watch the desired events, while others are being carried out in parallel. Keywords: virtual reality, mixed reality, virtual story-telling, wearable systems, computer vision 1. Introduction and problems outline telling, tangible devices, that have already achieved this goal in some extent. As technology moves forward, the needs and purposes of virtuality are getting more and The techniques of Augmented Reality have more clear. -
Vancouver Cross-Border Investment Guide
Claire to try illustration idea as one final cover option Vancouver Cross-Border Investment Guide Essential legal, tax and market information for cross-border investment into Vancouver, Canada Digital Download 1 Vancouver Cross-Border Contents Investment Guide Published October 2020 Why Invest in Vancouver ............................................................................1 Sectors to Watch ........................................................................................... 3 About the Vancouver Economic Commission Technology ..................................................................................................3 The Vancouver Economic Commission (VEC) serves one of the world’s fastest-growing, low- Cleantech .................................................................................................... 4 carbon economies. As the economic development agency for the city’s businesses, investors and citizens, VEC works to strengthen Vancouver’s economic future by supporting local companies, attracting high-impact investment, conducting and publishing leading-edge industry research, Media and Entertainment ............................................................................5 and promoting international trade. VEC works collaboratively to position Vancouver as a global destination for innovative, creative, diverse and sustainable development. Life Sciences ............................................................................................... 6 VEC respectfully acknowledges that it is located -
Gstreamer and Dmabuf
GStreamer and dmabuf OMAP4+ graphics/multimedia update Rob Clark Outline • A quick hardware overview • Kernel infrastructure: drm/gem, rpmsg+dce, dmabuf • Blinky s***.. putting pixels on the screen • Bringing it all together in GStreamer A quick hardware overview DMM/Tiler • Like a system-wide GART – Provides a contiguous view of memory to various hw accelerators: IVAHD, ISS, DSS • Provides tiling modes for enhanced memory bandwidth efficiency – For initiators like IVAHD which access memory in 2D block patterns • Provides support for rotation – Zero cost rotation for DSS/ISS access in 0º/90º/180º/270º orientations (with horizontal or vertical reflection) IVA-HD • Multi-codec hw video encode/decode – H.264 BP/MP/HP encode/decode – MPEG-4 SP/ASP encode/decode – MPEG-2 SP/MP encode/decode – MJPEG encode/decode – VC1/WMV9 decode – etc DSS – Display Subsystem • Display Subsystem – 4 video pipes, 3 support scaling and YUV – Any number of video pipes can be attached to one of 3 “overlay manager” to route to a display Kernel infrastructure: drm/gem, rpmsg+dce, dmabuf DRM Overview • DRM → Direct Rendering Manager – Started life heavily based on x86/desktop graphics card architecture – But more recently has evolved to better support ARM and other SoC platforms • KMS → Kernel Mode Setting – Replaces fbdev for more advanced display management – Hotplug, multiple display support (spanning/cloning) – And more recently support for overlays (planes) • GEM → Graphics Execution Manager – But the important/useful part here is the graphics/multimedia buffer management DRM - KMS • Models the display hardware as: – Connector → the thing that the display connects to • Handles DDC/EDID, hotplug detection – Encoder → takes pixel data from CRTC and encodes it to a format suitable for connectors • ie. -
Intel Brings Home Top Awards, Recognition During CES 2016
January 11, 2016 Intel Brings Home Top Awards, Recognition during CES 2016 Intel Technologies, Products and Partners Win Best Drone, Best Wearable and More SANTA CLARA, Calif.--(BUSINESS WIRE)-- At CES 2016 last week, Intel Corporation announced innovative technologies and collaborations aimed at delivering amazing experiences throughout daily life. Many of these innovations, ranging from unmanned aerial vehicles (UAVs) and wearables to new PCs and tablets, received an array of prestigious awards and recognition. This Smart News Release features multimedia. View the full release here: http://www.businesswire.com/news/home/20160111006536/en/ For example, Intel's leadership to integrate human-like senses into technology was recognized with various awards. Engadget, PC Magazine, The Verge and Videomaker named the Intel® Atom™ processor- powered Yuneec Typhoon H* the best drone of CES 2016. With Intel® RealSense™ technology, the Yuneec Typhoon H is Intel technologies, products and partners received top awards during last capable of collision week's CES 2016. Clockwise from upper left: Engadget, PC Magazine and avoidance, has a Videomaker named the Intel® Atom™ processor-powered Yuneec Typhoon “follow-me” feature, H with Intel® RealSense™ technology the best drone of CES 2016. Engadget recognized the Empire EVS*, based on Recon's Snow2 heads-up and a 4K display (HUD) technology, as the best wearable. Recon Instruments, an Intel camera. CNET, company, enabled Empire Paintball* to create a smart paintball mask that Gizmodo and Reuters allows users to access live tactical information with a glance. PBS, Reuters also included the and Wired included the Ninebot* Segway* robot, which is powered by an Intel drone in their best-of- Atom processor and uses Intel® RealSense™ technology, in their best-of- CES lists. -
Ownerls Manual
RECON JET Owner’s Manual Assembled in USA SAFETY INFORMATION IMPORTANT SAFETY INFORMATION READ BEFORE USING RECON JET Jet is designed to enhance your sports and fitness experience. If used improperly (e.g. when cycling without paying attention to the road), Jet may cause you to get in an accident that could result in property damage, serious injury, or death. Always pay attention to the road. Do not focus on Jet’s display and become distracted from your surroundings. Ride safe and have fun. Please read and understand these warnings before using Jet. If you have any questions about how to use Jet safely, contact Recon Customer Support at [email protected], or call us at 1.877.642.2486. Support hours are 6:00am to 5:30pm PST, 7 days a week. 1. Don’t get distracted. Keep your eyes on the road. Jet’s display sits just below your right eye, but that does not mean you can stare at it and still see everything on the road. Focusing on the display may cause you to miss cars, road debris, and other hazards, which may reduce or eliminate your ability to avoid an accident. Jet’s display and the information delivered on it are designed for quick access. Glance at Jet’s display quickly, in the same way you would glance at your car’s dashboard or rearview mirror, and do so only when you are sure that you are safe from traffic and other hazards. Jet displays different types of information than your cycling computer or other fitness tracking devices. -
Design, Modeling, and Analysis of Visual Mimo Communication
DESIGN, MODELING, AND ANALYSIS OF VISUAL MIMO COMMUNICATION By ASHWIN ASHOK A Dissertation submitted to the Graduate School|New Brunswick Rutgers, The State University of New Jersey in partial fulfillment of the requirements for the degree of Doctor of Philosophy Graduate Program in Electrical and Computer Engineering written under the direction of Dr. Marco O. Gruteser, Dr. Narayan B. Mandayam and Dr. Kristin J. Dana and approved by New Brunswick, New Jersey October, 2014 c 2014 Ashwin Ashok ALL RIGHTS RESERVED ABSTRACT OF THE DISSERTATION Design, Modeling, and Analysis of Visual MIMO Communication By ASHWIN ASHOK Dissertation Director: Dr. Marco O. Gruteser, Dr. Narayan B. Mandayam and Dr. Kristin J. Dana Today's pervasive devices are increasingly being integrated with light emitting diode (LED) arrays, that serve the dual purpose of illumination and signage, and photo- receptor arrays in the form of pixel elements in a camera. The ubiquitous use of light emitting arrays (LEA) and cameras in today's world calls for building novel systems and applications where such light emitting arrays can communicate information to cameras. This thesis presents the design, modeling and analysis of a novel concept called visual MIMO (multiple-input multiple-output) where cameras are used for communication. In visual MIMO, information transmitted from light emitting arrays are received through the optical wireless channel and decoded by a camera receiver. The paradigm shift in visual MIMO is the use of digital image analysis and computer vision techniques to aid in the demodulation of information, contrary to the direct processing of electrical signals as in traditional radio-frequency (RF) communication. -
Head-Mounted Mixed Reality Projection Display for Games Production and Entertainment
Pers Ubiquit Comput DOI 10.1007/s00779-015-0847-y ORIGINAL ARTICLE Head-mounted mixed reality projection display for games production and entertainment 1 2 2 3 Daniel Kade • Kaan Aks¸it • Hakan U¨ rey • Og˘uzhan O¨ zcan Received: 20 November 2014 / Accepted: 3 April 2015 Ó Springer-Verlag London 2015 Abstract This research presents a mixed reality (MR) gaming applications as well. Therefore, our MR prototype application that is designed to be usable during a motion could become of special interest because the prototype is capture shoot and supports actors with their task to per- lightweight, allows for freedom of movement and is a low- form. Through our application, we allow seeing and ex- cost, stand-alone mobile system. Moreover, the prototype ploring a digital environment without occluding an actor’s also allows for 3D vision by mounting additional hardware. field of vision. A prototype was built by combining a retroreflective screen covering surrounding walls and a Keywords Head-mounted projection display Á Mixed headband consisting of a laser scanning projector with a reality Á Motion capture Á Laser projector Á Immersive smartphone. Built-in sensors of a smartphone provide environments Á Games production navigation capabilities in the digital world. The presented system was demonstrated in an initially published paper. Here, we extend these research results with our advances 1 Introduction and discuss the potential use of our prototype in gaming and entertainment applications. To explore this potential Entertainment industry products such as video games and use case, we built a gaming application using our MR films are deeply depending on computer-generated imagery prototype and tested it with 45 participants.