COIMBATORE-641046 DEPARTMENT of COMPUTER APPLICATIONS (Effective from the Academic Year 2019-2020)

Total Page:16

File Type:pdf, Size:1020Kb

COIMBATORE-641046 DEPARTMENT of COMPUTER APPLICATIONS (Effective from the Academic Year 2019-2020) Certificate course in Augmented Reality -2019-20-ud Annexure No: 55F Page 1 of 4 SCAA Dated: 09.05.2019 Annexure – V BHARATHIAR UNIVERSITY:: COIMBATORE-641046 DEPARTMENT OF COMPUTER APPLICATIONS (Effective from the academic Year 2019-2020) Certificate Course in Augmented Reality Eligibility for admission to the course A pass in Higher Secondary Examination (+2) conducted by the Government of Tamil Nadu or an examination accepted as equivalent there to by the syndicate. Duration of the course The candidates can undergo this course both in full-time (3 months) and part-time (6 months). The certificate programme consists of theory and practical courses. Regulations The general Regulations of the Bharathiar University Choice Based Credit System are applicable to this certificate programme. The Medium of Instruction and Examinations The medium of instruction and Examinations shall be in English. Submission of Record Notebooks for Practical Examinations Candidates taking the Practical Examinations should submit bonafide Record Note Books prescribed for the Examinations. Otherwise the candidates will not be permitted to take the Practical Examinations. Revision of Regulations and Curriculum The above Regulation and Scheme of Examinations will be in vogue without any change for a minimum period of three years from the date of approval of the Regulations. The University may revise /amend/ change the Regulations and Scheme of Examinations, if found necessary. Scheme of Examinations Max Course Code Subject and Paper L P Credits Marks Paper I Augmented Reality 4 2 6 150 Certificate course in Augmented Reality -2019-20-ud Annexure No: 55F Page 2 of 4 SCAA Dated: 09.05.2019 Course Title :AUGMENTED REALITY No. of Credits : 4 Course Code : 19CSEHC01 No. of Teaching Hours:T-30 P-60 Course Objectives To impart knowledge to make the students 1. To understand the concepts behind AR 2. To design and develop AR applications Unit - I Introduction to Augmented Reality (AR), Virtual Reality (VR), eXtended Reality (XR) - Introduction to Unity3D and Content Generation Tools - History, evolution and market impact - Sample applications of AR, VR, XR: Presentation Unit - II Design application: Theory - Story and process - Scripting principles - Hardware: AR, VR, XR - Hardware: Development environment - Tools, Software Development Kit (SDK), Scripting Unit - III Basic development: Identifying basic design principles, reciting common choices, styles, and/or aesthetics Visual, audial, interactive, and narrative - System Dynamics and Scripting Fundamentals - Interfaces, Environments, Asset Management, and Animation - Project 1: Creating a project and environment - Project 2: Creating and using an asset - Project 3: Creating and using a Component Unit - IV Creating Environment: Principles of Cameras and Lighting in Application Environments- Principles of Audio, Animation - Physics, Particle system - Interaction: Eye tap, Gaze, Handheld controllers – Tracking – Spatial immersion and interaction – Principles of Quality and Functionality Assurance in Development Unit - V Project 4: Creating first application - Project 5: Creating a simple application: Principles of Versioning and Release – Packaging - Installing application on the device – Practical Applications: Virtual Circuit - Virtual Chemistry lab - Virtual Dental experiment – Game - Virtual Assembly and Repair - Augmented Book - Augmented Tourism - Augmented Healthcare: X-rays REFERENCE BOOKS 1. Erin Pangilinan, Steve Lukas, et al. ‘Creating Augmented and Virtual Realities: Theory and Practice for Next-Generation Spatial Computing’, Apr 14, 2019 2. Steve Aukstakalnis, ‘Practical Augmented Reality: A Guide to the Technologies, Applications, and Human Factors for AR and VR (Usability)’, 3. Jonathan Linowes, ‘Augmented Reality for Developers: Build practical augmented reality applications with Unity, ARCore, ARKit, and Vuforia’, October 9, 2017 Certificate course in Augmented Reality -2019-20-ud Annexure No: 55F Page 3 of 4 SCAA Dated: 09.05.2019 4. Michael Wohl, ‘The 360° Video Handbook: A step-by-step guide to creating video for virtual reality (VR)’, July 1, 2017 5. John Bucher, ‘Storytelling for Virtual Reality: Methods and Principles for Crafting Immersive Narratives’, Jul 6, 2017 6. Jonathan Linowes, ‘Unity Virtual Reality Projects: Learn Virtual Reality by developing more than 10 engaging projects with Unity 2018’, 2nd Edition 2nd Edition, Kindle Edition AR LAB Exercises (a) Virtual Circuit (b) Virtual Chemistry lab (c) Virtual Dental experiment (d) Game (e) Virtual Assembly and Repair (f) Augmented Book (g) Augmented Tourism (h) Augmented Healthcare: X-rays Certificate course in Augmented Reality -2019-20-ud Annexure No: 55F Page 4 of 4 SCAA Dated: 09.05.2019 BHARATHIAR UNIVERSITY :: Coimbatore 641 046 Department of Computer Applications Phygital Mode Phygital mode : ‘Phygital’ = physical plus digital = is a combination of classroom based teaching in Department of Computer Applications and online technology based teaching i.e. through online direct teaching, video professors, using the online materials prepared by Department of Computer Applications and industries. For every course (or) subject, 60% will be in physical mode (face to face classroom based teaching) and remaining 40% will be in digital mode. For online delivery of lectures, discussions, clarifications of doubts, special classes, WebEx class rooms or Zoomla will be used. Examinations will be conducted in physical mode by Bharathiar University at Department of Compuer Applications. .
Recommended publications
  • New Realities Risks in the Virtual World 2
    Emerging Risk Report 2018 Technology New realities Risks in the virtual world 2 Lloyd’s disclaimer About the author This report has been co-produced by Lloyd's and Amelia Kallman is a leading London futurist, speaker, Amelia Kallman for general information purposes only. and author. As an innovation and technology While care has been taken in gathering the data and communicator, Amelia regularly writes, consults, and preparing the report Lloyd's does not make any speaks on the impact of new technologies on the future representations or warranties as to its accuracy or of business and our lives. She is an expert on the completeness and expressly excludes to the maximum emerging risks of The New Realities (VR-AR-MR), and extent permitted by law all those that might otherwise also specialises in the future of retail. be implied. Coming from a theatrical background, Amelia started Lloyd's accepts no responsibility or liability for any loss her tech career by chance in 2013 at a creative or damage of any nature occasioned to any person as a technology agency where she worked her way up to result of acting or refraining from acting as a result of, or become their Global Head of Innovation. She opened, in reliance on, any statement, fact, figure or expression operated and curated innovation lounges in both of opinion or belief contained in this report. This report London and Dubai, working with start-ups and corporate does not constitute advice of any kind. clients to develop connections and future-proof strategies. Today she continues to discover and bring © Lloyd’s 2018 attention to cutting-edge start-ups, regularly curating All rights reserved events for WIRED UK.
    [Show full text]
  • A Review of Extended Reality (XR) Technologies for Manufacturing Training
    technologies Article A Review of Extended Reality (XR) Technologies for Manufacturing Training Sanika Doolani * , Callen Wessels, Varun Kanal, Christos Sevastopoulos, Ashish Jaiswal and Harish Nambiappan and Fillia Makedon * Department of Computer Science and Engineering, The University of Texas at Arlington, Arlington, TX 76019, USA; [email protected] (C.W.); [email protected] (V.K.); [email protected] (C.S.); [email protected] (A.J.); [email protected] (H.N.) * Correspondence: [email protected] (S.D.); [email protected] (F.M.) Received: 30 October 2020; Accepted: 5 December 2020; Published: 10 December 2020 Abstract: Recently, the use of extended reality (XR) systems has been on the rise, to tackle various domains such as training, education, safety, etc. With the recent advances in augmented reality (AR), virtual reality (VR) and mixed reality (MR) technologies and ease of availability of high-end, commercially available hardware, the manufacturing industry has seen a rise in the use of advanced XR technologies to train its workforce. While several research publications exist on applications of XR in manufacturing training, a comprehensive review of recent works and applications is lacking to present a clear progress in using such advance technologies. To this end, we present a review of the current state-of-the-art of use of XR technologies in training personnel in the field of manufacturing. First, we put forth the need of XR in manufacturing. We then present several key application domains where XR is being currently applied, notably in maintenance training and in performing assembly task.
    [Show full text]
  • X-Reality Museums: Unifying the Virtual and Real World Towards Realistic Virtual Museums
    applied sciences Article X-Reality Museums: Unifying the Virtual and Real World Towards Realistic Virtual Museums George Margetis 1 , Konstantinos C. Apostolakis 1, Stavroula Ntoa 1, George Papagiannakis 1,2 and Constantine Stephanidis 1,2,* 1 Foundation for Research and Technology Hellas, Institute of Computer Science, N. Plastira 100, Vassilika Vouton, GR-700 13 Heraklion, Greece; [email protected] (G.M.); [email protected] (K.C.A.); [email protected] (S.N.); [email protected] (G.P.) 2 Department of Computer Science, University of Crete, GR-700 13 Heraklion, Greece * Correspondence: [email protected]; Tel.: +30-2810-391-741 Abstract: Culture is a field that is currently entering a revolutionary phase, no longer being a privilege for the few, but expanding to new audiences who are urged to not only passively consume cultural heritage content, but actually participate and assimilate it on their own. In this context, museums have already embraced new technologies as part of their exhibitions, many of them featuring augmented or virtual reality artifacts. The presented work proposes the synthesis of augmented, virtual and mixed reality technologies to provide unified X-Reality experiences in realistic virtual museums, engaging visitors in an interactive and seamless fusion of physical and virtual worlds that will feature virtual agents exhibiting naturalistic behavior. Visitors will be able to interact with the virtual agents, as they would with real world counterparts. The envisioned approach is expected to not only provide refined experiences for museum visitors, but also achieve high quality entertainment combined with more effective knowledge acquisition. Keywords: extended reality; diminished reality; true mediated reality; augmented reality; virtual reality; natural multimodal interaction; unified user experiences; interactive museum exhibits Citation: Margetis, G.; Apostolakis, K.C.; Ntoa, S.; Papagiannakis, G.; Stephanidis, C.
    [Show full text]
  • The Metaverse and Digital Realities Transcript Introduction Plenary
    [Scientific Innovation Series 9] The Metaverse and Digital Realities Transcript Date: 08/27/2021 (Released) Editors: Ji Soo KIM, Jooseop LEE, Youwon PARK Introduction Yongtaek HONG: Welcome to the Chey Institute’s Scientific Innovation Series. Today, in the 9th iteration of the series, we focus on the Metaverse and Digital Realities. I am Yongtaek Hong, a Professor of Electrical and Computer Engineering at Seoul National University. I am particularly excited to moderate today’s webinar with the leading experts and scholars on the metaverse, a buzzword that has especially gained momentum during the online-everything shift of the pandemic. Today, we have Dr. Michael Kass and Dr. Douglas Lanman joining us from the United States. And we have Professor Byoungho Lee and Professor Woontack Woo joining us from Korea. Now, I will introduce you to our opening Plenary Speaker. Dr. Michael Kass is a senior distinguished engineer at NVIDIA and the overall software architect of NVIDIA Omniverse, NVIDIA’s platform and pipeline for collaborative 3D content creation based on USD. He is also the recipient of distinguished awards, including the 2005 Scientific and Technical Academy Award and the 2009 SIGGRAPH Computer Graphics Achievement Award. Plenary Session Michael KASS: So, my name is Michael Kass. I'm a distinguished engineer from NVIDIA. And today we'll be talking about NVIDIA's view of the metaverse and how we need an open metaverse. And we believe that the core of that metaverse should be USD, Pixar's Universal Theme Description. Now, I don't think I have to really do much to introduce the metaverse to this group, but the original name goes back to Neal Stephenson's novel Snow Crash in 1992, and the original idea probably goes back further.
    [Show full text]
  • Immersive Tourism
    Immersive Tourism State of the Art of Immersive Tourism Realities through XR Technology The Whitepaper contributes to the BUAS project DigiReal, an IMPULS/Sprong project, which was financed by the Dutch National Funding Organisation for Research SIA. Front page image credit: The WaveXR 1 About the Authors Jessika Weber Sabil, PhD Senior Researcher & Lecturer Games & Tourism [email protected] Dr. Jessika Weber Sabil is senior researcher at the Faculty of Digital Entertainment at BUas under the professorship of Applied Games, Innovation and Society and a senior lecturer at the Academy of Tourism of Breda University of Applied Sciences. Her research focusses on games applied to tourism ecosystems and experience design. Current and previous research projects explore (mobile) location-based AR games for experience enhancement, the application of serious games to understand complex systems and games to facilitate creative processes. Jessika holds a PhD from Bournemouth University, where she explored the game experiences of tourists with location-based augmented reality games in urban environments and a master from the University of Applied Sciences Salzburg on Tourism and Innovation Management. She is a member of the International Federation for Information Technology in Travel and Tourism (IFITT), Digital Games Research Group (DiGRA) and the Interaction Design Foundation. 2 Dai-In Danny Han, PhD Senior Researcher & Lecturer Hotel & Facility [email protected] Dr. Dai-In Danny Han is a professor at the research centre Future of Food at Zuyd University of Applied Sciences and a senior researcher at Breda University of Applied Sciences. He holds a PhD in the area of mobile augmented reality user experience design and has been involved in numerous projects studying the user experience for immersive technologies in the hospitality and tourism context.
    [Show full text]
  • State of the Art in Extended Reality - Multimodal Interaction
    State of the Art in Extended Reality - Multimodal Interaction Ismo Rakkolainen, Ahmed Farooq, Jari Kangas, Jaakko Hakulinen, Jussi Rantala, Markku Turunen, and Roope Raisamo Technical report HUMOR project, January 2021 Tampere University, Finland Contents 1. Introduction ............................................................................................................................................. 1 2. Multimodal Interaction ........................................................................................................................... 2 2.1 Human senses .......................................................................................................................................... 2 2.2. Introduction to multimodal interaction ................................................................................................... 4 2.3 Three-dimensional user interfaces ........................................................................................................... 8 3. Core Multimodal Interaction Techniques ........................................................................................... 11 3.1 Auditory interfaces ................................................................................................................................ 11 3.2 Speech.................................................................................................................................................... 12 3.3 Gesture recognition technologies .........................................................................................................
    [Show full text]
  • The XRSI Definitions of Extended Reality (XR)
    XRSI Standard Publication XR-001 The XRSI Defi nition of Extended Reality The XRSI Defi nitions of Extended Reality (XR) XR Safety Initiative Standard Publication XR-001 www.xrsi.org XR Safety Initiative xrsidotorg 1 CC BY-NC-SA 4.0 XRSI Standard Publication XR-001 The XRSI Defi nition of Extended Reality XR Data Classifi cation Framework Public Working Group XR-DCF Public Working Group XR Safety Initiative, California, USA Liaison Organization: Open AR Cloud 2 CC BY-NC-SA 4.0 xrsidotorg XRSI Standard Publication XR-001 The XRSI Defi nition of Extended Reality Abstract The Extended Reality – Data Classifi cation Framework – Public Working Group (XR- DCF-PWG) at the XR Safety Initiative (XRSI) develops and promotes a fundamental understanding of the properties and classifi cation of data in XR environments by providing technical leadership in the XR domain. XR-DCF-PWG develops tests, test methods, reference data, proof of concept implementations, and technical analysis to advance the development and productive use of immersive technology. XR-DCF-PWG’s responsibilities include the development of technical, physical, administrative, and management standards and guidelines for the risk-based security and privacy of sensitive information in XR environments. This Special Publication XR- series reports on XR-DCF-PWG’s research, guidance, and outreach efforts in XR Safety and its collaborative activities with industry, government, and academic organizations. This specifi c report is an enumeration of terms for the purposes of consistency in communication. Certain commercial entities, equipment, or materials may be identifi ed in this document in order to describe an experimental procedure or concept adequately.
    [Show full text]
  • Preprint a Conceptual Model of Immersive Experience H Lee 2020
    Preprint: A Conceptual Model of Immersive Experience in Extended Reality DOI:10.31234/osf.io/sefkh A Conceptual Model of Immersive Experience in Extended Reality Hyunkook Lee ([email protected]) Applied Psychoacoustics Lab (APL), University of Huddersfield, HD1 3DH, UK. ABSTRACT The term immersion or immersive is popularly used when describing and evaluating technologies in the area of extended reality (i.e., virtual/augmented/mixed reality). Much research has been conducted on immersion over the last few decades. However, there is still a lack of consistency in how the term is defined in the literature. Presence and involvement are other prominent concepts studied in the field of extended reality. However, there is currently no consensus on their relationship with immersion among researchers. This paper first discusses different dimensions of immersion as well as those of presence and involvement, aiming to resolve potential confusion around the terms and synthesise a relationship among them. From this, a new conceptual model of immersive experience for future studies in extended reality is proposed. The model defines physical presence, social/self presence and involvement as the main high-level attributes that collectively lead to an immersive experience. Each pair of the three attributes shares a common lower-level attribute of sensory, narrative or task/motor engagement, which is an initial step towards the higher-level experience. Plausibility, interactivity and interestingness are defined as the main properties of immersive system
    [Show full text]
  • All Reality: Values, Taxonomy, and Continuum, for Virtual, Augmented, Extended/Mixed (X), Mediated (X,Y), and Multimediated Reality/Intelligence
    All Reality: Values, taxonomy, and continuum, for Virtual, Augmented, eXtended/MiXed (X), Mediated (X,Y), and Multimediated Reality/Intelligence Steve Mann, John C. Havens, Jay Iorio, Yu Yuan, and Tom Furness Virtual Aug. Phenomenal X- X-Y Mediated MiXed Quantimetric All ABSTRACT reality reality reality reality reality reality reality reality reality VR AR ΦR XR XYR MR QR All R ( R) Humans are creating a world of eXtended/Artificial Reality/Intelligence 1938 1968 1974 1991 1994 1994 1996 2018* (AR, AI, XR, XI or EI), that in many ways is hypocritical, e.g. where cars and buildings are always “allowed” to “wear” cameras, but Figure 1: Realities Timeline: from VR to All R humans sometimes aren’t, and where machines sense our every movement, yet we can’t even understand how they work. We’re MiXed Reality constructing a system of values that gives more rights and less re- sponsibilities to AI (Artificial Intelligence) than to HI (Humanistic Intelligence). Whereas it is becoming common to separate the notions of IRL (In Real Life) and “Augmented” or “Virtual” Reality (AR, VR) into Disk and Earth pictures are from Wikimedia Commons Wikimedia from are pictures Disk and Earth DGM-250-K, Pioneer V-4 Edirol Roland from adapted Mixer MX200-USB and Liston12 completely disparate realms with clearly delineated boundaries, Real Augmented X Augmented Virtual we propose here the notion of “All Reality” to more holistically World Reality (AR) Virtuality (AV) Reality (VR) represent the links between these soon-to-be-outdated culturally accepted norms of various levels of consciousness. Inclusive in the Figure 2: Disk Jockey (DJ) Mixer Metaphor of mixed reality: notion of “All Reality” is also the idea of “ethically aligned reality”, Imagine two record players (turntables), feeding into an au- recognizing values-based biases, cultural norms, and applied ethics dio/video mixer.
    [Show full text]
  • Immersive Multimedia for Extended Reality
    Immersive Multimedia for Extended Reality Important dates Paper submission: 10th of May 14th of June, 2020 -- EXTENDED Notification of paper acceptance: 27th of July, 2020 Camera ready paper due: 16th of August, 2020 Conference: 21st – 23rd of September, 2020 Scope and topics One current big technology trend is Immersive Multimedia (IM), which includes 360-degree video, up to six degrees of freedom, spatial audio, interactivity, multi-sensory processing (vision, hearing, tactile, olfaction, and gustatory) among the others. User experiences range from Virtual Reality (VR) where the content becomes reality for the user, to Mixed Reality (MR) to Augmented Reality (AR) where the content is blended into the real world of the user. The wide spectrum of these experiences is often referred to as Extended Reality (XR). The application of immersive multimedia technologies for extended reality experiences is abbreviated as IMXR. The combination of several heterogeneous IMXR technologies poses several challenges. For example, IMXR experiences are more critical to delay and synchronization, and generally demand more resources from end devices and within the system (CPU, GPU, storage and network bandwidth). As a consequence, there are numerous open issues in the whole delivery pipeline from capture to rendering, including processing of a multitude of sensor information (audio, video, haptics, motion, location, etc.), streaming and visualizing different IMXR streams. Measuring the QoE of such new applications and services is also a totally new territory. In this session, we would like to discuss those issues, connecting a broad and interdisciplinary field of research areas, including signal processing, transport protocols, streaming, architectures, standardization, quality of experience and applications from a range of domains, including entertainment, industry and others.
    [Show full text]
  • Xrmas: Extended Reality Multi-Agency Spaces for a Magical
    XRmas: Extended Reality Multi-Agency Spaces for a Magical Remote Christmas YAYING ZHANG1, BRENNAN JONES2, SEAN RINTEL3, CARMAN NEUSTAEDTER4 1Microsoft Corporation, 2University of Calgary, 3Microsoft Research Cambridge, 4Simon Fraser University [email protected], [email protected], [email protected], [email protected] Figure 1: XRmas overview. The COVID-19 pandemic has raised attention toward remote and hybrid communications. Currently, one highly-studied solution lets a remote user use virtual reality (VR) to enter an immersive view of a local space, and local users use augmented reality (AR) to see the remote user’s representation and digital contents. Such systems give the remote user a sense of ‘being there’, but we identify two more challenges to address. First, current systems provide remote users with limited agency to control objects and influence the local space. It is necessary to further explore the relationship between users, virtual objects, and physical objects, and how they can play a role in providing richer agency. Second, current systems often try to replicate in-person experiences, but hardly surpass them. We propose XRmas: an AR/VR telepresence system that (1) provides a multi-agency space that allows a remote user to manipulate both virtual and physical objects in a local space, and (2) introduces three family activities in a Christmas context that adopt holographic animation effects to create a ‘magical’ experience that takes users beyond merely the feeling of ‘being there’. We report on preliminary insights from the use of such a system in a remote family communication context. 1 INTRODUCTION The COVID-19 pandemic has raised attention toward remote communication, and there has been increasing interest in adopting augmented reality (AR) and virtual reality (VR) in telecommunications.
    [Show full text]
  • Xrstudio: a Virtual Production and Live Streaming System for Immersive Instructional Experiences
    XRStudio: A Virtual Production and Live Streaming System for Immersive Instructional Experiences Michael Nebeling1, Shwetha Rajaram1, Liwei Wu1, Yifei Cheng1,2, Jaylin Herskovitz1 1 University of Michigan 2 Swarthmore College [nebeling,shwethar,wuliwei,yifcheng,jayhersk]@umich.edu ABSTRACT needs. The goal of this work is not to transform all instructional There is increased interest in using virtual reality in education, but tasks to VR or demonstrate the value of teaching in VR—this is the it often remains an isolated experience that is difficult to integrate focus of other research [42]. Rather, we want to enable instructors into current instructional experiences. In this work, we adapt vir- who wish to teach in VR and students who lack access to VR. For tual production techniques from filmmaking to enable mixed reality most students, there is both a significant learning curve to use VR capture of instructors so that they appear to be standing directly technologies and also the need to imagine how the person in VR in the virtual scene. We also capitalize on the growing popularity actually perceives the content if they themselves cannot directly of live streaming software for video conferencing and live produc- participate in the virtual experience [14, 29, 36]. tion. With XRStudio, we develop a pipeline for giving lectures in We think of this work as systems research on how to make it VR, enabling live compositing using a variety of presets and real- easier for both instructors and students to have more immersive time output to traditional video and more immersive formats. We educational experiences.
    [Show full text]