Camera-Based Occlusion for AR in Metal Via Depth Extraction from the Iphone Dual Camera

Total Page:16

File Type:pdf, Size:1020Kb

Camera-Based Occlusion for AR in Metal Via Depth Extraction from the Iphone Dual Camera Camera-based occlusion for AR in Metal via depth extraction from the iPhone dual camera Cristobal Sciutto Breno Dal Bianco Stanford University Stanford University cristobal.space [email protected] Abstract information is essential to to be able to occlude virtual ob- jects which are behind real objects. For example, looking In this project, we will explore the development of a at the Ikea Place app in 1, which allows users to place vir- proof-of-concept of occlusion with camera-based AR on tual furniture into the physical world, we see that it does not iOS’s graphics platform Metal. Per Cutting and Vishton’s support occlusion. While the sofa is behind the man, it is survey of the relative importance of visual cues, it is clear pasted on top of him in the image plane. Rather, the region that occlusion is the most important for generating a re- of the sofa that the man is covering should not be rendered. alistic virtual experience [1]. We hope that our develop- This is severely detrimental to the user’s experience as ments pave the way for future compelling consumer-grade occlusion is the most important visual cue, per Vishton and applications in the celular space. Our implementation in- Cutting [1]. In other words, it is the most important real volves tapping into both the phone’s capture system, ex- world effect through which a human is capable of distin- tracting disparity information from an iPhone X’s dual- guishing where objects are located in the world. Without camera, and implementing appropriate shaders for occlu- occlusion, a user experience cannot be compelling. sion. The difficulties tackled in this project are two-fold: Devices such as Microsoft’s Hololens have time-of-flight (i) iOS API deficiencies, and (ii) processing of inaccurate sensors that query the real world and measure the time it depth information. Source code and GIF examples of the app in usage are available at https://github.com/ tobyshooters/teapottoss. 1. Introduction In this section we explore the motivation behind the de- velopment of a solution to occulsion, particularly in the con- text of camera-based AR on phone. 1.1. Motivation Per GSMA’s 2019 Mobile Economy report, of the 5.3 billion humans over the age of 15, 5 billion of them own a smartphone [2]. Of the 5 billion users, iOS and An- droid constitute the largest user base. Consequently, most consumer-grade AR experiences are being developed for Android and iOS devices. They are made using ARCore or ARKit, Apple and Google’s proprietary AR frameworks, which specifically target their own hardware. The problem with mobile development, however, is that without specialized hardware, it is hard to get the sensor in- put needed for the generation of a quality AR or VR experi- ence. Namely, there is the lack of a depth map, i.e. a notion of how far objects in the real world are from the phone. This Figure 1. Ikea Place app. Note that lack of occlusion. Figure 2. Relative importance of visual cues per [1]. Figure 3. Example of 3D scene mesh reconstruction with Selerio takes for light to reflect back into the device. With this in- formation, along with the intended locaiton of an virtual ob- putational intensity of the approach. Furthermore, most ex- ject, it is possible to conditionally render an object only in isting algorithm’s require an initial calibration of the scene the case that it is the closest to the screen. While a time- at hand. ARKit requires similar initialization, so it could of-flight sensor is the best possible solution to the problem, be argued that this is not a further cost to the user experi- a low resolution sensor is sufficient for most existing expe- ence, however, it is clear that the ability to have occlusion riences. Since most objects are of considerable scale, e.g. out-of-the-box would be ideal. a couch, the resolution of depth needed is closer to meters than milimeters. In the second case, the user is prompted to create bound- Fortunately, a solution to this problem is possible within ing boxes for existing objects in the scene before begin- the form-factor of a phone. Namely, the usage of two cam- ning his experience. With these bounding boxes oriented eras that generate a stereo pair is sufficient for calculating in world space, it is similarly possible to estimate depth disparity, which can be then approximated to depth. All from them and thus occlude virtual objects. This initial- iPhones since the 7 and the Android competitor, Google’s ization phase is even more costly to user experience than Pixel, have two cameras. Currently, the dual cameras are the reconstruction, as users are required to manually gener- primarily used for computational photography. A dispar- ate 3D geometry for the rendering system. This approach ity map is sufficient for simulating effects such as depth- is taken by an existing open source project on GitHub by of-field or face-from-backgroudn isolation. This has led to Bjarne Lundgren [3], and can be seen in figure ??. many applications such as the iPhone’s portrait-mode. Nev- ertheless, this information has not yet been tapped for AR Overall, it is clear than any preprocessing step is non- experiences, beyond face filters. ideal. Using depth sensors directly forgo this problem, and Using this information is the first frontier for bettering this further motivates our approach of using mobile phone current consumer-grade AR experiences, and the area in dual cameras. Nevertheless, a combination of depth sensors which the greatest number of users can be impacted the with scene reconstruction, as used in current AR hardware quickest. will most likely be the way to go. This combined approach allows both immediate usage, and increased accuracy over 2. Related Work time. In the mobile space, there existing solutions to AR occlu- At Apple’s World Wide Developer Conference in 2019 sion use one of two methods: (i) 3D scene reconstruction, (after the development of this project), Apple announced or (ii) preprocessing of scene. In the first case, a series of ”people occlusion”, i.e. occlusion for the specific case of images captured from the phone are processed together to people. These features will be released in the form of their create an estimate of the real world scene. With this 3D new Reality Kit SDK. This does not yet solve that challenge scene, along with the phone’s position in the scene, one can solved by our application but makes it clear that it is Apple’s estimated the depth in a given camera direction. The re- intention to eventually reach the point where this informa- construction is often done with help of a machine learning tion is possible. We speculate that since the depth infor- algorithm. The Selerio SDK for cross-platform AR devel- mation likely passes through a machine learning pipeline, opment implements this approach [5]. The obvious draw- objects of which we have more data (e.g. faces and bodies) backs are the inaccuracies of reconstruction, and the com- will be the first to have occlusion enabled. Figure 6. Occlusion shader uses image as base color, but adds Phong shading. Figure 4. Bjarne Lundgren’s app, with the user establishing oc- cluding boxes. SDK. We then built a renderer in Metal (Apple’s lower-level graphics framework) that describes the 3D scene and ren- ders it with a shader. For Teapot Toss we have a 3D scene constitued of two objects: the teapot and an AR image plane. The former has the ability of being thrown upon tapping. The later hosts an image of the real world captured from the camera. The renderer receives both the captured image from the phone and the depth map from the iPhone’s camera stream and passes them as textures into a shader. For the AR image plane, we place its edges along the border of window space. Consequently, the AR image plane occupies the entirety of the user’s screen. The window space is then mapped to UV coordinates which are used to sample the captured image. We thus effectively have a live stream of the phone’s camera Figure 5. Successful occlusion with out final product. being displayed on this plane. This can best be visualized by using the color from the image, but still performing specular 3. Our Work highlighting on the teapot, per figure 6. Notably, the pot is occluded by the finger, yet we can still see the contour of the In this section we elaborate on the final product devel- object because the normals of the pot are still being used for oped and the process to reach that solution. highlights. The teapot is then rendered in front of this plane. We get 3.1. Overview the depth from the depth map corresponding to a fragment Our final product consists of an iOS game called Teapot of the teapot by projecting the fragment onto window space Toss, as observable in 5, which allows users to throw vir- and sampling the depth map texture. If the world space tual teapots into real-world cups. This is done by access- depth of the teapot is greater than the depth map, we use ing the camera stream for depth information and condition- the same UV coordinates to sample the captured image tex- ally rendering the teapot whether it is occluded by the real- ture, thus painting the teapot with the video stream. This world object. The camera information is extracted from process is best summarized by a example of Metal shader, the phone’s camera stream using the iOS AVFoundation available in the appendix.
Recommended publications
  • WWDC15 Graphics and Games
    Graphics and Games #WWDC15 Enhancements to SceneKit Session 606 Thomas Goossens Amaury Balliet Sébastien Métrot © 2015 Apple Inc. All rights reserved. Redistribution or public display not permitted without written permission from Apple. GameKit APIs High-Level APIs SceneKit SpriteKit Services Model I/O GameplayKit GameController Low-Level APIs Metal OpenGL SceneKit Since Mountain Lion Since iOS 8 SceneKit Particles Physics Physics Fields SpriteKit Scene Editor Scene Editor Available in Xcode 7 Scene Editor Can open and edit DAE, OBJ, Alembic, STL, and PLY files New native file format (SceneKit archives) Scene Editor SceneKit file format SCNScene archived with NSKeyedArchiver NSKeyedArchiver.archiveRootObject(scnScene, toFile: aFile) SCN Scene Editor Build your game levels Scene Editor Particles Physics Physics Fields Actions Scene Editor Shader Modifiers Ambient Occlusion Demo Scene Editor Amaury Balliet Behind the Scene Thomas Goossens Behind the Scene Concept phase Behind the Scene 3D modeling Behind the Scene Production • Final models • Textures • Lighting • Skinned character Behind the Scene Make it awesome • Particles • 2D overlays • Vegetation • Fog Game Sample Collisions with walls Rendered Mesh Collision Mesh Collisions with the Ground Collisions with the Ground Collisions with the Ground Collisions with the Ground Collisions with the Ground Collisions with the Ground Collisions with the Ground Collisions with the Ground Collisions with the Ground Collisions with the Ground Animations Animations Game Sample Animated elements Skinning
    [Show full text]
  • MEDIA LAYER and SERVICE LAYER Media Layer
    SNS COLLEGE OF ENGINEERING Kurumbapalayam (Po), Coimbatore – 641 107 AN AUTONOMOUS INSTITUTION Accredited by NBA – AICTE and Accredited by NAAC – UGCwith ‘A’ Grade Approved by AICTE, New Delhi & Affiliated to Anna University, Chennai CS8493-OPERATING SYSTEMS MEDIA LAYER AND SERVICE LAYER Media Layer Beautiful graphics and high-fidelity multimedia are hallmarks of the OS X user experience. Take advantage of the technologies of the Media layer to incorporate 2D and 3D graphics, animations, image effects, and professional-grade audio and video functionality into your app. Supported Media Formats OS X supports more than 100 media types, covering a range of audio, video, image, and streaming formats. Table 3-1 lists some of the more common supported file formats. Table 3-1 Partial list of formats supported in OS X Image formats PICT, BMP, GIF, JPEG, TIFF, PNG, DIB, ICO, EPS, PDF Audio file and data AAC, AIFF, WAVE, uLaw, AC3, MPEG-3, MPEG-4 (.mp4, .m4a), .snd, .au, .caf, formats Adaptive multi-rate (.amr) AVI, AVR, DV, M-JPEG, MPEG-1, MPEG-2, MPEG-4, AAC, OpenDML, 3GPP, 3GPP2, Video file formats AMC, H.264, iTunes (.m4v), QuickTime (.mov, .qt) Web streaming HTTP, RTP, RTSP protocols Graphics Technologies A distinctive quality of any OS X app is high-quality graphics in its user interface. And on a Retina display, users are more aware than ever of your app’s graphics. The simplest, most efficient, and most common way to ensure high-quality graphics in your app is to use the standard views and controls of the AppKit framework, along with prerendered images in different resolutions.
    [Show full text]
  • 3D Apple Games by Tutorials (Up to Date for Ios 10, Xcode 8 and Swift 3) [Englishonlineclub.Com].Pdf
    3D iOS Games by Tutorials 3D Apple Games by Tutorials Chris Language Copyright ©2016 Razeware LLC. Notice of Rights All rights reserved. No part of this book or corresponding materials (such as text, images, or source code) may be reproduced or distributed by any means without prior written permission of the copyright owner. Notice of Liability This book and all corresponding materials (such as source code) are provided on an “as is” basis, without warranty of any kind, express of implied, including but not limited to the warranties of merchantability, fitness for a particular purpose, and noninfringement. In no event shall the authors or copyright holders be liable for any claim, damages or other liability, whether in action of contract, tort or otherwise, arising from, out of or in connection with the software or the use of other dealing in the software. Trademarks All trademarks and registered trademarks appearing in this book are the property of their own respective owners. raywenderlich.com 2 3D iOS Games by Tutorials Dedications "To my wife Corné and my daughter Marizé. Thank you for your patience, support, belief, and most of all, your love. Everything I do, I do for you." — Chris Language raywenderlich.com 3 3D iOS Games by Tutorials About the author Chris Language is a seasoned coder with 20+ years of experience. He has fond memories of his childhood and his Commodore 64; more recently he started adding more good memories of life with all his iOS devices. By day, he fights for survival in the corporate jungle of Johannesburg, South Africa.
    [Show full text]
  • Bellus3d SDK Guide for Ios Iphone X
    Bellus3D SDK Guide for iOS iPhone X Contents 1. Introduction .......................................................................................................................................... 1 2. Integrating the SDK to your project ...................................................................................................... 1 3. Basic workflow ...................................................................................................................................... 2 4. Scanning ................................................................................................................................................ 2 5. Building 3D model ................................................................................................................................. 2 6. Rendering 3D model ............................................................................................................................. 3 a) Polygon mesh low-level data ............................................................................................................ 3 b) Polygon mesh high-level data ........................................................................................................... 4 c) Material settings ............................................................................................................................... 4 1. Introduction Bellus3D SDK is an iOS framework intended to build realistic 3D models of user’s face. To build the model the framework utilizes Apple TrueDepth camera
    [Show full text]
  • Parsing & Modeling Swift Systems
    Bachelor Thesis June 21, 2017 Parsing & Modeling Swift Systems Alessio Buratti Abstract Swift is an emerging programming language, mainly used to develop applications for Apple devices. It is widely used and its popularity is still increasing. We developed PMSS, a toolkit, written in Swift, that parses Swift applications and exports a model of their source code according to the FAMIX meta-modeling standard. These models can be analyzed using our iOS/macOS application or Moose, a language-independent platform for software and data analysis. Advisor Prof. Michele Lanza Assistant Roberto Minelli Advisor’s approval (Prof. Michele Lanza): Date: Contents 1 Introduction 2 1.1 Goal .............................................................2 1.2 Motivation . .2 2 State Of The Art 2 2.1 Related work . .2 3 Swift and the meta-modeling tools in a nutshell 3 3.1 The Swift Programming Language . .3 3.1.1 Some “Swifty” code examples . .3 3.2 FAMIX: a language independent meta-model . .5 3.3 MOOSE: a platform for software and data analysis . .6 4 PMSS: Parsing & Modeling Swift Systems 7 4.1 SourceKit: the core of Swift’s source code manipulation . .9 4.2 SourceKitten: high-level SourceKit interaction . .9 5 PMSS Model Visualizer: Our iOS and macOS Cocoa applications 14 5.1 iOS.............................................................. 14 5.2 macOS . 14 5.3 SceneKit: a high-level 3D graphics framework . 15 5.4 ARKit: augmented reality experience in any iOS application . 16 6 Conclusion 17 1 1 Introduction 1.1 Goal Software systems are constantly changing, old modules and components are being deprecated and replaced by more modern ones while new features are implemented and the existing ones are optimized.[7] Code maintenance is a complicated task, to be done, a deep knowledge and understanding of the system is necessary.[6] Reverse engineering is the process of analyze existing software systems and understand them.[3] Different tools can help the engineers to achieve this result.
    [Show full text]
  • Ios Game Development Using Spritekit Framework with Swift Programming Language
    Lal Jung Gurung IOS GAME DEVELOPMENT USING SPRITEKIT FRAMEWORK WITH SWIFT PROGRAMMING LANGUAGE IOS GAME DEVELOPMENT USING SPRITEKIT FRAMEWORK WITH SWIFT PROGRAMMING LANGUAGE Lal Jung Gurung Bachelor’s Thesis Spring 2016 Information Technology Oulu University of Applied Sciences ABSTRACT Oulu University of Applied Sciences Information Technology Author: Lal Jung Gurung Title of the bachelor’s thesis: iOS Game Development using SpriteKit Frame- work with Swift Programming Language Supervisor: Kari Laitinen Term and year of completion: 2016 Number of pages: 42 iOS is a mobile operating system for Apple manufactured phones and tablets. Mobile Gaming Industries are growing very fast, and compatibility with iOS is becoming very popular among game developers. The aim of this Bachelor’s thesis was to find the best available game development tools for iOS platform. The 2D game named Lapland was developed using Apple’s own native frame- work, SpriteKit. The game was written with the Swift programming language. The combination of SpriteKit and Swift help developers to make game develop- ment easy and fast, focusing more time on gameplay than other tedious tasks. The result was the game running successfully on iOS devices. SpriteKit pro- vides all the necessary tools for creating a 2D-game for the iOS platform. Keywords: iOS, Game Development, SpriteKit, Swift Programming Language. 3 PREFACE During my training program in Oulu Game Lab [1], I developed a game using a Unity game engine for android platform [2]. I became familiar with the Unity and android platform. In this thesis, I try to find best game development tools to de- velop a 2D-game for the iOS platform.
    [Show full text]
  • App Architecture
    By Chris Eidhof, Matt Gallagher, and Florian Kugler Version 1.0 (May 2018) © 2018 Kugler und Eidhof GbR All Rights Reserved For more books, articles, and videos visit us at https://www.objc.io Email: [email protected] Twitter: @objcio About This Book 5 1 Introduction 13 Application Architecture 13 Model and View 13 Applications Are a Feedback Loop 15 Architectural Technologies 17 Application Tasks 17 2 Overview of Application Design Patterns 20 Model-View-Controller 21 Model-View-ViewModel+Coordinator 24 Model-View-Controller+ViewState 29 ModelAdapter-ViewBinder 32 The Elm Architecture 35 Networking 38 Patterns Not Covered 39 3 Model-View-Controller 42 Exploring the Implementation 43 Testing 57 Discussion 61 Improvements 63 Conclusion 76 4 Model-View-ViewModel+Coordinator 78 Exploring the Implementation 80 Testing 96 Discussion 100 MVVM with Less Reactive Programming 101 Lessons to Be Learned 107 5 Networking 111 Networking Challenges 112 Controller-Owned Networking 112 Model-Owned Networking 118 Discussion 123 6 Model-View-Controller+ViewState 126 View State as Part of the Model 126 Exploring the Implementation 130 Testing 143 Discussion 149 Lessons to Be Learned 151 7 ModelAdapter-ViewBinder 159 Exploring the Implementation 161 Testing 177 Discussion 183 Lessons to Be Learned 187 8 The Elm Architecture 195 Exploring the Implementation 196 The Elm Architecture Framework 209 Testing 216 Discussion 221 Lessons to Be Learned 223 About This Book This book is about application architecture: the structures and tools used to bring smaller components together to form an application. Architecture is an important topic in app development since apps typically integrate a large number of diverse components: user events, network services, file services, audio services, graphics and windowing services, and more.
    [Show full text]
  • Grinder Ios Game.Pdf
    iOS Game Development with UIKit Martin Grider @livingtech http://abstractpuzzle.com/ ~ http://chesstris.com/ September 30, 2015 Overview • Who am I? • Apple native game technologies • UIKit Game Opinions • Me & UIKit Games • About GenericGameModel Who am I? Martin Grider @livingtech http://abstractpuzzle.com/ game developer iOS contract / freelance IGDATC • International Game Developer’s Association (Twin Cities) • http://igdatc.org/ Why UIKit? Got to have the Skillz to Pay the Bills! • My Business: I specialize in all native iOS development! • Definitely Not: Android, Phone Gap, Java, PHP • …and now not: Swift (maybe someday?!?) • Quality: 10,000 hours to mastery (maybe debunked*) • Enjoyment: I do what I love. * http://www.popsci.com/article/science/practice-not-important-thought-success-study-says I ❤ UIKit ..? • Yes! Maybe. • Maybe I’m just lazy. • But it is very fast to prototype some games in UIKit. • Also: UIKit is the oldest iOS API. Examples are easy to come by, and it’s relatively stable and easy to learn. Apple Native Game Technologies Metal • Hardware-specific graphics API • Introduced in iOS 8 • Closest to the “metal” (CPU / GPU) • Uses a custom shader language • Best performance possible in iOS Open GL (ES) • Open Graphics Library (for Embedded Systems) • Cross-platform (this is what ran underneath iOS before Metal) • Also extremely low-level Metal / OpenGL ES • Both are fundamentally similar • Useful to know how these work, because they are running “underneath” most game frameworks • But these are APIs for the framework developers
    [Show full text]
  • Ios Swift Game Development Cookbook
    iOS Swift Game Development Cookbook Development CookbookDevelopment Game Swift iOS Ready to make amazing games for the iPhone, iPad, and iPod touch? With “From design principles Apple’s Swift programming language, it’s never been easier. This updated for game engines to cookbook provides detailed recipes for managing a wide range of common iOS game development issues, ranging from 2D and 3D math to Sprite Kit the practical details of and OpenGL to performance—all revised for Swift. working with iOS, this You get simple, direct solutions to common problems found in iOS game book is an invaluable programming. Need to figure out how to give objects physical motion, or resource for any want a refresher on gaming-related math problems? This book provides developer who wants to sample projects and straightforward answers. All you need to get started is some familiarity with iOS development, Swift, and Objective-C. make outstanding iOS games.” ■ Design the architecture and code layout of your game —Jonathan Adamczewski engine programmer, Insomniac Games ■ Build and customize menus with UIKit ■ Detect and respond to user input ■ Use techniques to play sound effects and music ■ Learn different ways to store information for later use ■ Create 2D graphics with Sprite Kit ■ Create 3D graphics with Scene Kit ■ Add two-dimensional physics simulation ■ Learn beginning, intermediate, and advanced 3D graphics with OpenGL iOS Swift Game ■ Create challenges with artificial intelligence ■ Take advantage of game controllers and external displays Buttfield-Addison Jonathon Manning is a game designer and programmer who’s worked on projects Development ranging from iPad games for children to instant messaging clients.
    [Show full text]
  • 602 Managing 3D Assets With
    Graphics and Games #WWDC15 Managing 3D Assets with Model I/O Session 602 Nick Porcino Apple Inc. Remi Palandri Apple Inc. Claudia Roberts Apple Inc. © 2015 Apple Inc. All rights reserved. Redistribution or public display not permitted without written permission from Apple. Model I/O Model I/O Framework for handling 3D assets and data Import and export 3D asset files • Describe lighting, materials, environments • Process and generate asset data • Bake lights, subdivide and voxelize meshes For Physically Based Rendering • Designed for PBR tools and pipelines Integrated with Xcode 7 and GameKit APIs Available for iOS 9 and OS X 10.11 Introduction Model I/O Workflow SceneKit Content Creation Tool Import Model I/O Metal OpenGL Introduction Model I/O Workflow Bake SceneKit Content Creation Tool Import Model I/O Metal OpenGL Introduction Model I/O Workflow Bake SceneKit Content Creation Tool Import Model I/O Metal OpenGL Export Agenda The Model I/O Framework Features overview Data types and physical motivation Geometry and voxels Advanced lighting Baking Model I/O Overview File Formats Import formats • Alembic .abc • Polygon .ply • Triangles .stl • Wavefront .obj Export formats • Triangles .stl • Wavefront .obj Import and Export Import MDLAsset *asset = [[MDLAsset alloc] initWithURL:myURL]; Export [asset exportAssetToURL:myURL]; Physical Realism Realistic lights • IES profile, temperature, image based Realistic materials • Lambert / Blinn-Phong, physical BRDF Realistic cameras • From lens to sensor Realistic environments • Panoramic photographs • Procedural skies Modify and Bake Assets Ambient occlusion [mesh generateAmbientOcclusionTextureWithQuality: ... ]; Light and shadow baking [mesh generateLightMapTextureWithQuality: ... ]; Normals calculation [mesh addNormalsWithAttributeNamed: ... ]; Tangent basis [mesh addTangentBasisForTextureCoordinateAttributeNamed: ... ]; Voxels Voxels Create voxels from an asset [[MDLVoxelArray alloc] initWithAsset: ..
    [Show full text]
  • Ios Specific Frameworks Ios Specific Frameworks
    #WWDC19 •Introducing iPad Apps for Mac Ali Ozer, Cocoa Jake Petroules, Developer Tools Jason Beaver, Cocoa © 2019 Apple Inc. All rights reserved. Redistribution or public display not permitted without written permission from Apple. Agenda • What it is • Getting started • Things you get for free • API differences Rebuild and run iPad Apps natively on the Mac macOS App Web Games AppKit WebKit Metal macOS App iOS App Web Games AppKit UIKit WebKit Metal Leverage our shared technology stack Leverage our shared technology stack Integrate iOS frameworks Leverage our shared technology stack Integrate iOS frameworks Enable Xcode to build iPad projects for Mac macOS Apps macOS Apps WebKit SceneKit UI frameworks AppKit macOS Apps WebKit SceneKit UI frameworks AppKit CoreGraphics Low level frameworks Foundation libSystem macOS Apps WebKit SceneKit UI frameworks AppKit CoreGraphics Low level frameworks Foundation libSystem Databases Photos, Contacts, Prefs, … macOS Apps WebKit SceneKit UI frameworks AppKit CoreGraphics Low level frameworks Foundation libSystem Databases Photos, Contacts, Prefs, … Services Clipboard, File coordination, … macOS Apps WebKit SceneKit UI frameworks AppKit CoreGraphics Low level frameworks Foundation libSystem Databases Photos, Contacts, Prefs, … Services Clipboard, File coordination, … Kernel Darwin macOS Apps iOS Apps WebKit SceneKit UI frameworks AppKit CoreGraphics Low level frameworks Foundation libSystem Databases Photos, Contacts, Prefs, … Services Clipboard, File coordination, … Kernel Darwin macOS Apps iOS Apps
    [Show full text]
  • S07 Rendering, Metal, Sprite Kit, Scene
    iPhone Application Programming Lecture 7: Rendering, Metal, Sprite kit, Scene kit Simon Voelker Media Computing Group RWTH Aachen University Winter Semester 2015/2016 http://hci.rwth-aachen.de/iphone Media Computing Group Your App 2D Graphics Standards-Based High Efficiency and Imaging 3D Graphics GPU Access Scene Graphs SpriteKit Core Animation SceneKit Core Image Core Graphics Open GL ES Metal GPU Apple, WWDC 2014 Media Computing 2 Simon Voelker: iPhone Programming Group Your App 2D Graphics Standards-Based High Efficiency and Imaging 3D Graphics GPU Access Scene Graphs SpriteKit Core Animation SceneKit Core Image Core Graphics Open GL ES Metal GPU Apple, WWDC 2014 Media Computing 3 Simon Voelker: iPhone Programming Group OpenGL ES • 3D graphics rendering engine • OpenGL with reduced instruction set • Cross-platform • C based • OpenGL is pixel based • Version 1.1: Fixed rendering pipeline • Version 2.0: Shader-based rendering pipeline Media Computing 5 Simon Voelker: iPhone Programming Group GLKit • Introduced iOS 5 • More comfort for basic rendering tasks • Helper classes • GLKView & GLKViewController • GLKTextureLoader • GLKEffect • Math Library • Lighting • For More details: iPhone Lecture 6 (2012/2013) Media Computing 6 Simon Voelker: iPhone Programming Group Your App 2D Graphics Standards-Based High Efficiency and Imaging 3D Graphics GPU Access Scene Graphs SpriteKit Core Animation SceneKit Core Image Core Graphics Open GL ES Metal GPU Apple, WWDC 2014 Media Computing 7 Simon Voelker: iPhone Programming Group Metal • Highly optimised
    [Show full text]