Introduction to Mixed Realities Course 02 Content
Total Page:16
File Type:pdf, Size:1020Kb
Introduction to mixed realities Course 02 Content ● Recapitulation Course 1 ○ AR/VR/MR/XR ● Computer Graphics in AR/VR ○ 3D Modeling ○ Blender ● Working in a game engine ○ Unity Interface ● Conclusions ● Bibliography 2 R - Definitions ● Augmented Reality ● Virtual Reality ● Mixed Reality ● Extended Reality 3 Virtual Environment Rendering In VR, rendering does not only correspond to the image synthesis process Must encompass as many senses as possible to induce and immersive experience Image synthesis Audio rendering (spatial) Tactile rendering and force feedback (both localized and full body) Smell simulation 4 3D Modeling Used for the development of virtual models for computer simulation, XR Application for artificial intelligence (AI), big data analytics, etc. 5 Major Software in 3D Modeling Tools: ZBrush, Blender, SketchUp, AutoCAD, SolidWorks, 3Ds Max, Maya, Rhino3D, CATIA, etc. Used file formats of 3D models are FBX, OBJ, STL, etc. 6 7 Computer-Aided Industrial Design (CAID) CAID tools provide 3D modeling features CAID is used in various industries like 3D printing, animation, gaming, architecture, and industrial design for digital production The CAID tools provide designers with improved freedom of creativity compared to typical CAD 8 Texture Painting Software Texture painting is an essential step to enhance the visual effects and increase the realism of virtual environments 9 Key 3D Modeling Terms - 3D Model The digital representation of a three-dimensional object that is created in dedicated 3D modeling software A 3D model can be turned and looked at from every different angle and can be scaled, rotated, or freely modified 10 3D Modeling - NURBS Modeling Non-Uniform Rational Basis Spline (NURBS) model is a mathematical modeling type commonly used to generate curves and surfaces The main advantages of this modeling technique are the great flexibility and precision you have in generating your shapes (use interpolation) In contrast to Polygon Modeling, the curve is drawn in a 3D space, and edited by moving a series of handles called CVs or control vertices along the x-, y-, or z-axis 11 3D Modeling - Polygon Modeling Polygon models (also known as meshes) are a collection of vertices, edges and faces that define the model, which allows for easy and precise editing of parts of your object By changing the coordinates of one or several vertices, you can change the shape of the model Used in: Animation and Film Games industry 12 Polygon Modeling - Vertices, Edges and Faces Faces: are the most basic parts of a 3D polygon. When three or more edges are connected together, the face is what fills in the empty space between the edges and makes up what is visible on the polygon mesh Edges: are defined by two vertices at their end points Vertices: are the smallest components of a polygon model. It is simply a point in a three-dimensional space 13 Polygon Modeling - Polygons Four-sided (quads) or three-sided (tris - used more commonly in game modeling) The number of polygons in a mesh is called the poly-count, while polygon density is called resolution 14 Subdivision Surfaces/NURMS Modeling Subdivision surfaces (NURMS - Non-Uniform Rational Mesh Smooth): it is a method used to smoothen out those pixelated meshes NURMS actually subdivide each polygonal face into smaller faces that better approximate the smooth surface 15 Polygon Modeling - Textures and Shaders Shaders: are a set of instructions applied to a 3D model that lets the computer know how it should be displayed. With tools from 3D software packages, we can control the way the surface of the model interacts with light, including opacity, reflectivity, specular highlight (glossiness), etc. Textures: are two-dimensional image files that are mapped onto the model's 3D surface (texture mapping). Textures can range in complexity from simple flat color textures up to completely photorealistic surface detail 16 Polygon Modeling - UV Mapping UV mapping is the process of projecting a 2D image texture onto a 3D object. Typically, such a texture is applied after a model, or body, is created UV mapping can be used while creating a new material. The material affects the object, while the UV mapping only affects the surface of the material 17 Polygon Modeling - Texture Maps Specular Map - is a texture type that dictates which parts are glossy/specular based on the grayscale values that it has Diffuse Map - it can add that bit of realism to your shader Normal Map - It works more like a workaround compared to what you would normally achieve when using dedicated normal maps, but the effects are relatively close and indistinguishable 18 3D Modeling - Rigging Rigging is a technique used in skeletal animation for representing a 3D character model using a series of interconnected digital bones This bone structure is used to manipulate the 3D model like a puppet for animation The result is a hierarchical structure where each bone is in a parent/child relationship with the bones it connects to. This simplifies the animation process as a whole 19 3D Modeling - Animations These bones can be transformed using digital animation software meaning their position, rotation, and scale can be changed By recording these aspects of the bones along a timeline (using a process called keyframing) animations can be recorded A basic setup may take a few hours or less while a complex rig for a movie could take days https://www.youtube.com/watch?v=XRVIuE2iIxA &feature=emb_logo 20 Blender https://www.blender.org Blender is the free and open source 3D creation suite It supports the entirety of the 3D pipeline—modeling, rigging, animation, simulation, rendering, compositing and motion tracking, video editing and 2D animation pipeline 21 22 23 24 25 26 Blender - Edit Mode Annotate Measure Extrude Region Inset Faces Bevel Loop Cut Knife Poly Build Spin Smooth Edge Slide Shrink/Fatten Shear Rip Region 27 Blender - Sculpt Mode 28 Blender - Grabbing, Scaling, and Rotating The three most basic ways of changing an object in a 3D scene are called transformations: ✓ Change location using location ✓ Change size using scale ✓ Change rotation using rotation 29 Blender - Differentiating Between Coordinate Systems All coordinate systems in Blender are based on a grid consisting of three axes: ✓ ‐ ‐ ‐ The X axis typically represents side to side movement. ✓ The Y‐axis represents front‐to‐back ✓ The Z‐axis goes from top to bottom 30 Blender - Transform Orientations ✓ Global: the primary orientation to which everything else relates ✓ Local: each 3D object in Blender has a local coordinate system ✓ Normal: a set of axes that’s perpendicular to some arbitrary plane ✓ Gimbal: When you rotate an object about its X, Y, and Z axes ✓ View: relative to how you’re looking at the 3D View 31 Blender - Move, Rotate, Scale 32 Blender - Hotkeys 33 Blender - Hotkeys on the Numerical Keypad 34 Modeling and Computer Graphics in VR Create 3D models in VR for real-time interaction - one approach is to perform optimization to reduce the complexity by minimizing the mesh size of the models => visual realism of the models may be affected Computer graphics techniques used to create 3D models in VR: Mesh Shading Flat-shading (left) Smooth-shading (right) 35 Techniques used to create 3D models in VR Auto smooth shading filter to quickly and easily changes the way the shading Mesh editing tools such as bevel, subdivision, loop cut, etc. may need to be applied at the edge to create better visual effects. Below, 3D models applied the bevel modifiers with 20 segments, 2 segments and 6 loop cuts (from left to right) 36 Blender - Render Engine Eevee The corresponding models are rendered using the real-time render engine Eevee in Blender Visual effects of the models represented by red, green and blue color (from left to right) 37 38 39 Free 3D Models Poly https://poly.google.com Turbosquid https://www.turbosquid.com Unity Asset Store https://assetstore.unity.com/ CGTrader https://www.cgtrader.com/free-3d-models 40 Unity Interface A - Toolbar B - The Hierarchy Window C - The Game View D - The Scene View E - The Inspector Window F - The Project Window 41 User Interface - Toolbar On the left it contains the basic tools for manipulating the Scene view and the GameObjects within it In the centre are the play, pause and step controls In the right you have access to Unity Collaborate, Unity Cloud Services and your Unity Account Finally there is a layer visibility menu and the Editor layout menu 42 User Interface - Hierarchy window It is a hierarchical text representation of every GameObject in the Scene. Each item in the Scene has an entry in the hierarchy, so the two windows are inherently linked. The hierarchy reveals the structure of how GameObjects attach to each other 43 User Interface - Game view Simulates what your final rendered game will look like through your Scene Cameras. When you click the Play button, the simulation begins. 44 User Interface - Scene view Allows you to visually navigate and edit your Scene. The Scene view can show a 3D or 2D perspective, depending on the type of Project you are working on 45 User Interface - Inspector Window Allows you to view and edit all the properties of the currently selected GameObject. Because different types of GameObjects have different sets of properties, the layout and contents of the Inspector window change each time you select a different GameObject 46 User Interface - Project window Displays your library of Assets that are available to use in your Project. When you import Assets into your Project, they appear here 47 Assets in Unity An asset is a representation of any item you can use in your project An asset might come from a file created outside of Unity, such as a 3D Model, an audio file, or an image You can create some asset types in Unity, such as a ProBuilder Mesh, an Animator Controller, an Audio Mixer, or a Render Texture 48 How Unity Imports Assets Unity automatically imports assets and manages additional data about them: 1.