<<

Linköping University | Department of Computer and Information Science Master’s thesis, 30 ECTS | Datateknik 2021 | LIU-IDA/LITH-EX-A–21/010—SE

Synchronizing 3D data between software – Driving 3D collaboration forward using direct links

Synkronisering av 3D-data mellan mjukvaror

Carl Brage

Supervisor : Jonas Wallgren Examiner : Cyrille Berger

Linköpings universitet SE–581 83 Linköping +46 13 28 10 00 , www.liu.se Upphovsrätt

Detta dokument hålls tillgängligt på Internet - eller dess framtida ersättare - under 25 år frånpublicer- ingsdatum under förutsättning att inga extraordinära omständigheter uppstår. Tillgång till dokumentet innebär tillstånd för var och en att läsa, ladda ner, skriva ut enstakako- pior för enskilt bruk och att använda det oförändrat för ickekommersiell forskning och för undervis- ning. Överföring av upphovsrätten vid en senare tidpunkt kan inte upphäva detta tillstånd. Allannan användning av dokumentet kräver upphovsmannens medgivande. För att garantera äktheten, säker- heten och tillgängligheten finns lösningar av teknisk och administrativ art. Upphovsmannens ideella rätt innefattar rätt att bli nämnd som upphovsman i den omfattning som god sed kräver vid användning av dokumentet på ovan beskrivna sätt samt skydd mot att dokumentet ändras eller presenteras i sådan form eller i sådant sammanhang som är kränkande för upphovsman- nens litterära eller konstnärliga anseende eller egenart. För ytterligare information om Linköping University Electronic Press se förlagets hemsida http://www.ep.liu.se/.

Copyright

The publishers will keep this document online on the Internet - or its possible replacement - for a period of 25 years starting from the date of publication barring exceptional circumstances. The online availability of the document implies permanent permission for anyone to read, to down- load, or to print out single copies for his/hers own use and to use it unchanged for non-commercial research and educational purpose. Subsequent transfers of copyright cannot revoke this permission. All other uses of the document are conditional upon the consent of the copyright owner. The publisher has taken technical and administrative measures to assure authenticity, security and accessibility. According to intellectual property law the author has the right to be mentioned when his/her work is accessed as described above and to be protected against infringement. For additional information about the Linköping University Electronic Press and its procedures for publication and for assurance of document integrity, please refer to its www homepage: http://www.ep.liu.se/.

© Carl Brage Abstract

In the area of 3D visualization there are often several stages in the design process. These stages can involve creating a model, applying a texture to the model and creating a rendered image from the model. Some software can handle all stages of the process while some are focused on a single stage to try to perfect and narrow down the service provided. In this case there needs to be a way to transfer 3D data between software in an efficient way where the user experience isn’t lacking. This thesis explores the area of 3D data synchronization by first getting foundation from the prestudy and literature study. The findings from these studies are used in a shared file based implementation andadesign of a network based system. The work presented in this thesis forms a comprehensive overview which can be used for future work. Acknowledgments

My gratitude goes to Jonas and Cyrille for helping me fix all the errors in this report, andto Configura for giving me a chance at my first programming job. I would also like tothankmy family for being there for me and supporting me through these years at Linköping University. A special thank you goes to Erika who was my greatest support during these last times.

iv Contents

Abstract iii

Acknowledgments iv

Contents v

List of Figures vii

List of Tables viii

1 Introduction 1 1.1 Motivation ...... 1 1.2 Aim ...... 2 1.3 Research questions ...... 2 1.4 Delimitations ...... 2

2 Theory 3 2.1 Building Information Modeling ...... 3 2.2 3D design workflow ...... 3 2.3 3D data representation ...... 4 2.4 Network terminology ...... 5 2.5 software ...... 6

3 Method 9 3.1 Prestudy ...... 9 3.2 Literature study ...... 10 3.3 File-based implementation ...... 11 3.4 Network-based system ...... 12

4 Results 13 4.1 Prestudy ...... 13 4.2 Literature study ...... 14 4.3 File-based implementation ...... 21 4.4 Network-based system ...... 27

5 Discussion 32 5.1 Results ...... 32 5.2 Method ...... 34 5.3 The work in a wider context ...... 36

6 Conclusion 37 6.1 What research exists on 3D data sharing between software? ...... 37 6.2 What limitations does an automated file-based import-export workflow have? . 38 6.3 How can a network-based 3D data system be described using existing research? 38

v 6.4 Future work ...... 38

Bibliography 40

vi List of Figures

2.1 Internet protocol layers...... 5 2.2 CET Designer...... 7

4.1 Stages of ...... 15 4.2 AtomDag node type hierarchy...... 18 4.3 Structure of the two-way cooperation system...... 20 4.4 Architecture of the collaborative system...... 21 4.5 CET Designer extension for the export...... 23 4.6 Unreal Engine Auto Reimport settings...... 24 4.7 Blueprints for importing a FBX file...... 24 4.8 Basic room created in CET Designer...... 27 4.9 Architecture of the network-based system for CET Designer...... 28 4.10 Communication chart for the system...... 29 4.11 Format for the network design data...... 30

vii List of Tables

4.1 3D render engine features...... 13 4.2 Revit export file types...... 16 4.3 Specifications for computer used during development and testing...... 25 4.4 Specifications for files used during testing...... 26 4.5 Import/export times in seconds for the files and software used...... 26 4.6 Testing the implementation with FBX and glTF file...... 27

viii 1 Introduction

This chapter consists of the introduction of the thesis which gives a background of what we want to achieve and why. It is divided into motivation, aim and research questions with a description of delimitations at the end.

1.1 Motivation

Producing 3D content has been made easier with the introduction of different visualization software. These aim to provide many useful features and a clear UI. Over the years some of these software have made breakthroughs when it comes to producing realistic renderings of objects and scenery. Showing correct shadows, lightings and textures is important to give the viewer an accurate representation of 3D models and scenes. The introduction of features such as virtual reality (VR), augmented reality (AR) and haptic feedback has been a part of bringing another of realism to the digital world. Given these higher standards for 3D content companies need to meet the changing demands of customers which are accustomed to seeing the content represented more and more visually correct. Implementing a proper engine which meets these demands requires a large workforce with a deep understanding of 3D rendering. This is not only costly, but it is also unnecessary to have several companies which develop the same features. For this reason it is very common to instead rely on a third-party software to handle the graphics. Exporting the 3D content from a drawing and modelling software to a rendering software can be done in various ways which have different benefits. Allowing for continuous changes to the content while having a quick and easy workflow is crucial for designers using this kind of software. The question is how this can be achieved for companies which produce their own design software and are looking to easily create high quality renders. This thesis was proposed by Configura which is a company with offices around the world. Configura is one of the leading companies in space planning software. The software (CET Designer) is used in areas such as commercial furniture, material handling, kitchen, bath and industrial machinery. It is built with scalability and modularity in mind with many partners delivering custom extensions and plugins. The aim of the thesis is to find out how some well- known 3D rendering engine can be more loosely integrated into the software while retaining important features. Having photorealistic renderings and real time updates are features which are important in this context. CET Designer already has advanced 3D rendering capabilities.

1 1.2. Aim

Using a separate 3D rendering engine to complement the existing software means that more features could be available to the end user. Material editing and ray traced lighting are two features which could be interesting for the users. It could also provide a different UI which the user is more familiar with. CET Designer currently supports many different file formats and Configura is always looking to expand their reach when it comes to exporting andimporting 3D content. Being able to export drawings to separate rendering software during the design phase for iterative development is highly interesting for the company. This workflow needs to be simple and fast for the designer. Therefore, Configura wants to research this area further.

1.2 Aim

The aim of this thesis is to find out how companies producing design software can implement systems which allow for fast and efficient export to third-party rendering software. Using existing research and implementations by other companies we will find out how the design import export workflow can be optimized and made as simple as possible to the enduser. Different approaches will be evaluated such as a file-based approach and a network-based approach. Implementing the approaches for CET Designer will be done if it is possible within the time constraints.

1.3 Research questions

This thesis aims to answer these research questions:

1. What research exists on 3D data sharing between software?

2. What limitations does an automated file-based import-export workflow have?

3. How can a network-based 3D data system be described using existing research?

1.4 Delimitations

This thesis will be focused on evaluating different solutions for optimizing a 3D import export workflow within the architectural visualization and space planning industry. It will be focused primarily on CET Designer as the design software and picking between some third party rendering software for evaluation. The software used in this thesis will be limited by cost as there is no budget for buying licenses and plugins/extensions. Student licenses can be used if they are available for the software.

2 2 Theory

In this chapter, relevant theory is presented to give a good understanding of the underly- ing concepts in this thesis. These concepts have to do with architectural visualization, 3D rendering and metrics used to evaluate the implementation.

2.1 Building Information Modeling

Building Information Modeling (BIM) is a digital representation of physical and functional characteristics of a facility. It allows for creating accurate virtual models of a building by utilizing the shared knowledge resource that is BIM. In a practical sense BIM represents relationships, attributes and geometries in a multi-dimensional model. BIM is a continuation of the CAD software since it can also be used for documentation and maintenance of the building. Software using the BIM standard include Revit[5] and Graphisoft ArchiCAD[26]. The file format Industry Foundation Classes (IFC) is often associated with BIM. Itisa platform neutral model which is used to describe architectural and construction data. IFC is used to communicate between CAD-software and other software. IFC is developed by the organization buildingSMART. [32]

2.2 3D design workflow

A typical 3D design workflow can be composed of several stages. Different software existto provide specialization in the areas of modeling, materials, , post-processing etc. Being able to transfer 3D data between software is crucial since it allows for collaboration and an easier workflow. Creating a 3D design could consist of creating an initial 2D sketch which is then exported into a 3D modeling software where a 3D model will be built using the sketch. This 3D model can then be exported into a separate software which applies a material onto the model. For the final step the model needs to be rendered and there might be a needtoapply post-processing to the final rendered images. In the different steps there is a need toallowfor import and export functionality in the collaborating software.[36] There are different file types used for these steps, where DWG can be used for drawings, FBX for 3D models and PNG for rendered images.[38] When you want to automate this workflow you can create a shared 3D file which one software writes to and another one reads from. The receiving softwarewill

3 2.3. 3D data representation then check the file system for any changes to the 3D file. If the file has been altered, itwill reimport it and update the generated scene or model.[39] This approach to automating a 3D import-export workflow will be referred to as file-based in this thesis. There is also an approach which doesn’t use the file system. This approach will be referred to as network-based in this thesis. By setting up protocols for communication and specifying a format the data can be streamed between software. The software involved could be on the same computer or spread out across the world on different computers connected through the internet. There can be a server in the middle of the communication channel which handles the 3D model storage and how the data is sent between the software.[12][30] A more detailed presentation of how these approaches can be used in practice will be in the results chapter.

2.3 3D data representation

This section presents theory about 3D data representation, which includes how the data is stored and how different components are connected. Two main components of 3D dataare the geometry and the materials applied to the geometry.

2.3.1 Geometry The geometry of the model can be defined in different ways. The geometry can be solid, which means that the volume of the object is defined, or a shell/boundary which means that only the surface of the model is defined. How the geometry is represented can depend onhow detailed the model needs to be and storage requirements. A mesh representation means that the model is made up of vertices connected by edges to create polygonal shapes. Meshes are used in and modeling. 3D point clouds are also used for 3D geometry where sets of three-dimensional coordinates define the envelope of the object. This representation is mostly used when a physical object has been scanned through, for example, photogrammetry. Voxel-based geometries have been used where curves and rounded objects aren’t as important as performance and storage limitations. A voxel can be seen as 3D pixels and are simple cubical units.[42][24]

2.3.2 Materials The appearance of a 3D model is finalized by applying a material to the geometric data. This material can be represented in different ways, for example a simple image which is mapped to the 3D model. Properties can be applied to the material to achieve a higher photorealism. Properties such as reflection, opacity and roughness of the material can elevate it to thenext level.[38] There are different ways of specifying properties of materials, one of them isan approach called Physically Based Rendering.

Physically Based Rendering Physically Based Rendering, PBR, is an approach to render graphics with the aim to achieve photorealism. PBR tries to model the flow of light in the real world. The implementation of PBR can vary since it doesn’t contain a strict set of rules. PBR is more of a methodology. Physical phenomenon such as absorption and scattering of light on an object are studied to accurately simulate materials. Materials are defined through properties which determine light scattering and absorption. The file formats glTF and glb take advantage of PBR to create realistic 3D-models. [40]

2.3.3 3D file formats There exist several different 3D file formats which were developed for different purposes. Some of these include obj, FBX and glTF. The sizes and attributes vary in these file formats. glTF

4 2.4. Network terminology stands for Graphics Language Transmission Format and was designed by Khronos Group to transport 3D contents efficiently between networks. The structure or formation of a sceneis specified by using JSON. It consists of elements such as scene, nodes, camera andmesh.[34] The FBX file format was developed by Kaydara and is now owned by Autodesk. It isusedin software such as 3ds [2], Maya[3], [17] and Unreal Engine[23] since it can be used to store data. The OBJ file format is developed by Wavefront Technologies and supports geometries in form of vertices/edges/faces and parametric surfaces, vertex normals, textures, material properties and groups. The OBJ file format consists of a number of lines with keys and values.[38]

2.4 Network terminology

Some of the implementations used in various direct links between design software and render- ing software are based on a network approach, for example the Live Link plugin for Unreal engine[23]. This section gives a basic understanding of network terminology and how it works. There are five layers of the Internet protocol suite. Each layer is built upon the layer directly below and uses it to create new services. The layers are, from bottom up, physical layer, link layer, network layer, transport layer and application layer.[33]

HTTP SMTP FTP Application

TCP UDP Transport

IP Network

Ethernet LTE Link

Optical fiber Coaxial cable Physical

Figure 2.1: Internet protocol layers.

2.4.1 Physical Layer The physical layer is responsible for transferring bits rather than data packets. Bits are converted to signals and then transferred between nodes in the network. The physical layer refers to the hardware components which can be different types of network cables, routers or switches.

2.4.2 Link Layer The link layer consists of protocols such as Ethernet and wireless network standards. The basic service of a link layer is to move a frame from one node to another, where a frame consists of a data field and a number of header fields. The link layer also provides reliable delivery, error detection and error correction.

2.4.3 Network Layer The internet protocol, IP, resides in the network layer. On the network layer packets known as datagrams are transferred between hosts. Packets can vary in size depending on what the

5 2.5. 3D rendering software link layer can handle. The packets can be split up into smaller packets before they are handed over to the link layer. The transport layer uses the network layer in a similar way as you would use the postal service to deliver a letter. An internet transport layer protocol (TCP or UDP) sends a segment and a destination address to the network layer which then delivers this segment to the destination host transport layer.

2.4.4 Transport Layer The transport layer sends messages between application endpoints. There are two dominating transport protocols on the Internet, TCP and UDP. TCP stands for Transmission Control Protocol and is a connection-based protocol. A connection is set up between two hosts before any data is sent from the application. TCP guarantees that the sent data will make it to the target by asking for acknowledgments. If no acknowledgment is received from the target after a segment has been sent, the source host will try to resend the segment. TCP makes sure to keep the network from being congested by throttling the transmission rate when necessary. It also breaks long messages into shorter segments. UDP stands for User Datagram Protocol and is a connectionless protocol. This means that there is less reliability compared to TCP as there is no acknowledgment when a segment is sent. UDP doesn’t provide congestion control and flow control. These features combined means that UDP can be faster but less reliable compared to TCP.

2.4.5 Application Layer The application layer lies closest to the end-user and includes many protocols such as HTTP and SMTP. The application layer describes how information is to be exchanged, in the case of HTTP it describes the exchange between a web browser and a web server. Data is sent from the application layer to the transport layer which then handles the transmission. The protocols can be seen as the format with which we transfer information.

2.5 3D rendering software

This section lists some of the 3D rendering engines and software commonly used in game development and architectural visualization.

2.5.1 CET Designer CET Designer is developed by Configura which was founded in Sweden and now has offices around the world. CET stands for Configura Extension Technology and the software is used for space planning and configuration of products. What separates CET Designer from aCAD software is that there are rules defined for the different products which for example prevents a table from being stretched to a width that the manufacturer doesn’t support. Products can be grouped together in a user friendly manner which helps the designer to quickly develop spaces to show a customer. Uses for CET Designer range from kitchen spaces to warehouses and storage. Completed drawings can be rendered in high resolution images or exported in formats such as DWG, FBX, OBJ and IFC to other software. There are also extensions for software like Revit and SketchUp[48]. CET Designer[8] is developed in the language created by Configura called Configura Magic (Cm). An image of what the user sees when workingin CET Designer is shown in figure 2.2.

2.5.2 Unreal Engine Unreal Engine is developed by Epic Games and contains a complete suite of creation tools for different uses across several platforms. Unreal Engine is for example used for game develop-

6 2.5. 3D rendering software

Figure 2.2: CET Designer. ment, architectural visualization and television content creation. It is regularly updated with new features, bug fixes and community contributions. The latest version is Unreal Engine 4 and the fifth generation is scheduled to be released during 2021. Epic Games gives full access to the C++ source code for Unreal Engine 4 on GitHub where contributors can fork and modify the engine. Unreal Engine has a well established forum which allows developers and users to ask questions about features of the engine and associated tools. The license agreements allow for free use of Unreal Engine 4 for smaller projects. In projects which exceed the $1,000,000 USD gross revenue limit Epic Games takes out a 5% royalty fee. There are also custom licenses which companies can apply for. [23] In Unreal Engine 4 there is a plugin called Live Link which allows other software to stream data such as capture. This plugin can be used to connect software such as Maya and Motionbuilder[4] directly to the Unreal engine. The plugin is free to use and is open-source just like the engine code itself. This allows further development to be done to expand the features of the plugin. [20] For gathering data from different software in a unified format, Epic Games has devel- oped Datasmith which is a collection of tools and plugins. It is supported by 3ds Max, Cinema4D[37], Revit and IFC. [19]

Twinmotion Twinmotion is a software used for architectural visualization. It was developed using Unreal Engine and uses BIM or CAD models to create realistic renderings. Unlike Unreal Engine the source code is not available and there is no way to collaborate on the development of Twinmotion. Plugins and features are either added by the developers or in close collaboration with partners such as Autodesk and Trimble Twinmotion is available through a perpetual license at a retail price of $499 USD, and there is also a for students and educators. [22]

2.5.3 Unity Unity is developed by Unity Technologies and is, like Unreal Engine, a cross-platform engine. Unity has been used in industries such as game development, film, architecture and construc-

7 2.5. 3D rendering software tion. The Unity release cycle is fairly frequent, where versions are gathered into Long-Term Support releases. Unity has shared the C# code that goes into the engine, though it does not allow for modification and contributions in the same way that Unreal does.[43] There is a different licensing for Unity compared to Unreal Engine, where Unity hasopted for a model where companies can buy licenses for each user. The plans vary from $399 USD per year to $200 USD per month depending on the needs of the company. The different plans depend on the revenue of the company using Unity. There is also a free plan if revenue is less than $100,000 USD in the last 12 months or if you are a student. [44] Unity Reflect is a product which enables visualizing BIM and CAD projects in real-time from software such as , SketchUp and Rhino[1]. It is offered at an annual subscription cost of $690 USD, though there is also a free 30-day trial. [46]

2.5.4 Blender Blender is a free and open source 3D creation suite which supports development on Windows, macOS and . Blender is a community-driven project supported by the Blender foundation and the Blender Institute. Features of Blender include rendering, modeling, sculpting and video editing. Python is used as the internal scripting language for Blender. The majority of the Blender code is in C and C++. [17]

2.5.5 Enscape Enscape is a plugin which integrates into design software to deliver real-time rendering and virtual reality. The plugin is developed by Enscape GmbH and is marketed towards architects and designers. Support for Enscape integration exists for Revit, SketchUp, Archicad and others. The changes made in the design software are automatically translated to Enscape where the higher quality output is shown. Licensing for Enscape is based on a similar model as Unity where licenses are bought for individual users or computers. A fixed-seat license which is tied to one fixed machine islisted at approximately $39 USD per month and a floating license which can be shared on multiple machine comes in at $58 USD per month. [25]

2.5.6 Lumion Lumion is an architectural rendering software created by the Dutch company Act-3D which was founded in 1998. Lumion is focusing on a simple workflow with intuitive tools and features. It is compatible with software such as Autodesk Revit, SketchUp, Rhino and 3ds Max. Lumion has its own plugin called LiveSync for direct syncing with other software. Lumion is not open source and development of plugins is done in-house. The standard version of Lumion 11 costs €1499 for a single time fee, and the pro version costs €2999. There is also a free student license and a free 14-day public trial license.

8 3 Method

In this chapter the method used in the thesis is thoroughly explained to elevate the repro- ducibility. Certain choices and thoughts are presented here to show what steps were taken to reach a conclusion. The method is divided into prestudy, literature study, file-based imple- mentation and a network-based system. Dividing the last two chapters into an implementation and a system sketch was a choice made in the middle stage of the thesis. The initial aim of this thesis was to create a fully functional network-based implementation and to test its perfor- mance. As the prestudy and literature study progressed the author found that a network-based implementation would take a considerable amount of time to create. Setting up a server for communicating 3D scene changes between clients and creating plugins would be a large chal- lenge considering the time constraints. There were also some reservations from developers at Configura regarding if multi-threading could be implemented in CET Designer to support that kind of implementation. Discussions between the author and supervisor at Configura led to the method described in the following chapters.

3.1 Prestudy

In the prestudy different rendering software were evaluated to find a possible solution for synchronizing 3D-models from CET Designer. This would lay the foundation for the file-based implementation where the best suited software would be used. A list of possible programs was produced by using search terms such as ”rendering software”, ”architectural visualization software” and ”game engine”. The search engine used was Google1 and results from the searches were compiled into a list of software candidates. The results were then filtered through by Configura developers who decided on the final list insection 2.5.

3.1.1 Configura’s constraints Evaluating the software to find a suitable candidate for implementation was done in several stages. The first stage was to gather constraints that Configura had on the software, theseare listed below:

1https://www.google.com/

9 3.2. Literature study

The software should be well known in the architectural visualization/space planning community. The first criterion was that the software needed to be well known to designers inthearchi- tectural visualization/space planning community. This criterion was evaluated through the searches mentioned earlier. Based on feedback from developers at Configura, the software which was most relevant to the use case was gathered.

It should not cost anything to develop a solution. Meaning there should not be any subscription costs for the software during development. When it comes to cost of development an important criterion for this software was that it wasn’t going to incur any costs. Since this is only a pilot study to show the capabilities and possibilities of connecting two programs Configura didn’t want to pay for a . Having free use of the software for development and testing is a vital part in the selection.

The software should be open source or allow for calls to an API which has the necessary features. Another criterion that was defined is that there needed to be access to the source codeinsome way, or to an API which could be used to make the proper function calls. There needed to be support for importing meshes, materials and other 3D model data. Previous work in this area by a thesis [41] performed at Configura proved that this isn’t always trivial. Some software manufacturers make sure that the source code is only open to their own developers or partners to make sure they have a competitive advantage. And even if there is access to an API, it doesn’t always allow the developers full control of importing and exporting 3D model data. Doing a proper investigation before starting development is crucial.

3.1.2 Supported file formats Another part of the evaluation consisted of researching which file formats the software sup- ported when it comes to importing and exporting 3D content. A large range of supported file formats means that there are many possibilities when it comes to saving the correct3D information. Making sure that there are no data losses when exporting a drawing down to a file is crucial for designers. 3D content can be designed and defined differently dependingon optimizations, how materials are represented and if there should be any additional information stored. Supported file formats for the different rendering engines were evaluated based onthe export capabilities of CET Designer as well as what capabilities other drawing software have in this industry. Theory about different file formats is described in section 2.3.3.

3.2 Literature study

The aim of the literature study was to find out what advances had been made in the area of design workflow optimization. More specifically the aim was to find out how tooptimize the import-export workflow where 3D data is sent from one software to another. Papers were found through searches in different databases. The main search engine used was Google Scholar2 which links to papers on different websites and databases. Access to these databases was provided by Linköping University and the databases mainly used were IEEE Xplore3, SpringerLink4, ScienceDirect5, ACM Digital Library6 and ResearchGate7. The validity of the

2https://scholar.google.com/ 3https://ieeexplore.ieee.org/ 4https://www.link.springer.com/ 5https://www.sciencedirect.com/ 6https://dl.acm.org/ 7https://www.researchgate.net/

10 3.3. File-based implementation papers was evaluated by checking how well cited they were, where the papers were published and who the authors were. Finding additional papers in this area was done by searching for papers with related references and papers released by the same authors. In this way many papers could be traversed to find common denominators which could be used for categorizing the papers in the results chapter. Search terms used to find papers included ”BIM”, ”streaming”, ”TCP”, ”UDP”, ”mesh”, ”plugin”, ”synchronization” and ”3D”. Searching for different game engines and rendering engines was also done to find specific implementations. The engines used in searches are defined in section 2.5. Likewise file specific implementations were included in the searches as well. The relevant file formats found are defined in section 2.3.3.

3.3 File-based implementation

The implementation of a possible improvement to the import-export workflow consisted of finding a simple solution which could be evaluated and possibly improved upon. Therewere two suggestions provided by developers at Configura which were either a file-based approach or a network-based approach. The file-based approach meant that the design software would create a file in a format which both software supported. The file was then shared between the programs. The network-based approach meant that the 3D data would be sent between the software by means of a TCP or UDP link. Changes in the model would be directly sent through the link to keep the synchronization between the software. The file-based approach seemed to be the simplest to implement and test. Therefore, the file-based approach was implemented in software whereas the network-based approach consisted of a system sketch based on the literature study. This approach was also suggested to the author of this paper by the developers at Configura since some work had already been done when itcomesto supporting many different file formats. The current state of available options wasfurther researched in the literature study and network-based system sections.

3.3.1 Evaluation The evaluation of the file-based system was done in two phases. In the first phase thecorre- lation between different file sizes and the export/import times were examined. This provides information about the limitations of using a file-based system and the differences between software. This first phase was performed using the software listed in section 2.5. Files were gathered through designs by users of CET Designer as well as well as from the 3D library SketchUp[48]. Files varied in size depending on file format and how detailed the models or materials were. Since glTF has shown to be a promising format for 3D models it was going to be used for testing if possible. The prestudy provides more information about the different software and which file formats they support. Important metrics about the models suchas number of triangles and vertices can differ based on file type according to a recent study[34]. During the tests a timer started when the export or import button in the software was pressed and stopped when the file had completely loaded into the software or was saved to the file system. The results were compiled for further discussions. In the second phase the actual implementation was tested. From this phase the aim was to find out if this type of system would be useful and if there were any benefits fromusing different file formats. Smaller files were used in this stage to enable faster testing andtofocus in on how file formats affect performance. The overhead eliminated in the implementation comes from the user navigating to and pressing the import and export buttons as well as locating the file in the file system. Each time a change is made to the scene anewfilewas generated. The testing in this phase included both the initial loading time of the entire scene and the loading time when a change had been made to a model in the scene.

11 3.4. Network-based system

3.4 Network-based system

The network-based system was based on information from the prestudy and literature study. Early discussions with Configura developers resulted in a desire to research how a network- based 3D data transfer system could be sketched up. The system was focused around the CET Designer software and the Configura Magic programming language it is built on. Technologies that were used when developing CET Designer are detailed in the results chapter. Limitations which hinder certain implementations are also discussed. The system lays a foundation for further development and research from the literature study provides a basis for the results that will be achieved from using this kind of implementation. The motivation behind building a network-based system for transferring 3D data between software is that it gives the users more freedom when designing for example offices. CET Designer can be used to set up all the furniture and get a price estimation in real time as more items are added. CET Designer is also used to get assembly instructions and produce 2D CAD drawings. Exporting the entire graphical scene into a different software such as Unreal Engine or Lumion would allow the users to take advantage of more advanced 3D features such as modifying materials, using real time and post processing effects. It would also be possible to collaborate in real time with the same drawing where one user is placing down all the furniture while the other person is placing down decorations such as flowers and paintings. The feedback Configura has received over the years on their software and discussions with developers gathered a list of functionalities the system should have.

1. The system should support two-way communication between software.

2. The system should be scalable and support multiple machines at the same time.

3. The system should be able to send updates in real-time.

Supporting two-way communication means that there is a possibility for manipulating objects on a separate client and having that translate to a change in CET Designer. This could open new possibilities for interacting with objects using tools such as virtual reality. Scalability is an important metric as it allows the system to communicate changes to several users at once. A server would be handling all requests so it won’t affect user performance on either end. Sending updates in real time means that when a change has been made in CET Designer it is translated into a 3D format which can be sent to a client quickly and efficiently.

12 4 Results

This chapter presents the results in a similar format as the methods chapter in order to give a clear understanding of the development of ideas. The implementation chapter has been expanded to include a more in-depth explanation of the different components used.

4.1 Prestudy

Information for the prestudy was gathered through official websites of the companies which are referenced in section 2.5. This was compiled in table 4.1 which covers the questions and constraints listed in the method chapter.

Software Usage Open source Cost File formats Compatible software for direct link Game development, 3DS, FBX, OBJ, Blender architectural visualization, Yes Free No official links PLY, STJ film Single machine $39 USD, ArchiCAD, Revit, Rhino, SketchUp, Enscape Architectural visualization No multiple machines $58 USD FBX, glTF, OBJ Vectorworks (per month) Free license for students, 3DS, DAE, DWG, ArchiCAD, AutoCAD, Revit, Rhino, Lumion Architectural visualization No €1499 for standard version, DXF, FBX, OBJ, SketchUp, Vectorworks €2999 for pro version SKP Free license for students, 3DS, C4D, FBX, ArchiCAD, Revit, Rhine, RikCAD, Twinmotion Architectural visualization No $499 USD perpetual license OBJ, SKP SketchUp Game development, Free license for students, 3DS, DAE, DXF, Unity architectural visualization, No $399 USD per year to , Revit, Rhino, SketchUp FBX, OBJ film, construction $200 USD per month Game development, Free to develop, Unreal Engine architectural visualization, Yes 5% royalty fee for project FBX, glTF, OBJ Grasshopper, Rhino, SolidWorks television content creation exceeding $1,000,000 USD Table 4.1: 3D render engine features.

Table 4.1 shows that when it comes to cost of development for a proof of concept there are a few options which would be viable. Since this is a student project it should be possible to have access to student licenses meaning that Unreal Engine, Twinmotion, Unity, Blender and Lumion could be used. The caveat is that these software might not be open source for development. Only Blender and Unreal Engine are fully open source and would allow for greater modifications. Unity does allow for creating functionality using scripts and youcan also create plug-ins.[45] Through researching the companies’ websites and forums it was found that development of plugins is done in close collaborations between the companies. In a forum thread[28] on

13 4.2. Literature study the Vectorworks website employees mention directly working with Epic Games to provide a Twinmotion plugin. The companies can choose which collaborations they want to focus on and which links they want to allow. A consequence of this can be that it is not possible to implement a direct link or even an indirect link using a file-based approach. In the master thesis paper by Pintar[41] it was found that the limitations in the open Revit API prohibits the possibilities of a live connection to CET Designer. These constraints and limitations lead to the decision that Unreal Engine would be a good candidate for future development. It supports the file formats FBX and glTF which are both used in CET Designer. In addition, it is open source and will not cost anything to get started. The development will be aided by the well-defined documentation1 as well as the forum2 where other developers and users can post questions and answers to problems that arise. In the Unreal Engine marketplace3 there are plugins which could be used to avoid creating solutions which already exist. In the Architectural Visualization Rendering Survey by CGarchitect[7] Unreal Engine is one of the top 5 most popular rendering engines, which shows that there is an interest for using this software. The benefit of Unreal Engine over Blender, which has similar features, is that Configura has experience in using Unreal Engine in the past.

4.2 Literature study

The results from the literature study are presented in this section. There are subsections which deal with different components when it comes to transferring 3D data between software. Searches in the area of 3D data sharing show there are two common approaches[39]. One of them is to use a shared file which both software have access to. When changes are made tothe file the software reloads it into memory. Another approach is to use a protocol suchasTCP to stream the data between software over a network. This is accomplished by establishing a connection and sending data when a model or drawing has been changed. This approach can also use a cloud-based sharing model, where a file is uploaded to the cloud and thereby made available to other software. More about these approaches will be described below through analyzing related papers which were found through the method described in section 3.2.

4.2.1 File-based approaches Using a shared file is the simplest approach as it will be built using existing technology. Most software already support exporting and importing files in different file formats. The important part is finding a file format which is optimal for both software and having a file pathwhich both can access. This section investigates how this approach can be used to automate and improve the import and export workflow.

Industrial 3D importing With the purpose of defining the 3D modeling workflow Tran Thi wrote a thesis paper[47] at Helsinki Metropolia University. The thesis deals with the workflow of converting 3D models between CAD software and 3ds Max. As a collaboration between Helsinki Metropolia Uni- versity of Applied Sciences and the ship-design company Deltamarin the problem faced was exporting a 3D model of a ship design from a CAD software to 3ds Max. Tran Thi defined the 3D modeling stages as consisting of importing, set-up, modeling, texturing and render- ing. These stages are shown in figure 4.1. The thesis covers all these stages to give a deeper understanding of the complete workflow. Tran Thi came across problems with polygon breaking, overlapping, and duplicates when exporting and importing models. Polygon breaking refers to issues where the receiving software

1https://docs.unrealengine.com/ 2https://forums.unrealengine.com/ 3https://www.unrealengine.com/marketplace/

14 4.2. Literature study

Importing Set-up Modeling Texturing Rendering

Figure 4.1: Stages of 3D modeling. cannot interpret parts of the model. An example of this could be a rounded object which is defined in an unusual way. Overlapping means that objects have overlapping layers which cause graphical bugs. Duplicates are copies of objects which are cloned on top of each other. These issues were able to be fixed through various tools in the software. What Tran Thi learned from this work is that the 3D models need to be properly prepared before being imported to other software. Problems often occur in these workflows and depending on the software used there can be different issues and bugs that appear.

CAD to VR workflow automation In the paper by Engberg and Eriksson [14] the pipeline between CAD software and virtual reality is the focus of exploration. They identified a need in the manufacturing industry of visualizing assembly sequences in virtual reality to get a better sense of scale. The application they developed is based on Unreal Engine and they outline some of the features which could be useful to their implementation. One of the features is Datasmith which is a collection of tools that can be used to import CAD assets. Using Datasmith also allows developers to use a reimport workflow where CAD models are automatically reimported when the source program has been changed. Unreal Engine also allows for prepping data before it is imported into the editor. This is done either through the use of the visual scripting system called Blueprints or by using C++. Engberg and Eriksson explain the various uses of VR in education and engineering through their literature study. They showed that this is an area of great interest and potential and that there have historically been great difficulties in moving CAD data between software. This is also shown in the paper as there are struggles when they import the CAD file to Unreal Engine. Naming conventions for meshes and other assets differ between the software. In the implementation they say that Autodesk Fusion 360 has been used to design and export the CAD files. This leads to the authors having to create a pipeline for renaming assets according to their specifications. There were also problems with origin points for assets which was fixed in a similar way. For the authors it was important to beable also include part data in the CAD files. This means that the implementation needs to support additional information for components such as size and amount. This also needs custom fixes and implementations as this is a customer specific request. The result was an implementation which was tested and evaluated for usability metrics. It was shown to be working with some flaws; there were still some manual configurations required as the system was not completely automatic. The authors explain that many of the problems are due to limitation in Unreal Engine where they were unable to make runtime import work. It is not a feature which is directly available. Some development is needed but it is also possible to use plugins from the marketplace. It seems you need to understand how Unreal Engine works and the API on a deeper level to be able to create custom import export workflows. The differences in how Unreal Engine interprets CAD data compared to the other software used (Creo, Fusion 360 and Siemens NX) was also a hindrance to the implementation. For future studies the authors mention that it would be interesting to revisit this area after more functionality has been added to Unreal Engine. They would also like to evaluate different file formats.

15 4.2. Literature study

Integrating a 3D building model for real-time visualization Displaying 3D models through immersive technology can be used in many different industries. Presenting a building using a 360-degree immersive display was the foundation of the work done in the paper by Fält [15]. The background of the paper was to investigate how Norrköping Visualization Center could use their 360-degree display to provide services to companies and schools where they can visualize and interact with 3D models of buildings. The software used include 3D Studio Max (3ds Max), Revit and Unity 5. Fält found issues with importing 3D data to a game engine. The file format .rvt which is used by Revit is not supported byUnity which means that a conversion to FBX is needed. The file then needed to be imported into 3D Studio Max where a texture is applied before it could be imported into Unity. Fält found that automating the application of textures to the model could be done using different approaches, but rendering the textures still takes a long time. The conclusion he found is that you need to have an intimate understanding of how the software work to implement a fully functional pipeline. There are settings and features which can differ between two software which causes problems when importing and exporting. Fält found that the model would look significantly different in Revit, 3D Studio Max and Unity 5 even though the same model and textureswere used. He did not focus on lighting in the different software as these can be very different and there would need to be some tinkering with settings to get the same results throughout the software. Fält suggested that this could be the basis for the next thesis in this area. He stated that it would also be interesting to evaluate using more models as he only used one building model in his evaluation. He also stated that using 3D Studio Max directly with different rendering software could prove to be more effective, thereby skipping Revit since it caused issues.

Building information models in game engines The paper[6] by Bille et al. is focused on visualizing buildings using game engines. The authors explain that the purpose of reusing building plans and models in a 3D interactive virtual environment is to aid model refinement and for use in training environments. To show virtual models of oil refineries, production platforms, nuclear power plants and buildings during the construction phase provides a safe and comprehensive experience. The limitation of the BIM model is that it doesn’t provide an interactive environment. It is rather an accurate description of the building and its components. The authors are proposing a move to game engines to provide the interactive 3D environment using BIM data. Their paper focused a bit on related research and found that there have been a few papers about using BIM for game engines. There are often middle steps involved when exporting and importing models where texture mapping and metadata is applied. There can also be problems when models are updated as the link between the model and alterations to the model are severed due to a reimport and rebuild. The FBX file format is often used, and it is also chosen in this project. The authors created a list of properties for the Revit export file types which can be seen in table 4.2

File-type Geometric data Metadata FBX Yes No DWG Yes No OBJ Yes No DWF/DWFx Yes Partial NWC Yes Yes ODBC Yes Yes gbXML Yes Partial IFC Yes Partial Table 4.2: Revit export file types.

16 4.2. Literature study

Bille et al. found, like Fält [15], that there needs to be an intermediate step between Revit and Unity since the textures and colors do not transfer. They used 3ds Max with the AMC (Autodesk Material Converter) script to export an FBX file which contained all the correct components. Since FBX doesn’t contain any metadata it would be provided through a separate database. The authors noted a few improvements which could be done. One of them is that if you structure the data correctly it is possible to run automated scripts which attach dynamic events to certain objects in the BIM. One example of this is having a door opening animation. Since the 3D model will have collisions enabled the player won’t be able to enter the building. The conclusion of the paper is that they identified some pipelines from BIM to game engines which could be useful for future research. They also noted that custom tools can be created to be used in the game engines for measuring and interacting with the building structures which provides additional functionality to the users. A conclusion is that game engines provide possibilities for using BIM on many different platforms and further research should be done on which platform would be best suited for virtual demonstrations and skills training.

4.2.2 Network-based approaches A different approach compared to using a shared file is streaming the 3D data between software through a network. Sometimes the data is sent with a server acting as a . The server could be cloud-based or running locally on the user’s computer. Implementations can be achieved by using different protocols and libraries which will be demonstrated in the papers that have been selected in this study.

Atom: real-time networked streaming of 3D scenes In the paper[27] by Green the problem of pipeline efficiency is tackled using networked stream- ing of 3D scenes. The focus of the paper is the industry where it is common that several different software are used. It is helpful to get quick feedback of how a modelwill look in the final game even though the designer is working in a different software. is referenced as an example. To bypass obnoxious exporting steps every time a preview is desired is a goal for the implementation referenced in the paper. Green mentions that there are solutions which reduce the pipeline stages for iterative development, though the flaw is that the visualizations are localized. Having remote visualizations means that you can preview a scene on a tablet device while the designer is modifying it from a workstation elsewhere. The proposed solution is a plugin for Autodesk Maya which acts as a server which streams data to networked clients. The plugin should be standalone and not restricted to specific third-party client applications. A client could for example be a custom game engine. Green sent out a short survey before the development to find out which components of the product that were considered most important. The survey was sent to 20 people who were artists, programmers or technical artists. Results from the survey showed that speed of data transfer was more important than ease-of-use of the API. The most important data was geometry, materials, lights and rigging. The final product is a plugin called Atom which acts as an interface into scene data similar to the observer design pattern. Green also created a client for development and documentation purposes. The implementation is built on Boost Asio which is a cross- platform C++ library that can be used for networking. Asio was chosen due to ”its superior asynchronous capabilities and widespread usage”. Messages sent by atom are serialized using Google’s Protocol Buffers library. A message sent by Atom consists of a 32-bit signed integer denoting the size of the data followed by the raw data itself. When a server instance is started it will continuously listen to new connection requests. When a connection has been made the client can send a request to the server to send data. The implementation supports multiple threads and multiple connected clients. It is built on TCP since UDP is less reliable even though it would be faster. The data is structured as displayed in figure 4.2.

17 4.2. Literature study

Node

Annotation Camera Curve Light Material Mesh Texture Utility Xform

Figure 4.2: AtomDag node type hierarchy.

The nodes in the figure represent different Maya classes. The Mesh node is a representation of the Maya MFnMesh class. The base node called Node contains all essential data for an element in the scene. When Atom was completed the author sent out a feedback survey to artists and programmers to find out the usefulness of Atom. The survey showed that almost all participants thought it was useful if a client could be implemented in their desired game engine or rendering software. The participants also stated that it would reduce the number of steps required for artists to preview their work with in-game visuals during development. Most participants said that Atom would be most suitable for single-object previews, though it does support entire scenes. There was a desire to be able to use Atom for animations. This could be an area for improvement in the future. Since Atom is an API it could be extended at a later stage and there are endless possibilities of using this in combination with third-party software.

Synchronizing BIM data to VR The architecture, engineering and construction industries have witnessed a steady increase in interest for VR to improve existing work processes. Allowing users to interact with digital objects in real time can lead to new discoveries of flaws and strengths in the processes. En- hancing the process of converting BIM data to VR is the purpose of the paper[12] by Du et al. Converting BIM data to VR displays, such as the HTC Vive headset, has proven to be diffi- cult. It starts with an established design built on traditional platforms such as CAD or BIM. The design models are converted into VR displays instead of built from the ground up using game engines. The authors have identified problems with this convertion process. Firstly, the design-to-VR process is time consuming. It could involve rendering a finished BIM model, such as a Revit file, in a third-party graphics program such as 3D Studio Max to FBXformat. The file is then transferred into Unity for VR programming and the entire process couldtake hours to days to complete. Secondly the process doesn’t support real time data synchroniza- tion. Changes are very common, and it is important that feedback is done in a timely manner to avoid high costs later. Having fast feedback loops means that problems can be fixed at an early stage. Thirdly data integrity is difficult to maintain in the present Design-to-VR method with frequent changes. Data needs to be synchronized between many parties and the system should support a variety of formats and different platforms. Du et al. have researched this area extensively and came to the conclusion that many solutions are at a conceptual level and do not show a complete implementation using real software. Studies have also shown that exporting BIM is not straightforward. The problems that occur can vary depending on the environment used. The authors sought to develop an innovative data transfer protocol called BVRS (BIM-VR Real-time synchronization) to solve the problems mentioned before. BVRS uses a cloud-based infrastructure where the middle layer between two software consists of a database. Objects are stored in the database and are identified using an ElementID which is generated using the FBX file format. The database contains information about thepo-

18 4.2. Literature study sition, material, model and other metadata. The same structure is then created within the game engine to ensure that the database elements correspond to internal elements in the game engine. Revit is used in this implementation for creating the CAD models and the authors mention both Unity and Unreal Engine as candidates for game engines that could be used for displaying the scene in VR. Changes in Revit will result in a request to the cloud server. The relevant data will be changed and marked. The game engine then scans over all ElementIDs dynamically and examines if changes are discovered. If any changes are found they are used to change the visualization. The reverse process can also be implemented where changes in the game engine are sent to the database, though this is not currently implemented. The authors have focused on four major areas to create a real-time BIM data updating function: Model transfer and refinement, design, cloud server connections and player controller function development. Model transfer and refinement refers to optimizing the response time of the application and database as well as providing the best possible graphical experience for the user. Setting a limiting texture size as well as reusing the same texture for different models means that there will be fewer data packets sent. Occlusion culling is used to optimize the performance in VR. It disables rendering of objects when they are not seen by the camera. The introduction of advanced lighting and shadows provide a more immersive experience for the user. These features are most commonly found in game engines and not in CAD software. The implementation is heavily dependent on cloud server connection optimization. Sending the entire BIM project when the VR application makes a request can be very resource de- manding when the project gets large. The elements are instead tagged when changes have been made to them. Only tagged elements are sent to the application when a request has been made. Hash tables are used to store ElementIDs and properties and searching for properties is done in constant time for a given ID. The application goes through the elements’ updated properties and updates the model. User interface design and player controller function devel- opment refers to how the user traverses and views the 3D scene which isn’t interesting in the context of this literature study. Results from user tests show that real time synchronization of BIM data in VR is possible for different scenarios. BVRS reduced data processing and transmission delays compared to the example system irisVR which took 10s to see a design change whereas BVRS was almost real-time. For the future the authors have found that there needs to be improved efficiency for complex models. For a large number of interdependent objects the synchronization is affected since it needs to dynamically monitor changes toevery object. There is a large search/scanning process every time data is accessed and requested. Using ElementIDs as search key seems not to be the most efficient approach according to the authors. Other tree based search algorithms will be examined in the future to improve search efficiency.

Two-way cooperation of 3D data in game engines For collaborations using 3D data it can be useful to establish a two-way communications channel where modifications can be made to the model or scene on both sides. Kadoand Hirasawa presented such a system in their paper[30]. The authors explain that the use of a file-based data coordination proves to be a tedious process which is not suited for thecyclic design and visualization process. Their proposed system has a mesh-based representation in a VR application with smooth transmission and reception. It also allows the user to edit elements that have parametric behaviors and geometrical interactions. As opposed to the system presented by Du et al. [12] the system by Kado and Hirasawa is not limited to updating element locations and modifying element types. The implementation in this thesis uses ArchiCAD 20 as the 3D CAD software which sends information to Unreal Engine 4 using a relay database based on PostgreSQL. The collaborative system consists of the following steps. First 3D data is sent from the 3D CAD software to the database where it is stored. Then the VR application checks the database for any updates and loads them. Lastly the new data is processed and visualized in the VR application. Since there is a two-way connection there

19 4.2. Literature study are also steps in the opposite direction where changes to the model can be made in the VR applications. These changes are then stored in the database and they will be reflected in the 3D CAD software. The structure can be seen in figure 4.3.

ArchiCAD 20 PostgreSQL Unreal Engine 4

3D VR model API Blue Print scene API

Figure 4.3: Structure of the two-way cooperation system.

The implementation supports both a geometric mesh-based representation and a parametric-based representation. The reason for this is that the conversion between the two representations can be costly as you need to implement complex algorithms. The parametric- based representation is needed to be able to modify the 3D model using the VR application. Changes to the parameters are sent to the 3D CAD software which computes and creates the corresponding mesh. Measures were taken by the authors to handle meshes efficiently by the use of a delta update function. The delta update function makes sure that only necessary operations to the database are performed. When a duplicate chair is added to the 3D scene there isn’t a need to send and store the same mesh twice since it can be reused. When a chair is moved only the position is updated and the entire mesh isn’t sent again to the database. A hash function determines if the mesh has been altered in any way. Using the system on a 3D model comprised of 685 elements and more than 87,000 polygons showed promising results where the VR editing function could be used to modify the geometry of a window. By using parametric data the geometry of the wall was also changed to accommodate the smaller hole needed to house the window. Updating accompanying geometries resulted in several second delays which is an area of future improvement by the authors. The authors plan to implement similar systems in with other architectural 3D CAD software and game engines to further explore the possibilities of using game engines as interface of BIM data.

Collaboration and interaction with BIM data The paper[13] by Edwards et al. tries to address the issue of professional designers not being able to effectively interact and collaborate with users or clients on a functional level. Theend users want to be able to explore and interact with buildings and models to give constructive feedback during the development process. The authors propose the use of a game engine com- bined with BIM to solve this problem. They have identified some uses of game engines inthe AEC (Architecture/Engineering/Construction) and FM (Facilities Management) industries. These include simulating the evacuation of the population of a building, providing interac- tive visualization of a structure, and teaching students about construction site safety. The system described has a multi-layered architecture consisting of a BIM environment, a data transmission element, a game engine environment and a client end. The BIM environment is based on Autodesk Revit which provides building information as well as an API to access that information. Revit was selected since it was fully compatible with BIM standards and has good third-party support. This environment communicates with the data transmission element which contains the database and FBX converter. It generates semantic and geometric data and stores it in the database which has a two-way connection with the server game which is a part of the Game Engine environment. The Game Engine environment consists of a Unity

20 4.3. File-based implementation game component which feeds data into available clients which have connected through an IP address. Unity was chosen since it has a simple object orientated and editor-based design system. You can create executables which run on Windows and Mac. You can also create web player versions of games. At the client end, end-users and designers can interact with the models via the Unity game engine. Many different input and output devices are supported such as Windows and Mac operating systems that use monitors with keyboard and mouse, mobile platform using iOS or Android with touch screen or Web-based environments that allow the user to connect to the server through their web browser. The figure 4.4 shows all the components and how they interact with each other.

Client end User/Designer

Unity Game Game Engine Client Game Network Server Game environment

Data transmission Game Collaboration Autodesk FBX `Micro´ element Plug-in converter Web-Server

Revit Revit Main Revit API BIM environment Application

Figure 4.4: Architecture of the collaborative system.

The system starts up by executing the plugin which exports the BIM data to the FBX format. It is then converted to the OBJ format which can be sent to the server gane instance when a request is made. The server game instance can also request parametric properties which are sent separately. When a game client has connected to the server game it can request the model and parametric data from the server. When objects are modified in the client the data is sent back to the server which sends objects to the plugin. The authors found that there were some issues with sending back data to the plugin which could be a topic for future improvements. The separation of model and parametric data was also an issue since it didn’t provide an elegant solution. Using the IFC format would allow storing both geometric data and parametric data in the same format which is something that isn’t supported in the OBJ format. A technical improvement the authors identified was to implement Universal Plug and Play (UPnP) to reduce the user configuration by allowing programs to negotiate with a firewall in a network.

4.3 File-based implementation

The implementation of the shared file system consists of the exporting software CET Designer and the rendering software Unreal Engine. How they connect to each other and the technologies used are explained in this section.

21 4.3. File-based implementation

4.3.1 File format The file format glTF was chosen for the implementation since it is supported by Unreal Engine and it offers many new capabilities when it comes to rendering models compared toolder formats. glTF uses PBR which provides realistic materials with different properties. PBR is described in section 2.3.2. There was a desire from Configura to use the glTF format since support is currently being developed for the CET Designer software. In the study[34] by Lee et al. the file formats OBJ, FBX, STL and glTF are presented with comparisons of performance and structure. They find that glTF has more 3D attributes compared to the othersand performs better for most of their test cases. They also expect that the glTF format will be used in more applications in the future. The problem with using glTF is that not a lot of software support it. To be able to make a correct evaluation of this implementation there needs to be support for the more commonly used format FBX as well. This isn’t an issue since both Unreal Engine and CET Designer support FBX and it is simple to configure the implementation to support both file types.

4.3.2 CET Designer The exported file comes directly from CET Designer and the code which performs theexport is written in the CM programming language. CM is a strongly typed object-oriented pro- gramming language with extensible syntax and support for incremental development[9]. The development of the language started as there was a dissatisfaction with C++. Mostly the dissatisfaction was caused by the long work cycles caused by the need for restarting and re- compiling the program after changes to the source code. A garbage collector is used to reclaim memory that is allocated by objects and other values. CM source files are compiled to machine code before they are executed, though this doesn’t happen immediately. The source code is translated into different intermediate formats on-demand and is only compiled to machine code when the compiler cannot delay it any further. This means that the source code is directly tied into the execution of the program. Files can be recompiled at runtime which enables an incremental development style. For this implementation it means that during development it was possible to make changes to the export process and have it up and running instantly without having to restart the CET Designer software. In CET Designer most objects are known as Snappers which inherit from an Object base class. Snappers can be seen as 3D elements with custom properties and functionality. The Snapper base class is extended when more functionality is needed which is the traditional object-oriented workflow. Each Snapper in the scene in the scene holds information about its position, materials and meshes. The export function in CET Designer makes sure each Snapper is included in the exported file. The file is exported to a specific path on thecomputer which could be either statically defined or determined by the user through a save dialog.A new file is generated every time there is any changes to the scene. Changes are definedas any modification to a Snapper on the drawing area. This could include moving aSnapper, deleting a Snapper, changing Snapper properties or altering the material used on a Snapper. All of this functionality was gathered into an extension (also known as a plugin) which simply initiated the process of creating a shared file and keeping it up to date. The extension can be seen in figure 4.5. There are buttons which launch the Unreal Engine 4 editor, manually export the file to the specified location and also a button which allows the user tosetthefile location through a dialog window.

4.3.3 Unreal Engine Unreal Engine is used as the rendering software for this implementation. It is used to visualize the scene which has been exported from CET Designer as a glTF file. The implementation on the rendering software side consists of an automatic file import system which continuously looks for updates in the shared file. Two approaches have been identified for creating a workflow

22 4.3. File-based implementation

Figure 4.5: CET Designer extension for the Unreal Engine export. where a file is automatically imported and updated in Unreal Engine. The first approach involves setting up the Auto Reimport feature which is enabled by default in Unreal Engine. The other approach is to create a plugin which the user can download from the Unreal Engine Marketplace or directly from the developer. Developing the plugin can be done using Blueprints Visual Scripting or C++. Both approaches will be explained more in detail below.

Auto Reimport Auto Reimport is an Unreal Engine feature which allows the user to monitor a certain folder for changes in source content files[18]. This feature was described in the paper by Engberg and Eriksson [14]. The user modifies settings in the editor which define how the feature works. These settings available are: Directories to Monitor, Import Threshold Time, Auto Create Assets, Auto Delete Assets, Detect Changes On Startup, Prompt Before Action. Under the Directories to Monitor setting the user can determine which directory should be monitored and which folder it should map to in the internal Unreal file structure. Import Threshold Time defines the time it takes before a change is reflected in the automatic reimport. AutoCreate and Auto Delete determines if newly added source files should automatically create or delete Unreal Engine assets. Checking the setting Detect Changes on Startup means that Unreal Engine will update assets if any changes have occurred in the monitored folders on restart. Finally, the setting Prompt Before Action allows the user to get a prompt whenever a new changes are to be imported. For our implementation it would suffice to add our export folder to the setting Directories to Monitor. This is the folder which CET Designer exported the

23 4.3. File-based implementation glTF file to in the file system. The Import Threshold Time should be set to 0 for the quickest update time and the rest of the settings should be checked except for Prompt Before Action. How settings are defined for this implementation can be seen in figure 4.6.

Figure 4.6: Unreal Engine Auto Reimport settings.

Plugin development Creating a new plugin for Unreal Engine is done by simply opening up the Unreal Editor’s Plugin Browser and selecting from a few templates[21]. The development can then be done by using Blueprints or writing C++ code in an editor such as Visual Studio. Blueprints is a Visual Scripting system which uses a visual interface where nodes are created and connected to each other. An example of Blueprints scripting is shown in figure 4.7.

Figure 4.7: Blueprints for importing a FBX file.

Nodes are representations of C++ concepts such as variables, functions, objects and classes. The advantages to using Blueprints is that it allows for faster creation, faster iteration and

24 4.3. File-based implementation flexible editing. You can easily connect inputs and outputs to different nodes and create entire programs. The advantages that come from instead using C++ is faster runtime performance, more data control and easier version control. Using both Blueprints and C++ for Unreal Engine development is the best approach as it provides a good balance. While using the existing Auto Reimport feature would be the quickest way to get an implementation up and running, it doesn’t provide the easiest workflow for the user. The user needs to go into the settings and find the correct output folder as well as make selections for the different check boxes. To make this as simple as possible a plugin was developed which uses predefined values where the user only has to click on a single button toinitiate the synchronization with CET Designer. Thus, it is ensured that only a minimal amount of input is needed from the user in both software. This doesn’t have a significant impact on performance; it will only affect the user experience. A button is placed in the Unreal Engine editor menu. This button uses Blueprints to read the file from a url in the filesystem and import it into the scene. It continuously reloads the file when changes have been made. This builds upon existing code from the Auto Reimport feature provided by Epic Games.

4.3.4 Evaluation To answer the second research question this thesis evaluates whether or not an automated file- based import-export workflow is viable for use with different types of 3D files. The evaluation is done in two phases. In the first phase the import and export loading times are evaluated for the software used in the prestudy. In the second phase the implementation described above is evaluated with regards to limitations when it comes to file format and file sizes.

Testing hardware It is important to mention the specifications of the computer used during testing since it directly affects the performance of the software used. In this instance the tests aredone on a computer running with high-level specifications which was tested using UserBenchmark4. Results from this benchmark and the specifications can be seen in table 4.3. The percentages refer to how the individual component performs compared to the average component from other user benchmarks. UserBenchmarks also provides a rating for how well the system is suited for gaming, desktop work and as a workstation. The results from this test gives the following scores: Game - 105%, Desktop - 95%, Workstation - 89%. During the testing most software running in the background was closed down to minimize the interference.

CPU Intel Core i5 8600k 92.2% GPU Nvidia GTX 1080 112.6% SSD Samsung 960 Evo NVMe PCIe M.2 250GB 185% RAM G.SKILL F4 DDR4 2400 C15 2x8GB 73% MBD Asus ROG STRIX Z370-E GAMING - Table 4.3: Specifications for computer used during development and testing.

First phase of testing Enscape was excluded from the tests since it is a plugin which doesn’t provide the import- export functionality. There were four different files used for this testing. The files called Kitchen[16] and House[10] came from the SketchUp 3D Warehouse and were downloaded as Collada Files and then imported to Blender. In Blender the models were then exported as FBX files. The files called Industrial and Office were created in CET Designer using existing Snappers in the built in extensions. The files were saved as CET Designer Drawings and then

4https://www.userbenchmark.com/

25 4.3. File-based implementation exported as FBX files. Information about the files was gathered in Blender and compiledin table 4.4. It was not possible to use glTF since most software didn’t support it. There was unofficial support for glTF in some software through plugins or extensions though itmight not provide a good comparison.

Name File size (kB) Objects Vertices Edges Faces/Triangles House 596 203 17,197 21,292 9,556 Industrial 38,088 2,857 1,566,021 2,469,690 1,126,099 Kitchen 3,824 497 165,533 241,648 122,848 Office 18.8 796 192,005 354,719 181,145 Table 4.4: Specifications for files used during testing.

Testing was performed by importing the FBX file into the software and then exporting it to a new location using the menus in the software. Both stages were timed and results were compiled into table 4.5. Times are shown in the format Import time/Export time. Tests were repeated five times and the values in the table represent the mean values of those fiveruns. For Twinmotion and Lumion there are no export times since the software do not support that functionality. These software are most commonly used as rendering engines and provide the final step for designers.

House Industrial Kitchen Office Blender 0.91/1.36 27.22/16.19 1.95/2.32 5.45/3.48 CET Designer 2.44/1.27 58.49/3.42 6.27/1.03 6.36/1.41 Lumion 2.01/- 62.02/- 3.44/- 15.24/- Twinmotion 2.29/- 17.14/- 6.22/- 4.63/- Unity 1.05/1.2 25.9/76.98 3.41/3.98 4.69/6.55 Unreal Engine 32.77/1.61 188.55/3.68 56.92/1.65 362.01/1.55 Table 4.5: Import/export times in seconds for the files and software used.

There were large differences in both import and export times depending on which fileand software was used. Unreal Engine was shown to be very inefficient in loading files as ittook a lot of time to compile . The Industrial file gave the slowest results in general, which makes sense since it is the largest file and has the highest number of objects.

Second phase of testing The second phase of testing involves the implementation developed using CET Designer and Unreal Engine. In this test both FBX and glTF files are used to compare how fast the export and loading is compared to table 4.5 and see which file format is best suited for the implementation. The addition of glTF support in CET Designer is very recent so there hasn’t been much optimization yet. A smaller scene was chosen which was created directly in CET Designer. This scene is shown in figure 4.8. The smaller scene is used due to the results we can see in table 4.5. It would take too long to use any of those files. Since we are only interested in comparing the formats FBX and glTF it should not affect the overall results. From the testing in the first stage it is clear that Unreal Engine didn’t handle large file sizes particularly well. The tests were performed by clicking the export button in the CET Designer extension and waiting for the file to load completely in Unreal Engine. Whenthe shaders had completely compiled in Unreal Engine the process had finished. The next step was to determine fast Unreal would reload a file when it had been changed. This was tested by moving one object in the CET Designer drawing and rewriting the FBX/glTF file. Unreal

26 4.4. Network-based system

Figure 4.8: Basic room created in CET Designer. then reimported the file and took care of the changes that had been made. When theshaders had completely compiled in Unreal Engine the process had finished. Results from testing the implementation is shown in table 4.6 Format File size (kB) Initial load time Reload time FBX 12,265 128,06 10,31 glTF 14,291 64,4 50,62 Table 4.6: Testing the implementation with FBX and glTF file.

From these results there is a minimal difference in size for the different file formats. The load times also differ significantly where FBX had a high initial load time compared toglTF but it was faster to reload the FBX file when changes had been made to a single object. Discussions about the results from this section will be in chapter 5.

4.4 Network-based system

The network-based system is an effort to apply findings from the literature study to asystem using CET Designer as a drawing software. A previous thesis[41] at Configura tried to imple-

27 4.4. Network-based system ment a similar system between CET Designer and Revit. This system involved using Remote Procedure Calls to create a direct link on a local machine. The author found that there were issues with using the Revit API since it had some limitations which affected functionality in the system. There was also a problem to get optimal performance since CET Designer doesn’t let multiple threads work with their models. At the end of the paper the author suggested some improvements for future projects. One of the improvements was to use a shared database with which both software can interact. Using a database/server as a middlelayer between the software means that the time to transfer data will be slower compared to a direct link. With a direct link the system can simply send the updated model to the rendering software which loads the changes and updates the visualization. With a server in the middle it needs to process the changes and update the database before passing the changes on. It does however mean that the implementation can be scaled up to handle more clients. This was demonstrated in several papers[13][27][12] in the literature study. This route was chosen for the system design described in this section.

4.4.1 Architecture The architecture of the proposed system is described in this section. It includes the overall structure as well as data flow graphs and data format descriptions. The simplified structure of the system can be seen in figure 4.9.

CET Designer Server Client software

3D 3D scene API API scene Application server

Figure 4.9: Architecture of the network-based system for CET Designer.

Instead of a two-tier architecture, which is used in the paper[30] by Kado and Hirasawa, an application server was added which handles the communication between clients and the database. A three-tiered system is slower[11], however it does open up for more possibilities in terms of scaling up the system and improving reliability. From the data perspective the system wants to make sure that the correct clients are writing to the database and that changes are properly propagated through the system before any changes are made. The three-tiered system was used in papers[27][12] presented in the literature study which showed great performance results. The clients in this system could be anything from game engines to rendering software and even websites. Any software that is capable of communicating with the server can request information even if they do not possess the ability to visualize it. When using game engines such as Unity and Unreal Engine there will be a need to create custom plugins/extensions which handle the connection to the database and make sure that the requested data is converted to the proper local assets. Since software use different formats to store and manage their assets there will need to be some configuration done before the link is working properly. There doesn’t exist a format or system which all rendering software support so the focus is to keep this system simple so that development will be kept as easy as possible. Taking Unreal Engine

28 4.4. Network-based system as an example, Blueprints Visual Scripting can be used as described in section 4.3.3 to quickly get a client application up and running. Since CET Designer is working with a lot of parametric data, the relations between objects (Snappers) need to be kept when changes are made. An example of this can be that when a table is stretched length-wise the chairs should move at the same time so they are uniformly spaced out. Another example given by Kado and Hirasawa[30] is related to how a wall will accommodate a window model by creating a hole in the mesh. When moving or changing the size of the window the hole needs to change as well to make sure the scene looks realistic. Figure 4.10 describes how the components of the system communicate with each other to keep the scenes up to date on the server and all the connected clients. An observer pattern is used to make sure that connected clients receive updates when changes have been made.

CET Designer Server Client Connect to server (observer)

Send scene data Connect to server (observer)

Receive scene data A model is modified

Send changes Send changes

Update scene data

A model Receive scene data is modified

Update scene data

Receive scene data

Figure 4.10: Communication chart for the system.

4.4.2 Database format When it comes to database format the design should be efficient while containing all the information the system needs. Adding more functionality in the future should be supported where information such as prices, information about materials used and assembly instructions could be of interest in the areas which Configura is present. The format should be based on the internal components in CET Designer since scenes and models should be sent from CET Designer to the server as quickly as possible. Conversion between the data stored on the server and the internal structure in the client should be done in each client to off load computation from the server. The important data identified from the literature study is position, material, texture and mesh. Additional information such as camera position, lighting information and other visual effects could also be added. Constantly updating the camera position couldbe very costly and there isn’t a need to have the cameras on both software synchronized. This could even be obtrusive for the users. Lighting information is difficult to implement as it will vary depending on software. Du et al. [12] mentioned this issue in their paper. Usually BIM

29 4.4. Network-based system models don’t contain lighting information so it’s up to the user to place light fixtures in the rendering engine manually to suit the model used. Building on the Snapper structure in CET Designer each Snapper is assigned a unique ID which is stored in the database. The Snappers will be individual nodes which together make up the scene. Each node contains information about its own position, material, texture and mesh. The latter three components will be stored separately and accessed through unique IDs. Since materials, textures and meshes can be reused for different Snappers they shouldn’t be stored them more than once. This format is similar to the ones used by Green [27] and Du et al. [12]. Using a relational database objects are stored according to figure 4.11.

Snapper ID Material ID Texture ID Mesh ID

Position Material data Texture data Texture data . . . Material ID ...... Texture ID Mesh ID . . .

Figure 4.11: Format for the network design data.

Texture data can be in the form of jpg or png files and the materials define physical properties of an object. An example of this comes from the glTF reference guide[31] where data is stored in the JSON format which describes the structure of a scene containing 3D models. A further study in this area could compare different types of data formats and see which is faster with rendering engines. In the paper[29] by Hatledal both JSON and Google Protocol Buffers(GPB) were compared using Firefox, Chrome and Unity. GPB was shown to be significantly faster in Unity.

4.4.3 Protocols and technologies Network protocols and technologies were seldom referenced in the papers from the literature study. Green [27] mentioned using TCP instead of UDP to get better reliability with the drawback of worse performance. A library which can handle TCP requests as well as IPC and UDP is ZeroMQ5. Since there is existing support in CET Designer for streaming data using ZeroMQ it would be a good candidate for this system. There is support for ZeroMQ in Unity6, Unreal Engine7 and Blender8. The library can be used in many programming languages such as C, C++, Python and Java. Hatledal[29] used ZeroMQ to create a network interface for a real-time simulation framework. It proved that the library could be used to stream 3D data to Unity with good performance results.

4.4.4 Optimization From observations of the papers in section 3.2, there is an importance in employing optimiza- tions of the 3D data and system. Optimizations could for instance be that the user defines the level of details in the models that are to be stored on the database. Perhaps there aren’t strict requirements on the client for high quality models. The system could for instance reduce

5https://zeromq.org/ 6https://github.com/valkjsaaa/Unity-ZeroMQ-Example 7https://github.com/hdhauk/UnrealZeroMQ 8https://github.com/NumesSanguis/Blender-ZMQ-add-on

30 4.4. Network-based system models mesh and compress the texture to lower the file size. The user could determine the level of detail using a setting in the extension where possible options are ”Low”, ”Medium”, ”High” and ”Super”. This option already exists in CET Designer for local drawings so it would be straight forward to implement in an extension. Du et al. [12] mentioned that this optimization could be important to improve the response time between server and client. Another optimiza- tion that was used by Kado and Hirasawa [30] is to only update the database when meshes have been changed. This can be applied to textures by using hash functions and time stamps in the same way. When a new Snapper is added to the database it checks if the mesh/texture already exists. In that case it can be reused and the client doesn’t need to download the same mesh/texture again. If the mesh or texture has been altered for a certain Snapper it will be uploaded to the database.

31 5 Discussion

This chapter contains the discussion of the results and method. There is also a section detailing how the work applies in a wider context.

5.1 Results

The results consisted of four main sections: prestudy, literature study, file-based implementa- tion and network-based system. Findings from each section will be presented here.

5.1.1 Prestudy The prestudy looked at different software used for architectural visualization and game devel- opment to find out which would be suitable for collaborations with CET Designer andother drawing software. It looked at cost, available file formats and access to source code as metrics for this evaluation. As detailed later in the implementation this was not enough to get a com- plete grasp of the software and their capabilities. How fast the software imported and exported files of varying sizes would have been a good complement to this. Given the results fromthe evaluation another software such as Blender could have been picked. It is ultimately a trade off with several factors which the developer has to put their own weight into. Unreal Engine was chosen since it was known and used previously by developers at Configura and there isa large user forum where most questions can be answered about development. Using the fact that the software is open source or not might have been a too large requirement for this thesis. The thesis could have looked further into the softwares APIs and tools for development to get a fairer representation of these software and what developers can do with them. Going deeper into plugin/extension development for software such as Lumion, Twinmotion, Unity and Revit could be a basis for future work as there wasn’t enough time to do it in this paper.

5.1.2 Literature study In the literature study papers were divided into either file-based approaches or streaming based approaches. There are some general conclusions that can be drawn from the file-based approaches. They all seemed to be in agreement that there is a need for simplifying the process of transferring 3D data between software. There were a few different ways this problem was

32 5.1. Results tackled in the included papers. Tran Thi[47] focused on how 3D models should be properly prepared whereas the other papers by Engberg and Eriksson[14], Fält[15] and Bille et al.[6] turned their attention towards automating this process. The papers used in this section all mentioned difficulties in converting 3D models between software where intermediate stepshad to be taken between for example drawing software such as Revit and rendering software such as Unity. Difficulties with textures not being applied in certain software caused additional steps where software such as 3ds Max needed to be used in order to combine both textures and meshes. Finding papers relating to file sharing was found to be more difficult which could be due to a lacking interest in this area of research. The results from the implementation in this thesis could shed a light on why sharing files between software is not an effective approach. This will be discussed later on in this section. Out of the four papers referenced two of them used Revit and Unity and three of them used 3ds Max. In addition, one thesis used Unreal Engine in combination with Autodesk Fusion 360. A greater spread of software used might have produced different results. The prestudy and implementation evaluation showed thatthe results can vary greatly depending on which software is used. The software dictates whether or not a certain file format can be used and also how fast an implementation canbe.Two software might not be compatible either as they choose to handle 3D data in different ways. On the other hand, some software do not provide outsiders with the proper tools to build complete plugins or extensions. Therefore, it might be a limitation which has caused most research to be skewed to the software listed in these papers. When it comes to streaming 3D data there are many papers which can be found though they might not always be relevant to a certain area of research. These papers were mostly focused on synchronizing data between 3D software for the architectural visualization and space planning industries. Sources had to be discarded if they dealt with topics that strayed away from this area or if streaming 3D data was only a byproduct of a different area of research. The papers selected for this study had different approaches to streaming 3D content and they all provided good findings which can be used for future research. Game engines were usedin three of the four papers whereas Green[27] chose to build his own rendering client. His paper was heavily focused on users getting fast feedback on their models to see how they will look in the final game. He used short surveys to get feedback on his implementation which streamed data from Autodesk Maya to networked clients. A large part of the paper explained the node structure and how it ties into Mayas internal structure. The papers by Du et al.[12] and Kado and Hirasawa[30] were focused on providing a VR interface to the user where modifications can be made to the model. The implementations thoroughly detailed how optimizations can be made to make the latency as small as possible and provide a lag-free user experience. Their implementations were designed to work with a variety of different software which was also mentioned as future improvements. The final paper[13] by Edwards et al. used a unique approach where users either interacted with a client game or a server game. It also supported web player versions of the game. The implementation used a combination of the FBX and OBJ file format but found that IFC would have been a better candidate due to it allowing for storing both geometric and parametric data. The other three papers created their own format for storing the 3D data which shows that both methods are possible.

5.1.3 File-based implementation The implementation section described a file-based approach using CET Designer and Unreal Engine. Despite producing a small amount of code an automated workflow was set up which allowed the user a hands-free export and import between software. Since Unreal Engine already had a lot of functionality built in there wasn’t a lot of work on that end. Using another software such as Unity or Blender might have resulted in a longer and more difficult implementation. Since no papers were found to support or refute this claim, it would be interesting to find out more about this. Compared to a streaming based approach this method is significantly easier to set up. The problems arose from picking the correct format and creating models

33 5.2. Method which were well suited for evaluation. The initial aim was to test using glTF but it showed to be not feasible since there was poor support in other software. This meant that changes had to be made during the results phase. As stated previously the software should have been tested during the prestudy as well to give a more comprehensive view of their capabilities and downsides. The results from the evaluation shows that there is a difference when using different files of varying sizes. It can depend on the number of objects, triangles, how large the texture files are and how many textures are used. There is also difference in using FBX and glTF. glTF was only just added in CET Designer and it’s in an experimental stage. This could have affected the results negatively with regards to the time it takes to create the glTF file as well as how it conforms with the file standards. Many different established software don’t even support glTF yet which shows that it might be in an early stage of adoption. Loading files in Unreal took a lot of time due to ”Compiling shaders”. This process has a lower priority to not interfere with other processes. According to user forums there were fixes for this, however the fixes tried didn’t show an improvement to the load time. During loading the CPU and RAM usage spiked up to above 90%. When very large files were loaded it wasn’t possible to interact with the Unreal editor and CET Designer was also less responsive. In the future these tests should be revisited with proper knowledge about the software and how they are used optimally. The differences in results come from how the software handle the imported files. Inthecaseof CET Designer the models are turned into assets in the form of Snappers. Snappers are objects which contain not only meshes and textures but also information about how they relate to other Snappers. The same thing occurred in Unity and Unreal Engine where the models were turned into internal assets for use later as game objects. This does not seem to be the case for Lumion which loaded the models much quicker. In the paper[39] by Nopachinda and Ergan they found that game engines were very slow in importing files compared to the time it took to render them. Using three different geometric granularities (Coarse, Medium, Fine) inthe files they found that the import times grew from 12 minutes to almost an hour whereasthe rendering time reached a peak at 7 minutes. Evaluating this further for different software and using files varying in geometric and material detail would be useful for future work inthis area.

5.1.4 Network-based system The sketch of the network-based system used lessons from the literature study. It combined useful architecture structures, database formats and optimizations. It could have been more detailed and become a larger part of this thesis. To give a comprehensive view of the entire 3D data synchronization area of research the thesis was divided into these four parts. There wasn’t enough time to put this system into actual code, though that could be the basis for future work. The papers which form a basis for the network-based system needed to be more detailed in how the systems were built in order to replicate them successfully. There are missing parts such as which programming languages, libraries and protocols were used. This leads to a sketch which is less detailed than a fully functional implementation. In general, there was a lack of detailed information in the papers which might have been mitigated by looking at code bases instead. The papers didn’t include links to GitHub repositories or sites where the source code can be audited. Combining the literature study with a code study could have proved helpful when getting down into the finer grained details of the system. Searching for plugin implementations for rendering engines on GitHub and going through the code would be a great complement to the literature study.

5.2 Method

The method was divided into the same sections as the results chapter. This structure allowed for easing into the subject and get more knowledge about this area as the work progressed.

34 5.2. Method

Having these four parts meant that there was more flexibility when it comes to the size ofthe paper as certain parts could be removed if there wasn’t enough time. The implementation was kept small to allow for work on a network-based system which could be the basis for future work.

5.2.1 Prestudy Improvements could be made to the method used. Starting with the prestudy there was a lack of input from the community. Gathering feedback from Configuras users could be interesting to find out which rendering software they prefer to use and which they have used in thepast. This was an idea that came up late in the project and wouldn’t have been possible to complete in the given time frame. This information should have been taken advantage of in a useful way as well. The direction of this paper didn’t align with performing such a study. Another improvement is that the paper could have gone more in-depth on what opportunities the different software provided for developers and what type of development they support. This information could be useful for future work in this area.

5.2.2 Literature study The use of a literature study provided great insights into what has been done previously in this area and what has been suggested for future work. It was difficult to sort through all the papers and a more clearly defined list of prerequisites should have been done beforehand. Such as only including papers released in the last five years, papers that contain more than one author and papers which have been published in esteemed journals. The papers seemed to be very focused on BIM and architectural visualization in general. This could have skewed the general findings since it is only a part of the use case for 3D data and 3D rendering software. When it comes to replicability it is simple to find which sources have been used in the references chapter though if you only use the same search words and follow the method described you could end up using other sources and coming to other conclusions. Many of the papers used were published in journals and were well-cited, though the validity of some sources could be criticized since they are part of master thesis work and might not be sufficiently peer-reviewed. Since so many papers were used and the findings seem to align the validity shouldn’t bean issue.

5.2.3 File-based implementation The method of the file-based implementation is based on creating a simple workflow usinga minimal amount of code. Since CET Designer is used, which isn’t open source, it would be difficult to replicate this method. However, the general concept should be replicable inmany different software using the explanations in this thesis. Unreal Engine is well-documented and has an active forum where developers can get help with all kinds of problems. The implementation could have been tested and evaluated by developers or end users to see if it was useful and sufficiently simple to use. This was discarded since the implementation could not be fully packaged in time to allow for user testing. User testing was not the focus of this thesis as this work was exploratory and a proof-of-concept for future work as explained in the delimitation section 1.4. The performance was tested instead of user metrics since it would not be interesting to keep working on this type of implementation if it wasn’t viable and there were more optimal solutions such as a network-based approach.

5.2.4 Network-based system There wasn’t enough information in the method to get a proper grasp on how the network- based system worked. This is partly due to the fact that it is dependent on the research found in the literature study which is described at a later stage in the paper. If there was more time

35 5.3. The work in a wider context available the network-based system could have been implemented. To not go through with that was a decision made between the author and one of the supervisors at Configura. To create a complete solution would take too long and Configura wanted to focus on creating a foundation for future work instead. In comparison, a similar system by Lin et al. [35] involved connecting BIM models to a database that sent 3D data to a VR display. This work took 4 developers a total of 6 months to create. It is a much larger implementation compared to what would have been created so it can be seen as a high estimation for this kind of project. This thesis previously mentioned the project by Green [27] where he created a system for streaming 3D data from Maya. That paper was also part of a master thesis though he didn’t include a pre-study survey, literature study and implementation of a file-based system. He also didn’t stream the data to a 3rd party software. This could have led to some additional issues.

5.3 The work in a wider context

The focus of this thesis is to find ways of allowing collaboration between 3D software. Col- laboration can be very good for both companies and users since it allows software to be more specialized within a certain area. It is in the company’s interest to keep the users satisfied and that goes hand in hand with allowing them to produce content freely. Being able to export and import your drawings and designs to many different software means that you can take ad- vantage of different features that might not be available if you were locked to a single software. As seen in table 4.1, the software have different usage and supported file formats. Some might support features such as virtual reality, augmented reality, material editing, animations and rigging whereas others don’t. Providing easy to access plugins which enable direct links mean that users can get feedback instantly to changes in the model or drawing. This also opens up to collaborations between users where two or more computers can be linked together and perform changes in unison. This could greatly improve the efficiency of the users and could be an interesting foundation for a new study in this area.

36 6 Conclusion

This chapter summarizes the conclusions that can be made from this work. It is divided into sections relating to each research question. At the end of the chapter future work is presented.

6.1 What research exists on 3D data sharing between software?

Several papers were found in the area of which those deemed most interesting were included in this thesis. Some papers were also excluded because they showed the same results as others or were old enough that the methods and software used were outdated. What was found is that 3D data sharing can be done using a local shared file which software have access to for reading and modifying. There is also an approach where the 3D data is streamed through communication protocols either directly or through intermediate servers and databases. The latter approach seemed to be a greater topic of research as it was easier to find papers on this subject, and they came with many different variations and modifications. It seems to be a continuing topic as there were papers released in 2021 as well. Both approaches have advantages and disadvantages. The file-based approach seems to be easier to set up since it uses existing functionality with small modifications. All 3D software have some sort of import and export functionality and the modifications needed to automate the workflow canbe relatively simple compared to a streaming approach. This can be seen by the implementation developed and tested in this thesis. It only required slight modifications in the CET Designer source code and on the Unreal Engine side it was possible to use existing features and code. Setting up a server and database or creating a protocol for streaming 3D data over TCP would have required greater efforts. This can be proved by studying the papers in section 4.2.2. An example is the work by Du et al. [12] which involved three authors who created a data transfer protocol and format which needed many improvements to function optimally. At the same time file-based approaches can also create problems where bugs can occur between different software where for example file formats are interpreted differently. In summary, there isa fair amount of research in this area and there are some optimizations which can be applied to the processes and technologies used in the implementations. There is not yet a system which connects all major 3D drawing and rendering software and there could be a great opportunity for researchers and developers to try and tie them all together.

37 6.2. What limitations does an automated file-based import-export workflow have?

6.2 What limitations does an automated file-based import-export workflow have?

The implementation described in section 4.3 shows that it can be fairly easy to set up an automated import-export workflow, at least using the software described in this thesis. Other software such as Revit, 3ds Max and Unity are referenced in papers in the literature study which also show that it is possible. Limitations mostly come from not having access to the source code or an API which holds all the necessary functions. There is also the issue of selecting which file format to use. In the prestudy in section 3.1, some file formats are only supported by a small amount of software. This caused problems as glTF was initially chosen for the implementation though there wasn’t enough support for this in the software used for reference testing. Both FBX and glTF were used for testing the implementation to get a comparison. There isn’t a single perfect file format for every use case as some do not hold necessary information such as parametric data. There are also differences in how materials and geometric data are represented which can lead to varying results when visualizing the 3D scenes and models. You also need to make sure that files are properly prepared before being imported into other software. From the testing the conclusion can be drawn that software can severely impact the user experience in the form of slow loading times. When choosing which 3D software to use for an implementation there should be a consideration taken for how efficient it is at importing and exporting files. When files get too large or too detailed it will result in long wait timesduring which the computer could not be used properly. When software need to load in large models and scenes it takes up resources which other software cannot use. These resources include processing power and memory.

6.3 How can a network-based 3D data system be described using existing research?

Using existing research developers can take advantage of optimizations and avoid pitfalls which authors have come across in the past. Useful information such as architecture designs, database structure and communication charts could be researched. Combining this with knowledge about the software and systems developers want to work with they can create a customized system for a potential implementation. One thing that was lacking in the existing research was more in-depth descriptions about programming languages, libraries and protocols used. Since there isn’t access to any source code it would be difficult to completely reproduce the results from the papers. One possible improvement to the method used in this paper could be to research working implementations published on GitHub. This could tie together theory and practice so the reader gets a complete overview of how an efficient system could be designed and implemented. The system specified in this thesis could be used in future work wherean evaluation of the user benefits as well as performance requirements can be made.

6.4 Future work

For future work in this area it would be interesting to see the network-based system described in this paper be put into code and tested according to user centered metrics. Demands on latency and usability could be evaluated to find out what users want to get out of this kindof system. Something which was missing from some of the reports was a comparison to existing frameworks and tools. Creating a system from scratch takes a significant amount of time and there could be systems which already have the sought after functionality. In this thesis paper there wasn’t enough time to explore this fully. Therefore, for future work it could be

38 6.4. Future work interesting to look at systems like the NVIDIA Omniverse Platform1 which is currently in an open beta stage.

1https://developer.nvidia.com/nvidia-omniverse-platform

39 Bibliography

[1] Robert McNeel & Associates. Rhino3D. url: https://www.rhino3d.com/. Accessed: 2020-12-17. [2] Autodesk. 3ds Max. url: https://www.autodesk.se/products/3ds-max/overview. Accessed: 2020-12-17. [3] Autodesk. Maya. url: https://www.autodesk.se/products/maya/overview. Ac- cessed: 2020-12-17. [4] Autodesk. MotionBuilder. url: https : / / www . autodesk . com / products / motionbuilder/overview. Accessed: 2020-12-17. [5] Autodesk. Revit. url: https://www.autodesk.se/products/revit/overview. Ac- cessed: 2020-12-17. [6] Ross Bille, Shamus P. Smith, Kim Maund, and Graham Brewer. “Extending Building Information Models into Game Engines.” In: Proceedings of the 2014 Conference on In- teractive Entertainment. IE2014. Newcastle, NSW, Australia: Association for Computing Machinery, 2014, pp. 1–8. isbn: 9781450327909. doi: 10.1145/2677758.2677764. url: https://doi-org.e.bibl.liu.se/10.1145/2677758.2677764. [7] cgarchitect. Architectural Visualization Rendering Engine Survey Results. url: https: / / www . cgarchitect . com / features / articles / b352ebe4 - 2019 - architectural - visualization-rendering-engine-survey-results. Accessed: 2021-02-11. [8] Configura. CET Designer. url: https : / / www . configura . com / products / cet - designer. Accessed: 2020-12-07. [9] Configura. CM Language. url: https : / / support . configura . com / hc / en - us / sections/360008682434-CM-Language. Accessed: 2021-02-11. [10] Dilbert. House | 3D Warehouse. url: https://3dwarehouse.sketchup.com/model/ bad472a435d3fab41992a70eb6b3a2a6/House. Accessed: 2021-03-01. [11] Dmitriy Dorofeev and Sergey Shestakov. “2-Tier vs. 3-Tier Architectures for Data Pro- cessing Software.” In: ICAIT’2018. Aizu-Wakamatsu, Japan: Association for Computing Machinery, 2018, pp. 63–68. isbn: 9781450365161. doi: 10.1145/3274856.3274869. url: https://doi-org.e.bibl.liu.se/10.1145/3274856.3274869.

40 Bibliography

[12] Jing Du, Zhengbo Zou, Yangming Shi, and Dong Zhao. “Zero latency: Real-time synchro- nization of BIM data in virtual reality for collaborative decision-making.” In: Automation in Construction 85 (2018), pp. 51–64. issn: 0926-5805. doi: https://doi.org/10.1016/ j.autcon.2017.10.009. url: http://www.sciencedirect.com/science/article/ pii/S0926580517309172. [13] Gareth Edwards, Haijiang Li, and Bin Wang. “BIM based collaborative and interactive design process using computer game engine for general end-users.” In: Visualization in Engineering 3.1 (Feb. 2015), p. 4. issn: 2213-7459. doi: 10.1186/s40327-015-0018-2. url: https://doi.org/10.1186/s40327-015-0018-2. [14] Gustav Eriksson and Anton Engberg. “Automating the CAD to Virtual Reality Pipeline for Assembly Simulation.” MA thesis. Linköping University, Machine Design, 2020, p. 71. url: https://www.diva-portal.org/smash/get/diva2:1444936/FULLTEXT01.pdf. [15] Viktor Fält. “Strategies to effectively integrate a 3D model of a building in a software systems for real-time visualization.” MA thesis. Linköping University, The Institute of Technology, 2015, p. 32. url: https://www.diva- portal.org/smash/get/diva2: 838677/FULLTEXT01.pdf. [16] FineHouse. Outdoor Kitchen Pergolas | 3D Warehouse. url: https://3dwarehouse. . com / model / f67b258f7f61d9a8e79d181dec16bca / Outdoor - Kitchen - Pergolas. Accessed: 2021-03-01. [17] The Blender Foundation. Blender. url: https://www.blender.org/about/. Accessed: 2020-11-11. [18] Epic games. Auto Reimport. url: https://docs.unrealengine.com/en-US/Basics/ AssetsAndPackages/AutoReImport/index.html. Accessed: 2020-12-07. [19] Epic games. Datasmith. url: https://www.unrealengine.com/en- US/datasmith. Accessed: 2020-12-07. [20] Epic games. Live Link Plugin. url: https : / / docs . unrealengine . com / en - US / AnimatingObjects/SkeletalMeshAnimation/LiveLinkPlugin/index.html. Accessed: 2020-12-07. [21] Epic games. Plugins. url: https : / / docs . unrealengine . com / en - US / ProductionPipelines/Plugins/index.html. Accessed: 2020-12-07. [22] Epic games. Twinmotion. url: https://www.unrealengine.com/en-US/twinmotion. Accessed: 2020-11-11. [23] Epic games. Unreal Engine. url: https://www.unrealengine.com/en-US/faq. Ac- cessed: 2020-11-11. [24] Scott Gebhardt, Eliezer Payzer, Leo Salemann, Alan Fettinger, Eduard Rotenberg, and Christopher Seher. “Polygons , Point-Clouds , and Voxels , a Comparison of High-Fidelity Terrain Representations.” In: Simulation Interoperability Workshop and Special Workshop on Reuse of Environmental Data for Simulation—Processes, Standards, and Lessons Learned. 2009. [25] Enscape GmbH. Enscape. url: https://enscape3d.com/features/. Accessed: 2020- 11-11. [26] Graphisoft. Archicad. url: https://graphisoft.com/solutions/products/archicad. Accessed: 2020-12-17. [27] Daniel Green. “An Interface for Real-Time Networked Streaming of 3D Scenes.” MA thesis. Teesside University, Jan. 2016, p. 50. url: https://www.researchgate.net/ publication/327867528_An_Interface_for_Real-Time_Networked_Streaming_of_ 3D_Scenes.

41 Bibliography

[28] Nemetschek Group. Vectorworks. url: https://forum.vectorworks.net/index.php? /topic/51142-twinmotion-plugin-sync/page/14/. Accessed: 2021-02-11. [29] Lars Ivar Hatledal. “A Flexible Network Interface for a Real-time Simulation Frame- work.” MA thesis. NTNU, 2017. url: https : / / ntnuopen . ntnu . no / ntnu - xmlui / handle/11250/2685194. [30] Keita Kado and Gakuhito Hirasawa. “Two-Way Cooperation of Architectural 3d Cad and Game Engine.” In: Proceedings of the 16th ACM SIGGRAPH International Conference on Virtual-Reality Continuum and Its Applications in Industry. VRCAI ’18. Tokyo, Japan: Association for Computing Machinery, 2018. isbn: 9781450360876. doi: 10.1145/ 3284398.3284420. url: https://doi- org.e.bibl.liu.se/10.1145/3284398. 3284420. [31] Khronos. glTF 2.0 Quick Reference Guide. url: https://www.khronos.org/files/ gltf20-reference-guide.pdf. Accessed: 2021-03-01. [32] Sam Kubba. “Chapter Five - Building Information Modeling (BIM).” In: Handbook of Green Building Design and Construction (Second Edition). Ed. by Sam Kubba. Sec- ond Edition. Butterworth-Heinemann, 2017, pp. 227–256. isbn: 978-0-12-810433-0. doi: https://doi.org/10.1016/B978- 0- 12- 810433- 0.00005- 8. url: http://www. sciencedirect.com/science/article/pii/B9780128104330000058. [33] James Kurose and Keith Ross. Computer networks: A top down approach featuring the internet. Pearson, 2010. [34] Geon-hee Lee, Pyeong-ho Choi, Jeong-hwan Nam, Hwa-seop Han, Seung-hyun Lee, and Soon-chul Kwon. “A Study on the Performance Comparison of 3D File Formats on the Web.” In: International journal of advanced smart convergence 8.1 (Mar. 2019), pp. 65– 74. [35] Yu-Cheng Lin, Yen-Pei Chen, Huey-Wen Yien, Chao-Yung Huang, and Yu-Chih Su. “Integrated BIM, game engine and VR technologies for healthcare design: A case study in cancer hospital.” In: Advanced Engineering Informatics 36 (2018), pp. 130–145. issn: 1474-0346. doi: https://doi.org/10.1016/j.aei.2018.03.005. url: http://www. sciencedirect.com/science/article/pii/S1474034617303373. [36] Ellen Lupton. Graphic design theory: Readings from the field. Chronicle Books, 2009, pp. 127–132. [37] Maxon. Cinema4D. url: https://www.maxon.net/en/cinema-4d. Accessed: 2020-12- 17. [38] Kenton McHenry and Peter Bajcsy. “An overview of 3d data content, file formats and viewers.” In: National Center for Supercomputing Applications 1205 (2008), p. 22. [39] Sutenee Nopachinda and S. Ergan. “Challenges in Converting Building Information Mod- els into Virtual Worlds for FM Operations and User Studies in the Built Environment.” In: The 16th International Conference on Computing in Civil and Building Engineering (ICCCBE 2016). Osaka, Japan, 2016. [40] “Physically Based Rendering.” In: Physically Based Rendering (Third Edition). Ed. by Matt Pharr, Wenzel Jakob, and Greg Humphreys. Third Edition. Boston: Morgan Kauf- mann, 2017, p. 1235. isbn: 978-0-12-800645-0. doi: https://doi.org/10.1016/B978- 0-12-800645-0.50029-4. url: http://www.sciencedirect.com/science/article/ pii/B9780128006450500294. [41] Freddie Pintar. “Investigation and Implementation of a Live Connection between Con- figura CET and Revit Architecture 2009.” MA thesis. Linköping University, PELAB- Programming Environment Laboratory, 2009, p. 70. url: http://liu.diva-portal. org/smash/get/diva2:208793/FULLTEXT01.pdf.

42 Bibliography

[42] Florent Poux. How to represent 3D Data? url: https://towardsdatascience.com/ how-to-represent-3d-data-66a0f6376afb. Accessed: 2021-01-21. [43] Unity Technologies. Unity. url: https://unity.com/products/unity- platform. Accessed: 2020-11-11. [44] Unity Technologies. Unity - Compare plans. url: https://store.unity.com/compare- plans. Accessed: 2020-11-12. [45] Unity Technologies. Unity - Plug-ins. url: https : / / docs . unity3d . com / Manual / Plugins.html. Accessed: 2020-11-12. [46] Unity Technologies. Unity Reflect. url: https : / / unity . com / products / unity - reflect. Accessed: 2020-12-07. [47] Thien Tran Thi. “Importing a 3D model from an industrial design.” BA Thesis. Metropo- lia Ammattikorkeakoulu, 2015. url: https://www.theseus.fi/handle/10024/97743. [48] Trimble. SketchUp. url: https://www.sketchup.com/. Accessed: 2020-12-17.

43