<<

ExoViewer An Interactive Visualization of Extrasolar Planets and Small Solar System Bodies

Master Thesis 29 February 2016 by Pascal Forny, Cham ZG, 10-756-120

Supervisors: Visualization and MultiMedia Lab Prof. Dr. Renato Pajarola Department of Informatics Matthias Thony¨ University of Zurich Abstract

A recent trend in astronomy is the hunt for a second life sustaining planet in the cosmos. This research generates a lot of data of celestial bodies, so-called extrasolar planets or exoplanets, which is visualized here. Concepts for the adequate representation of the discovered objects are developed and implemented as an interactive computer application called ExoViewer. Apart from planetary bodies, the program illustrates and present in our Solar System. The challenge of rendering this huge amount of data in an appropriate manner is faced as a second part of the work. Results, which are frames rendered using described techniques and methods, show that visualizing uncertain or missing data is a big issue and the focus of a user can quickly get distracted by displaying too many things. Additionally, measurements of the performance of the application illustrate that calculating positions of bodies in space are costly operations and culling strategies are crucial to obtain a smooth running program.

ii Abstract in German

Neuste Entwicklungen in der Astronomie verstarken¨ die Bemuhungen,¨ neben der Erde einen anderen bewohnten Planeten im Universum zu finden. Diese Forschungsaktivitat¨ fuhrt¨ dazu, dass sehr viele sogenannte Exoplaneten entdeckt werden. Die vorliegende Arbeit prasentiert¨ Konzepte, woraus ein interaktives Computerprogramm mit dem Namen ExoViewer erstellt wird, welches diese Entdeckungen visualisiert. Daneben liegt ein zweiter Fokus auf der Darstellung einer grossen Anzahl an Asteroiden und Kometen in unserem Sonnensystem. Das Resultat dieser Arbeit sind generierte Bilder, welche aus den angewendeten Techniken hervorgehen und zeigen, dass die Visualisierung von inkompletten oder mit Unsicherheit behafteten Datensatzen¨ eine Heraus- forderung darstellt. Sie implizieren, dass beim Rendering von vielen Objekten Auswahlen, Priorisierungen oder durchdachte Darstellungsmethoden eingesetzt werden mussen.¨ Des Weiteren zeigt eine Performanzanalyse, dass Positionsberechnungen aufwandig¨ und die Anzahl ausgeloster¨ Zeichnungsvorgange¨ kritisch fur¨ die benotigte¨ Zeit pro Bild sind.

iii Contents

Abstract ii

1 Introduction 1 1.1 Initial Situation ...... 1 1.2 Goal ...... 2 1.3 Outline ...... 2

2 Related Work 3 2.1 Fundamentals in Astronomy ...... 3 2.1.1 Small Body Object ...... 3 2.1.2 Exoplanet ...... 3 2.1.3 Habitable Zone ...... 6 2.1.4 Distances ...... 7 2.2 Databases ...... 8 2.2.1 Open Exoplanet Catalogue ...... 9 2.2.2 JPL Small-Body Database ...... 9 2.2.3 The Solar System ...... 10 2.3 Astronomical Visualization ...... 10 2.3.1 Challenges ...... 10 2.3.2 Existing Tools for Visualizing Celestial Bodies ...... 10 2.4 Implementation Specific Aspects ...... 11

3 Theory 12 3.1 Particular Visualization Aspects ...... 12 3.1.1 Exoplanetary Systems ...... 12 3.1.2 Small Body Objects ...... 13 3.2 Location of Bodies in Space ...... 13 3.2.1 Modelling of Positions ...... 14 3.2.2 Intrasystem Movement ...... 15 3.3 Numerical Precision Aspects ...... 17 3.4 Acceleration Data Structures ...... 18 3.4.1 View Frustum Culling ...... 19 3.4.2 Distance Culling ...... 19

4 Implementation 20 4.1 Architecture ...... 20 4.1.1 ExoViewer Package ...... 20 4.1.2 GlobeEngine Package ...... 21 4.1.3 geAstro Package ...... 21 4.2 Use Cases ...... 22 4.3 Data Model ...... 22 4.3.1 Data Sources ...... 23 4.3.2 Data Import ...... 24

iv Contents

4.4 Graphics Pipeline ...... 24 4.4.1 Deferred Rendering Pipeline ...... 25 4.4.2 Rendering Data Structures ...... 26 4.4.3 Qt ...... 28 4.5 User Experience ...... 28 4.5.1 Interaction Overview ...... 28 4.5.2 Parametrization ...... 30 4.5.3 Seeing the Essentials ...... 33 4.5.4 UI Design Trade-offs ...... 33

5 Results 35 5.1 Exoplanetary Systems ...... 36 5.2 Small Body Objects ...... 37 5.3 Measurements ...... 38 5.3.1 Application Initialization ...... 38 5.3.2 Running Application ...... 39

6 Discussion 46 6.1 Exoplanetary Systems ...... 46 6.2 Solar System ...... 46 6.3 Performance ...... 47

7 Future Work 48 7.1 Implementation ...... 48

v 1 Introduction

The outer space has always had a special and mystical attraction towards human kind. Documentations of first astronomical observations reach back to thousands of years before Christ. Later, high cultures as they existed in Egypt, Greece, or Rome initiated the era of astronomy as a science. However, with the acquisition of knowledge, the curiosity never decreased. Nowadays, a lot of effort is made to learn more about astronomical phenomena. Special institutions like the National Aeronautics and Space Administration of the United States of America (NASA) have been founded which consume a significant amount of the federal budget. So the motivation to invest and discover more in this field is a constant in history. In contrast, the style of research drastically changed over time. Nowadays, having technical resources, e.g. computers or telescopes, available like never before, there are methods and branches of research which get increasingly intensive on the data level. One of these recent developments is the focus in finding extraterrestrial life. Novel instruments made it possible to detect planets outside of our Solar System. At the present , around 2,000 of these so-called exoplanets are known and their data is available to the public. Being a very popular field of research, this number is increasing at a very frequent basis. In parallel, our Solar System has been examined further, such that the known planets, dwarf planets, moons, and small body objects become more and more. This incredible amount of data, summed up currently slightly more than 700,000 objects, is hardly comprehensible as a whole by looking at tables and numbers. That is exactly where the data visualization comes into play. Interactive visual exploration of high-dimensional data sets can help to gain more insights than statistical analyses [Goo12]. Using an input of a visualization, the human brain is more capable to identify patterns or irregularities than with pure numerical figures. In general, visualization systems face many challenges in terms of computer graphics, when it comes to preci- sion and navigation. This aspect is particularly pronounced in astronomical visualization, where the dimensions of data cover a range unlike in any other field. Large scale visualization is important when all small body objects in our Solar System are considered and scalable algorithms for visualization are needed. Another frequent issue is the information visualization. The question posed here is how to translate numbers or strings into in image representing them. This is a hard problem to solve, especially when having only imprecise data or even no data at all, as it is the case with some exoplanets.

1.1 Initial Situation

This thesis is the follow-up project of a bachelor’s thesis, published in 2015 by Matthias Noetzli with the title ’Solar System Viewer’ [Not15].¨ The subject of this work was a parallel 3D visualization of the Solar System. Based on this project as a starting point, the ExoViewer is an extension thereof. Subsequently, some main aspects of the Solar System Viewer are described briefly. In this predecessor project, an interactive visualization of the , the planets and most of the known moons in the Solar System was developed. It allows to navigate though the stellar system, and select and focus on an arbitrary body. Numerical information about bodies can also be viewed. As a single data source serves a hand- made locally stored file. A big part of this project was the parallelization of the application, such that the rendering is done in a multi-screen environment.

1 1 Introduction

1.2 Goal

The target of this skilled work is to extend the Solar System Viewer to a large scale visualization system for Exoplanets and their hosting systems, as well as small body objects in the Solar System. The viewer itself will give the opportunity to navigate through 3D space and explore an exoplanet database. In addition it will be possible to view the numerical and textual information of each exoplanet within the application. In more detail, the initial task description speaks about an application to be developed fulfilling the following criteria:

• Define a concept and requirements to extend the currently available software package to visualize exoplan- etary systems from NASA’s Kepler project and the NASA/JPL Small Body Object Database.

• The viewer should allow to navigate between stellar systems in a seamless manner and should allow zoom- ing between a single planet and the survey of the Milky Way.

• It should be possible to visualize gravitational field changes within a stellar system with a simple grid like structure.

• The viewer should allow the selection of objects and shows currently available information about planets, stars and small body objects. This can be achieved over a Web API of the above mentioned databases.

• The application development process will be with C++, OpenGL, and Qt.

1.3 Outline

This work is structured in four main chapters, comprising the related work, the theory, the implementation, and the results. After having introduced to this thesis, an overview of relevant related work is presented in chapter 2 including aspects of astronomy, of existing databases, as well as of visualization techniques and projects. Subsequently, the important and interesting facets of the theory the developed prototype is based on, are treated in chapter 3. This part is meant to be concise because either a lot of principles are already described in the previous chapter or the theories are not really revolutionizing. The section 3.1 offers an overview of the non-technical concepts and ideas which found the developed application. They may help to understand some design and implementation decisions. Then, in chapter 4 comes the part, where most of the work was spent for this thesis, the explanation of the implementation. It contains all the important details about the implementation, focusing as always on changes and extensions compared to the predecessor project, realized by Noetzli [Not15].¨ This chapter is interesting both for people focusing on technical aspects like the graphics pipeline in section 4.4, and for people wanting to get in touch with the prototype as a user (section 4.5). In chapter 5, the results are presented mainly as screenshots, which is obvious being a graphical work. Different detailed aspects of the prototype are shown, focusing on the tasks of visualizing on the one hand the exoplanetary systems (5.1) and on the other hand the small body objects (5.2). To conclude the thesis, the results are discussed in chapter 6 and some future steps in the development of this project are suggested.

2 2 Related Work

2.1 Fundamentals in Astronomy

Having celestial bodies and whole stellar systems to be visualized, it helps or is even necessary to dive into the wide field of astronomy. An understanding of the astronomical fundamentals is highly supportive for both developer and user of the application. The information given here is far away of being complete, it is limited to the topics this thesis is essentially focusing on.

2.1.1 Small Body Object Small body objects are defined by the International Astronomical Union (IAU), which refers to them as ’Small Solar System Bodies’. The definition is an exclusion of all known celestial bodies in the Solar System: Every object having its orbit centered on the Sun and not being a planet or a dwarf planet is classified as . A dwarf planet has sufficient mass for its self-gravity to overcome rigid body forces so that it assumes a hydrostatic equilibrium (nearly round) shape, which is the main criterion differentiating it from the Small Solar System Body. Thus, a Small Solar System Body has usually not a round shape. [IAUb] Since Small Bodies are only known for the Solar System, the here defined ’Small Solar System Bodies’ are called ’small body objects’ or ’small bodies’ in this work. There is a big variety in this category of celestial bodies. These numbered objects are commonly grouped in two big categories: asteroids and comets. The distinction between them becomes increasingly difficult though, because more and more is learned about the formation of the universe. Asteroids are small rocky bodies primarily in orbit between and Mars. This accumulation of asteroids is referred to as the Main Belt. In the definition of the Jet Propulsion Laboratory (JPL) Small-Body Database (subsection2.2.2), also objects orbiting between Jupiter and Neptune, and even beyond Neptune are included. Their diameters range from a few meters up to nearly 1.000km. [JPL] Comets are small icy bodies, only a few kilometers in extent. They are believed to be products of the formation of the outer Solar System. The most famous accumulation of these objects is named . Among other categorization approaches of comets, an interesting one is the one based on their periodicity. The so called long- period comets complete one orbit around the Sun between 100 and several million years whereas the short-period comets have orbital periods of 5 to 7 years. The discovery of unknown comets has recently been boosted due to the joint in-space observatory SOHO of the European (ESA) and American (NASA) Space Agencies. As of mid 2015, this mission have been finding nearly 3,000 Comets within 20 years [Sea15]. [JPL] These two categories, asteroids and comets, can further be subdivided as shown in table 2.1. Independently thereof, there are Small Bodies which pass close to our planet and can therefore be of special interest. They are called NEOs (Near Earth Objects) and PHAs (Potentially Hazardous Objects), which potentially could hit Earth and cause a considerable damage [JPL].

2.1.2 Exoplanet An extrasolar planet, or more often referred to as its short form exoplanet, can be defined as an equivalent of a planet orbiting another barycenter (center of mass) than the one of the stellar system we live in [PLQ]. Nowadays, 2015 of them are known to exist in 1302 different exoplanetary systems [OEP]. It is commonly assumed that

3 2 Related Work

Atira (IEO) An asteroid orbit contained entirely within the orbit of the Earth (q < 0.983au). Also known as an Interior Earth Object. Aten (ATE) Near-Earth asteroid orbits similar to that of 2062 Aten (a <1.0au; q > 0.983au). Apollo (APO) Near-Earth asteroid orbits which cross the Earth’s orbit similar to that of 1862 Apollo (a > 1.0au; q < 1.017au). Amor (AMO) Near-Earth asteroid orbits similar to that of 1221 Amor (1.017au 1.666au). Main-belt Asteroid (MBA) Asteroids with orbital elements constrained by (2.0au 1.666au). Outer Main-belt Asteroid Asteroids with orbital elements constrained by (3.2au < a < 4.6au). (OMB) Jupiter Trojan (TJN) Asteroids trapped in Jupiter’s L4/L5 Lagrange points (4.6au 30.1au). Parabolic Asteroid (PAA) Asteroids on parabolic orbits (e = 1.0). Hyperbolic Asteroid (HYA) Asteroids on hyperbolic orbits (e > 1.0). Asteroid (other) (AST) Asteroid orbit not matching any defined orbit class. (a) Hyperbolic (HYP) Comets on hyperbolic orbits (e > 1.0). Parabolic Comet (PAR) Comets on parabolic orbits (e = 1.0). Jupiter-family Comet* (JFC) Jupiter-family comet, classical definition (p < 20y). Halley-type Comet* (HTC) Halley-type comet, classical definition (20y < p < 200y). Encke-type Comet (ETc) Encke-type comet, as defined by Levison and Duncan (TJupiter > 3; a < aJupiter). Chiron-type Comet (CTc) Chiron-type comet, as defined by Levison and Duncan (TJupiter > 3; a > aJupiter). Jupiter-family Comet (JFc) Jupiter-family comet, as defined by Levison and Duncan (2 < TJupiter < 3). Comet (other) (COM) Comet orbit not matching any defined orbit class. (b)

Table 2.1: Orbital classification of small bodies, (a) for asteroids and (b) for comets with a being the semimajor axis, e the eccentricity, p the period (time to complete one orbit), and T the Tisserand’s parameter (value calculated from several orbital elements; explained in [Rid12]). [JPL] exoplanets can be orbited by moons, as observable with planets in the Solar System. However, there are no confirmed moons of exoplanets, but in 2014, an ’exomoon’-candidate was announced [BBB+14]. Ideas of a plurality of wolds reach back to ancient Greece, but the first discovery of an exoplanet was made in 1995, when Michel Mayor and Didier Queloz from the University of Geneva reported an object orbiting 51 Pegasi. They observed it indirectly, which means that they did not see the exoplanet itself but the effect of its existence. This announcement triggered a wave of discoveries. By the end of the 20th century, several dozen exoplanets have been observed. The boom in this field of research, which is holding on, is mainly due to technical progresses such as significant improvements in spectrometers to separate starlight into its color components, better and more

4 2 Related Work accurate telescope sensors, or the development of novel computer programs for sophisticated analyses. It can be said, that these days exoplanet-hunting is a mainstream activity in the field of astronomy. The cosmos is assumed to be full of exoplanets, however, it is not trivial to detect them. [PLQ] In 2006, the French-European mission CoRoT was launched being the first space mission dedicated to the discovery of exoplanets. Only three years later, NASA started their fist exoplanet mission, called Kepler, which is revolutionizing in many aspects. All signs point to further missions from different institutions, also because this field of research is a good instrument to gain media attention. [PLQ]

Discovery Methods There exists a big variety of methods of discovery, some of them only in theory, but the subsequently described ones account for the majority of all exoplanet detections (figure 2.1). Only a handful exoplanets were ever directly imaged due to their faint light emitted compared to the glare of one or multiple nearby stars. Methods of direct imaging use special masking device to block out the star’s light (coronography) or images from multiple telescopes in order to calculate a combination of them which cancels out the light of the host star (interferometry). [PLQ] The transit method is a tool to detect exoplanets which pass between the point of observation and the hosting star. By doing so, it blocks a tiny fraction of the star’s light. Thus, sensitive instruments are able to measure a dimming of the star. From the of the dip in brightness and the periodicity, some planetary and orbital properties can be estimated. The Kepler Mission has already discovered more than 1000 exoplanets using the transit method, thus, it is the most popular one. [PLQ] The method of radial velocity takes advantage of the fact, that every body in a system orbits the barycenter. Thus, also a star moves around this center if other bodies are in the system. The change of speed of the star relative to the observer (oftentimes the Earth) can be analyzed using the spectrum of the star’s light, which is blue-shifted when the star is moving into the observer’s direction and red-shifted when it is moving in the opposite direction. This effect is also referred to as Doppler shift. Periodic shifts of the spectrum are taken to determine properties of other potential bodies in the system. [PLQ]

Microlensing Timing Direct Imaging 2% 1% 2%

Transit Radial Velocity 64% 31%

Figure 2.1: Discovery methods statistics of all exoplanets currently available in the Open Exoplanet Catalogue [OEP].

Hosting Systems Exoplanets orbit the barycenter of their hosting system. The number of stars in this system can be greater or equal to 1, having no theoretically defined upper limit. Double or multiple stars are very common in our galaxy.

5 2 Related Work

In general, two or more stars sticking together for a long time (usually lifelong) by mutual gravity attraction are called a double star. These systems can be structured in different manners. In a system with four stars, an exoplanet is not guaranteed to orbit around only one star, it can for instance also orbit around a binary. [Hei78]

Orphan Planets Orphan planets represent a special type of exoplanets not orbiting inside a stellar system but floating freely in the interstellar space, primarily moving around the galaxy’s center. Stating this, it is important to know that it strongly depends on how planetary objects and stars are distinguished. Object with a mass less than approximately 13 are normally considered to be a planet. Heavier ones are classified as stars, starting with the ’brown dwarf’ as lightest star. Hence, these two categories of celestial bodies are not in principle distinct. There is not a very solid common basis in the scientific community about orphan planets. [Car11] The problem with these objects is the uncertainty regarding their mass. Thus, it remains unsure whether they are really planets or small stars. Researchers estimate a ratio of 2 between orphan planets and stars in our galaxy regarding their respective quantities. Despite their large number, the hunt for extraterrestrial life is not focused on them since they are assumed to have a small probability of sustainability of permanent life. The Open Exoplanet Catalogue [Rei12] currently only contains two Orphan Planets, which are ’CFBDSIR2149’ and ’PSO J318.5-22’. [Car11]

2.1.3 Habitable Zone The so called habitable zone is traditionally defined as the circumstellar region in which a terrestrial-mass planet with a CO2-H2O-N2 atmosphere can sustain liquid water on its surface. The presence of fluid H2O is assumed to be necessary for the existence of life. Terrestrial mass here means a mass between 0.3M⊕ and 10M⊕, where M⊕ represents the mass of the Earth. Since one of the main reasons why exoplanetary research is done, is the strong public interest in a potential discovery of extraterrestrial life [PLQ], estimating the limits of the habitable zone in a system is a crucial and important tool. [KRK+13] Kopparapu et al. [KRK+13] present in their paper a simple method for the estimation of this zone. According to the theory and data presented there, the four numbers characterizing the habitable zone as shown in figure 2.2 can be inferred form the star’s properties. Note, that this model solely allows to estimate the habitable zone in single star systems. Every of the four radial distances defining the habitable zone is calculated in astronomical units (au) as follows [KRK+13]: L/L 0.5 d = au. (2.1) Seff

L/L is the ratio of the luminosities of the star in the system, which the limits are calculated for, and the one of the Sun. In case the luminosity of a star is unknown, the term is defined over the mass-luminosity relationship [Har88] L  M 3.5 = , (2.2) L M 30 with M representing the mass of the star in the system an M the mass of the Sun, so roughly 1.989×10 kilograms. Seff in equation 2.1 stands for stellar fluxes reaching the top of the atmosphere of an Earth-like planet. Kop- parapu et al. [KRK+13] provide coefficients to calculate this value for every of the four limits per star, depending on its temperature. Note, that the calculated limits of the habitable zone are only valid on a plane which is defined by the stellar horizon. Thus, it is not shaped spherically. Many of the currently known exoplanets have non-zero eccentricities, which can carry them (and their possible moons) in and out of the habitable zone [KRK+13].

6 2 Related Work

Early Mars

Maximum Greenhouse Limit Conservative Optimistic Estimate Estimate Runaway Greenhouse Limit

Recent

Star

Figure 2.2: Habitable zone estimates in single star systems. ’Early Mars’, which is named after the orbit Mars is believed to have moved on some billions of years ago probably having had liquid water on its surface [PKRP87] and ’Recent Venus’, which refers to the empirical fact, that Venus never has had liquid water on its surface at least for the last 1 billion years [SH91], limit the optimistic estimate of the habitable zone as outer and inner bound, respectively. The ’Maximum Greenhouse Limit’ and the ’Runaway Greenhouse Limit’ restrict the conservative estimate of the habitable zone, colored in dark green.

2.1.4 Distances Distances in universe are a thing which is only hardly imaginable to human beings. Even if the object to be measured the distance for is not another galaxy but an exoplanet. In astronomy, the popular units of distance mea- surements are not named inches or meters but light-years or parsecs. To establish a notion of these magnitudes, the units are defined and examples are given. One of the smaller length units used in the astronomic jargon is the (au), which approximates the average distance between the Sun and the Earth, is defined by the International Astronomical Union (in Resolution B2 of 2012 [IAUc]) as 1au = 149, 597, 870, 700m. (2.3) Beyond a stellar system, distances become too big to express them in reasonable numbers using astronomical units. To describe structures inside a galaxy, as our own Milky Way is, a unit called parsec is used. It is defined as [IAUa] 1parsec ≈ 30.857 × 1012km ≈ 206, 000au. (2.4) The parallax measurement leading to a length in parsec is described in more detail below in this very subsection. Alternatively, the light-year (ly) is sometimes used as a generally more popular unit. A light-year, which is defined as the distance traveled by light in a vacuum in a Julian year, is smaller than a parsec [IAUa].

1ly ≈ 0.3parsec ≈ 9.46 × 1012km ≈ 63, 241au. (2.5)

The closest known stellar system, which is also the closest in the Open Exoplanet Catalogue used for the Ex- oViewer described in 2.2.1, is Alpha Centauri at a distance of 1.339parsec. At the other hand, the most distant system in the database is a Kepler Object of Interest named KOI-5485, possibly located 10579.85parsec away from our Solar System.

7 2 Related Work

E t=0 Distant Stars

1au

Sun d ρ

Et=1

Figure 2.3: Schematic overview how stellar parallax is measured. The red dot represents the star which is at distance d from the Sun. At the right hand side, other stars in the distant background are considered to be practically fixed over the period of observation. Usually, the time points 0 and 1, for witch the Earth (E) is shown on its orbit around the Sun, are separated by six month. This leads to an angle ρ, also referred to as the parallax angle. [FGK15]

Stellar Parallax The unit parsec is defined in terms of the astronomical unit which is shown by figure 2.3. 1parsec equals the length of the leg adjacent to an angle of 1arcsec in a right-angled triangle, if the opposed length measures 1au. Using the parameter notation from figure 2.3, 1 d = au. (2.6) tan(ρ)

So 1parsec is defined formally as d iff ρ equals 1arcsec. The stellar parallax method is used to measure the distance to an object in space. It is a measure of how much an object moves compared to the fixed background in two images of different view points, the so called apparent motion. The above mentioned closest star to the Sun, Alpha Centauri, has a parallax of 0.762arcsec. Obviously, this method has its limitation as angles are getting too tiny for far away objects. Measurements conducted by telescopes on the Earth are limited to around 20parsec, which includes nearly 2000 stars. Recent space-based instruments are capable of measuring distances accurately up to 200parsec. [FGK15]

2.2 Databases

All the data used in ExoViewer is taken from publicly available repositories. The number of different potential catalogues containing the data necessary for the ExoViewer application results to be rather elevated. Providers cover a diverse range starting with professional astronomical institutes like the NASA and ending with hobby astronomers. The choice of data repositories is mainly based on the following criteria: Data completeness and correctness, format consistency of different entries, update frequency, and accessibility (e.g. how and how fast it can be downloaded). All data is directly taken as is, thus without adaption, addition or deletion of entries, accepting also some shortcomings. The chosen databases are presented subsequently.

8 2 Related Work

2.2.1 Open Exoplanet Catalogue The Open Exoplanet Catalogue is a database containing all the discovered extrasolar planets and their hosting systems. It is a novel approach since the GitHub project containing the catalogue is decentralized and open. Everyone can make contributions and suggest corrections. [Rei12] A big advantage of this database is, that is well maintained. For instance, averaging the number of commits per week over the year 2015 gives roughly 10.9, which means, that the catalogue is statistically updated more than once per day [OEP]. An inherent strength of this project is the inclusion of exoplanetary data of the variety of institutions making discoveries in this field. Further, the data comes in the convenient format XML (one file per system), which is later directly fed to the prototype. As a drawback, the Solar System is represented poorly and inconsistently. The file ’Sun.xml’ contains the data concerning the system we live in. Pluto is one of the planets in the Solar System, which should not be the case, as defined by the International Astronomical Union in the resolution B5 of the year 2006 [IAUb]. But other objects officially classified as dwarf planets, e.g. Ceres, do not appear. Further, there is no information about moons. This issue is discussed in 2.2.3. Table 2.2 shows the volume of the Catalogue. There are 342 more confirmed Exoplanets available than in the Exoplanet Data Explorer (accessible via http://exoplanets.org/), which is the interface to another popular exoplanet database. Apart form the objects reported in this table, the catalogue contains at the moment around 3000 Kepler Objects of Interest (KOI). A KOI is a system tracked by NASA’s Kepler telescope which possibly hosts planets.

Number of confirmed exoplanets 2014 Total number of planets (including Solar Sys- 2129 tem objects and unconfirmed exoplanets) Number of planetary systems 1301 Size of decompressed database (including ca. 7MB Kepler Objects of Interest (KOI))

Table 2.2: Open Exoplanet Catalogue Statistics [OEP].

2.2.2 JPL Small-Body Database The Jet Propulsion Laboratory (JPL) is a center for research and development associated to NASA. Their Small- Body Database contains information about small bodies, i.e. asteroids and comets, and their orbits. The database currently consits of 706.589 entries. It is accessible through a web interface, which allows to define filter criteria and to download the selected data subsequently in HTML or comma separated value (CSV) format. Per small body, 74 attributes are retrievable, whereas 27 of them relate to the object itself and 47 to its orbit and model. Another important fact is, that the small body objects are classified into different orbit groups, i.e. objects with similar orbital parameters belong to the same category. There are 14 of them for asteroids and 8 for comets. [JPL] However, the data available in this repository is not fully consistent with the definition of a small body object given in subsection 2.1.1. A known disagreement is the dwarf planet Ceres, which has an entry in the database as ’1 Ceres’ representing an asteroid. Having an approximately round shape, it would clearly fall into the category of dwarf planets. Another famous dwarf planet, Pluto, is not listed in the database of small bodies but in the JPL repository for planets, as well in contrary to the recent IAU definitions. Neither JPL provides a definition of small bodies nor they reference one. Thus, it remains unclear why these bodies are classified as they are.

9 2 Related Work

2.2.3 The Solar System As mentioned above, the data on the Solar System in the Open Exoplanet Catalogue is not in a satisfactory state. For this reason, the decision was taken to combine it with the data acquired by the predecessor project [Not15].¨ So, the structure of the Open Exoplanet Catalogue is more or less preserved, as the previously available set also being written in XML. This format, despite being not very efficient, is human readable, which is important as it is manually edited. Additionally, this set is enriched with some missing, but interesting objects like all the dwarf planets and their known moons, and even a planet its existence has recently been collected evidence for. In the case of the natural satellites, i.e. the moons, JPL would offer a database. This database seems to be quite encompassing, even including candidate moons, however, it lacks the Earth’s moon which is highly disadvantageous.

2.3 Astronomical Visualization

The recent establishment of new observation facilities and simulations in astronomy lead to the generation of incredible amounts of data. This new era enhances the role of visualization-based knowledge discovery. Although very popular for the wide public, astronomers tend to hesitate to develop and to use visualization applications. [HF11] There are a lot of techniques, which produce images very close to actual photographs of telescopes. A fre- quently visualized object is the so-called astronomical nebulae, as done for instance by Wenger et al. [WLM13]. Also planets and their atmosphere have been subject of visualizations. Very early, a model for this was developed [NSTN93], which produces good-looking output images [WB14]. Also the trend towards mobile devices is followed, by Bertin et al. [BPM15], who developed a remote appli- cation visualizing large sets of astronomical data, in real-time. It is web-based, built on the standard HTML5 technology, and can be operated by both, touch and mouse.

2.3.1 Challenges The relatively young research area of astronomical visualization is facing a variety of challenges. The first big issue is concerning the data used for the visualization. Telescopes and other instruments combine a low signal- to-noise ratio with a large dynamic range, which requires special data transformation and interpolation methods which may effect the visual output in the end [HF11]. There is also the lack of dominant efficient data represen- tations, i.e. standard data formats are not very wide spread in astronomy [Goo12]. Further, oftentimes the data to be visualized is huge in size, which touches the problem of efficient loading of the data into the memory and the updating of the scene containing a lot of represented data in real-time rendering, thus, the computational power [Goo12]. Another peculiarity is the navigation inside such a visualization, since usually it is in a three-dimensional space and there are long distances. Good devices for navigating on a 2D screen have evolved over the last decades, however, in a 3D space, the optimal solution is probably not a mouse but a Kinect-like device. [Goo12]

2.3.2 Existing Tools for Visualizing Celestial Bodies There are different tools available which visualize similar aspects as the developed prototype in this thesis. A quite sophisticated one is provided by NASA itself, called ’NASA’s Eyes’ (available here: http://eyes.nasa.gov/eyes- on-exoplanets.html), which can be downloaded for free. It includes a view on the Earth with different data sets which can be visualized (temperature, sea-level, etc.), one of our Solar System containing the planets and some main small bodies, and one of the exoplanetary systems. ExoPlanetSystems (available here: http://exoplanets.tommykrueger.com/app/) is an interesting web-based ap- proach of visualizing the exoplanetary systems, which is done by overlaying them, so they can be compared easily.

10 2 Related Work

Also, there are a lot simpler representations of exoplanets having absolutely no claim to reproduce reality, such as the one provided by universetoday.com (http://www.universetoday.com/93915/excellent-exoplanet-visualization- the-kepler-orrery-ii/), which comes in a video format. Solar System Scope (available here: http://www.solarsystemscope.com/) is a web-based viewer of the Solar System. It is also available as a mobile application. The interactive viewer contains all planets and dwarf planets. A visualization tool specialized on small bodies, more precisely asteroids in this case, is Astertank (available here: http://www.asterank.com/3d/). Being as well a web-based tool, it allows navigation though the point cloud of asteroids proving interesting information about them. Of course, the most important tool that existed before this work is the Solar System Viewer [Not15],¨ which represents also the starting point for the ExoViewer. A lot of techniques are adopted from this developed applica- tion.

2.4 Implementation Specific Aspects

The Institute of Electrical and Electronics Engineers (IEEE) defines the floating point data types in [IEE08]. Consulting the C++ reference [FTC], the conclusion can be drawn that the 32-bit type is implemented as float and the 64-bit version as double. Thus, as the name suggests, the double type occupies two times as many bits as the float type. Whitehead and Fit-Florea [WFF11] discuss the two on the GPU implemented floating point data types. To keep it simple, the choice has to be made between these two types. Alternatively, arbitrary precision operations could be performed on the GPU since other than natively supported data types can be desired in case the required range and precision are too high. This can be implemented following the instructions in [Lan15]. The optimization of the use of the computational power is not only an important aspect when it comes to the choice of the adequate data type, but also and particularly when designing the graphics pipeline. When building a 3D real-time application, the economy of resources is important. Liang et al. [LYLJ00] propose a concept called deferred lighting, in continuation also referred to as deferred rendering, which consists of doing the lighting calculations at the very end of the pipeline to avoid treatment of finally invisible polygons. As well, the rendering in real-time is an important aspect, since user expect to navigate instantly through space. There is no absolute definition of the term ’real-time’, but is is commonly understood to be a rendering speed of around 30 frames per second. Early, it was clear to exploit the potential of modern graphics cards (GPU). This is done by programming directly on the GPU so-called shaders. [KR08] As a last point, the line rendering is taken up, which in respect of this thesis is primarily associated to orbit rendering. In a region of dense lines, the challenge is to preserve the information of the many lines while also keeping other things of the scene in focus. The choice of which line to render, or even which parts of which line to render to achieve this is a hard one. Lines can be rendered by prioritizing them, i.e. only showing the important ones, and creating 3D effects by applying regional transparency [GRT13]. Another potentially complementary method for displaying lines is by implementing a lighting model on them, just like it is done on the normal bodies in scene. In this case, a line is treated as an infinitely small cylindrical tube and is shaded accordingly [MPSS05].

11 3 Theory

3.1 Particular Visualization Aspects

When visualizing, in whatever discipline it may be, the product is some model of the reality and not the reality itself. There are subjects which are only theoretically possible to visualize reasonably close to the original, others are simplified on purpose to focus the attention on a particular aspect. Obviously, the same holds for astronomical visualization. So, the first question to be asked before designing the visualized model is: What aspects of the subjects deserves to be focused on and therefore be transported by the developed application? Clearly, not every region in space is equally interesting. Considering the space containing all known exoplanets, only in a very tiny fraction of this volume, planets or exoplanets are visible and therefore worth to view and an even significantly smaller part is filled with material of celestial bodies. For instance, the orbit of the outermost planet in our Solar System, Neptune, has a major axis of around 60au, compared to the distance from the Sun to the closest system, Alpha Centauri, which is about 276,173au. The space in between is ’empty’ and of little interest for a person willing to detect exoplanets. To avoid getting lost in uninteresting interstellar space, the camera has to have particular restrictive properties concerning navigation, such as a so-called arc ball camera. This camera is always attached to a body, such that always at least one of them is in the view frustum. Further, it allows to rotate around the object in focus to view it form every possible side. Zooming is another mechanism which can lead to scenes without bodies in it, what is to be avoided. Therefore, the zooming is limited in both, the in and the out direction. The bodies are small relative to the distances between them. This not only holds for distances between systems, as described above, but even for intrasystem separations. In a proportionally down scaled Solar System, viewed from a distance such that all bodies are in scene, on a HD screen, not even the Sun has a diameter of more than one pixel. As empty screens are not what this application aims for, it is appropriate to distort the model of the cosmos, such that interesting parts are visible more easily. The sizes of the bodies have to be increased or distances have to be decreased, or a combination of it. It can be said, that the focus at this stage of prototyping lies on the bodies and how they move, rather than how the really look like in detail. So, their trajectories should be visible, it should be seen from distance how they move on it. Additionally, the relative positions of the exoplanetary systems to each other is interesting. Therefore, a nice overview has to be provided which allows to analyze how they are distributed in space.

3.1.1 Exoplanetary Systems Principally the Solar System and an exoplanetary system are very similar, apart form the latter one having poten- tially more than one star. What is fundamentally different is the quality of data, which is available. By quality, completeness and certainty is meant here. For every missing data field of a celestial body, an assumption has to be made, otherwise it cannot be displayed. For instance, a missing eccentricity of an orbit can be filled with the value 0, or with the average eccentricity of all known exoplanetary orbits. Whatever the choice will be, despite that it is probably not a bad guess, a guess remains a guess. Similarly, the frequent presence of big sigmas, i.e. highly imprecise measurements, regardless of a potentially good accuracy, leads to problems regarding the visual representation of a body or its orbit. This issue is risen in importance when the example is no longer a missing or imprecise eccentric value, but a texture of a celestial body. For sure, the developer of this application lack an astrophysical background, but

12 3 Theory the user would still like to see textures on an exoplanet, instead of uniformly colored spheres. So, some default textures and shapes have to be assumed. Further, apart from this uncertainty visualization, it may be interesting how the dimensions (of body sizes and distances) differ from those present in the Solar System. For instance, is a given star bigger than the Sun? Which satellite in the system is the most similar to the Earth concerning the diameter? These questions are interesting to answer.

3.1.2 Small Body Objects The main issue of the visualization of the Small Bodies is the fact, that there is a vast quantity of them. The JPL Small-Body Database [JPL] contains more than 700.000 entries. Treating them equally to planets would probably lead to a chaos of orbital lines rendered on the screen, nothing would be recognizable and therefore the application would loose a lot of its value. This problem can be referred to as massive orbit rendering. There exist some techniques to preserve visually critical structures even if there are a lot of lines overlaying, e.g. opacity optimization [GRT13]. To solve this problem, two distinct approaches are thinkable which can be applied appropriately. All orbits of the included small bodies are loaded but the visibility can be managed on a category level. Since the small bodies are grouped based on orbital parameters [JPL], as described in subsection 2.1.1, the selection of only one of these categories shows then a bunch of similar trajectories. As an alternative or complementary method, it could be considered to render the orbits in different colors per category. The other option is the omission of the orbit and instead focusing on the movement of the small bodies. This is done by representing each object by only a dot, which is visible form any distance in the same size. It gives a nice general impression of the motion and the positioning of the small bodies as a whole. There is a similar challenge when it comes to label rendering. Firstly, there is not much value of displaying the highest number of labels possible on the screen, even if they are non-overlapping. Secondly, the labels have to be prioritized in some manner, i.e. a very small asteroid is maybe not that interesting to label, whereas a bigger one certainly is. Small body objects do not have a nearly round shape by definition, which makes the choice of a standard shape more difficult. Further, drawing this shape is in most cases a waste of computational power, since it appears only big enough when having a really close focus on it.

3.2 Location of Bodies in Space

How to determine the location of an entity if there is no fixed point in a system? What if the object one sees is no longer there but moved away, just because the light travels too slow to observe the actual current position? These are issues, which a human being is not confronted with in daily life. The concepts of commonly applied frames of reference are no longer valid. That is why the navigation in this thesis is simplified as far as possible: every system is assumed to have its center at a fixed position relative to the Sun, which is approximately valid for a short period of time. However, there is no commonly accepted object of reference when it comes to positioning. Everything is considered to be moving. Thus, in contrast to the apparent motion described in subsection 2.1.4, there is real movement of stars, referred to as proper motion [FGK15]. To exemplify this, Barnard’s Star is considered, which exhibits the largest proper motion. It is located about 5.9 light-years from the Earth and has a proper motion of 10.3arcsec per year, taking only 200 years to travel the angular diameter of the Moon across the Earth’s sky. This may seem a very fast movement in astronomical terms, but it is negligible for the ExoViewer. [FGK15]

13 3 Theory

3.2.1 Modelling of Positions For the sake of simplicity, it is assumed that the described proper motion does not exist, thus, the systems are placed at fixed locations over time in ExoViewer. To be able to make a three-dimensional representation of objects in space, a coordinate system has to be established. For the purpose of constructing a scene of exoplanets, a coordinate system originating in our own Solar System seems to be a reasonable idea. Admittedly, our Sun, which is located in an arm of the galaxy Milky Way, is probably not the center of the universe. However, it is likely to be approximately at the average position of all observable exoplanets, since they are commonly not assumed to be clustered at some region in cosmos. The distribution of currently known exoplanets in space is not uniform, though. This is mainly due to NASA’s Kepler telescope which permanently points at the same constellations. So, most of the known Exoplanets lie in its view cone. [BKB+10] Describing the position of an exoplanetary system in cosmos then needs three parameters. Their names are right ascension (usually denoted as α), declination (usually denoted as δ) and a distance in any unit. In principle, it works like a simple spherical coordinate system with a few peculiarities in nomenclature and orientation. In figure 3.1, the equatorial coordinate system is presented. Right ascension is the equivalent of the geograph- ical longitude. A complete round contains 24h (), and is further subdivided in minutes and seconds. The so-called declination on the other hand is the projected geographical latitude. At the celestial equator it equals 0◦, at the celestial north pole 90◦ and at the celestial south pole −90◦. To illustrate the usual format of this data, the coordinates of Alpha Centauri are given [OEP]: α = 14h39m35s and δ = −60◦5001400. [DSZ11]

Celestial North Pole (δ = 90°)

Celestial Equator (δ = 0°)

Declination δ

Right Ascension α

Vernal Equinox (α = 0h, δ = 0°) Celestial South Pole (δ = -90°)

Figure 3.1: Equatorial coordinate cystem used for describing the position of celestial bodies on a transparent sphere around Earth (shaded in the center). The celestial equator is the projected equator of the Earth whereas the ecliptic is the projected imaginary orbit of the Sun around the Earth. The intersection of the celestial equator and the ecliptic, where the Sun comes from the south side is called the vernal equinox, which is the origin of the coordinate system.

Additionally to the equatorial coordinate system, a distance has be given to determine the position of the body relative to the Earth. Having the direction from the two longitude α and latitude δ, this vector has only to be stretched by the factor of the distance. Knowing about the vector from the Earth to a body in space, it has to be converted to x-y-z-coordinates to be suitable as input data for the application. For this, the right ascension is converted to degrees by adding the

14 3 Theory following three terms: the hour component is multiplied by 15 (the 360◦ of a circle are divided into 24 ), the minutes are multiplied by 4−1 (every hour is divided into 60 minutes) and the seconds are multiplied by 240−1 (every second is further divided into 60 seconds). The same thing is done for the declination, with the variation that it can be a negative value: the degree component is added to the minutes multiplied by 60−1 and to the seconds multiplied by 3600−1. This result is then multiplied by −1 if there is a minus indicating a negative value. By denoting the resulting term as α0 and δ0, respectively, and the given distance to the object as d, the three coordinates x, y, and z can easily be inferred: x = d ∗ sin(90◦ − δ0) ∗ cos(α0), (3.1) y = d ∗ sin(90◦ − δ0) ∗ sin(α0), (3.2) z = d ∗ cos(90◦ − δ0). (3.3) To be accurate, the distance vector from the Earth to the Sun should be added to this result. However, being a vector of a length of approximately 1au and the nearest system having a distance of about 276,173au (Alpha Centauri [OEP]), this step can be neglected in this case. [DSZ11]

3.2.2 Intrasystem Movement As already mentioned, every body in a system is orbiting primarily around the barycenter of the (sub-)system. As a starting point of the calculation, the two-body problem could be considered, which defines the movement of the bodies respecting mutual gravitational influences [BMW71].

+ +

Figure 3.2: Two different settings of the two-body problem are shown. On the left, the bodies differ clearly in mass, for instance approximating a situation like Sun-Earth, whereas on the right both bodies have the same mass, e.g. similar to the situation Pluto-Charon. In the first case, the massive body is located at the barycenter (represented by the plus sign) with an approximate zero-movement and is orbited by the smaller body. Having a more balanced status regarding the objects’ mass as on the right, both bodies move in a similar way around the gravitational center of the system.

To keep the concepts and calculations as simple as possible, several assumptions are made. Of course, in a stellar system, there are more than two bodies moving around. The complexity of calculation would explode, already a n-body problem [BMW71] with n equaling 3 is practically only numerically solvable. Further, a lot of systems, like the Solar System, are dominated by one massive object locating the barycenter close to itself, as shown by figure 3.2. Another reason for the simplification of having a fixed center, is the fact that that the data is available in this format [Rei12] [JPL]. There are no orbital parameters available for a central star. So every satellite’s movement is calculated only in dependence of the position of the body it orbits primarily. This means for instance, that the Moon’s position is calculated relative to the Earth, ignoring any other gravita- tional influence. The calculation of the true anomaly is subsequently presented briefly, respecting the different cases of circular, elliptical, parabolic, and hyperbolic trajectories, which are illustrated in figure 3.3. For this purpose, the Kepler’s equation and its derivatives, describing the relationship between time and an object’s place on a orbit, are applied. The Kepler’s equation was first derived by the German astronomer Johannes Kepler in 1609 [Kep93].

15 3 Theory

True Anomaly ν describes the position of a celestial body which moves on an orbit. Together with the in- ferred radial distance from the central body r, the actual position vector (x-y-z-position) can be defined as [r ∗ cos(ν), 0.0, −r ∗ cos(ν)], which still has to adjusted considering the orbital parameters argument of peri- , longitude of ascending node, and inclination [Not15].¨ Before develop the formulas for ν and r, the M has to be defined. It is exactly what the true anomaly would be if the body moved on a circular orbit: the in radians per day times the number of days elapsed since a time point at which the body was in periapsis (point nearest to the central body in the orbital trajectory).

Hyperbolic (e > 1)

Parabolic (e = 1)

Circular (e = 0)

Elliptic (0 < e < 1)

Figure 3.3: Schematic representation of orbit types, where e stands for eccentricity.

Elliptic Orbits Elliptic orbits are characterized by 0 < e < 1, with e denoting the eccentricity [BMW71]. This is by far the most common type of orbits among the objects in the databases used for ExoViewer [OEP] [JPL]. The so-called eccentric anomaly E is defined by: M = E − e ∗ sin(E). (3.4) To solve this equation for E, an iterative numerical approach is necessary. There are different methods of tackling this problem, also Kepler showed one in his work [Kep93]. For this thesis, the one derived in [DSZ11] is applied. The true anomaly is then calculated as follows: ! r1 + e 1  ν = 2 ∗ tan−1 ∗ tan ∗ E . (3.5) 1 − e 2

The radial distance is then immediately given as [BMW71]

a ∗ (1 − e2) r = au, (3.6) 1 + e ∗ cos(ν) where a denotes the orbital parameter semimajor axis in astronomical units.

Circular Orbits Orbits being defined as circular have an eccentricity of e = 0 [BMW71]. Of course, the probability of finding such an orbit is approximately zero. The true anomaly ν equals the mean anomaly M, ν = M, and the distance r is the same as the semimajor axis a, r = a, as the orbit is a perfect circle [BMW71].

16 3 Theory

Parabolic Orbits Also parabolic orbits are unlikely to occur, having an eccentricity of e = 1 [BMW71]. This type has a direct method to calculate the position, in contrast to the elliptic and the hyperbolic case, where approximations have to be done. If s 9 ∗ Γ W = t ∗ , (3.7) 8 ∗ q3 and q 3 p u = W + W 2 + 1, (3.8) with t denoting the time since a passage though the periapsis, Γ the gravity parameter of the system (heliocentric gravitational constant equals 1.32712440018×1020m3s-2 [JPL]), and q distance of periapsis, then,

 1  ν = 2 ∗ tan−1 u − . (3.9) u

The distance from the central body is 2 ∗ q r = , (3.10) 1 + cos(ν) again, with q standing for the distance of periapsis. [BMW71]

Hyperbolic Orbits Lastly, the hyperbolic case is discussed. Several comets happen to travel on hyperbolic trajectories, having an eccentricity e > 1 [BMW71]. The hyperbolic eccentricity H is defined by

M = e ∗ sinh(H) − H, (3.11) which can analogously to the elliptic case not be solved analytically, but with approximation following one of the many existing step-by-step algorithms. Finally, the separation to the central body is calculated in the same manner as for elliptic orbits (equation 3.6).

3.3 Numerical Precision Aspects

The question of this section is whether to choose the float data type or the double data type to represent any numerical floating point value in the prototype (data types defined in 2.4). The trade-off in this issue weighs speed against memory usage and against accuracy. For the moment, the memory restriction is adjourned, as it is not the first bottleneck and the whole framework the prototype is built in is anyways not optimized concerning this issue. On the other hand, the speed it quite important. On one of the developing machines, a GPU of the type ’NVIDIA GeForce GT 730M +’ is put in place, which has a so-called compute capability of 3.0. This implies that the number of operations per clock cycle per multiprocessor of 32-bit floating point (float) additions or multiplications is 192, whereas the value for 64-bit floating point (double) additions or multiplications is 8. For sure, a difference of this magnitude has to be taken into account when deciding which data structure to take. [CUD] To exemplify the accuracy, positions of bodies in the scene are looked at. As a first step, the biggest and the smallest distance in the scene have to be estimated. The biggest distance in the Open Exoplanet Catalogue is 10579.85parsec equaling 3.3×1017km, which is the distance of the KOI-5485 system from the Sun. To have a conservative estimate of the size of every of the three axes of the global coordinate system of the scene, it is for the moment assumed to exist a system with the same distance in every arbitrary direction. Thus, each axis

17 3 Theory should cover the range from -10,579.85parsec to 10,579.85parsec (in total 21,159.7parsec or 6.529×1017km), as the Solar System is the center of the coordinate system. Secondly, the data structures available have to be looked at, which are float and double. The size of these two data types are of 4 and 8 bytes, respectively [FTC]. They are analogously implemented on the GPU [WFF11]. The data structure float offers a precision of 24∗8 ≈ 4.295 × 109 different values, whereas a double-typed object with 4 more bytes available has 28∗8 ≈ 1.845 × 1019 different values. The gap between two values in the coordinate grid defined by the three axes needs to be small enough to conduct a visually smooth repositioning of a body on its orbit. Another argument is the requirement that the spacing should be big enough to represent close located bodies, e.g. planets and moons, always non-overlapping. This lower limiting factor is a bit harder to define than the the upper one. It is more an experimental determination than an analytical one. Table 3.1 illustrates different decision alternatives. As a conclusion of this table, the decision is taken to use double values in the case of the global positions. A consequence of this decision is that the operations on the GPU are more expensive. In return, the precision with a minimum distance of less than 100m is on a level, at which even a satellite or space station with a low orbit around the Earth could be modeled.

Inferred Maximum Range Minimum Distance float data type double data type 10m 4.295×107km 1.845×1017km 100m 4.295×108km 1.845×1018km ...... 100,000,000km 4.295×1017km 1.845×1027km 1,000,000,000km 4.295×1018km 1.845×1028km

Table 3.1: Table showing the desired minimum spacing of the coordinate system (in the leftmost column) and the total range for each of them, depending on the data type. The colors green and red symbolize whether the minimum distance, combined with the respective data type is suitable for the purpose described in this section: conservatively estimated, the range has to be at least 6.529×1017km in every direction.

3.4 Acceleration Data Structures

In the majority of cases, only a fraction of the objects in the scene is really relevant when it comes to composing the finally rendered image. They can be too far away, too small, or occluded by others and therefore hardly have an impact on the synthesized frame. As draw calls on geometrical objects are expensive, the amount of these calls has to be minimized. The first approach to achieve this is by building hierarchical data structures. Every stellar system, except for the one in focus, is too far away to recognize anything in there. Thus, objects not attached to the current system can be culled. Further, the same method can be applied when it comes to planets with their moons (currently only the case in the Solar System). If a planet is not visible, then it is likely that also its moons are not. A more sophisticated version of this approach would be to work with bounding boxes or even bounding spheres [AMHH08]. In this case, a sphere containing all the moons and the hosting planet, could be tested whether it is visible or not. The term visibility has so far not been clarified. The meaning in this particular case has two aspects: an object is visible if it can theoretically be seen from the current perspective, i.e. being inside the view frustum and not occluded, and is big enough to have an impact on the final image.

18 3 Theory

3.4.1 View Frustum Culling The view frustum culling method determines the objects which are inside the frustum and which are outside. In case of a spherical object (celestial body) or a bounding sphere, this calculation simply includes six sphere-plane testings. Given the position of the limiting points of the view frustum, its restricting planes can be inferred. For every plane, a normal vector n is calculated such that it points outward of the view frustum. For instance, respecting the notation of figure 3.4, nnearplane = (1 - 0) × (2 - 0). (3.12) For a plane being defined as a ∗ x + b ∗ y + c ∗ z = k, with a, b, and c representing the elements of a normal vector n, and k a constant, the distance p of a point Q, which is the center of a sphere in this case, to that plane can be determined by p = a ∗ Qx + b ∗ Qy + c ∗ Qz − k. (3.13) If p < 0, the point lies potentially inside the view frustum, if p > 0, the point lies for sure outside the view frustum. In the latter case, the radius r of a sphere is brought into the calculation. If r > p, the sphere is still potentially intersecting the frustum, thus, the test results to be positive. The described test is conducted for every of the six planes analogously. If all the tests give a positive results, the sphere is not culled. This method can also be applied for the case of the orbits, which is a bit more complicated, as they cover a much larger region of the scene than the object they are attached to. Thus, they cannot be hierarchically subordinated, but every fragment of the orbital object, which is a straight line, could theoretically be tested against the view frustum. By testing the starting and the end point of the line as well as enough samples in between, it can be determined if parts of the line lie inside the truncated pyramid or not. Of course, the benefits of conducting this calculation for every part of the orbit have to be weighed up carefully against the cost of doing so.

3.4.2 Distance Culling As stated above, there are objects in the view frustum, which are so distant and/or small that it is not worth the effort of drawing them. The mechanism to filter these object is here referred to as distance culling. Every object in scene has a distance to the camera, denoted by d in figure 3.4. This parameter is calculated simply by taking the length of the distance vector, separating the center of the object and the center of the camera. The dimensions of the object, i.e. the radius r for spheres, are now compared to this distance. It can e.g. be decided to draw the object if d < 1, 000, 000 ∗ r. This is a straight forward method, as in reality the resolution of the screen, more exactly the size of the frame buffer objects is a crucial factor.

6 7

2 3

Camera 0 1

4 5

Figure 3.4: View Frustum of a camera determined by six planes: nearplane (defined by the points 0,1,2,3), farplane (4,5,6,7), bottomplane (0,1,5,4), topplane (2,3,6,7), rightplane (3,1,5,6), and leftplane (2,3,6,7). Line d represents the distance from the camera to an arbitrary object in scene.

19 4 Implementation

This chapter revisits and deepens topics discussed in the theory part. It aims at explaining the prototype in detail, which was developed within the scope of this six-months master thesis. The software is written in C++, using OpenGL and the Qt framework.

4.1 Architecture

The architecture of the application is in essence taken from the predecessor project [Not15].¨ The code is structured in different packages, presented subsequently, which have their particular responsibilities and tasks.

4.1.1 ExoViewer Package The ExoViewer package is literally the starting point of the application since it contains the main.cpp file, which is executed. It creates an object of the geAstro::ExoEngine class and initiates the user interface (UI), since the package is the container for all elements concerning the UI. It constructs the interface as well as handles the user inputs. Figure 4.1 shows the structure of this package in more detail. It can clearly be seen that the ExoViewerUI class plays the most important role in this package, as it coordinates all the other classes.

ExoViewerInputControl inputcontrol

ExoViewerGLWidget glWidget

ExoViewerUI

visibleLables

bodySelection exoInfoWidget

VisibleLabelsDialog navigateTo ExoInfoWidgetClass

[...]

BodySelectionDialog NavigateToDialog

Figure 4.1: The reduced structure of the ExoViewer package. Only the central classes are represented, without member attributes and methods. The terms on the lines are equivalent to the reference names of the class at the arrowhead in the other connected class.

20 4 Implementation

4.1.2 GlobeEngine Package Apart from the code developed in this project, an important share of the application is the GlobeEngine package which originates from the Visualization and MultiMedia Lab (VMML) of University of Zurich. It provides a lot of basic elements which can be adapted and therefore facilitates the building of a rendering pipeline significantly. There are, form an ExoViewer’s perspective, some important classes and features in this package. The ArcBall- Camera is the camera integrated in the ExoViewer. It provides a ready to use camera model and handles as well the user input. Further, a lot of geometrical objects are provided, e.g. the Sphere class. Since celestial bodies are modeled as spheres for the sake of simplicity, this class is instantiated frequently. Its base class, at the same time the base class of all provided primitives (plane, cube, cone, etc.), is DrawableComponent.

4.1.3 geAstro Package The geAstro package is essentially there to feed the application with frames of the 3D simulation. It can therefore be considered the core to it. The central class in this package is ExoEngine (crucial properties showed in figure 4.2), a derivative of the GlobeEngine class Engine. The ExoEngine class loads and holds pointers to all the shaders, it creates and manages the FrameBufferOb- jects, and it is the reference for the current date, which the scene is simulated for. So that class is really central, as it also holds all the loaded objects inside a vector object. With its three fundamental functions initializeScene(), drawScene(), and update(), it sets up all the needed components, executes all the draw calls of the objects to be rendered, and triggers the recalculation of the object positions, respectively. The important data model, which is also part of geAstro, is discussed in detail in section 4.3. There is a helper class called BodyCreator which is called in the initializeScene() method of ExoEngine. This class is responsible for loading the data into the application, from the local file system or from the Internet via the local storage. SolarComposition is derived form the GlobeEngine class Composition. As the name suggests, it composes the scene based on the previously filled FrameBufferObject instances.

geAstro::ExoEngine

bodiesFbo: ge::FrameBufferObject orbitsFbo: ge::FrameBufferObject skyboxFbo: ge::FrameBufferObject habZonesFbo: ge::FrameBufferObject

outlineComposition: geAstro::SolarComposition

runnables: vector

currentDate: double

systems: vector systemInFocus: geAstro::System*

selected: geAstro::Body*

arcballCenter: geAstro::Body*

pickingevent: PickingEvent pickingCoordX: int pickingCoordY: int mousePosition: vector2i initializeScene(string) drawScene() update()

Figure 4.2: Important members of the ExoEngine class, contained in the geAstro package. The namespace ge is used for elements of the GlobeEngine package. PickingEvent is an enum of this class, which represents whether and which type of picking occurred, i.e. the selection the user made.

21 4 Implementation

4.2 Use Cases

Figure 4.3 illustrates the desired functionality of the application. It shows what the user can do with the piece of software provided. Excluding the action ’Parametrize Application’, which takes place before executing the prototype, all the activities are done by interacting with the running program.

Zooming Rotation Set Number of Exoplanetary Systems to check Panning Pick corresponding Set additional Interesting Label in GL Panel Pick Celestial Systems Body in GL Panel Set Name of Solar System XML file Search for Name Focus on selected of Celestial Body Celestial Body Set locality of databases

<> Set Paths to Resources Pick Satellite in Camera Navigation Parametrize InfoWidget Application (Shaders, Pictures, Data, etc.)

Select Celestial Body Set Number of Small Pick Hosting Body Body Objects to Display in InfoWidget Set Small Body Objects as point cloud or normal bodies

<>

Choose Planet Set Number and Distance Show/Hide Unit Circles <> in Habitable Zone of Unit Circles

View Data of User Celestial Body Show/Hide Habitable Zone Navigate to the previously selected Celestial Body

Show/Hide Orbits Show/Hide Labels Choose Date Adjust Animation Speed Scale Adjust Window Size Manage Visiblity of Celestial Bodies Show/Hide each System

Show/Hide Small Body Objects Show/Hide Stars Toggle Fullscreen Show/Hide Moons Show/Hide Categories of Small Body Objects Quit Application Show/Hide Planets

Figure 4.3: Use Case Diagram for the ExoViewer Application. Adapted from [Not15].¨

4.3 Data Model

This section relates to the data, i.e. information about celestial bodies, used in the application. Thus, how the steps between having the data in the external databases, described in section 2.2 and having them as a 3D representation in the ExoViewer are designed. The data model of the celestial bodies is hierarchical. There is a parent class, Body, which is never instan- tiated. Every single object in scene is an instance of a deviated class of Body, which are Star, Planet, Moon, and SmallBody. Figure 4.4 shows how the data is structured inside the application. The chosen intermediate granularity of different body derivatives is refined by using the data structure ’enum’. This makes sense in case there are different types of an object, but apart form different names, everything is shared form a programmatic perspective. Referring to the notation in figure 4.4, the enum planetType has the following values: EXOPLANET, PLANET, DWARF PLANET, HYPOTHETICAL PLANET. At the same time, there are much more subtypes of the SmallBody class, which are distinguished by the enum smallBodyType. The values are: IEO, IMB, TNO,

22 4 Implementation

ATE, MBA, PAA, APO, OMB, HYA, AMO, TJN, AST, MCA, CEN, HYP, ETc, PAR, CTc, JFC, JFc, HTC, COM. Please consider table 2.1 for the meaning of these abbreviations. However, this is not done for stars, where also different categories exist, as the types are not given by a consulted database as clearly as with small body objects. Stars could be further categorized by the spectral type, but this it a very fine granulate attribute. For this reason, only the color of a Star object is calculated based on the spectral type given and set as an attribute of the drawableObject. Every of the described objects, instantiated from a derived Body class, is mandatorily a member of System. This fact is also illustrated in figure 4.4. A System object has as essential members a name, a position, and vectors of Body pointers. It represents the highest layer of the data model. The System object is neither a derivative of DrawableComponent of the GlobeEngie package nor drawable, i.e. has no draw() function, but all its member are or have a drawable object.

geAstro::System

name: string position: vector3d solarSystem: bool geAstro::Body

bodiesInSystem: vector name: string starsInSystem: vector radius: double planetsInSystem: vector mass: double geAstro::Orbit moonsInSystem: vector rotationPeriod: double smallBodiesInSystem: vector axisTilt: double center: geAstro::Body* pointCloud: geAstro::PointCloud* spectraltype: string e: double double transittime: double a: double habitableZone: geAstro::HabitableZone* double temperature: double i: double unitCircles: geAstro::UnitCircles* double age: double w: double meanInclination: float omega: double meanOmega: float tex: std::shared_ptr tp: double meanW: float drawableObject: DrawableComponent* n: double drawable: OrbitDrawable orbit: geAstro::Orbit scaledDrawable: OrbitDrawable orbitExists: bool orbitColor vmml::Vector3d

id: unsigned int

satellites: vector

geAstro::Star geAstro::Planet geAstro::Moon geAstro:SmallBody

metallicity : double description: string volume: double type: smallBodyType magV: double lastupdate: string density: double discoverymethod: string surfaceArea: double spinorbitalignment: double surfaceGravity: double discoveryyear: double

planetTypeEnum: planetType isInHZ: int

specialTexture: bool

Figure 4.4: UML class diagram representing the data model in a reduced manner. Only relevant attributes are shown, and no functions are listed. For explanations of non-standard data types, refer to chapter 4, especially section 4.3.

4.3.1 Data Sources The data sources Open Exoplanet Catalogue and JPL Small Body Database are accessible in the Internet as described in section 2.2. Both repositories can be downloaded either by execute manually a separate python script, or by letting do this the application, setting the respective ’local’ parameter in the parametrization file to false (subsection 4.5.2). The reason for having done this task in python is the ease of implementation and the availability of convenient libraries. It is written using the python version 2.7 and importing the libraries urllib, zipfile, and os for the Open Exoplanet Catalogue and requests (available here: http://requests.readthedocs.org/),

23 4 Implementation and os for the JPL Small-Body Database. Of course, to use the scripts, all of the mentioned frameworks have to run properly and an Internet connection has to be available. In detail, the Open Exoplanet Catalogue is downloaded by the python equivalent of the subsequent pseudocode: z i p path = current directory + ”/catalog.zip” url = ”https :// github.com/OpenExoplanetCatalogue/ \ o p e n e x o p l a n e t catalogue/archive/master.zip” r e t r i e v e a n d save(url, zip p a t h ) u n z i p p e d d a t a folder = extract z i p ( z i p p a t h ) u n z i p p e d d a t a folder .rename(”data”) The JPL Small-Body Database is a bit more tedious to get, as there are a lot of options to chose in the down- loading interface. It is done by one http request per available category. The crucial lines in pseudocode are the following: attributes = ”AcApAsAtBgBhBiBjBkBlBmBoBp” #coding of required attributes for category in all a v a i l a b l e s m a l l b o d y categories: payload = assemble payload(attributes , category) r e c e i v e d d a t a = s e n d h t t p request(”http ://ssd.jpl.nasa. \ gov / s b d b query.cgi#x”, payload) filename = assemble filename(current directory , category) s a v e file(filename , received d a t a ) The data is downloaded and stored in CSV format. As an extension to the above functionality, a CSV to XML converter is written and located in the same directory in the code repository.

4.3.2 Data Import This converter was produced because initially, all data was read in XML format. However, the implemented XML reader turned out to be rather slow. So, at least for the Small Body Objects, the benefit of having a human readable file format was dropped in favor of a speed-up when importing the data in ExoViewer. For the data formatted in XML, which is the case for everything except the small body object files, a reader is implemented based on the TinyXML-2 parser (available here: https://github.com/leethomason/tinyxml2). Offer- ing a lot of functionality, this reader empirically resulted not to be very light weight. That is why for the big data set of small bodies, an ad-hoc C++ CSV reader was put in place by using the std::ifstream object.

4.4 Graphics Pipeline

For each frame appearing on the screen, the color of every pixel is recalculated by CPU and GPU. This non-trivial task is done in a multi-step procedure explained in more detail in this section. Figure 4.5 shows an overview of how this mechanism works. Before the rendering actually starts, the work load of updating the body positions is distributed, such that the computing power of more than only one core can be exploited (ExoEngine::setUpUpdateJobs()). The vectorial member variable runnables of the class ExoEngine is filled with pointers to MyRunnable objects. This class is a derivative of QRunnable. Every job (MyRunnable object) is assigned a number of bodies, which it has to conduct the positional update calculations for. Later, the ExoViewerGLWidget calls once per loop (initiated by the OpenGL framework) the ExoEngine::update() and ExoEngine::drawScene() methods, in this order. When the first one is being called, it simply checks whether the new positions are ready or not by asking the QThreadPool, which manages all the jobs. If this is the case, the new coordinates of the bodies are set and the parallel calculations are initiated again. Otherwise, nothing is done. However the answer of the QThreadPool may be, the draw() method is invoked right afterwards, either using recently calculated positions or, in case of a negative answer, older ones.

24 4 Implementation

ExoViewerGLWidget ExoEngine::drawScene() ExoEngine::update() QThreadPool ExoEngine::setUpUpdateJobs() ExoEngine::initializeScene()

loop GL rendering cyle

isDone()

allJobsFinished

alt allJobsFinished

setAllBodyPositions()

startAllJobs()

Figure 4.5: Simple schematic high-level survey of the rendering process.

4.4.1 Deferred Rendering Pipeline In the above mentioned ExoEngine::drawScene() function, in which the CPU essentially calls the programs on the GPU, geometrical objects are drawn in different steps. And as the name of this technique implies, the rendering of the objects on the screen is deferred. As a first step, everything is drawn into so-called FrameBufferObject instances (FBOs) and at the end, a composition assembles them to a screen sized image. This is shown in Figure 4.6. The big benefit of the deferred rendering pipeline comes from the deferred lighting calculation of the bodies. The (celestial) bodies are the only objects which have to be included in the lighting calculation (planes, lines, and points are not shaded). That is why the bodiesFbo has an additional layer (number 3), which holds the normal of each fragment. Like that, the computationally expensive calculation of the lighting model is conducted at the end of the pipeline, once it is sure not to spend this effort for occluded fragments. To illustrate this mechanism of the deferred rendering pipeline, figure 4.7 represents some interesting layers of the bodiesFbo object and the resulting frame. All the bodies are rendered into this FBO filling four layers: color, depth, ID, and normals. The depth layers of the different FBOs are mostly used to determine the closest fragment. For the outline functionality (fine red line on the selected body’s borders), the ID layers are used. Part (b) of figure 4.7 shows this layer for the bodiesFbo. To clarify, the selected body in the composed image (d) in the very same figure is colored in red, other IDs in white. It can be seen, that around the big body, a thin line is drawn in part (d). Note, that the smaller body in the foreground is also separated by the outline. Analogously, the orbit is colored for the selected body. Of course, all the other FBOs are also included in the calculation of the final frame.

Shaders The above presented structure of FBOs allows to have quite trivial shaders for the single objects. As being normally the case in OpenGL applications, every drawable object is assigned two shaders, a vertex and a fragment shader. These GPU programs are kept very reduced, basically filling only layers of the bound FBO. To exemplify

25 4 Implementation

bodiesFbo (FBO) orbitsFbo(FBO) habZonesFbo(FBO) skyboxFbo(FBO) (0) vec4 FragColor (0) vec4 FragColor (0) vec4 FragColor (0) vec4 FragColor (1) float FragDepth (1) float FragDepth (1) float FragDepth (1) float FragDepth (2) vec4 FragIDInfo (2) vec4 FragIDInfo (2) vec4 FragIDInfo (2) vec4 FragIDInfo (3) vec3 FragNormal

Draw all non-culled orbits, Draw all non-culled Handle probable Draw Habitable Draw ExoEngine::drawScene() probable Small Body point Draw Skybox bodies body selection Zone outlineComposition could, and Unit Circles

Clear FBOs and apply Disable CullFaces Enable CullFaces Enable Blending Disable Blending basic scene state

Figure 4.6: Overview of the ExoEngine::drawScene() method. Time increases form left to right. The main (draw- ing) tasks are shown in rounded boxes on the time line. The changes of the scene state is represented below, and the different FBOs, with all the layers and their associated data types are above that line. At the top, the arrows symbolize that the last action (draw outlineComposition) is fed with references of all the FBOs. this approach, the point shaders (point.vert and point.frag), which are used for the small body point cloud, are chosen: The vertex shader essentially calculates the final position by applying the Model-View-Projection-Matrix, passes the category of the small body, which is set as a uniform, and sets the gl PointSize to 3. Subsequently in the fragment shader, the category is used to look up the assigned color of the point, and the distance form the view point to the object is included in the determination of the final fragment color. All the shaders early in the pipeline are quite trivial, however, the big computational load has to be worked off at some moment. It is moved towards the end of the pipeline, i.e. the outlineComposition object, which is bound to a more complicated shader. Thus, the program outline.frag can be considered the main piece. The main task of the last shader in the pipeline consists of merging the different FBOs, which is in simplified terms done by comparing the values of the depth layer and taking the color of the FBO with the closest object in it. Taking up the example shown in figure 4.7, the calculation of the light is implemented as follows. For the current fragment, the value of the normal layer of the FBO bodiesFbo is read, denoted as vector n. Then the vector from the fragment to the light source l is required. Since the lighting is calculated in world space, the 3D fragment position has to be reconstructed. This can be done by left multiplying the inverse View-Projection- Matrix to the normalized device coordinates (found with the aid of the depth value of the fragment, taken from the depth layer of bodiesFbo). Having the position in world space and the position of the light source (normally the system’s star) set by the C++ code as a dvec3 uniform, the missing vector l can easily be inferred by subtraction, defining the fragment as starting point of the vector. n · l, but at minimum a value of 0, then is the weighting of the diffuse component of the lighting model. Furthermore, some ambient lighting is added, such that fragments with normal vectors pointing away form the star are not rendered completely black. Of course, for this final step, the color value from the color layer of the bodiesFbo has to be considered.

4.4.2 Rendering Data Structures The ExoViewer prototype contains only simple geometrical objects. Apart form the naturally roundish shaped stars, planets, and moons, also the small body objects are represented in a simplified manner, namely as spheres. The different data structures in the application are reviewed subsequently, starting with the spherical shape. Sphere is a class in the GlobeEngine package and a child of the DrawableComponent class. The geometrical mesh is created by giving two parameters, inferring the number of nodes by multiplication. Additionally, it can be textured by setting a texture handle. Both of them, the number of vertices as well as the resolution of the texture, should be chosen optimally balancing the resulting visual quality and the available resources. A sphere is defined

26 4 Implementation

(a) bodiesFbo layer 0 (color). (b) bodiesFbo layer 2 (ID).

(c) bodiesFbo layer 3 (normals). (d) Composed scene.

Figure 4.7: Deferred rendering pipeline illustration showing three layers of the bodiesFbo (see figure 4.6) and the finally rendered scene. to have 50 circles times 50 nodes, thus, 2,500 vertices. The higher this number, the rounder appears the shape, but the more computationally intensive is the visualization. Similarly, the textures are loaded at a size of 1024x512 pixels, which is not the highest possible resolution, but goes easy on resources and gives a visually satisfying output. There is the option to view the small bodies not represented as spheres, but as a point cloud. For this purpose, a PointCloud object of the geAstro package is created and filled with points, one per small body. When the draw() method is called on this object, the points are rendered as GL POINTS. Every time the update() method finished calculating the new positions of the small body objects, this structure is subsequently generated. Further, the orbits are drawn as a collection of GL LINE STRIP. An Orbit object is drawable via the Orbit- Drawable class, member of the geAstro package. Again, the trade-off consists of rendering a round appearing shape versus the computational effort of drawing the additional line segments. An orbit is implemented to be constructed of 200 straight parts. A circle (UnitCirlce object), on the other hand, has 360 vertices. In fact, these numbers are chosen highly arbitrarily, but the unit circles are normally less in quantity and additionally instanced rendered, thus, an elevated number of vertices can be justified. The remaining structures are the ones of the habitable zone and of the skybox. On the one hand, the habitable zone is rendered as three different Ring objects. This is also a DrawableComponent derived class in the geAstro package, which draws GL TRIANGLES in the region between two defined radii. The habitable zone is composed by two optimistic border parts and the conservative middle ring. On the other hand, the skybox is represented by six textured planes forming a cube around the center of the system in focus.

27 4 Implementation

4.4.3 Qt As a last step in the pipeline, the Qt elements are added onto the drawn scene, which is done by ExoViewer- GLWidget of the ExoViewer package. In brief, as much objects as possible are labeled, i.e. a label is only placed if is does not overlap with an already drawn one. The following priority in labeling is implemented:

1. Selected object

2. Stars

3. Planets

4. Moons

5. Small body objects

6. Exoplanetary systems

Within a category, the priority is higher the greater the ratio between body radius and euclidean distance of the body to the camera. This means in simple terms, that the closer and the bigger a celestial body, the higher the probability that is will be labeled. In the special case of the systems, there is no particular priority put in place, the order in the database is decisive. The composition of Qt elements, which is drawn for sure is the information box appearing when a system label is hovered. The exact position of a label is calculated using the View-Projection-Matrix and the viewport size of the active camera. With these numbers, the 3D coordinates of a body or a system can be transformed into a screen 2D position. The colors, sizes, and visibilities of the labels are defined by category and initialized with reasonable values. However, it is up to the user to customize these parameters on the fly.

4.5 User Experience

Since the application should help a user to gain new insights based on existing data and make the exploration of exoplanets as comfortable as possible, the user experience is an important facet in the development. This section overviews how a user can or should interact with the application.

4.5.1 Interaction Overview The UI is essentially what the user sees of the application. Figure 4.8 shows this interface as a schema. Useful interaction mechanisms are presented here.

Camera Handling The camera handling is crucial for the user experience. Being always focused on a body, the camera can be navigated relative to it. It is controlled by the mouse. By clicking and dragging with the left mouse button, the camera is rotated around the object in focus, preventing the same radial distance. The clicking and dragging of the right mouse button can be used to pan the camera. To scroll up with the mouse wheel means to zoom in, analogously, to scroll down means to zoom out. The zooming is limited in both directions. Figure 4.9 shows how the maximal zoom factor is calculated, whereas the minimal one is fixed.

28 4 Implementation

g) h) i) j) f)

b)

d) e)

a)

c)

k) l) m) n)

Figure 4.8: Schematic UI of the ExoViewer application. This Interface is structured in the following most impor- tant parts: The main panel a) contains the central part of the prototype, the visualization of the stellar systems. On the right hand side, it is limited by the information widget b), which is composed by c), the container of different data tabs providing numerical and textual inputs about the currently selected body, d) and e), the name and a potentially available image of this body, respectively. At the top of the screen, the menu bar f) is located, which has four extensible items: g) file, h) view, i) navigation, and j) debug. Another bar k) is spanned horizontally over the screen at the bottom, called the navigation bar. A button which focuses the camera on the Sun (l)), a widget containing the current data as well as a slider to adjust the speed of simulated time (m)), and a mechanism to jump to a arbitrary date in time (n)) are its elements.

Navigation to and Data View of Bodies To replace the camera further, which means to attach it to another body, several alternatives are implemented. The camera stays with the parameters it had on the previous body, i.e. the zooming and the position, except for the panning, which is reset in order to have the body in for sure in the view frustum. • The most intuitive method is to double click on the body or its label in part a) of figure 4.8. • To navigate to the first body (usually the star) in another system, a double click on a system label, in the background of the scene, has to be executed. Hovering it gives information on the bodies in system. • The dialog ’Navigate to Object’ offers the possibility to look up any body loaded in the prototype by name. Entering the first letters, a list with bodies its names start with this combination is presented. • The menu option ’Planets in Habitable Zone’ lists all the Planets and Exoplanets which are potentially habitable (conservatively and optimistically). Navigation is done by a single click on the desired body. • Further, in the widget on the right hand side of the screen (figure 4.8 b)), information about the currently selected object is displayed. The hosting body or a satellite can be navigated to. • A special body in the scene is the Sun (it is located at the origin of the coordinate system). That’s why there is a Sun-button at the bottom, which is always available. • There is a back navigation functionality, available in the menu (’Back’) or as a keyboard shortcut ’backspace’.

29 4 Implementation

4

3

2

1 maximalzoomfactor

0 0 0.1 0.2 0.3 0.4 0.5 radius

Figure 4.9: Calculation of the maximalzoomfactor on the y-axis of the ArcBallCamera, depending on the radius of the body in focus in astronomical units on the x-axis. The function is y = 0.57 ∗ x−0.45, which showed to produce empirically satisfying results. The zoomfactor is integrated in the View- Matrix of the ArcBallCamera.

Managing Visibility of Different Elements Subsequently, a brief overview on the most important visibility options and why they might be interesting is presented. These options can be clicked on in the menu ’View’ or triggered by the assigned keyboard shortcut ’v’.

• Body visibility: Every body category can be set to visible or not and even whole systems can be hidden. This option is especially interesting in the Solar System, where a lot of bodies are present, e.g. it allows focusing on one group of small bodies or on planets only.

• Orbit visibility: For the objects moving on an orbit, the trajectory can be hidden, again for the purpose of not getting lost in the vast number of lines. As a side effect, the rendering is speeded up.

• Label visibility: Per default, as much objects as possible are labeled in the scene such that there is no overlap. This labeling can be managed on the basis of the categories. Having the additional option to adjust the sizes and the colors of the labels, it allows to focus on what is considered to be interesting. With a lot of objects loaded, the hiding of labels speeds up the application significantly.

• Habitable zone visibility: The habitable zone can be made visible to see the estimated region, where planets objects are potentially capable to sustain life.

• Unit circles visibility: To better grasp the distances in space, concentric circles can be displayed. These circles can be modified in number and spacing.

4.5.2 Parametrization As a command argument, when running the application, the user has to pass an argument which is the parametriza- tion file, normally located at trunk/content/astro/paths/paths exo.xml within the application repository. The mod- ification of this file is the only way of taking influence on which and how data is loaded by the user. The file structure is defined by the following XML Schema:

30 4 Implementation

31 4 Implementation

The content of the above listed elements with values and attributes is now described in detail, referencing by name. If the value is a path, the default value relative to the project root is given in brackets: shaders Path to the directory containing all the shaders (trunk/shaders/). textures Path to the textures for the bodies in scene (trunk/content/astro/textures/textures resized/). pictures Path to the pictures which are displayed as a part of the body information (trunk/content/astro/pic- tures/). skybox Path to the images used for the skybox (trunk/content/astro/skybox/). habZones Path to the XML file containing the precalculated data, which the estimation of the habitable zone is based on (trunk/content/astro/data/HZs.xml). openExolanetSystems Path to the location of the folder where the exoplanetary data is located (trunk/con- tent/astro/data/OpenExoplanetCatalogue/). local (attribute of element ’openExolanetSystems’) Boolean determining whether the Open Exoplanet Catalogue is already local (true), or not (false). solarSystem Name of the file in the Open Exoplanet Catalogue containing data about the Solar System (Sun.xml). numberOfSystemToDisplay Integer representing the number of exoplanetary systems, which will be tried to be loaded into the prototype. This number does not include the systems additionally declared in a ’system’ element. system Name of XML file in the Open Exoplanet Catalogue which is desired to be loaded (e.g. Kepler- 186.xml). ssBodies Path to the XML file containing data about the Solar System (trunk/content/astro/data/oldBodies.xml). showMoons Boolean determining whether to load moons (true) or not (false). jplSmallBodies Path to the location of the folder where the JPL Small-Body Database is located (trunk/con- tent/astro/data/JPLSmallBodyObjectsDatabase/). local (attribute of element ’jplSmallBodies’) Boolean determining whether the JPL Small Body Database is already local (true), or not (false). displayPerCategory Integer, which gives the maximum number of small bodies which are loaded per category (as defined in table 2.1). showSmallBodies Boolean determining whether to load the small bodies (true) or not (false). smallBodiesAsPointCloud Boolean stating whether to load the small bodies as a point cloud (true), or as geometrical object with an orbit (false).

32 4 Implementation

4.5.3 Seeing the Essentials Referring to section 3.1, how are the challenges stated there solved in the implemented software? Helping the user not to get lost in space is the fist task. This is done attaching the camera to an object. Further, a lot of different navigation methods are provided to find the body which is looked for. But at the same time, random exploration is also made easy. To have a general view of all the systems, an overview map is provided. Furthermore, the rendering of a huge amount of bodies is a central topic. The solution here consists of being able to manage the visibility on a quite fine granular level, based on the types of celestial bodies. Going further, the small body objects can be represented as dots rather than actual bodies, colored per category. In order to distinguish the orbits of the different object classes, different colors are chosen, as illustrated in the color table 4.1. The stellar orbital color, which is implemented to be pinkish, is missing there, because it is only a theoretical value, i.e. currently, no stellar orbit exists in the prototype.

a) b) c) d)

Table 4.1: The four orbital colors have maximal distance in the RGB-space. a) is the color of planetary orbits, b) the one for the moons’ orbits, c) the one for small body object trajectories, and d) is the orbital color of the selected body (at the same time its outline color).

To convey the notion of a 3D space, different mechanisms are established: The orbits fade out the further away they are from the camera, there is a fixed installed background (skybox), or labels are more likely to appear for close objects, to name the most important examples. Concerning the exoplanetary systems, the uncertain data is one of the main issues. This is handled by adopting straight forward measures: No system is ignored in the process of loading because of missing or uncertain data. The confident intervals which are given in both the Open Exoplanet Catalogue and the JPL Small-Body Database are ignored completely whereas missing values are filled with average assumptions. A special case is the size: An exoplanet with a missing radius is rendered with a radius of 0, thus not visible. However, the trajectory and the label are shown in the system and the available data can be viewed in the information widget. Also for the textures assignment, very arbitrary assumptions are made in order not to represent them as plain spheres. It is likely that the user is interested in comparisons of exoplanetary systems with the Solar System. For this purpose, when navigating to a system via a system label, the hover functionality provides information about the radii of the bodies compared to the Sun or the Earth. Additionally, the unit circles offer a means to compare distances.

4.5.4 UI Design Trade-offs This subsection serves as a reflection of design decisions concerning the user interface. The benefits and draw- backs of certain aspects of the UI are discussed here.

Fix attached camera The fact that the camera is always attached to a body, avoids getting lost in space, but in contrast, the feeling of being in a free floating spacecraft is lost to a certain degree. However, the focusing on an other body is made instantaneously, since a smooth transition between bodies is unspectacular and slow.

Uncertainty visualization Leaving aside the relative high degree of uncertainty in the exoplanetary data re- sults in a tidy and simple user interface, but may transport a distorted image of the reality to the user. The same holds for filling in the gaps of missing data, especially the texture.

33 4 Implementation

Flexibility in label visualization The adjusting of the appearance of labels gives a high degree of freedom, but at the same time, it may be that the user is overwhelmed with this possibilities and would prefer a simple and good default setting.

Information widget On the one hand, the providing of numerical and textual data in a widget is a good supple- ment of what is seen in the 3D simulation, but on the other hand, the attention could be distracted reading numbers which probably are not very helpful in understanding what is represented in the rendered scene, of course always depending on the previous knowledge of the user.

Small bodies visualization as point cloud An advantage of the point cloud is that it is computationally a lot less expensive, i.e. much more bodies can be included. On the down side, the representation of a single body is of course far off from reality by just having a dot. The different colors of the categories are good to get an idea of the composition of the cloud, they can be distracting though when a user would like to observe all small bodies as a single entity. The point cloud gives an idea of the motion in contrast to the representation with a trajectory per body, where that effect is minimized.

34 5 Results

As this is a work in the field of visualization, the results are mainly in form of screenshots. The crucial elements of the parametrization, specified in more detail in subsection 4.5.2, are given per screenshot. For images without upper, lower, and right bar, only part a) or a fraction of it, as defined in figure 4.8, is shown. Please, as for every figure of this work, consider the appendix for digital versions of the images. For the printed edition, the contrast of some of the following screenshots is slightly adjusted. Figure 5.1 illustrates the Solar System in a survey, without small body objects. The Sun is selected and therefore labeled in red, as well as shown in the information widget on the right. The image exemplifies the depth effect of the orbits. The trajectory of Planet Nine, being far off in the background, is not visible, whereas orbital parts close to the camera are less transparent. The same effect can be seen with the lunar orbits (colored in green). When an object is being picked, as in figure 5.2, it gets highlighted. The borders of the celestial body and its potentially existing orbit are marked in a reddish color. It can be seen, that the orbit is also colored with the body in the background, as at the bottom right of the label ’Earth’. Further, no orbit depth effect is implemented in the picked state to see the whole trajectory. Figure 5.2 also illustrates nicely the lighting model, by presenting a bright day-side and a darker night-side. Both of the above mentioned figures show the effect of the so-called skybox, i.e. the fixed background texture. However, as seen later in this chapter, it could be desired to switch this texture off. The labeling for both figures is only set on for stars and planets.

Figure 5.1: Showing the Solar System with the Sun in the center, and all the planets, dwarf planets, hypothetical planets, and moons. Labeling of distant systems is turned off. displayPerCategory attribute equals 0.

35 5 Results

Figure 5.2: Earth is the center body of the ArcBallCamera, orbited by the Moon. It is selected such that the outline highlights the body as well as the orbit.

5.1 Exoplanetary Systems

Figure 5.3 shows one of the alternatives to navigate to another stellar system namely by double clicking on a system label. Before doing so, the hovering functionality provides basic information of the system. In the presented case, the system contains five planets, all slightly bigger than the Earth, and one star with half the radius of the Sun. The picture is taken with the Solar System planet Jupiter in focus and labeled moons.

Figure 5.3: The information box contained in this figure on the left top part appears when hovering over a system label. The number of planets and stars in the system hovered can be already enquired, additionally labeled with the size of the bodies: For planets in multiples of the Earth radius (RE) and for stars in multiples of the Solar radius (R ).

Once navigated to a system, the focus is initially on the first body (as given by the database), which is usually a star. Figure 5.4 shows a survey of an exoplanetary system (Kepler-186). The habitable zone is illustrated in green in two nuances of transparency. The more opaque ring represents the conservative estimate, whereas the more transparent extensions stand for the optimistic alternative. Additionally, 20 unit circles are displayed with a spacing of 1au. This setting allows to recognize the presence of an exoplanet in the habitable zone of the system. The combination of the knowledge gained from the information box in figure 5.3 and from the information widget

36 5 Results containing data on the star type, makes the location of habitable zone compared to the one in the Solar System intuitively logical.

Figure 5.4: System Kepler-186 with the habitable zone and unit circles displayed.

Having more exoplanetary systems loaded, it can quickly get crowded in space (figure 5.5). Note that all the labels are shaded such that they remain visible, even if positioned on bright or scattered background. To distinguish clearly bodies and systems, the labels representing other systems are per default not rendered in white but in a gray tone. Additionally, in the very same figure, the 2D-map at the bottom shows clearly that the known exoplanetary systems are clustered around the Solar System. The visualization of uncertain values is illustrated in two different aspects. Figure 5.6 shows the texturing alternatives of exoplanets in the application. The texture is chosen mainly depending on the semimajor axis orbit attribute of the exoplanet. Further, there are several exoplanets with an unknown radius. These objects not drawn at all, but labeled, as exemplified by exoplanet 11 Com b in figure 5.7.

5.2 Small Body Objects

The small body objects are only available for the Solar System, thus, the following screenshots always show this stellar system. There are two fundamentally different alternatives of displaying these asteroids and comets. On the one hand, they can be loaded as a point cloud. To make the point cloud more informative, for every category, a different color is chosen. In figure 5.8, all the loaded Small Bodies are in scene, whereas in figure 5.9, only one category is made visible. Like this, groups can be recognized and the overall movement can be tracked. On the other hand, the small body objects can be displayed in a similar manner to planets, i.e. as a body orbiting on a drawn trajectory. Figures 5.10 and 5.11 illustrate this analogously to the point cloud case above. Showing the small bodies in this way, the point are transformed into spheres, which are textured with a moon-like texture to appear rocky. In figure 5.12, an arbitrary asteroid is picked to exemplify this fact. Lastly, the oftentimes referred labeling issue for the small bodies is presented. Figure 5.13 shows what happens when the labeling of the small bodies is turned on, using the default label size. The labels are too numerous such that the point cloud in the background is no longer clearly recognizable. It can also be seen, that the prioritization of the labeling activity is working properly since all stars and planets in scene are marked. The same holds of course for figure 5.14, which shows the labeling of small body objects for the non-point-cloud case. As will be

37 5 Results

Figure 5.5: View on an arbitrary exoplanetary system CoRoT-25. numberOfSystemToDisplay equals 300. On the bottom left, the ’Overview Map’, available in the top bar (Navigation, Overview Map; shortcut M), is displayed. The yellow dot represents the Solar System. presented in the next section, this label rendering takes a huge amount of time. Thus, it is only hardly applicable in the current form for real-time rendering.

5.3 Measurements

A big part of the ExoViewer application was developed and analyzed on the following machine: Lenovo ThinkPad T440p with a 4 core CPU of the model i7-4710MQ, 8GB RAM, and a a GPU of the model Nvidia GeForce GT 730M +. On this Windows 10 Pro computer, the used IDE is Microsoft Visual Studio Professional 2013. Subsequently, some empirical performance results are presented. The application is executed directly form the IDE in ’Release’ mode. The basic scenario for this section of measurements is the following parametric setting: numberOfSystem- ToDisplay = 20, showMoons = true, displayPerCategory = 20000, showSmallBodies = true, smallBodiesAsPoint- Cloud = true, and the databases are defined to be locally available. In general, numbers refer to this parametriza- tion unless stated otherwise.

5.3.1 Application Initialization The first interesting point is the loading of the celestial bodies into the application. The time for loading the Solar System accounts for roughly 10.2 seconds, this includes the Sun, all the planets, and moons. When it comes to Asteroids and comets, figure 5.16 shows the loading time for the 92,137 small bodies loaded into the prototype. 12,433ms This gives an average loading time per small body of 92,137 = 0.13ms. The import of bodies of the database into the application is only a small part of the total time for initializing the program, as shown in figure 5.15. It can clearly be seen that the loading activity does not account for the biggest share of the total time, namely around a third. Action in the ’other’ category are manifold, e.g. reading

38 5 Results

(a) (b)

Figure 5.6: Simple texturing approach for exoplanets: Two different textures are available, either the Earth-like (a) or the Jupiter-like (b). (a) is located in the habitable zone of the system, represented by the green plane. (b) is located outside of it. the input parameter file or loading of shader programs. However, this time span strongly correlates positively with the number of loaded bodies: The orbit creation forms part of these remaining activities, as well as the set up of the jobs for parallel position calculations. In contrast, the loading of small bodies without the point cloud option turned on (smallBodiesAsPointCloud = false) takes a lot longer. For displayPerCategory = 10, 180 actual bodies loaded, the average loading time is around 64.3ms. Additionally, as stated earlier in this work, the memory is an issue: these 180 small body objects occupy nearly 1GB of RAM. So it is highly dependent on the memory resources available how many small bodies can be loaded with parameter value smallBodiesAsPointCloud = false. Continuing with the before presented parameter setting, the exoplanetary systems are loaded as a last step of this initialization. The accumulated time of the loaded systems is 2.9 seconds, with an average time of 162.8 milliseconds and a standard deviation of 75.4 milliseconds. Since the exoplanetary systems do not have high absolute differences in the number of bodies, this loading time does no vary without limits. Still, the maximum value in this measurement with 403 milliseconds (Kepler-186 containing six bodies) is around four times the minimum value, 103 milliseconds (2M 2140+16 containing two bodies).

5.3.2 Running Application The more interesting part of the measurements takes place once the application is initialized and running. The important aspect is how fast the program is able to produce the frames the user is willing to see. Figure 5.17 shows a graph of frame times. It can be said that the time needed to render a frame it highly dependent of what has to be shown. An extreme example is the peak d) in figure 5.17 which shows the frame time with small body labels. Between h) and i), there are significantly less bodies in scene than when being located in the Solar System. Further, when toggling some features or opening a dialog, the number of frames per seconds drops. As figure 5.18 illustrates, the activity of calculating the new positions of bodies takes a lot of time. Note, that since these computations are conducted independently, it is not the equivalent of or contained in the frame time. The sudden reduction of the update time after approximately two thirds of the taken sample is due to the creation of the point cloud, which is not done when being in an exoplanetary system.

39 5 Results

Figure 5.7: Visualization of an arbitrary exoplanet, 11 Com b, with unknown radius.

Figure 5.8: View on Solar System with Small Bodies rendered as point cloud. Labels and skybox are turned off, moons are in scene (green orbits). smallBodiesAsPointCloud = true, displayPerCategory = 80000. 153,388 small body objects are loaded. Every category is colored differently. (biggest groups here are: MBA in red, IMB in yellow-green, OMB in light purple, APO in dark purple, TJN in light blue, and MCA in white.)

40 5 Results

Figure 5.9: The same parameters as in figure 5.8, but only showing one category (TNO in this case) result in this image.

Figure 5.10: Solar System with displayPerCategory = 50 (778 small bodies loaded) and smallBodiesAsPoint- Cloud = false. Labels are turned off. All the small body objects have a yellow trajectory.

41 5 Results

Figure 5.11: Visibility of small body objects is limited to the MBA category. Same parameters as in figure 5.10.

Figure 5.12: Asteroid 2 Pallas captured form a very small distance. smallBodiesAsPointCloud = false. Labeling functionality is not active for small bodies.

42 5 Results

Figure 5.13: This image originates from the same parametrization as in figure 5.8. For this view of the inner Solar System, all the labels for stars, planets and small body objects are shown.

Figure 5.14: Screenshot focusing on the inner Solar System with the same parametrization as in figure 5.10. All labels for stars, planets and small body objects are displayed.

43 5 Results

Total Loading Time

Exoplanetary Systems Solar System (Planets and Moons) other 2947 10233 44940

Small Body Objects 12433

0 10000 20000 30000 40000 50000 60000 70000 80000

Figure 5.15: Total loading time of the application for the presented parameter setting in section 5.3. Total loading time is defined to be the equivalent of the ExoEngine::initializeScene() method duration. The time is on the x-axis and flows from left to right. The total time is 70.553 milliseconds, split in the actual time for data loading and ’other’ activities.

Small Bodies (point could option) Loading Time AST (101) MBA (20,000) COM (489) 11 CEN (337) OMB (1) PAA (0) AMO (5,315) IEO (16) 2751 50 47 2642 1 TNO (1,793) Etc (25) 816 2 194 4 PAR (1844) 502

other 364

APO (7,370) IMB (13,030) TJN (6,430) JFc (15) MCA (20,000) CTc (12) 824 1465 716 HTC (67) 2 ATE (994) HYA (0) 1927 2 7 106 0 0 2000 4000 6000 8000 10000 12000 14000

Figure 5.16: Diagram showing the loading time of small body objects for the scenario introduced in section 5.3. The time in milliseconds increases from left to right, on the x-axis. The total elapsed time is about 12.4 seconds. The bar includes a differently shaded part for every category, labeled with its abbre- viation, the number of loaded bodies in brackets, and the time needed in milliseconds on the second line. The last category (’other’), represents the time it takes for operations which are not directly the loading of the bodies, e.g. opening of the CSV files.

44 5 Results

Frame Time 10000 d) ↑~70000 f) ↑~12000 9000

8000

7000 6000 b) 5000

4000 a)

3000 c) 2000 g) j) 1000 e) h) i) k) 0 0 100 200 300 400 500 600 700 800 900

Figure 5.17: Diagram representing the frame time in milliseconds on the y-axis and the number of frames on the x-axis. It shows an arbitrary navigation through the prototype of approximately 900 frames in an accumulated time span of around 450,000 milliseconds. The average frame time is 496.12 milliseconds (dropping to 399 when ignoring to four samples over 10,000). Some peculiarities of this curve are the following: a) Creation of the dialog reachable under Navigate, Navigate To Object..., b) navigation via this dialog to the Earth, c) navigation back to the Sun, d) labels of the small body objects are switched on, e) labels of the moon are switched on, f) and g) fullscreen functionality is toggled, h) navigation to Kepler-186 via a system label in the background, i) navigation back to the Sun, j) creation of body visibility dialog and subsequent switching off of the small body objects, and k) switching off the lunar objects using the same dialog.

Jobs Timing

1400 1200 1000 800 600 400 200 0 0 50 100 150 200 250 300 350 400

Figure 5.18: Plot showing the performance of the update jobs calculating and setting the new positions in mil- liseconds. The drop around the execution 250 is attributed to the navigation out of the Solar System. The average for the presented sample lies at 272 milliseconds.

45 6 Discussion

The discussion chapter serves as a reflection of what this thesis contributed to the understanding of the problems and challenges which were there at the beginning of the process. Each subject of discussion is treated in the section which is influenced the most by it. Starting at the beginning of the process, we learned about the external databases in this field. There is a variety of them available in the Internet. Of course, every repository has its advantages and disadvantages. The choice of the database or databases should be based significantly on the purpose of the application. For our case, several repositories are possibly sufficient. For sure, we can also confirm the finding of Goodman [Goo12] stating that the lack of dominant data representation formats in astronomy is in general a handicap.

6.1 Exoplanetary Systems

Continuing with the topic of data, the uncertainty and nonexistence of data is a very big issue since there is hardly an exoplanetary system with a complete and reasonably certain data set, as outlined by Hassan and Fluke [HF11]. We did not expect the data to be this vague, thus, the focus has been moved further into this direction of uncertainty visualization. During the development phase, we realized that a constant comparison of basically every facet - radii of bodies, textures or colors of bodies, distances, habitability, etc. - of a stellar system with our Solar System was part of discussions. This seems to be a very interesting aspect of exploring exoplanetary objects, which was not understood as such in the beginning. Moreover, the systematical navigation to the different celestial objects accounts among other things for the purpose of the exoplanet research. The user has to be lead in this jungle of objects, in which most of them are of no special interest. This thought has to be considered throughout the design of the application.

6.2 Solar System

As the bodies in the Solar System are very numerous, it has become clear that the (orbit) coloring per object type is a measure which helps to preserve orientation. The question is how far to go with different colors, i.e. distinguish the objects in a very fine granular manner as it is done with the 22 different small body object colors in the point cloud or only differentiate the small bodies as one big category from planets and moons, as in the non-point-cloud case. The labeling, apart from being computationally very intensive, is a problem when too many of them are in scene. A default setting is provided, but several parameters can be changed by the user. Similar to so many questions of this type, e.g. the problem of the granularity of coloring, the core point is how much influence is given to the user. Why should it be possible to slow down the application to a frame time of 70 seconds which then not even produces visually appealing frames? It seems to be appropriate to give some degrees of freedom to the user, but a minimum quality of produced frames and performance should be guaranteed. Further, the massive orbit rendering, which occurs when loading many small bodies not in a point cloud, was not foreseen to be an issue of that dimension at the beginning. The results demonstrate that a high quantity of orbits filling areas nearly without gaps is quite useless, especially when not having implemented an algorithm which conserves some information about the local direction and depth of the orbit when being rendered. So, we completely share the principle idea of Gunther¨ et al. [GRT13] as well as of Mallo et al. [MPSS05].

46 6 Discussion

The point cloud representation of the small body objects is an alternative way of displaying them, in contrast to the classical one. A very nice notion of the collective movement is given. This method makes a significant step away from the idea of rendering a close-to-reality image, but offering a lot of other advantages. The more realistic the better has never been the credo, as for instance the scaling of distances was implemented before, and stays a subordinate thought.

6.3 Performance

It is obvious that the more bodies are drawn, the higher is the frame time. Thus, the goal has to be to draw as less bodies as possible, without affecting the visual output of the program. The culling strategy is a central aspect of the drawing algorithm. This insight is not really new, but has been confirmed with this work. The chapter 14 ’Acceleration Algorithms’ of Akenine-Moller¨ et al. [AMHH08] is really core. What has not been measured but resulted to be obvious in the development process is that the GPU has a very nice performance, coinciding with Kapferer and Riser [KR08].

47 7 Future Work

This chapter suggests work which can be considered to be done in a follow-up project. It mainly revisits topics discussed in chapter 3. In the field of the visualization of uncertain and missing data, a lot of work can still be done. For instance, it could be started by integrating the given confidence intervals in the information widget on the right. More advanced approaches would include representing these intervals, or even non-existing values, with sophisticated methods of computer graphics. The orbital model, as presented in this work, is not complete, meaning that the exact positions of the bodies in reality cannot be determined with the current parameters. As stated earlier, the parameter for the calculation of the position of a body on a orbit for any given time assumes to have a static trajectory. However, that is not the case as orbits deform and move over time, and are influenced by a lot of gravitational forces in space. Especially for the Solar System, this can be interesting to integrate. The question here is how to get the data needed for this calculation and whether the cost of the position computation will still be at a similar level as without this adjustment. Connected to this issue is the fact that the currently implemented algorithm of the position update takes longer the further away in future the date is. Further, the accuracy of the position can also be increased by adapting a more realistic model of the center of the orbit. Currently, the center point of an orbit is defined as the position of the heaviest body in the (sub-)system. Bringing it to the barycenter of the (sub-)system, the position of the orbiting body would change. For bodies like planets orbiting around the Sun this change may not be dramatic, in contrary, for more balanced systems on the mass level, as Pluto and its moons or some exoplanetary systems, the new model could lead to significantly shifted positions. Additionally, the central body of the (sub-)system would also move on an orbit. Again, a central question would be where to get this data from or calculate it ad hoc by implementing a complex physics model (discussed in subsection3.2.2). Cost and benefits have to be weighed carefully. Lastly, some bodies seem to move in large discrete steps, but is has not been determined which body behave at which time in this way. It is open, whether this issue is connected to the numerical precision topic, a computation effect or something else.

7.1 Implementation

The most important implementation specific aspects which are proposed to be tackled in the future are the fol- lowing: As seen in the chapter 5, the rendering of labels is very time consuming, and the computational cost does not increase linearly. So, for labeling in general, but especially for the labeling of a lot of bodies, a better, more effective and or efficient algorithm could be found. As well, the representation of the single celestial body could be enhanced. In particular, a more realistic stellar model and an non-spherical small body mesh (when not being represented in a point cloud) are the first points to improve. Also referring primarily to the small bodies, the level of detail of the drawable orbit object should be enhanced. Figure 7.1 shows that the problem is twofold: An orbit has corners since it is composed by a number of straight lines simulating an ellipse (in most cases). This leads to the problem, that, when positioning the body at the exact location, the points of intersection between the celestial object and its orbit are not where they should be. This displacement is of course bigger relative to the body size for asteroids and comets than for planets or moons. The brute force solution to this issue would be to increase the number of line segments. However, the drawing of GL LINE objects is computationally speaking not for free.

48 7 Future Work

Figure 7.1: Arbitrary small body representing the problem of the level of orbit detail. On the left, corners are observable in the orbit, which should be roundish, and the orbit does not penetrate the body centrally.

Work can be spent on implementing high-performance rendering data structures. For instance, all of the spheres could be drawn using the technique of instanced rendering. Another issue is the rendering of the orbits, which technically would be done with a lot less draw-calls, namely by using a multi-line object. This is already partly put in place in the geAstro package (OrbitCollection class). In addition, as stated various times, the implementation of culling strategies could lead to a significant performance boost. As a last point, the navigation of the ArcBallCamera could be improved. Currently, the control of the camera position is not completely intuitive.

49 Acknowledgments

First and foremost, I would like to thank my supervisor and adviser Matthias Thony.¨ He has been putting a lot of effort and ideas into this project. His expert knowledge and encouragements have been very valuable and motivating for me. Many thanks to Prof. Dr. Renato Pajarola, who gave me the opportunity to write this master thesis in the group for visualization and multimedia at the University of Zurich, and supported the work in every possible manner. Further, I would like to acknowledge the whole team of this research group, especially Dr. Markus Billeter, who helped in numerous debugging sessions and gave significant inputs.

50 Bibliography

[AMHH08] Tomas Akenine-Moller,¨ Eric Haines, and Naty Hoffman. Real-Time Rendering. A. K. Peters, Ltd., 3rd edition, July 2008.

[BBB+14] David P. Bennett, Virginie Batista, Ian A. Bond, Charles Bennett, Daisuke Suzuki, Jean-Philippe Beaulieu, Andrzej Udalski, Jadzia Donatowicz, et al. MOA-2011-BLG-262Lb: A Sub-Earth-Mass Moon Orbiting a Gas Giant Primary or a High Velocity Planetary System in the Galactic Bulge. The Astrophysical Journal, 785:155, April 2014.

[BKB+10] William J. Borucki, David Koch, Gibor Basri, Natalie Batalha, Timothy Brown, Douglas Caldwell, John Caldwell, et al. Kepler Planet-detection Mission: Introduction and First Results. Science, 327(5968):977–980, February 2010.

[BMW71] Roger R. Bate, Donald D. Mueller, and Jerry E. White. Fundamentals of Astrodynamics. Dover Publications, June 1971.

[BPM15] Emmanuel Bertin, Ruven Pillay, and Chiara Marmo. Web-based Visualization of Very Large Scien- tific Astronomy Imagery. Astronomy and Computing, 10:43–53, April 2015.

[Car11] Jon Carwright. ’Homeless’ Planets May Be Common in Our Galaxy. http://www.sciencemag.org/news/2011/05/ homeless-planets-may-be-common-our-galaxy, May 2011.

[CUD] Programming Guide :: CUDA Toolkit Documentation. http://docs.nvidia.com/cuda/ cuda-c-programming-guide/index.html. Accessed: February 2016.

[DSZ11] Peter Duffett-Smith and Jonathan Zwart. Practical Astronomy with your Calculator or Spreadsheet. Cambridge University Press, 4th edition, June 2011.

[FGK15] Roger Freedman, Robert Geller, and William J. Kaufmann. Universe. W.H. Freeman, 10th edition, July 2015.

[FTC] Fundamental Types (C++). https://msdn.microsoft.com/en-us/library/ cc953fe1(v=vs.120).aspx. Accessed: February 2016.

[Goo12] Alyssa A. Goodman. Principles of High-Dimensional Data Visualization in Astronomy. Astronomis- che Nachrichten, 333:505–514, June 2012.

[GRT13] Tobias Gunther,¨ Christian Rossl,¨ and Holger Theisel. Opacity Optimization for 3D Line Fields. ACM Transactions on Graphics, 32(4):120:1–120:8, July 2013.

[Har88] Martin Harwit. Astrophysical Concepts. Springer-Verlag, 2nd edition, April 1988.

[Hei78] Wulff D. Heintz. Double Stars. Springer Netherlands, 15th edition, 1978.

[HF11] Amr Hassan and Christopher J. Fluke. Scientific Visualization in Astronomy: Towards the Petascale Astronomy Era. pasa, 28:150–170, June 2011.

51 Bibliography

[IAUa] IAU - Measuring the Universe. https://www.iau.org/public/themes/measuring/. Accessed: January 2016.

[IAUb] IAU - Resolution B5, and B6 of 2006. http://www.iau.org/static/resolutions/ Resolution_GA26-5-6.pdf. Accessed: January 2016.

[IAUc] IAU - Resolutions B1, B2, B3, and B4 of 2012. http://www.iau.org/static/ resolutions/IAU2012_English.pdf. Accessed: January 2016.

[IEE08] IEEE Task P754. IEEE 754-2008, Standard for Floating-Point Arithmetic. August 2008.

[JPL] Jet Propulsion Laboratory (JPL) Solar System Dinamics. http://ssd.jpl.nasa.gov/. Ac- cessed: January 2016.

[Kep93] Johannes Kepler. New Astronomy. Cambridge University Press, January 1993.

[KR08] Wolfgang Kapferer and Tobias Riser. Visualization Needs and Techniques for Astrophysical Simu- lations. New Journal of Physics, 10(12):125008, December 2008.

[KRK+13] Ravi K. Kopparapu, Ramses Ramirez, James F. Kasting, Vincent Eymet, Tyler D. Robinson, Suvrath Mahadevan, Ryan C. Terrien, Shawn Domagal-Goldman, Victoria Meadows, and Rohit Deshpande. Habitable Zones around Main-sequence Stars: New Estimates. The Astrophysical Journal, 765:131, March 2013.

[Lan15] Bernhard Langer. Arbitrary-Precision Arithmetics on the GPU. In Central European Seminar on Computer Graphics for Students, 2015.

[LYLJ00] Bor-Sung Liang, Wen-Chang Yeh, Yuan-Chung Lee, and Chein-Wei Jen. Deferred Lighting: A Computation-efficient Approach for Real-time 3-D graphics. In Proceedings of the IEEE Interna- tional Symposium on Circuits and Systems, pages 657–660, 2000.

[MPSS05] Ovidio Mallo, Ronald Peikert, Christian Sigg, and Filip Sadlo. Illuminated Lines Revisited. In IEEE Visualization, page 3. IEEE Computer Society, 2005.

[Not15]¨ Matthias Notzli.¨ Solar System Viewer. Bachelor’s Thesis, University of Zurich, May 2015.

[NSTN93] Tomoyuki Nishita, Takao Sirai, Katsumi Tadamura, and Eihachiro Nakamae. Display of the Earth Taking into Account Atmospheric Scattering. In Proceedings of the 20th Annual Conference on Computer Graphics and Interactive Techniques, pages 175–182, 1993.

[OEP] Open Exoplanet Catalogue - GitHub. https://github.com/ OpenExoplanetCatalogue/open_exoplanet_catalogue/. Accessed: January 2016.

[PKRP87] James B. Pollack, James F. Kasting, Steven M. Richardson, and K. Poliakoff. The Case for a Wet, Warm Climate on Early Mars. Icarus, 71:203–224, August 1987.

[PLQ] PlanetQuest - The Search for Another Earth. http://planetquest.jpl.nasa.gov/. Ac- cessed: January 2016.

[Rei12] Hanno Rein. A Proposal for Community Driven and Decentralized Astronomical Databases and the Open Exoplanet Catalogue. ArXiv e-prints, December 2012.

[Rid12] Ian Ridpath. A Dictionary of Astronomy. Oxford University Press, 2nd edition, December 2012.

52 Bibliography

[Sea15] David Seargent. Comet Controversies. In Weird Astronomical Theories of the Solar System and Beyond, pages 133–201. Springer, December 2015.

[SH91] Sean C. Solomon and James W. Head. Fundamental Issues in the Geology and Geophysics of Venus. Science, 252(5003):252–260, April 1991.

[WB14] Nathan Walster and Michael Blain. Creating the Earth as a Backdrop in Gravity. In ACM SIGGRAPH Talks, pages 64:1–64:1, August 2014.

[WFF11] Nathan Whitehead and Alex Fit-Florea. Precision & Performance: Floating Point and IEEE 754 Compliance for NVIDIA GPUs. http://developer.download.nvidia.com/assets/ cuda/files/NVIDIA-CUDA-Floating-Point.pdf, June 2011.

[WLM13] Stephan Wenger, Dirk Lorenz, and Marcus Magnor. Fast Image-based Modeling of Astronomical Nebulae. Computer Graphics Forum, 32(7):93–100, October 2013.

53 Appendix

The appendix of this thesis is located on a DVD which is attached at the very end of the document. It consists of the following elements:

Abstract.txt This file contains the abstract in English.

Code This folder contains the complete application code.

Figures This folder contains all the figures used in this work in order of appearance.

Thesis.pdf This file is the electronic version of the presented printed document.

Zusfsg.txt This file contains the abstract in German.

54