Making 3D Movies of Northern Lights Eric Hivon, Jean Mouette, Thierry Legault
Total Page:16
File Type:pdf, Size:1020Kb
Making 3D movies of Northern Lights Eric Hivon, Jean Mouette, Thierry Legault To cite this version: Eric Hivon, Jean Mouette, Thierry Legault. Making 3D movies of Northern Lights. Journal of Space Weather and Space Climate, EDP sciences, 2017, 7, pp.A24. 10.1051/swsc/2017023. hal-01622710 HAL Id: hal-01622710 https://hal.sorbonne-universite.fr/hal-01622710 Submitted on 24 Oct 2017 HAL is a multi-disciplinary open access L’archive ouverte pluridisciplinaire HAL, est archive for the deposit and dissemination of sci- destinée au dépôt et à la diffusion de documents entific research documents, whether they are pub- scientifiques de niveau recherche, publiés ou non, lished or not. The documents may come from émanant des établissements d’enseignement et de teaching and research institutions in France or recherche français ou étrangers, des laboratoires abroad, or from public or private research centers. publics ou privés. Distributed under a Creative Commons Attribution| 4.0 International License J. Space Weather Space Clim. 2017, 7, A24 © E. Hivon et al., Published by EDP Sciences 2017 DOI: 10.1051/swsc/2017023 Available online at: www.swsc-journal.org TECHNICAL ARTICLE Making 3D movies of Northern Lights Eric Hivon1,*, Jean Mouette1 and Thierry Legault2,a 1 Sorbonne Universités, UPMC Univ. Paris 6 & CNRS (UMR 7095), Institut d’Astrophysique de Paris, 98 bis Bd Arago, 75014 Paris, France 2 THALES, 19 Avenue Morane Saulnier, 78140 Vélizy Villacoublay, France Received 4 June 2017 / Accepted 16 August 2017 Abstract – We describe the steps necessary to create three-dimensional (3D) movies of Northern Lights or Aurorae Borealis out of real-time images taken with two distant high-resolution fish-eye cameras. Astrometric reconstruction of the visible stars is used to model the optical mapping of each camera and correct for it in order to properly align the two sets of images. Examples of the resulting movies can be seen at http://www.iap.fr/aurora3d. Keywords: Aurora / visualisation / technical / image processing / algorithm 1 Introduction prompt the interest of scientists studying the Aurorae properties, who would like to use them, or build upon them, Aurorae borealis have always fascinated humans, who in their work. have long tried to report their observations by the best means The image taking procedure and camera synchronization is available at the time. If the earliest known record of an Aurora described in Section 2. The image processing steps allowing a was written on a clay tablet 567 BC in Babylon (Stephenson good rendition of the stereoscopy is described in Section 3, et al., 2004), they have since been filmed in color with specially while Section 4 shows how such 3D images and movies can be designed high sensitivity cameras1 in the mid-1980s (Hallinan viewed. Section 5 is devoted to a conclusion and perspectives. et al., 1985), and filmed in 3D for the first time in 2008.2 In this paper we detail the steps taken to generate 3D 2 Observational setup movies of Aurorae Borealis shot with high-resolution wide field cameras, generating movies fit for projection on Observations are made from Norway at the approximate hemispheric planetariums. To our knowledge, these 3D latitudes and longitudes of 69.3°N and 20.3°E, with two movies are the first ones showing Northern Lights ever cameras separated by either 6 or 9 km, depending on the local produced thanks to post-shooting astrometric reconstruction. weather conditions. Since the Aurorae mostly happen in the Since they are produced from real-time movie images instead upper atmosphere at 100 km above the ground, their parallax of the usual time-lapsed sequences, they can be used for is expected to be large enough between the two cameras to education and public outreach and give a glimpse at the rich construct their 3D rendering via binocular vision. As much as and rapidly moving three-dimensional structure of the possible, the cameras point to the zenith, and are then rotated Auroras. While they are close to the real-world sensations around the vertical so that the Northern Star is in the lower part of Aurora watching in terms of colors, saturation, speed of of the image, aligning the long axis of the image with the West- evolution, and sheer size on the sky (if seen in a planetarium), East direction, as shown in Figure 1 for one of the cameras. such movies add the extra information of the stereoscopy, Their positions are measured by GPS receivers. In what which is inaccessible to a single human observer on the field. follows, the western and eastern cameras are, respectively, Now that such movies are made available, they will hopefully dubbed left and right. a Astrophotography, http://www.astrophoto.fr. 2.1 The cameras *[email protected] 1 http://auroraalive.com/multimedia/autoformat/get_swf.php?video Each of the two cameras used is a Sony a7s, on which is Site=aurora&videoFile=aa_aurora_on_tv.swfþþ&videoTitle=Auror mounted a Canon Fisheye lens of focal length f = 15 mm with aþonþTV. an aperture of f/2.8. In order to further enlarge the field of view, 2 https://www.newscientist.com/article/dn15147-northern-lights-cap and allow the use of a Canon lens on the Sony E-mount tured-in-3d-for-the-first-time/. camera, it is coupled to a Metabones Speed Booster that This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. E. Hivon et al.: J. Space Weather Space Clim. 2017, 7, A24 Finally, in post-production, but before starting the 3D reconstruction pipeline, we look in the filmed sequences for the occurrence of bright, thin and fast moving Auroras structures, and compare closely their evolution from frame to frame between the two cameras, in order to find the best matching pair of frames. This provides the residual time offset between the two series of recorded time-stamps, and allows us to re-synchronize the two movies at a level compatible with the image rate (25 or 30 fps), assuming the relative time drift to be negligible over the duration of a sequence (a few minutes). 3 Image processing Time-lapsed stereoscopic observations of Aurorae are described for wide field cameras in Kataoka et al. (2013), and for fast narrow field cameras in Kataoka et al. (2016), following in either cases the procedure described in Mori et al. (2013). We are implementing a similar astrometric method: since the actual pointing, orientation and relative position of Fig. 1. The observation is done from colatitude u0 and longitude f0,in the two cameras are only known with limited accuracy, and the Earth-bound xyz referential. A camera-bound referential is because each camera can distord the image differently, the defined, with the Z axis along the camera boresight (roughly towards positional astrometry of bright stars identified in the images is the zenith), the Y axis along the shorter dimension of the camera used to determine and characterize the cameras geometrical image, approximately pointing North, and the X axis along the longer settings and optical responses, in order to properly realign the dimension of the camera image, pointing East. left and right set of images. reduces the focal length by a factor of 0.71. The camera sensor 3.1 Finding the stars is a full-frame CMOS 4240 Â 2832 pixel RGBG Bayer 3 matrix, with half of the pixels being sensitive to green and the The 5th edition of the Kitt Peak National Observatory 4 5 other half spread evenly between red and blue. ISO is set to 12 catalog, adapted from the Yale catalog, lists the J2000 800, representing a good compromise between noise and equatorial coordinates and magnitude of more than 9000 stars sensitivity. A video signal is produced at 25 or 30 frames per of magnitude 6.5. After setting correctly the minus sign on second (fps), in Ultra High Definition format (3840 Â 2160 the magnitude of its four brightest stars, it was used as a catalog pixels, ratio 16:9), and recorded on an external video recorder, of bright stars potentially seen in the left and right images. with Apple ProRes HQ codec (approximately 800 Mbits/s). As Candidate stars in the images are identified as follows. we will see in Section 3, the field of view is 220° Â 110°, Since the Aurorae filmed here are mostly green in color, a red covering 48% of the sphere, with a resolution at the center of channel frame is expected to be less exposed to them than the 0 the image of 2.94 /pixel. other channels, and is used to detect stars. We checked that using the green channel, expected to have a better intrinsic fi 2.2 Cameras synchronization resolution, did not change much the nal result, while it increased slightly the risk of false detection. The chosen image In order to achieve a good stereoscopy out of the movies is convolved (via Fast Fourier Transform operations) with a produced, we first have to make sure that their images are difference of Gaussians (DoG) filter, defined in pixel space as properly synchronized during their processing and visualisa- 2þ 2 2þ 2 tion. Keeping, on the field, the two distant cameras and/or their exp À x y exp À x y 2s2 2s2 recorders synchronized thanks to a common absolute GPS- Fðx; yÞ¼ 1 À 2 ; ð1Þ ps2 ps2 based time signal would have been ideal, but out of reach of 2 1 2 2 this project.