<<

sensors

Article HPT: A High Spatial Resolution Multispectral Sensor for Microsatellite

Junichi Kurihara 1,* ID , Yukihiro Takahashi 1, Yuji Sakamoto 2, Toshinori Kuwahara 2 and Kazuya Yoshida 2 ID 1 Faculty of Science, Hokkaido University, Kita 10 Nishi 8, Kita-ku, Sapporo 060-0810, Japan; [email protected] 2 Department of Aerospace Engineering, Tohoku University, Aramaki Aza Aoba 6-6-11, Sendai 980-8579, Japan; [email protected] (Y.S.); [email protected] (T.K.); [email protected] (K.Y.) * Correspondence: [email protected]; Tel.: +81-11-706-9244

Received: 19 January 2018; Accepted: 15 February 2018; Published: 18 February 2018

Abstract: Although nano/microsatellites have great potential as remote sensing platforms, the spatial and spectral resolutions of an optical payload instrument are limited. In this study, a high spatial resolution multispectral sensor, the High-Precision Telescope (HPT), was developed for the RISING-2 microsatellite. The HPT has four image sensors: three in the visible region of the spectrum used for the composition of true color images, and a fourth in the near- region, which employs liquid crystal tunable filter (LCTF) technology for wavelength scanning. Band-to-band image registration methods have also been developed for the HPT and implemented in the image processing procedure. The processed images were compared with other satellite images, and proven to be useful in various remote sensing applications. Thus, LCTF technology can be considered an innovative tool that is suitable for future multi/hyperspectral remote sensing by nano/microsatellites.

Keywords: multispectral sensor; tunable filter; microsatellite; Earth observation; image analysis

1. Introduction Artificial satellites can be categorized according to their masses, e.g., picosatellites (<1 kg), nanosatellites (1–10 kg), microsatellites (10–100 kg), minisatellites (100–1000 kg), and large satellites (>1000 kg). With the rapid development of electronic and communication technology, the adage “smaller, faster, and cheaper” summarizes the trend of satellite development activities, which has resulted in a significant increase in the number of nano/microsatellites launched over recent years, especially for Earth observation and remote sensing purposes [1]. Nano/microsatellites have great potential as remote sensing platforms because of the cost-effective implementation of satellite constellations and formations, which increase their overall temporal resolution and ground coverage [2]. However, nano/microsatellites have many limitations related to their small size, which lead to restrictions on their payload instruments in terms of design, power consumption, and data rate. For optical remote sensing, the spatial and spectral resolutions of an optical payload instrument are limited primarily by its aperture size, which is constrained by the dimensions of the satellite. For any optical system, irrespective of whether it is a spaceborne or ground-based instrument, its spatial resolution is limited theoretically by the diffraction of light, and its spectral resolution is limited practically by the signal-to-noise ratio (SNR). Both the diffraction and the SNR can be upgraded with larger apertures; thus, nano/microsatellites have fundamental disadvantages compared with larger satellites in terms of optical remote sensing [3]. Figure1 shows a diagram of the distribution of high (<10 m) and moderate (10–100 m) spatial resolution multispectral Earth observation satellites/sensors launched in the previous few decades.

Sensors 2018, 18, 619; doi:10.3390/s18020619 www.mdpi.com/journal/sensors Sensors 2018, 18, x FOR PEER REVIEW 2 of 11 Sensors 2018, 18, 619 2 of 11 Figure 1 shows a diagram of the distribution of high (<10 m) and moderate (10–100 m) spatial resolution multispectral Earth observation satellites/sensors launched in the previous few decades. The mass of a satellite and the decade of its launch are indicated by the area and the color of the circle, The mass of a satellite and the decade of its launch are indicated by the area and the color of the circle, respectively. As can be seen, satellite sensors with three or four spectral bands show a trend toward respectively. As can be seen, satellite sensors with three or four spectral bands show a trend toward higherhigher ground ground resolutions resolutions and and smaller smaller satellitesatellite sizes.sizes. Another evident evident trend trend isis the the increase increase in inthe the numbernumber of of spectral spectral bands bands for for a a series series of of largelarge satellites,satellites, such as as the the Land Landsatsat and and WorldView WorldView flagship flagship series.series. Aside Aside from from the the multispectral multispectral Earth Earth observation observation satellites,satellites, onlyonly two satellites, satellites, Earth Earth Observing- Observing-1 (EO-1)1 (EO-1) and and Project Project for for On-Board On-Board Autonomy-1 Autonomy-1 (PROBA-1), (PROBA-1), areare experimentallyexperimentally equipped equipped with with high high spatialspatial resolution resolution hyperspectral hyperspectral sensors, sensors, although although additionaladditional operational hyperspectral hyperspectral missions missions are are plannedplanned for for launch launch before before 2020 2020 [4 [4].].

FigureFigure 1. 1.Diagram Diagram of of the the high highand and moderatemoderate spatialspatial resolution multispectral multispectral Earth Earth observation observation satellites/sensorssatellites/sensors launched launched in in the the past.past.

HyperspectralHyperspectral sensors, sensors, which which are are also also known known asas imaging spectrometers, spectrometers, are are capable capable of of obtaining obtaining images in hundreds of spectral bands, and they can provide abundant information in relation to a images in hundreds of spectral bands, and they can provide abundant information in relation to a wide wide variety of Earth observation and remote sensing applications. Although the challenges in variety of Earth observation and remote sensing applications. Although the challenges in adapting adapting high spatial resolution hyperspectral sensors to small satellites have been studied high spatial resolution hyperspectral sensors to small satellites have been studied theoretically in theoretically in terms of the resolution, quality, and volume of hyperspectral data, the practical terms of the resolution, quality, and volume of hyperspectral data, the practical feasibility appears feasibility appears low for satellites <100 kg, even when applying state-of-the-art technology [5,6]. lowThe for hyperspectral satellites <100 sensor kg, even considered when applying in the state-of-the-artabove studies technologywas a push-broom/line-scanning [5,6]. The hyperspectral sensorspectrometer, considered which in the had above been studies conventionally was a push-broom/line-scanning mounted on large satellites spectrometer, or on manned which aircraft. had This been conventionallytype of hyperspectral mounted sensor on large has satellites a two-dimensional or on manned (2D) aircraft. detector This array, type where of hyperspectral one axis records sensor hasspatial a two-dimensional information and (2D) the detector other axis array, records where spec onetral axis information. records spatial The informationspatial axis andis arranged the other axisperpendicular records spectral to the information. satellite track; The thus, spatial by scanni axis isng arranged the Earth’s perpendicular surface along to the the path satellite of movement track; thus, byof scanning the satellite, the Earth’s a three-dimensiona surface alongl (3D) the hyperspectral path of movement data cube of the is obtained. satellite, aConsequently, three-dimensional both the (3D) hyperspectralresolution and data the cube quality is obtained. of the acquired Consequently, data ar bothe constrained the resolution considerably and the qualityby the ground of the acquired track datavelocity are constrained of the satellite considerably and the aperture by the size ground of the track sensor. velocity These of constraints the satellite make and it thedifficult aperture to realize size of theeffective sensor. Thesedeployment constraints of high make spatial it difficult resolution to realizehyperspectral effective sensors deployment on nano/microsatellites. of high spatial resolution hyperspectralAnother sensors type of onmulti/hyperspectral nano/microsatellites. sensor is a wavelength-scanning imager using a filter wheel orAnother a tunable type filter. of multi/hyperspectralThis type of sensor also sensor employs is a wavelength-scanning a 2D detector array, imagerbut it captures using a a filter spatial wheel 2D or a tunableimage filtered filter. This at a typecertain of sensorspectral also band employs that is switched a 2D detector by mechanically array, but it rotating captures the a spatial filter wheel 2D image or filteredby electrically at a certain tuning spectral the band filtered that wavelength. is switched byWhile mechanically a finite response rotating time the is filter required wheel to or switch by electrically from tuning the filtered wavelength. While a finite response time is required to switch from one spectral band to another for wavelength scanning, the resolution and the quality of the data do not depend directly on the Sensors 2018, 18, 619 3 of 11 satellite speed. A filter wheel mounted with 15 spectral filters has already been applied to spaceborne low spatial resolution (>100 m) multispectral sensors, such as the Polarization and Directionality of the Earth’s Reflectances instrument on the Polarization and Anisotropy of Reflectances for Atmospheric Sciences coupled with Observations from a light detection and ranging (LiDAR) (PARASOL) microsatellite [7]. The disadvantage of a filter wheel in relation to small satellites is that the diameter, weight, and power consumption of the filter wheel all increase with the number of spectral bands. Recently, we have developed a high spatial resolution multispectral sensor using a tunable filter to overcome the disadvantage of a filter wheel, and installed it on the RISING-2 microsatellite. This sensor, which is equipped with a liquid crystal tunable filter (LCTF) for wavelength scanning, is called the High-Precision Telescope (HPT). To the best of our knowledge, the HPT is the first spaceborne sensor using LCTF technology. The advantages for small satellites in using the LCTF are related not only to the reductions in the size, weight, and power consumption of the sensor, but also to the increased flexibility of the spectral bands and enhanced data volumes. The central wavelength of the spectral bands is electrically tunable for every image acquisition; hence, the data volume could be reduced by choosing appropriate spectral bands for specific purposes. This flexibility allows a single multispectral sensor on a nano/microsatellite to be applied to various remote sensing fields, making it comparable with hyperspectral sensors. As shown in Figure1, the HPT holds a unique position amongst the current multispectral Earth observation sensors. This paper presents descriptions of both the design and development of the HPT deployed on the RISING-2 microsatellite, and the data analysis procedure adopted for the acquired images.

2. Sensor Characteristics

2.1. RISING-2 Microsatellite The RISING-2 microsatellite is a 50-kg platform developed jointly by the Tohoku and Hokkaido universities in Japan [8]. It constitutes a substantial improvement over the SPRITE-SAT, which was launched in 2009 and subsequently called the RISING satellite after its launch [9]. Tohoku University developed a satellite bus system for the RISING-2 platform, including a main onboard computer called the Satellite Central Unit (SCU). Hokkaido University developed the mission payload instrumentation for Earth observation, including a controller unit called the Science Handling Unit (SHU; manufactured by AD Co., Ltd., Sagamihara, Japan). The RISING-2 microsatellite was designed with the form of a 50-cm cube and a weight of about 43 kg, such that it could be adopted as a secondary small payload on an H-IIA launch vehicle. The satellite was launched successfully by the H-IIA from Tanegashima Space Center in Japan on 24 May 2014, along with the primary payload (the Advanced Land Observing Satellite-2 satellite) and three other secondary small payloads (nano/microsatellites, each <50 kg). The RISING-2 microsatellite was placed into a Sun-synchronous orbit at the altitude of 628 km with the local Sun time on the descending node of 12:00. Three years after its launch, the satellite remains active, although its operation has become less frequent. The three-axis active attitude control system of the RISING-2 microsatellite provides off-nadir pointing capability in order for the optical mission payload instruments to maintain focus on a specific location on the ground [10]. This capability not only enhances the revisit frequency for the selected site, it also increases the number of spectral bands available for the site by wavelength scanning with the LCTF of the HPT. The location of the site to be observed by the HPT is stored preliminarily in the memory of the SCU, and the HPT observation is executed at a scheduled time by sending the stored commands from the SCU to the SHU. The SHU controls both the power supply and the data acquisition by the mission payload instruments. Acquired images are stored temporarily in the 8 MB Static Random Access Memory (SRAM) of the SHU. As the data volume of an image taken by the HPT is about 650 kB, the HPT can capture up to 12 images sequentially. As described later, the HPT has the potential for a hyperspectral Sensors 2018, 18, 619 4 of 11 SensorsSensors 2018 2018, ,18 18, ,x x FOR FOR PEER PEER REVIEW REVIEW 44 of of 11 11

sensor that is capable of obtaining images in hundreds of spectral bands, but it is not realized because sensorsensor that that is capableis capable of of obtaining obtaining images images in in hundreds hundreds ofof spectralspectral bands, but but it it is is not not realized realized because because of the limitation of the number of images stored temporarily. The SRAM allows fast access, but it is of theof the limitation limitation of of the the number number of of images images stored stored temporarily. temporarily. The The SRAM SRAM allowsallows fastfast access, but it is a aa volatile volatile memory memory unit, unit, and and it it loses loses stored stored data data when when the the power power is is off. off. After After an an observation observation sequence, sequence, volatile memory unit, and it loses stored data when the power is off. After an observation sequence, the thethe imagesimages areare combinedcombined withwith ancillaryancillary data,data, andand transferredtransferred toto thethe 128-MB128-MB non-volatilenon-volatile flashflash images are combined with ancillary data, and transferred to the 128-MB non-volatile flash memory of memorymemory ofof thethe SHUSHU withoutwithout imageimage datadata compression.compression. WhenWhen thethe RISING-2RISING-2 microsatellitemicrosatellite isis visiblevisible the SHU without image data compression. When the RISING-2 microsatellite is visible from the ground fromfrom thethe groundground stationstation locatedlocated atat TohokuTohoku UniverUniversity,sity, thethe datadata inin thethe flashflash memorymemory areare downlinkeddownlinked station located at Tohoku University, the data in the flash memory are downlinked via S-band telemetry viavia S-bandS-band telemetrytelemetry atat aa datadata raterate ofof 5050 kbpskbps [8],[8], whichwhich meansmeans thatthat itit takestakes atat leastleast 1.71.7 minmin toto at adownlinkdownlink data rate oneone of 50 image.image. kbps Therefore,[Therefore,8], which meansbecausebecause that ofof thisthis it takes limitalimita attiontion least ofof 1.7 thethe min datadata to rate,rate, downlink fefewerwer onethanthan image. 1010 imagesimages Therefore, cancan becausebebe downlinkeddownlinked of this limitation inin oneone day.day. of the data rate, fewer than 10 images can be downlinked in one day. 2.2. High Precision Telescope (HPT) 2.2.2.2. HighHigh PrecisionPrecision TelescopeTelescope (HPT)(HPT) TheTheThe HPT HPT HPT installed installed installed on onon the the the RISING-2 RISING-2 RISING-2 microsatellite microsatellite microsatellite is is a a high high is a spatial spatial high spatial resolution resolution resolution multispectral multispectral multispectral sensor sensor sensorequippedequipped equipped withwith withfourfour charge-coupled fourcharge-coupled charge-coupled devicedevice device (CCD)(CCD) (CCD)imageimage sensors imagesensors sensors forfor red,red, for green,green, red, blue,blue, green, andand blue, near-near- and near-infraredinfraredinfrared (NIR)(NIR) (NIR) bandsbands bands splitsplit splitbyby dichroicdichroic by dichroic mirrors,mirrors, mirrors, asas shownshown as inin shown FiguresFigures in 22 Figures andand 3.3. The2The and LCTFLCTF3. The isis employedemployed LCTF is employedforfor wavelengthwavelength for wavelength scanningscanning only scanningonly inin thethe only NIRNIR band,inband, the andand NIR itit band, isis placedplaced and inin it frontfront is placed ofof thethe in NIRNIR front CCDCCD of module.module. the NIR TheThe CCD module.centralcentral The wavelengthwavelength central wavelength ofof thethe NIRNIR bandband of the isis NIRelectricallyelectrically band is selectableselectable electrically byby controlling selectablecontrolling by thethe controlling LCTFLCTF forfor everyevery theLCTF imageimage for everyacquisition;acquisition; image acquisition; thus,thus, imagesimages thus, ofof imagesthethe spectralspectral of the bandsbands spectral minimallyminimally bands minimally requiredrequired forfor required specificspecific for purposespurposes specific purposescancan bebe canacquiredacquired be acquired sequentially.sequentially. sequentially. TheThe specificationsspecifications The specifications ofof thethe HPTofHPT the areare HPT listedlisted are inin listed TableTable in 1.1. Table 1.

FigureFigureFigure 2. 2. 2.Appearance Appearance Appearance ofof of the the High-Precision High-Precision Telescope Telescope (HPT). (HPT). (HPT).

FigureFigureFigure 3.3. 3. Schematic Schematic of ofof the the HPT. HPT.

Sensors 2018, 18, 619 5 of 11

Table 1. Specifications of the HPT.

Size 380 mm × 161 mm × 124 mm Weight 3 kg Focal length 1000 mm Aperture diameter 100 mm Ground sample distance 4.6 m (at nadir from 628 km altitude) Field of view 0.28◦ × 0.21◦ (3.1 × 2.3 km at nadir from 628 km altitude) Blue: 400–510 nm Green: 520–600 nm Spectral bands Red: 610–650 nm NIR: 401 bands selectable at 1-nm intervals from 650 to 1050 nm Image size 659 × 494 pixels Data quantization 10 bit

The development of the HPT commenced in 2010, with the aim of achieving a high spatial resolution sensor suitable for deployment on a microsatellite for Earth observation. Earth observation has diverse requirements regarding spatial and spectral resolutions according to different fields of application, e.g., agriculture, environmental monitoring, hydrology, and oceanography (cf. Figure 2 in [2]). A multispectral sensor with 5-m spatial resolution can satisfy a wide range of these fields, except for intelligence services, traffic monitoring, and urban development purposes, which mostly require sub-meter resolution. Therefore, the goal of developing a 5-m-resolution multispectral sensor was envisioned for the HPT. In order to achieve a ground sample distance of 5 m with a pixel size of 7.4 µm from an altitude of 700 km, a focal length of 1 m is required for the optical system of the sensor. Considering the high SNR required for multispectral imaging, an aperture diameter of 100 mm is considered the largest possible for a 50-kg microsatellite. Accordingly, the diffraction-limited resolution on the focal plane of the optical system is 6.4 µm at the 500-nm wavelength. As the diffraction-limited resolution is close to the pixel size, a diffraction-limited optical system is required. This is why the name of the HPT sensor includes the word precision instead of the resolution. The optical system of the HPT adopts a Cassegrain reflecting telescope that was designed and manufactured by Genesia Corp. (Mitaka, Japan). It is important in the design of an optical system for any spaceborne multispectral sensor to account for chromatic aberration and thermal stability. Compared with refracting telescopes using lenses, reflecting telescopes using mirrors have the advantage of no chromatic aberration, which is significant for wavelength scanning in a wide spectral range. Minimizing the chromatic aberration of a refracting telescope is possible, but balancing the chromatic aberration correction with thermal stabilization is usually difficult. The adoption of a reflecting telescope enables the development of the optical system to focus on thermal stability. In addition, the Cassegrain telescope comprises a parabolic primary mirror and a hyperbolic secondary mirror, which create the long focal length required for high spatial resolution imaging, while minimizing the length of the telescope tube by reflecting the light path. The telescope tube is made of carbon fiber-reinforced plastic, which has a very low coefficient of thermal expansion and high stiffness. The mirror is made from zero thermal expansion pore-free ceramics, which is a material developed by Nihon Ceratec Co., Ltd. (Sendai, Japan) for use in semiconductor manufacturing equipment. Although zero thermal expansion pore-free ceramics have an extremely low coefficient of thermal expansion at around room temperatures, active thermal control of the mirrors is impractical for the limited power supply of a microsatellite. Based on the results of thermal simulation analysis, both the tip of the telescope tube and the reverse side of the secondary mirror, which are exposed to the outside of the satellite, are covered by multilayer insulation to maintain the mirrors at room temperature passively. At room temperature, the Strehl ratio of the optical system measured with a laser interferometer is 0.92, which satisfies the criterion for a diffraction-limited system (>0.8). A monochromatic CCD module is used for imaging in the four spectral bands. Generally, CCD image sensors consume more power, but are more sensitive than complementary metal Sensors 2018, 18, 619 6 of 11 oxide semiconductor (CMOS) image sensors. The most important characteristic of a CCD image sensor for satelliteSensors 2018 imaging, 18, x FOR is PEER a global REVIEW shutter that can capture the entire frame at the same instant,6 of although 11 many CMOS image sensors use a rolling shutter. The CCD module T065 manufactured by Watec Co., metal oxide semiconductor (CMOS) image sensors. The most important characteristic of a CCD Ltd.image (Tsuruoka, sensor Japan)for satellite was imaging developed is a originallyglobal shutter for thethat opticalcan capture mission the entire payload frame instrumentation at the same of the SPRITE-SAT.instant, although The unitmany cell CMOS size ofimage this CCDsensors module use a is rolling 7.4 µm ×shutter.7.4 µm, The and CCD the effectivemodule imageT065 size is 659manufactured× 494 pixels. by TheWatec gain Co. andLtd. exposure(Tsuruoka, time Japan) of was the developed CCD module originally are changeable for the optical from mission−6 to +36 dB andpayload from instrumentation 20 µs to 34 s, of respectively, the SPRITE-SAT. enabling The unit the cell HPT size to of bethis applied CCD module to a wide is 7.4 μ rangem× 7.4 of μm, spectral radiancesand the on effective the Earth’s image surface. size is 659 An × important 494 pixels. function The gain supportedand exposure by time this CCDof the moduleCCD module is the are external triggerchangeable shutter from that allows−6 to +36 the dB synchronization and from 20 μs to of 34 the s, respectively, three CCD enabling modules the for HPT the to blue, be applied green, andto red bands,a wide in order range to of acquirespectral aradiances true color oncomposite the Earth’s image.surface. An important function supported by this CCDIn the module HPT is on the the external RISING-2 trigger microsatellite, shutter that allows LCTF the technology synchronization was of applied the three for CCD the modules first time to a for the blue, green, and red bands, in order to acquire a true color composite image. spaceborne multispectral sensor through collaboration with the Research Institute for Advanced Liquid In the HPT on the RISING-2 microsatellite, LCTF technology was applied for the first time to a Crystalspaceborne Technology multispectral at the Aomori sensor through Support collaboration Center for Industrial with the Research Promotion Institute in Japan. for Advanced The LCTF is a typeLiquid of optical Crystal band Technology pass filter at that the usesAomori liquid Support crystal Center for the for birefringent Industrial Promotion plates of in the Japan. Lyot filter,The and the LCTFLCTF is technology a type of optical has beenband usedpass filter for manythat uses applications, liquid crystal e.g., for the agriculture, birefringent healthcare, plates of the archaeology, Lyot andfilter, art (see and review the LCTF by [11technology] and references has been therein). used for Themany LCTF applications, is composed e.g., agriculture, of several stackedhealthcare, layers of liquidarchaeology, crystal sandwiched and art (see between review by crossed [11] and polarizers, references and therein). the transmission The LCTF is wavelength composed of of several the LCTF is controlledstacked by layers square-wave of liquid voltagescrystal sandwiched that are applied between to each crossed layer. polarizers, On the RISING-2 and the microsatellite,transmission the LCTFwavelength and the voltage-controlling of the LCTF is controlled circuit by are square-wave separated into voltages the HPT that andare applied the SHU, to each respectively. layer. On The the LCTF developedRISING-2 for microsatellite, the HPT (Figure the LCTF4) has and dimensions the voltage-controlling of 30.7 mm circuit× 30.3 are mmseparated× 30.3 into mm, the HPT and and it weighs the SHU, respectively. The LCTF developed for the HPT (Figure 4) has dimensions of 80 g. The power consumption of the voltage-controlling circuit is 0.2 W. These features enhance the 30.7 mm × 30.3 mm × 30.3 mm, and it weighs 80 g. The power consumption of the voltage-controlling compatibilitycircuit is of0.2 theW. LCTF These technology features enhance with nano/microsatellites. the compatibility of Thethe LCTFLCTF istechnology used for wavelength with scanningnano/microsatellites. in the NIR band, The and LCTF the is central used for wavelength wavelength of scanning the NIR bandin the isNIR electrically band, and selectable the central at 1-nm intervalswavelength from 650 of the nm NIR to 1050 band nm. is electrically In terms of selectab the numberle at 1-nm of spectral intervals bands, from the 650 HPT nm to has 1050 401 nm. bands In in the NIR.terms The peakof the transmittance number of spectral and thebands, full the width HPT at has half 401 maximum bands in increasethe NIR. withThe peak the centraltransmittance wavelength fromand 7% the and full 12 width nm (at at 650half nm)maximum to 29% increase and 30 with nm the (at 1050central nm), wavelength respectively. from 7% To maintainand 12 nm the (at LCTF performance650 nm) to over29% and a wide 30 nm temperature (at 1050 nm), range, respectively a temperature. To maintain sensor the LCTF is attached performance to the over LCTF, a wide and the optimumtemperature voltages range, at the a temperature measured temperature sensor is attach areed applied to the automaticallyLCTF, and the using optimum a look-up voltages table at the stored in measured temperature are applied automatically using a look-up table stored in the voltage- the voltage-controlling circuit. The response time for switching from one central wavelength to another controlling circuit. The response time for switching from one central wavelength to another depends depends on the temperature and on the combination of wavelengths. For example, the response time at ◦ on the temperature and on the combination of wavelengths. For example, the response time at 25 °C 25 Cranges ranges from from 39 ms 39 ms(for (fora switch a switch from from850 nm 850 tonm 840 tonm) 840 to nm)259 ms to 259(for msa switch (for afrom switch 710 fromnm to 710 970 nm to 970 nm), and and it it is is slower/faster slower/faster at lower/higher at lower/higher temperatures, temperatures, respectively. respectively. The exposure The exposureof the NIR ofCCD the NIR CCDisis designed designed to tostart start after after sufficient sufficient timetime has elapsed has elapsed for the for switching the switching of the wavelengths. of the wavelengths.

FigureFigure 4. 4.Photograph Photograph ofof the liquid crystal crystal tu tunablenable filter filter (LCTF) (LCTF) (unit: (unit: mm). mm).

Sensors 2018, 18, 619 7 of 11

3. ImageSensors Analysis 2018, 18, x FOR Methods PEER REVIEW and Results 7 of 11

3.1. Color3. Image Images Analysis Methods and Results A true color composite image is synthesized from red, green, and blue (RGB) band images 3.1. Color Images captured synchronously by the HPT. The fields of view of the three bands deviate spatially from one another becauseA true color of the composite optical alignment image is ofsynthesized their CCD from image red, sensor green, modules. and blue While(RGB) theband CCD images modules captured synchronously by the HPT. The fields of view of the three bands deviate spatially from one are aligned precisely along the optical axis by the focus adjustment, they have an alignment precision another because of the optical alignment of their CCD image sensor modules. While the CCD of ~100 µm in the direction perpendicular to the optical axis, which corresponds to a deviation of a few modules are aligned precisely along the optical axis by the focus adjustment, they have an alignment tens ofprecision pixels. of Therefore, ~100 μm band-to-bandin the direction registration perpendicular using to the a feature-based optical axis, which method corresponds is implemented to a in the imagedeviation processing of a few procedure.tens of pixels. The Therefore, scale-invariant band-to-band feature registration transform using method a feature-based [12] is used method for feature matchingis implemented between in the the blue image and processing green, procedure. and between The scale-invariant the blue and feature red band transform images. method From [12] a is single set ofused the for two feature band matching images forbetween a certain the blue scene, and their green, matched and between feature the pointsblue and are red limited band images. in number, and theyFrom area single localized set of the within two band the images frame for because a certain of scene, the difference their matched in thefeature spectral points radiance are limited of the differentin number, spectral and bands. they are The localized matched within feature the frame points, because which of arethe difference usually found in the atspectral the edges radiance of black or whiteof the objects different and spectral clouds, bands. are accumulated The matched featur frome many points, sets which of different are usually scenes, found and at the a homographyedges of black or white objects and clouds, are accumulated from many sets of different scenes, and a matrix is estimated from all of the matched feature points. The homography matrix describes a homography matrix is estimated from all of the matched feature points. The homography matrix transformation that maps points in one plane to the corresponding points in another plane, and it describes a transformation that maps points in one plane to the corresponding points in another is widelyplane, used and forit is georeferencing.widely used for georeferencing Based on each. Based estimated on each homography estimated homography matrix, the matrix, green the and red bandgreen images and are red transformed band images are into transformed the coordinate into the system coordinate of the system blue bandof the image.blue band Once image. a sufficiently Once precisea sufficiently transformation precise is transformation accomplished, is the accomplished, same homography the same matrix homography can be matrix applied can to be any applied other RGB bandto images. any other RGB band images. FigureFigure5 shows 5 shows a true a colortrue color image image composed composed of the of RGBthe RGB bands bands selected selected from from the the “first “first light” light” scenes, whichscenes, were acquiredwhich were about acquired one month about afterone month the launch after ofthe the launch RISING-2 of the microsatellite.RISING-2 microsatellite. For comparison, For a true colorcomparison, image composeda true color of image Bands composed 1, 2, and of 3 ofBands the Enhanced1, 2, and 3 Thematicof the Enhanced Mapper Thematic Plus (ETM+) Mapper sensor Plus (ETM+) sensor on the Landsat 7 satellite with 30-m spatial resolution is also shown in Figure 5. on the Landsat 7 satellite with 30-m spatial resolution is also shown in Figure5. The area in the scene is The area in the scene is famous for its production of rice in Japan, and many rectangular paddy fields famous for its production of rice in Japan, and many rectangular paddy fields roughly 100 m × 50 m in roughly 100 m × 50 m in size are visible in the image. Agricultural roads that are typically a few size aremeters visible wide, in thewhich image. separate Agricultural the paddy roads fields, that are evident are typically in the aRISING-2 few meters HPT wide, image, which but invisible separate the paddyin fields,the Landsat are evident 7 ETM+ in image. the RISING-2 This image HPT demonstra image,tes but clear invisible evidence in theof the Landsat 5-m resolution 7 ETM+ targeted image. This imageby demonstrates the HPT. clear evidence of the 5-m resolution targeted by the HPT.

FigureFigure 5. A 5. true A true color color composite composite image image of of Minami-UonumaMinami-Uonuma City, City, Niigata Niigata Prefecture, Prefecture, Japan, Japan, acquired acquired by theby RISING-2 the RISING-2 HPT HPT on on 2 July 2 July 2014, 2014, overlaid overlaid onon a true true color color compos compositeite image image acquired acquired by the by Landsat the Landsat 7 Enhanced7 Enhanced Thematic Thematic Mapper Mapper Plus Plus (ETM+) (ETM+) onon 3030 May 2014. 2014.

SensorsSensors2018 2018,,18 18,, 619 x FOR PEER REVIEW 88 ofof 1111

3.2. Multispectral Images 3.2. Multispectral Images Multispectral images selectable from 650–1050 nm are acquired with the NIR band of the HPT. A combinationMultispectral of imagesthe selected selectable wavelengths from 650–1050 is configured nm are in acquired advance, with and the multispectral NIR band ofimages the HPT. are Acaptured combination sequentially of the selectedat a scheduled wavelengths time and is configuredat a specified in time advance, interval. and multispectral images are capturedFigure sequentially 6 shows a atseries a scheduled of multispectral time and images at a specified acquired time at 665 interval. nm, 690 nm, 700 nm, 720 nm, 750 nm, Figureand 8736 showsnm. The a area series in ofthese multispectral scenes is dominate images acquiredd by forests, at 665 whose nm, spectral 690 nm, reflectance 700 nm, 720 is lower nm, 750at 665 nm, nm and and 873 690 nm. nm The (the area red inregion), these scenesand higher is dominated at 750 nm by and forests, 873 nm whose (the NIR spectral region). reflectance Due to isattitude lower atfluctuations 665 nm and during 690 nm the (the off-nadir red region), pointing and higherof the satellite, at 750 nm these and 873scenes nm deviate (the NIR from region). one Dueanother to attitude spatially. fluctuations Hence, band-to- during theband off-nadir registration pointing using of the satellite,feature-based these scenesmethod deviate is also fromrequired one anotherfor the spatially.processing Hence, of the band-to-band multispectral registration images usingprior theto feature-basedthe calculation method of indices is also such required as forthe thenormalized processing difference of the multispectral vegetation index images (NDVI), prior to which the calculation is derived of from indices red such and as NIR the images. normalized The differenceNDVI is calculated vegetation from index the (NDVI), following which expression: is derived from red and NIR images. The NDVI is calculated from the following expression: ρ ρ ρ NIR − ρ RED NDVI = ρNIR ρRED (1)(1) ρNIRNIR + ρREDRED where ρ and ρ are the surface reflectance in the red and NIR regions, respectively. However, where ρREDREDand ρNIRNIRare the surface reflectance in the red and NIR regions, respectively. However, the resultthe result of feature of feature matching matching between between the red the and red NIR and images NIR is images usually is poor usually because poor of thebecause differences of the indifferences their spectral in their reflectance. spectral Asreflectance. seen in Figure As seen6, the in multispectralFigure 6, the imagesmultispectral at 720 images nm, belonging at 720 nm, to “redbelonging edge” to between “red edge” the red between and NIR the regions, red and have NIR featuresregions, commonhave features to both common regions, to and both sufficient regions, numbersand sufficient of matched numbers feature of matched points. Therefore, feature points. the feature Therefore, matching the feature of multispectral matching images of multispectral is iterated sequentially,images is iterated withits sequentially, neighboring with spectral its neighborin band fromg thespectral red to band the NIRfrom through the red the to the red NIR edge, through before allthe of red the edge, images before are transformedall of the images to the are coordinate transformed system to the of thecoordinate NIR image. system This of spectrally the NIR image. sequential This imagespectrally registration sequential method image is registration also effective method for the is analysis also effective of hyperspectral for the analysis images of obtainedhyperspectral by a wavelength-scanningimages obtained by a imagerwavelength-scanning on an unmanned im aerialager on vehicle an unmanned [13]. aerial vehicle [13].

FigureFigure 6.6. MultispectralMultispectral images images of of Shim Shimonosekionoseki City, City, Yamaguchi Yamaguchi Prefec Prefecture,ture, Japan, Japan, acquired acquired on on29 29May May 2015. 2015.

InIn orderorder toto calculatecalculate thethe NDVI,NDVI, 10-bit10-bit digitaldigital numbersnumbers forfor eacheach pixelpixel afterafter thethe imageimage registrationregistration areare convertedconverted toto surfacesurface reflectancereflectance ρρusing using the the following following steps. steps. (1) (1) The The digital digital number numberis isconverted converted into spectral radiance at the sensor’s aperture L using the unit conversion coefficient, which is into spectral radiance at the sensor’s aperture Lsatsat using the unit conversion coefficient, which is obtainedobtained byby the preflight preflight calibration calibration test test using using an an integrating integrating sphere sphere as asthe the uniform uniform light light source. source. (2) An image-based atmospheric correction is applied to the path radiance L calculation using the dark (2) An image-based atmospheric correction is applied to the path radiancep Lp calculation using the darkobject object subtraction subtraction (DOS) (DOS) method method [14,15]. [14 ,(3)15]. The (3) surface The surface reflectance reflectance is calculated is calculated as: as: π Lsat - Lp 2 ρ π Lsat − Lp d (2) ρ = E cos θ (2) E cos θ

Sensors 2018, 18, x FOR PEER REVIEW 9 of 11 Sensors 2018, 18, 619 9 of 11 where d is the Earth–Sun distance, E is the solar exoatmospheric irradiance, and θ is the solar wherezenithd angle.is the Earth–SunFigure 7a,b distance, shows theE isNDVI the solar derived exoatmospheric from the multispectral irradiance, images and θ isin theFigure solar 6. zenithImages angle.at 665 Figurenm and7a,b 873 shows nm were the used NDVI for derived the calculation from the of multispectralspectral reflectance images in inthe Figure red and6. ImagesNIR regions, at 665respectively. nm and 873 A nmtrue were color used image for comprising the calculation Bands of spectral2, 3, and reflectance4 of the Operational in the red Land and NIR Imager regions, (OLI) respectively.on the Landsat A true 8 colorsatellite image is also comprising shown Bandsin Figu 2,re 3, 7a and for 4 ofcomparison. the Operational Figure Land 7c Imagershows the (OLI) NDVI on thederived Landsat from 8 satellite Bands is4 alsoand shown5 of the in Landsat Figure7 a8 forOLI; comparison. this value Figurewas acquired7c shows eight the NDVI days derivedbefore the fromacquisition Bands 4of and the 5 RISING-2 of the Landsat HPT 8images, OLI; this which value were was corrected acquired eightsimilarly days using before the the DOS acquisition method. ofAlthough the RISING-2 the spatial HPT images, resolution which of the were RISING-2 corrected HPT similarly in this usingscene theis 7.2 DOS m method.and approximately Although thefour spatialtimes resolutionhigher than of 30-m the RISING-2 spatial resolution HPT in this of scene the Landsat is 7.2 m 8 and OLI, approximately the NDVI values four timesin Figure higher 7b thanare in 30-mgood spatial agreement resolution with those of the in Landsat Figure 87c. OLI, This the imag NDVIe indicates values inthat Figure the multispectral7b are in good image agreement analysis withwas those implemented in Figure 7successfully,c. This image and indicates that the that RISING-2 the multispectral HPT was image calibrated analysis accurately was implemented during the successfully,preflight tests. and that the RISING-2 HPT was calibrated accurately during the preflight tests.

FigureFigure 7. 7.(a ()a Normalized) Normalized difference difference vegetation vegetation index index (NDVI) (NDVI) values values derived derived from from the the multispectral multispectral imagesimages in in Figure Figure6 overlaid6 overlaid on on a truea true color color composite composite image image acquired acquired by by the the Landsat Landsat 8 Operational8 Operational LandLand Imager Imager (OLI) (OLI) on on 21 21 May May 2015, 2015, (b ()b enlarged) enlarged view view of of the the area area framed framed by by the the black black square square in in (a (,ac,)c) NDVINDVI values values derived derived from from the the Landsat Landsat 8 OLI8 OLI bands bands for for the the same same area. area.

4.4. Future Future Research Research AlthoughAlthough thethe HPT project project on on the the RISING-2 RISING-2 micros microsatelliteatellite was wasa pilot a study, pilot study,the results the achieved results achievedwere utilized were utilizedfor next-generation for next-generation satellites. satellites. Multispectral Multispectral sensors using sensors LCTF using technology LCTF technology have been havedeveloped been developed further at further Hokkaido at Hokkaido University, University, and installed and installed on subsequent on subsequent microsatellites, microsatellites, e.g., the e.g.,DIWATA-1, the DIWATA-1, DIWATA-2, DIWATA-2, MicroDragon, MicroDragon, and and Rapid Rapid International International Scientific Scientific ExperimentExperiment SatelliteSatellite (RISESAT).(RISESAT). In theseIn these satellite satellite sensors, sensors, a LCTF a coveringLCTF co thevering visible the spectral visible region spectral is used region in conjunction is used in withconjunction that for with the NIR.that for Over the the NIR. past Over seven the years,past seven since years, the HPTsinceproject the HPT started, project other started, related other technologiesrelated technologies have also have been also improved. been improved. The capacities The ofcapacities the volatile of the and volatile non-volatile and non-volatile memories inmemories the SHU havein the been SHU increased have been up increased to 256 MB up andto 256 32 MB GB, and respectively, 32 GB, respectively, and the downlink and the datadownlink rate withdata X-band rate with telemetry X-band hastelemetry increased has increased up to 10 Mbps. up to 10 These Mbps. technological These technological developments developments have made have multispectralmade multispectral remote remote sensing sensing with LCTFwith LCTF technology technology more more effective effective and and operational. operational. In In fact, fact, the the DIWATA-1DIWATA-1 microsatellite, microsatellite, which which was launchedwas launched on 23 March on 23 2016, March now routinely2016, now acquires routinely multispectral acquires imagesmultispectral of several images spectral of bands several over spectral selected sitesbands in theover Philippines. selected Thesites MicroDragon in the Philippines. satellite willThe beMicroDragon launched in satellite 2018, and will will be operatelaunched over in 2018, Vietnam and forwill ocean-color operate over remote Vietnam sensing, for ocean-color which requires remote atsensing, least eight which spectral requires bands at least from eight the visiblespectral to band NIRs spectralfrom the regions visible [to16 NIR]. The spectral RISESAT regions [17], [16]. which The

Sensors 2018, 18, 619 10 of 11 is intended for launch together with the MicroDragon satellite, will have 5-m resolution (or higher) imaging in both the visible and the NIR regions. The image analysis methods presented in this paper can be utilized for images acquired by the above microsatellites that will be operated by members of the Asian microsatellite consortium, which was established in 2016 to share microsatellite technologies, observation data, and data application methods between its members.

5. Conclusions The HPT on the RISING-2 microsatellite was developed to demonstrate the efficacy of high resolution multispectral remote sensing nano/microsatellite platforms. LCTF technology was applied for the first time to realize a compact multispectral spaceborne sensor. The advantages offered by the LCTF technology are reductions in the size, weight, and power consumption of the sensor, the flexibility of the spectral bands and data volume, and compatibility with off-nadir pointing. The RGB color and NIR multispectral images were acquired successfully by the HPT and processed by image analysis methods. LCTF technology, as demonstrated by its application in the HPT on the RISING-2 microsatellite, can be considered an innovative tool suitable for future multi/hyperspectral remote sensing by nano/microsatellites.

Acknowledgments: This research was supported by the microsatellite research and development program of the Ministry of Education, Culture, Sports, Science and Technology (MEXT), Japan. This research was also supported by the Japan Society for the Promotion of Science (JSPS) Core-to-Core Program, B. Asia-Africa Science Platforms. The scale-invariant feature transform algorithm, available from the OpenCV library, was used for feature matching. Landsat images were downloaded from the United States Geological Survey website. Author Contributions: J.K. and Y.T. conceived and designed the experiments; Y.S., T.K., and K.Y. performed the experiments and contributed materials; J.K. analyzed the data and wrote the paper. Conflicts of Interest: The authors declare no conflict of interest.

References

1. Buchen, E. Small Satellite Market Observations. In Proceedings of the AIAA/USU Conference on Small Satellites, Logan, UT, USA, 8–13 August 2015; SSC15-VII-7. 2. Sandau, R.; Brieß, K.; D’Errico, M. Small satellites for global coverage: Potential and limits. ISPRS J. Photogramm. Remote Sens. 2010, 65, 492–504. [CrossRef] 3. Roberto, R.D.; Nascetti, A.; Paolozzi, A.; Paris, C. Optical payload for high-resolution Earth imaging suitable for microsatellites. In Proceedings of the IEEE 15th International Conference on Environment and Electrical Engineering, Rome, Italy, 10–13 June 2015. 4. Staenz, K.; Mueller, A.; Heiden, U. Overview of terrestrial missions. In Proceedings of the IEEE International Geoscience and Remote Sensing Symposium, Melbourne, VIC, Australia, 21–26 July 2013; pp. 3502–3505. 5. Guelman, M.; Ortenberg, F. Small satellite’s role in future hyperspectral Earth observation missions. Acta Astronaut. 2009, 64, 1252–1263. [CrossRef] 6. Villafranca, A.G.; Corbera, J.; Martín, F.; Marchán, J.F. Limitations of hyperspectral Earth observation on small satellites. J. Small Satell. 2012, 1, 19–29. 7. Lier, P.; Bach, M. PARASOL a microsatellite in the A-Train for Earth atmospheric observations. Acta Astronaut. 2008, 62, 257–263. [CrossRef] 8. Sakamoto, Y.; Sugimura, N.; Fukuda, K.; Kuwahara, T.; Yoshida, K.; Kurihara, J.; Fukuhara, T.; Takahashi, Y. Development and Flight Results of Microsatellite Bus System for RISING-2. Trans. JSASS Aerosp. Technol. Jpn. 2016, 14, Pf_89–Pf_96. [CrossRef] 9. Yoshida, K.; Takahashi, Y.; Sakamoto, Y.; Ujiie, E.; Takiuchi, K.; Nakazato, Y.; Sawakami, T.; Sakanoi, T.; Kasaba, Y.; Kondo, S.; et al. SPRITE-SAT: A Micro Satellite for Scientific Observation of Transient Luminous Events and Terrestrial Gamma-ray Flashes. Trans. JSASS Aerosp. Technol. Jpn. 2010, 8, Tm_7–Tm_12. [CrossRef] Sensors 2018, 18, 619 11 of 11

10. Fukuda, K.; Nakano, T.; Sakamoto, Y.; Kuwahara, T.; Yoshida, K.; Takahashi, Y. Attitude control system of micro satellite RISING-2. In Proceedings of the IEEE/SICE International Symposium on System Integration, Sendai, Japan, 21–22 December 2010; pp. 373–378. 11. Gat, N. Imaging spectroscopy using tunable filters: A review. In Proceedings of the AeroSense 2000, Orlando, FL, USA, 24–28 April 2000. 12. Lowe, D.G. Distinctive image features from scale-invariant keypoints. Int. J. Comput. Vis. 2004, 60, 91–110. [CrossRef] 13. Ishida, T.; Kurihara, J.; Viray, F.A.; Namuco, S.B.; Paringit, E.C.; Perez, G.J.; Takahashi, Y.; Marciano, J.J., Jr. A novel approach for vegetation classification using UAV-based . Comput. Electron. Agric. 2018, 144, 80–85. [CrossRef] 14. Chavez, P.S., Jr. Image-based atmospheric correction—Revisited and improved. Photogramm. Eng. Remote Sens. 1996, 62, 1025–1036. 15. Sobrino, J.A.; Jiménez-Muñoz, J.C.; Paolini, L. Land surface temperature retrieval from LANDSAT TM 5. Remote Sens. Environ. 2004, 90, 434–440. [CrossRef] 16. IOCCG. Minimum Requirements for an Operational, Ocean. In Reports of the International Ocean-Color Coordinating Group, No. 1; Morel, A., Ed.; IOCCG: Dartmouth, NS, Canada, 1998. 17. Kuwahara, T.; Yoshida, K.; Sakamoto, Y.; Tomioka, Y.; Fukuda, K.; Fukuyama, M.; Shibuya, Y. International Scientific Micro-satellite RISESAT based on Space Plug and Play Avionics. In Proceedings of the AIAA/USU Conference on Small Satellites, Logan, UT, USA, 13–16 August 2012.

© 2018 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).