sensors

Article Imaging with Low Cost Sensors: Development and Application of a Raspberry Pi-Based UV

Thomas C. Wilkes 1,*, Andrew J. S. McGonigle 1,2, Tom D. Pering 1, Angus J. Taggart 1, Benjamin S. White 3, Robert G. Bryant 1 and Jon R. Willmott 3

1 Department of Geography, The University of Sheffield, Winter Street, Sheffield S10 2TN, UK; a.mcgonigle@sheffield.ac.uk (A.J.S.M.); t.pering@sheffield.ac.uk (T.D.P.); a.taggart@sheffield.ac.uk (A.J.T.); r.g.bryant@sheffield.ac.uk (R.G.B.) 2 Istituto Nazionale di Geofisica e Vulcanologia, Sezione di Palermo, via Ugo La Malfa 153, Palermo 90146, Italy 3 Department of Electronic and Electrical Engineering, The University of Sheffield, Portobello Centre, Pitt Street, Sheffield S1 4ET, UK; ben.white@sheffield.ac.uk (B.S.W.); j.r.willmott@sheffield.ac.uk (J.R.W.) * Correspondence: tcwilkes1@sheffield.ac.uk; Tel.: +44-7798-837-894

Academic Editor: Gonzalo Pajares Martinsanz Received: 5 September 2016; Accepted: 3 October 2016; Published: 6 October 2016

Abstract: Here, we report, for what we believe to be the first time, on the modification of a low cost sensor, designed for the smartphone camera market, to develop an ultraviolet (UV) camera system. This was achieved via adaptation of Raspberry Pi , which are based on back-illuminated complementary metal-oxide semiconductor (CMOS) sensors, and we demonstrated the utility of these devices for applications at wavelengths as low as 310 nm, by remotely sensing power station smokestack emissions in this spectral region. Given the very low cost of these units, ≈ USD 25, they are suitable for widespread proliferation in a variety of UV imaging applications, e.g., in atmospheric science, volcanology, forensics and surface smoothness measurements.

Keywords: UV camera; UV imaging; smartphone sensor technology; sulphur dioxide emissions; Raspberry Pi; low-cost camera

1. Introduction Ultraviolet (UV) imaging has a wide variety of scientific, industrial and medical applications, for instance in forensics [1], industrial fault inspection [2], astronomy, monitoring skin conditions [3] and in remote sensing [4,5]. To date, scientific grade UV cameras, which have elevated quantum efficiencies in this spectral region, have been applied in this context. However, these systems are relatively expensive (typical unit costs thousands of dollars) and can be power intensive, since they may incorporate thermo-electric cooling. Although these units may provide high signal-to-noise ratios, a lower price point solution could expedite more widespread implementation of UV imaging. Recently, considerable effort has been invested in developing low cost back-illuminated complementary metal-oxide semiconductor (CMOS) sensor technology. Previously, this sensor architecture was applied only within specialist, low imaging arenas, e.g., in astronomy. In the last few years, however, the manufacturing costs of these devices have been reduced markedly, such that they now feature prominently in consumer electronic products, particularly . The key advantage of the back-illuminated sensor architecture, over the conventional CMOS configuration, is that the photo-diodes are placed in front of the metal wiring matrix layer of the sensor, thereby improving fill factor and substantially increasing optical throughput to the photoreceptors, particularly in the UV, where these detectors are photosensitive.

Sensors 2016, 16, 1649; doi:10.3390/s16101649 www.mdpi.com/journal/sensors Sensors 2016, 16, 1649 2 of 8

Sensors 2016, 16, 1649 2 of 8 To date, however, application of these inexpensive sensors has predominantly been focused on visible imaging,To date, however, due to the application choice of fore-optics of these inexpensive and the Bayer sensors filter has layer predominantly applied to the been sensors, focused in orderon visible imaging, due to the choice of fore-optics and the Bayer filter layer applied to the sensors, in to generate red-green-blue (RGB) mosaics. Recent studies have highlighted that these smartphone order to generate red-green-blue (RGB) mosaics. Recent studies have highlighted that these cameras do have some UV sensitivity [6–8], even with the above optical arrangement. Here, we build smartphone cameras do have some UV sensitivity [6–8], even with the above optical arrangement. on this work by modifying such a camera sensor, to maximize UV throughput to the sensor. To the best Here, we build on this work by modifying such a camera sensor, to maximize UV throughput to the of our knowledge, this constitutes the first report of an imaging system specifically adapted for the sensor. To the best of our knowledge, this constitutes the first report of an imaging system UV, based on a back-illuminated CMOS device developed for the smartphone market. In particular, specifically adapted for the UV, based on a back-illuminated CMOS device developed for the wesmartphone developed amarket UV camera. In particular system,, we based developed on Raspberry a UV ca Pimera camera system, boards, based and on demonstrated Raspberry Pi camera its utility forboards, UV imaging and demonstrated applications its down utility to for 310 UV nm, imaging via a case applications study involving down to remote310 nm, sensing via a case of sulphurstudy dioxideinvolving (SO 2remote) emissions sensing from of sulphur a power dioxide station (SO smokestack.2) emissions from a power station smokestack.

2.2. UV UV Camera Camera Development Development ThisThis work work was was achieved achieved using using Raspberry Raspberry PiPi Camera Module Module v v1.31.3 boards boards (referred (referred to toas asPiCams PiCams hereafter),hereafter), of cost of cost≈ USD ≈ USD 25 (Raspberry25 (Raspberry Pi Foundation). Pi Foundation) The. PiCamThe PiCam is based is based on an on Omnivision an Omnivision OV5647 back-illuminatedOV5647 back-illuminated CMOS sensor, CMOS developed sensor, developed primarily forprimarily the mobile for the phone mobile market; phone the market; OV5647 the is a 1/4”OV5647 5-megapixel is a 1/4” (25925-megapixel× 1944 (2592 active × 1944 array) active backlit-CMOS array) backlit unit,-CMOS with unit, 8-/10-bit with 8-/10 RGB/RAW-bit RGB/RAW output.image PiCam output. boards PiCam were boards chosen werehere chosen due here to the due ease to of the data ease acquisition of data acquisition and image and processing image usingprocessing Raspberry using Pi Raspberry computer Piboards computer (Raspberry boardsPi (Raspberry Foundation), Pi Foundation) via the Python, via programming the Python language.programming Due to language. their low Due expense to their and low power expense consumption, and power Raspberryconsumption, Pi computers Raspberry arePi computers increasingly beingare increasinglyused in a variety being ofused sensing in a variety applications of sensing (e.g., applications [9,10]). Whilst (e.g., this[9,10 work]). Whilst is focused this work on is one sensorfocused type, onpreliminary one sensor type, tests preliminary with another tests such with back-illuminated another such back CMOS-illuminated device, CMOS the Sony device, IMX219 the (SonySony Corporation, IMX219 (Sony Minato, Corporation Tokyo,, Minato Japan),, Tokyo also evidence, Japan), UValso sensitivity.evidence UV sensitivity. TheThe PiCam PiCam lens lens and and filter,filter, housed above above the the sensor sensor within within a plastic a plastic casing casing unit, unit,both absorb both absorb the UV signal, and were therefore removed as a first step to developing the UV camera. This was the UV signal, and were therefore removed as a first step to developing the UV camera. This was achieved chemically using a Posistrip® EKC830TM (DuPont, Wilmington, DE, USA) bath. To improve achieved chemically using a Posistrip® EKC830TM (DuPont, Wilmington, DE, USA) bath. To improve UV sensitivity of the sensor, we then removed the microlens and Bayer filter layers, the latter of UV sensitivity of the sensor, we then removed the microlens and Bayer filter layers, the latter of which which masks the sensor in a mosaic of RGB colour filters (Figure 1), and attenuates much of the masks the sensor in a mosaic of RGB colour filters (Figure1), and attenuates much of the incident UV incident UV radiation. This process effectively turns the PiCam into a monochrome sensor. radiation. This process effectively turns the PiCam into a monochrome sensor.

FigureFigure 1. 1.A A schematicschematic ( (CentreCentre)) of of the the Bayer Bayer filter filter array array and andits position its positioninging on the onphotodetector the photodetector array. array.Also Also shown shown are aremicroscope microscope images of the of the PiCam PiCam sensor sensor pre pre-- (Left (Left) and) and post post-- (Right (Right) Bayer) Bayer removalremov process.al process.

WhilstWhilst the the filter filter can can be be removed removed byby carefulcareful scratchingscratching (a (a range range of of tools, tools, such such as as metal metal tweezers tweezers oror a pointeda pointed wooden wooden object, object, can can bebe usedused forfor thisthis process),process), a a far far more more uniform uniform finish finish is isachievable achievable chemically;chemically; we we adopted adopte thed the latter latter approach approach using using a five a stepfive stepprocedure. procedure. We first We submerged first submerged the sensors the ® TM insensors photoresist in photoresist remover (Posistrip remover (Posistrip® EKC830TMEKC830) and heated) and heated(70–100 (70◦C)–100 until °C) the until filter the was filter entirely was removed;entirely thisremoved; generally this takesgenerally 10–30 takes minutes, 10–30 dependingminutes, depending on the level on the of appliedlevel of applied heating heating and agitation. and agitation. A second bath of this remover was then applied to optimise the cleaning procedure. A second bath of this remover was then applied to optimise the cleaning procedure. Following this, Following this, the photoresist remover was washed from the sensor by a three stage cleaning the photoresist remover was washed from the sensor by a three stage cleaning procedure, using procedure, using successive baths of n-butyl acetate, acetone and isopropyl alcohol. All the above successive baths of n-butyl acetate, acetone and isopropyl alcohol. All the above chemicals were chemicals were applied undiluted. For each of these steps, the sensor was submerged on the order of applied undiluted. For each of these steps, the sensor was submerged on the order of minutes,

Sensors 2016, 16, 1649 3 of 8

Sensors 2016, 16, 1649 3 of 8 to ensure a thorough cleaning. Careful and rigorous application of this methodology can result in uniformminutes, and to cleanensure sensors a thorough which cleaning. exhibit greatlyCareful enhanced and rigorous UV application sensitivity relativeof this methodology to non-de-Bayered can units.result For in example, uniform anda clear-sky clean sensors image takenwhich with exhibit a partially greatly enhanced de-Bayered UV unit, sensitivity captured relative through to a 310non nm-de filter-Bayered and with units a. For example, speed of a 400 clear ms,-sky exhibits image an taken increase with of a≈ partially600% in a de de-Bayered-Bayered unit, region relativecaptured to the through unmodified a 310 nm section filter and of the with sensor. a of 400 ms, exhibits an increase of ≈ 600% in a deFor-Bayered image region capture, relative it is necessary to the unmodified to retrieve section RAW of sensorthe sensor. data, rather than the standard JPEG For image capture, it is necessary to retrieve RAW sensor data, rather than the standard JPEG image output, as the former excludes the de-Bayering algorithm and image processing, which results image output, as the former excludes the de-Bayering algorithm and image processing, which in a non-linear response from the sensor. This, and all the below acquisition and processing steps results in a non-linear response from the sensor. This, and all the below acquisition and processing were achieved with in-house authored Python codes. Here the RAW images were saved to the camera, steps were achieved with in-house authored Python codes. Here the RAW images were saved to the where they were stored in the metadata of the 8-bit JPEG image. These binary data are then extracted camera, where they were stored in the metadata of the 8-bit JPEG image. These binary data are then ≈ andextracted saved as and PNG saved images, as PNG to preserveimages, to the preserve 10-bit RAWthe 10 digital-bit RAW number digital (DN) number format, (DN) in format, files of in files6 MB. Theseof ≈ images6 MB. These could images then be could straightforwardly then be straightforwardly processed andprocessed analysed. and analysed. AA UV UV transmissive transmissive anti-reflection anti-reflection (AR)(AR) coatedcoated plano plano-convex-convex quartz quartz lens lens of of 6 6mm mm diameter diameter and and 9 mm9 mm focal length (Edmund (Edmund OpticsOptics Ltd.,Ltd.; $100 Barrington,), was then NJ, mounted USA; $100), to the was fore then of the mounted PiCam, using to the an fore ofin the-house PiCam, designed, using an3D in-houseprinted lens designed, holder, attached 3D printed to the holder, using attached its pre-existing to the boardmount usingholes; its ◦ pre-existingthis provided mount a fieldholes; of this view provided of ≈ 28°. a field This of view lens ofhousing≈ 28 . consists This lens of housing two parts, consists enabling of two parts,straightforward enabling straightforward focusing via focusing screw thread via screw adjustment. thread adjustment. As the housing As the covers housing the covers entire camerathe entire cameraboard, board, it was it necessary was necessary to disable to disable the board’s the board’s light- light-emittingemitting diode diode (LED) (LED),, which, which, by default, by default, is is programmed to to turn turn on on during during image image acquisition acquisition (see (see Fig Figureure 2).2 By). Byway way of comparison: of comparison: the all the in all incost cost of of this camera configuration configuration (e.g., (e.g., PiCam PiCam,, lens, lens, filter, filter, filter filter holder, holder, chemical chemical removal removal costs) costs) includingincluding a UVa UV bandpass bandpass filter filteris is≈ ≈ USD 200–300,200–300, dependingdepending on on the the filter, filter, in in comparison comparison to toa typical a typical figurefigur ofe of at at least least ten ten times times the the upper upper value,value, forfor suchsuch a system based based on on currently currently availab availablele scienti scientificfic grade UV cameras. grade UV cameras.

FigureFigure 2. 2.A A profile profile image image of of the the Raspberry Raspberry PiPi Camera Module ( (RightRight),), and and the the modified modified system system with with customcustom built built optics optics (Left (Left).). The The customcustom designdesign is bolted to to the the camera camera board board using using the the pre pre-existing-existing mountmount holes. holes.

FollowingFollowing construction construction of of the the imaging imaging system, the the sensor sensor linearity, linearity, which which is is important important for for quantitativequantitative applications, applications, and and UV UV sensitivity sensitivity were tested. This This was was achieved achieved by by mounting mounting a UV a UV bandpassbandpass filter, filter, of 12.5 of 12.5 mm mm diameter, diameter, centred centred on 310 on nm 310 and nm of and 10 nm of 10full nm width fullat width half maximum at half maximum (FWHM) (Edmund Optics Ltd., Barrington, New Jersey, United States) to the fore of the (FWHM) (Edmund Optics Ltd.) to the fore of the ; these filters have no other transmission camera lens; these filters have no other transmission features in the spectral sensitivity range of the features in the spectral sensitivity range of the CMOS detectors. CMOS detectors. To test the UV sensor response, images of uniformly illuminated clear-sky were taken through To test the UV sensor response, images of uniformly illuminated clear-sky were taken through the 310 nm filter at varying shutter speeds; these experiments were performed in Sheffield, UK, the 310 nm filter at varying shutter speeds; these experiments were performed in Sheffield, UK, at

Sensors 2016, 16, 1649 4 of 8 Sensors 2016, 16, 1649 4 of 8 atapproximately approximately 11:00 11:00 local local time time (solar (solar zenith zenith angle angle of of around around 50 50°◦),), during during February February 2016. 2016. At At each each shuttershutter speed speed fourfour images were were captured captured and and analog analog gain gain was was fixed fixed throughout, throughout, for consistency for consistency;; the theoperating operating software software does does not not allow allow gain gain specification, specification, but but it itdoes does enable enable this this to to be be fixed, fixed, following following stabilizationstabilization of of the the system system after after the the cameracamera start-up start-up process.process. By By averaging averaging pixel DNs DNs from from a a 800800× × 600600 pixelpixel regionregion in the centre of each image, we observed a a linear linear increase increase in in average average pixel pixel DN DN relativerelative to to shutter shutter speed, speed, forfor thethe RAWRAW 10-bit10-bit sensorsensor data (Figure 33AA).). Furthermore,Furthermore, the RAW images demonstrateddemonstrated near near saturation saturation at at shutter shutter speeds speeds well well below below 1 s,1 ins, ain spectral a spectral region region where where there there is very is very little skylight due to ozone absorption, indicating usable UV sensor sensitivity. little skylight due to ozone absorption, indicating usable UV sensor sensitivity.

FigureFigure 3. 3.Plots Plots of of average average pixelpixel signalsignal (in(in digitaldigital number;number; DN)DN) vs.vs. shutter speed speed (ms) (ms) for for a a cropped cropped regionregion (800 (800 ×× 600600 ) pixels) of of clear clear-sky-sky images images taken taken at 310 at nm 310: ( nm:A) the (A 10) the-bit 10-bitRAW image RAW output image shows output showsa linear a linear increase increase in DN in DNwith with respect respect to shutter to shutter speed; speed; (B) ( B the) the 8-bit 8-bit standard standard output output JPEG JPEG image image showsshows a a non-linear non-linear response response inin allall threethree ofof thethe red-green-bluered-green-blue (RGB)(RGB) channels (Red (Red-channel-channel DNs DNs are are plottedplotted here) here) in in line line with with gamma correction.correction. AnAn errorerror bar is inserted on one data point point per per exposure time,time, indicating indicating the the standardstandard deviation of of the the pixel pixel intensities intensities in inthe the cropped cropped region region;; the thebar barheight heightss are areapproximately approximately thethe same same for for all all points points of of equivalent equivalent shutter shutter speed, speed, and and just just one one bar bar is is displayed displayed forfor clarity. clarity.

3.3. MeasurementsMeasurements of Power Station Sulphur Dioxide Dioxide Emissions Emissions SulphurSulphur dioxide dioxide has has strong strong UV UV absorption absorption bands bands between between 300 3 and00 and 320 320 nm nm [11], [11 which], which have have been exploitedbeen exploited in a range in a of range atmospheric of atmospheric remote sensing remote measurements sensing measurements of this species, of this using species, differential using opticaldifferential absorption optical spectroscopy absorption spectroscopy and UV imagery and UV [12 –imagery14]. In particular,[12–14]. Inpower particular, stations power release stations SO2 torelease the atmosphere SO2 to the atmosphere from their smokestacks, from their smokestacks, and remotely and sensing remotely these sensing emissions these emissions is one means is one of ensuringmeans of regulatory ensuring regulatory compliance compliance and constraining and constraining effects on effects the atmosphere on the atmosphere [4,5,15,16 ].[4,5,15,16]. InIn order order to to demonstrate demonstrate proofproof ofof conceptconcept ofof thethe utilityutility of the developed sensors sensors for for UV UV imaging imaging applications,applications, we we deployeddeployed twotwo UV PiCam units units at at Drax Drax power power station station in in the the United United Kingdom Kingdom over over a a≈≈ 1515 min min periodperiod on on 15 15 August August 2016 2016.. Bandpass Bandpass filters, filters, centred centred at at 310 310 and and 330 330 nm nm (10 (10 nm nm FWHM; FWHM; EdmundEdmund Optics Optics Ltd.),Ltd.), were mounted mounted to to the the fore fore of of the the co co-aligned-aligned camera cameras,s, one one on one on oneunit, unit, with with the theother other filter filter on the on theother other camera. camera. Both devices Both devices simultaneously simultaneously imaged imaged the rising the smokestack rising smokestack plume. plume.Shutter Shutter speeds speedsof ≈ 300 of and≈ 300 40 ms, and respectively, 40 ms, respectively, were applied were appliedfor the cameras, for the cameras, in view of in the view greater of the greaterscattered scattered skylight skylight intensit intensityy at the at latter the latter wavelength, wavelength, and and images images were were acquired acquired at at 0.25 0.25 Hz. Hz. Camera-to-cameraCamera-to-camera pixel pixel mapping mapping waswas achievedachieved subsequentlysubsequently in software, using the smokestack as as a a reference.reference. TheThe acquisitionacquisition and and retrieval retrieval protocols protocols followed followed those those of ofKantzas Kantzas et al. et [1 al.7] [,17 in], particular in particular by bycollecting collecting dark dark images images for foreach each camera, camera, which which were weresubtracted subtracted from all from acquired all acquired sky images. sky images. Clear Clearsky images, sky images, taken takenadjacent adjacent to the toplume the plume,, were also were acquired, also acquired, then normalised then normalised to generate to generate a mask, a mask,which which all acquired all acquired plume plume images images were divide wered divided by; this by; eliminates this eliminates vignetting as well as wellany sensor as any non-uniformity which may have resulted from the de-Bayering process. As SO2 absorbs at 310 nm, sensor non-uniformity which may have resulted from the de-Bayering process. As SO2 absorbs at 310but nm, not but at not 330 at nm, 330 contrastingnm, contrasting resulting resulting images images provides provides a means a means of of constraining constraining the the spatial spatial distribution of gas concentration across the field of view, whilst eliminating sources of extinction

Sensors 2016, 16, 1649 5 of 8

Sensors 2016, 16, 1649 5 of 8 distribution of gas concentration across the field of view, whilst eliminating sources of extinction commoncommon toto bothboth wavelengths,wavelengths, e.g.,e.g., duedue toto aerosols.aerosols. InIn particular,particular, SOSO22 apparent absorbanceabsorbance ((AAAA)) forfor eacheach pixelpixel waswas determineddetermined usingusing thethe followingfollowing relationrelation [[1717]:]: 퐼푃 퐼푃  IP310 . IP330  퐴퐴 == − log10 [ 310⁄ 330] (1) AA log10 퐼퐵310 퐼퐵330 (1) IB310 IB330 where IP is the in-plume intensity, and IB is the background intensity, taken as the average value in where IP is the in-plume intensity, and IB is the background intensity, taken as the average value in the the clear sky adjacent to the rising gas plume; the subscripts specify the camera filter in question, for clear sky adjacent to the rising gas plume; the subscripts specify the camera filter in question, for the the dark subtracted, mask corrected imagery. Calibration of AA was then carried out using quartz dark subtracted, mask corrected imagery. Calibration of AA was then carried out using quartz cells cells containing known column amounts of SO2 (in our case 0, 300 and 900 ppm.m), in particular containing known column amounts of SO (in our case 0, 300 and 900 ppm.m), in particular measuring measuring AA values for each cell when 2pointing at plume free sky, plotting concentration vs. these, AA values for each cell when pointing at plume free sky, plotting concentration vs. these, and then and then multiplying all acquired plume image AA values by this gradient. The cell column multiplying all acquired plume image AA values by this gradient. The cell column amounts were amounts were verified using an Ocean Optics Inc. USB2000 spectrometer, using the VolcanoSO2 verified using an Ocean Optics Inc. USB2000 spectrometer, using the VolcanoSO2 differential optical differential optical absorption spectroscopy code [18]. absorption spectroscopy code [18]. Figure 4C shows a typical calibrated SO2 image, after binning pixels to a 648 × 486 array to Figure4C shows a typical calibrated SO2 image, after binning pixels to a 648 × 486 array to reduce noise, highlighting the ability of the cameras to clearly resolve the smokestack emissions. reduce noise, highlighting the ability of the cameras to clearly resolve the smokestack emissions. Furthermore, background image noise levels are remarkably low (typically ≈ 25 ppm.m standard Furthermore, background image noise levels are remarkably low (typically ≈ 25 ppm.m standard deviation in a 648 × 486 binned image), in contrast to values of ≈ 30 ppm.m, previously reported deviation in a 648 × 486 binned image), in contrast to values of ≈ 30 ppm.m, previously reported from from more expensive scientific grade camera systems in atmospheric SO2 monitoring [5], indicating more expensive scientific grade camera systems in atmospheric SO monitoring [5], indicating the the potential of these low cost units in UV imaging applications. 2 potential of these low cost units in UV imaging applications.

Figure 4. A Figure 4. ((A)) A A cropped cropped image image of theof the Drax Drax smokestack smokestack taken taken at 310 at nm 310 with nm a with shutter a shutter speed of speed 300 ms. of The initial image pixels are binned to generate a pixel resolution of 648 × 486, to reduce noise; dark 300 ms. The initial image pixels are binned to generate a pixel resolution of 648 × 486, to reduce noise; image subtraction and mask corrections have been applied. (B) As in (A) but at 330 nm with a shutter dark image subtraction and mask corrections have been applied. (B) As in (A) but at 330 nm with a speed of 40 ms. (C) The resulting calibrated SO2 image of Drax power station stack and plume showing shutter speed of 40 ms. (C) The resulting calibrated SO2 image of Drax power station stack and plume the clear capacity of the system to resolve the plume emissions. (D) A cross-section of (C) showing showing the clear capacity of the system to resolve the plume emissions. (D) A cross-section of (C) gas concentrations along the row delineated by the red line. The background noise level can be clearly showing gas concentrations along the row delineated by the red line. The background noise level can observed between pixels 300 to 350. be clearly observed between pixels 300 to 350.

By integrating across the plume, perpendicular to the plume transport direction, integrated column amounts (ICAs) of SO2 were obtained, and a time series of ICAs through successive images

Sensors 2016, 16, 1649 6 of 8

SensorsBy 2016 integrating, 16, 1649 across the plume, perpendicular to the plume transport direction, integrated6 of 8 column amounts (ICAs) of SO2 were obtained, and a time series of ICAs through successive images waswas produced.produced. A plumeplume speedspeed ofof ≈≈ 4.94.9 m/sm/s was was calculated by generating two such ICA time series, determined for parallel plume cross sections, located at different distances from the source, and then cross correlating these. This plume speed was then multiplied by the ICA values to generate a SO2 cross correlating these. This plume speed was then multiplied by the ICA values to generate a SO2 flux flux time series, as shown in Figure 5. The SO2 flux was found to be relatively stable from the time series, as shown in Figure5. The SO 2 flux was found to be relatively stable from the smokestack, withsmokestack, a mean with emission a mean of 0.44 emission kg/s; of nevertheless, 0.44 kg/s; nevertheless, notable gas “puffs”notable aregas apparent,“puffs” are and apparent, can be seenand can be seen both in the flux time series and SO2 absorption image video (see Video S1 in the both in the flux time series and SO2 absorption image video (see Video S1 in the supplementary materials).supplementary The Draxmaterial Annuals). The Reviews Drax Annual of Environmental Reviews of Performance Environmental [19], Performance provide the best[19], available provide the best available ancillary emissions data for comparison, stating annual SO2 loadings ranging ancillary emissions data for comparison, stating annual SO2 loadings ranging 24.5–35.1 kt/yr for the period24.5–35.1 2008–2013, kt/yr for and the thus, period average 2008–2013, annual and fluxes thus, ranging average 0.78–1.1 annual kg/s. fluxes These ranging data 0.78 corroborate–1.1 kg/s. ourThese observations, data corroborate falling our within observations, an order of falling magnitude, within given an order that Drax’s of magnitude, operational given output that will Drax fall’s somewhatoperational below output the will mean fall annualsomewhat value below during the the mean British annual summer. value during the British summer.

Figure 5.5. Time series of SO22 fluxflux from DraxDrax power station for a 15 min acquisition period.

4. Discussion Discussion and and Concluding Concluding Re Remarksmarks InIn this this paper paper we we have have presented, presented, for for the the first first time, time, a a relatively relatively simple simple methodology methodology for the the development of low-costlow-cost UV cameras,cameras, based on inexpensiveinexpensive sensors developeddeveloped for thethe smartphonesmartphone market. We show that by modificationmodification of Raspberry Pi cameracamera sensors, and rebuilding of the optical systems, cameracamera boardsboards as as cheap cheap as as≈ ≈USD USD 25 25 can can be be adapted adapted to to applications applications down down to to wavelengths wavelengths of atof leastat least 310 310 nm. nm. The The potential potential utility utility of such of such devices devices in UV in imagingUV imaging applications applications was illustratedwas illustrated via a 2 casevia a study, case study, in which in which we used we the used cameras the cameras to perform to perform ultraviolet ultraviolet remote remote sensing sensing of SO2 fluxesof SO fromfluxes a powerfrom a stationpower smokestack.station smokestack. With thisthis inin mind,mind, other other possible possible areas areas in in which which this this technology technology could could be appliedbe applied include, include, but but are notare limitednot limited to: ground-based to: ground-based mineral mineral aerosol aerosol detection detection (331/360 (331/360 nm), tonm), mimic to mimic satellite satellite based TOMSbased retrievalsTOMS retrievals [20,21], monitoring [20,21], monitoring skin conditions skin [3 ,22 conditions], forensics [3 during,22], forensics crime scene during investigation crime [ scene1,23], faultinvestigation detection [1 in,23 power], fault systems detection [2 ],in fault power detection systems in [2 vehicles,], fault detection and astronomy. in vehicles, The and low astronomy. cost would beThe of low particular cost would benefit be of in particular application benefit areas in where application budgets areas are where limited, budgets or arrays are oflimited, these unitsor arrays are required.of these units One are example required. of such One a example scenario of is such in the a fieldscenario of volcanology, is in the field where of volcanology, UV cameras where are used UV cameras are used to image SO2 emissions in volcanic plumes [4,24,25] as a means of investigating to image SO2 emissions in volcanic plumes [4,24,25] as a means of investigating magmatic processes. Asmagmatic many volcanoes processes. are As located many involcanoes developing are countries,located in lower developing cost sensors countries, could lower greatly cost expedite sensors morecould comprehensivegreatly expedite monitoring more comprehensive of global volcanic monitoring hazards, of global with allowance volcanic hazards, for sensor with destruction allowance in thefor sensor event of destruction explosions. in the event of explosions. One potentialpotential current current limitation limitation to this to this system system is its is maximum its maximum achievable achie framevable rate. frame The rate. shutter The speedsshutter used speeds here used at thehere power at the station power (300 station ms at (300 310 ms nm) at correspond 310 nm) correspond to a theoretically to a theoretically achievable acquisitionachievable acquisition rate of >1 Hz. rate However, of >1 Hz. inHowever, reality obtainable in reality obtainable frame rates frame are rather rates lowerare rather due tolower on chipdue processing.to on chip processing Preliminary. Preliminary tests suggest tests that frameratessuggest that >0.5 framerates Hz are achievable, >0.5 Hz are however, achievable, over prolongedhowever, periodsover prolonged this resulted periods in sporadic this resulted dropping in sporadic of frames, dropping which of disappeared frames, which altogether disappeared for acquisition altogether at for acquisition at 0.25 Hz. More work is required to identify the limiting factor in frame capture, and whether stable higher acquisition rates might be achievable with further software optimisation. For the majority of applications, however, we suggest that framerates of 0.25 Hz will be more than adequate.

Sensors 2016, 16, 1649 7 of 8

0.25 Hz. More work is required to identify the limiting factor in frame capture, and whether stable higher acquisition rates might be achievable with further software optimisation. For the majority of applications, however, we suggest that framerates of 0.25 Hz will be more than adequate. Future work could focus on the spectral range of such units; the sensitivity of this system below 310 nm has not been quantified herein, and whilst the sensors do likely respond at deeper UV wavelengths, there may be a trade off in terms of signal to noise. Investigating the longer wavelength cut off of these monochrome devices could also be of interest, in establishing their use in high temperature thermal imaging applications. Other work may include consideration of thermal effects on sensor stability/noise, and the degree to which cooling/temperature stabilisation might improve signal to noise. Finally, whilst signal to noise levels appear promising here, particularly given the low sensor price point, further work is now also merited in comparing the system sensitivity and noise characteristics against more expensive scientific grade UV cameras.

Supplementary Materials: The following are available online at http://www.mdpi.com/1424-8220/16/10/1649/ s1, Video S1: Absorbance image sequence of Drax SO2 emissions. Warmer colours indicate larger SO2 column densities. Images were captured at 0.25 Hz, and this video sequence is speeded up to eight times real-time. Acknowledgments: This work was funded by a Scholarship from the University of Sheffield, awarded to TCW. TCW would like to thank the Raspberry Pi community (found at https://www.raspberrypi.org/forums/), who provided great support whilst working with the PiCam module. We would also like to thank Matthew Davies for his help in selecting the object lens used herein. TDP acknowledges the support of a NERC studentship (NE/K500914/1) and AMcG acknowledges a Leverhulme Trust Research Fellowship (RF-2016-580). Author Contributions: The paper was written by T.C.W and A.Mc.G. Development of the camera system and data processing/analysis was predominantly performed by T.C.W.; A.Mc.G. aided in initial camera tests, in the field work, and is the supervisor; T.D.P. provided the initial concept and aided in field work; A.T. aided in early-stage camera development and software development for camera control; B.W. devised and performed the methodology for removal of the Bayer filter from the camera sensor; R.B. is a co-supervisor; J.R.W. aided in the development of the camera optics. Conflicts of Interest: The authors declare no conflict of interest. The founding sponsors had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript, and in the decision to publish the results.

References

1. Krauss, T.; Warlen, S. The Forensic Science Use of Reflective Ultraviolet Photography BT—The Forensic Science Use of Reflective Ultraviolet Photography. J. Forensic Sci. 1985, 30, 262–268. [CrossRef][PubMed] 2. Chen, Z.; Wang, P.; Yu, B. Research of UV detection system based on embedded computer. World Automation Congress 2008. Available online: http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=4699210& isnumber=4698939 (accessed on 26 August 2016). 3. Fulton, J.E. Utilizing the Ultraviolet (UV Detect) Camera to Enhance the Appearance of Photodamage and Other Skin Conditions. Dermatol. Surg. 1997, 23, 163–169. [CrossRef][PubMed] 4. McElhoe, H.B.; Conner, W.D. Remote Measurement of Sulfur Dioxide Emissions Using an Ultraviolet Light Sensitive Video System. JAPCA J. Air Pollut. Control Assoc. 1986, 36, 42–47. [CrossRef]

5. Smekens, J.-F.; Burton, M.R.; Clarke, A.B. Validation of the SO2 camera for high temporal and spatial resolution monitoring of SO2 emissions. J. Volcanol. Geoth. Res. 2015, 300.[CrossRef] 6. Igoe, D.; Parisi, A.V.; Carter, B. Characterization of a smartphone camera’s response to ultraviolet a radiation. Photochem. Photobiol. 2013, 89, 215–218. [CrossRef][PubMed] 7. Igoe, D.; Parisi, A.V. Evaluation of a Smartphone Sensor to Broadband and Narrowband Ultraviolet A Radiation. Instrum. Sci. Technol. 2015, 43, 283–289. [CrossRef] 8. Igoe, D.; Parisi, A.V. Broadband direct UVA irradiance measurement for clear skies evaluated using a smartphone. Radiat. Prot. Dosim. 2015, 167, 485–489. [CrossRef][PubMed] 9. Moure, D.; Torres, P.; Casas, B.; Toma, D.; Blanco, M.J.; Del Río, J.; Manuel, A. Use of Low-Cost Acquisition Systems with an Embedded Linux Device for Volcanic Monitoring. Sensors 2015, 15, 20436–20462. [CrossRef] [PubMed] 10. Schlobohm, J.; Pösch, A.; Reithmeier, E. A Raspberry Pi Based Portable Endoscopic 3D Measurement System. Electronics 2016, 5.[CrossRef] Sensors 2016, 16, 1649 8 of 8

11. Vandaele, A.C.; Simon, P.C.; Guilmot, J.M.; Carleer, M.; Colin, R. SO2 absorption cross section measurements in the UV using a Fourier transform spectrometer. J. Geophys. Res. 1994, 99, 25599–25605. [CrossRef]

12. Mori, T.; Burton, M.R. The SO2 camera: A simple, fast and cheap method for ground-based imaging of SO2 in volcanic plumes. Geophys. Res. Lett. 2006, 33.[CrossRef] 13. Kantzas, E.P.; McGonigle, A.J.S. Ground Based Ultraviolet Remote Sensing of Volcanic Gas Plumes. Sensors 2008, 8, 1559–1574. [CrossRef] 14. Theys, N.; De Smedt, I.; van Gent, J.; Danckaert, T.; Wang, T.; Hendrick, F.; Stavrakou, T.; Bauduin, S.; Clarisse, L.; Li, C.; et al. Sulfur dioxide vertical column DOAS retrievals from the Ozone Monitoring Instrument: Global observations and comparison to ground-based and satellite data. J. Geophys. Res. Atmos. 2015, 120, 2470–2491. [CrossRef] 15. Millán, M.M. Remote sensing of air pollutants. A study of some atmospheric scattering effects. Atmos. Environ. 1980, 14, 1241–1253. [CrossRef] 16. McGonigle, A.J.S.; Thomson, C.L.; Tsanev, V.I.; Oppenheimer, C. A simple technique for measuring power

station SO2 and NO2 emissions. Atmos. Environ. 2004, 38, 21–25. [CrossRef] 17. Kantzas, E.P.; McGonigle, A.J.S.; Tamburello, G.; Aiuppa, A.; Bryant, R.G. Protocols for UV camera volcanic

SO2 measurements. J. Volcanol. Geoth. Res. 2010, 194, 55–60. [CrossRef] 18. McGonigle, A.J.S. Measurement of volcanic SO2 fluxes with differential optical absorption spectroscopy. J. Volcanol. Geotherm. Res. 2007, 162, 111–122. [CrossRef] 19. Annual review of Environmental Performance, Drax, 2013. Available online: http://www.drax.com/media/ 56551/Environmental-Performance-Review-2013.pdf (accessed on 26 August 2016). 20. Torres, O.; Bhartia, P.K.; Herman, J.R.; Ahmad, Z.; Gleason, J. Derivation of aerosol properties from satellite measurements of backscattered ultraviolet radiation: Theoretical basis. J. Geophys. Res. Atmos. 1998, 103, 17099–17110. [CrossRef] 21. Sinyuk, A.; Torres, O.; Dubovik, O. Combined use of satellite and surface observations to infer the imaginary part of refractive index of Saharan dust. Geophys. Res. Lett. 2003, 30.[CrossRef] 22. Mahler, H.M.; Kulik, J.A.; Harrell, J.; Correa, A.; Gibbons, F.X.; Gerrard, M. Effects of UV , photoaging information, and use of sunless tanning lotion on sun protection behaviors. Arch. Dermatol. 2005, 141, 373–380. [CrossRef][PubMed] 23. Yosef, N.; Almog, J.; Frank, A.; Springer, E.; Cantu, A. Short UV Luminescence for Forensic Applications: Design of a Real-Time Observation System for Detection of Latent Fingerprints and Body Fluids. J. Forensic Sci. 1998, 43, 299–304. [CrossRef][PubMed] 24. Bluth, G.; Shannon, J.; Watson, I.M.; Prata, A.J.; Realmuto, V. Development of an ultra-violet

for volcanic SO2 imaging. J. Volcanol. Geoth. Res. 2007, 161, 47–56. [CrossRef] 25. Peters, N.; Hoffmann, A.; Barnie, T.; Herzog, M.; Oppenheimer, C. Use of motion estimation algorithms for

improved flux measurements using SO2 cameras. J. Volcanol. Geoth. Res. 2015, 300, 58–69. [CrossRef]

© 2016 by the authors; licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC-BY) license (http://creativecommons.org/licenses/by/4.0/).