Mobile Crowd Instrumentation: Design of Surface Solar Irradiance Instrument

A thesis submitted

To Kent State University in partial

Fulfillment of the requirements for the

Masters of Science

by

Abhishek Singh

May, 2017

© Copyright All rights reserved Except for previously published materials

Thesis written by

Abhishek Singh

B.Tech., Shobhit University, INDIA, 2011

M.S., Kent State University, USA, 2017

Approved by

Dr. Javed I. Khan , Chair, Thesis Dissertation Committee

Dr. Javed I. Khan , Chair, Department of Computer Science

Dr. James L. Blank , Dean, College of Arts and Sciences

TABLE OF CONTENTS

LIST OF FIGURES …………………………………………………………...... vii

LIST OF TABLES/LIST ……………………………………………...... x

ACKNOWLEDGEMENTS………………………………………………………………………...…… xi

CHAPTER 1 BACKGROUND ABOUT CROWDSOURCING……………………………………..13

1.1 Crowdsourcing Definition ...... 13

1.2 Motivation about crowdsourcing ...... 13

1.3 Classification of crowdsourcing ...... 15

CHAPTER 2 MOBILE CROWD INSTRUMENTATION BASED ON CROWD COMPUTING ……………………………………………………………………………………………………17

2.1 Definition of Crowd Computing ...... 17

2.2 Component of Crowd Computing ...... 17

2.2.1 Cloud computing ...... 18

2.2.2 Human Computation ...... 18

2.2.3 Mobile Computing ...... 18

2.2.4 Social Computing ...... 19

2.3 Overview About Mobile Instrumentation ...... 19

2.3.1 Mobile Crowd Instrumentation ...... 20

2.3.1.1 Types of Sensor Available in Smartphones ...... 20

CHAPTER 3 SCIENTIFIC APPROACH FOR ESTIMATING SURFACE SOLAR IRRADIANCE

………….…………...….…………………...……………………....……………………………………27

3.1 Background and Related Work for Solar Irradiance Measurement…………………………...27

iii

3.1.1 Definition of Solar Irradiance ...... 27

3.1.2 Corroborative Work ...... 27

3.1.2.1 Different Values Retrieved from Sensors in Smartphone… ...... 28

3.1.2.2 Develop Program Using Camera Parameter ...... 30

3.2 Surface Solar Irradiance: preliminary and related works ...... 31

3.2.1 Overview ...... 31

3.2.2 Implementation of Our Approach Using Mathematical Method for SSI ...... 32

3.2.2.1 Notations ...... 32

3.2.2.2 Calculating Solar Irradiance ‘S’ ...... 33

3.2.2.3 Calculating Solar Elevation ...... 33

3.2.2.4 Calculating Solar Declination Angle...... 33

3.2.3 Classification of Cloud Level in Sky ...... 36

3.2.3.1 Related Approach to Categorize Different Clouds ...... 41

3.2.4 Related Works and Results ...... 42

3.3 EXPERIMENTATION ...... 42

3.3.1 Comparing Image Sample for Different Cloud Levels...... 44

3.3.2 Comparing SSI Values ...... 47

3.3.3 Bar Chart Comparison Between Calculated Data and SSI Reading ...... 50

3.3.4 Bar Chart for Minimum Mean Square Error ...... 51

3.3.5 Bar Chart for Signal to Noise Ratio ...... 52

3.3.6 Validation ...... 52

CHAPTER 4 DESIGNING OF COMPONENTS FOR SETTING UP FRAMEWORK FOR

CROWDSOURCING…………………...……………………...….….………………………………...53

4.1 Introduction to System Architecture...... 54

4.1.1 System Management ...... 54

4.1.2 Computational Engine ...... 55

iv

4.1.2.1 Computation at Mobile Processing Engine ...... 55

4.1.2.2 Computation at Server End (Cloud) ...... 58

4.1.2.3 Computation at Web Application ...... 60

4.1.3 Approach for Crowd Computing Project...... 61

4.2 API Set and Web Services ...... 62

4.3 Design Structure and Architecture of Mobile Application ...... 63

4.3.1 Approach and Definition ...... 63

4.3.2 Main component of the application ...... 64

4.3.3 Workflow ...... 65

4.4 Design Structure and Architecture of Interactive Application and Dashboard ...... 72

4.5.1 Login Credentials ...... 73

4.5.2 Project Harvester / Data Rovers...... 73

4.5.2.1 Dashboard Feature for Project Uploader ...... 74

4.5.2.2 Dashboard Feature for Data Uploader ...... 74

4.5.3 Manager and Administrator ...... 78

4.5.3.1 Geographic Analysis ...... 78

4.5.3.2 Crowd Population ...... 78

4.5.3.3 Spatial distribution ...... 78

4.5.4 Reward System ...... 79

4.5.4.1 Improvised reward point system ...... 80

4.5.5 Quality Controls for Data Input ...... 81

4.5 Dashboard Analytics ...... 83

4.6.1 Different Type of Graphs and Chart for Data Representation ...... 83

4.6.2 List of Meters, Graphs and Charts ...... 83

4.6.3 List of Primary Parameters Charts ...... 84

4.6.4 List of Time Frame Quantity Charts ...... 84

4.6.5 Data Image List Charts ...... 85

v

4.6.6 Quality Control Charts ...... 86

4.6.7 Reward and Earned Points Charts ...... 87

4.6.8 Geographical Analysis Charts ...... 88

CHAPTER 5 SUMMARY AND FUTURE WORK……………………………………………………90

5.1 Conclusion ...... 90

5.2 Open Work and Future Scope...... 91

APPENDIX A…………………………………………………………………………………………….93

Appendix: Source code sample for android application ...... 93

APPENDIX B…………………………………………………………………………………………….97

Appendix: Source code sample for web application………………………………………………...97

BIBLIOGRAPHY………………………………………………………………………………………..99

vi

LIST OF FIGURES

Figure 1. Sensor Circle for Mobile Crowd Instrument ...... 24

Figure 2. Surface Solar Instrument built from multiple sensors ...... 26

Figure 3. Display all values retrieved from sensors available in mobile phone...... 29

Figure 4. Display different value of camera parameters...... 30

Figure 5. Fisheye effect with normal phone camera...... 31

Figure 6. Different Solar Angles ...... 34

Figure 7. Show Solar Declination Angle variation over year ...... 35

Figure 8. Cirrus Cloud ...... 37

Figure 9. Cirrostratus Cloud ...... 37

Figure 10. Cirrocumulus Cloud ...... 37

Figure 11. Altostratus cloud...... 38

Figure 12. Altocumulus cloud ...... 38

Figure 13. Nimbostratus cloud ...... 39

Figure 14. Cumulus Cloud...... 39

Figure 15. Stratus cloud ...... 40

Figure 16. Cumulonimbus cloud ...... 40

Figure 17. Stratocumulus cloud ...... 41

Figure 18. RGB Scale ...... 42

Figure 19. Grey Scale ...... 42

Figure 20. Binary Scale ...... 42

Figure 21. Bar chart of comparison between measured and calculated data values of SSI ...... 50

Figure 22. Bar chart for MMSE: min, max, average ...... 51

Figure 23. Bar chart for SNR: min, max, avg...... 52

Figure 24. Architecture for crowdsourcing platform ...... 54

vii

Figure 25. Snapshot of Postgres Table for registered Ventures ...... 58

Figure 26. Snapshot of Postgres Table for Uploaded Data ...... 59

Figure 27. Snapshot of Postgres Table for Registered Rovers ...... 59

Figure 28. Screen shot of PHP code ...... 60

Figure 29. Screen shot of PHP code ...... 61

Figure 30. Google API Source ...... 62

Figure 31. Google Map Interface for App ...... 66

Figure 32. Venture display for an App ...... 67

Figure 33. Venture Information to collect Sensor Data ...... 68

Figure 34. Sensor Driver Interface for venture...... 68

Figure 35. Venture Information to collect Audio Data ...... 69

Figure 36. Audio Recording driver for Audio Data ...... 69

Figure 37. Venture Information to collect Image Data...... 70

Figure 38. Camera Driver Interface for Image Data ...... 70

Figure 39. Rover displayed on App ...... 71

Figure 40. Rover Information ...... 71

Figure 41. Rover Information Implementation for Solar Irradiance project: Image data collection technique ...... 72

Figure 42. Login Screen ...... 75

Figure 43. Screen to add new project by Project Uploader ...... 76

Figure 44. Added Venture with details ...... 76

Figure 45. Added data by Rover...... 77

Figure 46. Shows Sun position ...... 82

Figure 47. Details for capture image ...... 82

Figure 48. List of Gauge meters for Users ...... 85

Figure 49. Geo Map for different population of User in different country ...... 85

viii

Figure 50. Annotation chart: Uploaded and Rejected Data ...... 86

Figure 51. Bar chart: Comparison between Data Upload and Quality data ...... 86

Figure 52. Bar chart: Bars for Different quality control ...... 87

Figure 53. Pie Chart: Showing different rewards point ...... 87

Figure 54. Country population map ...... 88

Figure 55. Geo Map: Different located cities ...... 88

Figure 56. Table Chart: List of population of different cities ...... 88

Figure 57. Stacked Bar chart: Showing Active user population in differnt group of area in different cities ...... 89

Figure 58. Motion Chart: User population based on location and different area...... 89

ix

LIST OF TABLES

Table 1 Sources using Crowdsourcing ...... 15

Table 2. Sensor supported by Android OS ...... 23

Table 3. List of sensors and hardware specification ...... 28

Table 4. Summary of the General Notation ...... 32

Table 5. Image Samples for Clear Sky ...... 44

Table 6. Image Samples for Mid-Level Sky ...... 45

Table 7. Image Samples for Dense-Level Sky...... 46

Table 8. SSI for clear sky images ...... 47

Table 9. SSI for Mid- Level cloud images ...... 48

Table 10. SSI for Dense Level cloud images ...... 49

Table 11. Methods for crowdsourcing ...... 53

List 1. Main method for displaying rover and ventures on google map in android application ....96

List 2. Main method for establishing PostgreSQL connection ...... 97

List 3. Method shows insert query to register new Venture ...... 97

List 4. Method shows select query to display information of each registered Ventures ...... 98

x

ACKNOWLEDGMENTS

I am grateful to Dr. Javed I. Khan for his guidance and support on this research work, which endeavors my knowledge of working with this project.

xi

I dedicate my thesis work to my parents, family, friends and colleagues for their encouragement, inspiration and endless support. I will always be thankful to everyone and their contribution for this thesis work.

xii

CHAPTER 1

BACKGROUND ABOUT CROWDSOURCING

1.1. CROWDSOURCING DEFINITION

Jeff Howe and Mark Robinson introduced the term ‘Crowdsourcing’ in

2015. Now it’s becoming the popular approach for collecting data over the crowd.

Crowdsourcing is a technique where a large form of a group of online community people

helps to gather or resource needed information or data by using some sort of user

interface [10]. Here we are using this concept to gather this all primary data information

from the online community. For this, we have designed complete architecture for

implementing crowdsourcing for our project.

1.2 MOTIVATION ABOUT CROWDSOURCING

Crowdsourcing accomplished to aggregate data information from wide range of

stakeholder in many complexes system. It has potential to replace augment traditional

way of collecting data and feedback from many systems. It renders open source platform

where motivator for some innovative motive to collaborate with stakeholder to deduce

solutions. As it is more practical and comprehensive method as it cut down cost and

operation in many challenges. As crowdsourcing, doesn’t restrict any boundaries and

invites the interested crowd to form and work some specific solution. Per Saxton et all.

crowdsourcing can characterize by three atypical features

13 i.e. Processing of outsourcing the problem, the crowd and the web-based platform for collaboration.

Crowdsourcing objective can drive with two main components such as [17]

1.Audience-centric where sourcing requires performing in certain time and location by the crowd.

2. Event Centric where sourcing is independent of period and location.

Important aspects of crowdsourcing are conjunction of crowd knowledge with a scientist, research to expand the nature of science and development.

There are few example popular platforms such as [2]

1. Wiki systems

2. open source Software

3. Geo crowd mapping

4. mash-up like a combination of most all.

14

Here are few set of lists which worked on the above-discussed platform [1].

Types Professional Solution Reviews & Idea Design Tasks Finding Ratings Collection & Data Collection

Websites TopCoder InnoCentive Yelp AMT 99Designs ODesk Kaggle Quora Amazon CrowdFlower DesignCrowd NineSigma Yahoo eBay IMDb Tomnod eBird crowdSPRIN oocto ScriptLance Answers YouTube iStockPhoto G Threadless UTest Baidu Google NameThis PatternTap Crowdfunder Knowledge Book Rate MinuteWorker 12Designer spot.us Fundable Sina my Fiverr OpenIDEO Questions Professor PollDaddy Accenture SterlingFunder Angie’s List ClickWorker SpreadShirt giveForward Trip Advisor Challenge.gov Fiat Mio CrowdFundZoo InsiderPage Chaordix Freelancer m iBankers s MicroWorkers Choosa Angelinvestment MerchantCir MiniFreelance IdeaScale net work cle Zhubajie.com Ethics HyperFund JudysBook InnoCentive ShopforDesig EquityNet Open Table Idea Bounty ns 13 PlumFund BBB.org Dell IdeaStorm Dresses Lammily Buzzillions LogoDesignPr Epinions os Hatchwise Foodiecrowdfund Wize Ideaken ing Soylent PowerRevie BootB Big ws Idea Group SiteJabber

Table 1. Sources using Crowdsourcing

1.3. CLASSIFICATION OF CROWDSOURCING

Crowdsourcing can be defined with different categories [2][9]. 1. The knowledge discovery and management approach

Organization tasks crowd with finding and collecting information into common

location and format. This type is ideal for information gathering collectively. But there

could be reporting a problem such as the creation of collective resources.

2. Broadcast Search

15

Organization tasks crowd with solving the empirical problem. Like, broadcast a task for some specific location or time. This type is ideal for ideation problems with an empirical problem such as a Scientific problem or research problems.

3. Peer-vetted creative production approach

The idea of this approach to build solutions using online users or crowd when data or information is already available.

4. Distributed human intelligence tasking.

This is not the most popular approach of using this crowdsourcing. This approach is used when their data or information is already present or available and the goal is to introduce any design or build a solution out of it. This includes moreover like computing or processing within the data.

Challenges of crowdsourcing

There could be lot of plausible ways where organization practicing crowdsourcing can limit them using it. As it completely relies on robust, active, motivated crowd. And there is no prime technique yet build which helps the organization to sustain and provide? Successful and positive results. Also, crowdsourcing requires a great deal of transparency and trust on the part of an organization. To develop open challenges leads leveraging the motivation of the crowd. Also, leverage to power the power of crowds, the organization must surrender a bit of their own power by letting online communities become potential stakeholders.

16

CHAPTER 2

MOBILE CROWD INSTRUMENTATION BASED ON CROWD COMPUTING

2.1 DEFINITION OF CROWD COMPUTING

As we discussed using crowdsourcing to implement Surface Solar Irradiance

technique so to implement the computational approach for this technique we will use

crowd computing. Crowd computing is a concurrent approach of using crowdsourcing

and cloud computing. Here crowd plays an intelligent worker and computation is done

using cloud computing. Where the large mass of crowd interacts and work on one

challenge or tasks. Distribution of human intelligence tasks using the mobile device to

integrate the metadata.

2.2 COMPONENTS OF CROWD COMPUTING

There are several factors which contribute to form crowd computing such as

1. Cloud computing

2. Human Computation

3. Mobile Computing

4. Social Computing

There are few key features in crowd computing.

 Micro-tasking: Where work or task is broken into small subtask which is

completed by the crowd.

17

 Automation: Machines complete respective work and judgment by work for the

crowd.

 Hybrid Crowd: Crowd worker and machine where work together for certain task.

 AI: Machine learning a cascade of knowledge that enables more and more

automation and continuously optimizing cost and quality.

2.2.1 CROWD COMPUTING

Like where the group for mass interacts each other to perform some certain task

is defined as Crowd Computing. We will be discussing this computing more in detail in

later chapters.

2.2.2 CLOUD COMPUTING

Cloud computing instance where different service is delivered over the internet.

These services can be in many different forms such providing

 Cloud Relational Database Storage and Content Delivery

 Cloud Computing Web services

 Software and mobile development platform

 Providing Access Virtual Machine

 Cloud Networking: Virtual Private Cloud

 Platform for Internet of Things

2.2.3 HUMAN COMPUTING

This type of computing refers where computation is done by machine and

machine gave tasks to perform by Humans. Moreover, these subtasks altogether are

done by the human. So, this type of computation can be referred as Human Based

Computation.

18

2.2.4 MOBILE COMPUTING

This type of computing refers to where Transmission of metadata such as Sensor

Data, Audio Data, Image, Video data or another type of data to the Central system over

TCP/IP or UDP.

2.2.5 SOCIAL COMPUTING

This type of computing is generally performed by group or mass of crowd where

they interact to collect information. Here interaction can be segregate into primarily two

type of forms

1.Mass convergence

2.Mass Divergence

Overall interaction is performed as in Social Event either using social Media or

Digital Media.

2.3 OVERVIEW ABOUT MOBILE INSTRUMENTATION

The mobile device is a potential device available in today’s world. And

advancement in the mobile technology is striving on daily basis. As Mobile devices are

becoming primary sources for most of the operation involving different function related to

the technical world. So, every manufacturer in cellphone industries is taking this device

to the new level of tool that defines different operation and potential of this device with a

different tradition. But the most preeminent aspect of building this device with such an

advancement is its hardware component and built in sensors and capacity of assembling

more hardware components that make this simple device into the typical complex

instrument.

19

2.3.1 Mobile Crowd Instrumentation

If we talk about today’s smartphone which carries most of these intelligent

sensors. There is so many sensors available for cell phones, depending upon on

different manufacturer brands are as follows

2.3.1.1 Types of Sensor Available in Smartphones

● An accelerometer detects acceleration, tilt, and vibration to determine movement

and orientation.

● A gyroscope identifies up/down, left/right and rotation around three axes for more

complex orientation details.

● A light sensor detects data about lighting levels in the environment to adapt the

display accordingly.

● A proximity sensor detects when the phone is held to the face to make or take a

call, so the touch screen display can be disabled to avoid unintended input.

● A fingerprint sensor can enable biometric verification for the secure device and

website authentication as well as mobile payment.

● A magnetometer detects the direction of magnetic north and, in conjunction with

GPS, determines the user's location.

● Touchscreen Sensor

1. Resistive Touch

a. Hold 2 layers conductive and resistive touch layer

2. Capacitive Touch

a. Contain Capacitive layer

3. Surface accusative touch

● Range Sensors

a. Infrared Sensors

20

i. Develop on Infrared Emitter and Infrared Detector.

ii. Works by emitting infrared and capturing amount reflected.

iii. E.g. PIR Motion sensors and Kinect.

b. Infrared Radars

c. Ultrasonic Sensor

● High-frequency speaker

● Microphone

Can be used to measure distance and range using frequency and sonar effect Here is

the list the sensors supported by android based operating system. Here below there is a

list of sensor available and supported by Android OS [16].

Sensor Type Description

ACCELEROMETER Hardware Measures the acceleration force in m/s2 that is applied to a device on all three physical axes (x, y, and z), including the force of gravity.

AMBIENT TEMPERATURE Hardware Measures the ambient room temperature in degrees Celsius (°C). See note below.

GRAVITY Software or Measures the force of gravity in Hardware m/s2 that is applied to a device on all three physical axes (x, y, z).

21

GYROSCOPE Hardware Measures a device's rate of rotation in rad/s around each of the three physical axes (x, y, and z).

LIGHT Hardware Measures the ambient light level (illumination) in lx.

LINEAR ACCELERATION Software or Measures the acceleration Hardware force in m/s2 that is applied to a device on all three physical axes (x, y, and z), excluding the force of gravity.

MAGNETIC FIELD Hardware Measures the ambient geomagnetic field for all three physical axes (x, y, z) in μT.

ORIENTATION Software Measures degrees of rotation that a device makes around all three physical axes (x, y, z). As of API level 3 you can obtain the inclination matrix and rotation matrix for a device by using the gravity sensor and the geomagnetic field sensor in conjunction with the getRotationMatrix() method.

PRESSURE Hardware Measures the ambient air

22

pressure in hPa or mbar.

PROXIMITY Hardware Measures the proximity of an object in cm relative to the view screen of a device. This sensor is typically used to determine whether a handset is being held up to a person's ear.

RELATIVE HUMIDITY Hardware Measures the relative ambient humidity in percent (%).

ROTATION VECTOR Software or Measures the orientation of a Hardware device by providing the three elements of the device's rotation vector.

TEMPERATURE Hardware Measures the temperature of the device in degrees Celsius (°C). This sensor implementation varies across devices and this sensor was replaced with the

Table 2. Sensor supported by Android OS

23

Figure 1. Sensor Circle for Mobile Crowd Instrument

Using all these sensors we probably can develop much and build different applications that be used in different fields. So, working with a camera and few more sensors we come with technique to estimate surface solar irradiance. There are so many techniques are being generated to estimate solar energy and got into a form of one discrete device.

But in this technological evolution where cellular devices are getting smarter and intelligent comprising with different sensors and latest features. So, this idea was taking this existing device and explore for more opportunity where this device not only helps for better communication we can use its hardware, software, and feature to get something

24 that can probably help to overcome Solar calculator devices. That’s how we started working on the mechanism and generated an approach where smartphones can be molded into the new source of devices. There are many applications been developed based on these available sensors which provide significant information. Here are few examples of such these ‘Smart Sensors’ [18].

1. GammaPix: This application turns a smartphone into a radiation detector.

2. Metal Detector: This application uses magnetic sensor and turns into the magnetic detector.

3. Sound Meter: This application use microphone to detect sounds.

4. Thermometer: This application use an internal temperature sensor, GPS, and the internet to search weather temperature using weather API to determine outside temperature and ambient temperature reading can provide Room Temperature.

5.EMF Detector: This application uses a magnetic sensor to determine the electromagnetic field.

6. Pothole Detector: This application detects pothole while driving a car using accelerometer sensor.

25

GPS

Camera

SSI GPRS Sensor

Smart Calibrator

Gyroscope Accelerometer

Figure 2. Surface Solar Instrument built from multiple sensors

As there many more set of applications which also works with many other smartphone sensors. As most of these applications provide some feasible information as per one user of any information. But demonstrating this information using crowdsourcing and can be mapped to the geographic location. Offering open source API accessing available smart sensors and forming sets of sensors information from a crowd.

Uploading information to cloud storage from gathering data over large crowd population over the span of time. For an example, if using Ambient temperature sensor API and integrating into crowdsourcing application and uploading temperature data into cloud storage. Apparently collecting mass data over demographic locations, so with this information researchers can make deduce so much of relevant information which can be used in many logical ways. We have demonstrated this ideology for Estimating Surface

Solar Irradiance.

26

CHAPTER 3

SCIENTIFIC APPROACH FOR ESTIMATING SURFACE SOLAR IRRADIANCE

3.1 BACKGROUND AND RELATED WORK FOR SOLAR IRRADIANCE MEASUREMENT

3.1.1 DEFINITION OF SOLAR IRRADIANCE

The Solar Irradiance is a measuring unit for amount of solar energy or

electromagnetic radiation directed on a unit area in each unit of time. It may be

represented as in form Solar Insolation which is a direct amount of solar radiation scatter

on earth's surface called direct solar Insolation. Total solar irradiance is calculated by the

amount of directed solar insolation and amount of solar radiation reflected from earth's

surface. But here we are more interested in calculating Solar Irradiance. For which we

will discuss whole mathematical approach and techniques we have developed to

measure it.

3.1.2 CORROBORATIVE WORK

Before implementing our approach, we perform some tasks using smartphones

which lead in support of our mechanism. The smartphone is complex and sophisticated

device these days. Which comes with enormous possibilities of application. So here we

performed following task that supports our idea of the approach to measuring Solar

Irradiance.

27

3.1.2.1 Different values retrieved from sensors in smartphone.

Here we built an application on an android based platform using ADT Bundle.

This application displays values from different sensors available in android based

devices. We have shown different values from sensor available in smartphones like

Accelerometer, Gyro Sensor, Magnetic Sensors, Proximity Sensor, Ambient light

sensors, Pressure sensors, Orientation and ambient temperature sensor.

Here is the list of sensors and their hardware specification we used for measuring

readings.

Sensor Model Vendor Version Power Maximu Resolution Type m Range

Accelerati ST 1 0.25 19.6133 0.0095768 on sensor microelectro 07 nics

Light CM36651 Capella 1 0.75 3000 0.0095768 Sensor Microsystem 07 s Inc.

Orientatio Samsung 1 12.35 360.0 0.0039062 n sensor Electronics 5

Proximity CM36651 Capella 1 0.75 5.0 5.0 Sensor Microsystem s Inc.

Gyro LSM330DLS STMicroelect 6.1 8.726646 3.0543262 Sensor ronics E-4

Magnetic AK893C Asahi Kasie 6.0 2000.0 0.06 Sensor Microdevices .

Pressure BMP182 Bosch 1 1.0 1000.0 1.0 Sensor Barometer Sensor Camera 8 Megapixel Autofocus Camera with LED Flash, BSI, Best Photo, Best Face, Low light shot

Table 3. List of sensors and hardware specification

28

Here we display all retrieved values in a layout. This application was performed on Samsung Galaxy Note 2 smartphone. All these sensors are available on this smartphone. The idea of performing this task is we need to use calibration process for our approach for this we can use Orientation sensors and its component i.e. Azimuth angle, Roll, and Pitch.

Figure 3. Display all values retrieved from sensors available in mobile phone.

29

3.1.2.2 Develop program using camera parameter

Another supportive work was done with camera application in an android

application platform. Here we display all different parameters retrieved from the mobile

camera like ISO, Shutter speed, focal length, EV, max, and min focus values. These

values will help to improve our Image segmentation approach to estimate cloudiness in

the sky images. As we must operate with exposure values with sky images.

Figure 4. Display different value of camera parameters.

30

3.2 SURFACE SOLAR IRRADIANCE: PRELIMINARY AND RELATED WORKS

3.2.1 OVERVIEW

Initially, for estimating solar irradiance, we have decided to implement research

work processed by [4]. The basic idea of estimation was to use a fisheye camera and

then retrieving 1800 hemispherical image, as our approach is to develop an android

application for mobile to calculate solar irradiance. Our future application will be not

being conjugated with any other external support for data measurement. As paper [4]

was primarily focused on using fish eye camera, so typically use of fish eye lens will

digress from our approach and need to make use the external fisheye lens to get a

hemispherical image.

These sky images are taken from the mobile camera in Figure 6. shows the

generated effects of a fisheye camera, but still, we cannot have 1800 hemispherical

image. So now we have decided to have the single shot of sky image placing our

camera horizontal upright and later for further enhancing our image segmentation we

can use image stitching so that we can get a fisheye image of the sky. Here we have

created an application to generate fisheye effect i.e. is shown in Figure 6. To

demonstrate this effect, we have used Samsung Galaxy Note SGH-T889 camera.

Figure 5. Fisheye effect with normal phone camera.

31

3.2.2 IMPLEMENTATION OF OUR APPROACH USING MATHEMATICAL METHOD FOR

SSI.

So, referring to our new approach we implemented different method developed

by [5] to compute components of solar irradiance and calculating the percentage of

cloudiness in a scale with the approach developed [6].

3.2.2.1 Notations

NOTATION DESCRIPTION

S Solar irradiance

S0 Clear sky solar insolation µ Cloud cover

Ѳ Average angle of elevation

Ѳep Solar elevation angle of 1 hour before from current hour

Ѳc Solar elevation angle of current time x Longitude

y Latitude

α Solar zenith angle

Β current latitude

Ω Solar declination angle

H’ Hour angle

ƛ’ longitude of earth from vernal equinox

tv current time in radian T length of year from vernal equinox.

€ Earth tilt, which is 23.5

Table 4. Summary of the General Notation

32

3.2.2.2 Calculating Solar Irradiance ‘S’

Calculation of solar irradiance using with estimating cloudiness in the sky.

This approach will implement in different steps.

Step1. Calculating solar irradiance ‘S’ with corrected cloud level in the sky [5].

We have used Cloud Cover Radiation model proposed by Kasten and Czeplak power

function for the ratio S/S0.

S0 = Clear sky solar insolation

3.4 S = S0 (1 – 0.75 µ ) (1)

We need to calculate primarily S0 and µ, where µ refers to cloud cover in the sky.

To determine value of solar insolation S0 is given by

So = 90 sin Ѳ – 30 (2)

Ѳ = Average angle of elevation

Ѳep = Solar elevation angle of 1 hour before from current hour

Ѳc = Solar elevation angle of current time.

Where Ѳ = (Ѳep + Ѳc) / 2 (3)

Then equating the value Ѳ from (3) in (2), so now we can calculate value So.

3.2.2.3 Calculating Solar Elevation

Step 2. Need to calculate solar elevation angle Ѳ with using phone camera and

GPS. For finding the solar elevation angle (Ѳ), we need to estimate the longitude and

latitude (x, y) of the surface point. So, to calculate these coordinates we will use phone

GPS to retrieve data. As the sun position varies throughout the day, so angle depends

on the latitude (β) may also vary. So, we must calculate solar elevation (Ѳ) for every

hour.

33

Calculating the solar zenith angle

Figure 6. Different Solar Angles

This diagram describes what we are about to calculate called Zenith Angle.

‘α’ denotes Solar zenith angle

Here we use method for calculating zenith angle given in equation (4) and (5).

As Solar elevation angle (Ѳ) = 90 – Solar zenith angle (α) (4) and

Solar Zenith Angle (α) = Latitude - Angle of declination (Ω) (5)

Here we can calculate solar zenith angle (α) can be calculated by

34

Here we can calculate solar zenith angle (α) can be calculated by [13] [14] [15]

Cos α = sinβ * sinΩ + cosβ * cosΩ * cos H (4)

Here in equation (4)

To find ‘α’ we will find values of another component of equation first such as

β = current latitude

Ω = Solar declination angle

‘H’ refers to Hour angle, which is given by

H = 360o [current time in hrs. / 24]

To calculate the value of current latitude ‘β’ we can use phone GPS (6)

To calculate ‘H’ hour angle we can use registered time of an event. (7)

3.2.2.4 Calculate Solar Declination Angle ‘Ω’

The Solar declination angle is the angular distance of sun north or south of earth’s

equator [7].

Figure 7. Show Solar Declination Angle variation over year

35

Summer Solstice is when the sun shines down most directly on the Tropic of Cancer in

the northern hemisphere, making an angle of declination = +23.5° with the equator

Winter Solstice is when the sun shines down most directly on the tropic of Capricorn in

the southern hemisphere and make an angle of declination = – 23.5°.

That’s how Solar declination angle, Ω, has the range: – 23.5° < Ω < + 23.5° during its

yearly cycle.

The Solar angle of declination is given by [8]

Angle of declination (Ω) = 23.45° sin ({[N + 284] /365} x 360°) (7)

‘N’ represents day of the year (for e.g. January 1 as d = 1)

So, by substituting all values (5), (6) and (7) in equation (4) will result ‘α’.

Now to calculate solar elevation angle ’Ѳ’ which will be (90 – α).

This will produce the value current time Ѳc.

We need to follow same steps to generate the Ѳep value of last hour from time used to

calculate Ѳc. So, then finally we can substitute the values of Ѳep and Ѳc in (3). To get

Average elevation angle (Ѳ) and then we can find the solar insolation ‘S0’ from (2)

3.2.3 CLASSIFICATION OF CLOUD LEVEL IN SKY

After getting value for S0 need to find the value of µ.

Step 3. Classification of cloudiness in the sky

The percentage of the cloudiness in the sky, we need to map on a scale of (0.0 – 1.0).

Clouds are classified based on their altitude and structure. There primarily 10 different

types of clouds have been categorized in three main categories [8].

High clouds, Mid clouds, and Low clouds.

High Clouds further divided into three different categories.

36

1.Cirrus Figure 8. Cirrus Cloud

Disconnected clouds in the scheme of white patches stay between 18,000 ft. to

40,000 ft. from the ground level and usually made up of ice crystals.

2. Cirrostratus

Figure 9. Cirrostratus Cloud

Transparent high clouds from the cause of rising air and stay between 18,000 ft. to 40000 ft. from the ground level.

3. Cirrocumulus

Figure 10. Cirrocumulus Cloud

37

Thin, white patch and layered clouds made up of ice and supercooled water and usually stays between 20,000 ft. to 40,000 ft. from the ground level.

Mid Clouds further divided into three different categories.

1. Altostratus Figure 11. Altostratus cloud

Gray or bluish cloud Layered clouds forms with water and ice usually when cloud descend from high altitude. Usually stays between 6,500 - 20,000 ft. from the ground level.

2. Altocumulus Figure 12. Altocumulus cloud White/gray with shaded sides, sheet or layered clouds formation breakup of altostratus and usually stays between 2,000 ft. to 18,000 ft. from the ground level.

38

3. Nimbostratus Figure 13. Nimbostratus cloud

The continuous rain cloud resulting into the thick dark and gray clouds and

usually stay between 2,000 ft. to 10,000 ft. from the ground level

Low clouds further divided into four different categories.

1. Cumulus Figure 14. Cumulus Cloud

Disconnected, generally dense clouds form usually from occasional rain or snow

shower. Stays between 1,200ft. to 6,500 ft. from the ground level.

39

2. Stratus Figure 15. Stratus cloud

A generally thick gray cloud layer with a uniform base and usually stays between below 6,500 ft. from the ground level.

3. Cumulonimbus Figure 16. Cumulonimbus cloud

Fibrous upper edges, Heavy rain, and thunderstorms caused by convection, cumulus grew into Cumulonimbus clouds. Which stays between 1,100 ft. to 6,500ft.

40

4. Stratocumulus Figure 17. Stratocumulus cloud

Gray or whitish patch, sheet, or layered clouds form of breaking up of stratus

clouds. Stays between 1,200 ft. to 6,500 ft.

For estimating cloudiness in the sky, we categorize clouds in three basic

standards

 Almost clear sky (Low dense Clouds)

 Mid-level clouds (Partial dense Clouds)

 Highly dense clouds (High dense Clouds)

3.2.3.1 Related approach to categorize different clouds

Initially, we convert RGB scale image into a grayscale image with jpg format.

Then convert greyscale jpg format image into a binary image in png format. We

have use standard algorithm for conversion of images.

Finally calculating no of pixels for cloudiness in the sky representing black pixels

with a different threshold. For calculating value µ, we just compute the ratio of black pixel

ratio and total no of a pixel in the image.

Here we produce some results based on our approach. Initially, we convert RGB

scale image into grayscale as shown in Figure 3-15. and Figure 3-16.

41

Figure 18. RGB Scale Figure 19. Grey Scale Figure 20. Binary Scale

Here we use Threshold algorithm for conversion of an image into grayscale.

Next, we convert a grayscale image into binary scale with png format. Here we produce

different sky image samples for a different level of clouds in the sky based on cloud

classification techniques.

3.2.4 RELATED WORKS AND RESULTS

Step 4: Final calculation

We can calculate the solar irradiance ‘S’ value using above method and then

implement this complete procedure After that we can transform these retrieved values

on a scale of 0 – 255 into 0 – 1.0 scale to get the value of ‘µ’.

Substitute this value of ‘µ’ and S0 in equation (1) to get our results.

3.3 EXPERIMENTATION

To demonstrate our method, we have set up our at different

geographic locations to perform our experiment we have used cell phone camera and

GPS to retrieve images and geo locations parameters. Also, we usually mount our cell

phone horizontally parallel to ground level while capturing images. Here we took many

sample images of sky and place them in different tables respectively. Here reference

42 table 5, table 6 and table 7 shows different levels of sky images based on cloud density.

Each table has 3 columns. Column 1 represent captured images of RGB format, column

2 represent converted images in grayscale and column 3 represent converted images in binary format. Now using these images, we provide cloud pixel which we discuss in

Table 8, Table 9 and Table 10.

Here in Table 8, Table 9 and Table 10, we provide detailed information about different parameters. Such as date, time, current location, solar elevation angle of last hour, solar elevation of the angle of current hour, Average solar elevation angle, cloud cover, estimated solar irradiance readings and measured solar irradiance from the solar meter. These Table 8, Table 9 and Table 10 exhibit procedure about calculating SSI.

These data sets are being collected at different geo-locations, time and date.

43

3.3.1 Comparing Image Sample for Different Cloud Levels

Table. 3 Image Samples for Clear Sky

44

Table. 4 Image Samples for Mid-Level Sky

45

Table. 5 Image Samples for Dense-Level Sky

46

3.3.2 COMPARING SSI VALUES

SSI for clear sky images:

This table provides all information about the related data images taken with

respective time, location and date for Clear sky cloud Images. Also, demonstrate the

comparison between estimated and measured SSI values for data Images.

Image Date Time Current Solar Solar Average SSI in Cloud cover Estimated Measu Seq. Locatio Elevation Elevation Solar clear (µ) red n angle of angle of Elevation sky (S0) (SSI)W- Latitude last hour present angle (Ѳ) m2 SSI( 2 (Ѳep ) hour( Ѳc) W-m )

101. 10/10/1 13.07 41.24 40.16 41.80 40.98 619 61581/675881 619 620 3 pm 5

= 0.009

102. 10/10/1 13.21 41.24 35.03 41.77 38.4 584.936 42752/122880 573.736 560 3 =0.34

103 10/12/1 15:32 41.1527 31.66 23.13 27.395 425.521 376519/79902 425.508 431.21 3 786 72 4

=0.0471

104 05/3/14 10.11a 41.53 23.56 32.38 27.97 434.319 617745/79902 429.96 433.3 m 72

= 0.07

105 06/3/14 9:42 41.52 18.99 28.55 23.77 369.03 9444/1228800 369.029 384 am =0.007

106 3/9/14 11:45 41.241 30.03 37.87 33.95 552.884 216/1228800 552.883 560 am = .001

107 3/14/14 14:46 41.2417 46.37 43.56 44.965 669.607 113234/12288 669.455 680- pm 1135 00 690

= 0.09

Table 8. SSI for clear sky images

47

SSI for Mid- Level cloud images

This table provides all information about the related data images taken with them

respective time, location and date for mid-level cloud Images. Also, demonstrate the

comparison between estimated and measured SSI values for data Images.

Solar Solar Estimate Average SSI in Measured Current Elevation Elevation d Image Solar clear Cloud Date Time Location angle of angle of Seq. Elevation sky cover (µ) (SSI) W- Latitude last hour present (SSI)W- 2) angle (Ѳ) (S0) 2 m (Ѳep ) hour( Ѳc) m

921600/ 10/10/1 311.11 201 17:32 41.24 26.63 13.68 20.155 803498 165.218 130-135 3 5 =0.871

774663/ 10/10/1 41.2419 280.60 202 17:36 23.63 12.94 18.285 921600 159.495 102.8 3 4 6 =0.85

559.43 5630063 /799027 346.256 10/12/1 0 203 12:12 41.15 33.87 39.21 36.54 2 432.210 3 = 0.704

734048/ 10/24/1 12.56 41.1527 544.33 204 36.73 34.19 35.46 1228800 482.212 360 3 pm 786 3 =0.597

9668898 14:17 660.39 /122880 205 3/10/14 41.15 44.66 43.77 44.215 0 447.58 475-485 pm 7 = 0.78

1004223 661.18 /122880 206 3/10/14 14:20 41.152 44.72 43.84 44.28 0 428.966 480-485 3 =0.81

44.16 1101010

659.69 /122880 207 3/10/14 14:25 41.152 44.80 43.52 0 326.783 369 7 =0.89

336503/ 11:30 498.86 208 3/10/14 41.2417 28.10 36.48 32.29 128000= 387.025 460 am 2 0.701

Table 9. SSI for Mid- Level cloud images

48

SSI for Dense Level cloud images

This table provides all information about the related data images taken with

respective time, location and date for Dense-level cloud Images. Also, demonstrate the

comparison between estimated and measured SSI values for data Images.

Curre Solar Solar Measure Image Averag Estimate nt Elevation Elevation SSI in d d Tim e Solar Cloud Seq. Date Locatio angle of angle of clear e Elevation cover (µ) (SSI)W- n last hour present sky (S0) (SSI)W- No angle (Ѳ) m2 2) Latitude (Ѳep ) hour( Ѳc) m

560.60 9 7990269 10/13/ 14:0 41.152 154.277 301 39.22 34.03 36.625 /7990272 127.35 13 7 7786 0 =0.99

7990269 /7999027 10/14/ 3:18 41.241 135.095 302 37.98 32.38 35.18 540.38 160 13 pm 94 2 5 =0.99

7989501 10/15/ 13:0 41.241 589.39 /7999027 303 39.91 37.55 38.73 2 147.493 55-56 13 3 94 4 =0.9999

7845365 10/15/ 5:21 41.209 289.86 /7990272 304 23.67 14.03 18.85 86.69 69.25 13 pm 687 0 =0.9842

7753852 10/16/ 11:3 41.209 478.47 305 27.38 34.43 30.905 /7990272 154.456 138 13 0 am 687 8 =0.9704

550.43 35.895 7415216 10/23/ 1:02 41.241 8 /7990272 306 34.75 37.04 230.193 202 13 pm 67835 =0.928

44.115 1116077 /1122880 3/11/1 13:3 659.13 307 41.52 42.88 45.35 302.718 382 4 1 Pm 9 0 =0.90

Table 10. SSI for Dense Level cloud images

49

3.3.3 BAR CHART COMPARISON BETWEEN CALCULATED DATA AND SSI READING

Figure 21. Bar chart of comparison between measured and calculated data values of SSI

50

3.3.4 BAR CHART FOR MINIMUM MEAN SQUARE ERROR

Figure 22. Bar chart for MMSE: min, max, average

51

3.3.5 BAR CHART FOR SIGNAL TO NOISE RATIO

Figure 23. Bar chart for SNR: min, max, avg.

3.3.6 VALIDATION

To validate our experiment, we plotted a graph between calculated readings of

SSI and measured readings of SSI with the solar meter for each category.

As Figure 20. compares the graph between measured values retrieved from the

solar meter which shows in blue line and calculated values of SSI for clear sky images,

mid-level images and high dense level images which show in red lines. The x-axis of

each graph represent no. of sample images and Y-axis indicate solar insolation values.

And our results are quite effective for clear sky (low-level clouds) images.

Next Figure 21. presents MMSE (minimum mean square Error) for the min,

average and maximum values for categories

Finally, Figure 22. compares error ratio in calculated values for SSI. We plotted

Bar chart for Signal to Noise ratio with minimum, average and maximum values.

52

CHAPTER 4

DESIGNING OF COMPONENTS FOR SETTING UP FRAMEWORK FOR CROWDSOURCING

Here we have designed a model to set up our methodologies for Solar Irradiance concept execution. This whole platform should provide users to contribute either by uploading or submitting required data at different levels and even in different categories.

Our basic idea is to provide complete easiness to a user to access any project or create any new project. Here are traditional ways for this implementation.

Online Campaign Interactive Interface Face to Face

1. Internet, website 1. Mobile application 1. Asking different 2. Social Media groups of people 3. Facebook, twitter to support in the contribution of sky images within different levels. Like locations and time

Table 11. Methods for crowdsourcing

53

4.1 INTRODUCTION TO SYSTEM ARCHITECTURE

The crowd can be described based on the system design or cluster. Cluster

refers to the crowd project. As crowd computing require desired information or metadata

to gather. So, the sorting crowd is a most crucial process of the architecture.

4.1.1 SYSTEM MANAGEMENT

As to build computing architecture related to hardware requirement. There are

few important aspects of design that need to be taken into consideration.

- Selecting what kind of Hardware

- Selecting Computational Processing Unit

- Graphical User Interface

- Server and Database

- Crowd Interface (Mobile Application)

We have designed a generic template which helps to perform crowd computing.

CROWD INTERACTIVE INTERFACE GRAPHICAL (MOBILE APP) USER

INTERFACE CROWD COMPUTATIONA (WEB APP) PROCESSING L

ENGINE ENGINE

Figure 24. Architecture for crowdsourcing platform.

54

4.1.2 COMPUTATIONAL ENGINE

This process refers to processing algorithm that produces the desired result, as

there can be multiple approaches we can follow. Moreover, depending upon the crowd

computing project and how complex computation can be so to resolve this challenge we

can go through different approaches.

Here are major approaches we will discuss

1. Computation at Mobile Processing Engine

2. Computation at Server / Cloud Processing unit

3. Computation at User interface end.

4.1.2.1 COMPUTATION AT MOBILE PROCESSING ENGINE

As described the crowd will be interacting with a mobile application to perform

certain tasks and contribute metadata. This metadata will be processed and output as

the desired result e.g. Image taken by mobile app and convert PNG (RGB) scale image

to Grayscale Image. As smartphones are nowadays held such a strong processor these

days. For an example, we have performed operation Samsung Galaxy Note 4 which

have

 2.7 GHz quad core snapdragon 805 chipsets.

 Adreno 420 GPU

 So, for another mobile processor available.

 ARM V8 –A Exynos & Octa Soc

 4 Cortex- A57 cores 1.9 GHz

 Cortex-A57 cores 1.3 Ghz

55

As all these processors are much stronger to perform many complex

computations. Most of GPU units are sufficient to perform many image processing

computations. So, we can perform most of the computation at mobile processor engine.

And to upload the processed metadata to Cloud (Server).

Here is the code snippet we have worked to upload data into the cloud using

Android Studio.

Android Code snippet converting bitmap to png file public File bitmapToFile (Bitmap bmp) {

try { int size = 0; ByteArrayOutputStream bos = new ByteArrayOutputStream(size);

bmp.compress(Bitmap.CompressFormat.PNG, 80, bos); byte[] bArr = bos.toByteArray();

bos.flush();

bos.close();

FileOutputStream fos = openFileOutput("sample.png", Context.MODE_WORLD_WRITEABLE); fos.write(bArr); fos.flush(); fos.close();

File mFile = new File(getFilesDir().getAbsolutePath(), "sample.png");

return mFile;

} catch (FileNotFoundException e) { e.printStackTrace(); return null; } catch (IOException e) { e.printStackTrace(); return null; } }

56

Android Code snippet to upload data to Cloud Database.

private static PostHelper client = new PostHelper (PostgresConn.HOST, PostgresConn.DB_NAME, PostgresConn.USERNAME, PostgresConn.PASSWORD);

public void DataUpload() { thread = new Thread() {

public void run (){

try{

if (client.connect() == true) { //Toast.makeText(getApplicationContext(), "Data Uploading started !!", Toast.LENGTH_SHORT).show();

client.insertImage(bitmapToFile(bp));

//Toast.makeText(getApplicationContext(), "Data Uploaded in DB!!", Toast.LENGTH_SHORT).show(); Log.i("file " , "inserted "); }

client.closeConnection(); }

catch (SQLException se) { System.out.println("oops ! Can not connect. Error: " + se.toString()); } catch (ClassNotFoundException e) { System.out.println("oops ! You can not find the class. Error: " + e.getMessage()); } catch (NullPointerException e) { e.printStackTrace(); } catch (Throwable e) { e.printStackTrace(); } } };

thread.start(); }

57

4.1.2.2 COMPUTATION AT SERVER END (CLOUD)

In this approach, we will collect meta from mobile application and store into the

cloud database. As there are plenty of Relational Database available such as MYSQL,

SQL, POSTGRESQL and many other.

Also, there is Non-Relational Database such as Mongo DB, SQLite, Dynamo DB

etc. We also like to discuss Cache Data Structure. These Caching Data Structure use for

accessing most frequently accessed data from a database. There is few Cache available

such as Memcached and Redis.

For this example, we have used Postgres Database. This database is structured

relational DB and we have created no. of tables in the Database.

Figure 25. Snapshot of Postgres Table for registered Ventures

58

Figure 26. Snapshot of Postgres Table for Uploaded Data

Figure 27. Snapshot of Postgres Table for Registered Rovers

We can write the script to perform computation on the metadata and process data at cloud or server end and display desired data at GUI level. This is what we call

59

server side scripting, which performs computation on data. Again, this approach

completely depends on the project requirement. As this required more of memory at

server side and might be time-consuming as well.

4.1.2.3 COMPUTATION AT WEB APPLICATION

This is the most effective and traditional approach to performing the computation.

As we get data from the cloud now we can perform computation on the data and

processed data to get the desired result.

This interface can be designed in many different platforms. Such as PHP, HTML,

JSP, Node JS or Asp.net. As to perform computation we have used JavaScript and

algorithm to get the desired result from one data. We will be discussing computation

approach for our use case in more details later in the later chapter.

There are few snapshots of the source code and JavaScript using PHP.

Figure 28. Screen shot of PHP code

60

Figure 29. Screen shot of PHP code

Snapshot provide PHP Code where we have defined the function where we

downloading PNG file from the Cloud Database and converting into PNG file to display

in Web Application.

4.1.3 APPROACH FOR OUR USE CASE SCENARIO

As we have discussed most common practice in a computation of complex task.

We have combined all three approaches to performing computational together to reduce

work hard and reduce overhead at each level. So, this processing time can be increased

and workload can be distributing accordingly.

Still, there are several things need to be taken into consideration so that we can

Mitigate Data Loss

● Mitigate Computation Failure

● Can perform Data Backups

● Achieve Successful Computation

61

As in cloud base computing data eventually, will be uploaded to a mobile client

application. So, sluggish data transmission can discrepancy and losing data. So, the

computation is the response to performing Data upload, where mechanism should be

intelligent enough this task eventually.

Processing data from the cloud to the web application there are few important the

aspects that to be considered.

● Successful Data transmission

● Less time consuming to perform a computation.

● Low Complexity

● Efficient and correct Result

4.2 API SET AND WEB SERVICES

We have implemented many Google API and web services. As to implement

google maps on an android application we have used Google MAP API and Google

geolocation API. Here is a snapshot of Google API we have used all together in Android

and Web application.

Figure 30. Google API Source

For designing analytics statistics, we have used Google Charts API.

62

4.3 DESIGN STRUCTURE AND ARCHITECTURE OF MOBILE APPLICATION

4.3.1 Approach and Definition

To implement the above-mentioned technique for measuring solar irradiance we

developed using crowdsourcing methodology to gather most amount of the data. We

worked on creating a generalized interface to gain most potential desired data or result

out of a crowd. We have designed three distinct components to work with our

understanding of crowdsourcing application. Here are three distinctive and preliminary

components of our approach.

1. Venture

We have defined this category of related to the name of the project about what

kind of data is desired for. What are the expected results or information need to

accomplish? And as the user who will be a part of contributing this desired data or

information will explore different geographic location we have defined it as ‘Venture’.

Venture refer to the name of the project where the user who is responsible for the

accumulating data will explore to those geographical areas. So, from here we will refer

each project as Venture in all description.

2. Project Harvester

Here we have defined this category related to those users who will create new

Ventures. And expect to get desired information or data for this venture. As these users

will be responsible for creating venture on this platform and will harvest data or

information out the user who will be collecting and gathering data or information we

defined them as ‘Project Harvester’.

3. Data Rover

Here we defined this component base on the purpose of the user who will be

responsible for using Venture to collect the desired piece of data or information for the

Project Harvesters. As these types of user will move along in the different part of

63

geographical location to collect data we called them as ‘Data Rover’. As they will explore

new Venture in multiple locations. Although these rovers will collect some sort of

rewards in term of achieving Harvester goals. We have defined the standard parameter,

where harvester can Build in their Reward Systems.

4.3.2 Main component of the application

As we have already discussed key components of our architecture for the mobile

application. So, this application will primarily be used by Data Rovers to data. We

haven’t defined the type of data that can be desired by Project Harvester but it’s obvious

that type of data can only be mobile accessible or related to Rover inputs. But this

section can be left for future discussion. So, for our application, we have introduced a

very standard type of Data to get a better understanding of this architecture. Further, this

data can be defined many ways.

1. Image / Picture data

This type of data is very standard but can be used in many different approaches.

So, to collect all type of images we have created an option in our application which use

default camera hardware of the mobile to capture desired pictures and stored in JPEG

format. This can define further in different ways as per the requirement or need of the

Project Harvester.

2. Video Data

This type of data is very standard but can be used in many different approaches.

So, to collect all type of videos we have created an option in our application which uses

default camera hardware of the mobile to record desired videos. This can have defined

further in different ways as per the requirement or need of the Project Harvester.

64

3. Audio Data

This type of data is very standard but can be used in many different approaches.

So, to collect all type of audio recordings. We have created the option in our application

which uses default microphone hardware for recording sounds or another type of sound

related data. This can define further in different ways as per the requirement or need of

the Project Harvester.

4. Mobile Sensors Data

This type of data is very standard but can be used in many different approaches.

We have one option in our application which gathers all data from built-in sensors in the

cell phone. As all smartphone is having so many different types of hardware sensors.

So, this application can collect all data and transfer to the desired Venture.

This can have defined further in different ways as per the requirement or need of

the Project Harvester.

4.3.3 Workflow

Primarily interface of this mobile application is based on google maps. We have

used google Map API to create an interface where Rovers will find Multiple ventures

depending upon on the location. Rovers are identified by the as black icon and Ventures

can be identified as Red icon on the interface. Here is the snapshot of the application.

65

Figure 31. Google Map Interface for App

This shows like how different ventures and Active Rovers oriented on google

Maps. As Venture are introduced by Project Harvester to collect some important data.

We used Google Maps API and other API’s to get active Rovers locations.

66

Figure 32. Venture display for an App

Ventures are displayed on the map page of this application and represented by red icon. All Venture are introduced by Project Harvester to gain information. Tapping on the Venture Icon navigate to Venture page which contains all the information about the

 Venture. Such as

 Venture ID

 Venture City

 Venture State

 Venture Description

67

 Latitude

 Longitude

 Venture Type Category

 Venture Published Data

And Driver Button represents what type of data drives. For example, Sensor

Data, Camera Data or Audio Button. Once clicked navigate to the different driver where

Rover can collect data using Smartphone hardware? Such as Camera Driver for Picture

Data, Microphone for Audio Data and Multiple Sensors reading for Sensor Data.

Underneath there are different pages for Venture shows different drivers with

information.

Figure 33. Venture Information to collect Figure 34. Sensor Driver Interface for Sensor Data Venture

68

Figure 35. Venture Information to collect Figure 36. Audio Recording driver for Audio Data Venture

69

Figure 37. Venture Information to collect Figure 38. Camera Driver Interface for Image Image Data Venture

70

Figure 39. Rover displayed on App Figure 40. Rover Information

Rover is displayed on the map of this application and represented by Black Icon.

All Rover are introduced once registered on the application. By Tapping Black icon navigates to Rover information page. Page provide detailed information about Rover.

Below is the example of the Rover Information Page.

71

4.5 DESIGN STRUCTURE AND ARCHITECTURE OF INTERACTIVE APPLICATION AND DASHBOARD

Figure 41. Rover Information Implementation for Solar Irradiance project: Image data collection technique

1. Sky Image 2. Date 3. Time 4. Current Location Latitude 5. Solar Elevation angle of last hour(Ѳep) 6. Solar Elevation angle of present hour(Ѳc) 7. Average Solar Elevation angle (Ѳ) 8. SSI in clear sky (S0) 9. Cloud cover (µ) 10. Estimated (SSI)W-m2

72

We will maintain our data and primary key of that data will be location (latitude, location) Table fields are 1. Location – country – state – city 2. Time and date 3. Reading

These are main parameter users will upload to our database.

4.5.1 Login credentials

Here we create different login accounts for different user type as here for this

Dashboard we have two different type of users. One user is responsible for uploading

new projects and another user which is responsible for uploading related data images for

projects.

4.5.2 Project Harvester / Data Rovers

Project uploader is those users which are responsible for creating and adding

new projects to the Dashboard. With the dashboard, they can upload the different

required project to get data from Data uploader. As per our requirement can upload our

project for solar irradiance. Per our project need like we need different sky images

depending upon different locations and at the different time. So, these are some

constraints for our projects. Also, we can provide users different categories to upload

their data through the dashboard or mobile apps. Also, data uploader is responsible for

designing rewards system accordingly to the project needed depending upon different

parameters and factors.

73

4.5.2.1 Dashboard feature for project uploader

 Login form

 Next form will be Dashboard having these fields

[1] Project Name / Title (give a blank space to fill title 50 characters)

[2] Category (60 characters’ space)

[3] Project location (if possible we can search location), also can add more

locations

[4] Project Description (250 characters)

[5] Data Input Type (JPEG, PNG, JPG option should be there to choose)

[6] Format type (Resolution)

[7] Quality standards to maintain input data

[8] Population density

[9] Constraints and challenges

[10] Project Budget/fund

[11] Rewards and incentives: Rewards points system

Data Uploader is those online community people which are responsible for

uploading data images for the project. Dashboard interface, they can see all uploaded

projects and submit different data images as per project requirement. Will responsible for

sending their user information.

4.5.2.2 Dashboard feature for data uploader

Here user will able to view project, submit their data and earn rewards

In this dashboard, the user has all these options.

 User ID

 Project Name

74

 Project Description

 Location

 Input Data

(a) Image Upload (format: JPEG, PNG, JPG)

(b) Image Location

(c) Time and Date

 Rewards points earned

Different Layouts for Dashboard Login Screen

Figure 42. Login Screen

75

Figure 43. Screen to add new project by Project Uploader

Figure 44. Added Venture with details

76

Figure 45. Added data by Rover

77

4.5.3 Manager and administrator

There are few important things that need to be taken into consideration for any

project such as

 Project name

 Project description

 Project duration

 Project feasibility

4.5.3.1 Geographic Analysis

Before any implementation of a project, we need to make the analysis on

geographic location, where this project will be going to be implemented. So, we need to

study about geolocation and spatial distribution of crowd.

4.5.3.2 Crowd Population

Making analysis on geographic location we need to make the analysis on crowd

population and its spatial distribution. As for an example, if we want to implement this

project in Northeast Ohio we need to map all cities along with areas and population

densities related to these areas. As crowd from this location is proportional to data input

for Solar Irradiance estimation for us. Here is a link to map of population in Kent, Ohio

http://bscstudent.buffalostate.edu/ambrosaj01/web/Damling/img/Ohio_population_map.p

ng.

4.5.3.3 Spatial distribution

Here we need to make analysis population distribution in mentioned areas. For

an example, if we want to find spatial distribution in Kent city. We should find major

areas in Kent like Kent state university, downtown area, apartment, building etc. Then

we can define for University area: No of students enrolled, faculties and employees. Find

the major dense area in Kent itself.

78

4.5.4 Reward system

There are many different possible ways project can earn rewards points. So, few

possible ways project viewer earns rewards point are as follows Once Project Viewer

registered itself with latitude and longitude, so whenever user will upload data from its

registered zip code will earn ‘x’ reward points. But once user upload data from the

different location will earn reward point depending upon location-based reward point

scale.

Factors: Location-based rewards points Depending upon on geo-location

Depending upon on location crowd population.

Useful Location data e.g. No points will be rewarded if user upload data from a

location which doesn’t match project area map. Here is the approach how the user will

earn points depending upon location.

Location Reward scale Project Rover ➢ Registered with Longitude and Latitude

➢ Registered current geo-location parameters at every time user upload its data.

➢ Every time upload images earn ‘x’ reward points based on location-based reward point

scale.

A mathematical approach for making reward scale depending upon on location based. 1. X = Location of the user (Registered location with latitude and longitude) 2. D = certain no. of miles that change rewards points for the user. 3. r = Base reward points. 4. Xn = Xn-1 + d (n > 0)

79

D miles D D D D X------X1------X2------X3------X4------Xn Y1 Y2 Y3 Y4 Y4 Yn

X = Registered location of project viewer Y1 = ‘r’ reward point scale Y2 = different reward scale as per location grade scale.

Here we can calculate location factor X, X1 Similarly, this way we will be calculating and so on. reward points difference depending upon location difference. X1= X + d X2 = X2 + d Y2 = Y1 + r ‘r’ is the reward scale point X3 = X2 +d difference . Y3 = Y2 +r . Y4 =Y3 + r . . Xn = Xn-1 + d . . General definition for calculating location Yn = Yn-1 + r factor: General formula for finding reward point Xn = X + (n-1)d earned depending on location difference:

Yn = Y1 +(n-1)r

4.5.4.1 Improvised reward point system.

‘X’ = User Registered location (Home Location)

‘Xi ‘= Latest Registered location (Latitude and longitude)

‘Xi-1 = Last Registered Location

‘D’ = Registered no. of miles for reward points increment.

‘R’ = Base reward points.

80

‘r’ = recurring rewards points.

Condition:

Xi - Xi-1 > D [Then R = R + r]

Xi - Xi-1 <= D [Then R = R ]

Xi > Xn , Where Xn is the not required location [ Then R = 0 ] Other factors that will help to earn rewards points are

RLOGIN = User will earn ‘x’ amount of reward points every day once login in 24 hours.

RSH = If project viewer shares this project will also help to earn reward points.

RTOPUPLOADER = within the same project there can be competitiveness among different

users and top User will earn certain amount reward point. For example, ‘Daily rewards

for top uploader, can earn 10 amazon gift card or something similar.

Sharing uploaded images using social networking and if any image meet certain

no. of like’s uploader will earn certain no. of reward points. For example, 100 likes on

one single image can lead to earning ‘x’ reward points Or each like on multiple images

may also lead to earn reward points.

4.5.5 Quality controls for data input

There are primarily important factors that contribute maintaining quality input

data.

1.Creating a mobile application where the user only can upload sky images. Not letting

the user upload any other related images.

2. Maintaining calibrated horizontal upright interface to let project viewer take sky

images.

81

3. Also, the user should not able to upload image night sky image or when there is no sun light.

4. So here we can relate this application by using Sun position method. So, we can measure using” PSA algorithm for High Accuracy Tracking of the Sun” [10].

5. Maintaining quality of images

 Maintaining resolution of images  Avoid blurred Images and corrupted images while uploading data. So, that user won’t able to upload any unwanted image data.  Also, the user can only upload with a specific format like PNG, JPEG, JPG.

Figure 46. Shows Sun position Figure 47. Details for capture image

82

4.6 DASHBOARD ANALYTICS Here we have designed user interface for the crowdsourcing our concept of

gathering data images for our Solar Irradiance measurement. But to get more out of this

user interface we also designed tools for data analysis in this project. Which will be

helpful for the project manager and administrator to decide for gathering most feasible

data out of all data uploaders collectively.

4.6.1 DIFFERENT TYPE OF GRAPHS AND CHART FOR DATA REPRESENTATION

1. Area charts: - x and y-axis, variables area and line comparison between two

quantities over period

2. Bar charts: - x and y-axis, column: variables, Height: size of column

3. Stacked column charts: x and y-axis, stacked column: variables, height of each

stack: size of column {Total represents one category}

4. Gauge charts: Needle - variable, title - range {min. value, max. value}

5. Maps: geographic location

6. Line charts: x and y-axis, variable lines comparison between multiple quantities

over period

7. Pie charts: variables and percentage

8. Table charts: rows and columns

4.6.2 LIST OF METERS, GRAPHS AND CHARTS

1. User

2. Data image

3. Budget and expenses

4. Rewards and points'

5. Quality

83

6. Spatial distribution

7. Crowd Population

8. Geographic Analysis

4.6.3 LIST OF PRIMARY PARAMETERS CHARTS

 per sec

 per hour

 per 3 hours

 per 6 hours

 per 12 hours

 per day

 days

 Weekly

 15 days

 30 days

 month

 Yearly

4.6.4 LIST OF TIME FRAME QUANTITY CHARTS

 Active Users

 Inactive Users

 New Users

84

Figure 48. List of Gauge meters for Users

1. Location based - active, inactive and new users

Figure 49. Geo Map for different population of User in different country

4.6.5 DATA IMAGE LIST CHARTS

 No of images uploaded

 No of images accepted

 No of images rejected

85

Figure 50. Annotation chart: Uploaded and Rejected Data

Figure 51. Bar chart: Comparison between Data Upload and Quality data

4.6.6 QUALITY CONTROL CHARTS

 No of quality images

 Blurred Images

 Resolution

 Corrupted Images

86

 Color accuracy

 Exposure Accuracy

 Noise

Figure 52. Bar chart: Bars for Different quality control

4.6.7 REWARD AND EARNED POINTS CHARTS

 No of total point earned

 Login rewards earned

 Share rewards earned

 Top Uploader reward earned

 Most liked images rewards

Figure 53. Pie Chart: Showing different

rewards point

87

4.6.8 GEOGRAPHICAL ANALYSIS CHARTS

1. Country population map

Figure 54. Country population map

2. Different Charts based on locations and areas type

Figure 55. Geo Map: Different located cities Figure 56. Table Chart: List of on map population of different cities

88

Figure 57. Stacked Bar chart: Showing Active user population in

different group of area in different cities.

Figure 58. Motion Chart: User population based

on location and different area

89

CHAPTER 5

SUMMARY AND FUTURE WORK

5.1 CONCLUSION

A major part of our implementation holds by crowdsourcing. We have used

Surface Solar implementation as one of the prime examples for introducing

crowdsourcing.

Crowdsourcing itself a vast topic to work with. So, narrow down our approach

and enhance SSI Implementation we worked on social approach. Building interactive

user-friendly mobile application design and Designed Dashboard for Project Harvester

for opening a gateway for implementing multiple projects is one the goal that we have

achieved so far. Technique and singularity, we have used to make more open building

platform for collection of data information is unique. Techniques we have used here are

not that complicated that lead for future implementation in obstruct condition. We have

used most of the development technologies

as like Android Studio for designing android application, Google Charts for most

of Statistics Dashboard design and these are open sources. So, any further

implementation can be done easily.

To narrow done with our initial approach of Implementation Surface Solar

Irradiance. Still, there are so many different factors which do prevail of finding our result

of SSI value ‘S’ varies depending on many natural elements such as atmospheric

reflectivity, clouds height, cloud reflectivity, an aerosol component in the atmosphere and

90

sudden atmospheric changes, other factors such as confliction with other flying bodies

etc.

Case that is needed to understand [3]

• In the morning: when there is no sunrise, an absolute value of the solar

radiance will need to be identifying, as it corresponds to the clear sky.

• Sky is scattered with clouds

The border pixels of clouds are the classified into in homogenous pixel and

implemented into mixed pixels.

- Overexposed – Sometimes small branches are outshined

- Underexposed - Small holes in the canopy are lost.

As implementation constitute major play of crowd collecting or gathering a piece

of information so there will be more chance of Human error. So, in this case, Sky images

uploaded by the crowd can have an insignificant mark of errors. So, a future approach

should be pivoted along with image segmentation, so to reduce error expectancy in our

estimated values calculated from this method we need to do more research and find the

most appropriate method for image segmentation for Surface Solar Irradiance

calculation.

5.2 OPEN WORK AND FUTURE SCOPE

As this instrumentation brings more extensibility for data gathering and analysis

to researchers without providing any external source. Also, if the complexity of Data

acquisition transform, which can also bring more constraints and challenges. As entire

architecture relies on smartphone sensors which can be limited and the crowd, where

crowd interaction may produce Human errors. So, for this crowdsourcing platform, there

are still important aspects taken into consideration such as,

91

 Building compatible mobile application for Android and iPhone users

 Uploading mobile app in App Store and Google Play

 Hosting / Maintaining Https and Data Server

 Reducing communication overhead

 Awareness of mobile Instrumentation among crowd population.

 Heterogeneity, transparency, security, privacy, scalability, stability, reliability, and

redundancy

Mobile crowd instrumentation can build complex hybrid human-machine system

for Data acquisition, assessment, and Analysis without much involvement of educated or

technical desired group with low-cost input.

92

APPENDIX A

Appendix: Source code sample for Android application.

Here are some experimental data we retrieved on the mobile application build for

crowdsourcing platform. This application retrieved from sample stored on local

PostgreSQL database and display multiple information on google map as shown in

Figure 30. We have queried on the database to retrieve registered Ventures and rover

on the platform to display with a different icon.

The following are the main method for the display activity of the rover and ventures on google map.

@Override protected void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); setContentView(R.layout.activity_maps);

// Obtain the SupportMapFragment and get notified when the map is ready to be used. SupportMapFragment mapFragment = (SupportMapFragment) getSupportFragmentManager() .findFragmentById(R.id.map); mapFragment.getMapAsync(this);

}

@Override public void onMapReady(GoogleMap googleMap) { googlemap = googleMap;

/* LatLng sydney = new LatLng(-34, 151); googlemap.addMarker(new MarkerOptions().position(sydney).title("Marker in Sydney")); googlemap.moveCamera(CameraUpdateFactory.newLatLng(sydney)); CameraPosition myPosition = new CameraPosition.Builder() .target(sydney).zoom(10).bearing(90).tilt(0).build(); googleMap.animateCamera( CameraUpdateFactory.newCameraPosition(myPosition));*/

enableMyLocation();

} private void enableMyLocation() { if (ContextCompat.checkSelfPermission(this, Manifest.permission.ACCESS_FINE_LOCATION) != PackageManager.PERMISSION_GRANTED) { // Permission to access the location is missing. PermissionUtils.requestPermission(this, LOCATION_PERMISSION_REQUEST_CODE, Manifest.permission.ACCESS_FINE_LOCATION, true); } else if (googlemap != null) {

93

// Access to the location has been granted to the app. googlemap.setMyLocationEnabled(true); googlemap.setOnMyLocationButtonClickListener(this); MarkerCount(); } } private void MarkerCount() {

Handler handler = new Handler(); Runnable runnable = new Runnable() { @Override public void run() { try { if (pgdb.connect() == true) {

ResultSet countrs = pgdb.executeQuery("SELECT count(*) FROM proinfo"); while (countrs.next()) { int count = countrs.getInt(1); if (count != 0) { ResultSet rs = pgdb.executeQuery("select project_name , latitude , longitude from proinfo"); while (rs.next()) { project_name = rs.getString(1); latitude = rs.getString(2); longitude = rs.getString(3); projectVal.add(project_name); latval.add(Double.parseDouble(latitude)); longval.add(Double.parseDouble(longitude));

} rs.close();

ResultSet rs1 = pgdb.executeQuery("select * from users");

while(rs1.next()) {

user_name = rs1.getString(1); user_lat = rs1.getString(2); user_long = rs1.getString(3);

userVal.add(user_name); userlatval.add(Double.parseDouble(user_lat)); userlongval.add(Double.parseDouble(user_long));

} rs1.close(); } } countrs.close();

pgdb.closeConnection(); } } catch (SQLException se) { System.out.println("oops ! Can not connect. Error: " + se.toString()); } catch (ClassNotFoundException e) { System.out.println("oops ! You can not find the class. Error: " + e.getMessage()); } catch (NullPointerException e) { e.printStackTrace(); } catch (Throwable e) {

94

e.printStackTrace(); } } };

handler.post(new Runnable() { @Override public void run() { String[] prod = projectVal.toArray(new String[projectVal.size()]); Double[] lat = latval.toArray(new Double[latval.size()]); Double[] longit = longval.toArray(new Double[longval.size()]);

String[] user = userVal.toArray(new String[userVal.size()]); Double[] userlat = userlatval.toArray(new Double[userlatval.size()]); Double[] userlongit = userlongval.toArray(new Double[userlongval.size()]);

for (int i = 0; i < projectVal.size(); i++) { double latitude = lat[i]; double longitude = -longit[i]; final String projectName = prod[i]; nav = 1; LatLng latLng = new LatLng(latitude, longitude); Projectmarker = googlemap.addMarker(new MarkerOptions().position(latLng).title(projectName).icon(BitmapDescriptorFactory.fromResource(R.drawable.user)));

if (nav == 1) { googlemap.setOnMarkerClickListener(new GoogleMap.OnMarkerClickListener() { @Override public boolean onMarkerClick(Marker marker) {

String projectname = marker.getTitle().toString().trim();

SharedPreferences pref = getApplicationContext().getSharedPreferences("MyPref", MODE_PRIVATE); SharedPreferences.Editor editor = pref.edit(); editor.putString("Project_Name", projectname); editor.commit();

Intent i = new Intent(MapsActivity.this, UploadActivity.class); startActivity(i); nav = 0; return false; } });

googlemap.moveCamera(CameraUpdateFactory.newLatLng(latLng)); } }

for (int i = 0; i < userVal.size(); i++) { double latitude = userlat[i]; double longitude = -userlongit[i]; final String userName = user[i]; nav = 2; LatLng latLng = new LatLng(latitude, longitude); googlemap.addMarker(new MarkerOptions().position(latLng).title(userName).icon(BitmapDescriptorFactory.fromResource(R.drawable.user1))); CameraPosition myPosition = new CameraPosition.Builder() .target(latLng).zoom(5).bearing(0).tilt(0).build(); googlemap.animateCamera(

95

CameraUpdateFactory.newCameraPosition(myPosition));

}

for (int i = 0; i < 10; i++) { LatLng latLng = null; if(i%2 == 0) { double latitude = 40-i; double longitude = -81;

latLng = new LatLng(latitude, longitude); googlemap.addMarker(new MarkerOptions().position(latLng).title("Group").icon(BitmapDescriptorFactory.fromResource(R.drawable.group)));

}

else { double latitude = 40; double longitude = -81+i; latLng = new LatLng(latitude, longitude); googlemap.addMarker(new MarkerOptions().position(latLng).title("Wizard").icon(BitmapDescriptorFactory.fromResource(R.drawable.star)));

}

CameraPosition myPosition = new CameraPosition.Builder() .target(latLng).zoom(5).bearing(0).tilt(0).build(); googlemap.animateCamera( CameraUpdateFactory.newCameraPosition(myPosition));

}

} }); new Thread(runnable).start();

}

List 1. Main method for displaying rover and ventures on google map in android application

96

APPENDIX B

Appendix: Sample source code for crowdsourcing web application.

We have designed a crowdsourcing web application, where project harvester can

introduce new Ventures into the platform. Here are few sample code what we used for

retrieving data information from PostgreSQL database to display registered Venture

information as shown in Figure 43 and how to registering new venture into the database

as shown in Figure 42.

List 2. Main method for establishing PostgreSQL connection

List 3. Method shows insert query to register new Venture

97

List 4. Method shows select query to display information of each registered Ventures

98

BIBLIOGRAPHY

[1] Ten Basic Cloud Types. (n.d.). Retrieved March 30, 2017, from http://www.srh.noaa.gov/jetstream/clouds/cloudwise/types.html

[2] Brabham, D. "Crowdsourcing: A Model for Leveraging Online Communities,[w:] Delwiche A." Henderson J.(red.), The Participatory Cultures Handbook, New York (2012).

[3] B., J. (n.d.). Crowdsourcing: Today and Tomorrow An Interactive Qualifying Project. Retrieved April 03, 2016, from http://www.academia.edu/23940296/Crowdsourcing_Today_and_Tomorrow_An_Interactive_Qu alifying_Project

[4] C. Gauchet, P. Blanc, B. Espinar, B. Charbonnier, D. Demengel, “Surface solar irradiance estimation with low-cost fish-eye camera,”, Workshop on Remote Sensing Measurements for Renewable Energy, Risoe, Denmark 2012.

[5] F. Kasten, G. Czeplak, “Solar and terrestrial radiation dependent on the amount and type of cloud,” in Solar Energy, Volume 24, Issue 2 ed., Hamburg, Germany: Elsevier, 1980, pp. 177-189.

[6] E. Schwalbe, H.G. Maas, M. Kenter and S. Wagner, “Profile based sub-pixel- classification of hemispherical images for solar radiation analysis in forest ecosystems” in Proceedings of the Commission 7 Symposium, volume 36 (part 7), ISPRS – International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, Enschede, Netherlands, 2006.

[7] Chapter 30, “Solar Energy Utilization,” ASHRAE Handbook, Applications, 1995. [TH7011.A12]

[8] Ten Basic Cloud Types. (n.d.). Retrieved March 30, 2017, from http://www.srh.noaa.gov/jetstream/clouds/cloudwise/types.html

[9] Brabham, D. C. (2013). Crowdsourcing. Cambridge, MA: The MIT Press.

[10] Reference for PSA ALGO

1.Vant-Hull LL, Hildebrandt AF. Solar thermal power system based on optical transmission. Solar Energy [Internet]. 1976 ;18:31 - 39. Available from: http://www.sciencedirect.com/science/article/B6V50-497SCJS- 2H/2/78dfffb8fca290387fb2596f89696498

2.Citekey Almanac not found

3.Blanco-Muriel M, Alarcón-Padilla DC, López-Moratalla T, Lara-Coira MÍ. Computing the solar vector. Solar Energy [Internet]. 2001 ;70:431 - 441. Available from: http://www.sciencedirect.com/science/article/B6V50-42G6KWJ- 5/2/a61a5c50128325f281ca2e33e01de993

99

4.Reda I, Andreas A. Solar Position Algorithm for Solar Radiation Applications. 2003.

[11] Jacobson and Z. Mark, "Fundamentals of Atmosphere Modeling", 2nd ed., n.p.: Cambridge University Press, 2005.

[12] Hartmann, L. Dennis, “Global Physical Climatology”, n.p.: Academic Press, p. 30. ISBN 0080571638.

[13] Reda, I., & Andreas, A. (2008). Solar Position Algorithm for Solar Radiation Applications (Revised). doi:10.2172/15003974

[14] Duffie, J. A., & Beckman, W. A. (2013). Solar engineering of thermal processes. Hoboken, NJ: Wiley.

[15] Seinfeld, J. H. (2006). Atmospheric chemistry and physics: from air pollution to climate change.

[16] Sensors Overview. (n.d.). Retrieved April 03, 2016, from https://developer.android.com/guide/topics/sensors/sensors_overview.html

[17] Hetmank, L. (2013). Components and Functions of Crowdsourcing Systems-A Systematic Literature Review. Wirtschaftsinformatik, 4, 2013.

[18] N. (2013, June 22). 25 Cool Android Sensor Apps that give your phone some very useful super powers [Freeware]. Retrieved November 07, 2016, from http://www.redferret.net/?p=36668

100