Surveillance, privacy concerns and beliefs about the utility of ICT4D in Africa – A hybrid approach to empirical Investigation

(Submitted to & Society journal)

Egwuchukwu Ani Renaissance University, Ugbawka P.M.B. 01183 Enugu, Nigeria [email protected]

Abstract: Numerous international development literatures perceives the use of ICT4D in Africa as having a revolutionary impact on development. While such literatures draw our attention to important attributes of ICT4D and the transformational effects hitherto, they also miss to highlight the inevitable possibilities ensuing from its utilities. This work analyses the possible adverse effect of ICT4D, through the lens of surveillance and privacy concerns, and by the use of rigorous and multiple empirical investigations. The authors first developed and tested a theoretical model using second generation multivariate statistics (LISREL). Data were generated through online questionnaires and interviews. The authors then employed a novel computational approach to content analysis – by scraping Twitter data of key influencers on surveillance, privacy, and ICT4D in Africa. Generally, we found that surveillance and privacy have growing apprehensions over ICT usefulness for development initiatives. We further discussed the possible reason for our findings and implications.

Introduction The use of information and communication technology (ICT) in Africa for development interventions, has been widely established in literatures in terms of ICT for development (ICT4D1 - benefit and opportunities for development). Some ICT4D scholars have maintained the usual implicit assumption of communication technology as accelerating socio-economic development, eradicating poverty, digital gap and spurring inclusive innovations (Brown and Grant 2010; Madon et al. 2009; Walsham 2005, 2012; World Bank 2012)2. This is, of course, with some good cause; new technologies present countless opportunities for expression, connectivity and empowerment (Graham 2016, 2017). However, such establishments ignore the contradictory impacts and adverse effects of such technologies, which can reduce communication gap, in the same proportion with freedom of communication (Carmody 2012). The result of this dichotomy of transformation has, by itself, dually vast. On the one hand, it has facilitated the deployment of development initiatives, such as education, health, politics, economic development and national security. But it has also, on the other hand, adversely effected these aforementioned impacts as well as privacy of communication – on the aspect of surveillance and civil liberties. These efforts to advance development in African through communication technologies fail to engage issues of users digital privacy or the appropriateness of how information is shared and flows (Nissenbaum 2010), or topics of surveillance as the collection and monitoring of personal information for the purpose of social control (Lyon 2002).

1 ICT4D is interchangeably used in this work with just ICT, referring to Information and communication technology. Also ICT4D is Information and communication technology for development. 2 See also the work of the Center for Global Development, http://international.cgdev.org/ (accessed 12 October 2017)

1

Beginning with the premise on the critical roles both ICT corporations and international aid agencies plays in shaping individuals and communities expectation of ICT for rapid development (Dailey et al. 2010), the current study focuses on ICT users’ perception to the actual use of communication technologies for development interventions in Africa, and the associated privacy-surveillance risks. It questions the impact of such technologies on transformation, and also possible of the capabilities of communication technologies to collect and process citizens and consumers data, couched primarily as surveillance (Lyon 2007).

While citizens and consumers of ICT benefit from the Internet: convenience, vast amount of information sharing, and time saving. This information sharing has become extremely valuable because it allow marginal ICT users to tap knowledge from the free economy of global users (Graham 2013). But there are risks associated with these benefits, as citizens are affected by shifts in the meaning and nature of surveillance, from credible indications, particularly the case of Snowden’s revelations (Anderson 2014, 2). This risk might be proportionately greater in resources-constrained setting e.g. the economically impoverished population of Africa. This is particularly true of inequity and information inequality that results from isolation (Graham 2014). Data profiling of marginalized groups, corporations and the state can exacerbate existing condition of inequity. From data collection, analysis and storage, members of the dependent or marginal Internet users are at risk of been stereotyped, exploited, and isolated. Furthermore, such risk reinforces the ways dependent, as opposed to independent, poverty, has been systematically and historically produced through colonialism and other exploitative forms of international interconnection (Carmody 2011). In an attempt to deliver on development, corporations, agencies and states can use and reuse these profiled information, creating a feedback loop of injustice.

Yet Africans reflects a society in which privacy is highly treasured but often violated with the aid of communication technologies. They view privacy as an expression and safeguard of personal dignity (Neethling 2005). Privacy is among the highest of individual rights (Etzioni 1999; Westin 2001). There are several good illustrations to prove this. In Nigeria for example, there was an alarming public concerns especially among the Igbo ethnic group when a taped conversations between two political officials from the other major two tribes revealed them scheming on how to tame the Igbo tribe. When asked, 60% of Igbo respondents, contrary to expectations, condemned such bridge on privacy, notwithstanding the advantageous impact of such revelation. 3 In a recent survey by Human Right Watch on the development effect of mobile phone and Internet-related technologies in Ethiopia, 90% of ordinary citizens said; yes, there are improvements in communication cost and ease, but it is nothing compared with perceived fear of being monitored by government, and this fear has driven them to self-censor and greatly limited their freedom of communication.4 South African government came under fire by its citizens when it announced a development programme that would wreck people’s privacy. The proposed ID verification technology is capable of simulating human cognition using neural networks.5 In a highly sensitive setting of health care, the African governments together with World Health Organization (WHO) have made some efforts in introducing disease surveillance and e-health technology to a certain extent in Sub-Sahara African countries. A notable example is the Integrated Disease Surveillance System (IDSR); a regional framework for strengthening national public health surveillance capabilities at all levels in Africa.6 However, little improvement have been made

3 Technopoint, ‘Nigerian’s Big Brother is watching you’ (23 June 2017) https://techpoint.ng/2017/06/23/nigeria-big-brother-watching/ (accessed 12 October 2017) 4 News from Africa, ‘Privacy and Surveillance in Africa’ (daily publications) https://privacyinafrica.com/category/news-from-africa/ (accessed 12 October 2017) 5 University of Johannesburg ‘New SA technology will verify your identity by simulating how you think’ (28 June 2017) https://privacyinafrica.com/2017/06/28/new-sa-technology-will-verify-your-identity-by-simulating-how- you-think/ (accessed 12 October 2017) 6 Centres for Disease Control and Prevention, Global Health—Health Protection. What is Integrated

2 on the WHO’s IDSR since its establishment in 1998, with an expectation that it will ultimately drive significant advancement in disease surveillance system in Africa where mobile penetration is growing at an unprecedented rate. The introduction of Rapid Diagnosis Test (RDT) for malaria (Chihanga 2012), which is used to provide instant diagnosis of infection (via SMS-based, App-Based, VRS-based and telephony-Based) services. The identified surveillance system followed usual standard flow of information such as (푖) routine data collection at health care facilities, powered by mobile devices, (푖푖)forwarded to a district server located at national centres, (푖푖푖) analysed and shared with different national and international aid agencies (Brinkel 2014) in case of disease defined by WHO (Pascoe 2012). Consequently, the extent for which these aggregated data can be shared is crucial to allow a precise and complete assessment of their impacts. However, little empirical research has been carried out on the adverse implication of data collected for health purposes. While investments are going into the development of functional e-Health, clearance, as well as confidentiality of data, data sharing and security were not even raised. The risk of weak privacy protection for marginal Internet users, is often combined with poor direct protection for freedom of expression, leading to underdevelopment (Andrew et al 2012). It should be noted that the lone notion of closing communication gap, in many ways, have been unhelpful. Consequently, emphasis on technology alone draws attention away from other divides and inequalities that hampers development (Heeks 2002, 7). There have been some high profile cases where such emphasis on technology use, without appropriate consideration of it implication. For example, in Ghana, the ministry of fishery and Aquacultural development recently deployed surveillance drones to monitor and clamp down illegal fishing in the country.7 In Mozambique, surveillance camera are to be positioned in medical stores of National Health Service, in fight against theft of medicines.8 The South African surveillance method is worrisome. The State Security Minister said: “We are monitoring everything, from forcing telecommunication companies such as, MTN, Vodacom, Telkom and Cell C to hand overs sensitive information about their customers’ call log, to intercepting calls at will.”

In the following section, we develop a theoretical framework for a model that considers surveillance, privacy concerns and beliefs about the use of communication technologies as antecedent to the provision of development initiatives in Africa.

Theoretical framework The fervour surrounding ICT4D discourse has been so cacophonous as to drawn out, or arguably, thwart – any critical analysis of the potential adverse effects of the adoption new technologies on human rights and civil liberties (Hosein G. et al 2013, 12). According to (Graham 2002) arguments, surveillance tools, such as CCTV, are becoming the 5th utility like water, electricity and gas. While scholars from a range of disciplines, including social sciences and MIS have addressed surveillance and privacy issues (Culnan et al. 1999; HelpAge 2011) in the context of technology use from a global standpoint. Africa’s case, have remained - natural and largely unaddressed. To fill this void, we have analysed critically important researches on the subject of ICT4D (ICT for development) through the lens of surveillance and privacy concerns. A large proportion of them have focused on a range of issues including; the power of surveillance in everyday life and the fear of consumption (Lyon 2001, 2003), the transformational use of ICT in Africa (Yonazi, Kelly 2012), the end of geography and

Disease Surveillance and Response (IDSR)? Available online: www.cdc.gov/globalhealth/dphswd/idsr/what/ (accessed on 7 October 2017). 7Timothy Ngnenbe ‘Ghana to deploy drones to track illegal fishing’ (22 June 2017) https://privacyinafrica.com/2017/06/22/ghana-to-deploy-drones-to-track-illegal-fishing/ (accessed 12 July 2017) 8 Surveillance and privacy in Africa (9 June 2017)https://privacyinafrica.com/2017/06/09/mozambique- surveillance-cameras-in-fight-against-theft-of-medicines/ (10 July 2017)

3 conceptualization of space, place, connectivity and information technology (Graham, 1998, 2012, 2013, 2014, 2015, 2016, 2017), empirical investigation into privacy concerns and belief about government surveillance in online transactions (Dinev, Hart 2008; Liu, Yu, 2000; Winson 2015), ASR forum on surveillance in Africa, politics, histories and techniques (Kevin, Donovan, Philippe, Frowd, Aaron, Martin, 2016), perceived website utility models (Jiang, Wang, Tan, & Yu, 2016), an exploration of how development and humanitarian aid initiatives are aiding surveillance in developing countries (Hosein, Nyst, 2013). This research fall within the last category (development initiatives) and focuses on privacy and surveillance concerns about the usefulness of ICT4D in Africa. Scholars have discussed and written much about information and communication technologies (ICTs) being a possible panacea for African economic transformation (Graham 2012). Indeed, there have been much studies on the impact of ICT in Africa economic development (Guerriero 2015; Owiny, Mehta, Audrey, Maretzki, 2014), and much seem to revolve around Internet access. This, we suggest, is due to the prominence of their powerful, charismatic Silicon Valley Champions (Graham, Mann 2013). However, with indecisive empirical evidence to support this impressive claims (Friederici, Ojanperä, Graham 2017, 3), this research seek to address, through empirical investigation, the actual usefulness of ICT4D in Africa. Specifically, it addresses the question: in what ways have ICT4D been utilized in African? The nature of surveillance in Africa is very subtle, this is probably because of the obvious believe that Africa is the least developed continent of the world. As such is excluded from the modernity that underpin surveillance techniques from a global perspective. Yet, as some researchers have identified, surveillance does occur in Africa, albeit – in low-tech ways (Kevin, Donovan, Philippe, Frowd, Aaron, Martin, 2016). This is in addition to expansive monitory that is enabled by ICT in many African states. What is known about the rapidly emerging surveillance capacity is documented in some literatures. For example, in Uganda, a surveillance programme codenamed Funga Macho, or “open eye” in Swahili, has been used by the government to spy an opposition politician and anyone deemed dangerous to state security.9 In Ethiopia, authorities are suspected of using off-the shelf hacking software to spy on journalists and members of the Ethiopian diaspora living in the State and Europe.10 In Nigeria, there are evidences of US and UK-developed monitory software.11 In South Africa, the National Communication Centre is at the centre of the countries surveillance programme and operates above the law.12 Sudan established the Cyber Jihadist Unit, using equipment developed in Italy, to monitor political opposition figures.13

In developing countries, mobile phones are arguably the success story in the domain of ICT4D. As such, most humanitarian programme uses phone device as a medium of deployment. An example of mobile Health initiatives and digital identity registration. Such schemes can be transformational and solve the most basic of developmental challenges14. But since developing countries are plagued by historical divisions, ethnic conflicts and other social and cultural vulnerabilities that heighten the risk and complications to the use of mobile phones for health services. Across the continent, government are requiring citizens to register their SIM cards with personal information. Call and text message data

9 Nick Hopkins and Jake Morris, ‘UK firm's surveillance kit 'used to crush Uganda opposition’ BBC NEWS (15 October 2015) http://www.bbc.com/news/uk-34529237 (accessed 12 August 2017) 10 United States districts court, ‘document on Court proceedings’ https://www.eff.org/files/2014/02/18/complaint-kidane.pdf (accessed 25 July 2017) 11 (Morgan Marquis-Boire et al.) Mapping Global Censorship and Surveillance Tools. The Citizen Lab (15 January 2013) http://citizenlab.org/2013/01/planet-blue-coat-mapping-global-censorship-and-surveillance- tools/ (accessed 3 October 2017) 12 http://pmg-assets.s3-website-eu-west-1.amazonaws.com/141106dr.pdf (accessed 15 September 2017) 13 Bill Marczak, et al. ’Mapping Hacking Team’s “Untraceable” Spyware’ The Citizen lab (17 February 2014) https://citizenlab.ca/2014/02/mapping-hacking-teams-untraceable-spyware/ (accessed 18 August 2017) 14 The Science of Delivering Online IDs to a Billion People: The Aadhaar Experience”, World Bank Live, 24 April 2013, available at http://live.worldbank.org/science-delivering-online-idsbillion-people-aadhaar-experience (accessed 18 July 2017)

4 records held by authoritarian actors stand at the risk of misuse. Discrimination or could easily be the result, pertaining to, for example, electoral trends, public health issues, political activities or location (Hosein 2013, 48). This raises an interesting research question: is privacy invasion a real impact of connectivity in Africa or economic growth and inclusive development?

Methodology The research setting for this investigation was both online and offline. The online survey used e- platform, Survey Monkey Audience feature. The authors, through the software, directed participant to complete the online survey before leaving their school libraries or public computer training centres. Participants were requested to respond to the survey before leaving their libraries. To avoid bias in sample, we did not make any prior advertisement to recruit participants for the survey. The software allows a researcher to distribute a survey questionnaire to targeted audience satisfying certain demography characteristics. Using a web-based questionnaire facilitate an easily accessible medium, with little or no cost, for both respondents and author. The offline version of the questionnaire was administered to wholesalers, retailers and customers in the continent’s famous technology clusters (located in Lagos, Nigeria; Johannesburg, South Africa and Nairobi, Kenya). The offline survey was employed to supplement the online survey and also by the use of semi-structured interview. Interviews were conducted in four different national languages and translated to English (see the section for interview for detailed explanation). It was appropriate to use local languages in interviewing a population with a very low percentage of education and computer exposure. Research subjects included individuals in a wide walks of life, military, technology, public and private technology sectors, and aid agency workers, business, undergraduate and graduate students from large universities in 15 African countries, I8 years and above. Over 400 subjects participated in the final survey following two pretesting surveys, with almost equal representation of gender. In our effort to expand the scope of conventional methods, we explored the novelty of Twitter scrapping for content analysis (see also twitter section for detailed explanation). The authors relied on the PACT Methodological Report on Survey Design & Survey Questionnaire15 developed for measuring attitude of individuals toward privacy, with modifications to fit into the African context and by following analysis in the professional and popular literatures (Gangadharan 2015). According to (Straub 1989) a survey questionnaire is valid when it contain relevant questions drawn from a larger body of questions in literature. All of the items use a 5-piont Likert scale and they are shown in the Appendix A.

Theoretical model Until today, we are not aware of any empirical research which has incorporated surveillance, privacy concerns and beliefs about the usefulness of communication technologies (ICT) in the provision of development initiatives in Africa. Although development interventions have come a long way in the continent, there is little or no research that have empirically investigated the contrary impact of the use communication technologies in the delivery of aid interventions. When compared with extensive researches carried out across the Atlantic on surveillance and privacy, no other published report in Africa, to our knowledge, has done this. Thus, it represent the most complete empirical analysis of the contradictory role of ICT in African development, yet complied and published.

According to (Nicolas, Graham 2017), ICT4D in Africa has yet to meet with its grand visions of economic growth and inclusive development. Even if Africa is not fully ICT automated and networked yet, surveillance is no longer the exclusive preserve of high-tech global North (Egwuchukwu 2017). It should be noted, however, that the lack of resources does not prevent the relatively unskilled ordinary users of electronic devices from achieving anything like the surveillance power of large cooperation

15 PACT (2013) - Public perception of security and privacy: Assessing knowledge, Collecting evidence, Translating research into action. Collaborative Project funded under Theme SEC-2011.6.5-2, “The relationship between human privacy and security” Available at http://www.projectpact.eu/ (accessed 13 August 2015)

5

(Lyon 2003, 83). Surveillance and privacy in the context of technology use is a global issue (Lyon 2010). Researchers during the past 25 years have assessed technology acceptance using various iterations of classical Technology Acceptance Model (TAM) and Theory of Planned Behaviour (TPB). The research model for this investigation has it foundation in the technology acceptance model posited by (Davis et al. 1989) and measures attitude on the construct of perceived usefulness, one of the variables described by classical Technology Acceptance Model (TAM). While it has foundations in these models, this investigation focused on modification made to (Jiang et al. 2016) model on website utility and aesthetics and presented later in this section. Jiang model on website utility represents a comprehensive modification TAM3 and ETPB. Many studies used TAM to determine the outcome of information technology adoption (Adams, Nelson, Todd 1992; Taylor, Todd 1995; Wu, Wang 2005; Yang 2005). In like manner, TAM has been employed and modified in many privacy and surveillance studies (Dinev et al. 2008) to assess user’s willingness to use ICT services for transaction. User’s willingness to accept or reject ICT mediated services, or to use or misuse it, often shape the actual utility of such ICT in any given context (Hu et al. 1999). Consequently, the dependent variable for this research is Actual utility of ICT in Africa (AUIA). This construct refers to the intended use of ICT in general rather than for a specific purpose. Because our study focuses on the role ICT plays on development in general and it possible contradictory impact from the perspective of privacy and surveillance and the influence these exert on attitudes - a specific case on the transformational role of ICT bear no relevance to the main research question. Our theoretical model described is shown in Figure 1 and each of the construct is listed in Table 1

Privacy concern The best known fundamental definition of privacy is that of (Warren, Brandeis 1890, 193) who see it as a distinct individual ‘right to be left alone’. Privacy concerns relates to individuals’ apprehension and uneasiness over the use of their personal information (Westin 2003). The concept of privacy has been studied extensively in various discipline and have proved to be important in everyday life. For example, (Margulis, Lwin, Divev 2016) have found that privacy concern is the single most cited reason for declining to use the Internet. Many individuals do not register for electronic voting in Nigeria and beyond because they are worried that their political affiliation could be disclosed to opponents, and might be used against them in the event of election outcome (Osho 2015). As many as 60% provide false information when asked to register before using any ICT products such as SIM cards and Biometric registration for e-voting (Donovan et al. 2014). We propose that privacy concern determines the utilities of ICT4D in Africa. The possibility of misuse of technology isn’t anything novel (Linden et al. 2003). Although our attention on ICT seem to be, but not, limited on Internet services, perhaps, this is due to the prominence of their powerful, charismatic Silicon Valley Champions (Graham 2016). Specifically, we suggest that individuals’ sense of how much ICT has aided development interventions in Africa emanates from their privacy concerns and ultimately on their responses. If an individual perceives cooperation as acting responsibly in terms of their privacy policies and that sufficient legal frameworks are in place and enforced to protect their privacy, users are expected to show less concern on privacy, otherwise their concerns are likely to be heightened (Lwin et al. 2007). (Dinev, Hart 2004, 2006a) reported a negative relationship between perceived risk and intention to provide personal information while transacting on the Internet. These findings suggest the following. Hypothesis 1 (H1) Privacy concerns are negatively correlated with Actual utility of ICT4D in Africa.

6

Privacy Concerns H1 –

H3 –

Perceived use for H2 – Actual utility of ICT in

H6 + surveillance Africa

H5 +

H4 + Perceived use for development initiative

Figure 1: Proposed theoretical model

Table 1 Summary of constructs and definitions Construct category Construct Acronyms Definitions ICT usefulness Actual utility of ICT in Africa AUIA Users perception of ICT usefulness for general purpose in Africa Privacy, risk and Privacy Concerns of PCIM Worries about opportunistic misuse beliefs information misuse behaviours related to personal information submitted on ICT platform Fear beliefs Perceived use for PUS Users perception of ICT utility for surveillance the purpose of monitoring their personal information Sureness beliefs Perceived use for PUDI Users perception of ICT as being development initiative used solely for development initiatives, and general wellbeing

Perceived use for surveillance Surveillance refers to any collection and processing of personal data, whether identifiable or not, for purposes of influencing or managing those whose data have been garnered (Lyon 2001, 2). New technologies often facilitate opportunities for surveillance due to their powerful capability to generate and analyse personal data. The situation in Africa is, however, that new technologies and techniques are being deployed with no legal oversight. Investigation of contradictory effects of the use of personal information for surveillance is completely neglected in developing countries (Eubanks 2014). Perceived use for surveillance is individuals’ beliefs that governments, development actors, foreign aid donors and humanitarian organisations use ICT to increase malevolent surveillance activities (such as unfair discrimination, racial bias, tribal, and disclosure of political affiliation of citizens), too often cloaked in development objectives such as national security and diseases surveillance in health sector. More specifically, in our model, it is defined as individual’s perception that development actors use

7

ICT to gather personal information in order to stereotype individuals and widen inequity between them. The goal, they often claim impressively, is to classify each individual in context flexibly yet accurately – for example, a legitimate recipient of medical aid (drug), repeat offenders or terrorist – in order to determine who to apply humanitarian aid. Perceived use for surveillance can be better understood if we derive it from racial discrimination. A warning of increased ethnic surveillance is captured in South Africa and Ethiopia regimes (Kelvin at al. 2016), which present a situation of a systematic electronic apartheid, whereby the ‘Valids’ enjoy extraordinary social benefits, status and employment. The ‘InValids’ or those conceived naturally or mistakenly, are deemed ineligible and constitute the marginalized and underclass of the society where they live. In Ethiopia, the ethnically-based ruling political government has used its power over communication technologies to perpetuate invasive monitoring and surveillance in order to maintain control of it populations for 20 years (Human Right Watch, 2014). Sociologist (Lyon 2003) have argued that this sorts of discrimination, a condition under which individuals and/or groups are disadvantaged by the virtue of their racial or ethnic compositions continue to receive serious attention particularly in developing countries. To be sure, we used our model to assess the deepest fear of individuals (users of ICT) regarding their perception about the use of ICT to advance the course of ill surveillance in Africa. Fear is a dominant factor in major domestic and neighbourhood concerns of the 21 century (Lyon 2003, 81). Fear involves uncertainty and the possibility of future peril. It is based on perceived threats, risks to security in the environment including Internet infrastructure—social networking sites, electronic Identity-Cards, mobile phones (Dinev, Hart 2007). Fear is the underlying factor for disbeliefs and doubts. In this type of negative beliefs, users of ICT perceives that their private information could be sold illegally to third-parties (government agencies, aid donors and advertising companies) and their personal information can be used against them in an unanticipated extent. Having these disbeliefs, ICT users do not feel obliged to provide their information when making use of any ICT mediated services. The users measure the risk of providing their information against the intention of development actors in deploying ICT. This outcomes bring us to the second and third conclusions. Hypothesis 2 (H2) Perceived use for surveillance is positively correlated with actual utility of ICT in Africa. Hypothesis 3 (H3) Perceived use for surveillance is negatively correlated with privacy concern.

Perceived use for development intervention Our perception of development interventions are reported in twofold; development Intervention as it relates to PUDI (health, education, economy) and development interventions as it relates to general wellbeing (PUDG). In order to satisfy users of ICT, business owners, institutions, citizens and governments, the outcome of development actors must match with their original intention of deploying communication technologies in Africa. Whereby these intentions are to foster better governance, promote global public health, prevent sectarian unrest, alleviate poverty and human suffering, and support equitable economic growth (Centre of Global development). These development actors operates at the threshold of a significant inflexion point as they seek to answer questions associated to the dichotomy of digital technologies usage in providing developments in Africa. How many of the poorest developing countries that have made extensive use of telecommunication, are on track to meeting the UN development goals? The simple answer to this question is glaring. The situations in Nigeria and South Africa are apposite, where reportedly an estimated 50% of the country’s 44 mln population lives below poverty line,16 there were 31 million cell phone users in in 2007 according to statistics supplied by the International Telecommunication Union.17 This finding about the consumption rate of ICT as asymmetry to actual development contrast beliefs of credible sociologist. (Lyon 2003) insist that the more people use a certain product, the more

16 CIA World Factbook – South Africa Country Report https://www.cia.gov/library/publications/the-world- factbook/print/sf.html 17 4ITU (2008) „ICT Statistics Newslog – Africa“ http://www.itu.int/ITU- D/ict/newslog/CategoryView,category,Africa.aspx

8 is known about their consumption, and the more this is used as a guide both to what they will likely consume and to where incentive can be introduced to further encourage that consumption. These considerations suggest the following. Hypothesis 4 (H4) Perceived use for development interventions is positively correlated with Actual utility of ICT4D in Africa.

Users who perceive the governments and development actors as proactive will be less concerned that government has access to their personal information (e.g. address, ethnicity and political affiliation) and specific racial inclination (Dinev, Hart 2006b). Those users assesses their own privacy concerns and anxiety over personal risk from surveillance and juxtapose them against society risk in general. Facing the need for compromise, the users would prefer that their own personal information be in right hand of government rather than in criminal hands and thus would be less complaint with the need for information gathering (Culnan, Armstrong 1999). Provided that the users feel the surveillance activities is performed to the end of development objectives (i.e. in an ethical and appropriate manner), they would be willing to share their personal information for this purpose. These deliberations suggest the following. Hypothesis 5 (H5) Perceived use for surveillance is positively correlated with perceived use for development interventions

Development initiatives intended to ensure security and general wellbeing of citizens may also result in concerns about the potential side effect of broadening the scope of development actors to monitor citizens. If development actors have genuine, and are sincere about its reasons for monitoring citizens data. Individuals are likely to perceive ICT as more of a development tool and consequently would be less worried in providing their private information to ICT platforms. The theory of reason action - TRA - (Fishbein, Madden, Albarracin 2001, 1975, 1992) has been used extensively as a basis of behavioural intentions. The TRA contends that behavioural intentions are antecedent to specific behaviours of an individual. More specifically, an individual attitudes and perceptions of a certain ICT utility will influence that individual actions when he or she belief certain ICT tool will be used for the intended outcome (Chang Liu et al. 2005). Furthermore, subjective norms, social pressure to use or not to use a particular ICT tool influences behavioural intentions, determined by individual’s positive or negative evaluation of it. Based on this logic, the outcome of any tool geared towards development should influence individual perception regarding the utility of that tool, and in turn shapes his or her behavioural intention to provide his personal information on those platforms. Hypothesis 6 (H6) Perceived use for development interventions is positively correlated with privacy concerns

9

Analysis of Finding from questionnaire survey

Table 2 Descriptive statistics of survey respondents (푵 = ퟒퟑퟓ) Gender Male 58.1% Female 41.9% Age 0-18 3% 19-25 45% 26-30 15% 31-40 29% 41-50 4% > 50 years 4% Education Under High School 1% High School 8% Undergraduate 39% Postgraduate 47% Doctoral 10%% Region West 35% North 16.3% South 31% East 17.7% Language Arabic 14.5% English 42.1% French 29.4% Portuguese 14% Computer Skill Novice 4.2% Appreciation 20% Proficient 57.1% Expert 18.7%

Measurement assessment The research was tested using Structural Equation Modelling with LISREL. LISREL is particularly suitable for theory development because it facilitates simultaneous test of observable models with their measurement items and structural models. Unlike first generation regression model and partial least square (PLS), LISREL permits rigorous analysis of all the variance components of each observed variables (common, specific and error) as an integral part of assessing the structural model. We used the three steps approach to first assess the quality of our measures with the measurement items, sometimes referred to as confirmatory factor analysis (CFA). The steps includes (1) individual items reliability or construct reliability (2) construct convergent validity and (3) construct discriminant validity. We then tested the hypotheses by estimating the structural paths of our model, also known as structural equation modelling (SEM). LISREL records the specific and error variance of the observed variables into the research model output (see Appendix B). All the items in the present study’s model load heavily on their respective constructs thus rendering it statistically significant for analysis and reliability and validity.

Item reliability The researcher obtained the reliability of the individual items through the estimation of Cronbach’s alphas and the squared multiple correlations (R2) of the individual items (see Table 3). The Cronbach alpha value must be >.70 and the squared multiple correlation R2 (recommended value > 0.25). The values from our model loading is reported in table 3 below. Cronbach alpha values (obtained by the

10

mathematical specification of the number of items for each measurement and average correlation between items) ranges from .76 to 92, thus demonstrating adequate internal support of our model. Furthermore, most of the R2 value are higher than .25 providing evidence that the reliability criteria are met (Geftan et al. 2000).

Construct convergent validity The second step in assessing the model is the construct convergent validity which is established by (a) adequate model fit indices such as CFI, IFI, GFI, NFI, AGFI and 풙ퟐ RMSEA (see table 5)and (b) high factor loading and t-values. Average extracted variance (퐴푉퐸) is estimated by the adding the square of each factor loading and dividing with the number of indicators (items) for each measurement. Composite (construct) reliability is obtained by adding all factor loading, square this sum (푆푆퐼); sum all error variance of each items (푆퐸푉). Mathematically represented as (푆푆퐼⁄푆푆퐼 + 푆퐸푉).

Table 3

Confirmatory factor analysis statistics

Latent Item Latent construct loadings and Std. error terms t- 푹ퟐ Composite AVE Variable value Reliability PCIM PUDI PUDG PUS AUIA 휶 = . ퟖퟎ 휶 = . ퟖퟔ 휶 = . ퟖퟏ 휶 = . ퟗퟐ 휶 = . ퟕퟔ PCIM PCIM1 .90(.04) 11.93 .49 .98 .66 PCIM2 .87(.05) 25.94 .63 PCIM3 .78(.04) 5.01 .37 PCIM4 .69(.05) 5.02 .76 PUDI PUDI1 .52(.06) 16.67 .84 .97 .63 PUDI2 .98(.03) 9.77 .37 PUDI3 .81(.05) 6.51 .63 PUDI4 .79(.06) 6.56 .25 PUDG PUDG1 .85(.04) 5.16 .31 .96 .80 PUDG2 .94(.06) 4.67 .26 PUS PUS1 .76(.06) 22.31 .88 .97 .90 PUS2 .88(.06) 17.14 .85 PUS3 .93(.05) 5.67 .45 AUIA AUIA1 .83(.05) 12.70 .27 .97 .65 AUIA2 .90(.03) 25.92 .63 AUIA3 .68(.06) 5.55 .50

For adequate construct convergent validity, the fit indices should be higher than >.70. In our model all the fit indices are very strong (between .89 - .90) and the factor loading exceed the benchmark of (훌 . ퟕퟎ) and heavy t-values demonstrating convergent validity (see Appendix B). Also, all our estimated of AVEs are above .50 (Fornell, Larcker 1981) indicating convergent validity or that the items are all correlated within the constructs.

Construct discriminant validity

The third step in assessing the measurement model is the discriminant validity. For adequate discriminant validity, the correlations between pair of construct should be significantly different from unity. Looking at the table below, the largest correlations (value .89) was between AUIA and PUS. The

11

other correlations were smaller with no confidence interval coming close to 1.00 indicating that each construct is significantly different from any other. Another criterion for assessing discriminant validity requires that the loading of indicators on their respective latent variables be greater than the loadings of other indicators on these latent variables and those of these indicators on other latent variables. The loading and cross-loadings shows appropriate discriminant validity. The final step in assessing the discriminant validity is in observing the diagonal and off-diagonal elements in table. The off-diagonal element represent the correlation of all latent variables, while the diagonal elements are the square root of the average variance extracted (AVE) of the latent variables. This is easily demonstrated by looking at values of PCIM, PUDI, PUDG, PUS and AUIA (.66, .63, .80, .74 and .65 respectively). The discriminant validity is supported if items share more common variance with their constructs than with others. This mean that the diagonal elements should be greater than corresponding off-diagonal ones. Figures in the table 4 satisfy this requirement.

Common method variance test: We reduced the possibility of common method bias error, since it is associated with survey data collected from a single source (Podsakoff et al. 2003). The researcher performed the CVM test to determine that each principal component explained 41% of the total variance. (Richardson et al, 2009) notes that when the results of this test show that one factor explains much less than 50% of the total variance, then the method is sound and well represent the construct presented in the model. Table 4 Latent Variable Statistics Mean Std. Dev. PCIM PUDI PUDG PUS AUIA PCIM 3.568 1.259 .66 PUDI 2.819 1.703 .31(.09) .63 PUDG 3.577 2.298 .32(.10) -.54(.05) .80 PUS 2.273 1.386 -.85(.20) -.68(.09) -.31(.02) .90 AUIA 3.450 1.324 -.51(.06) .19(.02) .29(.03) .89(.07) .65

Table 5 Goodness of fit assessments and structural model Goodness of 풙ퟐ (풅풇) 풙ퟐ NFI CFI IFI RFI GFI AGFI RMR RMSEA fit measures 풑풆풓 풅풇 Model No sign < 2.0 > .90 > .90 > .90 > .90 ≈ .90 > .80 < .055 < .233 benchmark SEM Model 2155.89 .93 .90 .95 .98 .89 .90 .019 0.225

12

Figure 2: Standardized parameter estimates of the structural equation model. ∗ 푝 < .05; ∗∗ 푝 < .01; ∗∗∗ 푝 < .001

Structural model fit The standardized path coefficients for testing the structural model are used to evaluate the hypothesized relationships and are shown in Figure 2. All the measures of fit were in the acceptable range and well above minimum recommended values. Overall, SEM analysis confirm that the privacy concern for information misuse (PCIM) significantly (푝 < .05, < .01, < .001 ) influences the paths (AUIA, PUS, and PUD). Thus lending support to H1, H3 and H6. In addition, perceived use for development (PUD) has a positive effect on actual utility of ICT in Africa (AUIA) at (푝 < .05). Hence, H4 is supported. Our model, surprisingly did not support H5. These result provides support for five of the six hypothesis (see table 6).

Table 6 Summary of the result of Hypotheses testing Hypothesis Hypothesized Support SEM path coefficients relationship H1 PCIM→AUIA (−) YES -0.51** -0.51** H2 PUS→AUIA (−) YES 0.89*** 0.89*** H3 PUS→PCIM (−) YES -0.85*** -0.85*** H4 PUD→AUIA (+) YES PUDI PUDG 0.19* 0.21* H5 PUS→PUD (+) NO PUDI PUDIG -0.68*** -0.31* H6 PCIM→PUD (+) YES PUDI PUDIG 0.31* 0.32*

13

Analysis of finding from scrapping Twitter This novel approach to data collection and analysis helped the researcher to explore target audience that was outside of his sampling region. Computational methods, in theory, offer the potential for overcoming some of the sampling and coding limitations of traditional content analysis (Lewis et al. 2013). First, with regards to sampling, this involved drawing an algorithm measures to systematically gather the data of interest- such as nearly 100, 000 tweets published on twitter on the topics of privacy, surveillance and ICT4D in Africa, a task too unwieldy for any human to accomplish manually. Secondly, with regards to coding, the twitter data was collected, stored and analysed using python programming language and graphical mapping tool (known as pandas, provided in anaconda software package). We identified and visualized the topic of interests using infographic techniques, and the frequency of occurrences. We now discuss next in details the processes involved in this process.

Sampling of target audience The researcher began by performing a search in Google for curated lists containing keywords (surveillance, privacy, ICT4D and Africa). The results of these search are a series of Twitter lists that have been pre-curated to include influential and active people on surveillance, privacy and ICT4D in Africa. These lists contain the data (i.e. username) of the audience we are interested on. We used this website 18 to collect the data from twitter in an organized excel format containing in a tabular form lists of twitter members, cached status, 푢푟푙 of each members, and the status message of each members. See (Appendix C) for screenshot. More than 20 pre-curated data for every member was gathered. Using 푢푟푙 of each members, we perform another search on the same website. The result from this second search contain Twitter usernames of key influencers, policy makers and advocates of surveillance, privacy and ICt4D in Africa. To create a custom target audience in Twitter, we need a minimum of 500 usernames. In total about 1000 usernames were obtained. The username is all we need to create a target audience on twitter, which is the second step in sampling of the audience. The second step involved creating a campaign on Twitter Ads (twitter target audience feature). We uploaded the file containing the username data into the Twitter targeted audience feature. Then we sent custom-designed tweets to influencers in our field and we allowed the discussion to populate over time, before analysing the twitter feeds using Python visualization tools. By scraping twitter data we sought to contribute to a growing body of work that uses social media data, specifically twitter, to expand the range of content analysis.

Coding of data We used python programming language to code the scrapped twitter data. Our approach was built on the notion that Twitter is a social media data warehouse (Broersma et al. 2013). Anaconda software and GitHub library provided everything required for the coding. We first connected to our twitter API page by parsing four access parameters (API key, API secret, Access token and Access token secret). Our main goal in this tweets mining task is to compare the frequency of occurrences of surveillance, privacy and ICT4D and retrieve them on a chart for visualization. We used time series to group our tweets according to year starting from 2010 to 2017 (see figure 3) and we plotted it against frequency of tweets for each year. You can find the screen-shots of each of these processes in the Appendix D. Looking at figure 3, we can see that ICT4D received an increasing attention from year 2010 to 2015, rising up to 15 (frequency of tweets), before a decline, below privacy and surveillance in the last two years (2016 to 2017). Although surveillance and privacy received less attention during the early years of ICT import in Africa, they grew over time without any decline. These results suggest that considerable benefits have been made after the fast diffusion of mobile telecommunication in the

18 https://www.import.io/

14 continent. Such benefits includes connectivity and access, service delivery platforms (in healthcare, agriculture, financial services, and governance) and improved market efficiency (Kelly and Minges, 2012). These transformations have yielded a large body of research and attention on the positive relationships between economic and social development and the import of ICT4D (Aker, Mbiti 2010). However, surveillance and privacy implications to the use of ICT, at that moment (2010 to 2015) were largely overlooked. But with the introduction of SIM card registration in the majority of African countries and also expansive monitoring by agencies – a key approach of Africa emerging surveillance society – attentions on surveillance and privacy have grown rapidly in recent years.

Figure 3: Chart for visualization twitter results

Summary of finding from Interview Interview session involved two participants against one moderator at a time. We ensured that each of the moderators are completely bilingual (i.e. could speak his national language as well as English). The interview was conducted in four different languages, Arabic, English, French and Portuguese. We transcribed all interviews responses into English for consistent analysis. For each question in the open- ended interview, the interview moderators identified patterns and group answers into three categories as surveillance, privacy and ICT4D. The interview questions were same across the four different languages. Interviews were conducted with three objectives in mind: (푖) To obtain first-hand account of actual instances of privacy and surveillance abuse as a result of using ICT (푖푖)To ask the participants to give reasons for their beliefs about ICT4D in Africa (푖푖푖)To the check the validity of the result from the online questionnaire survey The moderators started the session by asking the participants on their general view of phones, computers and Internet since inception. During the interview, moderators avoided the use of terms “surveillance” or “privacy” as this would impose entirely different conception on the part of the participants. Rather an inference was made following tactically drawn out questions to elicit their views on privacy and surveillance issues. We then asked the participants whether they view ICT as having a revolutionary impact on development. To avoid bias, the moderators did not inquire directly

15 from the participants, actual instance of abusive use of ICT, unless the participant did first. We next asked the participants to rank ICT for development vs ICT for surveillance. We tried to understand how much government surveillance the participant thought had occurred as well as how safe and confidential they considered their data on ICT platforms. Each interview session lasted for about 30- 50minutes. More than 50 participants were interviewed.

We start the analysis by reporting accounts of surveillance activities by government or aid agencies and continue by presenting participants general view of ICT4 development in Africa.

50% of the participants from Northern Africa reported cases of unfair discrimination after diagnosis of HIV/AIDS by the WHO’s agencies in those countries. Notably, a mother (anonymity prevents me from mentioning name and countries precisely) said her daughter was systematically expelled from a prestigious secondary school in the country before the commencement of the next academic session. A move she believed was motivated by the result of free HIV/AID diagnosis in that school using a mobile phone device. First she was not allowed to sit close with other students especially children whose parents have political positions in the country. Next she was expelled for none prompt payment of school fees, after recording A’s in her results.

Another, a former opposition party member in Ethiopia divulged to us how his phone calls were tapped. “They took me to custody because we had a conversation over the phone about politics.”

In Nigeria and South Africa complaints were much on the sending of unsolicited, disturbing and dubious messages by agencies of mobile phone service providers. 90% of Nigerian participants reported that their personal information submitted upon registration and purchase of SIM cards have been compromised severally. One participant in Nigeria said,

“I received, on average, 35 unsolicited messages every month from unidentifiable individuals requesting my ATM PIN and account number in order to resolve rubbish incomplete BVM registration.”

Another participant stated that whenever he purchases credit card for airtime from service providers, his account will immediately be deducted for services that he never subscribed for. As such he losses half of every of his recharge amount for unknown services and caller tunes.

Furthermore, on the ranking of ICT for development vs ICT for privacy abuse. 68% of the respondents simply or strongly agreed with “ICT is more of a tool for privacy misuse than it is for development” (Mean =1.6 Std. Dev. = 0.90 N = 50 where 1 is strongly disagree and 5 strongly agree).

16

ICT4D vs Privacy abuse 140

120

100

80 ICT4D 60 Privacy Abuse

40

20

0 North South East West

Figure 4: Ranking of ICT4D versus Privacy Abuse

Participants mentioned that the most annoying part of losing their personal information to third parties would not be the loss of data, but the hassle of receiving unsolicited SMS requesting their bank details to resolve some unfounded issues.

Just like other researchers have revealed, participants believed that ICT have made some significant change in their lives. But they were apt to recognize that the cost associated with using ICT services seemingly outweighed their benefits nowadays. For example, one participant in South Africa said:

“I would rather prefer sending letters to relatives abroad via regular mail than email. The only reason why I would use email is speed in delivery. Cost is as serious issue because one doesn’t just send emails, you need a computer, power supply and internet connection to be able to do that. All this comes with serious costs.”

However, what was regarded private or sensitive differed amongst participants and nationalities. The interview survey confirmed the result of the online questionnaire.

17

Result Discussion Many studies have been conducted to test the level of privacy and surveillance concerns of ICT users and how they inspire the readiness to provide personal information for transaction. In particular, (Dinev, Hart and Muller, 2008). None of these have focused in Africa as marginal ICT users. Our research furthered these kind of researches but in the relevant context of Africa with a construct of ICT4D. Thus, our main goal was to develop and test, using multiple empirical approaches, the relationship between privacy and surveillance concerns with a view of identifying the hitherto usefulness of ICT in Africa. The findings show considerable support for our hypotheses that have six constructs, indeed form a holistic framework for people to evaluate utility of ICt4D in Africa. In the statistical analysis, the latent variables’ psychometric properties exceeded the established benchmark for reliability and validity tests (see tables 3 and 4). The structural model supported five of our six hypotheses, with exemption of H5. In social data analysis (scraping tweets of key influencers on surveillance, privacy, ICt4D), the chart shows (see figure 3) that concerns of ICT users in Africa regarding privacy and surveillance were not prominent in the earlier years (i.e. 2010 - 2014), rather ICT4D were more pronounced. But attention on privacy and surveillance began to grow significantly from years (2015 – date), while ICT4D become questionable with a relatively significant drop. The result of our social data analysis is consistent with those of statistics and interviews (see interview and quantitative analysis paragraphs).

Contributions and implication The research attempted to better correlate privacy, surveillance as factors to determine the utility of ICT in Africa. While prior contentions have accounted for privacy and surveillance concerns, to the best of our knowledge, this research endeavour is among the first to assess the utility of ICT in Africa through the lens of three elements (ICt4D, privacy and surveillance), by employing a more robust and rigorous empirical analysis. This is an important contribution.

According to OECD (Organisation for Economic Co–operation and Development) conventional maxim, “right to privacy is development” implies that, as part of development, governments must protect its citizens from invasive radicalism. This is particular true in the case of terrorism and national security. Despite the traditional view that guides many researches on the subject of surveillance and development in the context of technology use. Recent researches have shown that authorities intrusion concerns is positively related to privacy concerns of citizens (Dinev et al, 2008, p. 221). Consistent with these contentions, our findings have clearly revealed the significance of privacy concerns, perceived use for surveillance and perceived use for development initiative as antecedent to individuals’ value of ICT utility in Africa. First, the result shows that users’ privacy concerns does have a negative effect on their perception of ICT’s utility in Africa. The finding is consistent with prior findings and lends substantial support to the notion that privacy is valued as the highest of all individual rights, so any potential for misuse is treated with huge scepticism. One possible potential threat to information misuse is information discovery. This surely reflect a negative assessment of utility of ICT as it relates to the availability of personal information to be found on ICT platforms. Readers, however, should be cautioned against overestimating the negative impact of ICT4D as it relates to privacy concerns, given that the correlation between PCIM and AUIA is moderately week (- 0.51). Secondly, this study does not support the predominant claim that authorities’ initiative to improve security is important and arguable, tolerable for users to willingly disclose their personal information (Hert et al, 2008). The initial impression was that security initiatives cannot be perceived as invasive, because “the justification for such initiative would decline” (Dinev et al, 2008). This has been demonstrably shown not to be the case in Africa. The perceived use for surveillance is positively related to the utility of ICT4D in Africa, and this does support our hypothesis. This is an interesting finding that deserves serious attention, because the correlation between PUS and AUIA is very strong (0.89). One interpretation to this could be the global perception of Africa as a continent for refuse dump. This is particularly true in the manner of deploying ICT without any legal oversight to safeguard

18 it usage. Thus, perceived use for surveillance play an important role in understanding how users view ICT4D, not least as a necessity, but more as a tool that has been utilized to the extreme, because it operates above the law. Thirdly, the empirical support for perceived use for development was statistically significant in the twofold (PUDI and PUDG). We posit this result to be relatively insignificant when compared with the correlation coefficient between the paths of PUS → AUIA. Readers should not be tempted to see this finding as conflicting with the second finding, as the correlation between both constructs are very weak (0.19 and 0.21). Therefore we conclude that ICT4D utility in Africa is becoming more of a tool for surveillance than development.

19

Appendix

Appendix A Items and Scales Latent variables Items Scale

1 2 3 4 5 Privacy Concern for How much do you agree to the following: I am disagree –strongly Information Misuse (PCIM) concerned that a person can find, as a result of agree using ICT4D in Africa PCIM1: My date and place of birth, telephone and the names of my parents PCIM2: My political affiliation, and the candidate I voted for in an election PCIM3: I am concerned that the information I submit on any ICT mediated platform could be misused PCIM4: I am concerned about submitting information on any ICT mediated platform because of what others might do with it

Perceived utility for How much do you agree to the Scale development Initiatives transformational impact of ICT in the following disagree – strongly (PUDI) sectors: Health, Education, Legislation, agree Economy and General Wellbeing PUDIH1: Use of personal health information for unfair discrimination PUDIH2: Use of personal health information to advance medical research PUDIH3: Use of personal health information for timely diagnosis and prevention PUDIE4: Use of ICT for CBT examination General Wellbeing (PUDG) PUDG1: In general, my need to use ICT for services is greater than my concern about its adverse effect PUDG2: Use of ICT for research, study and teaching PUDG3: I am more willing to pay tax because I gain more than I spend in using ICT for services PUDG41: There is existing framework to protect ICT privacy in African Perceived utility for How much do you agree to the use of ICT, by Scale: surveillance (PUS) foreign agencies and African governments, for Disagree –strongly surveillance activities agree PUS1: The government use ICT for national security purpose in a more intrusive way PUS2: The government use ICT to share citizen’s personal information to foreign agencies. PUS3: The foreign agencies use it control over ICT to sell citizen’s personal information to third-parties (such as ad. companies)

20

Actual utility of ICT in Africa To what extent do you believe that ICT have Scale: (AUIA) been utilized in Africa to effect development in Not all – very much the following ways: AUIA1: Improving health service delivery AUIA2:Improving education quality (teaching, learning and research) AUIA3: improving economic growth and political participation

Appendix B: Lisrel output file and t-values

21

Appendix C: import.io screenshot for twitter members downloaded in excel form

22

23

References

 Adams, D.A., Nelson, R.R., & Todd, P.A. 1992. “Perceived usefulness, ease of use and usage of information technology: A replication.” MIS Quarterly, 16(2), 227-247  Anderson, Ross. 2014. "Privacy versus government surveillance: where network effects meet public choice." Proc. 13th Annual Workshop on the Economic of Information Security (WEIS 2014).  Andrew P, et al. 2012. “Global Survey on Internet Privacy and Freedom of Expression.” United Nations Educational, Scientific and Cultural Organization (UNESCO) open access.  Beatty D. 2005. “The Ultimate Rule of Law”. Oxford University Press, Oxford  Brinkel J et al. 2014. “Mobile Phone-Based mHealth Approaches for Public Health Surveillance in Sub-Saharan Africa: A Systematic Review.” International Journal of Environmental Research and Public Health, Vol 11, pgs. 11559-11582  Brown, A. E., and Grant, G. G. 2010. “Highlighting the Duality of the ICT and Development Research Agenda,” Information Technology for Development (16:2), pp. 96-111.  Carmody, P. (2011). The new scramble for Africa. Cambridge, UK: Polity  Cavoukian, A & Chibba, M & Stoianov, A. 2012. “Advances in Biometric Encryption: Taking Privacy by Design from Academic Research to Deployment,” Review of Policy Research, Vol 29, No. 1 pp 37-61  Chang Liu, Jack T. Marchewka, June Lu, Chun-Sheng Yu. 2005. “Beyond concern—a privacy- trust-behavioral intention model of electronic commerce.” Information & Management 42 pg. 289–304  Charlotte Smart, Jonathan Donner, Mark Graham. 2016. “Connecting the world from the sky: Spatial discourses around Internet access in the developing world,” ACM, http://dx.doi.org/10.1145/2909609.2909659  Chihanga, S., Tatarsky, A et al. 2012. “Towards malaria elimination in Botswana: A pilot study to improve malaria diagnosis and surveillance using mobile technology”. Malar. J. 11, http://doi:10.1186/1475-2875-11-S1-P96  Graham, M. and Mann, L., 2013. “Imagining a silicon savannah? Technological and conceptual connectivity in Kenya’s BPO and software development sectors.” The Electronic Journal of Information Systems in Developing Countries, 56.  Culnan, M.J., Armstrong, P.K. 1999. “Information privacy concerns, procedural fairness, and impersonal trust: an empirical investigation.” Organization Science 10 (1), 104–115.  D. Albarracin, B.T. Johnson, M. Fishbein, P.A. Muellerleile. 2001. “Theories of reasoned action and planned behavior as models of condom use: a meta-analysis,” Psychological Bulletin 127 (1), pp. 142–161.  David Lyon - editor- 2003. “Surveillance as social sorting: Privacy, risk, and digital discrimination.” London: Routledge.  David Lyon .2002. “Everyday surveillance: Personal data and social classifications Information,” Communication & Society 5:2 pg. 242–257  David Lyon. 2003. “Fear, Surveillance, and Consumption,” Hedgehog Review, Vol. 5 Issue 3, p 81-95.  David Lyon, 2007. Surveillance studies: An overview. Cambridge: Polity.  David Lyon, Surveillance Society: 2001. “Monitoring Everyday Life. Buckingham” Open University Press.  Davis, F. D. 1989. “Perceived usefulness, perceived ease of use, and user acceptance of information technology”. MIS Quarterly, 133, 319-339.

24

 Dinev, T., & Hart, P. 2006. “An extended privacy calculus model for e-commerce transactions,” Information Systems Research, 17(1), 61-80.  Dinev, T., Hart, P., 2006b. Internet privacy concerns and social awareness as determinants of intention to transact. International Journal of Electronic Commerce 10 (2), 7–31.  Donovan, K., & Martin, A. 2014. “The rise of African SIM registration: The emerging dynamics of regulatory change” (accessed 12 September 2015) Available at: http://firstmonday.org/ojs/index.php/fm/article/view/4351/3820  Eubanks V .2014. “Want to predict the future of surveillance? Ask poor communities.” The American Prospect, 15 January (accessed 26 September 2014) http://prospect.org/article/want-predict-future-surveillance-ask-poor-communities  Gangadharan, Seeta Peña. 2015. “The downside of digital inclusion: expectations and experiences of privacy and surveillance among marginal internet users.” New Media and Society. ISSN 1461-4448  Graham M. & Foster C. 2014. “Geographies of Information Inequality in Sub-Saharan Africa”  Graham, M. and Mann, L., 2013. “Imagining a silicon savannah? Technological and conceptual connectivity in Kenya’s BPO and software development sectors.” The Electronic Journal of Information Systems in Developing Countries, 56. (2) pp. 1-19.  HelpAge International. 2011. “Good practice in the development of management information systems for social protection,” available at: http://www.helpage.org/silo/files/good-practicein-the-development-of--management- information-systems-forsocial-protection.pdf  Hosein, Gus and Nyst, Carly. 2013. “Aiding Surveillance: An Exploration of How Development and Humanitarian Aid Initiatives are Enabling Surveillance in Developing Countries.” Available at SSRN: https://ssrn.com/abstract=2326229 or http://dx.doi.org/10.2139/ssrn.2326229  Johann Neethling. 2005. “The Concept of Privacy in South African Law 122/1,” The South African Law Journal pg. 18–28.  Lewis, S. C., Zamith, R., & Hermida, A. 2013. “Content Analysis in an Era of Big Data: A Hybrid Approach to Computational and Manual Methods.” Journal of Broadcasting & Electronic Media, 57(1), 34–52. doi:10.1080/08838151.2012.76170  Linden, A. and Fenn, J. 2003. “Understanding Gartner’s hype cycles.” Strategic Analysis Report No R-20-1971. Gartner, Inc.  Lyon, D. 2010. “Surveillance, power and everyday life. Emerging Digital Spaces in Contemporary Society.” Houndsmills, UK: Palgrave Macmillan, 107-120.  M. Fishbein, I. Ajzen. 1977. “Belief, Attitude, Intention, and Behavior: an Introduction to Theory and Research,” Addison- Wesley, Reading, MA, 1975.  Madon, S., Reinhard, N., Roode, D., and Walsham, G. 2009. “Digital Inclusion Projects in Developing Countries: Processes of Institutionalization  Margulis, S.T., 2003. “Privacy as a social issue and behavioral concept.” Journal of Social Issues 59 (2), 243–261.  Marta Guerriero. 2015. “The impact of Internet connectivity on economic development in Sub-Saharan Africa.” EPS-PEAK Our expertise Knowledge pg. 1-27  May Lwin & Jochen Wirtz & Jerome D. Williams 2007. “Consumer online privacy concerns and responses: a power–responsibility equilibrium perspective.” Academy of Marketing Science, Vol 35, pg. 572–585  Friederici, N., Ojanperä, S. and Graham, M., 2016. “The impact of connectivity in Africa: Grand visions and the mirage of inclusive digital development.” EJISDC (2017) 79, 2, 1-20

25

 Osho, O., Yisa, V. L., & Jebutu, O. J. 2015. “E-voting in Nigeria: A survey of voters' perception of security and other trust factors. In Cyberspace (CYBER-Abuja),” 2015 International Conference on (pp. 202-211). IEEE.  Pádraig Carmody. 2012. “The Informationalization of Poverty in Africa? Mobile Phones and Economic Structure.” Annenberg School for Communication & Journalism. Vol 8, No 3, pg. 1– 17  Podsakoff, P.M.; MacKenzie, S.B.; Lee, J.-Y.; and Podsakoff, N.P. 2003. “Common method biases in behavioral research: A critical review of the literature and recommended remedies.” Journal of Applied Psychology, 88, 5, 879–903.  See, for example, the work of the Center for Global Development, http://international.cgdev.org/  Smart, C., Donner, J., & Graham, M. 2016. “Connecting the world from the sky': Spatial discourses around Internet access in the developing world” In Proceedings of the Eighth International Conference on Information and Communication Technologies and Development (p. 18). ACM.  Solove D. 2011. “Nothing to hide. The false trade-off between privacy and security.” Yale University Press  Stephen Graham, 2002. “Technology, Place, and Planning: Looking Beyond the Hype,” Planning Theory and Practice 3.2 pg. 221–44.  T.J. Madden, P.S. Ellen, I. Ajzen, 1992. “A comparison of the theory of planned behavior and the theory of reasoned action,” Personality and Social Psychology Bulletin 18 (1), pp. 3–9.  Taylor, S., & Todd, P. 1995. “Understanding information technology usage: A test of competing models.” Information Systems Research, 6(23), 144-176.  Walsham, G. 2005. “Development, Global Futures and IS Research: A Polemic,” Journal of Strategic Information Systems (14:1), pp. 5-15.  Walsham, G. 2012. “Are We Making a Better World with ICTs? Reflections on a Future Agenda for the IS Field,” Journal of Information Technology (27:2), pp. 87-93.  Warren, Samuel and Louis Brandeis 1890. “The Right to Privacy” Harvard Law Review, Vol 4, No. 5. 103-220  Westin, A. F. 2003. “Social and political dimensions of privacy.” Journal of Social Issues, 59(2), 431–453.  World Bank. 2012. “ICT for Greater Development Impact: World Bank Group Strategy for Information and Communication Technology,” The World Bank Group, Washington, DC.  Wu, J-H., & Wang, S-C. (2005). “What drives mobile commerce? An empirical evaluation of the revised technology acceptance model.” Information & Management, 42(5), 719-729.  Yang, K. 2005. “Exploring factors affecting the adoption of mobile commerce in Singapore.” Telematics and Informatics, 22(3), 257-277.  Zhenhjiang, Wang, Tan, Yu 2016. “The Determinants and Impacts of Aesthetics in Users’ First Interaction with Websites,” Journal of Management Information Systems Vol. 33, No. 1, pp. 229–259.

26