Oxford Internet Institute – written evidence (DAD0060)

Written evidence submitted by Dr Victoria Nash, Dr Taha Yasseri and Dr Chico Camargo on behalf of the Oxford Internet Institute, .

As a multi-disciplinary department conducting research and teaching on the social, economic and political implications of digital technologies, it is our pleasure to provide written evidence to this important call. Please note that the responses here are compiled from research conducted by multiple faculty members, and do not reflect a single over-arching departmental position.

General 1. How has digital technology changed the way that democracy works in the UK and has this been a net positive or negative effect?

It may yet be too early to assess the effects of digital technologies on democracy in the UK, not least because the tools and devices we use are changing rapidly. However, as discussed in a key recent text by three OII faculty, Political Turbulence (Margetts, John, Hale, and Yasseri, 2015), the main societal effect of digital technology and specifically on political behaviour has been the evolution of political engagement. Through actions such as liking, commenting on or sharing content, social media enable tiny acts of participation that can scale up and lead to tangible political change. These actions may, but do not necessarily develop into larger collective action activities. It is hard to predict which ones will grow and sustain deeper engagement, but some of our research has shown that the dynamics of digitally driven collective actions is fundamentally different to collective action campaigns before the age of the Internet (Yasseri, Hale, and Margetts, 2017). This calls for more research, discussion, and legislation.

2. How have the design of algorithms used by social media platforms shaped democratic debate? To what extent should there be greater accountability for the design of these algorithms?

It is not merely the design of algorithms that shapes online democratic debate, other choices in platform design and in how those choices are implemented also make a significant difference. Factors such as the design of platforms and the way they encourage or discourage interactions between users make a difference, as do strategies for decisions regarding content moderation or content promotion. In certain platforms, the design allows considerable segmentation and polarization (Twitter) and in some, the design of the platform encourages collaboration and consensus (). Algorithms can determine how much social influence is channelled through each interaction, for example promoting the content of popular individuals above the content of individuals with few followers or small networks. Algorithms determine who sees what and in what order, often on the basis of how users have engaged with previous

1 content, regardless of whether this engagement was positive or negative, constructive or toxic, fact-based or misinformed. This creates an ecosystem which is dominated by high-engagement content, which very often appeals to emotion or false claims. This system also leaves users with little to no control over their algorithmically-sorted content or interactions, often without much evidence of how or why such selection occurs. There is a great deal of public and academic debate as to the extent and possible democratic implications of ‘filter bubble effects’. The mixed academic findings suggest that we should not automatically assume social media will reinforce existing beliefs or increase partisanship. The annual Digital News Surveys conducted by Oxford colleagues at the Reuters Institute for the Study of Journalism has previously noted that social media users consume a more diverse array of news sources than those who do not use social media (Newman et al 2017), whilst a recent survey of UK Internet users conducted by OII’s Dr Grant Blank concluded that only a small portion of the population are likely to find themselves in such online political echo chambers (Dubois and Blank 2018).

Education 3. What role should every stage of education play in helping to create a healthy, active, digitally literate democracy?

Education clearly has a crucial role to play and we welcome initiatives such as OfCom’s new Making Sense of Media programme which seeks to improve digital skills for UK children and adults. In considering how best to maximise the benefits of current and future educational initiatives supporting digital literacy the Committee should note the research findings of OII colleague Prof. Rebecca Eynon, whose research has consistently revealed persistent inequalities in young people’s ability to access and use the Internet (e.g. Wilkin, Davies and Eynon 2017). Contrary to the ‘digital natives’ narrative, this empirical research suggests that a small but significant proportion of young people still struggle with gaining the digital skills necessary to engage fully in modern society. Whilst considering the democratic benefits of new forms of online engagement, the Committee might thus want to also consider the increasing opportunity cost for those who lack the skills to capitalise on this, and ask how this could be mitigated. It is also important to assess the actual effectiveness of the educational measures put in place to tackle digital illiteracy, such as the implementation of England’s new National Curriculum for computing, as a replacement for the previous ICT program. In recent research, our OII colleague Dr Laura Larke has highlighted the need to work together with teachers to develop National Curriculum standards for computing that actually meet students’ needs and schools’ resources (Larke 2019).

Online campaigning 4. Would greater transparency in the online spending and campaigning of political groups improve the electoral process in the UK by ensuring accountability, and if so what should this transparency look like?

2 Online campaigning is no longer separable from any other type of campaigning (McElwee and Yasseri, 2017). In principle, existing rules and conventions governing election spending and campaigning should be applied to online campaigning. However, given growing public and political concern about irregularities in campaign behaviour and uncertainty around the scale and implications of new forms of political advertising, the OII established a Commission of experts to provide a clear framework for understanding the technological mechanisms, algorithms and big data practices that could undermine the mechanics of democracy. This Commission, the Oxford Technology and Elections Commission, will deliver its recommendations on the 16th October and these will likely include specific recommendations on measures to improve accountability and transparency. An OxTEC background report summarising existing research literature on elections and campaigning in the digital age may also be of interest to this Committee (Thwaite 2019).

4. What effect does online targeted advertising have on the political process, and what effects could it have in the future? Should there be additional regulation of political advertising?

The OII has not conducted any research into the efficacy of targeted advertising in election campaigns or other political contexts. However, Prof. Sandra Wachter has drawn attention to relevant aspects of behavioural advertising which may merit more regulatory attention, regardless of the political effects. In an article to be published in 2020 (Wachter 2019), she highlights three possible legal challenges that arise as a result of affinity profiling in such advertising, by which she means the practice of grouping people according to assumed interests rather than their personal traits. Such assumed interests are effectively guessed based on individuals’ online data traces and habits, and this practice may give rise to legal concerns relating to privacy, non-discrimination and group level protection. Additional recommendations are likely to be included in the final output of the OxTEC due on the 16th October.

Privacy and anonymity 5. To what extent does increasing use of encrypted messaging and private groups present a challenge to the democratic process?

We do not currently have any research results to report on this question, but the 2019 RISJ Digital News Report does include a section on this question, albeit reporting on countries such as Brazil, India and South Africa where private messaging apps have become an increasingly important way of communicating and discussing news. Their data suggests that although news group users do tend to be more partisan, just 8% of UK Internet users use Facebook groups for news and politics, and just 2% currently use WhatsApp groups for this purpose (Newman et al 2019, p. 19).

7. What are the positive or negative effects of anonymity on online democratic discourse?

3

Democratic debate 8. To what extent does social media negatively shape public debate, either through encouraging polarisation or through abuse deterring individuals from engaging in public life?

OII researchers Lisa-Maria Neudert and Nahema Marchal have produced a report for the European Parliament summarising research into social media and political polarisation (Neudert & Marchal 2019). Their report identifies two key ways in which social media can encourage polarisation, namely through the technological design of platforms and apps, and by the explicit manipulation of political actors. Three counter-trends are also highlighted, however, suggesting that social media are not inherently polarising. Social media can be used to counter polarisation by enabling activism and political mobilisation, through their potential to nudge for positive political behaviours and engagements, and through advances in AI which may gradually promote better transparency and accountability.

9. To what extent do you think that there are those who are using social media to attempt to undermine trust in the democratic process and in democratic institutions; and what might be the best ways to combat this and strengthen faith in democracy?

The OII research project on Computational Propaganda has produced a substantial body of research identifying and measuring the spread of online disinformation. The most recent report provides an inventory of global organised social media manipulation, and provides evidence of such campaigns in 70 countries, including 28 where such tools are used to suppress human rights, discredit political opponents or squash dissent (Bradshaw & Howard 2019). In relation to the UK, the report notes evidence of organised social media accounts being used to promote the government, promote the opposition, distract attention and encourage political division.

Misinformation 10. What might be the best ways of reducing the effects of misinformation on social media platforms?

In addition to the report mentioned above, the OII has published two inventories of international policy approaches to tackling online disinformation and its political effects (Bradshaw et al 2019; Robinson et al 2019). These reports do not promote a particular policy intervention for tackling disinformation, but the array of strategies identified suggests that different policy responses may be appropriate in different political contexts and cultures. It seems plausible that multiple interventions targeting different social and political actors, akin to a ‘public health’ approach will be necessary to tackle the problem.

Moderation 11. How could the moderation processes of large technology companies be improved to better tackle abuse and misinformation, as well as helping public debate flourish?

4

Technology and democratic engagement 12. How could the Government better support the positive work of civil society organisations using technology to facilitate engagement with democratic processes?

13. How can elected representatives use technology to engage with the public in local and national decision making? What can Parliament and Government do to better use technology to support democratic engagement and ensure the efficacy of the democratic process?

User-generated content on social media should be recognised as a potentially valuable source of information for gauging the public opinion on a policy or service provision. OII faculty research on social media discussion of Universal Credit (Bright, Margetts, Hale and Yasseri, 2014), as well as our analysis of all petition activity and content on the UK parliament petitioning website (Vidgen and Yasseri 2019) has demonstrated how information harvested from public discourse on social media could facilitate more informed policy making. Given that research from our faculty indicate that there is an increase in the number and diversity of issues considered important by the public, digital channels offer flexible and responsive means for effectively engaging with an ever more fragmented public (Camargo, Hale, John and Margetts, 2018).

14. What positive examples are there of technology being used to enhance democracy?

References

Bradshaw, S. & Howard, P.H. (2019). The Global Disinformation Disorder: 2019 Global Inventory of Organised Social Media Manipulation. Working Paper 2019.2. Oxford, UK: Project on Computational Propaganda.

Bradshaw, S., Neudert, L-M., & Howard, P.H. (2019). Government Responses to Malicious Use of Social Media. Working Paper 2019.2. Oxford, UK: Project on Computational Propaganda.

Bright, J., Margetts, H., Hale, S., & Yasseri, T. (2014). The use of social media for research and analysis: a feasibility study. A report of research carried out by the Oxford Internet Institute on behalf of the Department for Work and Pensions.

Camargo, C. Q., Hale, S. A., John, P., & Margetts, H. Z. (2018). Volatility in the Issue Attention Economy. arXiv preprint arXiv:1808.09037.

Dubois, E. & Blank, G. (2018). The echo chamber is overstated: the moderating effect of political interest and diverse media. Information, Communication & Society, 21:5, 729-745

Larke, L. R. (2019), Agentic neglect: Teachers as gatekeepers of England’s national computing curriculum. British Journal of Education Technology, 50: 1137-1150.

Margetts, H., John, P., Hale, S., & Yasseri, T. (2015). Political turbulence: How social media shape collective action. Princeton University Press.

McElwee, L., & Yasseri, T. (2017). Social Media, Money, and Politics: Campaign Finance in the 2016 US Congressional Cycle. arXiv preprint arXiv:1711.10380.

Neudert, L-M., & Marechal, N. (2019). Polarisation and the use of technology in political campaigns and communication. Panel for the Future of Science and Technology, European Parliament.

5 Newman, N., Fletcher, R., Kalogeropuoulos, A., and Nielsen, R.K. (2019). Reuters Institute digital news report 2019. Available at: http://www.digitalnewsreport.org/

Newman, N., Fletcher, R., Kalogeropuoulos, A., Levy, D.A.L., and Nielsen, R.K. (2017). Reuters Institute digital news report 2017. Available at: http://www.digitalnewsreport.org/

Robinson, O., Coleman, A., & Sardarizadeh, S. (2019). A report of anti-disinformation initiatives. OxTEC, University of Oxford. Available at: https://oxtec.oii.ox.ac.uk/wp-content/uploads/sites/115/2019/08/OxTEC- Anti-Disinformation-Initiatives-1.pdf

Thwaite, A. (2019). Literature review on elections, political campaigning and democracy. OxTEC, University of Oxford. Available at: https://oxtec.oii.ox.ac.uk/wp-content/uploads/sites/115/2019/09/OxTEC-Literature- Review-Alice-Thwaite-Report-25-09-19.pdf

Vidgen, B., & Yasseri, T. (2019). What, When and Where of petitions submitted to the UK Government during a time of chaos. arXiv preprint arXiv:1907.01536.

Wachter, S. (forthcoming). Affinity profiling and discrimination by association in online behavioural advertising. Berkeley Technology Law Journal, 35:2.

Wilkins, S., Davies, H. & Eynon, R. (2017). Addressing digital inequalities amongst young people: conflicting discourses and complex outcomes. Oxford Review of Education, 43:3, 332-347.

Yasseri, T., Hale, S. A., & Margetts, H. Z. (2017). Rapid rise and decay in petition signing. EPJ Data Science, 6(1), 20.

6