12 the diversitythe of content that recipients the have with. Thislack to deal of diversity,boiling down primarily “tailoredtion ideally to size” was an opportunity to gain a competitive advantage over other players on market. the people can day.devote of to type this activity every For latter,the of idea the reaching recipientthe with informa- for him, containing only what really interests him? Not withoutthe decreasing significancethatalso time was recipientsboth and broadcasters. Who would not like to receive a daily portionof information prepared especially makes it legitimate about to talk technological gatekeeping form as a new of phenomenon. this gate, a doorman, determining quality the and form of information reaching us” (Szpunar, 2017,p. 27), which mation on is passed to users on Web. the Technology, far so perceived as neutral, now “is becoming another gate-keepers” (Napoli, 2015,p. 753) have now become a key element that determines way the infor inwhich was first the to rank individual messageswithout humanparticipation. Digital protocols and “algorithmic This phenomenonalready in2002 occasionthe on thebe could debutobserved of which of News, allowing allows cookies), personalization of message the and adapting it to expectations the of recipient. the algorithms by website operators, combined website with trackingbehavior user (for on example a given by ation can everyone receive (at inwhich least intheory) a differentset of information.use The of appropriate wayvidualized of using by them immanent users,their being feature to led a situ beginning, from very the reversed trend. this possibilities The technical meansthese new of of providing information theand indi message, unified identical foreach them.the Onlyemergenceof and development has media a social of possibilitythe of personalizing message. the for entertainment, contact with others, and gaining knowledge about world the (GUS,2018),mainly due to making center and devoted entirely to people the using it. In a short time, Web the has tool becomebasic the The ’s characteristics make the dream it embody freeof a truly medium, independent of any decision- gate-keeping, personalization ,cyber-security, Keywords: 1211 DOI: 10.5604/01.3001.0014.4453 negative impact of phenomenon. this filterthe bubblecan significantly limit the freedomseeking information. of alsoarticle The presents several ways that canlimit the

CONTACT: Jakub Czopek on theWeb The phenomenon filterof the bubble asa threat to freedom Over time, however,Over doubts and concerns arose as personalization to the whether of information limits a sense, The firstnatural reactionand, to in development such of the media was onlinenews enthusiasmof In principle, since its inception, professional information forwarding has media been to its recipients University ofR University Ph.D.,Jakub Czopek, Professor, Assistant Institute ofPedagogy, Sciences, College ofSocial Innovation andNew Technologies ofPedagogyInternational Journal journal homepage:http://www.ijpint.comjournal

zeszow, Rzeszów, Poland, E-mail:[email protected] ISSN: 2392-0092, ing content that to those are compatible with worldview the of information, person seeking the that can significantly type the of affect content that usersInternet encounter the on Web. Bylimit of articleis the topurpose characterize concept the of filter the bubble. the mechanism It describes The internet is usually presented as a mediumthat gives freedom unlimited the to user. The main Abstract: 10.5604/23920092.1134790 10.5604/01.3001.0014.4453

Vol. 7,

No. 1,

2020 Vol. 7,No. 1, 2020, pp. 11-15 - - - -

This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International (CC BY-NC 4.0) 12 Jakub Czopek • The phenomenon of the filter bubble as a threat to freedom on the Web to provide recipients with content consistent with their worldview, may in turn reflect on openness to new points of view and consequently lead to deterioration in the quality of the public sphere. The first signals of this type (Sunstein, 2002) appeared in the literature along with the emergence of social media ( in 2001), today so strongly identified with this phenomenon. The breakthrough was the year 2011, when the concept of a “filter bubble” was born, which in a simple way was supposed to show the situation in which today’s recipient found himself (Pariser, 2011). The author defined it as the “unique universe of informa- tion” in which everyone lives (Pariser, 2011, p. 12). The creation of information-oriented communities was noticed by the European Commission’s High Level Expert Group on Media Diversity and Pluralism, writing in a report from 2013: „Increasing filtering mechanisms make it more likely for people to only get news on subjects they are interested in, and with the perspective they identify with. (…) It will also tend to create more insulated communities as isolated subsets within the overall public sphere. (…) Such developments undoubt- edly have a potentially negative impact on democracy.” (Vīķe-Freiberga, Däubler-Gmelin, Hammersley, & Pessoa Maduro, 2013, p. 27). It is worth noting that while the very concept of a filter bubble (or, more broadly, information bubble) is relatively new, in a broader context it defines a phenomenon that has been known for some time. Nicholas Negroponte in 1995 wrote about the “personal newspaper of the future”, in which everyone will find only the information that interests him (Negroponte, 1995, p. 153). However, the definition of an internet filter bubble proposed by Pariser significantly narrows its under- standing. First of all, it is personalized and therefore adapted to each user. In other words, each user lives in a different bubble. Two types of it should be distinguished here: the so-called self-selected personalization and pre-selected personalization (Thurman & Schifferes 2012). The first one is a conscious choice of the user and is associated with the use by him of content corresponding to his expectations. In media studies, it is known that viewers (also readers and radio listeners) choose a TV station (newspaper, radio station) that fits their views and supports their values, which allows them to avoid cognitive dissonance (Berger, 2014, p. 127). The second type of personalization seems to be a bit more dangerous, as the decision regarding the content the recipient will be interacting with is not an autonomous decision of the individual, but falls on the side of the content sender. And it is this second type of personalization of content that is characteristic of the filtering bubble. Secondly, the bubble is invisible, so the vast majority of users are not aware of its existence. Thirdly, the bubble begins to form automatically when the user decides to use the Web to search for information. Appro- priate algorithms save each virtual step of the user and on this basis predict what kind of content will be the most suitable for him in the next steps. This is the case, for example, of the content filtering mechanism on (Malinowski, 2016) or the way of providing specific answers in the Google search engine (Pariser, 2011, p. 5-6). These tools have obviously been developed in order to simplify the search for information and shorten the time that is devoted to finding them, so we can talk about the generally noble motives of their creators, which were guided by the desire to provide users with the greatest convenience in accessing infor- mation. However, in the long run, they help to create an information bubble around the user. This involves several negative implications that puts into question unrestricted freedom of internet users in using the Web. In recent years, the percentage of users who use the Internet to obtain new information has been steadily increasing. Importantly, more than half of them (55%) in access to news prefers the use of search engines, social media or content aggregators rather than direct access to the website of a given information provider (eg. newspapers) – this form is chosen only by 29% of respondents (Newman et al. 2019, p. 14). It is this group of 55% of Internet users who is more exposed to the information bubble, as the information

Creative Commons Attribution-NonCommercial 4.0 International (CC BY-NC 4.0) BY-NC 4.0 International (CC Attribution-NonCommercial Commons Creative sources they choose play a more important role in automated procedures for the selection and hierarchy of individual news. It is a truism to say that the Internet, especially social media, is a window that widens the world – a few clicks will help us find information on every topic that interests us. However, the information bubble makes every Internet user look at this world through a slightly different window, which makes him see it differently than others. Pariser in his first book dedicated to this phenomenon evokes quite a simple experiment (Pariser 2011, p. 6). In April 2010, there was a fire on the Deepwater Horizon oil rig off the coast of the Gulf of Mexico,

This work is licensed under a is licensed work This resulting in millions of tons of oil entering the sea. The platform was leased by the BP group. Pariser asked two of his female friends to enter the name “BP” in the search engine. Differences have already appeared in the number of results of such searches – respectively 139 million and 180 million records. What’s more, they

DOI: 10.5604/01.3001.0014.4453 Vol. 7, No. 1, 2020, pp. 11-15 Jakub Czopek • The phenomenon of the filter bubble as a threat to freedom on the Web 13 also concerned the order of the records: while one of the women on the first page of results only received links to information related to the Deepwater Horizon crash, the other only saw general BP investment informa- tion and its advertising. Both women belonged to the middle class, had similar political views and lived in the northeast of the USA. Despite the small demographic differences between them, the results they received were different. At this point, the question arises: what if people from different backgrounds, other beliefs, level of education, etc. are looking for the same information? It is not difficult to imagine a situation in which, when asked about the effectiveness of vaccines, the search engine will suggest radically different results to the activ- ists involved in the anti-vaccine movement and the parents expecting their child who are searching for this type of information on the internet for the first time. Responses to the search phrase “climate change” for some users will be a collection of alarming texts about human activity destroying the Earth, and for others a direc- tory of pseudo-scientific studies undermining the existence of the greenhouse effect. In such a situation, it is difficult to talk about widening horizons and opening up to new ideas and concepts. On the contrary – there is a tendency to limit and close users in specific “comfort zones”, in which they meet only with non-controversial information that is congruent with their worldview. The existence of the information bubble puts into question the possibility of unrestricted contacts and exchange of ideas with others. This was what the social media was supposed to be used for – connecting peo- ple to groups around common themes and interests, and the freedom to join such groups, on the principle of “everyone will find something for themselves”. In fact, the filter bubble connects us to others – but only to those who think similarly. Based on likes, sharing links in our Facebook’s newsfeed and our overall behavior in a given social networking site, it gives us the websites or groups that suit us the most. We find there users who (according to the algorithm) are similar to us. Thus, the possibility of knowing “another”, understood as someone representing other views, a different system of values, but also valuable, becomes limited. In the literature on the subject, at the end of the 20th century, the concept of “Internet balkanisation” appeared (Sagawa, 1997, also: Kuner et al., 2015; Mueller 2017). Initially, it referred to the situation in which the Internet treated as a unity begins to divide into closed (or limited access) parts, mainly due to the impact of political, technological or business factors. An example is the Internet operating in , where extensive censorship tools prevent access to such applications as Facebook or Google. Likewise, the governments of Cuba or want to limit the influence of these American companies on their citizens. E. Morozov called this new dimension of the Network a “splinternet” (Morozov, 2010). The same phenomenon, described mainly in the macro dimension, can also be applied to the micro scale. It can be illustrated by the characteristics of “cybertribes”, which was described by Paweł Matuszewski (2018). Among these isolated communities there is often a polarization of beliefs resulting from specific ways of obtaining information. Thus, two (or more) groups are formed, which are somehow closed in their worlds and rarely comes to the situation that their members enter into deeper interactions. And if it comes to them, they are based rather on competition and antagonisms than on the willingness to cooperate. The existence of such a phenomenon raises questions about the nature of the modern Internet: is it an open platform, free and uncontrolled, accessible to everyone without creating any artificial divisions? Or maybe we are dealing with theoretically unlimited virtual space, which, however, is divided into “information spheres”, access to which is determined by the algorithm? The Internet functioning in the latter way connects the user only with similarly thinking units. One can venture to say that in this way he shares it with everyone else. Thus, the Web ceases to be one global agora, where everyone can convince everyone to their views and heads towards an archipelago of network islands inhabited by “cybertribes” with different worldviews.

Considering the balkanization of the Internet on the socio-political level can of course lead to nega- 4.0) BY-NC 4.0 International (CC Attribution-NonCommercial Commons Creative tive conclusions about the distribution of the Internet community. However, a glance from the technological perspective allows us to notice some advantages of such a direction of the Web’s development. In autumn 2018, the French National Assembly and the Ministry of the Armed Forces announced that they were leav- ing Google as the default browser and instead choosing Qwant, which is the result of joint work by French and German IT specialists. This decision was motivated by security reasons related to the break with the monopoly of the Californian concern (Goujard, 2018). This relatively simple technical procedure can help to become independent of the solutions dictated by global concerns and be one of the first steps to France’s under a is licensed work This digital sovereignty. The introduction of similar solutions in other European countries would help in creating a counterbalance for American or Chinese IT companies.

DOI: 10.5604/01.3001.0014.4453 Vol. 7, No. 1, 2020, pp. 11-15 14 Jakub Czopek • The phenomenon of the filter bubble as a threat to freedom on the Web It can be assumed that the majority of Internet users are not aware of the existence of something like fil- ter bubble. As reported by the Pew Research Center (Smith, 2018), over half (53%) of Facebook users do not understand why specific news posts appear in their newsfeeds, and for users over 50 years of age this percent- age is 60%. What’s more, only one third of Internet users tried to influence the content displayed and modify it in some way. An important variable in this case is the age of the respondents, as in the group of the young- est (18-29 years) such attempts were undertaken by almost half – 48%, while among the oldest (65+) – only 19%. It can be assumed that IT competences seem to be crucial in this case, understood as the ability to use electronic devices and use the Internet. Undoubtedly, the existence of the filter bubble is a harmful phenomenon, which can significantly contrib- ute to lowering the level of public debate and the level of public information. Traditional journalism is still associated with the ethics of this profession and a person who is aware of his social role. On the other hand, in the case of providing online information, the machine plays a major role, for which morality and ethics are meaningless concepts. One should therefore consider how to limit this negative impact of content filtering algorithms. Bartosz Malinowski draws attention to two levels where limiting Facebook’s filter bubble is possible (Malinowski, 2016, p. 20-21). On the one hand, we can talk about individual solutions that at the level of settings will allow to limit its negative impact: change the filtering options from “most interesting” to “lat- est”, select “receive notifications” or sort and classify pages according to their validity. The same is true when limiting the impact of the information bubble associated with the use of Google and the products offered by the company. An interesting solution is to install the Chrome browser plug-in under the name “Escape Your Bubble”, which forces to Facebook’s newsfeed links to articles presenting a worldview contrary to the user’s beliefs. Many Internet users unwillingly choose to close themselves in the filter bubble using the most popular search tools. However, there are a number of products that allow to reduce the risk of getting stuck in a filter bubble. Browsers such as DuckDuckGo, Entireweb and Exalead can be an interesting alterna- tive to Google Chrome by placing more emphasis on protecting the privacy of its users and not collecting so much information about them. The next step may be to change the search engine. Today, Google has become a synonym of the search engine and the vast majority of Internet users treat it as a natural source of information when searching for information on the Web. Meanwhile, the use of other search engines (Bing, Startpage, Yahoo) may allow you to reach information that for some reason in Google’s results occur on further, virtually unauthorized places. The multi-search engines, such as Dogpile, Yippy or Zeekly, can also help. They also put the privacy and security of the user first, and in addition, their mechanism of action allows (with a suitably worded query) to reach the information sought faster and more efficiently. The essence of the multivariate search engine is that the query entered by the user is sent to a few or a dozen ordinary search engines, and the results obtained from them are compared with each other, which allows for deeper and more complete Internet penetration. At the other hand are solutions on a macro scale, related to education and awareness of the whole society. It is necessary to change the approach to information retrieval. The Internet has accustomed people to the speed and availability of answers to every question they ask. This undoubted advantage of access to informa- tion, however, is associated with the challenges that the user faces, even in the form of so-called Informa- tion smog, or information overproduction (Tadeusiewicz 2002). The Internet can no longer be presented as a wonderful place where one click of a computer mouse opens the answer to any question. It is necessary to change the approach to searching for information, especially in times of post-truth and fake news. The young-

Creative Commons Attribution-NonCommercial 4.0 International (CC BY-NC 4.0) BY-NC 4.0 International (CC Attribution-NonCommercial Commons Creative est Internet users who learn to use it in computer science classes should be made aware of the dangers to the reliability of content they may encounter on the Web. It is also important to build social awareness and sensi- tization to search for information in various sources. However, there is one fundamental doubt – will the information bubble surrounding us break if we adhere to all these advices? Or maybe just expand? The algorithm will certainly take into account our new choices and will expand the catalog of suggested answers. Therefore, the only reasonable option seems to be a healthy skepticism and a critical approach to the content that we encounter on the Web. A healthy skepticism and

This work is licensed under a is licensed work This a critical approach to the answers given to us by one or another search engine will certainly give better results than the confidence assumed in the results obtained.

DOI: 10.5604/01.3001.0014.4453 Vol. 7, No. 1, 2020, pp. 11-15 Jakub Czopek • The phenomenon of the filter bubble as a threat to freedom on the Web 15

References

Berger, A.A. (2014). Media and Communication Research Methods. London: SAGE Publications. Główny Urząd Statystyczny (2018). Społeczeństwo informacyjne w Polsce. Wyniki badań statystycznych z lat 2014–2018. Retrived from: https://stat.gov.pl/download/gfx/portalinformacyjny/pl/defaultaktual- nosci/5497/1/12/1/spoleczenstwo_informacyjne_w_polsce._wyniki_badan_statystycznych_z_lat_2014- 2018.pdf Goujard, C. (2018). France is ditching Google to reclaim its online independence. 11/20/2018. Wired.com. Retrieved from: https://www.wired.co.uk/article/google-france-silicon-valley Kuner, Ch., Cate, F., Millard, Ch., Svantesson, D., Lynskey, O. (2015). Internet Balkanization gatjers pace: is privacy the real driver? International Data Privacy Journal 5(1), 1-2. Malinowski, B. (2016). Jak Facebook zamyka nas w bańce informacyjnej. Algorytm filtrujący newsfeed a zjaw- siko filter bubble. Zarządzanie mediami, 4(1), 15-22. Matuszewski, P. (2018). Cyberplemiona. Analiza zachowań użytkowników Facebooka w trakcie kampanii par- lamentarnej. Warszawa: PWN. Morozov, E. (2010). Think Again: The Internet. Foreign Policy. Retrived from: https://foreignpolicy. com/2010/04/26/think-again-the-internet/ Mueller, M. (2017). Will the Internet Fragment? Sovereignty, Globalization and Cyberspace. UK: Polity Press. Napoli, P.M. (2015). Social media and the public interest: Governance of news platforms in the realm of indi- vidual and algorithmic gatekeepers. Telecommunications Policy, 39(9), 751–760. Negroponte, N. (1995). Being Digital. New York: Alfred A Knopf Inc. Newman, N., Fletcher, R., Kalogeropoulos, A., Nielsen, R.K. (2019). Reuters Institute Digital News Report 2019. Retrieved at: https://reutersinstitute.politics.ox.ac.uk/sites/default/files/2019-06/DNR_2019_FINAL_1. pdf Pariser, E. (2011). The filter bubble: What the Internet is hiding from you. New York: Penguin. Sagawa, P.I. (1997). The balkanization of the Internet. The McKinsey Quarterly, 126-135. Smith, A. (2018). Many Facebook users don’t understand how the site’s news feed works. Retrieved at: https:// www.pewresearch.org/fact-tank/2018/09/05/many-facebook-users-dont-understand-how-the-sites- news-feed-works/ Sunstein, C. R. (2002). Republic.com. Princeton, NJ: Princeton University Press. Szpunar, M. (2017). Imperializm kulturowy Internetu. Kraków: Instytut Dziennikarstwa, Mediów i Komuni- kacji Społecznej, Uniwersytet Jagielloński. Tadeusiewicz, R. (2002). Społeczność Internetu. Warszawa: Akademicka Oficyna Wydawnicza EXIT. Thurman, N., Schifferes, S. (2012). The future of personalization at news websites: Lessons from a longitudinal study. Journalism Studies, 13(5-6), 775-790. Vīķe-Freiberga, V., Däubler-Gmelin, H., Hammersley, B., Pessoa Maduro, L.M.P. (2013). A free and pluralistic media to sustain European democracy. Retrieved from: http://ec.europa.eu/digital-agenda/sites/digital- agenda/files/HLG%20Final%20Report.pdf Creative Commons Attribution-NonCommercial 4.0 International (CC BY-NC 4.0) BY-NC 4.0 International (CC Attribution-NonCommercial Commons Creative This work is licensed under a is licensed work This

DOI: 10.5604/01.3001.0014.4453 Vol. 7, No. 1, 2020, pp. 11-15