Disinformation and the false believe we are autonomous in our search for knowledge

December 18, 2020

Disinformation and the false believe we are autonomous in our search for knowledge

Jessica van der Schalk
December 18, 2020

Disinformation and the false believe we are autonomous in our search for knowledge

Jessica van der Schalk
December 18, 2020

Disinformation and the false believe we are autonomous in our search for knowledge

December 18, 2020

It’s often attributed to the advent of social media, algorithms that secretly use our preferences to prioritize certain information, but also to the rise of deep fakes: disinformation. Online, an argument can be found for every possible notion, our own ideas are easily confirmed and disinformation appears to spread more rapidly and widely than reliable informationSome say that we now live in a time where we experience autonomy when, in reality, we are being manipulated. Is this a new phenomenon and how worried should we be?

Our observations

  • In 2016, “post-truth” was nominated for word of the year by Oxford dictionaries, defined as “circumstances in which objective facts are less influential in shaping public opinion than appeals to emotion and personal belief”.
  • In general, disinformation refers to the intentional spreading of manipulative or even untrue information to convince the public of a certain viewpoint and/or influence their behavior. Disinformation differs from misinformation in the underlying motive: the spreading of erroneous information is intentional. There are different types of disinformation, such as false information (think of deep fakes, or simply made-up stories), “cherry-picked” information (certain parts of the truth are intentionally left out or highlighted), unproven links (true facts are wrongly linked to each other, leading to a false conclusion) or authentic information that is potentially harmful to a person or community (e.g. hate speech or private information that is leaked). In the report by the European Commission Technology and Democracy – Understanding the influence of online technologies on political behaviour and decision-making (2020), one conclusion arrived at is that people behave differently online from offline. The web offers a cognitively unique environment, resulting in specific psychological reactions. The online environment can, for example, influence the way individuals process information and communicate with each other. Importantly, there is scientific evidence that suggests that social media change people’s offline political behavior; this includes inciting dangerous behavior such as committing hate crimes, which could, if proved, be a valid reason to impose restrictions. “Establishing causality is crucial because it offers opportunity for intervention and control. If social media were found to cause social ills, then it would be legitimate to expect that a change in platform architecture might influence society’s well-being.”
  • Professors Dutilh Novaes and De Ridder claim that the tactics used in the spreading of disinformation to influence the public’s viewpoints or behavior are not new, rather the technical knowledge and means are. They have led us into a potentially dangerous situation in which it may become impossible for reliable parties to abide by the democratic maxim “should they go low, then we go high” any longer.

Connecting the dots

Disinformation is a global, public concern threatening democratic societies. After all, a well-functioning democracy depends on the ability of citizens to make informed choices. And it is precisely reliable information that seems to be having more difficulty reaching citizens. According to the report by the European Commission, the circumstances nowadays are perfect for the large-scale spread of disinformation, because of the interplay of the attention economy with human psychology. The attention economy is driven by algorithms that select and subsequently promote attractive, fascinating content on an individual level.Furthermore, people are naturally strongly inclined to focus on negative news, and most disinformation inspires negative emotions such as fear, rage and indignation. Disinformation thus spreads more quickly and widely than reliable information. Moreover, this causes individuals to be exposed less to different opinions, which are crucial to identifying the best arguments, exchanging viewpoints and reaching consensus. Scientific research corroborates the concerns over this phenomenon, but we are still in the early stages of policy-making on this theme.And yet, the intentional manipulation of the public is not a new phenomenon.According to Professors Dutilh Novaes and De Ridder, there are three tactics for manipulating the public that were already used before the internet era. First, pleasing and seducing the audience. By playing into deeply rooted sentiments by means of misleading but attractive information, the public is tempted to take on a certain standpoint or change their behavior. If this form of manipulation is exclusively applied, other information will still be available, and all parties can employ the same method, so that different stances can be heard. The second tactic is that of propaganda and censorship. Information is communicated by one party only and dissenting voices are suppressed. For practical reasons, this tactic is mainly used by political figures who have the means to censor.The final tactic is knowledge pollution. Traditional sources of information (science, government, media) are discredited by a stream of alternative information of such proportions that the public does not know which information is reliable anymore. Experts are depicted as subjective sources with their own views and those spreading disinformation as having equally valuable but different views. Due to this tactic, the public is threatening to fragment and individuals tend to narrow their focus to those sources they consider credible and which confirm their worldview.However, the technical aspects of how (dis)information is produced, spread and consumed, are new. Besides knowledge of human psychology, technological knowledge is also needed nowadays both to make disinformation effective as well as to combat it. In addition, these technological developments make it possible to reach prodigious numbers of people at low cost, whereas heretofore this would have required expensive means such as printing texts or employing persons. The technologies used in this context include the above-mentioned recommendation systems, dark patterns, fake news websites, fake social accounts, troll farms and bots.

Implications

  • Most people would characterize themselves as critical thinkers, capable of forming judgments and acquiring knowledge autonomously. However, when we investigate more thoroughly what it means to be a critical thinker, human beings in general appear to be poor critical thinkers but nevertheless see themselves as such. Professors Dutilh Novaes and De Ridder hold that the danger of knowledge pollution is that citizens still appear to be autonomous, when in fact they are not. This makes it very difficult for reliable information to reach citizens without the deployment of so-called “counter-manipulation”.
  • According to famous contemporary philosopher Latour, the reason that some people cling to information that denies problems has to do with the desire to deny global problems such as the coronavirus and climate change. This way, it is possible to live in a world without these problems.
  • The company WordProof has received 1 million euros to combat fake news, fraud and privacy problems with blockchain technology. His solutions try to transport some of the ‘rules’ of our physical world to the digital world that help make knowledge sharing more reliable. One example is the connection between someones identity and the extent to which information can spread on the internet. This way, he separates freedom of speech from freedom of reach.
About the author(s)
Jessica van der Schalk's research at FreedomLab is primarily centered on the impact of technology on education and the nature of virtual reality and artificial intelligence. Integral to her personal and professional development, Jessica delves deep into literature concerning the philosophical relationships between humans and nature, and the importance of critical thinking and human autonomy vis-à-vis the impending wave of technological revolutions.
You may also like