Self-appointed experts of everything, quick fix conclusions on complex policy matters, sensational labelling of enemies via clickbait posts, black or white statements, all or nothing approaches and conspiracy theorists: You get the gist.
The public sphere is overconsumed by such instances and characters. The apex of this hyperreality is articulated in the social media: From anti-vaccination hypes to engineering-by-Facebook. Which gets me thinking: Some silent scholar would have spent years trying to investigate an issue. She or he would have made some important findings, which however, remain confined within academic circles.
Then comes a Facebook influencer who publishes ten opinions a day on everything. Some of these get picked up in the algorithm spiral. The usual suspects add their comments and hey presto, thousands read the post. Clickbait vultures give a snowball effect to the story.
Trying to engage rationally in such fora is simply out of the question. What often matters is how explosive a claim is, irrespective of its accuracy. Influencers involved in such matters may come from different backgrounds, may sport different stripes and colours and may have different affiliations, both big and small, institutional and non-. This can lead to mob rule, elbowing out reasonable dialogue from the public sphere.
This new way of things is even more concerning when we situate them in the post-truth context. As author Ralph Keys puts it, here “borders blur between truth and lies, honesty and dishonesty, fiction and nonfiction. Deceiving others becomes a challenge, a game, and ultimately a habit.”
Thus, for every scholar who researches diligently you find some who consider themselves experts in areas they never really studied. For every scientist you find ideological dogmatists who are ready to defy gravity if it does not fit their script. For every serious journalist who tries to ensure that reporting is grounded in research and credible evidence, there are crusaders and clickbait collectors who rush to publish claims even if not necessarily factual. All this becomes even more worrying when one factors in professional propagandists who are involved in the systematic dissemination of distortion.
Which takes us to a recent article that I read in Time Magazine by Naomi Oreskes, a Historian of Science at Harvard University. Referring to challenges faced by the scientific community for public outreach, she says that for several decades, there has been an extensive and organized campaign intended to generate distrust in science, funded by regulated industries and libertarian think-tanks whose interests and ideologies are threatened by the findings of modern science.
Oreskes shows how unlike quick fix clickbait opinions, scientific claims are never accepted as true before a thorough process of examination by fellow scientists. This includes ‘peer-review’, where reviewers are scientific peers—experts in the same field of study. And scientific findings are different from ideological dogmas – as scientists may discover new evidence. As Oreskes eloquently puts it, “This is to their credit: it is a strength of science, not a weakness, that scientists continue to learn and to be open to new ways of thinking about old problems. The fact that we may learn new things in the future does not mean that we should throw away what hard-earned knowledge we have now.”
Another recent recommendable read on the topic was penned by scientist Lucky Tran in the Guardian. Here he refers to ‘Firehosing’, a term dubbed by social scientists Christopher Paul and Miriam Matthews from the Research and Development Corporation (RAND): It “relies on pushing out as many lies as possible as frequently as possible. That’s typical for propaganda, but the aspect that makes firehosing a unique strategy is that it doesn’t require the propagandist to make the lies believable.”
What this strategy does is that so many opinions are published that ‘it becomes exhausting to continually disprove them’.
Quoting the RAND researchers, Tran suggests various methods to combat firehosing: These include forewarning audiences about the methods that propagandists use to manipulate public opinion; disrupting the flow of disinformation (e.g. removing false content from platforms); withdrawing participation and support for outlets that profit from falsehoods; and creating stronger accountability and self-regulation in the media.
In social sciences and humanities such as sociology, methods such as discourse analysis can help us understand the political, historical and cultural functions of statements (or discourses). Here, one can systematically investigate which discourses are included in debates and which are excluded. It can also reveal how certain discourses gain dominance and how they can be challenged. This method helps us understand how various social statements raise to prominence not necessarily because of the factuality, but more so because of the way they are interpreted and articulated.
This method thus enables the researcher to understand the politicization of issues. It helps us understand how certain facts and non-facts make it to the political agenda whilst others do not, and how they are interpreted within the public sphere.
Let us not allow any institution, organisation, influencer, feed or narrative to monopolize the way we think. Let us value rich dialogue, even if slower in pace, and let us recognize that we may be wrong when evidence shows this. Let us be patient before rushing to quick conclusions. And let us have proper impact assessments in policy decisions.
Dr Briguglio is a Sociologist and Senior Lecturer at the University of Malta