We need a safe digital information space: only accountable users can achieve that

May 9, 2024
freedom of speech
We all hear that governments, tech giants, and advertisers must press for changes in the digital space, but audiences often forget that they are the foundational stone of it all.
notion image
The concept of a safe information environment may be elusive or hard to pin down, but if we want to escape the current morass we're in, we will have to find a way to define it. It has never been more crucial. As society grapples with the complexities of digital information flows, a significant responsibility has been placed on public actors and platforms to safeguard the integrity of our information. However, we will not achieve anything if we don't address how individual behaviour is expected to be the foundation of the whole system.
But first, what exactly constitutes a "safe information environment"? Historically, the concept would likely evoke images of journalism free from bias, misinformation, and malice—yet, one might argue whether such an idyll ever truly existed, given human nature's predilection for subjectivity. What has changed today, however, is the scale and speed with which information can be corrupted and disseminated, rendering traditional safeguards both outdated and overwhelmed. What we lost, in relationship with the past, is that, previously, accountability was much easier to ensure. Audiences could still render themselves invisible, but up to a point. Now, a hater-influencer can reach millions from the safety of his or her hideout. Safety could be defined as the reliability and accountability of all elements and stakeholders: audience, sources, policymakers, content. Everything should be true to itself.
The transition from passive audiences to active participants in media ecosystems marks a critical evolution in the role of the consumer. Today's audiences are not just receivers but are customers with powerful platforms at their disposal. This shift necessitates a demand for higher accountability not only from those who produce content but also from those who consume it. Herein lies a profound dilemma: how can publications demand accountability from an audience that possesses the ultimate power—their patronage—which can be redirected towards entities that cater to their biases and conveniences?
This dilemma underscores a broader societal tradeoff that involves balancing privacy, freedom of speech, and accountability. Anonymity online, while a bastion of privacy and free expression, can also serve as a shield behind which hate and prejudice proliferate. The solution is neither straightforward nor uniform. In contexts where revealing one's identity could threaten physical safety, the parameters of this tradeoff shift significantly, demanding nuanced and context-sensitive approaches. Societies akin to more totalitarian regimes, like China, have an questionable accountability fingerprint embedded in their systems, but at a dear cost to freedom. As it’s standing, it’s the way that China chose for itself, and this is another equally difficult debate.
Moreover, the digital age has emboldened malicious actors who exploit the dual shields of privacy and free speech to disseminate disinformation. The propagation of false narratives under the guise of free expression is a poisoning of our informational environment, turning potential tools of empowerment into weapons of societal discord. This malignancy within our informational ecosystem reflects a broader societal ailment, suggesting that the health of our society can indeed be measured by the average integrity of its citizens. This is very visible in polarised societies like the US, Brazil, and the UK, where radicalisation seems a breeding ground for future times.
Thus, the question arises: what is the path forward? The solution can only be reached when every stakeholder assumes their responsibilities, from governments and corporations to individual users. Authorities must do their part not only by enforcing the law, but also by addressing the rise in inequality, which is the real cause of polarisation. Tech giants must enforce the law, not pretend that their revenues come only from legitimate customers. Privacy, freedom of speech, and accountability are the variables in this equation. Determining whether this is feasible or not is the ultimate challenge if society intends to prevail.
On an individual level, there is a profound need for digital literacy education that empowers users to discern credible information from questionable sources, and to understand the impact of their online behaviour. This educational imperative extends beyond mere knowledge of facts to a deeper understanding of the ethical and societal implications of our digital engagements. An often-forgotten point regarding literacy is more akin to a societal illness: intolerance of dissent. Consider any fair social activism, such as gender, race, immigration, abortion, or any other. It's stunning to see how some people believe they will achieve the necessary tolerance for a fundamental right by being intolerant of another. It's a virtual paradox created not by no monster, but by people like us.
Trust in institutions moves as a herd: in blocks. When the trust in one falls, the others are dragged down too Confidence in the media, the government, and the legal system is at its lowest because the fundamental link of the chain - trust - has been lost as mentioned in this other text. This link goes reinforced. Uncertainty, fear, and frustration are born out of inequality. Even if the task - setting up a safe information environment - is daunting, it may not be insurmountable - yet. Doing so could have a game-changing effect on everything, from polarisation to climate change, but all this has to start from the individual citizen. If he or she prefers to be right rather than to be well, well, then there is an abyss just around the corner.

© Cassiano Gobbet 2023 - 2024