In November 2021, the Aspen Institute’s Commission on Information Disorder published a report highlighting the impact of the spread of false and misleading information on societies and providing detailed guidelines aimed at addressing the problem. The Commission, its remit “to identify and prioritize the most critical sources and causes of information disorder and deliver a set of short-term actions and longer-term goals to help government, the private sector, and civil society respond to this modern-day crisis of faith in key institutions,” is among the most high-profile efforts to take on what has become an insidious and multi-faceted scourge.
Information disorder is defined as “the sharing or developing of false information with or without the intent of harming,” and it is not hyperbole to suggest that it has far-reaching consequences. Once a society’s information landscape becomes so inundated with false information that its people lose the ability or desire to discern fact from fiction, the society becomes a prime target for manipulation.
Manipulation comes in the form of disinformation—false information spread with the intent to deceive. Disinformation finds fertile ground in a muddied information landscape where mistrust in institutions, structural inequality, and intellectual apathy persist. Campaigns conducted by state and non-state actors leverage these conditions to manipulate societal decision-making in the short term and/or erode social cohesion over the long term. Successful campaigns can turn one group against another, a population against its government, or a child against a parent. That they can be conducted with little risk and high reward makes them all the more dangerous.
We should not be surprised that it has come to this. For one, people are bombarded with more information than their brains are willing and able to process. The average person in the United States in 2020, for example, spent 1300 hours scrolling social media sites, consuming an unprecedented amount of information in the process. Evaluating each piece of information on its intellectual or factual merit is just not possible anymore. At the same time, the sheer scale has brought about the belief that information should be free, or at the very least not worth paying for.
Secondly, the economics of information, denominated in “attention,” incentivize creators to produce content that garners attention, not praise for its journalistic integrity. Praise, after all, does not keep the lights on and journalists employed. Traditional publications, amid declining print ad revenues after the proliferation of the internet, were faced with this exact dilemma: play ball with the evolving attention economy or risk the financial consequences.
What it all amounts to are increasingly polarized societies with no mutually trusted sources of information, no pursuit of shared truth, declining empathy, and the incentive to produce information that people want to read regardless of its integrity.
Where we're headed
Oxford Dictionary, in naming “post-truth” as the 2016 Word of the Year, described it as “relating to or denoting circumstances in which objective facts are less influential in shaping public opinion than appeals to emotion and personal belief”. Although five years have passed, no single idea portends our future relationship with information quite like “post-truth.”
Like the famous political propagandists before them, state and non-state actors will continue to use disinformation to exploit societies in which the pursuit of truth has become a zero-sum game of I’m Right, You’re Wrong. Where there is no room for the idea that two things can be true at the same time, disinformation will be there to provide the data points to support any perspective.
And as technology evolves, disinformation campaigns will only get more sophisticated in their attempt to sow division and erode faith in democratic institutions. The vast amount of data being collected from people every day on social media will make micro-targeting even more surgical. Deepfakes, already a grave threat to our ability to distinguish between fact and fiction, will continue to improve, while AI-powered bot networks will get better at mimicking human interaction.
Our ultimate destination can perhaps be best characterized by Hannah Arendt, political philosopher and author, who in her 1951 book The Origins of Totalitarianism stated, “The ideal subject of totalitarian rule is not the convinced Nazi or the convinced communist, but people for whom the distinction between fact and fiction (ie the reality of experience) and the distinction between true and false (ie the standards of thought) no longer exist.”
Preparing societies for the modern information landscape
It is unrealistic to aim for the elimination of disinformation. Instead, we need to first understand why disinformation has such a profound impact, particularly on post-truth societies. Opposing sides no longer trust each other’s facts and sources, empathy for alternative perspectives is in decline, the disinformation awareness gap is widening, and the attention economy rewards content that gets the most clicks, not content that prioritizes journalistic integrity. The challenge is finding tangible and realistic solutions that address these issues.
We know it is too late to stem the flow of disinformation, let alone eliminate it entirely. We also know that debating whether a piece of questionable information is true or false distracts us from the real question: What, if anything, is the intent behind a piece of information? If we really want to address the impact of disinformation, we need to focus on the ultimate targets of disinformation: people.
Operation: PLUTO, therefore, is a multi-faceted campaign designed to mitigate the impact of disinformation on societies. Using a three-pronged strategy of reparation, preparation, and reformation, we want to give societies the tools and knowledge to thrive in the modern information landscape.
Reparation addresses the damage disinformation has already done, preparation raises awareness and educates, and reformation promotes systemic change. At a tactical level, we have identified four specific initiatives to focus on:
Raise disinformation awareness
Through content, media interactions, expert engagement, and social media initiatives, we want to help societies understand the threat of disinformation and, ultimately, reevaluate their own information consumption habits.
Promote empathy as a tool to bridge societal divides
We want to demonstrate how two sides can have a constructive conversation around a divisive issue without burning bridges.
Develop educational initiatives
We want to develop and promote methods for evaluating information and understanding disinformation campaigns and build school curricula to prepare students for the modern information landscape.
Call for the realignment of attention economy incentives
Through media interactions and thought leadership, we want to urge governments to scrutinize attention economy incentives.
We know by now that there is no panacea solution that will stop disinformation. Operation: PLUTO is designed to mitigate the impact of disinformation by focusing on the ultimate targets of disinformation campaigns: people. Through these initiatives, we hope to give societies the tools and knowledge to thrive in the modern information landscape.
Call for partners
We fully appreciate the enormity of the task at hand. Addressing disinformation requires a contributions from a variety of organizations and individuals across disciplines. We welcome and encourage cooperation with anyone, regardless of background or area of expertise. Everyone has a role to play in building a healthier information landscape.
If you or your organization want to work together to address the impact of disinformation, please contact us.