The Entrepreneurs Helping Us Find Truth on the Internet
In his 2009 book, “Idiot America: How Stupidity Became a Virtue in the Land of the Free,” Charles P. Pierce stated that in today’s age “any theory is valid if it sells books, soaks up ratings, or otherwise moves units.” He argues that today validity is determined by the decibel of the speaker not the content of the lyrics.
It appears today that facts are not based on absolutes but rather on the sheer number of social media shares and retweets. On the wave of misinformation, 76% of self-identified Republicans believe in widespread fraud in the 2020 election. These claims mobilized several thousand of President Trump’s supporters to riot at the Capital on January 6th in an attempt to halt President Biden’s election. In the case of COVID-19, a study by the Harvard Kennedy School found that nearly 20% of their 1,008 survey respondents during the early days of the pandemic believed the CDC was exaggerating the threat of Sars-CoV-2 to try and undermine the Trump administration. Covid-19 vaccine conspiracy theories continue to spread on social media platforms despite tech giants’ attempts to root out misinformation.
In the realm of social media, truth is not determined by scientific methodology and reproducibility but simply by fervent belief. Therein lies the crux of the problem. There is an overall lack of respect for science and the methodology by which science arrives at truth and fact. The Scientific Method is the cornerstone of all the sciences and at the core of its practitioners. Regardless of the field of study, the method is utilized to find answers that are logical and supported by evidence.
HOW THE INTERNET ACCELERATES THE SPREAD OF MISINFORMATION
The internet has caused the rapid dissemination of falsehood by groups that normally would not be aligned. In a study by Velasquez et al. in April 2020, the authors discuss how a multiverse of hate is created from misinformation on unregulated platforms such as 4chan, Gab, and others that spread via nodes on mildly regulated platforms such as Facebook. This allows the rapid spread of non-evidence-based vitriol along the internet superhighway with little to no scientific validation of facts. This has allowed the propagation of false narratives involving COVID-19 and the stolen election narrative.
The issue lies in vetting information for truth in a world where science and its meticulous methods are not appreciated. Science and truth take time. Often the truth is complex, and it takes several iterations to discover the core of the issues. But in a world of sound bites, retweets, character limits, news flashes, mindless sharing, and limitless sources of information, we no longer value the time it takes to know the truth,
Entrepreneurs can play a significant role in combatting misinformation. For example, startups like Logically and Fabula AI use AI to help cut down on the time needed to identify misinformation. Logically complements AI and machine learning with a dedicated team to provide fact-checking services for clients. Governments, private sector companies, and individuals can take advantage of a suite of products geared at identifying, analyzing, and mitigating the spread of misinformation. Fabula AI takes a different approach. Its algorithm focuses on how a piece of information spreads on social media rather than the content itself. This machine-learning technique crunches through network-structured data to detect if and how networks of information are being manipulated and identifies “fake news” with 93% accuracy. Twitter bought the UK startup to help in its efforts to stop the spread of misinformation on its platform.
Though these startups can help make getting to know the truth a bit easier, AI-based solutions are not a magic wand for detecting misinformation. In 2018, researchers at the MIT Media Lab performed an extensive study of rumor cascades on Twitter from 2006 to 2017. They evaluated approximately 126,000 rumors spread by approximately 3 million accounts. In their research, false news reached a wider audience than facts. The top 1% of falsehoods were found to reach up to 100,000 people as opposed to the truth, which rarely reached over 1,000.
“Falsehood diffused significantly farther, faster, deeper, and more broadly than the truth in all categories of information,” wrote the authors.
They also found that when it came to fact versus fiction, robots did not discriminate. The robotic accounts spread both types of information equally, while humans favored disseminating falsehoods at a much higher rate.
Our automated models are incomplete, and we can’t tech our way out of this one. We need to rebuild critical thinking from the bottom up. Luckily, there are also entrepreneurs working on just that. Ad Astra* is a company that works “to renew a faith in facts and reason and uplift underserved and minority communities by providing them with scientific role models in science, technology, engineering, art, and math (STEAM) to which they can aspire.” The company works on producing Spanish and English language “edutainment” content in the form of videos, comics, and toys that ignite a love for science at a young age. By instilling a habit of scientific thinking, companies like these look to inspire a generation of critical thinkers capable of discerning fact from rumor.
Startups alone, however, cannot address the issue of misinformation on social media. Policymakers also need to play a role in creating the systems necessary to incentivize truth and disincentivize lies. A global survey of misinformation policy reveals how challenging this can be in practice. For example, the Supreme Court recently dismissed a lower court ruling that President Trump violated the first amendment by blocking critics on Twitter. In a concurring opinion, Justice Thomas raised the question of regulating social media platforms as public goods considering their “substantial market power” and the fact that there are no comparable alternatives to the platform, an opinion that garners significant support and dissent. Defining misinformation, protecting free speech, and choosing where to start have stymied political action worldwide. However, what is certain is that the easier it becomes to create content, the harder it will be to find the truth.
We live in a world driven technically by the engine of science whether it be the internet, GPS, or vaccine development. We see that its methodologies are effective in so many palatable capacities and yet do not elevate science when it comes to civic discourse and everyday dealings with one another. We do not take the time to reason through the complexities of the problems that we face today. We do not take the time to systematically address the data before us to find the root cause of the tendrils that have traversed time to wreak havoc on today. We allow opinion and rumor to prevail. Through science and its method, we can find the truth. In science, we must trust.
This article was co-authored by:
Regie Mauricio, Project Manager, Legatum Center at MIT
The views expressed herewith represent the authors’ own personal views with no relation to any previous, current, or future affiliations.