On Aug. 9, 2022, former Vice President William Ruto was declared the winner of Kenya’s presidential election by The Independent Electoral and Boundaries Commission (IEBC); the loser, Raila Odinga, quickly made the claim that the election was “rigged.” Despite the fact that Odinga has failed to prove his claims in the highest court, his followers continue to propagate baseless claims of election rigging.
Does any of this sound familiar?
It should. The story of Kenya’s 2022 election is eerily similar to the 2020 U.S. election – which led to the Jan. 6 insurrection at our capital, endangering thousands.
Misinformation today is more dangerous than it’s ever been, but the battle against misinformation isn’t a hopeless one. The power is in your hands — so, use it.
Granted, there are some factors over which we, as the general public, have no control. Recent political science research blames three factors for the sharp uptick in misinformation: ingrouping, rhetoric of political leaders and increased social media prevalence.
Ingrouping — the belief that one’s social identity is related to their superiority, is a result of increasing polarization. If information threatens one’s core values and protection of their identity, individuals are likely to subconsciously resist it, rather than learn from it.
Journalists and academics take advantage of this in how they frame the information they present.
A 2013 study found that when you reframe climate change to be about “purity,” it appeals to conservative ideology, “largely eliminate[s] the difference between liberals’ and conservatives’ environmental attitudes.” Reframing certainly applies to other issues, such as vaccines.
What’s more, this identity protection tendency is only exacerbated when political leaders like former President Trump and Vice President Odinga sow distrust in elections before and after they happen. President Trump made over 400 false claims undermining the integrity of the election in three weeks after his loss, alone.
Finally, the big bad: commonly used social media platforms like Twitter, Facebook and Instagram target the quick processing system in your brain, making them incredibly effective at ingraining misinformation, without incentivizing users to think critically to debunk baseless claims. Researchers found a sizable disconnect between what people believe and what they share on social media — 59 percent of Twitter news items in 2016 were shared without ever being opened.
But all hope is not lost. The most dangerous perpetrator of misinformation is social media – and it just so happens to be the one over which you have the most leverage. The bottom line: while academics, politicians and journalists can make positive change, you have more power than you think.
OK – so, where do you come in?
When using social media, I want to challenge you to do a few things. The first is that before you share something online, slow down, read the article and ask yourself, “does this seem true?”
Second, I want you to be an active — not passive — social media reader. When you come across something, and you aren’t sure if it’s true, comment “What is your source?” or “Is there evidence of this?” This strategy appeals to core, universal truth criteria.
It’s also crucial that when speaking to someone who believes misinformation, you don’t attack them or their ingroup. Rather, try to reframe issues and appeal to common, shared values.
Misinformation is a dangerous battle, but it’s not a hopeless one; if we let journalists and academics do their jobs and devote our energy wholly and completely to social media, we can defeat it.
On your end, the work is minimal and non-confrontational – the outcome, however, is tremendous.
Luke Williams PO ’23 is a Politics major. He researches misinformation at the University of Washington’s Center for an Informed Public.