OPINION: Facebook’s Fake News Problem And Misinformation
Maddy Neff | Feb. 16, 2018, 12:09 a.m.
Last fall, U.S. Congressional investigators found that Russia played a role in the spread of deliberate misinformation over social media during the 2016 election. Facebook “identified more than $100,000 worth of divisive ads” bought by a Russian company tied to the Kremlin, according to Scott Shane of The New York Times. American intelligence agencies concluded that the company’s activity was designed to shift public opinion in order to benefit the odds of Donald Trump’s victory and harm Hillary Clinton’s electability. The efforts to understand the impact of Facebook advertisements on the election itself are only in their beginning phases of evaluation.
Facebook’s fake news problem poses far more complicated ethical questions for the future than it did immediately after the election. Outside interference in an election that may have undermined American democracy, swayed public opinion, and given way to a Trump presidency is sickening. Furthermore, the lack of foresight involved in a social media platform failing to recognize the possibility that foreign actors could use its product to proliferate falsehoods and influence the results of an election is inexcusable and frankly, should have been prevented.
The investigation involving the fake news that flooded Facebook feeds in 2016 is going to directly affect whether the company will continue to have open and relaxed policies regarding their advertisement space, or whether it will implement regulations that prevent outside actors from disseminating lies.
This decision puts Facebook in a difficult position. The company will have to decide between advocating for free speech and enforcing restrictions to prevent the spread of fake news.
PEN American Center, an annual literary journal in New York City, defines “fake news” as “demonstrably false information that is being presented as a factual news report with the intention to deceive the public.” During the election, Russia spread false events and suggested pages intended to stir controversy, which targeted individuals based on their gender, age, political ideology, pages they like, etc.
Since then, Facebook has taken baby-steps toward some policy and platform reform, such as creating a system that allows users to flag “objectionable” content for removal, hiring a new team of employees to police shared content, and even going as far as to outsource questionable content to third-party fact-checkers. However, these alterations have been deemed ineffective in preventing the spread of fake news over Facebook’s millions of feeds for several reasons.
For one, these actions do not catch fake news before they are seen by millions of people. Secondly, once sources have been reported or flagged by users, Facebook employees, or third-party fact-checkers, Facebook does not always remove those posts.
While I recognize their attempts to regulate fake news, I don’t believe Facebook’s efforts thus far have addressed the problem head-on. A crucial part of Facebook’s fake news problem has occurred within the advertisement space that it sells to different groups, including foreign actors who can use that space to manipulate the public.
This complicates the debate because social media outlets have historically prioritized money over the quality of information disseminated and been making billions of dollars in profit while operating without social conscience or accountability. I often question whether Facebook is truly committed to participating in the system of American values, which supposedly allows social media sites to grow with protections that other places in the world would regulate with governmental censorship.
It is best that the people take this into their own hands. Facebook is currently in the process of adopting different methods that will attempt to control misinformation, and that process is not guaranteed to be completed anytime soon, or even at all. This means that ordinary people need to judge the truth of news for themselves, whether that means relying on multiple sources of news, fact-checking, or instilling the importance of news literacy education.
Maddy Neff SC ’21 is a politics major and a big critic of “big tech” from Palo Alto, CA. Maddy is probably listening to Lorde’s album, “Melodrama,” while scrolling through President Donald Trump’s twitter feed.