Facebook has become a formidable player on the news, political, and entertainment landscapes, with more than 1.8 billion active viewers a month. But its "gargantuan influence" has become its "biggest liability," Farhad Manjoo writes in the New York Times, unpacking how the site was the ideal conduit, especially during the election, for "fake news"—a "great disseminator of the larger cloud of misinformation and half-truths swirling about the rest of media." A study culled further evidence to back up fears that social media in general and Facebook in particular insulated people in "echo chambers," where they could mingle with people of like mind and indulge their confirmation bias. The Facebook feature that exacerbated that: its News Feed, which spits out a constant stream of information (factual or otherwise) and creates what Manjoo calls the "solipsistic irresistibility of algorithmic news."
Facebook said it had long studied the technology behind the "filter bubble" that kept people segregated and found they wouldn't click on stories opposing their worldview even when the site's algorithms did offer those stories up—letting Facebook "off the hook" in CEO Mark Zuckerberg's eyes, per Manjoo. Zuckerberg has since appeared to come around somewhat to acknowledging the issues of an algorithm-driven platform and vowed to make changes, especially via his recent "social infrastructure" manifesto. But the main problems behind the News Feed—traditionally more reliant on data than human judgment—remain a challenge. Manjoo notes to combat fake news, the site may have to do something it never has: ignore what its users actually want. And, if it starts hiring more people instead of over-relying on its algorithms, it may become something it never wanted to be: "a media company rather than a neutral tech platform." Manjoo's full piece here. (Read more Longform stories.)