Images, Videos on Facebook Will Now Be Fact-Checked

Efforts are ramping up as midterm elections approach
By Luke Roney,  Newser Staff
Posted Sep 13, 2018 6:19 PM CDT
The Facebook logo is seen at the company's headquarters.   (AP Photo/Marcio Jose Sanchez)

(Newser) – Facebook wants you to be able to believe your eyes. To that end, the social media platform will begin fact-checking images and videos posted to the site, the Wall Street Journal reports. Heretofore, Facebook has focused mainly on weeding out articles that include false information. But, Facebook product manager Tessa Lyons says in a statement, "The same false claim can appear … as text over a photo or as audio in the background of a video. In order to fight misinformation, we have to be able to fact-check it across all of these different content types." Social media posts including doctored photos and other images were a big part of Russian agents' attempts to influence the 2016 presidential election, per CNBC.

In a blog post, Facebook product manager Antonia Woodford says Facebook has created a machine-learning model to flag possibly false content through the use of "engagement signals, including feedback from people on Facebook." If an image or video doesn't seem right, Facebook sends it to fact-checkers for review, she says. As the midterm elections approach, CNBC reports, Facebook has been bolstering its fact-checking efforts and has detected "coordinated inauthentic behavior." On Wednesday, CEO Mark Zuckerberg said Facebook has "developed sophisticated systems … to prevent election interference" based on lessons learned from Russian meddling in 2016, according to Cnet.

My Take on This Story
Show results without voting  |  
14%
12%
1%
46%
9%
19%