Washington Post columnist Geoffrey A. Fowler argued that Meta’s community notes fact-checking system for posts made on Facebook, Instagram and Threads is “nowhere near up to the task” of keeping misinformation off their platforms in a column published on Monday.
After four months of participating in Meta’s community notes program and drafting over 65 notes, only three of Fowler’s proposed notes were published, all while he says people’s feeds are “filled with inaccurate information.”
“Zuckerberg fired professional fact-checkers, leaving users to fight falsehoods with community notes. As the main line of defense against hoaxes and deliberate liars exploiting our attention, community notes appear — so far — nowhere near up to the task,” the columnist asserted.
According to Fowler, he did his best to avoid bias in his fact-checking, deliberately drafting notes that “crossed the political spectrum,” and voting on other users’ contributions.
META DELETES 10 MILLION FACEBOOK ACCOUNTS THIS YEAR, BUT WHY?
“For example, I suggested notes on a fabricated image of Pam Bondi seen over half a million times, as well as a false claim about the wealth of Alexandria Ocasio-Cortez seen a quarter of a million times,” he reported. “I also voted on dozens of notes drafted by others, rating them ‘helpful’ or ‘not helpful.'”
While the columnist conceded that community notes do “have upsides,” he noticed some glaring issues with Meta’s fact-checking system from the outset.
“I discovered problems quickly. Sometimes, posts I identified for notes wouldn’t accept them because they were written by accounts outside the U.S. (which are excluded from Meta’s initial program) or had other technical problems. I’ve seen notes suggested by others that were low quality, some with more opinions than facts, or that sourced to ‘Google it,'” he found.
One of the root problems facing Meta’s community notes system, according to Fowler, is the “bridging algorithm” used by the platform to determine which notes get published and which get passed over. The formula used in the algorithm requires contributors who have disagreed with each other on past notes to agree that a new note is beneficial.
META FACES INCREASING SCRUTINY OVER WIDESPREAD SCAM ADS
“In theory, this is a good thing. You don’t want to publish notes that contain falsehoods or are simply attacks on particular people or ideas,” he noted. “However, agreement is tough to find. Notes I couldn’t get published included facts that shouldn’t be up for debate, including identifying AI deepfakes. This system also doesn’t lend itself to the unique risks of breaking news and fast-moving viral conspiracies.”
Kolina Koltai, who helped develop community notes at X, said that the algorithm shared between Meta and X is a “very, very conservative system,” and is better at avoiding harmful notes rather than ensuring the useful notes actually get published.
When Fowler approached Meta with his findings, the social media giant told him that his tests cannot be used to evaluate its community-driven fact-checking system, which has only been active for four months in the United States.
CLICK HERE FOR THE LATEST MEDIA AND CULTURE NEWS
“Community Notes is a brand new product that’s still in the test-and-learn phase, and it takes time to build a robust contributor community. While there are notes continuously publishing across Threads, Instagram and Facebook, not every note will be broadly rated as helpful by the community — even if those notes were written by a Washington Post columnist,” Meta spokeswoman Erica Sackin told the Post.
Meta declined to answer Fowler’s questions about how many notes the platform has published, how many users are participating in the program or whether there is data to show the program is actually having an impact, despite the company promising transparency regarding the system.
At the end of July, the Washington Post was forced to issue a correction for a story on Meta and fact checking. A note to a piece by outgoing fact-checker Glenn Kessler conceded, “A previous version of this column incorrectly said that Meta allowed users to opt out of having posts fact-checked. In fact, Meta allowed users to opt out of seeing fact-checked posts.”
This was Kessler’s final column as a fact-checker. He took a buyout and has yet to be replaced by the Post.
Latest & Breaking News on Fox News