Shock Claim: Facebook Moderators Told to ‘Err on the Side of an Adult’ with Potential Child Sexual Abuse Material

According to recent reports, Facebook’s content moderators are told to “err on the side of an adult” when unsure of the age of a victim in potential child sexual abuse material (CSAM).
A recent report from the New York Times reveals that Facebook’s training documents for the company’s content moderators instruct them to “err on the side of an adult” when they don’t know the age of an individual shown in a photo or video suspected of being child sexual abuse material (CSAM).
Facebook co-founder, Chairman and CEO Mark Zuckerberg (Photo by Chip Somodevilla/Getty Images)
(Josh Edelson/AFP/Getty Images)
Companies like Facebook are required to monitor their platforms for child sexual abuse material and if found, report it to the National Center for Missing and Exploited Children (NCMEC). Many tech firms employ content moderators to review content flagged as potential CSAM.
The Verge reports that the policy that instructs Mark Zuckerberg’s moderators to “err on the side of an adult” was made for Facebook’s content moderators at the third-party contractor Accenture and is mentioned in a California Law Review article from August, which states:
Interviewees also …

The post Shock Claim: Facebook Moderators Told to ‘Err on the Side of an Adult’ with Potential Child Sexual Abuse Material appeared first on Populist Press ©2022.

 •  0 comments  •  flag
Share on Twitter
Published on April 02, 2022 05:27
No comments have been added yet.


Stephen K. Bannon's Blog

Stephen K. Bannon
Stephen K. Bannon isn't a Goodreads Author (yet), but they do have a blog, so here are some recent posts imported from their feed.
Follow Stephen K. Bannon's blog with rss.