Source: Xinhua | 2018-10-25
U.S. top social media network Facebook said Wednesday it has pulled 8.7 million pieces of child nudity content from its platform in the past three months to fight child exploitation.
Facebook said it has been using artificial intelligence (AI) and machine learning technology in its efforts to prevent child exploitation and increased enforcement of its ban on photos that show minors in a sexualized context.
Facebook Global Head of Safety Antigone Davis wrote in an official post that 99 percent of those images were removed before anyone reported them for violation of the company's policy of prohibiting child exploitation.
She said Facebook has also removed accounts that promote child pornography, and it even took action on nonsexual content such as seemingly benign photos of children in the bath to avoid the potential for abuse.
Davis disclosed that Facebook has been working hard to develop new technology over the past year to combat child exploitation and keep children safe on the platform.
"In addition to photo-matching technology, we're using artificial intelligence and machine learning to proactively detect child nudity and previously unknown child exploitative content when it's uploaded," she said.
Contact: Newyork Liu
Email: marketing@busiunion.com
Wechat: NewyorkLiu
Company: Busiunion
Add: No. 351, Tianshanxi Road, Changning District, Shanghai, China