•   
  •   
  •   

Technology Facebook and Instagram removed millions of posts for violating its rules

20:35  13 november  2019
20:35  13 november  2019 Source:   cnet.com

Instagram plans to restrict posts promoting weight loss products

Instagram plans to restrict posts promoting weight loss products Influencers promoting diet teas, supplements and certain cosmetic surgeries may have their posts hidden or removed completely

Facebook said Wednesday it removed millions of posts for violating its rules against hate speech, child nudity and other offensive content from April to September. For the first time, the social network also released data about content taken down from Instagram, the photo app it owns.

a close up of a sign: Facebook and Instagram logos. Angela Lang/CNET© Provided by CBS Interactive Inc. Facebook and Instagram logos. Angela Lang/CNET

During the second and third quarter, Facebook removed 58 million posts for adult nudity and sexual activity, 5.7 million posts for harassment and bullying and 11.4 million posts for hate speech, according to its bi-annual community standards enforcement report.

Facebook settles court case with company selling fake Instagram likes

  Facebook settles court case with company selling fake Instagram likes Instagram has been struggling recently from an onslaught of spam comments, fake likes and fake follows. Now Facebook, which owns the service, has won a small victory against the spammers. The social media giant has settled a court case with a New Zealand company which sold fake likes on the platform. The target of the court case -- Social Media Series Limited -- was accused of violating cybersecurity laws by taking payments in exchange for artificially promoting posts and accounts. Its activities were estimated to bring in $9.4 million over its period of operation, according to Facebook.

The data highlights how the world's largest social network is still taking action against millions of posts that flow through its site and Instagram even as Facebook CEO Mark Zuckerberg pushes for free expression.

From July to September, Facebook removed 11.6 million pieces of content for violating its rules against child nudity and sexual exploitation, up from nearly 7 million in the previous quarter. On Instagram, more than 753,000 posts about child nudity and sexual exploitation were taken down in the third quarter.

Facebook logo© Angela Lang/CNET

Facebook logo

The company attributed the rise in these takedowns to improvements in detecting and removing content including how Facebook store digital fingerprints called "hashes" that run afoul of its rules against child nudity and sexual exploitation.

Apple removes Instagram stalking app from App Store

  Apple removes Instagram stalking app from App Store Like Patrol promised to keep tabs on partners' activity on Instagram. Apple said the app violated its guidelines. Like Patrol didn't respond to a request for comment. The app first appeared on Apple's store in July, and doesn't have a version on Android. It charged people up to $80 a year to spy on their partners, and had fewer than 300 people signed up in October, founder Sergio Luis Quintero said. The app doesn't classify as stalkerware, which abusive partners use to keep track of private information like location data, call logs, text messages and contacts.

Guy Rosen, Facebook's vice president of integrity, also said in a blog post that the company improved how it detected hate speech to detect these posts before people even see them. That includes identifying images and texts the company already pulled down for violating this policy.

"While we are pleased with this progress, these technologies are not perfect and we know that mistakes can still happen," Rosen said in the post.

The company also included new data about how suicide and self-injury content and terrorist propaganda.

Facebook pulled down 4.5 million posts for depicting suicide and self-injury from April to September. On Instagram, it took down 1.7 million of these posts for violating the same policy.

Facebook's report also included more details about how much content it pulled down in the wake of the Christchurch terrorist attack.

In March, a gunman used who killed 50 people at two mosques in Christchurch, New Zealand, used Facebook to live stream the attacks. From March 15 to September 30, Facebook removed about 4.5 million posts related to the attack. The company said it identified about 97% of those posts before users reported it.

Facebook, Instagram ban influencers from promoting guns and vaping .
Facebook and Instagram already ban ads for guns and e-cigarettes, but now they're shutting down a loophole that let merchants pitch the products regardless. The social networks have announced that they're banning "branded content" (read: influencer posting) that promotes weapons, tobacco and vaping. You'll also see "special restrictions" on posts that market products like alcohol and diet supplements. Enforcement for the new rules should take effect in the "coming weeks," Facebook said. It's also working on tools to help creators honor the new policy, such as setting minimum age requirements or their content.

—   Share news in the SOC. Networks

Topical videos:

usr: 1
This is interesting!