•   
  •   
  •   

Technology Inside Facebook's efforts to stop revenge porn before it spreads

20:22  18 november  2019
20:22  18 november  2019 Source:   nbcnews.com

Former Yahoo engineer pleads guilty to hacking 6,000 accounts in hunt for nudes

  Former Yahoo engineer pleads guilty to hacking 6,000 accounts in hunt for nudes He faces five years for stealing sexually explicit pics.Ruiz worked at Yahoo for more than 10 years and left the company in 2018, according to CNET sister site ZDNet.  His roles reportedly included a position as a reliability engineer for the company's Yahoo Mail service. During his time at Yahoo, he copied sexually explicit photos and videos without permission and stored the data at his home, the Department of Justice said in a Monday statement.

Facebook ’ s fight against revenge porn is just one piece of the broader challenge technology platforms face as they grapple with scalable content moderation solutions for ugly If Facebook can do that with revenge porn , it could revolutionize this battle. But its efforts so far show just how difficult this will be.

Updated: 3 Nov 2017, 16:51. FACEBOOK wants to stop revenge porn BEFORE it happens - by getting Social media makes it terrifyingly easy to spread revenge pornCredit: Getty Images. Teen: revenge porn ruined my life. “As part of our continued efforts to better detect and remove content

Michaela Zehara, 22, was going through her Instagram three years ago when an account bearing her name, photo and phone number started following her.

Image: Facebook has built a team to combat non-consensual sharing of intimate photos and videos.© Chelsea Stahl Image: Facebook has built a team to combat non-consensual sharing of intimate photos and videos.

"I just had a gut feeling something bad was about to happen," Zehara, a Los Angeles-based fitness trainer and aspiring actor, told NBC News.

Her gut was right. Minutes later, friends and family members started messaging her saying that the account was uploading photos of her naked body that she had shared with her boyfriend at the time.

"My vagina, breasts, butt. Everything," she said. "I threw up, I started crying. For a moment I was suicidal; death sounded a little bit more fun than this."

Come see The Interface live in San Francisco

  Come see The Interface live in San Francisco Casey Newton will talk with Renee DiResta about Facebook and the 2020 electionDiResta is a 2019 Mozilla Fellow in media, misinformation, and trust who is investigating how fake news spreads across social networks. Onstage, Newton and DiResta will discuss the steps Facebook has taken to arm itself against malicious attacks as well as areas where it’s currently vulnerable. As a bonus, they’ll discuss how to spot misinformation on social networks.

Facebook wants to stop revenge porn by getting people to send in their naughty photos BEFORE they are put up on the site by vindictive lovers. Facebook is testing image recognition technology to identify revenge porn . Users are being asked to send sexy snaps to themselves on Messenger.

Facebook is asking British users to send naked photos of themselves to the social network, to try to stop revenge porn . If you're worried an intimate photo of you could be shared by someone else, the idea is to get it blocked before it appears online.

Instagram took the images down in about 20 minutes, after Zehara and dozens of her friends had reported them. But the damage was already done.

"I don't know who screen-shotted them," she said. "And he still has the pictures. It could happen again. He has that hanging round my neck."

Zehara was the victim of revenge porn, a form of invasion of sexual privacy and online harassment where the perpetrator — usually a disgruntled ex-partner — posts or threatens to post intimate photos without consent, often with the aim of shaming the subject.

To combat this problem, Facebook has built a team of about 25 people, not including content moderators, working full-time to fight the nonconsensual sharing of intimate photos and videos. Each month, Facebook, which owns Instagram, has to assess about half a million reports of revenge porn and "sextortion," a source familiar with the matter said. The team's goal is not only to quickly remove pictures or videos once they have been reported, as happened in Zehara's case, but also to detect the images using artificial intelligence at the moment they are uploaded, to prevent them from being shared at all.

Twitter vows to introduce new rules against deepfakes

  Twitter vows to introduce new rules against deepfakes Twitter promises to introduce new policy to fight deepfakes, especially when they could "threaten someone's physical safety or lead to offline harm." The social network has announced that it's working on created rules to address what it calls "synthetic and manipulated media" posted on its website. Photos, videos, and audio that had been significantly altered to fabricate events that never happened fall under that classification. The company says it's taking this step, since it needs to consider the potential damage deepfakes shared on Twitter can cause.

Facebook today explained in more detail its new test for combating revenge porn following mass confusion earlier this week over how exactly the system works and whether The company considered blurring out images before they ended up in the hands of human reviewers, but decided against it

Facebook is rolling out new weapons to combat “ revenge porn ,” offering tools to flag inappropriate “This is part of our ongoing effort to help build a safe community on and off Facebook ,” Antigone Davis Revenge porn — the online sharing of sexually explicit photos without the subject’ s consent

a woman smiling for the camera© Provided by NBCU News Group, a division of NBCUniversal Media LLC

The team's work is complex and culturally nuanced, involving a wide variety of images, vulnerable people and time sensitivity. It's a problem that requires a human touch on the individual level, but that only an automated system can tackle at the necessary scale.

In interviews with NBC News, members of Facebook's team tasked with clamping down on revenge porn spoke publicly about their work for the first time. They recounted a number of missteps, including a poorly communicated pilot program inviting people to pre-emptively submit their nude photos to Facebook. They described how recent research across eight countries highlighted the cultural variation in what counts as an "intimate" image, which makes it more difficult for artificial intelligence to identify them. And they wrestled with what all that means for developing tools to quickly and effectively take these images down.

Advocates raise concerns for women, LGBTQ candidates after Rep. Hill resignation

  Advocates raise concerns for women, LGBTQ candidates after Rep. Hill resignation How Rep. Katie Hill's resignation could impact the future of women and LGBTQ candidates, according to advocates."Today I ask you all to stand with me and commit to creating a future where this no longer happens to women and girls. Yes, I'm stepping down, but I refuse to let this experience scare off other women who dare to take risks, who dare to step into this light, who dare to be powerful," she said on the House floor.

Facebook ' s policies on revenge porn have come into sharp focus after members of the Marine One big positive, she said, is that Facebook is expanding its efforts to combat harassment that isn't just in text. Davis said that Facebook has worked with the group to create a “one- stop shop” for reporting

Facebook is testing image recognition technology to identify revenge porn . It is not clear how long Facebook will store the blurred images, before it discards them and retains only the identifying hashes. Whitney said: 'Ewe! Facebook wants you to send them your nude pics in order to stop your ex from uploading them! Australia is among world leaders in efforts to combat revenge porn .

"In hearing how terrible the experiences of having your image shared was, the product team was really motivated in trying to figure out what we could do that was better than just responding to reports," said Radha Plumb, head of product policy research at Facebook.

But she noted that the problem extends beyond the company. There will always be "malicious actors" in the world who will "figure out how to hurt people in ways that are very hard to predict or prevent," she said.

Facebook's fight against revenge porn is just one piece of the broader challenge technology platforms face as they grapple with scalable content moderation solutions for ugly human behavior. From hate speech and violence to terrorist propaganda and conspiracy theories, companies including Facebook, Google and Twitter are all trying to teach artificial intelligence how to identify objectionable material.

If Facebook can do that with revenge porn, it could revolutionize this battle. But its efforts so far show just how difficult this will be.

Facebook's 'greater responsibility'

The problem of revenge porn is not confined to Facebook, as images can be posted elsewhere on the web, on pornography sites, for example, and may appear in search engine results. Some websites have specialized in what tech companies and victims' advocates call "nonconsensual intimate images," although the legal risk of doing so has grown. In 2015, a man who operated one such site was sentenced to 18 years in prison in California. (There is no federal law criminalizing the nonconsensual sharing of intimate images, although 46 states have such laws.)

2 Disney employees, former assistant principal among 17 facing charges in child porn sting

  2 Disney employees, former assistant principal among 17 facing charges in child porn sting Two Disney employees and a former middle school principal are among 17 people facing child pornography charges following a monthlong investigation. Polk County Sheriff Grady Judd announced the arrests Friday during a press conference. Sign up for our Newsletters Brett Kinney, 40, worked in the entertainment sector of Disney. “Brett has worked for 15 years for Disney, we arrested him two times,” Judd said. The sheriff also added that Kinney told investigators he’d been viewing child porn for 22 years. Also arrested was 52-year-old Disney hotel maintenance employee Donald Durr, Jr. “Here's what Donald told us during his interview: ‘I'm a pervert, not a monster.

Facebook thinks it has come up with a way to stop revenge porn from being posted on the platform. The social network announced Friday that it has developed a new tool that uses artificial intelligence to detect revenge porn on the platform before the post has been reported. Using machine learning and

How Two Marines Helped Bring Down Revenge Porn on Facebook . We’ve helped orchestrate efforts to create a central resource to make it easier for victims to report and have “ It took algorithms, and it took taking stuff out before it got to people. So, we know companies are fully capable of

Facebook, though, can have a huge impact on victims because it's where their real-life connections might see an image.

"Specialty websites show up in Google searches, but unless you are looking for them no one is going to see you nude," said Katelyn Bowden, founder of the victim advocacy group BADASS (Battling Against Demeaning and Abusive Selfie Sharing). "On Facebook and Instagram you have your family, friends, co-workers, bosses and your real name. Everybody is going to see."

Alex Stamos, a former head of security at Facebook, added that social platforms have a "civic responsibility" to focus on revenge porn.

"Because this type of abuse involves sending images to strangers in the social network of your victim, any app that lets you look up who is in someone's social network has a greater responsibility," he said.

While other platforms, such as Twitter, TikTok and Snap, prohibit users from posting intimate images, Facebook is alone in developing tools to prevent them from being shared in the first place.

In November 2017, Facebook launched a pilot in Australia inviting users to pre-emptively send the company their nude or intimate images. The idea was that Facebook could then block any attempts to distribute those images on the platform without the subject's consent.

Porn actress fights Trump request for settlement dollars

  Porn actress fights Trump request for settlement dollars COLUMBUS, Ohio (AP) — Attorneys for porn star Stormy Daniels are challenging a request by President Donald Trump’s lawyers to stake claim to a settlement between Daniels and Ohio's capital city. A federal judge last year said Daniels, whose real name is Stephanie Clifford, must pay Trump nearly $293,000 for his attorneys' fees and another $1,000 in sanctions after her defamation suit against him was dismissed. Earlier this year, the city of Columbus reached a $450,000 settlement with Daniels over the porn actress' arrest at a strip club in 2018. Trump's lawyers noted in filing to the court involved in the Columbus judgment last week that Clifford owes him $293,052.

The media reaction to the announcement, which CBS headlined "Facebook: send us your naked photos to stop revenge porn," was, at best, mockery, with prominent publications describing it as "idiotic" and highlighting the risks of sharing one's naked selfies with the social network. Some expressed concerns that a human content reviewer would look at the images before they were converted into indecipherable digital fingerprints, a process known as "hashing."

But some victims and support groups responded more positively. They saw the pilot as a way to claw back some control from people who threaten to share images.

"It's one way to ensure your photos won't be reposted," said Danielle Citron, vice president of the Cyber Civil Rights Initiative, a nonprofit dedicated to fighting nonconsensual porn. "It doesn't mean they won't appear on Pornhub, but it's better to have something you feel like you can do. You feel helpless and your sexual identity is ripped from your control. It's psychologically devastating."

Citron said she found the media coverage of the pilot "really frustrating."

"Facebook was unfairly attacked for something that was coming from a victim-centered place," she said.

What is considered 'intimate'?

The negative response to the pilot made Facebook officials realize that they needed to know more about the problem. The following year, 2018, Facebook launched a research program, detailed here for the first time, to explore how it could better support revenge porn victims and prevent images from being shared.

Facebook nixes billions of fake accounts

  Facebook nixes billions of fake accounts Facebook on Wednesday said it has taken down some 5.4 billion fake accounts this year in a sign of the persistent battle on social media against manipulation and misinformation. Amid growing efforts to create fraudulent accounts, Facebook said it has stepped up its defenses and often removes the accounts within minutes of their being created. Amid growing efforts to create fraudulent accounts, Facebook said it has stepped up its defenses and often removes the accounts within minutes of their being created.

Company researchers interviewed victim support groups in the U.S., Brazil, the U.K., Bulgaria, the West Bank, Denmark, Kenya and Australia. They also interviewed five men and five women in the U.S. who had reported revenge porn to the company. Among them was a young man whose ex-girlfriend had posted a naked image of him to Facebook that was seen by many of his connections and a young woman who had exchanged naked photos with someone she didn't know offline, who then started threatening to release the photos to family members if she didn't give him money.

The victims and advocacy groups told researchers that the existing reporting process was confusing and insensitive, particularly at a time of high stress.

"People who aren't familiar with various types of gender-based violence or online abuse can sometimes assume it's only online, so how bad can it be? But it's a really traumatic experience," said Plumb, head of product policy research at Facebook. "Some victims talked about having suicidal thoughts or living in constant fear about their personal and professional reputation. It changes how they view the world around them, causing them to live in fear and paranoia about what other information could be shared without their consent."

Facebook's research highlighted significant geographical, cultural and religious differences in the types of images that are considered "intimate."

"Originally our policy focused on nudity that was shared without consent, but we found that there are images that may be shared without consent that wouldn't violate our nudity policy but are used to harass somebody," said Antigone Davis, Facebook's head of global safety.

a man sitting at a table: Image: Facebook's head of global safety Antigone Davis speaks at a roundtable with first lady Melania Trump on March 20, 2018.© Evan Vucci Image: Facebook's head of global safety Antigone Davis speaks at a roundtable with first lady Melania Trump on March 20, 2018.

Davis gave the example of a woman in India who reported a photo in which she was fully clothed in a pool with a fully clothed man.

Former LAPD officer charged with revenge porn, domestic violence against fellow officer

  Former LAPD officer charged with revenge porn, domestic violence against fellow officer Former LAPD officer charged with revenge porn, domestic violence against fellow officerNine charges have been filed against Danny Reedy, a Los Angeles Police Department senior lead officer who has since retired but was on the force at the time of the alleged crimes.

"Within her culture and family that would not be acceptable behavior," Davis said. "The photo was shared intentionally with her family and employer to harass her."

The consequences for victims can be extreme. Some advocacy groups noted that their clients face honor killings, disownment by their family or physical abuse.

How AI can help

Based on the research, Facebook has tried to train its artificial intelligence applications to recognize a wide variety of images as potential revenge porn.

Facebook's systems scan each post for clues, some of them subtle. For example, the inclusion of a laughing-face emoji along with a phrase like "look at this" are two potential indications that an image could be revenge porn, according to the company.

Once the algorithms flag an image, it's sent for review by humans.

"Our goal is to find the vengeful context, and the nude or near-nude, and however many signals we need to look at, we'll use that," said Mike Masland, Facebook's product manager for fighting revenge porn.

Artificial intelligence systems require large amounts of data to learn to distinguish images. To get enough examples, Facebook says it turned to a readily available source: nude and near-nude images uploaded to Facebook that were already flagged by the company's own human reviewers.

As more images are reported, the AI may have a challenge in keeping up, but with more examples it may also get better.

"It will evolve," Masland said.

But some are skeptical that AI can effectively identify these images.

"Humans already struggle with determining intent," Sarah T. Roberts, an assistant professor at UCLA who studies commercial content moderation, said. "So how can AI, which is largely based on abstracted patterns of human behavior, be better positioned to know?"

'A big game of whack-a-mole'

One of the people Facebook consulted when developing tools to combat revenge porn was Katelyn Bowden from BADASS.

It was in April 2017 that Bowden discovered that nude images of her had been posted to a website known for sharing revenge porn.

"My immediate reaction was panic, embarrassment and shock," said Bowden, who was a bartender in Youngstown, Ohio, at the time. "The shock evolved into depression."

Bowden discovered that the only way to get her photos removed was to copyright them and then issue takedown notices to websites. But they started popping up on other sites like 4Chan and Discord, a chat platform for gamers.

"It was a big game of whack-a-mole," she said.

a person holding a frisbee in a pool© Provided by NBCU News Group, a division of NBCUniversal Media LLC

Soon Bowden started to connect with other victims she found online to help them get their photos taken down, and she created a Facebook group for what she called the "BADASS Army," which accumulated 3,000 members in 18 months.

In mid-2017, Bowden received a message from Antigone Davis, inviting her to Facebook's offices in Menlo Park, California, and Seattle to talk to the teams about the experiences of her "army."

Bowden said she met with some of Facebook's specialist content reviewers, responsible for checking images flagged as revenge porn. They had backgrounds in sex trafficking and child exploitation investigations, she said.

"I believe they are taking it seriously, but they are looking for a technical solution to a human problem," she said, suggesting that the company invest in more human moderators.

Robotic responses

Bowden and other leaders of victim support groups consulted by Facebook want the social network to take a far more personal approach with victims.

"Facebook doesn't seem to have a whole lot of empathy," Bowden said of the language the company uses in its policies and reporting systems.

Plumb said this was one of the most common pieces of feedback offered during Facebook's months of research. Victim support groups said that the language Facebook used did not convey the severity of the situation and at times could be perceived as victim blaming.

In response, Facebook has altered the language it uses on its site, in policy guidelines and in reporting tools, with the goal of making victims feel supported, not judged.

For example, it deleted a line on the customer support page dedicated to nonconsensual intimate images that stated: "The safest thing you can do is never share something you wouldn't want other people seeing."

"This is definitely true but not that helpful for victims after the fact," Plumb said.

Facebook has updated the reporting process to let victims file a revenge porn complaint in one simple page, with clear instructions for how to gather the evidence Facebook needs to take action on a complaint.

On the back end, the company has made this type of material a priority in queues for content moderation. Now, anything flagged as revenge porn is treated with a similar level of urgency as content related to self harm.

If Facebook determines that a user shared intimate images with malicious intent, both the content and the sharer's account are removed.

Victims and advocates say Facebook needs to do more.

Two victims who spoke to NBC News said their photos had been shared by their harassers in group chats on Messenger to which they had no access. This meant they couldn't view or report the content.

Amanda, 32, a stay-at-home mom from Lexington, Kentucky, said that even when her harasser did include her in a group chat, Facebook was not responsive. In September, she reported that he had shared photos of her breasts in group chats along with messages describing her as a "whore" and a "slut," but weeks later the images are still there.

Bowden said that members of her BADASS community regularly complain about Instagram's unresponsiveness.

"Instagram is awful," Bowden said. "They take forever to respond and they are not shutting down accounts that are bad."

A Facebook spokeswoman said that because Instagram shares the same policies and content reviewers as Facebook, it should be enforcing the rules evenly.

However, neither Instagram nor Messenger have specific language in their reporting flows to allow users to flag content as revenge porn. Instead they must flag it as nudity or harassment, which means it won't be given top priority.

The 'next frontier'

After a rocky start, Facebook has expanded the tool that lets people submit their intimate photos pre-emptively to the U.K., U.S., Canada, Pakistan and Taiwan. A Facebook spokeswoman said the tool will launch in additional countries in Europe, the Middle East and Latin America in the coming months.

But for some victims, the prospect of sharing their intimate photos with anyone when they are feeling so vulnerable is terrifying.

In the summer of 2017, a disgruntled ex-partner of Nicole Brzyski posted intimate photos of and her husband, along with their names and other biographical details, to several sites dedicated to nonconsensual intimate imagery.

"I thought my world was crumbling," Brzyski said. "It took a long time to accept it's not my fault."

Brzyski, now a paralegal and an advocate for others who have experienced online abuse, has spent hours making takedown requests to these websites. She has used her experience in digital marketing to create blogs, social media profiles and websites in an effort to drown out the revenge porn from the first page of Google's results for her name.

She hasn't, however, submitted her photos to Facebook for proactive removal.

"It sounds like it would be a helpful tool, but then you start thinking about sharing this photo with other people. We are already so uncomfortable and traumatized. I never did it because I was so afraid these photos would somehow be leaked again. Who knows who is looking at them?"

Plumb said this type of feedback is not uncommon, and that Facebook wasn't clear enough in its initial explanation of the pilot. Victims didn't understand how the matching process worked, who at Facebook would have access to the photos and whether there was a risk of their images being hacked or leaked.

Images submitted to Facebook are viewed briefly by a content moderator trained to deal with safety issues to ensure they are in fact intimate images before they are converted into digital fingerprints that can be used to prevent any subsequent posting of the image on Facebook, Instagram and Messenger. The process is similar to the one used to remove child sex abuse imagery from the web.

Facebook deletes the original image seven days after converting it to the indecipherable fingerprint. This means Facebook doesn't maintain a database of intimate photos that might be vulnerable to hacking or abuse — a fear articulated by some victims in Facebook's research.

In the U.K., the Revenge Porn Helpline, a nonprofit dedicated to helping those dealing with nonconsensual intimate images, pointed more than 400 people to Facebook's takedown tool over the course of a year.

"The relief that victims feel when they know their images can't be shared in this way is immense," Sophie Mortimer, manager of the Revenge Porn Hotline, said.

Mortimer said that Facebook's proactive approach stands out, compared to other platforms' reactive approaches.

"We would love to see a similar attitude to prevention elsewhere," she said.

Facebook's Davis said that's the "next frontier." The company hopes to collaborate with others in the industry, such as Twitter, YouTube, Microsoft, Snap and Reddit, in the same way that it did to tackle terrorist propaganda.

"What you see is that we will shut this type of content down on our platform, but then people will hop to other platforms and spaces," Davis said. "What would be great across industry is for us to share intelligence to disable someone from moving from one platform to another."

Former LAPD officer charged with revenge porn, domestic violence against fellow officer .
Former LAPD officer charged with revenge porn, domestic violence against fellow officerNine charges have been filed against Danny Reedy, a Los Angeles Police Department senior lead officer who has since retired but was on the force at the time of the alleged crimes.

—   Share news in the SOC. Networks

Topical videos:

usr: 3
This is interesting!