•   
  •   
  •   

Technology How Facebook, Twitter and Google aim to combat election misinformation

15:50  03 november  2020
15:50  03 november  2020 Source:   cnet.com

Democratic Party leaders are “banging their head against the wall” after private meetings with Facebook on election misinformation

  Democratic Party leaders are “banging their head against the wall” after private meetings with Facebook on election misinformation After months of talks, Democrats say Facebook isn’t ready for the election.Recode spoke with four sources with direct knowledge of ongoing monthly private conversations about election misinformation between several senior Democratic party committee leaders and senior members of Facebook’s policy team, including VP and director-level staff. These sources spoke on the condition of anonymity for fear of repercussions for discussing private talks.

Facebook , Twitter and YouTube, as the highest-profile social media platforms, have been grappling with misinformation on their Twitter says it will label or remove any similar post, making it more difficult to retweet a problematic message and reducing the likelihood that users will see it in their feeds.

How Facebook , Google , Twitter , & Microsoft Are Fighting Election Disinformation Together. Many major tech companies are working together to combat The 2016 election suffered from legitimacy challenges due to the widespread appearance of misinformation , some of which was deemed to

Social networks came under fire after Russian trolls used them to sow discord among Americans during the 2016 US presidential election. Now Facebook , Twitter and Google say they're better prepared to tackle misinformation during this year's presidential election.

The 2020 presidential election is putting social media to the test. Getty Images © Provided by CNET The 2020 presidential election is putting social media to the test. Getty Images

Facebook and Google, which owns the YouTube video-sharing service, have created databases that allow anyone to check the source of a political ad, who financed the message and how much was spent. Twitter banned political ads last year. Facebook works with third-party fact-checkers, though it exempts posts by politicians from this program. All of the social networks label problem posts, directing users to online hubs that include election information from authoritative sources. (Short-form video app TikTok, another popular social media network, also launched a US elections guide and said it will also reduce the spread of misleading videos.)

Google antitrust: Just how much do you actually use it? Way more than you think

  Google antitrust: Just how much do you actually use it? Way more than you think E-mail, calendar, search from Google are just the tip of the iceberg. Could you live without Maps or Waze, or your Chromebook?Google's influence in our lives is overwhelming, which is perhaps one of the reasons the Department of Justice and several state attorney generals banded together to file an anti-trust lawsuit against the company.

Facebook and Twitter say they will add warning labels to posts by any candidates or campaigns prematurely declaring that they've won. Election Day 2020: Facebook , Twitter Plan to Add Warnings to Politicians’ Posts If They Declare Premature Victory. By Todd Spangler.

Facebook has announced measures to combat misinformation during the US election season. Facebook expects their election information hub to reach around 160 million people in the US, or nearly one The primary aim of the hub is registering people to vote, she added, but the information

Social media companies say they're better prepared for the 2020 presidential election than they were four years ago. © Getty Images

Social media companies say they're better prepared for the 2020 presidential election than they were four years ago.

The efforts to combat misinformation come as social media companies weather a storm of criticism from all quarters ahead of the Nov. 3 election between President Donald Trump, a Republican, and challenger Joe Biden, a Democrat. Conservatives say social networks suppress their speech in an effort to sway the election. The companies deny the allegations. Liberals say the companies haven't done enough to stamp out fake news.

All of these companies will have their hands full after the polls close. Here are the ways the big three social media companies are trying to limit the spread of misinformation.

It's Election Day. Here's how to avoid getting fooled by misinformation

  It's Election Day. Here's how to avoid getting fooled by misinformation We won't know who won the presidential election on Tuesday night, so we'll be prime targets for misinformation.Some of this will be disinformation, or deliberately false and misleading content. Misinformation is a broader term that describes incorrect information regardless of whether the person sharing it knows it's false. Some bad information will originate from fake accounts, possibly run out of Russia. Some may come from politicians themselves. Almost all of it will be amplified by people who don't know it's false.

Effective immediately, Facebook said it will implement its plan to take down misinformation about voting. Zuckerberg has said that Facebook doesn’t favor either side and that he and his wife would donate 0 million to bolster funding for election infrastructure. Facebook said it aims to register

Other election -related safety steps Facebook is taking: Last week, it shut off political and social group recommendations while Instagram disabled the “Recent” tab on hashtag pages to stop the spread of misinformation around Election Day. Facebook also has said it will remove posts by anyone

Facebook

  • Facebook let users turn off all political ads on both the social network's main site and its Instagram photo-sharing service ahead of the election.
  • Facebook and Instagram launched an online hub for US users for information about voting, including registration, mail-in voting and election-related deadlines. Facebook CEO Mark Zuckerberg says the company helped an estimated 4.4 million Americans register to vote.
  • The social network stopped accepting new political or issue advertising during the final week of the campaign and will expand policies addressing voter suppression. Facebook will also temporarily halt all election and issue ads after Nov. 3 for an indefinite period of time.
  • The company will also label posts from politicians who declare premature victory.
  • Facebook is removing fake accounts designed to mislead others about their identity and purpose, including some with ties to Iran and Russia. The social network has also cracked down on accounts related to QAnon, a far-right conspiracy theory falsely alleging there's a deep state plot against Trump.
  • Facebook's Messenger and WhatsApp, both messaging apps, are limiting message forwarding.
  • Facebook temporarily suspended recommendations for new and political groups, online spaces where users gather to chat about shared interests.

Twitter

  • Twitter banned political ads, one of the strictest moves taken by a social media company.
  • Twitter may delete tweets that violate its policies, temporarily lock the accounts of offending users and suspend repeat offenders. Tweets that could violate the company's policies include messages that provide misleading information about voting, attempt to suppress or intimidate voters, provide false or misleading information about results, fail to fully or accurately disclose the identity of the tweeter.
  • The short-message social network may also add labels to tweets with misleading information, including those from politicians who declare victory prematurely. In addition, Twitter is labeling tweets that include manipulated media, state-affiliated media or content from politicians and government leaders that violates its rules, but leaves the tweets up because of public interest.
  • Twitter is making users think twice before they share a tweet that contains disputed information by showing them a warning. The company is also encouraging users to add a comment to retweets. Like other companies, Twitter is trying to direct people to a page with trustworthy election information.

Google

  • Google made changes to its popular search engine, blocking some suggestions its auto-complete function provides if a query is election-related. For example, if someone types the phrase "donate to," Google will block autocomplete suggestions that include the names of candidates or political parties.
  • Google will temporarily ban political advertisements after the polls close to try to prevent ads falsely claiming victory.
  • YouTube will label election videos and search results with an information panel that warns, "Results may not be final." The panel will link to a feature on Google with real-time results from the Associated Press.
  • YouTube will show people information panels on mail-in voting when they watch videos that discuss the subject. (The ballot-casting method has become fraught with misinformation as Trump has tried to discredit the process, while providing no evidence of security flaws in the time-tested system.)
  • YouTube banned some videos pushing false conspiracies such as QAnon, pledging to remove content that "targets an individual or group with conspiracy theories that have been used to justify real-world violence."
  • YouTube banned videos containing information that was obtained through hacking and could interfere with elections or censuses.

Facebook's crazy year was marked by a high-stakes battle with online lies .
The coronavirus pandemic, the US elections and outrage over racial injustice created a perfect storm for social media misinformation.Facebook CEO Mark Zuckerberg testifies remotely as US Sen. John Kennedy (R-LA) listens during a Senate Judiciary Committee hearing in November titled, "Breaking the News: Censorship, Suppression, and the 2020 Election.

usr: 0
This is interesting!