•   
  •   
  •   

Technology Federal study of top facial recognition algorithms finds ‘empirical evidence’ of bias

17:05  20 december  2019
17:05  20 december  2019 Source:   theverge.com

The best facial recognition cameras of 2019

  The best facial recognition cameras of 2019 Want a security camera with facial recognition? Here are your best options.The Philips Hue White starter kit.

A sweeping federal study of facial - recognition technology found that the systems were worse at identifying women and people of color than men and "While it is usually incorrect to make statements across algorithms , we found empirical evidence for the existence of demographic differentials in

Facial - recognition systems misidentified people of color more often than white people, a landmark federal study released Thursday shows, casting Asian and African American people were up to 100 times more likely to be misidentified than white men, depending on the particular algorithm and type

A new federal study has found that many of the world’s top facial recognition algorithms are biased along lines of age, race, and ethnicity. According to the study by the National Institute of Standards and Technology (NIST), algorithms currently sold in the market can misidentify members of some groups up to 100 times more frequently than others.

a close up of a blue wall© Illustration by James Bareham / The Verge

NIST says it found “empirical evidence” that characteristics such as age, gender, and race, impact accuracy for the “majority” of algorithms. The group tested 189 algorithms from 99 organizations, which together power most of the facial-recognition systems in use globally.

The best facial recognition cameras of 2019

  The best facial recognition cameras of 2019 Peter Schrager introduces Patrick Ricard, the Baltimore Ravens two-way player and reveals why he is the most underrated player in the league.

A US government study suggests facial recognition algorithms are far less accurate at identifying African-American and Asian "While it is usually incorrect to make statements across algorithms , we found empirical evidence for the existence of demographic differentials in the majority of the face

A new NIST study looks into the biases of various facial recognition systems. "While it is usually incorrect to make statements across algorithms , we found empirical evidence for the existence of demographic differentials in the majority of the face recognition algorithms we studied ," Patrick

The findings provide yet more evidence that many of the world’s most advanced facial recognition algorithms are not ready for use in critical areas such as law enforcement and national security. Lawmakers called the study “shocking,” The Washington Post reports, and called on the US government to reconsider plans to use the technology to secure its borders.

Lawmakers called the results “shocking”

The study tested “one-to-one” checks, used to match someone against a passport or ID card, as well as “one-to-many” searches, where someone is matched with a single record in a larger database. African-American women were inaccurately identified most frequently in one-to-many searches, while Asian, African American, Native American, and Pacific Islanders were all misidentified in one-to-one searches. Children and the elderly were also falsely identified more. In some cases, Asian and African American people were misidentified as much as 100 times more than white men. The highest accuracy rates were generally found among middle-aged white men.

How Google Interferes With Its Search Algorithms and Changes Your Results

  How Google Interferes With Its Search Algorithms and Changes Your Results Pressed by businesses, interest groups and governments, the internet giant uses blacklists, algorithm tweaks and an army of contractors to shape what you see.They are arguably the most powerful lines of computer code in the global economy, controlling how much of the world accesses information found on the internet, and the starting point for billions of dollars of commerce.

The majority of commercial facial - recognition systems exhibit bias , according to a study from a federal agency released on Thursday, underscoring questions about a technology increasingly used by police departments and federal agencies to identify suspected criminals.

across algorithms , we found empirical evidence for the existence of demographic differentials in the majority of the face recognition algorithms we And they’re biased against anyone who isn’t white, male, and middle-aged. This comprehensive study analyzed nearly 200 facial recognition

The NIST study relied on organizations voluntarily submitting their algorithms for testing. But missing from the list was Amazon, which sells its Rekognition software to local police and federal investigators. Previousstudies have raised concerns about the accuracy of Amazon’s system, and AI researchers have called on the company to stop selling its “flawed” system. Amazon claims that its software cannot be easily analyzed by NIST’s tests (despite the fact tech companies with similar products have no problem submitting their algorithms) and its shareholders have resisted calls to curb sales of Rekognition.

Experts say bias in these algorithms could be reduced by using a more diverse set of training data. The researchers found that algorithms developed in Asian countries, for example, did not have as big a difference in error rates between white and Asian faces.

However, even fixing the issue of bias won’t solve every problem with facial recognition when the technology is used in ways that doesn’t respect people’s security or privacy.

“What good is it to develop facial analysis technology that is then weaponized?” Joy Buolamwini, an AI researcher who has spearheaded investigations into facial recognition bias, told The Verge last year, “The technical considerations cannot be divorced from the social implications.”

The Secretive Company That Might End Privacy as We Know It .
Until recently, Hoan Ton-That’s greatest hits included an obscure iPhone game and an app that let people put Donald Trump’s distinctive yellow hair on their own photos. Then Mr. Ton-That — an Australian techie and onetime model — did something momentous: He invented a tool that could end your ability to walk down the street anonymously, and provided it to hundreds of law enforcement agencies, ranging from local cops in Florida to the F.B.I. and the Department of Homeland Security.

Topical videos:

usr: 1
This is interesting!