•   
  •   
  •   

Technology Massive errors found in facial recognition tech: US study

10:55  20 december  2019
10:55  20 december  2019 Source:   msn.com

The best facial recognition cameras of 2019

  The best facial recognition cameras of 2019 Want a security camera with facial recognition? Here are your best options.The Philips Hue White starter kit.

The study found US -developed face recognition systems had higher error rates for Asians, African Americans and Native American groups, with the American Indian demographic showing the highest rates of false positives. However, some algorithms developed in Asian countries produced similar

Facial recognition systems can produce wildly inaccurate results, especially for non-whites, according to a US government study released Thursday that is likely to raise fresh doubts on deployment of the artificial intelligence technology .

Facial recognition systems can produce wildly inaccurate results, especially for non-whites, according to a US government study released Thursday that is likely to raise fresh doubts on deployment of the artificial intelligence technology.

a woman taking a selfie: Facial recognition systems are coming under scrutiny for producing inaccurate results.© gece33 / IStock.com Facial recognition systems are coming under scrutiny for producing inaccurate results.

The study of dozens of facial recognition algorithms showed "false positives" rates for Asian and African American as much as 100 times higher than for whites.

The researchers from the National Institute of Standards and Technology (NIST), a government research center, also found two algorithms assigned the wrong gender to black females almost 35 percent of the time.

The best facial recognition cameras of 2019

  The best facial recognition cameras of 2019 Peter Schrager introduces Patrick Ricard, the Baltimore Ravens two-way player and reveals why he is the most underrated player in the league.

The study found US -developed face recognition systems had higher error rates for Asians, African Americans and Native American groups, with the American Indian demographic showing the highest rates of false positives. However, some algorithms developed in Asian countries produced similar

The study found US -developed face recognition systems had higher error rates for Asians, African Americans and Native American groups, with the American Indian demographic showing the highest rates of false positives. However, some algorithms developed in Asian countries produced similar

The study comes amid widespread deployment of facial recognition for law enforcement, airports, border security, banking, retailing, schools and for personal technology such as unlocking smartphones.

Some activists and researchers have claimed the potential for errors is too great and that mistakes could result in the jailing of innocent people, and that the technology could be used to create databases that may be hacked or inappropriately used.

The NIST study found both "false positives," in which an individual is mistakenly identified, and "false negatives," where the algorithm fails to accurately match a face to a specific person in a database.

"A false negative might be merely an inconvenience -- you can't get into your phone, but the issue can usually be remediated by a second attempt," said lead researcher Patrick Grother.

Facial recognition: The fight over the use of our faces is far from over

  Facial recognition: The fight over the use of our faces is far from over A raging battle over facial recognition software used by law enforcement and the civil rights of Americans might be heading to a courtroom. The latest salvo includes the American Civil Liberties Union suing the FBI, the Department of Justice and the Drug Enforcement Agency for those federal agencies' records to see if there is any secret surveillance in use nationwide. The lawsuit, filed Oct. 31, comes as organizations and law enforcement are going toe-to-toe over what is private and what isn't.Start the day smarter. Get all the news you need in your inbox each morning.

The study found US -developed face recognition systems had higher error rates for Asians, African Americans and Native American groups, with the American Indian demographic showing the highest rates of false positives. However, some algorithms developed in Asian countries produced similar

Facial recognition technology ’s difficulty detecting darker skin tones is a well-established problem. In February, MIT Media Lab’s Joy Buolamwini and Microsoft’s Timnit Gebru published findings that facial recognition software from IBM, Microsoft, and Face++ have a much harder time identifying gender in

"But a false positive in a one-to-many search puts an incorrect match on a list of candidates that warrant further scrutiny."

The study found US-developed face recognition systems had higher error rates for Asians, African Americans and Native American groups, with the American Indian demographic showing the highest rates of false positives.

However, some algorithms developed in Asian countries produced similar accuracy rates for matching between Asian and Caucasian faces -- which the researchers said suggests these disparities can be corrected.

"These results are an encouraging sign that more diverse training data may produce more equitable outcomes," Grother said.

Nonetheless, Jay Stanley of the American Civil Liberties Union, which has criticized the deployment of face recognition, said the new study shows the technology is not ready for wide deployment.

"Even government scientists are now confirming that this surveillance technology is flawed and biased," Stanley said in a statement.

"One false match can lead to missed flights, lengthy interrogations, watchlist placements, tense police encounters, false arrests or worse. But the technology's flaws are only one concern. Face recognition technology -- accurate or not -- can enable undetectable, persistent, and suspicionless surveillance on an unprecedented scale."

The Secretive Company That Might End Privacy as We Know It .
Until recently, Hoan Ton-That’s greatest hits included an obscure iPhone game and an app that let people put Donald Trump’s distinctive yellow hair on their own photos. Then Mr. Ton-That — an Australian techie and onetime model — did something momentous: He invented a tool that could end your ability to walk down the street anonymously, and provided it to hundreds of law enforcement agencies, ranging from local cops in Florida to the F.B.I. and the Department of Homeland Security.

—   Share news in the SOC. Networks

Topical videos:

usr: 0
This is interesting!