•   
  •   

World Amid reckoning on police racism, algorithm bias in focus

06:15  05 july  2020
06:15  05 july  2020 Source:   msn.com

Davies on racism: "Skin color is not a weapon"

 Davies on racism: Alponso Davies talks about racism and police violence. The FC Bayern defender wishes to be open with other people. © Imago Alphonso Davies talks about racism and police violence Alphonso Davies from Bundesliga club FC Bayern Munich is probably the big shooting star of the Bundesliga. Now the Bayern defender commented on the topic of racism. "My skin color is not a weapon. When people see me, they shouldn't be afraid to come up to me and say 'hello'.

Lucki focused on anti-black racism , amid protests over the death in American police custody of a black man, George Floyd, but wrote that all forms of "It is something we need to reckon as a society," he said. Miller said he is open to the idea of adding body cameras to all police though he said videos

Algorithmic bias describes systematic and repeatable errors in a computer system that create unfair outcomes, such as privileging one arbitrary group of users over others.

a group of people posing for the camera: Facial recognition technology is increasingly used in law enforcement, amid concerns that low accuracy for people of color could reinforce racial bias © DAVID MCNEW Facial recognition technology is increasingly used in law enforcement, amid concerns that low accuracy for people of color could reinforce racial bias

A wave of protests over law enforcement abuses has highlighted concerns over artificial intelligence programs like facial recognition which critics say may reinforce racial bias.

While the protests have focused on police misconduct, activists point out flaws that may lead to unfair applications of technologies for law enforcement, including facial recognition, predictive policing and "risk assessment" algorithms.

a close up of a street: San Francisco and several other cities have banned the use of facial recognition by police amid concerns about accuracy, while some big tech firms have suspended sales of the technology to law enforcement © JUSTIN SULLIVAN San Francisco and several other cities have banned the use of facial recognition by police amid concerns about accuracy, while some big tech firms have suspended sales of the technology to law enforcement

The issue came to the forefront recently with the wrongful arrest in Detroit of an African American man based on a flawed algorithm which identified him as a robbery suspect.

TV show 'The Simpsons' ditches using white voices for characters of color

  TV show 'The Simpsons' ditches using white voices for characters of color TV show 'The Simpsons' ditches using white voices for characters of color"Moving forward, 'The Simpsons' will no longer have white actors voice non-white characters," they said in a brief statement.

And systemic racism , which has been part of the US since its founding, can corrupt anyone's view of minorities in America. In the case of police , all cops Another study, from 2015, by researcher Cody Ross found, "There is no relationship between county-level racial bias in police shootings and crime

Racial bias in a medical algorithm favors white patients over sicker black patients. The algorithm wasn’t intentionally racist — in fact, it specifically excluded race. Such racism , though not driven by a hateful ideology, could have the same result as earlier segregation and substandard care.

Critics of facial recognition use in law enforcement say the case underscores the pervasive impact of a flawed technology.

Mutale Nkonde, an AI researcher, said that even though the idea of bias and algorithms has been debated for years, the latest case and other incidents have driven home the message.

a sign on the side of a building: Many algorithms designed for criminal justice were meant to eliminate bias, but analysts say the data used can merely reinforce historical trends © Philippe HUGUEN Many algorithms designed for criminal justice were meant to eliminate bias, but analysts say the data used can merely reinforce historical trends

"What is different in this moment is we have explainability and people are really beginning to realize the way these algorithms are used for decision-making," said Nkonde, a fellow at Stanford University's Digital Society Lab and the Berkman-Klein Center at Harvard.

Cricket-Holder wants racism to be treated like doping and fixing

  Cricket-Holder wants racism to be treated like doping and fixing CRICKET-TEST-ENG-WIN/HOLDER:Cricket-Holder wants racism to be treated like doping and fixingThen Pakistan captain Sarfaraz Ahmed was slapped with a four-match ban last year for a racist remark aimed at South Africa all-rounder Andile Phehlukwayo.

Surfacing and responding to algorithmic bias upfront can potentially avert harmful impacts to users and heavy liabilities against the operators and creators of algorithms , including computer Amazon discontinued a recruiting algorithm after discovering that it led to gender bias in its hiring.

Racial bias —even if not racial animus—is present in everybody. The human brain, as psychologist Joshua Correll of the University of Colorado, Boulder, puts it, is a “meaning-making machine,” forever sorting things into categories—good and bad, safe and unsafe, happy and sad—often on the fly.

a man standing in front of a computer: Facial recognition is used by law enforcement around the world, including in China, where activists say it may help authorities carry out human rights abuses © CHANDAN KHANNA Facial recognition is used by law enforcement around the world, including in China, where activists say it may help authorities carry out human rights abuses

Amazon, IBM and Microsoft have said they would not sell facial recognition technology to law enforcement without rules to protect against unfair use. But many other vendors offer a range of technologies.

- Secret algorithms -

Nkonde said the technologies are only as good as the data they rely on.

"We know the criminal justice system is biased, so any model you create is going to have 'dirty data,'" she said.

Daniel Castro of the Information Technology & Innovation Foundation, a Washington think tank, said however it would be counterproductive to ban a technology which automates investigative tasks and enables police to be more productive.

"There are (facial recognition) systems that are accurate, so we need to have more testing and transparency," Castro said.

Paul Weller has been 'cracking on' amid coronavirus lockdown

  Paul Weller has been 'cracking on' amid coronavirus lockdown According to Paul Weller, he's been busily working on new material amid the coronavirus lockdown. The 62-year-old singer has revealed he's been "cracking on" and trying to remain productive during the health crisis, which has stopped him from performing gigs. He said: "There's so much time off and obviously no live work, so I'm just cracking on and trying to get on with the next thing really." The release of Weller's new album, 'On Sunset', coincided with anti-racism protests across the US and elsewhere following George Floyd's death in Minneapolis.

" Racism is a mechanism where resources and unfortunately power, wealth, prestige and even humanity are distributed along a color line. That’s what racism is about," Nida, who is currently an assistant professor in the University of California Riverside Anthropology Department, said.

Not just George Floyd: Police departments have 400-year history of racism . Wenei Philimon. Slave patrols were tasked with hunting down runaways and suppressing rebellions amid fear of enslaved “The biggest thing is that they were race- focused as opposed to the police today, who should be

"Everyone is concerned about false identification, but that can happen whether it's a person or a computer."

Seda Gurses, a researcher at the Netherlands-based Delft University of Technology, said one problem with analyzing the systems is that they use proprietary, secret algorithms, sometimes from multiple vendors.

"This makes it very difficult to identify under what conditions the dataset was collected, what qualities these images had, how the algorithm was trained," Gurses said.

- Predictive limits -

The use of artificial intelligence in "predictive policing," which is growing in many cities, has also raised concerns over reinforcing bias.

The systems have been touted to help make better use of limited police budgets, but some research suggests it increases deployments to communities which have already been identified, rightly or wrongly, as high-crime zones.

These models "are susceptible to runaway feedback loops, where police are repeatedly sent back to the same neighborhoods regardless of the actual crime rate," said a 2019 report by the AI Now Institute at New York University, based a study of 13 cities using the technology.

Three US cities pilot truth, reconciliation push to tackle racism

  Three US cities pilot truth, reconciliation push to tackle racism Boston, Philadelphia and San Francisco will help form a pilot effort to confront racism in the criminal justice system. Suffolk County DA Rachael Rollins, Philadelphia DA Larry Krasner and San Francisco DA Chesa Boudin announced the initiative on Wednesday in partnership with the Grassroots Law Project, which is leading the effort. It will tackle racial inequities, and police violence and misconduct. "We need to confront our ugly past to create a more just and equitable future," said Rollins, whose jurisdiction includes Boston.

There is a saying in computer science: garbage in, garbage out. When we feed machines data that reflects our prejudices, they mimic them. Does a horrifying future await people forced to live at the mercy of algorithms ?

Many are prejudiced but hope to escape the label of “ racist ” or “sexist.” And the theory of implicit bias has handed them an excuse. Referencing the role of implicit bias in perpetuating the gender pay gap or racist police shootings is widely considered woke, while IAT- focused diversity training is now a

These systems may be gamed by "biased police data," the report said.

In a related matter, an outcry from academics prompted the cancellation of a research paper which claimed facial recognition algorithms could predict with 80 percent accuracy if someone is likely to be a criminal.

- Robots vs humans -

Ironically, many artificial intelligence programs for law enforcement and criminal justice were designed with the hope of reducing bias in the system.

So-called risk assessment algorithms were designed to help judges and others in the system make unbiased recommendations on who is sent to jail, or released on bond or parole.

But the fairness of such a system was questioned in a 2019 report by the Partnership on AI, a consortium which includes tech giants including Google and Facebook, as well as organizations such as Amnesty International and the American Civil Liberties Union.

"It is perhaps counterintuitive, but in complex settings like criminal justice, virtually all statistical predictions will be biased even if the data was accurate, and even if variables such as race are excluded, unless specific steps are taken to measure and mitigate bias," the report said.

Nkonde said recent research highlights the need to keep humans in the loop for important decisions.

July 4: dull atmosphere for the American national holiday

 July 4: dull atmosphere for the American national holiday The United States celebrated its national holiday this Saturday amid anti-racism demonstrations and the coronavirus crisis. © Supplied by Euronews Donald Trump celebrated American National Day on Saturday against the backdrop of a revival of the Covid-19 and demonstrations against racism. Four months before the presidential election, in a speech with a hint of campaign rally, the American president attacked the "radical left", the media and China.

"You cannot change the history of racism and sexism," she said. "But you can make sure the algorithm does not become the final decision maker."

Castro said algorithms are designed to carry out what public officials want, and the solution to unfair practices lies more with policy than technology.

"We can't always agree on fairness," he said. "When we use a computer to do something, the critique is leveled at the algorithm when it should be at the overall system."

rl/dw/acb


Video: Recode's Kara Swisher on the growing Facebook ad boycott (CNBC)

Harry, Meghan point to Commonwealth wrongs .
The Commonwealth's past wrongs need to be acknowledged to be able to move forward, the Duke of Sussex has said in a discussion on justice and equal rights.Harry and Meghan both took part in the video call with young leaders in one of the Queen's Commonwealth Trust's (QCT) weekly sessions set up in response to the growing Black Lives Matter movement.

—   Share news in the SOC. Networks
usr: 1
This is interesting!