•   
  •   

Tech & Science In A 'Dangerous And Sinister Step,' London Police Start Using Live Face Recognition Tech

20:00  24 january  2020
20:00  24 january  2020 Source:   gizmodo.com.au

AOC is sounding the alarm about the rise of facial recognition: 'This is some real life "Black Mirror" stuff'

  AOC is sounding the alarm about the rise of facial recognition: 'This is some real life Rep. Alexandria Ocasio-Cortez voiced concerns about the rise of facial recognition during a House hearing Wednesday. "This is some real life Black Mirror stuff that we're seeing here," Ocasio-Cortez said.The New York Democrat even likened some of the potential abuses of facial recognition to something fit for an episode of "Black Mirror," the science-fiction television show that explores the dangerous side of our fascination with technology.

a sign on the side of a building: Photo: Getty Images© Photo: Getty Images Photo: Getty Images

The dystopian nightmare begins. Today, London’s Metropolitan Police Service announced it will begin deploying Live Facial Recognition (LFR) tech across the capital in the hopes of locating and arresting wanted peoples.

“We are using a tried-and-tested technology, and have taken a considered and transparent approach in order to arrive at this point,” Assistant Commissioner Nick Ephgrave said in a statement. “Similar technology is already widely used across the UK, in the private sector. Ours has been trialed by our technology teams for use in an operational policing environment.”

A company started by an Aussie took billions of photos from social media to create a facial-recognition database, and police are already using it

  A company started by an Aussie took billions of photos from social media to create a facial-recognition database, and police are already using it Clearwater AI, a facial-recognition startup that scraped social media for images, has been adopted by at least 600 law-enforcement agencies, according to a New York Times report. The software developers relied on current and former Republican officials to sell the software to law-enforcement agencies. The agencies reportedly have little information about the origin of Clearwater AI, which likely violated policies of sites like Facebook, Twitter, Instagram, and YouTube to create its database of billions of photos.

The way the system is supposed to work, according to the Metropolitan Police, is the LFR cameras will first be installed in areas where ‘intelligence’ suggests the agency is most likely to locate ‘serious offenders.’ Each deployment will supposedly have a ‘bespoke’ watch list comprising images of wanted suspects for serious and violent offences. The London police also note the cameras will focus on small, targeted areas to scan folks passing by. According to BBC News, previous trials had taken place in areas such as Stratford’s Westfield shopping mall and the West End area of London. It seems likely the agency is also anticipating some unease, as the cameras will be ‘clearly signposted’ and officers are slated to hand out informational leaflets.

Alphabet CEO Sides With EU On Facial-Recognition Tech Moratorium, But Microsoft Isn't Convinced

  Alphabet CEO Sides With EU On Facial-Recognition Tech Moratorium, But Microsoft Isn't Convinced Alphabet and Google CEO Sundar Pichai recently came out in support of a European Union proposal for instating a temporary ban on facial-recognition technology until officials beef up privacy regulations. “I think it is important that governments and regulations tackle it sooner rather than later and give a framework for it,” he said Monday at a conference in Brussels, Belgium helmed by the think-tank Bruegel, per a Reuters report.“It can be immediate but maybe there’s a waiting period before we really think about how it’s being used,” he continued.

The agency’s statement also emphasises that the facial recognition tech is not meant to replace policing—just ‘prompt’ officers by suggesting a person in the area may be a fishy individual...based solely on their face. “It is always the decision of an officer whether or not to engage with someone,” the statement reads. On Twitter, the agency also noted in a short video that images that don’t trigger alerts will be immediately deleted.

As with any police-related, Minority Report-esque tech, accuracy is a major concern. While the Metropolitan Police Service claims that 70 per cent of suspects were successfully identified and that only one in 1,000 people created a fake alert, not everyone agrees the LFR tech is rock-solid. An independent review from July 2019 found that in six of the trial deployments, only eight of 42 matches were correct for an abysmal 19 per cent accuracy. Other problems found by the review included inaccurate watch list information (e.g., people were stopped for cases that had already been resolved), and the criteria for people being included on the watchlist weren’t clearly defined.

Aussie entrepreneur launches “disturbing and unethical” facial recognition tech in Silicon Valley

  Aussie entrepreneur launches “disturbing and unethical” facial recognition tech in Silicon Valley An Aussie entrepreneur is copping flack online for his contentious startup Clearview AI, which is designed to identify people from a single image.According to The New York Times, the technology has already been provided to more than 600 law enforcement agencies, including local police in Florida, the FBI and the Department of Homeland Security.

Privacy groups aren’t particularly happy with the development. Big Brother Watch, a privacy campaign group that’s been particularly vocal against facial recognition tech, took to Twitter, telling the Metropolitan Police Service they’d “see them in court.”

“This decision represents an enormous expansion of the surveillance state and a serious threat to civil liberties in the UK,” said Silkie Carlo, Big Brother Watch’s director, in a statement. “This is a breath-taking assault on our rights and we will challenge it, including by urgently considering next steps in our ongoing legal claim against the Met and the Home Secretary.”

Meanwhile, another privacy group Liberty, has also voiced resistance to the measure. “Rejected by democracies. Embraced by oppressive regimes. Rolling out facial recognition surveillance tech is a dangerous and sinister step in giving the State unprecedented power to track and monitor any one of us. No thanks,” the group tweeted.

The London police’s decision comes at an interesting time. Just last week, the European Union began mulling a three-to-five-year ban on facial recognition tech in public areas. It’s unclear whether that ban will ever come to pass—the news came via a leaked version of an early draft of a European Commission white paper. It’s also not clear if it will ultimately matter, given the looming specter of Brexit. That said, Liberty has already garnered over 22,000 signatures for a petition demanding Britain’s Home Secretary ban the use of facial recognition tech in public places.

Body found floating offshore near Marion Bay on SA's Yorke Peninsula .
The search for a man whose body was seen floating off the coast of South Australia's Yorke Peninsula will resume on Tuesday. © Provided by ABC NEWS The body was found floating offshore at Inneston, on Yorke Peninsula. (ABC News: Gary-Jon Lysaght) South Australia Police started the search at Inneston, near Marion Bay, about midday on Monday and said they were treating the death as a possible drowning.Emergency services were called to Pondalowie Bay Road after receiving reports of a body floating in the water, about 20 metres offshore.

—   Share news in the SOC. Networks

Topical videos:

usr: 0
This is interesting!