Offbeat: Fearful of bias, Google blocks gender-based pronouns from new AI tool - PressFrom - US
  •   
  •   
  •   

OffbeatFearful of bias, Google blocks gender-based pronouns from new AI tool

15:55  27 november  2018
15:55  27 november  2018 Source:   reuters.com

Google may add public comments for searches

Google may add public comments for searches Google+ is shutting down in the wake of a data privacy flaw, but that doesn't mean Google is now uninterested in social features. The 9to5Google crew has discovered what appears to be in-testing support for comments on search results. The feature would be limited to live sports matches (at least at first), but it would separate feedback from both pro commentators and viewers and would include filters to highlight the top comments. And yes, there would be moderation to cut down on spam and other abuse. We've asked Google if it can say more about its potential plans. There's no certainty this will arrive soon, if at all. With that in mind, the allure is clear for Google.

Google ’s technology will not suggest gender - based pronouns because the risk is too high that its “Smart Compose” technology might predict someone’s sex or gender identity incorrectly and offend users, product leaders revealed to Reuters in interviews. Gmail product manager Paul Lambert said a

Alphabet Inc's Google in May introduced a slick feature for Gmail that automatically completes sentences for users as they type. Tap out "I love" and Gmail might propose "you" or "it.". But users are out of luck if the object of their affection is "him" or "her.".

Fearful of bias, Google blocks gender-based pronouns from new AI tool© Reuters/Toby Melville FILE PHOTO: The Google name is displayed outside the company's office in London

By Paresh Dave

SAN FRANCISCO (Reuters) - Alphabet Inc's Google in May introduced a slick feature for Gmail that automatically completes sentences for users as they type. Tap out "I love" and Gmail might propose "you" or "it."

But users are out of luck if the object of their affection is "him" or "her."

Google's technology will not suggest gender-based pronouns because the risk is too high that its "Smart Compose" technology might predict someone's sex or gender identity incorrectly and offend users, product leaders revealed to Reuters in interviews.

Google rolls out Digital Wellbeing tool to help limit screen time

Google rolls out Digital Wellbeing tool to help limit screen time Digital Wellbeing is available to Pixel phone users and Android One device owners with Android 9 Pie. The news was reported Tuesday by 9to5Google. Google released Digital Wellbeing to Pixel users in August as part of an exclusive beta. The tool helps you monitor how much time you spend on your phone with a daily overview. A graphic on the dashboard shows how frequently you use different apps. You can also see how many times you unlock your phone and how many notifications you receive, Google said in an August blog post. © Provided by CBS Interactive Inc.

Google ’s technology will not suggest gender - based pronouns because the risk is too high that its Google uses tests developed by its AI ethics team to uncover new biases . A spam and abuse team Microsoft’s LinkedIn said it avoids gendered pronouns in its year-old predictive messaging tool

Google 's technology will not suggest gender - based pronouns because the risk is too high that its "Smart Compose" technology might predict someone's sex or gender identity incorrectly and offend users, product leaders revealed to Reuters in interviews. Gmail product manager Paul Lambert said a

Gmail product manager Paul Lambert said a company research scientist discovered the problem in January when he typed "I am meeting an investor next week," and Smart Compose suggested a possible follow-up question: "Do you want to meet him?" instead of "her."

Consumers have become accustomed to embarrassing gaffes from autocorrect on smartphones. But Google refused to take chances at a time when gender issues are reshaping politics and society, and critics are scrutinizing potential biases in artificial intelligence like never before.

"Not all 'screw ups' are equal," Lambert said. Gender is a "big, big thing" to get wrong.

Getting Smart Compose right could be good for business. Demonstrating that Google understands the nuances of AI better than competitors is part of the company's strategy to build affinity for its brand and attract customers to its AI-powered cloud computing tools, advertising services and hardware.

YouTube to delete video annotations next year

YouTube to delete video annotations next year Starting in January, you won't be able to see annotations in videos.

Google ’s technology will not suggest gender - based pronouns because the risk is too high that its “Smart Compose” technology might predict someone’s But Google refused to take chances at a time when gender issues are reshaping politics and society, and critics are scrutinizing potential biases in

Google 's technology will not suggest gender - based pronouns because the risk is too high that its Google of failing to comply with Russian laws. It blocked access to LinkedIn in 2016 and tried to do Google uses tests developed by its AI ethics team to uncover new biases . A spam and abuse team

Gmail has 1.5 billion users, and Lambert said Smart Compose assists on 11 percent of messages worldwide sent from Gmail.com, where the feature first launched.

Smart Compose is an example of what AI developers call natural language generation (NLG), in which computers learn to write sentences by studying patterns and relationships between words in literature, emails and web pages.

A system shown billions of human sentences becomes adept at completing common phrases but is limited by generalities. Men have long dominated fields such as finance and science, for example, so the technology would conclude from the data that an investor or engineer is "he" or "him." The issue trips up nearly every major tech company.

Lambert said the Smart Compose team of about 15 engineers and designers tried several workarounds, but none proved bias-free or worthwhile. They decided the best solution was the strictest one: Limit coverage. The gendered pronoun ban affects fewer than 1 percent of cases where Smart Compose would propose something, Lambert said.

Google CEO Sundar Pichai will face Congress next week

Google CEO Sundar Pichai will face Congress next week Google CEO Sundar Pichai will testify in front of the House Judiciary Committee on Dec. 5 at 10 a.m., according to a release from the committee. Pichai will be asked about potential bias in the platform and transparency around Google's practices. Google previously declined to send the CEO to a hearing with the Senate Select Committee on Intelligence. Google CEO Sundar Pichai is expected to testify in front of the House of Representatives Judiciary Committee on Dec. 5 at 10 a.m.

Google 's technology will not suggest gender - based pronouns because the risk is too high that its "Smart Compose" technology might predict someone's sex or gender identity incorrectly and offend users, product leaders revealed to Reuters in interviews. Gmail product manager Paul Lambert said a

Google 's technology will not suggest gender - based pronouns because the risk is too high that its "Smart Compose" technology might predict someone's sex or gender identity incorrectly and offend users, product leaders revealed to Reuters in interviews. Google rolls out Digital Wellbeing tool to

"The only reliable technique we have is to be conservative," said Prabhakar Raghavan, who oversaw engineering of Gmail and other services until a recent promotion.

NEW POLICY

Google's decision to play it safe on gender follows some high-profile embarrassments for the company's predictive technologies.

The company apologized in 2015 when the image recognition feature of its photo service labeled a black couple as gorillas. In 2016, Google altered its search engine's autocomplete function after it suggested the anti-Semitic query "are jews evil" when users sought information about Jews.

Google has banned expletives and racial slurs from its predictive technologies, as well as mentions of its business rivals or tragic events.

The company's new policy banning gendered pronouns also affected the list of possible responses in Google's Smart Reply. That service allows users to respond instantly to text messages and emails with short phrases such as "sounds good."

Google uses tests developed by its AI ethics team to uncover new biases. A spam and abuse team pokes at systems, trying to find "juicy" gaffes by thinking as hackers or journalists might, Lambert said.

Google CEO hearing in U.S. House likely to be postponed: Goodlatte

Google CEO hearing in U.S. House likely to be postponed: Goodlatte Google CEO hearing in U.S. House likely to be postponed: Goodlatte

Google ’s technology will not suggest gender - based pronouns because the risk is too high that its “Smart Compose” technology might predict someone’s sex or gender identity incorrectly and offend users, product leaders revealed to Reuters in interviews. Gmail product manager Paul Lambert said a

Google 's technology will not suggest gender - based pronouns because the risk is too high that its "Smart Google uses tests developed by its AI ethics team to uncover new biases . A spam and abuse Similar from the Web. Fearful of bias , Google blocks gender - based pronouns from new .

Workers outside the United States look for local cultural issues. Smart Compose will soon work in four other languages: Spanish, Portuguese, Italian and French.

"You need a lot of human oversight," said engineering leader Raghavan, because "in each language, the net of inappropriateness has to cover something different."

WIDESPREAD CHALLENGE

Google is not the only tech company wrestling with the gender-based pronoun problem.

Agolo, a New York startup that has received investment from Thomson Reuters, uses AI to summarize business documents.

Its technology cannot reliably determine in some documents which pronoun goes with which name. So the summary pulls several sentences to give users more context, said Mohamed AlTantawy, Agolo's chief technology officer.

He said longer copy is better than missing details. "The smallest mistakes will make people lose confidence," AlTantawy said. "People want 100 percent correct."

Yet, imperfections remain. Predictive keyboard tools developed by Google and Apple Inc propose the gendered "policeman" to complete "police" and "salesman" for "sales." Type the neutral Turkish phrase "one is a soldier" into Google Translate and it spits out "he's a soldier" in English. So do translation tools from Alibaba and Microsoft Corp. Amazon.com Inc opts for "she" for the same phrase on its translation service for cloud computing customers.

AI experts have called on the companies to display a disclaimer and multiple possible translations.

Microsoft's LinkedIn said it avoids gendered pronouns in its year-old predictive messaging tool, Smart Replies, to ward off potential blunders.

Alibaba and Amazon did not respond to requests to comment.

Warnings and limitations like those in Smart Compose remain the most-used countermeasures in complex systems, said John Hegele, integration engineer at Durham, North Carolina-based Automated Insights Inc, which generates news articles from statistics.

"The end goal is a fully machine-generated system where it magically knows what to write," Hegele said. "There’s been a ton of advances made but we’re not there yet."

(Reporting by Paresh Dave; Editing by Greg Mitchell and Marla Dickerson)

Google is fixing gender bias in its Translate service.
Google Translate has previously displayed signs of gender bias by assigning genders to certain adjectives and words describing occupations. 

—   Share news in the SOC. Networks

Topical videos:

usr: 4
This is interesting!