OffbeatGoogle is fixing gender bias in its Translate service
Fearful of bias, Google blocks gender-based pronouns from new AI tool
Alphabet Inc's Google in May introduced a slick feature for Gmail that automatically completes sentences for users as they type. Tap out "I love" and Gmail might propose "you" or "it." But users are out of luck if the object of their affection is "him" or "her." Google's technology will not suggest gender-based pronouns because the risk is too high that its "Smart Compose" technology might predict someone's sex or gender identity incorrectly and offend users, product leaders revealed to Reuters in interviews.
Enter Google Translate , the automated service that makes so much of the web comprehensible to so many of us. That seems to have been demonstrated by a series of tweets showing Google Translate in the act of gendering professions in such a way that can only be described as problematic.
The result is services like language translation spitting those biases back out in subtle but worrisome ways. Earlier this year, for instance, examples of gender bias started cropping up on social media with Google Translate . Try translating terms into English from Turkish, which has gender -neutral

Google Translate has previously displayed signs of gender bias by assigning genders to certain adjectives and words describing occupations. Thankfully, the company’s rolling out an update to fix this.
The company said that after the update, Google translate will provide both feminine and masculine translations for gender-neutral words:
Historically, it has provided only one translation for a query, even if the translation could have either a feminine or masculine form. So when the model produced one translation, it inadvertently replicated gender biases that already existed. For example: it would skew masculine for words like “strong” or “doctor,” and feminine for other words, like “nurse” or “beautiful.”
Google CEO Sundar Pichai will face Congress next week
Google CEO Sundar Pichai will testify in front of the House Judiciary Committee on Dec. 5 at 10 a.m., according to a release from the committee. Pichai will be asked about potential bias in the platform and transparency around Google's practices. Google previously declined to send the CEO to a hearing with the Senate Select Committee on Intelligence. Google CEO Sundar Pichai is expected to testify in front of the House of Representatives Judiciary Committee on Dec. 5 at 10 a.m.
Google Translate and other popular translation platforms often provide unintentionally sexist translations To Google ’s credit, Mountain View regularly tweaks Google Translate ’s algorithms to fix If source material used for translations has an aggregated bias in terms of one gender being
Google Translate was accused of sexism by a Turkish and a Finnish tweet recently, because, according to the authors of the tweets, when it translated certain Also, she states that “ it does not matter which way the gender bias goes, but the fact that the algorithm is built this way, does matter
Right now, this update is applicable to translations from English to French, Italian, Portuguese or Spanish. Additionally, when you translate from a gender-neutral language like Turkish, you’ll get two results as shown below:
You can try out Google Translate on the web here to test the company’s claims. The search giant said that it’ll roll out these improvements to Translate’s iOS and Android apps soon.
The Mountain View company noted on the blog that it’s determined to remove gender bias from its products, and it’s working on improving the auto-complete feature for search queries next. Recently, Google removed some auto-complete suggestions from Gmail, as they were leaning towards one gender.
It’s good to see companies working on removing gender bias in technology products. While such biases are essentially a reflection of how humans use these tools, experts have warned that it can hardwire sexism in people – so it’s important to adjust for that and build technologies to serve humanity better.
New bug prompts earlier end to Google+ social network.
Google said Monday it will close the consumer version of its online social network sooner than originally planned due to the discovery of a new software bug. The Google+ social network will close in April -- four months earlier than planned -- and the internet giant will focus on operating a version tailored for businesses, according to G Suite product management vice president David Thacker. Application programming interface programs (APIs) used by developers to access Google+ data will be shut down within 90 days, according to Thacker.
Topical videos:
Iris Bohnet: "What Works: Gender Equality by Design" | Talks at Google
Iris Bohnet visited Google's office in Cambridge, MA to discuss her book "What Works: Gender Equality by Design". Gender equality is a moral and a business ...
Sara Wachter-Boettcher: "Technically Wrong: Sexist Apps, Biased Algorithms [...]" | Talks at Google
A revealing look at how tech industry bias and blind spots get baked into digital products—and harm us all. Many of the services we rely on are full of oversights, ...
See also:
Topical videos
TOP News
TOP News
Latest News
Similar from the Web
Enter Google Translate , the automated service that makes so much of the web comprehensible to so many of us. That seems to have been demonstrated by a series of tweets showing Google Translate in the act of gendering professions in such a way that can only be described as problematic.
The Algorithm That Helped Google Translate Become SexistThe result is services like language translation spitting those biases back out in subtle but worrisome ways. Earlier this year, for instance, examples of gender bias started cropping up on social media with Google Translate . Try translating terms into English from Turkish, which has gender -neutral
Google Translate ’s Gender Problem (And Bing Translate ’s, AndGoogle Translate and other popular translation platforms often provide unintentionally sexist translations To Google ’s credit, Mountain View regularly tweaks Google Translate ’s algorithms to fix If source material used for translations has an aggregated bias in terms of one gender being
Why gender - bias is good in Google Translate – Attila Ulbert – MediumGoogle Translate was accused of sexism by a Turkish and a Finnish tweet recently, because, according to the authors of the tweets, when it translated certain Also, she states that “ it does not matter which way the gender bias goes, but the fact that the algorithm is built this way, does matter
Google Translate 's gender bias pairs "he" with "hardworking" andGender by Google Translate . he is a soldier she’s a teacher he is a doctor she is a nurse. In a way, this is not Google ’s fault. The algorithm is basing its translations on a huge corpus of In Estonian, Google Translate converts “[he/she] is a doctor” to “she,” so perhaps there is less cultural bias in
Google Translate ’s algorithm has a gender biasWhen Google Translate translates from Turkish to English, it has to guess which one it is — and it tends to guess that the sentence is referring to a “he.” However, the algorithm isn’t completely at fault, as it’s mostly reflecting a cultural bias that already exists. In Estonian, the Google algorithm
Learn how to use the impressive Google Translate | Daily Mail OnlineGoogle Translate can translate over 100 languages. But issues with translating gender -netural Is Google Translate SEXIST? Users report biased results when translating gender -neutral Several users have taken to Twitter to complain about Google 's sexist translations in its Translate tool.
A Fix for Gender Bias in Health Care? CheckGender bias has received significant attention in recent years, and has been scrutinized as a factor in the dearth of female chief executives, the treatment of Indeed, structuring decision-making in order to root out bias is already gaining traction in business — companies like Google and Slack have begun
AI Translate : Bias ? Sexist? Or this is the way it should be?In short, when the translators ( Google translate , Microsoft translator , etc) translate a sentence from gender neutral language(e.g. Turkish) to a non- gender neutral language(e.g. English), it make a guess on the gender (not really random guess, but fact/trained guess). This behaviour has gotten my attention.
In 2017, society started taking AI bias seriouslyGoogle Translate converted the gender -neutral Turkish terms for certain professions into "he is a doctor" and "she is a nurse" in English. From the ridiculous to the chilling, algorithmic bias -- social prejudices embedded in the AIs that play an increasingly large role in society -- has been exposed for