Big Tech /

Google to Begin Suggesting 'Inclusive Language' Corrections For Writers Using Docs


Google is rolling out new “inclusive language” correction suggestions for writers using Google Docs.

The aim of the new “assistive writing” function will be to help users avoid “politically incorrect” language, such as the words “landlord” or “mankind,” similar to the way that they alert people to spelling or grammatical issues.

“Potentially discriminatory or inappropriate language will be flagged, along with suggestions on how to make your writing more inclusive and appropriate for your audience,” Google said in a press release about the new function.

“Users typing ‘landlord’ will see a warning that it ‘may not be inclusive to all readers’ with the suggestion they should try ‘property owner’ or ‘proprietor’ instead,” the Daily Mail reports. “The word ‘humankind’ is a suggested alternative to what the online giant apparently sees as the controversial term ‘mankind’.”

Google Docs will also nudge users against writing with gendered terms like “policeman” or “housewife,” suggesting instead “stay-at-home spouse” or “police officers.”

The tool even flags the word “motherboard” as possibly offensive.

This new speech-policing tool is still not without its flaws, however.

The Daily Mail report noted that “A transcribed interview with ex Klu Klux Klan leader David Duke, in which he uses offensive racial slurs and talks about hunting black people, prompted no warnings. But it suggested President John F Kennedy’s inaugural address should say ‘for all humankind’ instead of ‘for all mankind’.”

In the Google Developer Documentation Style Guide, the company suggests “when trying to achieve a friendly and conversational tone, problematic ableist language might slip in. This can come in the form of figures of speech and other turns of phrase. Be sensitive to your word choice, especially when aiming for an informal tone. Ableist language includes words or phrases such as crazy, insane, blind to or blind eye to, cripple, dumb, and others. Choose alternative words depending on the context.”

Google recommends you avoid terms that can have violent meanings in some cases, even if that is not the way that they are being used.

“When possible, avoid the use of figurative language that can be interpreted as violent, such as hang and hit. Although there might also be nonviolent interpretations for these terms, avoiding their use prevents unintentional harm that might be caused by the violent interpretations,” the guide continues. “Avoid the use of figurative language that relates to the slaughter of animals. For example, avoid using the metaphor of pets versus cattle when comparing on-premises or stateful systems with stateless cloud systems.”

When using terms with an established meaning, such as whitelist, Google recommends that you use an acceptable term like “allowlist” instead — but at first mention write that it is “sometimes called a whitelist” in parenthesis so that people know what you are talking about.

The guide also takes aim at words used in coding — such as “master and slave.”

“The first time that you refer to a code item that uses a non-inclusive term, you can directly refer to that term, but format it in code font, and put it in parentheses if possible,” the guide says. “In subsequent mentions, use the preferred term (parent node, replica). If it’s necessary to refer to the entity name or keyword, continue doing so only with code formatting.”

The new feature is now on by default for enterprise users.

While some may welcome the assistance in making sure their writing is as non-offensive as possible, many are pointing to the potential implications for free speech and privacy online.

“Google’s new word warnings aren’t assistive, they’re deeply intrusive. With Google’s new assistive writing tool, the company is not only reading every word you type but telling you what to type,” Silkie Carlo, the director of Big Brother Watch, which campaigns for the protection of civil liberties, told The Telegraph. “This speech-policing is profoundly clumsy, creepy and wrong, often reinforcing bias. Invasive tech like this undermines privacy, freedom of expression and increasingly freedom of thought.”

Lazar Radic of the International Centre for Law and Economics added, “not only is this incredibly conceited and patronising – it can also serve to stifle individuality, self-expression, experimentation, and – from a purely utilitarian perspective – progress.”

According to Google, “assisted writing uses language understanding models, which rely on millions of common phrases and sentences to automatically learn how people communicate. This also means they can reflect some human cognitive biases.”

“Our technology is always improving, and we don’t yet (and may never) have a complete solution to identifying and mitigating all unwanted word associations and biases,” the tech giant added.

Google is not the first writing assistant to try and push users into being more politically correct.

Grammarly has notoriously prompted inclusive language.

“Grammarly’s writing assistant has incorporated suggestions through the delivery dimension of Grammarly Premium that can help you stay empathetic to the LGBTQIA+ people in your life. These suggestions are works in progress, and we welcome feedback from our users about their effectiveness—we’ve made multiple adjustments to our suggestions based on just that. Our goal with these suggestions is not to force you to write a certain way but to ask you to take a moment to consider how your audience may be affected by the language you choose,” the company said in a press release about their prompts relating to the LGBTQ community.

The grammar app has even pushed users to make sure their writing supports Ukraine by underlining sentences about the conflict in the colors of the Ukrainian flag and suggesting users read more about ways they can support the nation.

“We are adding a message in our product to direct users writing about the war to resources for helping Ukraine. We have also made the decision to block users located in Russia and Belarus from using Grammarly products or services,” the company said in a press release.

*For corrections please email [email protected]*

Popular