Seeking to offer its users a better experience, Google Docs has released a series of (supposed) improvements, among which is assisted text . The idea is that the application allows us to write better and faster. Now it is not only about spelling, grammar and style suggestions, but also recommendations to avoid the use of terms that are not considered inclusive .
When the system detects a term that it deems inappropriate, the tool alerts the user that the content may “not be inclusive of all readers.” Although it is a well-intentioned tool, the reality is that in practice it has been restrictive and annoying for some of the users who have already tried it.
According to the Vice site, among the terms that the English version has described as non-inclusive are “ landlord (owner)” and “ motherboard (mother card)”, although they obviously are not. The editors of the same site did some tests by transcribing excerpts from works or famous speeches to see what suggestions the tool made to them and the criterion does not seem to be very effective: the famous speech by Martin Luther King, “I Have a Dream” raised more flags that the transcript of an interview with Ku Klux Klan leader David Ernest Duke.
After extensively testing the tool, the Vice team reached out to a Google Docs spokesperson , who explained that the technology is constantly evolving: “Assisted typing uses language understanding models that are based on millions of sentences and common sentences to automatically learn how people communicate. This also means that they may reflect some human cognitive biases. Our technology is always improving, and we still don’t have (and may never) have a complete solution to identify and mitigate all spam and bias word associations.”
Although the assisted text function can be deactivated at any time and the system does not force the author to write in a certain way, the sensation that it gives is that it tries to regulate the way in which it is expressed. This case shows that despite how powerful it is, artificial intelligence and algorithms are still unable to differentiate and emulate some things that are natural to humans.