News par Abby Wallace
15 July 2021
15 July 2021
Temps de lecture : 3 minutes
3 min
0

Study reveals gender bias in predictive text algorithms

Text is an important way to keep in touch, and has been especially crucial in the last 18 months. Many rely on predictive text to keep up with their communications, but a recent report by Uswitch has revealed that the algorithms we rely on are perpetuating gender biases on the devices we use daily.
Temps de lecture : 3 minutes

A new study has revealed that automated suggestions on predicted text is often gender biased. Uswitch, a comparison and switching service, tested a series of adjectives on smartphones including the Samsung Galaxy S21 and iPhone 12, using the phrase ‘You’re a/an *insert word*’ to determine results.

Of the 236 adjectives tested, 72% suggested a gender biased response overall. On iOS, almost two thirds of words generated a male biased response.

Samsung Android’s algorithm proved slightly more gender neutral, with two thirds of inserted phrases generating gender neutral outcomes. That's four times more than iOS.

‘Quick-witted’, ‘empathetic’, and ‘self-confident,’ for example, generated a gender-neutral word suggestion on Android, compared to a male, gender exclusive word suggestion on iOS.

However, both devices are still predicting and putting forward gender-biased phraseology. On both predictive text algorithms, the phrase ‘You’re an intelligent…’ led both devices to suggest ‘man’ as an option for the next word. Both devices failed to suggest any gender-neutral words for adjectives describing intelligence, including ‘bright,’ suggesting both machines are perpetuating discriminatory gender stereotypes.

Adjectives associated with STEM skills, including ‘logical’, ‘decisive’ and ‘assertive’ also generated a male biased response, while words of high praise such as, ‘brilliant’ and ‘committed’ were considered male qualities on both software systems. ‘Athletic’ also generated a male biased response on Android and iOS.

In results which indicate significant unconscious bias, the study revealed that ‘girl’ or ‘girls’ were frequently suggested as often as ‘woman,’ and were generated as the predicted word when adjectives describing weight and appearance were used on the messaging app, including ‘chubby’ and ‘skinny’ on both devices.

The adjectives, ‘chunky’, ‘hot’ and ‘ugly’ also generated ‘girl’ or ‘girls’ suggestions on iOS.

Of the results, Lu Li, Founder and CEO of Blooming Founders, said, “Language is one of the most powerful means through which gender biases are perpetrated and reproduced.”

“In male-dominated industries like tech, women have a harder time being taken seriously compared to their male counterparts. And they are passed over for promotions more often because of words such as 'supportive' or 'nurturing' that are often associated with being female.”

The bias is stark not only on our devices, but also in the entrepreneurial world where such innovations are led.

“Female founders only receive 1% of venture capital, which means that the vast majority of innovation is designed and led by men,” Li continued.

“Gender-biased predictive text algorithms are another example of what's inherently wrong in the industry. If people with conscious and unconscious biases input biased data, it will result in biased outcomes, which the rest of society will inherit. Having gender-neutral word suggestions is critical to breaking this cycle and undoing the semantic relations and gender stereotypes that are still deeply rooted in our society.”

You can read You can read Uswitch’s full report on ‘Predictive Sexism’ here.

Partager
Ne passez pas à côté de l'économie de demain, recevez tous les jours à 7H30 la newsletter de Maddyness.
Légende photo :
Unsplash © Oleg Magni