Article
Why Are Digital Assistants Overwhelmingly Female?
Some of the greatest innovations of the past decade are voice assistants – Siri, Cortana, Alexa. In light of their innovation, have you reflected on why these digital assistants are female? From their names, voices, and characteristics, these voice assistants are feminine presenting.
The creators of popular voice assistants have come under fire for the use of feminine names and voices and the harmful gender stereotypes this decision perpetuates. What started as a cool human-machine interaction has become a foundation for discussions around gender bias in technology and the perspective of women in the evolving workplace.
CTOs have a multitude of choices for digital assistants and that’s certainly a factor of this discussion, but what’s largely more important is leading an organization that recognizes implicit and explicit bias around gender.
Devs cite insufficient data set for male AI voices
The creators of virtual assistants have claimed there is a lack of data on masculine voices.
Female voice recordings date back to 1878, when Emma Nutt became the first woman to be a telephone operator. Her voice was so soothing and welcoming that she became a role model for other women and companies. Hence, by the end of the 1880s, telephone operators were exclusively dominated by female employees/assistants. Because of this gender switch in the market, we now have numerous women’s audio recordings archived. And so, the creators of virtual assistants used this archived data to create and train new forms of voice-automated AI.
Research shows that creating male-voice automated systems can be incredibly difficult and expensive. Take Google for example. Initially, Google wanted to launch its new voice assistant with both a male and female voice.
Unfortunately, the systems Google used to create its new ‘assistant’ were only trained to use female data. Brant Ward, the global engineering manager for text-to-speech at Google, explained that it would be harder to obtain the same quality for a male voice as it is for a female voice. As a result, the team working on Google Assistant strongly advocated for a female voice, since it discovered how challenging it was to create male-voice AI. Moreover, they were unsure as to how users would respond to male voices and what will be the end result/product.
When market research meets limited case studies
Throughout history, several studies have indicated that more people preferred listening to feminine voices, as they were warm and calming to the ears. Another popular theory states that women tend to articulate vowel sounds more clearly, which makes female voice easier to understand, particularly in the workplace. Female voice recordings were used even during World War II in airplane cockpits because they spoke at a higher pitch than male pilots, making them easier to distinguish.
Hence, according to market research, technology companies predicted/analyzed that consumers prefer women’s voices more than male voices. These consumer preferences, intertwined with deep-rooted cultural assumptions about women and their role in domestic settings, particularly as caregivers, ignited the idea that female voice assistance would perform better in the market.
CTOs should strive for change
Today, tech companies have taken actionable steps to correct the situation. Just recently, Apple released the latest version of Siri, which no longer defaults to the feminine voice when using American English. Meanwhile, Amazon created a new option on its Echo device called Ziggy, which has a masculine-sounding voice. In another official blog post, Google announced the introduction of two new Google Assistant voices, Lime and Indigo, where Lime is a male voice and Indigo is a female voice for Google Assistant.
CTOs can start discussing gender bias in AI voice assistants by first educating themselves and their teams about the issue and its implications. Creating a safe space for open dialogue, promoting diversity within development teams, and implementing strategies to detect and mitigate bias are crucial steps. Establishing ethical guidelines, fostering a culture of accountability, and collaborating with external experts can further strengthen these efforts. Transparent communication about initiatives and progress, along with regular measurement and reporting, ensures ongoing commitment to reducing gender bias and fostering inclusivity in AI development.
In Brief
Addressing gender bias in AI development is not only a technical challenge but also a moral necessity. By striving for ethical development practices, diversity, and continuous monitoring for biasness, tech companies can strive to create more equitable AI systems that serve all segments of society fairly.