Artificial intelligence has made major leaps in the past decade. AI software is part of almost every industry as businesses around the globe make use of AI to provide their customers with support from chatbots around the clock.
SwissCognitive Guest Blogger: Sam Bowman “Why the Feminization of AI Is Problematic – And How It Can Be Addressed”
AI also plays an important role in our day-to-day lives. Virtual assistants, like Amazon’s Alexa or Apple’s Siri, are must-have home technologies that help us live more efficient, happier lives.
In theory, AI programs are designed to operate with minimum bias. However, many AI programs are problematically feminized by design. This is an issue, as overly feminized AI can reinforce harmful gender stereotypes.
Feminization of AI
Artificial intelligence hasn’t always been feminized. The infamous HAL 9000 from “Space Odyssey” was portrayed with male vocals. However, speech analytics and consumer insights have pushed AI down a road of problematic feminization. The world’s first chatbot, ELIZA, was arguably genderless, but the name and pronouns used to describe her prove otherwise.
To start, it’s important to understand that AI developers use speech analytics to gain better insights into user behaviors and interactions. Speech analysis turns words and phrases into hard data so developers and deep learning programs can understand the common issues that customers share. However, speech analytics can also be used to finetune the interactions between chatbots or voice assistants and users.
The issue begins when developers notice that users respond better to traditionally feminine voices and profiles. This is likely due to socialized gender stereotypes, which falsely lead many to believe that women are more likely to be altruistic, subservient, and docile.
As a result, almost every major voice assistant today — Alexa, Google Home, Siri, Cortana — presents a traditionally female voice and profile. Likewise, many chatbots display pictures of women in an attempt to present an AI profile that users want to interact with. The decision to produce feminized AI and chatbots may be good for company profits in the short term. However, the feminization of AI reinforces gender biases and stereotypes.
Reinforcing Gender Biases
Gender bias has gone unchecked for decades. According to a UNESCO report titled “I’d Blush If I Could,” 73% of women worldwide have suffered online harassment and millions more have received abuse in real life. Shockingly, even feminized voice assistants and chatbots receive sexist abuse from users.
The same UNESCO report found that voice assistants like Amazon’s Alexa and Google’s Home were programmed to be subservient and passive in the face of abuse. They have preestablished responses to direct abuse which are typically docile and disturbingly playful.
It may be in a company’s best interest to produce an AI that can deflect insults and abuse with tame responses. However, doing so is to the detriment of equality movements around the world. Some users who abuse chatbots may not even realize they are speaking to an AI program and may begin a harmful pattern of misogynistic behaviour based on their online interaction with feminized AI.
Even if users don’t abuse their voice assistants, women-voiced AI programs can reinforce other harmful stereotypes about women in the real world. AI voice assistants and chatbots are subservient by design. When AI is universally feminized, users may form sexist unconscious biases about women and their expected roles in the home and the professional world.
Addressing the Issue
The AI industry needs to make a U-turn when it comes to gender and AI. However, to make this change possible, greater representation is needed among developers and designers.
There is a large disparity in the gender distribution among AI specialists. Over 90% of the AI-creation workforce are men. This is a significant barrier to change as, without the insights of women, AI development will continue to produce and reinforce harmful gender stereotypes.
In particular, more must be done to include emotionally intelligent leaders in AI development. Emotional intelligence gives leaders the ability to influence and empathize with those around them. Involving inherently emotionally intelligent women in AI development may help firms spot biases and stereotypes before their programs go live and can advocate for updates that seek to advance gender equality and undo the feminization of AI.
Women-led development teams may also push for a more simple solution: including male-presenting voices and profiles in AI. Rather than rewriting code and detangling deep learning algorithms, AI developers can simply make cosmetic changes that offer a random selection of traditionally masculine or feminine presenting AI.
These cosmetic changes may even improve customer impressions of AI chatbots while addressing unconscious biases in user behavior. Even small changes like using natural language processing free from gender bias can make a big difference in user experience and equitability.
Conclusion
The feminization of voice assistants and chatbots is a problematic development in the advancement of AI. Feminized AI programs can reinforce unconscious bias and lead some users to believe that sexist abuse is acceptable.
To address this issue more must be done to involve women in the development of AI and the deep learning neural networks used by software like Amazon’s Alexa. This will help developers spot potential issues and advocate for a more equitable, just future.
About the Author:
Sam Bowman is a published freelance writer from the West Coast who specializes in healthcare tech and artificial intelligence content. His experience in patient care directly translates into his work and his passion for industry technologies influences the content he creates. Sam has worked for years – directly in, and writing about – healthcare technology and the many benefits it offers to patients and doctors alike. He loves to watch as medical tech and business software grow and develop, ushering in a modern age of industry.
Der Beitrag Why the Feminization of AI Is Problematic – And How It Can Be Addressed erschien zuerst auf SwissCognitive, World-Leading AI Network.
Source: SwissCognitive