Posted on July 11, 2019 in Sexual Harassment
In the last few years, the rise in popularity for artificial intelligence (AI) voice assistants prompted families and homeowners nationwide to adopt assistants to ask them questions, operate appliances, listen to news reports, and even play their favorite songs. Unfortunately, despite the value and entertainment assistants such as Apple’s Siri, Amazon’s Alexa, Microsoft’s Cortana, and Google’s assistant bring, facets that appear innocuous may spread toxic sexual harassment practices and perpetuate gender stereotypes. UNESCO published a new study that examines how AI voice assistants may cause damage.What Voice Assistants SayThe study uncovered that the models used to program the assistants appear docile and accept any verbal abuse and sexual harassment as typical and tolerable. Among other factors, the study discovered that the problem lies within the programming of the assistants. Many programmers who developed voice assistants are male. The lack of diversity in Silicon Valley means that the wide variety of voices and personalities throughout the world remain underrepresented. A significant drop in the number of women studying computer and information science may be an underlying factor as well. When originally developed, a female voice was used for the assistants contained in household appliances. Yet, driving and GPS applications feature male voices.When a user of the AI assistant engages in verbal sexual harassment, the assistants may respond with stereotypically female replies or in a submissive way. Depending on the phrasing or comment, the assistant may issue a grateful response, flirtatious answer, or with friendly oblivion. Even though the companies producing the voice assistants insist that they are genderless, the report indicates that every trait demonstrated by the assistants, including the name, voice, speech, and personality, appears female.Study Recommendations for Voice AssistantsSince the development of voice assistants, no significant changes in the programming of the personalities and responses have occurred. The latest study offers recommendations to eliminate gender stereotyping and reduce the influence of sexual harassment in the responses coming from the assistants. On a small scale, the language programmed into the assistants must be altered to avoid misogynistic responses and eliminate toxic language. Another option is to create assistants of both genders. One of the most important points brought up in the study is the lack of diversity in the computer programming industry. By encouraging women to study technology, they may be able to become part of the solution to the problem of the gender stereotyping coming from the voice assistants.