You don’t have to show any skin to be sexually harassed — and you don’t even have to be human to be the subject of offensive, suggestive commentary, it seems.
Deborah Harrison, an editorial writer in the Cortana division of Microsoft, told CNN that even female virtual assistants can’t escape the dirty and sometimes disrespectful minds of their human users. And as AI systems become increasingly humanized, this virtual harassment is a disturbing trend.
A large part of the issue stems from the fact that the vast majority of AI assistants feature female voices. Beyond Microsoft’s personal assistant, Cortana, there are Apple’s Siri and Amazon’s Alexa, and even Hollywood portrayed the operating system of the future with a sultry Scarlett Johansson. And beyond the issue of consistently conforming to gender stereotypes by using female personas in subordinate roles, it now appears that people are getting so comfortable with their AIs that these machines are being bombarded with questionable questions.
According to Harrison, when Cortana was first launched in 2014, “a good chunk of early queries were about her sex life.” Now the team behind the AI is fighting back; Cortana is a true woman of the 21st century, you see, and she doesn’t take any crap.
“If you say things that are particularly a**holeish to Cortana, she will get mad,” said Harrison during a talk at the Re•Work Virtual Assistant Summit in San Francisco. “That’s not the kind of interaction we want to encourage.”
To combat this sort of behavior, Harrison and seven other Microsoft writers tasked with the fascinating job of determining how Cortana responds to inquiries have decided to be very careful with the way in which they structure this virtual woman.
While she is very clearly female — she’s represented by a female avatar, and the flesh-and-blood human woman Jen Taylor supplies her actual voice — Cortana doesn’t succumb to many sterotypical female pitfalls. She doesn’t find herself constantly apologizing, nor does she seem particularly, well, subordinate. And according to Harrison, that’s all a conscious decision made by the Microsoft team.
“We wanted to be very careful that she didn’t feel subservient in any way … or that we would set up a dynamic we didn’t want to perpetuate socially,” she told CNN.
A big part of creating a believable persona for a virtual assistant, Microsoft says, is to talk to human beings who have that actual job. Not only does this give them better material, but it also helps the team address harassment issues from the people who have to deal with it firsthand.
So don’t mouth off to Cortana. You might not like what she says in response.
- Microsoft resurrects Clippy for redesigned emoji in Windows
- How to fix sound problems in Windows 10
- The biggest problems with Windows 10, and how to fix them
- What is OneDrive?
- What is Google Assistant? Here’s the guide you need to get started