A few years ago, when people figured out how to make Apple’s voice assistant swear, reporters were quick to blame the tech. “Siri’s got a bad mouth;” she gave a “shockingly inappropriate response” or a “randy robo response” to anyone who asked for a second definition of “mother.” When the voice assistant answers sassily to a request to divide zero by zero, we determine that she’s run amok and we’re one step away from Terminator times.
It’s easy to see why we anthropomorphize Siri and her A.I. brethren. She has a name and, apparently, a gender. But faulting a device — not those who program it — is a symptom of a much larger problem, according to Dr. Yolande Strengers, an associate professor, Department of Human Centred Computing at Monash University, and Dr. Jenny Kennedy, a postdoctoral research fellow at RMIT University, Melbourne. Their new book, Smart Wife: Why Siri, Alexa, and Other Smart Home Devices Need a Feminist Reboot, explores the harmful stereotypes that lie behind Alexa’s upbeat answers and Siri’s snark.
“We have so many very similar, very nostalgic and problematic stereotypes of the wife and feminized labor being designed into these technologies that are then going into mass markets around the world.”
Even if you only occasionally interact with these digital assistants, they still fall under the category of smart wife, Kennedy tells Digital Trends. “It’s any form of technology, device, assistant that is designed to perform any kind of wifely role in the home,” she said. Many smart-home devices are meant to lessen what’s traditionally thought of as women’s work: Robot vacuums, smart laundry machines, and all manner of kitchen appliances. Siri, Alexa, and Google Home also remind you to pick up your dry cleaning, have automated settings for when you’re hosting a party, and can tell the kids a bedtime story. “They’re also reinforcing how work within the home gets divided up between different actors,” Kennedy said of smart wives. “Currently, a lot of the housework, the day-to-day work, the emotional work, is all still falling to a feminized cohort, whether that’s human or other.”
“We have so many very similar, very nostalgic and problematic stereotypes of the wife and feminized labor being designed into these technologies that are then going into mass markets around the world,” said Strengers. “So, it’s not so much that she’s a wife, necessarily, that’s the problem. It’s the type of wife she is and the type of woman she’s portrayed to be.”
Fixing the problems with smart wives takes more than switching the default gender of the voice from female to male, Kennedy and Strengers say. They need a complete reboot.
If you’ve given your Roomba a name, there’s a good chance you call it Rosie, after The Jetsons’ robot maid. “As far as smart wives go, Rosie has it all,” write Kennedy and Strengers. “She embodies the core values commonly associated with the stereotypical dutiful 1950s’ housewife — with a few added bonuses.”
The reason your robot vacuum evokes the efficient cartoon maid is no mistake. In the book, Strengers and Kennedy trace the history of robots, virtual assistants, and chatbots. Even an early natural language processing computer program, ELIZA, was feminized, named for Eliza Doolittle in George Bernard Shaw’s Pygmalion and My Fair Lady. A robotic woman appeared onscreen relatively early, in 1949’s The Perfect Woman. “There’s that really clear link that we demonstrated in the book between these popular culture versions of the smart wife and how women on the screen are portrayed in an artificial form and how that creates inspiration and the basis for designs of real smart wives that we now have in our homes,” said Strengers.
“Technology has nearly always overpromised and underdelivered, particularly when it comes to the home,” write Strengers and Kennedy. Our robot vacs are nothing like Rosie. The Jetsons didn’t have to straighten up before Rosie rolled around, for fear of something getting tangled in her wheels or her bumping into an immovable object. They never worried about what would happen if Rosie ran into Astro the dog’s poop.
Because of their limitations, smart-home devices can sometimes create more work for us. It’s like the so-called “juicer problem.” They’re supposed to make juicing easier. If you’ve ever owned one, though, you know they can be a nightmare to clean. But now you have this expensive device and need to get your money’s worth. Before the juicer, you’d have just bought a carton of orange juice and called it a day. It’s why author Ruth Schwartz Cowan noted technology often creates “more work for mother” by providing devices that were supposed to make her housework simple and the home pristine.
“Often the more devices that come into the home, the more work is created,” said Kennedy, and she predicts it will continue with smart devices. Re-adding them to the Wi-Fi network or troubleshooting errors creates what researchers call “digital housekeeping.”
“This actually takes up quite a lot of time and energy, and it’s not being considered part of the overall bucket of housework that needs to be done,” said Kennedy. It often falls to men, who might consider gadgets one of their hobbies, or it might be part of their work outside the home. But if one partner is detangling the robot vacuum, they’re also not taking out the garbage or folding the laundry, so that work may fall to the other person.
Fixing a smart-home device may take some skill or technical know-how, but Kennedy and Strengers write in the book that the robots themselves are often seen as replacing menial work: “Embedded in our Rosie idol is an assumption that housework is something that should be erased and removed from our lives — that women’s work should be done silently and efficiently, and it is simple enough to be assigned to an autonomous alternative. One implication is that it is mundane, easy, and valueless.”
When our robots don’t perform the way we expect them to, it’s easy to become frustrated. If you yell and swear at your voice assistant, it’s not going to upset them, right? But Kennedy and Strengers still think it’s a bad idea. “It’s not about protecting robot feelings as much as being concerned about how any form of abuse of an anthropomorphized and, especially, feminized object can help normalize abuse of feminization,” said Kennedy.
“We are really concerned about this trend to make these robots and devices likable.”
Mistreatment can be especially troubling for sexbots, she and Strengers say, which sometimes have programmed “personalities” such as “frigid.” This could raise complicated questions around consent, some critics say — not with robots, but with other humans.
“They’re not necessarily saying that that means they’ll go out and do that to women,” said Kennedy. “But still, it just feeds the culture, and that’s what we’re concerned about.”
Everyday robots tend to share certain traits with sexbots. “We are really concerned about this trend to make these robots and devices likable,” said Strengers. They’re often cute, with big eyes and soothing voices. “It does serve a purpose, which is to make us accept these devices into our lives,” she said.
But again, she says, that reinforces expectations of how women should look and behave. “They’re perpetuating a particular form of femininity, which is only one of the many, many forms of femininity and is arguably quite outdated in terms of the way that women act and are represented and live now in the 21st century,” she said.
Instead, designers should find new ways of designing robots and other devices. “There are other ways of approaching likability,” said Strengers. “It’s not that we have to design Terminator robots that are going to be rude and horrible and, you know, shoot us in our sleep.”
Simply changing the voice and physical design doesn’t go far enough, they argue. “It’s about the fundamental personalities that are being designed into these devices as well,” said Strengers. She and Kennedy offer a nine-step manifesto to retooling these smart wives so that they promote gender equity and diversity.
“One of the main ones is who gets to code,” said Kennedy. “Having more women represented in the industry is going to change the perspectives that are brought to the table in those early design phases and in the imagined future user and understanding the diversity of potential users.”
Jacqueline Feldman designed KAI, a chatbot, to have a bot personality. It steers people away from asking questions about its gender or making sexual suggestions. Feldman’s background is in writing and literature, and Strengers would like to see more people from disciplines like anthropology and sociology be included in creating A.I. and bots, “so that we don’t just have on the frontiers of technology those really technical minds making these things,” she said, “that it’s seen as a collaborative effort with other disciplines and it’s seen as a social opportunity as much as it is a technical one.”
“It’s not an easy-fix, one-size-fits-all solution that we’re proposing here,” she added. “There are a lot of different elements that need to change for this to turn around.”
- Google Home vs. Amazon Echo
- The best Alexa-enabled devices for 2021
- 6 annoying things that smart speakers do and how to fix them
- Apple HomePod mini review: Finally, the smart speaker Apple needs
- 7 things we wish Amazon Alexa could do