This story first appeared at Villainesse.
Apple’s Siri. Microsoft’s Cortana. Amazon’s Alexa.
Such “personal assistant” artificial intelligence (AI) programmes are used by millions of us daily, which – if nothing else – signifies that they’re at the vanguard of the continued integration of AI and intelligent machines into our lives. They can organise our schedules, give us directions, tell us the weather and the news, read and write our emails – at this point, the list of what they can’t do would be shorter.
But here’s a question: have you noticed that all of these programmes are meant to be “female,” with women’s voices and names?
There’s a reason for that. As Laurie Penny puts it: because we don’t want to consider their feelings.
By codifying AI and intelligent machines as women, we’re reinforcing our own sexism and misogyny – including towards real, human women. And that, of course, has troubling implications for the future.
But that’s not all. We’re also dehumanising these programmes and machines in a way that makes it easier to dehumanise human women; it’s no coincidence that the word “robot” comes from the Czech word for “slave.”
Take “sexbots” designed for the purpose of sexual pleasure. It’s well-established that rapists will often build up to their crime, and that there’s a link between normalisation of exposure to sexual violence and rape. If a man feels it’s OK to degrade, abuse and engage in violent sex with an anatomically female sexbot despite its protestations – as some are programmed with the ability to say “no” – it’s not an extraordinary leap to wonder whether he may have less of a moral problem with raping a human woman.
This isn’t just sci-fi scaremongering: at an electronics industry expo in Austria in September, a “intelligent sex doll” (read: sexbot) named “Samantha” programmed to react to touch was seriously damaged after being repeatedly molested and abused by multiple men. As Samantha’s developer told UK tabloid Metro: “they treated the doll like barbarians.”
More concerning, however, was his dismissal of Samantha’s treatment: “Samantha can endure a lot. She will pull though.”
Presumably he wouldn’t say the same were Samantha human.
But treating “female” AI and robots as lesser beings – and then potentially becoming more inclined to do the same to human women – isn’t the only issue facing us right now. With advancements in AI capable of learning, there’s also a serious worry about teaching such programmes to be dehumanising towards women.
When Microsoft trialled “Tay” on Twitter in April 2016, the chatbot was meant to mimic the speech patterns of a “typical” teen girl, and would learn through its interactions with the Twittersphere. Within 24 hours, Tay proudly declared “her” admiration for Trump and Adolf Hitler, hatred of women, and belief that LGBT+ people should be murdered and that the Holocaust never happened. Oh, and “she” did it all with a string of racist, misogynist and homophobic epithets that would make even the worst trolls blush; amid global horror, Microsoft quickly shut down the project.
Tay, of course, learned to be so awful because of the awful messages sent by male trolls. Here’s a sad question: would Tay have been so abused if it had been codified as a teenage boy instead of a girl?
We all know the even sadder answer is no.
Then there’s Sophia. A humanoid robot developed by Hong Kong-based Hanson Robotics, Sophia has the ability to adapt to human behaviour and work with humans. Though not technically sentient (the so-called “singularity,” when an artificial intelligence becomes self-aware and gains consciousness, is yet to happen, with experts torn as to when – or if – such a milestone will occur), Sophia became the first robot to legally gain citizenship in October, when Saudi Arabia made “her” a citizen. The move was widely condemned: in granting “her” citizenship, Sophia now legally has more rights than human women in the ultraconservative kingdom.
By granting Sophia citizenship, the message is clear: in Saudi Arabia, women are less human than a robot. Comparatively, their feelings, emotions, and even basic rights are meaningless. We should all be horrified if other nations may follow suit.
When men dehumanise “female” AI and robots, they dehumanise human women. It’s hardly a surprise that robots are often depicted as women: men have been controlling, objectifying and commanding women for centuries.
There are some solutions to start fixing the problem before it gets worse, and to preserve the dream of a future where people of all sexes are treated with equal love and respect. For one, we need more women engineers and programmers working with AI and robots, who can stress the importance to male colleagues of not dehumanising women through their creations, and create kickass non-sexist robots of their own. It may also be worth discussing regulation around the industry, particularly when it comes to abuse of robots meant to resemble human appearance or behaviour.
The third solution may sound simplistic and might be the hardest to implement, but would be the best in the long term: we need to stop dehumanising and degrading women in general, whether human or robot.
We all know our ever-advancing technology isn’t going away. Let’s work towards a brighter future where that technology is used to promote equality – not oppress women. Otherwise, we might as well let the machines pull the plug on humanity altogether.
Idealog has been covering the most interesting people, businesses and issues from the fields of innovation, design, technology and urban development for over 12 years. And we're asking for your support so we can keep telling those stories, inspire more entrepreneurs to start their own businesses and keep pushing New Zealand forward. Give over $5 a month and you will not only be supporting New Zealand innovation, but you’ll also receive a print subscription, an Idealog t-shirt and a copy of the new book by David Downs and Dr. Michelle Dickinson, No. 8 Recharged (while stocks last).