If HBO's Westworld has taught us anything, it's that using robots to fulfill your fantasies can have devastating consequences. Fearing moral, ethical, and social complications reverberating out of the budding sex-robot industry around the world, renowned scientists and researchers are speaking out firmly on the potential harmful ramifications involved in the marketing of expensive, fantasy relationship devices.
Restrictions and limitations on their sale, combined with heightened consumer awareness, have sociologists, robotics experts, and psychologists making bold statements concerning how these lifelike, AI-driven sex robots might be used in anti-social and deviant scenarios that could extend into real-life situations. But is the best solution harsh regulatory oversight and self-imposed industry standards?
“Some robots are programmed to protest, to create a rape scenario,” Duke University researcher Hendren told the BBC while lecturing at the American Association for the Advancement of Science's annual meeting in Seattle. “Some are designed to look like children. One developer of these in Japan is a self-confessed pedophile, who says that this device is a prophylactic against him ever hurting a real child. But does that normalize and give people a chance to practice these behaviors that should be treated by just stamping them out?”
Ethicists have been sounding the alarm the past few years about this exact issue regarding silicone sex partners rolling off assembly lines.
One organization known as the Campaign Against Sex Robots has formed an online coalition to bring educated information to the table and to forecast some ill winds blowing for humanity should pleasure model machines become highly affordable to anyone needing a semi-autonomous girlfriend.
Matt McMullen, CEO of Abyss Creations and creator of RealDoll and Harmony AI, sought to set the record straight back in a 2017 interview with Mel magazine as the industry was rapidly advancing.
"Our customers can be shy or socially intimidated by real social situations," he explained. "A lot of times the doll does something magical for them. It gives them a feeling of not being alone, not being a loner. It's that companionship, more than anything else, that appeals to people and gives them confidence to interact socially."
Of course, conventional wisdom will always assert that there will never be a viable substitute for knowing, caring for, and loving another real person. But do the perceived therapeutic benefits of these products — say, to persons afflicted with social anxiety, disabilities, past trauma, or who have difficulty forming lasting romantic relationships — outweigh what others might see as risks?
Robot AI ethicist Kathleen Richardson of Leicester University believes that the companies behind this technology should be forced to not advertise and market these surrogate machines as life partners or substitutes for normal, flesh-and-blood relationships.
“These companies are saying, 'You don't have a friendship? You don't have a life partner? Don't worry, we can create a robot girlfriend for you,'" she explained to the BBC. "A relationship with a girlfriend is based on intimacy, attachment and reciprocity. These are things that can’t be replicated by machines ... If someone has a problem with a relationship in their actual lives, you deal with that with other people, not by normalizing the idea that you can have a robot in your life and it can be as good as a person."
It's a conundrum that researchers, advocates, and industry figures will need to figure out the old-fashioned way: through actual human discourse and thought.