‘Divorced from reality,’ says critical law professor
Are “virtuous sex robots” the way of the future? University researchers suggest that robots created for human pleasure should be designed so that they can grant or withhold consent, as well as teach sex education.
Anco Peeters, a doctoral student at Australia’s University of Wollongong, and Pim Haselager, associate professor at The Netherlands’ Radboud University, published “Designing Virtuous Sex Robots” in the International Journal of Social Robotics last month.
The paper examined four areas: “virtue ethics and social robotics,” “Contra instrumentalist accounts,” “Consent practice through sex robots” and “Implications of virtuous sex robots.”
The authors do not focus on child sex robots or sex robots that play into rape fantasies, but “the potential positive aspects of intimate human–robot interactions through the cultivation of virtues.”
They argue that robots can be designed to help humans become more virtuous: either by exhibiting virtuous behavior or “nudging human behavior directly.”
Sex robots should have a consent module to prevent “unwanted behavioural patterns,” according to Peeters, a “philosopher specialising in embodied cognition and artificial intelligence,” and Haselager, whose specialty is “Theoretical Cognitive Science.”
However, even “active consent” has its problems since “verbal consent does not necessarily mean that a partner is freely engaging in sex.”
They also suggest using “a compassion cultivating sex robot” to teach teenagers sex education. “A sex robot which not only can practise consent scenarios with a human partner, but which can actually cultivate a virtue like compassion could potentially be used in sex education and therapy,” the authors argue.
A law professor who has called for Congress to regulate sex robots says this paper “is totally divorced from reality,” however.
“This new highly theoretical article, based upon one of three competing ethics postulates, seems to do little in anything to assist in this area,” John Banzhaf, who teaches public interest law at George Washington University, told The College Fix in an email. “Indeed, its suggestions – that sexbot be use[d] to teach virtue, and to reduce campus rapes, etc. – seem very unrealistic.”
Danger of rape fantasies is to ‘normalize outside of a consensual context’
The authors argue that robots can help “nudge users towards virtuous (or vicious) behaviour,” such as congratulating a human for exercising.
Another way for robots to learn virtuous behavior is to mimic it: “The use of machine learning with artiﬁcial neural networks may be a way of avoiding the need to write an algorithm that speciﬁes what action needs to be taken when.”
A fellow ethicist, Robert Sparrow, has argued that “sex robots could encourage vicious behaviour,” the authors acknowledge. People will be able to “live out whatever fantasies,” including rape fantasies, which could make someone more vicious.
“Let us assume that rape-play between two consenting adults is not necessarily morally wrong,” Peeters and Haselager posit. What could be wrong about acting out a rape fantasy is it could “normalize the associated repeated behaviour outside of a consensual context – the cultivation of a vice.”
But they argue that “there are ways to involve consent in the case of intimate human-robot interaction aimed to prevent the risk Sparrow is drawing attention to, without condemning the manufacture and use of sex robots in principle.”
The authors believe if a sex robot can cultivate such negative aspects in humans, they can also cultivate positive ones.
Banzhaf, the law professor, believes it is “doubtful” that the men who purchase these sex robots are doing it in “an attempt to improve their ‘virtue,’” he told The Fix.
New paper argues that sex robots should be programmed to make consent decisions (which could result in sex robots rejecting their owner's sexual advances) to prevent the normalization non-consensual sex: https://t.co/Lxt8H0Y3yf pic.twitter.com/vkB13Cxoec
— Cory Clark (@ImHardcory) October 4, 2019
Consent-based sexbots ‘may negatively affect’ the bottom line
In another point, Peeters and Haselager address an argument by computer pioneer David Levy, author of “Love and Sex with Robots,” that humans can be attracted to and fall in love with robots. Levy even says “robot sex could become better for many people than sex with humans, as robots surpass human sexual technique and become capable of satisfying everyone’s sexual needs.”
Levy would also argue that sex robots could be used by soldiers on long-term missions to “prevent cheating.” This is why the authors argue that Levy sees sex robots as “tools to be used or products to be consumed.”
However, the authors suggest that “robots are not merely neutral tools” and any belief contrary “could lead to practices that provide cause for concern.”
There are some people who would consider sex robots as cheating on a partner and “[n]otions of love and sex will be changed by the development of humanlike robots.”
Peeters and Haselager recognize that consent-based sex robots “may negatively aﬀect the potential economic gains of sex-robot producers,” though that is not their concern.
“A robot equipped with a consent-module could potentially be used to investigate ways of improving consent practice in general,” the authors argue.
Consent is tricky because “verbal consent does not necessarily mean that a partner is freely engaging in sex,” since “social pressure or substance abuse may be involved.”
Another issue with active consent is “explicit consent has met with cultural resistance, as men and women generally believe discussing consent decreases the chance that sex will occur.”
Peeters and Haselager say “active consent” has been a helpful way at college campuses to fight sexual assault and rape.
Check out a consensual sexbot like a library book?
GWU’s Banzhaf told The Fix that he’s skeptical of the authors’ argument, as he paraphrased it, that “‘active consent’ (also known as Yes Means Yes) can help to reduce rapes on college campuses, and presumably that sexbots can be used to teach this concept.”
He said there was no evidence that trying to “redefine the time-honored legal concept and definition of consent” would reduce campus rape:
Furthermore, even assuming for the sake of argument that sexual interactions with a sexbot equipped with a consent module would teach virtue and/or the need for consent and thereby reduce rapes, it is hard to see a university requiring its incoming (presumably male) students to have sex with a robot enough times to teach this virtue, or even permitting students to interact with a sexy female robot on a voluntary basis much like checking out a book from the library or some scientific equipment from a lab.
Sex robots could also be used in therapeutic settings, specifically with people who have narcissistic personality disorder, according to Peeters and Haselager.
They recognize the implications of virtuous sex robots, such as violating a person’s autonomy. If a robot is meant to bring a person beer, for example, should the robot stop serving the person if the robot believes the human has had one too many?
Another implication of a consent module for sex robots: What would the robot do whenever it rejects sexual interactions? Would the robot shut off completely?
In their conclusion, Peeters and Haselager recognize “the misuse of sex robots could have a lasting impression on an adolescent learning about intimate relationships,” and that society’s response to sex robots is unpredictable.
“Rather than highly theoretical discussions by authors who apparently have no relevant educational or other background in this area,” Banzhaf concluded, “what is need[ed] are studies by trained professional about what is already happening, and then probably appropriate legislation and/or regulation.”