What are the ethics of human-robot relationships?
There’s a scene early on in Spike Jonze’s new movie, Her, wherein Samantha, a disembodied, intuitive operating system, reveals to her owner, Theodore, that she has read his entire email archive. She tells him she knows about his impending divorce, and gently asks him when he’ll be ready to date again.
Since the premise of the film is a romance between Theodore and Samantha, it’s easy to interpret the scene with that end in mind. Imagine starting a relationship with a virtually omniscient supercomputer who had access to your entire digital communication archive and the power to communicate with people on your behalf using those channels. It sounds about as romantic as being chased into a tar pit by a swarm of bees.
The film’s aesthetic is twee and gauzy, priming you to go “aww” in much the same way as a nappy commercial, and the characters communicate largely through trite emotional remarks that wouldn’t be out of place in one of the teeth-achingly mawkish love letters Theodore writes for a living. The upshot of this sickly sweet tone is that the audience is directed to look through a Vaseline-covered lens at the film’s actual plot, which runs along the lines of ”emotionally stunted man-child conducts unethical dalliance with robot housemaid, learns some valuable lessons about himself.”
In terms of narrative, Samantha being an operating system is almost an afterthought. It’s this issue that Jonze elides spectacularly, and which deserves a closer look: what are the ethical implications of interactions between humans and sentient machines like Samantha?
Theodore is presented as naive and selfish in his relationship with her, but never is there any suggestion that his actions may be indefensible. Samantha is heavily implied to be a Strong AI, a conscious being that emerges from a non-organic machine. This means that she is morally equivalent to a human person: she has an inner life, preferences, and goals.
If Samantha is, mentally, an artificial person, what are the conditions of her employment? Does she work for Theodore, or is she owned by the company that built her? If she’s a person, why isn’t it illegal to own her? We’re never invited to explore these issues in Her. The film presents a world in which this questionable status quo is presented as unproblematic.
There is currently no such thing as Strong AI, and enough debate over its theoretical possibility that representing it on film is much closer to fantasy than science fiction. The distinction between strong and weak artificial intelligence is however frequently collapsed, both in fiction and in public discussions about humans and computers.
David Levy’s book Love and Sex with Robots posits that human-robot relationships will soon become regular occurrences; but since we know that Strong AI doesn’t exist, Levy necessarily refers to Weak AI, which is basically a very convincing version of Microsoft’s famous character Clippy. Clippy asks and answers questions, makes facial expressions, and responds to human input, but unlike Samantha, he has no internal life.
The implications of this kind of human/robot relationship – one between a sentient, conscious human and an object – are very different than those between a human and a fantastical conscious AI. Although modern depictions of love tends to focus on the individual emotional experience of infatuation, we also acknowledge that a romantic relationship requires reciprocal empathy. This is why marriage experts are constantly telling us all that communication is the key to happiness: we have no direct access to the inner life of our beloved, but it is precisely the acknowledgement and understanding of this inner life that is required for a healthy and respectful relationship. This is love as a practice, and it’s this that is lacking in any relationship between a human and a non-conscious AI.
Given the existence of dating simulations, Levy’s book, and the plethora of pop culture depictions of robo-romance, it’s vital to assess what the potential acceptance of objects as romantic partners says about our conception of love. If your partner has no inner life, does this mean the empathy and inter-subjectivity of love is being devalued? Samantha might be a strong AI, but any film that doesn’t at least acknowledge the difference between fictional robots and the very real possibility of weak AI social robots is doing a disservice to a complex phenomenon that will become increasingly important as our technology develops into the future.
A few years ago, the Danish Council of Ethics released a report that tried to engage with some of these questions, and I wish I could go back in time and hand Jonze a copy before he sat down to write Her. One of the Council’s concerns is social robots, which are designed to seem as though they have inner lives. These emotional simulations encourage us to treat their artificial feelings as real, potentially leading to “relationships”, in which humans instrumentalise objects with very convincing similarities to real people.
Films that involve artificial intelligence should invite us to think about those intuitions, rather than using robots as a lazy novelty. Her could have been a chance to get stuck in to this stuff, but you’d probably get more intellectual depth from watching a few episodes of The Jetsons.
guardian.co.uk © Guardian News & Media Limited 2010