The Ethics of Westworld
Today I’d like to talk about my favourite new show, Westworld, if I may.
If you haven’t heard about it or seen it yet, Westworld is an HBO series about an amusement park populated with androids. Customers pay a hefty price to enter the Western-themed park, which offers several narrative packages to play through. However, most customers simply use the park to fulfill their base desires, killing the androids or having sex with them, for example. The people running the park don’t seem to have a better opinion of their customers either.
The show begins when androids start malfunctioning, requiring their removal from the park for diagnostics. So much happens in this show, and it explores so many fascinating science fiction themes that I could probably write about it for hours. But let’s keep things simple for now.
The Rape of Dolores
Let’s start with the most obvious ethical issue presented by the show: the killing and rape of androids. In the first episode, the main android, Dolores, gets dragged kicking and screaming into a barn. The scene cuts away, but we all know what happens next.
Why should we discuss the ethics of harming androids? Androids are machines, artificial beings. Why should there even be an ethic regarding their treatment?
Well, I believe there is a vast difference between a machine that assembles cars and an android, especially the ones shown in Westworld. A factory machine has no mind of its own. It follows its programming and does nothing beyond that. And, of course, you could make the same argument about androids: they may look human, but they simply run their programs.
But I would argue that the difference lies in the intent of creation. The Westworld androids were created to mimic human behaviour. If we deliberately create human-like beings, shouldn’t we treat them with the same respect that organic humans deserve? I believe we should.
One of the show’s characters, the narrative director, makes a good point when he argues (I can’t remember if it’s in the first or second episode) that the company should stop making androids more realistic because, and I’m paraphrasing, “Do you really want to think that your husband actually had sex with that woman, or that you actually killed that person?”
It seems highly suspect to create androids so human-like that we can’t tell the difference, only to do whatever we please with them. That amounts to slavery. We’ve seen this theme before in the story R.U.R., which I reviewed on this blog. And you can guess how that story ends.
And whether or not Dolores can actually feel pain, the fact that she reacts as any woman would in such a situation tells us everything we need to know. Raping her is a crime. She has been programmed to act as though a crime is being committed against her, therefore, a crime is being committed against her. This brings us to another issue.
After each narrative cycle (I’m still not sure how long they last in the show), technicians wipe the androids’ memories and reset them. Their storylines restart, waiting for new customers to interact with them. So Dolores has no memory of the injustices she continually experiences. Should we still commit crimes against her just because she is an android and will have no memory of them?
You can guess my answer: no.
Playing God
Mistreating androids the way Westworld portrays isn’t just wrong, it’s a gross abuse of our responsibility as creators. Just as we care for our children, we should care for humanoid machines. I don’t think I need a more detailed argument beyond the fact that it’s simply the right thing to do. If we choose to play God in our own way, then we damn well better take it seriously.
And if nothing else, what does it say about us that we create such masterpieces only to destroy them?
Final Thoughts
The first few episodes of Westworld have been incredibly interesting and thought-provoking. As I discussed in my post about why I’m fascinated with robots, these mechanical creations serve as an excellent means of exploring what it means to be human. But as the show illustrates—or at least, what I take away from it—is that this exploration should remain theoretical. We should figure ourselves out without harming others.
I’m sure Westworld will inspire many future blog posts. For now, watch the show! I’ll talk to you later.
Ciao!
Updated February 6, 2025.