Friday, June 1, 2018

Forgotten Evil: The Ethics Of Westworld

"I had a question.  A question you're not supposed to ask.  Which gave me an answer you're not supposed to know."
--Peter Abernathy, Westworld (season one, episode one)


Is it wrong to hurt a conscious being if it will suffer no permanent harm--not even the lasting mental harm of traumatic memories?  If an entire class of android beings was created by humans for the sole purpose of having other humans treat them however kindly or sadistically they wish, would the purpose intended by the creators render casual violence against the androids morally permissible?

Westworld addresses these issues directly.  After finishing all seven current seasons of Game of Thrones, I suspected it was only a matter of time before I started HBO's Westworld!  By just the fourth episode, the show presents an exploration of the relationship between sentience (consciousness) and ethics, the danger artificial intelligence could pose to humans, and the ways that sadism can seize people who otherwise seem just like us.  It depicts a park, managed by a corporation, full of androids (robots designed to appear like humans) placed in a reconstruction of a late 1800s Western town, hence the reason name of the park is Westworld.  Various androids are given their own partial backstories and scripted activities or speeches; some are cowboys, some are prostitutes, some are soldiers, and so forth.

The park is not meant for its human guests to merely observe a recreation of a past society though.  It exists so that those who pay an entrance fee can do to the androids whatever the wish.  If they so desire, they can beat, rape, abduct, or "kill" them.  Each day, many of the androids return to their scripts, their memories of abusive treatment erased or suppressed by the park managers.  People can express impulses they would not be permitted to show in modern society as long as they pay for the experience.

Some guests visibly enjoy tormenting the androids, which have been programmed to be unable to effectively fight back, since they cannot actually kill in self-defense.  They might be able to fire shots from their nonlethal guns, or try to flee from assailants, but they are unable to save themselves from the actions of the guests.  And the ones who abuse the androids do not make any ethical judgments against the way the park is used.  The pleasure of an abuser cannot make something right, wrong, or amoral, however.  If an act is wrong simply because it involves the unjust suffering of a sentient being, then even erasing the memories of a victim would not change the fact that the act itself was evil, as the victim still suffered.  This demands a correct understanding of what it means to be a sentient being.

In episode three of season one, Robert Ford, the cofounder of Westworld, makes a major logical blunder when he tells an employee that the androids are "not conscious," though they have been intentionally designed to have perceptions, emotions, and drives.  A non-telepath can never know if any other consciousnesses exist, so a non-telepathic observer could never verify if a robot or other human truly is conscious, but the fact remains that if a robot truly has been crafted with the ability to perceive on any level, then it is conscious, for consciousness is the ability to perceive as well as the thing that perceives.  Consciousness is not the sum of multiple components of mental life, as if removing emotions or the will would remove consciousness; consciousness is a mind.  It is the state of having any mental life.  Even perception without the ability to actively reason, feel physical or emotional sensations, direct a will in a particular direction, or create mental images is still consciousness.  A robot with consciousness is, by nature, a sentient being.

Some might side with many of the guests of Westworld, insisting that as long as androids, and not actual humans, are the recipients of cruel treatment, then nothing evil has actually occurred.  But if an act of torture is wrong, then its wrongness does not disappear when it is inflicted on a non-human being, like an animal or an android.  If sadism is wrong, whether that sadism is directed towards a human or non-human being is irrelevant to the fact that it is immoral.  Even if human life takes precedence over animal life or robot life--and this is exactly what Christianity teaches--this does not mean that there are no moral obligations presiding over human treatment of other conscious entities.  That one has a greater metaphysical value than the others does not mean the others can be treated in any way without there being any moral dimensions to the treatment.

If an evil act is forgotten, its evil is not erased.  It only ceases to be remembered.

No comments:

Post a Comment