SUPPORT REQUEST: I’m playing a sim-style game, and the non-player characters that you deploy have particular skills, weaknesses, likes, and dislikes. So I sometimes put them in situations that I know will make them uncomfortable, like sending a guy who is afraid of space out to mine an asteroid. The results can be hilarious. But I also feel a little uneasy that I’m not letting them live their best lives. Am I being unethical?
Dear Dungeon Master,
Games of this sort allow ordinary mortals to live out the fantasy of playing God. You become the demiurge of your own digital cosmos, dictating the fates of characters whose lives, such as they are, remain subject to your whims. Playing them tends to raise the sorts of questions that have long been taken up by theological and tragic literature.
Ever since we humans started writing, it seems, we have suspected that we are pawns in the games of higher beings. In the Iliad, Hector, upon realizing that he is facing death, complains that men are playthings of the gods, whose wills change from one day to the next. It’s a conclusion echoed by Gloucester in King Lear, as he wanders the heath after being ruthlessly blinded. “As flies to wanton boys are we to the gods. / They kill us for their sport.”
In the book of Job, Satan and God place a bet on whether Job, a most righteous man, will curse God if enough suffering and hardship befall him. After securing God’s permission, Satan kills Job’s children, his servants, and his livestock and causes his body to break out in boils. Job, who has no clue that his suffering is simply the subject of a gentleman’s wager, can only assume that his woes are divine punishment. “My flesh is clothed with worms and clods of dust,” he cries out. “My skin is broken, and become loathsome … My life is wind.”
It’s difficult to read such passages without sympathizing with the human victims. And I imagine that the uneasiness you feel when provoking your characters means you suspect that you are similarly making them suffer for your own entertainment. Of course, non-player characters—NPCs—are just algorithms with no minds and no feelings, hence no ability to feel pain or discomfort. That is, at any rate, the consensus. But humans, as you probably know, have a bad track record of underestimating the sentience of other creatures (Descartes believed animals were simply machines and could not feel pain), so it’s worth taking a moment to really consider the possibility of algorithmic suffering.
Many NPCs rely on behavior tree algorithms that follow rote if-then rules, or—in more advanced characters—machine-learning models that develop their own adaptive methods. The ability to suffer is often tied to things like nociceptors, prostaglandins, and neuronal opioid receptors, so it would seem that video game characters lack the neurological hardware required for a pain response. Emotional distress (our ability to feel fear, anxiety, discomfort) is more complex, from a neurological standpoint, though emotion in humans and other animals often relies to some degree on external stimuli processed by the five senses. Given that these algorithms have no sensory access to the world—they can’t see, feel, or hear—it’s unlikely that they are capable of experiencing negative emotions.
Still, when it comes to the ethics of suffering, neurology is not the only relevant consideration. Some moral philosophers have argued that the ability to hold preferences—the capacity to see the world in terms of positive and negative outcomes and to develop decisionmaking processes about these outcomes—is a definitive criteria for real suffering. One advantage of speaking of preferences rather than pain is that whereas pain is entirely subjective, felt only to the person who is suffering, preferences can be observed. We know cats have preferences because they recoil from bathtub water and sometimes scamper off when approached by dogs. The fact that your NPCs have, as you put it, “particular skills, weaknesses, likes, and dislikes” suggests that they do in fact have preferences, though this is also something you can test by simple observation. When you put them in undesired situations do they resist or struggle? Do they exhibit facial expressions or body movements you associate with fear? You might object that such behavior is simply programmed in by their designers, but animal preferences could similarly be thought of as a kind of algorithm programmed by evolutionary history.
Brian Tomasik, an ethics consultant at the Foundational Research Institute, has argued that NPCs are “morally relevant processes,” meaning that we, as humans, have some degree of ethical responsibility toward them. Many NPC algorithms resemble the kind of goal-directed behavior (planning, welfare monitoring, adaptive responses) one finds in complex animals, Tomasik argues. Not all NPCs fit this description, obviously. Some, like the Goombas in Super Mario Bros., are little more than bouncing objects. But once you start talking about characters that try to avoid death or that suffer penalties to their health or well-being when they are injured, our treatment of them becomes “marginally ethically relevant.” Tomasik admits that NPC suffering is not among the world’s most pressing ethical challenges. “On any given occasion it’s not a big deal,” he said in a recent interview, “but aggregated over tens of millions of people killing thousands of these characters on a regular basis during game play, it does begin to add up to something nontrivial.”
Given that you’re not actually killing or torturing your characters, the stakes are not quite so high. In fact, it’s possible that putting your players in difficult situations might actually be beneficial to them. Your decision to play them against their skill sets calls to mind a popular trope in epic literature—the unlikely hero. Paris is described in the Iliad as unskilled and cowardly, epitomized by his decision to use a bow and arrow in the Trojan War rather than engage in hand-to-hand combat. And yet, it is Paris’ arrow that manages to kill Achilles, striking him in the heel. Religious texts are similarly full of stories in which the gods choose unusual agents to carry out their will. Moses is commanded to confront Pharaoh and negotiate the freedom of his people despite having poor communication skills. (“Pardon your servant, Lord,” he protests, when receiving his divine mission. “I have never been eloquent, neither in the past nor since you have spoken to your servant. I am slow of speech and tongue.”) All of which is to say: Sending a space-fearing sim on an asteroid-mining mission has all the makings of heroic literature.
If you think about your own experience, there are undoubtedly times when being forced into an uncomfortable situation caused you to grow or expanded your understanding of your own capacities. Perhaps your decision to push your characters beyond their predetermined abilities betrays an underlying hope that some good might come from situations they find undesirable. Of course, only you can decide, by carefully examining your motives, whether that’s the case. When you put the players in difficult situations, are you getting some kind of sadistic glee out of watching them suffer or glorying in the unfettered power you hold over them? Or does part of you believe that they are capable of more than their scripts suggest, that they have the potential to evolve beyond their hardwired limits? I’d like to believe the latter, though not merely out of concern for the algorithms. All games are, to some extent, synecdoches of life, and the miniature worlds we create reflect the implicit convictions we hold about our own. To trust that predetermined digital beings are capable of overcoming their programming entails an expansive faith in human nature, a belief that we too can occasionally transcend the determinations of biology and genetics and summon our better natures from the brute mechanics of fate.
Be advised that CLOUD SUPPORT is experiencing higher than normal wait times and appreciates your patience.