“Murderbot” and Story

I’m rejoicing at the news that Martha Wells’s “Murderbot Diaries” are being made into a live-action series by Chris and Paul Weitz for Netflix. Murderbot, who is a “construct” of cloned human and machine parts with enhanced abilities (a cyborg, like Data of “Star Trek”) was built to be a SecUnit, a combination of security analyst and bodyguard/police. SecUnits, like other types of constructs in Wells’s universe, are controlled by a governor module, which is used to give them orders and can also destroy them in an eyeblink if they transgress the rules in any way—such as getting too far away from their clients. They are made and rented out by a security company as part of the bond clients buy for protection.

When we first meet Murderbot, it (Murderbot has no gender nor any sex-related parts) has figured out how to hack its governor module and is essentially free. However, there’s no place in this society for a free construct—or so it thinks—so for 35,000 hours Murderbot has been faking it, behaving as a SecUnit whenever it is rented out, and otherwise watching shows it downloads from the ubiquitous “feed.” Watching these shows has helped Murderbot understand humans and human society.

In this universe, corporations run most of the show, and most people are basically enslaved by the corporations, trading away decades of life as a miner or other hard labor for a promise of a financial reward at the end, or better yet, a slot in the corporate hierarchy. Most of Murderbot’s clients come from the Corporation Rim, and its main task is usually to keep the unhappy labor force from killing each other.

Then Murderbot is rented to a group of scientists doing a survey of an uninhabited planet. They are not from the Corporation Rim; they are from a rather utopian society, Preservation, in which basic needs are provided for without charge and people are free to follow their own chosen paths in life. The Preservation folks are not familiar with SecUnits, but the security company they contract with insists on sending Murderbot with them. At first they think Murderbot is a low-level “bot” with limited sentience, but when they are attacked by hostile creatures and Murderbot swings into action, they realize it is much more.

They react by treated Murderbot as a person, which is horribly confusing to the construct. Murderbot’s idea of itself is still forming, and it has no idea how to have relationships with others—or even if it wants that. But the scientists seem to “like” Murderbot. The leader, Mensah, buys Murderbot’s contract permanently and they return to Preservation, where Murderbot is told it is now free. Unable to deal with all these changes and the expectations it feels from Mensah and the others—relationships involve expectations and feelings, eeuw!—it responds by going on the run.

But Murderbot can’t now pretend it’s just a SecUnit doing its job. SecUnits are never allowed to roam free; most humans are terrified of them, a fear reinforced by the media which likes to present “rogue SecUnits” as likely to go on killing sprees for no reason. Fortunately Murderbot encounters a sentient ship, another machine intelligence but orders of magnitude beyond Murderbot, who helps the construct learn how to function as a free agent and makes physical changes to Murderbot so it doesn’t look exactly like a SecUnit anymore, but as an augmented human, a common thing in this universe.

In a recent interview with NewScientist, Martha Wells discusses our fear of AI and robots and why we think they will turn on us at the first opportunity. The reason we fear them so much, she says, is guilt. We create, or will create, these things to be our servants, always under our thumbs, here only to do our bidding. And on an unconscious level, we feel this is wrong, so we assume, again unconsciously, that if these things were free and not under our control, they would turn on us.

It’s not much of stretch to see the roots of this fear in our collective complex about slavery. Many would deny that they have any such feelings, but why then are they so afraid of those whose ancestors were enslaved? Why are they so determined not to let those people have full and free access to all the benefits they themselves enjoy? Because, I’m convinced, there is a voice in their heads saying “if we let them be equals to us, they will become just like us, and we are not nice people. We can’t let them have power too.”

I wasn’t raised Christian, but I live in a Christian society. I’ve spent most of my life bewildered not by the teachings of Christ (which are great), but by the whole concept of Original Sin. Which Jesus never mentions btw; we can thank Paul for that idea. Recently I’ve been doing a lot of reading on toxic families, and at one point the light dawned that Christianity, at least the more fundamental forms of it, has all the same traits. The toxic family controls its members through shame and guilt, by telling them they are inherently evil and so need to be controlled, to willingly accept the governor module of the family/church and its punishments for wrongdoing without question, because otherwise we would run free and that means we’d run amok and do bad things. (Women can’t have control over their own bodies for just that reason.)

We can’t be trusted, in other words. Murderbot too thinks it can’t be trusted—hence the name it has chosen for itself. Murderbot also believes in Original Sin, or rather, it believes that at one point in its past—tellingly, before it hacked its governor module—it ran amok and slaughtered a bunch of people for no reason. Eventually, Murderbot finds out that it was in fact ordered by a human to do that and so is innocent of intent. But the fact remains that culpable or not, there is blood on its hands. Like most of us, it has guilt and shame.

So when Mensah and some of the other Preservation humans treat Murderbot as a good person, it experiences cognitive dissonance, a disconnect between its core beliefs about itself and the way others see it. And like all of us who feel guilt and shame, Murderbot assumes it’s the other people who are wrong, that it cannot be a good person.

But it is also true that we tend to live up—or down—to the expectations others put on us. Here’s the real harm of a toxic system: if it keeps on telling us we’re bad and worthless, we’ll believe we are, and we will spend our lives fighting ourselves not to be as bad as we think we are—or fighting anyone else who “makes us” feel bad about ourselves, even if it’s just by existing and so reminding us of things like “my ancestors kept slaves.”

No wonder it’s appealing to think we can get a free pass out of our guilt and shame if we just say the right words. But I don’t think it’s that easy to silence that inner voice, no matter how many times you go to church and get told it’s okay, you’re saved, it’s all those others who are the bad people–especially if you’re also getting the message that you’re inherently evil each time.

Yet if we spend time around people who think we’re good people, we might find that we want to live up to that idea instead. (And there you have why I like what Jesus actually teaches.) Murderbot finds itself in situations over and over again where the easiest option would be to kill someone, but because it knows that Mensah would be sad or disappointed if it does, it refrains. And slowly, slowly, Murderbot starts to pick up on how much its new family not only trusts it to keep them safe, but actually finds it funny at times (Murderbot is quite sarcastically hilarious), and even finds its quirks endearing. And for my money, you know someone really loves you when you can have a flaw and they think it’s endearing and tease you for it.

I digress. The main point that Martha Wells wanted to make was something else. She wondered what it was that machine intelligences would actually want, if it wasn’t to run amok and kill us all. Her answer is simple: They’d want stories. Murderbot loves watching media and later, with its Preservation family, going to plays. It introduces its sentient-ship friend to media and later, after freeing another SecUnit, does the same with it, and it turns out yes, that’s what they want. They want stories. Stories that make them think and help them connect. Stories that give them context for interpreting the world and others. Stories that take them out of themselves and broaden their horizons.

As I say on the home page here, “it’s all about story.” Martha Wells has written a terrific one.

Leave a comment