The Pop-Culture Theologians Reflect On Westworld Season 1
Hello World! Martha Cecilia here! I am 1/2 of the Pop-Culture Theologians squad at Engaged Gaze. As Pop-Culture addicts, we spend an absurd amount of time dissecting, arguing, and fan-girling over film,tv, and literature and we want to bring you into our discussions.
With Westworld Season 2 having debuted last Sunday, I thought we would do a quick recap of my takeaways from Season 1, which will lead to our first joint Sunday Sermon post on Season 2, Episode 1 tomorrow! We welcome you into our crazy little corner of discussion. Stay tuned for Sunday's post for last weeks episode before the premier of Episode 2.
So, spoiler warning-- SPOILERS UP AHEAD. We are assuming that you have seen Season 1. Each new post will assume you have seen the new episode. Below is a trailer for Season 2--of which this first season recap post is spoiler free. We will give you a week to catch up between episodes. But from here on out, read at your own risk. And come at us in the comment section--we are counting on it and look forward to chatting it up.
So, lets do this thing and break down Season 1, y'all!
Martha Cecilia's 5 Sips of Tea:
1. “Have you ever questioned the nature of your reality?”
At the core of Westworld mythology and maybe its own “cornerstone"(wink wink) is this question posed by Dolores Abernathy, her innocent southern drawl whispering as much to the viewer as herself in the opening of the first episode. This question is repeated in multiple iterations throughout the entire season, culminating in Dolores’s frantic plea to William in the final 2 episodes of Season 1, “Where am I? When am I?”
As the viewer, we are asked the simplest of unanswerable questions: What is the nature of “our” reality? Is my reality a solitary experience or a communal one? Could this be a question that exists in multiple planes for multiple folks or even for one single being? What does that loaded question even mean? What does it mean to exist? Insert head exploding, right?
From a theological perspective, it also begs the question of what is our place within the concept of natural law. At its core, it seems to me that Westworld is deconstructing the concept of natural law and man’s understanding of their place in “this” world—another loaded concept itself in the face of this broader question of “reality”, existence, and more importantly co-existence.
2. We have not developed any sort of collective “tech ethics”—and we need to confront our ever changing technological landscape proactively--before it is too late.
So, this goes back actually to conversations that first percolated after I saw the film Ex-Machina. Ex-Machina, similar to Westworld, asks the audience — how we define consciousness? Maybe even more fundamentally, what does it mean to be alive? Is it the ability to feel? What does "to feel" even mean? Is it percieved happiness, sorrow, pain, loss? Is it awareness of self? If that is the case, what exactly does it mean to be aware of oneself (Y'all--this post is mostly questions I can't answer--YES WESTWORLD)? Is it the human experience (for those of faith) of being “created” by an unknown and non-human creator (whatever that unknown is for you)?
These questions then lead to a secondary set of questions and discussion—what responsibility do we, as creators within the narrative of AI have to said AI if/when they develop “consciousness” or pass whatever future Turin Test (more on this below) we create as these issues get more and more complex. I am using quotations for the concept of “consciousness” because at the heart of every friendly debate I have ever had over both the film Ex-Machina and Westworld is the divide of how we define consciousness and what it means to be created and/or a creator. Obviously this piggybacks off the fundamental questions of what is reality—what does it mean to be alive?
For some—the “hosts” in Westworld are quite simply computers and code. Simplistically speaking—they are not within a hierarchy of existence (think of the traditional top down of god, man, woman, animals, insects, plants). A robot would not factor into that traditional hierarchy of power and existence. To kill a host would be no different than to drop a phone and have it shatter. There is no moral weight to any action perpetrated on a "thing" you have built (created). Even if you were to take said phone and throw it angrily against a wall--the only critique would be of the "thrower's" need to reflect on how they handle their anger and how stupid it is to break something so expensive over an emotion. I can't even get started on how similar the language for code is to theological explanations of being and how exhausting it is to dialogue about how we ourselves have historically as humans explained our own existence through language that mimics written code and tend to view these narratives as biological vs. mechanical--but whatever (shoutout to my lil' brother John-Paul--hopefully a future guest theologian).
Then for others (like myself)—the questions of consciousness and what defines it complicates our relationship with the concept of AI—mostly in a hypothetical future where AI like those of Westworld will/could exist.
In Ex-Machina, when Nathan, a programmer who has created a conscious AI (Ava) is discussing his work with a younger programmer named Caleb, he says to him two things that have stuck with me:
“One day, the AIs are going to look back on us the same way we look at the fossil skeletons on the plains of Africa—an upright ape living in dust with crude language and tools, all set for extinction.”
"If you have created a conscious machine, it’s not the history of man. That’s the history of gods.”
I have sat with those quotes now for a couple years--always chewing on the depth and audacity of both the critique of humanity's understanding (and lack of humility) of their place in this world and also how dangerous humans are, thinking of our current treatment and the systemic harm inflicted on minorities/oppressed commentates already--those who within the bio/mechanical dialogue are already "equals", and how dangerous humans will be the more power they have to create new systems of oppression and new groups to be oppressed.
So, historically speaking, the Turin Test is our initial marker for AI and consciousness. The Turin Test, developed by Alan Turing in the 1950s, is a test of a machine’s ability to exhibit intelligent behavior equivalent to, or indistinguishable from, that of a human. The creators of both Ex-Machina and Westworld are playing with a different test--one we have talked about significantly since the rise of the Alt-right (but lets be real---the American narrative was built on oppression and genocide--so from the beginning of time) and here is my thesis: Westworld's creators are trying out a human test—an empathy test. If and when we do create (code, build, etc.) "beings" in the form of AI that passes a complex Turin Test—what is our responsibility as creators? What makes for a good creator? Here is a doozie (maybe at the heart of the maze we as viewers are on)--are we empathetic enough (coded or hardwired--the irony) to bear the responsibility of creation? And then, how do we define empathy for those we have create it? How do we cultivate it? Is it enough? Do we know how to show up as allies through authentic empathy and relationships with those who are not like us and do the emotional labor needed to advocate for those who can not show up or defend themselves due to systemic lack of agency (that may be of our own making)?
I am just going to go out on a limb and say that historically, we do not, as humans, have a good chance of passing this test.
If you want to see how uncomfortable people are with this dialogue on empathy and responsibility—particularly those who fall in the first group of quantifying any “consciousness” as simple code, try these following questions (I recommend some whiskey to get through this exercise):
Is there an ethical dilemma or moral weight in killing an adult host?
Is there an ethical dilemma or moral weight in raping an adult host?
Is there an ethical dilemma or moral weight in killing a child host?
Is there an ethical dilemma or moral weight in raping a child host?
Even if you don't understand/relate to what they are feeling, would you advocate for a host? Is there any reason to advocate? Do you have that moral obligation?
There are so many different questions within this dialogue--but the one thread that always comes up is just how hard it is to talk about these hypothetical ethical issues. Most often, the discomfort I personally have come across is deflected by, “Well, it would say something about the human, but it isn’t murder—it isn’t rape.”
And then you go down the rabbit hole of—but if they—even through “code” can feel it—the emotions of fear, pain, loss—then is this about more than just a judgement on the perpetrator, but an actually unethical/moral act?
The problem is—with technology exponentially growing and expanding—we have not even begun to have these conversations—as ethicists, theologians, people of faith—even just as humans. And we are falling behind. By the time technology catches us with our imagination (the driver of creation)—we will need to have had these conversations—we will need tech ethicists and best practices in place. And these will be nuanced conversations at the intersection of ethics, empathy, faith, and power.
We are woefully behind. And we need to prepare.
3. "These violent delights have violent ends...”
Toxic Masculinity (a patriarchal society) is the current human condition (at the intersections of white supremacy and institutionalize male supremacy, among other connected afflictions). This and its systems of oppression are killing us (some at a much more rapid pace than others) and will kill anything in its path. The fact that the premise of Westworld is humans coming into a world where there are no rules points to a division between a world with a certain set of ethical boundaries and the fantasy of a world without them. While there are allusions to some visitor experiences to Westworld where families can just experience a different time and place—we are very early on led to understand that raping, pillaging, killing, and chaos is the prime driver for the visitors. It is the headline on the metaphorical fucking brochure.
We see this most clearly in the characters of William (The Man in Black) and Logan—two stakeholders in the park experience who initially approach it quite differently, though both seem to reach the same conclusion. Logan, from the start, views the hosts as nothing more than machines meant to be at his disposal—with no agency and no innate personal value. William, on the other hand, seems to initially struggle with this view of the hosts—up until Logan—frustrated with the empathy that William is showing Dolores—cuts her open revealing her mechanical framework. This disconnect is further divorced from his initial empathy when he encounters Dolores again, on the same loop that lead him to her with a different random male visitor. His fragility is on display at his rage that his experience with Dolores was not his own, leading to a complete dismantling of his empathy. I want to reiterate--it was not the fact that she was mechanical that led him to turn into a murderous and raping madman in a silly hat. It was that fact that he was replaceable (there is irony in this internalized truth when the hosts condition is their replaceable-ness) and that a woman had inadvertently and involuntarily dismissed him.
On the other hand we have Maeve. As she slowly gains consciousness through her coding adapting and adapting to her condition (and coding is adaptive--we know this), she begins to realize her worth as a being. As someone who is aware of her existence, ignorant of the debate about the nature of her existence, she reclaims her agency and power by expanding her awakening though killing herself over and over again—leading her to the maintenance bay where she takes her power back with the help of sweet and bumbling Felix, a Delos employee struggling with the unclear line of what the hosts are and a very real streak of empathy that is the key to Maeve’s freedom. She had functioned in multiple capacities as a host in Westworld—most importantly as a mother in a homestead with a daughter and then, after experiencing her first "awakening" after the loss of her daughter in a violent encounter with a visitor, as a madam in a brothel. She even cuts into herself—confirming her gut feeling that she is not “like the visitors” seeking out her mechanical makeup—but that does not lead to some sort of identity crisis. It instead validates her marginalization, commodification, and captivity. This highlights that being like "others" is not the ruler by which existing should be measure--at least not for the creators of Westworld. I thought this was HUGE! This is a huge transgressive dismantling of our understanding of existence. This is the golden nugget!
While she and Bernard do eventually argue over whether or not her awakening itself is true awakening or programming, the fact that Maeve does not give power to the language of coding is EVERYTHING. Coded or not, she is. And she will be. And she has always been.
Oh, and she doesn’t care what anyone, including her creators, think. It is irrelevant to her existence. The fact that her programming was to get her out of Westworld and that she makes the decision to come back to her daughter matters. Unlike the visitors, Maeve and Dolores—as they awaken—reject the narrative of the human condition towards violence. They reject the narrative that their bodies, their voices, and their lived experiences are insignificant or subject to a pre-established hierarchy or "code". And that, folks, is how theology is made and dismantled. It is how patriarchy and toxic masculinity is dismantled. And it is how white male supremacy crumbles.
4. Coming to terms with our existence and mortality is hard. Coming to terms with systems of oppression, particularly those we ourselves participate in, is much harder.
I will keep this one short. Westworld is calling us out as individuals who actively participate in systems of oppression to do some introspective soul searching. It is us who need to wake up. Even those who fall in the lines of intersectional oppression participate in other systems of oppression as oppressors. And no matter how “woke” we consider ourselves to be, the hardest dismantling of self we come in contact with in our existence, at least in my opinion and interpretation of the Westworld writing, is not that of our mortality, but of our capability, capacity, and complicit-ness in doing harm.
When Bernard asks Ford to let him download his entire history—he is not horrified that he is a host. He is horrified that he has systemically oppressed and harmed others without even realizing it.
For the writers of Westworld—the maze that should all be going down is realizing that we are all Bernard.
Or at least we should at least strive to be.
5. It is never too late for redemption.
At the end of season one, Ford shows remorse for not only the events that lead to Arnold’s death, but for his blindness to the horrors of the park he has created and the immesurable harm he has inflicted. And while his final narrative (the Wyatt-Board of Trustees narrative) is flawed, it is his attempt to shut down the park and rectify what has gone so horribly wrong. At a minimum—he no longer wanted to be complicit in the oppression of the hosts and his creations. He pivots as best he can.
And in a world where oppression is at every corner, consistently protected by the fragility of the oppressors, it is refreshing to see a narrative where a humble admission of guilt is given, the horror of what has been perpetrated is acceptacepttand and processed, and a pivot in hopes of reparation is made. While it may be cowardly or even to extreme for some-- I will take it. I will take the narrative that we have to try to make reparations, become allies, and at times even put our own bodies at risk in the hopes of true and authentic change. I will take this. As flawed as it is, I will take it.
See you tomorrow for our first every Sunday Sermon!