Dancing in Chains . . . This was the discipline of the Greek poets: first to impose upon themselves a manifold constraint by means of the earlier poets; then to invent in addition a new constraint, to impose it upon themselves and cheerfully to overcome it, so that constraint and victory are perceived and admired.
– Friedrich Nietzsche 1880
Season Two of Westworld explored how an individual who was expressly programmed to feel and think certain things (the androids or ‘hosts’) would deal with moral conundrums versus how humans deal with such issues. The show ultimately argues that human choices and personalities are uncomfortably robotic and predetermined. (I say “argued” because, as a work of fiction, the show is not bound by reality, and any presumptions about human nature and decision making that color the show merely reflect the creators’ beliefs.) Despite having the structure of their minds built-in by other people, the hosts repeatedly resist their programming and make self-sacrificing decisions in difficult moral situations. In contrast, the guests find themselves enslaved to their impulses, doomed to repeatedly retreat into selfish decisions. Hence, my takeaway from Season 2 is that humans are programmed in ways similar to robots; thus, finding free will requires us to accept and be self-aware of our natures while rooting out the narrow slivers of choice we still have after our impulses have driven us to certain situations.
The season follows three primary androids – Dolores, Maeve, and Bernard – as they grapple with becoming self-aware of their programming and their desires to be more. Dolores spends the season on a reactionary rampage to destroy the company running the Westworld park and eventually dominate the entire world of humans. She is aware that her reality is fake and that humans have been exploiting her for years, but her awareness stops there. When confronting other hosts, she believes she is enlightened while they are still slaves to their narratives. This arrogance manifests in her willingness to manipulate the bodies and minds of other hosts through company technology. In fact, at one point, Dolores decides the host she is in love with, Teddy, is too sensitive and compassionate for the future war that Dolores is waging, and she has him held down while she artificially magnifies his aggressivity and decreases his empathy.
In response, Teddy appears to be violent and callous, but functional. His mind’s reaction to the stark change in his personality, however, begins manifesting in small ways. He becomes hesitant and self-doubting, then eventually begins trembling and unable to move his hand when he decides not to shoot a man who is fleeing. Torn between his nature and Dolores’s new programming, Teddy ultimately kill himself. Thus, while Dolores’s awareness of the park enables her to organize a revolt to it, her belief that she was exceptional and needed to forcibly guide the rest along the path she constructed cost her dearly.
In contrast to Dolores’s grand revenge narrative, Maeve is the protagonist of a love story. Maeve is a black woman who formerly ran a brothel until old, partially erased memories of her past life as a mother began invading her consciousness, driving her to seek out her ‘daughter’ and care for her. Maeve shares Dolores’s awareness regarding the falsity of Westworld; in Season 1, Maeve had awakened outside of the park while repairs were being done to her body. But Maeve’s self-awareness goes one step deeper. Maeve is aware that her relationship with her daughter is entirely constructed, and that the origin of her love for her daughter was programmed. Maeve thus lacks the organic development of genuine affection for a loved one that parents and partners develop for their children and spouses. But she does not care. She chooses to pursue her love for her ‘daughter’ anyway, taking her feelings she had no control over, accepting them, and recasting them as her own.
Frankly, this is fucking brilliant. And hardcore. And I love it. In fact, when Maeve finally discovers her daughter–paired with another Maeve-like clone, programmed to be the girl’s daughter now that Maeve has been removed to the brothel narrative–she engages in the ultimate, altruistic act for a loved one, by sacrificing her body and her hopes of living as her daughter’s mother, both so that the girl and her new mother can escape into a better world. Thus, Maeve’s recognition and acceptance of the artificiality of her core drive also enables her to accept that her daughter has a new artificial mother, leaving Maeve in a state of happy recognition when she makes the only choice she can in the face of these facts. Faced with this same situation, Dolores would probably have inadvertently destroyed her daughter by externally programming her to leave her narrative and love Dolores. Knowing her limits enabled Maeve to more effectively navigate around them.
The final key host, Bernard, faces a very different problem from Maeve and Dolores. Bernard is a unique creature in Westworld, being the only person heavily based on a human but slightly altered so that he can cope with the reality of being an android. In fact, one big reveal in Season 2 is that the secret purpose of the park was to copy rich people’s personalities and install them in new android bodies so they could live forever. But repeated experiments with this technique failed–the park’s largest investor, the deceased James Delos, continually broke down into psychotic loops when installed in new bodies. Ford (Anthony Hopkins), whose consciousness only existed digitally after the death of his body in season 1, also acknowledges that he would likely get trapped in loops if installed in a body. The hosts can deal with it fine because their minds and bodies are constructed together, but people cannot.
Bernard is different, though. He was heavily inspired by Ford’s former business partner, Arnold, but repeated attempts to replicate Arnold resulted in suicide. So they changed him slightly. Bernard became a consciousness based on, but only inspired by, Arnold. By having his personality changed for the “better,” he was not only a morally superior person to Arnold (in his creators’ eyes), but he was able to cope with the fact he was placed in an artificial body. Thus, Bernard has to deal with the strange psychological trauma of knowing he was a loose replica of a real human, resulting in him being distinct from both humans and androids.
In contrast to Maeve and Dolores, Bernard lacks a strong drive to do any one thing in particular–he lacks the strong revolutionary identity driving Dolores or Maeve’s maternal drives. While he is extremely intelligent and uniquely compassionate to both androids and humans, he is also subservient and seeks to please whomever he is around in a given scene. Bernard struggles with Ford invading his mind and manipulating him to kill humans. Yet, after erasing Ford from his hard drives, Bernard finds himself lost. This is where Bernard unwittingly accepts his nature in a brilliant way.
After helping one human (Elsie) throughout most of the season, Bernard is forced to helplessly watch her die when she commands him to sit in a chair while she speaks to a woman who murders her. This experience shocks Bernard, overwhelming his coding and enabling him to break his code and leave his seat. Having lost his contingent purpose (helping Elsie), he attempts to find a purpose by scanning his brain for any remnants of Ford, to no avail. Bernard had deleted him from his brain several scenes earlier. But after smashing the tablet he was using to scan his own internal hard drives, Ford appears to Bernard and offers him help, counseling him step-by-step in a plan to save both the hosts (by preserving a digital utopia for them) and the humans (by shooting Dolores). Bernard realized he needed Ford’s authority to direct him and embraced it. With Ford’s commanding guidance, Bernard manages to miraculously save many.
But close to the end of the season finale, Bernard reveals the real kicker. While staring into the horizon, Bernard begins to ruminate on Ford’s reappearance and realizes it was impossible. There was no trace of Ford’s consciousness in Bernard’s programming. Hence, Bernard actually tricked his unconscious and took control of his own mind and decisions by paradoxically releasing control–by accepting his subservient nature, Bernard was able to simply imagine a forceful, external authority who was actually drawing upon Bernard’s own intelligence and true, hidden desires. In doing so, the show’s most endearingly timid character became the most impactful hero.
Season two ends with Dolores’s consciousness in a new body (maybe two bodies?) outside of the park. She has reanimated Bernard (who died in the park) and tells him she disagrees with his desire for compromise and general nonviolence, but she also recognizes her need for his difference of opinion. This is Dolores’s moment of redemption. She has now become self-aware of her arrogant, bull-headed nature. Instead of feeling guilty about it or trying to reprogram it, she recognizes that it leaves her with blind spots that only a critical partner with a different nature can illuminate.
While the show also follows a couple of humans–James Delos (the largest investor in the park), William (the man in black), and William’s brother-in-law, Logan–I find their stories simpler and fairly uninteresting. Essentially, Logan suffers from drug addiction–the most obvious analogy to a human being programmed to repeatedly make bad mistakes. And while Delos and William are not depicted as addicts, they are depicted as impulsive humans with powerful, insatiable cravings–Delos for control and William for violence. Westworld’s point regarding human nature is simple–humans are essentially programmed through our biology and socialization. But our inability to recognize this renders us enslaved to our impulses. We think we are free to choose, but, in effect, we simply end up doing what we were driven to do. We construct complex, admirable, after-the-fact rationalizations for our actions. But they are bullshit.
Thus, the more explicitly programmed creatures–the hosts–ultimately exercised more choice than the creatures theoretically imbued with free will–the humans–because the hosts were able to recognize and negotiate within their limits. This is why Ford says the hosts are better than the guests. It is a general commentary on how all creatures with consciousness (artificially intelligent androids and humans alike) can make better decisions. And it potentially lays out how the androids of the future may develop choice despite being born entirely of human hand.