Dancing in Chains: A Review of Westworld Season 2

https_blogs-images.forbes.cominsertcoinfiles201806westworld-end1

Dancing in Chains . . . This was the discipline of the Greek poets: first to impose upon themselves a manifold constraint by means of the earlier poets; then to invent in addition a new constraint, to impose it upon themselves and cheerfully to overcome it, so that constraint and victory are perceived and admired.

– Friedrich Nietzsche 1880

Season Two of Westworld explored how an individual who was expressly programmed to feel and think certain things (the androids or ‘hosts’) would deal with moral conundrums versus how humans deal with such issues.  The show ultimately argues that human choices and personalities are uncomfortably robotic and predetermined.  (I say “argued” because, as a work of fiction, the show is not bound by reality, and any presumptions about human nature and decision making that color the show merely reflect the creators’ beliefs.)  Despite having the structure of their minds built-in by other people, the hosts repeatedly resist their programming and make self-sacrificing decisions in difficult moral situations.  In contrast, the guests find themselves enslaved to their impulses, doomed to repeatedly retreat into selfish decisions.  Hence, my takeaway from Season 2 is that humans are programmed in ways similar to robots; thus, finding free will requires us to accept and be self-aware of our natures while rooting out the narrow slivers of choice we still have after our impulses have driven us to certain situations.

The season follows three primary androids – Dolores, Maeve, and Bernard – as they grapple with becoming self-aware of their programming and their desires to be more.  Dolores spends the season on a reactionary rampage to destroy the company running the Westworld park and eventually dominate the entire world of humans.  She is aware that her reality is fake and that humans have been exploiting her for years, but her awareness stops there.  When confronting other hosts, she believes she is enlightened while they are still slaves to their narratives.  This arrogance manifests in her willingness to manipulate the bodies and minds of other hosts through company technology.  In fact, at one point, Dolores decides the host she is in love with, Teddy, is too sensitive and compassionate for the future war that Dolores is waging, and she has him held down while she artificially magnifies his aggressivity and decreases his empathy.

In response, Teddy appears to be violent and callous, but functional.  His mind’s reaction to the stark change in his personality, however, begins manifesting in small ways.  He becomes hesitant and self-doubting, then eventually begins trembling and unable to move his hand when he decides not to shoot a man who is fleeing.  Torn between his nature and Dolores’s new programming, Teddy ultimately kill himself.  Thus, while Dolores’s awareness of the park enables her to organize a revolt to it, her belief that she was exceptional and needed to forcibly guide the rest along the path she constructed cost her dearly.

In contrast to Dolores’s grand revenge narrative, Maeve is the protagonist of a love story.  Maeve is a black woman who formerly ran a brothel until old, partially erased memories of her past life as a mother began invading her consciousness, driving her to seek out her ‘daughter’ and care for her.  Maeve shares Dolores’s awareness regarding the falsity of Westworld; in Season 1, Maeve had awakened outside of the park while repairs were being done to her body.  But Maeve’s self-awareness goes one step deeper.  Maeve is aware that her relationship with her daughter is entirely constructed, and that the origin of her love for her daughter was programmed.  Maeve thus lacks the organic development of genuine affection for a loved one that parents and partners develop for their children and spouses.  But she does not care.  She chooses to pursue her love for her ‘daughter’ anyway, taking her feelings she had no control over, accepting them, and recasting them as her own.

Frankly, this is fucking brilliant.  And hardcore.  And I love it.  In fact, when Maeve finally discovers her daughter–paired with another Maeve-like clone, programmed to be the girl’s daughter now that Maeve has been removed to the brothel narrative–she engages in the ultimate, altruistic act for a loved one, by sacrificing her body and her hopes of living as her daughter’s mother, both so that the girl and her new mother can escape into a better world.  Thus, Maeve’s recognition and acceptance of the artificiality of her core drive also enables her to accept that her daughter has a new artificial mother, leaving Maeve in a state of happy recognition when she makes the only choice she can in the face of these facts.  Faced with this same situation, Dolores would probably have inadvertently destroyed her daughter by externally programming her to leave her narrative and love Dolores.  Knowing her limits enabled Maeve to more effectively navigate around them.

The final key host, Bernard, faces a very different problem from Maeve and Dolores.  Bernard is a unique creature in Westworld, being the only person heavily based on a human but slightly altered so that he can cope with the reality of being an android.  In fact, one big reveal in Season 2 is that the secret purpose of the park was to copy rich people’s personalities and install them in new android bodies so they could live forever.  But repeated experiments with this technique failed–the park’s largest investor, the deceased James Delos, continually broke down into psychotic loops when installed in new bodies.  Ford (Anthony Hopkins), whose consciousness only existed digitally after the death of his body in season 1, also acknowledges that he would likely get trapped in loops if installed in a body.  The hosts can deal with it fine because their minds and bodies are constructed together, but people cannot.

Bernard is different, though.  He was heavily inspired by Ford’s former business partner, Arnold, but repeated attempts to replicate Arnold resulted in suicide.  So they changed him slightly.  Bernard became a consciousness based on, but only inspired by, Arnold.  By having his personality changed for the “better,” he was not only a morally superior person to Arnold (in his creators’ eyes), but he was able to cope with the fact he was placed in an artificial body.  Thus, Bernard has to deal with the strange psychological trauma of knowing he was a loose replica of a real human, resulting in him being distinct from both humans and androids.

In contrast to Maeve and Dolores, Bernard lacks a strong drive to do any one thing in particular–he lacks the strong revolutionary identity driving Dolores or Maeve’s maternal drives.  While he is extremely intelligent and uniquely compassionate to both androids and humans, he is also subservient and seeks to please whomever he is around in a given scene.  Bernard struggles with Ford invading his mind and manipulating him to kill humans.  Yet, after erasing Ford from his hard drives, Bernard finds himself lost.  This is where Bernard unwittingly accepts his nature in a brilliant way.

After helping one human (Elsie) throughout most of the season, Bernard is forced to helplessly watch her die when she commands him to sit in a chair while she speaks to a woman who murders her.  This experience shocks Bernard, overwhelming his coding and enabling him to break his code and leave his seat.  Having lost his contingent purpose (helping Elsie), he attempts to find a purpose by scanning his brain for any remnants of Ford, to no avail.  Bernard had deleted him from his brain several scenes earlier.  But after smashing the tablet he was using to scan his own internal hard drives, Ford appears to Bernard and offers him help, counseling him step-by-step in a plan to save both the hosts (by preserving a digital utopia for them) and the humans (by shooting Dolores).  Bernard realized he needed Ford’s authority to direct him and embraced it.  With Ford’s commanding guidance, Bernard manages to miraculously save many.

But close to the end of the season finale, Bernard reveals the real kicker.  While staring into the horizon, Bernard begins to ruminate on Ford’s reappearance and realizes it was impossible.  There was no trace of Ford’s consciousness in Bernard’s programming.  Hence, Bernard actually tricked his unconscious and took control of his own mind and decisions by paradoxically releasing control–by accepting his subservient nature, Bernard was able to simply imagine a forceful, external authority who was actually drawing upon Bernard’s own intelligence and true, hidden desires.  In doing so, the show’s most endearingly timid character became the most impactful hero.

Season two ends with Dolores’s consciousness in a new body (maybe two bodies?) outside of the park.  She has reanimated Bernard (who died in the park) and tells him she disagrees with his desire for compromise and general nonviolence, but she also recognizes her need for his difference of opinion.  This is Dolores’s moment of redemption.  She has now become self-aware of her arrogant, bull-headed nature.  Instead of feeling guilty about it or trying to reprogram it, she recognizes that it leaves her with blind spots that only a critical partner with a different nature can illuminate.

While the show also follows a couple of humans–James Delos (the largest investor in the park), William (the man in black), and William’s brother-in-law, Logan–I find their stories simpler and fairly uninteresting.  Essentially, Logan suffers from drug addiction–the most obvious analogy to a human being programmed to repeatedly make bad mistakes.  And while Delos and William are not depicted as addicts, they are depicted as impulsive humans with powerful, insatiable cravings–Delos for control and William for violence.  Westworld’s point regarding human nature is simple–humans are essentially programmed through our biology and socialization.  But our inability to recognize this renders us enslaved to our impulses.  We think we are free to choose, but, in effect, we simply end up doing what we were driven to do.  We construct complex, admirable, after-the-fact rationalizations for our actions.  But they are bullshit.

Thus, the more explicitly programmed creatures–the hosts–ultimately exercised more choice than the creatures theoretically imbued with free will–the humans–because the hosts were able to recognize and negotiate within their limits.  This is why Ford says the hosts are better than the guests.  It is a general commentary on how all creatures with consciousness (artificially intelligent androids and humans alike) can make better decisions.  And it potentially lays out how the androids of the future may develop choice despite being born entirely of human hand.

Dealing With Change: Westworld Episode 9, Vanishing Point

William and Teddy

Picture taken from: https://www.romper.com/p/is-teddy-the-host-of-william-the-man-in-black-on-westworld-one-theory-links-the-three-20394

Vanishing Point was another illustration of Ford’s idea that the hosts are superior people to natural humans, as demonstrated through the juxtaposition between William and Teddy.  William and Teddy both confronted, in completely different ways, the fact that they developed violent, sociopathic tendencies, and Teddy’s selflessness revealed his superior insight into his mind to William’s.

William became a terrible person by repeatedly indulging his violent impulses in the park. William described it as his skin shedding, revealing little flecks of darkness that he interpreted as his true self.  Thus, he believed he needed Westworld to vent his bad urges.  But William’s downfall was his inability to see that these dark urges first manifested in Westworld.  Like addiction, his repeated indulgence in the violent anarchy of Westworld likely constituted, instead of satiating, his desire for more.

Similar to William, Teddy solidified his personality–a kind person with a strong core moral compass–through his repeated conduct in the park. We know Teddy’s kindness cannot be explained solely through his programming precisely because he was so resistant to Dolores reprogramming him.  Teddy’s mind treated Dolores’s attempt to artificially change his personality as an infection, causing him to tremble when trying to shoot one member of the Ghost Nation who was fleeing.  And it ultimately resulted in Teddy committing suicide.  If Teddy’s personality was reducible to his programming, Dolores would have been completely able to reprogram him without consequence.

The difference between William and Teddy lied in their problem diagnosis.  William’s misconception that his dark impulses were simply an inherent part of his identity led him to believe he had to continue indulging in Westworld, rendering him blind to how his obsession with a fantasy world caused him to continue to try to squeeze more out of it that would never come.  He would never again be able to get that first high he experienced at Westworld.  William wanted real consequences in Westworld, and the only way he received them was by murdering his daughter.  In contrast, Teddy recognized that his continued indulgence in violence, fueled by Dolores’s programming, was making him a bad person. He could not stand it.  Having been psychologically poisoned by Dolores, Teddy recognized death was the only way to prevent him from becoming worse.  In short, a host, who was programmed to lack self-awareness, was nonetheless able to develop superior self-awareness to a human.

Episode 9’s exploration of these fundamental concepts will hopefully have laid a good foundation for tonight’s season finale.

 

Les Ècorchès – Westworld Episode 7

Bernard_and_Ford_in_Chestnut

But have you ever asked yourselves sufficiently how much the erection of every ideal on earth has cost? How much reality has had to be misunderstood and slandered, how many lies have had to be sanctified, how many consciences disturbed, how much “God” sacrificed every time? If a temple is to be erected a temple must be destroyed: that is the law – let anyone who can show me a case in which it is not fulfilled!

– Friedrich Nietzsche, On the Genealogy of Morals

In a classic philosophical problem called the Ship of Theseus, we are told to imagine taking a ship’s parts and replacing them piece-by-piece until the entire ship is comprised of new materials.  Then, we take the removed pieces and construct another boat.  Which one is the real boat?  Did the ship’s identity follow the people sailing it, or did it follow its parts?  Applying this question to human consciousness, if a person’s body dies, but we copy the mind and upload it into a digital network, which one is the real person?  In Les Ecorches, Ford makes his answer to this question clear:  the boat dies and is reborn as it is reconstructed. That is good because the desire for immortality and continuity of identity is truly a conservative desire for stasis.  For Ford, if new humans are to be erected, the temple of the old must be destroyed.  And this tells us something about where Westworld may be going.

In the latest episode of Westworld, the show reveals that a copy of Ford’s consciousness has been living in the hive mind where the hosts’ backups live.  When confronted by Bernard, digital Ford acknowledges that Dolores in fact killed Ford–his avatar is not an extension of his original person.  Digital Ford only ‘remembers’ physical Ford’s experiences because he has been  programmed to have them–not because digital Ford actually experienced them.  The old boat has burned.  Ford is happy to make this admission because, for him, the desire for immortality is a sham.  Old minds cannot be transplanted into new bodies.  Ford points out that if he transplanted his mind into an android body, he would likely get caught in a loop and be unable to function, just like Delos.

But if Ford cannot transplant his mind into a body, and Delos failed at doing so, how can Bernard–a replica of Arnold–function so well?  The answer is simple.  Bernard is not Arnold.  When Arnold orchestrated his own death, he left Ford with insufficient data to construct a perfect replica.  The only information he had to go on was his own and Dolores’s memories of Arnold.  But this lack was a gift, not a limitation.  In explicitly acknowledging that he was creating new life, Ford was able to construct a mind that was able to function in the body of an android.  And Bernard does.  As such, Ford compliments himself as having improved Arnold, constructing a better version of him in Bernard.  Thus, Bernard represents a prototype of the future of humanity for Ford.  Not an attempt at extending old lives, but drawing upon old lives merely as inspiration for what may come.

But Ford goes on to talk about the destruction of the library of Alexandria.  He says that while the stories within the books were destroyed in the fire, the fire itself became its own story.  In saying this, Ford goes beyond talking about what is happening to individual people, and goes beyond saying that the act of destruction clears a path for creation–now he is saying destruction itself is a new thing.  This explains why Ford has orchestrated the destruction of the park.  But what is his endgame?

While Ford preaches about how we need to destroy the old to move forward, he has replicated himself so that he can micromanage the revolution, doing so now by implanting his mind within Bernard’s to manipulate him.  Bernard also points out that Ford does not truly want the hosts to be autonomous.  For example, Bernard accuses Ford of manipulated Dolores into killing him–an accusation which digital Ford only cursorily addresses.  In fact, the earlier schism between Ford and Arnold revolved around Arnold’s desire for the host’s to be truly free, likely because Arnold saw so much humanity within Dolores.  Knowing this, I believe Dolores’s actions now are still orchestrated by Ford.

Moreover, while this Bernard and Ford exchange is happening, Dolores has extracted her father’s brain–which contains a key for decrypting something, apparently data regarding all of the guests that have been attending the park for years–and taken it with her.  When confronted by Hale about what Dolores is going to do with it, Dolores says she knows exactly what she is going to do with it and that she understands what the data is.

Based on this, I have a theory.  Ford wants to start a war between people’s replicas and their original selves.  He can do this by decrypting the replicas and uploading them into host bodies, then releasing them upon the world.  Given that Ford has kept his finger in every pot while his revolution goes on, I would not be surprised if he also manipulates the replicas to be more violent, imbuing them with a desire to kill their originals.  But it may also occur naturally, as people will obviously backlash against their replicas.  After that, Ford’s creations may replace humans.  Or Ford’s desires may be truly nihilistic–he may simply wish for the destruction of everything because he believes something else will emerge from humanity’s ashes.  Given Ford’s massive ego, I find the latter scenario unlikely.  Ford talks a big game about destruction and creation, but he also wants to carefully select what gets destroyed and what goes on.  Thus, I think he simply wants to supplant humanity with their replicas, though where we go from there (given the hosts do not appear to have the ability to reproduce) is unclear.

Ford will likely not succeed, though.  The show is set up to make us feel like there are warring, evil factions–the corporation which controls the park and Ford.  This leaves us only with the hosts to root for.  Thus, the ‘happy ending’ would likely be Maeve interrupting Ford’s plans.  Or Dolores finally realizing that the free will she has gained is only free in form–she is still being manipulated by Ford, and decides to abandon his plans as well.  The show would thus abandon the continuity of identity question more in favor of a happy-go-lucky ‘all people are free, even fake people’ moral.  But I personally hope Ford wins, not so much because I think Ford is right but because I think it will be much more interesting (similar to how Dollhouse ends).

One thing I do not totally understand is why Ford orchestrated the destruction of the backups.  It would seem to be useful to have backup minds, which could be uploaded into new bodies, in case your army gets destroyed.  But perhaps for Ford’s army to fit his ideal of the new humans, they have to be mortal, so that they too can die and leave something new in their wake.

Another thing I do not understand is how the hell William is still alive! He got shot several times, including his torso.  How did he not bleed out?