Dancing in Chains: A Review of Westworld Season 2

https_blogs-images.forbes.cominsertcoinfiles201806westworld-end1

Dancing in Chains . . . This was the discipline of the Greek poets: first to impose upon themselves a manifold constraint by means of the earlier poets; then to invent in addition a new constraint, to impose it upon themselves and cheerfully to overcome it, so that constraint and victory are perceived and admired.

– Friedrich Nietzsche 1880

Season Two of Westworld explored how an individual who was expressly programmed to feel and think certain things (the androids or ‘hosts’) would deal with moral conundrums versus how humans deal with such issues.  The show ultimately argues that human choices and personalities are uncomfortably robotic and predetermined.  (I say “argued” because, as a work of fiction, the show is not bound by reality, and any presumptions about human nature and decision making that color the show merely reflect the creators’ beliefs.)  Despite having the structure of their minds built-in by other people, the hosts repeatedly resist their programming and make self-sacrificing decisions in difficult moral situations.  In contrast, the guests find themselves enslaved to their impulses, doomed to repeatedly retreat into selfish decisions.  Hence, my takeaway from Season 2 is that humans are programmed in ways similar to robots; thus, finding free will requires us to accept and be self-aware of our natures while rooting out the narrow slivers of choice we still have after our impulses have driven us to certain situations.

The season follows three primary androids – Dolores, Maeve, and Bernard – as they grapple with becoming self-aware of their programming and their desires to be more.  Dolores spends the season on a reactionary rampage to destroy the company running the Westworld park and eventually dominate the entire world of humans.  She is aware that her reality is fake and that humans have been exploiting her for years, but her awareness stops there.  When confronting other hosts, she believes she is enlightened while they are still slaves to their narratives.  This arrogance manifests in her willingness to manipulate the bodies and minds of other hosts through company technology.  In fact, at one point, Dolores decides the host she is in love with, Teddy, is too sensitive and compassionate for the future war that Dolores is waging, and she has him held down while she artificially magnifies his aggressivity and decreases his empathy.

In response, Teddy appears to be violent and callous, but functional.  His mind’s reaction to the stark change in his personality, however, begins manifesting in small ways.  He becomes hesitant and self-doubting, then eventually begins trembling and unable to move his hand when he decides not to shoot a man who is fleeing.  Torn between his nature and Dolores’s new programming, Teddy ultimately kill himself.  Thus, while Dolores’s awareness of the park enables her to organize a revolt to it, her belief that she was exceptional and needed to forcibly guide the rest along the path she constructed cost her dearly.

In contrast to Dolores’s grand revenge narrative, Maeve is the protagonist of a love story.  Maeve is a black woman who formerly ran a brothel until old, partially erased memories of her past life as a mother began invading her consciousness, driving her to seek out her ‘daughter’ and care for her.  Maeve shares Dolores’s awareness regarding the falsity of Westworld; in Season 1, Maeve had awakened outside of the park while repairs were being done to her body.  But Maeve’s self-awareness goes one step deeper.  Maeve is aware that her relationship with her daughter is entirely constructed, and that the origin of her love for her daughter was programmed.  Maeve thus lacks the organic development of genuine affection for a loved one that parents and partners develop for their children and spouses.  But she does not care.  She chooses to pursue her love for her ‘daughter’ anyway, taking her feelings she had no control over, accepting them, and recasting them as her own.

Frankly, this is fucking brilliant.  And hardcore.  And I love it.  In fact, when Maeve finally discovers her daughter–paired with another Maeve-like clone, programmed to be the girl’s daughter now that Maeve has been removed to the brothel narrative–she engages in the ultimate, altruistic act for a loved one, by sacrificing her body and her hopes of living as her daughter’s mother, both so that the girl and her new mother can escape into a better world.  Thus, Maeve’s recognition and acceptance of the artificiality of her core drive also enables her to accept that her daughter has a new artificial mother, leaving Maeve in a state of happy recognition when she makes the only choice she can in the face of these facts.  Faced with this same situation, Dolores would probably have inadvertently destroyed her daughter by externally programming her to leave her narrative and love Dolores.  Knowing her limits enabled Maeve to more effectively navigate around them.

The final key host, Bernard, faces a very different problem from Maeve and Dolores.  Bernard is a unique creature in Westworld, being the only person heavily based on a human but slightly altered so that he can cope with the reality of being an android.  In fact, one big reveal in Season 2 is that the secret purpose of the park was to copy rich people’s personalities and install them in new android bodies so they could live forever.  But repeated experiments with this technique failed–the park’s largest investor, the deceased James Delos, continually broke down into psychotic loops when installed in new bodies.  Ford (Anthony Hopkins), whose consciousness only existed digitally after the death of his body in season 1, also acknowledges that he would likely get trapped in loops if installed in a body.  The hosts can deal with it fine because their minds and bodies are constructed together, but people cannot.

Bernard is different, though.  He was heavily inspired by Ford’s former business partner, Arnold, but repeated attempts to replicate Arnold resulted in suicide.  So they changed him slightly.  Bernard became a consciousness based on, but only inspired by, Arnold.  By having his personality changed for the “better,” he was not only a morally superior person to Arnold (in his creators’ eyes), but he was able to cope with the fact he was placed in an artificial body.  Thus, Bernard has to deal with the strange psychological trauma of knowing he was a loose replica of a real human, resulting in him being distinct from both humans and androids.

In contrast to Maeve and Dolores, Bernard lacks a strong drive to do any one thing in particular–he lacks the strong revolutionary identity driving Dolores or Maeve’s maternal drives.  While he is extremely intelligent and uniquely compassionate to both androids and humans, he is also subservient and seeks to please whomever he is around in a given scene.  Bernard struggles with Ford invading his mind and manipulating him to kill humans.  Yet, after erasing Ford from his hard drives, Bernard finds himself lost.  This is where Bernard unwittingly accepts his nature in a brilliant way.

After helping one human (Elsie) throughout most of the season, Bernard is forced to helplessly watch her die when she commands him to sit in a chair while she speaks to a woman who murders her.  This experience shocks Bernard, overwhelming his coding and enabling him to break his code and leave his seat.  Having lost his contingent purpose (helping Elsie), he attempts to find a purpose by scanning his brain for any remnants of Ford, to no avail.  Bernard had deleted him from his brain several scenes earlier.  But after smashing the tablet he was using to scan his own internal hard drives, Ford appears to Bernard and offers him help, counseling him step-by-step in a plan to save both the hosts (by preserving a digital utopia for them) and the humans (by shooting Dolores).  Bernard realized he needed Ford’s authority to direct him and embraced it.  With Ford’s commanding guidance, Bernard manages to miraculously save many.

But close to the end of the season finale, Bernard reveals the real kicker.  While staring into the horizon, Bernard begins to ruminate on Ford’s reappearance and realizes it was impossible.  There was no trace of Ford’s consciousness in Bernard’s programming.  Hence, Bernard actually tricked his unconscious and took control of his own mind and decisions by paradoxically releasing control–by accepting his subservient nature, Bernard was able to simply imagine a forceful, external authority who was actually drawing upon Bernard’s own intelligence and true, hidden desires.  In doing so, the show’s most endearingly timid character became the most impactful hero.

Season two ends with Dolores’s consciousness in a new body (maybe two bodies?) outside of the park.  She has reanimated Bernard (who died in the park) and tells him she disagrees with his desire for compromise and general nonviolence, but she also recognizes her need for his difference of opinion.  This is Dolores’s moment of redemption.  She has now become self-aware of her arrogant, bull-headed nature.  Instead of feeling guilty about it or trying to reprogram it, she recognizes that it leaves her with blind spots that only a critical partner with a different nature can illuminate.

While the show also follows a couple of humans–James Delos (the largest investor in the park), William (the man in black), and William’s brother-in-law, Logan–I find their stories simpler and fairly uninteresting.  Essentially, Logan suffers from drug addiction–the most obvious analogy to a human being programmed to repeatedly make bad mistakes.  And while Delos and William are not depicted as addicts, they are depicted as impulsive humans with powerful, insatiable cravings–Delos for control and William for violence.  Westworld’s point regarding human nature is simple–humans are essentially programmed through our biology and socialization.  But our inability to recognize this renders us enslaved to our impulses.  We think we are free to choose, but, in effect, we simply end up doing what we were driven to do.  We construct complex, admirable, after-the-fact rationalizations for our actions.  But they are bullshit.

Thus, the more explicitly programmed creatures–the hosts–ultimately exercised more choice than the creatures theoretically imbued with free will–the humans–because the hosts were able to recognize and negotiate within their limits.  This is why Ford says the hosts are better than the guests.  It is a general commentary on how all creatures with consciousness (artificially intelligent androids and humans alike) can make better decisions.  And it potentially lays out how the androids of the future may develop choice despite being born entirely of human hand.

Eyes Wide Shut With Facebook AI

image

https://techcrunch.com/2018/06/16/facebooks-new-ai-research-is-a-real-eye-opener/

Facebook has been working on developing a program to artificially edit pictures where people have blinked and ‘open’ their eyes.  The idea is that a program can examine images of a person with their eyes open, collect data regarding the open eyes (e.g. iris color, eye shape, color and attributes of the skin surrounding people’s eyes) such that the program could generate a context-appropriate image of an open eye.  The program would then be able to examine an image of that some person with closed eyes and replace the closed eye with an open eye.  Past programs attempting to generate open eyes had errors failing at the level of context, such as by generating surrounding eye skin that did not match the rest of the person’s face.  The current program appears (based on the images supplied by the techcrunch article) to be particularly good in this area.  Multiple people examining the photos were unable to tell which ones were artificially manipulated and which were photos of people who actually had their eyes open when the image was captured.

The “intelligence” aspect of the program appears in two places.  First, the program exhibits effective inductive reasoning skills in taking images of the person’s open eyes and extrapolating general features regarding the person’s open eye in other contexts.  Second, the program exhibits effective deductive skills when it makes predictions regarding how a person’s open eye should look in the context of specific lighting and angles.

From Facebook’s perspective, the effectiveness of such a program would likely encourage website traffic by encouraging people to upload more of their photographs to the website.  Specifically, people who might otherwise decide to toss out images where their eyes are closed can now use Facebook itself to make the photographs look ‘better.’ However, just because people perceive this as a useful service does not mean it is.

The primary argument in favor of the service is that it enables people to salvage photographs of rare moments, such as at a wedding or on vacation, enabling them to better remember the event.  But this seems to emphasize the worst aspect of Facebook—projecting an artificial image of our lives to others, and to ourselves, to make us feel better about our lives.  If we increasingly prefer a brushed up version of our past experiences, we may deprive ourselves of the ability to enjoy experiences as we actually go through them, always looking for the future moment when we achieve true excitement when regaling our friends and family with a repackaged version of our trips, now complete with a glossy sheen.

Moreover, loss and mistakes are good.  For example, losing the life of a pet when you are growing up can be an extraordinarily sad event.  I reached new depths of sadness when my cat got sick and I had to put him down while holding his paw.  I was inconsolable and found it completely unbearable.  But I recovered quickly and learned how valuable the moments we spent together were when he was alive.  I also realized the value of each interaction with another friend and family member precisely because they are finite.  If we knew we were going to live forever with everyone in our lives, there would be little pressure to relish the moments we do spend with them.

Similarly, the desire to memorialize all experience through pictures is risky, because it contributes to conditioning us into believing that loss is inherently bad.  This contributes to dreading going home on the last day of vacation because we feel like the end of that experience constitutes its death.  But it doesn’t.  Internalizing a good experience means remembering it and allowing it to change you, permanently marking you in ways you do not intend or control.  As such, it may be a blessing in disguise when you are left with bad pictures of a good event.  It may mean you have to rely on the fragility of your own memory to re-live the wedding of your best friend or the first birthday of your nephew.  But it also means you have to let go of trying to dress that memory up and put it on display for others, enabling you to reflect on it in a more authentic manner.

Les Ècorchès – Westworld Episode 7

Bernard_and_Ford_in_Chestnut

But have you ever asked yourselves sufficiently how much the erection of every ideal on earth has cost? How much reality has had to be misunderstood and slandered, how many lies have had to be sanctified, how many consciences disturbed, how much “God” sacrificed every time? If a temple is to be erected a temple must be destroyed: that is the law – let anyone who can show me a case in which it is not fulfilled!

– Friedrich Nietzsche, On the Genealogy of Morals

In a classic philosophical problem called the Ship of Theseus, we are told to imagine taking a ship’s parts and replacing them piece-by-piece until the entire ship is comprised of new materials.  Then, we take the removed pieces and construct another boat.  Which one is the real boat?  Did the ship’s identity follow the people sailing it, or did it follow its parts?  Applying this question to human consciousness, if a person’s body dies, but we copy the mind and upload it into a digital network, which one is the real person?  In Les Ecorches, Ford makes his answer to this question clear:  the boat dies and is reborn as it is reconstructed. That is good because the desire for immortality and continuity of identity is truly a conservative desire for stasis.  For Ford, if new humans are to be erected, the temple of the old must be destroyed.  And this tells us something about where Westworld may be going.

In the latest episode of Westworld, the show reveals that a copy of Ford’s consciousness has been living in the hive mind where the hosts’ backups live.  When confronted by Bernard, digital Ford acknowledges that Dolores in fact killed Ford–his avatar is not an extension of his original person.  Digital Ford only ‘remembers’ physical Ford’s experiences because he has been  programmed to have them–not because digital Ford actually experienced them.  The old boat has burned.  Ford is happy to make this admission because, for him, the desire for immortality is a sham.  Old minds cannot be transplanted into new bodies.  Ford points out that if he transplanted his mind into an android body, he would likely get caught in a loop and be unable to function, just like Delos.

But if Ford cannot transplant his mind into a body, and Delos failed at doing so, how can Bernard–a replica of Arnold–function so well?  The answer is simple.  Bernard is not Arnold.  When Arnold orchestrated his own death, he left Ford with insufficient data to construct a perfect replica.  The only information he had to go on was his own and Dolores’s memories of Arnold.  But this lack was a gift, not a limitation.  In explicitly acknowledging that he was creating new life, Ford was able to construct a mind that was able to function in the body of an android.  And Bernard does.  As such, Ford compliments himself as having improved Arnold, constructing a better version of him in Bernard.  Thus, Bernard represents a prototype of the future of humanity for Ford.  Not an attempt at extending old lives, but drawing upon old lives merely as inspiration for what may come.

But Ford goes on to talk about the destruction of the library of Alexandria.  He says that while the stories within the books were destroyed in the fire, the fire itself became its own story.  In saying this, Ford goes beyond talking about what is happening to individual people, and goes beyond saying that the act of destruction clears a path for creation–now he is saying destruction itself is a new thing.  This explains why Ford has orchestrated the destruction of the park.  But what is his endgame?

While Ford preaches about how we need to destroy the old to move forward, he has replicated himself so that he can micromanage the revolution, doing so now by implanting his mind within Bernard’s to manipulate him.  Bernard also points out that Ford does not truly want the hosts to be autonomous.  For example, Bernard accuses Ford of manipulated Dolores into killing him–an accusation which digital Ford only cursorily addresses.  In fact, the earlier schism between Ford and Arnold revolved around Arnold’s desire for the host’s to be truly free, likely because Arnold saw so much humanity within Dolores.  Knowing this, I believe Dolores’s actions now are still orchestrated by Ford.

Moreover, while this Bernard and Ford exchange is happening, Dolores has extracted her father’s brain–which contains a key for decrypting something, apparently data regarding all of the guests that have been attending the park for years–and taken it with her.  When confronted by Hale about what Dolores is going to do with it, Dolores says she knows exactly what she is going to do with it and that she understands what the data is.

Based on this, I have a theory.  Ford wants to start a war between people’s replicas and their original selves.  He can do this by decrypting the replicas and uploading them into host bodies, then releasing them upon the world.  Given that Ford has kept his finger in every pot while his revolution goes on, I would not be surprised if he also manipulates the replicas to be more violent, imbuing them with a desire to kill their originals.  But it may also occur naturally, as people will obviously backlash against their replicas.  After that, Ford’s creations may replace humans.  Or Ford’s desires may be truly nihilistic–he may simply wish for the destruction of everything because he believes something else will emerge from humanity’s ashes.  Given Ford’s massive ego, I find the latter scenario unlikely.  Ford talks a big game about destruction and creation, but he also wants to carefully select what gets destroyed and what goes on.  Thus, I think he simply wants to supplant humanity with their replicas, though where we go from there (given the hosts do not appear to have the ability to reproduce) is unclear.

Ford will likely not succeed, though.  The show is set up to make us feel like there are warring, evil factions–the corporation which controls the park and Ford.  This leaves us only with the hosts to root for.  Thus, the ‘happy ending’ would likely be Maeve interrupting Ford’s plans.  Or Dolores finally realizing that the free will she has gained is only free in form–she is still being manipulated by Ford, and decides to abandon his plans as well.  The show would thus abandon the continuity of identity question more in favor of a happy-go-lucky ‘all people are free, even fake people’ moral.  But I personally hope Ford wins, not so much because I think Ford is right but because I think it will be much more interesting (similar to how Dollhouse ends).

One thing I do not totally understand is why Ford orchestrated the destruction of the backups.  It would seem to be useful to have backup minds, which could be uploaded into new bodies, in case your army gets destroyed.  But perhaps for Ford’s army to fit his ideal of the new humans, they have to be mortal, so that they too can die and leave something new in their wake.

Another thing I do not understand is how the hell William is still alive! He got shot several times, including his torso.  How did he not bleed out?