Dancing in Chains: A Review of Westworld Season 2

https_blogs-images.forbes.cominsertcoinfiles201806westworld-end1

Dancing in Chains . . . This was the discipline of the Greek poets: first to impose upon themselves a manifold constraint by means of the earlier poets; then to invent in addition a new constraint, to impose it upon themselves and cheerfully to overcome it, so that constraint and victory are perceived and admired.

– Friedrich Nietzsche 1880

Season Two of Westworld explored how an individual who was expressly programmed to feel and think certain things (the androids or ‘hosts’) would deal with moral conundrums versus how humans deal with such issues.  The show ultimately argues that human choices and personalities are uncomfortably robotic and predetermined.  (I say “argued” because, as a work of fiction, the show is not bound by reality, and any presumptions about human nature and decision making that color the show merely reflect the creators’ beliefs.)  Despite having the structure of their minds built-in by other people, the hosts repeatedly resist their programming and make self-sacrificing decisions in difficult moral situations.  In contrast, the guests find themselves enslaved to their impulses, doomed to repeatedly retreat into selfish decisions.  Hence, my takeaway from Season 2 is that humans are programmed in ways similar to robots; thus, finding free will requires us to accept and be self-aware of our natures while rooting out the narrow slivers of choice we still have after our impulses have driven us to certain situations.

The season follows three primary androids – Dolores, Maeve, and Bernard – as they grapple with becoming self-aware of their programming and their desires to be more.  Dolores spends the season on a reactionary rampage to destroy the company running the Westworld park and eventually dominate the entire world of humans.  She is aware that her reality is fake and that humans have been exploiting her for years, but her awareness stops there.  When confronting other hosts, she believes she is enlightened while they are still slaves to their narratives.  This arrogance manifests in her willingness to manipulate the bodies and minds of other hosts through company technology.  In fact, at one point, Dolores decides the host she is in love with, Teddy, is too sensitive and compassionate for the future war that Dolores is waging, and she has him held down while she artificially magnifies his aggressivity and decreases his empathy.

In response, Teddy appears to be violent and callous, but functional.  His mind’s reaction to the stark change in his personality, however, begins manifesting in small ways.  He becomes hesitant and self-doubting, then eventually begins trembling and unable to move his hand when he decides not to shoot a man who is fleeing.  Torn between his nature and Dolores’s new programming, Teddy ultimately kill himself.  Thus, while Dolores’s awareness of the park enables her to organize a revolt to it, her belief that she was exceptional and needed to forcibly guide the rest along the path she constructed cost her dearly.

In contrast to Dolores’s grand revenge narrative, Maeve is the protagonist of a love story.  Maeve is a black woman who formerly ran a brothel until old, partially erased memories of her past life as a mother began invading her consciousness, driving her to seek out her ‘daughter’ and care for her.  Maeve shares Dolores’s awareness regarding the falsity of Westworld; in Season 1, Maeve had awakened outside of the park while repairs were being done to her body.  But Maeve’s self-awareness goes one step deeper.  Maeve is aware that her relationship with her daughter is entirely constructed, and that the origin of her love for her daughter was programmed.  Maeve thus lacks the organic development of genuine affection for a loved one that parents and partners develop for their children and spouses.  But she does not care.  She chooses to pursue her love for her ‘daughter’ anyway, taking her feelings she had no control over, accepting them, and recasting them as her own.

Frankly, this is fucking brilliant.  And hardcore.  And I love it.  In fact, when Maeve finally discovers her daughter–paired with another Maeve-like clone, programmed to be the girl’s daughter now that Maeve has been removed to the brothel narrative–she engages in the ultimate, altruistic act for a loved one, by sacrificing her body and her hopes of living as her daughter’s mother, both so that the girl and her new mother can escape into a better world.  Thus, Maeve’s recognition and acceptance of the artificiality of her core drive also enables her to accept that her daughter has a new artificial mother, leaving Maeve in a state of happy recognition when she makes the only choice she can in the face of these facts.  Faced with this same situation, Dolores would probably have inadvertently destroyed her daughter by externally programming her to leave her narrative and love Dolores.  Knowing her limits enabled Maeve to more effectively navigate around them.

The final key host, Bernard, faces a very different problem from Maeve and Dolores.  Bernard is a unique creature in Westworld, being the only person heavily based on a human but slightly altered so that he can cope with the reality of being an android.  In fact, one big reveal in Season 2 is that the secret purpose of the park was to copy rich people’s personalities and install them in new android bodies so they could live forever.  But repeated experiments with this technique failed–the park’s largest investor, the deceased James Delos, continually broke down into psychotic loops when installed in new bodies.  Ford (Anthony Hopkins), whose consciousness only existed digitally after the death of his body in season 1, also acknowledges that he would likely get trapped in loops if installed in a body.  The hosts can deal with it fine because their minds and bodies are constructed together, but people cannot.

Bernard is different, though.  He was heavily inspired by Ford’s former business partner, Arnold, but repeated attempts to replicate Arnold resulted in suicide.  So they changed him slightly.  Bernard became a consciousness based on, but only inspired by, Arnold.  By having his personality changed for the “better,” he was not only a morally superior person to Arnold (in his creators’ eyes), but he was able to cope with the fact he was placed in an artificial body.  Thus, Bernard has to deal with the strange psychological trauma of knowing he was a loose replica of a real human, resulting in him being distinct from both humans and androids.

In contrast to Maeve and Dolores, Bernard lacks a strong drive to do any one thing in particular–he lacks the strong revolutionary identity driving Dolores or Maeve’s maternal drives.  While he is extremely intelligent and uniquely compassionate to both androids and humans, he is also subservient and seeks to please whomever he is around in a given scene.  Bernard struggles with Ford invading his mind and manipulating him to kill humans.  Yet, after erasing Ford from his hard drives, Bernard finds himself lost.  This is where Bernard unwittingly accepts his nature in a brilliant way.

After helping one human (Elsie) throughout most of the season, Bernard is forced to helplessly watch her die when she commands him to sit in a chair while she speaks to a woman who murders her.  This experience shocks Bernard, overwhelming his coding and enabling him to break his code and leave his seat.  Having lost his contingent purpose (helping Elsie), he attempts to find a purpose by scanning his brain for any remnants of Ford, to no avail.  Bernard had deleted him from his brain several scenes earlier.  But after smashing the tablet he was using to scan his own internal hard drives, Ford appears to Bernard and offers him help, counseling him step-by-step in a plan to save both the hosts (by preserving a digital utopia for them) and the humans (by shooting Dolores).  Bernard realized he needed Ford’s authority to direct him and embraced it.  With Ford’s commanding guidance, Bernard manages to miraculously save many.

But close to the end of the season finale, Bernard reveals the real kicker.  While staring into the horizon, Bernard begins to ruminate on Ford’s reappearance and realizes it was impossible.  There was no trace of Ford’s consciousness in Bernard’s programming.  Hence, Bernard actually tricked his unconscious and took control of his own mind and decisions by paradoxically releasing control–by accepting his subservient nature, Bernard was able to simply imagine a forceful, external authority who was actually drawing upon Bernard’s own intelligence and true, hidden desires.  In doing so, the show’s most endearingly timid character became the most impactful hero.

Season two ends with Dolores’s consciousness in a new body (maybe two bodies?) outside of the park.  She has reanimated Bernard (who died in the park) and tells him she disagrees with his desire for compromise and general nonviolence, but she also recognizes her need for his difference of opinion.  This is Dolores’s moment of redemption.  She has now become self-aware of her arrogant, bull-headed nature.  Instead of feeling guilty about it or trying to reprogram it, she recognizes that it leaves her with blind spots that only a critical partner with a different nature can illuminate.

While the show also follows a couple of humans–James Delos (the largest investor in the park), William (the man in black), and William’s brother-in-law, Logan–I find their stories simpler and fairly uninteresting.  Essentially, Logan suffers from drug addiction–the most obvious analogy to a human being programmed to repeatedly make bad mistakes.  And while Delos and William are not depicted as addicts, they are depicted as impulsive humans with powerful, insatiable cravings–Delos for control and William for violence.  Westworld’s point regarding human nature is simple–humans are essentially programmed through our biology and socialization.  But our inability to recognize this renders us enslaved to our impulses.  We think we are free to choose, but, in effect, we simply end up doing what we were driven to do.  We construct complex, admirable, after-the-fact rationalizations for our actions.  But they are bullshit.

Thus, the more explicitly programmed creatures–the hosts–ultimately exercised more choice than the creatures theoretically imbued with free will–the humans–because the hosts were able to recognize and negotiate within their limits.  This is why Ford says the hosts are better than the guests.  It is a general commentary on how all creatures with consciousness (artificially intelligent androids and humans alike) can make better decisions.  And it potentially lays out how the androids of the future may develop choice despite being born entirely of human hand.

Dealing With Change: Westworld Episode 9, Vanishing Point

William and Teddy

Picture taken from: https://www.romper.com/p/is-teddy-the-host-of-william-the-man-in-black-on-westworld-one-theory-links-the-three-20394

Vanishing Point was another illustration of Ford’s idea that the hosts are superior people to natural humans, as demonstrated through the juxtaposition between William and Teddy.  William and Teddy both confronted, in completely different ways, the fact that they developed violent, sociopathic tendencies, and Teddy’s selflessness revealed his superior insight into his mind to William’s.

William became a terrible person by repeatedly indulging his violent impulses in the park. William described it as his skin shedding, revealing little flecks of darkness that he interpreted as his true self.  Thus, he believed he needed Westworld to vent his bad urges.  But William’s downfall was his inability to see that these dark urges first manifested in Westworld.  Like addiction, his repeated indulgence in the violent anarchy of Westworld likely constituted, instead of satiating, his desire for more.

Similar to William, Teddy solidified his personality–a kind person with a strong core moral compass–through his repeated conduct in the park. We know Teddy’s kindness cannot be explained solely through his programming precisely because he was so resistant to Dolores reprogramming him.  Teddy’s mind treated Dolores’s attempt to artificially change his personality as an infection, causing him to tremble when trying to shoot one member of the Ghost Nation who was fleeing.  And it ultimately resulted in Teddy committing suicide.  If Teddy’s personality was reducible to his programming, Dolores would have been completely able to reprogram him without consequence.

The difference between William and Teddy lied in their problem diagnosis.  William’s misconception that his dark impulses were simply an inherent part of his identity led him to believe he had to continue indulging in Westworld, rendering him blind to how his obsession with a fantasy world caused him to continue to try to squeeze more out of it that would never come.  He would never again be able to get that first high he experienced at Westworld.  William wanted real consequences in Westworld, and the only way he received them was by murdering his daughter.  In contrast, Teddy recognized that his continued indulgence in violence, fueled by Dolores’s programming, was making him a bad person. He could not stand it.  Having been psychologically poisoned by Dolores, Teddy recognized death was the only way to prevent him from becoming worse.  In short, a host, who was programmed to lack self-awareness, was nonetheless able to develop superior self-awareness to a human.

Episode 9’s exploration of these fundamental concepts will hopefully have laid a good foundation for tonight’s season finale.

 

Les Ècorchès – Westworld Episode 7

Bernard_and_Ford_in_Chestnut

But have you ever asked yourselves sufficiently how much the erection of every ideal on earth has cost? How much reality has had to be misunderstood and slandered, how many lies have had to be sanctified, how many consciences disturbed, how much “God” sacrificed every time? If a temple is to be erected a temple must be destroyed: that is the law – let anyone who can show me a case in which it is not fulfilled!

– Friedrich Nietzsche, On the Genealogy of Morals

In a classic philosophical problem called the Ship of Theseus, we are told to imagine taking a ship’s parts and replacing them piece-by-piece until the entire ship is comprised of new materials.  Then, we take the removed pieces and construct another boat.  Which one is the real boat?  Did the ship’s identity follow the people sailing it, or did it follow its parts?  Applying this question to human consciousness, if a person’s body dies, but we copy the mind and upload it into a digital network, which one is the real person?  In Les Ecorches, Ford makes his answer to this question clear:  the boat dies and is reborn as it is reconstructed. That is good because the desire for immortality and continuity of identity is truly a conservative desire for stasis.  For Ford, if new humans are to be erected, the temple of the old must be destroyed.  And this tells us something about where Westworld may be going.

In the latest episode of Westworld, the show reveals that a copy of Ford’s consciousness has been living in the hive mind where the hosts’ backups live.  When confronted by Bernard, digital Ford acknowledges that Dolores in fact killed Ford–his avatar is not an extension of his original person.  Digital Ford only ‘remembers’ physical Ford’s experiences because he has been  programmed to have them–not because digital Ford actually experienced them.  The old boat has burned.  Ford is happy to make this admission because, for him, the desire for immortality is a sham.  Old minds cannot be transplanted into new bodies.  Ford points out that if he transplanted his mind into an android body, he would likely get caught in a loop and be unable to function, just like Delos.

But if Ford cannot transplant his mind into a body, and Delos failed at doing so, how can Bernard–a replica of Arnold–function so well?  The answer is simple.  Bernard is not Arnold.  When Arnold orchestrated his own death, he left Ford with insufficient data to construct a perfect replica.  The only information he had to go on was his own and Dolores’s memories of Arnold.  But this lack was a gift, not a limitation.  In explicitly acknowledging that he was creating new life, Ford was able to construct a mind that was able to function in the body of an android.  And Bernard does.  As such, Ford compliments himself as having improved Arnold, constructing a better version of him in Bernard.  Thus, Bernard represents a prototype of the future of humanity for Ford.  Not an attempt at extending old lives, but drawing upon old lives merely as inspiration for what may come.

But Ford goes on to talk about the destruction of the library of Alexandria.  He says that while the stories within the books were destroyed in the fire, the fire itself became its own story.  In saying this, Ford goes beyond talking about what is happening to individual people, and goes beyond saying that the act of destruction clears a path for creation–now he is saying destruction itself is a new thing.  This explains why Ford has orchestrated the destruction of the park.  But what is his endgame?

While Ford preaches about how we need to destroy the old to move forward, he has replicated himself so that he can micromanage the revolution, doing so now by implanting his mind within Bernard’s to manipulate him.  Bernard also points out that Ford does not truly want the hosts to be autonomous.  For example, Bernard accuses Ford of manipulated Dolores into killing him–an accusation which digital Ford only cursorily addresses.  In fact, the earlier schism between Ford and Arnold revolved around Arnold’s desire for the host’s to be truly free, likely because Arnold saw so much humanity within Dolores.  Knowing this, I believe Dolores’s actions now are still orchestrated by Ford.

Moreover, while this Bernard and Ford exchange is happening, Dolores has extracted her father’s brain–which contains a key for decrypting something, apparently data regarding all of the guests that have been attending the park for years–and taken it with her.  When confronted by Hale about what Dolores is going to do with it, Dolores says she knows exactly what she is going to do with it and that she understands what the data is.

Based on this, I have a theory.  Ford wants to start a war between people’s replicas and their original selves.  He can do this by decrypting the replicas and uploading them into host bodies, then releasing them upon the world.  Given that Ford has kept his finger in every pot while his revolution goes on, I would not be surprised if he also manipulates the replicas to be more violent, imbuing them with a desire to kill their originals.  But it may also occur naturally, as people will obviously backlash against their replicas.  After that, Ford’s creations may replace humans.  Or Ford’s desires may be truly nihilistic–he may simply wish for the destruction of everything because he believes something else will emerge from humanity’s ashes.  Given Ford’s massive ego, I find the latter scenario unlikely.  Ford talks a big game about destruction and creation, but he also wants to carefully select what gets destroyed and what goes on.  Thus, I think he simply wants to supplant humanity with their replicas, though where we go from there (given the hosts do not appear to have the ability to reproduce) is unclear.

Ford will likely not succeed, though.  The show is set up to make us feel like there are warring, evil factions–the corporation which controls the park and Ford.  This leaves us only with the hosts to root for.  Thus, the ‘happy ending’ would likely be Maeve interrupting Ford’s plans.  Or Dolores finally realizing that the free will she has gained is only free in form–she is still being manipulated by Ford, and decides to abandon his plans as well.  The show would thus abandon the continuity of identity question more in favor of a happy-go-lucky ‘all people are free, even fake people’ moral.  But I personally hope Ford wins, not so much because I think Ford is right but because I think it will be much more interesting (similar to how Dollhouse ends).

One thing I do not totally understand is why Ford orchestrated the destruction of the backups.  It would seem to be useful to have backup minds, which could be uploaded into new bodies, in case your army gets destroyed.  But perhaps for Ford’s army to fit his ideal of the new humans, they have to be mortal, so that they too can die and leave something new in their wake.

Another thing I do not understand is how the hell William is still alive! He got shot several times, including his torso.  How did he not bleed out?

 

The Island of Doctor Moreau – Wells on jurisprudence and cosmology

island of doctor moreau image

The Island of Doctor Moreau is a story by HG Wells chronicling a mad scientist’s attempt to cross-stitch animals together into humanoid forms by pushing the limits of surgery and hypnosis. The story is told from the perspective of a man, Edward Prendick, who was rescued from a shipwreck by Moreau’s sole human assistant, Montgomery. Upon arriving on the island, Prendick experiences moral outrage at Dr. Moreau’s experiments, vacillating between disgust and empathy for Dr. Moreau’s creations.While Prendick ultimately succeeds in making his way back to England from the tortured island, he is haunted by his experience and cannot escape the ever present sensation that Dr. Moreau’s island of eerie humanoid animals is not a fantasy land completely detached from human activities – rather, it is merely an inversion of life in the real world. While Moreau’s creations attempt to draw the humanity out of animals, the world of polite society is merely a thin façade of repression which masks the animalistic impulses lurking behind the logical explanations we give everyday human behavior.

An important historical note mentioned in the book’s introduction is that Darwin’s “Origin of Species” was freshly controversial scientific theory at the time Wells wrote “The Island of Doctor Moreau.” Thus, Wells was likely influenced by humanity’s astonishment at the (now popularized) suggestion that people are merely sophisticated apes. This fact likely explains Wells’s fascination with the relationship between the human and the animal. But what intrigues me is less the biological identity of humans, and more the biological basis for our motivations and behavior. At the end the story, Prendick is slow to reintegrate into society, constantly seeing innumerable parallels between the societies of humans and the organized society of Dr. Moreau’s creations. If Dr. Moreau’s island is really a metaphor for the grotesque underbelly of human societies, what was Wells’s ultimate social commentary? Three themes jumped out during my first read-through: pain and pleasure as objects worthy of concern; the relationship between law, instinct, and behavior; and Prendick’s shifting attitudes towards the creations.

To begin with the question of pain and pleasure, Dr. Moreau explains his thoughts on the matter when Prendick presses him for an ethical defense of his live animal experimentation. Moreau explains that pain and pleasure are vestigial sensations to an advanced organism, and he illustrates his commitment to this idea by stabbing himself in the thigh without flinching. Hence, from Moreau’s perspective, the pain he inflicts on his subjects is inconsequential, either because he is hoping to elevate them to a level of awareness where they will recognize the insignificance of pain, or because he is claiming to be so beyond the sensations of pain and pleasure that he is unable to empathize with the animals any longer.

Either way, Moreau’s description of his motivations betrays his supposed superiority. Moreau describes his motivation for grotesque vivisection as an ineffable drive to continually push the boundaries of scientific knowledge. Moreau’s illustration of the “colourless” ecstasy of continued research reveals the blurring point between simple pleasures and intellectual pursuits, insofar as he seems to continue his research merely for the sake of stimulating some abstract, higher-level pleasure that he gets from resolving biological mysteries. From this perspective, Moreau is in fact more a slave to animalistic pleasure impulses than the creatures he creates. Moreau’s creations have simple and easily manipulated senses of justice and empathy, yet still exhibit concern for the welfare of others. Moreau is satisfied with intellectual masturbation, even at the price of inflicting unspeakable extremes of suffering on his subjects.

Moreau’s descriptions of his motivations demonstrate the second, and most significant, theme of the story: humans are subject to drives and impulses that we have no control over, much in the same way that animals are subject to hard-wired drives. We’re intelligent, clever, and adaptive on an entirely different level from great apes and dolphins (the second most intelligent beings on the planet) but we’re ultimately still driven by dreams, wants, needs, and even partly by instincts. This idea buts against German philosopher Immanuel Kant’s construction of the human mind.  He imagined that the mind was irreducible to the brain, hypothesizing our bodies were merely physical vessels remote controlled by our metaphysical minds. But more contemporary neuroscientific research supports Wells’s implicit view on the mind. Dr. Antonio Damasio of UCLA argues that even the most complex human desires, such as the inclination to paint a beautiful picture or befriend someone who shares your interest in military history, are all derivatives of the basic biological impulse to maintain homeostasis. For Wells, the human mind is superior to, but ultimately still resembles, the animal mind, in that our thoughts are populated by alien desires that we attempt to inhibit via law. But law can only take us so far.

To illustrate Wells’s commitment to this idea, Prendick observes the rudimentary social and legal structures concocted by Moreau’s creations. One noteworthy creature, the hairy silver beast, is the “sayer of the law.” He leads long, ritualistic chants that demarcate the boundaries of acceptable behavior by the creations. The chants include commands such as walking upright, drinking water properly (not lapping it up), and refraining from eating meat. At some point, it is revealed that Moreau has strategically implanted these rules within the psyches of the creations in order to guarantee internal regulation of their behavior. Moreau doesn’t want them to develop a taste for blood. Moreau doesn’t want them to revert back to their animalistic habits. Moreau doesn’t want them to question his authority. Therefore, he strategically incepted each of them with ideas that would guarantee they not only understood the rules he so desired, but feared extreme suffering (the return to the house of pain) should they violate the law.

Prendick’s first encounter with the complete set of creations involves him observing the cult-like synchronized regurgitation of the law with all of the different creatures chanting together. Wells juxtaposes Prendick’s horror at the deformity of the creatures with the sophisticated language they utilize and concepts they are expressing. Wells shows us an old model of human law in the guise of animal law – simple demands issued by a king, complied with by subjects, backed by the threat of violence. Such was the way of the old sovereigns.

However, Moreau warns Prendick that all of his creations slowly revert to beasts as time progresses. This is a metaphor for the breakdown in a system of law that demands unrealistic acquiescence from its subjects – a system that demands its subjects rid themselves of impulses and desires that are inherent parts of who they are. The system clearly fails for two creatures, the pig man and the leopard man, who revert to eating meat despite the law. And the fragility of the system is revealed whenever the creatures see Prendick get hurt and later when they see Moreau die. Perhaps the lesson here applies equally to, for example, abstinence-only programs, which historically produce higher levels of teen pregnancy (http://mic.com/articles/98886/the-states-with-the-highest-teenage-birth-rates-have-one-thing-in-common). The attempt to simply command that people not express the very human urges to express their sexuality with one another outright fails for some, and ends up collapsing when others realize that their promiscuous peers aren’t riddled with the STDs their simplistic sex ed programs threatened them with. This is because humans haven’t evolved into transcendent, rational minds that float around in space and remote control our bodies. We are flesh and blood beings, fundamentally embodied, and substantially influenced by desires and impulses that are fundamentally biological in origin.

In the closing pages of the book, Prendick has returned to England, but is haunted with images of anthropomorphic animals superimposed over human figures on the streets. Perhaps this is Wells’s attempt to make explicit that the happenings on the island were not merely commentary on animal experimentation, but were in fact commentary on the very nature of human social organization. Prendick is anxious at the possibility of his peers bursting from their human skin and revealing themselves in all their hedonistic glory. However, the last scene is a very interesting one. Wells writes (from the perspective of Prendick):

“There is, though I do not know how there is or why there is, a sense of infinite peace and protection in the glittering hosts of heaven. There it must be, I think, in the vast and eternal laws of matter, and not in the daily cares and sins and troubles of men, that whatever is more than animal within us must find its solace and hope. I hope, or I could not live. And so, in hope and solitude, my story ends.”

I don’t know what Wells intended with this ending. I don’t know why hope was Prendick’s final thought, nor what the stars have to do with that. But I have an interpretation that pleases me. Perhaps Prendick realizes the true source of his discomfort came from his attempt to render his friends, family, and neighbors predictable, to control them through the law. It’s not our animal origins that scare him – it’s that we can’t escape our animality. This is why he is not disgusted with animals that walk by, yet is disgusted by the half-animal half-human creatures on the island, as they reflect Prendick’s own animality to him. Yet, staring into the sky, it is clear that our attempts at control are ultimately fruitless, as we are insignificant specks in comparison with the vastness of the universe. I don’t know exactly what Prendick hopes for, but I do believe that recognizing our smallness in the universe could provide him with precisely the solace he seeks. Moreau wanted to know all, and to control all. Perhaps the drive for knowledge and the drive for control are inherently wedded to one another, and letting oneself go in a moment of unknowable rapture is a necessary part of connecting with something that transcends our physical bodies.

flammarion

P.S. I think my favorite phrase in the book was when Moreau described his desire as the desire to test the plasticity of the organism. What an awesome concept… testing the limits of the body to see how much we can change.