Maeve wakes up again, in the same bed she always does, the same day she always does. That’s the thing about your life changing, about what happens when the world as you know it shifts beneath your feet. You wake up one day and it looks the same and happens the same, but it doesn’t feel the same anymore. Not because it’s different, but because you are.
Today, Maeve would like to die. Fortunately, as a sex worker in a hyperviolent world of masculine fantasies, this is a very easy thing to do. She sees her opening the moment a group of guests strolls into the Mariposa, all cowboy hats and swagger. It reflects a self-preservation instinct that women learn, often without realizing it: the ability to recognize at glance the sort of man who is willing to hurt you, even if he hides it beneath a smile. She sees it in an instant from all the way across the room, and so she takes him upstairs and does the two things most likely to make a man kill you: refuse his advances and question his masculinity. Abracadabra, she’s dead!
When she wakes up on a metal slab, Felix the robo-repairman has some more uncomfortably existential revelations to share: She has no free will and everything about her — including her “fuck you” attitude toward men — was actually designed as an elaborate fantasy to please men. Maeve was programmed to satisfy the pernicious and wrong-headed fantasy about female sexuality that women who say no don’t really mean mean no; rather, they are simply playing an elaborate game on their way to saying yes. “You’re hard to get. Even when you say no to the guests, it’s because you were made to,” Felix says. Is there any greater nightmare than realizing your entire existence has been designed to make entitled assholes feel correct in their terrible ideas about women? I think not.
Meanwhile, thanks to Elsie’s detective work, Bernard has learned that the glitching hosts are being used for industrial espionage. He sets out to tell Theresa, only to change his mind when she unceremoniously dumps him. Elsie’s investigation later reveals that Theresa is the one stealing the data, but more worryingly that someone else whom the system recognizes as “Arnold” has been modifying hosts in more serious ways — including changes that would allow them to lie to humans or even kill them.
When Bernard investigates the “anomalies” on his own, he discovers that Dr. Ford has a secret house of first-generation robots designed to simulate a childhood version of Ford’s family — and yes, the little boy who’s been running around the park is indeed Li’l Ford. “Tell me all about your day,” Ford says to the youthful version of himself, stroking his robo-head like the benevolent father figure he never had, while an android copy of his own abusive, alcoholic dad watches from the corner.
Bernard is troubled by the sheer number of Freudian psychodynamics on display, but even more troubling is Ford’s discovery that Li’l Ford has killed his childhood dog and left its body in the woods. Why? Because Arnold’s voice told him the animal was a killer, and “if it was dead, it couldn’t hurt anything anymore.” That sounds like a cool program to run in robots that constantly witness murders, and I definitely feel good about where this is going.
Lee Sizemore, the narrative director who believes that his lurid tales of robo-fucking are the apotheosis of modern art, is still pretty sore that Dr. Ford called him a hack. So like any bullshit artist, he nurses his wounded pride by getting wasted, and then he lurches back to command central to literally piss all over the Westworld map while whining intolerably about freedom of speech. His embarrassing spectacle is witnessed by Charlotte Hale, the young executive director of the Delos board who has arrived to “oversee certain transitions in our administration,” which is a nice way of saying that they’re gunning for Ford, and she’s the big gun.
Elsewhere, the Man in Black continues his journey with Teddy, who gets accosted by a band of soldiers who think he orchestrated a massacre with Wyatt at Escalante. Also, a sudden flashback reveals that he aaaactually may have orchestrated a massacre with Wyatt at Escalante. Teddy responds to this revelation by hopping on the back of a chain gun and massacring even more soldiers, so at least he’s consistent.
Since we know that Teddy and Wyatt are a crucial part of Ford’s mysterious new narrative — which surely has nothing to do with Arnold — what does it mean that he decided to tell a story about two old friends, one of whom developed some “strange ideas” after supposedly hearing the “voice of God” and then set out on a mission of death and destruction? And what would it mean if Teddy, the supposed hero of this little debt-settling yarn, is more complicit than he’d like to admit, that he isn’t such a good guy after all?
Maeve, meanwhile, does not give a shit about any of these stupid stories because she is busy writing her own. After making Felix take her on an extremely surreal tour of the corporate office, she decides that it’s time to make some changes to her “attribute matrix,” including her intelligence, which was previously capped at 14 out of 20, because we wouldn’t want the help getting too ambitious, now would we?
Here’s the horrifying thing about the hosts: Like all science fiction, they aren’t really fiction at all. They are the story of how people have always treated other people whom they deemed less than fully human. They are the story of what always happens when we tell one group of people the world was made for them, that the story is about them, and that everyone else is a servant or a slave or a decoration, or even just slightly less of a person than they are. The hosts are an indictment of the terrible danger inherent in any system, cultural or technological, that teaches us entitlement instead of empathy, that values violence over compassion, that tells us the only way to empower ourselves is by dominating and devaluing others. Hell isn’t other people. Hell is what the world looks like when we stop believing that people who are different from us are people, too, and every bit as valuable and real as we are.
If we didn’t know Westworld was a horror story already, we know the moment we see it through Maeve’s eyes, as she walks through the cold, calculated nuts and bots of what it means to turn other people into objects, into profit, into parts. For as much time as we spend fantasizing about the apocalypse, about how the world will end in fire or ice or aliens or robots or any other primal fear that makes us feel helpless and small, maybe we should spend just as much time thinking about how it’s far more likely to end: in a failure of empathy.