Wow a blog post! That’s right! In the middle of a busy semester I figured I’d take a short break from thinking about important things and instead shower you with trivial and disjointed thoughts about movies! Yay! Buckle up, we’re going for a trip!
When I saw the trailer for Her in theatres before Catching Fire, I couldn’t help but sigh. My feelings for the admittedly brief glimpse of the film that the trailer provided are more than just the usual passing annoyance, disinterest, and general sense of disillusionment with a film about yet another straight white man. Don’t get me wrong, that’s certainly part of it; straight white men, I am sick of hearing your stories all the time. Sorry, not sorry. But no, it was more than that.
“Her” makes me angry.
That’s probably pretty confusing for a lot of people. Most people who are uninterested in the film are just that — uninterested. Not actively angered by it. Certainly not enough to write a blog post about it. So why is this particular film eliciting a reaction like that from me?
Well, in short, “Her” rubs me the wrong way because it presents a concept that goes against my core understanding of human relationships. Maybe that’s like a “yeah well duh but that’s what it’s about!!! It’s supposed to make you question what love truly is!!!” Yeah well no, let me tell you why. It’s a long road and there are many facets to my distaste for this film concept, so bear with me.
What, do YOU think, is the fundamental aspect of being human? Apparently someone was arguing that like, when AI can have “flaws” just like humans then “how is it so different” or whatever; I don’t know, the usual stuff that’s thrown around when talking about AI. So is it that, the presence of “flaws” (and whatever that entails exactly)? Is it the ability to love or feel emotions? Not in my opinion.
What really defines humans is free will. The 100% completely free ability to choose. Because what that entails is totally arbitrary, nonsensical decisions. We don’t understand what exactly free choice entails. Even for mundane things like “what makes us like music”. Boy I could talk about that all day, I’ve been doing my research. There are all sorts of theories from all sorts of fields as to what influences our tastes. And there are some damn good theories. But they only explain general trends in our taste. They are completely unable to account for each individual decision.
Each of those theories holds some solid arguments; maybe they are all right. The human brain is vastly complex, and our decisions are affected by vastly complex experiences. Emotions. Memories. Social context. Specific situations we happen to be in at the time. What makes us love someone but not someone else? If we can’t pinpoint exactly what influences our decisions and how much, how would we be able to duplicate that with AI?
Because free will sometimes makes no sense. Humans are not computers. Sure our brains seem like a big huge complex computer system, with neurons firing in specific patterns, etc etc. But we haven’t figured out exactly how our decisions are made, and I don’t think we ever will. The decisions humans make are sometimes arbitrary. Sometimes we will make one decision one day, and a completely contradictory decision the next. They do not follow a clear-cut pattern.
You can’t duplicate the way our brains work without understanding it first. And we clearly don’t understand it. So AI may seem to imitate free will, but I don’t believe they are or will ever be truly duplicating it. Our entire life’s experiences seem to go into each decision we make, along with our emotions at the time, and that’s something that’s hard to replicate. Throw in other factors that we’re not even entirely sure about, and there’s pretty much no way you can really, actually copy it.
So problem number 1 with these movies for me is that they go “imagine if robots become just like humans!” and I pretty much feel that just. No. Sorry.
Okay, we get it, shut up Jenn, why does that even matter? Did you hate every single sci-fi movie about AI as much as this one? No, definitely not. There’s more wrong with it.
When we love a human, (hopefully) we love them in part for their humanity. We love the essence of their being human. Part of loving someone is loving their ability to be 100% their own complete person who makes their own decisions and has their own life experience. I mean, people who want their partner to no longer have the ability/desire to make decisions is abusive, we can all agree on that. But if AI can’t fully replicate free will, their ability to make decisions the way humans do is lacking. So if someone loves an AI the way they would love a human (instead of, for example, a pet; in which the importance of complete human free will isn’t as much of an issue)… aren’t they missing something integral in a relationship?
And here we get to why the concept makes me angry: maybe that’s the point. If someone who otherwise is attracted to humans finds themselves in love with an AI instead of a human… is it because that AI is missing the essence of humanity? No offense to our imaginary future AI brethren, but I’m pretty sure with 7 billion people on the planet you can find a human who’s just as awesome as that AI (I mean, there isn’t even only one person on earth we find awesome, so there’s no way an AI replicating human-ness is going to be like so totally way more awesome than any human ever), so what makes the AI so attractive? I’m not talking about people like the woman who fell in love with and married a bridge, or the man who’s dating his car. That’s a different kettle of fish. I’m talking about the people who are convinced their one true love is this almost-but-not-quite-human. But it’s more than that. It’s the story that’s being sold to us.
It just seems too… convenient.
In a society where women are constantly dehumanized, objectified, belittled and attacked for having opinions, a man being in love with a woman who is quite literally objectified and not human hits a little too close to home. Wow you’ve happened to fall in love with the one woman who doesn’t quite have free will! You lucky bastard! Yeah, but this is a movie, not a true story.
Which kind of makes it worse for me. It’s a movie; it’s trying to sell us this idea. Everything about it just squicks me the fuck out.
There are far more men in movies than women (especially when it comes to white men vs. women of colour). The past three years have actually seen a decline in representation in blockbuster movies. This movie doesn’t even have to have a real female lead. Convenient!
We live in a society in which we see women turned into objects, and women are told constantly to be quiet, to be smaller, to take up less room, to not rock the boat, to not step out of line, that their “no”s and “yes”es (but especially “no”s) aren’t quite as important as men’s. And now hollywood is selling us a story all about a white man and his little computer girlfriend who literally isn’t quite human, in a way that is sort of weirdly the ideal for women? Convenient!
In a movie that is trying to sell us on relationship dynamics that I consider to be incredibly iffy at best, I can’t help but feel there’s a reason it’s about a man and not a woman.