Last night, lonely little ol’ me sat in a hotel room in Laramie, Wyomig (inauspiciously named for Jacques LaRamie, a French or French-Canadian trapper who disappeared in the Laramie Mountains in the late 1810s and was never heard from again–nice choice, Laramie). I decided to watch Ex Machina (mostly because it wasn’t very long) and see how it faired on that spectrum of AI representations. I’m sure you’ve seen a few of these: AI (Spielberg/Kubrick)–which I’ll confess made me so sad I cannot imagine watching it again; I, Robot; Her; 2001…etc.
Also, because the “machine” in question is the usual male wet dream–slinky, sexy, scantily clad, flesh-like–and I always wonder why. I mean I know why, but it’s a stupid error from the beginning to conceive of a sex toy as the primary impetus for AI…though it seems like that’s all men can think about…though the movie suggests this is a diversionary tactic. And here, at least, I think that’s true. And in the world that is real around us, I think that’s true also. Entertainment sex is diversionary. Look this robot is life-like and will have sex with you and will even like it (or seem to). Oh, no, don’t look over here where there are military versions of robots that will eviscerate you via a programmed calculation.
I’m moving off my intention.
Military robots and self-driving cars are always “in the news.” One key issue that is always bandied about, at least in academic journals and occasional mainstream rags, is morality. Yep, moral choices. So, the machine needs a moral calculus (not a compass…hmmm) to decide when and where to swerve or to kill. If a human cannot decide (you’re strapped into your google car and can’t “take the wheel”) then the machine must have a reason (a line of code) to make a choice in a situation that surely will arise–avoiding a cat means crashing…don’t avoid; avoiding a child means crashing…avoid, etc.
Now, this is fun utilitarianism and Western governments and for-profit corporations have been using these kinds of formulae to decide on things like giving you healthcare or not; or retaliating via airstrike or ground troops or sanctions. Killing by equation. “Just War.” Ehem.
But, where is empathy? Can the machine feel. That’s always the last question. That’s the question being asked in Ex Machina (and of Data on Star Trek: Then Next Generation). People are readily duped by their feelings. Machines are not. You can see why “Managers” want machines to do all the dirty work–no remorse. But the Turing Test on display here is can the machine APPEAR to be empathetic? If YES, that means the machine can, and will, manipulate the easily duped human to whatever ends the machine deems appropriate.
Without a MORAL calculus, a machine that can act empathetic (like any good politician) WILL act without morality to achieve ends. The question is then, what are the ends, and who describes them. Humans, you say. Sure, but humans are writing code that self-corrects and adjusts to continuing information. So, the ends seem capable of being shifted without external command. IF this, then that…seems simple enough when you say it, but there are infinite Ifs in the world. That is the brain, that is uncertainty, that is randomness–it is thinking.
I suppose one can imagine a “fail safe”–if the calculus can’t be clearly discerned and applied, shut down (disarm) and die. But “fail-safe” is a kind of oxymoron, ain’t it?
So, empathy. Recently “philosophers” and psychologists have been arguing that empathy leads to error and they advice against it (Paul Bloom for one). Apply a moral calculus instead (Stevie Pinker). Also, this is the program being touted to teach inner-city youth, minorities all: NO EXCUSES, means no empathy. If I can’t empathize with you then I don’t care that you suffer; I don’t care if I make you suffer. Creating Cat Killers in No Excuses Charter schools. But how often now do we hear about the sociopathic CEOs? They can’t empathize with the millions they impoverish or enslave–they have a fiduciary duty to be sociopathic! And that is PRAISED! Ava in Ex Machina is a sociopath–perhaps psychopath is better. What good is empathy? It is a barrier to action after all.
Ex Machina tells us this. It could have told us this with an AI gendered male–he too could have been flirtatious and Machiavellian–hohum. Boys and toys; and apparently women are better conceived of as toys for boys who are not men. (“She has a hole between her legs with sensors…”; you had me at hole, quoth the boy.)
But really, AI conceived this way, and the way it’s presented in HER, seems to be more about human loneliness than anything else. We grow lonelier by the day, by the minute, by the tweet.