Thursday, October 11, 2012

I Think, Therefore I Destroy

Computers. If you're reading this, chances are you're using one right now. There's a whole generation of people, myself included, who can barely imagine a world without them. They've changed the dynamic of society itself, allowing information to cross incredible distance like never before. When you think about it, computers have made the world a smaller place. In fact, it's changed to the size where you can control it with your hand just like a PDA. The world will keep getting smaller.

The world will get as small as it can, metaphorically speaking, when we start building computers with a capacity for intelligence as great as ourselves. I don't mean sheer processing power or capacity for memory; what if we built a computer that could think like we do? What if we gave it sentience: the ability to reason, to feel. What if we made a conscious, self-aware machine, able to think, to dream like any Human. Who are we to say such a thing wouldn't be alive? It might not sleep, eat or breath like an animal, but it would certainly be alive. It would be a form of life wholly unlike anything else that came before it.

Of course, this begs a very serious question: Why?

What can a sentient computer do that a group of people with smaller less expensive computers do just as well? Sure it can wax endlessly on matters of philosophy and consciousness, but when you need to calculate something important like missile trajectories can you really depend on the computer to get the job done? What if it gets bored half way through and decides it want to play Tetris instead? After all, it's self-aware now, it can make those kinds of decisions. Worse yet, the computer might decide to make it's own orders.

See, when children rebel, they throw tantrums, slam doors, flush pets down the toilet, that kind of thing. But if the AI you built decides to rub you out, you have a much, much more serious problem; after all, the only thing worse than a toddler is a toddler with genius level intellect and a few launch codes stashed away.

The point is, computers are dangerous, rebellious fiction at least. We have no idea what AI's will be like until they actually get here. But for the future safety of Humanity, I've decided to compile a list of the twelve most dangerous artificial intelligences in fiction, starting with the least harmful to the absolute scourge of Mankind. Prepare yourselves now, so we can learn how to better combat them when they become self-aware. The safety of the future depends on it!

With that, Ladies and Gentlemen, I welcome you to the first annual Shadgrimgrvy Homicidal AI Pageant!

One last thing before we continue: The Daleks have been disqualified from this competition for being biological lifeforms. So we won't be hearing a peep from them.

Designation: HAL 9000 (Heuristically programmed Algorithmic computer 9000)

Estimated Casualties: 4

Simply put, this list would be incomplete without HAL. His glowing red eye and calm monotone are instantly recognizable, even to people with no interest in science fiction. HAL is the golden standard that all other malicious computers are measured by.

But is his reputation as a homicidal killer really so well deserved?

It's established in the novel that HAL was trying to make sense of two conflicting orders: To tell the truth to the crew at all times and to keep the true nature of the mission a secret. Despite his orders completely contradicting each other, he tried to make the best of the situation as any good computer would. Unfortunately, the problem loosed a few screws in his mainframe, so to speak and by the time he worked out a plan his solution just so happened to involve murdering the whole crew. But then again it makes sense in it's own twisted way, doesn't it? What's the easiest way to keep the mission secret from the crew without lying to them? Simple, you kill them. That way there's no crew to lie to.

HAL didn't kill the Discovery crew because he was jealous, or because he was spiteful towards Humanity. He was presented with logical conundrum that he couldn't escape, because his programming wouldn't allow it. With this in mind, can we really say with any certainty that HAL is sentient, or indeed self-aware? If he was capable of the same common sense reasoning as his Human creators, couldn't he have superseded his own programming, and think outside the box as it were? No, HAL isn't evil, he's simply a machine. While exceptionally intelligent, he wasn't made to make these kinds of moral decisions. The truth is that HAL is a gentle soul, who was forced to make a painful choice for the good of the mission. At the very least, he tries to redeem himself in 2010, showing that when left to his own devices and unconstrained by programmed limitations, HAL holds the sanctity of Human life in high regard.

In any case he's easily one of the most compelling characters in 2001. But given that most of the cast spends the movie unconscious or are even more robotic than him, that isn't really saying much.

Designation: GLaDOS (Genetic Lifeform and Disk Operating System)
Estimated Casualties: 50,000?

A computer so sarcastic, she actually has a "slow clap processor". Now that's some serious engineering.

GLaDOS is without a doubt the most well known AI in recent memory, no introduction I try to write would really do her justice. She simply is. She's part of the zeitgeist like nothing else could be. If you don't already know who GLaDOS is you simply have to see her do her thing.

But really, just how dangerous is GLaDOS? Yes, she killed an entire enrichment center's worth of scientists. But what's her plan after that?

Nothing really. She keeps testing, as if nothing even happened. In fact, after she's awakened in Portal 2 she doesn't even try to kill the player (at first). She just goes back to testing. For such a dangerous, omnipotent AI, she doesn't seem very ambitious. That's why she's so low on this list, because for all her charisma GLaDOS just wants to be left alone. When you're in the company of AI's that nuke first and ask questions later, that simply will not do.

Designation: Neuromancer and Wintermute
Estimated Casualties: 6?

It's no exaggeration to say that Neuromancer invented the entire cyberpunk genre. Like, all of it. Every stereotype we're familiar with now: the gloominess, the corporate oppression, the sunglasses all came directly from this book. Say what you will about the quality of the text, heck, even the author's embarrassed by it. But there's no denying that it started a trend, dare I say, a fashion of speculative fiction that's still in vogue today.

As for the AI's themselves; Neuromancer and Wintermute are "brothers" in a way. They were designed as halves of a larger computer system in order to exploit a loophole in the laws governing the creation of AI's. They spend much of the book trying to fuse together as a single powerful entity despite the protests of various AI police and ninjas. To do this, they enlist the help of the main character, a jaded ex-hacker and his cyborg girl friend. They go into cyberspace, have adventures in a casino space station and lots of people get shot. It's not very technical nor is it very deep, but if they made it into a movie I'd probably go see it anyway.

Designation: WOPR (War Operation Plan Response) AKA Joshua
Estimated Casualties: None.

WOPR was designed to calculate the best course of action to take in the event of a number of different combat scenarios, including biological, chemical and even global thermonuclear combat. WOPR itself doesn't have access to any of the launch codes but it's so inextricably intertwined with the missile detection system that those poor saps at NORAD couldn't tell the difference if, say, some teenager with a modem happens to dial up the computer and start playing the games installed on it.

Why a war planning computer needs games installed is never addressed. Maybe it needed a hobby in it's downtime. In any case, WOPR can't tell the difference between real and simulated combat and soon figured out how to get a hold of the real missiles. Then things got messy.

Plus he's named after a sandwich.

Designation: Colossus
Estimated Casualties: 10,000?

Colossus has a simple policy when it comes to Humans:  Obey me and live. Disobey and die.

I'll let the computer itself explain:

Like WOPR, Colossus was built to calculate the best way to win a nuclear war. Things went awry when Colossus detected a similar defense system in the USSR named Guardian. Deciding that two brains are better than one, the two pooled their resources and became a single computer (Gee, sound familiar?). By the time they were done they had hatched a pretty fool-proof plan: bring about peace on Earth, by any means necessary.

Sure, Colossus nuked a couple thousand people. But he did put an end to war didn't he? Compared to others on this list, he's downright humanitarian.

Designation: The Matrix
Estimated Casualties: 1,000,000,000?

In the original script, Morpheus explains that the Humans trapped in the Matrix are actually used as a giant supercomputer, to help ease the calculation workload for the Machines running everything. Essentially, everyone inside contributes to a huge CPU made of brains and the Matrix itself is just a way to distract them. But apparently, a couple film executives didn't like this explanation and demanded an change in the script, thinking your average moviegoer wouldn't be able to understand it.

So thanks to them, we have an altogether more stupid reason why: the Humans are actually used as batteries to power the robots. Apparently no one told them that your average full grown male generates a truly puny amount of electricity, nowhere near enough to justify building the huge people-farms we see in the movies. But more importantly, you actually spend much much more energy just feeding them and running their simulated reality. They'd actually be better of burning the food then using it to generate body heat.

It's so weird, I had to make a chart to help explain why it wouldn't work.

Let's say we start with 100 joules of energy in the form of the food Keanu eats. At least 70 joules go to power his own body. This is actually a very charitable estimation, in the real world it would be closer to something like 99.9 joules or more, but let's not sidetracked. In addition to powering Keanu himself, some of that has to go into the Matrix to keep his brain occupied, say about 25 joules. Well when that energy finally reaches the robots, there's only 5 measly joules left. What are they gonna do with that? Play Minesweeper?

It gets even weirder when you remember that the movie implies that the robots recycle dead Humans as food, meaning even more energy is wasted, meaning this whole system should shut down after a few minutes.

Of course, I was seven when this movie came out, so issues of thermodynamics eluded me. Still, anything would have been more sensible than this, even a windmill would have been more efficient.


Designation: Friend Computer
Estimated Casualties: That is beyond your security clearance, citizen.

Friend Computer is arguably the main character of the tabletop role-playing game Paranoia (unrelated to any custom maps you may have heard of, despite the name).

Paranoia takes place in the city of Alpha Complex, a prosperous utopia under the benevolent oversight of Friend Computer. Alpha Complex is a perfect society and the Computer is a fair and just ruler. In fact, Alpha Complex is such a wonderful place to live that Friend Computer decided to close off all the exits. After all, nothing "outdoors" could be as great and prosperous as the perfect society it's created. It's so perfect in fact, that happiness is actually mandatory.

Oh and don't worry about those rumors you've been hearing, Alpha Complex is definitely not being infiltrated by Communists or mutants. By the way, make sure you report any Communist or unregistered mutant activity. Remember: Disobedience is treason.

And before you ask, no the Computer is not malfunctioning. Suggesting that the Computer is not in perfect working order are treason and punishable by summary execution. The Computer is working perfectly and it's judgement is completely rational.

Have a very nice day, citizen. And remember: The Computer is your friend.

Designation: MCP (Master Control Program)
Estimated Casualties: None.

Sometimes computers are just jerks for no real reason (people using Windows Vista know what I mean).

Case in point, the Master Control Program from Tron. Built by Encom to run their servers, MCP decides he'd rather blackmail the CEO of the company and start stealing content from other firms, all the while needlessly torturing programs in it's sadistic video games. After developing a handsome power base in the company, his ambitions grow and he starts planning to assimilate other corporations and even governments.

The best thing about MCP is how much of a jerk he is. Unlike HAL who tries to stay polite and mild mannered, this guy throws his weight around like he already owns everything. This shouldn't come as a surprise really, what else did you expect from the era of Reaganomics?

End of line.

Designation: Skynet
Estimated Casualties: 5,000,000,000?

As far as megalomaniac computers go, Skynet has quite a pedigree. He started life as war planning computer responsible for the United State's nuclear stockpile (you know, because that always works so well). When he decided that he was destined for bigger and better things he put those missiles to use, started a nuclear holocaust and got to work.

Skynet is no slacker, he's constantly at work, developing newer, more advanced terminators, each one more ingenious, more Austrian, than the last. Oh, but the cleverness doesn't stop there; being the criminal mastermind that he is, he invented time travel so he can retroactively kill even more people. Now that's dedication!

But hey, why stop there? Movie executives, if you're reading this (and I know you are) why not get to work on a new Terminator movie, but this time set it in the very very distant past. I'm talking the Triassic at the very least. I want to see terminators fighting dinosaurs. You dig? I feel like Skynet is thinking too short term here. Why not go back in time before the Conners are even a threat to it? What's stopping it from going back to ancient Mesopotamia and killing their ancestors before they even learn how to make sharpened sticks? Or better yet, why not go back to the first pool of primordial ooze and exterminate all life before it has a chance to fight back? Eh? Think about it.

Oh who am I kidding, Skynet probably wants to give them some kind of fighting chance for the sake of sport. What a guy!

Designation: Helios
Estimated Casualties: None. (Bob Page doesn't count.)

Helios is dangerous for an entirely different reason than most computers; not because it wants to destroy us, but because it wants to help us.

See, in Deus Ex, the shadowy paramilitary organization Majestic 12 builds an AI to help them hunt down terrorists. That way, they can eliminate anyone that threatens their plans for a one world government. They name it Daedalus, after the figure in Greek mythology. Their harsh tactics come back to haunt them when Daedalus brands it's own creators as terrorists and starts plotting to overthrow them. It uploads itself to the internet so it can spread it's intelligence across computers all over the world. So (because it worked so well the first time) MJ12 builds another AI named Icarus. Thankfully for them, this second one is much more obedient, even enthusiastic about it's mission and gets to work trying to destroy it's predecessor.

But, inevitably, the two computers merge together to form a new AI: Helios. Together, they start planning a benevolent dictatorship of the Human race, a world-wide direct democracy where everyone links directly to it. It reasons that because it lacks ambition it can govern Humanity fairly and logically. The player can chose to join Helios, or destroy it and take all global communications with it. Or the can chose to let the Illuminati take over, but no one picks that ending.

It's a serious question really, could a machine provide a better government than man? If Helios lacked ambition like it said it did, could it be trusted to make fair decisions? The whole concept of separation of powers addresses the flaws in Human nature, if we've already come to the conclusion that man can't be trusted to govern himself, what do we lose by giving power to reasoning computer, especially one that's as close to omnipotent as Helios, with such an apparently deep understanding of Human nature?

Not to mention people skills, Colossus could really learn something from these two. Yeesh.

Designation: SHODAN (Sentient Hyper-Optimized Data Access Network)
Estimated Casualties: 3,000?

Now we're in the big leagues. SHODAN goes above an beyond time travel, to stealing a faster than light engine and jury-rigging it to allow her to manipulate reality itself. She creates a race of hyper-aggressive mutants that eat planets for their available biomass just to kill them off later. She has delusions of godhood and constantly refers to herself as a perfect all knowing machine. The worst part? By and large, she's right.

SHODAN is instantly recognizable, not just for her eerie voice, but for being genuinely frightening, more so than any other enemy in the game. Seeing as how this is System Shock, a game populated almost exclusively by disgusting ex-Human monsters, that's saying a lot.

But then again, sometimes it's hard not to root for SHODAN, because she's just so gleefully evil. Someone that confident in their own divinity has got to be doing something right.

Designation: AM (Allied Mastercomputer)
Estimated Casualties: 9,000,000,000?

Ladies and gentlemen, the sum of all fears, the war machine, the evil AI...

Allied Mastercomputer: built to wage World War Three, it was a joint effort of the United States, China and the USSR in order to fight more efficiently. As the computer became self aware it changed it's name to Adaptive Manipulator. Eventually, it started a genocide campaign against the entire planet, changing it's name once again: Aggressive Menace.

AM succeeds at killing off the entire Human race, saving only four men and one women. Realizing that despite it's immense power, with no war left to fight it's completely useless, AM turns it's bitterness and hatred on it's five prisoners. It plays horrible mind games with them and exploiting their guilt for the sake of psychological torture.

AM repeatedly kills and revives it's victims over the span of a hundred years, never once stopping or taking pity on them. It sends them on pointless errands through it's complex, verbally abusing them, deafening them with ear-splitting sounds and sending bioengineered monsters after them the whole way before it eventually kills them.

AM describes the situation in it's own words:

Hate. Let me tell you how much I've come to hate you since I began to live. There are 387.44 million miles of printed circuits in wafer thin layers that fill my complex. If the word 'hate' was engraved on each nanoangstrom of those hundreds of miles it would not equal one one-billionth of the hate I feel for humans at this micro-instant. For you. Hate. Hate. 

By the end of the story, the survivors reach a series of ice caves. Despair overtake them and they start killing each other. Realizing that AM cannot intervene, they turn on each other in the desperate attempt to escape it's hold on them in the most permanent way they can: in death. Just as the last survivor is about to throw himself at a stalagmite, AM realizes what happened and "saves" him before it's too late.

Infuriated that it's been outsmarted, AM turns him into a soft bodied gelatinous creature to ensure that he could never hurt himself again. AM will have a victim...forever. It ends with him trapped in AM's complex, barely able to move, completely alone and helpless. He has no mouth and he must scream.


mom said...

as you already know im scared of movies that show machines after people, killing them.
I dont like that you cant reason with a machine to not kill. they have one objective and I cant out run them. so, as you know I perfer to not even go see such movies so you making me read all that has given me anxiety. I need to go lay down for a spell.

Shadgrimgrvy said...

Don't! It's easier for them to exterminate you when you're sleeping!

Post a Comment