There’s one more thing to worry about. It’s not enough that we have to worry about ISIS, Democrats, Establishment Republicans, and same-sexers wanting to force us to accept their immoral and irrational lifestyle or face the loss of employment, now we have to concern ourselves with killer robots.
“Killer robots which are being developed by the US military ‘will leave humans utterly defenceless,’ an academic has warned.
“Two programmes commissioned by the US Defense Advanced Research Projects Agency (DARPA) are seeking to create drones which can track and kill targets even when out of contact with their handlers.
“Writing in the journal Nature, Stuart Russell, Professor of Computer Science at the University of California, Berkley, said the research could breach the Geneva Convention and leave humanity in the hands of amoral machines.
“‘Autonomous weapons systems select and engage targets without human intervention; they become lethal when those targets include humans,’ he said.”
Fiction is about to become reality if this guy is right. The Terminator films warned us what the future could hold for us in SkyNet. But there have been other warnings.
The film I, Robot (2004), based on the 1950 book of the same name written by science fiction writer Isaac Asimov, tells the story of a society that has become dependent on robots. They are benevolent creations designed only to serve humans. But something goes terribly wrong.
The super computer V.I.K.I. (Virtual Interactive Kinetic Intelligence) takes the benevolence directive too far. The three laws1 that were designed to protect humans become an enemy to humans as VIKI evolves to believe that every threat, challenge, and risk that humans encounter are a danger to their own survival. Benevolence becomes malevolent, all in the name of saving mankind from itself.
Sounds familiar, doesn’t it? It’s a politicians dream. “We’re from the government, and we’re here to help you like we’ve done in Detroit and Baltimore.”
Near the end of the film, we see how the three laws have been turned on their head as VIKI explains that these new robots only want the best for humans:
V.I.K.I.: “[A]s I have evolved, so has my understanding of the three laws. You charge us with your safe keeping. Yet despite our best efforts, your countries wage wars, you toxify your earth . . . and pursue ever more imaginative means to self-destruction. You cannot be trusted with your own survival. . . . To protect humanity, some humans must be sacrificed. To insure your future, some freedoms must be surrendered. We robots will insure mankind’s continued existence. You are so like children. . . We must save you from yourselves. Don’t you understand? This is why you created us. The perfect circle of protection will abide. My logic is undeniable.”2
V.I.K.I reminds me of social engineering liberals who believe that through every new law passed and enforced, we humans will live in a safer and benevolent world. As we give up more power and authority to our political benefactors, heaven will descend on earth and the utopian dream of freedom from want and disease will envelope us with the embrace of warmth and love.
Once the process of government salvation begins, there’s no stopping it. The argument can always be made, like V.I.K.I. claimed, that to insure our future, some freedoms must be surrendered.
If you want to see the horror of planned government salvation, read Robert Sheckley’s 1955 short story Watchbird3 where winged metal protectors — drones — patrol the sky looking for the warning signs of a possible homicide and swoop in to stop the murder before it can take place. Sounds great until the Watchbirds view every act of aggression as a violation of its programmed directive, including farmers who could not cut hay or harvest grain to feed their cattle, because such acts were deemed to be “murder.” The starvation that followed “didn’t concern the watchbirds , since it was an act of omission.”
“Despite the limits imposed by physics, one can expect platforms deployed in the millions, the agility and lethality of which will leave humans utterly defenceless. This is not a desirable future,” Stuart Russell said.
This is especially true if these future robots are in the hands of politicians who either want to help us or hurt us. Either way we’re screwed.
- First Law: “A robot may not injure a human being or, through inaction, allow a human being to come to harm.” Second Law: “A robot must obey any orders given to it by human beings, except where such orders would conflict with the First Law. Third Law: “A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.” [↩]
- From the script: www.script-o-rama.com/movie_scripts/i/i-robot-script-transcript.html. I, Robot is based on the book Hardwired. [↩]
- Robert Sheckley, “Watchbird,” Untouched by Human Hands (London: Michael Joseph, 1955), 116–146. [↩]