Saturday, August 1, 2015

Why Killer Robots Are Not the Problem

drone wikimedia
Humanity is on the cusp of a new age of warfare. For several millennia our species was ruled by those with the most swords, spears, and arrows, and for centuries after that, we lived in the age of gunpowder. But now, we're entering the age of drones and automated war machines. This may be an era unlike any other we've witnessed before. It'll be a world where the nations with the most military might won't be the ones with the most bombs, soldiers, and jets, but those with the most robots.


Unless of course, the most influential people of our time can stop this new arms race before it begins.
A global arms race to make artificial-intelligence-based autonomous weapons is almost sure to occur unless nations can ban the development of such weapons, several scientists warn.
Billionaire entrepreneur Elon Musk, physicist Stephen Hawking and other tech luminaries have signed an open letter warning against the dangers of starting a global arms race of artificial intelligence (AI) technology unless the United Nations supports a ban on weapons that humans "have no meaningful control over."
The letter, which was issued by the Future of Life organization, is being presented today (July 27) at the International Joint Conference On Artificial Intelligence in Buenos Aires, Argentina.
"The key question for humanity today is whether to start a global AI arms race or to prevent it from starting. If any major military power pushes ahead with AI weapon development, a global arms race is virtually inevitable, and the endpoint of this technological trajectory is obvious: autonomous weapons will become the Kalashnikovs of tomorrow."
I have to admit, I was taken aback by their collective statement. I wasn't surprised that they said it. After all, people like Elon Musk and Stephen Hawking have voiced their concerns about AI before. What surprised me was that the heavy hitters of science, technology, and business, don't seem to understand something very fundamental about technology.

There's a reason why phrases like "the cats out of the bag" or "Pandora's box" are often uttered when a new technology is being developed. You can't really stop technological progress, even if it's progressing in a direction that most people don't want it to go. You can refuse to contribute to that technology on ethical grounds, and you'd be well within your rights to do so, but if not you, then somebody else is always going to take your place.

The point is, once something is invented, it can't be "uninvented" for lack of a better word. It will always run its course whether we like it or not. The reason why can be seen in every technology that has ever been developed. If most people refuse to use or develop that technology, the few who decide to ignore them and foster it anyway (and there will always be someone who does this), will have a devastating advantage over the Luddites. Humans have a tendency to throw their morals out the window as soon as someone else achieves a major advantage over them, especially if that advantage comes in the form of a weapon. So their noble stand against that technology will soon dissipate in the face of reality. They will have to either adapt or die.

Which brings me back to the warning issued by Elon Musk and Stephen Hawking. You'd think that the smartest and most ambitious people on the planet would understand this inescapable truth underlying technology, but they don't. Or rather they do, but they don't want to apply that truth to killer robots, and who could blame them? I know I wouldn't. I'm deeply concerned about killer robots too, so I understand where they're coming from. However, my fear of these machines is a little different from theirs.

What worries me most is not a world filled with killer robots, but a world where only the government can have them. After all, a world where everyone has access to this technology wouldn't be all that different from the world we live in now. When they said that "autonomous weapons will become the Kalashnikovs of tomorrow" they weren't wrong. If most Americans had the right and the means to own a killer robot, that's really only one step removed from how it is now, where most Americans have the right and the means to own a firearm.

Which is why our government is very busy building automated war machines, while trying their best to prevent the rest of us from having them. That's the real reason why the government is hassling the teenager who built that gun firing drone recently. They don't want any competition. If they could have all the guns to themselves they would, but they can't. So instead, they're making sure that they gain a competitive edge with this technology before it gets off the ground, while ensuring that we never get a hold of these weapons.

And that's the real problem with killer robots. Who's going to have access to them?

At the end of the day, these devices are just tools, not unlike the ones we have now. Admittedly, they are pretty horrifying, but you know what's scarier than killer robots? A world where only the most powerful people have them. A world where governments are capable of cheaply killing millions of people without facing any repercussions, because who would stop them?

Rest assured, no pleading or begging to the UN is going to stop governments from building these machines. And even if they did succeed, someone out there would continue to foster this technology in the shadows, and if they succeeded, then the world would have to bow to their whims, or face the automated consequences. The arms race would begin again, posthaste.

In other words, the cat's out of the bag and Pandora's box has been opened.

Delivered By The Daily Sheeple

No comments:

Post a Comment