…two words: Killer robots!
Clearly, we’re all doomed. Skynet, here we come!
The Professor is wrong, it costs around $800-1000 to make an unmanned GPS autopilot drone. We’ve priced the parts, and a decent glider is needed (something that won’t change course/altitude on its own much)
Wow, only the military could watch the Terminator series and think: Good idea! What could possibly go wring? (and say that as a rhetorical question, rather than the deep and serious question it is)
“wring”? I mean “wrong.”
The excellent Primordial Blog just reported on the fact that these killer robots could be able to read your brainwaves too. Be scared. (But don’t let them know you’re scared:)
I have a low end GPS unit (Roadmate 1200 from Magellan). When I use it to go to one of the pre-programmed locations, it tells me I have arrived just as I cross the curb cut into the parking lot. I call it a guidance system.
The robots can’t do a worse job of deciding who to shoot than us humans in a war.
In a keynote address to the Royal United Services Institute (RUSI), Professor Noel Sharkey, from the University’s Department of Computer Science, will express his concerns that we are beginning to see the first steps towards an international robot arms race. He will warn that it may not be long before robots become a standard terrorist weapon to replace the suicide bomber.
Well, at least they’ll be easier to identify at checkpoints.
The main point here is not to be afraid of incredible Terminator-style AI with robots that can truly think and decide to turn against us, but robots capable of killing that use existing classification/image processing methods.
Hell, your generic classification system may have 80% accuracy, but that’s still 2 dogs/children/trees/nuns/movie posters mowed down per 8 bad guys. So much of it is about context – for example, how would it react to a guy running towards it with a gun in his hand shouting “Save me: the insurgents are trying to kill me!”?
On the other hand, using robots which can only shoot on remote human command could actually reduce civilian deaths – the operator is in no true danger (unlike a soldier), so would hopefully be less likely to shoot without properly assessing threat. Like Noel, I strongly disapprove of taking away this safeguard.
With decent control algorithms you could probably get away with a cheaper glider.
New comments have been temporarily disabled. Please check back soon.
Enter your email address to subscribe to this blog and receive notifications of new posts by e-mail.
By definition, alternative medicine has not been shown to be effective or has been shown to…
California’s new law that eliminates personal belief exemptions has been a success, increasing vaccine uptake after…
Antivaxers fear and detest vaccines, but one of the types of vaccines they fear and detest…
Naturopaths are fake doctors who fancy themselves to be real doctors, so much so that they…
In March, it was widely reported that a young woman named Jade Erick had died suddenly…