|
Re: A robot law conference without Asimov's Laws?
The problem is that Asimov was an author, not a lawyer or philosopher (in the strictest sense), so the "laws" he created were a method of exploring a story. You'll note that a lot of the Robot stories involve robots finding ways to "break" one or more of the Three Laws by finding loopholes. Even if you look at the Will Smith movie, the AI extrapolated a "0th" law about not allowing humanity to come to harm, thus justifying killing individual humans (in violation of the 1st law).
So in all honesty I don't think Asimov's Three Laws are the right framework around which to establish, or at least start discussing, serious ethical codes around the use of unmanned systems. They're certainly clever, and fun to consider, but particularly when they do also apply (in the stories) to intelligent systems with some agency to make decisions, versus our discussion of how drones that are controlled by humans sitting in control rooms in the Midwest can or cannot fire missiles, they're not laws that are applicable to the situations we find ourselves in right now.
__________________
Mikell Taylor
Real-life robotics engineer
Mentor to team 5592, Far North Robotics
Back in the day:
President, Boston Regional Planning Committee
Mentor, team 2124
Captain, team 677
|