Sep 2, 2016

Elon Musk, Stephen Hawking and others call on AI/machine regulation

Artificial intelligence, robots and self-learnig machines are making incredible progress. But what few people think about is that if we have self-driving cars, we can also have self-driving tanks. Robot armies would be a terrible, terrible thing, deciding on their own judgement whether to kill or not to kill etc. Plus - we don't really know how long we will have full control over self-learning systems. So some smart minds have put together an open letter, and it is really worth reading, not only if you are interested in AI or science fiction for that matter:

http://futureoflife.org/open-letter-autonomous-weapons/


Remember Isaac Asimov's laws of robotics? We need something like those.
(https://en.wikipedia.org/wiki/Laws_of_robotics)

  1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.
  2. A robot must obey the orders given it by human beings, except where such orders would conflict with the First Law.
  3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws.