Autonomous cars are on the brink of being everywhere. From, Tesla, to Google, to Uber, these big corporations are all having a piece of the action that autonomy promises to deliver. But the one thing none of them seem to have considered is the moral aspect of autonomous cars and how they would act in a life or death situation.
The age-old commandment “Thou Shalt Not Kill” does not apply in all real life all of the time. If someone is about to wipe out an entire city, then it is quite reasonable to take them down. A good place to start for moral rules for autonomous robots would be with Asimov’s Three Rules of Robotics. These are:
- A robot may not injure a human being, or, through inaction, allow a human being to come to harm.
- A robot must obey orders give to it by human beings except where such orders would conflict with the First Law.
- A robot must protect its existence as long as such protection does not conflict with the First or Second Law.
More News To Read
- Tesla Are Soon To Offer a 100 kWh Battery in Europe
- The Diversity of Different Amazonian Plants May Be the Key to Its Survival
- The Future of Chinese Highways May Soon Looks Very Different To Now
- The Model S Has Been Perfected Now Time For Model 3