How Does Autonomous Cars Make The Choice of ” Life or Death “?

Latest Hot DEALS

Sale!

Winsome Eugene Nightstand End Table

Original price was: $90.00.Current price is: $29.00.
Sale!

Evenflo Gold Shyft DualRide Infant Car Seat and Stroller Combo

Original price was: $599.99.Current price is: $479.99.

Autonomous cars are on the brink of being everywhere. From, Tesla, to Google, to Uber, these big corporations are all having a piece of the action that autonomy promises to deliver.  But the one thing none of them seem to have considered is the moral aspect of autonomous cars and how they would act in a life or death situation.




Recently, when Google’s director of engineering, Ray Kurzweil was asked to consider the question of how a car that is about to hit another with three people inside will decide what is the best course of action to take.  Will it swerve to avoid the collision with the car, but hit three children on the sidewalk in the process?  In short, the answer was pretty much that they don’t know.

The age-old commandment “Thou Shalt Not Kill” does not apply in all real life all of the time.  If someone is about to wipe out an entire city, then it is quite reasonable to take them down.  A good place to start for moral rules for autonomous robots would be with Asimov’s Three Rules of Robotics. These are:

  • A robot may not injure a human being, or, through inaction, allow a human being to come to harm.
  • A robot must obey orders give to it by human beings except where such orders would conflict with the First Law.
  • A robot must protect its existence as long as such protection does not conflict with the First or Second Law.




The same question about morality was then forwarded to Andrew Chatham, a principal engineer at Google.  He doesn’t feel this would be a problem that would come up very often, if at all. His role as a moral software engineer is to prevent the car from getting into that situation in the first place.  If it does find itself in that kind of dilemma, then it shows he has in some way failed in his job.  In any dangerous situation in an autonomous car, almost 100 percent of the time, the car will apply the brakes. Researchers have recently created a Moral Machine to see what humans would do in certain driverless car dilemmas.  Whether this will be used to improve autonomous cars morals better, is yet to be seen.


More News To Read

Comments

comments

Follow Us For News and Discount Deals

TrendinDEALS

Sale!

40oz Tumbler - C8K3FLBQ

Original price was: $32.98.Current price is: $15.49.
Sale!

Small Bathroom Storage Cabinet

Original price was: $43.96.Current price is: $17.36.
Sale!

Skechers Women's Go Walk Joy-15641 Sneaker

Original price was: $63.00.Current price is: $29.49.
Sale!

4/8 Piece Compression Travel Kit

Original price was: $30.98.Current price is: $18.59.
Sale!

Bissell SpinWave Hard Floor Expert Pet Robot

Original price was: $399.99.Current price is: $194.62.
Sale!

Pack of 3 Polo Ralph Lauren Classic Fit Shirts

Original price was: $43.00.Current price is: $21.25.

More like this
Related

Poker in the New Digital Era: Is It Worth it to Play Poker Online?

Without a doubt, poker is a timeless card game...

The Future of AI: Insights from the Godfather of AI

In the world of artificial intelligence, Geoffrey Hinton stands...

The Science Behind Cold Plunging: Is It Worth It for Your Health?

Ready to cold plunge? We dive into the science...

Unraveling the Mystery of the Ninth Planet: Could Modified Gravity Hold the Key?

In the ever-evolving realm of astrophysics, a recent revelation...