Self driving cars
As todays technology is constantly advancing, the topic of self driving cars has become an important issue in the world of driving. Questions such as "if self driving cars are proven to be safer than passenger cars, should we completely ban manual driven cars?" or who will decide the decisions programed into these self driven cars?
Guess
Looking at it from a deontological point of view, one would argue that murder is completely unethical in any regards, so this means the car would be programmed into having a zero tolerance towards murder, However if a circumstance arises where a child run out in the middle of the street while the car is driving the speed limit, the child does not have the right to be on the road but the car does. So should the car crash and potentiality kill the owner, or hit the child?
Looking at it from a utilitarian point of view, one would argue that if murder was inevitable during a car crash, you would try to minimize the deathly impact. So one death is better than ten people dying.
No comments:
Post a Comment