Moral Machines

Photo by Denys Nevozhai on Unsplash In 2018, Uber self-driving car under test in Tempe, Arizona was involved in a crash which unfortunately leads to the killing of a pedestrian. Last week, National Transport Safety Board concluded that it was Uber’s self-driving software’s fault (apart from various non-technical valid issues), as the autonomous software was not programmed to react to pedestrians crossing the street outside of designated crosswalks. This flaw (which Uber seems to have fixed now) raises a question about situations in which software, when not programmed correctly, can lead to more severe crashes. This reminded me of Moral Machine, a project at Massachusetts Institute of Technology, that creates extreme scenarios (similar to trolley problem) to understand human perception. The data collected points to the fact that every individual has a different […]

Moral Machines Read More »