What if autonomous cars weren’t allowed to optimize for the driver but instead were more broadly cognizant of the cost any single action imposed on society at large? Would self-driving cars be selfish or unselfish, optimizing for the efficiency of the driver or the impact upon cars and pedestrians? While this question doesn’t have quite the same moral imperative of the trolley problem, it’s a consideration that algorithms and regulations should consider.
For example, should a self-driving car be allowed to make a left turn across traffic, an action which typically delays everyone behind the car, versus making a series of right turns in order to cross a busy street? In a future where EVERY car is autonomous, they would merely communicate to one another and be able to switch lanes at high speed to move away from the obstruction, but we’re due for many years of hybrid traffic on the road.
If there are a dozen people waiting to cross the street and only a single car approaching the intersection, should the car need to wait while the pedestrian mass is allowed to move forward? A three minute delay to one driver trumps a 36 cumulative minute delay to the group, right?
Has anyone seen good studies on what an optimized traffic system looks like and what the associated decisions trees look like if you’re maximizing efficiency?
Pingback: Weekly Links & Thoughts #101 | meshedsociety.com