Tragically, we have seen the first fatal accident involving a self-driving car, with a pedestrian in Tempe, Arizona dying after being hit by an Uber in autonomous mode. My thoughts are with the family, and the Uber operator. It would be a horrific experience.

I don’t know the facts of this case, but people are already opining that the victim was jay-walking.  

I sincerely hope commentators don’t simply line up around the clinical rights & wrongs of the road rules, as if the SDC shouldn’t be expected to cope with an errant human. I always thought the point of Self Driving Cars was that they’d work on real roads, without special signposts, beacons or machine-readable lane markings.  Truly autonomous SDCs must adapt to the real world, where the rule is, people don’t always follow the rules. 

As our cities fill with rule-bound robot vehicles, jay-walkers should not have to fear for their lives.  

Recently I wrote

No algorithm is ever taken by surprise, in the way a human can recognise the unexpected. A computer can be programmed to fail safe (hopefully) if some input value exceeds a design limit, but logically, it cannot know what to do in circumstances the designers did not foresee. Algorithms cannot think (or indeed do anything at all) "outside the dots".