The first death of a "driver" in a self-driving car on public roads has just been reported. That was an eventual inevitability, so in that sense not much of a surprise. What is a surprise, though, is that the "driver", Joshua D. Brown, had spent 11 years as a Navy Seal. I've known a couple of Navy Seals, and I can think of ways they might die after retirement: in a skydiving accident, or a cave diving accident, or a climbing accident, or a motorcycle accident. A Seal might have a parachute fail to open properly, but you can dang well be sure he would have packed it himself. He might have a line break while climbing, but it would be one that he had chosen and inspected first. In other words, I would expect that if he dies early, it will be because he is doing something active and placed too much trust in himself; otherwise he is likely to die of a more humdrum cause, like too many cigarettes. He would know from experience not to put too much trust in technology, because he would have seen technology fail and would have been trained to adapt when that happens.
So when I read that a veteran Navy Seal has died due to placing too much faith in a self-driving car, and that (according to the truck driver's account) he had felt that a Harry Potter movie was a better use of his time and concentration than driving ... something just does not add up. No, I'm not claiming foul play and a cover-up; I'm just saying it does not add up.