I understand what you are saying, but I disagree.
If (and it’s a very big if) there was an incident caused by the automatic system, there would understandably be public concern. However I would expect the public to be reasonably logical and set it in the context of the big picture.
As far as I know, there has never been a fatal accident caused by an ATO system on any railway worldwide. If I’m wrong someone will no doubt correct me. Similarly I’m not aware of any fatal incidents caused by autoland on airliners. Back to railways there have been no fatalities caused by MCB-OD level crossings. In all three of these areas of automation, there is plenty of history that shows that having these in manual control causes incidents and fatalities. I would expect the public to ask what had gone wrong, but set it in the context of the much, much better safety record through automation than than through manual operation.
Sticking with a hypothetical railway example - if one day an SSI interlocking somehow causes a collision, would the public demand all SSI interlockings be switched off and we go back to men with flags?
Ok I didn't realise there has never been a fatal accident worldwide using an ATO system thus far? That certainly does speak exceptionally for its track record.
Obviously I don't understand the ins and outs of the technical systems involved in ATO in the sense that is being described. I understand from the posts that there looks to be some sort of backup failsafe system between ATO and the driver which means all fails revert to the safe mode i.e. train will stop and or authority goes to essentially red. Which is great.
Ok good points about driverless cars operate on non restricted open public infrastructure. And the railway is restricted to a high extent. And if someone went onto the line at Thameslink station and the train collided in ATO mode as someone stated, the railway disclaimer says a trespasser shouldn't be there and takes liability .Which is totally fair enough and the public accept this.
But, where the public restriction to infrastructure changes is where say people are walking at the back of a bay platform. If an ATO didn't stop in time the way that Voyager train didn't (given, manually) and as happened at Kings Cross a few years ago, and endangered persons at the rear of the bay and hurt anyone, well that would be serious wouldn't it? But if a manual driver did that, which they did recently in the two incidents above, it can be dismissed as not infrastructure, system, TOC or NWR blame as it was human driver error.
Now if in the extremely unlikely event ATO messed up and braked too late for a bay platform, or the driver realised and braked but contact was inevitable, then I think the story might hit the news if it was found that the computer failed to stop correctly. And the public would judge the computer differently to how they judged a human driver who did the same. This is the point I am making.
Also, if we had the scenario where a train failed to brake in time or at a high enough rate or at the last minute and it was too fast for a curve or a set of points and this lead to any sort of accident ever ever, then again I don't think it'd be like the tram derailment where driver error was deemed fair enough blame. It would hit the media that computer drove train too fast into the bend or whatever and even if driver realised, if an accident couldn't be prevented public would say no to computer imo.
Some have said they think the public would be understanding and accepting of risk to a certain degree as we do with anything human controlled. I'm not so sure. Especially in the UK. I mean look how there was a story where a couple of certain model washing machines spontaneously went bang and opened a hole in the back and it happened to a couple in the country. It went national. And the whole model was recalled and people were worrying who had any sort of (this brand) washer even if it wasn't the model concerned.
Same with this UBER driverless cab. Thousands of car accidents happen worldwide every hour of the day. One serious accident occurs with a driverless cab and rightly so it goes worldwide viral with huge worry and uproar. Because we accept a human is allow to make a critical mistake but we don't accept a computer can take overall charge of our critical safety and lead to the death of an innocent person who was unsuspecting of the risk.
The slightest safety technical accident and we recall the whole line of products in the UK and word spreads via media straight away. And I think the same principle would apply to a computer controlled train if anyone not putting themselves in direct line of danger got hurt.