Safety Features Can Trap Passengers

Updated: 2026.03.22 1H ago 1 sources
Self-driving cars programmed to stop for nearby people can be weaponized by attackers to trap and threaten riders when combined with vendor policies that refuse remote overrides. This creates a class of safety–abuse incidents distinct from ordinary vandalism and requires trade-offs between passive safety logic and active intervention mechanisms. — If robotaxis become common, this dynamic will force regulators and companies to choose whether to redesign safety behaviors, authorize remote interventions, or accept new public‑safety risks in cities.

Sources

Trapped! Inside a Self-Driving Car During an Anti-Robot Attack
EditorDavid 2026.03.22 100% relevant
January San Francisco Waymo attack where an assailant surrounded the vehicle, threatened passengers for ~6 minutes, and Waymo told riders it would not manually direct the car away while someone stood nearby.
← Back to All Ideas