
Understanding the real drivers behind AV accidents is crucial for policymakers, manufacturers, and the public to demand safer designs and realistic safety standards. As autonomous vehicles become more prevalent, transparent safety practices will determine whether society accepts or rejects this technology, making the discussion timely amid regulatory battles between the U.S. and China.
I recently joined Anthony, Michael & Fred on the Center for Auto Safety podcast. Topics we discussed include:
Waymo hitting a school child and also dozens of times caught driving past stopped school buses. This includes what it really means for a crash to be unavoidable in the real world rather than a narrow mathematical model. What Waymo could be doing to avoid these situations, but chooses not to do because they are comfortable taking the risk with other people’s kids. (And why not all stop signs are created equal.)
The architecture report by The Autonomous on hardware approaches to managing redundancy for safe autonomous‑vehicle computing systems.
The irony of the US chasing China by trying to weaken automotive safety regulations while China is busy strengthening theirs.
A mention of my book on Embodied AI Safety. Chapter 10 is about how the AV industry might rebuild the trust they are letting slip through their fingers.
• Brief video explaining this topic here
• a blog post on this topic here
More details are in the book, and a discussion in this podcast episode. Short version: opacity and deflection stop working in the long term. Transparency is required.
And a lot more!
Listen to the full show here (free): https://rss.com/podcasts/autosafety/2521953/

Comments
Want to join the conversation?
Loading comments...