Starting the day by poking the fanbois: https://lnkd.in/eUHjtfE2.
Most of the population (except systems engineers) believe that automation always works perfectly. It is something propagated by Hollywood movies. I always show this video to my students when discussing about automation and safety: https://youtu.be/n_1apYo6-Ow?si=N8xq_D6nfMfEvqLZ Automation only works as good as the mix of sensors, processors, software and actuators
I don't trust the currently available autonomous cars. As a professor specializing in reliability engineering from a systems perspective, my hesitation partly arises because I've never been in a car accident. Considering the potential for companies to misrepresent their safety numbers due to their own interests, I don't want to increase the likelihood of experiencing an accident by relying on autonomous driving functions. It's crucial for these companies to ensure their decision-making processes are transparent and explainable, even though achieving this may be challenging.
"...."move fast and break things." Both bro culture and a disruptive mindset, as she sees it, incentivize companies to gloss over safety risks." It is a cornucopia of biases :These "tech bros" exhibit overconfidence bias in their abilities, leading to action bias where immediate implementations is made, then they ask for regulators, like OpenAI did to stop the competition. Confirmation bias causes them to focus only on supporting evidence, and the Dunning-Kruger Effect makes them unaware of the complexities they're overlooking. There are more, but it all boils down to oversimplification and thinking that technology is solution to all societal problems. In Croatia government is showering robotaxi projects with money in the capital while they can't get waste disposal right. But that is more about EU desperation to catch up to USA / China in AI, then local political (in)competence, another story. Continue with the great job.
I found it interesting in the BMW Group article by Dr. Sebastian Osswald on their new L3 car, when he commented - you are allowed to 'not pay attention'. If you are allowed to 'not pay attention', how does BMW Group possibly measure the driver is ready to take control back in an emergency, and not for instance sleeping, or on the back seat Tesla style? https://www.linkedin.com/posts/osswald_bmw-level3-activity-7112882713421836289-V2jT?utm_source=share&utm_medium=member_android
I've said before that if engineers have to be licensed by the state to design roads, then engineers need to be licensed by a government agency to build self driving cars. Just like if poor road design kills someone--if that car kills someone, it allows us to hold the licensee accountable.
I still find it astounding that these companies took the NHTSA data for "94% driver error," dumped all the caveats, arbitrarily changed it to "94% human error" and yet so many folks still hang on every word they say. Just goes to show how incurious most people are.
Rogers is incorrect to state "Until robot cars have traveled for hundreds of millions of miles, there's no way to get a statistically significant, unequivocal conclusion."
“"I think the military has made a lot of strides, but I do think that's what's happening in these Silicon Valley companies is just a reminder that we haven't come as far in our society as I thought we would have."” Unfortunately, an accurate observation. I hear regularly from female and non-binary professionals how hard it is to work in tech. We all need to call this behavior out - we need many more Missy Cummings
Software Engineer for Commerce Bank
9moI'm still really surprised and confused that self-driving cars got approved for the public roads at all. Or how that even works with insurance agencies if you're not technically driving the car if it crashes?... Or if the driver is always accountable, that really defeats the purpose of the tech as you might as well be driving yourself.