Content Warning: Graphic self-driving car accident
October 5, 2023 9:29 PM   Subscribe

This post contains the graphic description of a person recently struck by a self-driving car in San Francisco. Link

Driver Strikes Pedestrian Near Fifth and Market; Cruise Car Subsequently Runs Her Over and Stops On Top Of Her

A female pedestrian who was allegedly in the middle of Fifth Street when the traffic light at Market Street turned green was struck by a human driver's car and subsequently knocked into the path of a Cruise autonomous vehicle (AV) driving alongside it. The Cruise vehicle, apparently after stopping quickly and sensing something under its tires, stopped and turned its hazards on. Meanwhile, the car was reportedly on top of the woman's leg as she screamed in pain, according to witnesses who spoke to the Chronicle. The car's rear tire was still on the woman's leg when first responders arrived, and after the AV was disabled by Cruise, the jaws of life were used to life the car off the woman.

A delivery person on a bike, Austin Tutone, told the Chronicle that he ran to the woman's aid. "I told her, 'The ambulance is coming' and that she’d be okay. She was just screaming," Tutone said.

According to a statement from SFFD Captain Justin Schorr, the woman was transported to SF General with "multiple traumatic injuries," and she was listed in critical condition as of Tuesday morning.
posted by AlSweigart (133 comments total) 5 users marked this as a favorite
 
after the AV was disabled by Cruise

I can only imagine that the FD had to wait on hold with fucking customer support. What an awful situation.
posted by They sucked his brains out! at 10:10 PM on October 5, 2023 [14 favorites]


I will not drive the streets of SF anyway, but this is even worse.
posted by jenfullmoon at 10:14 PM on October 5, 2023 [1 favorite]


Thanks for the warning, there's no way I'm clicking that link.
posted by JHarris at 10:18 PM on October 5, 2023 [10 favorites]


Per the link: Cruise put out a statement saying, "A human-driven vehicle struck a pedestrian while traveling in the lane immediately to the left of a Cruise AV. The initial impact was severe and launched the pedestrian directly in front of the AV. The AV then braked aggressively to minimize the impact. The driver of the other vehicle fled the scene, and at the request of the police the AV was kept in place. Our heartfelt concern and focus is the wellbeing of the person who was injured and we are actively working with police to help identify the responsible driver."

Cruise is blaming the cops for keeping its AV on top of the victim?
posted by Iris Gambol at 10:33 PM on October 5, 2023 [5 favorites]


But unquestionably, the image of an AV with its hazard lights on, on top of a screaming woman, is going to be hard for the industry to shake.

The solution is simple. [Some exceptions may apply.] Heavy motor vehicles do not belong in San Francisco.
posted by aniola at 10:47 PM on October 5, 2023 [30 favorites]


I hope her recovery is swift, and that she doesn't catch covid from the hospital.
posted by aniola at 10:48 PM on October 5, 2023 [8 favorites]


If we're going to have these fucking things, as a baseline they need to be able to be taken over and controlled remotely and immediately by authorities when there is an accident.
posted by They sucked his brains out! at 11:24 PM on October 5, 2023 [18 favorites]


The title of this post is misleading; this is not a self-driving car accident. Computers do not do things by accident; they do things because they are programmed to do them. Cruise has 2000 employees, and GM spent $2 billion last year alone on developing the code that ran in this car and that determined that the best course of action under these circumstances was to pin an injured woman to the ground, and crush her leg with a ton of pressure, ignoring her screams of pain, until both Cruise and first responders intervened.

Now I don't think that a team of software developers had a feature card they worked on to implement this specifically. Certainly I wouldn't argue for this as a feature. But, on the other hand, I would never write software to be used to control multi-ton pieces of heavy equipment and then release it in a completely uncontrolled environment filled with unsuspecting members of the public without making damn hell sure it wouldn't do something like this.
posted by Superilla at 11:37 PM on October 5, 2023 [35 favorites]


This is kind of a weird framing given that the woman was hit by a human-driven car first? If that one had been automated as well she likely would have been hit by zero cars and she'd be just fine instead of in critical condition.
posted by ethand at 11:44 PM on October 5, 2023 [40 favorites]


Something like 10-15 years ago, I remember yelling at Masshole drivers trying to run me over when I was crossing with the light, "I can't wait until self-driving cars make it so you can't do that anymore!" (Obviously with a lot more curse words.)

How incredibly naive I was.

Almost four years ago, just before the pandemic, I took BART into San Francisco to ride down the newly-opened Better Market Street, which banned (private) cars in favor of emphasizing a public transit corridor, with hopes of further improvements for people walking, rolling, biking down the key commercial corridor. The very same Market Street where this woman was just severely injured due to ... a car crash.
Cruise CEO Kyle Vogt recently gave comments at TechCrunch Disrupt that "we cannot expect perfection" from the self-driving cars, and he said if too much resistance continues politically, the company could just leave SF altogether.
Promise?
posted by Pandora Kouti at 11:49 PM on October 5, 2023 [20 favorites]


In browsing certain video subreddits, one error that is easy for drivers to make in their panic to escape the situation is to drive forward or back too quickly while the human is still under their vehicle causing catastrophic injuries. One case at our local school involved a driver in an SUV hitting a schoolkid at low speed and knocking them down - in her panic, the driver slammed on the brakes, except she panicked and slammed on the accelerator instead, and the car leapt forward and smashed into the kid.

Depending on the angle / tilt of the vehicle a stationary Bolt might exert 300kg of weight on that one tyre on a leg. I don't think that they'd be able to prove that failing to move the vehicle off quickly enough worsened the injuries suffered. It's certainly a unique case... the counter-factual to that would be, don't pull out the impaling object until the professionals get there or they'll bleed out.

Cruise AV has never contacted a pedestrian through its million+ miles of testing. Though, stationary Cruise AVs have been crashed into by bicyclists, skateboarders and electric scooters. Their incident log says that 94% of contact incidents were the fault of the other party, which bodes well for their safety record vs a regular drive where you'd think an average driver would have a 50/50 chance of being at fault.
posted by xdvesper at 11:50 PM on October 5, 2023 [14 favorites]


Ah yes the technology we are nowhere close to being able to create to solve a problem that doesn't exist.
posted by GallonOfAlan at 12:10 AM on October 6, 2023 [4 favorites]


"The driver of the other vehicle fled the scene, and at the request of the police the AV was kept in place."

I would really like to understand more about the decision tree here.
posted by Tell Me No Lies at 12:22 AM on October 6, 2023 [11 favorites]


My interpretation of that is just that the police asked Cruise to keep the AV on the scene, not on top of the victim. By the time police were making that decision, the jaws of life had likely already lifted the car.

I'm not at all on the autonomous bandwagon, but it's not clear this outcome was worse than if there had been a human driver. The AV detected the pedestrian and slammed on the brakes - the same thing a human driver would have done. It will be interesting to know if that reaction was faster or slower than a human; I guess we'll find that out as the investigation progresses.
posted by qxntpqbbbqxl at 12:35 AM on October 6, 2023 [9 favorites]


>interesting to know if that reaction was faster or slower than a human

And to be fair, most newer cars have automatic emergency braking, which trains explicitly for scenarios like "child runs out directly into your path from between two parked cars so you couldn't possibly have seen them until the last milisecond".

My feeling after driving these cars - is that one possible future might be like the Culture series. A benevolent AI keeps humans in check. There is no crime and no need for prisons because the AI will simply gently prevent you from committing the crime. You only have an illusion of choice.

I've made this comment before that so many of these systems in their present state are training me to drive like the AI, rather than us training the AI to drive like a human. The system tells me when to slow down, to make sure to stay in my lane, to maintain a safe distance, to indicate before changing lane, to get off the road when it detects I'm tired or dozing. Any deviation from the optimal driving behavior and the system pings me. And if I want to deviate from the optimal to cause a crash? The system can physically override me - automatically brake if I'm about to hit another car or pedestrian.
posted by xdvesper at 12:57 AM on October 6, 2023 [8 favorites]


Now we know what (Cruise) self-driving cars can't do that people easily can: notice when the car next to you runs someone over and you brake before they land in front of you so you won't run them over. That's not even expecting perfection.
posted by rhizome at 1:13 AM on October 6, 2023 [7 favorites]


If pretty much all city vehicles were autonomous, and the maximum speed limit they could drive would be something like 20 or 30 km/h, I could see them being safer than what we have now.

They’d probably need to be integrated with infrastructure that watches over the street for pedestrians, dogs, cats and anything else — air traffic control for the streets. The city should run that infrastructure and be able to stop vehicles when needed and be programmed to give much more freedom to pedestrians, cyclists and others on the road and sidewalk. Cars should have highly visible paint for both machines and humans.

Street infrastructure probably need to be updated to remove ambiguity. Streets should be certified to meet standards that allow for autonomous driving. Cars need to be tested against a plethora of scenarios that can happen on those streets.

Vehicles should be lighter, so they cause less catastrophic damage on impact with vehicles and humans — and to the environment. A self-driving four wheeled covered electric vehicle that’s basically a grown-up bicycle? Fine, let’s do that.

It wouldn’t be perfect, but it’d be a start. You could have the "go anywhere" mobility that cars promise, with far less of the dangerous baggage they come with.

Autonomous (and human-driven) vehicles are currently being designed with an ideology of creating a sort of wild west apocalypse vehicle. It’s short-sighted and simply not good enough.
posted by UN at 1:22 AM on October 6, 2023 [9 favorites]


This is kind of a weird framing given that the woman was hit by a human-driven car first?

Yeah, it was a hit-and-run accident. But you're only supposed to read this and shake your fist at self-driving cars, not consider the awful human drivers they're intended to replace.
posted by pracowity at 2:20 AM on October 6, 2023 [28 favorites]


This is kind of a weird framing given that the woman was hit by a human-driven car first?

Isn't it weirder to imply that it's somehow more ok that self-driving vehicles can't deal with unexpected, emergent situations? I mean, people don't all drive well, but adding automatons that literally park on casualties is helping anyone how?

Oh, and isn't it great that profit-driven technology gets to be tested live on our public streets. I wonder how this will impact on the citizen's future health insurance?
posted by onebuttonmonkey at 2:30 AM on October 6, 2023 [20 favorites]


> This is kind of a weird framing given that the woman was hit by a human-driven car first?

Yeah, no.

We're all well aware of the issues with human drivers.

The specific issue here is, that "autonomous" cars are sent out into the real world with the idea that they are actually autonomous and can deal with all the situations they will find themselves in.

But it is more and more clear that is not true and will never be true. There will always be a certain percentage of situations where the best they can do is shut down and wait for their human to come along and tell them what to do.

Strange and very unusual situations are actually (and somewhat paradoxically) very common in everyday life. Like something that happens to humans only one in a million days happens 330 times every day in the U.S. and some 8000 times daily across the earth.

FWIW about hundred people are killed by automobiles daily in the U.S. We all know what bad drivers humans are, etc etc etc, but somehow despite that they do a pretty decent job of avoiding killing each other.

Point is, you think it's fairly easy for autonomous cars to drive more safely than humans, but actually the competition is far more stiff than you might think.

> Their incident log says that 94% of contact incidents were the fault of the other party

It's worth pointing out three things here:

#1. This information comes from "THEIR incident logs". Literally their own, proprietary information. A very, very small proportion of which they have deigned to release publicly. One of the major issues with the self-driving car companies is they are very tight-fisted with their data. They release it very selectively. We actually don't even have the data to determine whether their evaluation is correct or not. We literally don't know how safe or unsafe autonomous vehicles are. That data to demonstrate that - or not - is all proprietary and secret.

#2. When police (highly biased towards the driver/motorist point of view) evaluated who was at fault in motor vehicle driver-bicyclist collisions, they found over 90% the fault of the bicyclists. When independent evaluators using more objective criteria evaluated the same incidents, they found the fault was closer to 50/50.

Could some similar bias be at fault here?

#3. These self-driving cars are literally driving on easy mode. They are on the easiest, most predictable streets with the slowest speed limits, they are all mapped to the nines, etc etc etc. The remaining 99.99999% of the streets, roads, and highways in the world are all much, much, much, much, MUCH harder to handle for the autonomous vehicles.

Cars that have literally a perfect safety record in these carefully curated small areas may well have safety record 1000X worse than humans when let loose on all roads and streets. The "real world" it that much more difficult - or maybe even more so.

And presents with ambiguous and difficult situations that much more commonly.

Yet, these vehicles are getting stuck and stymied very regularly even in the "easy mode" areas.
posted by flug at 3:21 AM on October 6, 2023 [37 favorites]


What if instead of millions of self-driving cars each carrying a single passenger we ended up with a smaller fleet of larger vehicles driven by professional drivers that could help people get where they are going?
posted by yonega at 3:29 AM on October 6, 2023 [55 favorites]


I will say, I am not even against self-driving cars in principle.

But that way we are approaching it now is astonishingly cavalier and in disregard of the public safety and any possible form of best practices for a technology literally capable of dealing death. Tech companies want to experiment with human lives and the built environment in the same way they do when building a "killer app" like YouTube or Twitter.

They continually push for a federal law pre-empting state and local regulations on self-driving cars that will essentially make the AV industry self-regulating and exempt from any meaningful oversight and even release of data and information.

They have come very, very close to passing such a federal law at least twice.

I'm not kidding - here are just a few specific examples: We tend to just get into a "technology - wow!" mindset and give all our thought to some indefinite time in the future when X might happen. Because X would be really cool if it ever came to pass.

Would we allow this sort of philosophy towards, say, testing drugs? "Hey, I'm going to randomly experiment with this series of mostly unknown substances on millions of unwitting victims. I think it won't hurt (most of) them too badly and if it pans out we might help billions of people sometime in the future!!!11!! I think it might even be safer than all the drugs we have now! If it pans out, of course! Now, let's pass a federal framework for me to proceed with my experiments that provides the absolute minimum oversight and allows me to keep absolutely all of my experimental data, including that dealing with injuries and deaths, secret."

Who would ever, ever allow that type of experiment to move forward?

Yet - here we are.
posted by flug at 3:46 AM on October 6, 2023 [20 favorites]


This information comes from "THEIR incident logs". Literally their own, proprietary information. A very, very small proportion of which they have deigned to release publicly. One of the major issues with the self-driving car companies is they are very tight-fisted with their data. They release it very selectively.

Always remember the above any time someone asserts these companies' claims like, "94% of contact incidents were the fault of the other party."

We have little reason to take their claims at face value, especially given that reporting requirements are so lax. Anytime one of these experimental vehicles is involved in a "contact incident" on a public street, the data from that car should be released to the public.
posted by mediareport at 4:13 AM on October 6, 2023 [8 favorites]


I think 94% of contacts are the fault of the other party because the other party either failed to take reasonable evasive action or made sudden unpredictable motions instead of continuing and trusting the AI.
posted by Phanx at 4:33 AM on October 6, 2023 [4 favorites]


I wish self-driving cars could come onto the market safely, if only for the sole selfish reason I fucking hate driving the 401 to visit my in-laws who live outside the GTA. It makes me nervous and unhappy.
posted by Kitteh at 4:56 AM on October 6, 2023 [3 favorites]


Yeah, it was a hit-and-run accident.

Crash, not accident.

Continued use of the word "accident" implies that these events are outside human influence or control. In reality, they are predictable results of specific actions. Since we can identify the causes of crashes, we can take action to alter the effect and avoid collisions.

The concept of "accident" works against bringing all appropriate resources to bear on the enormous problem of highway collisions. Use of "accident" fosters the idea that the resulting damage and injuries are unavoidable.
-Source
posted by entropone at 5:14 AM on October 6, 2023 [18 favorites]


A human driver could have moved the car off the victim, or anyone at the scene, including fire fighters, could have driven the car off the victim, presumably. This sounds like an awful and messy accident and I'm also at a loss to know why the jaws of life were needed, but holy shit.

These are supposed to be easy streets at low speed. I do not want this company making decisions for these cars on the highway. Or anywhere, really. Enough pedestrians and cyclists die and are injured in my city already. Fuck that shit.
posted by tiny frying pan at 5:36 AM on October 6, 2023 [9 favorites]


Not aimed at any person in particular, or even this thread, but in general, I actually don't get this "post-truth" world where we can label any data that doesn't agree with our worldview as false and made-up. It's got exactly the same legitimacy as antivaxxers saying the studies showing Pfizer vaccine efficacy and safety profiles were made up by big-pharma with a known track record of unethical practices, and it's a conspiracy to make billions in profits... so we should all take Ivermectin instead.

Well there are several things we can do to verify the truth. If there were tens of thousands of deaths due to the vaccine, where is the evidence? It would get reported through medical and media channels, and the FDA would be extremely interested to hear about this.

Same thing here. If there were hundreds of crashes where an autonomous vehicle was at fault, where is the evidence? All it would take is one single dashcam video or mobile phone video disseminated on the internet to prove that Cruise was lying about its performance and was actively injuring innocent road users. It would be a huge scoop for any journalist or media outlet, and the negative publicity would lead to the California Department of Motor Vehicles yanking their Robotaxi license.

It would literally be easier to make up vaccine efficacy and safety trials performed in labs with double blinded patients (who don't even know if they're getting the placebo or not) than it would be to lie and make up AV safety figures when the testing is all done IN PUBLIC.

In the case of Tesla, they're not beholden to any agency for a license (they aren't doing Robotaxi testing) so they can lie or whatever with no consequence... that one I agree there's no rational reason to believe (or disbelieve) anything they report. Just look at the motives.

Same concept as a health supplement company. They don't need a license from the FDA because it's not a drug. They can make all these claims, their private study shows this, or not. Doesn't mean squat, really, because they aren't reliant on the FDA for a license to sell.
posted by xdvesper at 5:37 AM on October 6, 2023 [11 favorites]


I will still rely on the NHTS data and not the data from a business trying to succeed. The vaccines were obviously safe because MILLIONS of people were taking them with no ill effects. I am not aware of millions of self driving cars.

There are hundreds of crashes per year already with cars who have self driving components, and the emphasis I've seen is a driver is necessary to reduce those. So I have little confidence that no driver is better.

This shit as it stands right now gives me the shivers.
posted by tiny frying pan at 5:46 AM on October 6, 2023 [15 favorites]


I wonder how many times BART has struck someone per passenger mile traveled.
posted by aspersioncast at 5:51 AM on October 6, 2023 [2 favorites]


Not aimed at any person in particular, or even this thread, but in general, I actually don't get this "post-truth" world where we can label any data that doesn't agree with our worldview as false and made-up.

We have reasons to think that pharma companies and the people who actually do the work are being kept approximately honest-ish. Fear of regulation at the firm level, fear of criminal or civil action and just becoming unhireable at the level of individual researchers.

I might be willing to trust that numbers Cruise provides about the number of crashes their cars are involved in are in the ballpark. But why would we think that Cruise's assessment of whether their cars were at fault is any better than cops' assessment of whether their officers were at fault? What's the bad thing that happens to Cruise or its employees if someone else thinks that their vehicles were at fault 28% of the time instead of 6%? Who goes to prison? Who loses their career?

I wonder how many times BART has struck someone per passenger mile traveled.

Even if you had the numbers, it would be a devil of a time trying to figure out how many of those were suicides.
posted by GCU Sweet and Full of Grace at 5:54 AM on October 6, 2023 [6 favorites]


I’m not so concerned that “autonomous” vehicles are likely to increase actual safety risks, especially as human driver behavior in my area has degraded dramatically over the last couple of years. What does bother me about this and similar imaginable circumstances is the powerlessness of actual, physically present people to help when something has gone wrong. The thought of the typical failed product customer service experience, where the relevant phone number is made intentionally difficult to find, then puts you through a cumbersome menu, then puts you on hold for a while, then connects you to a low-level CSR whose grasp of your language is poor, and who can only plod thorough a script until they finally admit that they’re not empowered to do the thing that would actually solve your problem… that experience, but with a 3/4 ton weight on top of someone? That possibility is what I find nightmarish. These vehicles ought to have a big red button on the exterior that makes them instantly, locally controllable.
posted by jon1270 at 6:09 AM on October 6, 2023 [12 favorites]


I generally keep out of these threads but I have to point out that there are tens of thousands of very unsafe drivers on the roads right now who know that they're unsafe behind the wheel but can't bring themselves to stop driving because they don't want to lose their autonomy. If you have older friends or relatives, you're probably nodding right now. If self-driving vehicles were primarily positioned as assistive devices for those who can't or shouldn't drive themselves, I think we'd have a lot less of a problem with them.
posted by phooky at 6:38 AM on October 6, 2023 [15 favorites]


I'm so glad that everyone is giving the person who hit her with their car a pass so they can opine on the evils of autonomous vehicles. I'm not even an AV fan, but this is kind of ridiculous.

Whether or not the AV was at fault in the situation, or should have reacted differently to a pedestrian underneath it, remains to be debated.

The AV is certainly not at fault for the woman getting hit by the human driver and then flung into its path. Stopping, which is what it did, was literally the only thing it could do to minimize its impact. It is truly unfortunate that it ended up stopped on top of her leg, but at that point, what is the right course of action? If it rolls forwards or backwards it might run over her head - it doesn't know. That's an extremely low-information event for which the safest course of action is to lift it up and off - which is what happened. Hopefully this spurs a much faster response protocol in events like this.

FWIW about hundred people are killed by automobiles daily in the U.S. We all know what bad drivers humans are, etc etc etc, but somehow despite that they do a pretty decent job of avoiding killing each other.

That's 365,000 people killed by cars per year. I wouldn't call that a "pretty decent job of [not] killing each other." That's almost on par with coronary heart disease. Compare that number to 18 fatalities due to autonomous vehicles over a 2 year period.

Like, it feels really weird to be defending autonomous vehicles, because I don't trust car companies, but this situation is 100% the fault of the driver who hit the woman and then fled the scene. F that person, I hope they do a long stint in prison.
posted by grumpybear69 at 6:44 AM on October 6, 2023 [16 favorites]


These vehicles ought to have a big red button on the exterior that makes them instantly, locally controllable.

Most robots have big, visible emergency stop switches, but most robots aren't tooling around town with griefers constantly looking to slap the e-stop for lulz. Less visible but functional emergency stops inside vehicles are common (think elevators, subway trains). I don't have a good solution for this, but I could imagine exterior buttons that immediately get an emergency operator on line; not perfect but something.
posted by phooky at 6:46 AM on October 6, 2023 [2 favorites]


These vehicles ought to have a big red button on the exterior that makes them instantly, locally controllable.

Emergency services definitely should be able take control of them. Including cops, though I know how well that proposition will go over. Taking emergency control of any dangerous machine shouldn't be an option open solely to the manufacturer or operator.

But you can't have just anyone slap a red button on the door when it's at the stop light and take control of the car. They need to be locally controllable by someone who has a key or knows the code or something like that.
posted by pracowity at 6:47 AM on October 6, 2023 [2 favorites]


Is it unreasonable to assume that a human driver may have clocked the situation as it unfolded and not just once the pedestrian was directly in front of the vehicle?
posted by badbobbycase at 6:56 AM on October 6, 2023 [5 favorites]


Crash, not accident.

Collision, not accident.
posted by biffa at 6:57 AM on October 6, 2023 [3 favorites]


That's 365,000 people killed by cars per year.

365 x 100 =36,500.
posted by biffa at 6:59 AM on October 6, 2023 [2 favorites]


365 x 100 =36,500.

The say musicians are good at math. I am the exception.
posted by grumpybear69 at 7:00 AM on October 6, 2023 [4 favorites]


Not a huge fan of AVs, but let's say the traditional car interface (steering wheel/brakes/throttle) was present on this car and also was active and accessible. (I'm not suggesting that as a solution, I'm just creating a hypothetical here)

Are bystanders expected to hop in and make a decision whether to drive the car forward or backwards off of the person pinned underneath it? I suspect that the normal first responder protocol is to jack the car up, not drive it off the person stuck beneath it. If I found myself in a bystander position I might consider sticking a scissor jack from another car under it and lifting it, but I definitely wouldn't risk driving it and potentially making the injuries worse.
posted by Larry David Syndrome at 7:01 AM on October 6, 2023 [10 favorites]


I guess what I'm saying is the "car couldn't be controlled by a local human after the accident" is a bit of a straw man in this whole scenario
posted by Larry David Syndrome at 7:02 AM on October 6, 2023 [4 favorites]


Don't worry about it - we just reduced US road traffic deaths by 90% in 15 minutes. We're the most effective measure ever taken.
posted by biffa at 7:02 AM on October 6, 2023 [6 favorites]


What if instead of millions of self-driving cars each carrying a single passenger we ended up with a smaller fleet of larger vehicles driven by professional drivers that could help people get where they are going?

Bus drivers are busy driving buses.

I want buses driven by robots -- actual anthropomorphic robots, if possible -- but with a human sidekick/guide/conductor/bouncer on every bus to keep the people in line. The conductor could also do stuff like maneuver the bus off a screaming hit-and-run victim if the occasion arose, but mainly you'd want a bus that just tootles along in the bus lane (eliminate all on-street parking to make room for those lanes) and gives everyone a nice safe ride from block to block while the nice conductor helped old ladies to their seats.
posted by pracowity at 7:08 AM on October 6, 2023 [2 favorites]


A recent SF Chronicle article clarifies the situation with the vehicle not being moved despite the pedestrian being pinned under the car:

According to Cruise, police had directed the vehicle to remain stationary, apparently with the pedestrian trapped beneath it. While spokespeople for the Police Department did not have an officer’s report Tuesday morning to clarify exactly when police made the order, they said it is standard procedure not to move cars at the scene of wrecks – regardless of whether a human is operating them.

“When it comes to someone pinned beneath a vehicle, the most effective way to unpin them is to lift the vehicle,” Sgt. Kathryn Winters, a spokesperson for the department, said in an interview. Were a driver to move a vehicle with a person lying there, “you run the risk of causing more injury.”


It does sound like that Cruise did have the ability to move the car in some fashion and were legitimately asked not to.
posted by I EAT TAPAS at 7:09 AM on October 6, 2023 [22 favorites]


The obvious solution here is for Cruise to add an automated speech capability so that their cars can calmly and reasonably explain that, statistically speaking, they're parked on top of you for your own good.
posted by Gerald Bostock at 7:28 AM on October 6, 2023 [12 favorites]


’When it comes to someone pinned beneath a vehicle, the most effective way to unpin them is to lift the vehicle,” Sgt. Kathryn Winters, a spokesperson for the department, said in an interview. Were a driver to move a vehicle with a person lying there, “you run the risk of causing more injury.”’

It does sound like that Cruise did have the ability to move the car in some fashion and were legitimately asked not to.


Hush, you’re ruining the narrative. The AV can’t do better in the situation than a naïve human would.
posted by Tell Me No Lies at 7:32 AM on October 6, 2023 [9 favorites]


...although the fact that many if not most of the commenters in this thread would have done the wrong thing and further injured the woman speaks a lot about the challenges AI faces.
posted by Tell Me No Lies at 7:49 AM on October 6, 2023 [11 favorites]


I'm a bit baffled at the hatred for self driving cars. What's up with that?

So far the record shows htem to be better, safer, drivers than humans. Is this sort of an airplane crash deal where the illusion of control is causing people to fear the safer option?
posted by sotonohito at 7:49 AM on October 6, 2023 [6 favorites]


Early Tuesday morning, a local (Oakland) TV news, reported the crash. They said that Cruise released to them the dash cam video of the crash, but they could not show it, they could watch it privately. The reason seems to be that Cruise had evidence that the person was hit by another car and they wanted that reported. It wasn’t their fault. This was the message that had to be told immediately after the crash. The news people lamented that at 7am Tuesday they had no information about the condition of the victim. It’s 8am Friday and I still haven’t heard anything about the victim.
posted by njohnson23 at 8:04 AM on October 6, 2023 [5 favorites]


Autonomous (and human-driven) vehicles are currently being designed with an ideology of creating a sort of wild west apocalypse vehicle. It’s short-sighted and simply not good enough.

This. There are a lot of things that could be done with infrastructure and regulations to make self-driving cars easier and safer. But nope. All of these companies are just doing an all-or-nothing approach to completely and instantly replace human drivers within the current driving paradigm because move fast and break things people!

The current approach to self-driving cars is just a libertarian tech-bro fantasy about being able to out-engineer reality.
posted by RonButNotStupid at 8:19 AM on October 6, 2023 [6 favorites]


They have not in the slightest proved this is the safer option, imo. Maybe in the future it will be.
posted by tiny frying pan at 8:19 AM on October 6, 2023 [1 favorite]


It is likely, even probable, that autonomous cars are being allowed on our roads prematurely. These events, however, are not evidence of that. The autonomous car acted correctly in this circumstance.
posted by a faded photo of their beloved at 8:23 AM on October 6, 2023 [5 favorites]


So far the record shows htem to be better, safer, drivers than humans. Is this sort of an airplane crash deal where the illusion of control is causing people to fear the safer option?

Autonomous vehicles are currently 10 times more dangerous than human drivers, having driven 8 million miles with a fatal crash, where the population-level human rate in the US is about 85 million miles. The fact that Cruise has driven a million miles without a fatal crash is statistically meaningless. This is a fundamental problem, that it takes an incredible amount of exposure to develop strong statistics, and there is no evidence that the corporations involved are releasing good enough data to permit better estimates with less driving. (I'll note here that a median driver, for example someone over 25 and sober, is about twice as good as that average, and that autonomous vehicles also drive empty, and we should actually measure the crashes per mile serving passengers, which might mean that ~250 million miles per fatal crash is a reasonable baseline, in which case autonomous vehicles are roughly 30 times more dangerous per occupied mile than a median driver.)

But that's only the first line problem. Brief sidebar. In this 1954 IBM press release, after a successful demonstration of translating Russian to English, Dr. Leon Dostert says "five, perhaps three years hence, interlingual meaning conversion by electronic process in important functional areas of several languages may well be an accomplished fact." In other words, that by the mid-50s or surely by 1960 computers could do translation. I'm bringing this old AI history up because we're at the other end of the problem now, computers are actually capable of doing reasonable translation in many cases between common languages, albeit not to the level of a skilled translator and missing cultural nuance.

It's 70 years later, and it turns out that the promising approach of the 1950s would never work. And neither would the promising approach of the 70s and 80s, and even the promising approach of the 2000s. I remember in the early 2010s that Google Translate was so bad, the main use case was as a party game; pass something through three or four languages and see what mangled sentence came back in English. It was deep neural networks circa 2015 (the same core technology as in AVs) that made the current, decent solution possible. We don't know -- it is literally impossible to know -- whether this current approach to technology can make AVs safer than a median human driver if we just let them drive enough (and kill enough real people), or whether the best this technology can get is a doubling or tripling or quadrupling of the already irresponsibly high number of road deaths. And we won't find out until we're there.

Which takes us to the third line of problems. Let's say that AVs can get better, and in fact can get better than humans can. Where does that leave us? It takes us from a system where the roads are full of cars, each carrying one person, to a system where the roads are even fuller of cars carrying nobody at all. This cannot fundamentally be solved by "optimization"; 100,000 people want to go to downtown in the morning, and 5,000 people want to leave and there is no algorithmic way around that fact that doesn't result in a lot of empty cars driving around.

We already have pretty full roads; do we want to build more? Do we want to sit in worse traffic? Do we want to produce more CO2 emissions? Turns out that the most common type of microplastics in the ocean are actually tire particles, do we want this to increase?

It takes a bold visionary to look at a highway full of cars, each weighing 3000, 4000, 5000 pounds to carry 150 pounds of person, and decide that what we need is less efficiency.
posted by Superilla at 8:28 AM on October 6, 2023 [32 favorites]


I'm a bit baffled at the hatred for self driving cars. What's up with that?

So far the record shows htem to be better, safer, drivers than humans. Is this sort of an airplane crash deal where the illusion of control is causing people to fear the safer option?


Speaking only for myself, but it's an earned distrust of the people building self-driving cars. I don't trust the move-fast-and-break-things ethos to design a toaster, let alone a robot that causes people to die when it makes a mistake. Elon Musk, in particular, has done great harm to the industry's reputation with repeated indicators that he and Tesla do not take AV safety seriously. Couple that with an alarming lack of government oversight, where apparently companies are just allowed to alpha-test on the streets, and it's a recipe for disaster. The other companies, including Cruise, are a lot better, but I'm answering your question about the hatred for self-driving cars.

It's interesting that you mention airplane crashes. I've worked in the technology side of aerospace and in big tech, and I'd much rather have the aerospace industry designing these things (737 Max 8 fiasco notwithstanding).
posted by qxntpqbbbqxl at 9:06 AM on October 6, 2023 [7 favorites]


Autonomous vehicles are currently 10 times more dangerous than human drivers, having driven 8 million miles with a fatal crash, where the population-level human rate in the US is about 85 million miles.

I don't necessarily disagree with your overall point, but it must be noted that there is a vast gulf in crashes and fatalities per mile traveled between different types of roads/streets and different levels of urbanization. As an example I happen to have to hand, rideshare drivers in San Francisco have a crash about every 20,000 miles. By contrast, Cruise cars are involved in crashes once every 63,000 miles or so. Waymo has a very similar crash rate in SF.

It would be very enlightening if other states required the same public reporting that California does, but they don't, so we have no reliable data about how AVs perform in other conditions. In any event, it's clearly not sound to lump in the hundreds of millions of "easy mode" miles humans drive on rural Interstates and then declare AVs worse when they have zero reported miles doing that kind of driving.

The feds really should enact reporting requirements that cover the entire US so that a better comparison can be made.
posted by wierdo at 9:12 AM on October 6, 2023 [10 favorites]


I'd much rather have the aerospace industry designing these things

Airplanes are another thing that should be self-driving. Set it for Tokyo and then go serve coffee like the rest of the flight attendants.
posted by pracowity at 9:31 AM on October 6, 2023


Whoa, according to this AAA Foundation report 20% of fatal car crashes that involve a pedestrian are hit and run.
posted by gwint at 9:38 AM on October 6, 2023 [3 favorites]


Airplanes are another thing that should be self-driving. Set it for Tokyo and then go serve coffee like the rest of the flight attendants.

To a large degree, they already are. Pilot complacence was cited as a contributing factor in the two 787 Max 8 crashes, the theory being that pilots accustomed to doing more manual flying would have been better at preventing the crashes (to be clear, I am not blaming the pilots for the crashes; blame rests entirely on Boeing executives IMHO)
posted by qxntpqbbbqxl at 9:40 AM on October 6, 2023 [5 favorites]


The SF Chronicle story says the pedestrian was in the crosswalk (or so says Cruise). Did the car see the pedestrian and start across the crosswalk anyway? Maybe I'm the naïve one, but I thought crosswalks were special places, like churches in Highlander. But it seems the law in CA is that you a car can YOLO across if they have the green light -- I don't know about SFO local laws. It seems like a good idea for a car to wait, anyway.
posted by credulous at 9:53 AM on October 6, 2023 [1 favorite]


New today: The final 11 seconds of a fatal Tesla Autopilot crash. Details of the 2019 Tesla failure that killed the passenger sitting in the driver's seat. The system engaged on a road it could not drive, at a speed 14 over the speed limit, then failed to notice a crossing truck and drove right through it. Or half under it. Half of the passenger remained with the truck; the Tesla decided to stop 40 seconds later.
posted by Nelson at 9:59 AM on October 6, 2023 [4 favorites]


Jumping from the assholery of the move fast and break things crowd to hatred for self driving cars is a mistake.

Yes, Elon Musk sucks. I loathe him.

No, that doesn't mean self driving cars are bad.

Humans are absolutely terrible drivers. I drive to work and it is staistically the most dangerous thing I have ever done or ever will do in my entire life. Every day I take my life in my hands because humans are shitty drivers.

I will also note that the whole "zomg empty cars" argument is flawd in that it seems predicated on the idea that everyone will individually own a self driving car and send it home or something.

Once it's to the point where you can get a self driving car service for cheaper than maintaining a car of your own I suspect you'll see individual car ownership decline steeply and traffic decline as well.

Right now we have three parking spaces for every car. And people spend hours driving around looking for a parking space.

Worse, in our current system everyone owns a car themselves and that car spends 90% of its time sitting still somewhere not being used and just wasting space and being a waste of the resources that went into its construction.

If you can just get dropped off and let the car go take other people places then you solve two problems at once. No more issues with parking, and less downtime on each vehicle meaning there's a need for fewer vehicles overall.

This is not to say that current self driving cars are super fantastic. But we can't exactly make progress towards better self driving cars if we give into luddite propaganda and ban them.
posted by sotonohito at 10:15 AM on October 6, 2023 [7 favorites]


I believe self driving cars can be safer than human driven cars but it's not clear the evidence is we're there yet. Either way though, just being safer is not enough. The cars also need to be trustworthy, and transparent enough that we understand when they fail. And to have a clear legal liability situation; certainly the passengers shouldn't be responsible when their Tesla kills someone. The companies building the software have to have some liability, it's the only way this is going to work legally.

The problem with the cowboy bullshit Musk's company is doing is they aren't handling any of these concerns or questions responsibly. Telsa just says "lol the driver should have taken over we made them pinky swear not to let their full self driving autopilot car pilot or fully self drive. Cruise seems a little better but is clearly screwing up in how it is menacing the streets of my home city now. Waymo seems meaningfully better too; it's no accident there's a lot less drama with their more careful testing.

I think it's crucial these companies create this new technology in a careful and transparent way so that it's safe and socially responsible.
posted by Nelson at 10:21 AM on October 6, 2023 [6 favorites]


We already have pretty full roads; do we want to build more?

Roads? Where we're going, we don't need roads.
posted by kirkaracha at 10:28 AM on October 6, 2023 [2 favorites]


Yeah I'm getting really tired of the aviation analogies here because that's software that's written to military/government standards, with real specifications and requirements, and driven by trained test pilots in controlled and isolated situations before a single paying civilian gets to step on one.

When Tesla/Cruise/Waymo/WhatTheFuckEver decides to beta test their software on public roads alongside me and my family, and I didn't agree to be a part of this test, I have a real problem with that.

I'm all for progress and I believe self-driving cars can be a thing when we ALL drive them simultaneously and accidents like this can be avoided. But, until them, get them off the fucking roads.
posted by JoeZydeco at 10:32 AM on October 6, 2023 [7 favorites]


The SF Chronicle story says the pedestrian was in the crosswalk (or so says Cruise). Did the car see the pedestrian and start across the crosswalk anyway? Maybe I'm the naïve one, but I thought crosswalks were special places, like churches in Highlander. But it seems the law in CA is that you a car can YOLO across if they have the green light -- I don't know about SFO local laws. It seems like a good idea for a car to wait, anyway.

This KRON story has a full accounting of the situation based on their review of video footage. The pedestrian was hit in the crosswalk by a human-driven car, not the Cruise. The pedestrian was carried on the hood of the human-driven hit-and-run car for approximately one block (!) and then fell off the hood of the hit and run car into the other lane, directly in front of a Cruise which then attempted to brake. The hit and run driver fled the scene.

It is not clear whether a human driver could have stopped faster than the Cruise to avoid running over the already-injured pedestrian, but it is clear that the pedestrian was initially hit by a human-driven car and thrown into the path of the Cruise.
posted by I EAT TAPAS at 10:33 AM on October 6, 2023 [10 favorites]


This was an awful accident: I have gotten used to reading about the antics of self-driving cars as though they are from the funny news page, and got halfway through this one with that same perspective before realizing just how awful it was.

Coming off that feeling, Cruise's deflections and assurances that they just did what they were told, or how they were helping with investigations by making sure the blame got pinned on the other driver, all come across as lacking in basic empathy or decency.

It feels like the old advice to never admit fault after an accident: presumably a good idea in a legal sense, but definitely makes you look like an asshole in the moment, especially if you're bad at it and come across as defensive and dishonest.
posted by pulposus at 10:37 AM on October 6, 2023 [1 favorite]


And to have a clear legal liability situation

It's a bit like an elevator. Some corporation designed it and built it. Some corporation owns and operates it. One or both probably have some responsibility if it falls fifty floors or chops someone in half or bursts into flame and burns the orphanage to the ground.
posted by pracowity at 10:38 AM on October 6, 2023 [2 favorites]


If you can just get dropped off and let the car go take other people places then you solve two problems at once. No more issues with parking, and less downtime on each vehicle meaning there's a need for fewer vehicles overall.

But that's EXACTLY the basis for the "zomg empty cars!" argument. If you have tens of thousands of people who right now commute by car into a city at the same time, those cars either have to

a) stay parked all day to wait for rush hour in the opposite direction (what currently happens)
b) drive empty back out of the city to acquire another inbound ride
c) drive empty around the city looking for a local ride while waiting for the evening commute out of town.

This self-driving car utopia where a majority of people can replace their vehicles with some sort of autonomous, on-demand rideshare is not going to happen because the math just doesn't work.

Autonomous vehicles are no substitute for public transit.
posted by RonButNotStupid at 11:04 AM on October 6, 2023 [22 favorites]


This KRON story has a full accounting of the situation based on their review of video footage. The pedestrian was hit in the crosswalk by a human-driven car, not the Cruise. The pedestrian was carried on the hood of the human-driven hit-and-run car for approximately one block (!) and then fell off the hood of the hit and run car into the other lane, directly in front of a Cruise which then attempted to brake. The hit and run driver fled the scene.

It is not clear whether a human driver could have stopped faster than the Cruise to avoid running over the already-injured pedestrian, but it is clear that the pedestrian was initially hit by a human-driven car and thrown into the path of the Cruise.
posted by I EAT TAPAS at 1:33 PM on October 6 [2 favorites +] [!]


Initially this collision didn't seem like an AI failure. This new info makes it seem that a human could have done better- (barring some strange sightline issues) it's unlikely that a human driver would fail to see and hear that the car next to them had struck a pedestrian and was carrying them along on the hood of their car. Most drivers would stop or at least slow down and hang back in that scenario, especially as it unfolded over the course of a full block. It's not a reaction time issue, it's a situational awareness issue (Admittedly, this is enough of an edge case that it doesn't seem outrageous that the Cruise car wasn't more proactive and just was just in "business-as-normal "mode until the pedestrian rolled off the hood of the car next to them and into their path)
posted by Larry David Syndrome at 11:06 AM on October 6, 2023 [10 favorites]


But it seems the law in CA is that you a car can YOLO across if they have the green light -- I don't know about SFO local laws.

Where on Earth did you get this idea? In California pedestrians always have the right of way.
posted by sigmagalator at 11:47 AM on October 6, 2023 [4 favorites]


On my walk yesterday, I was halfway through a signaled pedestrian crossing at a roundabout, where the speed limit is posted 15mph. A driver who apparently not only didn't see the flashing pedestrian crossing lights, and didn't see me, or did and just didn't fucking care, had to slam on their brakes in a great squeal of tires, and stopped just short of hitting me. When they saw that I was recording them (because I always video when crossing the street, because this) they made an angry face and floored it, again just barely missing me. Made a report and sent the video to the cops, with absolutely zero confidence anything will come of it.

So many drivers are goddamn fucking monsters. I don't think AVs are ready yet, but shit's gotta be better than these assholes behind the wheel.
posted by xedrik at 12:20 PM on October 6, 2023 [4 favorites]


Maybe people are upset because the now everyday tragedy of a pedestrian being hit by a human driver who is at no point interested in taking responsibility is "supposed" to be a strong argument for public transit, but it seems like we're taking this brand new fork of computerized chauffeurs for everyone?
posted by Selena777 at 12:29 PM on October 6, 2023 [6 favorites]


Autonomous vehicles are no substitute for public transit.

Autonomous public transit vehicles.
posted by pracowity at 12:31 PM on October 6, 2023 [1 favorite]


I guess I was questioning if the law says cars need to stop the entire time the pedestrian is in the crosswalk, even if their paths don't cross. It looks like CA law doesn't explicitly say this, and some state laws say explicitly you can proceed if the pedestrian is on the other half of the road. I would expect autonomous vehicles to take the more cautious approach, regardless of the law. (This video make it seem like it does not.)

If the Cruise vehicle initially saw the pedestrian in the crosswalk, what then? Did it just see her suddenly vanish when the other car carried her away, and then figured it was safe to proceed? Or was it planning to proceed anyway since she was in the other lane?

I think autonomous vehicles will be safer than the average distracted/angry/hotboxing/all-of-the-above driver, but it would be nice if they were more polite. I wonder if they could communicate with pedestrians visually or audibly, telling them when they're detected and safe to proceed. Right now they're like the jerks with full tint.
posted by credulous at 12:33 PM on October 6, 2023 [1 favorite]


I wonder if they could communicate with pedestrians visually or audibly, telling them when they're detected and safe to proceed.

I know!

I just nodded to a mother and three small kids who were contemplating whether or not to enter a crosswalk. I had stopped at a stop sign and they were on the other side of the road from me. Since they hadn't actually stepped into the crosswalk *yet* I had every legal right to proceed, but I could easily see their intent and I decided the right thing to do was to eliminate one variable from their calculus by letting them know with little more than a nod that I'd wait for them.

What freaks out a lot of people is that an AV just can't do that. Sure a vehicle will stop when it's required to stop, but it can't participate in the myriad of little interactions that occur ALL THE TIME between drivers and pedestrians.
posted by RonButNotStupid at 1:01 PM on October 6, 2023 [5 favorites]


Okay, the Cruise being on top of a pedestrian is fucking awful, but honestly not as awful as the first driver who just left the person they hit and carried on top of their car for a few feet, then LEFT.

As someone who is solely a pedestrian, I worry A LOT about the fact drivers think doing the little "oopsies! I didn't look both ways and nearly ended your life!" wave is okay when you've just nearly hit me with your car.
posted by Kitteh at 1:10 PM on October 6, 2023 [8 favorites]


I will also note that the whole "zomg empty cars" argument is flawd in that it seems predicated on the idea that everyone will individually own a self driving car and send it home or something.

No, the "zomg empty cars" is because self driving cars cruise around on the roads waiting for the next passenger. They aren't conveniently parked in someone's driveway.


Once it's to the point where you can get a self driving car service for cheaper than maintaining a car of your own I suspect you'll see individual car ownership decline steeply and traffic decline as well.

It certainly didn't happen with Lyft/Uber:

Here's the lowdown: Traffic congestion since the companies' introductions in urban areas is up by 0.9% and the duration of a jam increased 4.5%. All the while, individuals relied less on public transportation, with ridership down 8.9%. At the same time, the most urban areas saw personal vehicle ownership fall by just 1%, and it wasn't uniform across all areas. In conclusion, Uber and Lyft have no impact on the number of personal vehicles on US roads.
posted by oneirodynia at 1:14 PM on October 6, 2023 [10 favorites]


I live in intown Atlanta, where red lights are viewed as optional for the first 5-10 seconds, nearly every driver is high as hell after 4pm, the road quality is D+ at best, there's a shirtless dude doing the Asshole Stroll across every heavily-traveled block, people routinely drive on the wrong side of the road, there are fewer traffic cops per capita than almost anywhere else in the USA and the city government isn't even any good at being corrupt. I can't WAIT for these fuckers to come try it here.
posted by outgrown_hobnail at 1:16 PM on October 6, 2023 [5 favorites]


Another thing that just occurred to me is the working assumption -- one even I had! -- is that autonomous vehicle software will one day (in the past or future) be better than humans, and this will mean a permanent, long-lasting reduction in road deaths (if not climate deaths).

Because baked into that assumption is that massively networked, complex computer systems run by private corporations only go one direction, and that is the direction of improvement. But it occurs to me that I've used Facebook 10 years ago and recently; I've shopped on Amazon five years ago and recently; I've done Google searches two years ago and recently; I've looked at Twitter one year ago and recently. This assumption seems less and less robust. (And as noted, pilots flew Boeing jets 10 years ago and with more recently with the faulty MAX software.)

It is entirely plausible that in year 202X or 203X an autonomous vehicle is safer than the average or even the median human, and that in the year 203Y or 204Y due to under-maintenance or optimizing for faster trips or getting kickbacks to drive past McDonalds or whatever, they are no longer safer. But they will still be just as legal.
posted by Superilla at 2:23 PM on October 6, 2023 [14 favorites]


I can't WAIT for these fuckers to come try it here.

Cruise is testing in Miami. If their software can handle that, it can handle anything in the rest of the US or Canada that doesn't involve snow.

I'm not as optimistic about Cruise, but I'd be perfectly happy to have Waymo around me. Waymo's model/algorithm/whatever you want to call it is extensively tested before it is ever allowed to operate independently on the road. How, you ask? Human drivers drive around in fully instrumented cars. That data gets fed into a simulation where the ML model is initially trained on the real data and variations of that data with the parameters changed. When that has produced a reasonably safe "driver," the AI gets to drive a real car, but with a safety driver. Again, the data is fed back into the model and millions of miles more are run in the simulator. Only then does Waymo put cars on the road operating autonomously. Then, when there are inevitably crashes, that data is used in even more simulations to teach the AI the best way to handle similar situations in the future.

The key here is the extensive use of simulation based on, but not exactly the same as, real world conditions. It still doesn't eliminate all crashes, but it does appear to help reduce the severity of the crashes that do happen.
posted by wierdo at 3:16 PM on October 6, 2023


It's almost as if the solution to human-driven cars wasn't AI-cars, but rather less cars.
posted by signal at 3:53 PM on October 6, 2023 [5 favorites]


Watch out for the details, guys. KRON4, SF Independent news station, has more details.
A Cruise representative allowed KRON4 to view video footage recorded by its AV involved in the accident. The video confirmed that a human-driven car struck the woman first before she was thrown directly into the path of the Cruise car.

  • The video begins with the human-driven car and driverless Cruise car waiting side-by-side at an intersection for a traffic light to turn green.
  • After the light turns green, the video shows the woman walking in a crosswalk when she is struck by the human-driven car. At the moment of impact, the woman was crossing the street against a red light and walking in front of oncoming traffic.
  • The video then shows the woman being thrown onto the hood and windshield of the human-driven car. She is carried on the hood for about a block. She then tumbles onto the pavement directly in front of the Cruise car.
  • The video shows the Cruise car running the woman over.
  • The human-driven car pauses before fleeing the scene.
  • The video ends.
  • It is NOT possible to predict whether if the Cruise vehicle could have stopped even if there was a test driver behind the wheel.

    Also, I'd like to point out the intersection at 5th and Market is NOT a simple intersection. That vehicle is heading SOUTHeast bound on 5th St, so the traffic came from northwest of Market street. I lived in SF for decades, and I know that intersection quite well. It's decently lit at night, but I honestly can't say if this accident is preventable and I would NOT pass any judgment until we know more.
    posted by kschang at 4:29 PM on October 6, 2023 [1 favorite]


    Again, any human being driving a car who saw the driver next to them hit another human being would do almost anything but then run over the victim, too. Most likely, they would brake, honk their horn, scream, and/or call 911. They would not blithely drive next to a car with another human being on its hood and then run over that human being when they fell off the hood.
    posted by hydropsyche at 4:54 PM on October 6, 2023 [10 favorites]


    That's the thing that strikes me about this incident as well - even with the best of sensors, AVs lack the comprehension to have real situational awareness. I don't drive all that much, but once a month or so, I'll see something while I'm driving that puts me into high alert: a car on the highway having trouble staying in its lane, a poorly secured load on the truck in front of me, etc. Human drivers pick up subtle cues from the environment that let them anticipate and avoid trouble before it happens. AVs don't have this ability, so they have to rely on fast reactions alone. Maybe that's enough to make them safe enough eventually, but maybe not.
    posted by qxntpqbbbqxl at 5:10 PM on October 6, 2023 [7 favorites]


    Human drivers pick up subtle cues from the environment that let them anticipate and avoid trouble before it happens.

    Well, the asshole who actually hit the pedestrian is not one of them.
    posted by Kitteh at 6:26 PM on October 6, 2023 [3 favorites]


    Well, the asshole who actually hit the pedestrian is not one of them.

    The Cruise also actually hit the pedestrian. Most people would not have driven like the Cruise, because it's hard not to notice a person on the hood of the car next to them and naturally slow down a bit. The Cruise is a worse driver than a human.
    posted by netowl at 7:20 PM on October 6, 2023 [6 favorites]


    Again, any human being driving a car who saw the driver next to them hit another human being would do almost anything but then run over the victim, too.

    Nah, lots of people would run over the victim, too. Lots of people's startle reaction is to just do *something* and to do it hard and fast, and we don't always pick the right thing to do. And it's pretty common for people intending to mash the brakes to miss and hit the accelerator, and then just press it even harder because it's not working. We got rear-ended on the Lewiston-Queenston bridge that way once (dalegraciasadios everyone involved was in the US).

    They wouldn't intend to; they wouldn't very quickly work through a flowchart and *decide* to run them over. They wouldn't "blithely" do it. But lots of people would absolutely run them over in a blind startled panic.
    posted by GCU Sweet and Full of Grace at 7:29 PM on October 6, 2023 [6 favorites]


    Most people would not have driven like the Cruise, because it's hard not to notice a person on the hood of the car next to them and naturally slow down a bit

    Perhaps, assuming that the person was even visible to the Cruise vehicle's cameras. I'd have more conviction on the subject if I hadn't read so many stories about people being struck by multiple vehicles in a single incident. Most drivers just don't pay that much attention to things that aren't directly in front of them.
    posted by wierdo at 9:34 PM on October 6, 2023 [6 favorites]


    DO NOT BLAME THE ROBOT
    posted by Nelson at 11:23 PM on October 6, 2023


    I'm not weighing in on whether robot cars are better or worse than human driven ones. I don't, honestly, know enough to have more than feelings rather than opinions. But I can speak to what drivers do when they see a human in the hood of a car.

    There was a BLM protest in Pensacola. Some folks got into the street. An SUV drove slowly through the crowd, somehow ending up with a guy on the hood, clinging to the edge of the hood near the windshield. That car then drove over the three mile bridge, at 45 mph, with the dude on the hood, in the right hand lane. Cars in the left lane pretty much drove normally. Most passed the car-with-dude. Some hung back and got behind it. Dude got off the car, with bruises but no serious injury, in Gulf Breeze. Well over three miles later.

    So no - I'm not at all convinced that people would either notice or take safe actions when a car beside them has a human on the hood.
    posted by Vigilant at 12:10 AM on October 7, 2023 [6 favorites]


    Worse, in our current system everyone owns a car themselves and that car spends 90% of its time sitting still somewhere not being used and just wasting space and being a waste of the resources that went into its construction.

    This seems like sort of a weird argument. I’m also not using my wok or my washing machine or my bathtub most of the time, but I don’t regret having them. I’m not a big fan of car culture (I don’t drive and so it does annoy me how much of the infrastructure is hostile to me) but material culture in general involves a lot of made things that are only in use a little bit of the time.
    posted by eirias at 4:08 AM on October 7, 2023 [1 favorite]


    Cars have orders of magnitude more space allocated to their storage (and yet more for their use) than home appliances or bathtubs or whatever other personal goods you want to compare them with. All of that space costs an enormous amount of money directly to build and maintain it and even more over time in reduced land value and other knock on effects of having so much land area dedicated to parking personal autos. Worse, the vast majority of the time the costs aren't borne entirely by those who choose to own and drive cars.

    Every one of those parking spaces represents somewhere between $1000 and $30000 in up front cost. The land area used is not generating tax revenue. That space forces productive uses to be spread farther apart, which means more linear miles of infrastructure to maintain, from the obvious like roads to the mostly unseen like utilities. Worse, the latter two issues combine to make our cities functionally insolvent in the long run because more pipes, lane miles, etc have to be paid for with less revenue per unit of land area.

    All of that together combines to drive up the cost of basically everything we buy. And that's not even touching on how utterly shitty a sea of asphalt makes a city. Not only does it make walking somewhere between unpleasant and nearly impossible it also exacerbates heat waves due to the heat island effect preventing cities from cooling down at night, which makes people much more susceptible to heat-related illnesses.

    All that said, I don't want to give people the impression that I hate cars or anything. I think cars are great. The problem is that we have way, way, way too many of them and insist on cramming them into cities where they simply don't belong. There are 500 foot tall residential skyscrapers in Miami that have nearly a hundred feet of their vertical height dedicated to car storage. The extreme cost of doing that added to the space dedicated to car storage that could instead be housing makes it goddamn near impossible to build anything but luxury housing. That's the kind of thing I hate, not the machines themselves.
    posted by wierdo at 5:14 AM on October 7, 2023 [5 favorites]


    in our current system everyone owns a car themselves and that car spends 90% of its time sitting still somewhere not being used and just wasting space and being a waste of the resources that went into its construction

    If most parking spaces had a charging port and it became normal and expected for cars to be wired back to such a port when parked, then far from wasting space and being a waste of the resources that went into their construction, the domestic car fleet would double as an aggregate grid-accessible energy storage resource sufficient to cover variability in solar and wind plant output many times over.
    posted by flabdablet at 5:34 AM on October 7, 2023 [3 favorites]


    I have to admit that I am weirded out that this thread is pretty much going ROBOT CARS BAD HUMAN DRIVERS GOOD when the initial catastrophic injury was done by a human driver. Did the robot car hit the person after? Absolutely. But everyone is going after the robot car, and just more or less ignoring the hit and run done by a human driver. To me, that is always always going to be worse. And humans are shit drivers. I have been nearly hit by many many human drivers as a person who cannot afford a car and walks to get to where I need to go. People think they are good drivers. They are not. I guess I can expect a future where I have to dodge being hit by humans and robots. What a shitty outcome.
    posted by Kitteh at 5:37 AM on October 7, 2023 [3 favorites]


    I don't think human drivers are good. But I also don't trust current self driving car tech. That seems to be what most people are saying.
    posted by tiny frying pan at 6:16 AM on October 7, 2023 [7 favorites]


    Nah, the thread veers more on the latter than the former. And I have been saying I don't trust either one of those options.
    posted by Kitteh at 6:25 AM on October 7, 2023


    exacerbates heat waves due to the heat island effect

    When solar radiation strikes an unshaded surface, some is reflected off into the sky and some is absorbed. Absorption into a simple passive surface like rooftop or pavement causes heating in that surface. Absorption into an active surface like a tree leaf or a solar PV panel causes less such heating, because some of the absorbed energy ends up doing other things.

    In tree leaves, some of the absorbed energy drives a phase change of water from liquid to vapor in the process of transpiration, ultimately supplying the mechanical energy required to keep water moving from roots to leaves. In thermodynamics terms this is a relatively inefficient heat engine, and yet the difference between having most of a surface occupied by these inefficient heat engines and not is the difference largely responsible for the heat island effect.

    Solar PV panels convert roughly a fifth of the radiation that strikes them into electrical energy that's then instantly removed from the immediate vicinity of the absorption site and dissipated somewhere else. I have not run the numbers but I would not be surprised to find out that this has local heat accumulation mitigation effects roughly similar on a surface area for surface area basis with tree leaves.

    Ah, you say, but most of that removed energy is still going to end up degrading to heat, and most of that degradation is going to be happening in the city because that's where most of the energy end use is at, so thermally speaking there's no net benefit, and this whole line of argument is just more techno utopian horseshit.

    To which I respond that the energy supplied by every single solar PV panel installed in a city is energy that hasn't come from a thermal generator. The total amount of high-quality, low-entropy energy that a city ultimately dissipates as heat doesn't depend on the specific source of that low-entropy energy, and to the extent that PV displaces thermal, that is a net benefit.

    Not only that, but the concentration of energy into high-quality, low-entropy forms is just a lossy, inefficient process no matter which way it happens. The flip side of which is that the vast bulk of the heat energy that a city-scale region ultimately dissipates as heat will not have made that little side trip through low-entropy conversion along the way. All you need to do in order to build a substantial heat island is replace vegetation with paving; adding a bunch of energy-consuming machinery as well doesn't make things a great deal worse.

    In effect, urban heat islands are global warming in microcosm. The energy flows involved are so mind-manglingly huge that all it takes is small local disruptions to create effects highly consequential to human beings.

    The single most significant factor when thinking about urban heat islands is the balance between reflectivity and absorptivity of solar radiation for absorbed surfaces. If urban heat islands were a problem we were seriously interested in solving at the expense of every other consideration, we'd just deploy enough white paint to make every city show up as a pure white patch on Google Earth. But we're not going to do that because there's no economic (i.e. individually self-interested) argument for doing so, and if economic motivations didn't drive human behaviour to the extent that they do, we wouldn't have a heat island problem in the first place.

    There are economic arguments for building solar PV shade structures over every parking lot in America, especially given the current trajectory of ICE vehicle displacement by battery EVs and the ongoing tendency of mass production to drive PV panel costs down. And given the similarity noted above between what trees and PV panels do with some of the incident radiative energy they absorb, and given the ridiculous proportion of today's urban surface area devoted to parking, it seems to me that doing so pretty much has to lessen the heat island problem as a beneficial side effect.
    posted by flabdablet at 6:47 AM on October 7, 2023


    If you're worried about whether cars should be driven by robots or apes, focus on that. The amount of public space that cars use up is a different discussion. Cars use up all the space already, regardless of who or what is driving them.

    Yeah, life would be better if we reduced or eliminated on-street parking. There's no reason your private car needs to be parked on a public street overnight or while you work or shop or whatever. Buy or rent a parking spot somewhere off the street. If you have multiple cars, get multiple spots. Your parking burden should be your parking burden. With no on-street parking, we would have a free lane or two (a lot of streets have parking on both sides) to convert to trees, sidewalks, bicycle lanes, etc. But that's a totally different story that shouldn't be conflated with the "omg robot cars are coming to kill us all!" narrative.
    posted by pracowity at 6:55 AM on October 7, 2023


    Kitteh, that's because there's nothing to say about a hit and run driver...they're a piece of shit. But there's a lot to say about self driving cars and what those should have done in this scenario because the option to do something about it exists and is ongoing. I don't see it as picking one over the other to focus on.
    posted by tiny frying pan at 6:59 AM on October 7, 2023


    I don't think human drivers are good. But I also don't trust current self driving car tech. That seems to be what most people are saying.

    That's my exact position, for what it's worth.

    I have no time at all for the line of argument that says autonomous vehicles are on course to make a worthwhile contribution to public safety and that therefore Waymo, Cruise and for fuck's sake Tesla should be encouraged to keep on doing what they're doing. I think that argument is horseshit for a number of reasons, but first and foremost among those is its source: it came straight out of autonomous vehicle marketing and is therefore (to borrow Frank Wilhoit's pungent phrasing) axiomatically dishonest and undeserving of serious scrutiny.

    If you want to make a public safety argument, gimme complete, verifiable research data or STFU. No, a carefully curated information drip feed from motivated actors is not something I am ever going to be persuaded by.

    Yes, apes suck. No disagreement from me there, ever. But I have spent enough time in the software development industry, and enough time hanging about with software developers, to have no reason at all to believe that robots, especially mass-marketed universal robots, don't suck and won't suck at least as much if not way mo'.

    Autonomous vehicles are happening because overconfident techbros think they're cool and venture capitalists can smell money to be made. If you genuinely believe there's anything more to them than that, I think you're fooling yourself.
    posted by flabdablet at 7:12 AM on October 7, 2023 [7 favorites]


    So, maybe we could discuss whether a Cruise car would have hit the pedestrian in the first place. The issues with the auto-drive vehicle that hit the woman seem to be that it failed to get off her leg and whether it could have sensed her being flung in front of it while it still had time to stop.

    Also, stepping into a busy street in the middle of the block without looking at traffic is a terrible idea. So there's that, too. Do we know if the human driver who fled the scene could have avoided hitting the woman?
    posted by mule98J at 9:10 AM on October 7, 2023


    Also, stepping into a busy street in the middle of the block without looking at traffic is a terrible idea. So there's that, too.

    Before blaming the victim, it could be the driver culture in the city. I forwarded this on to someone in SF, who tells me that drivers in cars there generally do not respect crosswalks or, more accurately, pedestrians in crosswalks. Make of that what you will.

    To the extent that AI models are trained on how people drive, maybe it makes sense that the Cruise car crashed into her, as well as the human-driven car.
    posted by They sucked his brains out! at 1:42 PM on October 7, 2023


    I have no time at all for the line of argument that says autonomous vehicles are on course to make a worthwhile contribution to public safety

    ….says a member of the gender that can safely get into taxis at night.
    posted by Tell Me No Lies at 6:29 PM on October 8, 2023 [3 favorites]


    I've driven far more taxis than taken them. I'm sorry the drivers available to you are as dangerous as you imply. I still think that expecting robot drivers, and specifically the corporations that run the robot drivers, to be less dangerous than that if they are eventually allowed to dominate the rides-as-a-service industry is kind of naive.

    Put my body inside a machine designed to lock me in there and take me wherever the fuck its remote operator wants? Thanks, no thanks.
    posted by flabdablet at 7:22 PM on October 8, 2023 [3 favorites]


    whether it could have sensed her being flung in front of it while it still had time to stop.

    I think this is a pretty low standard, and that the question should be whether it could see the person on the hood in its mirrors (or cameras) over the preceding block and reacted to that. Very few people could react as a human driver fast enough to avoid a person running out from between two vans (e.g.), so starting the evaluation at "person drops right in front of car" is going easy on the tech.
    posted by rhizome at 12:36 AM on October 9, 2023


    Right this second public roads worldwide are a site of massive and ongoing carnage. Around 44,000 die every year from traffic accidents in the USA alone. That's close to 120 a day.

    It is clear that as long as humans are driving, traffic fatalities are going to continue. I'd love to see more walkable and bikable cities and more public transit, but we're not going to see an end to cars. Eliminating human drivers is the most direct path to ending the ongoing traffic massacre.

    Which is why I see the combination of NIMBY and luddism as so baffling.

    It is not POSSIBLE to jump instantly from no self driving cars to perfect self driving cars. If there are ever going to be perfect self driving cars then there is, unavoidably, going to be a period wherein imperfect self driving cars exist on public roads. I totally get the anger and distrust of the move fast and break things crowd, Musk sucks and the others aren't particularly better. But they're the ones doing this at the moment. I'd rather suffer the frustration of letting them get a win than allow humans to keep getting killed in traffic.

    Even more worrying to me is that the anti comments here are now veering into the same paranoid style you see in right wingers ranting about gas stoves. People here are now arguing that they fear they might be kidnapped by self driving car corporations? WT actual F? That's the same sort of thing as "the government wants you to have electric stoves so they can starve you if you commit wrongthink" thing you see over on the far right wing sites.

    And people are diverging into demanding that self driving cars fix other problems before they're introduced. Can't have self driving cars if there's pollution, or roads, or less than 100% perfect constant use of the vehicle without any downtime, or whatever, the important part is that the antis have now defined the end goal as being impossible to achieve and, importantly, being completely separate from the issue of safety that theoretically started this whole discussion. No longer is it just "self driving cars need to be safer than human drivers", now they also need to solve a whole slew of other problems or else nope.

    It's apparent that to many on the anti side the answer is "no to self driving cars" and the rest is just grasping for any excuse and justification that can be invented to reach that answer. And the traffic massacre goes on. 120 a day. Every day. Forever if the antis have their way.
    posted by sotonohito at 8:10 AM on October 9, 2023


    I'm just asking for tighter reporting requirements for "contact incidents," really, but your strawpeople are fun to look at, at least.
    posted by mediareport at 9:13 AM on October 9, 2023 [1 favorite]


    There is literally a person talking about being kidnapped by self driving cars only a few comments before mine. Not straw at all.

    I'm also strongly in favor of more laws mandating transparency and incident reports.
    posted by sotonohito at 9:44 AM on October 9, 2023


    And people are diverging into demanding that self driving cars fix other problems before they're introduced. Can't have self driving cars if there's pollution, or roads, or less than 100% perfect constant use of the vehicle without any downtime, or whatever

    We have limited resources, both financial and space in population centers, to spend on transportation options. Commiting those resources to further advance individual vehicles (as opposed to trams or autonomous busses or other mass transit options) does impact stuff like pollution, the amount and type/quality of space in dense cities dedicated to roads vs pedestrians or bicycles, etc. These issues are fully relevant to a discussion of rules and regulations around autonomously driven cars, their development, and their deployment.
    posted by eviemath at 11:50 AM on October 9, 2023 [1 favorite]


    It is not POSSIBLE to jump instantly from no self driving cars to perfect self driving cars. If there are ever going to be perfect self driving cars then there is, unavoidably, going to be a period wherein imperfect self driving cars exist on public roads.

    The Red Queen Hypothesis says that as the issues that cause us to call self-driving cars "imperfect" are fixed, new imperfections will arise. The Greater Idiot principle assures us that there will be newer and more interesting ways of finding the blind spots in robot cars. The latter is the engine of the former, so the idiots/victims will always lead the problem solvers. This is why I always say there is never going to be "perfect self driving cars." It has nothing to do with the fact that I'm not personally Elon Musk Super Genius or any kind of scientician really, it's the relationship between nature and technology, and there's always a gap between what tech can model of nature. Always. The map is still not the territory, even if it's a territory-sized map.

    The only question, as I see it, is what level of harm we are willing to accept.
    posted by rhizome at 1:06 PM on October 9, 2023


    Around 44,000 die every year from traffic accidents in the USA alone. That's close to 120 a day.

    It is clear that as long as humans are driving, traffic fatalities are going to continue.


    What's every bit as clear to me as that apparently is to you is that as long as large numbers of multi-ton machines continue to operate at high speed in close proximity to large numbers of human beings, traffic deaths and maimings will likewise continue.

    That having more of those machines operated by robots than apes would lower the fatality rate remains a conjecture as yet unproven. A conjecture, furthermore, that I have relevant industry experience to recognize as run-of-the-mill self-serving self-deluding tech-bro disruptor horseshit.

    In any case, it is manifestly obvious that the present fatality rate is acceptable. Were that not the case, we would not be accepting it. Rather, we would long since have re-organized ourselves in ways that seek to reduce rather than increase the proximity of large numbers of high-speed multi-ton machines to large numbers of human beings.

    Personally I disagree with the prevailing consensus. To me, the present fatality rate is unqualifiedly appalling and our ongoing failure to take it seriously more so. We've done it with smoking, we're doing it with climate change, and we should be doing it with car culture as well.

    And in case it's not abundantly clear, I have no confidence whatsoever that we're likely to bring death and injury rates down by ceding direct control over the juggernauts we continue to design our lives around to corporations whose time is valuable to us and will be with us as soon as a consultant becomes available.

    Arguments in favour of taking and supporting that approach, especially when doing so is presented as a "realistic" alternative to what's actually required, look to me like bandaid makers spruiking their product as the best and most promising approach to treating gutshots. Just, no.

    It's not about making the perfect the enemy of the good. It's about making the necessary the enemy of the completely fucking inadequate.
    posted by flabdablet at 7:14 AM on October 10, 2023


    And in case it's not abundantly clear, I have no confidence whatsoever that we're likely to bring death and injury rates down by ceding direct control over the juggernauts we continue to design our lives around to corporations whose time is valuable to us and will be with us as soon as a consultant becomes available.

    The fact of the matter is that, in their currently limited domain, the two companies operating fleets of AVs at any kind of scale are already outperforming human drivers to at least some degree in that domain. We don't have the data yet to say for sure that it isn't just a statistical fluke, but it is promising. I tend to think that it's not because I know from personal experience that there are a large number of "contact incidents" involving human drivers that never get reported. I wish that every state Waymo and Cruise operate in had the kind of reporting requirements that California does, but the data from California is sufficient to draw preliminary conclusions. Between them they are driving millions of miles a year just in that limited area of that one single state.

    I'm as much in favor of cutting the amount of miles driven in cities as anyone, and would certainly prefer we tax the shit out of large companies and use that money to do transit better, but I'm also not going to let the perfect be the enemy of the good in the meantime. It takes decades to revamp cities. It is certainly doable, but it takes time. There is right now a model that can be largely rolled out in this decade that can drastically reduce the number of collisions in our cities and their severity. I'm not going to scoff at that just because that one megalomaniac asshole keeps calling his cars with a worse than average ADAS a self driving car.
    posted by wierdo at 9:14 AM on October 10, 2023


    the data from California is sufficient to draw preliminary conclusions

    I would really appreciate reading some thorough reporting about that. Do you have a reference? The stuff I've read is much more ambiguous.
    posted by Nelson at 9:52 AM on October 10, 2023 [1 favorite]


    I wish that every state Waymo and Cruise operate in had the kind of reporting requirements that California does

    That's kind of hilarious, because California's reporting requirements are ridiculously lax, for instance not requiring companies to report "bricking," when the AVs just stop functioning in the middle of the street:

    Since AVs hit the streets of San Francisco, there have been numerous instances of vehicles malfunctioning and stopping in the middle of the street — referred to as “bricking” — blocking the flow of traffic, public transit and emergency responders...“Several minutes of driverless AV stalled on Muni rail tracks can cause several hours of disrupted transit service.”

    ...Today, AV companies are only required to report collisions and not the many incidents of bricking.


    Neat! There's no requirement to report bricking incidents at all to California regulators. That's quite a model you're hoping goes to other states. And according to the president of the San Francisco Board of Supervisors, the city isn't even allowed to give AV companies tickets when their vehicles break basic traffic laws, like not stopping at a stop sign:

    “We’ve seen plenty of these things—making illegal turns or rolling through a stop sign—and we can’t even ticket the car or send a ticket to the company, because state law doesn’t allow it,” said Aaron Peskin, president of the San Francisco Board of Supervisors. “All these people are calling me and complaining about it, and they don’t believe it when I tell them that I have as much authority over autonomous vehicles as I do over the price of tires in Brazil.”

    These are extremely basic safeguards and incentives that the AV companies' lobbying money and connections have allowed them to circumvent:

    Getting signoff from the state utilities commission on 24/7 operations in San Francisco required a multiyear, multimillion-dollar lobbying campaign waged by two of the biggest companies in the world: Waymo is owned by Alphabet, the parent company of Google, and Cruise is owned by General Motors...

    [Cruise’s] state-level strategy has been orchestrated by Jason Kinney—one of Gov. Gavin Newsom’s closest associates and the man whose birthday prompted the governor’s infamous French Laundry dinner during the pandemic.


    The gaping holes in reporting requirements - yes, even in California - and the lack of transparency and accountability for mishaps are appalling.
    posted by mediareport at 10:43 AM on October 10, 2023 [6 favorites]


    ...when the AVs just stop functioning in the middle of the street:
    During Monday’s meeting, Nicholson said that the SFFD and other city agencies had been told by Waymo and Cruise that if an AV bricks in an active emergency zone, first responders can get in touch with their customer service and AV field support teams, or they can get in and take over the vehicle to move it.

    “It is not the responsibility of my people to get in one of your vehicles and take it over,” Nicholson said. “It is the responsibility of the autonomous vehicle companies to not have them impact us in the first place . . . our folks cannot be paying attention to an autonomous vehicle when we’ve got ladders to throw.”

    Cruise is working to get regulatory approval to deploy its purpose-built Origins, which are built without steering wheels or pedals. Those would be impossible for a first responder to move without towing them away.
    But of course what primarily motivates us is concern for public safety. /s

    See, this is the thing about engineers. Once an engineer decides they understand a problem they're going to go in hard to solve that problem. It's the nature of the profession. And engineers are good at solving problems whose scope they actually comprehend. Have to be, or nobody would pay us our asking rate.

    What we're not at all good at is having the humility to admit that there's more to a problem space than we'd ever considered before committing time, skill and large amounts of other people's money to addressing whatever little corner of it first attracted our attention. I have yet to meet an engineer, or anybody who has ever worked in software engineering, whom I could not fairly describe as exhibiting this kind of tunnel vision once the itch to solve an identified problem has begun to be seriously scratched.

    And you know, when this monomaniacal focus works it works really really well. We have endless amounts of cool tech available to all of us now exactly because engineers are that kind of person.

    But the older I get, the more alarmed I become about people who are so sure they know what they're doing, even when it's blindingly obvious that actually they have only the tiniest shred of a little piece cut from an obscure corner of a clue. It's no accident that "engineer's disease" is a ha ha only serious label for the worst kind of hubris.

    I see the combination of NIMBY and luddism as so baffling

    The main thing to bear in mind about the Luddites is that the concerns they were addressing in the only way open to them were completely legitimate, that the reactionary forces brought to bear to crush them were off-the-scale disproportionate, and that the complete contempt in which their concerns were held and continue to be held is indisputably the origin story of the hyper-capitalist hellscape we find ourselves having to navigate today.

    So yeah, I'm a neo-luddite and proud of it.
    posted by flabdablet at 11:38 AM on October 10, 2023 [6 favorites]


    It's hilarious that fire departments (is he really using 'ladders to throw' LOL) are angry about autonomous car procedures when they are one of the main barriers with their giant pointless trucks to any pedestrian or biking improvements.

    And why do they even bring that big truck to car accidents, which is 85% of fire fighter's job now - they aren't fire fighters anymore - they are 'car accident responders' - because it's to protect them from being hit by other drivers crashing into them while they are working another accident.

    And of course a bricked car is the fire department's biggest issue in getting to other accidents, as though all roads are always free of traffic. They are probably going to advocate for another specialized lane for that. In the name of public safety.
    posted by The_Vegetables at 12:38 PM on October 12, 2023


    ^ I think the truck is necessary because the most versatile "Jaws of Life" extrication tools are heavy.
    posted by Iris Gambol at 3:20 PM on October 12, 2023 [2 favorites]


    Time for hack comedians to retire the ol' "No one ever said 'fuck the Fire Department!'" line I guess.
    posted by Lentrohamsanin at 5:31 PM on October 12, 2023 [4 favorites]


    it's to protect them from being hit by other drivers crashing into them while they are working another accident

    I'm a little confused. Are you suggesting that their work on those other crashes would be more efficient or effective if they didn't have the truck there to protect them? Or that saving lives after a crash is something that should get a lower priority than commuter and/or pedestrian convenience?
    posted by flabdablet at 6:15 PM on October 12, 2023 [3 favorites]


    The California DMV today revoked Cruise's permit to operate without a driver after it was revealed that Cruise did not provide footage of its vehicle dragging the woman 20 feet after it initially stopped.
    posted by oneirodynia at 12:16 PM on October 24, 2023 [5 favorites]


    Was just coming here to add that. Cruise straight up withheld information presumably from the emergency response folks as well as the state authorities. I .. do not feel anymore inclined to support them entering the Seattle market (supposedly they have some pilot authorization here.)
    posted by R343L at 12:23 PM on October 24, 2023 [1 favorite]


    WTF were they thinking withholding the video? Did they hope they could get away with it? So sloppy they forgot to hand it over?

    A huge problem for these autonomous vehicle companies is convincing regulators that not only are they safe, but that their designers are responsible companies and the cars will work well in the regulatory regime. Hiding evidence of your car nearly killing someone is not a good way to do that.
    posted by Nelson at 12:49 PM on October 24, 2023 [4 favorites]


    Aaron Gordon on Bluesky says Cruise is disputing the claim that they withheld video initially. The DMV has said they stand by the order of suspension. Of course, the DMV folks have basically zero reason to lie since if it comes out they lose their jobs. Whereas corporations get away with lying with minimal consequences all the time and there’s a long history of corporations doing so to prevent or delay regulatory action.
    posted by R343L at 12:58 PM on October 24, 2023 [2 favorites]


    “Ultimately, we develop and deploy autonomous vehicles in an effort to save lives,” Lindow said. “In the incident being reviewed by the DMV, a human hit and run driver tragically struck and propelled the pedestrian into the path of the AV. The AV braked aggressively before impact and because it detected a collision, it attempted to pull over to avoid further safety issues. When the AV tried to pull over, it continued before coming to a final stop, pulling the pedestrian forward. Our thoughts continue to be with the victim as we hope for a rapid and complete recovery.”
    That said, our legal team has advised us not to provide her with any kind of direct monetary compensation and we will regretfully be taking that advice.

    We appreciate her contribution to our ongoing research effort and remain quite sure that in years to come she will look back at this incident and say to herself "it's so satisfying to know that Cruise has learned so much from what they did to me that they have done exactly the same thing to other people no more than six times over the last ten years and scarcely did it at all last year."
    posted by flabdablet at 3:55 PM on October 24, 2023 [1 favorite]


    As a person who is very firmly in the self driving vehicles are a good thing camp: Fuck Cruise.
    posted by sotonohito at 8:26 PM on October 24, 2023


    A huge problem for these autonomous vehicle companies is convincing regulators that not only are they safe, but that their designers are responsible companies and the cars will work well in the regulatory regime.

    A huge problem for these companies is convincing everybody that their trade secrets aren't a matter of public interest, like GM saying that they just can't tell you what the torque specs are for their seatbelt bolts. Sorry, they would if they could.
    posted by rhizome at 1:20 AM on October 25, 2023 [2 favorites]


    Thank you for the update on this. Shocking that they hid footage.
    posted by tiny frying pan at 5:15 AM on October 26, 2023


    more on the update via the LA Times:

    A Cruise car hit a pedestrian. The company's response could set back California's new robotaxi industry

    [It is curious to me that they only very late in the articl mention that the victim was first hit by a human driver who fled the scene.]
    posted by chavenet at 9:09 AM on October 26, 2023 [1 favorite]


    I think its horrifying that they hid the footage, but not shocking. You expect a corporation to be utterly immoral and to lie, cheat, and steal if that improves profits.

    Which is why we need some really draconian laws, with big whistleblower rewards and random unscheduled inspections, mandating total openness from all corporations.
    posted by sotonohito at 12:31 PM on October 26, 2023 [2 favorites]


    I meant shocking in exactly the same way you meant horrifying 🤷🏻
    posted by tiny frying pan at 1:05 PM on October 26, 2023 [2 favorites]




    Good
    posted by They sucked his brains out! at 11:14 AM on October 29, 2023


    « Older Wheels coming off the (NGP) VAN?   |   Embroidered tales and craftivism Newer »


    This thread has been archived and is closed to new comments