543
submitted 5 days ago* (last edited 5 days ago) by lgsp@feddit.it to c/fuckcars@lemmy.world

Wapo journalist verifies that robotaxis fail to stop for pedestrians in marked crosswalk 7 out of 10 times. Waymo admitted that it follows "social norms" rather than laws.

The reason is likely to compete with Uber, 🤦

Wapo article: https://www.washingtonpost.com/technology/2024/12/30/waymo-pedestrians-robotaxi-crosswalks/

Cross-posted from: https://mastodon.uno/users/rivoluzioneurbanamobilita/statuses/113746178244368036

you are viewing a single comment's thread
view the rest of the comments
[-] Kitathalla@lemy.lol 20 points 5 days ago

The reason is likely to compete with Uber, 🤦

A few points of clarity, as I have a family member who's pretty high up at waymo. First, they don't want to compete with uber. Waymo isn't really concerned with driverless cars that you or I would be owning/using, and they don't want (at this point anyway) to try to start a new taxi service. Right now you order an uber and a waymo car might show up. . They want the commercial side of the equation. How much would uber pay to not have to pay drivers? How much would a shipping company fork over when they can jettison the $75k-150 drivers?

Second, I know for a fact that the upper management was pushing for the cars to drive like this. I can nearly quote said family member opining that if the cars followed all the rules of the road, they wouldn't perform well, couching it in the language of 'efficiency.' It was something like, "being polite creates confusion in other drivers. They expect you to roll through the stop sign or turn right ahead of them even if they have right of way." So now the waymo cars do the same thing. Yay, "social norms."

A third point is that, as someone else mentioned, the cars are now trained, not 'programmed' with instructions to follow. Said family member spoke of when they switched to the machine learning model, and it was better than the highly complicated (and I'm dumbing down my description because I can't describe it well) series of if-else statements. With that training comes the issue of the folks in charge of things not knowing exactly what is going on. An issue that was described to me was their cars driving right at the edge of the lane, rather than in the center of it, and they couldn't figure out why or (at that point, anyway) how to fix it.

As an addendum to that third point, the training data is us, quite literally. They get and/or purchase people's driving. I think at one time it was actual video, not sure now. So if 90% of drivers blast through at the moment of the red light change if they can, it's likely you'll hear about it eventually from waymo. It's a weakness that ties right into that 'social norm' thing. We're not really training safer driving by having machine drivers, we're just removing some of the human factors like fatigue or attention deficits. Again, as I get frustrated with the language of said family member (and I'm paraphrasing), 'how much do we really want to focus on low percentage occurrences? Improving the 'miles per collision' is best at the big things.'

[-] leadore@lemmy.world 11 points 5 days ago

Then maybe they should make sure to train them with footage and/or data of drivers who are following the traffic laws instead of just whatever drivers they happen to have data from.

Do they review all this training data to make sure data from people driving recklessly is not being included? If so, how? What process do they use to do that?

[-] trolololol@lemmy.world 6 points 4 days ago

Hmmm yeah no surprises there and I like how you articulated it all really well

On the social norm thing, it's still a conscious decision how much they're investing in teaching their ai how to distinguish good vs bad behavior. In AI speak, you can totally mark adequate behavior with rewards and bad behavior with penalties. Then you get the car to shift its behavior in the right direction. You can't predict how it fine tunes specific behavior like the line edge unless you are willing to start from scratch if necessary, but overall that's how you teach it that crossing a red light is a big no no. Penalties, and if not enough, start over.

[-] astronaut_sloth@mander.xyz 7 points 5 days ago

A third point is that, as someone else mentioned, the cars are now trained, not ‘programmed’ with instructions to follow.

As an addendum to that third point, the training data is us, quite literally.

Yeah, that makes sense. I was in SF a few months ago, and I was impressed with how the Waymos drove--not so much the driving quality (which seemed remarkably average) but how lifelike they drove. They still seemed generally safer than the human-driven cars.

Improving the ‘miles per collision’ is best at the big things.

Given the nature of reinforcement learning algorithms, this attitude actually works pretty well. Obviously, it's not perfect, and the company should really program in some guardrails to override the decision algorithm if it makes an egregiously poor decision (like y'know, not stopping at crosswalks for pedestrians) but it's actually not as bad or ghoulish as it sounds.

[-] Kitathalla@lemy.lol 7 points 5 days ago

but it’s actually not as bad or ghoulish as it sounds

We'll have to agree to disagree on that one. I think decisions made solely for making the company's cost as low as possible while actively choosing to not care about issues just because their chance is low (we've all seen fight club, right? [If A > B where B=cost of paying out * chance of occurrence and A=cost of recall, no recall]) even if devastating are ghoulish.

this post was submitted on 31 Dec 2024
543 points (96.9% liked)

Fuck Cars

9860 readers
531 users here now

A place to discuss problems of car centric infrastructure or how it hurts us all. Let's explore the bad world of Cars!

Rules

1. Be CivilYou may not agree on ideas, but please do not be needlessly rude or insulting to other people in this community.

2. No hate speechDon't discriminate or disparage people on the basis of sex, gender, race, ethnicity, nationality, religion, or sexuality.

3. Don't harass peopleDon't follow people you disagree with into multiple threads or into PMs to insult, disparage, or otherwise attack them. And certainly don't doxx any non-public figures.

4. Stay on topicThis community is about cars, their externalities in society, car-dependency, and solutions to these.

5. No repostsDo not repost content that has already been posted in this community.

Moderator discretion will be used to judge reports with regard to the above rules.

Posting Guidelines

In the absence of a flair system on lemmy yet, let’s try to make it easier to scan through posts by type in here by using tags:

Recommended communities:

founded 2 years ago
MODERATORS