[HN Gopher] Mercedes beats Tesla to autonomous driving in Califo...
       ___________________________________________________________________
        
       Mercedes beats Tesla to autonomous driving in California
        
       Author : belter
       Score  : 226 points
       Date   : 2023-06-10 13:45 UTC (9 hours ago)
        
 (HTM) web link (www.theregister.com)
 (TXT) w3m dump (www.theregister.com)
        
       | belter wrote:
       | "Conditionally automated driving: Mercedes-Benz DRIVE PILOT
       | further expands U.S. availability to the country's" -
       | https://media.mercedes-benz.com/article/81a29ac5-4d02-4b58-b...
       | 
       | "Mercedes-Benz DRIVE PILOT is the world's only SAE Level 3 system
       | with internationally valid type approval. It builds on a very
       | robust foundation, setting new industry standards. DRIVE PILOT
       | uses a highly sophisticated vehicle architecture based on
       | redundancy with a multitude of sensors enabling comfortable and
       | safe conditionally automated driving. The certification by the
       | authorities in California and in Nevada once again confirms that
       | redundancy is the safe and thus the right approach."
       | 
       | "...Mercedes-Benz is focusing on SAE Level 3 conditionally
       | automated driving with the ultimate goal of driving at speeds of
       | up to 80 mph (130 km/h) in its final iteration..."
       | 
       | "Mercedes-Benz S-Class DRIVE PILOT Sensors Details" -
       | https://youtu.be/9m-VS55w9HA
        
         | cj wrote:
         | >"Mercedes-Benz S-Class DRIVE PILOT Sensors Details" -
         | https://youtu.be/9m-VS55w9HA
         | 
         | - LiDAR
         | 
         | - Radar
         | 
         | - Cameras
         | 
         | - Ultrasonic sensor
         | 
         | - Road moisture sensor
         | 
         | Curious how this collection of sensor compares with Tesla and
         | also with run of the mill cars with ACC / lane keeping.
        
           | stefan_ wrote:
           | It seems a bit suboptimal to place LiDAR in the front and
           | low. More like a lane following kinda setup.
        
           | belter wrote:
           | https://youtu.be/BFdWsJs6z4c
        
             | cj wrote:
             | I'm aware of Elon's opinion: "LiDAR is lame. LiDAR is lame.
             | Lame. Losers use LiDAR. LiDAR is expensive."
             | 
             | I can't stomach watching a video that starts out like that.
        
               | pyinstallwoes wrote:
               | Do you use LiDAR when you drive?
        
               | timeon wrote:
               | Do you use camera when you drive?
        
               | fallingknife wrote:
               | I use 2
        
               | [deleted]
        
               | tgv wrote:
               | How much processing power do you use to process the input
               | and control the car? You've got more of it in your head
               | than a whatever Tesla puts in their cars. Your cameras
               | are also better.
        
               | ChuckNorris89 wrote:
               | No, but at the same time Boeings and Airbuses don't flap
               | their wings and birds don't use jet engines to fly.
               | 
               | Just because nature and millions of years of evolution
               | have solved a problem in a way that looks simple on the
               | surface, doesn't mean the same thing can be copied with
               | current tech to solve the same problem.
               | 
               | Elon's "but humans drive with only use two eyes therefore
               | we can do it with cameras" is the most moronic argument
               | ever and it saddens me when I hear others here parrot it.
               | 
               | Our two eyes may be enough to drive, but our eyes can
               | move in their sockets, our heads can move around and
               | parallax for depth perception, our retinas have miles
               | better dynamic range than any commercial sensor that
               | Tesla is using, but most importantly, our brains are
               | orders of magnitude ore intelligent at reasoning with new
               | or unknown situations than current self driving tech
               | which is just basic pattern matching in comparison.
        
               | carlmr wrote:
               | It's moronic because it's motivated reasoning.
        
         | ajross wrote:
         | It's also worth pointing out that this is an extremely limited
         | system:
         | 
         | > On suitable freeway sections and where there is high traffic
         | density, DRIVE PILOT can offer to take over the dynamic driving
         | task, up to speeds of 40 mph.
         | 
         | Basically, this will follow the car in front of you in slow
         | freeway traffic. It won't navigate, it won't even change lanes.
         | It won't work (and it's not clear to me how it disengages) when
         | traffic speeds up.
         | 
         | That's not useless. There's a reasonable argument that this is
         | the product the market wants. But the "Germans beat Tesla"
         | framing here is really quite spun. Autonomy and capability
         | aren't the same thing.
        
           | Vespasian wrote:
           | You forgot the best "feature": It allows you to take your
           | hands off the wheel and your eyes off the road.
           | 
           | This is significant as it means MB takes on liability and
           | will give you sufficient time (i think 10s in the European
           | version) to take over.
           | 
           | In the world of corporate liability this is huge even if it
           | isn't a technical achievement by itself.
        
             | justapassenger wrote:
             | Only people who don't understand problem domain think it's
             | not a technical achievement.
        
               | ajross wrote:
               | I'll give you 2:1 odds that before this reaches market
               | (looks like end of the year for deliveries per the
               | article) Tesla will duplicate the stunt and lift the
               | driver monitoring requirements in similar situations.
               | 
               | (Edit: two replies here playing gotcha games with prior
               | FSD announcements. But I'm serious: Tesla can do what
               | this Mercedes does already, essentially perfectly, and
               | has been for the two years I've operated the vehicle. The
               | cost to them of duplicating this stunt is near zero. I'll
               | bet you anything they do it.)
        
               | noAnswer wrote:
               | They claimed in 2018 that in 2019 you will be able to let
               | your Tesla work for you as a Robo-Taxi and make 300.000,-
               | a year. Why would they still wait?
               | 
               | Why did Elon Musk wait so long to say the videos showing
               | "him" saying "We will have full self driving in 201x" are
               | deep fakes?
        
               | justapassenger wrote:
               | The Tesla, who claimed they're full self driving next
               | year, for more than half a decade now?
               | 
               | Cool, let's wait.
               | 
               | I think it shows how little you understand the problem,
               | to call it a "stunt".
        
             | panick21_ wrote:
             | There are lots of level 2 systems where you can take your
             | hands of the wheel. That not the definition of level 3.
        
           | kelnos wrote:
           | The interesting thing is it's roughly the same system my 2022
           | Mercedes has, except it requires my hands on the wheel (and
           | does not disengage over 40mph). I mean, this is just adaptive
           | cruise with lane keeping.
           | 
           | Really the big thing here is Mercedes is saying it's good
           | enough (under 40mph) that the driver doesn't have to pay
           | attention at all.
        
       | guerby wrote:
       | I searched and it looks like the Mercedes L3 is available for
       | 7430 EUR on top of 149900 EUR for the base EQS 450+ model (link
       | below, in french).
       | 
       | Anyone looked if there's a cheaper way to get this Mercedes L3?
       | 
       | https://www.lesnumeriques.com/voiture/drive-pilot-mercedes-c...
       | 
       | https://www.lesnumeriques.com/voiture/essai-mercedes-eqs-450...
        
       | lvl102 wrote:
       | It doesn't take a lot to retro-fit roads with autonomous
       | guardrail markers when we repave. It's going to take a federal-
       | level effort to accomplish this (which should have been apart of
       | pandemic infrastructure stimulus).
       | 
       | I strongly believe this would improve safety for autonomous
       | vehicles by 1000x.
        
         | justapassenger wrote:
         | Detecting lane is mostly solved problem. Detecting AND
         | predicting what other things will do on the road (not only
         | cars, but people, animals, trash, etc) is not.
        
           | pfannkuchen wrote:
           | > predicting what other things will do on the road... [like]
           | trash
           | 
           | At least we do have a model for this one (physics). I wonder
           | what additional signals the car would need to accurately
           | simulate trash movement? Wind?
        
             | MattRix wrote:
             | I don't think you could ever accurately simulate trash
             | movement.
             | 
             | Trash is irregularly shaped, even with the highest
             | resolution cameras, it'd be impossible to know what the far
             | side of the trash is shaped like. Then on top of that
             | there's no way to predict highly localized gusts of wind.
             | 
             | The best way to deal with trash is to just have an ML model
             | deal with lots of it so it can make predictions about what
             | it is likely to do (which is basically what Tesla is
             | already doing).
        
             | justapassenger wrote:
             | Wind, yes. But even without it - thing get tricky, when
             | you're traveling at 80mph. You need to detect enough of
             | movements to be able to predict future path. It not easy,
             | even with very advanced sensors.
             | 
             | And you cannot "play it safe and assume it'll collide with
             | us". You get phantom breaking, which is very dangerous.
        
           | ikekkdcjkfke wrote:
           | Have you seen the video where a tesla tries to hard left into
           | the guard rail on a freeway because of construction/redirect
           | lines going to the left
        
         | aedocw wrote:
         | Lane keeping is not the hard part, that's been solved for a
         | while. It's dealing with pedestrians and other drivers doing
         | unpredictable things. Slow moving infrastructure projects will
         | not help that at all.
        
         | zaroth wrote:
         | I used to think that V2X road infrastructure was a major
         | necessary component for self driving. I no longer hold that
         | opinion.
         | 
         | The "self-driving" infrastructure would not be magically any
         | better than the "human-driving" infrastructure, and how to
         | handle conflicting data from the infra in terms of sensor
         | fusion is not at all clear.
         | 
         | In short, it creates as many problems as it solves, and doesn't
         | really solve the problems that it sets out to in the real-
         | world.
         | 
         | Even simpler things like supplemental signaling that could be
         | used in special circumstances like road work and emergency
         | vehicles, if they are not used properly and consistently in all
         | cases, it doesn't change the fact that the system needs to
         | correctly handle these cases internally.
        
           | lvl102 wrote:
           | I definitely agree state of the art is good enough for 97% of
           | driving scenarios but they're also not fail-safe and robust.
           | In order for autonomous to be truly acceptable, they need to
           | be 100x more safe. No, nearly flawless. In order to
           | accomplish that you need physical and hardcoded methods as
           | well. In addition, and equally as important, is that road
           | infrastructure gives you improved ability to coordinate cars
           | on the road without relying on individual and disparate
           | compute units.
        
         | kbos87 wrote:
         | There's just no need for this. Tesla can detect even the most
         | poorly defined edges of a dirt road without a problem.
        
           | belter wrote:
           | Maybe they should start at detecting children? Luminar Tech
           | can... - https://youtu.be/3mnG_Gbxf_w
        
       | throwaway_ab wrote:
       | Many people are saying Tesla FSD is far more advanced, and I've
       | seen videos of A Tesla driving around LA for 2 hours completely
       | autonomously so I agree Tesla FSD is the world leader by far and
       | blows what Mercedes has built out of the water.
       | 
       | However Mercedes are taking liability whilst their system is in
       | use which implies Tesla takes no liability whilst their system is
       | in use.
       | 
       | I'm surprised Tesla is scared to take legal responsibility for
       | their system and I am surprised lawmakers are allowing autonomous
       | systems when the manufacturer doesn't believe in it enough to
       | take responsibility whilst it's in use.
       | 
       | How is Tesla getting away with this?
       | 
       | Of course they can beat the competition especially if they do not
       | need to take legal responsibility for any deaths/accidents that
       | occur when it's in use.
        
         | notyourwork wrote:
         | If I were Tesla why would I take responsibility for something
         | no one is forcing me to take responsibility for? Lawmakers are
         | responsible for this charade and its embarrassing to me that a
         | company can be so misleading with marketing and get away with
         | it.
        
           | wahnfrieden wrote:
           | Both state and capital can be bad at the same time y'know
        
           | TulliusCicero wrote:
           | Because it means the feature is actually useful.
           | 
           | If the human "driver" is liable for what an autonomous car
           | does, it means you have to watch the car like a hawk. At that
           | point, may as well be driving.
        
           | prmoustache wrote:
           | I wouldn't do any business with you for example if that is
           | your default level of ethics.
        
             | jchoca wrote:
             | You say that, but it's how most companies operate
             | unfortunately.
        
             | GreedClarifies wrote:
             | OK then don't!
             | 
             | That's the magic of the market! People have different
             | values.
        
         | radomir_cernoch wrote:
         | > [...] blows what Mercedes has built out of the water.
         | 
         | Mercedes FSD prototype, 10 years ago:
         | https://youtu.be/G5kJ_8JAp-w
        
           | elif wrote:
           | In that video they mention doing localization based upon a
           | prebuilt map of the route by matching images to the model 10
           | times per second.
           | 
           | That is by definition, not FSD. That is like the system
           | announced today, a limited route autonomy.
           | 
           | For comparison, FSD v3 (they are shipping v4 in every vehicle
           | now) performs localization 2,000 times per second based upon
           | a hybrid model of every road in open street maps and a
           | generalized model of roads. That is why it is FULL. Even if
           | you are on an unmapped brand new road built yesterday, it
           | will know how to drive appropriately.
        
             | akmarinov wrote:
             | They aren't shipping v4 in the model 3, so no on the "every
             | vehicle"
        
               | elif wrote:
               | https://www.teslaoracle.com/2023/06/07/tesla-
               | model-3-highlan...
        
               | akmarinov wrote:
               | That's not shipping...
               | 
               | If you buy a model 3 TODAY you get V3
        
           | ChuckNorris89 wrote:
           | _> Mercedes FSD prototype, 10 years ago:_
           | 
           | Mercedes FSD prototype 1986 to 1994 via the 400 Million Euro
           | EU funded Prometheus project
           | https://www.youtube.com/watch?v=I39sxwYKlEE
           | 
           | It's funny that German, and Italian researchers and car
           | makers had the early lead on self driving tech and then lost
           | it by shelving the tech. Oof.
           | 
           | Which reinforces my earlier point I made in another thread
           | here today, that innovation only happens in the EU as long as
           | it's government funded and as soon as the funding stops, work
           | stops and everything gets shelved instead of the private
           | industry picking up the slack, funding it further to
           | commercialize it like in the US. Sad.
           | 
           |  _"It's possible that [Germany] threw away its clear vanguard
           | role because research wasn't consistently continued at the
           | time," Schmidhuber said. He added that carmakers might have
           | shied away from self-driving technology because it seemed to
           | be in opposition to their marketing, which promoted the idea
           | of a driver in charge of steering a car. "_
        
             | moffkalast wrote:
             | > What do they have now?
             | 
             | > > Mercedes sprinter
             | 
             | I don't know why but that is hysterical.
        
             | jeffreygoesto wrote:
             | Ernst Dickmanns. Legend. Sat close to him at a CVPR and he
             | could not resist to rant about "How's that new? We did that
             | in the 90es!" =:-D
        
             | constantcrying wrote:
             | >It's funny that German, and Italian researchers and car
             | makers had the early lead on self driving tech and then
             | lost it by shelving the tech. Oof.
             | 
             | Actually a very common occurrance. I don't think FSD on
             | todays level was possible in '94 and the projects failure
             | was inevitable unless it had been continously funded for at
             | least 15 years more.
             | 
             | >innovation only happens in the EU as long as it's
             | government funded and as soon as the funding stops
             | 
             | Seems like a bad example. Funding stopped because the
             | technology didn't work.
        
           | MattRix wrote:
           | Yeah that's nowhere close. It's easy to make a prototype that
           | looks good in a marketing video while driving a very tightly
           | mapped route. It's a whole other thing to let anyone use self
           | driving tech anywhere, especially on routes it has never seen
           | before.
        
             | jeffreygoesto wrote:
             | That was teen years ago, remember. All I can say is that
             | these guys are extremely knowledgeable, kind and an
             | absolute joy to work with. Big shout out to Eberhard,
             | Carsten, Christoph, Clemens and Thao, and to the ones not
             | appearing in the video, like Uwe (enjoy your retirement),
             | David and Henning and a lot others from the chair of
             | Christoph Stiller and from Mercedes research.
        
         | elif wrote:
         | The responsibility already isn't on the car occupants, it is on
         | the occupants' insurance carrier. The only way to meaningfully
         | diminish responsibility on the car occupants is to lower
         | insurance premiums.
         | 
         | To that end, Tesla is offering insurance directly to consumers
         | now, offering lower premiums based upon driver safety system
         | utilization. In my case it would cut my insurance rates in
         | half.
        
         | ajross wrote:
         | > However Mercedes are taking liability whilst their system is
         | in use
         | 
         | Are they? This press release doesn't actually say so. There was
         | an announcement a while back when they deployed this system in
         | Germany, but that's obviously a different legal environment.
         | 
         | FWIW, this is mostly spin anyway. Liability isn't generally
         | something that is granted, it's either there or not. If Tesla
         | FSD causes an accident, they can absolutely be sued for that in
         | the US. And they have been on a handful of occasions
         | (Successfully even, I think? Pretty sure there were settlements
         | in some of the early AP accidents?).
         | 
         | The reason that "Tesla is liable for accidents" doesn't make
         | news is... that there are vanishingly few AP/FSD accidents. The
         | system as deployed is safe, but that doesn't match people's
         | priors so we end up in arguments like this about "accepting
         | liability" instead of "it crashed".
        
           | ra7 wrote:
           | > Liability isn't generally something that is granted, it's
           | either there or not. If Tesla FSD causes an accident, they
           | can absolutely be sued for that in the US.
           | 
           | That's not what liability means here. Assuming the story is
           | true, it means Mercedes is responsible for damages caused
           | when the system is engaged and the users know that when they
           | buy the car. Being sued afterwards is not the same thing.
           | 
           | > The reason that "Tesla is liable for accidents" doesn't
           | make news is... that there are vanishingly few AP/FSD
           | accidents.
           | 
           | Or because Tesla actively hides accident data by using
           | suspect methodology and not following regulations about
           | disclosure.
           | 
           | Just a couple of days ago there was a user report of FSD
           | hitting and killing a dog: https://twitter.com/TeslaUberRide/
           | status/1666860361381818384...
           | 
           | Unsurprisingly, it won't show up in any of the stats Tesla
           | publishes in their two-paragraph "safety report". That's
           | because they don't consider any contact that doesn't deploy
           | airbags to be an accident. There are plenty of reports like
           | this that are not being counted.
           | 
           | Not to mention, Tesla is openly violating at least CA DMV
           | regulations that require reporting of all disengagements and
           | contact events.
        
             | rad_gruchalski wrote:
             | Yes, because an accident is when there are personal
             | injuries, otherwise it's a collision.
        
               | ra7 wrote:
               | I'm not aware of any definition of accident that says
               | personal injuries have to occur. But sure, you can
               | replace accident here with collision or contact events.
               | Also, airbag deployment doesn't automatically mean there
               | are injuries either.
               | 
               | Point is these types of events are not being reported by
               | Tesla, while every other company testing self driving
               | technology (specially ones that have CA DMV permit to do
               | so) are reporting them.
        
             | toast0 wrote:
             | > Not to mention, Tesla is openly violating at least CA DMV
             | regulations that require reporting of all disengagements
             | and contact events.
             | 
             | Tesla letters to CA DMV claim they don't participate
             | because their system isn't a self-driving system. Which is
             | fine, other than they're telling customers it is at the
             | same time.
        
               | charcircuit wrote:
               | A word's meaning is a specific legal document can be
               | different than a word's meaning in a product's marketing
               | material.
        
           | enragedcacti wrote:
           | [1] (2) (A) "Autonomous vehicle" means any vehicle equipped
           | with autonomous technology that has been integrated into that
           | vehicle that meets the definition of Level 3, Level 4, or
           | Level 5
           | 
           | [2] (c) The manufacturer has in place and has provided the
           | department with evidence of the manufacturer's ability to
           | respond to a judgment or judgments for damages for personal
           | injury, death, or property damage arising from the operation
           | of autonomous vehicles on public roads
           | 
           | [1] https://leginfo.legislature.ca.gov/faces/codes_displaySec
           | tio...
           | 
           | [2] https://www.dmv.ca.gov/portal/file/adopted-regulatory-
           | text-p...
           | 
           | > FWIW, this is mostly spin anyway. Liability isn't generally
           | something that is granted, it's either there or not.
           | 
           | Mercedes is releasing an L3 product in a jurisdiction where
           | operation of L3 products is insured by the manufacturer. That
           | is substantially different than "someone could sue them and
           | maybe maybe maybe win"
           | 
           | > The reason that "Tesla is liable for accidents" doesn't
           | make news is... that there are vanishingly few AP/FSD
           | accidents.
           | 
           | There have been 736 known AP/FSD crashes and 17 deaths. The
           | reason that "Tesla is liable for accidents" doesn't make news
           | is that they aren't legally liable for their level 2 system
           | under existing AV regulation.
           | 
           | https://www.msn.com/en-us/autos/news/theres-been-a-
           | whopping-...
        
             | ajross wrote:
             | Sorry, where do you get "accept liability" from "ability to
             | respond to a judgement for damages"? That's not a
             | requirement to pay, that just requires that the company
             | have the ability to pay if they are found liable! It's the
             | corporate equivalent of requiring liability insurance.
             | 
             | Show me the contract (or hell, even a press release) where
             | Mercedes acknowledged accepting liability in the US. I
             | really don't think this happened.
        
               | piotrkaminski wrote:
               | There were a ton of articles around 2022-03-20 (e.g.,
               | [1]) that had a line like this:
               | 
               | > Once you engage Drive Pilot, you are no longer legally
               | liable for the car's operation until it disengages.
               | 
               | Not quite a press release but given that Mercedes never
               | denied the claims it's pretty close. It'll be interesting
               | to see how this is implemented legally, of course.
               | 
               | [1] https://www.roadandtrack.com/news/a39481699/what-
               | happens-if-...
        
               | ajross wrote:
               | That was a German product though. The press release about
               | the american rollout is notably missing that language.
        
               | piotrkaminski wrote:
               | Hmm, the latest article [1] specifically about the
               | California authorization says:
               | 
               | > When active, Mercedes takes responsibility for Drive
               | Pilot's actions.
               | 
               | and
               | 
               | > "Mercedes-Benz Drive Pilot is the world's only SAE
               | Level 3 system with internationally valid type approval,"
               | Mercedes CTO Markus Schafer said in a statement.
               | 
               | Not as clear-cut as you'd want it to be but certainly
               | leaning towards the claim. I guess we'll know for sure
               | once the cars actually go on sale in California.
               | 
               | [1] https://www.roadandtrack.com/news/a44139131/mercedes-
               | benz-se...
        
               | enragedcacti wrote:
               | Why are you quoting "accept liability" as if its
               | something I actually said?
               | 
               | > Show me the contract (or hell, even a press release)
               | where Mercedes acknowledged accepting liability in the
               | US.
               | 
               | I'm not sure why this is the bar to clear, there is no
               | reason Mercedes would want to potentially open themselves
               | to more responsibility than the law requires. That said,
               | it should be obvious to most people that Mercedes is
               | taking some legal exposure when they:
               | 
               | 1) call their product SAE L3 when SAE L3 is the legal
               | definition of a vehicle where the operator doesn't have
               | to pay attention
               | 
               | 2) tell the user they can watch movies while driving! (no
               | US manual available yet but they make a similar statement
               | in the press release) https://moba.i.mercedes-
               | benz.com/baix/cars/223.1_mbux_2021_a...
               | 
               | That very obviously speaks to their level of confidence
               | in their system compared to something like FSD:
               | 
               | > It may do the wrong thing at the worst time, so you
               | must always keep your hands on the wheel and pay extra
               | attention on the road.
               | 
               | some legal analysis of US law on AV liability if you're
               | interested: https://www.nortonrosefulbright.com/en/knowle
               | dge/publication...
        
         | zaroth wrote:
         | Mercedes could also ship their own L2 AutoPilot without having
         | to take legal responsibility. Their customers would love it as
         | much as Tesla drivers love theirs.
         | 
         | The thing is, they just aren't capable of it.
        
           | bobsoap wrote:
           | Mercedes has been shipping their L2 system since 2013, at
           | least in Europe.
        
           | amf12 wrote:
           | Or Mercedes has certain reputation to maintain. This is not a
           | dig at Tesla, but Mercedes not shipping L2 AP doesn't imply
           | they aren't capable of it.
        
           | kelnos wrote:
           | Huh? The L2 system on my 2022 Mercedes works just fine. And
           | I'm sure that's not the first model year where it was
           | present.
        
       | renewiltord wrote:
       | I see. This car is good enough for a morning commute in Bay Area
       | traffic. The question is the disengagement mode. 40 mph on the
       | highway means it only makes sense in traffic, but when the
       | conditions are no longer met how does it decide to hand over
       | control?
       | 
       | If it can reliably wake me up or pull over at the off ramp then
       | it's good enough.
        
       | cryptoegorophy wrote:
       | All these comments are a perfect example and a study of
       | confirmation bias.
        
       | pvorb wrote:
       | By the way, Mercedes tested a self-driving van in 1986[1].
       | 
       | [1]: https://www.politico.eu/article/delf-driving-car-
       | born-1986-e...
        
       | TheAlchemist wrote:
       | Clickbait title to say the least !
       | 
       | While Tesla did amazing things for EVs adoption, and self driving
       | (even if one may say the main contribution is hyping it) it's
       | hard to see them as leaders anymore.
       | 
       | Fact is, Mercedes is taking responsibility for its system while
       | Tesla is not. Tesla claims (and especially it's fanboys) starts
       | to look like Theranos. Yeah it almost work and it will be a game
       | changer. Yeah well...
       | 
       | I'm curious about those Tesla videos that are everywhere - is
       | there some kind of dataset somewhere with videos of similar
       | situations, annotated with which version it is etc, so one can
       | make a kind of historical evaluation of its progress ?
       | 
       | (Would also like the history of Elon tweets claiming each version
       | fix this or that to go along in the dataset)
        
         | onethought wrote:
         | Is there a database of videos of Mercedes? I know "they take
         | liability" but if I'm dead, that still sucks.
         | 
         | What has Mercedes ceo commented about it?
         | 
         | I think you'll find Mercedes just don't publish anything, where
         | as Tesla pretty much develop in the open. Regardless of the
         | sausage, some people just don't like knowing how it's made.
        
       | Havoc wrote:
       | Comments are surprisingly negative - I would have thought hn
       | would celebrate such an advance
        
         | warkdarrior wrote:
         | Only Elon is allowed to make advances in the automotive, space,
         | tunneling, and tweeting spaces.
        
           | kubb wrote:
           | they probably hold tesla stock since its all time high :)
        
           | claudiug wrote:
           | sadly I can only do a +1 once :)
        
         | TulliusCicero wrote:
         | Over the last handful of years, it does seem like HN commenters
         | have become much more negative about tech advances.
        
         | sebzim4500 wrote:
         | I think with a different headline the comments would have been
         | positive, but this one is misleading at best
        
       | dbcurtis wrote:
       | L3 is profoundly unwise. L3 can disengage at any moment. The
       | attention required of the human in order to drive safely exceeds
       | that required in L2, and is much more difficult to maintain.
        
         | ajross wrote:
         | I don't know that it's "unwise"; that depends on failure rate.
         | 
         | But it's absolutely true that the practical difference between
         | "human must supervise" and "human must be ready to take over"
         | is _MUCH_ smaller than people want it to be. Mostly everyone
         | wants to yell about Elon Musk and  "2 vs. 3" is ammunition.
        
         | Retric wrote:
         | L3 requires zero attention, that's what makes it L3 vs L2. The
         | driver needs to stay sober, awake, and in the drivers seat but
         | are supposed to be able to read a book or something.
         | 
         | A L3 car is required to be able to handle all short term
         | situations and only do a handoff with 5-10 seconds of warning.
         | The idea is it's ok for the car to come to a complete stop and
         | say I have no idea what to do, but it isn't ok to simply fail
         | on a freeway at 70 MPH.
         | 
         | Failing safely is a huge difference, as mentioned in the
         | article: _if a driver doesn 't take over when prompted the car
         | activates the hazard lights and slowly comes to a stop before
         | making an emergency system call to alert first responders to a
         | potential problem._
        
         | belter wrote:
         | It will give a 10 second warning - https://youtu.be/1gjweWq8qAc
        
           | dbcurtis wrote:
           | I have verrrry limited connectivity RN so can't watch video.
           | If they really do give 10 seconds, then does the system
           | remain engaged for the "dog/kid darting out between parked
           | cars" scenario? And engage a collision avoidance trajectory
           | planner? 10 seconds is an eternity, even in 35 mph zones. (60
           | kph)
        
             | Retric wrote:
             | Yes L3 must handle the dog running into the road situation,
             | this Mercedes will even do evasive driving when possible to
             | avoid a collision.
        
         | foepys wrote:
         | Mercedes' system will always give the driver a 10 seconds
         | warning before disengaging. Enough to put away your phone and
         | assess the situation.
        
         | Gasp0de wrote:
         | How is that different from L2? SAE Levels 0-2 demand constant
         | supervision and the ability to take over at any moment. Level 3
         | is actually better, as it only requires you to be able to take
         | over after a short (e.g. 10 second) notice. Levels 4 and 5 do
         | not require the driver to be able to take over (e.g. drunk,
         | sleeping, no license).
         | 
         | https://www.sae.org/blog/sae-j3016-update
        
           | dbcurtis wrote:
           | How short is short? How fast is your OODA loop? L3 plays into
           | known human factor weaknesses.
        
       | gzer0 wrote:
       | Friendly reminder that this system is HEAVILY limited, with the
       | following restrictions:                 - Must be under 40 mph
       | - Only during the daylight and only on certain highways       -
       | CANNOT be operated on city/country streets       - CANNOT be
       | operated in construction zones       - CANNOT be during heavy
       | rain or fog or flood roads
       | 
       | And for comparison: https://www.youtube.com/watch?v=DjZSZTKYEU4
       | 
       | Tesla FSD navigating the complex city streets of LA for 60
       | minutes with zero human intervention.
       | 
       | This seems like a marketing gimmick by Mercedes at best; the two
       | technologies aren't even in the same league. Any comparison is
       | laughable. They are outmatched, outclassed, and absolutely
       | outdone. Karpathy (now @ OpenAI) and the Tesla FSD team have
       | really done an incredible job.
        
         | golemiprague wrote:
         | [dead]
        
         | iancmceachern wrote:
         | Friendly reminder that there are many humans who choose to not
         | operate their vehicle outside of these conditions as well. Not
         | me, but I know people...
        
         | jacquesm wrote:
         | I don't think we're going to agree here.
         | 
         | > This seems like a marketing gimmick by Mercedes at best;
         | 
         | No, this is what a professional roll-out of a feature like this
         | should look like.
         | 
         | > the two technologies aren't even in the same league. > Any
         | comparison is laughable.
         | 
         | Well, at least we agree on these two. But probably not about
         | the direction.
        
         | [deleted]
        
         | akmarinov wrote:
         | In the end, Mercedes assumes liability for its system, where
         | Tesla doesn't. Tells you all you need to know.
        
           | slg wrote:
           | This doesn't mean the system is actually better. It just
           | means that Mercedes thinks that the cost of covering the
           | liability won't exceed the boost in sales that come from this
           | decision. It reminds me of what an automotive innovator once
           | said[1]:
           | 
           | >Here's the way I see it, Ted. Guy puts a fancy guarantee on
           | a box 'cause he wants you to feel all warm and toasty
           | inside... Because they know all they sold ya was a guaranteed
           | piece of shit. That's all it is, isn't it? Hey, if you want
           | me to take a dump in a box and mark it guaranteed, I will. I
           | got spare time. But for now, for your customer's sake, for
           | your daughter's sake, ya might wanna think about buying a
           | quality product from me.
           | 
           | Not that I'm implying Autopilot is necessarily a "quality
           | product" in comparison. It is just that a guarantee or
           | liability coverage is nothing but a marketing expense for the
           | company issuing it. It doesn't actually mean the product is
           | higher quality.
           | 
           | [1] - https://www.youtube.com/watch?v=mEB7WbTTlu4
        
           | wilg wrote:
           | Is there evidence for this?
        
             | DoesntMatter22 wrote:
             | [flagged]
        
             | djcannabiz wrote:
             | https://insideevs.com/news/575160/mercedes-accepts-legal-
             | res...
             | 
             | "once a driver turns on Drive Pilot, they are "no longer
             | legally liable for the car's operation until it
             | disengages." The publication goes so far as to say that a
             | driver could actually pay zero attention to the road ahead,
             | play on their mobile phones, and even watch a movie. If the
             | car were to crash, Mercedes would be 100 percent
             | responsible."
        
               | wilg wrote:
               | Yes, but is there any evidence this is actually
               | happening? Not that they announced it might a year ago.
        
               | wilg wrote:
               | Folks, don't downvote me, show me where Mercedes actually
               | says they are taking liability for actual cars that are
               | actually on the road with customers. What people cite on
               | this is a vague marketing puff piece from a year ago. If
               | they are doing it, it should be pretty easy to find out
               | how that is structured from their website or whatever the
               | car owner has signed, or just like literally any evidence
               | whatsoever.
        
               | dmix wrote:
               | This explains the 40mph and other extreme restrictions
        
               | froh wrote:
               | for now, that is. in German they call it "traffic jam
               | pilot" Staupilot. they start L3 autonomy with the boring
               | and tedious scenario first.
               | 
               | it's obvious they'll expand it to more and more
               | freeway/highway driving scenarios, and from there grow
               | into any out of town driving.
               | 
               | meanwhile Waymo and Tesla can take their bruises with
               | downtown traffic and pedestrians and children and hand
               | drawn road signs and dirty signs and poorly placed ones
               | and whichever more crazy real life reality show surprise
               | guests appear on stage...
               | 
               | I wouldn't (and I don't think you did but several others
               | in such discussions do) snicker too much on Mercedes.
               | they have a knack on getting a couple of car related
               | things pretty right.
        
           | jpalomaki wrote:
           | I can understand financial liability, but what if somebody
           | gets seriously hurt and there's criminal charges? Is there
           | already a legal framework in US for transferring this kind of
           | liabilities from the driver to manufacturer?
        
           | simondotau wrote:
           | How exactly is Mercedes accepting liability? I mean to say,
           | how does this work in practice? Who absorbs the demerit
           | points if your car is accused of going 40 in a 30 zone?
        
             | JumpCrisscross wrote:
             | > _How exactly is Mercedes accepting liability?_
             | 
             | You're indemnified.
        
               | kbos87 wrote:
               | Mercedes can't indemnify you. Local police and
               | prosecutors make those decisions.
        
               | nradov wrote:
               | Nope. Local police and prosecutors have no authority to
               | make decisions on liability. Their authority is limited
               | to criminal matters. However, police reports can
               | generally be used as evidence in civil trials where
               | judges and juries make decisions about liability.
               | 
               | There may be some circumstances where a driver operating
               | a Mercedes-Benz vehicle using Drive Pilot could be
               | committing a criminal offense. For example, I think
               | sitting in the driver's seat while intoxicated would
               | still be illegal even if you're not actually driving. But
               | that is separate from liability.
        
               | JumpCrisscross wrote:
               | > _Mercedes can't indemnify you. Local police and
               | prosecutors make those decisions_
               | 
               | Are you thinking of immunity? Indemnity is regularly
               | contracted between private parties. (Also, police can't
               | give you immunity in America.)
        
         | jimjimjim wrote:
         | good god! No autonomous drive should drive through fog or
         | flooded roads or construction zones. I wouldn't trust any
         | system in those conditions regardless of what stans say.
        
         | oxfordmale wrote:
         | Mercedes accepts legal liability for collisions if the system
         | is used under these stringent conditions. Tesla doesn't do
         | this.
         | 
         | It also doesn't mean it doesn't have the technical capability
         | to drive through the streets of LA for 60 minutes.
         | 
         | https://www.thatcham.org/mercedes-to-accept-liability-for-ac...
        
           | BoorishBears wrote:
           | Working in the AV space, it's really frustrating how
           | confidently people who have no idea about what's hard and
           | what isn't go off about Tesla right now.
           | 
           | Mercedes has soundly beaten the last decade of Tesla efforts
           | by reaching L3.
           | 
           | I've personally watched FSD go off the rails and into a crash
           | situation within 60 seconds of being turned on three times
           | this month (I have a friend who loves to try it in San
           | Francisco)
           | 
           | Had it crashed it'd be on my friend, not Tesla. The fact
           | Mercedes is taking responsibility puts it in an entirely
           | different level of effectiveness.
           | 
           | -
           | 
           | People also don't seem to understand Mercedes has a separate
           | L2 system that works above 45 mph that already scores better
           | than AP by consumer reports
        
         | amelius wrote:
         | Do they use a RealTime OS?
         | 
         | If so, (how) do they run GPU drivers on it?
         | 
         | And if not, how do they guarantee that the system is responsive
         | and safe?
         | 
         | (Of course a third option would be that they don't use an OS at
         | all for the mission critical stuff).
        
         | simion314 wrote:
         | Would you bet your life on a Tesla Twitter promise? They are
         | not even betting one cent on their car safety.
        
           | zaroth wrote:
           | > They are not even betting one cent in their car safety.
           | 
           | Do you mean strictly in the sense of accepting liability for
           | crashes while AutoPilot is engaged?
           | 
           | Because Tesla makes safety a primary selling point of their
           | vehicles. Objectively Tesla invests hugely in the safety of
           | their vehicles, and scores the absolute highest marks for
           | safety in many government tests.
           | 
           | Tesla makes an L2 system where the driver must remain
           | engaged. And part of their _FSD_ system includes the most
           | sophisticated driver gaze and attention tracking system on
           | the market. This has made their FSD system on predominantly
           | _non-highway_ roads safer than the average human driver
           | without FSD.
        
             | simion314 wrote:
             | No, I mean actually bet your life, say put your little
             | children in the back and send them to some destination
             | because you seen Tesla videos on YouTube.
             | 
             | if you won\t do it for a randoms destination, on what would
             | you risk your children life, on some white listed high-way?
             | which one?
        
             | kelnos wrote:
             | In minor traffic a week or so ago, I ended up next to a
             | Tesla for a few minutes where the driver had zero hands on
             | the wheel, and her eyes were buried in her phone, with her
             | head angled downward. Whatever system was running seemed to
             | be totally fine with that situation; if that's the most
             | advanced driver attention monitoring system available,
             | we're in a lot of trouble. Tesla caring so much about
             | safety is so obviously a bad joke.
        
               | chroma wrote:
               | I don't believe you. Teslas have a cabin camera that
               | monitors your gaze and quickly emits warnings if you look
               | at your phone while autopilot is on. If you ignore the
               | warnings, the car puts it emergency flashers on, pulls
               | over, and disables autonomous driving until your next
               | trip. If you do the five times with the FSD beta, you are
               | banned from using FSD.
        
               | qwytw wrote:
               | > are banned from using FSD
               | 
               | Do you get a refund?
        
               | zaroth wrote:
               | It's a one week ban I believe.
        
               | zaroth wrote:
               | It is not. The FSD Beta enables an aggressive attention
               | monitor which is not active with the basic AutoPilot
               | system.
               | 
               | The nag when FSD is enabled is actually quite annoying.
               | Even glancing over at the screen for more than a second
               | will trigger it. If it triggers more than a couple times
               | you get locked out for the drive. If it triggers more
               | than 5 times in total across any number of drives, you
               | get locked out entirely for a full week.
        
             | justapassenger wrote:
             | > And part of their FSD system includes the most
             | sophisticated driver gaze and attention tracking system on
             | the market.
             | 
             | That's a just blunt lie. Their driver monitoring system is
             | very deficient. They lack, for example, industry standard
             | of infrared camera that allows you to see through the
             | sunglasses.
             | 
             | And all older Model S/X, that have FSD, lack any camera at
             | all.
        
               | zaroth wrote:
               | You're right, their older cars don't have their most
               | advanced system. When I said "on the market" I mean the
               | cars they are selling now, not historically.
               | 
               | I've found the system extremely adept at gaze tracking
               | and alerting. The cabin camera is infrared in their
               | latest models (at least for the last 2 years).
               | 
               | Please don't call me a liar. I am happy to be called
               | wrong and corrected.
               | 
               | I should have said "one of the most advanced" because
               | this is in truth a subjective measure and I don't think
               | agencies like NHTSA rank and qualify attention tracking?
        
         | dclowd9901 wrote:
         | I'm curious what streets are under 40 mph and not "city" or
         | "country" streets?
        
           | spockz wrote:
           | Congested highways?
        
         | KennyBlanken wrote:
         | For comparison, Tesla FSD veering directly at a cyclist, who
         | was saved only by the driver's lightning reflexes:
         | https://www.youtube.com/watch?v=a5wkENwrp_k
         | 
         | For comparison, Tesla FSD cruising right through a crossswalk
         | with pedestrians and the driver is gleeful about it:
         | https://arstechnica.com/cars/2023/05/teslas-full-self-drivin...
         | 
         | For comparison, Tesla FSD veering into oncoming traffic
         | https://www.reddit.com/r/teslamotors/comments/128zncy/fsd_te...
         | 
         | I remember seeing a video where Tesla FSD veered right at an
         | island/telephone pole. Another where it veered at a _crowd_ of
         | pedestrians on the corner of a signalized intersection waiting
         | to cross.
         | 
         | A reminder that the software has been buggy for years and Tesla
         | is somehow being allowed to "beta test" it in public among
         | people who did not consent to the risk.
        
           | jeroenhd wrote:
           | As a counter point: these Tesla videos are much more common
           | because there are more Teslas driving around with the feature
           | enabled.
           | 
           | We don't know of the Mercedes is as much of a murder machine
           | based on the little material from their FSD system there is.
           | 
           | All self driving car manufacturers have videos of their cars
           | doing the most ridiculous things. It's hard to pick one above
           | the other. At least Mercedes seems to have documented their
           | limitations rather than assumed their system works
           | everywhere, so that's a good sign to me.
        
           | iknowstuff wrote:
           | Because ultimately there hasn't been a single publicized
           | death or injury resulting from FSD Beta, while it only took a
           | few months for a professional tester of Uber's SDC effort to
           | kill someone. So maybe it makes sense to test with millions
           | of people for a few minutes a day instead of 8 hours a day
           | with a few hired testers. The media would sure jump on such
           | tesla headlines.
        
         | belter wrote:
         | Took me less than a few minutes, to find in the same channel,
         | FSD failing and the human having to intervene. Here it is at
         | the proper time and for a video uploaded 10 hours ago... -
         | https://youtu.be/KG4yABOlNjk?t=995
         | 
         | And even worst failures...looking at all these I simply do not
         | believe the videos posted are not edited or a selection of a
         | success out of many failed ones.
         | 
         | https://youtu.be/KG4yABOlNjk?t=1252
         | 
         | https://youtu.be/WowhH_Xry9s?t=77
         | 
         | https://youtu.be/WowhH_Xry9s?t=196
         | 
         | Since the video was live on YouTube no editing was done...
         | 
         | These videos are only posting the successful events, we need to
         | see them all. That is why probably Tesla is not putting their
         | money where their mouth is.
        
           | shdshdshd wrote:
           | Now, lets see Mercedes ones...
        
           | elif wrote:
           | Why don't you find a video of FSD failing on a 40mph highway
           | with no lane changes for an apples to apples comparison...
           | 
           | Oh wait that is an unreasonably niche scenario to be worthy
           | of consideration in the context of full autonomy.
        
             | belter wrote:
             | Will almost accelerating straight into the guard rail at 30
             | mph do it? - https://youtu.be/sNBEHumIHJI?t=14
             | 
             | https://www.carscoops.com/2022/05/tesla-driver-claims-
             | model-...
        
               | fallingknife wrote:
               | That road is not within the operating requirements of the
               | Mercedes L3 which is only allowed on highways where such
               | turns are never found.
        
           | zaroth wrote:
           | This is a great example! I hope people will take the time to
           | click and watch it drive.
           | 
           | I think it's a tough call in this case. The Tesla is trying
           | not to block an intersection where a light is green but
           | there's no room to clear the intersection on the other side
           | due to traffic ahead. In fact an oncoming car is able to turn
           | left in front of the Tesla because it waited.
           | 
           | Probably the law is that you should NOT enter the
           | intersection in such a state, but the human nature would be
           | to make a more nuanced judgement of "how bad would it be" to
           | continue thru and possibly get stuck sticking out into the
           | intersection for a bit until traffic moves again.
           | 
           | I would think - how long might I be stuck? Would it actual
           | impede other traffic? Also factors like, am I'm late for
           | work? Am I frustrated because traffic has been horrible or am
           | I enjoying a Sunday drive?
           | 
           | Ego (Tesla's name for the driving software) doesn't get
           | impatient. It can maximally obey all traffic regulations at
           | all times if they code it to do so, even if human drivers
           | behind might be laying on the horn.
           | 
           | This little clip really shows how much nuance is involved in
           | designing these systems, and how doing the technically right
           | thing can either be wrong or at least ambiguously right.
        
             | belter wrote:
             | That is not the failure. The failure is that would not move
             | anymore and the driver had to intervene and accelerate.
        
               | zaroth wrote:
               | At 16:55 thru 17:15? Where he manually accelerates the
               | car and blocks the intersection?
        
               | belter wrote:
               | https://youtu.be/KG4yABOlNjk?t=999
               | 
               | 16:40 to 17:12
        
               | szundi wrote:
               | Seemed to be completely OK, a driver who is on the safe
               | side would have just done this exactly
        
               | rvnx wrote:
               | Totally, you are not supposed to block an intersection,
               | and it is forbidden to stay on a pedestrian crossing.
        
               | belter wrote:
               | Again...that was not the failure. Actually there are two
               | failures:
               | 
               | - First one, to stop too far away both from the road
               | intersection and the pedestrian crossing. And of course
               | you should not stop on top the pedestrian crossing. The
               | car behind the Tesla, noticed that right away, and went
               | over the Tesla. Even the Tesla driver in the video
               | commented on that. I wonder what FSD version that driver
               | has :-)
               | 
               | - Second failure, the car ahead, the one already over the
               | road crossing started moving but the Tesla would not move
               | at all. Only when the video author accelerated as he
               | mentions in the video.
        
               | belter wrote:
               | If you stay there, blocking traffic for no reason, and do
               | not proceed when you are clear to go, as it did until the
               | driver accelerated, you will fail a driving license exam
               | in most countries.
        
               | BHSPitMonkey wrote:
               | "Most countries" would fail a driver for not crossing an
               | intersection until there is space on the other side? I
               | think you're being a little absurd here.
               | 
               | The driver in this case could see that the cars beyond
               | the blue car were starting to move, and thus _predict_
               | that he could cross a little bit early (betting on the
               | space being available by the end of the maneuver, but
               | risking being the ass who ends up blocking the crosswalk
               | after miscalculating).
        
               | Phil_Latio wrote:
               | Maybe it would have moved a second later...
        
               | [deleted]
        
               | [deleted]
        
             | jondwillis wrote:
             | LA has an unwritten law where 2-3 cars make unprotected
             | lefts after oncoming traffic has cleared on yellow/red
             | lights. Letting FSD drive, it can't honor this "cultural"
             | (technically illegal) behavior. If I am operating it, off
             | goes FSD in that moment.
        
               | zeroonetwothree wrote:
               | It's legal to turn left in such a case as long as you
               | entered the intersection already. So at least the first
               | 1-2 cars are not doing anything illegal
        
               | jondwillis wrote:
               | Yeah, but sometimes the second and usually the third (and
               | fourth - I have seen it!) are not in the intersection.
        
               | kevin_thibedeau wrote:
               | You can't legally enter an intersection (advance past the
               | stopping line if present) until you are clear to perform
               | your transit of the intersection. Left on red is never
               | legal and marginal on yellow.
        
               | dragonwriter wrote:
               | > Left on red is never legal and marginal on yellow.
               | 
               | Not parituclarly relevant to the gridlock violation at
               | issue here, but left on red from a one-way street to a
               | one-way street is legal in the same conditions as right-
               | on-red generally in a large majority of US states, a few
               | of which also allow left-on-red from a two-way street to
               | a one-way street. Left-on-red is only completely illegal
               | in a few states.
        
               | toast0 wrote:
               | Dude it's totally following the law. You're allowed to
               | enter the intersection for an unprotected left even if
               | it's not clear to turn (possibly not if the end of the
               | intersection you're going to is full of cars). If you are
               | in the intersection, you're allowed to clear when
               | possible regardless of the color of your light.
               | 
               | Most intersections with unprotected left turns let you
               | fit at least a whole car, sometimes two in the
               | intersection to wait, and the third car has its front
               | wheels on the paint, so it's totally in the intersection.
        
               | shkkmo wrote:
               | While you are allowed to enter the intersection during a
               | yellow, you are also considered legally to have been
               | warned and would be consider liable if doing so causes an
               | accident.
               | 
               | This extra liability would seem like making self driving
               | cars error on the side of stopping to be in the interests
               | of auto makers who would be exposed to that liability.
        
               | toast0 wrote:
               | Mostly the cars will have entered during the green, not
               | the yellow, and were waiting for a chance to turn. When
               | there was no opportunity to turn while the light was
               | green, they must turn while the light is yellow or red,
               | because they are already in the intersection and must
               | clear the intersection.
        
               | illumin8 wrote:
               | Incorrect. As long as your front wheels pass the stop
               | line prior to the light turning red, you are legally
               | driving through the intersection and should continue and
               | clear the intersection on the other side (not stop in the
               | middle).
               | 
               | Imagine how crazy the law would be if this were the case:
               | - Light changes from green to yellow 1ms before your
               | front wheels pass the stop line - Other direction traffic
               | runs a red light and hits you from the side - You're now
               | somehow liable because your wheels entered the
               | intersection with zero ability to react quickly enough
               | (no human can react in 1ms and no car can stop that fast)
               | and the other driver that clearly blew a red is not?
               | 
               | That would be pants on head crazy, tbh...
        
               | illumin8 wrote:
               | Correct. The law states that as long as your front wheels
               | are in the intersection prior to the light turning red,
               | you should proceed through the intersection.
               | Inexperienced drivers that either stay in the
               | intersection or try to reverse into traffic behind them
               | are breaking the law and create a huge hazard for others.
               | 
               | Even if the light for opposing traffic turns green while
               | the turning car is still clearing the intersection,
               | opposing traffic is legally required to wait and not
               | enter the intersection until opposing traffic has cleared
               | the intersection.
        
               | martythemaniak wrote:
               | Tesla has been specifically reprimanded by NHSTA for
               | allowing FSD to drive like human rather than to the
               | letter of the law. The rolling stops is one I remember,
               | but basically it'll apply to anything.
        
             | rhtgrg wrote:
             | > The Tesla is trying not to block an intersection where a
             | light is green
             | 
             | What light? There is none if you're talking about t=995.
        
           | jondwillis wrote:
           | Anecdotally, in LA, I turn off FSD basically every minute
           | while trying to use it, due to it doing something slightly
           | "inhuman" or not ideal/too to the letter of the law,
           | signaling incorrectly while staying in a lane that curves
           | slightly, etc.
           | 
           | I can't imagine letting it go for a full hour without causing
           | some road rage or traffic blockage.
           | 
           | To be clear there is a definite driving "culture" in LA that
           | is very permissive and aggressive (out of necessity). FSD
           | doesn't follow this culture.
        
             | belter wrote:
             | Now imagine it trying to manage a Dutch Roundabout -
             | https://youtu.be/41XBzAOmmIU or driving in a city like
             | OPorto - https://youtu.be/VIWikUUl5YQ
        
               | rlupi wrote:
               | Or Naples https://www.youtube.com/watch?v=dCE1aUkeMe0
        
               | belter wrote:
               | Well I was trying to still give a chance to FSD, but as
               | you are raising the stakes...Let's agree to have
               | Marrakech as the baseline - https://youtu.be/SsZlduEIyPQ
               | 
               | And we can also test it in Ho Chi Minh. Tesla says it has
               | superhuman abilities... - https://youtu.be/1ZupwFOhjl4
        
             | natch wrote:
             | Every time you disengage it invites you to leave immediate
             | voice feedback as to why, and presumably they are using all
             | this feedback in conjunction with camera and data feeds
             | from cars that are opted in (which includes all FSD beta
             | cars I believe).
             | 
             | So, they are getting what they need to make it better.
        
             | LeoPanthera wrote:
             | > signaling incorrectly while staying in a lane that curves
             | slightly
             | 
             | This is _super_ annoying, but the most recent FSD update
             | seems to have mostly fixed that.
             | 
             | It's still pretty crappy in a lot of other ways, but at
             | least that improved.
        
               | jondwillis wrote:
               | I only started noticing it since the Autopilot/FSD fusion
               | update a month or two ago. I haven't been driving much
               | the past few weeks so maybe it has been fixed.
        
               | noja wrote:
               | > but the most recent FSD update seems to have mostly
               | fixed that.
               | 
               | I have read this comment before.
        
           | kbos87 wrote:
           | I'm left wondering what this comment proves? The four
           | situations you posted were the most minor of minor annoyances
           | when the car was being _too cautious_ , and you've decided
           | they must be cherry picked without any evidence given.
        
             | belter wrote:
             | It's shows the system does not have a semantic
             | understanding of what is happening. That is why he stops so
             | far away from the gate, it does not know it is a gate.
             | 
             | This is another one, definitely one of my favorite examples
             | ( at correct time) - https://youtu.be/2u6AgLuwVqI?t=86
        
           | KennyBlanken wrote:
           | Tesla not only isn't putting their money where their mouth
           | is, they have a lond-standing history of using illegal DMCA
           | takedowns to remove these sorts of videos from Twitter,
           | reddit, Facebook, etc.
        
             | iknowstuff wrote:
             | Incorrect
        
           | wilg wrote:
           | I'm not sure exactly your point. The Tesla does sometimes
           | require intervention, that's why it's Level 2. But it's still
           | attempting to drive in significantly more complicated
           | situations than this Drive Pilot thing. Does Drive Pilot stop
           | at stoplights or make turns? I don't think so.
           | 
           | Regarding deceptive editing, plenty of people post their
           | Teslas doing squirrely things and them intervening. So it's
           | not like a secret that sometimes you have to intervene.
        
             | belter wrote:
             | Look at is happening at this point, another additional
             | failure mode, not seen in the links I posted before
             | 
             | https://youtu.be/WowhH_Xry9s?t=272
             | 
             | See how is confusing the speed limit for cars and trucks,
             | and doing a sudden break. More precisely, at 5:12.
             | 
             | So if you have car driving behind you, not expecting the
             | sudden break, this could cause an accident. They are joking
             | with people lives.
             | 
             | Ethics matter in Software Engineering.
        
               | iknowstuff wrote:
               | FYI: Not unique to Tesla, I get plenty of sudden
               | slowdowns when riding Cruise.
        
             | AbrahamParangi wrote:
             | Attempting and failing is clearly, _clearly_ worse for the
             | general public and in my opinion Tesla should be strictly
             | liable.
        
               | wilg wrote:
               | It's not necessarily worse, since there is a person
               | driving the car who can prevent the car from behaving
               | badly. What's the safety difference between this and a
               | regular cruise control, which will happily plow you into
               | a wall or car if you don't intervene?
               | 
               | And, empirically, there's no evidence that these cars are
               | less safe when driving this way. Tesla claims people
               | driving with FSD enabled have 4x fewer accidents than
               | those driving without it, and nobody has presented any
               | data that disputes that.
        
               | belter wrote:
               | "Tesla Again Paints A Crash Data Story That Misleads Many
               | Readers" - https://www.forbes.com/sites/bradtempleton/202
               | 3/04/26/tesla-...
               | 
               | "...Several attempts have been made to reach out to Tesla
               | for comment over the years since these numbers first
               | started coming out, however, Tesla closed its press
               | relations office and no longer responds to press
               | inquiries..."
        
               | wilg wrote:
               | This critique of their impact report (I was referring to
               | a more recent statement) only goes as far as saying FSD
               | beta is equally safe to humans driving, not worse, which
               | seems perfectly acceptable?
        
               | avereveard wrote:
               | Depends on the average of human driver. Especially if the
               | average includes motorbikes.
               | 
               | Saying fsd on tela has the same statistic than the
               | general driver population prints a grim picture, as it
               | puts it in a strictly worse performance than peers
               | vehicles (SUV or saloons depending on the model)
        
             | adsfgiodsnrio wrote:
             | We know Tesla cannot match Mercedes. We don't know whether
             | or not Mercedes can match Tesla. Mercedes isn't reckless
             | enough to let untrained fanboys play with defective
             | software in two-ton vehicles on public roads.
        
               | samr71 wrote:
               | "We know Tesla cannot match Mercedes" - how? You know
               | this?
               | 
               | "reckless" "untrained fanboys" "defective software" -
               | what is this tone? Why is it reckless? Why do the fanboys
               | need training? Why do you think the software is
               | defective? These are significant unjustified claims!
               | 
               | To me, it seems each company has a different strategy for
               | self-driving, which aren't directly comparable. Beta
               | testing with relatively untrained people on public roads
               | seems necessary for either strategy though.
        
               | adsfgiodsnrio wrote:
               | >how? You know this?
               | 
               | California seems to think so.
               | 
               | >Beta testing with relatively untrained people on public
               | roads seems necessary for either strategy though.
               | 
               | Then why is Tesla the only company doing it?
               | 
               | Mercedes has an L3 system shipping _today_ and they didn
               | 't see any need to endanger my life to build it.
        
               | wilg wrote:
               | Mercedes' system does not do most of the things Tesla's
               | does, right? Such as stop at stoplights or make turns, or
               | do anything at all off-highway. It's a significantly
               | different product, and since they didn't try to do many
               | of the things Tesla is trying to do, it's pretty
               | difficult to claim that those things aren't necessary
               | because Mercedes didn't do them, when they haven't even
               | attempted to deliver the same feature.
        
           | belter wrote:
           | "Watch the Self-Driving Struggle" -
           | https://youtu.be/2u6AgLuwVqI?t=88
        
         | activiation wrote:
         | A few weeks ago, i saw a Tesla suddenly go 90 degrees on a
         | small road downtown Atlanta ... It took a while for it to get
         | going again... Of course I don't know if it was FSD but I have
         | never seen a human do something this dumb.
        
         | anaganisk wrote:
         | Yowzzaa, but the damm Beta on Tesla can't still think of
         | slowing down while cornering on steep curves, throwing people
         | on to the side and not navigate into a dead end that drops into
         | the ocean. This is not an anecdote, it's my personal
         | experience. No way I'm not gonna trust a system again, that
         | puts human lives (even those who are not driving a Tesla) for a
         | beta test.
        
         | foota wrote:
         | Sounds like they don't want people to die from their product.
        
         | rvnx wrote:
         | It's very different goals, the Tesla approach is more of a
         | hack, which is to release things, without any liability or
         | guarantee "we quickly hacked this together, good luck, if you
         | die or get injured this is your problem!", and Mercedes is
         | delivering a product that only support a few features but do it
         | well, and they put their responsibility on it.
        
           | lm28469 wrote:
           | > "we quickly hacked this together, good luck, if you die or
           | get injured this is your problem!"
           | 
           | Or kill someone who didn't ask for any of this
        
           | dmix wrote:
           | What features does Tesla FSD offer that Mercedes doesn't?
        
             | jacquesm wrote:
             | Liability.
        
         | Gys wrote:
         | Very serious limitation of Tesla:
         | 
         | The human has to keep the hands on the wheel at all times and
         | eyes on the road.
         | 
         | Edit: I looked at these FSD videos before and am very sure it
         | will not work that well in an average European city.
         | Unfortunately, because I really would love to buy a real self
         | driving car that legally allows me to watch a movie while
         | driving (highways is enough).
        
         | fossuser wrote:
         | Yeah this isn't beating Tesla to anything.
         | 
         | If you constrain a system to where it's effectively useless and
         | declare victory, that's worse than trying to actually solve the
         | problem and saying you're not there yet.
         | 
         | It's lowering the goal precipitously until you can achieve and
         | then pretending you did it. Tesla has flaws, but this is a dumb
         | article.
        
           | cbsmith wrote:
           | > Yeah this isn't beating Tesla to anything.
           | 
           | They beat Tesla to being certified in California as Level 3,
           | with restrictions.
        
             | fossuser wrote:
             | anything _that matters_ (which was implied by the rest of
             | my comment)
             | 
             | They can check a box and make this press release though, I
             | guess that's worth something.
        
         | conroy wrote:
         | > Tesla FSD navigating the complex city streets of LA for 60
         | minutes with zero human intervention.
         | 
         | 17 fatalities, 736 crashes: The shocking toll of Tesla's
         | Autopilot
         | 
         | https://www.washingtonpost.com/technology/2023/06/10/tesla-a...
        
           | charcircuit wrote:
           | Just because you crash that doesn't mean you aren't able to
           | navigate without human intervention. Humans are also capable
           | of crashing their car.
        
           | merth wrote:
           | He means under their supervisor control. There are "normal"
           | people who do their best to cause a crash.
        
           | bmicraft wrote:
           | Without any comparison to humans those numbers are completely
           | meaningless
        
           | readams wrote:
           | Autopilot is just fancy cruise control. How many crashes have
           | there been with cruise control turned on? It's just a
           | completely meaningless thing to even look at. How many
           | crashes would there have been during those miles without
           | autopilot?
        
             | pell wrote:
             | Tesla doesn't sell Autopilot as cruise control though.
        
           | fallingknife wrote:
           | You're right, that is shockingly low! 43,000 people die in
           | car crashes in the US every year, and only 17 in Teslas since
           | 2019. That's only 1 in 7500.
        
             | georgyo wrote:
             | These statistics are meaningless.
             | 
             | How many Tesla cars with full self driving are there
             | compared to regular cars?
             | 
             | A Tesla with FSD will only be using FSD a small fraction of
             | the time.
             | 
             | If we had statistics on the number of hours FSD was active
             | compared to the number of hours all other car driving of
             | all cars we might be able to compared these numbers.
             | 
             | Hour wise, I think normal driving is way above 7500:1.
        
               | Eisenstein wrote:
               | Why are people willing to excuse things because they
               | happen in a car? If an AI power tool were going haywire
               | once in a while and chopping the arms off of bystanders
               | near by, would we find that acceptable because people
               | using non-AI powered tools also chop off their body parts
               | sometimes? 'We can't know if it would have happened if a
               | person were in control' is not an argument that would fly
               | for anything else just as dangerous.
        
         | pavlov wrote:
         | I have a Model 3 in Europe with the so-called FSD, and it's
         | mostly terrible. I regret paying for it. The car often doesn't
         | understand even speed limit signs, so it fails at the bare
         | minimum of safety.
         | 
         | Recently I visited LA and a friend gave me a drive in their
         | Model 3. The difference in sensor interpretation quality was
         | massive compared to what I see back home. That gave me hope of
         | my aging Model 3 still seeing improvements... But also a
         | nagging suspicion that Tesla may have spent a lot of manual
         | effort on data labeling and heuristics for LA roads
         | specifically, and the results may not be easily scalable
         | elsewhere.
        
           | chroma wrote:
           | The difference in FSD behavior is due to the EU's laws on
           | autonomous vehicles. The EU mandates maximum lateral
           | acceleration, lane change times, and many other details that
           | make for a worse (and less safe) driving experience.
        
             | BonoboIO wrote:
             | Could you explain why the limits make the experience less
             | safe.
             | 
             | Less lateral acceleration should mean more safety, or I m
             | wrong?
        
             | ifdefdebug wrote:
             | Now why would the EU mandate details that make for a less
             | safe driving? You are implying they are either malicious or
             | stunningly incompetent.
        
               | 221qqwe wrote:
               | > stunningly incompetent
               | 
               | Not stunningly, just moderately. But generally yes,
               | that's a good description for EU bureaucrats.
        
               | smabie wrote:
               | Government stunningly incompetent? Who would have
               | thought..
        
           | omgwtfbyobbq wrote:
           | It will likely take a while because there is a difference.
           | I'm not sure if it's manual effort or just having more data
           | in certain regions.
           | 
           | When I first got into the beta, it wasn't great, but has
           | improved significantly since then.
        
         | adsfgiodsnrio wrote:
         | So, in the absolute best-case scenario cherry-picked from
         | thousands of videos, Tesla's system is capable of going
         | _twenty-five miles_ without making dangerous mistakes.
         | 
         | I agree the comparison is laughable.
        
         | karaterobot wrote:
         | Skeptical but genuine question here: if Tesla's Level 3 system
         | is more advanced, why don't they have authorization to release
         | it? And why does the article quote Tesla as saying they only
         | have a level 2 system?
        
           | jeremyjh wrote:
           | They aren't pursuing it, and stopped publicly reporting
           | disengagement data. They found out their stans are happy to
           | buy the cars and use FSD without liability protection or
           | safety data, so why bother? Whenever one crashes it's the
           | driver's fault by definition.
        
         | iknowstuff wrote:
         | Yea you will struggle to find a single video of drive pilot
         | where it's not driven by a representative from Mercedes.
         | 
         | In those, you can see the system disengage the second it
         | doesn't have a vehicle to follow at slow speeds right in front
         | of it (like when the vehicle ahead changes lanes).
         | 
         | Useless.
        
         | wonnage wrote:
         | Tesla FSD is a marketing gimmick
        
         | croes wrote:
         | They are all heavily limited.
         | 
         | But FSD isn't even necessary to help, an autonomous
         | acceleration and follow mode for all cars on traffic lights and
         | in traffic jams would already have huge benefits.
        
         | olliej wrote:
         | So it can only be used on highways with a 40mph speed limit???
         | :D
        
         | hristov wrote:
         | Even considering its heavy limitations, the Mercedes system is
         | miles better than the Tesla, because it is actually useful. You
         | can actually turn on the system and do your email, or take a
         | nap. Yes, the 40 mph limitation on highways seems to make it
         | useless, but many highways get congested to the point where you
         | are not going over 40 mph anyways, and those are the times when
         | it is most frustrating to drive.
         | 
         | And regarding the FSD, most youtube videos I have seen run into
         | human intervention within 20 minutes or so. Going 60 minutes
         | without human intervention is certainly possible if you get a
         | bit lucky but that does not mean you should take your eyes off
         | the road for a second. FSD still has to be constantly
         | supervised, which makes it of very limited utility. At this
         | stage it is still a fun experiment at most.
        
           | iknowstuff wrote:
           | Now try to look up videos of Mercedes' L3 system in operation
           | and you'll see how hilarious this claim is. It shuts off
           | immediately without a vehicle to follow in front of you. Good
           | luck taking a nap and typing emails. L3 my ass.
        
             | BoorishBears wrote:
             | You're not allowed to sleep, you need to take control
             | within 10-15 seconds.
             | 
             | And no, it doesn't turn off if there are no vehicles in
             | front, but it does turn off above 45 MPH. On most highways
             | you'll only be below 45 because of traffic.
             | 
             | Above 45 they have a separate L2 system that requires
             | "normal" attentiveness.
        
           | thegrim33 wrote:
           | "do your email, or take a nap" <- The definition of an SAE
           | Level 3 system is that you must remain in the seat, prepared
           | to take control immediately when the system requests you to.
           | Taking a nap or otherwise not paying attention is not what
           | such a system supports.
        
             | torginus wrote:
             | Afaik the key differentiator of a L3 and a L2 system is
             | that if you don't take control when the system requests you
             | to, the L3 system can safely stop/pull aside while all bets
             | are off with an L2 system.
        
             | GuB-42 wrote:
             | I guess it is not ok to be asleep as even a loud alarm will
             | take time to wake you up, but not paying attention, like
             | when you are doing your email is fine. In fact, that's the
             | entire point of level 3 autonomy.
             | 
             | You are "on call" rather than "at work", so you must be
             | prepared to act when the car rings. If the car doesn't
             | ring, you are free to do whatever you want as long as you
             | can hear the ring and take back control when it happens.
        
             | dkjaudyeqooe wrote:
             | > prepared to take control immediately
             | 
             | I don't think that's correct. I've seen "in a certain
             | amount of time" but you make it sound like it's a safety
             | issue, which it's not, that being a key differentiation
             | from a L2 system. When it can't drive it stops driving and
             | you have to drive. If it's not a safety issue you could
             | conceivably bee sleeping, as long as you can wake up in a
             | timely manner.
             | 
             | The SAE makes it clear that the car is driving at L3, and
             | on that basis you would expect the transition to another
             | driver would be graceful, just like with two human drivers.
        
               | olex wrote:
               | That "certain amount of time" is variable, but in
               | practice is on the level of 10-15 seconds for the
               | Mercedes system - at least that's what it was when it was
               | first certified in Germany some time ago. It is designed
               | to let you take your hands off the wheel and look at the
               | scenery, but anything more than that is too much
               | distraction for when it requires you to take over. And it
               | will, because it's really not very capable at all - it's
               | basically an advanced traffic jam autopilot that can
               | follow other cars in a straight well marked line, that's
               | it.
        
               | dkjaudyeqooe wrote:
               | What would be great is the SAE tightened the definitions
               | so there were standards of performance, maybe with
               | different quality sublevels within each.
        
               | [deleted]
        
               | hristov wrote:
               | For person like me that takes very light naps when
               | sitting, 10 seconds is plenty to wake up and take
               | control. Also if I am using a tablet doing email or
               | browsing or even playing games, 10 seconds is plenty of
               | time to put the tablet away and grab the wheel.
        
               | rl3 wrote:
               | > _For person like me that takes very light naps when
               | sitting, 10 seconds is plenty to wake up and take
               | control._
               | 
               | I'd argue that isn't remotely enough time to safely take
               | control. You're betting your brain wakes up sufficiently
               | in that time, and if it doesn't the consequences are
               | potentially deadly.
               | 
               | > _Also if I am using a tablet doing email or browsing or
               | even playing games, 10 seconds is plenty of time to put
               | the tablet away and grab the wheel._
               | 
               | That's a bit better from an alertness standpoint, but not
               | from a situational awareness one. You're deprived of both
               | context and situational awareness on the road.
               | 
               | Some key details extend far beyond a 10-15 seconds, such
               | as: another car driving erratically, lanes ending, line
               | of sight on pedestrians or cyclists visible prior that
               | are now occluded by traffic. The list goes on.
        
           | cscurmudgeon wrote:
           | > And regarding the FSD, most youtube videos I have seen run
           | into human intervention within 20 minutes or so. Going 60
           | minutes without human intervention is certainly possible if
           | you get a bit lucky but that does not mean you should take
           | your eyes off the road for a second. FSD still has to be
           | constantly supervised, which makes it of very limited
           | utility. At this stage it is still a fun experiment at most.
           | 
           | Are there equivalent videos for Mercedes?
        
           | FormerBandmate wrote:
           | Autopilot works fine in highways under 40 miles per hour. You
           | still pay some attention but that's also true with this, if
           | traffic goes above 40 mph you have to take control in 2
           | seconds while Autopilot is more gradual
        
         | qwytw wrote:
         | > This seems like a marketing gimmick by Mercedes at best; the
         | two technologies aren't even in the same league.
         | 
         | You might be right or not but gave zero arguments in your
         | comment. The fact that these safety/regulatory restrictions
         | exist does not mean the system is not more capable.
        
         | hn_throwaway_99 wrote:
         | Can you explain how                 - Must be under 40 mph
         | - Only during the daylight and only on certain highways       -
         | CANNOT be operated on city/country streets
         | 
         | doesn't exclude all roads? Where I live, highways all have a
         | speed limit of 65-75 mph, and all streets with a speed limit of
         | 40 or below are city our county roads. So where can you
         | actually use this?
        
           | iknowstuff wrote:
           | It does. The system is useless. Not all freeways are even
           | allowed.
        
           | onion2k wrote:
           | In a traffic jam. Which is most of LAs roads at least some of
           | the time.
        
           | FormerBandmate wrote:
           | From what it sounds like, when you are in a traffic jam you
           | use it and then the second that traffic jam ends you
           | immediately have to take control.
           | 
           | Ngl that sounds like it requires more attention than
           | autopilot, if you're not doing anything and then have to take
           | full control that sounds like it could lead to an absurdly
           | unsafe outcome
        
       | barbariangrunge wrote:
       | We've actually had nearly self driving vehicles for over a
       | century. They're called trains. We should invest in some. The
       | navigation problem gets way easier
        
         | andylynch wrote:
         | Most trains are far from self driving; training to operate one
         | is easily 6-12 months work, depending on the line, with strict
         | licensing as well.
        
         | Spivak wrote:
         | Driverless trains are actually a pretty new thing, being on
         | tracks is not as set/forget as people assume.
         | 
         | If you mean "transportation where I'm not driving" then sure
         | but taxis also fit that bill.
        
           | brooke2k wrote:
           | Proportionally, they are far, far closer to self driving even
           | with multiple people driving the train. A ratio of just 1:100
           | drivers:passengers is a vast improvement on personal cars
           | which would probably be what, 1:1? Maybe 1:2 on average?
        
           | katbyte wrote:
           | Sky train in Vancouver bc has been driverless for near 40
           | years now
        
           | DanielVZ wrote:
           | Subway trains have been able to be autonomous since at least
           | a decade.
        
             | masswerk wrote:
             | Paris Metro introduced driverless on all their MP 55 train
             | sets in 1967.
        
         | Longhanks wrote:
         | Do you have a train station in front of your house and any
         | target destination you can imagine wanting to go to? Because
         | most people don't.
        
           | rdlw wrote:
           | Do you have a self driving car? Because no one does.
        
           | panick21_ wrote:
           | Maybe the all the money invested in self driving and EV
           | insentives and insane highways and stroads maybe you could
           | afford decent public transit.
        
           | berkes wrote:
           | Yes. And yes.
           | 
           | I live in the Netherlands and we have this. Because it is
           | what people want and need. And because it's been invested in
           | for decades. And because it has a way higher ROI than car
           | infra.
        
             | yread wrote:
             | Well the road infra has also been invested in heavily.
             | Dutch highways are the best
        
               | berkes wrote:
               | Yup. It's not an either/or as many seem to think. Public
               | Transport works best in addition and next to cars, bikes,
               | and such. It's mixing and combining that offers the true
               | value.
        
           | PartiallyTyped wrote:
           | Maybe if we built more of those, more people would be within
           | walking distance of them.
        
             | ImprobableTruth wrote:
             | Trains for local transportation make no sense. Trams for
             | high-throughput and buses for everything else is the only
             | sensible thing for local public transport.
        
               | mitthrowaway2 wrote:
               | Works well in Japan.
        
               | umanwizard wrote:
               | > Trains for local transportation make no sense.
               | 
               | Plenty of people take trains for local transportation
               | every day.
        
               | PartiallyTyped wrote:
               | Trams and metros however are quite reasonable.
        
       | megaman821 wrote:
       | Headlines like this really speed up the decline of media. While
       | it is factual it really exploits what a common person would
       | interpret the headline to mean. Tesla is exploring autonomous
       | driving on most roads and conditions. L3 autonomous driving is
       | not binary, you can get it at certain times, in certain places,
       | under certain weather conditions yet the comparison is anchoring
       | it to a system that isn't aiming to operate under those
       | constraints. So the headline is taking a decent accomplishment
       | for Mercedes and turning into catnip for Tesla haters.
        
         | simondotau wrote:
         | [flagged]
        
           | dmbche wrote:
           | I doubt so, I think in the guidelines Meta comments are not
           | encouraged as they are not fruitful for conversation.
        
             | simondotau wrote:
             | I appreciate your contribution but I disagree that it's any
             | more "meta" to respond to the community's unwritten
             | reactions (votes) than responding to its written reactions
             | (normal replies).
        
               | dmbche wrote:
               | Oh I was refering to your comment on the editorialisation
               | of the title, since it's not a comment on the subject of
               | the article but on the writing of it's title. Not a dig,
               | just an observation.
        
               | simondotau wrote:
               | Oh, I misunderstood. Still, I'd argue that in a world
               | where most people who eyeball the headline don't read the
               | article, _the headline is the story._ And thus discussing
               | whether the headline leaves a misleading impression is
               | fair game -- and not at all meta.
        
         | dmbche wrote:
         | I'm not a fan of tesla, but not a fan of self driving over all.
         | This headline just tells me that benz is the first to be
         | selling legal automatic driving cars in California, something I
         | assumed Tesla was doing. I found it enlightening.
         | 
         | Non technical people would probably assume that Tesla already
         | has this, since we hear about it all the time, and that this is
         | Benz getting up to speed - while it's not the case at all and
         | Benz is beating the largest competitor in the market.
         | 
         | I think this is a fair title.
        
         | belter wrote:
         | Mercedes achieves Level 3, while putting their money and
         | responsibility, behind what is a life and death scenario. Tesla
         | fans call it "decent". It's worst than a religion..it's a cult.
        
           | megaman821 wrote:
           | Address nothing about the misleading headline. Call the
           | person that points it out as part of a cult. Maybe you should
           | take a harder look at yourself.
        
         | justapassenger wrote:
         | Mercedes shipped a system that's legally self driving, under
         | some set of conditions.
         | 
         | Tesla sells system that's not legally self driving under any
         | set of conditions. You're always driving, and Tesla will fight
         | you in court if you want them to take liability for any action
         | it took.
         | 
         | There's nothing misleading about this headline.
        
           | simondotau wrote:
           | All of that is technically correct but isn't the point the OP
           | was making. These are products with wildly different
           | aspirations and at different stages of achieving those
           | aspirations.
        
             | justapassenger wrote:
             | Yes. One product works and other doesn't.
             | 
             | And one that doesn't work is better and for sure will
             | improve and work in the future? And one that does work
             | won't improve?
        
               | simondotau wrote:
               | I don't know who you're replying to, because that's
               | certainly nothing to do with what I said.
        
               | zaroth wrote:
               | It's bizarre and absurd to claim a system that's driven
               | _billions_ of miles "doesn't work".
               | 
               | I really like how Simondotau put it. These are wildly
               | different systems. Benchmarks can always be rigged and I
               | see this move by Mercedes no differently.
               | 
               | Having failed to compete with Tesla on actual self
               | driving technology, they concocted a minimally useful
               | system that could check a box in their marketing
               | material. Mercedes, here, is an ostrich.
               | 
               | My primary benchmark for how useful the system is would
               | be how many passenger-miles its driven. We'll be waiting
               | a long time for Mercedes to be bragging about that
               | metric.
        
               | justapassenger wrote:
               | Because it's very very simple. One system is self driving
               | and other isn't, no matter how you try to spin it.
        
               | zaroth wrote:
               | One is a universal L2 autonomous driving system and one
               | highly limited L3 autonomous driving system. They are
               | both considered forms of autonomous driving.
               | 
               | [1] - https://www.synopsys.com/automotive/autonomous-
               | driving-level...
        
               | justapassenger wrote:
               | One legally drives the car for you. Other doesn't.
        
               | zaroth wrote:
               | Actually they are both legally autonomously driving
               | according to SAE. You should read up on it!
               | 
               | "SAE International is the world's leading authority in
               | mobility standards development." Forgive me if I'll take
               | their internationally recognized standards over your
               | characterization.
               | 
               | Certainly only one of the systems accepts liability for
               | accidents while it's enabled, but liability for accidents
               | is just one component of autonomous driving, and not the
               | most important one.
               | 
               | The most important one is clearly the percentage
               | availability of the system across varying roads and
               | conditions.
               | 
               | An L3 or L4 system that is geofenced to work on a few
               | miles of select private roads might be a "world first"
               | while simultaneously being of little practical value.
               | 
               | In essence, a pissing contest or vanity metric.
        
               | justapassenger wrote:
               | Ok, you've convinced me. I'll go drive in FSD Tesla and
               | be certain that you and Musk will personally guarantee
               | safety of it and pickup the bill if something happens.
               | 
               | Also, please actually go read the standard, and talk
               | about design intent, and which main way to differentiate
               | the systems.
        
       | as-j wrote:
       | The actual title is:
       | 
       | > Germans beat Tesla to autonomous L3 driving in the Golden State
       | 
       | Which mashed a lot more sense since Waymo, Cruise and Zoox all
       | have L4 autonomous cars on the road today in California operating
       | with no human inside at all.
        
       | jjmorrison wrote:
       | such clickbait
        
       | sidcool wrote:
       | Seems like a mistaken title. The limitations are so significant,
       | it can't be called autonomous driving.
        
       | elif wrote:
       | Level 2 FULL self driving is way harder of a problem than
       | location limited, condition limited, maneuver limited, speed
       | limited level 3 so to me it is hilarious to call Tesla beat in
       | any sense of the word.
       | 
       | 40mph single lane travel during clear skies and daytime down
       | certain highways can be done by any number of naive driver
       | assistants from many manufacturers. Tesla's autopilot from 2015
       | could certainly accomplish that, let alone the 2023 FSD versions.
       | The only thing Mercedes has accomplished here is ambitious
       | paperwork.
        
         | justapassenger wrote:
         | You really think there's no engineering difference between
         | system that's designed to actively kill you, and others if you
         | don't pay attention vs system that's designed not to do it, and
         | company putting their money behind that statement?
        
           | elif wrote:
           | You really think Mercedes legal filings guarantee a safer
           | system? It is more important to focus on the technical
           | capabilities of the systems than who can add more asterisks
           | to their crash statistics.
           | 
           | You are comparing a mature system with billions of miles of
           | testing to a system which you will struggle to keep engaged
           | for 2 mile intervals.
        
             | wahnfrieden wrote:
             | Tesla FSD is not mature, it's just old
        
             | justapassenger wrote:
             | Ok, let's focus on the technical aspects.
             | 
             | Tesla has no safety life cycle process and as a result FSD
             | is inherently unsafe, from the engineering point of view.
             | 
             | You can try to hand wave over that, but if you are an
             | engineer working on the safety critical systems (it's an
             | actual engineering field), that's all you need to know
             | about FSD.
        
               | fallingknife wrote:
               | You are not actually focusing on technical aspects. You
               | are focusing on regulatory credentials and bureaucratic
               | process. Focusing on technical aspects would mean looking
               | at actual data from on road performance.
        
               | justapassenger wrote:
               | Design process and lifecycle of the product and
               | development is not a technical aspect only at startups
               | building apps to share cat photos.
        
               | elif wrote:
               | How will you discretely identify every problem to be
               | solved in the domain of live real drivers in real world
               | conditions? Applying a model of an industrial plant where
               | machines interface with machines, and occasionally humans
               | in prescribed and orderly functions, will not get you any
               | closer to safety.
               | 
               | What will get you closer to safety is insanely large
               | corpus of real world data being trained on in a
               | continuous feedback loop.
        
               | justapassenger wrote:
               | Ask Boeing how ignoring it works out.
        
               | jeremyjh wrote:
               | Which Boeing execs ended their career in disgrace after
               | killing hundreds of people in pursuit of lower training
               | costs ?
        
               | rdlw wrote:
               | Dennis Muilenburg.
               | 
               | 189 people:
               | https://en.wikipedia.org/wiki/Lion_Air_Flight_610
               | 
               | 157 people: https://en.m.wikipedia.org/wiki/Ethiopian_Air
               | lines_Flight_30...
               | 
               | That's 346 people killed.
               | 
               | Here's an article about his being fired in disgrace after
               | this: https://www.nytimes.com/2019/12/23/business/Boeing-
               | ceo-muile...
        
               | jeremyjh wrote:
               | Wow, totally missed anyone at all was held accountable
               | for that. It's not much but something at least.
        
       | pnw wrote:
       | I tried to buy the Mercedes EQS with that feature. All the
       | dealers had a $50k markup , you had to wait months for delivery
       | and most of the cool features in their marketing were not
       | available due to parts shortages.
       | 
       | So call me skeptical...
        
         | ryanwaggoner wrote:
         | Skeptical? Because it's so popular that it's very hard to get?
        
           | empiricus wrote:
           | popular would mean at least 100k-1mil. is this mercedes
           | really popular like this?
        
             | hef19898 wrote:
             | Tesla's sell at 100k - 1 mil mark up?
        
               | frenchman99 wrote:
               | I think they mean 100k to 1 million units sold
        
       | sixQuarks wrote:
       | I love reading the mental gymnastics the anti-Elon folks in here
       | are doing to actually believe Mercedes has beat Tesla in
       | autonomous driving.
        
         | claudiug wrote:
         | because is not like that?
        
       | kbos87 wrote:
       | What exactly do people think it means when Mercedes says they
       | will "take liability"? Decisions about whether to prosecute a
       | driver when something bad happens are local and state level
       | decisions. Mercedes can't protect you when a local prosecutor
       | decides to say "yeah level 3, whatever, you were behind the
       | wheel."
        
         | belter wrote:
         | That was handled in Germany by changing existing laws. I would
         | imagine equivalent laws were made or are being prepared for
         | California and Nevada.
         | 
         | "Germany will be the world leader in autonomous driving" -
         | https://bmdv.bund.de/SharedDocs/EN/Articles/DG/act-on-autono...
         | 
         | "...The Autonomous Driving Act took effect in Germany on July
         | 28, 2021, introducing significant and comprehensive amendments
         | to the German Road Traffic Law..." -
         | https://www.ippi.org.il/germany-autonomous-driving-act/
         | 
         | Law here (German):
         | https://bmdv.bund.de/SharedDocs/DE/Anlage/Gesetze/Gesetze-19...
        
         | toast0 wrote:
         | Well, they can provide a rigorous criminal defense, perhaps.
         | 
         | But I don't think there's a lot of criminal prosecutions for
         | driving on highways at speeds under 40 mph.
        
         | jupp0r wrote:
         | I assume they can only take over civil, not criminal liability.
        
           | Vespasian wrote:
           | As another user posted at least in Germany the laws were
           | changed to account for this.
        
       | oblio wrote:
       | "Slow and steady" better than "move fast and break things"? :-)
        
         | seydor wrote:
         | break people
        
         | ajross wrote:
         | Define better? A Tesla will do exactly the same thing[1] and
         | just nag you every few minutes to tug on the steering wheel
         | (those nags are getting rarer every release as the driver
         | monitoring camera improves, FWIW). The Mercedes driver can have
         | their hands doing other things, but still needs their eyes out
         | of the cockpit because the second traffic speeds up to 40mph
         | they need to take over.
         | 
         | In practice, this is "Level 3", but only within an ephemeral
         | subset of driving conditions that aren't going to be consistent
         | enough for you actually exploit that autonomy.
         | 
         | [1] Actually much more, obviously, but it at least does this.
        
           | zaroth wrote:
           | People forget the status quo is millions of deaths and
           | trillions in property and economic damages. Moving slow
           | condemns millions more to die.
        
             | qwytw wrote:
             | Those millions (especially in developed countries where the
             | per capita number of accidents is highest) won't be able to
             | buy Teslas anyway.
             | 
             | Statically you could decrease the number of deaths by
             | 20-30% by just banning cars older than 10 years. There are
             | some other (cheaper) things that can be done as well.
        
         | HDThoreaun wrote:
         | 40k+ people die per year in car accidents in the US alone. I
         | would say "move fast and break less things" is better.
        
           | waffleiron wrote:
           | We can get that to 60k!
        
           | oblio wrote:
           | You know how you could REALLY fix that?
           | 
           | 1. Actual car checks everywhere. I've seen some crazy cars in
           | the US that would be impounded ASAP in most of Europe.
           | 
           | 2. Add pedestrian safety requirements for new cars, which
           | would practically ban tall cars (trucks and SUVs) except for
           | professional use.
           | 
           | 3. Something for smartphones, not really sure about the best
           | attack angle.
           | 
           | This "might not ever happen" self driving shtick is just
           | moving the goalpost into the middle of nowhere.
        
       | olliej wrote:
       | Does Mercedes accept liability for any crashes that occur while
       | its system is in control?
        
       | porphyra wrote:
       | ... only when driving under 40 mph on highways. So it is pretty
       | useless.
       | 
       | The main thing is that Mercedes Benz will take liability for
       | anything that goes wrong. But the system as a whole is far less
       | capable than Tesla FSD.
        
         | ethanbond wrote:
         | > but the system as a whole is far less capable than Tesla FSD,
         | the renowned "exaggerator" told me so
         | 
         | We actually have no idea what Benz et al.'s systems are capable
         | of if they were to deploy them with the same recklessness as
         | Tesla.
        
           | zaroth wrote:
           | This statement makes no logical sense. Their systems are
           | "capable of" what they can be used to achieve in actual
           | operation.
           | 
           | It's not like it's this massively sophisticated system for
           | FSD with most of the code commented out.
           | 
           | These are purpose-built and heavily tested to a
           | _specification_ - that's how engineering works. The Mercedes
           | system specification is extremely limited in practice, but
           | gives them the marketing win of being able to claim they do
           | L3.
        
             | adsfgiodsnrio wrote:
             | Tesla's system can be used to achieve _nothing_ in actual
             | operation. It makes constant mistakes and requires an ever-
             | vigilant human to take over in a fraction of a second when
             | it does. There is no advantage over manual driving, let
             | alone manual driving with modern driver assistance.
        
             | ethanbond wrote:
             | Systems that are required to have ~100% performance within
             | an operational design domain generally have more than 0%
             | performance beyond that domain.
             | 
             | For example, Tesla FSD seems like a pretty good highway
             | cruise control, but it can be dangerously deployed in city
             | driving too. Of course this is a little difficult to argue
             | about since AFAIK Tesla has never defined its ODD and their
             | lead autopilot software guy said in his deposition that he
             | didn't even know what an ODD was?
             | 
             | I suppose one man's "marketing win" to "claim" L3 is
             | another man's "actually deploying L3."
        
               | zaroth wrote:
               | I maintain that the Mercedes system is "capable of" what
               | it can functionally be deployed to achieve by a user.
               | That's what "capable of" means. If Mercedes wants to try
               | its hand at a universal L2 system, I wish them the best.
               | This system is in truth _not capable_ of L2 or L3
               | autonomous driving under almost all typical driving
               | conditions. There is a small geofenced and speed
               | constrained window under which the car will lane-keep and
               | TACC without requiring hands-on-wheel. To me, that's what
               | we call a "gimmick".
               | 
               | An autonomous system which cannot even change lanes
               | should not - in my opinion - be classified as L3
               | Autonomy. Not only is it a gimmick, it's basically a
               | loophole in the spec.
               | 
               | As an aside, since you mentioned it, I think that
               | describing FSD as "pretty good highway cruise control" is
               | gaslighting.
               | 
               | Describing FSD as dangerously deployed in city driving is
               | merely misinformed. In fact FSD (including the human
               | driver and FSD's comprehensive attention tracking and
               | alerting) is safer than the average driver in city
               | driving.
               | 
               | https://www.tesla.com/impact
        
               | qwytw wrote:
               | Well Mercedes does supposedly offer a L2 system. Why do
               | you say it doesn't?
        
               | ethanbond wrote:
               | I'm not sure what you're even trying to argue here. You
               | disagree with SAE's designations? Interesting trivia I
               | guess.
               | 
               | Anyway I'm sure Tesla will say it's Level 3 whenever it's
               | Level 3. Would've thought that was a while back when they
               | marketed, sold, and rolled out "Full Self Driving" on
               | public roads.
        
         | dmbche wrote:
         | That's all enforced by the government, it's not a limitation of
         | the system. They might be comfortable with liability for more
         | than this but are not allowed to by regulators - which is wise
        
           | zaroth wrote:
           | It literally is a "limitation of the system".
        
             | dmbche wrote:
             | The implied system is Benz's autodriving. You seem to refer
             | to "society" as a system. Clever, but not really fruitful.
        
               | zaroth wrote:
               | I wasn't trying to be cheeky. I must have misunderstood
               | your comment?
               | 
               | As I understand it the L3 system is limited to limited
               | access highway roads, < 40mph, and no lane changes. Those
               | are its limitations, and I believe they are self-imposed
               | by Mercedes, not any US regulations.
        
               | dmbche wrote:
               | "Regardless of those capabilities, the California DMV is
               | placing some serious restrictions on the Merc system. It
               | can only operate during the day on certain limited roads,
               | and only at speeds of up to 40 miles per hour. The system
               | will be usable on California highways in the Bay Area,
               | Central Valley, Los Angeles, Sacramento and San Diego, as
               | well as on Interstate 15 between southern California and
               | Nevada, where the autopilot was approved for use in
               | January."
               | 
               | This indicates to me the limitations come from regulators
               | and not the manufacturer but I might be wrong.
        
         | HDThoreaun wrote:
         | People spend years of their life going under 40 on the highway
         | in LA. This seems huge in socal.
        
         | elif wrote:
         | Only on CERTAIN highways if it's daytime and it's not raining.
        
         | oblio wrote:
         | > The main thing is that Mercedes Benz will take liability for
         | anything that goes wrong. But the system as a whole is far less
         | capable than Tesla FSD.
         | 
         | "It's less capable except for the minor fact that if we
         | <<kill>> someone, it's <<our fault, not yours>>".
         | 
         | You know, that minor factor called "life or death".
         | 
         | > ... only when driving under 40 mph on highways. So it is
         | pretty useless.
         | 
         | If Waymo/Tesla/whatever would do this, they'd be the darlings
         | of the internet.
         | 
         | If Mercedes does this, no one thinks: "ah, it's the first
         | iteration, they can you know, <<increase>> the speed later on".
        
           | kelnos wrote:
           | If the local DA decides to prosecute you after your car kills
           | someone, there is nothing Mercedes can do for you. This
           | liability they are assuming must be civil only.
        
           | sidibe wrote:
           | > If Waymo/Tesla/whatever would do this, they'd be the
           | darlings of the internet.
           | 
           | How exactly do you think Waymo works that they wouldn't take
           | responsibility?
        
             | neysofu wrote:
             | Waymo takes responsibility for all accidents and Tesla is
             | able to drive autonomously at 40km/h on highways, but
             | neither does _both_. Mercedes does. That 's -I believe- the
             | point that parent is making.
             | 
             | Now, I'd argue that Waymo has probably somewhat solved
             | highways already as they're much simpler than city driving,
             | they're just not offering it to the public yet.
        
           | belter wrote:
           | Also known as "draw the rest of the owl"
        
             | oblio wrote:
             | Mercedes has drawn a lot more of the owl than anyone else.
             | Accepting liability is huge.
             | 
             | It's putting their money where their mouth is.
             | 
             | Musk is putting his mouth on a lot of things, I guess that
             | makes him a baby?
        
             | sidibe wrote:
             | At least they started with an owl shape. Tesla is starting
             | with a square before "draw the rest of the owl"
        
           | diebeforei485 wrote:
           | You aren't going to kill people while going under 40mph on a
           | highway (cars only, no pedestrians) anyway. Under 40mph is
           | like stop and go traffic.
           | 
           | This is a marketing stunt. They've taken a well-solved
           | problem (stop and go) and taken responsibility for it, and
           | getting headlines by screaming something about J3016 level 3.
        
             | Arainach wrote:
             | A highway is any public road per legal definition.
             | https://law.justia.com/codes/california/2022/code-
             | veh/divisi...
             | 
             | 40mph is more than enough to kill someone.
        
               | zaroth wrote:
               | "Limited access highway" is a different thing, and that's
               | the roads this system will engage on. Zero pedestrians,
               | unless you know, a climate protest is shutting it down.
        
           | fallingknife wrote:
           | Changing who holds liability does not make the system more or
           | less capable. Think about it. Was Mercedes system any less
           | capable the day before they declared it L3 than the day
           | before? Clearly not. It was the same system.
           | 
           | If you want to convince me that their system is way more
           | capable than Tesla, you would have to show me evidence that
           | Tesla FSD fails in the same limited environment as Mercedes
           | is certified L3 in, and I have not seen any.
        
       ___________________________________________________________________
       (page generated 2023-06-10 23:00 UTC)