[HN Gopher] Normalized crash data shows Autopilot is much less s...
       ___________________________________________________________________
        
       Normalized crash data shows Autopilot is much less safe than Tesla
       claims
        
       Author : gnicholas
       Score  : 371 points
       Date   : 2022-02-02 18:53 UTC (4 hours ago)
        
 (HTM) web link (twitter.com)
 (TXT) w3m dump (twitter.com)
        
       | gameswithgo wrote:
       | EDIT: After reading the full paper and not just the twitter
       | graphs, some of what I said is wrong. That is, they also
       | controlled for age, and when controlling for age and road type,
       | autpilot was just worse.
       | 
       | On the one hand I had the same suspicion about the unadjusted
       | data, on the other hand the fact that autopilot isn't _worse_ is
       | pretty promising!
       | 
       | However, average human rates include people driving drunk, tired,
       | elders, teenagers etc.
       | 
       | So at 43 when I am driving sober, I'm probably safer not using
       | autopilot. When I was 19 probably safer to use autopilot.
        
         | lbrito wrote:
         | >So at 43 when I am driving sober, I'm probably safer not using
         | autopilot. When I was 19 probably safer to use autopilot.
         | 
         | That might work out for you, but if teens think like that _now_
         | and avoid years of driving experience, they can't expect to be
         | any better at driving at 43 than they are as teenagers, which
         | kind of kills the argument.
        
           | gameswithgo wrote:
           | The skills are pretty easy to acquire, it is the maturity
           | that is hard.
        
         | gnicholas wrote:
         | > _the fact that autopilot isn 't worse is pretty promising!_
         | 
         | Presumably drivers take over when it makes mistakes, which
         | tilts the stats a bit.
         | 
         | > _When I was 19 probably safer to use autopilot._
         | 
         | As a parent, I wonder about what safety tech I would let my
         | kids use when learning to drive. I want to make sure they fully
         | learn how to drive, which makes me think I shouldn't let them
         | use too much semi-autonomous tech.
         | 
         | At the same time, it seems foolish to tell them not to use
         | blind spot monitors, or to even expect them to "check their
         | blind spot" the old-fashioned way if their cars have monitors.
         | 
         | Interestingly, my father (in his 70s) tells me he'll never
         | trust blind spot monitors, so I'm seeing a generational
         | difference even between him and me. Too bad, since older
         | drivers probably could benefit the most from not having to look
         | back when changing lanes. Older eyes take longer to re-focus,
         | so it's more likely an older driver would miss something
         | happening ahead on account of having looked rearward.
        
           | tacLog wrote:
           | > At the same time, it seems foolish to tell them not to use
           | blind spot monitors, or to even expect them to "check their
           | blind spot" the old-fashioned way if their cars have
           | monitors.
           | 
           | I have never owned a car with blind spot monitoring. Do
           | people really just rely on them completely? Even in dense
           | urban environments with bikes around? Or is it more of a
           | judgement call? Different perspectives on this would be nice.
           | 
           | I have never seen the little light on the mirrors fail to
           | detect me while I am riding but I never trust a driver not to
           | just turn right cutting me off anyways.
           | 
           | I am not trying to pass judgement just understand how these
           | driver aids change the way people drive.
        
             | gnicholas wrote:
             | I've never owned a car with BSM but have rented for
             | extended periods of time. I would expect that after a year
             | or two of using the system while also actively checking my
             | blind spot the conventional way, I would become comfortable
             | with understanding its reliability and limitations (if any)
             | and would adjust my behavior.
             | 
             | Some Hondas pipe through a camera feed from the side of the
             | vehicle when you turn on your blinker, which wouldn't have
             | the same potential weakness for cyclists that you mention
             | with the binary sensor-based systems.
        
             | sdoering wrote:
             | > Different perspectives on this would be nice.
             | 
             | Driving for ~25 years. Did quite some kilometers when I was
             | between 19 and 30.
             | 
             | My current car is the first one with a lot of assistance
             | systems. I actually see them as additional safety level,
             | but as a first instance still use my senses. Blind spot
             | monitoring is something that I learned to value as a slice
             | in the Swiss Cheese Model of Security [1].
             | 
             | And I detect myself driving extra carefully when driving a
             | car without them. Way more careful than I was driving
             | before I ever had blind spot monitoring.
             | 
             | [1]: https://en.wikipedia.org/wiki/Swiss_cheese_model?wprov
             | =sfla1
        
             | LeifCarrotson wrote:
             | It's practially a trope how in sci-fi movies with flying
             | futuristic cars, it's a demonstration of extreme competence
             | for a character to demand "manual controls" to pilot the
             | vehicle...like every normal driver does today. I expect
             | that will become more and more real-world. Even today, most
             | drivers don't know how to drive a manual transmission
             | vehicle, much less how to double-clutch one without
             | synchros or how to stop on ice without ABS. Those are
             | disappearing from public roads and from the capabilities of
             | average drivers, I expect that driving without aids will
             | follow in the same way.
             | 
             | As part of my job, I end up renting vehicles pretty
             | frequently, they're often newer models than my daily and
             | have blind spot monitoring, backup cameras, radar cruise,
             | vision-based lanekeeping assist, etc. It's frightening how
             | in the course of a week you can get used to having to put
             | less effort into centering yourself in the lane, trusting
             | that cruise control will just keep a comfortable distance
             | from the vehicle ahead of you. Driving becomes a lot less
             | stressful. When I get back home and climb into my old
             | manual-everything beater, though, it's quite an adjustment.
             | 
             | Regarding blind spot checks, yes, the Audi I recently drove
             | had alerts for that, the blinking yellow light in my
             | peripheral vision when cars were passing me was a nice
             | reinforcement, but I'm too conditioned to do head checks to
             | skip those. Likewise, reverse cameras - I've driven many
             | work vans, pickups with headache racks, Jeeps with
             | scratched up plastic, pulled trailers and RVs, etc where
             | the windshield-mounted rearview mirror is useless; lots of
             | pros get used to backing up using the side mirrors only.
             | However, I asked my sister in law (who is extremely
             | competent at most things) to drive my truck for an errand
             | and she asked for help backing it out of the driveway - her
             | car has always had a backup camera, which is honestly lots
             | easier and she was completely uncomfortable using the side
             | mirrors.
             | 
             | It's not hard to imagine that someone who only drives with
             | assistive tools would adapt to become dependent on them;
             | I'd argue it's more unusual to expect that they wouldn't!
        
               | FireBeyond wrote:
               | Perhaps pedantic, but also genuine curiosity, thinking
               | about it:
               | 
               | > how to stop on ice without ABS
               | 
               | My understanding is that ABS won't do anything to help
               | you stopping on ice, anyway. On ice, braking is hampered
               | by lack of traction on tires, whereas ABS is trying to
               | avoid the challenges of tires locking up, so you can
               | maintain (some semblance of) directionality.
        
             | FireBeyond wrote:
             | No. Even as a tech geek... I use it as an aid, but not as
             | the primary. Even when I'm driving a fire engine with a
             | dash monitor that has seven always on, always visible
             | cameras around it, I thought I'd use them a lot more than I
             | do. I do use the cameras in my car supplementarily too,
             | helps me dial parking perfectly, etc. But I still find
             | myself doing visual checks.
             | 
             | But I also grew up and had my formative driving learning
             | years without the benefit of such aids.
        
           | rzimmerman wrote:
           | I also think it will be important for any future teenager of
           | mine to learn to drive without the regenerative braking that
           | Teslas default to. It's a much better driving experience IMO,
           | but I worry a new driver won't learn the "oh no hit the
           | brakes!" reflex if they rarely use the brake pedal. I'd
           | probably force them to learn with the car set to "normal"
           | ICE/idle--style for the first few months.
        
             | Psyonic wrote:
             | Is there actually a "normal" style? I've only done a test
             | drive but it seemed like there were various levels of
             | automatic braking, but none that actually allowed you to
             | coast.
        
         | spywaregorilla wrote:
         | The road and age adjusted graph shows tesla still being better,
         | by a trivial amount
        
         | weego wrote:
         | "Tesla Autopilot: Marginally better than the worst drivers on
         | the road" is quite the sales pitch
        
         | justapassenger wrote:
         | > So at 43 when I am driving sober, I'm probably safer not
         | using autopilot. When I was 19 probably safer to use autopilot.
         | 
         | That's very dangerous thinking. Autopilot isn't self driving
         | and will actively try to kill you, from time to time, like all
         | driver assist systems. 19 year old are much more likely to
         | abuse autopilot and drive distracted/drunk. Only reason those
         | systems aren't crashing left and right is because drivers are
         | paying attention.
        
           | gameswithgo wrote:
           | It is only dangerous thinking with the most uncharitable
           | interpretation of what I wrote.
        
         | jiggawatts wrote:
         | I've heard Tesla owners describing autopilot as feeling like
         | being a passenger with a learner driver behind the wheel.
         | 
         | That was my first hand experience also, it feels like I'm the
         | nervous dad gritting his teeth while the teenager does
         | something technically legal and safe but nerve-racking.
        
       | rurp wrote:
       | I'm not too surprised by this and bet a lot of HN folks suspected
       | that this was the case, but I'm really glad to see this come out
       | and hope it gets a lot of visibility.
       | 
       | Controlling for confounding variables is a complicated and often
       | subtle process that many people don't have an intuitive grasp of,
       | so Elon's BS claims have probably been very convincing for many.
        
       | hnburnsy wrote:
       | From the paper:
       | 
       | (1) Statistics obtained from Hardmanet al.(2019) (2) Statistics
       | obtained from Blanco et al.(2016) (3) Statistics obtained from
       | Transportation Research Board of the National Academy of Sciences
       | (2013)
       | 
       | I don't understand how such chronologically disparate sets of
       | data could produce reliable statistics. The Telsa demographic
       | data (1) was from a "2018 demographic survey of 424 Tesla owners"
        
       | caditinpiscinam wrote:
       | How is it that Tesla (and other companies developing self-driving
       | capabilities) are allowed to deploy their products in the real
       | world? We make people take a test to get a drivers license -- why
       | don't we have requirements for self-driving systems? We require
       | that cars pass routine mechanical inspections -- why isn't the
       | software controlling these cars regulated?
        
         | bob1029 wrote:
         | This is the point that bothers me the most. Everyone is so
         | fixated on the things they can see and talk about (i.e.
         | regulate). The complex mountain of software running on these
         | cars is _the entire_ bucket of liability.
         | 
         | How do you mandate software quality? I don't think testing
         | alone is enough if you seriously care about safety. Any test
         | can be gamed as tesla has aptly demonstrated so far.
        
         | Dylan16807 wrote:
         | There are requirements in some places, and maybe there should
         | be more.
         | 
         | But on the other hand, even the most pessimistic examination of
         | this data doesn't seem to suggest that these cars are below the
         | "can get a drivers license" threshold.
        
         | dboreham wrote:
         | Because in the context of regulations, they don't claim it to
         | be self-driving.
        
           | FireBeyond wrote:
           | Exactly. Marketing: "Self driving!" "The driver is only in
           | the seat for legal purposes. The car is driving itself."
           | Legal: "Maintain control of the vehicle at all times."
           | 
           | Same with Summon. Marketing: "Bring your vehicle to you while
           | dealing with a fussy child!" Legal: "Maintain attention and
           | focus on vehicle at all times. Do not use while distracted."
        
         | heavyset_go wrote:
         | Regulators are asleep at the wheel.
        
           | ActorNightly wrote:
           | Its kinda amazing to see the cognitive dissonance that people
           | have when it comes to "regulators".
           | 
           | When these regulators get swayed by the car industry into
           | stupid stuff like 25 year import rule, or how they still keep
           | speed limits around which have nothing to do with safety as
           | they are based on outdated MPG savings initiative and end up
           | disproportionally affecting lower income people, they
           | rightfully are criticized.
           | 
           | However, as soon as some manufacturer pops up and does
           | something that the public doesn't like, even for silly
           | reasons like not protecting people from themselves, people
           | immediately jump to wanting the "regulators" to do something
           | about it. Kinda makes it easy to see who has irrational hate
           | for Tesla/Musk versus those who actually care about
           | technology.
           | 
           | The correct thing for the "regulators" to do is to make it
           | mandatory that every car has the same set of cameras and
           | instruments that Tesla cars do, and that data is streamed to
           | a publicly funded data warehouse where its open source and
           | available for anyone to use.
        
         | kevingadd wrote:
         | Regulation hasn't caught up with the technology, and even when
         | regulation exists regulators are very slow to enforce it. Uber
         | and Lyft were various degrees of blatantly illegal in many
         | places they operated, but it took regulators years to actually
         | do anything about it and at that point they were so well-
         | established that it rarely made sense to actually punish them.
         | It's a reliable bet to make if you've got investor millions to
         | spend on lawyers and lobbyists.
        
         | jsight wrote:
         | Because based upon this data it is at least as safe as not
         | deploying it. This data actually suggests that it is marginally
         | safer.
        
       | gnicholas wrote:
       | Underlying study is here. [1] Full disclosure, I co-wrote the
       | piece for The Daily Beast [2] that originally suggested that
       | Tesla's methodology was seriously flawed. I am not a Tesla-hater
       | though -- I just thought it was odd that the company was playing
       | quite so fast and loose with their safety claims.
       | 
       | 1: https://engrxiv.org/preprint/view/1973/3986
       | 
       | 2: https://www.thedailybeast.com/how-tesla-and-elon-musk-
       | exagge...
        
         | julianz wrote:
         | So that Daily Beast article has been up for more than 5 years
         | with nobody fixing the horrible typo in the headline? Wow.
        
           | rosndo wrote:
           | Daily Beast, not the Economist.
        
         | jacquesm wrote:
         | > I just thought it was odd that the company was playing quite
         | so fast and loose with their safety claims.
         | 
         | That's because it's marketing, not science.
        
           | sandworm101 wrote:
           | And because it's Tesla, not Honda.
           | 
           | Every car company does marketing. Few make over-the-top
           | safety claims about their products. This is the company that
           | decided to have its cars not stop at stop signs. Normal rules
           | don't apply to Tesla.
           | 
           | https://www.bbc.com/news/technology-60230072
           | 
           | "In recall documents, the electric vehicle maker notes that
           | if certain conditions are met, the "rolling stop" feature is
           | designed to allow the vehicle to travel through all-way-stop
           | intersections at up to 5.6mph without coming to a complete
           | stop."
        
             | yumraj wrote:
             | > And because it's Tesla, not Honda.
             | 
             | And because it's Elon Musk, not some ethical CEO
        
               | aurelianito wrote:
               | Ethical CEO is an oximoron.
        
               | yumraj wrote:
               | I feel sad that you believe that.
        
             | rhino369 wrote:
             | The Silicon Valley Stop is the new Hollywood Stop, huh.
        
             | omgwtfbyobbq wrote:
             | It's not like Honda doesn't engage in puffery.
             | 
             | Most retailers make over-the-top claims regarding their
             | products, including safety.
             | 
             | https://www.latimes.com/business/autos/la-fi-honda-
             | fine-2015...
        
             | NikolaNovak wrote:
             | I mean... I _am_ "Tesla Hater" in that I don't like the
             | direction they're proceeding in; but every human being I
             | ever met in real actual life does rolling stops under some
             | conditions. Last time I can confidently state that "I never
             | rolled through a stop at very slow speed" is when I was 17,
             | and only NOW do I realize how much of an annoying bum I was
             | :P. So if a car does a rolling stop through a stop sign
             | when there are no other cars obstacles or people, my
             | thought is 1. yes that's a breach of law as written 2. It's
             | what everybody else safely does.
             | 
             | Kind of like in many jurisdictions, police will gently talk
             | to you if you drive 60kph on the freeway, or even if you're
             | doing 90kph in the left lane. Yes it's legal, no it isn't
             | safe.
        
               | enlyth wrote:
               | When I was a kid and we were living in Vancouver, my mum
               | told me how she did a 'rolling stop' through a stop sign
               | at 4am in the morning with empty streets, and a cop car
               | noticed it and gave her a massive fine. She was in tears
               | because my family wasn't earning much money at the time.
               | 
               | For some reason I still remember this story decades later
               | and always come to a full stop at stop signs, that's how
               | I was taught in driving school and that's how I will
               | always do it. I'm a calm driver and I don't care if
               | people are annoyed by me, I will drive the speed limit
               | and I will not rush anywhere, it's not my problem that
               | someone else needs to speed because they woke up late,
               | safety should be the number one priority when you're
               | operating a vehicle.
        
               | ipaddr wrote:
               | Sounds like a ticket to fight as the cop working at 4am
               | would have a hard time showing up to court.
        
               | ceejayoz wrote:
               | In my area, cops have specific court dates; all their
               | tickets are set to be handled on the same day that's
               | already part of the cop's schedule.
        
               | sandworm101 wrote:
               | A) Cops don't work such shifts for the weeks/months it
               | takes to get a court date. B) Cops don't normally "show
               | up" in traffic courts. They can often appear via
               | electronic means. C) Cops have great respect for courts.
               | If called, they show up.
               | 
               | D) If your only defense is the faint hope that the cop
               | doesn't show, the judge is going to rain hellfire on you
               | for wasting everyone's time. Court costs. Amended
               | tickets. Contempt. Unless you are about to lose your
               | license, you don't want to roll that dice. If you go to
               | court to fight a traffic ticket you better have an actual
               | defense.
        
               | sokoloff wrote:
               | I've shown up to contest all but one traffic ticket I've
               | ever received*. It's up to the state to prove what I'm
               | being accused of; I am not required to admit to it or
               | passively pay the fine without appearing if I choose.
               | 
               | I've won many (outright or gotten reductions to non-
               | moving violations) and lost some, but I've never had a
               | judge "rain hellfire on [me]" or anything even remotely
               | similar to that, even in cases where I pled "not guilty",
               | listened to the state's case, and presented no argument
               | in my defense.
               | 
               | * I got a camera ticket in Switzerland on a business trip
               | that I did just pay rather than traveling back to appear
               | and undoubtedly lose.
        
               | Grustaf wrote:
               | That's a very odd reason to contest a fine. Clearly it
               | was justified.
        
               | ipaddr wrote:
               | It may not have been. Can the case be made, is there
               | proof, will anyone show up? Contesting a fine takes time
               | but the judge can reduce the amount if it was unusually
               | high.
        
               | hangonhn wrote:
               | I wonder if that might be a function of how roads near
               | you are designed. I live in the Bay Area and there are
               | definitely certain roads and intersections where I don't
               | feel safe not coming to a full stop and looking both
               | ways. It might be because of hills or that the other
               | direction has no stop sign, etc. Maybe where you live
               | roads have better visibility but at least in my
               | experience in my part of the Bay Area, I often find
               | myself needing to be very careful at some stop signs and
               | has consequently been doing complete stops.
        
               | bambax wrote:
               | > _Yes it 's legal_
               | 
               | IDK about the US, but in France the speed limit is a
               | target speed; if conditions are good (weather,
               | visibility, traffic) you're supposed to drive at or near
               | maximum speed. It's illegal to drive too slow, and
               | there's a specific fine for this.
               | 
               | https://www.legifrance.gouv.fr/codes/article_lc/LEGIARTI0
               | 000... [in French]
        
               | FireBeyond wrote:
               | In Australia (I left 15 years ago) it was stated that you
               | were required to drive at the _maximum_ speed that met
               | the conditions:
               | 
               | - not exceeding posted speed limit
               | 
               | - appropriate for environmental conditions
               | 
               | - your ability to safely control the vehicle
               | 
               | And failure to do so could be fined.
        
               | jonathankoren wrote:
               | In the US it's legally a max. I have seen -- although
               | rarely -- entire lines of cars pulled over and ticketed
               | for speeding on an interstate (limited access, multilane
               | divided highway, max speed 65-70 mph (104-112 kph))
               | Driving too fast for conditions, is a different fine, so
               | if it's foggy and you're going the posted limit, you
               | could also get fined.
               | 
               | You can get ticketed for driving too slow, but I've never
               | seen it. I've only seen a minimum speed posted on an
               | interstate (45 mph (72 mph)), but conceivably you can get
               | ticked anywhere for impeding the flow of traffic.
        
               | jacquesm wrote:
               | > You can get ticketed for driving too slow, but I've
               | never seen it.
               | 
               | I have (very gently) forced a car off the road once in
               | Germany with the police on 112, a very elderly gentleman
               | was doing 30 Kph on the autobahn and caused one near miss
               | after another. Police came and helped him to get home, we
               | talked for a while and it turned out that it was his
               | first trip in a long time to go and see his sister in
               | another town, he'd gotten lost and was frightened out of
               | his wits by all the traffic zooming by.
               | 
               | I don't know how it ended, he probably kept his license
               | because clearly there was no officer around to witness
               | the event but I'm pretty sure he avoided the autobahn
               | after that.
        
               | pinkorchid wrote:
               | The law that you linked says that drivers aren't allowed
               | to drive at an unreasonably slow speed, and defines that
               | to be (for highways, with good weather, and just on the
               | leftmost lane) 80 km/h.
               | 
               | It doesn't say that the speed limit is a target, or that
               | you're supposed to drive at or near it.
        
               | bambax wrote:
               | Yes, but it's how we're taught at driving school. I think
               | it's written down somewhere but can't find it at the
               | moment, so the article I linked to is the closest I could
               | find.
        
               | cgriswald wrote:
               | > ...but it's how we're taught at driving school.
               | 
               | It doesn't matter. People have all kinds of things they
               | are taught about driving from their parents, instructors,
               | and even official documents that don't carry legal weight
               | ( _e.g._ , a highway patrol website).
               | 
               | Even when these types of ideas are good ideas, they
               | aren't binding and you can't count on others to follow
               | those rules. The only true rules of the road are the
               | subset of the laws that are enforced; where enforcement
               | might be done by law enforcement officers, civil judges,
               | or insurance company adjudication processes.
        
               | rtikulit wrote:
               | On a public road I almost always come to a full stop, and
               | if I don't I recognize it as an error. That's the law and
               | there are very good reasons for it. It's an unambiguous
               | standard of performance, for example. Arguments for
               | rolling stops based on personal utility are selfish,
               | IMHO, and arguments pleading utility to others are
               | disingenuous--the rolling stoppers say that it's safer to
               | rolling stop because of the rolling stoppers? Please.
               | Think it through. :-)
               | 
               | (Part of the reason I do it is because as I age I would
               | like to ingrain habits that will make me a safer driver
               | even as my cognitive ability declines.)
               | 
               | Near me there is an intersection where the same cars
               | drive through on a daily basis and where the drivers have
               | habituated themselves to rolling stops. Yes, it's almost
               | always fine. But I have been almost T-boned twice, and
               | was hit once, fortunately with minimal damage. And even
               | though they do not have the right of way, their habit of
               | rolling stops regularly pre-empts the actual rules of the
               | road, and they cut off drivers who have the right of way.
               | 
               | That this is due to the normalization of deviance is
               | abundantly obvious.
        
               | jacquesm wrote:
               | They way to deal with declining cognitive ability is to
               | stop driving, not to perform rituals. That said, you
               | should still follow the rules, but mostly because they
               | are the rules. Not because you are following arguments
               | based on personal utility, because they are selfish,
               | IMHO.
        
               | Swenrekcah wrote:
               | Safety rituals are extremely useful exactly because
               | accidents happen when you have reduced capacity without
               | realising it.
               | 
               | Doesn't matter if it's because of lack of sleep, work
               | stress, cognitive decline, etc.
        
               | rtikulit wrote:
               | I'm struggling to respond constructively to your comment.
               | Think about all the phenomena that make up "cognitive
               | ability" and all of the possible dimensions and
               | properties of "cognitive decline".
               | 
               | It should be obvious that your advice is not in any way
               | useful or actionable. And rituals are a very well
               | accepted strategy for dealing with situations which
               | demand consistent and good attention, where human
               | cognitive variability causes problems.
               | 
               | A couple of famous examples--checklists used by aircraft
               | pilots, or the pointing rituals used on Japanese
               | railways.
        
               | sandworm101 wrote:
               | It is one thing to note that humans regularly break the
               | law. It is another thing to task people with programing a
               | robot to break that law. I'm very sure that many people
               | break the speed limit regularly. But I would never expect
               | to get away with programing a car to break that limit. No
               | sane person would ever tell an employee or customer that
               | rolling stops are OK, not in writing.
               | 
               | And 5.6mph isn't rolling. That's beyond a walking pace.
               | That's jogging territory. Any cop watching an
               | intersection would ticket this.
        
               | NikolaNovak wrote:
               | I fully support the noble notions and idealistic ideas.
               | 
               | But let's make it real, and let's ask a question about
               | the real world:
               | 
               | Does that mean that expectation is all automated cars
               | will go exactly the speed limit on the left lane of North
               | American superhighways?
               | 
               | If so, those cars will inherently be a danger to life and
               | limb. I don't care about any self-righteous driver who
               | indicates houghtily " _I_ drive speed limit on left lane
               | of American super highway ", I've spoken to advanced
               | driver safety instructors, highway police officers and
               | city councilors who all agree those people need to bloody
               | well move it along - it's just not _safe_.
        
               | cgriswald wrote:
               | > Does that mean that expectation is all automated cars
               | will go exactly the speed limit on the left lane of North
               | American superhighways?
               | 
               | In the vast majority of states this would also be
               | illegal. If Tesla decides it safe, they should be able to
               | just rewrite the rules, though, right?
        
               | sandworm101 wrote:
               | Yes. For a company to program illegality into a product
               | opens them up to untold liability. Normal car companies
               | program their speedometers to read slightly high (5%
               | ish). This keeps them from being into lawsuits by people
               | who claim that an inaccurate speedometer contributed to
               | the severity of a crash. Any car company who programs a
               | car to drive faster than the limit _will_ be either
               | liable or have to pay lawyers in every crash involving
               | such cars. Even if their car doesn 't cause the crash,
               | the fact that it was speeding will contribute to the
               | severity of the accident. It would be like programing a
               | robotic bartender to serve kids who are underage but look
               | old enough to pass for 21. Corporate lunacy.
        
               | sebzim4500 wrote:
               | >Yes. For a company to program illegality into a product
               | opens them up to untold liability.
               | 
               | People have been saying this for years yet Tesla is not
               | being sued into nonexistence. Presumably their contracts
               | are written well enough that if the user tells the car to
               | go above the speed limit that's on them.
        
               | sandworm101 wrote:
               | Because incidents involving these programs are very
               | small. At the moment there are very few Teslas on the
               | road in comparison to other car companies. There would
               | statistically only be a handful at most of such
               | accidents. But give it a few years until there are 10x or
               | 20x as many teslas on the road. Then the class actions
               | will start. That is if, like here, Tesla hasn't already
               | recalled all such vehicles.
        
               | frosted-flakes wrote:
               | There's a difference between the driver setting the
               | cruise control above the speed limit, and a computer
               | unilaterally deciding that it should run a stop sign.
        
               | aspenmayer wrote:
               | > Does that mean that expectation is all automated cars
               | will go exactly the speed limit on the left lane of North
               | American superhighways?
               | 
               | It's already against road rules to stay in the passing
               | lane when cars behind wish to pass, regardless of the
               | speed limit, or if you're already driving at it. If cars
               | behind desire to overtake, give way. This is already
               | codified. No need to blame the cars or the self-driving
               | tech. It's the human driver who bears responsibility for
               | what the car does or doesn't do, as are the only ones
               | able to countermand the autopilot. Blaming Tesla for any
               | of that seems like dogpiling and behind the point.
        
               | [deleted]
        
               | sokoloff wrote:
               | That depends on the jurisdiction. In my state (MA), I can
               | use the left lane while passing other traffic even if
               | there is traffic behind me who wishes to go faster than I
               | am.
        
               | hedora wrote:
               | Not in California. Left lane fast, right lane slow isn't
               | the law, and you can pass in whatever lane you'd like.
               | 
               | Studies show that, adjusted for congestion and weather,
               | SF Bay Area drivers have some of the highest accidents
               | per mile in the country.
        
               | Calavar wrote:
               | > And 5.6mph isn't rolling
               | 
               | Most cars idle at 5 to 6 mph, so _any_ roll through a
               | stop sign is likely to break the 5 mph mark.
               | 
               | > Any cop watching an intersection would ticket this.
               | 
               | Not sure where you live, but this is absolutely not the
               | case where I am. I have only ever once heard of a traffic
               | officer ticketing for this. On the other hand, I see cops
               | watch on as people ignore "no turn on red" signs and let
               | them get away with it pretty much every day. Let alone
               | rolling through an intersection at 5 mph.
        
               | jmisavage wrote:
               | I've gotten a ticket before for a rolling stop. It
               | happens.
        
             | jlmorton wrote:
             | > This is the company that decided to have its cars not
             | stop at stop signs. Normal rules don't apply to Tesla.
             | 
             | It's a driving profile which must be turned on that
             | implements behavior extremely common among the driving
             | public. My Honda with Traffic Sign Recognition and adaptive
             | cruise control lets me set the speed above the speed limit,
             | too.
             | 
             | Our traffic laws are in many ways ambiguous, such that even
             | the police officers enforcing the laws will readily admit
             | you're allowed to drive above the posted speed limit. When
             | traffic rules are interpreted ambiguously, it's not that
             | strange for a driving profile to do the same.
        
               | kevin_thibedeau wrote:
               | The driving public knows when it _can 't_ take such
               | calculated risks. Does the ML classifier know what it
               | isn't seeing?
        
             | somethoughts wrote:
             | It could also be that being slightly controversial and
             | possibly factually incorrect is by design in the "any press
             | is good press"/"it's better to ask forgiveness than
             | permission" kind of way. When you are a relative newcomer
             | going against legacy incumbents with much larger ad budgets
             | and fighting to live another day such guerrilla marketing
             | style tactics can be advantageous.
             | 
             | The trick is to realize when you've jumped the shark and
             | are now the established brand and wean yourself of the
             | "fast and loose" approach.
        
           | [deleted]
        
         | contravariant wrote:
         | So to summarize:
         | 
         | - Freeways account for 93% of distance travelled with Autopilot
         | 
         | - Freeways account for 28% of the distance travelled by regular
         | drivers.
         | 
         | - "vehicles on non-freeways crashed 2.01 times more often per
         | mile"
         | 
         | a quick back of the envelop calculation suggests merely driving
         | on the freeway lowers accidents per mile by ~40%, which
         | accounts for most of the difference between Autopilot and
         | regular drivers.
        
           | avs733 wrote:
           | so in sum, Thomas Bayes just kicked Tesla square in the ass?
        
         | [deleted]
        
         | RyEgswuCsn wrote:
         | So it sounds like it's Simpson's paradox [1] at work. I had
         | always wondered why many people are slow to accept self-driving
         | cars despite the claims of them being statistically safer. I
         | think that explains it --- to make consumers confident about
         | the self-driving technologies, it is not enough for them to be
         | able to handle the "easy" kind of driving (e.g. on highway and
         | under relatively good visibility conditions) better than
         | humans; they need to demonstrate that they can drive better
         | than humans in the most challenging driving environments too.
         | 
         | I guess this is the classic scenario where human intuition is
         | defeated by carefully presented statistics.
         | 
         | [1]: https://en.wikipedia.org/wiki/Simpson%27s_paradox
        
           | chaxor wrote:
           | Can you explain how Simpsons paradox applies here?
        
             | chemengds wrote:
             | It's more traditional sampling bias than Simpsons paradox
        
             | RyEgswuCsn wrote:
             | The performance appears to be superior due to a large
             | proportion of the driving is done on easy environment.
        
           | oblio wrote:
           | The thing is, you don't even need Tesla "self driving" for
           | that.
           | 
           | Adaptive cruise control, tech we've had for what, 20+ years?
           | is more than enough for most highways, it takes out ~80% of
           | the stress/boredom of driving, especially since lane changes
           | are minimal and curves are gentle.
           | 
           | The hard parts of driving (aka not highways) are the real
           | problem, as expected. We solved the easy parts in 2000 and
           | who knows if we'll solve the hard parts by 2050.
        
             | CSSer wrote:
             | I have a car with adaptive cruise control and lane assist,
             | and most of my morning commute (on days when I go into the
             | office) is driven on the highway. Most of the time I don't
             | turn it on because I'm in a self-perceived hurry to get to
             | work, so I forget.
             | 
             | You know what would be really nice? Something that detects
             | when it's safe and reasonable to suggest turning on
             | adaptive cruise control so I use it more. Maybe something
             | like detecting I've driven a few miles at a consistent rate
             | of speed above a certain threshold.
             | 
             | The hard parts are definitely the problem, but there are
             | lots of easy wins left on the table too.
        
               | oblio wrote:
               | Isn't it just 1 button on the steering wheel to turn it
               | on?
               | 
               | I definitely hope your car doesn't put it in some hidden
               | menus on the central screen.
        
               | Melatonic wrote:
               | I would personally hate for that to be on but I can see
               | your point. Maybe a simple hardware switch for on / off
               | and adaptive cruise control turns on once you are over
               | 65mph for more than 1 minute and will only adapt +-15mph
               | total?
        
             | RyEgswuCsn wrote:
             | For sure. Though even non-tech people are well aware that
             | adaptive cruise control cannot get us to true self driving
             | but "AI" allegedly can.
        
           | kybernetikos wrote:
           | > it is not enough for them to be able to handle the "easy"
           | kind of driving (e.g. on highway and under relatively good
           | visibility conditions) better than humans
           | 
           | Is there any evidence that they do the easy stuff better than
           | humans? I would want to see it compared with cars of similar
           | age and cost driving on similar roads. I'm pretty skeptical
           | that such a comparison would in fact be favourable to Tesla.
        
           | nradov wrote:
           | People are slow to "accept" self-driving cars because they
           | don't exist in a form that anyone can actually buy.
           | Regardless of safety or lack thereof, I can't purchase a real
           | SAE level 4+ vehicle today at any price.
        
           | [deleted]
        
           | wilde wrote:
           | I think traditional auto makers have done a much better job
           | of communicating to consumers what the tech can actually do.
           | We don't have self driving cars (which implies that they can
           | drive in those harder conditions). We have cars that are good
           | at lane keeping on freeways.
           | 
           | I wish Tesla would celebrate that victory rather than double
           | down on lies.
        
         | andrepd wrote:
         | >I am not a Tesla-hater though -- I just thought it was odd
         | that the company was playing quite so fast and loose with their
         | safety claims.
         | 
         | If you're still surprised by this you've not been paying
         | attention to how Telsa operates.
        
           | maxdo wrote:
           | That's a stupid projection of a paid media.
           | 
           | Most of the car brands underperform compare to them in terms
           | of safety. But because other brands release millions of
           | boring cars with 0 innovation they are not criticized.
           | 
           | If cheap Ford model rollout a car that has 3 of 5 in term of
           | safety, or drag until last legally allowed moment to
           | introduce a mandatory safety feature, no one is going to
           | write "Ford/Car brand X is horrible". It's just a boring CarX
           | brand car.
           | 
           | In fact there are so many cars with mediocre crash test
           | results on the road. There are so many car brands that
           | implement in a cheap way safety measures. Like mandatory
           | backup camera. In half car on the market you can barely see
           | an image during the sunny day. No one is posting scary
           | articles like " Toyota screen is killing people while driver
           | is backing up"
           | 
           | Facts: Tesla was the first to introduce such wide range of
           | safety features in every car they sold. They don't make
           | safety a premium feature. They invest in own crash test
           | facilities. Only Volvo is kind of on par.
           | 
           | You can calculate a lot and argue a lot but you can't deny
           | facts:
           | 
           | - every tesla car has an exceptionally good crash test
           | results
           | 
           | - drivers are more distracted nowadays mostly due to phones
           | 
           | - Any car that has similar to tesla safety features is more
           | safe on the road. These are stop on a traffic light ,
           | emergency stop if there is a person, collision avoidance
           | system, slow down near police car, signal if there is car in
           | a blind zone
           | 
           | - Tesla is a major contributor of this AI driven safety
           | features mass adoption
           | 
           | - Tesla has lots of measures to return distracted customer
           | back to the driving as opposed to any car on the road 10
           | years ago
           | 
           | - Tesla was first to introduce over the air updates of
           | software and firmware. Brands like GM is still planning to
           | bring in 2023. They were able to deliver number of safety
           | features instantly over the air. Infamous cops cars crash is
           | a good example of it.
           | 
           | It is really hard to understand how you can make conclusion
           | that safety is not a priority for tesla despite so much
           | resources this brand invest in safety.
           | 
           | Autopilot is a tool. You can miss use it, sure. But if you
           | care about safety it's hard to match with what they offer.
           | 
           | You compare 0 with some number. A 10 y.o. car with cruise
           | control 10 has 0 smart safety features. Tesla now has Y
           | features with X amount of effectiveness. You can debate what
           | is X. But it is stupid to compare 0 vs ( Y * X ). It's just
           | safer. Period.
           | 
           | Other car brands don't even dare to release such numbers.
        
             | ummonk wrote:
             | It wasn't much of an accomplishment for Tesla to introduce
             | safety features across its lineup, when its lineup
             | consisted of luxury cars.
             | 
             | Today, other car manufacturers are standardizing safety
             | features across their model lineups while manufacturing
             | cars that are a third the price of the cheapest Tesla.
        
               | jliptzin wrote:
               | A third the price but still pouring disgusting fumes into
               | our neighborhoods
        
               | ummonk wrote:
               | The disgusting fumes are almost exclusively from diesel
               | trucks and very old cars. Manufacturing affordable (no,
               | the Model 3 is not affordable) new cars helps take old
               | cars off the streets and make our neighborhoods cleaner.
        
               | maxdo wrote:
               | BMW is a luxury Car, you still have to pay for safety
               | extra. That's the difference. One car brand make it
               | mandatory, other ask you extra dollars for every safety
               | camera to install.
        
               | ummonk wrote:
               | BMW is hated for a reason, and most car manufacturers
               | (not just Tesla) would compare favorably to it.
               | 
               | Pretty much every major manufacturer makes safety
               | features standard on the base models across their lineup
               | today.
        
             | dmitriid wrote:
             | > Autopilot is a tool. You can miss use it, sure. But if
             | you care about safety it's hard to match with what they
             | offer.
             | 
             | 1. It's not autopilot, by any measurable criteria
             | 
             | 2. As the adjusted data clearl shows, it's not that hard to
             | match it. And since you mention Volvo, they definitely
             | match it. They are honest enough though not to call it
             | "autopilot", or "full safe driving" or pin the plame for
             | its failures on the driver.
        
               | maxdo wrote:
               | The author is projecting his assumptions based on 400
               | people questioned. Is it fair.
               | 
               | Most Cars volvo produced is not as safe simply because
               | Volvo is still making most of the money on Gasoline car,
               | and they are less safe vs electric. Do we need to create
               | a screaming article "Vovlo is deliberately making money
               | on old tech that kills people"? Again volvo is a premium
               | brand, so they don't want to invest more in EV to make
               | your car more safe. They also promote hybrid Cars that
               | has highest chances to catch on fire compares to EV and
               | regular ICE cars.
               | 
               | But yeah, naming of an optional feature is so much more
               | important compares to real tech, crash tests etc. That's
               | a logic media is trying to put in our head. Don't think.
        
               | ummonk wrote:
               | What makes you say gasoline cars are much less safe than
               | electric?
        
               | oblio wrote:
               | > Again volvo is a premium brand, so they don't want to
               | invest more in EV to make your car more safe.
               | 
               | Dude, really?
               | 
               | https://group.volvocars.com/company/innovation/electrific
               | ati...
        
               | maxdo wrote:
               | Volvo's plans all in the future, it's just a commitment.
               | today is there in 2025, tomorrow web designer put another
               | beautiful article with 2030. Gasoline cars is less safe
               | to the same car with the same set of airbags just because
               | of It's physics e.g. lower center of gravity, no engine
               | in the front that can kill you during forward collision,
               | battery at the bottom serve as a protection layer during
               | other type of crashes.
               | 
               | In fact tesla is about to release structural batteries
               | cars, that will use battery as part of the cars body,
               | makes it even more safe. Model Y with structural pack
               | will be out in March 2022.
               | 
               | Volvo at this moment promoting hybrids and ICE that
               | doesn't have this features, because it's bloody business.
               | And money is money.
               | 
               | Just compare real actions of Volvo who claim one of the
               | safest brand and Tesla.
        
               | oblio wrote:
               | Did you even read the article I linked? I have friends
               | that have the XC40 Recharge...
               | 
               | > In fact tesla is about to release structural batteries
               | cars, that will use battery as part of the cars body,
               | makes it even more safe. Model Y with structural pack
               | will be out in March 2022.
               | 
               | I'll believe that when I see it. How's the Tesla Semi
               | release going? Especially since structural batteries are
               | still an active research topic that actual researchers
               | think will be in mass production in 5 years or more.
        
               | [deleted]
        
               | ummonk wrote:
               | Yup. Autopilot on aircraft is very different because
               | pilots don't have to be ready to take over on a moment's
               | notice.
               | 
               | Of course they have to be monitoring the flight but they
               | can be generally assured that the autopilot won't decide
               | to just fly into a mountain with only fractions of a
               | second for the pilots to intervene.
        
               | laen wrote:
               | There are moments a pilot has seconds to react and
               | disengage autopilot.
               | 
               | Example: Traffic Resolution advisories require a 5 second
               | reaction time and 2.5 seconds for any follow-on
               | reactions. This reaction time includes the time to
               | disengage autopilot and also put the aircraft into a
               | climb-or-dive. Studies show that a situationally aware
               | pilot takes at least 2 seconds to start the reaction, and
               | another second to achieve the proper climb/descent. There
               | are other times on e.g. approach, or ground collision
               | warnings with even narrower margins for disengaging
               | autopilot.
               | 
               | Pilots generally know when they need to be hyper-aware
               | with autopilot and so do competent Tesla drivers. If
               | there are issues with the name "autopilot" for Tesla, the
               | same argument needs to be made for aircraft
               | manufacturers.
        
               | JaimeThompson wrote:
               | > If there are issues with the name "autopilot" for
               | Tesla, the same argument needs to be made for aircraft
               | manufacturers.
               | 
               | If a drivers license required as much training as a
               | pilots license then you might have a point.
        
               | dzader wrote:
               | "Boeing instructed pilots to take corrective action in
               | case of a malfunction, when the airplane would enter a
               | series of automated nosedives. "
               | 
               | https://en.wikipedia.org/wiki/Boeing_737_MAX_groundings
        
               | ummonk wrote:
               | Indeed, Tesla's autopilot behaves a lot more like MCAS
               | than like an airplane autopilot.
        
               | sokoloff wrote:
               | > It's not autopilot, by any measurable criteria
               | 
               | It does seem quite comparable in capabilities to the
               | actual autopilot in aircraft. And that's what's confusing
               | to people; most don't realize that an autopilot is a
               | control used by a driver or pilot to reduce their
               | workload. (As a pilot, I am still logging flight time as
               | the "sole manipulator of controls" during the time the
               | autopilot is engaged.)
        
               | watwut wrote:
               | And that "confusion" was created intentionally by Tesla.
               | So that it is easy to pretend it was not deliberately
               | pretending higher capabilities while giving yourself
               | option to pretend you are not.
               | 
               | You know like that claim about human being there just
               | because of pesky regulations and laws. Not because human
               | would be needed.
        
               | ummonk wrote:
               | The kind of environment that a car is in (and the kind of
               | monitoring required by the driver) is much more akin to
               | autoland than autopilot.
        
             | AnthonyMouse wrote:
             | > But because other brands release millions of boring cars
             | with 0 innovation they are not criticized.
             | 
             | That's not actually why.
             | 
             | It's because carmakers are _huge_ advertisers. You get on
             | their bad side, they pull their advertising.
             | 
             | Tesla's advertising consists of Elon Musk's Twitter
             | account, so Tesla gets the reporting the others would get
             | if the authors weren't reliant on the subjects of those
             | stories for their paychecks.
        
             | FireBeyond wrote:
             | > Facts: Tesla was the first to introduce such wide range
             | of safety features in every car they sold.
             | 
             | Please itemize these "wide range of safety features" for us
             | that other manufacturers do not have or did not have until
             | Tesla innovated them.
             | 
             | > They invest in own crash test facilities. Only Volvo is
             | kind of on par.
             | 
             | Audi owns one, for just one.
        
               | [deleted]
        
               | maxdo wrote:
               | Sure, https://www.tesla.com/safety
               | 
               | https://www.youtube.com/watch?v=9KR2N_Q8ep8
               | 
               | Tesla roll out cars in generations. So just take a $40k
               | Model 3/Y platform was introduced in 2017 and compare it
               | with any car been introduced that year. My friend bought
               | a BMW in 2019 for similar money, it doesn't have half of
               | that. Maybe you can have them but you have to buy extra
               | packages for extra money.
               | 
               | Tesla don't sell safety features for extra price. The
               | only extra feature is Autopilot. But hardware, camera's ,
               | lane departure avoidance, blind spot collision warning
               | etc in every car.
               | 
               | You can even take such a simple feature as forward
               | collision warning system. When you drive it is super
               | accurate it uses the same data and models from Autopilot
               | to predict when there is a crash. If your slowing down
               | dynamic is not good or the car in front/back change it
               | dynamics it will warn you immediately. Only this feature
               | saved me several times from collision. I'm not a good
               | driver.
               | 
               | As I said in 2022 there are many car brands that catching
               | up in some cases they doing better. And this is good.
               | 
               | Human is a bad, constantly distracted driver. Technology
               | is the only way to make our life safer.
        
               | FireBeyond wrote:
               | Automatic emergency braking is a required standard
               | feature in the US.
               | 
               | Forward collision warning became standard in 2016 by at
               | least ten manufacturers: Audi, BMW, Ford, General Motors,
               | Mazda, Mercedes-Benz, Tesla, Toyota, Volkswagen, and
               | Volvo.
               | 
               | Without speaking to Blind Spot, my Audi does similar
               | intelligent things. If you change lanes next to a vehicle
               | or just in front of it, but you have a positive speed
               | differential, it won't activate. If there is a vehicle in
               | your blind spot approaching you but their speed
               | differential to you is low, it will activate later. If
               | there's a much higher speed differential (say they're in
               | a HOV lane, and you're in a much slower regular lane), it
               | will activate much earlier.
        
               | macintux wrote:
               | > Automatic emergency braking is a required standard
               | feature in the US.
               | 
               | Not currently; I just bought a 2022 Jeep Wrangler that
               | doesn't have anything of the sort.
        
               | gambiting wrote:
               | >> My friend bought a BMW in 2019 for similar money, it
               | doesn't have half of that.
               | 
               | That's an incredibly bad example, because in German
               | brands you have to pay extra for basic safety
               | features(famously in the new Polo you have to pay extra
               | to have more than the basic 3 airbags, I don't even know
               | how that's allowed). Look at a brand new
               | Peugeot/Citroen/Kia/Hyundai and you will pay less than
               | what a Model 3 costs and get just as many safety
               | features. Yes, all the stuff that is optional in a BMW -
               | but look outside of Audi/Merc/BMW/Volkswagen and you will
               | find that this stuff is also completely standard.
        
               | oblio wrote:
               | > My friend bought a BMW in 2019 for similar money, it
               | doesn't have half of that. Maybe you can have them but
               | you have to buy extra packages for extra money.
               | 
               | #1 lesson of buying cars is:
               | 
               | 1. Don't buy German cars if you don't have to (or if you
               | buy them, never complain about the price of their
               | optional equipment).
               | 
               | The corollary is:
               | 
               | 1". Buy Korean or Japanese cars.
        
           | mdoms wrote:
           | He never said he was surprised.
        
         | Natsu wrote:
         | The study says this:
         | 
         | > An estimated 15% of injury crashes and 24% of property
         | damage-only crashes are never reported to police (M. Davis and
         | Company, Inc., 2015), while9% of injury crashes and 24% of
         | property damage-only crashes are reported but not logged
         | (Blincoe et al., 2015). Even with robust data, establishing the
         | statistical significance of automated vehicle safety can be
         | expensive. Kalra and Paddock (2016)demonstrated that
         | establishing that an AV has a fatal crash rate equivalent to
         | the national average with 95% confidence would require driving
         | a fleet of 100 vehicles continuously for 12.5 years.
         | 
         | I presume there's an adjustment for this in the figures you
         | compare to, but I am having trouble finding it, though that may
         | be my own failure so I wanted to ask if you could help me find
         | how that's adjusted for?
        
           | TedDoesntTalk wrote:
           | Why would injury crashes go unreported?
        
             | vanattab wrote:
             | I got hit a few weeks ago while walking my dog (dog was
             | fine). I thought I had just bruised my tailbone so I waved
             | the driver on after a short chat and never reported it. In
             | hindsight I wish I had as it seems like my tailbone was
             | fractured and not just bruised but it definitely happens.
        
             | tempnow987 wrote:
             | After college I lived with roommates for a while. One went
             | on a joyride of sorts and damaged a bunch of parking
             | meters, crashed and injured themselves. They had a friend
             | drive them home and repaired the vehicle with no reporting.
             | 
             | In some cities currently even if you want to have police
             | respond to a traffic accident, you'd need to wait a VERY
             | long time - they will tell you that unless you need EMS to
             | try and get a safe spot and call for a tow. Some folks
             | aren't willing to wait hours for police to show up to do
             | what, take a report?
        
             | LorenPechtel wrote:
             | Police responding to the actual crash scene, useful. Police
             | report filed after the fact, only if there's some need for
             | said report. And not all injuries are immediately apparent.
             | 3 1/2 years ago I got a first-person view of what it would
             | be like to be on the receiving end of a PIT maneuver
             | courtesy of a woman who didn't look adequately. (Yes, when
             | you looked left the road was empty--I saw that also and
             | turned into that empty spot the block before you!) At the
             | time the only injury I was aware of was a very minor
             | scrape, not even worthy of a bandaid. Later I discovered a
             | pulled muscle--it only hurt when I did certain things.
             | 
             | I also used to know a woman who thought it was nothing more
             | than a fender-bender, went to the doc a couple of days
             | later because her neck was bothering her. Doc sent her
             | straight to the hospital--incomplete cervical fracture, one
             | wrong move and she could have dropped dead.
        
             | sokoloff wrote:
             | If I had a single-car crash with minor injuries, why would
             | I report it? It seems like there's limited upside to me as
             | the (obviously at-fault) driver. Go get medical care and go
             | get my car fixed. (I drive cars cheap enough to replace and
             | so do not insure them against collisions that are my
             | fault.)
             | 
             | Drivers driving without a valid license, without insurance,
             | under the influence, wanted by the law, etc would have all
             | kinds of reasons to not report a crash.
        
       | pjkundert wrote:
       | An even _more_ serious problem with Tesla 's claims regarding the
       | safety of its Autopilot is this:
       | 
       | Autopilot failure statistics are _non-ergodic_. You don 't get to
       | "play again" after a catastrophic Autopilot failure.
       | 
       | So, Autopilot must not be 5x or 10x better than your own driving
       | -- it has to be, like 100x or 1000x better, in order to warrant
       | you risking your life on it.
       | 
       | Trusting an autopilot that is just 10x better than you are?
       | 
       | Insane. It randomly drives under a semi-trailer that you would
       | have easily seen and avoided, and you're dead.
        
       | endisneigh wrote:
       | Are there stats that even show that any self driving system is
       | safer than the median driver?
       | 
       | Basically accidents involve two people. Said accidents have
       | responsibility divided between both individuals.
       | 
       | A hypothetically perfect driver could still be involved in fatal
       | accidents by virtue of the other driver.
       | 
       | The worse drivers obviously will cause these accidents.
       | 
       | The questions are three fold:
       | 
       | 1. Are self driving vehicles safer than the median driver?
       | 
       | 2. What's the ratio between at fault and not at fault for any
       | accidents the median driver gets in?
       | 
       | 3. What's the same ratio for self driving cars?
       | 
       | Normalizes for location, of course.
        
         | spywaregorilla wrote:
         | The median driver is a difficult thing to study. 77% of people
         | have experienced an accident, but I think fewer than 50% of
         | people have been at fault for one. That's not to say they
         | wouldn't be at fault if they drove more, but the median has a
         | clean record. Hard to know who to exclude.
        
         | M2Ys4U wrote:
         | >Basically accidents involve two people.
         | 
         | Not always, no. It's perfectly possible to crash a car with
         | nobody else around.
        
           | bagels wrote:
           | Or for one car to crash in to many cars.
        
       | ActorNightly wrote:
       | > Crash rates can be adjusted to account for differences in
       | environment and demographics in different data sets. A sample
       | dataset with acrash rateris exposed to some variableiat a
       | different proportion p than the comparison dataset. In the case
       | study, for example, vehicles running Autopilot were driven on
       | freeways (i) 93% of the time, resulting in pi= 0.93. In the SHRP
       | 2 NDS, only 28% ofvehicle mileage was recorded on freeways, i.e
       | pi= 0.28. In the SHRP 2 NDS data, vehicles on non-freeways
       | crashed 2.01 times more often per mile than vehicles on freeways.
       | The observed Autopilot crash rate can be adjusted to reflect
       | national driving ratesto reflect the crash rate that might be
       | observed if 28% of Autopilot mileage was on freeways and 72% were
       | on non-freeways, assuming that the 2.01 ratio holds for
       | Autopilot.
       | 
       | So if humans are 2 times more likely to crash on non-highways
       | than highways, the argument presented here is that the added
       | danger of non-freeways affects autopilot in the same way,
       | degrading its performance. Seems like a far reaching assumption.
       | 
       | If there is no data on autopilot performance on rural roads, then
       | you cannot make a claim either way.
        
       | yrral wrote:
       | How about the predominately male ownership of teslas causing
       | higher accident rates. I see the paper adjusts for age and road
       | type but missing gender is a pretty big slip-up?
       | 
       | I didn't read the paper fully but grepped for "gender" and "sex"
       | and couldn't find anything.
        
         | hervature wrote:
         | Don't forget to adjust for the 50% greater distance driven by
         | men [1]. In reality, per mile drive, the difference in genders
         | is negligible and maybe even counter to what most people
         | believe. Of course though, men will pay more in insurance
         | because driving is inherently dangerous and driving that much
         | more frequently is problematic.
         | 
         | [1] - https://www.fhwa.dot.gov/ohim/onh00/bar8.htm
        
           | akira2501 wrote:
           | Men are also more likely to drive drunk, under the influence
           | and are more likely to buy a car with more than 250hp.
           | There's a _lot_ of confounding variables when examining
           | crashes.
        
         | gnicholas wrote:
         | Good question! According to one study in LA, [1] men cause 60%
         | of accidents. I do wonder, however, the extent to which this
         | gap fades with age. Given that most Tesla owners are not super
         | young, it might not make as much of a difference as the 60%
         | number (assuming it is nationally representative) would
         | indicate.
         | 
         | 1: https://xtown.la/2019/12/12/women-drivers-maybe-we-need-
         | more...
        
           | user123abc wrote:
           | Men also drive 63% more miles than women.
           | https://www.carinsurance.com/Articles/average-miles-
           | driven-p...
        
           | yrral wrote:
           | Note when looking at 40% vs 60% of accidents (assuming males
           | and females drive the same amount) the % difference seems
           | small, only 20%, but actually this means males cause 50% more
           | accidents than females do (60/40=1.5).
        
             | nlowell wrote:
             | I don't know if we can assume males and females drive the
             | same amount.
        
             | user123abc wrote:
        
           | smnrchrds wrote:
           | Keep in mind that men drive 16,500 miles per year on average,
           | while women drive 10,100 miles. There is a noticeable gender
           | gap in "accidents per year", but not in "accidents per mile".
           | The paper compares accidents per mile, so there should be
           | little gender effect in the results.
           | 
           | https://www.fhwa.dot.gov/ohim/onh00/bar8.htm
        
       | [deleted]
        
       | freediver wrote:
       | As a Tesla driver, I do not trust the autopilot yet and think it
       | is years away (if ever achievable) from level 5...
       | 
       | The reason this study has this outcome can be because of the
       | survivorship bias - people are letting autopilot drive only in
       | easiest road conditions where chance for something to go wrong is
       | smaller to begin with.
       | 
       | I would have hard time imagining people are letting autopilot
       | drive regularly in a tight, windy road (at night) or in snow,
       | thus completely preventing those (frequent types of?) accidents
       | from happening under autopilot.
        
       | garbageT wrote:
       | Pre-print research based off of unpublished paper/dataset and
       | "personal correspondence" makes this questionable. Why are we
       | sharing this before knowledgeable and relevant reviewers get a
       | hold it?
       | 
       | I'll wait for the peer-reviewed version.
        
         | kevingadd wrote:
         | So Tesla marketing gets the benefit of the doubt, but research
         | doesn't?
        
           | hunterb123 wrote:
           | That was never said. GP could doubt both.
           | 
           | I don't think his request is out of the ordinary here:
           | 
           | https://hn.algolia.com/?dateRange=all&page=0&prefix=true&que.
           | ..
        
       | nja wrote:
       | I was just watching this video of a Tesla trying and miserably
       | failing to drive itself through the streets of Boston:
       | https://twitter.com/TaylorOgan/status/1488555256162172928
       | 
       | A number of these near-crashes seem to be pretty odd errors (a
       | couple of times it just decided to drive on the wrong side of the
       | street?) -- albeit ones that one _could_ see drivers unfamiliar
       | with the area making.
       | 
       | If a human driver started driving on the wrong side of a street,
       | etc -- at least in Boston -- they would be greeted by a chorus of
       | horns and rude gestures from those around, which would hopefully
       | indicate to them that they were in error. Brings up an
       | interesting question: does anyone know if Teslas are monitoring
       | for surrounding horns/etc?
       | 
       | Another common problem in Boston is being blocked by geese
       | crossing the road. Are Teslas able to detect such low, slow-
       | moving obstacles?
       | 
       | It seems like it's not super feasible to rely on self-driving
       | vehicles in a dense city like Boston, but maybe such features
       | could be geofenced to areas like highways?
        
         | JoshTko wrote:
         | To be fair Boston city roads are like horse carriage routes
         | that were converted to streets and will probably one of the
         | last places FSD optimizes for.
        
           | nja wrote:
           | Interestingly, the routes in that video are in one of the few
           | parts of Boston with a street grid (Southie:
           | https://twitter.com/TaylorOgan/status/1488555262655045637 )
           | -- so I wonder how much _worse_ it is on the cowpath roads
           | elsewhere...
        
           | petee wrote:
           | Good point, Boston is a great real-world fuzzer. If they
           | spent their time making it work there they'd have less issues
           | elsewhere
        
           | mijamo wrote:
           | Those roads seem much easier to drive on than basically any
           | city I've been in Europe. Paris is 10x more complex than that
           | for instance. If FSD can't handle that there's really not of
           | a use case apart from highways in Europe.
           | 
           | To give you an idea of the incredible features in some
           | cities, I have recently driven in Rennes where they found a
           | good idea to makes roads with one single central lane for
           | cars with wide bicycle path on each side. You are supposed to
           | drive in the center and if a car comes in the other direction
           | each car needs to go on the bicycle lane on their respective
           | side of the road while giving priority to bicycles. I wonder
           | how FSD would handle that...
        
             | masklinn wrote:
             | Iirc that concept actually comes from the netherlands. I've
             | seen videos on Dutch road design showing that, i had no
             | idea they'd been implemented in france.
        
           | mdoms wrote:
           | If FSD is expected to perform poorly on those roads then it
           | shouldn't be available on said roads and it certainly
           | shouldn't be legal to use it there.
        
           | croes wrote:
           | So the feature is useless in most european cities?
        
         | sakopov wrote:
         | This is interesting. I have seen similar videos of Tesla self-
         | driving successfully on similar type of streets in Seattle. I
         | wonder if the quality of FSD changes from city to city based on
         | how much data Tesla receives from other drives in that city. I
         | guess I'm suggesting that Seattle could have more Tesla drivers
         | than Boston. I have no idea if any of this is true. Just a
         | thought.
        
         | ceejayoz wrote:
         | I like the guy who jumps in on the comments with "you're doing
         | it wrong".
         | 
         | https://twitter.com/ButchSurewood/status/1488602615407681541
         | 
         | Isn't the whole idea that you're _not_ doing anything?
        
       | borski wrote:
       | We need a startup that can test and correct dubious statistical
       | claims, or verify good ones. I would definitely pay for that.
        
         | Someone1234 wrote:
         | That would be cool, but companies are often not forthcoming
         | with the raw data needed to do such analysis independently
         | (particularly if it could make them look bad).
        
           | borski wrote:
           | Sure, but would they be if they could be "independently
           | verified" by a startup like this? It might be a PR win.
        
       | JoeAltmaier wrote:
       | Is it eggregiously worse than normal human pilots? If not, then
       | that's an accomplishment in itself.
        
         | spywaregorilla wrote:
         | It's marginally better
        
           | JoeAltmaier wrote:
           | Right! A few years ago, claiming that a robot car would drive
           | as safely as a human driver would have been an incredible
           | claim. Just saying.
        
             | thomaskcr wrote:
             | It would have been, Musk didn't think so and felt the need
             | to embellish/exaggerate - his habit of exaggerating/lying I
             | think is more on trial than the actual Tesla vehicles here.
        
       | lordnacho wrote:
       | Wow, adjusted for driver age the difference vanishes.
       | 
       | That's quite an interesting observation, and it shows the
       | importance of proper statistics. It also raises questions about
       | companies producing their own stats, it's often only when someone
       | takes the time to dig into it that we discover this kind of
       | thing.
       | 
       | I am not holding my breath waiting for the common consumer to
       | understand statistical critical thinking though. I sometimes
       | catch myself forgetting the due diligence even after many years
       | of forcing myself to think about stats.
       | 
       | Perhaps some sort of ombudsman could do this, pointing out when
       | stats are lying to people.
        
         | ProAm wrote:
         | If you have taken a course in statistics you'll know its always
         | possible to make the numbers looks favorable to your cause. It
         | would take several unrelated blind third parties to do this
         | correctly and no public company will ever let that happen,
         | especially Tesla.
        
           | kadoban wrote:
           | They should be forced to let that happen. It's crucial for
           | public safety.
        
             | IanDrake wrote:
        
             | ProAm wrote:
             | Public knowledge of safety affects stock price and brand
             | recognition. Will never happen.
        
               | gretch wrote:
               | This is an arrogant amount of speculation based on
               | nothing more than personal intuition.
               | 
               | In fact, there are plenty of examples where public
               | knowledge of safety is published by mandate of the
               | government. Here is an example of FAA flight data and you
               | can see details about incidents across airlines:
               | https://www.faa.gov/data_research/accident_incident/
               | 
               | I'm not going to spend time digging it up, but I'm pretty
               | sure there are just as many sources of data on other
               | important things such as pharmaceuticals e.g. the covid
               | vaccine.
               | 
               | So no, just because 'stock price go down' doesn't mean a
               | thing will never happen
        
         | tantalor wrote:
         | You could easily have a Simpsons paradox, e.g., it's safer
         | overall, but for any single age group its neutral or even
         | _less_ safe.
        
         | djanogo wrote:
         | They should adjust for car-age and price.
        
         | kfor wrote:
         | I highly recommend How to Lie with Statistics (1954) to learn
         | more about these sorts of misleading stats:
         | https://en.wikipedia.org/wiki/How_to_Lie_with_Statistics
        
         | contravariant wrote:
         | Driver age isn't the factor that explains the difference,
         | accident rate adjusted for age shifts up a similar amount for
         | both Autopilot and regular drivers.
         | 
         | What _truly_ makes a difference is the kind of road they 're
         | driving on. The accompanying paper provides a figure showing
         | Autopilot is essentially only used on freeways which are
         | inherently safer. Adjusting for that seems to explain most of
         | the difference.
         | 
         | Though personally I'd prefer to have just compared the accident
         | rate on freeways for both types rather than this weird
         | weighting method they try to use.
        
         | rurp wrote:
         | > It also raises questions about companies producing their own
         | stats
         | 
         | Companies saying "trust us, we have the data to prove it" while
         | keeping that data secret should be ignored. _Especially_ if
         | their CEO has a long history of blatant dishonesty! I 'm most
         | annoyed that people gave any weight to Tesla's safety
         | statistics claims in the first place.
        
           | pengaru wrote:
           | > I'm most annoyed that people gave any weight to Tesla's
           | safety statistics claims in the first place.
           | 
           | TSLA stock consistently made gains for a lot of people who
           | otherwise wouldn't be paying attention let alone putting
           | their weight in anything Tesla related.
        
           | dahfizz wrote:
           | > Companies saying "trust us, we have the data to prove it"
           | while keeping that data secret should be ignored.
           | 
           | While true, I do not think this is a great heuristic. How
           | would the average person know whether the "data" is
           | available? Most people don't even know what "data" really is.
           | They have never heard of CSVs or SQL. We need a better signal
           | for consumers to know what they can and can't trust.
        
             | rurp wrote:
             | A good journalist will include that context in a story. I
             | know I read at least one article that included these safety
             | claims while also noting that the data was private and
             | unverifiable.
             | 
             | I think it's a good default to assume any self-beneficial
             | corporate message is false, absent evidence to the
             | contrary.
        
             | captain_price7 wrote:
             | An average person doesn't need to directly access the data.
             | As long as journalists or researchers can, an avg person
             | could rely on their analysis/opinion instead.
        
               | dahfizz wrote:
               | I would certainly not rely on journalists' to actually
               | have a look at the data, let alone to have the technical
               | skills to analyze and report on it. As it is, reporters
               | don't bother mentioning / linking to the data, or
               | mentioning the lack of publicly available data.
        
               | ceejayoz wrote:
               | It's OK to expect better from _both_ journalists and
               | companies.
        
               | jobu wrote:
               | Data-journalism is actually a thing, and bigger
               | organizations like the New York Times have done some
               | pretty impressive work building tools and training
               | programs for their reporters.
               | 
               | https://open.nytimes.com/how-we-helped-our-reporters-
               | learn-t...
        
           | Out_of_Characte wrote:
           | Companies dont validate their claims, only provide data and
           | statistics to back up marketing angles. There's no point to
           | it either since it would still be a conflict of interest when
           | you've researched your own product and found no errors in it.
        
         | ajross wrote:
         | This isn't "proper statistics", this is P-hacking. Tesla's data
         | is incomplete and bad. But going out and finding confounding
         | variables to confirm your priors isn't the treatment.
        
       | hartator wrote:
       | Also feel they compare Tesla with autopilot vs. other cars.
       | Instead of Tesla with autopilot vs. Tesla without autopilot. As
       | Teslas have way better crash tests but doesn't mean autopilot is
       | safer.
        
         | maxerickson wrote:
         | Being safer in a crash doesn't make it okay to crash.
        
       | brk wrote:
       | Nobody working in machine vision is surprised by this. We are a
       | LONG way from having self-driving vehicles as a general thing,
       | particularly when Tesla wants to rely on MV as their source of
       | environment analysis.
        
       | paxys wrote:
       | I have felt since day 1 that "autopilot is better than the
       | average driver!" is the most misleading fact/statistic in the
       | industry.
       | 
       | The "average driver" involved in an accident includes drunk
       | drivers, road ragers, habitual speeders, teenagers, people
       | driving in bad conditions (rain, snow), people driving without
       | sleep and several other risky groups that most people buying
       | Teslas aren't part of. To be worth it, the car has to reduce _my
       | personal_ accident risk under the conditions I usually drive,
       | otherwise I prefer to keep my own hands behind the wheel over
       | handing over control to an algorithm.
        
         | [deleted]
        
         | stjohnswarts wrote:
         | So basically everyone in a car? :)
        
         | mrtksn wrote:
         | If I remember correctly, they also cover the easy part of the
         | road and make the human take over at tricky situations.
         | 
         | So at the end of the day, you have spotless records for the
         | autopilot that was driving many many miles on the straight line
         | and humans having accidents at short intersections or
         | construction sites. This translates into very favorable numbers
         | for autopilot when presented as accidents per miles driven.
        
           | spywaregorilla wrote:
           | If it were the case that autopilot was benefiting from
           | dumping all of the difficult parts on the humans, you would
           | expect to see much higher than average rates of accidents per
           | mile on the human driven parts, because they're only getting
           | the hard parts. This tends not to be true, so I'm doubting
           | it's a real effect.
        
             | FireBeyond wrote:
             | Your conclusion relies on the - quite faulty - assumption
             | that "situations that inherently difficult for FSD to
             | handle" are automatically "also more dangerous for human
             | drivers". In snowy conditions, humans do just fine,
             | generally, at following "lanes", be they the actual lane,
             | or the safest route that everyone else is following. Humans
             | are also capable of deducting lane direction, orientation,
             | even when there are contradictory/old lane markings on the
             | road, a situation FSD regularly causes danger in.
             | 
             | Or that that negative effect is lost in the orders of
             | magnitude of "all human drivers across all miles" versus
             | FSD.
        
               | tshaddox wrote:
               | You don't think that human driver accident rates in snowy
               | conditions are much higher than in fair weather
               | conditions?
        
             | mrtksn wrote:
             | Humans don't disengage and let the machines handle the
             | situation but machines do it all the time. Do the machines
             | disengage at tricky situations or straight lines?
        
               | spywaregorilla wrote:
               | The statistics are there to read however, and suggest
               | your reasoning is not correct.
               | 
               | Case A: Non Tesla Human drives 999 easy miles and 1 hard
               | mile
               | 
               | Case B: Autopilot drives 999 easy miles, and human drives
               | 1 hard mile
               | 
               | If the effect size of the hard mile is so large that its
               | skewing the statistics, you would expect the Telsa human
               | driver to have a horrendous per mile accident rate
               | relative to the non tesla human driver. What's most
               | likely is that the accidents following a human handoff
               | get correctly allocated to the Autopilot and the effect
               | size being described is not actually that significant.
        
               | buran77 wrote:
               | Student drivers (with an instructor in the car, and
               | secondary controls) have very few accidents, if any, not
               | because of _their_ safety record but because every
               | mistake is saved by the instructor. A car can be
               | considered as driving  "safer than a human" when it can
               | match the average human in _any_ conditions the average
               | human drives. Everything else is just squinting at the
               | data and _choosing_ an interpretation that fits your
               | personal opinion.
               | 
               | A Tesla can relatively safely cover traffic like divided
               | or controlled access highways. That's a very narrow slice
               | of all possible driving situations and not one
               | responsible for most accidents.
        
               | spywaregorilla wrote:
               | So as a thought experiment, if Autopilot + Human
               | intervention reduced the rate of accidents by 50% vs.
               | just humans after normalization; can we consider
               | autopilot to be adding value?
        
               | buran77 wrote:
               | Of course the Autopilot adds value, all such driver
               | assists are there to add value and help the driver on any
               | car. I just don't think Autopilot can _drive_ , let alone
               | _drive safer than a human_. There 's a difference between
               | "helping a human drive safer" and "driving safer than a
               | human". This is a confusion many people make when reading
               | these stats, to Tesla's benefit.
               | 
               | Also the oldest Tesla with AP is less than 10 years old
               | (with an average of 3-4 years given the sales trends).
               | The average age of cars on the street in the US is over
               | 12 years old. An accident statistic that looks at
               | reasonably new, premium cars against everything else
               | won't paint the correct picture.
        
               | Dylan16807 wrote:
               | If that means the supervision actually works, then it
               | doesn't sound like there's a problem?
        
               | runarberg wrote:
               | > Student drivers (with an instructor in the car, and
               | secondary controls) have very few accidents [...] because
               | every mistake is saved by the instructor
               | 
               | Is that the only reason? I suspect that many (most)
               | student drivers are also in general more alert, drive
               | slower, and follow each rule/guideline to an extreme. If
               | they don't the instructor will probably end the lesson
               | and remove a dangerous driver of the road.
        
               | mrtksn wrote:
               | stats for easy miles and hard miles? link please?
               | 
               | Humans don't crash at every tricky situation but Tesla
               | claims that humans are horrible drivers and their cars
               | gives the control to the humans when when something
               | happens.
        
               | spywaregorilla wrote:
               | Literally any distribution of hard miles and easy miles
               | will produce the same outcome in this thought experiment,
               | to varying effects, if your premise of Tesla hiding the
               | AI's incompetence with passing off to humans when driving
               | is challenging is accurate.
               | 
               | If your premise is to be believed as a significant
               | effect, you must also accept that this outcome should be
               | visible in the data
        
               | jonathankoren wrote:
               | It's not just that. There's a context switch when the
               | autodrive disengages, so the human is actually less ready
               | for the hard mile than if they were driving the whole
               | time. Sure m, the human is supposed to be able to take
               | control at anytime, but I don't think that really
               | happens. The whole purpose of autodrive is do you don't
               | have to pay attention and drive.
        
               | mkipper wrote:
               | Is this data available anywhere? How do I know that non-
               | autopilot Tesla mile per accident rates aren't horrible?
               | 
               | Tesla publishes their own data for the safety of
               | autopilot, which I presume is based on their own analysis
               | of accident records. Is this same detailed information
               | available to other groups (insurers, NIST, etc)? Or do
               | they just calculate an aggregate "Tesla mile per
               | accident" rate that is a blend of the great autopilot
               | rate and horrible human rate?
               | 
               | I'm not trying to be facetious here. I have no idea if
               | this data is available to any groups other than Tesla
               | themselves. And if so, do they publish those numbers?
        
               | spywaregorilla wrote:
               | Tesla releases the miles per crash rates quarterly, for
               | autopilot and non autopilot cases. Autopilot crashes
               | include anything within 5 seconds of disengagement. The
               | human rate tends to be more than 2x worse than the
               | autopilot rate. This is not normalized for factors like
               | road context.
               | 
               | The human rate for tesla driven miles tends to be ~4x
               | better than the other brands' average. To precisely
               | answer this question you would want to see both a
               | comparable brand's humans' performance; and probably the
               | split of humans who used autopilot and humans who don't
               | ever use autopilot. We don't have that, but in my
               | personal opinion there's enough evidence to suggest it's
               | probably not a grand conspiracy. I'm of the opinion that
               | autopilot being ballpark on par with other drivers is
               | more than enough to reduce accidents substantially, at
               | scale.
               | 
               | https://www.tesla.com/VehicleSafetyReport
        
             | jcranberry wrote:
             | Could you provide a source? That sounds fairly interesting.
        
               | spywaregorilla wrote:
               | Just some reasoning really. Statistics and proper
               | normalization are hard.
               | 
               | Tesla tends to say that autopilot crashes occur 1/2 as
               | often as non autopilot crashes. That's likely not
               | normalized to road conditions. But if you assume that
               | Tesla is secretly just putting all the hard miles on the
               | humans, then that would imply humans are driving many
               | more hard miles and should have higher accident rates.
               | The autopilots meanwhile must be performing worse on the
               | easy miles and racking up additional accidents that
               | wouldn't have otherwise happened.
               | 
               | If you combine those two, the overall rate of accidents
               | should be higher than average, but it's actually lower by
               | a fair margin. Again, normalization is hard.
               | 
               | Ideally you would be able to compare human drivers of
               | another comparable car brand to the human drivers of
               | Tesla to confirm the Tesla drivers don't seem to be being
               | judged on unreasonably difficult conditions.
        
               | mrtksn wrote:
               | There was a sourse but I could not find it ATM. It's
               | fairly simple, people don't disengage and their driving
               | safety is judged over all the miles they drive + all the
               | situations where Autopilot disengages.
               | 
               | Tesla Autopilot is judged only by the miles driven
               | without disengagement, which is quite limited actually.
               | You can watch Youtube videos to see at what kind of
               | situations Tesla autopilot gives up.
               | 
               | There's no situation where the Autopilot takes over from
               | the human saying "That's a tricky road, let me handle
               | it".
        
               | spywaregorilla wrote:
               | You seem to be missing the point though. If this were
               | significant, then human tesla drivers should be shown as
               | performing much worse than other car drivers, because
               | you're claiming they have a disproportionately large
               | riding time in "tricky roads".
               | 
               | A non tesla driver should be doing way better because
               | they get to pad their score with the easy roads the
               | autopilot supposedly gets.
        
               | FireBeyond wrote:
               | As mentioned elsewhere, just because a situation is
               | difficult for FSD to parse and process doesn't inherently
               | make it a dangerous situation for a human driver.
        
               | mrtksn wrote:
               | Maybe that's the case, Tesla data isn't public. They
               | don't publish data but conclusions and their conclusions
               | are questionable.
        
               | spywaregorilla wrote:
               | Actually its more an issue that other car companies don't
               | publish their data for comparison.
               | 
               | On a dumb average tesla is way better, but it'd be more
               | compelling if we could compare to new luxury brands with
               | similar target markets
        
               | mrtksn wrote:
               | How it's other car companies fault that Tesla isn't
               | publishing data so we can check if Autopilot miles are
               | mostly on straight lines and human miles are at tricky
               | situations?
        
         | spywaregorilla wrote:
         | So how much better than the median randomly assigned uber
         | driver does it need to be?
        
         | [deleted]
        
         | FireBeyond wrote:
         | Huh. Citation needed that drivers who buy, say, a Model S,
         | particularly one with Ludicrous mode are less likely to be
         | "habitual speeders" than the average driver.
         | 
         | This has come up before. When the new FSD beta started, people
         | started claiming that safe driving was a function of vehicle
         | price, and therefore Tesla drivers, especially those who had
         | paid for FSD, were more likely to be safer. When I noted that
         | my current vehicle costs more than a Model S, based on the
         | logic, Tesla should be recruiting me to beta test FSD, well it
         | was hard to find a refutation.
        
         | tshaddox wrote:
         | But that's not really contradictory, is it? If you believe that
         | you're a better driver than autopilot, just don't use it, or
         | buy another car. It's not at all dishonest or deceptive
         | marketing (if the claim is factually accurate, of course).
        
         | ajross wrote:
         | > several other risky groups that most people buying Teslas
         | aren't part of.
         | 
         | This is a fallacy. Ask any demographic you like if they're road
         | ragers, drunk drivers, etc... and they'll deny it. Everyone is
         | sure that rare and terrible outcomes will never happen to them
         | because of their own behavior, _including_ the people to whom
         | it happens!
         | 
         | It's _probably_ true that middle aged premium car owners are
         | safer on average, but you don 't get to rule that stuff out by
         | fiat. In fact there have been a few "Autopilot Saves Passed Out
         | Driver" stories over the past few months, where Tesla drivers
         | were clearly impaired.
         | 
         | As for this particular paper: this is just P-hacking folks. You
         | can't take a vague dataset[1] and then "correct" it like this
         | without absolutely huge error bars. Why correct based only
         | these variables? Why did they all push the results in one
         | direction? Why not gender? Why not income? Why not compare vs.
         | like cars?
         | 
         | This isn't good statistics, it's just more statistics. If we
         | want the real answers we should get a better data set (which,
         | I'll agree, would likely involve some regulatory pressure on
         | manufacturers).
         | 
         | [1] And, to be fair: Tesla's safety report is hardly
         | comprehensive and provides no data other than the aggregate
         | numbers.
        
           | stjohnswarts wrote:
           | I just think a lot of Tesla drivers see themselves as elite
           | drivers or something when we all know that's not true if you
           | step back away from the situation.
        
           | Majromax wrote:
           | > You can't take a vague dataset and then "correct" it like
           | this without absolutely huge error bars.
           | 
           | That ship has already sailed; Tesla makes active safety
           | claims based on that dataset. To hold research to your
           | standard here would be to say that Tesla can make its claims,
           | but nobody can challenge those claims.
           | 
           | > Why did they all push the results in one direction?
           | 
           | If you have two correction steps, then a priori you'd expect
           | a 25% chance that they both act in the same direction. I
           | don't think this is very remarkable.
           | 
           | > Why not gender?
           | 
           | This probably would be a reasonable addition. I doubt it
           | would change the results much, but we have the well-known
           | fact that insurance rates differ for men and women, so it may
           | be relevant.
           | 
           | > Why not income?
           | 
           | First, is this data even generally available? If the data
           | doesn't exist, then we can't control for it.
           | 
           | Second, should we expect crash rates per mile driven to
           | differ greatly by owner/driver income? A priori, I wouldn't
           | think this demographic quality to have a strong impact.
           | 
           | > Why not compare vs. like cars?
           | 
           | I think "personal vehicle" is a reasonable comparative
           | category. Would we expect collision rates to differ greatly
           | between more specific categorizations? For the sake of the
           | overall conclusion, would we expect Tesla-equivalents to be
           | particularly crash prone in the broad dataset?
           | 
           | Any statistical analysis will have its limitations, but when
           | we're talking about life-safety claims from a manufacturer I
           | think we should have wide latitude to look at critical
           | evaluations of the original data.
        
           | [deleted]
        
         | beambot wrote:
         | Confounding factors matter.
         | 
         | E.g. I would not be at all surprised if drivers of high-end
         | BMWs have lower deaths per mile due to older population, more
         | affluent, etc.
        
         | jjulius wrote:
         | I don't disagree with your broader statement about autopilot's
         | safety relative to other drivers, but...
         | 
         | >The "average driver" involved in an accident includes drunk
         | drivers, road ragers, habitual speeders, teenagers, people
         | driving in bad conditions (rain, snow), people driving without
         | sleep and several other risky groups that most people buying
         | Teslas aren't part of.
         | 
         | ... _what_? Tesla owners absolutely can be habitual speeders,
         | people who drive without sleep, road ragers, and drunk drivers.
         | Can you please explain your thought process behind this claim?
         | I really don 't understand why anyone would assume that Tesla
         | owners are less likely to have those traits than non-Tesla
         | owners.
        
           | w-j-w wrote:
        
           | sudosysgen wrote:
           | Because they are buying an expensive, new car, meaning higher
           | income, meaning that those things are less likely :
           | https://academic.oup.com/alcalc/article/46/6/721/129644
        
             | capableweb wrote:
             | In my person experience, the "drunk drivers", "road
             | ragers", "habitual speeders" groups seems to be over-
             | represented in expensive cars rather than cheaper cars.
             | Poor people cannot afford to crash their car while rich
             | people can.
        
               | paxys wrote:
               | Data released by the DOL and insurance companies does not
               | match with your personal experience. There's a reason 18
               | year olds driving shitty cars pay through their nose for
               | liability coverage.
        
               | huubuub wrote:
               | I do not think a car crash is normally the result of a
               | decision people make based on whether they can afford it
               | or not.
        
             | aeternum wrote:
             | Keep in mind that this study could just mean that people in
             | expensive cars with higher incomes are less likely to be
             | arrested/stopped by police.
        
               | lelandfe wrote:
               | Hmm.. The underlying studies (Baum, 2000; Eensoo et al.,
               | 2005) that found links between socioeconomic status and
               | DUI/DWI did so by comparing those arrested against a
               | control group - which I believe should deal with that
               | concern?
        
           | paxys wrote:
           | For your specific question, actuarial tables for car
           | accidents/deaths are available publicly, and the profile for
           | the typical Tesla owner (middle aged, high income, driving
           | expensive sedan/SUV) is considered much safer than average.
           | 
           | Moreover it is true in general that most people driving
           | aren't drunk or high, aren't reckless etc., and so most Tesla
           | drivers aren't either. The number of accidents isn't
           | uniformly distributed among the public.
           | 
           | More broadly though, I never said that zero Tesla owners (or
           | any other specific car drivers) do any of these things, but
           | that using autopilot with an above average safety record can
           | still mean that the personal safety of an individual or group
           | of individuals goes down. For the system to be a net benefit
           | it has to be pushed to the below average drivers.
        
           | vletal wrote:
           | Yeah, seems like a typo a GPT-3 would do.
           | 
           | The claim seems to be that Teslas do not drink before
           | driving, not Tesla owners, right?
        
           | jahewson wrote:
           | I read it that way at first too but I think the reasoning is
           | actually correct. The population of Tesla drivers will of
           | course resemble the overall population of drivers but most
           | drivers are not sleepy/raging/drunks and by extension "most
           | people buying Teslas aren't part of" that group either.
           | 
           | To use an analogy: the average person has 1.9 legs but a
           | product which results in its user having 1.95 legs is not an
           | improvement for most people.
        
             | ketzo wrote:
             | Wow. Excellent analogy.
        
           | jasonhansel wrote:
           | For an individual person to be willing to adopt autopilot on
           | safety grounds, autopilot has to be safer than _that
           | individual person 's_ driving, not just safer than the
           | _average person 's_ driving.
           | 
           | If your driving skill is above the mean (because you're never
           | a road rager, speeder, etc.), then you're worse off using
           | Autopilot, even if Autopilot is as safe as the average
           | driver.
           | 
           | Since most people probably consider themselves above average
           | drivers (and in fact most people may _be_ above the mean if
           | the bad drivers are outliers), this limits the number of
           | people who will believe that Autopilot makes them safer.
        
         | Xylakant wrote:
         | I think it's fair to compare to the average driver. You may be
         | not average by virtue of not being a teenager and not driving
         | drunk, but you're not magically immune to being run over or
         | crashed into by one of those. So even if a hypothetical FSD
         | capable car would not drive one yota better than you do, it
         | cwould still make you safer by virtue of making the surrounding
         | environment safer.
        
           | paxys wrote:
           | Sure the worst drivers out there utilizing assist tech like
           | autopilot would benefit everyone, but data shows that they
           | aren't the ones spending money on fancy cars. My comment was
           | about me making a decision for myself.
           | 
           | Plus in your hypothetical the autopilot can drive at least as
           | well as me, which is also a big assumption and not something
           | I believe with what I have seen so far.
        
           | djanogo wrote:
           | It can't be just any "average driver", they should be
           | compared against "average driver" who just bought $40-50K NEW
           | car.
        
             | sandworm101 wrote:
             | >> "average driver" who just bought $40-50K NEW car.
             | 
             | Driver and buyer/owner are different things. Few teenagers
             | ever buy a Tesla, but they certainly drive them. A car
             | cannot only be safe in the hands of the rich initial buyer
             | who lives in a nice climate. It must be safe in all the
             | other people who that owner may let drive. It must also be
             | safe for subsequent second and third owners, people who
             | might not be wealthy enough to stay home when it rains or
             | when the autopilot thinks conditions are too rough. (It was
             | -35c on my drive to work this morning. Dark. Ice fog then
             | blowing snow.)
        
               | IshKebab wrote:
               | Uhm, I don't know what insanely rich middle eastern
               | country you're from but in most of the world teenagers do
               | not generally drive Teslas!!
        
               | sandworm101 wrote:
               | Have you been to LA, SF, Vancouver or Denver? These are
               | Teslas, not Ferraris. Lots of teenagers are driving their
               | parent's tesla. Go look at any university parking lot and
               | you will find plenty of 50,000$ cars.
        
             | jeffbee wrote:
             | Exactly right. The population has to be compared to people
             | driving new Volvo S90 or M-B C300, cars that have IIHS
             | fatality rates of zero.
        
             | obmelvin wrote:
             | If you are speaking about the safety features in a new car,
             | then I won't disagree. However, if you are implying that
             | those with a nice new car are somehow better drivers, more
             | attentive drivers, or care more about their car then I
             | would highly disagree. As someone who purchased a new car
             | last year, and does greatly care about it, I'm routinely
             | shocked by people and their clear disregard for their own
             | vehicle. Reckless driving, texting while not even
             | attempting to look, and the way people park their cars so
             | close to you that they can't even get out* have all shocked
             | me now that I'm paying more attention.
             | 
             | * I have even stared at this guy struggling to get out of
             | his fancy Jeep who opened his door into my car. I know this
             | anecdotal, but all it takes is a few counter examples to
             | show that car price etc does not have a super high
             | correlation to how careful & considerate you are.
        
               | heavyset_go wrote:
               | Look at the accident and fatality rates for drivers of
               | luxury cars. They are much, much lower than the same
               | rates for less expensive cars. In fact, if you break it
               | down by vehicle model, there are several models with 0
               | fatalities at all.
        
           | rightbyte wrote:
           | No.
           | 
           | I'm not going to boast about how good of a driver I am (I
           | am!!), but lets say I am a median driver. I would want a FSD
           | to be _better_ than a median driver. Not the average crash
           | /mile stats that include all kinds of long tail idiots doing
           | the most damage.                   * ... alcohol-impaired
           | driving crashes, accounting for 28% of all traffic-related
           | deaths in the United States.                  * Drugs other
           | than alcohol (legal and illegal) are involved in about 16% of
           | motor vehicle crashes.
           | 
           | https://www.cdc.gov/transportationsafety/impaired_driving/im.
           | ..
           | 
           | So 44% intoxicated to some degree (unless the stats overlap).
           | That number is not even including reckless drivers not
           | driving under influence, teenagers that just got the license,
           | bad health elderly etc.
        
           | cma wrote:
           | Tesla's claim relied on inclusion of motorcycles one time,
           | which seems super misleading and unfair.
        
             | FireBeyond wrote:
             | Misleading, from a company that once (several days ago)
             | claimed that they sold more cars in Australia than Camry's
             | last year.
             | 
             | Until someone pointed out that vehicle registration...
             | disagreed. By nearly 30%.
             | 
             | Then "Oops. We made a mistake and counted 3,000+ deliveries
             | which haven't actually been, well, delivered."
             | 
             | I hope they get spanked hard for that. Australia has very
             | onerous "truth in advertising" laws.
        
           | yumraj wrote:
           | Given what I've been hearing/reading about _autopilot_ I'm
           | more worried about being hit by one of them than a human
           | driver.
        
           | larksimian wrote:
           | My problem is that the median driver is much much better than
           | the mean driver and tesla os comparing against means.
           | 
           | The damage done in a car distribution is heavily skewed
           | towards people that get in fatal crashes or total a car. I'd
           | randomly guess like 20-30% of drivers are below mean. Most
           | people won't total a car, vast majority will never kill
           | anyone.
           | 
           | If you get median people into an autopilot car that's got a
           | mean safety record we end up pulling the average down, the
           | road becomes less safe. And luxury sedan buyers tend to be
           | one of the safest demos on the road, which makes the problem
           | even worse.
        
             | xapata wrote:
             | Median vs mean of what measure?
        
       | NaturalPhallacy wrote:
       | But is it safer than human drivers?
       | 
       | This is the real question, because a "perfect" autopilot may be
       | impossible because of cases where it must choose between
       | sacrificing something outside the car vs its occupants and there
       | will be disagreement.
       | 
       | Setting the bar at "perfect" is pretty unrealistic given that
       | humans are already pretty bad.
        
       | amelius wrote:
       | Can we have Self-driving Formula 1 before we get these cars on
       | the street please?
        
       | pengaru wrote:
       | Cruise-control (including Autopilot) is only applicable to a
       | subset of ideal driving conditions anyways.
       | 
       | You'd make the comparison of cruise-control crash rates vs.
       | general driving in all conditions if you were trying to abuse
       | statistics for marketing your gimmick cruise-control feature.
       | 
       | It's unclear to me how you'd even attempt to normalize the data
       | when you don't have national crash rates for cruise-control only
       | miles in non-Tesla vehicles.
       | 
       | I saw mention that one of the TLA's investigating Teslas crashing
       | into emergency vehicles subpoenaed _all_ manufacturers of
       | vehicles sold w /L2 cruise-control features for their safety
       | data. Maybe that will produce some normalized-enough data for
       | making a meaningful comparison.
        
       | YaBomm wrote:
        
       | [deleted]
        
       | spywaregorilla wrote:
       | I find this headline to be pretty misleading.
       | 
       | * It doesn't find anything to suggest Autopilot is less safe than
       | claimed. It finds reasons to believe Autopilot's improvement on
       | safety relative to humans is less than claimed
       | 
       | * Under both the road adjusted model; and the road+age adjusted
       | model, Autopilot outperforms the average driver on average. Not
       | by much, but its there.
       | 
       | * Everyone here seems to be assuming that the takeaway is that
       | Autopilot is less safe than humans, but there's no evidence for
       | that. If the numbers here are to be believed, I would take it as
       | a good sign for tesla, and by definition an advantage to people
       | who are otherwise poor drivers?
       | 
       | * Question: Why do crashes per million miles have such large
       | swings in the baseline? It's weird that it varies so heavily by
       | quarter without any obvious seasonal patterns to me.
       | 
       | edit: mildly annoyed at getting downvoted for seeming to be the
       | only person here noting that the blue line is on average below
       | the red line on every chart shown.
        
         | dboreham wrote:
         | Perhaps the downvotes are because most people would be
         | apprehensive about putting their life in the hands of an
         | "average" safe driver that's built from software? I probably
         | wouldn't allow any other average safe human to drive me, if the
         | alternative of me driving myself is available.
        
           | spywaregorilla wrote:
           | You never take ubers on the grounds of safety? Because that
           | is, on average, an average human.
           | 
           | Either way, I'm not too impressed. I wouldn't purchase
           | Autopilot personally for similar safety reasons. But let's
           | not twist the stats to suggest they're actually much worse
           | than they are.
        
         | leobg wrote:
         | You've been commenting on a tweet by Edward Niedermeyer. The
         | guy who created a site called "Tesla Death Watch" back in 2008.
         | 
         | These downvotes have got nothing to do with you or with what
         | you pointed out. Just some people here with sour grapes.
        
       | maxdo wrote:
       | >Although the ages of drivers in Tesla's crash rate data are
       | unknown, their ages could be estimated from a 2018 demographic
       | survey of 424 Tesla owners (Hardman et al., 2019).
       | 
       | Seriously? you trying to claim your method is more fair using 424
       | people in 2019, when Tesla has demographics of millions cars
       | across the globe?
       | 
       | My conclusion:
       | 
       | Author is saying it's not fair to compare with some old car.
       | Tesla was one of the first brands to mass adopt this safety
       | features in every Telsa. This is the only brand that include
       | camera, collision warnings in every, even cheapest modification.
       | And they were the first to introduce many of these features. The
       | rest are catching up. Why Tesla can't be proud of that? They give
       | safety to every customer, no matter if it's the cheapest option
       | or $150k+ car. Only volvo invest on par with tesla in testing
       | facilities, in house extended crash tests ets.
       | 
       | Autopilot and similar system is safer compares to any car on the
       | highway with old cruise control because they have features that
       | force you to be focused. They can stop you on a traffic light,
       | they can react faster on animal running across the road etc. And
       | Yes all measures like collision warnings, notifications, makes
       | your driving safe. Yes, you can use it without autopilot, but I
       | as a consumer don't care. If I'm buying a car, fact that this car
       | is much safer compares to 10 years old ford makes me more
       | confident in purchase.
       | 
       | You can debate a lot on how easy is to avoid some of this safety
       | measures. But if you an idiot you can fall asleep in cruise
       | controlled car as well. Tesla will at least try to wake you up.
       | And will stop on a traffic light.
       | 
       | FSD is a good example of an extension of this rational vs
       | emotional thinking. FSD Beta has a very strict focus control. You
       | look not on a road few times you're blocked. You don't hold a
       | wheel few times you blocked. Etc.
       | 
       | So even FSD beta in it's current state is a complete garbage that
       | quite often literally drives you into accident the way they
       | enforce your attention fixes everything.
       | 
       | FSD beta with this limitations has 60k cars on a road. And 0
       | major incidents that lead to a major injury or death. Not
       | counting few scratched rims and one time driver got off the road
       | because he over reacted on some maneuver.
       | 
       | It's bizarre how much money legacy auto is poring into efforts to
       | make tesla look bad. Before it was infamous data of burning EV's.
       | But if you look at data a single car brand BMW has so 100 times
       | more risk of fire compares to tesla. You don't see paid articles
       | on Times square how dangerous is BMW.
       | 
       | Right now all major brands invest in EV and suddenly all this
       | fire incidents disappear. And they switched to Autopilot and FSD.
       | 
       | Is there a room for improvement yes. Is the car that can stop on
       | a traffic light, after some fixes for a cop's car, notify about
       | potential collision, avoid automatically some collisions safer.
       | Definitely yes. The rest is just hype, speculations and quite
       | often corruption and paid articles.
        
       | laomai wrote:
       | Regarding some of the conversations about programming something
       | to obey the law or obey human convention:
       | 
       | - it's tricky because the people on the roads expect human
       | convention and drive for that
       | 
       | - as long as there is a mix of humans that drive by convention,
       | it's probably best to err on the side of following human
       | convention to some degree
       | 
       | Case in point: center lane
       | 
       | When I first started using autopilot on two lane roads the car
       | would stay dead center in the lane. If a car was coming towards
       | my in the opposite lane, it began to notice humans would veer
       | away from center to provide more buffer between themselves and
       | the oncoming car.
       | 
       | Because I didn't want to piss other drivers off, I would often
       | disengage and drift to the right of my lane while the oncoming
       | car approached me. If I didn't do that, there was always a last
       | minute extra drift away from me by the other car.. the
       | conventionally expected buffer distance wasn't enough to make
       | them feel comfortable due to unexpected (lack of) behavior.
       | 
       | It's not a law / no law issue above. But I have similar
       | experiences navigation into roundabouts with crosswalks in front
       | of them (autopilot stops at empty crosswalk where normal driver
       | would cruise through until the stop line to check for cars in the
       | roundabout -- and if none were there might pass through with a
       | rolling stop/check.
        
       | omgwtfbyobbq wrote:
       | Tesla should release demographic info if they have it. IIRC,
       | luxury vehicles tend to have lower accident/fatality rates
       | because their buyers tend to be more experienced, mature, and
       | wealthier.
       | 
       | I'm also wondering about the assumptions the pre-print makes.
       | 
       | https://engrxiv.org/preprint/view/1973/3986
       | 
       | For example, it's assumed that Tesla's data has the same
       | characteristics as the SHRP 2 NDS data, and that the demographic
       | survey data from Hardman et al is accurate WRT the SHRP 2 NDS
       | data.
       | 
       | > The analysis in this paper relies on the assumption that the
       | freeway-to-non-freeway and age group crash ratios found in the
       | SHRP 2 NDS are consistent with the manufacturer's data, as there
       | are no roadway specific nor age-related factors in the
       | manufacturer safety report.
       | 
       | > Although the ages of drivers in Tesla's crash rate data are
       | unknown, their ages could be estimated from a 2018 demographic
       | survey of 424 Tesla owners (Hardman et al., 2019).
       | 
       | But... It seems like Hardman et al are using California specific
       | data.
       | 
       | https://escholarship.org/content/qt58t7674n/qt58t7674n_noSpl...
       | 
       | > In this study, we used a cohort survey of Plug-in Electric
       | Vehicles (PEV) owners in California administered by the authors
       | in November 2019. Respondents had been previously surveyed by the
       | UC Davis Plug-in and Hybrid & Electric Vehicle (PH&EV) Research
       | Center between 2015 and 2018 as part of four surveys in the eVMT
       | project when they originally bought their PEV. Respondents for
       | the four phases of the eVMT survey were sampled from the pool of
       | PEV buyers who had applied for the state rebate from the
       | California Vehicle Rebate Program (CVRP). More than 25,000 PEV
       | owners were surveyed between 2015 and 2018. A total of 15,000 of
       | these respondents gave consent to be re-contacted and were
       | invited for the repeat survey in 2019. In all, 4,925 PEV owners
       | responded to the repeat survey.
       | 
       | And... The newer data indicates that age could be positively with
       | more Autopilot use and long-distance travel (also correlated with
       | more Autopilot use), which seems like a confound WRT to the pre-
       | print's conclusions and so on.
       | 
       | > Age is negatively correlated, indicating that the lower the
       | driver's age the higher the odds of reporting more long-distance
       | travel. This suggests Autopilot induces travel among younger
       | Tesla owners. As with the model with all Level 2 automation,
       | users' income is negatively correlated.
        
       | savant_penguin wrote:
       | So tesla owners belong to some specific age group and that makes
       | it safer?
        
       | croes wrote:
       | I doubt that full self driving is possible in the near future
       | without changes to the cars and the infrastructure to help the
       | autonomous systems.
        
         | petilon wrote:
         | It is available today in a limited area:
         | https://techcrunch.com/2021/11/03/cruise-launches-driverless...
        
       | irthomasthomas wrote:
       | How is this still front page? Somebody sleeping?
       | 
       | In case you missed them, there have been 5 tesla stories nuked
       | from the front page today:                 'Self driving' Tesla
       | fails miserably on the streets of South Boston
       | 
       | 30 points 22 comments
       | 
       | https://news.ycombinator.com/item?id=30173566
       | Tesla Australia admits its sales figures were wrong
       | 
       | 39 points 14 comments
       | 
       | https://news.ycombinator.com/item?id=30176635
       | Tesla drivers report a surge in 'phantom braking'
       | 
       | 42 points 79 comments
       | 
       | https://news.ycombinator.com/item?id=30179681
       | Self-driving Tesla does 'the craziest things you can imagine'
       | 
       | 96 points 172 comments
       | 
       | https://news.ycombinator.com/item?id=30178168
        
         | [deleted]
        
         | LeoPanthera wrote:
         | There are valid criticisms of Tesla, this normalized data story
         | is probably one of them, but the fans and anti-fans of Tesla
         | are so equally rabid that it is nearly impossible to have a
         | rational conversation about it.
         | 
         | Expressing a view in either direction results in an instant
         | pile-on of trolls and flames.
        
           | toomuchtodo wrote:
           | There are valid criticisms, and there is being...overly
           | enthusiastic.
           | 
           | https://news.ycombinator.com/submitted?id=camjohnson26
        
           | irthomasthomas wrote:
           | So Tesla news can be discussed anywhere except HN?
           | 
           | The reason HN is exceptional is that it has a lot of readers
           | working on or near AI, who will quickly call out the B.S. in
           | Tesla PR. They know this, so all Tesla stories get nuked.
        
         | contravariant wrote:
         | Lack of the word 'Tesla' in the title probably does it. As well
         | as not tripping the flame war detector probably. We'll see how
         | this thread fares.
        
         | [deleted]
        
       ___________________________________________________________________
       (page generated 2022-02-02 23:00 UTC)