Hacker Timesnew | past | comments | ask | show | jobs | submitlogin
Self-driving Tesla does 'the craziest things you can imagine' (boston.com)
128 points by codechicago277 on Feb 2, 2022 | hide | past | favorite | 209 comments


Just to clarify the current process of being able to use the Tesla FSD beta because I've seen a lot of comments by people thinking Tesla released this and cars are now just driving around without a human driver:

1. pay $12k or $200/mo

2. Opt-in for beta. Drive over 100-200 miles and maintain a 98%+ "safety score" until you're selected (could be months)

3. Manually activate FSD each time you want to use it

4. Keep hands on wheel while it's going on and prepare to take over (it beeps and deactivates if you don't)

5. Be continuously recorded by a camera while FSD is on


Should this be legal? Some rich Tesla fanboy beta testing buggy software with a misleading name of autopilot or FSD ont he streets where your children walk?

I am not aware of such laws but they shoudl exist, otherwise other companies can start alpha testing with random fanboys.


As one of the “rich Tesla fanboys” you describe, no, probably it should not be legal.

That being said, it has made me re-realize that humans are really terrible drivers as well, myself included, just less terrible on average, currently.


I’m curious how does a bad (terrible) “full self driving” car make you realise humans are “really terrible drivers”. I can see how some drivers can miscalculate and end up making a risky manoeuvre. It happens, intentionally or not, and most people can and are willing to react in a way that avoids an collision. A Tesla on the other hand is beyond bad or terrible. It is basically incompetent - driving into on-coming traffic, opposing lanes, deviating wildly from the lane, ignoring speed limits etc).


I respect your honesty and encourage you to try some remedial driving lessons to avoid being a “really terrible driver.” It can save your life and others also.


I'm a great driver on a relative basis. The average is just that terrible.


It's really hard to estimate that. Vast majority of drivers believe they are better than average driver.


Probably this is true, but I bet accurate self-evaluation based on facts is positively correlated with driving skill.


All the remedial driving lessons in the world can't prevent you from making a mistake and killing someone.


If you mean prevent 100%, of course not since nothing is absolute.

But driving skill training can most certainly greatly increase your skills and awareness and thus immensely reduce your risk of mistakes.

Anyone who drives should take as many car control and accident avoidance clinics as they can and refresh up every few years (if not more). These should really be required by law but since they're not, you can increase your odds of living by being proactive about it.

It's weird that this is somehow controversial for car driving. For e.g. airplane pilots there is a long and strict training and skill maintenance routine, because obviously this makes for safer pilots. You don't just get to read a booklet and go flying.


Pilots spend a lot of time on simulators, I believe.

Seems like there would be economies of scale if the federal government created some really realistic simulators programmed with all the common accident situations and then everybody could be required to pass a test every four years or something.

I imagine it would be so much more enjoyable and hopefully more effective than the defensive driving classes that exist now.


Yeah they can. If you practice defensive driving you are less likely to kill someone


I drive a tesla with FSD in an urban area and based on what I've seen it do, I am 1000x more scared of some rich boy driving his new lambo around hitting a pedestrian than my Tesla while on FSD even with noone in it (if that were possible)


You're right, we probably shouldn't make reckless driving legal either.


That's technically not the correct analogue to what I said


You should probably demand rich dudes get proportional fines, that would solve the problem partially. Probably same irresponsible teen is also alpha testing a Tesla while his watching Netflix on the dashboard so no win if you move him from a Lambo in an Tesla.


I would hope to whatever God is left on this accursed Earth that Tesla dashboards will refuse to play video while the car is in motion? Or that they have that mirror trick where the video is only viewable from the shotgun seat.


A Tesla vs. a Lamborghini is a peculiar comparison. What self-respecting Tesla fan would keep quiet how their Model S does 0-60 quicker?

Seriously:

2021 Lamborghini Aventador: 2.8 seconds

2020 Tesla Model S Performance: 2.4 seconds

(Car & Driver figures)


Yes, but how does it compare to a normal responsible driver? Because that is the use case here, not the idiot with an exotic they can't handle.


It doesn't compare. It cannot drive by itself yet, it disengages in many normal-to-a-human situations with no ability to resume by itself.


I'm having trouble understanding your point.

You're more scared of rich boys in lambos than of a Tesla that "disengages in many normal-to-a-human situations with no ability to resume"?


The Tesla isn't allowed to drive around without human supervision yet. The rich kid in the Lambo is


One behavior is discouraged by the manufacturer, the other is encouraged. I'm not sure the analogy works.


OK, but do you have a lot of faith in human drivers, a huge percentage of whom are talking on their phones with other people for work or play, some of which I've witnessed holding up the video with one hand so their faces can be seen?

Forget FSD - whether it's Tesla or others, an automation in the car that stops when a living being is detected in front seems like an advance, while human drivers talking in a sales meeting while wearing wireless AirPods appear to be an increasing threat to pedestrians.


> Should this be legal?

It should not be legal, to do testing on public streets with unwilling participants.

Tesla should be required to be able to enable this only in closed areas where access is limited to those participating in the testing.


Why shouldn't it? Anyone can buy 500, 700, 1100hp, etc. RWD production cars and YouTube is covered in videos of people using them to cause destruction of property and injuries because they don't know what the car is going to do. These muscle cars, supercars, and hypercars require control inputs that are not necessarily intuitive and could be very surprising. At least Tesla is forcing people to go through a safety test first. These cars can be operated safely. If and when there is sufficient evidence that they can't, I expect things to change.


You can't drive illegal for public road cars, so my point is why should a car that can randomly drive you into a child should be legal for public roads?

Let's assume Elon and Tesla are perfect, you still need the law for less perfect CEOs and companies right? This less perfect ones might do bad things.


Can't any car with a drive-by-wire technically just randomly drive into a child? Yeah, the fact that more unpredictable software is involved definitely increases the risk, but where exactly would the line be?


The line would be here IMO, let me know why you think different

Step 1: you get some independent persons from the government (Tesla or Google, etc pays for this so no need to cry taxes)

Step 2: you drive the cars in the area where you want to approve it for testing/production , you record all mistakes and times the car ahd to intervene

Step 3: drive all year around, all weather conditions, all traffic conditions

Step 4: you analyze the data, see how many times you had to intervene and compute the damages that would have occurred. Get some real statisticins that could put bounds around the damages.

Using the data from 4 you ask the population to vote what is the correct value they are comfortable with.

What Tesla is doing now do not match because:

1 the tests are not transparent, we don't know or we can't trust Tesla for the number of mistakes the AI is doing

2 updates invalidate previous tests and stats

3 the computer will not drive in all conditions and is differing harder parts of the road to the human so this also is screwing the stats, if you let the AI to drive the easy part and the human the hard part then for sure the AI has a good record.

Even if Tesla is perfect you need this laws for the imperfect companies that would move fast... so think about this as laws that will protect you from the evil Google car or some Chinese brand


> If and when there is sufficient evidence that they can't, I expect things to change.

Why should I let Tesla put my life at risk if they're not sure it can be operated safely?

The burden of proof should be collecting evidence that they are safe, not waiting for data soaked in blood before pulling back.

Any other company wouldn't be able to pull off this kind of scam like Tesla is doing.


Oh no, the children!

2025: Elon announces BabyX, in vitro baby factory, claims he's cloning any baby killed by a Tesla vehicle, for free.


>Oh no, the children!

Yeah dude, not sure where you are from but there are special traffic rules related to children , like how to drive around schools, school buses and children, because children are doing stupid things and schools are marked with special signs so the driver knows to pay more attention in this area.


Adults do plenty of stupid things too. Kids just have on average larger speeds and accelerations.


Sure, but I was a kid and I remember what stupid things I did, like driving a bike on a big hill , off road, Now I can't imagine me doing this, not even if you pay me a lot of money - so is clear that kids and teens will do a lot of stupid things because for some reason they are blind to some risk categories.


If you remove the last two words it reads much more like an elon tweet.


Or selected paidboys


If it's safer on balance than average drivers (for some reasonable definition of "safer"), I think the answer is "yes". Do you want to expose your children to unnecessary risk? I don't posit that it is safer or that it should be legal, but we should frame the conversation in terms of actual safety metrics and not cherry-picked incidents. Of course, our media isn't equipped to facilitate these kinds of nuanced conversations, which is a bigger and broader problem.

EDIT: Seems like my "data-driven decision making" advocacy really struck a nerve with the cherry-picked anecdotes and rhetoric crowd. :)


But HOW do we know is safer? We should not assume is safer by default and I see many problems with Tesla fanboys numbers

1 the count for the times drivers intervene to prevent a crash are secret, or are not counted at a crash.

2 I don't want on my village/city some american car that is safer then the american driver that included 16 year old drunk teens and people that don't even have to pass exams and medical tests. The self driving car should be compared with healthy drivers , I don't care that some state/country is incapable of stopping speeding or drunk driving.

3 updating this cars should invalidate previous approvals, as a software developer I know well how updates include bugs not only features.

Anyway I am against companies deciding from themselves how is a capable alphya testers and how they should run alpha tests.

If we want real metrics then incidents liek the ones in the video should be counted against Tesla, is MALICIOUS to show a stats of average driver using an average car versus an expensive car with a combo of human+computer and assign the good to the computer and ignore the many times the human had to save the computer ass. Fuck shit statistics and shit companies and fanboys using them. At best you can prove the computer might be a decent copilot versus no copilot.


> But HOW do we know is safer?

We measure. This requires transparency on the part of Tesla, but if we frame it as "if you want legal FSD, you have to comply with transparency". We regulate like this in all other industries so there's no reason to think we can't do it here.

> We should not assume is safer by default

No one is advocating "assume it is safer by default".

> I see many problems with Tesla fanboys numbers

Right, but by framing it with metrics rather than cherry-picked incidents, we can have a national conversation about which metrics are most appropriate. If Tesla fanboys bring numbers that you take issue with, you can debate them. As an aside, it's hilarious that my earlier comment was being downvoted for advocating a debate based on data rather than cherry picked anecdotes, but I should know my audience better :)

> the count for the times drivers intervene to prevent a crash are secret, or are not counted at a crash.

See my transparency remarks above.

> I don't want on my village/city some american car that is safer then the american driver that included 16 year old drunk teens and people that don't even have to pass exams and medical tests.

Ignoring the obvious barbs at Americans, if we quantify the safety of any given FSD system, we can compare that against local safety and prohibit FSD systems which score worse than the local average such that FSD will only ever be a boon.

> updating this cars should invalidate previous approvals, as a software developer I know well how updates include bugs not only features.

Seems reasonable to me.

> Anyway I am against companies deciding from themselves how is a capable alphya testers and how they should run alpha tests.

In this case, I agree.

> Fuck shit statistics and shit companies and fanboys using them.

I love these kinds of constructive comments. Keep 'em coming.


> This requires transparency on the part of Tesla

Good fucking luck, that's going to be harder than pulling Elon's teeth out.

Telsa has regularly lied about how well the system works, and will continue to do so unless we strong arm some independent verification on them.

And good luck getting that to happen. Governments roll onto their backs for Musk like it's the second coming of christ.

FSD should not be on the road without transparency. I don't want to get in car accidents because Musk wants to beta test crappy software.


> Governments roll onto their backs for Musk like it's the second coming of christ.

Lol, nonsense. Musk moved Tesla HQ out of California precisely because he couldn't get the local or state government to play ball (a California politician publicly Tweeted "F*ck Elon Musk"--hardly "rolling onto their backs"). Moreover, if you can't depend on governments to defy Musk and require regulatory oversight of FSD, how are you going to depend on them to outlaw FSD altogether?


>> updating cars should invalidate previous approvals

> Seems reasonable to me

Year 2026: autonomous drivers have 80% fewer accidents per million miles than human drivers.

Bug fix to improve that to 90% and prevent an additional 500,000 casualties per year must wait until 2028 for NHTSA certification.


>Year 2026: autonomous drivers have 80% fewer accidents per million miles than human drivers.

What "autonomous drivers" means in your numbers? Does it mean autopilot without any driver in the car that intensives like in TFA? Because this is the shit stats ignore the cases where the autopilot was saved from 5 crashes in 5 minutes by it's driver, in this case you could say the combo of driver+auto-co-plot is safer , also you should compare same cars and same roads , self driving software gives up on harder conditions so this inflates your stats.

TL:DR Tesla should also show the disengagements numbers so we can add those in the stats, some of those would have been crashes if the driver did not intervene.


Why would anybody ever use it then? If you have to monitor it and it does a bad job (in situations documented in that video), what is the value? I recognize that it does better in other situations, like open highways, but if you have to keep your hands on the wheel anyway, what is the benefit to the driver compared to older technology like assisted cruise control?


I wonder how this compares to more limited abilities like lane-keeping. I did a few 10hr drives between Oregon and California during the pandemic and the lane-keeping made this easier similarly to cruise control. I still has to watch it all the time, but it somehow made it more relaxed.


More relaxed might make for a happier driver, but I’m not sure it makes for a safer driver.


It's only for as long as the software is in beta testing (if we can really call it beta).


It is amazing that they crowdsourced their QA department and got people to pay $12k for the privilege of doing the work.


The driver probably has it in "assertive" mode. Outside of assertive mode it is actually annoyingly conservative. So much so that I often take manual control at turns to avoid impatient drivers honking at me as it inches through the intersection.

With that said, I hate driving in Boston, so much so whenever I go there I park outside the city and take the T (subway) the rest of the way. I would not turn on FSD in Boston in a million years.

I live in a mid-sized city much smaller than Boston and it does "ok" but even there I take manual control when downtown.


I "love" that your options are "annoyingly conservative" or "view road laws as optional".


There is an even more conservative option I haven't tried yet...

But to be fair, New Englander's are terribly impatient. Many of our ideas of road etiquette include: no loitering at a light for more than 0.25s after it turns green. And "yellow means speed up."


As someone that used to live 45 minutes outside of town, I'd kind of be okay with allowing the car to drive the monotonous highway bits and then taking over manual control for the city bits of driving.


Autopilot already does that very well and has for a few years and comes for free with each Tesla. you don’t need to pay the $12k FSD package for that


Yeah, agreed. Enhances Autopilot does very well on the highway.

Though FSD does navigate onramps and off ramps better than autopilot did (I don't know how autopilot works now since I'm on FSD beta now).

With that said, though, I don't think you can buy just autopilot anymore.

My other car is a Volvo and using Adaptive Cruise Control with Lane Keep Assist is scary as heck, any more than slight turn in the highway and you have to correct to avoid leaving your lane. Tesla on the other hand can change lanes, and has no issues with even the most extreme turns in the highway. But off the highway FSD is definitely more stressful than just manually driving while it's in beta.


I've no experience with a Tesla, but Boston is a particularly unforgiving driving environment. I find it less stressful driving in NYC.


I don't know what it is about Boston... maybe it's the pedestrians that don't check for traffic at all, or the taxis that split lanes, or maybe the five way intersections, or maybe the construction, or the tunnels and overpasses, or the fact the roads were laid out for horses and have absolutely no rhyme or reason most of the time... the list is long :P

Never driven in NYC, though I have walked through it as a pedestrian... it seems easier to deal with.


These days when I'm near a Tesla on the road I use the same caution as if driving near an obviously drunk driver. Give them lots of space and prepare for them to act unpredictably.


My father has a 2019 P100D model S. I’ve driven it a few times. It feels like you’re driving a cross between a budget Android tablet and a somewhat skittish horse with a rocket engine up its ass. What could possibly go wrong? On the last two occasions it has dumped on the brakes in the middle of an empty road for no reason.

My usual wheels are a crappy French dumb city car with a peanut for an engine and I feel safer in that with a large truck driving 6 feet off my ass.

Edit: also my crappy French thing actually has CarPlay rather than the shit show mess Tesla came up with.


I think you should ride a horse, maybe cars just aren't your thing.


That's interesting: this would if enough people do it skew Teslas towards being very safe vehicles!


This pretty much matches my experiences too. When it does stuff right it’s great. But then it goes and tries to swerve into a fast approaching car, change lanes in an intersection, decide last minute the turn lane you’re in in actually a straight through lane, checks that the brakes work on a wide open road, etc

I’m sure some people are more comfortable letting the car do these things but I feel it’s just a matter of time until one of them gets in an accident. Of course Tesla would have an out saying the person should have been paying attention, it’s still beta and so on.

I just don’t feel confident the current sensor suite can allow for level 4.


You’re actually supposed to intervene anytime it does the slightest thing wrong. It’s the interventions that signal to Tesla what to improve.

I’ve seen many testers letting it “play out”. That’s probably dangerous and no help towards improving things.


>> "Ogan said he shared the video because he’s seen a lot of “perfectly curated” videos of people using the Tesla self-driving feature on beautiful roads in California with no issues"

Emphasis mine. Yeah, this is where the value of the human being can't be over stated. There are plenty of times where road conditions "just don't make sense" and it takes a certain level of intelligence, intuition, and experience to even attempt to make a right call (and even then, sometimes you're making the situation worse and other nearby drivers have to adapt to you). For example: Road construction, or after they've paved the roads but haven't put down the lines, or when they've painted lines over other lines so there are conflicting lane markings, or detours that send you down one way streets marked "wrong way", etc....


Abridged history of Elon's self-driving predictions:

2015: "We’re going to end up with complete autonomy, and I think we will have complete autonomy in approximately two years."

2016: "I really consider autonomous driving a solved problem, I think we are less than two years away from complete autonomy, safer than humans, but regulations should take at least another year."

2018: "I think probably by end of next year (2019) self-driving will encompass essentially all modes of driving and be at least 100% to 200% safer than a person."

2019: "We need to be at 99.9999..% We need to be extremely reliable. When do we think it is safe for FSD, probably towards the end of this year then its up to the regulators when they will decide to approve that."

2019: "I feel very confident predicting autonomous robotaxis for Tesla next year."

2020: "I am extremely confident of achieving full autonomy and releasing it to the Tesla customer base next year. But I think at least some jurisdictions are going to allow full self-driving next year."

2021: "Tesla Full Self-Driving will work at a safety level well above that of the average driver this year, of that I am confident. Can’t speak for regulators though."


the increasing shift to blaming regulators is really notable when you put this on a timeline. Would be nice if people reacted with the historical knowledge about what Musk says rather than just the presumption that he is being true and honest when he blames them for Tesla/SpaceX's problems.


Exactly. Regulators are getting more involved in this space because they should be, anyway, and because Tesla has repeatedly demonstrated that if they don't, Tesla will take more and more risks.

Regulators have nothing to do with FSD's failure to recognize: roundabouts, stationary vehicles, trains at level crossings, or its "selective observation" of road law.


FSD will never work for ordinary cars in cities. But it may be good enough for self driving tanks fighting in WW4 - AD2050

That is really the only scenario where the high collateral damage won't matter.

  "We didn't know we where training kill-bots"


> "We didn't know we where training kill-bots"

We knew all along. https://en.wikipedia.org/wiki/DARPA_Grand_Challenge_(2004)


Personal opinion...FSD won't work within the existing paradigm largely because driving is a social experience not purely a technical one. It could, and should, be made to work in CONCERT with infrastructure changes to enable and support it. All new roads should be built with systems in place to help FSD cars succeed. Infrastructure spending should be used to help fix and redesign and systematize areas where FSD cars are likely to struggle...largely (to my mind) because doing so will also improve things like pedestrian safety.


I think musk would fight that as he, as of late, seems to be against such subsidies and assistance that would help firms not Tesla.


what Musk wants/will do is largely irrelevant to the eventual success of self driving vehicles. pardon the use of his pet acronym FSD but what will make the technology, not Tesla, successful is a build environment that a car can interact with more reliably as well as a car that can interrogate the built environment more effectively.

Both are important. A long range RFID tag in every road sign means I don't need a camera to read it, and it can't fail in snow. Magnetic nails[1] in the highway that can aid with high speed lane positioning information absolutely beats the camera on my Volvo that struggles to see white lines when driving into the sun.

[1] https://trid.trb.org/view/574206


The cruise control does not even work in these cars.


I was kind of sad for me, to be honest.

Between the ages of 18-26 or so I was a really big Elon Musk fan. It's hard to deny the exciting technological progress made by Tesla and SpaceX on many fronts and I hugely bought into the idea that this stemmed from him. On some level I still believe it does - he's been running both successfully for a long time so he clearly isn't utterly full of shit.

But watching the autopilot predictions become progressively sillier, over and over - watching the Thai cave diving debacle - watching his pathetic Twitter exchanges where he needlessly belittles people in childish ways when he could have just maintained his silence and ignored them - his idiotic pronouncements on the pandemic - I found along the way I lost nearly all of my respect for him. I stopped taking him seriously and came to believe most of the things he says publicly aren't worth hearing or reading about.

I still view him as a brilliant business leader, but....interestingly, he seems to also be an imbecile. And I guess that isn't really surprising, because people are rarely reducible to one thing, but it wasn't until I reached my late 20s that I started to see it more clearly.


Henry Ford was a jackass. Edison: jackass. The guys in Led Zeppelin: child-molesting jackasses (well, Paige is, at least). Musk is but the latest in a long line of people who can be admired for what they have brought to the world, and...then you're better off stopping right there while enjoying your electric lightbulb, rock music, and reusable rocket ships.


He isn't an imbecile. Every one of these statements earns him money. He might have started with certain technological aspirations, but then he made a lot of money and turned into a standard corporate psychopath. The only difference with the rest of CEOs is the kind of cult following his early days build for him. He's gonna milk them for as long as he can.


Yeah, Musk is a giant jackass, but unlike most other jackasses he has big dreams and a proven ability to consistently advance towards them. I can respect that about him, if nothing else.


The only metric that matters in the self driving race is the number of markets where regulators have granted approval for full autonomy. Supporting metrics would be markets where there’s a pilot program and the number of fully autonomous miles driven.

Tesla is losing and using misdirection to convince the public they aren’t.


Got to give the guy his due though: the average capitalist bastard couldn’t literally sell a promise like that and hold it off for half a decade yet still maintain a rabid fanbase stating it’ll be any minute now and shooting down any dissident opinion.

/s for the sake of the challenged.

I figured most of the cars that had FSD paid for will be scrap before it happens.


I don't know. The last few years of US politics has really opened my eyes to just how blatantly someone can lie and get away with it pretty much indefinitely.

What it comes down to is: do people want to believe your lie? If they do, almost nothing will dissuade them from believing it.


I disagree. This stuff can go on indefinitely until the big money pulls out, and the big money doesn't pull out until something happens to make the investment embarrassing to mention to your peers. It took an expose from the WSJ (which probably wouldn't have worked from any other paper) to do it for Theranos, it simply took a memorable, repeatable joke to do the same type of thing for Bill Cosby.

I honestly don't think that Tesla is hiding fraud at that level. If anything is going to kill Tesla, it'll be cheaper, higher-quality competition, probably followed with lies about sales figures. Things that are years off (as far as I know.) Lies about what your technology is on the verge of doing are standard for the tech industry.


I don't see a huge implosion happening. What will tank Tesla's stock is when competitors catch up and pass them. Two years ago, Tesla was the only viable option. Today, buyers have a lot of options and with Tesla's continued problems with quality and service, that will mean other makers will continue to capture more market share.

On the autonomous driving side, Waymo and other teams are years ahead of Tesla. Elon's been able to convince the market otherwise and so far they've bought it. In the next 1-2 years as self-driving cars get deployed onto the roads this will be undeniable.

I don't see a big, Theranos style implosion. The market will slowly grow concerned, there will be a "correction" which Tesla stans will ignore and then will be a slow decline with dips each time competitors release strong numbers. By 2023-2024, Tesla's market cap will have fallen 70-80% down to that of other car makers.


Agree with this.

I suspect someone will release an actually affordable (comparable to ICE up front and ownership costs), safe, basic EV with decent range and standard car features which has better quality and less hype around it and Tesla will be lose the ass end of the model 3 market. It’ll all collapse from there.


I think it is 100% fraudulent to charge for the FSD option that isn’t enabled at delivery of the vehicle.

Edit: to qualify this my father paid over £5000 for it three years ago.


I made sure to qualify it with "at that level." Minor fraud is minor.


2022: "I would be “shocked” if Tesla does not achieve “Full Self-Driving safer than human this year.”" [1]

[1] https://electrek.co/2022/01/31/elon-musk-tesla-full-self-dri...


I'm surprised other self driving car companies haven't tried to file a lawsuit to get Tesla FSD more regulated. If a Tesla FSD ends up in a(nother) notable accident, the ensuing media and regulation backlash will set back the whole industry by years.

As it stands today, I wouldn't let someone driving activate FSD, much less use it myself.


Stupidity is doing the same thing over and over again. Obviously the system is not ready for such narrow streets with no line markings and bad visibility. Just don't engage it in such situations.

I sure hope that the future iterations of FSD beta will assess the environment and ask the driver to take over sooner.


Even if you discount the examples with no lane markings or low visibility (both of which are quite common in the Boston area, so a system not designed to handle it is ultimately flawed), it almost running into three separate trucks cannot even remotely be chalked up to stupidity. Even in the example where it turns into the semi, that truck is clearly visible. In the last example, with the FedEx truck, the display pretty clearly shows it knows the truck/cars are there, and it still flubbed.


I look forward to the new name for the system. 'Totally Self Driving now can drive over 14% of the time, a 4% improvement from Full Self Driving. Look out soon for the next release, Completely Self Driving.'


The USB-IF normalized that sort of naming sillyness, I fully expect it to spread.


Full Self-Driving 3.1 Release 2 SuperSpeed Beta 16 With Extra Cheese


I Can't Believe Its Not Self Driving (™)


Don't other manufacturers limit their level 2 systems to mapped and known-good highways? And, for example, my Ford will turn off blue cruise and lanekeeping if it detects it can not handle the environment.


My Subaru will not enable lane-keep below highway speeds. Adaptive cruise control can be done at any speed (greater than zero... it turns itself off if traffic is stopped), though.


>Stupidity is doing the same thing over and over again.

Does that mean insanity is doing stupid things but expecting different results?


I should be able to roll stops signs without issue on my bicycle if Teslas are able to do it.



Pretty sure most of you do....


If a bicyclist does something wrong, all cyclists are bad. If a driver does something stupid, that driver is stupid.

People complain daily about the bad drivers they face during their commute. But that's quickly forgotten. That one time a cyclist did something wrong, however, is etched into memory.

Drivers (not cars) kill an incredible amount of people every year, while cyclists virtually none. Cars pollute, make noise, destroys cities.

And still, people go out of their way to harass cyclists every chance they get. Weird how that works.


Sure there are bad drivers but in SF bad cyclists are a matter of routine. Blowing stop signs, red lights at full speed, weaving through cross traffic and riding on pedestrian sidewalks at speed are not regular occurrences for car drivers.

Sure I want a full biking city and what not but the cyclists here are such a bad testimony to everyone else that it starts this strong negativity. Being jerks to the pedestrians and the drivers earns you few allies.


A few things that drivers do:

    - Failure to use signals to merge lanes
    - Speeding in excess of 80 mph
    - Tailgating (because otherwise someone would cut in front of you and you'd never get where you're going)
    - Left lane camping
    - Aggressively punishment passing left lane campers
    - Not stopping for right on reds
    - Having 1000 yard stare to the left when turning right on red and being oblivious to pedestrians
    - Failure to get up to speed on onramps
    - Stopping in the middle of the freeway when missing an exit
    - Multiple lane changes without looking to make an exit
    - Backing up on the breakdown lane to make a missed exit
    - Backing a semi down an exit lane to rejoin the freeway because it was the wrong exit
    - Merging into the merge lane of an onramp to pass traffic
    - Using the shoulder to pass traffic
    - Distracted driving and blowing through red lights (with the resultant T-bone side impact collisions)
    - Drunk driving
    - Wrong way driving (drunk, elderly or suicidal)
    - Lane cutting
    - Failure to zipper merge properly
    - Playing "follow the leader" through non-permissive left turns without looking for oncoming traffic
    - Turning left immediately when the non-permissive light turns yellow without yielding for oncoming traffic
    - Multiple cars turning left long after their permissive green arrow has changed to red
    - Detroit/Massachusetts left turns
    - The "right on red / illegal u-turn / right turn" trick (not an actual legal Michigan left / Mexican returno)
    - Entering the oncoming traffic lane when their lane is blocked without checking for oncoming traffic
    - Using honking as a means to "punish" other drivers
    - Blocking other drivers and aggressive lane changes and road rage
BUT OH MY FUCKING GOD THAT BICYCLIST THAT BLEW THAT STOP SIGN IS THE REAL SIGN OF SOCIAL DECAY


You're kinda proving my point, though, by your sweeping generalizations...

Why do the cars get a pass for all bad stuff they cause?


If SF is anything like NYC, GP is right to make sweeping generalizations.

Cyclists do not follow the rules of the road here. They simply don't. In particular, they go through red lights as if they don't exist. As a pedestrian crossing a crosswalk -- with the cross signal -- you still need to look left and right for bicycles. I don't see cars running red lights too often, but bicycles are a constant threat. And they are silent and sometimes hard to see.

To be fair, this report[1] from 2018 shows the disparity in injuries caused by bicycles (134) versus cars (2,272). With that in mind, it doesn't make sense for me to be complaining about the bicycles.

At the same time, there's a reason these statistics don't match my experience and the experience of many others. I dodge bicycles every day and don't get injured, probably because they're relatively easy to dodge most of the time. I've been hit by a bicycle twice but never reported it because I wasn't injured enough to seek medical care. I've never had to dodge a car.

If you stand on any corner in NYC, you will see cyclists running red lights several times within an hour. You won't see cars doing it as often.

So, I think this issue would be solved if the majority of cyclists followed the rules of the road like the majority of drivers do. Until then, people like me are going to keep complaining, despite the apparent damages not being as severe as what cars cause.

[1] https://www1.nyc.gov/html/dot/downloads/pdf/bicycle-crash-da...


> So, I think this issue would be solved if the majority of cyclists followed the rules of the road like the majority of drivers do. Until then, people like me are going to keep complaining

Again there are lies. Majority of cyclists do follow the rules, no worse than drivers do. You're just blind on one eye. If you're not a cyclist, you're for instance probably ignoring the tens of cars illegally parked on bike lanes we see every trip.

And majority of issues where cyclists break the law would be solved if cyclists got proper infrastructure. If you want to make your city safer, vote for that, instead of letting your hate for cyclists blind you.


Again not self aware. They clearly do not here. They are not saints for riding.


And how many people did they kill per mile? Cars get an absolute free pass for all the carnage they cause.


Bikes have killed less of course, but actually not 0.

I'm not saying a free pass, I'm saying they act outside the flow of traffic so strongly and regularly compared to cars that people get angry with them.

I don't know why bikers are so un-self-aware that they blow red lights and stop signs as a matter of principle, bully pedestrians in my experience far more than cars, then act like the victim


> Weird how that works.

Because it doesn't. None of that is true, except in maybe some isolated incidences. Those incidences just happen to be etched in your memory, because you took them personally. It's not enough of a norm for your sweeping generalization to hold water.


Pft, go into any discussion on twitter/reddit/fb/newssite about some traffic incident. In all cases where a car is involved, it's likely "that driver is stupid" (or even excuses made for the driver, since most commenters see themselves in their shoes), but if a cyclist is involved the comments are full of vitriol against all cyclists.


Anecdotally, I've encountered more complaints from cyclists that they're all being painted with the same brush, than I have incidents in which all cyclists are painted with the same brush. Could just be where I hang out, though.


In-group/out-group bias is a well attested thing. I saw similar opinions from car drivers in Cambridge (UK) when I lived there, even though actual traffic cameras showed that local car drivers were exactly as bad as local cyclists.


We do. But let’s be real. I would say less than 5% of drivers fully stop where I am now, and from living in Boston for a while 0% there.

Cue the Boston driver jokes on the Tesla, but that’s just really bad driving there. I do not expect them to get usable self-driving for another 10-15 years at least.


Why would I stop if I can clearly see there is no one coming from the other directions? I drive a stick shift too so it helps to be able to keep some non-zero momentum. This is completely disconnected from the stereotypical bad driver barreling through the intersection completely unaware there is a stop sign at all.

It's a different story on 4-way stop intersections. You look at who is where, which order they stopped in, and go when it's your turn.

The real bottom line ultimately is, are the average accident rates up or not?


Why is the Boston driver thing such a meme? The NHTSA Region 1, which is New England, has the fewest traffic fatalities per capita and per vehicle-mile. Florida, Texas, and California are twice as bad.


It’s the only place where a driver has waved a vodka (or other clear spirits, didn’t get a taste) bottle at me on a highway. YMMV.


Plain old drunk driving seems civilized compared to the rest of the country. Let me know if you see some clown in a monster truck brandishing a Glock in Boston.


Seriously, Tesla's autopilot rolls through stop signs because everyone does. How many people are going to get rear ended because every Tesla stops at a stop sign now? It feels like enforcing this behavior is more dangerous than letting it slide.

It's the same with following the speed limit on roads where everyone speeds. Nobody is going to like self driving cars if they're going 25 on a road where everyone goes 40, where the speed limit should clearly be adjusted. The accidents that occur from drivers not being able to stop for a slow car in time, drivers getting annoyed and attempting to overtake, or drivers causing road rage incidents will dwarf whatever crash rate you'd get by just letting the car go faster.


So Tesla is supposed to mimic bad driving of bad drivers? Whatever happened to the promised Age of Road Safety, brought about by smart AI whose ability to make decisions is so much better than the useless chunks of meat menacing our roads today?

>How many people are going to get rear ended because every Tesla stops at a stop sign now?

So bad drivers roll through stop signs, and this behaviour is somehow at equilibrium with terrible drivers who follow too closely and are thus perpetually on the verge of rear-ending someone? Best not upset this equilibrium, it clearly works so well!

>It's the same with following the speed limit on roads where everyone speeds.

Of course it is. Can't wait for speeding self-driving cars taking out cyclists or pedestrians. Whaddaya want from me, I put it in 'hurry mode' because I've got places to be!

>Nobody is going to like self driving cars if they're going 25 on a road where everyone goes 40, where the speed limit should clearly be adjusted. The accidents that occur from drivers not being able to stop for a slow car in time, drivers getting annoyed and attempting to overtake, or drivers causing road rage incidents will dwarf whatever crash rate you'd get by just letting the car go faster.

Speeders that endanger lives of everyone not currently inside a car, now embedded in an efficient algorithm. Whatever could go wrong? My North American barely liveable city is gonna be so much better for it!


Tesla is supposed to mimic human driving so its cars are predictable to other human drivers. What's the point of driving safely if it gets you in an accident?

> Speeders that endanger lives of everyone not currently inside a car, now embedded in an efficient algorithm

Maybe pedestrians should be voting to improve public transportation instead of trying to walk in roads clearly not meant for pedestrians. Has nobody heard of trains or buses? How is it a solution to make every car go 25 mph just in case some random person decides to jaywalk?


> Tesla is supposed to mimic human driving so its cars are predictable to other human drivers. What's the point of driving safely if it gets you in an accident?

Weird, Musk is always talking about how FSD is far better than human driving, and far safer.

He's never once said "it's trying to mimic humans", or that it's "much safer unless that might cause an accident, and only then will it drive unsafely (like rolling through stop signs)".

This sounds like post-facto rationalization to me.


>Tesla is supposed to mimic human driving so its cars are predictable to other human drivers.

Great! I don't roll or blow through stops, follow too closely or speed by more than 5 kmh within city limits. I drive carefully and defensively, avoiding dangerous manoeuvres, especially when a driver close to me seems erratic (for example, blows through a stop sign, speeds or follows too closely).

Tesla should mimic my driving.

Pedestrians should be voting to improve public transportation in addition to improving walkability of their neighbourhoods, not instead of that.

>How is it a solution to make every car go 25 mph just in case some random person decides to jaywalk?

In my Canadian province, crossing mid-block is perfectly legal as long as you yield to traffic, and 'jaywalking' is not a legal concept. Slowing cars down is safer not just because it reduces the likelihood and severity of hitting pedestrians when they cross incorrectly. It is also safer because it reduces the likelihood and severity of hitting pedestrians when they cross correctly, or walk on a sidewalk. It is also safer because it reduces the likelihood and severity of hitting cyclists and other cars. Highways should be designed for speed, whereas streets should be designed for safety at the expense of speed.

Besides, when a city gets above a certain size and is not missing traffic lights where they should be, speeding within city limits is often quite a futile endeavour. I can't tell you how many times I see drivers who go 70 in a 50 kmh zone zoom past me, just to be waiting at the next light, or the one after that as I pull up after going 55 kmh the whole way. I won't lie, it can be amusing.

If self-driving cars just emulate the stupid, dangerous and illogical habits of human drivers, then to hell with self-driving cars, and you can bet I'll be voting for that.


Who is just driving into cars at stop signs because they stop for an extra two seconds?


Ask my dad or the variety of people in your life that have had it happen to them. I guarantee you know people this has happened to. Have you never heard of getting rear ended?


> Ask my dad or the variety of people in your life that have had it happen to them

Every rear-end accident I've ever been aware of at a stop sign would have happened whether the person in front stopped for 2 seconds, 5 seconds, or 7 seconds. They were all due to the person in the back not paying attention and not driving safely. Whether the person in the front did a slow 5 second rolling stop or a 5-7 second full stop has never been relevant.


> I guarantee you know people this has happened to.

I'm unaware of anyone I know around here ever getting rear-ended because someone stopped at a stop sign. Granted, the culture here is to actually stop at each stop sign, so, you know, it's expected that people will actually stop and not just assumed that they're going to roll through.


Of course. Although most rear-ending I've seen is on highways when drivers (that shouldn't be on the road) fail to leave adequate distance in front of them.


Related situation, but not quite the same: I know someone who wrote off their car on the A14 near Cambridge (UK) because they weren’t paying attention in stop-start traffic and rear-ended the one in front.

Ironically, she hates Elon Musk and everything he does, so memes about how self driving cars are a terrible idea.


You know what is really ironic? If this person was in a Tesla or similar car with obstacle detection the car would have never let them rear end the one in front in stop-and-go traffic.


That’s the point ;)


Too subtle for this sleep deprived brain. Cheers to you sir :-)


Robots who do not follow traffic norms will quickly become infuriating.


Not an answer to the question.


Anyone who advocates that not stopping a car at stop signs is a "norm" should be launched directly into the sun. This is why traffic deaths in the U.S. are 40k/year and increasing at 15% per year.


The increase has more to do with cellphones, but people don’t want to talk about that because there’s a very simple, easy fix - it would just cost advertisers and anger people who couldn’t use their phone in the car anymore.


If you think a rolling stop warrants being launched into the sun, well, I will have to question your good judgement, or perhaps whether or not you have actually driven a car before.


> How many people are going to get rear ended because every Tesla stops at a stop sign now?

Probably not many, and they'll mostly be low-speed. And the person who follows to closely and doesn't brake at stop lights is 100% at fault in that situation. Why blame Tesla for following the law? If you think the law is stupid, get the vote out and change it.

I once got pulled over for driving 5 over the limit. There was a squadron of tens of motorcycle cops who pulled everybody over. I challenged the ticket in court, complaining that one cannot drive the speed limit on that road without other drivers getting aggro, and that I was driving with the flow of traffic. The judge did not care. He said it's my responsibility to follow the law. So, good luck.


I have never in my life seen a bicyclist slow down at a stop sign or sometimes even a red light.


Someone on a bike isn't very likely to kill someone they didn't happen to see when they roll through a stop sign or light. Also, bikes tend to be slower, and the biker has considerably more visibility than someone in a car. I roll through stop signs on a bike, but when I do so, I know for sure that there's no one around, because I slow down some, look around, and I'm able to properly see everything.

I think most bikers stop (or at the very least slow down) for red lights. I don't usually wait for red lights, but unless I'm able to see clearly that it's safe, I'd never just run one, because I'm not suicidal. Again, having much more visibility, and a lot more time to scope out the situation makes this considerably safer for the person on the bike.


I keep comparing it to rails. Streetcars solved many of the self driving problems. Its not to hate on self driving cars but its cool to look back how they accomplished all that with the tools of the time. Imagine how batteries or a flywheel could bridge large gaps in overhead cables. If you run them on tax money and build enough lines I imagine convenience could be superior to cars.


https://www.smbc-comics.com/comic/fsd is the current definitive take on the subject.

Watching this video I really had no idea how bad it was. My opinion is that we will never have full self driving without infrastructure updates; basically putting the cars on rails and forbidding non-self-driving cars and pedestrians. Even then it's dicey.

At least part of the reason is that a self-driving car can never have skin in the game. So if you want to cut off a self-driving car, go for it! If there's a busy exit and you want to cut in line, find the FSD and go in front -- with a real driver there's a chance, however small, that they would play chicken, but an FSD never will. If you want to pass a FSD and there's not enough room, just do it anyway.

Even worse will be when FSDs learn these behaviors by observing them and use them on other FSDs, and then you have a race to the bottom.


> So if you want to cut off a self-driving car, go for it!

I'm not sure. Maybe it's my anxiety speaking but that sounds pretty dangerous, at least over time. An autonomous car may not play chicken but it can still make mistakes, or be unable to brake in time. There's also no way to know that a car is being autonomously driven since, currently, someone has to have their hands on the wheel.


This. So much this.

It’s 2022. We still have meat flavoured train drivers and they have a massively constrained set of complexity to navigate.


It helps when those meat bags aren't on their phones just like in cars.


I think it also helps when you’re not driving a distraction filled Android tablet on wheels.


How would you know it's not just a Tesla, but a Tesla under FSD? Those are 2 totally different things. I'd like to see big, bright lights that are enabled when FSD is engaged that alerts other drivers to the fact. Similar to Student Driver type of signs. Essentially, since FSD is still beta, isn't it technically still learning meaning "student" as well?


Disclaimer about this guy Ogan, if you track his online accounts it’s very anti-Tesla auto-pilot/self-driving. Rightfully so in some situations.

Also he is a investor in a Chinese competitor to Tesla (EV and self-driving abilities). Take his words with a grain of salt.


Well, it seems like he has a reason to be highly skeptical of Tesla auto-pilot, based on how awful and dangerous the technology appears to be. Not to mention Musk's repeated broken promises about the timeline and functionality.


He claimed on Twitter spaces a couple weeks ago that he has a large investment in Tesla as well - based on their gross margins and growth, not based on FSD


Luckily we don't have to take his word for it, we can just watch his video with our own eyes.


anti Musk and anti Tesla astroturfing is a tale as old as time. A lot of it has always been rooted in people with financial interests in Tesla doing poorly

I always flag these topics. The comments are nearly verbatim identical to the last 20 times FSD gets posted and nothing about the conversation is interesting


aka the problems are the same as they were. That’s a good metric for the lack of progress on the issues.


aka there's been a lot of progress and people still like to say there hasn't been and that elon should be sued for fraud because it isn't a perfect self driving machine yet


Like Elon has himself said in corporate communications, and documents lodged with the SEC, for years on end?


I'm sure you also flag pro Tesla stories from people with large financial positions, too, for neutrality, right?

Right?


It is well past time for government to step in and get these dangerous cars off the road.


This whole situation fascinates me. At what point do people start suing Tesla for fraud? There is no evidence that Musk's claims about FSD will ever be true.


I'm confused:

> "evidence that [...] will be true"

Apart from provably recurring events already observed (e.g. Sun/Moon position in the sky), how can one present evidence for the future?

And what kind of society start suing people for fraud for failing to provide evidence for the future?


There are many claims by Musk that had a specific amount of time on them. We're supposed to be able to use our cars as an automated taxi fleet already.


Anti musk hyperbole in a Tesla thread, how predictable


Self-driving is absolutely coming. With that 'self' being some $2/hour Uber driver virtually driving from their couch.


What surprises me is the simple fact that these self "driving" cars are allowed on the road without any federal oversight at all. Meanwhile you could not legally register a Mercedes Benz car that only has ONE wiper blade because it did not meet federal safety requirements.

Yet somehow replacing the human in the drivers seat, the most safety critical part of the whole damn car, with a gpu, camera, and junk code is no problem. Even after multiple stories of accidents and deaths caused by them.

And another point, touching LCD screens while operating a car is perfectly legal but touching a phone screen isn't. At what point did the system stop caring or was paid to stop caring? Have the tech companies subverted the government and our safety for profit? WTF is going one here?


Not a huge regulations-by-default person but I agree strongly with this. The other issue is accountability laundering. If a human kills someone they're criminally responsible. If a company's software does it then they'll shift blame onto the human passenger or deflect into a small fine, at the worst. If I can't trust the incentives then I can't trust the tech.


> Even after multiple stories of accidents and deaths caused by them.

How many accidents and deaths are caused by human drivers? I haven't checked the stats but it has to be non-zero.


Watching the film I knew where I have experienced these exact same problems:

When I thought my wife and kids to drive!

Incredible how many of the exact same mistakes it makes as they did.


Sounds like every other driver in Boston.


I think full self driving is the same distance away as cold fusion.


And when we get it, it's more like Cold Fusion. https://en.wikipedia.org/wiki/ColdFusion_Markup_Language


Please don’t insult CFML by comparing it to something that puts the user at risk and crashes every five minutes ;)


But is it closer than Linux on the desktop?


Well, Linux on the desktop is always this year. So it's closer than fusion, which is always ten years from now, and maybe closer than fully autonomous Teslas, which is always a year or two from now. At least the Steam Deck won't make rolling stops.


The irony of this is there is probably more Unix on your wrist than Linux on the desktop.


As someone who's heavily bought into the Apple ecosystem, I'm surrounded by Unix all the time, technically. On my wrist, in my pocket, on my lap. And I'm surrounded by Linux, too, at least at work. But Linux on the desktop? How delightfully absurd!

Nah, I really want my next PC build to run Linux. But that'll have to wait a while ... just paid $4k for a vacation. My girlfriend wasn't sympathetic that $4k would pay for an awesome rig instead.


I tried Linux on a fairly hefty Ryzen rig recently and it was the usual shit show. HiDPI doesn’t even work properly with wayland yet. My next $4k rig is going to be an iMac.

Typing this on my iPad in a hotel. That’s the life.


I'm sympathetic, and I don't think I'd ever run Linux as my primary compute. But as a secondary machine that gives me a chance to tinker where failure doesn't really matter, it seems viable.


I just rent mine from Linode. Less capital risk then.


Linux on the desktop is already here in the form of WSLg, and will be even more popular once Android support rolls out (though that's probably more useful on tablets anyway).


Already here. I am writing this on KDE Plasma.

The amount of other people reading this on a linux desktop is 100% irrelevant to me.


Cold fusion is a hoax. Did you mean regular fusion?

Both Cruise and Waymo are legally operating cars without drivers that are available to the public today.


Whoosh (first part). I think that was their point.

In one sense the cars are not available to the public.

They are available as services of course, so you’re right in that sense.

But we do not know what is going on in the background (such as a team of “safety drivers” monitoring every inch of the drive with the ability to do who knows what at any time).

As far as I can tell, Cruise has an appearance of autonomy to reap the benefits of the PR around that. With the safety drivers (by another name perhaps, to hide their role) not physically present in the car, but connected over a network and operating behind the curtain. All of hacker news has fallen for this hook, line, and sinker.

Waymo is another story. With a restricted sandbox, it’s essentially a shuttle, not a car.


The saying is that fusion (not cold fusion) is always 10 years away. It’s not a hoax but it always seems slightly out of grasp. We are far closer to self driving cars than we are towards regular fusion as we’ve had working prototypes for decades. There hasn’t yet been a break even fusion reactor prototype.

Cruise launched publicly in San Francisco yesterday.

I believe both they and Waymo are using remote safety drivers but at a ratio of one for every 20 cars on the road and they expect that ratio to continually decrease over time.


Both have been just around the corner since the 1970's. But just because they are obvious technologies it doesn't make them more feasible.


> Both have been just around the corner since the 1970's.

Fusion, yes. Cold fusion, no.


Which law of nature self driving AI contradicts?


We know enough about the physical laws of nature that we can write them down and figure out where the limits of tech/automation are.

We know nothing about how the human brain works. True artificial intelligence/AGI (which is needed for level 5 self driving) is a pipe dream.


I'm sorry, but it's you who know nothing. Insulting entire fields is neither nice nor constructive.

Sooner or later AGI will be available, it's just a matter of time. It does not have to work like a human brain


If you ask around you will see that my thoughts are shared by most people actually working in the field. It's the enthusiasts/fans who think we are on the brink of the world they see in sci-fi movies.


Here's a list of nothing where nobody figured out anything:

https://en.wikipedia.org/wiki/Outline_of_neuroscience

I didn't ask around, but I did watch quite a few science lectures/talks by reputable scientists about how the brain and our mind works.

I also did not make any claims other than suggesting that self driving AI will eventually be possible since there is no natural law that prevents it.

Being skeptical is okay, but you might need to back up your claims.


Probably the one that attempting to process an unconstrained system in a constrained model, regardless of the method, always results in unhandled edge cases. Apply tons of metal and fragile entities made of meat to the problem and someone’s going to get hurt.


This is true for human brains too. Not surprisingly you are more likely to cause an accident if you are driving in a new environment, like in a heavy snowfall on an icy road, if California is your only driving experience so far.

In finance they say past performance does not indicate future success, same should apply to AI in both directions. Just because its applications were too narrow in the past predicts nothing about the future.


The law of human nature


AI is like graphene it can do anything in the lab.

EDIT: thx


I think you meant graphene.


Graphene?


OP looks like Ogan himself. Mostly anti-Tesla posts.


tesla bad clickbait.

of course FSD does crazy shit right now; it's beta and it's learning! That's why you agree to not become complacent and have full control over steering at all times.

in my experience, it works well 95% of the time, but the 5% unpredictability factor is way too high, which is why I don't use it with passengers in the car unless asked to do so. more specifically, I might manually maneuver the car less than 2 miles out of a 20 mile trip with FSD beta enabled, which is an improvement over the 5 miles I would manually maneuver with FSD stable. but it's getting better, and it's definitely better than anything else you can use on public roads, so I'm sticking with it.


> which is why I don't use it with passengers in the car unless asked to do so.

If a fellow passenger's safety concern is enough to stop you from enabling it, would you consider a 3rd parties request:

As a pedestrian, cyclist and driver I ask that you never engage your autopilot on a public street which I may be crossing, riding or driving on as I have concerns about its safety. I am also sure most other people doing the same would agree. Thank you.


Everyone wants to hate on Elon but he’s making incredible progress in machine learning. Tesla and OpenAI have done more for the field than almost anyone in the last decade, except maybe Google.


What progress exactly? Did Tesla actually invented anything related to ML or are just training and annotating and maybe creating if-else rules.

Am I wrong or the number of times people take over to avoid a crash is a secret?


How are you claiming this? There are tons of research publications from the likes of Google, Facebook etc. Does Tesla ever publish AI/ML research?


No instead they have AI days where they get up on stage and stupify and thrill investors with their technicality and claims everything is almost ready. Last year they made it sound like Dojo was a done deal, last week in earnings call they made it sound not ready, but also not necessarily useful


Says a guy who is short on Tesla, but despite that conflict of interest, the video looks real. I mean if he had managed to modify his car firmware to make it less safe just to fake that video, that would also be a huge Tesla safety issue.


Are you long Tesla?

Because we can all play this game. It's a very silly one, and I'm honestly concerned that the SEC, FCC, and other regulators have the same issue - the people who are suppose to be regulating companies have stocks, and those stocks affect their judgement in much the same way that Pelosi and other prominent politician-traders would never vote against their stock portfolios.


I just wanted to point out what I thought might be new information for some readers. I have no position on Tesla, neither short nor long. I just consider them for entertainment value :)


TFA makes no mention of anyone being short TSLA. Beside, the person quoted isn’t too short Tesla, since they paid for the vehicle.

But, okay, let’s roll with that: after that video, you expect “the guy” to be long Tesla? There’s the avoidance of “conflict of interest”, and then there’s “I should not be left alone with large sums of money”.


The comment section below the article does mention this though ...

The guy's twitter feed is basically one huge rant against Tesla and Elon Musk. He seems to have no other topic than that. Makes you wonder why he bought one anyway.


Thanks for the necessary context that was inexplicably left out.


If you're anti-Tesla due to your experience with and observation of their product, legitimate reasons like that, that's not a conflict of interest. You're not starting out with a bias. You're allowed to form an opinion for the right reasons and bring it forward without it being suspect.

If you're anti-Tesla because you didn't think Elon Musk was funny on SNL, or you don't like his haircut, or some other unrelated reason, that's what would be unfair. But you can't write off anyone short on Tesla as inherently biased. Otherwise you'd be writing off all criticism period.


The author of TFA is a known investor in a Tesla competitor


You would never rent a Tesla in that case.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: