Tesla’s Latest FSD Beta Doesn’t Seem Ready For Public Use, Which Raises Big Questions

What I like about this test is that it presents a very good mix of everyday, normal driving situations in an environment with a good mix of traffic density, road complexity, lighting conditions, road markings, and more. In short, reality, the same sort of entropy-heavy reality all of us live in and where we expect our machines to work.

There’s a lot that FSD does that’s impressive when you consider that this is an inert mass of steel and rubber and silicon that’s effectively driving on its own through a crowded city. We’ve come a long way since Stanley the Toureg finished the DARPA Challenge back in 2006, and there’s so much to be impressed by.

At the same time, this FSD beta proves to be a pretty shitty driver, at least in this extensive test session.

Anyone arguing that FSD in its latest state drives better than a human is either delusional, high from the fumes of their own raw ardor for Elon Musk or needs to find better-driving humans to hang out with.

FSD drives in a confusing, indecisive way, making all kinds of peculiar snap decisions and generally being hard to read and predict to other drivers around them. Which is a real problem.

Advertisement

Drivers expect a certain baseline of behaviors and reactions from the cars around them. That means there’s not much that’s more dangerous to surrounding traffic than an unpredictable driver, which this machine very much is.

And that’s when it’s driving at least somewhat legally; there are several occasions in this video where traffic laws were actually broken, including two instances of the car attempting to drive the wrong way down a street and into oncoming traffic.

Advertisement

Nope, not great.

In the comments, many people have criticized Kyle, the driver/supervisor, for allowing the car to make terrible driving decisions instead of intervening. The reasoning for this ranges from simple Tesla-fan-rage to the need for disengagements to help the system learn, to concern that by not correcting the mistakes, Kyle is potentially putting people in danger.

Advertisement

They’re also noting that the software is very clearly unfinished and in a beta state, which, is pretty clearly true as well.

These are all reasonable points. Well, the people just knee-jerk shielding Elon’s Works from any scrutiny aren’t reasonable, but the other points are, and they bring up bigger issues.

Advertisement

Specifically, there’s the fundamental question about whether or not it makes sense to test an unfinished self-driving system on public roads, surrounded by people, in or out of other vehicles, that did not agree to participate in any sort of beta testing of any kind.

You could argue that a student driver is a human equivalent of beta testing our brain’s driving software, though when this is done in any official capacity, there’s a professional driving instructor in the car, sometimes with an auxiliary brake pedal, and the car is often marked with a big STUDENT DRIVER warning.

Advertisement

Image for article titled Tesla's Latest FSD Beta Doesn't Seem Ready For Public Use, Which Raises Big Questions
Image: JDT/Tesla/YouTUbe

I’ve proposed the idea of some kind of warning lamp for cars under machine control, and I still think that’s not a bad idea, especially during the transition era we find ourselves in.

Advertisement

Of course, in many states, you can teach your kid to drive on your own without any special permits. That context is quite similar to FSD beta drivers since they don’t have any special training beyond a regular driver’s license (and no, Tesla’s silly Safety Score does not count as special training).

In both cases, you’re dealing with an unsure driver who may not make good decisions, and you may need to take over at a moment’s notice. On an FSD-equipped Tesla (or really any L2-equipped car), taking over should be easy, in that your hands and other limbs should be in position on the car’s controls, ready to take over.

Advertisement

In the case of driving with a kid, this is less easy, though still possible. I know because I was once teaching a girlfriend of the time how to drive and had to take control of a manual old Beetle from the passenger seat. You can do it, but I don’t recommend it.

Of course, when you’re teaching an uncertain human, you’re always very, very aware of the situation and nothing about it would give you a sense of false confidence that could allow your attention to waver. This is a huge problem with Level 2 semi-automated systems, though, and one I’ve discussed at length before.

Advertisement

As far as whether or not the FSB beta needs driver intervention to “learn” about all the dumb things it did wrong, I’m not entirely sure this is true. Tesla has mentioned the ability to learn in “shadow mode” which would eliminate the need for FSD to be active to learn driving behaviors by example.

As far as Kyle’s willingness to let FSD beta make its bad decisions, sure, there are safety risks, but it’s also valuable to see what it does to give an accurate sense of just what the system is capable of. He always stepped in before things got too bad, but I absolutely get that this in no way represents safe driving.

Advertisement

At the same time, showing where the system fails helps users of FSD have a better sense of the capabilities of what they’re using so they can attempt to understand how vigilant they must be.

This is all really tricky, and I’m not sure yet of the best practice solution here.

Advertisement

This also brings up the question of whether Tesla’s goals make sense in regard to what’s known as their Operational Design Domain (ODD), which is just a fancy way of saying “where should I use this?”

Tesla has no restrictions on their ODD, as referenced in this tweet:

Advertisement

This raises a really good point: should Tesla define some sort of ODD?

I get that their end goal is Level 5 full, anywhere, anytime autonomy, a goal that I think is kind of absurd. Full Level 5 is decades and decades away. If Tesla freaks are going to accuse me of literally having blood on my hands for allegedly delaying, somehow, the progress of autonomous driving, then you’d think the smartest move would be to restrict the ODD to areas where the system is known to work better (highways, etc) to allow for more automated deployment sooner.

Advertisement

That would make the goal more Level 4 than 5, but the result would be, hopefully, safer automated vehicle operation, and, eventually, safer driving for everyone.

Trying to make an automated vehicle work everywhere in any condition is an absolutely monumental task, and there’s still so so much work to do. Level 5 systems are probably decades away, at best. Restricted ODD systems may be able to be deployed much sooner, and maybe Tesla should be considering doing that, just like many other AV companies (Waymo, Argo, and so on) are doing.

Advertisement

We’re still in a very early transition period on this path to autonomy, however that turns out. Videos like these, that show real-world behavior of such systems, problems and all, are very valuable, even if we’re still not sure on the ethics of making them.

All I know is that now is the time to question everything, so don’t get bullied by anyone.

Something About Tesla’s Model S Plaid Nürburgring Run Doesn’t Sit Right

Musk May Have Lied About Modifications For The Plaid’s Ring Run

Twitter user @Benshooter later walked back the accusation of Musk lying about the record. Tesla posted onboard video to its own YouTube channel of the record run, and it appears that the only modification made was to put an aftermarket digital gauge readout in front of the driver, even retaining the stock yoke steering wheel. The video was also posted to The ‘Ring’s YouTube channel, corroborating the authenticity of the record. The official lap time for this onboard video is 7:35.579.

So what gives? What’s with the other time listed, a 7:30.909? Clearly Tesla has more speed up its sleeve. Judging by the onboard video being set in a red car, it appears Tesla may have had two red cars on hand, one stock and one seriously modified. There was also at least one black car on hand as well. Maybe Tesla ran several iterations of the Plaid at the track, including some future track-focused version with carbon ceramic brakes and a round steering wheel.

With the results having been certified as a record by the necessary officials, Tesla seems to have gone through the motions of getting this done the right way. There’s no reason not to be a little skeptical of Elon Musk claims, but at least this time it appears that everything is on the up and up. It’s a little sketchy that Elon didn’t explain the actual “record” was the slower of the two lap times he posted on Twitter, but it’s entirely possible he doesn’t know the difference.

I Figured Out How Chevy Can Sell A Ton Of Bolts And It Involves Tesla

Illustration for article titled I Figured Out How Chevy Can Sell A Ton Of Bolts And It Involves Tesla

Screenshot: Chevrolet/Jason Torchinsky/Tesla

By now you’ve possibly heard that there’s a new Chevy Bolt coming, and it’s going to have a very competitive range of 259 miles and a very competitive price of just under $32,000. You likely haven’t heard all that much about it because even though it’s a modern, capable EV built by a company that’s been building cars in quantity for over a century, it’s not a Tesla. And, as a not-Tesla EV, nobody gives a shit about it. But I have a plan to fix that, a way for Chevy to really sell a crapton of Bolts. And it involves a whole new kind of engineering.

You’ve heard of reverse engineering, right? Where a company takes a competitor’s product and figures out how it works? And you of course know about badge engineering, where a carmaker slaps its name on some other carmaker’s car to somehow make money, yeah? Well, consider this concept: reverse brand engineering.

What I’m suggesting is that Chevrolet needs to work out a licensing deal with Tesla that lets them offer an option to sell people Bolts re-branded as a Tesla. Let’s call it the Tesla Model 2.

Advertisement

It could be one of the Bolt’s trim packages, like this:

undefined

Screenshot: Chevrolet

The TLS trim package is the one we’re talking about here. This package would offer a full Chevy badge/bowtie delete, ideally even in the little white print in the corners of the windows, too. No Chevy badging anywhere, as that will all be replaced with Tesla badges, which includes a new faux-grille panel without Chevy’s distinctive diamond pattern.

So, we’d go from this:

undefined

Screenshot: Chevrolet

Advertisement

…to this:

undefined

Screenshot: Chevrolet/Jason Torchinsky/Tesla

Advertisement

Just the nose badge, the “grille” panel, and a Tesla badge on the tailgate, along with a MODEL 2 badge. Oh, and wheels without the Chevy logo.

There would be an adapter included so you could charge your disguised Bolt at any Tesla Supercharger station, and then be able to rub well-lotion’d, world-saving elbows with fellow Tesla owners, where you can talk about Bitcoin and make jokes with the numbers 69 and 420 in them.

Advertisement

Also, all of the Bolt’s instruments will get a UI re-skin to match Tesla’s look-and-feel (this is easy! It’s just software on a screen!) and the Bolt’s center-stack infotainment display screen will get a similar makeover, along with some ability to run Tesla infotainment applications, like the one that makes fart sounds or shows a fireplace or plays Atari games. 

Hardware permitting, it should just run some licensed variant of Tesla’s infotainment software, but even if it’s just emulated or copied, that’ll probably be just fine.

Advertisement

Also, it should have GM’s SuperCruise Level 2 semi-autonomy system installed, just re-named Autopilot SC. This could be considered an upgrade, depending on who you ask, even.

Another very important part here is the very obviously Tesla-branded key fob. This should be big and showy and unmistakably Tesla. The key is key.

Advertisement

All that, plus a glossy 8×10 headshot of Elon Musk, ideally signed and with some sort of Tesla Certificate of Authenticity printed on the back, to be produced in case of arguments from gate-keeping Tesla owners, should complete the package.

I’m telling you, with this package, Chevy will move Bolts like they were electric hotcakes. It’s got everything people want in a modern electric car: a Tesla badge!

Advertisement

Plus, unlike a Tesla-built Tesla, the Model 2 will have bumpers that stay on in the rain and novel Sta-Put™ roofs and the parts and service side of things won’t be a total mess.

As far as what Tesla gets out of it, I guess it’s mostly money from their sweet licensing deal with GM, and an entry-level model below the Model 3 on the road with zero effort from them. Plus, plenty of brand visibility, too!

Advertisement

Also, it may help Tesla’s perceived reliability, kind of like how Pontiac benefited that way when it sold Toyota Matrixes as Pontiac Vibes. Kinda.

This is really Chevy’s best bet to finally get people buying Bolts, because, as we’ve already seen with the perfectly fine first-gen Bolt, nobody really gives a rat’s rectum about them. But mainstream culture is absolutely smitten with Tesla, for reasons that transcend logic.

Advertisement

Why fight it, Chevy? Just pay Elon some cash and start slapping Tesla Ts on your Bolts, and watch those things fly off the lots. You think I’m kidding, but deep down, you know there’s some painful truth here.

As always, you can just Venmo me my cut.

NHTSA Has A Lot Of Catch-Up Ahead

Illustration for article titled NHTSA Has A Lot Of Catch-Up Ahead

Photo: Associated Press (AP)

It’s happening too often. Someone spots a Tesla owner sleeping while motoring down the freeway, their car under the control of Tesla’s Autopilot driver assistance system. Next thing you know, it’s all over social media.

You may wonder how Tesla was able to release this product onto public roads. Are there no regulations covering such features? Isn’t this a safety issue? According to a report from the Los Angeles Times, it really breaks down to oversight from the government.

The Trump administration focused its efforts on rolling back fuel economy requirements. Its arguments for doing so was that cars would become both cheaper and safer. That didn’t happen, and it’s a mystery why Trump thought it would. One explanation is he didn’t know shit about cars.

Unfortunately, fuel economy and emissions control rollbacks were just about the only things Trump’s NHTSA did get around to doing. NHTSA’s important regulatory oversight work stalled for four years with no director at the helm. Now, the Biden administration has a backlog of neglected tasks to dig through. As the Times report shows, NHTSA has been pretty much hands-off when it comes to driver-assistance systems, specifically when it comes to Tesla’s misleadingly named Autopilot:

Officially, the National Highway Traffic Safety Administration discourages such behavior, running a public awareness campaign last fall with the hashtag #YourCarNeedsYou. But its messaging competes with marketing of Tesla itself, which recently said it will begin selling a software package for “Full Self Driving” — a term it has used since 2016 despite objections from critics and the caveats in the company’s own fine print — on a subscription basis starting this quarter.

That NHTSA has so far declined to confront Tesla directly on the issue is firmly in character for an agency that took a hands-off approach to a wide range of matters under the Trump administration.

”Inactive,” is how Carla Bailo, chief executive of the Center for Automotive Research, summed up NHTSA’s four previous years. “Dormant,” said Jason Levine, executive director at the Center for Auto Safety. “No direction,” said Bryant Walker Smith, a professor and expert in autonomous vehicle law at the University of South Carolina.

The agency went the full Trump term without a Senate-confirmed administrator, leaving deputies in charge. It launched several safety investigations into Tesla and other companies, but left most unfinished. “A massive pile of backlog” awaits the Biden administration,” said Paul Eisenstein, publisher of The Detroit Bureau industry news site.

Advertisement

While NHTSA has been absent on a number of issues, its lack of oversight on autonomous driving is perhaps the biggest. The Times says Level 2 autonomy is the biggest safety challenge since Ralph Nader’s Unsafe At Any Speed. Silly Nader references aside, the Times does have a point.

How to deal with emerging autonomous driving technologies is a long term issue. But one thing is for sure, the way Tesla uses its customers as beta testers raises alarm bells with experts.

Whoever takes charge must balance the long-term potential for next-generation cars to reduce pollution, traffic and greenhouse gases against the near-term risks of deploying buggy new technologies at scale before they’re fully vetted. In the “move fast and break things” style of Silicon Valley, Tesla Chief Executive Elon Musk has embraced those risks.

While other driverless car developers — from General Motors’ Cruise, to Ford’s Argo AI, to Amazon’s Zoox, to Alphabet’s Waymo, to independent Aurora and more — all take an incremental, slow rollout approach with professional test drivers at the wheel, Tesla is “beta testing” its driverless technology on public roads using its customers as test drivers.

Musk said last month that Tesla cars will be able to fully drive themselves without human intervention on public roads by late this year. He’s been making similar promises since 2016. No driverless car expert or auto industry leader outside Tesla has said they think that’s possible.

While law professor Smith is impressed by Tesla’s “brilliant” ability to use Tesla drivers to collect millions of miles of sensor data to help refine its software, “that doesn’t excuse the marketing, because this is in no way full self-driving. There are so many things wrong with that term. It’s ludicrous. If we can’t trust a company when they tell us a product is full self-driving, how can we trust them when they tell us a product is safe?”

The Detroit Bureau’s Eisenstein is even harsher. “Can I say this off the record?” he said. “No, let me say it on the record. I’m appalled by Tesla. They’re taking the smartphone approach: Put the tech out there, and find out whether or not it works. It’s one thing to put out a new IOS that caused problems with voice dictation. It’s another thing to have a problem moving 60 miles per hour.”

A late 2016 NHTSA directive under the Obama administration considered “predictable abuse” as a potential defect in autonomous driving tech deployment. Unfortunately, under Trump NHTSA did nothing. For context, the directive came about a year after the software that enabled Autopilot driver assistance in the Tesla Model S was released.

The inaction of NHTSA drew ire from another federal safety agency, the National Transportation Safety Board. The NTSB — which is most known for its investigations of plane and train incidents — blamed predictable abuse for a 2018 crash where a Tesla Model X crashed into a concrete divider.

Advertisement

Part of the issue is the lack of transparency from Musk and Tesla regarding how safe the Autopilot driver-assist system is as well as a lack of data in general. From the Times:

Musk regularly issues statistics purporting to show that Autopilot and Full Self Driving are on balance safer than cars driven by humans alone. That could be, but even if Musk’s analysis is sound — several statisticians have said it is not — the data is proprietary to Tesla, and Tesla has declined to make even anonymized data available to university researchers for independent confirmation. Tesla could not be reached — it disbanded its media relations department last year.

***

In 2019, after a series of Tesla battery fires, NHTSA launched a probe of the company’s software and battery management systems. Later, the agency said allegedly defective cooling tubes that could cause leaks were being investigated as well. At the time, the agency did not make public information it held about battery cooling tubes prone to leakage that were installed in early versions of the Model S and Model X.

Since late 2016, many Tesla drivers had been complaining about “whompy wheels” on their cars — a tendency for the suspension system to break apart, which sometimes caused a wheel to collapse or fall off the car. Chinese drivers lodged similar complaints, and last October, China authorities ordered a recall of 30,000 Model S and Model X cars. A Tesla lawyer wrote NHTSA a letter arguing no U.S. recall was necessary and blamed driver “abuse” for the problems in China. NHTSA said in October it is “monitoring the situation closely.”

Four days before Biden’s inauguration, NHTSA announced that defects in Tesla touchscreen hardware can make the car’s rear-view camera go blank, among other problems. Rather than order a recall, NHTSA said it asked Tesla to voluntarily recall approximately 158,000 Model S and Model X cars for repair. On Feb. 2, Tesla agreed to recall 135,000 of those cars.

Advertisement

Check out the full Los Angeles Times report, it’s well worth the read!