Description:
In this episode of Kilowatt, I analyze Tesla's ambitious plans for autonomous vehicles following the controversial We, Robot event. I aim to fill the gaps left by the event's vague details, especially regarding the CyberCab and its wireless charging design. There’s a focus on how FSD operations, like Autopark, enhance the overall autonomous experience, contrasted with mixed feedback concerning Tesla's full self-driving technology claims. Discussions also address regulatory challenges and safety concerns surrounding these advancements.
Support the Show:
Other Podcasts:
News:
- Tesla needs to deploy lobbyist's for Cybercab
- Tesla wants to flood the streets with Cybercab
- FSD is more dangerous as it gets better
- We, Robot and movie ripoffs
- Tesla sued by Blade Runner producers
- Were Optimus robots controlled by humans?
- Tesla's wire charging solution is 90% efficient
- Will Robovan fit in a Boring Tunnel?
- Inspiration behind Robovan
- Tesla has a lot to prove on Robotaxi
- Tesla's stock dip
- Tesla FSD requires an intervention every 13 miles
- NHTSA and FSD Investigation 1
- NHTSA and FSD Investigation 2
*ART PROVIDED BY DALL-e
Support this show http://supporter.acast.com/kilowatt.
Hosted on Acast. See acast.com/privacy for more information.
Hosted by Simplecast, an AdsWizz company. See pcm.adswizz.com for information about our collection and use of personal data for advertising.
[00:00:21] Hello everyone and welcome to Kilowatt, a podcast about electric vehicles, renewable energy, autonomous driving, and much, much more.
[00:00:27] My name is Bodhi and I am your host and on today's episode we are going to talk about autonomous driving and a little bit more.
[00:00:34] Not much, much more, just a little bit more.
[00:00:36] It's been about two weeks now, almost two weeks since we covered Tesla's WeRobot event.
[00:00:42] And, you know, a lot of criticism that Tesla got for the WeRobot event is that it didn't include a lot of information.
[00:00:52] Like, Tesla had this, or Elon, you know, portrayed this grand vision for the CyberCab RoboTaxi service, but was really light on the details on how they were going to actually accomplish this.
[00:01:10] So I thought it would be a good idea to take an episode and actually fill in those gaps.
[00:01:16] But before we do that, I would like to welcome a new patron, Brian.
[00:01:22] Brian went to patreon.com forward slash Kilowatt or he might have went to supportkilowatt.com and signed up to become a patron.
[00:01:30] Brian, thank you so much for supporting the show.
[00:01:32] I really appreciate it.
[00:01:34] It really does mean a lot and it really helps.
[00:01:36] So, again, I just want to say a very heartfelt thank you for doing that.
[00:01:41] It really does mean a lot.
[00:01:44] So, thank you.
[00:01:46] If you need any assistance setting up your RSS feed or if you have any questions or suggestions, please feel free to reach out to me.
[00:01:53] Of course, my email is bodie, B-O-D-I-E at 918digital.com.
[00:02:00] All right, let's move on to our news.
[00:02:03] Now, I do have to let you know that a normal show, a normal 20-minute show is about four pages of notes for me.
[00:02:14] Today's show is 10 pages of notes.
[00:02:17] This took me a very long time to not only go through all the articles that I needed to go through and find, you know, the things that were important, but also to write it up.
[00:02:30] I had to reorganize it a couple of times.
[00:02:33] Like, there's going to be a lot of information in this episode.
[00:02:35] So, hopefully, you find this episode useful.
[00:02:40] What I thought we would do is we'd start with some, you know, some of the lighter stuff.
[00:02:45] So, let's start off with CyberCab and wireless charging.
[00:02:49] One of the things that I found interesting was that the wireless charger for the vehicle is in the back of the vehicle, not in the front.
[00:02:59] So, in order to charge your vehicle, you need to drive over the pad or back onto the pad to charge your vehicle.
[00:03:07] And I thought, well, that's just really interesting the way Tesla does that.
[00:03:11] I wonder why that is.
[00:03:12] Because if you're charging in your garage, right, you normally drive, you know, forward into your garage.
[00:03:20] And you would probably want the charger to be near the front of the garage.
[00:03:25] I would think that.
[00:03:26] That would be my thought.
[00:03:28] But if, you know, Tesla brings wireless charging to other vehicles, it looks like it's going to need to be in the rear of the vehicle.
[00:03:35] So, I thought that was interesting.
[00:03:37] Or at least for the CyberCab, it's going to need to be in the rear.
[00:03:41] But then I thought, well, you know, one of my favorite features is Tesla's Autopark.
[00:03:48] So, if you don't know, if you have full self-driving, I don't.
[00:03:53] But I did get granted another trial.
[00:03:56] So, I've been playing with it over the last couple days.
[00:03:59] But anyway, if you don't know what Autopark is, when you're driving through a parking lot,
[00:04:02] you'll actually see on the screen the parking spaces.
[00:04:07] And then it's got a little P.
[00:04:08] And you choose which parking space you want, and the car will park itself.
[00:04:12] It'll actually back into that parking space.
[00:04:15] And it works really well, to be honest.
[00:04:18] It's one of my favorite features of full self-driving.
[00:04:20] And I genuinely would pay, you know, $100 or $200 to use this thing all the time.
[00:04:27] Just have access to it whenever without full self-driving.
[00:04:30] But when you consider the CyberCab needs to back into a space in order to charge, you know,
[00:04:38] over this charging pad, the Autopark feature makes a lot of sense when it does that.
[00:04:44] Like, it just doesn't nose into the parking spot.
[00:04:46] It actually backs into the parking spot.
[00:04:48] So, maybe this isn't groundbreaking to everybody else.
[00:04:52] But I was like, oh, well, that makes a lot of sense.
[00:04:55] Again, I could be the last person figuring this out.
[00:04:58] So, I might sound really foolish right now.
[00:05:02] But anyway, there you go.
[00:05:04] The next little bit about CyberCab and wireless charging is Marquez Brownlee made a little video.
[00:05:11] Well, not a little video.
[00:05:12] His videos are always huge.
[00:05:14] But he made a video talking about the inefficiencies when it comes to wireless charging and vehicles.
[00:05:21] And he hypothesized that Tesla is probably getting somewhere around 75% efficiency with their wireless charging system.
[00:05:29] Which is to say, you know, to make this really easy, say the max they could get is 10 kilowatts.
[00:05:38] And with the 75% efficiency, that would make drop it down to 7.5 kilowatts.
[00:05:46] Tesla did respond to his video and claimed that their efficiency on their wireless charger is well over 90%.
[00:05:55] Now, the proof is in the pudding.
[00:05:58] Once these actually come out and are deployed in the world, we'll see if that's really true or for that only works in Tesla's test labs.
[00:06:06] But 90% is a pretty good number.
[00:06:08] So, we shall see.
[00:06:10] All right.
[00:06:11] Let's move on to RoboVan.
[00:06:14] Franz von Holzhausen, if you don't know, he is Tesla's lead designer.
[00:06:20] He was on the Kilowatts podcast, not this podcast, the Kilowatts podcast.
[00:06:27] And he was interviewed at the WeRobot event.
[00:06:30] And they were actually in the RoboVan, which is pretty cool.
[00:06:34] And the host, Ryan, asked if the RoboVan could fit into a boring tunnel.
[00:06:40] And Franz kind of smiled.
[00:06:42] His eyes lit up.
[00:06:43] And he smiled.
[00:06:43] And he said, possibly.
[00:06:45] And almost certainly, it would, you know, if you're trying to move people in a significant way around places like that are really congested, like L.A., and you happen to have access to a tunnel, why wouldn't you do this?
[00:07:01] It makes a lot of sense.
[00:07:03] I did watch the entire interview.
[00:07:06] And I'm going to be honest with you.
[00:07:08] It was not great.
[00:07:10] And let me be clear on this.
[00:07:11] It was not the fault of host Ryan Levinson.
[00:07:14] It was not his fault.
[00:07:15] He did fine.
[00:07:17] It was just Franz did not give any more information that was significant.
[00:07:22] It was surface level only.
[00:07:25] And Ryan even, you know, talks about the event and how there wasn't much information a little bit later.
[00:07:32] But if you want to go back and watch it, I'll put a link in the show notes.
[00:07:35] And, you know, even though our names are similar, I have zero animosity towards Ryan and what he does.
[00:07:43] I think what he does is great.
[00:07:44] So go watch his YouTube channel.
[00:07:48] All right.
[00:07:51] Elon posted a picture of RoboVan on X with the caption saying,
[00:07:56] futuristic Art Deco bus.
[00:07:59] He then later confirmed that the RoboVan's design inspiration was taken from Art Deco trains.
[00:08:06] And if you look up on Google trains from the 1930s, that makes sense to me.
[00:08:12] That jives.
[00:08:13] I don't know if it's true.
[00:08:15] We'll talk about that here in a second.
[00:08:17] But it makes total sense that it does kind of have that look and feel.
[00:08:24] I guess that leads us to a few people in the movie industry that are unhappy with Tesla's event.
[00:08:32] Producers of Blade Runner 2049 are not happy with Tesla and Elon for using Blade Runner inspired visualizations at the Wii Robot event.
[00:08:43] It does sound like Tesla asked permission to use some Blade Runner inspired visuals.
[00:08:49] And they were denied.
[00:08:51] And Tesla did it anyway.
[00:08:53] And the producers are claiming that Tesla used AI generated images that were nearly identical to images from the film.
[00:09:02] So the producers are suing Tesla and Elon.
[00:09:05] I'm not sure if they'll win.
[00:09:06] It'll be a while before we find out if this goes anywhere or becomes a big deal.
[00:09:12] But, yeah.
[00:09:13] I haven't seen the new Blade Runner.
[00:09:16] But I did see the original way back when.
[00:09:19] And I always thought it was weird that Elon wants to model.
[00:09:23] And he's said this in the past.
[00:09:26] Like his vehicles off of these, you know, movies that have a very bleak outlook for our future.
[00:09:34] Now, I will grant you that there was cool technology in that world.
[00:09:40] But it wasn't exactly a utopia.
[00:09:42] It was kind of sad.
[00:09:44] And there was a lot of problems.
[00:09:46] I don't...
[00:09:47] Why would we want to manifest that in our real life?
[00:09:50] I'm not really sure.
[00:09:52] Now, speaking of sci-fi movies, Wii Robot is clearly a reference to the movie iRobot.
[00:10:00] And, you know, if you did a side-by-side comparison of the Optimus robots with the iRobot robots, the Robovan with the transport van in iRobot.
[00:10:12] I have seen that movie.
[00:10:13] And even the Cyber Cab looks kind of like the Audis.
[00:10:17] I think it was Audis that Will Smith was driving in that movie.
[00:10:20] There's a lot of design cues that seem like they're taken from that movie.
[00:10:24] Even like we said, the Robovan is like this, you know, inspiration from design came from the Art Deco trains.
[00:10:32] But if you look at them side-by-side, it looks like it also maybe came from...
[00:10:37] Some of that inspiration came from iRobot.
[00:10:42] But I don't think the iRobot people are going to do anything about this, but they are calling out the similarities.
[00:10:48] And a lot of people are as well.
[00:10:49] And there's a lot of funny memes that also call out the similarities.
[00:10:53] Because if you've seen the movie, great.
[00:10:56] If you haven't, the robots, you go off script and start hurting people.
[00:11:05] The movie's old.
[00:11:06] If that's too much of a spoiler for you, don't email me.
[00:11:09] It's old.
[00:11:10] And then that brings us to the Optimus robots.
[00:11:14] I mentioned that the Optimus robots were serving drinks and handing out gifts and, you know, walking among the meatbags in the crowd.
[00:11:27] As a matter of fact, each Optimus bot had its own meatbag assistant that made sure that nobody abused the Optimus bot, which is good.
[00:11:37] But there may have been another purpose there as well.
[00:11:40] Because each meatbag minder had a device in their hand that allowed the minder to interact with the Optimus bot in some way.
[00:11:49] We don't know exactly what it was doing.
[00:11:52] But they had some device in their hand and they were clicking buttons.
[00:11:55] So, not sure if everything was totally AI, I think, like Elon was leading us to believe.
[00:12:04] Now, I don't believe that he specifically said that it was 100% AI.
[00:12:08] The robots were interacting this way.
[00:12:10] Some people were able to have conversations with Optimus.
[00:12:14] And a lot of people were impressed with the conversations.
[00:12:19] Like one person asked about their shoes and the Optimus made a joke about it was just their feet.
[00:12:26] I don't know if this was real.
[00:12:28] It did not seem like it was real to me, but I don't know.
[00:12:33] So, I'm just going to quote somebody who probably does know.
[00:12:36] Adam Jonas of Morgan Stanley said,
[00:12:39] It is our understanding that these robots were not operating entirely autonomously,
[00:12:44] but relied on tele-ops, human intervention,
[00:12:48] so it was more of a demonstration of degrees of freedom and agility.
[00:12:52] So, my question to you is, do you think that there was some human intervention when it came to the Optimus robots?
[00:13:01] Honestly, Elon has been known to exaggerate FSD capabilities in the past and got caught,
[00:13:09] you know, editing videos to make it seem as if FSD worked better than it did at the time.
[00:13:16] Do you think there was some human interaction going on with these Optimus robots?
[00:13:23] Send me an email.
[00:13:23] Bodie, B-O-D-I-E at 918digital.com.
[00:13:28] And here's another question, because Tesla had a disclaimer about forward-looking statements at that time.
[00:13:35] Do you think that if Tesla did lie about anything, or knowingly lie about anything at the event,
[00:13:43] do you think that violates any sort of SEC rules?
[00:13:49] So, yeah.
[00:13:51] Send me an email.
[00:13:52] Bodie, B-O-D-I-E at 918digital.com.
[00:13:57] All right.
[00:13:58] So, we're done with that section.
[00:13:59] Let's talk about full self-driving, supervised and unsupervised, and robo-taxi.
[00:14:05] And let's start off with the basics, right?
[00:14:07] Right now, Tesla in regulatory filings has said that, you know, full self-driving is level two.
[00:14:16] So, that's where we are at the moment as of today.
[00:14:19] We have access to.
[00:14:21] They use Tesla Vision.
[00:14:24] So, it's only cameras.
[00:14:25] Instead of using radar, LIDAR, ultrasonic sensors, and cameras, Tesla only uses cameras.
[00:14:31] They turned off the ultrasonic sensors and the radars on the cars that had them,
[00:14:35] and they stopped including them, you know, on other cars when they made that change.
[00:14:42] Now, armed with that knowledge, let's go ahead and talk about some things related to robo-taxi and full self-driving.
[00:14:50] We're going to talk about it a little bit from an investor's perspective, from the journalist's perspective,
[00:14:58] and then kind of from regulators' perspectives.
[00:15:01] On what they need to prove in order to, you know, be granted the ability to operate a robo-taxi,
[00:15:09] even if it's just for testing.
[00:15:11] And then maybe some challenges that they're going to have.
[00:15:15] So, let's start off with the investors.
[00:15:19] After the event, the next day, on Friday, the 11th, Tesla's stock took a 9% hit.
[00:15:26] So, Elon lost $15 billion in net worth, which is a massive loss for most people,
[00:15:35] but he is still the richest man with an updated net worth of $240 billion.
[00:15:40] And I should say, richest man tracked.
[00:15:43] It's kind of thought that Vladimir Putin was probably the richest man in the world,
[00:15:47] and maybe some of the other folks that maybe don't have their money tracked quite as closely as folks,
[00:15:54] other folks do.
[00:15:56] Let's just say it that way.
[00:15:57] But richest man on record, Elon Musk.
[00:16:01] Some of the reasons given for the dip in stock price is Tesla's lack of detail.
[00:16:08] You know, they didn't really give a plan, share a plan on how all of this was going to come together.
[00:16:14] And then there was some skepticism when it comes to Tesla's timeline.
[00:16:19] Last episode, we talked about, you know, Tesla's departures,
[00:16:24] but we also talked about some people who got promoted at Tesla, right?
[00:16:28] And there was some talk right before the event of having newly promoted Tesla VPs,
[00:16:34] Milan Kovic and Ashok Alaswamy speak at the event and provide much needed details.
[00:16:41] And it's just reported that Elon said no,
[00:16:45] because Elon wanted to keep the presentation at a high level
[00:16:49] because he knew that competitors watched Tesla so closely.
[00:16:54] There also is some people out there saying that this was another way for Tesla to,
[00:17:00] you know, just raise capital.
[00:17:03] Tesla doesn't need capital that bad.
[00:17:05] So I don't think that that was the case with the event.
[00:17:09] But I'll also say that, you know,
[00:17:13] if you're worried that competitors are going to copy you,
[00:17:17] those competitors already have a bunch of stuff in motion.
[00:17:21] They're probably not talking about it like you are,
[00:17:23] but they already have a bunch of stuff in motion.
[00:17:25] A lot of the stuff that they're working on
[00:17:28] relies on technologies like LiDAR and radar and ultrasonic sensors and cameras
[00:17:34] and things that you're doing differently.
[00:17:37] So, yes, could they have stolen some stuff or gotten some inspiration,
[00:17:43] let's say, from the Wii Robot presentation?
[00:17:46] Absolutely.
[00:17:47] It might get them to move in a specific direction,
[00:17:50] but it probably wouldn't be done in a very timely manner.
[00:17:53] And they'd have to abandon a bunch of things that they've already been working on.
[00:17:57] So I can see Elon's point in that we don't want to give too much away.
[00:18:04] However, the people working in this field,
[00:18:08] in the autonomous driving field,
[00:18:11] they already know a lot of this stuff.
[00:18:14] I genuinely don't think that anything that Tesla's working on
[00:18:18] would be considered surprising to any of these engineers
[00:18:25] and researchers that are working in the area of autonomous driving.
[00:18:29] Now, I could be wrong.
[00:18:30] Maybe Tesla's doing something in such a unique way
[00:18:33] that it takes them by surprise.
[00:18:36] But also, FYI, a lot of the Tesla engineers
[00:18:40] that worked on some of this stuff are now working other places.
[00:18:43] So it's not as if they're going to give up Tesla's secret sauce,
[00:18:49] but they come with knowledge.
[00:18:51] And maybe they don't do it exactly the way Tesla does it,
[00:18:54] but they can do something similar that is adjacent to this.
[00:18:57] What I'm saying is Tesla probably could have given a lot more detail
[00:19:02] without giving away the store, right?
[00:19:07] I think the argument that we're going to keep this real simple
[00:19:12] so that our competitors don't get any ideas,
[00:19:16] I don't think that's a great argument
[00:19:17] because these folks already know.
[00:19:20] Whatever you're going to tell them that is going to sound new and amazing
[00:19:25] to bozos like me
[00:19:27] is going to be old hat to people who actually research this stuff
[00:19:31] and work in this space.
[00:19:35] So I think that's a cop-out, to be honest with you.
[00:19:41] But I am sure we're going to hear more on all of this
[00:19:45] when it comes to Tesla's earnings call,
[00:19:47] which is actually tomorrow as I record this.
[00:19:51] Again, I wanted to get this episode out way sooner than I have.
[00:19:57] It's just, it was a big episode to write, so forgive me.
[00:20:02] Let's see here.
[00:20:05] Let's move on to journalists
[00:20:07] because investors are interesting, right?
[00:20:11] And what the stock price does is interesting,
[00:20:13] but it's not something we really cover here on our show.
[00:20:17] But we do, you know, get our news from journalists.
[00:20:20] So let's start with that.
[00:20:22] From the articles that I've read
[00:20:24] and I try to get my information
[00:20:27] from a variety of different news outlets,
[00:20:30] there's a lot of skepticism
[00:20:32] that Tesla will be able to accomplish
[00:20:34] their robo-taxi goal
[00:20:36] in the timeframe that Elon laid out.
[00:20:39] And we're going to go over all the reasons why
[00:20:42] here in a moment.
[00:20:43] But I will say that this is based on
[00:20:45] the journalists' professional opinions.
[00:20:48] It's based on where we are currently
[00:20:50] with full self-driving.
[00:20:53] It's based on Elon's inability
[00:20:56] to deliver in a timely manner
[00:20:59] on certain promises.
[00:21:01] Like, I think Elon delivers more
[00:21:02] than people give him credit for,
[00:21:04] but it's always very late.
[00:21:06] And then it's based on
[00:21:09] like current laws and regulations.
[00:21:12] And again, we're going to get into that.
[00:21:14] So I just want to make that very clear
[00:21:16] before we actually start talking
[00:21:17] about this stuff.
[00:21:18] So that's what we're basing this off of
[00:21:21] is opinions.
[00:21:23] There's some experts
[00:21:24] that the journalists interviewed
[00:21:26] that will be in here.
[00:21:27] It's their opinion as well.
[00:21:29] So none of this is set in stone.
[00:21:34] You know, there's a presidential election
[00:21:35] coming up that could change the way
[00:21:37] that this works for, you know,
[00:21:39] the positive for Tesla and Elon
[00:21:42] and for maybe a negative.
[00:21:44] Who knows?
[00:21:46] So what I'm saying is
[00:21:48] this information comes with some caveats.
[00:21:50] All right.
[00:21:52] Let's start with
[00:21:54] is full self-driving ready for CyberCab?
[00:21:56] And the short answer to that is no,
[00:21:58] but we're going to start with a quote
[00:22:00] from Ars Technica.
[00:22:02] Tesla's controversial full self-driving
[00:22:05] is now capable of some
[00:22:06] quite advanced driving,
[00:22:08] but can breed undeserved complacency
[00:22:12] according to independent testing.
[00:22:14] The partially automated driving system
[00:22:17] exhibited dangerous behavior
[00:22:19] that required human intervention
[00:22:21] more than 75 times
[00:22:23] over the course
[00:22:24] of more than 1,000 miles
[00:22:27] or 1,600 kilometers
[00:22:28] of driving in Southern California,
[00:22:30] averaging one intervention
[00:22:32] every 13 miles.
[00:22:34] So we actually talked about this,
[00:22:36] if this sounds familiar,
[00:22:37] a couple of weeks ago,
[00:22:39] AMCI did some testing
[00:22:40] and this quote was based
[00:22:42] off of their testing.
[00:22:44] The disengagements
[00:22:45] could be something silly
[00:22:46] that the system should know
[00:22:49] how to do
[00:22:49] or how to handle,
[00:22:50] or it could be an edge case.
[00:22:52] And the edge cases,
[00:22:54] you know,
[00:22:55] are harder to deal with
[00:22:56] when it comes to,
[00:22:57] you know,
[00:22:59] autonomous driving systems.
[00:23:00] And I'm hoping to get a representative
[00:23:02] from AMCI on the show.
[00:23:04] I'm actually kind of trying
[00:23:06] to work that out right now.
[00:23:07] So I'll keep you posted on that
[00:23:09] as we kind of go through the process.
[00:23:11] But we also have Fred Lambert
[00:23:14] who did his own testing.
[00:23:16] Fred Lambert is of Electric.
[00:23:17] I should let you know
[00:23:18] that Fred Lambert is,
[00:23:21] I don't want to say
[00:23:22] he's not a fan of Tesla,
[00:23:24] but he's been very critical of Tesla.
[00:23:26] I happen to agree
[00:23:27] with a lot of things that he says.
[00:23:29] I don't agree with everything
[00:23:29] that he says,
[00:23:30] but I think he is fair
[00:23:32] when it comes to Tesla.
[00:23:33] But Tesla or Elon specifically,
[00:23:36] I'm not really sure,
[00:23:37] not a fan of his takes for sure.
[00:23:39] So Fred recently took a 200 mile
[00:23:42] or 350 kilometer trip
[00:23:44] and he used FSD 12.5.4.2
[00:23:50] nearly the entire trip.
[00:23:52] Like I said,
[00:23:53] I think Fred is fair.
[00:23:55] He had some good things to say
[00:23:56] about FSD and some bad things.
[00:23:58] I'm going to start with the bad things.
[00:24:00] And I will say that Fred lives in Canada
[00:24:04] and I'm not familiar with,
[00:24:06] he gave the cities,
[00:24:07] but I'm not familiar
[00:24:09] with the road trip that he's taking.
[00:24:11] You know,
[00:24:11] if someone was like,
[00:24:13] I'm going from LA to Vegas,
[00:24:15] I'm very familiar with that trip.
[00:24:17] I'm familiar with Boise, Idaho
[00:24:19] to Seattle, Washington.
[00:24:20] You know,
[00:24:21] I've done those trips tons of times.
[00:24:22] I have no idea
[00:24:23] what he encountered on his trip
[00:24:26] through Canada, right?
[00:24:27] Through the 200 miles
[00:24:28] or 350 kilometers that he took.
[00:24:31] So I am going off of his article alone.
[00:24:36] I do not know how congested the roads are,
[00:24:40] what type of roads he was on,
[00:24:41] you know,
[00:24:42] for long periods of time.
[00:24:45] Just putting that out there.
[00:24:46] So we'll start with the bad
[00:24:49] and we'll start with the kind of the minor bad.
[00:24:51] It was,
[00:24:51] there was a large amount of interventions.
[00:24:55] Now an intervention is not a full disengagement,
[00:24:59] but the driver needs to interact with the system
[00:25:01] in some way or another.
[00:25:02] One of the examples he gave
[00:25:05] was passing a slower car.
[00:25:07] So it's a two lane road.
[00:25:10] You're,
[00:25:10] you're,
[00:25:11] let's say we're going east.
[00:25:12] You got somebody slow in front of you.
[00:25:13] The car passes on the left
[00:25:16] and then doesn't move back into the right lane
[00:25:19] after making that pass,
[00:25:21] right?
[00:25:22] Um,
[00:25:23] this is a very specific example
[00:25:25] and I'm not entirely sure
[00:25:27] if there's some sort of Canadian,
[00:25:29] you know,
[00:25:30] rule that,
[00:25:31] or law that means
[00:25:32] you can only travel in the left lane
[00:25:34] if you're passing.
[00:25:35] Here in the United States,
[00:25:36] people are slow in the fast lane
[00:25:39] and slow in the slow lane.
[00:25:41] So,
[00:25:41] um,
[00:25:43] I don't,
[00:25:43] I don't know if this is just a preference
[00:25:45] that Fred has
[00:25:46] or if this is some sort of law.
[00:25:48] Like here,
[00:25:49] I don't think it would be that big of a deal,
[00:25:51] uh,
[00:25:52] in Arizona.
[00:25:52] But again,
[00:25:53] I'm not that familiar with Canada
[00:25:55] and their road systems.
[00:25:57] I have driven in Canada.
[00:25:58] It was like 30 years ago.
[00:26:01] So another intervention
[00:26:03] that he listed was
[00:26:05] he was coming up
[00:26:07] to a flashing green light
[00:26:08] and the car was hesitating too long,
[00:26:11] didn't know what to do
[00:26:13] or it wasn't clear
[00:26:14] if the car knew what to do
[00:26:15] and he was holding up traffic.
[00:26:18] So he pushed on the accelerator
[00:26:19] so that the car would go
[00:26:21] through the intersection.
[00:26:22] That makes sense.
[00:26:23] I've kind of had stuff like that
[00:26:24] happen before
[00:26:25] where the car was hesitant.
[00:26:26] So that,
[00:26:27] that totally makes sense to me.
[00:26:29] Um,
[00:26:30] didn't apparently disengage
[00:26:31] the autopilot system
[00:26:33] or the full self-driving system.
[00:26:34] It was just an intervention.
[00:26:36] Now let's go through
[00:26:37] the actual disengagements.
[00:26:39] He said he had five disengagements
[00:26:41] over 200 miles
[00:26:43] and he claimed
[00:26:44] some of those were dangerous.
[00:26:46] So one,
[00:26:47] you know,
[00:26:48] an example of a non-dangerous
[00:26:49] disengagement was
[00:26:51] there was sun glare
[00:26:52] on the cameras
[00:26:53] and it disengaged
[00:26:54] because the car
[00:26:54] didn't know what to do.
[00:26:55] That seems like
[00:26:55] actually a safety disengagement.
[00:26:59] Um,
[00:27:00] but in one instance,
[00:27:02] Fred felt like the car
[00:27:04] was going to run a red light.
[00:27:06] Now,
[00:27:06] he wasn't sure
[00:27:08] if it was going to run
[00:27:09] a red light or not
[00:27:11] because the car
[00:27:11] doesn't give you
[00:27:12] any indication
[00:27:13] that it's going to stop
[00:27:16] for that red light.
[00:27:16] It gives you an indication
[00:27:17] that it sees the red light,
[00:27:19] but it doesn't tell you
[00:27:19] it's going to stop.
[00:27:21] And he,
[00:27:22] he makes this very clear
[00:27:25] that at one point in time
[00:27:27] through FSD,
[00:27:28] the car would decelerate
[00:27:30] a little bit sooner
[00:27:31] when approaching a stoplight.
[00:27:32] Uh,
[00:27:33] but now it maintains
[00:27:34] the higher speed for longer
[00:27:36] and is more aggressive
[00:27:37] in braking
[00:27:38] when it comes to that stop.
[00:27:39] And I've,
[00:27:40] I've experienced this recently too.
[00:27:42] Like I said,
[00:27:42] I got the full self-driving trial
[00:27:45] and I'm going to talk about that
[00:27:46] here in a little bit.
[00:27:47] So I have experienced
[00:27:49] that same thing
[00:27:50] or something similar,
[00:27:51] I guess,
[00:27:51] with,
[00:27:52] with FSD and my car,
[00:27:54] which has hardware three.
[00:27:56] So I,
[00:27:57] I understand why
[00:27:59] you're barreling up
[00:28:00] towards a red light
[00:28:01] and the car is not indicating
[00:28:03] that it's going to stop
[00:28:04] that you would actually
[00:28:06] put the brake on,
[00:28:07] um,
[00:28:07] and make sure
[00:28:09] that you're not going
[00:28:09] to run through that red light
[00:28:10] and hurt yourself
[00:28:11] or somebody else.
[00:28:12] So that makes total sense to me.
[00:28:13] But all in all,
[00:28:14] five disengagements
[00:28:15] in 200 miles,
[00:28:17] I'm going to talk
[00:28:18] about my own story.
[00:28:18] I had three
[00:28:20] in the 30 some miles
[00:28:21] that I had to drive to work.
[00:28:24] So it doesn't seem that bad.
[00:28:26] This seems pretty good.
[00:28:27] Um,
[00:28:27] other than,
[00:28:28] you know,
[00:28:29] possibly running a red light.
[00:28:31] And then let's talk about,
[00:28:33] uh,
[00:28:33] the good.
[00:28:34] Fred says the car feels
[00:28:36] more natural
[00:28:36] when it comes to merging
[00:28:37] and lane changes
[00:28:38] and roundabouts
[00:28:40] and giving,
[00:28:41] uh,
[00:28:41] in one instance,
[00:28:42] it gave the pedestrians
[00:28:43] the right of way.
[00:28:44] So it stopped the car
[00:28:45] so pedestrians could
[00:28:46] cross the street.
[00:28:46] That's great.
[00:28:47] And he said it was better
[00:28:49] at dealing with intersections.
[00:28:50] He also said
[00:28:51] there was notable improvement
[00:28:53] with city driving.
[00:28:54] And I would say
[00:28:55] similar to my experience,
[00:28:57] but he did say
[00:28:58] that there is a scary part
[00:29:00] of his full self-driving experience.
[00:29:02] And that is
[00:29:03] that it can build complacency,
[00:29:05] which is what,
[00:29:06] uh,
[00:29:07] uh,
[00:29:08] the AMCI testing.
[00:29:10] That's what they said
[00:29:11] as well
[00:29:12] is that it can build complacency
[00:29:14] because it's really good
[00:29:15] at a lot of things,
[00:29:16] but it's not so good
[00:29:17] at those edge cases.
[00:29:19] So if the person
[00:29:20] responsible
[00:29:21] for the vehicle,
[00:29:22] the person in the driver's seat
[00:29:23] isn't paying attention
[00:29:26] and you come up
[00:29:27] against one of those
[00:29:28] edge cases,
[00:29:29] you know,
[00:29:30] you could get in an accident
[00:29:33] and hopefully nobody gets hurt.
[00:29:35] You could get hurt yourself.
[00:29:36] You could hurt somebody else.
[00:29:37] You could die.
[00:29:37] You could kill somebody.
[00:29:39] So that complacency
[00:29:41] in Fred's mind
[00:29:43] is dangerous.
[00:29:45] I will say,
[00:29:46] and Fred does acknowledge this,
[00:29:48] that the driver monitor system,
[00:29:51] you know,
[00:29:51] the,
[00:29:51] the camera that's looking at you
[00:29:53] while you're driving
[00:29:54] does a decent job
[00:29:55] of making sure
[00:29:56] you're paying attention.
[00:29:58] I will say one of the things
[00:29:59] that does kind of
[00:30:00] irritate me about that feature
[00:30:02] or annoy me
[00:30:03] is if I'm looking off,
[00:30:05] like,
[00:30:06] I don't think that the mirrors
[00:30:07] on the Tesla
[00:30:07] are all that great.
[00:30:09] Okay.
[00:30:09] I think they're terrible
[00:30:10] to be honest with you.
[00:30:12] Uh,
[00:30:12] there's,
[00:30:13] that feels like there's a ton
[00:30:14] of different blind spots.
[00:30:15] I do like the fact
[00:30:16] that there's a camera
[00:30:17] and I,
[00:30:17] I use the mirror
[00:30:18] plus the camera
[00:30:19] and I think that does
[00:30:21] a good job.
[00:30:21] But when you're in
[00:30:22] full self-driving,
[00:30:23] mode,
[00:30:24] I don't want to turn
[00:30:26] on my blinker,
[00:30:27] uh,
[00:30:27] because I don't know
[00:30:28] if there's a car
[00:30:29] in that left lane
[00:30:30] or that right lane.
[00:30:31] So I'll look
[00:30:32] and I'll make sure
[00:30:33] and I,
[00:30:33] I make sure to look
[00:30:34] multiple times
[00:30:35] before I get,
[00:30:37] I,
[00:30:37] I make a lane change
[00:30:38] because I have been
[00:30:40] surprised before,
[00:30:42] um,
[00:30:43] especially here in Phoenix
[00:30:44] that a car
[00:30:45] happened to be there.
[00:30:46] And so I'm a little
[00:30:47] paranoid about that.
[00:30:48] And when I'm looking
[00:30:49] to that right mirror
[00:30:50] specifically,
[00:30:51] the car will say,
[00:30:52] Hey,
[00:30:53] pay attention.
[00:30:53] And I'll give you
[00:30:53] the little alert,
[00:30:54] pay attention,
[00:30:55] pay attention.
[00:30:55] And I am paying
[00:30:56] attention.
[00:30:56] I'm just making
[00:30:57] sure before I make
[00:30:59] a request to
[00:31:01] change lanes
[00:31:03] that there's
[00:31:04] nobody there.
[00:31:05] Now I understand
[00:31:05] that the car is
[00:31:06] not supposed to
[00:31:06] go into that lane
[00:31:08] if there is somebody
[00:31:08] there,
[00:31:09] but I still like
[00:31:10] to make sure.
[00:31:12] So,
[00:31:12] yeah,
[00:31:13] uh,
[00:31:13] but it,
[00:31:13] it does that monitor,
[00:31:15] that,
[00:31:15] uh,
[00:31:15] monitoring system
[00:31:16] does a pretty good
[00:31:16] job.
[00:31:18] Um,
[00:31:20] so we've,
[00:31:21] we've talked a little
[00:31:21] bit about full
[00:31:22] self-driving.
[00:31:23] Let's talk about
[00:31:24] some other concerns
[00:31:25] about unleashing
[00:31:26] cyber cabs on the
[00:31:27] world.
[00:31:28] Some folks are
[00:31:28] worried that,
[00:31:30] uh,
[00:31:31] releasing all
[00:31:32] these cyber cabs
[00:31:33] at scale would be
[00:31:34] very dangerous
[00:31:35] and we're at least
[00:31:36] a decade away
[00:31:37] from a self-driving
[00:31:38] taxi service.
[00:31:39] Now,
[00:31:40] that second part
[00:31:41] might be true.
[00:31:42] Um,
[00:31:43] the first part,
[00:31:44] I'm not so sure.
[00:31:46] Waymo and GM
[00:31:47] Cruise and GM
[00:31:48] Cruise is not a
[00:31:49] good example,
[00:31:50] but Waymo and GM
[00:31:51] Cruise are self-driving
[00:31:53] taxi services.
[00:31:54] If Tesla had to
[00:31:56] fall in the
[00:31:57] footsteps of these
[00:31:58] two companies,
[00:31:59] then this is kind
[00:32:00] of how the process
[00:32:01] would go.
[00:32:02] First,
[00:32:03] they're going to be
[00:32:03] operated in,
[00:32:04] you know,
[00:32:05] testing in geofenced
[00:32:06] areas with a trained
[00:32:08] driver and they're
[00:32:09] going to be doing
[00:32:10] this for a while.
[00:32:12] No passengers,
[00:32:13] train driver,
[00:32:15] making sure that
[00:32:16] the system operates
[00:32:18] the way that they
[00:32:19] said it is.
[00:32:20] And,
[00:32:20] you know,
[00:32:21] they might even have
[00:32:22] specific hours that
[00:32:24] they're able to
[00:32:25] operate.
[00:32:25] They may not be
[00:32:26] able to operate
[00:32:26] between,
[00:32:27] let's say,
[00:32:28] five and seven or
[00:32:29] three and seven here
[00:32:30] in Phoenix because
[00:32:31] that's the busy time
[00:32:33] and they don't want,
[00:32:33] you know,
[00:32:34] self-driving
[00:32:36] Teslas,
[00:32:37] uh,
[00:32:37] clogging up the
[00:32:38] roads even though
[00:32:38] there's a driver
[00:32:39] there.
[00:32:39] The second phase
[00:32:41] would be you
[00:32:43] keep the driver,
[00:32:44] you keep the
[00:32:44] geofenced area,
[00:32:46] but you add
[00:32:46] passengers to
[00:32:47] this.
[00:32:48] So Tesla can
[00:32:49] start charging
[00:32:50] for,
[00:32:51] you know,
[00:32:52] uh,
[00:32:53] moving people from
[00:32:54] point A to point B
[00:32:55] as long as it's
[00:32:55] within this geofenced
[00:32:56] area.
[00:32:57] And then it goes
[00:32:58] to the third
[00:32:59] phase,
[00:33:00] which is you're
[00:33:01] still in the same
[00:33:02] geofenced area
[00:33:03] that you've proven
[00:33:04] that you can
[00:33:04] operate in safely.
[00:33:06] There's no driver,
[00:33:07] driver,
[00:33:08] but there's also
[00:33:08] no passengers
[00:33:10] and you have to,
[00:33:11] you know,
[00:33:12] test out for a
[00:33:12] while.
[00:33:13] And then you'll
[00:33:14] move on to
[00:33:16] the fourth
[00:33:17] phase,
[00:33:17] which is no
[00:33:18] drivers,
[00:33:20] geofenced area.
[00:33:21] You're still in that
[00:33:21] geofenced area
[00:33:22] and now you have
[00:33:24] passengers.
[00:33:24] And then it's just
[00:33:25] going to be rinse
[00:33:26] and repeat for
[00:33:27] different areas
[00:33:28] that you want to
[00:33:29] operate in.
[00:33:30] Like, I don't
[00:33:32] think, and we're
[00:33:33] going to talk
[00:33:33] about regulations
[00:33:34] here in a moment,
[00:33:34] but I don't think
[00:33:35] regulators are going
[00:33:36] to be like, oh,
[00:33:38] go nuts, drive
[00:33:39] wherever you want.
[00:33:39] You're good.
[00:33:40] I don't think that's
[00:33:41] going to happen.
[00:33:42] I think they're
[00:33:42] going to have to
[00:33:43] follow the same
[00:33:44] path that GM
[00:33:46] cruise and Waymo
[00:33:47] are currently
[00:33:48] following.
[00:33:48] And, you know,
[00:33:49] GM cruise had some
[00:33:51] setbacks for sure,
[00:33:53] but I, just because
[00:33:54] Tesla has really
[00:33:56] good marketing
[00:33:57] around their
[00:33:57] full self-driving
[00:33:58] does not equal
[00:34:01] the data that
[00:34:05] regulators want
[00:34:06] to see before
[00:34:07] they just
[00:34:08] completely release
[00:34:09] you.
[00:34:09] But in that same
[00:34:10] way, it also
[00:34:11] doesn't mean that
[00:34:12] these vehicles
[00:34:12] have to be
[00:34:13] perfect.
[00:34:14] They just need
[00:34:15] to demonstrate
[00:34:15] that they can
[00:34:16] operate safely
[00:34:17] on roads.
[00:34:20] That's it.
[00:34:21] Like, if they
[00:34:23] were perfect,
[00:34:25] there would be
[00:34:25] no reason to
[00:34:26] test them, but
[00:34:28] they're not
[00:34:28] perfect.
[00:34:29] And this is how
[00:34:30] these companies
[00:34:32] learn what those
[00:34:34] edge cases look
[00:34:35] like.
[00:34:36] And, you know,
[00:34:37] if they screw
[00:34:38] up, like GM
[00:34:40] did, then they
[00:34:41] should be held
[00:34:42] accountable.
[00:34:42] In a lot of
[00:34:43] ways, I am more
[00:34:45] worried about a
[00:34:46] 16-year-old
[00:34:48] driver, a brand
[00:34:49] new 16-year-old
[00:34:50] driver, driving
[00:34:51] around my
[00:34:51] neighborhood than
[00:34:52] I am about a
[00:34:53] Waymo van,
[00:34:54] which also drives
[00:34:55] around my
[00:34:55] neighborhood.
[00:34:56] Like, there was
[00:34:57] a Waymo van in
[00:34:57] front of my
[00:34:58] house just
[00:34:59] today, picking
[00:35:00] somebody up, one
[00:35:01] of the neighbors
[00:35:01] up.
[00:35:02] Like, my
[00:35:03] friends, they
[00:35:04] drive, they
[00:35:05] use Waymo
[00:35:06] vans all the
[00:35:07] time.
[00:35:08] They have two
[00:35:09] cars.
[00:35:09] They could
[00:35:10] easily take a
[00:35:11] car to where
[00:35:12] they want to
[00:35:12] go, but they're
[00:35:13] just like, they
[00:35:14] think the
[00:35:15] technology is
[00:35:16] cool, and,
[00:35:17] you know, if they
[00:35:19] go out to dinner
[00:35:19] or whatever, they
[00:35:20] want to have a
[00:35:20] drink, they want
[00:35:21] to be able to
[00:35:22] safely ride
[00:35:23] home and not
[00:35:24] have to leave
[00:35:24] their car
[00:35:25] somewhere, and
[00:35:27] Waymo is a
[00:35:28] good option for
[00:35:29] them.
[00:35:29] My oldest is
[00:35:31] 28 years old.
[00:35:31] When she lived
[00:35:32] in downtown
[00:35:33] Phoenix, she
[00:35:34] used Waymo
[00:35:35] vans all the
[00:35:36] time.
[00:35:36] She has a
[00:35:37] car, and she
[00:35:38] doesn't even
[00:35:38] drink that
[00:35:39] much.
[00:35:39] She just
[00:35:40] used them
[00:35:40] because they
[00:35:41] were really
[00:35:41] convenient, and
[00:35:42] she doesn't
[00:35:42] like driving.
[00:35:45] But anyway, back
[00:35:46] to, you know,
[00:35:47] satisfying
[00:35:48] regulators, you
[00:35:49] know, they just
[00:35:50] need to prove
[00:35:51] that they can
[00:35:51] operate safely
[00:35:52] on these roads.
[00:35:54] It's a very
[00:35:55] simplified idea,
[00:35:58] but they have
[00:35:59] to prove a
[00:36:00] bunch of stuff
[00:36:01] to regulators,
[00:36:02] and we're going
[00:36:02] to kind of go
[00:36:03] through some of
[00:36:03] that right now.
[00:36:06] So right now,
[00:36:07] laws and
[00:36:07] regulations in
[00:36:08] the U.S.
[00:36:08] are not really
[00:36:10] conducive to
[00:36:12] Tesla's
[00:36:13] robo-taxi
[00:36:13] timeline, unless
[00:36:15] there's some
[00:36:15] major reforms
[00:36:16] in a very
[00:36:17] short period
[00:36:17] of time, which
[00:36:19] would take an
[00:36:20] act of Congress,
[00:36:21] and that's not
[00:36:22] like, oh, that's
[00:36:23] never going to
[00:36:24] happen.
[00:36:24] That would take
[00:36:24] an act of
[00:36:25] Congress.
[00:36:26] No, it
[00:36:27] actually needs
[00:36:28] an act of
[00:36:28] Congress to
[00:36:29] change these
[00:36:31] rules and laws.
[00:36:33] To further that
[00:36:34] point, if Tesla
[00:36:35] wants to remove
[00:36:35] the steering
[00:36:36] wheel and pedals
[00:36:37] and go total
[00:36:38] no driver,
[00:36:40] then they will
[00:36:40] have to obtain
[00:36:41] approval of
[00:36:43] from the
[00:36:43] National Highway
[00:36:44] Transportation
[00:36:44] Safety
[00:36:45] Administration.
[00:36:46] And the
[00:36:47] National Highway
[00:36:48] Transportation
[00:36:48] Safety Administration
[00:36:49] only allows
[00:36:51] 2,500
[00:36:52] autonomous vehicles
[00:36:54] per year on
[00:36:55] a temporary
[00:36:56] exemption.
[00:36:58] So this is
[00:37:00] not, that's
[00:37:02] not a lot.
[00:37:02] That's not a
[00:37:03] lot.
[00:37:03] You might be
[00:37:04] saying to
[00:37:04] yourself, well,
[00:37:05] is there 2,500
[00:37:06] new autonomous
[00:37:07] vehicles per year
[00:37:08] operating in the
[00:37:10] United States?
[00:37:10] I have no
[00:37:11] idea, but
[00:37:12] right now we
[00:37:12] have three
[00:37:14] companies that
[00:37:15] want to do
[00:37:15] it, GM,
[00:37:16] Waymo, and
[00:37:17] Tesla, and
[00:37:19] Volvo's looking
[00:37:21] to enter the
[00:37:21] market as well
[00:37:22] to test their
[00:37:23] system.
[00:37:24] And Mercedes
[00:37:24] is doing some
[00:37:26] level 3
[00:37:26] testing, although
[00:37:28] I don't think
[00:37:28] that qualifies
[00:37:29] under this
[00:37:30] particular
[00:37:31] exemption.
[00:37:32] But yeah,
[00:37:33] there's the,
[00:37:34] there, there,
[00:37:35] 2,500 may
[00:37:37] have been a
[00:37:38] lot a couple
[00:37:38] of years ago,
[00:37:39] but it's
[00:37:40] quickly becoming
[00:37:41] probably not
[00:37:41] enough in
[00:37:43] terms of
[00:37:44] vehicle,
[00:37:45] autonomous vehicle
[00:37:46] testing.
[00:37:47] If it's only
[00:37:48] 2,500 a year,
[00:37:49] that's not,
[00:37:49] it's not a
[00:37:50] lot.
[00:37:51] So we talked
[00:37:51] a little bit
[00:37:52] about federal
[00:37:52] law.
[00:37:53] Let's move on
[00:37:53] to state law
[00:37:54] and regulations.
[00:37:55] First of all,
[00:37:57] no two states
[00:37:58] are the same.
[00:37:58] You know,
[00:37:59] I think,
[00:38:00] I think Tesla
[00:38:01] said that they
[00:38:02] were going to
[00:38:02] start testing
[00:38:03] cyber cab
[00:38:04] in California
[00:38:05] and Texas.
[00:38:07] Texas has
[00:38:08] a pretty loose
[00:38:09] set of rules
[00:38:11] and laws
[00:38:11] when it comes
[00:38:12] to
[00:38:14] autonomous
[00:38:14] taxi services.
[00:38:16] And
[00:38:18] when I say
[00:38:19] loose,
[00:38:19] I don't,
[00:38:19] I'm not
[00:38:20] implying that
[00:38:21] they're unsafe,
[00:38:21] but they're
[00:38:23] not as
[00:38:23] stringent as
[00:38:24] California's,
[00:38:25] for instance.
[00:38:26] So being able
[00:38:27] to satisfy
[00:38:28] 50 different
[00:38:29] states and
[00:38:29] 50 different
[00:38:30] laws
[00:38:31] seems like
[00:38:33] a challenge,
[00:38:34] if I'm being
[00:38:34] honest with
[00:38:35] you.
[00:38:36] And there's
[00:38:36] no,
[00:38:37] there's really
[00:38:37] no uniformity
[00:38:38] in the laws
[00:38:39] between the
[00:38:40] different states
[00:38:41] and the
[00:38:42] federal government.
[00:38:42] So this
[00:38:44] is going
[00:38:45] to be
[00:38:45] challenging.
[00:38:46] And we're
[00:38:47] pretty far
[00:38:47] away from
[00:38:48] what Elon
[00:38:49] said at
[00:38:50] the We
[00:38:51] Robot event,
[00:38:52] which is,
[00:38:53] this is Elon's
[00:38:54] direct quote,
[00:38:55] quote,
[00:38:55] we'll move
[00:38:56] from supervised
[00:38:57] full self-driving
[00:38:58] to unsupervised
[00:39:00] full self-driving
[00:39:00] where you can
[00:39:01] fall asleep
[00:39:02] and wake
[00:39:02] up at
[00:39:02] your
[00:39:02] destination.
[00:39:03] That's what
[00:39:04] he said
[00:39:04] at the
[00:39:04] event.
[00:39:05] I don't
[00:39:07] think we're
[00:39:08] there.
[00:39:08] There's going
[00:39:08] to need to
[00:39:09] be a lot
[00:39:10] more convincing
[00:39:13] data given
[00:39:14] to regulators.
[00:39:15] And by the
[00:39:16] way,
[00:39:18] testing and
[00:39:19] the data does
[00:39:19] not include
[00:39:20] Tesla fanboys
[00:39:22] who own stock
[00:39:22] in the company
[00:39:23] and have a
[00:39:24] financial interest
[00:39:26] in showing you
[00:39:27] the videos on
[00:39:28] YouTube or X
[00:39:29] that full self-driving
[00:39:30] works nearly
[00:39:31] perfectly,
[00:39:32] you know,
[00:39:33] that's not
[00:39:37] like I
[00:39:39] watch those
[00:39:39] videos too.
[00:39:40] It's not a
[00:39:42] representation of
[00:39:43] what the system
[00:39:43] can do.
[00:39:44] It is what
[00:39:45] those YouTubers
[00:39:46] and, you know,
[00:39:47] quite frankly,
[00:39:48] Tesla did this
[00:39:49] too,
[00:39:49] what they want
[00:39:50] you to see.
[00:39:51] You know,
[00:39:52] a 20-minute
[00:39:52] video showing
[00:39:53] a flawless
[00:39:54] full self-driving
[00:39:55] trip is
[00:39:58] statistically
[00:39:58] insignificant,
[00:39:59] according to
[00:40:00] Phil Koopman,
[00:40:01] a professor
[00:40:02] of electrical
[00:40:03] and computer
[00:40:04] engineering at
[00:40:04] Carnegie Mellon
[00:40:05] University.
[00:40:06] Like,
[00:40:07] they need to
[00:40:08] have good
[00:40:10] data.
[00:40:10] And I believe
[00:40:11] that they will
[00:40:12] have good
[00:40:12] data.
[00:40:13] Tesla is,
[00:40:14] you know,
[00:40:14] full of
[00:40:15] data scientists.
[00:40:16] You know,
[00:40:17] I want to
[00:40:17] stop here.
[00:40:18] I think I
[00:40:19] don't want
[00:40:20] to give
[00:40:20] the impression
[00:40:25] that I don't
[00:40:26] think this is
[00:40:27] possible.
[00:40:28] I just
[00:40:28] think it's
[00:40:29] hard.
[00:40:31] You know,
[00:40:33] Waymo's been
[00:40:33] operating in my
[00:40:34] area for a
[00:40:35] really long
[00:40:36] time.
[00:40:36] And they
[00:40:37] recently had
[00:40:38] an issue
[00:40:38] where the car
[00:40:39] didn't know
[00:40:40] what to do
[00:40:40] in a
[00:40:41] construction
[00:40:42] zone.
[00:40:42] And it was
[00:40:43] big news.
[00:40:44] And you know
[00:40:44] what the car
[00:40:45] did?
[00:40:45] It drove into
[00:40:46] a parking lot
[00:40:47] and just
[00:40:47] stopped because
[00:40:49] it didn't
[00:40:49] know what
[00:40:49] to do,
[00:40:50] which is
[00:40:50] what it
[00:40:50] should do,
[00:40:51] to be honest
[00:40:51] with you.
[00:40:53] It's the
[00:40:54] safest thing.
[00:40:54] If it doesn't
[00:40:55] know what to
[00:40:55] do, rather
[00:40:57] than continuing
[00:40:58] on and
[00:40:59] learning as it
[00:40:59] goes and
[00:41:00] hurting somebody,
[00:41:01] it drove into
[00:41:01] a parking lot
[00:41:02] when it was
[00:41:02] safe and it
[00:41:03] just pulled
[00:41:03] over.
[00:41:04] So I
[00:41:06] don't think
[00:41:06] that this is
[00:41:07] impossible.
[00:41:08] I just
[00:41:08] think that
[00:41:10] the way that
[00:41:11] it is
[00:41:11] presented
[00:41:12] by Elon,
[00:41:13] and I'm
[00:41:14] not even
[00:41:14] going to say
[00:41:14] by Tesla
[00:41:15] because this
[00:41:15] is all
[00:41:15] Elon,
[00:41:16] it's not
[00:41:17] as
[00:41:21] easy as
[00:41:22] he's making
[00:41:22] it sound.
[00:41:24] He's making
[00:41:24] it sound
[00:41:25] like the
[00:41:27] regulator is
[00:41:28] going to
[00:41:28] flip a switch
[00:41:28] and the
[00:41:29] car is going
[00:41:29] to be able
[00:41:29] to drive
[00:41:30] wherever they
[00:41:30] want without
[00:41:31] a driver.
[00:41:31] I don't
[00:41:32] think that's
[00:41:32] going to
[00:41:33] And to
[00:41:34] further that
[00:41:35] point,
[00:41:36] things that
[00:41:36] are going
[00:41:36] to give
[00:41:37] regulators
[00:41:37] pause,
[00:41:38] there are
[00:41:39] a ton of
[00:41:40] different
[00:41:40] investigations
[00:41:41] that the
[00:41:41] National Highway
[00:41:42] Transportation
[00:41:42] Safety
[00:41:43] Administration
[00:41:43] is doing
[00:41:44] into
[00:41:45] different
[00:41:45] parts
[00:41:45] of
[00:41:46] full
[00:41:46] self-driving.
[00:41:48] Most
[00:41:49] recently,
[00:41:49] there is
[00:41:51] an
[00:41:51] investigation
[00:41:52] over a
[00:41:54] few
[00:41:54] different
[00:41:54] collisions
[00:41:55] when
[00:41:56] visibility
[00:41:56] has been
[00:41:57] reduced,
[00:41:57] and these
[00:41:58] collisions
[00:41:59] resulted in
[00:41:59] one pedestrian
[00:42:01] fatality
[00:42:01] and one
[00:42:02] reported
[00:42:03] injury.
[00:42:04] So,
[00:42:04] according to
[00:42:05] the article,
[00:42:05] reduced
[00:42:06] visibility
[00:42:06] includes
[00:42:08] sun glare,
[00:42:09] which is
[00:42:09] one of the
[00:42:10] issues that
[00:42:11] Fred Lambert
[00:42:11] had,
[00:42:12] airborne
[00:42:13] dust,
[00:42:14] and I
[00:42:14] think,
[00:42:14] I didn't
[00:42:14] write it
[00:42:15] down,
[00:42:15] but I
[00:42:15] think fog
[00:42:16] was the
[00:42:16] third one,
[00:42:17] but I
[00:42:17] would imagine
[00:42:18] rain,
[00:42:18] snow,
[00:42:19] anything that
[00:42:20] interferes
[00:42:20] with the
[00:42:21] visibility
[00:42:21] of the
[00:42:22] cameras
[00:42:22] would fall
[00:42:23] under that
[00:42:24] definition.
[00:42:25] So,
[00:42:26] I'm going
[00:42:26] to read
[00:42:26] this as
[00:42:27] it was
[00:42:27] written
[00:42:28] from the
[00:42:29] National
[00:42:30] Highway
[00:42:30] Transportation
[00:42:30] Safety
[00:42:31] Administration.
[00:42:32] Here we
[00:42:33] go.
[00:42:34] The ability
[00:42:35] of FSD's
[00:42:36] engineering
[00:42:37] controls to
[00:42:38] detect and
[00:42:38] respond to
[00:42:40] reduced roadway
[00:42:41] visibility
[00:42:41] conditions,
[00:42:42] whether any
[00:42:43] or other
[00:42:44] similar FSD
[00:42:45] crashes have
[00:42:46] occurred in
[00:42:47] reduced roadway
[00:42:48] visibility
[00:42:48] conditions,
[00:42:49] and if so,
[00:42:50] the contributing
[00:42:51] circumstances for
[00:42:52] these crashes.
[00:42:53] This is what
[00:42:53] they're
[00:42:53] investigating.
[00:42:55] Any updates
[00:42:56] or modifications
[00:42:57] from Tesla
[00:42:58] to the FSD
[00:42:59] system that
[00:43:00] may affect the
[00:43:00] performance of
[00:43:01] FSD in
[00:43:03] reduced roadway
[00:43:04] visibility
[00:43:04] conditions.
[00:43:05] In particular,
[00:43:06] this review will
[00:43:07] assess the
[00:43:08] timing,
[00:43:08] purpose,
[00:43:09] and capabilities
[00:43:10] of such
[00:43:10] updates,
[00:43:11] as well as
[00:43:12] Tesla's
[00:43:13] assessment of
[00:43:14] their safety
[00:43:15] impact.
[00:43:16] The probe
[00:43:18] is actually
[00:43:19] covering 2.4
[00:43:20] million Teslas,
[00:43:22] which include
[00:43:23] the Cybertruck,
[00:43:24] Model 3,
[00:43:26] years 2017
[00:43:27] to 2024,
[00:43:29] Model Y,
[00:43:30] years 2020
[00:43:31] to 2024,
[00:43:33] Model S and
[00:43:34] X,
[00:43:35] 2016 to
[00:43:36] 2024.
[00:43:38] So,
[00:43:38] according to
[00:43:39] the National
[00:43:41] Highway Transportation
[00:43:42] Safety Administration
[00:43:43] Office of
[00:43:44] Defects
[00:43:44] Investigation,
[00:43:46] Tesla has
[00:43:46] labeled full
[00:43:48] self-driving
[00:43:48] as a partial
[00:43:49] driving autonomous
[00:43:51] system,
[00:43:52] or level 2.
[00:43:54] And I'm going to
[00:43:55] read this again
[00:43:56] as it is written.
[00:43:57] The Office of
[00:43:58] Defects
[00:43:58] Investigation
[00:44:00] has identified
[00:44:01] four standing
[00:44:02] general order
[00:44:03] reports in which
[00:44:04] Tesla vehicles
[00:44:04] experienced a crash
[00:44:06] after entering
[00:44:06] an area of
[00:44:07] reduced
[00:44:09] roadway visibility
[00:44:10] conditions with
[00:44:11] FSD beta,
[00:44:12] FSD supervised,
[00:44:13] or collectively
[00:44:15] FSD engaged.
[00:44:16] In these crashes,
[00:44:18] the reduced
[00:44:18] roadway visibility
[00:44:19] arose from
[00:44:21] conditions such as
[00:44:22] sun glare,
[00:44:22] fog,
[00:44:23] airborne dust.
[00:44:24] In one of
[00:44:25] these crashes,
[00:44:26] the Tesla
[00:44:26] vehicle fatally
[00:44:27] struck a
[00:44:28] pedestrian.
[00:44:29] One additional
[00:44:30] crash in
[00:44:32] these conditions
[00:44:33] involved a
[00:44:34] reported injury,
[00:44:35] stated the
[00:44:36] National Highway
[00:44:37] Transportation Safety
[00:44:38] Administration.
[00:44:39] So,
[00:44:40] yeah,
[00:44:41] I mean,
[00:44:41] it's,
[00:44:42] you know,
[00:44:43] if they have
[00:44:43] to approve
[00:44:44] you removing
[00:44:46] your pedals
[00:44:47] and steering
[00:44:48] wheel and
[00:44:48] going no
[00:44:49] driver,
[00:44:50] they have
[00:44:51] an awful
[00:44:51] lot of data
[00:44:52] showing that,
[00:44:54] you know,
[00:44:55] there could be
[00:44:56] concern.
[00:44:57] I'm not saying
[00:44:57] there is.
[00:44:58] The investigation
[00:44:59] is not to
[00:45:00] prove guilt.
[00:45:02] The investigation
[00:45:02] is to find out
[00:45:03] whether or not
[00:45:04] there is an
[00:45:06] issue and
[00:45:07] let's correct
[00:45:07] it,
[00:45:08] right?
[00:45:08] It's not a
[00:45:10] gotcha type
[00:45:11] system,
[00:45:12] as far as I
[00:45:13] can tell anyway.
[00:45:14] It's more of a,
[00:45:15] hey,
[00:45:16] if there's some
[00:45:17] area lacking
[00:45:18] in this,
[00:45:19] let's correct
[00:45:20] it and make
[00:45:21] sure it doesn't
[00:45:21] happen again.
[00:45:22] Right?
[00:45:23] I'm sure
[00:45:23] that there
[00:45:24] might be
[00:45:24] fines or
[00:45:25] penalties or
[00:45:26] whatever that
[00:45:26] that come
[00:45:27] along with
[00:45:27] this,
[00:45:28] but the
[00:45:29] goal is
[00:45:30] to keep
[00:45:31] everybody on
[00:45:32] the road
[00:45:32] safe.
[00:45:33] It's not
[00:45:33] to be
[00:45:34] some
[00:45:37] stick
[00:45:38] to beat
[00:45:39] companies over
[00:45:40] the head
[00:45:40] with.
[00:45:41] It's
[00:45:41] truly there
[00:45:42] as best
[00:45:43] as I can
[00:45:44] tell anyway,
[00:45:45] as long as
[00:45:45] I've been
[00:45:46] doing this
[00:45:46] podcast,
[00:45:47] to make
[00:45:48] sure that
[00:45:48] automakers
[00:45:49] are,
[00:45:49] one,
[00:45:50] following the
[00:45:50] rules,
[00:45:51] and two,
[00:45:51] on these
[00:45:52] weird cases
[00:45:54] that we're
[00:45:54] not super
[00:45:54] familiar with,
[00:45:55] like full
[00:45:55] self-driving
[00:45:56] or autonomous
[00:45:56] driving,
[00:45:59] if there are
[00:46:01] mistakes made
[00:46:02] or holes in
[00:46:04] the system,
[00:46:06] we correct
[00:46:07] them.
[00:46:07] And I think
[00:46:10] that's reasonable.
[00:46:11] All right.
[00:46:12] I have been
[00:46:13] talking for
[00:46:14] 46 minutes
[00:46:15] at this point.
[00:46:15] So what
[00:46:16] I'd like to
[00:46:16] do,
[00:46:16] if I could,
[00:46:17] is take the
[00:46:18] last three
[00:46:18] or four
[00:46:19] minutes here,
[00:46:19] it might not
[00:46:20] even be that
[00:46:21] long,
[00:46:22] and just tell
[00:46:22] you about
[00:46:23] my experience
[00:46:24] with full
[00:46:25] self-driving
[00:46:26] trial.
[00:46:28] Again,
[00:46:29] I didn't buy
[00:46:29] it,
[00:46:30] so this is
[00:46:30] just a
[00:46:31] 30-day
[00:46:31] trial or
[00:46:32] however long
[00:46:33] Tesla decides
[00:46:33] to let me
[00:46:34] use it.
[00:46:34] I got it
[00:46:35] last Friday,
[00:46:37] so I decided
[00:46:37] to test it
[00:46:38] out first
[00:46:42] rightway onto
[00:46:42] the road.
[00:46:43] I engage
[00:46:44] full self-driving
[00:46:45] and,
[00:46:46] you know,
[00:46:47] there's like,
[00:46:49] I don't know,
[00:46:50] 700-800 feet
[00:46:51] of road and
[00:46:52] then there's a
[00:46:53] right turn
[00:46:53] where I live.
[00:46:55] So the car
[00:46:56] is going.
[00:46:57] One of the
[00:46:58] things that I
[00:46:58] don't love
[00:46:59] about full
[00:47:00] self-driving
[00:47:00] is the speed
[00:47:01] limit is
[00:47:02] technically 25
[00:47:03] and it gets
[00:47:04] to 25
[00:47:05] very fast.
[00:47:06] And when
[00:47:07] you make
[00:47:07] that right
[00:47:08] turn,
[00:47:08] it was wanting
[00:47:10] to keep speed
[00:47:10] at 25.
[00:47:11] And at
[00:47:12] that particular
[00:47:13] corner,
[00:47:14] they're young
[00:47:15] kids,
[00:47:15] so I'm
[00:47:15] usually pretty
[00:47:16] slow in
[00:47:18] driving that
[00:47:19] direction because
[00:47:20] I don't want
[00:47:20] to hit anybody.
[00:47:22] But I'm
[00:47:23] letting the
[00:47:23] car do its
[00:47:23] thing.
[00:47:24] I don't see
[00:47:24] the kids
[00:47:25] anywhere.
[00:47:26] And it's
[00:47:26] a school
[00:47:26] day and
[00:47:27] I know
[00:47:27] that they
[00:47:28] go to
[00:47:28] school
[00:47:29] earlier than
[00:47:30] when I was
[00:47:30] driving,
[00:47:31] so I felt
[00:47:31] comfortable.
[00:47:32] But when I
[00:47:33] came up on
[00:47:33] that corner,
[00:47:34] there was a
[00:47:34] man and his
[00:47:35] dog standing
[00:47:35] in the middle
[00:47:36] of the road.
[00:47:36] For what
[00:47:37] reason?
[00:47:37] I don't
[00:47:38] know.
[00:47:38] There's
[00:47:38] sidewalks on
[00:47:39] both sides,
[00:47:39] but he was
[00:47:41] going to
[00:47:42] I disengaged
[00:47:42] immediately because I
[00:47:45] didn't want to hurt
[00:47:45] the man or the
[00:47:46] dog, for that
[00:47:47] matter.
[00:47:47] And I didn't
[00:47:48] trust that full
[00:47:49] self-driving was
[00:47:50] going to not
[00:47:51] hit them because
[00:47:52] it didn't seem
[00:47:53] like it was
[00:47:54] going to make
[00:47:54] any, it
[00:47:56] wasn't breaking,
[00:47:57] you know, and I
[00:47:58] wasn't particularly
[00:47:58] close to them, but
[00:47:59] I also didn't want
[00:48:01] to make them feel
[00:48:01] uncomfortable and
[00:48:02] let the car do
[00:48:03] whatever it was
[00:48:03] going to do.
[00:48:04] It just didn't
[00:48:04] seem like a good
[00:48:05] idea.
[00:48:05] So I disengaged
[00:48:07] and then I was
[00:48:09] getting on the
[00:48:09] freeway and there
[00:48:11] were construction
[00:48:12] cones.
[00:48:14] FSD navigated
[00:48:15] through these
[00:48:15] construction cones
[00:48:16] just fine, but I
[00:48:17] thought it was a
[00:48:18] little bit too
[00:48:19] fast.
[00:48:19] Like technically
[00:48:20] the speed limit
[00:48:20] was whatever it
[00:48:22] was, 45, 35
[00:48:23] through the
[00:48:23] construction zones
[00:48:24] and it was just
[00:48:25] kind of weaving
[00:48:27] through, I
[00:48:28] wouldn't say
[00:48:28] weaving, I had
[00:48:29] to get in the
[00:48:29] far left lane so
[00:48:30] I can get on
[00:48:31] the freeway.
[00:48:31] So it went
[00:48:32] through the
[00:48:33] construction cones
[00:48:34] to get me into
[00:48:35] the far left
[00:48:35] lane.
[00:48:36] It was, it
[00:48:37] was, it
[00:48:39] could have been
[00:48:39] more intentional
[00:48:40] and slower.
[00:48:40] It didn't need
[00:48:41] to be that
[00:48:41] fast, a little
[00:48:42] bit more
[00:48:43] cautious, you
[00:48:44] know, when
[00:48:45] cones are
[00:48:46] around.
[00:48:48] But I didn't
[00:48:49] disengage at
[00:48:49] that time.
[00:48:50] I just was a
[00:48:50] little uncomfortable
[00:48:51] with the speed
[00:48:52] and aggressiveness
[00:48:53] and I have the
[00:48:54] car on chill
[00:48:55] mode.
[00:48:56] So I was a
[00:48:56] little, just a
[00:48:58] little bit
[00:48:59] concerned with the
[00:49:00] aggressiveness
[00:49:00] there.
[00:49:01] I wouldn't have
[00:49:01] done that if it
[00:49:02] was me.
[00:49:03] And then I
[00:49:05] was transitioning,
[00:49:06] I needed to
[00:49:06] transition from
[00:49:07] one freeway to
[00:49:08] another, right?
[00:49:09] And there is a
[00:49:09] line of cars
[00:49:10] that are, goes
[00:49:11] back miles and
[00:49:13] miles and miles
[00:49:13] and I'm letting
[00:49:14] full self-driving
[00:49:15] do its thing.
[00:49:16] But there are
[00:49:17] people waiting
[00:49:18] for miles to
[00:49:19] get on this
[00:49:21] transition.
[00:49:22] And I was
[00:49:24] within half a
[00:49:25] mile and the
[00:49:26] car did not
[00:49:27] seem like it
[00:49:28] was going to
[00:49:28] try and get
[00:49:29] into this lane.
[00:49:30] And I had a
[00:49:31] meeting I had
[00:49:32] to go to.
[00:49:32] So I went
[00:49:34] ahead and there
[00:49:35] was an opportunity.
[00:49:35] I disengaged
[00:49:36] and I slid
[00:49:37] into a spot,
[00:49:39] probably much
[00:49:40] to the frustration
[00:49:40] of everybody
[00:49:41] behind me.
[00:49:42] But I slid
[00:49:43] into a spot
[00:49:43] and I was able
[00:49:44] to make the
[00:49:44] transition and
[00:49:45] then I turned
[00:49:46] it back on
[00:49:46] and it did
[00:49:47] fine through
[00:49:47] most of the
[00:49:50] driving to
[00:49:51] the other
[00:49:52] stuff that I
[00:49:53] needed to do
[00:49:53] that day.
[00:49:54] It still won't
[00:49:55] change lanes into
[00:49:57] or out of the
[00:49:58] HOV lane because
[00:49:59] in Arizona,
[00:50:00] I don't know
[00:50:00] how it is
[00:50:00] everywhere else,
[00:50:01] but in Arizona
[00:50:02] it's a solid
[00:50:02] white line and
[00:50:03] it sees that
[00:50:04] solid white line
[00:50:05] and it will
[00:50:05] not make that
[00:50:07] transition and
[00:50:08] that's fine.
[00:50:09] I'll disengage
[00:50:10] for that.
[00:50:11] But overall I
[00:50:11] thought it was
[00:50:12] pretty good.
[00:50:14] Much better
[00:50:15] than the last
[00:50:15] time I had it
[00:50:16] and I did not
[00:50:17] experience a lot
[00:50:18] of the concerns
[00:50:20] that I brought up
[00:50:20] the last time
[00:50:21] and I won't go
[00:50:22] into those now
[00:50:23] because we are
[00:50:23] now 15 minutes
[00:50:24] into this podcast.
[00:50:26] But yeah,
[00:50:27] overall I'm pretty
[00:50:28] happy.
[00:50:29] with full self
[00:50:30] driving.
[00:50:31] Would I trust
[00:50:32] it implicitly?
[00:50:33] Absolutely not.
[00:50:35] Obviously I think
[00:50:36] I've demonstrated
[00:50:36] that.
[00:50:38] But for most
[00:50:41] of the time
[00:50:42] I was driving
[00:50:42] it, I was
[00:50:43] quite impressed
[00:50:44] with it.
[00:50:45] All right
[00:50:45] everybody,
[00:50:46] that is it
[00:50:46] for me.
[00:50:47] I'm going to
[00:50:47] end it here.
[00:50:48] Thank you so
[00:50:48] much for listening.
[00:50:49] I hope you
[00:50:50] all have a
[00:50:51] wonderful day.
[00:50:51] Our next
[00:50:52] episode will be
[00:50:53] Tesla's 2024
[00:50:55] Q3 earnings call
[00:50:56] and that is
[00:50:57] actually happening
[00:50:58] tomorrow.
[00:50:59] So I'll
[00:51:00] probably be an
[00:51:02] immediate episode
[00:51:02] right after this.
[00:51:03] And by the way,
[00:51:05] I was trying to
[00:51:06] clear out my
[00:51:07] news queue.
[00:51:08] There's too much
[00:51:09] news coming in.
[00:51:10] That's not going
[00:51:11] to happen.
[00:51:11] So we're going
[00:51:13] back to two
[00:51:13] episodes a week.
[00:51:14] Maybe three
[00:51:18] because Tesla's
[00:51:19] earnings call is
[00:51:20] kind of an
[00:51:21] easier episode
[00:51:22] for me to
[00:51:22] do.
[00:51:23] We'll see.
[00:51:24] Two for sure.
[00:51:25] All right
[00:51:26] everybody, thank
[00:51:26] you so much
[00:51:27] for listening.
[00:51:27] If you need
[00:51:28] to email me
[00:51:28] it's Bodee
[00:51:28] B-O-D-I-E
[00:51:30] at 918digital.com
[00:51:31] You can find me
[00:51:32] on Twitter
[00:51:33] at 918digital.
[00:51:35] I hope you all
[00:51:35] have a wonderful
[00:51:36] week and I
[00:51:37] will talk to
[00:51:37] you sometime
[00:51:38] before Friday.
