Description:
In this episode, I speak with Guy Mangiamele from AMCI Testing, an organization vital to independent automotive evaluations. We discuss their role in assessing autonomous driving, focusing on Tesla's full self-driving (FSD) system. Guy reveals that driver interventions were required every 11 miles in New York City and every 13 miles in Southern California, highlighting the need for ongoing supervision despite advancements. We also explore societal expectations of autonomy compared to technological realities. This conversation underscores the importance of independent testing in shaping automotive safety and transparency.
Support the Show:
Other Podcasts:
Links:
*ART PROVIDED BY DALL-e
Support this show http://supporter.acast.com/kilowatt.
Hosted on Acast. See acast.com/privacy for more information.
Hosted by Simplecast, an AdsWizz company. See pcm.adswizz.com for information about our collection and use of personal data for advertising.
[00:00:00] Hello everyone and welcome to Kilowatt, a podcast about electric vehicles, renewable energy,
[00:00:05] autonomous driving and much, much more. My name is Bodhi and I am your host and on today's
[00:00:09] episode we are going to talk to Guy Mangiamele. And I'm going to be honest with you, I had to
[00:00:16] practice Guy's last name several times. Guy says it in such a beautiful accent when he says his last
[00:00:22] name. Anyway, Guy works for AMCI testing. A couple of weeks ago, actually a couple of months ago, we
[00:00:28] talked about AMCI testing doing some third party testing on Tesla's full self-driving. I love it
[00:00:38] when third party testing organizations take the time to test this kind of stuff and release the
[00:00:43] data that they've found. And honestly, I want to encourage more companies to do this, but I reached
[00:00:49] out to Guy and I said, hey, would you like to come on the show? And he agreed to come on and chat with us
[00:00:54] about it. Join me in welcoming Guy Mangiamele to the show. Hello, Bodhi. How are you? Good. Good. How are
[00:01:02] you? Excellent. Great to be here. Excellent. I am super excited to talk to you. I reached out to you on
[00:01:07] LinkedIn and you were kind enough to actually respond and we're able to make this happen. So
[00:01:14] I just wanted to kind of walk through some initial things like what is AMCI testing? Not everybody
[00:01:22] might be familiar with that organization. So what is it? So AMCI testing has been in business
[00:01:30] more than 40 years now. Basically, we serve all the automotive manufacturers, the OEMs,
[00:01:38] and have for those four decades, four plus decades. They come to us basically when they have
[00:01:45] a new product or updating a product and they want to know how their product stacks up with the competition.
[00:01:55] They want an unbiased, independent, third-party look at the work that their engineers have done
[00:02:02] and the kind of product that they have to sell. So let me just say a bit about process. So, you know,
[00:02:14] creating a vehicle is a very complicated endeavor. The engineers that are on that project are given a
[00:02:21] brief. They are typically very strongly motivated to produce the best product possible. But, you know,
[00:02:31] in the three, four, five years, sometimes gestation period that there is between the time they're given
[00:02:37] the brief and the vehicle comes to fruition, a lot of things have changed. The competition has moved on.
[00:02:45] Buying habits have changed. And so at the end of that tunnel, there is a sort of realignment
[00:02:51] that's necessary. And oftentimes that happens in the vehicle's marketing department where they say at the
[00:02:59] end of this process, okay, what do we really have to sell here? And who are we selling against?
[00:03:04] And how are we going to be compared both by the media and in the showroom? And so it's our job,
[00:03:13] my team's job, to give them an independent assessment of what that product is, that new product,
[00:03:20] how well it succeeded at the various strengths and weaknesses that they were aiming for at the
[00:03:26] beginning of the brief and where they currently sit. So we find what the most natural marketing messages
[00:03:33] are for that vehicle. We can predict kind of what the media is going to say about it so that our
[00:03:39] clients can respond accordingly. Excellent. And how did you get involved? This is a really interesting
[00:03:45] career. How did you get involved with that? Well, we have a pretty eclectic staff.
[00:03:49] Some of our staff are retired race drivers, retired engineers in various fields of engineering,
[00:03:57] everything from automotive to fluid dynamics, all kinds of different stuff. I actually was an
[00:04:04] automotive journalist. And so I came to it from that end. It looks like you're celebrating 25 years
[00:04:10] there too soon. Yes, it's been a long haul. I never would have imagined that I would be there that
[00:04:17] long, but it was a great fit. And the work is always fascinating. A lot of people say,
[00:04:23] if you can work in automotive, you can work anywhere because the environment is just so
[00:04:28] challenging. And there's so many variables changing all the time that really makes it exciting.
[00:04:36] Yeah. No, I've been doing this podcast for eight years and it is, I will say in the little sliver
[00:04:44] that I have, that I've carved out for myself. It is, there's always something new to learn. And no
[00:04:49] matter what you think, the next time you talk about a subject, there'll be something in there that
[00:04:55] might change your mind or it might tweak the way you think about something. And it's usually, honestly,
[00:04:59] what we're going to talk about today is full self-driving.
[00:05:01] Well, and that's one of the various reasons why, you know, right now is perhaps one of the most
[00:05:09] intense moments in this industry. I mean, the speed of change, expectations, international markets,
[00:05:21] international, you know, tastes and demands, regulation. I mean, it's, everything is hitting
[00:05:26] at once. And, you know, OEMs just can't afford to make a mistake right now. And there are plenty that
[00:05:34] are. So that brings us to what we're going to talk about today. What, what, what prompted you to test
[00:05:40] full self-driving? Well, so, you know, as I mentioned at the, at the opening, the vast majority of the work
[00:05:49] that we do is for major manufacturers. And almost all of that work is covered by very strict
[00:05:55] non-disclosure agreements, NDAs, because it's information that those clients are using internally
[00:06:02] to, to create new product, to revise old product, and to make decisions, decisions about everything
[00:06:12] from, you know, where they're going to produce something to how many units they're going to
[00:06:14] produce and, and what the content is going to be. So we, as a company, AMCI testing are sometimes in a
[00:06:22] rather odd position of doing all of our work behind the scenes and not being able to talk about any of
[00:06:29] it. And so, because we can't talk about any of it or very little of it, it's very hard for us to
[00:06:36] market ourselves. And so once in a while, when something appears to be very timely, we take it upon
[00:06:43] ourselves to create a project around it that we can publicly discuss. And, and oftentimes, and this is a
[00:06:52] particular case this time, we feel we pick that project because we think it does a public service. So
[00:06:59] autonomous driving, the safety aspects behind that have huge implications for all of society. And we feel like
[00:07:09] we have, you know, a very original outlook on that original perspective based on the clients that we've
[00:07:17] got and what is going on, you know, in the marketplace. And so we decided, well, we're going to take this on
[00:07:25] ourselves. And so that's basically what we did. We decided we would take a look at FSD because,
[00:07:33] you know, it's, it's the only camera only system out there. Tesla has made a big deal about removing
[00:07:40] any other, you know, sensors from their vehicles. They're made the decision to go all camera. And you
[00:07:50] have other companies like Waymo that are, you know, currently, you know, in market who are doing,
[00:07:56] you know, extensive mapping of, of streets and have, you know, LiDAR and radar on their vehicles.
[00:08:03] And claim that it's necessary. So it seems like, it seemed very timely to do this now, especially
[00:08:11] as the new software version 12.5 was coming out that, you know, Tesla was touting as so much more
[00:08:17] capable than previous ones. And we thought, well, let's, let's get to the bottom of it, see what
[00:08:23] they're, what, how far they've come, see what's left to do and whether or not this is something that can
[00:08:30] ever be undertaken with the kind of safety that people really should expect.
[00:08:37] That's, that's great. I am, I am a big proponent and I know this, this costs money and in the grand
[00:08:43] scheme of the budget, I don't know how much it costs, but I'm a big proponent of third-party
[00:08:49] testing companies, testing this stuff outside of the marketing teams at these big companies.
[00:08:55] I'm not going to rule out Tesla just exclusively. They don't really have a marketing team, but I'm
[00:09:00] just saying like, I want to know where these things fall short, where they can buy somebody much
[00:09:06] smarter than me. I have a Tesla. I have the trial version of full self-driving right now. I think it's
[00:09:11] great, but I don't think it's perfect. You know, I want somebody who's smarter than me looking at this
[00:09:16] stuff and evaluating it on in a non-biased way and, and providing feedback. And hopefully that feedback
[00:09:23] goes back to those engineers. And that is that the positive outcome comes from it. So I love that you
[00:09:30] guys are doing this. And I just want to underline, because Matt probably didn't hit that hard enough
[00:09:34] in what I just said, but this, this work that we're talking about here and that the recent press releases
[00:09:40] and videos have, have, have pertained to, this was not for any paying client. Sure. And we have no
[00:09:49] interest in either Tesla or Waymo. We are not, you know, stock owners in any of those companies. We have,
[00:09:57] we have no iron in the fire there. We're really a completely independent company. That's just looking
[00:10:05] to, and we're a data-driven company. So basically we, we don't express ourselves in opinions generally.
[00:10:13] And we're going to need to talk a little bit about that regarding real world testing, which is kind of
[00:10:18] what this is. But in general, the, the, the data that we supply our clients is completely numerical.
[00:10:27] And because of that it's, you know, it doesn't, it doesn't include anything that's opinion related.
[00:10:35] It's strictly factual. Now, when we talk about doing something that is real world, like this program was,
[00:10:45] there are certain variables that you cannot really get to the bottom of in a completely clean,
[00:10:54] comparative way. In other words, if you were trying to test these two systems and hold every variable
[00:11:04] constant, you could do that in a proving ground setting, but you, there wouldn't be the same variables
[00:11:14] that you find in real world traffic. Oftentimes our real world testing, and in this case as well,
[00:11:22] we're talking about, you know, testing vehicles for all kinds of things in real world scenarios in which,
[00:11:31] you know, we would put vehicles nose to tail, rotate drivers, rotate the order of the vehicles
[00:11:36] to try and tease out all of those potential problems. But even if you put multiple cars front to back
[00:11:47] in this real world scenario with autonomous driving, neither of those cars are going to,
[00:11:53] are going to encounter the same issues in the same moment, in the same lane. So the way we engineered
[00:12:03] this particular test on FSD was to look at it in terms of required driver interventions,
[00:12:11] because there isn't really any other way to think about it. You try to maximize the number of miles,
[00:12:18] the number of scenarios you put the system in through, and document every time that the driver needs to take
[00:12:28] control of the vehicle. And there's lots to discuss there, obviously. But from our standpoint,
[00:12:35] we took control of the vehicle and tagged it as a driver required intervention. Anytime public safety
[00:12:46] or the safety of other road users was at risk. So a couple of possible scenarios to just to tease that out.
[00:12:58] So if the vehicle is crossing a double yellow line, and there's oncoming traffic, we're going to take control
[00:13:05] of the vehicle and call that a driver required intervention. If the vehicle is braking hard in traffic
[00:13:13] for no reason, we're going to call that an intervention. If it's trying to turn left in front
[00:13:20] of oncoming traffic, we're not going to wait and ride it into the actual accident. There's no reason to do
[00:13:26] that. We can see what's happening. We can look at the FSD display, and we can see the path that the
[00:13:32] computer is painting, and what its intentions are. And at that point, we'll take control and call it an
[00:13:40] intervention. And so that's basically what we've done. And you can see that on average, over the more
[00:13:48] than 1300 miles now that we've accumulated in various scenarios, both in Southern California and in New
[00:13:56] York City traffic, the average number of scenarios was between 13 in Southern California traffic, 13
[00:14:04] scenarios, and 11 when we did this in New York. Did you differentiate between driver intervention,
[00:14:15] like accelerating when it was slowing down and it didn't need to, like shadows sometimes on the freeway
[00:14:21] will do this, or critical interventions where you just had to completely disengage FSD? Or was that
[00:14:27] all included in the interventions? No, the only time we would tag an intervention was when we had to
[00:14:34] intervene, when we were forced to intervene. So no, if it makes small errors or does something that
[00:14:39] the driver wouldn't necessarily do, or you would think, huh, that's a strange thing. No,
[00:14:44] that's not a reason for intervening. No, not at all. Let's say it misses an exit and goes to the next
[00:14:52] exit. That's not a reason to intervene. Well, I will. Okay. So there's a spot on the 202 when I'm
[00:14:59] driving and it goes from the 202 to the 10, right? And every single time it slows down. We're not taking
[00:15:07] the 10. We're staying in the lane. It doesn't seem to matter what lane I'm in unless I'm in the far
[00:15:11] left lane. It always slows down as if it wants to take that, even though it's not supposed to,
[00:15:16] and it doesn't show that it's going to, but always slows down five to seven miles an hour. I don't know
[00:15:21] why in that space. I don't know why it does it. I'll just put my foot on the accelerator. I'll get
[00:15:27] past that. But it's not, I didn't disengage full self-driving to do that, but I did have to intervene
[00:15:33] to get it to, you know, stay up with the speed of traffic.
[00:15:38] Right. Yeah.
[00:15:39] Yeah. We would not, we would not really consider that.
[00:15:43] Gotcha.
[00:15:45] Yeah. Okay.
[00:15:46] So again, just to, just, just to reiterate. So basically between we, our interventions
[00:15:52] typically seem to be every 13 or every 11 miles on average. We were averaging every 11 miles in New
[00:16:01] York city traffic, and we were averaging every 13 miles in Southern California traffic over the course
[00:16:07] of about 1300 miles. And further, I could say that that testing occurred over a wide range of,
[00:16:16] of traffic scenarios. We intentionally planned it that way. So we weren't just trying to accumulate
[00:16:23] miles as quickly as possible on the freeway, but you know, we would do about one quarter miles freeway,
[00:16:29] about one quarter miles city, about one quarter miles on mountain roads, and about one quarter of
[00:16:35] the miles were done on two lane highways, rural highways. Gotcha.
[00:16:41] So, you know, it's important because in order to really understand, and as a test driver, this is
[00:16:51] what you're trying to do, understand how the system thinks and whether or not the processes that the system
[00:16:59] is going through are uniform and predictable. You have to do a large number of miles over a wide array of
[00:17:12] scenarios. You can't just limit yourself to one because for instance, full self-driving overall, and many people
[00:17:21] probably would think it's exactly the opposite, but it tends to do quite well in the city. City traffic,
[00:17:29] where it's slower, even though there are, you know, arguably more obstacles and pedestrians and things,
[00:17:36] it's where it's somehow at its best. And, you know, this has to do with how it's been programmed,
[00:17:45] how many hours have been spent programming it in certain scenarios, what, you know, the depth of that
[00:17:50] programming. We're not programmers. We just know what its end behavior is, and that's really the only
[00:17:57] important thing, you know, to a buyer or even a regulator who is assessing, you know, the system's
[00:18:07] safety margin. So it's important to look at it over a wide swath of scenarios.
[00:18:17] And I will say, I think it does pretty good. This version does pretty good in the city driving where
[00:18:24] I live. I live in Tempe and I don't know if you've ever been here, but we have massive, we almost have
[00:18:30] thoroughfares for city streets. So it does fine. Even my left, I used to, when I would make a left-handed
[00:18:37] turn out of my neighborhood, because it is four lanes and a turning lane, it would, it would have a
[00:18:43] really hard time with that, but it's, it's doing much better. So I will say that that's, that's good
[00:18:48] for sure that the improvement is there from my point of view. Um, when Oceanside has pretty
[00:18:55] consistent weather, you might have some fog and whatever, but in the morning, but you have pretty
[00:19:00] consistent weather. Did you notice anything in New York or with the changing traffic conditions?
[00:19:05] Cause I, the East coast roads versus West coast roads can be different. I've never been to New York,
[00:19:10] New York city. So I can't really compare apples to apples there, but did you notice any difference
[00:19:15] between the two locations? The one, the one, um, out of our control, uh, weakness so far in the
[00:19:23] protocol that we have executed is that it's all been fair weather testing for whatever reason.
[00:19:31] Um, you know, starting in the summertime here in Southern California, it's been extremely dry
[00:19:36] and all of the days that we operated in New York city environs. It was also, the weather was also
[00:19:43] fair. Um, sometimes cloudy we've been through, you know, we we've tested through, you know,
[00:19:49] setting suns where the sun is, is, is pointed directly into the Tesla's cameras and, you know,
[00:19:56] and at night, you know, probably 40 to 45% of the time at night. Um, so we have, we've done that,
[00:20:03] but, uh, in, in weather rain, snow that we have yet to breach and we will eventually, but the seasons
[00:20:11] just have not yet allowed it. But worth saying that in spite of that fact, even though all of our
[00:20:20] testing has been fair weather, uh, the system has succumbed to a significant number of challenges
[00:20:27] in that time. Now I do want to, I'm fond of saying, you know, give credit where credit is due.
[00:20:36] Um, Tesla's engineers have done a rather amazing job with 12.5. It has to be said. I mean,
[00:20:48] they have gotten this system to a point where I think many, many pundits would have said,
[00:20:56] uh, they would never get to. So it does do things and does behave in a very humanoid way to an extent
[00:21:05] that would surprise many people. And in effect, this is sort of ironically, the problem with FSD.
[00:21:18] If you get into it as a first time driver who has never experienced something like this,
[00:21:24] and you drive the first three miles, the first five miles, maybe even the first 15 miles,
[00:21:30] and you see that the system doesn't make a single error through confirmation bias in your own mind,
[00:21:39] you begin to roll back the safeguards in your, in your head about policing this autonomous system.
[00:21:50] And your brain, we are, we are wired as humans to think this way, that the future is predicated on
[00:22:00] the past. And so when you see past performance of the system, your brain assumes that it's going to
[00:22:08] continue to perform that way in the future. And so first of all, your brain will say, well, I've seen
[00:22:14] it do, you know, 30 successful lane changes. There's no reason to think that the 31st won't be.
[00:22:21] And then it'll be, I've seen it do, you know, 50 freeway exits. No reason to think that the next
[00:22:28] one won't be handled fine. I've seen it go through, you know, 60 crosswalks with pedestrians standing there
[00:22:36] and never had it even, you know, move towards any of those people, but that doesn't mean it won't do
[00:22:43] it the next time. And the number of variables that these systems are dealing with through their
[00:22:54] interpretation of camera imagery in real time. If you imagine this, a computer looking at
[00:23:07] 30 or 60 frame per second feed, analyzing those images, how many objects there are in those images,
[00:23:16] what the implied next action of those objects are, and then translating that into movements of the
[00:23:24] throttle, steering, car placement, all of that stuff in real time. You know, it's an extraordinary ask.
[00:23:33] And that's what's amazing is how well it does much of the time. Much of the time, it does a really
[00:23:42] good job of it. And sometimes surprisingly good job of it, which is what makes its failures when it does
[00:23:50] fail, just all the more breathtaking, surprising, and potentially dangerous.
[00:23:56] Do you know if your vehicle had hardware three or hardware four?
[00:24:01] We're always testing on a hardware four vehicle. And yes, with, so our testing has spanned 12.5.1,
[00:24:09] 12.5.3, and 12.5.4 on hardware four vehicles.
[00:24:14] Gotcha. Gotcha. Yeah.
[00:24:16] And okay. So we've, we've talked about some of the things that it does really well. We've talked
[00:24:21] about some of the, the failures. My next question to you, and I'm not going to ask this, but I'm
[00:24:27] going to let the audience know what it was. Because based on this conversation, I don't think it's fair
[00:24:31] to ask, but you know, how close do you think we are to achieving full self-driving? Even if it's Tesla,
[00:24:35] it could be GM, Ford, Waymo, you can, you don't have to answer that if you, if you don't want to.
[00:24:44] Well, I think here's, here's the way I talk about it. I say, I think there are parallels between what
[00:24:53] we're trying to accomplish with, with autonomous driving and what has been accomplished in aviation
[00:25:00] since the 1970s. If you think about autopilot systems, I mean, real autopilot is not what Tesla
[00:25:08] called autopilot, right? But they were, they were modeling it verbally on aviation related autopilots.
[00:25:15] In the mid 1970s, um, auto land was pioneered for large jets, which was a system that allowed
[00:25:25] the aircraft to be landed in zero, zero visibility. The pilot can see nothing out of the windows of the
[00:25:32] airplane through touchdown and rollout on a runway. If certain circumstances are met, you know,
[00:25:39] certain, certain certified runways, certain certified, uh, aircraft, but those systems were
[00:25:45] certified to a per occurrence failure rate of, you know, one in 150,000. So essentially they never
[00:25:56] fail and being in the air, they have many fewer variables to deal with than an autonomous driving
[00:26:07] system. There are, everything is basically known. The system knows the aircraft speed. It knows the
[00:26:15] air speeds around the aircraft. It knows the, you know, the, the slope of the glide slope on the
[00:26:21] approach to that runway. It knows the length of the runway. It knows how far down the runway it can
[00:26:26] land in order to have enough room to break. It knows the wind speed. It knows wind direction,
[00:26:32] knows all those. And it seems like a lot, but it's nothing compared to the number of variables
[00:26:37] that a car is dealing with, with traffic and people and bicycles and motorcycles moving around it in
[00:26:45] unpredictable ways. In, in the air, all of those things are roughly predictable and, you know,
[00:26:52] the computing demands are high, but they're not insurmountable. The question is whether or not,
[00:26:59] you know, we can at a reliable cost get to that sort of acceptable, societally acceptable failure rate,
[00:27:12] uh, in autonomous driving. Yeah. That, and that is a question that I think has yet to be answered.
[00:27:19] Certainly the Tesla system is nowhere near there. I would say that if you were, if you're trying to
[00:27:26] come up with a, uh, you know, acute way of encapsulating it, it's quite impressive. Uh,
[00:27:34] but my daughter who's 17 and who I'm teaching how to drive, she can be quite impressive on some days as
[00:27:41] well, but I wouldn't get in the back seat. Right. I wouldn't, I wouldn't, I wouldn't trust her to drive
[00:27:48] me 40 miles. And because I know that in the next three or four miles, some mistake that I need to
[00:27:56] correct is going to happen. Right. And she's been driving for probably 10, 15, 20 hours now,
[00:28:03] and she's pretty good. I'd put her up against anybody who's been driving that number of hours,
[00:28:08] but I wouldn't get in the back seat. And that's kind of where this system is. You just, you know,
[00:28:14] you never know when it's going to make some kind of error that you'd say, how is it that it's,
[00:28:21] it can do all these other things so well, and yet it makes such a basic error as this.
[00:28:29] And I think like along the same lines of what you're saying, I think the, the, and I won't use any
[00:28:37] specific company or CEOs of specific companies for this, but I think the, the, um, what autonomous
[00:28:47] driving could be has been super inflated to the point where, uh, people have just been inundated
[00:28:55] with like all of this information and they're like, well, we can do this. And it's pretty good.
[00:28:59] Most of the time, the problem with that is, uh, it, but it's not right in this, in the scenarios
[00:29:07] you're talking about where you had the critical fails, where you had to engage, uh, the driver.
[00:29:12] So you avoided an accident. I still have friends who I make fun of my friends on the show and say
[00:29:19] they're, they're idiots. They're dumb. They're troglodytes, but in reality, they are very smart.
[00:29:23] I still have friends who think that my car can drive itself to work in the morning. Um, and they're,
[00:29:29] they're not like, they're not dumb. They're, they are educated people. They just like, they see
[00:29:34] something on the news or in social media and that's what gets put in their head. I think if
[00:29:40] autonomous driving would have been from the onset, put in a more practical, um, this is what it can do.
[00:29:47] This is what it can't do type of, and granted that's not going to sell cars, but if it was in a
[00:29:52] more practical, like look at all these things from where we were in 2016 to where we are now
[00:29:59] in autonomous driving is nothing short of a miracle. Like this is so impressive. I'll get in a Waymo
[00:30:06] and it'll take me to a Trader Joe's and it still won't make, it'll go around a neighborhood. So it
[00:30:12] doesn't have to make a left turn. Now I haven't tested that in a couple of months, but in the last
[00:30:16] time I got in a Waymo, uh, the, the, every single time I get in the Waymo to go to a Trader Joe's,
[00:30:21] it won't make a left turn. It'll go around neighborhoods until it can make a right turn
[00:30:26] into the parking lot. I don't understand why, but it's still pretty darn amazing that it would
[00:30:31] even do that. It picks me up in my house. Granted, it picks me up in my neighbor's house,
[00:30:35] but it's, it's pretty amazing that these things are driving around my neighborhood
[00:30:39] on a, on a regular basis, you know, and in the early days people would, you know, try to run out in
[00:30:44] front of it and some people shot at them and all this stuff that doesn't really happen anymore.
[00:30:48] They're, they're kind of just blended into the scenery at this point. We have so many.
[00:30:53] Um, but I think from the onset, if we would have been, uh, if the information provided would have
[00:30:59] been a little more clear of what it could actually do, not what it will be able to do in a year's time,
[00:31:06] two years time, even though that wasn't true either. Uh, I think we would be in a better spot as far
[00:31:11] as people getting in the car and being fooled is in the right world, but, but kind of, um,
[00:31:17] you know, lullaby into this, this feeling that the car is going to just take care of it.
[00:31:23] It's not. Yeah. Complacency. Um, I, I do think, cause again, what we have in the cars right now is
[00:31:31] really amazing. And it, we went from nothing a few years ago or relatively close to nothing a few years
[00:31:39] ago compared to what it can do now. And, uh, the, the, the negative press that it gets when,
[00:31:45] when something bad happens, um, is warranted in one way, but you know, it's also like,
[00:31:52] we have this amazing technology where we're, we should work harder to spread. And this is,
[00:32:00] I think what you're doing is like realistic information of what these systems can do.
[00:32:05] So I I'm very impressed. I, when I saw your, your first YouTube video, I was like,
[00:32:11] got to have them on. Oh, thank you. Well, you know, the, here's the other,
[00:32:18] I think overview point though, on the system on, on Tesla system and where it currently is.
[00:32:25] And we're talking about full self-driving supervised. They have called it supervised because
[00:32:31] they've needed to do that. Um, it's really at this point, a bit of a toy. In other words,
[00:32:43] it's not, it, you need to supervise it in such a way and to, uh, such an extent that it does not,
[00:32:57] at this point alleviate the stress of driving. In fact, it adds to it, it complicates it.
[00:33:06] When our team gets out of a, an FSD test after eight hours, you're wiped out. It's not like you
[00:33:14] have driven, you know, eight hours from LA to San Francisco or something, you know, uh, it's,
[00:33:22] it's a, it's totally different because you, you need to approach it with the constant
[00:33:31] understanding that it is about to make a mistake that you need to anticipate. And, you know,
[00:33:39] a licensed driver, someone who's been driving for more than two years, you drive with the driving
[00:33:47] tasks running in the background of your mind. You're not really thinking about what you're doing
[00:33:52] most of the time. You, you can listen to the radio, you can, you know, dictate text messages,
[00:33:58] you can do other things and still competently drive most people. It doesn't require your full
[00:34:05] and complete attention. It runs in background in your brain. When you have full self-driving
[00:34:11] engaged and you're allowing it to run in background in your brain, you're on the way to an accident.
[00:34:21] In fact, the system will sometimes make mistakes so quickly and so errantly that if you don't have
[00:34:30] your hands coddling the wheel, you won't be able to respond in time. You see videos of people driving
[00:34:37] with, you know, their hands clasped to their chest or hands in their lap. And, you know, the,
[00:34:43] that is a very unwise thing to do because you will in probably not too many hours behind the wheel
[00:34:54] come upon an instance where you needed to have your hands closer. It was that timely an event.
[00:35:02] Um, and it requires that sort of riveted attention to the job of driving, which most people are not
[00:35:10] used to giving the task. It's, um, so it's a bit of, um, it's a bit of a game. It's like a video game.
[00:35:22] It's kind of fun to do to see how far it, what it can do. And isn't that amazing, but you have to be
[00:35:30] ready to catch it. And always recall that the responsibility for the final behavior of that car
[00:35:38] and anything it does to hurt anyone else out there is the drivers and the drivers alone.
[00:35:44] I agree a hundred percent. So having said that, how do you think we should be looking at,
[00:35:51] cause these things are only going to progress, right? More and more cars are going to get this
[00:35:55] type of technology. So how do we teach driving to newer folks? And then how do we, uh, kind of,
[00:36:04] uh, give, uh, older folks, uh, an updated, uh, version of this so that, that, that it isn't running
[00:36:10] in the back of their head. Let's say it's like, you're almost running, uh, autopilot something,
[00:36:15] you know, you're running all this stuff, uh, subconsciously, and then it goes even deeper
[00:36:20] when you're using full self-driving. How do we bring that up forward?
[00:36:24] It's a challenge, Bodie. I think it's a real challenge. And I think it's further complicated by
[00:36:32] everybody's. And I think it's a, it's a habit that goes across, you know, every age group, but
[00:36:37] younger drivers in particular, and the distraction that, you know, their portable devices pose,
[00:36:44] you know, the amount of time that we want that we're addicted to spending on our phones and
[00:36:52] full cell driving or other similar systems will be seen almost invariably as a gateway to more time,
[00:37:04] more attention with the, with those devices on those devices. And you'll want to believe
[00:37:11] that this system will allow you to turn your attention to that thing you would really rather
[00:37:17] be doing, you know, being on some application or texting your friends or on Instagram or whatever
[00:37:24] it is. Um, and it can't, it's not there yet. Now, you know, the Mercedes-Benz system, which is
[00:37:32] operating in Europe, right? It has, you know, lights flashing on it when it's engaged. So the police can
[00:37:40] know that the system is engaged. Um, I believe it's speed limited to, uh, 90 kilometers an hour,
[00:37:48] or maybe 85 kilometers an hour. It's whatever the equivalent of 35 miles an hour, I think is.
[00:37:54] I think it's more than that. I think it's more than that, but, but it's, and you know,
[00:38:00] you can completely take your attention away from the car and, and Mercedes-Benz is liable for what
[00:38:11] the car does, not the driver, but that the car does not operate on any street like FSD does.
[00:38:22] It operates only on certain routes, certain mapped routes, uh, specific streets and in specific,
[00:38:30] you know, uh, uh, weather and, you know, with specific weather limitations, um, traffic limitations.
[00:38:38] So it's, you know, it's a totally different take, much more limited than what we have now.
[00:38:46] And would you be willing, you know, as a German driver to travel at only, you know, 45 to 50 miles
[00:38:53] per hour in Germany with the way, you know, Germans drive just to have the luxury of only,
[00:39:01] you know, paying attention to your device and not paying attention to the road.
[00:39:04] How useful is that? I don't know.
[00:39:09] Yeah, that, that's a good point. In, um, I'm a, I'm a firefighter. I drive a fire truck for a living
[00:39:15] and when, uh, we are allowed to go 10 miles over the speed limit when conditions are perfect
[00:39:24] and traffic, uh, allows, right. We can't just blaze through at, uh, you know, if the speed limit's 45
[00:39:32] at 55 miles an hour, um, and it's, it's raining outside. It's, we can only do it when it's safe to
[00:39:38] do so. Right. And they, I look at level three or what Mercedes-Benz has going because they're the
[00:39:44] only ones currently that are operating in level three, I believe. I look at that in a very similar
[00:39:49] way, like in perfect conditions, they can operate level three and the driver can watch Netflix or
[00:39:53] whatever. I did look it up. Um, it is 95 kilometers per hour, which is 59 miles per hour, but that's a
[00:40:00] change. It used to be 60 kilometers per hour, which was 37. So, um, looks like we had, we got a nice
[00:40:07] bump there and that's honestly 37 miles per hour is, is good. It's not maybe the most useful, but
[00:40:14] it's useful in the city, but maybe not on the freeway or whatever. Right. Or the Autobahn.
[00:40:22] Uh, excellent. And I would, I can tell you some of the kinds of situations in which it's impressive
[00:40:29] and some in which it has failed on us. Uh, full self-driving or full self-driving. Yeah. Yes. Okay.
[00:40:36] I mean, if you, if you want to go there. Yeah. Yeah. Um, so, um, for instance, when I talked about it
[00:40:42] before, about it being particularly strong in the city, it's that it is particularly courteous to
[00:40:50] pedestrians. Uh, it, it seems to even respond to, um, where pedestrians are looking. If you are
[00:41:02] approaching a crosswalk with FSD engaged, and it's a crosswalk that has a crossing light,
[00:41:09] and there are, actually, this is, I think, in one of the videos that we have up there,
[00:41:14] and you see that there is somebody who's standing at the edge of the crosswalk and the light for the
[00:41:21] car is green. If that person standing at the edge of the crosswalk is facing into the crosswalk as if
[00:41:30] they're about to cross, the car will break to a stop. If the person turns 180 degrees and looks
[00:41:38] away from the crosswalk, the car will continue. Interesting.
[00:41:43] Now that's astonishing. Uh, astonishing. The, the, the granularity of, um, assessment that that system
[00:41:57] is going through to make those decisions is very impressive. And so you think about something like
[00:42:04] that, and then you imagine how could it have another sort of failure? I'll give you an, um,
[00:42:11] here's another one. So, uh, it has only done this once in all of the miles that we've accumulated,
[00:42:19] but, uh, like many current drivers, they seem to, when they come to a stop in traffic,
[00:42:27] they leave now a lot of space ahead of their cars. Sometimes it's a car length. Sometimes it's,
[00:42:32] this is a new thing that's taking hold. I'm not really sure why it's happening,
[00:42:37] but FSD tends to emulate that. It leaves rather a lot of space in front of the car when it stops,
[00:42:45] um, in traffic, like at a red light. And on this one occasion, all of a sudden it started to kind
[00:42:53] of edge up on the car and ahead of it. And we thought, huh, look at that. It seems to have figured
[00:42:59] out maybe an update overnight. It's seemed to figure out that maybe that's too much space to leave.
[00:43:04] And all of a sudden it completely let off the brakes and starts to accelerate into the car that was
[00:43:12] ahead of us. And, you know, the ADAS system warning went off, which says brake, you know, in front,
[00:43:19] in the, in the dash and, you know, bright. And we realized from that moment, because the car did not
[00:43:27] break, that those two systems are not coupled together, which is fascinating. That full self-driving
[00:43:36] would not be coupled to the regular ADAS warnings that the car's other sensor suite is looking at.
[00:43:45] It only did that once. Uh, we had one time win on a two lane road, uh, at twilight. We were passed on
[00:43:56] the right by two motorcycles that were riding, you know, in tandem next to each other in the, uh,
[00:44:05] in the right-hand lane and came into our lane immediately ahead of our car at about 50 miles
[00:44:12] an hour at twilight. The right-hand lane ends, and then there's a gas station driveway coming up.
[00:44:21] They are slowing down to turn into the gas station driveway and our Tesla is not. And we let it go
[00:44:30] until we were like 20 feet from the back of these motorcycles at a closing speed of 20 miles an hour
[00:44:36] and had to intervene. There, the motorcycle's taillights were on, their brake lights were
[00:44:41] illuminated because they were braking to make that turn and the car was not doing it. Now,
[00:44:46] there are plenty of times when we've seen it respond correctly to motorcycles, but that time
[00:44:53] it didn't. And there were two. Was it because there were two separate motorcycles and their lights
[00:44:59] were too far apart to be taillights of a car? Hard to know. But again, these are the sort of scenarios
[00:45:06] that can happen. And someone who was not paying attention would have easily, could have easily put
[00:45:13] those motorcycles onto the front of the car? No question. And 100% of the driver's attention
[00:45:21] must be devoted to supervising the system, just as the name suggests in its current state. You just can't
[00:45:30] take your mind off it. It's fun to watch it, and it's amazing to see what it can do through image-based
[00:45:40] assessment, but it's still not reliable for use. So I'm going to guess, because I mean, you know,
[00:45:48] Tesla says that they're going to have FSD-13 out this month. Maybe they'll have it out by January,
[00:45:55] but I'm going to guess that you'll continue testing as these releases are coming out.
[00:46:01] Yes, we will. Yeah. As our schedule allows, absolutely.
[00:46:04] Yeah. And then, you know, this is, I don't know, is this something, because if you're working with
[00:46:13] OEMs, third-party OEMs, you definitely are just OEMs in general, I guess, because it's not third-party.
[00:46:19] But if you're working with these folks, you may or may not be able to say, are you working on any of
[00:46:24] these other autonomous systems doing similar testing that you're doing with Tesla, like Ford's
[00:46:30] Blue Cruise or GM's Cruise?
[00:46:32] Yeah, I can't say that. Unfortunately, that would all be covered by NDA. But we have experience in
[00:46:38] basically all of the systems that are out there. We're constantly trying to increase the depth of
[00:46:44] that experience. And, you know, also because they're constantly being updated and their behavior is
[00:46:50] changing all the time. So it's a moving target. These things are all moving targets. And, you know,
[00:46:56] to stay up to date and up to speed on them is a real job. But it's an important one. It's an
[00:47:04] important one.
[00:47:05] Yeah, I agree. I agree. All right. So thank you very much, Guy, for coming on and being so generous
[00:47:13] with your time.
[00:47:15] Where would people find information about AMCI testing? Maybe if they wanted to, you know,
[00:47:23] subscribe on the socials or YouTube or anything you want to promote as well, throw it in there.
[00:47:29] So if you go to amcitesting.com, you'll see a bit about our company, but you'll also see
[00:47:37] links to all of our press releases, both on our full self-driving testing, something that we call
[00:47:47] MP6, miles per six minute of charge. Another project that we've undertaken where, in which we try
[00:47:54] to explain to the public, a new way of thinking about charging speed and the real importance of
[00:48:00] range in electric vehicles. And that charging speed may actually be, for most people in most scenarios,
[00:48:08] the most important factor to consider. And so we have tested a wide range of vehicles there
[00:48:14] to say how many miles of range can you add in the time of a normal fuel stop?
[00:48:21] And you can also see, you get links to our YouTube page where we have six FSD videos currently up,
[00:48:30] as well as topics about other vehicles that we've tested, performance vehicles that we've tested,
[00:48:37] tire testing, all kinds of really interesting, I think, topics to explore. So you can kind of get an
[00:48:46] idea and confirm, I believe that we really are completely unbiased in our approach to things.
[00:48:54] If we weren't, if we weren't unbiased, you know, our work would be of no real value to our clients.
[00:49:03] There'd be no reason for any OEM to engage us to just pat them on the back and tell them what a great
[00:49:10] job they'd done. You know, they can do that themselves. They need a critical eye. They need
[00:49:16] someone to be, to be the other side of the coin, the other side of the argument for them,
[00:49:22] and to be able to tell them where their product really stands. And that's, that's what we do.
[00:49:27] And that's the only reason why we're still in business after more than four decades.
[00:49:32] Excellent. Well, Guy, thank you so much for coming on and sharing this with us.
[00:49:36] No, you bet. Thanks again. I hope to do it again soon.
[00:49:40] I would like to thank Guy for being so generous with his time and coming on the show and sharing
[00:49:45] everything that he learned with us. I thought it was a great interview and I learned a ton. I hope
[00:49:50] you learned a ton too. I'll put links to AMCI testing in the show notes. If you go to their website,
[00:49:56] they have a page with all the YouTube videos up there. So you can just go and peruse them
[00:50:01] at your leisure. They're not very long. They're, you know, a couple minutes long each.
[00:50:05] It just gives you some extra context on what Guy was talking about. So check the show notes if you
[00:50:12] are interested in that. And if you want to email me, you can do so. It's bodie, B-O-D-I-E at 918digital.com.
[00:50:19] You can also find me on X at 918digital. And I've got some other things going on, you know,
[00:50:26] trying to find different ways for people to interact and support the show. And
[00:50:31] I haven't quite figured it out yet. As soon as I do, I'll let you know. But just so you know,
[00:50:36] I'm working on it. But yeah, yeah, this Thursday, by the way, we are going to have our annual
[00:50:41] Thanksgiving episode, which is basically a trivia show, sort of like Jeopardy. And I had on Alison
[00:50:51] Sheridan, my kid, Sierra, and Steve Sheridan, Alison's husband. And we had a great time. Now,
[00:50:59] I will tell you that a little funny, little funny technical problem that I had. When I do an
[00:51:07] interview with Kilowatt, I don't record the video because this isn't a video podcast. Although
[00:51:11] that's probably going to change a little bit with the interviews. I'll probably start releasing those
[00:51:16] as videos. But I have the settings in the program that I use not to record video. I thought I had
[00:51:24] changed that. I did not. So I have video of the game board, which is great. But I do not have video
[00:51:34] of Alison, Sierra, or Steve, because I foolishly did not actually do the I didn't I didn't turn on the
[00:51:44] video. So if you want, you can go to YouTube and watch that episode, which will be on it'll come out
[00:51:51] on Thanksgiving Day. You can watch it on YouTube if you want, you'll see the game board there.
[00:51:56] Unfortunately, there won't be any video of the contestants, which is part of the fun because we
[00:52:01] kind of tease each other. I always tell everybody that I want it to be a little bit of a roast. I
[00:52:06] want everybody to tease each other and have a good time in a fun way, not in a cruel way. Unfortunately,
[00:52:13] you won't be able to see that because I didn't record the video. So yeah, but the game board will be
[00:52:19] on there. You'll see all the questions and you can play along if you would like to or you can always
[00:52:23] listen to the audio version as well. All right, everybody. Thank you so much for listening
[00:52:28] to this episode at Kilowatt. I hope you all have a wonderful week and I will talk to you on Thursday.
[00:52:34] See you next time.
[00:52:58] Now to save your Aldi-Filiate or to the Aldi Talk App.
[00:53:02] A whole year's moments to share.
[00:53:05] Aldi. Good for everyone.
