We’ve been promised a future with self-driving cars for a long time, but the technology available today is finally starting to pay off on that 100-year-old promise. There’s an incredible number of companies racing towards the autonomous vehicle future, which I’ve talked about before. Tesla’s approach to that future is based on radar and computer vision, which is what powers Autopilot, enhanced summon, and at some point, more self-driving features. But there are some other pioneers in the space making interesting progress on autonomy from a completely different angle. Waymo was kind enough to invite me out for an event to meet some of their team, learn about their self-driving technology, what motivates them, and ride in one of the Waymo One taxis operating in the Phoenix, Arizona area. My takeaway from the experience surprised me.
A Quick History
Taking a step back and looking at the history of self-driving cars is kind of amazing. This isn’t a new idea at all and has been floating around in our collective imagination for nearly 100 years.
In the 1920’s, Houdini Radio Control showed off a radio-controlled car called, “American Wonder” in New York City.1
In the 1930’s General Motors sponsored Norman Bel Geddes’s Futurama exhibition with a radio-controlled electric car using electromagnetic fields.
Jumping to the 1950’s, RCA Labs demonstrated a car guided by wires and embedded circuits in the roadway.
In the 1960’s we saw the United Kingdom’s Transport and Road Research Laboratory test a Citroen DS with embedded magnetic cables in a road.
By the 1980’s we started to see vision-guided systems from Mercedes-Benz. And DARPA-funded Autonomous Land driven Vehicle (ALV) in the United States that used LiDAR, computer vision, and robot controls.
In the 1990’s those technologies continued with tests across the United States, like the Carnegie Mellon University’s Navlab project that drove cross-country 98.2% autonomously.
And in 2009 is when we see companies like Google jump into the fray with their X lab self driving project. But in 2016, that project was spun off as its own company, Waymo, under Alphabet.
When it comes to self driving, there are some guidelines for the different levels of automation from the NHTSA:2
- Level 0 – No Automation. This is the majority of cars on the road.
- Level 1 – Driver Assistance This vehicle can assist with steering or braking, but not at the same time. Think adaptive cruise control.
- Level 2 – Partial automation This vehicle can assist with steering and braking at the same time, but still requires the driver’s full attention. This is where most modern cars that have some kind of “automation” fall today. It’s basically lane assist and adaptive cruise control.
- Level 3 – Conditional automation This is where a driver is still required, but they don’t have to keep their eyes on the road. The car handles almost everything.
- Level 4 – High automation This is where companies like Waymo currently operate. A driver is only required in certain circumstances, so if the conditions are right, then the car can completely drive itself.
- Level 5 – Full automation Exactly what you’d expect. This is when no human driver is required at any point.
So I’ve been an avid user of Autopilot on my Tesla, which is a next level driver assist feature. It’s not self driving, but on the 5 levels of self driving, it’s somewhere around a level 2 or 3.3 But Waymo has been operating at level 4 autonomy for some time now and has been operating a taxi service called Waymo One in the Phoenix area. When Waymo asked if I’d be interested in coming out to meet some people from their team and experience their technology first hand, it was an instant yes. And to be clear, even though Waymo provided the trip, that in no way has colored my perception of what I saw. My opinions are my own.
One thing that’s a common misperception with Waymo is that it’s operated by Google. I had thought that myself for a long time, but Waymo isn’t part of Google at all anymore. After they were spun off in 2016, they’re a completely independent company under the Alphabet umbrella. The name “Waymo” comes from its mission statement of “a new way forward in mobility.” And after my meetings and conversations with employees, that mission statement really seems to be ingrained in their culture. There is a genuine passion and excitement around changing transportation to make it safer and more accessible. It was apparent to me how much they believe in the mission and how moved they are to see their technology is impacting people’s lives, such as the first blind person to ride on their own in a self-driving car. You can also see it in how the Waymo One app has been designed around accessibility. When the car comes to pick you up, there’s a button in the app to honk the horn to help someone who has difficulty with sight to find the car.
Where Tesla is relying completely on radar and computer vision for their self-driving features on their fleet, Waymo has gone the path of computer vision, radar, and LiDAR in their technologies. Pair that with the high resolution mapping that they do for the areas in which they operate and you have a car that can easily achieve level 4 autonomy today. Tesla is relying heavily on perfecting their machine learning models to achieve full level 5 autonomy at some point in the future, but that means their cars are around level 2 and 3 today. Waymo’s path has pushed them to level 4 very quickly and reliably, but just like Tesla, are now refining and developing their models to hit level 5 at some point in the future.
The big difference between the two is that Waymo is operating a fully functional taxi fleet today within the zones where they’ve created high resolution maps. I’m not trying to stir up a controversy, but Elon Musk has been very vocal that LiDAR is a crutch for true level 5 autonomy. So is it a negative that Waymo is reliant on LiDAR and high resolution mapping? I don’t think so. Far from it. There’s multiple paths to the autonomous vehicle solution and what Waymo is doing is extremely impressive.
Before the trip, I knew that Waymo had been operating for a while in the Phoenix area, as well as branching out to test in other cities in limited zones. But the scale of the operation was much larger than I was expecting. They have hundreds of cars in their fleet that run 24 hours a day, seven days a week. But they’re also testing cars in other areas of the country to stress test their technology with extreme weather. Right now they’re doing rain testing in areas of Florida. And they’ve also been doing winter weather testing in Michigan.
For the actual ride in the car, we went on a 20 – 25 minute road trip around the Scottsdale area. My brother, Sean, went with me on the ride and was just as excited as I was to experience it. The route took us through a pretty wide variety of environments like office park areas, residential streets, to multi-lane roads with heavy traffic.
I have a little more experience with riding in a car that’s driving itself than my brother, but even I was a little anxious the first couple of minutes of the ride. Pulling up to a stop sign and then pulling out into traffic had me a little on edge at first. But something really strange happened to both of us after the first couple of minutes. The car drove exactly like a person would in every situation we were seeing. It was like a switch flipped and went from something novel and crazy, to something pretty mundane. It was kind of crazy how normal it felt. As my brother put it, “Okay … it’s just a car that’s driving.”
As a UI/UX designer, I immediately focused in on the passenger screens in the back seat. I was completely blown away by how well they were designed, and how much thought, testing, and iteration must have gone into it. During our conversations with the team, they talked about how much time they put into trying to understand how to make people comfortable with a self-driving car. And this UI design was proof of that effort and time. It communicated exactly what the car was doing at every moment, as well as what the car was seeing.
The screen would show a ping-like effect every couple of seconds that showed you the LiDAR dots of what the car was seeing in the surrounding environment, which included everything from parked cars, to people, to vegetation. When the car was about to speed up, the route line would get a pulse of brighter green. If there was a stop light coming up, a small stop light would show up in the upper left corner of the screen and show you that the car recognized it as red, yellow, or green. Every piece of the UI was carefully constructed to show you what the car was seeing and why it was doing what it was doing. I was blown away by that. And to Tesla, I really hope they take a look at that UI because it’s incredible. I’d love to see them take cues from that with the Autopilot UI.
With the high resolution mapping that they’ve done, the car also took speed bumps and dips in the road like a champ. And near the end of the route we took the car had to turn left at a very busy multi-lane road. The car slowly edged its way out into the center of the intersection, waited for the light to turn yellow, and waited for a break in oncoming traffic before making the turn. It took that turn exactly like I would have done. The only issue I had was how aggressive it was with the pumping of the brakes as it worked its way into the intersection. Other than that, it was a flawless ride.
My one big takeaway from this weekend was that autonomous vehicles in our lives isn’t something in the distant future, but is limited to where you can experience it. It’s not something that’s a year or two away from widespread use, but is something we’re going to see more and more of over the coming decade. Waymo’s current fleet of modified Chrysler Pacifica’s are about to be joined by their next generation car that’s built on a Jaguar iPace. They’re also testing semi-trucks decked out with their self driving technology, and are doing test runs with empty trailers to fine tune that system. It’s a good example of how this technology can be modified to run on a very wide assortment of vehicles.
These types of autonomous systems have more awareness of their surroundings than we do. They can process that information much faster than we can. And have quicker reaction times than we’re capable of. In the end, these systems will be better and safer drivers that the rest of us. There’s no doubt in my mind that this type of thing will eventually be ubiquitous and that this is the future of transportation. It’s something we’ve been promised for decades, but the technology had too many compromises to make it viable. But that promise has hit a point with technologies like LiDAR, computer vision, radar, and machine learning that are bringing it much closer to reality. It’s Waymo’s thoughtful user experience design that impressed me the most though. For autonomous cars to be accepted, it’s important to have a system that’s designed from the ground up to be useful for those with special needs; to address people’s fears and anxiety around getting into a car without a human driver; to not focus on the technology alone, but how this technology can and should be integrated into our lives.
This isn’t to say that I don’t have reservations about self-driving technology, I do. As excited as I am, I’m also concerned by the ramifications of drivers losing their jobs. What this will mean for our daily lives and the future of transportation is going to be profound. It has the potential to be a paradigm shift and change a lot of things we’ve accepted as immutable. I think there’s far more pros than cons to this shift, but it’s something that we need to think about and address as self driving becomes more widespread.