Tesla Doubles Down on the Deep Learning Behind Autopilot
Tesla's custom silicon and the latest updates to the deep neural networks behind Autopilot have impressed the experts.
Join the DZone community and get the full member experience.
Join For Free
I despise driving. Dual carriageways and traffic jams. Manoeuvres and motorways. Reversing and roundabouts (I’m European — North Americans call them “traffic circles,” although I don’t believe I ever saw one in the four years I lived in Canada). And let’s not omit emissions. Ugh.
Unfortunately, it is a necessary part of 21st-century life, so when I bought a new car recently, I chose one which would do some of the driving work for me, and which didn’t create any emissions. I am now a fully paid-up EV owner, and apart from some ongoing range anxiety, I have embraced my Nissan LEAF, which has taken some of the drudgery out of driving (particularly parallel parking).
Here in the UK, the LEAF is the most popular plug-in EV. Although it doesn’t offer the same level of autonomy as Tesla Autopilot, I would have needed to be much more patient to join the waiting list for a Tesla, not to mention wealthier.
While Tesla’s self-driving capabilities are currently out of my reach, it is still interesting to keep track of progress on the Autopilot system in the hope that, one day, more accessible manufacturers will offer it too. (Of course, Tesla is just one of many manufacturers working on autonomy, but also the most accessible to the general driving population. Waymo — Google’s self-driving car project — and others have fleets that are under test, but are not yet available to the market).
Indeed, you could say that Tesla has the largest public deployment of robots worldwide, with over 250,000 on the road, regularly in autonomous driving mode. If you’ve not already seen it, here’s a recent video of the Tesla Autopilot system in action.
The Tesla Autopilot system feeds camera data into a deep neural net which offers powerful computer vision capabilities. We don’t know all the details, but a recent report suggests that Tesla has introduced some new technology with some impressive capabilities into the latest V9 update to its Autopilot software. The update improves the computer’s ability to track objects by using the eight cameras all around the car rather than just the front-facing ones, tracking vehicles on all sides and noting the differences between motorbikes, cars, and trucks.
A deep learning expert with access to the V9 software has analyzed the code and describes the implications of the update as a “mind-boggling expansion of raw capacity.” He estimates the V9 camera network to be ten times larger and require 200 times more computation when compared to earlier incarnations, making it “likely that V9 is straining the compute capability,” and goes on to estimate that it probably takes at least thousands and perhaps millions of times more data to fully train it.
“This network is far larger than any vision NN I’ve seen publicly disclosed and I’m just reeling at the thought of how much data it must take to train it. I sat on this estimate for a long time because I thought that I must have made a mistake. But going over it again and again I find that it’s not my calculations that were off, it’s my expectations that were off...Is Tesla using semi-supervised training for V9? They've gotta be using more than just labeled data - there aren't enough humans to label this much data. I think all those simulation designers they hired must have built a machine that generates labeled data for them, but even so.”
Tesla has designed its own silicon to ensure that the cars have processors that can handle the demands made by the deep neural network they are developing. Elon Musk said recently that the new silicon is an order of magnitude faster than the NVIDIA chipset they were using previously. Tesla has achieved a “bare metal level” by designing its calculator and memory circuits from scratch, instead of layering up GPUs and optimizing them for data transfer at high speed.
If you want to find out more about what is happening “under the hood,” other Tesla enthusiasts have analyzed Autopilot technology and reported on it extensively. There is also a useful 30-minute video from Tesla’s Director of AI, Andrej Karpathy, about the evolution of self-driving vehicles and computer vision (skip to the 15-minute point for the start of the description of Tesla’s approach to computer vision for its Autopilot solution).
Are you a Tesla driver? Any comments on the recent upgrade? Perhaps, like me, you’re observing from the sidelines. Let me know your thoughts in the comments!
Opinions expressed by DZone contributors are their own.
Comments