Self-driving cars are (will be) the pinnacle of consumer products automation – robot vacuum cleaners, smart fridges, and TVs are just toys compared to self-driving cars. That's true both in terms of technology and in terms of impact. We aren’t yet on level 5 self-driving cars, but they are behind the corner.
But as software engineers, we know how fragile software is. And self-driving cars are basically software, so we can see all the risks involved with putting our lives in the hands anonymous (from our point of view) developers and unknown (to us) processes and quality standards. One may argue that this has been the case for every consumer product ever, but with software, it's different — software is way more complex than anything else.
So I have an outrageous proposal — self-driving cars should be open source. We have to be able to verify and trust the code that’s navigating our helpless bodies around the highways. Not only that, but we have to be able to verify if it is indeed that code that is currently running in our car, and not something else.
In fact, let me extend that — all cars should be open source. Before you say “but that will ruin the competitive advantage of manufacturers and will be deadly for business”, I don’t actually care how they trained their neural networks, or what their datasets are. That’s actually the secret sauce of the self-driving car and, in my view, it can remain proprietary and closed. What I’d like to see open-sourced is everything else. (Under what license? I’d be fine to even have it copyrighted and so not “real” open source, but that’s a separate discussion).
CAN is a low-level protocol and does not support any security features intrinsically. There is also no encryption in standard CAN implementations, which leaves these networks open to man-in-the-middle packet interception. In most implementations, applications are expected to deploy their own security mechanisms; e.g., to authenticate incoming commands or the presence of certain devices on the network. Failure to implement adequate security measures may result in various sorts of attacks if the opponent manages to insert messages on the bus. While passwords exist for some safety-critical functions, such as modifying firmware, programming keys, or controlling antilock brake actuators, these systems are not implemented universally and have a limited number of seed/key pair
I don’t know in what world it makes sense to even have a link between the entertainment system and the low-level network that operates the physical controls. As apparent from the talk, the two systems are supposed to be air-gapped, but in reality, they aren’t.
Rookie mistakes were abound – unauthenticated “execute” method, running as root, firmware is not signed, hard-coded passwords, etc. How do we know that there aren’t tons of those in all cars out there right now, and in the self-driving cars of the future (which will likely use the same legacy technologies of the current cars)? Recently, I heard a negative comment about the source code of one of the self-driving cars “players”, and I’m pretty sure there are many of those rookie mistakes.
Why is this even more risky for self-driving cars? I’m not an expert in car programming, but it seems like the attack surface is bigger. I might be completely off target here, but on a typical car, you’d have to “just” properly isolate the CAN bus. With self-driving cars, the autonomous system that watches the surrounding and makes decisions on what to do next has to be connected to the CAN bus. With Tesla being able to send updates over the wire, the attack surface is even bigger (although that’s actually a good feature – to be able to patch all cars immediately once a vulnerability is discovered).
Of course, one approach would be to introduce legislation that regulates car software. It might work, but it would rely on governments to do proper testing, which won’t always be the case.
The alternative is to open-source it and let all the white-hats find your issues so that you can close them before the car hits the road. Not only that, but consumers like me will feel safer, and geeks would be able to verify whether the car is really running the software it claims to run by verifying the fingerprints.
Richard Stallman might be seen as a fanatic when he advocates against closed source software, but in cases like… cars, his concerns seem less extreme.
“But the Jeep vulnerability was fixed”, you may say. And that might be seen as being the way things are – vulnerabilities appear, they get fixed, life goes on. No person was injured because of the bug, right? Well, not yet. And “gaining control” is the extreme scenario – there are still pretty bad scenarios, like being able to track a car through its GPS, or cause panic by controlling the entertainment system. It might be over Wi-Fi, or over GPRS, or even by physically messing with the car by inserting a flash drive. Is open source immune to those issues? No, but it has proven to be more resilient.
One industry where the problem of proprietary software on a product that the customer bought is… tractors. It turns out farmers are hacking their tractors, because of multiple issues and the inability of the vendor to resolve them in a timely manner. This is likely to happen to cars soon, when only authorized repair shops are allowed to touch anything on the car. And with unauthorized repair shops, the attack surface becomes even bigger.
In fact, I’d prefer open source not just for cars, but for all consumer products. The source code of a smart fridge or a security camera is trivial. It would rarely mean sacrificing competitive advantage. But refrigerators get hacked, security cameras are an active part of botnets, and the “internet of shit” is getting ubiquitous. A huge amount of these issues are dumb, beginner mistakes. We have the right to know what we are running – in our frdges, DVRs, and, ultimately, cars.
Your fridge may soon be spying on you, your vacuum cleaner may threaten your pet in demand of “ransom”. The terrorists of the future may crash planes without being armed, crash vans into crowds without being in the van, and “explode” home equipment without being in the particular home. And that’s not just a hypothetical.
Will open source magically solve the issue? No. But it will definitely make things better and safer, as it has done with operating systems and web servers.