Should automobile software be open-sourced?

Back in the late 90s, many newspapers reported this apocryphal exchange between Microsoft CEO Bill Gates and General Motors:

At a recent COMDEX, Bill Gates reportedly compared the computer industry with the auto industry and stated: “If GM had kept up the technology like the computer industry has, we would all be driving twenty-five dollar cars that got 1,000 miles per gallon.”

Recently General Motors addressed this comment by releasing the statement: “Yes, but would you want your car to crash twice a day?”

I was abruptly put in mind of this old joke when reading the latest news about Toyota’s crash-prone cars, because some of the fatal problems now appear to be based not in the mechanics of the cars, but in their software. The New York Times today reported on the story of 77-year-old Guadalupe Alberto, who died when her 2005 Camry accelerated out of control and crashed into a tree; “the crash is now being looked at as a possible example of problems with the electronic system that controls the throttle and engine speed in Toyotas.”

The point is, cars have gradually employed more and more software as control systems, to the point where, as the Times notes …

The electronic systems in modern cars and trucks — under new scrutiny as regulators continue to raise concerns about Toyota vehicles — are packed with up to 100 million lines of computer code, more than in some jet fighters.

“It would be easy to say the modern car is a computer on wheels, but it’s more like 30 or more computers on wheels,” said Bruce Emaus, the chairman of SAE International’s embedded software standards committee.

Maybe we shouldn’t be surprised if Toyota winds up wrestling with bug-caused crashes. Once software grows really huge, its creators are often unable to vouchsafe that it’s bug-free — that it’ll work as intended in all situations. Automakers have a vested and capitalistic interest in making sure their cars don’t crash, so I’m sure they’re pretty careful. But it’s practically a law of nature that when code gets huge, bugs multiply; the software becomes such a sprawling ecosystem that no single person can ever visualize how it works and what might go wrong. Worse, it’s even harder to guarantee a system’s beahvior when it’s in the hands of millions of users, all behaving in idiosyncratic ways. They are the infinite monkeys bashing at the keyboard, and if there’s a bug in there somewhere, they’ll uncover it — and, if they do so while travelling 50 miles an hour, possibly kill themselves.

The problems of automobile software remind me of the problems I saw two years ago while writing about voting-machine software for the New York Times Magazine. As with cars, you’ve got software that is performing mission-critical work — executing democracy! — and it’s in the hands of millions of users doing all sorts of weird, unanticipated stuff (like double- or triple-touching touchscreens that are only designed for single-touch). Let’s leave aside the heated question of whether a manufacturer or hacker could throw an election by tampering with the software. The point is, even without recourse to that sort of skulduggery, what I found is that the machines so frequently crash, bug out, or just do head-scratchingly weird stuff that it’s no wonder so many people refuse to trust them.

So what’s the solution? Well, in the world of election software, many have suggested open-sourcing the code. If thousands of programmers worldwide could scrutinize voting-machine software, they’d find more bugs than the small number of programmers currently working in trade secret. And theoretically this could improve public confidence.

Would the same process improve automobile code? Should the software in our cars be open-sourced?

I’d say “yes.” But the truth is, open-sourcing doesn’t solve all of your problems when you’re dealing with software that’s regulated by the government.

Consider the case of voting machines. Every time Diebold or ES&S or whatever issued a “patch” to fix problems in their software, the patch had to be submitted to the federal authorities — a process that could take several months to scrutinize it. Why? Well, they naturally want to make sure that the patch doesn’t make things worse. Again, I’m leaving aside the question of whether the federal authority regulating voting machines does it job competently; back when I was researching this, it certainly didn’t look like it. But let’s assume for the sake of argument that the regulator is top-notch. It’s still going to take weeks and maybe months to verify that a new software patch doesn’t b0rk an existing system.

So the upshot is, if you buy a piece of voting-machine software with even middling complexity, you’re probably going to discover bugs that need patching in the months and years to come. Some those might be really really bad bugs that can screw up elections. But the public’s interest in regulating the industry makes it understandably slow to fix problems when they emerge. Keep in mind, you’re going to have this problem even with open-source software; Linux and Apache and Firefox are patched all the time when security bugs are discovered. Indeed, one benefit of open-source software is precisely that you can discover more bugs, more quickly.

Now, I don’t know much about the regulatory regime for automobile software. Maybe it’s more flexible, faster, and able to authorize patches more quickly. (If anyone reading this knows, please comment!) But then again, do we want software patches rushed more quickly into cars, when they might do more damage than good? My big takeaway from the voting-machine story was: Man, you really ought to keep the software incredibly simple and stripped down from the get-go, to minimize bugs in the first place. Because holy moses can it get harder and harder to fix things if they spin out of control.

Of course, the libertarian response might be simply to not regulate automobiles or voting machines at all; if the regulations are getting in the way of fixing the code, aren’t the regulations the problem? And sure, that solution has at least the benefit of elegance. But the historical record of unregulated industries engaged in work critical to the public weal is not all that great, so I’m gonna go with the need for some regulations here.

But it’s interesting to see how tangled things get as complex software takes control of life-or-death matters, eh? I suspect we’ll hear a lot more about this in the months to come, at least as far as Toyota is concerned.

UPDATE: My friend Tom Igoe, a physical-computing pioneer at New York University, DM’d me on Twitter to point out a flaw in my argument here:

you’re making some assumptions about code bloat that aren’t always true for embedded systems code. Most embedded systems have no OS. So the problems are still relevant, but different than what you’re inferring. IT’d make for a good article explaining the difference

He’s right … I should research that and write it!

(That photo above is from the New York Times’ photo essay on Guadalupe Alberto, which is really well-done, and worth checking out!)

blog comments powered by Disqus

Search This Site


I'm Clive Thompson, the author of Smarter Than You Think: How Technology is Changing Our Minds for the Better (Penguin Press). You can order the book now at Amazon, Barnes and Noble, Powells, Indiebound, or through your local bookstore! I'm also a contributing writer for the New York Times Magazine and a columnist for Wired magazine. Email is here or ping me via the antiquated form of AOL IM (pomeranian99).

More of Me


Recent Comments

Collision Detection: A Blog by Clive Thompson