Wednesday, October 30, 2013

Unintended Acceleration, Software, and Sadness

A few years ago I became concerned about reports of sudden unintended acceleration in Toyota vehicles, especially when some of my family members started driving new Toyotas. At first I was skeptical of the reports, but they kept coming. In time, a friend of a friend had a terrible accident, and I was only two trustworthy people removed from a firsthand experience.

I started paying more attention to the reports, and I developed a very strong suspicion that software was to blame. Three simple facts led me to this suspicion:

  1. The engine throttle was controlled by software.
  2. The brakes were controlled by software.
  3. Nobody knows how to make software without bugs.

The first fact surprised me a little. The second surprised me a lot. The third is common knowledge to anyone who has ever developed software, but it may be surprising to those who haven't.

When I first learned how to program in BASIC as a child, I was taught that computers don't make errors; people do. If you write a perfect program, the computer will do exactly what you expect. This is a tantalizingly optimistic view, and it helped me challenge myself to become a better programmer. Unfortunately it is not true.

During those early years my programs were small and simple. Sometimes I could write an entire program in a single page of text. It seemed very possible that programs could be perfectly correct, but somehow they never were. There was always a bug, and almost every time the bug was my own fault. As time went on, I started working on larger and larger software projects, and it became clear to me, as it should to any software developer, that the likelihood of bugs increases when software complexity increases.

This is a Big Problem. It is so big that many of the greatest minds in computer science have devoted their lives to it. Some very interesting progress has been made, but it is still largely an unsolved problem in real-world systems. It is hard to create a program that correctly implements a specification. It is hard to create a correct specification. It is hard to implement a programming language correctly. It is hard to build correct interfaces to other programs. It is hard to build computers that reliably execute programs correctly, especially in environments with high levels of electrical noise like the engine compartment of a car.

By the way, I often use "is hard" to mean "might be impossible".

Not all software is equally buggy, of course. It is possible to create computer systems that are more reliable than others (consider the hardware and software on spacecraft, for example), but it is difficult to do. It is a very different problem than the problem of building reliable mechanical systems.

The problem of software bugs is probably the biggest reason that computer security is so awful. We don't know how to make software without bugs, and bugs tend to undermine security. This is why people in the information security community seem to understand and expect bugs more than most other people; we spend our lives discovering, analyzing, exploiting, and fixing bugs. We find bugs that others miss. We break things that are supposedly unbreakable.

To me, the unintended acceleration reports smelled like buggy software from the very beginning. Few of the reports were identical, but all of them involved the inability of the driver to influence a computer that controls the engine throttle.

Some of the reports agreed on a particular point: Pressing harder on the brake pedal did nothing. This is terrifying to imagine. Your car accelerates rapidly even while your foot is on the brake pedal. You press harder and harder until the pedal is at the floor. Maybe you have time to switch off the ignition or shift into neutral, but how long would it take you to think of that? It might take only a second of unintended acceleration to cause a fatal accident.

At first Toyota denied the problem. Then they recalled floor mats. At the time, I thought that was a pretty stupid response to what seemed like a software bug. Then they recalled pedals. Then they blamed the drivers. They repeatedly said that they couldn't recreate the problem when testing the software (but any software developer knows that an inability to reproduce an error rarely means that a bug doesn't exist).

I started wondering: Have any information security professionals audited the software? Has anyone actually skilled at finding bugs looked for bugs? As far as I could determine, the only people who had tested the software were automotive engineers employed by Toyota. Automotive engineers might not know anything about finding bugs, but they should at least know something about fail-safe design.

To me, the most troubling part of the whole thing was that the brakes and all fail-safe mechanisms were also under computer control. Really? You would make a car with software throttle and also give it software brakes? Don't you know that an automobile is a lethal weapon? Have you never seen software fail? How about a traditional brake system just in case, even if it is only activated when the brake pedal is fully depressed? How about a mechanical linkage that limits the throttle when the driver slams on the brakes?

I can't imagine any engineering culture within Toyota that would fail to consider such things unless it is simply a case of automotive engineers putting too much trust in software because they don't understand software failures. Maybe they tested the things ten thousand times, unaware that they should have tested ten trillion different conditions.

As I became more and more convinced that a software bug was to blame and that nobody was properly looking for it, I started planning a blog post. I considered trying to reverse engineer a car. Even better, perhaps I could convince someone more skilled than me to try to find the bug.

Then the unexpected happened: Tin whiskers were implicated as a cause of unintended acceleration in Toyota vehicles. I had convinced myself that software must be to blame, but suddenly a seemingly plausible alternative arose. I understood tin whiskers well enough to believe that they could explain at least a portion of the failures, yet tin whiskers were just mysterious enough that I didn't question whether or not they might explain all of the failures.

Then I failed. I stopped paying attention after I heard about the tin whiskers. I didn't consider the likelihood of software bugs vs. failures due to tin whiskers. I didn't follow through on making recommendations for mechanical fail-safe (which could prevent fatal accidents regardless of the root cause of the problem). I didn't notice when Toyota denied that tin whiskers caused unintended acceleration. I never went back and reviewed the notably weak software analysis results of the NASA report that first implicated tin whiskers. I ignored the fact that the United States government stopped investigating the problem.

This week I read that a court of law found Toyota's faulty software to blame in a case of unintended acceleration. A software audit for the plaintiff revealed that coding standards for safety-critical software were not followed and that the software is buggy and incredibly complex. The audit even identified a particular failure mode in which a driver could press harder on the brake pedal with no effect, which is as close to a "smoking gun" as we could hope to see. The case clearly indicates negligent software development and deployment practices on the part of Toyota.

This shouldn't have happened if the automotive engineers were appropriately skeptical of software. This shouldn't have happened if the executives were appropriately skeptical of software. This shouldn't have happened if the software engineers were appropriately skeptical of software.

At the very least, the software engineers should have known better. If I were developing software that could kill someone in an error condition, I would feel a moral obligation to tell people about the potential for error. However, as everyone in the information security community knows, developers tend to overestimate the quality of their own code, and very few software developers are skilled bug hunters.

Unfortunately the software source code still has not been made available to the public. We have to trust the analysis of the plaintiff's expert witness (or trust Toyota) to understand how the software works. The details from the expert witness that have been reported, however, seem very credible to me. The jury found in favor of the plaintiff, so Toyota failed to effectively argue against the analysis.

I'm pretty confident in agreeing with the analysis, but it would be nice to be able to verify. If the software were open source, that would be possible. In fact, if the software were open source, others could have done the same analysis years ago and likely would have been able to fix bugs and save lives. How many people will have to die before we decide that open source is as important for safety as seat belts?

I am deeply sad for the people who died in automobile accidents for years before Toyota's negligence was revealed, I am sad for the people who will die in future accidents, and I am sad and ashamed that I never followed through on my own suspicions about the bugs at the heart of the problem.

11 comments:

Redbeard said...

My favorite bit about this from the past:

“The jury is back, The verdict is in. There is no electronic-based cause for unintended high-speed acceleration in Toyotas. Period.” - Ray LaHood, United States Secretary of Transportation

Anonymous said...


Modern cars need 'flight data recorders' for diagnostics and investigation of anomalous crashes and near-misses.. it's not even hard... 'candump' on the multiple CANBUS's in the vehicle.


.... *Therac-25* ....anyone?! .... “Those who cannot remember the past are condemned to repeat it.”

Perry E. Metzger said...

One note: formal methods are no longer beyond usability for this sort of thing. We now have a formally verified C compiler (CompCert), a formally verified microkernel (seL4), and other similar gadgets. Ten or twenty years ago formal verification of large systems was inconceivable, but now it is actually quite doable thanks to massive progress in the technologies to assist people in doing the proofs.

Anonymous said...

Assuming the accelerator and brake systems run on CPUs built on something like 1 square centimeter of silicon, in aggregate Toyotas could represent one of the world's larger cosmic ray detectors.

- Marsh

Anonymous said...

Never underestimate the power of an expert witness to uncover the truth. It must be a very rewarding job indeed.

Robert Graham said...

The number one cause of unintended accelerator is because drivers panic and press the gas pedal instead of the break. It's a constant across all manufacturers and all models. It's not even clear that the problem was statistically more prevalent in Toyotas.

Omer Ansari said...

Michael, I can almost feel the conviction with which you wrote this post. What stood out to me was your statement about how open source can greatly enhance safety and security. I never thought of it that way.

Thanks for the great post. And by the way, waiting intently for HackRF general shipment :)

Omer Ansari

mrsstomperhere said...

anyone know anything about experimental remote access interface devices? I seem to be finding evidence there is an exper4imental Bluetooth device accessing my systems- I was havng serious invasions through a router and modem owned owned by Michael shurer, owner of 'MICHAEL'S AUDIO AND VIDEO' in fraser, co.- he also owned an apartment building and I was a renter ther, and since june of last year, when I saw the "?" between my pc's and his router,apparently I have 3 interfaces ands on wireshark, I was shown indication of an experimental device-REALLY! sincwe I moved from sharers apartment bilding, the crap has gotten worse, I fight for use of my device every dfamn time I try to use it. the fact that I had caught invasion from employees of michaels audo and video of my satellite dish should be reason for an investigation, and I am trying to figure all the ways these devices are being accessed and I find experimental bluetooth devices? I even found files for using my speakers as microphones and adding files remotely- and it just gets worse from there... anyone with any info as to why I am being shown these things in scans, call 970-887-9695 and help me find these damn hackers- they gotta be within Bluetooth

mrsstomperhere said...

now-I just gotta find this thing in my home- any ideas as to the distance the hackers have to be to access a Bluetooth device?

mrsstomperhere said...

now-I just gotta find this thing in my home- any ideas as to the distance the hackers have to be to access a Bluetooth device?

asor said...
This comment has been removed by the author.