Warnings and Lessons from Hawaii’s Mistaken Alarm

It is quite arguably a miracle that humanity has managed to avert widespread catastrophe by nuclear weapons thus far, as the record shows.

“This is not a drill,” announced the emergency alert, and for 37 minutes hundreds of thousands of Hawaiians and tourists were left to contemplate the possibility that an incoming missile might soon end their lives.

Now that we know the “ballistic missile threat inbound” warning was in error there are urgent lessons to review. As a former secretary of defense, my advice is we treat the Hawaii incident not as a false alarm but a real one. It highlighted an emphatically genuine risk that human error or technological failure—or some fatal combination of the two—could result in a horrific nuclear catastrophe.

From what we have learned about Hawaii—a single person clicked the wrong item on a computer drop-down menu—it is clear that the state’s alert system is in need of some basic improvements. Most obviously it should require a two-person instead of a one-person decision system. A two-person system, long established in comparable military systems, would significantly reduce (but not eliminate) the probability of such a mistake happening again. But if that is the only change made we would be learning the wrong lesson from this wake-up call.

[…]

What happened in Hawaii is a new manifestation of an old problem. For decades, Cold War policymakers worried that a false report of missiles flying might prompt a leader to launch real missiles in retaliation. As a Pentagon official in 1979, I was awakened in the middle of the night by an Air Force watch officer who reported that his screen showed hundreds of Soviet missiles on the way. For a terrible moment, I thought a nuclear Armageddon was at hand. He quickly reassured me that it was an unexplained technological error. But that incident shaped my thinking for decades. What if exactly that same error had happened in October 1962, when as an intelligence consultant during the Cuban missile crisis I returned to my Washington hotel room each night convinced that nuclear war was imminent? We survived multiple Cold War close calls through a combination of good management and—to a troubling degree—plain good luck.

[…]

The consequences in Hawaii were that people were terrified. They were terrified not only because they thought that they and their families were going to die, but because they had no idea of “what to do.” That could have led to heart attacks or to automobile accidents, but such results, happily, have not been reported so far.

That they didn’t know what to do is fundamental to a nuclear attack, especially if the missile is carrying a hydrogen bomb. One hydrogen bomb could kill essentially everyone in a city like Honolulu or Hilo, even if the residents took cover. So the “what to do” has to happen before the missile is fired. The way to save yourself and your family from being killed in a nuclear war is to keep such a war from happening. Once the missiles are launched, it is too late. And that is one important lesson we could learn from the Hawaii false alert.

But there is also a second lesson. If the attack alert came from our military warning system, the president would be faced immediately with an existential decision. He would have 5 to 10 minutes to decide whether to launch our ICBMs before they were destroyed in their silos.

If he decides to launch them, and it is a false alert (that could be caused by any of the reasons given above), there will be no way to call them back or abort them in flight. He will have mistakenly started World War III, a war likely to destroy our civilization.

The U.S. military has been aware of this existential problem for many decades, and it has taken heroic actions to lower the probability of a false alert. Still, there have been three false alerts, one of which looked very real and could very well have led to a launch decision. It is to the great credit of our military leaders and the system they have put in place that we have never had a mistaken launch order. But the danger is actually greater now than during the Cold War, with the emergence of malicious hackers and government-directed hackers.

So the primary lesson from the Hawaii false alert is that even with the highly capable system our military now has in place, we are still vulnerable to such an alert. And the consequence of a mistaken launch order is no less than the end of our civilization as we know it.

The article doesn’t mention reducing the sheer amount of nuclear weapons, but that of course is an important initiative.