The price of liberty is eternal vigilance. It’s a pithy phrase. Often attributed to Thomas Jefferson, it’s pretty clear that he believed the content of the sentence even if he did not coin it. What October 7th reminds us, for the latest time in a near endless string of reminders, is that it’s really hard to maintain vigilance. Forever is a long, long time. Eventually, our vigilance weakens and we become vulnerable.
In The Black Swan, Nassim Nicholas Taleb writes (adapted from a parable of Bertrand Russell):
Consider a turkey that is fed every day. Every single feeding will firm up the bird’s belief that it is the general rule of life to be fed every day by friendly members of the human race “looking out for its best interests,” as a politician would say. On the Wednesday afternoon before Thanksgiving, something unexpected will happen to the turkey. It will incur a revision of belief.
Until that fateful Wednesday afternoon, every passing day provides evidence that the turkey is not just safe but under the compassionate care of the farmer. All of the evidence, in fact, confirms the mistaken hypothesis that the farmer is the turkey’s friend. Evidence can be consistent with multiple hypotheses. What happens to the turkey is not a surprise to the butcher who has a different, more accurate model of the underlying process that generates the data.
And as Taleb has pointed out, the perception of safety is highest the instant before the black swan materializes. The string of protective success is longest at that point—there is no evidence that it is about to fail. The longer the worst outcome fails to materialize, the more confident we become that disaster has been avoided. Inevitably, we let down our guard.
Until the morning of October 7th, Israel received daily evidence of the success of its plan to contain Hamas and received daily evidence that its system to monitor the Gazan border was foolproof. There were two key components to that system—highly sophisticated electronic monitoring devices and human observers at the border providing backup. As is the case in perhaps all calamities, there were warnings.
Supposedly, the Israeli government received a detailed copy of the plan for Hamas’s attack. The human observers at the border watched and reported Hamas’s drills as they prepared for the attack. Their warnings were dismissed as “fantasies.” According to a New York Times story, Israel’s security service, the Shin Bet, assumed what was being observed in the early hours of October 7 was a nighttime exercise:
Their judgment that night might have been different had they been listening to traffic on the hand-held radios of Hamas militants. But Unit 8200, Israel’s signals intelligence agency, had stopped eavesdropping on those networks a year earlier because they saw it as a waste of effort.
One reason the evidence was ignored is that there is always some evidence in advance of an attack. It is often an electronic needle in a digital haystack—part of a swarm of intel and information that is hard to assess. Such was the case with Pearl Harbor and 9/11. But I think equally if not more important is the psychological burden of eternal vigilance. It’s too hard to keep paying attention. The use of technology—even something as primitive in today’s world as the Maginot Line—induces overconfidence. The daily drumbeat of silence—the technology is working—convinces those in charge that the system is indeed foolproof.
In the run-up to the financial crisis of 2008, banks used a technique called VAR (Value At Risk) to assess the riskiness of their portfolios. The imperfection of that technique was widely understood by serious practitioners. But with each passing day without massive losses, risk officers at those banks were lulled into a false sense of security. Obviously, VAR was working well enough. And as I explored in my brief book on the financial crisis, Gambling with Other People’s Money, the incentive of banks to be vigilant was blunted by the potential of being bailed out, a potential that was in fact realized for almost all of them. On October 7th, the intelligence failure had no similar excuse—nothing softened the blow that fell on Israel that day.
Someone in the financial risk business once conceded to me that VAR was indeed imperfect, but it was the best tool available. What was the alternative to assessing risk mathematically, going with your gut, he asked, incredulously. My response was that sometimes we are particularly vulnerable to overconfidence if we think we are using “science” and VAR had lots of math behind it that made it look scientific. One advantage to going with your intuition is that you might be more likely to doubt your intuition than a scientistic tool and that such humility might protect you from catastrophe. Cutting edge tools can lead to hubris.
High-tech surveillance tools can play a similar role in inducing a lack of vigilance about an attack from Gaza—they lull the worrier into overconfidence because, like VAR, they are state of the art. But in the case of October 7th, it wasn’t the tools that failed. It was the decision-makers who had convinced themselves, based on day after day of nothing bad happening, that Hamas was all roar but no teeth. What a tragic misperception that turned out to be.
Once the war is over, there will be a commission here in Israel to assess just how badly Israel’s intelligence agencies, military, and political leaders underestimated Hamas and failed to reckon with what was coming over the horizon. They had one job—security—and they failed. The verdict of that commission and of history, will be brutal, as it should be. Great leadership requires eternal vigilance. And while that may be too hard a standard to meet, it doesn’t hurt to be aware of the difficulty of the task. Sometimes that may be enough to remind us of the need for humility in facing the potential for catastrophe.
POSTSCRIPT: Here is an addendum I’ve written on this issue—the tragic irony of Iron Dome in the challenge of eternal vigilance.
In this context, I wonder about another Taleb phrase: anti-fragile. How can Israel be anti-fragile given that it lacks strategic depth, is surrounded by enemies, and has a small population relative to the number of people who resent its success?
Edward Luttwak, expert in military strategy, wrote a critique of that NYT article and gave his compelling view of how Oct 7 happened. No conspiracy, simply the reality of an army like the IDF in the continual state of low-level conflict with occasionally spikes that Israel faces. Here’s an excerpt from Luttwak’s piece:
“Had Israeli intelligence analysis, or the arrival of a complete war plan sold by an enterprising operative revealed Hamas’s plan for an attack on October 7, the Israelis would have sent much stronger forces to guard the Gaza perimeter. Instead of the lone Merkava tank whose capture by dozens of Hamas fighters was shown again and again in news videos, there would have been a company of 10 tanks in that position, which would have massacred the attackers with their machine-gun fire. As for the single mechanised infantry company with fewer than 100 solders that guarded a critical hinge position, there would have been a battalion or even two that would have crushed the attackers.
But then, of course, Hamas spotters would have seen Israeli troops ready to defeat them — and they would have called off the attack altogether. There is worse: once an attack warning is received and reinforcements are deployed so that the enemy calls off its planned attack, the intelligence indicators that got it right will be discredited as false alarms, while the intelligence officers who failed to heed the signs will be the ones everyone listens to the next time around.”