“The overriding conclusion was that the government’s principal failure in 9/11 was a failure to ‘connect the dots’.” –Brookings Institute
Like 9/11, evidence suggests intelligence failures occurred during Pearl Harbor, the Boston Marathon Bombing, the bombing of the USS Cole, multiple active assailant attacks, and other incidents. These intelligence failures attain exponential levels of concern and consequence when they occur on a national level. Often associated with such events is the criticism that there was a “failure to connect the dots.” Most recently, some have questioned if there was an intelligence failure (i.e., failure to connect the dots) before the October 7, 2023, Hamas attack in Israel. According to the National Counterterrorism Center, “HAMAS – the acronym for Harakat al-Muqawama al-Islamiya (Islamic Resistance Movement) – is the largest and most capable militant group in the Palestinian territories and one of the territories’ two major political parties.” The United States and several other countries have identified Hamas as a terrorist organization.
Erik Dahl, an assistant professor of National Security Affairs at the Naval Postgraduate School’s Center for Homeland Defense and Security, posits in “Intelligence and Surprise: Failure and Success from Pearl Harbor to 9/11 and Beyond” that connecting the dots is understanding the importance of signals and warnings in the available information (i.e., intelligence) and parsing out that which is most relevant, connected, and able to be assimilated into a product for decision-makers willing to accede its content. Analyzing the enormous amount of data available often compounds this issue. Dahl cites Roberta Wohlsteter’s book Pearl Harbor, which stated the ratio of noise to relevant signals made data analysis onerous before the attack on December 7, 1941.
In an October 30, 2023, edition of the podcast Overheard, entitled “Surprise Attack: Understanding the Challenges of Intelligence Analysis,” Philip Wasielewski stated one commonality of surprise attacks has been preconceptions, which lead to ingrained biases. The idea that terrorists could fly hijacked airliners into the World Trade Center or that an irregular Hamas terrorist force could thwart the seemingly impregnable Israeli border defenses was inconceivable to many analysts and decision-makers. Wasielewski’s guest, former Central Intelligence Agency (CIA) analyst Nate Dietrich, reinforced this viewpoint by stating that preconceptions and biases engender unalterable beliefs, which may lead to inaction.
Examples From the 9/11 and Hamas Attacks
Before the Palestinian-backed militant group Hamas attacked Israel on October 7, 2023, signs of an impending attack, including training with paragliders, military-style drills, and a professional-quality video of attacks on mock Israeli targets, were apparent. However, the “dots” were not collected and analyzed sufficiently to connect them.
According to a New York Post article, Hamas hid its preparations for the October 7 attack in the open. In that article, Michael Milshtein, a former Israeli Army intelligence officer, stated he was aware of Hamas’ preparations but never conceived of their ability to coordinate such an ambitious, large-scale operation. Milshtein’s observation appears to mirror that of the Israeli intelligence apparatus in that there was much thinking about what an adversary was, not what it could become. In addition, recent reports suggest that Israeli intelligence officials dismissed intelligence information, including a copy of the Hamas attack plan, obtained well in advance of the attack because the information was considered “aspirational.”
The 9/11 Commission Report similarly concluded after the 2001 terrorist attacks that the U.S. intelligence enterprise was still squarely on a Cold War footing against an adversary (i.e., the Soviet Union) that no longer existed. As a result, U.S. intelligence failed to give credence to the many data points indicating the emergence of a new threat (e.g., al-Qaeda). Siloed thinking and institutional (i.e., bureaucratic) policies that limit intelligence sharing are two impediments to preventing surprise attacks. Without information sharing across intelligence services, generating a compilation of data points to indicate a potential incident, especially a surprise attack, is challenging at best.
Conclusions from The 9/11 Commission Report led to the creation of the Office of the Director of National Intelligence to coordinate data collection and analysis among U.S. agencies. In The Conversation, Javed Ali of the Gerald R. Ford School of Public Policy postulates Israel’s intelligence agencies – Shin Bet, Mossad, and its military intelligence agency – would benefit from a similar coordinating entity in their ongoing national defense. However, even if the abovementioned obstacles are overcome, some, including Dahl, suggest surprise attacks are inevitable.
The Challenge of Connecting the Dots
Failing to make the connections between intelligence gathered from various sources can have catastrophic impacts. While often asked, determining the root cause of why the dots were not connected is not a simple task, as there are ultimately many “whys” that contribute to how these catastrophic surprise attacks continue to happen despite the employment of robust intelligence and military systems. As such, post-incident reviews and analyses cannot cease when answering the first “why.” For example:
- Military commanders in Pearl Harbor decided not to accept warnings of a pending attack because they had not completely deciphered the Japanese code;
- The CIA did not hand off information regarding the 9/11 hijackers to the Federal Bureau of Investigation for follow-up;
- The Israeli intelligence officials did not recognize the threat contained in pre-attack intelligence.
In addition, an Arab country, purportedly Egypt, reportedly warned Israel about Palestinian anger reaching a dangerous point before October 7. Yet, it appears Israel did not take significant action to address this warning. The public impression that Hamas was not willing to get into a large-scale confrontation with Israel may have played a role in the dots not being connected in advance of the October 7 attack. This perception extended beyond Israel as a national security advisor in the Biden Administration noted just days before the attack “that the Middle East was the calmest it had been in two decades.” Some, such as retired Lieutenant General William G. “Jerry” Boykin, advance that internal issues within the Israeli Knesset not only created a strategic weakness their enemies could exploit but resulted in Israel missing opportunities to connect intelligence leads.
Reevaluating to Not Repeat the Cycle
Indefinitely maintaining a maximum readiness level in security and special operations is impractical from a financial or staffing perspective. As society moves farther away from a critical incident, it becomes even more challenging to maintain support for maximum readiness. That being said, when dealing with an adversary or adversaries possessing an avowed hatred that calls for eliminating people or nations, there is no room for complacency. As such, the “it won’t happen here” mentality must change.
It is easy to analyze an incident afterward and point out what a given organization(s) failed to do to prevent a surprise attack. Rather than criticize or pass judgment after an incident, there must be a concerted effort to glean the root causes and consistencies across surprise attacks, whether local, regional, national, or international. It is short-sighted to reason that decision-makers failing to recognize the commonalities (i.e., pre-incident indicators), using siloed information sharing, not taking a threat seriously, or being complacent that it cannot happen here are the only contributing factors that need to be addressed to avoid a repeated cycle of surprise attacks.
Key Takeaways and Recommendations
Following are some recommendations from the authors to avoid common themes evidenced in surprise attacks:
- Do not lose sight of the adversary’s overall goals and intent. The original Hamas charter called for Israel’s elimination. Although Hamas modified its charter in 2017, the intent has not changed, as evidenced by the October 7 attack on Israel.
- Create a culture of cooperation, not competition, among intelligence services. For nations like the United States and Israel that maintain robust intelligence operations across multiple agencies and organizations, competition and information silos can occur due to existing bureaucracies. Culture and laws can create bureaucracies that silo intelligence (e.g., the CIA is responsible for international intelligence operations, the FBI is the lead agency for domestic intelligence operations). Even with conscious efforts to share information, having multiple agencies with different intelligence responsibilities delays information- and intelligence-sharing processes.
- Expect the unexpected when it comes to asymmetrical warfare. Agencies should consider “red teaming” to identify vulnerabilities in their defenses. With the approval of an organization’s leadership, red teaming for a nation-state threat could involve a group (i.e., red team) pretending to be an opposing force to create a physical or cybersecurity intrusion. The red team then reports the vulnerabilities it identified back to the organization’s leadership to address. However, costs and other concerns often limit or prevent using this tactic. Think tanks exist in topical areas, but the number of organizations engaging in red teaming for threats and vulnerabilities is assuredly low for many reasons (e.g., cost, perception, safety concerns). Yet, the factors limiting red-team use seem minor when a surprise attack occurs and lives are lost.
- Reduce or eliminate personal, institutional, and political biases in prioritizing security vulnerabilities. Unalterable beliefs engendered by biases regarding the capabilities of an adversary can skew effective measures at preventing an attack.
- Review the successful steps employed in previous interdictions to prevent future surprise attacks. This includes identifying what commonalities existed to provide sufficient credibility of the connected dots to equip decision-makers with a clear enough picture of the looming threat to initiate preventative measures.
Regardless of whether the actions outlined in this article for mitigating future surprise attacks or other steps are taken, one thing is certain. The consequences of repeating the cycle of failing to connect the dots are too high to ignore. Communities, friends, and families expect more, and rightfully so.
Robert Leverone
Robert Leverone, M.A., retired as a lieutenant from the Massachusetts State Police (MSP) after thirty-one years of service. He was the commander of the MSP’s Special Emergency Response Team, an arm of the agency tasked with crowd control and homeland security-related missions. Robert holds a Bachelor of Science degree in Business Administration from Northeastern University, a Master of Science degree in Criminal Justice from Westfield State University, and a Master of Arts degree in Security Studies (Homeland Security and Defense) from the Naval Postgraduate School, where he authored his thesis, Crowds as Complex Adaptive Systems: Strategic Implications for Law Enforcement. Robert is the owner and president of Crowd Operations Dynamix, Inc., specializing in training and consulting for law enforcement and private industry organizations in crowd management and control issues.
- Robert Leveronehttps://domesticpreparedness.com/author/robert-leverone
- Robert Leveronehttps://domesticpreparedness.com/author/robert-leverone
- Robert Leveronehttps://domesticpreparedness.com/author/robert-leverone
- Robert Leveronehttps://domesticpreparedness.com/author/robert-leverone
Darren E. Price
Darren E. Price, M.A., retired in 2020 after over 34 years of government service. He currently consults for public and private sector clients on various homeland security-related projects across the United States. In addition, Darren serves as an adjunct professor in the Homeland Security/Emergency Management Program at Idaho State University and Mount Vernon Nazarene University. He is a graduate of the Naval Postgraduate School’s Center for Homeland Defense and Security Master’s Program with a Master of Arts degree in Security Studies (Homeland Security and Defense). Darren is also a U.S. Army veteran, where he served as an intelligence analyst in Germany and the United States.
- Darren E. Pricehttps://domesticpreparedness.com/author/darren-e-price
- Darren E. Pricehttps://domesticpreparedness.com/author/darren-e-price