There is no question that analyzing mountains of information and determining what is important, urgent, and worthy of follow-up (three separate and distinct categories) is a daunting task in any organization.
Are we sharing all of our knowledge yet?
When the organization is the United States Federal Government and the amount of information that has to be addressed daily dwarfs what most people can conceptualize, lives may be at stake when an individual or system fails to connect the dots.
Such a failure occurred on December 25, 2009, but it need not have.
The tools to manage information on a massive scale do indeed exist and it is clear that the U.S. government is either not deploying the right ones or not using them correctly.
The National Counterterrorism Center, created in 2004 following recommendations of the 9/11 Commission, has a mission to break “the older mold of national government organizations” and serve as a center for joint operational planning and joint intelligence. In other words, various intelligence agencies were ordered to put aside decades-long rivalries and share what they know and whom they suspect. Unfortunately, while this sounds good in theory, in practice this mission may not yet be close to be being fully carried out.
In addition to the fact that old habits die hard (such as a disdain for inter-agency information sharing), it appears that the folks at the NCTC failed to grasp basic tenets of knowledge sharing, namely that search, in order to be effective, needs to be federated and contextual, that is to say it needs to simultaneously search multiple data stores and present results in a coherent manner.
Discrete searches in separate databases will yield far different results compared to a federated search that spans across multiple databases. All reports indicate that intelligence agencies were still looking at discrete pieces of information from separate and distinct databases plus the agencies themselves were not sharing all that they knew.
In this case, much was known about Umar Farouk Abdulmutallab, the Nigerian man accused of trying to blow up Northwest Flight 253. In May, Britain put him on a watch list and refused to renew his visa. In August, the National Security Agency overheard Al Qaeda leaders in Yemen discussing a plot involving a Nigerian man. In November, the accused’s father warned the American Embassy (and a CIA official) in Abuja that his son was a potential threat. As a result, the son was put on a watch list that flagged him for future investigation. He bought his plane ticket to Detroit with cash and boarded the flight with no luggage. Yet, almost unbelievably, no one saw a pattern emerge here.
Shouldn’t a system somewhere have put the pieces of this puzzle together and spit out “Nigerian, Abdulmutallab, Yemen, visa, plot, cash ticket purchase, no luggage = DANGER!”?
Information Overload is partially to blame as well. Given the vast amount of intelligence that the government receives every day on suspected terrorists and plots, it could very well be that analysts were simply overwhelmed and did not notice the pattern. Rather than being immune from the problem, given the sheer quantity of the information it deals with, the government is more of a poster child for it.
Regardless of what comes out of the numerous investigations of the Christmas Day terrorism plot and the information-sharing failures of the various intelligence agencies, one thing was abundantly clear by Boxing Day: the Federal Government needs to greatly improve its ability to leverage the intelligence it gathers and connect the dots.
Clearly, there are many changes that need to occur in order to improve security but one relatively simple way for the government to proceed is to take the first steps to lower the amount of Information Overload and raise the signal-to-noise ratio so that critical information can rise to the top.
Jonathan B. Spira is CEO and Chief Analyst at Basex.