It was the antiquated (largely paper-based) method with which the FBI managed their data that prevented them from connecting the dots that could have avoided 9/11. At the time, one department wondered why so many foreigners were taking flying lessons, another department reported the entry of various Al Qaeda activists into the country and a third raised suspicions about a specific person. Unfortunately this information remained isolated and no analyst was able to access it. According to the 9/11 Commission, the FBI’s IT system was inadequate for the situation. Since then, there have been various attempts to solve the problem, including Virtual Case File, which cost $170 million: not a single line of code from that computer system has ever been used. Then, in 2005, it was Sentinel’s turn (worth $541 million) which was supposed to be up and running by 2009. In 2010 Jeff Johnson was called in to fix the situation: $405 million had already been spent, only half the program had been developed and an additional 6-8 years of work plus a further $350 million were estimated as necessary to get it off the ground.
It was not the quality of the work that had been done thus far that was to blame, but the way in which it had been done, their way of working was the cause. The project had been organised according to the waterfall method, based on Charts written by its creator, Gantt, which were first drawn up in 1910. Gantt’s charts were used for the first time to supply troops to General Crozier during the Great War. Basically, we went from war trenches to war drones, and yet we expected to build the FBI’s IT system with a 20th-century method.