According to Occam’s Razor, the best explanation of any phenomenon is the simplest. In the case of the World Trade Center towers, this criterion would certainly favor the analysis that the towers were weakened by aircraft impacts, further weakened by fire, and ultimately fell in a gravity-driven progressive collapse.
A variety of evidence indicates that the Twin Towers were indeed struck by large commercial (or militarized) jet aircraft, and that the wing structures of those aircraft were massive and strong enough to slice like a knife through the exterior columns (and perhaps even some interior columns) of the buildings — leaving each tower intact, to be sure, but in a structurally precarious condition. The ensuing fires certainly would certainly have caused some further progressive weakening of the structures. In this situation, it seems reasonable that some interior floors might have partially or completely collapsed, or disconnected from the exterior columns. As the unsupported height of the exterior columns increased, the idea of a sudden buckling around the perimeter of the building doesn’t seem impossible, especially if the core columns were already weakened by fire and aircraft damage. And this is exactly what is seen in the videos of the collapse: without any sign of explosive cutter charges going off, the exterior columns appear to yield in a process of smooth and uniform crushing, almost as if they simply melted away.
After the collapse initiation, the videos appear to show a “zone of destruction” accelerating downwards at almost free-fall speed, with the intact upper stories of the towers pounding against the stationary lower stories in a spectacular orgy of mutual destruction, with dust and structural fragments thrown outwards in massive volume — until finally nothing can be seen except the billowing cloud moving downwards and also hanging in the air and exploding upwards at the same time. An explanation for this phenomenon was provided in a paper by Bazant & Zhou that was allegedly written and rushed through peer review in a matter of days after 9/11: the amount of energy required to buckle and break the columns of the WTC towers is quite trivial compared to the gravitational potential energy of the towers, assuming that the appropriate leverage is applied by the dynamics of the collapse.
A more imponderable factor is the possibility of accelerated combustion during such a collapse process. As the collapse initiated, the contents of the smoldering office fire would have been swept downwards into a turbulent maelstrom. Could the flaming embers have been fanned to hotter temperatures by this turbulent compressed air, and quickly ignited more flammable materials encountered at each new floor on the way down? Any such process would be limited by the amount of air contained in the towers as the collapse began; but if as much as 50% of this oxygen had been consumed in such an accelerated process of combustion during the 10 seconds it took for each building to collapse, the energy released would have been equivalent to 165 tons of TNT.[1] Such an energy source could go a long ways towards accounting for the explosive appearance of the collapses, and for the expansion of the dust clouds.
However credible this scenario might seem, it has been challenged in virtually every detail by many investigators well aware of the long history of false-flag operations perpetrated by governments preparing their people for war. The evidence is overwhelming, from many lines of argument, that 9/11 was just such a false-flag inside job; the collapse of the World Trade Center must be evaluated in light of that basic fact. In context, the needs of the perpetrators for an operationally successful event must have taken the highest possible precedence. For that reason alone, considering the many imponderables of the collapse process, it seems unlikely that the perpetrators would have left the collapse of the towers to chance. A partial or asymmetrical collapse, or non-collapse, would have been catastrophic to their plans. Thus, it seems highly probable that the perpetrators planned some technological means to insure the outcome.
Moving along to more specific arguments: firstly, it seems unlikely that the fires in the buildings were intense enough to have substantially weakened the structure much beyond the initial damage caused by the aircraft impacts. A substantial portion of the fuel on board the aircraft would have flashed in the fireball seen on impact, while any remaining fuel would have either burned off quickly, or drained to lower floors. After the jet fuel was gone, the remaining fire would have been simply a typical office fire, as indicated by the smoldering black smoke that was emerging from the flames. Such fires do not burn hot enough even to significantly weaken structural steel, much less cause a collapse; and has often been pointed out, no other modern steel-framed skyscraper has ever been destroyed by such a fire.
The collapse initiation scenario of Bazant & Zhou has been disputed by Gordon Ross, who argues that the entire building both above and below the collapse zone should have acted like a resilient system of springs and dampers that would have absorbed and dissipated some of the energy of impact, stopping or at least slowing the initial collapse. Also, there is no reason why all the columns on the initiating floor should all buckle at the same time, offering no further resistance, as postulated by Bazant & Zhou; again, in the event that a single column failed in a buckling mode, the surrounding web of beams, columns and floors would be subject to both elastic and inelastic deflections which would redistribute the load while absorbing kinetic and potential energy, slowing or halting the onset of the collapse.
However plausible the “accelerated combustion” scenario might (or might not) be, it is very difficult to account for such processes generating extremely high, directed and explosive pressures. The intensively energetic nature of the collapses has been extensively documented by investigations of video footage and other photographic records, interviews with eyewitnesses, first responders and cleanup crews, forensic reports on the dust and debris, and analysis of seismographic recordings and infrared thermal imaging reports. The results of these investigations, as well as the basic questions about the plausibility of the gravity-driven collapse scenario, have led to extensive speculation as to whether the collapse of the Twin Towers was actually caused by some sort of intentional demolition, and if so, how this could have been accomplished.
Derrick Grimmer suggested in 2003 that the core columns could have been quietly destroyed by thermite. He estimated that a core column could be melted by a coating of thermite of approximately 2 inches thick, and that 11 metric tons of thermite should be more than sufficient to provide the energy needed to drive the expansion of the pyroclastic cloud of hot gas and dust that enveloped Manhattan as a result of the collapse of the towers, as calculated by Jim Hoffman. Grimmer suggested that this modest amount of thermite could have been installed at the time of the towers’ construction, or that it could have been applied as an “insulation coating” during routine maintenance.
However, in Spring 2005, an anonymous “Finnish Military Expert” argued that there were some aspects of the WTC event that could not be explained by any conventional thermal or explosive technology, but could only be explained by the use of a pure fusion device similar to what I have suggested above. The Finnish expert cited the predominance of extremely small particles < 300 microns in size in the dust (typical of nuclear molecular decomposition rather than any chemical process); a brown shade of color in the air; selectively burned cars in the parking areas and other effects resembling EMP (electromagnetic pulse) induction; superheated pools of molten steel, including core columns melted completely to a height of 65 feet; elevated tritium levels in the WTC debris; a blast wave whose onset was very soon after the first collapse began, which apparently destroyed WTC6 (the customs building) before it was struck by falling debris, and which ejected steel columns outwards to a distance of up to 175 meters. This Finnish expert mentioned the thermite theory as well as the possibility that explosive cutting charges were used, and he denied that either of these methods could account for all the effects.
Recognizing that any nuclear explosives used must have had fairly low yield and precise effects, the Finnish Military Expert wrote:
While looking for a bomb with a small size and a strong effect, a pure hydrogen bomb was an obvious solution. When no atomic device is needed for igniting, the size of the hydrogen bomb gets even smaller and the yield (effect) can be set within a wide range, for example between from 1 to 100. This succeeded in the 1980’s, as well as the neutron bomb, which kills only living things and leaves most material untouched.
A remarkable claim, yet casually asserted as if it were an indisputable matter of fact. In 2007, the Expert published an addendum to explain his claim that such weapons exist:
With almost unlimited funds and the sharpest minds of the known scientific community, do you believe all the universities researching this have continuously failed for some 50 years? There are some 10 ways to ignite such a fusion ball besides the atomic bomb which is the only way the audience is generally aware of. Half of these could be useful in weapons applications. [….] 4th generation thermonuclear weapons and directed energy. Google (Swiss researcher) Andre Gsponer and read all his weapons-related material you can get.
If Gsponer’s work is the only source of this information, then of course the Finnish Expert’s “10 ways to ignite such a fusion ball” can only be viewed as highly speculative. As we have discussed above, Gsponer never claimed to have any inside knowledge about whether the technical possibilities he discussed had actually come to fruition. Nevertheless, the evidence and argument that the Finnish Military Expert was presented was influential enough that in the aftermath of its publication, another sort of explosion ensued, disrupting and ultimately fissioning the so-called “911 Truth” movement.
Finnish Military Expert’s views won some influential support from Dr. Ed Ward (MD), who noted the many cancers that were starting to emerge in first responders, and wrote:
The spectrum and percentages of cancer are massive. There are at least 4 classifications of blood-cell cancers: leukemia, lymphoma, Hodgkin’s and myeloma. There are many more classifications of soft tissue cancers. There is brain cancer. There is breast cancer. For most of these there are subclassifications of many different types of specific cancer in each, so far not publicly disclosed. There are huge percentages of respiratory distress and loss of function. Multiple reports of ‘irregular cycles’ (miscarriages?). Most likely there will be several more types of cancer to follow. In particular, responders should be checked for thyroid cancer and function. There has been no noting of birth defects which also needs to be done. There is one thing and only one thing that can cause all these cancers and problems – RADIATION.
Ward agreed with the view that a small pure-fusion weapon was probably responsible. However, William Tahil, a British engineering consultant, published a book-length analysis titled “Ground Zero” in 2006, answering that well-known fission technology was more likely responsible. Tahil focused on the issue of the superheated pools of molten metal found in the debris pile. He reasoned that these long-lived underground heat sources could only be accounted for by the meltdown of nuclear reactors, which he claimed had been built into the basement of the WTC towers. Tahil reviewed a USGS study of the dust from the towers, and found highly elevated and correlated levels of barium and strontium. Unfortunately, the USGS study did not reveal whether these were the radioactive isotopes of barium and strontium. Tahil argued that the barium and strontium in the dust must indeed have been highly radioactive fission products, and that the elevated level of zinc must have come from a special jacket designed to enhance the deadliness of the radiation.
Another blogger going by the handle of “The Anonymous Physicist” attempted to debunk both Tahil and the “Finnish Expert”, claiming that small, tactical-style nuclear weapons were the explosive of choice. Such weapons would have been readily available from any major nuclear nation’s arsenal, and thus would not have required exotic development such as “Finnish Expert” postulated, nor was there any requirement for generational advanced planning and secrecy such as Tahil’s model suggested. “Anonymous Physicist” was very critical of Tahil’s dust analysis, noting that the strontium could easily have been a component of limestone aggregate in the concrete, and barium and zinc could be common components of white paint. However, “Anonymous Physicist” acknowledged that even in his scenario, there should have been easily detectable levels of radioactivity throughout Manhattan after 9/11, and he claimed that results to the contrary must have been faked.
Meanwhile — Steven Jones, famous for his earlier work in “cold fusion”, emerged in 2005 as an advocate for the thermite hypothesis. Many 9/11 activists greeted Jones enthusiastically into the movement because of his prominent status as a full professor at Brigham Young University. Jones initially worked together with Dr. Jim Fetzer, another full professor who was also famous (or infamous) for his work on the JFK assassination, to create the “Scholars for 911 Truth” organization and its “Journal of 911 Studies”, a venue for publication of “Peer Reviewed” scientific papers.
Jones and Fetzer quickly came into conflict, however, as Fetzer was willing to entertain a wide variety of theories about the causes of the WTC tower collapses, including not only the various nuclear theories, but also Judy Wood’s view that exotic directed-energy technology was involved; while Jones insisted that any such views were incompatible with the ‘scientific method’ as he saw it. Fetzer, for his part, argued that thermite and other conventional explosives couldn’t possibly have accounted for all the effects seen on 9/11.
As the conflict became heated, Fetzer attempted to dismiss Jones and his allies from their positions within the “Scholars” group, and in late 2006, Fetzer threatened legal action to take sole control of the group’s web sites. Jones founded an alternative group, “Scholars for 911 Truth and Justice”, which is said to have attracted a majority of the membership of the original group. Fred Burks brokered a truce in the legal battle, in which the original website became a disambiguation site with links to the alternative “Scholars” homepages. Jones’ faction kept control of the “Journal of 911 Studies”.
Over time, both Fetzer and Jones have displayed “Jekyll and Hyde”-like behavior syndromes, appearing very reasonable at times, and fierce or insane at other times. This has led to accusations that both Fetzer and Jones are “controlled opposition”, or “Lifetime Actors”. One website critical of Fetzer noted that he lists his work on the JFK assassination and 911 on his academic curriculum vitae as “applied philosophical research”, and speculates:
When Dr. Fetzer meets up with his friends or old colleagues, he doesn’t need to play the role of a crackpot whose cognitive powers are in decline, leading him to fall for preposterous hypotheses whilst developing an irresistible urge to broadcast his weird beliefs to the world. He simply tells them that his “applied philosophical research” includes his role in infiltrating “conspiracy theorists”.
Jones, for his own part, has developed an interest in “over-unity” research, involving an implausible belief that simple electronic oscillator circuits can somehow draw significant amounts of power from the quantum foam, or some other unknown and mysterious source. Much earlier, Jones’ top-notch research on muon-catalyzed cold fusion took place against a background of his far less credible research findings that Jesus visited America.
The net effect of Fetzer and Jones’ antics, however, has been to convey the impression that even the best-qualified 9/11 conspiracy researchers can safely be dismissed as nut-cases — while at the same time consistently drawing attention away from the possibility that a pure fusion weapon was used.
As incredible as it may seem from a viewpoint limited to technology in the public domain, pure fusion is the only explanation that’s fully compatible with the data that has been released from credible and/or official sources. Fetzer and others in the conventional nuclear explosives camp are forced to ignore or deny the data indicating very low or negligible levels of residual radioactivity in the debris, dating from as early as those measurements were taken; while Jones and his followers are forced to obfuscate or evade the many lines of evidence pointing towards nuclear effects, including the evidence of extraordinarily high temperature reactions, the “dustification” of so much of the debris, and the pattern of radiation-induced cancers in first responders.
[1] Given an air density of 1.22 kg per cubic meter, and an estimated volume of air 1.5e6 cubic meters per tower, the mass of air in each tower would be 1.9e6 kg. With a stoichiometric ratio of 15 kg of air for each 1 kg of fuel, we could burn 138 tons of fuel with the air in each building. If the energy available in the contents is 10 gigajoules/ton, we would obtain 1380 gigajoules of energy, or 380,000 kwh (equivalent to 330 tons of TNT.) This is an approximate upper limit on the energy that could have been released by accelerated combustion, but the real figure is probably much lower: stoichiometric combustion is an ideal that can only be approached in highly controlled conditions. A more reasonable estimate might be in the range of 20% to 50% of this upper bound.
Discuss in forum!