YIELD/EFFICIENCY
Processes > Bomb Testing and Weapon Effects
The yield of an atomic bomb is the amount of energy released by an explosion, including blast, thermal radiation, and nuclear radiation.
Yield is usually measured in terms of the amount of conventional explosives (TNT) that would be required to produce a similar amount of energy.
The units used are equivalent kilotons (KT—where one kiloton is the energy released by one thousand tons of TNT) or megatons
(MT—one million tons of TNT).
Fission weapons and smaller fusion weapons
are generally measured in kilotons. Larger fusion weapons are measured in megatons. The Trinity test had a yield of about 20 kilotons, as did the bombs dropped
on Hiroshima and Nagasaki. The
first hydrogen bomb, Ivy Mike, was detonated on November 1, 1952 with a yield of
10 megatons, while the largest bomb ever tested was a Soviet weapon in 1961 named Tsar Bomba, or "Emperor of the Bomb," measuring 58 megatons.
This yield of a given amount of fissionable material is never equivalent to the total energy that could potentially be released.
The efficiency of an atomic bomb is defined as the ratio of the actual yield to the theoretical maximum yield. The fission of one pound of
uranium or plutonium will release the same amount of explosive energy as about 8,000 tons of TNT. In a 20-kiloton nuclear weapon,
2.5 pounds of material undergo fission. The actual weight of uranium or plutonium in such a weapon, however, is greater than this amount.
In a fission weapon, fission takes place in only part of the nuclear material, and the efficiency is thus said to be less than 100 percent.
The material that has not undergone fission remains in the weapon residues after the explosion. The Trinity device used about 13.5 pounds
of plutonium.
Efficiency is highly dependent on the design of the bomb. Proper use of explosives in
assembling the critical mass can increase bomb efficiency, as can the use
of tampers and initiators, which
prevent chain reactions from either beginning or ending prematurely.
|