By the start of World War II, Japan, Germany, and the Soviet Union all had active atomic weapons programs. The United States simply had the will, the money, and the industrial base to make it happen first.
It took the U.S. government only two years to build an industrial manufacturing system larger than the US automobile industry for the purpose of creating atomic bombs. It built three cities from scratch. It employed more than 130,000 people. Uranium fission was first demonstrated in the last month of 1938. On July 16, 1945—barely six and a half years later—the first atomic bomb was detonated in a test in New Mexico. It cost about two billion dollars—equivalent to about twenty-six billion dollars today.
And though the Manhattan Project was classified, the basic science that made it all possible was not. The American government was running scared, fearful that the Germans or the Japanese might develop the bomb first. By the end of the war, the U.S. government had no illusions about being able to keep the nuclear genii confined to its bottle. The necessary knowledge was widespread. In fact, the U.S. monopoly on the atomic bomb lasted only four years, with the Soviet Union setting off their first atomic bomb in 1949.
The sort of atomic bomb that was dropped on Hiroshima Japan in 1945 is a remarkably simple device—which is why it is impossible to keep other nations from building them. But. While the bomb itself is simple, obtaining the necessary explosive nuclear material is not.
Uranium is common in the Earth’s crust. Most of it is the stable sort: non-radioactive Uranium 238. Mixed up with it, however, is a small percentage of its radioactive isotope: Uranium 235.
The separation of U-235 from U-238 is incredibly difficult and expensive. What percentage of U-235 you can get in your uranium determines whether you have created something that works well for building a nuclear reactor to generate electricity or whether you have material that can become a bomb. If you can get it from its natural fraction of a percentage up to say 20 percent, you can build nuclear power plants. A nuclear power plant simply cannot explode. For explosions to happen, you need much greater purity, say at least 80 percent—and you need a lump at least as big as a cantaloupe. This is hard to do.
A big enough lump of bomb-grade uranium is called a critical mass.
And that is why I say an atomic bomb itself is a relatively simple device. Once you have enough bomb grade uranium packed together, it can’t help but explode. The match needed to light the candle, as it were, is simply having enough of it in one spot.
The Hiroshima bomb just a gun barrel packed with cordite and loaded with a bullet that pointed at a target. The bullet was a lump of U-235 that was “sub-critical”; that is, it wasn’t big enough to explode. At the end of the barrel was another lump of U-235, the “target.” It was also “sub-critical.” To make the bomb blow up, the two lumps simply had to be brought together. At the moment the two pieces came together the lump became critical. It created a cascading, very quick, extremely violent, chain reaction. If have the right amount of U-235 packed together, it can’t help but explode.
That’s why barely functioning societies such as North Korea can build and successfully detonate a nuclear bomb. That’s why the fear of a terrorist organization building a bomb is not unreasonable: if they can get enough fissile material, nothing stands in the way of building a bomb.
And that’s why the world worries so much about Iran and their nuclear program. They have the technology that allows them to separate U-235 from U-238. If they stop purifying it at 20 percent—then they merely have the fuel needed for nuclear reactors to create electricity. Such stuff can never explode or be made into a bomb. However. That the Iranians can “distill” U-235 from U-238 to the purity needed to generate electricity means that they very easily can move from that bomb-grade uranium. So have they really stopped at the making electricity level? And can their assurances be trusted?