Guest post: Did bombing during second world war cool global temperatures?
While the atomic bombs dropped on Hiroshima and Nagasaki – on 6 and 9 August 1945, respectively – have gone down in history as the first use of nuclear weapons in warfare, what is less well known is that they were part of a larger bombing campaign by US B-29 Superfortress bombers.
Between 3 February and 9 August 1945, an area of 461sq kilometers in 69 Japanese cities, including Hiroshima and Nagasaki, was burned during these air raids – killing 800,000 civilians. The smoke produced by Hiroshima and Nagasaki made up less than 5% of the total.
Although our results could not formally detect a cooling signal from second world war smoke, it does not invalidate the nuclear winter theory that much more massive smoke emissions from nuclear war would cause large climate change and impacts on agriculture.
There are many analogues that support parts of nuclear winter theory – not least the way in which major volcanic eruptions create long-lasting clouds in the stratosphere, cooling the Earth and reducing rainfall. The 1815 Tambora eruption in Indonesia, for example, caused the “Year Without a Summer” in 1816, bringing crop failures and food shortages across the northern hemisphere.
Since the end of the Cold War in the early 1990s, the global nuclear arsenal has been reduced by a factor of four. The world currently possesses about 14,000 nuclear weapons, distributed among nine nations – the US, Russia, France, the UK, China, India, Pakistan, Israel and North Korea.
Yet our climate model simulations show that these would still be enough to produce nuclear winter – and that even 1% of them could cause climate change unprecedented in recorded human history.