The job of defence editor has many perks: flying in a Gripen fighter jet, prowling the Black Sea on an American destroyer and roaming the corridors of the Pentagon. The latest is getting to watch “Oppenheimer” on its release day. On Friday I’ll participate in a panel discussion on nuclear weapons after a special screening of Christopher Nolan’s keenly awaited biopic of the nuclear scientist who played a key role in the creation of the atomic bomb.
To prepare, I have been re-reading my worn copies of Richard Rhodes’s “The Making of the Atomic Bomb” and Kai Bird’s and Martin Sherwin’s biography “American Prometheus”. Oppenheimer, the director of the Los Alamos laboratory during the second world war, became a prominent campaigner for a ban on nuclear weapons and against the development of a hydrogen bomb.
These themes have obvious contemporary echoes. Just as many nuclear scientists, such as Andrei Sakharov, a Soviet physicist, turned against the bomb, so too are pioneers of artificial intelligence expressing concerns over the safety of the technology they have developed. Both subjects—nukes and AI—raise the question of existential risk and how we measure it. My colleague Arjun Ramani has reported on a fascinating new study which asks why “superforecasters”—those with a track record of accurate predictions about the future—typically express less concern than subject-matter experts over the prospect of an apocalypse caused by nuclear weapons, AI or pathogens.
Oppenheimer himself sought international controls on nuclear weapons, expressing sympathy with the idea of a global government. “The basic idea of security through international co-operative development has proven its extraordinary and profound vitality,” he wrote in an otherwise gloomy essay in Foreign Affairs in 1948. Leaders in AI have long found inspiration in nuclear analogies and the role of the International Atomic Energy Agency (IAEA). Could, for instance, an AI agency monitor computer-processing power in the same way that the IAEA scrutinises fissile material?
Yet the analogies are not especially encouraging. North Korea launched another nuclear-capable missile on July 12th. The war in Ukraine has not been great for the nuclear order. In recent weeks, prominent Russian political scientists like Sergey Karaganov have urged the use of nukes against America (though others have pushed back). Arms control between America and Russia, which was shaky pre-war, is breaking down more quickly.
And, as Oliver Carroll explained in his excellent dispatch from southern Ukraine, there are also worries over the safety of the Zaporizhia nuclear power plant. It is occupied by Russia but is in the path of a Ukrainian counter-offensive, which is being slowed down by minefields. The plant is more secure in its construction than Chernobyl, but Ukraine is worried that Russia might manufacture a disaster. At the NATO summit last week, I heard Ben Wallace, Britain’s defence secretary, compare the threat to a “dirty bomb”.
Finally, no discussion of “Oppenheimer” would be complete without a mention of “Barbie”, whose release on the same day has unleashed a flurry of “Barbenheimer” memes. The two movies offer a stark choice, as we explore in our Culture section: realism or escapism? But I confess I never thought that the movie about the pink doll would prove more contentious than the one about nuclear weaponry. Vietnam has banned “Barbie” in the (mistaken) belief that a map shown on screen depicts the “nine-dash line”, which demarcates China’s claim to the South China Sea. Republican senators are now involved; this morning I saw even a law professor weigh in on the debate. I stand ready to chair any post-screening panels on this vital subject.
Thank you for reading. We are keen to hear your thoughts on “Barbenheimer” and any other feedback. You can reach us at: thewarroom@economist.com .
Economist is a Fleet Street City of London Royalist Globalist administrative rag.
I like C. Nolan's movies, but a documentary about Oppenheimer as opposed to science fiction might be a bit touchy if it has that good ole propaganda flavor.
I bet it doesn't include any alien touches. Oppenheimer would spend days alone wandering around the Sandia mountains. WTF was he doing out there, consulting Don Juan and dropping peyote, or being guilt tripped by aliens?
Judging by the screenshot from the movie "Oppenheimer," that chick is Jean Tatlock (Florence Pugh), the crazy commie broad that Oppie had a fling with back in the day. She's just fucked Oppie (the other back in the pic) and is practicing the 1000 cock stare. She also had mental illness and killed herself after a secret meeting years later with Oppie while he was running Los Alamos.
Shashank Joshi
Defence editor
The job of defence editor has many perks: flying in a Gripen fighter jet, prowling the Black Sea on an American destroyer and roaming the corridors of the Pentagon. The latest is getting to watch “Oppenheimer” on its release day. On Friday I’ll participate in a panel discussion on nuclear weapons after a special screening of Christopher Nolan’s keenly awaited biopic of the nuclear scientist who played a key role in the creation of the atomic bomb.
To prepare, I have been re-reading my worn copies of Richard Rhodes’s “The Making of the Atomic Bomb” and Kai Bird’s and Martin Sherwin’s biography “American Prometheus”. Oppenheimer, the director of the Los Alamos laboratory during the second world war, became a prominent campaigner for a ban on nuclear weapons and against the development of a hydrogen bomb.
These themes have obvious contemporary echoes. Just as many nuclear scientists, such as Andrei Sakharov, a Soviet physicist, turned against the bomb, so too are pioneers of artificial intelligence expressing concerns over the safety of the technology they have developed. Both subjects—nukes and AI—raise the question of existential risk and how we measure it. My colleague Arjun Ramani has reported on a fascinating new study which asks why “superforecasters”—those with a track record of accurate predictions about the future—typically express less concern than subject-matter experts over the prospect of an apocalypse caused by nuclear weapons, AI or pathogens.
Oppenheimer himself sought international controls on nuclear weapons, expressing sympathy with the idea of a global government. “The basic idea of security through international co-operative development has proven its extraordinary and profound vitality,” he wrote in an otherwise gloomy essay in Foreign Affairs in 1948. Leaders in AI have long found inspiration in nuclear analogies and the role of the International Atomic Energy Agency (IAEA). Could, for instance, an AI agency monitor computer-processing power in the same way that the IAEA scrutinises fissile material?
Yet the analogies are not especially encouraging. North Korea launched another nuclear-capable missile on July 12th. The war in Ukraine has not been great for the nuclear order. In recent weeks, prominent Russian political scientists like Sergey Karaganov have urged the use of nukes against America (though others have pushed back). Arms control between America and Russia, which was shaky pre-war, is breaking down more quickly.
And, as Oliver Carroll explained in his excellent dispatch from southern Ukraine, there are also worries over the safety of the Zaporizhia nuclear power plant. It is occupied by Russia but is in the path of a Ukrainian counter-offensive, which is being slowed down by minefields. The plant is more secure in its construction than Chernobyl, but Ukraine is worried that Russia might manufacture a disaster. At the NATO summit last week, I heard Ben Wallace, Britain’s defence secretary, compare the threat to a “dirty bomb”.
Finally, no discussion of “Oppenheimer” would be complete without a mention of “Barbie”, whose release on the same day has unleashed a flurry of “Barbenheimer” memes. The two movies offer a stark choice, as we explore in our Culture section: realism or escapism? But I confess I never thought that the movie about the pink doll would prove more contentious than the one about nuclear weaponry. Vietnam has banned “Barbie” in the (mistaken) belief that a map shown on screen depicts the “nine-dash line”, which demarcates China’s claim to the South China Sea. Republican senators are now involved; this morning I saw even a law professor weigh in on the debate. I stand ready to chair any post-screening panels on this vital subject.
Thank you for reading. We are keen to hear your thoughts on “Barbenheimer” and any other feedback. You can reach us at: thewarroom@economist.com .
Editor’s picks