Extreme Risk
What an Officer of the Soviet Strategic Missile Forces Taught us on Extreme Risk
Martin Bartels
9 August 2020
Almost nobody noticed what happened on 26 September 1983
In 1983 the Cold War had reached a new peak. The great nuclear powers had not found a way out of the logic of the mutual threat by the complete destruction of not only the two countries. Both superpowers followed the “MAD” principle (“Mutually Assured Destruction”).
https://www.britannica.com/topic/mutual-assured-destruction
That meant that both of them had the capability to obliterate the other side and were able to launch a powerful second strike, even after the destruction of their own missile bases. The equilibrium meant that security was guaranteed as long as something unforeseen did not upset the balance of powers. The Soviet Union relied on “Oko”, a satellite system designed to detect the launch of ballistic missiles from US territory. On 26 September 1983 Lieutenant Colonel Stanislav Yevgrafovich Petrov
https://commons.wikimedia.org/wiki/File:Stanislaw-jewgrafowitsch-petrow-2016.jpg?uselang=ru
was the duty officer of the Serpukhov-15 bunker command centre.
https://en.wikipedia.org/wiki/Serpukhov-15
Shortly after midnight the Oko system reported the launch of initially one and then another four intercontinental ballistic missiles from bases in Montana.
After the discovery of a nuclear attack, the Soviet leadership had 28 minutes to decide on a counterattack. The logical consequence of a confirmed nuclear attack against the USSR would have been the launch of the country's entire terrestrial arsenal against the United States and NATO members, which would have immediately triggered lethal NATO counter-strikes from submarines.
The officer suspected a false alarm, but he did not have enough information to be certain of that. He made a decision to deviate from the military protocol. He reported a warning system failure to the Soviet leadership. He did not report his professional assessment.
His report made sure that there was no “retaliation”. Actually, the report turned out to be correct in terms of content, the warning system had misinterpreted solar reflections on clouds near Malmstrom Air Force Base as missile launches.
The incident became known to the public only in 1998. Stanislav Petrov was surprised by the numerous honours and the expression of deep gratitude from all over the world. He lived modestly in Frjasino (Moscow oblast) until his demise on 19 May 2017.
https://www.youtube.com/watch?v=quM5obcn8R0
Petrov’s criteria
Several interviews conducted many years later clearly show the considerations Stanislav Petrov made during the decisive 28 minutes:
- He was aware that a nuclear exchange of blows between the US and the USSR would quickly wipe out vast parts of the planet and contaminate the rest.
- The firing of only 5 missiles from Montana seemed implausible to him.
- He had doubts about the reliability of the Oko system.
- It was clear to him that reporting uncertainty about the warning system's alarm signal (i.e. the truth) would irrevocably prompt the leadership to an unpredictable decision. There was Yuri Andropov at the top who was already terminally ill.
- It was clear to him that by simply giving a report which was incorrect at that time, because it stated his assessment as a fact, he alone would predetermine the military decision-making process and there would be no nuclear strike made by the USSR.
The statements made by Stanislav Petrov years later are available on the Internet, pointing out a professional with a strong sense of responsibility and whose human integrity was beyond any doubt.
However, they do not reveal what excessive tension this person experienced during those 28 minutes and somehow made rational decisions.
https://www.youtube.com/watch?v=quM5obcn8R0&t=219s
Heroism
Heroes appear in critical situations such as wars or natural disasters. They fight for ideals, right or wrong, often for years. Some sacrifice their lives and are revered afterwards. Sometimes new evaluations of “post mortem” appear because the ideals of societies change.
A decision-making process of 28 minutes that saves humanity does not fit the classic model. Stanislav Petrov saw himself as a rational decision-maker, not as a hero. The need to avert danger from humanity was a criterion that took precedence in his mind to make a report contrary to his military duty. His transgression saved us all.
Stanislav Petrov’s action has not been and will not be re-evaluated in the future.
A look into the uncertainty abyss
The Oko risk was unique in its greatness. However, there are other hazardous situations that can also have huge scale and involve the conflict between the processes that were previously established on the basis of comprehensive risk analysis with a high level of authority and factors that were not previously taken into account. These can be particularly stupid.
An example of this is the Forsmark (Sweden) Nuclear Power Plant incident of 25 July 2006, where a short circuit in the electrical system that was too simple to be included in the planning of risk scenarios, triggered a chain of initially uncontrollable chain reactions. The engineers could prevent a meltdown but only with difficulty.
Once again, the rescuers were professionals who despite the carefully formulated set of rules worked out for all eventualities, took the situation under control and prevented the meltdown just in time.
https://en.wikipedia.org/wiki/Forsmark_Nuclear_Power_Plant
History shows that in many events the causes of the greatest dangers arise at a surprisingly low technical level.
https://www.history.com/news/historys-worst-nuclear-disasters
However, the small number of documented cases do not allow to come to a conclusion that the statistical regularity of "absurd causes" can be admitted here.
Does human discretion make the world safer?
Let us start with the simple side of this question: when there is such a long and convincing series of numbers that even the most vigilant statistician would classify them as solid, we can compare the degrees of certainty under different constellations.
A contemporary example of this is the question of the safety of autonomous driving: for almost every country we have plenty of data on the frequency and causes of car accidents under existing circumstances (car types, technical equipment of the cars, road conditions, weather, road signs, insobriety, health condition of the driver . . .). These figures make it possible to work out measures to adjust the factors identified as dangerous and so to reduce the number of accidents. This works well.
The idea of letting go of the steering wheel and putting a computer in charge of the car may make some of us shiver. However, as soon as enough statistical data is available on a different technical environment (e.g. frequency of failure of fast Internet), we are approaching the point at which autonomous driving becomes statistically safer than steering by humans with driving licenses. Carefully compiled statistics makes it possible to deal with autonomous driving in a responsible manner. In the end, the pragmatically achieved result may become that we give up the steering wheel of the car to the computer and only intervene in certain exceptional situations.
On the other end of the spectrum there are enormous risks. There cannot be enough statistical data here to improve the uncertainty to such an extent that we can really sleep easily. Certainly, it improves safety to analyse the known technical correlations and to document presumably correct processes in such a way that the technicians in charge are less exposed to errors.
On the other hand, such rulebooks can also pose an additional danger: we humans have the strange specific nature to trust in rules too easily and too willingly. Even if (or because) they are very complicated and also incomprehensible, we give too much credit to those who wrote them, and we easily develop an unfounded feeling of security.
The belief in the wisdom and protective power of sophisticated rulebooks and hierarchies is a refutable belief. It has been refuted time and again. And it will be refuted again.
People with reasonable knowledge and experience, a sound ethical awareness and robust nerves, who can quickly identify measurement errors and implausible technical sequences of events and who understand the relative value of predefined processes must have the authority to intervene. This increases security considerably but does not guarantee it.
Better stay away completely
We must not rely on the hope that in the end there will always be a Stanislav Petrov ready to intervene. From a functional point of view the hope is the expectation of something desirable, which is unlikely. Therefore, the hope has no role to play in risk management. If a risk can only be mastered with the luck or the exceptional skills of specially qualified and ethically strong human beings, we fare best if we don't take it at all.
To set the stage
Lao Tzu’s words sum up a dramatic contemporary scenario: While in some parts of the world people are increasingly affected by water scarcity, others face the growing threat of too much water due to extremely heavy rainfall and rising sea levels.
While the poem captures the ambivalence of water perfectly, the words "soft and weak" also seem to describe the way modern civilisations have responded to it. Their foggy perception and sluggish action is just as dangerous as the threats themselves.
Why Water?
The focus of this essay is to use the prominent example of water to help identify concrete approaches for dealing rationally with the issue of climate change. Climate change affects us in many ways, including the expansion of deserts, forest fires, the salinisation of soils, landslides, extreme weather events, agricultural crop losses, loss of biodiversity, spread of disease and human and wildlife migration.
.
Scientists and engineers have laid the foundations for our prosperity. And only these elites can show us the way to overcome the harmful externalities of these very engines of our wealth. This article supports the thesis that we are technologically and organisationally in a position to successfully meet these challenges, step by step.
One obstacle to the mobilisation of existing resources lies in the fact that the general public has only a vague understanding of the issue. They do not realise that, unless we make controlled sacrifices, nature will impose uncontrollable sacrifices on us.
We urgently need to overcome the human tendency to trivialise and understand with our minds and hearts what will happen if we do not listen to the guidance of our scientists and engineers. However, while these experts hold the keys to the right strategies, they are only trained to communicate with other scientists. This leads to a situation of misunderstanding and therefore a lack of adequate action.
Blurred perception of facts
Every day, we are all exposed to an overdose of reports about minor and major disasters in all forms of media. We more or less defend ourselves against this by ignoring some news, i.e. reducing the strain on our nerves by filtering information. It is human nature to rely on the mostly correct assumption that unpleasant developments will eventually end and change for the better. In the case of climate change, however, looking away and hoping things resolve themselves doesn’t appear to be a winning strategy.
A wealth of scientific analyses on climate change is available to everyone, but these are mostly comprehensible only for other scientists.
We should openly acknowledge that most people in the northern hemisphere have a sense of empathy for people "in the south" who are plagued by overpowering rains, flooded lowlands, islands disappearing into the water, eroding coastlines or droughts. However, the geographical distance and lack of awareness of the frequency of such disasters dilute solidarity. Collective psychological repression can set in quickly.
Most people in the northern hemisphere do not consider an increase in average temperatures of a few degrees to be alarming. Many even express relief that the winter is often milder than in the past. Loud protests by campaigners are experienced and understood by most citizens as a disturbance or perhaps exaggerated fearmongering.
At the level of policy, scientifically informed decision-makers attend international conferences on climate change, where they negotiate with other decision-makers on action plans that have no teeth but are presented as hard-won progress. And they are increasingly supporting “green” sectors of the economy. However, they are often reluctant to share the full extent of their knowledge about the problem because they do not want to jeopardise their recognition by “rocking the boat”.
The factual impact level is decisive for citizens
There is controversy about the interplay of causes of climate warming (industrial emissions, volcanic activity, ocean currents, etc.). We don't want to debate that here. What is more relevant are the changes in global average temperatures and their trends, as determined by scientific methods.
Instantaneous interruption or reversal of a climatic process?
Changes to the climate are not new in human history, and certain events have triggered reductions in temperature. A striking example of a break in climatic developments is the eruption of an Icelandic volcano in the year 536 CE, whose dust made the atmosphere in the northern hemisphere so opaque to sunlight over a period of more than 20 years that temperatures fell drastically ("Little Ice Age").
Recently, it has been hypothesised that ice ages were triggered by asteroids.
It may be tempting to pin our hopes on the possibility of such events helping us to mitigate climate change, but while we cannot rule them out, events of this kind are rare and unpredictable, we must not include them in projections. It would be absurd to hope for random external causes that could interrupt or stop the progress of global warming. While hope is a human propensity, it is not suitable for contingency planning.
Our real bottleneck
What is preventing us from taking appropriate action to minimise and reverse the rise in average temperatures?
Citizen perception of the nature and dimension of the threat is inevitably blurred, because the daily reports from the media are mostly unstructured and not comprehensible to non-scientists. The reports do not allow us to recognise the essentials.
Citizens need an overview that is communicated in an honest, understandable and clearly structured way. Only when citizens have realised the nature and scale of the problem will decision-makers have the courage to take action with determination. In essence, it is about legitimising protection strategies that are considered unpopular today.
Given that citizens do not have access to graspable knowledge, we have a transformation problem. And this can be overcome if science presents the overall scenario from a certain distance. Figuratively speaking: It is not about describing every pixel point of an image, but about showing the image as a whole. The holistic representation deviates from the usual approach of scientists, because each of them is professionally held to focus on "pixel points" in their respective area of specialisation. This is the only way science makes progress, but that's not what is needed here.
The contours of the hologram can be communicated in an understandable way using e.g. the key points mentioned above:
If the effect of a detail is not legible, the presentation of the measurement can be improved. In particular, the exponential impact of very small changes in average temperatures in the atmosphere goes very much against human intuition. We can compensate for this disadvantage in perspective: Instead of referring to changes in temperature in degrees Celsius, we should consistently communicate changes in basis points, i.e. in hundredths of a degree Celsius. For example, labelling a temperature rise as "32 basis points" would be correct and would make the difference easier to comprehend than "0.32 °C". This method is a common practice in the financial industry. There, too, this method of representation is helpful in raising awareness that a small change can have massive implications.
Comparing our planet with human bodies helps us to comprehend the effect of changes in temperature: If your body temperature rises by 1° Celsius, you have a fever and are not feeling well. If the temperature rises by 1.5 or even 2° Celsius, you are very ill and hardly able to work. It is similar with our planet: If it experiences increases in average temperatures of this magnitude, it shows the symptoms of a "serious illness". However, this "fever" does not go away after a few days.Truthful and comprehensible holographic description will work like a call to action as sensible citizens will refuse to accept the idea that their lives, that of their children or that of their grandchildren, will be exposed to significant and unparalleled danger.
Here is a simple example of a call to action: It is true that the onset of toothache does not necessarily trigger a reaction in us straight away. We are perhaps still hoping that it will go away on its own. But at some point we turn to the dentist for help. We may later find the dentist's bill stressful, but the relief of finding a solution to the problem outweighs this. It is necessary that we anticipate, that we sense the expected greater pain, in order to take the initiative.
Governments will only act vigorously when informed citizens demand it vigorously. There has been pressure from sections of the population for a long time, but its direction has always been vague and therefore not sufficiently effective.
And like a dentist, a government cannot act for free, but will send bills to taxpayers. The later the comprehensive strategy is implemented, the higher the bill.
Defensive and offensive measures
The necessary government action plans are not the subject of this article. It should only be mentioned that defensive measures are necessary first, e.g., improved meteorological warning systems, raising and strengthening of dams and dykes on the sea coast and rivers, preparation for the abandonment of non-defensible areas. In addition, measures are needed to halt the dangerous trend and then slowly reverse it. These essentially consist of avoiding emissions and removing greenhouse gases from the atmosphere.
Desperate measures?
The keyword for desperate actions is "geoengineering". This could imply approaches such as making either the atmosphere or our oceans absorb less sunlight or bind more CO2. While these approaches sound exciting, they are not fully developed and run the risk of causing irreversible damage. As such it is unlikely they will be used.
Sabotage of the communication of scientific work
There are two groups working against open and fair communication between science and the citizens.
Refuseniks who are not interested in facts work against this. They are used to believing their own feelings and those of their friends from social networks. There should be no discussion with them, because deviations from their assumptions act as fuel for them. Science will not lead them out of their dream worlds.
Then there are the sceptics, who may have expert knowledge but only select those parts of it for their thinking and communication that seem to support their rejection of action. This is a dangerous species, because “expert” sceptics can claim some credibility and can disrupt societal communication successfully. The only way to weaken these people is to persistently ask them for better and well-founded alternatives. Then they have to provide verifiable answers or quietly hoist the white flag.
Acknowledgements:
My heartfelt thanks go to Professor Reinhard Gast. As a practising geologist and experienced researcher, he has helped me to grasp the exponential impact of seemingly minimal changes in the temperature of our atmosphere, similar to our own bodies, and the uniqueness of the current situation.
Authorship disclosure:
Fully human generated