Culture Change
Search
24 November 2024
Home
Nuclear Power Accidents & Our Ability To Predict Peak Oil Impacts PDF Print E-mail
User Rating: / 14
PoorBest 
by Charles Cresson Wood   
26 March 2011
Image
Three Mile Island
In response to the recent tsunamis, the resulting nuclear power plant breakdowns, and the ensuing environmental releases of radioactive materials, one Japanese governmental official claimed that contingency plans “failed to anticipate the scale of the disaster.” In 2001, Australian nuclear engineer Tony Wood indicated that probabilistic risk assessments (PRAs) failed to anticipate the events that led to the world’s worst nuclear disaster at Chernobyl (in 1986 in what is now the Ukraine). He also indicated that PRAs did not anticipate the worst reactor accident in the UK (in 1957 at Windscale), nor did PRAs anticipate the worst nuclear accident in the USA (in 1979 at Three Mile Island in Pennsylvania). There appears to be a pattern here.

Probabilistic risk assessment involves the estimation of a probability that a serious threatening scenario will actually take place. Although some related ISO (International Organization for Standardization) standards have been released, there are no generally agreed-upon standards for conducting PRAs. Likewise, there are no requirements that the PRAs that have been completed be updated in light of new information — such as the problems encountered in Japan. Furthermore, PRAs do not need to be accurate, and according to a 2002 report written by the (US) Nuclear Regulatory Commission (NRC), the quality of these risk assessments varies considerably from one licensee to operate a nuclear power plant to another.

Typically, PRAs are based on many assumptions and subjective estimates, and the combination of all these factors, not so surprisingly can be way off the mark. Even credible sources can come up with unbelievably optimistic estimates. For example, a 2003 multi-disciplinary study done at the Massachusetts Institute of Technology (MIT) estimated that the risk of an accident damaging the core of a nuclear reactor in the US was about 1/10,000 per reactor per year. The Japanese nuclear disaster reminds us that the likelihood is in fact actually much greater than this study indicates.

So why are these PRA estimates so wildly optimistic? There are a number of serious problems with this risk assessment approach, but this author specifically calls out five problems below. It is of note that all of these problems also apply to peak oil, and the disastrous consequences that we are all on track to experience, unless business, non-profits, and government wake-up to the very serious dangers that peak oil poses. These peak oil dangers include: precipitous fall-off in demand for products and services, unexpected supplier bankruptcies, dramatic stock market crashes, financial system lock-ups, widespread unemployment, localized famines, and serious civil unrest.

Image
BP Deepwater Horizon

The first of these problems has to do with who actually conducts a PRA. In many cases, employees or consultants paid by a certain organization promoting a nuclear power plant are the ones who conduct a risk assessment. A bias toward their benefactor no doubt is built into the assessments performed by these analysts. The pressure is to have a risk assessment be a marketing tool, rather than an objective review of the actual risks involved. The way to get around this bias is to have independent third parties, such as government regulators, either perform the analyses themselves, or else hire independent expert risk assessment consultants to do the work. A process to establish true independence rather than sham independence also needs to be in place.

The second problem involves groupthink, where established organizational biases color the way that the analysts look at various threat events such as a nuclear accident. It should not be surprising that the chosen risk analysts are often insiders, and/or are known by, and accepted by, those who would be assessed. Since it may adversely affect their careers, these insiders are loathe to “rock the boat,” and loathe to be the messengers bringing bad news. The fact that nuclear plants are run by utilities, which are for-profit operations, indicates they are under great pressure to keep costs down, and this too many cause the operators to chose insiders, particularly those who can offer the least cost deliverables. Thus the groupthink, augmented by efforts to minimize costs, will in turn will lead to cutting corners whenever possible. Cutting corners in a PRA is particularly dangerous because the result is likely to be that top management is not aware of what they don’t know. This insider approach can lead to “surprising events,” where top management claims that they couldn’t imagine that something like this would happen (as was apparently the case with the Japanese official mentioned at the beginning of this article).

A third problem involves the increased variability surrounding the occurrence of rarely encountered events. The world is now going into a phase of increasing volatility in many sectors. The financial meltdown of 2007-2008 showed that the economic world is becoming more variable in its ups and downs. Hurricane Katrina revealed the climate variability that we are increasingly experiencing around the globe. The recent revolutions in Egypt and Tunisia indicate how the public in many countries has become increasingly unpredictable in its acceptance of governmental control. The increasing incidence of ocean-going boat piracy, now taking place off the coast of Somalia, indicates that the delivery of oil to major oil consuming nations like China is becoming increasingly unpredictable. Wars fought over oil, such as the US Iraqi invasion, are likewise indicating that the supplies of fossil fuels are increasingly tenuous, and that the availability of these fuels will in the future be more variable than has been the case over the last few decades. These and many other types of variability need to be more directly incorporated into PRAs — that is if the PRAs are going to be anywhere close to accurate.

A fourth problem has caused PRAs to be wildly off the mark. That is the great faith in technology, the belief that it will save the day. When we play with dangerous technologies, such as nuclear power, or for that matter oil drilling (don’t forget the Gulf of Mexico oil spill), or still more dangerous — natural gas drilling (fracking has its own serious environmental side effects), then in order for us to act responsibly, we must in advance accurately predict the downsides and the side effects that come along with these powerful and dangerous technologies. In the nuclear power realm, these downsides and side effects for example include the need to safeguard nuclear waste for hundreds of thousands of years. To be more accurate, risk assessments should be performed by, or at least involve the active participation of, technology skeptics and cynics. To get a more balanced PRA, at least some of the risk analysts should seriously doubt the merits of complex technology, and they should have diligently studied the historical experience when it comes to the side effects of the technologies involved.

A fifth problem with many PRAs involves the failure to adequately consider the systemic interactions that go along with an accident, an attack, or a natural disaster. Disasters like the one recently taking place in Japan, involve disruptions caused by the failure of multiple centralized systems such as those providing water and electrical power. These systems are unfortunately often interdependent and linked-up with feedback loops. The difficult-to-understand interactions surrounding these systems means this level of analysis is often left out of PRAs, or at least unduly truncated, but this multi-level complexity must be closely examined and modeled. For example, if nuclear reactors need water cooling in order to remain safe, and if water cooling systems require electrical grid power in order to operate, what happens when the grid is down due to a natural disaster such as an earthquake? How will water cooling systems continue to run? Perhaps diesel powered generators will do the trick — but only until they consume the fuel stored on-site. What then? What if the roads are blocked due to an earthquake? How will more fuel come in? And what if the cooling pipes are broken by an explosion or an earthquake? Much more thought needs to go into the analysis of multiple system failures, and how we will deal with these simultaneous multiple system failures. Of course this is going to take more money, time, and expertise.

Based on the results of the current Japanese nuclear situation and many other technologically induced disasters, it is clear that our collective ability to accurately predict problems through PRAs is somewhere between weak and non-existent. Unless we markedly upgrade the way we are doing risk assessment, and take the process much more seriously, this deficient situation will continue to cause extended business interruptions, unnecessarily large financial losses, unwarranted deaths, and avoidable public health issues.

It is time that we came right out and said that: “It is simply not believable for top management to claim that they couldn’t have imagined that certain serious problems could take place.” The Japanese nuclear accidents, and many other technological disasters, could of course take place. And the PRAs that management paid for should have seriously examined and planned for these occurrences.

If the modern societies are going to use dangerous and powerful technology — such as nuclear power, or for that matter petroleum — then there is a significant price to be paid, a price that is currently not being adequately paid. This price includes the increased cost of performing accurate risk assessments, assessments that honestly estimate the probability of various serious attacks, accidents, and mishaps. This price additionally includes doing extensive up-front research that examines the downsides and side effects of the proposed technologies. This price furthermore includes adding more safeguards and controls, so that the downsides and side effects are dealt with as part of the initial design, not added later. What we don’t need now is still more “build it now and deal with the consequences later” approaches to the deployment of complex technologies.

* * * * *

International Organization for Standardization: iso.org

Charles Cresson Wood is a technology risk management consultant with Post-Petroleum Transportation, in Mendocino California. He is the author of the book Kicking The Gasoline & Petro-Diesel Habit: A Business Manager’s Blueprint For Action. See his website to obtain the book and see the original posting of the above article on March 24, 2011.

Image
Charles Cresson Wood

Further Reading:

Peak Oil & Deficiencies In Risk Assessment Methodologies by Charles Cresson Wood, March 6, 2011.

Comments (2)Add Comment
The article deals with weaknesses in probability risk assessemnt(PRA). It does not, however, question the validity of the PRA as an effective tool. Taleb in his books on Black Swan events provides a sound basis for not using probability as a measure of what may occurin many circumstances. It is, for example, very doubtful that seismologists prepare PRAs to give an indication of when earthquakes will occur even though they can use their expertise to indicate the regions where they are most likely. Management can easily be misled by the assertion that RPA is an effective tool even when they appreciate some of the embedded uncertainty. Doubtless many scientists and mathematicians strive to do rigorous RPAs. It is the concept that is questionable.
Denis Frith
report abuse
vote down
vote up

Votes: +6
If the modern societies are going to use dangerous and powerful technology — such as nuclear power, or for that matter petroleum — then there is a significant price to be paid, a price that is currently not being adequately paid. This price includes the increased cost of performing accurate risk assessments, assessments that honestly estimate the probability of various serious attacks, accidents, and mishaps. This price additionally includes doing extensive up-front research that examines the downsides and side effects of the proposed technologies. This price furthermore includes adding more safeguards and controls, http://www.thomassaboverkauf-o...-shop.html so that the downsides and side effects are dealt with as part of the initial design, not added later. What we don’t need now is still more “build it now and deal with the consequences later” approaches to the deployment of complex technologies.
thomas sabo online shop
report abuse
vote down
vote up

Votes: +0

Write comment
smaller | bigger

busy
 
< Prev   Next >

Culture Change mailing address: P.O. Box 3387, Santa Cruz, California, 95063, USA, Telephone 1-215-243-3144 (and fax).
Culture Change was founded by Sustainable Energy Institute (formerly Fossil Fuels Policy Action), a nonprofit organization.
Some articles are published under Title 17 U.S.C. Section 107. See Fair Use Notice for more information.