Tuesday, March 15, 2011

The Risks of Nuclear


The recent explosion and now likely meltdown at the Fukushima nuclear power plant run by TEPCO has generated a lot of press and re-sparked the debate on nuclear safety. This is an especially important discussion in the present period as many countries throughout the world are currently assessing which potential alternatives to natural gas can serve as feasible replacements.

The current reactor design is a BWR-type reactor which uses light water as a coolant. While it share some basic features with the notorious Chernobyl RMBK plant, it also has important differences. In addition to the RMBK's many mis-features which lead to the famous accident, RMBK also suffered from an almost complete lack of passive safety systems. It even lacked a primary containment vessel for the reactor. When it exploded it shot flaming graphite and radioactive products into the atmosphere which spread over a wide area - a seriously catastrophic event.

Because of the existence of a primary containment vessel, under normal conditions the design at Fukushima should keep even a complete meltdown from causing a serious radiation danger to the public, much in the same way the Three Mile Island reactor was able to do. However, the conditions that the reactor has so far encountered are not particularly normal. After a 9.0 earthquake, it's very difficult to be sure if your design is going to act in the way you intended.

Which leads us to the primary difficulty which has plagued the Fukushima reactor. The reactor design relies critically on an active coolant system. I have read in several places where people have wondered why the reactor wasn't scrammed (scramming means implementing emergency shutdown procedures). In fact it was scrammed. The problem is that it takes a long time to cool down. During this entire cool-off period, one needs to be flowing coolant past the core to avoid a meltdown. Unfortunately the pump system were unable to function because of a failure to power them. Without coolant the core melts and the problem becomes much more complicated and dangerous. In the worse case a complete liquification of the core could even lead to a return to criticality. This would be similar to the reactor core turning back on, except this time without the designed geometry. Essentially an uncontrolled and very difficult to control reactor. If this happens, things become much more complicated and dangerous.

The assessment of safety for the Fukushima units was based on the idea that redundancy would provide sufficient safety. However, they neglected to calculate the risk of some event in which both causes were common - that the same cause of electrical failure would also knock out the generators.

A passive safety coolant system should likely have been a requirement for any reactor design as this event shows. Reactors such as the Economic Simplified Boiling Water Reactor would not have been affected by a generator failure and would have been able to provide passive cooling for the period needed to cool the core to avoid meltdown. This would presumably lead to a greater margin of safety.

However, we should still wonder whether or not if it would be safe enough. The fact that some coolant failure could lead to a meltdown and consequently a return to criticality should give pause. A worse case scenario becomes very bad indeed.

There are many questions that are necessary to contemplate in evaluating the safety of various technologies. Nuclear designs as they currently stand, are somewhat peculiar compared to most of our other fuel technologies. Nuclear designs, have, per TWh proved to be extremely safe as compares other power generation technologies such as natural Gas. In Europe, nuclear is on the order of between 10 and 1000 times safer* in count of number of deaths per TWh from all causes than natural gas.

Should we count Chernobyl into our calculations? How do we assess risk from cataclysmic events? The assessment of risk from low probability but potentially massive events is very difficult. Very low risks are very difficult to measure accurately since their frequency is so low that our estimates tend be dominated by guesses.

In addition we need to compare the safety against other replacement technologies, or the possibilities of abandoning the technologies niche itself. In the case of nuclear power, this would be a search for baseload power replacements.

When we begin to look at technologies in comparison we find that even in this tragic and improbable event in Japan natural gas has itself not been free from problems. Many people in Japan were incinerated from natural gas explosions. There were also 1800 homes washed away by a dam failure. It's not clear how many died from that, but the number is likely to be very substantial. Which energy source turns out to be more deadly under such extreme conditions will have to wait until after the scale of the nuclear threat is fully understood.

Yet the nuclear power systems continue to drive more public fear. Some of this may have to do with the difficulty of providing an accurate risk assessment leaving us to guess exactly how bad things can get. When people look to the nuclear experts for opinion the best they can seem to do is say something along the lines of: We expect it will not be as bad as Chernobyl. Such statements are hardly very reassuring.

The character of the particular technology itself is not irrelevant in our calculations. To take a rather less charged subject than nuclear power we can instead turn to the question of Hydro power. Hydro power deaths per TWh if taken in summation over the entire world turns out to be one of the worst offending technologies. Worse than even natural gas or coal. However, almost all of the problems with hydro occurred in impoverished third world countries. A single catastrophe in 1975 at the Banqiao Dam in China left over 20,000 dead directly from drowning and somewhere around 100,000 dead from famine and disease.

No such legacy haunts Europe's dams. They have proved to be both safe and stable and hydro power in Europe deaths per TWh is effectively zero if we exclude eastern Europe. A similar truth holds for nuclear power.

Now we can perhaps say that large dams in Europe should be avoided on the off chance that some Typhoon or Earthquake hits - an event that while it may seem improbable - is not impossible. Since the potential death tolls would be tremendous, it's not totally unreasonable to overestimate the probability in order to provide some buffer of safety for ourselves.

However, this same reasoning should not cause us to avoid micro hydro power, since the possibility of massive disasters from a small water turbine is impossible to imagine (though some deaths would not be impossible). Similarly, it should not be the case that we reject all nuclear power based on specific applications of the technology in specific circumstances. The evaluations of the worst case scenarios need to be made on the basis of the implementation.

In order to understand nuclear safety, or the lack thereof, it helps to go back a bit in time to the creation of the US nuclear programme to see why we have the reactors that we do.

Light water reactors are not by any means the only type of reactor. During the course of development there were a large range of reactors which were tested. The number of types now in operation is much less diverse than when nuclear power was in its infancy.

One might suppose that this was because we have settled on what are effectively the most safe and reliable nuclear reactors with the best characteristics. Unfortunately, to assume this would be to assume wrongly.

The development of nuclear power has been closely coupled with the desire to develop nuclear weapons. Without understanding this fact it's impossible to understand the direction of nuclear development.

Several designs for nuclear reactors, including one of the first, the AHR (Aqueous Homogeneous Reactor) and a later design based on similar ideas, the MSR (Molten Salt Reactor) were dropped despite the fact that they had achieved similar potential viability as a comercial reactor technology to the now popular LWR (Light Water Reactors). Some of these designs were considered so safe that universities were given licenses to operate them for the generation of isotopes or neutron flux for experiments.

These reactors had many potential advantages including intrinsic passive safety features. They allowed designs ranging from the truly tiny, around .05MW up to large scale reactors, around 1GW. These designs allowed cheaper fuel production, since they used a fuel slurry, liquid or aqueous suspension, rather than complicated metal cladded fuel pellets. Most surprisingly, they also allowed arbitrarily high burnup of the nuclear fuel.

In a standard LWR, one can expect somewhere around 5% of the fissile material to be used. In some of the most sophisticated high temperature reactors that have been operated, solid core configurations can reach 20%. The end result of these low burnups are high production of waste, and low efficiency in the use of fuel. If you can exceed 99% then you are potentially producing very little waste.


Liquid reactors are also able to evacuate Xenon 135 by bubbling it out of the core. The Chernobyl accident was exacerbated by a lack of primary containment. However, the initial instability was due to a build up a of the neutron poison, Xe-135. This element stops neutrons in the chain reaction as its absorption profile is enormous compared to anything else. Nuclear fission can cause a buildup in a solid fuel leading to a sudden drop in neutrons. However, when the Xe-135 decays one can find a sudden return to neutrons and a consequent heating of the reactor. Xe-135 is a major difficulty in the operation of solid fuel reactors, since they are not able to evacuate it, but must wait for decay.

If that weren't enough, these reactor types could also use Thorium as a fuel. Thorium is much more prevalent in the Earth's crust than Uranium and much more evenly distributed

So why didn't the Atomic Energy Commission forge ahead with these reactor designs? As Kirschenbaum, who worked on the AHR, related, the design was rejected already in 1944 when they realised it would not produce Plutonium as quickly as the AEC wanted. The use of Thorium turns out to have been scratched for similar reasons. There is no good production pathway for Plutonium from Thorium.

The AEC was dedicated, not to finding the most efficient fuel source as the "Atoms for Peace" moniker might lead one falsely to believe, but was interested in the production of weapons grade plutonium. As such it was completely dedicated to the "Plutonium economy", which included an array of LWRs and fast breeder reactors which would allow the production of large quantities for the nuclear weapons program. LWRs were to become dominant despite their lack of inherent safety features.

During the 1960s, one of the great nuclear scientists, and lifelong proponent of nuclear power, Alvin Weinberg, was asked by the AEC to do safety assessments of LWR type reactors. What Weinberg and his team found in their assessments caused them some distress. The LWR designs indeed had very serious safety deficiencies. Weinberg then began attempting to warn the industry and the AEC about the shortcomings in the designs.

Eventually, Weinberg was sidelined. US Senator Chet Holifield, a proponent of the "Plutonium Economy", famously said: "Alvin, if you are concerned about the safety of reactors, then I think it might be time for you to leave nuclear energy."

Whether or not nuclear power should take centre stage, be a bit player, or not even make the cut is a question that can't be answered easily. As for myself, I'm sympathetic towards nuclear power as a fuel source for a world that will need ever more energy. The question of course, requires a careful evaluation of the options and the associated costs of these options.

In the last analysis however, more important even than this careful analysis of our options, are the following two points:

There is only one all important factor in which energy source we use, and that is humans. It isn't how much the plant cost and it isn't about the strict conversion efficiencies of thermal energy to electric or any other such technical parameter. It simply matters if it will improve or disimprove our lives compared to not using it.

Lastly, what makes the most sense from this perspective is irrelevant if we haven't the power to make it happen. As we see clearly with the choice to develop LWR technology, those with the power call the shots. If we want the over-riding important factor to be how things impact people, the people are going to need a lot more power.

* Figures for deaths per TWh are from ExternE, and modified to include some of the most pessimistic estimates for Chernobyl

Tuesday, March 8, 2011

Knowledge Production as a Public Good


Recently, I've read through a number of proposals regarding systematic attempts to allocate labour in a post-capitalist society. Most of these share the common feature that they don't attempt to look in detail at true public goods. With a knowledge economy that is becoming an extremely large part of our overall productivity, I think this is an oversight which should be corrected.

In addition, there is a belief by many that open-source approaches can directly solve the problem even within a capitalist system. However, open-source suffers from a number of deficiencies. It does not demonstrate the ability to support the labour of people involved by providing them with livelihoods. It fails at providing necessary resources in the case of more capital intensive knowledge production, for instance chemistry, genetics, hardware manufacture or even cinema. It also is weak at signaling when labour is widely desired. This leads to a tendency to be hobbyist focused, being as it is only supported as a recreation, and not focused on providing the greatest public good.

A perfect public good is non-rival and non-excludable. A non-rival good means I can use it, and you can use it and neither of us experience any loss at the others use. Television and radio are examples of perfect non-rival goods. Internet tends towards being an imperfect, as do roads etc. Non-excludable means that it's not possible to keep you from using it. Street lights [1] are good examples of something which is very hard to exclude people from using. Knowledge naturally fits into this category provided we drop things like copyright and patent. Copyright and patent are designed in order to make a non-rival good appear to be a rival good by generating exclusion through the use of legal force.

In terms of efficiency the use of exclusionary force is purely a drag on the efficiency of the entire system. The drag on efficiency is partly due to the fact that it requires labour for enforcement - a judicial system, legal teams, police, methods of tracking use, incarceration or the levying of fines, the generation of DRM technologies, including software and specialised hardware - all of which do nothing useful (in fact they have negative use-value). In addition this enforcement has the extremely deleterious effect of reducing the free spread of useful information and concepts which can make production processes more efficient. In software and hardware there are huge levels of redundancy of research and "clean-room" designs done for no other purpose than to avoid patent suits. A new more efficient process will be kept intentionally limited in application in order to derive monopoly rents. Just looking at the list puts me in awe at the absurd inefficiencies of the capitalist system.

It's much more sensible in a post-capitalist society to treat these goods very differently. Since there is no (sensible) rivalry it doesn't make sense to try and charge some price for it. Still, in the immediate future it's not going to be possible for everyone to devote all their labour time to poetry or films. If these types of knowledge production draw voluntary labour to an extent that other basic goods production is not taking place, we need some way to see that this is happening.

Even if all labour were allocated voluntarily it would be exceedingly useful to see where labour was most appreciated to society - so unless we really and truly get to a post-scarcity society - it makes sense to worry about this.

The amount of resources that should be allocated for a piece of software, film, research and development or some other information based good is insanely hard to calculate. It requires knowing its labour cost, divide total popularity over all time - which is essentially impossible. We can however guess that the labour equivalent for a Michael Jackson song should probably be a microsecond of labour devoted from each of Michael's fan base. However, at the time of production it's entirely impossible to know this, since there is no way to know the amount of labour society would eventually like to devote. Indeed as time passes Michael Jackson's music may not reduce in popularity. Perhaps even more extreme, what value would we assign towards Newton's research into forces in physics?

If we want these sorts of endeavours to be supported beyond recreational labour and easily acquired resources*, then it makes sense to fund them socially. Past performance is no guarantee of future success, but it is some indicator. Social allocation could be described by looking at such performance.

Publicly funded information production is often done in a very monolithic fashion (but then so is private funding of films and bands in the main part). However, this need not be the case. The National Science Foundation for instance gives out grants to various institutions on the basis of evaluated past performance. It is conceivable that we could structure such an arts council and software council to do likewise.

The allocation of public funding itself might not be dictated by a board of experts as done with the NSF. It might be a delegated ministry of art/software etc, or it might even be possible to have a vote style infrastructure - which would allow people to describe the amount of their socially devoted production that should be alloted to various social goods.

The output of such an enterprise would not have to be policed in terms of consumption, but would literally be free access. By doing so it should be easier to institute methods of tracking the consumption as there is little incentive to avoid doing so. A post-capitalist youtube for instance would give good information about the number and multiplicity of views of a music video. Though it's impossible to account perfectly, and there are ways at avoidance of such, there is little incentive on the part of consumers to do so.

Because resources for institutions would be in some way tied to a reputation based on consumption, there *would* be some incentive for individuals who wanted to inflate their social importance to mislead. However, since there is no longer any reason for public funding of infrastructure like cinemas, youtube, or software repositories to have any connection with the content producers themselves, it's likely that it would be institutionally difficult to do so.

It's important to remember that individuals would be seeking the resources for necessary capital infrastructure and labour time, not pursuing actual profit. The profit motive wouldn't be a driving motivation in this scenario, even if it would likely drive certain individuals towards the reproduction of their status as reliable producers.

There are many possible ways of arranging knowledge production more cooperatively that could be explored as long as we keep in mind some basic facts:

1) Public goods are very difficult to value accurately even in a system of perfect information as they require knowledge from the future. Therefor no systematic approach is going to be perfect.

2) Public goods should not be treated like other rival-goods in almost any conceivable system of accounting. We should not create rival goods from non-rival goods by wasting resources simply so that they look like other goods.

When we work with knowledge, we should keep in mind that the model needs to be cut to fit the reality rather than the reverse.

* Think of the amount of time and physical resources devoted to Avatar or Water World for instance, and you can see the difficulty of arranging some types of knowledge production on an entirely ad hoc basis.

[1] Street Lamps were mentioned as a non-excludable public good by César De Paepe in his arguments with the Proudhonists.