Radioactive Data Tsunami

Why the best response to disaster is transparency.

By Neno Duplan

The public’s fear about releases from the Fukushima Daiichi nuclear plant is understandable owing to a lack of information on the radionuclide contamination. To date, the focus of the crisis rightfully has been on taming the reactors and containing the leaks. After some initial setbacks, the situation at the plant has come closer to being under control. Very soon, attention will shift to characterizing effects on human health and the environment and on long-term monitoring and stewardship. At this point, an opportunity exists for the Japanese government and TEPCO, the electric utility that owns and operates the plant, to become as transparent as possible about the evolving conditions.

Difficulties at the plant first emerged on March 11 at reactor No.1 when the Japanese government declared a “nuclear emergency status.” Since then, the remaining five reactors have experienced issues with leaks and cooling. The good news is that all six have now been reconnected to the power grid, which could enable engineers to reactivate cooling systems, depending on the damage.

As with similar disasters in history, authorities initially denied that the situation posed an immediate threat. Soon thereafter, as the scope of the problems at the plant became more apparent, the same authorities called for enhanced radiation data monitoring around the plant. Eventually, residents from a 20-kilometer zone around the plant were evacuated. About four weeks after the tsunami, contamination appeared to be widespread both around and beneath the plant. Moreover, large quantities of moderately radioactive water had begun to be purposely discharged into the Pacific to allow for storage of more highly contaminated water. In mid-April, Japanese officials reclassified the accident at the highest possible level. The partial meltdown of three reactors and at least two spent fuel pools, along with multiple hydrogen explosions at the site, rated a 7 on the International Nuclear Event Scale—a level previously pinned only to the single reactor disaster at Chernobyl.

The variability of the reported concentrations has been striking. Samples of groundwater taken beneath the No. 1 reactor’s turbine building on April 1, for example, contained radioactive iodine at 10,000 times the legal threshold. On March 30, Japan’s Nuclear Industry Safety Agency found that levels of Iodine-131 in the seawater near the plant were 4,385 times the maximum level permitted under law. Several days later, this number soared to 7.5 million times the legal limit after the aforementioned release of contaminated water.

In the coming years, authorities will collect an immense amount of data on the contamination. Samples will be taken of air, soil, groundwater, seawater, and various biota, including crops and fish. These will be measured for various radionuclides, all of which have various half-lives. The data will need to be evaluated for possible impacts on humans and the environment. This can best be accomplished if all relevant data is brought together and stored in a centralized information management system that is accessible to all stakeholders.

Deploying such a centralized environmental management system to the “cloud” would allow all interested parties to know where samples have been taken, who collected them, how they were analyzed, what the levels of radionuclides were, and what the legal limits and long-term effects of each isotope are. The general public won’t do this—only those experienced in statistics, modeling, risk assessment, and/or health physics.

There is, however, one unassailable benefit to TEPCO and the Japanese government of making the information as public as possible. Listening to members of the Japanese public interviewed after incidents began, one could sense a decreasing level of trust with authorities and a growing frustration. With a public-facing database, accessible even to its critics, TEPCO can greatly reduce accusations that it is withholding information. Soviet and Russian authorities never did this on Chernobyl, and the general public still does not know the exact extent of that disaster to human health and the environment.

Software systems to accomplish this already exist at some American nuclear utilities fleets and nuclear weapons sites like Los Alamos National Laboratory. These systems have served these companies well. TEPCO and Japan should take a lesson from these and not lose any time putting a similar system in place, as the requirements for longterm monitoring will generate a data tsunami. This data tsunami that is already on its way might be as challenging to manage as the real one unless the proper information management system is put in place.

Neno Duplan is the founder, president and CEO of Silicon Valley-based Locus Technologies. The company organizes environmental, energy, and radionuclides information for U.S. nuclear utilities, DOE nuclear weapons sites, and other industries in the Cloud.

Posted June 14, 2011 in Environment