RACHEL'S ENVIRONMENT & HEALTH NEWS #820
June 23, 2005
ISN'T IT TIME WE REGULATED CHEMICALS?
By Tim Montague*
If you read almost any newspaper these days, you learn the following kinds of information:
** Many plastic toys contain chemicals that can interfere with the sexual development of laboratory animals and are now thought capable of doing the same in baby boys.
** Most of the rivers and streams in the U.S. are contaminated with low levels of chemicals that can change the sexual orientation of fish and can interfere with reproduction in animals that feed on fish.
** Dozens of toxic chemicals have recently been measured in household dust, indicating that common consumer products are contaminating our homes with toxicants.
You might ask yourself, isn't the government regulating dangerous chemicals? Unfortunately, the answer is No, not in any effective way.
About 1700 new chemicals are put into commercial use each year, almost entirely untested for their effects on humans and the natural world.
After a chemical causes enough harm for someone to take notice, then the government conducts a numerical risk assessment (aka, quantitative risk assessment) on an individual chemical. The point of a numerical risk assessment is to learn how much of a chemical is "safe" to eat, drink, and breathe. Then the government may try to regulate releases of that chemical. But fewer than 1% of all chemicals are currently regulated. (See Rachel's #815.)
A scientist at the University of Oregon has described why numerical risk assessment doesn't work, and has suggested other ways we could control chemical hazards. Dr. Joe Thornton -- a biologist -- explains that numerical risk assessment is a fundamentally inappropriate way to control persistent pollutants (such as heavy metals and chemicals containing chlorine) for two reasons:
1) It assumes that we can learn all the ways that every individual chemical can cause harm in humans and in the natural environment -- but there aren't enough scientists in the world to do this.
2) Many industrial chemicals tend to stick around for a long time and move from place to place in ways that are impossible to predict, so often we don't even know what we're looking for.
Thornton proposes we adopt four new ways of regulating chemicals -- zero discharge, clean production, reverse onus, and phasing out entire classes of persistent chemicals -- because the old way (regulating one chemical at a time at the end of the discharge pipe) simply doesn't work.
Risk assessment assumes that damage is local, short-lived, and predictable. But organisms and the environment are complex, interconnected, and only partly understood (to put it mildly). Therefore, we cannot predict cause-and-effect in any reliable way. In the face of these insurmountable difficulties, we can take a precautionary stance: when we have good reason to suspect harm, yet we have scientific uncertainty, we can err on the side of caution. Faced with choices, we can give the benefit of the doubt to public health and to nature.
Thornton's four principles begin to clarify how the precautionary principal can work in the real world. These principles are:
ZERO DISCHARGE -- Persistent and bioaccumulative toxicants are incompatible with ecological processes, and no amount of their release into the environment is acceptable.
CLEAN PRODUCTION -- We can consider alternative technologies up front and avoid the use of known toxicants in manufacturing. Finding alternatives rather than approving pollutants becomes the focus. For example, in dry cleaning, we can replace perchloroethylene (perc) with CO2 and water-based methods.
REVERSE ONUS -- Apply the same logic used in drug safety: give manufacturers the responsibility to show that a product is reasonably safe for use before it can be released into the environment. This shifts the burden of proof from society to the chemical companies to provide information about their products, to monitor for harmful effects and to come clean about their findings.
EMPHASIS ON LARGE CLASSES OF CHEMICALS -- Faced with the impossibilities of measuring the impacts of individual chemicals, simply phase out entire classes of compounds that are clearly problematic. PCBs, CFCs and lead compounds are all examples of classes of chemicals that have been phased out because of their hazards.
Thornton gives six reasons why the current risk paradigm is so flawed:
1. ACCUMULATION OF PERSISTENT POLLUTANTS
Risk-based approaches assume that nature and living things can absorb and assimilate synthetic chemicals, breaking them down and digesting them. This may be true for sewage, oil, and other naturally occurring substances. But persistent organic pollutants (POPs) like pesticides, solvents, refrigerants, etc. often resist natural breakdown and can persist for years, decades or centuries. (See Rachel's #284, #505, #611.)
Many POPs and metallic pollutants are fat-soluble and thus bioaccumulate as they move up the food chain. Top predators like humans, bears, and big fish can accumulate chemical concentrations that are tens of millions of times greater than typical environmental levels.
Persistence and bioaccumulation mean that even very small discharges of synthetic chemicals can build up to dangerous levels in our bodies over time. The general public's average body burden for some of the best studied pollutants is already at or near the range at which health impacts have been found in laboratory animals. To avoid this problem, we can declare that there is no level of acceptable discharge for chemicals that persist or magnify in the food chain -- in other words, we can adopt a zero-discharge policy.
2. CUMULATIVE GLOBAL POLLUTION
Numerical risk assessment oversimplifies the real world and considers environmental risks to be local in time and space. Once a chemical disperses beyond some horizon, it is assumed to do no further harm. So industry is encouraged to dot the landscape with sources of pollution that collectively begin to overwhelm the biosphere but which are individually within acceptable limits. As a result the entire planet has become polluted.
3. TOXICOLOGICAL COMPLEXITY
The science of numerical risk assessment is based on the premise that we can calculate a chemical's impact on the health of living things. Risk assessors do this by measuring the toxicity of individual chemicals on individual species -- usually rats, mice and other small mammals. There are at least 70,000 synthetic chemicals being used in commerce today (up from 40,000 in 1991). Risk assessment considers the toxicity of an individual pollutant acting alone -- when in reality each chemical is acting in concert with a myriad of other chemicals in the environment.
This poses a huge problem -- studying multiple chemical exposures is very costly and time-consuming. It would require 33 million experiments just to learn something about the effects of 25 different chemicals on a single species over a short period of time (13 weeks). A similar study of just 1% of the 70,000 chemicals in commerce would require 10E210 experiments (that is, 10 with 210 zeroes behind it). Trillions upon trillions upon trillions of experiments -- you can see that science as we know it is not prepared to tackle this problem. On the other hand, the risk-assessment solution is easy: Just ignore multiple chemical exposures.
4. INADEQUATE DATA
Industry's capacity for inventing new chemicals has overwhelmed the regulatory system's ability to study their potential harms. The chemical industry is introducing at least 1700 new chemicals into commerce each year. The U.S. National Toxicology Program conducts assessments on just 10 to 20 substances per year. At this rate we are falling at least 90 years behind in our knowledge each year that passes. A study by the National Research Council in 1997 concluded that we lack even minimal toxicity information for 70% of the most worrisome chemicals -- those that are manufactured in high volume and are already suspected of harming the environment.
The risk-assessment solution: If you don't have data on the toxicity of a substance, assume the risk is ZERO. Just ignore the problem. Here, reverse onus plays an important role in putting the burden of proof on industry to collect and reveal data on new chemicals prior to their general release or manufacture.
5. FORMATION OF CHEMICAL MIXTURES AND BYPRODUCTS
The nature of industrial chemistry is messy. When you mix chemicals under diverse industrial circumstances you inevitably produce new and unexpected byproducts. Joe Thornton gives three examples of how we are flying blind:
a) Paper manufacturing. The effluent from pulp mills contain over 300 organochlorine byproducts; including dioxins, furans, phenols, benzenes, thiophenes, methyl-sulfones, methanes, ethanes, acids and PCBs. We have only identified 3 to 10% of the organically bound chlorine in pulp effluent. In other words, we are 90-97% ignorant of what is coming out of the pipe.
b) Incineration. Incinerator emissions are estimated to contain over 1,000 products of incomplete combustion (complete combustion would reduce the fuel to carbon dioxide and water). Yet we have identified only 40-60% of these chemical effluents.
c) Pesticide manufacture. Byproducts account for almost 20% of DDT manufacture by weight. Many of these byproducts have never even been identified.
We don't know the names, structures or toxicity of many of the chemical byproducts formed in industrial processes. Even though we phased out purposeful manufacture of PCBs they -- and dioxins, an unwanted byproduct -- are still being introduced into the environment as side-effects of chlorine chemistry. To prevent global contamination with dioxin, we would need to phase out the whole class of organochlorines.
6. POLLUTION CONTROL AND DISPOSAL
End-of-pipe pollution control and disposal technologies do little to prevent global environmental contamination. If you manufacture a substance that breaks down slowly and tends to accumulate in living things, it will eventually spread throughout the living world. Scrubbers, filters, precipitators, incinerators, and landfills are all just ways of temporarily moving a substance from one location or form to another (a shell-game). In the end, everything that persists will disperse into the air, water, land and living things and people will be affected. Landfills leak, incinerators generate toxic ash and gas, and even the best pollution controls are never 100% effective.
Thornton helps us realize that we are foolish to try to control chemicals with the end-of-the-pipe risk-assessment approach. Instead, we can use the precautionary principle and acknowledge that:
a) Some chemicals don't belong in the environment (zero discharge) and are best regulated away as entire classes of compounds;
b) With the right combination of carrots and sticks as motivation, industry can find clean technologies (clean production); and
c) The burden of proof (aka "reverse onus") can be placed on the industries that want to introduce new chemicals -- to show that they have done their best to understand the consequences of their actions -- thus motivating them to innovate and develop clean technologies. No data? No market.
* Tim Montague is Associate Director of Environmental Research Foundation. He holds an M.S. degree in ecology from University of Wisconsin-Madison and lives in Chicago.
 Joe Thornton, "Beyond Risk: An Ecological Paradigm to Prevent Global Chemical Pollution" INTERNATIONAL JOURNAL OF OCCUPATIONAL AND ENVIRONMENTAL HEALTH Vol. 6 (2000) pgs. 318-330. Available at
And see Rachel's #704, which reviews Thornton's book, Pandora's Poison.
 International Joint Commission. FIFTH BIENNIAL REPORT ON GREAT LAKES WATER QUALITY. Windsor, ON, Canada, 1990. Available here: //www.ijc.org/php/publications/pdf/ID603.pdf
 Mary O'Brien, MAKING BETTER ENVIRONMENTAL DECISIONS; AN ALTERNATIVE TO RISK ASSESSMENT (Cambridge, Mass.: MIT Press, 2000). ISBN: 0262650533
RACHEL'S ENVIRONMENT & HEALTH NEWS
Environmental Research Foundation
P.O. Box 160
New Brunswick, N.J. 08903
Fax (732) 791-4603; E-mail: email@example.com