HOME / DONATE / RSS / SUBSCRIBE / ABOUT CPR
Center for Progressive Reform



CPR Perspective: Toxicogenomics

The Implications of the Genomics Revolution for Environmental Policy
 
 
Background

The Issue
How will new discoveries in toxicogenomics affect the nature, scope, and resources devoted to research on the adverse effects of toxic chemicals.

Despite years of debate and scientific effort, only a tiny fraction of the approximately 75,000 chemicals in commercial production have been subjected to even rudimentary toxicity testing. This lack of information persists in large part because the testing methods available to federal regulators are both costly and subject to large uncertainties. The limitations of chemical risk assessment methods, and particularly toxicology, have been the subject of countless reports, articles, and studies.

Scientists have a new sense of hope that the uncertainties limiting chemical risk assessment may soon be overcome. This surge of optimism has been inspired by the apparent success of genomics in the biomedical sciences. The guiding faith of those optimistic about change is that a fundamental paradigm shift in the science of risk assessment is now achievable through the application of genomics methods to environmental toxicology, a specialized field often referred to as "toxicogenomics."

Genomics methods are portrayed as providing a new generation of simple, low-cost screening methods for determining whether a chemical is toxic, whether an individual is sensitive to certain toxins, or whether someone has been exposed to or harmed by a toxic substance. Proponents claim that genomics methods will improve the precision of toxicology and take the guess work out of risk assessment methods used to establish environmental standards.

The central innovation in toxicogenomics is the ability to measure the activity of thousands of genes simultaneously. Its advocates maintain that the degree to which a gene is activated (i.e., producing the protein for which it codes) is indicative of biological reactions that arise in response to a chemical exposure. For example, if a class of chemicals (e.g., solvents) causes direct damage to DNA or interferes with hormonal regulators (e.g., endocrine disruptors), measuring gene-activity levels following exposure can identify aberrant effects among those genes vulnerable to these chemicals and locate genetic defects associated with heightened susceptibility to such chemical exposures. Scientists believe that by allowing them to monitor dynamic biological responses, these methods will enable them to understand the basic mechanism of chemical toxicity and to identify accurate tests for genetic susceptibilities and harmful chemical exposures.

The emergence of toxicogenomics represents a major development in environmental toxicology. The Food & Drug Administration and the Environmental Protection Agency have adopted preliminary policies supporting toxicogenomic methods. At the same time, the National Institute for Environmental Health Sciences (NIEHS) within the National Institutes of Health has initiated a major toxicogenomics program. In 1998, NIEHS launched the Environmental Genome Project (EGP), which will "investigate how genetic variation affects responses to environmental exposures" by characterizing genes linked to human disease.

 

What People are Fighting About
 

What's at Stake
The hype surrounding the study of genes and toxic exposures may cause an unwarranted and badly focused commitment of resources to the development of toxicogenomics and a concomitant, ill-advised withdrawal of resources from well-established essential research.

The debate over toxicogenomics centers on the true promise of genomics methods. Both sides of the debate agree that toxicogenomics will improve understanding of toxicology and, in certain cases, provide new tools for environmental standard setting. The claims of proponents, however, go far beyond advancing scientific knowledge. They maintain that toxicogenomics will resolve many of the scientific uncertainties that have hampered toxics regulation. Skeptics argue that this view ignores important biological constraints and that hopes for simple scientific fixes in the past have led environmental policy astray. Skeptics point out that notwithstanding huge public and private investments, simple methods for studying human health remain elusive. The complex nature of biology, they argue, ought to make us question these high-flying claims.

The optimism generated by toxicogenomics is premised on toxicology being reducible to its presumed genetic origins, that is identifying genes "for" specific human traits implicated in harmful chemical exposures. Two assumptions typically underlie this view: (1) human disease is primarily a matter of genetic susceptibility and (2) most genetic traits are simple (i.e., involve a small number of strongly influential genes).

Skeptics assert that both assumptions are false. First, despite the intense interest in genomics, broad scientific consensus holds that most common diseases are linked to human-made and natural environmental exposures (e.g., diet, pollutants) than to genetic susceptibilities. Scientists estimate that the drop in cancer rates caused by eliminating certain environmental factors could be as high as eighty to ninety percent. By contrast, diseases with a strong genetic component are estimated to account for less than 5 percent of major cancers and coronary heart disease.

Second, genetic traits are complex and influenced by an individual's surrounding genetic makeup-genes do not function in isolation. As a result, each gene associated with a toxic response or susceptibility will have a small, often variable effect, making detection of its causal role far more challenging. In essence, one is looking for a needle in a haystack with only a vague understanding of what the needle in question looks like.

They also argue that the intuitive appeal of linking gene activity levels to toxicological responses obscures the limitations of these methods. Biologists know, for example, that changes in gene activity can be signs of good or harmful effects, and that a lot of additional information is often needed to determine which is the case. Conversely, the absence of a change in gene activity cannot rule out harmful effects, as chemical toxins may not impact gene activity (some toxins alter genes themselves without systematically changing their activity levels).

Finally, skeptics point out that detecting changes in gene expression levels will often be impossible. Toxicogenomic methods cannot detect all types of toxicological impacts. For example, the pain reliever acetaminophen causes liver damage through non-specific (i.e., random) modifications of important proteins. Such non-specific toxicity causes gene activity levels to vary unpredictably and rules out discovering a "signature" gene-activity pattern to associate with exposure to this chemical. In such cases, lacking a defining sign of such toxic effects, toxicogenomics methods will be ineffectual.

Proponents of toxicogenomics believe that these obstacles can be overcome. In essence, they argue that studying simple toxicological processes will allow them to gain the experience and knowledge necessary to tackle more complex problems. They are also beginning to acknowledge that gene-activity testing is not sufficient on its own and have expanded the scope (and complexity) of their research considerably. This shift amounts to an acknowledgement that the science underlying toxicogenomics is more challenging than scientists had initially thought and that significant developments with practical regulatory benefits are likely to take years to mature and become integrated into regulatory programs.

 

CPR's Perspective
 

CPR supports efforts to develop reliable toxicogenomics methods. However, CPR believes that policymakers must have a realistic understanding of toxicogenomic methods to forge sound research and regulatory policies. Despite the promise of toxicogenomics, it would be premature and unwise to allow this field of research to be viewed as a substitute for established testing methods. The primary benefit of toxicogenomics is likely to be indirect mechanistic information, and to the extent that regulatory testing methods are developed, they are likely to be limited to a small subset of chemicals.

These limitations imply that before committing significant public funds to toxicogenomics, policymakers must commit to maintaining adequate funding for other established toxicology research and testing programs. Further, the decision on whether and how much to support toxicogenomics should turn on whether toxicogenomics adds important new classes of techniques for understanding chemical toxicity. Because insights will occur incrementally, it will be essential periodically to reevaluate the appropriate level of commitment and focus of government resources.

Beyond its value for regulatory science, toxicogenomics research stands to transform the structure of regulatory standards. Currently, government regulators presume that variation in individual susceptibility to disease or harm from chemical exposures across the population is relatively limited. A growing number of studies reveal that this presumption is false. Recent studies, for example, indicate that the efficacy of biological processes involved in neutralizing the effects of toxic exposures vary by as much as 85- to 500-fold across the U.S. population, with correspondingly high variability in cancer risk. Treating the U.S. population as uniform thus exposes susceptible subgroups to unacceptable levels of risk. Toxicogenomic methods provide a new suite of tools for studying this population-wide variation and should prompt changes in how regulatory standards are set.

If the insights of toxicogenomics research are incorporated into regulatory standards, another important policy question will be the treatment of simple and complex genetic susceptibilities. The Center believes that regulatory standards should address simple and complex genetic susceptibilities separately. Subgroups of individuals with simple genetic susceptibilities to toxic exposures (e.g., harm from exposure to beryllium) will be, relatively speaking, easier to define and identify. Thus, an approach based on determining the susceptibility of "identifiable subgroups" is viable for simple genetic susceptibilities.

Establishing legal rules for complex genetic susceptibilities will be especially difficult because variation in susceptibility is unpredictable. An approach based on identifying specific population subgroups is therefore bound to fail. The best approach for dealing with complex genetic susceptibilities will be to incorporate a safety factor into estimates of chemical toxicity to ensure that, to the extent possible, the entire population, with its considerable variation, is adequately protected. A safety factor would lower the acceptable level of exposure, for example by a factor of ten, to compensate for scientific uncertainty and to ensure that we are "better safe than sorry." Safety factors for complex susceptibilities should be roughly commensurate with the estimated variability in individual toxic susceptibility across the U.S. population as a whole. Initially, while data are limited, a single default factor (like that used in the Food Quality Protection Act) should be used or variable safety factors could be developed (e.g., based on the nature of the populations most effected). As scientists' mechanistic understanding of chemical toxicity improves and better epidemiological data are collected, default safety factors would be replaced with more direct estimates of the population-wide variation in susceptibility.

Toxicogenomics stands to have important indirect effects as well. In particular, it has the potential to alert policymakers and the general public to the modest effect that genetics have on human health-relative to environmental factors. Alerting the public to the importance of environmental factors is particularly important in an environment in which public resources are being heavily committed to high-tech genomics-oriented medicine. A critical benefit of recognizing the limits of genomics would be a refocusing of attention on public health research. Indeed, the benefits of robust public health research are demonstrated by the fact that the most significant improvements in aggregate human health have historically followed advances in basic public health, such as improvement in sanitation, reductions in environmental exposures and improvements in living conditions, rather than major technological advances.

A significant liability of the current infatuation with toxicogenomics is its potential to distort research priorities in the environmental health sciences, which are already encumbered by inadequate support. The Center believes that it is essential for policymakers and scientists alike to promote an accurate understanding of toxicogenomic methods and to foster a carefully reasoned, scientifically grounded approach to integrating toxicogenomics into environmental regulatory science and policy.

The Center for Progressive Reform

455 Massachusetts Ave., NW, #150-513
Washington, DC 20001
[email protected]
202.747.0698

© Center for Progressive Reform, 2013