What if we could predict whether an individual will have an adverse reaction to everyday chemicals, such as those in laundry detergent or perfume? Or whether a particular chemical factory worker will fall ill upon exposure to a particular substance? With the publication of a study from a team of researchers that included NCATS experts, science is one step closer to such a scenario.
More than 80,000 chemical compounds are registered for use in the U.S., and for the vast majority of them, there has been no toxicity testing in humans to inform us about their effects on health. Just as genetic differences make each of us more or less susceptible to developing conditions such as heart disease, these differences also could determine how sensitive we are to chemicals in the environment. To establish safe levels of chemicals for human use, regulatory officials traditionally have used animal toxicology data to predict human responses to chemicals. This imperfect practice highlights a critical translational science and public health problem: until now, there has been no way to accurately measure human differences in sensitivity to the chemicals in our environment. A new study, published in the Jan. 13, 2015, issue of Environmental Health Perspectives, introduces a new way around this obstacle, using NCATS’ robotic screening capabilities to test the cells of more than 1,000 individuals with different genetic backgrounds for sensitivity to 179 chemicals.
“To fully achieve the promise of precision medicine, we will not only have to understand the extent and genetic basis for human variation in response to therapeutics, but also for sensitivity to chemical toxicity and adverse drug reactions,” said John R. Bucher, Ph.D., associate director of the National Toxicology Program at the National Institute of Environmental Health Sciences (NIEHS), one of the collaborating institutions in the study. “This work provides a possible approach using technologies already in hand.”
Addressing a Knowledge Gap
Several years ago, experts in the Toxicology in the 21st Century (Tox21) program recognized a critical scientific need not yet addressed by the initiative. Tox21 is a collaboration among researchers from NCATS, NIEHS, the Environmental Protection Agency, and the Food and Drug Administration. Tox21 scientists use a robotic system located at NCATS’ laboratories in Rockville, Maryland, to perform automated tests, or assays, which expose cells and proteins to thousands of chemicals in a short time. This process is called high-throughput screening (HTS).
Tox21 experts, including Raymond Tice, Ph.D., now retired from NIEHS, knew that the initiative’s approach yielded valuable information about cell functions affected by chemical toxicity. However, the tested cells all came from only a few cell lines, so the test results had few details about the range of sensitivity to a chemical’s toxicity in the human population as well as which genes affect that variability. But the team suspected that adapting the Tox21 HTS platform to test specific chemicals with known toxicity using cells from hundreds or thousands of people could produce that missing information.
To get started, the Tox21 team partnered with scientists at the University of North Carolina at Chapel Hill — toxicology researcher Ivan Rusyn, M.D., Ph.D., now at Texas A&M University, and statistician Fred Wright, Ph.D., now at North Carolina State University, both authors on the paper — who had experience with small-scale testing of cells for genetic variations in toxicity.
To draw reliable conclusions about how genes affect chemical sensitivity, the group needed cells that represented a large number of genetically diverse people. Luckily, they were able to acquire lymphoblastoid cells, a type of engineered white blood cell, together with related genetic information from 1,086 people from nine ethnic populations as part of the publically funded 1000 Genomes Project. The initiative is an international effort to catalog human genetic variation by sequencing (i.e., determining the sequence of “letters” in a person’s DNA) the genomes of more than 1,000 people. The researchers were ready to test chemicals on these cells grown in culture (also called in vitro) on a large-scale.
Making the Most of NCATS’ Robotic Capabilities
With help from automation experts in NCATS’ Division of Pre-Clinical Innovation, the Tox21 team tested the toxicity of 179 chemicals using the cells of those 1,086 people and a technique called quantitative HTS. This innovative method, developed at NCATS, enabled the researchers to run each compound through the assay at eight different concentrations. This approach is more likely to capture the full range of responses from the most to the least sensitive individual. The chemicals tested were substances to which people regularly are exposed, including pesticides, industrial chemicals, food additives and drugs.
The study, a massive logistical undertaking, was unprecedented in its scope. “It is the largest population-based, in vitro test with the largest number of cell lines ever done,” Rusyn said.
The team found that for about half of the chemicals tested, the range of variability among individual responses was larger than previously assumed. When making regulatory decisions about safe levels of chemicals, environmental experts generally assume that reducing exposure by an extra 10-fold is sufficient to protect people who may be more sensitive to the toxic effects of a given chemical. Specifically, exposure is reduced first by a “toxicokinetic” factor of 3.2, which accounts for differences in how a chemical reaches cells in the body, and then by a “toxicodynamic” factor of 3.2, which accounts for differences in biological responses after a chemical interacts with cells in the body. Although some approaches have addressed the first factor, few researchers have evaluated the second factor, and only at a much smaller scale. This larger study’s findings suggest that the standard toxicodynamic factor is generally applicable, but for about half of the chemicals examined, a larger factor — in some cases greater than 10 — may be more appropriate. This result paves the way for a strategy to develop more precise, chemical-specific exposure factors, rather than the currently used “one-size-fits-all” approach.
The researchers also examined the relationship between gene variations (polymorphisms) and cells’ sensitivity to toxicity. They discovered sensitivity-related polymorphisms in several genes involved in transporting substances across the cell membrane. The study was the first to highlight the role of this gene family in susceptibility differences for a range of chemicals. It may point to mechanisms by which chemicals affect human health.
Disseminating a New Approach to Protecting Public Health
Prior to the paper’s publication, the team released portions of the data to the international scientific community in the form of a data challenge called the DREAM Toxicogenetics Challenge. Such a competition harnesses the power of crowdsourcing to invite scientists to use Tox21 data to develop innovative predictive models for chemical toxicity across populations.
The results from this paper, as well as from crowdsourced analyses, could help regulators devise a more accurate way of determining safe levels of environmental chemicals. They also may enable the identification of people who are especially sensitive to certain chemicals.
“This work personifies the NCATS 3Ds,” said Christopher P. Austin, M.D., NCATS director. “Our team collaboratively developed a new way to address the problem of individual differences in sensitivity to chemicals; through this paper, we demonstrated that our method works, and we disseminated the results so that scientists can generate their own hypotheses and use the techniques in their own toxicological studies. This approach to studying personalized effects of environmental exposures, like that of precision medicine, embodies NCATS’ mission of finding translational solutions to improve human health.”
Posted March 2015