<<

Journal of Exposure Science and Environmental (2009) 19, 536–543 r 2009 Nature Publishing Group All rights reserved 1559-0631/09/$32.00

www.nature.com/jes

Meeting Report Twenty-first century approaches to toxicity testing, biomonitoring, and risk assessment: perspectives from the global chemical industry

RICHARD D. PHILLIPSa, TINA BAHADORIb, BRENDA E. BARRYb, JAMES S. BUSc, TIMOTHY W. GANTd, JANETM.MOSTOWYe, CLAUDIA SMITHf, MARC WILLUHNf AND ULRIKE ZIMMERg aExxonMobil Petroleum and Chemical, Machelen, Belgium bAmerican Chemistry Council Long-Range Research Initiative, Arlington, Virginia, USA cDow Chemical Company, Midland, Michigan, USA dUniversity of Leicester, Medical Research Council, Leicester, UK eBayer Material Science, Pittsburgh, Pennsylvania, USA fEuropean Chemical Council, Brussels, Belgium gVerband der Chemischen Industrie, Frankfurt, Germany

The International Council of Chemical Associations’ Long-Range Research Initiative (ICCA-LRI) sponsored a workshop, titled Twenty-First Century Approaches to Toxicity Testing, Biomonitoring, and Risk Assessment, on 16 and 17 June 2008 in Amsterdam, The Netherlands. The workshop focused on interpretation of data from the new technologies for toxicity testing and biomonitoring, and on understanding the relevance of the new data for assessment of human health risks. Workshop participants articulated their concerns that scientific approaches for interpreting and understanding the emerging data in a biologically relevant context lag behind the rapid advancements in the new technologies. Research will be needed to mitigate these lags and to develop approaches for communicating the information, even in a context of uncertainty. A collaborative, coordinated, and sustained research effort is necessary to modernize risk assessment and to significantly reduce current reliance on animal testing. In essence, this workshop was a call toaction to bring together the intellectual and financial resources necessary to harness the potential of these new technologies towards improved decision making. Without investment in the science of interpretation, it will be difficult to realize the potential that the advanced technologies offer to modernize toxicity testing, exposure science, and risk assessment. Journal of Exposure Science and Environmental Epidemiology (2009) 19, 536–543; doi:10.1038/jes.2009.38

Keywords: toxicity testing, high-throughput screening, Tox21, risk assessment, biomonitoring, animal testing alternatives.

Introduction important questions have emerged concerning the extensive data sets that these new technologies can generate. How can New developments and innovations in technologies for data primarily from in vitro assay systems be appropriately toxicity testing and biomonitoring present an invaluable interpreted in the absence of interactions with the complex opportunity to improve current risk assessment methodolo- biological systems from which the samples originated? How gies and their contribution to regulatory decision-making do changes in these model test systems relate to apical end processes. Recent reports from the National Academy of points that have traditionally been used in ? How Sciences (NAS, 2006, 2007a, b) describe the potential for a can relevant exposure and dose information be collected so true paradigm shift in toxicity testing that takes advantage of that clear links can be drawn between the biological recent revolutionary changes in technologies for biology and observations and the environmental exposure conditions? biotechnology. These new technologies provide sensitive How can this information be applied to improve our approaches for detecting genetic and metabolic changes in understanding of risk and the decisions related to the cells and tissues as well as for detecting the presence of low regulatory process? levels of compounds in biological samples. However, A workshop, titled Twenty-First Century Approaches to Toxicity Testing, Biomonitoring and Risk Assessment,was organized to stimulate interactions and discussions among 1. Address all correspondence to: Tina Bahadori, American Chemistry a variety of stakeholders regarding implementation of the Council Long-Range Research Initiative, 1300 Wilson Boulevard, Arlington, VA 22209, USA. new technologies towards improved assessment of human E-mail: [email protected] health risks, with an emphasis on research, development, and Received 14 June 2009; accepted 18 June 2009 advancement of the new methods. A clear benefit of such Perspectives from the global chemical industry Phillips et al. improvements in risk assessment would be to better inform Kavlock, 2008). Many of these new technologies were the regulatory decision-making process. originally developed by the pharmaceutical industry for rapid toxicity screening of potential drug candidates. Numerous governmental and academic organizations were also involved Workshop organization in the development and refinement of these technologies. This knowledge base provides an unprecedented leverage to This workshop, among the first international meetings develop related approaches for rapidly assessing the potential focusing on this topic, was held on 16–17 June 2008 in toxicities of chemical compounds in the environment. Amsterdam in the Netherlands. It was organized and spons- At present, determination of human health risks from ored by the ICCA-LRI, which comprises the Long-Range exposures to chemicals is based primarily on results from Research Initiatives (LRIs) of the American Chemistry animal model toxicity studies that involve exposures at doses Council, (the European Chemical Industry Council), and the that can far exceed every day human exposure levels. These Japanese Chemical Industry Council. The workshop was results are then adjusted with a variety of factors to extra- part of the ICCA-LRI’s global research strategy to strength- polate from high-dose to low-dose effects, to account for en the scientific foundation for public policy and decision species differences between animals and humans, and to making through support of quality research regarding the determine levels at which no effects or minimal effects are effects of chemicals on human health and the environment. likely to occur. Several drawbacks of these traditional toxicity More than 150 participants, including scientists, public studies are that they usually take years to complete, they are health/public policy professionals, industry representatives very costly, and they can use a large number of animals. An and communicators from academia, governmental/non- additional drawback is that the adjustment factors used may governmental organizations from the United States, Canada, not fully account for metabolic differences between humans Europe, and Japan attended the workshop. and animals that can alter toxicological outcomes. This report summarizes the discussions and recommenda- A majority of scientists in the fields of risk assessment tions that emerged from the workshop that helped the and toxicology acknowledge that the long-term, high-dose ICCA-LRI plan its next generation of research programs. exposure regimens used for animal models yield results that It does not represent a consensus document among the may not be biologically relevant to the potential adverse workshop attendees. A complete report can be found on effects and risks from real-world exposures to chemicals for 2008 ICCA-LRI Workshop Report. humans (Conolly et al., 1999). The new technologies, such as toxicogenomics, can in weeks to months provide extensive Context for the workshop information about modifications in cells and tissues following exposures to chemicals at a variety of concentrations. The primary motivation for this workshop was the rapidly However, the key current question is how best to interpret emerging shift in thinking about the use of new technologies the volume of data documenting these in vitro modifications to evaluate human health and environmental risks in the in a way that is relevant for predicting potential adverse twenty-first century. Science, technology, and innovative changes/outcomes in whole organisms and that is relevant to thinking are all essential tools for addressing the variety the changes observed in the longer term animal exposure of global concerns regarding risks. models. One example is a current lack of understanding Enabling this paradigm shift will require significant capacity regarding changes in gene-expression profiles and potential building as well as dialog and collaborations between downstream consequences for cells, tissues, and organisms. scientists and policy experts from all sectors, including Meaningful interpretation of the data from the new industry, academia, and governmental and non-governmen- technologies will be essential for using these new technologies tal organizations. to improve the regulatory decision-making process. The new technologies present an exciting opportunity to The 2007 report from the NAS, titled Toxicity Testing extend beyond traditional risk assessment approaches and to in the 21st Century: A Vision and a Strategy, which describes better understand the effects of chemicals on humans and the the new technologies for evaluating environmental health environment. They are particularly relevant for clarifying our risks, was a major focus for the workshop (National understanding of potential adverse effects from exposures to Academy of Sciences, 2007a). The basis for the new the low levels of chemicals present in our environment. The paradigm is use of the new technologies to detect activation new technologies include toxicogenomics that combines of toxicity pathways, which the report generally defines as the tools of traditional toxicology with those of genomics cellular response pathways that, when sufficiently perturbed, as well as bioinformatics and high-throughput screening can result in adverse pathological outcomes. Changes in gene, (HTS) assays that can be used to evaluate changes in gene, protein, and metabolite profiles following exposures to protein, and metabolite profiles in cell and tissue samples chemical compounds or environmental stressors can be (Waters and Fostel, 2004; Hayes and Bradfield, 2005; identified with the new technologies and used to identify

Journal of Exposure Science and Environmental Epidemiology (2009) 19(6) 537 Phillips et al. Perspectives from the global chemical industry cellular and tissue changes that may portend activation of using several ‘omic platforms from genomics to metabo- toxicity pathways. One intended goal for these new analytical nomics with novel bioinformatics and test systems, such approaches is to begin to use the data in place of traditional as stem cells, to assess the potential of these methods to apical end points, such as gross physiological and patho- screen for genotoxic and carcinogenic properties of chemical logical changes including tissue damage and tumor forma- compounds in vitro. This alternative to rodent bioassays can tion, from the longer term and more costly studiesy . test genotoxic and non-genotoxic carcinogens and ultimately However, as noted in the NAS report a critical prerequisite wouldbeusedtodevelopabatteryoforgan-specific for meaningful use of the data from the new technologies will genomics-based in vitro assays that will be submitted to the be the ability to distinguish between the changes representing European Centre for the Validation of Alternative Methods. perturbations in cells or tissues that may resolve themselves In the area of biomonitoring, a food quality and safety through normal homeostatic mechanisms and those changes program is aimed toward identifying of exposure indicating true activation of toxicity pathways that may for chemicals of relevance to human health and disease. In to true pathology. A further need is the need to determine if addition, NewGeneris is an in vivo regime that will employ all perturbations are adverse, whether or not they resolve biomarkers of dietary exposure to genotoxic and immuno- when the causative agent is removed. toxic chemicals, as well as biomarkers of early effects using The benefits envisioned through use of the new techno- mother-child birth cohorts and biobanks. All of these test logies to generate toxicity test data and improve the risk regimes support the EU’s progression towards the use and assessment and decision-making processes are numerous. application of the new technologies of transcriptomics, The computational or in silico methods that are the core proteomics, metabonomics, and bioinformatics. of the approach have the potential to increase the number In the United States, two major research efforts are focusing of chemicals and end points that can be evaluated while on in vitro approaches as alternatives to traditional animal decreasing both the time and costs required to complete toxicity testing. In 2007, US EPA’s National Center for the assays and the number of animals used for testing. The Computational Toxicity (NCCT) launched ToxCast to results can be a source of detailed mechanistic and dose develop a cost-effective in vitro approach for prioritizing information relevant to human health risk assessment. toxicity testing for the large number of chemicals in a short The report acknowledges that a substantial commitment of period of time (Dix et al., 2007). In response to the release of resources will be required to implement the vision that has the 2007 NRC report, two NIH institutes and EPA NCCT been outlined. In addition, successful implementation of this formed a collaboration, called Tox21 that has been designed to new paradigm in toxicity testing will require support from the (1) identify mechanisms of chemically induced biological scientific community, regulators, law-makers, industry, and activity, (2) prioritize chemicals for more extensive toxicolo- the public. Last, but not the least, effective communication gical evaluation, and (3) develop more predictive models of among all of these parties regarding the development and in vivo biological response (Austin et al., 2008; Collins et al., ultimate value of these new methodologies will be a key to its 2008). Tox21 is an innovative approach to build complemen- success. tary capacity across different governmental agencies, and to European perspectives on the use and potential value of broaden stakeholder collaborations for addressing emerging the new technologies for addressing current issues regarding scientific needs. Tox21 is expected to deliver biological activity the evaluation of chemical toxicity in the European Union profiles that are predictive of in vivo toxicities for thousands of (EU) were an integral part of the workshop discussions. Two substances of importance to regulatory authorities in the direct applications include compliance with the testing United States, as well as in many other countries. requirements under the new legislation for chemicals, called Biomonitoring and its applications for public health issues the Registration, Evaluation, Authorization and Restriction within an evolving risk framework were another major focus of Chemicals Regulation, and the recent EU-wide ban of the workshop. Biomonitoring data reflect ongoing or on animal use for cosmetics development. It is anticipated previous exposures to chemicals based on measurements of that the in vitro methods that constitute the core of the the chemicals themselves or their metabolites in fluid and new technologies can accelerate completion of required tissue samples, such as human blood and (National testing while reducing costs and animal usage. However, Academy of Sciences, 2006). Some of the current challenges the challenges to the new technologies will be to meet the for biomonitoring include proper identification of exposure requirements for international validation. sources, understanding the impacts of sampling methods on A number of emerging toxicogenomic testing projects and quantification of exposure, determining the potential effects programs were discussed at the Workshop. Several in vitro of mixed exposures, and a lack of relevant toxicity values to test projects that are in use or under development in Europe, interpret the data, particularly among susceptible individuals, such as Predictomics, ReProTect and TOXDROP, employ such as children and the elderly. novel techniques for determining chemical toxicity. Another Through its biomonitoring research program, the European collaborative project, carcinoGENOMICS, is ICCA-LRI has explored the linkages between biomonitoring

538 Journal of Exposure Science and Environmental Epidemiology (2009) 19(6) Perspectives from the global chemical industry Phillips et al. data and environmental exposures. The relationships among and planning efforts; and (3) current thinking on modeling biomonitoring, real-world exposures, and recent innovations and statistical approaches to improve dose and exposure in technology for toxicity testing could be synergized through estimates. a global coordination effort by LRI. Such a coordinated Several large ongoing biomonitoring studies by public effort could provide great value to the chemical industry health organizations demonstrate both the benefits and by improving the understanding of the potential effects of complexities inherent in biomonitoring studies. A number chemicals at environmentally relevant exposure levels. of the large ongoing studies were discussed as examples. The Biomonitoring assay methods now have exquisite sensiti- German Health Interview and Examination Survey for vity to detect chemicals and metabolites in samples at the Children and Adolescents (KiGGS) study has evaluated parts per billion and the parts per trillion levels. Key question more than 17,000 German children up to the age of 17 years that emerge related to these enhanced detection limits are (Robert Koch Institute (RKI), 2005). These collected data whether and how the measured levels potentially link to include objective measures of physical and mental health as predictions for potential adverse health outcomes and what is well as parental or self-reported information about subjective the relevance of the detection limits for characterizing true health status, health behavior, health care utilization, social population exposures. and migrant status, living conditions, and environmental A current trend in biomonitoring research is the use of determinants of health. The German Environmental Survey existing risk-based approaches as a context for the measure- on Children (GerES) has examined exposures and exposure ment results, such as converting a measured level pathways to environmental pollutants, including metals, to a dose level in humans or animals using physiologically pesticides, and selected chemical compounds, in a subset of based pharmacokinetic (PBPK) modeling. These PBPK children 3 to 14 years of age who are part of the KiGGS models use computer-based approaches to combine informa- study (Umweltbundesamt (UBA), 2008). Data from both tion about the physiology and anatomy of an animal or the of these studies now allow evaluation of relationships human body to understand the biochemistry and metabolism between environmental conditions and the health of children of the chemical or chemicals of interest following an in Germany. exposure. Future directions for improved biomarkers include Two major biomonitoring initiatives by the Centers for their use in epidemiological studies to define biomonitoring- Disease Control and Prevention, The National Reports on response relationships and to help set research priorities. Human Exposure to Environmental Chemicals and the Another future focus for biomonitoring studies is to extend National Health and Nutrition Examination Survey, were beyond measurements of body burden to applications that reviewed as examples of large biomonitoring studies in the will facilitate the understanding of health end points and United States (CDC, 2007). exposure sources. Discussions about these biomonitoring studies highlighted the challenges they present, including the specificity of any Parallel sessions given biomarker, its persistence in the body, the limits of analytical sensitivity for detecting a biomarker, intra-person Three parallel sessions were convened following the work- and inter-person variabilities in the results and ultimately shop’s opening plenary sessions to provide an opportunity how to link the results to actual exposures. Differences for more detailed interactions and discussions among speak- between reference values, which are health-based and ers and participants regarding three major themes for the accepted widely, and human biomonitoring (HBM) values, workshop: biomonitoring, advanced technologies, and risk which are not health-based, were reviewed. The discussants assessment. noted that reference values are statistically derived as average values for a sample population, whereas HBM values are Human biomonitoring used to identify concentrations above and below which This session explored the links between biomarkers of adverse health effects may occur. exposures and environmental exposures and how advance- Reverse dosimetry is a method for estimating exposure and ments in technologies, such as biological or environmental risk from human biomonitoring data (Clewell et al., 2008). monitoring and modeling, can facilitate these connections The method involves probabilistic dose reconstruction at and subsequent interpretation of human biomonitoring data. the population level and human pharmacokinetic modeling The aim was to showcase new developments in quantitative to describe the relationship between the biomarker and and qualitative interpretation and application of biomonitor- external dose. The method may also require information ing data and to present opportunities that would advance about the nature of the exposure, such as source, frequency, their application to public health issues. Three relevant topics and duration. It can be a useful approach in the absence of included: (1) how biomonitoring data are currently being a direct link between a biomarker and health outcomes, but used in large population studies; (2) reports of new a confounding issue is the uncertainty and variability in biomonitoring data as well as information on new trends human exposures and .

Journal of Exposure Science and Environmental Epidemiology (2009) 19(6) 539 Phillips et al. Perspectives from the global chemical industry

Another recent development is biomonitoring equivalents differences among ongoing studies that are utilizing the new (BE). The BE approach can use human biomonitoring data technologies, identifying areas for cooperation, maximizing for a variety of compounds and then draw upon existing risk value from the programs, and identifying a role for assessment information for those compounds to provide a ICCA-LRI to address potential underemphasized areas. tool for the interpretation of biomonitoring data in a health Discussions included the types of information that these risk assessment context. BE values are based on transforma- various technological approaches can produce and how they tion of human tolerable daily intake values, which have been might be useful for risk assessment. As noted previously, a derived from existing risk assessments and include animal key element of the paradigm shift for the use of these new toxicity data, using pharmacokinetic modeling approaches technologies is to move the focus from thinking about (Hays et al., 2007, 2008). BEs developed for specific changes in single genes towards impacts on multiple genes chemicals that may be present in blood, urine, or other that constitute toxicity pathways. Related to the recent NAS human samples are designed to be consistent with existing (2007b) report on applications of toxicogenomic technologies exposure guidance values, such as reference doses and to predictive toxicology and risk assessment, a potential reference concentrations. They can serve as a communication benefit of these new technologies would be evaluation of the tool for placing population-based biomonitoring results in a shape of dose–response curves for chemicals at levels relevant public health-risk context and for informing chemical to real-world human exposures. A key question is whether prioritization decisions regarding the need for further risk genomic technologies can be successfully used to identify and assessment or for risk management decisions. characterize key event changes in cells and tissues in this low- The discussions during the session identified a number of dose region. challenges for the use and interpretation of biomonitoring The Tox21 program uses quantitative HTS to identify data. To use biomonitoring data for effective intervention or biological activities for large chemical libraries. Traditional risk management strategies, a link has to be established HTS approaches used by the pharmaceutical industry between external sources and the pathways for exposure. for drug discovery have the capacity to evaluate more than This presents a particular challenge for cumulative exposures 100,000 chemicals per day. However, only one dose that include multiple chemicals and the potential for inter- per compound is generally evaluated and high rates of actions among these chemicals. More specific biomarkers false positives and false negatives can occur. Tox21 uses 15 of exposure and improved designs for both exposure and different concentrations for each compound and produces epidemiological studies that will elucidate these linkages are robust biological activity profiles for human and animal cells; required. Biomonitoring data should be considered among they include evaluation of viability, enzymatic pathways, and the full suite of information gathered for exposure analysis nuclear receptor assays that have low rates of false positives studies, and to that end, collection of relevant peripheral data and false negatives. through questionnaires and personal/environmental moni- In Europe, the EU-Framework Programme 6 PredTox toring is also essential. Biological specimens stored under Project used an integrated toxicogenomics approach for proper conditions could be banked for future analyses that mechanistic biomarker identification. The project focus was would use new methods as they are developed. The absence an in vivo rat study that evaluated toxicities in the liver and of quality standards for most biomarkers and uncertainty in kidneys caused by selected drug candidates using samples laboratory analysis also present limitations for biomonitoring of liver, kidney, blood, and urine. The study protocol methods. However, further development and refinement of incorporated conventional end points, such as clinical approaches that more clearly link biomonitoring results back chemistry and histopathology, with transcriptomics, proteo- to exposure, such as reverse dosimetry and BEs, will be mics and metabonomics. The objective of the study was valuable elements for improving risk assessment and fostering to explore the sensitivity of ‘‘omic methods’’ as predictive more effective risk management decisions for the general systems with short and low-dose exposures. These data public. would be applied to increase the understanding of toxicity mechanisms and to evaluate use of the new technologies to Advanced technologies identify potential biomarkers in tissue samples and fluids. Discussions about the advanced technologies presented an A current challenge for the use of genomics in toxicology opportunity to review recent developments that utilize lies in our ability to use these tools together with other toxicogenomics and HTS assays as well as a systems biology relevant molecular approaches to determine whether changes approach. Two objectives that have been identified for these in gene expression can be used as indicators of adverse new technologies are to provide toxicity (hazard) screening biological responses to a toxic exposure. It is anticipated that information for a large numberofchemicalsandtoprovide specific mechanisms of toxicity will elicit specific patterns of mode of action information to guide understanding of risk gene expression; whether these patterns are conserved across assessment (National Academy of Sciences, 2007a). The species is an additional question that remains to be answered. session focused on understanding the similarities and For predictive toxicology, the question remains whether use

540 Journal of Exposure Science and Environmental Epidemiology (2009) 19(6) Perspectives from the global chemical industry Phillips et al. of ‘‘omics’’ data can improve understanding of mechanisms exposure and dose and then to link this information to of toxicity. However, which toxicity parameters should be toxicity data. evaluated in-depth and whether genomics can improve the Looking at a path forward, a possible first step would be to performance of in vitro models are areas for continued use a problem formulation approach and the available data research. One concern is that while the costs of generating the sources as anchors for that process. Such sources could data will likely decrease, the costs for interpreting the data are include the large databases of the physical and chemical likely to increase due to their quantity and complexity. As the properties of chemicals as well as existing use and exposure function of more genes, and consequently more networks, is information. The large volumes of available data that can understood, reevaluation of previously generated ‘‘omics’’ serve as a bridge to future understanding include structure– data will also be needed. Although these directions suggest activity relationships and the extensive databases on human future scientific and financial challenges, they should health, such as epidemiology data for a number of chemicals, not mitigate generation of data, but only emphasize the animal toxicity pathways, and mode of action information. importance of relevant data interpretation. Questions do remain whether elements of the existing risk It also remains to be determined whether the new assessment framework can inform or be used to develop technologies can inform current operational assumptions in a new framework and what first steps would be needed to toxicity testing and risk assessment. Examples of potential create such a framework. One key to success will be to invest tests of these assumptions include examination of the current the time needed to better understand the potential complex- routine application of a linear, no-threshold approach for ities of the toxicity pathways discussed in the NAS (2007a) risk assessments of genotoxic substances and determination report; this step will be critical to establish a valid bridge from of whether traditional maximum-tolerated dose approaches existing animal data to toxicity pathways in humans. that use pathology and/or organ and body weight effects can be refined to align with effects identified with toxicogenomics. The new approaches are also drivers for devising new Perspectives and path forward toxicological terms, such as no observed transcriptional effect Participation at the workshop by academics, regulators, and level; however, the application of such concepts to future risk industry and government representatives was invaluable for assessment will require further consideration. In addition, the obtaining input and feedback from a variety of perspectives. workshop participants noted an opportunity to apply This blend of expertise enhanced the process of identifying toxicogenomic approaches to further explore and refine the research gaps and of defining research directions that will lead use of Toxic Equivalency Quantity and Toxic Equivalency to better understanding of the potential impact of chemicals Factor defaults that are currently used to characterize risks on human health and the environment. The workshop from chemical mixtures. Additional opportunities remain to underscored the chemical industry’s commitment to research reexamine current use of multiple uncertainty factors in risk for developing 21st century approaches to toxicity testing, assessments and to replace the uncertainty defaults with exposure science, and risk assessment. One clear outcome of databased information. the workshop was support from across the range of attendees for future research efforts to advance exposure science to Risk assessment improve the ability to characterize exposures in an envir- The new technologies present challenges to risk assessment onmentally relevant context. To complement the advances in practitioners to determine how to translate the large volumes toxicity testing and risk assessment, corresponding advances of data into meaningful information that can be considered in in exposure science will be needed to characterize relevant public health decision making. A companion challenge is exposure with an efficiency and scale commensurate with the how to effectively communicate these data to the medical toxicity testing initiatives. community, the public, and the media. To remain relevant, A number of observations emerged from the workshop risk assessment approaches will need to be modified to be that helped shape the path forward for ICCA-LRI research more dynamic and responsive to this oncoming deluge of planning. They are summarized here: information. Workshop participants discussed the need for a significant paradigm shift in risk assessment approaches.  New science and technology can be instrumental for The NAS (2007a) report highlighted population-level creating a paradigm shift in risk assessment. To engage in a exposure data as a critical element to inform toxicity testing, rational debate about a new paradigm, a commitment to describe risk and to place risk into a real-world context. must be made to fill the information gap. Implementation of this aspect of the vision will require a  Public perception of risk involves many personal beliefs, systems-based incorporation of exposure and its interaction judgments, attitudes, and feelings. Increased access to data with effect (Cohen Hubal et al., 2008; Edwards and Preston, may improve public perceptions of risk, but it also carries 2008; Sheldon and Cohen Hubal, 2009). New predictive the potential for misuse and misinterpretation of the tools and databases will need to be developed to characterize data and for decreased societal receptivity to innovation.

Journal of Exposure Science and Environmental Epidemiology (2009) 19(6) 541 Phillips et al. Perspectives from the global chemical industry

Timely and relevant communication strategies will be reviewed research, established test protocols, and validated essential for effectively distributing information through- models. Importantly, this acceptance should emerge from out the network of stakeholders. a robust, multi-stakeholder dialog, as evidenced at this  The move from identifying hazards to identifying risks will workshop, with its focus on design, interpretation, and require targeted studies to understand and map toxicity application of these complex, but informative technologies pathways. Essential elements include comprehensive in to risk assessment. vitro studies, preferably with human cells, computational A specific aim of the workshop was to stimulate a maps of the toxicity pathways, and underpinning the discussion on the scientific basis for policy-making, to analysis of pathways with support of basic science to understand how these new technologies can have a role elucidate and analyze perturbations in those pathways. in risk assessment and to build a consensus to move the risk  The balance of financial commitment to the new assessment process forward. The recent emergence of these technologies will shift from data generation to data new technologies offers an unprecedented opportunity to interpretation as more high-throughput technologies are advocate effectively and proactively for science-based employed. decision making, to formulate better decisions about  Understanding the potential effects of exposures to potential effects from exposures to chemicals, and to speak chemicals in early life and the influence of genetic to the safety of chemical from a stronger scientific basis. susceptibility, including polymorphisms, are critical. The The perspectives and observations from the workshop that emerging technology enables characterization of genetic are summarized above can serve as a starting point for polymorphisms; however, proper interpretation and un- ICCA-LRI research to synergize the use of these new derstanding of the data will require much research, such technologies toward those ends. as full evaluation over the complete range of dose– response.  Fundamental to improving risk assessment with the new technologies are the needs to relate in vitro doses to in vivo References exposures and to characterize dose–response relationships. Austin C., Kavlock R., and Tice R. Tox21: Putting a lens on the vision of Gene expression may identify a hazard, but a true assess- Toxicity Testing in the 21st Century. Available at http://www.alttox.org/ ment of risk requires relevant dose–response data and ttrc/overarching-challenges/way-forward/austin-kavlock-tice/; last updated, mechanistic understanding of the relevant toxicity path- August 19, 2008. Centers for Disease Control and Prevention (CDC). National Health and Nutrition ways, particularly at low doses. Examination Survey (NHANES) Web site. National Center for Health  A key question is how the new toxicity information will be Statistics 2007. Available at http://www.cdc.gov/nchs/nhanes.htm. translated to assess the potential for real-world human Clewell H.J., Tan Y.M., Campbell J.L., and Andersen M.E. Quantitative interpretation of human biomonitoring data. ToxicolApplPharmacol2008: health risks. A paradigm shift in risk assessment requires 231: 122–133. aprioriand commensurate consideration of the totality Cohen Hubal E.A., Richard A.M., Imran S., Gallagher J., Kavlock R., Blancato of the paradigm and an understanding of the real-world J., and Edwards S. Exposure science and the US EPA National Center for Computational Toxicology. J Exp Sci Environ Epidemiol 2008. Available context of population exposures. Population-based data online: 5 Nov 2008, doi: 10.1038/jes.2008.70. and human exposure information will be critical for Collins F.S., Gray G.M., and Bucher J.R. Transforming environmental health guiding development and use of toxicity information. protection. Science 2008: 319: 906–907. Conolly R.B., Beck B.D., and Goodman J.I. Stimulating research to improve the  Requisite components of population information include: scientific basis of risk assessment. Toxicol Sci 1999: 49: 1–4. information on host susceptibility and background expo- Dix D.J., Houck K.A., Martin M.T., Richard A.M., Setzer W., and Kavlock R.J. sures to interpret and extrapolate in vitro test results; The ToxCast program for prioritizing toxicity testing of environmental chemicals. Toxicol Sci 2007: 95: 5–12. human exposure data to select doses for toxicity testing Edwards S.W., and Preston R.J. Systems biology and mode of action based risk so that hazard information on environmentally relevant assessment. Toxicol Sci 2008: 106: 312–318. effects can be developed; and biomonitoring and bio- Hays S.M., Aylward L.L., and LaKind J.S., et al. Guidelines for the development of biomonitoring equivalents: Report from the biomonitoring equivalents marker data to relate real-world human exposures with expert workshop. Regul Toxicol Pharmacol 2008: 51: S4–S15. concentrations that perturb toxicity pathways to identify Hays S.M., Becker R., Leung H.-W., Aylward L.L., and Pyatt D.W. potential exposures. Biomonitoring equivalents: a screening approach for interpreting biomonitor- ing results from a public health risk perspective. Regul Toxicol Pharmacol  To realize the vision of NRC (2007a) report, a strategic 2007: 47: 96–109. and fully integrated systems approach is required. This Hayes K.R., and Bradfield C.A. Advances in toxicogenomics. Chem Res Toxicol approach can be actualized through a coordinated 2005: 18: 403–414. Kavlock R.J., Ankley G., Blancato J., Breen M., Conolly R., Dix D., Houck K., and sustained investment not only to develop technologies, Hubal E., Judson R., Rabinowitz J., Richard A., Setzer R.W., Shah I., but also to interpret the emerging data in the context Villeneuve D., and Weber E. Computational toxicologyFa state of the science of population health. mini review. Toxicol Sci 2008: 103: 14–27. National Academy of Sciences. Board on Environmental Studies and Toxicology.  Scientific and regulatory acceptance of the new toxicity Human Biomonitoring for Environmental Chemicals. The National Academies testing approaches should be fostered through peer- Press, Washington, DC, 2006.

542 Journal of Exposure Science and Environmental Epidemiology (2009) 19(6) Perspectives from the global chemical industry Phillips et al.

National Academy of Sciences. Board on Environmental Studies and Toxicology. Sheldon L.S., and Cohen Hubal E.A. Exposure as part of a systems approach for Toxicity Testing in the 21st Century: A Vision and a Strategy.TheNational assessing risk. Environ Health Perspect 2009. Available online: 8 April 2009, Academies Press, Washington, DC, 2007a. doi:10.1289/ehp.0800407. National Academy of Sciences. Board on Environmental Studies and Toxicology. Umweltbundesamt (the German Federal Environment Agency) (UBA). Applications of Toxicogenomic Technologies to Predictive Toxicology and Risk The health and environmental hygiene German environmental survey Assessment. The National Academies Press, Washington, DC, 2007b. 2003/06 (GerES IV) for children Web site, 2008. Availabe at http://www. RKI (Robert Koch Institute). KiGGS: The German health survey for children umweltbundesamt.de/gesundheit-e/survey/us03/uprog.htm. and adolescents, 2005. Available at http://www.kiggs.de/experten/downloads/ Waters M.D., and Fostel J.M. Toxicogenomics and systems toxicology: aims and dokumente/kiggs_engl.pdf. prospects. Nat Rev: Genet 2004: 5: 936–948.

Journal of Exposure Science and Environmental Epidemiology (2009) 19(6) 543