90
This article was downloaded by: [Boston University] On: 06 May 2013, At: 02:00 Publisher: Taylor & Francis Informa Ltd Registered in England and Wales Registered Number: 1072954 Registered office: Mortimer House, 37-41 Mortimer Street, London W1T 3JH, UK Journal of Toxicology and Environmental Health, Part B: Critical Reviews Publication details, including instructions for authors and subscription information: http://www.tandfonline.com/loi/uteb20 Toxicity Testing in the 21st Century: A Vision and a Strategy Daniel Krewski a b , Daniel Acosta Jr. c , Melvin Andersen d , Henry Anderson e , John C. Bailar III f , Kim Boekelheide g , Robert Brent h , Gail Charnley i , Vivian G. Cheung j , Sidney Green Jr. k , Karl T. Kelsey l , Nancy I. Kerkvliet m , Abby A. Li n , Lawrence McCray o , Otto Meyer p , Reid D. Patterson q , William Pennie r , Robert A. Scala s , Gina M. Solomon t , Martin Stephens u , James Yager v , Lauren Zeise w & Staff of Committee on Toxicity Testing and Assessment of Environmental Agents a R. Samuel McLaughlin Centre for Population Health Risk Assessment, Institute of Population Health, University of Ottawa, Ottawa, Ontario, Canada b Department of Epidemiology and Community Medicine, Faculty of Medicine, University of Ottawa, Ottawa, Ontario, Canada c University of Cincinnati, Cincinnati, Ohio, USA d The Hamner Institutes for Health Sciences, Research Triangle Park, North Carolina, USA e Wisconsin Division of Public Health, Madison, Wisconsin, USA f University of Chicago, Chicago, Illinois, USA g Brown University, Providence, Rhode Island, USA h Thomas Jefferson University, Wilmington, Delaware, USA i HealthRisk Strategies, Washington, DC, USA j University of Pennsylvania, Philadelphia, Pennsylvania, USA k Howard University, Washington, DC, USA l Harvard University, Boston, Massachusetts, USA m Oregon State University, Corvallis, Oregon, USA n Exponent, Inc., San Francisco, California, USA o Massachusetts Institute of Technology, Cambridge, Massachusetts, USA p The National Food Institute, Technical University of Denmark, Søborg, Denmark q Reid Patterson Consulting, Inc., Elkhorn, Wisconsin, USA r Pfizer, Inc., Groton, Connecticut, USA s Exxon Biomedical Sciences (Ret.), Tucson, Arizona, USA t Natural Resources Defense Council, San Francisco, California, USA u The Humane Society of the United States, Washington, DC, USA v Johns Hopkins University, Baltimore, Maryland, USA w California Environmental Protection Agency, Oakland, California, USA Published online: 21 Jun 2010. To cite this article: Daniel Krewski , Daniel Acosta Jr. , Melvin Andersen , Henry Anderson , John C. Bailar III , Kim Boekelheide , Robert Brent , Gail Charnley , Vivian G. Cheung , Sidney Green Jr. , Karl T. Kelsey , Nancy I. Kerkvliet , Abby A. Li , Lawrence McCray , Otto Meyer , Reid D. Patterson , William Pennie , Robert A. Scala , Gina M. Solomon , Martin Stephens , James Yager , Lauren Zeise & Staff of Committee on Toxicity Testing and Assessment of Environmental Agents

Toxicity Testing in the 21st Century: A Vision and a Strategy

  • View
    236

  • Download
    3

Embed Size (px)

Citation preview

Page 1: Toxicity Testing in the 21st Century: A Vision and a Strategy

This article was downloaded by: [Boston University]On: 06 May 2013, At: 02:00Publisher: Taylor & FrancisInforma Ltd Registered in England and Wales Registered Number: 1072954 Registered office: Mortimer House,37-41 Mortimer Street, London W1T 3JH, UK

Journal of Toxicology and Environmental Health, PartB: Critical ReviewsPublication details, including instructions for authors and subscription information:http://www.tandfonline.com/loi/uteb20

Toxicity Testing in the 21st Century: A Vision and aStrategyDaniel Krewski a b , Daniel Acosta Jr. c , Melvin Andersen d , Henry Anderson e , John C.Bailar III f , Kim Boekelheide g , Robert Brent h , Gail Charnley i , Vivian G. Cheung j , SidneyGreen Jr. k , Karl T. Kelsey l , Nancy I. Kerkvliet m , Abby A. Li n , Lawrence McCray o , OttoMeyer p , Reid D. Patterson q , William Pennie r , Robert A. Scala s , Gina M. Solomon t ,Martin Stephens u , James Yager v , Lauren Zeise w & Staff of Committee on Toxicity Testingand Assessment of Environmental Agentsa R. Samuel McLaughlin Centre for Population Health Risk Assessment, Institute of PopulationHealth, University of Ottawa, Ottawa, Ontario, Canadab Department of Epidemiology and Community Medicine, Faculty of Medicine, University ofOttawa, Ottawa, Ontario, Canadac University of Cincinnati, Cincinnati, Ohio, USAd The Hamner Institutes for Health Sciences, Research Triangle Park, North Carolina, USAe Wisconsin Division of Public Health, Madison, Wisconsin, USAf University of Chicago, Chicago, Illinois, USAg Brown University, Providence, Rhode Island, USAh Thomas Jefferson University, Wilmington, Delaware, USAi HealthRisk Strategies, Washington, DC, USAj University of Pennsylvania, Philadelphia, Pennsylvania, USAk Howard University, Washington, DC, USAl Harvard University, Boston, Massachusetts, USAm Oregon State University, Corvallis, Oregon, USAn Exponent, Inc., San Francisco, California, USAo Massachusetts Institute of Technology, Cambridge, Massachusetts, USAp The National Food Institute, Technical University of Denmark, Søborg, Denmarkq Reid Patterson Consulting, Inc., Elkhorn, Wisconsin, USAr Pfizer, Inc., Groton, Connecticut, USAs Exxon Biomedical Sciences (Ret.), Tucson, Arizona, USAt Natural Resources Defense Council, San Francisco, California, USAu The Humane Society of the United States, Washington, DC, USAv Johns Hopkins University, Baltimore, Maryland, USAw California Environmental Protection Agency, Oakland, California, USAPublished online: 21 Jun 2010.

To cite this article: Daniel Krewski , Daniel Acosta Jr. , Melvin Andersen , Henry Anderson , John C. Bailar III , KimBoekelheide , Robert Brent , Gail Charnley , Vivian G. Cheung , Sidney Green Jr. , Karl T. Kelsey , Nancy I. Kerkvliet , AbbyA. Li , Lawrence McCray , Otto Meyer , Reid D. Patterson , William Pennie , Robert A. Scala , Gina M. Solomon , MartinStephens , James Yager , Lauren Zeise & Staff of Committee on Toxicity Testing and Assessment of Environmental Agents

Page 2: Toxicity Testing in the 21st Century: A Vision and a Strategy

(2010): Toxicity Testing in the 21st Century: A Vision and a Strategy, Journal of Toxicology and Environmental Health, Part B:Critical Reviews, 13:2-4, 51-138

To link to this article: http://dx.doi.org/10.1080/10937404.2010.483176

PLEASE SCROLL DOWN FOR ARTICLE

Full terms and conditions of use: http://www.tandfonline.com/page/terms-and-conditions

This article may be used for research, teaching, and private study purposes. Any substantial or systematicreproduction, redistribution, reselling, loan, sub-licensing, systematic supply, or distribution in any form toanyone is expressly forbidden.

The publisher does not give any warranty express or implied or make any representation that the contentswill be complete or accurate or up to date. The accuracy of any instructions, formulae, and drug doses shouldbe independently verified with primary sources. The publisher shall not be liable for any loss, actions, claims,proceedings, demand, or costs or damages whatsoever or howsoever caused arising directly or indirectly inconnection with or arising out of the use of this material.

Dow

nloa

ded

by [

Bos

ton

Uni

vers

ity]

at 0

2:00

06

May

201

3

Page 3: Toxicity Testing in the 21st Century: A Vision and a Strategy

51

Journal of Toxicology and Environmental Health, Part B, 13:51–138, 2010Copyright © Taylor & Francis Group, LLCISSN: 1093-7404 print / 1521-6950 onlineDOI: 10.1080/10937404.2010.483176

TOXICITY TESTING IN THE 21ST CENTURY: A VISION AND A STRATEGY

Daniel Krewski1,2, Daniel Acosta, Jr.3, Melvin Andersen4, Henry Anderson5, John C. Bailar III6, Kim Boekelheide7, Robert Brent8, Gail Charnley9, Vivian G. Cheung10, Sidney Green, Jr.11, Karl T. Kelsey12, Nancy I. Kerkvliet13, Abby A. Li14, Lawrence McCray15, Otto Meyer16, Reid D. Patterson17, William Pennie18, Robert A. Scala19, Gina M. Solomon20, Martin Stephens21, James Yager22, Lauren Zeise23, Staff of Committee on Toxicity Testing and Assessment of Environmental Agents 1R. Samuel McLaughlin Centre for Population Health Risk Assessment, Institute of Population Health, University of Ottawa, Ottawa, Ontario, Canada2Department of Epidemiology and Community Medicine, Faculty of Medicine, University of Ottawa, Ottawa, Ontario, Canada3University of Cincinnati, Cincinnati, Ohio, USA4The Hamner Institutes for Health Sciences, Research Triangle Park, North Carolina, USA5Wisconsin Division of Public Health, Madison, Wisconsin, USA6University of Chicago, Chicago, Illinois, USA7Brown University, Providence, Rhode Island, USA8Thomas Jefferson University, Wilmington, Delaware, USA9HealthRisk Strategies, Washington, DC, USA10University of Pennsylvania, Philadelphia, Pennsylvania, USA11Howard University, Washington DC, USA12Harvard University, Boston, Massachusetts, USA13Oregon State University, Corvallis, Oregon, USA14Exponent, Inc., San Francisco, California, USA15Massachusetts Institute of Technology, Cambridge, Massachusetts, USA16The National Food Institute, Technical University of Denmark, Søborg, Denmark17Reid Patterson Consulting, Inc., Elkhorn, Wisconsin, USA18Pfizer, Inc., Groton, Connecticut, USA19Exxon Biomedical Sciences (Ret.), Tucson, Arizona, USA20Natural Resources Defense Council, San Francisco, California, USA21The Humane Society of the United States, Washington, DC, USA22Johns Hopkins University, Baltimore, Maryland, USA 23California Environmental Protection Agency, Oakland, California, USA

With the release of the landmark report Toxicity Testing in the 21st Century: A Vision and aStrategy, the U.S. National Academy of Sciences, in 2007, precipitated a major change in theway toxicity testing is conducted. It envisions increased efficiency in toxicity testing anddecreased animal usage by transitioning from current expensive and lengthy in vivo testingwith qualitative endpoints to in vitro toxicity pathway assays on human cells or cell linesusing robotic high-throughput screening with mechanistic quantitative parameters. Riskassessment in the exposed human population would focus on avoiding significant perturba-tions in these toxicity pathways. Computational systems biology models would be imple-mented to determine the dose-response models of perturbations of pathway function.Extrapolation of in vitro results to in vivo human blood and tissue concentrations would be

This article is based on the report Toxicity Testing in the 21st Century: A Vision and a Strategy, prepared by the Committee onToxicity Testing and Assessment of Environmental Agents. The report was originally published by the U.S. National Research Council in2007. It is reproduced here with permission of the National Academies Press to serve as the basis for a discussion of progress madetoward the implementation of the recommendations in the report, as reflected in the invited papers included in this volume. D. Krewskiis the Natural Sciences and Engineering Research Council of Canada Chair in Risk Science at the University of Ottawa.

Address correspondence to Dr. Daniel Krewski, PhD, McLaughlin Centre for Population Health Risk Assessment, University ofOttawa, 1 Stewart Street, Ottawa, Ontario, Canada K1N 6N5. E-mail: [email protected]

Dow

nloa

ded

by [

Bos

ton

Uni

vers

ity]

at 0

2:00

06

May

201

3

Page 4: Toxicity Testing in the 21st Century: A Vision and a Strategy

52 D. KREWSKI ET AL.

based on pharmacokinetic models for the given exposure condition. This practice wouldenhance human relevance of test results, and would cover several test agents, compared totraditional toxicological testing strategies. As all the tools that are necessary to implementthe vision are currently available or in an advanced stage of development, the key prereq-uisites to achieving this paradigm shift are a commitment to change in the scientific com-munity, which could be facilitated by a broad discussion of the vision, and obtainingnecessary resources to enhance current knowledge of pathway perturbations and pathwayassays in humans and to implement computational systems biology models. Implementa-tion of these strategies would result in a new toxicity testing paradigm firmly based onhuman biology.

Toxicity testing is approaching a pivotalpoint where it is poised to take advantage ofthe revolution in biology and biotechnology.The current system is the product of anapproach that has addressed advances in sci-ence by incrementally expanding test protocolsor by adding new tests without evaluating thetesting system in light of overall risk-assessmentand risk-management needs. That approachhas led to a system that is somewhat cumber-some with respect to the cost of testing, the useof laboratory animals, and the time needed togenerate and review data. In combination withvaried statutory requirements for testing, it hasalso resulted in a system in which there aresubstantial differences in chemical testing, withmany chemicals not being tested at all despitepotential human exposure to them. Further-more, the data that are generated might not beideal for answering questions regarding risk tohuman health. Accordingly, the U.S. Environ-mental Protection Agency (EPA) recognized thatthe time had come for an innovative approachto toxicity testing and asked the NationalResearch Council (NRC) to develop a long-range vision and strategy for toxicity testing. Inresponse to the U.S. EPA’s request, the NRCconvened the Committee on Toxicity Testingand Assessment of Environmental Agents, whichprepared a report. This document is based onthe NRC Committee’s report on Toxicity Testingand Assessment of Environmental Agents.

HISTORICAL PERSPECTIVE OF REGULATORY TOXICOLOGY

To gain an appreciation of current toxicity-testing strategies, it is helpful to examine how

they evolved, why differences arose amongand within federal agencies, and who contrib-uted to the process. The current strategies havetheir foundation in the response to a tragedythat occurred in 1937 (Gad & Chengelis,2001). At that time, few laws prevented the saleof unsafe food or drugs. A labeling law prohib-ited the sale of “misbranded” food or drugs,but the law could be enforced only on thebasis of criminal charges that arose after sale ofa product. During fall 1937, the Massengil Com-pany marketed a drug labeled “Elixir of Sulfa-nilamide,” which was a solution of sulfanilamidein diethylene glycol. From the recognition ofthe drug’s toxicity to its removal from the mar-ket by the Food and Drug Administration (FDA),it caused at least 73 deaths. The tragedy revealedthe inadequacy of the existing law. FDA wasable to act only because the drug had beenmislabeled; at that time, an elixir was definedas a product that contained alcohol. If the com-pany had labeled the drug “Solution of Sulfanil-amide,” FDA would not have been able to act.

As a result of the sulfanilamide tragedy,Congress passed the Food, Drug, and CosmeticAct (FDCA) of 1938, which required evidence(that is, from toxicity studies in animals) of drugsafety before marketing (Gad & Chengelis,2001). Major amendments to the FDCA in1962, known as the Kefauver–Harris Amend-ments, strengthened the original law andrequired proof not only of drug safety but ofdrug efficacy. More extensive clinical trialswere required, and FDA had to indicate affir-mative approval of a drug before it could bemarketed. The approval process thus changedfrom one based on premarket notification toone based on premarket approval.

Dow

nloa

ded

by [

Bos

ton

Uni

vers

ity]

at 0

2:00

06

May

201

3

Page 5: Toxicity Testing in the 21st Century: A Vision and a Strategy

TOXICITY TESTING IN THE 21ST CENTURY 53

The FDCA also dealt with food-safetyissues and was amended in 1958 to requiremanufacturers to demonstrate the safety offood additives (Frankos & Rodricks, 2001).FDA was given authority to develop toxicitystudies for assessing food additives and tospecify criteria to be used in assessing safety.As a result of the need for scientific safetyassessments, toxicologists in FDA, academe,and industry developed the first modern proto-cols in toxicology during the 1950s and 1960s(see, for example, FDA, 1959). Those protocolshelped to shape the toxicity-testing programsthat are in use today.

Differences in testing strategies betweendrugs and foods arose in FDA because of differ-ences in characteristics and regulatory require-ments (Frankos & Rodricks, 2001). Drugs arechemicals with intended biologic effects inpeople, whereas food additives—such as anti-oxidants, emulsifiers, and stabilizers—haveintended physical and chemical effects in food.Thus, a drug manufacturer must demonstratethe desired biologic effect, and a food-additivemanufacturer must demonstrate the absenceof measurable biologic effect. Regarding regu-latory requirements, the FDCA requires clinicaltrials in humans for drug approval; there is nosuch requirement for food additives. FDA con-siders risks and benefits when approving a drugbut considers only safety when approving afood additive. Thus, differences in approachesto food and drug testing have evolved.

The public has long been concerned aboutthe safety of intentional food additives anddrugs. By the late 1960s, concern about expo-sure to chemical contaminants in the environ-ment was also growing. In 1970, the U.S. EPAwas established “to protect human health andto safeguard the natural environment—air,water, and land—upon which life depends”(U.S. EPA, 2005a). Over the years, U.S. EPAhas developed toxicity-testing strategies toevaluate pesticides and industrial chemicalsthat may eventually appear as food residues oras environmental contaminants.

The 1947 Federal Insecticide, Fungicide,and Rodenticide Act (FIFRA) required the regis-tration of pesticides before marketing in interstate

or foreign commerce (Conner et al., 1987).The statute was first administered by the U.S.Department of Agriculture, but authority wastransferred to the U.S. EPA when it was cre-ated. FIFRA has been amended several times,but the 1972 amendments transformed FIFRAand gave U.S. EPA new powers, such as classi-fication of pesticides and regulation of pesticideresidues on raw agricultural commodities.Although registration remained the centerpieceof the act, one amendment required proof thatthe pesticide did not cause “unreasonableadverse effects” on humans or the environ-ment (Conner et al., 1987). That amendmentwas largely responsible for the testing strategythat eventually emerged in U.S. EPA.

The other critical pieces of legislation thathelped to shape the current toxicity-testingstrategy for pesticides were amendments to theFDCA. In 1954, the Miller Amendment“required that a maximum acceptable level(tolerance) be established for pesticide residuesin foods and animal feed” (Conner et al.,1987). The Food Quality Protection Act of1996 amended the FDCA (and FIFRA) and“fundamentally changed the way EPA regulatespesticides” (U.S. EPA, 2005b). Some of themost important changes were the establish-ment of a risk-based standard for pesticideresidues on all foods, the requirement that U.S.EPA “consider all non-occupational sources ofexposure . . . and exposure to other pesticideswith a common mechanism of toxicity whensetting tolerances,” the requirement that U.S.EPA set tolerances that would ensure safety forinfants and children, and the requirement thatU.S. EPA develop and implement an endocrine-disruptor screening program (U.S. EPA,2006a).

FIFRA, the FDCA, and the amendments tothem are responsible for the current toxicity-testing strategy for pesticides, which typicallyrequires extensive testing before a pesticidecan be marketed. The strategy for evaluatingindustrial chemicals is different. The Toxic Sub-stances Control Act (TSCA) was passed in 1976to address control of new and existing industrialchemicals not regulated by other statutes(Kraska, 2001). Although manufacturers are

Dow

nloa

ded

by [

Bos

ton

Uni

vers

ity]

at 0

2:00

06

May

201

3

Page 6: Toxicity Testing in the 21st Century: A Vision and a Strategy

54 D. KREWSKI ET AL.

required to submit premanufacturing notices—which include such information as chemicalidentity, intended use, manufacturing process,and expected exposure—no specific toxicitytesting is required [see NRC (2006a) interimreport for more information on the extent ofchemical testing under TSCA]. Instead, thestrategy for evaluating industrial chemicalsrelies heavily on the use of structure–activityrelationships.

FDA’s drug and food-additive testingprograms and U.S. EPA’s pesticide testing pro-gram represent strategies designed to supportsafety evaluations of chemicals before specifieduses. Other testing can occur in response toregulatory concerns regarding environmentalagents. For example, U.S. EPA sponsors sometoxicity testing, epidemiologic studies, and testdevelopment to support its regulatory man-dates, such as those under the Safe DrinkingWater Act. The Health Effects Institute (HEI), ajoint U.S. EPA- and industry-sponsored organi-zation, funds toxicity studies to inform regulatorydecisions on air pollutants. As regulatory con-cerns arise, industry may initiate testing to eval-uate further dose-response relationships ofimportant environmental contaminants. TheNational Toxicology Program (NTP)—whichwas created in 1978 to “coordinate toxicologytesting programs within the federal govern-ment[,] . . . strengthen the science base in toxi-cology[,] . . . develop and validate improvedtesting methods[,] . . . [and] provide informationabout potentially toxic chemicals to health,regulatory, and research agencies, scientificand medical communities, and the public”(NTP, 2005a)—performs toxicity tests onagents of public-health concern. For example,its chronic bioassay has become the gold stan-dard for carcinogenicity testing. The NTP hasbeen instrumental in the acceptance and inte-gration of new tests or approaches in toxicity-testing strategies. It has initiated developmentof medium- and high-throughput tests toaddress the ever-growing number of newlyintroduced chemicals and the existing chemicalsand breakdown products that have not beentested [The NTP’s general approach asdescribed in its Roadmap for the Future is

reviewed in the NRC (2006a) report.] Testsproposed by NTP and others that are alternativesto standard protocols are formally reviewed byan interagency authority, the Interagency Coor-dinating Committee on the Validation of Alter-native Methods (ICCVAM), to ensure that theyhave value in regulatory decision making.

Another organization that has influencedtoxicity-testing programs in the United States isthe Organization for Economic Cooperationand Development (OECD). OECD is an organi-zation that “provides a setting where govern-ments can compare policy experiences, seekanswers to common problems, identify goodpractice and co-ordinate domestic and inter-national policies” (OECD, 2006, p. 7). OECD’sbroad interests include health and the environ-ment. OECD has been instrumental in devel-oping internationally accepted, or harmonized,toxicity-testing guidelines. The goal of the har-monization program is to reduce the repetitionof similar tests conducted by member countriesto assess the toxicity of a given chemical. OtherOECD programs that have influenced toxicity-testing approaches or strategies include thoseto define the tests required for a minimal dataset for a chemical and to determine theapproach to screening endocrine disruptors.

RISK ASSESSMENT

The toxicity data generated by the strate-gies and programs just described are mostoften used in a process called risk assessmentto evaluate the risk associated with exposure toan agent. The 1983 NRC report Risk Assess-ment in the Federal Government: Managing theProcess, which presented a systematic andorganized paradigm, set a standard for riskassessment. The report outlined a three-phaseprocess in which scientific data are movedfrom the laboratory or the field into the risk-assessment process and then on to decisionmakers to determine regulatory options.

The research phase is marked by datageneration and method development, includingbasic research and routine testing. For any par-ticular risk assessment, the data used may havemany sources, including studies of laboratory

Dow

nloa

ded

by [

Bos

ton

Uni

vers

ity]

at 0

2:00

06

May

201

3

Page 7: Toxicity Testing in the 21st Century: A Vision and a Strategy

TOXICITY TESTING IN THE 21ST CENTURY 55

animals, clinical tests, epidemiologic studies, andstudies of animal and human cells in culture.The data may be reported in peer-reviewedpublications, in the general scientific literatureand government reports, and in unpublishedreports of specific tests undertaken for anassessment.

In the risk-assessment phase, selected dataare interpreted and used to evaluate a potentialrisk to human health and the environment.The 1983 NRC report described this phase interms of four components: hazard identification(analysis of the available data to describe quali-tatively the nature of the response to toxicchemicals, such as tumors, birth defects, andneurologic effects); dose-response analysis (quan-tification of the relationship between exposureand the response observed in studies used toidentify hazard); exposure assessment (quantifi-cation of expected exposure to the agentamong the general population and differentlyexposed groups); and risk characterization(synthesis and integration of the analyses in thethree other components to estimate the likeli-hood and scope of risk among the general,sensitive, and differently exposed populations).Although risk assessment is based on scientificdata, the process is characterized by gaps indata and fundamental scientific knowledge, andit relies on models, extrapolation, and otherinference methods. The process turns to sciencepolicies—choice of mathematical models, safetyfactors, and assumptions—to fill in data andknowledge gaps. Science policies used in riskassessment are distinct from the regulatory pol-icies developed for risk-management decisionsdescribed below.

Risk management moves the original data—now synthesized and integrated in the form of arisk characterization—to those responsible formaking regulatory decisions. The decision mak-ers consider the products of the risk assessmentwith data from other fields (for example, eco-nomics), societal and political issues, and inter-agency and international factors to decidewhether regulation is needed and, if so, itsnature and scope.

The 1983 NRC report and later reports(NRC, 1993, 1996; U.S. EPA, 1998a) recognized

a planning and scoping stage in which a host ofscientific and societal issues are considered inadvance of research and risk assessment. Thatactivity includes examining the expected scopeof the problem, available data and expecteddata needs, cost and time requirements, legalconsiderations, and community-related issues.The present report identifies some of thoseconsiderations and other, public-health consid-erations as “risk contexts” and underlines theirimportant role in decisions related to toxicitytesting (see discussion under “The Committee’sSecond Task and Approach” in this section).

Reviews and critiques of the 1983 NRCparadigm have for the most part focused onthe risk-assessment module and its four com-ponents. A review of the literature shows con-siderably less attention to the research moduleand the risk-management module. The presentreport focuses on the research module, inwhich testing is conducted; however, it venturesinto some risk-assessment considerations.

The Committee’s First Task and Key Points From its Interim Report

Anticipating the impact of the many scientificadvances and the changing needs of the assess-ment process, U.S. EPA recognized the need toreview existing strategies and develop a long-range vision for toxicity testing and assessment.The committee that was formed in response toU.S. EPA’s request and convened in March2004 includes experts in developmental toxi-cology, reproductive toxicology, neurotoxicology,immunology, pediatrics and neonatology, epi-demiology, biostatistics, in vitro methods andmodels, molecular biology, pharmacology,physiologically based pharmacokinetic andpharmacodynamic models, genetics, toxicoge-nomics, cancer hazard assessment, and riskassessment.

As a first task, the committee was asked toreview several relevant reports by U.S. EPAand others and to comment on aspects per-taining to new developments in toxicity testingand proposals to modify current approaches.Accordingly, the committee reviewed the2002 U.S. EPA evaluation of its reference-doseand reference-concentration process (U.S.

Dow

nloa

ded

by [

Bos

ton

Uni

vers

ity]

at 0

2:00

06

May

201

3

Page 8: Toxicity Testing in the 21st Century: A Vision and a Strategy

56 D. KREWSKI ET AL.

EPA, 2002), the International Life SciencesInstitute Health and Environmental SciencesInstitute draft reports on a tiered toxicity-testingapproach for agricultural-chemical safety eval-uations (ILSI-HESI, 2004a, 2004b, 2004c), the2004 European Union report on the REACH(Registration, Evaluation and Authorisation ofChemicals) program, and the 2004 report on thenear-term and long-term goals of the NationalToxicology Program (NTP, 2004). The commit-tee’s interim report, released in December2005, fulfilled the first part of the study.

As discussed in its interim report (NRC,2006a), the committee’s review of currenttoxicity-testing strategies revealed a system thathad reached a turning point. Agencies typicallyhave responded to scientific advances andemerging challenges by simply altering individualtests or adding tests to existing regimens. Thatpatchwork approach has not provided a fullysatisfactory solution to the fundamental prob-lem—the difficulty in meeting four objectivessimultaneously: depth, providing the mostaccurate, relevant information possible for hazardidentification and dose-response assessment;breadth, providing data on the broadest possi-ble universe of chemicals, endpoints, and lifestages; animal welfare, causing the least animalsuffering possible and using the fewest animalspossible; and conservation, minimizing theexpenditure of money and time on testing andregulatory review.

The committee identified several recurringthemes and questions in the various reportsthat it was asked to review. The recurringthemes included the following:

• The inherent tension between breadth,depth, animal welfare, and conservation andthe challenge to address one of these issueswithout worsening another.

• The importance of distinguishing betweentesting protocols and testing strategies whileconsidering modifications of current testingpractices.

• The possible dangers in making tests sofocused that they evaluate only one endpointin one species and thus provide no overlapto verify results.

• The need for both chemical-specific tailoredtesting to enhance understanding of a partic-ular chemical’s mode of action and uniformtesting protocols and strategies to enhancecomparability.

• The importance of recognizing that toxicitytesting for regulatory purposes should beconducted primarily to serve the needs ofrisk management.

The recurring questions that arose duringthe committee’s review included the following:Which environmental agents should be tested?How should priorities for testing chemicals beset? What strategies for toxicity testing are themost useful and effective? How can toxicitytesting generate data that are more useful forhuman health risk assessment? How can toxicitytesting be applied to a broader universe ofchemicals, life stages, and health effects? Howcan environmental agents be screened withminimal use of animals and efficient expendi-ture of time and other resources? How shouldtests and testing strategies be evaluated?

In considering those questions, the com-mittee came to several important conclusions.First, the intensity and depth of testing shouldbe based on practical needs, including the useof the chemical, the likelihood of human expo-sure, and the scientific questions that testingmust answer to support a reasonable science-policy decision. Fundamentally, the design andscope of a toxicity-testing approach need toreflect risk-management needs. Thus, the goalis to focus resources on the evaluation of themore sensitive adverse effects of exposures ofgreatest concern rather than on full character-ization of all adverse effects irrespective of rele-vance for risk-assessment and risk-managementneeds. Second, priority setting should be acomponent of any testing strategy that isdesigned to address a large number of chemicals.Chemicals to which people are more likely tobe exposed or to which some segment of thepopulation might receive relatively high expo-sures should undergo more in-depth testing,and this concept is embedded in several existingand proposed strategies. Third, there are majorgaps in current toxicity-testing approaches. The

Dow

nloa

ded

by [

Bos

ton

Uni

vers

ity]

at 0

2:00

06

May

201

3

Page 9: Toxicity Testing in the 21st Century: A Vision and a Strategy

TOXICITY TESTING IN THE 21ST CENTURY 57

importance of the gaps is a matter of debateand depends on whether effects of public-health importance are being missed by currentapproaches. Testing every chemical for everypossible health effect over all life stages isimpractical; however, the emerging technolo-gies hold great promise for screening chemi-cals more rapidly. Fourth, testing strategies willneed to be evaluated with respect to the valueof information that they provide in light of thefour objectives already discussed—depth,breadth, animal welfare, and conservation. Inevaluating new tests, there remains the diffi-cult question of what should serve as the goldstandard for performance. Simply comparingthe outcomes of new tests with the outcomesof currently used tests might not be the bestapproach; determining whether it is willdepend on the reliability and relevance of thecurrent tests.

The Committee’s Second Task and Approach

For the second part of the study, thecommittee’s statement of task was to buildon the work presented in the first report anddevelop a long-range vision and strategicplan to advance the practices of toxicitytesting and human health assessment ofenvironmental contaminants. The committeewas directed to consider the following spe-cific issues:

• Improvements in the assessment of keyexposures (for example, potential suscepti-bility of specific life stages and groups in thegeneral population) and toxicity outcomes(for example, endocrine disruption anddevelopmental neurotoxicity).

• Incorporation of state-of-the-science test-ing and assessment procedures, methods,and approaches, such as genomics, pro-teomics, transgenics, bioinformatics, andpharmacokinetics.

• Methods for increasing efficiency in experi-mental design and reducing the use of labo-ratory animals.

• Potential uses and limitations of new oralternative testing methods.

• Application of emerging computational andmolecular techniques in risk assessment.Issues to be considered included the datanecessary to validate the techniques, thelimitations of the techniques, the use of suchmethods to identify plausible mechanisms orpathways of toxicity, and the use of mecha-nistic insights in risk assessments or testingdecisions.

To prepare its final report, the committeeheld six meetings from April 2005 to June2006. Three of the meetings included publicsessions during which the committee heardpresentations by staff members of several U.S.EPA offices, including the Office of Prevention,Pesticides and Toxic Substances, the Office ofChildren’s Health Protection, the Office ofWater, the Office of Solid Waste and Emer-gency Response, and the Office of Air andRadiation. The committee also heard presenta-tions by persons in other government agencies,industry, and academe.

To develop its long-range vision, the com-mittee identified a variety of scenarios forwhich toxicity-testing information would beneeded to make a decision. Some commonscenarios, defined by the committee as “riskcontexts” for which toxicity testing is used togenerate information needed for decisionmaking, are outlined next.

• Evaluation of new environmental agents. Thiscategory covers chemicals that have thepotential to appear as environmentalcontaminants. It includes pesticides; indus-trial chemicals; chemicals that are destinedfor use in, for example, consumer products;and chemicals that might be emitted by thecombustion of new fuels or new manufacturingprocesses. It would also include their break-down products. Because of the large numberof new agents that are introduced each year,a mechanism is needed to test the agentsrapidly for potential toxicity. Questions havebeen raised about the safety of and riskposed by new categories of potentialenvironmental agents, such as thoseintroduced through nanotechnology and

Dow

nloa

ded

by [

Bos

ton

Uni

vers

ity]

at 0

2:00

06

May

201

3

Page 10: Toxicity Testing in the 21st Century: A Vision and a Strategy

58 D. KREWSKI ET AL.

biotechnology. This category would alsoinclude those substances.

• Evaluation of existing environmental agents.Many substances already in the environmenthave not been evaluated for toxicity. In somecases, a need to evaluate specific existingenvironmental agents may arise from the dis-covery of a new source or exposure pathwayor from a better understanding of humanexposure on the basis of, for example,biomonitoring data. In other cases, scrutinymay be necessary when toxicity is newlyrecognized, such as toxicity in a worker pop-ulation. In addition, the backlog of untestedchemicals in commerce requires assessmentto ensure that the chemicals in use today donot pose unacceptable risks at current expo-sures. Thus, toxicity testing for existing envi-ronmental agents requires a variety of testingapproaches, from basic screening of a hugeset of chemical agents to use of specific datagenerated by new exposure or health-effectsinformation.

• Evaluation of a site. In many areas, soil orwater has been contaminated by, for example,former industrial, military, or power-genera-tion activities. If a new use, such as the build-ing of a school or office building, is proposedfor such a site, a primary goal would be toprotect the health of future users of the site.Other goals could include evaluating therisks to neighbors posed by such a site ordetermining the degree and type of cleanupneeded. Sites that are in use also might needevaluation, such as sites of industrial work-places, schools, or office buildings. Thoseevaluations almost always involve concernsabout exposures to site-specific chemicalmixtures.

• Evaluation of potential environmental con-tributors to a specific disease. Many diseasesare suspected of having an etiology that is,at least in part, environmental. A higherprevalence of a disease in one geographicarea than in another might require decisionmakers to consider the role of environmen-tal agents in the disparity. Understandingthe role of environmental agents in a preva-lent disease can also help to target actions

that need to be taken. For example,asthma, which has seen an increase in prev-alence over the last two decades in Westernsocieties, is now known to be induced oraggravated by air pollutants. That under-standing has allowed decision makers totake action against some pollutants, butother causes or triggers of asthma could yetbe discovered.

• Evaluation of the relative risks posed byenvironmental agents. A risk manager mightneed to choose between different manufac-turing processes or different solvents. Con-sumers might wish to distinguish betweenproducts on the basis of their potential risksto children. A proponent of a new chemicalor process might wish to show that it has alower risk in some ways than the currentchemical or process. Such decisions mightrequire less complex risk characterizations ifthey focus on the possible outcomes or expo-sures to be compared rather than requiringan in-depth understanding of the risks associ-ated with each possible choice. This scenarioemphasizes the need for toxicity-testinginformation to be directly comparable, stan-dardized, and quantifiable so that such com-parisons can be made.

Thus, a primary goal of the committee wasto develop a flexible toxicity-testing strategythat would be responsive to the different toxicity-testing needs of the various risk contexts out-lined above. Another goal of the committeewas to consider the powerful new technologiesthat have become available and will continueto evolve. For example, bioinformatics, whichapplies computational approaches to describeand predict biologic function at the molecularlevel, and systems biology, which is a powerfulapproach to describing and understanding fun-damental mechanisms by which biologicsystems operate, have pushed biologic under-standing into a new realm. Moreover, genomics,proteomics, and metabolomics offer greatpotential and are being used to study humandisease and to evaluate the safety of pharma-ceutical products. Those and other tools areconsidered to be important in any future toxicity-testing strategy.

Dow

nloa

ded

by [

Bos

ton

Uni

vers

ity]

at 0

2:00

06

May

201

3

Page 11: Toxicity Testing in the 21st Century: A Vision and a Strategy

TOXICITY TESTING IN THE 21ST CENTURY 59

ORGANIZATION OF THE DOCUMENT

This document is organized into five moresections. In the section titled “Vision,” the limi-tations of the current toxicity-testing system,the design goals for a new system, and theoptions considered by the committee are dis-cussed. An overview of the new long-rangevision for toxicity testing of environmentalagents is also presented. Each component ofthe new vision is discussed in greater detail inthe section “Components of the Vision.”“Tools and technologies” that might be used inthe future toxicity-testing paradigm aredescribed in the subsequent section. Imple-mentation of the new vision over the course ofseveral decades is considered in the section“Developing the Science Base and Assays toImplement the Vision.” In the final section,“Prerequisites for Implementing the Vision inRegulatory Contexts,” the committee considersthe implications of the long-range vision giventhe current regulatory framework.

VISION

Make no little plans. They have no magicto stir men’s blood and probably them-selves will not be realized. Make big plans;aim high in hope and work, rememberingthat a noble, logical diagram oncerecorded will never die, but long after weare gone will be a living thing, assertingitself with ever-growing insistency. (DanielHudson Burnham, Architect, Designer ofthe 1893 Chicago World’s Fair)

The goal of toxicity testing is to developdata that can ensure appropriate protection ofpublic health from the adverse effects ofexposures to environmental agents. Currentapproaches to toxicity testing rely primarily onobserving adverse biologic responses in homo-geneous groups of animals exposed to highdoses of a test agent. However, the relevanceof such animal studies for the assessment ofrisks to heterogeneous human populationsexposed at much lower concentrations hasbeen questioned. Moreover, the studies areexpensive and time-consuming and can use

large numbers of animals, so only a small pro-portion of chemicals have been evaluated withthese methods. Adequate coverage of differentlife stages, of endpoints of public concern,such as developmental neurotoxicity, and ofmixtures of environmental agents is a continuingconcern. Current tests also provide little infor-mation on modes and mechanisms of action,which are critical for understanding interspe-cies differences in toxicity, and little or noinformation for assessing variability in humansusceptibility. Thus, the committee looked torecent scientific advances to provide a newapproach to toxicity testing.

A revolution is taking place in biology. Atits center is the progress being made in theelucidation of cellular-response networks.Those networks are interconnected pathwayscomposed of complex biochemical interac-tions of genes, proteins, and small moleculesthat maintain normal cellular function, controlcommunication between cells, and allow cellsto adapt to changes in their environment. Afamiliar cellular-response network is signalingby estrogens in which initial exposure results inenhanced cell proliferation and growth of specifictissues or in proliferation of estrogen-sensitivecells in culture (Frasor et al., 2003). In that typeof network, initial interactions between a sig-naling molecule and various cellular receptorsresult in a cascade of early, midterm, and lateresponses to achieve a coordinated responsethat orchestrates normal physiologic functions(Landers & Spelsberg, 1992; Thummel, 2002;Rochette-Egly, 2003).

Bioscience is rapidly enhancing our knowl-edge of cellular-response networks and allowingscientists to begin to uncover the manner inwhich environmental agents perturb pathwaysto cause toxicity. Pathways that can lead toadverse health effects when sufficiently per-turbed are termed toxicity pathways. Responsesof cells to oxidative stress caused by exposureto diesel exhaust particles (DEP) constitute anexample of toxicity pathways within a cellular-response network (Xiao et al., 2003). In a dose-related fashion, in vitro exposures to DEP leadto activation of a hierarchic set of pathways.First, cell antioxidant signaling is increased. As

Dow

nloa

ded

by [

Bos

ton

Uni

vers

ity]

at 0

2:00

06

May

201

3

Page 12: Toxicity Testing in the 21st Century: A Vision and a Strategy

60 D. KREWSKI ET AL.

the dose increases, inflammatory signaling isenhanced; finally, at higher doses, there is acti-vation of cell-death (apoptosis) pathways (Nelet al., 2006). Thus, in the cellular-responsenetwork dealing with oxidative stress, the anti-oxidant pathways activated by DEPs are normaladaptive signaling pathways that assist in main-taining homeostasis; however, they are alsotoxicity pathways in that they lead to adverseeffects when oxidant exposure is sufficientlyhigh. The committee capitalizes on the recentadvances in elucidating and understandingtoxicity pathways and proposes a new approachto toxicity testing based on them.

New investigative tools are providingknowledge about biologic processes and func-tions at an astonishing rate. In vitro tests thatevaluate activity in toxicity pathways are eluci-dating the modes and mechanisms of action oftoxic substances. Quantitative high-throughputassays can be used to expand the coverage ofthe universe of new and existing chemicals thatneed to be evaluated for human health riskassessment (Roberts, 2001; Inglese, 2002;

Inglese et al., 2006; Haney et al., 2006). Thenew assays can also generate enhanced infor-mation on dose-response relationships over amuch wider range of concentrations, includingthose representative of human exposure. Phar-macokinetic and pharmacodynamic modelspromise to provide more accurate extrapola-tion of tissue dosimetry linked to cellular andmolecular endpoints. The application of toxi-cogenomic technologies and systems-biologyevaluation of signaling networks will permitgenomewide scans for genetic and epigeneticperturbations of toxicity pathways. Thus, changesin toxicity pathways are envisioned as the basis ofa new toxicity-testing paradigm for managing therisks posed by environmental agents instead ofapical endpoints from whole-animal tests.

This section provides an overview of thecommittee’s vision but first discusses the limita-tions of current toxicity-testing strategies, thedesign goals for a new system, and the optionsthat the committee considered. Key terms usedthroughout this report are listed and defined inTable 1.

TABLE 1. Key Terms Used in the Report

• Apical endpoint. An observable outcome in a whole organism, such as a clinical sign or pathologic state, that is indicative of a disease state that can result from exposure to a toxicant.

• Cellular-response network. Interconnected pathways composed of the complex biochemicalinteractions of genes, proteins, and small molecules that maintain normal cellular function, con-trol communication between cells, and allow cells to adapt to changes in their environment.

• High-throughput assays. Efficiently designed experiments that can be automated and rapidlyperformed to measure the effect of substances on a biologic process of interest. These assayscan evaluate hundreds to many thousands of chemicals over a wide concentration range toidentify chemical actions on gene, pathway, and cell function.

• Mechanism of action. A detailed description, often at the molecular level, of the means bywhich an agent causes a disease state or other adverse effect.

• Medium-throughput assays. Assays that can be used to test large numbers of chemicals for theirability to perturb more integrated cellular responses, such as cell proliferation, apoptosis, andmutation. Because of assay complexity, fewer agents can be evaluated in the same period thanwith high-throughput assays.

• Mode of action. A description of key events or processes by which an agent causes a diseasestate or other adverse effect.

• Systems biology. The study of all elements in a biologic system and their interrelationships inresponse to exogenous perturbation (Stephens & Rung, 2006).

• Toxicity pathway. Cellular response pathways that, when sufficiently perturbed in an intact animal, are expected to result in adverse health effects (see Figure 2).

Dow

nloa

ded

by [

Bos

ton

Uni

vers

ity]

at 0

2:00

06

May

201

3

Page 13: Toxicity Testing in the 21st Century: A Vision and a Strategy

TOXICITY TESTING IN THE 21ST CENTURY 61

Limitations of Current Testing Strategies

The exposure-response continuum shownin Figure 1 effectively represents the currentapproach to toxicologic risk assessment. Itfocuses primarily on adverse health outcomesas the endpoints for assessing the risk posed byenvironmental agents and establishing humanexposure guidelines. Although intermediatebiologic changes and mechanisms of action areconsidered in the paradigm, they are viewedas steps along the pathway to the ultimateinduction of an adverse health outcome.

Traditional toxicity-testing strategies under-taken in the context of the preceding paradigmhave evolved and expanded over the last fewdecades to reflect increasing concern about awider variety of toxic responses, such as subtleneurotoxic effects and adverse immunologicchanges. The current system, which relies pri-marily on a complex set of whole-animal-based toxicity-testing strategies for hazardidentification and dose-response assessment,has difficulty in addressing the wide variety ofchallenges that toxicity testing must meettoday. Toxicity testing is under increasingpressure to meet several competing demands:

• Test large numbers of existing chemicals,many of which lack basic toxicity data.

• Test the large number of new chemicals andnovel materials, such as nanomaterials, intro-duced into commerce each year.

• Evaluate potential adverse effects with respectto all critical endpoints and life stages.

• Evaluate potential toxicity in the most vulner-able members of the human population.

• Minimize animal use.• Reduce the cost and time required for

chemical safety evaluation.• Acquire detailed mechanistic and tissue-dosimetry

data needed to assess human risk quantitativelyand to aid in regulatory decision making.

The current approach relies primarily on invivo mammalian toxicity testing and is unableto meet those competing demands adequately.In 1979, about 62,000 chemicals were in com-merce (Government Accounting Office [GAO],2005). Today, there are 82,000, and about 700are introduced each year (GAO, 2005). Thelarge volume of new and current chemicals incommerce is not being fully assessed (see thecommittee’s interim report, NRC, 2006a). Onereason for the testing gaps is that the current test-ing is so time-consuming and resource-intensive.Furthermore, only limited mechanistic informa-tion is routinely developed to understand howmost chemicals are expected to produce adversehealth effects in humans. Those deficiencies limitthe ability to predict toxicity in human popula-tions that are typically exposed to much lowerdoses than those used in whole-animal studies.They also limit the ability to develop predictionsabout similar chemicals that have not been simi-larly tested. The following sections describe sev-eral limitations of the current system anddescribe how a system based on toxicity path-ways would help to address them.

Low-Dose Extrapolation From High-Dose Data

Traditional toxicity testing has relied onadministering high doses to animals of nearlyidentical susceptibility to generate data foridentifying critical endpoints for risk assessment.Historically, exposing animals to high doseswas justified by a need for sufficient statisticalpower to observe high incidences of toxicresponses in small test populations with relativelyshort exposures. In many cases, daily doses inanimal toxicity tests are orders of magnitudegreater than those expected in human exposures.Thus, the use of high-dose animal toxicity testsfor predicting risks of specific apical humanendpoints has remained challenging and

FIGURE 1. The exposure-response continuum underlying the current paradigm for toxicity testing.

Dow

nloa

ded

by [

Bos

ton

Uni

vers

ity]

at 0

2:00

06

May

201

3

Page 14: Toxicity Testing in the 21st Century: A Vision and a Strategy

62 D. KREWSKI ET AL.

controversial. Inferring effects at lower doses isdifficult because of inherent uncertainty in thenature of dose-response relationships. Effectsat high doses may result from metabolic pro-cesses that contribute negligibly at lower dosesor may arise from biologic processes that donot occur with treatment at lower doses. Incontrast, high doses may cause overt toxicresponses that preclude the detection of biologicinteractions between the chemical and varioussignaling pathways that lead to subtle butimportant adverse effects. The vision proposedin this report offers the potential to obtaindirect information on toxic effects at exposuresmore relevant to those experienced by humanpopulations.

Animal-to-Human Extrapolation

Other concerns arise about the relation-ship between the biology of the test speciesand the heterogeneous human population.Animals have served as models of humanresponse for decades because the biology ofthe test animals is, in general, similar to that ofhumans (NRC, 1977). However, although thegenerality holds true, there are several exam-ples of idiosyncratic responses in test animalsand humans in which chemicals do not have aspecific toxic effect in a test species but do inhumans and vice versa. A classic example isthalidomide: Rats are resistant, and humanfetuses are sensitive. The committee envisionsa future in which tests based on human cellsystems can serve as better models of humanbiologic responses than apical studies in differ-ent species. The committee therefore believesthat, given a sufficient research and develop-ment effort, human cell systems have thepotential to largely supplant testing in animals.

Mixtures

Current toxicity-testing approaches havebeen criticized because of their failure to con-sider co-exposures that commonly occur inhuman populations. Because animal toxicitytests are time-consuming and resource-intensiveand result in the sacrifice of animals, it is diffi-cult to use them for substantial testing of chem-ical mixtures (NRC, 1988; Cassee et al., 1998;

Feron et al., 1995; Lydy et al., 2004; Bakandet al., 2005; Pauluhn, 2005; Teuschler et al.,2005). Furthermore, without information onhow chemicals exert their biologic effects, test-ing of mixtures is a daunting task. For example,testing of mixtures in animal assays couldinvolve huge numbers of combinations ofchemicals and the use of substantial resourcesin an effort of uncertain value. In contrast, test-ing based on toxicity pathways could allowgrouping of chemicals according to their effectson key biologic pathways. Combinations ofchemicals that interact with the same toxicitypathway could be tested over broad doseranges much more rapidly and inexpensively.The resulting data could allow an intelligentand focused approach to the problem ofassessing risk in human populations exposed tomixtures.

Design Criteria for a New Toxicity Testing Paradigm

The committee discussed the design crite-ria that should be considered in developing astrategy for toxicity testing in the future. Asdiscussed in the committee’s interim report(NRC, 2006a), which did much to framethose criteria, the goal is to improve toxicitytesting by accomplishing the followingobjectives:

• Provide broader coverage of chemicals andtheir mixtures, endpoints, and life-stagevulnerabilities.

• Reduce the cost and time of testing, increaseefficiency and flexibility, and make it possibleto reach a decision more quickly.

• Use fewer animals and cause minimal sufferingto animals that are used.

• Develop a more robust scientific basis of riskassessment by providing detailed mechanisticand dosimetry information and by encouragingthe integration of toxicologic and population-based data.

The committee considered those objec-tives as it weighed various options. The followingsection discusses some of the options consideredby the committee.

Dow

nloa

ded

by [

Bos

ton

Uni

vers

ity]

at 0

2:00

06

May

201

3

Page 15: Toxicity Testing in the 21st Century: A Vision and a Strategy

TOXICITY TESTING IN THE 21ST CENTURY 63

Options for a New Toxicity-Testing Paradigm

In developing its vision for toxicity testing,the committee explored four options, as pre-sented in Table 2. The baseline option (Option I)applies current toxicity-testing principles andpractices. Accordingly, it would use primarilyin vivo animal toxicity tests to predict humanhealth risks. The difficulties in interpreting ani-mal data obtained at high doses with respect torisks in the heterogeneous human populationwould not be circumvented. Moreover, becausewhole-animal testing is expensive and time-consuming, the number of chemicals addressedwould continue to be small. The continued useof relatively large numbers of animals for toxicitytesting also raises ethical issues and is inconsis-tent with emphasis on reduction, replacement,and refinement of animal use (Russell & Burch,1959). Overall, the current approach does notprovide an adequate balance among the fourobjectives of toxicity testing identified in thecommittee’s interim report: depth of testing,breadth of testing, animal welfare, and conser-vation of testing resources.

The committee extensively considered theexpanded use of tiered testing (Option II) toalleviate some of the concerns with presentpractice. The tiered approach to toxicity testingentails a stepwise process for screening andevaluating the toxicity of agents that still reliesprimarily on test results in whole animals. Thegoal of tiered testing is to generate pertinentdata for more efficient assessment of potential

health risks posed by an environmental agent,taking into consideration available knowledgeon the chemical and its class, its modes or mech-anisms of action, and its intended use and esti-mated exposures (Carmichael et al., 2006).Those factors are used to refine testing prioritiesto focus first on areas of greatest concern in earlytiers and then to move judiciously to advancedtesting in later tiers as needed. In addition, anemphasis on pharmacokinetic studies in tieredapproaches has been considered in recent dis-cussions of improving toxicity testing of pesti-cides (Carmichael et al., 2006; Doe et al., 2006).

Tiered testing has been recommended inevaluating the toxicity of agricultural products(Doe et al., 2006), in screening for endocrine dis-ruptors (Charles, 2004), and in assessing devel-opmental toxicity (Spielman, 2005) andcarcinogenicity (Stavanja et al., 2006) of chemi-cals and products. A tiered-testing approach alsohas the promise to include comparative genomicstudies to help to identify genes, transcription-factor motifs, and other putative control regionsthat are involved in tissue responses (Ptacek &Sell, 2005). The increasing complexity of biologicinformation—including genomic, proteomic,and cell-signaling information—has encouragedthe use of a more systematic multilevel approachin toxicity screening (Yokota et al., 2004).

The systematic development of tiered,decision-tree selection of more limited suitesof animal tests could conceivably providetoxicity-testing data nearly equivalent to thosecurrently obtained but without the need to

TABLE 2. Options for Future Toxicity-Testing Strategies

Option I, in vivo Option II,tiered in vivo Option III, in vitro and in vivo Option IV, in vitro

Animal biology Animal biology Primarily human biology Primarily human biologyHigh doses High doses Broad range of doses Broad range of dosesLow throughput Improved throughput High and medium throughput High throughputExpensive Less expensive Less expensive Less expensiveTime-consuming Less time-consuming Less time-consuming Less time-consumingUse of relatively large

numbers of animalsUse of fewer animals Use of substantially fewer animals Use of virtually no animals

Based on apical endpoints Based on apical endpoints Based on perturbations of critical cellular responses

Based on perturbations of critical cellular responses

Some screening using computational and in vitro approaches; more flexibility than current methods

Screening using computational approaches possible; limited animalstudies that focus on mechanism and metabolism

Screening using computational approaches

Dow

nloa

ded

by [

Bos

ton

Uni

vers

ity]

at 0

2:00

06

May

201

3

Page 16: Toxicity Testing in the 21st Century: A Vision and a Strategy

64 D. KREWSKI ET AL.

conduct tests for as many apical endpoints.The use of appropriately chosen computationalmodels and in vitro screens might also permitsound risk-management decisions in some caseswithout the need for in vivo testing. Both typesof tiered-testing strategies offer the potential ofreducing animal use and toxicity-testing costsand allowing flexibility in testing based onrisk-management information needs. Althoughthe committee recognized the potential forincremental improvement in toxicity testingthrough a tiered approach, Option II still repre-sents only a small step in improving coverage,reducing costs and animal use, and increasingmechanistic information in risk assessment. Itstill relies on whole-animal testing and is gearedmainly toward deciding which animal tests arerequired in risk assessment for any specificagent. Although tiered testing might be pursuedmore formally in a transition to a more compre-hensive toxicity-testing strategy, it does not meetmost of the design criteria discussed earlier.

In the committee’s view, a more transforma-tive paradigm shift is needed to achieve theobjectives for toxicity testing set out in its interimreport, represented by Options III and IV in Table2. The committee’s vision is built on the identifi-cation of biologic perturbations of toxicity path-ways that can lead to adverse health outcomesunder conditions of human exposure. The use ofa comprehensive array of in vitro tests to identifyrelevant biologic perturbations with cellular andmolecular systems based on human biologycould eventually eliminate the need for whole-animal testing and provide a stronger, mechanis-tically based approach for environmental deci-sion making. Computational models could alsoplay a role in the early identification of environ-mental agents potentially harmful to humans,although further testing would probably beneeded. This new approach would be lessexpensive and less time-consuming than the cur-rent approach and result in much higherthroughput. Although the reliance on in vitroresults lacks the whole-organism integration pro-vided by current tests, toxicologic assessmentswould be based on biologic perturbations of tox-icity pathways that can reasonably be expectedto lead to adverse health effects. Understanding

of the role of such perturbations in the inductionof toxic responses would be refined through toxi-cologic research. With the further developmentof in vitro test systems of toxicity pathways andthe tools for assessing the dose-response charac-teristics of the perturbations, the committeebelieves that its vision for toxicity testing will meetthe four objectives set out in its interim report.

Full implementation of the high-throughput,fully human-cell-based testing scheme repre-sented by Option IV in Table 2 would face anumber of scientific challenges. Major concernsare related to ensuring adequate testing ofmetabolites and the potential difficulties of eval-uating novel chemicals, such as nanomaterialsand biotechnology products with in vitro tests.Those challenges require maintenance of somewhole-animal tests into the foreseeable future, asindicated in Option III, which includes specificin vivo studies to assess formation of metabolitesand some mechanistic studies of target-organresponses to environmental agents and leavesopen the possibility that more extensive in vivotoxicity evaluations of new classes of agents willbe needed. Like Option IV, Option III empha-sizes the development and application of new invitro assays for biologic perturbations of toxicitypathways. Thus, although the committee notesthat Option IV embodies the ultimate goal fortoxicity testing, the committee’s vision for thenext 10–20 years is defined by Option III.

The committee is mindful of the method-ologic developments that will be required toorchestrate the transition from current practicestoward its vision. During the transition period,there will be a need to continue the use ofmany current test procedures, including whole-animal tests, as the tools needed to implementthe committee’s vision fully are developed.The steps that need to be taken to achieve thecommittee’s vision are discussed further in thesection “Developing the Science Base andAssays to Implement the Vision.”

The committee notes that Europeanapproaches to improve toxicity testing emphasizethe replacement of animal tests with in vitromethods (Gennari et al., 2004). However, amajor goal of the European approaches is todevelop in vitro batteries that can predict the

Dow

nloa

ded

by [

Bos

ton

Uni

vers

ity]

at 0

2:00

06

May

201

3

Page 17: Toxicity Testing in the 21st Century: A Vision and a Strategy

TOXICITY TESTING IN THE 21ST CENTURY 65

outcome of high-dose testing in animals. Thecommittee distinguishes those in vitro testsfrom the ones noted in Options III and IV. Invitro studies promise to provide more mecha-nistic information and to allow more extensiveand more rapid determinations of biologic per-turbations that are directly relevant to humanbiology and exposures.

Overview of the Committee’s Long-Range Vision for Toxicity Testing

The framework outlined in Figure 2 formsthe basis of the committee’s vision for toxicitytesting in the 21st century. The figure indicatesthat the initial perturbations of cell-signalingmotifs, genetic circuits, and cellular-responsenetworks are obligatory changes related tochemical exposure that might eventually resultin disease. The consequences of a biologicperturbation depend on the magnitude of theperturbation, which is related to the dose, thetiming and duration of the perturbation, andthe susceptibility of the host. Accordingly, atlow doses, many biologic systems may function

normally within their homeostatic limits. Atsomewhat higher doses, clear biologic responsesoccur. They may be successfully handled withadaptation, although some susceptible peoplemay respond. A more intense or persistent per-turbation may overwhelm the capacity of thesystem to adapt and may lead to tissue injuryand possibly to adverse health effects.

In this framework, the goals of toxicity test-ing are to identify critical pathways that whenperturbed can lead to adverse health outcomesand to evaluate the host susceptibility to under-stand the effects of perturbations on humanpopulations. To implement the new toxicity-testing approach, toxicologists will need toevolve a comprehensive array of test proceduresthat will allow the reliable identification ofimportant biologic perturbations in key toxicitypathways. And epidemiologists and toxicologistswill need to develop approaches to understandthe range of host susceptibility within popula-tions. Viewing toxic responses in that mannershifts the focus away from the apical endpointsemphasized in the traditional toxicity-testing

FIGURE 2. Biologic responses viewed as results of an intersection of exposure and biologic function. The intersection results inperturbation of biologic pathways. When perturbations are sufficiently large or when the host is unable to adapt because of underlyingnutritional, genetic, disease, or life-stage status, biologic function is compromised, and this leads to toxicity and disease. Source:Adapted from Andersen et al., 2005a. Reprinted with permission; copyright 2005, Trends in Biotechnology.

Dow

nloa

ded

by [

Bos

ton

Uni

vers

ity]

at 0

2:00

06

May

201

3

Page 18: Toxicity Testing in the 21st Century: A Vision and a Strategy

66 D. KREWSKI ET AL.

paradigm, toward biologic perturbations thatcan be identified more efficiently without theneed for whole-animal testing and toward char-acterizing host vulnerability to provide the con-text for assessing the implications of test results.

Figure 3 illustrates the major componentsof the committee’s proposed vision: chemicalcharacterization, toxicity testing, and dose-response and extrapolation modeling. Each com-ponent is discussed in further detail in the nextsection, and the tools and technologies thatmight play some role in the future paradigm arediscussed in section “Tools and Technologies.”

Chemical characterization involves consid-eration of physicochemical properties,environmental persistence, bioaccumulationpotential, production volumes, concentrationin environmental media, and exposure data.Computational tools, such as quantitative struc-ture–activity relationship models and bioinfor-matics, may eventually be used to categorizechemicals, predict likely toxicity and metabolicpathways, screen for relative potency withpredictive models, and organize large databasesfor analysis and hypothesis generation.

Toxicity testing in the committee’s visionseeks to identify the perturbations in toxicity

pathways that are expected to lead to adverseeffects. The focus on biologic perturbationsrather than apical endpoints is fundamental tothe committee’s vision. If adopted, the visionwill lead to a major shift in emphasis awayfrom whole-animal testing toward efficient invitro tests and greater human surveillance.Targeted testing is also used to identify orexplore functional endpoints associated withadverse health outcomes and may include invivo metabolic or mechanistic studies.

Dose-response modeling is used todescribe the relationship between biologic per-turbations and dose in quantitative terms andoptimally mechanistic terms; extrapolationmodeling is used to make predictions of possibleeffects in human populations at prevailingenvironmental exposure concentrations. Com-putational modeling of toxicity pathways eval-uated with specific high-throughput teststhemselves will be a key tool for establishingdose-response relationships. Pharmacokineticmodels, such as physiologically based pharma-cokinetic models, will assist in extrapolatingfrom in vitro to in vivo conditions by relatingconcentrations active in toxicity-test systems invitro to human blood concentrations.

FIGURE 3. The committee’s vision for toxicity testing is a process that includes chemical characterization, toxicity testing, and dose-response and extrapolation modeling. At each step, population-based and human exposure data are considered, as is the question ofwhat data are needed for decision making.

Dow

nloa

ded

by [

Bos

ton

Uni

vers

ity]

at 0

2:00

06

May

201

3

Page 19: Toxicity Testing in the 21st Century: A Vision and a Strategy

TOXICITY TESTING IN THE 21ST CENTURY 67

At each step, population-based data andhuman-exposure information should beconsidered. For example, human biomonitoringand surveillance can provide data on exposureto environmental agents, host susceptibility,and biologic change that will be key for dose-response and extrapolation modeling. Through-out, the information needs for risk-managementdecision making must be borne in mindbecause they will to a great extent guide thenature of the testing required. Thus, the popu-lation-based data and exposure informationand the risk contexts are shown to encircle thecore toxicity-testing strategy in Figure 3.

The components of the toxicity-testing para-digm are semi-autonomous but interrelatedmodules, containing specific sets of underlyingtechnologies and capabilities. Some chemicalevaluations may proceed stepwise from chemi-cal characterization to toxicity testing to dose-response and extrapolation modeling, but thatsequence might not always be followed. A criti-cal feature of the new vision is consideration ofrisk context at each step and the ability to exitthe strategy at any point whenever enough datahave been generated to inform the decision thatneeds to be made. Also, the proposed visionemphasizes the generation and use of popula-tion-based data and exposure estimates whenpossible. The committee notes that the develop-ment of surveillance systems for chemicalsnewly introduced into the market will be impor-tant. The new vision encourages the collectionof such data on important existing chemicalsfrom biomonitoring, surveillance, and molecularepidemiologic studies. Finally, flexibility isneeded in the testing of environmental agents toencourage the development and application ofnovel tools and approaches. The evolution ofthe toxicity-testing process, as envisioned here,must retain flexibility to encourage incorpora-tion of new information and new methods asthey are developed and found to be useful forevaluating whether a given exposure poses arisk to humans. That will require formal proce-dures for the phasing in or phasing out of stan-dard testing methods. Indeed, that process isattuned to the need for efficient testing of allchemicals in a timely, cost-effective fashion.

The committee envisions a reconfigurationof toxicity testing through the development of invitro medium- and high-throughput assays. Thein vitro tests would be developed not to predictthe results of current apical toxicity tests butrather as cell-based assays that are informativeabout mechanistic responses of human tissuesto toxic chemicals. The committee is aware ofthe implementation challenges that the newtoxicity-testing paradigm would face. For exam-ple, toxicity testing must be able to address thepotential adverse health effects both of chemi-cals in the environment and of the metabolitesformed when the chemicals enter the body.Much research will be needed to ensure thatthe new system evaluates the effects of thechemicals and their metabolites fully. Moreover,as we shift from a focus on apical endpoints toperturbations in toxicity pathways, there will bea need to develop an appropriate science baseto support risk-man-agement actions based onthe perturbations. Implementation of the visionand the possible challenges are discussed in thesection “Developing the Science Base andAssays to Implement the Vision.”

COMPONENTS OF THE VISION

The committee foresees pervasive changesin toxicity testing and in interpretive risk-assess-ment activities. The current approach to toxicitytesting focuses on predicting adverse effects inhumans on the basis of studies of apical end-points in whole-animal tests. In the committee’svision, in vitro mechanistic tests provide rapidevaluations of large numbers of chemicals,greatly reduced live-animal use, and resultspotentially more relevant to human biology andhuman exposures. As discussed in the previoussection, “Vision,” toxicity testing can be increas-ingly reconfigured with the accrual of betterunderstanding of biologic pathways perturbedby toxicants and of the signaling networks thatcontrol activation of the pathways. The use ofsystems-biology approaches that integrateresponses over multiple levels from molecules toorgans will enable a more holistic view ofbiologic processes, including an understandingof the relationship between perturbations in

Dow

nloa

ded

by [

Bos

ton

Uni

vers

ity]

at 0

2:00

06

May

201

3

Page 20: Toxicity Testing in the 21st Century: A Vision and a Strategy

68 D. KREWSKI ET AL.

toxicity pathways and consequences for cell andorganism function. The central premise of thecommittee’s vision is that toxicant-inducedresponses can be quantified with appropriatecellular assays and that empirical or mechanisticmodels of pathway perturbations can be used asthe basis of environmental decision making.Combining a fundamental understanding of cel-lular responses to toxicants with knowledge oftissue dosimetry in cell systems and in exposedhuman populations will provide a suite of toolsto permit more accurate predictions of condi-tions under which humans are expected to showpathway perturbations by toxicant exposure.The institutional and infrastructural changesrequired to achieve the committee’s vision willinclude changes in the types of tests that supporttoxicity testing and how toxicity, mechanisticinformation, and epidemiologic data are used inregulatory decision making. The regulatory tran-sition from the current emphasis on apical end-point toxicity tests to reliance on perturbations oftoxicity pathways will raise many issues. Thechallenges to implementation and a strategy toimplement the vision are discussed in the section“Developing the Science Base and Assays toImplement the Vision.”

This section discusses individual compo-nents of the vision: chemical characterization

(component A), toxicity testing (component B),dose-response and extrapolation modeling(component C), population-based and humanexposure data (component D), and risk contexts(component E). Component B is composed of atoxicity-pathway component and a limited tar-geted-testing component. The toxicity-pathwaycomponent will be increasingly dominant asmore and more high-throughput toxicity-pathwayassays are developed and validated. Surveil-lance and biomonitoring data will be neededto understand the effects of toxicity-pathwayperturbations on humans. Finally, the overallsuccess of the new paradigm will depend onensuring that toxicity testing meets the infor-mation needs of environmental decision makinggiven the risk contexts.

Component A: Chemical Characterization

An overview of component A is providedin Figure 4. Chemical characterization is meantto address key questions, including the com-pound’s stability in the environment, thepotential for human exposure, the likely routesof exposure, the potential for bioaccumulation,the likely routes of metabolism, and the likelytoxicity of the compound and possible metab-olites based on chemical structure or physical

FIGURE 4. Overview of chemical characterization component.

Dow

nloa

ded

by [

Bos

ton

Uni

vers

ity]

at 0

2:00

06

May

201

3

Page 21: Toxicity Testing in the 21st Century: A Vision and a Strategy

TOXICITY TESTING IN THE 21ST CENTURY 69

or chemical characteristics. Thus, data wouldbe collected on physical and chemical proper-ties, use characteristics, possible environmentalconcentrations, possible metabolites andbreakdown products, initial molecular interac-tions of compounds and metabolites with cellularcomponents, and possible toxic properties. Avariety of computational methods might beused to predict those properties when data arenot available. Decisions could be made afterchemical characterization about further testingthat might or might not be required. For exam-ple, if a chemical were produced in such amanner that it would never reach the environ-ment or if it were sufficiently persistent andbiologically reactive, further toxicity evaluationmight not be necessary for regulatory decisionmaking. Moreover, computational tools forestimating biologic activities and potencycould be useful in assessing characteristics ofcompounds during their development or in apremanufacturing scenario to rule out devel-opment or introduction of compounds that areexpected to lead to biologically importantperturbations in toxicity pathways. In mostcases, chemical characterization alone is notexpected to be sufficient to reach decisionsabout the toxicity of an environmental agent.

The tools for chemical characterization willinclude a variety of empirical and computa-tional methods. As outlined in the committee’sfirst report (NRC, 2006a), computationalapproaches that can and most likely will beused are in the following categories: tools to cal-culate physical and chemical properties, modelsthat predict metabolism and metabolic productsof a chemical, structure–activity relationship(SAR) and quantitative SAR (QSAR) models thatpredict biologic activity from molecular struc-ture, and models that predict specific molecularinteractions, such as protein–ligand binding, tis-sue binding, and tissue solubility. An array ofcomputational tools is available to calculatephysical and chemical properties (Volarath etal., 2004; Olsen et al., 2006; Grimme et al.,2007; Balazs, 2007). Tools for assessing meta-bolic fate and biologic activity are continuallyevolving, and many of the more accurate andrefined examples rely on proprietary technology

or proprietary databases. Databases that supportthe most predictive tools may therefore end upbeing proprietary and substantially differentfrom those available in the public domain. Thecommittee urges the U.S. Environmental Pro-tection Agency (EPA) to consider taking a leadrole in ensuring public access to the data setsthat are developed for predictive modeling andin providing the resources necessary for the con-tinual evolution of methods to develop SAR,QSAR, and other predictive modeling tools.

Many models used to predict hazard arebased only on structure and physical andchemical properties and rely on historicaldata sets. Their reliability is limited by the rel-evant data sets, which are continually evolv-ing and increasing in size and accessibility.That is, the predictive value of the structure–activity rules will depend on the chemicals inthe data set from which they are derived—their prevalence, structures, and whether theyhave the toxic activity of interest (see, forexample, Battelle, 2002). Computationalapproaches for predicting toxicity and molecu-lar interactions are available for only a smallnumber of endpoints, such as estrogen-recep-tor binding, and their predictive value can below (Battelle, 2002). As approaches improvewith time and experience and as the data setsavailable for model development becomelarger and more robust, computational toolsshould become much more useful for chemicalcharacterization, predicting activity in toxicitypathways, and early-stage decision making.

Component B: Toxicity Testing of Compounds and Metabolites

The long-term vision makes the developmentof predictive toxicity-pathway-based assays thecentral component of a broad toxicity-testingstrategy for assessing biologic activity of new orexisting compounds. The assays will be con-ducted primarily with cells or cell lines, opti-mally with human cells or cell lines, and astime passes, the need for traditional apical animaltests will be greatly reduced and optimallyeliminated. The overview of component B pro-vided in Figure 5 indicates that toxicity testingwill include both pathway testing and targeted

Dow

nloa

ded

by [

Bos

ton

Uni

vers

ity]

at 0

2:00

06

May

201

3

Page 22: Toxicity Testing in the 21st Century: A Vision and a Strategy

70 D. KREWSKI ET AL.

testing, which are discussed further in thefollowing.

A period of transition is inevitable becauseof the need to develop the full suite of toxicity-pathway tests that will be required for acomprehensive assessment of toxicity. Chal-lenges related to the transition from the currentparadigm oriented to apical endpoints to thatoutlined here are addressed separately in thesection “Developing the Science Base andAssays to Implement the Vision.”

Toxicity Pathways The committee’svision focuses on toxicity pathways. Toxicitypathways are simply normal cellular responsepathways that are expected to result in adversehealth effects when sufficiently perturbed. Forexample, in early studies of cancer biology, spe-cific genes that were associated with malignantgrowth and transformation were called onco-

genes (those promoting unrestrained cell repli-cation) and tumor-suppressor genes (thoserestricting replication). Both oncogenes andtumor-suppressor genes were later found tocode for proteins that played important roles innormal biology. For example, oncogenes wereinvolved in cell replication, and suppressor-geneproducts normally halted some key part of thereplication process. However, mutations (suchas those which can be induced by some envi-ronmental agents) were found to makeoncogenes constitutively active or to cause agreat reduction in or loss of activity of suppressorgenes.

It is the ability of otherwise normal cellularresponse pathways to be targets for environ-mental agents that leads to their definition astoxicity pathways. Perturbations of toxicitypathways can be evaluated with a variety of

FIGURE 5. Toxicity-testing component, which includes toxicity-pathway testing in cells and cell lines and targeted testing in whole animals.

Dow

nloa

ded

by [

Bos

ton

Uni

vers

ity]

at 0

2:00

06

May

201

3

Page 23: Toxicity Testing in the 21st Century: A Vision and a Strategy

TOXICITY TESTING IN THE 21ST CENTURY 71

assays, including relatively straightforwardbiochemical assays, such as receptor bindingor reporter-gene expression, or more inte-grated cellular response assays, such as assaysto evaluate proliferation of an estrogen-responsive cell line after treatment with envi-ronmental agents. Cellular responses can bebroadly dichotomized as those requiring rec-ognition of the structure of an environmentalagent and those occurring because of reactiv-ity of the environmental agent. In the firstcase, the three-dimensional structure is recog-nized by macromolecular receptors, as withestrogenic compounds. Accordingly, tests forthe structurally mediated responses could bebased on binding assays or on integrated cel-lular-response events, such as proliferation,induction of new proteins, or alteration ofphosphorylation status of cells after exposureto environmental agents. In the second case,with reactivity-driven responses, the com-pound or a metabolite reacts with and dam-ages cellular structures. Reactive compoundshave the capacity to be much more promiscu-ous in their targets in cells, and the initialstress responses to tissue reactivity with theseagents may also trigger adaptive changes tomaintain homeostasis in the face of increasedcellular stress (see Figure 2).

Biologic systems from single cells to com-plex plant and animal organisms have evolvedmany mechanisms to respond to and counterstressors in their environment. Many responsesare mediated through coordinated changes inexpression of genes in specific patterns, whichresult in new operational characteristics ofaffected cells (Ho et al., 2006; Schilter et al.,2006; Singh & DuMond, 2007). Many stress-response pathways—such as those regulatedby hsp90-mediated regulation of chaperoneproteins, by Nrf2-mediated antioxidant-elementcontrol of cellular glutathione, or by steroid-hormone family (for example, PPAR, CAR,and PXR) receptor-mediated induction ofxenobiotic metabolizing enzymes—are con-served across many vertebrate species (Aranda& Pascual, 2001; Handschin & Meyer, 2005;Westerheide & Morimoto, 2005; Kobayashi &Yamamoto, 2006). Initial responses to stressors

represent adaptation to maintain normal func-tion. When stressors are applied at increas-ingly high concentrations in combination withother stressors, in sensitive hosts, or duringsensitive life stages, adaptation fails, andadverse effects occur in the cell and organism(see Figure 2).

As stated, the committee’s long-rangevision capitalizes on the identification and useof toxicity pathways as the basis of a newapproach to toxicity testing and dose-responsemodeling. An important question for toxicity-testing strategies concerns the number of path-ways that might need to be examined as primarytargets of chemical toxicants. For example, inthe case of reproductive and developmentaltoxicity, the National Research Council Com-mittee on Developmental Toxicology listed 17primary intracellular and intercellular signalingpathways that were then known to be involvedin normal development (NRC, 2000). Thosepathways and the various points for toxic inter-action with them are potential targets of chem-icals whose structures mimic or disruptportions of them. Some of the pathways arealso important at other life stages, and biologi-cally significant perturbations of them mightresult in long-lasting effects or effects that aremanifested later in life. As discussed in thesection “Developing the Science Base andAssays to Implement the Vision,” considerableeffort will be required to determine whichpathways ultimately to include in the suite oftoxicity pathways for testing and what patternsand magnitudes of perturbations will lead toadverse effects.

Some examples of toxicity pathways thatcould be evaluated with high-throughputmethods are listed next, where the conse-quences of pathway activation are also noted.Most tests are expected to use high-throughputmethods, but others could include medium-throughput assays of more integrated cellularresponses, such as cytotoxicity, cell prolifera-tion, and apoptosis. Simpler assays, such asreceptor binding or reactivity of compoundswith targets (for example, tests of inhibition ofcholinesterase activity), also could be used asneeded.

Dow

nloa

ded

by [

Bos

ton

Uni

vers

ity]

at 0

2:00

06

May

201

3

Page 24: Toxicity Testing in the 21st Century: A Vision and a Strategy

72 D. KREWSKI ET AL.

• Nrf2 antioxidant-response pathway (McMahonet al., 2006; Zhang, 2006). The activation ofantioxidant-response element signalingoccurs through oxidation of sentinel sulfhydrylson the protein Keap1. Some agents, such aschlorine, activate Nrf2 signaling in vitro, andthe oxidative stress likely is the cause of irri-tation and toxicity in the respiratory tract.

• Heat-shock-response pathway (Maroni et al.,2003; Westerheide & Morimoto, 2005). Theactivation of protein synthesis by HSP1 tran-scription factor signaling maintains cellularproteins in an active folded configuration inresponse to stressors that cause unfoldingand denaturation.

• PXR, CAR, PPAR, and AhR response pathways(Waxman, 1999; Handschin & Meyer, 2005;Hillegass et al., 2006; Timsit & Negishi, 2006;Li et al., 2006). The activation of xenobiotic-metabolizing pathways by transcriptionalactivation reduces concentrations of somebiologically active xenobiotics and enhanceselimination from the body as metabolites(Nebert, 1994); it can also increase theactivation of other xenobiotics to more toxicforms. The toxicity and carcinogenicity of someagents, such as polyaromatic hydrocarbons,occur because of production of mutagenicmetabolites by inducible oxidative enzymes.

• Hypo-osmolarity-response pathway (Subra-manya & Mensa-Wilmot, 2006). Cellularstressors damage the integrity of the cellularmembranes and activate p38 MAP kinase-mediated pathways to counter them (VanWuytswinkel et al., 2000). The p38 MAPkinase functionality for the stress responses isconserved across eukaryotes.

• DNA-response pathways (Nordstrand et al.,2007). Damage to DNA structures inducesrepair enzymes that act through GADD45(Sheikh et al., 2000) and other proteins.Unrepaired damage increases the risk ofmutation during cell division and increasesthe risk of cancer.

• Endogenous-hormone-response pathways(NRC, 1999; Harrington et al., 2006).Enhancement or suppression of activity oftranscriptionally active hormone receptors—including estrogen, androgen, thyroid, and

progesterone receptors (Aranda & Pascual,2001)—leads to altered homeostasis andalteration in biologic functions that arecontrolled by the receptors.

The biologic revolution now making itsway into toxicity testing sets the stage for thedesign of mechanistic cell-based assays thatcan be evaluated primarily with high-throughputapproaches to testing. The promise of thenovel cell-system assays is becoming apparentin advances in several areas: genomic studiesof cellular signaling networks affected bychemical exposures, identification of commontoxicity pathways that regulate outcomes indiverse tissues, and understanding of networksthat control cell responses to external stressors.To ensure the value of results for use in envi-ronmental decision making, the toxicity-pathwayassays should be amenable to measurementsof dose-response relationships over a broadrange of concentrations. Chemical concentra-tions should be measured directly in the mediaused in the toxicity-pathway assays whenadministered concentrations might not repre-sent the concentrations in vitro (for example, inthe case of volatile compounds).

Finding new assays for assessing the dose-response characteristics of the toxicity pathwayswill have high priority for research and stan-dardization. Environmental agents on whichanimal, human, and cellular evidence consis-tently demonstrates increased risk of adversehealth outcomes could serve as positive controlsfor evaluation of toxicity-pathway assays.Those controls would serve as standards for theevaluation of the ability of other compounds toperturb the assayed toxicity pathways. Nega-tive controls would also be needed to evaluatethe specificity of responses for the key toxicitypathways. For risk implications in specific pop-ulations, interpretation of the studies wouldconsider the results of the assays coupled withinformation on host susceptibility from otherhuman cell or tissue assays and population-basedstudies. The research needed to implement thetoxicity-pathway approach is discussed furtherin the section “Developing the Science Baseand Assays to Implement the Vision.”

Dow

nloa

ded

by [

Bos

ton

Uni

vers

ity]

at 0

2:00

06

May

201

3

Page 25: Toxicity Testing in the 21st Century: A Vision and a Strategy

TOXICITY TESTING IN THE 21ST CENTURY 73

Targeted Testing As discussed in thesection “Vision,” an integral part of the com-mittee’s vision is targeted testing, which wouldbe used to complement toxicity-pathway testingand used in the following circumstances:

• To clarify substantial uncertainties in theinterpretation of toxicity-pathway data.

• To understand effects of representative proto-type compounds from classes of materials,such as nanoparticles, that may activate toxicitypathways not included in a standard suite ofassays.

• To refine a risk estimate when the targeted test-ing can reduce uncertainty and when a morerefined estimate is needed for decision making.

• To investigate the production of possiblytoxic metabolites of new compounds.

• To fill gaps in the toxicity-pathway testingstrategy to ensure that critical toxicity pathwaysand endpoints are adequately covered.

One of the challenges of developing an invitro test system to evaluate toxicity is the currentinability of cell assays to mirror the metabolismof a whole animal (Coecke et al., 2006). Forthe foreseeable future, any in vitro strategy willneed to include a provision to assess likelymetabolites with whole-animal testing. Themetabolites would also need to be tested in asuite of in vitro assays. For very reactive metab-olites, the suite of assays should include cellmodels that have biotransformation enzymesrequired for metabolism. Although it maybecome possible to make comprehensive pre-dictions of metabolism of environmental agents,any plan to implement the vision here willprobably have to rely on some metabolite-iden-tification studies in whole animals. Anotherchallenge is adequate development of in vitroassays to identify reliably toxicity pathways thatare causally related to neurodevelopment andother physiologic processes that depend ontiming and patterns of exposure and the inter-actions of multiple pathways. In the near term,targeted in vivo testing will most likely beneeded to address those types of toxicities.

Targeted testing might be conducted invivo or in vitro, depending on the conditions

and the toxicity tests available. In the case ofmetabolite studies, one approach might be todose small groups of animals with radiolabeledcompound, to separate and characterize theexcreted radioactivity with modern analytictechniques, and to compare the metabolitestructure with known chemistries to determinethe need for testing specific metabolites. Similarstudies might be conducted in tissue bioreactors,especially a liver bioreactor or cocultures ofcells from human liver and other tissues thatmight make the studies more applicable tohuman metabolism. Concerns raised in evalua-tions of metabolism could necessitate synthesisof specific metabolites that would then betested in the main toxicity-pathway assays. Inthe development of the European Centre forthe Validation of Alternative Methods, therehas been extensive discussion of the challengesof capturing the possible toxicity of metabolitesso as not to miss ultimate toxicities of sub-stances with in vitro testing (Coecke et al.,2005, 2006).

Although targeted tests could be based onexisting toxicity-test systems, they will probablydiffer from traditional tests in the long term.They could use transgenic species, isogenicstrains, new animal models, or other novel testsystems (see the committee’s interim report[NRC, 2006a] for further discussion) and couldinclude a toxicogenomic evaluation of tissueresponses over wide dose ranges. Whateversystem is used, testing protocols would maxi-mize the amount of information gained fromwhole-animal toxicity testing. For example,routinely used whole-animal toxicity-testingprotocols could provide mode-of-action infor-mation on toxicity pathways and target tissuesin short-term repeat studies. They couldemphasize measurement of metabolite forma-tion and applications of transcriptomics andbioinformatics; future designs might includeother “-omic” approaches as the technologiesmature and the costs of such studies decrease.Toxicogenomic studies of 14–30 days couldprovide tissues for microarray analysis andinformation on pathology. They would harvesta suite of major tissues, mRNA analysis wouldbe performed, and bioinformatics analysis

Dow

nloa

ded

by [

Bos

ton

Uni

vers

ity]

at 0

2:00

06

May

201

3

Page 26: Toxicity Testing in the 21st Century: A Vision and a Strategy

74 D. KREWSKI ET AL.

would be conducted to evaluate dose-responserelationships in connection with changes ingenes and groups of related genes. mRNA fromtissues with evidence of pathologic alterationsat high doses might also be examined with themajor tissues. Thus, the targeted testing in thecommittee’s vision will not necessarily resemblethe standard whole-animal assays now con-ducted either in the protocol used or in theinformation gained.

Component C: Dose-Response and Extrapolation Modeling

The committee’s vision includes dose-response and extrapolation modeling modules,which are discussed next; an overview of thiscomponent is provided in Figure 6.

Empirical Dose-Response Modeling Asthey are currently used in toxicity testing withapical endpoints, empirical dose-response(EDR) models often describe a relationshipbetween the incidence of the endpoint andeither the dose given to the animal or the con-centration of the environmental agent or itsmetabolite in the target tissue. In the long-range

vision, the committee believes that EDR modelswill be developed for environmental agentsprimarily on the basis of data from in vitro,mechanistically based assays described in com-ponent B. The EDR models would describe therelationship between the concentration in thetest medium and the degree of in vitroresponse; in some cases, they would providean estimate of some effective concentration atwhich a specified level of response occurs. Theeffective concentration could describe, forexample, a percentage of maximal response ora statistical increase above background for amore integrated assay, such as an enhanced-cell-proliferation assay. Considerations in theinterpretation of in vitro response metricswould include responses in positive and negativecontrols, their statistical variability, backgroundhistorical data, and the experimental dose-response data on the test substance. In general,the toxicity-pathway evaluations require con-sideration of increases in continuous ratherthan dichotomous responses.

Dose measures in targeted-testing studiesconducted in whole animals could also be

FIGURE 6. Overview of dose-response and extrapolation modeling component.

Dow

nloa

ded

by [

Bos

ton

Uni

vers

ity]

at 0

2:00

06

May

201

3

Page 27: Toxicity Testing in the 21st Century: A Vision and a Strategy

TOXICITY TESTING IN THE 21ST CENTURY 75

expressed in relation to a measure of tissue orplasma concentrations of the parent com-pound or a metabolite in the organism, such asblood concentration, area under a concentra-tion–time course curve, and rate of metabo-lism. Preferably, the concentrations would bebased on empirical measurements rather thanon predictions from pharmacokinetic models.The main reason for insisting that the in vivostudies have a measure of tissue concentrationis to permit comparison with the results fromthe in vitro assays.

In some risk contexts, an EDR model basedon in vitro assay results might provide adequatedata for a risk-management decision, for exam-ple, if host-susceptibility factors of a compoundin humans are well understood and humanbiomonitoring provides good informationabout its tissue or blood concentrations andabout other exposures that affect the toxicitypathway in a human population. Effective con-centrations in the suite of in vitro mechanisticassays could be adjusted for host susceptibilityand then compared with the human biomoni-toring data. In the absence of detailed biomon-itoring data and host-susceptibility information,predictions of human response to a toxicantwill require building on the data provided bythe in vitro EDR models and using physiologi-cally based pharmacokinetic (PBPK) modelsand perhaps host-susceptibility information onrelated compounds.

Extrapolation Modeling Extrapolationmodeling encompasses the analytic toolsrequired to predict exposures that might resultin adverse effects in human populations primarilyon the basis of results of hazard testing com-pleted in component B. In the committee’svision, extrapolation modeling would mostlikely include PBPK modeling to equate tissue-media concentrations from toxicity testing withtissue doses expected in humans; toxicity-pathway modeling that provides an under-standing of the biologic components that controlthe toxicity-pathway response in vitro; andconsideration of human data on host suscepti-bility and background exposure that providethe context for interpreting the modelingresults. As stated in the committee’s interim

report (NRC, 2006a), the computationalapproaches must be validated, adequatelyexplained, and made accessible to peer reviewto be valuable for risk assessment. Models notaccessible for review may be useful for manyscientific purposes but are not appropriate forregulatory use.

Toxicity-Pathway Dose-Response ModelsModels of toxicity-pathway perturbations

need to be developed to interpret results fromtoxicity tests in a mechanistic rather than sim-ply empirical manner; they should be achiev-able in the near future. Toxicity-pathway modelsshould be more readily configured than mod-els of organism-level toxicity because theydescribe only the toxicity pathway itself and theinitial chemical-related perturbations that arebelieved to be obligatory but not necessarilysufficient for causing the overt adverse healtheffect.

Several models of normal signaling pathwayshave been developed, for example, for heat-shock response (El-Samad et al., 2005; Riegeret al., 2005), platelet-derived growth-factor sig-naling (Bhalla et al., 2002), and nuclear factorkappa-B-mediated inflammatory signaling inresponse to cytokines, such as tumor necrosisfactor-alpha (Hoffmann et al., 2002; Choet al., 2003). Also, a screen for anticancerdrugs has been developed by using the Nrf2antioxidant-response pathway (Wang et al.,2006a), and a preliminary Nrf2 oxidative-stress model has been developed (Zhang,2006) to examine chlorine as an oxidativestressor and to evaluate both adaptive andovertly toxic responses of cells in culture. Tox-icity-pathway dose-response models opti-mally would describe the interaction ofchemicals with cell constituents that activate orrepress the pathway (that is, control it) anddescribe the cellular consequences of activa-tion (that is, the cellular responses, usuallyaltered gene expression, to changes in normalcontrol). Table 3 and Figure 7 illustrate these con-cepts in terms of the activation of the Nrf2 anti-oxidant stress-response pathway.

Although the toxicity-pathway models arediscussed here as part of component C of thevision, creation of the models would occur as a

Dow

nloa

ded

by [

Bos

ton

Uni

vers

ity]

at 0

2:00

06

May

201

3

Page 28: Toxicity Testing in the 21st Century: A Vision and a Strategy

76 D. KREWSKI ET AL.

natural extension of developing and validatingthe in vitro toxicity-pathway tests discussed incomponent B. In other words, the committeeenvisions that the models would be developedfor many assays in component B. The commit-tee recognizes that in the near term there willbe continued reliance on default approachesfor low-dose extrapolation, such as the lineardose-response model and application ofuncertainty factors to benchmark doses or no-observed-adverse-effect levels. The applicationof uncertainty and adjustment factors to pre-cursor biologic responses from perturbationswill not necessarily involve the same factors ascurrently used in U.S. EPA risk assessments fornoncancer endpoints.

The committee emphasizes the importantdistinction between models for toxicity-pathwayperturbations and biologically based dose-response (BBDR) models for apical responses.Approaches to BBDR modeling for complexapical responses—such as cancer (Moolgavkar& Luebeck, 1990; Conolly et al., 2003), devel-opmental toxicity (Leroux et al., 1996), andcytotoxicity (Reitz et al., 1990; el-Masri et al.,1996)—have focused on integrated processes,such as proliferation, apoptosis, necrosis, andmutation. Experimental studies and biologicand toxicologic research are still required toguide the development and validation of suchmodels. Although toxicity-testing strategieswould be enhanced by availability of quantitative

TABLE 3. Example of Components of Signaling Pathway That Could Be Modeled

In nontoxic environments, antioxidant genes are repressed through inactivation of the transcriptional regulator Nrf2. The cytoplasmic protein Keap-1 binds Nrf2 and sequesters Nrf2 in the cytoplasm, where it cannot activate transcription of antioxidant genes (see Figure 7). Nrf2 bound to Keap-1 is then quickly degraded through the Cul3-based E3 ligase system (Kobayashi et al., 2004). In toxic environments, some oxidants interact with thiol groups on Keap-1, causing Nrf2 to be released and translocated to the nucleus. Once in the nucleus, Nrf2 heterodimerizes with a small Maf protein and binds to antioxidant response elements; this leads to expression of antioxidant-stress proteins and phase 2 detoxifying enzymes (Motohashi & Yamamoto, 2004).

The negative-feedback response loop has two major portions, each of which could be the target of model development. First, the inactivation of Keap-1 by oxidants and the later formation of the Nrf2-Maf het-erodimer are response circuits that can be mathemati-cally modeled to predict low-dose toxic responses. Second, the expression of antioxidant-stress proteins and phase 2 detoxifying enzymes can also be modeled to predict low-dose toxic responses.

FIGURE 7. Nrf2 antioxidant-response pathway schematic. Adapted from Motohashi and Yamamoto (2004), with permission fromTrends in Molecular Medicine.

Dow

nloa

ded

by [

Bos

ton

Uni

vers

ity]

at 0

2:00

06

May

201

3

Page 29: Toxicity Testing in the 21st Century: A Vision and a Strategy

TOXICITY TESTING IN THE 21ST CENTURY 77

BBDR models for apical responses, this type ofmodeling is still in its infancy and probably willnot be available for risk-assessment applicationsin the near future. Progress in developing themodels will rely heavily on biologic studies ofdisease processes in whole animals and mathe-matical descriptions of the processes. Thecommittee sees BBDR-model development forapical endpoints as part of a much longer rangeresearch program and does not see routinedevelopment of the models from toxicity-path-way testing data in the foreseeable future.

Physiologically Based Pharmacokinetic Modeling

PBPK models assist in extrapolations ofdosimetry among doses, dose routes, animalspecies, and classes of similar chemicals (Clarket al., 2004). They also support risk assess-ment, aid in designing and interpreting theresults of biomonitoring studies (Clewell et al.,2005), and facilitate predictions of humanbody burden based on use and exposure pat-terns in specific populations. The developmentof PBPK models requires variable investment,depending on the chemical. For well-studiedclasses of compounds, PBPK-model develop-ment might require collection of compound-specific characteristics or statistical analysis toincorporate descriptions of human variabilityand to describe uncertainty (see, for example,Bois et al., 1996; Fouchecourt et al., 2001;Poulin & Theil, 2002; Theil et al., 2003). Forless well-studied classes of chemicals, modeldevelopment might require collection of time-course data on tissue concentrations (see, forexample, Sarangapani et al., 2002). Validationof existing models is an important consideration.The possibility of studying the pharmacokinet-ics of low concentrations in environmentally oroccupationally exposed humans providesmany opportunities for checking the validity ofPBPK models. Advances in analytic chemistrypermit kinetic studies at extremely low dosesthat enable opportunities for such studies.

In the future, QSAR should allow estimationof such parameters as blood-tissue partitioning,metabolic rate constants, and tissue bindingand could give rise to predictive PBPK models

validated with a minimal research investmentin targeted studies in test animals. The goal ofdeveloping predictive PBPK models dates backto efforts to develop in vitro tools to measuremodel parameters or to develop QSAR modelsto predict model parameters on the basis ofphysical and chemical characteristics or prop-erties (Gargas et al., 1988, 1989).

Component D: Population-Based and Human Exposure Data

Population-based and human exposuredata will be crucial components of the newtoxicity-testing strategy. They will be critical forselecting doses in in vitro and targeted in vivotesting, for interpreting and extrapolating fromhigh-throughput test results, for identifying andunderstanding toxicity pathways, and for iden-tifying toxic chemical hazards. Figure 8 pro-vides an overview of component D, and thefollowing subsections discuss how population-based and exposure data can be integratedwith toxicity testing.

Population-Based Data and the Toxicity-Testing Strategy The new toxicity-testingstrategy emphasizes the collection of data onthe fundamental biologic events involved inthe activation of toxicity pathways after expo-sure to environmental agents. The collection ofmechanistic data on fundamental biologic per-turbations will provide new opportunities forgreater integration of toxicity testing and popu-lation-based studies. In some cases, coordina-tion of the tests will be required; interpretationof toxicity-test results will require an under-standing of how human susceptibility factorsand background exposures affect the toxicitypathway and how those factors and exposuresvary among people.

Genetic epidemiology provides an excellentexample of the integration of information fromtoxicity testing in the long-range vision andpopulation-based studies. It seeks to determinethe relationship between specific genes in thepopulation and disease. The finding of geneticloci associated with susceptibility potentiallycan inform biologists of important cellular pro-teins that affect disease and can uncover noveldisease pathways. Toxicity-testing assays can

Dow

nloa

ded

by [

Bos

ton

Uni

vers

ity]

at 0

2:00

06

May

201

3

Page 30: Toxicity Testing in the 21st Century: A Vision and a Strategy

78 D. KREWSKI ET AL.

then be designed to investigate and evaluatethe finding and the effects of exogenous chem-icals on the disease pathways. For example,human studies have provided information onDNA damage in arsenic-exposed people andmotivated laboratory studies on culturedhuman cells to determine specific DNA-repairpathways affected by arsenic (Andrew et al.,2006).

Conversely, as understanding of toxicitypathways grows, specific genetic polymor-phisms that increase or decrease susceptibilityto adverse effects of exposure to environmen-tal agents can be more accurately predicted.For example, genetic polymorphisms in someDNA repair and detoxification genes result inhigher levels of chromosomal and genomicdamage based on the micronuclear centromerecontent in tissue samples from welders occu-pationally exposed to welding fumes (Iarmar-covai et al., 2005). Although a substantialamount of normal genetic variation has been

identified, only a small fraction of the varia-tion may play a substantive role in influencingdifferences in human susceptibility. Under-standing the biology of the toxicity pathwaysprovides insight into how genetic susceptibil-ity may play an important role. Specifically, atoxicity-testing strategy with a mechanisticfocus should define pathways and indicatepoints that are rate-limiting or are criticalsignaling nodes in cellular-response systems.Identifying those nodes will allow the poten-tial effects of genotypic variation to be betterdetermined and integrated into chemical-toxicityassessments.

Another example of the interplay betweentoxicity testing and epidemiology is the genera-tion of potentially important data on biomarkers.The committee’s vision emphasizes studiesconducted in human cells that indicate howenvironmental agents can affect human bio-logic response. The studies will suggest biomar-kers of early biologic effects that could be

FIGURE 8. Overview of population-based and human exposure data component.

Dow

nloa

ded

by [

Bos

ton

Uni

vers

ity]

at 0

2:00

06

May

201

3

Page 31: Toxicity Testing in the 21st Century: A Vision and a Strategy

TOXICITY TESTING IN THE 21ST CENTURY 79

monitored in human populations (NRC,2006b). Studying the markers in a variety ofcellular systems will help to determine thebiomarkers that are best for systematic testingand for use in population-based studies.

Population-health surveillance may indi-cate human health risks that were not detectedin toxicity tests. For example, although phar-maceutical products are subject to extensivetoxicologic and clinical testing before theirintroduction into the marketplace, pharma-covigilance programs have identified adversehealth outcomes that were not detected inpreclinical and clinical testing (Lexchin, 2005;IOM, 2007). Food-flavoring agents provideanother illustrative example. In 2000, severalcases of bronchiolitis obliterans, a severe andrare pulmonary disorder, were described informer workers at a microwave-popcorn plant(Akpinar-Elci et al., 2002). Exposure to vapor-ized flavoring agents used in the productionprocess was associated with decreased lungfunction (Kreiss et al., 2002). Flavoring-associ-ated respiratory disease was also documentedamong food-product workers and amongworkers in facilities that manufactured the fla-voring agents (Lockey et al., 2002). Althoughthe toxicity of the flavoring agents was con-firmed in animal studies (Hubbs et al., 2002),their inhalation hazards during manufactureand food-product production was not recog-nized at the time of product approval. Situa-tions in which toxicity testing is not adequatelyconducted or fails to identify an importanthuman health risk emphasize the need to inte-grate population-based studies into any toxicity-testing paradigm and the need to collecthuman data in a structured manner so thatthey can be used effectively by the toxicologycommunity.

Human-Exposure Data and the Toxicity-Testing Strategy Human-exposure data mayprove to be pivotal as toxicity testing shiftsfrom the current apical endpoint whole-animaltesting to cell-based testing. Several types ofinformation will be useful. The first is informa-tion collected by manufacturers, users, agen-cies, or others on exposures of employees inthe workplace or on environmental exposures

of the population at large. Such exposure infor-mation would be considered in the setting ofdose ranges for in vitro toxicity testing and ofdoses for collecting data in targeted pharmaco-kinetic studies and in selecting concentrationsto use in human PBPK models.

Other valuable information will comefrom biomonitoring surveys of the populationthat measure environmental agents or theirmetabolites in blood, urine, or other tissues.New sensitive analytic tools that allowmeasurement of low concentrations of chemi-cals in cells, tissues, and environmental mediaenable tracking of biomarkers in the humanpopulation and the environment (Weis et al.,2005; NRC, 2006b). Comparison of concen-trations of agents that activate toxicity pathwayswith concentrations of agents in biologicmedia in human populations will help to identifypopulations that may be overexposed, toguide the setting of human exposure guide-lines, and to assess the cumulative impact ofchemicals that influence the same toxicitypathway. The ability to make such comparisonswill be greatly strengthened by a deeperunderstanding of the pharmacokinetic pro-cesses that govern the absorption, distribution,metabolism, and elimination of environmentalagents by biologic systems. The enhancedability to identify media concentrations thatcan evoke biologic responses will help toreduce the uncertainties associated with afocus on apical effects observed at high dosesin animal testing.

The importance of biomonitoring dataemphasizes the need to support and expandsuch programs as the National BiomonitoringProgram conducted by the Centers for DiseaseControl and Prevention (CDC, 2001, 2003,2005). Those programs have greatly increasedthe understanding of human population expo-sure and have provided valuable informationto guide toxicity testing. In time, biomonitoringwill enable assessment of the status of thetoxicity-pathway activation in the population.That information will be critical in understand-ing the implications of high-throughput resultsfor the population and for identifying susceptiblepopulations.

Dow

nloa

ded

by [

Bos

ton

Uni

vers

ity]

at 0

2:00

06

May

201

3

Page 32: Toxicity Testing in the 21st Century: A Vision and a Strategy

80 D. KREWSKI ET AL.

Component E: Risk Contexts

Toxicity testing is valuable only if it can beused to make more informed and more effi-cient responses to public-health concernsfaced by regulators, industry, and the public.Early in this article,, the committee identifiedfive broad risk contexts requiring decisionsabout environmental agents, which are listedin Figure 9. Each decision-making context cre-ates a need for toxicity-testing informationthat, if fulfilled, can help to identify the mosteffective ways to reduce or eliminate healthrisks posed by environmental agents.

Some of the risk contexts require rapidscreening of environmental agents numberingin the tens of thousands. Others require highlyrefined dose-response information on effects atenvironmental concentrations, the ability totest chemical mixtures, or the use of focusedassays targeted to specific toxicity pathways orendpoints. Some risk contexts may require theuse of population-based approaches, includingpopulation health surveillance and biomonitoring.

The committee believes that its vision for anew toxicity-testing paradigm will help torespond to decision-making needs, whetherregulatory or nonregulatory, and will allowevaluation of all substances of concern what-ever their origin might be. Specific implicationsof the vision for risk management can be illus-trated by considering the five risk contextsidentified in the first section.

• Evaluation of new environmental agents. Twoissues arise in the testing of new chemicals orproducts. First, emerging technologies mightrequire novel testing approaches. For example,nanotechnology, which focuses on materialsin the nanometer range, will present chal-lenges in toxicity testing that might not beeasily addressed with existing approaches(Institute of Medicine [IOM], 2005; Bormet al., 2006; Gwinn & Vallyathan, 2006; Nelet al., 2006; Powell & Kanarek, 2006). Spe-cifically, the toxic properties of a nanoscalematerial will probably depend on its physical

FIGURE 9. Overview of risk contexts component.

Dow

nloa

ded

by [

Bos

ton

Uni

vers

ity]

at 0

2:00

06

May

201

3

Page 33: Toxicity Testing in the 21st Century: A Vision and a Strategy

TOXICITY TESTING IN THE 21ST CENTURY 81

characteristics, not on the toxic properties ofthe substance or element itself (such as tita-nium or carbon) that makes up the material.The nanoscale material might be evaluatedwith new in vitro tests specially designed toidentify biologic perturbations that might beexpected from exposure to it. As discussedearlier in this section, nanoscale materialsmay require some targeted whole-animaltesting to ensure that all biologically significanteffects are identified. Second, because manynew commercial chemicals are developedeach year, there is a need for a mechanismto screen them rapidly for potential toxicity.With an emphasis on high- and medium-throughput screens, the committee’s visionfor toxicity testing accommodates screeninga large number of chemicals.

• Evaluation of existing environmental agents.Two issues arise in the testing of existingenvironmental agents. For widespread andpersistent environmental agents that cannotbe easily removed from the human environ-ment and can have potentially significanthealth effects, an in-depth evaluation of toxicproperties is important. The committee’svision, with its emphasis on toxicity-pathwayanalysis, will provide the deep understandingneeded for refined evaluation of the poten-tial human health effects and risks. As in theevaluation of new environmental agents,there is a need for effective screening methodsso that the potential toxicity of the tens ofthousands of agents already in the environ-ment can be evaluated. The committee’stoxicity-testing strategy, with high-throughputtoxicity-pathway assays, should permitgreater coverage of the existing environmentalagents that have not been adequately testedfor toxicity.

• Evaluation of a site. Sites invariably contain amixture of chemical agents. Evaluation ofmixtures has proved to be difficult in theexisting toxicity-testing strategy (see the section“Vision”). High-throughput assays, as empha-sized by the committee, may be the bestapproach for toxicity assessment of mixturesbecause they are more easily used to assesscombinations of chemicals. Biomonitoring

data—whose collection is highlighted in thecommittee’s vision—can be especially usefulin site investigations to identify problematicexposures.

• Evaluation of potential environmental con-tributors to a specific disease. Public-healthproblems, such as clusters of cancer cases oroutbreaks of communicable diseases, canhave an environmental component. Asthmahas distinct geographic, temporal, and demo-graphic patterns that strongly suggest envi-ronmental contributions to its incidence andseverity (Woodruff et al., 2004) and providesan excellent illustration of how the committee’svision could help to elucidate the environ-mental components of a disease. First, animalmodels of asthma have been plagued byimportant species differences, which limit theutility of standard toxicity-testing approaches(Pabst, 2002; Epstein, 2004). Second,substantial data are available on toxicitypathways involved in asthma (Maddox &Schwartz, 2002; Pandya et al., 2002; Lutz &Sulkowski, 2004; Lee et al., 2005; Chan et al.,2006; Nakajima & Takatsu, 2006; Abdala-Valencia et al., 2007); the pathways shouldbe testable with high-throughput assays,which could permit the evaluation of manyenvironmental agents for a potential etiologicrole in the induction or exacerbation ofasthma. Third, environmental agents thatraise concern in the high-throughput assayscould have high priority in population-basedstudies for evaluation of their potential linkto asthma in human populations, such asworkers. The high-throughput assays that arebased on evaluation of toxicity pathways cansurvey large numbers of environmentalagents and identify those which operatethrough a mechanism that may be relevantto a disease of interest, as in the case ofasthma, and may help to generate usefulhypotheses that can then be examined inpopulation-based studies.

• Evaluation of the relative risks posed by envi-ronmental agents. It is often useful to assessthe relative risks associated with differentenvironmental agents, such as pesticides orpharmaceutical products, that could have

Dow

nloa

ded

by [

Bos

ton

Uni

vers

ity]

at 0

2:00

06

May

201

3

Page 34: Toxicity Testing in the 21st Century: A Vision and a Strategy

82 D. KREWSKI ET AL.

been developed for the same purpose. Thenew toxicity-testing paradigm will provideinformation on relative potencies establishedby computational toxicology, toxicity-path-way analysis, dose-response analysis, andtargeted testing.

The future toxicity-testing strategy envi-sioned by the committee will be well suited toproviding the relevant data needed to makethe critical risk-management decisionsrequired in the long term.

Toxicity-Testing Strategies in Practice

To illustrate how the results of the testsenvisioned by the committee may be appliedin specific circumstances, two hypotheticalexamples of environmental agents that maypose risks to human health are considered. Thefirst example is an irritant gas, and the secondis an environmental agent that acts by interac-tions with estrogen receptors. The committeeemphasizes that these examples are intendednot to recommend definitive procedures forconducting human health risk assessment butsimply to show how assessment might beapproached. As the research discussed in thesection “Developing the Science Base andAssays to Implement the Vision” is conducted,much will be learned, and new tests and meth-ods to incorporate results into assessments willemerge.

Example 1: Irritant GasToxicity testing and empirical dose-

response analysis• Among a larger group of gases tested in mul-

tiple high-throughput assays, the agentcaused dose-related responses in test assaysfor glutathione depletion, Nfr2 oxidative-stresspathway activation, inflammatory pathwayresponses, and general cytotoxicity. Mostother human toxicity-pathway tests hadnegative results, but the test gas was rou-tinely cytotoxic in systems in which gases wereeasily tested. Nrf2 pathway activation provedto be the most sensitive endpoint, with anEC10 of 10 ppm and a lower bound on theEC10 of 6.5 ppm, where EC10 or ED10 is theconcentration or dose that causes a 10%

increase in the response or effect over that ofthe control.

• A known hydrolysis product of the test gas—one produced in stoichiometric equivalentson hydrolysis of the gas—produced similarresponses in vitro when tested over a thou-sand-fold concentration range (0.001–1mM). The test provided a lower bound ED10

1

of 0.12 mM for Nrf2 pathway activation. Thehydrolysis product was tested in a broadsuite of toxicity pathways and showed littleevidence of pathway specific responses, butconsistently showed toxic responses at con-centrations much above 1.0 mM.

• At nontoxic concentrations, the compoundshowed no evidence of mutagenicity.

Extrapolation• Low dose. With positive-control oxidants,

low-dose behavior of the Nrf2 pathway wasshown to be nonlinear because of high gainin the feedback loops that control activationof this adaptive stress-response pathway. Aconcentration of one-tenth the lower boundon the EC10 would not be expected to causesubstantial pathway activation. That concen-tration would serve as a starting point forconsideration of susceptibility factors, preex-isting disease in the human population, andpossible coexposures to similarly actingcompounds.

• In vitro to in vivo. Extrapolation from the invitro system used a human pharmacokineticmodel derived from a computational fluid-dynamics approach. Model inputs derivedpartially from SAR included reaction rates ofthe gas in tissues and species-specific breathingrates. The pharmacokinetic dosimetry modelwas used to calculate the exposure concen-trations that would yield 0.012 mM hydrolysisproduct (that is, 0.12 mM/10) in the noseand lungs during a continuous human inha-lation exposure. The pharmacokinetic model,run in Markov-chain Monte Carlo fashion toaccount for variability and uncertainty, pro-vided lower bound estimates of 2.5–0.6 ppmfor the lungs and 15–3 ppm for the nose.Sensitivity analysis of the combined toxicity-pathway dosimetry model indicated key

Dow

nloa

ded

by [

Bos

ton

Uni

vers

ity]

at 0

2:00

06

May

201

3

Page 35: Toxicity Testing in the 21st Century: A Vision and a Strategy

TOXICITY TESTING IN THE 21ST CENTURY 83

biologic and pharmacokinetic factors thathad important roles in dose delivery andthe circuitry governing Keap1 and Nrf2signaling.

• Susceptibility. Susceptibility would dependheavily on polymorphisms in critical portionsof the Nrf2 pathway. People with higherthan average Keap1 or lower than averageNrf2 could fail to have an adaptive responseto oxidative stressors and could progress totoxicity at lower exposure concentrations.The observed polymorphisms in the humanpopulation and sensitivity with pre-existingdiseases suggest that estimates arising fromthe dose-response analysis should bereduced by a factor of 10.

Risk-assessment guidance• The exposure concentration derived from

the high-throughput toxicity-pathwayscreens and the associated interpretive toolscould be used in setting reference stan-dards. The assessment would indicate thatthe concentration should ensure that anexposure would not lead to biologicallysignificant responses to the compound. Inaddition, the risk narrative would state thatthis exposure limit should be protective ofother downstream responses—such as res-piratory tract toxicity—that might be of con-cern at higher concentrations, because evenadaptive, precursor responses are beingavoided.

• Estimates of cumulative risk should be con-sidered for situations with simultaneousexposures to the irritant gas and other gasesthat affect Nrf2 signaling.

Human surveillance• Surveillance studies of workers or other

human populations potentially exposed tothe irritant gas could test for evidence ofNfr2 oxidative-stress pathway activation andinflammatory pathway responses, possiblyusing induced sputum samples. To evaluatethe results, any increases in activation in theexposed population could be comparedwith pathway activation in control humanpopulations.

Example 2: Estrogenic AgonistToxicity testing and empirical dose-

response analysis• Members of a large group of commercial

chemicals were tested in multiple high-throughput in vitro assays. One of them trig-gered dose-related activation of estrogenicsignaling in receptor-binding assays andincreased DNA replication—indicative ofcell proliferation—in human breast-cancercells in vitro. Binding assays for this com-pound had the lowest ED10 values; assayindicators of gene transcription and DNAreplication occurred at much higher concen-trations. QSAR methods also predicted anestrogenic effect on the basis of a library oftested compounds. All other human toxicity-pathway tests were negative or showedresponses at much higher concentrations.The test compound had low cytotoxicity inmost screens and produced estrogen-receptoractivation at concentrations one-tenth ofthose that produce signs of cell toxicity.

• A short-term, mechanistic in vivo study withovariectomized female rats confirmed mildestrogenic action in vivo and moderate evi-dence of gene expression for responses inutero or in breast tissues. Predicted conju-gated metabolites of the compound werewithout activity in those assays.

Extrapolation• Experience with estrogen and other estro-

genic chemicals indicates the existence ofsusceptible populations—such as pubescentgirls, fetuses, and infants—that require addi-tional protection and attention. In addition,chemicals that bind to and activate the estro-gen receptor may act additively with oneanother. The extrapolation needs to considerthe compound uses, subpopulations that arelikely to be exposed to it, other backgroundexposures to estrogenic agents in these sub-populations, and the estimated tissue dose inpregnant and nonpregnant women, fetuses,and infants.

• Research on estrogen and estrogen agonistsreveals that if receptor occupancy in the mostsensitive tissues in susceptible humans is

Dow

nloa

ded

by [

Bos

ton

Uni

vers

ity]

at 0

2:00

06

May

201

3

Page 36: Toxicity Testing in the 21st Century: A Vision and a Strategy

84 D. KREWSKI ET AL.

increased by less than x% by this exposure orany combined exposure to estrogenic com-pounds, an appreciable activation of down-stream responses or a biologically significantincrease in their activation would be unlikely.An alternative assessment would be based ona functional response in a toxicity-pathwayassay, such as transcriptional activation.

• Human PBPK models for the compoundwould be used to model absorption, distri-bution to sensitive tissues, and elimination ofactive parent compound. The models (forexample, Markov-chain Monte Carlo PBPKmodel) would be designed to account forhuman variability in pharmacokinetics andmodeling uncertainty. The PBPK modelscould generate a point-of-departure expo-sure concentration or a daily intake at whichthere would be less than x% increase inreceptor occupancy or less than x% change intranscriptional activation in susceptible popu-lations (for example, fetuses) and in 95% to99% of the exposed general population. ThePBPK models could also provide the bloodconcentration associated with the change inreceptor occupancy or transcriptional activa-tion. That blood concentration could beexpressed in units of “estrogen equivalence”to simplify comparisons with estrogen andsimilarly acting estrogen agonists. Also, on thebasis of estrogen equivalence, the modelscould be used to assess the effects of cumula-tive exposure to exogenous estrogenic com-pounds and could be checked againstbiologic monitoring data in the human popu-lation for validity and to ensure that the pointof departure is not overestimated.

Risk-assessment guidance• Reference doses and concentrations used in

decision making could be based on a pointof departure derived as already described.The reference dose would consider factors,such as susceptibility, that could be alteredby polymorphisms in critical portions ofdownstream estrogen-response pathways orin conjugation with enzymes that clear thecompound before it reaches the systemiccirculation.

Human surveillance• Human surveillance of workers exposed to the

compound could detect subtle indications ofearly effects in humans if they were to occur.

Toxicity Testing and Risk Assessment Amajor application of the results of toxicity testingis in the risk assessment of environmentalagents. As illustrated in Figure 10, the commit-tee’s vision for toxicity testing is consistent withthe risk-assessment paradigm originally put for-ward by the National Research Council in1983. Chemical characterization and toxicity-pathway evaluation would be involved in hazardidentification. Pharmacokinetic models wouldbe used to calibrate in vitro and human dosim-etry and thereby facilitate the translation ofdose in cellular systems to dose in humanorgans and tissues. Population-based studieswould be used to confirm or explore effectsobserved in cellular systems to suggest biologicperturbations that require clarification in invitro tests and to interpret findings in in vitrostudies in the context of human populations.All would work together to permit establishmentof human exposure guidelines based on riskavoidance, which could be used to enforcescientifically based regulatory standards orsupport nonregulatory risk-managementstrategies.

Mode-of-action information is important forinforming the dose-response component of therisk-assessment paradigm. A deep understand-ing of mode of action involves studying themechanistic pathways by which toxic effects areinduced, including the key molecular and otherbiologic targets in the pathways. Thus, the com-mittee’s vision, outlined in the sections “Vision”and “Components of the Vision” of this report,is a shift away from traditional toxicity testingthat focuses on demonstrating adverse healtheffects in experimental animals toward a deeperunderstanding of biologic perturbations in keytoxicity pathways that lead to adverse healthoutcomes. The committee believes that itsvision of toxicity testing would better inform theassessment of the potential human health risksposed by exposure to environmental agents andensure efficient testing methods.

Dow

nloa

ded

by [

Bos

ton

Uni

vers

ity]

at 0

2:00

06

May

201

3

Page 37: Toxicity Testing in the 21st Century: A Vision and a Strategy

TOXICITY TESTING IN THE 21ST CENTURY 85

TOOLS AND TECHNOLOGIES

The committee provided an overview of itsvision for toxicity testing, and described themain components of the vision in previous sec-tions. Here, tools and technologies that mightbe used to apply the committee’s vision arebriefly discussed. The tools and technologieswill evolve and mature over time, but manyare already available. The committee empha-sizes that technologies are evolving rapidly,and new molecular technologies will surely beavailable in the near future for mapping toxicitypathways, assessing their functions, and mea-suring dose-response relationships.

Tools and Technologies for Chemical Characterization

Various computational methods are availablefor chemical characterization. The discussionhere focuses on structure–activity relationship(SAR) analyses, which use physical and chemicalproperties to predict the biologic activity,potential toxicity, and metabolism of an agentof concern. All are conceptually based on the

similar-property principle—that is, that chemi-cals with similar structure are likely to exhibitsimilar activity (Tong et al., 2003). Accordingly,biologic properties of new chemicals are ofteninferred from properties of similar existingchemicals whose hazards are already known.Specifically, SAR analysis involves buildingmathematical models and databases that usephysical properties (such as solubility, molecularweight, dissociation constant, ionization poten-tial energies, and melting point) and chemicalproperties (such as steric properties, presenceor absence of chemical moieties or functionalgroups, and electrophilicity) to predict biologicor toxicologic activity of chemicals. SAR analysescan be qualitative (for example, recognition ofstructural alerts, that is, chemical functionalgroups and substructures) or quantitative (forexample, use of mathematical modeling to linkphysical, chemical, and structural propertieswith biologic or toxic endpoints) (Benigni,2004). Key factors in the successful applicationof SAR methods include proper representationand selection of structural, physical, and chem-ical molecular features; appropriate selection

FIGURE 10. Risk assessment components. End product is development of one or more indicators of risk, such as a reference dose orconcentration.

Dow

nloa

ded

by [

Bos

ton

Uni

vers

ity]

at 0

2:00

06

May

201

3

Page 38: Toxicity Testing in the 21st Century: A Vision and a Strategy

86 D. KREWSKI ET AL.

of the initial set of compounds (that is, the“training set”) and methods of analysis; thequality of the biologic data; and knowledge ofthe mode or mechanism of toxic action(McKinney et al., 2000).

Current applications of SAR analysesinclude soft drug design, which involvesimproving the therapeutic index of a drug bymanipulating its steric and structural properties(Bodor, 1999); design and testing of chemo-therapeutic agents (van den Broek et al., 1989);nonviral gene and targeted-gene delivery (Congiuet al., 2004); creating predictive models of car-cinogenicity to replace animal models(Benigni, 2004); predicting the toxicity ofchemicals, particularly pesticides and metals(Walker et al., 2003a); and predicting the envi-ronmental fate and ecologic effects of industrialchemicals (Walker et al., 2003b). Among theavailable predictive-toxicity systems, the mostwidely used are statistically based correlativeprograms (such as CASE/MultiCASE and TOP-KAT) and rule-based expert systems (such asDEREK and ONCOLOGIC) (McKinney et al.,2000).

There are many examples of successfulapplications of SAR and quantitative SAR(QSAR) analysis. One successful application ofSAR analysis in risk assessment is the modelingof Ah-receptor-binding affinities of dioxin-likecompounds, including the structurally relatedpolychlorinated dioxins, dibenzofurans, andbiphenyls. Specifically, SAR methods wereused to establish a common mechanism ofaction for toxic effects and in the further devel-opment of toxic equivalency factors in riskassessments involving exposure to complexmixtures of those compounds (van der Berg etal., 1998). Other successful applications haveexamined how structural alterations influencetoxicity. For example, toxic effects of nonpolaranesthetics are mediated by a nonspecificaction on cell membranes and have beenshown to be directly correlated to their logoctanol–water partition coefficient (log Kow).However, the polar anesthetics—which includesuch chemicals as phenols, anilines, pyridines,nitrobenzenes, and aliphatic amines—gener-ally show an anesthetic potency 5–10 times

higher than expected on the basis of their logKow alone (Soffers et al., 2001).

Much effort has been directed toward themodeling and prediction of specific toxicities,particularly mutagenicity and carcinogenicitybecause of the importance of these endpoints,the cost and length of full rodent assays for car-cinogenesis, and the availability of high-qualitydata for modeling purposes. Experimentalobservation has led to the identification of severalstructural alerts that can cause both mutationand cancer, including carbonium ions (alkyl,aryl, and benzylic), nitrenium ions, epoxidesand oxonium ions, aldehydes, polarized doublebonds (alpha- and beta-unsaturated carbonylsor carboxylates), peroxides, free radicals, andacylating intermediates (Benigni & Bossa,2006). The structural alerts for mutagenicityand carcinogenicity have been incorporatedinto expert systems for predicting toxic effectsof chemicals (Simon-Hettich et al., 2006).

A number of structural alerts also havebeen associated with developmental toxicity.They were identified on the basis of knowndevelopmental responses to environmentalagents, such as valproic acids, hydrazides, andcarbamates (Schultz & Seward, 2000; Cronin,2002; Walker et al., 2004). Studies have dem-onstrated that the presence of a hydroxylgroup is required for estrogenic activity ofbiphenyls; symmetric derivatives are 10 timesmore active than nonsymmetric ones (Schultzet al., 1998). The relationship between the sizeand shape of the nonphenolic moiety andestrogenic potency among para-substitutedphenols demonstrated the trend of increasingestrogenicity with increased molecular size(Schultz & Seward, 2000). Thus, although pre-dictive models for some toxic endpoints, suchas mutagenicity, already exist, more mechanis-tically complex endpoints—such as acute,chronic, or organ toxicity—are more difficult topredict (Schultz & Seward, 2000; Simon-Hettichet al., 2006).

One final application of SAR analysis is inpredicting absorption, distribution, metabolism,and excretion. Qualitative SARs, QSARs, andthe related quantitative structure–propertyrelationships have been successfully used to

Dow

nloa

ded

by [

Bos

ton

Uni

vers

ity]

at 0

2:00

06

May

201

3

Page 39: Toxicity Testing in the 21st Century: A Vision and a Strategy

TOXICITY TESTING IN THE 21ST CENTURY 87

estimate such key properties as permeability,solubility, biodegradability, and cytochromeP-450 metabolism (Feher et al., 2000; Bugrimet al., 2004); to predict drug half-life values(Anderson, 2002); and to describe penetrationof the blood-brain barrier (Bugrim et al., 2004).

As indicated earlier, the predictive abilityof different models depends on selecting thecorrect molecular descriptors for the particulartoxic endpoints, the appropriate mathematicalapproach and analysis, and a sufficiently richset of experimental data. The ability to adaptexisting models continuously by building onlarger and higher quality data sets is crucial forthe improvement and ultimate success of theseapproaches.

Mapping Toxicity Pathways

As discussed in the sections “Vision” and“Components of the the Vision,” the key com-ponent of the committee’s vision is the evalua-tion of perturbations in toxicity pathways.Many tools and technologies are available thatcan aid in the identification of biologic signal-ing pathways and the development of assays toevaluate their function. Recent advances incellular and molecular biology, “-omics” tech-nologies, and computational analysis have con-tributed considerably to the understanding ofbiologic signaling processes (Daston, 1997;Ekins et al., 2005). Within the last 15 years,multiple cellular response pathways have beenevaluated in increasing depth, as is evidencedby the progress in the basic knowledge of cel-lular and molecular biology (Fernandis &Wenk, 2007; Lewin et al., 2007). Moreover,systems biology constitutes a powerfulapproach to describing and understanding thefundamental mechanisms by which biologicsystems operate. Specifically, systems biologyfocuses on the elucidation of biologic compo-nents and how they work together to give riseto biologic function. A systems approach canbe used to describe the fundamental biologicevents involved in toxicity pathways and toprovide evolving biologic modeling tools thatdescribe cellular circuits and their perturba-tions by environmental agents (Andersen et al.,2005a). A longer term goal of systems biology

is to create mathematical models of biologiccircuits that predict the behavior of cells inresponse to environmental agents qualitativelyand quantitatively (Lander & Weinberg, 2000).Progress in that regard is being made in devel-opmental biology (Cummings & Kavlock,2005; Slikker et al., 2005). The subsectionsthat follow outline tools and technologies thatwill most likely be used to elucidate the criticaltoxicity pathways and to develop assays toevaluate them.

In Vitro Tests The committee foreseesthat in vitro assays will make up the bulk of thetoxicity tests in its vision. In vitro tests are cur-rently used in traditional toxicity testing andindicate the success of developing and using invitro assays (Goldberg & Hartung, 2006). Invitro tests include the 3T3 neural red uptakephototoxicity assay (Spielman & Liebsch,2001), cytotoxicity assays (O’Brien & Haskins,2007), skin-corrosivity tests, and assays mea-suring vascular injury using human endothelialcells (Schleger et al., 2004). Many tests havebeen validated by the European Centre for theValidation of Alternative Methods. The com-mittee notes that the current in vitro tests origi-nated as alternatives to or replacements ofother toxicity tests. In the committee’s vision,in vitro assays will evaluate biologically signifi-cant perturbations in toxicity pathways andthus are not intended to serve as directreplacements of existing toxicity tests.

The committee envisions the use of humancell lines for the in vitro assays. Cell lines havebeen used for a long time in experimental toxi-cology and pharmacology. Human cell linesare readily available from tissue-culture banksand laboratories and are particularly attractivebecause they offer the possibility of workingwith a system that maintains several pheno-typic and genotypic characteristics of thehuman cells in vivo (Suemori, 2006). Differen-tiated functions, functional markers, and meta-bolic capacities may be altered or preserved incell lines, depending on culture conditions,thereby allowing testing of a wide array ofagents in different experimental settings. Otherpossibilities include using animal cells that aretransfected to express human genes and proteins.

Dow

nloa

ded

by [

Bos

ton

Uni

vers

ity]

at 0

2:00

06

May

201

3

Page 40: Toxicity Testing in the 21st Century: A Vision and a Strategy

88 D. KREWSKI ET AL.

For example, various cell lines—such as V79,CHO, COS-7, NIH3T3, and HEPG2—havebeen transfected with complementary DNA(cDNA, DNA synthesized from mature mRNA)coding for human enzymes and used inmutagenesis and drug-metabolism studies(Potier et al., 1995). Individual enzymes havealso been stably expressed to identify the majorhuman isoenzymes, such as cytochromes P-450and UDP-glucuronosyltransferases, responsiblefor the metabolism of potential therapeutic andenvironmental agents. The metabolic in vitroscreens with human enzymes are usually con-ducted as a prelude to clinical studies.

A major limitation of using human cell linesis the difficulty of extrapolating data from thesimple biologic system of single cells to thecomplex interactions in whole animals. Ques-tions have also been raised concerning thestability of cell lines over time, the reproduc-ibility of responses over time, and the ability ofcell lines to account for genetic diversity of thehuman population. Nonetheless, cell lineshave been used as key tools in the initialscreening and evaluation of toxic agents andthe characterization of properties of cancercells (Suzuki et al., 2005) and in gene profilingwith microarrays (Wang et al., 2006b). Thehigh-throughput methods now becoming morecommon will allow the expansion of the meth-ods to larger numbers of endpoints, wider doseranges, and mixtures of agents (Inglese, 2002;Inglese et al., 2006).

High-Throughput Methods A criticalfeature of the committee’s vision is the use ofhigh-throughput methods that will allow eco-nomical screening of large numbers of chemi-cals in a short period. The pharmaceuticalindustry provides an example of the successfuluse of high-throughput methods. Optimizingdrug-candidate screening is essential for timelyand cost-effective development of new phar-maceuticals. Without effective screening meth-ods, poor drug candidates might not beidentified until the preclinical or clinical phaseof the drug-development process, and thiscould lead to high costs and low productivityfor the pharmaceutical industry (Lee & Dordick,2006). Pharmaceutical companies have turned

to high-throughput screening, which allowsautomated simultaneous testing of thousandsof chemical compounds under conditions thatmodel key biologic mechanisms (Fischer,2005). Such technologies as hybridization,microarrays, real-time polymerase chain reac-tion, and large-scale sequencing are some ofthe high-throughput methods that have beendeveloped (Waring & Ulrich, 2000). High-throughput assays are useful for predicting sev-eral important characteristics related to theabsorption, distribution, metabolism, excretion,and toxicity of a compound (Gombar et al.,2003). They can predict the interaction of acompound with enzymes, the metabolic deg-radation of the compound, the enzymesinvolved in its biotransformation, and themetabolites formed (Masimirembwa et al.,2001). That information is integral for selectingcompounds to advance to the next phase ofdrug development, especially when manycompounds may have comparable pharmaco-logic properties but differing toxicity profiles(Pallardy et al., 1998). High-throughput assaysare also useful for rapid and accurate detectionof genetic polymorphisms that could dramati-cally influence individual differences in drugresponse (Shi et al., 1999).

Microarrays Microarray technologieshave allowed the development of the field oftoxicogenomics, which evaluates changes ingenetic response to environmental agents ortoxicants. These technologies permit genome-wide assessments of changes in gene expres-sion associated with exposure to environmentalagents. The identification of responding genescan provide valuable information on cellularresponse and some information on toxicitypathways that might be affected by environ-mental agents. Some of the tools and technolo-gies are described next.

Microarrays are high-throughput analyticdevices that provide comprehensive genome-scale expression analysis by simultaneouslymonitoring quantitative transcription of thou-sands of genes in parallel (Hoheisel, 2006).The Affymetrix GeneChip Human GenomeU133 Plus 2.0 Array provides comprehensiveanalysis of genomewide expression of the

Dow

nloa

ded

by [

Bos

ton

Uni

vers

ity]

at 0

2:00

06

May

201

3

Page 41: Toxicity Testing in the 21st Century: A Vision and a Strategy

TOXICITY TESTING IN THE 21ST CENTURY 89

entire transcribed human genome on a singlemicroarray (Affymetrix Corporation, 2007).Whole-genome arrays are also available for therat and mouse. The use of the rat arrays willprobably increase as the relationships betweenspecific genes and markers on the arraysbecome better understood.

Protein microarrays potentially offer theability to evaluate all expressed proteins in cellsor tissues. Protein-expression profiling wouldallow some understanding of the relationshipbetween transcription (the suite of mRNAs inthe cell) and the translational readout of thetranscripts (the proteins). Protein microarrayshave diverse applications in biomedicalresearch, including profiling of disease markersand understanding of molecular pathways, pro-tein modifications, and protein activities (Zangaret al., 2005). However, whole-cell or tissueprofiling of expressed proteins is still in thedevelopmental stage. These techniques remainexpensive, and the technology is in flux.

Differential gene-expression experimentsuse comparative microarray analysis to identifygenes that are upregulated or downregulatedin response to experimental conditions. Thelarge-scale investigation of differential geneexpression attaches functional activity to struc-tural genomics. Whole-genome-expressionexperiments involve hundreds of experimentalconditions in which patterns of global geneexpression are used to classify disease speci-mens and discover gene functions and toxico-genomic targets (Peeters & Van der Spek,2005). Gene-expression profiling will have arole in identifying toxicity pathways in whole-animal studies but is not expected to be thestaple technology for identifying and mappingthe pathways.

High-Throughput Functional GenomicsFunctional genomics should be distin-

guished from toxicogenomics. Toxicogenomics isa broad field combining expertise in toxicology,genetics, molecular biology, and environmentalhealth and includes genomics, proteomics, andmetabonomics, whereas functional genomicsas described here is a specialized disciplinethat attempts to understand the functions ofgenes within cellular networks.

Large-scale evaluations of the status ofgene expression and protein concentrations incells allow understanding of the integratedbiologic activities in tissues and can be used tocatalog changes after in vivo or in vitro treat-ment with environmental agents. However,evaluation of the organization and interactionsamong genes in toxicity pathways requiresapproaches referred to as functional genomics,which encompass a different suite of molecu-lar tools (Brent, 2000). The tools are designedto catalog the full suite of genes that arerequired for optimal activity of a toxicity path-way. The evaluation of the readout of thosefunctional screens with bioinformatic analysisprovides key data about the organization oftoxicity pathways and guides computationalmethods that model the consequences of per-turbation of the pathways by environmentalagents.

Functional analysis requires a cell-basedassay that provides a convenient, automatedcell-based measure of functioning of a toxicitypathway (Akutsu et al., 1998; Michiels et al.,2002; Chanda et al., 2003; Lum et al., 2003;Berns et al., 2004; Huang et al., 2004) andrequires the ability to automate treatment ofthe cells with individual cDNAs or small inter-fering RNAs (siRNAs), which are relatively shortRNA oligomers that appear to play importantroles in inhibiting gene expression (Hannon,2002; Meister & Tuschl, 2004; Mello & Conte,2004; Hammond, 2005). Treatment of thecells with a particular cDNA causes overex-pression of the gene (and presumably the pro-tein) that is coded by it. In contrast, treatmentwith gene-specific siRNA causes knockdown ofspecific proteins by enhancing degradation ofthe mRNA from the gene. High-throughputmethods permit automation of such cell-basedassays by the use of robots and libraries ofcDNAs and siRNAs. The screens show whichgenes increase and which decrease activity ofthe toxicity pathway.

Computational Biology Computationalbiology uses computer techniques and mathe-matical modeling to understand biologic pro-cesses. It is a powerful tool to cope with theever-increasing quantity and quality of biologic

Dow

nloa

ded

by [

Bos

ton

Uni

vers

ity]

at 0

2:00

06

May

201

3

Page 42: Toxicity Testing in the 21st Century: A Vision and a Strategy

90 D. KREWSKI ET AL.

information on genomics, proteomics, geneexpression, gene varieties, genotyping tech-niques, and protein and cell arrays (Kriete &Eils, 2006). Computational tools are used indata analysis, data mining, data integration,network analysis, and multiscale modeling(Kitano, 2005). Computational biology is par-ticularly useful for systems biology in under-standing structural, regulatory, and kineticmodels (Barabasi & Oltvai, 2004); in modelingsignal transduction (Eungdamrong & Iyengar,2004); and in analyzing genome informationand its structural and functional properties(Snitkin et al., 2006). Furthermore, computa-tional biology is used to predict toxic effects ofchemical substances (Simon-Hettich et al.,2006), to understand the toxicokinetics andtoxicodynamics of xenobiotics (Ekins, 2006), todetermine gene-expression profiling of cancercells (Katoh & Katoh, 2006), to help in thedevelopment of genomic biomarkers (Ginsburg& Haga, 2006), and to design virtual experi-ments to replace or reduce animal testing(Vedani, 1999). In drug design and discovery,novel computational technologies help to createchemical libraries of structural motifs relevantto target proteins and their small molecularligands (Balakin et al., 2006; O’Donoghue et al.,2006).

Cellular signaling circuits handle an enor-mous variety of functions. Apart from replicationand other functions of individual cells, signal-ing circuits must implement the complex logicof development and function of multicellularorganisms. Computer models are helpful inunderstanding that complexity (Bhalla et al.,2002). Recent studies have extended suchmodels to include electrical, mechanical, andspatial details of signaling (Bhalla, 2004a,2004b). The mitogen-activated protein kinase(MAPK) pathway is one of the most importantand extensively studied signaling pathways; itgoverns growth, proliferation, differentiation,and survival of cells. Widely varied mathematicalmodels of the MAPK pathway have led to novelinsights and predictions as to how it functions(Orton et al., 2005; Santos et al., 2007).

Predictive computational models derivedfrom experimental studies have been developed

to describe receptor-mediated cell communi-cation and intracellular signal transduction(Sachs et al., 2005). Physicochemical modelsattempt to describe biomolecular transforma-tions, such as covalent modification and inter-molecular association, with physicochemicalequations. The models make specific predic-tions and work mostly with pathways that arebetter understood. They can be viewed astranslations of familiar pathway maps intomathematical forms (Aldridge et al., 2006).Integrated mechanistic and data-driven mod-eling for multivariate analysis of signaling path-ways is a novel approach to understandingmultivariate dependence among molecules incomplex networks and potentially can be usedto identify combinatorial targets for therapeu-tic interventions and toxicity-pathway targetsthat lead to adverse responses (Hua et al.,2006).

In Vivo Tests As discussed in two previoussections, in vivo tests will most likely be used inthe foreseeable future to evaluate the forma-tion of metabolites and some mechanisticaspects of target-organ responses to environ-mental agents, including genomewide evalua-tion of gene expression. The previous section“Components of the Vision” noted that carefuldesign of those studies could substantiallyincrease the value of information obtained. Forexample, evaluation of cellular transcriptomicpatterns from tissues of animals receiving short-term exposures may provide clues to cellulartargets of environmental agents and assist intarget-tissue identification. (See the section“Components of the Vision” for further discussionof protocol changes that could increase thevalue of toxicity tests.) Moreover, technologicadvances in detection and imaging have thepotential for improving in vivo testing. Forexample, positron-emission tomography (PET)is an imaging tool that can determine bio-chemical and physiologic processes in vivo bymonitoring the activity of radiolabeled com-pounds (Paans & Vaalburg, 2000). BecausePET can detect the activity of an administeredcompound at the cellular level, its use in ani-mal models can result in the incorporation ofmechanistic processes and an understanding of

Dow

nloa

ded

by [

Bos

ton

Uni

vers

ity]

at 0

2:00

06

May

201

3

Page 43: Toxicity Testing in the 21st Century: A Vision and a Strategy

TOXICITY TESTING IN THE 21ST CENTURY 91

the pathologic effects of a candidate com-pound (Rehmann & Jayson, 2005).

Tools and Technologies for Dose-Responseand Extrapolation Modeling As discussed insections, ‘Vision’ and ‘Components of theVision’, two types of modeling will be criticalfor implementing the committee’s vision: phys-iologically based pharmacokinetic (PBPK)models and dose-response models of perturba-tions of toxicity pathways. PBPK models willallow dose extrapolation from in vitro condi-tions used for assessing toxicity-pathway per-turbations to projected human exposures invivo. Mechanistic models of perturbations oftoxicity pathways should aid in developinglow-dose extrapolation models that considerthe biologic structure of the cellular circuitrycontrolling pathway activation.

Physiologically Based PharmacokineticModels Assessing the risk associated withhuman chemical exposure has traditionallyrelied on the extrapolation of data from animalmodels to humans, from one route of exposureto another, and from high doses to low doses.Such extrapolation attempts to relate theextent of external exposure to a toxicant to theinternal dose in the target tissue of interest.However, differences in biotransformation andother pharmacokinetic processes can intro-duce error and uncertainty into the extrapola-tion of toxicity from animals to humans(Kedderis & Lipscomb, 2001).

PBPK models provide a physiologic basisfor extrapolating between species and routesof exposure and thus allow estimation of theactive form of a toxicant that reaches the targettissue after absorption, distribution, and biotrans-formation (Watanabe et al., 1988). However,PBPK results can differ significantly in thehands of different modelers (Hattis et al.,1990), and improved modeling approaches forparameter selection and uncertainty analysisare under discussion. PBPK models might alsobe useful for estimating the effect of exposureat different life stages, such as pregnancy, criticalperiods of development, and childhoodgrowth (Barton, 2005). Interindividual differ-ences can be incorporated into PBPK modelsby integrating quantitative information from in

vitro biotransformation studies (Bois et al.,1995; Kedderis & Lipscomb, 2001).

The more pervasive use of PBPKapproaches in the new strategy for toxicitytesting will be in basing dosimetry extrapola-tions on estimates of partitioning, metabolism,and interactions among chemicals derivedfrom in vitro measurements or perhaps evenfrom SAR or QSAR techniques. Those extrapo-lations will require some level of validation thatmight require data from kinetic studies in vol-unteers or from biomonitoring studies inhuman populations. In the committee’s visionfor toxicity testing, the development of PBPKmodels from SAR predictions of partitioningand metabolism would decrease animal use,and continued improvements in the in vitro toin vivo extrapolations of kinetics will supportthe translation from test-tube studies of pertur-bations to predictions.

Dose-Response Models of ToxicityPathways Dose-response modeling of toxic-ity pathways involves the integration of mecha-nistic and dosimetric information about thetoxicity of a chemical into descriptive mathe-matical terms to provide a quantitative modelthat allows dose and interspecies extrapolation(Conolly, 2002). New techniques in molecularbiology, such as functional genomics, will playa key role in the development of such modelsbecause they provide more detailed informa-tion about the organization of toxicity path-ways and the dose-response relationships ofperturbations of toxicity pathways by environ-mental agents. Dose-response models havebeen developed for cell-signaling pathwaysand used in risk assessment (Andersen et al.,2002). They have found important applicationsin studying chemical carcinogenesis (Park &Stayner, 2006). In particular, models of cancerformation have been developed to describethe induction of squamous-cell carcinomas ofthe nasal passage in rats exposed to formalde-hyde by inhalation, taking into account bothtissue dosimetry and the nonlinear effects ofcellular proliferation and formation of DNA–protein cross-links (Slikker et al., 2004a,2004b; Conolly et al., 2004). However,alternative implementations of the formaldehyde

Dow

nloa

ded

by [

Bos

ton

Uni

vers

ity]

at 0

2:00

06

May

201

3

Page 44: Toxicity Testing in the 21st Century: A Vision and a Strategy

92 D. KREWSKI ET AL.

model gave substantially different results (Sub-ramaniam et al., 2006). Emerging developmentsin systems biology allow modeling of cellularand molecular signaling networks affected bychemical exposures and thereby produce anintegrated modeling approach capable of pre-dicting dose-response relationships of pathwayperturbations by developmental and reproduc-tive toxicants (Andersen et al., 2005b).

In the next decades, the dose-responsemodeling tools for perturbations shouldprogress relatively rapidly to guide low-doseextrapolations of initial interactions of toxiccompounds with biologic systems. The quanti-tative lineage of early perturbations with apicalresponses is likely to develop more slowly. Forthe foreseeable future, the continued refine-ment of biologic models of signaling circuitryshould guide the extrapolation approachesnecessary for conducting risk assessment withthe toxicity-pathway tests as the cornerstone oftoxicity-testing methods.

DEVELOPING THE SCIENCE BASE AND ASSAYS TO IMPLEMENT THE VISION

Rapid advances in the understanding ofthe organization and function of biologic sys-tems provide the opportunity to develop inno-vative mechanistic approaches to toxicitytesting. In comparison with the current system,the new approaches should provide wider cov-erage of chemicals of concern, reduce the timeneeded for generating toxicity-test datarequired for decision making, and use animalsto a far smaller extent. Accordingly, the com-mittee has proposed development of a testingstructure that evaluates perturbations in toxic-ity pathways and relies on a mix of high- andmedium-throughput assays and targeted invivo tests as the foundation of its vision for tox-icity testing. This section discusses the kinds ofapplied and basic research needed to supportthe new toxicity-testing approach, the institu-tional resources required to support andencourage it, and the valuable products thatcan be expected during the transition from thecurrent apical endpoint testing to a mechanisti-cally based in vivo and in vitro test system.

Most tests in the committee’s vision wouldbe unlike current toxicity tests, which generatedata on apical endpoints. The mix of tests inthe vision include in vitro tests that assess criti-cal mechanistic endpoints involved in theinduction of overt toxic effects rather than theeffects themselves and targeted in vivo teststhat ensure adequate testing of metabolitesand coverage of endpoints. The move towarda mechanism-oriented testing paradigm poseschallenges. Implementation will require (1)the availability of suites of in vitro tests—pref-erably based on human cells, cell lines, orcomponents—that are sufficiently compre-hensive to evaluate activity in toxicity path-ways associated with the broad array ofpossible toxic responses; (2) the availability oftargeted tests to complement the in vitro testsand ensure overall adequate data for decisionmaking; (3) models of toxicity pathways tosupport application of in vitro test results topredict general-population exposures thatcould potentially cause adverse perturbations;(4) infrastructure changes to support the basicand applied research needed to develop thetests and the pathway models; (5) validation oftests and test strategies for incorporation intochemical-assessment guidelines that will pro-vide direction on interpreting and drawingconclusions from the new assay results; and(6) acceptance of the idea that the results oftests based on perturbations in toxicity path-ways are adequately predictive of adverseresponses and can be used in decision making.Development of the new assays and therelated basic research—the focus of thissection—requires a substantial researchinvestment over quite a few years. Institutionalacceptance of the new tests and the requisitenew risk-assessment approaches—the focus ofthe next section, “Prerequisites for Imple-menting the Vision in Regulatory Contexts”—also require careful planning. They cannotoccur overnight.

Ultimately, the time required to conductthe research needed to support large-scaleincorporation of the new mechanistic assaysinto a test strategy that can adequately andrapidly address large numbers of agents

Dow

nloa

ded

by [

Bos

ton

Uni

vers

ity]

at 0

2:00

06

May

201

3

Page 45: Toxicity Testing in the 21st Century: A Vision and a Strategy

TOXICITY TESTING IN THE 21ST CENTURY 93

depends on the institutional will to commitresources to support the changes. The commit-tee believes that with a concerted researcheffort, over the next 10 years high-throughputtest batteries could be developed that wouldsubstantially improve the ability to identifytoxicity hazards caused by a number of mech-anisms of action. Those results in themselveswould be a considerable advance. The timefor full realization of the new test strategy,with its mix of in vitro and in vivo test batter-ies that can rapidly and inexpensively assesslarge numbers of substances with adequatecoverage of possible endpoints, could be 20or more years.

This section starts by discussing basicresearch that will provide the foundation forassay development. It then outlines a researchstrategy and milestones. It concludes by dis-cussing the scientific infrastructure that willsupport the basic and applied researchrequired to develop the high-throughput andtargeted testing strategy envisioned by thecommittee.

Scope of Scientific Knowledge, Methods, and Assay Development

This section outlines the scientific inquiryrequired to develop the efficient and effectivetesting strategy envisioned by the committee.Several basic-research questions need to beaddressed to develop the knowledge basefrom which toxicity-pathway assays and sup-porting testing technologies can be designed.The discussion here is intended to provide abroad overview, not a detailed researchagenda. The committee recognizes the chal-

lenges and effort involved in addressing someof these research questions.

Knowledge Development Knowledge crit-ical for the development of high-throughputassays is emerging from biologic, medical, andpharmaceutical research. Further complemen-tary, focused research will be needed toaddress fully the key questions that whenanswered will support toxicity-pathway assaydevelopment. Those questions are outlined inTable 4 and elaborated next.

• Toxicity-pathway identification. The keypathways that, when sufficiently perturbed,will result in toxicity will be identified primarilyfrom future, current, and completed studiesin the basic biology of cell-signaling motifs.Identification will involve the discovery ofthe protein components of toxicity pathwaysand how the pathways are altered by envi-ronmental agents. Many pathways are underinvestigation with respect to the basic biol-ogy of cellular processes. For example, theNational Institutes of Health (NIH) has amajor program under way to develop high-throughput screening (HTS) assays based onimportant biologic responses in in vitro sys-tems. HTS has the potential to identifychemical probes of genes, pathways, and cellfunctions that may ultimately lead to charac-terization of the relationship between chemicalstructure and biologic activity (Inglese et al.,2006). Determining the number and natureof toxicity pathways involved in human dis-ease and impairment is an essential compo-nent of the committee’s vision for toxicitytesting.

TABLE 4. Key Research Questions in Developing Knowledge to Support Pathway Testing

Toxicity-pathway identification—What are the key pathways whose perturbations result in toxicity? Multiple pathways—What alteration in response can be expected from simultaneous perturbations of multiple toxicity pathways?

Adversity—What adverse effects are linked to specific toxicity-pathway perturbations? What patterns and magnitudes of perturbations are predictive of adverse health outcomes?

Life stages—How can the perturbations of toxicity pathways associated with developmental timing or aging be best captured to enable the advancement of high-throughput assays?

Effects of exposure duration—How are biologic responses affected by exposures of different duration? Low-dose response—What is the effect on a toxicity pathway of adding small amounts of toxicants in light of preexisting endogenous

and exogenous human exposures?Human variability—How do people differ in their expression of toxicity-pathway constituents and in their predisposition to disease

and impairment?

Dow

nloa

ded

by [

Bos

ton

Uni

vers

ity]

at 0

2:00

06

May

201

3

Page 46: Toxicity Testing in the 21st Century: A Vision and a Strategy

94 D. KREWSKI ET AL.

• Multiple pathways. Adverse biologic changecan occur from simultaneous perturbationsof multiple toxicity pathways. Environmentalagents typically affect more than one toxicitypathway. Although the committee envisionsthe design of a suite of toxicity tests that willprovide broad coverage of biologic perturba-tions in all key toxicity pathways, biologicperturbations in different pathways may leadto synergistic interactions with importantimplications for human health. For someadverse health effects, an understanding ofthe interplay of multiple pathways involvedmay be important. For others, the researchneed will be to identify the pathway affectedat the lowest dose of the environmentalagent.

• Adversity. An understanding of possible dis-eases or functional losses that may resultfrom specific toxicity-pathway perturbationswill support the use of pathway perturba-tions for decision making. Current risk assess-ments rely on toxicity tests that demonstrateapical adverse health effects, such as diseaseor functional deficits, that are at various dis-tances downstream of the toxicity-pathwayperturbations. In the committee’s vision, theassessment of potential human health impactwill be based on perturbations in toxicitypathways. For example, activation of estro-genic action to abnormal levels during preg-nancy is associated with undescended testesand, in later life, testicular cancer. Researchwill be needed to understand the patternsand magnitudes of the perturbations that willlead to adverse effects. As part of theresearch, biomarkers of effect that can bemonitored in humans and studied in wholeanimals will be useful.

• Life stages. An understanding of how pathwaysassociated with developmental timing oraging can be adversely perturbed and lead totoxicity will be needed to develop high-throughput assays that can capture andadequately cover developmental and senesc-ing life stages. Many biologic functionsrequire coordination and integration of awide array of cellular signals that interactthrough broad networks that contribute to

biologic function at different life stages. Thatcomplexity of pathway interaction holds forreproductive and developmental functions,which are governed by parallel and sequen-tial signaling networks during critical periodsof biologic development. Because of thecomplexity of such pathways, the challengewill be to identify all important pathways thataffect such functions to ensure adequateprotection against risks to the fetus andinfant. That research will involve elucidatingtemporal changes in key toxicity pathwaysthat might occur during development andthe time-dependent effects of exposure onthese pathways.

• Effects of exposure duration. The dose of andresponse to exposure to a toxicant in thewhole organism depend on the duration ofexposure. Thus, conventional toxicity testingplaces considerable emphasis on characteriz-ing risks associated with exposures of differ-ent duration, from a few days to the testanimal’s lifetime. The ultimate goal in thenew paradigm is to evaluate conditions underwhich human cells are likely to respond andto ensure that these conditions do not occurin exposures of human populations. Researchwill be needed to understand how the dose-response relationships for perturbationsmight change with the duration of exposureand to understand pathway activation underacute, subchronic, and chronic exposureconditions. The research will involve investi-gating the differential responses of cells ofvarious ages and backgrounds to a toxic com-pound and possible differences in responsesof cells between people of different ages.

• Low-dose response. The assessment of thepotential for an adverse health effect from asmall environmental exposure involves anunderstanding of how the small exposureadds to preexisting exposures that affect thesame toxicity pathways and disease pro-cesses. For the more common human dis-eases and impairments, a myriad of exposuresfrom food, pharmaceuticals, the environment,and endogenous processes have the potentialto perturb underlying toxicity pathways.Understanding how a specific environmental

Dow

nloa

ded

by [

Bos

ton

Uni

vers

ity]

at 0

2:00

06

May

201

3

Page 47: Toxicity Testing in the 21st Century: A Vision and a Strategy

TOXICITY TESTING IN THE 21ST CENTURY 95

exposure contributes, with the other expo-sures, to modulate a toxicity pathway is criti-cal for the understanding of low-doseresponse. Because the toxicity tests used inthe committee’s long-range vision are basedlargely on cellular assays involving sensitivebiomarkers of alterations in biologic func-tion, it will be possible to study the potentialfor adverse human health effects at doseslower than is possible with conventionalwhole-animal tests. Given the cost-effective-ness of the computational methods and invitro tests that form the core of the toxicitytesting, it will be efficient to evaluate effectsat multiple doses and so build a basis ofdetailed dose-response research.

• Human variability. People differ in theirexpression of toxicity-pathway constituentsand consequently in their predisposition todisease and impairment. An understandingof differences among people in the level ofresponsiveness of particular toxicity path-ways is needed to interpret the importanceof small environmental exposures. The com-prehensive mapping of toxicity pathwaysprovides an unprecedented opportunity toidentify gene loci and other determinants ofhuman sensitivity to environmental exposures.That research will support the developmentof biomarkers of exposure, effect, andsusceptibility for surveillance in the humanpopulation, and these discoveries in turn willsupport an assessment of host susceptibilityfor use in extrapolating results from the invitro assays to the general population and

susceptible groups. The enhanced ability tocharacterize interindividual differences insensitivity to environmental exposures willprovide a firmer scientific basis of the estab-lishment of human exposure guidelines thatcan protect susceptible subpopulations.

Research on most, or all, of the subjectsjust described is going on in the United Statesand internationally. It is taking place in aca-deme, industry, and government institutionsand is funded by foundations and the federalgovernment, mainly to understand the basis ofhuman disease and treatment. Private firms,such as pharmaceutical and biotechnologycompanies, conduct the research for productdevelopment. However, efforts directed specif-ically toward developing toxicity-testing systemsare small.

Test and Analytic Methods DevelopmentThe research just described will provide

the foundation for the development of toxicitytests and comprehensive testing approaches.The categories of toxicity tests and methods tobe developed are outlined next, and theprimary questions to be answered in theirdevelopment are presented in Table 5.

• Methods to predict metabolism. A key issue toaddress at an early phase will be developmentof methods to ensure adequate testing formetabolites in high-throughput assays.Understanding the range of metabolic prod-ucts and the variation in metabolism amonghumans and being able to simulate human

TABLE 5. Main Questions in Developing Tests and Methods

Methods to predict metabolism—How can adequate testing for metabolites in the high-throughput assays be ensured? Chemical-characterization tools—What computational tools can best predict chemical properties, metabolites, xenobiotic–cellular

and molecular interactions, and biologic activity? Assays to uncover cell circuitry—What methods will best facilitate the discovery of the circuitry associated with toxicity pathways?Assays for large-scale application—Which assays best capture the elucidated pathways and best reflect in vivo conditions? What

designs will ensure adequate testing of volatile compounds?Suite of assays—What mix of pathway-based high- and medium-throughput assays and targeted tests will provide adequate

coverage? What targeted tests should be developed to complement the toxicity-pathway assays? What are the appropriate positive and negative controls that should be used to validate the assay suite?

Human-surveillance strategy—What surveillance is needed to interpret the results of pathway tests in light of variable human susceptibility and background exposures?

Mathematical models for data interpretation and extrapolation—What procedures should be used to evaluate whether humans are at risk from environmental exposures?

Test-strategy uncertainty—How can the overall uncertainty in the testing strategy be best evaluated?

Dow

nloa

ded

by [

Bos

ton

Uni

vers

ity]

at 0

2:00

06

May

201

3

Page 48: Toxicity Testing in the 21st Century: A Vision and a Strategy

96 D. KREWSKI ET AL.

metabolism as needed in test systems arecritical for developing valid toxicity-pathwayassays. Without such methods, targeted invivo assays will be needed to evaluatemetabolism.

• Chemical-characterization tools. In additionto metabolism, further development of toolsto support chemical characterization will beimportant. The tools will include computa-tional and structure–activity relationship(SAR) methods to predict chemical properties,potential initial interactions of a chemicaland its metabolites with cellular molecules,and biologic activity. A National ResearchCouncil report (NRC, 2000) indicated thatearly cellular interactions are important inunderstanding potential toxicity and includereceptor–ligand interactions, covalent bindingwith DNA and other endogenous molecules,peroxidation of lipids and proteins, interfer-ence with sulfhydryl groups, DNA methylation,and inhibition of protein function. Goodpredictive methods for chemical character-ization will reduce the need for targeted test-ing and enhance the efficiency of the testing.

• Assays to uncover cell circuitry. Developmentof methods to facilitate the discovery of thecircuitry associated with toxicity pathwayswill involve functional genomic techniquesfor integrating and interpreting various datatypes and for translating dose-response rela-tionships from simple to complex biologicsystems, for example, from the pathway tothe tissue level. It will most likely requireimproved methods in bioinformatics, sys-tems biology, and computational toxicology.Some advances in overexpression with com-plementary DNA (cDNA) and gene knock-down with small inhibitory RNAs are likely toallow improved pathway mapping and willalso lead to studies with cells or cell lines thatare more readily transfectable.

• Assays for large-scale application. Severalsubstantive issues will need to be consideredin developing assays for routine applicationin a testing strategy. First, as pathways areidentified, medium- and high-throughputassays that adequately evaluate pathwaysand human biology will be developed,

including new, preferably human, cell-basedcultures for assessment of perturbations. Sec-ond, the assay designs that best capture theelucidated pathways and can be applied forrapid large-scale testing of chemicals willneed to be identified. Third, an importantdesign criterion for assays will be that theyare adequately reflective of the in vivo cellu-lar environment. For any given assay, thatwill involve an understanding of the ele-ments of the human cellular environmentthat must be simulated and of culture condi-tions that affect response. Fourth, the molec-ular evolution of cell lines during passage inculture and related interlaboratory differ-ences that can result will have to be con-trolled for. Fifth, approaches for the testing ofvolatile compounds will require early atten-tion in the development of high-throughputassays; this has been a challenge for in vitrotest systems in general. Sixth, assay sensitivity(the probability that the assay identifies thephenomenon that it is designed to identify)and assay specificity (the probability that theassay does not identify a phenomenon asoccurring when it does not) will be importantconsiderations in assay design. Individualassays and test batteries should have thecapability to predict accurately the effectsthat they are designed to measure withoutundue numbers of false positives and falsenegatives. And seventh, it will be importantto achieve flexibility to expand or contractthe suites of assays as more detailed biologicunderstanding of health and disease statesemerges from basic research studies.

• Suite of assays. An important criterion for thedevelopment of a suite of assays for assessingthe potential for a substance to cause a par-ticular type of disease or group of toxicitieswill be adequate coverage of causativemechanisms, affected cell types, and suscep-tible individuals. Ensuring the right mix ofpathway-based high-throughput assays andtargeted tests will involve research. For dis-eases for which toxicity pathways are notfully understood, targeted in vivo or othertests may be included to ensure adequatecoverage.

Dow

nloa

ded

by [

Bos

ton

Uni

vers

ity]

at 0

2:00

06

May

201

3

Page 49: Toxicity Testing in the 21st Century: A Vision and a Strategy

TOXICITY TESTING IN THE 21ST CENTURY 97

• Human-surveillance strategy. Human data onthe fundamental biologic events involved inthe activation of toxicity pathways will aidthe interpretation of the results of high-throughput assays. They will provide thebasis of understanding of determinants ofhuman susceptibilities related to a toxicitypathway and of background exposures tocompounds affecting the pathway. Researchwill be needed to assess how population-based studies can best be designed and con-ducted to complement high-throughput testingand provide the information necessary fordata interpretation.

• Mathematical models for data interpretationand extrapolation. Procedures for evaluatingthe impact of human exposure concentra-tions will involve pharmacokinetic and othermodeling methods to relate cell media con-centrations to human tissue doses andbiomonitoring data and to account for expo-sure patterns and interindividual variabilities.To facilitate interpretation of high-through-put assay results, models of toxicity pathways(see the section “Components of the Vision”)and other techniques will be needed toaddress differences among people in theirlevels of activation of particular responsepathways. Although it is not a key aspect ofthe current vision, in the distant futureresearch may enable the development ofbiologically based dose-response models ofapical responses for risk prediction.

• Test-strategy uncertainty. Methods to evalu-ate the overall uncertainty in a possible testingstrategy will assist the validation and evolu-tion of the new methods. Formal methodscould be developed that use systematicapproaches to evaluate uncertainty in pre-dicting from the test battery results the dosesthat should be without biologic effect inhuman populations. These uncertainty eval-uations can be used in the construction andselection of testing strategies.

Whether the testing strategy will detectand predict harmful exposures will depend onwhether the major toxicity pathways areaddressed by the high-throughput assays or

covered by the targeted in vivo and other tests.To ensure that the test system is adequate, thecommittee envisions a multipronged approachthat includes the following components:

• A continuing research and evaluationprogram to develop, improve, and assess thetesting program.

• Adequate validation of the assays, includingexamination of false-negative and false-positiverates, by applying the assays to sufficientnumbers of chemicals of known toxicity.

• A robust program of biomonitoring, humanhealth surveillance, and molecular epidemi-ology to assess exposures and early indica-tors of toxicity, to aid in interpretation ofhigh-throughput assay results, and to moni-tor exposures to ensure that toxic ones arenot missed.

Aspects of those endeavors are discussedin the following subsections.

Strategy for Knowledge and Assay Development and Validation

The research strategy to develop thecomputational tools, suites of in vitro assays,and complementary targeted tests envisionedby the committee will likely involve contribu-tions on multiple fronts, including the following:

• Basic biologic research to obtain the requisiteknowledge of toxicity pathways and thepotential human health impacts when thepathways are perturbed.

• Science and technology milestones thatensure timely achievement of assays and tooldevelopment for the new paradigm.

• Phased basic and applied research to dem-onstrate success in the transition to the testingemphasis on toxicity pathways.

The basic-research effort will be directed atdiscovering and mapping toxicity pathwaysthat are the early targets of perturbation byenvironmental agents and at understandinghow agents cause the perturbations. That willbe followed by research focused on the designof assays that can be used to determine, first,

Dow

nloa

ded

by [

Bos

ton

Uni

vers

ity]

at 0

2:00

06

May

201

3

Page 50: Toxicity Testing in the 21st Century: A Vision and a Strategy

98 D. KREWSKI ET AL.

whether an agent has the potential to perturbthe pathway and, if so, the levels and durationsof exposure required. The scientific inquiry willinvolve research at multiple levels of biologicorganization, that is, understanding the natureof toxicity pathways at the molecular and cel-lular levels and how toxicity-pathway alter-ations may translate to disease processes intissues, organs, and the whole organism. Someof the tools and technologies that enable thisresearch are described in the previous section“Tools and Technologies.”

In each broad field of toxicity testing, suchas neurotoxicology and reproductive anddevelopmental toxicity, systematic approachesto assay development, assay validation, andgeneralized acceptance of the assays will beorganized and pursued. As the research ques-tions presented in the previous section areanswered, milestones would be achieved in anorderly manner. Some important milestones tomove from pathway research through assaydevelopment to validated test strategies arepresented in broad brushstrokes in Table 6.The committee recognizes that the implemen-tation of its recommendations would entailextensive planning and expert deliberation;through those processes, the important mile-stones would be subdivided, elaborated,reshaped, and perhaps even replaced.

The research would progress in sequentialphases, whose timelines would overlap. Thecommittee finds that four phases would evolveas follows:

Phase I: Toxicity-pathway elucidationA focused research effort is pursued first to

understand the toxicity pathways for a select

group of health effects (that is, apical end-points) or molecular mechanisms. Early inthis first phase, a data-storage, -access, and -management system would be established toenable broad use of the data being gener-ated to facilitate the understanding of thetoxicity pathways and research and knowl-edge development in later phases. A thirdelement of this phase would involve devel-oping standard practices for research meth-ods and reporting of results so that they areunderstandable and accessible to a broadaudience of researchers and to facilitate con-sistency and validity in the research methodsused. Research in this phase would alsofocus on developing tools for predictingmetabolism, characterizing chemicals, andplanning a strategy for human surveillanceand biomonitoring of exposure, susceptibil-ity, and effect markers associated with thetoxicity-pathway perturbations.

Phase II: Assay development and validationHigh- and medium-throughput assays

would be developed for toxicity pathwaysand points for chemical perturbation in thepathways organized for assay development.During this phase, attempts would be pursuedto develop biologic markers of exposure,susceptibility, and effect for use in surveil-lance and biomonitoring of human popula-tions where these toxicity pathways mightactivated.

Phase III: Assay relevance and validitytrial The third phase would explore assayuse, usually in parallel with traditional apicaltests. It would screen chemicals that would nototherwise be tested and would begin the

TABLE 6. Some Science and Technology Milestones in Developing Toxicity-Pathway Tests as the Cornerstone of Future Toxicity-TestingStrategies

Develop rapid methods and systems to enable in vitro dosing with chemical stressors (including important metabolites and volatile compounds).Create and adapt human, human-gene-transfected rodent, and other cell lines and systems, with culture medium conditions, to have an adequate array of in vitro human cell and tissue surrogates.Adapt and develop technologies to enable the full elucidation of critical toxicity pathways causing the diseases by the mechanisms selected for pilot project study.Develop toxicity-pathway assays that fully explore the possible effects of exogenous chemical exposure on the diseases and mechanisms selected for a pilot-project study, thereby demonstrating proof of concept.Establish efficient approaches for validating suites of high-throughput assays.Develop the infrastructure for data management, assay standardization, and reporting to enable broad data sharing across academic, government, industry, and nongovernmental organization sectors and institutions.

Dow

nloa

ded

by [

Bos

ton

Uni

vers

ity]

at 0

2:00

06

May

201

3

Page 51: Toxicity Testing in the 21st Century: A Vision and a Strategy

TOXICITY TESTING IN THE 21ST CENTURY 99

biomonitoring and surveillance of humanpopulations.

Phase IV: Assembly and validation of testbatteries Suites of assays would be proposedand validated for use in place of identified apicaltests.

Some of the key science and technologydevelopment activities for the phases are listedout in Figure 11, and some of the criticalaspects are described next. All phases wouldinclude research on toxicity pathways. Pro-gression through the phases would involve

FIGURE 11. Progression of some important science and technology activities during assay development.

Dow

nloa

ded

by [

Bos

ton

Uni

vers

ity]

at 0

2:00

06

May

201

3

Page 52: Toxicity Testing in the 21st Century: A Vision and a Strategy

100 D. KREWSKI ET AL.

exploring the research questions outlined inTable 4.

Phase I: Toxicity-Pathway ElucidationResearch to understand toxicity pathwaysPhase I research would develop pathway

knowledge from which assays for health effectswould emerge. Systems-biology approaches—including molecular profiling microarrays,pathway mining, and other high-resolutiontechniques—would reveal key molecular inter-actions. Mechanistic understanding providesthe basis for identifying the key molecular “trig-gers” or mechanisms of interactions that canalter biologic processes and ultimately causetoxicity after an environmental exposure.Those nodal triggers or interactions would bemodeled in vitro and computationally to pro-vide a suite of appropriate assays for detectingtoxicity-pathway perturbations and the requisitetools for describing dose-response relationships.

Early efforts would explore possible toxicitypathways for health effects where there is fairlyadvanced knowledge of mechanisms of toxicity,molecular signaling and interactions. As a casestudy, the following sketches out how knowledgedevelopment might begin for toxic responsesthat are associated with estrogenic signalingalterations caused by agonists and antagonistsof estrogen function.

Even our current appreciation of the numberof potential toxicity pathways highlights thebreadth of responses that might be evaluatedin various high-throughput assays. Consider-ation of adverse responses at the level of theintact organism that might be associated withaltered signaling through estrogen-receptor-mediated responses illustrates some of thechallenges. Xenobiotic-caused alteration inestrogen signaling can occur or be measured ata number of points in the various process thataffect estrogen actions, including steroidogenesis,hormone transport and elimination, receptorbinding and alteration in numbers of receptors,and changes in nuclear translocation. Thosepathways may also be evaluated at differentlevels of organization—ligand binding, receptortranslocation, transcriptional activation, andintegrated cellular responses. Some of the pro-cesses are outlined here.

• Estrogen steroidogenesis. Upstream alter-ations in steroidogenesis pathways or otherindependently regulated pathways that affectendocrine signaling would be explored.Knowledge development would focus onunderstanding of enzymatic function for keysteroidogenesis pathways and the interac-tions of the pathways with each other and onunderstanding of how key elements of thepathways might be altered, including alter-ations of precursors, products, and metabo-lites when pathway dysregulation occurs.The research might involve quantitativeassessment of key enzyme functions in invitro and in vivo systems, analytic techniquesto measure various metabolites, and model-ing to understand the target and key stepsthat undergo estrogen-related dysregulation.Other assays would develop SAR informationon compounds already associated withaltered steroidogenesis in other situations.

• Estrogen–receptor interactions. Much isknown about the molecular interactionsbetween xenobiotics and estrogen receptors(ERs), for example, direct xenobiotic interac-tion with ERs, including differential interac-tion with specific ER subtypes, such as ER-aand ER-b xenobiotic interactions with dis-crete receptor domains that give rise to dif-ferent biologic consequences, such asinteractions with the ligand-binding domainthat could cause conformational changesthat activate or inhibit signaling; and directxenobiotic interactions with other compo-nents of the ER complex, including accessoryproteins, coactivators, and other coregula-tory elements. Most responses associatedwith altered estrogen signaling would bemore easily evaluated in assays that evalu-ated a larger scale function, such as receptoractivation of estrogen-mediated transcriptionof reporter genes or estrogen-mediated cellresponses (for example, cell proliferation ofestrogen-sensitive cells in vitro).

• Processes that lead to estrogenic transgenera-tional epigenetic effects. Assay developmentto address estrogen-induced transgenera-tional epigenetic effects would involveunderstanding how early-life exposures to

Dow

nloa

ded

by [

Bos

ton

Uni

vers

ity]

at 0

2:00

06

May

201

3

Page 53: Toxicity Testing in the 21st Century: A Vision and a Strategy

TOXICITY TESTING IN THE 21ST CENTURY 101

estrogenic compounds permanently altertranscriptional control of genes, understand-ing how such early-life exposures might bepriming events for later-life alterations inreproductive competence or the develop-ment of cancer, and understanding howsuch exposures may produce transgenera-tional effects. Specific approaches in thisresearch might include genomewide meth-ods to analyze the patterns of DNA methyla-tion with and without estrogenic exposure,quantification of histone modifications,measurements of microRNAs, and the dis-section and mechanistic understanding ofhormonal inputs to the epigenetic regulatoryphenomena.

Those are just a few examples of the kindsof research on estrogenic compounds thatwould support assay development. Theapproaches include relatively small-scaleresearch efforts for processes that are fairly wellunderstood (such as direct ligand–receptorinteractions) and larger endeavors for the yet-to-be-explained processes (such as the epige-netic and transgenerational effects of early-lifeestrogenic-compound exposure). A holisticunderstanding of estrogenic and other path-ways and signaling in humans would bederived incrementally by building on studies ina wide variety of species and tissues. Newinformation from basic studies in biology islikely to lead to improved assays for testingspecific toxicity pathways.

The identified estrogenic pathways andsignaling processes, once understood, wouldserve as the substrate for further pathwaymining to highlight the critical events thatcould be tested experimentally in assay systems,that is, events that are obligatory for down-stream, apical responses and occur at thelowest exposure of a biologic system to anenvironmental agent. With studies on theorganization of response circuitry controllingthe toxicity-pathway responses, a dose-responsemodel would be based on key, nodal pointsin the circuits that control perturbations,rather than on the overall detail of all steps inthe signaling process.

Assessing validity of pathway knowledgeand linkage to adversity at the organism level

The next step in pathway elucidationwould be the assessment of the validity of thepathway knowledge, which would proceed intwo steps and involve the broader scientificcommunity.

First, the validity would be tested by artifi-cially modulating the pathways to establish thatpredicted downstream molecular conse-quences are consistent and measurable. Theperturbations could take place, for example,with the use of standard reference com-pounds, such as 17b-estradiol, or discretemolecular probes, such as genetically modifiedtest systems, knockout models, or other inter-ventions with siRNA or small-molecule inhibitorsof key enzymes of other cellular factors.

Second, the consequences of pathway dis-ruption for the organism—the linkage ofmolecular events to downstream establishedbiologic effects considered to be adverse orhuman disease—would be assessed. For thecase of perturbations of estrogen signaling, itmay include linkage with results from short-term in vivo assays, such as an increase in uterineweight in rats in the uterotrophic assay. Thelink between the toxicity pathways and adverseeffects at the level of the whole organism wouldbe assessed in a variety of in vivo and in vitroexperiments.

Development of data-storage, data-access,and data-management systems Very earlystage in Phase I, data-storage, -access, and -management systems should be developedand standardized. As the altered-estrogen-sig-naling case study indicates, the acquisition ofthe knowledge to develop high-throughputtesting assays would involve the discovery oftoxicity pathways and networks from vastamounts of data from studies of biologic cir-cuitry and interactions of environmental agentswith the circuitry. Organization of that knowl-edge would require data analysis and explorationby interdisciplinary teams of scientists. Under-standing the relationships of pathways toadverse endpoints would also involve large-volume data analysis, as would the design oftest batteries and their validation. Those efforts

Dow

nloa

ded

by [

Bos

ton

Uni

vers

ity]

at 0

2:00

06

May

201

3

Page 54: Toxicity Testing in the 21st Century: A Vision and a Strategy

102 D. KREWSKI ET AL.

could be stymied without easy and wide publicaccess to databases of results from a broadarray of research studies: high-throughputassays, quantitative-SAR model development,protein and DNA microarrays, pharmacoki-netic and metabolomic experiments, in vivoapical tests, and human biomonitoring, clinical,and population-based studies. Central reposito-ries for “-omics” data are under developmentand exist to a small extent for some in vivo tox-icity data. The scale of data storage and accessenvisioned by the committee is much larger.

The data should be available, regardless ofwhether they were generated by industry, aca-deme, federal institutions, or foundations.However, the data-management system mustalso be able to accommodate confidential databut allow for data sharing of confidential com-ponents of the database among parties thatagree to the terms of confidentiality. The data-management system would also provide pro-cedures and guidelines for adequate qualitycontrol. Central storage efforts would need tobe coordinated and standardized as appropriateto ensure usefulness.

Standardization of research assays and resultsWith the development of data-management

systems, processes for standardizing platformswould have to be developed. Currently, thereis little standardization of microarrays, althoughsuch efforts are moving more quickly with theMinimum Information About a MicroarrayExperiment formats now in use (Brazma et al.,2001). Too much standardization can stifleinnovation, so approaches to identifying andusing the appropriate level of standardizationwould be needed. Bioinformatics should pro-ceed jointly with the development of assay-platform technology. Data-management systemswould have to evolve flexibly to accommodatenew data forms and assay platforms.

Phase II: Assay Development andValidation After the Phase I validity assess-ment, pathways would be selected for assaydevelopment. The focus would be on criticaltoxicity pathways that lead reliably to adverseeffects for the organism and that are notsecondary consequences of other biologicperturbations. The first subsection of this

section outlined some of the technical issuesthat would require research to support assaydevelopment.

The case-study example of altered estrogensignaling already given indicates how assaysmay follow from toxicity-pathway identifica-tion. Understanding the direct gene-regulationconsequences of modulated ER-mediated tran-scriptional activation would lead to specificassays for quantitative assessment of transcription(RNA), translation (protein), metabolite markers,and altered function. Rapid assays to evaluatefunction on the scale of receptor activation ofestrogen-mediated transcription of reportergenes or even estrogen-mediated cell responses,such as cell proliferation of estrogen-sensitivecells in vitro, could be developed to assessaltered estrogen signaling.

Also important for assessing the potentialfor perturbations in estrogen signaling wouldbe reliable assays for detecting estrogen recep-tor interactions rapidly. Specific assays thatmight be developed include ligand–receptorbinding assays and more sophisticated compu-tational structural models of ligand interactionswith receptor and receptor-complex confor-mational changes. Further sets of assays wouldbe needed to address the wide variety of toxic-ity pathways by which estrogenic compoundscan operate. In this phase, biomarkers ofeffect, susceptibility, and exposure would bedeveloped for use in human biomonitoringand surveillance.

Demonstrating that a test is reliable andrelevant for a particular purpose is a prerequi-site for its routine use for regulatory accep-tance. But establishing the validity of any newtoxicity assay can be a formidable process—expensive, time-consuming, and logisticallyand technically challenging. Development ofefficient approaches for validating the newmechanistically based assays would add to thechallenge. How can the assays come into usewithin a reasonable time and be sufficientlyvalidated to be used with confidence? Thatquestion is discussed by considering first therelevant existing guidance on validation andthen the challenges faced in validating the newtests. Finally, some general suggestions are

Dow

nloa

ded

by [

Bos

ton

Uni

vers

ity]

at 0

2:00

06

May

201

3

Page 55: Toxicity Testing in the 21st Century: A Vision and a Strategy

TOXICITY TESTING IN THE 21ST CENTURY 103

made regarding validation of the new tests. Inmaking its suggestions, the committee acknowl-edges the considerable work going on in insti-tutions in the United States and Europe toimprove validation methods.

Existing validation guidance Guidelines onthe validation of new and revised methods forregulatory acceptance have been developedby both regulatory agencies and consortia(ICCVAM/NICEATM, 2003; OECD, 2005).Such guidelines focus on multifactorial aspectsof a test, which cover the following elements:

• Definition of test rationale, test components,and assay conduct and the provision ofdetails on the test protocol.

• Consideration of the relationship of the test-method endpoints to the biologic effect ofinterest.

• Characterization of reproducibility in andamong laboratories, transferability amonglaboratories, sources of variability, test limits,and other factors related to the reliability oftest measurements (sometimes referred to asinternal validity).

• Demonstrated biologic performance of thetest with reference chemicals, comparison ofthe performance with that of the tests it is toreplace, and description of test limitations(sometimes referred to as external validity).

• Availability, peer review, and good-labora-tory-practices status of the data supportingthe validation of the test method.

• Independent peer review of the methodsand results of the test and publication in thepeer-reviewed literature.

Criteria for regulatory acceptance of newtest methods have also been published(ICCVAM/NICEATM, 2003). They cover someof the subjects noted already and include criteriarelated to robustness (insensitivity to minorchanges in protocol), time and cost effectiveness,capability of being harmonized and acceptedby agencies and international groups, andcapability of generating useful information forrisk assessment.

Validation of a new test method typically isa prerequisite for regulatory acceptance but is

no guarantee of acceptance. It establishes theperformance characteristics of a test methodfor a particular purpose. Different regulatoryagencies may decide that they have no needfor a test intended for a given purpose, or theymay set their criteria of acceptable perfor-mance higher or lower than other agencies. Tominimize problems associated with acceptance,the Organization for Economic Cooperationand Development (OECD, 2005) recommendsthat validation and peer-review processes takeplace before a test is considered for acceptanceas an OECD test guideline. OECD recognizes,however, that factors beyond the technicalperformance of an assay may be viewed differ-ently by different regulatory authorities.

Challenges in validating mechanisticallybased assays Validation of the mechanisti-cally based tests envisioned by the committeemay be especially challenging for several rea-sons. First, the tests in the new paradigm thatare based on nonapical findings depart fromcurrent practice used by regulatory agencies insetting health advisories and guidelines basedon apical outcomes. Relevant policy and legalissues are discussed at length in the next section,“Prerequisites for Implementing the Vision inRegulatory Contexts,” and are not coveredhere except to note that scientific acceptanceof a test and its relationship to disease is a criticalcomponent of establishment of the validity ofthe test for regulatory purposes.

Second, the new “-omics” and relatedtechnologies will need to be standardized andrefined before specific applications can be vali-dated for regulatory purposes (Corvi et al.,2006). Such preliminary work could be seen asan elaborate extension of the routine step oftest-method optimization or prevalidationleading to validation of conventional in vivo orin vitro assays. The committee also notes earlierin this report that some degree of standardizationwill be necessary early to promote understand-ing and use of assay findings by researchers forknowledge development.

Third, because “-omics” and related tech-nologies are evolving rapidly, the decision tohalt optimization of a particular applicationand begin a formal validation study will be

Dow

nloa

ded

by [

Bos

ton

Uni

vers

ity]

at 0

2:00

06

May

201

3

Page 56: Toxicity Testing in the 21st Century: A Vision and a Strategy

104 D. KREWSKI ET AL.

somewhat subjective. Validation and regula-tory acceptance of a specific test do not pre-clude incorporating later technologic advancesthat would enhance its performance. If it iswarranted, the effects of such modifications onperformance can be evaluated through anexpedited validation that avoids the burdens ofa second full-blown validation.

Fourth, the committee envisions that asuite of new tests typically will be needed toreplace an individual in vivo test, given thatapical findings can be triggered by multiplemechanisms. Consequently, although it iscurrent practice to validate a single test againstthe corresponding conventional test and thento look for one-to-one correspondence, thenew paradigm would routinely entail valida-tion of test batteries and would use multivariatecomparisons.

Fifth, existing validation guidelines focuson concordance between the results of thenew and the existing assays. In practice, thatoften means comparing results from cell-basedin vitro assays with in vivo data from animals.One of the challenges of validating the medium-and high-throughput assays in the new vision—with its emphasis on human-derived cells, celllines, and cellular components—will be to iden-tify standards of comparison for assessing theirrelevance and predictiveness while aiming for atransformative paradigm shift that emphasizeshuman biology, mechanisms of toxicity, andinitial, critical perturbations of toxicity pathways.

Sixth, it is anticipated that virtually allxenobiotics will perturb signaling pathways tosome degree, so a key challenge will be todetermine when a perturbation leads todownstream toxicity and when it does not.Thus, specificity may be a bigger challengethan sensitivity.

Assay validation under new toxicity-testingparadigm Validation should not be viewed asan inflexible process that proceeds sequentiallythrough a fixed series of steps and is thenjudged according to unvarying criteria. Forexample, because validation assesses fitness forpurpose, such exercises should be judged withthe specific intended purpose in mind. A test’sintended purpose may vary from use as a

preliminary screening tool to use as the definitivetest. Similarly, a new test may be intended tomodel one or a few toxicity mechanisms for agiven apical endpoint but not the full array ofmechanisms. Given that the new paradigmwould emerge gradually, it would be importantto consider validating incremental gains, whilerecognizing their current strengths andweaknesses.

Consequently, applying a one-size-fits-allapproach to validation is not conducive to therapid incorporation of emerging science ortechnology into regulatory decision making. Amore flexible approach to assay validationwould facilitate the evolution of testing towarda more mechanistic understanding of toxicityendpoints; the form the validation should takeis a point of discussion and deliberation (Ballset al., 2006; Corvi et al., 2006). For nonregula-tory use of assays, preliminary data-gathering,and exploration of mechanisms, at a minimumsome general guidance on assay performanceappears warranted for intended assays. Forassays to be used routinely, somewhat rigorousperformance standards and relevance wouldhave to be established.

Returning to the case study on estrogen sig-naling, the validation sequence involves thedevelopment of specific assays that track thekey molecular triggers linked to human estro-genic effects. This validation component islargely focused first on validating that the assaycomponents recapitulate the key molecularinteractions already described here and thenon the traditional approach of looking at assayperformance in terms of reproducibility andrelevance.

Assessing intralaboratory and interlaboratoryreproducibility is more straightforward thanassessing relevance, which is sometimes labeledaccuracy. To assess relevance, assays would beformally linked to organism-level adversehealth effects. For example, they wouldprovide the basis of evaluating the level ofmolecular change that potentially correspondsto an adverse effect. In addition, referencecompounds would be used to determine theassays’ positive and negative predictive value.Ideally, substances known to cause and

Dow

nloa

ded

by [

Bos

ton

Uni

vers

ity]

at 0

2:00

06

May

201

3

Page 57: Toxicity Testing in the 21st Century: A Vision and a Strategy

TOXICITY TESTING IN THE 21ST CENTURY 105

substances known not to cause the effect inhumans would be used as the reference agentsfor positive and negative predictivity. In theabsence of adequate numbers of xenobioticsknown to be positive and negative in humans,animal data may have to be used in validation.For the assays based on human cell lines, thatcould be problematic, and some creativity andflexibility in the validation process would bedesirable. For example, rodent-based cellassays comparable with the human assay couldbe used to establish relevance and to supportthe use of the human cell-based assay.

Phase III: Assay Relevance and ValidityTrial Once assays are developed and for-mally validated, they would become availablefor use. The committee suggests three distinctstrategies that could aid in the assessment oftest validity and relevance and could furtherthe development of improved assays.

First, research entities, such as the NationalToxicology Program (NTP), should furtherdevelop and run the experimental high-throughput assays, some before they are fullyvalidated, on chemicals that have already beenextensively tested with standard or other toxic-ity tests. The NTP has, for example, initiatedmechanistic high-throughput assays on at least500 chemicals that have already been testedusing NTP cancer and reproductive and devel-opmental toxicity studies, and, in collaborationwith the NIH Molecular Library Initiative, furtherdeveloped and applied cell-based screeningassays that can be automated (NTP, 2006). TheU.S. Environmental Protection Agency (EPA)National Center for Computational Toxicology(NCCT) also has an initiative to screen numer-ous pesticides and some industrial chemicals inhigh-throughput tests. Those processes wouldbe essential for validating the new assays andfor learning more about which health effectscan be predicted from specific perturbations oftoxicity pathways.

Second, new validated assays should beconducted in parallel with existing toxicity testsfor chemicals, such as pesticides and pharma-ceuticals, that will be undergoing or haverecently undergone toxicity testing under regu-latory programs. This research testing, which

would be conducted by research entities,would help to foster the evolution of the assaysinto cell-based test batteries to eventuallyreplace current tests. The testing would alsohelp to gauge the positive and negative predic-tive values of the various assays and therebyhelp to avoid (or at least begin to quantify) theassociated risks with missing important toxicitieswith the new assays or incorporating a newassay that detects meaningless physiologicalterations that are not useful for predictinghuman risk.

Third, as the new assays are developedfurther and validated, they should be deployedas screens for evaluation of chemicals thatwould not currently undergo toxicity testing,such as existing high-production-volumechemicals that have not been tested or havebeen evaluated only with the screening infor-mation data set, or new chemicals that are notcurrently subject to test requirements. Used asscreens for chemicals that would otherwise notbe tested or be subject only to little testing, theassays could begin to help to set priorities fortesting and could also help to guide the focusof any testing that may be required. Eventuallythey could provide the basis of an improvedframework for addressing chemicals for whichtesting is limited or not done at all. This is illus-trated in Figure 12.

Resources will be required to implementthe three approaches: testing of chemicals withlarge and robust data sets of apical tests, parallelresearch testing of chemicals subject to existingregulatory testing requirements, and applyinghigh-through-put screens to chemicals that arecurrently not tested. In making those sugges-tions, the committee is not recommendingexpanding test requirements for pesticides orpharmaceuticals. Rather, it notes that the testsdeveloped will be a national resource of widebenefit and worthy of funding by federal researchprograms. Voluntary testing by industry usingvalidated new assays should also be encouraged.The three approaches are anticipated to pay offsubstantially in the longer term as scientists, reg-ulators, and stakeholders develop enough famil-iarity and comfort with the new assays that theybegin to replace current apical endpoint tests

Dow

nloa

ded

by [

Bos

ton

Uni

vers

ity]

at 0

2:00

06

May

201

3

Page 58: Toxicity Testing in the 21st Century: A Vision and a Strategy

106 D. KREWSKI ET AL.

and as mechanistic indicators are increasinglyused in environmental decision making.

In addition to the high-throughput testingby NTP and U.S. EPA of chemicals with robustdata sets already described here, the committeenotes the increasing use of mechanistic assays,primarily for further evaluation of chemicalsthat have demonstrated toxicity in standardapical assays. The mechanistic studies are doneto evaluate further a tailored subset of toxicitypathways, such as those involving the peroxi-some proliferators-activated receptor, the arylhydrocarbon receptor, and thyroid and sexhormones. Some companies are also usinghigh-throughput assays to guide internal decisionmaking in new chemical development, buttheir results typically are not publicly available.

A recent example of how the high-throughput assays could play out in the nearterm is the risk assessment of perchlorate. Thedata on perchlorate include standard sub-

chronic- and chronic-toxicity tests and develop-mental-neurotoxicity tests, but risk assessmentsand regulatory decisions have been based onperturbation of iodide-uptake inhibition—theknown toxicity pathway through which per-chlorate has its effects (U.S. EPA, 2006b; NRC,2006c). If a new chemical were found toinhibit iodide uptake, standard toxicity testswould not be necessary to demonstrate thepredictable effects on thyroid hormone andneurodevelopment. Regulatory decisions couldbe based on the dose-response relationship foriodide-uptake inhibition. The new data on per-chlorate-susceptible subpopulations (for exam-ple, those with low iodide) emerging frombiomonitoring would also be considered (seeBlount et al., 2006). Such a chemical wouldneed to undergo a full battery of toxicity-pathwaytesting to ascertain that no other importantpathways that might have effects at lowerdoses were disrupted.

FIGURE 12. Screening of chemicals that would otherwise not be tested or be subject to only limited testing. The results of the screeningtests would be used to decide the nature of further testing needed, if any.

Dow

nloa

ded

by [

Bos

ton

Uni

vers

ity]

at 0

2:00

06

May

201

3

Page 59: Toxicity Testing in the 21st Century: A Vision and a Strategy

TOXICITY TESTING IN THE 21ST CENTURY 107

In the long run, using upstream indicatorsof toxicity from high-throughput assays basedon toxicity pathways can be more sensitive andhence more protective of public health thenusing apical-endpoint observations from assaysin small numbers of live rodents. However,while the new assays are under development,there will be a long period of uncertainty duringwhich the false-positive and false-negativerates of the testing battery will remain unclear,and the ability of the battery to adequatelypredict effects in susceptible subpopulations orduring susceptible life stages will also beunclear. During the phase-in period and after-ward, there will be a need to pay close atten-tion to whether important toxicities are beingmissed or are being exaggerated by the toxic-ity-pathway screening battery. The concernabout missing important toxic endpoints is oneof the main reasons for the committee’s rec-ommendation for a long phase-in period dur-ing which the new assays are run in parallelwith existing assays and tested on chemicals onwhich there are already large robust data setsof apical findings. Parallel testing will allowidentification of toxicities that might be missedif the new assays were used alone and willcompel the development of assays to addressthese gaps.

Many additional issues would need to beconsidered during the interim phase of assaydevelopment. For example, technical issues,such as cell-culture conditions, and selectivepressures that result in molecular evolution ofcell lines over time and across laboratoriescould result in issues that could be addressedonly with experience and careful review ofassay results. Parallel use of new assays andcurrent tests would probably continue forsome time before the adoption of the newassays as first-tier screens or as definitive testsof toxicity.

Phase IV: Assembly and Validation ofTest Batteries Once toxicity pathways areelucidated and translated into high-throughputassays for a broad field of toxicity testing, suchas neurotoxicology, a progressively more com-prehensive suite of validated medium- to high-throughput tests would become available to

cover the field. Single assays would not becomprehensive or predictive in isolation butwould be assembled into suites with targetedtests that would cover the field. The suite or“panel” of assays and the scoring of the assayswould need to be assessed. This may involve acomputational assessment of multivariate end-points. Turning again to the estrogen-signalingcase study, known estrogen modulators shouldregister as positive in one or more assays. Con-fidence in the suite of assays can come fromthe knowledge that all known mechanisms ofestrogenic-signaling alteration are modeled.

The development and assessment of bat-teries and the overall testing strategy would befacilitated by a formal uncertainty evaluation.For the different risk contexts and decisions tobe made (see the section “Components ofVision”), the preferred test batteries may differin sensitivity, in this context the probability thatthe battery identifies as harmful a dose that isharmful, and specificity, the probability that atest battery identifies as not harmful a dose thatis not harmful. In screening, the effect of afalse-negative finding of no harm at a givendose can be far more costly than a false-posi-tive finding of harm (see, for example, Lave &Omenn, 1986). The ability to characterize thespecificity and sensitivity of the test batterywould aid the consideration of the cost effec-tiveness and value of the information to beobtained from the test battery (Lave andOmenn, 1986; Lave et al., 1988) and ulti-mately help to identify preferred test strategies.

Although considerable effort would bedirected at the construction of high-throughputbatteries, targeted tests would probably also beneeded in routine testing strategies to addressparticular risk contexts (for example, registra-tion of a pesticide for food uses). Still, the end-point-focused targeted assays should by nomeans remain static. Instead, they shouldevolve to incorporate new refinements. Forexample, the rapid developments in imagingtechnologies have offered scientists importantnew tools to enhance the collection of infor-mation from animal bioassays. Promising newassays that use nonmammalian models, such asCaenorhabditis elegans, are in development.

Dow

nloa

ded

by [

Bos

ton

Uni

vers

ity]

at 0

2:00

06

May

201

3

Page 60: Toxicity Testing in the 21st Century: A Vision and a Strategy

108 D. KREWSKI ET AL.

Combined mammalian assays that incorporatea broader array of sensitive endpoints in amore efficient manner have been developed.The committee assumes that development ofthose approaches will continue, and it encour-ages development and validation of them intargeted testing. As newer targeted-testingapproaches become available, older apicalapproaches should be retired.

Intermediate products of assay-development research

One important benefit of the researchdescribed is that it could add public-health pro-tection and refinement to current regulatorytesting. For example, in some risk contexts, par-ticularly widespread human exposure to existingchemicals, the dose-response data from toxicity-pathway tests could help to refine quantitativerelationships between adverse effects identifiedin the apical tests and perturbations in toxicitypathways and improve the evaluation of pertur-bations at the low end of the dose-responsecurve. The results of the toxicity-pathway testscould provide data to aid in interpreting theresults of apical tests on a given substance andmay guide the selection of further follow-uptests or epidemiologic surveillance. The mecha-nistic assays would also help to permit theextrapolation of toxicity findings on a chemicalunder study to other chemicals that target thesame mechanism. Additional benefits andresearch products anticipated for use in the nearterm include the following:

• A battery of inexpensive medium- and high-throughput screening assays that could beincorporated into tiered-testing schemes toidentify the most appropriate tests or toprovide preliminary results for screening riskassessments. With experience, the assayswould support the phase-out of apical end-point tests.

• Early cell-based replacements for some invivo tests, such as those for acute toxicity.

• Work to develop consensus approaches forDNA-reactivity and mutagenicity assays andstrategies for using mechanistic studies incancer risk assessment.

• Online libraries of results of medium- and high-throughput screens for use in toxicity predic-tion and improving SAR models. For classes ofchemicals well studied in apical endpoint tests,the comparison of results from high-throughputstudies with those from whole-animal studiescould provide the basis of extrapolating toxicityto untested chemicals in the class.

• Elucidation of the mechanisms of toxicity ofchemicals well studied in high-dose apicalendpoint tests. Research to achieve thevision must include the study of perturba-tions of toxicity pathways of well-studiedchemicals, many of which have widespreadhuman exposure. Such research would bringabout better understanding of the mecha-nisms of toxicity of the chemicals andimprove risk assessment. Chemicals withknown adverse effects and mechanisms wellelucidated with respect to toxicity pathwayswould be good candidates to serve as posi-tive controls in the high-throughput assays.Such studies would help to distinguishbetween exposures that result in impairedfunction and disease and exposures thatresult in adaptation and normal biologicfunction (see Figure 2).

• Indicators of toxicity-pathway activation in thehuman population. This knowledge could beused to understand the extent to which a sin-gle chemical might contribute to disease pro-cesses and would be critical for realistic dose-response modeling and extrapolation.

• Refined analytic tools for assessing the phar-macokinetics of environmental agents inhumans exposed at low concentrations.Such evaluations could be used directly inrisk assessments based on apical endpointtests and could aid in design and interpreta-tion of in vitro screens.

• Improvements in targeted human diseasesurveillance and exposure biomonitoring.

Building a Transformative Research Program

Instituting Focused Research A long-term, large-scale concerted effort is needed tobring the new toxicity-testing paradigm to fruition.

Dow

nloa

ded

by [

Bos

ton

Uni

vers

ity]

at 0

2:00

06

May

201

3

Page 61: Toxicity Testing in the 21st Century: A Vision and a Strategy

TOXICITY TESTING IN THE 21ST CENTURY 109

A critical element is the conduct of transforma-tive research to provide the scientific basis ofcreating the new testing tools and to under-stand the implications of test results and howthey may be applied in risk assessments usedin environmental decision making.

What type of institutional structure wouldbe most appropriate for conducting and man-aging the research effort? It is beyond the com-mittee’s charge and expertise to make specificrecommendations either to change or to creategovernment institutions or to alter their fundingdecisions. The committee will simply sketch itsthoughts on an appropriate institutional structurefor implementing the vision. Other approachesmay also be appropriate.

The committee notes that an institutionalstructure should be selected with the followingconsiderations in mind:

• The realization of the vision will entail con-siderable research over many years andrequire substantial funding—hundreds ofmillions of dollars.

• Much of the research will be interdisciplinaryand consequently, to be most effective,should not be dispersed among discipline-specific laboratories.

• The research will need high-level coordina-tion to tackle the challenges presented in thevision efficiently.

• The research should be informed by theneeds of the regulatory agencies that wouldadapt and use the emerging testing proce-dures, but the research program should beinsulated from the short-term orientationand varied mandates of the agencies.

Interdisciplinarity, Adaptability, andTimeline The need for an institutional struc-ture that encourages and coordinates the nec-essarily multidisciplinary research cannot beoverstated, and a spirit of interdisciplinarityshould infuse the research program. Accord-ingly, the effort would need to draw on a varietyof technologies and a number of disciplines,including basic biology, bioinformatics, biosta-tistics, chemistry, computational biology,

developmental biology, engineering, epidemi-ology, genetics, pathology, structural biology,and toxicology. Good communication andproblem solving across disciplines are a must,as well as leadership adept at fostering interdis-ciplinary efforts. The effort will have to bemonitored continually, with the necessarycross-interactions engineered, managed, andmaintained.

The testing paradigm would be progres-sively elaborated over many years or decadesas experience and successes accumulate. Itshould continue to evolve with scientificadvances. Its evolution is likely to entail mid-course changes in the direction of research asbreakthroughs in technology and science openmore promising leads. Neither this committeenor any other constituted committee will beable to foresee the full suite of possibilities orpotential limitations of new approaches thatmight arise with increasing biologic knowledge.The research strategy just outlined provides apreview to the future and suggests generalsteps needed to arrive at a new toxicity-testingparadigm. Some of the suggested steps wouldneed to be reconsidered as time passes andexperience is developed with new cell-basedassays and interpretive tools, but no globalchange in the vision, which the committeeregards as robust, is expected.

The transition from existing tests to thenew tests would require active management,involvement of the regulatory agencies, andcoherent long-range planning that invests inthe creation of new knowledge while refiningcurrent testing and, correspondingly, stimulat-ing changes in risk-assessment procedures andguidelines. Over time, the research expertiseand infrastructure involved in testing regimescould be transformed in important ways as theneed for animal testing decreases and path-way-related testing increases.

The committee envisions that the newknowledge and technology generated from theproposed research program will be translatedto noticeable changes in toxicity-testing prac-tices within 10 years. Within 20 years, testingapproaches will more closely reflect the pro-posed vision than current approaches. That

Dow

nloa

ded

by [

Bos

ton

Uni

vers

ity]

at 0

2:00

06

May

201

3

Page 62: Toxicity Testing in the 21st Century: A Vision and a Strategy

110 D. KREWSKI ET AL.

projection assumes adequate and sustainedfunding. As in the Human Genome Project,progress is expected to be nonlinear, with thepace increasing as technologic and scientificbreakthroughs are applied to the effort.

Cross-Institution and Sector LinkagesThe research to describe cellular-

response networks and toxicity pathwaysand to develop the complementaryhuman biomonitoring and surveillancestrategy would be part of larger currentefforts in medicine and biotechnology.Funding of that research is substantial inmedical schools and other academic insti-tutions, some U.S. federal and Europeanagencies, and pharmaceutical, medical,and biotechnology industries. Linksamong different elements in the researchcommunity involved in relevant researchwill be needed to capitalize on the newknowledge, technologies, and analytictools as they develop. Mechanisms forensuring sustained communication andcollaboration, such as data sharing, willalso be needed.

Some form of participation by industry andpublic-interest groups should be ensured.Firms have a long-term interest in the new par-adigm, and most stand to gain from more effi-cient testing requirements. Public-health andenvironmental interest groups, as well as thosepromoting alternatives to animal testing,should also be engaged.

Funding A large-scale, long-termresearch program is needed to elucidate thecellular-response networks and individual tox-icity pathways within them. Given the scientificchallenges and knowledge developmentrequired, moderately large funding will berequired. The committee envisions a researchand test-development program similar in scaleto the NTP or the Institute for Systems Biologyin Seattle, WA.

The success of the project will depend onattracting the best thinkers to the task, and theendeavor would compete with relatedresearch programs in medicine, industry, andgovernment for these researchers. Attractingthe best researchers in turn would depend on

an adequately funded and managed venturethat appears well placed to succeed.

Institutional Framework The committeeconcludes that an appropriate institutionalstructure for the proposed vision is a researchinstitute that fosters multidisciplinary researchintramurally and extramurally. A strong intra-mural research program is essential. The effortcannot succeed merely by creating a virtualinstitution to link and integrate organizationsthat are performing relevant research and bydispersing funding on relevant researchprojects. A mission-oriented, intramural pro-gram with core multidisciplinary programs toanswer the critical research questions can fosterthe kind of cross-discipline activity essential forthe success of the initiative. There would be farless chance of success within a reasonableperiod if the research were dispersed amongdifferent locations and organizations without acore integrating and organizing institute. A col-located, strong intramural research initiativewill enable the communication and problemsolving across disciplines required for theresearch and assay development.

Similarly, a strong, well-coordinated, tar-geted extramural program will leverage theexpertise that already exists within academe,pharmaceutical companies, the biotechnologysector, and elsewhere and foster research thatcomplements the intramural program. Throughits intramural and highly targeted extramuralactivities, the envisioned research institutewould provide the nexus through which thenew testing tools would be conceived, devel-oped, validated, and incorporated into coherenttesting schemes.

The committee sees the research institutefunded and coordinated primarily by the federalgovernment, given the scale of the necessaryfunding, the multiyear nature of the project,and links to government regulatory agencies.That does not mean that there will be no rolefor other stakeholders. Biotechnology compa-nies, for example, could co-fund specificprojects. Academic researchers could conductresearch with the program’s extramural funds.Moreover, researchers in industry and aca-deme will continue making important progress

Dow

nloa

ded

by [

Bos

ton

Uni

vers

ity]

at 0

2:00

06

May

201

3

Page 63: Toxicity Testing in the 21st Century: A Vision and a Strategy

TOXICITY TESTING IN THE 21ST CENTURY 111

in fields related to the proposed vision inde-pendently of the proposed projects.

The key institutional question is where tohouse the government research institute thatcarries out the intramural program of core mul-tidisciplinary research and manages the extra-mural program of research. Should it be anexisting entity, such as the National Institute ofEnvironmental Health Sciences (NIEHS), or anew entity devoted exclusively to the pro-posed vision? The committee notes that therecognized need for research and institutionalstructures that transcend disciplinary bound-aries to address critical biomedical researchquestions has spawned systems-biology insti-tutes and centers at biomedical firms and severalleading universities in the country. However,the committee found few examples in the gov-ernment sector. The Department of Energy(DOE) Genomics GTL Program seeks to engi-neer systems for energy production, site reme-diation, and carbon sequestration based onsystems-biology research on microorganisms.In its review of this DOE program, NRC(2006c) found collocated, integrated verticalresearch to be essential to its success.

If one were to place the proposed researchprogram into an existing government entity, apossible choice would be the NTP, a multi-agency entity administered and housed inNIEHS. The NTP has several features that sug-gest it as a possible institutional home for theresearch program envisioned here, includingits mandate to develop innovative testingapproaches, its multiagency character, the sim-ilarities between its Vision and Roadmap forthe Future and what is envisioned here, and itsexpertise in validating new tests through theNTP Interagency Center for the Evaluation ofAlterative Toxicological Methods and its sisterentity, the Interagency Coordinating Committeeon the Validation of Alternative Methods, andin “-omics” testing at its Center for Toxicoge-nomics. It is conceivable that the NTP couldabsorb the research mandate outlined here ifits efforts dramatically scaled up to accommo-date the focused program envisioned. If it wereplaced in the NTP, structures would have to bein place to ensure that the day-to-day technical

focus on short-term problems of high-volumechemical testing would not impede progress inevolving testing strategies. As the new test bat-teries and strategies are developed and vali-dated, they would be moved out of theresearch arm and be made available for routineapplication.

The committee considered housing theproposed research institute in a regulatoryagency and notes that this could be problematic.The science and technology budgets of regula-tory agencies have been under considerablestress and appear unlikely to sustain such aneffort. Although U.S. EPA’s NCCT has initiatedimportant work in this field, the scale of theendeavor envisioned by the committee is sub-stantially larger and could not be sufficientlysupported if recent trends in congressionalbudgeting for the U.S. EPA continue. Forexample, U.S. EPA’s science and technologyresearch budget has been suboptimal anddecreasing in real dollars for a number of years(U.S. EPA, 2006b, 2007).

The research portfolio entailed by thecommittee’s vision will also require activemanagement to maintain relevance and thescientific focus needed for knowledge develop-ment. Although sufficient input from regulatoryagencies is needed, insulation of the institutefrom the short-term orientation of regulatory-agency programs that depend on the results oftoxicologic testing is important.

In the end, the committee noted thatwherever the institute is housed, it should bestructured along the lines of the NTP, withintramural and focused extramural compo-nents and interagency input but with its ownfocused mission and funding stream.

Scientific Surprises and the Need forMidcourse Corrections Research often bringssurprises, and today’s predictions concerning thepromise of particular lines of research areprobably either pessimistic or optimistic insome details. For example, the committee’svision of toxicity testing stands on the pre-sumption that a relatively small number ofpathways can provide sufficiently broadcoverage to allow a moderately sized set ofhigh- and medium-throughput assays to be

Dow

nloa

ded

by [

Bos

ton

Uni

vers

ity]

at 0

2:00

06

May

201

3

Page 64: Toxicity Testing in the 21st Century: A Vision and a Strategy

112 D. KREWSKI ET AL.

developed for the scientific community to usewith confidence and that any important gaps incoverage can be addressed with a relativelysmall set of targeted assays. That presumptionmay be found to be incorrect. Furthermore,the establishment of links between perturba-tions and apical endpoints may prove espe-cially challenging for some endpoints. Thus, asthe research proceeds and learning takesplace, adjustments in the vision and theresearch focus can be anticipated.

In addition to program oversight notedabove, the research program should beassessed every 3–5 years by well-recognizedscientific experts independently of vestedinterests in the public and private sectors. Theassessment would weigh practical progress, thepromise of methods on the research horizon,and the place of the research in the context ofother research, and it would recommend mid-course corrections.

Concluding Remarks for Developing the Science Base and Assays to Implement the Vision

In the traditional approach to toxicity testing,the whole animal provides for the integrationand evaluation of many toxicity pathways. Yeteach animal study is time-consuming andexpensive and results in the use of many ani-mals. In addition, many animal studies need tobe done to evaluate different endpoints, lifestages, and exposure durations. The newapproach may require individual assays for hun-dreds of relevant toxicity pathways. Despite thatapparent complexity, emerging methods allowtesting of many pathways extremely rapidlyand efficiently (for example, in microarrays orwells). If positive signals from the assays can beused with confidence to guide risk management,the new approach will ultimately prove moreefficient than the traditional one.

It is clear, however, that much developmentand refinement will be needed before a newand efficient system could be in place. Forsome kinds of toxicity, such as developmentaltoxicity and neurotoxicity, the identification ofreplacement toxicity-pathway assays might beparticularly challenging, and some degree of

targeted testing might continue to be necessary.In addition, the validation process mayuncover unexpected and challenging technicalproblems that will require targeted testing.Finally, the parallel interim process may dis-cover that some categories of chemicals or oftoxicity cannot yet be evaluated with toxicity-pathway testing. Nonetheless, the committeeenvisions the steady evolution of toxicitytesting from apical endpoint testing to a systembased largely on toxicity-pathway batteries in amanner mindful of information needs and ofthe capacity of the test system to provideinformation.

In the long term, the committee expectstoxicity pathways to become sufficiently wellunderstood and calibrated for batteries of high-throughput assays to provide a substantial frac-tion of the toxicity-testing data needed forenvironmental decision making. Exposuremonitoring, human surveillance for early per-turbations of toxicity-response pathways, andepidemiologic studies should provide an addi-tional layer of assurance that early indicationsof adverse effects would be detected if theyoccurred. The research conducted to realizethe committee’s vision would support a seriesof substantial improvements in toxicity testingin the relatively near term.

PREREQUISITES FOR IMPLEMENTING THE VISION IN REGULATORY CONTEXTS

The committee’s vision sets the stage fortransformative changes in toxicity testing in theregulatory agencies and the larger scientificcommunity. Although advances in the state ofthe science are indispensable to realization ofthe vision, corresponding institutional changesare also important. The changes will promoteacceptance of the principles and methodsenvisioned. Acceptance will depend on severalfactors, some having scientific origins. Forexample, the new testing requirements will beexpected to reflect the state of the science andto be founded on peer-reviewed research,established protocols, validated models, caseexamples, and other scientific features. Other

Dow

nloa

ded

by [

Bos

ton

Uni

vers

ity]

at 0

2:00

06

May

201

3

Page 65: Toxicity Testing in the 21st Century: A Vision and a Strategy

TOXICITY TESTING IN THE 21ST CENTURY 113

factors stem from administrative proceduresassociated with rulemaking, such as documentingscientific sources; providing opportunities forscientific experts, stakeholders, and the inter-ested public to participate; and consulting withsister agencies and international organizations.

This section explores the conditionsrequired for using the new testing strategy forregulatory purposes. It focuses on the federalagencies and identifies institutional outlooksand orientation—both tangible, such as budgetand staffing, and intangible, such as leadershipand commitment—that can determine thepace and degree to which the vision is incor-porated into agency culture and practice. Thesection also addresses the fundamental issuesrelated to the use and the validity of the newconcepts, technologies, and resulting data forthe specific purpose of developing federalregulations.

The committee’s vision anticipates continualchange over the next two to three decades.Beyond the scientific and procedural consider-ations summarized in this section, the state ofthe economy, changing environmental condi-tions and social perspectives, and otherdynamics that shape the political climate willinfluence legislative changes and federal budgetsthat, in turn, will determine the future of toxicitytesting in the regulatory context.

Institutional Change to Meet the Vision

Attitudes and Expectations Full realiza-tion of the vision depends on the promotion ofnew testing principles and methods in the sci-entific community at large. As in the past,some changes will originate outside the regula-tory agencies and work their way into agencypractice, and others will originate in the agen-cies and work their way into the larger scien-tific community. In both cases, far-reachingshifts in orientation and perception will be crit-ical. For risk assessors and researchers, theshifts will be from familiar types of studies andestablished procedures involving overt effectsin laboratory animals and cross-species extrap-olation to new approaches that focus on howchemicals, both endogenous and exogenous,interact in human disease processes (Lieber,

2006). Many analysts in and outside the agencieswill have to apply their expertise in new ways.

The need for a change in attitude and ori-entation extends far beyond risk assessors andthe toxicity-testing community. Most difficult,perhaps, will be the new level of scientificunderstanding needed to enable many partici-pants, especially nonscientists, to become suffi-ciently informed to engage in discussion of thenew methods. Lawmakers who determine policyand appropriate funds, federal executives whodetermine research priorities, politicallyaccountable managers and decision makerswho use data-based risk assessment for makingregulatory decisions, courts that review thosedecisions, and the public, which has an interestin the need for and nature of regulations, willneed to become acquainted with new termi-nology and concepts.

Nonscientists will grasp some aspects of thenew science—such as having regulations basedon data derived from human cells, cell lines,and tissues rather than on laboratory animals—more easily than other aspects, such as themolecular basis of chemical changes that leadto adverse health effects. Ideally, individual orinstitutional “champions” will emerge to fosterand guide the implementation process.

Developing and Cultivating ExpertiseEffective implementation depends on com-

petent scientists and informed agency manage-ment. Those factors are crucial: Agency progressdepends on the expertise and experience ofthe technical staff and a supportive manage-ment structure. Incorporating new tests andtesting strategies into risk-assessment practicesand agency testing guidelines will go no furtheror faster than staffing permits.

For several decades, academic institutionshave prepared scientists for toxicity testing andrisk analysis through training in chemistry, biol-ogy, toxicology, pharmacology, and the relatedmedical and engineering disciplines. Agencyscientists receive their basic undergraduate andpostgraduate education and training fromexternal institutions and bring their training tobear on their work for the agencies. For many,pre-agency experience includes postdoctoralfellowships, internships, or first jobs at universities,

Dow

nloa

ded

by [

Bos

ton

Uni

vers

ity]

at 0

2:00

06

May

201

3

Page 66: Toxicity Testing in the 21st Century: A Vision and a Strategy

114 D. KREWSKI ET AL.

industry laboratories, consulting laboratories,and other outside organizations. The kind ofexpertise currently available in the agenciestherefore reflects in large measure expertise inthe larger scientific community. That traditionhas contributed to a large and stable cadre ofwell-trained scientists in the federal agenciesthat have science-based responsibilities. Thus,implementing the vision will require aninfusion of new scientists who have educationand experience in the new technologies andspecial training for current scientific staff andmanagers.

Scientists in academe, industry, and con-sulting laboratories and organizations have hada productive exchange with those in regulatoryagencies through professional conferences andworkshops, joint research projects, and peer-review activities. Fostering and acceleratingthose activities will be critical for implementingthe vision and will require congressional andmanagement support of targeted investment indeveloping and sustaining agency expertise.Scientists gravitate to attractive, well-funded,and well-staffed programs. To hire and retainhigh-caliber scientists in the numbers anddisciplines needed, agencies will need con-gressional and management support of thevision reflected in budget allocations and hiringauthorizations.

Policies to Foster Development and Useof New Tests Institutional change does notcome easily. The history of toxicity testing indi-cates that the pace and extent of change willdepend in part on policies and incentives.Some policies and incentives to encourage theuse and development of the new tests by agen-cies are discussed here.

First, continued progress in the use of thenew technologies constitutes the greatestincentive to reconfiguring agency testing pro-grams in line with the vision. Policies to sup-port and reward effective use of new testingconcepts and methods should be imple-mented. Apart from historical high-visibilityexamples, such as the Human GenomeProject, current broad-scale examples includethe development and use of mechanistic dataand the expanding list of “-omics” applications.

Second, policies to encourage the use ofdata generated with the new testing paradigmin chemical assessments by the agencies will beimportant. That will involve the evolution ofagencies’ risk-assessment methods and guide-lines as the new tests are developed and used.For decades, the federal agencies have pro-mulgated formal risk-assessment guidelines,based in part on consultation with outside sci-entists and the public, that codify generallyaccepted concepts and methods to be fol-lowed in assessing the hazards, dose-responserelationships, exposures, and risks related toenvironmental agents (for example, U.S. EPA,1991, 1996, 1998b, 2005c). Policies toinclude the new technologies in agency assess-ments can foster and accelerate their acceptanceand institutionalization.

Third, congressional funding of agencies toimplement the vision is essential to support rel-evant research and staffing, encourage workwith external scientists outside the agencies,recognize accomplishments by scientists andtheir management, and support other policiesto promote change.

Fourth, dependence of market access onthe conduct of specific toxicity tests can be apolicy incentive. For example, the EuropeanUnion’s Registration, Evaluation and Authorisa-tion of Chemicals (REACH) program requiresgeneration of a basic set of toxicity data onnew industrial chemicals before the chemicalscan enter the market; the program also setsdeadlines for receipt of basic toxicity data onexisting industrial chemicals. Another exampleis the registration of pesticides in the UnitedStates.

Fifth, scientific progress in toxicity testingdepends on work in academic and private-sectorlaboratories and in the federal sector. Congres-sional and agency policies and activities mustensure that sufficiently informative data gener-ated from effective new methods are used inthe regulatory process and that the largeexpenditures of money are not in vain.

Sixth, policies designed to overcome ten-dencies to resist novel approaches and maintainthe status quo will be important. Implementingthe vision requires periodic re-examination of

Dow

nloa

ded

by [

Bos

ton

Uni

vers

ity]

at 0

2:00

06

May

201

3

Page 67: Toxicity Testing in the 21st Century: A Vision and a Strategy

TOXICITY TESTING IN THE 21ST CENTURY 115

testing programs and strategies in each agencyand possibly a return to Congress to addressoutdated and ineffective programs that mightimpede implementation of novel tests andimproved testing strategies.

Regulatory Use of New Methods

The committee’s vision sets the stage fortransformative change in developing data tomeet regulatory objectives codified in lawspassed by Congress. Although the term toxicitytesting rarely, if ever, appears in the major stat-utes administered by the U.S. EnvironmentalProtection Agency (EPA), the availability of reli-able data on “adverse effects” and health orenvironmental “risk” is an underlying assump-tion in them. The Clean Water Act, the CleanAir Act, the Toxic Substances Control Act(TSCA), and pesticide and Superfund legislationare based on the availability of data for riskassessment and regulatory decision making forchemicals in their jurisdictions.

The data can have several sources. Somestatutes—such as the Federal Insecticide, Fun-gicide, and Rodenticide Act (FIFRA), the FoodQuality Protection Act, and TSCA—authorizeU.S. EPA to require the producers of somechemicals to develop and submit specific cate-gories of data to the agency. Other statutes—such as the Clean Air Act, the Clean Water Act,and the Safe Drinking Water Act—requiretoxicity data to be considered but dependmainly on information available in the scien-tific literature or government laboratory reports(in some cases, these statutes authorize U.S.EPA to apply TSCA and FIFRA testing require-ments to chemicals in their jurisdiction).Regardless of the statute or the data source,toxicity data are indispensable for well-rea-soned conclusions on the nature and dimen-sions of risk and for well-grounded decisionson the necessity of regulation to protect thepublic health or the environment and on thenature and scope of any such regulations.

As discussed in previous sections, the com-mittee’s vision will result in the generation ofdata on perturbations in toxicity pathways withthe use of high- and medium-throughputassays. A few of the test methods considered in

this report have a long history and a place inthe current regulatory testing programs andcurrent risk-assessment guidelines and prac-tices. Others are in early stages of developmentand have yet to be considered for regulatoryuse. Still others that will be used eventually arenot yet on the drawing board or even imagined.Debate on the scientific validity of nonapicaltest methods and the application of the result-ing data should be expected, and controversycould stall or bar the use of new test methodsby regulatory agencies.

The discussion here addresses the prospectof controversy and focuses on the validity anddefensibility of the new approaches. The primarymeasure of validity for regulatory purposes isscientific validity. Evidence of reliability andcredibility that satisfies established scientificcriteria is the principal basis for adopting andadapting new testing concepts and methodsfor regulatory use. Validity in this sense doesnot require de novo testing or further confir-mation of previously validated scientific tests(see the section “Developing the Science Baseand Assays to Implement the Vision”). Rather,it involves producing documentary evidencethat the tests have been validated consistentlywith standard scientific criteria. The objectiveis to avoid bringing unproven tests and theresulting data into the regulatory system. How-ever, there are also policy and proceduralaspects to validation, so the discussion alsoaddresses administrative policies and proce-dures and other nonscientific considerationsrelated to promulgating and defending govern-ment testing practices and requirements. Newdata and data categories developed in linewith the proposed changes in testing can beexpected to affect many aspects of risk assess-ment and risk management. This section com-ments mainly on testing requirements.

Scientific Prerequisites of Validity

The federal agencies have a 75-year his-tory of developing and promulgating toxicity-testing requirements for external entities,such as pesticide and drug manufacturers,and internal guidance for government labo-ratories (see introductory sections of this report).

Dow

nloa

ded

by [

Bos

ton

Uni

vers

ity]

at 0

2:00

06

May

201

3

Page 68: Toxicity Testing in the 21st Century: A Vision and a Strategy

116 D. KREWSKI ET AL.

Documenting the validity, reliability, and rele-vance of test methods to the satisfaction of thescientific community has been and will con-tinue to be an essential first step in identifyingappropriate methods for use in the regulatorycontext. That documentation can also provideinformation and a tutorial for decision makers,the public, and the courts.

Individual agency testing requirements donot arise de novo. For example, U.S. EPA’sOffice of Pesticide Programs promulgates testguidelines and requirements only after a com-prehensive development and review processinvolving public comment, harmonization withother international organizations, and peerreview by experts in the field (see, for exam-ple, 63 Fed. Reg. 41845-41848(1998) and U.S.EPA, 2006c). Documentary evidence of validityhas many sources and takes several forms. Itincludes evidence that customary criteria ofscientific acceptance, such as peer review andpublication in scholarly journals, have beensatisfied. Use by other laboratories, other gov-ernment agencies, or international organiza-tions, such as the Organization for EconomicCooperation and Development, is an indicationof scientific acceptability. As new methodsemerge, case studies and peer-reviewed test-ing guidelines, standardized operating proce-dures, and practice can be used to documentvalidity.

Establishing and documenting the validityof the new nonapical test methods and thevalidity of markers of adverse responses corre-sponding to perturbations of toxicity pathwayswill be important milestones in implementingthe committee’s vision for regulatory use.Some considerations for accomplishing this arediscussed next.

Adopting and Adapting New TestSystems and Methods The vision promptsquestions regarding the extent to which scien-tific progress using primarily human cells, celllines, and cellular components in vitro canreplace and, ideally, surpass in vivo mamma-lian systems as predictors of toxic effects inhumans. Testing with cellular systems derivedfrom human tissue and from nonmammaliansystems is backed by an impressive scientific

literature and has a long history that includesmajor contributions to cancer research and theHuman Genome Project.

Regulatory agencies also use in vitro systemsfor toxicity testing and risk assessment. In vitromode-of-action data were central elementswhen U.S. EPA proposed revisions to the can-cer guidelines more than 10 years ago and inthe final guidelines (U.S. EPA, 2005c). Mode-of-action data are featured in a wide array ofrisk assessments in U.S. EPA, other governmentinstitutions, and the private sector (for exam-ple, Meek et al., 2003; CalEPA, 2004; NTP,2005b; IARC, 2006). U.S. EPA’s exploration ofmode-of-action approaches illustrates the useof information on biologic perturbationsinvolved in key toxicity pathways.

With few exceptions, such studies are usedin the regulatory context mainly to supplementor complement data from in vivo studies. As aresult, despite the established value of in vitrosystems for many purposes, increased relianceon them for regulatory testing may requirefurther evidence of validity. As discussed in thisreport, a particularly important aspect of estab-lishing validity concerns metabolism. Many ofthe issues are highlighted in the followingstatement:

Several major problems are encounteredin studying metabolism-related toxicity in vitro:(a) modeling human metabolism . . . ; (b)maintaining tissue-specific function in vitro; (c)selecting an appropriate xenobiotic metabolizingsystem; (d) keeping enzyme activity stable overtime; and (e) the adverse effects to toxicity-indicator cells of subcellular metabolizing frac-tions. . . . Two further problems [are] the testingof mixtures of chemicals that might require dif-ferent enzyme systems . . . and . . . the inacti-vation of exogenous biotransformation systems,due to exposure to certain solvents and testsubstance. (Coecke et al., 2006, 57)

Unresolved scientific issues of that type arepotential barriers to full validation and accep-tance of some new concepts and methods foruse in the regulatory context. Such issues showthat although the vision conforms to the currentmovement from in vivo to in vitro test systems,a new set of scientific and related issues may

Dow

nloa

ded

by [

Bos

ton

Uni

vers

ity]

at 0

2:00

06

May

201

3

Page 69: Toxicity Testing in the 21st Century: A Vision and a Strategy

TOXICITY TESTING IN THE 21ST CENTURY 117

replace interspecies extrapolation as a sourceof controversy. For example, using human celllines in culture instead of laboratory animals toidentify early perturbations in a cellular-response network avoids the uncertainties asso-ciated with the customary animal-to-humanextrapolation. But such human-to-humanmethods introduce new issues and relateduncertainties, such as extrapolation from iso-lated cells in tissue culture to intact humansand from the genetic backgrounds of thecultured cells to the genetic backgrounds ofindividuals or populations of interest for risk-assessment purposes.

Incorporation of emerging methodsdepends in part on the status of the new meth-ods in the scientific community, which in turndepends on the reliability of new test systemsin identifying compounds with known biologicactivities. The generic question is “readiness”for regulatory use. Methods still under devel-opment are not necessarily barred, but untilthey are fully tested and documentable, ques-tions regarding extrapolation, relevance, andpossible controversy with respect to use forregulatory purposes can be expected.

Identifying and Defining Markers andIndicators of Adverse Responses The visioncalls for replacing current tests for apical end-points, such as tumors and birth defects, withmechanistically based testing that identifiesearly markers of disease and potential risk. Thenew tests focus on perturbations that areexpected to produce adverse responses. Thisaspect of the vision presents validation issuesthat require two kinds of documentation, onescientific and one policy-related.

As discussed earlier, assessment of scientificvalidity will require evidence, such as peer-reviewed publications and other indicators ofacceptance in the scientific community. Simi-lar documentation will be required for othernew endpoint categories identified as earlyindicators of perturbations of critical pathwaysthat have the potential to cause toxic effects.

The policy question is an old one: Whatconstitutes an adverse effect? The regulatorytrigger for many statutes administered by U.S.EPA is an adverse effect or some variation. For

example, the Safe Drinking Water Act calls forestablishing contaminant concentrations atwhich “no known or anticipated adverseeffects on the health of persons occur andwhich allows an adequate margin of safety.” AFIFRA provision calls for preventing “unreason-able adverse effects on the environment,” aphrase that includes nontarget animals as wellas humans. As a result, identifying adverseeffects is the objective of many current testingpractices and regulations and will be critical forthe use of new test methods and data.

Historically, both in legislation and inpractice, testing and regulation have focusedon apical endpoints, particularly clinically,anatomically, or histopathologically observ-able endpoints, such as tumors, birth defects,and neurologic impairments. That precedentcould provide a basis of resistance to a movefrom traditional apical endpoints to perturba-tions of toxicity pathways. However, despitethe historical emphasis, scientific and regula-tory sources make clear that adverse effectsembrace a wide array of endpoint categories.Table 7 provides some definitions that areconsistent with the vision’s approach to toxicitytesting.

In this case, establishing validity for regulatorypurposes involves documenting (1) sourcesthat justify a broad interpretation of adverseeffects as a concept and (2) published papersand other materials that show the relationshipbetween responses in toxicity pathways anddisease. Case studies that link specific chemicals,mechanistic endpoints, and disease would beuseful.

Policy and Procedural Prerequisites ofValidity Ideally, new test systems and agencyguidelines that incorporate them will coevolve.In that regard, opportunities for public partici-pation are as important as scientific measuresof validity. For the courts, in laboratories subjectto government testing requirements, and in thepublic forum, the perceived legitimacy of newtesting approaches depends also on nonscientificfactors.

Establishing a Record For any of thecomponents of the vision, documentaryevidence of scientific validity as reviewed earlier

Dow

nloa

ded

by [

Bos

ton

Uni

vers

ity]

at 0

2:00

06

May

201

3

Page 70: Toxicity Testing in the 21st Century: A Vision and a Strategy

118 D. KREWSKI ET AL.

in this report makes up the substantive portionof the record, but evidence of public participa-tion is also important. Current U.S. EPA prac-tice often includes extensive discussion withscientists in universities, industry, advocacygroups, and other government agencies atpublic conferences and workshops. Informal orformal notice-and-comment rulemaking pro-cedures and external peer review are criticalsteps in the development and issuance of newtesting and risk-assessment guidance (U.S. EPA,1998c, 2005c).

Audience and Communication IssuesThe committee’s vision is the product of

extensive scientific thought supported by asubstantial body of scientific evidence. Thescientific principles and methods involved inthe implementation of the committee’s visionare well known in the scientific community, amajor constituency in the discussion of thescientific validity of data derived from toxicitytests for regulatory use. Scientists have longrecognized the importance of effective com-munication of scientific results to a widevariety of stakeholders in toxicity testing,including other scientists, regulatory authori-ties, industry, the mass media, nongovernmentorganizations, and the public (NRC, 1989;Leiss, 2001; Krewski et al., 2006; ATSDR,2007). However, because of the transformativenature of the committee’s vision for toxicitytesting, communication of the scientific basis of

the vision and its implications for risk assessmentof environmental agents will be challenging.

Here, there is a need for clarity in commu-nicating the essence of the committee’s visionto affected parties. The nature and scientificcomplexity of the unfamiliar and more sophis-ticated methods promoted in the vision mayrequire new communication approaches. Thescientific community may be best positioned tounderstand the scientific basis on which thecommittee’s vision rests but may need time toappreciate its implications fully. Acceptance ofthe committee’s vision in the scientific com-munity will require further elaboration of thetechnical details of its implementation andgeneration of new scientific evidence to sup-port the move away from apical endpoints toperturbations of toxicity pathways. The broadparticipation of the scientific community in theelaboration of the committee’s vision for toxicitytesting is essential for its success.

Even more challenging will be the nonsci-entists’ understanding and acceptance of thecommittee’s vision. Regulatory authorities willneed to consider how current risk-assessmentpractices can be adapted to make use of thetypes of toxicity-testing data underlying thecommittee’s vision to arrive at human exposureguidelines for environmental agents judged, onthe basis of the new test results, to have toxicpotential. Lawmakers will need to determinewhether the regulatory statutes that form the

TABLE 7. Definitions of Adverse Effect

Definition Source

“Adverse effect: A biochemical change, functional impairment, or pathologic lesion that affects the performance of the whole organism, or reduces an organism’s ability to respond to an additional environmental challenge.”

IRIS, 2007

“Adverse effect: Change in the morphology, physiology, growth, development or life span of an organism, system or (sub) population that results in an impairment of functional capacity, an impairment of the capacity to compensate for additional stress, or an increase in susceptibility to other external influences.”

Renwick et al., 2003

“Adverse effects are changes that are undesirable because they alter valued structural or functional attributes of the entities of interest . . . . The nature and intensity of effects help distinguish adverse changes from normal . . . variability or those resulting in little or no significant change.”

Sergeant, 2002

“The spectrum of undesired effects of chemicals is broad. Some effects are deleterious and others are not . . . . [Regarding drugs], some side effects . . . are never desirable and are deleterious to the well-being of humans. These are referred to as the adverse, deleterious, or toxic effects of the drug.”

Klaassen & Eaton, 1991

“All chemicals produce their toxic effects via alterations in normal cellular biochemistry and physiology . . . . It should also be recognized that most organs have a capacity for function that exceeds that required for normal homeostasis, sometimes referred to as functional reserve capacity.”

Klaassen & Eaton, 1991

Dow

nloa

ded

by [

Bos

ton

Uni

vers

ity]

at 0

2:00

06

May

201

3

Page 71: Toxicity Testing in the 21st Century: A Vision and a Strategy

TOXICITY TESTING IN THE 21ST CENTURY 119

basis of such guidelines need to be modified toreflect the greater reliance on indicators oftoxicity-pathway perturbations than on overthealth outcomes. For regulatory and legalexperts to support the implementation of thecommittee’s vision, it is essential that the fun-damental biologic tenets underlying it beclearly articulated and reinforced by the devel-opment of the scientific data needed to sup-port the shift away from a focus on apicaloutcomes to biologic perturbations of key toxicitypathways. The communication challenge willbe to portray the benefits of adopting the com-mittee’s vision in scientifically valid terms withoutconfusing the vision with over-reliance on intri-cate scientific detail.

Adoption of the committee’s vision willrequire acceptance by politicians and the publicalike. There will undoubtedly be a lack of sup-port for its implementation if the scientificessence of the vision (the notion of toxicitypathways and the effects of perturbing them) isnot communicated in understandable terms.Data will need to be generated to demonstratethat avoidance of such perturbations will pro-vide a level of protection against the potentialhealth risks posed by environmental agents atleast as great as the current level. It will also beimportant to demonstrate that adoption of thecommittee’s vision will permit an assessment ofthe potential risks associated with many moreagents than is possible with current toxicity-testing practices and that this expanded coverageof the universe of environmental agents can beachieved cost-effectively.

The vision for toxicity testing in the 21stcentury articulated here represents a paradigmshift from the use of experimental animals andapical endpoints toward the use of more effi-cient in vitro tests and computational tech-niques. Implementation of the vision, whichwill provide much broader coverage of theuniverse of environmental agents that warrantour attention from a risk-assessment perspec-tive, will require a concerted effort on the partof the scientific community. A substantial com-mitment of resources will be required to generatethe scientific data needed to support thatparadigm shift, which can be achieved only

with the steadfast support of regulators, law-makers, industry, and the general public. Theirsupport will be garnered only if the essence ofthe committee’s vision can be communicatedto all stakeholders in understandable terms.

CONCLUSIONS

Change often involves a pivotal event thatbuilds on previous history and opens the doorto a new era. Pivotal events in science includethe discovery of penicillin, the elucidation ofthe DNA double helix, and the developmentof computers. All were marked by inauspiciousbeginnings followed by unheralded advancesover a period of years but ultimately resulted ina pharmacopoeia of lifesaving drugs, a map ofthe human genome, and a personal computeron almost every desk in today’s workplace.

Toxicity testing is approaching such a sci-entific pivot point. It is poised to take advantageof the revolutions in biology and biotechnology.Advances in toxicogenomics, bioinformatics,systems biology, epigenetics, and computa-tional toxicology could transform toxicity test-ing from a system based on whole-animaltesting to one founded primarily on in vitromethods that evaluate changes in biologic pro-cesses using cells, cell lines, or cellular compo-nents, preferably of human origin. Anticipatingthe impact of recent scientific advances, theU.S. Environmental Protection Agency (EPA)asked the National Research Council (NRC) todevelop a long-range vision for toxicity testingand a strategic plan for implementing thevision.

This article is based on the NRC Committee’sreport on Toxicity Testing and Assessment ofEnvironmental Agents, prepared in response toU.S. EPA’s request, and envisions a majorcampaign in the scientific community toadvance the science of toxicity testing and putit on a forward-looking footing. The potentialbenefits are clear. Fresh thinking and the use ofemerging methods for understanding how envi-ronmental agents affect human health will pro-mote beneficial changes in testing of theseagents and in the use of data for decision making.The envisioned change is expected to generate

Dow

nloa

ded

by [

Bos

ton

Uni

vers

ity]

at 0

2:00

06

May

201

3

Page 72: Toxicity Testing in the 21st Century: A Vision and a Strategy

120 D. KREWSKI ET AL.

more robust data on the potential risks tohumans posed by exposure to environmentalagents and to expand capabilities to test chem-icals more efficiently. A stronger scientificfoundation offers the prospect of improvedrisk-based regulatory decisions and possiblygreater public confidence in and acceptance ofthe decisions.

With those goals in mind, the committeepresents in this report a vision for mobilizingthe scientific community and marshallingscientific resources to initiate and sustain newapproaches, some available and others yet tobe developed, to toxicity testing. This reportspeaks to scientists in all sectors—government,public interest, industry, university, and con-sulting laboratories—who design and conducttoxicity tests and who use test results to evaluaterisks to human health. The report also seeks toinform and engage decision makers and otherleaders who shape the nature and scope ofgovernment regulations and who establish bud-getary priorities that will determine progress inadvancing toxicity testing in the future. The fullimpact of the committee’s wide-rangingrecommendations can be achieved only if bothscientists and nonscientists work to advance theobjectives set forth in the vision.

The Vision

The current approach to toxicity testingrelies primarily on a complex array of studiesthat evaluate observable outcomes in wholeanimals, such as clinical signs or pathologicchanges, that are indicative of a disease state.Partly because that strategy is so time-consumingand resource-intensive, it has had difficulty inmeeting many challenges encountered today,such as evaluating various life stages, numeroushealth outcomes, and large numbers ofuntested chemicals. The committee debatedseveral options for improving the current systembut concluded that a transformative paradigmshift is needed to achieve the design criteria setout in the committee’s interim report: (1) toprovide broad coverage of chemicals, chemicalmixtures, outcomes, and life stages, (2) toreduce the cost and time of testing, (3) to usefewer animals and cause minimal suffering in

the animals used, and (4) to develop a morerobust scientific basis for assessing healtheffects of environmental agents. For a furtherdiscussion of the options considered by thecommittee, see the section “Options for a NewToxicity-Testing Paradigm.”

The committee considered recent scientificadvances in defining a new approach to toxicitytesting. Substantial progress is being made inthe elucidation of cellular-response networks—interconnected pathways composed of com-plex biochemical interactions of genes, pro-teins, and small molecules that maintainnormal cellular function, control communica-tion between cells, and allow cells to adapt tochanges in their environment. For example,one familiar cellular-response network is sig-naling by estrogens in which initial exposureresults in enhanced cell proliferation and tissuegrowth in specific tissues. Bioscience is enhancingour knowledge of cellular-response networksand allowing scientists to begin to uncover howenvironmental agents perturb pathways inways that lead to toxicity. Cellular responsepathways that, when sufficiently perturbed, areexpected to result in adverse health effects aretermed toxicity pathways. The committee envi-sions a new toxicity-testing system that evalu-ates biologically significant perturbations in keytoxicity pathways by using new methods incomputational biology and a comprehensivearray of in vitro tests based on human biology.

Components of the Vision

Figure 3 illustrates the major componentsof the committee’s vision: chemical character-ization, toxicity testing, and dose-response andextrapolation modeling. The components ofthe vision, which are described in the subsec-tions that follow, are distinct but interrelatedmodules involving specific sets of technologiesand scientific capabilities. Some chemical eval-uations may proceed in a stepwise manner—from chemical characterization to toxicity test-ing to dose-response and extrapolation model-ing—but such a sequential evaluation need notalways be followed in practice. A critical featureof the new vision is consideration of the riskcontext (the decision-making context that creates

Dow

nloa

ded

by [

Bos

ton

Uni

vers

ity]

at 0

2:00

06

May

201

3

Page 73: Toxicity Testing in the 21st Century: A Vision and a Strategy

TOXICITY TESTING IN THE 21ST CENTURY 121

the need for toxicity-testing information) ateach step and the ability to exit the strategy atany point when sufficient data have beengenerated for decision making. The visionemphasizes the generation and use of popula-tion-based and human exposure data wherepossible for interpreting test results andencourages the collection of such data onimportant chemicals with biomonitoring, sur-veillance, and epidemiologic studies. Popula-tion-based and human exposure data, alongwith the risk context, will play a role in bothguiding and using the toxicity information thatis produced. Finally, the vision anticipates thedevelopment of a formal process to phase inand phase out test methods as scientific under-standing of toxicity-testing methods expands.That process addresses the need for efficienttesting of all chemicals in a timely, cost-effectivefashion.

Chemical Characterization Chemicalcharacterization is meant to provide insights tokey questions, including a compound’s stabilityin the environment, the potential for humanexposure, the likely routes of exposure, thepotential for bioaccumulation, possible routesof metabolism, and the likely toxicity of thecompound and possible metabolites based onchemical structure or physical or chemicalcharacteristics. Thus, data would be collectedon physical and chemical properties, use, pos-sible environmental concentrations, metabo-lites and breakdown products, initial molecularinteractions of compounds and metaboliteswith cellular components, and possible toxicproperties. A variety of computational methodsmight be used to predict those properties andcharacteristics. After chemical characterization,decisions might be made about what furthertesting is required or whether it is needed atall. In most cases, chemical characterizationalone is not expected to be sufficient to reachdecisions about the toxicity of an environmentalagent.

Toxicity Testing In the vision proposed(Figure 3), toxicity testing has two components:toxicity-pathway assays and targeted testing.The committee expects that when the vision isachieved, predictive, pathway-based assays

will serve as the central component of a broadtoxicity-testing strategy for assessing the biologicactivity of new and existing compounds.Targeted testing will serve to complement theassays and support evaluation.

Toxicity Pathways Figure 2 illustrates theactivation of a toxicity pathway. The initial per-turbations of cell-signaling motifs, genetic circuits,and cellular-response networks are obligatorychanges resulting from chemical exposure thatmight eventually result in disease. The conse-quences of a biologic perturbation depend onits magnitude, which is related to the dose, thetiming and duration of the perturbation, andthe susceptibility of the host. Accordingly, atlow doses, many biologic systems may functionnormally within their homeostatic limits. Atsomewhat higher doses, clear biologic responsesoccur. They may be successfully handled byadaptation, although some susceptible peoplemay respond. More intense or persistent per-turbations may overwhelm the capacity of thesystem to adapt and lead to tissue injury andpossible adverse health effects.

The committee’s vision capitalizes on theidentification and use of toxicity pathways asthe basis of new approaches to toxicity testingand dose-response modeling. Accordingly, thevision emphasizes the development of suites ofpredictive, high-throughput assays (high-throughput assays are efficiently designedexperiments that can be automated and rapidlyperformed to measure the effect of substanceson a biologic process of interest—these assayscan evaluate hundreds to many thousands ofchemicals over a wide concentration range toidentify chemical actions on gene, pathway,and cell function) that use cells or cell lines,preferably of human origin, to evaluate rele-vant perturbations in key toxicity pathways.Those assays may measure relatively simpleprocesses, such as binding of environmentalagents with cellular proteins and changes ingene expression caused by that binding, orthey may measure more integrated responses,such as cell division and cell differentiation.Although the majority of toxicity tests in the visionare expected to use high-throughput methods,other tests could include medium-throughput

Dow

nloa

ded

by [

Bos

ton

Uni

vers

ity]

at 0

2:00

06

May

201

3

Page 74: Toxicity Testing in the 21st Century: A Vision and a Strategy

122 D. KREWSKI ET AL.

assays of more integrated cellular responses,such as cytotoxicity, cell proliferation, and apop-tosis. Over time, the need for traditional ani-mal testing should be greatly reduced andpossibly even eliminated.

Targeted Testing Targeted testing wouldbe used to complement toxicity-pathway testsand to ensure adequate evaluation. It wouldbe used (1) to clarify substantial uncertaintiesin the interpretation of toxicity-pathway data;(2) to understand effects of representativeprototype compounds from classes of materi-als, such as nanoparticles, that may activatetoxicity pathways not included in a standardsuite of assays; (3) to refine a risk estimatewhen the targeted testing can reduce uncer-tainty, and a more refined estimate is neededfor decision making; (4) to investigate the pro-duction of possibly toxic metabolites; and (5)to fill gaps in the toxicity-pathway testing strat-egy to ensure that critical toxicity pathwaysand endpoints are adequately covered. One ofthe challenges of developing an in vitro test sys-tem to evaluate toxicity is the current inability ofcell assays to mirror metabolism in the inte-grated whole animal. For the foreseeablefuture, any in vitro strategy will need to includea provision to assess likely metabolites throughwhole-animal testing.

Targeted testing might be conducted invivo or in vitro, depending on the toxicity testsavailable. Although targeted tests could bebased on existing toxicity-test systems, they willprobably differ from traditional tests in thefuture. They could use transgenic species,isogenic strains, new animal models, or othernovel test systems and could include a toxico-genomic evaluation of tissue responses overwide dose ranges. Whatever system is used,testing protocols would maximize the amountof information gained from whole-animal toxicitytesting.

Dose-Response and ExtrapolationModeling In the vision proposed (Figure 3),dose-response models would be developed forenvironmental agents primarily on the basis ofdata from mechanistic, in vitro assays asdescribed in the toxicity-testing component.The dose-response models would describe the

relationship between concentration in the testmedium and degree of in vitro response. Insome risk contexts, a dose-response modelbased on in vitro results might provide adequatedata to support a risk-management decision.An example could involve compounds forwhich host-susceptibility factors in humans arewell understood and for which human biomoni-toring provides good information about tissueor blood concentrations of the compound andother related exposures that affect the toxicitypathway in a human population.

Extrapolation modeling estimates the envi-ronmental exposures or human intakes thatwould lead to human tissue concentrationssimilar to those associated with perturbationsof toxicity pathways in vitro and would accountfor host susceptibility factors. In the vision pro-posed, extrapolation modeling has three pri-mary components. First, a toxicity-pathwaymodel would provide a quantitative, mecha-nistic understanding of the dose-response rela-tionship for the perturbations of the pathwaysby environmental agents. Second, physiologi-cally based pharmacokinetic modeling wouldthen be used to predict human exposures thatlead to tissue concentrations that could becompared with the concentrations that causedperturbations in vitro. Third, human datawould provide information on backgroundchemical exposures and disease processes thatwould affect the same toxicity pathway andprovide a basis for addressing host susceptibil-ity quantitatively.

Population-Based and Human ExposureData Population-based and human exposuredata are important components of the commit-tee’s toxicity-testing strategy (Figure 3). Thosedata can help to inform each component ofthe vision and ensure the integrity of the over-all testing strategy. The shift toward the collec-tion of more mechanistic data on fundamentalbiologic perturbations in human cells willrequire greater use of biomonitoring andhuman-surveillance studies for data interpreta-tion. Moreover, the interaction between popu-lation-based studies and toxicity tests willimprove the design of each study type foranswering questions about the importance of

Dow

nloa

ded

by [

Bos

ton

Uni

vers

ity]

at 0

2:00

06

May

201

3

Page 75: Toxicity Testing in the 21st Century: A Vision and a Strategy

TOXICITY TESTING IN THE 21ST CENTURY 123

molecular, cellular, and genetic factors thatinfluence individual and population-levelhealth risks. Because the vision emphasizesstudies conducted in human cells that indicatehow environmental agents can affect humanbiologic responses, the studies will suggestbiomarkers (indicators of human exposure,effect, or susceptibility) that can be monitoredand studied in human populations.

As toxicity testing shifts to cell-based studies,human exposure data from biomonitoringstudies [such as those recommended in theNational Research Council report HumanBiomonitoring for Environmental Chemicals(NRC, 2006b)] may prove pivotal. Such datacan be used to select doses for toxicity testingthat can provide information on biologiceffects at environmentally relevant exposures.More important, comparison of concentrationsthat activate toxicity pathways with concentra-tions of agents in blood, urine, or other tissuesfrom human populations will help to identifypotentially important exposures to ensure anadequate margin of safety in setting humanexposure guidelines.

Risk Context Toxicity testing is usefulultimately only if it can be used to facilitate moreinformed and efficient responses to the public-health concerns of regulators, industry, and thepublic. Common scenarios, defined by the com-mittee as “risk contexts,” for which toxicity testingis used to make decisions include evaluation ofpotential environmental agents, existing envi-ronmental agents, sites of environmental con-tamination, environmental contributors to ahuman disease, and the relative risk of differ-ent environmental agents. Some risk contextsrequire rapid screening of tens of thousands ofenvironmental agents; some require highlyrefined dose-response data, extending down toenvironmentally relevant exposure concentra-tions, and some require the ability to testchemical mixtures or to use assays focused onspecific mechanisms. Some risk contexts mightrequire the use of population-based approaches,including population health surveillance andbiomonitoring. With its emphasis on high-throughput assays that use human cells, celllines, and components to evaluate biologically

significant perturbations in key toxicity pathways,the vision presented here will assist thedecision-making process in each risk context.

Implementation of the Vision

Implementation of the vision will require(1) the availability of suites of in vitro tests—preferably based on human cells, cell lines, orcomponents—that are sufficiently comprehen-sive to evaluate activity in toxicity pathwaysassociated with the broad array of possibletoxic responses; (2) the availability of targetedtests to complement the in vitro tests andensure an adequate toxicity database for risk-management decision making; (3) computa-tional models of toxicity pathways to supportapplication of in vitro test results to predictexposures in the general population that couldpotentially lead to adverse changes; (4) infra-structure changes to support the basic andapplied research needed to develop the testsand the pathway models; (5) validation of testsand test strategies for incorporation into chem-ical-assessment guidelines that will providedirection in interpreting and drawing conclu-sions from the new assay results; and (6) evi-dence justifying that the results of tests basedon perturbations in toxicity pathways are ade-quately predictive of adverse health outcomesto be used in decision making.

A substantial and focused research effortwill be needed to meet those requirements.The research will need to develop both newscientific knowledge and new toxicity-testingmethods. Key questions that need to beaddressed regarding knowledge and methoddevelopment are highlighted in Table 4 andTable 5, respectively.

The research and development needed toimplement the vision would progress in phaseswhose timelines would overlap. Phase I wouldfocus on elucidating toxicity pathways; develop-ing a data-storage, -access, and -managementsystem; developing standard protocols forresearch methods and reporting; and planninga strategy for human surveillance and biomoni-toring to support the toxicity-pathway testingapproach. Phase II would involve developmentand validation of toxicity-pathway assays and

Dow

nloa

ded

by [

Bos

ton

Uni

vers

ity]

at 0

2:00

06

May

201

3

Page 76: Toxicity Testing in the 21st Century: A Vision and a Strategy

124 D. KREWSKI ET AL.

identification of markers of exposure, effect,and susceptibility for use in surveillance andbiomonitoring of human populations. PhaseIII would evaluate assays by running them inparallel with traditional toxicity tests, onchemicals with large data sets, and on chemi-cals that would not otherwise be tested as ascreening process. Parallel testing will allowidentification of toxicities that might bemissed if the new assays were used alone andwill compel the development of assays toaddress these gaps. Surveillance and biomoni-toring of human populations would also beginduring Phase III. Finally, the validated assayswould be assembled into panels in Phase IVfor use in place of identified traditional toxicitytests.

Validation will be a critical component ofthe research and development phases. Estab-lishing the validity of any new toxicity assaycan be a formidable process—expensive,time-consuming, and logistically and techni-cally demanding. For several reasons, validationwill be especially challenging for the mecha-nistically based tests envisioned by the com-mittee. First, the test results to be generated inthe new paradigm depart from the traditionaldata used by regulatory agencies to set healthadvisories and guidelines. Second, the manynew technologies developed will need to bestandardized and refined before specificapplications are validated for regulatory pur-poses. Third, because new technologies areevolving rapidly, the decision to halt optimiza-tion of a particular application and begin a formalvalidation study will be somewhat subjective.Fourth, the committee envisions that a suite ofnew tests will typically be needed to replace aspecific traditional test. Fifth, existing guide-lines focus on concordance between theresults of new and existing assays; the difficultywill be to find standards for comparison thatcan assess the relevance and predictivity of thenew assays. Sixth, because virtually all envi-ronmental agents will perturb signaling path-ways to some degree, a key challenge will beto determine when such perturbations arelikely to lead to toxic effects and when theyare not.

A long-term, large-scale concerted effort isneeded to bring the committee’s vision fortoxicity-testing to fruition. A critical factor forsuccess is the conduct of the transformativeresearch to establish the scientific basis of newtoxicity-testing tools and to understand theimplications of test results and their applicationin risk assessments used in decision making.The committee concludes that an appropriateinstitutional structure that fosters multidisci-plinary intramural and extramural research isneeded to achieve the vision. The effort willnot succeed merely by creating a virtual institu-tion to link and integrate organizations thatperform relevant research and by dispersingfunding on relevant research projects. Mission-oriented intramural and extramural programswith core multidisciplinary activities within theinstitute to answer the critical research ques-tions listed earlier in this report can foster thekind of interdisciplinary activity essential forthe success of the initiative. There would be farless chance of success within a reasonable timeif the research were dispersed among differentlocations and organizations without a coreintegrating and organizing institute to enablethe communication and problem solvingrequired across disciplines.

Research frequently brings surprises, andtoday’s predictions about the promise of linesof research might prove to be too pessimisticor too optimistic in some details. Therefore,the committee recommends that an indepen-dent scientific assessment of the research pro-gram supporting implementation of the visionbe conducted every 3 to 5 years to provideadvice for midcourse corrections. The interimassessments would weigh progress, evaluatethe promise of new methods on the researchhorizon, and refine the committee’s vision inlight of the many scientific advances that areexpected to occur in the near future.

Regulatory acceptance of the new toxicity-testing strategy will depend on several factors.New testing requirements will be expected toreflect the state of the science and be foundedon peer-reviewed research, established testprotocols, validated models, and case studies.Other factors affecting regulatory acceptance

Dow

nloa

ded

by [

Bos

ton

Uni

vers

ity]

at 0

2:00

06

May

201

3

Page 77: Toxicity Testing in the 21st Century: A Vision and a Strategy

TOXICITY TESTING IN THE 21ST CENTURY 125

stem from administrative procedures associatedwith rulemaking, such as documenting scientificsources; providing opportunities for scientificexperts, stakeholders, and the interested publicto participate; and consulting with sister agenciesand international organizations. Implementingthe vision will require improvements and focusedeffort over a period of decades. However, giventhe political will and the availability of funds toadapt the current regulatory system to takeadvantage of the best possible scientificapproaches to toxicity testing in the future, thecommittee foresees no insurmountable obstaclesto implementing the vision presented here.

Resources are always limited, and currenttoxicity-testing practices are long establishedand deeply ingrained in some sectors. Thus,some resistance to the vision proposed by thiscommittee is expected. However, the visiontakes full advantage of current and expectedscientific advances to enhance our understandingof how environmental agents can affect humanhealth. It has the potential to greatly reducethe cost and time of testing and to lead tomuch broader coverage of the universe ofenvironmental agents. Moreover, the visionwill lead to a marked reduction in animal useand focus on doses that are more relevant tothose experienced by human populations. Thevision for toxicity testing in the twenty-firstcentury articulated here is a paradigm shift thatwill not only improve the current system buttransform it into one capable of overcomingcurrent limitations and meeting future challenges.

REFERENCES

Abdala-Valencia, H., Earwood, J., Bansal, S.,Jansen, M., Babcock, G., Garvy, B., Wills-Karp,M., and Cook-Mills, J. M. 2007. Non-hematopoietic NADPH oxidase regulation oflung eosinophilia and airway hyperrespon-siveness in experimentally-induced asthma.Am. J. Physiol. Lung Cell Mol. Physiol.292:L1111–L1125.

Affymetrix Corporation. 2007. GeneChipArrays. Affymetrix Corporation. Available athttp://www.affymetrix.com/products/arrays/specific/hgu133plus.affx.

Akpinar-Elci, M., Kanwal, R., and Kreiss, K.2002. Bronchiolitis obliterans syndrome inpopcorn plant workers. Am. J. Respir. Crit.Care Med. 165:A526–A526.

Akutsu, T., Kuhara, S., Maruyama, O., andMiyano, S. 1998. A system for identifyinggenetic networks from gene expressionpatterns produced by gene disruption andoverexpressions. Genome Inform. Ser. Work-shop Genome Inform. 9:151–160.

Aldridge, B. B., Burke, J. M., Lauffenburger,D. A., and Sorger, P. K. 2006. Physico-chemical modeling of cell signaling path-ways. Nat. Cell Biol. 8:1195–1203.

Andersen, M. E., Yang, R. S., French, C. T.,Chubb, L. S., and Dennison, J. E. 2002.Molecular circuits, biological switches, andnonlinear dose-response relationships. Envi-ron. Health Perspect. 110(suppl. 6):971–978.

Andersen, M. E., Dennison, J. E., Thomas, R. S.,and Conolly, R. B. 2005a. New directions inincidence dose-response modeling. TrendsBiotechnol. 23:122–127.

Andersen, M. E., Thomas, R. S., Gaido, K. W.,and Conolly, R. B. 2005b. Dose-responsemodeling in reproductive toxicology in thesystems biology era. Reprod. Toxicol.19:327–337.

Anderson, S. 2002. The state of the world’spharmacy: A portrait of the pharmacyprofession. J. Interprof. Care 16:391–404.

Andrew, A. S., Burgess, J. L., Meza, M. M.,Demidenko, E., Waugh, M. G., Hamilton,J. W., and Karagas, M. R. 2006. Arsenicexposure is associated with decreased DNArepair in vitro and in individuals exposed todrinking water arsenic. Environ. Health Per-spect. 114:1193–1198.

Aranda, A., and Pascual, A. 2001. Nuclear hor-mone receptors and gene expression. Physiol.Rev. 81:1269–1304.

ATSDR (Agency for Toxic Substances andDisease Registry). 2007. A primer on healthrisk communication principles and practices.Atlanta, GA: U.S. Department of Health andHuman Services, Agency for Toxic Sub-stances and Disease Registry. Available athttp://www.atsdr.cdc.gov/risk/riskprimer/index.html [accessed March 20, 2007].

Dow

nloa

ded

by [

Bos

ton

Uni

vers

ity]

at 0

2:00

06

May

201

3

Page 78: Toxicity Testing in the 21st Century: A Vision and a Strategy

126 D. KREWSKI ET AL.

Bakand, S., Winder, C., Khalil, C., and Hayes,A. 2005. Toxicity assessment of industrialchemicals and airborne contaminants: Tran-sition from in vivo to in vitro test methods: Areview. Inhal. Toxicol. 17:775–787.

Balakin, K. V., Kozintsev, A. V., Kiselyov, A. S.,and Savchuk, N. P. 2006. Rational designapproaches to chemical libraries for hit identi-fication. Curr. Drug Discov. Technol. 3:49–65.

Balazs, A. C. 2007. Modeling self-assemblyand phase behavior in complex mixtures.Annu. Rev. Phys. Chem. 58:211–233.

Balls, M., Amcoff, P., Bremer, S., Casati, S.,Coecke, S., Clothier, R., Combes, R., Corvi,R., Curren, R., Eskes, C., Fentem, J., Grib-aldo, L., Halder, M., Hartung, T., Hoffmann,S., Schectman, L., Scott, L., Spielmann, H.,Stokes, W., Tice, R., Wagner, D., andZuang, V. 2006. The principles of weight ofevidence validation of test methods and testingstrategies. The report and recommendationsof ECVAM workshop 58. Altern. Lab. Anim.34:603–620.

Barabasi, A. L., and Oltvai, Z. N. 2004. Net-work biology: Understanding the cell’sfunctional organization. Nat. Rev. Genet.5:101–113.

Barton, H. A. 2005. Computational pharmaco-kinetics during developmental windows ofsusceptibility. J. Toxicol. Environ. Health A68:889–900.

Battelle. 2002. Evaluation of SAR predictions ofestrogen receptor binding affinity. EPA/68-W-01-023. Work Assignment 2-3. Preparedfor U.S. Environmental Protection Agency,Columbus, OH.

Benigni, R. 2004. Chemical structure ofmutagens and carcinogens and the relation-ship with biological activity. J. Exp. Clin. CancerRes. 23:5–8.

Benigni, R., and Bossa, C. 2006. Structure–activity models of chemical carcinogens:State of the art, and new directions. Ann. IstSuper Sanita. 42:118–126.

Berns, K., Hijmans, E. M., Mullenders, J.,Brummelkamp, T. R., Velds, A., Heimerikx, M.,Kerkhoven, R. M., Madiredjo, M., Nijkamp,W., Weigelt, B., Agami, R., Ge, W., Cavet,G., Linsley, P. S., Beijersbergen, R. L., and

Bernards, R. 2004. A large-scale RNAi screenin human cells identifies new components ofthe p53 pathway. Nature 428:431–437.

Bhalla, U. S. 2004a. Signaling in small subcel-lular volumes. I. Stochastic and diffusioneffects on individual pathways. Biophys J.87:73–744.

Bhalla, U. S. 2004b. Signaling in small subcel-lular volumes. II. Stochastic and diffusioneffects on synaptic network properties.Biophys. J. 87:745–753.

Bhalla, U. S., Ram, P. T., and Iyengar, R. 2002.MAP kinase phosphatase as a locus of flexi-bility in a mitogen activated protein kinasesignaling network. Science 297:1018–1023.

Blount, B. C., Pirkle, J. L., Osterloh, J. D., Val-entin-Blasini, L., and Caldwell, K. L. 2006.Urinary perchlorate and thyroid hormonelevels in adolescent and adult men andwomen living in the United States. Environ.Health Perspect. 114:1865–1871.

Bodor, N. 1999. Recent advances in retromet-abolic design approaches. J. Control. Release62:209–222.

Bois, F. Y., Krowech, G., and Zeise, L. 1995.Modeling human interindividual variabilityin metabolism and risk: The example of4-aminobiphenyl. Risk Anal. 15:205–213.

Bois, F. Y., Gelman, A., Jiang, J., Maszle, D. R.,Zeise, L., and Alexeef, G. 1996. Populationtoxicokinetics of tetrachloroethylene. Arch.Toxicol. 70:347–355.

Borm, P. J., Robbins, D., Haubold, S., Kuhlbusch,T., Fissan, H., Donaldson, K., Schins, R.,Kreyling, W., Lademann, J., Krutmann, J.,Warheit, D., and Oberdorster, E. 2006. Thepotential risks of nanomaterials: A review car-ried out for ECETOC. Part. Fibre Toxicol. 3:11.

Brazma, A., Hingamp, P., Quackenbush, J.,Sherlock, G., Spellman, P., Stoeckert, C.,Aach, J., Ansorge, W., Ball, C. A., Causton,H. C., Gaasterland, T., Glenisson, P., Holstege,F. C., Kim, I. F., Markowitz, V., Matese, C.,Parkinson, H., Robinson, A., Sarkans, U.,Schulze-Kremer, S., Stewart, J., Taylor, R.,Vilo, J., and Vingron, M. 2001. Minimuminformation about a microarray experiment(MIAME)-toward standards for microarraydata. Nat. Genet. 29:365–371.

Dow

nloa

ded

by [

Bos

ton

Uni

vers

ity]

at 0

2:00

06

May

201

3

Page 79: Toxicity Testing in the 21st Century: A Vision and a Strategy

TOXICITY TESTING IN THE 21ST CENTURY 127

Brent, R. 2000. Genomic biology. Cell100:169–183.

Bugrim, A., Nikolskaya, T., and Nikolsky, Y.2004. Early prediction of drug metabolismand toxicity: Systems biology approach andmodeling. Drug Discov. Today 9:127–135.

California Environmental Protection Agency.2004. Public health goal for arsenic in drink-ing water. Office of Environmental HealthHazard Assessment, California Environmen-tal Protection Agency. Available at http://www.oehha.ca.gov/water/phg/pdf/asfinal.pdf.

Carmichael, N. G., Barton, H. A., Boobis, A. R.,Cooper, R. L., Dellarco, V. L., Doerrer, N. G.,Fenner-Crisp, P. A., Doe, J. E., Lamb IV, J. C.,and Pastoor, T. P. 2006. Agricultural chemi-cal safety assessment: A multisector approachto the modernization of human safetyrequirements. Crit. Rev. Toxicol. 36:1–7.

Cassee, F. R., Groten, J. P., van Bladeren, P. J.,and Feron, V. J. 1998. Toxicological evalua-tion and risk assessment of chemical mixtures.Crit. Rev. Toxicol. 28:73–101.

Centers for Disease Control and Prevention.2001. National report on human exposure toenvironmental chemicals. Atlanta, GA: CDC.Available at http://www.noharm.org/details.cfm?ID=745&type=document.

Centers for Disease Control and Prevention.2003. Second national report on humanExposure to Environmental Chemicals.Atlanta, GA: U.S. Department of Health andHuman Services, Centers for Disease Controland Prevention. Available at http://www.serafin.ch/toxicreport.pdf.

Centers for Disease Control and Prevention.2005. Third national report on human expo-sure to environmental chemicals. Atlanta,GA: U.S. Department of Health and HumanServices, Centers for Disease Control andPrevention. Available at http://www.cdc.gov/exposurereport/pdf/thirdreport.pdf.

Chan, R. C., Wang, M., Li, N., Yanagawa, Y.,Onoe, K., Lee, J. J., and Nel, A. E. 2006.Pro-oxidative diesel exhaust particlechemicals inhibit LPS-induced dendriticcell responses involved in T-helper differ-entiation. J. Allergy Clin. Immunol.118:455–465.

Chanda, S. K., White, S., Orth, A. P.,Reisdorph, R., Miraglia, L., Thomas, R. S.,DeJesus, P., Mason, D. E., Huang, Q., Vega,R., Yu, D. H., Nelson, C. G., Smith, B. M.,Terry, R., Linford, A. S., Yu, Y., Chirn, G. W.,Song, C., Labow, M. A., Cohen, D., King, F.J., Peters, E. C., Schultz, P. G., Vogt, P. K.,Hogenesch, J. B., and Caldwell, J. S. 2003.Genome-scale functional profiling of themammalian AP-1signaling pathway. Proc.Natl. Acad. Sci USA 100:12153–12158.

Charles, G. D. 2004. In vitro models in endo-crine disruptor screening. ILAR J. 45:494–501.

Cho, K. H., Shin, S. Y., Lee, H. W., and Wolk-enhauer, O. 2003. Investigations into theanalysis and modeling of the TNFa-medi-ated NF-kB-signaling pathway. Genome Res.13:2413–2422.

Clark, L. H., Setzer, R. W., and Barton, H. A.2004. Framework for evaluation of physio-logically-based pharmacokinetic models foruse in safety or risk assessment. Risk Anal.24:1697–1717.

Clewell, H. J., Gentry, P. R., Kester, J. E., andAndersen, M. E. 2005. Evaluation of physio-logically based pharmacokinetic models inrisk assessment: An example with perchloro-ethylene. Crit. Rev. Toxicol. 35:413–433.

Coecke, S., Blaauboer, B. J., Elaut, G., Freeman,S., Freidig, A., Gensmantel, N., Hoet, P.,Kapoulas, V. M., Ladstetter, B., Langley, G.,Leahy, D., Mannens, G., Meneguz, A.,Monshouwer, M., Nemery, B., Pelkonen,O., Pfaller, W., Prieto, P., Proctor, N.,Rogiers, V., Rostami-Hodjegan, A., Sabbioni,E., Steiling, W., and van de Sandt, J. J. 2005.Toxicokinetics and metabolism. Altern. Lab.Anim. 33(suppl. 1):147–175.

Coecke, S., Ahr, H., Blaauboer, B. J., Bremer, S.,Casati, S., Castell, J., Combes, R., Corvi, R.,Crespi, C. L., Cunningham, M. L., Elaut, G.,Eletti, B., Freidig, A., Gennari, A., Ghersi-Egea,J. F., Guillouzo, A., Hartung, T., Hoet, P.,Ingelman-Sundberg, M., Munn, S., Janssens,W., Ladstetter, B., Leahy, D., Long, A.,Meneguz, A., Monshouwer, M., Morath, S.,Nagelkerke, F., Pelkonen, O., Ponti, J., Pri-eto, P., Richert, L., Sabbioni, E., Schaack, B.,Steiling, W., Testai, E., Vericat, J. A., and

Dow

nloa

ded

by [

Bos

ton

Uni

vers

ity]

at 0

2:00

06

May

201

3

Page 80: Toxicity Testing in the 21st Century: A Vision and a Strategy

128 D. KREWSKI ET AL.

Worth, A. 2006. Metabolism: A bottleneck inin vitro toxicological test development. Thereport and recommendations of ECVAMworkshop 54. Altern. Lab Anim. 34:49–84.

Congiu, A., Pozzi, D., Esposito, C., Castellano, C.,and Mossa, G. 2004. Correlation betweenstructure and transfection efficiency: A studyof DC-Chol-DOPE/DNA complexes. Col-loids Surf. B Biointerfaces 36:43–48.

Conner, J. D., Ebner, L. S., O’Connor, C. A.,Volz, C., and Weinstein, K. W. 1987. Pesti-cide regulation handbook. New York: Execu-tive Enterprises Publication.

Conolly, R. B. 2002. The use of biologicallybased modeling in risk assessment. Toxicol-ogy 181–182:275–279.

Conolly, R. B., Kimbell, J. S., Janszen, D.,Schlosser, P. M., Kalisak, D., Preston, J., andMiller, F. J. 2003. Biologically motivatedcomputational modeling of formaldehydecarcinogenicity in the F344 rat. Toxicol. Sci.75:432–447.

Conolly, R. B., Kimbell, J. S., Janszen, D.,Schlosser, P. M., Kalisak, D., Preston, J., andMiller, F. J. 2004. Human respiratory tractcancer risks of inhaled formaldehyde: Dose-response predictions derived from biologi-cally-motivated computational modeling ofa combined rodent and human dataset. Tox-icol. Sci. 82:279–296.

Corvi, R., Ahr, H. J., Albertini, S., Blakey, D. H.,Clerici, L., Coecke, S., Douglas, G. R.,Gribaldo, L., Groten, J. P., Haase, B.,Hamernik, K., Hartung, T., Inoue, T., Indans,I., Maurici, D., Orphanides, G., Rembges, D.,Sansone, S. A., Snape, J. R., Toda, E.,Tong, W., van Delft, J. H., Weis, B., andSchechtman, L. M. 2006. Meeting report:Validation of toxicogenomics-based test sys-tems: ECVAM-ICCVAM/NICEATM consider-ations for regulatory use. Environ HealthPerspect. 114:420–429.

Cronin, M. T. 2002. The current status andfuture applicability of quantitative structure–activity relationships (QSARs) in predicting tox-icity. Altern. Lab. Anim. 30(suppl. 2):81–84.

Cummings, A., and Kavlock, R. 2005. A sys-tems biology approach to developmentaltoxicology. Reprod. Toxicol. 19:281–290.

Daston, G. P. 1997. Advances in understandingmechanisms of toxicity and implications forrisk assessment. Reprod. Toxicol. 11:389–396.

Doe, J. E., Boobis, A. R., Blacker, A., Dellarco, V.,Doerrer, N. G., Franklin, C., Goodman, J.I., Kronenberg, J. M., Lewis, R., Mcconnell,E. E., Mercier, T., Moretto, A., Nolan, C.,Padilla, S., Phang, W., Solecki, R., Tilbury, L.,van Ravenzwaay, B., and Wolf, D. C. 2006.A tiered approach to systemic toxicity testingfor agricultural chemical safety assessment.Crit. Rev. Toxicol. 36:37–68.

Ekins, S. 2006. Systems-ADME/Tox: Resourcesand network applications. J. Pharmacol. Tox-icol. Methods 53:38–66.

Ekins, S., Nikolsky, Y., and Nikolskaya, T.2005. Techniques: Applications of systemsbiology to absorption, distribution, metabo-lism, excretion and toxicity. Trends Pharma-col. Sci. 26:202–209.

el-Masri, H. A., Thomas, R. S., Sabados, G. R.,Phillips, J .K., Constan, A. A., Benjamin, S.A., Andersen, M. E., Mehendale, H. M., andYang, R. S. 1996. Physiologically based phar-macokinetic/pharmacodynamic modeling ofthe toxicology interaction between carbontetrachloride and kepone. Arch. Toxicol.70:704–713.

El-Samad, H., Kurata, H., Doyle, J. C., Gross,C. A., and Khammash, M. 2005. Survivingheat shock: Control strategies for robustnessand performance. Proc. Natl. Acad. Sci. USA102:2736–2841.

Epstein, M. M. 2004. Do mouse models ofallergic asthma mimic clinical disease? Int.Arch. Allergy Immunol. 133:84–100.

Eungdamrong, N. J., and Iyengar, R. 2004.Computational approaches for modelingregulatory cellular networks. Trends CellBiol. 14:661–669.

Feher, M., Sourial, E., and Schmidt, J. M.2000. A simple model for the prediction ofblood-brain partitioning. Int. J. Pharm.201:239–247.

Fernandis, A. Z, and Wenk, M. R. 2007. Mem-brane lipids as signaling molecules. Curr.Opin. Lipidol. 18:121–128.

Feron, V. J., Groten, J. P., Jonker, D., Cassee,F. R., and van Bladeren, P. J. 1995. Toxicol-

Dow

nloa

ded

by [

Bos

ton

Uni

vers

ity]

at 0

2:00

06

May

201

3

Page 81: Toxicity Testing in the 21st Century: A Vision and a Strategy

TOXICITY TESTING IN THE 21ST CENTURY 129

ogy of chemical mixtures: Challenges fortoday and the future. Toxicology 105:415–427.

Fischer, H. P. 2005. Towards quantitative biol-ogy: Integration of biological information toelucidate disease pathways and to guidedrug discovery. Biotechnol. Annu. Rev.11:1–68.

Food and Drug Administration. 1959.Appraisal of the safety of chemicals in foods,drugs and cosmetics. Staff of the Division ofPharmacology, Food and Drug Administra-tion, Department of Health, Education andWelfare. Austin, TX: Association of Food andDrug Officials of the United States.

Fouchecourt, M. O., Beliveau, M., andKrishnan, K. 2001. Quantitative structure–pharmacokinetic relationship modeling. Sci.Total Environ. 274:125–135.

Frankos, V. H., and Rodricks, J .V. 2001. Foodadditives and nutrition supplements. In Reg-ulatory toxicology, ed. S. C. Gad, pp. 133–166.London: Taylor & Francis.

Frasor, J., Danes, J. M., Komm, B., Chang, K.C. N., Lyttle, C. R., and Katzenellenbogen,B. S. 2003. Profiling of estrogen up- anddown-regulated gene expression in humanbreast cancer cells: Insights into gene net-works and pathways underlying estrogeniccontrol of proliferation and cell phenotype.Endocrinology 144:4562–4574.

Gad, S. C., and Chengelis, C. P. 2001. Humanpharmaceutical products. In Regulatory toxi-cology, ed. S. C. Gad, pp. 9–69. London:Taylor & Francis.

Gargas, M. L., Seybold, P. G., and Andersen,M. E. 1988. Modeling the tissue solubilitiesand metabolic rate constant (Vmax) of halo-genated methanes, ethanes, and ethylenes.Toxicol. Lett. 43:235–256.

Gargas, M. L., Burgess, R. J., Voisard, D. E.,Cason, G. H., and Andersen, M. E. 1989.Partition coefficients of low-molecular-weight volatile chemicals in various liquidsand tissues. Toxicol. Appl. Pharmacol.98:87–99.

Gennari, A., van den Berghe, C., Casati, S.,Castell, J., Clemedson, C., Coecke, S.,Colombo, A., Curren, R., Dal Negro, G.,

Goldberg, A., Gosmore, C., Hartung, T.,Langezaal, I., Lessigiarska, I., Maas, W.,Mangelsdorf, I., Parchment, R., Prieto, P.,Sintes, J. R., Ryan, M., Schmuck, G., Stitzel,K., Stokes, W., Vericat, J. A., and Gribaldo,L. 2004. Strategies to replace in vivo acutesystemic toxicity testing. The report and rec-ommendations of ECVAM Workshop 50.Altern. Lab. Anim. 32:437–459.

Ginsburg, G. S., and Haga, S. B.,. 2006. Trans-lating genomic biomarkers into clinicallyuseful diagnostics. Expert Rev. Mol. Diagn.6:179–191.

Goldberg, A. M., and Hartung, T. 2006. Protect-ing more than animals. Sci Am. 294:84–91.

Gombar, V. K., Silver, I. S., and Zhao, Z. 2003.Role of ADME characteristics in drug dis-covery and their in silico evaluation: In sil-ico screening of chemicals for theirmetabolic stability. Curr. Top. Med. Chem.3:1205–1225.

Government Accounting Office. 2005. Chemi-cal regulation: Options exist to improve EPA’sability to assess health risks and manage itschemical review program. GAO-05-458.Washington, DC: U.S. Government Account-ing Office. Available at http://www.gao.gov/new.items/d05458. pdf.

Grimme, S., Steinmetz, M., and Korth, M.2007. How to compute isomerization energiesof organic molecules with quantum chemicalmethods. J. Org. Chem. 72:2118–2126.

Gwinn, M. R., and Vallyathan, V. 2006. Nano-particles: Health effects—pros and cons.Environ. Health Perspect. 114:1818–1825.

Hammond, S. M. 2005. Dicing and slicing:The core machinery of the RNA interferencepathway. Fed. Eur. Biochem. Soc. Lett.579:5822–5829.

Handschin, C., and Meyer, U. A. 2005. Regula-tory network of lipid-sensing nuclear recep-tors: Roles for CAR, PXR, LXR, and FXR. Arch.Biochem. Biophys. 433:387–396.

Haney, S. A., LaPan, P., Pan, J., and Zhang, J.2006. High-content screening moves to thefront of the line. Drug Discov. Today11:889–894.

Hannon, G. J. 2002. RNA interference. Nature418:244–251.

Dow

nloa

ded

by [

Bos

ton

Uni

vers

ity]

at 0

2:00

06

May

201

3

Page 82: Toxicity Testing in the 21st Century: A Vision and a Strategy

130 D. KREWSKI ET AL.

Harrington, W. R., Kim, S. H., Funk, C. C.,Madak-Erdogan, Z., Schiff, R., Katzenellen-bogen, J. A., and Katzenellenbogen, B. S.2006. Estrogen dendrimer conjugates thatpreferentially activate extranuclear, nonge-nomic versus genomic pathways of estrogenaction. Mol. Endocrinol. 20:491–502.

Hattis, D., White, P., Marmorstein, L., andKoch, P. 1990. Uncertainties in pharmacoki-netic modeling for perchloroethylene. I.Comparison of model structure, parame-ters, and predictions for low-dose metabo-lism rates for models derived by differentauthors. Risk Anal. 10:449–458.

Hillegass, J. M., Murphy, K. A., Villano, C. M.,and White, L. A. 2006. The impact of arylhydrocarbon receptor signaling on matrixmetabolism: Implications for developmentand disease. Biol. Chem. 387:1159–1173.

Ho, S. M., Tang, W. Y., Belmonte de Frausto,J., and Prins, G. S. 2006. Developmentalexposure to estradiol and bisphenol Aincreases susceptibility to prostate carcino-genesis and epigenetically regulates pho-phodiesterase type 4 variant 4. Cancer Res.66:5624–5632.

Hoffmann, A., Levchenko, A., Scott, M. L., andBaltimore, D. 2002. The IkB-NF-kB signalingmodule: Temporal control and selectivegene activation. Science 298:1241–1245.

Hoheisel, J. D. 2006. Microarray technology:Beyond transcript profiling and genotypeanalysis. Nat. Rev. Genet. 7:200–210.

Hua, F., Hautaniemi, S., Yokoo, R., andLauffenburger, D. A. 2006. Integrated mech-anistic and data driven modeling for multi-variate analysis of signaling pathways. J. R.Soc. Interface. 3:515–526.

Huang, Q., Raya, A., DeJesus, P., Chao, S. H.,Quon, K. C., Caldwell, J. S., Chanda, S. K.,Izpisua-Belmonte, J. C., and Schultz, P. G.2004. Identification of p53 regulators bygenome-wide functional analysis. Proc. Natl.Acad. Sci. USA 101:3456–3461.

Hubbs, A. F., Battelli, L. A., Goldsmith, W. T.,Porter, D. W., Frazer, D., Friend, S.,Schwegler-Berry, D., Mercer, R. R., Reynolds,J. S., Grote, A., Castranova, V., Kullman, G.,Fedan, J. S., Dowdy, J., and Jones, W. G.

2002. Necrosis of nasal and airway epithe-lium in rats inhaling vapors of artificial butterflavoring. Toxicol. Appl. Pharmacol.185:128–135.

Iarmarcovai, G., Sari-Minodier, I., Chaspoul, F.,Botta, C., De Meo, M., Orsiere, T.,Berge-Lefranc, J. L., Gallice, P., and Botta, A.2005. Risk assessment of welders using anal-ysis of eight metals by ICP-MS in blood andurine and DNA damage evaluation by thecomet and micronucleus assays; influence ofXRCC1 and XRCC3 polymorphisms.Mutagenesis 20:425–432.

IARC. 2006. Cobalt in hard metals and cobaltsulfate, gallium arsenide, indium phosphideand vanadium pentoxide. IARC Monogr.Eval.Carcinogen. Risks Hum. 86. Lyon, France:IARC Press.

Inglese, J. 2002. Expanding the HTS paradigm.Drug Discov. Today 7(suppl. 18):S105–S106.

Inglese, J., Auld, D. S., Jadhav, A., Johnson, R. L.,Simeonov, A., Yasgar, A., Zheng, W., andAustin, C. P. 2006. Quantitative high-throughput screening: A titration-basedapproach that efficiently identifies biologicalactivities in large chemical libraries. Proc.Natl. Acad. Sci. USA 103:11473–11478.

Institute of Medicine. 2005. Implications ofnanotechnology for environmental healthresearch. Washington, DC: National Acade-mies Press.

Institute of Medicine. 2007. The future of drugsafety: Promoting and protecting the healthof the public. Washington, DC: NationalAcademies Press.

Integrated Risk Information System (IRIS).2007. Glossary of IRIS terms. Integrated RiskInformation System, U.S. EnvironmentalProtection Agency. Available at http://www.epa.gov/IRIS/help_gloss.htm.

Interagency Coordinating Committee on theValidation of Alternative Methods andNational Toxicology Program InteragencyCenter for the Evaluation of Alternative Toxi-colgical Methods. 2003. ICCVAM Guide-lines for the nomination and submission ofnew, revised, and alternative test methods.NIH publication no. 03-4508. ResearchTriangle Park, NC: National Institute of

Dow

nloa

ded

by [

Bos

ton

Uni

vers

ity]

at 0

2:00

06

May

201

3

Page 83: Toxicity Testing in the 21st Century: A Vision and a Strategy

TOXICITY TESTING IN THE 21ST CENTURY 131

Environmental Health Sciences, NationalInstitutes of Health.

International Life Sciences Institute Health andEnvironmental Sciences Institute. 2004a.Systemic toxicity white paper. Systemic Tox-icity Task Force, Technical Committee onAgricultural Chemical Safety Assessment,ILSI Health Sciences Institute, Washington,DC.

International Life Sciences Institute Health andEnvironmental Sciences Institute. 2004b.Life stages white paper. Life Stages TaskForce, Technical Committee on AgriculturalChemical Safety Assessment, ILSI Health Sci-ences Institute, Washington, DC.

International Life Sciences Institute Health andEnvironmental Sciences Institute. 2004c.The acquisition and application of absorp-tion, distribution, metabolism, and excretion(ADME) data in agricultural chemical safetyassessments. ADME Task Force, TechnicalCommittee on Agricultural Chemical SafetyAssessment, ILSI Health Sciences Institute,Washington, DC.

Katoh, M., and Katoh, M. 2006. Bioinformaticsfor cancer management in the post-genomeera. Technol. Cancer Res. Treat. 5:169–175.

Kedderis, G. L, and Lipscomb, J. C. 2001.Application of in vitro biotransformation dataand pharmacokinetic modeling to risk assess-ment. Toxicol. Ind. Health 17:315–321.

Kitano, H. 2005. International alliance forquantitative modeling in systems biology.Mol. Syst. Biol. 1:2005.0007. Available athttp://www.nature.com/msb/journal/v1/n1/pdf/msb4100011.pdf.

Klaassen, C. D., and Eaton, D. L. 1991. Princi-ples of toxicology. In Casarett and Doull’stoxicology: The basic science of poisons, 4thed., eds. M. O. Amdur, J. Doull, and C. D.Klaassen, pp. 12–49. New York: PergamonPress.

Kobayashi, A., Kang, M. I., Okawa, H., Ohtsuji,M., Zenke, Y., Chiba, T., Igarashi, K., andYamamoto, M. 2004. Oxidative stress sensorKeap1 functions as an adaptor for CuI3-based E3 ligase to regulate proteasomaldegradation of Nrf2. Mol. Cell. Biol.24:7130–7139.

Kobayashi, M., and Yamamoto, M. 2006.Nrf2-Keap1 regulation of cellular defensemechanisms against electrophiles and reac-tive oxygen species. Adv. Enzyme Regul.46:113–140.

Kraska, R. C. 2001. Industrial chemicals: Regu-lation of new and existing chemicals (theToxic Substances Control Act and similarworldwide chemical control laws). In Regula-tory toxicology, ed. S. C. Gad, pp. 244–276.London: Taylor & Francis.

Kreiss, K., Gomaa, A., Kullman, G., Fedan, K.,Simoes, E. J., and Enright, P. L. 2002. Clinicalbronchiolitis obliterans in workers at a micro-wave-popcorn plant. N. Engl. J. Med.347:330–338.

Krewski, D., Lemyre, L., Turner, M. C., Lee,J. E. C., Dallaire, C., Bouchard, L., Brand,K., and Mercier, P. 2006. Public perceptionof population health risks in Canada: Healthhazards and sources of information. Hum.Ecol. Risk Assess. 12:626–644.

Kriete, A., and Eils, R. 2006. Introducing com-putational systems biology. In Computationalsystem biology, eds. A. Kriete, and R. Eils, pp.1–14. Boston: Elsevier Academic Press.

Lander, E. S., and Weinberg, R. A. 2000.Genomics: Journey to the center of biology.Science 287:1777–1782.

Landers, J. P., and Spelsberg, T. C. 1992. Newconcepts in steroid hormone action: Tran-scription factors, proto-oncogenes, and thecascade model for steroid regulation of geneexpression. Crit. Rev. Eukaryot. Gene Expr.2:19–63.

Lave, L. B., and Omenn, G. S. 1986. Cost-effectiveness of short-term tests for carcino-genicity. Nature 324:29–34.

Lave, L. B., Ennever, F. K., Rosenkranz, H. S.,and Omenn, G. S. 1988. Information valueof the rodent bioassay. Nature 336:631–633.

Lee, C. T., Ylostalo, J., Friedman, M., andHoyle, G. W. 2005. Gene expression profil-ing in mouse lung following polymeric hex-amethylene diisocyanate exposure. Toxicol.Appl. Pharmacol. 205:53–64.

Lee, M. Y., and Dordick, J. S. 2006. High-throughput human metabolism and toxicityanalysis. Curr. Opin. Biotechnol. 17:619–627.

Dow

nloa

ded

by [

Bos

ton

Uni

vers

ity]

at 0

2:00

06

May

201

3

Page 84: Toxicity Testing in the 21st Century: A Vision and a Strategy

132 D. KREWSKI ET AL.

Lieber, M. M. 2006. Towards an understand-ing of the role of forces in carcinogenesis: Aperspective with therapeutic implications.Riv. Biol. 99:131–160.

Leiss, W., ed., 2001. In the chamber of risks:Understanding risk controversies. Montreal:McGill-Queen’s University Press.

Leroux, B. G., Leisenring, W. M., Moolgavkar,S. H., and Faustman, E. M. 1996. A biologi-cally-based dose-response model for devel-opmental toxicology. Risk Anal. 16:449–458.

Lewin, B., Cassimeris, L., Lingappa, V. R., andPlopper, G, eds. 2007. Cells. Sudbury, MA:Jones and Bartlett.

Lexchin, J. 2005. Drug withdrawals from theCanadian market for safety reasons, 1963–2004.Can. Med. Assoc. J. 172:765–767.

Li, T., Chen, W., and Chiang, J. Y. 2006. PXRinduces CYP27A1 and regulates cholesterolmetabolism in the intestine. J. Lipid Res.48:373–384.

Lockey, J., Mckay, R., Barth, E., Dahlsten, J.,and Baughman, R. 2002. Bronchiolits oblit-erans in the food flavoring manufacturingindustry. Am. J. Respir. Crit. Care Med.165:A461.

Lum, L., Yao, S., Mozer, B., Rovescalli, A., VonKessler, D., Nirenberg, M., and Beachy, P. A.2003. Identification of Hedgehog pathwaycomponents by RNAi in Drosophila culturedcells. Science 299:2039–2045.

Lutz, W., and Sulkowski, W. J. 2004. Vagusnerve participates in regulation of the air-ways: Inflammatory response and hyperre-activity induced by occupationalasthmogens. Int. J. Occup. Med. Environ.Health 17:417–431.

Lydy, M., Belden, J., Wheelock, C., Hammock,B., and Denton, D. 2004. Challenges in regu-lating pesticide mixtures. Ecol. Society 9:1–6.

Maddox, L., and Schwartz, D. A. 2002. Thepathophysiology of asthma. Annu. Rev. Med.53:477–498.

Maroni, P., Bendinelli, P., Tiberio, L., Rovetta,F., Piccoletti, R., and Schiaffonati, L. 2003.In vivo heat-shock response in the brain:Signaling pathway and transcription factoractivation. Brain Res. Mol. Brain Res.119:90–99.

Masimirembwa, C. M., Thompson, R., andAndersson, T. B. 2001. In vitro high through-put screening of compounds for favorablemetabolic properties in drug discovery.Comb. Chem. High Throughput Screen.4:245–263.

McKinney, J. D., Richard, A., Waller, C.,Newman, M. C., and Gerberick, F. 2000.The practice of structure activity relationships(SAR) in toxicology. Toxicol. Sci. 56:8–17.

McMahon, M., Thomas, N., Itoh, K.,Yamamoto, M., and Hayes, J. D. 2006.Dimerization of substrate adaptors can facili-tate cullin-mediated ubiquitylation of pro-teins by a “tethering” mechanism: A two-siteinteraction model for the Nrf2–Keap1 com-plex. J. Biol. Chem. 281:24756–24768.

Meek, M. E., Bucher, J. R., Cohen, S. M.,Dellarco, V., Hill, R. N., Lehman-McKee-man, L. D., Longfellow, D. G., Pastoor, T.,Seed, J., and Patton, D. 2003. A frameworkfor human relevance analysis of informationon carcinogenic modes of action. Crit. Rev.Toxicol. 33:591–653.

Meister, G., and Tuschl, T. 2004. Mechanismsof gene slicing by double-stranded RNA.Nature 431:343–349.

Mello, C. C., and Conte, D. Jr. 2004. Revealingthe world of RNA interference. Nature431:338–342.

Michiels, F., van Es, H., van Rompaey, L.,Merchiers, P., Francken, B., Pittois, K., vander Schueren, J., Brys, R., Vandersmissen, J.,Beirinckx, F., Herman, S., Dokic, K., Klaassen,H., Narinx, E., Hagers, A., Laenen, W., Piest,I., Pavliska, H., Rombout, Y., Langemeijer,E., Ma, L., Schipper, C., Raeymaeker, M. D.,Schweicher, S., Jans, M., van Beeck, K.,Tsang, I. R., van de Stolpe, O., Tomme, P.,Arts, G. J., and Donker, J. 2002. Arrayedadenoviral expression libraries for func-tional screening. Nat. Biotechnol.20:1154–1157.

Moolgavkar, S. H. and Luebeck, G. 1990.Two-event model for carcinogenesis: Biolog-ical, mathematical, and statistical consider-ations. Risk Anal. 10:323–341.

Motohashi, H., and Yamamoto, M. 2004.Nrf2-Keap1 defines a physiologically important

Dow

nloa

ded

by [

Bos

ton

Uni

vers

ity]

at 0

2:00

06

May

201

3

Page 85: Toxicity Testing in the 21st Century: A Vision and a Strategy

TOXICITY TESTING IN THE 21ST CENTURY 133

stress response mechanism. Trends Mol.Med. 10:549–557.

Nakajima, H., and Takatsu, K. 2006. Role ofcytokines in allergic airway inflammation.Int. Arch. Allergy Immunol. 142:265–273.

Nebert, D. W. 1994. Drug-metabolizingenzymes in ligand-modulated transcription.Biochem. Pharmacol. 47:25–37.

National Research Council. 1977. Drinkingwater and health, Vol. 1. Washington, DC:National Academy Press.

National Research Council. 1983. Risk assess-ment in the federal government: Managingthe process. Washington, DC: NationalAcademy Press.

National Research Council. 1988. Complexmixtures: Methods for in vivo toxicity testing.Washington, DC: National Academy Press.

National Research Council. 1989. Improvingrisk communication. Washington, DC:National Academy Press.

National Research Council. 1993. Issues in riskassessment. Washington, DC: NationalAcademy Press.

National Research Council. 1996. Understand-ing risk: Informing decisions in a democraticsociety. Washington, DC: National AcademyPress.

National Research Council. 1999. Hormonallyactive agents in the environment. Washing-ton, DC: National Academy Press.

National Research Council. 2000. Scientificfrontiers in developmental toxicology and riskassessment. Washington, DC: NationalAcademy Press.

National Research Council. 2006a. Toxicitytesting for assessment of environmentalagents: Interim report. Washington, DC: TheNational Academies Press.

National Research Council. 2006b. Humanbiomonitoring for environmental chemicals.Washington, DC: The National AcademiesPress.

National Research Council. 2006c. Review ofthe Department of Energy’s Genomics: GTLprogram. Washington, DC: The NationalAcademies Press.

National Toxicology Program. 2004. The NTPvision: Toxicology in the 21st century: The

role of the National Toxicology Program.National Toxicology Program, National Insti-tute for Environmental Health Sciences,Research Triangle Park, NC. Available athttp://ntp.niehs.nih.gov/ntp/main_pages/NTPVision.pdf.

National Toxicology Program. 2005a. Historyof NTP. National Toxicology Program. Avail-able at http://ntp-server.niehs.nih.gov.

National Toxicology Program. 2005b. Reporton carcinogens, 11th Ed. U.S. Department ofHealth and Human Services, Public HealthService, National Toxicology Program. Avail-able at http://ntp.niehs.nih.gov/ntp/roc/toc11.html.

National Toxicology Program. 2006. Currentdirections and evolving strategies. ResearchTriangle Park, NC: National Toxicology Pro-gram, National Institute of EnvironmentalHealth Sciences, National Institutes ofHealth. Available at http://ntp.niehs.nih.gov/files/NTP_CurrDir20061.pdf.

Nel, A., Xia, T., Madler, L., and Li, N. 2006.Toxic potential of materials at the nanolevel.Science 311:622–627.

Nordstrand L. M., Ringvoll, J., Larsen, E., andKlungland, A. 2007. Genome instability andDNA damage accumulation in gene-targetedmice. Neuroscience 145:1309–1317.

O’Brien, P., and Haskins, J. R. 2007. In vitrocytotoxicity assessment. Methods Mol. Biol.356:415–425.

O’Donoghue, S. I., Russell, R. B., andSchafferhans, A. 2006. Three-dimensionalstructures in target drug discovery and vali-dation. In In silico technologies in drug targetidentification and validation, 6th ed., eds. D.Leon, and S. Markel, pp. 285–308. BocaRaton, FL: CRC Press.

Olsen, L., Rydberg, P., Rod, T. H., and Ryde,U. 2006. Prediction of activation energiesfor hydrogen abstraction by cytochromep450. J. Med. Chem. 49:6489–6499.

Organization for Economic Cooperation andDevelopment. 2005. Guidance document onthe validation and international acceptance ofnew or updated test methods for hazardassessment. OECD Series on Testing andAssessment No. 34. ENV/JM/Mono(2005)14.

Dow

nloa

ded

by [

Bos

ton

Uni

vers

ity]

at 0

2:00

06

May

201

3

Page 86: Toxicity Testing in the 21st Century: A Vision and a Strategy

134 D. KREWSKI ET AL.

Paris: Organisation for Economic Co-operationand Development. Available at http://appli1.oecd.org/olis/2005doc.nsf/linkto/env-jm-mono(2005)14.

Organization for Economic Cooperation andDevelopment. 2006. The OECD: Organizationfor Economic Co-operation and Development.Organization of Economic Co-operation andDevelopment. Available at http://www.oecd.org/dataoecd/15/33/34011915.pdf.

Orton, R. J., Sturm, O. E., Vyshemirsky, V.,Calder, M., Gilbert, D. R., and Kolch, W.2005. Computational modeling of thereceptor-tyrosine-kinase-activated MAPKpathway. Biochem. J. 392(pt. 2):249–261.

Paans, A. M., and Vaalburg, W. 2000. Positronemission tomography in drug developmentand drug evaluation. Curr. Pharm. Des.6:1583–1591.

Pabst, R. 2002. Animal models for asthma:Controversial aspects and unsolved prob-lems. Pathobiology 70:252–254.

Pallardy, M., Kerdine, S., and Lebrec, H. 1998.Testing strategies in immunotoxicology. Toxi-col. Lett. 102-103:257–260.

Pandya, R. J., Solomon, G., Kinner, A., andBalmes, J. R. 2002. Diesel exhaust andasthma: Hypotheses and molecular mecha-nisms of action. Environ. Health Perspect.110(suppl. 1):103–112.

Park, R. M., and Stayner, L. T. 2006. A searchfor thresholds and other nonlinearities in therelationship between hexavalent chromiumand lung cancer. Risk Anal. 26:79–88.

Pauluhn, J. 2005. Overview of inhalationexposure techniques: Strengths and weak-nesses. Exp. Toxicol. Pathol. 57(suppl.1):111–128.

Peeters, J. K., and Van der Spek, P. J. 2005.Growing applications and advancements inmicroarray technology and analysis tools.Cell Biochem. Biophys. 43:149–166.

Potier, M., Lakhdar, B., Merlet, D., and Cam-bar, J. 1995. Interest and limits of human tis-sue and cell use in pharmacotoxicology. CellBiol Toxicol. 11:133–139.

Poulin, P., and Theil, F. P. 2002. Prediction ofpharmacokinetics prior to in vivo studies. II.Generic physiologically based pharmacoki-

netic models of drug disposition. J. Pharm.Sci. 91:1358–1370.

Powell, M. C., and Kanarek, M. S. 2006.Nanomaterial health effects-Part 1: Back-ground and current knowledge. WisconsinMed. J. 105:16–20.

Ptacek, T., and Sell, S. M. 2005. A tieredapproach to comparative genomics. BriefFunct. Genomic Proteomic. 4:178–185.

Rehmann, S., and Jayson, G. C. 2005. Molecu-lar imaging of antiangiogenic agents. Oncol-ogist 10:92–103.

Reitz, R. H., Mendrala, A. L., Corley, R. A.,Quast, J. F., Gargas, M. L., Andersen, M. E.,Staats, D. A., and Conolly, R. B. 1990. Esti-mating the risk of liver cancer associatedwith human exposures to chloroform usingphysiologically based pharmacokineticmodeling. Toxicol. Appl. Pharmacol.105:443–459.

Renwick, A. G., Barlow, S. M., Hertz-Picciotto, I.,Boobis, A. R., Dybing, E., Edler, L., Eisen-brand, G., Greig, J. B., Kleiner, J., Lambe, J.,Muller, D. J., Smith, M. R., Tritscher, A.,Tuijtelaars, S., van den Brandt, P. A., Walter,R., and Kroes, R. 2003. Risk characterizationof chemicals in food and diet. Food Chem.Toxicol. 41:1211–1271.

Rieger, T. R., Morimoto, R. I., and Hatzimani-katis, V. 2005. Mathematical modeling ofthe eukaryotic heat-shock response:Dynamics of the hsp70 promoter. Biophys. J.88:1646–1658.

Roberts, S. A. 2001. High-throughput screen-ing approaches for investigating drug metab-olism and pharmacokinetics. Xenobiotica31:557–589.

Rochette-Egly, C. 2003. Nuclear receptors:Integration of multiple signaling pathwaysthrough phosphorylation. Cell. Signal.15:355–366.

Russell, W. M. S., and Burch, R. L. 1959. Theprinciples of humane experimental tech-nique. London: Methuen.

Sachs, K., Perez, O., Pe’er, D., Lauffenburger,D. A., and Nolan, G. P. 2005. Causal pro-tein-signaling networks derived from multi-parameter single-cell data. Science308:523–529.

Dow

nloa

ded

by [

Bos

ton

Uni

vers

ity]

at 0

2:00

06

May

201

3

Page 87: Toxicity Testing in the 21st Century: A Vision and a Strategy

TOXICITY TESTING IN THE 21ST CENTURY 135

Santos, S. D., Verveer, P. J., and Bastiaens, P. I.2007. Growth factor-induced MAPK net-work topology shapes Erk response deter-mining PC-12 cell fate. Nat. Cell Biol.9:324–330.

Sarangapani, R., Teeguarden, J., Plotzke, K. P.,McKim, J. M., Jr., and Andersen, M. E. 2002.Dose-response modeling of cytochromep450 induction by rats by octamethylcy-clotetrasiloxane. Toxicol. Sci. 67:159–172.

Schilter, B., Marin-Kuan, M., Delatour, T.,Nestler, S., Mantle, P., and Cavin, C. 2006.Ochratoxin A: Potential epigenetic mecha-nisms of toxicity and carcinogenicity. FoodAddit. Contam. 22(suppl. 1):88–93.

Schleger, C., Platz, S. J., and Deschl, U. 2004.Development of an in vitro model for vascu-lar injury with human endothelial cells.ALTEX 21(suppl. 3):12–19.

Schultz, T. W., and Seward, J. R. 2000. Healtheffects related structure–toxicity relationships:A paradigm for the first decade of the newmillennium. Sci. Total Environ. 249:73–84.

Schultz, T. W., Sinks, G. D., and Bearden. A. P.1998. QSAR in aquatic toxicology: A mech-anism of action approach comparing toxicpotency to Pimephales promelas, Tetrahy-mena pyriformis, and Vibrio fischeri. In Com-parative QSAR, ed. J. Devillers, pp. 51–110.London: Taylor & Francis.

Sergeant, A. 2002. Ecological risk assessment:History and fundamentals. In Human andecological risk assessment: Theory and prac-tice, ed. D. J. Paustenbach, pp. 369–442.New York: John Wiley and Sons.

Sheikh, M. S., Hollander, M. C., and Fornance,A. J., Jr. 2000. Role of Gadd45 in apoptosis.Biochem. Pharmacol. 59:43–45.

Shi, M. M., Bleavins, M. R., and de la Iglesia, F.A. 1999. Technologies for detecting geneticpolymorphisms in pharmacogenomics. Mol.Diagn. 4:343–351.

Simon-Hettich, B., Rothfuss, A., and Steger-Hartmann, T. 2006. Use of computer-assisted prediction of toxic effects of chemicalsubstances. Toxicology 224:156–162.

Singh, K. P., and DuMond, J. W., Jr. 2007.Genetic and epigenetic changes induced bychronic low dose exposure to arsenic of

mouse testicular Leydig cells. Int. J. Oncol.30:253–260.

Slikker, W. Jr., Andersen, M. E., Bogdanffy, M.S., Bus, J. S., Cohen, S. D., Conolly, R. B.,David, R. M., Doerrer, N. G., Dorman, D. C.,Gaylor, D. W., Hattis, D., Rogers, J. M.,Setzer, R. W., Swenberg, J. A., and Wallace, K.2004a. Dose-dependent transitions inmechanisms of toxicity: Case studies. Toxi-col. Appl. Pharmacol. 201:226–294.

Slikker, W. Jr., Andersen, M. E., Bogdanffy,M. S., Bus, J. S., Cohen, S. D., Conolly, R.B., David, R. M., Doerrer, N. G., Dorman, D.C., Gaylor, D. W., Hattis, D., Rogers, J. M.,Woodrow Setzer, R., Swenberg, J. A., andWallace, K. 2004b. Dose-dependent transi-tions in mechanisms of toxicity. Toxicol.Appl. Pharmacol. 201:203–225.

Slikker, W., Xu, Z., and Wang, C. 2005. Appli-cation of a systems biology approach todevelopmental neurotoxicology. Reprod.Toxicol. 19:305–319.

Snitkin, E. S., Gustafson, A. M., Mellor, J., Wu, J.,and DeLisi, C. 2006. Comparative assess-ment of performance and genome depen-dence among phylogenetic profilingmethods. BMC Bioinformatics 7:420.

Soffers, A. E., Boersma, M. G., Vaes, W. H.,Vervoort, J., Tyrakowska, B., Hermens, J. L.,and Rietjens, I. M. 2001. Computer-model-ing-based QSARs for analyzing experimentaldata on biotransformation and toxicity. Toxi-col. In Vitro 15:539–551.

Spielmann, H. 2005. Predicting the risk of devel-opmental toxicity from in vitro assays. Toxicol.Appl. Pharmacol. 207(suppl. 2):375–380.

Spielmann, H., and Liebsch, M. 2001. Lessonslearned from validation of in vitro toxicitytest: From failure to acceptance into regula-tory practice. Toxicol. In Vitro 15:585–590.

Stavanja, M. S., Ayres, P. H., Meckley, D. R.,Bombick, E. R., Borgerding, M. F., Morton,M. J., Garner, C. D., Pence, D. H., andSwauger, J. E. 2006. Safety assessment ofhigh fructose corn syrup (HFCS) as an ingre-dient added to cigarette tobacco. Exp. Toxi-col. Pathol. 57:267–281.

Stephens, S. M., and Rung, J. 2006. Advancesin systems biology: Measurement, modeling

Dow

nloa

ded

by [

Bos

ton

Uni

vers

ity]

at 0

2:00

06

May

201

3

Page 88: Toxicity Testing in the 21st Century: A Vision and a Strategy

136 D. KREWSKI ET AL.

and representation. Curr. Opin. DrugDiscov. Devel. 9:240–250.

Subramaniam, R. P., Crump, K. S., Chen, C.,White, P., Van Landingham, C., Fox, J. F.,Schlosser, P., Covington, T. R., DeVoney, D.,Vandenberg, J. J., Preuss, P., and Whalan, J.2006. The role of mutagenicity in describingformaldehyde-induced carcinogenicity: Possi-ble inferences using the CIIT model. Presentedat the Society of Risk Analysis Annual Meet-ing, December 3–6, Baltimore, MD.

Subramanya, S., and Mensa-Wilmot, K. 2006.Regulated cleavage of intracellular glyco-sylphosphatidylinositol in a trypanosome:Peroxisome-to-endoplasmic reticulum trans-location of a phospholipase C. Fed. Eur. Bio-chem. Soc. J. 273:2110–2126.

Suemori, H. 2006. Establishment and thera-peutic use of human embryonic stem celllines. Hum. Cell 19:65–70.

Suzuki, N., Higashiguchi, A., Hasegawa, Y.,Matsumoto, H., Oie, S., Orikawa, K., Ezawa,S., Susumu, N., Miyashita, K., and Aoki, D.2005. Loss of integrin alpha3 expression isassociated with acquisition of invasivepotential by ovarian clear cell adenocarci-noma cells. Hum. Cell. 8:147–155.

Teuschler, L., Klaunig, J., Carney, E., Chambers,J., Conolly, R., Gennings, C., Giesy, J.,Hertzberg, R., Klaassen, C., Kode, R., Paus-tenbach, D., and Yang, R. 2005. Support ofscience-based decisions concerning the eval-uation of the toxicology of mixtures: A newbeginning. Charting the future: Building thescientific foundation for mixtures joint toxic-ity and risk assessment, Atlanta, GA, Societyin Toxicology sponsored meeting: Contem-porary concepts in toxicology. Available athttp://www.toxicology.org/ai/meet/MixturesWhitePapers.doc.

Theil, F. P., Guentert, T. W., Haddad, S.,and Poulin, P. 2003. Utility of physiologi-cally based pharmacokinetic models todrug development and rational drug dis-covery candidate selection. Toxicol. Lett.138:29–49.

Thummel, C. S. 2002. Ecdysone-regulated puffgenes 2000. Insect Biochem. Mol. Biol.32:113–120.

Timsit, Y. E. and Negishi, M. 2006. CAR andPXR: The xenobiotic-sensing receptors. Ste-roids 72:231–246.

Tong, W., Welsh, W. J., Shi, L., Fang, H., andPerkins, R. 2003. Structure–activity relation-ship approaches and applications. Environ.Toxicol. Chem. 22:1680–1695.

U.S. Environmental Protection Agency. 1991.Guidelines for developmental toxicity riskassessment. EPA/600/FR-91/001. Washington,DC: Risk Assessment Forum, U.S. Environ-mental Protection Agency. Available at http://www.epa.gov/NCEA/raf/pdfs/devtox. pdf.

U.S. Environmental Protection Agency. 1996.Guidelines for reproductive toxicity risk assess-ment. EPA/630/R-96/009. Washington, DC:Risk Assessment Forum, U.S. EnvironmentalProtection Agency. Available at http://www.epa.gov/ncea/raf/pdfs/repro51.pdf.

U.S. Environmental Protection Agency. 1998a.Guidelines for ecological risk assessment. EPA/630/R-95/002F. Washington, DC: U.S. Envi-ronmental Protection Agency, Risk Assess-ment Forum. Available at oaspub.epa.gov/eims/eimscomm.getfile? p_download_id=36512.

U.S. Environmental Protection Agency. 1998b.Guidelines for neurotoxicity risk assessment.EPA/630/R-95/001F. Washington, DC: RiskAssessment Forum, U.S. Environmental Pro-tection Agency. Available at http://www.epa.gov/ncea/raf/pdfs/neurotox.pdf.

U.S. Environmental Protection Agency. 1998c.Health effects test guidelines: OPPTS870.4300 combined chronic toxicity/carci-nogenicity. EPA 712-C-98-212. Washington,DC: Office of Prevention, Pesticides andToxic Substances, U.S. Environmental Pro-tection Agency. Available at http://www.epa.gov/opptsfrs/publications/OPPTS_ Harmo-nized/870_Health_Effects_Test_Guidelines/Series/870-4300.pdf.

U.S. Environmental Protection Agency. 2002.A review of the reference dose and referenceconcentration processes. Final report. EPA/630/P-02/002F. Washington, DC: U.S. Envi-ronmental Protection Agency, Risk Assess-ment Forum. Available at http://www.epa.gov/IRIS/RFD_FINAL%5B1%5D.pdf.

Dow

nloa

ded

by [

Bos

ton

Uni

vers

ity]

at 0

2:00

06

May

201

3

Page 89: Toxicity Testing in the 21st Century: A Vision and a Strategy

TOXICITY TESTING IN THE 21ST CENTURY 137

U.S. Environmental Protection Agency. 2005a.EPA commemorates its history and celebratesits 35th anniversary. U.S. EnvironmentalProtection Agency. Available at http://www.epa.gov/history/.

U.S. Environmental Protection Agency. 2005b.Food Quality Protection Act (FQPA) of 1996.U.S. Environmental Protection Agency,Office of Pesticides. Available at http://www.epa.gov/opp00001/regulating/laws/fqpa/index.htm.

U.S. Environmental Protection Agency. 2005c.Guidelines for carcinogen risk assessment.EPA/630/P-03/001F. Washington, DC: RiskAssessment Forum, U.S. EnvironmentalProtection Agency. Available at http://www.epa.gov/raf/publications/pdfs/CANCER_GUIDELINES_FINAL_3-25-05.PDF.

U.S. Environmental Protection Agency. 2006a.Highlights of the Food Quality Protection Actof 1996. Office of Pesticides, U.S. Environ-mental Protection Agency. Available at http://www.epa.gov/pesticides/regulating/laws/fqpa/fqpahigh.htm.

U.S. Environmental Protection Agency. 2006b.Science and research budgets for the U.S.Environmental Protection Agency for fiscalyear 2007; An advisory report by the ScienceAdvisory Board. EPA-SAB-ADV-06-003.Washington, DC: U.S. EnvironmentalProtection Agency, Available at http://yosemite.epa.gov/sab/sabproduct.nsf/36a1ca3f683ae57a85256ce9006a32d0/0EDAAECA1096A5B0852571450072E33E/$File/sab-adv-06-003.pdf .

U.S. Environmental Protection Agency. 2006c.Pesticides: Science and policy. Office of Pes-ticides, U.S. Environmental ProtectionAgency. Available at http://www.epa.gov/pesticides/science/index.htm.

U.S. Environmental Protection Agency. 2007.Comments on EPA’s strategic research direc-tions and research budget for FY 2008, Anadvisory report of the U.S. EnvironmentalProtection Agency Science Advisory Board.EPA-SAB-ADV-07-004. Washington DC:U.S. Environmental Protection Agency.Available at http://yosemite.epa.gov/sab/sabproduct.nsf/

997517EFA5FC48798525729F0073B4D4/$File/sab-07-004.pdf.

van den Broek, L. A., Lazaro, E., Zylicz, Z.,Fennis, P. J., Missler, F. A., Lelieveld, P.,Garzotto, M., Wagener, D. J., Ballesta, J. P.,and Ottenheijm, H. C. 1989. Lipophilic ana-logues of sparsomycin as strong inhibitors ofprotein synthesis and tumor growth: A struc-ture–activity relationship study. J. Med.Chem. 32:2002–2015.

Van der Berg, M., Birnbaum, L., Bosveld, A. T.,Brunstrom, B., Cook, P., Feeley, M., Giesy, J.P., Hanberg, A., Hasegawa, R., Kennedy, S.W., Kubiak, T., Larsen, J. C., van Leeuwen,F. X., Liem, A. K., Nolt, C., Peterson, R. E.,Poellinger, L., Safe, S., Schrenk, D., Tillitt,D., Tysklind, M., Younes, M., Waern, F., andZacharewski, T. 1998. Toxic equivalencyfactors (TEFs) for PCBs, PCDDs, PCDFs forhuman and wildlife. Environ. Health Per-spect. 106:775–792.

Van Wuytswinkel, O., Reiser, V., Siderius, M.,Kelders, M. C., Ammerer, G., Ruis, H., andMager, W. H. 2000. Response of Saccharo-myces cerevisiae to sever osmotic stress: Evi-dence for a novel activation mechanism ofthe HOG MAP kinase pathway. Mol. Micro-biol. 37:382–397.

Vedani, A. 1999. Replacing animal testing byvirtual experiments: A challenge in compu-tational biology. Chimia 53:227–228.

Volarath, P., Wang, H., Fu, H., and Harrison, R.2004. Knowledge-based algorithms for chem-ical structure and property analysis. Conf.Proc. IEEE Eng. Med. Biol. Soc. 4:3011–3014.

Walker, J. D., Enache, M and Dearden, J. C.2003a. Quantitative cationic-activity rela-tionships for predicting toxicity of metals.Environ. Toxicol. Chem. 22:1916–1935.

Walker, J. D., Jaworska, J., Comber, M. H.,Schultz, T. W., and Dearden, J. C. 2003b.Guidelines for developing and using quanti-tative structure–activity relationships. Envi-ron. Toxicol. Chem. 22:1653–1665.

Walker, J. D., ed. 2004. Quantitative struc-ture–activity relationships for pollution pre-vention, toxicity screening, risk assessment,and web applications (QSAR II). Pensacola,FL: SETAC Press.

Dow

nloa

ded

by [

Bos

ton

Uni

vers

ity]

at 0

2:00

06

May

201

3

Page 90: Toxicity Testing in the 21st Century: A Vision and a Strategy

138 D. KREWSKI ET AL.

Wang, X. J., Hayes, J. D., and Wolf, C. R.2006a. Generation of a stable antioxidantresponse element-driven reporter genecell line and its use to show redox-depen-dent activation of nrf2 by cancer chemo-therapeutic agents. Cancer Res.66:10983–10994.

Wang, S. L., Lan, F. H., Zhuang, Y. P., Li, H. Z.,Huang, L. H., Zheng, D. Z., Zeng, J., Dong,L. H., Zhu, Z. Y., and Fu, J. L. 2006b.Microarray analysis of gene-expression pro-file in hepatocellular carcinoma cell, BEL-7402, with stable suppression of hLRH-1 viaa DNA vector-based RNA interference. YiChuan Xue Bao. 33:881–891.

Waring, J. F., and Ulrich, R. G. 2000. Theimpact of genomics based technologies ondrug safety evaluation. Annu. Rev. Pharma-col. Toxicol. 40:335–352.

Watanabe, P. G., Schumann, A. M., and Reitz,R. H. 1988. Toxicokinetics in the evaluationof toxicity data. Regul. Toxicol. Pharmacol.8:408–413.

Waxman, D. J. 1999. P450 gene induction bystructurally diverse xenochemicals: Centralrole of nuclear receptors CAR, PXR, andPPAR. Arch. Biochem. Biophys. 369:11–23.

Weis, B. K., Balshaw, D., Barr, J. R., Brown, D.,Ellisman, M., Lioy, P., Omenn, G., Potter, J.D., Smith, M. T., Sohn, L., Suk, W. A., Sum-ner, S., Swenberg, J., Walt, D. R., Watkins,

S., Thompson, C., and Wilson, S. H. 2005.Personalized exposure assessment: Promis-ing approaches for human environmentalhealth research. Environ. Health Perspect.113:840–848.

Westerheide, S. D., and Morimoto, R. I. 2005.Heat shock response modulators as thera-peutic tools for diseases of protein confor-mation. J. Biol. Chem. 280:33097–33100.

Woodruff, T. J., Axelrad, D. A., Kyle, A. D.,Nweke, O., Miller, G. G., and Hurley, B. J.2004. Trends in environmentally relatedchildhood illnesses. Pediatrics 113(suppl.4):1133–1140.

Xiao, G. G., Wang, M., Li, N., Loo, J. A., andNel, A. E. 2003. Use of proteomics to dem-onstrate a hierarchical oxidative stressresponse to diesel exhaust particle chemicalsin a macrophage cell line. J. Biol. Chem.278:50781–50790.

Yokota, F., Gray, G., Hammitt, J. K., andThompson, K. M. 2004. Tiered chemicaltesting: A value of information approach.Risk Anal. 24:1625–1639.

Zangar, R. C., Varnum, S. M., and Bollinger, N.2005. Studying cellular processes anddetecting disease with protein microarrays.Drug Metab. Rev. 37:473–487.

Zhang, D. D. 2006. Mechanistic studies of theNrf2-Keap1 signaling pathway. Drug Metab.Rev. 38:769–789.

Dow

nloa

ded

by [

Bos

ton

Uni

vers

ity]

at 0

2:00

06

May

201

3