3
Risk Analysis, Vol. 29, No. 4, 2009 DOI: 10.1111/j.1539-6924.2009.01222.x From the Editors Toxicity Testing in the 21st Century This month’s lead is a summary and review of the National Research Council’s “Toxicity Testing in the 21st Century: A Vision and A Strategy.” 1 In this article, Daniel Krewski, Melvin Andersen, Ellen Mantus, and Lauren Zeise summarize a vision to ad- vance toxicity testing and human health assessment of environmental agents. They describe how scien- tific advances can transform toxicity testing to al- low additional assessments of potentially toxic chem- icals by using more timely and more cost-effective methods, including high- and medium-throughput in vitro screening assays, computational toxicology and systems biology, along with other emerging high- content testing methodologies, such as functional ge- nomics and transcriptomics. Suresh Moolgavkar, our Area Editor for Health Risk Assessment, asked six experts with different perspectives to comment on the paper. Each praises the vision and offers suggestions for making it more useful. Rory Connolly argues that if we expect risk assessment to maintain high throughput and be accu- rate, then there is need to address the issues of mi- crodosimetry, adaptive responses and homeostasis. E. Donald Elliott focuses on the regulatory perspec- tive, wondering why a regulator would ever take the political and legal risk to be the first to base an ac- tual regulatory decision on the new model, and then he wonders if a judge would uphold a regulatory de- cision based on the new vision. Elliott argues for a legally sophisticated group or institution to take up the issues where the NAS Committee left off and fill in the gaps so that Toxicity Testing in the 21st Century can actually be used by regulatory author- ities. Dale Hattis notes that while high-throughput testing may ultimately be of substantial value, for higher profile decisions on major agents in commerce involving complex tradeoffs of risks and economic impacts for different policy options, the findings of high-throughput tests will not be sufficient. For these, a system that quantitatively assesses actual health risks and the large associated uncertainties will be essential. A commentary by Robert H. Kavlock and col- leagues acknowledges the challenges laid out in the Krewski et al. perspective, and describes the NIH/EPA collaboration called Tox21. With four fo- cus groups devoted to different components of the NRC vision, the Tox21 consortium constitutes a concerted, long-term effort to identify mechanisms of chemically-induced biological activity, prioritize chemicals for more extensive evaluation and de- velop more predictive models of in vivo biological response. Lorenz R. Rhomberg urges Risk Analy- sis readers to read the full NRC report, focusing on a careful consideration of the ways that risk assess- ment will have to change to deal with the new testing approaches. He highlights his view that the new vi- sion consists of more than new testing technologies, but is based on a change in the questions that tox- icology addresses, that is, a shift toward identifying causes and then inferring possible effects. The final commentary by Joyce Tsuji stresses the difficulties of developing in vitro assays that can predict in vivo outcomes with adequate sensitivity and specificity and discusses challenges for public health decision- makers in dealing with uncertainty. Krewski et al. briefly reply to each commentary and encourage us to use their paper and the accom- panying commentaries as a starting point for think- ing about a more complete evaluation of the future directions for toxicity testing as set out in the full NRC report. We’re pleased with this set of papers and hope that you will consider proposing similar sets of papers to us. The other papers in this issue examine terrorism, food contamination, endangered species, and other risk-related challenges. Yacov Haimes, our Area Ed- itor for Engineering, had discussed the meaning of “vulnerability” in a 2006 article in Risk Analysis. 2 His perspective article in this issue examines what we mean by “resilience.” He considers existing def- initions and arrives at one that will prove useful to practitioners. Terje Aven and Ortwin Renn, funded by Norway’s Research Council, consider the utility of 471 0272-4332/09/0100-0471$22.00/1 C 2009 Society for Risk Analysis

Toxicity Testing in the 21st Century

Embed Size (px)

Citation preview

Page 1: Toxicity Testing in the 21st Century

Risk Analysis, Vol. 29, No. 4, 2009 DOI: 10.1111/j.1539-6924.2009.01222.x

From the Editors

Toxicity Testing in the 21st Century

This month’s lead is a summary and review ofthe National Research Council’s “Toxicity Testingin the 21st Century: A Vision and A Strategy.”1 Inthis article, Daniel Krewski, Melvin Andersen, EllenMantus, and Lauren Zeise summarize a vision to ad-vance toxicity testing and human health assessmentof environmental agents. They describe how scien-tific advances can transform toxicity testing to al-low additional assessments of potentially toxic chem-icals by using more timely and more cost-effectivemethods, including high- and medium-throughputin vitro screening assays, computational toxicologyand systems biology, along with other emerging high-content testing methodologies, such as functional ge-nomics and transcriptomics.

Suresh Moolgavkar, our Area Editor for HealthRisk Assessment, asked six experts with differentperspectives to comment on the paper. Each praisesthe vision and offers suggestions for making it moreuseful. Rory Connolly argues that if we expect riskassessment to maintain high throughput and be accu-rate, then there is need to address the issues of mi-crodosimetry, adaptive responses and homeostasis.E. Donald Elliott focuses on the regulatory perspec-tive, wondering why a regulator would ever take thepolitical and legal risk to be the first to base an ac-tual regulatory decision on the new model, and thenhe wonders if a judge would uphold a regulatory de-cision based on the new vision. Elliott argues for alegally sophisticated group or institution to take upthe issues where the NAS Committee left off andfill in the gaps so that Toxicity Testing in the 21stCentury can actually be used by regulatory author-ities. Dale Hattis notes that while high-throughputtesting may ultimately be of substantial value, forhigher profile decisions on major agents in commerceinvolving complex tradeoffs of risks and economicimpacts for different policy options, the findings ofhigh-throughput tests will not be sufficient. For these,a system that quantitatively assesses actual healthrisks and the large associated uncertainties will beessential.

A commentary by Robert H. Kavlock and col-leagues acknowledges the challenges laid out inthe Krewski et al. perspective, and describes theNIH/EPA collaboration called Tox21. With four fo-cus groups devoted to different components of theNRC vision, the Tox21 consortium constitutes aconcerted, long-term effort to identify mechanismsof chemically-induced biological activity, prioritizechemicals for more extensive evaluation and de-velop more predictive models of in vivo biologicalresponse. Lorenz R. Rhomberg urges Risk Analy-sis readers to read the full NRC report, focusing ona careful consideration of the ways that risk assess-ment will have to change to deal with the new testingapproaches. He highlights his view that the new vi-sion consists of more than new testing technologies,but is based on a change in the questions that tox-icology addresses, that is, a shift toward identifyingcauses and then inferring possible effects. The finalcommentary by Joyce Tsuji stresses the difficultiesof developing in vitro assays that can predict in vivooutcomes with adequate sensitivity and specificityand discusses challenges for public health decision-makers in dealing with uncertainty.

Krewski et al. briefly reply to each commentaryand encourage us to use their paper and the accom-panying commentaries as a starting point for think-ing about a more complete evaluation of the futuredirections for toxicity testing as set out in the fullNRC report. We’re pleased with this set of papersand hope that you will consider proposing similar setsof papers to us.

The other papers in this issue examine terrorism,food contamination, endangered species, and otherrisk-related challenges. Yacov Haimes, our Area Ed-itor for Engineering, had discussed the meaning of“vulnerability” in a 2006 article in Risk Analysis.2

His perspective article in this issue examines whatwe mean by “resilience.” He considers existing def-initions and arrives at one that will prove useful topractitioners. Terje Aven and Ortwin Renn, fundedby Norway’s Research Council, consider the utility of

471 0272-4332/09/0100-0471$22.00/1 C© 2009 Society for Risk Analysis

Page 2: Toxicity Testing in the 21st Century

472 Editorial

traditional quantitative risk assessments for complexrisk issues, such as terrorism, in which the circum-stances are both ambiguous and uncertain. They con-clude that traditional Bayesian and other approachesprovide an incomplete picture of risk. Instead, theysuggest approaches grounded in qualitative meth-ods that characterize the breadth of uncertainty andin scenarios validated through consistency, psycho-logical empathy with the main players, congruencewith past trends, and plausibility. Their scenario-generating approach includes processes aimed at en-hancing imagination, using game theoretical experi-ments for simulating interactive variables and apply-ing role playing for stimulating empathy. While fa-voring a more qualitative approach, they acknowl-edge that quantitative methods have a legitimate roleto play in broader risk characterization and scenarioapproaches, as long as the limits with respect to theirvalidity and reliability are noted.

Fear of food contamination is common, andwe present four papers that address these increas-ingly high-profile risks. Funded by ZonMW and theDutch Ministry of Public Health, Esther van Asseltet al. focus on cross-contamination and undercook-ing. They observed home preparation of chicken-curry salad, finding a wide range of microbial con-tamination levels in the final salad, caused by vari-ous cross-contamination practices and widely vary-ing heating times. Model predictions indicated thatcooking times should be at least 8 minutes andcutting boards need to be changed after cuttingraw chicken in order to obtain safe bacterial lev-els in the final salad. The model predicted around75% of the variance in cross-contamination. Theauthors suggest that model accuracy can furtherbe improved by including other cross-contaminationroutes besides hands, cutting boards, and knives,and that the results be used as a worst-case es-timate for evaluating cross-contamination in thehome.

The first case of bovine spongiform encephalopa-thy (BSE) was detected in the United Kingdomin1986. Corporate and international ramifications arestill occurring. Funded by the Dutch Ministry ofAgriculture, Nature, and Food Quality, Clazien J. deVos and Lourens Heres note that the ban on useof meat and bone meal (MBM) in livestock feed re-duced the spread of BSE. But now that the BSE epi-demic is fading out, should there be a partial lifting ofthe MBM ban? Developing a simulation model thatconsiders three infection pathways, they find thatcross-contamination in the feed mill is the most risky

pathway. Combining model results, they concludethat the risk of using some MBM is very low.

Ides Boone, et al. focus on the quality of para-meters and the probability of health risks asso-ciated with eating contaminated food. They select101 parameters from the life cycle beginning withprimary production and including transport, hold-ing, and slaughterhouse; next to post-processing, dis-tribution, and storage; and meal preparation andconsumption.

Xiao-Wei Lin et al. construct a risk model to pre-dict the diffusion of foot-and-mouth disease (FMD)caused by passengers carrying meat products fromcloven-hoofed animals across international bound-aries. Employing recent data from an internationalairport in Taiwan, they build a mathematical modelto simulate the probability distributions of diseaseprevalence, of FMD virus existing in the meat prod-ucts after meat processing, and they estimate thesurvival of virus. Notably, they report variations inillegal transport by season and that the odds of in-terception by trained animals are higher than by cus-toms agents.

The remaining papers in this issue consider avariety of important risk-related issues and tools.Endangered species protection is a risk managementissue in North America, and the precautionary prin-ciple is often cited as an ethical basis for protectingspecies. Funded by the U.S. National Science Foun-dation and Fisheries and Oceans Canada, RobinGregory and Graham Long use a structured decision-making (SDM) approach to bound the managementproblem, define objectives and performance mea-sures, develop precautionary management alterna-tives, and evaluate the consequences of managementactions. They highlight how strategy tables were usedby a stakeholder committee charged with protectionof endangered Cultus Lake salmon on the Canadianwest coast.

Tony Cox examines chronic obstructive pul-monary disease (COPD), which is the fourth lead-ing cause of death worldwide. COPD is associatedwith smoking but many who smoke do not developthe disease and the disease continues to develop inthose who stop smoking. Funded by Philip Morris,Cox builds a risk model based on the assumption ofprotease-antiprotease imbalance in the lung, leadingto ongoing proteolysis (digestion) of lung tissue byexcess proteases.

Funded partly by NIOSH, Robert Noble, A.John Bailer and Robert Park point out that modelaveraging in risk assessments has become more

Page 3: Toxicity Testing in the 21st Century

Editorial 473

common. Their paper tests a protocol for selectingmodels that best fit the data. The authors focused onevaluating the impact of coal dust on lung function ina cohort of exposed miners. After eliminating nearlyall the models and combinations of models as not fit-ting their criteria, they observe that remaining mod-els yield benchmark concentrations that differ by afactor of 2 to 9 depending on the concentration met-ric and covariates. Their approach is a useful strategyfor addressing model uncertainty.

Hospitals are no longer considered safe placesby many people. Funded by the French National Re-search Agency, Mathieu Emily et al. develop a coeffi-cient that quantifies the risk associated with hospitaldepartments, permitting comparisons of similar de-partments. The adjustment coefficient characterizesthe tail of the distribution of the total patient lengthof stay in a department before the first disease eventoccurs. They provide an approximation for the tail of

the distribution and illustrate it by examining the riskassociated with a standard patient safety indicator in20 southeastern French hospitals.

Finally, Natarajan Krishnamurthy reviews Risk:Philosophical Perspectives (2007), edited by TimLewens, and recommends the collection of essaysfor those interested in the philosophical and psy-chological underpinnings of risk assessment andmanagement.

Michael GreenbergKaren Lowrie

REFERENCES

1. National Research Council. Toxicity Testing in the 21st Cen-tury: A Vision and a Strategy. Washington, DC: NationalAcademy Press, 2007.

2. Haimes, YY. On the definition of vulnerabilities in measuringrisks to infrastructures, Risk Analysis, 2006;26(2): 293–296.