12
1 Science in the Administrative Process Annotated Recommendations March 4, 2013 Wendy Wagner, U. of Texas School of Law This document identifies the most significant differences between the Staff revised recommendations and the recommendations in the report. Passages in red and underlined are new text that is outside the scope of the report. I. Best Practices 1. Transparent Ex Ante Design of Risk Assessment: At an early stage in their regulatory processes, agencies should articulate the specific policy questions that may be informed by science; specify how the risk assessment will integrate available information, including the criteria to be used in reviewing and weighing existing studies and approaches; identify other a priori analytical choices; assess the available evidence bearing on these policy- relevant questions; apply the evidence to the policy questions at issue (with robust statements of material uncertainties and assumptions); and identify plausible scientific approaches to inform policy alternatives. Agencies should maintain a clear distinction between assessment of risks and review of risk management alternatives. [The original recommendation in the report reads: All significant science-policy choices made by an agency in reaching a decision should be identified and explained in clear and understandable terms. In order to provide this heightened level of transparency, agencies should consider following an analytical process that: a) identifies the policy-relevant questions that can be informed by science; b) identifies in advance a study design, such as criteria for weighting individual studies, as well as identifying other a priori analytical choices, like stopping rules; c) provides a synthesis of the available evidence and relevant literature guided by this study design; and d) identify other significant assumptions, choices of analytical techniques, and remaining uncertainties and how different plausible choices might change the resulting policy decision. If possible, the agency should also follow the model of the NAAQS policy assessment in bridging science and policy in a final report, although this final step will likely involve more effort and experimentation. Making these analytical steps explicit may not be practicable in some science-policy decisions and may not be practicable in other regulatory settings. This recommendation simply encourages agencies to consider this staged approach in their processes. Ultimately, with experience, this analytical approach may develop into a best practice. Comment [W1]: The original recommendation in the report draws the analytical process out of the NAAQS process, with some elaboration from feedback at the NAS workshop. This revised recommendation reframes and adds to these steps in ways that no longer track the research in the report. Based on NAS and committee feedback, this recommendation was also proposed in the report as simply worthy of consideration, rather than a “best practice”.

Science in the Administrative Process Annotated Recommendations March … · Science in the Administrative Process Annotated Recommendations – March 4, 2013 Wendy Wagner, U. of

Embed Size (px)

Citation preview

1

Science in the Administrative Process

Annotated Recommendations – March 4, 2013

Wendy Wagner, U. of Texas School of Law

This document identifies the most significant differences between the Staff revised

recommendations and the recommendations in the report. Passages in red and underlined are

new text that is outside the scope of the report.

I. Best Practices

1. Transparent Ex Ante Design of Risk Assessment: At an early stage in their regulatory

processes, agencies should articulate the specific policy questions that may be informed by

science; specify how the risk assessment will integrate available information, including the

criteria to be used in reviewing and weighing existing studies and approaches; identify

other a priori analytical choices; assess the available evidence bearing on these policy-

relevant questions; apply the evidence to the policy questions at issue (with robust

statements of material uncertainties and assumptions); and identify plausible scientific

approaches to inform policy alternatives. Agencies should maintain a clear distinction

between assessment of risks and review of risk management alternatives.

[The original recommendation in the report reads:

All significant science-policy choices made by an agency in

reaching a decision should be identified and explained in clear and

understandable terms. In order to provide this heightened level of

transparency, agencies should consider following an analytical process

that: a) identifies the policy-relevant questions that can be informed by

science; b) identifies in advance a study design, such as criteria for

weighting individual studies, as well as identifying other a priori analytical

choices, like stopping rules; c) provides a synthesis of the available

evidence and relevant literature guided by this study design; and d)

identify other significant assumptions, choices of analytical techniques,

and remaining uncertainties and how different plausible choices might

change the resulting policy decision. If possible, the agency should also

follow the model of the NAAQS policy assessment in bridging science

and policy in a final report, although this final step will likely involve

more effort and experimentation.

Making these analytical steps explicit may not be practicable in

some science-policy decisions and may not be practicable in other

regulatory settings. This recommendation simply encourages agencies to

consider this staged approach in their processes. Ultimately, with

experience, this analytical approach may develop into a best practice.

Comment [W1]: The original recommendation in the report draws the analytical process out of the NAAQS process, with some elaboration from feedback at the NAS workshop. This revised recommendation reframes and adds to these steps in ways that no longer track the research in the report. Based on NAS and committee feedback, this recommendation was also proposed in the report as simply worthy of consideration, rather than a “best practice”.

2

Until then, agencies are strongly encouraged to consider this analytical

approach in conducting their work.]

2. Conducting Systematic Review: In conducting the systematic review process, agencies

should observe the following practices:

(a) assemble a team of experts with the appropriate skills required to conduct the review;

(b) adopt standard protocols for evidence identification that contain clear and precise

criteria for including or excluding studies on the basis of relevance and scientific

merit1;

(c) in interpreting and synthesizing different studies, should: (i) articulate one or more

hypotheses about how a hazard causes, directly or indirectly, an endpoint of concern,

including an explanation of related phenomena that are expected to be observable if

the hypotheses are indeed true; (ii) review all relevant research studies for their

implications regarding these hypotheses and summarize such evidence; (iii) evaluate

evidence supporting the hypotheses; and (iv) infer, based on the weight of evidence,

whether the potential hazard has a causal effect on the endpoint of concern,

explaining why this conclusion is more likely than the alternative;

(d) use a database to capture study information and relevant quantitative data and prepare

standardized evidence tables that will capture the key features of each study;

(e) determine whether the available information is adequate to address each problem

statement, identify gaps where additional studies would be useful, and determine

whether the agency’s stakeholders or others can generate any missing data;

(f) adopt uniform approaches to evaluate the weight of evidence of each study and

thoroughly and transparently identify any weaknesses, uncertainties, or variability in

the data or analysis; and

(g) evaluate the strength of contrary and negative evidence, in addition to the primary

studies used by the agency, and consider the use of multiple studies in quantitative

analysis.

1 In this connection, agencies should consider using the standards articulated in the following sources: Austin Bradford Hill, The Environment and Disease: Association or Causation?, PROCEEDINGS OF THE ROYAL SOC’Y OF MED., 58, 295-300 (1965); HJ Klimish & Andreae M. Tillman U., A Systematic Approach for Evaluating the Quality of Experimental Toxicological and Ecotoxicological Data, REGAL TOXICOL PHARMACOL 1-5 (1997).

Comment [W2]: This recommendation is outside the scope of the report.

3

Agencies should ensure transparency in designing and conducting systematic reviews.

The agency should prepare an internal document describing its general process for

conducting systematic reviews and should post that document on its website. In addition,

in each instance in which it conducts such a review, the agency should catalogue its efforts

to comply with the above steps in a publicly accessible document.

3. Disclosure of Underlying Studies and Data: Consistent with the Information Quality Act

(IQA) guidelines issued by the Office of Management and Budget and its own IQA guidelines,

each agency should ensure that qualified members of the public can, within the time limits

provided for public comment, fully reproduce the agency’s analytical results. This generally

requires that agencies identify and make publicly available the scientific literature, raw

data, models reviewed, and its research results, including the results it obtained but on

which it did not rely. Scientific and technical literature, whether utilized or rejected, should

be posted online to afford the public the opportunity to evaluate and comment on it,2 unless

subject to copyright, in which case clear instructions should be provided concerning how to

obtain it consistent with copyright law.

[The original recommendation in the report reads:

In supporting its science-based regulatory decision, an agency

should identify and make publicly available a list of the scientific

literature it consulted, including even the literature it rejected when it is

material to the scientific analysis, as well as the literature it relied upon.

This reference list should be posted online whenever possible.

When an agency relies on studies that are not published, it should

post the studies on its website as soon as is practicable, subject to

copyright and other legal restrictions. When this public transparency is

not possible, these restrictions should be explained in the agency’s

individual analyses and possibly more generally in describing its

regulatory program for the public.]

4. Checkpoints and Explanations: Particularly in cases when they are not bound by

judicially enforceable deadlines, agencies should generally establish explicit checkpoints

for regulatory projects. These checkpoints should address both when agencies will close

their consideration of research or debate in order to reach a decision and when they might

reopen that consideration. External peer review bodies may be particularly useful to

agencies in establishing scientifically credible checkpoints. In any case, agencies should

explain their decisions to initiate, stop, or reopen consideration of research or debate.

2 Administrative Conference of the United States, Recommendation 2011-1, Legal Considerations in E-Rulemaking, ¶ 4, 76 Fed. Reg. 48789, 48789 (Aug. 9, 2011).

Comment [W3]: Red passages are outside the scope of the report.

4

Such explanations should particularly reference any relevant ongoing research or external

deliberations.

5. Identification of Future Research Projects: For science-intensive rules, agencies should use

the results of uncertainty analysis to identify specific types of future research projects that

will best advance understanding on the regulatory issue (for example, through the use of

value of information analysis). This identification of research questions and priorities

should influence the agencies’ research agendas as well as provide a basis for establishing

future checkpoints.

6. Agency Staff Authorship Rights: Agency staff play an important role in producing their

respective agencies’ scientific analyses. Agency managers should consider providing

staff with some form of consensual authorship right or attribution for reports or analyses

to which they contribute in a significant way. Such rights should be acknowledged for all

staff authors who contributed in a significant way to a technical or scientific report,

including economists, lawyers, and other nonscientists. In a similar vein, reviewers and

other contributors should also be identified by name and general contribution.

7. Dissent Rights: Agencies should encourage vigorous debate among agency scientists,

and should explore ways of incorporating the diversity of that debate in any resulting

work product. One such policy would allow agency staff to dissent or express their non-

concurrence on a technical issue in a document to which they contributed. In cases where

written dissent or nonconcurrence is permitted, agency managers should take seriously a

staff member’s request to place a dissent or non-concurrence into the public record.

Dissenting employees should also be allowed and encouraged to publish these dissenting

positions in the peer reviewed literature, provided that confidential governmental

deliberations are not compromised. In all cases and regardless of the public availability

of these discussions, dissenting staff members should be protected from reprisals.

8. Transparency for External Review: Agencies should comply with section 6(a)(3)(E) of

Executive Order 12866,3 or any successor Executive Order, and identify all substantive

changes in regulations between the draft submitted to the Office of Information and

Regulatory Affairs (OIRA) for review and the action subsequently announced.

9. Sharing of Agency Best Practices through Central Executive Branch Coordinator: OSTP, an

interagency group headed by OSTP, or another body designated by the President should be

responsible for identifying and publicizing the innovations developed by agencies for

transparently incorporating science into their regulatory decisions.

3 Exec. Order No. 12,866, 3 C.F.R. 638 (1993), reprinted as amended in 5 U.S.C. § 601 (2006).

Comment [W4]: Red passages are outside the scope of the report. This recommendation was proposed in the report as worthy of consideration, rather than a “best practice”.

Comment [W5]: Red passages are outside the scope of the report. Based on NAS and Committee feedback, this recommendation was proposed in the report as worthy of consideration, rather than a “best practice”.

Comment [W6]: Based on NAS and Committee feedback, this recommendation was proposed in the report as worthy of consideration, rather than a “best practice”.

Comment [W7]: Based on NAS and Committee feedback, this recommendation was proposed in the report as worthy of consideration, rather than a “best practice”.

Comment [W8]: The report identified other compliance lapses by agencies with the EO. See deleted recommendation #3 at the end of the report.

5

10. Elimination of Legal Barriers to Transparent Decisionmaking: Agencies should identify legal

barriers that impede public access to the scientific information underlying agency analyses

or otherwise block the agencies’ development of scientifically robust decision-making

processes. Agencies should recommend appropriate revisions in existing law to eliminate

such impediments to the Executive Office of the President. OSTP or another centralized

entity should serve as a forum for identifying concerns affecting multiple agencies and

urging appropriate changes in law.

II. Agency Disclosure to Enhance the Transparency of Research

11. Data Disclosure: To the maximum extent practicable, agencies should voluntarily

comply with the Shelby Amendment4 and OMB Circular A-110

5 in circumstances to

which Circular A-110 does not literally apply. In particular, agencies should seek to

provide disclosure of data underlying federally-funded or non-federally funded research,

including from government contracts. Where the owners of such data will not provide

such access, the agency should note that fact, explain why they used the results if they

choose to do so, and may assign less weight to such research.

The original recommendation in the report reads:

“Studies used in the formulation of regulation should be subject to data

access requirements equivalent to those under the Data Access Act

(Shelby Amendment) and its implementing circular (OMB Circular A-

110) regardless of who funded the study. If a study is used by an agency to

inform the development of a regulation, then the same kinds of

information about that study should be available upon request, regardless

of whether the study was funded by the federal government, industry, or

some other entity.”

12. Legal Restrictions on Disclosure: Public transparency of scientific information may not

be possible because of legal restrictions. These may be based on personal privacy6 or

because the owner of information claims it to be protected from disclosure as trade secret

or other confidential business information.7 Agencies should explain these restrictions in

the agency’s individual analyses and indicate whether any such restricted information

was relied upon and, if so, for what conclusions. Agencies should publish non-restricted

summaries of such information and consider procedures to provide for the sharing of CBI

4 Omnibus Consolidated and Emergency Supplemental Appropriations Act, Pub. L. No. 105-277, 122 Stat. 2681, 2780, 3176 (1998). 5 Uniform Administrative Requirements for Grants and Agreements with Institutions of Higher Education, Hospitals, and Other Non-Profit Organizations (OMB Circular A-110), 2 C.F.R. § 215 (2004). 6 The Privacy Act of 1974, 5 U.S.C. § 552a (1974). 7 Trade Secrets Act, 18 U.S.C. § 1905 (2012); Freedom of Information Act, 5 U.S.C. § 552(b)(4) (2012).

Comment [W9]: This requirement is weaker than the recommendation adopted by the BPC, which was adopted verbatim in the report.

Comment [W10]: This requirement is considerably weaker than the recommendation in the report.

6

with outside parties in ways that do not compromise confidentiality (e.g., user

agreements).

This is the original recommendation in the report reads:

“Confidential Business Information (CBI) claims can . . . make it difficult

for the interested public to evaluate studies that contribute to regulatory

policy.”8 Agencies that provide CBI protections for studies or data that

inform regulation should ensure that the CBI claims are justified. Given

the strong incentives to regulated parties for overclaiming CBI protection

and the resultant costs from this overclaiming to public health protection

and research, it is important that the agencies’ CBI programs not provide a

safe haven for unjustified suppression of relevant regulatory research.9

To that end and as a first step, the agencies should review their CBI

programs to ensure that there is rigorous oversight of CBI and related

trade secret claims on health and environmental research. Agencies

should, where possible, penalize those CBI claims that, upon review,

appear unjustified.

13. Financial Interests Disclosure: Agencies should require financial interest disclosures on

all research submitted to inform an agency’s licensing, regulatory, or other decision-

making process. This disclosure should be similar to the financial interest disclosure

required by scientific journals.10

The regulatory financial interest disclosure should also,

where possible, identify whether the experimenter or author had the legal right to design

the research, collect the data, interpret the data, and author, publish or otherwise

disseminate the resulting report without approval of the sponsor of the research. Finally,

agencies and scientific advisory committees should be skeptical of those studies wherein

a party other than the principal investigator (e.g., the study sponsor or funder) had control

over the design or publication of the study.

[This sentence in the original recommendation read:

The regulatory conflict of interest disclosure should also, where possible,

identify whether a sponsor reserved the right to participate in the design of

the research; the collection of data; the interpretation of data; the writing

or disseminating the report, or any other material features of the research.]

8 Bipartisan Policy Center Report at 43. 9 Id. 10 Uniform Requirements for Manuscripts Submitted to Biomedical Journals: Manuscript Preparation and Submission: Preparing a Manuscript for Submission to a Biomedical Journal, INTERNATIONAL COMMITTEE OF MEDICAL JOURNAL EDITORS, http://www.icjme.org/manuscript_1prepare.html.

Comment [W11]: This sentence was reworded in a way that provides more limited insights into sponsor influence. The original rec sought information about sponsor rights to participate in the study design, etc.; the revised rec only solicits information on whether the author has the right to publish. The original recommendation in the report is more consistent w/ the ICJME requirement that requires disclosure of “financial relationships with entities in the bio-medical arena that could be perceived to influence, or that give the appearance of potentially influencing, what you wrote in the submitted work.” http://www.icmje.org/coi_disclosure.pdf

7

III. Use of Peer Review for Agency Science

14. External Peer Review Panels: Agencies should finalize the charge/questions submitted to a

peer review committee before choosing reviewers. In constructing peer review panels

consisting of outside experts, agencies should select panel members based primarily upon

their expertise and experience as well as their ability to contribute to the panel’s

deliberations without conflict of interest or undue bias or pre-disposition. This applies to

all potential members, whether hailing from government, academia, or the private or non-

profit sectors. Agencies should carefully delineate the difference between financial conflicts

of interest and bias or pre-disposition. Agencies should avoid financial conflicts of interest

to the greatest extent feasible. Insofar as virtually all potential panel members possess

certain general views or pre-dispositions based on background and training, agencies

should select members that represent a range of respected perspectives and fields of

expertise. Indeed, panels with a healthy range of perspectives are more likely to engage in a

robust review of critical scientific issues.

In constructing peer review panels, agencies should observe the following principles:

(a) Agencies should develop guidelines for implementing peer review procedures,

including identification of issues that warrant additional procedures and issues

that warrant external review (rather than internal review).11

(b) For the most significant scientific work products, agencies should consider

employing recognized third-party institutions (e.g., NAS) to conduct the

relevant peer review.

(c) When agencies use contractors to organize peer review panels, they should

require the contractors to apply to prospective and actual members of such

panels the same ethics requirements that would apply if such individuals were

special government employees.

11 See OFFICE OF MGMT. AND BUDGET, PEER REVIEW BULLETIN 4-6 (2004), available at http://www.whitehouse.gov/sites/default/omb/memoranda/fy2005/m05-03.pdf.

Comment [W12]: This section is outside the scope of the report.

8

(d) Agencies should provide a meaningful and timely opportunity for the public

to provide input into the peer review process before the pre-review

commences. This normally would include an opportunity to comment on the

scope of review (i.e., problem formulation), peer review charge/questions, and

proposed peer reviewers. The public should have the opportunity to nominate

proposed peer reviewers and needed areas of expertise and to submit oral and

written comments in sufficient time for review by panel members.

(e) Agencies should permit public participation in at least some meetings of every

panel, and such sessions should be conducted in a fair, balanced manner that

gives adequate time to all views. For particularly significant peer reviews,

agencies should encourage and provide sufficient time for the peer reviewers

to engage in dialogue with public commenters.

(f) Where panel members draft comments for discussion at panel meetings, the

comments should be disclosed prior to panel meetings, allowing a sufficient

amount of time to ensure a robust discussion of the individual comments at

the panel meeting and to allow adequate time for public review of the

comments prior to the meeting.

(g) Agencies should consider establishing independent ombudsmen to ascertain

whether the agencies have adequately responded to peer review and public

comments.12

(h) Agencies should publish a response to significant peer review and public

comments on the peer review process at the completion of the review, in order

to allow responses from panel members or from the public prior to the

publication of the final document under review.

(i) Agencies should draw an appropriate balance between the competing

concerns of paperwork burdens on panel members, privacy, and transparency

in their review for possible conflicts or bias. Panel members should update

their disclosures during the duration of the panel if there are material changes

to the information presented.

(j) Agencies should make available electronic records of peer review meetings,

including transcripts, within 30 days of such meetings on the appropriate

agency website.

12 ENVT’L PROT. AGENCY, SCI. ADVISORY BD. & ORD BD. OF SCIENTIFIC COUNSELORS, IMPLEMENTATION OF ORD STRATEGIC RESEARCH PLANS: A JOINT REPORT OF THE SCIENCE ADVISORY BOARD AND ORD BOARD OF SCIENTIFIC COUNSELORS 36 (2012) available at http://www.epa.gov/osp/bosc/pdf/120928rpt.pdf.

9

15. Internal Review: Consistent with President Obama’s scientific integrity directive,13

agencies should seek expert review of scientific analyses wherever possible, even if this

review occurs wholly within the agency. Agencies should explain in the final rule how

they ensured rigorous review of the scientific research underlying each regulatory project.

.

[The original recommendation read:

Consistent with President Obama’s directive, agencies should be

encouraged, not impeded, from having their scientific analyses reviewed

by other experts, even if this oversight occurs wholly inside the agency.

Any limitations on an agency’s ability to have scientific work reviewed by

scientific experts should be actively discouraged. Additionally and when

possible, agencies should endeavor to explain how they ensured the

rigorous review of their scientific products for each regulatory project.]

13

Memorandum from the Admin. of Barack H. Obama for the Heads of Executive Departments & Agencies on

Scientific Integrity (Mar. 9, 2009) available at http://www/gpo/gov/fdsys/pkg/DCPD-200900137/pdf/DCPD-

200900137.pdf.

Comment [W13]: The original recommendation focused on removing impediments to external review. That thrust is completely lost in the rewritten recommendation.

10

Deleted Recommendations from the Original Report

the full set of the report recommendations are reproduced in the Executive Summary

1. Agencies should provide the public with an accessible description of the process that they

utilize for integrating science into their decisions for each of their science-intensive

programs. This includes a statement of how an agency evaluates the scientific

information used in its analysis; how the agency makes that information available to

reviewers and the public; how the analysis is reviewed by experts and interested parties;

and how the agency ensures that the final decision can be compared against the scientific

record.

2. Agencies should resist applying deliberative process protections to documents and

communications that influenced the development of science-based regulatory projects.

To the extent agencies do invoke the deliberative process privilege, they should justify so

doing with respect to each document that is withheld from the public. Draft science-

policy analyses, such as draft papers, can be made public with the disclaimer that they do

not necessarily represent the policy or scientific position of the agency. Agencies should

prepare an administrative record that advances this transparency goal by ensuring that the

documents, meetings, and other deliberations that resulted in potentially significant

changes to scientific assumptions or interpretations are made part of the administrative

record. These administrative records should be posted on the internet when possible.

3. Under Section 2(a) and 6(a) of Executive Order 12866, the agencies are responsible for

interpreting and complying with Section 6(a). The agencies’ compliance under Section

6(a) should include at the very least:

1) documentation of the major changes made at the suggestion or

recommendation of OIRA at any point in the lifecycle of the regulation as required by

Section 6(a)(3)(E)(iii) and 6(a)(3)(F). If there are no major changes, then the agency

should provide a statement to that effect;

2) an identification of all substantive changes made between the draft submitted to

OIRA for review and the action subsequently announced in compliance with Section

6(a)(3)(F)(ii). This includes but is not limited to a red-lined version of the document

undergoing OIRA review;

3) for both #1 and 2, the agencies should provide a “complete, clear, and simple”

identification and explanation of each major change in “plain, understandable language”

for the public. Explication of these major changes should be accessible to the public –

through for example a cover memorandum --- and not buried in hundreds of pages of red-

lined documents. Although the Executive Order technically requires this accessible

explication of all changes (and not simply the major changes) made at the suggestion of

OIRA, a disclosure of the major changes is considerably less burdensome and appears

consistent with the thrust of the Executive Order;

4) a complete library of all documents exchanged between OIRA and the agency

throughout the life cycle of the regulatory action to ensure that the agency is in full

compliance with Section 6(a)(3)(E)(ii) and (iii).

11

5) centralized public access to the information specified above to ensure practical,

as opposed to merely theoretical, compliance with the general requirements of Section

6(a)(3)(E) and (F). Both reginfo.gov and regulations.gov should link to or provide the

public with document libraries that enable simple access to and searching of documents

required under Section 6.

Agencies should apply these same requirements of Section 6(a)(3)(E), as

interpreted above, to all significant science-intensive regulatory actions, including agency

guidances and other standards and policies, whether or not they are published in the

Federal Register, as well as to all significant, supporting studies and projects that inform

science-intensive agency rules, guidances, policies and related products. The

requirements should also apply to all rules that are withdrawn, whether ORIA has

reviewed them or not.

4. The agency should disclose material changes made at the suggestion or recommendation

of White House offices or other agencies, consistent with Section 6(a)(3)(E)(iii), when

doing so does not impair the deliberative processes.

Timeline for Report and Recommendations Science in Regulation

Committee Meets on Draft recs March 2012

NAS meeting Sept. 2012

Committee Meets on Revision outline Sept. 2012

Final Draft Report and Recs To ACUS

Drafting Subcommittee rewrite recs* Reeve Bull

Jamie Conrad Kevin Bromberg

ACUS solicits comments on revised recs* Susan Dudley

Alan Morrison

George Washington Workshop Oct. 2012

SBA Workshop Dec. 2012

Denotes outside of consultant process