65
H H I I T T P P o o l l i i c c y y C C o o m m m m i i t t t t e e e e _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ N N o o v v e e m m b b e e r r 7 7 , , 2 2 0 0 1 1 2 2

HIT Policy Committee ______ - Bloomberg Industry Group

Embed Size (px)

Citation preview

HHIITT

PPoolliiccyy CCoommmmiitttteeee

____________________

NNoovveemmbbeerr 77,, 22001122

#HITPolicy

AGENDA

HIT Policy Committee

November 7, 2012

9:30 a.m. to 2:45 p.m. [Eastern Time]

Omni Shoreham Hotel, 2500 Calvert Street, NW, Washington, DC 20008

9:30 a.m. CALL TO ORDER – MacKenzie Robertson, Office of the National Coordinator for

Health Information Technology

9:35 a.m. Remarks – Farzad Mostashari, National Coordinator

9:50 a.m. Review of the Agenda – Paul Tang, Vice Chair of the Committee

9:55 a.m. Update from CMS

– Robert Anthony, CMS

10:10 a.m. Final Walkthrough of Meaningful Use Stage 3 Request for Comment (RFC)

Meaningful Use – Paul Tang, Chair and George Hripcsak, Co Chair, MUWG

Quality Measures – David Lansky, Chair QMWG

Privacy and Security – Deven McGraw, Chair P&STT

ONC RFC – Farzad Mostashari, National Coordinator

12:10 p.m. Public Comment

12:25 p.m. LUNCH BREAK

1:15 p.m. Healtheway, Inc. and the eHealth Exchange

– Mariann Yeager, Interim Executive Director, Healtheway, Inc.

1:45 p.m. ONC Update

– Jodi Daniel, ONC

– Carol Bean, ONC

2:30 p.m. Public Comment

2:45 p.m. Adjourn

HIT Policy Committee

Members Organization

Chair

Farzad Mostashari HHS/ONC

Vice Chair

Paul Tang Palo Alto Medical Foundation

Members

David Bates Brigham and Women’s Hospital

Christine Bechtel National Partnership for Women & Families

Christopher Boone American Heart Association

Neil Calman The Institute for Family Health

Richard Chapman Kindred Healthcare

Arthur Davidson Denver Public Health Department

Connie Delaney University of Minnesota/School of Nursing

Paul Egerman Businessman/Entrepreneur

Judith Faulkner Epic Systems Corporation

Gayle Harrell Florida State Legislator

Charles Kennedy Aetna

David Lansky Pacific Business Group on Health

Deven McGraw Center for Democracy & Technology

Frank Nemec Gastroenterology Associates, Inc.

Marc Probst Intermountain Healthcare

Joshua Sharfstein Department of Health & Mental Hygiene, Maryland

Latanya Sweeney Harvard University

Scott White 1199 SEIU Training & Employment Fund

Federal Ex officios

Madhulika Agarwal Department of Veterans Affairs

Patrick Conway Centers for Medicare & Medicaid Services, HHS

Thomas Greig Department of Defense

Robert Tagalicod Centers for Medicare & Medicaid Services, HHS

HIT Policy Committee 10-03-2012 DRAFT Meeting Summary Page 1

HIT Policy Committee

DRAFT

Summary of the October 3, 2012 Meeting

ATTENDANCE The following Committee members were in attendance at this meeting:

Farzad Mostashari

Paul Tang

Madhulika Agarwal

David Bates

Christine Bechtel

Christopher Boone

Neil Calman

Arthur Davidson

Connie White Delaney

Judith Faulkner

Thomas Greig

Gayle Harrell

Deven McGraw

Marc Probst

Joshua Sharfstein

Scott White

The following Committee members did not attend this meeting:

Richard Chapman

Patrick Conway

Paul Egerman

Charles Kennedy

David Lansky

Frank Nemec

Latanya Sweeney

Robert Tagalicod

KEY TOPICS

1. Call to Order

MacKenzie Robertson, Office of the National Coordinator (ONC), welcomed participants to the

41st Health Information Technology Policy Committee (HITPC) meeting. She reminded the

group that this was a Federal Advisory Committee (FACA) meeting being conducted with two

opportunities for public comment, and that a transcript will be posted on the ONC website. She

conducted roll call and reminded members to identify themselves for the transcript before

speaking. She turned the meeting over to HITPC Chair Farzad Mostashari.

2. Remarks

HIT Policy Committee 10-03-2012 DRAFT Meeting Summary Page 2

Farzad Mostashari, National Coordinator, gave extensive remarks on the recent news coverage of

HIT. He acknowledged that he was hurt by a Wall Street Journal opinion piece that referred to

ONC as a cheerleader. The piece reportedly referred to the few cost benefits and called HIT a

misguided effort and a waste of money, as well as criticized the government for not setting

standards for documentation of blood pressure and problem list. He reported that the academic

community and others had responded with evidence and opinions to the contrary. According to

Mostashari, 27 of the 31 published cost-benefit analyses show either cost reduction or health

benefits. Such studies have a narrow focus while HIT is an infrastructure and once in place can

do much with marginal costs. The real question should be how to implement HIT to maximize

health benefits and cost savings. He mentioned an article in the American Journal of Managed

Care that described a project in Minnesota to incorporate prior notification and authorization

requirements into clinical decision support. Findings showed improved health outcomes, savings

in providers’ time, and a decrease in unnecessary tests.

He continued his rebuttal, noting that although it was interesting that the WSJ encouraged more

regulation, its assertion about the lack of standards is untrue. In Stage 2, 42 distinct standards are

required. Next, he talked about a series of articles in The Washington Post and The New York

Times on billing. Data from the decade before the meaningful use program show an association

between the increase in severity of codes and their distribution across hospitals. According to

Mostashari, the articles imply that the incentive to adopt electronic health records (EHRs)

contributed to the increase in severity. He offered several alternative explanations, one being that

the more severe codes may have been the correct codes and not necessarily an indicator of fraud.

Only a few cost aspects were considered. Actually, the total costs for Medicare and Medicaid

were comparatively lower over the past few years. The purpose of EHRs is to refocus on

managing population costs. Coordination rather than documentation is the issue. There is a shift

away from payment on volume. Information is essential to manage care. Moreover, EHRs can be

useful tools for audit and enforcement. He pointed out that the Robert Wood Johnson

Foundation’s Open Notes project found that 99% of respondents wanted to continue to see their

providers’ notes. Patients can be made partners in detecting fraud.

He announced that for Stage 3, he wants the HITPC to look at policies around documentation of

patient care. There is lots of information being carried forward that may not be necessary. He

said that he wants to work with vendors and providers on functionality of EHRs to, for example,

look at audit logs and provider authentication and fraud. Stage 3 will focus on quality, outcomes

and population management. He referred to a recent article in the Annals of Internal Medicine

that describes increases in quality of care. He reminded the members that several years are

required for the results of change to be visible.

3. Review of the Agenda

Paul Tang, Vice Chairperson, reminded members that the meeting was dedicated to the request

for comment (RFC) for Stage 3 in which everything will be directed to improving outcomes. He

referred to the summary of the September 2012 meeting, which was circulated with the meeting

materials, and asked for a motion to approve the summary. The motion was made and seconded

and a voice vote resulted in unanimous approval.

HIT Policy Committee 10-03-2012 DRAFT Meeting Summary Page 3

Action item #1: The summary of the September 2012 HITPC meeting was

approved as circulated.

4. Meaningful Use Stage 3 Request for Comment (RFC) Input from the Meaningful Use

Workgroup

Tang, in his role as chair of the Meaningful Use Workgroup, explained that the workgroup

members and Michele Nelson, ONC, deliberated over the comments on the preliminary

recommendations presented to the HITPC in August and made a number of changes. He

reviewed the RFC schedule leading up to May 2013 when the HITPC is scheduled to transmit final Stage 3 recommendations to HHS. Once again, he reviewed the guiding principles:

• Supports new model of care (e.g., team-based, outcomes-oriented, population

management)

• Addresses national health priorities (e.g., NQS, Million Hearts)

• Broad applicability (since MU is a floor)

−Provider specialties (e.g., primary care, specialty care)

−Patient health needs

−Areas of the country

• Promotes advancement -- Not "topped out" or not already driven by market forces

• Achievable -- mature standards widely adopted or could be widely adopted by 2016

• Reasonableness/feasibility of products or organizational capacity

−Prefer to have standards available if not widely adopted

−Don’t want standards to be an excuse for not moving forward

Next, he referred members to the recommendations grid, which was both included in the

previously distributed meeting materials and shown on slides. The slides showed each objective

for Stages 1 and 2 and proposed for 3, as well as a recommended place holder for Stage 4. Red

items showed changes from Stage 1 to Stage 2, blue items from Stage 2 to Stage 3

recommendations and green items updates made following the August 1, 2012 HITPC meeting.

Beginning with the domain Improve Quality Safety, Efficiency and Reducing Health

Disparities, he methodically reviewed SGRP101 through SGRP121, concentrating on the

reasons for the most recent changes in Stage 3 recommendations (Stage 4 placeholders not listed

here):

101- CPOE for medications includes DDI checking for “never” combinations as

determined by an externally vetted list.

Certification Criteria: EHR must be able to consume an externally supplied list of

“never” DDIs, using RxNorm and NDF-RT standards along with a TBD DDI reactions

value set.

Certification Only for EPs

• EHRs must also have the ability to identify abnormal test results and track when

results are available.

• EHR must have the ability to transmit lab orders using the lab order and results

Interface guidelines produced by the S&I Framework Initiative.

RFC: Are the existing standards for laboratory orders adequate to support including this

certification criterion?

104 - Remove objective because topped out and ensure used in CQMs for disparities.

HIT Policy Committee 10-03-2012 DRAFT Meeting Summary Page 4

Certification criteria:

• Occupation and industry codes

• Sexual orientation, gender identity (optional fields)

• Disability status

Differentiate between patient reported and medically determined

Need to continue standards work

105 - Certification criteria only: Use of lab test results, medications, and vital signs (BP,

ht, wt, BMI), to support clinicians’ maintenance of up-to-date accurate problem lists.

Vendors utilize rules to help providers improve the problem list (e.g. Method for

Assigning Priority Levels ). System provides prompts about additions, edits, and

deletions for clinicians review and action.

RFC: How to incorporate into certification criteria for pilot testing?

106 - System provides prompts about additions, edits, and deletions for clinicians review.

RFC: How to incorporate into certification criteria for pilot testing

107 - System provides prompts about additions, edits, and deletions for clinicians review

and action.

113 - The 15 CDS interventions should include one or more interventions in each of the

following areas, as applicable to the EP's specialty:

• Preventative care (including immunizations)

• Chronic disease management (e.g., diabetes, hypertension, coronary artery

disease)

• Appropriateness of lab and radiology orders

• Advanced medication-related decision support* (e.g., renal drug dosing)

115 - EP Objective: Generate lists of patients for multiple specific conditions and present

near real-time (vs. retrospective reporting) patient-oriented dashboards to use for quality

improvement, reduction of disparities, research, or outreach reports. Dashboards are

incorporated into the EHR’s clinical workflow for the care coordinator or the provider. It

is actionable and not a retrospective report.

117 - 2) Mismatches (situations in which a provider dispenses a medication and/or dosing

that is not intended) are tracked and used for quality improvement.

Discussion of Revisions

Judy Faulkner asked about (slide 6) transitions with orders: What about transitions without

orders? Tang responded that the point is to encourage orders with all transitions and referrals; the

measure (denominator) is based on orders. Complaining about the difficult in reading the small

print, she inquired about (slide 11) maintaining the medication list. Tang explained, giving an

example that if an item appears on the problem list without a corresponding med (or vice versa),

an alert is generated. This is a spotting mechanism only. Faulkner said that (slide 14) the use of

the word “intervention” should be reconsidered. Tang replied that its use is consistent with the

NPRM language. Faulkner suggested “guidance.”

HIT Policy Committee 10-03-2012 DRAFT Meeting Summary Page 5

Gayle Harrell referred to slide 9 and proclaimed that some of the content in the placeholder could

be moved back to Stage 3. Most states have Prescription Drug Monitoring Programs (PDMP)

and all states have Medicaid formulary lists, both of which should be incorporated in and used in

Stage 3. Florida has been doing something similar since 2003, which resulted in very significant

cost savings. Available formularies should be used. Tang reported that according to the Health

Information Technology Standards Committee (HITSC), the necessary standards are not broadly

available. He agreed to investigate the suggestion about the Medicaid formularies.

Jodi Daniels, ONC, reported on a pilot project to make the Prescription Drug Monitoring

Program (PDMP) readily available to providers. Seven small pilots are underway. A toolkit will

be designed to make it easier to get to PDMP via EHRs. She invited members to send

suggestions to her.

Responding to a member’s concern about the absence of disability status on slide 10, Tang

pointed it out to her and said that the question is whether disability status should be patient

reported or medically reported or determined.

Marc Probst indicated that he was impressed with the detail of the recommendations. He

wondered whether the workgroup had categorized the requirements for the recommended

changes. Do we fully understand what we are asking? Tang referred to the principle of

reasonableness and feasibility and said that a matrix could be constructed to categorize the

objectives and criteria.

Tang moved to the domain Engage Patients and Families and explained each of the most recent

changes:

204A – MENU item: Automated Transmit: (builds on “Automated Blue Button Project”):

Before issue final recommendations in May, will review the result of Automated Blue

Button pilots.

204B - MENU: Provide 10% of patients with the ability to submit patient-generated

health information to improve performance on high priority health conditions, and/or to

improve patient engagement in care (e.g. patient experience, pre-visit information, patient

created health goals, etc.). This could be accomplished through semi-structured

questionnaires.

Based upon feedback from HITSC this can be a MENU item.

Need RFC language to describe the rational for this function (contributes to health

outcomes improvement, QI goals and care efficiency).

206 - Add language support: For the top 5 non-English languages spoken nationally,

provide 80% of patient-specific education materials in at least one of those languages

based on EP’s or EH’s local population, where publically available.

Discussion

Harrell talked about being concerned with the privacy and security of downloads. Is the goal for

the patient to be able to download a record and incorporate it into a PHR for transmission and if

HIT Policy Committee 10-03-2012 DRAFT Meeting Summary Page 6

so, what is the physician’s liability? Tang pointed out that provider-to-provider transmission is

covered under HIPAA. When a patient transmits her data to someone, she is responsible. Joy

Pritts, ONC, confirmed that once information is transmitted to the patient, it becomes the

responsibility of the patient. Discussion ensured. Someone commented on the distinction

between a patient giving directions on transmission once or at specific visits. Pritts reported that

she is working with others in HHS to determine the best approach to this new policy issue.

Considering HIPAA and HITECH, and the NPRM on the latter and based on the existing rule,

there is no restriction on the format in which the record is provided. In some states, laws may

restrict transmission. She indicated a lack of clarify as to whether the provider’s liability ends

with the receipt of permission to transmit. It is possible that the provider’s liability may extend to

the receipt of the transmitted information by the other party. Harrell said that people are very

concerned with the liability issue. Pritts noted that concerns commonly surface with the

introduction of new technology. Tang acknowledged that the issue should be explored in the

RFC. Christine Bechtel talked about a recommendation that patients receive a warning prior to

confirmation of a download. A pilot project is underway on auto-transmission of summaries.

Tang said that the RFC preamble will first describe what is known, followed by the questions.

Faulkner said that it is tricky to determine to what the patient has agreed. Difficulties include the

education of individual clinicians, the need for signature pads, paper overload and that too much

authorization impedes patient access to services. Pritts reported that a pilot on obtaining e-

consent will launch this month, using tablets and videos. Faulkner expressed concern about a

complicated process with unnecessary steps.

George Hripcsak, Co-Chair, Meaningful Use Workgroup, presented the most recent changes to

the recommendations for the Care Coordination domain in Stage 3.

302 - SC&C Recommendation: Standards work needs to be done to adapt and further

develop existing standards to define the nature of reactions for allergies (i.e. severity).

303 - Certification Criteria: Inclusion of data sets being defined by S&I Longitudinal

Coordination of Care WG, which are expected to complete HL7 balloting for

inclusion in the C-CDA by Summer 2013:

1) Consultation Request (Referral to a consultant or the ED)

2) Transfer of Care (Permanent or long-term transfer to a different facility, different

care team, or Home Health Agency)

304 - Certification Criteria: Develop standards for a shared care plan, as being

defined by S&I Longitudinal Coordination of Care WG. Some of the data elements

in the shared care plan overlap content represented in the CDA. Adopt standards for

the structured recording of other data elements, such as patient goals and related

interventions.

305 - Certification Criteria: Include data set defined by S&I Longitudinal

Coordination of Care WG and expected to complete HL7 balloting for inclusion in

the C-CDA by Summer 2013: Shared Care Encounter Summary (Consultation

Summary, Return from the ED to the referring facility, Office Visit)

HIT Policy Committee 10-03-2012 DRAFT Meeting Summary Page 7

\

3308 – 308 - IE workgroup recommendation (IF provider directories exist and are

operational):

EH OBJECTIVE: The EH/CAH will send electronic notification of a significant

healthcare event in a timely manner to key members of the patient’s care team, such

as the primary care provider, referring provider or care coordinator, with the

patient’s consent if required.

EH MEASURE: For 10% of patients with a significant healthcare event (arrival at

an Emergency Department (ED), admission to a hospital, admission to a long term

care facility, discharge from an ED or hospital, or death) , EH/CAH will send an

electronic notification to at least one key member of the patient’s care team, such as

the primary care provider, referring provider or care coordinator, with the patient’s

consent if required, within two hours of when the event occurs.

Discussion

Someone brought up the issue of what an EHR system can do for a practice beyond improving

care for the individual patient. Hripcsak acknowledged that the emphasis to date has been the

patient. But some functions, such as generating lists by conditions, can be used for population

management. Lists can be used for action. To do something such as examine a pattern of

unexpected ED visits or hospitals admissions will require more advanced analytics.

David Bates talked about a survey that his organization conducted for the National Quality

Forum (NQF). Currently, the use of EHRs is very primitive. Tools are needed to enable more

comprehensive analyses, but they have yet to be developed. Someone suggested adding a

question to the RFC to that effect. Tang indicated that a question on dashboards may cover it.

Hripcsak continued with the domain of Population Health. Only a minor change was made to

the threshold in 405.

Discussion

Joshua Sharfstein suggested having the capability to receive an urgent message from the state

health agency. He referred to a pilot project in Maryland in which this was successfully done.

Hripcsak replied that the workgroup examined ways to receive alerts for specific patients in stage

4. However, the workgroup members had not thought about a general message. Art Davidson

said that such a concept was embedded in slide 37. The Centers for Disease Control and

Prevention (CDC) sends health alert messages. Sharfstein pointed out a need for something

below the level of national messages. Davidson said standards for that function are not currently

available but that the Direct provider directory can possibly be used for messages.

Tang said that the recommendations will be revised per the comments.

5. Meaningful Use Stage 3 Request for Comment Input from the Information Exchange

Workgroup

Larry Garber, Member, Information Exchange Workgroup, presented three recommendations:

Query for Patient Information (EHR Certification Criteria) Proposed Criteria for the next

phase of EHR Certification:

HIT Policy Committee 10-03-2012 DRAFT Meeting Summary Page 8

1. The EHR must be able to query another entity* for outside records and respond to such

queries. This query may consist of three transactions:

• Patient query based on demographics and other available identifiers, as well as the

requestor and purpose of request.

• Query for a document list based for an identified patient

• Request a specific set of documents from the returned document list *the outside entity

may be another EHR system, a health information exchange, or an entity on the NwHIN,

for example.

2. When receiving in inbound patient query, the EHR must be able to:

• Tell the querying system whether patient authorization is required to retrieve the

patient’s records and where to obtain the authorization language**. (E.g. if authorization

is already on file at the record-holding institution it may not be required).

• At the direction of the record-holding institution, respond with a list of the patient’s

releasable documents based on patient’s authorization

• At the direction of the record-holding institution, release specific documents with

patient’s authorization

3. The EHR initiating the query must be able to query an outside entity** for the

authorization language to be presented to and signed by the patient or her proxy in order

to retrieve the patient’s records. Upon the patient signing the form, the EHR must be able

to send, based on the preference of the record-holding institution, either:

• A copy of the signed form to the entity requesting it

• An electronic notification attesting to the collection of the patient’s signature

Query Provider Directory (EHR Certification Criteria)

Proposed Criteria for the next phase of EHR Certification: The EHR must be able to

query a Provider Directory external to the EHR to obtain entity-level addressing

information (e.g. push or pull addresses).

Request for Comment for EHR Certification with these additional questions:

• Are there sufficiently mature standards in place to support this criteria? What

implementation of these standards are in place and what has the experience been?

Data Portability Between EHR Vendors (RFC) Request for Comment for EHR

Certification:

• What criteria should be added to the next phase of EHR Certification to further facilitate

healthcare providers’ ability to switch from using one EHR to another vendor’s EHR?

Discussion

Deven McGraw acknowledged her support for including the Information Exchange Workgroup’s

recommendations in the RFC. She assured the committee members that the Privacy and Security

Tiger Team will examine the use cases prior to the finalization of the recommendations.

6. Meaningful Use Stage 3 Request for Comment Input from the Privacy and Security

Tiger Team

McGraw, Co-Chair, Privacy and Security Tiger Team, presented questions in three areas:

The Health IT Policy Committee has already recommended that provider users of EHRs

be authenticated at National Institute for Standards and Technology (NIST) “Level of

HIT Policy Committee 10-03-2012 DRAFT Meeting Summary Page 9

Assurance” (LoA) 3 for remote access (e.g., more than user name and password required

to authenticate) by Stage 3 of MU.

1. Should the next phase of EHR certification criteria include capabilities to authenticate

provider users at LoA 3? If so, how would the criterion/criteria be described?

2. What is an appropriate MU measure for ensuring provider users authenticate at LoA 3

for remote access? Under what other circumstances (if any) should authentication at LoA

3 be required to meet MU?

3. NIST establishes guidance for authentication of individuals interacting remotely with

the federal government. What, if any, modifications to this guidance are recommended

for provider EHR users?

In Stages 1 and 2 of MU, EPs/EHs/CAHs are required to attest to completing a HIPAA

security risk assessment (and addressing deficiencies), and, in stage 2, attesting to

specifically addressing encryption of data at rest in CEHRT.

1.What, if any, security risk issues (or HIPAA Security Rule provisions) should be

subject to MU attestation in Stage 3?

2. For example, the requirement to make staff aware of the HIPAA Security Rule and to

train them on Security Rule provisions is one of the top 5 areas of Security Rule

noncompliance identified by the HHS Office for Civil Rights over the past 5 years. In

addition, entities covered by the Security Rule must also send periodic security reminders

to staff. The Tiger Team is considering requiring EPs/EHs/CAHs to attest to

implementing Security Rule provisions regarding staff outreach & training and sending

periodic security reminders; we seek feedback on this proposal.

Accounting for disclosures, surveillance for unauthorized access or disclosure and

incident investigation associated with alleged unauthorized access is a responsibility of

organizations that operate EHRs and other clinical systems. Currently the 2014 Edition

for Certified EHR Technology specifies the use of ASTM E-2147-01. This specification

describes the contents of audit file reports but does not specify a standard format to

support multiple-system analytics with respect to access. The Tiger Team requests

feedback on the following questions:

1. Is it feasible to certify the compliance of EHRs based on the prescribed standard?

2. Is it appropriate to require attestation by meaningful users that such logs are created

and maintained for a specific period of time?

3. Is there a requirement for a standard format for the log files of EHRs to support

analysis of access to health information access multiple EHRs or other clinical systems in

a healthcare enterprise?

4. Are there any specifications for audit log file formats that are currently in widespread

use to support such applications?

Discussion

Members had no comments. McGraw invited members to attend the virtual hearing on patient

identification and authentication on October 29.

7. Public Comment

Robertson announced that comments were limited to three minutes

HIT Policy Committee 10-03-2012 DRAFT Meeting Summary Page 10

Kelly Emerick of the Secure ID Coalition commented that she supported requiring LoA 3, but

that she preferred LoA 3 or higher. How do we know records are sent and received as intended?

She encouraged the HITPC to take a hard look at how to ensure secure transmission.

Joanne Lynn, Altarum Institute, complained about the movement of care plan from Stage 3 to

Stage 4 (slide 13) without explanation. A care plan is the most important aspect of care for the

seriously ill. She declared that care plan is not controversial and said that she had a similar issue

with slide 13 and advance care plan. Patients could carry a PDF with their advance directives.

Patients need mobility of this information. She probed for an explanation from Tang. Robertson

thanked her for her comments.

Amari, self-identified as both a patient and of the American Medical Association, thanked ONC

for providing two comment opportunities. Referring to p. 11, she acknowledged that as a

consumer she would not necessarily notice errors in her record. How would this happen? Would

the doctor ask, “Did I make an error?” HIPAA already allows for corrections. Regarding

certification, she asked for protection against misuse. She reported that she had previously

submitted comments to CMS requesting an evaluation of the meaningful use program. Providers

could be queried about the hardest measures to obtain. The requirements for meeting objectives

should be scaled back to allow more flexibility.

Diane Jones, American Hospital Association, commented on the timing of Stage 3. She

recommend a delay in the RFC release date, which is scheduled to occur at the same time that

many providers are focused on attestation for Stage 1 or implementing Stage 2. She

recommended extension of the comment period to 60 days.

Faulkner observed that it would be beneficial for the members themselves to have a time to

discuss public comments. Additionally, meeting materials should be distributed on a set schedule

in order to ensure sufficient preparation time. She also wondered about Stage 4: How would it be

funded and how would it work? She went on to say that a single national standard for

immunizations is needed. Robertson responded that she intended to institute administrative

meetings to deal with meeting issues.

Harrell had additional comments. She requested information on the outcomes of the many ONC

pilots mentioned during the meeting. Tang told her that the provision of such information was

the purpose of the monthly ONC updates. She declared that she wanted a list and summary of the

pilot projects. Neil Calman inquired about the criteria for applying for and selecting pilot sites.

Daniels responded that the pilots are often carried out via contracts. Contractors establish the

criteria (with ONC approval) and select the sites. She indicated that she is open to suggestions

for increasing transparency.

Robertson introduced new member Chris Boone, American Heart Association, who then

mentioned several consumer projects undertaken by his organization.

8. Meaningful Use Stage 3 Request for Comment Input from the Quality Measures

Workgroup

HIT Policy Committee 10-03-2012 DRAFT Meeting Summary Page 11

James Walker, member, Quality Measures Workgroup, prefaced his recommendations by

explaining that the workgroup had tried to rethink the purpose and form of quality measures. The

measures should leverage data routinely captured in the EHR and PHR during the process of

care. Support for clinical quality measure (CQM) calculations should be flexible and adaptive to

future requirements, which may include new measures or changes to measure definitions at

minimal cost. Providers should be able to configure the CQM calculation to use data elements

appropriate to local workflow. An end goal is to shift quality measurement and reporting from

sampled retrospective, human chart reviews and accounting records to concurrent, machine-

automated improvement support while recognizing that there will remain a place for human

abstracted quality measurement. He recommended the following questions for the RFC:

• Is a shift away from retooling legacy paper-based CQMs in exchange for designing e-

CQMs de novo a reasonable and desirable course of action?

• Is there an evidence basis for clinical population management platform use? Is there a

business case?

• What are the technological challenges to widespread release and adoption? Can the

HITPC encourage technology in this area without being prohibitively prescriptive?

Walker said that to leverage CQM innovation from health systems and professional societies, the

workgroup discussed a proposal to allow EPs or EHs to submit a locally developed CQM as a

menu item in partial fulfillment of requirements (in lieu of one of the existing measures specified

in the meaningful use program). Health care organizations choosing this optional menu track

would be required to use a brief submission form that describes some of the evidence that

supports their measure and how the measure was used in their organization to improve care. Two

non-mutually exclusive approaches are proposed. A conservative approach might allow

“Certified Development Organizations,” to develop, release and report proprietary CQMs. An

alternate approach might open the process to any EP or EH but constrain allowable e-CQMs via

measure design software (e.g., Measure Authoring Tool). The RFC question is what constraints

should be in place.

Discussion

Tang said that there is potential for better tools. The source of data can be reconsidered and the

burden of data collection reduced.

Harrell talked about making measures appropriate and meaningful to individual providers; the

specialties have not been addressed. She wondered who would vet new measures, a process that

would require the time and participation of many interest groups.

Jessie James, ONC, reported that ONC staff had located good measures used by DoD and VA

that did not go through the NQF process. Vendors have commented on their interest in making

measures relevant to specialties and subspecialties. A process for vetting has yet to be defined.

He suggested soliciting comments on a vetting structure. Walker said that both testing and

vetting must be done.

Kevin Larsen, ONC, pointed out that the need for a question on certification: How do products

enable development of new measures?

HIT Policy Committee 10-03-2012 DRAFT Meeting Summary Page 12

McGraw referred to a request by the Agency for Healthcare Research and Quality (AHRQ) for

comments on quality measures. An ONC staff member said that ONC and AHRC staffs were

working together. Comments are expected to be synthesized by January. A report will be made

available to the HITPC.

Harrell said that as the payment model changes, quality measures will become more important.

They need to be specific to the individual provider.

9. CMS Update Elizabeth Holland, CMS, presented the monthly update. September 2012 data are very

preliminary. Registration is up, the largest number to date. Forty-nine states are launched for

Medicaid and Puerto Rica recently launched. Hawaii is the 50th

state. CMS staff is trying to

educate providers on Stage 2 and conduct outreach for Stage 1. The latter has been much

affected by a restricted travel budget. The goal is for everyone to be a meaningful user. Twelve

hospitals came in October 1 for their second year. Regarding attestation, there was a slight

increase overall from August. 94,782 EPs have attested, 94,538 successfully. 1,895 hospitals

have successfully attested. Staff plans more webinars on Stage 2 and is working on FAQs and

with ONC for technical specification sheets for vendors. A guidance on how to apply for

payment adjustment hardship exceptions is forthcoming.

Regarding the objectives, on average all thresholds were greatly exceeded, but every threshold

has some providers on the borderline. Drug formulary, immunization registries and patient list

are the most frequently selected menu objectives for EPs. EHs most frequently select advance

directives, drug formulary and clinical lab test results. Least frequently selected menu objectives

for hospitals were transition of care and reportable lab results and for EPs transition of care

summary and patient reminders were the least frequently selected. There was little difference

among specialties in performance, but exclusions and deferrals varied.

Q&A

Harrell complained about the absence of percentages on penetration and participation in

Holland’s report. She emphasized that she wanted percentages, not simply pie charts.

10. ONC Update

Daniel informed the group that there is one vacancy on the HITPC and several on the HITSC.

She repeated an earlier comment that a website will be used to solicit members for new

workgroups on patient engagement. She hopes to get more diverse workgroup memberships.

Staff is updating the consumer health strategy, security and safety sections of the ONC strategic

plan. FCA, FCC and ONC staffs are preparing a congressionally mandated report on risk-based

management. Public hearings will be held. Their announcements will be published in Federal

Register. The Upcoming Health 2.0 conference will be another opportunity to obtain feedback on

that topic. ONC awarded a contract to RAND to evaluate risk management of HIT errors.

Another contract was awarded to design check lists. CQM specifications are in process.

Daniel repeated comments that she made earlier in the meeting (agenda item #4). ONC is

examining how HIT can contribute to the prevention of prescription drug abuse. PDMP are a

HIT Policy Committee 10-03-2012 DRAFT Meeting Summary Page 13

great resource at the state level, but providers do not routinely use them. ONC awarded a

contract to look at standards and policy to increase timely use by providers, emergency

departments and pharmacies. The contractor convened a workgroup, which generated

recommendations. The recommendations are being piloted in seven sites in Indiana, Michigan,

Washington and another state. A tool kit on the use of PDMP is being designed. ONC is open to

suggestions. She invited members to let her know on which projects they wish additional

information.

Q&A

Harrell expressed her support for the use of PDMP. She referred to a hearing on safety in 2011

and said that she wanted the hearing report. Daniels responded that ONC will publish its safety

and strategic plans and invite comments. She offered to make a presentation to the committee or

a workgroup upon request. She reminded Harrell that all hearings are open to the public. Tang

said that he wanted a report made to the HITPC on safety and the update to strategic plan.

11. Public Comment

Robin Raiford, The Advisory Board Company, reported that the CIO of a hospital in Joplin,

Missouri, told her that he qualified for meaningful use in spite of the tornado.

SUMMARY OF ACTION ITEMS:

Action item #1: The summary of the September 2012 HITPC meeting was

approved as circulated.

Meeting Materials

Agenda

Summary of September 2012 meeting

Presentations slides: Privacy and Security Tiger Team, Meaningful Use Workgroup, Information

Exchange Workgroup, Quality Measures Workgroup, ONC, CMS

Meaningful Use Workgroup RFC approach and questions

1

http://www.cms.gov/EHRIncentivePrograms/

HIT Policy Committee

October 2012

http://www.cms.gov/EHRIncentivePrograms/ 2

2

http://www.cms.gov/EHRIncentivePrograms/ 3

http://www.cms.gov/EHRIncentivePrograms/ 4

3

http://www.cms.gov/EHRIncentivePrograms/ 5

http://www.cms.gov/EHRIncentivePrograms/ 6 For final CMS reports, please visit:

http://www.cms.gov/EHRIncentivePrograms/56_DataAndReports.asp

4

http://www.cms.gov/EHRIncentivePrograms/ 7

http://www.cms.gov/EHRIncentivePrograms/ 8

For final CMS reports, please visit:

http://www.cms.gov/EHRIncentivePrograms/56_DataAndReports.asp

5

http://www.cms.gov/EHRIncentivePrograms/ 9

http://www.cms.gov/EHRIncentivePrograms/ 10

6

http://www.cms.gov/EHRIncentivePrograms/

19.04% 19.04%

60.75% 0.7979

Eligible Hospitals

5,011 Total

4,057 Total Registered

3,044 Paid

11

http://www.cms.gov/EHRIncentivePrograms/

41.90%

39.94%

18.16%

Registered Eligible Professionals

521,600 Total EPs

Registered Medicare EPs (208,331)

Registered Medicaid EPs (94,741)

12

7

http://www.cms.gov/EHRIncentivePrograms/

72.63%

15.82%

11.54%

Paid Eligible Professionals

521,600 Total EPs

Medicare EPs (82,535)

Medicaid EPs (60,208)

13

http://www.cms.gov/EHRIncentivePrograms/

• Over 60% of all eligible hospitals have received an

EHR incentive payment for either MU or AIU •Over 60% have made a financial commitment to put an EHR

in place

• Approximately 22% or 1 out of every 5 Medicare

EPs are meaningful users of EHRs

• Approximately 1 out of every 4 Medicare and

Medicaid EPs have made a financial commitment to an

EHR

• 58% of Medicare EPs receiving incentives are

specialists (non primary care) 14

8

http://www.cms.gov/EHRIncentivePrograms/ 15

Providers Paid Oct-12 LTD

Medicare EPs – [ESTIMATED] 5,250 87,750

Medicaid EPs (AIU + MU) [ESTIMATED] 3,800 64,000

Medicaid/Medicare Hospitals** [ESTIMATED] 400 4,600

Total Number of Providers Paid 9,450 156,350

http://www.cms.gov/EHRIncentivePrograms/ 16

Payments Oct-12 LTD

Medicare EPs [ESTIMATED] $95,000,000 $1,509,000,000

Medicaid EPs [ESTIMATED] $70,000,000 $1,313,000,000

Medicaid/Medicare Hospitals (Medicare Pymt) [ESTIMATED] $340,000,000 $2,905,000,000

Medicaid/Medicare Hospitals (Medicaid Pymt) [ESTIMATED] $140,000,000 $2,442,000,000

Total $645,000,000 $8,358,000,000

9

http://www.cms.gov/EHRIncentivePrograms/ 17

http://www.cms.gov/EHRIncentivePrograms/

• On average all thresholds were greatly exceeded, but

every threshold had some providers on the borderline

• Drug formulary, immunization registries and patient list

are the most popular menu objectives for EPs

• Advance Directives, Drug Formulary, and Clinical Lab Test

Results for hospitals

• Transition of care summary and patient reminders were

the least popular menu objectives for EPs

• Transition of Care and Reportable Lab Results for hospitals

• Little difference between EP and hospitals

• Little difference among specialties in performance, but

differences in exclusions and deferrals

18

10

http://www.cms.gov/EHRIncentivePrograms/

This data-only analysis shows our earliest

adopters who have attested, but does not inform

us on barriers to attestation.

At the time of the analysis

• 105,750 EPs had attested

• 105,514 Successfully

• 236 Unsuccessfully (208 EPs have resubmitted

successfully)

• 2,238 Hospital had attested

• All successfully

19

http://www.cms.gov/EHRIncentivePrograms/ 20

Objective Performance Exclusion Deferral

Problem List 97% N/A N/A

Medication List 97% N/A N/A

Medication Allergy List 97% N/A N/A

Demographics 91% N/A N/A

Vital Signs 91% 9% N/A

Smoking Status 90% 0.4% N/A

11

http://www.cms.gov/EHRIncentivePrograms/ 21

Objective Performance Exclusion Deferral

CPOE 83% 17% N/A

Electronic prescribing 81% 21% N/A

Incorporate lab results 92% 4% 36%

Drug-formulary checks N/A 14% 15%

Patient lists N/A N/A 25%

Send reminders to patients 61% 0.5% 81%

http://www.cms.gov/EHRIncentivePrograms/ 22

Objective Performance Exclusion Deferral

E – Copy of Health Information 97% 70% N/A

Office visit summaries 79% 2% N/A

Patient Education Resources 49% N/A 50%

Timely electronic access 72% 1% 67%

12

http://www.cms.gov/EHRIncentivePrograms/

23

Objective Performance Exclusion Deferral

Medication reconciliation 90% 3% 54%

Summary of care at transitions 90% 3% 84%

http://www.cms.gov/EHRIncentivePrograms/ 24

Objective Performance* Exclusion Deferral

Immunizations 35% 44% 20%

Syndromic Surveillance 6% 26% 69%

*Performance is percentage of attesting providers who conducted test

13

http://www.cms.gov/EHRIncentivePrograms/ 25

Objective Performance Exclusion Deferral

Problem List 95% N/A N/A

Medication List 98% N/A N/A

Medication Allergy List 98% N/A N/A

Demographics 96% N/A N/A

Vital Signs 92% N/A N/A

Smoking Status 93% 0.4% N/A

http://www.cms.gov/EHRIncentivePrograms/ 26

Objective Performance Exclusion Deferral

CPOE 85% N/A N/A

Advance directives 96% 0.3% 9%

Incorporate lab results 95% N/A 16%

Drug-formulary checks N/A N/A 16%

Patient lists N/A N/A 40%

14

http://www.cms.gov/EHRIncentivePrograms/

27

Objective Performance Exclusion Deferral

E – copy of health information 96% 69% N/A

E – copy of discharge Instructions

95% 65% N/A

Patient education resources 73% N/A 58%

http://www.cms.gov/EHRIncentivePrograms/

28

Objective Performance Deferral

Medication reconciliation 85% 72%

Summary of care at transitions 83% 91%

15

http://www.cms.gov/EHRIncentivePrograms/

Objective Performance* Exclusion Deferral

Immunizations 49% 14% 37%

Reportable Lab Results 16% 6% 78%

Syndromic Surveillance 21% 3% 76%

29

*Performance is percentage of attesting providers who conducted test

1

eHealth Exchange Update: HIT Policy Committee November 7, 2012

Mariann Yeager Interim Executive Director

Healtheway, Inc. 703-623-1924

[email protected]

1

Discussion Topics

• Context

• eHealth Exchange

• Transition

• Data Use and Reciprocal Support Agreement (DURSA)

• Healtheway

• Discussion

2

2

eHealth Exchange

• eHealth Exchange: A community of exchange partners who share information under a common trust agreement, common set of technical and policy requirements and testing process

• Based upon national standards, NwHIN and others

• Started as an NwHIN program initiative (NwHIN Exchange); production since 2009

• Transitioned to public-private partnership, operated by Healtheway and rebranded as eHealth Exchange

• Healtheway: Non-profit organization chartered to support the eHealth Exchange and focused on cross-industry collaboration to advance HIE implementation

3

eHealth Exchange Strategic Road Map

Initial Rollout

(2009 – 2011)

Mature, Grow, Scale

2012-2013

Sustainability

2014

Phase 1 Phase 2 Phase 3

• Early adopters • Federal business cases • Shared infrastructure • Early lessons learned • Success / viability • ONC program pilot

concludes • Production ramps up

• Transition to public-private model

• Grow participation and transactions

• Refine and scale • Expand value cases • Align with national

strategy • Implement

sustainability model

• Continued growth in participant and transactions

• Revenue model sustains operations

• Nationwide deployment

• Interoperable exchange among private entities

11/6/2012

4

3

eHealth Exchange Vision / Mission

• Vision

• To serve as a critical element of the nationwide health information infrastructure; and

• To improve the health and welfare of all Americans through health information exchange that is trusted, that scales, and enhances quality of care and health outcomes by supporting comprehensive longitudinal health records.

• Mission

• To expand trusted, secure and interoperable exchange of health information across the nation by providing shared governance and necessary shared service to public and private organizations who wish to interconnect as a network of networks.

5

• Alabama One Health Record

• Centers for Medicare and Medicaid Services (CMS)

• Childrens’ Hospital of Dallas

• Community Health Information Collaborative (CHIC)

• Conemaugh Health System

• Department of Defense (DOD)

• Department of Veterans Affairs

• Dignity Health

• Douglas County Individual Practice Association (DCIPA)

• Eastern Tennessee Health Information Network (etHIN)

• EHR Doctors

• HealthBridge

• HEALTHeLINK (Western New York)

• Idaho Health Data Exchange

• Inland Northwest Health Services (INHS)

• Kaiser Permanente

• Lancaster General Health

eHealth Exchange Participants • Marshfield Clinic

• Medical University of South Carolina (MUSC)

• MedVirginia

• MultiCare Health System

• National Renal Administrators Association (NRAA)

• New Mexico Health Information Collaborative (NMHIC)

• North Carolina Healthcare Information and Communications Alliance, Inc. (NCHICA)

• OCHIN

• Quality Health Network

• Regenstrief Institute

• Social Security Administration (SSA)

• South Carolina Health Information Exchange (SCHIEx)

• South East Michigan Health Information Exchange (SEMHIE)

• Strategic Health Intelligence

• University of California, San Diego

• Utah Health Information Network (UHIN)

• Wright State University 6

4

• Alere / Wellogic • ApeniMED • Aurion • Axolotl • CareEvolution • CGI • Cogon • CONNECT • CSC • Epic • Harris • K Force • MEDecision • MedFX

Technology Solution Providers who Support eHealth Exchange

• Medicity

• Mirth

• MobileMD

• Northrop Grumman

• OneHealthPort

• Orion Health

• SAIC

• Talis

• Thompson

• Vangent

• Wright State Research Institute 7

Value Proposition

Value Proposition

Implement once,

exchange with many

Recognition as part of trusted

community

Expanded connectivity

Cost effective and efficient

Enforced compliance

and accountability

Functional and scalable

shared services

11/6/2012

8

5

Transition Status

• Healtheway assumed responsibility for operations, effective 10/1/12

• Onboarding process (i.e. receipt / processing of applications)

• Participation Testing (facilitated by CCHIT as testing body)

• Digital certificates

• Service registry

• Coordinating Committee support

• DURSA and operating policies and procedures

• Implementation specifications

• Preparing to pilot and launch revamped testing program with CCHIT

• Implementing growth and sustainability strategy 9

Prior State / Current State

Previously

• ONC initiative - NwHIN Exchange

• Coordinating Committee*

• DURSA*

• Onboarding & testing facilitated by ONC

• Operations supported / funded by ONC

• Services provided to participants for free

Today

• Public-private initiative - eHealth Exchange

• Coordinating Committee*

• DURSA*

• Testing facilitated by testing body designated by CC

• Operations supported/ funded by Healtheway

• Participants begin paying for services in future 10

* Unchanged

6

Exchange Trust Framework Unchanged

• DURSA remains in full force and effect

• Coordinating Committee retains all authorities as specified in the DURSA

• Healtheway board does not have any oversight responsibilities with respect to Exchange, but will operate under an agreement with the Coordinating Committee

11

12

eHealth Exchange CC – Healtheway Board Functions

Federal Participants

State Participants

Private Participants

eHealth Exchange Coordinating Committee

Exec Director

Healtheway Board

3 CC Reps

Up to 9 Elected

Members

• Oversee eHealth Exchange participation • Approve specs, test guides, policies • Enforce DURSA • Handle disputes / breaches • Approve changes to DURSA • Designate Healtheway to support

Exchange operations

• Assure corporation is effectively supporting and providing value to its customers (e.g. Exchange, etc.)

• Make financial decisions (e.g. annual budget, membership program, funding, etc.)

• Engage & oversee Executive Director / staff • Guide business strategy and oversee business

(e.g. programs, marketing, partnerships, etc.) • Set strategic direction – HIE implementation

collaborative efforts

Appoint

• Any organization that wishes to be a part of Healtheway community collaboration (e.g. HIE, vendor, payer, non-profit, academic institution, etc.)

Government

Liaisons

7

Revamped eHealth Exchange Participation Testing Program • Moves testing from being an internal function of eHealth

Exchange to accrediting testing lab

• Consistent with national strategy

• Raised bar significantly – with focus on interoperability testing

• Streamlined, efficient and scalable

• Significantly less costly for eHealth Exchange to support

• Cost-effective for technology providers and applicants

• eHealth Exchange participating testing required as a condition of joining the eHealth Exchange

For more details regarding the strategy and test guides:

• http://exchange-iwg.wikispaces.com/

11/6/2012

13

Healtheway Mission

• To provide the infrastructure to support the safe and secure exchange of data which eHealth Exchange Participants use to further their respective missions to: • Improve clinical decision making and coordination, quality and

affordability of care;

• Support meaningful use;

• Enhance disease surveillance, support preparedness and routine public health missions to improve public health; and

• Realize efficiencies and expedite provision of funding and services to individuals to support their care and well-being.

• To support programs for Healtheway corporate members to advance implementation of HIE: • Testing program announced and will be launched with EHR HIE

Interoperability Workgroup 14

8

DATA USE AND RECIPROCAL SUPPORT AGREEMENT

Overview and Lessons Learned

15

Data Use and Reciprocal Support Agreement • A comprehensive, multi-party trust agreement that will be signed by

all eligible entities who wish to exchange data among eHealth Participants

• Eliminates the need for “point-to-point” agreements

• Requires signatories to abide by common set of terms and conditions that establish Participants’ obligations, responsibilities and expectations

• The obligations, responsibilities and expectations create a framework for safe and secure health information exchange, and are designed to promote trust among Participants and protect the privacy, confidentiality and security of the health data that is shared

• Assumes that each Participant has trust relationships in place with its agents, employees and data connections (end users, systems, data suppliers, networks, etc.)

• As a living document, the agreement will be modified over time 16

9

Key Provisions

• Participants in production • Applicable law • HIPAA privacy and security • Duty to respond • Future use of data • Autonomy principle • Breach notification • Mandatory, non-binding dispute resolution • Allocation of liability risk • Coordinating Committee composition, authority and duties • Permitted purposes • Identity proofing and truthful assertions • Disclosing participant required to meet its respective legal requirements before

releasing data • Authorization • Compliance with technical specifications and validation plan • Voluntary suspension • Change process for specifications, validation plan, operating policies and

procedures and DURSA

17

Lessons Learned

• Few questions / issues regarding the agreement itself

• Eliminates duplicative point-to-point agreements

• Realized return on investment to date

• Focus is on executing and implementing the agreement to assure “chain of trust”

• Determine who should sign

• Implement “flow-down” provisions via contracts or written policies with a participant’s agents, users, etc.

• Agreement has core elements needed for day-to-day exchange of health information

• DURSA and operating policies and procedures referenced by Coordinating Committee on routine basis

• Processes vetted and refined over time

18

10

Discussion For more information:

www.healthewayinc.org

19

1

ONC HIT Certification Program

Update

Carol Bean, Ph.D.

Director, Office of Certification

Office of the National Coordinator for Health IT

HIT Policy Committee

November 7, 2011

Overview

2

• ONC HIT Certification Program

• 2014 Test Method Development

• 2014 Test Method Public Review and Comment

• Technical Workshop

• 2014 Test Scenarios

2

ONC HIT Certification Program

3

ONC HIT Certification Program Transition

• Name Change: Permanent Certification Program = ONC HIT Certification

Program

• New Structure: ONC, NVLAP, ANSI, ATLs, and ACBs

• Separate testing and certification entities

• No impact on Temporary Program certifications

On October 4, 2012, the Temporary Certification Program

sunsetted and the ONC HIT Certification Program began.

ONC HIT Certification Program –

Participants

4

ONC Office of the National Coordinator (ONC), Office of Certification manages the

ONC HIT Certification Program.

NVLAP National Voluntary Laboratory Accreditation Program (NVLAP), administered by

the National Institute of Standards and Technology (NIST), accredits Accredited

Testing Laboratories (ATLs).

ONC–AA ONC-Approved Accreditor (ONC-AA) accredits and oversees ONC-Authorized

Certification Bodies (ONC-ACBs). Note: There is only one ONC-AA at a time.

ATL NVLAP Accredited Testing Laboratory (ATL) tests Health IT (HIT), including

Complete EHRs and/or EHR Modules. Note: There can be multiple ATLs.

ONC-ACB ONC-Authorized Certification Body (ONC-ACB) certifies HIT, including

Complete EHRs and/or EHR Modules. Note: There can be multiple ACBs.

Developer/Vendor Creator(s) of HIT, including Complete EHRs and/or EHR Modules.

3

ONC HIT Certification Program –

ATLs and ACBs

5

NVLAP – Accredits Testing Laboratories (ATLs)

• Certification Commission for HIT (CCHIT)

• Drummond Group, Inc.

• ICSA Laboratories, Inc.

• InfoGard Laboratories, Inc.

• SLI Global Solutions

ANSI – Accredits Certification Bodies (ACBs)

• Certification Commission for HIT (CCHIT)

• Drummond Group, Inc.

• ICSA Laboratories, Inc.

• InfoGard Laboratories, Inc.

• Orion Register, Inc.

ONC HIT Certification Program –

Office of the National Coordinator

6

ONC

4

ONC HIT Certification Program –

NVLAP and ONC-AA (ANSI)

7

ONC-AA Approved Accreditor

(ANSI)

ONC

accredits

approves

NIST NVLAP National Voluntary

Laboratory Accreditation Program

ISO/IEC 17011

ONC HIT Certification Program –

ONC-AA Accredits ONC-ACBs

8

ACB Authorized

Certification Body*

ACB Authorized

Certification Body*

ONC-ACB ONC-Authorized Certification Body

ONC-AA Approved Accreditor

(ANSI)

ONC

accredits

approves

accredits

NIST NVLAP National Voluntary

Laboratory Accreditation Program

ISO/IEC 17011

ISO/IEC Guide 65

5

ONC HIT Certification Program –

ONC Authorizes ONC-ACBs

9

ACB Authorized

Certification Body*

ACB Authorized

Certification Body*

ONC-ACB ONC-Authorized Certification Body

ONC-AA Approved Accreditor

(ANSI)

ONC

accredits

approves

accredits

NIST NVLAP National Voluntary

Laboratory Accreditation Program

auth

orize

s

ISO/IEC 17011

ISO/IEC Guide 65

ONC HIT Certification Program –

NVLAP Accredits ATLs

10

ACB Authorized

Certification Body*

Authorized Testing Body*

ACB Authorized

Certification Body*

ONC-ACB ONC-Authorized Certification Body

ONC-AA Approved Accreditor

(ANSI)

ONC

accredits

performs testing against criteria

approves

accredits

NIST NVLAP National Voluntary

Laboratory Accreditation Program

auth

orize

s

accredits

Authorized Testing Body*

ATL Accredited Testing

Laboratory

ISO/IEC 17011

ISO/IEC 17025

NIST 150

NIST 150-31

ISO/IEC Guide 65

6

ONC HIT Testing and Certification –

Developers and Vendors

11

ACB Authorized

Certification Body*

Authorized Testing Body*

ACB Authorized

Certification Body*

ONC-ACB ONC-Authorized Certification Body

performs testing against criteria

certifies tested products

Developer/Vendor

Authorized Testing Body*

ATL Accredited Testing

Laboratory

performs testing against criteria

ONC HIT Testing and Certification –

ATL Tests EHR Technology

12

ACB Authorized

Certification Body*

Authorized Testing Body*

ACB Authorized

Certification Body*

Product successfully passes testing

ONC-ACB ONC-Authorized Certification Body

performs testing against criteria

certifies tested products

Developer/Vendor

Authorized Testing Body*

ATL Accredited Testing

Laboratory

performs testing against criteria

7

ONC HIT Testing and Certification –

ONC-ACB Certifies EHR Technology

13

ACB Authorized

Certification Body*

Authorized Testing Body*

ACB Authorized

Certification Body*

Product successfully passes testing

ONC-ACB ONC-Authorized Certification Body

performs testing against criteria

certifies tested products

Developer/Vendor

Authorized Testing Body*

ATL Accredited Testing

Laboratory

performs testing against criteria

Product successfully achieves certification

ONC HIT Testing and Certification –

ONC Posts CEHRT to CHPL

14

ACB Authorized

Certification Body*

Authorized Testing Body*

ACB Authorized

Certification Body*

Product successfully passes testing

ONC reviews and posts certified

product to CHPL ONC-ACB

ONC-Authorized Certification Body

performs testing against criteria

certifies tested products

Developer/Vendor

Authorized Testing Body*

ATL Accredited Testing

Laboratory

performs testing against criteria

Product successfully achieves certification

8

15

0

500

1000

1500

2000

2500

3000

2/18

/201

1

3/18

/201

1

4/18

/201

1

5/18

/201

1

6/18

/201

1

7/18

/201

1

8/18

/201

1

9/18

/201

1

10/1

8/20

11

11/1

8/20

11

12/1

8/20

11

1/18

/201

2

2/18

/201

2

3/18

/201

2

4/18

/201

2

5/18

/201

2

6/18

/201

2

7/18

/201

2

8/18

/201

2

9/18

/201

2

10/1

8/20

12

Total Products

Total Unique Products

Total Unique Products Estimate

Total Vendors

Certified Health IT Product List –

Overview

2014 Test Method Timeline OCT NOV AUG SEPT DEC JAN

Public Comment Period

8/23/2012

9/7 11/15

W2 W3 W4 9/7 9/14 9/21 9/28

W5 10/18

W1

W1 9/7 9/21

W2 9/14 9/28

W3 9/21 10/5

W4 9/28 10/12

2014 Test Procedures Posted

12/14/2012

W6 11/2 11/14

W5 10/18 11/5

Test Procedures Revised 9/21 11/30

National Coordinator Approval

12/3 12/7

12/7 12/14

2014 Scope Expansion for ACBs and ATLs

Technical Workshop 12/14/2012

2014

Certification Begins

1/2/2012

Test Scenarios Reviewed/revised

12/7 On-going

11/1 CHPL 3.0

Development

CHPL 3.0 Release

12/1 CHPL 3.0

Testing Begins

10/1 CHPL 3.0

Requirements Gathering

11/2 W6

ACB / ATL setup operations

1/2/2012

W7 TBD TBD

TBD W7

DRAFT

Test Tool Demos (ATL/ACB)

10/2 10/4 10/31

Technical Workshop & Training

11/13 – 11/15

9

Test Method Development –

2014 Edition Technical Requirements

17

Final Rule

Publication

Final Rule Publication • September 4, 2012

• Published in the Federal Register

• Defines the 2014 Edition Certification Criteria

• Aligns with CMS’ Stage 2 MU

September 4, 2012

Test Method Development –

Test Method Draft Development

18

Final Rule

Publication

Draft Test

Method

Draft Test Method • September – November 2012

• Evaluates conformance to 2014

Edition Certification Criteria

• Overseen by ONC’s Office of

Certification

September 4, 2012 Sept. – Nov. 2012

10

Test Method Development –

Public Review and Feedback

19

Final Rule

Publication

Draft Test

Method

Public

Review

September 4, 2012 Sept. – Nov. 2012 Sept. 7 – Nov. 14, 2012

Public Review • September 7 – November 14, 2012

• Request feedback/input from the

public

• Draft Test Method posted on ONC’s

website in waves

Test Method Development –

Revision and Training

20

Final Rule

Publication

Draft Test

Method

Public

Review

Test Method

Update

September 4, 2012 Sept. – Nov. 2012 Sept. 7 – Nov. 14, 2012 Sept. – Nov. 2012

Test Method Update • September – November 2012

• Analyze public comments

• Update Test Method per public

review

• ATL and ACB training and

evaluation

11

Test Method Development –

Approval by National Coordinator

21

Final Rule

Publication

Draft Test

Method

Public

Review

Test Method

Update

Final Test

Method

September 4, 2012 Sept. – Nov. 2012 Sept. 7 – Nov. 14, 2012 Sept. – Nov. 2012 Mid-December 2012

Final Test Method • Mid-December 2012

• National Coordinator approves

final 2014 Edition Test Method

• Federal Register Notice

• Final Test Method posted on

ONC’s website

Test Method Development –

Implementation

22

Final Rule

Publication

Draft Test

Method

Public

Review

Test Method

Update

Final Test

Method Implementation

September 4, 2012 Sept. – Nov. 2012 Sept. 7 – Nov. 14, 2012 Sept. – Nov. 2012 Mid-December 2012 December 2012

Implementation • December 2012

• NVLAP and ANSI expand

scope to 2014 Ed.

• ONC authorizes ACBs

• ATLs and ACBs begin

operational implementation

• Testing and Certification

begins in January 2013

12

Draft Test Method Review Process –

Wave 1

23

Wave 1 (14 Test Procedures)

170.314(a)(1) Computerized provider order entry

170.314(a)(4) Vital signs, body mass index, and growth charts

170.314(a)(5) Problem list

170.314(a)(6) Medication list

170.314(a)(7) Medication allergy list

170.314(a)(10) Drug formulary checks

170.314(a)(11) Smoking status

170.314(a)(15) Patient-specific education resources

170.314(a)(17) Inpatient setting only—advance directives

170.314(d)(5) Automatic log-off

170.314(d)(8) Integrity

170.314(d)(9) Optional—accounting of disclosures

170.314(f)(1) Immunization Information

170.314(f)(2) Transmission to immunization registries

Wave 1

9/7/12 – 9/21/12

24

Wave 2 (7 Test Procedures)

170.314(a)(3) Demographics

170.314(a)(9) Electronic notes

170.314(a)(13) Family health history

170.314(a)(14) Patient list creation

170.314(d)(6) Emergency access

170.314(f)(5) Ambulatory setting only—cancer case information

170.314(f)(6) Ambulatory setting only—transmission to cancer registries

Wave 1

9/7/12 – 9/21/12

Wave 2

9/14/12 – 9/28/12

Draft Test Method Review Process –

Wave 2

13

25

170.314(a)(2) Drug-drug, drug-allergy interaction checks

170.314(a)(8) Clinical decision support

170.314(a)(12) Image Results

170.314(a)(16) Inpatient setting only—eMAR

170.314(b)(4) Clinical information reconciliation

170.314(e)(2) Ambulatory setting only—clinical summary

Wave 3 (6 Test Procedures)

Wave 1

9/7/12 – 9/21/12

Wave 2

9/14/12 – 9/28/12

Wave 3

9/21/12 – 10/5/12

Draft Test Method Review Process –

Wave 3

26

170.314(b)(7) Data portability

170.314(d)(1) Authentication, access control, and authorization

170.314(d)(2) Auditable events and tamper-resistance

170.314(d)(3) Audit reports

170.314(d)(4) Amendments

170.314(d)(7) End-user device encryption

170.314(e)(3) Ambulatory setting only—secure messaging

170.314(f)(3) Transmission to public health agencies—syndromic surveillance

170.314(g)(3) Safety-enhanced design

Wave 4 (9 Test Procedures)

Wave 1

9/7/12 – 9/21/12

Wave 2

9/14/12 – 9/28/12

Wave 3

9/21/12 – 10/5/12

Wave 4

9/28/12 – 10/12/12

Draft Test Method Review Process –

Wave 4

14

27

170.314(b)(2) Transitions of care—create and transmit transition of care/referral summaries

170.314(b)(3) Electronic prescribing

170.314(e)(1) View, download, and transmit to 3rd party

170.314(f)(4) Inpatient setting only—transmission of reportable lab tests and values/results

Wave 5 (4 Test Procedures)

Wave 1

9/7/12 – 9/21/12

Wave 2

9/14/12 – 9/28/12

Wave 3

9/21/12 – 10/5/12

Wave 4

9/28/12 – 10/12/12

Wave 5

10/18/12 – 11/1/12

Draft Test Method Review Process –

Wave 5

28

170.314(c)(1) Clinical quality measures—capture and export

170.314(c)(2) Clinical quality measures—import and calculate

170.314(c)(3) Clinical quality measures—electronic submission

170.314(g)(1) Automated numerator recording

170.314(g)(2) Automated measure calculation

Wave 6 (5 Test Procedures)

Wave 1

9/7/12 – 9/21/12

Wave 2

9/14/12 – 9/28/12

Wave 3

9/21/12 – 10/5/12

Wave 4

9/28/12 – 10/12/12

Wave 5

10/18/12 – 11/1/12

Wave 6

11/2/12 – 11/16/12

Draft Test Method Review Process –

Wave 6

15

29

Wave 1

9/7/12 – 9/21/12

Wave 2

9/14/12 – 9/28/12

Wave 3

9/21/12 – 10/5/12

Wave 4

9/28/12 – 10/12/12

Wave 5

10/18/12 – 11/1/12

Wave 6

11/2/12 – 11/16/12

Wave 7*

TBD

170.314(b)(1) Transitions of care—receive, display, and incorporate summary care records

170.314(b)(5) Incorporate lab tests and values/results

170.314(b)(6) Inpatient setting only—transmission of e-lab test and values/results to Amb provider

170.314(g)(4) Quality management system

Wave 7* (4 Test Procedures)

Draft Test Method Review Process –

Wave 7

*Represents the remaining Test Procedures. Wave number and content may change.

2014 Edition Test Tools

CCDA Validation Tool

via Direct and SOAP

Lab Results Interface

(LRI) Validation Tool Transport Testing Tool

Cypress Direct Certificate

Discovery Tool (DCDT)

HL7 v2 Validation Tool ePrescribing

Conformance Tool

HL7 Cancer Registry

Report Validation Tool

30

16

Technical Training & Workshop

31

Technical Workshop (public webinar)

• Presenters: ONC, NVLAP, ANSI, test and tool developers

• Date: November 13, 2012

Technical Training for ATLs and ACBs

• Participants: ONC, NVLAP, ANSI, ATLs, ACBs

• Date: November 14-15, 2012

• Location: DC

Topics

• Program timeline

• Test Procedures

• Overview of public review

• Test Tool demos

• CHPL 3.0

Test Scenario Development –

Unit Based Testing

32

• Minimum requirement

• Independent tests

• Individual test data and results

• Currently employed for 2011 Edition Test Procedures

Unit Based Testing

17

Test Scenario Development –

Scenario Based Testing

33

• Alternative to unit based testing

• Dependent tests

• Dependent test data and results

• Can remove individual test from sequence

Scenario Based Testing If test 1 is not applicable…

– Reflects a typical clinical workflow in multiple care settings

– Allows persistence of data elements (i.e. model for data threading)

– Maintains testing flexibility (e.g. add/remove ―unit test‖)

Test Scenario Development – Approach

34

Approach

Process

Types

18

– Reflects a typical clinical workflow in multiple care settings

– Allows persistence of data elements (i.e. model for data threading)

– Maintains testing flexibility (e.g. add/remove ―unit test‖)

Test Scenario Development – Process

35

– Develop clinically plausible workflow

– Initial development based on 2011 Edition Certification Criteria

– Reevaluate against the 2014 Edition Certification Criteria

Approach

Process

Types

– Reflects a typical clinical workflow in multiple care settings

– Allows persistence of data elements (i.e. model for data threading)

– Maintains testing flexibility (e.g. add/remove ―unit test‖)

Test Scenario Development – Types

36

– Develop clinically plausible workflow

– Initial development based on 2011 Edition Certification Criteria

– Reevaluate against the 2014 Edition Certification Criteria

– Medication Management

– Emergency Department

– Interoperability

– Outpatient

– Inpatient

Approach

Process

Types

19

ONC HIT Certification Program

37

Questions?