15
2015 Draft Test Method Review / Discussion Implementation, Certification, and Testing (ICT) Workgroup June 17, 2015 Liz Johnson, co-chair Cris Ross, co-chair 1

2015 Draft Test Method Review / Discussion Implementation, Certification, and Testing (ICT) Workgroup June 17, 2015 Liz Johnson, co-chair Cris Ross, co-chair

Embed Size (px)

Citation preview

Page 1: 2015 Draft Test Method Review / Discussion Implementation, Certification, and Testing (ICT) Workgroup June 17, 2015 Liz Johnson, co-chair Cris Ross, co-chair

2015 Draft Test Method Review / Discussion

Implementation, Certification, and Testing (ICT) Workgroup

June 17, 2015

Liz Johnson, co-chairCris Ross, co-chair

1

Page 2: 2015 Draft Test Method Review / Discussion Implementation, Certification, and Testing (ICT) Workgroup June 17, 2015 Liz Johnson, co-chair Cris Ross, co-chair

Current Membership

Name Organization

Cris Ross, co-chair Mayo

Liz Johnson, co-chair Tenet Healthcare Corporation

Sarah Corley QSI NextGen Healthcare

David Kates The Advisory Board Company

Udayan Mandavia iPatientCare

Kyle Meadors Drummond Group Inc.

Rick Moore National Committee for Quality Assurance

Andrey Ostrovsky Care at Hand

Danny Rosenthal Inova Health System

John Travis Cerner Corp.

Steve Waldren American Academy of Family Physicians

Zabrina Gonzaga Lantana

Kevin Brady, Federal Ex officio National Institute of Standards and TechnologyBrett Andriesen, staff lead Office of the National Coordinator for Health IT

2

Page 3: 2015 Draft Test Method Review / Discussion Implementation, Certification, and Testing (ICT) Workgroup June 17, 2015 Liz Johnson, co-chair Cris Ross, co-chair

3

Test Procedure Assignments

Criteria Number Criteria Description Review Lead

§170.315(a)(10) Clinical Decision Support Sarah Corley

§170.315(b)(1) Transitions of Care John Travis

§170.315(b)(2) Clinical Information Reconciliation & Incorporation

John Travis

170.315(e)(1) View, Download and Transmit Sarah Corley

§170.315(b)(6) Data Portability David Kates

§170.315(b)(3) Electronic Prescribing John Travis

§170.315(c)(1) Clinical Quality Measures – record and export

John Travis

§170.315(g)(6) Consolidated CDA Creation Performance David Kates

§170.315(a)(19) Patient Health Information Capture Sarah Corley

§170.315(a)(2) CPOE – laboratory Sarah Corley

§170.315(a)(20) Implantable Device List David Kates

§170.315(g)(7) Application Access to Common Clinical Data Set

David Kates

Page 4: 2015 Draft Test Method Review / Discussion Implementation, Certification, and Testing (ICT) Workgroup June 17, 2015 Liz Johnson, co-chair Cris Ross, co-chair

4

§170.315(a)(10)Clinical Decision Support - Sarah Corley

• Please provide links to the exact standards cited as well as plain English descriptions of what those standards are

• Clarifications needed for types of CDS (therapeutic / diagnostic and others)

• Clarifications needed for end user action tracking (active versus passive results)

Page 5: 2015 Draft Test Method Review / Discussion Implementation, Certification, and Testing (ICT) Workgroup June 17, 2015 Liz Johnson, co-chair Cris Ross, co-chair

5

§170.315(b)(1)Transitions of Care - John Travis

• The test procedure as it is presents difficulty for providing feedback absent being able to also have availability to the test data and test tools in context. The test procedure does not include specifics for test tools for interoperability / protocol validation

• The criteria/procedure as proposed includes a variety of CCDA templates that (a) are not required for nor can be used for the related Stage 3 transition of care objective and (b) may relate to other criteria to which vendors need not certify for meaningful use or even for other programs at present (e.g., 170.315(b)(9) Care Plans). Therefore, all of these CCDA templates should not be required for validation of this criterion if for satisfying the definition of CEHRT, and should be removed, made optional, or made conditional based on testing to other criteria that implicate them

• The procedure as proposed does not include test data for validation of valid and invalid CCDAs and for creation of CCDAs.

• Lastly, 1.3 seems to be duplicative with 170.315(g)(6)’s test procedure and purpose for C-CDA Creation Performance. It appears that the C-CDA is tested much more extensively there which is mandatory if 170.315(b)(1) is also being tested. We recommend that these efforts be consolidated and leverage such that one testing tool and method should be used for both where there is overlap. We recommend limiting the scope of the test procedure for 170.315(b)(1) to be for the use of transport through the edge protocol only.

Page 6: 2015 Draft Test Method Review / Discussion Implementation, Certification, and Testing (ICT) Workgroup June 17, 2015 Liz Johnson, co-chair Cris Ross, co-chair

6

§170.315(b)(2)Clinical Information Reconciliation & Incorporation - John Travis

• Suggest that the reconciliation process can be satisfied by whatever manner the EHR product presents the clinical summary for provider review.

• We suggest that confirmation of the reconciled list is not necessarily “per item” but may be of the entire list at the time the provider completes reconciliation activity.

• We suggest that the reconciliation process does not require all such reconciliation activities must necessarily occur in one system function but may be performed in more than one function. Most EHR systems have distinct medication reconciliation components from allergy and problem list components, and EHR vendors should be able to provide reconciliation functions in either a consolidated manner or through individual functions for the different clinical data types subject to this certification criterion as appropriate to the functionality of their products.

Page 7: 2015 Draft Test Method Review / Discussion Implementation, Certification, and Testing (ICT) Workgroup June 17, 2015 Liz Johnson, co-chair Cris Ross, co-chair

7

170.315(e)(1)View, Download and Transmit - Sarah Corley

• Link to standards and provide plain English description of standard. If standard is proprietary consider obtaining a limited license to allow link. Include a link to the 42 CFR requirements as well as a plain English description of the required CLIA data elements. Please provide a link to each of the testing tools. Please specify the version and exact name of each of the testing tools. Under activity history log, please clarify that the only action and information that must be stored for an API call is the request for access by the API. Please clarify how testing of an API will occur. Is a test tool being developed? Is there an expectation that the testing labs will develop a product to make an API call for each vendor that they certify? Is ONC or NIST going to develop these products? Vendors would need immediate access to these products to begin testing.

• For criteria that utilize testing tools to validate output from the EHR, no visual inspection should be required. Testing tools should produce output that validates the standard was applied properly. Vendors should be able to provide testing tool documentation as attestation of successful validation from the testing tools.

Page 8: 2015 Draft Test Method Review / Discussion Implementation, Certification, and Testing (ICT) Workgroup June 17, 2015 Liz Johnson, co-chair Cris Ross, co-chair

8

§170.315(b)(6)Data Portability - David Kates

• Create Transitions of Care Documents – Cognitive Status / Functional Status• Reason for referral and discharge instructions are only required

for CCD• Some patient demographic fields are only conditionally required• In addition to the stated time periods stipulated in the test criteria,

there should be a criteria for generating a CCD that relates to current encounter only. It is not clear that you would want this to be a user configuration option but instead should be an administrative option.

• There should also be an option for a CCDA to be created when new results are captured (e.g., lab results received in an ambulatory practice which occurs after the encounter note is signed).

Page 9: 2015 Draft Test Method Review / Discussion Implementation, Certification, and Testing (ICT) Workgroup June 17, 2015 Liz Johnson, co-chair Cris Ross, co-chair

9

§170.315(b)(3)Electronic Prescribing - John Travis

• The test procedures, including test data, tooling, and expected outcomes, are insufficiently detailed to be considered a viable first draft. The procedures as written simply repeat the criteria in the NPRM and do not reflect the detail needed to evaluate ONC's/NIST's expectation of criteria interpretation and validation. The proposed timeline requires that the requirements of the Final Rule be complete at the time of the final rule and, therefore, we request that the test procedures including test data, test tooling, expected outcomes be released in reviewable draft form available for substantive comment prior to the final rule.

• The test procedure should explicitly call out which messages require visual confirmation and which require validation through a test tool. There should be instructions on any required pre-testing setup to properly test features such as transactions for Medication History and Fill Status Notification

• What will be the Test Pharmacy system sending RxFill and other messages - may require coordiation with Surescripts• How are vendors getting these Pharmacy initiated messages into their systems?

Page 10: 2015 Draft Test Method Review / Discussion Implementation, Certification, and Testing (ICT) Workgroup June 17, 2015 Liz Johnson, co-chair Cris Ross, co-chair

10

§170.315(c)(1)Clinical Quality Measures – record and exportJohn Travis

• The test procedures, including test data, tooling, and expected outcomes, are insufficiently detailed to be considered a viable first draft. The procedures as written simply repeat the criteria in the NPRM and do not reflect the detail needed to evaluate ONC's/NIST's expectation of criteria interpretation and validation. The proposed timeline requires that the requirements of the Final Rule be complete at the time of the final rule and, therefore, we request that the test procedures including test data, test tooling, expected outcomes be released in reviewable draft form available for substantive comment prior to the final rule. Delay in providing a reviewable draft may inadvertently delay the availability or CEHRT and further complicate quality implementations by providers to successfully achieve program outcomes

• The test procedure for "capture" proposes that the vendor enter all data for each and every CQM. This is a highly inefficient and ineffective test methodology and we recommend substantial change.

• The QRDA conformance tool used for ONC’s certification testing (i.e. CYPRESS), and the QualityNet tool need to be aligned. Our past experience indicates that the files that are successfully validated in CYPRESS do not always get accepted on QualityNet for testing submission.

Page 11: 2015 Draft Test Method Review / Discussion Implementation, Certification, and Testing (ICT) Workgroup June 17, 2015 Liz Johnson, co-chair Cris Ross, co-chair

11

§170.315(g)(6)Consolidated CDA Creation PerformanceDavid Kates• While there are significant advantages to the C-CDA 2.0 standard in

terms of additional structured data elements and specificity to improve interoperability, it may not be prudent to adopt until it moves from a draft standard (DSTU) to an accredited standard.

• The document templates listed are a significant expanded scope from MU2. There is benefit to the industry if EHRs could all consistently produce and consume all these documents but this may be a significant reach. The CCD, Discharge Summary, and Progress Note might be considered as highest priority if scope were to be reduced.

• Change EHR references to HIT

Page 12: 2015 Draft Test Method Review / Discussion Implementation, Certification, and Testing (ICT) Workgroup June 17, 2015 Liz Johnson, co-chair Cris Ross, co-chair

12

§170.315(a)(19)Patient Health Information CaptureSarah Corley• It is not completely clear whether this criterion is limited to types

of health information that might be created externally vs. documentation completed within the EHR since that too is health information documentation. The mention of record suggests that this refers to internally created documentation. If this requirement is for only documentation created outside of the EHR, that should be specifically stated. If it is for internally created documentation, it should specify if there is a subset of information that must be labeled, or if every piece of documentation must be able to be labeled.

• Please clarify for testers and auditors that there are many ways that health information could be labeled.

• Clarify if narrative can be free text anywhere in the HIT system

Page 13: 2015 Draft Test Method Review / Discussion Implementation, Certification, and Testing (ICT) Workgroup June 17, 2015 Liz Johnson, co-chair Cris Ross, co-chair

13

§170.315(a)(2)CPOE – laboratorySarah Corley• Commercial labs are not using any standards for their Directory of Services, nor

are they using LOINC for the subset of order-able tests that have been assigned a LOINC code, so there would be no real world testing possible for this measure. We suggest rewording this requirement to remove the word “electronic” before “Directory of Services”. We suggest that you select several of the national lab vendors (such as Quest or Lab Corps) to determine the order-able codes, the ask on entry (AOE) questions, and the test collection requirements. Then, allow the developer to select the lab vendor data set for whichever lab vendors they choose.

• Testing should be done with order compendia from major national labs such as Quest or LabCorp. Preliminary discussions suggest that Quest will be supporting this while LabCorp has not started development

• LOI is still in DSTU, consideration should be given to an existing ordering interface with a national lab vendor

Page 14: 2015 Draft Test Method Review / Discussion Implementation, Certification, and Testing (ICT) Workgroup June 17, 2015 Liz Johnson, co-chair Cris Ross, co-chair

14

§170.315(a)(20)Implantable Device ListDavid Kates• The test criteria are consistent with the certification

requirements. The dependency is that this information is readily available from implantable devices. For test purposes, there should be sample data provided to the CEHRT under test to verify compliance. Verification that inappropriate data results in errors (e.g., missing required fields, dates that are incorrectly formatted, etc.) to ensure applicable validity checking is performed.

Page 15: 2015 Draft Test Method Review / Discussion Implementation, Certification, and Testing (ICT) Workgroup June 17, 2015 Liz Johnson, co-chair Cris Ross, co-chair

15

§170.315(g)(7)Application Access to Common Clinical Data SetDavid Kates• The API that will be used to meet these requirements needs to be stipulated. If the HL7

FHIR standard is accredited by the time these certification test criteria are established, this is the most likely candidate. If not, there will need to be a process for the vendor to define the API that will be used and its characteristics, which presumably would be RESTful web services, ideally JSON. It would also be advisable to stipulate the requirements for “registering” an application, including permissioning and authentication which may be provided by Oauth2. There will be SIGNIFICANTLY lower value to having the industry establish a requirement for EHRs to expose an API if each vendor adopts a proprietary API.

• Ideally a test fixture will be established that will exercise the API to support the test criteria and made accessible to EHR vendors to verify the functionality stipulated in the test criteria. This should include verification that valid data for the MU data set elements are returned, including inclusion of required fields, type checking, and terminology verification.

• While it is likely not feasible to have one standard API that would cover every possible HIT system, it is prudent for ONC to set standards on how each vendor is required to publish their API protocols (e.g., data dictionary, functions, etc.) to ensure other HIT vendors can easily build interfaces to the various HIT modules required for interoperability