625
CDAR2_IG_QRDA_R1_U1_2012MAY Quality Reporting Document Architecture Implementation Guide for CDA Release 2 (US Realm) Based on HL7 CDA Release 2.0 First Ballot Pre-publication Preview Version May July 2012 © 2012 Health Level Seven, Inc. Ann Arbor, MI All rights reserved.

AALLC Style Guide · Web viewFor instance, at a hospital reporting on three acute myocardial infarction measures (AMI-1, AMI-2, AMI-3), where the quality program requires that all

  • Upload
    others

  • View
    2

  • Download
    0

Embed Size (px)

Citation preview

AALLC Style Guide

CDAR2_IG_QRDA_R1_U1_2012MAY

Quality Reporting Document ArchitectureImplementation Guide for CDA Release 2(US Realm)Based on HL7 CDA Release 2.0

First BallotPre-publication Preview Version

May July 2012

© 2012 Health Level Seven, Inc.Ann Arbor, MIAll rights reserved.

Primary Editor:

Gaye Dolin, MSN RNLantana Consulting [email protected]

Co-Editor:

Floyd Eisenberg, MD National Quality [email protected]

Co-Chair/ Co-Editor:

Robert H. Dolin, MDLantana Consulting [email protected]

Co-Editor:

Chad BennettTelligen [email protected]

Co-Chair/ Co-Editor:

Brett Marquard Lantana Consulting [email protected]

Co-Editor:

Feliciano Yu, MDSt. Louis Children’s [email protected]

Co-Chair:

Calvin BeebeMayo [email protected]

Co-Editor:

Crystal KallemLantana Consulting [email protected]

Co-Chair:

Austin KreislerSAIC Consultant to CDC/NHSN [email protected]

Co-Editor:

Srinivas [email protected]

Co-Chair:

Grahame GrieveKestral Computing Pty [email protected]

Co-Editor:

Patty CraigJoint [email protected]

Co-Editor:

Sarah GauntLantana Consulting [email protected]

Co-Editor:

Joy KuhlOptimal [email protected]

Co-Editor:

Yan HerasLantana Consulting [email protected]

Technical Editor:

Susan HardyLantana Consulting [email protected]

Co-Editor:

Jingdong LiLantana Consulting [email protected]

Technical Editor:

Diana WrightLantana Consulting [email protected]

Co-Editor:

Liora AlschulerLantana Consulting [email protected]

Current Work Group: David Stumpf, Julie Steele, John Roberts, Fred Miller, Juliet Rubini, Jonathan Ives, Emma Jones, Rute Martins, Jeff James, Karen Nakano, Dan Green, Anne Smith, Lura Daussat, Linda Hyde, Savithri Devaraj, Todd Harpster, Vinayak Kulkarni, Stan Rankins, Brian Peck, Fred Rahmanian, Debbie Krauss, Rob Samples, Kim Schwartz, Robert Wardon, Mary Pratt, Saul Kravitz

Acknowledgments

This implementation guide was produced and developed under the sponsorship of the Office of Clinical Standards and Quality of the Centers for Medicare and Medicaid Services (CMS). Throughout the development of this guide, multiple industry stakeholders provided input and translation of business and technical requirements. Advisors included representatives from the Joint Commission, National Quality Forum (NQF), National Committee for Quality Assurance (NCQA), and American Medical Association (AMA).

Release 1.0 of this guide was produced and developed through the efforts of the Quality Reporting Document Architecture (QRDA) Project supported by the Child Health Corporation of America (CHCA) to develop and support a standard for quality reporting. The QRDA committee was comprised of representatives from the American Health Information Management Association (AHIMA), Integrating the Healthcare Enterprise (IHE), CHCA, the Collaborative for Performance Measure Integration with EHR Systems, MedAllies, and the Nationwide Health Information Network (NHIN).

This specification is a set of constraints on existing work, and the extent to which it can accommodate the expressive requirements of quality reporting over time is a function of the richness of the model on which it is built, the Health Level Seven (HL7) Reference Information Model (RIM) and the RIM document standard, and the Clinical Document Architecture Release 2 (CDA R2). We thank all those who have worked for over a decade to produce these fundamental specifications; we especially thank the HL7 Structured Documents committee for their support of this project.

This material contains content from SNOMED CT® (http://www.ihtsdo.org/snomed-ct/). SNOMED CT is a registered trademark of the International Health Terminology Standard Development Organisation (IHTSDO).

This material contains content from LOINC® (http://loinc.org). The LOINC table, LOINC codes, and LOINC panels and forms file are copyright © 1995-2010, Regenstrief Institute, Inc. and the Logical Observation Identifiers Names and Codes (LOINC) Committee. All are available at no cost under the license at http://loinc.org/terms-of-use.

Revision History

Rev

Date

By Whom

Changes

1.0

4/12/12

Lantana

Created and posted draft CDAR2 QRDA R2 Implementation Guide for May 2012 ballot.

1.1

7/1/2012

Lantana

Post ballot reconciliation tracked changes version

Table of Contents

1Introduction27

1.1Purpose27

1.2Audience27

1.3Approach27

1.4CDA R228

1.5Templated CDA28

1.6Background29

1.6.1QRDA Category I – Single Patient Report29

1.6.2QRDA Category II – Patient List Report30

1.6.3QRDA Category III – Calculated Report30

1.7Relationship to Health Quality Measures Format: eMeasures30

1.8Current Project31

1.9Scope32

1.10Organization of This Guide32

1.11Conformance Conventions Used in This Guide33

1.11.1Templates Not Open for Comment33

1.11.2Templates and Conformance Statements33

1.11.3Open and Closed Templates34

1.11.4Keywords35

1.11.5Cardinality35

1.11.6Vocabulary Conformance36

1.11.7Null Flavor37

1.11.8Asserting an Act Did Not Occur39

1.11.9Data Types39

1.12XML Conventions Used in This Guide40

1.12.1XPath Notation40

1.12.2XML Examples and Sample Documents40

1.13Rendering Header Information for Human Presentation41

1.14Content of the Package41

2QRDA Category I Framework42

2.1Measure Section42

2.2Reporting Parameters Section42

2.3Patient Data Section42

3Quality Data Model-Based QRDA IG43

3.1Introduction43

3.2QDM-Based QRDA Category I Construction Rules46

3.2.1How Many QRDA Documents Should be Created?46

3.2.2Generate a QRDA for Which Patients?46

3.2.3How Many Data Should be Sent?46

3.2.4What if There are No Data in the EHR?47

3.3Generating a QDM-Based QRDA Category I Instance from a QDM-Based eMeasure48

3.4QDM-Based QRDA Category I Instance Validation50

4General Header Template52

4.1US Realm Header52

4.1.1RecordTarget54

4.1.2Author66

4.1.3DataEnterer67

4.1.4Informant69

4.1.5Custodian70

4.1.6InformationRecipient71

4.1.7LegalAuthenticator72

4.1.8Authenticator74

4.1.9Participant (Support)75

4.1.10InFulfillmentOf77

4.1.11DocumentationOf/serviceEvent77

4.1.12Authorization/consent78

4.1.13componentOf79

4.2US Realm Address (AD.US.FIELDED)79

4.3US Realm Date and Time (DT.US.FIELDED)80

4.4US Realm Date and Time (DTM.US.FIELDED)81

4.5US Realm Patient Name (PTN.US.FIELDED)81

4.6US Realm Person Name (PN.US.FIELDED)83

5Document-Level Templates84

5.1QRDA Category I Framework84

5.1.1ClinicalDocument/template ID84

5.1.2ClinicalDocument/code84

5.1.3ClinicalDocument/title84

5.1.4RecordTarget85

5.1.5ClinicalDocument/participants85

5.1.6QRDA Category I Framework Body Constraints86

5.2QDM-Based QRDA87

6Section-Level Templates92

6.1Measure Section93

6.1.1Measure Section QDM94

6.2Patient Data Section95

6.2.1Patient Data Section QDM97

6.3Reporting Parameters Section110

7Entry-Level Templates111

7.1Act Intolerance or Adverse Event Observation111

7.1.1Diagnostic Study Adverse Event114

7.1.2Diagnostic Study Intolerance118

7.1.3Intervention Adverse Event122

7.1.4Intervention Intolerance126

7.1.5Laboratory Test Adverse Event130

7.1.6Laboratory Test Intolerance134

7.1.7Procedure Adverse Event138

7.1.8Procedure Intolerance142

7.2Age Observation146

7.3Allergy Status Observation148

7.4Assessment Scale Observation149

7.4.1Risk Category Assessment152

7.5Caregiver Characteristics154

7.6Communication from Patient to Provider155

7.7Communication from Provider to Patient159

7.8Communication from Provider to Provider161

7.9Deceased Observation164

7.9.1Patient Characteristic Expired166

7.10Discharge Medication - Active Medication167

7.11Drug Vehicle169

7.12Encounter Activities170

7.12.1Encounter Active174

7.12.2Encounter Performed176

7.13Family History Organizer179

7.13.1Diagnosis Family History182

7.14Functional Status Result Observation184

7.14.1Functional Status Result187

7.15Health Status Observation190

7.16Immunization Medication Information192

7.17Incision Datetime194

7.18Indication196

7.19Instructions198

7.20Laboratory Test Performed200

7.21Measure Reference202

7.21.1eMeasure Reference QDM204

7.22Medication Activity209

7.22.1Medication Active218

7.23Medication Administered221

7.24Medication Dispense226

7.24.1Medication Dispensed229

7.25Medication Information232

7.26Medication Supply Order234

7.27Non-Medicinal Supply Activity237

7.28Patient Care Experience238

7.29Patient Characteristic Clinical Trial Participant240

7.30Patient Characteristic Estimated Date of Conception242

7.31Patient Characteristic Gestational Age244

7.32Patient Characteristic Observation Assertion246

7.33Patient Characteristic Payer247

7.34Patient Preference249

7.35Plan of Care Activity Act252

7.35.1Intervention Order253

7.35.2Intervention Recommended256

7.36Plan of Care Activity Encounter258

7.36.1Encounter Order259

7.36.2Encounter Recommended262

7.37Plan of Care Activity Observation265

7.37.1Care Goal267

7.37.2Diagnostic Study Order269

7.37.3Diagnostic Study Recommended272

7.37.4Functional Status Order275

7.37.5Functional Status Recommended277

7.37.6Laboratory Test Order280

7.37.7Laboratory Test Recommended283

7.37.8Physical Exam Order286

7.37.9Physical Exam Recommended289

7.38Plan of Care Activity Procedure292

7.38.1Procedure Order293

7.38.2Procedure Recommended296

7.39Plan of Care Activity Substance Administration298

7.39.1Medication Order300

7.39.2Substance Recommended306

7.40Plan of Care Activity Supply308

7.40.1Device Order309

7.40.2Device Recommended312

7.40.3Medication Supply Request315

7.41Precondition for Substance Administration318

7.42Problem Observation319

7.42.1Diagnosis Active324

7.42.2Diagnosis Inactive328

7.42.3Diagnosis Resolved332

7.42.4Symptom Active336

7.42.5Symptom Assessed339

7.42.6Symptom Inactive342

7.42.7Symptom Resolved345

7.43Problem Status348

7.43.1Problem Status Active350

7.43.2Problem Status Inactive351

7.43.3Problem Status Resolved352

7.44Procedure Activity Act354

7.44.1Intervention Performed360

7.44.2Intervention Result363

7.45Procedure Activity Observation366

7.45.1Diagnostic Study Performed372

7.45.2Functional Status Performed376

7.45.3Physical Exam Performed378

7.46Procedure Activity Procedure381

7.46.1Device Applied388

7.46.2Procedure Performed391

7.46.3Procedure Result394

7.47Product Instance397

7.48Provider Care Experience399

7.49Provider Preference401

7.50Radiation Dosage and Duration404

7.51Reaction Observation406

7.51.1Reaction410

7.52Reason412

7.53Reporting Parameters Act414

7.54Result Observation415

7.54.1Diagnostic Study Result418

7.54.2Laboratory Test Result421

7.54.3Physical Exam Finding424

7.54.4Result426

7.55Service Delivery Location428

7.56Severity Observation430

7.57Status433

7.58Substance or Device Allergy - Intolerance Observation435

7.58.1Allergy - Intolerance Observation440

7.58.2Device Adverse Event460

7.58.3Device Allergy464

7.58.4Device Intolerance468

7.59Tobacco Use472

8Supporting Templates475

8.1Facility Location475

8.2Transfer From477

8.3Transfer To478

9References480

APPENDIX A —Acronyms and Abbreviations481

APPENDIX B —HQMF QDM Data Type to CDA MappiNG Tables482

APPENDIX C —Change Log (r1 vs R2)491

APPENDIX D —Template IDs Used in This Guide492

APPENDIX E —Previously Published Templates508

APPENDIX F —Extensions to CDA R2510

APPENDIX G —QRDA Category II Report Draft512

Header Constraints512

Header Attributes512

Participants513

Body Constraints516

Section Constraints519

Reporting Parameters Section519

Measure Section520

APPENDIX H —QRDA Category III Report Draft525

Header Constraints525

Header Attributes525

Participants526

Body Constraints528

Section Constraints531

Reporting Parameters Section531

Measure Section533

Table of Figures

Figure 1: Templated CDA28

Figure 2: Overview of quality framework31

Figure 3: Constraints format example34

Figure 4: Constraints format – only one allowed35

Figure 5: Constraints format – only one like this allowed36

Figure 6: Constraint binding to a single code36

Figure 7: XML expression of a single-code binding36

Figure 8: Translation code example37

Figure 9: nullFlavor example37

Figure 10: Attribute required38

Figure 11: Allowed nullFlavors when element is required (with xml examples)38

Figure 12: nullFlavor explicitly disallowed38

Figure 12: Asserting an act did not occur39

Figure 13: XML document example40

Figure 14: XPath expression example40

Figure 15: ClinicalDocument example41

Figure 16: Prototypic quality data element43

Figure 17: Relationship between QDM, eMeasure, and QRDA44

Figure 18: Quality Data Element representation in a QRDA Category I instance45

Figure 20: US realm header example54

Figure 21: effectiveTime with timezone example54

Figure 22: recordTarget example64

Figure 23: Person author example67

Figure 24: Device author example67

Figure 25: dataEnterer example69

Figure 26: Informant with assignedEntity example70

Figure 27: Custodian example71

Figure 28: informationRecipient example72

Figure 29: legalAuthenticator example74

Figure 30: Authenticator example75

Figure 31: Participant example for a supporting person77

Figure 32: Consent example79

Figure 33: documentationOf/serviceEvent example with identifiers91

Figure 35: Measure Section QDM example95

Figure 34: Diagnostic study adverse event example117

Figure 35: Diagnostic study intolerance example121

Figure 36: Intervention adverse event example125

Figure 37: Intervention intolerance example129

Figure 38: Laboratory test adverse event example133

Figure 39: Laboratory test intolerance example137

Figure 40: Procedure adverse event example141

Figure 41: Procedure intolerance example145

Figure 42: Age observation example148

Figure 149: Risk category assessment example153

Figure 47: Communication from patient to provider example158

Figure 48: Communication from provider to patient example161

Figure 49: Communication from provider to provider example163

Figure 50: Patient characteristic expired example167

Figure 49: Discharge medication – active medication entry example168

Figure 55: Drug vehicle entry example170

Figure 57: Encounter activities example173

Figure 58: Encounter active example176

Figure 59: Encounter performed example178

Figure 60: Family history organizer example182

Figure 61: Diagnosis Family History183

Figure 145: Functional status result example189

Figure 62: Health status observation example191

Figure 63: Immunization medication information example194

Figure 64: Incision datetime example195

Figure 65: Indication entry example197

Figure 66: Instructions entry example199

Figure 68: Laboratory test performed example202

Figure 69: Measure reference example204

Figure 56: eMeasure reference QDM example208

Figure 70: Medication activity example216

Figure 71: Medication active example220

Figure 72: Medication administered example224

Figure 73: Medication dispense example229

Figure 74: Medication dispensed example231

Figure 75: Medication information example234

Figure 76: Medication supply order example236

Figure 77: Patient care experience example240

Figure 129: Patient characteristic clinical trial participant example242

Figure 73: Patient Characteristic - Estimated Date of Conception entry example243

Figure 74: Patient characterstic - gestational age entry example246

Figure 75: Patient characteristic observation assertion entry example247

Figure 106: Patient characteristic payer example249

Figure 78: Patient preference example252

Figure 79: Plan of care activity act example253

Figure 80: Intervention order example255

Figure 81: Intervention recommended example258

Figure 82: Plan of care activity encounter example259

Figure 83: Encounter order example262

Figure 84: Type example265

Figure 85: Plan of care activity observation example266

Figure 86: Care goal example269

Figure 87: Diagnostic study order example272

Figure 89: Functional status order example277

Figure 90: Functional status recommended example279

Figure 91: Laboratory test order example283

Figure 92: Laboratory test recommended example286

Figure 93: Physical exam order example289

Figure 94: Physical exam recommended example291

Figure 95: Plan of care activity procedure example293

Figure 96: Procedure order example295

Figure 97: Procedure recommended example298

Figure 98: Plan of care activity substance administration example300

Figure 99: Medication order example304

Figure 100: Substance recommended example308

Figure 101: Plan of care activity supply example309

Figure 102: Device order example312

Figure 103: Device recommended example315

Figure 104: Medication supply request example317

Figure 107: Precondition for substance administration example319

Figure 108: Problem observation example323

Figure 109: Problem observation with specific problem not observed323

Figure 110: Problem observation for no known problems324

Figure 111: NullFlavor example324

Figure 112: Diagnosis active example327

Figure 113: Diagnosis inactive example331

Figure 114: Diagnosis resolved example335

Figure 115: Symptom active example338

Figure 116: Symptom assessed example341

Figure 117: Symptom inactive example344

Figure 118: Symptom resolved example347

Figure 119: Problem status example350

Figure 120: Problem status active example351

Figure 121: Problem status inactive example352

Figure 122: Problem status resolved example353

Figure 123: Procedure activity act example359

Figure 125: Intervention result example365

Figure 126: Procedure activity observation example371

Figure 127: Diagnostic study performed example375

Figure 128: Functional status performed example378

Figure 130: Physical exam performed example380

Figure 131: Procedure activity procedure example387

Figure 132: Device applied example391

Figure 133: Procedure performed example393

Figure 134: Procedure result example397

Figure 136: Product instance example399

Figure 137: Provider care experience example401

Figure 138: Provider preference example404

Figure 139: Radiation dosage and duration example406

Figure 140: Reaction observation example409

Figure 141: Reaction example411

Figure 142: Reason example414

Figure 143: Result observation example418

Figure 144: Diagnostic study result example420

Figure 146: Laboratory test result example423

Figure 148: Result example427

Figure 150: Service delivery location example430

Figure 151: Severity observation example433

Figure 152: Status example434

Figure 43: Allergy – intolerance observation example448

Figure 44: Medication adverse effect example451

Figure 45: Medication allergy example455

Figure 46: Medication intolerance example459

Figure 51: Device adverse event example463

Figure 52: Device allergy example468

Figure 53: Device intolerance example472

Figure 155: Tobacco Use entry example using sdtc:valueSet extension474

Figure 153: Facility location example476

Figure 154: Transfer from example478

Figure 155: Transfer to example479

Figure 156: realmCode Category II example512

Figure 157: ClinicalDocument/templateId Category II example512

Figure 158: Null flavor recordTarget Category II example513

Figure 159: AssignedAuthor as a processing entity Category II example513

Figure 160: Informant Category II example514

Figure 161: Custodian Category II example514

Figure 162: legalAuthenticator Category II example515

Figure 163: Category II/III use of Measure Set and Measure sections517

Figure 164: Sample QRDA Category II Patient List Report518

Figure 165: Reporting parameters section Category II example520

Figure 166: Measure section Category II example522

Figure 167: realmCode Category III example525

Figure 168: ClinicalDocument/templateId Category III example525

Figure 169: Null flavor recordTarget Category III example526

Figure 170: AssignedAuthor as a processing entity Category III example526

Figure 171: Informant Category III example527

Figure 172: Custodian Category III example527

Figure 173: legalAuthenticator Category III example528

Figure 174: Sample Category III QRDA Calculated Summary Report530

Figure 175: Reporting parameters section Category III example532

Figure 176: Act/performer Category III example representing a provider with which to group data535

Figure 177: Location Category III example representing a clinic with which to group data535

Figure 178: entryRelationship Category III example referring to the reporting parameters535

Figure 179: entryRelationship Category III observation of an integer value as a numerator example536

Table of Tables

Table 1: Content of the Package41

Table 3: QDM HQMF Pattern to CDA Mapping Table49

Table 4: Basic Confidentiality Kind Value Set53

Table 5: Language Value Set (excerpt)53

Table 6: Telecom Use (US Realm Header) Value Set58

Table 7: Administrative Gender (HL7) Value Set58

Table 8: Marital Status Value Set59

Table 9: Religious Affiliation Value Set (excerpt)59

Table 10: Race Value Set (excerpt)60

Table 11: Ethnicity Value Set60

Table 12: Personal Relationship Role Type Value Set (excerpt)61

Table 13: State Value Set (excerpt)61

Table 14: Postal Code Value Set (excerpt)62

Table 15: Country Value Set (excerpt)62

Table 16: Language Ability Value Set63

Table 17: Language Ability Proficiency Value Set63

Table 18: IND Role classCode Value Set76

Table 19: PostalAddressUse Value Set80

Table 20: EntityNameUse Value Set82

Table 21: EntityPersonNamePartQualifier Value Set83

Table 22: QRDA Category I Framework Contexts84

Table 23: Participant Scenarios86

Table 3: QDM-Based QRDA Contexts87

Table 4: QDM-Based QRDA Constraints Overview88

Table 25: Measure Section Contexts93

Table 8: Measure Section Constraints Overview93

Table 9: Measure Section QDM Contexts94

Table 10: Measure Section QDM Constraints Overview94

Table 27: Patient Data Section Contexts95

Table 28: Patient Data Section QDM Contexts97

Table 29: Reporting Parameters Section Contexts110

Table 30: Act Intolerance or Adverse Event Observation Contexts111

Table 31: Act Intolerance or Adverse Event Observation Constraints Overview112

Table 32: Allergy/Adverse Event Type Value Set114

Table 33: Diagnostic Study Adverse Event Contexts114

Table 34: Diagnostic Study Adverse Event Constraints Overview115

Table 35: Diagnostic Study Intolerance Contexts118

Table 36: Diagnostic Study Intolerance Constraints Overview119

Table 37: Intervention Adverse Event Contexts122

Table 38: Intervention Adverse Event Constraints Overview123

Table 39: Intervention Intolerance Contexts126

Table 40: Intervention Intolerance Constraints Overview127

Table 41: Laboratory Test Adverse Event Contexts130

Table 42: Laboratory Test Adverse Event Constraints Overview131

Table 43: Laboratory Test Intolerance Contexts134

Table 44: Laboratory Test Intolerance Constraints Overview135

Table 45: Procedure Adverse Event Contexts138

Table 46: Procedure Adverse Event Constraints Overview139

Table 47: Procedure Intolerance Contexts142

Table 48: Procedure Intolerance Constraints Overview143

Table 49: Age Observation Contexts146

Table 50: Age Observation Constraints Overview147

Table 51: AgePQ_UCUM Value Set148

Table 52: Allergy Status Observation Contexts148

Table 53: Allergy Status Observation Constraints Overview149

Table 54: Assessment Scale Observation Contexts149

Table 55: Assessment Scale Observation Constraints Overview150

Table 56: Risk Category Assessment Contexts152

Table 57: Risk Category Assessment Constraints Overview - Differential152

Table 58: Caregiver Characteristics Contexts154

Table 59: Caregiver Characteristics Constraints Overview154

Table 60: Communication from Patient to Provider Contexts155

Table 61: Communication from Patient to Provider Constraints Overview156

Table 62: Communication from Provider to Patient Contexts159

Table 63: Communication from Provider to Patient Constraints Overview159

Table 64: Communication from Provider to Provider Contexts161

Table 65: Communication from Provider to Provider Constraints Overview162

Table 66: Deceased Observation Contexts164

Table 67: Deceased Observation Constraints Overview165

Table 68: Patient Characteristic Expired Contexts166

Table 69: Patient Characteristic Expired Constraints Overview - Differential166

Table 70: Discharge Medication - Active Medication Contexts167

Table 71: Discharge Medication - Active Medication Constraints Overview168

Table 72: Drug Vehicle Contexts169

Table 73: Drug Vehicle Constraints Overview169

Table 76: Encounter Activities Contexts170

Table 77: Encounter Activities Constraints Overview171

Table 78: Encounter Type Value Set173

Table 79: Encounter Active Contexts174

Table 80: Encounter Active Constraints Overview - Differential174

Table 81: Encounter Performed Contexts176

Table 82: Encounter Performed Constraints Overview - Differential177

Table 83: Family History Organizer Contexts179

Table 84: Family History Organizer Constraints Overview179

Table 85: Family History Related Subject Value Set (excerpt)181

Table 86: Diagnosis Family History Constraints Overview - Differential183

Table 87: Functional Status Result Observation Contexts184

Table 88: Functional Status Result Observation Constraints Overview185

Table 89: Functional Status Result Contexts187

Table 90: Functional Status Result Constraints Overview - Differential188

Table 91: Health Status Observation Contexts190

Table 92: Health Status Observation Constraints Overview190

Table 93: HealthStatus Value Set191

Table 94: Immunization Medication Information Contexts192

Table 95: Immunization Medication Information Constraints Overview192

Table 96: Vaccine Administered (Hepatitis B) Value Set (excerpt)194

Table 97: Incision Datetime Contexts194

Table 98: Incision Datetime Constraints Overview195

Table 99: Indication Contexts196

Table 100: Indication Constraints Overview196

Table 101: Instructions Contexts198

Table 102: Instructions Constraints Overview198

Table 103: Patient Education Value Set199

Table 104: Laboratory Test Performed Contexts200

Table 105: Laboratory Test Performed Constraints Overview200

Table 87: Measure Reference Contexts202

Table 88: Measure Reference Constraints Overview203

Table 89: eMeasure Reference QDM Contexts204

Table 90: eMeasure Reference QDM Constraints Overview205

Table 108: Medication Activity Contexts209

Table 109: Medication Activity Constraints Overview209

Table 110: MoodCodeEvnInt Value Set214

Table 111: Medication Route FDA Value Set (excerpt)214

Table 112: Body Site Value Set (excerpt)215

Table 113: Medication Product Form Value Set (excerpt)215

Table 114: Unit of Measure Value Set (excerpt)216

Table 115: Medication Active Contexts218

Table 116: Medication Active Constraints Overview - Differential219

Table 117: Medication Administered Contexts221

Table 118: Medication Administered Constraints Overview222

Table 119: Medication Dispense Contexts226

Table 120: Medication Dispense Constraints Overview226

Table 121: Medication Fill Status Value Set228

Table 122: Medication Dispensed Contexts229

Table 123: Medication Dispensed Constraints Overview - Differential230

Table 124: Medication Information Contexts232

Table 125: Medication Information Constraints Overview232

Table 126: Medication Supply Order Contexts234

Table 127: Medication Supply Order Constraints Overview234

Table 128: Non-Medicinal Supply Activity Contexts237

Table 129: Non-Medicinal Supply Activity Constraints Overview237

Table 130: Patient Care Experience Contexts238

Table 131: Patient Care Experience Constraints Overview238

Table 109: Patient Characteristic Clinical Trial Participant Contexts240

Table 110: Patient Characteristic Clinical Trial Participant Constraints Overview241

Table 132: Patient Characteristic Estimated Date of Conception Contexts242

Table 133: Patient Characteristic Estimated Date of Conception Constraints Overview243

Table 134: Patient Characteristic Gestational Age Contexts244

Table 135: Patient Characteristic Gestational Age Constraints Overview244

Table 136: DayWkPQ Value set245

Table 137: Patient Characteristic Observation Assertion Contexts246

Table 138: Patient Characteristic Observation Assertion Constraints Overview246

Table 115: Patient Characteristic Payer Contexts247

Table 116: Patient Characteristic Payer Constraints Overview248

Table 143: Source of Payment Typology Value set249

Table 140: Patient Preference Contexts249

Table 141: Patient Preference Constraints Overview251

Table 142: Plan of Care Activity Act Constraints Overview252

Table 143: Plan of Care moodcode (Act/Encounter/Procedure)253

Table 144: Intervention Order Contexts253

Table 145: Intervention Order Constraints Overview - Differential254

Table 146: Intervention Recommended Contexts256

Table 147: Intervention Recommended Constraints Overview - Differential256

Table 148: Plan of Care Activity Encounter Constraints Overview258

Table 149: Encounter Order Contexts259

Table 150: Encounter Order Constraints Overview - Differential259

Table 151: Encounter Recommended Contexts262

Table 152: Encounter Recommended Constraints Overview263

Table 153: Plan of Care Activity Observation Constraints Overview265

Table 154: Plan of Care moodCode (Observation) Value Set266

Table 155: Care Goal Contexts267

Table 156: Care Goal Constraints Overview267

Table 157: Diagnostic Study Order Contexts269

Table 158: Diagnostic Study Order Constraints Overview - Differential270

Table 159: Diagnostic Study Recommended Contexts272

Table 160: Diagnostic Study Recommended Constraints Overview273

Table 161: Functional Status Order Contexts275

Table 162: Functional Status Order Constraints Overview - Differential275

Table 163: Functional Status Recommended Contexts277

Table 164: Functional Status Recommended Constraints Overview278

Table 165: Laboratory Test Order Contexts280

Table 166: Laboratory Test Order Constraints Overview280

Table 167: Laboratory Test Recommended Contexts283

Table 168: Laboratory Test Recommended Constraints Overview - Differential284

Table 169: Physical Exam Order Contexts286

Table 170: Physical Exam Order Constraints Overview287

Table 171: Physical Exam Recommended Contexts289

Table 172: Physical Exam Recommended Constraints Overview289

Table 173: Plan of Care Activity Procedure Constraints Overview292

Table 174: Procedure Order Contexts293

Table 175: Procedure Order Constraints Overview293

Table 176: Procedure Recommended Contexts296

Table 177: Procedure Recommended Constraints Overview296

Table 179: Plan of Care Activity Substance Administration Constraints Overview299

Table 180: Plan of Care moodCode (SubstanceAdministration/Supply) Value Set299

Table 181: Medication Order Contexts300

Table 182: Medication Order Constraints Overview - Differential300

Table 183: Substance Recommended Contexts306

Table 184: Substance Recommended Constraints Overview - Differential306

Table 185: Plan of Care Activity Supply Constraints Overview309

Table 186: Device Order Contexts309

Table 187: Device Order Constraints Overview - Differential310

Table 188: Device Recommended Contexts312

Table 189: Device Recommended Constraints Overview313

Table 190: Medication Supply Request Contexts315

Table 191: Medication Supply Request Constraints Overview316

Table 198: Precondition for Substance Administration Contexts318

Table 199: Precondition for Substance Administration Constraints Overview318

Table 200: Problem Observation Contexts319

Table 201: Problem Observation Constraints Overview320

Table 202: Problem Type Value Set322

Table 203: Problem Value Set (excerpt)322

Table 204: Diagnosis Active Contexts324

Table 205: Diagnosis Active Constraints Overview325

Table 206: Diagnosis Inactive Contexts328

Table 207: Diagnosis Inactive Constraints Overview329

Table 208: Diagnosis Resolved Contexts332

Table 209: Diagnosis Resolved Constraints Overview332

Table 210: Symptom Active Contexts336

Table 211: Symptom Active Constraints Overview336

Table 212: Symptom Assessed Contexts339

Table 213: Symptom Assessed Constraints Overview339

Table 214: Symptom Inactive Contexts342

Table 215: Symptom Inactive Constraints Overview342

Table 216: Symptom Resolved Contexts345

Table 217: Symptom Resolved Constraints Overview345

Table 218: Problem Status Contexts348

Table 219: Problem Status Constraints Overview349

Table 220: Problem Status Active Contexts350

Table 221: Problem Status Active Constraints Overview350

Table 222: Problem Status Inactive Contexts351

Table 223: Problem Status Inactive Constraints Overview351

Table 224: Problem Status Resolved Contexts352

Table 225: Problem Status Resolved Constraints Overview353

Table 226: Procedure Activity Act Contexts354

Table 227: Procedure Activity Act Constraints Overview354

Table 228: Procedure Act Status Code Value Set358

Table 229: Act Priority Value Set359

Table 230: Intervention Performed Contexts360

Table 231: Intervention Performed Constraints Overview361

Table 232: Intervention Result Contexts363

Table 233: Intervention Result Constraints Overview363

Table 234: Procedure Activity Observation Contexts366

Table 235: Procedure Activity Observation Constraints Overview367

Table 236: Diagnostic Study Performed Contexts372

Table 237: Diagnostic Study Performed Constraints Overview372

Table 238: Functional Status Performed Contexts376

Table 239: Functional Status Performed Constraints Overview376

Table 242: Physical Exam Performed Contexts378

Table 243: Physical Exam Performed Constraints Overview378

Table 244: Procedure Activity Procedure Contexts381

Table 245: Procedure Activity Procedure Constraints Overview381

Table 246: Device Applied Contexts388

Table 247: Device Applied Constraints Overview388

Table 248: Procedure Performed Contexts391

Table 249: Procedure Performed Constraints Overview392

Table 250: Procedure Result Contexts394

Table 251: Procedure Result Constraints Overview394

Table 252: Product Instance Contexts397

Table 253: Product Instance Constraints Overview398

Table 254: Provider Care Experience Contexts399

Table 255: Provider Care Experience Constraints Overview399

Table 256: Provider Preference Contexts401

Table 257: Provider Preference Constraints Overview403

Table 258: Radiation Dosage and Duration Contexts404

Table 259: Radiation Dosage and Duration Constraints Overview404

Table 260: Reaction Observation Contexts406

Table 261: Reaction Observation Constraints Overview407

Table 262: Reaction Contexts410

Table 263: Reaction Constraints Overview410

Table 169: Reason Contexts412

Table 170: Reason Constraints Overview413

Table 266: Reporting Parameters Act Contexts414

Table 267: Reporting Parameters Act Constraints Overview415

Table 268: Result Observation Constraints Overview416

Table 269: Diagnostic Study Result Contexts418

Table 270: Diagnostic Study Result Constraints Overview419

Table 269: Laboratory Test Result Contexts421

Table 270: Laboratory Test Result Constraints Overview421

Table 271: Physical Exam Finding Contexts424

Table 272: Physical Exam Finding Constraints Overview424

Table 273: Result Contexts426

Table 274: Result Constraints Overview426

Table 277: Service Delivery Location Contexts428

Table 278: Service Delivery Location Constraints Overview428

Table 279: HealthcareServiceLocation Value Set (excerpt)429

Table 280: Severity Observation Contexts430

Table 281: Severity Observation Constraints Overview431

Table 282: Problem Severity Value Set432

Table 283: Status Contexts433

Table 284: Status Constraints Overview434

Table 285: Substance or Device Allergy - Intolerance Observation Contexts435

Table 286: Substance or Device Allergy - Intolerance Observation Constraints Overview436

Table 51: Allergy Observation Contexts440

Table 52: Allergy Observation Constraints Overview441

Table 53: Allergy/Adverse Event Type Value Set445

Table 54: Medication Brand Name Value Set (excerpt)445

Table 55: Medication Clinical Drug Value Set (excerpt)446

Table 56: Medication Drug Class Value Set (excerpt)447

Table 57: Ingredient Name Value Set (excerpt)447

Table 255: Medication Adverse Effect Contexts448

Table 256: Medication Adverse Effect Constraints Overview449

Table 257: Medication Allergy Contexts452

Table 258: Medication Allergy Constraints Overview453

Table 259: Medication Intolerance Contexts456

Table 260: Medication Intolerance Constraints Overview457

Table 261: Device Adverse Event Contexts460

Table 262: Device Adverse Event Constraints Overview461

Table 263: Device Allergy Contexts464

Table 264: Device Allergy Constraints Overview465

Table 265: Device Intolerance Contexts468

Table 266: Device Intolerance Constraints Overview469

Table 275: Tobacco Use Contexts472

Table 276: Tobacco Use Constraints Overview473

Table 287: Facility Location Contexts475

Table 288: Facility Location Constraints Overview475

Table 289: Transfer From Contexts477

Table 290: Transfer From Constraints Overview477

Table 291: Transfer To Contexts478

Table 292: Transfer To Constraints Overview478

Table 293: HQMF QDM Pattern to CDA Template Mapping Table482

Table 294: HQMF QDM Attribute Patterns to CDA Elements in Specific Templates Mapping Table488

Table 295: HQMF QDM Attribute Patterns to CDA Elements Mapping Table489

Table 296: HQMF QDM Data Type Patterns Not Mappable490

Table 297: Alphabetical List of Template IDs in This Guide492

Table 298: Template Containment in This Guide496

Table 299: Previously Published Templates (Closed for Ballot)508

Table 300: QRDA Category II/III Participant Scenarios516

Introduction

"If you cannot measure it, you cannot improve it."

Lord Kelvin (1824-1907)

Purpose

This document describes constraints on the Clinical Document Architecture Release 2 (CDA R2) header and body elements for Quality Reporting Document Architecture (QRDA) documents. The Institute of Medicine (IOM) definition of quality is: “The degree to which health services for individuals and populations increase the likelihood of desired health outcomes and are consistent with current professional knowledge.”[footnoteRef:1] For care quality to be evaluated, it must be standardized and communicated to the appropriate organizations. [1: http://www.iom.edu/Global/News Announcements/Crossing-the-Quality-Chasm-The-IOM-Health-Care-Quality-Initiative.aspx]

QRDA is a document format that provides a standard structure with which to report quality measure data to organizations that will analyze and interpret the data. Quality measurement in health care is complex. Accurate, interpretable data efficiently gathered and communicated is key in correctly assessing that quality care is delivered

Audience

The audience for this document includes software developers and implementers with reporting capabilities within their electronic health record (EHR) systems; developers and analysts in receiving institutions; and local, regional, and national health information exchange networks who wish to create and/or process CDA reporting documents created according to this specification.

Approach

Overall, the approach taken here is consistent with balloted implementation guides for CDA. These publications view the ultimate implementation specification as a series of layered constraints. CDA itself is a set of constraints on the Health Level Seven (HL7) Reference Information Model (RIM). Implementation guides such as this add constraints to CDA through conformance statements that further define and restrict the sequence and cardinality of CDA objects and the vocabulary sets for coded elements.

This implementation guide is release 2 (R2) of the QRDA Draft Standard for Trial Use (DSTU). The Background and Current Project sections describe the development of the DSTU.

CDA R2

CDA R2 is “… a document markup standard that specifies the structure and semantics of ‘clinical documents’ for the purpose of exchange” [CDA R2, Section 1.1; see References]. Clinical documents, according to CDA, have six characteristics:

Persistence

Stewardship

Potential for authentication

Context

Wholeness

Human readability

CDA defines a header for classification and management and a document body that carries the clinical record. While the header metadata are prescriptive and designed for consistency across all instances, the body is highly generic, leaving the designation of semantic requirements to implementation guides such as this one.

Templated CDA

CDA R2 can be constrained by mechanisms defined in the “Refinement and Localization”[footnoteRef:2] section of the HL7 Version 3 Interoperability Standards. The mechanism most commonly used to constrain CDA is referred to as “templated CDA”. In this approach, a library of modular CDA templates are constructed such that they can be reused across any number of CDA document types, as shown in the following figure. [2: http://www.hl7.org/v3ballot/html/infrastructure/conformance/conformance.htm]

Figure 1: Templated CDA

There are many different kinds of templates that might be created. Among them, the most common are:

· Document-level templates: These templates constrain fields in the CDA header, and define containment relationships to CDA sections. For example, a History-and-Physical document-level template might require that the patient’s name be present, and that the document contain a Physical Exam section.

· Section-level templates: These templates constrain fields in the CDA section, and define containment relationships to CDA entries. For example, a Physical-exam section-level template might require that the section/code be fixed to a particular LOINC code, and that the section contain a Systolic Blood Pressure observation.

· Entry-level templates: These templates constrain the CDA clinical statement model in accordance with real world observations and acts. For example, a Systolic-blood-pressure entry-level template defines how the CDA Observation class is constrained (how to populate observation/code, how to populate observation/value, etc) to represent the notion of a systolic blood pressure.

A CDA Implementation Guide (such as this one) includes reference to those templates that are applicable. On the implementation side, a CDA instance populates the templateId field where it wants to assert conformance to a given template. On the receiving side, the recipient can then not only test the instance for conformance against the CDA XML schema, but can also test the instance for conformance against asserted templates.

Background

In early pilots of the QRDA initiative, participating organizations confirmed the feasibility of using the HL7 Clinical Document Architecture (CDA) as the foundation for the QRDA specification. The participants concluded that CDA provided the technical underpinnings for communicating pediatric and adult quality measures for both inpatient and ambulatory care settings.

In later pilots, the HL7 Child Health Work Group and the Structured Documents Work Group developed a QRDA DSTU, Release 1 (R1), first published in September 2008.

The QRDA DSTU R1 defined three categories of quality reporting: A QRDA Category I – Single Patient Report, a QRDA Category II – Patient List Report, and a QRDA Category III – Calculated Report. Only the QRDA Category I report was balloted, while the sections of the DSTU that define QRDA Category II and Category III reports were for comment only. The concept of the release 1 report types are described below.

QRDA Category I – Single Patient Report

A QRDA Category I report is an individual-patient-level quality report. Each report contains quality data for one patient for one or more quality measures, where the data elements in the report are defined by the particular measure(s) being reported on. A QRDA Category I report contains raw applicable patient data. When pooled and analyzed, each report contributes the quality data necessary to calculate population measure metrics.

QRDA R1 defined the CDA framework for quality reports and a method for referencing a quality measure. The DSTU recommended the re-use of Continuity of Care Document (CCD)[footnoteRef:3] clinical statements to send measure data elements. Two measure-specific implementation guides were created as part of the guide. [3: HL7 Implementation Guide: CDA Release 2 - Continuity of Care Document CCD April 1, 2007. http://www.hl7.org/implement/standards/product_brief.cfm?product_id=6]

QRDA Category II – Patient List Report

A QRDA Category II report is a multi-patient-level quality report. Each report contains quality data for a set of patients for one or more quality measures, where the data elements in the report are defined by the particular measure(s) being reported on.

Whereas a QRDA Category I report contains only raw applicable patient data, a QRDA Category II report includes flags for each patient indicating whether the patient qualifies for a measure’s numerator, denominator, exclusion, or other aggregate data element. These qualifications can be pooled and counted to create the QRDA Category III report.

The QRDA Category II Draft appendix contains the header, body, and section constraints for the Patient List Report from QRDA R1 (March 2009).

QRDA Category III – Calculated Report

A QRDA Category III report is an aggregate quality report. Each report contains calculated summary data for one or more measures for a specified population of patients within a particular health system over a specific period of time.

Data needed to generate QRDA Category II and QRDA Category III reports must be included in the collected QRDA Category I reports, as the processing entity will not have access to additional data sources.

The QRDA Category III Draft appendix contains the header, body, and section constraints for the Calculated Report from the QRDA R1 (March 2009).

Relationship to Health Quality Measures Format: eMeasures

The HL7 Health Quality Measures Format (HQMF) is a standard for representing a health quality measure as an electronic document. A quality measure is a quantitative tool that provides an indication of the performance of an individual or organization in relation to a specified process or outcome via the measurement of an action, process or outcome of clinical care. Quality measures are often derived from clinical guidelines and are designed to determine whether the appropriate care has been provided given a set of clinical criteria and an evidence base. Quality measures are also often referred to as performance measures or quality indicators. A quality measure expressed in HQMF format is referred to as an "eMeasure".

Measure developers, drawing upon available evidence, devise measureable parameters to gauge the quality of care in a particular area. These measureable parameters are assembled into quality measures which are then expressible as eMeasures. eMeasures guide collection of EHR data and other data, which are then assembled into QRDA quality reports and submitted to quality or other organizations. This relationship is summarized in the Overview of quality framework figure.

While there is no prerequisite that a QRDA document must be generated based on an eMeasure, the QRDA standard is written to tightly align with HQMF.

Figure 2: Overview of quality framework

Current Project

Since the creation of QRDA R1, there has been an increase in understanding of the end-to-end electronic quality reporting process. HL7 created a standard called eMeasure: Representation of the Health Quality Measures Format (HQMF).[footnoteRef:4] Using this standard, quality measures are redefined using HL7 RIM semantics, thus expressing the measures using a well-vetted model. This process is called “re-tooling.” Formally expressed criteria within an eMeasure can be converted into queries expressed against an EHR. [4: eMeasure Health Quality Measure Format, Release 1 HL7 Version 3 September 2009 Ballot Site. http://archive.hl7.org/v3ballotarchive/v3ballot2009sep/html/welcome/environment/index.htm]

The current project simplifies the QRDA framework and correlates the QRDA CDA with the HQMF standard. QRDA R2 re-uses the US Realm header from the Consolidated CDA implementation guide[footnoteRef:5]. QRDA R2 does not require the nesting of sections and no longer recommends a measure set section. At the time of development of QRDA R1, measures were referenced using an act/code. QRDA R2 references the measure through reference to an externalDocument. The externalDocument/ids and version numbers are used to reference the measure. [5: HL7 Implementation Guide for CDA Release 2.0, Consolidated CDA Templates, December 2011. http://www.hl7.org/implement/standards/product_brief.cfm?product_id=258]

The QRDA R2 framework provides guidance such that conformant measure-specific QRDA implementation guides can be developed through the HL7 process, by quality organizations, provider organization, and other quality stakeholder.

In addition to updating the QRDA Category I DSTU, this ballot also introduces a specific QRDA Category I DSTU Implementation Guide, designed to carry data based on Meaningful Use Stage 2 quality measures expressed in HQMF format.

This specific implementation guide deprecates the use of the CCD sections in the body and instead suggests the use of defined CDA templates based on the National Quality Forum (NQF) This specific implementation guide defines CDA templates based on the National Quality Forum (NQF) Quality Data Model (QDM)[footnoteRef:6], the same model used in the construction of Meaningful Use Stage 2 quality measures. We call this specific guide the "Quality Data Model-Based QRDA Implementation Guide". Rather than a specific implementation guide for each measure or set of measures, reporting organizations will be able to dynamically generate QRDA instances based on the corresponding eMeasure(s). This is described in detail in the Quality Data Model-Based QRDA IG. [6: NQF Quality Data Model. http://www.qualityforum.org/QualityDataModel.aspx]

Scope

This implementation guide is a conformance profile, as described in the “Refinement and Localization”[footnoteRef:7] section of the HL7 Version 3 Interoperability Standards. The base standard for this implementation guide is the HL7 Clinical Document Architecture, Release 2.0.[footnoteRef:8] As defined in that document, this implementation guide is both an annotation profile and a localization profile. This implementation guide It does not describe every aspect of CDA. Rather, it defines constraints on the base CDA. [7: http://www.hl7.org/v3ballot/html/infrastructure/conformance/conformance.htm] [8: HL7 Clinical Document Architecture (CDA Release 2). http://www.hl7.org/implement/standards/cda.cfm]

This guide defines additional constraints on CDA used in a QRDA document in the US realm. Additional optional CDA elements, not included here, can be included and the result will be compliant with the specifications in this guide.

Organization of This Guide

This guide includes a set of CDA Templates and prescribes their use within a QRDA document. The main chapters are:

Chapter 2. QRDA Category I Framework describes the QRDA Framework changes. Release 2 simplifies the QRDA framework and clarifies quality measure referencing.

Chapter 3. Quality Data Model-Based QRDA IG describes the relationship between the National Quality Forum’s (NQF) Quality Data Model (QDM), HQMF, and QRDA. It also describes the concept of dynamic generation of QDM QRDA documents.

Chapter 4. General Header Templates defines the document header constraints that apply to QRDA Category I documents.

Chapter 5. Document-Level Templates defines the document constraints that apply to QRDA Category I documents.

Chapter 6. Section-Level Templates defines the section templates in a QRDA Category I document.

Chapter 7. Entry-Level Templates defines the entry templates in a QDM approach to a QRDA Category I document. For every HQMF QDM pattern there is a QRDA template.

Conformance Conventions Used in This GuideTemplates Not Open for Comment

This implementation guide includes several templates that that have already been balloted and published and are, therefore, closed for comment in this ballot. These templates are indicated throughout this guide with a notation such as [Closed for comments; published December 2011] after the template name. The Previously Published Templates appendix lists these templates. If errors are found or enhancements are desired in these templates, please document them on Consolidated CDA Templates Errata or Consolidated CDA Templates Suggested Enhancements HL7 structured Document WIKI pages.

Templates and Conformance Statements

Conformance statements within this implementation guide are presented as constraints from a Template Database (Tdb). An algorithm converts constraints recorded in a Templates Database to a printable presentation. Each constraint is uniquely identified by an identifier at or near the end of the constraint (e.g., CONF:7345). These identifiers are persistent but not sequential.

Bracketed information following each template title indicates the template type (section, observation, act, procedure, etc.), the templateId, and whether the template is open or closed.

Each section and entry template in the guide includes a context table. The "Used By" column indicates which documents or sections use this template, and the "Contains Entries" column indicates templates contained within this template. Each entry template also includes a constraint overview table to summarize the constraints following the table. For templates that are children ("Conforms to") of parent constraints, the constraint overview tables are labeled “differential” to indicate that rows that duplicate the partentparent constraints have been removed. Value set tables, where applicable, and brief XML example figures are included with most explanations.

A typical template, as presented in this guide, is shown in the Constraints format example figure. The next sections describe specific aspects of conformance statements—open vs. closed statements, conformance verbs, cardinality, vocabulary conformance, containment relationships, and null flavors.

Figure 3: Constraints format example

Severity Observation

[observation: templateId 2.16.840.1.113883.10.20.22.4.8(open)]

Table xxx: Severity Observation Contexts

Used By:

Contains Entries:

Reaction Observation

Allergy Observation

This clinical statement represents the severity of the reaction to an agent. A person may manifest many symptoms …

Table yyy: Severity Observation Constraints Overview

Name

XPath

Card.

Verb

Data Type

CONF#

Fixed Value

observation[templateId/@root = '2.16.840.1.113883.10.20.22.4.8']

@classCode

1..1

SHALL

7345

2.16.840.1.113883.5.6 (HL7ActClass) = OBS

SHALL contain exactly one [1..1] @classCode="OBS" Observation (CodeSystem: 2.16.840.1.113883.5.6 HL7ActClass) STATIC (CONF:7345).

SHALL contain exactly one [1..1] @moodCode="EVN" Event (CodeSystem: 2.16.840.1.113883.5.1001 ActMood) STATIC (CONF:7346).

SHALL contain exactly one [1..1] templateId (CONF:7347) such that it

SHALL contain exactly one [1..1] @root="2.16.840.1.113883.10.20.22.4.8" (CONF:10525).

SHALL contain exactly one [1..1] code="SEV" Severity Observation (CodeSystem: 2.16.840.1.113883.5.4 ActCode) STATIC (CONF:7349).

SHOULD contain zero or one [0..1] text (CONF:7350).

This text, if present, SHOULD contain zero or one [0..1] reference/@value (CONF:7351).

This reference/@value SHALL begin with a '#' and SHALL point to its corresponding narrative (using the approach defined in CDA Release 2, section 4.3.5.1) (CONF:7378).

SHALL contain …

Open and Closed Templates

In open templates, all of the features of the CDA R2 base specification are allowed except as constrained by the templates. By contrast, a closed template specifies everything that is allowed and nothing further may be included. Templates in a QRDA document are open.

Keywords

The keywords shall, should, may, need not, should not, and shall not in this document are to be interpreted as described in the HL7 Version 3 Publishing Facilitator's Guide[footnoteRef:9]: [9: http://www.hl7.org/v3ballot/html/help/pfg/pfg.htm]

shall: an absolute requirement for the particular element. Where a SHALL constraint is applied to an XML element, that element must be present in an instance, but may have an exceptional value (i.e., may have a nullFlavor), unless explicitly precluded. Where a SHALL constraint is applied to an XML attribute, that attribute must be present, and must contain a conformant value.

shall not: an absolute prohibition against inclusion

should/should not: best practice or recommendation. There may be valid reasons to ignore an item, but the full implications must be understood and carefully weighed before choosing a different course

may/need not: truly optional; can be included or omitted as the author decides with no implications

Cardinality

The cardinality indicator (0..1, 1..1, 1..*, etc.) specifies the allowable occurrences within a document instance. The cardinality indicators are interpreted with the following format “m…n” where m represents the least and n the most:

0..1 zero or one

1..1 exactly one

1..* at least one

0..* zero or more

1..n at least one and not more than n

When a constraint has subordinate clauses, the scope of the cardinality of the parent constraint must be clear. In the next figure, the constraint says exactly one participant is to be present. The subordinate constraint specifies some additional characteristics of that participant.

Figure 4: Constraints format – only one allowed

1. SHALL contain exactly one [1..1] participant (CONF:2777).

a. This participant SHALL contain exactly one [1..1] @typeCode="LOC" (CodeSystem: 2.16.840.1.113883.5.90 HL7ParticipationType) (CONF:2230).

In the next figure, the constraint says only one participant “like this” is to be present. Other participant elements are not precluded by this constraint.

Figure 5: Constraints format – only one like this allowed

1. SHALL contain exactly one [1..1] participant (CONF:2777) such that it

a. SHALL contain exactly one [1..1] @typeCode="LOC" (CodeSystem:

2.16.840.1.113883.5.90 HL7ParticipationType) (CONF:2230).

Vocabulary Conformance

The templates in this document use terms from several code systems. These vocabularies are defined in various supporting specifications and may be maintained by other bodies, as is the case for the LOINC® and SNOMED CT® vocabularies.

Note that value-set identifiers (e.g., ValueSet 2.16.840.1.113883.1.11.78 Observation Interpretation (HL7) DYNAMIC) do not appear in CDA instances; they tie the conformance requirements of an implementation guide to the allowable codes for validation.

Value-set bindings adhere to HL7 Vocabulary Working Group best practices, and include both a conformance verb (shall, should, may, etc.) and an indication of dynamic vs. static binding. Value-set constraints can be static, meaning that they are bound to a specified version of a value set, or dynamic, meaning that they are bound to the most current version of the value set. A simplified constraint, used when the binding is to a single code, includes the meaning of the code.

Figure 6: Constraint binding to a single code

1. … code/@code="11450-4" Problem List (CodeSystem: 2.16.840.1.113883.6.1 LOINC).

In this example, the notation conveys the actual code (11450-4), the code’s displayName (Problem List), the OID of the codeSystem from which the code is drawn (2.16.840.1.113883.6.1), and the codeSystemName (LOINC).

HL7 Data Types Release 1 requires the codeSystem attribute unless the underlying data type is “Coded Simple” or “CS”, in which case it is prohibited. The displayName and the codeSystemName are optional, but often useful to include in an instance.

The above example would be properly expressed as follows.

Figure 7: XML expression of a single-code binding

displayName="Problem List"

codeSystemName=”"LOINC”"/>

A full discussion of the representation of vocabulary is outside the scope of this document; for more information, see the HL7 Version 3 Interoperability Standards, Normative Edition 2010[footnoteRef:10] sections on Abstract Data Types and XML Data Types R1. [10: http://www.hl7.org/memonly/downloads/v3edition.cfm#V32010 (must be a member to view)]

There is a discrepancy in the implementation of translation code versus the original code between HL7 Data Types R1 and the convention agreed upon for this implementation guide. The R1 data type requires the original code in the root. This implementation guide specifies the standard code in the root, whether it is original or a translation. This discrepancy is resolved in HL7 Data Types R2.

Figure 8: Translation code example

codeSystemName='SNOMED CT'>

displayName='necrotizing enterocolitis'

codeSystem='2.16.840.1.113883.19'/>

Null Flavor

Information technology solutions store and manage data, but sometimes data are not available; an item may be unknown, not relevant, or not computable or measureable. In HL7, a flavor of null, or nullFlavor, describes the reason for missing data.

Figure 9: nullFlavor example

Use null flavors for unknown, required, or optional attributes:

NI No information. This is the most general and default null flavor.

NA Not applicable. Known to have no proper value (e.g., last menstrual period for a male).

UNK Unknown. A proper value is applicable, but is not known.

ASKU Asked, but not known. Information was sought, but not found (e.g., the patient was asked but did not know).

NAV Temporarily unavailable. The information is not available, but is expected to be available later.

NASK Not asked. The patient was not asked.

MSKThere is information on this item available but it has not been provided by the sender due to security, privacy, or other reasons. There may be an alternate mechanism for gaining access to this information.

OTHThe actual value is not and will not be assigned a standard coded value. An example is the name or identifier of a clinical trial.

This above list contains those null flavors that are commonly used in clinical documents. For the full list and descriptions, see the nullFlavor vocabulary domain in the CDA normative edition.[footnoteRef:11] [11: HL7 Clinical Document Architecture (CDA Release 2). http://www.hl7.org/implement/standards/cda.cfm]

Any SHALL conformance statement may use nullFlavor, unless the attribute is required or the nullFlavor is explicitly disallowed. SHOULD and MAY conformance statement may also use nullFlavor.

Figure 10: Attribute required

1. SHALL contain exactly one [1..1] code/@code="11450-4" Problem List (CodeSystem: LOINC 2.16.840.1.113883.6.1) (CONF:7878)

or

2. SHALL contain exactly one [1..1] effectiveTime/@value (CONF:5256).

Figure 11: Allowed nullFlavors when element is required (with xml examples)

1. SHALL contain at least one [1..*] id

2. SHALL contain exactly one [1..1] code

3. SHALL contain exactly one [1..1] effectiveTime

New Grading system

Spiculated mass grade 5

Figure 12: nullFlavor explicitly disallowed

1. SHALL contain exactly one [1..1] effectiveTime (CONF:5256).

a. SHALL NOT contain [0..0] @nullFlavor (CONF:52580).

Asserting an Act Did Not Occur

The negationInd attribute, if true, specifies that the act indicated was observed to not have occurred (which is subtly but importantly different from having not been observed). NegationInd='true' is an acceptable way to make a clinical assertion that something did not occur, for example, "no gestational diabetes".

A nested reason for the act not being done can be represented through the use of an entryRelationship clinical statement with an actRelationship type of “RSON”.

Figure 12: Asserting an act did not occur

codeSystem="2.16.840.1.113883.5.4"

codeSystemName="HL7ActCode" />

code="11687002"

codeSystem="2.16.840.1.113883.6.96"

codeSystemName="SNOMED CT"

displayName="gestational diabetes mellitus" />

codeSystem="2.16.840.1.113883.6.96"

codeSystemName="SNOMED-CT"

displayName="reason"/>

Data Types

All data types used in a CDA document are described in the CDA R2 normative edition.[footnoteRef:12] All attributes of a data type are allowed unless explicitly prohibited by this specification. [12: HL7 Clinical Document Architecture (CDA Release 2). http://www.hl7.org/implement/standards/cda.cfm]

XML Conventions Used in This GuideXPath Notation

Instead of the traditional dotted notation used by HL7 to represent Reference Information Model (RIM) classes, this document uses XML Path Language (XPath) notation[footnoteRef:13] in conformance statements and elsewhere to identify the Extended Markup Language (XML) elements and attributes within the CDA document instance to which various constraints are applied. The implicit context of these expressions is the root of the document. This notation provides a mechanism that will be familiar to developers for identifying parts of an XML document. [13: XML Path Language (XPath) Version 1.0. http://www.w3.org/TR/xpath/]

XPath statements appear in this document in a monospace font.

XPath syntax selects nodes from an XML document using a path containing the context of the node(s). The path is constructed from node names and attribute names (prefixed by an ‘@’) and catenated concatenated with a ‘/’ symbol.

Figure 13: XML document example

...

code='17561000' displayName='Cardiologist' />

In the above example, the code attribute of the code could be selected with the XPath expression in the next figure.

Figure 14: XPath expression example

author/assignedAuthor/code/@code

XML Examples and Sample Documents

Extended Mark-up Language (XML) examples appear in figures in this document in this monospace font. Portions of the XML content may be omitted from the content for brevity, marked by an ellipsis (...) as shown in the example below.

Figure 15: ClinicalDocument example

...

Within the narrative, XML element (code, assignedAuthor, etc.) and attribute (SNOMED CT, 17561000, etc.) names also appear in this monospace font.

This package includes complete sample documents as listed in the Content of the Package table below.

Rendering Header Information for Human Presentation

Metadata carried in the header may already be available for rendering from EHRs or other sources external to the document; therefore, there is no strict requirement to render directly from the document header. An example of this would be a doctor using an EHR that already contains the patient’s name, date of birth, current address, and phone number. When a CDA document is rendered within that EHR, those pieces of information may not need to be displayed since they are already known and displayed within the EHR’s user interface.

Good practice would recommend that the following information be present whenever the document is viewed:

Document title and document dates

Service and encounter types, and date ranges as appropriate

Names of all persons along with their roles, participations, participation date ranges, identifiers, address, and telecommunications information

Names of selected organizations along with their roles, participations, participation date ranges, identifiers, address, and telecommunications information

Date of birth for recordTarget(s)

Content of the Package

The following files comprise this package.

Table 1: Content of the Package

Filename

Description

Ballot Applicability

CDAR2_IG_QRDA_R1_U1_2012MAY

This guide

Normative

QRDA_CAT_I_Sample.xml

Sample QRDA category I

Informative

cda.xsl

Stylesheet for display of CDA instances

Informative

QRDA Category I Framework

A QRDA Category I report is an individual-patient-level quality report. Each report contains quality data for one patient for one or more quality measures, where the data elements in the report are defined by the particular measure(s) being reported on. A QRDA Category I report contains raw applicable patient data. When pooled and analyzed, each report contributes the quality data necessary to calculate population measure metrics.

Measure Section

The Measure Section template contains explicit reference to the measure or measures being reported. The standard allows a QRDA Category I instance to contain data for any number of measures.

Reporting Parameters Section

The Reporting Parameters Section provides information about the reporting time interval, and may contain other information that provides context for the patient data being reported. The receiving organization may tell the reporting organizations what information they want in this section.

Patient Data Section

The Patient Data Section conveys all the patient data elements expected in the measure(s) stated in the measure section. A patient data element is information about a particular person (as opposed to a population). Examples include: individual’s test results, individual’s encounter location, and an individual’s date of birth.

In the Quality Data Model-Based QRDA IG, corresponding template patterns exist for every QDM Quality Data Type and thus a very specific structure is required. This is not a requirement of the QRDA framework. However, where possible, data elements in a QRDA framework instance should be communicated with entry-level templates from the Implementation Guide for CDA Release 2.0 Consolidated CDA Templates (US Realm) December 2011. In many cases these templates will require further constraint to convey the exact data elements required by a measure or set of measures. Data elements should always be sent with a date/time stamp.

Quality Data Model-Based QRDA IGIntroduction

This section introduces the National Quality Forum (NQF) Quality Data Model (QDM)[footnoteRef:14], and describes how it is used to construct both QDM-based eMeasures and corresponding QDM-based QRDA documents as defined in this guide. [14: NQF Quality Data Model. http://www.qualityforum.org/QualityDataModel.aspx]

From HL7's perspective, the QDM is a domain analysis model that defines concepts recurring across quality measures. The figure below illustrates components of the QDM relevant to understanding how that model guides the construction of CDA templates in QRDA documents.

Figure 16: Prototypic quality data element

The QDM breaks a concept down into a "Quality Data Type", "Quality Data Attributes", and a "Value Set". A "Quality Data Element" is a Quality Data Type, along with its Quality Data Attributes and associated Value Set. Quality Data Types (e.g., Medication Administered, Diagnosis Active) represent a clinical category (e.g., Medication, Diagnosis) coupled with a "state" (e.g., Administered, Active)[footnoteRef:15]. Quality Data Attributes represent various components of the Quality Data Type, such as timing, or category-specific characteristics, as shown above in the Medication Administered example. [15: For the purposes of this document, we make reference to Quality Data Types, and make no further reference to clinical categories or states.]

The next figure illustrates the relationship between QDM and QRDA (and eMeasure).

Figure 17: Relationship between QDM, eMeasure, and QRDA

Quality Data Types have been manually converted into HL7 Reference Information Model (RIM)-derived XML patterns that, when coupled with value sets, become Quality Data Elements that can be used as data criteria within a QDM-based eMeasure.

Each Quality Data Type pattern is assigned a unique ID, which is present in the eMeasure, and which is mapped to a corresponding CDA template.

A Quality Data Element further constrains a Quality Data Type pattern via vocabulary binding to a value set. The linked CDA template can be further constrained through a corresponding value set bindingreference. This linkage allows one to construct a QDM-based QRDA, given a QDM-based eMeasure.

The further constraint on a CDA template through value set binding referencing provides results in a new CDA template. To retain the ability to defineflexibility in defining new QDM-based eMeasures (e.g. assign new value sets to existing Quality Data Types thereby creating new Quality Data Elements) without the need to update this guide, we introduce the notion of “dynamic CDA template creation”. In essence, this guide has predefined CDA templates corresponding to Quality Data Types, and provides a mechanism for referencing arbitrary value sets, thereby allowing a QRDA instance to conform to a Quality Data Element.

The actual value set referencing mechanism is shown in the following figure. In the figure, the Quality Data Type is identifiable via the template ID, whereas the value set is referenced by the sdtc:valueSet attribute. CDA template ID in a QRDA instance contains the templateId from the library (in templateId/@root), concatenated with the value set id from the eMeasure (in templateId/@extension).

Figure 18: Quality Data Element representationTemplateId construction in a QRDA Category I instance

This guide describes how to construct a QDM-based QRDA document for any QDM-based eMeasure rather than give prescriptive rules for QRDA construction on an eMeasure-by-eMeasure basis.

QDM-Based QRDA Category I Construction Rules

This section provides guidelines on when to create QRDA documents and what data to include.

How Many QRDA Documents Should be Created?

A QDM-based QRDA Category I instance contains data on a single patient for one or more QDM-based eMeasures. Each eMeasure for which data is included shall be referenced in a QRDA Measure Section that adheres to the Measure Section QDM template.

As a result of this rule, a QRDA Category I instance may contain data elements for multiple measures, thus it is very important to include date/time stamps for the various objects (e.g., to know that a particular medication was administered during a particular encounter). This guide, therefore, further constrains those CDA templates used elsewhere, such as in the Implementation Guide for CDA Release 2.0 Consolidated CDA Templates (US Realm), December 2011, e.g. to require date/time stamps.

Generate a QRDA for Which Patients?

As noted above, a QDM-based QRDA Category I instance will reference one or more eMeasures in the QRDA Measure Section. Some quality programs may require that a QDM-based QRDA Category I instance reference only those eMeasures for which the corresponding Initial Patient Population (IPP) criteria have been met. Other quality programs may require that a QDM-based QRDA Category I instance reference a pre-negotiated set of measures.

A QRDA document should be created for each patient meeting at least one of the Initial Patient Population (IPP) criteria of the referenced eMeasure(s). No QRDA document should be created for patients that fail to meet any of the IPP criteria.

Where a QRDA document references multiple measures, the patient shall should have met at least one of the referenced IPP criteria for each of them. For instance, at a hospital reporting on three acute myocardial infarction measures (AMI-1, AMI-2, AMI-3), where the quality program requires that all three eMeasures be referenced in the QRDA Category I instance, if a patient meets the IPP for AMI-1, and AMI-2, or but not for AMI-3, then the a QRDA for that patient will be generatedonly reference AMI-1 and AMI-2, and will only carry data for those referenced eMeasures. As described below, that QRDA would contain data for all referenced eMeasures.

Often times, a quality program implementing QRDA will provide prescriptive guidelines that define the exact rules for which eMeasures to reference in a QDM-based QRDA Cateogyr I instance, or the exact triggers for sending a QRDA document. Where such prescriptive guidelines exist, they take precedence over the more general guidance provided here.

How Many Data Should be Sent?

QDM-based QRDA adheres to a "scoop and filter" philosophy, whereby data from an EHR are scooped up and filtered, and the remaining content is packaged into the instance.

When the recipient of the instance has access to no other EHR data, it is important that the instance include data elements relevant to computing eMeasure criteria, as well as the other data elements defined in an eMeasure —for stratification, for risk adjustment, etc. Every data element present in the EHR that is required by the referenced eMeasure(s), not just those needed to compute criteria, shall be included in the QRDA document.

The EHR may have more data than are relevant to the referenced eMeasure(s) and more data than are needed to compute the criteria. For instance, a patient who has been in the Intensive Care Unit undergoing continuous blood pressure monitoring will have reams of blood pressure observations. QDM-based QRDA adheres to a "smoking gun" philosophy where, at a minimum, the conclusive evidence needed to confirm that a criterion was met shall be included in the instance.

At the very least, the QRDA document should include:

Smoking gun data that offers confirmatory proof, where a patient has met a criterion

All data elements defined within the referenced eMeasuresthat confirm IPP inclusion for each referenced eMeasure (since by definition, the patient has met IPP criteria for each referenced eMeasure)

All relevant data elements present in the EHR. For disjunctive criteria (e.g., where a criterion can be satisfied by either of two data elements), include all the relevant data elements that are present in the EHR

Stratification variables, supplemental data elements, risk adjustment variables, and any other data element specified in the referenced eMeasure(s)

A quality program implementing QRDA will often provide prescriptive guidelines that define additional data, outside the smoking gun, that may or must be sent (such as the complete problem or medication list). Where such prescriptive guidelines exist, those take precedence over the more general guidance provided here. In other words, the "smoking gun" heuristic ensures that the bare minimum is present in the QRDA, and does not preclude inclusion of additional data.

What if There are No Data in the EHR?

Not all data elements defined within the referenced eMeasures will be present in the EHR for each patient for which a QDM-based QRDA Category I instance is to be sent.Other than IPP data elements, which by definition will be known, other data elements may not be present in the EHR. Following from the scoop and filter philosophy, a QDM-based QRDA document will not contain data elements that aren't present in the source system. For instance, if an eMeasure has a criterion "patient is in the Numerator if they have blue eyes" and the patient doesn't have eye color captured in the source system, then the corresponding QRDA document will not contain an eye color observation for that patient.

Whereas a QDM-based QRDA defines this consistent approach to missing data, the QDM-based eMeasure defines the logical processing of missing data (e.g. how to classify a patient into various populations in the absence of an eye color observation). In other words, the eMeasure addresses how it factors in missing data when calculating criteria, and it is the job of the QDM-based QRDA to include relevant data that was present in the EHR, and to not include data that was missing from the EHR.

Each eMeasure will address exactly how it factors in missing data when calculating criteria. For instance, one measure may assume that absence of evidence equals evidence of absence (e.g., patient doesn't have eye color in the EHR, therefore is not in the Numerator), whereas another measure may differentiate between data that is known to be true, data that is known to be false, and data that isn't known. Where an eMeasure requires positive evidence of absence (e.g., "no known allergies"), that eMeasure will include a corresponding data element in its logic. For data elements of an eMeasure, other than those needed to compute the IPP, that are not present in the EHR, the corresponding QRDA will contain nothing.

Generating a QDM-Based QRDA Category I Instance from a QDM-Based eMeasure

This guide does not give prescriptive rules for QRDA construction on an eMeasure-by-eMeasure basis, but rather, describes how to construct a QDM-based QRDA instance for any QDM-based eMeasure. This section walks through the an illustrative QRDA generation process in detail, to illustrate how to construct a QDM-based QRDA for one or more QDM-based eMeasures. Detailed mapping tables, that allow an implementer to figure out which templates to include in a QDM-based QRDA Category I instance based on the criteria and other data elements in a QDM-based eMeasure are included in Appendix “HQMF QDM DATA TYPE TO CDA MAPPING TABLES”.

There are many ways to generate a QDM-based QRDA document. This section is illustrative, focusing on how to use an eMeasure to figure out what templates need to be included in the QRDA document and how they are to be instantiated. This section does not address other important aspects of QRDA document generation, such as how to extract relevant data from the EHR.

When a QRDA document references multiple measures, the patient shall have met the IPP criteria for each of them. If a provider wants to report on three myocardial infarction measures (AMI-1, AMI-2, AMI-3) for a population of patients and if a patient meets the IPP for AMI-1 and AMI-2, but not for AMI-3, then the QRDA for that patient will only reference AMI-1 and AMI-2 and will only carry data for those referenced eMeasures.

Illustrative steps to construct a QDM-based QRDA for a given patient in this scenario:

1. Identify those eMeasures to be included in the QRDA document. The patient shall have met the IPP for each of them. List each of these eMeasures in the Measure Section QDM section.

2. For all eMeasures in the Measure Section QDM section, take the union of Quality Data Elements. Include all data elements, including those needed to calculate population criteria, those needed for stratification, those needed for risk adjustment, etc. This list may have the same Quality Data Type multiple times, each associated with a different value set. In other words, take the union of Quality Data Elements (Quality Data Type + value set). You'll wind up with a table, looking something like this:

Table 2: Union of Quality Data Types from eMeasures of Interest

Quality Data Element

Quality Data Type Pattern ID

Value Set Name

Value Set ID

Diagnosis, Active: Pregnancy

2.16.840.1.113883.3.560.1.2

Pregnancy Grouping Value Set

2.16.840.1.113883.3.600.0001.18

Medication, Administered: Aspirin

2.16.840.1.113883.3.560.1.14

Aspirin RxNorm Value Set

2.16.840.1.113883.3.666.05.626

Medication, Administered: Beta Blocker

2.16.840.1.113883.3.560.1.14

Beta Blocker RxNorm Value Set

2.16.840.1.113883.3.117.35

3. For each Quality Data Type identified, identify the corresponding CDA template from this guide. The QDM to QRDA Mapping Table in the appendix maps from HQMF Quality Data Type Pattern ID to the corresponding CDA template ID. You'll wind up with a table, looking something like this:

Table 3: QDM HQMF Pattern to CDA Mapping Table

Quality Data Element

Quality Data Type

Pattern ID

Value Set Name

Value Set ID

CDA Template Name

CDA Template Library ID

Diagnosis, Active: Pregnancy

2.16.840.1.113883.3.560.1.2

Pregnancy Grouping Value Set

2.16.840.1.113883.3.600.0001.18

Diagnosis Active

2.16.840.1.113883.10.20.24.3.11

Medication, Administered: Aspirin

2.16.840.1.113883.3.560.1.14

Aspirin RxNorm Value Set

2.16.840.1.113883.3.666.05.626

Medication Administe