4
Vol. 2 Issue 4 July-August 2004 REPRINTED WITH PERMISSION OF HOMELAND FIRST REPSONSE. © 2004

REPRINTED WITH PERMISSION OF HOMELAND FIRST … · challenges in the world of emergency response drill evaluation. Emergency response organizations at ... • An After Action Report

  • Upload
    others

  • View
    5

  • Download
    0

Embed Size (px)

Citation preview

Page 1: REPRINTED WITH PERMISSION OF HOMELAND FIRST … · challenges in the world of emergency response drill evaluation. Emergency response organizations at ... • An After Action Report

Vol. 2 Issue 4 July-August 2004

REPRINTED WITH PERMISSION OF HOMELAND FIRST REPSONSE. © 2004

Page 2: REPRINTED WITH PERMISSION OF HOMELAND FIRST … · challenges in the world of emergency response drill evaluation. Emergency response organizations at ... • An After Action Report

BY LUCIA K. MARKS &

MICHAEL POTTER

Imagine that you are required, at greatexpense, to annually test your car’sair bag to determine its effectivenessin saving your life.

Now imagine that yourair bag’s performance isbased solely on thevisual observation andopinion of yourmechanic rather than aquantifiable measure ofits ability to inflate in afraction of a second.Finally, imagine thatyou are asked to com-pare the mechanic’sobservations and opin-ions from last year to that of a differentmechanic this year and determine if theair bag’s performance has changed. Soundtricky? We’re facing similar assessmentchallenges in the world of emergencyresponse drill evaluation.

Emergency response organizations at

every level are planningand conducting pre-paredness exercises atconsiderable expense,yet these agencies haveno objective system forevaluating the resultsonce these drills areover. When speaking tothe American HospitalAssociation in Aprilabout the importance ofconducting frequentdrills, Department of

Homeland Security (DHS) Secretary TomRidge said, “It’s the only way tostrengthen our preparedness.”

The Office of Domestic Preparedness’s(ODP) Homeland Security Exercise andEvaluation Program, Volume I: Overviewand Doctrine describes a well-designed

and executed exercise as one that:• Tests and validates policies, plans,

procedures, training, equipment andinteragency agreements;

• Clarifies and trains personnel inroles and responsibilities;

• Improves interagency coordinationand communications;

• Identifies gaps in resources;• Improves individual performance;

and• Identifies opportunities for

improvement.Access at

www.ojp.usdoj.gov/odp/docs/HSEEPv1.pdf

These exercises should not only bringcritical players together as they wouldduring a real crisis, but also measure theability of the entities being exercised toperform their required functions. In an

32 HOMELAND FIRST RESPONSE July/August 2004

Drilling for ResultsDrilling for ResultsThe quest for objectiveexercise evaluations

Pho

to C

ourt

esy

of F

EM

AREPRINTED WITH PERMISSION OF HOMELAND FIRST REPONSE. © 2004

Page 3: REPRINTED WITH PERMISSION OF HOMELAND FIRST … · challenges in the world of emergency response drill evaluation. Emergency response organizations at ... • An After Action Report

effort to obtain this measurement, allorganizations that conduct exercisesusing DHS/ODP funds are now subject tothe following requirements:

• All tabletop exercises (TTXs), drills,functional exercises (FEs) and full-scale exercises (FSEs) will be evalu-ated and performance-based;

• An After Action Report (AAR) willbe prepared and submitted toDHS/ODP following every TTX, drillFE and FSE;

• An Improvement Plan (IP) will bedeveloped, submitted to DHS/ODPand implemented to address findingsand recommendations identified inthe AAR; and

• Periodic exercise scheduling andimprovement implementation datawill be reported to DHS/ODP.

THE CHALLENGESIn this context, agencies and organiza-tions that perform DHS/ODP-fundedemergency response exercises and drillsface two major challenges: 1) the currentlack of an objective means of measuringperformance during preparedness exer-cises, and 2) the development of the exer-cises themselves.

Evaluation is the cornerstone of exer-cises. It clearly documents the strengthsand opportunities for improvement in ajurisdiction’s preparedness and is the firststep in the improvement process. How-ever, current exercise evaluations arealmost entirely subjective and providevery little quantitative evidence of per-formance capabilities. TOPOFF2 (TopOfficials Exercise Series), a Congression-ally mandated, large-scale exercise con-ducted in May 2003, involved more than8,500 people from 100 federal, state andlocal agencies, the American Red Crossand the Canadian government at anexpense of more than $16 million.Although DHS said the exercise was asuccess and “provided a tremendouslearning experience,” the results recordedin the After Action Report (AAR) werealmost entirely subjective. (Access a pdfof the TOPOFF2 AAR at www.dhs.

gov/interweb/assetlibrary/T2_Report_Final_Public.doc).

This lack of quantitative evidence iscritical for three reasons:

• First, there is no solid basis forunderstanding if a given entity iscapable of meeting critical perform-ance objectives during a real crisis;

• Second, there is no measure ofimproved performance over timefrom previous exercises; and

• Third, federal agencies are relyingmuch more heavily on the states toprovide quantitative evidence thatthey are improving their prepared-ness capabilities when they considerfuture funding initiatives.

DEMAND FOR EVIDENCEStates are experiencing a demand for evi-dence of increased preparedness. The vastmajority of federal funding agencies,including the Centers for Disease Controland Prevention (CDC), ODP, HealthResources and Services Administration(HRSA) and Federal Emergency Manage-ment Agency (FEMA), are requiringstates to create preparedness evaluationplans in their 2005 grant proposals. On asimilar front, such organizations as theColumbia University School of NursingCenter for Health Policy are working withthe CDC to create competencies for allpublic health workers as a way to createbenchmarks and assess preparednessamong public health professionals. TheCDC is also refining its assessment tool,titled Public Health Preparedness andResponse Capacity Inventory, for local and

regional public health offices. The second major challenge facing

agencies and organizations is the develop-ment of the exercises themselves. For thepreparedness exercises to have the great-est return on investment, the evaluationmust be an integral part of the exerciseitself, and in addition, this evaluationmust be quantifiable or objective. Toensure realistic, objective interactionsoccur within the context of the homelandsecurity, public health or medical aspectsof exercises, subject matter experts inthese areas must be involved in designingand implementing the evaluationmethodology. Through the developmentof objective measures for the publichealth and medical aspects of prepared-ness exercises, and the implementation ofthose measures in actual preparednessexercises, agencies and organizations willhave a truly effective tool for both meas-urement and training.

DOQ IN DEVELOPMENTIn March, the Texas Institute for HealthPolicy Research (TIHPR), a nonprofithealth policy think-tank in Austin, in col-laboration with Altarum, a nonprofitresearch and innovation institute in SanAntonio, began developing quantifiableexercise metrics for the Texas Departmentof Health (TDH). The Drill OutcomeQuantification (DOQ) project will helpmaximize TDH’s return on its investmentin preparedness exercises. The completedproject could prove fruitful for otheragencies seeking a detailed set of quanti-tative measures for evaluating emergencyresponse exercises. TIHPR developed aphased approach for supporting TDH toachieve the goals described below.

Phase IUnder Phase I of the DOQ program,TIHPR developed a detailed set of quanti-tative measures—or metrics—to use inconjunction with preparedness exercisesfor assessing public health and medicalresponse to chemical, biological, radio-logical, nuclear and explosive (CBRNE)scenarios. TIHPR conducted an initialassessment of current exercise evaluationprocedures within the civilian and mili-tary communities to ensure that their sub-sequent work is well informed, and thatthe subsequent metric development activ-ities take into account any measures thatalready exist. Throughout this process,

July/August 2004 HOMELAND FIRST RESPONSE 33

U A FEMA employee monitors the

TOPOFF2 exercise, which involved more

than 8,500 people at a cost of more than

$16 million. Was it a success? The results

recorded in the After Action Report were

almost entirely subjective.

The evaluation process for exercises should comprise the following components:

A formal exercise evaluation;

Integrated analysis;

Comprehensive AAR includingboth qualitative and quantitativedata summarizing what hasoccurred and analyzing the performance of critical tasks and capacities; and

Improvement planning that facilitates converting the lessonslearned into concrete, measurablesteps that result in improvedresponse capabilities.

Page 4: REPRINTED WITH PERMISSION OF HOMELAND FIRST … · challenges in the world of emergency response drill evaluation. Emergency response organizations at ... • An After Action Report

TIHPR worked collaboratively with TDHto ensure that the metrics being devel-oped are in concert with federal guide-lines. (At press time, the metrics were notavailable for publication.)

Phase IIIn Phase II of the DOQ program, TIHPRwill “operationalize” the metrics anddashboard developed in the first phase.The support provided under this task willinvolve the design and development of arelational database for capturing thequantitative data gathered during pre-paredness exercises.

This capability will be further enhancedin two ways. First, the “front-end” or inter-face of the database will be programmed tosupport exercise data collection activitiesusing hand-held devices. Second, the dash-board developed under Phase I will be pro-grammed to be dynamically populated bythe database. This will allow for on-site per-formance reporting during and immedi-ately following exercises. This immediatefeedback will provide the greatest impacton those participating in exercises.

Phase IIIUnder Phase III of the DOQ program,TIHPR will develop a constructive simula-tion of the public health/medical aspects ofemergency response. A constructive simu-lation is a computer model where simu-lated people operate in a simulated envi-ronment. Actual people—the users—willprovide input to the model and may affectmodel outcomes via those inputs. However,users will not directly determine the out-comes. The model may be designed toprompt users for input and thereby attainsome desired level of interactivity.

Constructive simulations have the abil-

ity to demonstrate the downstream impli-cations of decision-making that are rarelyattainable during live exercises. They alsopresent a viable, cost-effective alternative toexpensive, live field exercises.

VALUE-ADDEDAgencies and organizations have invested,and will continue to invest, significantresources in conducting preparednessexercises. The development of objectivemetrics will make it possible for agencies

at all levels to:• Provide a more realistic assessment

of readiness and capability;• Ensure that independent, appropri-

ately related expertise and guidanceare brought to bear when designingexercises;

• Force the exercises themselves into amore objective realm, increasing thereturn on investment; and

• Provide quantitative support inacquiring future funding, particu-larly as federal agencies demand tosee some objective measurement ofimprovement made through priorfunding.

With the creation and implementationof quantifiable exercises, drills will betransformed from subjective to objectivesources of information and increase theirvalue as both preparedness and educa-tional tools.

TIHPR and Altarum welcome inputfrom state and federal agencies. Pleaseemail any questions or comments toLucia Marks at [email protected].

⌧⌧ Lucia Marks is the director of disaster

response for the Texas Institute for Health Policy

Research (TIHPR) in Austin. TIHPR is a

statewide, nonprofit, non-partisan organization

serving as a catalyst for improvement in the

health of Texans through education in health

policy options and grassroots, community-based

health solutions. In addition to the Drill Outcome

Quantification (DOQ) project, the institute devel-

oped, in cooperation with the Texas Department

of Health, Bioterrorism Preparedness Planning for

Texas Hospitals, a manual to aid all Texas

hospitals in their preparation for response to a

bioterrorist act, contagious disease outbreak or

other public health threat or emergency. For

more information on TIHPR, visit

www.healthpolicyinstitute.org.

⌧⌧Michael Potter is the director of Texas

operations for the Altarum Institute, a nonprofit

research and innovation institution that

researches, develops and deploys advanced

information technology and decision-support

systems solutions. With offices in Ann Arbor,

Mich., Alexandria, Va., and San Antonio, Texas,

Altarum serves government and commercial

customers in the national security, health care,

energy, environment and transportation sectors.

For more information, visit

www.altarum.org.

34 HOMELAND FIRST RESPONSE July/August 2004

XA FEMA Emergency Support Team member participates in TOPOFF2 from theagency’s headquarters in Washington, D.C.

The vast majority of federal fundingagencies are requiring states to createpreparedness evaluation plans in their 2005 grant proposals.

Pho

to C

ourt

esy

of F

EM

A