25
Local health department electronic reportable disease surveillance practice and costs, North Carolina, 2009 OR: Proving it out E. Samoff MPH PhD, A. T. Fleischauer MSPH PhD, L. DiBiase MS, M. Davis MPH, A. Waller ScD, P. D. M. MacDonald MPH PhD This research was carried out by the North Carolina Preparedness and Emergency Response Research Center (NCPERRC) which is part of the UNC Center for Public Health Preparedness at the University of North Carolina at Chapel Hill’s Gillings School of Global Public Health and was supported by the Centers for Disease Control and Prevention (CDC) Grant 1PO1 TP 000296. The contents are solely the responsibility of the authors and do not necessarily represent the official views of CDC. Additional information can be found at http://cphp.sph.unc.edu/ncperrc/ North Carolina Preparedness & Emergency Response Research Center (NCPERRC)

Local health department electronic reportable disease surveillance practice and costs, North Carolina, 2009 OR: Proving it out E. Samoff MPH PhD, A. T

Embed Size (px)

Citation preview

Local health department electronic reportable disease surveillance practice and

costs, North Carolina, 2009OR: Proving it out

E. Samoff MPH PhD, A. T. Fleischauer MSPH PhD, L. DiBiase MS, M. Davis MPH, A. Waller ScD,

P. D. M. MacDonald MPH PhD

This research was carried out by the North Carolina Preparedness and Emergency Response Research Center (NCPERRC) which is part of the UNC Center for Public Health Preparedness at the University of North Carolina at Chapel Hill’s Gillings School of Global Public Health and was supported by the Centers for Disease Control and Prevention (CDC) Grant 1PO1 TP 000296. The contents are solely the responsibility of the authors and do not necessarily represent the official views of CDC. Additional information can be found at http://cphp.sph.unc.edu/ncperrc/

North Carolina Preparedness & Emergency Response Research Center (NCPERRC)

Background

• All states now use an electronic disease surveillance system

• What we know– Increase speed of initial notification to public health and number of

cases reported– Facilitate data capture and review

• What we don’t know– Does electronic disease surveillance improve public health

surveillance practice?– Support public health interventions?– Is electronic disease surveillance more efficient or cost-effective?– Improve population health?

Background

• Project: To evaluate North Carolina’s electronic disease surveillance system

• Project objectives– Describe workforce resources used for electronic

disease surveillance system – Describe impact on case reporting and

surveillance practice– Identify best practices for electronic disease

surveillance

Background

• North Carolina Electronic Disease Surveillance System (NC EDSS)– Highly customized off-the-shelf Maven system– Implemented in 2008– All reportable diseases except syphilis and HIV

• Case data entered by – LHD staff – Laboratories via ELR (≈ 33% of cases)– State staff

• System offers additional surveillance capacities

Methods

• Random sample: 30/100 counties• Interviews– NC Electronic disease surveillance system (NC EDSS) lead• Staff # and hours• Use of NC EDSS system• Use of surveillance data

– CD Nurse• Case management • Use of NC EDSS system

• Cost– $29/hour ($60,552/yr) salary

Methods

• NC EDSS system data– All VPD, STD, and other CD cases– Number of cases– Timeliness: % reported to state within 30 days– Accuracy: % of cases returned by state to LHD– Currently ignored cases: % of cases never handled

>45 days old

Methods

Composite score for indicators of good reporting practice:– Assigned 1 point each for:– Timeliness (>79% of completed cases submitted to state in <30

days)– Accuracy (<17% of cases returned to LHD for corrections)– Ignored cases (<1% of total cases ignored after 45 days)

• High/low comparison: High (2/3 points) vs. Low (0/1 points)

• County size: Small: <55,654, Medium: 55,655 - 107,427, Large: >107,427

Results: Respondent Profile

• May-August 2010• 28 counties• Broad geographical distribution

• Broad population distribution– 8,888-923,944 population– 10 small, 8 medium, 10 large

Results: Cases

• Total number of cases reported: 10,809

0

500

1000

1500

2000

2500Number of cases by county size

County size

Num

ber o

f cas

es

Smaller Larger

Results: Staff using electronic disease surveillance• Total staff using NC EDSS– 136 employees, 34.5 FTEs– average 4.8 employees, 1.2 FTEs per county

• Type of staff– CD nurses/supervisors– Administrative staff– DIS– Laboratory personnel

Results: Staff time

• 69% of employees using NC EDSS spent <12 hours per week of their work time on the system

Proportion of work time spent on NC EDSS

≥24 hours per week12-23 hours per week<12 hours per week

69%

10%

21%

Results: Staff expenditure (FTEs)

0

0.5

1

1.5

2

2.5

3

3.5Number of FTEs per 100,000 population

County size

Num

ber o

f FTE

s pe

r 100

,000

Smaller Larger

Results: Cases reported per FTE

• Average of 68 cases reported per FTE per month

020406080

100120140160180

County size

Num

ber o

f cas

es/F

TE/m

onth

Smaller Larger

Results: Salary cost per case reported

0

200

400

600

800

1000

Cost per case reported

County size

Sala

ry c

ost p

er c

ase

in d

olla

rs

Smaller Larger

Results: Salary cost per case reported

0

50

100

150

200

250

300

350Cost per case reported

County size

Sala

ry c

ost p

er c

ase

in d

olla

rs

Smaller Larger

Results: Impact on case reporting and surveillance practice

• NC EDSS leads: – 68% (19/28): Reported changes in case management– 89% (17/19): Improvement

• CD nurses: – 57% (12/21): Reported changes in case management– 75% (9/12): Improvement

• Because– Increased timeliness– Easier to know what to do/ask– Easier to access case-patient data– More thorough documentation

Results: Impact on case reporting and surveillance practice

• Counties using >5 NC EDSS capacities are more likely to– Report using surveillance data for decisions about

public health program management– Report providing surveillance data to policy-makers– Report including surveillance data in annual reports– Report using data from extended surveillance form

for disease intervention

Results: Reporting performance rank

• Rank based on – Timeliness (>82% submitted to NC DPH within 30 days)– Accuracy (<17% of cases returned to LHD)– Incomplete cases (<1% cases incomplete longer than 45 days

0

1

2

3

County size group

Rank

Small Medium Large

Rank=0

Mean cost per case by reporting practice rank

0 points 1 point 2 points 3 points0

50

100

150

200

250

Rank

Cost

per

cas

e in

dol

lars

Mean cost per case by rank and county size

0

50

100

150

200

250

300

350

Low rank (0/1)High rank (2/3)

County size group

Mea

n co

st p

er c

ase

Small Medium Large

Good surveillance costs less. How do we get there?

Practices associated with high reporting performance

• Can look at incoming cases daily (P=.11)• 6 staff or fewer using NC EDSS (P=.02)• Use surveillance data for program evaluation

(P=.13)• >5% of cases=Not a case (P=0.22)

Limitations

• FTE data are reported by interviewee – Not verified by electronic system– Based on current user lists

• Does not represent multi-county LHDs as well• Interviewer bias

Conclusions

• Resources used per case reported differs across state– Good surveillance costs less

• Perceived improvement in case management and disease surveillance

• Electronic surveillance system is supporting key surveillance activities

• Daily use of electronic surveillance system by a focused user group supports good reporting practice

Acknowledgements

UNC Gillings School for Global Public Health

Pia MacDonald MPH PhDCarol Gunther-Mohr MAMeredith Davis MPHLauren Dibiase MPHHeidi Soeters MPHErika Samoff MPH PhD

UNC School of Information and Library Science

Stephanie W. Haas PhD

Carolina Center for Health Informatics / UNC Dept of Emergency Medicine

Anna Waller ScDAmy Ising MSIS

CDC/NC Division of Public HealthAaron Fleischauer PhD

Contact

Erika [email protected]

NC PERRCcphp.sph.unc.edu/ncperrc