9
1 Chapel JM, Wang G. Stroke & Vascular Neurology 2019;0. doi:10.1136/svn-2019-000301 Open access Understanding cost data collection tools to improve economic evaluations of health interventions John M Chapel , Guijing Wang Division for Heart Disease and Stroke Prevention, CDC, Atlanta, Georgia, USA Correspondence to Dr Guijing Wang; [email protected] To cite: Chapel JM, Wang G. Understanding cost data collection tools to improve economic evaluations of health interventions. Stroke & Vascular Neurology 2019;0. doi:10.1136/ svn-2019-000301 Additional material is published online only. To view, please visit the journal online (http://dx.doi.org/10.1136/svn- 2019-000301). Received 9 November 2019 Accepted 12 November 2019 Review © Author(s) (or their employer(s)) 2019. Re-use permitted under CC BY-NC. No commercial re-use. See rights and permissions. Published by BMJ. ABSTRACT Micro-costing data collection tools often used in literature include standardized comprehensive templates, targeted questionnaires, activity logs, on-site administrative databases, and direct observation. These tools are not mutually exclusive and are often used in combination. Each tool has unique merits and limitations, and some may be more applicable than others under different circumstances. Proper application of micro-costing tools can produce quality cost estimates and enhance the usefulness of economic evaluations to inform resource allocation decisions. A common method to derive both fixed and variable costs of an intervention involves collecting data from the bottom up for each resource consumed (micro-costing). We scanned economic evaluation literature published in 2008-2018 and identified micro-costing data collection tools used. We categorized the identified tools and discuss their practical applications in an example study of health interventions, including their potential strengths and weaknesses. Sound economic evaluations of health interventions provide valuable information for justifying resource allocation decisions, planning for implementation, and enhancing the sustainability of the interventions. However, the quality of intervention cost estimates is seldom addressed in the literature. Reliable cost data forms the foundation of economic evaluations, and without reliable estimates, evaluation results, such as cost-effectiveness measures, could be misleading. In this project, we identified data collection tools often used to obtain reliable data for estimating costs of interventions that prevent and manage chronic conditions and considered practical applications to promote their use. BACKGROUND Healthcare expenditures in the USA have been rising rapidly in recent decades and are higher on a per capita basis than in all other comparable high-income countries. 1–3 A primary driver of healthcare expenditures in the USA is the growing burden of chronic health conditions. 4 Developing, identifying and scaling interventions that efficiently prevent and manage chronic conditions is an important factor for controlling health- care costs, and decision makers—faced with limited public resources—are increasingly requesting information on the economic costs and benefits of health interventions to make evidence-based programming and resource allocation decisions. 5 6 Sound economic eval- uations, such as cost-effectiveness analysis and cost-benefit analysis of health interven- tions, can provide valuable information that supports allocation decisions. 7 8 Many types of economic evaluations are available to provide valuable information for resource allocation decisions, and each approach offers different measures of the economic aspects of a health intervention (figure 1). Regardless of the type of analysis chosen, reliable intervention cost data forms the foundation of all economic evaluations. Without reliable cost estimates, the evalua- tion results may be misleading. 7 9 Moreover, intervention cost information is essential for informing the replication planning and scale-up of interventions found to be effec- tive and efficient. A detailed understanding of the inputs and costs of an intervention are needed to move from more controlled research settings to real-world implementa- tion contexts. 10 Despite the importance of health inter- vention cost estimates, the inclusion and quality of these estimates are frequently lacking and seldom addressed in the litera- ture. 9 11–13 Considerable effort and focus are often placed on the rigorous measurement of health outcomes of interventions, whereas methods for measuring intervention costs have been relatively neglected in comparison. Several guidelines, checklists and stan- dards have been developed for conducting, reporting and reviewing economic eval- uations. 5 14 However, few have provided adequate focus on methods for interven- tion cost estimation, 15–17 and none has been developed with sufficient detail for thor- ough direct measurement techniques such as micro-costing, 18 the most accurate and precise costing method. 7 While defined consensus standards and guidelines have yet to be developed for conducting micro-costing studies, the principles are well established and discussed in the literature. 7 8 10 19 20 Even so, discussion of the available data collection on April 12, 2020 by guest. Protected by copyright. http://svn.bmj.com/ Stroke Vasc Neurol: first published as 10.1136/svn-2019-000301 on 1 December 2019. Downloaded from

Understanding cost data collection tools to improve ... · circumstances. Proper application of micro-costing tools can produce quality cost estimates and enhance the usefulness of

  • Upload
    others

  • View
    8

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Understanding cost data collection tools to improve ... · circumstances. Proper application of micro-costing tools can produce quality cost estimates and enhance the usefulness of

1Chapel JM, Wang G. Stroke & Vascular Neurology 2019;0. doi:10.1136/svn-2019-000301

Open access

Understanding cost data collection tools to improve economic evaluations of health interventionsJohn M Chapel , Guijing Wang

Division for Heart Disease and Stroke Prevention, CDC, Atlanta, Georgia, USA

Correspondence toDr Guijing Wang; gbw9@ cdc. gov

To cite: Chapel JM, Wang G. Understanding cost data collection tools to improve economic evaluations of health interventions. Stroke & Vascular Neurology 2019;0. doi:10.1136/svn-2019-000301

► Additional material is published online only. To view, please visit the journal online (http:// dx. doi. org/ 10. 1136/ svn- 2019- 000301).

Received 9 November 2019Accepted 12 November 2019

Review

© Author(s) (or their employer(s)) 2019. Re- use permitted under CC BY- NC. No commercial re- use. See rights and permissions. Published by BMJ.

AbsTrACTMicro- costing data collection tools often used in literature include standardized comprehensive templates, targeted questionnaires, activity logs, on- site administrative databases, and direct observation. These tools are not mutually exclusive and are often used in combination. Each tool has unique merits and limitations, and some may be more applicable than others under different circumstances. Proper application of micro- costing tools can produce quality cost estimates and enhance the usefulness of economic evaluations to inform resource allocation decisions. A common method to derive both fixed and variable costs of an intervention involves collecting data from the bottom up for each resource consumed (micro- costing). We scanned economic evaluation literature published in 2008-2018 and identified micro- costing data collection tools used. We categorized the identified tools and discuss their practical applications in an example study of health interventions, including their potential strengths and weaknesses. Sound economic evaluations of health interventions provide valuable information for justifying resource allocation decisions, planning for implementation, and enhancing the sustainability of the interventions. However, the quality of intervention cost estimates is seldom addressed in the literature. Reliable cost data forms the foundation of economic evaluations, and without reliable estimates, evaluation results, such as cost- effectiveness measures, could be misleading. In this project, we identified data collection tools often used to obtain reliable data for estimating costs of interventions that prevent and manage chronic conditions and considered practical applications to promote their use.

bACkgroundHealthcare expenditures in the USA have been rising rapidly in recent decades and are higher on a per capita basis than in all other comparable high- income countries.1–3 A primary driver of healthcare expenditures in the USA is the growing burden of chronic health conditions.4 Developing, identifying and scaling interventions that efficiently prevent and manage chronic conditions is an important factor for controlling health-care costs, and decision makers—faced with limited public resources—are increasingly requesting information on the economic costs and benefits of health interventions to make evidence- based programming and resource

allocation decisions.5 6 Sound economic eval-uations, such as cost- effectiveness analysis and cost- benefit analysis of health interven-tions, can provide valuable information that supports allocation decisions.7 8

Many types of economic evaluations are available to provide valuable information for resource allocation decisions, and each approach offers different measures of the economic aspects of a health intervention (figure 1). Regardless of the type of analysis chosen, reliable intervention cost data forms the foundation of all economic evaluations. Without reliable cost estimates, the evalua-tion results may be misleading.7 9 Moreover, intervention cost information is essential for informing the replication planning and scale- up of interventions found to be effec-tive and efficient. A detailed understanding of the inputs and costs of an intervention are needed to move from more controlled research settings to real- world implementa-tion contexts.10

Despite the importance of health inter-vention cost estimates, the inclusion and quality of these estimates are frequently lacking and seldom addressed in the litera-ture.9 11–13 Considerable effort and focus are often placed on the rigorous measurement of health outcomes of interventions, whereas methods for measuring intervention costs have been relatively neglected in comparison.

Several guidelines, checklists and stan-dards have been developed for conducting, reporting and reviewing economic eval-uations.5 14 However, few have provided adequate focus on methods for interven-tion cost estimation,15–17 and none has been developed with sufficient detail for thor-ough direct measurement techniques such as micro- costing,18 the most accurate and precise costing method.7 While defined consensus standards and guidelines have yet to be developed for conducting micro- costing studies, the principles are well established and discussed in the literature.7 8 10 19 20 Even so, discussion of the available data collection

on April 12, 2020 by guest. P

rotected by copyright.http://svn.bm

j.com/

Stroke V

asc Neurol: first published as 10.1136/svn-2019-000301 on 1 D

ecember 2019. D

ownloaded from

Page 2: Understanding cost data collection tools to improve ... · circumstances. Proper application of micro-costing tools can produce quality cost estimates and enhance the usefulness of

2 Chapel JM, Wang G. Stroke & Vascular Neurology 2019;0. doi:10.1136/svn-2019-000301

Open access

Figure 1 Economic evaluations of health interventions: types and components.

tools and methods used to conduct micro- costing studies has been limited and tends to focus on clinical settings and health technology,17 21 with less attention placed on public health interventions such as those for the preven-tion and management of chronic conditions.10 19

This study adds to the field by scanning literature published in the last decade to identify and describe data collection tools often used to obtain reliable data for estimating intervention costs for health interven-tions. Another objective is to discuss considerations for the practical application of cost data collection tools and methods to promote their uses for better intervention cost estimates.

InTervenTIon CosTs And CosTIng meThodsComponents of an economic evaluation of a health inter-vention can consist of the costs of the intervention and the consequences that result from the intervention.7 8 Inter-vention costs refer to the cost of inputs required in the development and implementation process of the inter-vention (in other words, the costs of resources consumed in the process of developing, implementing, operating and delivering the intervention). The consequences of the intervention are those that result from the consump-tion of the intervention, such as changes in health utility, or increased consumption or savings of resources that occur as a result of the intervention. For this paper, we are primarily focused on the intervention costs, although the intervention consequences (outcomes) are an equally

important component of economic evaluation. Several terms and concepts may help to understand and assess intervention costs (table 1).

Costing methods in economic evaluation generally fall on a spectrum between a bottom up, micro- costing approach and a top down, gross costing approach,7 22 each with trade- offs between accuracy, precision and the burden of research.15 The choice of method determines the cost estimates and may change the results of an economic evaluation considerably.23 24 In gross costing, a highly aggregated cost of service is used, such as cost per inpatient stay or lump sum of funding provided for a programme. A more accurate and detailed method is to estimate intervention costs using a micro- costing approach. Micro- costing can provide the most precise method of deriving intervention costs because it involves direct enumeration and costing of each intervention input. Commonly, a hybrid of approaches is found to be appropriate under given feasibility restraints, using micro- costing to estimate the intervention input costs and gross costing to estimate the cost consequences.8 We focus on micro- costing to estimate the costs of intervention inputs, although the methods we describe below could poten-tially be applied to measuring cost consequences as well.

At its core, micro- costing entails enumerating each input unit used in an intervention and deriving the total cost by assigning a unit cost value to the inputs and aggregating them. Inputs could include labour inputs such as nurse or pharmacist time and capital inputs such as facilities space

on April 12, 2020 by guest. P

rotected by copyright.http://svn.bm

j.com/

Stroke V

asc Neurol: first published as 10.1136/svn-2019-000301 on 1 D

ecember 2019. D

ownloaded from

Page 3: Understanding cost data collection tools to improve ... · circumstances. Proper application of micro-costing tools can produce quality cost estimates and enhance the usefulness of

3Chapel JM, Wang G. Stroke & Vascular Neurology 2019;0. doi:10.1136/svn-2019-000301

Open access

Table 1 Common terms in cost assessment of health interventions

Terms Descriptions

Fixed costs (FC)

Costs that do not vary with the quantity of output in the short run (eg, 1 year) and may vary with time, rather than the level of output. Could include:

Development costs

Intervention development planning (eg, meetings, consulting)

Intervention materials development (eg, creation of educational videos)

Start- up costs

Facilities and infrastructure set- up (eg, equipment and technology installation, furniture)

Hiring costs

Initial training

Operational costs

Labour

Salary, wages, fringe benefits of staff (eg, administrative support, operations management, supervision, programme monitoring)

Volunteer time

Training Equipment

Depreciation of durable goods

Variable costs (VC)

Costs that vary with the level of output. Could include:

Operational costs

Labour

Salary, wages, fringe benefits of staff (eg, intervention delivery staff, service labour)

Volunteer time

Materials, supplies (eg, testing kits, postage, educational materials)

Travel

Mileage, gas

Total costs (TC)

Total intervention cost (ie, the sum of all cost categories during the study time horizon):TC=FC+(VC×Q)

Average cost (AC)

Average cost per unit of output, such as cost per person served (ie, total costs divided by the total units of outputs):AC=TC/Q

Marginal cost (MC)

Cost of producing an additional unit of output (ie, total cost after producing an extra unit of output minus the original total cost):MC=(TC of Qn+1)–(TC of Qn)

Opportunity cost

Cost of forgoing the next best alternative use of a resource; the value of the resource had it been used for another productive activity.

Q, quantity of output, such as persons served.

or programme educational materials. Units of input can be defined based on the intervention and could include, for example, an hour of nurse time or a percentage of a nurse’s total time for labour, square foot for facilities space and individual booklet for programme educational materials. Micro- costing comprises five primary steps (other investigators have itemised methods from 3 to 6 steps10 19 20 25 but the major principles remain the same): 1) define the intervention production processes and the study perspective; 2) identify the intervention inputs; 3) quantify the units of each input; 4) assign a cost value to each input unit and aggregate and 5) conduct sensitivity analysis. Under the umbrella of micro- costing methods, different approaches with varying degrees of precision exist. The measurement and valuation of inputs can involve bottom up or top down micro- costing,24 and costs can be measured at different levels of analysis (eg, cost per patient served, intervention session or site).26

Micro- costing is also a foundational principle of activity- based costing (ABC).26 In ABC, resource use is identified and quantified for each of a defined set of mutually exclu-sive and exhaustive activities to determine how costs are allocated across the activities. For the purposes of this paper, we use an inclusive definition of micro- costing to include these variations. The data collection tools and discussions that follow are applicable to them all. An eval-uator may judge the balance of precision and burden—and thus which variation or combination of variations of micro- costing—that may be appropriate for their inter-vention and the purpose of their study.

IdenTIfyIng CosTs dATA ColleCTIon ToolsWe scanned literature published from 2008 through 2018 to identify types of data collection tools that are often used to conduct micro- costing studies of public health and preventive interventions. Details of the liter-ature selection process are available online (supple-mentary appendix figure 1). Briefly, we searched data-bases including Medline, PsychInfo and Econ lit for cost studies of interventions to prevent and manage chronic conditions, including mental and behavioural health risk factors. The searches returned 2082 records, and 306 studies remained after screening out reviews, commen-taries and studies that did not address non- infectious chronic conditions, prevention and management (evalua-tions of clinical operations, treatments or medicines were not included) or intervention cost estimation through micro- costing. Of those, we included 93 economic eval-uation studies that focused on examining intervention costs and provided sufficient detail to identify the specific processes by which cost data for at least one input cate-gory were collected.

We reviewed the 93 studies and abstracted the original descriptions of the cost data collection tools used in each study, as well as information on how each tool was used (eg, mode, main user) when available. The described cost data collection tools were then listed and combined into

on April 12, 2020 by guest. P

rotected by copyright.http://svn.bm

j.com/

Stroke V

asc Neurol: first published as 10.1136/svn-2019-000301 on 1 D

ecember 2019. D

ownloaded from

Page 4: Understanding cost data collection tools to improve ... · circumstances. Proper application of micro-costing tools can produce quality cost estimates and enhance the usefulness of

4 Chapel JM, Wang G. Stroke & Vascular Neurology 2019;0. doi:10.1136/svn-2019-000301

Open access

emergent categories and subcategories. Information on the study type, study perspective, intervention setting, intervention type and health targets was also abstracted and subsequently combined into emergent categories. Counts of the number of studies that employed each type of tool were then used to summarise the frequency with which they had been used overall (table 2) and by study and intervention characteristics categories (online supplementary appendix).

resulTsData collection methods employed in the recent litera-ture varied widely. We identified five major types of data collection tools (table 2):1. Standardised comprehensive templates (eg, web- based

cost assessment tool, standardised data collection in-strument). These templates are comprehensive in that they are used to collect resource unit quantity and unit cost data for most or all aspects of an intervention. They are standardised in the sense that, while they are often developed specifically for or in conjunction with a study, they have been (or can be) generalised to be made publicly available or used for multiple studies. For example, in the field of substance abuse there are multiple publicly available standardised compre-hensive templates that have been used in original or adapted form, such as the Drug Abuse Treatment Cost Analysis Program ( www. datcap. org). The templates are often completed retrospectively by a lead user (eg, op-erations manager, researcher) who might use a variety of data sources to fill out the template, such as finan-cial records, programme documents and consultation with other staff. The templates can be administered via interview (ie, completed by the researcher by in-terviewing staff and reviewing records) or via a survey (eg, email, web- based) for a representative from the intervention being studied to complete.

2. Targeted questionnaires and interviews (eg, survey of participants, manager’s survey, staff interviews). Tar-geted questionnaires are similar to the templates de-scribed above but are more limited in scope to target specific cost categories, are study specific or less for-mal/standardised. This category can refer to question-naires conducted via survey or via interviews (struc-tured or unstructured). For example, a questionnaire could be created to survey intervention staff in order to estimate the amount of time they spent on inter-vention activities. Or a researcher could conduct staff interviews, with a structured or unstructured interview guide, to estimate this information. Questionnaires are often administered retrospectively once or periodically (eg, 6 months) during a study.

3. Activity logs (eg, staff time sheet, time diary). Activity logs are prospectively completed logs or forms com-pleted by intervention staff or participants. They are most often used by intervention staff to prospectively record their time spent on intervention- related activi-

ties. For example, a staff member could carry a form in which they record various activities that represent work time for each increment (eg, 15 min periods) of the work day, as illustrated in an example form provided by Findorff et al.27 However, they can include the col-lection of materials or supply costs as well.25 Logs can be developed and administered as paper- based forms, computer- based forms or through a smart phone app or other hand- held electronic device. Similar logs can also be administered to intervention participants to prospectively record time or resource use; in these cas-es, they are often described as diaries.

4. Direct observation (time and motion study). Direct observation involves researchers or trained observ-ers prospectively recording intervention resource use through in- person observation of the intervention processes. For example, a researcher could follow a provider of the intervention throughout the day to ob-serve and record quantities of all resources used and the time spent on each intervention- related activity by the provider.

5. On- site databases and routine records (eg, intervention database, cost accounting system, financial records). On- site databases refer to data systems housed on- site to collect resource use information specific to the site (not to be confused with broader databases such as an insurer’s administrative database). Databases can be set up specifically for data collection for a study (study specific databases), or they may already exist for a site’s normal operations (in- place databases), such as a cost- accounting system for a hospital cost centre.21 They can be developed or customised to record information specific to a study’s interests, such as options to identify the specific activity a certain resource was devoted to. In this category, we include records that are already routinely being collected (other routine records), even if they are not specifically described as being housed in a specific database, such as ‘programme records’ or ‘fi-nancial records’ that are not described further.

The tools defined above are not mutually exclusive. Frequently, multiple methods are used in combination in a study. In our 93 reviewed studies, 51 (55%) described the use of just one category of tool, while 42 (45%) described the use of two or more. Nor is the above list necessarily exhaustive; other, less frequently used methods may exist.

In our sample of surveyed literature, on- site adminis-trative databases and records were the most commonly used tools. However, the high frequency of their use was driven by studies that did not describe the process by which records had been collected, instead simply refer-ring to ‘programme records’ or ‘financial records’, which were commonly used as a data source in conjunction with other data collection tools. Specifically defined electronic databases that were created to collect data for the study or which were already in place were only described in 9% and 8% of studies, respectively. Activity logs, targeted questionnaires and interviews and standardised compre-hensive templates were the second, third and fourth most

on April 12, 2020 by guest. P

rotected by copyright.http://svn.bm

j.com/

Stroke V

asc Neurol: first published as 10.1136/svn-2019-000301 on 1 D

ecember 2019. D

ownloaded from

Page 5: Understanding cost data collection tools to improve ... · circumstances. Proper application of micro-costing tools can produce quality cost estimates and enhance the usefulness of

5Chapel JM, Wang G. Stroke & Vascular Neurology 2019;0. doi:10.1136/svn-2019-000301

Open access

Tab

le 2

C

ost

dat

a co

llect

ion

tool

s us

ed in

93

stud

ies

surv

eyed

in li

tera

ture

pub

lishe

d in

200

8–20

18

Co

st d

ata

colle

ctio

n to

ols

Des

crip

tio

nTy

pic

al c

olle

ctio

n d

esig

nC

om

mo

n m

od

es o

f us

e

Freq

uenc

y o

f us

e (%

)*(n

=93

)

1) S

tand

ard

ised

co

mp

rehe

nsiv

e te

mp

late

sC

olle

ct c

ost

dat

a fo

r m

ost

or a

ll as

pec

ts o

f an

inte

rven

tion.

Cab

b

e ge

nera

lised

to

be

mad

e p

ublic

ly a

vaila

ble

or

used

for

mul

tiple

st

udie

s.

Ret

rosp

ectiv

eC

omp

uter

- bas

ed (e

g, E

xcel

), p

aper

- bas

ed31

1a

) via

inte

rvie

wU

sed

by

a re

sear

cher

to

colle

ct in

form

atio

n vi

a in

terv

iew

.R

etro

spec

tive

Tele

pho

ne, i

n- p

erso

n9

1b

) via

sur

vey

Use

d b

y in

terv

entio

n st

aff t

o co

mp

lete

in s

urve

y fo

rm.

Ret

rosp

ectiv

eE

mai

l, w

ebsi

te- b

ased

24

2) T

arge

ted

q

uest

ionn

aire

sS

imila

r to

the

tem

pla

tes

des

crib

ed a

bov

e b

ut m

ore

limite

d in

sc

ope,

are

stu

dy

spec

ific

or le

ss fo

rmal

/sta

ndar

dis

ed.

Ret

rosp

ectiv

eC

omp

uter

- bas

ed, p

aper

- bas

ed34

2a

) via

sur

vey

Use

d a

s a

surv

ey q

uest

ionn

aire

sen

t to

inte

rven

tion

staf

f.R

etro

spec

tive

Em

ail

13

2b

) via

inte

rvie

wU

sed

by

rese

arch

er t

o co

llect

info

rmat

ion

via

inte

rvie

w (s

truc

ture

d

or u

nstr

uctu

red

).R

etro

spec

tive

Tele

pho

ne, i

n- p

erso

n20

3) A

ctiv

ity lo

gsIn

terv

entio

n st

aff c

omp

lete

logs

pro

spec

tivel

y to

tra

ck t

ime

or

mat

eria

ls u

sed

for

inte

rven

tion

activ

ities

. Can

als

o b

e us

ed b

y p

artic

ipan

ts (c

ost

dia

ries)

.

Pro

spec

tive

Pap

er- b

ased

, com

put

er- b

ased

(e

g, E

xcel

), sm

art-

pho

ne38

4) D

irect

ob

serv

atio

nR

esea

rche

rs o

r tr

aine

d s

taff

obse

rve

inte

rven

tion

pro

cess

es a

nd

reco

rd t

ime

or m

ater

ials

use

d d

urin

g in

terv

entio

n ac

tiviti

es.

Pro

spec

tive

In- p

erso

n w

ith p

aper

- bas

ed o

r co

mp

uter

- bas

ed fo

rm10

5) O

n- si

te d

atab

ases

or

reco

rds

Dat

a sy

stem

s ho

used

on-

site

to

colle

ct r

esou

rce

use

info

rmat

ion

spec

ific

to t

he s

ite.

Pro

spec

tive/

retr

osp

ectiv

eE

lect

roni

c d

atab

ase,

pap

er/h

ard

co

py

reco

rds

41

5a

) Stu

dy-

spec

ific

dat

abas

eD

atab

ase

crea

ted

sp

ecifi

cally

for

the

pur

pos

e of

col

lect

ing

dat

a fo

r a

stud

y. U

sed

by

staf

f to

pro

spec

tivel

y tr

ack

reso

urce

s.P

rosp

ectiv

eC

omp

uter

- bas

ed (e

g, M

icro

soft

A

cces

s)9

5b

) In-

pla

ce d

atab

ase

Exi

stin

g el

ectr

onic

dat

abas

e in

- pla

ce a

t a

site

to

rout

inel

y co

llect

in

form

atio

n.R

etro

spec

tive†

Ele

ctro

nic

reco

rds

(eg,

ac

coun

ting

syst

em, e

lect

roni

c he

alth

rec

ord

)

8

5c

) Oth

er r

outin

e re

cord

sO

ther

exi

stin

g re

cord

s (e

lect

roni

c or

har

d c

opy)

tha

t ar

e ro

utin

ely

colle

cted

rel

ated

to

an in

terv

entio

n.R

etro

spec

tive†

Com

put

er o

r p

aper

rec

ord

s (e

g,

rece

ipts

, att

end

ance

rec

ord

s)32

*The

per

cent

ages

rep

rese

nt t

he p

rop

ortio

n of

stu

die

s in

tha

t co

lum

n th

at u

sed

a p

artic

ular

cos

t d

ata

colle

ctio

n to

ol. S

tud

ies’

use

of t

ools

was

not

mut

ually

exc

lusi

ve; 5

1 (5

5%) s

tud

ies

des

crib

ed u

sing

onl

y on

e ca

tego

ry o

f dat

a co

llect

ion

tool

s, 4

2 (4

5%) d

escr

ibed

usi

ng t

wo

or m

ore

cate

gorie

s of

too

ls. T

here

fore

, col

umn

per

cent

ages

do

not

sum

to

100%

.†D

epen

din

g on

the

dat

abas

e or

rec

ord

s, t

he o

rigin

al p

rimar

y d

ata

coul

d b

e re

cord

ed p

rosp

ectiv

ely.

But

the

dat

a co

llect

ion

for

a st

udy

usua

lly c

onsi

sts

of r

etro

spec

tive

exam

inat

ion

of s

uch

dat

a.

on April 12, 2020 by guest. P

rotected by copyright.http://svn.bm

j.com/

Stroke V

asc Neurol: first published as 10.1136/svn-2019-000301 on 1 D

ecember 2019. D

ownloaded from

Page 6: Understanding cost data collection tools to improve ... · circumstances. Proper application of micro-costing tools can produce quality cost estimates and enhance the usefulness of

6 Chapel JM, Wang G. Stroke & Vascular Neurology 2019;0. doi:10.1136/svn-2019-000301

Open access

Tab

le 3

C

ost

dat

a co

llect

ion

tool

s em

plo

yed

in a

sam

ple

stu

dy

Stu

dy

Inte

rven

tio

n d

escr

ipti

on

Stu

dy

pur

po

seIn

terv

enti

on

inp

uts

Dat

a co

llect

ion

too

ls u

sed

Des

crip

tio

n o

f us

eD

ata

colle

cted

Mira

mb

eau

et

al28

Com

mun

ity h

ealth

w

orke

r (C

HW

) p

rogr

amm

e im

ple

men

ted

at

the

Nor

thea

ster

n Ve

rmon

t R

egio

nal

Hos

pita

l ser

vice

ar

ea in

rur

al

Verm

ont.

CH

Ws

are

inte

grat

ed in

to

team

- bas

ed c

are

to h

elp

pat

ient

s ac

cess

soc

ial a

nd

econ

omic

ser

vice

s to

imp

rove

the

ir co

nditi

ons.

Ass

ess

pro

gram

me

cost

s fr

om a

pub

lic

heal

th p

ersp

ectiv

e to

d

eter

min

e th

e co

st t

o se

t up

and

op

erat

e th

e p

rogr

amm

e.

Cos

ts w

ere

estim

ated

in

Oct

ober

201

0—S

epte

mb

er 2

011.

Lab

our:

CH

Ws;

sup

ervi

sory

st

aff;

volu

ntee

rs;

adm

inis

trat

ive

sup

por

t.C

apita

l:fu

rnitu

re;

equi

pm

ent

(com

put

ers,

te

lep

hone

s);

mat

eria

ls (e

g,

pro

mot

iona

l m

ater

ials

; offi

ce

sup

plie

s); t

rain

ing;

tr

ansp

orta

tion;

of

fice

spac

e.

Sta

ndar

dis

ed

com

pre

hens

ive

tem

pla

te

The

hosp

ital a

dm

inis

trat

or u

sed

the

tem

pla

te

to c

omp

ile t

he d

ata

by

exam

inin

g fin

anci

al

reco

rds

from

the

hos

pita

l’s a

dm

inis

trat

ive

dat

abas

e an

d s

pea

king

to

rele

vant

sta

ff.

The

com

ple

ted

tem

pla

te w

as r

evie

wed

by

the

auth

ors

and

any

dis

crep

anci

es w

ere

reso

lved

.

Col

late

d a

ll co

st d

ata

for

the

stud

y, in

clud

ing

unit

qua

ntiti

es a

nd u

nit

cost

s.

Act

ivity

logs

For

a 2-

wee

k p

erio

d, e

ach

CH

W u

sed

the

fo

rm t

o r e

cord

the

cat

egor

y re

pre

sent

ing

thei

r ac

tivity

for

each

30

min

incr

emen

t of

the

w

orkd

ay.

Per

sonn

el t

ime

allo

cate

d

to in

terv

entio

n ac

tiviti

es.

On-

site

d

atab

ases

or

reco

rds

Res

earc

hers

exa

min

ed fi

nanc

ial r

ecor

ds

from

the

hos

pita

l’s a

dm

inis

trat

ive

dat

abas

e to

det

erm

ine

the

reso

urce

s co

nsum

ed in

the

in

terv

entio

n an

d g

athe

r un

it co

st in

form

atio

n.

Uni

t q

uant

ities

for

all n

on-

lab

our.

Uni

t co

sts

for

lab

our,

mat

eria

ls, s

upp

lies,

eq

uip

men

t.

common tools, respectively, and were used at similar rates (31%–38%). Direct observation was the least common category (10%). Standardised comprehensive templates were more commonly administered via survey than via interview, whereas targeted questionnaires were more often in interview form than survey form. Targeted ques-tionnaires via interview frequently were not well described and often referred to as ‘staff interviews’ with little to no detail on the development or content of the questions being asked.

Case example: cost analysis of a community health worker programme in rural vermontTo provide more detailed examples of the identified cost data collection tools in practical use, we summarise a case example of a well- conducted costing study in the field of prevention and public health that reasonably balances rigorous data collection against feasible research burden (table 3).

Mirambeau et al28 conducted a cost analysis of a commu-nity health worker (CHW) programme for the North-eastern Vermont Regional Hospital service area in rural Vermont and estimated the fixed and variable costs of implementing the programme for 1 year. The researchers created a standardised comprehensive template to collate cost data for 1 year in 2010–2011 from the public health perspective. The researchers conducted a 2- day site visit, administered in- person and telephone interviews with staff and reviewed programme documents and litera-ture to inform the development of a standardised cost collection template. The hospital administrator used the template to compile the data by examining records from the hospital’s administrative database and speaking to relevant staff. To allocate labour costs to the programme, the researchers created an activity log to track CHWs’ time spent on the programme. For a 2- week period, each CHW used the form to prospectively record the activity that reflected their time spent for each 30 min increment of their workday.

A strength of this study was the authors’ use of an activity log to collect data used to allocate labour cost to the programme. By administering the activity logs prospec-tively for a 2- week period with multiple staff members, the authors were able to generate estimates of the labour costs associated with the programme that were likely more accurate than if the CHWs or a supervisor were to esti-mate their time retrospectively. Although administering the activity log for a 2- week period is fairly strong, an alternative method of using the tool for a sample of time periods across the length of the 1- year study period could improve confidence that the time estimates are represen-tative of the average work- time. This alternative approach could have helped ensure the time estimates reflected any potential variations in labour use over time resulting from changes in levels of ‘production’ (eg, patient load), which might not be captured in a single 2- week period.

A second strength of this study was the information collected to inform the development of a standardised

on April 12, 2020 by guest. P

rotected by copyright.http://svn.bm

j.com/

Stroke V

asc Neurol: first published as 10.1136/svn-2019-000301 on 1 D

ecember 2019. D

ownloaded from

Page 7: Understanding cost data collection tools to improve ... · circumstances. Proper application of micro-costing tools can produce quality cost estimates and enhance the usefulness of

7Chapel JM, Wang G. Stroke & Vascular Neurology 2019;0. doi:10.1136/svn-2019-000301

Open access

template. The detailed study of the intervention produc-tion process prior to data collection helped ensure the data collected represented the entirety of costs involved to operate the intervention. The authors were able to distin-guish between non- labour start- up costs and ongoing operational costs in their presentation of data, as well as present a sensitivity analysis with different cost scenarios, which can assist future planning (cost analysis results table available in online supplementary appendix). The study could have been improved by identifying which ongoing operational and personnel costs were fixed and which were variable. Additionally, the authors could have incorporated activity- based costing in their data collec-tion tools to describe how the costs were allocated in the programme and identified the primary cost drivers.

dIsCussIonNumerous data collection tools are available to collect and estimate fixed and variable costs of health interven-tions. Researchers should take into account a number of considerations when deciding on the process they should use to collect cost data for their evaluation, including the size and scale of the intervention, setting, time horizon and purpose of the study. One important concern is the balance between the required precision of the data collected for a study and the acceptable level of research burden.

The investigator should aim to collect all cost data prospectively during a study. However, prospective cost collection approaches can have feasibility constraints. In an ideal scenario, existing data systems (in- place data-base) could be used to track resource use prospectively with minimal additional burden to staff. But many preven-tive interventions are set in community- based settings and might not have the data infrastructure found in health-care settings, which could limit the feasibility of using existing data systems to track resource use. Moreover, many prevention interventions are delivered in multiple sites with varying contexts that can influence costs,29 which further necessitates a standard primary data collection strategy. Direct observation could produce very accurate and precise data as trained observers watch the inter-vention processes and consistently record the resources used. However, the research cost could be fairly high: the observer’s time would need to be dedicated to research activities throughout the observation period, and they would most likely require training or re- training to ensure observation recording is consistent. Additionally, staff or patients might find the presence of observers intrusive in some cases.26 We suspect the potentially high research cost of direct observation is a primary reason it was by far the least commonly used data collection method used in our sample of cost studies.

Activity logs can be a well- balanced option to collect detailed, prospective data with a reasonable level of research burden, but potential limitations remain. While not as high cost as direct observation, activity logs still

come with the additional research burden as staff must complete the logs during their work day, where they may already be working at capacity, and could require training for the logs’ use. And because the logs rely on multiple staff to input reliable data, results depend on individuals’ agreement to participate in the data collection and their level of buy- in and effort, which can lead to missing data.29

The research burden for prospective methods can be mitigated by using sample time periods for data collection rather than prospectively collecting data throughout the study period, as mentioned above. Data collection could occur during a one- time sample period (eg, every day for two consecutive weeks) or during a sample of multiple time points across the study period (eg, 1 week during the first half of the study period and 1 week in the second half, random sample of days across the study period). Careful consideration should be made when choosing a study period that will be representative of the interven-tion as a whole. For instance, if there is a learning curve in the uptake of new protocols, choosing a period at the beginning of the intervention might overestimate time costs.26 27 Similarly, an intervention could have varying levels of production across time that should be consid-ered when selecting a sampling period.

Prospective data collection methods can ideally be used to track all resources used in an intervention. But in contexts where this might not be feasible for all cost cate-gories and prioritisations need to be made (eg, limited research budget, time, site buy- in), some costs might be more important to track in detail than others. For example, interventions to prevent and manage chronic conditions tend to be heavily labour intensive, and thus labour cost is typically a key category for understanding programme costs.29 30 Community- based interventions employ labour from the setting in which it is delivered (eg, school, community centre) and staff members may have multiple job duties, related and unrelated to the interven-tion, meaning a detailed understanding of how their time is spent is essential to allocate an appropriate amount of their time to the intervention and avoid overestimating its costs. Additionally, variable costs such as intervention materials may be less predictable and harder to track than fixed costs such as site rent and utilities, which is often readily available from an examination of existing records. Therefore, we recommend prioritising labour and other variable costs for detailed prospective data collection.

When prospective data collection is not feasible, or deemed not necessary for certain cost categories, retro-spective approaches (standardised comprehensive templates, targeted questionnaires, retrospective exam-ination of records) can provide lower burden options for data collection. Standardised comprehensive templates and targeted questionnaires can be used in a survey format to collect data mainly at one time rather than prospectively throughout the intervention and thus reduce burden for the intervention staff. However, using them in a one- time, retrospective survey format can create issues with accuracy resulting from response

on April 12, 2020 by guest. P

rotected by copyright.http://svn.bm

j.com/

Stroke V

asc Neurol: first published as 10.1136/svn-2019-000301 on 1 D

ecember 2019. D

ownloaded from

Page 8: Understanding cost data collection tools to improve ... · circumstances. Proper application of micro-costing tools can produce quality cost estimates and enhance the usefulness of

8 Chapel JM, Wang G. Stroke & Vascular Neurology 2019;0. doi:10.1136/svn-2019-000301

Open access

biases (eg, recall bias). In addition, a survey format with these tools means the accuracy of the data collected will depend on the respondents’ level of effort to provide accurate data.26 31

The mode of a questionnaire can have important influ-ences on data accuracy. For example, questionnaires conducted via in- person interviews might encourage more complete reporting, have a lower cognitive burden on the respondent or mitigate recall bias, but they can also require additional coordination and could introduce bias (eg, interviewer bias). Conversely, questionnaires via self- administered surveys can be easier to conduct, espe-cially if a large sample is required; but they tend to have a lower response rate and completeness of information and a higher cognitive burden.32 Similarly, for retrospective surveys, the recall period can affect the accuracy of the data.33–35

Collecting detailed micro- costing data provides a number of benefits. Most notably, as discussed above, direct measurement of intervention costs through micro- costing is likely to produce the most accurate and precise measure of an intervention’s costs.7 The detailed accounting of resource unit quantities also facilitates sensitivity analyses and the translation of intervention costs to other contexts. Because data are collected on a micro- level to document the quantities of each resource used, sensitivity analyses can easily be employed to examine how intervention costs could change depending on contextual differences in resource use and resource unit costs.10 For example, costs for inputs such as rent for facilities or certain costs of labour can easily be adjusted to account for potential differences in rents or wages across regions.28 Researchers can also examine cost differences that may result from substi-tuting inputs—such as capital for certain types of labour or using lower cost labour for certain activities—to help optimise efficiency.10 35 36 In general, when interven-tion costs are presented, resource unit quantities and unit costs should be presented separately to enhance the usefulness of results to inform future planning for replication and scale up by allowing other investigators to conduct similar types of sensitivity analysis.10 21 37 An emphasis on this type of transparency and assessment of contextual factors can help a cost analysis align with and contribute to broader evaluation and planning priorities focused on facilitating the successful transla-tion and dissemination of effective interventions, such as those described in the Reach, Effectiveness, Adop-tion, Implementation, Maintenance framework.36 38 39

Additionally, studies should describe their data collec-tion tools and methods with sufficient detail that can provide readers the ability to assess the data collection process and determine potential areas of accuracy or lack thereof. A standardised taxonomy for micro- costing data collection tools and methods used in public health and prevention science could improve the transparency of, and confidence in, intervention cost estimates. For example, Ridyard et al have similarly proposed a taxonomy

for resource use measures employed in clinical trials in a UK and European context.40 The studies we reviewed rarely provided descriptions of their tools and methods with such a sufficient level of detail, indicating work is still needed in this area. Moreover, while scanning the literature to identify data collection tools used, we found many studies described using a micro- costing method but provided no detail on the process with which their data were collected.

We scanned the recent literature to identify the types of tools commonly used in economic evaluations of chronic disease prevention and management interventions and provided a snapshot of how often they are used. However, this literature scan was not meant to provide a full system-atic review of this costing literature and caution should be taken when interpreting the frequencies of tool use reported here. A rigorous systematic and critical review of micro- costing studies could provide useful information to assess topics, such as the quality of current methods,21 and future research should address this need.41 Similarly, the development of a standardised checklist for the conduct, reporting and appraisal of micro- costing studies, such as one previously proposed by Ruger nd Reiff,18 could be of great benefit to promote the standardisation of such methods and improve the comparability of estimates from different studies. There have also been calls for future research to directly compare the use of various tools and examine their comparative accuracy, reliability or validity.21 Although this type of research would be of great benefit to the field, none has been conducted and published to our knowledge.

ConClusIonsResearchers who want to estimate the intervention costs of public health and preventive interventions focused on chronic conditions can apply the tools we have identified. The considerations we have discussed require careful forethought, and proper application can produce quality cost estimates, which in turn will enhance the usefulness of economic evaluations to inform resource allocation decisions, planning and sustainability for effective preven-tive health interventions. Future research can address the standardisation, validity and reporting of such tools, which can further improve the confidence and utility of intervention cost estimates in practice.

Contributors JMC completed the draft and revised the manuscript. GW provided guidance and modifications.

funding The authors have not declared a specific grant for this research from any funding agency in the public, commercial or not- for- profit sectors.

Competing interests None declared.Patient consent for publication Not required.Provenance and peer review Not commissioned; internally peer reviewed.

open access This is an open access article distributed in accordance with the Creative Commons Attribution Non Commercial (CC BY- NC 4.0) license, which permits others to distribute, remix, adapt, build upon this work non- commercially, and license their derivative works on different terms, provided the original work is

on April 12, 2020 by guest. P

rotected by copyright.http://svn.bm

j.com/

Stroke V

asc Neurol: first published as 10.1136/svn-2019-000301 on 1 D

ecember 2019. D

ownloaded from

Page 9: Understanding cost data collection tools to improve ... · circumstances. Proper application of micro-costing tools can produce quality cost estimates and enhance the usefulness of

9Chapel JM, Wang G. Stroke & Vascular Neurology 2019;0. doi:10.1136/svn-2019-000301

Open access

properly cited, appropriate credit is given, any changes made indicated, and the use is non- commercial. See: http:// creativecommons. org/ licenses/ by- nc/ 4. 0/.

orCId idsJohn M Chapel http:// orcid. org/ 0000- 0003- 1543- 9449Guijing Wang http:// orcid. org/ 0000- 0002- 5511- 3788

RefeRences 1 Catlin A, Cowan CA. History of health spending in the United

States, 1960-2013. Rockville, MD: Centers for Medicare & Medicaid Services, 2015.

2 Anderson GF, Reinhardt UE, Hussey PS, et al. It’s The Prices, Stupid: Why The United States Is So Different From Other Countries. Health Aff 2003;22:89–105.

3 Squires D, Anderson C. U.S. health care from a global perspective: spending, use of services, prices, and health in 13 countries. The Commonwealth Fund, 2015.

4 Buttorff C, Ruder T, Bauman M. Multiple chronic conditions in the United States. Santa Monica, CA: RAND Corporation, 2017.

5 Crowley DM, Dodge KA, Barnett WS, et al. Standards of evidence for conducting and reporting economic evaluations in prevention science. Prev Sci 2018;19:366–90.

6 National Academy of Medicine BoC, Youth, and Families, Division ofBehavioral and Social Sciences and Education, National Academies of Sciences E, and Medicine. Advancing the power ofeconomic evidence to inform investments in children, youth, and families. Washington, DC: National Academies Press, 2016.

7 Drummon M, Mark J, Karl C, et al. Methods for the economic evaluation of health care programmes. 4th ed. New York, NY: Oxford University Press, 2015.

8 Neumann P, Sanders G, Russell L, et al. Cost- Effectiveness in health and medicine. 2nd edn. New York, NY: Oxford University Press, 2017.

9 Graves N, Walker D, Raine R, et al. Cost data for individual patients included in clinical studies: no amount of statistical analysis can compensate for inadequate costing methods. Health Econ 2002;11:735–9.

10 Foster EM, Porter MM, Ayers TS, et al. Estimating the costs of preventive interventions. Eval Rev 2007;31:261–86.

11 Jefferson T, Demicheli V, Vale L. Quality of systematic reviews of economic evaluations in health care. JAMA 2002;287:2809–12.

12 Kruper L, Kurichi JE, Sonnad SS. Methodologic quality of cost- effectiveness analyses of surgical procedures. Ann Surg 2007;245:147–51.

13 Ungar WJ, Santos MT. Quality appraisal of pediatric health economic evaluations. Int J Technol Assess Health Care 2005;21:203–10.

14 Langer A. A framework for assessing Health Economic Evaluation (HEE) quality appraisal instruments. BMC Health Serv Res 2012;12:253.

15 Barnett PG. An improved set of standards for finding cost for cost- effectiveness analysis. Med Care 2009;47:S82–8.

16 Jacobs P, Ohinmaa A, Brady B. Providing systematic guidance in pharmacoeconomic guidelines for analysing costs. Pharmacoeconomics 2005;23:143–53.

17 Ridyard CH, Hughes DA. Methods for the collection of resource use data within clinical trials: a systematic review of studies funded by the UK health technology assessment program. Value Health 2010;13:867–72.

18 Ruger JP, Reiff M. A checklist for the conduct, reporting, and appraisal of Microcosting studies in health care: protocol development. JMIR Res Protoc 2016;5:e195.

19 Charles JM, Edwards RT, Bywater T, et al. Micro- costing in public health economics: steps towards a standardized framework, using the incredible years toddler parenting program as a worked example. Prev Sci 2013;14:377–89.

20 Xu X, Yonkers KA, Ruger JP. Costs of a motivational enhancement therapy coupled with cognitive behavioral therapy versus brief advice for pregnant substance users. PLoS One 2014;9:e95264.

21 Frick KD. Microcosting quantity data collection methods. Med Care 2009;47:S76–81.

22 Mogyorosy Z, Smith P. The main methodological issues in costing health care services: a literature review. York, UK: University of York Centre for Health Economics, 2005.

23 Clement Nee Shrive FM, Ghali WA, Donaldson C, et al. The impact of using different costing methods on the results of an economic evaluation of cardiac care: microcosting vs gross- costing approaches. Health Econ 2009;18:377–88.

24 Tan SS, Rutten FFH, van Ineveld BM, et al. Comparing methodologies for the cost estimation of hospital services. Eur J Health Econ 2009;10:39–45.

25 Ruger JP, Emmons KM, Kearney MH, et al. Measuring the costs of outreach motivational interviewing for smoking cessation and relapse prevention among low- income pregnant women. BMC Pregnancy Childbirth 2009;9:46.

26 Smith MW, Barnett PG. Direct measurement of health care costs. Med Care Res Rev 2003;60:74S–91.

27 Findorff MJ, Wyman JF, Croghan CF, et al. Use of time studies for determining intervention costs. Nurs Res 2005;54:280–4.

28 Mirambeau AM, Wang G, Ruggles L, et al. A cost analysis of a community health worker program in rural Vermont. J Community Health 2013;38:1050–7.

29 Corso PS, Ingels JB, Kogan SM, et al. Economic analysis of a multi- site prevention program: assessment of program costs and characterizing site- level variability. Prev Sci 2013;14:447–56.

30 Honeycutt AA, Khavjou OA, Bradley C, et al. Intervention costs from communities putting prevention to work. Prev Chronic Dis 2016;13:E98.

31 Smith MW, Barnett PG, Phibbs CS, et al. Microcost methods for determining Va healthcare costs. Menlo Park, CA: Health Economics Resource Center, 2010.

32 Bowling A. Mode of questionnaire administration can have serious effects on data quality. J Public Health 2005;27:281–91.

33 Short ME, Goetzel RZ, Pei X, et al. How accurate are self- reports? Analysis of self- reported health care utilization and absence when compared with administrative data. J Occup Environ Med 2009;51:786–96.

34 Dalziel K, Li J, Scott A, et al. Accuracy of patient recall for self- reported doctor visits: is shorter recall better? Health Econ 2018;27:1684–98.

35 Ritzwoller DP, Sukhanova A, Gaglio B, et al. Costing behavioral interventions: a practical guide to enhance translation. Ann Behav Med 2009;37:218–27.

36 Ritzwoller DP, Sukhanova AS, Glasgow RE, et al. Intervention costs and cost- effectiveness for a multiple- risk- factor diabetes self- management trial for Latinas: economic analysis of ¡Viva Bien! Transl Behav Med 2011;1:427–35.

37 Ingels JB, Walcott RL, Wilson MG, et al. A prospective programmatic cost analysis of fuel your life: a worksite translation of DPP. J Occup Environ Med 2016;58:1106–12.

38 Glasgow RE, Estabrooks PE. Pragmatic applications of RE- AIM for health care initiatives in community and clinical settings. Prev Chronic Dis 2018;15:E02.

39 Estabrooks PA, Wilson KE, McGuire TJ, et al. A Quasi- Experiment to assess the impact of a scalable, community- based weight loss program: combining reach, effectiveness, and cost. J Gen Intern Med 2017;32:24–31.

40 Ridyard CH, Hughes DA, DIRUM Team. Taxonomy for methods of resource use measurement. Health Econ 2015;24:372–8.

41 Xu X, Grossetta Nardini HK, Ruger JP. Micro- costing studies in the health and medical literature: protocol for a systematic review. Syst Rev 2014;3:47.

on April 12, 2020 by guest. P

rotected by copyright.http://svn.bm

j.com/

Stroke V

asc Neurol: first published as 10.1136/svn-2019-000301 on 1 D

ecember 2019. D

ownloaded from