41
The Gemini Project to Redesign the Consumer Expenditure Survey: Redesign Proposal Jennifer Edgar, Dawn V. Nelson, Laura Paszkiewicz, and Adam Safir Gemini Design Team Bureau of Labor Statistics June 26, 2013

The Gemini Project to Redesign the Consumer Expenditure ... · The Gemini Project to Redesign the Consumer Expenditure Survey: Redesign Proposal . Jennifer Edgar, Dawn V. Nelson,

Embed Size (px)

Citation preview

The Gemini Project to Redesign the

Consumer Expenditure Survey:

Redesign Proposal

Jennifer Edgar, Dawn V. Nelson, Laura Paszkiewicz, and Adam Safir

Gemini Design Team

Bureau of Labor Statistics

June 26, 2013

Executive Summary

In 2009, the Bureau of Labor Statistics (BLS) Consumer Expenditure Survey (CE) initiated the

multi-year Gemini Project for the purpose of researching, developing, and implementing an

improved survey design. The objective of the redesign is to improve the quality of the survey

estimates through a verifiable reduction in measurement error.

This report, prepared by the Gemini Design Team, presents a comprehensive proposal for the

redesigned survey, based on three years of information gathering, inquiry, and synthesis. The

design team relied on a broad set of input documents to develop a redesign proposal which meets

the Office of Prices and Living Conditions’ (OPLC) survey requirements and addresses three key

factors believed to affect the survey’s ability to collect high quality data, specifically,

measurement error, environmental changes, and flexibility. The proposed design is budget

neutral for ongoing operations.

The proposed design includes two waves of data collection set 12 months apart. Each wave

contains the same interview structure consisting of two visits and a 1-week diary. Visit 1 is an in-

person interview made up of two parts. The first part identifies the roster of the household, while

the second is a recall interview that collects large, easily-recalled household expenditures.

Additionally, Visit 1 incorporates instructions to collect relevant expenditure records for the

Visit 2 records-based interview, as well as training for and placement of the electronic,

individual diaries. Following Visit 1, the electronic web-based diary (accessible via PC,

smartphone, or other mobile device) is maintained for one week by all household members 15

years old and older. Visit 2 is an in-person, records-based expenditure interview on household

expenditures that can reasonably be found in records such as receipts, utility bills, and bank

statements.

The proposed incentive structure for the new design includes a $2 prepaid cash incentive per

household sent with an advance letter, a $20 household incentive (debit card) provided after

Visit 1, a $20 individual incentive (debit card) for each member who completes the diary, and a

$20 or $30 household incentive (debit card) after Visit 2. Pending further research and

discussion, the Visit 1 $20 household incentive may be provided with the advance letter and

activated upon completion of the Visit 1 interview.

As part of the redesign process, representatives from the Division of Consumer Prices and Price

Indexes (CPI), the Division of Consumer Expenditure Survey (DCES), and the Division of Price

and Index Number Research (DPINR) created a set of reconciled survey requirements. These

reconciled requirements specified a minimum set of expenditure and non-expenditure data

elements that must be collected from each consumer unit (CU),1 the need for annual expenditure

estimates at an aggregate level of total household spending, the need for month of expenditure(s)

for each expenditure category, and the need for data to be collected at a minimum of two points

in time, one year apart, for each CU, allowing for analysis of change over time.

1A consumer unit is defined as members of a household who are related by blood, marriage, or other legal arrangement OR members of a household that make joint financial decisions. Throughout the report, the term “Consumer Unit” and household are used interchangeably.

Table of Contents

1. Gemini Background ................................................................................................................. 1

a. Redesign Motivation ............................................................................................................ 1

b. Report Overview .................................................................................................................. 2

c. Redesign Evaluation ............................................................................................................ 2

d. Redesign Process ................................................................................................................. 3

2. Proposal Development Process ............................................................................................... 3

a. Internal Inputs ...................................................................................................................... 4

b. External Inputs ..................................................................................................................... 4

c. Guiding Principles ............................................................................................................... 7

d. Survey Requirements ........................................................................................................... 7

e. Proposal Review .................................................................................................................. 9

3. Proposed Design .................................................................................................................... 10

a. Design Overview ............................................................................................................... 10

b. Survey Content................................................................................................................... 16

c. Data Collection Methodology ............................................................................................ 17

d. Sample Design and Proposed Sample Size ........................................................................ 18

e. Expected Cost .................................................................................................................... 20

f. Future Decisions ................................................................................................................ 21

4. Future Enhancements ............................................................................................................ 23

a. Adaptive Design................................................................................................................. 23

b. Gold Standard Interviews .................................................................................................. 23

c. Split Questionnaire Design ................................................................................................ 24

d. Increased Use of Technology............................................................................................. 25

e. Self-Administered Interviews ............................................................................................ 25

f. Extended Longitudinal Component ................................................................................... 25

5. Next Steps .............................................................................................................................. 26

Appendix 1. Proposed Survey Content and Source ...................................................................... 28

Appendix 2. Comparison of Current and Proposed Surveys ........................................................ 34

Appendix 3. Design Research, Discussion, and Outreach Events ................................................ 36

Appendix 4. Project Team Membership ....................................................................................... 37

1

1. Gemini Background

The mission of the Consumer Expenditure Survey program (CE) is to collect, produce, and

disseminate information that presents a statistical picture of consumer spending for the

Consumer Price Index (CPI), government agencies, and private data users. The mission

encompasses analyzing CE data to produce socio-economic studies of consumer spending and

providing CE data users with assistance, education, and tools for working with the data. CE

supports the mission of the Bureau of Labor Statistics (BLS), and therefore CE data must be of

consistently high statistical quality, relevant, timely, and must protect respondent confidentiality.

In an effort to improve data quality, the BLS began the Gemini Project in 2009 to redesign the

CE survey with the overall mission to improve data quality through a verifiable reduction in

measurement error, with a particular focus on under-reporting. The effort to reduce

measurement error is to be explored in a manner consistent with combating further declines in

response rates (Gemini Project Vision Document, 2012).

a. Redesign Motivation

The design of the CE survey must be updated on an as-needed basis to address the effect that

changes in society, technology, new consumer products, and spending methods have on survey

estimates. Not making these updates presents a risk that the CE survey will not be able to

continue producing high quality expenditure estimates for users.

Surveys today face well-known challenges that affect response rates, such as increasingly busy

respondents, confidentiality and privacy concerns, competing surveys, controlled-access

residences, and non-English-speaking respondents. In addition to these response rate challenges,

the CE survey also faces challenges either related to or that directly impact the quality of the data

collected. Presented in order of importance, the three most challenging issues the CE survey

faces are (1) evidence of measurement error, (2) environmental changes related to new

technology and consumption behaviors, and (3) the need for greater flexibility in the mode of

data collection and ability to update data collection strategies. Addressing these issues comprises

the main objectives of the survey redesign.

2

b. Report Overview

To address the challenges facing the CE, with this report

the Gemini Design Team presents a redesign proposal that

is expected to reduce measurement error, reflect currently

available technology and consumer behaviors, and

incorporate flexibility needed to regularly update data

collection strategies and modes. The design team has

reviewed, evaluated, and synthesized numerous reports

including recommendations from the National Academies’

Committee on National Statistics (CNSTAT) (Dillman

and House, 2012) and Westat (Cantor, et. al, 2013), as

well as several internal reports on defining data quality

(Gonzalez, et. al., 2009) and the data users’ needs

(Henderson, et. al., 2010). The design team used input

from these reports to develop a design that uses

technology to encourage real-time data capture, individual diaries to reduce proxy reporting,

shortened interview length and record use to improve data quality, and incentives to address

respondent motivation. The design team is confident the proposed design will increase the CE’s

ability to collect high quality data, while maintaining current response rates and costs.

c. Redesign Evaluation

CE will evaluate the success of the Gemini Project by the ability of the new survey design to

meet the redesign objectives – that is, to (1) result in a quantifiable reduction in measurement

error, (2) account for environmental changes, including changes in available technology and

consumer spending behaviors, and (3) support a greater operational flexibility, including changes

in data collection mode and in data collection strategies. Regarding the first objective, the CE has

initiated the development of an ongoing data quality profile and a measurement error evaluation

process to track changes in overall data quality and measurement error resulting from the

redesign’s implementation. The CE will evaluate the success of the second and third objectives

by comparing the time necessary to incorporate specific data collection protocol changes before

and after the redesign’s implementation.

Through the use of (1)

technology to encourage

realtime data capture, (2)

individual diaries to reduce

proxy reporting, (3) shortened

interview length and record

use to improve data quality,

and (4) incentives to address

respondent motivation, the

proposed design increases the

CE’s ability to collect high

quality data, while

maintaining current response

rates and costs.

3

The net analytic value of CE data for researchers and policymakers should also be maintained or

improved, both for microeconomic and macroeconomic data users, while improving the overall

quality of the data provided. It is also important that the design aims to be reasonably robust to

future changes in societal and technological factors and variability in budgetary levels. The

design team believes that the survey resulting from this redesign proposal will meet those

objectives.

d. Redesign Process

Since its inception in 2009, the Gemini Project has been managed by a project leader and

steering team, with guidance from an executive management group. The Gemini Project

includes three stages. Stage 1, which concludes with this report, focused on information

gathering and synthesis leading to this redesign proposal. Stage 1 included developing a

research project tracking system to make relevant existing research more easily accessible,

defining data quality for CE, and coordinating a series of information gathering events (see

Appendix 3 for complete list).

Stage 2 of the project will focus on developing a roadmap for testing, evaluation, development,

and implementation of the redesign. As part of stage 2, BLS will conduct outreach to data users

on the new proposed redesign with the goal of publicizing the proposals and mitigating any large

impacts of the new design identified by current CE users. These efforts are expected to continue

through 2013, with a final report summarizing the feedback obtained developed in early 2014.

Stage 3 comprises the actual implementation of the redesign. It will include feasibility testing,

evaluation, piloting, systems development, field staff training, and conducting a bridge test of the

new survey (alongside the current survey). At this point, some of the details and intricacies of

the design will be addressed as further research is conducted and implementation approaches.

Stage 3 is expected to take 10 years to complete, relying heavily on the availability of funding.

With a start date in 2014, full implementation of the redesign is expected in 2023.

2. Proposal Development Process

The Gemini Design team compiled numerous reports and input to create a cohesive redesign

proposal that would meet the needs of CE and the goals of the redesign project. Both internal

4

and external inputs guided the decisions made and assisted in guiding the process of developing

the design.

a. Internal Inputs

During stage 1 of the Gemini Project, the Steering Team

chartered subteams to research various topics and organize

information to provide essential input for making redesign

decisions. A Research Tracking System Team developed a

database for tracking CE method research findings. A Data

Quality Definition Team was formed to develop a

definition of data quality to be used by the survey. The

OPLC Requirements Team reconciled data requirements of

internal BLS stakeholders (CPI, CE, and internal researchers) into one set of reconciled

expenditure categories and data requirements. The first Gemini sampling team estimated sample

sizes required for the CE Diary Survey (CED) and CE Quarterly Interview Survey (CEQ) based

on the reconciled expenditure categories. The Gemini Technology Team reviewed the existing

literature and made recommendations about the most promising technologies. The design team

synthesized this information to develop a final redesign proposal.

In addition to the contributions from these teams, redesign ideas and feedback were solicited

from BLS and Census staff. Regular meetings were held with the Gemini Steering Team and

Gemini Executive Management Group where redesign ideas were shared. The design team also

conducted a web survey to ask CE staff for their input on features to incorporate into the

redesign. Field representative (FR) focus groups were conducted on the proposed design to

solicit feedback from the interviewers. The design team considered all of these ideas and

feedback when finalizing the proposed design.

b. External Inputs

To complement the above listed internal initiatives, BLS recognized the need for external

expertise in the redesign process. The Gemini Project hosted several events aimed at collecting

information from external sources, survey producers, and experts for use during the redesign.

These included a survey redesign panel discussion (January 2010), data capture technology

The majority of the promising

features identified from the

CNSTAT report, Westat

proposal, and internal CE

survey review have been

incorporated into this design

proposal.

5

forum (March 2010), respondent records panel discussion (May 2010), data user needs forum

(June 2010), and methods workshop (December 2010).

Additionally, to obtain an independent perspective on the

redesign, the Gemini Project contracted with two external

groups to provide expert recommendations on the redesign

proposal and research process.

The first contract was to the National Academies’

Committee on National Statistics (CNSTAT), who convened

an expert panel to collect additional information about the

issues facing the CE and propose redesign options. The

panel consisted of experts from a variety of relevant fields, including survey methodology,

statistics, psychology, economics, and public policy. The panel reviewed CE documentation and

past research, hosted two events with external experts, including other household survey

producers (June 2011) and survey methodologists with recommendations on redesign options

(October 2011). CNSTAT also contracted with two external vendors to provide independent

redesign proposals, which they used in conjunction with their expert opinion to create a report

documenting three redesign options for the CE.

The output of the panel was a report that synthesized information gathered through the BLS data

user needs forum, BLS methods workshop, CNSTAT household survey data producer workshop,

CNSTAT CE redesign options workshop, and independent papers into comprehensive design

recommendations. The panel was charged with presenting a “menu of comprehensive design

options,” rather than a single design, and the final report, “Measuring What We Spend: Toward a

New Consumer Expenditure Survey” met the charge fully. The report validates CE concerns

about issues affecting survey data quality, affirms the need for a major redesign, provides expert

recommendations for improving the survey, and establishes an imprimatur for CE’s redesign

plans.

The CNSTAT report validates

CE concerns about issues

affecting survey data quality,

affirms the need for a major

redesign, provides expert

recommendations for

improving the survey, and

establishes an imprimatur for

CE’s redesign plans.

6

The CNSTAT final report contained three individual redesign proposals, each with elements that

could be incorporated as presented, or in different combinations (Dillman and House, 2012). Of

the designs proposed, the Gemini Design team’s proposal is most similar to “Design B,”

including features such as a single sample design, increased use of technology and records, and

use of incentives. The design team used the CNSTAT report as a key input for the redesign

proposal, as noted in footnotes throughout this report, and their careful evaluation of design

options will be used throughout the next stage of the redesign process.

The second contract, for a second independent redesign proposal, was awarded to Westat, and

utilized the primary methodologists who worked on the CNSTAT-commissioned proposals. The

contract was for a single, cohesive, budget-neutral survey design addressing the key issues

impacting CE data quality (underreporting, changes in technology and society, and flexibility of

data collection mode). The contractors based their design on a careful review of the literature

and past CE research as well as their own extensive experience in survey design. They also

worked to build their current design on the feedback received from their original designs

proposed to CNSTAT. BLS and Westat worked together to ensure the final proposal was

realistic given a five-year development and testing schedule. The Gemini Design team’s

proposed design reflects the Westat design in many ways, as noted in footnotes throughout the

report, including two waves of data collection, increased use of technology and records,

individual diaries, reduced overlap in content between the diary and interview, and use of

incentives.

In addition to the redesign proposal, Westat used budget figures from the Census Bureau along

with their data collection experience to create a cost estimate for on-going data collection.

Finally, they provided a plan to evaluate the design elements. The design development was

iterative, with BLS providing feedback on options throughout the process. The output of this

contract was three presentations (methodology, cost and evaluation, and a final comprehensive

presentation) and a report detailing the proposed design, estimated budget, and evaluation plan

(Cantor, Mathiowetz, Schneider and Edwards, forthcoming). The Gemini Process will use the

Westat final report as a key input for the next stage of the redesign, as the research roadmap is

developed.

7

c. Guiding Principles

Borrowing the approach from CNSTAT, the Gemini Design team developed a set of principles

to use when making decisions. These principles helped the team weigh various design features,

and provided criteria to use when there was no obvious choice. The seven guiding principles

were:

1. Keep it simple

2. Make sure the design works for all respondents (e.g., have technology options)

3. Increase flexibility in content, modes, and technology

4. Meet OPLC requirements

5. Reduce measurement error

6. Keep costs neutral

7. Have no negative effect on response rates

d. Survey Requirements

As noted above, one of the activities in the redesign was to define a set of minimum survey

requirements that meet the needs of the CE, the CPI, and the Division of Price and Index

Number Research program offices. The complete requirements can be found in the forthcoming

“OPLC Requirements for the Consumer Expenditure Survey” report, to be available for

download from the BLS website (Passero, et. al, 2013). The design detailed in this report meets

all the requirements specified in the document. Specifically, the design will allow for annual

expenditure estimates of total household spending at the aggregate level. It will capture the full

list of expenditure and non-expenditure items needed from each consumer unit at the specified

level of detail. At the CU level, the design will allow for 3 months of expenditure data collected

at two points in time, a year apart, allowing for analysis of change. Additionally, the month of

the expense will be collected to meet CPI requirements. Not all data required for all types of

analyses are accommodated with the proposed redesign. For example, there is no provision for

the additional data needed for poverty thresholds or for analysis of total yearly income and

yearly expenditures from a single consumer unit.

Both CPI cost-weights and CE publication tables require all expenditure data for a given

calendar quarter. For any given CU, Visit 1 (recall interview) and Visit 2 (records-based

8

interview) reflect expenditures for the three months prior to the interview, while the reporting

diary expenditures reflect the current week. To obtain a complete picture of expenditures in the

proposed design, data will be used from different CUs: Visit 1 and Visit 2 reports will be used

from one set of CUs, while Diary expenditures will be used from a different set of CUs from the

previous quarter, to match the same time period. In the example shown in table 1, CPI and CE

Publications would use one month of February expenditures from the recall and records-based

interviews conducted with CUs interviewed in March, April, and May, and one week of

February expenditures from diaries completed in February (by different CUs).

Microdata users have different needs, requiring all expenditure categories from the same time

period to come from a single CU. To do this, interview data for the past three months would be

combined with global questions covering the diary expenditure categories (collected in the recall

interview and covering the same past three month time period). This approach would provide a

complete picture of spending for each CU for the past three months.

Looking back to table 1, focusing on CUs interviewed in April, microdata users would receive

January, February, and March expenditure data from the recall interview, the records-based

interview, and the global categories covering the diary expenditures. This would account for a

total picture of spending, covering the past three months, from each CU. The global questions

data could be augmented with diary data collected in the subsequent time period. Although the

time periods would differ, microdata users would be able to take advantage of the additional

detailed information, informing their use of the global questions data.

9

Table1. Expenditure time period coverage based on Interview month

Expenditures Occurring In

CUs Interviewed in Jan Feb Mar Apr May

January (Oct. – Dec reference period)

Diary February (Nov. – Jan. reference period)

Recall Records

Diary March (Dec. –Feb. reference period)

Recall Recall Records Records

Diary April (Jan. – Mar. reference period)

Recall Recall Recall Records Records Records

Diary May (Feb. – April reference period)

Recall Recall Recall Records Records Records Diary

e. Proposal Review

As the design team developed this redesign proposal, continuous feedback was obtained from a

variety of sources. The design team shared a preliminary draft of the redesign proposal with

senior BLS and Census staff, and held several meetings with the Gemini Project executive

management team. Additionally, three focus groups were

held with Census FRs to collect their reactions to the

redesign proposal, and identify any elements they deemed

not feasible in a production setting. Overall, reactions to

the proposal were positive, with each stakeholder group

agreeing that the design was promising and would likely

collect higher quality data while maintaining response

rates. In particular, the FRs had a positive reaction to the

design. They found many features, such as incentives,

having only two waves, and use of technology very

exciting. While there were some questions and concerns

Overall, reactions to

preliminary drafts of the

proposal have been positive,

with stakeholder groups

agreeing that the proposed

design represents a positive

step forward, and has the

potential to obtain higher

quality data while maintaining

response rates.

10

about a few design elements, the design team did not consider any of the noted concerns as

“show stoppers,” and have considered changes to address those concerns (Edgar, 2013). With

the completion of this report, the Gemini project will conduct several additional outreach events

through 2013 aimed at sharing the redesign proposal and collecting feedback from other key

stakeholders.

3. Proposed Design

a. Design Overview

As shown in figure 1, the proposed design calls for a single sample of consumer units2,3

participating in two waves of data collection,4 12 months apart. Each wave is comprised of two

visits with one CU respondent as well as a one-week5 electronic diary6 for every CU member 15

years old and older. A multi-mode survey leverages interviewer skills at collecting certain types

of data and building a rapport with the respondent, while also relying on self-administration and

proactive diary keeping to obtain other types of information. The two visits will be household-

level interviews with reduced content that are shorter in length than the current CEQ, which the

design team hopes will maximize survey cooperation.

Visit 1

During the Visit 1 recall interview, the FR will collect the household roster and household-level

expenditures that the CU respondent can reasonably recall using a Computer Assisted Personal

Interview (CAPI) instrument. The CAPI instrument will allow either the CU respondent or the

FR to modify the order in which expenditures are collected.

2 “All three [CNSTAT] prototypes … feature … a single sample.” (Dillman and House, 2012, page 98) 3 “To meet the challenges of minimizing respondent burden while also providing data users with the ability to examine change within CUs, [Westat] recommend[s] using a single sample design that incorporates two waves of in-person interview data collection with a single week of diary maintenance after the first wave of CE interview data collection.” (Cantor et al., 2013, pg. 6) 4 “By limiting the panel to two interviews, [the] proposed design reduces burden but also reduces the costs of the second interview (due to having already established rapport) as well as potentially improving reporting by either encouraging record maintenance between the first and second interview or encouraging collection of records prior to the second interview.” (Cantor et al., 2013, pg. 10) 5 “A move to a single week rather than a two-week diary period is based on the limited empirical literature demonstrating a “fall off” in reporting as the length of the diary period increases.” (Cantor et al., 2013, pg 19) 6 “Research in other data collection projects suggests that replacing paper-based data collection methods with electronic methods may potentially improve data quality.” (Cantor et al., 2013, pg. 20)

11

All expenditure categories will have a three month reference period, collecting data for the three

months prior to the interview month. Depending on the category, questions may ask about the

previous quarter, with the month of purchase identified, or they may ask about each of three

previous months individually. In addition to the expenditure categories captured during the

recall interview, a select number of global questions covering all expenditure categories to be

collected from the diary will be included. Global questions will ask respondents to report total

spending at a higher level (e.g., clothing) than the more detailed questions likely to be included

in the interviews (e.g., pants, shirts, shoes).

After conducting the recall interview, the FR will ask the respondent to collect records7,8 for

select expenditures (e.g., expenditures respondents typically do not accurately know, such as

utilities) for the three months prior to the interview to be used in the Visit 2 records-based

interview. Lastly, the FR will perform electronic diary training and placement with the CU

respondent(s) to prepare for the expenditure diary task. The CU respondent will be expected to

train the other CU members, if not present, on how to use the electronic diary. The entire Visit 1

interview is intended to average 45 minutes. Upon successfully completing Visit 1, the FR will

give the CU respondent a $20 incentive in the form of a debit card.

Diary Week

Beginning on the first day following the Visit 1 interview and continuing for seven days, all CU

members 15 years of age and older will record their individual expenditures using an electronic

diary. A $20 individual incentive (in the form of a debit card) will be given to every CU member

completing the one-week diary.

Respondents will be encouraged to use their own device to access the electronic diary form.9,10

CU members without their own device will be given a basic mobile device (smartphone or

7 “The use of records is extremely important to reporting expenditures and income accurately. The use of records on the current CE is far less than optimal and varies across the population. A redesigned CE would need to include features that maximize the use of records where at all feasible…” (Dillman and House, 2012, pg. 6) 8 “Encourage respondents to refer to records for the interview… The rationale for asking respondents to refer to records is simply that records are likely to contain precise, accurate information.” (Cantor et al., 2013, pg. 17) 9 “[Westat] recommend[s] allowing respondents to use their own smart phones, tablets, laptop computers, or desktop computers to enter their expenditure data on the web-based diary.” (Cantor et al., 2013, pg. 29) 10 “All three of the proposed designs recommend the use of a [mobile] device” (Dillman and House, 2012, pg. 150)

12

tablet)11 with the ability to connect to the internet to use for the duration of the diary reference

week. At the Visit 2 interview, the devices would be picked up. Any CU member unable or

unwilling to complete an electronic diary will be given a paper diary form. This multi-mode

diary will allow respondents to use the technology most comfortable to them.

The respondents’ electronic diary entries will be monitored during the reference week by Census

headquarters or Regional Office staff,12 who will notify FRs of any issues the CU members are

experiencing with the electronic diary. The FR will follow-up with the CU during the reference

week to (1) recognize and further encourage those who are reporting expenditures to continue to

do so, (2) encourage respondents who are not reporting their expenditures to begin to do so, and

(3) address any issues that may be negatively affecting respondents’ proper use of the diary. The

FR will serve as the primary support contact for the respondents, and a centralized technology

help desk will be available for any technical issues they cannot resolve.13 Any technical issues

will be logged with case information so the FR will be able view issues and see the resolution.

Keeping track of technical issues will also allow for continuous improvement.

An advantage of individual diaries is that requiring all CU members to keep a one-week diary

may stress the importance of the diary task, which may elevate the salience of the diary keeping

activity, resulting in an increase in overall expenditure reporting and at the same time reducing

the diary keeping burden on a single household respondent.14 The need for every CU member to

complete an individual diary could reduce response, but the individual incentives awarded for

successful completion of the diary should counteract this risk.

11 “Respondents can then use their own technology or borrow a [mobile] device to access the diary.” (Cantor et al., 2013, pg. 5) 12 “The centralized facility would be able to monitor and intervene if households do not enter data regularly or if there is evidence of poor data quality.” (Dillman and House, 2012, page 103) 13 “The field representatives’ role will still be important in directly collecting data, but their role [should] grow to also provide support in additional ways.” (Dillman and House, 2012, pg. 92) and “The field representative’s role changes radically…, from being the prime interviewer and data recorder to being the facilitator who encourages, trains, and monitors in support of the respondent’s thoughtful and accurate data entry.” (Dillman and House, 2012, pg. 104) 14 “A limited body of empirical research [e.g. Grootaert 1986; Edgar, et al. 2006] suggests that reducing reliance on a single CU reporter for the diary may improve reports of expenditures.” (Cantor et al., 2013, pg 34)

13

Visit 2

The Visit 2 interview will occur after the diary week is complete, one week after the Visit 1

interview. During the Visit 2 records-based interview, the FR will review the Diaries and check

with the CU respondent to capture any additional expenses that were missed by members during

the diary week. Then the FR will again use a CAPI instrument to collect expenditures that are

best collected via the records that the respondent was asked to collect. The Visit 2 interview will

collect the expenditures covering the three months prior to the Visit 1 interview month. Asking

for specific records a week in advance should increase their use during this second interview,

resulting in improved data quality. However, we recognize the task of gathering records may

have the adverse effect of reducing response rates at Visit 2.

As with Visit 1, Visit 2 is intended to average 45 minutes, and the CU respondent or the FR can

modify the order in which expenditures are collected. Also, as with Visit 1, a $20 household

debit card incentive is given upon successful completion of the CAPI interview unless the

respondent used records to report expenditure amounts; in which case the incentive is increased

to $30.15

Wave 2

Twelve months later, each CU will repeat the full process. The second wave of data collection

will follow the same structure as the first, with a Visit 1 recall interview, individual one-week

diaries, and a Visit 2 records-based interview. The incentive structure will also remain the same

with $2 cash included in the advance letter, a $20 debit card given after Visit 1, a $20 debit card

for each member who completes the diary, and a $20 or $30 debit card after Visit 2.

When compared to the status quo, a clear advantage of a design comprised of only two waves

(each 12 months apart) is a reduction in overall household level burden; however, after a year,

there is a risk of losing the rapport that was developed by the FR during the first wave. To help

offset this, the CU household will be sent a “Respondent Engagement” mailing following the

first wave to help maintain rapport and encourage their participation in the second wave of data

collection. These mailings are envisioned to be a postcard or e-mail thanking them for their

15 The criteria by which the respondent will qualify for the increased incentive will be explained in advance of the interview.

14

participation, reiterating the importance of their participation, and reminding them of the next

wave.

Incentives

Each wave begins with an advance mailing that describes the survey and explains the importance

of their participation (similar to the CE current design’s Wave 1 advance letters). The letter will

also include a token $2 cash incentive and information about the promised incentives for

successfully completing the household interview and diary components.16,17

The incentives will be individual-level and performance-based.18 A single household CU

respondent receives $20 for completing the Visit 1 recall interview and $20 or $30 for the

completion of the Visit 2 records-based interview, depending on whether or not records were

used. Each CU member, including the household CU respondent, will receive $20 for

successfully completing the one-week diary. All momentary incentives are repeated for Wave 2,

and the final incentive is a non-monetary “Respondent Experience” brochure package, awarded

after the second wave of data collection.

After the second wave is completed, a non-monetary “Respondent Experience” incentive will be

mailed within one week to the CU household respondent. This brochure package will contain a

personalized graph showing the household’s Wave 1 expenditures compared to national

averages, and an information sheet listing helpful government websites.

16 “Another key element of the prototypes is the use of incentives to motivate respondents to complete data collection and provide accurate data. The panel recommends an appropriate incentive program be a fundamental part of the future CE program.” (Dillman and House, 2012, pg. 10) 17 “Incorporate incentives at all stages of the CE data collection – prepaid small incentives in the advance letter, for the CE interview respondent, and for every diarist in the CU. Utilize incentives to both improve response rates and motivate reporting.” (Cantor et al., 2013, pg. 36) 18 The debit card will be given at the start of the task and the pin provided at the successful completion of the task.

15

Figure 1. Proposed design

16

b. Survey Content

To address the OPLC data requirements, the design team reviewed the full list of required

expenditure and non-expenditure data elements and determined which were best collected via

recall, records, or a diary.19 The design team identified 36 expenditure categories to be collected

via recall during the first in-person interview, 68 categories to be collected in the diary, and the

remaining 36 categories to be collected via records during the second in-person interview. These

designations will be tested and evaluated to ensure the correct collection method is selected.

Consult appendix 1 for the full list of 140 expenditure and non-expenditure categories classified

by data collection approach.20

Although the design team has proposed a data collection approach for each required element, the

team acknowledges additional work will be needed before finalizing the approach used to collect

each data element. Research will also need to address the level of aggregation at which each

question should be asked in order to collect data comprehensive of the required categories.

For example, the category for education expenses may be asked as a single question: What did

your household spend on education expenses in the past X months?

or as multiple questions: In the past X months, what did your household spend on …

college tuition?

on tutoring and test preparation?

on educational books and supplies?

The level of aggregation required to collect some expenditure categories may result in some

lower-level questions collected in the recall interview and some in the records-based interview,

spreading the category across both surveys.

The guidelines the design team used to make the decision on the best data collection approach

for each expenditure category follows: 19 “Both [recall and records interviewing] have real drawbacks, and a new design will need to draw from the best (or least problematic) aspects of both methods.” (Dillman and House, 2012, pg. 4) 20 Appendix 1 is based on an initial list, dated October 31, 2012 of reconciled expenditure categories. The exact survey content will be based on research using the final list of expenditure categories provided in the OPLC Requirements Document (Passero et al., 2013).

17

Visit 1 Recall Interview

Big ticket items that can be easily recalled (e.g., vehicles, major appliances)

Items that respondents would be able to report for other household members (e.g.,

college tuition, catered affairs)

Infrequent purchases not likely to be collected with sufficient sample size in a one-

week diary (e.g., professional medical services)

Average survey duration of 45 minutes

Diary

Small, frequent expenses not easily recalled (e.g., food)

Items that people may be more willing to report privately (e.g., alcohol, tobacco)

Items an individual proxy respondent would not know for all household members

(e.g., recreational goods)

Visit 2 Records-Based Interview

Items that respondents likely do not accurately know but could easily obtain from

records (e.g., payroll deductions, mortgage interest and charges)

Items that respondents may know but may be more accurately reported using records

(e.g., electricity, water, sewer, and trash collection)

Average survey duration of 45 minutes

c. Data Collection Methodology

To collect the data described above, a variety of methods will be used. CAPI instruments will be

created for the Visit 1 and Visit 2 interviews. They will be developed to maximize flexibility;

interviewers will be able to move through the expenditure sections in the order that best suits the

respondents (e.g., in order of the records they have available, in order of respondent interest).

Recognizing that keeping a diary is a burdensome task, this design attempts to overcome that

challenge with reducing the reporting task from two weeks to one week, including incentives and

simplifying the reporting process as much as possible. The reporting tasks will be divided

18

among the individual household members, minimizing the burden for any one respondent. Using

a mobile-optimized web diary for the individual diaries will make data entry by respondents

simpler. The electronic diary will be designed with a new simplified format (open-ended to

capture all expenditures) and will be accessible from multiple points (i.e., personal computer,

personal mobile device, or government-provided mobile device). This should help respondents

enter expenditures in real time, as purchases are made. To capture data and create records in the

diary, respondents will be encouraged to make use of available device functionality such as

cameras, receipt scanners, bar code scanners, or handwriting-to-text software. An additional

benefit to web-based collection is the ability to monitor results in real time, allowing the FRs to

target CUs that need encouragement or assistance in reporting expenditures.

d. Sample Design and Proposed Sample Size

As mentioned above, the proposed design calls for a single rolling sample of CUs participating in

two waves of data collection. The proposed design also calls for the same sample size across all

expenditure categories with every household receiving the same survey content. An advantage of

this design is that it will be straightforward to program and implement.

The CE sample size requirement states that CE must be able to produce “reliable annual

estimates for the required level of expenditure detail both nationally and for standard one-way

tables currently produced” (Passero, et. al, 2013). To identify the minimum sample size needed

to meet this requirement, the design team worked with CE Statistical Methods Division (SMD)

staff. The criterion was for the smallest published subpopulation21 a coefficient of variation22

(CV) of less than 25 percent was expected to be obtained for a majority of the expenditure

categories intended for publication.

Using three years of CE integrated data from 2009 – 2011 and combinations of existing

expenditure categories, simulations were run to identify the percent of reconciled expenditure

categories that met the 25-percent CV threshold at various sample sizes (see figure 2).

21 The Asian subpopulation 22 For this analysis, the CV was calculated by dividing the standard error by the mean.

19

Figure 2. The number of reconciled expenditure categories, intended to be published, meeting

the 25-percent CV threshold for the Asian subpopulation at given sample sizes

The design team evaluated the findings and recommends a sample size of 28,000 completed

cases per year. That is the point where additional cases do not significantly increase the number

of categories meeting the threshold. It is also important to note that even with a much larger

sample size than recommended, e.g. 42,000, more than double the current production sample

size, there are a number of categories which do not meet the required threshold. This mimics the

current production design, in which a number of collected expenditure categories do not meet a

25-percent CV threshold.

The CPI has independent sample size requirements, also documented in the OPLC Requirements

Document (Passero, et. al, 2013). They require the sample size be sufficient to “produce reliable

monthly expenditure estimates as needed for the C-CPI-U” and “allow the CPI to replace its cost

weights every two years for national and sub-national indexes.”

20

Work is ongoing to identify the minimum sample sizes needed to meet the CPI requirements.

When available, the CE required sample size and CPI required sample size will be reconciled,

likely using the larger of the two to ensure all requirements are met. The final sample size is

expected to be in the range of the current sample size, which means: (1) as is currently the case,

the desired level of accuracy may not be reached for some expenditure categories; and (2) also as

with the current sample, there may be a larger sample size than needed for some expenditure

categories.

e. Expected Cost

The start-up costs of the redesigned survey should be reasonable because of the use of currently

available technology, a straightforward design, and the Census Bureau having done previous

development work that can be leveraged (e.g., the American Community Survey web-based data

collection instrument). As for ongoing costs, there are some elements of the proposed design

that will increase costs, while other elements will decrease cost. For example, the increased use

of technology will likely result in new costs (e.g., technology help desk for respondents) and the

individual diaries may increase processing costs (adding a new requirement to integrate the

individual diaries); however, there should be overall cost savings from the use of a single sample

with fewer waves as well as a reduction in the keying of diary data.

The use of monetary incentives will be expensive (with a maximum of $132 per wave for a four-

person household), but research suggests (Lavrakas, 2008) that incentives will reduce the cost of

the first contact. Further, the use of a debit card (with pin given when respondent successfully

completes the interview or diary) may result in cost savings from any un-cashed cards.

In a January 18, 2013 email to the Census Bureau, the design team provided an overview of the

basic design, including the chart in figure 1 and requested a cost estimate for the proposal. The

overall survey and sample design, amount and structure of incentives, survey mode, expected

response rate (70 percent), and technology (basic smartphone with wireless technology) were

provided. For cost estimation purposes, a sample size of 35,000 was specified. The Census

Bureau estimate for ongoing data collection of this design was in line with current production

costs, meeting one of the goals of the redesign.

21

Additionally, as part of their contract, Westat estimated costs for conducting a survey with very

similar design and sample size, and also concluded that the ongoing collection cost would be in

line with current production costs (Cantor, et. al., forthcoming 2013).

f. Future Decisions

There remain a number of design decisions that have not yet been addressed, most due to the

need for additional information or research. The following topics will need to be addressed

during the redesign research development process as more information becomes available:

i. Mobile diary device application type: Whether to use a mobile-optimized web survey

or a native application (app). For the purposes of this report, a native app is being

defined as a local application designed to run on a specific device, versus a web

application that is run within a browser. The advantages of a mobile-optimized web

option over a native app includes ease of version maintenance, reduced respondent

security concerns, respondent access without installing, and access on a desktop

computer. The advantages of using a native app over a mobile-optimized web option

include a more consistent appearance, offline access, and greater speed.

ii. Use of records as data input: Explore options to use an app or other software to

capture and code information directly from records and input that information into the

diary and/or interview. Interviewers could then identify missing or unclear data

elements and interview only on those as needed in Visit 2.

iii. Acceptance of annotated grocery receipts: Explore issues associated with allowing

respondents to provide grocery receipts in lieu of recording those expenses in the

diary.

iv. Incentive amounts and structure: The current Census protocol and logistical issues

associated with individual-level and performance-based incentives should be

explored. Additionally, the effectiveness of all planned incentives will be evaluated.

22

v. Exact survey content: The division of content between the recall interview, records-

based interview, and diary needs to be developed. This also includes the use of

global questions, focus of the diary and householder versus individual diaries.

vi. Government provided technology: The goal of providing government technology is

to increase the number of respondents using the electronic diary, to improve data

quality, and to reduce data entry costs. During the development period, the percent of

respondents without their own access to the internet will be weighed against the cost

of providing technology and any other risks or logistical issues to determine the

benefit of providing technology in lieu of simply using a paper diary for respondents

without internet access.

vii. Length of Visit 1 and Visit 2: As the survey content and field procedures (e.g., diary

placement, use of records) are finalized, the amount of time an interviewer is with the

respondent in each visit will be estimated. The goal is to have each visit average 45

minutes. Careful evaluation of task length will have to be done.

viii. Respondent Experience mailing: The current proposal recommends sending the

Respondent Experience brochure after Wave 2. The advantage of this timing is the

confidence that the mailing does not change respondents’ spending behavior, and

ensures the Wave 1 data have been processed and can be included in the mailing.

The disadvantage is that the promise of the brochure is likely to be less effective in

combating Wave 2 attrition than the actual material would be.

ix. Collection of outlet information: Explore collecting information required for the

CPI’s Telephone Point of Purchase Survey (TPOPS) in the CE. For example, the

TPOPS requires outlet (i.e. store) name, location, and price. This information could

potentially be collected from records and receipts during the Visit 2 records-based

interview, and/or via the diary.

23

4. Future Enhancements

With the “Keep It Simple” guiding principle in mind, the design team aimed to create a survey

design that will allow for development and implementation in the short term, and ongoing

improvement in the long term, after implementation. Below is a list of design features that the

team identified as promising, including adaptive design, gold standard interviews, split

questionnaire methods, increased use of technology, self-administered interviews, extended

longitudinal component, and collection of outlet information. The list can be viewed as a

starting point for the prioritization of future research and development efforts.

a. Adaptive Design

Adaptive survey design attempts to break up methods and processes for survey data collection

into manageable, observable, trackable pieces. It considers different indicators of progress (frame

data, effort measures, cost measures, response data and

response propensity estimates), measures them continuously

and tailors data collection approaches to case characteristics.

In short, adaptive design is the use of empirical data to

facilitate intelligent business decisions prior to and during

data collection.

Potential adaptive design benefits for CE are:

Improvement in response through tailored field

procedures based on contact history or respondent

characteristics.

Reduction of respondent burden without

negatively affecting key estimates through

tailoring the collection of select expenditure

categories (e.g., not asking questions for

expenditures where we have already achieved sufficient sample).

b. Gold Standard Interviews

In many federal surveys, one can assess the quality of data by comparisons with other sources of

information. One of the difficulties in evaluating the quality of CE data is that there is no “gold

Potential future enhancements

include

adaptive design,

gold standard interviews,

split questionnaire design,

increased use of

technology,

self-administered

interviews,

extended longitudinal

component, and

the collection of outlet

information.

24

standard” with which to compare the estimates. Instead of comparing to an external source, one

could conduct an intensive interview with a subsample of the main sample. This intensive

interview could employ techniques such as the extensive use of records, financial software, and

budget balancing to establish accurate measurements of expenditures, income, and assets at the

household level. It could be conducted as part of an ongoing research sample or as a one-time

effort.

Data from a gold standard interview could be used in several ways:

1. Data adjustment

2. Evaluate measurement error in the base sample

3. Measure the extent and organization of household record keeping to inform how

better to collect expenditures in the ongoing survey

4. Provide an opportunity to collect data for more demanding research needs

c. Split Questionnaire Design

Split questionnaire is a form of matrix sampling of survey questions in which several distinct

forms of the survey are constructed (thus modules are sampled, rather than questions) and

respondents are randomly (although not necessarily with equal probability) assigned to one

survey form in order to reduce respondent burden by shortening the survey. Split questionnaire

designs reduce the length of the survey while collecting the necessary information from at least

some of the sample members, but they also result in missing data. The goal is to minimize the

amount of information lost relative to the complete questionnaire. Appropriate decisions must be

made at various phases of the survey process to aid optimal implementation and estimation.

The BLS has devoted considerable attention to the potential use of split questionnaire design and

preliminary simulation results indicate that split questionnaire designs for the CE can reduce

survey length by at least 50 percent, with the impact on variances “varying” depending on the

type of expenditure category (Gonzalez, 2012). However, additional research is needed to

determine the optimal length of a shortened survey, composition of questionnaire splits (in terms

of their statistical properties and impact on respondent processes/errors), and dataset construction

and analysis methods.

25

d. Increased Use of Technology

One goal of the redesign was to improve flexibility of the design to allow for updates as society

or technology changes. This proposal uses a smartphone or other mobile device as the data

collection device, but we recognize that over the next decade technology will advance and other

options may become more feasible, efficient, cost effective, or otherwise attractive. This design

allows for the incorporation of technology, either in place of the smartphone, or in addition to it,

as technology advances.

In addition to data collection, technology may be added to the survey design in other ways.

Mobile devices could be given to the respondent during the interview to view the questions,

information booklet or other survey aids, in conjunction with questions asked by the interviewer.

e. Self-Administered Interviews

A potential avenue for cost savings is incorporating more self-administration into the interview

process since personal visit interviews are the most costly mode of data collection. There have

been attempts to administer complex questionnaires in self-administered formats. For example,

the National Institutes of Health has been developing an instrument for conducting a 24 hour

dietary recall (e.g., Thompson, et al, 2010; Subar, et al., 2010). This is a computerized interview

available on the internet that is designed to conduct large-scale nutritional studies. Going forward

with self-administration would require additional research to identify (1) protocols to promote

recall (e.g., cues, frames of reference, visual displays), (2) composition of self-administered

modules, and (3) interviewer follow-up procedures when self-administered responses indicate

cursory or lack of effort in the response (e.g., missing data; low levels of expenditures; sections

completed in atypical short lengths of time).

f. Extended Longitudinal Component

Budget permitting, additional interview waves could be added across years (for example, three

interview waves 12 months apart) or within years (for example, three interview waves, four

months apart) to increase the data’s value to researchers interested in longitudinal analysis.

26

5. Next Steps

With this proposed design, the Gemini Project has met a significant milestone, but a large

volume of redesign-related activity remains. In July 2013, the redesign proposal will be

presented publically at the CE Methods Symposium. That will begin the task of sharing the

design with stakeholders and data users and identifying user impact. As the design team

concludes, new teams will take its place to contribute to the ongoing work necessary to complete

the project; for example, the Data User Impact team and Survey Content team will be chartered.

CE will be doing outreach with data users throughout 2013 and will consider their feedback in

developing the next iteration of the redesign. This is an iterative process and user feedback and

future research may result in changes to the proposal.

Research required to resolve outstanding design decisions and to finalize the design itself will

take place over a five-year period, beginning in 2014. During this period, the design and exact

survey content will be finalized. In parallel to the research related to the design, CE will

continue efforts on developing metrics to evaluate measurement error. CE will continue to share

information with users during this period; for example, documents detailing what data will and

will not be available in the new survey. The subsequent five-year period will see logistical

details, materials, and codebooks finalized and systems established. CE’s goal is to implement a

fully redesigned CE survey by 2023.

27

References

Bureau of Labor Statistics. 2008. Consumer Expenditure Survey Program Review Team Final Report. BLS Internal Document. Bureau of Labor Statistics. “Chapter 16. Consumer Expenditures and Income.” Handbook of Methods. http://www.bls.gov/opub/hom/pdf/homch16.pdf (accessed June 26, 2013) Cantor, D., Mathiowetz, N., Schneider, S., and Edwards, B. Forthcoming 2013. Westat Redesign Options for the Consumer Expenditure Survey. Dillman, D. and House, C., Editors. 2012. Measuring what we spend: Towards a new Consumer Expenditure Survey. Panel on Redesigning the BLS Consumer Expenditure Surveys; Committee on National Statistics; Division of Behavioral and Social Sciences and Education; National Research Council. Edgar, J., McBride, B. and Safir, A. Forthcoming 2013. Research Highlights for the Consumer Expenditure Survey Redesign. Monthly Labor Review. Edgar, J. 2013. Gemini Design Team FR Focus Group results. BLS Internal document.

Gemini Project Steering Team. 2012. Gemini Project Vision Document. http://www.bls.gov/cex/ovrvwgeminivision.pdf (accessed June 26, 2013). Gonzalez, J., Hackett, C., To, N., and Tan, L. 2009. Definition of Data Quality for the Consumer Expenditure Survey: A Proposal. Available at http://www.bls.gov/cex/ovrvwdataqualityrpt.pdf Henderson, S., Brady, V. Brattland, J., Creech, B., Garner, T., and Tan, L. 2010. Consumer Expenditure Survey Data Users’ Needs Final Report. BLS Internal document. Lavrakas, P. J. (Ed.). 2008. Encyclopedia of survey research methods. Thousand Oaks, CA: Sage Publications, Inc. Passero, B., Ryan, J., Cage, R., Stanley, S., and Garner, T. Forthcoming, 2013. OPLC Requirements for the Consumer Expenditure Survey. Westat. 2011. Data Capture Technologies and Financial Software for Collecting Consumer Expenditure Data Report. http://www.bls.gov/cex/ceother2011westat.pdf (accessed June 26, 2013).

28

Appendix 1. Proposed Survey Content and Source

The division of content between the recall interview, records-based interview and diary needs to

be further studied and developed. The diary is currently designed to be open-ended, with the

intent of getting sufficient responses to accurately estimate the expenditure amounts listed in the

table.

OPLC Requirement Category23

Recall

Interview Diary

Records

Interview

Airline Fares X

Appliances X

Boats, RVs, and Other Recreational Vehicles X

Catered Affairs X

CE All Other Gifts X

CE Support for College Students X

Child Care X

College Tuition X

Floor Coverings X

Funeral Expenses X

Furniture X

Household Operations X X

Legal Fees and Financial Expenses X

Lodging Away from Home X

Maintenance and Repair Supplies X X

Meals as Pay X

Medical Equipment and Supplies X X

23 Appendix 1 is based on an initial list, dated October 31, 2012 of reconciled expenditure categories. The exact survey content will be based on research using the final list of expenditure categories provided in the OPLC Requirements Document (Passero et al., 2013).

29

OPLC Requirement Category23

Recall

Interview Diary

Records

Interview

Motor Vehicle Fees X X

Motor Vehicle Maintenance and Repair X X

Movie, Theater, Sporting Event and Other Admissions X X

Occupational Expenses X X

Other Food Away from Home X X

Other Household Equipment and Furnishings X X

Other Housing Expenses X

Other Medical Services X

Other School Tuition, Tutoring, and Test Preparation X

Owners' Equivalent Rent of Primary Residence X

Owners' Equivalent Rent of Vacation/Second Residence X

Pets, Pet Products and Services X X

Photography X X

Professional Medical Services X

Recreation Services X X

Rent as Pay X

School Meals X X

Unpriced Maintenance and Repair Services X X

Video and Audio X X X

Amount Car or Truck Sold or Reimbursed

X

Amount Motorcycle Sold or Reimbursed

X

Amount Recreational Vehicle Sold or Reimbursed

X

CE Alimony, Child Support and Support for College Students

X

30

OPLC Requirement Category23

Recall

Interview Diary

Records

Interview

CE Assets and Liabilities

X

CE Cash Contrib. To Charities, Educ. and Other Organizations

X

CE Deductions for Social Security

X

CE Finance Charges Excl. Mortgage and Vehicle

X

CE Life and Other Personal Insurance

X

CE Non-Payroll Deductions To Retirement Plans

X

CE Payroll Deductions To Retirement Plans

X

CE Vehicle Finance Charges

X

Commercial Health Insurance

X

Electricity

X

Fuel Oil and Other Fuels

X

Hospital and Related Services

X

Household Insurance

X

Internet Services

X

Leased and Rented Vehicles

X

Long Term Care Insurance

X

Medicare Payments

X

Mortgage Interest and Charges

X

Motor Vehicle Insurance

X

Motorcycles

X

Natural Gas

X

New Cars and Trucks

X

Personal Property Taxes

X

31

OPLC Requirement Category23

Recall

Interview Diary

Records

Interview

Property Taxes

X

Rent of Primary Residence

X

Telephone Services

X

Trade In Allowance, Leased Vehicles

X

Trade In Allowance, New Cars and Trucks

X

Trade In Allowance, New Motorcycles

X

Used Cars and Trucks

X

Water, Sewer and Trash Collection Services

X

Adult Care

X

Bakery Products

X

Beef and Veal

X

Beer, Wine and Other Alcoholic Beverages Away from Home

X

Beer, Wine, and Other Alcoholic Beverages at Home

X

Beverage Materials Including Coffee

X

Breakfast, Brunch, Snacks and Nonalcoholic Beverages Away from

Home X

Cereal and Cereal Products

X

Children's Apparel

X

Dairy and Related Products

X

Dinner Away from Home

X

Educational Books and Supplies

X

Eggs

X

Fats and Oils

X

32

OPLC Requirement Category23

Recall

Interview Diary

Records

Interview

Fish and Seafood

X

Food Prepared By Cu On Out of Town Trips

X

Footwear

X

Fresh Fruits

X

Fresh Vegetables

X

Household Textiles

X

Housekeeping Supplies

X

Infant and Toddler Apparel

X

Information and Information Processing Other Than Internet Services

X

Jewelry and Watches

X

Juices and Nonalcoholic Drinks

X

Lunch Away from Home

X

Medicare Prescription Drug Payments

X

Medicinal Drugs

X

Men's Apparel

X

Misc. Personal Services

X

Miscellaneous24

X

Miscellaneous Personal Goods

X

Motor Fuel

X

Motor Vehicle Parts and Equipment

X

Other Apparel Services

X

24 Includes miscellaneous fees; lotteries and parimutuel losses; and maintenance, repairs, and utilities for other properties

33

OPLC Requirement Category23

Recall

Interview Diary

Records

Interview

Other Foods

X

Other Household Products

X

Other Meats

X

Other Public Transportation

X

Other Recreational Goods

X

Personal Care Products

X

Personal Care Services

X

Pork

X

Postage and Delivery Services

X

Poultry

X

Processed Fruits and Vegetables

X

Recreational Reading Materials

X

Sewing

X

Sporting Goods

X

Sugar and Sweets

X

Tobacco and Smoking Products

X

Tools, Hardware, Outdoor Equipment and Supplies

X

Women's Apparel

X

Number of Categories Collected 36 68 36

34

Appendix 2. Comparison of Current and Proposed Surveys

Design

Dimension Current Proposed

Sample Design - Two independent samples for interview and diary

- Rolling sample implementation

- Single sample, all households complete interview and diary

- Rolling sample implementation

Interview

- Single interview designed to collect large and reoccurring expenditures, but collects a basically comprehensive range of categories

- All categories use a three month reference period

Recall Interview: - Large, easily recalled expenditures - Respondent or FR able to change

section order Records-Based Interview: - Expenditure categories where records

are likely available - Respondent able to report for different

periods to match records

Diary - Single paper diary per household - Open-ended, collecting four main

categories of expenditures - In-person pickup and placement

- Electronic one-week diary with paper backup

- Individual diary for all household members 15 years old and older

- Open-ended, collecting all expenses - In-person pickup and placement - Respondent to use their device

functionality to capture data (such as photos or voice messages), which will create a record in the diary for later completion

Waves - 5 waves, three months apart - 2 waves, one year apart

Sample Size - 28,0000 completed interviews per year

- 14,000 completed diaries per year - Sample size the same across all

categories

- Goal of 28,000 completed cases (interviews and diaries) per year

- Sample size the same across all categories

35

Design

Dimension Current Proposed

Modes - In-person and telephone interviews - Paper Diaries

- In-person interview survey - Multi-mode diary

o Web o Mobile device o Paper (backup)

Records - Record use encouraged, with variable success

- Records-based interview at Visit 2 o Specific records (e.g.,

mortgage statement, paystub, utility bills) referred to during Visit 2, after respondent receives FR instructions regarding records at the end of Visit 1

Content - Overlapping of content between diary and interview due to open-ended diary design

- Interview is relatively comprehensive

- Overlapping of content between diary and interview due to open-ended diary design

- Recall and Records interview limited content to reduce overlap with diary

- See Appendix 1 for content division

Role of

Interviewer

- Conduct interviews - Interviewer as a trainer for diary

placement - Place and pick up diaries, with some

mid-week calling to check on respondent

- Conduct interviews as done currently for Visit 1 and Visit 2

- Interviewer as a trainer for diary placement and records request

- Interviewer contacts respondent to encourage reporting as necessary, as prompted by Census headquarters or regional office reports

- Interviewer serves as main support contact, interviewer will contact main tech help desk for technical issues

Incentives - None - Monetary incentives ($132 total per wave for a four person household

- Household-level non-monetary “Respondent Experience” incentive

36

Appendix 3. Design Research, Discussion, and Outreach Events

2013

July – CE Survey Methods Symposium

2012

December – FESAC Meeting

October – CNSTAT Report Workshop

July – CE Survey Methods Symposium

2011

October – Redesign Options Workshop

June – Household Survey Producers Workshop

February – 1st CNSTAT Panel Meeting

2010

December – Methods Workshop

June – Data Users' Needs Forum

May – AAPOR Panel Presentations: Utilizing Respondent Records

March – Data Capture Technology Forum

January – WSS/DC-AAPOR Survey Redesign Panel

2009

July – CRIW-NBER Conference On Improving Consumption Measurement

37

Appendix 4. Project Team Membership

Gemini Steering Team

Kathy Downey, formerly at BLS

Jennifer Edgar

Steve Henderson

Bill Passero

Laura Paszkiewicz

Carolyn Pickering, formerly at BLS

Adam Safir

David Swanson

Jay Ryan

Gemini Executive Management

Rob Cage

Bob Eddy

John Eltinge

Thesia Garner

Mike Horrigan

John Layng

Jay Ryan

Gemini Design Team

Kathy Downey, formerly at BLS

Jennifer Edgar

Dawn V. Nelson

Laura Paszkiewicz

Adam Safir