28
Web Design Issues in a Business Establishment Panel Survey Third International Conference on Establishment Surveys (ICES-III) June 18-21, 2007 Montréal, Québec, Canada Kerry Levin & Jennifer O’Brien, Westat

Web Design Issues in a Business Establishment Panel Survey Third International Conference on Establishment Surveys (ICES-III) June 18-21, 2007 Montréal,

Embed Size (px)

Citation preview

Page 1: Web Design Issues in a Business Establishment Panel Survey Third International Conference on Establishment Surveys (ICES-III) June 18-21, 2007 Montréal,

Web Design Issues in a Business Establishment Panel Survey

Third International Conference on Establishment Surveys

(ICES-III)June 18-21, 2007

Montréal, Québec, CanadaKerry Levin & Jennifer O’Brien, Westat

Page 2: Web Design Issues in a Business Establishment Panel Survey Third International Conference on Establishment Surveys (ICES-III) June 18-21, 2007 Montréal,

2

Overview of Presentation

• A brief review of the web design system and its origins

• Design issues we encountered

• Opportunities for experimental investigation

Page 3: Web Design Issues in a Business Establishment Panel Survey Third International Conference on Establishment Surveys (ICES-III) June 18-21, 2007 Montréal,

3

Background

• The Advanced Technology Program (ATP) at the National Institute of Standards and Technology (NIST) is a partnership between government and private industry to conduct high-risk research

• Since 1990, ATP’s Economic Assessment Office (EAO) has performed rigorous and multifaceted evaluations to assess the impact of the program and estimate the returns to the taxpayer. One key feature of ATP’s evaluation program is the Business Reporting System (BRS).

Page 4: Web Design Issues in a Business Establishment Panel Survey Third International Conference on Establishment Surveys (ICES-III) June 18-21, 2007 Montréal,

4

General Description of the BRS

• Unique series of online reports that gather regular data on indicators of business progress and future economic impact of ATP projects

• ATP awardees must complete four BRS reports per calendar year– three short quarterly reports and one long annual report

Page 5: Web Design Issues in a Business Establishment Panel Survey Third International Conference on Establishment Surveys (ICES-III) June 18-21, 2007 Montréal,

5

General Description of the BRS

• There are several different types of instruments (each with a profit and nonprofit version):

1. Baseline2. Annual3. Closeout4. Quarterly

• The BRS instruments are a hybrid survey/progress report that ask respondents attitudinal questions as well as items designed to gather information on project progress.

• The Baseline, Annual, and Closeout reports are between 70 and 100 pages in length. Due to this length and complexity, web administration is the most logical data collection mode

Page 6: Web Design Issues in a Business Establishment Panel Survey Third International Conference on Establishment Surveys (ICES-III) June 18-21, 2007 Montréal,

6

Design issues: Online logic checks vs. back-end logic checks

1. Examples of online logic checks (i.e., hard edits)• Sum checking• Range checks

2. Examples of back-end logic checks• Frequency reviews• Evaluation of outliers

Page 7: Web Design Issues in a Business Establishment Panel Survey Third International Conference on Establishment Surveys (ICES-III) June 18-21, 2007 Montréal,

7

Online sum checking: Example

Page 8: Web Design Issues in a Business Establishment Panel Survey Third International Conference on Establishment Surveys (ICES-III) June 18-21, 2007 Montréal,

8

Online range checking: Example

Page 9: Web Design Issues in a Business Establishment Panel Survey Third International Conference on Establishment Surveys (ICES-III) June 18-21, 2007 Montréal,

9

Back-end checking: Frequency reviews and outlier evaluations

• At the close of each cycle of data collection, the data for each instrument are carefully reviewed for anomalies

• Frequency reviews are conducted to ensure that there were no errors in skips in the online instrument

• Although the BRS includes range checks for certain variables, the ranges are sometimes quite large, therefore an evaluation of outliers is a regular part of our data review procedures

Page 10: Web Design Issues in a Business Establishment Panel Survey Third International Conference on Establishment Surveys (ICES-III) June 18-21, 2007 Montréal,

10

Use of pre-filled information in the BRSThe BRS instruments make use of two types of

pre-filled information:

1. Pre-filled information from sources external to the instrument (i.e., information gathered in previous instruments or information provided by ATP such as issued patents)

2. Pre-filled information from sources internal to the instrument (i.e., information provided by the respondent in earlier sections of the current report)

Page 11: Web Design Issues in a Business Establishment Panel Survey Third International Conference on Establishment Surveys (ICES-III) June 18-21, 2007 Montréal,

11

Pre-filled information: External source example

Page 12: Web Design Issues in a Business Establishment Panel Survey Third International Conference on Establishment Surveys (ICES-III) June 18-21, 2007 Montréal,

12

Pre-filled information: Internal source example

Page 13: Web Design Issues in a Business Establishment Panel Survey Third International Conference on Establishment Surveys (ICES-III) June 18-21, 2007 Montréal,

13

Required items

While most items in the BRS instruments are not required, the few that are fall into two categories:

1. Items required for accurate skips later in the instrument

2. Items deemed critical by ATP staff

Page 14: Web Design Issues in a Business Establishment Panel Survey Third International Conference on Establishment Surveys (ICES-III) June 18-21, 2007 Montréal,

14

Required items: Example item important for skip pattern

Page 15: Web Design Issues in a Business Establishment Panel Survey Third International Conference on Establishment Surveys (ICES-III) June 18-21, 2007 Montréal,

15

Required items: Example item critical to ATP

Page 16: Web Design Issues in a Business Establishment Panel Survey Third International Conference on Establishment Surveys (ICES-III) June 18-21, 2007 Montréal,

16

Unique Design: Financial items

Page 17: Web Design Issues in a Business Establishment Panel Survey Third International Conference on Establishment Surveys (ICES-III) June 18-21, 2007 Montréal,

17

Administration issues in the BRS: Multiple respondents

• Each ATP-funded project has multiple contacts associated with it

• It is rarely the case that a single respondent can answer all items in the survey. Westat provides only one access ID per report, however, therefore the respondents are responsible for managing who at their organizations are given access to the BRS online system

Page 18: Web Design Issues in a Business Establishment Panel Survey Third International Conference on Establishment Surveys (ICES-III) June 18-21, 2007 Montréal,

Experimental investigations using the BRS

Page 19: Web Design Issues in a Business Establishment Panel Survey Third International Conference on Establishment Surveys (ICES-III) June 18-21, 2007 Montréal,

19

Reducing item nonresponse: The Applicant Survey

• The ATP’s Applicant Survey is not one of the BRS instruments, but is regularly administered via the web to companies and organizations that applied for ATP funding

• In 2006, Westat embedded an experiment within the Applicant Survey to test which of two different types of nonresponse prompting would result in reduced item nonresponse

Page 20: Web Design Issues in a Business Establishment Panel Survey Third International Conference on Establishment Surveys (ICES-III) June 18-21, 2007 Montréal,

20

Reducing item nonresponse: The Applicant Survey

904 respondents were randomly assigned to one of three conditions:

1) Prompt for item nonresponse appeared (if applicable) at the end of the survey;

2) Prompt for item nonresponse appeared (if applicable) after each section;

3) No prompt (control group).

Page 21: Web Design Issues in a Business Establishment Panel Survey Third International Conference on Establishment Surveys (ICES-III) June 18-21, 2007 Montréal,

21

Reducing item nonresponse: The Applicant Survey

End of Survey: After each section:

Page 22: Web Design Issues in a Business Establishment Panel Survey Third International Conference on Establishment Surveys (ICES-III) June 18-21, 2007 Montréal,

22

Reducing item nonresponse: The Applicant Survey

Both prompts for item nonresponse appeared effective, and to an equal degree.

Group

Percentage of completed surveys containing missing

data

Mean number of missing items per completed survey

Prompt at end of survey

23.0% 0.9

Prompt after each section of survey

23.3% 1.1

No prompt (control)

39.8% 2.2

p<.02 p<.008

Page 23: Web Design Issues in a Business Establishment Panel Survey Third International Conference on Establishment Surveys (ICES-III) June 18-21, 2007 Montréal,

23

Boosting response rates: The days of the week experiment

• Literature suggests that there are optimal call times for telephone surveys. But are there also optimal days of the week to email survey communications?

• Optimal day to email was measured by:

• The overall response rate

• The time it takes to respond

Page 24: Web Design Issues in a Business Establishment Panel Survey Third International Conference on Establishment Surveys (ICES-III) June 18-21, 2007 Montréal,

24

Boosting response rates: The days of the week experiment

• Three different experimental conditions:

1) Monday cohort 2) Wednesday cohort 3) Friday cohort

• The invitation email and up to 3 reminders were all sent on the same day, either Monday, Wednesday, or Friday.

Page 25: Web Design Issues in a Business Establishment Panel Survey Third International Conference on Establishment Surveys (ICES-III) June 18-21, 2007 Montréal,

25

Boosting response rates: The days of the week experiment

Experimental GroupTotal Eligibl

e

Total Complet

es

Response Rate

Monday 161 141 87.6

Wednesday 159 140 88.1

Friday 152 141 92.8

Page 26: Web Design Issues in a Business Establishment Panel Survey Third International Conference on Establishment Surveys (ICES-III) June 18-21, 2007 Montréal,

26

Time to Complete the Survey

Email 1 Email 2 Email 3 Email 4

Monday 14.9% 32.3% 65.2% 87.6%

Wednesday 15.7 37.7 67.3 88.1

Friday 18.4 38.8 67.1 92.8

Cumulative Response Rates

Page 27: Web Design Issues in a Business Establishment Panel Survey Third International Conference on Establishment Surveys (ICES-III) June 18-21, 2007 Montréal,

27

Boosting response rates: The days of the week experiment

• Friday cohort trends toward higher response rates, but all cohorts require the same amount of effort to achieve their respective response rates

• Overall, there is some evidence that the day of the week does matter

Page 28: Web Design Issues in a Business Establishment Panel Survey Third International Conference on Establishment Surveys (ICES-III) June 18-21, 2007 Montréal,

28

Conclusion

• The BRS has presented us with various design and administration challenges

• We have had the chance to fine-tune and address a variety of issues that that have come to our attention

• As researchers encounter new issues in the administration of web surveys, the BRS offers a place to study them