Jen Sweet Office for Teaching, Learning & Assessment DePaul University Shannon Milligan Faculty...

Preview:

Citation preview

Jen SweetOffi ce for Teaching, Learning & Assessment

DePaul University

Shannon MilliganFaculty Center for Ignatian Pedagogy

Loyola University Chicago

ADVANCED SURVEY DESIGN

Tuesday, December 8, 20152:00 – 3:30pm

Workshop Outline

• Introductions• Conflicting Recommendations for Survey Design?• Cognition and Survey Design• Recommendations to Reduce Cognitive LoadACTIVITY!• Affective Assessment and SurveysACTIVITY!• Survey Design & Distribution Tools• Analyses of your Survey Instrument

Workshop Outcomes

By the end of this workshop, participants will be able to:

• Apply knowledge of the cognitive processes students use to respond to surveys to design effective survey items and instruments.

• Identify non-cognitive variables that can be assessed with surveys

• Use the tools available to them at their respective institutions to design and distribute surveys.

• Identify a variety of methods available for the analysis of survey data.

Conflicting Recommendation

s for Survey Design?

Recommendations for Survey Design that seem to Conflict

Examples:Neutral Point • Always include? • Never use? • Sometimes yes; sometimes no?

Number of Scale Points to Include• 2?• 3?• 4?• 5?• 6?• 7?• 9?

• The more the better?

All of This is in the Literature!

So, What’s Up with the Literature?All of these recommendations may be appropriate depending on the specific context:

• respondent attributes • nature of the items in the survey • length of the survey• Etc.

Generally looking at things like:• Reliability• *Validity • Survey Outcomes • Response Rate (high)

• Use of Response Sets (low)

Survey Design is as Much Art as Science!“There is always a well-known solution to every human problem - neat, plausible, and wrong.” H.L. Hencken

Cognition and Survey Design

Cognitive LoadPaas and Van Merrienboer, 1994

The amount of cognitive effort (or thinking) students need to exert to respond to a survey item.

If cognitive load exceeds the student’s working memory capacity, they will take some sort of shortcut (Paas & Van Merrienboer, 1994), or satisfice (Krosnick, 1991)

• Read questions less carefully (skim)

• Use a response set

• Give same response for all questions, regardless of content

• Overuse neutral or N/A response option

• Skip the question (provide no response)

• Respond randomly

• Decide not to complete the survey

Cognitive to Respond to Surveys (Tourangeau, 1984)

1.Interpretation

2.Retrieval

3.Judgment

4.Response

Recommendations to Reduce

Cognitive Load

Step 1: Interpretation•Use language that is clear and familiar to survey respondents. • Avoid cognitively taxing wording.• Avoid unfamiliar words and phrasing.• Avoid jargon and acronyms

• Ensure that question stems are clear and explicit.• Do not use concepts that are unclear or unfamiliar to respondents.• Avoid complex sentence structures.• Ask about only one concept in each stem; avoid double-barreled

questions

• Use questions that do not make assumptions.• Ask for information in a direct manner by avoiding double negatives • Ensure question stems are succinct, including only as much information as is necessary for respondents to properly interpret what is being requested of them.

Interpretation (Continued)• Include clear instructions that clarify the purpose of the survey instrument, and provide respondents with expected procedures for responding to the survey instrument.

• Ensure that every portion of a survey instrument is visible without the need for additional action by the respondent. • Use radio buttons instead of drop-down boxes to display response

options.• Do not “hide” definitions respondents may need to interpret and

respond to survey items.• Use easy-to-read font size and type.• Use high-contrast font and background colors.

Step 2: Retrieval•Use stems that request information with which respondents have primary experience and avoid asking for second-hand information (i.e., information that the respondent has heard about, but not experienced personally) or hypothetical information.•Group conceptually similar items together.

Step 3: Judgment•Use the smallest number of response options necessary to encompass all meaningful divisions of what you are asking about.•General Guideline: four or five response options, depending on whether or not there will be a neutral option.• Include a neutral option if you reasonably expect participants to have no opinion, but otherwise, they should be avoided •Neutral responses can be difficult to interpret•Offering a neutral option may encourage satisficing

Step 4: Response•Use the smallest number of response options necessary to encompass all meaningful divisions of what you are asking about.•Label the scale options.•May only need to label the most extreme options• Include a neutral option if you reasonably expect participants to have no opinion, but otherwise, they should be avoided •Neutral responses can be difficult to interpret•Offering a neutral option may encourage satisficing

Activity!

Practice Evaluating Survey Items!

Individually:Complete the worksheet

In Groups:Compare ResponsesDid Everyone Identify the same items for Improvement?Are there differences in ways you edited items?

Surveys and the Rise of Affective

Assessment

Growing Emphasis on Non-Cognitive Abilities• Grit• Growth• Social-emotional development• Self-awareness/management/efficacy• General affect• Engagement• Mattering• Climate

• Research shows strong relationships between these variables and overall success

From NPR: “Nonacademic Skills Are Key to Success. But What Should We Call Them?” (May 28, 2015)

The Role of Surveys From NPR: “To Measure What Tests Can’t, Some Schools Turn to Surveys” (December 2, 2015)

From that article: “A growing battery of school leaders, researchers and policymakers think surveys are the best tool available right now to measure important social and emotional goals for schools and students”

Why?• Easy to administer• Easier to collect and analyze than reflection papers (and the like)• Many surveys/survey questions already exist• Faster data sharing = faster decision-making/implementation

(maybe)

Activity!

Non-Cognitive Assessment and You

Individually:•What non-cognitive variables might be of interest to your program?•Are you already assessing any of these variables?

In Groups:•Share what you are assessing and what you might be interested in assessing•What variables might be important to look at on an institutional-level?

Survey Design/Distributio

n Tools

Three Main Tools Used at DePaul

1. Qualtrics

2. Google Forms

3. Survey Monkey

QualtricsAdvantages:•Free to DePaul faculty and staff•Supported by Information Services•Very Flexible and Comprehensive System•Lots of Features

Disadvantage:•Reporting Features aren’t Great•Steeper Learning Curve than Other Systems

Google FormsAdvantages:•Free to Everyone•Data is Collected in Excel Format•Easier to Learn than Qualtrics

Disadvantage:•Not Nearly as many Features as Qualtrics

Survey MonkeyAdvantages:•?Disadvantage:•Most Advanced Features Cost $•To get all the features available in Qualtrics, cost is $780/year•Limited to 10 questions and 100 responses on the free version

Overview of Survey Analysis

Survey Analysis• For when you want to:• Group survey respondents• Group survey items• Make predictions/observe relationships• Analyze “fit” of respondents and items

•Which to use based on:1. Purpose

2. Audience

http://www.edmeasurement.net/5244/SPSS%20survey%20data.pdf

Basic Survey Analysis• Sometimes all you need are:

1. Means

2. Correlations

3. Frequency tables/graphs

Survey Analysis-Group Respondents

• Cluster Analysis• What it does: Creates groups or “clusters” of respondents

based on similar responses to a set of survey questions• Kind of intuitive because: Clustering is part of organizing (e.g.

organization of produce section, medical symptoms)• What it answers: What do members of each cluster have in

common?• Ex. First-generation students tend to have less familiarity with

research opportunities

• Useful for: Marketing, outreach, cohort creation

Survey Analysis-Group Survey Items

• Factor Analysis • What it does: Groups statistically related survey items into a

number of “factors”• Kind of intuitive because : Groupings can be done based on

preconceived ideas or based on the data analysis. Also, largely correlation-based• What it answers: Can items be removed from the survey? Do

the factors relate to constructs that make sense (e.g. identified learning outcomes)?• Useful for: Survey refinement, naming constructs

Example: Do our questions aimed at measuring a certain outcome actually seem to measure that outcome?

Survey Analysis-Make Predictions• Regression • What it does: Determines the relationship between multiple

predictor variables and a dependent variable• Kind of intuitive because : Similar to correlation (determining the

relationship between 2 variables), but with the ability to control for other variables• What it answers: What item(s) are significant predictors of an

outcome? Is there a positive or negative relationship between the predictor variable and outcome variable? How well do our variables explain the observed outcome?• Useful for: Looking at relationships between responses to certain

questions (or factors) and outcomes. Determination of resource allocation. Possible survey refinement

Example: Do students who report greater library use have a higher GPA?

Survey Analysis-Analyze “Fit” of Respondents and Items

• Rasch Analysis and Item Response Theory (IRT)• What it does: Provides information about “ability” of

respondents and “difficulty” of survey items• But they differ in: Rasch only considers respondent ability and

item difficulty. IRT can also account for guessing and greater differences between high/low ability respondents• What it answers: What item(s) are too easy or too difficult? Is

the survey measuring a single variable? Are there trends in responses between respondent groups?• Useful for: Survey refinement, detecting bias in survey items,

determining number of levels needed in a rating scale

Example: Is a survey item written in such a way that it is interpreted differently across student groups?

Contact InformationJen SweetDePaul UniversityOffice for Teaching, Learning & Assessmentjsweet2@depaul.edu

Shannon MilliganLoyola University ChicagoFaculty Center for Ignatian Pedagogysmilligan@luc.edu