51
Using Student Engagement Survey Data to Inform Schoolwide Programs and Strategies Why Student Engagement Matters and How to Measure It September 10, 2014 Lexington, Kentucky

September 10, 2014 Lexington, Kentucky

Embed Size (px)

DESCRIPTION

Using Student Engagement Survey Data to Inform Schoolwide Programs and Strategies Why Student Engagement Matters and How to Measure It. September 10, 2014 Lexington, Kentucky. Welcome and Overview. Lydotta Taylor Research Alliance Lead, REL Appalachia. What is a REL?. - PowerPoint PPT Presentation

Citation preview

Page 1: September 10, 2014 Lexington, Kentucky

Using Student Engagement Survey Data to Inform Schoolwide Programs and Strategies

Why Student Engagement Matters and How to Measure It

September 10, 2014Lexington, Kentucky

Page 2: September 10, 2014 Lexington, Kentucky

Welcome and Overview

Lydotta TaylorResearch Alliance Lead, REL Appalachia

Page 3: September 10, 2014 Lexington, Kentucky

What is a REL?

• A REL is a Regional Educational Laboratory.• There are 10 RELs across the country.• The REL program is administered by the U.S. Department of Education,

Institute of Education Sciences (IES).• A REL serves the education needs of a designated region.• The REL works in partnership with the region’s school districts, state

departments of education, and others to use data and research to improve academic outcomes for students.

3

Page 4: September 10, 2014 Lexington, Kentucky

What is a REL?

4

Page 5: September 10, 2014 Lexington, Kentucky

REL Appalachia’s mission

• Meet the applied research and technical assistance needs of Kentucky, Tennessee, Virginia, and West Virginia.

• Conduct empirical research and analysis.• Bring evidence-based information to policymakers and practitioners:

– Inform policy and practice – for states, districts, schools, and other stakeholders.

– Focus on high-priority, discrete issues and build a body of knowledge over time.

http://www.RELAppalachia.org

Follow us! @REL_Appalachia

5

Page 6: September 10, 2014 Lexington, Kentucky

Overview of today’s workshop

• Clarifying the Student Engagement Construct.• Best Practices for Survey Data Collection and General Data Management.• Selecting an Instrument for Measuring Student Engagement.• Action Planning for Next Steps.• Wrap-up and Closing Remarks.• Stakeholder Feedback Survey.

6

Page 7: September 10, 2014 Lexington, Kentucky

Workshop goals

• Understand the student engagement construct.• Select an appropriate instrument for measuring student engagement.• Gain an understanding of best practices for survey data collection and

general data management.

7

Page 8: September 10, 2014 Lexington, Kentucky

What is Student Engagement

Jerry Johnson, Ed.D.Associate Professor

University of North Florida

Page 9: September 10, 2014 Lexington, Kentucky

Discussion questions

• What does the term “student engagement” mean to you?• Is student engagement a concern in your school or district?• In what ways and to what extent is your school or district working to

promote student engagement?• What more would you like to know about student engagement, both

generally and within your student population?

9

Page 10: September 10, 2014 Lexington, Kentucky

What is student engagement?

• Student engagement has been defined variously by different researchers and theorists, but there is consistency around key ideas.

• A broad conceptual definition that reflects those varied perspectives:

Student engagement is a measure of the extent to which a student willingly participates in schooling activities. (Appleton, Christenson, & Furlong, 2008)

• There is consensus among researchers and theorists that student engagement is a multidimensional construct with four elements (Fredericks et al., 2011):

– Academic engagement.

– Affective engagement.

– Behavioral engagement.

– Cognitive engagement.

10

Page 11: September 10, 2014 Lexington, Kentucky

What does research say about student engagement?

• Student engagement is closely associated with desirable schooling outcomes (higher attendance, higher academic achievement, fewer disciplinary incidents, lower dropout rates, higher graduation rates).

(Appleton et al., 2008; Finn, 1989, 1993; Fredricks, Blumenfeld, & Paris, 2004; Jimerson, Campos, & Grief, 2003; Jimerson, Renshaw, Stewart, Hart, & O’Malley, 2009; Shernoff & Schmidt, 2008)

• Student engagement is closely associated with general measures of well-being (lower rates of health problems, lower rates of high-risk behaviors).

(Carter, McGee, Taylor, & Williams, 2007; McNeely, Nonnemaker, & Blum, 2002; Patton et al., 2006)

• Student engagement levels can be effectively influenced through school-based interventions.

(Appleton, Christenson, Kim, & Reschly, 2006; Christenson et al., 2008; Fredricks et al., 2004)

11

Page 12: September 10, 2014 Lexington, Kentucky

How are student engagement measures used?

• Research purposes:– Research on motivation and cognition.– Research on dropping out.

• Evaluation of interventions.• Diagnosis and monitoring

– Student level.– Teachers, school, or district level.

• Needs assessment.

(Fredericks et al., 2011)

12

Page 13: September 10, 2014 Lexington, Kentucky

Basics of Noncognitive Measures

Page 14: September 10, 2014 Lexington, Kentucky

Overview

• Noncognitive measures.• Basics of measurement:

– “Poor” versus “good” measures.

– Reliability.

– Validity.

14

Page 15: September 10, 2014 Lexington, Kentucky

Noncognitive measures: Purpose

• Cognitive measures test what people know or can do.

• Noncognitive measures capture an individual’s attitude, behavior, characteristic, value, or interest.

Cognitive example:

Radiocarbon dating CANNOT be used on the remains of:

A. PlantsB. MineralsC. Warm-blooded animalsD. Cold-blooded animals

Noncognitive example:

I enjoy studying science:

A. Strongly disagreeB. DisagreeC. NeutralD. AgreeE. Strongly agree

15

Page 16: September 10, 2014 Lexington, Kentucky

Noncognitive measures: An example

• Objectives:– (Cognitive objective) Increase knowledge in science course:

Give test: Score as right or wrong.

– (Noncognitive objective) Increase positive attitudes toward science: Administer survey measuring student attitudes toward math and science:

• Cognitive and noncognitive objectives call for different types of measures.

16

Page 17: September 10, 2014 Lexington, Kentucky

Noncognitive measures: How they are constructed

• Likert responses.– Typically 5 to 7 response options.

– Cannot score as correct or incorrect.

• Use total score or subscale scores.– Overall attitudes toward science.

– Attitudes toward each type of science (biology, chemistry, physics).

• Some items will be reversed scored.– “I am tired of learning about science.”

17

Page 18: September 10, 2014 Lexington, Kentucky

Noncognitive measures: Potential problems

• Response set.– ABCDABCDABCD…

• Social desirability.– “I would enjoy studying science.”

• Difficulty defining construct.– Are we really measuring attitudes?

– Maybe I respond positively to the above items because I enjoy studying anything, and not necessarily science in particular.

18

Page 19: September 10, 2014 Lexington, Kentucky

Basics of measurement

I have objectives and know where to find instruments…Now what?

• Examine instruments:– Do they measure your objectives?

– Are any suitable for your purposes?– Do they cost money to administer or score?– Are they difficult to administer?– How long do they take to complete?– How many items do they contain?– Is there any reliability or validity information?

I have objectives and know where to find instruments…Now what?• Any issues with the items?

– “I like science when it’s easy.”– “I don’t like science now, but I

probably will in the future.”– “Science uses the scientific

method.”– “I like science, just not

biology.”

19

Page 20: September 10, 2014 Lexington, Kentucky

Basics of measurement

“Poor” versus “good” measures

20

Page 21: September 10, 2014 Lexington, Kentucky

Basics of measurement: Example of a poor measure

• Scientific Attitudes Inventory II.– 40 items.

– 6 subscales.

– 1 (“Strongly disagree”) through 6 (“Strongly agree”)

21

Page 22: September 10, 2014 Lexington, Kentucky

Basics of measurement: Example of a poor measure

Please answer the following items ( R indicates reverse scored):

1. A scientist must have a good imagination to create new ideas.

2. R A major purpose of science is to produce new drugs and save lives.

3. Ideas are the important result of science.

4. R Science tries to explain how things happen.

5. R Electronics are examples of the really valuable products of science.

6. A major purpose of science is to help people live better.

22

Page 23: September 10, 2014 Lexington, Kentucky

Basics of measurement: Example of a good measure

• Definitions: Engagement versus Disaffection with Learning– Engagement: enthusiastic and emotionally positive in

interactions.– Disaffection: apathetic withdrawal and frustrated alienation.

• Sample items:– Behavioral Engagement: “When I’m in class, I listen very

carefully.”– Behavioral Disaffection: “When I’m in class, I just act like I’m

working.”– Emotional Engagement: “I enjoy learning new things in class.”– Emotional Disaffection: “When we work on something in class, I

feel discouraged.”

(Skinner et al., 2009)

23

Page 24: September 10, 2014 Lexington, Kentucky

Basics of measurement: Reliability of a measure

• Consistency of measurement.– Test-retest reliability.

– Internal consistency reliability.

– Alternate or parallel forms reliability.• Cronbach’s alpha as a measure of internal consistency.

– Ranges from 0.00 to 1.00.

– The higher the value, the greater the reliability.• Reliability depends on:

– The sample of respondents.

– Length of the measure (number of items).

24

Page 25: September 10, 2014 Lexington, Kentucky

Basics of measurement: Validity of a measure

• Accuracy of measurement: Is it measuring what it claims to measure?– Face validity as a first step.

– Construct validity = how well the item or items capture the underlying concept.

– When single items are sufficient.

– When summated scales are necessary.

• Additional subtypes of validity:– Predictive validity.

– Convergent validity.

25

Page 26: September 10, 2014 Lexington, Kentucky

Basics of measurement: Reliability and validity

Item responses are reliable.

Item responses are valid.

Item responses are reliable.

Item responses are NOT valid. – We’re consistently measuring

something, but what is it?

Item responses are NOT reliable.

Item responses are NOT valid.

26

Page 27: September 10, 2014 Lexington, Kentucky

Basics of measurement: Validity versus reliability

• Validity as accuracy.• Reliability as precision.• Ideally, a measure should be valid and reliable.

27

Page 28: September 10, 2014 Lexington, Kentucky

Administering the Student Engagement Survey

Page 29: September 10, 2014 Lexington, Kentucky

Preparing to administer the survey

• Identify a district point of contact (POC).• Identify a school survey administrator (this could be the same person

as the POC).

29

Page 30: September 10, 2014 Lexington, Kentucky

Data collection

• REL Appalachia staff will upload the selected instrument to an online survey system and send the URL to district POC.– Survey will include field for student name or district student ID (district

preference).

• District POC forwards survey URL to school survey administrator.• Schools administer survey to students.• District POC adds student background characteristics to completed survey

data file using student name or student district ID.– REL Appalachia will provide a list of requested student background

characteristics (e.g., student background information race/ethnicity, grade level, limited English proficiency, IEP).

• District POC removes student name or student district ID and sends survey data and corresponding background characteristics to REL Appalachia.

30

Page 31: September 10, 2014 Lexington, Kentucky

Why are student background data important?

• Without student background data, can only report student engagement at the school level.

• Need student background data to disaggregate results by key student subgroups.– Grade level

– Race/ethnicity

– English proficiency

– IEP

• Can tailor student engagement programs or interventions to particular types of students.

31

Page 32: September 10, 2014 Lexington, Kentucky

Survey follow up

• Data analyses will be conducted by REL Appalachia staff.• REL Appalachia staff will prepare data memos summarizing the results.• Debriefing session on survey results and implications.

– Review results disaggregated by student subgroups.

– Review schoolwide strategies and programs for promoting student engagement (as reported in the extant literature).

– Develop school- and/or district-level action plans.

32

Page 33: September 10, 2014 Lexington, Kentucky

Survey ethics - consent and assent forms

• Parental consent forms.– Send home with students.

– Must be obtained before data collection.

– Do not allow student to fill out survey without parental consent.

– Assign an alternate computer-based activity during survey administration.

• Student assent forms.– Read script aloud to students.

– It will also be on the landing page of the online survey.

33

Page 34: September 10, 2014 Lexington, Kentucky

Missing data: why response rates matter

• Non-response bias.– Item- and instrument-level non-response.

– Occurs when answers of respondents differ from the way non-respondents might answer.

• Monitoring response rates.– Labels remaining on sheets sent to schools.

– “Analyze Data” tab in Survey Monkey.

– Goal is 80% or higher.

• Incentives for participation.• Distinguishing between diligent follow-up and harassment of respondents.

34

Page 35: September 10, 2014 Lexington, Kentucky

Break

Page 36: September 10, 2014 Lexington, Kentucky

Selecting an Instrument for Measuring Student Engagement

Page 37: September 10, 2014 Lexington, Kentucky

IES Issues & Answers Report (available online)

http://ies.ed.gov/ncee/edlabs/regions/southeast/pdf/REL_2011098.pdf

37

Page 38: September 10, 2014 Lexington, Kentucky

Overview of the report

• Purpose: “to describe 21 instruments for measuring student engagement in upper elementary through high school as identified through a literature review.”

• Content and structure:• Definitions, instrument types, psychometric properties.

• Instrument abstracts.

• Tables for comparing instrument attributes (e.g., developer/availability, engagement dimensions assessed, intended purposes/uses).

• Potential uses for stakeholders:• Introducing stakeholders to the instruments that are available.

• Assisting stakeholders in determining the appropriateness of various available measures.

38

Page 39: September 10, 2014 Lexington, Kentucky

How is student engagement measured?

• Three primary data collection strategies:• Student self-reports.• Teacher self-reports.• Observational measures.

• Three dimensions of student engagement:• Behavioral — the student’s involvement in academic, social, and

extracurricular activities.• Affective/emotional — extent of the student’s positive [and negative]

reactions to teachers, classmates, academics, and school.• Cognitive — the student’s level of investment in his/her learning.

Note: Academic engagement is measured using traditional outcome data, such as student achievement results.

39

(Fredericks et al., 2011)

Page 40: September 10, 2014 Lexington, Kentucky

Questions to guide the selection process

1. What type(s) of student engagement do you want to better understand?– Behavioral?

– Affective/emotional?

– Cognitive?

– All, or a combination of the above?

2. In what specific ways do you hope to use the information gained from this process?– For monitoring of individual students?

– For needs assessments at the classroom, grade, school, or district level?

– Both?

40

Page 41: September 10, 2014 Lexington, Kentucky

Questions to guide the selection process (continued)

3. What type of data collection strategy appeals to you?– Student self-report survey?

– Teacher self-report survey?

– Observation using a protocol?

4. How much time are you willing to allocate to data collection?

5. What human resources are available in your school/district to facilitate the data collection process?

41

Page 42: September 10, 2014 Lexington, Kentucky

42

Two example surveys – Handout 1

• Student Engagement Instrument (SEI)Appleton, J.J., Christenson, S.L., Kim, D., & Reschly, A.L. (2006). Measuring cognitive and psychological engagement: Validation of the Student Engagement Instrument. Journal of School Psychology, 44, 427-445.

• High School Survey of Student Engagement

Center for Evaluation and Education Policy, Indiana University

Page 43: September 10, 2014 Lexington, Kentucky

Activity #1: Select a student engagement survey

• Break into small groups of 2-3.• Generate responses to the five “Questions to Guide the Selection

Process” in Handout 2.• Each small group will then share and explain your responses, allowing

for questions.• Finally, as a whole group, we will make a determination about which

instrument best addresses the needs and objectives of the group as a whole.

43

Page 44: September 10, 2014 Lexington, Kentucky

Lunch

Page 45: September 10, 2014 Lexington, Kentucky

Action Planning for Next Steps

Page 46: September 10, 2014 Lexington, Kentucky

Activity #2: Plan for survey administration• Work in a school or district team (or as an individual representative of

your school or district). Please see Handout 3 and complete the table to develop a tentative plan of action for activities associated with data collection.

Action Step Time Frame Resources Needed Responsible Person(s) Progress Status

Identifying participating schools, districts; securing administrative approval/support.

Creating/obtaining identification numbers.

Securing parental consent/student assent.

Distributing data collection materials to school sites.

Collecting and shipping materials to REL Appalachia.

46

Page 47: September 10, 2014 Lexington, Kentucky

What happens next• Workshop attendees

– Steps for survey administration summarized in Handout #4.

– Please contact Michael Flory at [email protected] to let us know you will participate in this project.

• REL Appalachia– Provide URL for survey instrument to the district POC.

– Analyze survey data following administration by schools.

– Prepare data memo summarizing results.

– Discuss results and implications.

47

Page 48: September 10, 2014 Lexington, Kentucky

Wrap-Up and Closing RemarksStakeholder Feedback Survey

Lydotta TaylorResearch Alliance Lead, REL Appalachia

Page 49: September 10, 2014 Lexington, Kentucky

Connect With Us!

www.relappalachia.org

@REL_Appalachia

Patty Kannapel

[email protected]

Michael [email protected]

49

Page 50: September 10, 2014 Lexington, Kentucky

50

Sources cited

Appleton, J. J., Christenson, S. L., & Furlong, M. J. (2008). Student engagement with school: Critical conceptual and methodological issues of the construct. Psychology in the Schools, 45, 369–386.

Appleton, J.J., Christenson, S.L., Kim, D., & Reschly, A.L. (2006). Measuring cognitive and psychological engagement: Validation of the Student Engagement Instrument. Journal of School Psychology, 44, 427-445.

Carter, M., McGee, R., Taylor, B., & Williams, S. (2007). Health outcomes in adolescence: Associations with family, friends, and school engagement. Journal of Adolescence, 30, 51-62.

Christenson, S. L., Reschly, A. L., Appleton, J. J., Berman, S., Spangers, D., & Varro, P. (2008). Best practices in fostering student engagement. In A. Thomas & J. Grimes (Eds.), Best practices in school psychology V (pp. 1099–1120). Washington, DC: National Association of School Psychologists.

Finn, J. D. (1989). Withdrawing from school. Review of Educational Research, 59, 117–142.Finn, J. D. (1993). School engagement and students at risk. Washington, DC: National Center for Education Statistics.Fredricks, J. A., Blumenfeld, P. C., & Paris, A. H. (2004). School engagement: Potential of the concept, state of the evidence.

Review of Educational Research, 74(1), 59–109.Fredricks, J., McColskey, W., Meli, J., Mordica, J., Montrosse, B., & Mooney, K. (2011). Measuring student engagement in upper

elementary through high school: A description of 21 instruments (Issues & Answers Report, REL 2011–No. 098). Washington, DC: U.S. Department of Education, Institute of Education Sciences, National Center for Education Evaluation and Regional Assistance, Regional Educational Laboratory Southeast. Retrieved September 13, 2013, from http://ies.ed.gove/ncee/edlabs.

Jimerson, S., Campos, E., & Grief, J. (2003). Toward an understanding of definitions and measures of school engagement and related terms. The California School Psychologist, 8, 7–27.

Jimerson, S., Renshaw, T., Stewart, K., Hart, S., & O’Malley, M. (2009). Promoting school completion through understanding school failure: A multi-factorial model of dropping out as a developmental process. Romanian Journal of School Psychology, 2, 12–29.

Page 51: September 10, 2014 Lexington, Kentucky

51

Sources cited (continued)

McNeely, C.A., Nonnemaker, J.M., & Blum, R.W. (2002). Promoting school connectedness: Evidence from the National Longitudinal Study of Adolescent Health. Journal of School Health, 72, 138-146.

Patton, G.C., Bond, L., Carlin, J.B., Thomas, L., Butler, H., Glover, S., Catalano, R., & Bowes, G. (2006). Promoting social inclusion in schools: A group-randomized trial of effects on student health risk behavior and well-being. American Journal of Public Health, 96, 1582-1587.

Shernoff, D., & Schmidt, J. (2008). Further evidence of an engagement-achievement paradox among U.S. high school students. Journal of Youth and Adolescence, 37(5), 564-580.

Skinner, E.A., Kindermann, T.A., & Furrer, C.J. (2009). A motivational perspective on engagement and disaffection: Conceptualization and assessment of children’s behavioral and emotional participation in academic activities in the classroom. Educational and Psychological Measurement, 69(3), 493-525.