31
Best Practices in Classroom Peer Review Edward F. Gehringer Department of Computer Science North Carolina State University The Expertiza project has been funded by the National Science Foundation Please visit our Web site: http ://tinyurl.com/expertiza-site

Best Practices in Classroom Peer Review Edward F. Gehringer Department of Computer Science

  • Upload
    manon

  • View
    23

  • Download
    0

Embed Size (px)

DESCRIPTION

Best Practices in Classroom Peer Review Edward F. Gehringer Department of Computer Science North Carolina State University. The Expertiza project has been funded by the National Science Foundation Please visit our Web site : http ://tinyurl.com/expertiza-site. Credits …. - PowerPoint PPT Presentation

Citation preview

Page 1: Best Practices in  Classroom Peer Review Edward F.  Gehringer Department of Computer Science

Best Practices in Classroom Peer Review

Edward F. GehringerDepartment of Computer ScienceNorth Carolina State University

The Expertiza project has been funded by the National Science FoundationPlease visit our Web site: http://tinyurl.com/expertiza-site

Page 2: Best Practices in  Classroom Peer Review Edward F.  Gehringer Department of Computer Science

Credits … Arlene Russell, Calibrated Peer Review Chris Schunn, SWoRD Steve Joordens & Dwayne Paré,

Peer Scholar Eric Ford & Dmytro Babik, Mobius SLIP Helen Hu & David McNaughton, uJudge

Gehringer, Classroom peer review [email protected]

Page 3: Best Practices in  Classroom Peer Review Edward F.  Gehringer Department of Computer Science

Outline What’s good about peer review? F2F vs. online peer review Rubrics Rating vs. ranking Formative vs. summative Quality control Who reviews whom? Online apps for peer review

Gehringer, Classroom peer review [email protected]`

Page 4: Best Practices in  Classroom Peer Review Edward F.  Gehringer Department of Computer Science

Advantages of peer review?

Gehringer, Classroom peer review [email protected]

Page 5: Best Practices in  Classroom Peer Review Edward F.  Gehringer Department of Computer Science

Some advantages of peer review Feedback is

more extensive quicker scalable

Can’t blame the reader! Forces students to think metacognitively

Gehringer, Classroom peer review [email protected]

Page 6: Best Practices in  Classroom Peer Review Edward F.  Gehringer Department of Computer Science

Face-to-face vs. online peer review

Gehringer, Classroom peer review [email protected]

Page 7: Best Practices in  Classroom Peer Review Edward F.  Gehringer Department of Computer Science

Why face-to-face peer review? Easier to set up Communicate more interactively Exchange non-verbal cues Instructor can intervene

Gehringer, Classroom peer review [email protected]

Page 8: Best Practices in  Classroom Peer Review Edward F.  Gehringer Department of Computer Science

Why online peer review?

Doesn’t consume class time Read/write more reflectively Easier to get multiple reviews Easier for author to refer back Can be used summatively Can be perused in deciding on grade Rubric can perhaps be more detailed

Gehringer, Classroom peer review [email protected]

Page 9: Best Practices in  Classroom Peer Review Edward F.  Gehringer Department of Computer Science

Rubrics Why use a rubric?

Tell students what to look for “Fairness” in assessment

Students can helpcreate the rubric

How detailed?

Gehringer, Classroom peer review [email protected]

Page 10: Best Practices in  Classroom Peer Review Edward F.  Gehringer Department of Computer Science

Rubric advice

Page 11: Best Practices in  Classroom Peer Review Edward F.  Gehringer Department of Computer Science

Rating vs. ranking Should students rate others’ work on a

Likert scale, or rank students against each other?

Rating Easier to rate than rank using a rubric Can give 2 students the same rating

Ranking May not be compatible with F2F review More robust when reviewers are not experts Can use a slider to show nearness

Gehringer, Classroom peer review [email protected]

Page 12: Best Practices in  Classroom Peer Review Edward F.  Gehringer Department of Computer Science

Mobius SLIP’s approach to ranking

Gehringer, Classroom peer review [email protected]

Page 13: Best Practices in  Classroom Peer Review Edward F.  Gehringer Department of Computer Science

Formative vs. summative peer review Formative—text feedback Summative—Likert scale Should peer review be used summatively?

Gehringer, Classroom peer review [email protected]

Page 14: Best Practices in  Classroom Peer Review Edward F.  Gehringer Department of Computer Science

Quality control You can’t take review quality for granted. Approaches

Metareviewing Calibration Reputation system

Gehringer, Classroom peer review [email protected]

Page 15: Best Practices in  Classroom Peer Review Edward F.  Gehringer Department of Computer Science

Metareviewing “Review the reviewer” “Rate the rater” Who performs metareviews? 3 choices

Author? Instructor?3rd party?

Can we automatethe process?

Gehringer, Classroom peer review [email protected]

Page 16: Best Practices in  Classroom Peer Review Edward F.  Gehringer Department of Computer Science

Calibration Basic idea: Training course for reviewers

How they do how much credence they get

Before students review peers, they get 3 works to review

1 exemplary Their agreement with instructor Reviewer

Competency Index

Others have known defects

Gehringer, Classroom peer review [email protected]

Page 17: Best Practices in  Classroom Peer Review Edward F.  Gehringer Department of Computer Science

Reputation algorithm

s1 gets the same scores from reviewers in both situations. Should it get the same grade?

s1 s2 s3 s4 s5

r1 0.6 0.6 0.6 — —r2 0.3 — — 0.4 —r3 0.3 — — — 0.4r4 — 0.3 0.3 0.4 0.4r5 — 0.3 0.3 0.4 0.4

s1 s2 s3 s4 s5

r1 0.6 0.4 0.4 — —r2 0.3 — — 0.2 —r3 0.3 — — — 0.2r4 — 0.4 0.4 0.5 0.5r5 — 0.4 0.4 0.5 0.5

Gehringer, Classroom peer review [email protected], Classroom peer review [email protected]

Page 18: Best Practices in  Classroom Peer Review Edward F.  Gehringer Department of Computer Science

Reputation algorithm, cont.

r2 and r3 agree with their co-reviewers …

s1 s2 s3 s4 s5

r1 0.6 0.6 0.6 — —r2 0.3 — — 0.4 —r3 0.3 — — — 0.4r4 — 0.3 0.3 0.4 0.4r5 — 0.3 0.3 0.4 0.4

s1 s2 s3 s4 s5

r1 0.6 0.4 0.4 — —r2 0.3 — — 0.2 —r3 0.3 — — — 0.2r4 — 0.4 0.4 0.5 0.5r5 — 0.4 0.4 0.5 0.5

Gehringer, Classroom peer review [email protected]

Page 19: Best Practices in  Classroom Peer Review Edward F.  Gehringer Department of Computer Science

Reputation algorithm, cont.

r2 and r3 agree with their co-reviewers … while r1 gives higher scores. So, s1’s grade may be inflated.

s1 s2 s3 s4 s5

r1 0.6 0.6 0.6 — —r2 0.3 — — 0.4 —r3 0.3 — — — 0.4r4 — 0.3 0.3 0.4 0.4r5 — 0.3 0.3 0.4 0.4

s1 s2 s3 s4 s5

r1 0.6 0.4 0.4 — —r2 0.3 — — 0.2 —r3 0.3 — — — 0.2r4 — 0.4 0.4 0.5 0.5r5 — 0.4 0.4 0.5 0.5

Gehringer, Classroom peer review [email protected]

Page 20: Best Practices in  Classroom Peer Review Edward F.  Gehringer Department of Computer Science

Reputation algorithm, cont.

In this situation, r1 agrees with his co-reviewers, while r2 and r3 give lower scores.

So in this case, s1 was reviewed by “harder” graders, and thus deserves a higher grade.

s1 s2 s3 s4 s5

r1 0.6 0.6 0.6 — —r2 0.3 — — 0.4 —r3 0.3 — — — 0.4r4 — 0.3 0.3 0.4 0.4r5 — 0.3 0.3 0.4 0.4

s1 s2 s3 s4 s5

r1 0.6 0.4 0.4 — —r2 0.3 — — 0.2 —r3 0.3 — — — 0.2r4 — 0.4 0.4 0.5 0.5r5 — 0.4 0.4 0.5 0.5

Gehringer, Classroom peer review [email protected]

Page 21: Best Practices in  Classroom Peer Review Edward F.  Gehringer Department of Computer Science

Reputation systems—how reliable? Two studies on Coursera MOOC [2013]

Piech et al.: ≥ 26% of grades ± 5% from “ground truth.”

Kulkarni et al.: 40% of grades off by 1 letter grade!

But … no calibration, metareviewing this was, after all, a MOOC

Gehringer, Classroom peer review [email protected]

Page 22: Best Practices in  Classroom Peer Review Edward F.  Gehringer Department of Computer Science

Who reviews whom? Simplest: Each student reviews k other

students Reviewing in groups—case study, etc. Individuals review teams Dynamic assignment, to make sure all

get reviewed

Gehringer, Classroom peer review [email protected]

Page 23: Best Practices in  Classroom Peer Review Edward F.  Gehringer Department of Computer Science

The PR app landscape Most widely used: CPR

Sharable assignments for many disciplines But, you probably want to adapt.

Gehringer, Classroom peer review [email protected]

Page 24: Best Practices in  Classroom Peer Review Edward F.  Gehringer Department of Computer Science

SWoRD Perhaps the most-researched system …

from Pitt’s Learning Resource Development Ctr.

Gehringer, Classroom peer review [email protected]

Page 25: Best Practices in  Classroom Peer Review Edward F.  Gehringer Department of Computer Science
Page 26: Best Practices in  Classroom Peer Review Edward F.  Gehringer Department of Computer Science

Peer Scholar

Came from U. of Toronto Now sold by Pearson in Canada Free (for now) in the US Supports (& recommends) revision and

resubmission

Gehringer, Classroom peer review [email protected]

Page 27: Best Practices in  Classroom Peer Review Edward F.  Gehringer Department of Computer Science

Mobius SLIP

Origin in case-study courses Based on ranking “Double loop”

Gehringer, Classroom peer review [email protected]

Page 28: Best Practices in  Classroom Peer Review Edward F.  Gehringer Department of Computer Science

Expertiza “Reusable learning objects

through peer review” Supports signing up for topics/parts of a

project Students (or instructor) form teams Individuals review teams Teammates review each other

Gehringer, Classroom peer review [email protected]

Page 29: Best Practices in  Classroom Peer Review Edward F.  Gehringer Department of Computer Science

Signup sheet

Gehringer, Classroom peer review [email protected]

Page 30: Best Practices in  Classroom Peer Review Edward F.  Gehringer Department of Computer Science

Viewing results

Gehringer, Classroom peer review [email protected]

Page 31: Best Practices in  Classroom Peer Review Edward F.  Gehringer Department of Computer Science

Summary Reasons for doing peer review F2F vs. online peer review Rubrics are important Rating vs. ranking Formative vs. summative Quality control Who reviews whom? Online apps for peer review

Gehringer, Classroom peer review [email protected]`