37
A practical introduction to active learning for busy skeptics April 9, 2013 Michael Prince 1 Education is what happens to the other person, not what comes out of the mouth of the educator. -(Miles Horton) A Practical Introduction to Active Learning for the Busy Skeptic Michael Prince Professor of Chemical Engineering Bucknell University Reminder Why I’m here What I hope to do What I’m not trying to do

active learning 2013crlte.engin.umich.edu/.../2013/06/W13ActiveLearningPres.pdf · 2019-09-10 · A practical introduction to active learning for busy skeptics April 9, 2013 Michael

  • Upload
    others

  • View
    4

  • Download
    0

Embed Size (px)

Citation preview

Page 1: active learning 2013crlte.engin.umich.edu/.../2013/06/W13ActiveLearningPres.pdf · 2019-09-10 · A practical introduction to active learning for busy skeptics April 9, 2013 Michael

A practical introduction to active learning for busy skeptics April 9, 2013

Michael Prince1

Education is what happens to the other person, not what comes out of the mouth of the educator.- (Miles Horton)

A Practical Introduction to Active Learning for the Busy Skeptic

Michael Prince

Professor of Chemical Engineering

Bucknell University

Reminder

Why I’m here

What I hope to do

What I’m not trying to do

Page 2: active learning 2013crlte.engin.umich.edu/.../2013/06/W13ActiveLearningPres.pdf · 2019-09-10 · A practical introduction to active learning for busy skeptics April 9, 2013 Michael

A practical introduction to active learning for busy skeptics April 9, 2013

Michael Prince2

Goals: At the end of the workshop participants should be able to:

Define active learning and explain the distinctions between the different active learning methods

Explain when the different methods make sense to use

Develop and incorporate activities from each of the different instructional methods

Defining Active Learning

Any instructional method that actively engages students in the learning process.

Requires students to do meaningful learning activities that promote intellectual engagement.

http://www.cs.unt.edu/~garlick/teaching/1110/1110.html

The Active Learning Continuum

ActiveLearning

Problem-Based Learning

Make thelecture active

ProblemsDrive the Course

Instructor focused

Studentfocused

CollaborativeLearning

CooperativeLearning

InformalGroupActivities

StructuredTeamActivities

Page 3: active learning 2013crlte.engin.umich.edu/.../2013/06/W13ActiveLearningPres.pdf · 2019-09-10 · A practical introduction to active learning for busy skeptics April 9, 2013 Michael

A practical introduction to active learning for busy skeptics April 9, 2013

Michael Prince3

Split-Class Brainstorm Activity

Possible benefits of active learning

Possible faculty concerns about using it.

Deconstructing Active Learning

Key Elements

Introducing Activities in the Classroom

Promoting Student Engagement

What does the data say? Even if you

are fascinating…..

People only remember the first 15 minutes of what you say

Percent of Students Paying Attention

Time from Start of Lecture (minutes)

100

50

00 10 20 30 40 50 60

Page 4: active learning 2013crlte.engin.umich.edu/.../2013/06/W13ActiveLearningPres.pdf · 2019-09-10 · A practical introduction to active learning for busy skeptics April 9, 2013 Michael

A practical introduction to active learning for busy skeptics April 9, 2013

Michael Prince4

10

One Structure to Address This

Thinking Together: Collaborative Learning in the Sciences – Harvard University – Derek Bok Center – www.fas.harvard.edu/~bok_cen/

Effectiveness of Short ActivitiesWith Pause Without Pause

Short term recall

108 correct facts recalled after lecture

80 correct facts recalled after lecture

Long term recall Average exam score = 84.9

Average exam score = 76.7

Stretch Break

Page 5: active learning 2013crlte.engin.umich.edu/.../2013/06/W13ActiveLearningPres.pdf · 2019-09-10 · A practical introduction to active learning for busy skeptics April 9, 2013 Michael

A practical introduction to active learning for busy skeptics April 9, 2013

Michael Prince5

Quiz

The 2nd Element of Active Learning: Student Engagement

Lots of evidence that supports the importance of student engagement no surprise there

The magnitude of reported improvements might be a surprise

Sample Data Supporting Engagement

Average College and University Results

0 20 40 60 80 100

Velocity

Acceleration

Force

% Students Understanding Concepts

After New Methods

After TraditionalInstruction

Before Instruction

Page 6: active learning 2013crlte.engin.umich.edu/.../2013/06/W13ActiveLearningPres.pdf · 2019-09-10 · A practical introduction to active learning for busy skeptics April 9, 2013 Michael

A practical introduction to active learning for busy skeptics April 9, 2013

Michael Prince6

Predict Outcomes

Case 1: Experienced, highly rated instructor using primarily lecture and demonstrations. Clickers at end of class.

Case 2: Graduate student using all active learning (“clickers”, no lectures, student questions & instructor feedback)

Deslauries et al., Science, May 13, 2011

Results:

Graduate student-taught section showed:20% increased attendance

4 x more student engagement (obs. rated)

More than twice the learning gains

It’s the method, not the person!

Simple and Easy Active Learning Exercises

One minute paper

Brainstorm applications of class material

Summarize (clarify) lecture notes

Have students generate test questions or homework problems from lecture material

Non-rhetorical questions Predict what will happen if…..

Explain in your own words, why…

What if in the real world the theory doesn’t work

Page 7: active learning 2013crlte.engin.umich.edu/.../2013/06/W13ActiveLearningPres.pdf · 2019-09-10 · A practical introduction to active learning for busy skeptics April 9, 2013 Michael

A practical introduction to active learning for busy skeptics April 9, 2013

Michael Prince7

Common Faculty Questions

How much time does it take to prepare?

Can I still cover the syllabus?

Often very little YES!!

Common Mistakes with Active Learning

1. Always calling on volunteers

2. Making activity too long

3. Making activity trivial

Group Discussion

Jot down questions/concerns/comments you have about using active learning in your classes. (the more, the better)

Instructor will call on individuals at the end of the activity

Page 8: active learning 2013crlte.engin.umich.edu/.../2013/06/W13ActiveLearningPres.pdf · 2019-09-10 · A practical introduction to active learning for busy skeptics April 9, 2013 Michael

A practical introduction to active learning for busy skeptics April 9, 2013

Michael Prince8

You might use active learning if

You want to increase student engagement and attendance

You want to increase students’ retention of the subject matter

You want to uncover and address students’ misconceptions

Any Final Questions About Active Learning?

?

Collaborative Learning

Active learning where students interact with one another as they learn and apply course material

Focus on student’s exploration of course material not on instructor’s presentations of it

Interactions are generally unstructured

Page 9: active learning 2013crlte.engin.umich.edu/.../2013/06/W13ActiveLearningPres.pdf · 2019-09-10 · A practical introduction to active learning for busy skeptics April 9, 2013 Michael

A practical introduction to active learning for busy skeptics April 9, 2013

Michael Prince9

The Active Learning Continuum

ActiveLearning

Problem-Based Learning

Make thelecture active

ProblemsDrive the Course

Instructor focused

Studentfocused

CollaborativeLearning

CooperativeLearning

InformalGroupActivities

StructuredTeamActivities

Key Element of Collaborative Learning

Working together rather than promoting learning as an entirely individual activity

Why foster collaboration?

Hundreds of studies support that collaboration:

Improves academic achievement

Improves student attitudes towards their studies

Improves student retention in academic programs

Page 10: active learning 2013crlte.engin.umich.edu/.../2013/06/W13ActiveLearningPres.pdf · 2019-09-10 · A practical introduction to active learning for busy skeptics April 9, 2013 Michael

A practical introduction to active learning for busy skeptics April 9, 2013

Michael Prince10

Why teams promote learning

Individual students get stuck, groups keep going

More and better question generation, less fear in class

Students see and learn different perspectives and strategies

Students, like professors, learn best what they teach!

Examples of Collaborative Learning

Think-pair-share

Jigsaw

Paired writing assignments or homework

Pairing students for any of the activities listed under active learningBrainstorming

Clarifying lecture notes

Answering stimulating questions

Collaborative Activity

Read, “How About a Quick One?

Discuss ideas from article, previous slides and your own experience

Pick your favorite example and explain how you might use it in one of your classes

Be prepared to share with the class

Page 11: active learning 2013crlte.engin.umich.edu/.../2013/06/W13ActiveLearningPres.pdf · 2019-09-10 · A practical introduction to active learning for busy skeptics April 9, 2013 Michael

A practical introduction to active learning for busy skeptics April 9, 2013

Michael Prince11

In summary, you might use collaborative learning if… You want to add variety to your list of possible

class activities You want to acclimate the students to working

together You want to get students talking in large classes You want to see the benefits reported in the

literature with respect to academic achievement, attitudes and retention

Questions about Collaborative Learning?

?

Page 12: active learning 2013crlte.engin.umich.edu/.../2013/06/W13ActiveLearningPres.pdf · 2019-09-10 · A practical introduction to active learning for busy skeptics April 9, 2013 Michael

HOW ABOUT A QUICK ONE?

Of all instructional methods, lecturing is the most common, the easiest, and the least effective. Unless the instructor

is a real spellbinder, most students cannot stay focused throughout a lecture: after about 10 minutes their attention

begins to drift, first for brief moments and then for longer intervals; they find it increasingly hard to catch up on

what they missed while their minds were wandering; and eventually they switch the lecture off altogether like a bad

TV show. McKeachie [1] cites a study indicating that immediately after a lecture students recalled 70% of the

information presented in the first ten minutes and only 20% of that from the last ten minutes.

There are better ways. Actively involving students in learning instead of simply lecturing to them leads to improved

attendance, deeper questioning, higher grades, and greater lasting interest in the subject [1,2]. A problem with

active instructional methods, however, is that they sound time-consuming. Whenever I describe in workshops and

seminars the proven effectiveness of in-class problem-solving, problem-formulation, trouble-shooting or

brainstorming exercises, I can always count on someone in the third row asking---usually sincerely, sometimes

belligerently---"If I do all that, how am I supposed to get through the syllabus?"

I have a variety of answers I trot out on such occasions, depending on my mood and the tone of my questioner, but

they mostly amount to "So what if you don't?" Syllabi are usually made up from the standpoint of "What do I want

to cover" rather than the much more pertinent "What do I want the students to be able to do"; ,when the latter

approach is adopted, it often turns out that large chunks of the syllabus serve little educational purpose and can be

excised with no great loss to anyone. But never mind: let's accept---for the remainder of this column, at least---the

principle that it is critically important to get through the syllabus. Can I (asks my friend in the third row) use any of

those allegedly powerful teaching techniques and still cover it all?

Yes (I reply), you can. Here are two techniques for doing it.

In-class group problem-solving

As you lecture on a body of material or go through a problem solution, instead of just posing questions to the class

as a whole and enduring the subsequent embarrassing and time-wasting silences, occasionally assign a task and give

the class one or two minutes to work on it in groups of three to five at their seats. For example:

Sketch and label a flow chart (schematic, force diagram, differential control volume) for this system.

Sketch a plot of what the problem solution should look like.

Give several reasons why you might need or want to know the solution.

What's the next step?

What's wrong with what I just wrote?

How could I check this solution?

What question do you have about what we just did?

Suppose I run some measurements in the laboratory or plant and the results don't agree with the formula I

just derived. Think of as many reasons as you can for the discrepancy.

What variations of this problem might I put on the next test? (This and the last one are particularly

instructive.)

You don't have to spend a great deal of time on such exercises; one or two lasting no more than five minutes in a

50-minute session can provide enough stimulation to keep the class with you for the entire period. The syllabus is

safe!

Warning, however. The first time you assign group work, the introverts in the class will hang back and try to avoid

participating. Don't be surprised or discouraged---it's a natural response. Just get their attention---walk over to them

if necessary---and remind them good-naturedly that they're supposed to be working together. When they find out

that you can see them(1) they'll do it, and by the time you've done three or four such exercises most of the class will

need no extra prodding. Granted, there may be a few who continue to hold out, but look at it this way: in the usual

Page 13: active learning 2013crlte.engin.umich.edu/.../2013/06/W13ActiveLearningPres.pdf · 2019-09-10 · A practical introduction to active learning for busy skeptics April 9, 2013 Michael

lecture approach, 5% of the students (if that many) are actively involved and 95% are not. If you can do something

that reverses those percentages or comes close to it, you've got a winner.

In-class reflection and question generation

The one-minute paper is an in-class assignment in which students nominate the most important and/or the most

confusing points in the lecture just concluded [3,4]. Variations of this device can be used to powerful effect. About

two minutes from the end of a class, ask the students---working individually or in small groups---to write down and

turn in anonymous responses to one or two of the following questions:

What are the two most important points brought out in class today (this week, in the chapter we just

finished covering)? Examination of the responses will let you know immediately whether the students are

getting the essential points. Also, when the students know beforehand that this question is coming they will

tend to watch for the main points as the class unfolds, with obvious pedagogical benefits.

What were the two muddiest points in today's class (this week's classes, this section of the course)? Rank

the responses in order of their frequency of occurrence and in the next class go over the ones that came up

most often.

The responses to this question will surprise you. What you would have guessed to be the most difficult

concepts may not show up on many papers, if they show up at all; what will appear are concepts you take

for granted, which you skimmed over in your lecture but which are unfamiliar and baffling to the students.

What would make this material clearer to you? You also never know what you'll get in response to this

one---perhaps requests for worked-out examples of solution procedures or concrete applications of abstract

material, or pleas for you to write more clearly on the board, speak more slowly, or stop some annoying

mannerism that you weren't aware you were doing. Responses to this question can provide valuable clues

about what you could do to make your teaching more effective.

Make up a question about an everyday phenomenon that could be answered using material presented in

class today (this week). (Optional:) One or two of your questions will show up on the next test.

I used the last exercise---including the zinger about the next test---at the end of a course segment on convective heat

transfer and got back a wonderful series of questions about such things as why you feel much colder in water at 20

degrees celcius than in air at the same temperature; why you feel a draft when you stand in front of a closed window

on a cold day; why a fan cools you on a hot day and why a higher fan speed cools you even more; why a car

windshield fogs up during the winter and how a defogger works; and why you don't get burned when you (a) move

your hand right next to (but not quite touching) a pot of boiling water; (b) touch a very hot object very quickly; (c)

walk across hot coals. I typed up the questions (sneaking a few additional ones onto the list) and posted them

outside my office---and in the days preceding the test I had a great time watching the students thinking through all

the questions and speculating on which one I would put on the test. (I used the one about the fan.)

There are other short, easy, and effective instructional methods, but these should do for starters. Check them out and

let me know how they work for you. If I collect some good testimonials (positive or negative) I'll report them in a

future column.

References

1. McKeachie, W.J., Teaching Tips, 8th Edn. Lexington, MA, D.C. Heath & Co. (1986).

2. Bonwell, C.C., and J.A. Eison, Active learning: Creating Excitement in the Classroom, ASHE-ERIC

Higher Education Report No. 1, Washington, DC, George Washington University, 1991.

3. Wilson, R.C., "Improving Faculty Teaching: Effective Use of Student Evaluations and Consultants," J.

Higher Ed., 57, 196-211 (1986).

4. Cross, K.P., and T.A. Angelo, Classroom Assessment Techniques: A Handbook for Faculty, Ann Arbor,

National Center for Research to Improve Postsecondary Teaching and Learning, 1988.

Page 14: active learning 2013crlte.engin.umich.edu/.../2013/06/W13ActiveLearningPres.pdf · 2019-09-10 · A practical introduction to active learning for busy skeptics April 9, 2013 Michael

Chemical Engineering Education, 28(3), 174–175 (1994)

ANY QUESTIONS?

Richard M. Felder Department of Chemical Engineering

North Carolina State University Raleigh, NC 27695

Most questions asked in engineering classes follow one of two models:

1. “If a first-order reaction A B with specific reaction rate 3.76 min-1 takes place in an ideal continuous stirred-tank reactor, what volume is required to achieve a 75% reactant conversion at steady state if the throughput rate is 286 liters/s?”

2. “Do you have any questions?” While these may be important questions to ask, they don't exactly stimulate deep thought. “What's the volume?” has only one possible correct answer, obtained by mechanically substituting values into a formula. “Do you have any questions?” is even less productive: the leaden silence that usually follows makes it clear that the answer for most students is always “No,” whether or not they understand the material.

Questions lie at the heart of the learning process. A good question raised during class or on a homework assignment can provoke curiosity, stimulate thought, illustrate the true meaning of lecture material, and trigger a discussion or some other form of student activity that leads to new or deeper understanding. Closed (single-answer) questions that require only rote recitation or substitution don't do much along these lines, and questions of the “Any questions?” variety do almost nothing.

Following are some different things we can ask our students to do which can get them thinking in ways that “Given this, calculate that” never can. Define a concept in your own words • Using terms a bright high school senior (a chemical engineering sophomore, a physics

major, your grandmother) could understand, briefly explain the concept of vapor pressure (viscosity, heat transfer coefficient, ideal solution).1

Explain familiar phenomena in terms of course concepts • Why do I feel comfortable in 65oF still air, cool when a 65oF wind is blowing, freezing in

65oF water, and even colder when I step out of the water unless the relative humidity is close to 100%?

• A kettle containing boiling water is on a stove. If you put your finger right next to the kettle but not touching it, you'll be fine, but if you touch the kettle for more than a fraction of a second you'll burn yourself. Why?

1Warning: Don't ask your students to give a comprehensible definition of something like τxx or entropy or temperature or mass unless you're sure you can do it.

Page 15: active learning 2013crlte.engin.umich.edu/.../2013/06/W13ActiveLearningPres.pdf · 2019-09-10 · A practical introduction to active learning for busy skeptics April 9, 2013 Michael

Predict system behavior before calculating it • Without using your calculator, estimate the time it will take for half of the methanol in the

vessel to drain out (for all the water in the kettle to boil off, for half of the reactant to be converted).

• What would you expect plots of CB vs. t to look like if you ran the reactor at two different temperatures? Don't do any calculations—just use logic. Explain the shapes of your plots.

• An open flask containing an equimolar mixture of two miscible species is slowly heated. The first species has a normal boiling point of 75oC and the second boils at 125oC. You periodically measure the temperature, T, and the height of the liquid in the flask, h, until all of the liquid is gone. Sketch plots of T and h vs. time, labeling the temperatures at which abrupt changes in system behavior occur.2

Think about what you've calculated. • Find three different ways to verify that the formula we just derived is correct. • Suppose we build and operate the piping system (heat exchanger, absorption column, VLE

still, tubular reactor) exactly as specified, and lo and behold, the throughput rate (heat duty, solvent recovery, vapor phase equilibrium composition, product yield) is not what we predicted. What are at least 10 possible reasons for the disparity?3

• Why would an intermediate reactor temperature be optimal for this pair of reactions? (Put another way, what are the drawbacks of very low and very high temperature operation?)

• The computer output says that we need a tank volume of 3.657924x106 m3. Any problems with this solution?

Brainstorm • What separation processes might work for a mixture of benzene and acetone? Which one

would you be tempted to try first? Why? • What are possible safety (environmental, quality control) problems we might encounter with

the process unit we just designed? You get double credit for an answer that nobody else thinks of. The longest list gets a three-point bonus on the next test. Once a list of problems has been generated, you might follow up by asking the students to prioritize the problems in terms of their potential impact and to suggest ways to minimize or eliminate them.

Formulate questions • What are three good questions about what we covered today? • Make up and solve a nontrivial problem about what we covered in class this week (about

what we covered in class this month and what you covered in your organic chemistry class

2You will be amazed and depressed by how many of your students—whether they're sophomores or seniors—say the level remains constant until T=75oC, and then the liquid boils. 3Be sure to provide feedback the first few times you ask this critically important question, so that the students learn to think about both assumptions they have made and possibilities for human error.

Page 16: active learning 2013crlte.engin.umich.edu/.../2013/06/W13ActiveLearningPres.pdf · 2019-09-10 · A practical introduction to active learning for busy skeptics April 9, 2013 Michael

this month). Memory and plug-and-chug problems won't be worth much—for full credit, the problem should be both creative and challenging.

• A problem on the next test will begin with the sentence, “A first-order reaction A B with specific reaction rate 3.76 min-1 takes place in an ideal continuous reactor.” Generate a set of questions that might follow. Your questions should be both qualitative and quantitative, and should involve every topic the test covers. I guarantee that I will use some of the questions I get on the test.

I could go on, but you get the idea.

Coming up with good questions is only half the battle; the other half is asking them in a way that has the greatest positive impact on the students. I have not had much luck with the usual approaches. If I ask the whole class a question and wait for someone to volunteer an answer, the students remain silent and nervously avoid eye contact with me until one of them (usually the same one) pipes up with an answer. On the other hand, if I call on individual students with questions, I am likely to provoke more fear than thought. No matter how kindly my manner and how many eloquent speeches I make about the value of wrong answers, most students consider being questioned in class as a setup for them to look ignorant in public—and if the questions require real thought, their fear may be justified.

I find that a better way to get the students thinking actively in class is to ask a question, have the students work in groups of 2–4 to generate answers, and then call on several of the groups to share their results. I vary the procedure occasionally by having the students formulate answers individually, then work in pairs to reach consensus. For more complex problems, I might then have pairs get together to synthesize team-of-four solutions. Another effective strategy is to put questions like those listed above into homework assignments and pre-test study guides, promising the students that some of the questions will be included on the next test, and then include them. If such questions only show up in class, many students tend to discount them; however, if the questions also routinely appear in homework and on tests, the students take them seriously. It's a good idea to provide feedback on their initial efforts and give examples of good responses, since this is likely to be a new game for most of them and so at first they won't know exactly what you're after. After a while they'll start to get it, and some of them may even turn out to be better at it than you are. This is not a bad problem to have.4

4For more information on helping students develop creative problem-solving abilities, see R.M. Felder, “On Creating Creative Engineers,” Engineering Education, 77(4), 222 (1987), <http://www.ncsu.edu/felder-public/Papers/Creative_Engineers.pdf>, and “The Generic Quiz,” Chem. Eng. Education, 19(4), 176 (1985), and Chapter 5 of P.C. Wankat and F.S. Oreovicz, Teaching Engineering, New York, McGraw-Hill, 1993.

Page 17: active learning 2013crlte.engin.umich.edu/.../2013/06/W13ActiveLearningPres.pdf · 2019-09-10 · A practical introduction to active learning for busy skeptics April 9, 2013 Michael

Acknowledgments. This work was supported by the BelgianFonds National de la Recherche Scientifique (FNRS),European Commission, Mind Science Foundation,McDonnell Foundation, French-Speaking CommunityConcerted Research Action (ARC 06/11-340), FondationLéon Frédéricq, and National Institutes of Health.M.-A.B. and O.G. are Research Fellows, M.B. and C.S.

Postdoctoral Fellows, and S.L. Senior ResearchAssociate at the FNRS. M.I.G., V.L., and K.F. aresupported by the Wellcome Trust.

Supporting Online Materialwww.sciencemag.org/cgi/content/full/332/6031/858/DC1Materials and Methods

SOM TextFig. S1Table S1References

22 December 2010; accepted 6 April 201110.1126/science.1202043

Improved Learning in a Large-EnrollmentPhysics ClassLouis Deslauriers,1,2 Ellen Schelew,2 Carl Wieman*†‡

We compared the amounts of learning achieved using two different instructional approachesunder controlled conditions. We measured the learning of a specific set of topics andobjectives when taught by 3 hours of traditional lecture given by an experienced highly ratedinstructor and 3 hours of instruction given by a trained but inexperienced instructor usinginstruction based on research in cognitive psychology and physics education. The comparisonwas made between two large sections (N = 267 and N = 271) of an introductory undergraduatephysics course. We found increased student attendance, higher engagement, and more thantwice the learning in the section taught using research-based instruction.

Thetraditional lecture approach remains theprevailing method for teaching science atthe postsecondary level, although there

are a growing number of studies indicating thatother instructional approaches are more effective(1–8). A typical study in the domain of physicsdemonstrates how student learning is improvedfrom one year to the next when an instructorchanges his or her approach, as measured by stan-dard concept-based tests such as the Force Con-cept Inventory (9) or the instructor’s own exams.In our studies of two full sessions of an advancedquantum mechanics class taught either by tra-ditional or by interactive learning style, studentsin the interactive section showed improved learn-ing, but both sections, interactive and traditional,showed similar retention of learning 6 to 18monthslater (10). Here, we compare learning producedby two contrasting instructional methods in alarge-enrollment science course. The control groupwas lectured by amotivated faculty member withhigh student evaluations andmany years of experi-ence teaching this course. The experimental groupwas taught by a postdoctoral fellow using instruc-tion based on research on learning. The sameselected learning objectives were covered by bothinstructors in a 1-week period.

The instructional design for the experimentalsection was based on the concept of “deliberatepractice” (11) for the development of expertise.

The deliberate practice concept encompasses theeducational ideas of constructivism and formativeassessment. In our case, the deliberate practice takesthe form of a series of challenging questions andtasks that require the students to practice physicist-like reasoning and problem solving during classtime while provided with frequent feedback.

The design goal was to have the studentsspend all their time in class engaged in deliberatepractice at “thinking scientifically” in the form ofmaking and testing predictions and argumentsabout the relevant topics, solving problems, andcritiquing their own reasoning and that of others.All of the activities are designed to fit togetherto support this goal, including moving the sim-ple transfer of factual knowledge outside of classas much as possible and creating tasks and feed-back that motivate students to become fully en-gaged. As the students work through these tasks,they receive feedback from fellow students (12)and from the instructor. We incorporate multi-ple “best instructional practices,” but we believethe educational benefit does not come primarily

from any particular practice but rather from theintegration into the overall deliberate practiceframework.

This study was carried out in the second termof the first-year physics sequence taken by allundergraduate engineering students at the Uni-versity of British Columbia. This calculus-basedcourse covers various standard topics in electric-ity and magnetism. The course enrollment was850 students, who were divided among threesections. Each section had 3 hours of lecture perweek. The lectures were held in a large theater-style lecture hall with fixed chairs behind benchesgrouping up to five students. The students also hadweekly homework assignments, instructional labo-ratories, and tutorials and recitations where theysolved problems; this workwas graded. Thereweretwo midterm exams and a final exam. All coursecomponentswere common across all three sections,except for the lectures, which were prepared andgiven independently by three different instructors.

During week 12, we studied two sectionswhose instructors agreed to participate. For the11 weeks preceding the study, both sections weretaught in a similar manner by two instructors (Aand B), both with above average student teachingevaluations and many years experience teachingthis course and many others. Both instructors lec-tured using PowerPoint slides to present con-tent and example problems and also showeddemonstrations. Meanwhile, the students tooknotes. “Clicker” (or “personal response system”)questions (average 1.5 per class, range 0 to 5)were used for summative evaluation (which wascharacterized by individual testing without dis-cussion or follow-up other than a summary of thecorrect answers). Students were given participa-tion credit for submitting answers.

Before the experiment, a variety of data werecollected on the students in the two sections

1Carl Wieman Science Education Initiative, University of BritishColumbia, Vancouver, BC, Canada. 2Department of Physics andAstronomy, University of British Columbia, Vancouver, BC, Canada.

*On leave from the University of British Columbia and theUniversity of Colorado.†To whom correspondence should be addressed. E-mail:[email protected]‡This work does not necessarily represent the views of theOffice of Science and Technology Policy or the United Statesgovernment.

Table 1. Measures of student perceptions, behaviors, and knowledge.

Control section Experimental section

Number of students enrolled 267 271Mean BEMA score (13) (week 11) 47 T 1% 47 T 1%Mean CLASS score (14) (start of term)(agreement with physicist)

63 T 1% 65 T 1%

Mean midterm 1 score 59 T 1% 59 T 1%Mean midterm 2 score 51 T 1% 53 T 1%Attendance before experiment* 55 T 3% 57 T 2%Attendance during experiment 53 T 3% 75 T 5%Engagement before experiment* 45 T 5% 45 T 5%Engagement during experiment 45 T 5% 85 T 5%*Average value of multiple measurements carried out in a 2-week interval before the experiment. Engagement also varies overlocation in the classroom; numbers given are spatial and temporal averages.

13 MAY 2011 VOL 332 SCIENCE www.sciencemag.org862

REPORTS

on

May

17,

201

1w

ww

.sci

ence

mag

.org

Dow

nloa

ded

from

Page 18: active learning 2013crlte.engin.umich.edu/.../2013/06/W13ActiveLearningPres.pdf · 2019-09-10 · A practical introduction to active learning for busy skeptics April 9, 2013 Michael

(Table 1). Students took twomidterm exams (iden-tical across all sections). In week 11, students tookthe Brief Electricity and Magnetism Assessment(BEMA), which measures conceptual knowledge(13). At the start of the term, students took theColorado LearningAttitudes about Science Survey(CLASS) (14), which measures a student’s per-ceptions of physics. During weeks 10 and 11, wemeasured student attendance and engagement inboth sections. Attendancewasmeasured by count-ing the number of students present, and engage-ment was measured by four trained observers ineach class using the protocol discussed in thesupporting onlinematerial (SOM) (15). The resultsshow that the two sections were indistinguishable(Table 1). This in itself is interesting, because thepersonalities of the two instructors are rather dif-ferent, with instructor A (control section) beingmore animated and intense.

The experimental intervention took place dur-ing the 3 hours of lecture in the 12th week. Thoseclasses covered the unit on electromagneticwaves.This unit included standard topics such as planewaves and energy of electromagnetic waves andphotons. The control section was taught by in-structor A using the same instructional approachas in the previous weeks, except they added in-structions to read the relevant chapter in the text-book before class. The experimental section wastaught by two instructors who had not previouslytaught these students. The instructors were thefirst author of this paper, L.D., assisted by thesecond author, E.S. Instructor A and L.D. hadagreed tomake this a learning competition. L.D. andinstructor A agreed beforehand what topics andlearning objectives would be covered. Amultiple-choice test (see SOM) was developed by L.D.and instructor A that they and instructor B agreedwas a good measure of the learning objectivesand physics content. The test was prepared at theend of week 12. Most of the test questions wereclicker questions previously used at anotheruniversity, often slightly modified. Both sectionswere told that they would receive a bonus of 3%of the course grade for the combination of par-ticipating in clicker questions, taking the test, and(only in the experimental section) turning in grouptask solutions, with the apportionment of creditacross these tasks left unspecified.

In contrast to instructor A, the teaching experi-ence of L.D. and E.S. had been limited to servingas teaching assistants. L.D. was a postdoctoral re-searcher working in the CarlWieman (third authorof this paper) ScienceEducation Initiative (CWSEI)and had received training in physics educationand learning research and methods of effectivepedagogy while assisting with the teaching of sixcourses. E.S. had a typical physics graduate studentbackground except for having taken a seminarcourse in physics education.

The instructional approach used in the experi-mental section included elements promoted byCWSEI and its partner initiative at the Universityof Colorado: preclass reading assignments, pre-class reading quizzes, in-class clicker questions

with student-student discussion (CQ), small-groupactive learning tasks (GT), and targeted in-classinstructor feedback (IF). Before each of the three50-min classes, students were assigned a three- orfour-page reading, and they completed a short true-false online quiz on the reading. To avoid studentresistance, at the beginning of the first class, severalminutes were used to explain to students why thematerial was being taught this way and howresearch showed that this approachwould increasetheir learning.

A typical schedule for a classwas the following:CQ1, 2min; IF, 4min; CQ2, 2min; IF, 4min; CQ2(continued), 3 min; IF, 5 min; Revote CQ2, 1 min;CQ3, 3 min; IF, 6 min; GT1, 6 min; IF with ademonstration, 6 min; GT1 (continued), 4 min;and IF, 3 min. The time duration for a question oractivity includes the amount of time the studentsspent discussing the problem and asking numer-ous questions. There was no formal lecturing;however, guidance and explanations were providedby the instructor throughout the class. The instructorresponded to student-generated questions, to resultsfrom the clicker responses, and to what the in-structor heard by listening in on the student-student discussions. Students’ questions commonlyexpanded upon and extended the material coveredby the clicker questions or small-group tasks. Thematerial shown on the slides used in class is givenin the SOM, along with some commentary aboutthe design elements and preparation time required.

At the beginning of each class, the studentswere asked to form groups of two. After a clickerquestion was shown to the class, the studentsdiscussed the question within their groups (whichoften expanded to three or more students) andsubmitted their answer using clickers. When thevoting was complete, the instructor showed theresults and gave feedback. The small-group taskswere questions that required a written response.Students worked in the same groups but submittedindividual answers at the end of each class forparticipation credit. Instructor A observed each ofthese classes before teaching his own class andchose to use most of the clicker questions devel-oped for the experimental class. However, Instruc-tor A used these only for summative evaluation,as described above.

L.D. and E.S. together designed the clickerquestions and small-group tasks. L.D. and E.S.

had not taught this class before and were notfamiliar with the students. Before the first class,they solicited two volunteers enrolled in the courseto pilot-test the materials. The volunteers wereasked to think aloud as they reasoned through theplanned questions and tasks. Results from thistesting were used to modify the clicker questionsand tasks to reduce misinterpretations and adjustthe level of difficulty. This process was repeatedbefore the second class with one volunteer.

During the week of the experiment, engage-ment and attendance remained unchanged in thecontrol section. In the experimental section, studentengagement nearly doubled and attendance in-creased by 20% (Table 1). The reason for theattendance increase is not known. We hypothe-size that of the many students who attended onlypart of a normal class, more of themwere capturedby the happenings in the experimental section anddecided to stay and to return for the subsequentclasses.

The test was administered in both sections inthe first class after the completion of the 3-hourunit. The control section had covered the materialrelated to all 12 of the questions on the test. Theexperimental section covered only 11 of the 12questions in the allotted time. Two days beforethe test was given, the students in both sectionswere reminded of the test and given links to thepostings of all the material used in the experi-mental section: the preclass reading assignmentsand quizzes; the clicker questions; and the grouptasks, along with answers to all of these. Thestudents were encouraged by e-mail and in classto try their best on the test and were told that itwould be good practice for the final exam, but theirperformance on the test did not affect their coursegrade. Few students in either section finished in lessthan 15min, with the average being about 20min.

The test results are shown in Fig. 1. For theexperimental section, 211 students attended classto take the test, whereas 171 did so in the controlsection. The average scores were 41 T 1% in thecontrol section and 74 T 1% in the experimentalsection. Random guessingwould produce a scoreof 23%, so the students in the experimental sec-tion did more than twice as well on this test asthose in the control section.

The test score distributions are not normal(Fig. 1). A ceiling effect is apparent in the experi-

Fig. 1. Histogram of studentscores for the two sections.

www.sciencemag.org SCIENCE VOL 332 13 MAY 2011 863

REPORTS

on

May

17,

201

1w

ww

.sci

ence

mag

.org

Dow

nloa

ded

from

Page 19: active learning 2013crlte.engin.umich.edu/.../2013/06/W13ActiveLearningPres.pdf · 2019-09-10 · A practical introduction to active learning for busy skeptics April 9, 2013 Michael

mental section. The two distributions have littleoverlap, demonstrating that the differences inlearning between the two sections exist for es-sentially the entire student population. The stan-dard deviation calculated for both sections wasabout 13%, giving an effect size for the differencebetween the two sections of 2.5 standard de-viations. As reviewed in (4), other science andengineering classroom studies report effect sizesless than 1.0. An effect size of 2, obtained withtrained personal tutors, is claimed to be the largestobserved for any educational intervention (16).

This work may obtain larger effect sizes thanin this previous work because of the design andimplementation that maximized productive en-gagement. The clicker questions and group taskswere designed not only to require explicit expertreasoning but also to be sufficiently interestingand personally relevant to motivate students tofully engage. Another factor could be that pre-vious work primarily used end-of-term tests, andthe results on those tests reflect all the learningthat students do inside and outside of class, forexample, the learning that takes place while doinghomework and studying for exams. In our inter-vention, the immediate low-stakes test more direct-ly measured the learning achieved from preclassreading and class itself, in the absence of sub-sequent study.

We are often asked about the possible con-tributions of the Hawthorne effect, where anychange in conditions is said to result in improvedperformance. As discussed in citations in the SOM,the original Hawthorne plant data actually showno such effect, nor do experiments in educationalsettings (17).

A concern frequently voiced by faculty as theyconsider adopting active learning approaches isthat students might oppose the change (18). Aweek after the completion of the experiment andexam, we gave students in the experimental sec-tion an online survey (see SOM); 150 studentscompleted the survey.

For the survey statement “I really enjoyed theinteractive teaching technique during the threelectures on E&M waves,” 90% of the respon-dents agreed (47% strongly agreed, 43% agreed)and only 1% disagreed. For the statement “I feel Iwould have learned more if the whole physics153 course would have been taught in this high-ly interactive style.” 77% agreed and only 7%disagreed. Thus, this form of instruction waswell received by students.

In conclusion, we show that use of deliberatepractice teaching strategies can improve bothlearning and engagement in a large introductoryphysics course as compared with what was ob-tained with the lecture method. Our study com-pares similar students, and teachers with the samelearning objectives and the same instructionaltime and tests. This result is likely to generalize toa variety of postsecondary courses.

References and Notes1. R. J. Beichner et al., in Research-Based Reform of

University Physics, E. F. Redish, P. J. Cooney, Eds.(American Association of Physics Teachers, College Park,MD, 2007).

2. C. H. Crouch, E. Mazur, Am. J. Phys. 69, 970(2001).

3. J. E. Froyd, “White paper on promising practices inundergraduate STEM education” [Commissioned paper forthe Evidence on Promising Practices in Undergraduate

Science, Technology, Engineering, and Mathematics (STEM)Education Project, The National Academies Board on ScienceEducation, 2008]. www7.nationalacademies.org/bose/Froyd_Promising_Practices_CommissionedPaper.pdf

4. J. E. Froyd, “Evidence for the efficacy of student-activelearning pedagogies” (Project Kaleidoscope, 2007).www.pkal.org/documents/BibliographyofSALPedagogies.cfm

5. R. R. Hake, Am. J. Phys. 66, 64 (1998).6. J. K. Knight, W. B. Wood, Cell Biol. Educ. 4, 298

(2005).7. M. Prince, J. Eng. Educ. 93, 223 (2004).8. L. Springer, M. E. Stanne, S. S. Donavan, Rev. Educ. Res.

69, 21 (1999).9. D. Hestenes, M. Wells, G. Swackhamer, Phys. Teach. 30,

141 (1992).10. L. Deslauriers, C. Wieman, Phys. Rev. ST Phys. Educ. Res.

7, 010101 (2011).11. K. A. Ericsson, R. Krampe, C. Tesch-Romer, Psychol. Rev.

100, 363 (1993).12. M. K. Smith et al., Science 323, 122 (2009).13. L. Ding, R. Chabay, B. Sherwood, R. Beichner, Phys. Rev.

ST Phys. Educ. Res. 2, 010105 (2006).14. W. K. Adams et al., Phys. Rev. ST Phys. Educ. Res. 2,

010101 (2006).15. Materials and methods are available as supporting

material on Science Online.16. B. Bloom, Educ. Res. 13, 4 (1984).17. R. H. Bauernfeind, C. J. Olson, Phi Delta Kappan 55, 271

(1973).18. G. K. Allen, J. F. Wedman, L. C. Folk, Innovative High.

Educ. 26, 103 (2001).Acknowledgments: This work was supported by the University

of British Columbia through the Carl Wieman ScienceEducation Initiative.

Supporting Online Materialwww.sciencemag.org/cgi/content/full/332/6031/862/DC1Materials and MethodsSOM TextReferences

16 December 2010; accepted 5 April 201110.1126/science.1201783

13 MAY 2011 VOL 332 SCIENCE www.sciencemag.org864

REPORTS

on

May

17,

201

1w

ww

.sci

ence

mag

.org

Dow

nloa

ded

from

Page 20: active learning 2013crlte.engin.umich.edu/.../2013/06/W13ActiveLearningPres.pdf · 2019-09-10 · A practical introduction to active learning for busy skeptics April 9, 2013 Michael

July 2004 Journal of Engineering Education 223

MICHAEL PRINCEDepartment of Chemical EngineeringBucknell University

ABSTRACT

This study examines the evidence for the effectiveness of activelearning. It defines the common forms of active learning mostrelevant for engineering faculty and critically examines the coreelement of each method. It is found that there is broad butuneven support for the core elements of active, collaborative,cooperative and problem-based learning.

Keywords: active, collaborative, cooperative, problem-based learning

I. INTRODUCTION

Active learning has received considerable attention over thepast several years. Often presented or perceived as a radical changefrom traditional instruction, the topic frequently polarizes faculty.Active learning has attracted strong advocates among faculty look-ing for alternatives to traditional teaching methods, while skepticalfaculty regard active learning as another in a long line of educa-tional fads.

For many faculty there remain questions about what activelearning is and how it differs from traditional engineering educa-tion, since this is already “active” through homework assignmentsand laboratories. Adding to the confusion, engineering faculty donot always understand how the common forms of active learningdiffer from each other and most engineering faculty are not inclinedto comb the educational literature for answers.

This study addresses each of these issues. First, it defines activelearning and distinguishes the different types of active learningmost frequently discussed in the engineering literature. A core ele-ment is identified for each of these separate methods in order to dif-ferentiate between them, as well as to aid in the subsequent analysisof their effectiveness. Second, the study provides an overview of rel-evant cautions for the reader trying to draw quick conclusions onthe effectiveness of active learning from the educational literature.Finally, it assists engineering faculty by summarizing some of themost relevant literature in the field of active learning.

II. DEFINITIONS

It is not possible to provide universally accepted definitions forall of the vocabulary of active learning since different authors in thefield have interpreted some terms differently. However, it is possi-

ble to provide some generally accepted definitions and to highlightdistinctions in how common terms are used.

Active learning is generally defined as any instructional methodthat engages students in the learning process. In short, active learningrequires students to do meaningful learning activities and think aboutwhat they are doing [1]. While this definition could include tradi-tional activities such as homework, in practice active learning refers toactivities that are introduced into the classroom. The core elements ofactive learning are student activity and engagement in the learningprocess. Active learning is often contrasted to the traditional lecturewhere students passively receive information from the instructor.

Collaborative learning can refer to any instructional method inwhich students work together in small groups toward a common goal[2]. As such, collaborative learning can be viewed as encompassing allgroup-based instructional methods, including cooperative learning[3–7]. In contrast, some authors distinguish between collaborativeand cooperative learning as having distinct historical developmentsand different philosophical roots [8–10]. In either interpretation, thecore element of collaborative learning is the emphasis on student in-teractions rather than on learning as a solitary activity.

Cooperative learning can be defined as a structured form of groupwork where students pursue common goals while being assessed in-dividually [3, 11]. The most common model of cooperative learn-ing found in the engineering literature is that of Johnson, Johnsonand Smith [12, 13]. This model incorporates five specific tenets,which are individual accountability, mutual interdependence, face-to-face promotive interaction, appropriate practice of interpersonalskills, and regular self-assessment of team functioning. While dif-ferent cooperative learning models exist [14, 15], the core elementheld in common is a focus on cooperative incentives rather thancompetition to promote learning.

Problem-based learning (PBL) is an instructional method where rele-vant problems are introduced at the beginning of the instruction cycleand used to provide the context and motivation for the learning thatfollows. It is always active and usually (but not necessarily) collaborativeor cooperative using the above definitions. PBL typically involves sig-nificant amounts of self-directed learning on the part of the students.

III. COMMON PROBLEMS INTERPRETING THELITERATURE ON ACTIVE LEARNING

Before examining the literature to analyze the effectiveness ofeach approach, it is worth highlighting common problems that en-gineering faculty should appreciate before attempting to draw con-clusions from the literature.

A. Problems Defining What Is Being StudiedConfusion can result from reading the literature on the effec-

tiveness of any instructional method unless the reader and author

Does Active Learning Work? A Review of the Research

Page 21: active learning 2013crlte.engin.umich.edu/.../2013/06/W13ActiveLearningPres.pdf · 2019-09-10 · A practical introduction to active learning for busy skeptics April 9, 2013 Michael

take care to specify precisely what is being examined. For example,there are many different approaches that go under the name ofproblem-based learning [16]. These distinct approaches to PBLcan have as many differences as they have elements in common,making interpretation of the literature difficult. In PBL, for exam-ple, students typically work in small teams to solve problems in aself-directed fashion. Looking at a number of meta-analyses [17],Norman and Schmidt [18] point out that having students work insmall teams has a positive effect on academic achievement whileself-directed learning has a slight negative effect on academicachievement. If PBL includes both of these elements and one asks ifPBL works for promoting academic achievement, the answer seemsto be that parts of it do and parts of it do not. Since different appli-cations of PBL will emphasize different components, the literatureresults on the overall effectiveness of PBL are bound to be confus-ing unless one takes care to specify what is being examined. This iseven truer of the more broadly defined approaches of active or col-laborative learning, which encompass very distinct practices.

Note that this point sheds a different light on some of the avail-able meta-analyses that are naturally attractive to a reader hopingfor a quick overview of the field. In looking for a general sense ofwhether an approach like problem-based learning works, nothingseems as attractive as a meta-analysis that brings together the resultsof several studies and quantitatively examines the impact of the ap-proach. While this has value, there are pitfalls. Aggregating the re-sults of several studies on the effectiveness of PBL can be mislead-ing if the forms of PBL vary significantly in each of the individualstudies included in the meta-analysis.

To minimize this problem, the analysis presented in Section IVof this paper focuses on the specific core elements of a given instruc-tional method. For example, as discussed in Section II, the core ele-ment of collaborative learning is working in groups rather thanworking individually. Similarly, the core element of cooperativelearning is cooperation rather than competition. These distinctionscan be examined without ambiguity. Furthermore, focusing on thecore element of active learning methods allows a broad field to betreated concisely.

B. Problems Measuring “What Works”Just as every instructional method consists of more than one ele-

ment, it also affects more than one learning outcome [18]. Whenasking whether active learning “works,” the broad range of out-comes should be considered such as measures of factual knowledge,relevant skills and student attitudes, and pragmatic items as studentretention in academic programs. However, solid data on how an in-structional method impacts all of these learning outcomes is oftennot available, making comprehensive assessment difficult. In addi-tion, where data on multiple learning outcomes exists it can includemixed results. For example, some studies on problem-based learn-ing with medical students [19, 20] suggest that clinical performanceis slightly enhanced while performance on standardized exams de-clines slightly. In cases like this, whether an approach works is amatter of interpretation and both proponents and detractors cancomfortably hold different views.

Another significant problem with assessment is that many rele-vant learning outcomes are simply difficult to measure. This is par-ticularly true for some of the higher level learning outcomes that aretargeted by active learning methods. For example, PBL might nat-urally attract instructors interested in developing their students’

ability to solve open-ended problems or engage in life-long learn-ing, since PBL typically provides practice in both skills. However,problem solving and life-long learning are difficult to measure. As aresult, data are less frequently available for these outcomes than forstandard measures of academic achievement such as test scores.This makes it difficult to know whether the potential of PBL topromote these outcomes is achieved in practice.

Even when data on higher-level outcomes are available, it is easyto misinterpret reported results. Consider a study by Qin et al. [21]that reports that cooperation promotes higher quality individualproblem solving than does competition. The result stems from thefinding that individuals in cooperative groups produced better solu-tions to problems than individuals working in competitive environ-ments. While the finding might provide strong support for cooper-ative learning, it is important to understand what the study does notspecifically demonstrate. It does not necessarily follow from theseresults that students in cooperative environments developedstronger, more permanent and more transferable problem solvingskills. Faculty citing the reference to prove that cooperative learningresults in individuals becoming generically better problem solverswould be over-interpreting the results.

A separate problem determining what works is deciding whenan improvement is significant. Proponents of active learning some-times cite improvements without mentioning that the magnitude ofthe improvement is small [22]. This is particularly misleading whenextra effort or resources are required to produce an improvement.Quantifying the impact of an intervention is often done using effectsizes, which are defined to be the difference in the means of a sub-ject and control population divided by the pooled standard devia-tion of the populations. An improvement with an effect size of 1.0would mean that the test population outperformed the controlgroup by one standard deviation. Albanese [23] cites the benefits ofusing effect sizes and points out that Cohen [24] arbitrarily labeledeffect sizes of 0.2, 0.5 and 0.8 as small, medium and large, respec-tively. Colliver [22] used this fact and other arguments to suggestthat effect sizes should be at least 0.8 before they be considered sig-nificant. However, this suggestion would discount almost everyavailable finding since effect sizes of 0.8 are rare for any interventionand require truly impressive gains [23]. The effect sizes of 0.5 orhigher reported in Section IV of this paper are higher than thosefound for most instructional interventions. Indeed, several decadesof research indicated that standard measures of academic achieve-ment were not particularly sensitive to any change in instructionalapproach [25]. Therefore, reported improvements in academicachievement should not be dismissed lightly.

Note that while effect sizes are a common measure of the mag-nitude of an improvement, absolute rather than relative values aresometimes more telling. There can be an important difference be-tween results that are statistically significant and those that are sig-nificant in absolute terms. For this reason, it is often best to findboth statistical and absolute measures of the magnitude of a report-ed improvement before deciding whether it is significant.

As a final cautionary note for interpreting reported results, somereaders dismiss reported improvements from nontraditional in-structional methods because they attribute them to the Hawthorneeffect whereby the subjects knowingly react positively to any novelintervention regardless of its merit. The Hawthorne effect is gener-ally discredited, although it retains a strong hold on the popularimagination [26].

224 Journal of Engineering Education July 2004

Page 22: active learning 2013crlte.engin.umich.edu/.../2013/06/W13ActiveLearningPres.pdf · 2019-09-10 · A practical introduction to active learning for busy skeptics April 9, 2013 Michael

C. SummaryThere are pitfalls for engineering faculty hoping to pick up an ar-

ticle or two to see if active learning works. In particular, readersmust clarify what is being studied and how the authors measure andinterpret what “works.” The former is complicated by the widerange of methods that fall under the name of active learning, butcan be simplified by focusing on core elements of common activelearning methods. Assessing “what works” requires looking at abroad range of learning outcomes, interpreting data carefully, quan-tifying the magnitude of any reported improvement and havingsome idea of what constitutes a “significant” improvement. This lastwill always be a matter of interpretation, although it is helpful tolook at both statistical measures such as effect sizes and absolute val-ues for reported learning gains.

No matter how data is presented, faculty adopting instructionalpractices with the expectation of seeing results similar to those re-ported in the literature should be aware of the practical limitationsof educational studies. Educational studies tell us what worked, onaverage, for the populations examined and learning theories suggestwhy this might be so. However, claiming that faculty who adopt aspecific method will see similar results in their own classrooms issimply not possible. Even if faculty master the new instructionalmethod, they can not control all other variables that affect learning.The value of the results presented in Section IV of the paper is thatthey provide information to help teachers “go with the odds.” Themore extensive the data supporting an intervention, the more ateacher’s students resemble the test population and the bigger thereported gains, the better the odds are that the method will work fora given instructor.

Notwithstanding all of these problems, engineering facultyshould be strongly encouraged to look at the literature on activelearning. Some of the evidence for active learning is compelling andshould stimulate faculty to think about teaching and learning innontraditional ways.

IV. THE EVIDENCE FOR ACTIVE LEARNING

Bonwell and Eison [1] summarize the literature on active learn-ing and conclude that it leads to better student attitudes and im-provements in students’ thinking and writing. They also cite evi-dence from McKeachie that discussion, one form of active learning,surpasses traditional lectures for retention of material, motivatingstudents for further study and developing thinking skills. Felderet al. [27] include active learning on their recommendations forteaching methods that work, noting among other things that activelearning is one of Chickering and Gamson’s “Seven Principles forGood Practice” [28].

However, not all of this support for active learning is compelling.McKeachie himself admits that the measured improvements of dis-cussion over lecture are small [29]. In addition, Chickering andGamson do not provide hard evidence to support active learning asone of their principles. Even studies addressing the research base forChickering and Gamson’s principles come across as thin with re-spect to empirical support for active learning. For example, Scorcelli[30], in a study aimed at presenting the research base for Chicker-ing and Gamson’s seven principles, states that, “We simply do nothave much data confirming beneficial effects of other (not coopera-tive or social) kinds of active learning.”

Despite this, the empirical support for active learning is exten-sive. However, the variety of instructional methods labeled as activelearning muddles the issue. Given differences in the approaches la-beled as active learning, it is not always clear what is being promotedby broad claims supporting the adoption of active learning. Perhapsit is best, as some proponents claim, to think of active learning as anapproach rather than a method [31] and to recognize that differentmethods are best assessed separately.

This assessment is done in the following sections, which look atthe empirical support for active, collaborative, cooperative and prob-lem-based learning. As previously discussed, the critical elements ofeach approach are singled out rather than examining the effective-ness of every possible implementation scheme for each of these dis-tinct methods. The benefits of this general approach are twofold.First, it allows the reader to examine questions that are both funda-mental and pragmatic, such as whether introducing activity into thelecture or putting students into groups, is effective. Second, focusingon the core element eliminates the need to examine the effectivenessof every instructional technique that falls under a given broad catego-ry, which would be impractical within the scope of a single paper.Readers looking for literature on a number of specific active learningmethods are referred to additional references [1, 6, 32].

A. Active LearningWe have defined the core elements of active learning to be intro-

ducing activities into the traditional lecture and promoting studentengagement. Both elements are examined below, with an emphasison empirical support for their effectiveness.

1) Introducing student activity into the traditional lecture: Onthe simplest level, active learning is introducing student activity intothe traditional lecture. One example of this is for the lecturer topause periodically and have students clarify their notes with a part-ner. This can be done two or three times during an hour-long class.Because this pause procedure is so simple, it provides a baseline tostudy whether short, informal student activities can improve the ef-fectiveness of lectures.

Ruhl et al. [33] show some significant results of adopting thispause procedure. In a study involving 72 students over two coursesin each of two semesters, the researchers examined the effect of in-terrupting a 45-minute lecture three times with two-minute breaksduring which students worked in pairs to clarify their notes. In par-allel with this approach, they taught a separate group using astraight lecture and then tested short and long-term retention oflecture material. Short-term retention was assessed by a free-recallexercise where students wrote down everything they could remem-ber in three minutes after each lecture and results were scored by thenumber of correct facts recorded. Short-term recall with the pauseprocedure averaged 108 correct facts compared to 80 correct factsrecalled in classes with straight lecture. Long-term retention was as-sessed with a 65 question multiple-choice exam given one and a halfweeks after the last of five lectures used in the study. Test scoreswere 89.4 with the pause procedure compared to 80.9 withoutpause for one class, and 80.4 with the pause procedure compared to72.6 with no pause in the other class. Further support for the effec-tiveness of pauses during the lecture is provided by Di Vesta [34].

Many proponents of active learning suggest that the effectivenessof this approach has to do with student attention span during lecture.Wankat [35] cites numerous studies that suggest that student

July 2004 Journal of Engineering Education 225

Page 23: active learning 2013crlte.engin.umich.edu/.../2013/06/W13ActiveLearningPres.pdf · 2019-09-10 · A practical introduction to active learning for busy skeptics April 9, 2013 Michael

attention span during lecture is roughly fifteen minutes. After that,Hartley and Davies [36] found that the number of students payingattention begins to drop dramatically with a resulting loss in reten-tion of lecture material. The same authors found that immediatelyafter the lecture students remembered 70 percent of informationpresented in first ten minutes of the lecture and 20 percent of infor-mation presented in last ten minutes. Breaking up the lecture mightwork because students’ minds start to wander and activities providethe opportunity to start fresh again, keeping students engaged.

2) Promoting Student Engagement: Simply introducing activ-ity into the classroom fails to capture an important component ofactive learning. The type of activity, for example, influences howmuch classroom material is retained [34]. In “Understanding byDesign” [37], the authors emphasize that good activities developdeep understanding of the important ideas to be learned. To dothis, the activities must be designed around important learning out-comes and promote thoughtful engagement on the part of the stu-dent. The activity used by Ruhl, for example, encourages studentsto think about what they are learning. Adopting instructional prac-tices that engage students in the learning process is the defining fea-ture of active learning.

The importance of student engagement is widely accepted andthere is considerable evidence to support the effectiveness of studentengagement on a broad range of learning outcomes. Astin [38]

reports that student involvement is one of the most important pre-dictors of success in college. Hake [39] examined pre- and post-testdata for over 6,000 students in introductory physics courses andfound significantly improved performance for students in classeswith substantial use of interactive-engagement methods. Testscores measuring conceptual understanding were roughly twice ashigh in classes promoting engagement than in traditional courses.Statistically, this was an improvement of two standard deviationsabove that of traditional courses. Other results supporting the effec-tiveness of active-engagement methods are reported by Redish et al.[40] and Laws et al. [41]. Redish et al. show that the improvedlearning gains are due to the nature of active engagement and not toextra time spent on a given topic. Figure 1, taken from Laws et al.,shows that active engagement methods surpass traditional instruc-tion for improving conceptual understanding of basic physics con-cepts. The differences are quite significant. Taken together, thestudies of Hake et al., Redish et al. and Laws et al. provide consider-able support for active engagement methods, particularly for ad-dressing students’ fundamental misconceptions. The importance ofaddressing student misconceptions has recently been recognized asan essential element of effective teaching [42].

In summary, considerable support exists for the core elements ofactive learning. Introducing activity into lectures can significantlyimprove recall of information while extensive evidence supports thebenefits of student engagement.

B. Collaborative LearningThe central element of collaborative learning is collaborative vs.

individual work and the analysis therefore focuses on how collabora-tion influences learning outcomes. The results of existing meta-stud-ies on this question are consistent. In a review of 90 years of research,Johnson, Johnson and Smith found that cooperation improved learn-ing outcomes relative to individual work across the board [12]. Simi-lar results were found in an updated study by the same authors [13]that looked at 168 studies between 1924 and 1997. Springer et al.[43] found similar results looking at 37 studies of students in science,mathematics, engineering and technology. Reported results for eachof these studies are shown in Table 1, using effect sizes to show theimpact of collaboration on a range of learning outcomes.

What do these results mean in real terms instead of effect sizes,which are sometimes difficult to interpret? With respect to academicachievement, the lowest of the three studies cited would move a

226 Journal of Engineering Education July 2004

Figure 1. Active-engagement vs. traditional instruction for im-proving students’ conceptual understanding of basic physics concepts(taken from Laws et al., 1999)

Table 1. Collaborative vs. individualistic learning: Reported effect size of the improvement in different learning outcomes.

Page 24: active learning 2013crlte.engin.umich.edu/.../2013/06/W13ActiveLearningPres.pdf · 2019-09-10 · A practical introduction to active learning for busy skeptics April 9, 2013 Michael

student from the 50th to the 70th percentile on an exam. In absoluteterms, this change is consistent with raising a student’s grade from75 to 81, given classical assumptions about grade distributions.*With respect to retention, the results suggest that collaboration re-duces attrition in technical programs by 22 percent, a significantfinding when technical programs are struggling to attract and retainstudents. Furthermore, some evidence suggests that collaboration isparticularly effective for improving retention of traditionally under-represented groups [44, 45].

A related question of practical interest is whether the benefits ofgroup work improve with frequency. Springer et al. looked specifical-ly at the effect of incorporating small, medium and large amounts ofgroup work on achievement and found the positive effect sizes associ-ated with low, medium and high amount of time in groups to be 0.52,0.73 and 0.53, respectively. That is, the highest benefit was found formedium time in groups. In contrast, more time spent in groups didproduce the highest effect on promoting positive student attitudes,with low, medium and high amount of time in groups having effectsizes of 0.37, 0.26, and 0.77, respectively. Springer et al. note that theattitudinal results were based on a relatively small number of studies.

In summary, a number of meta-analyses support the premisethat collaboration “works” for promoting a broad range of studentlearning outcomes. In particular, collaboration enhances academicachievement, student attitudes, and student retention. The magni-tude, consistency and relevance of these results strongly suggest thatengineering faculty promote student collaboration in their courses.

C. Cooperative LearningAt its core, cooperative learning is based on the premise that co-

operation is more effective than competition among students forproducing positive learning outcomes. This is examined in Table 2.

The reported results are consistently positive. Indeed, looking athigh quality studies with good internal validity, the already large ef-fect size of 0.67 shown in Table 2 for academic achievement in-creases to 0.88. In real terms, this would increase a student’s examscore from 75 to 85 in the “classic” example cited previously, thoughof course this specific result is dependent on the assumed grade dis-tribution. As seen in Table 2, cooperation also promotes interper-sonal relationships, improves social support and fosters self-esteem.

Another issue of interest to engineering faculty is that coopera-tive learning provides a natural environment in which to promote

effective teamwork and interpersonal skills. For engineering faculty,the need to develop these skills in their students is reflected by theABET engineering criteria. Employers frequently identify teamskills as a critical gap in the preparation of engineering students.Since practice is a precondition of learning any skill, it is difficult toargue that individual work in traditional classes does anything todevelop team skills.

Whether cooperative learning effectively develops interpersonalskills is another question. Part of the difficulty in answering thatquestion stems from how one defines and measures team skills.Still, there is reason to think that cooperative learning is effective inthis area. Johnson et al. [12, 13] recommend explicitly training stu-dents in the skills needed to be effective team members when usingcooperative learning groups. It is reasonable to assume that the op-portunity to practice interpersonal skills coupled with explicit in-structions in these skills is more effective than traditional instructionthat emphasizes individual learning and generally has no explicit in-struction in teamwork. There is also empirical evidence to supportthis conclusion. Johnson and Johnson report that social skills tendto increase more within cooperative rather than competitive or indi-vidual situations [46]. Terenzini et al. [47] show that students re-port increased team skills as a result of cooperative learning. In addi-tion, Panitz [48] cites a number of benefits of cooperative learningfor developing the interpersonal skills required for effective team-work.

In summary, there is broad empirical support for the centralpremise of cooperative learning, that cooperation is more effectivethan competition for promoting a range of positive learning out-comes. These results include enhanced academic achievement and anumber of attitudinal outcomes. In addition, cooperative learningprovides a natural environment in which to enhance interpersonalskills and there are rational arguments and evidence to show the ef-fectiveness of cooperation in this regard.

D. Problem-Based Learning As mentioned in Section II of this paper, the first step of deter-

mining whether an educational approach works is clarifying exactlywhat the approach is. Unfortunately, while there is agreement onthe general definition of PBL, implementation varies widely.Woods et al. [16], for example, discuss several variations of PBL.

“Once a problem has been posed, different instructional methods may beused to facilitate the subsequent learning process: lecturing, instructor-facilitated discussion, guided decision making, or cooperative learning. Aspart of the problem-solving process, student groups can be assigned to

July 2004 Journal of Engineering Education 227

*Calculated using an effect size of 0.5, a mean of 75 and a normalized grade dis-tribution where the top 10 percent of students receive a 90 or higher (an A) and thebottom 10 percent receive a 60 or lower (an F).

Table 2. Collaborative vs. competitive learning: Reported effect size of the improvement in different learning outcomes.

Page 25: active learning 2013crlte.engin.umich.edu/.../2013/06/W13ActiveLearningPres.pdf · 2019-09-10 · A practical introduction to active learning for busy skeptics April 9, 2013 Michael

complete any of the learning tasks listed above, either in or out of class. In thelatter case, three approaches may be adopted to help the groups stay on trackand to monitor their progress: (1) give the groups written feedback after eachtask; (2) assign a tutor or teaching assistant to each group, or (3) create fullyautonomous, self-assessed “tutorless” groups.”

The large variation in PBL practices makes the analysis of its ef-fectiveness more complex. Many studies comparing PBL to tradi-tional programs are simply not talking about the same thing. Formeta-studies of PBL to show any significant effect compared to tra-ditional programs, the signal from the common elements of PBLwould have to be greater than the noise produced by differences inthe implementation of both PBL and the traditional curricula.Given the huge variation in PBL practices, not to mention differ-ences in traditional programs, readers should not be surprised if noconsistent results emerge from meta-studies that group togetherdifferent PBL methods.

Despite this, there is at least one generally accepted finding thatemerges from the literature, which is that PBL produces positivestudent attitudes. Vernon and Blake [19] looking at 35 studies from1970 to 1992 for medical programs found that PBL produced a sig-nificant effective size (0.55) for improved student attitudes andopinions about their programs. Albanese and Mitchell [20] similar-ly found that students and faculty generally prefer the PBL ap-proach. Norman and Schmidt [18] argue “PBL does provide amore challenging, motivating and enjoyable approach to education.That may be a sufficient raison d’etre, providing the cost of the im-plementation is not too great.” Note that these and most of the re-sults reported in this section come from studies of medical students,for whom PBL has been widely used. While PBL has been used inundergraduate engineering programs [49, 50] there is very littledata available for its effectiveness with this population of students.

Beyond producing positive student attitudes, the effects of PBLare less generally accepted, though other supporting data do exist.Vernon and Blake [19], for example, present evidence that there is astatistically significant improvement of PBL on students’ clinicalperformance with an effect size of 0.28. However, Colliver [22]points out that this is influenced strongly by one outlying study witha positive effect size of 2.11, which skews the data. There is also evi-dence that PBL improves the long-term retention of knowledgecompared to traditional instruction [51–53]. Evidence also suggeststhat PBL promotes better study habits among students. As onemight expect from an approach that requires more independencefrom students, PBL has frequently been shown to increase library

use, textbook reading, class attendance and studying for meaningrather than simple recall [19, 20, 53, 54].

We have already discussed the problems with meta-studies thatcompare non-uniform and inconsistently defined educational inter-ventions. Such studies are easily prone to factors that obscure re-sults. The approach for handling this difficulty with active, collabo-rative and cooperative learning was to identify the central elementof the approach and to focus on this rather than on implementationmethods. That is more difficult to do with PBL since it is not clearthat one or two core elements exist. PBL is active, engages studentsand is generally collaborative, all of which are supported by our pre-vious analysis. It is also inductive, generally self-directed, and oftenincludes explicit training in necessary skills. Can one or two ele-ments be identified as common or decisive?

Norman and Schmidt [18] provide one way around the difficul-ty by identifying several components of PBL in order to show howthey impact learning outcomes. Their results are shown in Table 3,taken directly from Norman and Schmidt using the summary ofmeta-studies provided by Lipsey and Wilson [17]. The measuredlearning outcome for all educational studies cited by Lipsey andWilson was academic achievement.

Norman and Schmidt present this table to illustrate how differ-ent elements of PBL have different effects on learning outcomes.However, the substantive findings of Table 3 are also worth high-lighting for faculty interested in adopting PBL because there seemsto be considerable agreement on what works and does not work inPBL.

Looking first at the negative effects, there is a significant nega-tive effect size using PBL with non-expert tutors. This finding isconsistent with some of the literature on helping students make thetransition from novice to expert problem solvers. Research compar-ing experts to novices in a given field has demonstrated that becom-ing an expert is not just a matter of “good thinking” [42]. Instead,research has demonstrated the necessity for experts to have both adeep and broad foundation of factual knowledge in their fields. Thesame appears to be true for tutors in PBL.

There is also a small negative effect associated with both self-paced and self-directed learning. This result is consistent with thefindings of Albanese and Mitchell [20] on the effect of PBL on testresults. In seven out of ten cases they found that students in PBLprograms scored lower than students in traditional programs ontests of basic science. However, in three out of ten cases, PBLstudents actually scored higher. Albanese and Mitchell note thatthese three PBL programs were more “directive” than others,

228 Journal of Engineering Education July 2004

Table 3. Effect sizes for academic achievement associated with various aspects of problem-based learning.

Page 26: active learning 2013crlte.engin.umich.edu/.../2013/06/W13ActiveLearningPres.pdf · 2019-09-10 · A practical introduction to active learning for busy skeptics April 9, 2013 Michael

indicating that this element might be responsible for the superiorexam performance for students in those programs. Therefore, facul-ty might be advised to be cautious about the amount of self-direc-tion required by students in PBL, at least with regard to promotingacademic achievement as measured by traditional exams.

Looking at what seems to work, there are significant positive ef-fect sizes associated with placing students in small groups and usingcooperative learning structures. This is consistent with much of theliterature cited previously in support of cooperative learning. WhilePBL and cooperative learning are distinct approaches, there is anatural synergy that instructors should consider exploiting. That is,real problems of the sort used in PBL require teams to solve effec-tively. At the same time, the challenge provided by realistic prob-lems can provide some of the mutual interdependence that is one ofthe five tenets of cooperative learning.

Table 3 also shows that positive results come from instruction inproblem solving. This is consistent with much of the advice givenby proponents of problem-based learning [55]. While practice iscrucial for mastering skills such as problem solving, greater gains arerealized through explicit instruction of problem solving skills. How-ever, traditional engineering courses do not generally teach problemsolving skills explicitly. Table 3 suggests that faculty using PBLconsider doing just that.

In conclusion, PBL is difficult to analyze because there is notone or two core elements that can be clearly identified with studentlearning outcomes. Perhaps the closest candidates for core elementswould be inductive or discovery learning. These have been shownby meta-studies to have only weakly positive effects on student aca-demic achievement [56, 57] as measured by exams. This might ex-plain why PBL similarly shows no improvement on student testscores, the most common measure of academic achievement.

However, while no evidence proves that PBL enhances academ-ic achievement as measured by exams, there is evidence to suggestthat PBL “works” for achieving other important learning outcomes.Studies suggest that PBL develops more positive student attitudes,fosters a deeper approach to learning and helps students retainknowledge longer than traditional instruction. Further, just as co-operative learning provides a natural environment to promote inter-personal skills, PBL provides a natural environment for developingproblem-solving and life-long learning skills. Indeed, some evi-dence shows that PBL develops enhanced problem-solving skills inmedical students and that these skills can be improved further bycoupling PBL with explicit instruction in problem solving. Similar-ly, supporting arguments can be made about PBL and the impor-tant ABET engineering outcome of life-long learning. Since self-directed learning and meta-cognition are common to both PBLand life-long learning, a logical connection exists between these de-sired learning outcomes and PBL instruction, something often nottrue when trying to promote life-long learning through traditionalteaching methods.

IV. CONCLUSIONS

Although the results vary in strength, this study has found sup-port for all forms of active learning examined. Some of the findings,such as the benefits of student engagement, are unlikely to be con-troversial although the magnitude of improvements resulting fromactive-engagement methods may come as a surprise. Other findings

challenge traditional assumptions about engineering education andthese are most worth highlighting.

For example, students will remember more content if brief activi-ties are introduced to the lecture. Contrast this to the prevalent con-tent tyranny that encourages faculty to push through as much materi-al as possible in a given session. Similarly, the support for collaborativeand cooperative learning calls into question the traditional assump-tions that individual work and competition best promote achieve-ment. The best available evidence suggests that faculty should struc-ture their courses to promote collaborative and cooperativeenvironments. The entire course need not be team-based, as seen bythe evidence in Springer et al. [43], nor must individual responsibilitybe absent, as seen by the emphasis on individual accountability in co-operative learning. Nevertheless, extensive and credible evidence sug-gests that faculty consider a nontraditional model for promoting aca-demic achievement and positive student attitudes.

Problem-based learning presents the most difficult method toanalyze because it includes a variety of practices and lacks a domi-nant core element to facilitate analysis. Rather, different implemen-tations of PBL emphasize different elements, some more effectivefor promoting academic achievement than others. Based on the lit-erature, faculty adopting PBL are unlikely to see improvements instudent test scores, but are likely to positively influence student atti-tudes and study habits. Studies also suggest that students will retaininformation longer and perhaps develop enhanced critical thinkingand problem-solving skills, especially if PBL is coupled with explicitinstruction in these skills.

Teaching cannot be reduced to formulaic methods and activelearning is not the cure for all educational problems. However, thereis broad support for the elements of active learning most commonlydiscussed in the educational literature and analyzed here. Some ofthe findings are surprising and deserve special attention. Engineer-ing faculty should be aware of these different instructional methodsand make an effort to have their teaching informed by the literatureon “what works.”

ACKNOWLEDGMENTS

The author would like to thank Richard Felder for his thoughtfulcritique of this work and for many similar pieces of advice over thepast several years. The National Science Foundation through ProjectCatalyst (NSF9972758) provided financial support for this project.

REFERENCES

[1] Bonwell, C.C., and J. A. Eison, “Active Learning: Creating Ex-citement in the Classroom,” ASHEERIC Higher Education Report No.1, George Washington University, Washington, DC , 1991.

[2] Online Collaborative Learning in Higher Education, �http://clp.cqu.edu.au/glossary.htm�, accessed 12/3/2003.

[3] Millis, B., and P. Cottell, Jr., “Cooperative Learning for HigherEducation Faculty,” American Council on Education, ORYX Press, 1998.

[4] Smith, B., and J. MacGregor, “What is Collaborative Learning?,”in Goodsell, A., M. Mahler, V. Tinto, B.L.Smith, and J. MacGreger,(Eds), Collaborative Learning: A Sourcebook for Higher Education (pp. 9–22).University Park, PA: National Center on Postsecondary Teaching, Learn-ing and Assessment, 1992.

July 2004 Journal of Engineering Education 229

Page 27: active learning 2013crlte.engin.umich.edu/.../2013/06/W13ActiveLearningPres.pdf · 2019-09-10 · A practical introduction to active learning for busy skeptics April 9, 2013 Michael

[5] Cusea, J., “Collaborative & Cooperative Learning in Higher Edu-cation: A Proposed Taxonomy,” Cooperative Learning and College Teaching,Vol. 2, No. 2, 2–4, 1992.

[6] Bean, J., “Engaging Ideas: The Professor’s Guide to IntegratingWriting, Critical Thinking, and Active Learning in the Classroom,” Josey-Bass Publishers: San Francisco, 1996.

[7] Felder, R., Brent, R., and Stice, J., “National Effective TeachingInstitute: Workshop Materials,” 2002 American Society for EngineeringEducation Annual Conference, Montreal, Quebec, Canada, 2002.

[8] Bruffee, K., “Sharing Our Toys: Cooperative Learning VersusCollaborative Learning,” Change, January/February, 1995, p. 12.

[9] Panitz, T., “Collaborative Versus Cooperative Learning-A Com-parison of the Two Concepts Which Will Help Us Understand the Un-derlying Nature of Interactive Learning,” �http://capecod.net/~tpanitz/tedsarticles/coopdefinition.htm�, accessed 12/2/2003.

[10] �http://www.wcer.wisc.edu/nise/CL1/CL/question/TQ13.htm�, accessed 12/3/2003.

[11] Feden, P., and R. Vogel, Methods of Teaching: Applying CognitiveScience to Promote Student Learning, McGraw Hill Higher Education, 2003.

[12] Johnson, D., R., Johnson, and K. Smith, Active Learning: Cooper-ation in the College Classroom, 2nd ed., Interaction Book Co., Edina, MN,1998.

[13] Johnson, D., R., Johnson, and K. Smith, “Cooperative LearningReturns to College: What Evidence is There That it Works?,” Change,Vol. 30, No. 4. July/Aug., 1998, p. 26–35.

[14] Stahl, R., “The Essential Elements of Cooperative Learning inthe Classroom,” ERIC Digest ED370881, 1994, available online at�http://www.ericfacility.net/ericdigests/ed370881.html�.

[15] Slavin, R., “Cooperative Learning. Research on Teaching Mono-graph Series,” ERIC Digest ED242707, 1983.

[16] Woods, D., R Felder, A. Rugarcia, and J. Stice, “The Future ofEngineering Education. III. Developing Critical Skills,” Chemical Engi-neering Education, Vol. 34, No. 2, 2000, pp. 108–117.

[17] Lipsey, M.W., and Wilson, D.B., “The Efficacy of Psychological,Educational and Behavioral Treatment: Confirmation from Meta-Analysis,” American Psychology, Vol. 12, 1993, p. 1181–1209.

[18] Norman, G., and H. Schmidt, “Effectiveness of Problem-BasedLearning Curricula: Theory, Practice and Paper Darts,” Medical Education,Vol. 34, 2000, pp. 721–728.

[19] Vernon, D., and R. Blake, “Does Problem-Based LearningWork? A Meta-Analysis of Evaluative Research,” Academic Medicine, Vol.68, No. 7, July 1993.

[20] Albanese, M. and S. Mitchell, “Problem-Based Learning: A Re-view of Literature on Its Outcomes and Implementation Issues,” AcademicMedicine, Vol. 68, No. 1, January 1993.

[21] Qin, Z., Johnson, D., and Johnson, R., “Cooperative VersusCompetitive Efforts and Problem Solving,” Review of Educational Research,Vol. 65, No. 2, Summer 1995, p. 129.

[22] Colliver, J., “Effectiveness of Problem-Based Learning Curricu-la,” Academic Medicine, Vol. 75, 2000, p. 259.

[23] Albanese, M., “Problem-Based Learning: Why Curricula AreLikely to Show Little Effect on Knowledge and Clinical Skills,” MedicalEducation, Vol. 34, No. 9, 2000, p. 729.

[24] Cohen, J., Statistical Power Analysis for Behavioral Sciences, RevisedEdition, Englewood Cliffs, New Jersey: Erlbaum; 1977.

[25] Dubin R., and T. Taveggia, “The Teaching -Learning Paradox.A Comparative Analysis of College Teaching Methods,” University ofOregon, USA: Center for the Advanced Study of Educational Administra-tion, 1968.

[26] Bracey, G., “Tips for Readers of Research: Beware the ‘ClassicStudy’,” Phi Delta Kappan, Vol. 83, No. 8, April 2002, p. 642.

[27] Felder, R., D. Woods, J. Stice, and A. Rugarcia, “The Future ofEngineering Education: II.Teaching Methods that Work,” Chemical Engi-neering Education, Vol. 34, No. 1, 2000, pp. 26–39.

[28] Chickering, A., and Z. Gamson, “Seven Principles for GoodPractice,” AAHE Bulletin, Vol. 39, ED 282 491, March 1987, pp. 3–7.

[29] McKeachie, W., “Research on College Teaching,” EducationalPerspectives, Vol. 11, No. 2, May 1972, pp. 3–10.

[30] Sorcinelli, M., “Research Findings on the Seven Principles,” inA.W. Chickering and Z.F. Gamson, eds., Applying the Seven Principles forGood Practice in Undergraduate Education, New Directions in Teaching andLearning, #47, San Francisco: Jossey-Bass, 1991.

[31] �http://trc.ucdavis.edu/trc/active/defini.html�, accessed12/3/03.

[32] MacGregor, J., Cooper, J., Smith, K., and Robinson, P. (Eds.)“Strategies for Energizing Large Classes: From Small Groups to LearningCommunities,” New Directions for Teaching and Learning, Vol. 81, Jossey-Bass, 2000.

[33] Ruhl, K., C. Hughes, and P. Schloss, “Using the Pause Procedureto Enhance Lecture Recall,” Teacher Education and Special Education, Vol.10, Winter 1987, pp. 14–18.

[34] Di Vesta, F., and D. Smith, “The Pausing Principle: Increasingthe Efficiency of Memory for Ongoing Events,” Contemporary EducationalPsychology, Vol. 4, 1979.

[35] Wankat, P., The Effective Efficient Professor: Teaching, Scholarshipand Service, Allyn and Bacon: Boston, MA, 2002.

[36] Hartley, J., and Davies, I., “Note Taking: A Critical Review,”Programmed Learning and Educational Technology, Vol. 15, 1978,pp. 207–224.

[37] Wiggins, G., and J. McTighe, “Understanding by Design,”Merrill Education/ASCD College Textbook Series, ASCD, Alexandria,Virginia, 1998.

[38] Astin, A., What Matters in College?; Four Critical Years Revisited,Josey-Bass: San Francisco, CA, 1993.

[39] Hake, R., “Interactive-Engagement vs. Traditional Methods: ASix-Thousand-Student Survey of Mechanics Test Data for IntroductoryPhysics Courses,” American Journal of Physics, Vol. 66, No. 1, 1998, p. 64.

[40] Redish, E., J. Saul, and R. Steinberg, “On the Effectiveness ofActive-Engagement Microcomputer-Based Laboratories,” AmericanJournal of Physics, Vol. 65, No. 1, 1997, p. 45.

[41] Laws, P., D. Sokoloff, and R. Thornton, “Promoting ActiveLearning Using the Results of Physics Education Research,” UniServeScience News, Vol. 13, July 1999.

[42] Bransford, J., A. Brown, and R. Cocking, (Commission on Be-havioral and Social Science and Education, National Research Council),“How People Learn: Body, Mind, Experience and School,” NationalAcademy Press, Washington D.C., 2000. Available online at�http://www.nap.edu/html/howpeople1/�.

[43] Springer, L., M. Stanne, and S. Donovan, “Effects of Small-Group Learning on Undergraduates in Science, Mathematics, Engineer-ing and Technology: A Meta-Analysis,” Review of Educational Research,Vol. 69, No. 1, 1999, pp. 21–52.

[44] Fredericksen, E., “Minority Students and the Learning Commu-nity Experience: A Cluster Experiment,” U.S.: Texas 1998-0400,ED423533, 1998.

[45] Berry, Jr., L., “Collaborative Learning: A Program for Improvingthe Retention of Minority Students, U.S.: Virginia, 1991-00-00,ED384323, 1991.

230 Journal of Engineering Education July 2004

Page 28: active learning 2013crlte.engin.umich.edu/.../2013/06/W13ActiveLearningPres.pdf · 2019-09-10 · A practical introduction to active learning for busy skeptics April 9, 2013 Michael

[46] Johnson, D., and R. Johnson, Cooperation and Competition, Theoryand Research, Edina, MN: Interaction Book Company, 1989.

[47] Terenzini, P., Cabrera, A., Colbeck, C., Parente, J., and BjorklundS., “Collaborative Learning vs. Lecture/Discussion: Students’ ReportedLearning Gains,” Journal of Engineering Education, Vol. 90, No. 1, 2001.

[48] Panitz, T., “The Case for Student Centered Instruction viaCollaborative Learning Paradigms,” U.S. Massachusetts, 1999-12-00, ED448444, 1999.

[49] �http://www.udel.edu/pbl/others.html#undergrad�, accessed1/16/04.

[50] �http://pbl.cqu.edu.au/content/online_resources.htm�,accessed 1/16/04.

[51] Norman, G., and H. Schmidt, “The Psychological Basis of Prob-lem-Based Learning: A Review of Evidence,” Academic Medicine, Vol. 67,1993, pp. 557–565.

[52] Martensen, D., H. Eriksson, and M. Ingleman-Sundberg, “Med-ical Chemistry: Evaluation of Active and Problem-Oriented TeachingMethods,” Medical Education, Vol. 19, 1985, p. 34.

[53] Gallagher, S., “Problem-Based Learning: Where did it comesfrom, what does it do and where is it going?,” Journal for Education of theGifted, Vol. 20, No. 4, Summer 1997, pp. 332–362.

[54] Major, C., and B. Palmer, “Assessing the Effectiveness of Prob-lem-Based Learning in Higher Education: Lessons from the Literature,”Academic Exchange Quarterly, Vol. 5, No. 1, 2001, p. 4.

[55] Woods, Donald R., �http://www.chemeng.mcmaster.ca/pbl/pbl.htm�, accessed 12/3/03

[56] El-Nemr, M., “A Meta-Analysis of the Outcomes of TeachingBiology as Inquiry,” (Doctoral Dissertation, University of Colorado, 1979),Dissertation Abstracts International, 40, 5812A. (University MicrofilmsInternational No 80-11274), 1980.

[57] Lott, G., “The Effect of Inquiry Teaching and Advance Organiz-ers upon Student Outcomes in Science Education,” Journal of Research inScience Teaching, Vol. 20, 1983, p. 437.

AUTHOR’S BIOGRAPHY

Dr. Michael Prince is a professor in the Department of Chemi-cal Engineering at Bucknell University, where he has been since re-ceiving his Ph.D. from the University of California at Berkeley in1989. He is the author of several education-related papers for engi-neering faculty and gives faculty development workshops on activelearning. He is currently participating in Project Catalyst, an NSF-funded initiative to help faculty re-envision their role in the learningprocess.

Address: Department of Chemical Engineering, Bucknell Uni-versity, Lewisburg, PA 17837; telephone: 570-577-1781; e-mail:[email protected]

July 2004 Journal of Engineering Education 231

Page 29: active learning 2013crlte.engin.umich.edu/.../2013/06/W13ActiveLearningPres.pdf · 2019-09-10 · A practical introduction to active learning for busy skeptics April 9, 2013 Michael

Chemical Engineering Education, 45(2), 131-132 (Spring 2011)

Richard M. Felder is Hoechst Celanese Professor Emeritus of Chemical Engineering at North Carolina State University. He is co-author of Elementary Principles of Chemical Processes (Wiley, 2005) and numerous articles on chemical process engineering and engineering and science education, and regularly presents workshops on ef-fective college teaching at campuses and conferences around the world. Many of his publications can be seen at <www.ncsu.edu/effective_teaching>.

Random Thoughts . . .

Dear Dr. Felder,

What can I do about low teaching evaluations from students I teach actively when what they clearly want is much more traditional (passive ride, smooth highway please)? I’m about ready to give up and return to just lecturing, as I am sure students will evaluate my courses higher if I do. Thank you for your time and consideration.

Sincerely, _____________

* * *Dear ____________,

Before I respond to your question, let me assure you that I get it. Learner-centered teaching methods like active and cooperative and problem-based learning make students take more responsibility for their learning than traditional teacher-centered methods do, and the students are not necessarily thrilled about it. All college instructors who have tried the former methods have experienced student resistance—and if they were getting high evaluations when they taught tradi-tionally, their ratings may have dropped when they made the switch. As you’ve discovered, it doesn’t feel good when that happens, so it will be understandable if you decide to go back to teaching classes where you just lecture and the students just listen (or text or surf or daydream or sleep).

Please think about a couple of things before you make your decision, however. An important part of our job as teachers is equipping as many of our students as possible with high-level problem-solving and thinking skills, including critical and creative thinking. If there’s broad agreement about anything in educational research, it’s that well-implemented learner-centered instruction is much more effective than traditional lecture-based instruction at promoting those skills. (If you’d

HANG IN THERE!Dealing with Student Resistance

to Learner-Centered Teaching

RichaRd M. FeldeR

like to check the research for yourself, the attached bibli-ography suggests some good starting points.) It’s true that many students want us to simply tell them up front in our lectures everything they need to know for the exam rather than challenging them to figure any of it out for themselves. If we give them that, though, we are failing those who have an aptitude for high-level thinking and problem solving but might not develop those skills without the guidance, practice, and feedback learner-centered methods provide. That failure is a high price for us to pay to get better student ratings—and we might not even get them by staying traditional. Teachers whose evaluations are not all that high to begin with com-monly see their ratings increase when they adopt a more learner-centered approach.

I don’t know what your institution is like, but here’s the way things go at the universities and colleges I’ve visited. Most instructors teach traditionally but there are quite a few who use active learning and other learner-centered methods, including some of the best teachers on the campus—the ones who routinely get excellent performance and high ratings from their students, teaching awards, and wedding invitations and birth announcements from their former students. At some point another faculty member may decide to try, say, active

© Copyright ChE Division of ASEE 2011

Page 30: active learning 2013crlte.engin.umich.edu/.../2013/06/W13ActiveLearningPres.pdf · 2019-09-10 · A practical introduction to active learning for busy skeptics April 9, 2013 Michael

Chemical Engineering Education132

All of the Random Thoughts columns are now available on the World Wide Web athttp://www.ncsu.edu/effective_teaching and at http://che.ufl.edu/~cee/

learning, perhaps after attending a workshop or reading a paper or constantly hearing about the superb student responses their gifted colleague always enjoys. He or she tries it and it doesn’t go well—the evaluations are mediocre and some students grumble that their professor made them do all the work instead of teaching them.* Instructors in this situation can easily conclude that the nontraditional methods caused their poor ratings. What that conclusion doesn’t explain, however, is how that talented colleague of theirs can use the same methods on the same students and get good performance and glowing reviews.

Whenever I’ve explored this issue with instructors dis-tressed by it, I have invariably found that the teaching method they were trying was not the real problem. It was either that they were making one or more mistakes in implementing the method, or something else was troubling the students and the method was a convenient scapegoat. So, if you’ve used a learner-centered method, didn’t like the outcomes, and would like to do some exploring, you might start with these questions:

• In your student evaluations, were complaints limited to the method, or did they also relate to other things such as the length of your assignments and exams, the clarity of your lecturing, or your lack of availability and/or respect for students? If they did, consider addressing those complaints before abandoning the method.

• Did you explain to the students why you were using the method? If you tell them you’re doing it because research has shown that it leads to improved learning, greater acquisition of skills that potential employers consider valuable, and higher grades, most will set aside their objections long enough to find that you’re telling the truth. (See Reference 2 in the bibliography.)

• Did you use the new method long enough to overcome the learning curve associated with it? It can take most of a semester to become comfortable with and adept at active learning, and if you’re using a more com-plex technique such as cooperative or problem-based learning and you’re not being mentored by an expert, it might take several years.

• If you got unsatisfactory student ratings, did you check references on the method to see if you were doing something wrong? For example, did you assign small-group activities in class that lasted for more than 2–3 minutes or call for volunteers to respond every time? (See Reference 4 to find out how both practices can kill the effectiveness of active learning.) The bibliography suggests references you might consult for each of the most common learner-centered methods.

• In your midterm evaluations, did you specifically ask the students whether they thought active learning (or whatever you were doing) was (a) helping their learn-ing, (b) hindering their learning, or (c) neither helping nor hindering? If you do this, you may find that the students objecting vigorously to the method are only a small minority of the class. If that’s so, announce the survey results in the next class session. Students who complain about learner-centered methods often imagine that they are speaking for most of their class-mates. Once they find out that very few others feel the way they do, the grumbling tends to disappear immediately.

If your answers to any of those questions suggest that making some changes in your approach to the method and trying again might be worthwhile, consider doing it. If you conclude, however, that you’ve done all you can and going back to traditional teaching is your only viable course of ac-tion, then so be it. I hope you choose the first option, but it’s totally your call.

Best regards, and good luck, Richard Felder

BIBlIoGRApHy 1. Bullard L.G., and R.M. Felder, “A Learner-centered Approach to Teach-

ing Material and Energy Balances. 1. Course Design,” Chem. Eng. Ed., 41(2), 93 <http://www.ncsu.edu/felder-public/Papers/StoichPap-pt1.pdf>; “2. Course Instruction and Assessment,” Chem. Eng. Ed., 41(3), 167 <http://www.ncsu.edu/felder-public/Papers/StoichPap-pt2.pdf> (2007)

2. Felder, R.M., Sermons for Grumpy Campers,” Chem. Eng. Ed., 41(3), 183, <http://www.ncsu.edu/felder-public/Columns/Sermons.pdf> (2007)

3. Felder, R.M., and R. Brent, “Cooperative Learning,” in P.A. Mabrouk, ed., Active Learning: Models from the Analytical Sciences, ACS Symposium Series 970, Chapter 4, 34–53, Washington, DC: American Chemical Society, <http://www.ncsu.edu/felder-public/Papers/CL-Chapter.pdf> (2007)

4. Felder, R.M., and R. Brent, “Active Learning: An Introduction,” ASQ Higher Education Brief, 2(4), <http://www.ncsu.edu/felder-public/Pa-pers/ALpaper(ASQ).pdf> (2009)

5. Prince, M.J., “Does Active Learning Work? A Review of the Research,” J. Eng. Ed., 93(3), 223, <http://www.ncsu.edu/felder-public/Papers/Prince_AL.pdf> (2004)

6. Prince, M.J., and R.M. Felder, “Inductive Teaching and Learning Meth-ods: Definitions, Comparisons, and Research Bases,” J. Eng. Ed., 95(2), 123, <http://www.ncsu.edu/felder-public/Papers/InductiveTeaching.pdf> (Inductive methods include inquiry-based, problem-based, and project-based learning.) (2006) p

* My favorite student evaluation came from someone who wrote “Felder really makes us think!” It was on his list of the three things he disliked most about the course.

Page 31: active learning 2013crlte.engin.umich.edu/.../2013/06/W13ActiveLearningPres.pdf · 2019-09-10 · A practical introduction to active learning for busy skeptics April 9, 2013 Michael

Resistance Paper

http://www4.ncsu.edu/unity/lockers/users/f/felder/public/Papers/Resist.html[11/3/2011 10:01:00 AM]

NAVIGATING THE BUMPY ROADTO STUDENT-CENTERED INSTRUCTION

Richard M. FelderDepartment of Chemical Engineering

North Carolina State UniversityRaleigh, NC 27695-7905

Rebecca BrentSchool of Education

East Carolina UniversityGreenville, NC 27858

Authors' note: An abridged version of this paper was published in College Teaching, 44, 43-47 (1996).

INTRODUCTION

In the traditional approach to higher education, the burden of communicating course material resides primarily with theinstructor. In student-centered instruction (SCI), some of this burden is shifted to the students. SCI is a broad approachthat includes such techniques as substituting active learning experiences for lectures, holding students responsible formaterial that has not been explicitly discussed in class, assigning open-ended problems and problems requiring criticalor creative thinking that cannot be solved by following text examples, involving students in simulations and role-plays,assigning a variety of unconventional writing exercises, and using self-paced and/or cooperative (team-based)learning. In traditional instruction, the teacher's primary functions are lecturing, designing assignments and tests, andgrading; in SCI, the teacher still has these functions but also provides students with opportunities to learnindependently and from one another and coaches them in the skills they need to do so effectively. In recent decades,the education literature has described a wide variety of student-centered instructional methods and offered countlessdemonstrations that properly implemented SCI leads to increased motivation to learn, greater retention of knowledge,deeper understanding, and more positive attitudes toward the subject being taught (Bonwell and Eisen 1991; JohnsonJohnson and Smith 1991a,b; McKeachie 1986; Meyers and Jones 1993).

We use student-centered instruction extensively in our courses and discuss it in teaching workshops we present tofaculty members and graduate teaching assistants. The workshop participants generally fall into two categories. On theone hand are the skeptics, who come up with all sorts of creative reasons why student-centered methods could notpossibly work. On the other hand are the converts, who are sold on SCI and can't wait to try it. We know the fearsteachers have about the instructional methods we advocate, having had most of them ourselves, and we can usuallysatisfy most of the skeptics that some of the problems they anticipate will not occur and the others are solvable. Weworry more about the enthusiasts who leave the workshop ready to plunge right in, imagining that the spectacularresults promised by the literature will show up immediately.

The enthusiasts may be in for a rude shock. It's not that SCI doesn't work when done correctly-it does, as both theliterature and our personal experience in two strikingly different disciplines richly attest. The problem is that while thepromised benefits are real, they are neither immediate nor automatic. The students, whose teachers have been tellingthem everything they needed to know from the first grade on, don't necessarily appreciate having this support suddenlywithdrawn. Some students view the approach as a threat or as some kind of game, and a few may become sullen orhostile when they find they have no choice about playing. When confronted with a need to take more responsibility fortheir own learning, they may grouse that they are paying tuition-or their parents are paying taxes-to be taught, not toteach themselves. If cooperative learning is a feature of the instruction, they may gripe loudly and bitterly about otherteam members not pulling their weight or about having to waste time explaining everything to slower teammates.Good lecturers may feel awkward when they start using student-centered methods and their course-end ratings may

Page 32: active learning 2013crlte.engin.umich.edu/.../2013/06/W13ActiveLearningPres.pdf · 2019-09-10 · A practical introduction to active learning for busy skeptics April 9, 2013 Michael

Resistance Paper

http://www4.ncsu.edu/unity/lockers/users/f/felder/public/Papers/Resist.html[11/3/2011 10:01:00 AM]

initially drop. It's tempting for instructors to give up in the face of all that, and many unfortunately do.

Giving up is a mistake. SCI may impose steep learning curves on both instructors and students, and the initialinstructor awkwardness and student hostility are both common and natural. The key for the instructors is to understandhow the process works, take some precautionary steps to smooth out the bumps, and wait out the inevitable setbacksuntil the payoffs start emerging.

TRADITIONAL STUDENTS IN A NONTRADITIONAL CLASS: A PAINFUL ODYSSEY

Woods (1994) observes that students forced to take major responsibility for their own learning go through some or allof the steps psychologists associate with trauma and grief:

1. Shock: "I don't believe it-we have to do homework in groups and she isn't going to lecture on the chapter before theproblems are due?"

2. Denial: "She can't be serious about this-if I ignore it, it will go away."

3. Strong emotion: "I can't do it-I'd better drop the course and take it next semester" or "She can't do this to me-I'mgoing to complain to the department head!"

4. Resistance and withdrawal: "I'm not going to play her dumb games-I don't care if she fails me."

5. Surrender and acceptance: "OK, I think it's stupid but I'm stuck with it and I might as well give it a shot."

6. Struggle and exploration: "Everybody else seems to be getting this-maybe I need to try harder or do thingsdifferently to get it to work for me."

7. Return of confidence: "Hey, I may be able to pull this off after all-I think it's starting to work."

8. Integration and success. "YES! This stuff is all right-I don't understand why I had so much trouble with it before."

Just as some people have an easier time than others in getting through the grieving process, some students mayimmediately take to whichever SCI method you're using and short-circuit many of the eight steps, while others mayhave difficulty getting past the negativity of Steps 3 and 4. The point is to remember that the resistance you encounterfrom some students is a natural part of their journey from dependence to intellectual autonomy (see Kloss 1994). Ifyou provide sufficient structure and guidance along the way, by the end of the course most of them will reachsatisfactory levels of both performance and acceptance of responsibility for their own learning.

In the remainder of this paper, we list common faculty concerns about student-centered instructional methods and offerresponses. Much of the discussion involves issues associated with cooperative learning, the method that in ourexperience occasions the most vehement student resistance.

FACULTY CONCERNS

If I spend time in class on active learning exercises, I'll never get through the syllabus.

You don't have to spend that much time on in-class work to have a significant impact with it. Simply ask questionsoccasionally and give the students a short time to come up with solutions and answers, working either individually orin small groups. Then collect answers from several randomly selected individuals or groups. One or two such exercisesthat take a total of 5-10 minutes can keep a class relatively attentive for an entire period.

On a broader note, much of what happens in most classes is a waste of everyone's time. It is neither teaching norlearning. It is stenography. Instructors recite their course notes and transcribe them onto the board, the students do theirbest to transcribe as much as they can into their notebooks, and the information flowing from one set of notes to theother does not pass through anyone's brain. A more productive approach is to put substantial portions of the course

Page 33: active learning 2013crlte.engin.umich.edu/.../2013/06/W13ActiveLearningPres.pdf · 2019-09-10 · A practical introduction to active learning for busy skeptics April 9, 2013 Michael

Resistance Paper

http://www4.ncsu.edu/unity/lockers/users/f/felder/public/Papers/Resist.html[11/3/2011 10:01:00 AM]

notes-lengthy prose, detailed derivations, complex diagrams-in handouts or coursepaks, leaving gaps to be filled inand sprinkling questions and instructions like "Prove," "Justify," "Verify," "Explain" throughout the presentation.Spend class time only on the most critically important and conceptually difficult parts of the notes, leaving the studentsto cover the rest for themselves. The many hours of class time you will save by doing this should be more thansufficient for all the active learning exercises you might want to use. Your classes will be more lively and effective,you will still cover the syllabus, and you might even be able to augment it to include topics you never had time tocover before. Moreover, if you announce that some of the gaps and exercises in the handouts will be the subject of testquestions and then keep your promise, the students will even read the handouts-at least after the first test.

If I don't lecture I'll lose control of the class.

That's one way to look at it. Another is that several times during a class period your students may become heavilyinvolved in working on or arguing about what you're trying to get them to learn, and it may take a few seconds (neverlonger once you get the hang of it) to bring their attention back to you. There are worse problems!

I assign readings but many of my students don't read them and those who do seem unable to understand thematerial independently.

In our experience, the only reliable way to compel most students to read the assigned material is to test them on itwithout covering all of it in class. Some instructors use short quizzes at the beginning of every period for this purpose;others who don't want to spend that much class time giving and grading quizzes prefer to include questions on thereadings in their regularly scheduled examinations. In either case, the instructors soon learn that testing students onmaterial not explicitly covered in class inevitably leads to vigorous protests. There are several ways to ease thestudents' transition from reliance on the instructor to self-reliance. Create graphic organizers that visually illustrate thestructures and key points of the readings (Bellanca 1990) and later ask the students to do so. Prepare study guides thatsummarize critical questions answered by the readings and then include some of the questions on the exams. Givebrief or extended writing assignments that call on the students to explain portions of the readings in their own words.Well-constructed writing assignments compel students to process material actively, identifying important points orconnecting the material to their prior knowledge (Brent and Felder 1992).

Some of my students just don't seem to get what I'm asking them to do-they keep trying to find "the rightanswer" to open-ended problems, they still don't have a clue about what a critical question is, and the problemsthey make up are consistently trivial.

An essential feature of any skill development program is practice and feedback. Most students have never been taughtto solve open-ended problems or think critically or formulate problems, so that the first time you assign such anexercise they will probably do it poorly. Collect their products and provide constructive comments. In addition,reproduce several products (perhaps slipping in one of your own as well), hand them out without attribution, go oversome of them in class to illustrate the sort of thing you're looking for, and suggest ways to make good products evenbetter. Modeling of this type helps students understand the process they need to go through to improve their own work.After several similar assignments and feedback sessions, students will start giving you the kind of results you'relooking for and they will also begin giving one another meaningful feedback in group work. This approach serves adouble purpose: the students gain more skill and confidence and you gain a classroom of teaching assistants who canhelp each other learn. By the end of the course some of them may be performing at a surprisingly high level.

When I tried active learning in one of my classes, many of the students hated it. Some refused to cooperate andmade their hostility to the approach and to me very clear.

Instructors who set out to try student-centered instruction in a class for the first time are often unpleasantly surprisedby the fierce negativity of some responses. Many who don't anticipate such reactions get discouraged when theyencounter them, give up, and go back to more comfortable but less effective methods.

Page 34: active learning 2013crlte.engin.umich.edu/.../2013/06/W13ActiveLearningPres.pdf · 2019-09-10 · A practical introduction to active learning for busy skeptics April 9, 2013 Michael

Resistance Paper

http://www4.ncsu.edu/unity/lockers/users/f/felder/public/Papers/Resist.html[11/3/2011 10:01:00 AM]

To minimize resistance to any student-centered method, try to persuade the students from the outset that you areneither playing a game nor performing an experiment, but teaching in a way known to help students learn more andunderstand better. You can reinforce your point about the effectiveness of SCI by offering variations on one or more ofthe following observations:

You've all had the experience of sitting through a good lecture, believing that you understood it, and then laterwhen you tried to do the homework you realized that you didn't get it at all. By putting you to work in class I'mgiving you a jump start on understanding the material and doing the homework efficiently.

Unless you're a Zen monk, you can't sit still and keep your mind focused on one thing for more than a fewminutes. In lectures your attention drifts, first for short intervals, then for longer ones, and by the end of astraight 50-minute lecture you're probably getting less than 20% of what's being said. Doing something activefrom time to time during the lecture substantially increases the amount of information you actually get. It alsocuts way down on boredom.

When you go out to work, I guarantee you'll be working in teams. When companies fill out surveys askingthem what skills they want their new employees to have, teamwork skills are usually ranked either first orsecond. Since working in teams is what you're going to be doing on your job, you may as well start learninghow to do it now.

(To students complaining about being slowed down by having to explain material they understand to slowerteammates.) If you ask any professor, "When did you really learn thermodynamics (or structural analysis ormedieval history)?" the answer will almost always be "When I had to teach it." Suppose you're trying to explainsomething and your partner doesn't get it. You may try to put it in another way, and then think of an example,then another one. After a few minutes of this your partner may still not get it, but you sure will.

In our experience, most students bright enough to complain about being held back by their classmates are also brightenough to recognize the truth of the last argument.

I'm having a particularly hard time getting my students to work in teams. Many of them resent having to do itand a couple of them protested to my department head about it.

Cooperative learning tends to be the hardest student-centered method to sell initially, especially to high academicachievers and strong introverts. The points given above about the prevalence of teamwork on most jobs, theimportance of teamwork skills to most employers, and the fact that we learn best what we teach, can help. Perhaps themost effective selling point for cooperative learning (unfortunately) involves grades. Many research studies havedemonstrated that students who learn cooperatively get higher grades than students who try to learn the same materialindividually (Johnson et al. 1991b). Before assigning group work for the first time, we may mention a study (Tschumi1991) in which an instructor taught an introductory computer science course three times, once with the studentsworking individually and twice using group work, with common examinations in the first two classes. In the first class,only 36% of the students earned grades of C or better, while in the classes taught cooperatively, 58% and 65% of thestudents did so. Those earning A's in the course included 6.4% (first offering) and 11.5% (second offering) of thosewho worked cooperatively and only 3% of those who worked individually. There was some student resentment aboutgroup work in the first cooperative offering and almost none in the second one, presumably because the instructor wasmore skilled in the method the second time and possibly because the students in the second cooperative class knewabout the results from the first class.

Persuading students that group work is in their interest is only the first step in making this instructional approach workeffectively. The instructor must also structure group exercises to promote positive interdependence among teammembers, assure individual accountability for all work done, facilitate development of teamwork skills, and provide forperiodic self-assessment of group functioning. Techniques for achieving these goals are suggested by Johnson et al.(1991a), Felder and Brent (1994), and many other books and articles in the recent education literature. Instructors newto cooperative learning are advised to have several such references handy when planning activities and assignmentsand dealing with problems.

Page 35: active learning 2013crlte.engin.umich.edu/.../2013/06/W13ActiveLearningPres.pdf · 2019-09-10 · A practical introduction to active learning for busy skeptics April 9, 2013 Michael

Resistance Paper

http://www4.ncsu.edu/unity/lockers/users/f/felder/public/Papers/Resist.html[11/3/2011 10:01:00 AM]

If I assign homework, presentation, or projects to groups, some students will "hitchhike," getting credit forwork in which they did not actively participate.

This is always a danger, although students determined to get a free ride will usually find a way whether theassignments are done individually or in groups. In fact, cooperative learning that includes provisions to assureindividual accountability-such as individual tests on the material in the group assignments-cuts down on hitchhiking(Johnson et al. 1991a,b). Students who don't actually participate in the homework will generally fail the tests,especially if the assignments are challenging (as they always should be if they are assigned to groups) and the teststruly reflect the skills involved in the assignments. If the group work only counts for a small fraction of the overallcourse grade (say, 10-20%), hitchhikers can get high marks on the homework and still fail the course.

One way to detect and discourage hitchhiking is to have team members individually or collectively distribute the totalpoints for an assignment among themselves in proportion to the effort each one put in. Students want to be nice to oneanother and so may agree to put names on assignments of teammates who barely participated, but they are less likelyto credit them with high levels of participation. Another technique is to call randomly on individual team members topresent sections of project reports or partial solutions to problems, with everyone in the group getting a grade based onthe selected student's response. The best students will then make it their business to see that their teammates allunderstand the complete solutions, and they will also be less inclined to put a hitchhiker's name on the written productand risk having him or her be the designated presenter.

Many of the cooperative teams in my class are not working well-their assignments are superficial andincomplete and some team members keep complaining to me about others not participating.

The interpersonal challenges of cooperative learning may be severe. Students have widely varying intellectual abilities,work ethics, and levels of sensitivity to criticism, and a substantial part of the cooperative learning experience islearning how to confront and work through the conflicts that inevitably arise from these variations.

One way to get groups off to a good start is to have them formulate and write out a set of team standards andexpectations, sign it, make copies for themselves, and turn in the original to you. As the course proceeds, have themperiodically evaluate how well they are working as a team to meet those standards and what they might do to workmore effectively. You may invite teams with serious problems to have a session in your office. If they do, try to helpthem find their own solutions rather than telling them what they should do.

Taking a few minutes in class to focus on critical teamwork skills can make a major difference in how groups function.Periodically select an important activity like brainstorming or resolving conflicts and offer tips in class on effectiveways to carry out the activity. An effective technique is to present a short scenario describing a common problem andbrainstorm solutions with the class.

You may also give teams the last resort option of firing uncooperative members after giving them at least twowarnings, and you may give individuals carrying most of the workload the option of joining another group after givingtheir uncooperative teammates at least two warnings. In our experience, teams almost invariably find ways of workingthings out themselves before these options have to be exercised.

Teams working together on quantitative problem assignments may always rely on one or two members to getthe problem solutions started. The others may then have difficulties on individual tests, when they must beginthe solutions themselves.

This is a legitimate concern. An effective way to minimize it is for each team member to set up and outline eachproblem solution individually, and then for the team to work together to obtain the complete solutions. If the studentsare instructed in this strategy and are periodically reminded of it, most of them will discover its importance andeffectiveness and adopt it. There is also merit in assigning some individual homework problems to give the studentspractice in the problem-solving mode they will encounter on the tests.

Page 36: active learning 2013crlte.engin.umich.edu/.../2013/06/W13ActiveLearningPres.pdf · 2019-09-10 · A practical introduction to active learning for busy skeptics April 9, 2013 Michael

Resistance Paper

http://www4.ncsu.edu/unity/lockers/users/f/felder/public/Papers/Resist.html[11/3/2011 10:01:00 AM]

I teach a class containing students in minority populations that tend to be at risk academically. Does active,cooperative learning work in this kind of setting?

In fact, the most frequently cited cooperative learning success story comes from the minority education literature.Beginning in the mid-1970's, Uri Treisman, a mathematics professor then at the University of California-Berkeley,established a group-based calculus honors program, reserving two-thirds of the places for minority students whoseentering credentials suggested that they were at risk. The students who participated in this program ended with ahigher retention rate after three years than the overall average for all university students, while minority students in acontrol population were mostly gone after three years. Treisman's model has been used at many institutions withcomparable success (Fullilove and Treisman 1990). In another study, George (1994) tested several cooperativelearning techniques on a predominantly African-American psychology class and compared their performance with thatof a control group taught noncooperatively. She found that group work led to significant improvements in bothacademic achievement and attitudes toward instruction.

When using cooperative learning in classes that include minority students-ethnic minorities, or women in engineeringand other nontraditionally female fields-try to avoid groups in which the minority students are isolated. Felder et al.(1995) report a study of cooperative learning in a sequence of engineering courses. Women responded to group workwith overwhelming approval, but many indicated that they tended to assume less active roles in group discussions andsome reported that their ideas tended to be devalued or discounted within their teams. The likelihood of theseoccurrences is reduced if a team contains more than one member of the minority population.

Even though I've done everything the experts recommend, some of my students still complain that they don'tlike the student-centered approach I'm using and they would have learned more if they had taken a "normal"class.

They could be right. Students have a variety of learning styles and no instructional approach can be optimal foreveryone (Claxton and Murrell 1987; Felder 1993; Grasha 1990, 1994). In the end, despite our best efforts, somestudents fail and some who pass continue to resent our putting so much of the burden of their learning on theirshoulders. One of our students once wrote in a course-end evaluation, "Felder really makes us think!" It was on the listof things he disliked. On the other hand, for all their complaints about how hard we are on them, our students on theaverage do better work than they ever did when we just lectured, and many more of them now tell us that after gettingthrough one of our courses they feel confident that they can do anything. So you may lose some, but you can expect towin a lot more.

In short, we are convinced that the benefits of properly implemented student-centered instruction more thancompensate for any difficulties that may be encountered when implementing it. Instructors who follow recommendedSCI procedures when designing their courses, who are prepared for initially negative student reactions, and who havethe patience and the confidence to wait out these reactions, will reap their rewards in more and deeper student learningand more positive student attitudes toward their subjects and toward themselves. It may take an effort to get there, butit is an effort well worth making.

REFERENCES

Bellanca, J. 1990. The cooperative think tank: Graphic organizers to teach thinking in the cooperative classroom.Palatine, IL: Skylight Publishing.

Bonwell, C.C., and J.A. Eison. 1991. Active learning: Creating excitement in the classroom. ASHE-ERIC HigherEducation Report No. 1. Washington, DC: George Washington University.

Brent, R., and R.M. Felder. 1992. Writing assignments-Pathways to connections, clarity, creativity. College Teaching,402, 43-47.

Claxton, C.S., and P.H. Murrell. 1987. Learning styles: Implications for improving educational practice. ASHE-ERIC

Page 37: active learning 2013crlte.engin.umich.edu/.../2013/06/W13ActiveLearningPres.pdf · 2019-09-10 · A practical introduction to active learning for busy skeptics April 9, 2013 Michael

Resistance Paper

http://www4.ncsu.edu/unity/lockers/users/f/felder/public/Papers/Resist.html[11/3/2011 10:01:00 AM]

Higher Education Report No. 4. Washington, DC: George Washington University.

Felder, R.M. 1993. Reaching the second tier: Learning and teaching styles in college science education. J. Coll.Science Teaching, 235, 286-290.

-, and R. Brent. 1994. "Cooperative learning in technical courses: Procedures, pitfalls, and payoffs." ERIC DocumentReproduction Service Report ED 377 038.

-, G.N. Felder, M. Mauney, C.E. Hamrin, Jr., and E.J. Dietz. 1995. A longitudinal study of engineering studentperformance and retention. III. Gender differences in student performance and attitudes. J. Engr. Education, 84(2),151-174.

Fullilove, R.E., and P.U. Treisman. 1990. Mathematics achievement among African American undergraduates at theUniversity of California Berkeley: An evaluation of the mathematics workshop program. J. Negro Education, 593,463-478.

George, P.G. 1994. The effectiveness of cooperative learning strategies in multicultural university classrooms. J.Excellence in Coll. Teaching, 51, 21-30.

Grasha, A.F. 1990. The naturalistic approach to learning styles. College Teaching, 38(3), 106-113.

-. 1994. A matter of style: The teacher as expert, formal authority, personal model, facilitator, and delegator. CollegeTeaching, 42(4), 142-149.

Johnson, D.W., R.T. Johnson, and K.A. Smith. 1991a. Active learning: Cooperation in the college classroom. Edina,MN: Interaction Book Company.

-. 1991b. Cooperative learning: Increasing college faculty instructional productivity. ASHE-ERIC Higher EducationReport No. 4. Washington, DC: George Washington University.

Kloss, R.J. 1994. A nudge is best: Helping students through the Perry scheme of intellectual development. CollegeTeaching, 424, 151-158.

McKeachie, W. 1986. Teaching tips, 8th Edition. Lexington, MA: Heath & Co.

Meyers, C., and T.B. Jones. 1993. Promoting active learning: Strategies for the college classroom. San Francisco:Jossey Bass.

Tschumi, P. 1991. 1991 ASEE Annual Conference Proceedings pp. 1987-1990. Washington, DC: Am. Society forEngr. Education.

Woods, D.R. 1994. Problem-based learning: How to gain the most from PBL. Waterdown, Ontario: Donald R.Woods.

Bibliography of educational papers Cooperative learning page Return to main page

[email protected]