Upload
robert-l-hohn-and-bruce-frey
View
215
Download
2
Embed Size (px)
Citation preview
Heuristic Training and Performance in Elementary Mathematical Problem SolvingAuthor(s): Robert L. Hohn and Bruce FreySource: The Journal of Educational Research, Vol. 95, No. 6 (Jul. - Aug., 2002), pp. 374-380Published by: Taylor & Francis, Ltd.Stable URL: http://www.jstor.org/stable/27542403 .
Accessed: 21/10/2013 20:52
Your use of the JSTOR archive indicates your acceptance of the Terms & Conditions of Use, available at .http://www.jstor.org/page/info/about/policies/terms.jsp
.JSTOR is a not-for-profit service that helps scholars, researchers, and students discover, use, and build upon a wide range ofcontent in a trusted digital archive. We use information technology and tools to increase productivity and facilitate new formsof scholarship. For more information about JSTOR, please contact [email protected].
.
Taylor & Francis, Ltd. is collaborating with JSTOR to digitize, preserve and extend access to The Journal ofEducational Research.
http://www.jstor.org
This content downloaded from 129.8.242.67 on Mon, 21 Oct 2013 20:52:03 PMAll use subject to JSTOR Terms and Conditions
Heuristic Training and Performance
in Elementary Mathematical
Problem Solving
ROBERT L. HOHN BRUCE FREY University of Kansas
ABSTRACT The processes of understanding and solving word problems proceed through the phases of problem trans
lation, problem interpretation, solution planning, solution exe
cution, and solution monitoring. The authors developed a
heuristic strategy (SOLVED) to explain these phases in lan
guage appropriate to third-, fourth-, and fifth-grade students.
Children were trained over several lessons to use it to solve dif
ferent types of mathematical problems. Results of 2 experi ments involving 223 elementary students indicated that SOLVED was more effective in aiding both short-term and
delayed problem solving than traditional problem-solving instruction. Accuracy in problem solving was significantly cor
related with metacognitive processing. Third-grade students
used SOLVED more rapidly and effectively than did older stu
dents, and no interaction of learning rate with ability or gen der occurred.
Key words: elementary mathematics instruction, heuristics in
mathematics, mathematical problem solving
Efforts
to reform mathematics instruction have empha sized greater understanding of the meaning of word
problems, including representational, as well as solution, skills (Grouws, 1992; Mayer, 1992; National Council of
Teachers of Mathematics, 1989). If educators are to enhance
mathematical understanding, they may need to use instruc
tional procedures that include heuristic training. Following a
heuristic procedure is assumed to provide a more systemat
ic or planned approach to problem solving (Polya, 1973). The processes of understanding and solving word prob
lems can be divided into several sequential phases. The first
phase is problem representation, which can be subdivided into two substages: problem translation (in which factual information in the problem statement is interpreted) and
problem integration (in which knowledge of problem types is used to form an integrated structure of the problem's rela
tionships; Lewis, 1989). The next phase is solution plan ning, in which the learner selects a solution procedure.
Solution execution follows, in which necessary computa
tions are carried out. Solution monitoring, in which the
problem solver reviews the computations to detect errors,
completes the sequence (Brenner et al., 1997; Mayer,
1989). Failure to effectively engage in the necessary
metacognitive activity at any one of these phases is likely to
lead to a poor problem-solving performance. On the basis of the foregoing analysis of the metacogni
tive phases of mathematical problem solving, we created a
training program for this study as an attempt to help ele
mentary mathematics students remember and engage in
these processes. We based the program on the following
assumptions:
1. Third-, fourth-, and fifth-grade students who are in the
early stage of developing their mathematical competence can benefit from a heuristic that provides an organized series of steps for them to follow when they attack word
problems; 2. The program should be constructed so that when students
follow it, they will be required to use representational and
symbolic manipulation skills; 3. Steps suggested in the program should be supported by
mathematical problem-solving research; 4. Instruction throughout the program should be incorporat
ed into the daily mathematics lessons that students
encounter, as directed by their classroom teacher.
The training program is entitled SOLVED, which stands for State the problem, Options to use, Links to the past,
Visual aid, Execute your answer, and Do check back. Each letter cues a concept or procedure that, if followed, would
help the learner to consider the necessary phases of problem
solving. State the problem aids in representation by clarify ing the givens and goals of the problem statement (Mayer, 1992). Options to use suggests to the learner that identify ing the problem type may aid in understanding it (Riley,
Address correspondence to Robert L. Hohn, Department of Psy
chology and Research in Education, 1122 W. Campus Boulevard,
University of Kansas, Lawrence, KS 66045-3101. (E-mail: h?hn @ ukans. edu)
374
This content downloaded from 129.8.242.67 on Mon, 21 Oct 2013 20:52:03 PMAll use subject to JSTOR Terms and Conditions
July/August 2002 [Vol. 95(No. 6)] 375
Greeno, & Heller, 1983). Links to the past reminds the stu
dent to recall similar problems completed earlier that might suggest a solution plan (Silver, 1987). Visual aid assists
both representation and solution planning in that diagrams or schematic representations often clarify problem relation
ships and suggest possible routes to solution (Hagerty &
Kozhevnikov, 1999; Hembree, 1991). Execute your answer
reminds the student to perform the necessary calculations
only after earlier representation and planning. Research
indicates that learners are more likely to consider proce dures and to select the proper one if representation has been
complete (Siegler & Jenkins, 1989). Do check back empha sizes the solution-monitoring phase of problem solving (Brenner et al., 1997).
Some students may already possess the domain-specific
knowledge, metacognitive skills, and a heuristic approach to mathematical problem solving that are included in new
instructional programs (DeCorte, Greer, & Verschaffel,
1996). The differential effect of new programs on high- and
low-ability students is therefore important to consider. In
addition, boys make greater use of personally invented pro cedures in mathematics, whereas girls rely more on teacher
supplied aids (Fennema, Carpenter, Jacobs, Franke, & Levi,
1998). Girls may therefore benefit more than boys from a
program in which students are instructed directly by their
teacher to use a specific mathematical heuristic, like the one
developed for this study. Analysis of the interaction of gen der differences with treatment condition on performance
appeared relevant to this study. We conducted two experiments. We designed the first
experiment to compare training in the SOLVED program with a textbook-based approach to problem solving. We
undertook the second experiment to analyze strategy use, to
determine its relationship to performance, to investigate
possible interactive effects of gender and ability, and to
investigate whether problem-solving performance would be maintained after a 2-week delay, with no prompting present.
EXPERIMENT 1
Method
Participants
Participants in Experiment 1 were 31 third-grade stu
dents, 37 fourth-grade students, and 35 fifth-grade students in two intact classes at each grade level. The participating school was part of a professional development consortium in which teachers had agreed that improvement in mathe
matical problem-solving instruction was an important goal.
All the teachers involved in this experiment were women, with at least 3 years of classroom experience. One class at
each grade level was selected randomly to receive training in the SOLVED method; the other class served as a control.
The school had a policy of not placing students in a partic ular class according to ability level, so groups were pre
sumed to be relatively heterogeneous.
Materials
The materials for this study consisted of two 4-problem
quizzes presented prior to and after training and 12 sample
problems used during training, at each grade level. We
developed problems for each of four required topics for
each grade. Teachers identified topics as those that students
had some difficulty with previously and that were mandat
ed by the school district's curriculum plan. Thirty-five of
the 60 problems for all grades had been used on earlier tests, and we determined that they averaged 41% in item difficul
ty. The rest of the problems were constructed in parallel form. For example, for the third-grade problem "Jamal has
15 basketball cards. Eric has 13 cards. Do the boys have more than 30 cards?" A parallel problem was "Jeffery has
17 baseball cards. Todd has 21 cards. Do the boys have more than 30 cards?" Table 1 lists the topics for each grade level and a sample problem for each topic.
Procedure
Teachers in the SOLVED group observed a sample SOLVED lesson; we taught them to present the lesson fol
lowing a six-step procedure:
1. Distribute a copy of the SOLVED check list to accompa
ny the lesson.
2. Demonstrate how SOLVED could be applied to a sample
problem.
3. Wherever possible, simplify terms to explain the applica tion of each of the six steps in SOLVED. For example, a
fourth-grade problem was "The Statue of Liberty stands on a square base. Each side of the base is 65 feet long.
What is its perimeter?" The teacher might explain the
problem in the following way:
Perimeter means how far it is all the way around an area. I
have to figure out how far it is all the way around the stat
ue. If I start at one point, and go all the way around it until
I get back I will know it (State). Since it's a square, I could
add all 4 sides up or multiply one side by 4 (Options). I remember I did a problem like this last week when we mea
sured our bedrooms, but it wasn't square (Links). I could
draw a picture, but I know what a square looks like (Visu
alize). I am ready to do the math now?I'll multiply 4 x 65
(Execute) = 260. That looks right but to check I'll add up the four sides. Still 260. All right! (Do look back). (Inserts added to demonstrate link between teacher comments and
SOLVED steps)
4. Provide two similar problems for individual practice. 5. After practice, pair students with a partner to discuss how
they applied the mnemonic and what solution was derived. 6. Require each pair to explain how they used SOLVED in
one of the problems.
Teachers who used the control method did not follow the first three steps of the SOLVED procedure, but they fol lowed the prescribed instructions for explaining the various
mathematical problems as detailed in the "Teacher's Guide"
This content downloaded from 129.8.242.67 on Mon, 21 Oct 2013 20:52:03 PMAll use subject to JSTOR Terms and Conditions
376 The Journal of Educational Research
Table 1.?Grade Level, Mathematical Topics, and Sample Problems (Experiment 1)
Grade Topic Sample problem
Place value
Two-digit addition
Two-digit subtraction
Time
Perimeter measurement
Multiplication
Three-digit subtraction
Single-digit division
Two-digit division
Time
Measurement area
Logic
"Stonehill School cost $9,170,869 to build. What number is in the ten thousands place?"
"Jamal has 15 basketball cards. Eric has 13 cards. Do the boys have more than 30 cards?"
"Joan and Krista are flying kites. Joan's kite flew 50 minutes. Krista's stayed up 30 minutes. How much longer did Joan's kite fly?"
"It takes Paul 12 minutes to walk to school. The bell rings at
8:00. What is the latest time that Paul can leave home and still not be late?"
"The Statue of Liberty stands on a square base. Each side of the base is 65 feet long. What is its perimeter?"
"It costs Sam $9.25 to buy gas for his truck each week. How much does Sam spend for gas in 10 weeks?"
"Maria has $1.25 to buy candy A Snickers bar costs 69 cents. If she buys the Snickers bar, how much money will she have left?"
"A farm stand sells apples at 2 for 18 cents, apricots at 5 for 50 cents and plums at 3 for 15 cents. If Sasha wants one of each
fruit, how much money will she need?"
"Rosanne must put 420 rocks into boxes. If she puts 21 rocks into each box, how many boxes will she need?"
"Roberto pays 10 cents a minute to call his friend Manuel in Los
Angeles. If Roberto calls Manuel at 7:52 P.M. and talks until 8:09 P.M., how much does the phone call cost?"
"Your classroom is 50 feet long and 30 feet wide. What is its area?"
"The snack menu has pretzels, popcorn, chips and trail mix. Joe and John each chooses two snacks, but neither chooses the same one. Joe never eats pretzels. John hates popcorn. Both boys agree that pretzels and chips do not go together. Which snacks did each boy choose?"
of the classroom text for their grade level. They did follow
Steps 4, 5, and 6.
We randomly observed all six teachers in both conditions
during two practice lessons. Lesson content consisted of mathematics topics encountered in the normal curriculum
sequence of the school year. We rated the teachers on how well they followed the six-step procedure with a Likert-type scale ranging from 1 to 5. A 5 indicated that the teacher per
formed all six steps consecutively and accurately, 4 indicat ed one error, 3 indicated two errors, 2 indicated three
errors, and 1 indicated more than four errors. After each
observation of each teacher, we provided an explanation of their ratings and discussed the teacher's performance. Rat
ings on the second observations for all teachers averaged
4.70; interjudge reliability was .95. After the pretest, all students were taught four mathemat
ics lessons; each one covered one of the four grade-appro
priate topics. Lessons were presented every 2 days. One day after the fourth lesson was completed, the teachers adminis
tered a four-problem quiz consisting of parallel problems. As on the pretest, students were told to "show all their work."
All instruction, practice, and testing activities were conduct ed in the students' classrooms with their regular teacher.
Scoring
Accuracy of problem responses was scored on a 0-2 con
tinuum; 0 reflected a completely erroneous response, 1 reflected a partially correct response, and 2 reflected a com
pletely correct response. A completely erroneous response
occurred when there was no response, or a response that fol
lowed an incorrect algorithm (student subtracted when
problem required addition), or a major calculation error such as "$9.25 x 10 = $19.25." A partially correct response revealed appropriate representation of the problem, but there was a minor error in calculation. An example is
"Roberto pays 10 cents a minute to call his friend Manuel in Los Angeles. If Roberto calls Manuel at 7:52 P.M. and talks until 8:09 P.M., how much does the phone call cost?
Answer: 8:09 - 7:52 = 17 minutes x 10 cents = $1.60." Two
judges used the 0-2 scale to rate four problems completed by each of the 103 students on the pretest. Interjudge relia
bility was .93.
This content downloaded from 129.8.242.67 on Mon, 21 Oct 2013 20:52:03 PMAll use subject to JSTOR Terms and Conditions
2002 [Vol. 95(No. 6)]
Table 2.?Mean Pre- and Posttest Performance Scores for Treatment Group, by Experimental Condition, for Three Grade Levels (Experiment 1)
SOLVED Control
Pretest Posttest Pretest Posttest
Grade n M SD M SD n M SD M SD
3 17 1.71 1.31 3.59 2.09 14 2.64 1.50 2.71 1.77
4 20 1.70 1.66 4.20 2.04 17 1.94 1.30 2.72 .92
5 20 3.90 1.71 4.90 1.92 15 4.20 2.08 4.07 1.49
Results and Discussion
Table 2 presents pre- and posttest means, standard devia
tions, and sample sizes, by experimental condition, for the
three grades.
To correct for initial group differences, we conducted
analysis of covariance (ANCOVA) for third and fifth graders,
using pretest scores as the covariate. For third graders, ANCOVA scores were statistically significant, F(l, 28) =
10.87, p = .003. For fifth graders, ANCOVA scores were also
statistically significant, F(l, 32) = 23.83, p = .002.
The data for fourth graders failed the test for equality of
variances, Levene's F(l, 29) =
7.37, p -
.01, as well as the
homogeneity-of-slopes test, Pretest x Group Interaction,
F(3, 27) = 4.16, p = .02, both of which are required assump tions of ANCOVA. Consequently, we used independent t
tests to compare posttest scores for fourth graders. Because
pretest means were not significantly different between
groups, ?(35) = .49, p = .63, we could not directly compare
posttest means using the t test for independent means and
adjusting degrees of freedom to account for unequal vari ances. The independent t test comparing fourth-grade
experimental and control groups on the posttest was statis
tically significant, ?(27) = 2.94, p = .007.
Results clearly indicated greater improvement for those in
the SOLVED condition over time when compared with those
in the control treatment for all three grades. Mean improve ment for SOLVED ranged from 1.00 to 2.50 points, whereas
control-group improvement ranged from -.13 to .77 points.
Although performance levels remained modest on the imme
diate posttest, students did seem to profit from SOLVED.
EXPERIMENT 2
Method
We designed the second study to include analysis of stu
dent use of the SOLVED technique and its relation to per formance, and to determine whether performance levels
would be maintained after a longer delay. In addition, Study 2 included analysis of potential interactions with gender and mathematics ability.
Participants
Participants in Experiment 2 were 38 third graders, 37
fourth graders, and 45 fifth graders in two different intact
classes at each grade level from the same elementary school, as in Experiment 1. All the teachers of these classes
were women who had at least 2 years of teaching experi ence. National and state standardized assessments were not
available to determine mathematical ability for many stu
dents. However, teachers possessed final classroom test
scores from the previous semester for approximately 65%
of the students. We used scores above the median on those
tests to identify those students who were defined as above
average in mathematics ability, whereas scores below the
median defined those who were below average. The remain
ing students were rated by teachers on the basis of recent
class performance.
Materials
Testing materials consisted of 3 five-problem quizzes
presented immediately after each of three review lessons
and a five-problem quiz administered 2 weeks after the
final lesson. Four of the problem types were the same as in
Experiment 1, and the problems used to compose these tests were drawn from those quiz items, as well as from
sample problems. The combined item difficulty level of
those items used in Experiment 1 quizzes was .51. One new problem type was introduced for each grade. These were added because the new set of teachers nominated
them as reflecting important concepts for that grade's cur
riculum. All new items used for the four tests were rated as equivalent in difficulty by pairs of teachers at each
grade level.
Procedure
All students were taught three lessons, each reviewing the five grade-appropriate topics. Students had been
exposed to these topics in work done earlier in the school
year. Lessons were taught following the same direct instruc
tion format used in Experiment 1 ; the SOLVED method was
used in each lesson. Teachers in Experiment 2 participated in the same SOLVED training and feedback procedure as
that employed in Experiment 1. After each lesson, the 5 item quiz was administered to students, who were remind
ed to show all their work. Two weeks after these lessons
were concluded, the students completed an additional 5
item quiz covering the five topics addressed.
This content downloaded from 129.8.242.67 on Mon, 21 Oct 2013 20:52:03 PMAll use subject to JSTOR Terms and Conditions
378 The Journal of Educational Research
Scoring
Problems were judged on two dimensions: accuracy and
metacognitive processing (MP). Accuracy was rated fol
lowing the same system as in Experiment 1. Interrater reli
ability for judging accuracy scores was .97.
MP was rated separately on a 0-2 continuum; a 0 indicat
ed no apparent use, 1 indicated some evidence of processing, and 2 reflected clear evidence. For example, one of the third
grade problems was "It takes Pat 12 minutes to walk to
school. The late bell rings at 8:00. What is the latest time that
Pat can leave home and still not be late?" A score of 2 would
be assigned to a response in which the student drew clock
face(s) with hands indicating the two critical times, and some attempt to designate the minutes between the two
times. This response would be considered evidence of rep resentation. A score of 1 would be assigned to a response in
which the student drew a clock face, but some element of the problem was not included. A score of 0 would be
recorded if no evidence of representation, solution plan ning, solution execution, or solution monitoring was appar
ent. Interrater reliability in judging MP was .91.
Results and Discussion
Table 3 presents mean accuracy scores, standard devia
tions, and sample sizes for the three training trials and the
delayed test for the three grade levels.
We conducted a one-way, within-subjects analysis of
variance (ANOVA); the main factor was the time between the four test administrations, and the dependent variable was
the test score. There were three between-subjects factors in the analysis: grade (3, 4, or 5), student gender, and student
mathematics ability (above or below the median). The results indicated a large main-time effect (Wilks's lambda =
.81), F(3, 104) = 8.28, p < .001, multivariate r|2 = .19. We
found a statistically significant interaction between time and
grade (Wilks's lambda = .75), F(6, 208) = 5.29, p < .001,
multivariate r)2 = .13. The interaction terms tested involving
gender and mathematics ability were not significant. Because the interaction between time and grade was sta
tistically significant, we examined the effect of time for each
grade separately. To control for Type I errors across the three
main effects, we set the alpha for each effect at .017 (.05/3). We set statistically significant simple main effects for third
graders (Wilks's lambda = .61), F(3, 35) = 7.55, p < .001, multivariate r\2
= .39, and for fifth graders (Wilks's lambda =
.44), F(3,42) = 17.80,p < .001, multivariate r|2= .56, but not
for fourth graders. Although fourth graders appeared to
improve over time, particularly between the third trial and
the delayed test, the overall trend did not approach signifi cance. Fifth graders improved throughout the training trials, but there was some loss on the delayed test.
Table 4 presents MP scores, standard deviations, and
sample sizes for the three training trials and the delayed test
for the three grade levels.
We conducted an additional one-way, within-subjects ANOVA; MP was the dependent variable. The results for this ANOVA indicated a significant time effect (Wilks's lambda = .87, F(3, 104) = 5.33, p < .002, multivariate r|2
=
.13. We found a statistically significant interaction between time and grade (Wilks's lambda = .84), F(6, 208) = 3.28, p <
.004, multivariate r\2 = .09. Because the interaction between
time and grade was statistically significant, we examined the effect of time for each grade separately. We found statistical
ly significant simple main effects for third graders (Wilks's lambda = .65), F(3, 35) = 6.25, p < .002, multivariate ?f
= .35 and for fifth graders (Wilks's lambda = .64), F(3, 42) = 7.74,
p < .001, multivariate r|2 = .36, but not for fourth graders.
Correlations between accuracy and MP scores included the following: third graders, r(36) = .43, p < .05; fourth
graders, r(35) = .29, p > .05; and fifth graders, r(43) = .41, p < .05. These results indicate that there was steady im
provement for third and fifth graders over trials and that pro
cessing gains continued 2 weeks later without prompting. MP was related positively to problem-solving performance
Table 3.?Mean Accuracy Scores, by Trials, for Three Grade Levels (Experiment 2)
Trial 1 Trial 2 Trial 3 Delayed test
Grade n M SD M SD M SD M SD
3 38 4.95 1.93 5.58 3.04 5.95 2.17 6.63 2.68 4 37 4.39 2.49 4.50 3.08 4.67 2.75 5.61 2.79 5 45 4.98 2.81 6.93 3.00 7.42 2.48 6.96 2.90
Table 4.?Total MP Scores, by Trials, for Three Grade Levels (Experiment 2)
Trial 1 Trial 2 Trial 3 Delayed test
Grade n ~M SD ~M SD ~M SD~ ~M SD
3 38 4.79 2.77 5.00 1.93 6.61 2.19 5.11 2.81 4 37 2.81 2.16 2.58 2.45 3.75 2.44 3.22 2.97 5 45 1.93 1.80 3.07 3.33 2.80 4.22 3.64 3.84
This content downloaded from 129.8.242.67 on Mon, 21 Oct 2013 20:52:03 PMAll use subject to JSTOR Terms and Conditions
July/August 2002 [Vol. 95(No. 6)] 379
for these two groups. For fourth graders, although im
provement in accuracy and MP over trials did not reach sta
tistically significant levels, we observed gains in both areas.
The MP scores of fourth- and fifth-grade students were rel
atively low when compared with third-grade students.
As a further check on the relationship between problem
solving accuracy and metacognitive processing, we briefly interviewed 10 randomly selected students at each grade level after testing. They were asked how they used the
SOLVED approach during problem solving. Of the 25
meaningful responses generated by the 30 students, 10 were
rated by two independent judges as reflecting the solution
execution phase, for example, "I remembered to multiply
by 2." Eight responses reflected the solution-monitoring
phase, for example, "I knew I should look back over my work." Seven responses indicated attempts at problem rep
resentation, such as, "I knew I should put the times on a
clock face that I drew." Students did not mention using SOLVED to help them remember similar problems or to
plan their approach.
Ability in mathematics did not interact with performance at any of the three grade levels. This finding suggests that
both below- and above-average students profited equally from SOLVED training. Further research in which stan
dardized ability scores are used is necessary before more
definitive conclusions can be made.
Gender also failed to interact with performance at each of
the three grade levels. These findings do not necessarily conflict with those of Fennema et al. (1998), however. In
this study, students were taught to use only the mnemonic
provided them and were not encouraged to invent their own
approach, as in the Fennema et al. study.
Conclusions and Implications
The results suggest that elementary school students can
be taught to utilize a simple heuristic strategy and that its use is associated with improved problem-solving skill.
Acquisition of the heuristic led to a superior learning rate
when compared with a more traditional textbook approach.
Improvement continued beyond initial learning trials, and was evident during long-term retention.
We believe that the success of SOLVED is best explained
by its structure and how it was taught. SOLVED was
designed to reflect the phases of problem solving in a man
ner easily understood by young students. Traditional mod
els of the problem-solving process have not always been as
procedural, nor as sequential, as they should be when they are explained to novice problem solvers. Moreover, the
heuristic was taught and practiced over several trials, fol
lowing suggestions that "strategy instruction follow a pow
erful model of teaching carefully meshed with the curricu
lum" (Pressley & Associates, 1990, p. 181). Several issues remain to be resolved before one can rec
ommend that SOLVED or other heuristics be widely adopted. First, evidence of strategic effort found in students' work
primarily reflected the solution execution phase of problem
solving, although some representational attempts were
made. This finding has two possible explanations. Students
may be less inclined to engage in significant reflective
thought about problems prior to jumping to the solution
phase. This is characteristic of novice problem solvers in
general. Increased emphasis on representational skills will
need to be included in teachers' instructional methods for
this tendency to be overcome. An alternative explanation is
that deeper analysis of students' metacognitive processing is necessary. Future research, including use of interview
procedures similar to those developed by Fennema et al.
(1998) or Mevarech and Kramanski (1997) to more fully examine students' thoughts, seems warranted.
Second, the more rapid rate of acquisition and higher use
of metacognitive processing demonstrated by third graders is of interest. These learners appeared eager to use
SOLVED in the first trial, whereas older students began to
use it effectively only in later trials. Third graders who are
just beginning to work with mathematical word problems may be more open than fourth and fifth graders to sugges tions about processing procedures from their teachers.
Fourth and fifth graders may struggle to integrate a new
approach with procedures that were taught previously or
developed through their own efforts, no matter how effec
tive. Third grade is perhaps the optimal time to begin instruction in the use of metacognitive processes in mathe
matical problem solving. Third, further analysis of variability in accuracy and
metacognitive processing according to different problem
types needs to be conducted. A classification system such as
that recommended by Riley et al. (1983), rather than prob lem selection based on sequential curricular organization
may yield more useful information about the factors that
influence metacognitive processing at each phase.
Finally, we made an attempt to keep lesson length, quiz
length, and number of review lessons taught comparable to
normal expectations in third-, fourth-, and fifth-grade class
rooms. This was intended to maintain student motivation to
perform at one's best. The number of problems in each test
and the number of lessons taught could be increased in
future replications as a check on the reliability and validity of these results.
This study lends support to efforts to use a more process
oriented approach to mathematics instruction. Although other attempts to enhance mathematical problem solving are worthy of attention, educators must ensure that students
tackle new problems with skill and the confidence that they are able to represent and solve them.
REFERENCES
Brenner, M. E., Mayer, R. E., Moseley, B., Brar, T., Duran, R., Reed, B. S., et al. (1997). Learning by understanding: The role of multiple represen tations in learning algebra. American Educational Research Journal,
34(4), 663-689.
Decorte, E., Greer, B., & Verschaffel, L. (1996). Mathematics teaching and
learning. In D. C. Berliner & R. C. Calfee (Eds.). Handbook of educa
This content downloaded from 129.8.242.67 on Mon, 21 Oct 2013 20:52:03 PMAll use subject to JSTOR Terms and Conditions
380 The Journal of Educational Research
tional psychology (pp. 491-549). New York: Macmillan.
Fennema, E., Carpenter, T. P., Jacobs, V. A., Franke, M., & Levi, L. (1998). New perspectives on gender differences in young children's mathemati
cal thinking. Educational Researcher, 27(5), 6-11.
Grouws, D. A. (Ed.). (1992). Handbook of research on mathematics teach
ing and learning. New York: Macmillan.
Hagerty, M., & Kozhevnikov, M. (1999). Types of visual spatial represen tations and mathematical problem solving. Journal of Educational Psy
chology, 97(4), 684-689.
Hembree, R. (1991). Experiments and relational studies in problem solv
ing: A meta-analysis. Journal of Research in Mathematics Education,
23(2), 242-273.
Lewis, A. B. (1989). Training students to represent arithmetic word prob lems. Journal of Educational Psychology, 87(4), 521-531.
Mayer, R. E. (1989). Introduction to cognition and instruction in mathe
matics. Journal of Educational Psychology, 87(4), 452-456.
Mayer, R. E. (1992). Thinking, problem solving and cognition (2nd ed.). New York: W. H. Freeman.
Mevarech, Z. R., & Kramanski, B. (1997). IMPROVE: A multidimension
al method for teaching mathematics in heterogeneous classrooms.
American Educational Research Journal, 34(2), 365-394.
National Council of Teachers of Mathematics. (1989). Curriculum and
evaluation standards for school mathematics. Reston, VA: Author.
Polya, G. (1973). How to solve it (2nd ed.) Princeton, NJ: Princeton Uni
versity Press.
Pressley, M., & Associates. (1990). Cognitive strategy instruction that
really improves children's academic performance. Cambridge, MA:
Brookline.
Riley, M. S., Greeno, J. G., & Heller, J. I. (1983). Development of chil
dren's problem solving ability. In H. R. Ginsburg (Ed.), The develop ment of mathematical thinking (pp. 153-196). New York: Academic
Press.
Siegler, R. S., & Jenkins, E. (1989). How children discover new strategies. Hillsdale, NJ : Erlbaum.
Silver, E. A. (1987). Foundations of cognitive theory and research for
mathematics problem-solving instruction. In A. Schoenfeld (Ed.), Cog nitive science and mathematics education (pp. 33-60). Hillsdale, NJ:
Erlbaum.
Articles from this publication are now available from
Bell & Howell Information and Learning
Online, Over ProQuest Direct ?state-of-the-art online information system
featuring thousands of articles from hundreds of publications, in ASCII full-text,
full-image, or innovative Text+Graphics formats
In Microform?from our collection of more than 19,000 periodicals and 7,000
newspapers
Electronically, on CD-ROM, and/or magnetic tape?through our ProQuest? databases in both full-image ASCII full text formats
Call toll-free 800-521-0600, ext. 3781
International customers please call: 734-761-4700
D CI I AU niAf CI I Attn.: Box 38, P.O. Box 1346,300 North Zeeb Road, Ann Arbor, Ml 48106-1346 *** * L^FIIUVVLLL For comprehensive information on
Information and Bell & Howell Information and Learning products, Learning visit our home page: http://www.umi.com
email: [email protected]
This content downloaded from 129.8.242.67 on Mon, 21 Oct 2013 20:52:03 PMAll use subject to JSTOR Terms and Conditions