85
1 Distributed Scaffolding: Helping Students Learn Science from Design Sadhana Puntambekar University of Connecticut Janet L. Kolodner Georgia Institute of Technology Under review, Cognition and Instruction. Please do not circulate.

Under review, Cognition and Instruction. Please do not

  • Upload
    others

  • View
    1

  • Download
    0

Embed Size (px)

Citation preview

1

Distributed Scaffolding: Helping Students Learn Science from Design

Sadhana PuntambekarUniversity of Connecticut

Janet L. KolodnerGeorgia Institute of Technology

Under review, Cognition and Instruction.

Please do not circulate.

2

Distributed Scaffolding: Helping Students Learn Science from Design

Sadhana PuntambekarUniversity of Connecticut

Janet L. KolodnerGeorgia Institute of Technology

In this paper, we describe our efforts at designing and implementing scaffolding that allowsstudents to take advantage of the affordances of a Learning-by-Design classroom. We describe twostudies in which we iteratively designed the scaffolding. We began by integrating into theclassroom a paper-and-pencil tool, design diaries, to scaffold students’ design-related activities.Our experiences with using the diaries in middle school classrooms in study 1 helped usunderstand that in the dynamic, complex environment of the classroom, not all of the scaffoldingcan be provided with any one tool. Scaffolding needs to be distributed across the various agentsthat play a role in learning (for example - the teacher, peers and software and paper and pencil tools).We found that a system of scaffolding is required which integrates the activities that students haveto carry out. In study 2, we implemented distributed scaffolding. We compared students’ depth ofscience understanding and found that students did better in study 2.

In recent years there has been an upsurge of interest in design activities as a means to promote

science learning (e.g. Harel 1991; Kafai, 1994; Lehrer & Romberg, 1996, Kolodner, 1997;

Baumgartner and Reiser, 1998; Hmelo, Holton & Kolodner, 2000; Kolodner, Crismond, Gray,

Holbrook, & Puntambekar, 1998; Kolodner, Crismond, Fasse, Gray, Holbrook, Puntambekar,

submitted). Design challenges provide a motivating context for students to learn science content,

and, as students engage in cycles of designing, evaluating, and redesigning, they also have the

opportunity to confront their understandings and misunderstandings of science concepts.

Design is a rich paradigm for learning science for many reasons. (1) Design provides a

meaningful and engaging context for students to learn science. It provides a natural environment for

discovering reasons why science concepts need to be learned, for seeing the contexts in which those

concepts are put to use, and for relating those concepts to experiences both in and outside of school.

(2) Design serves as an umbrella for mental skills (Lehrer and Romberg, 1996) and integrates

activities such as such as analysis, synthesis, evaluation, and revision that are often fragmented across

the curriculum. It therefore has the potential for enhancing both content knowledge and skills (3)

Design activities that require construction of a working artifact provide ongoing feedback for

students as they confront their understanding of concepts by trying to put them into practice. (4)

Good design problems have multiple solutions, and thus have the potential to encourage students t o

evaluate alternative solutions. (5) Design is an iterative process. It affords opportunities for students

to incrementally construct, evaluate, discuss and revise both the models they are designing and their

3

conceptions, thus encouraging students to engage in metacognitive activities such as planning and

monitoring.

In one of our Learning by Design™ (LBD™) units (Kolodner, 1997, Kolodner et al., 1998;

Kolodner, et al., submitted; Hmelo, Holton & Kolodner, 2000), for example, middle-school students

were presented with the Jekyll Island challenge. We wanted them to learn about coastal erosion and

accretion by having them design and model means of combating the erosion that is threatening this

coastal island's existence. As students began to try to address the challenge, they raised questions

about erosion, waves, and currents. They followed this up with investigations – reading print material

and constructing and running models in stream tables to learn more about the effects of currents,

tides and waves on erosion activity. Based on the results of their investigations, they designed

solutions to Jekyll Island’s erosion problems and then constructed models of those solutions in their

stream tables. Testing and evaluation of their design solutions helped students to build on what they

had learned through investigation and to understand some of the interactions between natural systems

and the artificial ones they were designing.

Learning from design activities affords rich opportunities for learning. With proper facilitation,

students have the opportunity to construct and generate rich meanings (Perkins, 1986) as they build,

test, and manipulate their artifacts. In addition, the classroom affords a rich social context

(Vygotksy, 1978; Bruner, 1986) to augment the cognitive affordances of design activities. Numerous

opportunities to articulate what they are learning and justify their claims arise as students work

together in small groups and engage in whole-class presentations and discussions. Individual and small-

group learning, whole-class discussions, presentations, and critiques provide a whole host of

opportunities for reflecting on, articulating, critiquing, and refining ideas. Such an environment holds

rich supports for both individual and social processes (Driver, Asoko, Leach, Mortimer, and Scott,

1994).

While designing provides students with motivation for engaging in scientific inquiry and rich

affordances for learning and applying science content, it is not always easy for middle-school

students and teachers to participate in and learn successfully from design activities. Design is a

complex process, encompassing many skills and activities. Students need support to successfully

execute the various activities involved in designing -- analyzing the situation to understand the

problems and issues that need to be addressed, gathering information, generating alternative solutions,

generating criteria to evaluate solutions, thinking about trade-offs, and justifying choices. Additional

complexity comes from design’s cyclical nature; designing is not a linear process, but rather the

solution is often ‘emergent’ (Gargarian, 1996), coming from a complex interplay between refining

one’s understanding of the design challenge and in parallel refining one’s solution ideas. Novice

designers frequently concentrate on single moves and combine them in an additive manner, and they

often fail to make connections between activities within the design process.

4

But making and managing such connections is essential both to good design and to learning from

design activities. Our understanding of learning from design activities (Kolodner, Hmelo, &

Narayanan, 1996; Kolodner, 1997; Hmelo, et al., 2000) tells us that students need to go through

several cycles of trial, analysis, and refinement to develop a deep understanding of science. Each

cycle allows application of what they know about the scientific principles that they are learning,

testing the application, analyzing the results and the conceptions that led to their design, and refining

their conceptions so as to be able to come up with a better solution.

The Learning by Design™ project at Georgia Tech aims to learn how to best orchestrate

learning-from-design activities in middle-school classrooms such that students will productively and

deeply learn science content and practices and such that a large range of teachers will be able t o

successfully carry out its suggestions (Kolodner, 1997; Kolodner et al., 1998; Kolodner et al.,

submitted; Hmelo et al., 2000). To that end, we have designed and written materials for

approximately a year’s worth of physical and earth science units. Each unit is framed by a significant

design challenge, one that requires learning and application of science targeted by the science

standards (AAAS, 1993; NRC, 1996) for middle-school students. Learning by Design is informed by

the cognitive model that comes from Case-Based Reasoning (Kolodner, 1993; Schank, 1982, Schank,

Berman & Macpherson, 1999), and it was influenced greatly by the practices of Problem-Based

Learning (PBL) (Barrows, 1985; Koschmann et al., 1994). Everybody in the class is given the same

design challenge, and students work in teams of 3 or 4 students, each team attempting its best design

solution. As in PBL, teachers take on the roles of facilitators, orchestrating the movement from

teamwork to whole-class discussions and presentations, managing discussions, and helping students

summarize important aspects of the issues being raised. Whole-class discussions focus on identifying

what students already know, issues they need to learn more about, ideas for addressing the challenge,

and plans for moving forward, as well as discussions of the science being learned and how it might be

applied. Notes from these discussions are recorded on PBL-inspired white-boards that are posted for

all students to see. Doing and reflecting are interleaved with each other, and frequent returns to the

whiteboard help students maintain their perspective on the challenge and all its parts.

Our earliest LBD implementations (Gertzman & Kolodner, 1996; Hmelo et al., 2000) showed us

the range of scaffolding middle-school students needed to productively learn from design activities.

Indeed, they needed scaffolding for nearly all of the tasks listed above. For example, in an early pilot

study, students participated in a two-week unit about arthropods1. Their challenge was to design a

useful household robot with arthropod features (e.g., claws for grabbing, hinged legs for jumping) that

could perform tasks that a person finds difficult to do (e.g., reaching something from a high shelf,

picking up a heavy object). They needed to build a working model (from whatever materials they

1 An arthropod is an invertebrate with a segmented body and an exoskeleton, i.e. hard shell on the outside withhinges so that it can move.

5

thought appropriate) of some important arthropod-related function of their robot. They were asked

to think about the features of arthropods and the usefulness of some of them for helping us with our

work, the function that their robot would serve, who would use it, how they could model its functions,

and so on. They were also given handouts containing information on making moving limbs, joints,

etc.

While the challenge was engaging, handouts we provided gave much information about

constructing models, and the PBL whiteboard provided rich context for achieving the challenge, we

found that student teams needed explicit guidance with many things. They needed help with using the

information on the whiteboards and in the handouts, and they did not use any of it unless they were

asked to do so. Even with PBL-style facilitation, the range of decisions they needed to make t o

move forward overwhelmed them, and they had difficulties with the reasoning involved in learning

from design.. They needed help with understanding the problem, carrying out the research, relating

the science they had learned through investigation to their designs, understanding what to include in

their models, and constructing and evaluating their working models (Gray, Young & Newstetter,

1997). While they developed a deeper understanding of the science content as they progressed

through several cycles of design and evaluation, they needed help evaluating their designs, identifying

their pros and cons, explaining what was wrong when something wasn’t as good as it should have

been, and applying that back in another iteration (Hmelo, Allen, Holton, & Gertzman, 1997). They

needed reminders to engage in each of these activities, and they needed help engaging productively.

On the other hand, when a researcher or the teacher could work with them and provide the help they

needed, they engaged enthusiastically and learned much about arthropods.

These studies helped us understand, at a high level, the range of scaffolding that middle-school

students need to successfully engage in design activities for science learning. They showed us, as well,

some of the areas where teachers tend to have difficulties managing the classroom. And they

suggested some of the specific scaffolding and orchestration required to allow middle-school teachers

and students to take full advantage of design challenges as a context for learning science. In the

studies we present here, we attempted to answer three questions aimed at verifying those suggestions

and supplementing them with specific guidelines for designing scaffolding:

What specific kinds of scaffolding do middle-school children need to productively learn science

in a design environment?

What tools might be used to provide such scaffolding?

How should the scaffolding be distributed across tools, agents, and activities in the learning-

science-from-design environment?

In this paper, we present two design experiments (Brown, 1992) that helped us address these

issues. Both design experiments were run in the context of the Jekyll Island challenge. Middle

schoolers were asked to design ways of managing the erosion on Jekyll Island, a coastal island in

6

Georgia. They needed to learn about coastal erosion and accretion and their causes and wave and

current actions and their effects to achieve the challenge.

We began by integrating into the classroom a paper-and-pencil tool, design diaries

(Puntambekar, 1997, Puntambekar & Kolodner, 1998), to scaffold students’ design-related activities.

Based on our understanding of the processes that comprise designing, the diaries scaffolded the

students’ understanding and implementation of the stages in the design process. We learned two

important things from this first design experiment. First, we refined our understanding of the

processes involved in designing, the ways we might present those processes to students, and the

specific kinds of scaffolding we needed to provide for each. We concluded that we needed to help

students structure their work within each phase of designing, that we needed to provide how-to hints

within each phase. We also learned that students needed metacognitive hints to help them decide

what to do next or what to refer to as they were working, and that we needed to provide examples as

models. Second, we observed that in the dynamic, complex environment of the classroom, not all of

the scaffolding could be provided with any one tool or agent. Scaffolding, we concluded, needs to be

distributed across the agents that play a role in learning (for example - the teacher, peers, the

challenge itself and materials available, software, and paper and pencil tools). We found that a system

of scaffolding is required that integrates the activities that students have to carry out. In our second

design experiment, the following school year, we put distributed scaffolding into practice and were

able to see gains in students’ ability to learn and apply science to generate better design products. Our

analysis of these two design experiments provides details about both the content needed in

scaffolding for learning science productively from design activities and how to distribute scaffolding

in a design-based science classroom.

AN INTRODUCTION TO SCAFFOLDING

The notion of scaffolding comes from a socio-constructivist model of learning (Vygotsky,

1978; Wertsch, McName, McLare & Budwig, 1980) in which learning is believed to occur in the

context of social interactions in which a more knowledgeable person guides a learner's emerging

understanding. This support provided to a student is referred to as scaffolding, described by Wood,

Bruner, and Ross (1976) as consisting “of the adult controlling those elements of the task that are

essentially beyond the learner’s capacity, thus permitting him to concentrate upon and complete

only those elements that are within his range of competence.” One of the most crucial aspects of

successful scaffolding is that the student is working within what Vygotsky called the Zone of

Proximal Development (ZPD), defined by Wertsch (1985) as the distance between the child’s actual

developmental level as determined by independent problem solving and the higher level of potential

7

development as determined through problem solving under adult guidance and in collaboration with

more capable peers. Enabling the learner to bridge this gap between the actual and the potential

depends on the resources or the kinds of support provided. The best scaffolding will eventually lead

the learner to internalize the processes he/she is being helped to accomplish (Rogoff, 1990).

According to Roehler (1997), successfully scaffolded interventions require a balance between support

and challenge such that "support is provided through scaffolding, and challenge is provided through

learner interest in completing the task."

Based on the Vygotskian notion of making abstract processes more visible, scaffolding can take

a variety of forms. For example, a teacher or a more capable peer could help by modeling strategies

and providing suggestions (as in reciprocal teaching, Brown and Palincsar, 1987). Scaffolding can also

be provided in the form of prompts and questions that help students understand the processes

involved in learning (Scardamalia and Bereiter, 1985; Bell and Davis, 1996; Jackson, Krajcik,

Soloway, 1998; Guzdial, 1995). Jackson et al’s approach placed an emphasis on the learner and

provided three types of scaffolding – supportive (in support of doing the task), reflective (in support

for thinking about the task) and intrinsic (support that changes the task itself). Guzdial’s EMILE

showed how to embed several kinds of scaffolding in software, as do Bell and Davis’ work on KIE.

More recently, Guzdial’s Co-web has been designed based on the concept of dynamic scaffolding, an

approach in which the prompts are not pre-defined, but rather students and teacher/s collaboratively

co-construct knowledge and provide scaffolding for each other dynamically ‘in response to a

situation’ (Guzdial, submitted). Luckin (1998) distinguished between learner-focussed scaffolding,

which varies the content of activities as well as the help offered, and task-focussed scaffolding, which

varies the activities according to their difficulty and complexity.

The original notion of scaffolding assumed that a single more knowledgeable person would help

an individual learner, providing him or her with exactly the help he/she needed to move forward. But

the modern classroom does not allow that privilege. Rather, a single teacher is often providing

scaffolding for up to 35 students at the same time, usually basing her help not on what any individual

requires at the moment, but rather on what she believes the majority of the class needs in order to be

successful. Scaffolding can be more individualized if it is provided in a paper or software tool that

individuals interact with, or if classroom activities are redefined so that peers can provide scaffolding

for peers and/or the teacher can have a chance to work with individuals as the rest of the class

engages in activities without the teacher. Thus, orchestration of the class is a big issue in the design

of scaffolding that will work in school.

In order to provide successful scaffolding, whether for an individual or a class, through a human

or a tool, it is essential to understand the learner in terms of his/her current understanding and the

difficulties that he/she will face in successfully completing the learning activity. While the original

notion of scaffolding suggested that it was necessary to understand specifics about the individual

8

learner being helped, notions of providing scaffolding that will work for everybody among a group of

individuals points us in a slightly different direction – towards understanding, in general, the kinds of

understandings and difficulties learners have as they engage in a learning activity (Jackson, et al.

1998, 1998, Guzdial, 1995).

Our initial pilot studies helped us understand those aspects of the design process that were

particularly difficult for middle-school students and some of the help that teachers had to supply so

that students could successfully learn science through design. The studies detailed in this paper helped

us to better understand the needs of middle schoolers engaging in learning science from design, the

kinds of scaffolding they needed, tools that can provide some of that scaffolding, and how t o

orchestrate a classroom so that the full set of scaffolding needs middle-schoolers have in a design

environment can be provided for them. It has implications more generally, as well, for project-based

and inquiry approaches to learning.

STUDY 1 – INTRODUCING DESIGN DIARIES TO LEARNING BYDESIGN™

Based on the early pilot studies, and on teachers’ suggestions about how to provide individual

students with guidance while they were designing, our first approach to providing scaffolding focused

on helping students through the activities and reasoning processes involved in designing. We wanted

to provide help that would allow the students to achieve a design challenge productively without

being overwhelmed by its complexity and without overwhelming the teacher. We also wanted

students to recognize what they had been doing so that they would be able to articulate and reflect on

their activities to learn from them.

Processes involved in designing

Our first approach to scaffolding took the form of externalizing the activities (Collins, Brown

and Newman, 1989) involved in designing and in learning through design, by making clear to students

the range of activities that they could carry out and suggestions about what they might do next. We

articulated a simplified view of the processes involved in designing that divides it into four phases

(Figure 1), each of which has several component activities and products – understanding the problem,

gathering information, generating a solution, and evaluation.

Understanding the challenge is an important first stage in designing (Atman & Bursic, 1996). To

clarify their understanding of the challenge, we wanted students to think about sub-problems they

would need to address and to remember their own prior experiences that might give them hints about

how to go about addressing each. We then wanted them to identify questions they needed to answer

9

or topics they needed to learn more about (learning issues) that would enable them to understand the

challenge better and move forward with addressing it. During this phase, we also wanted them t o

generate initial ideas for addressing the challenge. (This is essentially the reasoning that happens

around a whiteboard in PBL at the beginning of each new problem.)

Figure 1: A simplified view of the design process

The second phase, research and information gathering, happened after questions are generated.

For the Jekyll Island challenge, information gathering involved reading print material, looking at

available videos and laser-disk resources, and modeling different erosion management approaches in

stream tables. Based on our initial studies we understood that students would need help with each of

Choosing a solutionGenerating alternative SolutionsGenerating criteria

Problem understandingRestating the problemQuestionsInitial ideasLearning issues

ResearchUse resourcesSecond set of ideas

Build/redesignEvaluate design/ testOptimize/rebuild

10

these. This phase also included generating a second set of ideas and perhaps new questions, based on

better understanding of the topic.

Generating a solution was the third phase in this design cycle. We identified three important

subgoals in this stage. First was to generate criteria that could be used to evaluate potential solutions.

Second was to come up with two or three possible solutions and weigh them with respect to the

criteria generated. Third was to select the best solution from among those being considered. This is

the one that would be constructed and iteratively refined. Choosing between potential solutions

requires deep understanding of scientific concepts being learned, as it requires applying criteria t o

potential solutions and making predictions about how they apply.

During the evaluation stage, solution ideas are constructed and tested, with explanations

generated about why something did not work as predicted and decisions made about whether to refine

the current solution or to redesign. When something doesn’t work as predicted, a designer has t o

explain why. It is possible that her predictions were wrong due to a misunderstanding of the

underlying science or that some of the features of her design need to be improved. Science concepts

must be applied to generate an explanation, and science misconceptions are often identified in this

phase.. Our pilot studies showed us that students certainly needed help to generate such explanations.

As can be seen in Figure 1, these processes are not linear. If a constructed design doesn’t work as

expected during evaluation, more questions might be generated and more investigation needed or

more solutions might need to be generated and a new possibility chosen or the challenge might come

to be better understood. Each requires returning to a different phase of designing. Note that

construction and evaluation drive design decisions– the opportunity to try out ideas followed by the

need to explain results is critical to motivating a need and desire to revisit conceptions and ideas and

refine them.

Given this model of the activities that are part of designing, we identified seven “products”

associated with seven subactivities of the design process that we could ask students to create in

addition to their final physical product: (1) their statement of their understanding of the challenge,

(2) a first set of learning issues, (3) initial ideas, (4) a second and more refined set of learning issues,

(5) a second and more refined set of ideas, (6) descriptions of solutions, and (7) criteria for evaluating

solutions. The products would serve two purposes. They would help students keep records essential t o

making good design decisions and engaging well in designing and learning, and they would provide us

with documentation of students’ thinking, knowledge, and capabilities.

Tools for scaffolding: The Design Diaries

We created a tool that we call a Design Diary to scaffold the design process. Design Diaries are a

paper-and-pencil-based tool with pages associated with the major activities and products of the design

11

process described above. Each page in the Design Diary had prompts to help students carry out its

associated design step and write down important information. For example, during problem

understanding, we asked students to restate the problem in their own words. During information

gathering, we asked them to restate questions that had been derived on the whiteboards. Prompts for

choosing between alternative solutions ask them to identify the criteria against which they would

evaluate possible solutions and to state why they thought these criteria were important. We also

made available, on some pages examples of good and not-so-good responses as models of what we

wanted them to think about and articulate. Table 1 shows the prompts from several diary pages.

Appendix A shows the full set of prompts we provided.

Design phase Activity supported Sample promptsProblem understanding Generating question

to understand problemThis is what I understand of theproblem(Please restate the problem in yourown words)What questions do I need to ask in orderto understand the problem better?

Generating solution Evaluating alternativeSolutions

What are the problem requirements?What are the criteria?Which criteria do each of your solutionsmeet?Which do they not meet?What are the positive features of eachof the solutions?What are the limitations of each of thesolutions?

Generating solution Coming up with criteria What are the criteria against which youwill evaluate possible solutions? Why doyou think these criteria are important?

Table 1: Sample prompts from design diary pages

The diary pages served many functions. Based on the notion that “making covert, abstract

processes visible, public and manipulable, serves as a necessary catalyst for reflective metacognitive

activity” (Derry, Tookey & Chiffy, 1994), the diaries served as a vehicle for providing hints. They

helped students to recognize what phase of designing they were in and to record ideas and knowledge

relevant to that phase of designing. They also provided prompts to help students decide how to move

forward. Thus they made thinking visible and recorded students’ journey through the design process.

Design diary hints provided guidance for students both in carrying out design activities and

reflecting on them in order to learn from them. Diaries prompted students to make critical decisions,

such as what function would they model, what materials they would use and why, what they needed t o

learn to model the function they chose, how they would evaluate potential solutions, whether they

12

needed to revise their models, and so on. Sometimes they provided hints about how to make those

decisions.

The diaries were used by students as an individual planning and reflection tool. Before engaging

in a design activity, they worked at home or in class planning for it and writing down their thoughts;

sometimes after engaging in an activity, they would update their diaries. The intention was that by

thinking alone before a group design, planning, or investigation activity, they would be better ready

to engage thoughtfully in the activity keeping the science they were learning in mind, and they would

be better able to contribute their ideas to small-group discussions. When using the diary after a hands-

on activity, the diary would encourage them to think about and articulate what they had done and

why.

One of the important issues to consider while designing scaffolding, as Luckin (1998) has

pointed out, is to guard against making it too prescriptive or too exclusive. The diaries had

suggestions in them that students could use, and they also contained examples. However, the

suggestions were presented on the side, in such a way that students could read and use them or not, as

they saw fit. The teacher prompted the students to read the suggestions in order to carry out the

design activities.

The classroom context

The diaries were used during the implementation of the Jekyll Island. Jekyll Island is a barrier

island off the coast of Georgia. The island has a problem in that it is eroding at the northern end and

accreting at the southern end. Also, a man-made canal (channel) in a nearby river is causing the sand

to accumulate at the southern end. The challenge is to come up with a means of managing the

erosion. Students were asked to come up with a physical means, but they could come up with long-

term plans as well.

The challenge was carried out in a mid-SES suburban Atlanta middle school. Students were in 8th

grade (14 years old). There were up to thirty students in each class. This was a middle school

population with students from a range of abilities. Their teacher was, by that time, a seasoned PBL

facilitator, but this was her first full implementation of the Jekyll Island problem.

Students in the four classes carried out the set of activities that have since become typical in an

LBD environment (Kolodner et al., submitted) – they brainstormed their initial ideas, generated

learning issues, generated and evaluated their solutions, and built working models of their designs.

Investigations, construction of working models, and explanation of results was done by small groups

of three to four students. The whole class generated ideas and learning issues, and small groups

reported on results of investigations and of modeling activities to the whole class.

The classroom that they were in was arranged in such a way that there were five stream tables at

one end. Students used the stream tables to build models that would allow them to investigate the

13

results of several different erosion management methods they came across and to try out their

solution ideas. Students could also go back to their desks and work on their diaries or they could look

for information on the Internet. There was also a video disk player, and students could view video

clips on erosion and related topics to help them with their investigations and designs. A trip to Jekyll

Island was also an important part of this three-week unit.

As students went through the phases in the design process, they used the stream tables to test

their design ideas. Students used sand, stones and water to simulate wave currents and the way their

designs would work to help stop erosion. They worked in groups, and the teacher helped each group

with their work. Students used the diaries either during class or as homework to generate learning

issues, solution ideas and criteria. They would come back to class each day and discuss in small groups

what they wanted to do next and how they would carry it out.

RESULTS

Students kept records of their design activities in their diaries all through the three-week period.

These were collected at the end of the unit. Student responses in the diaries were analyzed for

evidence of science learning and for understanding of the design process. In addition to the diaries, a

researcher (the first author) was present in the class taking down notes and observations throughout

the three-week period. These observations were important to understanding how the teacher

managed and facilitated the activities and the ways in which students moved between the activities.

Coding of student entries in the design diaries

To analyze depth of science understanding, the seven interim products the students recorded in

their diaries were analyzed using a four-point coding scheme. For each, a score of 0 was given t o

responses that showed no scientific thought, a score of 1 was given to responses with some science

that was general and not concrete enough. Responses with some scientific explanation (but not deep

enough) were given a score of 2, and a score of 3 was given to responses that included a well-justified

scientific explanation. Table 2 shows an example of the coding categories for analyzing students’

alternative solutions.

The first of our seven coding categories was students' understanding of the challenge (U).

Students were asked to elaborate and analyze the situation to extract issues related to the challenge

and problems they needed to address. Table 3a shows examples of students' responses in this

category. A score of 1 meant that the response was a repetition of the problem statement without

any elaboration. Responses that showed an analysis of the situation on Jekyll by listing the causes of

14

erosion on the island, such as the effect of the sea wall on the north end and the problems caused by

dredging a channel (see Table 3a), scored a 3.

Score Coding Category Sample Response0 Irrelevant response Put houses on stilts.1 Some solution ideas,

but not concreteenough

A sea wall with rocks on both sidesof the wall; put wall at an angle.

2 Solutions with someexplanation ofscience, but notconcrete enough

Build a wall that blocks the riverfrom letting it erode the north end,pushing it.

3 Solutions withexplanations in termsof the science (e.g.redirecting long shorecurrent)

Make a wall coming out of theeastern side that will collect sandand build beaches that the sea wallshave destroyed. Also a sea wall onthe north side will cause the longshore current to bounce off.Another wall on the northwesternside will direct waves away fromthe island.

Table 2: Coding categories for analyzing students’ solutions and solution ideas

Coding category Score Sample responseNot understood or a vague analysis 0 I understood that we are working on

Jekyll problem and we are working on itnow.

Repeat problem verbatim 1 Try to stop the erosion on Jekyll.Some elaboration (e.g. we have totackle erosion, some mention ofcurrents)

2 We have to find a way of stopping themovement of sand from the northern endto the southern end.

Elaboration with emphasis onexplaining the problem (mentioncurrents, channel)

3 This is happening because the sea wall iscausing the sand to move from the northend to the south end. Also, a deep man-made channel is causing problems.

Table 3a: Coding problem understanding (U)

A second part of understanding a situation is to raise questions or learning issues to clarify the

problem. This was done during the Jekyll Island Challenge early on before investigations (Q) and

again after the first round of investigations (LI). As illustrated in table 3b, responses that raised non-

scientific questions regarding the cost of the designs or about the aesthetic aspects of design, for

example “how much will it cost?” or “will it look ugly?” were given a score of 1. Although these

were important aspects to consider, our main aim of was to enable students to learn the science. As

15

such, responses that failed to focus on science got scores of 0 and 1 while those that focused on

science were given a score of 2 or 3. A score of 2 meant that students raised science questions that

were scientific but broad, such as, “How do we build a sea wall?” or “What is causing erosion?” A

score of 3 was given when students raised questions that required explanatory answers, such as “How

are the currents affecting erosion?” and “What are long-shore currents, and how do they affect what

is happening on Jekyll?”

Coding category Score Sample responseNo questions or irrelevant questions 0 We have to solve this problemGeneral questions (residents, ‘how’ questions,but very general such as ‘how can you stoperosion’)

1 How much will it cost?Will it look ugly?

Some science questions (regarding what iserosion and how it happens)

2 What is making the sand move?What is causing the erosion?

Deeper science questions (causes of theproblem, mention channel, currents, and ingeneral the ‘how’ questions)

3 How are the currents affectingerosion? What are long shorecurrents and how do they affectwhat is happening on Jekyll?

Table 3b: Coding of learning issue/questions (Q and LI)

Three sets of solution ideas were scored and analyzed. The initial set of ideas (I1) were the ones

that students generated before they had had a chance to learn much about erosion and its causes. The

second set (I2) were generated after students collected information and researched their learning

issues. During this time, student groups might also have tried out some of their erosion management

ideas in the stream tables. The third set of ideas (S) were their proposed solutions. They were

generated after students had tested earlier ideas in the stream tables, evaluated which few were most

likely to yield good results, and modeled their proposed solutions in the stream tables. In all of these

cases, ideas that showed a deeper understanding of science scored a 3. As shown in table 3c, students’

understanding of the science ranged from a shallow “a sea wall with rocks on both sides of the wall”

to “currents can cause a lot of damage, rocks work well on slowing waves down, putting the seawall at

an angle helps” (scores 2 and 3 respectively).

Coding category Score Example Response

No response or an impractical response 0 Make a tube to bring the sand to thenorth end of the island

Some solution ideas, but not concreteenough

1 A sea wall with rocks on both sides of thewall.

Solutions without explaining thescience

2 A sea wall with rocks on both sides of thewall

Solutions with explanations (e.g.redirect long shore current)

3 Currents can cause a lot of damage, rockswork well on slowing waves down, putting

16

the seawall at an angle helpsTable 3c: Examples of students' ideas about solving the problem (I1, I2 and S)

Criteria (C) are subgoals a designer aims to achieve. For any given design solution, there are

often aesthetic, cost, and scientific and engineering criteria. In the case of the Jekyll Island

Challenge, the teacher asked students to focus on earth-science-related criteria (and this was well-

discussed in class). In our coding, too, we were most interested in the earth-science-related criteria

students generated. Thus, criteria such as “cost”, “will it last?”, and “what will be its effect on

environment” were given scores of 1. Criteria such as “stop erosion” were given scores of 2, because

although they referred to the problem broadly, they were not specific enough. Criteria that included

more of the specifics of achieving the general “global” criteria that got scores of 2, such as “slow

down waves,” were given scores of 3. Students tended to score 2’s and 3’s on this item; not many

were very specific about the criteria.

Coding category Score Example responseGeneral criteria 0 Save JekyllDesign related but not sciencerelated

1 Save the environment, cost

Science related, but global (stoperosion)

2 Stop erosion, Save as much as beaches aspossible.

Specific science related criteria(redirect current)

3 Slow down waves

Table 3d: Criteria generated by students (C)

Students' responses in the diaries and their progress over time

The results Qindicated some growth in student learning as students progressed through their design

activities, with somewhat better articulation of science towards the final phases of the design cycle.

In the initial stages, students' responses were quite general (0’s and 1’s). Later, the percentage of 0

scores went down and the percentage of 2’s and 3’s went up slightly, suggesting that the class as a

whole was showing some deeper understanding of science, but not as much as we would have liked or

expected.

Figure 2 Qshows the breakdown of students’ responses across the whole set of coding

categoriesQQQ Qproblem understanding (i.e. analyzing the situation) (U), initial questions (Q), initial design

ideas (I1), learning issues after researching the topic (LI), solution ideas after researching the topic

(I2), solutions (S), and criteria (C). QOn the whole, the majority of student responses were towards the

lower end (scores of 0 and 1), except for criteria (C), where scores were mid-range (1 and 2).

17

To start with, 75% of the students repeated the design problem verbatim without elaborating it

(U). The first set of questions they asked about the challenge to clarify their understanding were also

weak, with 75% of students scoring 0 or 1. As many of the students did not relate the problems on

Jekyll to science, their initial solution ideas were rather vague and unpractical, with 75% scored 0 or

1 on their initial design ideas (I1). This seems natural since they did not have the required knowledge

to come up with good solution ideas. Students' solution ideas later on, after they had had an

opportunity to study what was happening on Jekyll (through experiments on stream tables, resources

provided in the class such as laser discs, discussions in the class, and a visit to the island) improved a

little, but not significantly. Looking at their second set of questions (LI), we see that now only 65%

scored 0 or 1; in their second set of ideas (I2) and solutions (S), only 70% scored 0’s and 1’s. But we

had expected much more growth, with a higher percentage of responses scoring 3, reflecting

significantly increasing understanding of the science involved as students progressed in their design

activities. We also found that students often got fixated in the first solution idea that they thought of

and did not come up with alternatives.

Figure 2: Student responses in the first implementation of the Jekyll Island Problem

Students’ understanding over time

Students spent over three weeks on the project. One of the most important aspects of the design

process is that students engage in multiple cycles of designing and evaluating. For example, their first

0

10

20

30

40

50

60

70

80

Score - 0 15 36 24 22 22 26 6

Score - 1 75 42 50 46 50 49 42

Score - 2 8 9 12 19 12 12 46

Score - 3 1 5 7 9 13 9 0

U Q I1 LI I2 S C

18

ideas (I1) were a part of the initial stage of the design process, where they came up with any number

of ideas to solve the problem. As they progressed through the design process they refined their ideas

(I2Q) and then narrowed the ideas down to two or three possible solutions (S). The results reported

above show that students developed a slightly better understanding of science between the start and

end of the challenge, but certainly not as much as we would have liked of expected. But what about

for each student? Did their solution ideas reflect a better understanding of science towards later stages

of the design cycle? Perhaps by looking at how individual students progressed, we could find out

something about the ways their ideas and understanding of science were progressing – something that

would help us understand whether we were on the right track, what their particular difficulties were,

and what other scaffolding or orchestration needed to be provided. We set out to study whether

students refined their ideas in terms of the science that they were learning as they progressed through

the design process.

We used the Wilcoxon signed rank test to analyze the progression of student’s individual scores.

For each student, we compared the quality of initial ideas (I1), generated during the first week of the

challenge with ideas they generated after their investigations (I2) ( Table 4), and we compared those

ideas with their final ideas, represented in their solutions (S) ( Table 5). A negative rank signified that

there was a shift from a higher score to a lower sore (e.g. from 3 to 0, 1, or 2). A positive rank

meant a positive change – from a lower to a higher score.

When all ranks were compared, the results supported those reported above; there were no

significant trends over time. The number of students who showed a better understanding of science

increased from I1 to I2, the difference was not significant (Table 4). There were more positive ranks

than negative ranks, meaning that some students did improve in terms of their science understanding,

but there were almost as many cases where students scored more poorly on I2 than on I1, and half of

the ranks were ties, showing no change in science articulation from I1 to I2.

When we compared changes from I2 to S ( Table 5), results were similar, showing little change,

but interestingly, a significant (z=-2.085, p=.037) number of the students gave less scientific answers

in their solutions (S) than they had when articulating their second set of ideas (I2). Solutions included

the best two or three ideas that students narrowed down to after they had tested some of their ideas

and learned more about the science of erosion. Twenty-two students did worse on solutions; 11 did

better, and the other 65 were ties.

N Z SigNegative Ranks 23(a) -1.597 .110Positive Ranks 30(b)Ties 46(c)Total 99

19

a I2 < I1

b I2 > I1

c I2 = I1Table 4: Comparison of I1 and I2

N Z SigNegative Ranks 22(a) -2.085 .037*Positive Ranks 11(b)Ties 65(c)Total 98a S < I2b S > I2c S = I2

Table 5: Comparison of I2 and S

Connecting science to designed artifacts: Is this where the problem was?

Clearly, students were either having problems making connections between their project

solutions and the science they were learning, or they were not writing down as much science as they

were taking into account. When we looked at the design cognition literature, we found evidence that

proposes that the first was the problem. Novices tend to neglect predictions about behavior to help

them debug their models (e.g. Murayama, 1994). If students were focusing on structure and neglecting

function and behavior, then this would mean they were not doing the kinds of reasoning that would

afford connections between their designs and the science they were learning and explain why they

were not articulating the science they knew, as function and behavior tend to be more inherently

connected to scientific knowledge and reasoning. We wanted to find out if this was the difficulty

students were having, this time coding student responses to get at the aspects of their designs that

they were attending to. We did this using a coding scheme based on the SBF (Structure-Function-

Behavior) model of representing devices (Goel, Garza, Grue, Murdock & Recker, 1996). The SBF

representation provides definitions of structure (the component parts), function (the functions of

those parts), and behavior (the causal relationships between the functions of the components that

result in actions), and we used those definitions to code the student responses. A SBF model of a

device specifies the structure and the functions of the device, and also its behaviors that specify how

the device functions are composed from the functions of the structural components (Goel et al.,

1996). As we suspected, we found that students concentrated heavily on the components (Structure)

of the design that they were working on. As can be seen from Figure 3, a large percentage of student

responses concentrated on the structures of their designs, in all of the seven design diary pages that

20

they recorded. Although there was a gradual increase in their responses in the function category

towards the end of the design cycle, the gains were not big. The learning issues and the criteria that

they came up with reflected their understanding of the components their designs needed to have.

Typical were suggestions such as 'build a sea wall'. But there was little evidence in the design diaries

that students attended to the functions their solutions were supposed to fulfill, e.g., “How do sea walls

work”, “How do high tides affect erosion?”. This is consistent with our first coding scheme. Indeed,

there was evidence that students found it difficult to make connections between the parts of their

designs and their functions and to understand how the parts worked together to achieve the necessary

function/s. For example, very few students explicitly elaborated the problem statement to indicate

that they needed to understand the role of tides and currents as causes of erosion. Most students only

showed a shallow understanding of the problem, such as “we need to do something about people

losing their properties on Jekyll”. Many students were concerned about effects their designs would

have on the environment, but even those students failed to express functionality they needed t o

address when they derived their learning issues.

Figure 3: Students’ understanding of Structure, Function and Behavior

Observations

Why were students not getting to function and behavior? Were they learning science content

but not using it? If so, why were they not using the science they were learning? For this, we turned t o

ethnographic observations we had been making in the classrooms. Our classroom observations helped

us to explain some of these results and to clarify the role that the diaries and the other agents in the

0

20

40

60

80

100

Structure 79 96 86 83 84 80 63

Function 22 4 13 13 16 19 34

Behavior 0 1 2 5 1 2 4

U Q I1 LI I2 S C

21

classroom (teacher, peers, resources) were playing. From this, we have been able to suggest roles each

should play in an environment where students are learning science from design activities.

The first author collected our classroom observations from all of the four participating classes.

On average, every class was observed at least three times a week. This gave us the opportunity t o

observe students through all the phases of the design cycle. Observations focused on noting student

uses of design diaries and observing small groups as they discussed their solution ideas, as they

consulted resources to explore their questions, and as they tried out their solutions. We also observed

group presentations of their designs and the discussions that followed. We note here those trends that

we think provide some explanation for why student results were not more positive.

In general, students wrote in their diaries as homework, usually in preparation for what they

would do the next day. When they got back together in their small groups the next day, they had a

short planning discussion and then began the work they were to do that day – generating ideas,

investigating, building and testing models in the stream table, and so on. Students worked well with

each other – that is, they listened to each other and figured things out together; they divided up work

well; and they were truly engaged – intellectually and emotionally – in what they were doing. The

teacher walked from group to group to make sure students were on task and to listen in on their

discussions and help them out of quandaries. She also used the video-disks as a resource to answer

student questions. The teacher had also put together a set of internet resources that she encouraged

the students to explore. In addition she spend some time at the end of each class to discuss the

logistics for the next day. A casual observer would have concluded that these were students who were

enthusiastically engaged and comfortably working together to learn.

However, there were many signs that the full affordances of designing for learning science were

not being grasped and taken advantage of. Most of the student groups iteratively refined their design

solutions by trial and error without systematically reasoning about why their designs weren’t working

as well as expected and without systematically reasoning about the changes they might make to their

designs and what might result. When they tested their designs, they often did not refer back to the

criteria that they had set in their diaries. If they thought that their designs failed, most did not go

back to the diary pages on learning issues or evaluation to reason about what else they might have

needed to learn. In other words, they were not relating their design activities to the science that had

been exploring as they read and watched videos.

As a result, many opportunities for connecting science to design were missed. While each

student had something to contribute to group discussion as a result of using diaries to prepare for

group work, students did not, in general, formally refer back to the diaries as they worked together.

Nor were they encouraged to go back and annotate diary pages after their group work. There was

little sharing of the work that they did in the diaries; the questions and ideas that students raised in

their diaries were not followed up in small-group or whole-class discussions. As there was no formal

22

follow-up on their work in the diaries, many of the interesting issues that they addressed in their

diaries and the misunderstandings they had about the science issues were not discussed. Because the

teacher was not formally checking the diaries each day, she was not fully aware of the lack of depth

in what the students were writing and discussing in their small groups. In essence, diary work was not

fully integrated with the other activities in the classroom.

Nor was there advantage taken of the links between the various activities that students

undertook -- reading, modeling possibilities in the stream table, coming up with hybrid solution, and

so on. Students seemed to perceive the process as being linear and without links between different

activities. They concentrated on single moves (Christiaans and Dorst, 1992) and worked on solving

parts of the problem without consideration for the whole. Whole-class discussions were rare, and the

teacher was trying hard to allow the students to work independently. Thus, the students had little

help connecting their investigations to the planning, building, and testing of their designs. And

without class discussions and diary pages to promote it, students had no real help with reflecting on

what they learned at each stage.

We analyzed our observational data, as well, to pull out particular difficulties students were

having. We saw that they had a particularly hard time using the resources available to gather specific

information they needed to move forward with their designing. Often, they had trouble extracting

useful information from the resources provided, copying whole paragraphs verbatim from the

resources rather than summarizing the content. They were also quite unsystematic in their

information gathering, concentrating heavily on finding interesting facts rather than on attempting

to answer the questions they had raised earlier and recorded on whiteboards. Their fact-finding was

often serendipitous and not related to the purpose that their designs were to fulfill.

The presentations that students made after completing their designs showed that students had

trouble justifying their design ideas. Students used sand, wood, stones, etc. to build walls, then tested

their designs in stream tables (by pouring water and simulating wave patterns), worked on some

revisions, and then, when satisfied with their end product, demonstrated their solutions to the class.

We had expected that they would have justified their designs in terms of the wave actions and

currents they had been using to test them, but they did not, nor were they able to when asked. This

was not revealed until they presented their solutions.

DISCUSSION

Clearly, students needed a lot more scaffolding than the diaries alone were providing. There

seemed to be two big problems. First, the design diaries had not provided some scaffolding that

perhaps they should have provided, in particular scaffolding that would encourage more justification,

reflection, making of connections, and specifying of function. Second, as a beginner in facilitating a

23

project-based classroom with iteration as a key ingredient, the teacher gave the students more

independence than they could handle and failed to recognize the role she had to play in helping them

plan and reflect and tie together the different aspects of what they were doing. The diaries needed t o

be integrated better with what students were doing, and the teacher needed to be providing some

specific kinds of help that the diaries weren’t providing. We needed to redesign the scaffolding

thinking about how it might be distributed between the teacher and the diaries and help the teacher

learn how to scaffold and sequence activities better, taking the teacher’s role in scaffolding and

orchestrating the classroom far more seriously than we had done previously.

Need for more specific prompts

One of the reasons why students failed to reason about and justify their designs in terms of the

science they were learning may have been that the diaries did not do a good enough job of

encouraging and supporting justification in terms of science issues. Without such support, it seems,

students will not solve problems with the deliberate intention of understanding and using science. This

is similar to findings by Baumgartner and Reiser (1997) in which students were focussed on

optimizing the performance of their models and not necessarily justifying their results. Analysis of

student responses in the diaries helped us understand that we needed to provide prompts that would

‘elicit’ students’ understanding of science issues and help them tie the science to making design

decisions. Specific prompts encouraging students to explore and use the science during each of the

design phases might help with this. Also, we believe prompting must be added to explicitly ask

students to reason about the functions of their devices.

Results indicated that students did not make much progress in their solution ideas as they went

further in the design process. The Wilcoxon sign test showed that there was no significant growth in

their use of science as students progressed through the design process. One of the reasons why

students did not make much progress in their solution ideas may have been that they did not formally

refer back to their ideas, criteria, and learning issues when they were doing their building and testing.

The kind of monitoring and metacognitive understanding that is required to learn from design was

not something that was seen in students' responses and classroom observations.

We saw the need, as well, to add scaffolding that would help students use resources well so as t o

extract science relevant to the design challenge and the learning issues they had generated earlier.

Need for distributed scaffolding in the classroomBut scaffolding in the design diaries was not the only reason students were having difficulties. I t

was difficult for individual students, or even small groups, to be able to fully debug their solutions and

ideas. We had thought that using the science to justify design ideas would provide students with

affordances for connecting science to design. But, we found, it is difficult for students to become

24

proficient at justifying decisions using science if they are not asked to regularly justify and then debug

their justifications. And, a parallel investigation using software had shown us that when small groups

were encouraged to talk to other groups, they were more likely to engage in discussion about function

(Puntambekar, Nagel, Hübscher, Guzdial and Kolodner, 1997).

That parallel investigation suggested to us that we could encourage talk about function (and

hence more use of science) and provide students with opportunities for attempting to justify their

decisions is we added to the classroom activities times for publicly sharing their design ideas and

justifying them. Students would then have a chance to practice justifying and explaining why their

solutions fulfilled their purposes and why the criteria they chose were the most important ones for

judging their designs, making the tying of science to designing a theme of classroom activities. If

they did this between some of their iterations, with the teacher specifically allocating time for

multiple presentations and whole-class discussions, students would have more opportunity for talking

science and seeing how others were applying it (Tabak and Reiser, 1997).

Another change needed in classroom orchestration, our results suggested, was that individual,

small-group, and whole-class opportunities for learning be more seamlessly integrated with each

other. This would mean orchestrating the classroom in ways that would ensure that individual work

students do in their design diaries be formally used in small-group and whole-class discussions, and that

there be more opportunities for small groups to share with the class the ideas they are formulating

and the results of testing their design ideas. These whole-class discussions might happen during

information gathering, to help the whole class stay on track and share with each other what they are

finding, after making a first pass at coming up with a design solution after those initial investigations,

and between some iterations.

Finally, we saw the need to specifically link testing their models to science. It was difficult t o

interrupt students for whole-class discussion of their ‘hands-on’ activities, but if we made interim

presentations of results a formal part of classroom activities (as suggested in the previous paragraph)

and provided students with help explaining their results, perhaps we could help them link the science

better to their modeling and testing.

In other words, it would be important, we thought, to provide students with adequate public

practice at the skills we wanted them to learn; to add more classroom activities that provided

opportunities for that practice, for debugging the practices, and for debugging science conceptions; t o

promote more metacognitive reasoning in the design diaries; to promote thinking about function and

behavior during designing; and to integrate the scaffolding provided in the design diaries

synergistically with classroom activities such that each of these important practices be supported in a

variety of different ways.

We have coined the notion of distributed scaffolding to refer to such scaffolding. Our results

indicated that the scaffolding needed to be at both these levels for students to take advantage of the

25

rich learning opportunities afforded by such an environment. At the individual level, the diaries

needed specific prompts to help students reason about the science that were learning. At the social

level, interactions with peers and teacher-led discussions could provide scaffolding enabling students

to justify and reason about their designs. In distributed scaffolding, both individual and social

processes involved in learning are taken into account, and scaffolding at both levels is synergistically

interleaved. Scaffolding is distributed across the tools and context in which learning is happening.

Each extends the other.

STUDY II - IMPLEMENTING DISTRIBUTED SCAFFOLDING

We implemented three significant changes in our second implementation of the Jekyll Island t o

see if these hypotheses about distributed scaffolding would hold up. First, we refined the design diaries

based on an analysis of the design process more in keeping with the needs we saw for connecting its

different phases to each other. Second, we added scaffolding to the diaries that would encourage

students to think about the structure, function and behavior of the devices they were designing and

modeling. Third, and perhaps most important, we put into place new classroom activities that would

encourage more student-student and student-teacher interactions and we helped the teacher learn how

to facilitate those activities.

The design process

Based on those analyses of designing and on the needs of our students that we identified in our

first study, we kept the four phases we had presented to students earlier, but we made three important

changes in our presentation of the process to the students:

We made the specifics of each phase more visible.

We made connections between the phases more visible.

We specifically provided scaffolding for thinking about function.

As Figure 4 shows, we described the design process as consisting of four main phases at the

macro level – analysis (called problem understanding previously), exploration (called

research/information gathering previously), solution generation, and evaluation. Each phase was

presented to the teacher and students as consisting of numerous activities at the micro level. Both

can be seen in the diagram. Analysis consists of exploratory activities such as identifying the

objectives, identifying the challenge’s sub-problems, brainstorming to come up with solution ideas,

identifying issues for research, etc. Exploration involves further understanding of the challenge

through literature search, refining the problem specification, establishing criteria for evaluation and

other activities that help in clarifying the situation. Solution generation was presented as the stage in

26

which a solution is chosen and a prototype is implemented. Evaluation involves evaluating the design

with respect to criteria and identifying aspects for improvement and redesign.

Because the stages are not linear, we also provided a metacognitive level to the students in the

design diaries to help them draw connections between the phases, see the cyclical nature of design,

and them make connections between science and design. Within each of the design steps, we provided

support, on the diary pages, for students to reason about the function and behavior of their designs.

For example, suppose a student built a prototype and was evaluating her model. While evaluating, the

diaries would point her back to the criteria generated during exploration that would help her t o

evaluate her artifact. Using those criteria affords judging whether her prototype fulfills the

function(s) that were proposed (during analysis), and if not whether the structure needs to be

changed. We explain these in more detail in the figure 4.

Tinkering

Ideas

Specification

Exploration

Learning Issues

Research

Solution Generation

Choose solution

Possible solutions

Generate criteria

Eva

luat

ion

Bui

ld /

Test

Eva

luat

e

Rev

ise

/ Mod

ify

AnalysisUnderstand problem

SubproblemsContext / useQuestions

Solution SubproblemsMaterials Testing

ExplainJustify

hypothesis generationpredictionobservation

Choose materialRun testsRevise

Decision makingWeigh trade-offsJustify choice

Figure 4: Model of the design process - second iteration

27

Improved version of the dairiesBased on the design process described above, we developed a new version of the design diaries t o

add more support for the activities involved in designing as well as for helping students make

connections between science and design. The diaries provided students with three levels of prompts.

At the highest level were the macro-prompts, designed to help students reason about the phase of

design that they were working on. Next were the micro-prompts, designed to help students carry out

the activities within each design phase. Both the macro- and the micro-prompts encouraged students

to reason about the purpose of their designs right from the start, thus having them think about the

science issues that they should be addressing. Apart from the prompts that helped students reason

during each of the phases of design, the new version of the diaries also included metacognitive

prompts which were intended to help students to actively monitor their learning (Brown, 1988,

1994). The metacognitive prompts were designed to help students to monitor their learning by

encouraging them to go back to what they had already written in the diaries, and by helping them

understand the cyclical nature of design. Thus, for example, they were asked to read their problem

specification again before they generated criteria, so that their criteria addressed what they set out t o

accomplish in their problem specification. Metacognitive prompts also helped students to reason and

justify as they were making their design decisions. In addition, the diaries also addressed the need for

students to make connections between the scientific principles that they were learning and their

activities by including prompts that helped students to link the structure of their design to its

function, and to reason about the behavior of their models. Table 6 shows examples of the kinds of

prompts that were included in the new version of the diaries.

Type of scaffolding Type of prompts Example promptsMacro-level • High level prompts about the

design phase• Help students to think about

the overall goal of the phase

A specification should onlystate what is required to solvethe problem - not how tosolve it.

Micro-level • Prompts to help students withthe sub-tasks within eachphase

• Prompts to help studentsreason about the scientificprocess such as makingpredictions

• Prompts to help studentsfocus on function

Things that you might thinkabout

What will be the function ofthe device?Who will used it?Where will it be used?Are there any otherconstraints that you need toconsider (e.g. safety, cost, easeof use?)

Metacognitive level • Prompts to help students torelate a particular phase andactivity to their overarchingdesign challenge

You might want to refer toyour research notes whilewriting the specification.

28

• Prompts to help students seethe design process as cyclicaland the inter-relatedness ofphases and activities

Come back to this page as youlearn more about what isneeded. Remember to dateyour entries.

Table 6: Types of Scaffolding in the second version of the diaries

In addition to the new types of prompts, there were other important improvements in this

version of the diaries. We included additional pages for students to write a 'specification'. This design

diary page was developed to enable students to think carefully about what the purpose of their design

was, so that they might learn to reason about what is to be achieved and how the science that they

are learning could help them achieve this. We also included pages to help them hypothesize about

how their models would work when they tested them and whether their prediction came true. If their

predictions came true, the prompts in the diaries encouraged them to explain why, and if they did

not, the prompting encouraged students to think about what design features they needed to improve

to achieve their goals. The pages for generating criteria were also enhanced with better prompts.

Table 7 shows the new prompts for several of the phases whose prompts were shown in Table 2 and

some of the new categories of scaffolding and the prompts that went with them. Appendix B shows

the full set of prompts as well as the examples that could help guide students’ design decisions for the

second version of the design diaries.

Design phase Activity supported Sample prompts from version 1 Sample prompts fromversion 2

problemunderstanding

Generating questionto understandproblem

This is what I understand of theproblem (Please restate theproblem in your own words)What questions do I need to askin order to understand theproblem better?

Write a brief statement of theproblem emphasizing thepurpose of the design.Questions you might answerin your statement.

What is the purpose / functionof the design?Where will it be used?Who will use it?What exactly are yousupposed to design?

A statement of the problemshould not be too vague ortoo detailed.Example

29

Generatingsolution

Evaluating alternative solutions

What are the problemrequirements?What are the criteria?Which criteria do each of yoursolutions meet?Which do they not meet?What are the positive features ofeach of the solutions?What are the limitations of eachof the solutions?

For each solution write

What is good about it• you should include the

criteria that the solutionsatisfies (from the criteriapage)

• you may want to list othercriteria as well, and mayalso want to update thecriteria page

What is not so good about it• you might use the criteria

that the solution does notsatisfy (from the criteriapage)

• you may want to list othercriteria as well, and mayalso want to update thecriteria page

What is interesting about ityou might include what isunique or interesting orspecial about this solution

Generating asolution

Coming up withcriteria

What are the criteria against whichyou will evaluate possiblesolutions? Why do you thinkthese criteria are important?

Discuss possible problems andadvantages such as

• can it achieve its purpose?• what characteristics does it

need to have to achieve itspurpose?

• how long it takes to make• how durable it is• How does it work in the

environment - is it safe, isit easy to use

Use this list to set upevaluation criteria.Use the evaluation criteria tojudge your design on the nextpageRead your specification again tohelp generate other criteria

Table 7: Comparison of the first and second version of the diaries

Social scaffolds

We added two kinds of social scaffolds to the set of scaffolding tools for the classroom: pin-up

sessions in which students would show off their ideas, models, and designs in progress, and whole-class

discussions inserted at strategic times in the orchestration. For each, we worked with the teacher t o

decide on appropriate points in the sequencing when they should be held and the kinds of

presentations and discussions that should be their focus.

30

Pin-up sessionsAs discussed earlier, we hypothesized from Study 1 that if students were provided with a variety

of opportunities to justify their design decisions in terms of the scientific phenomena that they were

learning, they would become more adept at justifying and at understanding what a good justification

is. In particular, we wanted groups to present their solution ideas to each other. Reviewing designs of

peer groups and presenting one’s own designs in the classroom, we thought, would provide students

with opportunities to consider alternative designs and also help them in the skills of questioning,

justifying and critical reasoning which are important aspects of science learning. We took the notion

of the “pin-up session” from the architecture studio as a format for these presentations. In an

architecture studio, students periodically present their design ideas and sketches to the rest of the

class and to their teacher by creating a poster, pinning it to the wall, and then explaining to the class

their intentions and how they plan to achieve them. Creation of a presentation encourages students

to think through their ideas deeply and make their reasoning clear. Hearing the ideas of others

provides grist for students to learn what makes for good justifications. This was our idea in the middle

school classroom. We believed, as well, that pin-up sessions would give groups that were having a

hard time a chance to come up to the level of the rest of the class and that it would give the entire

class a chance to consider additional alternatives besides the ones they had considered in their small

groups. During a pin-up session, students would present their design ideas to the class, and for each

idea, they were to provide justification. Justification, in general, would come from the reading or

video disks that students viewed , or it would come from the results and analysis of their modeling

activities in the stream tables. After presentation, the teacher and peers would question presenters

about their ideas and justifications, thus encouraging them think hard about relating their designs t o

the science they were learning..

Whole-class discussionsWe proposed several points for whole class discussions as well – opportunities for small groups

to share their thinking that would allow the teacher to become aware of common misconceptions and

common difficulties and that would allow students to experience more than what they were doing in

their small groups. To enable sharing of learning experiences across all the groups in the classroom,

whole class discussions facilitated by the teacher were integrated in study II. The teacher used

introductory time each day to help students reflect on what they had worked on the day before, t o

introduce students to the day’s activities, and to help the class summarize what they had learned

during their previous activities. She also set aside time after pin-up sessions -- while students were

engrossed in designing, building, and testing models in stream tables – for looking across the different

ideas proposed by peers and to draw out what might be learned from them. Helping students “come

up for air” while in that construction and testing phase, we thought, would be critical in a design-based

science classroom, as we saw that students engrossed in designing would easily forget to apply science

31

to what they were doing in favor of trial-and-error iterative activities. We hoped that integrating

whole-class discussions regularly into classroom activities would encourage students reflect on and

explain their designs in terms of the science they were learning. Web-SMILE (Puntambekar et al.,

1997; Guzdial et al., 1997), a tool for facilitating electronic discussions between groups in the

teachers’ four classes, was also used in this study.

DISTRIBUTING SCAFFOLDING ACROSS THE SCAFFOLDING TOOLSAND ACTIVITIES

An important feature of the second study of the Jekyll Island Challenge was that the tools and

activities that provided different levels of scaffolding were distributed across the activities and agents

in the environment (illustrated in Table 8). Each tool had an important role to play in supporting

different design activities.

Tools andactivities

Practices supported Use of the tool oractivity

How the tool oractivity supported

learning the practicesDiaries Practices that are part

of designing -- macro,micro and meta levels

By individuals, ashomeworkor during reflection time

Macro-, micro- andmeta-cognitive promptsand examples

Pin-upsessions andelectronicdiscussions

Justifying solution ideas,generating criteria

By the community, afterinvestigations, aftercoming up with possiblesolutions

Teacher and peerquestions (situationalprompts) andexplanations; teacherand peer modeling

Whole-classdiscussionsandpresentations

Sharing solution ideas,asking questions acrossclasses

By the community,During solutiongeneration andevaluation

Teacher and peerquestions (situationalprompts) andexplanations; teacherand peer modeling

Table 8: Distributing scaffolding across agents, tools, and activities

Thus, while the diaries were designed to provide scaffolding for the cognitive and metacognitive

aspects of each of the phases of design, the pin-up sessions and whole class discussions were designed

to provide a social environment that would make clear what particular help students needed and allow

any of the agents in the environment who could provide that help to provide it. Scaffolding in the

diaries was set based on our analysis of design processes, but scaffolding in the pin-up sessions and

whole class discussions was ‘dynamic’ (Guzdial, submitted.). It changed based on the situation and

student needs at that time. In particular, the pin-up sessions, we believed, would help students justify

their solution ideas and evaluation criteria and provide opportunities for questioning their peers;

whole-class discussions would help students share their learning experiences and help the teacher t o

32

ask for and also provide explanations for the scientific phenomena that students were learning;

electronic discussions would support the exchange of ideas and solutions across classes with whom

they did not have the opportunity to interact in real time; and student presentations at the end of

the challenge would help extend and generalize students’ science understanding.

The classroom environment

Study II was carried out the year following Study I in the same school as study I with the same

teacher and using the same Jekyll Island Challenge in eighth grade Earth Science class. The classroom

had the same layout, with stream tables, a video disk player, and a computer with an internet

connection. What was different this time was that we used the revised design diaries, and we helped

the teacher learn when to introduce the social scaffolds and how to facilitate each. Students worked in

small groups as in the first study, but because of the pin-up sessions and whole-class discussions, they

were given numerous opportunities to make their design decisions public, to review and justify their

designs, and to comment on their classmates' designs.

As in the first study, the challenge began with the teacher presenting it to the students and the

students and teacher working around a PBL whiteboard to generate issues they needed to learn more

about to achieve the challenge and initial ideas. Students then worked individually or in groups

gathering information from available resources in an attempt to answer the questions generated. This

time, however, when investigation was complete, the teacher called the class together for discussion

of what they had found out. She helped them clarify their science understanding, offered

explanations, and asked questions. Small groups then proceeded to come up with their refined design

ideas based on investigations that had been done (I2). This was followed by a pin-up session, where

students presented their ideas and the class publicly discussed the pros and cons of each. Groups tried

out ideas in the stream tables and attempted to evaluate them; this was followed by another pin-up

session. Groups narrowed their ideas again after that; this was followed by another pin-up session.

Finally, groups presented their solutions to each other and showed how they would work in the

stream tables, followed by whole-class discussion comparing and contrasting across the different

proposed solutions.

Interestingly, this new orchestration not only created more opportunities for groups to share

ideas, but our observations show that it promoted far more small-group discussion about ideas and

their justifications than had happened in Study I. Individual, small group and whole class learning

seemed to be integrated, and scaffolding was distributed across activities and agents.

33

RESULTS

As in Study I, we analyzed diary responses using the seven-part coding scheme. We also collected

classroom observations.

Students’ responses in diariesFigure 5 shows the results of our analysis of design diaries. Comparing this figure to Figure 2, it is

evident that students performed better this time. This time, students mentioned waves and tides (the

causes of coastal erosion) right from the start of the design process. They identified the problem

they needed to address better – slowing down erosion – and their ideas, learning issues, and criteria

showed that they were keeping that problem in mind as they designed. Students' responses showed a

marked improvement in terms of justifying their designs by using the science that they were learning.

We found that students now had a better insight into the causes of the problem which made them

reason better about the solutions. They understood the role that the tides and currents played and

how a man-made canal was worsening the erosion on one end of the island. They also had a more

realistic understanding of how to tackle erosion. One student, for example, wrote, “Is the channel

still dredged today? I know that it was dredged and sand was taken out periodically and the sand from

the long shore current was stopped and dredged out, because it was put into the channel and ‘pulled’

out”.

Figure 5: Student responses in the second implementation of the Jekyll Island Problem

In particular, we found this time that students’ restatement of the problem (U) and their first set

of questions (Q) were significantly better than in Study 1, with 22% of the students scoring a 2 or 3.

0

10

20

30

40

50

60

70

Score - 0 10 32 17 21 9 5 1

Score - 1 68 52 56 36 41 45 33

Score - 2 16 14 16 27 25 26 51

Score - 3 6 4 11 16 25 24 16

U Q I1 LI I2 S C

34

Their restatements of the problem and analysis of the situation showed a better understanding of the

role of ‘currents’. This time, 20% of the students raised science related questions (Q), such as, “ I

need to learn more about the currents and how they affect erosion,” in contrast to more general

questions such as, “will it work?” Their solution ideas (I1, I2, and S) were also quite a bit more

elaborated with science. This time solution ideas reflecting consideration of science, such as,

“Erosion is caused by waves constantly hitting the beach and pulling sand out to sea, move channel

further out to the sea, which will make the sand move less” and “If we build a tidal wall to the side of

the island and build sand bar at the top of the island the water currents wouldn’t erode the north &

the wall would stop the south end from gaining island,” were much more common, with 20% scoring

2’s and 3’s on I1, 50% on I2, and 50% on S. The most outstanding change was seen in the criteria

(C) generated by students. Forty-four percent of students came up with criteria that related to the

science they were learning (2’s and 3’s). This ability probably helped students evaluate their solution

ideas and come up with more sophisticated solutions than in Study I. While in initial stages, students’

ideas were simple and often unpractical (e.g., “use dumptrucks to move sand back to the north”), in

the later stages, they showed use of the targeted science content (e.g., “build walls to slow down the

longshore current” and “the jetties will block the sand but will let the water pass”). Fifty percent of

the students scored 2 or 3 points for their second set of ideas (I2), indicating a better and deeper

understanding of science that that were learning, and 50% of their final solution ideas (S) reflected an

understanding of the causes of erosion. In addition, 67% of the students' responses in the most

important phase of design, that of generating criteria, showed that students were now thinking about

the function that their designs had to fulfill. The criteria that they generated were more relevant t o

the science of erosion and its causes.

Students progress through the design process

Figure 5 shows that students progressed more knowledgeably through the design process than

they had in Study 1. To find out how individual students contributed to the class trend, we again used

the Wilcoxon signed rank test. As in Study I, we compared students' solution ideas as they progressed

through the design process (Figure 5, tables 8 and 9). We found that the difference between their

initial ideas (I1) and the second set of ideas (I2) was significant (z=-.308 p<.001). As shown in Table

8, the number of positive ranks (34) (those whose use of science increased) was four times the

number of negative ranks (8). However, there was no significant change in individuals’ use of science

from I2 to their solutions (S) (Table 9), consistent with the class trend as seen in Figure 5.

35

N z SigNegative Ranks 8(a) -.308 .000**Positive Ranks 34(b)Ties 39(c)Total 81a I2 < I1b I2 > I1c I2 = I1

Table 8: Comparison of I1 and I2

N z SigNegative Ranks 17(a) -.308 .758Positive Ranks 17(b)Ties 47(c)Total 81A S < I2B S > I2C S = I2

Table 9: Comparison of I2 and S

Reasoning about devices

When we analyzed student responses for structure, function, and behavior, there were differences

this time, as well. Responses reflected more thought about function than in Study 1 – from the start.

Questions (Q) indicated that students were thinking about what the factors are that were contributing

to the island’s erosion and then reasoning about how they could tackle these issues. For example, the

response, "What exactly are artificial fills and revetments? Will that stop erosion completely?"

refers to the student raising questions about the functions of the various methods of combating

erosion -- artificial fills and revetments. Another response, "How effective will they be against long

shore current?" looks into the role of sea walls and their effectiveness against long shore currents. In

another example, "Currents can cause a lot of damage, rocks work well on slowing waves down,

putting the seawall at an angle helps", we see that the student is reasoning about the function (slowing

down current) and relating that to the structure (sea wall at an angle). While the typical first idea (I1)

in Study 1 was “build a sea wall,” the typical first idea (I1) now was "build a sea wall to slow down the

currents." When they tested their solution ideas (S), students' responses once again showed that they

were indeed thinking about the functions of their designs, as illustrated in the response, "I thought

that the sand was going to build up against the jetties and rock them over. But after testing I saw that

36

the sand didn’t rock the jetties over…. The jetties were good and held the sand but let the water

pass".

Figure 6 summarizes the responses in the structure, Function and Behavior categories. Clearly,

students had a better understanding of the function and behavior of their design in study 2. This was

seen through all the seven activities of the design process and also increased as students progressed in

their design activities.

Figure 6: Students’ understanding of Structure, Function and Behavior

Observations

Classroom observations also were more satisfying. We found that students spent more time on

the stream tables trying out alternative solution ideas before they chose an optimal solution. They

also questioned each other's solution ideas during the pin-up sessions and electronic discussions.

Common questions included, "Why do you think this solution will work?" and "Will it work during

high tides?", both questions that showed the questioners were thinking about the function of the

designs that were being presented and that encouraged presenters to think about function if they

hadn’t already. Presentations of final solutions were more refined than in Study 1, and students

justified their choices by referring to other ideas that they had tried but which had failed. Students

raised questions about the causes of erosion during whole-class discussions, and the teacher often

showed clippings from the video disks to the whole class to help them understand difficult issues.

0

10

20

30

40

50

60

Structure 57 55 55 51 47 40 33

Function 31 35 34 37 38 42 45

Behavior 12 10 11 12 15 18 22

U Q I1 LI I2 S C

37

HOW MUCH BETTER DID THE DISTRIBUTED SCAFFOLDING WORK?

COMPARING STUDENT LEARNING IN STUDY 1 AND STUDY II

Our analyses of student responses in the diaries across the two studies showed students better able

to articulate and use the science they were being exposed to in Study II than they could in Study I. T o

find out how much better students were doing in Study II, we set out to compare student responses

across the two studies. To do this, we focused on design ideas -- the initial ideas (I1), second set of

ideas (I2) and possible solutions (S) – comparing students from Study I (the diaries only group) t o

students from study II (distributed scaffolding group), using the Mann-Whitney U test. This test is

the non-parametric version of the independent samples t-test. Table 10 shows the comparisons of

the frequency of scores in the two groups.

Students in Study II tended to score more highly across the categories. But upon focusing on the

three ideas categories, an interesting story emerged. Comparing students on I1, initial design ideas, we

found that the two groups were essentially equal; that is, they began the challenge at roughly the

same level of understanding. But when we looked at I2, a week later in the design process, we found

that students in the distributed scaffolding group significantly outperformed the students in the diaries

group. The percentage of students who scored 2 or 3 points was twice as large as that of students

Study I. This difference was significant (z = -3.86, p <.001). The same was true of the solutions (S)

that students came up with. The number of students who scored 2 or 3 was 21% in study I and 50% in

study II. This difference too was significant (z = -4.84; p <.001). The criteria that students came up

with also showed that the second group outperformed the first. In the first, 49% of the students

scored 2 or 3, but in the second, this rose to 67%.

Ideas 1 (I1) Ideas 2 (I2) Solutions (S)Study 1 Study 2 Study 1 Study 2 Study 1 Study 2

Score – 0 26 17 23 9 28 5Score – 1 54 56 52 41 51 45Score – 2 13 16 12 25 12 26Score – 3 7 11 13 25 9 24

Table 10: Frequency of scores for I1, I2 and solutions in study I and study II

DISCUSSION

Results obtained in Study II, the distributed-scaffolding implementation of the Jekyll Island

Challenge, indicated that on the whole students showed a deeper understanding of the usefulness and

38

applicability of the science they were learning. The improved version of the diaries were possibly one

of the reasons for their improved performance. Design diary prompts in Study II were specifically

designed to encourage students to reason about their design decisions using the science that they were

learning. Prompts also incorporated aspects of the scientific process such as making predictions,

figuring out whether the predictions were correct or incorrect and why, revisiting, questions, criteria,

refining their specifications, and so on. Diary scaffolding encouraged students to revisit earlier

entries, and indeed, students did that – refining their criteria, refining their specifications, and adding

potential solutions. Bransford et al. (1999) have described that ideas are best introduced when

students see a need or a reason for their use – this helps them see relevant uses of knowledge to make

sense of what they are learning. The diaries provided students with support to help them connect

their design activities to the science that they were learning. The prompts in the diaries helped them

to reason about their designs, put their ideas into practice, and use the science to propose and

evaluate solutions.

Another reason why students might have done better in Study II could have been that they had

several opportunities to explain and justify their design decisions and to question the reasoning of

other students during pin-up sessions that occurred before they made their I2 entries in their diaries.

Presenting their ideas, answering their peers’ questions, and considering the ideas of others gave

students a chance to think about and refine their ideas before writing them down. There was no

significant difference between students’ use of science to justify their second sets of ideas (I2) and t o

justify their solutions (S), indicating that the scaffolding, whole-class discussions, and pin-up sessions

held early on were critical for learning science content and practices.

In Study I, students seemed to view design as a linear process – one went on to the next stage

when finished a previous one and never went back. Metacognitive prompts in design diaries in Study

II were designed to help them see across phases and appreciate the need to revisit earlier decisions.

And indeed, in Study II, students seemed to have a better understanding of the cyclical nature of

designing, evident in the fact that they revisited older diary pages as they advanced in the design

cycle.

But all students did not improve. Some students did worse during the solutions phase than when

they presented their second set of ideas; perhaps it was only chance that they used science well in

justifying their second set of ideas and the scaffolding did not help those students sufficiently. Further

study may show that we need to refine earlier science scaffolding or add to the scaffolding for use of

science in later phases.

But whatever scaffolding we add to the diaries, there will always be some scaffolding that is

missing for some students. Scaffolding in a paper-and-pencil or electronic tool can only take into

account the kinds of learning needs that are common and obvious enough to be identified and might

not be adaptable to the specific needs of every member of the community. When designing paper-

39

and-pencil or electronic scaffolding for a community, this will always happen, as one cannot

anticipate every problem students will have. With a distributed scaffolding scheme, however, this does

not have to remain problematic. The teacher can fill in where the designed scaffolding is deficient.

Schofield’s analysis of the social structures in classroom where cognitive tutors were introduced

(Schofield & Britt, 1991) for example, showed that when much of the work students do can be done

independently, the teacher is freed to provide such individual help.

CONCLUSIONS AND RECOMMENDATIONS

Our results showed that learning science from design activities requires scaffolding at several

different levels, with scaffolding distributed across the available tools, activities, and agents and

integrated in a way that admits multiplicity. In a complex project-based environment, it can be

difficult to align all the affordances in such a way that every student can recognize and take

advantage of all the affordances. When scaffolding is distributed, integrated, and multiple, there are

more chances for students to notice and take advantages of the environment’s and activity’s

affordances. For example, recognizing the need to reflect is particularly difficult when students are

working hard on a hands-on activity (indeed, taking time to reflect is hard for anybody in the flow of

working on an exciting hands-on activity). Paper-and-pencil and electronic scaffolding can’t help

students recognize that need; rather, they seem to need to be interrupted from their activities t o

think about what they are doing. When scaffolding is distributed across tools and agents in the

environment in a systematic way, such difficulties can be dealt with from a variety of perspectives.

Multiple opportunities are important too. Students who fail to take a design diary prompt into

account have another opportunity to be scaffolded during a pin-up session when a peer asks the same

question that is in a prompt but uses different words or during a whole-class discussion when another

students explains how he/she accomplished some task.

Hogan and Pressley (1997) raise questions regarding an optimal blend of scaffolding and other

instructional strategies such as, for example, direct explanation. The distributed scaffolding

environment in Study II provided opportunities for teachers to make public what students were

learning in small groups. It also provided opportunities for teachers to clarify issues, to help students

work towards better conceptions, and to provide explanations when necessary. As is clear from the

data, the distributed scaffolding group clearly benefited from an environment that included ways of

identifying the many different kinds of help they needed.

An important aspect of scaffolding is adaptability and fadability (Jackson et al., 1998). We could

not achieve this in the design diaries; they were the same for everybody. But adding the social

scaffolds provided additional help beyond the set scaffolding and allowed the teacher to identify

additional help some of the students needed. An important future direction will be to design

40

scaffolding for design-based classrooms in such a way that we can give students more individualized

support. This will require a better understanding of the individual difficulties students have when

learning from design activities and a more dynamic way of making the scaffolding available. Design

diaries were on paper; making them electronic will allow that more easily.

Our aim was to understand the effect of several types of scaffolding on the process of learning

from design activities. We tried to design our scaffolding in such a way that students could successfully

utilize the numerous opportunities in the complex and rich learning-by-design environment. In future

studies, it will be important to study how well students have learned the science through the design

activities by looking into pre and post test results.

As the learning sciences community moves towards designing broadly-applicable approaches t o

promoting learning (e.g., curriculum units, general-purpose learning tools), the lessons from this

study will become more and more important. There are limitations to the amount of scaffolding that

any one tool can provide. There is a limit to the scaffolding one can provide electronically – limits

to individualization, limits due to the passivity of the machine. In addition, it is also difficult for

teachers to provide help, both with regard to content and process, to groups of students working on

different aspects of their design. Classroom approaches that can be used by the broad range of

teachers will have to take those constraints into account, and it will be important to design those

approaches taking into account the full range of needs of learners, the several different tools for

scaffolding that might be available, and the range of agents available to provide help. We suspect that

distributing the scaffolding across available agents and venues, integrating it seamlessly, providing

complementary roles to agents and duplicating functions across agents will all be essential to such

design.

REFERENCES

Atman, C. J. & Bursic, K. M. (1996). Teaching engineering design: can reading a textbook make a difference?Research in Engineering Design: Vol. 7 PP. 1-7.

Barrows, H. S. (1985). How to design a problem based curriculum for the preclinical years. Springer-Verlag: NY.

Baumgartner, E. & Reiser, B. (1998). Strategies for supporting student inquiry in design tasks. Presented at theAnnual Meeting of the American Educational Research Association, San Diego, California.

Bell, P. & Davis, E. A. (1996, April). Designing an Activity in the Knowledge Integration Environment. Paperpresented at the 1996 Annual Meeting of the American Educational Research Association, New York, NY.

Bransford, J. D., Brown, A. L., Cocking, R. R. (1999). How People Learn. National Academy Press, WashingtonDC.

41

Brown, A. L., & Palincsar, A. S. (1987). Reciprocal Teaching of comprehension strategies: A natural history of oneprogram for enhancing learning. In J. D. Day & J. G. Borkowski (Eds.), Intelligence and Exceptionality: Newdirections for theory, assessment, and instructional practice. Norwood, NJ: Ablex.

Brown, A. L. (1988). Motivation to learn and understand: On taking charge of one’s own learning. Cognition andInstruction, 5(4), 311-322.

Brown, A. L. (1992). Design Experiments: Theoretical and methodological challenges in creating complexinterventions in classroom settings. Journal of the learning sciences, 2 (2), 141-178.

Brown, A. L. (1994). The advancement of learning. Educational Researcher, Vol. 23, No. 8, pp. 4-12.

Bruner (1986). Actual minds, possible worlds. Harvard University Press.

Collins, A., Brown, J. S., & Newman, S. E. (1989). Cognitive Apprenticeship: Teaching the crafts of reading,writing and mathematics. In L. B. Resnick (Eds.) Knowing, Learning and Instruction, Essays in Honor of RobertGlaser. Hillsdale, NJ: Erlbaum.

Christiaans, H. H. C. M., & Dorst, K. H. (1992). Cognitive models in industrial design engineering: a protocolstudy. Design theory and methodology, 42, pp. 131-140.

Derry, S., Tookey, K. & Roth, B. (1994). The effect of collaborative interaction and computer tool use on theproblem-solving processes of lower ability students. Presented at the Annual Meeting of the American EducationalResearch Association, Atlanta, GA.

Driver, R., Asoko, H., Leach, J., Mortimer, E., & Scott, P. (1994). Constructing scientific knowledge in theclassroom. In Educational Researcher, 23 (7), pp. 5-12.

Gargarian, G. (1996). The art of design. In Y. B. Kafai & M. Resnick (Eds.). Constructionism in practice. Erlbaum,NJ.

Gertzman, A. & Kolodner, J. L. (1996). A Case Study of Problem-Based Learning in a Middle-School ScienceClass: Lessons Learned. Proceedings of the Second International Conference on the Learning Sciences,Evanston/Chicago, IL, pp. 91 - 98.

Goel, A., Garza, G., Grue, N., Murdock, W. & Recker, M. (1996). Exploratory interface in interactive designenvironments. In J. S. Gero & F. Sudweeks (eds.), Artificial intelligence in design, 387-405.

Gray, J., Young, J. & Newstetter, W. (1997). Learning science by designing Robots: Knowledge acquisition aboutarthropods and collaborative skills development by middle school students. Presented at the annual meeting ofthe American Educational Research Association, Chicago.

Guzdial M. (1995). Software-realized scaffolding to facilitate programming for science learning. Interactive LearningEnvironments, 4(1). Pp. 1-44.

Hübscher, R., Puntambekar, S. & Guzdial, M. (1997). A Scaffolded Learning Environment SupportingLearning and Design Activities, Paper presented at the annual meeting of the American Educational ResearchAssociation, Chicago, IL.

Guzdial, M. (submitted)). Collaborative websites to support an authoring community on the Web. Manuscriptsubmitted for publication.

Harel, I. (1991). Children designers. Ablex: New York.

Hmelo, C., Allen, J. & Holton, D. & Kolodner, J. L. (1997). Designing for understanding: Children’s lung models.In proceedings of the Annual meeting of the Cognitive Science Society, pp. 298-303.

42

Hmelo, C. E., Holton, D.L., Kolodner, J.L. (2000). Designing to Learn about Complex Systems. Journal of theLearning Sciences, Vol. 9, No. 3, pp. 2 - 298.

Hogan, K., & Pressley, M. (1997). Becoming a scaffolder of student’ learning. In K. Hogan & M. Pressley (Eds.)Scaffolding student learning. Brookline Books. Cambride, MA.

Jackson, S. L., Krajcik, J. & Soloway, E. (1998). The design of guided learner-adaptable scaffolding in interactivelearning environments. In proceedings of the conference on Human factors in computing systems (CHI 98), pp.187-194.

Kafai, Y. B. (1994). Minds in play: Computer game design as a context for children's learning. Erlbaum: HillsdaleNJ.

Kolodner, J.L. (Ed.), (1993). Case-Based Learning. Kluwer Academic Publishers, Dordrecht, Netherlands.

Kolodner, J.L. (1997). Educational Implications of Analogy: A View from Case-Based Reasoning. AmericanPsychologist, Vol. 52, No. 1, pp. 57-66.

Kolodner, J.L., Hmelo, C. E., & Narayanan, N. H. (1996). Problem-based Learning Meets Case-based Reasoning.In D.C. Edelson & E.A. Domeshek (Eds.), Proceedings of ICLS '96 , Charlottesville, VA: AACE, pp.188-195.

Kolodner, J.L., Crismond, D., Gray J., Holbrook, J., & Puntambekar, S.(1998). Learning by Design from Theoryto Practice. Proceedings International Conference of the Learning Sciences '98, pp.16 - 22.

Kolodner, J.L., Crismond D., Fasse, B., Gray, J., Holbrook, J., Puntambekar. (submitted). .Putting a Student-Centered Learning-by-Design Curriculum into Practice: Lessons Learned. Submitted to Journal of the LearningSciences, March 8, 2000; under revision.

Koschmann, T. D., Myers, A. C., Feltovich, P. J. & Barrows, H. S. (1994). Using technology to assist in realizingeffective learning and instruction: A principles approach to the use of computers in collaborative learning. In theJournal of the learning sciences, 3(3), pp. 227-264.

Lehrer, R. & Romberg, T. (1996), Exploring children's data modeling. Cognition and Instruction, 14(1), pp. 69 -108.

Luckin, R. (1998). Knowledge construction in the zone of collaboration: scaffolding the learner to productiveinteractivity. In A. Bruckman, M. Guzdial, J. Kolodner & A. Ram (Eds.). Proceedings of the InternationalConference of the Learning Sciences, pp. 188-194.

Piaget, J. (1954). The construction of reality in the child. New York: Basic books.

Perkins, A. (1986). Knowledge as design. Hillsdale, NJ: Erlbaum.

Puntambekar, S. (1997). Supporting the design process by using design diaries in the ‘Learning by Design’environment. Paper Presented at the AERA annual meeting, March 24-28, Chicago.

Puntambekar, S., Nagel, K., Hübscher, R., Guzdial, M., & Kolodner, J. L. (1997) Intragroup and Intergroup: Anexploration of learning with complementary collaboration tools. In proceedings of the Computer SupportedCollaborative Learning conference, Toronto December 10-14.

Puntambekar, S. & Kolodner, J. L. (1998). Distributed scaffolding: Helping students learn in a learning by designenvironment. In A. S. Bruckman, M. Guzdial, J. L. Kolodner, & A. Ram (Eds.), ICLS 1998, Proceedings of theInternational Conference of the Learning Sciences, pp. 35-41.

Tabak, I., Reiser, B. (1997). Complementary Roles of Software-based Scaffolding and Teacher-Student Interactions inInquiry Learning. In R. hall, Miyake, N. & Enyedy, N. Proceedings of the CSCL conference, pp. 289-298.

43

Rogoff, B. (1990). Apprenticeship in thinking: cognitive development in social context. Oxford, UK: OxfordUniversity Press.

Roehler, L. R. & Cantlon, D. J., (1997). Scaffolding: A powerful tool in social constructivist classrooms. In K.Hogan & M. Pressley (Eds.) Scaffolding student learning. Brookline Books. Cambride, MA.

Schank, R. (1982). Dynamic memory: A theory of learning in computers and people. Cambridge University Press.Cambridge.

Schank, R. C., Berman, T. R., & Macpherson, K. A. (1999). Learning by doing. In C. M. Reigeluth (Ed.)Instructional-Design Theories and Models: A New Paradigm of Instructional Theory. pp. 161-181. Hillsdale, NJ:Erlbaum.

Schofield, J. W., Britt C, L.,& Eurich-Fulcer R. (1991). Computer-Tutors and Teachers: The Impact of Computer-Tutors on Classroom Social Processes. Paper presented at the annual meeting of the American EducationalResearch Association, Chicago.

Vygotsky, L. S. (1978). Mind in Society: The development of higher psychological processes. Cambridge: HarvardUniversity Press.

Wertsch, J. V. (1985). Vygotsky and the social formation of mind. Cambridge, MA. Harvard University Press.

Wertsch, J., Mcnamee, G., McLare, J., & Budwig, N. (1980). The adult-child dyad as a problem solving system.Child Development, 51, pp. 1215-1221.

Wood, D., Bruner, J. S. & Ross, G. (1976). The role of tutoring in problem solving. In Journal of child psychologyand psychiatry, Vol. 17, pp. 89-100.

ACKNOWLEDGMENTS

This research has been supported in part by the National Science Foundation, the McDonnell

Foundation, and the EduTech Institute (with funding from the Woodruff Foundation).

44

APPENDIX - A

45

Figuring it out!

46

Understanding the problem Understanding the problemThis is what I understand of the problem (Please restate the problem in your own words)

What questions do I need to ask in order to understand the problem better?

What are the different parts of the problem?

47

Ideas Ideas Ideas Ideas Ideas Ideas Ideas Ideas Ideas Ideas Ideas

These are some of my first ideas about how I might solve the problem.

These ideas are based on what you already know......Think about what else you need to knowSuggestionsReread the problem Think about what I need to learn

48

Learning Issues Learning Issues Learning Issues Learning Issues

49

Fleshing it out!

50

What did I learn? What did I learn? What did I learn? What did I learn?

SuggestionsI need to know more I can move forwardReread problem Proceed to specify requirements and work

towards a solutionReread learning issuesCarry out more research

51

Requirements Requirements Requirements Requirements Requirements

SuggestionsI need to know more I can move forwardReread problem Proceed to work towards a solutionReread learning issuesCarry out more researchRethink the requirements

52

Choosing a solution!

53

Ideas Ideas Ideas Ideas Ideas Ideas Ideas Ideas Ideas Ideas Ideas

SuggestionsI need to know more I can move forwardReread problem Proceed to work towards a solutionReread learning issuesCarry out more researchRethink the requirements

54

Possible solutions Possible solutions Possible Solutions

SuggestionsI need more information, think some more I can move forwardReread problem, restatement Proceed to choose a solutionCarry out more researchRethink the requirementsRethink the solutions

55

Choosing a solution

Keep these questions in front when you proceed to choose your solution.

What are the problem requirements?What are the criteria?Which criteria do each of your solutions meet?Which do they not meet?What are the positive features of each of the solutions?What are the limitations of each of the solutions?

56

What are the criteria against which you will evaluate possible solutions?Why do you think these criteria are important?

57

The solution satisfies the following criteria becauseCriteria Reasons

The solution does not satisfy the following criteria because

Criteria

Reasons

58

Problem Solved!

59

Write a report on the experiences you had in solving this problem. The report should havetwo sections as detailed below. In each section, please address the questions given.

What did we accomplish?What problems did we have?How did we solve the problems?With whose help did we solve the problems?Lessons we learned.

**********************************************************************************

60

APPENDIX – B

61

IDEAS

IDEAS Suggestions

Write your ideas on this page. Ideas can be about many thingsrelating to your designs. See suggestions.

Your ideas can be about anyaspect of your design suchas

• ideas aboutsolutions materials

• subproblems• about testing your

designs• criteria• uniqueness of your

design

Make sections on this pagefor the different types ofideas if you wish.

This page can be used atany time to jot down ideas.

Come back to this pageanytime you have newideas, or to refine or use theideas you have written here.

62

SPECIFICATION Suggestions

A specification should onlystate what is required tosolve the problem - not howto solve it.

Things that you mightthink about

What will be the functionof the device?Who will used it?Where will it be used?Are there any otherconstraints that you need toconsider (e.g. safety, cost,ease of use?)

You might want to refer toyour research notes whilewriting the specification.

Come back to this page asyou learn more about whatis needed. Remember todate your entries.

Example

The device must either carryor assist the person up anddown the stairs.

It must be very easy to useand not have anycomplicated controls.

It must be completely safefor the user, and anypersons or animals standingnearby.

It must not obstruct thenormal use of stairs.

It must support the weightof the person.

63

EXPLORATION

(PROBLEM UNDERSTANDING)

64

PROBLEM UNDERSTANDING

RESTATING THE PROBLEM Suggestions

Write a brief statement of the problem emphasizing the purpose ofthe design.

Questions you mightanswer in your statement.

What is the purpose /function of the design?Where will it be used?Who will use it?What exactly are yousupposed to design?

A statement of theproblem should not be toovague or too detailed.

See Example

ExampleGood analysis

Climbing up and down thestairs can be difficult andoften frightening for someelderly people. The problemis made worse when theytry to carry things as well.We think there is a need forsome kind of lift device orother aid to help them to goup and down the stairssafely.

Vague analysis

We need to design a liftdevice for the elderlypeople.

65

PROBLEM UNDERSTANDING

Suggestions

What do I already know that will help me design?List the things that youknow that you will takeinto account as you design.

You list may be longer thanthe one in the example.

You will want to includethings you learned as youtinkered with materials

Example

Things I know….

The device has to be lightand safe.

I can probably usealuminum to make it light.

Example

What do I still need to learn to be able to come up with a gooddesign?

List the things that you stillneed to learn about to beable to design.

Example

Things I need to know…..

I have to learn more aboutaluminum and other lightmaterials

I have to learn about safetyof such devices

You will want to update these lists as you understand the problem better

66

INVESTIGATION

(RESEARCH, SPECIFICATION)

67

LEARNING ISSUES Suggestions

Moving forward - what do we need to learn more about?List your issues in the formof questions.

For each question youmight like to find out• Your questions might

be about the parts yourmodels will have, thefunctions of thoseparts and how theparts work together

• How the informationrelates to your model?

• What more do you needto know?

• Do you need to readyour observationsagain?

For each question, youmight like to list how youwill find out more:• Look up in library• Ask teacher or expert to

explain• Use books / resources

provided by teacher• Conduct an experimentVary the model / tinker.How?

Example

What are the types ofdevices for lifting up thestairs?How do such devicesoperate?What holds the chair /device to the stairs?How does the slidingmechanism operate?How much weight can ittake and still slidesmoothly and be safe?What materials can I use tomake it light and safe?Does the device need seatbelts?

68

AFTER RESEARCH - WHOLE CLASS DISCUSSION Suggestions

NOTESYou might like to

Prepare yourself to answerquestions about what youlearned and how you willuse it in your models.

Write down questions thatyou want to find answers to

Which information do youneed right away and whatcan you put off until thetesting

Make a list of questionsthat you will be able to findanswers to after trials withyour designs.

69

SOLUTION GENERATION

(ALTERNATIVES, CRITERIA, CHOOSING ASOLUTION)

70

POSSIBLE SOLUTIONS Suggestions

List at least one possible idea for a solution.Remember

not to get too engrossed inthe first idea that comes toyour mind

to think about at least threedifferent ways of solvingthe problem

read the specification againto understand better whatexactly is required.

Example

71

GENERATE CRITERIA Suggestions

How can you judge which is the best of your ideas. What are thecriteria against which you will evaluate possible solutions? Why doyou think these criteria are important?

Discuss possible problemsand advantages such as

• can it achieve itspurpose?• what characteristicsdoes it need to have toachieve its purpose?• how long it takes tomake• how durable it is• How does it work inthe environment - is itsafe, is it easy to use

Use this list to set upevaluation criteria.

• Use the evaluationcriteria to judge yourdesign on the next page• Read yourspecification again tohelp generate the criteria

ExampleSafety of the passenger

Ease of use

Durability

Easy to operate

Sturdy but not heavy

Should support the weightof the person

EVALUATION OF MY SOLUTION IDEA

72

Solution _____________________________

Advantages - Satisfies the following criteria2

CRITERIA REASONS

Solution _____________________________

Disadvantages - Does not satisfy the following criteria

SHARING OUR SOLUTION IDEAS

2 NOTE: If you find an advantage or disadvantage not listed on your criteria page, you might want to update yourcriteria list.

73

Suggestions

SOLUTION _____________________________________

SOLUTION ___________________________________

SOLUTION _____________________________________

For each solution write

What is good about it• you should include the

criteria that thesolution satisfies (fromthe criteria page)

• you may want to listother criteria as well,and may also want toupdate the criteria page

What is not so good aboutit• you might use the

criteria that thesolution does notsatisfy (from thecriteria page)

• you may want to listother criteria as well,and may also want toupdate the criteria page

What is interesting about it• you might include

what is unique orinteresting or specialabout this solution

SOLUTION _____________________________________

WHICH IDEA IS THE BEST?

74

From each of your criteria sheets, select the most important criteria that your group wants to use tocompare and contrast each of your group members’ design ideas. Then select the design that best fitsthe evaluation criteria. You can combine features from different designs to create an even betterdesign.

CRITERIA SOLUTION 1 SOLUTION 2 SOLUTION 3 SOLUTION 4 SOLUTION 5 SCALE WEUSED

75

BUILD AND TEST

76

PREDICTION BEFORE EACH TEST

PREDICTIONS Suggestions

Mention what you think will happen before each trial. Come back tothis page before every trial

First trial

Second trial

You might like to mention

What you think will happenwhen you test your design?

Will it fulfill the functionfor which it is designed?Why?

Example

Third trial

77

OBSERVATIONS DURING TESTING Suggestions

Note your observations here. For each trial

You might like to mention

What you did during thetrial

How is this different froman earlier trial

What you observed / whathappened during the trial

Your explanation of whatyou observed

Specific questions that youmight have relating to yourmodel

What happened during thetrial? Did your predictioncome true?

What features of your modeldo you think contributed tothe results you obtained?Explain why?

What features of yourdesign will you modify as aresult of this trial

78

OBSERVATIONS DURING TESTING...CONTINUED Suggestions

Note your observations here. For each trial

You might like to mention

What you did during thetrial

How is this different froman earlier trial

What you observed / whathappened during the trial

Your explanation of whatyou observed

Specific questions that youmight have relating to yourmodel

What happened during thetrial? Did your predictioncome true?

What features of your modeldo you think contributed tothe results you obtained?Explain why?

What features of yourdesign will you modify as aresult of this trial

79

EVALUATION

80

EXPLAIN YOUR MODELS Suggestions

Write a short description of the model you have built. Explain thefeatures and advantages of your designs. You can draw diagrams toillustrate your design.

You might like to mention

The parts of your modelThe functions of the partsHow the parts work togetherHow the model works andWhyWhat is nifty about yourmodel and why

81

RESULTS AND EXPLANATION

RESULTS OF FINAL TESTING Suggestions

Explain your final results.You might like tomention

whether your designfulfill the specs that youstarted with

if yes explain why

if no, can you think ofthe reasons.

Be prepared to justifyyour results with respectto your designs

Use this page as a placeto jot down yourexplanations andjustifications as apreparation for the pin-up sessions.Example

82

REDESIGN Suggestions

How will we redesign our models? What do we need to change?You might like to thinkabout

Did your design accomplishthe function that it wassupposed to perform?

Do you need to redesignyour models?

Why do you need toredesign your models?

What are the things that youwill take into account whilebuilding your models again

Do you need to know morein order to improve yourmodels?

What do you need to know?

83

PIN-UP SESSIONS

84

PIN-UP SESSION Suggestions

PIN-UP - NOTES

You might include inyour notes-Issues that you don’tknow much aboutQuestions that you wereaskedQuestions that youasked othersWhat’s confusing,what’s new to youAnything interestingthat you want toremember

85