9
Woolf, B.P., Murray, T., Marshall, D., Dragon, T., Kohler, K., Mattingly, M., Bruno, M., Murray, D., Sammons, J., Critical Thinking Environments for Science Education. Proceedings of the 12 th International Conference on AI and Education, July 2005, Amsterdam. Critical Thinking Environments for Science Education Beverly Park WOOLF, 1 Tom MURRAY, 1 David MARSHALL, 1 Toby DRAGON, 1 Kevin KOHLER, 1 Matt MATTINGLY, 1 Merle BRUNO, 2 Dan MURRAY, 3 Jim SAMMONS 3 Department of Computer Science, 1 University of Massachusetts, Amherst, MA. School of Natural Science, 2 Hampshire College, Amherst, MA. Department of Geology, 3 University of Rhode Island, R.I. Abstract. We have developed a range of critical thinking environments for science education that span several academic content areas, including human biology, geology and forestry. All environments share a methodology, infrastructure and sets of assumptions and tools, which allows them to leverage from the accomplishments and intuitions of the others. These tutors support a student on the Web to be active and engaged, track that student’s critical thinking and reason about her knowledge and its own teaching strategies. An Inquiry Notebook provides a way to sort, filter and categorize data and justifications and an Argument Editor supports argument formation. Students drag and drop evidence to support or refute each argument. A Coach provides helpful feedback guided by a database of expert rules, which create the basis for the content-specific analysis of the student’s argument. 1. Introduction We are engaged in several projects to support critical thinking in science education; these projects have both shared and individual goals. The overarching shared goal is to involve students in scientific reasoning, critical thinking and hypothesis generation and thereby generate more responsive and active learning. Individual goals focus on teaching specific academic content knowledge in human biology, geology and forestry. Additionally, each tutor employs consistent elements across disciplines, utilizes common tools and supports intersecting development. This paper describes two inquiry tutors built with this infrastructure and discusses the research approach behind the work. The inquiry environment, called Rashi, 1 immerses students in problem-based cases and asks them to observe phenomena, reason about them, posit theories and recognize when data does or does not support their hypotheses [1, 2, 3, 4, 5]. Each teaching environment tracks student investigations (e.g., questions, hypotheses, data collection and inferences) and helps the student articulate how evidence and theories are related. Generic tools, common to all the environments, guide students through ill-structured problem spaces, helping them to formulate questions, frame hypotheses, gather evidence and construct arguments. Tools such as the Inquiry Notebook and the Hypothesis Editor are used across domains. Domain specific tools, including the Exam Room and Interview Tools 1 Rashi homepage is http://ccbit.cs.umass.edu/Rashihome/

Critical Thinking Environments for Science Education

Embed Size (px)

Citation preview

Page 1: Critical Thinking Environments for Science Education

Woolf, B.P., Murray, T., Marshall, D., Dragon, T., Kohler, K., Mattingly, M., Bruno, M., Murray, D.,Sammons, J., Critical Thinking Environments for Science Education. Proceedings of the 12th InternationalConference on AI and Education, July 2005, Amsterdam.

Critical Thinking Environments forScience Education

Beverly Park WOOLF,1 Tom MURRAY,1 David MARSHALL,1 Toby DRAGON,1

Kevin KOHLER,1 Matt MATTINGLY,1 Merle BRUNO,2 Dan MURRAY,3

Jim SAMMONS3

Department of Computer Science,1 University of Massachusetts, Amherst, MA.School of Natural Science,2 Hampshire College, Amherst, MA.

Department of Geology,3 University of Rhode Island, R.I.

Abstract. We have developed a range of critical thinking environments for scienceeducation that span several academic content areas, including human biology,geology and forestry. All environments share a methodology, infrastructure and setsof assumptions and tools, which allows them to leverage from the accomplishmentsand intuitions of the others. These tutors support a student on the Web to be activeand engaged, track that student’s critical thinking and reason about her knowledgeand its own teaching strategies. An Inquiry Notebook provides a way to sort, filterand categorize data and justifications and an Argument Editor supports argumentformation. Students drag and drop evidence to support or refute each argument. ACoach provides helpful feedback guided by a database of expert rules, which createthe basis for the content-specific analysis of the student’s argument.

1. Introduction

We are engaged in several projects to support critical thinking in science education; theseprojects have both shared and individual goals. The overarching shared goal is to involvestudents in scientific reasoning, critical thinking and hypothesis generation and therebygenerate more responsive and active learning. Individual goals focus on teaching specificacademic content knowledge in human biology, geology and forestry. Additionally, eachtutor employs consistent elements across disciplines, utilizes common tools and supportsintersecting development. This paper describes two inquiry tutors built with thisinfrastructure and discusses the research approach behind the work.

The inquiry environment, called Rashi,1 immerses students in problem-based cases andasks them to observe phenomena, reason about them, posit theories and recognize whendata does or does not support their hypotheses [1, 2, 3, 4, 5]. Each teaching environmenttracks student investigations (e.g., questions, hypotheses, data collection and inferences)and helps the student articulate how evidence and theories are related.

Generic tools, common to all the environments, guide students through ill-structuredproblem spaces, helping them to formulate questions, frame hypotheses, gather evidenceand construct arguments. Tools such as the Inquiry Notebook and the Hypothesis Editor areused across domains. Domain specific tools, including the Exam Room and Interview Tools

1 Rashi homepage is http://ccbit.cs.umass.edu/Rashihome/

Page 2: Critical Thinking Environments for Science Education

Woolf, B.P., Murray, T., Marshall, D., Dragon, T., Kohler, K., Mattingly, M., Bruno, M., Murray, D.,Sammons, J., Critical Thinking Environments for Science Education. Proceedings of the 12th InternationalConference on AI and Education, July 2005, Amsterdam.

(for human biology), or the Field Guide (for forestry) fully engage students in knowledgeintegration within a specific domain.

Existing inquiry software presents cases and provides rich simulation-based learningenvironments and tools for gathering, organizing, visualizing, and analyzing informationduring inquiry [6, 7, 8, 9, 10]. They support authentic inquiry in the classroom andknowledge sharing, and several tracked and analyzed student data selections andhypotheses. The contribution of this research is to carefully track the reasoning behindstudent arguments and to critique the student’s use of supporting and refuting evidence. Thetutor helps students identify weaknesses in their arguments and guides them about how tostrengthen arguments during critical thinking. The next two sections describe the HumanBiology Inquiry Tutor and then the Geology Tutor.

2. Human Biology Inquiry Tutor

The first domain described is human biology, in which Rashi supports students to evaluatea patient and generate hypotheses about a possible diagnosis.2 The patient’s complaintsform an initial set of data from which students begin the diagnostic process, by“interviewing” the patient about symptoms and examining her, Figure 1. Some data is madevisible by student action, e.g. asking for chest x-rays, prescribing a certain drug or using ameasurement tool. Some data is interpreted for the student (e.g. "x-ray normal"); other dataprovides raw material and the student interprets it and draws her own conclusions. Sixbiology cases have been developed, including those for hyperthyroidism, lactoseintolerance, food poisoning, diarrhea, mold allergy, and iron deficiency anemia. Hundredsof introductory biology students have used this tutor.

Rashi does not enforce a particular order of student activity, allowing students to moveopportunistically from one phase to another. Students read a case description and use toolssuch as the Examination Lab and Laboratory Examination, Figure 1, to identify thepatient’s signs and symptoms. They might interview the patient about her complaints andorganize physiological signs, medical history or patient examinations in the InquiryNotebook. They sort, filter and categorize data according to predefined categories and onesthat they invent. The site of the observation, e.g., “Interview Room” or “Examination Lab,”is recorded automatically in the Inquiry Notebook. Notebook ‘pages’ allow students tocreate separate spaces for data, as scientists do on separate pages of lab notebooks. A“scratch pad” allows a student to record open questions and hypotheses and to identify datathat may reveal flaws in a hypothesis. Students search the web for diagnostic material,definitions and interpretations of laboratory results.

Students posit several hypotheses (and other inferences) in the Argument Editor, bottomright, Figure 1. They drag and drop data from the Inquiry Notebook into the ArgumentEditor to link evidence to support or refute each argument. Arguments can be severallevels deep. Structured prompts, reminders and help are student motivated with variousstages of inquiry. The student can ask “What do I need to work on?” or “Why is this thewrong hypothesis?” Coaching is based on rules that look for certain conditions in thestudent’s work and provide hints if rules are not met, see Section 4. Currently, the tutordoesn’t interrupt the user to provide reminders because that is seen as obtrusive and mightpotentially slow down the student.

At some point each student makes a final electronic report supporting one hypothesis asthe “best.” This submission, sent electronically to the teacher, includes data, inferences,hypotheses, justifications, competing hypotheses and arguments from the Inquiry Notebook 2 Human Biology Inquiry Tutor: http://ccbit.cs.umass.edu/Rashihome/projects/biology/index.html

Page 3: Critical Thinking Environments for Science Education

Woolf, B.P., Murray, T., Marshall, D., Dragon, T., Kohler, K., Mattingly, M., Bruno, M., Murray, D.,Sammons, J., Critical Thinking Environments for Science Education. Proceedings of the 12th InternationalConference on AI and Education, July 2005, Amsterdam.

and Argument Editor. We are working on a community-centered version of the tutor, inwhich students work in remote groups to brainstorm a list of predictions to resolve a caseand each student separately types in possible causes for the observed phenomena.

Figure 1. Human Biology Inquiry Tutor. Diagnosis of the patient begins with an examination and labtests (left). Examination and interview facts are organized (sometimes automatically) into the InquiryNotebook (top right) and hypotheses are entered into the Argumentation Editor (bottom right). In thisexample, the student has postulated three hypotheses (mono, diabetes and pregnancy) and supported or

refuted each with evidence and observations.

Client-server software supports storing individual student data. A simple databasehouses text entered by the student as well as the Inquiry Notebook and Argument Editorobjects. Intelligence is distributed between the server and client. Java is used to implementvisual activities and graphical user interfaces and to reason about the student. This studentdatabase is used both to display student work and to reason about it. The reasoning elementin Rashi receives data from the student database and compares that with the expert’sargument input by the faculty through authoring tools, see Section 5. Rashi searches overboth databases to analyze the argument and match student text entries to database objectsfrom the stored expert’s argument. The server communicates these results back to theclient. The database and all the algorithms for doing the analysis reside in the applicationand the server is only contacted to store student data. The client side doesn’t have adatabase in any formal sense; though it is primarily the side that analyzes the student’sargument, see Section 4. Some portions of the Coach reside server side.

3. Geology Inquiry Tutor

Page 4: Critical Thinking Environments for Science Education

Woolf, B.P., Murray, T., Marshall, D., Dragon, T., Kohler, K., Mattingly, M., Bruno, M., Murray, D.,Sammons, J., Critical Thinking Environments for Science Education. Proceedings of the 12th InternationalConference on AI and Education, July 2005, Amsterdam.

This same Rashi inquiry infrastructure supports students using the Geology Tutor toexplore a geologic phenomenon and reason about past or future events.3 In the FaultRecognition Module, Figure 2, students predict where the next earthquake might occur. Themodule opens with images of a series of large and recurring earthquakes in the San Andreasarea of California, U.S.A., bottom Figure 2. The student is asked to relocate a road

Case Statement

Argument Editor

Edit EvidenceInquiry Notebook

Final Report

Fault Explorer

Slicken SidesEvidence Editor

Figure 2. Geology Earthquake Fault Detection Module. The Case Statement indicates possible routes for areplacement road, left. Students navigate in any direction through footage of earthquakes, images of fault lines or

features such as slicken rock, top right. Observations are noted in the Inquiry Notebook, bottom left, arguments madein the Argument Editor, bottom right, and a final report is submitted, bottom right.

destroyed by an earthquake, left, Figure 2. Three possible routes are suggested (A, B, or C)each of which pass through combinations of four suspicious areas. As project geologists,students evaluate the area and prepare an engineering report with a best routerecommendation. Students from introductory geology courses have used this tutor.

After a student observes an image or activity, she might enter a feature (e.g., lineamentor slickenside) into the Inquiry Notebook. Elsewhere she might enter inferences(interpretations) of this observation along with supporting reasoning. For example, shemight infer that a lineament was an active fault and then support that inference withmultiple citations. Hotspots on images provide information, such as: a line of springsparallel to the lineament; a fence offset by 1.3 meters; or a data set that shows that the areawas seismographically active. The student is expected to use classroom materials, e.g., tofind the relationship between faults and hydrology and to write such observations in hernotebook. Finally, the student makes a recommendation (conclusion) of the best place tolocate the road supported by observations and inferences. At any point she may ask theCoach for help to decide what to do next, or analyze work done so far.

4. Coaching

The Coach analyzes a student’s argument, compares it to that of the expert and providesuseful feedback. Expert knowledge, encoded by a faculty member using the authoring tool,

3 The Geology Tutor is at http://ccbit.cs.umass.edu/Rashihome/projects/geology/index.html

Page 5: Critical Thinking Environments for Science Education

Woolf, B.P., Murray, T., Marshall, D., Dragon, T., Kohler, K., Mattingly, M., Bruno, M., Murray, D.,Sammons, J., Critical Thinking Environments for Science Education. Proceedings of the 12th InternationalConference on AI and Education, July 2005, Amsterdam.

provides a database of expert rules that encapsulates a cohesive argument for eachhypothesis. Once an author has made a well-formed expert argument, the Coach workseffectively to create content-specific analysis of a student’s argument.

Remain Domain Independent. Both Rashi and the Coach are domain independent andextensible. Expert rules are not specific to a case or domain and do not contain hard-codeddomain knowledge. Rules are of two types; either they 1) support well-formed arguments,e.g., identify a logical flaw in the student’s argument, circular logic, a weakness (lack ofsupporting data, missing intermediate inferences, etc.) or a missing piece of data, (alocation was not visited or lab test not done); or they 2) shore up weak arguments, e.g.,“You argue that high TSH supports a diagnosis of mononucleosis. I suggest you reconsiderthis argument.” This second rule type may seem domain dependent, yet it only uses theexpert argument from the authoring tool to identify a hypothesis and its evidence. Using thesame rule, the Coach can then ask the geology student “What supports your hypothesis thatroute C is preferred?”

Teach good argument structure. By referring to the expert system, the Coach teachesstudents how to make successful inquiry arguments; it guides them to support eachargument with sufficient data, making sure an argument is well structured and that noimportant intermediate steps are skipped. The Coach tells a student when to provide properevidence for an existing hypothesis or when to make a hypothesis/inference for which thereis already sufficient data. It shows the student how to support hypotheses with datacollected or to collect more data to possibly eliminate hypotheses.

Be Non-Intrusive. Rashi does not enforce a default order for making an argument. If astudent is motivated and has a way of doing things, the Coach does not intervene.However, if the student is not posing hypotheses or evidential support, the Coach will makesuggestions and assist her to flesh out details. It will check in the Inquiry Notebook to seewhether the student has support for an argument and will look in the Argument Editor tosee that this support is properly connected. The Coach will ask a student to connect anargument with its support or to find support if it is missing (e.g., indicate a screen wheresupport can be found).

5. Evaluation and Scaling Up

These inquiry cases have all been evaluated at Hampshire College and the Universities ofMassachusetts and Rhode Island with undergraduates as well as middle school scienceteachers. The biology tutor was evaluated several times in large (500 students) universitylecture-based classroom. However, as there was only time to use a few short cases, weconsider this evaluation to be a pilot study to test the evaluation instruments. Nevertheless,the results were very encouraging: students quickly learned the software and were able topose open ended and authentic questions, plan queries and engage in on-line research.

A new evaluation instrument was developed to be sensitive to the small pre-post skillgains that result for short learning interventions and to be more easily scored, see [11].This instrument measures two types of student learning: 1) Content questions ask studentsin human biology to identify several diagnoses for a set of symptoms and to suggest bloodand urine tests; and 2) Inquiry questions ask students to critique a set of statementsregarding inquiry reasoning and a hypothetical report on a Rashi-like case. The instrumentis item-, recognition- and difference-based. We only have preliminary data on Rashi use inthese cases and it appears that students at the small colleges evidenced gains in contentknowledge and no gain in inquiry while students in the large classes showed an increase ininquiry skills and a drop in content performance.

Page 6: Critical Thinking Environments for Science Education

Woolf, B.P., Murray, T., Marshall, D., Dragon, T., Kohler, K., Mattingly, M., Bruno, M., Murray, D.,Sammons, J., Critical Thinking Environments for Science Education. Proceedings of the 12th InternationalConference on AI and Education, July 2005, Amsterdam.

Please rate how well youwere able to: %Well/

Very WellCreate hypotheses 53%

Become comfortable with

the system

53%Learn the content material 47%Find needed information 47%Understand the rules for

using the program

47%Use the notebook to

organize information

47%Perform tests 40%

Find the process enjoyable 40%Table 1. Student reaction to the Human

Biology Tutor, Fall ’03.

We have also noted significant correlations between a student’s inquiry skill level andsome of the Rashi use metrics [11]. In particular, there were significant positivecorrelations between inquiry skill level and the number of hypotheses posed by a student,the number of arguments, the number of items in the notebook, the number of explanationsentered by students, the use of notebook organizing tools and the overall use of Rashi tools.As this is what one would expect, this adds some credence to the ecological validity of thepre-post instrument. As in past formative evaluations of Rashi, the survey did not indicateany significant problems with the software. We interpret these results as supporting theusability of the software and its perceived usefulness.

Interviews, surveys, essay questions, group discussions, and pre-post essay activitieshave shown that participants were enthusiastic and impressed with the potential of Rashi asan educational tool. Interactivity was seen as a verypositive attribute, with the Patient Examination featurein biology cited as one of the better components.Students’ perception of learning the inquiry processwas favorable, Table 1. Half the students felt theexperience had taught them how to better approach acomparable inquiry problem.

Since this project is multi-disciplinary and multi-institutional, we need to scale up the usage andcoverage of the software. Thus, issues of authoring toolusability and power are critical and perennial. Ourexperience is that several stakeholders, e.g., faculty andundergraduates, have been able to use our authoringtools to develop new cases in a few weeks after training, see [3]. Experts specify content(images, text, numeric values, etc.), evidential relationships (supports, refutes, etc.) betweenhypotheses and data and indicate which hypothesis or hypotheses are reasonableconclusions for each case. In one instance, an undergraduate was able to build a biologycase in a few weeks as an independent project. She suggested the case, developed medicaldiagnosis rules and patient signs/symptoms. The case was used with her fellow students thenext semester.

6. The Inquiry Research Strategy

To support active learning for students, who have gown-up with computer and videosystems, requires leveraging technology and multimedia to teach domain content andsupport scientific thinking. We followed a consistent set of learning and pedagogicalprinciples during development of these tutors as described in this section.

Learning principles. Four learning principles have guided development of thiswork.

Role-oriented. Students assume roles, e.g., medical personnel responsible for adiagnosis or engineers deciding which location is most secure. Through practice andrepetition, students learn the skill of a master in each situation.

Action-oriented. These environments are designed for action and exploration, withexperiences structured so that students actively search for a solution. Neither pre-plannedknowledge nor explicit course material is delivered.

Goal-oriented. Students are given a goal to pursue while working in a media-richenvironment, requiring them to encounter questions and barriers and to generate hypotheses

Page 7: Critical Thinking Environments for Science Education

Woolf, B.P., Murray, T., Marshall, D., Dragon, T., Kohler, K., Mattingly, M., Bruno, M., Murray, D.,Sammons, J., Critical Thinking Environments for Science Education. Proceedings of the 12th InternationalConference on AI and Education, July 2005, Amsterdam.

and evidence. Students both learn and rehearse techniques behind critical thinking and alsoexperience the need for domain knowledge while solving a problem.

Interactive and exploratory. Students pursue their own path through theenvironment. No fixed traversal is enforced. Thus, each learner is self-directed and free toexplore and construct knowledge in her own way.

Pedagogical principles. Four pedagogical principles of educational environmentshave guided this research, based on principles identified in Bransford [12] and expressed ina National Academy of Sciences (NAS) report, How People Learn [13]. These principlessupport delivery of complex learning, domain content and scientific thinking, withinauthentic and customizable environments. They include:

Knowledge-centered. The tutor knows about domain and student knowledge and canreason about expert rules and a student’s arguments.

Learner-centered. The tutor tracks each student’s work and responds in the contextof that student’s reasoning. Students are not treated as blank slates with respect to goals,opinions, knowledge and time.

Assessment-centered. The tutor indicates whether student reasoning is consistentwith that of the expert. The Coach makes a student’s thinking visible and provides chancesfor the student to review her learning. Assessment is also provided to teachers, in the formof a final report delivered by e-mail, to inform them about student progress.

Community-centered. Currently teams of students work together on a singlecomputer to solve cases. Ultimately people at remote sites will be able to use the tutor tosupport student collaboration. This latter feature has not been fully implemented.

Producing solid educational material for the Web requires great effort and a largenumber of resources. Stakeholders, including students, teachers, parents and industries, playa critical role in the process of that material development, with a view towards saving timeand resources, as described in this project. All participants need to question the very natureand content of instruction provided on the Web. If the Web is to be worthy of the largeinvestment of time and resources required to impact education, it must provide authentic,flexible and powerful teaching that is responsive to individual students and easy toreproduce and expand.

The set of tutors described in this paper provides a first step in that direction, supportingenvironments in which students and teachers are involved in authentic problem solving.One of the original dreams behind development of the Web was to provide a realistic mirror(or in fact the primary embodiment) of the ways in which people work and play andsocialize [14]. This dream can also be applied to education; thus the Web will become aprimary source and environment for education once sufficient intelligent and adaptiveteaching materials are available to make education universal and a realistic mirror ofanytime and anyplace instruction.

7. Conclusion

This paper described Rashi, a Web-based infrastructure shared by a number of tutors,allowing each to leverage from the accomplishments and intuitions of the others. Rashisupports active and engaging learning on the Web, tracks each student’s critical thinking,and reasons about her knowledge and its own teaching strategies, while being open to otherresources (Web-sites) and other people (on-line communities). This tutor was not rooted inextensions of what already exists in education, such as lectures or bulletin boards. Thispaper discussed the shared methodology, infrastructure and tool set.

Page 8: Critical Thinking Environments for Science Education

Woolf, B.P., Murray, T., Marshall, D., Dragon, T., Kohler, K., Mattingly, M., Bruno, M., Murray, D.,Sammons, J., Critical Thinking Environments for Science Education. Proceedings of the 12th InternationalConference on AI and Education, July 2005, Amsterdam.

We observed that students often do not have a great understanding of the inquiryprocess, but do seem to understand the "scientific method" or a structured method ofinquiry learning. Rashi helps students learn the inquiry process, though it doesn't teach it;the tutor provides an environment where inquiry learning is easy to do and intuitive. Thestudent is placed in a situation where she is encouraged to make observations, collectcoherent thoughts about these observations and to come up with possible solutions to thequestions or problems posed. The Coach helps a student learn the inquiry process, not byteaching about the process itself, but by helping the student take part in it. The Coachsupports students to make hypotheses, find data and use that data to support or refutehypotheses. In sum, Rashi teaches content by providing a problem that requires knowledgeof an academic domain to solve. It teaches the inquiry process by involving students in theinquiry process.8. Acknowledgements

Research on Rashi was funded in part by the U.S. Department of Education, “ Expanding aGeneral Model of Inquiry Learning”, Fund for the Improvement of Post SecondaryEducation, Comprehensive Program, #P116B010483, B. Woolf, P.I., and by the NationalScience Foundation under grant DUE-0127183, “Inquiry Tools for Case-based Courses inHuman Biology,” M. Bruno, PI and Woolf, Co-PI, and NSF, CCLI #0340864, “On-lineInquiry Learning in Geology,” D. Murray, P.I., B. Woolf co-PI.

Any opinions, findings, and conclusions or recommendations expressed in this materialare those of the author and do not necessarily reflect the views of the funding agencies.

9. References

[1] Woolf, B. P., Marshall, D., Mattingly, M., Lewis, J., Wright, S., Jellison, M & Murray, T. (2003).Tracking student propositions in an inquiry system. In U. Hoppe, F. Berdeho & J. Kay, (Eds.) ArtificialIntelligence in Education, Proceedings of AIED 2003, World Conference, IOS Press, pp. 21-28.

[2] Woolf, B. P., Reid, J., Stillings, N., Bruno, M., Murray, D., Reese, P., Peterfreund, A. & Rath, K. (2002)A General Platform for Inquiry Learning, Proceedings of the 6th Int’l Conference on Intelligent TutoringSystems, Lecture Notes in Computer Science 2363, 681-697, France.

[3] Murray, T., Woolf, B. & Marshall, D. (2004). Lessons Learned from Authoring for Inquiry Learning: Atale of three authoring tools. The International Conference on Intelligent Tutoring Systems, Brazil.

[4] Bruno, M. (2000). Student-active learning in a large classroom. Presented at Project Kaleidoscope 2000Summer Institute. Keystone, Colorado. http://carbon.hampshire.edu~mbruno/PKAL2000.html

[5] Bruno, M.S. & Jarvis, C. D. (2001). It's Fun, But Is It Science? Goals and Strategies in a Problem-BasedLearning Course. The Journal of Mathematics and Science: Collaborative Explorations, 4(1): 25-42.

[6] Aleven, V. & Ashley, K. D. (1997). Teaching Case-Based Argumentation Through a Model andExamples: Empirical Evaluation of an Intelligent Learning Environment. In B. du Boulay & R.Mizoguchi (Eds.), Artificial Intelligence in Education, Proceedings of AI-ED 97 World Conference, 87-94. Amsterdam: IOS Press.

[7] Krajcik, J., Blumfeld, P., Marx, R., Bass, K., Fredricks, J. and Soloway, E. (1998). Inquiry in project-based science classrooms: Initial attempts by middle school students. The Journal of the LearningSciences, 7 (3and4), 313-350.

[8] White, B., Shimoda, T., Frederiksen, J. (1999). Enabling students to construct theories of collaborativeinquiry and reflective learning: computer support for metacognitive development. International J. ofArtificial Intelligence in Education, 10, 151-182.

[9] Suthers, D., Toth, E. & Weiner, A. (1997). An integrated approach to implementing collaborative inquiryin the classroom, Proceedings of the 2nd Int’l Conference on Computer Supported CollaborativeLearning.

[10] Alloway, G., Bos, N., Hamel, K., Hammerman, T., Klann, E., Krakcik, J., Lyons, D., Madden, T.,Margerum-Leys, J., Reed, J., Scala, N., Soloway, E., Vekiri, I., & Wallace, R. (1996). Creating anInquiry-Learning Environment Using the World Wide Web. Proceedings of the Int’l Conference ofLearning Sciences.

Page 9: Critical Thinking Environments for Science Education

Woolf, B.P., Murray, T., Marshall, D., Dragon, T., Kohler, K., Mattingly, M., Bruno, M., Murray, D.,Sammons, J., Critical Thinking Environments for Science Education. Proceedings of the 12th InternationalConference on AI and Education, July 2005, Amsterdam.

[11] Murray, T., Rath, K., Woolf, B., Marshall, D., Bruno, M., Dragon, T. & Kohler, K. (2005). EvaluatingInquiry Learning through Recognition Based Tasks, International Conference on AIED, Amsterdam.

[12] Bransford, J.D. (2004). Toward the Development of a Stronger Community of Educators: NewOpportunities Made Possible by Integrating the Learning Sciences and Technology,http://www.pt3.org/VQ/html/bransford.html Vision Quest, Preparing Tomorrow’s Teachers to UseTechnology.

[13] Bransford, J. D., Brown, A. & Cocking, R., (1999). Ed., "How People Learn – Brain, Mind, Experience,and School,” National Academy Press, Washington, D.C. 1999.

[14] Berners-Lee, T. (1996). The World Wide Web: Past, Present and Future. IEEE Computer 29(10), 69-77.http://www.w3.org/People/Berners-Lee/1996/ppf.html