24
Using educational software to support collective thinking and test hypotheses in the computer science curriculum Giorgos E. Panselinas & Vassilis Komis Published online: 16 October 2009 # Springer Science + Business Media, LLC 2009 Abstract This study analyses the discourse among the teacher and the students, members of three (3) small groups, who learn in the environment of a stand-alone computer. Two educational environments are examined: the first one, a virtual laboratory(Virtual scale-DELYS) and the second one, a computer modeling environment (ModelsCreator). The Virtual Scaleenvironment provides users with curriculum focused feedback and in that sense it can be categorized as directive. The ModelsCreator environment provides users merely with a representation of their own conception of curriculum concepts, so it can be categorized as an open-ended environment. The goal of this research is to exemplify the way the two educational software environments support (a) the development of collective thinking in peerand teacher-led discussion and (b) studentsautonomy. The software tools of the Virtual scalealong with the resources provided for the problem solving created an educational framework of hypothesis testing. This framework did not limit the studentscontributions by directing them to give short answers. Moreover, it supported the studentsinitiatives by providing tools, representations and procedures that offered educationally meaningful feedback. Based on the above results, we discuss a new educationally important structure of software mediation and describe the way the two software activities resourced collective thinking and studentsinitiatives. Finally, for each type of software environment, we propose certain Educ Inf Technol (2011) 16:159182 DOI 10.1007/s10639-009-9107-y G. E. Panselinas (*) Terma elpidos Atsalenio, 71306 Iraklio, Crete, Greece e-mail: [email protected] e-mail: [email protected] V. Komis Department of Educational Sciences and Early Childhood Education, University of Patras, Greece, 26500 Rio-Patras, Greece e-mail: [email protected]

Using educational software to support collective thinking and test hypotheses in the computer science curriculum

Embed Size (px)

Citation preview

Using educational software to support collectivethinking and test hypotheses in the computerscience curriculum

Giorgos E. Panselinas & Vassilis Komis

Published online: 16 October 2009# Springer Science + Business Media, LLC 2009

Abstract This study analyses the discourse among the teacher and the students,members of three (3) small groups, who learn in the environment of a stand-alonecomputer. Two educational environments are examined: the first one, a “virtuallaboratory” (Virtual scale-DELYS) and the second one, a computer modelingenvironment (ModelsCreator). The ‘Virtual Scale’ environment provides users withcurriculum focused feedback and in that sense it can be categorized as directive. TheModelsCreator environment provides users merely with a representation of their ownconception of curriculum concepts, so it can be categorized as an open-endedenvironment. The goal of this research is to exemplify the way the two educationalsoftware environments support (a) the development of collective thinking in peer—and teacher-led discussion and (b) students’ autonomy. The software tools of the“Virtual scale” along with the resources provided for the problem solving created aneducational framework of hypothesis testing. This framework did not limit thestudents’ contributions by directing them to give short answers. Moreover, itsupported the students’ initiatives by providing tools, representations and proceduresthat offered educationally meaningful feedback. Based on the above results, wediscuss a new educationally important structure of software mediation and describethe way the two software activities resourced collective thinking and students’initiatives. Finally, for each type of software environment, we propose certain

Educ Inf Technol (2011) 16:159–182DOI 10.1007/s10639-009-9107-y

G. E. Panselinas (*)Terma elpidos Atsalenio, 71306 Iraklio, Crete, Greecee-mail: [email protected]: [email protected]

V. KomisDepartment of Educational Sciences and Early Childhood Education, University of Patras, Greece,26500 Rio-Patras, Greecee-mail: [email protected]

hypotheses for future research regarding the support of collaborative problemsolving.

Keywords Collaborative learning . Secondary education . Educational software .

Thinking together . Discourse analysis . Computer science

1 Introduction

In schools, the educational activities that are based or supported by educationalsoftware take place mainly in small student groups (Crook 1994; Holmboe and Scott2005). These are group activities among students learning by using computers andamong students and a teacher who usually organizes and supports or facilitates theoverall process (Mercer et al. 2004). The discourse that is therefore generated hascertain educational goals. The students use software tools1 in order to solve a problemor, generally, meet the requirements of an educational activity. The students use thesetools by their own free will, by the direct or indirect (Worksheet) prompt of a teacheror finally by the software itself. The use of artifacts is supposed to transform thecognitive and communicative requirements of human activities and the way in whichthe collaborative action is organised (Crook 1994; Saljo 1998). Computer feedbackand interaction between user and computer mediate and structure the evolvingdiscourse (Littleton 1998). Depending on the software environment and the generalpedagogical frame in which the software environment is used, different types ofdiscourse and dialogue are created among the members of the educational group(teacher, students) (Fisher 1993; Crook 1994; Wegerif 1996; Mercer et al. 2004).These types of dialogue and discourse have different educational value. Theireducational value is crucially determined by (a) whether students are given theopportunity to take control of their own learning process heading towards studentautonomy (Panselinas and Komis 2008; Rasku-Puttonen et al. 2003) and also (b) bythe emergence of collective thinking (Mercer 2000) in the conversation among thestudents and the teacher. In the same way, the use of tools and representations ofthe educational software is rendered educationally productive if it serves the abovepurposes, aiming at conceptual understanding. Many researchers have conducted andproposed as important, studies that identify the interaction structure of discourse infront of the computer screen with regard to computer feedback and user-computerinteraction (Fisher 1993; Crook 1994; Wegerif 1996; Wegerif et al. 1998; Anderson etal. 1999; Wegerif et al. 2003; Mercer et al. 2004; Murphy 2007).

Following the above line of thought, the present study analyzes the structure andevaluates the quality of the educational discourse in two different computerenvironments as it is formed through the use of the software tools in the educationalframework created by the teacher.

The first computer-based educational environmentt is a “virtual laboratory”(Virtual scale-DELYS) and the second one, a computer modeling environment

1 The term ‘software tools’ defines the symbolic applications on the user interface of the educationalsoftware, which mediate the user’s instructions and set in motion representation- and simulation-makingprocedures.

160 Educ Inf Technol (2011) 16:159–182

(ModelsCreator). ‘Virtual scale’ is designed to pursue learning of Numeralsystems concepts (binary and decimal system) and the particular educationalscenario in ModelsCreator environment is designed to pursue learning ofcomputer processor features. Both subjects constitute some of the basic subjectsof secondary computer science curriculum in Technical Vocational Education(YPEPTH/ΥΠΕΠΘ 1999a, b).

The ModelsCreator educational software environment allowed us to organizeinquiry modeling activities (see Appendix) to pursue inquiry learning suggested tooffer a scientific and authentic way of doing and learning science (Lohner et al. 2005).Both DELYS and ModelsCreator environments gave us the opportunity to organizehypothesis testing activities/tasks that “have been promoted as a powerful context forsupporting knowledge acquisition in science (Howe et al. 2000, pp. 262). Thus, thetasks in both environments are supposed to create a computer-based learning contextappropriate for supporting students’ “thinking together” (see also Wegerif 2004).

The ‘Virtual Scale’ (DELYS) environment provides users with curriculumfocused feedback and in that sense it can be categorized as directive. TheModelsCreator environment actually provides users merely with a representationof their own conception of curriculum concepts, so it can be categorized as an open-ended environment. Directive educational software, though not ‘virtual laboratories’,has been associated with the transmission model of learning (Fisher 1993; Crook1994; Cazden 2001; Wegerif 2004). ‘Open’ computer modeling educational softwarehas been associated with constructivist approaches (Dimitracopoulou and Komis2005), the “augmentation effect” on student learning (Angeli 2008) and thediscovery model of learning (Wegerif 2004).

We suggest that those categorizations are not adequate to exemplify the way eacheducational environment supports educational discourse and students’ autonomy inauthentic school settings (Howe et al. 2000; Wegerif et al. 2003; Wegerif 2004).Having the students use or build a computer visualization and/or simulation to modelthe subject area does not necessarily provide them with the tools to learn how tosolve the problem. “The literature about using the computer as a tool sometimesblurs the distinction between using external cognitive tools, e.g. computers, anddeveloping internal cognitive tools, e.g. thinking skills. These are not the samethings.” (Wegerif 2002, pp. 24–25).

In this study, the classroom teacher of the corresponding curriculum courses wasone of the researchers. He selected (DELYS) and designed (ModelsCreator)hypothesis-testing activities/tasks and worksheets for both computer educationalenvironments. He designed the task in ModelsCreator according to software designand the “modeling” type of software and planned his interventions in bothenvironments according to ‘guided construction of knowledge’ pedagogy (seeMercer 1995, 2000; Wegerif 2004).

Therefore, given the ‘guided construction of knowledge’ pedagogy applied by theteacher, our aim is to identify the properties of the computer-based educationalactivities (software tools, other resources for problem solving) that shape thedistinctive way each educational environment supports

(a) The development of collective thinking in peer— and teacher-led discussion and(b) Students’ autonomy

Educ Inf Technol (2011) 16:159–182 161161

The structure of this paper is described as follows. Firstly, “the guidedconstruction of knowledge” pedagogy is explained regarding (a) the emergenceof collective thinking in peer— and teacher-led discussion and (b) themediation of stand-alone computer in collaborative problem solving. Next, wepresent the setting and the research method of the study itself, followed by itsrelevant findings. We conclude the study by (a) discussing a new educationallyimportant structure of software mediation, (b) the way the two softwareactivities resourced collective thinking and students’ initiatives and (c) wepropose certain hypotheses for future research regarding the support ofcollaborative problem solving.

1.1 Collective thinking in peer discussion (pD) and teacher-led discussion (tlD)

Mercer (2000) introduced the notion of collective thinking to describe talk in a jointproblem solving activity where participants share relevant past experience andinformation creating context for the joint activity and work with each other’s ideas inorder to use language as a tool to transform the given information into newunderstanding. In particular, “exploratory talk is an effective way of using languageto think collectively” (Mercer 2000, pp. 153) and cumulative talk “can be veryusefully applied for getting joint work completed” with participants “using languageto think together” (Mercer 2000, pp. 31–32).

Exploratory talk is the kind of talk in which partners engage critically butconstructively with each other’s ideas. Statements and suggestions are offered forjoint consideration. These may be challenged and counter-challenged, but if soreasons are given and alternative hypotheses are offered. Agreement is sought as abasis for joint progress. Knowledge is made publicly accountable and reasoning isvisible in the talk (Mercer 2000).

Cumulative talk is the kind of talk in which “speakers build on each other’scontributions, add information of their own and in a mutually supportive,uncritical way construct shared knowledge and understanding” (Mercer 2000,pp. 31).

Exploratory talk, cumulative talk and disputational talk were detected in discourseamong peer students (symmetric interactions) (peer discussion -(pD)) (Mercer 1995).As he has pointed out ‘exploratory’, ‘cumulative’ and ‘disputational’ are not meantto be descriptive categories into which all observed speech can be neatly andseparately coded. They are analytic categories, typifications of ways in whichstudents have been observed to talk together. In practice, excerpts of cumulative talkare likely to be part of a more extended exploratory talk episode. However, there arealso cases in which cumulative talk is neither integrated into more extended episodesof exploratory talk nor evolved into exploratory talk.

On the other hand, the teacher often supports the discourse among students(teacher-led discussion — (tlD)). He/she therefore acts as an experienced person,who uses verbal language to initiate students into the community of discourse of aparticular discipline (Lemke 1990; Wells 1992; Mercer 1995). He/She mayencourage students to make explicit their thoughts, reasons and knowledge andshare them with the class. He/she may ‘model’ useful ways of using language thatchildren can appropriate (Rojas-Drummond and Mercer 2003). He/She may also

162 Educ Inf Technol (2011) 16:159–182

focus students’ attention on the details and important aspects of the task (Mercer1995). Thus, in successful cases, the teacher supports students’ individual andcollective thinking with his/her verbal contributions (Panselinas and Komis 2009).

1.2 Educational discourse at a stand-alone computer using directive and open-endedsoftware: Analysis based on teacher-student interaction (I-R-F), student-computerinteraction (I-R-F), peer and teacher-led discussion (pD, tlD)

According to some researchers, the difference between an “open” non-directive anda directive software environment affects students’ contributions to the overalldiscourse, the length of the discourse and the quality of the discourse (Fisher 1993;Tolmie et al. 1993; Crook 1994; Wegerif 1996; Wegerif et al. 1998; Anderson et al.1999, Mercer et al. 2004).

Fisher (1993) and Crook (1994) argued that the structure of discourse in adirective software environment often simulate the pseudo-discourse or “triadicdialogue” (Lemke 1990) between a teacher and their students (Teacher Initiation(tI) — Student Response (sR) — Teacher Feedback/Follow up (tF)). In a directivesoftware environment, the initiation derives from the software (Computer Initiation(cI)), the response derives from the students (Student Response (sR)) and thefeedback (Computer feedback (cF)) or the next movement (Computer follow up(cF)) derives from the software. According to the above researchers, in such acontext, students’ contributions in knowledge construction through discourse arelimited to short answers.

On the other hand, Fisher (1993) proposes the structure “Student Initiation (sI) —Computer Response (cR) — Student Follow up (sF)” in case of an “open” non-directive software environment. The students take their own initiatives to use thesoftware tools (sI), which in turn transform the students’ initiatives intorepresentations on the computer screen. (cR). The students make the next move(sF). Fisher suggests that this type of environment is more likely to promote dialogueamong students. Yet, she cautions “open-ended software can lead to equallyformalized talk, depending on how it is used and how children perceive the aims ofthe task” (page 112). Similarly, Mercer (1994) argues that the effectiveness of an“open-ended” software for generating extended discussion among students dependson how the tasks using the specific software are incorporated into the overall schoolactivities.

Wegerif (1996), analysing some of the discourses of the Slant project, tracedinstances of peer discussion (pD) among students, who were trying to reach a commondecision in order to respond to the computer (sR) in an adventure game (VikingEngland). That adventure game is considered directive since it instructs the students bysetting problems to them (cI) while providing feedback to their actions (cF). Theexistence of exploratory talk (pD) was explained by the fact that the students gotlessons from their teacher on how they should conversationally behave before reachinga common decision. Therefore, Wegerif proposed the “cI-pD-sR-cF” structure, in caseof a directive computer environment that function inside a pedagogical framework,which cultivates exploratory talk among students. Later studies highlight the duality ofthe computer acting at times as a subject (instructing role) while being an object(passive role) (Wegerif et al. 2003; Wegerif 2004; Mercer et al. 2004). In this way, they

Educ Inf Technol (2011) 16:159–182 163163

explained the different psycho-pedagogical environment created when the question andthe feedback comes from the software and not from the teacher.

Studies of Howe, Tolmie and colleagues (Tolmie et al. 1993; Howe et al. 2000) inteaching Science showed that, in both computational and non-computational environ-ments, the structure of the activity importantly determines the quality of theeducational discourse, which is developed. The above researchers suggest the“hypothesis testing” activities in order to achieve both conceptual and proceduralknowledge in Science education. In these activities “(a) pupils first debate theirconceptual understanding and reach a consensus about the hypothesis to beinvestigated, (b) next subject their consensual positions to expert guidance (by ateacher) about how to pursue a practical controlled investigation of their hypothesis,(c) perform the investigation and (d) discuss the outcomes together to drawconclusions” (Howe et al. 2000; Mercer et al. 2004). Wegerif, Mercer and theircolleagues applied the above structure of teaching-and-learning activities in order topursue goals of Science curriculum in computer-based educational environments(Wegerif et al. 1998; Wegerif et al. 2003; Wegerif 2004; Mercer et al. 2004). Thestructure “cI-pD-sR-cF” is integrated in those hypothesis-testing activities as part ofthe overall interaction structure. This four-part structure is called ‘guided constructionof knowledge’ and offers a third way to model the teaching-and-learning process,different from the model of transmission or discovery of knowledge. In two laterstudies, the role of the teacher interpreting and contextualizing the correct answer forthe students appears as a fifth part of the above mentioned structure (tlD) (Wegerif2004; Mercer et al. 2004). The teacher’s support of contextualizing the whole activityis developed after the simulation stage (cI-pD-sR-cF-tlD).

In the above patterns of computer mediation, the computer assumes the role of“subject” (computer Initiaton (cI)... student Response (sR) — computer Follow up(cF)) or the role of “object” (student Initiation (sI) — computer Response (cR)) (seeCrook 1994). We suggest that these patterns also differ in terms of the initiative ofusing the tools of the software. In the former case, students use the software tools asa response to other’s (software, teacher) initiation (sR), whereas in the later, studentsuse the software tools on their own initiative (sI).

2 Method

“Computer Science” in Greek secondary education is mainly taught in vocationalschools. Therefore, we selected an evening vocational technical school to study thepursuit of computer science curriculum in secondary education.

2.1 Participants

2.1.1 Students

Three (3) groups of students of a Greek evening vocational technical uppersecondary school (two dyads and one triad) participated in the study along with theirteacher-researcher of the “Computer Hardware” and “Basic Maths in ComputerScience” curricular subjects. The seven students, four males and three females, were

164 Educ Inf Technol (2011) 16:159–182

first-year students aged from 17 to 30. All three groups that participated in the studyconsisted of students with similar ages (peer groups). The students of the study hadlow to moderate learning capacities. They had almost no experience of educationalsoftware tasks (a 10-min familiarization with the software before the main activitiesof the study in hand). Yet, they had some experience of general application softwaretasks (wordprocessors etc.) and they knew each other, as they worked in the samesmall group in the computer lab.

2.1.2 Teacher

The classroom teacher can be described as a “Computer Science” teacher who — atthe time of the data collection — had already had 8 years of teaching experience andhad been continually developing his professional competence by studying andresearching educational sciences. What is more, his pedagogical thinking had beeninfluenced by the neo-vygotskian approaches to learning (e.g. Crook 1994; Mercer2000; Wegerif 2004).

During the activities, the teacher tried to scaffold students’ performance until thestudents solved the problem with his help (Panselinas and Komis 2009). He also toldstudents that, doing the activities, they had to cooperate and talk to each other inorder to reach a consensus. He stimulated discussion about elements of the solutionprocess, avoiding direct answers. Instead, he invited students to come up with theirown solutions and redirected their individual answers to the group (see also Hoekand Seegers 2005). He set low to moderate learning goals and he intervened toinstigate peer discussion whenever it was not developed. He acted this way, in orderto elicit problem-solving contributions on behalf of the students towards knowledgeconstruction and problem solution. In most of activities, he generally succeeded inhaving the students provide evidence of assisted performance within 35–60 min.

2.2 Greek educational context

In Greek classrooms, groupwork and especially cooperative and collaborativelearning are limited. This is due to the lack of a substantial initial and in-serviceteacher training which takes into consideration teachers’ professional needs andbeliefs (Gravani and John 2005). Yet, ICT courses in computer labs brought in thetask-oriented and project-oriented learning as well as the setting of dyads and triadsworking together on a computer screen (Panselinas 2002). However, due to top-down imposed policy decisions in ICT integration in schools and lack of anappropriate teacher training system that meet teachers’ needs and takes into accountboth classroom culture and educational theory and research, teacher-centeredpedagogy practically prevails (Jimoyiannis and Komis 2007). Therefore, Greekteachers are not trained in ways to evolve a groupwork activity into a collaborativelearning activity, whereas the teacher coaches students’ “thinking together” (seeMercer et al. 2004).

Thus, the students of the research in question had got used not to talking to eachother while learning with the presence of a teacher but rather addressing the teacherwhenever prompted to give an answer, set an hypothesis, draw a conclusion orgenerally contribute to the discussion. In this way, the teacher’s prompts to “talk to

Educ Inf Technol (2011) 16:159–182 165165

each other” were actually prompts to adopt a communication behaviour they werenot familiar with, at least when the teacher was present.

2.3 Software

DELYS consists of a collection of complementary microworlds regardingdifferent subjects of Computer Science (Dagdilelis et al. 2003). In the presentresearch, the microworld of ‘virtual scale’ is used (Fig. 1). It is a virtuallaboratory that was designed for students to understand the relationship betweenthe decimal and the binary numeral systems and to investigate basic principles ofvarious numeral systems. The user interface of the software contains a pair ofscales where the users can put weights on its left tray representing the numeralweight units of the decimal system (1, 10, 100, etc.) and weights on its right trayrepresenting the numeral weight units of the binary system (1, 2, 4, 8, etc.). Thescales balance when the user is “unlocking” the system and the representednumbers on the left and right trays have the same “weight”. At the same time, apartfrom the representation using weights for the binary and decimal systems, there isalso a representation with digits, for example 12 in the decimal system or 1100 inthe binary as it is shown in Fig. 1. The students are able, using the software, tochange any of the two representations and the software changes the otherrepresentation automatically.

“ModelsCreator” is a computer-modeling environment that allows students tocreate and test models representing different aspects and phenomena of the naturalworld (Fig. 2). Testing models’ behaviour is carried out by direct manipulation andon the basis of the multiple representations available within the environment of

Fig. 1 The user interface of DELYS (virtual scale)

166 Educ Inf Technol (2011) 16:159–182

ModelsCreator (simulation, bar charts, graphs, tables, etc.) (Dimitracopoulou andKomis 2005). “ModelsCreator” contains objects that have a mediating role helpingstudents to manipulate abstract entities and concepts (properties of objects).Properties of the same entity or other entities can be connected with qualitative,semi-quantitative or quantitative relations. “ModelsCreator” integrates semi-quantitative models, which meet the requirements of many curriculum subjects,permitting interdisciplinary use of the modeling process. The educational scenariowe designed, deployed and tested concerns teaching-and-learning of Hardwareconcepts (Panselinas et al. 2005).

2.4 Tasks, procedure and data collection

The tasks were integrated into students’ school “computer science” courses, as theclassroom teacher of the corresponding curriculum courses was one of theresearchers. The task in Virtual Scale aims at teaching concepts regarding decimaland binary systems and enabling the students to make conversions and predictionsregarding those numeral systems. The task in ModelsCreator environment aims atstudents identifying the processor features and estimating their importance on theefficiency of the computer system. The activities/tasks were based on worksheets.Parts of them are cited in the Appendix.

Entities

Library

Models activity space

Semi-

quantitative

relations

Tools for alternative representations of

models Tools for running and testing models

Fig. 2 The user interface of ModelsCreator: modeling the relation “ as long as the processor frequencyincreases the application software execution time decreases”

Educ Inf Technol (2011) 16:159–182 167167

The three (3) groups of students (two dyads and one triad) worked in the twoeducational software environments at a stand-alone computer in the computerlaboratory with only the teacher present and not the other two groups. The rest oftheir schoolmates were attending a different course in a different classroom. The six(6) activities were captured on video so as to capture not only verbal but also non-verbal interaction and activity. The discourse of the participants was recorded onannotated transcripts.

2.5 Method of analysis

In order to pursue the purpose of our study we detect the interaction structure of theeducational discourse in terms of (a) student-computer interaction and students’initiatives to use the software tools for hypotheses testing and (b) the emergence ofcollective thinking in peer— and teacher-led discussion. We suggest that differentsoftware tools and resources for problem solving generate different structures ofeducational discourse. The structures of educational discourse are different in termsof incorporating collective thinking or not, supporting students’ initiatives or not.Thus, the pursuit of the purpose of the study is operationalized as discourse structuredetection.

We employ a discourse analysis that segments the whole talk of each task in parts.Each part contains the talk of one sub-task where there are sub-tasks (“Basic Math’sin Computer Science” task); otherwise, where there are no sub-tasks, the whole talkof the task is the one and only segment (“Computer Hardware” task). Next, theanalysis of the talk and action in each segment (sub-task) is based on the teacher-student interaction (I-R-F), the student-computer interaction (I-R-F), the discussiondeveloped among students (pD) and the supporting discussion among the teacherand the students (tlD) (Table 1). Actually, we use the same method of analysisemployed in previous works presented in Section 1.2.

Table 1 Categorization of segments of talk and student-computer interaction

Abbreviation Category

tI Teacher Initiation

tF Teacher Feedback/Follow up

sI Student Initiation

sR Student Response

sF Student Follow up

cI Computer Initiation

cR Computer Response

cF Computer Feedback/Follow up

pD Peer Discussion

tlD Teacher-led Discussion

(Teacher Responses to students’ questions are considered integrated into teacher-led discussion (tlD) andcoded accordingly because there are not the focus of the present study)

168 Educ Inf Technol (2011) 16:159–182

3 Results

3.1 Teaching computer science with the “Virtual Scale” of DELYS

In this section, we study the discourse that has taken place around the “VirtualScale” during the teaching-and-learning activities with the three groups. The activitywas structured by a worksheet, selected from the software manual (part of it in theAppendix), and divided the whole task into sub-tasks.

Some sub — tasks created a discussion of the structure type: “tI-sR-cF-tlD” (Sub-task 1). In this case, the teacher and the worksheet initiate (tI) and the initiationactually act as a prompt for the students to use the software in a certain predefinedway. Afterwards, the software transforms (cF) the students’ Response (sR) and therefollows a discussion supported by the teacher (tlD).

In the worksheet there are also sub-tasks (e.g. Sub-task 2), which set problemsand questions (tI) that students have to resolve through peer discussion (pD). Thestudents reach a consensus and respond to the software (sR), which in turn offers theappropriate feedback (digital representation and balancing of the scales) (cF)(Episode 1). After the software’s feedback, the teacher’s mediation might be neededin order to contextualize the activity and create some kind of recapitulationsupporting the students’ collective reflection (tlD). In this way, a (tI-(pD)-sR-cF-(tlD)2) structure is created.

In Episode 1, in the peer discussion stage (pD), the teacher intervenes just toprovide students with pen and paper and encourage one of them to explain his wayof thinking to the other. After the software response (cF), there is some kind ofacknowledgement (Ack) of the correctness of the students’ response (sR).

Additionally, the worksheet contains questions, which give students theopportunity to use the software tools in order to answer them (Sub-task 3).Within these activities, students have the opportunity to set up a virtualexperiment on their own in order to test the truth of a hypothesis, which, inturn, will help them to solve the problem. In this way, the teacher poses aproblem (tI) and there follows peer or teacher-led discussion (D is either pD ortlD), wherein the students have the opportunity to set a hypothesis. Next, thestudents set up a virtual experiment and test the hypothesis using the softwaretools (sI-cR). This experiment creates an interaction structure of “sI-cR-D” type(Fisher 1993). In this case, the computer environment assumes a more “passive”role waiting for the students to use the software tools the way they see fit.Thus, the decision-making and the control of action are passed on to thestudents. Finally, the use of software tools (sI-cR) and the discourse among themembers of the educational group (D) lead the students to solve the problem(sR). Therefore, the developed structure is: tI- (D)-(sI-cR-D)...(sI-cR-D)-sR(Episode 2).

In Episode 2, the conversation regarding the testing of the hypothesis of sub —task 3 is not cited in its entirety but only the first part. That is why the sR part is notthe final response of the students but a temporary one.

2 The stages in parentheses are optional.

Educ Inf Technol (2011) 16:159–182 169169

Episode 1 Sub task 2 Group B_S

(Transcriptional conventions: E- teacher, B, S-students, “/”- pause in discourse less than 2 s, “//”- pause indiscourse more than 2 s, “[”- overlapping utterances, (text)-non verbal activity and comments, ()-unintelligible speech, “”- reading from Worksheet, commas are not used)

tI B: “lock the Scales”/ locked / “place weight units on the right tray so as the scales balance once youunlock them”

E: here we are

B: here we are

E: place weight units on the right tray to balance the weight units on the left tray once you unlockthem.

pD Β: 167

S: 128 130 (S counts pointing at the screen)

Β: can we use a calculator?

S: No. Come on

Ε: I can give you pen and paper

S: there is no problem

Ε: here you are (E gives them) / you will need a pen and paper at some point.

S: [130 162 (he counts watching the screen)

Β: so wait

Ε: don’t erase anything

S: come on / come / I found it / leave it there

Ε: It’s ok / but I want both of you to understand/ not only one of you

Β: [167 (B uses pen and paper)

Ε: [please explain it to your fellow student]

Β: Subtract 128 from 167

S: see what I have written / what I’ ve put there (B is watching)

Β: ok / we convert the decimal form of 167 into the binary form.

S: How much is this (pointing at the screen) is it correct?

Β: 128 plus [32 equals 130 160

S:[calculate (points at the paper)

Β: stop 160 164 167 168

S: why is it 167 man? [One hundred and sixty

Β: [one hundred (both point at screen indications) wait 128 32 / wait 32 equals 130 160

S: yes!

Β: Ok?

S: and 7

Β: 165 166 167 ok! If we unlock it now

sR (The students have placed weight units on the right tray and unlock the Scales)

cF (The scales show their indication)

Ack S: it’s in balance

Β: it is going to be exactly in balance

S: here it is

Ε: ok, it looks correct / let’s continue

170 Educ Inf Technol (2011) 16:159–182

Episode 2 Sub task 3 T_Z Group

tI Ε: “is there any sum of weight units on the left tray which is not possible to balance?” / that is to sayif I put a number here (E handles the mouse and points at the left tray on the screen) is it likely not tobe able to balance the scales placing weights on the right tray?

tlD Τ: no, you will be able at all cases

Ζ: after 128 we place two hundred.../ 120 plus 120 / 100 200 (Z calculates 128 plus 128)

Ε: you can talk now I am not testing you

Ζ: 256

Ε: What was that?

Ζ: after 128 can we use 256;

Ε: yes / you can use 256 but why are you saying this? / That is to say if.. / Can I find a number toplace here (left tray) and then not be able to place weights there (right tray) to get the scales inbalance? Can I do this? / So can I put a number here / Okay? / in order not to be able to balance thescales? Can I do this?

Ζ: no, you can’t

Ε: come on communicate with each other

Τ: until the number 254

Ε: until two hundred?

Τ: fifty four

Ε: why are you saying this?

Ζ: ()

Τ: the binary system goes up to this number

Ε: so, am I not able to have numbers greater than 254?

Τ: in these scales here (means the software’s representation)?

Ε: generally can I?

Τ: no, you can’t

Ε: so the binary numbers go up to 254?

Τ: we are talking about binary now

Ε: Ζ do you agree?

sI Τ: here if we put everything (T places all the corresponding weights on the left tray)

Ε: yes/ let’s see/ if we put everything in there what can happen let’s see

cR (All the corresponding weight units are placed on the left tray and the digital representation of thenumber in the decimal system is formed)

tlD What number is this?

Ζ: 100

Τ: 299

Ζ: 299

Ε: nice

sI (T places all the corresponding weights on the right tray)

cR (All the corresponding weight units are placed on the right tray and the digital representation of thenumber in the binary system is formed)

tlD What number is this? 8 aces

Τ: 254

Ε: two hundred? How did you find 254?

Ζ: you sum everything

Τ: In the binary system it goes up to the number 254

Educ Inf Technol (2011) 16:159–182 171171

Generally, we detected the following three discourse structures in all three groupsthat participated in the activities with the ‘Virtual scale’ of DELYS:

Structure 1: tI-sR-cF-tlD. Structure 1 was detected in a) sub-tasks with the aim offamiliarizing students with the software environment and b) sub-tasks(Sub-task 1), where students are prompted to conduct a predefinedexperiment. In these activities, students are not supposed to use thesoftware tools on their own initiative.

Structure 2: tI- (pD)-sR-cF- (tlD). Structure 2 was detected in sub-tasks where theteacher or the worksheet posed a problem to the students. The nature ofthe problem limits the students to use the software tools solely torepresent and test the solution they came up with (Episode 1- Sub task 2).

Structure 3: tI-(D)-(sI-cR-D)...(sI-cR-D)-sR. Structure 3 was also detected in sub-tasks, where the teacher or the worksheet posed a problem to thestudents. However, in this case, students solved the problem by settingand testing their own hypotheses. The students, aiming at testing theaforementioned hypotheses, conduct a virtual experiment on their owninitiative (‘sI’ part in Episode 2). Students of all three groups took suchinitiatives. The aforementioned structure of educational activity supportsthe students’ initiatives by providing meaningful feedback. At the sametime, this structure of educational activity does not limit students’initiatives by directing them to use the software tools merely to representand test their response to the problem posed, as structure 2 does.

In this way, the Virtual Scale constitutes a “virtual laboratory” with tools andsettings that provide informative representations and feedback. The software acts asa “passive” environment where the students use the software tools with the aim ofbeing guided by them. The computer functions as a lab providing meaningfulindications, which support students thinking in order to reach a common decision.Therefore, it constitutes an environment, which does not direct students’ initiatives,while it provides feedback to them. Students’ initiatives of using the software tools

Ε: it is not 254 do the math

Τ: it is (makes the calculation on paper) / 210..255!

Ε: it is 255

Τ: 255

Ε: so can’t I write the number 299 into binary?

Τ: no I can’t

Ε: if I add.. Z do you believe this? Can’t I write the number 299 in the binary?

Ζ: no we can’t

Ε: if I add an ace?

sR Τ: if you add an ace you can

tlD Ε: if I have 9 bits if I have 9 digits

Τ: with 9 bits you can

Ε: can I?

sR Τ: yes yes

172 Educ Inf Technol (2011) 16:159–182

aim at setting up an “experiment”, which will give an answer to the hypothesis-question raised in the process of investigating the solution to the problem posed. Inthis way, the environment of the “virtual scale” presents characteristics, which arelikely to support collective thinking developed in the discourse among students butalso among students and their teacher.

3.2 Teaching computer science with the open-ended software of ModelsCreator

In this section, we study the discourse generated in the “ModelsCreator” environmentduring the teaching-and-learning activities with the three groups. A worksheet wasdelivered to the students from the activity book accompanying the software package(Appendix). The author of the activity book is one of the researchers. The one andonly task (no sub-tasks) formed the structure of the discourse the same way for allthree groups.

In ModelsCreator environment, the computer acts as an “open-ended”teaching-and-learning context that is providing tools, which can be used bystudents in order to create and test models of a “computer system”. Thecreation and the testing of the models that took place in the particular activitiesdid not offer feedback that supported the users’ (students, teacher) thinking.Part of this is due to the “open” and “non directive” character of the softwareand partly due to the weakness of the software scenario to create familiar andmeaningful representations and simulations (Panselinas et al. 2005). Studentsconstruct a model based on their current perception of the phenomenon. Thesoftware provides students with a representation of the model as well as asimulation by running the model. Both representation and simulation do notprovide students with either the acknowledgement of correctness of the modelcreated or the feedback that supports the reconstruction of a correct one. Yet, theeducational scenarios followed the structure of “hypothesis testing” (Howe et al.2000). In ModelsCreator case, the responsibility of supporting the students’construction of a correct model has been solely passed on to the teacher. Thegoal of the educational scenario is the conceptual understanding of the computerprocessor features and their importance for the computer performance. This goalwas supposed to be achieved through students’ collective activity and discussion,on concepts, relations and attributes that deal with the related phenomenon. Thestudents’ collective activity would take place with reference to the user interfaceof ModelsCreator that would function as a shared learning workspace. As aresult of the collective activity, the students should be able to set a hypothesis,which, with the support of the software environment and mainly the teacher,would be tested.

Thus, a model creation problem is posed to the students (tI) who are calledupon to select a semi-quantitative relation between the “internal synchronizationfrequency of the processor” and the “system performance”. Students mightgenerate a peer discussion among them (pD) in order to reach a consensus (sR)regarding the setting of a hypothesis, that is to say the selection of the propersemi-quantitative relation. This consensus leads to students’ use of softwaretools concerning the construction and testing of the model. This use of software

Educ Inf Technol (2011) 16:159–182 173173

tools creates representations and simulations on the computer screen (cF). Forinstance, if students chose a relation of inverse analogy, there would be arepresentation on the screen depicted on Fig. 2. Next, the students can run themodel creating a simulation of the relation: “as long as the frequency of theprocessor increases the application execution time decreases” (Fig. 2). Therefollows a discussion among the members of the educational group, supported bythe teacher in order for the students to construct and share meaning that will leadthem to conceptual understanding (tlD). If necessary, the students wouldreconstruct the model in order to reflect their anew-constructed conceptualunderstanding and would retest its behaviour (sR-cF). Consequently, the activityhas the following structure: “tI-(pD)-sR-cF-tlD-(sR-cF)”. The parenthesis on pD(students’ peer discussion) means that students’ collective thinking is notguaranteed. Peer discussion is favored or not by the general educationalframework, in particular, the software environment, the nature of educationalactivity and the pedagogical context, where the activity takes place. On the otherhand, the teacher’s supporting discussion (tlD) is necessary as the softwareenvironment is not of a directive type.

In the educational scenario that was applied and tested in our study, there was nodiscussion among students that incorporated collective thinking (pD). The studentsreached a correct (Episode 3) or incorrect consensus (Episode 4) without developingcollective thinking.

In Episode 3, the students are asked to choose a semi-quantitative relationshipbetween the “internal synchronization frequency of the processor” and the“execution time of a software application”. Only part of the discussion among thestudents is presented. The students draw a correct conclusion without thinkingcollectively. The teacher intervenes in order for the students to get familiarised withthe symbol system used by the software.

Episode 3: “Processor frequency and execution time of application”- GroupB_S

Ε: what is this? What you have already placed means that as long as the one(processor frequency) increases the other (application execution time) increases tooS: B / it’s not rightΒ: okay I got itS: let us put this (points at a relation on the computer’s screen)Ε: reverse analogyΒ: as the processor increases the execution time decreasesS: yes, we will put thisΒ: as the processor’s frequency increases the execution time decreases (B correctshis aforementioned hypothesis)Ε: let’s go to the next oneS: yes, is this correct? (Towards E)Ε I don’t know we are going to run it and then we will see

In Episode 4, the students try to choose a semi-quantitative relationship betweenthe “kind of software application” and its “execution time”. Only part of the

174 Educ Inf Technol (2011) 16:159–182

discussion is presented. Despite the teacher’s prompt to students to go back andground their thinking on their common knowledge and experience, they were notable to generate any kind of collective thinking. Finally, the teacher postponed hisscaffolding intervention until the discussion stage (tlD) after the construction andtesting of the students’ initial model.

Episode 4: “Kind of application and execution time” — Group F_A_S

Ε: you will put a relation that links both of them / here is the kind of theapplication and there is the execution time (E points at the screen)Α: does it stay the same?F: this one (F points at the screen)Α: does the one increase and the other stay the same?S: up here (A handles the mouse) / it’s three isn’t it?Ε: I don’t know / we will check at the end //don’t look on the ones belowΑ: aha!Α and which one is it?F: this one (F points at the screen)Ε: before you choose just think what actually happens in real computer systemsS: yes / the processor..Α: I don’t knowΕ: put any relation now and we will discuss about it in a moment.

In other words, the students in their first attempt to create a model reached acommon decision but in most cases they did not explain how they reached it. This factmight constitute a weakness of the teaching-and-learning process (Howe et al. 2000).

Also, the students and the teacher, during the supporting discourse among them(tlD), neither use nor refer to the software tools and the representations on thecomputer screen in order to support their thinking and construct the meaning. Theywere led to conceptual understanding solely through the use of their commonknowledge regarding the concepts and phenomena studied. That occurred through astructured teaching and learning process that is created by the “Let’s think” stage ofthe worksheet (Episode 5).

In Episode 5, the members of the group try to develop mutual understanding andanswer the question in the worksheet that concerns the relation of the internalsynchronization frequency of a processor with the execution time of an application.Episode 5 took place in the “Let’s think” stage after the construction and testing ofthe students’ initial model.

Episode 5: “Processor’s frequency and application execution time” after theconstruction and testing of the student’s initial model- “Let’s think”stage- Group B_S

S: “In which way do changes of the internal frequency of a processor affect theexecution time of an application running in a computer system? / How do youexplain it? / Higher internal frequency means more..” // (B and S are supposedto fill with a word or phrase the sentence they read).Ε: Higher internal frequency / what does it mean?Β: more [ticks

Educ Inf Technol (2011) 16:159–182 175175

S: [ticks (B and S mean the “ticks” or cycles of the synchronization clock of theprocessor)Ε: so if you go up to 800 what is it going to happen? (E means the clockfrequency of 800 Mhz)S: we will have more..Ε: if we go from 800 up to 900 what is it going to happen?Β: we will have more ticksΕ: that’s right!Β: more tasks(“Task” is a concept that developed through the discussion among the members ofthe educational group and refers to the single system function that takes place inthe processor each time a cycle of the processor clock is completed. A “task”means the execution of a number of instructions of the application that dependson the generation of the processor, for example one “task” in Pentium IIIcorresponds to three (3) instructions per cycle)S: tasks / not tasksΕ: more cycles of the processor’s clock which means more tasks okay? / So whatare you going to write down there?Β: more cyclesS: more cyclesΕ: of the clockS: of the clockΕ: when you have these kinds of questions you should read the whole sentenceconcluding the phrase after the blankS: Yes / in order to complete the sentence properly / “of the synchronization clockper second / consequently”E: “more..”/ “per second”Β: [tasks per secondS: [tasks yeah tasksΕ: That’s right, tasks or instructions / both of them are correct/ more tasks meanmore instructionsS: “therefore, applications software execution time..” // isΕ: decreases? [Increases? or mains the same?S: [getting less

The educational scenario of the worksheet formed the structure of the discoursethe same way for all three groups that took part in the study.

Structure 4: tI- (pD)-sR-cF-tlD- (sR-cF). Structure 4 concerns the testing of ahypothesis (sR), which is set as an answer to a problem posed by theteacher (tI). The absence of tools, representations and processesincorporated into the software, which function in a way that offersmeaningful feedback to the students, oblige the teacher-led discourse(tlD) to find the appropriate grounding in other means (i.e. work-sheets, common knowledge of the group). In this way, there were nostudents’ initiatives regarding the use of software tools and repre-sentations. The discussion among the students (pD) is not supported

176 Educ Inf Technol (2011) 16:159–182

by the computer but presumes a suitable pedagogical framework andsupport with other means.

4 Discussion - Conclusion

There follows the discussion of a new educationally important structure ofsoftware mediation that is derived from the study. Next, it follows the descriptionof the way the two software activities resourced collective thinking and students’initiatives. Finally, for each type of software environment, we propose certainhypotheses for future research regarding the support of collaborative problemsolving.

4.1 A computer-based educational environment for supporting collective thinking,students’ autonomy and self-regulation

In ‘tI-(pD)-sR-cF-(tlD)’ structure detected in the Virtual Scale activity (sub-task 2) aswell as in Wegerif and colleagues’ studies (Wegerif 1996; Wegerif et al. 2003;Mercer et al. 2004) students seem to use the software tools solely to represent andtest the solution they came up with in the discussion. However, in ‘tI-(D)-(sI-cR-D)...(sI-cR-D)-sR’ structure (sub-task 3), the students use the software tools on theirown initiative in order to test their own hypotheses (sI-cR-D). We suggest that this isdifferent and even more educationally important (see also Lohner et al. 2005),because it (a) can be related to procedural knowledge that constitutes a means toconceptual knowledge (Howe et al. 2000) and (b) constitute an indication thatstudents are moving towards students’ autonomy and self-regulation (Panselinas andKomis 2008; Boekaerts 1997).

This finding adds to the theory (Wegerif 1996, 2004; Wegerif et al. 2003; Merceret al. 2004) that the categorization of educational software as ‘directive’ in terms ofproviding guidance through feedback is not related necessarily with the transmissionmodel of learning. It is the type of feedback and software tools as well as theeducational framework provided by the teacher that shape the quality of educationalsupport provided by the software.

4.2 Resources for collaborative problem solving at a stand-alone computer

The significance of resources for collaborative problem solving has been stressed inearlier research (Crook 1994, 1998; Wegerif 1997, 2002) and there is contemporarystrand of CSCL research that studies this issue (Zumbach et al. 2005; Fischer 2007).Yet, few studies have given empirical evidence of how resources affect collaborationin the context of a stand-alone computer (Wegerif 1997; Schilter et al. 1998; Lohneret al. 2005).

Given the same pedagogical context in which both hypothesis testing activitiestook place, we suggest that the properties of the computer-based educationalactivities that shape the distinctive way each environment supports (a) the

Educ Inf Technol (2011) 16:159–182 177177

development of collective thinking in peer— and teacher-led discussion and (b)students’ autonomy are the following:

1. The existence of software tools, which trigger simulations and representationsoffering meaningful information and feedback to the students.

2. The provision of other resources (discussion, text, subject-related processes3)that enable students to utilize information taken from the environment in orderto discuss and reach a common decision.

4.2.1 Open-ended modeling software

In the case of the open-ended modeling software environment, the first prerequisitemight have been satisfied if (a) the external representations and simulations on theuser interface presented an optical linkage with the concepts and phenomena theystand for, or (b) it was possible for the students to identify a conceptual or proceduralanalogy between them.

In the ModelsCreator environment, the information as well as the processes ofutilizing the information is encapsulated in the “Let’s think” text of the worksheet.The teacher supports the conceptual understanding for the students through astructured inductive teaching process that uses both teacher-led discussion andteaching material but not the software tools.

Generally, in an “open-ended” software environment aiming at certain curriculumgoals the teaching is not done by the software. However, a question that arises is:“What does an open-ended modeling environment of this type offers in terms of ateaching-and-learning environment?” Trying to answer this question, we suggest thatan open-ended modeling environment of this type offers the development of a newsymbol and representation system, that is to say a new “language” for the descriptionof phenomena. For instance, Fig. 2 depicts a new representation or even modeling ofthe hypothesis: “as long as the internal frequency of the processor increases theapplication software execution time decreases”. A “language” that should first belearned by students in order to be able to use it and represent their perceptions aboutthe phenomena studied. These perceptions are possible to be presented with a varietyof representations on the computer screen and may function as a common referencein the discussion since they are in common view. However, we should be cautiousregarding misunderstandings when students use software tools because softwaretools use symbols and representations instead of the real phenomena. (O’Malley1992; Panselinas 2004; Wegerif 2002).

4.2.2 Virtual labs

On the other hand, “virtual laboratories” such as “virtual scale”, can function as“passive” educational environments (object) with the ability to provide feedback

3 “Numeral systems”-related processes: The process of the formation of a number by means of summingthe corresponding weight units. The process of the conversion of a binary representation of a number intoits decimal representation and conversely.

178 Educ Inf Technol (2011) 16:159–182

with educational meaning whenever students use the software tools (subject).More importantly it seems that this feedback does not limit but supportsstudents’ initiatives to “set up” hypothesis testing virtual experiments.

In this type of software, the options for the creation of a wide range of“virtual experiments” may vary. Do the students have the opportunity to createor select from a wide range of experiments or not? Do the students have theopportunity to involve a variety of parameters of the phenomenon beingstudied? A research question that arises from the present study (Structure 3) iswhether the opportunities that the “virtual laboratories” offer in the involvementof multiple parameters and the creation of a variety of “virtual experiments”affect students’ initiatives and the generation of collective thinking regardingboth procedural and conceptual knowledge. If, for example, the software offersa limited number of predefined experiments, the need for collective thinkingamong students might be critically limited. On the other hand, the presence of avariety of experiments and parameters might require from the students to thinkcollectively in order to set the suitable experiment that would give an answer tothe question they have posed.

Appendix

Part of the worksheet accompanying the ‘virtual scale’

Sub-task 1

Empty both trays of the Scale. Place 201 weight units on the left tray

1. Lock the Scales2. Place the weight units 128, 64, 8, 1 on the right tray3. Unlock the Scales4. What do you see? Can you explain it?

Sub-task 2

Empty the Scales. Place 167 weight units on the left tray.

1. Lock the Scales2. Place weight units on the right tray so as the scales balance once you unlock

them.3. Unlock the Scales.

Sub-task 3

Is there any sum of weight units on the left or the right tray, which is notpossible to balance? In other words, if I put a number on a tray is it likely notto be able to balance the scales placing weights on the other tray?

Educ Inf Technol (2011) 16:159–182 179179

Part of the Worksheet: constructing and testing a model for the “processor’sproperties and application execution time”

How does internal frequency affect applications software execution time in apersonal computer? Create a model that explains that influence.

Create a model by dragging and dropping the appropriate entities in themodels’ activity spaceSelect the appropriate properties and a relationTest model’s behaviour manually by moving the bar, automatically withbutton “play” or using the button “step by step”.

Let’s think

In which way do changes in internal frequency of a processor affect applicationssoftware execution time? How would you explain that?

Higher internal clock frequency means more (a)________________________per second, consequently more (b)______________ per second, therefore,applications software execution time (c)____________.

It is possible the answers you have already given to those questions to haveled you to change or add something to your model. Integrate your new ideasin the model you have already created and re-test it’s behaviour.

References

Anderson, A., McAtteer, E., Tolmie, A., & Demissie, A. (1999). The effect of software type on the qualityof talk. Journal of Computer Assisted Learning, 15(15), 28–40.

Angeli, C. (2008). Distributed cognition: a framework for understanding the role of computers inclassroom teaching and learning. Journal of Research on Technology in Education, 40(3), 271–279.

Boekaerts, M. (1997). Self-regulated learning: a new concept embraced by researchers, policy makers,educators, teachers, and students. Learning and Instruction, 7(2), 161–186.

Cazden, C. (2001). Classroom discourse: The language of teaching and learning (2nd ed.). London:Heinemann.

Crook, C. (1994). Computers and the collaborative experience of learning. Routledge.Crook, C. (1998). Children as computer users: the case of collaborative learning. Computers and

Education, 30(3/4), 237–247.Dagdilelis, V., Evangelidis, G., Saratzemi, M., Efopoulos, V., & Zagouras, C. (2003). DELYS: a novel

microworld-based educational software for teaching computer science subjects. Computers andEducation, 40(4), 307–325.

Dimitracopoulou, A., & Komis, V. (2005). Design principles for the support of modelling andcollaboration in a technology based learning environment. International Journal ContinuingEngineering Education and Lifelong Learning, 15(1/2), 30–55.

Fischer, G. (2007). Designing socio-technical environments in support of meta-design and social creativity. InProceedings of Conference on Computer Supported Collaborative Learning (CSCL ’2007), pp. 1–10.

Fisher, E. (1993). Characteristics of children’s talk at the computer and its relationship to the computersoftware. Language and Education, 7(2), 97–114.

Gravani, M., & John, P. (2005). ‘Them and us’: Teachers’ and tutors’ experiences of a ‘new’ professionaldevelopment course in Greece. Compare: A Journal of Comparative and International Education, 35(3), 303–319.

180 Educ Inf Technol (2011) 16:159–182

Hoek, D., & Seegers, D. (2005). Effects of instruction on verbal interactions during collaborative problemsolving. Learning Environments Research, 8, 19–39.

Holmboe, C., & Scott, P. (2005). Characterising individual and social concept development incollaborative computer science classrooms. Journal of Computers in Mathematics and ScienceTeaching, 24(1), 89–115.

Howe, C., Tolmie, A., Duchak-Tanner, V., & Rattray, C. (2000). Hypothesis testing in science: groupconsensus and the acquisition of conceptual and procedural knowledge. Learning and Instruction, 10(4), 361–391.

Jimoyiannis, A., & Komis, V. (2007). Examining teachers’ beliefs about ICT in education: implications ofa teacher preparation programme. Teacher Development, 11(2), 149–173.

Lemke, J. L. (1990). Talking science: Language, learning, and values. Ñorwood, NJ: Ablex.Littleton, K. (1998). Productivity through interaction: An overview. In K. Littleton & P. Light (Eds.),

Learning with computers: Analysing productive interactions. Routledge.Lohner, S., van Joolingen, W. R., Savelsbergh, E. R., & van Hout-Wolters, B. (2005). Students’ reasoning

during modeling in an inquiry learning environment. Computers in Human Behavior, 21(3), 441–461.Mercer, N. (1994). The quality of talk in children’s joint activity at the computer. Journal of computer

assisted learning, 10, 24–32.Mercer, N. (1995). The guided construction of knowledge: Talk amongst teachers and learners.

Multilingual Matters.Mercer, N. (2000). Words and minds: How we use language to think together. Routledge.Mercer, N., Dawes, L., Wegerif, R., & Sams, C. (2004). Reasoning as a scientist: ways of helping children

to use language to learn science. British Educational Research Journal, 30(3), 359–377.Murphy, P. (2007). Reading comprehension exercises on line: the effects of feedback, proficiency and

interaction. Language Learning and Technology, 11(3), 107–129.O’Malley, C. (1992). Designing computer systems to support peer learning. European Journal of

Psychology of Education, 7(4), 339–352.Panselinas, G. (2002). Groupwork activities and guided construction of knowledge in general applications

software environment. In Proceedings of 3rd Hellenic conference with international participation:“Information and Communication Technologies in Education”, Vol. B, pp. 275–285, Rhodes, Greece(In Greek).

Panselinas, G. (2004). Computer mediation in cognitive interactions amongst students: the case of generalapplications software. Themes in education, Vol. 5, 1–3, pp. 133–148 (In Greek).

Panselinas, G., Komis, V., Politis, P. (2005). Modelling activities with educational software with regard tocomputer operation. In Proceedings of 2nd International conference “Hands-on science: Science in aChanging Education”, pp. 70–75. Rethymno, Greece.

Panselinas, G., & Komis, V. (2008). Students' initiatives to test hypotheses by using educational software:an aspect of self-regulation. In Proceedings of 4th conference on Didactics of Informatics, Patras,Greece (In Greek).

Panselinas, G., & Komis, V. (2009). 'Scaffolding' through talk in groupwork learning. Thinking skills andCreativity 4(2), 86-103.

Rasku-Puttonen, H., Eteläpelto, A., Arvaja, M., & Häkkinen, P. (2003). Is successful scaffolding anillusion? — Shifting patterns of responsibility and control in teacher-student interaction during a long-term learning project. Instructional Science, 31, 377–393.

Rojas-Drummond, S., & Mercer, N. (2003). Scaffolding the development of effective collaboration andlearning. International journal of educational research, 39, 99–111.

Saljo, R. (1998). Learning as the use of tools: A sociolcultural perspective on the human-technology link.In K. Littleton & P. Light (Eds.), Learning with computers: Analysing productive interactions.Routledge.

Schilter, D. G, Perret, J-F., Perret-Clermont, A-N., & De Guglielmo, F. (1998). Sociocognitive interactionsin a computerised industrial task: Are they productive for learning? In K. Littleton & P. Light (Eds.),Learning with computers: Analysing productive interactions. Routledge.

Tolmie, A., Howe, C. J., Mackenzie, M., & Greer, K. (1993). Task design as an influence on dialogue andlearning: primary school group work with object flotation. Social Development, 2, 183–201.

Wegerif, R. (1996). Using computers to help coach exploratory talk across the curriculum. Computers andEducation, 26(1–3), 51–60.

Wegerif, R. (1997). Factors affecting the quality of children’s talk at computers. In R. Wegerif & P.Scrimshaw (Eds.), Computers and talk in the primary classroom. Multilingual Matters.

Wegerif, R. (2002). Report 2: Literature review in thinking skills, technology and learning. NestaFuturelab Series.

Educ Inf Technol (2011) 16:159–182 181181

Wegerif, R. (2004). The role of educational software as a support for teaching and learning conversations.Computers and Education, 43, 179–191.

Wegerif, R., Mercer, N., & Dawes, L. (1998). Software design to support discussion in the primarycurriculum. Journal of Computer Assisted Learning, 14(3), 199–211.

Wegerif, R., Littleton, K., & Jones, A. (2003). Stand-alone computers supporting learning dialogues inprimary classrooms. International Journal of Educational Research, 39, 851–860.

Wells, G. (1992). The centrality of talk in Education. In K. Norman (Ed.), Thinking voices: The work ofthe National Oracy Project. London: Hodder and Stoughton.

YPEPTH/ΥΠΕΠΘ, (1999a). ‘Basic Mathematics in Computer Science’ curriculum for Vocational TechnicalSecondary Education. Retrieved February 9, 2009, from http://pi-schools.sch.gr/download/lessons/tee/computer/PS/BASIC-INFORMATICS-CONCEPT.ZIP (in Greek).

YPEPTH/ΥΠΕΠΘ, (1999b). ‘Computer Hardware’ curriculum for Vocational Technical SecondaryEducation. Retrieved February 9, 2009, from http://pi-schools.sch.gr/download/lessons/tee/computer/PS/yliko.zip (in Greek).

Zumbach, J., Schonemann, J., & Reimann, P. (2005). Analyzing and supporting collaboration incooperative computer-mediated communication. In T. Koschmann, D. Suthers & T. W. Chan (Eds.),Computer supported collaborative learning 2005: The next ten years. Mahwah: Lawrence ErlbaumAssociates.

182 Educ Inf Technol (2011) 16:159–182