View
1
Download
0
Category
Preview:
Citation preview
Using Network Tools to Enable a Distributed Research Project:Reflections on the first two years of “Learning how to Learn”
Patrick Carmichael1, Richard Procter1 and Leslie Honour2
1University of Cambridge Faculty of Education, 2University of Reading Institute of Education
Paper presented at the Annual Conference of the British Educational Research Association, Heriot-Watt University, Edinburgh, 11th -13th September 2003
Abstract
The ‘Learning how to Learn’ Project of the ESRC Teaching and Learning Research Programme makes use of a range of electronic tools to provide information, enable communication and collaboration and maintain oversight of the progress of the project as a whole. In this paper we review the contribution of these tools to the project and discuss a number of analytical frameworks within which electronic tools for project management and community building may be discussed. We then describe how some specific elements of the website have developed in response to the project design and user needs and offer some preliminary analysis of their use. This suggests that different groups involved in the project use different sets of electronic tools and use the functions it offers in a range of ways, some of which may reflect differences in their perceptions of the project and their role within it. We identify a number of areas for further investigation and some extant research questions that may be addressed by the project in future.
The Learning how to Learn Project and its Website
‘Learning how to Learn’ is a project in the second phase of the Economic and Social Research
Council’s Teaching and Learning Research Programme and involves staff from four Universities
(Cambridge, King’s College London, Reading and the Open University) and over forty schools spread
across five Local Education Authorities and a Virtual Education Action Zone (VEAZ). The project has
developed and deployed a range of electronic tools in order to enable the development and
management of the project and the ‘real world’ research community associated with it.
Central to these is the project website (http://www.learntolearn.ac.uk)1 which has evolved over the past
two years to fulfil a number of roles (Carmichael, 2002) including:
This paper is based on the work of ‘Learning How to Learn - in classrooms, schools and networks'. This is a four year development and research project funded from January 2001 to March 2005 by the UK Economic and Social Research Council (ref: L139 25 1020) as part of Phase II of the Teaching and Learning Research Programme (see http://www.tlrp.org ). The Project is directed by Mary James (University of Cambridge) and co-directed by Robert McCormick (Open University). Other members of the team are: Carmel Burgess, Patrick Carmichael, David Frost, John MacBeath, David Pedder, Sue Swaffield and Richard Procter (University of Cambridge), Paul Black, Bethan Marshall and Joanna Swann (King's College London), Leslie Honour (University of Reading) and Alison Fox (Open University). Past members of the team are Geoff Southworth, University of Reading (until March 2002), Colin Conner, University of Cambridge (until April 2003) and Dylan Wiliam, King's College London (until August 2003). Further details are available at: http://www.learntolearn.ac.uk .
o the provision of information about the nature and scope of the project to both participants and
the wider public;
o provision of classroom and staff development materials for use by LEA and school
coordinators involved in the project;
o data collection and storage;
o provision of project management and data analysis tools for use by the research team;
o provide exemplification of best practice in classroom assessment.
Central to these roles is a innovative content management system that builds web pages and other types
of outputs from data components held on a web server: a typical page viewed by a user might actually
represent many small components embedded in a template. These components represent the basic units
of communication within the project; the largest are characteristically documents constructed in other
applications (such as MS Word documents and PowerPoint presentations) with the smallest being
single bibliographical references, dates and times or email addresses. Content and formatting
information are rigorously separated, allowing customisation of the web pages in response to user
needs and reuse of components in a variety of output formats. Each component is accompanied by a
small metadata packet which is used to locate it within the site and which may include information
about authorship, relevance and sufficient descriptive information for it to be incorporated into the
index of the site’s integrated search facility. The navigation system and internal hyperlinks that appear
on the web pages are generated dynamically from the metadata and the underlying file system in which
all components are contained.
The electronic tools used by the Learning how to Learn project have been custom built by members of
the team to reflect the needs of a distributed research project; we recognise that not all educational
research projects are in a position to do this, and the project’s appointment of a full-time project officer
with responsibility for technical development has contributed significantly to the capabilities of the
project as a whole. The time taken to develop these tools can be offset against the fact the tools match
closely the aims and design of the project; at the same time, some of the tools are sufficiently generic in
character that they could be adapted for use in further research projects and could be adapted for use by
other stakeholders including LEAs and groups of schools who wish to continue to develop the work of
Carmichael, Procter and Honour: Using Network Tools to Enable a Distributed Research Project: Page 2
the project at the end of its funding. All the software used is freely available and makes only minimal
technical demands of users, a fact that we hope increases its potential to be used in other settings2.
There is a wide range of interactions between members of the universities team, with instant
messaging, one-to-one email and one-to-many email lists being used not only to help manage the
project but also to share information, solicit advice and feedback and to enable sub-groups with specific
roles or interests to function within the broader project framework. At the same time, the location of
the project within the Teaching and Learning Research Programme means that individuals within the
project also interact within the broader network of the Programme and Research Capacity Building
Network as well as within broader educational networks. Members of the project have other
institutional affiliations and belong to belong to other electronic and real-world networks, and when
they receive relevant information from these, ‘forward’ it to individuals or to the whole universities
team for information or action. Other patterns of interaction have emerged whereby communication
initially takes place within a small group or between individuals before the resulting work –
instrumentation, data analysis or potential publications – is more widely circulated.3
The use of the website and associated tools to address a range of project needs has increasingly led to a
view of electronic networks not simply as a means of information dissemination but rather as an
environment across which data, analytical tools and users are distributed; and of the website as what
Clark (1996) calls a ‘communication space’. The purpose of the electronic tools we use and develop,
then, is first to replicate the existing patterns of interaction required by the day-today practicalities of
the project and to replicate the patterns of interaction demanded by the research design (James, Black,
and McCormick, 2003) but also to enable and encourage new patterns of interaction within the
developing research community. In this respect it closely resembles the relationship between
technology and community described by Verwijs et al. (2002; p. 60-61) in which they argue that while
initially at least “technology support should match the primary activities of the community, its size,
goals, and the preferences of community members”, the availability of new technologies “may inspire
[community members] to explore new interaction styles, thus improving the performance or efficiency
of the community”.
Carmichael, Procter and Honour: Using Network Tools to Enable a Distributed Research Project: Page 3
In the course of the selection and development of the electronic tools used by the project, we have not
only integrated information, communication and collaboration tools into the website, but have also
borrowed tools, techniques and approaches from fields outside education research, including financial
services, software development and news provision. Other educational communities use a wider range
of tools - in support of distance learning, conference management and to enable peer-review for
publications, for example, and there are some interesting developments in the area of collaborative or
networked qualitative data analysis (Muhr, 2000; Ford, Oberski and Higgins, 2000; Carmichael, 2001)
– but the picture presented by Swaak, Verwijs and Mulder (2000), in their review of European
electronic communities across several domains, is of educational research communities
characteristically using web-sites as document repositories, with most interaction taking place via
telephone and email, and face-to-face meetings being the primary means by which ideas were
integrated and exchanged (p. 30). Mann and Stewart’s discussion (2000) of the role of electronic tools
in the conduct of education research emphasises email and synchronous conferencing and presents the
development of customised electronic tools as being technically demanding (p.71) stressing instead the
role of new technologies as “means of data collection through active engagement with informants”
(Hine, 2001).
Classifying Electronic Tools: Three Frameworks
Electronic tools designed to support community building have been classified on a number of bases by,
amongst others, Steuer (1998), who divides them into synchronous and asynchronous on the one hand
and one-to-one, one-to-many and many-to-many on the other; Slagter, Verwijs and Mulder (2001), who
provide a survey of the features provided by a range of collaborative software applications; and
Wenger, who locates a wide range of electronic tools in a framework informed by his work on
‘Communities of Practice’ (Wenger, 1999; Wenger, McDermott and Snyder, 2002). Each of these
approaches provides a useful analytical framework within which the electronic tools used by the project
may be described – although none provide a definitive description.
Steuer’s emphasis on patterns of interaction encourages us to look at how the project website provides
not only one-to-many, asynchronous services such as web pages, email distribution lists and
newsletters, but also provides a rationale for one-to-one communication between research team
Carmichael, Procter and Honour: Using Network Tools to Enable a Distributed Research Project: Page 4
members and school co-ordinators using email because it is ubiquitous and it is familiar and easily
accessible to most users. Email messages can be ‘cc’d’ to the project website and then appear in the
record of interactions between participants. Steuer’s distinction between ‘private’ and ‘public’
communications, however, is complicated by the fact that, while the ‘logs’ to which participants
contribute appear ‘private’, their purpose is to provide research data about the progress and ultimately
the effectiveness of the project to a larger audience: initially, the research team and in the longer term, a
potentially wider ‘public’. It has been important to make clear to participants in the Learning how to
Learn project that provision of a customised, individual interface or conversation space does not
necessarily mean that all communications taking place within it are ‘private’.4
Slagter, Verwijs and Mulder’s (2001) approach is to analyse facilities offered by a number of CSCW
(computer supported collaborative work) environments that they identify as “currently important tools
to support online communities” in that they provide “technical support for communities, that allows
and stimulates a large group of community members to share experiences and knowledge, while
fostering community cohesion”. They divide facilities into those concerned with ‘Information’ (such
as participant profiles and “frequently asked questions”); ‘Communication’ (such as email lists,
discussion boards and chat facilities) and ‘Collaboration’ (version control for documents, access
control systems, peer-reviewing and annotation), and then assess the extent to which the various CSCW
environments offer these. The Learning how to Learn website and associated services offer a range of
the facilities they identify, but differs from the applications they describe in that they are supplemented
by a range of facilities specifically geared to the needs of the project and its members, including the
online ‘logs’ kept by participants. It is also important to recognise that a “a web based interface” is
used to address all these functions, allowing information, communication and collaboration functions to
be addressed without the need for any downloads or browser ‘plug-ins’; some online community tools
make demands of users which may beyond the capabilities of individual users or those based in
schools.
Slagter, Verwijs and Mulder’s analysis is also relevant in that it differentiates communities from ‘task
groups’, characterised by a fixed structure “derived from existing hierarchies”, a goal of, or orientation
towards, task completion, and a set timescale. (also described by Swaak, Verwijs and Mulder,
Carmichael, Procter and Honour: Using Network Tools to Enable a Distributed Research Project: Page 5
2000,p.18). The latter also characterise the electronic tools selected to support task groups as being
more concerned with task related information exchange than with the provision of enduring
communication and collaboration channels. We shall return to the relevance of this distinction within
the project in due course.
Wenger’s review of ‘community-oriented technologies’ (2001) differs from the previous review in that
it seeks to locate existing computer software in a framework informed by his prior work on
‘communities of practice’ (Wenger, 1999; Wenger, McDermott and Snyder, 2002). Wenger classifies
electronic tools and functions in a framework linked to activities he describes as being characteristic of
community building, identifying the following as common elements of electronically-enabled
communities of practice:
a home page to assert their existence and describe their domain and activities; a conversation space for on-line discussions of a variety of topics; a facility for floating questions to the community or a subset of the community; a directory of membership with some information about their areas of expertise in the domain; a shared workspace for synchronous electronic collaboration, discussion, or meeting; a document repository for their knowledge base; a search engine good enough for them to retrieve things they need from their knowledge base; community management tools, mostly for the coordinator but sometimes also for the
community at large, including the ability to know who is participating actively, which documents are downloaded, how much traffic there is, which documents need updating, etc.;
the ability to spawn subcommunities, subgroups, and project teams (Wenger, 2001, p.8)
Analysis of the features and functions provided by the project website reveals that most of these
features are provided, but that users do not enjoy the same levels of access to some of these. We can
identify four broad levels of access: project members with administrator rights on the website; project
team members based in Universities; school and LEA coordinators; and other teachers and visitors to
the site. Table 1 shows how these differ in the website as it is currently (Summer 2003) operating.
What we see is a broadly hierarchical structure in which proximity to the technological ‘hub’ of the
project equates with greater access not only to information but also to the range of facilities available.
Some of the differences (sub-community creation, email lists) have come about as a result of demands
which have arisen in the course of the project, whereas others such as differentiated access to shared
workplaces, information about participants and access to documentation are derived from the original
project design and reflect the roles of the project participants as originally construed, issues such as
privacy and anonymity and the project’s declared intent to share findings and resources as widely as
possible within the lifetime of the project.
Carmichael, Procter and Honour: Using Network Tools to Enable a Distributed Research Project: Page 6
Carmichael, Procter and Honour: Using Network Tools to Enable a Distributed Research Project: Page 7
Table 1: Project Roles and Access to ‘Community-oriented technologies’ provided by the Project
Role in Project and approximate number in category
Facility Administrators(n= 2)
Project team(n= 19)
Coordinators(n=c.50)
Others(n=large)
Home Page Yes Yes Yes Yes
Conversation Space Instant Messenger Email Lists One-to-one email One-to-one email
Floating Questions Email Lists Email Lists One-to-one email No
Directory of Membership Yes Yes Yes (limited scope) Yes (limited scope)
Shared Workspace Yes Yes Yes (limited scope) No
Document Repository Yes Yes Yes Yes
Search Engine Yes (full scope) Yes (full scope) Yes (limited scope) Yes (limited scope)
Management Tools Yes Yes Only in relation to own school/LEA No
Sub-community creation YesYes (by
arrangement with administrators)
Yes (potentially at end of project) No
The insights provided by these three analyses go beyond simply listing features and functions of the
project website. If, as all the authors listed suggest, different kinds of communities may make use of
different combinations of electronic tools, then it is possible that the selection and deployment of may
provide an indication about the nature of the community, either as it already exists or the directions in
which it might develop. The use of the tools provided, then, may act as more than a source of data, but
equally, as a ‘barometer’ for the progress of the community-building exercise. Analysis of the patterns
of use of electronic tools (some of which may be interpreted as ‘successful’ or otherwise) may provoke
and enable critical analysis of project objectives and design and may point up differences in
stakeholders’ and participants’ perceptions of the project and their role within it.
The Project Website – Information Provision, Project Management and Community Resource
In the remainder of this paper, we will describe some of the ways in which the project website has
developed. In the course of two years of continuous development, the original information provision
role has been supplemented by communication and latterly collaboration functions. We will describe
two areas of the site that have developed and discuss the impact of these developments and the extent
to which they provide insights into the development of the project and its associated communities.
Data has emerged from discussions within the project team; from contact with schools both mediated
through project ‘critical friends’ and through face-to-face meetings with school coordinators; from
analysis of contributions to the shared workspaces or logs of the website; and from analysis of web
Carmichael, Procter and Honour: Using Network Tools to Enable a Distributed Research Project: Page 8
traffic5. The first of these areas is concerned with the provision of information for school coordinators
and classroom teachers about the range of practical strategies which research evidence suggests may
represent effective practice in formative assessment; the second is the collection of data as to the
progress of the project in participating schools.
Information about Classroom Strategies
Part of the initial project intervention in participating schools is a presentation, generally made by a
member of the universities team, defining ‘assessment for learning’, reviewing research evidence, and
offering teachers a range of practical classroom strategies designed to respond to Black and Wiliam’s
recommendation that teachers be provided with ‘a variety of living examples of implementation’(1998;
1 The project website at http://www.learntolearn.ac.uk provides access to both information in the public domain but also to a range of resources and facilities to which access is restricted. Logging in as a ‘guest’ provides access to only a small part of the entire website, which now extends to hundred of web pages and thousands of smaller data components.
2 For the technically-inclined: the vast majority of the data is stored in XML-RDF (Extensible Markup Language – Resource Definition Format) and almost all programming is in Perl 5.6, with extensive use of the XML::Parser and RDF::Parser Libraries. The site is hosted with a commercial hosting agency on a Red Hat Linux system running Apache Web Server.
3 Even a cursory analysis of email traffic within the project reveals the range of interactions taking place. 61 items of project-related email received by one of us (Carmichael) during June 2003 were analysed according to audience (individual; small group; the entire project team; wider audience such as all members of the TLRP) and purpose (project administration; technical advice; project development such as instrumentation and data analysis; information about events, publications and sources; discussion of strategy and of draft materials; and circulation of completed papers and policy documents).
AudienceAdminTechnicalDevelopmentInformationDiscussionCirculationTotalIndividual912230026Small Groups22931118Project Team40014413Wider Audience0004004Total (n = 61)151411115561This does not include interactions taking place via the website logs; face-to-face meetings; phone calls, faxes and text
messages, but there are still a number of emerging patterns. Interactions relating to ‘technical’ and ‘development’ activities characteristically take place between individuals or within small groups, with the whole team receiving materials which are either at a later stage in their development or which are
concerned with project administration – although the pattern shown by the latter along with that of ‘information’ does suggest that, as the project has matured, individual roles and interests have become
more clearly differentiated.
What is also significant here is that even though there are small groups within the project which might outwardly appear to be ‘task groups’ the nature of the interactions between them goes beyond that associated with task groups by Swaak, J., Verwijs, C. & Mulder, I. (2000). Closer inspection of the content of the emails in the sample reveals that those in the ‘discussion’ and ‘information’ categories, while addressed to a small group with a specific project role, were often concerned with more long-term issues and related to broader educational and technological agendas than the immediate ‘tasks’.
4 A good analogy for this is the way customer data is used by websites like http://www.amazon.com. Individual customers receive a personalised view of the website depending on their declared preferences and past viewing and purchasing profile; however, while some personal information (addresses, credit-card details etc.) is ‘private’, other data are used (albeit anonymised and aggregated) so as to allow features like reviews, ratings and “customers who bought this item also bought …” lists.
Carmichael, Procter and Honour: Using Network Tools to Enable a Distributed Research Project: Page 9
p. 13). This is then used as the basis of a ‘audit’ process in which schools identify which of the
strategies they (collectively or individually) already use, which they might adopt, and what kinds of
support they might require from the project in order to design and implement a successful change in
policy and practice.
The content of the presentation is contained on a Microsoft Powerpoint presentation of approximately
40 slides, accompanied by notes. This was originally circulated to all project members by email, and
was then stored as a downloadable resource on the website, in an area that was ‘password protected’.
Almost immediately, however, project coordinators in schools had taken part in the initial presentation
requested copies to review further, to present to staff who had been absent from the original event, and
to integrate into their own training materials. The downloadable resource, together with other materials
for use by participating schools was then moved into an area of the website which was still password-
protected, but which now allowed access by school coordinators.
The next significant development was a major redesign of the website as a whole, and the installation
of the new content management system. At this point (May 2003), the decision was taken by the
project management to release all project materials into the public domain; this was informed by
increasing numbers of requests from schools for materials to be more widely available (and not
password-protected) so as to allow staff to work independently of school coordinators, and by interest
in the work of the project from other agencies and individuals in the UK and elsewhere. Over the
following three months, web statistics6 demonstrated that the presentation was downloaded over 150
times; in some cases by project members who were making presentations, but more significantly by
school and LEA users inside and outside the project.
Aware that the presentation was a key resource, two further developments were put in place in July
2003: a web-based (HTML) version of the presentation relieving users of the need to download the
large Powerpoint presentation; and the development of a new section of the website presenting the
6 The website is hosted by Webfusion (http://www.webfusion.com), part of the HostEurope group. As part of the hosting package used by the project, the hosts provide full logs of access to the site. These are then analysed using ‘Webalizer’ a commercial package (also provided) and a number of custom Perl scripts that filter log entries generated by site maintenance and testing, and originating in automated requests from search engines.
Carmichael, Procter and Honour: Using Network Tools to Enable a Distributed Research Project: Page 10
practical classroom strategies as the basis of a collaborative activity in which teachers were invited to
respond via email with their own experiences of using these in their own classrooms, interesting
practice and experience being subsequently incorporated into the website. In Steuer’s (1998) terms,
this represents a move from ‘private’ to ‘public’ access; in those of Slagter, Verwijs and Mulder
(2001), the development of an ‘information’ resource into the basis of communication and
collaboration. If we consider these developments in the framework informed by Wenger (and set out in
Table 1) then we can see the development of an initially private document repository into a public
‘conversation space’ albeit one – at the present time at least - moderated and mediated by the website
administrators and members of the universities team. A natural progression of this development would
be to present users of the website with an opportunity to submit examples of practice, comments,
annotations and questions directly to a conversation space to which members of the University team
also contribute, replicating their roles as critical friends to schools in the ‘real world’ within the
electronic environment.
Logging Project Progress in Schools
A second facility offered via the website is a dedicated ‘conversation space’ to which school and LEA
coordinators, critical friends and researchers can contribute reports and reflections on the progress of
the project in the settings in which they work. This was initially conceived of as being an online
equivalent and replacement for paper or word-processed logs that would be completed as the project
developed and which would intermittently be collected for analysis by researchers. In addition to
providing data about the progress of the project, these were also designed to allow reflection and
dialogue on the part of the participants. A set of management tools were also built so that any of the
participants could gain a quick overview of progress to date (including which parts of the project
intervention had taken place, what data had been collected and dates of future visits and meetings) and
project managers could obtain an oversight of progress across a whole LEA or across the project as a
whole. A web interface was designed to ensure that key information (respondent, role, date of event or
events described, attendees) was collected along with two areas for free text entry – one specifically
requesting an account of events and the other for reflective comments, annotations and ‘issues arising’
from the events (see Figure 1). We expected from the outset that some log entries would be limited in
both length and scope, simply reporting on arrangements for meetings, delivery of resources and
Carmichael, Procter and Honour: Using Network Tools to Enable a Distributed Research Project: Page 11
administration of research instruments, but expected that at other times the logs would provide a forum
in which the school, LEA and University-based team members could reflect, interact and exchange
information. Meetings for School and LEA coordinators within the project have taken place
approximately every six months, and the question of how and what to ‘log’ has featured at several of
these meetings; in addition, critical friends often demonstrate the website and the online logs during
early visits to schools and a ‘practice’ school log to which new users can contribute postings until
familiar with the system has been installed on the site7
Figure 1: The Log Web Interface
At the time of writing (July 2003), over 600 log entries have been submitted from the 43 project
schools, with the highest number relating to a single school being 48. Wide variations in the number of
logs reflect the different starting dates and different patterns of project development across the project.
Analysis of nearly 500 log entries undertaken at the end of June 2003 revealed that, at that stage in the
project, the logs were still being used more by university-based critical friends and researchers to
record their activities in schools – and, in some cases, documented protracted discussions with schools
Carmichael, Procter and Honour: Using Network Tools to Enable a Distributed Research Project: Page 12
in order to negotiate access, dates for visits and the terms under which the project would operate. At
the broadest level of categorisation, the content of log entries were as shown in Table 2.
Table 2: Log Entries – General Description, July 2002-June 2003
Contributor Purpose of Log Entry Number (n=484)
%
Critical Friends Descriptions and reflections on training, in-service and workshops
44 9.1
Other critical friend interactions – negotiation of access, arrangement of meetings, offering advice, circulating information
143 29.5
Researchers Accounts of Research Visits – data collection, interviews, observation
170 35.1
School Coordinators Accounts of activity in schools or attendance at meetings by School coordinators
106 21.9
Various Other – technical advice, general discussions about the conduct of the project, notification of staff changes
21 4.3
In some cases, very short log entries were made by users simply selecting items from the ‘drop-down
menus’ shown in Figure 1; researchers characteristically entered these when they had conducted
interviews or other data collection. Other log entries comprise literal accounts of activities - reports of
development activities, meetings and other events; in some cases, school coordinators ‘pasting’ minutes
of meetings held in school or ‘action plan’ documents directly into the logs, or critical friends doing the
same with email messages and other documents they had collected. A subset of these also included
interpretative and reflective comments (see Figure 3 for a breakdown of the log entries by level of
elaboration).
5 At this point is it important to stress that we are cautious in ascribing too much significance to ‘raw’ usage statistics, for example, which may be inaccurate measures of meaningful engagement by users, and recognize that, in some cases, the suite of tools available to projects may be limited and unrelated to project design. The deployment of online ‘discussion boards’, for example, is, in our opinion, commonplace more because Internet Service Providers and Hosts provide them, rather than because of any clear rationale for discussion or engagement with users, which may well be better provided using different tools and strategies – including ‘non-electronic’ ones.
7 This is named ‘Sunnydale’ after the school attended by Buffy Summers in the TV series ‘Buffy the Vampire Slayer’. The school name is now used as a ‘placeholder’ school name in sample data across the project. Fans will be pleased to hear that, despite his busy schedule, Rupert Giles acts as Learning how to Learn’s school coordinator at Sunnydale.
Carmichael, Procter and Honour: Using Network Tools to Enable a Distributed Research Project: Page 13
Table 3: Log Entries – Level of Elaboration
Contributor Total Notification Only
Notification and Description
Notification, Description and
Reflective Comments
Critical Friends(reporting on training)
44 1 (2%) 19 (44%) 24 (55%)
Critical Friends (other interactions) 143 1 (<1%) 97 (68%) 45 (31%)
Researchers 170 40 (24%) 79 (46%) 51(30%)
School Coordinators 106 2(2%) 32(30%) 72(68%)
Various 21 0(0%) 12(57%) 9(43%)
Analysis of the nature and level of elaboration of the log entries by contributor, revealed a number of
interesting patterns. While they frequently contributed reflective comments, the project researchers
were more likely than other groups to use the logs (where appropriate) to simply flag the fact that a
particular data collection activity had taken place (24% of records); since these included interviews and
questionnaires which were to be later analysed and reflective comments were not always necessary,
unless issues arose immediately about the deployment of research instruments, the conduct of the data
collection activity, or other issues of concern to the researcher or the respondents. For example:
“Several issues emerged at the end of the session related to absentees [and]…Learning support assistants who would be invited to complete as a separate group. It was recognised that they might need terms to be explained, but that it will be important not to do this in a way that leads them to answer in a particular way. … a number of new members of staff starting work at the school next term…” (Log Entry by Researcher, School DWH, 17-06-2002)
“[the teacher]felt extremely uneasy about my filming the lesson. After the experience however she felt more relaxed about the observation and has allowed me to film her on my next visit in the summer term 2003” (Log Entry by Researcher, School MAN, 26-03-2003)
Members of the universities team acting as critical friends almost always offered some kind of
descriptive comments of events in schools – numbers of attendees at meetings, the substance of
discussions, the response of staff to workshop materials. They often made reflective comments about
the school and staff and also about the implications of their experiences for the project as a whole,
although this was not as common as we had perhaps expected; perhaps because many of the log entries
Carmichael, Procter and Honour: Using Network Tools to Enable a Distributed Research Project: Page 14
early in the project were concerned with management of the project and comprise literal accounts of
phone calls, visits and meetings. Examples that go beyond literal description are often concerned with
the implications of school-based activity such as in-service sessions and workshops for the broader
project aims:
“I was acutely aware that this is an infants school and the examples I had to hand were drawn from secondary practice. However, teachers were very ready to offer analogous examples from their own pupils. For example misunderstanding that children experience over the different meanings of words … The two examples of children's work brought by teachers …may be worth capturing as examples for the L2L website and to provide examples from work with very young children in relation to this workshop.” (Log entry by Critical Friend, School KNG, 30-09-2002)
School coordinators responded in a wide variety of ways to the opportunity to contribute to the online
logs. Of the different groups of users, they contributed the highest number of reflective comments, but
these ranged from additional literal accounts of activity in the school, through questions to the
universities team (rather as an alternative to email), notes and aide-memoires as to ‘where next’, to
reflective and self-critical commentary. On the whole, school coordinators’ comments, while
considered and in some cases very detailed, were less concerned with reflecting on existing practice
and on the impact of the project on its development and were more concerned with the day-to-day
conduct of the project. The range of these entries is illustrated by this small selection.
“School Coordinator still to be decided. Handbook has not been received - could we have one please?”(Log Entry by School Coordinator, School LDW, 12-07-2002)
“Meetings with Eng. Dept have resulted in discussion and review of Dept. Marking policy and, as a result, a revised marking policy, where pupils have greater control over the marking - peer marking implemented and trialed. The decision was made to not grade every piece but to give formative comments instead. Finally, KS3 pupils were given marking policy and key to have displayed and to use in front of books and a target sheet, to record targets set as a result of formative comment, to be kept in the back of the exercise books. Pupils only will be given a National Curriculum grade on a termly basis - this was a compromise after discussion with school.” (Log Entry by School Coordinator, School SEK, 18-10-2002)
[Workshop] was generally very successful. However, some WPS staff still need to be made familiar with the basics of Assessment for Learning. Questionnaires will be done here within the next fortnight, probably after OFSTED … (Log Entry by School Coordinator, School WHE, 06-01-2003)
“I found describing the class quite difficult and wondered why this was. Perhaps I think of them too much as 'levels' of learning due to the focus on assessment. We have had this discussion in the staff room. Are we thinking too much about children as a 1c or 2a? Also there are so many individuals within a class which you are focusing on as you teach. Do I ever think in
Carmichael, Procter and Honour: Using Network Tools to Enable a Distributed Research Project: Page 15
'whole class' terms? I'll keep a note of when I do.” (Log Entry by School Coordinator, School KNG, 21-05-2003)
In this final example, the school coordinator continued her log entry with comments on the conduct of
the classroom observation; at the current stage of project development, such reflective contributions are
atypical on the part of school coordinators, and those relating the teachers’ own learning to that of the
children involved in the project are very rare.
“So used to receiving feedback and discussing the lesson - found it strange just to give my opinion of how it went. I wanted to know how it went as far as [the researcher] was concerned and what [they were] focusing on. I wonder if children feel like this if we do not comment on their work.” (Log Entry by School Coordinator, School KNG, 21-05-2003)
The question of whether contributing to online environments necessarily encourages reflective thought
– or even accounts of reflective practice - remains unproven and the existing literature presents a
confused picture. Maxwell et al. (2001) analyse the content of postings in a forum designed to support
teacher-researchers distributed over a wide geographical area in rural Australia; while this was on
balance judged to be effective in supporting, most of the online interactions were concerned with
project design and management, and while there was some evidence of university-based staff and other
teachers developing ‘person-orientated’ critical friend roles in the online discussion space, there was,
however, ‘little evidence of the critical analysis associated with the notion of “critical friend”’
(Maxwell et al, 2001; p.10). In other cases, (for example, Hoel and Gudmundsdottir (1999) reporting
on their ‘REFLECT’ project in Norway) where the purpose and parameters of self-criticism and
analysis were, perhaps, more clearly defined, the process of preparing and then reflecting upon and
responding to writing for online submission seems to have been more effective at encouraging
reflective writing and critical input from mentors.
Some Conclusions and Areas for Future Study
In addition to its original declared intention to aid project communication and data collection, the
continued use of a range of online tools within the project presents opportunities to document not only
their specific affordances but may also allow us to relate their use (or use of sets of tools) to the
different roles of individuals and groups within the project. While there may be questions as to the
reliability of the online logs their use and the access they provide to school-based participants’ thinking
Carmichael, Procter and Honour: Using Network Tools to Enable a Distributed Research Project: Page 16
has proved a useful ‘barometer’ – not only of the project’s progress but also of the different ways in
which the roles of those involved in the project are construed.
Bampton and Cowton (2002; paras. 13-16) discuss how conducting research online restricts the
‘register’ of interaction, and lack of visual and auditory cues makes it difficult to gauge the level of
engagement of respondents. At the time of writing, the project researchers are beginning work with
school coordinators and ‘focal teachers’ which will allow the construction of a more complete picture
of activities within project schools. Comparison of this potentially richer data with that presented with
the online logs presents an opportunity to critically evaluate the effectiveness of the latter and to assess
what elements of data are enhanced, diminished or perhaps lost altogether by the use of electronic
media.
We are conscious of Illingworth’s warning in relation to the use of electronic tools to collect data thin
research - that “the advantages of speed, 'friendliness' and instantaneous dialogue combined with
research unrestricted by geographical location and transcription costs, masks a number of issues which
affect the research relationship”(2001; para 13.2), and see this project as an opportunity to document in
detail the impact of working online both on data collection, research relationships and the long-term
impact of the project on its various stakeholders. Of the research relationships Illingworth alludes to,
perhaps the most critical in the ‘Learning how to Learn’ project is that between members of the
universities team and the school coordinators. Within the project, the role of school coordinator is
perhaps the most problematic and demanding and we are conscious of differing interpretations (both
from universities team members and from the coordinators themselves) as to what that role involves.
On the one hand, they have a role in managing and promoting the project within their schools; on the
other they are members (albeit peripheral ones, in many cases) of a broader research community rooted
in the Universities. It is they as much as the university-based critical friends who have to make
decisions as to whether the deployment of the project in their particular school should be directed by
the school management or derived from a consensus amongst staff; and the degree to which supporting
research evidence should inform staff in the development of classroom practice. The online tools
offered by the project will allow us to engage over an extended period with a sample of school
Carmichael, Procter and Honour: Using Network Tools to Enable a Distributed Research Project: Page 17
managers involved in the facilitation of educational innovation, and will aid us in constructing images
of agents of change in a variety of educational settings.
We foresee opportunities to use data collected from the various electronic tools, together with richer
qualitative accounts, to illuminate differences in perceptions of roles and structures, conceptualisations
of project processes and attitudes towards educational innovation more generally between different
project stakeholders. We would also hope to establish firmer relationships between these aspects and
the use of the available network tools, and to offer advice as to how educational change and capacity
building can best be supported and enhanced through their use. In this respect, we will also seek to
clarify the notions of ‘community of practice’ and ‘task group’ as outlined earlier in this paper, and to
explore the extent to which participation in electronic ‘communities’ and ‘task groups’ relates to wider
‘real-world’ activity. We are already very cautious about suggesting that there is any necessary
relationship between physical location or project role and participation in ‘community’ rather than
‘tasks’. To represent the school coordinators and other teachers involved in the project as being
management-directed and lacking interest in the broader issues surrounding the project is to do them a
disservice; clearly some school coordinators are fully engaged with the project and would be only too
happy to take advantage of additional ‘community-building’ opportunities, and we have had to adjust
our own provision for this in the light of their requests and emerging patterns in project data.
There is increasing recognition that individuals respond to demands to fulfil multiple and often
conflicting organizational roles, and develop imaginative ways of reconciling interest in and loyalty to
broader communities with more immediate demands of workplace tasks. Zabusky and Barley (1997)
describe how membership of more than one community can cause tension as users seek to reconcile the
demands and norms of different groups and Teigland (2000) describes how collaborative communities
of computer programmers share problems and innovative solutions despite their being employed in
organizations in direct competition with each other and in which ‘on-time delivery’ is seen as being
more important than creativity. What we would hope is that the range of electronic tools we provide
not only fosters the widest range of interactions and development but also allows better understanding
of the processes at work when teachers in schools interact with and participate in educational research
and development.
Carmichael, Procter and Honour: Using Network Tools to Enable a Distributed Research Project: Page 18
Bibliography
Bampton, R. and Cowton, C.J. (2002). The E-Interview Forum: Qualitative Sozialforschung: Special Edition on Using Technology in the Qualitative Research Process 3(2) May 2002. Online at: http://www.qualitative-research.net/fqs-texte/2-02/2-02bamptoncowton-e.htm
Black, P. and Wiliam, D. (1998) ‘Assessment and Classroom Learning’ Assessment in Education 5(1) 7-71.
Carmichael, P. (2002) Extensible Markup Language and Qualitative Data Analysis Forum: Qualitative Sozialforschung: Special Edition on Using Technology in the Qualitative Research Process 3(2) May 2002. Online at: http://www.qualitative-research.net/fqs-texte/2-02/2-02carmichael-e.htm
Carmichael, P. (2002) ‘Learning how to Learn: Using Internet Technologies to Support Distributed Research’ Paper presented at ESRC-TLRP Conference, Cambridge, 23-24 September 2002. Online at: http://www.learntolearn.ac.uk/home/009_public_papers/carmichael_tlrp_2002.doc
Clark, H. (1996): Using Language. Cambridge: Cambridge University Press
Ford, K. Oberski, I. & Higgins, S. (2000) 'Computer-aided qualitative analysis of interview data: Some Recommendations for Collaborative Working' The Qualitative Report, Volume 4, Numbers 3 & 4, March, 2000. Online at: http://www.nova.edu/ssss/QR/QR4-3/oberski.html
Hoel, T. and Gudmundsdottir, S. (1999) ‘The REFLECT Project in Norway: interactive pedagogy using email’ Journal of Information Technology for Teacher Education, 8(1); p. 89-110.
Hine, C. (2001) ‘Review of: Mann, C and Stewart, F. ‘Internet Communication and Qualitative Research: A Handbook for Researching Online’’ Sociological Research Online 6(2). Online at: http://www.socresonline.org.uk/6/2/mann.html
Illingworth, N. (2001) 'The Internet Matters: Exploring the Use of the Internet as a Research Tool' Sociological Research Online 6(2), Online at: http://www.socresonline.org.uk/6/2/illingworth.html
James, M., Black, P. and McCormick, R. (2003) ‘Deepening capacity through innovative research design: researching learning how to learn in classrooms, schools and networks’ Paper presented at the 2003 Annual Meeting of the American Educational Research Association, Chicago, in the BERA-1 Symposium, ‘The UK’s Teaching and Learning Research Programme: responding to challenges for Educational Research’.
Mann, C. and Stewart, F. (2000) Internet Communication and Qualitative Research: A Handbook for Researching Online (Sage Publications: London)
Maxwell, T.W., Reid, J. McLoughlin, C., Clarke, C and Nicholls, R. (2001) ‘Online support for Action Research in a Teacher Education Internship in rural Australia’ paper presented at the Society for the Provision of Education in Rural Australia Annual Conference, Wagga Wagga, 8-11 July, 2001. Online at: fehps.une.edu.au/f/s/edu/tMaxwell/ Spera_paper_Final.pdf
Muhr, T (2000) ‘Increasing the Reusability of Qualitative Data with XML Forum: Qualitative Sozialforschung 1(3) December 2000. Online at: http://www.qualitative-research.net/fqs-texte/3-00/3-00muhr-e.htm
Slagter, R., C. Verwijs & I. Mulder (2001) Survey of community tools. Enschede: Telematica Instituut, The Netherlands. Online at: http://cscw.telin.nl/communitytools
Steuer, J. (1998) ‘Tilling the Soil: tools for building a web community’ Web Techniques 3(1) January 1998; 44-49.
Carmichael, Procter and Honour: Using Network Tools to Enable a Distributed Research Project: Page 19
Swaak, J., Verwijs, C. & Mulder, I. (2000) Task groups and communities compared: Can results from task groups be transferred to communities? Enschede: Telematica Instituut, The Netherlands. Online at: https://doc.telin.nl/dscgi/ds.py/Get/File-10580/
Verwijs, C., Mulder, I., Slagter, R., & Moelaert, F. (2002) ‘Leveraging communities with new technologies’ In J. H. E. Andriessen, M. Soekijad, & H. J. Keasberry (Eds.) Support for knowledge sharing in communities pp. 57-70. (Delft, The Netherlands: Delft University Press)
Wenger, E. (1999) Communities of Practice: Learning, Meaning, and Identity Cambridge University Press, New York, 1998.
Wenger, E. (2001). Supporting communities of practice: A survey of community-oriented technologies. Online at: http://www.ewenger.com/tech
Wenger, E. McDermott, R. and Snyder, W. (2002) Cultivating Communities of Practice (Harvard Business School Press)
Zabusky, S.E. and Barley, S.R. (1997) ‘You can’t be a stone if you’re cement’: reevaluating the emic identities of scientists in organisations. In: L.L. Cummings and B.M. Shaw (eds.) Research in Organizational Behaviour 19, pp. 361-404. (Greenwich, JAI).
Carmichael, Procter and Honour: Using Network Tools to Enable a Distributed Research Project: Page 20
Recommended