Upload
others
View
1
Download
0
Embed Size (px)
Citation preview
Learning Analytics:
Concepts, methods, tools, achievements
research directions and perspectives [Synthesis and critical analysis of a new
interdisciplinary field]
Αngelique Dimitracopoulou
Learning Technology and Educational Engineering Lab
University of the Aegean
Μaster Program: NTCL, D.CIS, Technological University of Cyprus 19 Μarch 2015
LEARNING TECHNOLOGY AND EDUCATIONAL ENGINEERING LABORATORY www.LTEE.gr/adimitr 2
Field Definition, field sources, dimensions, purposes
Ethical and Legal Issues
The field’s main Concepts
Developments from Academic Analytics Direction
The Achievements of the L.A. Field : Examples Developments from Reflection oriented Direction
Learning Analytics Debate and Critical Issues Learning Analytics Perspectives
LEARNING TECHNOLOGY AND EDUCATIONAL ENGINEERING LABORATORY www.LTEE.gr/adimitr 3
LEARNING TECHNOLOGY AND EDUCATIONAL ENGINEERING LABORATORY www.LTEE.gr/adimitr 4
LEARNING TECHNOLOGY AND EDUCATIONAL ENGINEERING LABORATORY www.LTEE.gr/adimitr 5
Analytics definition: Analytics is the discovery and communication of meaningful patterns in data. Especially valuable in areas rich with recorded information, analytics relies on the simultaneous application of statistics, computer programming and operations research to quantify performance. Analytics often favors data visualization to communicate insight. [wikipedia] Analytics is a term used in business and science to refer to computational support for capturing digital data to help inform decision making [Unesco LA Policy Briefs 2012]
Learning Analytics Definition (1/5)
LEARNING TECHNOLOGY AND EDUCATIONAL ENGINEERING LABORATORY www.LTEE.gr/adimitr 6
Learning Analytics Definition v.1. L.A. refers to the interpretation of a wide range of data produced by and gathered on behalf of students in order to access academic progress, predict future performance and spot potential issues. The goal of LA is to enable teachers and schools to tailor educational opportunities to each student’s level of need and ability in close-to-real time. [NMCHorizon H.E. Report, 2012 & EDUCAUSE]
Learning Analytics Definition (1/5)
LEARNING TECHNOLOGY AND EDUCATIONAL ENGINEERING LABORATORY www.LTEE.gr/adimitr 7
Learning Analytics Definition v.2. L.A. is an educational application of web analytics, a science that is commonly used to by business to analyse commercial activities, identify spending trends, and predict consumer behavior. [Horizon Report, Schools 2014]
Learning Analytics Definition (1/5)
LEARNING TECHNOLOGY AND EDUCATIONAL ENGINEERING LABORATORY www.LTEE.gr/adimitr 8
Learning Analytics Definition v.3. The field of LA focuses on tracking learning activities and the context in which these activities occur, to promote awareness and reflection through algorithmic analysis (in educational data mining) or information visualisation. [E. Duval et all 2014]
Learning Analytics Definition (1/5)
LEARNING TECHNOLOGY AND EDUCATIONAL ENGINEERING LABORATORY www.LTEE.gr/adimitr 9
Learning Analytics Definition v.4. L.A. is the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimizing learning and the environments in which it occurs. [SOLAR 2010, LAK 2011]
Learning Analytics Definition (1/5)
LEARNING TECHNOLOGY AND EDUCATIONAL ENGINEERING LABORATORY www.LTEE.gr/adimitr 10
Learning Analytics a new field in its infancy Emerging => decate 2000, Explosing => 2010
SOLAR: Society for Learning Analytics Research, www.solarsearch.org Annual International => Learning Analytics and Knowledge Conference, 1st LAK 2011 -> 5th LAK2015 Journal special issues Reports of Institutions { Unesco, Educause, ΝΜC Horizon, EC,.. } Master Programs and Courses North Caroline State University ( http://analytics.nesu.edu) Northwestern ( http://analytics.northwestern.edu)
Learning Analytics Discipline State
LEARNING TECHNOLOGY AND EDUCATIONAL ENGINEERING LABORATORY www.LTEE.gr/adimitr 11
A field in emergence during the last decade from various & different research problematic / questions / interests
[1] Interaction Analysis field (exploring social & collaborative learning, support students to
reflect on their learning process, support teachers in orchestrating social learning activities [LTEE lab, 2002-> ]
[2] Webanalytics and also how to organise ourselves [3] Distance learning Corporate Institutes [ business oriented preoccupations ] ( the data explosion and the capture of a wide range of data & trails ) (The rise of on-line education based on VLE, CMS, LMS, MIS ) [4] MOOCs [ 2010] (statistics on hundreds or million of students)
Learning Analytics Discipline State (1/2)
LEARNING TECHNOLOGY AND EDUCATIONAL ENGINEERING LABORATORY www.LTEE.gr/adimitr 12
Learning Analytics Disciplines Sources
LEARNING ANALYTICS
LEARNING SCIENCES
Cognitive Sciences Edu. Psychology
Pedagogy Didactics
[Science Education, Mathematics Edu etc]
Sociology Edu. Technology
Economy of Education .....
Statistics
Business
Intelligence
Operational Research
Data Mining
Information Visualization
Artificial Intelligence
Bibliometrics
Scientometrics
Social Network Analysis
A
NA
LYTI
CS
L.A. an Interdisciplinary Field
ANALYTICS
LEARNING TECHNOLOGY AND EDUCATIONAL ENGINEERING LABORATORY www.LTEE.gr/adimitr 13
Learning Analytics Disciplines Sources
Cooper, A. (2012). A brief history of Analytics, Vol.1, No 9, CETIS Analytics Series, (JSIC Series)
LEARNING TECHNOLOGY AND EDUCATIONAL ENGINEERING LABORATORY www.LTEE.gr/adimitr 14
Learning Analytics Directions
LEARNING ANALYTICS
ACADEMIC ANALYTICS
GOVERNANCE ANALYTICS
AUTOMATED ASSESSMENT RECOMMENDERS SYSTEMS
PREDICTIVE MODELS
REFLECTION
ACTIVITY ANALYTICS
TEACHING ORCHESTRATION RESEARCH ON LEARNING
Embedded analytics Extracted analytics
ASSESSMENT ANALYTICS
LEARNING TECHNOLOGY AND EDUCATIONAL ENGINEERING LABORATORY www.LTEE.gr/adimitr 15
Learning AnalyticsDirections & Purposes
LEARNING ANALYTICS
ACADEMIC ANALYTICS
GOVERNANCE ANALYTICS AUTOMATED ASSESSMENT RECOMMENDERS SYSTEMS
PREDICTIVE MODELS
SELF-MONITORING
ACTIVITY ANALYTICS
TEACHING ORCHESTRATION RESEARCH ON LEARNING
Embedded analytics Extracted analytics
Adaptation Personalisation Intervention
Content Units Resources Recommendations
Decision making Organisational Management Publicity
Identification At risk students
LEARNING TECHNOLOGY AND EDUCATIONAL ENGINEERING LABORATORY www.LTEE.gr/adimitr 16
Learning Analytics Academic A.
Meso-level: operated at institutional level (integrate data silos to enterprise warehouses, optimize workflows, etc, Benefits • Improve administrative decision making and organ. resource allocation • Support holistic decision making viewing impact of different variables • Help leaders determine the hard (e.g. patents, research) and soft (eg. Reputation, profile, quality of teaching) value generated
Scale of Analysis Objective Who benefits Institutional : learner profiles, performance of ac. knowledge flow
Administrators, funders, marketing
Regional : comparison between systems
Funders, administration
National and International National governments, education authorities
Macro level of Academic Analysis (enable cross institutional analytics e.g. through maturity surveys of current institutional practices, standardize assessment data, etc)
Micro-level : regarding individuals & groups of students teachers Benefits • Identify at risk students and provide interventions • Provide learners with insight into their learning outputs, give
recommendations for improvements etc
LEARNING TECHNOLOGY AND EDUCATIONAL ENGINEERING LABORATORY www.LTEE.gr/adimitr 17
Who is interested by Learning Analytics ?
Learning Analytics LA tools Users
LEARNING ANALYTICS
STAKEHOLDERS
INSTITUTIONAL LEADERS
POLICY MAKERS
ORGANISATIONAL ADMINISTRATORS
TECHNOLOGISTS
RESEARCHERS IN EDUCATION EDUCATIONAL PRACTITIONERS
INSTRUCTIONAL DESIGNERS
PRODUCT VENDORS
LEARNERS
LEARNING TECHNOLOGY AND EDUCATIONAL ENGINEERING LABORATORY www.LTEE.gr/adimitr 18
Data Traces Trails Dashboard Indicator Analytics Methods … ...
Learning Analytics Concepts
LEARNING TECHNOLOGY AND EDUCATIONAL ENGINEERING LABORATORY www.LTEE.gr/adimitr 19
DATA & TRACES _______________________________
Answers to queries and questionnaires Access to learning resources Posts of assignments Contributions to shared documents or wikis Actions to a learning environment Votes in lecture response systems
Tweeter posts Facebook related actions Time on pages (in every kind of resources) Posts in discussion fora Replies to posts
Websites’ visits, Portals visits Logins to learning management systems ID data on registrations
Location of device used to access course Sounds around digital device, bio-physical
Learning Analytics Concepts categ
Explicit Data
Tacit Data
Unintended Data
Digital data
Physical data
LEARNING TECHNOLOGY AND EDUCATIONAL ENGINEERING LABORATORY www.LTEE.gr/adimitr 20
DATA & TRACES _______________________________
Answers to queries and questionnaires Access to learning resources Posts of assignments Contributions to shared documents or wikis Actions to a learning environment Votes in lecture response systems
Tweeter posts Facebook related actions Time on pages (in every kind of resources) Posts in discussion fora Replies to posts
Websites’ visits, Portals visits Logins to learning management systems ID data on registrations
Location of device used to access course Sounds around digital device, bio-physical
Learning Analytics Concepts categ
Explicit Data
Tacit Data
Unintended Data
Digital data
Physical data
The trace (what: event, data) has to be recorded with information:
Who (acts, leaves the trace) When (time stamp of the trace/trail) Where (which tool, which learning environment etc - part of the context of the trace -)
LEARNING TECHNOLOGY AND EDUCATIONAL ENGINEERING LABORATORY www.LTEE.gr/adimitr 21
MAIN DATA SOURCES__________________
Artifacts produced individually and/or by a group
Exercises - test results
Resource use
Reports (+ reports on dispositions, motivation, emotions)
Social interaction
Time spent
Bio-physical data
Learning Analytics Concepts categ
LEARNING TECHNOLOGY AND EDUCATIONAL ENGINEERING LABORATORY www.LTEE.gr/adimitr 22
Learning Analytics Concepts: Lifecycle
Data Selection
Data analysis
Data operationalisation
Interpretation/ action
L.A. Output Communication
Data storage
DATA CAPTURE
Explicit Tacit
Physical Digital
….
LA Dashboard
Learning Analytics LifeCycle : Main Phases
LEARNING TECHNOLOGY AND EDUCATIONAL ENGINEERING LABORATORY www.LTEE.gr/adimitr 23
Analytic Methods (main categories) Actions/activities analysis (quantitative)
Content Analysis (of students’ products)
Discourse Analysis (meaningful data on language qualities)
Social Learning Analytics (exploring the role of social interaction in learning)
Disposition Analytics: (data regarding students dispositions to their own learning and the relationship of these to their learning) Statistics, metrics and Visualisation methods Predictive Algorithms
Learning Analytics Methods
LEARNING TECHNOLOGY AND EDUCATIONAL ENGINEERING LABORATORY www.LTEE.gr/adimitr 24
Face2face settings Awareness/reflection
(1) Tool providing on-line feedback to speakers regarding the agreement or disagreement of the audience The system detects individuals 'attitudes from the audience by tracking head nods, shakes and voice (using camera and microphone of their desktop PC) [Yu Y-C, You S-CD, Tsai D-R, 2012] (2) Backstage tool, engage students during lectures, visualising their twitter activity on lecturer’ slides presentation. Comparison of tweets among students (average activity, positive ratings receiving on tweets –from green over yellow to red in decreasing order) [Pohl A, Bry F., Schwarz J, Gottstein M, 2012]
L.A. Examples: Awareness/Reflection
L.A. Tools supporting f2f lectures
LEARNING TECHNOLOGY AND EDUCATIONAL ENGINEERING LABORATORY www.LTEE.gr/adimitr 25
Face2face settings Awareness/reflection
(1) Tool providing on-line feedback to speakers regarding the agreement or disagreement of the audience The system detects individuals 'attitudes from the audience by tracking head nods, shakes and voice (using camera and microphone of their desktop PC) [Yu Y-C, You S-CD, Tsai D-R, 2012] (2) Backstage tool, engage students during lectures, visualising their twitter activity on lecturer’ slides presentation. Comparison of tweets among students (average activity, positive ratings receiving on tweets –from green over yellow to red in decreasing order) [Pohl A, Bry F., Schwarz J, Gottstein M, 2012]
L.A. Examples: Awareness/Reflection
L.A. Tools supporting f2f lectures
LEARNING TECHNOLOGY AND EDUCATIONAL ENGINEERING LABORATORY www.LTEE.gr/adimitr 26
Face2face settings Awareness/reflection
(1) Tool providing on-line feedback to speakers regarding the agreement or disagreement of the audience The system detects individuals 'attitudes from the audience by tracking head nods, shakes and voice (using camera and microphone of their desktop PC) [Yu Y-C, You S-CD, Tsai D-R, 2012] (2) Backstage tool, engage students during lectures, visualising their twitter activity on lecturer’ slides presentation. Comparison of tweets among students (average activity, positive ratings receiving on tweets –from green over yellow to red in decreasing order) [Pohl A, Bry F., Schwarz J, Gottstein M, 2012]
L.A. Examples: Awareness/Reflection
L.A. Tools supporting f2f lectures BackStage L.A. Tool Functions & Indicators Users’ presence (online, offline, busy), Followership between Students, Displaying Degree of Involvement by the Activity Aggregator for Students, Reputation of Users Workspace Awareness Information 1) Categorical Distribution of Posts as an Indicator for the Prevailing Backchannel Discourse, 2) Explicit Notification of Activities Related to a User’s Posts, 3) Rating and Ranking of Posts as Means to Estimate Relevance, 4) Annotation of Slides, 5) Filter Boxes to Increase the Lecturer’s Awareness of the Backchannel,
LEARNING TECHNOLOGY AND EDUCATIONAL ENGINEERING LABORATORY www.LTEE.gr/adimitr
A Cross-Platform Framework for Educational Applications on Phones, Tablets, Tablet PCs Provides functionality to create pen- and touch enabled applications on active, collaborative learning [Wade, Fagen, Sam, Kamin, 2012]
Face2face settings Awareness/reflection
L.A. Examples: Awareness/Reflection
L.A. Tools supporting f2f Lectures->Workshop
LEARNING TECHNOLOGY AND EDUCATIONAL ENGINEERING LABORATORY www.LTEE.gr/adimitr
A Cross-Platform Framework for Educational Applications on Phones, Tablets, Tablet PCs Provides functionality to create pen- and touch enabled applications on active, collaborative learning [Wade, Fagen, Sam, Kamin, 2012]
Functionnalities • Shared drawing on slides • Make it easier to point to places in code, make suggestions etc • Allows presenter to explain program by using pictures • Independent navigation by students
Face2face settings Awareness/reflection
L.A. Examples: Awareness/Reflection
L.A. Tools supporting f2f Lectures->Workshop
LEARNING TECHNOLOGY AND EDUCATIONAL ENGINEERING LABORATORY www.LTEE.gr/adimitr
[Gutiirrez Rojas I., Crespo Garcia R.M. (2012)] Class-On: This tool integrates the solutions to the efficiency problems detected in the orchestration of the sessions: providing awareness mechanisms for the teacher for
improving the orchestration, distribution of feedback and support, and efficiency of the session;
providing awareness mechanisms for the students for monitoring their progress and support self-reflection;
leveraging the labor of the teacher facilitating the identification of trivial or general questions;
collecting evidences for summative assessment;
Awareness/reflection
L.A. Examples: Awareness/Reflection
L.A. Tools supporting f2f Lectures-> Lab Face2face settings
LEARNING TECHNOLOGY AND EDUCATIONAL ENGINEERING LABORATORY www.LTEE.gr/adimitr
Face2face settings
Awareness/reflection & Teachers Orchestration
ModellingSpace LA Tools: Teachers & lab model Collocated Students use the L.A. tools, on the fly and afterwards synchronous collaborative modelling activities
- LTEE lab, Petrou, Fessakis, Dimitracopoulou, 2004, - Petrou PhD, 2005
MODELLINGSPACE ‘students’ settings in lab
L.A. Examples: Awareness/Reflection
LEARNING TECHNOLOGY AND EDUCATIONAL ENGINEERING LABORATORY www.LTEE.gr/adimitr 31
“Activity Analysis Quantitative overview”
[00:04:53][Kyriakos] You must go to page 8 and do what it says. [00:06:23][Kyriakos] What’s going on? Why you are doing nothing? [00:07:18][Rodoula] I can’t put the relationship, I whould like some guidance. If you want ask for the key and do it. [00:07:26][Kyriakos]οκ [KYRIAKOS TOOK THE KEY]. [00:07:38][Teacher] Kyriako please, don’t press Rodoula!
“Annotated Playback”
“COPRET” “CAF: Collaborative Activity Function”
Face2face settings Awareness/reflection & Teachers Orchestration
L.A. Examples: Awareness/Reflection
LEARNING TECHNOLOGY AND EDUCATIONAL ENGINEERING LABORATORY www.LTEE.gr/adimitr 32
Face2face settings Awareness/reflection & Teachers Orchestration
L.A. Examples: Awareness/Reflection
Tabletop concept mapping (for 4 people) touch screen Action analysis and sound /verbal contribution Cuendet, S., Dillenbourg P. (2013) ,CSCLConference Participation Radar, provide a mirror of learner’s actions both verbal and physical Contribution Chart gives an indication of the extent of each learners contribution to the group artifact. The Evolution Diagram depicts the building process of the artifact relating this to a Master Artefact and
with each participants contribution.
LEARNING TECHNOLOGY AND EDUCATIONAL ENGINEERING LABORATORY www.LTEE.gr/adimitr 33
Blended Learning or Online Learning Awareness/reflection
L.A. Examples: Awareness/Reflection
SNA of a Learning Community: Awareness and self/group assessment of ‘e-mail messages’ interactions, in the frame of an e-learning educational program, for in service education of teachers on ICTs (Community=80 pers) [LTEE, Hlapanis & Dimitracopoulou JTPE,2007]
LEARNING TECHNOLOGY AND EDUCATIONAL ENGINEERING LABORATORY www.LTEE.gr/adimitr 34
Blended Learning or Online Learning Awareness/reflection
L.A. Examples: Awareness/Reflection
SNA of a Learning Community: Awareness and self/group assessment of ‘e-mail messages’ interactions, in the frame of an e-learning educational program, for in service education of teachers on ICTs (Community=80 pers) [LTEE, Hlapanis & Dimitracopoulou JTPE,2007] I. Social Network Analysis (SNA) visualisation
Nodes: each participant Links: interactions among the participants weight of a link: -> intense interaction, the nodes are close to each
II. Interaction Analysis Indicators: Density of network: [registered interaction /the
maximum of possible interactivity Concentration: [ 0 % = all communicate with all
the others] [ 100% =all communicate with
only one person]
LEARNING TECHNOLOGY AND EDUCATIONAL ENGINEERING LABORATORY www.LTEE.gr/adimitr 35
Blended Learning or Online Learning Awareness/reflection
L.A. Examples: Awareness/Reflection
SNA of a Learning Community: Awareness and self/group assessment of ‘e-mail messages’ interactions, in the frame of an e-learning educational program, for in service education of teachers on ICTs (Community=80 pers) [LTEE, Hlapanis & Dimitracopoulou JTPE,2007]
Isolated communication groups
Linear (flat) communication
35
2nd Week Group Maths 1
4rth Week Group Maths 1
Via the SNA, the participant can recognise: (a) his ‘position’ into the group, (b) the situation of the group as a whole (without paying attention to the underlying indicators exact values)
LEARNING TECHNOLOGY AND EDUCATIONAL ENGINEERING LABORATORY www.LTEE.gr/adimitr 36
Blended Learning or Online Learning Awareness/reflection
L.A. Examples: Awareness/Reflection
SNA of a Learning Community: Forum Analysis [LTEE, Hlapanis & Dimitracopoulou JTPE,2007] SNA for Forum Moderators
SNA for Researcher ( Community main Coordinator) { providing information on when a community is created}
No isolated groups exist; communication is not flat
Many different communication cells exist
A large communication cell exists, embodying nearly all members
LEARNING TECHNOLOGY AND EDUCATIONAL ENGINEERING LABORATORY www.LTEE.gr/adimitr 37
Blended Learning or Online Learning Awareness/reflection
L.A. Examples: Awareness/Reflection
SNAPP forum analysis tool (SNA based)
LEARNING TECHNOLOGY AND EDUCATIONAL ENGINEERING LABORATORY www.LTEE.gr/adimitr 38
SNAPP (Social Network Analysis & Pedagogical Practices) forum analysis tool for Sakai Collaborative Learning Environment (v2.8 & v2.9) by Marist College. It provides an opportunity for teachers to rapidly identify patterns of user behavior – at any stage of course progression. SNAPP has been developed to extract all user interactions from various commercial and open source learning management systems (LMS) such as BlackBoard (including the former WebCT), Moodle and now Sakai. SNAPP is compatible for both Mac and PC users and operates in Chrome and Firefox.
The nodes correspond to the students in the discussion and the edges in the graph represent the interactions between them. The node scaling utilizes the Page Ranking algorithm to denote the level of participation of a student with in the forum.
The SNS diagram of forum based students’ discussions online can: identify disconnected (at risk) students; identify key information brokers within your class; identify potentially high and low performing students so you can plan interventions before you even
mark their work; indicate the extent to which a learning community is developing in your class; provide you with a “before and after” snapshot of what kinds of interactions happened before and after
you intervened/changed your learning activity design (useful to see what effect your changes have had on student interactions and for demonstrating reflective teaching practice e.g. through a teaching portfolio)
allow your students to benchmark their performance without the need for marking
Blended Learning or Online Learning Awareness/reflection
L.A. Examples: Awareness/Reflection
SNAPP forum analysis tool (SNA based)
LEARNING TECHNOLOGY AND EDUCATIONAL ENGINEERING LABORATORY www.LTEE.gr/adimitr 39
SNAPP (Social Network Analysis & Pedagogical Practices) forum analysis tool for Sakai Collaborative Learning Environment (v2.8 & v2.9) by Marist College. It provides an opportunity for teachers to rapidly identify patterns of user behavior – at any stage of course progression. SNAPP has been developed to extract all user interactions from various commercial and open source learning management systems (LMS) such as BlackBoard (including the former WebCT), Moodle and now Sakai. SNAPP is compatible for both Mac and PC users and operates in Chrome and Firefox.
The nodes correspond to the students in the discussion and the edges in the graph represent the interactions between them. The node scaling utilizes the Page Ranking algorithm to denote the level of participation of a student with in the forum.
The SNS diagram of forum based students’ discussions online can: identify disconnected (at risk) students; identify key information brokers within your class; identify potentially high and low performing students so you can plan interventions before you even
mark their work; indicate the extent to which a learning community is developing in your class; provide you with a “before and after” snapshot of what kinds of interactions happened before and after
you intervened/changed your learning activity design (useful to see what effect your changes have had on student interactions and for demonstrating reflective teaching practice e.g. through a teaching portfolio)
allow your students to benchmark their performance without the need for marking
Blended Learning or Online Learning Awareness/reflection
L.A. Examples: Awareness/Reflection
SNAPP forum analysis tool (SNA based)
LEARNING TECHNOLOGY AND EDUCATIONAL ENGINEERING LABORATORY www.LTEE.gr/adimitr 40
Blended Learning or Online Learning Awareness/reflection
L.A. Examples: Awareness/Reflection
40
Independent IA tool for all, incorporating a variety of indicators
DIAS system – (Discussion Interaction Analysis System): for asynchronous discussion forum
LTEE, Bratitsis & Dimitracopoulou , 2007
Indicators selection interface (group IA I)
DIAS IA tool user: can select set of indicators :
(~ 65 indicators): for individual students, groups for teachers
LEARNING TECHNOLOGY AND EDUCATIONAL ENGINEERING LABORATORY www.LTEE.gr/adimitr
Blended Learning or Online Learning Awareness/reflection
L.A. Examples: Awareness/Reflection
A. Indicators addressed to individuals
(b) Tree Structure (a) User Type Messages per week
(d) Messages Length per user
2: 1st thematic discussion 3: Working group 1 4: Working group 2 5: Working group 3 6: Working group 4 7: Working group 5 8: Working group 6 9: Working group 7 10: Working group 8 11: ICTs in pre-primary education 12: PC labs spaces 13: Technical management 14: School models 15: Innovation initiatives in schools
c) Messages posted by user in community
LEARNING TECHNOLOGY AND EDUCATIONAL ENGINEERING LABORATORY www.LTEE.gr/adimitr
Blended Learning or Online Learning Awareness/reflection
L.A. Examples: Awareness/Reflection
A. Indicators addresse
42 42 42
(a) Activity Indicator
(b) Contribution Indicator
(c) Classification Indicator
B. Indicators addressed to participants groups
(d) SNA answer indicator
(e) SNA read indicator
LEARNING TECHNOLOGY AND EDUCATIONAL ENGINEERING LABORATORY www.LTEE.gr/adimitr
Blended Learning or Online Learning Awareness/reflection
L.A. Examples: Awareness/Reflection
A. Indicators addresse
43 43 43
(a) Activity Indicator
(b) Contribution Indicator
(c) Classification Indicator
B. Indicators addressed to participants groups
(d) SNA answer indicator
(e) SNA read indicator
Activity indicator: X-Axis: number of messages written by a member
Y-Axis: number of messages read by this member
The circle grows: ~number of types of messages
Contribution indicator: The distance of the circle from the circumference: ~to the activity of the member:=total number of messages Discussion initiation: are subsidized
The circle sizes: ~number of types of messages
LEARNING TECHNOLOGY AND EDUCATIONAL ENGINEERING LABORATORY www.LTEE.gr/adimitr
Blended Learning or Online Learning Awareness/reflection
L.A. Examples: Awareness/Reflection
A. Indicators addresse
44 44 44
(a) Activity Indicator
(b) Contribution Indicator
(c) Classification Indicator
B. Indicators addressed to participants groups
(d) SNA answer indicator
(e) SNA read indicator
Classification indicator: X-axis: It represents the amount of contribution (messages written as percentage of the total number of messages) Y-Axis: the amount of interaction: messages read as percentage of the available number of messages
SNA Answer Indicator: messages written by user B, that user A has answered
LEARNING TECHNOLOGY AND EDUCATIONAL ENGINEERING LABORATORY www.LTEE.gr/adimitr
Blended Learning or Online Learning Awareness/reflection
L.A. Examples: Awareness/Reflection
A. Indicators addresse
45 45 45
(a) Activity Indicator
(b) Contribution Indicator
(c) Classification Indicator
B. Indicators addressed to participants groups
(d) SNA answer indicator
(e) SNA read indicator
SNA Answer Indicator: messages written by user B, that user A has answered
SNA Reads Indicator: messages written by user B, that user A has read (dissemination of messages)
Interpretation Example: By inspecting: the Classification Indicator the moderator may see how active member X is: => whether X has extreme of balanced behavior [Arrogant member: write many messages, but not read other messages. Passive member: read many messages, but not write enough] the SNA Answers Indicator: whether X is isolated or holds a central position. > If X seems active in message writing (classification indicator), SNA shows if X interact with others or not, by posting answers and receives answers :=> If not : low argumentative value of his messages, off topic writing, arrogant behaviour, lack of knowledge regarding topic, etc the SNA Reads Indicator: if he holds a central position but appears to be isolated in SNA Answers, then he writes messages which are read by many others but not answered to:=> X could be a discussion coordinator or face participation problems.
LEARNING TECHNOLOGY AND EDUCATIONAL ENGINEERING LABORATORY www.LTEE.gr/adimitr
Blended Learning or Online Learning Awareness/reflection
L.A. Examples: Awareness/Reflection
A. Indicators addresse
46
B. Indicators addressed to participants groups
(d) SNA answer indicator
(e) SNA read indicator
Activity indicator: X-Axis: number of messages written by a member
Y-Axis: number of messages read by this member
The circle grows: ~number of types of messages
46
User Time Read indicator: The vortices are colored, according the time user X has read the message. unread messages by X: colored black
messages written by X are represented by rectangle => Whether member X is active mostly in earlier or later phases of discussion activity
(f) User Time Read I
LEARNING TECHNOLOGY AND EDUCATIONAL ENGINEERING LABORATORY www.LTEE.gr/adimitr
Blended Learning or Online Learning Awareness/reflection
L.A. Examples: Awareness/Reflection
47 47
DIAS Interaction Analysis Indicators
Interpretation Level
High Level
Low Level
Point of View
Individual Perspective
Group - differentiated
Group -undifferentiated
Community Perspective
Assistance Level
Awareness
Assessment
LEARNING TECHNOLOGY AND EDUCATIONAL ENGINEERING LABORATORY www.LTEE.gr/adimitr
Blended Learning or Online Learning Awareness/reflection
L.A. Examples: Awareness/Reflection
48 48
DIAS Interaction Analysis Indicators
Interpretation Level
High Level
Low Level
Point of View
Individual Perspective
Group - differentiated
Group -undifferentiated
Community Perspective
Assistance Level
Awareness
Assessment
DIAS LA tool Central design principles: Take into account the totality of the members that are involved in a ‘learning
activity’, as well as their cognitive systems that may form, students as individuals (in various roles), but also as members of one or more groups or even communities, teachers in different roles according to the category of learning activity, etc.
Provide a rich range of interaction analysis indicators: The analysis of interactions, in terms of indicators, seems to be an appropriate framework that offers different points of view of the learning activity process, its quality, as well as its product. Different indicators may be more appropriate during different time periods of the learning, for different learning task, as well as for different profiles of forum participants.
Create a flexible, customizable and interoperable system: Forums are tools that can be used in a number of contexts, and for a variety of discussion based learning activity categories. Furthermore, forum participants take various roles and they have different needs according to their discussions subjects, the available time, etc. Thus, it is significant to create customizable, flexible and interoperable systems.
LEARNING TECHNOLOGY AND EDUCATIONAL ENGINEERING LABORATORY www.LTEE.gr/adimitr
Blended Learning or Online Learning Awareness/reflection
L.A. Examples: Awareness/Reflection
Goal oriented Visualisation Dashboard [ Santos, Govaerts, Verbert, Duval (2011, LAK Conf)
The time spent with different tools, websites and Eclipse IDE documents are tracked by RescueTime4 and the Rabbit Eclipse plug-in5. Students can filter by different criteria, such as course goals and dates.
LEARNING TECHNOLOGY AND EDUCATIONAL ENGINEERING LABORATORY www.LTEE.gr/adimitr
Blended Learning or Online Learning Awareness/reflection
L.A. Examples: Awareness/Reflection
Goal oriented Visualisation Dashboard [ Santos, Govaerts, Verbert, Duval (2011, LAK Conf)
The time spent with different tools, websites and Eclipse IDE documents are tracked by RescueTime4 and the Rabbit Eclipse plug-in5. Students can filter by different criteria, such as course goals and dates.
LEARNING TECHNOLOGY AND EDUCATIONAL ENGINEERING LABORATORY www.LTEE.gr/adimitr 51
Initial question: How to support learning activities’ single participants or groups of them, so as to take the control of their own activity, when they act in highly interactive environments ?
Awareness/reflection
L.A. Examples: Awareness/Reflection
LEARNING TECHNOLOGY AND EDUCATIONAL ENGINEERING LABORATORY www.LTEE.gr/adimitr 52
Main reason: The need to support ‘participants’ in a metacognitive level:
In Learning environnements (for stand-alone and social use)
Working in technology based LE is an activity more complex than in paper and pencil: it is difficult to be aware of «what we have done »
Working in social or CSCL systems is an activity much more complex than working individually
Students cannot create an ‘image’ of their own activity, or this of other students/collaborators (as individuals, group or a whole community). For teachers, it is very-very hard to understand/take into account what it happens in stand-alone systems or to manage activities in collaborative environments, due to the very complex interactions that occur.
In working groups and scientific communities Similar problems: Need for support of Awareness of actions and interactions, of different actors or groups, intervening with different roles & needs.
L.A. Examples: Awareness/Reflection
LEARNING TECHNOLOGY AND EDUCATIONAL ENGINEERING LABORATORY www.LTEE.gr/adimitr
Theoretical Considerations
L.A. Awareness/Reflection Direction
Theoretical Consideration C1:
A consideration of all agents and cognitive systems involved in social learning settings
Society
Community
Individual Group Groups
Teacher
Community
Individual Group
Class of Groups Teacher (s)
LEARNING TECHNOLOGY AND EDUCATIONAL ENGINEERING LABORATORY www.LTEE.gr/adimitr
Theoretical Considerations
L.A. Awareness/Reflection Direction
Theoretical Consideration C2:
A consideration of the control of learning activity process as distributed to all the agents
Learners’ Cognitive Systems
Teacher
System
Self-regulation (via metacognitive support) Teacher supervision and/or evaluation & also self-assessment Advisors (‘pedagogical agents’) and/or environment adaptations activated by the system
LEARNING TECHNOLOGY AND EDUCATIONAL ENGINEERING LABORATORY www.LTEE.gr/adimitr
a few words… about Indicators
L.A. Awareness/Reflection Direction
55
Affective
Awareness
Estimation-Judgment
Evaluation
Cognitive: processes/mode
Cognitive: Product/content
Social [collaboration]
Individual:student/teacher
The individuals of group
The group
What ? (nature)
Whom ? (point of view)
Level of assistance?
The community ‘Discussion depth, Gerrosa, 2005
LEARNING TECHNOLOGY AND EDUCATIONAL ENGINEERING LABORATORY www.LTEE.gr/adimitr
a few words… about Indicators
L.A. Awareness/Reflection Direction
56
Affective
Awareness
Estimation-Judgment
Evaluation
Cognitive: processes/mode
Cognitive: Product/content
Social [collaboration]
Individual:student/teacher
The individuals of group
The group
What ? (nature)
Whom ? (point of view)
Level of assistance?
The community
LEARNING TECHNOLOGY AND EDUCATIONAL ENGINEERING LABORATORY www.LTEE.gr/adimitr
a few words… about Indicators
L.A. Awareness/Reflection Direction
57
Affective
Awareness
Estimation-Judgment
Evaluation
Cognitive: processes/mode
Cognitive: Product/content
Social [collaboration]
Individual:student/teacher
The individuals of group
The group
What ? (nature)
Whom ? (point of view)
Level of assistance?
The community ‘Discussion depth, Gerrosa, 2005 ‘Initiative’, Barros, 2000
LEARNING TECHNOLOGY AND EDUCATIONAL ENGINEERING LABORATORY www.LTEE.gr/adimitr
Podgorlec V, Kuhar (2011)
Blended & Online Learning
Assessment in LMS like systems
L.A. Examples: Assessment/Recommenders
Moodle Learning Analytics Features / Moodle Dashboard
LEARNING TECHNOLOGY AND EDUCATIONAL ENGINEERING LABORATORY www.LTEE.gr/adimitr
Blended & Online Learning
Assessment, Alarm systems & Recommenders Systems
L.A. Examples: Assessment/Recommenders
LOCO-Analyst for Learning Management Systems (LMS)
Ali, Hatala, Gasevic Jovanovic, 2012
LOCO-Analyst aims “”at helping instructors rethink the quality of learning content and learning design of the course.”” o It is a generic feedback generation tool , that can be connected to LMS systems o The provided feedback is based on the analyses of the context data collected
in the learning environment. LOCO-Analyst informs instructors about: The activities the learners performed and/or participated The usage of the learning content The peculiarities of the interactions among members of course (it is a generic feedback generation tool , that can be connected to LMS systems)
LEARNING TECHNOLOGY AND EDUCATIONAL ENGINEERING LABORATORY www.LTEE.gr/adimitr
The use of learner analytics through the application of Course Signals to difficult courses has shown great promise with regard to the success of first and second year students, as well as their overall retention to the University. To date, over 23,000 students across 100 courses have been impacted by Course Signals and over 140 instructors have utilized the system.
….Increasing “Student Success”
Blended & Online Learning
Assessment, Alarm systems & Recommenders Systems
Course-Signals // Purdue University
L.A. Examples: Assessment/Recommenders
Course Signals detects early warning signs and provides interventions to students who may not be performing to the best of their abilities before they reach a critical point. The Course Signals program works best when instructors want to provide individual feedback but don’t have a good way to do so, especially in large enrollment classes. With Course Signals, students are provided with customized feedback that can help them make informed decisions about their academic performance early in the semester
LEARNING TECHNOLOGY AND EDUCATIONAL ENGINEERING LABORATORY www.LTEE.gr/adimitr
The use of learner analytics through the application of Course Signals to difficult courses has shown great promise with regard to the success of first and second year students, as well as their overall retention to the University. To date, over 23,000 students across 100 courses have been impacted by Course Signals and over 140 instructors have utilized the system.
….Increasing “Student Success”
Blended & Online Learning
Assessment, Alarm systems & Recommenders Systems
Course-Signals // Purdue University
L.A. Examples: Assessment/Recommenders
Key Takeaways Purdue Univ. created Signals, a system that gives students early and frequent performance notifications and helps faculty members steer students toward additional campus resources as needed. The student success algorithm considers not only demographic and performance data but also data on student effort and engagement. Results thus far show that students who have engaged with Course Signals have higher average grades and seek out help resources at a higher rate than other students.
Course Signals detects early warning signs and provides interventions to students who may not be performing to the best of their abilities before they reach a critical point. The Course Signals program works best when instructors want to provide individual feedback but don’t have a good way to do so, especially in large enrollment classes. With Course Signals, students are provided with customized feedback that can help them make informed decisions about their academic performance early in the semester
LEARNING TECHNOLOGY AND EDUCATIONAL ENGINEERING LABORATORY www.LTEE.gr/adimitr
….Increasing “Student Success”
Blended & Online Learning
Assessment, Alarm systems & Recommenders Systems
Course-Signals // Purdue University
Course Signals is a student success system developed on a predictive analytic model (see figure ) that contains elements from the academic technologies and the student information system. The model is behaviorally based and considers student performance (grades in the course so far), effort (time on task), and characteristics (student profile). The algorithm runs on real-time data and provides a risk indicator for each student. Using this risk indicator (a red, yellow, or green traffic signal) as a formative guide, faculty members can give students in their courses meaningful feedback, suggesting behaviors that students might wish to change to improve their chances of success, thus placing the emphasis squarely on action.
L.A. Examples: Assessment/Recommenders
LEARNING TECHNOLOGY AND EDUCATIONAL ENGINEERING LABORATORY www.LTEE.gr/adimitr
Blended & Online Learning
Assessment, Alarm systems & Recommenders Systems
OLI, Carnegie Mellon University, Open Learning Initiative
L.A. Examples: Assessment/Recommenders
LEARNING TECHNOLOGY AND EDUCATIONAL ENGINEERING LABORATORY www.LTEE.gr/adimitr
Blended & Online Learning
Assessment, Alarm systems & Recommenders Systems
OLI, Carnegie Mellon University, Open Learning Initiative
L.A. Examples: Assessment/Recommenders
Screenshot of the Learning Dashboard main page for Module 7 “Statically Equivalent Loads”. Based on an analysis of students' performance on activities targeting a given learning objective, the system computes and displays graphically the Estimated Learning Levels for each of the learning objectives of the module. In particular, the proportion of the class at each learning level (red=low, yellow=moderate, green=high) is proportional to the respective portion of the bar, with gray representing students who have completed too few activities to enable a prediction. To obtain more information about students' progress and performance on a learning objective, the instructor clicks on that bar, for a more detailed view. Further clicking on the dots would show display the names of students in each of the above categories.
LEARNING TECHNOLOGY AND EDUCATIONAL ENGINEERING LABORATORY www.LTEE.gr/adimitr
Blended & Online Learning
Assessment, Alarm systems & Recommenders Systems
OLI, Carnegie Mellon University, Open Learning Initiative
L.A. Examples: Assessment/Recommenders
LEARNING TECHNOLOGY AND EDUCATIONAL ENGINEERING LABORATORY www.LTEE.gr/adimitr
The STUDENT INSPECTOR consists of three major components: a browser to explore data, an admin module to
manage students and student groups, and to assign tasks to them, and
an AI-based analyser to perform more sophisticated data processing.
Performance Misconceptions Topic coverage Overview of most frequent errors and misconceptions, identifying by ActiveMaths
Blended & Online Learning
Assessment, Alarm systems & Recommenders Systems
STUDENT INSPECTOR [Oliver SCHEUER O., & ZINN C., 2011]
L.A. Examples: Assessment/Recommenders
LEARNING TECHNOLOGY AND EDUCATIONAL ENGINEERING LABORATORY www.LTEE.gr/adimitr
Blended & Online Learning
Assessment, Alarm systems & Recommenders Systems
SocialLearn platform [Open University, UK]
L.A. Examples: Assessment/Recommenders
LEARNING TECHNOLOGY AND EDUCATIONAL ENGINEERING LABORATORY www.LTEE.gr/adimitr
Blended & Online Learning
Assessment, Alarm systems & Recommenders Systems
SocialLearn platform [Open University, UK]
L.A. Examples: Assessment/Recommenders
SocialLearn architecture includes a Recommendation Engine, a pipeline that process data, and output this analysis in form defined by
recommendation web services. Its second feature is : Identity Server that supplies (with the
students’ informed consent, data to Recommendation Engine
(data: learner’s profiles, learners activities in SocialLearn, and selected element on their activities and interactions on social media sites such as LinkedIn, Twitter, and sites employed OpenID
The final element of SocialLearn is : Delivery Channel (sites for both input and
output) linked with the “SocialLearnBackup” a browser toolbar, that presents also the recommendations and other visualizations
SocialLearn is designed to work in a wide range of contexts, and by a wide number of users profiles/roles “: learner, teachers, group leaders, administrators.
LEARNING TECHNOLOGY AND EDUCATIONAL ENGINEERING LABORATORY www.LTEE.gr/adimitr
Blended & Online Learning
Assessment, Alarm systems & Recommenders Systems
SocialLearn platform [Open University, UK]
L.A. Examples: Assessment/Recommenders
Key characteristics of Exploratory dialogue (N.Mercer, 2007), (in a forum) include : Challenge (alternative, but if, I don’t
believe), Evaluation (good point, important, how
much) Reasoning (does that mean, my
understanding, take your point )and Extension (Next step, it’s like, relates to) .
LEARNING TECHNOLOGY AND EDUCATIONAL ENGINEERING LABORATORY www.LTEE.gr/adimitr
Blended & Online Learning
Assessment, Alarm systems & Recommenders Systems
SocialLearn platform [Open University, UK]
L.A. Examples: Assessment/Recommenders
ELLI: Effective Lifelong Learning Inventory Implementing Social learning Disposition Analytics (Bukkingham, LAK2012 Conference ) [Theory: Deep engagement in learning is a function of a complex combination of learners’ identities, dispositions, values, attitudes and skills] ELLI Spider is fed by self-report data, and by data (activity, interactions) provided by Identity Server and Delivery channel Red triangle: no action in a dimension Yellow squares: some activity Green circles: very active
LEARNING TECHNOLOGY AND EDUCATIONAL ENGINEERING LABORATORY www.LTEE.gr/adimitr
Blended & Online Learning
Assessment, Alarm systems & Recommenders Systems
SocialLearn platform [Open University, UK]
L.A. Examples: Assessment/Recommenders
Implementing Social learning Context Analytics The Identity Server and the Delivery Channel provide data about a learners’ current context, including goals, activities, group memberships and learning roles. A future Social Learn app, will make use of this data , adding geolocation, to produce recommendations tailored to learner.
LEARNING TECHNOLOGY AND EDUCATIONAL ENGINEERING LABORATORY www.LTEE.gr/adimitr
L.A. Academic Analytics /components
AC
AD
EM
IC A
NA
LYTI
CS
M
ain
Com
pone
nts
LEARNING TECHNOLOGY AND EDUCATIONAL ENGINEERING LABORATORY www.LTEE.gr/adimitr
L.A. Learning Analytics Critical Dimensions
D A T A
Open / protected
Physical / Digital
Explicit / Implicit
P U R P O S E
Reflection
Adaptation
Prediction
Recomme-ndation
LEARNING ANALYTICS
D A SHB OA RD
Interactivity
Usability
Visualisation / metaphors
Consent
USERS -STAKEHOLDERS-
Learners Teachers Staff Institution
PROCESS INSTRUMENTS
Theories Technologies Visualisations Algorithms
Anonymity
Customisation
Governance
LEARNING TECHNOLOGY AND EDUCATIONAL ENGINEERING LABORATORY www.LTEE.gr/adimitr 74
Data is not Neutral (every step of LA lifecycle from data to analytics, to insight, to intervention, is infused with human judgment *) We have to work on the field of big data with caution:
Bigger data are not always better data
Not all data are equivalent
Just because it is accessible doesn’t make it ethical
Think about what the technology renders visible and leaves invisible
Automating research changes the definition of knowledge
Limited access to big data creates new digital divides
Claims to Objectivity and Accuracy are misleading Boyd D., & Crawford K. (2011). Six provocations for big data, Oxford Internet Institute, UK.
Learning Analytics Research Ethics
LEARNING TECHNOLOGY AND EDUCATIONAL ENGINEERING LABORATORY www.LTEE.gr/adimitr 75
Ethics issues are involved in many levels, for instance: Lack of accountability: when predictions created without verifications of statistical methods
or accuracy and completeness of data Visualization: Lack of connection among context, individual and data in a lot of cases Anonymity : lack of privacy if data may be harvested without permission and can be de-
anonymised through implied relationships Interactivity: When they work one-way : institution to students. Lack of interactivity if artifacts
are not interactive or they don’t allow feedback Dashboard Design : Which design or visual elements of dashboards are persuasive and how
might affect the viewer ? Where is meaning created beyond the dashboard’s label of at-risk ? How might the student respond to being labeled without context ?
An ethical Literacy, should include: Functional Literacy: Using software to create visualizations, or to access and navigate a
learning analytics dashboard Critical Literacy: Understanding data limitations in predictive modeling and
visualizations/dashboards as limitations of context, accountability, privacy and interactivity Rhetorical literacy: Analysing visualizations/dashboard rhetorically (producing knowledge) in
terms of persuasion, power, and ownership
Learning Analytics Research Ethics
LEARNING TECHNOLOGY AND EDUCATIONAL ENGINEERING LABORATORY www.LTEE.gr/adimitr 76
Remind that such personal data of analytical value may range from: formal transactions to involuntary data exhaust (such as building access, system logins, keystrokes, click streams)
Data can be derived from a range of systems: Recorded activity; students records, attendance, assignments, researcher information Systems interactions, card transactions VLE, library/repository search, Feedback mechanisms, surveys, customer care External systems that offer reliable identification such as sector and shared services and
social network
Ethics Macro Level - Data Macro Level: Academic / Governance Level : Educational Systems, Schools, Universities
LEARNING TECHNOLOGY AND EDUCATIONAL ENGINEERING LABORATORY www.LTEE.gr/adimitr 77
A number of higher education institutions dispose policy frameworks in response to (inter)national legislative contexts to regulate and govern intellectual property, safeguard data privacy and regulate access to data.
These policy frameworks may not always be sufficient to address the specific ethical challenges in the harvest and analysis of big data and protection.
Study of HE examples*: Open University , UK (OU Student Community Charter) University of South Africa (UNISA)
“The review of those institutions’ policy frameworks highlights the irregularity of learning analytics where the institution is the only role player with decision making power, determining the scope, definition and use of educational data without the input of other stakeholders”
Example: Amazon.co.uk provides clear explanations about consent, security, the information collected (including the use of cookies) the involvement of third parties and the users’ ability to access information about them. However, this does not solve the problem !
Ethics Macro Level - Data Macro Level: Academic / Governance Level : Educational Systems, Schools, Universities
Prinsloo P., Slade S. (2011) An evaluation of policy frameworks for addressing ethical considerations in Learning Analytics, LAK2011
LEARNING TECHNOLOGY AND EDUCATIONAL ENGINEERING LABORATORY www.LTEE.gr/adimitr 78
A few institutions have defined, or published their ethical policy and guidelines The institutions must develop context specific and appropriate guidelines and policy frameworks
Ethics Macro Level - Data
Academic Analytics lack convention and operationnalisation of ethical
principles, understanding about what is actually going on and where the line is drawn in grey areas
Defining and addressing ethical issues in learning analytics depend
on a number of epistemological ideological assumptions
Macro Level: Academic / Governance Level : Educational Systems, Schools, Universities
LEARNING TECHNOLOGY AND EDUCATIONAL ENGINEERING LABORATORY www.LTEE.gr/adimitr 79
Basic Ethical Principles
Nuremberg Guidelines [1946]
Helsinki Declaration [1964]
Belmont Report [1979]
American Psychological Association Principles and Code of Ethics
Ethics Macro Level - Data Macro Level: Academic / Governance Level : Educational Systems, Schools, Universities
Law and Ethics
January 2012, European Commission proposed a reform of the EU’s 1995 data protection rules to strengthen online privacy rights => it is not yet concluded The practice of Analytics (under UK law), especially with reference to personal data, should be considered under the headings of:
Data Protection Confidentiality & Consent Freedom of Information Intellectual Property Rights Licensing for Reuse
LEARNING TECHNOLOGY AND EDUCATIONAL ENGINEERING LABORATORY www.LTEE.gr/adimitr 80
Ethical Dimensions Some general common principles that provide for good practice: Clarity, open definition of purpose, scope and boundaries, even if that is broad and in some
respects open-ended Confort and care, consideration for both the interests and the feelings of the data subject and
vigilance regarding exceptional cases Choice and Consent, informed individual opportunity to opt-out or opt-in Consequence and Complaint, recognition that there may be unforeseen consequences and
therefore providing mechanisms for redress
Basic considerations of Institutions (HE, Schools) related to Data Ethics policy :
Who benefits and under what conditions Conditions for consent, de-identification and opting out Vulnerability and Harm Collection, analyses, access to and storage of data
Ethics Macro Level - Data Macro Level: Academic / Governance Level : Educational Systems, Schools, Universities
LEARNING TECHNOLOGY AND EDUCATIONAL ENGINEERING LABORATORY www.LTEE.gr/adimitr 81
Related Concepts
Ownership (of data) & Control (of analytic processes) Consent ( consent conditions, consent for third parties, right for opt out, right to withdraw, right
of anonymity, right to have access to my data,…) Transparency ( awareness of data collection and use by all stakeholders,…) Privacy (out of scope data, tracking location, unintentional creation of sensitive data, risk of re-
identification,…) Validity (minimisation of inaccurate data, minim of incomplete data, spurious correlations..) Access (student access to their data, data formats, right to correct inaccurate data,…) Action (conflicting interventions, need for human intervention, …) Adverse impact (labelling bias, reinforcing discriminations, gaming the system, …)
There are Ethical, Legal, Logistical implicated Issues
Ethics Macro Level - Data Macro Level: Academic / Governance Level : Educational Systems, Schools, Universities
* LACE project - www.laceproject.eu - work in progress
LEARNING TECHNOLOGY AND EDUCATIONAL ENGINEERING LABORATORY www.LTEE.gr/adimitr 82
Simple guidelines for analytics in a limited range of data and in small group of people :
Clarity, transparency, interactivity, anonymity preservation
Tell to all people: what, why, and for whom Demand them: what do they want to know/ use Supply: what are the analytics outputs what can they do with it
Micro-Level: Classes, Learning Sessions, Communities of Practice
Ethics Micro Level - Data
LEARNING TECHNOLOGY AND EDUCATIONAL ENGINEERING LABORATORY www.LTEE.gr/adimitr
L.A. Learning Analytics Debate
Learning Analytics is rapidly growing research field, as well as a rapidly growing & robust commercial market
Learning Analytics and especially Academic Analytics promise is to transform : Educational Research into a data-driven science and Educational Institutions into organisations that make evidence-
based decisions
Critical debate is needed on : The educational paradigms that learning analytics promote The limits of computational modelling The ethics of analytics ….
LEARNING TECHNOLOGY AND EDUCATIONAL ENGINEERING LABORATORY www.LTEE.gr/adimitr
L.A. (self)Reflection Direction Critique
Mostly driven by what is available as data
Mostly driven by what is technical easy to produce (as indicators, as visualisations)
Often they were initiated in the field via enthusiasm by the possibilities and advances of technologies than from theories and theoretical preoccupations & constructs)
Some points of critique on Learning Analytics Researchers producing tools supporting awareness and self-reflection
LEARNING TECHNOLOGY AND EDUCATIONAL ENGINEERING LABORATORY www.LTEE.gr/adimitr
L.A. Academic Analytics Critique
Academic Analytics has nothing to do with learning !
Automated Assessment is not learning Organisational management is not learning
There is a gap between LA research and practice by LA software companies and vendors
LA companies advance very quickly forming the LA field
Every LA application is determined by obvious or hidden assumptions, by implicit or explicit underlying learning theories
The majority of Academic Analytics products is characterized by the
Behaviorism as the dominant underlying theory
What kind of learning the current technological applications of Academic Analytics in LMS like environments can capture ?
Different kinds of Analytics Different kind of Pedagogies
LEARNING TECHNOLOGY AND EDUCATIONAL ENGINEERING LABORATORY www.LTEE.gr/adimitr
L.A. Learning Analytics Critique
*** Observing, selecting, measuring, describing, classifying, sorting, ordering, ranking…but also visualising
these processes of meaning making are never wholly neutral & objective, but are fraught with problems and compromises, biases and omissions
Data are not neutral !
There is not one step of LA lifecycle that is neutral !
LEARNING TECHNOLOGY AND EDUCATIONAL ENGINEERING LABORATORY www.LTEE.gr/adimitr
L.A. Learning Analytics Critique
Analytics profoundly shape Education ! Backingham Shum S., Keynote Speech, EDMedia 2014 Conference
{Learning Analytics; Welcome to the future of assessment ?}
Politically
Ontologically
Algorithmically
Semiotically
Systemically
Authority & Power
Analytics reports (organisational & macro level) at the national scale, come with consequences at difference dimensions, sometimes in a punitive and not constructive way What data, concepts, relationships the designers apply to seek the model ? (exams reports, speech, discussion?)
Classification systems provide both a warrant and a tool for forgetting (unit of analysis, coded out of a given schema)
Visualisation modes characterising attainment and progress
Ranking modes that characterize students with badges (‘award system’)
Algorithms govern education and influence teaching or learning process, when automated interventions are produced by them (e.g. in recommenders systems)
What threshold, sables, relationships, patterns are used by algorithms ?
Deterministic algorithms, Probabilistic algorithms: Mathematics, Technology or imposed Epistemology ?
Algorithmical processes: secrecy, obscurity, incountability => need for an algorithmic transparency
What meaning making does the visualisations as well as the LA Dashboards interaction design encourage ? LA applications in institutions change the system dynamics:
It modifies in a specific way the curriculum, the assessment methods and contents, the personnel /staff requirements etc
It makes faster the feedback loops for policy makers, instructional designers, administrators, leaders
In almost all the cases there is not a distribution of power between institution leaders, L.A. Committee and university or school community (professors, students, staff etc)
LEARNING TECHNOLOGY AND EDUCATIONAL ENGINEERING LABORATORY www.LTEE.gr/adimitr
L.A. Learning Analytics Critique
Analytics profoundly shape Education ! Backingham Shum S., Keynote Speech, EDMedia 2014 Conference
{Learning Analytics; Welcome to the future of assessment ?}
Politically
Ontologically
Algorithmically
Semiotically
Systemically
Authority & Power
Analytics reports (organisational & macro level) at the national scale, come with consequences at difference dimensions, sometimes in a punitive and not constructive way What data, concepts, relationships the designers apply to seek the model ? (exams reports, speech, discussion?)
Classification systems provide both a warrant and a tool for forgetting (unit of analysis, coded out of a given schema)
Visualisation modes characterising attainment and progress
Ranking modes that characterize students with badges (‘award system’)
Algorithms govern education and influence teaching or learning process, when automated interventions are produced by them (e.g. in recommenders systems)
What threshold, sables, relationships, patterns are used by algorithms ?
Deterministic algorithms, Probabilistic algorithms: Mathematics, Technology or imposed Epistemology ?
Algorithmical processes: secrecy, obscurity, incountability => need for an algorithmic transparency
LA applications in institutions change the system dynamics:
It modifies in a specific way the curriculum, the assessment methods and contents, the personnel /staff requirements etc
It makes faster the feedback loops for policy makers, instructional designers, administrators, leaders
LEARNING TECHNOLOGY AND EDUCATIONAL ENGINEERING LABORATORY www.LTEE.gr/adimitr
L.A. Learning Analytics Critique
Analytics profoundly shape Education ! Backingham Shum S., Keynote Speech, EDMedia 2014 Conference
{Learning Analytics; Welcome to the future of assessment ?}
Politically
Ontologically
Algorithmically
Semiotically
Systemically
Authority & Power
Analytics reports (organisational & macro level) at the national scale, come with consequences at difference dimensions, sometimes in a punitive and not constructive way What data, concepts, relationships the designers apply to seek the model ? (exams reports, speech, discussion?)
Classification systems provide both a warrant and a tool for forgetting (unit of analysis, coded out of a given schema)
Visualisation modes characterising attainment and progress
Ranking modes that characterize students with badges (‘award system’)
Algorithms govern education and influence teaching or learning process, when automated interventions are produced by them (e.g. in recommenders systems)
What threshold, sables, relationships, patterns are used by algorithms ?
Deterministic algorithms, Probabilistic algorithms: Mathematics, Technology or imposed Epistemology ?
Algorithmical processes: secrecy, obscurity, incountability => need for an algorithmic transparency
What meaning making does the visualisations as well as the LA Dashboards interaction design encourage ? LA applications in institutions change the system dynamics:
It modifies in a specific way the curriculum, the assessment methods and contents, the personnel /staff requirements etc
It makes faster the feedback loops for policy makers, instructional designers, administrators, leaders
In almost all the cases there is not a distribution of power between institution leaders, L.A. Committee and university or school community (professors, students, staff etc)
LEARNING TECHNOLOGY AND EDUCATIONAL ENGINEERING LABORATORY www.LTEE.gr/adimitr
L.A. Learning Analytics Critical Dimensions
D A T A
Open / protected
Physical / Digital
Explicit / Implicit
P U R P O S E
Reflection
Adaptation
Prediction
Recomme-ndation
LEARNING ANALYTICS
D A SHB OA RD
Interactivity
Usability
Visualisation / metaphors
Consent
USERS -STAKEHOLDERS-
Learners Teachers Staff Institution
PROCESS INSTRUMENTS
Theories Technologies Visualisations Algorithms
Anonymity
Customisation
Governance
LEARNING TECHNOLOGY AND EDUCATIONAL ENGINEERING LABORATORY www.LTEE.gr/adimitr 91
LA Perspective : Reflection systems
General Design Aim: Every Learning Environment to include/be linked with an IA component/tool
Learners’ Cognitive Systems
Teacher
System
Perspective: Exploit the complementarities between the existing approaches applying L.A.
Create enriched and/or adaptive interfaces Provide LA tools’ outputs for selfregulation Intelligent systems advising students or even teachers
LEARNING TECHNOLOGY AND EDUCATIONAL ENGINEERING LABORATORY www.LTEE.gr/adimitr 92
Perspective: Apply complementary approaches based on LA
Learners’ Cognitive Systems
Teacher
System
Enrich the interface
Adapt LO/Curricula
Advise or guide users
Knowing students
Advise Students
Assess Students
Self-Assess/regulate Behavior
Regulate Individual Behavior
Regulate Group Behavior
LA Perspective : Reflection systems
LEARNING TECHNOLOGY AND EDUCATIONAL ENGINEERING LABORATORY www.LTEE.gr/adimitr 93
L.A. Perspectives: Research Road Map
Design of interaction Analysis tools/components & functions
More profound LA indicators and more complete LA models Define interpretative schema, for specific conditions/contexts Tailored indicators as well as indicators sets (models) for various actors profiles and roles. Development
Development of LA tools “for all” learning_environments Interoperable tools, open systems & customisable components Research on LA tools users (users’ requirements, tools’ effects)
Empirical Results on LA tools effects to users: [ (i) users’ requirements, (ii) tools usages, (iii) IA I significance for the users, (iv) tools’ effects]
Conceive & apply an appropriate Research Design
Emphasis on “ethical” aspects and ‘community’ social rules
…a new research field to work on
LEARNING TECHNOLOGY AND EDUCATIONAL ENGINEERING LABORATORY www.LTEE.gr/adimitr 94
L.A. Perspectives: Research Road Map
Design of interaction Analysis tools/components & functions
More profound LA indicators and more complete LA models Define interpretative schema, for specific conditions/contexts Tailored indicators as well as indicators sets (models) for various actors profiles and roles. Development
Development of LA tools “for all” learning_environments Interoperable tools, open systems & customisable components Research on LA tools users (users’ requirements, tools’ effects)
Empirical Results on LA tools effects to users: [ (i) users’ requirements, (ii) tools usages, (iii) IA I significance for the users, (iv) tools’ effects]
Conceive & apply an appropriate Research Design
Emphasis on “ethical” aspects and ‘community’ social rules
…a new research field to work on
Data appropriateness, usefulness, completeness (e.g. using also
paper data, physical data) Analysis methods on all kinds of data [focusing more on content
analysis, discourse analysis, dispositions, emotions]
Apply an Appropriate Research Design so as: - Experimenting in situ - Take into account and study the context of use and its dominant culture (get detailed human interaction data also around computer)
LEARNING TECHNOLOGY AND EDUCATIONAL ENGINEERING LABORATORY www.LTEE.gr/adimitr
L.A. Perspective: Concluding Position
In the purpose of offering richer learning environments, it is needed : A real link between “Learning” and “Analytics’’, working way with theories of
learning, teaching, cognition and knowledge in a more profound
Working more on support learning and teaching process than evaluate/assess teaching and learning
Working more on what is needed by the learning actors than on what is available Elaborating and Respecting Ethical frameworks and detailed guidelines in all
underlying dimensions. Ethics plays an enormously important role. All stakeholders must be involved !
Building of trust and confidence is primordial for the existence & prosperous future of LA discipline
Preconditions of Learning Analytics field prosperity
LEARNING TECHNOLOGY AND EDUCATIONAL ENGINEERING LABORATORY www.LTEE.gr/adimitr 96
Tools that are starting to be used:
CAM (Contextualized Attention Metadata) – collects/combines dada from different tools (browsers, office tools, multimedia, social network, etc.) http://mitarbeiter.fit.fraunhofer.de/~hcschmitz/publications/schmitz-et-al-cam-2011- prepublished.pdf LOCO-analyst (Learning Object Context Ontologies ) – learning process feedback http://jelenajovanovic.net/LOCO-Analyst/loco.html SMILI (Student Models that Invite the Learner In) Open Learner Modelling Framework – method for describing, analyzing and designing open learning models http://www.iaied.org/pub/1115/file/1115_Kay07.pdf SNAPP (Social Networks Adapting Pedagogical Practice)- developed to analyze interaction patterns on courses – http://www.ascilite.org.au/conferences/auckland09/procs/bakharia-poster.pdf Gephi – filtering, clustering, navigation and manipulation of network data – http://gephi.org/ Sense.us asynchronous collaboration – graphical annotation and view sharing http://vis.stanford.edu/papers/senseus http://vis.berkeley.edu/papers/sense.us/video/ Signals – which encourages or prompts users to take an action – / nudge analytics http://www.itap.purdue.edu/learning/tools/signals/ ……..
Learning Analytics Tools list
LEARNING TECHNOLOGY AND EDUCATIONAL ENGINEERING LABORATORY www.LTEE.gr/adimitr 97
Biblio Learning Analytics …a new research field to study, to work on
List of active labs worldwide, on Learning Analytics
LTEE lab, Aegean Univ., GR, [ www.ltee.gr/adimitr ]
CoCo Lab (P. Reimann) [ http://coco.edfac.usyd.edu.au/]
Collide, Lab, (U. Hoppe) Univ. of Duisburg, [ www.collide.info ]
GSIC, Univ. Valladolid, (Y. Dimitriadis) [http://gsic.tel.uva.es/]
EPFL/CRAFT, (P. Dillenbourg),[ http://craft.epfl.ch/ ] Eric Duval, HCI unit, Computer Science Dept, Katholieke Universiteit Leuven,
[https://erikduval.wordpress.com/about/ ] Simon Buckingham Shum [ http://simon.buckinghamshum.net/ ] Learning
Informatics, University of Technology Sydney, Director of the Connected Intelligence Centre (CIC)
George Siemens, Associate Director, Technology Enhanced Knowledge Research Institute, Athabasca University, Canada [http://www.educause.edu/members/george-siemens ]
Rebecca Ferguson, Institute of Educational Technology, Open University, UK [http://iet.open.ac.uk/people/r.m.ferguson ]
LEARNING TECHNOLOGY AND EDUCATIONAL ENGINEERING LABORATORY www.LTEE.gr/adimitr 98
Biblio (0)
Eli (2011). "Seven Things You Should Know About First Generation Learning Analytics.". EDUCAUSE Learning Initiative Briefing.
Long, P. and Siemens, G., (2011). "Penetrating the fog: analytics in learning and education.". Educause Review Online 46 (5): 31–40.
Buckingham Shum, Simon (2012). Learning Analytics Policy Brief. UNESCO.
Ferguson, R.. The State of Learning Analytics in 2012: A Review and Future Challenges. Technical Report. Knowledge Media Institute: The Open University, UK, 2012.
Rienties B., Alden Rivers B., (2014) Measuring and Understanding Learner Emotions: Evidence and Prospects. Learning Analytics Review 1.
Learning Analytics Community Exchange, EU funded Project [http://www.laceproject.eu]
LEARNING TECHNOLOGY AND EDUCATIONAL ENGINEERING LABORATORY www.LTEE.gr/adimitr 99
Biblio (I) Fesakis G., Dimitracopoulou A., Palaiodimos A. (2011). The impact of interaction analysis graphs on
groups during online collaboration through blogs according to the “learning by design” scenario, CSCL2011 Conference Proceedings.
Dimitracopoulou A. & Bruillard E. (2006). Interfaces de Forum enrichies par la visualization d’analyses automatiques des interactions et du contenu. (Guest Editors) E.Bruillard & G.-L. Baron,. Special Issue on ‘Forum des Discussions Asynchrones’, Sciences et Techniques Educatives, Vol.13,
Hlapanis, G. & Dimitracopoulou A. (2007). A School-Teachers’ Learning Community: Matters of Communication Analysis, (Guest Editors) P. Kirschner & K.-W. Lai, Special Issue on Online Communities of Practice in Education, Journal of Technology, Pedagogy & Education, Vol 16, Iss 1.
Bratitsis T. & Dimitracopoulou A. (in press 2007a), Monitoring and Analysing Group Interactions in Asynchronous Discussions with DIAS system, Special Issue, Intern.Journal of e-Collaboration ( IJeC).
Bratitsis T. & Dimitracopoulou A. (2007b). Ιnteractions Analysis in asynchronous discussions: Lessons learned on the learners perspective, using the DIAS system. International Congress, Computer Supported Collaborative Learning, CSCL 2007, Mice, Minds and Society, July 21-26, 2007, New Jersey, USA
Bratitsis T. & Dimitracopoulou A. (2007c). Collecting and Analyzing Interaction Data in Computer-Based Group Learning Discussions: An overview. The 11th International Conference on User Modeling, Workshop on Personalisation in e-learning environments at individual and group level , Corfu, Greece, 25-29 June, 2007
Dimitracopoulou, A et al. (2005): State of the art of interaction analysis for Metacognitive Support & Diagnosis. Interaction Analysis (IA) JEIRP Deliverable D.31.1.1. Kaleidoscope NoE, (IST, e-learning) December 2005. (pp.119) Available online at: www.telearn.noe-kaleidoscope.org,
LEARNING TECHNOLOGY AND EDUCATIONAL ENGINEERING LABORATORY www.LTEE.gr/adimitr 100
Biblio (I)
Hulshof P. & Dimitracopoulou A. (2005). Synthesis and Research Agenda on Computer Based
Interaction Analysis. Interaction Analysis (IA) JEIRP Deliverable D.31.1.3. Kaleidoscope NoE, December 2005. (pp.109) Available online at: www.telearn.noe-kaleidoscope.org,
Petrou A. & Dimitracopoulou A. (under submission). Τeachers’ participation during Synchronous Collaborative Activities in every-day educational practice supported by Interactions’ Analysis Tools. Computers & Education.
Fessakis G., Petrou A. & Dimitracopoulou A. (2004) Collaboration Activity Function: An interaction analysis’ tool for Computer Supported Collaborative Learning activities, In Kinshuk et al (Eds) Proceedings of 4th IEEE International Conference on Advanced Learning Technologies (ICALT 2004), August 30 - Sept 1, 2004, Joensuu, Finland, IEEE Computer Society Editions, pp. 196-201
Petrou A. & Dimitracopoulou A. (2003). Is synchronous computer mediated collaborative problem solving ‘justified’ only when by distance? Teachers’ point of views and interventions with co-located groups during every day class activities. In (Eds) B. Wasson, S. Ludvigsen & U. Hoppe, Computer Support for Collaborative Learning: Designing for Change in Networked Learning Environments, CSCL 2003 congress: 14-18 June 2003, Bergen, Norway, Kluwer Academic Publishers, pp. 441-450.
LEARNING TECHNOLOGY AND EDUCATIONAL ENGINEERING LABORATORY www.LTEE.gr/adimitr 101
Biblio (II) BARROS B., VERDEJO F. M. (2000). Analyzing student interaction processes in order to improve
collaboration. The DEGREE approach, International Journal of Artificial Intelligence in Education, (2000), 11, 221-241.
BIUK-AGHAI R., SIMOFF S. (2001). An integrated framework for Knowledge extraction in Collaborative Virtual Environments. In ACM-2001, Group’01, Sept.30-Oct.3, 2001, Boulder, Colorado, USA.
CHENG R., VASSILEVA J. (2005). Adaptive rewarding mechanism for sustainable online learning community. Artificial Intelligence in Education. C.-K. Looi et al. (Eds). IOS Press.
DILLENBOURG P., OTT D., WEHRLE T., BOURQUIN Y., JERMANN P., CORTI D., SALO P. (2002). The socio-cognitive functions of community mirrors. In F. Fluckiger, C. Jutz, P. Schulz and L. Cantoni (eds). Proceedings of the 4th Int. Conference on New Educational Environments. Lugano, May 8-11.
GEROSA M.A., PIMENTEL G.P., FUKS H., LUCENA C. (2005). No need to read messages right now: helping mediators to steer educational forums using statistical and visual information. In T. Koschmann, D. Suthers, T.-W.Chan (eds), Proceedings of CSCL2005: The Next Ten Years! Taipei, May 30-June 4, 2005, Taiwan, ISLS.
Harrer, A., Bollen, L. & Hoppe, U. (2004). Processing and Transforming Collaborative Learning Protocols for Learner's Reflection and Tutor's Evaluation In: Proc. of Workshop on Artificial Intelligence in CSCL at European Conference on Articificial Intelligence ECAI 2004, edited by E. Gaudioso and L. Talavera, Valencia
JERMANN P. (2004). Computer Support for Interaction Regulation in Collaborative Problem Solving, PhD Thesis, University of Geneva.
MARTÍNEZ A., DIMITRIADIS Y., RUBIA B., GÓMEZ E., & DE LA FUENTE P. (2003b). Combining qualitative and social network analysis for the study of social aspects of collaborative learning, Computers and Education, 41(4), 353-368.
LEARNING TECHNOLOGY AND EDUCATIONAL ENGINEERING LABORATORY www.LTEE.gr/adimitr 102
Biblio (II) MICHOZUKI T., KATO H., HISAMATSU S., YAEGASHI K., FUJITANI S., NAGATA T., NAKAHARA J., NISHIMORI
T., SUZUKI M. (2005). Promotion of Self-Assessment for Learners in Online Discussion Using the Visualization Software. In T. Koschmann, D. Suthers, T.-W.Chan (eds), Proceedings of CSCL2005: The Next Ten Years! Taipei, May 30-June 4, 2005, Taiwan
NAKAHARA J., KAZARU Y., SHINICHI H., YAMAUCHI Y. (2005). iTree: Does the mobile phone encourage learners to be more involved in collaborative learning? In (Eds) T. Koschmann, D. Suthers, T.-W.Chan, Proceedings of CSCL 2005: The Next Ten Years! Taipei, May 30-June 4, 2005, Taiwan, ISLS.
REIMANN P. (2003). How to support groups in learning: More than problem solving. (keynote talk) in Aleven et al. (ed.), Artificial Intelligence in Education (AIED 2003). Supplementary Proceedings Sydney: University of Sydney, p. 3-16.
REYES P., TCHOUNIKINE P. (2005). Mining learning groups' activities in Forum-type tools. In T. Koschmann, D. Suthers, T.-W.Chan (eds), Proceedings of Computer Supported Collaborative Learning 2005: The Next Ten Years! Taipei, May 30-June 4, 2005, Taiwan, ISLS.
SCHUMMER T., STRIJBOS J-W., BERKEL T. (2005). A new direction for log-files analysis in CSCL: Examples with a spatio-temporal metric. In T. Koschmann, D. Suthers, T.-W.Chan (eds), Proceedings of CSCL 2005: The Next Ten Years! Taipei, May 30-June 4, 2005, Taiwan, ISLS.
VASSILEVA J., CHENG R., SUN L., HAN W. (2005). Designing Mechanisms to Stimulate Contributions in Collaborative Systems for Sharing Course-Related Materials, ITS 2004, Workshop on Computational Models of Collaborative learning. Maceio, Alagoas, Brazil, August 30 - September 3, 2004.
ZUMBACH J., SCHÖNEMANN J., REIMANN P. (2005). Analyzing and Supporting Collaboration in Cooperative Computer-Mediated Communication. In T. Koschmann, D. Suthers, & T. W. Chan (Eds.), CSCL 2005: The Next 10 Years! (pp. 758-767). Mahwah, NJ: Lawrence Erlbaum.