22
Considering Learning Analytics: SpeakApps and the Application of a Learning Analytics Reference Model Mairéad Nic Giolla Mhichíl Dublin City University www.speakapps.org

Krakow presentation speak_appsmngm_final

Embed Size (px)

DESCRIPTION

Learning analytics in SpeakApps and in Higher Education Institutions, Ethical Issues

Citation preview

Page 1: Krakow presentation speak_appsmngm_final

Considering Learning Analytics: SpeakApps and the Application of a Learning Analytics

Reference Model

Mairéad Nic Giolla MhichílDublin City University

www.speakapps.org

Page 2: Krakow presentation speak_appsmngm_final

Project Overview SpeakApps 2 speakapps.eu

Lifelong Learning Programme Nov 2013 – Oct 2014

KA2 LANGUAGES, Accompanying Measures

Development of tools and pedagogical tasks for oral production and spoken interaction

Partners Associated Partners • Institut Obert de Catalunya• University of Southern Denmark• University of Nice• University of Jÿvaskÿla• Ruhr-Universitat Bochum• Polskie Towarzystwo Kulturalne "Mikolaj Kopernik“• Fundació Pere Closa

Page 3: Krakow presentation speak_appsmngm_final

SpeakApps...

• Life Long Learning Project, 2011-2012 & 13-14

• Partners and Associate Partners:– Universitat Oberta de Catalunya, Catalan, English

– Rijksuniversiteit Groningen, Dutch

– University of Jyväskylä, Swedish, Finnish

– Jagiellonian University Krakow, Polish

– Dublin City University, Irish

• Development of tools and pedagogical tasks for oral production and spoken interaction

• Open Educational Resource using CEFR

• SpeakApps Moodle based Platform

Page 4: Krakow presentation speak_appsmngm_final

Introduction to Analytics

Pervasiveness of technology has facilitated the collection of data and the creation of a variety of data sets, big data

Application has spread across domains and prompted business and societal applications

Google analytics – Adwords etc. (http://www.web2llp.eu/) Smart cities – Policing using predicative models to prevent crime

(Santa Cruz police department - http://edition.cnn.com/2012/07/09/tech/innovation/police-tech/)

Ultimate aim to inform decision making from resource allocation to improved services etc.

How? By using a variety of data mining techniques for discovery of patterns and/ or validation of hypothesis/claims

Page 5: Krakow presentation speak_appsmngm_final

Educational Analytics

Data available in education from a variety of sources LMS Institutional systems Google for education User generated content, social networks

Ferguson (2012)* provides a useful overview of the educational analytics field and suggests the following divergence in focus between:

Educational data mining focuses on the technical challenge: How can we extract value from these big sets of learning-related data?

Learning analytics focuses on the educational challenge: How can we optimise opportunities for online learning?

Academic analytics focuses on the political/economic challenge: How can we substantially improve learning opportunities and educational results at national or international levels?

)

Page 6: Krakow presentation speak_appsmngm_final

Educational Analytics…

Long and Siemens (2011:32) – aptly describes the challenge:

But using analytics requires that we think carefully about what we need to know and what data is most likely to tell us what we need to know.

(http://net.educause.edu/ir/library/pdf/ERM1151.pdf)

* See: Ferguson, Rebecca (2012). Learning analytics: drivers, developments and challenges. International Journal of Technology Enhanced Learning, 4(5/6) pp. 304–317.

Page 7: Krakow presentation speak_appsmngm_final

Demographics• Age, home/term address,

commuting distance, socio-economic status, family composition, school attended, census information, home property value, sibling activities, census information

Online Behaviour• Mood and emotional analysis of

Facebook, Twitter, Instagram activities, friends and their actual social network, access to VLE

• (Moodle)

Physical Behaviour• Library access, sports centre, clubs

and societies, network access yielding co-location with others and peer groupings, lecture/lab attendance…

Student or Learner Data**

Academic Performance• Second Level performance,

University exams, course preferences, performance relative to peers in school

Slide reproduced and adapted with permission from presentation Glynn, M. (2014) Using the Data from eAssessment, eAssessment 2014: Final Answer: The question of summative eAssessment, 05th September 2014, University of Dundee Scotland available from: http://www.slideshare.net/enhancingteaching/optimising-knowledge-assessment-data

Page 8: Krakow presentation speak_appsmngm_final

Descriptive• What has happened?

Diagnostic• Why this happened?

Predictive• What will happen?

Prescriptive• What to do?

Levels / Objectives of Analytics**

Slide reproduced and adapted with permission from presentation Glynn, M. (2014) Using the Data from eAssessment, eAssessment 2014: Final Answer: The question of summative eAssessment, 05th September 2014, University of Dundee Scotland available from: http://www.slideshare.net/enhancingteaching/optimising-knowledge-assessment-data

Page 9: Krakow presentation speak_appsmngm_final

Learning analytics is a moral practise which should align with core organisational principles

The purpose and boundaries regarding the use of learning analytics should be well defined and visible

Students should be engaged as active agents in the implementation of learning analytics

The organisation should aim to be transparent regarding data collection and provide students with the opportunity to update their own data and consent agreements at regular

intervalsModelling and interventions based on analysis of data should be free from bias and aligned with appropriate

theoretical and pedagogical frameworks wherever possible

Students are not wholly defined by their visible data or our interpretation of that data

Adoption of learning analytics within the organisation requires broad acceptance of the values and benefits

(organisational culture) and the development of appropriate skills

The organisation has a responsibility to all stakeholders to use and extract meaning from student data for the benefit of

students where feasible

Learning analytics is a moral practise which should align with core organisational principles

The purpose and boundaries regarding the use of learning analytics should be well defined and visible

Students should be engaged as active agents in the implementation of learning analytics

The organisation should aim to be transparent regarding data collection and provide students with the opportunity to update their own data and consent agreements at regular

intervalsModelling and interventions based on analysis of data should

be free from bias and aligned with appropriate theoretical and pedagogical frameworks wherever possible

Students are not wholly defined by their visible data or our interpretation of that data

Adoption of learning analytics within the organisation requires broad acceptance of the values and benefits (organisational culture) and the

development of appropriate skills

The organisation has a responsibility to all stakeholders to use and extract meaning from student data for the benefit of

students where feasible

Open University Analytics Principles**

Slide reproduced and adapted with permission from presentation Glynn, M. (2014) Using the Data from eAssessment, eAssessment 2014: Final Answer: The question of summative eAssessment, 05th September 2014, University of Dundee Scotland available from: http://www.slideshare.net/enhancingteaching/optimising-knowledge-assessment-data

Page 10: Krakow presentation speak_appsmngm_final

Analytics Process*

Establish the objective or claim of the LA exercise

Three Stage iterative process 1. Data collection and Pre-processing

Data preparation and cleaning, removal of redundant data etc. time stamps Application of established and evolving data mining techniques to complete

this2. Analytics and action

Explore/analyse the data to discover patterns Data visualisation and representation

3. Post-processing Adding new data from additional data sources Refining the data set Identify new indicators/metrics Modify variables of analysis Choose a new analytics method

 * See: Chatti, M.A., Dyckhoff, A.L., Schroeder, U. and Thüs, H. (2012) ‘A reference model for learning analytics’, Int. J. Technology Enhanced Learning, Vol. 4, Nos. 5/6, pp.318–33

Page 11: Krakow presentation speak_appsmngm_final

What? What kind of data does the system gather, manage, and use for the analysis?

Who? Who is targeted by the analysis?

Why? Why does the system analyse the collected data?

How? How does the system perform the analysis of the collected data?

* See: Chatti, M.A., Dyckhoff, A.L., Schroeder, U. and Thüs, H. (2012) ‘A reference model for learning analytics’, Int. J. Technology Enhanced Learning, Vol. 4, Nos. 5/6, pp.318–33

11

Learning Analytics Reference Model*

Page 12: Krakow presentation speak_appsmngm_final

What? Data and Environment: Which systems Structured and/or unstructured data

Who? StakeholderTeachersStudents Instructional designers Institutional stakeholders

12

Learning Analytics Reference Model*

Page 13: Krakow presentation speak_appsmngm_final

Why? ObjectiveMonitoring and analysisPrediction and intervention Tutoring and Mentoring Assessment and feedback Adaptation Personalization and recommendationReflection

Challenge to identify the appropriate indicators/metrics

13

Learning Analytics Reference Model*

Page 14: Krakow presentation speak_appsmngm_final

How? MethodStatistics: most LMS produce statistics based on

behavioural data Data mining techniques and others (long list)

Classification (categories known in advance) many different techniques from Data mining

Clustering (categories created from the data similar data clustered together based on similar attributes not known in advance)

Association rules mining leads to the discovery of interesting associations and correlations within data

Social Network Analysis…

14

Learning Analytics Reference Model*

Page 15: Krakow presentation speak_appsmngm_final

15

Dublin City University’s Moodle Analytics*What? What kind of data does the system gather, manage, and use for the analysis

Using the generic Moodle Log Data i.e. behavioural data

Who? Who is targeted by the analysis?

Students and Lecturers

Why? Why does the system analyse the collected data?

Students – adapt their behaviour Teachers – review course Institutional – monitor engagement particularly of first year students

How? How does the system perform the analysis of the collected data?

- Model created based on c. six years of user data based on user participation within courses – identify which modules are suitable i.e. must posses a strong confidence level i.e. .6/.7

- Students who are not engaging with the Module are provided with feedback of engagement – student centred

Page 16: Krakow presentation speak_appsmngm_final

DCU Moodle Analytics

Slide reproduced and adapted with permission from presentation Glynn, M. (2014) Using the Data from eAssessment, eAssessment 2014: Final Answer: The question of summative eAssessment, 05th September 2014, University of Dundee Scotland available from: http://www.slideshare.net/enhancingteaching/optimising-knowledge-assessment-data

Page 17: Krakow presentation speak_appsmngm_final

Claim is that student and teacher oral & video recordings should be time limited to maintain the attention of the listener

Currently we recommend a maximum of one minute for learner recordings and two minutes for teacher recordings

At present this claim is based on experience Evidence to support decision-making which will impact:

Resource allocation to refine the tool – time limitation Learner agency Instructional and task design

17

SpeakApps Pilot

Page 18: Krakow presentation speak_appsmngm_final

18

SpeakApps Pilot & LA Reference Model

What? What kind of data does the system gather, manage, and use for the analysis

LMS data i.e. technical information i.e. device, browser, versions etc.behavioural data i.e. time stamps, click tracking, user generated content such as surveys, peer-feedback

Who? Who is targeted by the analysis?

Students, Teachers, Instructional Designers and Developers

Why? Why does the system analyse the collected data?

Students – adapt Teachers – tutoringInstructional designers – adapt taskDevelopers – interface adaptation

How? How does the system perform the analysis of the collected data?

Statistics based on behavioural data and the analysis of user generated data – possible qualitative follow-up

Page 19: Krakow presentation speak_appsmngm_final

Data Types and Sources Aggregate and integrate data produced by students from multiple

sources Challenge to source, combine and manipulate data from a wide variety of

sources and in many formats

Over reliance on behavioural data from LMS, varied data sources Structured data i.e. data from LMS etc., other institutional systems, connected

devices Unstructured data i.e. other sources user generated content/data i.e.

Facebook - social network modelling, online dictionaries, translation tools thesaurus etc.

19

Concluding Remarks

Page 20: Krakow presentation speak_appsmngm_final

Student Agency in LA Students as active agents – voluntarily collaborate in

providing and accessing data

Designing interventions (if appropriate in the context) and the agency of the student: Student at the centre of interpretation Data representation to facilitate interpretation Requires specific skills of interpretation

20

Concluding Remarks…

Page 21: Krakow presentation speak_appsmngm_final

Ethical and Educational Concerns

Use of data based on transparent opt-in permission of students following established research principles Students understand that data is collected about them and actively

buy-in Privacy and stewardship of data

Emphasis on learning as a moral practice resulting in

understanding rather than measuring (Reeves, 2011)

21

Concluding Remarks…

Page 22: Krakow presentation speak_appsmngm_final

Challenging to realise the specific objectives of stakeholders Teachers v Instructional designers in online education

Institutions v funders

Designing and focusing indicators

Necessary skills for interpretation and communicating outcomes

Representation in clear and usable formats for stakeholders DCU to research impact on students

22

Concluding Remarks…