44
Web-based Self- and Peer Assessment of Teachers Digital Competences Mart Laanpere, Hans Põldoja

Web-based Self- and Peer Assessment of Teachers Digital Competences

Embed Size (px)

DESCRIPTION

Presentation in the research group seminar, Institute of Informatics, Tallinn University, 7 March 2012.Based on the following publication:Põldoja, H., Väljataga, T., Tammets, K., & Laanpere, M. (2011). Web-based Self- and Peer- assessment of Teachers’ Educational Technology Competencies. In H. Leung, E. Popescu, Y. Cao, R. Lau, & W. Nejdl (Eds.), Advances in Web-Based Learning – ICWL 2011: 10th International Conference, Hong Kong, China, December 2011. Proceedings (pp. 122–131). Berlin / Heidelberg: Springer. http://www.springerlink.com/content/e3t2042568271213/

Citation preview

Web-based Self- and Peer Assessment of Teachers Digital Competences

Mart Laanpere, Hans Põldoja

This work is licensed under the Creative Commons Attribution-ShareAlike 3.0 Unported License. To view a copy of this license, visit http://creativecommons.org/licenses/by-sa/3.0/ or send a letter to Creative Commons, 444 Castro Street, Suite 900, Mountain View, California, 94041, USA.

c b a

Research context

• Importance of educational technology competencies

• Generic ICT competency frameworks (e.g. ICDL) do not cover all the competencies needed for educational use of ICT

• Educational Technology Competency Model (ETCM) for Estonian teachers

• DigiMina project for assessing teachers’ educational technology competencies

Research problem

To what extent and how could be teachers’ educational technology competencies assessed using a Web-based tool?

Existing competency frameworks

• International Computer Driving License

• UNESCO ICT Competency Framework for Teachers

• ISTE National Educational Technology Standards for Teachers (NETS-T)

• ECDL / ICDL

• Used in 148 countries

• Focused on ICT usage

• Standardized testing

• Launched 2008, revised 2011

• Guidelines for creating national competency models

• 6 subdomains

• ISTE NETS-T 2008

• 20 competencies in 5 competency groups

• Used in Norway, Netherlands, Finland, etc.

Design challenges

Design challenges

• How to define performance indicators of all competencies in ETCM?

• How to select appropriate methods and instruments for assessing competencies?

• How to implement selected assessment methods in a Web-based tool?

Educational Technology Competency Model for Estonian Teachers

• Based on ISTE NETS-T 2008

• 5-level assessment rubric developed by local expert group

Measuring Educational Technology Competencies

• Assessment methodology and instruments must be reliable, valid, flexible, but also affordable with respect to time and costs.

• Four levels of measuring competencies (Miller, 1990):

• knows

• knows how

• shows how

• does

Web-based Assessment of Competencies

• Five-dimensional framework for authentic assessment (Gulikers et al, 2004):

• tasks: meaningful, relevant, typical, complex, ownership of problem and solution space;

• physical context: similar to professional work space and time frame, professional tools;

• social context: similar to social context of professional practice (incl. decision making);

• form: demonstration and presentation of professionally relevant results, multiple indicators;

• criteria: used in professional practice, related to realistic process/product, explicit

Design and development of

DigiMina

Research-based design methodology

Adapted from (Leinonen et al, 2008)

Contextual Inquiry

Personas

Participatory Design

Scenario-based design

Participatory design sessions

Concept mapping

Product Design

User stories

Paper prototyping

Information architecture

High-fidelity prototyping

Software PrototypeAs Hypothesis

Agile sprints

Personas

• Teacher training master student

• Novice teacher

• Experienced teacher

• Educational technologist of a school

• Trainings manager (in a national organization)

(Cooper et al, 2007)

Scenarios

• Master student is evaluating her educational technology competencies

• Peer assessment of problem solving tasks

• Educational technologist of a school is getting an overview of teachers’ educational technology competencies

• Training manager is compiling a training group with sufficient level of competencies

(Carroll, 2000)

Stsenaarium 2: Pädevuste partnerhindamine

Kaisa on noor matemaatika õpetaja, kes on esimest aastat koolis. Õpetaja kutseaasta käigus tuleb muuhulgas tähelepanu pöörata ka oma pädevuste kaardistamisele. Kaisa on korra varem ülikooli ajal juba oma haridustehnoloogilisi pädevusi hinnanud, kuid nüüd koolis on ta oma teadmisi ja oskusi praktikas saanud rakendada.

DigiMina keskkonda sisenedes näeb ta oma eelmisel aastal tehtud pädevuste hindamise testi. Ta saab valida, kas alustab pädevuste testi täitmist eelmise aasta tasemelt või päris algusest. Talle meenub, et eelmisel korral võttis kõigi ülesannete lahendamine suhteliselt palju aega ning ta otsustab sel korral alustada testi täitmist eelmise aasta tasemelt. Sel korral on lisaks valikvastustega küsimustele ka probleemülesandeid, mille puhul tuleb vabas vormis omapoolne lahendus kirjutada.

Vastamise lõpus kuvatakse pädevusprofiili diagramm, kuid osad pädevused on sealt puudu. Kaisa loeb kõrvalolevalt märkuselt, et vabas vormis küsimusi peavad hindama teised DigiMina kasutajad ning täielikke tulemusi kuvatakse alles siis kui tema vastused on hinnatud.

Uuesti avalehele jõudes kuvatakse Kaisale teadet, et ta võib osaleda teiste kasutajate poolt sisestatud probleemülesannete lahenduste hindamises. Kaisale pakub huvi, kuidas teised etteantud probleeme lahendasid, ning ta otsustab ühte pädevustesti hinnata. Ülesande hindamisel peab Kaisa lisama omapoolse kommentaari pakutud lahendusele ning valima menüüst hindepunktid. Hindamise lihtsustamiseks kuvatakse talle ka hindamiskriteeriume. Kui esimene vastus on hinnatud otsustab ta hinnata veel paari vastust. Kaisa ei näe, kelle vastuseid ta hindas, kuid ta oletab, et tegemist võis olla teiste õpetaja kutseaasta üliõpilastega.

Paari päeva pärast saab Kaisa e-maili, et tema vastused on kahe DigiMina kasutaja poolt hinnatud. Sisselogides näeb ta oma vastuste kohta kirjutatud lühikesi kommentaare ning pääseb kokkuvõtva diagrammi juurde. Kokkuvõttes saab ta võrrelda oma pädevuste taset ka õpetaja kutseaasta üliõpilaste keskmise pädevustasemega ning kõigi kasutajate keskmise pädevustasemega.

Participatory design sessions

• 2 sessions

• Discussing the scenarios

• Drawing the sketches

Main concepts

Competency Test

• Competency test can be taken several times to measure the advancement

• Usability issue: large number of tasks (20 competencies, 5 levels)

• Solutions:

• Can be saved and continued later

• Setting the starting level with self-evaluation

Tasks

• Task types:

• automatically assessed self-test items

• peer assessment task

• self reflection task

• Need to increase the number of competencies that can be assessed with a self-test

• Peer assessment requires blind review from a user in a same or higher competency level

Competency Profile

• Level of competencies is displayed as a diagram

• User can compare her competency level with the average level of various groups

• Privacy settings (private, group members, public)

• Can be linked or embedded to external web pages

Group

• Typically created for a school or a group of teacher training students

• Group owner can see competency profiles of all members

• Anybody can create a group

• Groups can be set up as private or public

Competency Requirements

• Large number of competency profiles would make DigiMina a valuable planning tool

• Competency requirements can be created by the training manager, teacher trainer and group owner

• Will be implemented in a later phase

Current prototype

Task development

Assessment rubrics example3.1. Demonstrate fluency in technology systems and the transfer of current knowledge to new technologies and situations3.1. Demonstrate fluency in technology systems and the transfer of current knowledge to new technologies and situations3.1. Demonstrate fluency in technology systems and the transfer of current knowledge to new technologies and situations3.1. Demonstrate fluency in technology systems and the transfer of current knowledge to new technologies and situations3.1. Demonstrate fluency in technology systems and the transfer of current knowledge to new technologies and situations

1 2 3 4 5

Creates a user account in a web-based system and creates/uploads resources; uses common software/web environments/hardware with the help of a user manual; uses presentation tools and a printer; saves/copies files to external drive.

Manages access rights to the resources published in the web.

Solves independently the problems that occur during the use of ICT tools (using help, manual, FAQ or forums when needed); combines different tools; changes the settings of a web-based system.

Transfers working methods from known web environment/software to an unknown environment.

Chooses (compares, evaluates) the most suitable tool for a given task.

Example self-test task

• Screen recording of a teacher joining national educational portal Koolielu and making several errors during the process

• Multiple response question about the errors made

Example peer review task

• Teacher must adapt a given study guide to her own working context (age range, subject area, software)

Example self-reflection task

• 4.1 level 1 — participates as a partner in a multicultural project

Development of test items

• Test items are authored using IMS QTI compatible test authoring tool TATS (Tomberg & Laanpere, 2011)

• 3 IMS QTI question types currently supported:

• choiceInteraction (multi-choice)

• choiceInteraction (multi-response)

• extendedTextInteraction

Future work

Development of DigiMina

• Visualizations and analytics

• Support for additional QTI question types: orderInteraction and associateInteraction

• Competency requirements

• Integration with Koolielu portal

References

• Gulikers, J. T. M., Bastiaens, T. J., & Kirschner, P. A. (2004). A Five-Dimensional Framework for Authentic Assessment. Educational Technology Research & Development, 52(3), 67–86.

• Miller, G. E. (1990). The assessment of clinical skills/competence/performance. Academic Medicine, 65(9), S63–S67.

• Leinonen, T., Toikkanen, T., & Silvfast, K. (2008). Software as Hypothesis: Research-Based Design Methodology. In Proceedings of the Tenth Anniversary Conference on Participatory Design 2008 (pp. 61–70). Indianapolis, IN: Indiana University.

• Cooper, A., Reimann, R., & Cronin, D. (2007). About Face 3: The Essentials of Interaction Design. Indianapolis, IN: Wiley Publishing, Inc.

• Carroll, J. M. (2000). Making Use: Scenario-Based Design of Human-Computer Interactions. Cambridge, MA: The MIT Press.

• Tomberg, V., & Laanpere, M. (2011). Implementing Distributed Architecture of Online Assessment Tools Based on IMS QTI ver.2. In F. Lazarinis, S. Green, & E. Pearson (Eds.), Handbook of Research on E-Learning Standards and Interoperability: Frameworks and Issues (pp. 41–58). IGI Global.