13
The development of a quick, easy to use, cross-disciplinary information evaluation matrixMike Leigh Department of Informatics De Montfort University Leicester [email protected] Lucy Mathers Department of Media Technology De Montfort University Leicester [email protected] Kaye Towlson Library Services De Montfort University Leicester [email protected]

“The development of a quick, easy to use, cross-disciplinary information evaluation matrix” Mike Leigh Department of Informatics De Montfort University

Embed Size (px)

Citation preview

Page 1: “The development of a quick, easy to use, cross-disciplinary information evaluation matrix” Mike Leigh Department of Informatics De Montfort University

“The development of a quick, easy to use, cross-disciplinary information evaluation

matrix”

Mike LeighDepartment of InformaticsDe Montfort [email protected]

Lucy Mathers

Department of Media TechnologyDe Montfort [email protected]

Kaye TowlsonLibrary Services

De Montfort [email protected]

Page 2: “The development of a quick, easy to use, cross-disciplinary information evaluation matrix” Mike Leigh Department of Informatics De Montfort University

Session Outline

Project Background Project Strategy Methodology – Action Research

Approach Data Collection Methods Information Source Evaluation Matrix

(ISEM) Learning Issues Teaching Issues Future Work Conclusions

Page 3: “The development of a quick, easy to use, cross-disciplinary information evaluation matrix” Mike Leigh Department of Informatics De Montfort University

Project Background

Student (mis)perception of their information evaluation skills• Previous study results• Observed wiki/blog postings

Existing online tutorials largely ignored• Intute Internet Detective, OU Safari,

QUICK

Prototype ISEM developed within the Faculty of Technology (Leigh et al., 2009)

Current project aims to evaluate utilisation of ISEM by students with diverse needs within different academic disciplines

Page 4: “The development of a quick, easy to use, cross-disciplinary information evaluation matrix” Mike Leigh Department of Informatics De Montfort University

Methodology – Action Research Approach

Adapted from Allen (2001)

Page 5: “The development of a quick, easy to use, cross-disciplinary information evaluation matrix” Mike Leigh Department of Informatics De Montfort University

Data Collection Methods

In order to understand the effectiveness of this diverse usage of the ISEM, the collection of data was undertaken by the following mechanisms:• A deeper study within the Faculty of

Technology. • A broader study across the University in a

wider range of disciplines, and from different levels of study.

• Use within transition project packs provided to sixth-form students beginning their first year of study at De Montfort University.

• Ad-hoc usage – Library publication.• External interest.

Page 6: “The development of a quick, easy to use, cross-disciplinary information evaluation matrix” Mike Leigh Department of Informatics De Montfort University

Source/Reference:

Task/Question:

Evaluation Criteria 1 2 3 4 5 Score

Who? - is the author Author background is unknown

Some evidence author works in this area but few articles

Evidence of some publications in this

area by author

Author has several published works

in this area

Author is a known authority in this area

Score

What? - is the relevance of points made

Content and arguments of little or no relevance to

the task

Only of peripheral /little relevance to

task being undertaken

Some of the content is relevant to task

requirements

Several points made are of

relevance to task

Content and arguments closely match your needs

Score

Where? – context for points made

Situation to which author applies

points is different to that of the task

Minimal similarity between author’s context & the task

context

Author’s situation and that of the task have some similarity

Reasonable similarity between author’s and task

context

Author’s context and that of the task very

similar

Score

When? – was the source published

Date is unknown or older than 20 years

old

Old reference – between 10 and 20

years old

Reference is between 5 to 10

years old

Recent reference is 2 to 5 years old

or known important paper

Up-to-date source – published in last two

years or seminal paper

Score

Why? – author’s reason/purpose for writing the article

No apparent motivation seen in

article

News paper (or online) article opinion – not

evidenced

Trade magazine / commercial paper –

might have some bias

Book source / conference paper or

subject interest forum/blog

Academic journal paper – peer

reviewed

Score

Notes:

Total score

Information Source Evaluation Matrix

Page 7: “The development of a quick, easy to use, cross-disciplinary information evaluation matrix” Mike Leigh Department of Informatics De Montfort University

Q: I’ve got enough to do already, why should I use this?

Q: But this is common sense, what benefits do I get?

Q: I don’t understand the criteria, how can I find out what each one means?

Q: If a paper gets a low score but I still think it’s useful, should I use it?

Learning Issues

Page 8: “The development of a quick, easy to use, cross-disciplinary information evaluation matrix” Mike Leigh Department of Informatics De Montfort University

Q: Why should I use this?A: To help students understand for themselves why Wikipedia is not the font of all knowledge!

Teaching Issues

Q: How much extra time will this take?

Q: How can I make my students use it?

Q: The criteria or descriptors don’t appear to be relevant to my subject / task area, how can I adapt the ISEM?

Page 9: “The development of a quick, easy to use, cross-disciplinary information evaluation matrix” Mike Leigh Department of Informatics De Montfort University

Future Work

Synthesis of ISEM for other CS evaluations• Software Libraries • Software Development Patterns

Investigate adaptation of ISEM for other media sources (images, video, audio, etc.)

Online tutorials to support matrix use Develop a self-evaluation skills survey ISEM widget to be available

• User-chosen variable weightings• Dynamic free text area• Enhancement of appearance?• Caveat: demand for this was very small

Page 10: “The development of a quick, easy to use, cross-disciplinary information evaluation matrix” Mike Leigh Department of Informatics De Montfort University

1 2 3 4 5 Score

Who? - provenance of software code (source – author)

Author background is unknown - code possibly ‘hacked’

. . . . . . . . .Code is from a

reputable source – vendor/author

Score

What? - is the functionality provided by the software

Does not provide required functionality but is adaptable . . . . . . . . .

Provides an exact match to required

functionalityScore

What? – is the Interface? How easy to integrate?

Not coded in target language, can be adapted with effort . . . . . . . . .

Written in required language, can be

plugged-inScore

What? – is the license agreement? Can code be used freely?

Open source code can be used for whatever purpose . . . . . . . . .

Strict licensing agreements to be

adhered toScore

Where? – context for use of code -is the software proven for required situation

Situation for which the code is needed is different to that of

the requirement

. . . . . . . . .

Context and that of the requirement highly similar -

proven code for the domain

Score

When? –how established is the software

Beta version - not yet fully developed . . . . . . . . .

Well used, reliable and robust software

Score

Why? – reason/ purpose for writing the code

Written for single use in a specific environment –sparse

documentation. . . . . . . . .

Written to be a reusable component - well documented

and reviewedScore

Software Source:

Required Functionality:

Total score

Software Libraries Evaluation Matrix - Prototype

Page 11: “The development of a quick, easy to use, cross-disciplinary information evaluation matrix” Mike Leigh Department of Informatics De Montfort University

1 2 3 4 5 Score

Who? - wrote the pattern – what is its provenance?

Author(s) background is unknown - pattern possibly

‘flawed’

. . . . . . . . . Pattern is from a reputable source – organisation/author

Score

What? - is the design problem addressed by the pattern

Does not directly solve the problem but is adaptable

with effort. . . . . . . . . Provides an exact match to the

problem being resolved

Score

Where? – context in which the pattern is a proven solution

Problem situation for which the pattern is needed is different to that of the

pattern

. . . . . . . . .Context and that of the pattern

highly similar - proven solution for the domain

Score

What? – are the Forces that need to be resolved? Will the pattern resolve them?

Some of the forces will be resolved but not others –

some conflict seen. . . . . . . . .

Forces seen in the problem will be resolved by the pattern with few

difficulties

Score

When? –how established is the software pattern?

Pattern has been used in a small number of cases – not

yet established. . . . . . . . .

Widely applied and proven in many different domains – well known

solution

Score

Why? – reason/ purpose for writing the pattern

Written for use in specific specialist environment –transferability uncertain

. . . . . . . . . Written to be widely applicable - well documented and reviewed

ScorePattern name:

Problem requiring resolution:

Total score

Software Development Pattern Evaluation Matrix - Prototype

Page 12: “The development of a quick, easy to use, cross-disciplinary information evaluation matrix” Mike Leigh Department of Informatics De Montfort University

Conclusions Students need guidance on ISEM usage

• Positive feedback from students where f2f introduction

• Clear written guidance to clarify criteria / descriptors

Student views:• Facilitating the quick evaluation of sources• The ease of use of the ISEM• The effectiveness of how sources maybe

evaluated• The raising of awareness of evaluation criteria

Suggested modifications around:• Disagreement of the given criteria• The need for more guidance on the use of the

matrix• Improvements that could be made to the matrix

layout• Issues concerning the access mechanism

Page 13: “The development of a quick, easy to use, cross-disciplinary information evaluation matrix” Mike Leigh Department of Informatics De Montfort University

References ALLEN, W.J. (2001) The role of action research in environmental

management. In: Working together for environmental management: the role of information sharing and collaborative learning. Ph.D. (Development Studies), Massey University. Available from: http://learningforsustainability.net/research/thesis/thesis_ch3.html, (accessed 28 May 2010)

BLANCHETT, H., POWIS, C. and WEBB, J. (May 2010) A guide to teaching information literacy: 101 tips. Facet Publishing.

LEIGH, M., MATHERS, E.L. and TOWLSON, K. (2009a) Using face-to-face sessions and focus groups to develop online support to enhance student information evaluation skills in VLE learning communities. SOLSTICE Conference, It’s all in the blend? 4th June 2009. Edge Hill University, U.K.

LEIGH, M., MATHERS, E.L. and TOWLSON, K. (2009b) A tool for developing information evaluation skills in a Web 2.0 environment. HEA ICS Workshop: Implications of HE in a Web2.0 World Report, 18 November 2009. De Montfort University, Leicester, U.K.

MATHERS, E.L. and LEIGH, M. (2008) Facilitators and barriers to developing learning communities. The Higher Education Academy Annual Conference: Transforming the student experience, 1 – 3 July 2008. Harrogate, U.K

TOWLSON, K., LEIGH, M. and MATHERS, L. (2009) The Information Source Evaluation Matrix, a quick, easy and transferable content evaluation tool, SCONUL Focus (47), pp. 15-18.