66
Image credit: https://www.flickr.com/photos/haydnseek/ By Paul Prinsloo (@14prinsp) University of South Africa (Unisa) Presentation at NUI Galway, 22 September 2016 Fleeing from Frankenstein and meeting Kafka on the way: Algorithmic decision-making in higher education

Fleeing from Frankenstein and meeting Kafka on the way: Algorithmic decision-making in higher education

  • Upload
    prinsp

  • View
    404

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Fleeing from Frankenstein and meeting Kafka on  the way: Algorithmic  decision-making in  higher education

Imag

e cr

edit:

http

s://

ww

w.fl

ickr

.com

/pho

tos/

hayd

nsee

k/25

3408

8367

By Paul Prinsloo (@14prinsp)University of South Africa (Unisa)

Presentation at NUI Galway, 22 September 2016

Fleeing from Frankenstein and meeting Kafka on the way:Algorithmic decision-making in higher education

Page 2: Fleeing from Frankenstein and meeting Kafka on  the way: Algorithmic  decision-making in  higher education

• This presentation provides some of my thoughts in a submission to the Special Issue of E-Learning and Digital Media with as theme “Learning in the age of algorithmic cultures.” Editors: Petar Jandrić, Jeremy Knox, Hamish Macleod and Christine Sinclair.

• I don’t own the copyright of any of the images in this presentation. I therefore acknowledge the original copyright and licensing regime of every image used.

This presentation (excluding the images) is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License

Acknowledgements

Page 3: Fleeing from Frankenstein and meeting Kafka on  the way: Algorithmic  decision-making in  higher education

Imag

e cr

edit:

http

s://

ww

w.fl

ickr

.com

/pho

tos/

hayd

nsee

k/25

3408

8367My own positionality/location

Image credit: https://upload.wikimedia.org/wikipedia/commons/b/bf/Maze_01.svg

Page 4: Fleeing from Frankenstein and meeting Kafka on  the way: Algorithmic  decision-making in  higher education

Imag

e cr

edit:

http

s://

ww

w.fl

ickr

.com

/pho

tos/

hayd

nsee

k/25

3408

8367

Image credit: http://www.northsideservices.com/wp-content/uploads/2015/11/Revolving_door-base.jpg

Persisting concerns that we have not solved the student

departure/attrition question

Page 5: Fleeing from Frankenstein and meeting Kafka on  the way: Algorithmic  decision-making in  higher education

Imag

e cr

edit:

http

s://

ww

w.fl

ickr

.com

/pho

tos/

hayd

nsee

k/25

3408

8367

Image credit: https://pixabay.com/en/stress-anxiety-depression-stressed-1084525/

Increased reporting to an ever-growing number of stakeholders (the ‘audit’ society), the need for evidence and the necessity to ensure the

effective allocation of resources

Page 6: Fleeing from Frankenstein and meeting Kafka on  the way: Algorithmic  decision-making in  higher education

Imag

e cr

edit:

http

s://

ww

w.fl

ickr

.com

/pho

tos/

hayd

nsee

k/25

3408

8367

Image credit: https://s-media-cache-ak0.pinimg.com/736x/5c/58/07/5c58072b5f003d7a69b129cb6f8055b6.jpg

Page 7: Fleeing from Frankenstein and meeting Kafka on  the way: Algorithmic  decision-making in  higher education

Imag

e cr

edit:

http

s://

ww

w.fl

ickr

.com

/pho

tos/

hayd

nsee

k/25

3408

8367Student data as the ‘new black”, as oil, as a

resource to be mined

Image credit: http://fpif.org/wp-content/uploads/2013/01/great-oil-swindle-peak-oil-world-energy-outlook.jpg

Page 8: Fleeing from Frankenstein and meeting Kafka on  the way: Algorithmic  decision-making in  higher education

Imag

e cr

edit:

http

s://

ww

w.fl

ickr

.com

/pho

tos/

hayd

nsee

k/25

3408

8367

Image credit: https://en.wikipedia.org/wiki/TriageWeb page: http://www.chronicle.com/article/Are-Struggling-College/235311

Page 9: Fleeing from Frankenstein and meeting Kafka on  the way: Algorithmic  decision-making in  higher education

Imag

e cr

edit:

http

s://

ww

w.fl

ickr

.com

/pho

tos/

hayd

nsee

k/25

3408

8367

How can we use algorithmic decision-making in higher education to ensure, on the one hand, caring, appropriate, affordable and effective learning experiences, and on the

other hand, ensure that we do so in a transparent, accountable and ethical way?

Page 10: Fleeing from Frankenstein and meeting Kafka on  the way: Algorithmic  decision-making in  higher education

Imag

e cr

edit:

http

s://

ww

w.fl

ickr

.com

/pho

tos/

hayd

nsee

k/25

3408

8367Overview of this presentation

• The social imaginaries surrounding algorithms• Mapping the potential of and issues surrounding

algorithmic decision-making in the context of higher education, student success, data and the role of data scientists

• Exploring two frameworks (Danaher, 2015; Knox, 2010) for making sense of algorithmic decision-making in higher education

• Considering pointers for a way forward• (In)conclusions

Page 11: Fleeing from Frankenstein and meeting Kafka on  the way: Algorithmic  decision-making in  higher education

Imag

e cr

edit:

http

s://

ww

w.fl

ickr

.com

/pho

tos/

hayd

nsee

k/25

3408

8367

Image credits:Frankenstein - https://www.amazon.com/Frankenstein-Mary-Shelley/dp/1512308056/ref=sr_1_4?s=books&ie=UTF8&qid=1474302559&sr=1-4&keywords=frankensteinMary Shelley - https://en.wikipedia.org/wiki/Frankenstein The Trail - https://www.amazon.com/Trial-Franz-Kafka/dp/1612931030/ref=sr_1_1?ie=UTF8&qid=1474302647&sr=8-1&keywords=the+trial+kafka

Page 12: Fleeing from Frankenstein and meeting Kafka on  the way: Algorithmic  decision-making in  higher education

Imag

e cr

edit:

http

s://

ww

w.fl

ickr

.com

/pho

tos/

hayd

nsee

k/25

3408

8367In the social imaginary algorithms

simulataneously repel and attract us

Image credit: https://pixabay.com/en/binary-code-man-display-dummy-face-1327512/

Page 13: Fleeing from Frankenstein and meeting Kafka on  the way: Algorithmic  decision-making in  higher education

Imag

e cr

edit:

http

s://

ww

w.fl

ickr

.com

/pho

tos/

hayd

nsee

k/25

3408

8367While Artificial Intelligence (AI) “tools are

producing compelling advances in complex tasks, with dramatic improvements in energy consumption, audio processing, and leukemia detection”, we are also faced with the reality that “AI systems are already making problematic judgements that are producing significant social, cultural, and economic impacts in people’s everyday lives” (Crawford and Whittaker, 2016, par. 1).

Page 14: Fleeing from Frankenstein and meeting Kafka on  the way: Algorithmic  decision-making in  higher education

Imag

e cr

edit:

http

s://

ww

w.fl

ickr

.com

/pho

tos/

hayd

nsee

k/25

3408

8367

“Just as we learn our biases from the world around us, AI will learn its biases from us” (Collins, 2016)

Page 15: Fleeing from Frankenstein and meeting Kafka on  the way: Algorithmic  decision-making in  higher education

Imag

e cr

edit:

http

s://

ww

w.fl

ickr

.com

/pho

tos/

hayd

nsee

k/25

3408

8367“Technology has no ethics” (Brin, 2016)

“Adapting to a new technology is like a love affair… The devices, apps and tools seduce us … and any doubts or fears we had melt away” (Ellen Ullman as quoted by Miller, 2013)

Page 16: Fleeing from Frankenstein and meeting Kafka on  the way: Algorithmic  decision-making in  higher education

Imag

e cr

edit:

http

s://

ww

w.fl

ickr

.com

/pho

tos/

hayd

nsee

k/25

3408

8367 Algorithms

“… encode human prejudice, misunderstanding, and bias into automatic systems that increasingly manage our lives. Like gods, these mathematical models are opaque, their workings invisible to all but the highest priests in their domain: mathematicians and computer scientists. Their verdicts, even when wrong or harmful, are beyond dispute or appeal. And they tend to punish the poor and the oppressed in our society, while making the rich richer” (O’Neill, 2016a, par. 14).

Page 17: Fleeing from Frankenstein and meeting Kafka on  the way: Algorithmic  decision-making in  higher education

Imag

e cr

edit:

http

s://

ww

w.fl

ickr

.com

/pho

tos/

hayd

nsee

k/25

3408

8367Algorithmic imaginaries emerges from “a very

specific economic and innovative culture” associated with Silicon Valley technology companies, and they privilege their originators’ “techno-euphoric interpretations of Internet technologies as driving forces for economic and social progress” (Mager, 2015, pp. 5-6)

Image credit: https://upload.wikimedia.org/wikipedia/commons/2/23/Firmen_im_Silicon_Valley.jpg

Page 18: Fleeing from Frankenstein and meeting Kafka on  the way: Algorithmic  decision-making in  higher education

Imag

e cr

edit:

http

s://

ww

w.fl

ickr

.com

/pho

tos/

hayd

nsee

k/25

3408

8367

Image credits: Amazon.com

Page 19: Fleeing from Frankenstein and meeting Kafka on  the way: Algorithmic  decision-making in  higher education

Imag

e cr

edit:

http

s://

ww

w.fl

ickr

.com

/pho

tos/

hayd

nsee

k/25

3408

8367

Image credits: Amazon.com

Page 20: Fleeing from Frankenstein and meeting Kafka on  the way: Algorithmic  decision-making in  higher education

Imag

e cr

edit:

http

s://

ww

w.fl

ickr

.com

/pho

tos/

hayd

nsee

k/25

3408

8367

Image credits: Amazon.com

Page 21: Fleeing from Frankenstein and meeting Kafka on  the way: Algorithmic  decision-making in  higher education

Imag

e cr

edit:

http

s://

ww

w.fl

ickr

.com

/pho

tos/

hayd

nsee

k/25

3408

8367

Higher Education in a changing, fluid landscape

Student retention and

success

DataData Scientists

Algorithms

No ICTs Individual & social

well-being dependent

on ICTs

Individual & social well-being

related to ICTs

Page 22: Fleeing from Frankenstein and meeting Kafka on  the way: Algorithmic  decision-making in  higher education

Imag

e cr

edit:

http

s://

ww

w.fl

ickr

.com

/pho

tos/

hayd

nsee

k/25

3408

8367

DataData Scientists

Algorithms

No ICTs Individual & social

well-being dependent

on ICTs

Humanity - Technology - Nature

Humanity - Technology - Technology

Technology - Technology - Technology

(Danaher, 2016; Floridi, 2014)

Individual & social well-being

related to ICTs

The “in-betweenness” of technology

Page 23: Fleeing from Frankenstein and meeting Kafka on  the way: Algorithmic  decision-making in  higher education

Imag

e cr

edit:

http

s://

ww

w.fl

ickr

.com

/pho

tos/

hayd

nsee

k/25

3408

8367

• Rationalisation, commercialisation and outsourcing• Funding constraints – funding follows performance rather

than preceding it• Evidence-based management and the rise of the

administrative university• Our quantification fetish and obsession with data• Increasingly online and digital, increasingly dependent on

ICTs

Higher Education in a changing, fluid landscape

Student retention and

success

DataData Scientists

Algorithms

Page 24: Fleeing from Frankenstein and meeting Kafka on  the way: Algorithmic  decision-making in  higher education

Imag

e cr

edit:

http

s://

ww

w.fl

ickr

.com

/pho

tos/

hayd

nsee

k/25

3408

8367

Quality

CostAccess

The iron triangle in education• Impact on increased

access• Cost of

teaching/support/care• Quality

Page 25: Fleeing from Frankenstein and meeting Kafka on  the way: Algorithmic  decision-making in  higher education

Imag

e cr

edit:

http

s://

ww

w.fl

ickr

.com

/pho

tos/

hayd

nsee

k/25

3408

8367What are the potential, challenges and ethical

implications of using algorithms to address issues of cost, quality, access and care?

Page 26: Fleeing from Frankenstein and meeting Kafka on  the way: Algorithmic  decision-making in  higher education

Imag

e cr

edit:

http

s://

ww

w.fl

ickr

.com

/pho

tos/

hayd

nsee

k/25

3408

8367Higher Education

in a changing, fluid landscape

Student retention and

success

DataData Scientists

Algorithms

• Selecting students who applied for access to higher education – admission requirements. Criteria?

• Ensuring optimum ‘fit’ between individual student characteristics, background, aspirations and chosen program and/or courses

• Allocating resources appropriately, efficiently and ethically to individual students. Criteria?

• Ensuring completion in the ‘quickest’ possible time and return on investment

Page 27: Fleeing from Frankenstein and meeting Kafka on  the way: Algorithmic  decision-making in  higher education

Imag

e cr

edit:

http

s://

ww

w.fl

ickr

.com

/pho

tos/

hayd

nsee

k/25

3408

8367What are the potential, challenges and ethical

implications of using algorithms to admit students, chose ‘fit’, allocate resources and enable the most affordable, safest (and quickest) route to

success?

Image credit: http://www.yourtango.com/201168184/facebook-relationship-status-what-does-its-complicated-mean

Page 28: Fleeing from Frankenstein and meeting Kafka on  the way: Algorithmic  decision-making in  higher education

Imag

e cr

edit:

http

s://

ww

w.fl

ickr

.com

/pho

tos/

hayd

nsee

k/25

3408

8367Higher Education

in a changing, fluid landscape

DataData Scientists

Algorithms

Student retention and

success

• Our data fetish – the more, the better• The belief that data are neutral and our samples are [n=all]• The belief that it is sufficient to spot patterns and not to

understand/investigate the ‘why’• The use of data points and variables as proxies and mistaking

correlation for causation• Our assumption that our students’ digital lives and clicks

provide us with a holistic/complete picture. What about their non-digital lives? What about what we cannot measure?

Page 29: Fleeing from Frankenstein and meeting Kafka on  the way: Algorithmic  decision-making in  higher education

Imag

e cr

edit:

http

s://

ww

w.fl

ickr

.com

/pho

tos/

hayd

nsee

k/25

3408

8367Higher Education

in a changing, fluid landscape

Student retention and

success

DataData Scientists

Algorithms

• Data analysis as a “black art” (Floridi, 2012)• Data scientists as the “high-priests of algorithms”

(Dwoskin, 2014), the “engineer[s] of the future” (Van der Aalst, 2014) and data scientists as “rock stars and gods” (Harris, Murphy & Vaisman, 2013), the “hottest” job title (Chatfield, & Shlemoon & Redublado, 2014, p. 2)

Page 30: Fleeing from Frankenstein and meeting Kafka on  the way: Algorithmic  decision-making in  higher education

Imag

e cr

edit:

http

s://

ww

w.fl

ickr

.com

/pho

tos/

hayd

nsee

k/25

3408

8367Higher Education

in a changing, fluid landscape

Student retention and

success

DataData Scientists

Algorithms

Who are they … really? (Harris, Murphy and Vaisman, 2015) [sample 250]• Data developers (strong on machine learning, programming,

good overall)• Data researchers (disproportionally strong in statistics)• Data creatives (all-rounders – statistics, machine learning,

programming)• Data businesspersons (disproportionally strong in business,

then stats)

Page 31: Fleeing from Frankenstein and meeting Kafka on  the way: Algorithmic  decision-making in  higher education

Imag

e cr

edit:

http

s://

ww

w.fl

ickr

.com

/pho

tos/

hayd

nsee

k/25

3408

8367Higher Education

in a changing, fluid landscape

Student retention and

success

DataData Scientists

Algorithms

Data analysis as séance with the data scientist as interlocutor,

scientist, charlatan, oracle and/or medium (Prinsloo, 2016)

Image credit: http://3.bp.blogspot.com/-VontR5jZTEE/U7Ga6mUcRvI/AAAAAAAAA7s/qNq_toHlh34/s1600/mesas-girantes.jpg

Page 32: Fleeing from Frankenstein and meeting Kafka on  the way: Algorithmic  decision-making in  higher education

Imag

e cr

edit:

http

s://

ww

w.fl

ickr

.com

/pho

tos/

hayd

nsee

k/25

3408

8367Higher Education

in a changing, fluid landscape

Student retention and

success

DataData Scientists

Algorithms

Image credit: https://encrypted-tbn1.gstatic.com/images?q=tbn:ANd9GcREGZ_tw83Gm-Ma-WC9-SVq8H4a20jJ6gKyLw3dyySJciCF52YC

Page 33: Fleeing from Frankenstein and meeting Kafka on  the way: Algorithmic  decision-making in  higher education

(1)Humans

perform the task

(2)Task is shared

with algorithms

(3)Algorithms

perform task: human supervision

(4)Algorithms

perform task: no human input

Seeing Yes or No? Yes or No? Yes or No? Yes or No?

Processing Yes or No? Yes or No? Yes or No? Yes or No?

Acting Yes or No? Yes or No? Yes or No? Yes or No?

Learning Yes or No? Yes or No? Yes or No? Yes or No?

Danaher, J. (2015). How might algorithms rule our lives? Mapping the logical space of algocracy. [Web log post]. Retrieved from http://philosophicaldisquisitions.blogspot.com/2015/06/how-might-algorithms-rule-our-lives.html

Human-algorithm interaction in the collection, analysis and use of student data

Page 34: Fleeing from Frankenstein and meeting Kafka on  the way: Algorithmic  decision-making in  higher education

Imag

e cr

edit:

http

s://

ww

w.fl

ickr

.com

/pho

tos/

hayd

nsee

k/25

3408

8367Preliminary seven dimensions of surveillance

(Knox, 2010)

1. Automation2. Visibility3. Directionality4. Assemblage5. Temporality6. Sorting7. Structuring

Page 35: Fleeing from Frankenstein and meeting Kafka on  the way: Algorithmic  decision-making in  higher education

Imag

e cr

edit:

http

s://

ww

w.fl

ickr

.com

/pho

tos/

hayd

nsee

k/25

3408

83671. Automation

Key questions Dimensional intensity

What is the timing of the collection?

Intermittently/infrequently

Continuous

Locus of control? Human Machine

Can it be turned on and off (and by whom?)

All the monitoring can be turned on/off

None of the monitoring can be turned off

Page 36: Fleeing from Frankenstein and meeting Kafka on  the way: Algorithmic  decision-making in  higher education

Imag

e cr

edit:

http

s://

ww

w.fl

ickr

.com

/pho

tos/

hayd

nsee

k/25

3408

83672. Visibility

Key questions Dimensional intensity

Is the surveillance apparent and transparent?

All parts (collection, storage, processing and viewing) are visible

None of the monitoring is visible

Ratio of self-to-surveillant knowledge?

Subject knows everything the surveillant knows

Subject does not know anything that the surveillant knows

Page 37: Fleeing from Frankenstein and meeting Kafka on  the way: Algorithmic  decision-making in  higher education

Imag

e cr

edit:

http

s://

ww

w.fl

ickr

.com

/pho

tos/

hayd

nsee

k/25

3408

83673. Directionality

Key questions Dimensional intensity

What is the relative power of surveillant to subject?

Subjects hold all the power

Surveillant holds all the power

Who has access to monitoring/recording/ broadcasting functions?

Subjects Surveillant

Page 38: Fleeing from Frankenstein and meeting Kafka on  the way: Algorithmic  decision-making in  higher education

Imag

e cr

edit:

http

s://

ww

w.fl

ickr

.com

/pho

tos/

hayd

nsee

k/25

3408

83674. Assemblage

Key questions Dimensional intensity

Medium of surveillance Single medium (e.g. text)

Multimedia

Are the data stored? No Yes

Who stores the data? Subject or collector

Third party

Page 39: Fleeing from Frankenstein and meeting Kafka on  the way: Algorithmic  decision-making in  higher education

Imag

e cr

edit:

http

s://

ww

w.fl

ickr

.com

/pho

tos/

hayd

nsee

k/25

3408

83675. Temporality

Key questions Dimensional intensity

When does the monitoring occur?

Confined to the present

Combines the present with the past

How long is the monitoring frame?

One, isolated, relatively short frame (e.g. test)

Long periods, or indefinitely

Does the system attempt to predict future behavior/outcomes

No – only assessment of the present

Present + past used to predict the future

When are the data available? All of the data available only after event is completed

Available in real-time and experienced as instantaneous

Page 40: Fleeing from Frankenstein and meeting Kafka on  the way: Algorithmic  decision-making in  higher education

Imag

e cr

edit:

http

s://

ww

w.fl

ickr

.com

/pho

tos/

hayd

nsee

k/25

3408

83676. Sorting

Key questions Dimensional intensity

Are subjects’ data compared with other data – other individuals/ groups/ abstract configurations/ state mandates?

None Other data are used as basis for comparison

Page 41: Fleeing from Frankenstein and meeting Kafka on  the way: Algorithmic  decision-making in  higher education

Imag

e cr

edit:

http

s://

ww

w.fl

ickr

.com

/pho

tos/

hayd

nsee

k/25

3408

83677. Structuring

Key questions Dimensional intensity

Are data used to alter the environment (i.e. treatment, experience, etc.)?

Not used Used to alter the environment of all subjects

Are data used to target the subject for different treatment that they would otherwise receive?

No data are used as basis for differing treatment

Based on data, treatment is prescribed

Page 42: Fleeing from Frankenstein and meeting Kafka on  the way: Algorithmic  decision-making in  higher education

Imag

e cr

edit:

http

s://

ww

w.fl

ickr

.com

/pho

tos/

hayd

nsee

k/25

3408

8367

Higher Education in a changing, fluid landscape

Student retention and

success

DataData Scientists

Algorithms

Page 43: Fleeing from Frankenstein and meeting Kafka on  the way: Algorithmic  decision-making in  higher education

Imag

e cr

edit:

http

s://

ww

w.fl

ickr

.com

/pho

tos/

hayd

nsee

k/25

3408

8367The way forward: some considerations

• Human oversight over algorithmic design and application: ‘straddling’ “the boundary between resistance and accommodation” (Danaher, 2016, p. 19)

• Epistemic enhancement of humans• Sousveillance• Individual (integrative and non-integrative)

partnerships with algorithms(Danaher, 2016)

Page 44: Fleeing from Frankenstein and meeting Kafka on  the way: Algorithmic  decision-making in  higher education

Imag

e cr

edit:

http

s://

ww

w.fl

ickr

.com

/pho

tos/

hayd

nsee

k/25

3408

8367

“A simplistic over-reliance on algorithms is heavily flawed.”

Web page: http://www.nature.com/polopoly_fs/1.20653!/menu/main/topColumns/topLeftColumn/pdf/537449a.pdf

Page 45: Fleeing from Frankenstein and meeting Kafka on  the way: Algorithmic  decision-making in  higher education

Imag

e cr

edit:

http

s://

ww

w.fl

ickr

.com

/pho

tos/

hayd

nsee

k/25

3408

8367The way forward: some considerations

• Does the idea violate the human rights of anyone involved?

• Does this idea substitute human relationships with machine relationships?

• Does this idea put efficiency over humanity?

• Does this idea put economics and profits over the most basic human ethics?

• Does this idea automate something that should not be automated? (Leonard, 2016)

Page 46: Fleeing from Frankenstein and meeting Kafka on  the way: Algorithmic  decision-making in  higher education

Imag

e cr

edit:

http

s://

ww

w.fl

ickr

.com

/pho

tos/

hayd

nsee

k/25

3408

8367

• If “technology has no ethics” (Brin, 2016) then we should not rely on ethics considering human history and “taking perspective from the long ages of brutal, feudal darkness endured by our ancestors.”

• It is not that ethics are not important and that frameworks do not serve any purpose but to what extent do ethics and frameworks affect “the worst human predators, parasites and abusers?”

• “Ethics are the mirror in which we evalaute oursleves and hold outselves accountable” (emphasis added). Holding actors and humans accountable still works “better than every single other system ever tried.”

The way forward: some considerations

Page 47: Fleeing from Frankenstein and meeting Kafka on  the way: Algorithmic  decision-making in  higher education

Imag

e cr

edit:

http

s://

ww

w.fl

ickr

.com

/pho

tos/

hayd

nsee

k/25

3408

8367The way forward (some pointers)

• “Solid values and self-regulation reign in only the scrupulous” (p. 206)

• We need laws but we must “reevaluate our metrics of success…. What should we be counting?” (p. 206). What are the hidden costs and non-numerical values?

• Any data collected, must be based on opt-in• We need to question the assumptions, limitations and implications

of the proxies we use (e.g. zip codes)• The role and scope of transparency, openness, accountability and

audits. Also audit fairness…• There are some things we cannot and should not attempt to

model (e.g. teacher effectiveness)

Page 48: Fleeing from Frankenstein and meeting Kafka on  the way: Algorithmic  decision-making in  higher education

Imag

e cr

edit:

http

s://

ww

w.fl

ickr

.com

/pho

tos/

hayd

nsee

k/25

3408

8367The way forward (some pointers)

• While we have to be concerned about efficiency, it is not the only question we should ask, and most probably not the most important. We also need to consider whether a curriculum, pedagogy and assessment are appropriate (Biesta, 2007, 2010)

• Education is an open and recursive system (Biesta 2007, 2010), where student success is the result of a dynamic interaction between context, students, institutional efficiencies, epistemological access, resources and support at a particular moment in time (Subotzky & Prinsloo, 2011)

• “Who benefits from algorithmic education technology? How? Whose values and interests are reflected in its algorithms?” (Watters, 2016; emphasis added)

Page 49: Fleeing from Frankenstein and meeting Kafka on  the way: Algorithmic  decision-making in  higher education

Imag

e cr

edit:

http

s://

ww

w.fl

ickr

.com

/pho

tos/

hayd

nsee

k/25

3408

8367The way forward (some pointers)

• Rule 1: Do no harm. • Rule 2: Read rule 1• Students have a right to know who designs our algorithms, for

what purposes, using what data, how they are affected, and make an informed decision to opt-in

• Provide students access to information and data held about them, to verify and/or question the conclusions drawn, and where necessary, provide context

• Provide access to a neutral ombudsperson• Ethical oversight? Accountability?

(See Prinsloo & Slade, 2015; Slade & Prinsloo, 2013; Willis, Slade & Prinsloo 2016)

Page 50: Fleeing from Frankenstein and meeting Kafka on  the way: Algorithmic  decision-making in  higher education

Imag

e cr

edit:

http

s://

ww

w.fl

ickr

.com

/pho

tos/

hayd

nsee

k/25

3408

8367(In)conclusions

Who benefits, under what

conditions, what are the

(un)intended consequences

and how and who will keep us

accountable and transparent?

Page 51: Fleeing from Frankenstein and meeting Kafka on  the way: Algorithmic  decision-making in  higher education

Imag

e cr

edit:

http

s://

ww

w.fl

ickr

.com

/pho

tos/

hayd

nsee

k/25

3408

8367Thank you

Paul PrinslooResearch Professor in Open Distance Learning (ODL)College of Economic and Management Sciences, Office number 3-15, Club 1, Hazelwood, P O Box 392Unisa, 0003, Republic of South AfricaT: +27 (0) 12 433 4719 (office)T: +27 (0) 82 3954 113 (mobile)[email protected]

Personal blog: http://opendistanceteachingandlearning.wordpress.comTwitter profile: @14prinsp

Page 52: Fleeing from Frankenstein and meeting Kafka on  the way: Algorithmic  decision-making in  higher education

Imag

e cr

edit:

http

s://

ww

w.fl

ickr

.com

/pho

tos/

hayd

nsee

k/25

3408

8367References and additional reading

Ball, N. (2013, November 11). Big Data follows and buries us in equal measure. [Web log post]. Retrieved from http://www.popmatters.com/feature/175640-this-so-called-metadata/

Beauchamp T. L., & Childress J.F. (2001). Principles of Biomedical Ethics. (5th ed). Oxford: Oxford University Press.

Bergstein, B. (2013, February 20). The problem with our data obsession. MIT Technology Review. Retrieved from https://www.technologyreview.com/s/511176/the-problem-with-our-data-obsession/

Bertolucci, J. (2014, July 28). Deep data trumps Big Data. Information Week. Retrieved from http://www.informationweek.com/big-data/big-data-analytics/deep-data-trumps-big-data/d/d-id/1297588

Biesta, G. (2007). Why “what works” won’t work: evidence-based practice and the democratic deficit in educational research, Educational Theory, 57(1),1–22. DOI: 10.1111/j.1741-5446.2006.00241.x .

Page 53: Fleeing from Frankenstein and meeting Kafka on  the way: Algorithmic  decision-making in  higher education

Imag

e cr

edit:

http

s://

ww

w.fl

ickr

.com

/pho

tos/

hayd

nsee

k/25

3408

8367References and additional reading (cont.)

Biesta, G. (2010). Why ‘what works’ still won’t work: from evidence-based education to value-based education, Studies in Philosophy of Education, 29, 491–503. DOI 10.1007/s11217-010-9191-x.

Booth, M. (2012, July 18). Learning analytics: the new black. EDUCAUSEreview, [online]. Retrieved from http://www.educause.edu/ero/article/learning-analytics-new-black

boyd, D., & Crawford, K. (2013). Six provocations for Big Data. Retrieved from http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1926431

Citron, D.K., & Pasquale, F. (2014). The scored society: Due process for automated predictions. http://ssrn.com/abstract=2376209

Collins, N. (2016, September 1). Artificial Intelligence will be as biased and prejudiced as its human creators. Pacific Standard. Retrieved from https://psmag.com/artificial-intelligence-will-be-as-biased-and-prejudiced-as-its-human-creators-38fe415f86dd#.p15q3xmow

Crawford, K. (2014, May 30). The anxieties of Big Data. The New Inquiry. Retrieved from http://thenewinquiry.com/essays/the-anxieties-of-big-data

Page 54: Fleeing from Frankenstein and meeting Kafka on  the way: Algorithmic  decision-making in  higher education

Imag

e cr

edit:

http

s://

ww

w.fl

ickr

.com

/pho

tos/

hayd

nsee

k/25

3408

8367References and additional reading (cont.)

Crawford, K., & Whittaker, M. (2016, September 12). Artificial intelligence is hard to see. Why we urgently need to measure AI’s societal impacts. [Web log post]. Medium. Retrieved from https://medium.com/@katecrawford/artificial-intelligence-is-hard-to-see-a71e74f386db#.wi7sq5l3a

Danaher, J. (2014, January 6). Rule by algorithm? Big Data and the threat of algocracy.[Web log post]. Retrieved from http://philosophicaldisquisitions.blogspot.com/2014/01/rule-by-algorithm-big-data-and-threat.html

Danaher, J. (2015, June 15). How might algorithms rule our lives? Mapping the logical space of algocracy. [Web log post]. Retrieved from http://philosophicaldisquisitions.blogspot.co.za/2015/06/how-might-algorithms-rule-our-lives.html

Dascalu, M. I., Bodea, C. N., Mihailescu, M. N., Tanase, E. A., & Ordoñez de Pablos, P. (2016). Educational recommender systems and their application in lifelong learning. Behavior & Information Technology, 35(4), 290-297.

Page 55: Fleeing from Frankenstein and meeting Kafka on  the way: Algorithmic  decision-making in  higher education

Imag

e cr

edit:

http

s://

ww

w.fl

ickr

.com

/pho

tos/

hayd

nsee

k/25

3408

8367

References and additional reading (cont.)de Freitas, S., Gibson, D., Du Plessis, C., Halloran, P., Williams, E., Ambrose, M.,

Dunwell, I., & Arnab, S. (2015). Foundations of dynamic learning analytics: Using university student data to increase retention. British Journal of Educational Technology, 46(6), 1175-1188.

Diakopoulos, N. (2014). Algorithmic accountability. Digital Journalism. DOI: 10.1080/21670811.2014.976411

Diefenbach, T. (2007). The managerialistic ideology of organisational change management, Journal of Organisational Change Management, 20(1), 126 — 144.

Doctorow, C. (2016, September 15). Rules for trusting "black boxes" in algorithmic control systems. Retrieved from http://boingboing.net/2016/09/15/rules-for-trusting-black-box.html

Domingos, P. (2015). The master algorithm. How the quest for the ultimate learning machine will remake our world. New York, NY: Perseus Books.

Drachsler, H., Hummel, H. G., & Koper, R. (2008). Personal recommender systems for learners in lifelong learning networks: the requirements, techniques and model. International Journal of Learning Technology, 3(4), 404-423.

Page 56: Fleeing from Frankenstein and meeting Kafka on  the way: Algorithmic  decision-making in  higher education

Imag

e cr

edit:

http

s://

ww

w.fl

ickr

.com

/pho

tos/

hayd

nsee

k/25

3408

8367References and additional reading (cont.)

Eubanks, V. (2014, January 15). Want to predict the future of surveillance? Ask poor communities. The American Prospect. Retrieved from http://prospect.org/article/want-predict-future-surveillance-ask-poor-communities

Feldstein, M. (2012, May 6) What is machine learning good for? [Web log post]. Retrieved from http://mfeldstein.com/what-is-machine-learning-good-for/

Ferguson, R., Brasher, A., Clow, D., Griffiths, D., & Drachsler, H. (2016). Learning analytics: visions of the future. Paper delivered at the 6th International Learning Analytics and Knowledge (LAK) Conference, 25-29 April, Edinburgh, Scotland. Retrieved from http://oro.open.ac.uk/45312/

Fleming, (2016, April 1). Artificial intelligence and machine learning in education – a glimpse of what that might mean. Microsoft. Retrieved from https://blogs.msdn.microsoft.com/education/2016/04/01/how-will-your-staff-or-students-use-this/

Floridi, L. (2012). Big data and their epistemological challenge. Philosophy & Technology, 1-3.

Gitelman, L. (ed.). (2013). “Raw data” is an oxymoron. London, UK: MIT Press.

Page 57: Fleeing from Frankenstein and meeting Kafka on  the way: Algorithmic  decision-making in  higher education

Imag

e cr

edit:

http

s://

ww

w.fl

ickr

.com

/pho

tos/

hayd

nsee

k/25

3408

8367References and additional reading (cont.)

Grosz, B.J., Altman, R., Horvitz, E., Mackworth, A., Mitchell, T., Mulligan, D., & Shoham, Y. (2016). One hundred year study on Artificial Intelligence. Artificial Intelligence and life in 2030. Stanford University. Retrieved from https://ai100.stanford.edu/sites/default/files/ai_100_report_0831fnl.pdf

Hartfield, T. (2015, May 12 ). Next generation learning analytics: Or, how learning analytics is passé. [Web log post]. Retrieved from http://timothyharfield.com/blog/2015/05/12/next-generation-learning-analytics-or-how-learning-analytics-is-passe/

Hartley, D. (1995). The ‘McDonaldisation’ of higher education: food for thought? Oxford Review of Education, 21(4), 409—423.

Henman, P. (2004). Targeted!: Population segmentation, electronic surveillance and governing the unemployed in Australia. International Sociology, 19, 173-191

Howells, C. (2016, February 15). Can algorithms replace academics? Insead Knowledge. Retrieved from http://knowledge.insead.edu/operations/can-algorithms-replace-academics-4518

Page 58: Fleeing from Frankenstein and meeting Kafka on  the way: Algorithmic  decision-making in  higher education

Imag

e cr

edit:

http

s://

ww

w.fl

ickr

.com

/pho

tos/

hayd

nsee

k/25

3408

8367References and additional reading (cont.)

Johnson, J.A. (2015, October 7). How data does political things: The processes of encoding and decoding data are never neutral. [Web log post]. Retrieved from http://blogs.lse.ac.uk/impactofsocialsciences/2015/10/07/how-data-does-political-things/

Joynt, G.M., & Gomersall, C.D. (2005). Making moral decisions when resources are limited – an approach to triage in ICY patients with respiratory failure. South African Journal of Critical Care (SAJCC), 21(1), 34—44. Retrieved from http://www.ajol.info/index.php/sajcc/article/view/35543

Kitchen, R. (2013). Big data and human geography: opportunities, challenges and risks. Dialogues in Human Geography, 3, 262-267. SOI: 10.1177/2043820613513388

Kitchen, R. (2014). The data revolution. London, UK: SAGE. Kitchen, R., & McArdle, G. (2016). What makes Big Data, Big Data? Exploring the ontological

characteristics of 26 datasets. Big Data & Society, January-June, 1-10. DOI: 10.1177/2053951716631130

Knox, D. (2010). Spies in the house of learning: a typology of surveillance in online learning environments. Paper presented at Edge, Memorial University of Newfoundland, Canada, 12-15 October.

Page 59: Fleeing from Frankenstein and meeting Kafka on  the way: Algorithmic  decision-making in  higher education

Imag

e cr

edit:

http

s://

ww

w.fl

ickr

.com

/pho

tos/

hayd

nsee

k/25

3408

8367References and additional reading (cont.)

Lagoze, C. (2014). Big Data, data integrity, and the fracturing of the control zone. Big Data & Society (July-December), 1-11.

Leonhard, G. (2016). Technology vs. humanity: The coming clash between man and machine. Fast Future Publishing Ltd.

Mager, A. (2012). Algorithmic ideology: How capitalist society shapes search engines. Information, Communication & Society, 15(5), 769-787.

Mager, A. (2015). Glocal search: Search technology at the intersection of global capitalism and local socio-political cultures. Vienna: Institute of Technology Assessment (ITA), Austrian Academy of Sciences. Retrieved from http://www.astridmager.net/wp-content/uploads/2015/11/Abschlussbericht-OeNB_Mager.pdf

Mayer-Schönberger, V. (2009). Delete. The virtue of forgetting in the digital age. Princeton, NJ: Princeton University Press.

Mayer-Schönberger, V., & Cukier, K. (2013). Big data. London, UK: Hachette.Miller, C.C. (2013, August 24). Addicted to apps. The New York Times. Retrieved from

http://www.nytimes.com/2013/08/25/sunday-review/addicted-to-apps.html

Page 60: Fleeing from Frankenstein and meeting Kafka on  the way: Algorithmic  decision-making in  higher education

Imag

e cr

edit:

http

s://

ww

w.fl

ickr

.com

/pho

tos/

hayd

nsee

k/25

3408

8367References and additional reading (cont.)

Miller, C. C. (2015, July 9). When algorithms discriminate. The New York Times. Retrieved from http://www.nytimes.com/2015/07/10/upshot/when-algorithms-discriminate.html

Morozov, E. (2013a, October 23). The real privacy problem. MIT Technology Review. Retrieved from http://www.technologyreview.com/featuredstory/520426/the-real-privacy-problem/

Morozov, E. (2013b). To save everything, click here. London, UK: Penguin Books. Napoli, P. (2013). The algorithm as institution: Toward a theoretical framework for

automated media production and consumption. In Media in Transition Conference (pp. 1–36). DOI: 10.2139/ssrn.2260923

Nissenbaum, H. (2015). Respecting context to protect privacy: Why meaning matters. Science and engineering ethics. Retrieved from http://link.springer.com/article/10.1007/s11948-015-9674-9

Open University. (2014). Policy on ethical use of student data for learning analytics. Retrieved from http://www.open.ac.uk/students/charter/essential-documents/ethical-use-student-data-learning-analytics-policy

Page 61: Fleeing from Frankenstein and meeting Kafka on  the way: Algorithmic  decision-making in  higher education

Imag

e cr

edit:

http

s://

ww

w.fl

ickr

.com

/pho

tos/

hayd

nsee

k/25

3408

8367References and additional reading (cont.)

Manning, C. (2012, March 14). Educational triage. [Web log post]. Retrieved from http://colinmcit.blogspot.co.uk/2012/03/educational-triage.html.

Markoff, J. (2015). Machines of loving grace: The quest for common ground between humans and robots. New York, NY: HarperCollins Publishing.

Merceron, A., Blikstein, P., & Siemens, G. (2016). Learning analytics: from Big Data to meaningful data. Journal of Learning Analytics, 2(3), 4-8.

Muñoz, C., Smith, M., & Patil, D.J. (2016, May). Big data: A report on algorithmic systems, opportunity, and civil rights. Executive Office of the President. Retrieved from https://www.whitehouse.gov/sites/default/files/microsites/ostp/2016_0504_data_discrimination.pdf

O’Neil, C. (2016a, September 1). How algorithms rule our working lives. The Guardian. Retrieved from https://www.theguardian.com/science/2016/sep/01/how-algorithms-rule-our-working-lives

O’Neil, C. (2016b). Weapons of math destruction. How big data increases inequality and threatens democracy. UK: Allen Lane.

Page 62: Fleeing from Frankenstein and meeting Kafka on  the way: Algorithmic  decision-making in  higher education

Imag

e cr

edit:

http

s://

ww

w.fl

ickr

.com

/pho

tos/

hayd

nsee

k/25

3408

8367References and additional reading (cont.)

Pardo, A., & Siemens, G. (2014). Ethical and privacy principles for learning analytics. British Journal of Educational Technology, 45(3), 438-450.

Pasquale, F. (2015, October 14). Scores of scores: how companies are reducing consumers to single numbers The Atlantic. Retrieved fromhttp://www.theatlantic.com/business/archive/2015/10/credit-

scores/410350/Pasquale, F. [FrankPasquale]. (2016, February 19). "We know where you are. We know

where you’ve been. We can more or less know what you're thinking about. http://www.theatlantic.com/technology/archive/2016/02/google-cute-evil/463464/ … #Jigsaw [Tweet]. Retrieved from https://twitter.com/FrankPasquale/status/700473628605947904

Pasquale, F. (2015). The black box society. Harvard Publishing, US.Perrotta, C., & Williamson, B. (2016). The social life of Learning Analytics: cluster analysis

and the ‘performance’of algorithmic education. Learning, Media and Technology, 1-14.Prinsloo, P. (2009). Modelling throughput at Unisa: The key to the successful

implementation of ODL. Retrieved from http://uir.unisa.ac.za/handle/10500/6035

Page 63: Fleeing from Frankenstein and meeting Kafka on  the way: Algorithmic  decision-making in  higher education

Imag

e cr

edit:

http

s://

ww

w.fl

ickr

.com

/pho

tos/

hayd

nsee

k/25

3408

8367References and additional reading (cont.)

Prinsloo (2016). Evidence-based decision making as séance: implications for learning and student support. In Jan Botha & Nicole Muller (eds.), Institutional Research in support of evidence-based decision-making in Higher Education in Southern Africa. Stellenbosch, South Africa: SUN Media. In press.

Prinsloo, P., Archer, E., Barnes, G., Chetty, Y., & Van Zyl, D. (2015). Big (ger) data as better data in open distance learning. The International Review of Research in Open and Distributed Learning, 16(1).

Prinsloo, P., & Slade, S. (2014). Educational triage in higher online education: walking a moral tightrope. International Review of Research in Open Distributed Learning (IRRODL), 14(4), pp. 306-331. http://www.irrodl.org/index.php/irrodl/article/view/1881.

Prinsloo, P., & Slade, S. (2015, March). Student privacy self-management: implications for learning analytics. In Proceedings of the Fifth International Conference on Learning Analytics And Knowledge (pp. 83-92). ACM. Retrieved from http://dl.acm.org/citation.cfm?id=2723585

Prinsloo, P., & Slade, S. (2016a). Student vulnerability, agency, and learning analytics: an exploration. Journal of Learning Analytics, 3(1), 159-182.

Page 64: Fleeing from Frankenstein and meeting Kafka on  the way: Algorithmic  decision-making in  higher education

Imag

e cr

edit:

http

s://

ww

w.fl

ickr

.com

/pho

tos/

hayd

nsee

k/25

3408

8367References and additional reading (cont.)

Prinsloo, P., & Slade, S. (2016b). Here be dragons: Mapping student responsibility in learning analytics, in Mark Anderson and Collette Gavan (eds.), Developing Effective Educational Experiences through Learning Analytics (pp. 174-192). Hershey, Pennsylvania: ICI-Global.

Sclater, N., Peasgood, A., & Mullan, J. (2016). Learning analytics in higher education. JISC. Retrieved from https://www.jisc.ac.uk/sites/default/files/learning-analytics-in-he-v3.pdf

Selwyn, N. (2014). Distrusting educational technology. Critical questions for changing times. New York, NY: Routledge

Slade, S., & Prinsloo, P. (2013). Learning analytics: ethical issues and dilemmas. American Behavioral Scientist, 57(1) pp. 1509–1528.

Slade, S., & Prinsloo, P. (2015). Student perspectives on the use of their data: between intrusion, surveillance and care. European Journal of Open, Distance and Elearning. (pp.16-28). Special Issue. http://www.eurodl.org/materials/special/2015/Slade_Prinsloo.pdf

Subotzky, G., & Prinsloo, P. (2011). Turning the tide: a socio-critical model and framework for improving student success in open distance learning at the University of South Africa. Distance Education, 32(2): 177-19.

Page 65: Fleeing from Frankenstein and meeting Kafka on  the way: Algorithmic  decision-making in  higher education

Imag

e cr

edit:

http

s://

ww

w.fl

ickr

.com

/pho

tos/

hayd

nsee

k/25

3408

8367References and additional reading (cont.)

Tene, O. & Polonetsky, J. (2013). Judged by the Tin Man: Individual rights in the age of Big Data. J. on Telecomm. & High Tech. L., 11, 351.

Totaro, P., & Ninno, D. (2014). The concept of algorithm as an interpretive key of modern rationality. Theory Culture Society 31, pp. 29—49. DOI: 10.1177/0263276413510051

Uprichard, E. (2013, October 1). Big data, little questions. Discover Society. Retrieved from http://discoversociety.org/2013/10/01/focus-big-data-little-questions/

Vander Ark, T. (2015, November 25). 8 ways machine learning will improve education. [Web log post]. Retrieved from http://blogs.edweek.org/edweek/on_innovation/2015/11/8_ways_machine_learning_will_improve_education.html

Wang, T. (2013, January 20). Why Big Data needs thick data. Medium. Retrieved from https://medium.com/ethnography-matters/why-big-data-needs-thick-data-b4b3e75e3d7#.4jbatgurh

Watters, A. (2013, October 13). Student data is the new oil: MOOCs, metaphor, and money. [Web log post]. Retrieved from http://www.hackeducation.com/2013/10/17/student-data-is-the-new-oil/

Watters, A. (2014). Social justice. [Web log post]. Retrieved from http://hackeducation.com/2014/12/18/top-ed-tech-trends-2014-justice

Page 66: Fleeing from Frankenstein and meeting Kafka on  the way: Algorithmic  decision-making in  higher education

Imag

e cr

edit:

http

s://

ww

w.fl

ickr

.com

/pho

tos/

hayd

nsee

k/25

3408

8367References and additional reading (cont.)

. Wigan, M.R., & Clarke, R. (2013). Big data’s big unintended consequences. Computer,

(June), 46-53. Williamson, B. (2016). Silicon startup schools: technocracy, algorithmic imaginaries and

venture philanthropy in corporate education reform. Critical Studies in Education, 1-19.

Willis, J. E., Slade, S., & Prinsloo, P. (2016). Ethical oversight of student data in learning analytics: A typology derived from a cross-continental, cross-institutional perspective. Educational Technology Research and Development. DOI: 10.1007/s11423-016-9463-4 Retrieved fromhttp://link.springer.com/article/10.1007/s11423-016-9463-4