View
213
Download
0
Category
Tags:
Preview:
Citation preview
1
Quality Assurance in Italian Universities
Gianfranco Rebora, University Cattaneo – LIUC (Italy)
Matteo Turri, University of Milan (Italy)
EURAM 2010
Back to the Future
19-22 May 2010 – Tor Vergata, Rome
2
This paper aims
to review the main events in the development of evaluation activities in Italian universities from 1993 to the present day,
to explain the reasons of poor results coming from a relevant collective effort
to find some relevant evidence also in the optics of pubblic management and governance
3
QA/ Evaluation
• Not a very specific concept…
• … comprehends different approaches, methodologies and practices referred to definition, development and assessment of quality, aiming to improve the ability of institutions, staff and students to meet HE goals (which are debated not fixed beforehand)
4
2007 – 2009 Bologna ProcessStocktaking Report 2009
• Stage of development of external QA systemStage of development of external QA system 3• Level of student participation in QA 2• Level of international participation in QA 2
Report from working groups appointed by theBologna Follow-up Group to theMinisterial Conference in Leuven/Louvain-la-Neuve28-29 April 2009
5
Why do results seem poor?(of QA in Italian experience)
1993- 2009:• Evaluation and QA
activities and practices have been growing at a fast rate
But results seem poor:- Accountability ?- Improvement ?
6
IDEA
METHODS USE
BODIES
Inertia, opportunism
andunforeseen
or undesidered
effects
Components of an evaluation system Institutional and Organisational
Impact:
Knowledge-Learning
Improvement
Accountability
Conceptual framework
7
The critical point: How evaluation is used
• The “actual use” according to Patton (1997, p.20) is the best way to understand the value of evaluation activities and the efforts dedicated to it. We considered:
• Generating knowledge• Improvement• Accountability• non-use, given that in some cases no use is made of
evaluation output
8
The Italian QA story: three stages
• 1993-1998: start up
• 1999 – 2006: a surfeit of information and very few results
• 2007-2009: stalemate
9
BodiesKind of use
1993-1998
Start up
1999-2006
Information diffused
2006-2009
Stalemate
Ev. Units Instituted with the aim of verify “the administration of state funding, productivity in teaching and research”
Charged to develop the assessment techniques established by CNSVU and to make an annual report
Continuing activities
OVSU Collects extensive information
-
CNVSU Replaced the OSVU:-Collecting data- consultancy and assistency to ministerial requirements
Merger with CIVR
Anvur: instituted by law in 2006, delay prolonged to 2010
CIVR In charge of research evaluation
(exercise 2001-03)
Starts new exercise
2004-08 (2010)
10
How QA has been used: an overviewKind of use 1993-1998
Start up
1999-2006
Information diffused
2006-2009
Stalemate
Knowledge
Accountability
Improvement
No use
Growing and diffused knowledge
This use of evaluation “causes embarrassment” because it provides justification for university government actions that cannot be taken”
Path of improvement has been interrupted (in research field)
No use of evaluation prevails – risk is avoided
11
1999-2006: a surfeit of information but very few results
Universities interpreted this greater autonomy by increasing the number of degree courses and adopting opportunist behaviour such as increasing the number of competitions for lecturing positions, at times dealing with career opportunities in an underhand way
13
2007-2009: stalemate
• 2007: a law decided the merger of the CNVSU and CIVR and the setting up of the ANVUR
• 2008: the new government modified the set-up of Anvur• 2009: the new rules are on the way of final approval, but
the Agency is not yet beginning its activity • 2010: a new research evaluation exercise is now
starting (2004-2008): CIVR is still in charge of it• Universities must now manage relevant cuts in their
budgets with the risk that future rules about evaluation will increase the pressure for compliance and conformity
14
Lessons learned
1. To emphasize (to stress) the USE (of evaluation) helps to understand things
2. The field of Higher education is open /very sensitive to accept evaluation
1. International (European) drive2. Availability of core competences3. Culture/past experiences of research
3. Governance matters1. University system governance2. Institution level
4. Two different vision are in conflict:1. Administrative/ bureaucratic2. Professional
15
3. Governance problems have negative impact on use of
evaluation
– University system level
– HE Institutions level
16
At University system level
• An administrative/ bureaucratic approach prevails on a more substantial and professional one
• Minimum quality requisites have a very formal intepretation
• Lack of resources/budget and political events influence the continuity of QA practices: es. CIVR/VTR after 2006
17
At HEIs level
A weak type of governance like the one in Italian universities is disinterested in evaluation because it involves making decisions that no one has the strength to make:• Several rectors saw evaluation as a stimulus and tool for
governance but then had difficulty in finding the necessary consensus for re-election
• A sudden reduction in the autonomy of evaluation units is however quite common after the election of a new rector
(MINELLI E, REBORA G., TURRI M. (2008). How can evaluation_fail? The case of Italian universities. QUALITY IN HIGHER EDUCATION. vol. 14)
18
Mismatch between macro level initiatives in QA and micro level experienced needs
We can remove it by:– establishing a threshold in order to prevent the
diffusion of weak higher education initiatives – abolishing rules that impose specific organisational
patterns and limit strategies of differentiation– providing rules that university leaders (rectors, deans
and the various coordinators of teaching activities) can use in order to validate and strengthen their strategic choices and their government structure
– promoting autonomy and a more competent professional approach of central bodies and agencies operating in the field of HE
19
How QA has been used: an overviewKind of use 1993-1998
Start up
1999-2006
Information diffused
2006-2009
Stalemate
Knowledge Predominant Extended: Most frequent use Most frequent use
Accountability the attempt to use quantitative data to influence decision - making and the allocation of funds has failed
Decisions are postponed
Improvement Lack of involvement of universities in self evaluation procedures that could generate learning
VTR: stimulated various disciplines to discuss and define their criteria of excellence for research
Lack of involvment continues
VTR was interrupted
No use abundance of bulky and generic documents which are not used, as most teaching staff are unaware that they exist
abundance of bulky and generic documents which are not used, as most teaching staff are unaware that they exist
Recommended