Upload
others
View
2
Download
0
Embed Size (px)
Citation preview
01/12/2014
Mutual Learning Exercise on Performance Based
Funding System
Clemente López Bote Chief of Coordination, Evaluation and
Monitoring Unit
Spanish Agency for Research
106€/year
58 univ. públicas 30 univ. privadas
350
50 58 63 64 65 62 58 56 44 50 54
51 59 65 66 67 64 60 57 45 54 54
SECRETARIA DE ESTADO de I+D+I
SG Coordinación de
OPIS
SECRETARIA GENERAL DE CIENCIA E
INNOVACIÓN
DG Política de I+D+I
SG Coordinación, Planificación y
Seguimiento
SG Relaciones Internacionales SG Grandes Instalaciones
Científico-Técnicas
SG Promoción de la
Competitividad
SG Fomento de la
Innovación
AGENCIA
CDTI
SECRETARIA DE ESTADO de I+D+I
SG Coordinación de
OPIS
SECRETARIA GENERAL DE CIENCIA E
INNOVACIÓN
DG Política de I+D+I
SG Coordinación, Planificación y
Seguimiento
SG Relaciones Internacionales SG Grandes Instalaciones
Científico-Técnicas
SG Promoción de la
Competitividad
SG Fomento de la
Innovación
AGENCIA
CDTI
DIRECTOR
DIVISION DE
PROGRAMACIÓN Y
GESTIÓN ECONÓMICA Y
ADMINISTRATIVA
DIVISIÓN DE COORDINACIÓN, EVALUACIÓN Y
SEGUIMIENTO CIENTÍFICO Y TÉCNICO
SECRETARIA
GENERAL
Seguimiento y
Justificación de Ayudas
Ex ante
Ex post (project grants)
Gestión económica
Gestión de Ayudas de
Fondos Europeos
Planificación y Gestión
Administrativa
Ex post (International, strength)
Life
Technol &
Comm
Social/
Hum Environm
Distribution of
resources (100%)
35%
23%
31%
12%
State Program
27 areas
52 sub-areas
Scientific managers (3yr)
Comisión de evaluación
1. Informe individualizado con puntuación por apartados
2. Listado priorizado Propuesta contratos predoctorales y financiación propuesta
1. Lista priorizada 2. Financiación propuesta 3. Propuesta contratos predoctorales
ACTA-1
PRP
ACTA-2
PRD
3
4
6
7
9
Grupo de trabajo delegado de la CEv para estudio de alegaciones
10
11
ACTA
alegaciones
Priority list
Evaluation panel
Scientific manager
Institution
3’
5
Revisión Administrativa
Remote Highly
specialized Review
REPORT
Scientist
Remote review of 20-30 projects of
similar subject
Evaluation unit
AEI (administrative unit)
1
2
Publication of results 8
2
(2003-2012)
PRODUCCIÓN CIENTÍFICA ESPAÑOLA Numero de documentos y % sobre total mundial
total
n=9
5 years (2009-13)
15 most cited countries
% Éxito
2008-14
AGL 743
SAF 684
ENE
216
CTQ
633
CGL
713
CTM
273
BFU
577
Masa crítica de investigadores
activos y de calidad
How do we manage to reward the BEST according to their
potential and to select and stimulate the AVERAGE and/or YOUNG scientist ?
1. independent evaluation of project from young scientists (within the same panel).
2. Limit participation of scientist in living national projects (incompatibility while the project is live).
3. Cutting line: Finnacing the GOOD projects
Competitive system 40-50% acceptance
Evaluación científica / Proceso de selección Excelencia vs Masa crítica de grupos de investigación
(6500)
(1600) (520)
(400) Proyectos investigación
Comisión Asesora de Investigación Científica y Técnica (CAICYT, 1958) Comisión Interministerial de Ciencia y Tecnología (CICYT, 1987)
Plan Nacional de I+D+i Plan Nacional de I+D+i Plan Nacional de I+D+i Plan Nacional de I+D Plan Nacional de I+D Plan Nacional de I+D 1988-1991, 1992-1995, 1996-1999, 2000-2003, 2004-2007, 2008-2011 Plan Estatal de Investigación Científica y Técnica y de Innovación 2013-2016
AGENCIA CDTI
0
50
100
150
200
250
300
350
400
450
Patentes europeas de origen español
0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
% patentes españolas / total europeas
EUROPEAN PATENTS
Translational research: Crossing the valley of death Nature 453, 840-842 (2008)
Generación de conocimiento básico
Creación, Transferencia, de Tecnología:
Desarrollo Tecnológico
Inversión Esfuerzo social
Beneficio económico Desarrollo económico y
social . ……. aplicado
“En España tenemos muchos (y buenos) resultados de
proyectos, pero muy pocos llegan a ser utilizados”
The Death Valley
http://www.sebbm.com/
Jun 2014
El retorno social de la investigación
científica
Fernández Lucio
El nuevo contrato social de la ciencia
Jordi Molas
Investigación translacional e innovación
médicaOscar Llopis y Pabo D’E
ste
XXXVII Congreso
Empresas
Sociedad
Génesis de conocimiento
Trasferencia
Patentes
Aplicada
Translational research: Crossing the valley of death Nature 453, 840-842 (2008)
Generación de conocimiento básico
Creación, Transferencia, de Tecnología:
Desarrollo Tecnológico
Inversión Esfuerzo social
Beneficio económico Desarrollo económico y
social . ……. aplicado
“En España tenemos muchos (y buenos) resultados de
proyectos, pero muy pocos llegan a ser utilizados”
The Death Valley
http://www.sebbm.com/
Jun 2014
El retorno social de la investigación
científica
Fernández Lucio
El nuevo contrato social de la ciencia
Jordi Molas
Investigación translacional e innovación
médicaOscar Llopis y Pabo D’E
ste
XXXVII Congreso
Empresas
Sociedad
Génesis de conocimiento
Trasferencia
Patentes
Aplicada
Tina Seelig
Challenges:
Research evaluation in Italy
Marco Malgarini
Senior Manager for Research Evaluation, ANVUR
PSF – Mutual Learning Exercise on Performance Based Funding Systems
Bruxelles, September 7th, 2016
• Many EU Member states have introduced PBFS in order to increase effectiveness and performance of public research
• PBFS are expected to:
– Improve research performance
– Concentrate resources in the best organisations
• The evaluation of research at the basis of PBRS output can be:
– Based on pure peer evaluation
– Indicator-based
– Based on informed peer review
Performance based funding systems in general
2
• In Italy, public funding is mostly allocated at the organisational level (small role for project funding)
• Since 2011, organisational level funding is allocated in a competitive way (previously, block funding), on the basis of a large scale, ex post, national evaluation exercise (VQR) administered by ANVUR
• Up to 30% of total research funding is allocated on the basis of the VQR
• An evaluation exercise has been realised for the first time with reference to the period 2000-2003, with limited impact on funding allocation
• The situation has changed with the VQR 2004-10 and the new VQR 2011-14
Performance based funding systems in Italy
3
• The main characteristics of the VQR 2011-2014 are the following :
– Evaluation is performed with regard to 16 research Areas by groups of independent experts (GEV), which in turns appoint external reviewers to assess research quality
– Evaluation is based on the combined use of peer review and bibliometrics
– Evaluation makes use of bibliometric information concerning both the article and the Journal in which the article is published
– Evaluation concerns Universities and Departments, NOT individual researchers
– Individual evaluation results are NOT be published
– ANVUR publish the list of referees, but not the association between the referee and the publications she has assessed
– Evaluation results is used for funding purposes
Main features
4
5
Italian Scientific Areas of Research
• The 16 Areas according to which the evaluation is organised are: • Bibliometric Areas:
• Mathematics (Area 1) • Physics (Area 2) • Chemistry (Area 3) • Earth Sciences (Area 4) • Biology (Area 5) • Medicine (Area 6) • Agricultural and Veterinarian Sciences (Area 7) • Civic Engineering (Area 8a) • Industrial and communication Engineering (Area 9) • Psychology (Area 11a)
• Humanities and Social Sciences: • Architecture (Area 8b) • Humanities (Area 10) • History and Philosophy (Area 11b) • Law (Area 12) • Economics and Statistics (Area 13) – partially bibliometric • Social Sciences (Area 14)
Criteria against which research products are evaluated:
a) originality, to be intended as the degree according to which the publication is able to introduce a new way of thinking about the object of the research;
b) Methodological rigor, to be intended as the degree according to which the publication adopts an appropriate methodology and is able to present its results to peers;
c) Actual or potential impact, to be intended as the level of influence – current or potential – that the research exerts on the relevant scientific community
Evaluation criteria
6
• Each University Professor presents 2 research products published in the period 2011-2014
• Each researchers in Public Research Organisations (PROs’) presents 3 research products
• We have received 96,066 products for evaluation • Products admitted for evaluation are:
– Books – Articles and review essays – Other scientific publications, including compositions,
designs, projects (architecture), performances, exhibitions, arts objects, databases and softwares
– Patents
Publications to be presented
7
• Every publication is attributed a quality profile: – Excellent (weight 1), first decile in the international distribution
in the Area – Good (weight 0,7), 20-30% segment in the international
distribution in the Area – Fair (weight 0,4), 30-50% segment in the international
distribution in the Area. – Acceptable (weight 0,1), 50-80% segment in the international
distribution in the Area. – Limited (weight 0): 80-100% segment in the international
distribution in the Area. – Impossible to evaluate (weight 0): missing publications or
publications which are impossible to evaluate. • Quality profiles will be aggregated at the University/Department
level in order to assess the overall research quality of the Institution
Quality profiles
8
• In HSS, peer review prevails (with the only exception of Economics and Statistics)
• In “Hard Sciences”, life sciences and engineering we use bibliometric evaluation, based on citations and Impact factor, distinguished for: – Subject category
– Year of publication
– Document type (article & letter; review; proceeding)
• Groups of experts are responsible for choosing the exact final allocation to the quality profile, on the basis of the separate classification of articles according to IF and citations
• In order to overcome the possibility that results may be highly different among areas depending on the different weight assigned to citations and impact, the bibliometric algorithm is calibrated on basis of the joint distribution of citations and impact factor
• In this way, the 10-20-20-30-20 distribution at the basis of the definition of quality profiles is respected in each area
• The final evaluation of Departments and Universities depend on results obtained with a combination of peer review, informed peer review and bibliometric methods
The evaluation method
9
• Towards the definition of more homogenous areas of evaluation (ERC)
• How to combine information about citations and journal impact
• How to treat auto citations (currently, the are computed and a signal is issued if they are above a certain threshold)
• Is it advisable to consider the number of authors? How to eventually correct for it?
• Is there a systematic difference among the results obtained with peer and biblometric evaluation?
• Is there any method alternative to peer review that can be used to support/substitute for the evaluation of research outputs in HSS?
– Journal classification
– The evaluation of books
Critical points for future evaluations
10