Bibliometric Indicators of Research Excellence: Their Uses and Limitations
IREG Forum 2017 on Academic RankingsDoha, Qatar 12-14 March 2017
Malgorzata Krasowska, Government Proposition Manager
Bibliometric Indicators for Research Performance:Some History
3How Has Excellence in Research Been Judged? Peer Review – From the 17th Century to Today
1665 Henry Oldenburg creates Philosophical Transactions, but no referee system is used.1699 France’s Royal Academy of Sciences authorized to approve books for publication.1752 Royal Society establishes a committee to vote on what to publish in Philosophical Transactions1831 Cambridge University Professor William Whewell convinces the Royal Society to commission public reports on manuscripts.
Alex Csiszar, “Peer Review: Troubled from the Start,”Nature, 532 (7599): 306-308 19 April 2016
-- Richard Smith, former Editor, British Medical Journal
4
A Different Approach to Assessing the Impact and Importance of Research Findings: Citation Measures
“If you can measure that of which you speak, and can express it by a number, you know something of your subject; but
if you cannot measure it, your knowledge is meager and
unsatisfactory.”
— William Thomson (Lord Kelvin, 1824-1907)
1925-2017
“By using authors' references in compiling the citation index, we are in reality utilizing an army of indexers, for every time an author makes a reference he is in effect indexing that work from his point of view.”
— Eugene Garfield, pioneer of citation indexing for the scientific and scholarly literature, creator of the Science Citation Index (now the Web of Science), resulting in a revolution in information retrieval
5
Research Evaluation Combining Qualitative and Quantitative Views: Peer Review and Citation Analysis
Peer Review
•Small-scale, ground-up view•Absolute counts, size measures color
perceptions and judgments•Affected by work done long ago
Publication & Citation Analysis
•Global, top-down view•Weighted and relative measures•Can reveal more recent contributions
Measuring Meaningfully:Different Indicators for Different Perspectives of a Multidimensional Activity
7Bibliometric Basics: Measures of Output and Impact, as well as Size-dependent and Size-independent Indicators
Ludo Waltman, “A Review of the Literature on Citation Impact Indicators,” Journal of Informetrics, 10 (2): 365-391, 2016, Table 2, page 371.
8Bibliometrics Basics: Striving for Best Practices and Thoughtful Applications
9
Bibliometrics Basics: Striving for Best Practices and Thoughtful Applications
10
Bibliometrics Basics: Striving for Best Practices and Thoughtful Applications
David A. Pendlebury, White Paper: Using Bibliometrics in Evaluating Research, Thomson Reuters, 2008 http://wokinfo.com/media/mtrp/UsingBibliometricsinEval_WP.pdf
11
University Rankings: Special Case with Particular Issues
Yves Gingras, University of Quebec in Montreal
Extending Assessments of University Research Performance: I. Participation at the Leading Edge of Science
13
Co-citation Analysis Yields Research Fronts, Specialties Representing the Leading Edge of Science
Henry Small, “Co-Citation in the Scientific Literature: A New Measure of the Relationship Between Two Documents,” Journal of the American Society for Information Science, 24(4): 265-69, July/August 1973
Henry Small
14Example: Beijing Institute of Technology and its Participation in Contemporary Research Fronts, Large and Small, Hot and Mature
Total Research Fronts, October 2016: 8,575Research Fronts with Beijing Institute of Technology Participation: 49
Professor Hu, Xiaosong, He Hongwen and Xiong Rui, Department of Mechanical Engineering, National Engineering Lab for Electric Vehicles
Professor Liu, Xiangdong (刘向
东教授), Vice Dean of School of Automation
Professor Yao, Yugui (姚
裕贵教授), School of Physics. Title: Yangtz Scholar Distinguished Professor (长江学者特聘
教授)
Professor Wu, Feng (吴峰教授), School of Materials Science and Engineering
Professor Wei, Yi-Ming (魏一
鸣教授), School of Management and Economics
Extending Assessments of University Research Performance: II. Innovation Activity
16R&D Funding, Patents, and Surveys: Measurements of Innovative Activity and Success
Traditional Indicators of Innovation
•R&D expenditures•Patent-based indicators•Private and public surveys, and•Expert appraisal
• Human capital• Public and private
research funding• Infrastructure• Experience
(knowledge base)
• Graduates (degrees)• Data collections• Publications• Patents• Tech transfer, spin
outs• Standards and policies
• Skilled workforce• New knowledge• Economic, Health,
Environment benefits• Social, Cultural benefits• Legislation, Policy
17Reuters Indicators for World’s Most Innovative Universities
David Ewalt, “Reuters Top 100: The World's Most Innovative Universities – 2016,” 28 September 2016, http://www.reuters.com/article/amers-reuters-ranking-innovative-univers-idUSL2N1C406D
18
Robert .J.W. Tijssen, Alfredo Yegros- Yegros, Jos J. Winnink, “University- industry R&D linkage metrics: validity and applicability in world university rankings,” Scientometrics, 109 (2): 677-696, November 2016.
Robert Tijssen
The Limitations of Bibliometric Data… and Issues with their Interpretation
20What Bibliometric Analyses and Rankings Do Not ReflectHuman capital:Student achievement on entranceSuccess of graduatesInternationality and diversity
Infrastructure:Information resourcesPlant and equipmentSustainability
Teaching
Finances:Endowment and alumni contributionsR&D funding and overhead Income
Productivity (in terms of efficiency)
Innovation not published or cited
Engagement & Service to society
Reputation both global and regional
21
Beware: Authority of Numbers, Distinctions without a Difference, False Precision, and other Logical Fallacies
22
Indicators of Activity vs. Activity Driven by Indicators: Danger of Unintended Negative Consequences Resulting from Evaluation Systems with Perverse Incentives or that Seek Scores and Ranks instead of Real Progress toward Research Excellence
• Formulaic use of data for evaluation, especially relying on single measures, is not best practice
• Multiple and appropriate measures are required, and these as supplementary to peer review
• Goodhart’s Law: “Any observed statistical regularity will tend to collapse once pressure is placed upon it for control purposes.”
• Use bibliometric indicators to follow research, not as a goal: Citations will follow excellence