首页 | 本学科首页   官方微博 | 高级检索  
     检索      


Unbiased evaluation of ranking metrics reveals consistent performance in science and technology citation data
Institution:1. Institute of Fundamental and Frontier Sciences, University of Electronic Science and Technology of China, Chengdu 610054, PR China;2. URPP Social Networks, University of Zurich, 8050 Zurich, Switzerland;3. Alibaba Research Center for Complexity Sciences, Hangzhou Normal University, 311121 Hangzhou, PR China;4. Department of Radiation Oncology, Inselspital, Bern University Hospital and University of Bern, 3010 Bern, Switzerland;5. Department of Physics, University of Fribourg, 1700 Fribourg, Switzerland;1. Library, Nanjing Medical University, Nanjing, 210029, China;2. State Key Laboratory of Analytical Chemistry for Life Science, School of Electronic Science and Engineering, Nanjing University, Nanjing, 210023, China;1. A P J Abdul Kalam Technological University, Thiruvananthapuram, Kerala, 695016, India;2. CSIR Central Mechanical Engineering Research Institute, Durgapur, Bengal, 713209, India;3. Amsterdam School of Communication Research (ASCoR), University of Amsterdam, PO Box 15793, 1001 NG Amsterdam, the Netherlands;1. National Taiwan University of Science and Technology, Graduate Institute of Patent, No. 43, Sec. 4, Keelung Rd., Taipei, Taiwan, ROC;2. National Taiwan University, Department of Mechanical Engineering, No. 1, Sec. 4, Roosevelt Rd., Taipei, Taiwan, ROC;3. National Taiwan University, Department of Library and Information Science, No. 1, Sec. 4, Roosevelt Rd., Taipei, Taiwan, ROC;4. National Taiwan University, Center for Research in Econometric Theory and Applications (CRETA), No. 1, Sec. 4, Roosevelt Rd., Taipei, Taiwan, ROC;1. National Research University Higher School of Economics, Myasnitskaya, 20, 101000 Moscow, Russia;2. Institute of Mathematics, Physics and Mechanics, Jadranska 19, 1000 Ljubljana, Slovenia;3. University of Primorska, Andrej Maru?i? Institute, 6000 Koper, Slovenia;1. Department of Information Management, National Central University, Taoyuan City, Taiwan 320, R.O.C.;2. Center for Innovative Research on Aging Society, National Chung Cheng University, Chiayi, Taiwan 621, R.O.C.;3. MOST AI Biomedical Research Center at National Cheng Kung University, Taiwan 701, R.O.C.;4. Department of Information Management, National Chung Cheng University, Chiayi, Taiwan 621, R.O.C.;5. Chiayi Chang Gung Memorial Hospital, Chiayi, Taiwan 613, R.O.C.;6. Department of Agricultural Economics, National Taiwan University Taipei, Taiwan 106, R.O.C.
Abstract:Despite the increasing use of citation-based metrics for research evaluation purposes, we do not know yet which metrics best deliver on their promise to gauge the significance of a scientific paper or a patent. We assess 17 network-based metrics by their ability to identify milestone papers and patents in three large citation datasets. We find that traditional information-retrieval evaluation metrics are strongly affected by the interplay between the age distribution of the milestone items and age biases of the evaluated metrics. Outcomes of these metrics are therefore not representative of the metrics’ ranking ability. We argue in favor of a modified evaluation procedure that explicitly penalizes biased metrics and allows us to reveal metrics’ performance patterns that are consistent across the datasets. PageRank and LeaderRank turn out to be the best-performing ranking metrics when their age bias is suppressed by a simple transformation of the scores that they produce, whereas other popular metrics, including citation count, HITS and Collective Influence, produce significantly worse ranking results.
Keywords:Citation networks  Network ranking metrics  Node centrality  Metrics evaluation  Milestone scientific papers and patents
本文献已被 ScienceDirect 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号