EconPapers    
Economics at your fingertips  
 

On evaluating the quality of a computer science/computer engineering conference

Orestis-Stavros Loizides and Polychronis Koutsakis

Journal of Informetrics, 2017, vol. 11, issue 2, 541-552

Abstract: The Peer Reputation (PR) metric was recently proposed in the literature, in order to judge a researcher’s contribution through the quality of the venue in which the researcher’s work is published. PR, proposed by Nelakuditi et al., ties the selectivity of a publication venue with the reputation of the first author’s institution. By computing PR for a percentage of the papers accepted in a conference or journal, a more solid indicator of a venue’s selectivity than the paper Acceptance Ratio (AR) can be derived. In recent work we explained the reasons for which we agree that PR offers substantial information that is missing from AR, however we also pointed out several limitations of the metric. These limitations make PR inadequate, if used only on its own, to give a solid evaluation of a researcher’s contribution. In this work, we present our own approach for judging the quality of a Computer Science/Computer Engineering conference venue, and thus, implicitly, the potential quality of a paper accepted in that conference. Driven by our previous findings on the adequacy of PR, as well as our belief that an institution does not necessarily “make” a researcher, we propose a Conference Classification Approach (CCA) that takes into account a number of metrics and factors, in addition to PR. These are the paper’s impact and the authors’ h-indexes. We present and discuss our results, based on data gathered from close to 3000 papers from 12 top-tier Computer Science/Computer Engineering conferences belonging to different research fields. In order to evaluate CCA, we compare our conference rankings against multiple publicly available rankings based on evaluations from the Computer Science/Computer Engineering community, and we show that our approach achieves a very comparable classification.

Keywords: Conference evaluation; Paper impact; Author affiliations; h-index (search for similar items in EconPapers)
Date: 2017
References: View references in EconPapers View complete reference list from CitEc
Citations: View citations in EconPapers (3)

Downloads: (external link)
http://www.sciencedirect.com/science/article/pii/S1751157716301808
Full text for ScienceDirect subscribers only

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:eee:infome:v:11:y:2017:i:2:p:541-552

DOI: 10.1016/j.joi.2017.03.008

Access Statistics for this article

Journal of Informetrics is currently edited by Leo Egghe

More articles in Journal of Informetrics from Elsevier
Bibliographic data for series maintained by Catherine Liu ().

 
Page updated 2025-03-19
Handle: RePEc:eee:infome:v:11:y:2017:i:2:p:541-552