A new method for measuring the originality of academic articles based on knowledge units in semantic networks
Jianhua Hou,
Dongyi Wang and
Jing Li
Journal of Informetrics, 2022, vol. 16, issue 3
Abstract:
Research on the evaluation of the quality of academic papers is attracting more attention from scholars in scientometrics. However, most previous researches have assessed paper quality based on external indicators, such as citations, which failed to account for the content of the research. To that end, this paper proposed a new method for measuring a paper's originality. The method was based on knowledge units in semantic networks, focusing on the relationship and semantic similarity of different knowledge units. Connectivity and path similarity between different content elements were used in particular networks as indicators of originality. This study used papers published between 2014 and 2018 in three categories (i.e. Library & Information Science, Educational Psychology, and Carbon Nanotubes) and divided their content into three parts (i.e. research topics, research methods and research results). It was found that the originality in all categories increase each year. Furthermore, a comparison of our new method with previous models of citation network analysis and knowledge combination analysis showed that our new method is better than those previous methods when used in measuring originality.
Keywords: Originality measurement; Semantic network; Natural language processing (NLP); Knowledge unit (search for similar items in EconPapers)
Date: 2022
References: View references in EconPapers View complete reference list from CitEc
Citations: View citations in EconPapers (3)
Downloads: (external link)
http://www.sciencedirect.com/science/article/pii/S175115772200058X
Full text for ScienceDirect subscribers only
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:eee:infome:v:16:y:2022:i:3:s175115772200058x
DOI: 10.1016/j.joi.2022.101306
Access Statistics for this article
Journal of Informetrics is currently edited by Leo Egghe
More articles in Journal of Informetrics from Elsevier
Bibliographic data for series maintained by Catherine Liu ().