Transformer-Based Patent Novelty Search by Training Claims to Their Own Description
Michael Freunek and
André Bodmer
Applied Economics and Finance, 2021, vol. 8, issue 5, 37-46
Abstract:
In this paper we present a method to concatenate patent claims to their own description. By applying this method, bidirectional encoder representations from transformers (BERT) train suitable descriptions for claims. Such a trained BERT could be able to identify novelty relevant descriptions for patents. In addition, we introduce a new scoring scheme- relevance score or novelty score to interprete the output of BERT. We test the method on patent applications by training BERT on the first claims of patents and corresponding descriptions. The output is processed according to the relevance score and the results compared with the cited X documents in the search reports. The test shows that BERT score some of the cited X documents as highly relevant.
Date: 2021
References: View complete reference list from CitEc
Citations:
Downloads: (external link)
https://redfame.com/journal/index.php/aef/article/download/5182/5588 (application/pdf)
https://redfame.com/journal/index.php/aef/article/view/5182 (text/html)
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:rfa:aefjnl:v:8:y:2021:i:5:p:37-46
Access Statistics for this article
More articles in Applied Economics and Finance from Redfame publishing Contact information at EDIRC.
Bibliographic data for series maintained by Redfame publishing ().