The benefits of asset tokenisation within securitisation
Gonçalo Lima,
Robert Barnes and
Charles Kerrigan
Additional contact information
Gonçalo Lima: R3, 2 London Wall Place, UK
Robert Barnes: 33 Cavendish Square, UK
Charles Kerrigan: CMS, Cannon Place, UK
Journal of Securities Operations & Custody, 2024, vol. 16, issue 4, 366-384
Abstract:
Securitisation has allowed banks to move from an originate-to-hold to an originate-to-distribute model. While it is widely accepted that this helped banks to achieve higher profitability and diversification, it is also regarded as the main cause of the 2007–08 global financial crisis. The lack of transparency between the securities issued and the performance of the underlying loans led to extreme risk taking and amplified the impacts once the loans started to underperform. This paper explores asset tokenisation, which can bring similar benefits to securitisation while enabling more effective management of the risks due to the traceability and immutability of distributed ledger technology (DLT). The paper argues that tokenisation of assets has now progressed beyond the experimentation phase and is being adopted by major commercial banks, central banks and financial market infrastructures (FMI). In addition, it describes the regulatory tailwinds for market participants to get involved in deploying and using the technology that makes tokenisation possible. While tokens and securities are both claims on assets, tokenisation’s additional capabilities of traceability and programmability enable the terms of a claim to be modified programmatically under specific circumstances, for example through a smart contract. A further positive attribute of tokenisation is that it can significantly improve and compress the workflow of existing and new securities, bringing considerable benefits from both operational and cost perspectives. The paper goes on to argue that generalised adoption of DLT along with harmonised standards, interoperability and integration for tokenisation feature among key requirements on which market participants and technology providers are actively working. Finally, the paper makes the point that cryptographically proven data also acts as a stepping-stone for high-quality artificial intelligence (AI) implementations, which can continue to expand productivity and profitability for regulated financial institutions.
Keywords: distributed ledger technology; tokenisation; securitisation; post-trade workflow acceleration; bank profitability; risk; regulatory capital; liquidity; cost of funding (search for similar items in EconPapers)
JEL-codes: E5 G2 K22 (search for similar items in EconPapers)
Date: 2024
References: Add references at CitEc
Citations:
Downloads: (external link)
https://hstalks.com/article/8691/download/ (application/pdf)
https://hstalks.com/article/8691/ (text/html)
Requires a paid subscription for full access.
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:aza:jsoc00:y:2024:v:16:i:4:p:366-384
Access Statistics for this article
More articles in Journal of Securities Operations & Custody from Henry Stewart Publications
Bibliographic data for series maintained by Henry Stewart Talks ().