Attention is all you need: An interpretable transformer-based asset allocation approach
Tian Ma,
Wanwan Wang and
Yu Chen
International Review of Financial Analysis, 2023, vol. 90, issue C
Abstract:
Deep learning technology is rapidly adopted in financial market settings. Using a large data set from the Chinese stock market, we propose a return-risk trade-off strategy via a new transformer model. The empirical findings show that these updates, such as the self-attention mechanism in technology, can improve the use of time-series information related to returns and volatility, increase predictability, and capture more economic gains than other nonlinear models, such as LSTM. Our model employs Shapley additive explanations (SHAP) to measure the “economic feature importance” and tabulates the different important features in the prediction process. Finally, we document several economic explanations for the TF model. This paper sheds light on the burgeoning field on asset allocation in the age of big data.
Keywords: Transformer model; Asset allocation; SHAP; Chinese stock market (search for similar items in EconPapers)
Date: 2023
References: View references in EconPapers View complete reference list from CitEc
Citations: View citations in EconPapers (2)
Downloads: (external link)
http://www.sciencedirect.com/science/article/pii/S1057521923003927
Full text for ScienceDirect subscribers only
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:eee:finana:v:90:y:2023:i:c:s1057521923003927
DOI: 10.1016/j.irfa.2023.102876
Access Statistics for this article
International Review of Financial Analysis is currently edited by B.M. Lucey
More articles in International Review of Financial Analysis from Elsevier
Bibliographic data for series maintained by Catherine Liu ().