A hybrid self attentive linearized phrase structured transformer based RNN for financial sentence analysis with sentence level explainability.

Journal: Scientific reports
Published Date:

Abstract

As financial institutions want openness and accountability in their automated systems, the task of understanding model choices has become more crucial in the field of financial text analysis. In this study, we propose xFiTRNN, a hybrid model that integrates self-attention mechanisms, linearized phrase structure, and a contextualized transformer-based Recurrent Neural Network (RNN) to enhance both model performance and explainability in financial sentence prediction. The model captures subtle contextual information from financial texts while maintaining explainability. xFiTRNN provides transparent, sentence-level insights into predictions by incorporating advanced explainability techniques such as LIME (Local Interpretable Model-agnostic Explanations) and Anchors. Extensive evaluations on benchmark financial datasets demonstrate that xFiTRNN not only achieves a remarkable prediction performance but also enhances explainability in the financial sector. This work highlights the potential of hybrid transformer-based RNN architectures for fostering more accountable and understandable Artificial Intelligence (AI) applications in finance.

Authors

  • Md Tanzib Hosain
    Department of Computer Science and Engineering, American International University-Bangladesh, 408/1, Kuratoli, Khilkhet, Dhaka, 1229, Bangladesh.
  • Md Kishor Morol
    Department of Computing and Information Science, Cornell University, 616 Thurston Ave, Ithaca, NY, 14853, USA.
  • Md Jakir Hossen
    Center for Advanced Analytics (CAA), COE for Artificial Intelligence, Faculty of Engineering & Technology (FET), Multimedia University, Jalan Ayer Keroh Lama, Bukit Beruang, 75450, Melaka, Malaysia. jakir.hossen@mmu.edu.my.

Keywords

No keywords available for this article.