EconPapers    
Economics at your fingertips  
 

Mutual Information-Based Generalisation Gap Analysis Using Deep Learning Model

Hemanta Kumar Bhuyan, Bhuvan Unhelkar (), S. Siva Shankar () and Prasun Chakrabarti ()
Additional contact information
Hemanta Kumar Bhuyan: Muma College of Business, University of South Florida, 8350 N. Tamiami Trail Sarasota, FL, USA
Bhuvan Unhelkar: Muma College of Business, University of South Florida, 8350 N. Tamiami Trail Sarasota, FL, USA
S. Siva Shankar: ��Department of Computer Science and Engineering, KG Reddy College of Engineering and Technology, Hyderabad, Telangana, India
Prasun Chakrabarti: ��Department of Computer Science and Engineering, Sir Padampat Singhania University, Udaipur 313601, Rajasthan, India

Journal of Information & Knowledge Management (JIKM), 2025, vol. 24, issue 01, 1-31

Abstract: Most deep learning models face difficulties in analysing image information due to the concept of information bottlenecks and their corresponding methodologies. But, the information bottleneck is used for discarding redundant data and trying to maximise in favour of data directly relevant to the task-oriented information. However, managing information bottlenecks is challenging in the learning model process. Although convolutional neural networks are designed for small-scale processing, their inductive bias makes it difficult to learn contextual features. Thus, we have considered the theoretical learning model to justify the advantages of information bottleneck in deep learning model. We tried to use a fundamental information bottleneck in the vision transformer model. The channel density module cleans up task-related data, while the collected image representations are encouraged to be diverse through local connections in cumulative local transformer blocks. We considered the encoder and decoder methods that analyse the information bottleneck techniques in the deep learning model. This paper presents a rigorous learning theory that mathematically links information bottlenecks for generalisation errors, demonstrating the usefulness of information bottlenecks in deep learning. Our approach suggests that limiting information bottlenecks is crucial for managing errors in deep learning techniques. We conducted experiments across various mathematical models and learning environments to test the validity of our new mathematical insights. In many cases, generalisation errors correspond to unwanted information at hidden levels. We have considered boundary approaches using various scaling parameters and dimensions for the degree of information bottleneck. As per the estimation loss and error by different correlation approaches using generalisation gap methods, we found Spearman correlation having loss (0.86) and error (0.758), whereas Pearson correlation having loss (0.85) and error (0.76), respectively. We also considered outputs for model compression metrics and analysed them through comparative performance.

Keywords: Healthcare image; local features; transformer; convolutional neural network (CNN); adversarial generative model; information bottleneck (search for similar items in EconPapers)
Date: 2025
References: Add references at CitEc
Citations:

Downloads: (external link)
http://www.worldscientific.com/doi/abs/10.1142/S0219649225500017
Access to full text is restricted to subscribers

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:wsi:jikmxx:v:24:y:2025:i:01:n:s0219649225500017

Ordering information: This journal article can be ordered from

DOI: 10.1142/S0219649225500017

Access Statistics for this article

Journal of Information & Knowledge Management (JIKM) is currently edited by Professor Suliman Hawamdeh

More articles in Journal of Information & Knowledge Management (JIKM) from World Scientific Publishing Co. Pte. Ltd.
Bibliographic data for series maintained by Tai Tone Lim ().

 
Page updated 2025-03-20
Handle: RePEc:wsi:jikmxx:v:24:y:2025:i:01:n:s0219649225500017