EconPapers    
Economics at your fingertips  
 

The Effective Depth Paradox: Topology and Trainability in Deep CNNs

Manfred M. Fischer and Joshua Pitts

No 2, Working Papers in Regional Science from WU Vienna University of Economics and Business

Abstract: This paper investigates the relationship between convolutional neural network (CNN) topology and image recognition performance through a comparative study of the VGG, ResNet, and GoogLeNet architectural families. Utilizing a unified experimental framework, the study isolates the impact of depth from confounding implementation variables. A formal distinction is introduced between nominal depth (Dnom), representing the physical layer count, and effective depth (Deff ), an operational metric quantifying the expected number of sequential transformations. Empirical results demonstrate that architectures utilizing identity shortcuts or branching modules maintain optimization stability by decoupling Deff from Dnom. These findings suggest that effective depth serves as a superior framework for predicting scaling potential and practical trainability, ultimately indicating that architectural topology — rather than sheer layer volume — is the primary determinant of gradient health in deep learning models.

Date: 2026
References: Add references at CitEc
Citations:

Downloads: (external link)
https://research.wu.ac.at/en/publications/39474f89-6ee9-4bcd-a3db-c49fdf2b0cd1 original version (application/pdf)

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:wiw:wus046:81798935

Access Statistics for this paper

More papers in Working Papers in Regional Science from WU Vienna University of Economics and Business Welthandelsplatz 1, 1020 Vienna, Austria.
Bibliographic data for series maintained by WU Library ().

 
Page updated 2026-05-03
Handle: RePEc:wiw:wus046:81798935