When Does Convolutional Depth Matter? Nominal versus Effective Depth in VGG, ResNet, and GoogLeNet
Manfred M. Fischer and
Joshua Pitts
No 2, Working Papers in Regional Science from WU Vienna University of Economics and Business
Abstract:
Increasing convolutional depth has been central to advances in image recognition, yet deeper networks do not uniformly yield higher accuracy, stable optimization, or efficient computation. We present a controlled comparative study of three canonical convolutional neural network architectures — VGG, ResNet, and GoogLeNet — to isolate how depth influences classification performance, convergence behavior, and computational efficiency. By standardizing training protocols and explicitly distinguishing between nominal and effective depth, we show that the benefits of depth depend critically on architectural mechanisms that constrain its effective manifestation during training rather than on nominal depth alone. Although plain deep networks exhibit early accuracy saturation and optimization instability, residual and inception-based architectures consistently translate additional depth into improved accuracy at lower effective depth and favorable accuracy–compute trade-offs. These findings demonstrate that effective depth, not nominal depth, is the operative quantity governing depth’s role as a productive scaling dimension in convolutional networks.
Date: 2026
References: Add references at CitEc
Citations:
Downloads: (external link)
https://research.wu.ac.at/en/publications/39474f89-6ee9-4bcd-a3db-c49fdf2b0cd1 original version (application/pdf)
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:wiw:wus046:81798935
Access Statistics for this paper
More papers in Working Papers in Regional Science from WU Vienna University of Economics and Business Welthandelsplatz 1, 1020 Vienna, Austria.
Bibliographic data for series maintained by WU Library ().