Master's thesis presentation. Hanqi is advised by Prof. Dr. Felix Dietrich.
Previous talks at the SCCS Colloquium
Hanqi Huo: Transformations between fully connected and convolutional neural networks
SCCS Colloquium |
This thesis explores the mathematical equivalence between convolutional neural networks
(CNNs) and fully connected neural networks (FCNNs), focusing on transforming convolu-
tional layers into fully connected layers. By expressing convolution operations as specific
forms of matrix multiplications, I establish a formal equivalence that enables the conversion
of a CNN into an FCNN with a specially structured weight matrix. The constructed matrix
Q captures the sparsity and weight-sharing mechanisms inherent in CNNs within a fully
connected framework. Additionally, I investigate the behavior of wide neural networks under
prior assumptions, demonstrating that their output distributions converge to a Gaussian
process as the network width approaches infinity. Through theoretical analysis and numerical
experiments, I validate the univariate and multivariate normality of the outputs of wide CNNs
and FCNNs. These findings provide valuable insights into neural network architectures,
highlighting fundamental connections between different layer types and contributing to the
understanding of neural networks’ behavior in the infinite-width limit.