Simulating Posterior Bayesian Neural Networks with Dependent Weights

2025-09-22 19:00 GMT · 7 months ago aimagpro.com

arXiv:2507.22095v2 Announce Type: replace
Abstract: In this paper we consider posterior Bayesian fully connected and feedforward deep neural networks with dependent weights. Particularly, if the likelihood is Gaussian, we identify the distribution of the wide width limit and provide an algorithm to sample from the network. In the shallow case we explicitly compute the distribution of the conditional output, proving that it is a Gaussian mixture. All the theoretical results are numerically validated.