Entropy-Informed Weighting Channel Normalizing Flow for Deep Generative Models

2025-12-10 20:00 GMT · 4 months ago aimagpro.com

arXiv:2407.04958v2 Announce Type: replace
Abstract: Normalizing Flows (NFs) are widely used in deep generative models for their exact likelihood estimation and efficient sampling.
However, they require substantial memory since the latent space matches the input dimension.
Multi-scale architectures address this by progressively reducing latent dimensions while preserving reversibility.
Existing multi-scale architectures use simple, static channel-wise splitting, limiting expressiveness. To improve this, we introduce a regularized, feature-dependent $mathtt{Shuffle}$ operation and integrate it into vanilla multi-scale architecture.
This operation adaptively generates channel-wise weights and shuffles latent variables before splitting them.
We observe that such operation guides the variables to evolve in the direction of entropy increase, hence we refer to NFs with the $mathtt{Shuffle}$ operation as emph{Entropy-Informed Weighting Channel Normalizing Flow} (EIW-Flow).
Extensive experiments on CIFAR-10, CelebA, ImageNet, and LSUN demonstrate that EIW-Flow achieves state-of-the-art density estimation and competitive sample quality for deep generative modeling, with minimal computational overhead.