Advancements in Generative Models: Bayesian Flow Networks

Large-scale neural networks have revolutionized generative models, enabling them to capture complex relationships among variables. While diffusion models have excelled in image generation, they still struggle with discrete data compared to autoregressive models. However, an innovative solution has been introduced by Alex Graves and his research team – Bayesian Flow Networks (BFNs).

BFNs operate by manipulating parameters of the data distribution rather than working with noisy data directly. This approach provides an effective solution for dealing with discrete data, addressing the shortcomings of diffusion models. The researchers describe BFNs as a transmission scheme, where distribution parameters are input into a neural network. The network then outputs the parameters of a second distribution, known as the “output distribution.”

To generate the “receiver distribution,” noise is added to the data to create a “sender distribution.” The output distribution is then convolved with the same noise distribution to generate the receiver distribution. The sender distribution is used if the value is correct and is summed over, weighted by the corresponding value’s probability under the output distribution. Bayesian inference rules are applied, selecting a sample from the sender distribution to update the input distribution. This process is repeated by feeding the input distribution’s parameters back into the network to obtain the parameters of the output distribution.

BFNs offer a fully continuous and differentiable generative process, as they operate on the parameters of the data distribution rather than directly on noisy data. This makes them applicable for discrete data, providing a unique advantage over diffusion models.

Empirical studies conducted by the research team demonstrate that BFNs outperform known discrete diffusion models in character-level language modeling tasks, such as the text8 dataset. The team hopes that their work will inspire new perspectives and encourage further research in the field of generative models.

For more detailed information, the paper “Bayesian Flow Networks” by Alex Graves and his team is available on arXiv.

– arXiv: [arXiv](