Training Latent Diffusion Models with Interacting Particle Algorithms
Tim Y. J. Wang, Juan Kuntz, O. Deniz Akyildiz
我们引入了一种基于粒子的新型算法,用于潜在扩散模型的端到端训练。 我们重新制定了训练任务,以最小化自由能量功能,并获得这样做的梯度流。 通过将后者与相互作用的粒子系统近似,我们获得了算法,我们通过提供错误保证从理论上支持它。 这种新型算法在实验中与以前的基于粒子的方法和变异推理类似物进行了有利的比较。
We introduce a novel particle-based algorithm for end-to-end training of latent diffusion models. We reformulate the training task as minimizing a free energy functional and obtain a gradient flow that does so. By approximating the latter with a system of interacting particles, we obtain the algorithm, which we underpin it theoretically by providing error guarantees. The novel algorithm compares favorably in experiments with previous particle-based methods and variational inference analogues.