Linear Convergence of Black-Box Variational Inference: Should We Stick the Landing?
Kyurae Kim, Yian Ma, and Jacob R. Gardner
我们证明具有控制变量的黑盒变异推理(BBVI),特别是粘附着陆(STL)估计器,在完美的变速家族规范下以几何(传统上称为“线性”)速率收敛。 特别是,我们证明了STL估计器的梯度方差的二次边界,其中包括错误指定的变异家族。 结合先前关于二次方差条件的工作,这直接意味着BBVI与使用投影随机梯度下降的收敛。 对于投影运算符,我们考虑一个具有三角形尺度矩阵的域,投影在Θ(d)时间上是可计算的,其中d是目标后验的维度。 我们还改进了对常规闭合形式熵梯度估计器的现有分析,该估计器能够与STL估计器进行比较,为两者提供明确的非渐近复杂性保证。
We prove that black-box variational inference (BBVI) with control variates, particularly the sticking-the-landing (STL) estimator, converges at a geometric (traditionally called "linear") rate under perfect variational family specification. In particular, we prove a quadratic bound on the gradient variance of the STL estimator, one which encompasses misspecified variational families. Combined with previous works on the quadratic variance condition, this directly implies convergence of BBVI with ...