Machine Learning & Scientific Computing Series
Information Theory with Kernel Methods
I will consider the analysis of probability distributions through their associated covariance operators from reproducing kernel Hilbert spaces. In this talk, I will show that the von Neumann entropy and relative entropy of these operators are intimately related to the usual notions of Shannon entropy and relative entropy, and share many of their properties. They come together with efficient estimation algorithms from various oracles on the probability distributions. I will also present how these new notions of relative entropy lead to new upper-bounds on log partition functions, that can be used together with convex optimization within variational inference methods, providing a new family of probabilistic inference methods (based on https://arxiv.org/pdf/2202.08545.pdf).
Contact: Diana Bohler at 626-395-1768 dbohler@caltech.edu
For more information visit: https://caltech.zoom.us/j/86735762559?pwd=ckdPRmlnWEVtdk0rVm9Rbk5IMTExQT09