An Equivalence Between Private Classification and Online Prediction
Abstract: We prove that every concept class with finite Littlestone dimension can be learned by an (approximate) differentially-private algorithm. The converse direction was shown in recent work of Alon, Livni, Malliaris, and Moran, STOC '19. Together these two results show that a class of functions is privately learnable if and only if it is learnable in the mistake-bound model of online learning. To establish our result, we introduce "global stability," a new notion of algorithmic stability for learning algorithms that we show can always be satisfied when learning classes of finite Littlestone dimension.
To watch the talk:
- Watching the live stream. At the announced start time of the talk (or a minute before), a live video stream will be available on our "next talk" page. Simply connect to the page and enjoy the talk. No webcam or registration is needed. Questions and comments during the talk are welcome (text only, unfortunately); simply post a comment below the live video stream on YouTube.
- Watching the recorded talk offline. The recorded talk will be made available shortly after the talk ends on our YouTube page. (Please leave a comment if you enjoyed it!)
Contact: Bonnie Leung firstname.lastname@example.org