Spring 2021
This term the Information Theory, Machine Learning and Statistics Seminar focuses on generaliza-
tion in machine learning and divergence measures, some of the topics studied last term. The seminar
consists of a series of two-hours talks aimed to graduate students and researchers with a basic know-
ledge of information theory and machine learning (e.g., as covered in the mini-courses last term).
The seminar runs in a bi-weekly format, taking place every other Friday from 9am to 11am (CDT).
The seminar sessions are transmitted using Zoom, please request the access link to Mario Diaz.
List of Speakers
Date
Speaker
Title
05/March
Mario Diaz
Universidad Nacional Autónoma de MéxicoGeneralization in Machine Learning via (Conditional) Mutual Information
19/March
Tyler Sypherd
Arizona State UniversitySynthesizing Classification-Calibration and Rademacher Complexity Generalization with alpha-loss
16/April
Mahdi Haghifam
University of TorontoSharpened Generalization Bounds based on Conditional Mutual Information and an Application to Noisy, Iterative Algorithms
30/April
Borja Rodríguez
KTH Royal Institute of TechnologyTighter Expected Generalization Error Bounds via Wasserstein Distance
14/May
James Melbourne
CIMATOn Discrete Analogs of the Entropy Power Inequality
28/May
Alvaro Díaz
Max Planck Institute for the Physics of Complex SystemsA Phase Transition in Terms of the Shannon Entropy
Last update: May 22, 2021