EE Seminar: On the Information Complexity of Learning

08 באפריל 2019, 15:00 
חדר 011, בניין כיתות-חשמל 

(The talk will be given in English)

 

Speaker:     Ido Nachum
                     Department of Mathematics, Technion

 

Monday, April 8th, 2019
15:00 - 16:00

Room 011, Kitot Bldg., Faculty of Engineering

 

On the Information Complexity of Learning
 

Abstract

Under an information theoretic setting, these two notions are indeed equivalent. 
a) Compression implies learning. We will show that learning algorithms that retain a small amount of information from their input sample generalize.
b) Learning implies compression. We will show that under an average-case analysis, every hypothesis class of finite VC dimension (a characterization of learnable classes) has empirical risk minimizers (ERM) that do not reveal a lot of information.

If time permits, we will discuss a worst-case lower bound by presenting a simple concept class for which every empirical risk minimizer (possibly randomized) must reveal a lot of information and also observe connections with well-studied notions such as sample compression schemes, Occam's razor, PAC-Bayes, and differential privacy.

אוניברסיטת תל אביב עושה כל מאמץ לכבד זכויות יוצרים. אם בבעלותך זכויות יוצרים בתכנים שנמצאים פה ו/או השימוש
שנעשה בתכנים אלה לדעתך מפר זכויות, נא לפנות בהקדם לכתובת שכאן >>