EE Seminar: Everything Old is New Again: The Return of Gradient Based Optimization Methods

27 במאי 2019, 15:00 
חדר 011, בניין כיתות-חשמל 

(The talk will be given in English)

 

Speaker:     Prof. Marc Teboulle
                     School of Mathematical Sciences, TAU

 

Monday, May 27th, 2019
15:00 - 16:00

Room 011, Kitot Bldg., Faculty of Engineering

 

Everything Old is New Again: The Return of Gradient Based Optimization Methods
 

Abstract

The gradient method, forged by Cauchy about 170 years ago, has regained over the last decade a strong revived interest in modern optimization through many of its variants and relatives known as First Order Methods (FOM). This renewed interest in  FOM has emerged  from  the current high demand in solving optimization problems arising in a wide spectrum of modern applications,  e.g., in signal processing, image science, machine learning, and physics. These applied problems are often ill-posed, nonsmooth,  convex or nonconvex,  and typically very large or even huge scale. This rules out the option of using sophisticated algorithms (e.g., Newton types schemes involving inversion of matrices) which often become prohibitively expensive.  Elementary first order methods using function values and gradient/subgradient information then often remain our best alternative to tackle such large scale optimization problems.  In turn, this rich collection of applied problems is providing fresh perspectives for optimization algorithms leading to new fundamental research with challenging theoretical and computational questions in the field.

We discuss recent advances on the design and analysis of gradient-based algorithms that we have developed and applied successfully in various areas.   In particular, we highlight the ways in which mathematical structures and data information can be beneficially exploited to design and analyze simple convex and nonconvex optimization methods. The talk is intended to a wide audience, and we will assume (almost) no prior knowledge in continuous optimization.

Short Bio
Marc Teboulle is the Eric and Sheila Samson Chair and Professor of Optimization at the School of Mathematical Sciences of Tel Aviv University. He received his D.Sc. from the Technion, Israel Institute of Technology. Teboulle research interests are in the area of continuous optimization, including theoretical foundation and algorithmic analysis, as well as its applications to many areas of science and engineering. Prior to joining Tel Aviv University in 1996, Teboulle was an applied mathematician for the Israel Aircraft Industries, a post-doctoral fellow at Dalhousie University, Canada, and a Professor of Mathematics at University of Maryland (USA). 
Teboulle has published widely on optimization theory, algorithms, and applications.  He is a co-author of the book "Asymptotic Cones and Functions in Optimization and Variational Inequalities" (Springer Monographs in Mathematics, 2003, with A. Auslender). He is a Fellow of SIAM, the Society for Industrial and Applied Mathematics. He served a term as Area Editor for Continuous Optimization of Mathematics of Operations Research from 2013-2018. He currently serves as Corresponding Editor for the Optimization Area of COCV-- Control, Optimization, and Calculus of Variations,  and is on the editorial boards of some other leading journals, including SIAM Journal of Optimization, and SIAM Journal on Mathematics of Data Science.

 

אוניברסיטת תל אביב עושה כל מאמץ לכבד זכויות יוצרים. אם בבעלותך זכויות יוצרים בתכנים שנמצאים פה ו/או השימוש
שנעשה בתכנים אלה לדעתך מפר זכויות, נא לפנות בהקדם לכתובת שכאן >>