Dr. Koby Todros-Robust parameter estimation based on the -divergence

סמינר מחלקת מערכות - EE Systems Seminar

04 במרץ 2024, 15:00 
Electrical Engineering-Kitot Building 011 Hall  
Dr. Koby Todros-Robust parameter estimation based on the  -divergence

(The talk will be given in English)

 

Speaker:     Dr. Koby Todros

School of Electrical and Computer Engineering, Ben Gurion University

011 hall, Electrical Engineering-Kitot Building‏

Monday, March 4th, 2024

15:00 - 16:00

Robust parameter estimation based on the -divergence

Abstract

The maximum likelihood estimator (MLE) is a well-established tool, that operates by minimizing the empirical Kullback-Leibler divergence (KLD) between the underlying data-generating distribution and a presumed parametric class. When the presumed probability model is correctly specified, the MLE is usually asymptotically efficient. In practice, however, the presumed probability model is often misspecified due to off-nominal measurements, called outliers. These, can be generated due to impulsive noise, sensor fault, or adversarial data corruption. The MLE may be highly susceptible to such misspecification. This sensitivity arises from the fact that the KLD puts a non-negligible consideration to off-nominal low-density components of the data-generating distribution, thereby, making them influential in the estimation process.

In this presentation, we shall introduce a novel robust KLD generalization, called -divergence, that addresses this limitation. The -divergence incorporates a weighted version of the presumed log-likelihood function. To de-emphasize low-density regions, associated with outliers, the corresponding weight function follows a convolution between the data-generating density and a strictly-positive smoothing “”ernel function. Consequently, the resulting minimum -divergence estimator (MDE), that operates by minimizing the empirical -divergence w.r.t. the vector parameter of interest, leverages Parzen's non-parametric kernel density estimator to effectively mitigate outliers. We demonstrate that the considered approach offers notable theoretical and practical benefits over competing estimators employing alternative robust divergences, such as Hellinger’s distance, the  and  divergences. Throughout the presentation, we shall discuss the asymptotic performance of MKDE and address the important problem of selecting the kernel's bandwidth parameter. The MDE will be illustrated for robust source localization the presence of intermittent jamming and for robust Gaussian mixture modeling with application to anomaly detection.

Short Bio:
Dr. Todros is a faculty member at the School of Electrical and Computer Engineering in the Ben-Gurion University of the Negev. His research interests include statistical signal processing and machine learning with focus on semi-parametric detection and estimation, adaptive filtering, sensor array and multichannel signal processing, blind source separation, and biomedical signal processing. Dr. Todros is a senior IEEE member and serves as an associate editor in the IEEE Signal Processing Letters in the Signal Processing Elsevier Journal.

 

השתתפות בסמינר תיתן קרדיט שמיעה = עפ"י רישום שם מלא + מספר ת.ז. בטופס הנוכחות שיועבר באולם במהלך הסמינר

 

 

 

 

אוניברסיטת תל אביב עושה כל מאמץ לכבד זכויות יוצרים. אם בבעלותך זכויות יוצרים בתכנים שנמצאים פה ו/או השימוש
שנעשה בתכנים אלה לדעתך מפר זכויות, נא לפנות בהקדם לכתובת שכאן >>