Machine-Learning the Hidden Universal Semantics of Natural Languages
Dr. Omri Abend - Edinburgh University
The field of Natural Language Processing (NLP) has recently been pivotal in producing important language technologies such as machine translation and question answering. Such technologies are based on elaborate structural representations of text, detected by statistical methods. However, common approaches to structural representation are language-specific or even domain-specific, limiting the applicability of NLP tools and models. How to represent both the idiosyncrasies of specific domains and languages as well as their commonalities is still an open question.
In my talk I will address these questions and propose an approach for discovering a level of representation shared by all languages using latent variable models for structured prediction. Under this approach, learning starts from universally-applicable coarse-grained logical structure, which is used to bootstrap the learning of more fine-grained semantic distinctions, as well as the learning of the specifics of individual languages. I will discuss the value of universal semantic structures both to the computational modeling of child language acquisition, and to leading NLP applications, focusing on machine-reading of web data and machine translation.
Joint work with Ari Rappoport, Shay Cohen and Mark Steedman.
Omri Abend is a postdoctoral researcher in the School of Informatics of the University of Edinburgh, working in the fields of Natural Language Processing and Computational Linguistics in Mark Steedman's lab. Omri earned his PhD from the Hebrew University, under the supervision of Ari Rappoport. During his PhD studies, he was a member of the Azrieli Fellows Program for promoting academic excellence and leadership. Prior to that, Omri studied mathematics and cognitive sciences at the Hebrew University (BSc, summa cum laude).
ההרצאה תתקיים ביום חמישי, 18.12.14, בשעה 13:00 בחדר 206, בנין וולפסון הנדסה, הפקולטה להנדסה, אוניברסיטת תל-אביב.