2015 Joint Telematics Group/IEEE Information Theory Society Summer School
on Signal Processing, Communications and Networks.
IISc Bangalore, July 20-23, 2015.
IISc Bangalore, July 24, 2015.
Yihong is an assistant professor of Electrical and Computer Engineering at University of Illinois at Urbana-Champaign. Previously he was a postdoctoral fellow with the Statistics Department, the Wharton School, University of Pennsylvania, working with Prof. Tony Cai on high-dimensional statistical inference and decision theory. He received his Ph.D. degree in Electrical Engineering from Princeton University in 2011 with his thesis supervised by Prof. Sergio Verdú. He received his B.E. from Department of Electronic Engineering at Tsinghua University in 2006. He spent the summer of 2010 as an intern in the Information Theory Research Group in HP Labs.
Gerhard is Alexander von Humboldt Professor and Head of the Institute for Communications Engineering at the Technische Universität München (TUM). He received the B.Sc. and M.Sc. degrees in electrical engineering from the University of Manitoba, Winnipeg, MB, Canada in 1991 and 1992, respectively, and the Dr. sc. techn. (Doktor der technischen Wissenschaften) degree from the ETH Zürich, Switzerland, in 1998. From 1998 to 2000, he was with Endora Tech AG, Basel, Switzerland, as a communications engineering consultant. From 2000 to 2008 he was with the Math Center, Bell Labs, Alcatel-Lucent, Murray Hill, NJ, as a Member of Technical Staff. He joined the University of Southern California (USC), Los Angeles, CA, as a Professor of Electrical Engineering in 2009. He joined TUM in 2010.
In the modern big-data era, technological innovations have enabled the collection and storage of data at a previously unimaginable scale; at the same time, the growth of data is constantly outpaced by that of the features which makes the statistical inference task highly non-trivial and computationally challenging. Accordingly, in contemporary data-analytic applications, the area of high-dimensional statistics has become the focus of increasing attention, concerning problems in which the ambient dimension is finite but comparable to or substantially larger than the sample size. The holy grail is to design statistical procedures that are both computationally efficient and information-theoretically optimal.
The interplay between information theory and statistics is a constant theme in the development of both fields. In this lecture series I will illustrate how techniques rooted in information theory play a key role in understanding the fundamental limits of high-dimensional statistical problems. We will discuss foundational topics on information-theoretic methods, such as Fano's inequality, Le Cam's methods, metric entropy and volume methods, aggregation, as well as their applications on specific problems, such as sparse linear regression, estimating high-dimensional matrices, principle component analysis, functional estimation, large-alphabet problems, community detection, etc. I will also discuss the recent trend of combining the statistical and computational perspectives and the computational barriers in a series of statistical problems on large matrices and random graphs.