### Latest Announcements

• Homework 1 posted. Due in the class on September 11.
• First class on Monday, August 7 at 2:00pm in ECE 1.08

### Time & Venue

• Class: Monday and Wednesday 2pm-3:30pm in ECE 1.08

### Agenda

(Notes for some of the lectures have been provided below. These notes have been lifted mostly verbatim from the textbook of the course or other resources mentioned in the class. Perhaps the arrangement of the topics is slightly different. The only reason to make them available here is to supplement the reading material for the students. These notes have been prepared jointly with Aditya Gopalan. However, HT must be given the credit for all the typos, spelling mistakes, and other Freudian slips.)
• Lecture 1: Introduction to the course: Law of large numbers, central limit theorem (with Berry-Esseen correction), large deviation; Markov and Chebyshev inequalities; course overview; numerical evidence of concentration.
• ##### 1. Cramér-Chernoff method for tail bounds
• Lecture 2: The Cramér-Chernoff method; properties of Cramér transform; Examples: Gaussian, Poisson, Bernoulli, Chi squared.
• Lecture 3: Sums of iid rvs; Application: Johnson-Lindenstrauss Lemma; sub-Gaussian rvs; variance of bounded rvs and Hoeffding inequality.
• ##### 2. Martingale differences and bounded differences
• Lecture 4: Concentration bounds for sums of iid: Hoeffding inequality (covered in lecture 3), Bennett inequality, Bernstein inequality, sum of mulitplicative family and Azuma's inequality; concentration for maximum of martingale noise; the bounded difference property and McDiarmid's inequality. [pdf]
• Lecture 5: Applications1: A bound on maximum of subGaussian random variables; longest common subsequence, chromatic number of Erdös-Rényi random graph; balls and bins setup -- number of empty bins. [pdf]
• Lecture 6: Applications2: Empirical process and uniform convergence; Glivenko-Cantelli Theorem; VC bound for deviation of empirical processes. [pdf]
• Lecture 7: Applications2: Chaining method to bound the deviation of empirical process; introduction of Efron-Stein inequality. [pdf]
• Coming up ...

### Course description

It is well-known that functions of large numbers of random quantities tend to behave rather predictably and less randomly’ than their constituents. For instance, the laws of large numbers tell us that the average of many independendent random variables is asymptotically the expected value; higher-order refinements such as the central limit theorem and large deviations techniques uncover the asymptotic rate at which this reduction in randomness takes place. However, if one is interested in sharper estimates, for the probability of deviation from the typical value, for a fixed number of observations, for functions other than the average, or for functions of dependent random variables, one must take recourse to more specific measure concentration bounds. Perhaps the most basic, nontrivial examples in this regard are the Markov and Chebyshev inequalities, which are encountered in a first course on probability. This graduate-level course on concentration inequalities will cover the basic material on this classic topic as well as introduce several advanced topics and techniques. The utility of the inequalities derived will be illustrated by drawing on applications from electrical engineering, computer science and statistics.

1. Introduction & motivation: Limit results and concentration bounds
2. Chernoff bounds: Hoeffding’s inequality, Bennett’s inequality, Bernstein’s inequality
3. Variance bounds: Efron-Stein inequality, Poinca ́re inequality
4. The entropy method and bounded difference inequalities
5. Log-Sobolev inequalities and hypercontractivity
6. Isoperimetric inequalities (Talagrand's convex distance inequality)
7. The transportation method
8. Influences and threshold phenomena

Textbook Stephane Boucheron, Gabor Lugosi, and Pascal Massart, `Concentration Inequalities,” Oxford University Press, 2013.

### Prerequisites

A course on either probability, random processes or measure theory. Basic mathematical maturity and working familiarity with probability calculations.