 Complete the proof of Theorem 19 by showing that $(1\tfrac{k+1}{n} +\tfrac{k}{n^2})^{1/2} \leq 1+2k/n$ for all $1 \leq k \leq n/2$.
 Using just the facts that $\mathbf{Stab}_\rho[\mathrm{Maj}_n] \to \frac{2}{\pi} \arcsin \rho$ for all $\rho \in [1,1]$ and that $\mathbf{Stab}_\rho[\mathrm{Maj}_n] = \sum_{k \geq 0} \mathbf{W}^{k}[\mathrm{Maj}_n] \rho^k$, deduce that $\lim_{n \to \infty} \mathbf{W}^{k}[\mathrm{Maj}_n] \to [\rho^k](\frac{2}{\pi} \arcsin \rho)$ for all $k \in {\mathbb N}$. (Hint: by induction on $k$, always taking $\rho$ “small enough”.)

 For $0 \leq j \leq m$ integers, show that $\hat{\lVert} \mathrm{Maj}_{2m+1}^{=2j+1} \hat{\rVert}_1 = \binom{m}{j} \frac{1}{2j+1} \cdot \frac{2m+1}{2^{2m}}\binom{2m}{m}$.
 Deduce that $\hat{\lVert} \mathrm{Maj}_{2m+1} \hat{\rVert}_1 = \mathop{\bf E}\left[\frac{1}{2\boldsymbol{X}+1}\right] \cdot \frac{2m+1}{2^{m}}\binom{2m}{m}$, where $\boldsymbol{X} \sim \mathrm{Binomial}(m,1/2)$.
 Deduce that $\hat{\lVert} \mathrm{Maj}_n \hat{\rVert}_1 \sim \frac{2}{\sqrt{\pi}} \frac{1}{\sqrt{n}} 2^{n/2}$.

 Show that for each odd $k \in {\mathbb N}$, \[ \left(\tfrac{2}{\pi}\right)^{3/2} k^{3/2} \leq [\rho^k] (\tfrac{2}{\pi} \arcsin \rho) \leq \left(\tfrac{2}{\pi}\right)^{3/2} k^{3/2}(1+O(1/k)). \] (Hint: Stirling’s approximation.)
 Prove Corollary 20. (Hint: for the second statement you’ll need to approximate the sum $\sum_{\text{odd } j > k} \left(\tfrac{2}{\pi}\right)^{3/2} j^{3/2}$ by an integral.)
 For integer $0 \leq k \leq n$, define $\mathcal{K}_k : \{1,1\}^n \to {\mathbb R}$ by $\mathcal{K}_k(x) = \sum_{S = k} x^S$. Since $\mathcal{K}_k$ is symmetric, the value $\mathcal{K}_k(x)$ depends only on the number $z$ of $1$’s in $x$; or equivalently, on $\sum_{i=1}^n x_i$. Thus we may define $K_k : \{0, 1, \dots, n\} \to {\mathbb R}$ by $K_k(z) = \mathcal{K}_k(x)$ for any $x$ with $\sum_i x_i = n – 2z$.
 Show that $K_k(z)$ can be expressed as a degree$k$ polynomial in $z$. It is called the Kravchuk (or Krawtchouk) polynomial of degree $k$. (The dependence on $n$ is usually implicit.)
 Show that $\sum_{k=0}^n \mathcal{K}_k(x) = 2^n \cdot 1_{(1, \dots, 1)}(x)$.
 Show for $\rho \in [1,1]$ that $\sum_{k=0}^n \mathcal{K}_k(x)\rho^k = 2^n\mathop{\bf Pr}[\boldsymbol{y} = (1, \dots, 1)]$, where $\boldsymbol{y} = N_\rho(x)$.
 Deduce the generating function identity $K_k(z) = [\rho^k]((1\rho)^z(1+\rho)^{nz})$.
 Prove Proposition 21.
 Prove Proposition 22 using the Central Limit Theorem. (Hint for $\mathbf{W}^{1}[f_n]$: use symmetry to show it equals the square of $\mathop{\bf E}[f_n({\boldsymbol{x}})\mathop{{\textstyle \sum}} \frac{1}{\sqrt{n}} {\boldsymbol{x}}_i]$.)
 Consider the setting of Theorem 13. Let $\boldsymbol{S} = \sum_i a_i {\boldsymbol{x}}_i$ where ${\boldsymbol{x}} \sim \{1,1\}^n$, and let $\boldsymbol{Z} \sim \mathrm{N}(0,1)$.
 Show that $\mathop{\bf Pr}[\boldsymbol{S} \geq t]$, $\mathop{\bf Pr}[\boldsymbol{Z} \geq t] \leq 2\exp(t^2/2)$ for all $t \geq 0$.
 Recalling $\mathop{\bf E}[\boldsymbol{Y}] = \int_0^\infty \mathop{\bf Pr}[\boldsymbol{Y} \geq t]\,dt$ for any random variable $\boldsymbol{Y}$, use the Berry–Esseen Theorem (and Remark 12, Exercise 16) to show \[ \Bigl\mathop{\bf E}[\boldsymbol{S}] – \mathop{\bf E}[\boldsymbol{Z}]\Bigr \leq O(\epsilon T + \exp(T^2/2)) \] for any $T \geq 1$.
 Deduce $\mathop{\bf E}[\boldsymbol{S}] – \sqrt{2/\pi} \leq O(\epsilon\sqrt{\log(1/\epsilon)})$.
 Improve $O(\epsilon\sqrt{\log(1/\epsilon)})$ to the bound $O(\epsilon)$ stated in Theorem 13 by using the nonuniform Berry–Esseen Theorem, which states that the bound $B\beta$ in the Berry–Esseen Theorem can be improved to $C\beta\cdot\frac{1}{1+u^3}$ for some constant $C$.
 Consider the sequence of LTFs defined in Proposition 22. Show that \[ \lim_{n \to \infty} \mathbf{Stab}_\rho[f_n] = \Lambda_\rho(\mu). \] Here $\mu = \overline{\Phi}(t)$ and $\Lambda_\rho(\mu)$ is the Gaussian quadrant probability defined by $\Lambda_\rho(\mu) = \mathop{\bf Pr}[\boldsymbol{z}_1 > t, \boldsymbol{z}_2 > t]$, where $\boldsymbol{z}_1, \boldsymbol{z}_2$ are standard Gaussians with correlation $\mathop{\bf E}[\boldsymbol{z}_1\boldsymbol{z}_2] = \rho$.
 In this exercise you will complete the justification of Theorem 14 using the following multidimensional BerryEsseen Theorem:
Theorem 33 Let $\boldsymbol{X}_1, \dots, \boldsymbol{X}_n$ be independent ${\mathbb R}^d$valued random vectors, each having mean zero. Write $\boldsymbol{S} = \sum_{i=1}^n \boldsymbol{X}_i$ and assume $\Sigma = \mathop{\bf Cov}[\boldsymbol{S}]$ is invertible. Let $\boldsymbol{Z} \sim \mathrm{N}(0,\Sigma)$ be a $d$dimensional Gaussian with the same mean and covariance matrix as $\boldsymbol{S}$. Then for all convex sets $U \subseteq {\mathbb R}^d$, \[ \mathop{\bf Pr}[\boldsymbol{S} \in U] – \mathop{\bf Pr}[\boldsymbol{Z} \in U] \leq C d^{1/4} \beta, \] where $C$ is a universal constant, $\beta = \sum_{i=1}^n \mathop{\bf E}[\\Sigma^{1/2} \boldsymbol{X}_i\_2^3]$, and $\\cdot\_2$ denotes the Euclidean norm on ${\mathbb R}^d$.
 Let $\Sigma = \begin{bmatrix} 1 & \rho \\ \rho & 1 \end{bmatrix}$ where $\rho \in (1,1)$. Show that \[ \Sigma^{1} = \begin{bmatrix} 1 & \rho \\ 0 & 1 \end{bmatrix}\begin{bmatrix} 1 & 0 \\ 0 & (1 \rho^2)^{1} \end{bmatrix} \begin{bmatrix} 1 & 0 \\ \rho & 1 \end{bmatrix}. \]
 Compute $y^\top \Sigma^{1} y$ for $y = \begin{bmatrix} \pm a \\ \pm a \end{bmatrix} \in {\mathbb R}^2$.
 Complete the proof of Theorem 14.
 Show that Theorem 30 is essentially optimal by exhibiting functions $f : \{1,1\}^n \to \{1,1\}$ with $\widehat{f}(1) = 1\delta/2$ and $\mathbf{W}^{1}[f] \geq 1 – \delta + \Omega(\delta^2 \log(1/\delta))$, for a sequence of $\delta$ tending to $0$.
 Prove Corollary 29.
 Fill in the details of the proof of Theorem 30.
 From Theorem 31 we know that the statement “$\mathbf{I}[f] \leq O_k(1) \sqrt{n}$ for all $f \in \mathrm{PTF}_n$” implies the statement “$\mathbf{NS}_{\delta}[f] \leq O_k(1) \sqrt{\delta}$ for all $f \in \bigcup_n \mathrm{PTF}_{n,k}$”. Show also the converse implication. (Hint: Exercise 2.37.)
 Estimate carefully the asymptotics of $\mathbf{I}[f]$, where $f \in \mathrm{PTF}_{n,k}$ is as in the strongest form of the Gotsman–Linial Conjecture.
 Let $A, B \subseteq \{1,1\}^n$ have cardinality $\alpha 2^n$ each, $\alpha \leq 1/2$. Thinking of $\{1,1\}^n \subset {\mathbb R}^n$, let $\mu_A, \mu_B \in {\mathbb R}^n$ be the centers of mass of $A$, $B$. Show that these centers are close in Euclidean distance: $\\mu_A – \mu_B\_2 \leq O(\sqrt{\log(1/\alpha)})$.
 Show that the Gaussian isoperimetric function satisfies $\mathcal{U} \cdot \mathcal{U}\ \prime\prime = 1$ on $(0,1)$. Deduce that $\mathcal{U}$ is concave.
 Fix $\alpha \in (0, 1/2)$. Let $f : \{1,1\}^n \to [1,1]$ satisfy $\mathop{\bf E}[f] \leq \alpha$ and $\widehat{f}(i) \leq \epsilon$ for all $i \in [n]$. Show that $\mathbf{W}^{1}[f] \leq \mathcal{U}(\alpha) + C \epsilon$, where $\mathcal{U}$ is the Gaussian isoperimetric function and where the constant $C$ may depend on $\alpha$. (Hint: you will need the nonuniform Berry–Esseen Theorem from Exercise 32.)
 In this exercise you will show by induction on $k$ that $\mathbf{Inf}[f] \leq 2 n^{11/2^k}$ for all $f : \{1,1\}^n \to \{1,1\}$ which are degree$k$ PTFs. The $k = 0$ case is trivial. So for $k > 0$, suppose $f = \mathrm{sgn}(p)$ where $p : \{1,1\}^n \to {\mathbb R}$ is a degree$k$ polynomial which is never $0$.
 Show for $i \in [n]$ that $\mathop{\bf E}[f({\boldsymbol{x}}) {\boldsymbol{x}}_i \mathrm{sgn}(\mathrm{D}_i p({\boldsymbol{x}}))] = \mathbf{Inf}_i[f]$. (Hint: first use the decomposition $f = x_i \mathrm{D}_i f + \mathrm{E}_i f$ to reach $\mathop{\bf E}[\mathrm{D}_i f \cdot \mathrm{sgn}(\mathrm{D}_i p)]$; then show that $\mathrm{D}_if = \mathrm{sgn}(\mathrm{D}_i p)$ whenever $\mathrm{D}_i f \neq 0$.)
 Conclude that $\mathbf{I}[f] \leq \mathop{\bf E}[\sum_i {\boldsymbol{x}}_i \mathrm{sgn}(\mathrm{D}_i p({\boldsymbol{x}}))]$. Remark: when $k = 2$ and thus each $\mathrm{sgn}(\mathrm{D}_i p)$ is an LTF, it is conjectured that this bound is still $O(\sqrt{n})$.
 Apply Cauchy–Schwarz and deduce \[ \mathbf{I}[f] \leq \sqrt{n + \sum_{i \neq j} \mathop{\bf E}[{\boldsymbol{x}}_i {\boldsymbol{x}}_j \mathrm{sgn}(\mathrm{D}_i p({\boldsymbol{x}})) \mathrm{sgn}(\mathrm{D}_j p({\boldsymbol{x}}))]}. \]
 Use Exercise 2.15 and the AMGM inequality to obtain $\mathbf{I}[f] \leq \sqrt{n + \sum_i \mathbf{I}[\mathrm{sgn}(\mathrm{D}_i p)]}$.
 Complete the induction.
Typo in theorem 33 :
$U\subseteq R^{d}$
Thanks!