
Chapter 2 exercises, continued
 Show that $\mathrm{T}_\rho$ is positivitypreserving for $\rho \in [1,1]$; i.e., $f \geq 0 \Rightarrow \mathrm{T}_\rho f \geq 0$. Show that $\mathrm{T}_\rho$ is positivityimproving for $\rho \in (1,1)$; i.e., $f \geq 0, f \neq 0 \Rightarrow \mathrm{T}_\rho f > 0$.
 Show that $\mathrm{T}_\rho$ satisfies the semigroup property: $\mathrm{T}_{\rho_1} \mathrm{T}_{\rho_2} = \mathrm{T}_{\rho_1 \rho_2}$.
 Use Exercises 1.1(e),(f) to deduce the formulas $\mathrm{E}_i f = \sum_{S \not \ni i} \widehat{f}(S)\, \chi_S$ and $\mathrm{T}_\rho f = \sum_S \rho^{S} \widehat{f}(S)\,\chi_S$.
 Show that $\mathrm{T}_\rho f \leq T_\rho f$ pointwise for any $f : \{1,1\}^n \to {\mathbb R}$.
 Show that $\mathbf{Stab}_{\rho}[f] = \mathbf{Stab}_{\rho}[f]$ if $f$ is odd and $\mathbf{Stab}_{\rho}[f] = \mathbf{Stab}_{\rho}[f]$ if $f$ is even.
 For each function $f$ in Exercise 1.1, compute $\mathbf{Stab}_\rho[f]$.
 Compute $\mathbf{Stab}_\rho[\mathrm{Tribes}_{w,s}]$.
 Suppose $f : \{1,1\}^n \to \{1,1\}$ has $\min(\mathop{\bf Pr}[f = 1], \mathop{\bf Pr}[f = 1]) = \alpha$. Show that $\mathbf{NS}_\delta[f] \leq 2\alpha$ for all $\delta \in [0,1]$.
 Verify Fact 52.
 Fix $f : \{1,1\}^n \to {\mathbb R}$. Show that $\mathbf{Stab}_\rho[f]$ is a convex function of $\rho$ on $[0,1]$.
 Let $f : \{1,1\}^n \to \{1,1\}$. Show that $\mathbf{NS}_\delta[f] \leq \delta \mathbf{I}[f]$ for all $\delta \in [0,1]$.
 Define the average influence of $f : \{1,1\}^n \to {\mathbb R}$ to be $\pmb{\mathscr{E}}[f] = \frac{1}{n} \mathbf{I}[f]$. Now for $f : \{1,1\}^n \to \{1,1\}$, show \[ \pmb{\mathscr E}[f] = \mathop{\bf Pr}_{\substack{{\boldsymbol{x}} \sim \{1,1\}^n \\ \boldsymbol{i} \sim [n]}}[f({\boldsymbol{x}}) \neq f({\boldsymbol{x}}^{\oplus \boldsymbol{i}})] \quad \text{and} \quad \tfrac{1e^{2}}{2} \pmb{\mathscr E}[f] \leq \mathbf{NS}_{1/n}[f] \leq \pmb{\mathscr E}[f]. \]
 Suppose $f_1, \dots, f_s : \{1,1\}^n \to \{1,1\}$ satisfy $\mathbf{NS}_\delta[f_i] \leq \epsilon_i$. Let $g : \{1,1\}^s \to \{1,1\}$ and define $h : \{1,1\}^n \to \{1,1\}$ by $h = g(f_1, \dots, f_s)$. Show that $\mathbf{NS}_\delta[h] \leq \sum_{i=1}^s \epsilon_i$.
 Complete the proof of Proposition 53 by showing that $(1\delta)^{k1} k \leq 1/\delta$ for all $0 < \delta \leq 1$ and $k \in {\mathbb N}^+$. (Hint: compare both sides with $1 + (1\delta) + (1\delta)^2 + \cdots + (1\delta)^{k1}$.)
 Fixing $f : \{1,1\}^n \to {\mathbb R}$, show the following Lipschitz bound for $\mathbf{Stab}_\rho[f]$ when $0 \leq \rho\epsilon \leq \rho < 1$: \[ \bigl\mathbf{Stab}_{\rho}[f] – \mathbf{Stab}_{\rho\epsilon}[f]\bigr \leq \epsilon \cdot \frac{1}{1\rho} \cdot \mathop{\bf Var}[f]. \] (Hint: use the Mean Value Theorem and the previous exercise.)
 Suppose that $\mathop{\bf F}$ is a functional on functions $f : \{1,1\}^n \to {\mathbb R}$ expressible as $\mathop{\bf F}[f] = \sum_{S} c_S \widehat{f}(S)^2$ where $c_S \geq 0$ for all $S \subseteq [n]$. (Examples include $\mathop{\bf Var}$, $\mathbf{W}^{k}$, $\mathbf{Inf}_i$, $\mathbf{I}$, $\mathbf{Inf}^{(1\delta)}_i$, and $\mathbf{Stab}_\rho$ for $\rho \geq 0$.) Show that $\mathop{\bf F}$ is convex, meaning $\mathop{\bf F}[\lambda f + (1\lambda) g] \leq \lambda \mathop{\bf F}[f] + (1\lambda) \mathop{\bf F}[g]$ for all $f$, $g$, and $\lambda \in [0,1]$.
 Extend the FKN Theorem as follows: Suppose $f : \{1,1\}^n \to \{1,1\}$ has $\mathbf{W}^{\leq 1}[f] \geq 1\delta$. Show that $f$ is $O(\delta)$close to a $1$junta. (Hint: consider $g(x_0, x) = x_0 f(x_0 x)$.)
 Compute the precise probability of a Condorcet winner (under impartial culture) in a $3$candidate, $3$voter election using $f = \mathrm{Maj}_3$.

 Arrow’s Theorem for $3$ candidates is slightly more general than what we stated: it allows for three different unanimous functions $f, g, h : \{1,1\}^n \to \{1,1\}$ to be used in the three pairwise elections. But show that if using $f$, $g$, $h$ always gives rise to a Condorcet winner then $f = g = h$. (Hint: first show $g(x) = f(x)$ for all $x$ by using the fact that $x$, $y = x$, and $z = (f(x), \dots, f(x))$ is always a valid possibility for the votes.)
 Extend Arrow’s Theorem to the case of Condorcet elections with more than $3$ candidates.
 The polarizations of $f : \{1,1\}^n \to {\mathbb R}$ (also known as compressions, downshifts, or twopoint rearrangements) are defined as follows. For $i \in [n]$, the $i$polarization of $f$ is the function $f^{\sigma_i} : \{1,1\}^n \to {\mathbb R}$ defined by \[ f^{\sigma_i}(x) = \begin{cases} \max \{f(x^{(i \mapsto +1)}), f(x^{(i \mapsto 1)})\} & \text{if } x_i = +1, \\ \min \{f(x^{(i \mapsto +1)}), f(x^{(i \mapsto 1)})\} & \text{if } x_i = 1. \end{cases} \]
 Show that $\mathop{\bf E}[f^{\sigma_i}] = \mathop{\bf E}[f]$ and $\f^{\sigma_i}\_p = \f\_p$ for all $p$.
 Show that $\mathbf{Inf}_{j}[f^{\sigma_i}] \leq \mathbf{Inf}_{j}[f]$ for all $j \in [n]$.
 Show that $f^{\sigma_i}$ is monotone in the $i$th direction (recall Exercise 5). Further, show that if $f$ is monotone in the $j$th direction for some $j \in [n]$ then $f^{\sigma_i}$ is still monotone in the $j$th direction.
 Let $f^* = f^{\sigma_1 \sigma_2 \cdots \sigma_n}$. Show that $f^*$ is monotone, $\mathop{\bf E}[f^*] = \mathop{\bf E}[f]$, and $\mathbf{Inf}_j[f^*] \leq \mathbf{Inf}_j[f]$ for all $j \in [n]$.
 The Hamming distance $\mathrm{Dist}(x,y) = \#\{i : x_i \neq y_i\}$ on the discrete cube $\{1,1\}^n$ is an example of an $\ell_1$ metric space. For $D \geq 1$, we say that the discrete cube can be embedded into $\ell_2$ with distortion $D$ if there is a mapping $F : \{1,1\}^n \to {\mathbb R}^m$ for some $m \in {\mathbb N}$ such that: \begin{align*} \F(x) – F(y)\_2 &\geq \mathrm{Dist}(x,y) \text{ for all $x$, $y$;} \tag{“no contraction”}\\ \F(x) – F(y)\_2 &\leq D \cdot \mathrm{Dist}(x,y) \text{ for all $x$, $y$.} \tag{“expansion at most $D$”} \end{align*} In this exercise you will show that the least distortion possible is $D = \sqrt{n}$.
 Recalling the definition of $f^\mathrm{odd}$ from Exercise 1.9, show that for any $f : \{1,1\}^n \to {\mathbb R}$ we have $\f^\mathrm{odd}\_2^2 \leq \mathbf{I}[f]$ and hence \[ \mathop{\bf E}_{{\boldsymbol{x}}}[(f({\boldsymbol{x}})  f({\boldsymbol{x}}))^2] \leq \sum_{i = 1}^n \mathop{\bf E}_{{\boldsymbol{x}}}\Bigl[\bigl(f({\boldsymbol{x}})  f({\boldsymbol{x}}^{\oplus i})\bigr)^2\Bigr]. \]
 Suppose $F : \{1,1\}^n \to {\mathbb R}^m$, and write $F(x) = (f_1(x), f_2(x), \dots, f_m(x))$ for functions $f_i : \{1,1\}^n \to {\mathbb R}$. By summing the above inequality over $i \in [m]$, show that any $F$ with no contraction must have expansion at least $\sqrt{n}$.
 Show that there is an embedding $F$ achieving distortion $\sqrt{n}$.
 Give a Fourierfree proof of the Poincaré inequality by induction on $n$.
 Let $V$ be a vector space with norm $\ \cdot \$ and fix $w_1, \dots, w_n \in V$. Define $g : \{1,1\}^n \to {\mathbb R}$ by $g(x) = \\sum_{i=1}^n x_i w_i\$.
 Show that $\mathrm{L} g \leq g$ pointwise. (Hint: triangle inequality.)
 Deduce $2\mathop{\bf Var}[g] \leq \mathop{\bf E}[g^2]$ and thus the following Khintchine–Kahane inequality: \[ \mathop{\bf E}_{{\boldsymbol{x}}}\left[\left\\mathop{{\textstyle \sum}}_{i=1}^n {\boldsymbol{x}}_i w_i\right\\right] \geq \frac{1}{\sqrt{2}} \cdot \mathop{\bf E}_{{\boldsymbol{x}}}\left[\left\\mathop{{\textstyle \sum}}_{i=1}^n {\boldsymbol{x}}_i w_i\right\^2\right]^{1/2}. \] (Hint: Exercise 24.)
 Show that the constant $\frac{1}{\sqrt{2}}$ above is optimal, even if $V = {\mathbb R}$.
 In the correlation distillation problem, a source chooses ${\boldsymbol{x}} \sim \{1,1\}^n$ uniformly at random and broadcasts it to $q$ parties. We assume that the transmissions suffer from some kind of noise, and therefore the players receive imperfect copies $\boldsymbol{y}^{(1)}, \dots, \boldsymbol{y}^{(q)}$ of ${\boldsymbol{x}}$. The parties are not allowed to communicate, and despite having imperfectly correlated information they wish to agree on a single random bit. In other words, the $i$th party will output a bit $f_i(\boldsymbol{y}^{(i)}) \in \{1,1\}$, and the goal is to find functions $f_1, \dots, f_q$ which maximize the probability that $f_1(\boldsymbol{y}^{(1)}) = f_2(\boldsymbol{y}^{(2)}) = \cdots = f_q(\boldsymbol{y}^{(q)})$. To avoid trivial deterministic solutions, we insist that $\mathop{\bf E}[f_i(\boldsymbol{y}^{(j)})]$ be $0$ for all $j \in [q]$.
 Suppose $q = 2$, $\rho \in (0,1)$, and $\boldsymbol{y}^{(j)} \sim N_\rho({\boldsymbol{x}})$ independently for each $j$. Show that the optimal solution is $f_1 = f_2 = \pm \chi_i$ for some $i \in [n]$. (Hint: you’ll need Cauchy–Schwarz.)
 Show the same result for $q = 3$.
 Let $q = 2$ and $\rho \in (\frac12, 1)$. Suppose that $\boldsymbol{y}^{(1)} = {\boldsymbol{x}}$ exactly, but $\boldsymbol{y}^{(2)} \in \{1, 0, 1\}^n$ has erasures: it’s formed from ${\boldsymbol{x}}$ by setting $\boldsymbol{y}^{(2)}_i = {\boldsymbol{x}}_i$ with probability $\rho$ and $\boldsymbol{y}^{(2)}_i = 0$ with probability $1\rho$, independently for all $i \in [n]$. Show that the optimal success probability is $\tfrac{1}{2} + \tfrac{1}{2} \rho$ and there is an optimal solution in which $f_1 = \pm \chi_i$ for any $i \in [n]$. (Hint: eliminate the source, and introduce a fictitious party $1′$…)
 Consider the previous scenario but with $\rho \in (0,\frac12)$. Show that if $n$ is sufficiently large then the optimal solution does not have $f_1 = \pm \chi_i$.

 Let $g : \{1,1\}^n \to {\mathbb R}^{\geq 0}$ have $\mathop{\bf E}[g] = \delta$. Show that for any $\rho \in [0,1]$, \[ \rho \sum_{j=1}^n \widehat{g}(j) \leq \delta + \sum_{k = 2}^n \rho^k \g^{=k}\_\infty. \] (Hint: Exercise 26.)
 Assume further that $g : \{1,1\}^n \to \{0,1\}$. Show that $\g^{=k}\_\infty \leq \sqrt{\delta} \sqrt{\binom{n}{k}}$. (Hint: first bound $\g^{=k}\_2^2$.) Deduce $\rho \sum_{j=1}^n \widehat{g}(j) \leq \delta + 2\rho^2 \sqrt{\delta} n$, assuming $\rho \leq \frac{1}{2\sqrt{n}}$.
 Show that $\sum_{j=1}^n \widehat{g}(j) \leq 2\sqrt{2} \delta^{3/4} \sqrt{n}$ (assuming $\delta \leq 1/4$). Deduce $\mathbf{W}^{1}[g] \leq 2\sqrt{2} \cdot \delta^{7/4} \sqrt{n}$ (hint: show $\widehat{g}(j) \leq \delta$ for all $j$).
 Suppose $f : \{1,1\}^n \to \{1,1\}$ is monotone and $\mathbf{MaxInf}[f] \leq \delta$. Show $\mathbf{W}^{2}[f] \leq \sqrt{2} \cdot \delta^{3/4} \cdot \mathbf{I}[f] \cdot \sqrt{n}$.
 Suppose further that $f$ is unbiased. Show that $\mathbf{MaxInf}[f] \leq o(n^{2/3})$ implies $\mathbf{I}[f] \geq 3 – o(1)$; conclude $\mathbf{MaxInf}[f] \geq \frac{3}{n} – o(1/n)$. (Hint: extend Exercise 25.) Use Exercise 45 to remove the assumption that $f$ is monotone for these statements.
 Let $V$ be a vector space with norm $\ \cdot \_V$. If $f : \{1,1\}^n \to V$ we can define its Fourier coefficients $\widehat{f}(S) \in V$ by the usual formula $\widehat{f}(S) = \mathop{\bf E}_{{\boldsymbol{x}} \in \{1,1\}^n}[f({\boldsymbol{x}}) {\boldsymbol{x}}^S]$. We may also define $\f\_p = \mathop{\bf E}_{{\boldsymbol{x}} \in \{1,1\}^n}[\f({\boldsymbol{x}})\_V^p]^{1/p}$. Finally, if the norm $\ \cdot \_V$ arises from an inner product $\langle \cdot, \cdot \rangle_V$ on $V$ we can define an inner product on functions $f, g : \{1,1\}^n \to V$ by $\langle f, g \rangle = \mathop{\bf E}_{{\boldsymbol{x}} \in \{1,1\}^n}[\langle f({\boldsymbol{x}}), g({\boldsymbol{x}})\rangle_V]$. The material developed so far in this book has used $V = {\mathbb R}$ with $\langle \cdot, \cdot \rangle_V$ being multiplication. Explore the extent to which this material extends to the more general setting.


in #48, probably f should be g?
Fixed, thanks.
In #33, it’s too hard to prove it just using knowledge you give me. For delta belongs to [0,1] it maybe right, but I doubt whether the question should be delta belongs to [0,1/2] for NS increases in this area.
It should be correct still for $\delta \in [1/2,1]$. Hint: union bound.
Thank you. I get it.
In 49 (56 in the book), it looks like a typo: $E[f_i(y^(j))]=0$. (it doesn’t make sense in (c), where the y^(j) have different domains)
Yes, thanks. Evidently decided to change i to j at some point and didn’t change it in enough places!