
Chapter 7 exercises
 Suppose there is an $r$query local tester for property $\mathcal{C}$ with rejection rate $\lambda$. Show that there is a testing algorithm which, given inputs $0 < \epsilon, \delta \leq 1/2$, makes $O(\frac{r \log(1/\delta)}{\lambda \epsilon})$ (nonadaptive) queries to $f$ and satisfies the following:
 If $f \in \mathcal{C}$ then the tester accepts with probability $1$.
 If $f$ is $\epsilon$far from $\mathcal{C}$ then the tester accepts with probability at most $\delta$.
 Let ${\mathcal M} = \{(x, y) \in \{0,1\}^{2n}: x = y\}$, the property that a string’s first half matches its second half. Give a $2$query local tester for ${\mathcal M}$ with rejection rate $1$. (Hint: locally test that $x \oplus y = (0, 0, \dots, 0)$.)
 Reduce the proof length in Examples 15 to $n2$.
 Verify the claim from Examples 12 regarding the $2$query tester for the property that a string has all its coordinates equal. (Hint: use $\pm 1$ notation.)
 Let ${\mathcal O} = \{w \in {\mathbb F}_2^n : w \text{ has an odd number of } 1′\text{s}\}$. Let $T$ be any $(n1)$query string testing algorithm which accepts every $w \in {\mathcal O}$ with probability $1$. Show that $T$ in fact accepts every string $v \in {\mathbb F}_2^n$ with probability $1$ (even though $\mathrm{dist}(w,{\mathcal O}) = \frac{1}{n} > 0$ for half of all strings $w$). Thus locally testing ${\mathcal O}$ requires $n$ queries.
 Let $T$ be a $2$query testing algorithm for functions $\{1,1\}^n \to \{1,1\}$. Suppose that $\mathcal{T}$ accepts every dictator with probability $1$. Show that it also accepts $\mathrm{Maj}_{n’}$ with probability $1$ for every odd $n’ \leq n$. This shows that there is no $2$query local tester for dictatorship assuming $n > 2$. (Hint: you’ll need to enumerate all predicates on up to $2$ bits.)
 For every $\alpha < 1$, show that there is no $(\alpha,1)$Dictatorvs.NoNotables test using MaxE$3$Lin predicates. (Hint: consider large odd parities.)

 Consider the following $3$query testing algorithm for $f : \{0,1\}^n \to \{0,1\}$. Let ${\boldsymbol{x}}, \boldsymbol{y} \sim \{0,1\}^n$ be independent and uniformly random, define $\boldsymbol{z} \in \{0,1\}^n$ by $\boldsymbol{z}_i = {\boldsymbol{x}}_i \wedge \boldsymbol{y}_i$ for each $i \in [n]$, and accept if $f({\boldsymbol{x}}) \wedge f(\boldsymbol{y}) = f(\boldsymbol{z})$. Let $p_k$ be the probability that this test accepts a parity function $\chi_S : \{0,1\}^n \to \{0,1\}$ with $S = k$. Show that $p_0 = p_1 = 1$ and that in general $p_k \leq \tfrac{1}{2} + 2^{S}$. In fact, you might like to show that $p_k = \tfrac{1}{2} + (\frac{3}{4} – \frac{1}{4}(1)^k)2^{k}$. (Hint: it suffices to consider $k = n$ and then compute the correlation of $\chi_{\{1,\dots, n\}} \wedge \chi_{\{n+1, \dots, 2n\}}$ with the bent function $\mathrm{IP}_{2n}$.)
 Show how to obtain a $3$query local tester for dictatorship by combining the following subtests: (i) the Odd BLR Test; (iii) the test from part (a).
 Obtain the largest explicit rejection rate in Theorem 7 that you can. Can you improve your bound by doing the BLR and NAE Tests with probabilities other than $1/2, 1/2$?

 Say that $A$ is an $(\alpha,\beta)$distinguishing algorithm for Max$\mathrm{CSP}(\Psi)$ if it outputs `YES’ on instances with value at least $\beta$ and outputs `NO’ on instances with value strictly less than $\alpha$. (On each instance with value in $[\alpha, \beta)$, algorithm $A$ may have either output.) Show that if there is an efficient $(\alpha,\beta)$approximation algorithm for Max$\mathrm{CSP}(\Psi)$ then there is also an efficient $(\alpha,\beta)$distinguishing algorithm for Max$\mathrm{CSP}(\Psi)$.
 Consider Max$\mathrm{CSP}(\Psi)$, where $\Psi$ be a class of predicates which is closed under restrictions (to nonconstant functions); e.g., Max$3$Sat. Show that if there is an efficient $(1,1)$distinguishing algorithm then there is also an efficient $(1,1)$approximation algorithm. (Hint: try out all labels for the first variable and use the distinguisher.)

 Let $\phi$ be a CNF of size $s$ and width $w \geq 3$ over variables $x_1, \dots, x_n$. Show that there is an ``equivalent'' CNF $\phi'$ of size at most $(w2)s$ and width $3$ over the variables $x_1, \dots, x_n$ plus auxiliary variables $\Pi_1, \dots, \Pi_\ell$, with $\ell \leq (w3)s$. Here ``equivalent'' means that for every $x$ such that $\phi(x) = \mathsf{True}$ there exists $\Pi$ such that $\phi'(x,\Pi) = \mathsf{True}$; and, for every $x$ such that $\phi(x) = \mathsf{False}$ we have $\phi'(x,\Pi) = \mathsf{False}$ for all $\Pi$.
 Extend the above so that every clause in $\phi'$ has width exactly $3$ (the size may increase by $O(s)$).
 Suppose there exists an $r$query PCPP reduction $\mathcal{R}_1$ with rejection rate $\lambda$. Show that there exists a $3$query PCPP reduction $\mathcal{R}_2$ with rejection rate at least $\lambda/(r2^r)$. The proof length of $\mathcal{R}_2$ should be at most $r2^r\cdot m$ plus the proof length of $\mathcal{R}_1$ (where $m$ is the descriptionsize of $\mathcal{R}_1$'s output) and the predicates output by the reduction should all be logical ORs applied to exactly three literals. (Hint: Exercises 4.1, 11.)

 Give a polynomialtime algorithm $R$ which takes as input a general boolean circuit $C$ and outputs a width$3$ CNF formula $\phi$ with the following guarantee: $C$ is satisfiable if and only if $\phi$ is satisfiable. (Hint: introduce a variable for each gate in $C$.)
 The previous exercise in fact formally justifies the following statement: ``$(1,1)$distinguishing Max$3$Sat is $\mathsf{NP}$hard''. (See Exercise 10 for the definition of $(1,1)$distinguishing.) Argue that indeed, if $(1,1)$distinguishing (or $(1,1)$approximating) Max$3$Sat is in polynomial time then so is CircuitSat.
 Prove Theorem 34. (Hint: Exercise 7.11(b).)
 Describe an efficient $(1,1)$approximation algorithm for MaxCut.

 Let $H$ be any subspace of ${\mathbb F}_2^n$ and let ${\mathcal H} = \{\chi_\gamma : {\mathbb F}_2^n \to \{1,1\} \mid \gamma \in H^\perp\}$. Give a $3$query local tester for ${\mathcal H}$ with rejection rate $1$. (Hint: similar to BLR, but with $\langle \varphi_H * f, f * f \rangle$.)
 Generalize to the case that $H$ is any affine subspace of ${\mathbb F}_2^n$.
 Let $A$ be any affine subspace of ${\mathbb F}_2^n$. Construct a $3$query, length$2^n$ PCPP system for $A$ with rejection rate a positive universal constant. (Hint: given $w \in {\mathbb F}_2^n$, the tester should expect the proof $\Pi \in \{1,1\}^{2^n}$ to encode the truth table of $\chi_w$. Use Exercise 15 and also a consistency check based on local correcting of $\Pi$ at $e_{\boldsymbol{i}}$, where $\boldsymbol{i} \in [n]$ is uniformly random.)

 Give a $3$query, length$O(n)$ PCPP system (with rejection rate a positive universal constant) for the class $\{w \in {\mathbb F}_2^n : \mathrm{IP}_{n}(w) = 1\}$, where $\mathrm{IP}_{n}$ is the inner product mod $2$ function ($n$ even).
 Do the same for the complete quadratic function $\mathrm{CQ}_n$ from Exercise 1.1.
 In this exercise you will prove Theorem 20.
 Let $D \in {\mathbb F}_2^{n \times n}$ be a nonzero matrix and suppose ${\boldsymbol{x}}, \boldsymbol{y} \sim {\mathbb F}_2^n$ are uniformly random and independent. Show that $\mathop{\bf Pr}[\boldsymbol{y}^\top D {\boldsymbol{x}} \neq 0] \geq \frac14$.
 Let $\gamma \in {\mathbb F}_2^n$ and $\Gamma \in {\mathbb F}_2^{n \times n}$. Suppose ${\boldsymbol{x}}, \boldsymbol{y} \sim {\mathbb F}_2^n$ are uniformly random and independent. Show that $\mathop{\bf Pr}[(\gamma^\top {\boldsymbol{x}})(\gamma^\top \boldsymbol{y}) = \Gamma \bullet ({\boldsymbol{x}} \boldsymbol{y}^\top)]$ is $1$ if $\Gamma = \gamma \gamma^{\top}$ and is at most $\frac34$ otherwise. Here we use the notation $B \bullet C = \sum_{i,j} B_{ij} C_{ij}$ for matrices $B, C \in {\mathbb F}_2^{n \times n}$.
 Suppose you are given query access to two functions $\ell : {\mathbb F}_2^n \to {\mathbb F}_2$ and $q : {\mathbb F}_2^{n \times n} \to {\mathbb F}_2$. Give a $4$query testing algorithm with the following two properties (for some universal constant $\lambda > 0$): (i) if $\ell = \chi_\gamma$ and $q = \chi_{\gamma \gamma^\top}$ for some $\gamma \in {\mathbb F}_2^n$, the test accepts with probability $1$; (ii) for all $0 \leq \epsilon \leq 1$, if the test accepts with probability at least $1 – \gamma\cdot \epsilon$ then there exists some $\gamma \in {\mathbb F}_2^n$ such that $\ell$ is $\epsilon$close to $\chi_\gamma$ and $q$ is $\epsilon$close to $\chi_{\gamma \gamma^\top}$. (Hint: apply the BLR Test to $\ell$ and $q$; and, use part (b) with local correcting on $q$.)
 Let $L$ be a list of homogenous degree$2$ polynomial equations over variables $w_1, \dots, w_n \in {\mathbb F}_2$. (Each equation is of the form $\sum_{i,j=1}^n c_{ij} w_i w_j = b$ for constants $b, c_{ij} \in {\mathbb F}_2$; we remark that $w_i^2 = w_i$.) Define the string property ${\mathcal L} = \{w \in {\mathbb F}_2^n : w \text{ satisfies all equations in L}\}$. Give a $4$query, length$(2^n + 2^{n^2})$ PCPP system for ${\mathcal L}$ (with rejection rate a positive universal constant). (Hint: the tester should expect the truth table of $\chi_w$ and $\chi_{ww^\top}$. You will need part (c) as well as Exercise 15 applied to “$q$”.)
 Complete the proof of Theorem 20. (Hints: given $w \in \{0,1\}^n$, the tester should expect a proof consisting of all gate values $\bar{w} \in \{0,1\}^{\mathrm{size}(C)}$ in $C$’s computation on $w$, as well as truth tables of $\chi_{\bar{w}}$ and $\chi_{\bar{w}\bar{w}^\top}$. Show that $\bar{w}$ being a valid computation of $C$ is encodable with a list of homogeneous degree$2$ polynomial equations. Add a consistency check between $w$ and $\bar{w}$ using local correcting, and reduce the number of queries to $3$ using Exercise 12.)
 Verify the connection between $\mathrm{Opt}({\mathcal P})$ and $C$’s satisfiability stated in the proof sketch of Theorem 36. (Hint: every string $w$ is $1$far from the empty property.)
 A randomized assignment for an instance ${\mathcal P}$ of a CSP over domain $\Omega$ is a mapping $\boldsymbol{F}$ which labels each variable in $V$ with a probability distribution over domain elements. Given a constraint $(S,\psi)$ with $S = (v_1, \dots, v_r)$, we write $\psi(\boldsymbol{F}(S)) \in [0,1]$ for the expected value of $\psi(\boldsymbol{F}(v_1), \dots, \boldsymbol{F}(v_r))$. This is simply the probability that $\psi$ is satisfied when one actually draws from the domaindistributions assigned by $\boldsymbol{F}$. Finally, we define the value of $\boldsymbol{F}$ to be $\mathrm{Val}_{\mathcal P}(\boldsymbol{F}) = \mathop{\bf E}_{(\boldsymbol{S}, \boldsymbol{\psi}) \sim {\mathcal P}}[\boldsymbol{\psi}(\boldsymbol{F}(\boldsymbol{S}))]$.
 Suppose that $A$ is a deterministic algorithm which produces a randomized assignment of value $\alpha$ on a given instance ${\mathcal P}$. Show a simple modification to $A$ which makes it a randomized algorithm which produces a (normal) assignment whose value is $\alpha$ in expectation. (Thus, in constructing approximation algorithms we may allow ourselves to output randomized assignments.)
 Let $A$ be the deterministic MaxE$3$Sat algorithm which on every instance outputs the randomized assignment which assigns the uniform distribution on $\{0,1\}$ to each variable. Show that this is a $(\frac78,\beta)$approximation algorithm for any $\beta$. Show also that the same algorithm is a $(\frac12,\beta)$approximation algorithm for Max$3$Lin.
 When the domain $\Omega$ is $\{1,1\}$, we may model a randomized assignment as a function $f : V \to [1,1]$; here $f(v) = \mu$ is interpreted as the unique probability distribution on $\{1,1\}$ which has mean $\mu$. Now given a constraint $(S,\psi)$ with $S = (v_1, \dots, v_r)$, show that the value of $f$ on this constraint is in fact $\psi(f(v_1), \dots, f(v_r))$, where we identify $\psi : \{1,1\}^r \to \{0,1\}$ with its multilinear (Fourier) expansion. (Hint: Exercise 1.5.)
 Let $\Psi$ be a collection of predicates over domain $\{1,1\}$. Let $\nu = \min_{\psi \in \Psi} \{\widehat{\psi}(\emptyset)\}$. Show that outputting the randomized assignment $f \equiv 0$ is an efficient $(\nu, \beta)$approximation algorithm for Max$\mathrm{CSP}(\Psi)$.
 Let $\boldsymbol{F}$ be a randomized assignment of value $\alpha$ for CSP instance ${\mathcal P}$ (as in Exercise 20). Give an efficient deterministic algorithm which outputs a usual assignment $F$ of value at least $\alpha$. (Hint: try all possible labellings for the first variable and compute the expected value that would be achieved if $\boldsymbol{F}$ were used for the remaining variables. Pick the best label for the first variable and repeat.)
 Given a local tester for functions $f : \{1,1\}^n \to \{1,1\}$, we can interpret it also as a tester for functions $f : \{1,1\}^n \to [1,1]$; simply view the tester as a CSP and view the acceptance probability as the value of $f$ when treated as a randomized assignment (as in Exercise 20(c)). Equivalently, whenever the tester “queries” $f(x)$, imagine that what is returned is a random bit $\boldsymbol{b} \in \{1,1\}$ whose mean is $f(x)$. This interpretation completes Definition 38 of Dictatorvs.NoNotables tests for functions $f : \{1,1\}^n \to [1,1]$ (see Remark 39). Given this definition, verify that the Håstad$_\delta$ Test is indeed a $(\tfrac{1}{2}, 1\delta)$Dictatorvs.NoNotables test. (Hint: show that the formula for the probability that the Håstad$_\delta$ test accepts $f$ still holds for functions $f : \{1,1\}^n \to [1,1]$. There is only one subsequent inequality which uses that $f$’s range is $\{1,1\}$, and it still holds with range $[1,1]$.)
 Let $\Psi$ be a finite set of predicates over domain $\Omega = \{1,1\}$ which is closed under negating variables. (An example is the scenario of Max$\psi$ from Remark 24.) In this exercise you will show that Dictatorvs.NoNotables tests using $\Psi$ may assume $f : \{1,1\}^n \to [1,1]$ is odd without loss of generality.
 Let $T$ be an $(\alpha,\beta)$Dictatorvs.NoNotables test using predicate set $\Psi$ which works under the assumption that $f : \{1,1\}^n \to [1,1]$ is odd. Modify $T$ as follows: whenever it is about to query $f(x)$, with probability $\tfrac{1}{2}$ let it use $f(x)$ and with probability $\tfrac{1}{2}$ let it use $f(x)$. Call the modified test $T’$. Show that the probability $T’$ accepts an arbitrary $f : \{1,1\}^n \to [1,1]$ is equal to the probability $T$ accepts $f^\mathrm{odd}$.
 Prove that $T’$ is an $(\alpha,\beta)$Dictatorvs.NoNotables test using predicate set $\Psi$ for functions $f : \{1,1\}^n \to [1,1]$.
 This problem is similar to Exercise 23 in that it shows you may assume that Dictatorvs.NoNotables tests are testing “smoothed” functions of the form $\mathrm{T}_{1\delta} g$ for $g : \{1,1\}^n \to [1,1]$, so long as you are willing to lose $O(\delta)$ in the probability that dictators are accepted.
 Let $U$ be an $(\alpha,\beta)$Dictatorvs.NoNotables test using an arity$r$ predicate set $\Psi$ (over domain $\{1,1\}$) which works under the assumption that the function $f : \{1,1\}^n \to [1,1]$ being tested is of the form $\mathrm{T}_{1\delta} g$ for $g : \{1,1\}^n \to [1,1]$. Modify $U$ as follows: whenever it is about to query $f(x)$, let it draw $\boldsymbol{y} \sim N_{1\delta}$ and use $f(\boldsymbol{y})$ instead. Call the modified test $U’$. Show that the probability $U’$ accepts an arbitrary $g : \{1,1\}^n \to [1,1]$ is equal to the probability $U$ accepts $\mathrm{T}_{1\delta} g$.
 Prove that $U’$ is an $(\alpha,\beta – r\delta/2)$Dictatorvs.NoNotables test using predicate set $\Psi$.
 Give a slightly alternate proof of Theorem 43 by using the original BLR Test analysis and applying Exercises 23, 24.
 Show that when using Theorem 41, it suffices to have a “Dictatorsvs.NoInfluentials test”, meaning replacing $\mathbf{Inf}_i^{(1\epsilon)}[f]$ in Definition 38 with just $\mathbf{Inf}_i[f]$. (Hint: Exercise 24.)
 For $q \in {\mathbb N}^+$, UniqueGames($q$) refers to the arity$2$ CSP with domain $\Omega = [q]$ in which all $q!$ “bijective” predicates are allowed; here $\psi$ is “bijective” if there is a bijection $\pi : [q] \to [q]$ such that $\psi(i,j) = 1$ iff $\pi(j) = i$. Show that $(1,1)$approximating UniqueGames($q$) can be done in polynomial time. (The Unique Games Conjecture of Khot [Kho02] states that for all $\delta > 0$ there exists $q \in {\mathbb N}^+$ such that $(\delta, 1\delta)$approximating UniqueGames($q$) is $\mathsf{NP}$hard.)
 In this problem you will show that Corollary 44 actually follows directly from Corollary 45.
 Consider the ${\mathbb F}_2$linear equation $v_1 + v_2 + v_3 = 0$. Exhibit a list of $4$ clauses (i.e., logical ORs of literals) over the variables such that if the equation is satisfied then so are all $4$ clauses, but if the equation is not satisfied then at most $3$ of the clauses are. Do the same for the equation $v_1 + v_2 + v_3 = 1$.
 Suppose that for every $\delta > 0$ there is an efficient algorithm for $(\frac78 + \delta, 1 – \delta)$approximating MaxE$3$Sat. Give, for every $\delta > 0$, an efficient algorithm for $(\frac12 + \delta, 1 – \delta)$approximating MaxE$3$Lin.
 Alternatively, show how to transform any $(\alpha,\beta)$Dictatorvs.NoNotables test using MaxE$3$Lin predicates into a $(\frac34 + \frac14 \alpha, \beta)$Dictatorvs.NoNotables test using MaxE$3$Sat predicates.
 In this exercise you will prove Theorem 42.
 Recall the predicate $\mathrm{OXR}$ from Exercise 1.1. Fix a small $0 < \delta < 1$. The remainder of the exercise will be devoted to constructing a $(\frac34 + \delta/4,1)$Dictatorvs.NoNotables test using Max$\mathrm{OXR}$ predicates. Show how to convert this to a $(\frac78 + \delta/8,1)$Dictatorvs.NoNotables test using MaxE$3$Sat predicates. (Hint: similar to Exercise 28(c).)
 By Exercise 23, it suffices to construct a $(\frac34 + \delta/4,1)$Dictatorvs.NoNotables test using the $\mathrm{OXR}$ predicate assuming $f : \{1,1\}^n \to [1,1]$ is odd. Håstad tests $\mathrm{OXR}(f({\boldsymbol{x}}), f(\boldsymbol{y}), f(\boldsymbol{z}))$ where ${\boldsymbol{x}}, \boldsymbol{y}, \boldsymbol{z} \in \{1,1\}^n$ are chosen randomly as follows: For each $i \in [n]$ (independently), with probability $1\delta$ choose $({\boldsymbol{x}}_i, \boldsymbol{y}_i, \boldsymbol{z}_i)$ uniformly subject to ${\boldsymbol{x}}_i\boldsymbol{y}_i\boldsymbol{z}_i = 1$, and with probability $\delta$ choose $({\boldsymbol{x}}_i, \boldsymbol{y}_i, \boldsymbol{z}_i)$ uniformly subject to $\boldsymbol{y}_i\boldsymbol{z}_i = 1$. Show that the probability this test accepts an odd $f : \{1,1\}^n \to [1,1]$ is \begin{equation} \label{eqn:oxrbound} \tfrac34 – \tfrac14 \mathbf{Stab}_{\delta}[f] – \tfrac14\sum_{S \subseteq [n]} \widehat{f}(S)^2 \mathop{\bf E}_{\boldsymbol{J} \subseteq_{1\delta} S}[(1)^{\boldsymbol{J}}\widehat{f}(\boldsymbol{J})], \end{equation} where $\boldsymbol{J} \subseteq_{1\delta} S$ denotes that $\boldsymbol{J}$ is a $(1\delta)$random subset of $S$. In particular, show that dictators are accepted with probability $1$.
 Upperbound \eqref{eqn:oxrbound} by \[ \tfrac34 + \delta/4 + \tfrac14 \sqrt{(1\delta)^t} + \tfrac14 \sum_{S \leq t} \widehat{f}(S)^2 \mathop{\bf E}_{\boldsymbol{J} \subseteq_{1\delta} S}[\widehat{f}(\boldsymbol{J})], \] or something stronger. (Hint: Cauchy–Schwarz.)
 Complete the proof that this is a $(\frac34 + \delta/4, 1)$Dictatorvs.NoNotables test, assuming $f$ is odd.
 In this exercise you will prove Theorem 41. Assume there exists an $(\alpha,\beta)$Dictatorvs.NoNotables test $T$ using predicate set $\Psi$ over domain $\{1,1\}$. We define a certain efficient algorithm $R$, which takes as input an instance $\mathcal{G}$ of UniqueGames($q$) and outputs an instance ${\mathcal P}$ of Max$\mathrm{CSP}(\Psi)$. For simplicity we refer to the variables $V$ of the UniqueGames instance $\mathcal{G}$ as “vertices” and its constraints as “edges”. We also assume that when $\mathcal{G}$ is viewed as an undirected graph, it is regular. (By a result of Khot–Regev [KR08] this assumption is without loss of generality for the purposes of the Unique Games Conjecture.) The Max$\mathrm{CSP}(\Psi)$ instance ${\mathcal P}$ output by algorithm $R$ will have variable set $V \times \{1,1\}^{q}$ and we write assignments for it as collections of functions $(f_v)_{v \in V}$, where $f : \{1,1\}^q \to \{1,1\}$. The draw of a random of constraint for ${\mathcal P}$ is defined as follows:
 Choose $\boldsymbol{u} \in V$ uniformly at random.
 Draw a random constraint from the test $T$; call it $\boldsymbol{\psi}(f({\boldsymbol{x}}^{(1)}), \dots, f({\boldsymbol{x}}^{(\boldsymbol{r})}))$.
 Choose $\boldsymbol{r}$ random “neighbours” $\boldsymbol{v}_1, \dots, \boldsymbol{v}_{\boldsymbol{r}}$ of $\boldsymbol{u}$ in $\mathcal{G}$, independently and uniformly. (By a neighbour of $\boldsymbol{u}$, we mean a vertex $v$ such that either $(\boldsymbol{u},v)$ or $(v,\boldsymbol{u})$ is the scope of a constraint in $\mathcal{G}$.) Since $\mathcal{G}$’s constraints are bijective, we may assume that the associated scopes are $(\boldsymbol{u}, \boldsymbol{v}_1), \dots, (\boldsymbol{u}, \boldsymbol{v}_{\boldsymbol{r}})$ with bijections $\boldsymbol{\pi}_{1}, \dots, \boldsymbol{\pi}_{\boldsymbol{r}} : [q] \to [q]$.
 Output the constraint $\boldsymbol{\psi}(f_{\boldsymbol{v}_1}^{\boldsymbol{\pi}_1}({\boldsymbol{x}}^{(1)}), \dots, \boldsymbol{\psi}(f_{\boldsymbol{v}_{\boldsymbol{r}}}^{\boldsymbol{\pi}_{\boldsymbol{r}}}({\boldsymbol{x}}^{(\boldsymbol{r})}))$, where we use the permutation notation $f^\pi$ from Exercise 1.29.
 Suppose $\mathrm{Opt}(\mathcal{G}) \geq 1\delta$. Show that there is an assignment for ${\mathcal P}$ with value at least $\beta – O(\delta)$ in which each $f_v$ is a dictator. (You will use regularity of $\mathcal{G}$ here.) Thus $\mathrm{Opt}({\mathcal P}) \geq \beta – O(\delta)$.
 Given an assignment $F = (f_v)_{v \in V}$ for ${\mathcal P}$, introduce for each $u \in V$ the function $g_u : \{1,1\}^q \to [1,1]$ defined by $g(x) = \mathop{\bf E}_{\boldsymbol{v}}[f_{\boldsymbol{v}}^{\boldsymbol{\pi}}(x)]$, where $\boldsymbol{v}$ is a random neighbour of $\boldsymbol{u}$ in $\mathcal{G}$ and $\boldsymbol{\pi}$ is the associated constraint’s permutation. Show that $\mathrm{Val}_{{\mathcal P}}(F) = \mathop{\bf E}_{\boldsymbol{u} \in V}[\mathrm{Val}_{T}(g_{\boldsymbol{u}})]$ (using the definition from Exercise 22).
 Fix an $\epsilon > 0$ and suppose that $\mathrm{Val}_{{\mathcal P}}(F) \geq s + 2\lambda(\epsilon)$, where $\lambda$ is the “rejection rate” associated with $T$. Show that for at least a $\lambda(\epsilon)$fraction of vertices $u \in V$, the set $\text{NbrNotable}_u = \{i \in [q] : \mathbf{Inf}_i^{(1\epsilon)}[g_u] > \epsilon\}$ is nonempty.
 Show that for any $u \in V$ and $i \in [q]$ we have $\mathop{\bf E}[\mathbf{Inf}^{(1\epsilon)}_{\boldsymbol{\pi}^{1}(i)}[f_{\boldsymbol{v}}]] \geq \mathbf{Inf}_{i}^{(1\epsilon)}[g_u]$, where $\boldsymbol{v}$ is a random neighbour of $u$ and $\boldsymbol{\pi}$ is the associated constraint’s permutation. (Hint: Exercise 2.41.)
 For $v \in V$, define also the set $\text{Notable}_u = \{i \in [q] : \mathbf{Inf}_i^{(1\epsilon)}[f_v] \geq \epsilon/2\}$. Show that if $i \in \text{NbrNotable}_u$ then $\mathop{\bf Pr}_{\boldsymbol{v}}[\boldsymbol{\pi}^{1}(i) \in \text{Notable}_{\boldsymbol{v}}] \geq \epsilon/2$, where $\boldsymbol{v}$ and $\boldsymbol{\pi}$ are as in the previous part.
 Show that for every $u \in V$ we have $\text{Notable}_u \cup \text{NbrNotable}_u \leq O(1/\epsilon^2)$. (Hint: Proposition 2.53.)
 Consider the following randomized assignment for $\mathcal{G}$ (see Exericse 20): for each $u \in V$, give it the uniform distribution on $\text{Notable}_u \cup \text{NbrNotable}_u$ (if this set is nonempty; otherwise, give it an arbitrary labelling). Show that this randomized assignment has value $\Omega(\lambda(\epsilon) \epsilon^5)$.
 Conclude Theorem 41, where “UGhard” means “$\mathsf{NP}$hard assuming the Unique Games Conjecture”.
 Technically, Exercise 30 has a small bug: since a Dictatorvs.NoNotables test using predicate set $\Psi$ is allowed to use duplicate query strings in its predicates (see Remark 39), the reduction in the previous exercise does not necessarily output instances of Max$\mathrm{CSP}(\Psi)$ because our definition of CSPs requires that each scope consist of distinct variables. In this exercise you will correct this bug. Let $M \in {\mathbb N}^+$ and suppose we modify the algorithm $R$ from Exercise 30 to a new algorithm $R’$, producing an instance ${\mathcal P}’$ with variable set $V \times [M] \times \{1,1\}^{q}$. We now think of assignments to ${\mathcal P}’$ as $M$tuples of functions $f_v^1, \dots, f_v^M$, one tuple for each $v \in V$. Further, thinking of ${\mathcal P}$ as a function tester, we have ${\mathcal P}’$ act as follows: whenever ${\mathcal P}$ is about to query $f_v(x)$, we have ${\mathcal P}’$ instead query $f_v^{\boldsymbol{j}}(x)$ for a uniformly random $\boldsymbol{j} \in [M]$.
 Show that $\mathrm{Opt}({\mathcal P}) = \mathrm{Opt}({\mathcal P}’)$.
 Show that if we delete all constraints in ${\mathcal P}’$ for which the scope contains duplicates, then $\mathrm{Opt}({\mathcal P}’)$ changes by at most $1/M$.
 Show that the deleted version of ${\mathcal P}’$ is a genuine instance of Max$\mathrm{CSP}(\Psi)$. Since the constant $1/M$ can be arbitrarily small, this corrects the bug in Exercise 30‘s proof of Theorem 41.


In Ex. 31 (b), I think we should replace $1/M$ by something like $r^2/M$, where $r$ is the maximum arity of the constraints in $\Psi$. This is like the birthday paradox, right? Maybe this isn’t tight, but if $M < r$, we can certainly cook up situations in which *every* constraint in $\mathcal{P}'$ has repeats in its scope, and so we can't expect any nontrivial guarantees in this regime.