## §4.4: Håstad’s Switching Lemma and the spectrum of DNFs

Let’s further investigate how random restrictions can simplify DNF formulas.

[...]

## §4.3: Random restrictions

In this section we describe the method of applying random restrictions. This is a very “Fourier-friendly” way of simplifying a boolean function.

[...]

## §4.2: Tribes

In this section we study the tribes DNF formulas, which serve as an important examples and counterexamples in analysis of boolean functions. Perhaps the most notable feature of the tribes function is that (for a suitable choice of parameters) it is essentially unbiased and yet all of its influences are quite tiny.

[...]

## §4.1: DNF formulas

Perhaps the commonest way of representing a boolean function $f : \{0,1\}^n \to \{0,1\}$ is by a DNF formula:

[...]

## §3.5: Highlight: the Goldreich–Levin Algorithm

We close this chapter by briefly describing a topic which is in some sense the “opposite” of learning theory: cryptography. At the highest level, cryptography is concerned with constructing functions which are computationally easy to compute but computationally difficult to invert. Intuitively, think about the task of encrypting secret messages: you would like a [...]

## §3.4: Learning theory

Computational learning theory is an area of algorithms research devoted to the following task: given a source of “examples” $(x, f(x))$ from an unknown function $f$, compute a “hypothesis” function $h$ which is good at predicting $f(y)$ on future inputs $y$. We will focus on just one possible formulation of the task:

[...]

## §3.3: Restrictions

A common operation on boolean functions $f : \{-1,1\}^n \to {\mathbb R}$ is restriction to subcubes. Suppose $[n]$ is partitioned into two sets, $J$ and $\overline{J} = [n] \setminus J$. If the inputs bits in $\overline{J}$ are fixed to constants, the result is a function $\{-1,1\}^J \to {\mathbb R}$. For example, if we [...]