boofun.analysis.hypercontractivity
Hypercontractivity tools for Boolean function analysis.
Hypercontractivity is a key technique from Chapter 9 of O’Donnell’s book, providing powerful bounds on the behavior of low-degree polynomials.
Key results: - Bonami’s Lemma: Bounds on Lq norms of low-degree polynomials - KKL Theorem: At least one variable has influence Ω(log n / n) - Friedgut’s Junta Theorem: Low total influence implies close to junta
- Mathematical Background:
- The noise operator T_ρ acts on functions by:
T_ρ f(x) = E_y[f(y)] where y is ρ-correlated with x
In Fourier: (T_ρ f)^(S) = ρ^|S| f̂(S)
Bonami’s Lemma: For q > 2, ‖T_ρ f‖_q ≤ ‖f‖_2 when ρ ≤ 1/√(q-1)
Functions
|
Verify Bonami's Lemma: ‖T_ρ f‖_q ≤ ‖f‖_2. |
|
Friedgut's Junta Theorem bound. |
|
Verify the hypercontractive inequality. |
|
Compute the error of approximating f by its projection onto junta variables. |
|
KKL theorem lower bound on maximum influence. |
|
Level-d inequality (O'Donnell Lemma 9.23). |
|
Compute the L_q norm of f: ‖f‖_q = (E[|f(x)|^q])^{1/q}. |
Compute max influence and compare with KKL lower bound. |
|
|
Apply the noise operator T_ρ to a Boolean function. |
- boofun.analysis.hypercontractivity.noise_operator(f: BooleanFunction, rho: float) np.ndarray[source]
Apply the noise operator T_ρ to a Boolean function.
- The noise operator is defined as:
(T_ρ f)(x) = E_y[f(y)]
where y is obtained by independently keeping each bit of x with probability (1+ρ)/2 and flipping it with probability (1-ρ)/2.
In Fourier domain: (T_ρ f)^(S) = ρ^|S| · f̂(S)
- Parameters:
f – BooleanFunction to apply noise to
rho – Correlation parameter in [-1, 1]
- Returns:
Array of (T_ρ f)(x) values for all x
Note
Returns real values in [-1, 1], not Boolean values.
- boofun.analysis.hypercontractivity.lq_norm(f: BooleanFunction, q: float) float[source]
Compute the L_q norm of f: ‖f‖_q = (E[|f(x)|^q])^{1/q}.
- Parameters:
f – BooleanFunction to analyze
q – Norm parameter (q ≥ 1)
- Returns:
L_q norm of f
- boofun.analysis.hypercontractivity.bonami_lemma_bound(f: BooleanFunction, q: float, rho: float) Tuple[float, float][source]
Verify Bonami’s Lemma: ‖T_ρ f‖_q ≤ ‖f‖_2.
Bonami’s Lemma (O’Donnell Theorem 9.22): For any f: {±1}^n → ℝ, if 1 ≤ q ≤ 2 or q > 2 with ρ ≤ 1/√(q-1), then:
‖T_ρ f‖_q ≤ ‖f‖_2
- Parameters:
f – BooleanFunction to analyze
q – Norm parameter
rho – Noise parameter
- Returns:
Tuple of (‖T_ρ f‖_q, ‖f‖_2) for verification
Note
For Boolean functions, ‖f‖_2 = 1 always.
- boofun.analysis.hypercontractivity.kkl_lower_bound(total_influence: float, n: int) float[source]
KKL theorem lower bound on maximum influence.
KKL Theorem (O’Donnell Theorem 9.24): For any Boolean f: {±1}^n → {±1} that is not constant:
max_i Inf_i[f] ≥ Ω(log n / n) · Var[f]
- More precisely, if I[f] is the total influence:
max_i Inf_i[f] ≥ Var[f] · Ω(log n) / n
- For balanced functions (Var[f] = 1):
max_i Inf_i[f] ≥ c · log n / n for some constant c
- Parameters:
total_influence – Total influence I[f]
n – Number of variables
- Returns:
Lower bound on maximum influence
Note
The constant c ≈ 0.57 in the precise statement.
- boofun.analysis.hypercontractivity.max_influence_bound(f: BooleanFunction) Tuple[float, float, float][source]
Compute max influence and compare with KKL lower bound.
- Parameters:
f – BooleanFunction to analyze
- Returns:
Tuple of (max_influence, kkl_bound, total_influence)
- boofun.analysis.hypercontractivity.friedgut_junta_bound(total_influence: float, epsilon: float) int[source]
Friedgut’s Junta Theorem bound.
Friedgut’s Theorem (O’Donnell Theorem 9.40): If f: {±1}^n → {±1} has total influence I[f], then f is ε-close to a k-junta where k ≤ 2^{O(I[f]/ε)}.
- Parameters:
total_influence – Total influence I[f]
epsilon – Distance threshold
- Returns:
Upper bound on junta size k
- boofun.analysis.hypercontractivity.junta_approximation_error(f: BooleanFunction, junta_vars: List[int]) float[source]
Compute the error of approximating f by its projection onto junta variables.
- The projection of f onto variables J is:
f_J(x) = E[f | x_J] = averaging f over non-J coordinates
The error is dist(f, f_J) = Pr[f(x) ≠ f_J(x)].
- Parameters:
f – BooleanFunction to analyze
junta_vars – List of variable indices to keep
- Returns:
Approximation error (probability of disagreement)
Note
This is related to the sum of influences of non-junta variables: dist(f, f_J) ≤ Σ_{i ∉ J} Inf_i[f]
- boofun.analysis.hypercontractivity.level_d_inequality(f: BooleanFunction, d: int, q: float = 4) Tuple[float, float][source]
Level-d inequality (O’Donnell Lemma 9.23).
- For the degree-d part of f, denoted f^{=d}:
‖f^{=d}‖_q ≤ (q-1)^{d/2} · ‖f^{=d}‖_2
This is a consequence of Bonami’s lemma.
- Parameters:
f – BooleanFunction to analyze
d – Degree level
q – Norm parameter (q > 2)
- Returns:
Tuple of (‖f^{=d}‖_q, (q-1)^{d/2} · ‖f^{=d}‖_2)
- boofun.analysis.hypercontractivity.hypercontractive_inequality(f: BooleanFunction, rho: float, p: float = 2, q: float = 4) Tuple[float, float, bool][source]
Verify the hypercontractive inequality.
Hypercontractive Inequality (O’Donnell Theorem 9.21): For 1 < p ≤ q < ∞ and ρ = √((p-1)/(q-1)):
‖T_ρ f‖_q ≤ ‖f‖_p
- Parameters:
f – BooleanFunction to analyze
rho – Noise parameter
p – Input norm parameter
q – Output norm parameter
- Returns:
Tuple of (‖T_ρ f‖_q, ‖f‖_p, inequality_satisfied)