Open In Colab

Lecture 9: DNFs & Random Restrictions¶

Topics: DNF formulas, Random restrictions, Switching Lemma preview

O'Donnell Chapters: 3.3, 4.1, 4.3
Based on lecture notes by: Qinggao Hong
Notebook by: Gabriel Taboada


Key Concepts¶

Term Definition
DNF Disjunctive Normal Form (OR of ANDs)
Width Maximum clause size in DNF
Random Restriction Fix random subset of variables, leave rest "free"
$p$-random restriction Each variable is free with probability $p$

Main result: Width-$w$ DNFs have total influence $\leq w$ (Theorem 9.3)

In [1]:
# Install/upgrade boofun (required for Colab)
!pip install --upgrade boofun -q

import boofun as bf
print(f"BooFun version: {bf.__version__}")
[notice] A new release of pip is available: 25.2 -> 26.0
[notice] To update, run: /Library/Developer/CommandLineTools/usr/bin/python3 -m pip install --upgrade pip
/Users/gabrieltaboada/dev/Boofun/boofun/src/boofun/core/errormodels.py:21: UserWarning: uncertainties library not available - some error models disabled
  warnings.warn("uncertainties library not available - some error models disabled")
/Users/gabrieltaboada/dev/Boofun/boofun/src/boofun/quantum/__init__.py:22: UserWarning: Qiskit not available - quantum features limited
  warnings.warn("Qiskit not available - quantum features limited")
BooFun version: 1.1.1
In [2]:
import numpy as np
import boofun as bf
from boofun.analysis import SpectralAnalyzer
from boofun.analysis.restrictions import (
    random_restriction,
    apply_restriction,
    restriction_shrinkage,
)

import warnings
warnings.filterwarnings('ignore')

1. DNFs and Fourier Concentration¶

Tribes is the canonical balanced DNF: $k$ "tribes" of $w$ variables each.

In [3]:
# Tribes function is a canonical DNF
tribes = bf.tribes(3, 9)  # 3 tribes of 3 variables each
print(f"Tribes(3,9): {tribes.n_vars} variables (3 tribes x 3 vars each)")
print(f"Total Influence: {tribes.total_influence():.4f}")

# Spectral weight by degree
weights = tribes.spectral_weight_by_degree()
print(f"\nFourier weight by degree (W_k):")
for k, w in weights.items():
    bar = '#' * int(w * 40)
    print(f"  k={k}: {w:.4f} {bar}")
Tribes(3,9): 9 variables (3 tribes x 3 vars each)
Total Influence: 1.7227

Fourier weight by degree (W_k):
  k=0: 0.1155 ####
  k=1: 0.3297 #############
  k=2: 0.3499 #############
  k=3: 0.1507 ######
  k=4: 0.0349 #
  k=5: 0.0151 
  k=6: 0.0035 
  k=7: 0.0005 
  k=8: 0.0001 
  k=9: 0.0000 

2. Random Restrictions¶

Definition 9.6: A restriction $(J, z)$ fixes variables outside $J$ to values $z$, leaving $J$ "free".

In [4]:
# Demonstrate random restrictions
f = bf.majority(7)
print(f"Original: Majority-7 ({f.n_vars} variables)")
print(f"Total influence: {f.total_influence():.4f}")

rng = np.random.default_rng(42)
p = 0.5
rho = random_restriction(f.n_vars, p, rng)
print(f"\nRestriction (p={p}): {rho}")
print(f"  Fixed vars: {dict(rho.fixed)}")
print(f"  Free vars: {sorted(rho.free)}")

f_restricted = apply_restriction(f, rho)
n_free = f_restricted.n_vars if f_restricted.n_vars else 0
print(f"\nRestricted function: {n_free} free variables")
if n_free > 0:
    print(f"Restricted total influence: {f_restricted.total_influence():.4f}")
else:
    print("Function became constant!")
Original: Majority-7 (7 variables)
Total influence: 2.1875

Restriction (p=0.5): 1*10*11
  Fixed vars: {0: 1, 2: 1, 3: 0, 5: 1, 6: 1}
  Free vars: [1, 4]

Restricted function: 2 free variables
Restricted total influence: 0.0000

3. The "Low-Pass Filter" Effect¶

Random restrictions act like a low-pass filter in Fourier analysis:

  • High-degree coefficients get attenuated: They depend on correlations among many variables, and fixing random variables "breaks" these correlations
  • Low-degree coefficients are preserved: They depend on fewer variables, so they're more likely to survive
  • The function gets "smoothed" toward lower frequencies
In [5]:
def print_weight_bars(weights, n_vars, title, max_bars=30):
    print(f"\n{title} (n={n_vars})")
    print("-" * 50)
    max_w = max(weights.values()) if weights else 1
    for k in sorted(weights.keys()):
        w = weights[k]
        bar_len = int((w / max_w) * max_bars) if max_w > 0 else 0
        print(f"  k={k}: {w:.4f} {'#' * bar_len}")

tribes = bf.tribes(3, 9)
rng = np.random.default_rng(42)

print("LOW-PASS FILTER EFFECT")
print("=" * 50)

orig_weights = tribes.spectral_weight_by_degree()
print_weight_bars(orig_weights, tribes.n_vars, "ORIGINAL Tribes(3,9)")

# Average over many restrictions
p = 0.5
avg_weights = {}
total_free = 0
valid_samples = 0
num_samples = 50

for _ in range(num_samples):
    rho = random_restriction(tribes.n_vars, p, rng)
    f_rho = apply_restriction(tribes, rho)
    n_free = f_rho.n_vars if f_rho.n_vars else 0
    if n_free > 0:
        total_free += n_free
        valid_samples += 1
        for k, w in f_rho.spectral_weight_by_degree().items():
            avg_weights[k] = avg_weights.get(k, 0) + w / num_samples

avg_free = total_free / valid_samples if valid_samples > 0 else 0
print_weight_bars(avg_weights, f"~{avg_free:.1f}", f"AVERAGE after p={p} restriction")

print(f"\n-> Degree-0 weight: {orig_weights.get(0,0):.4f} -> {avg_weights.get(0,0):.4f} (increased!)")
print(f"-> Degree-2 weight: {orig_weights.get(2,0):.4f} -> {avg_weights.get(2,0):.4f} (decreased!)")
print("-> Random restrictions attenuate high-degree structure (low-pass filter)!")
LOW-PASS FILTER EFFECT
==================================================

ORIGINAL Tribes(3,9) (n=9)
--------------------------------------------------
  k=0: 0.1155 #########
  k=1: 0.3297 ############################
  k=2: 0.3499 ##############################
  k=3: 0.1507 ############
  k=4: 0.0349 ##
  k=5: 0.0151 #
  k=6: 0.0035 
  k=7: 0.0005 
  k=8: 0.0001 
  k=9: 0.0000 

AVERAGE after p=0.5 restriction (n=~4.5)
--------------------------------------------------
  k=0: 0.3660 ##########################
  k=1: 0.4141 ##############################
  k=2: 0.1832 #############
  k=3: 0.0300 ##
  k=4: 0.0056 
  k=5: 0.0010 
  k=6: 0.0001 
  k=7: 0.0000 
  k=8: 0.0000 

-> Degree-0 weight: 0.1155 -> 0.3660 (increased!)
-> Degree-2 weight: 0.3499 -> 0.1832 (decreased!)
-> Random restrictions attenuate high-degree structure (low-pass filter)!

4. Verifying Lemma 9.9¶

Lemma 9.9: $\mathbb{E}_{(J,z) \sim R_p}[\text{Inf}[f_{J,z}]] = p \cdot \text{Inf}[f]$

Total influence scales linearly with $p$!

In [6]:
print("Verifying Lemma 9.9: E[Inf[f_{J,z}]] = p * Inf[f]")
print("=" * 65)

functions = {
    "Tribes(3,9)": bf.tribes(3, 9),
    "Majority-9": bf.majority(9),
    "Parity-8": bf.parity(8),
}

p = 0.5
num_samples = 200
rng = np.random.default_rng(42)  # Different seed to avoid edge cases

print(f"\np = {p}, samples = {num_samples}")
print(f"{'Function':<15} {'Inf[f]':<10} {'p*Inf[f]':<12} {'E[Inf]':<12} {'Ratio':<8}")
print("-" * 65)

for name, f in functions.items():
    orig_inf = f.total_influence()
    expected = p * orig_inf
    
    total_restricted_inf = 0
    valid = 0
    for _ in range(num_samples):
        rho = random_restriction(f.n_vars, p, rng)
        f_rho = apply_restriction(f, rho)
        n_free = f_rho.n_vars if f_rho.n_vars else 0
        if n_free > 0:
            total_restricted_inf += f_rho.total_influence()
            valid += 1
        # When n_free = 0, influence is 0 (constant function)
    
    # Include the constant cases (influence = 0) in the average
    empirical = total_restricted_inf / num_samples
    ratio = empirical / expected if expected > 0 else 0
    
    print(f"{name:<15} {orig_inf:<10.4f} {expected:<12.4f} {empirical:<12.4f} {ratio:<8.2f}")

print("\n-> Ratio ~ 1.0 confirms Lemma 9.9!")
Verifying Lemma 9.9: E[Inf[f_{J,z}]] = p * Inf[f]
=================================================================

p = 0.5, samples = 200
Function        Inf[f]     p*Inf[f]     E[Inf]       Ratio   
-----------------------------------------------------------------
Tribes(3,9)     1.7227     0.8613       0.8642       1.00    
Majority-9      2.4609     1.2305       1.2690       1.03    
Parity-8        8.0000     4.0000       4.1900       1.05    

-> Ratio ~ 1.0 confirms Lemma 9.9!

5. Decision Tree Shrinkage¶

The Switching Lemma (Lecture 10) shows DNFs become shallow decision trees after restriction.

In [7]:
print("Decision Tree Depth Shrinkage")
print("=" * 55)

for name, f in functions.items():
    stats = restriction_shrinkage(f, p=0.5, num_samples=100)
    
    print(f"\n{name}:")
    print(f"  Original DT depth: {stats['original_dt_depth']}")
    print(f"  Avg restricted DT depth: {stats['avg_restricted_dt_depth']:.2f}")
    print(f"  Shrinkage factor: {stats['depth_shrinkage_factor']:.2f}x")
    print(f"  Became constant (depth=0): {stats['constant_fraction']*100:.0f}%")
Decision Tree Depth Shrinkage
=======================================================
Tribes(3,9):
  Original DT depth: 9
  Avg restricted DT depth: 2.53
  Shrinkage factor: 0.28x
  Became constant (depth=0): 0%
Majority-9:
  Original DT depth: 9
  Avg restricted DT depth: 4.35
  Shrinkage factor: 0.48x
  Became constant (depth=0): 1%

Parity-8:
  Original DT depth: 8
  Avg restricted DT depth: 3.65
  Shrinkage factor: 0.46x
  Became constant (depth=0): 1%

Summary¶

Result Statement
Thm 9.3 Width-$w$ DNF has $\text{Inf}[f] \leq w$
Lemma 9.9 $\mathbb{E}[\text{Inf}[f_{J,z}]] = p \cdot \text{Inf}[f]$
Switching Lemma DNF becomes shallow DT after restriction

boofun API¶

from boofun.analysis.restrictions import random_restriction, apply_restriction

rho = random_restriction(n, p)      # Generate p-random restriction
f_rho = apply_restriction(f, rho)   # Apply to get restricted function
f.spectral_weight_by_degree()       # W_k[f] for each degree k
restriction_shrinkage(f, p)         # Statistics on shrinkage