faso.dev · a small workshop

What survives noise?

Every message moves through a noisy channel — a phone line, a radio wave, a hard drive, a synapse. Some bits arrive wrong. Claude Shannon's 1948 question: how much noise can a channel take before communication becomes impossible?

The answer is surprising. There's a sharp ceiling. Below it, you can recover anything. Above it, you can't. Five figures.


Figure 1 of 5

A bit is a coin.

A coin that always lands heads carries no information — you knew the answer in advance. A fair coin carries one full bit. Shannon's entropy H(p) measures how surprising each flip is, on average.

H
last flip:
drag the slider — the dot rides the entropy curve

Honest: entropy measures uncertainty, not "amount of stuff." A heavily biased coin is informative in advance — you can predict heads — so each flip surprises you less. Maximum surprise lives at p = 0.5.


Figure 2 of 5

A message is a stream.

Send a string of bits down a channel that flips each one independently with probability p. Drag the slider. Watch the errors arrive.

This is the binary symmetric channel — Shannon's simplest example. Most real channels are messier; this one is a clean lab.

source
— channel, p flip rate —
received
orange outlines mark flipped bits · resamples every 1.4s

Honest: a real channel doesn't deliver exactly p·N errors — it delivers a sample from a Binomial(N, p) distribution. The expected count is what you can plan against; any single transmission varies.


Figure 3 of 5

Noise breaks the message.

Bits aren't abstract. Take a phrase, push it through ASCII into bits, push the bits through the channel, then read them back. At p = 0.05 the phrase mostly survives. At p = 0.20, it's gibberish.

sent
SHANNON 1948
— channel, p flip rate —
received
characters in orange differ from the original · resamples every 1.6s

Honest: this is the naive setup — no encoding. Real systems never send raw bits through noisy channels; they encode first. That's the next figure.


Figure 4 of 5

Encoding fights back.

The simplest defense: send each source bit three times. At the receiver, take the majority. A single flip per group is now corrected. You pay 3× the bandwidth for it.

Toggle the rate. Watch the error rate fall — and the bandwidth cost rise.

source bits
— encode, channel, decode —
decoded bits (green = recovered correctly)
residual error rate: effective rate: bits/symbol
repetition trades bandwidth for reliability — but not at infinite ratio

Honest: repetition is the worst possible code. Real codes (Hamming, Reed-Solomon, LDPC, Turbo, Polar) achieve far better reliability for the same bandwidth. Repetition is here as the simplest code, not the best.


Figure 5 of 5

Shannon's theorem.

The capacity of a binary symmetric channel is:

C(p) = 1 − H(p)

Shannon proved: for any rate R below C, codes exist that drive the error probability arbitrarily close to zero. For any rate R above C, no such codes exist. The boundary is sharp.

Drag the dot. The plane is split: below the curve, reliable communication is possible. Above it, impossible.

noise p = 0.10  ·  rate R = 0.40  ·  capacity C(p) = 0.531
FEASIBLE — reliable codes exist at this rate
drag the dot anywhere · green region: reliable codes exist · red region: provably impossible

Honest: Shannon's theorem is existential — it proves codes exist below capacity. Finding them took fifty more years (Hamming 1950, Reed-Solomon 1960, Turbo 1993, LDPC rediscovered 1996, Polar 2009). The theorem doesn't tell you how; it tells you how much.