Browsing through Keynes’ “A Treatise on Probability,” I came across a pretty nugget which Keynes credits to Laplace. Suppose you want to make a fair decision via coin flip, but are afraid the coin is slightly biased. Flip two coins (or the same coin twice), and call it “heads” if the flips match, tails otherwise. This procedure is practically guaranteed to have very small bias. In fact, if we call ${b_i = P_i(H)-P_i(T)}$ the bias of flip ${i}$, a quick calculation shows that the bias of the double flip is ${b_1b_2}$, so that a 1% bias would become a near-negligible .01%.

I noticed that we can extend this; consider using ${n}$ flips, and calling the outcome “heads” if the number of tails is even, “tails” if it is odd. An easy induction shows that the bias of this procedure is ${b_1b_2...b_n}$, which of course goes to 0 very quickly even if each coin is quite biased. Here also is a nice direct calculation: consider expanding the product

${b_1b_2 \ldots b_n = (P_1(H) - P_1(T)) \cdots (P_n(H) - P_n(T))}$

The magnitude of each of the ${2^n}$ terms is the probability of a certain sequence of flips; the sign is positive or negative according to whether the number of tails is even or odd. Done.

I can hardly believe it of such a simple observation, but this actually feels novel to me personally (not to intellectual history, obviously.) Not surprising exactly, but novel. I suppose examples such as the following are very intuitive: The last (binary) digit of a large integer such as “number of voters for candidate X in a national election” is uniformly random, even if we know nothing about the underlying processes determining each vote, other than independence (or just independence of a large subset of the votes.)