Everything Becomes an 8
When AI models train recursively on their own output, something irreversible happens. Edges erode. The unusual disappears first. What survives is the average — a blob that looks like everything and nothing at once.
This piece uses MNIST digit classification as a concrete lens for understanding model collapse — the phenomenon by which AI systems, trained on AI-generated data, progressively lose the variance that makes them useful. It is also a meditation on what this means for human creativity in an AI-saturated environment.
Featured Essay
"The model has lost the signal and kept the certainty. That is worse."