Understanding The Weird Parts [best] | 2027 |
Weirdness is often the result of simplified mental models. The beginner’s model of arithmetic (addition as repeated counting) fails for negative numbers because it is a special case. The expert’s model (addition as group operation on the integer ring) handles all cases uniformly. Reading the ECMAScript specification, the Python data model documentation, or Euclid’s axioms transformed by modern set theory is the work of moving from folk understanding to formal understanding.
Language, too, is a patchwork of weird parts. English spelling is notoriously irregular (“ghoti” could theoretically be pronounced “fish” if you take “gh” from “tough,” “o” from “women,” and “ti” from “nation”). Grammatical quirks like the “double negative” in standard English (“I don’t have none” means “I have some” in some dialects but is proscribed in standard English) show how different communities resolve the same weirdness in opposite ways. Understanding these requires moving beyond prescriptive rules to descriptive linguistics: language is not a logically designed system but an evolved, negotiated, living artifact. Given that every nontrivial domain has its weird parts, what approach leads to genuine understanding rather than rote memorization? understanding the weird parts
Why do such parts exist? Often, because formal systems grow organically. Programming languages evolve from practical needs, accruing edge cases and legacy behaviors. Mathematics expands by generalization, sometimes producing results that contradict earlier intuitions (e.g., the Banach-Tarski paradox). Human cognition itself is a patchwork of evolutionary shortcuts, leading to systematic biases. The weird parts are not bugs in the universe—they are features of systems that were never designed from scratch with perfect foresight. Perhaps no field offers a richer collection of weird parts than software engineering. Consider JavaScript’s type coercion: [] + [] evaluates to an empty string, [] + {} becomes "[object Object]" , but {} + [] is 0 . The explanation involves the language’s implicit type conversion rules, the distinction between statement and expression contexts, and the + operator’s overloaded behavior. At first glance, this seems arbitrary. But after studying the specification—how the ToPrimitive abstract operation works, how valueOf and toString are called—the weirdness becomes understandable. It is still surprising, but no longer mysterious. Weirdness is often the result of simplified mental models