I love to browse old reference books, from 19th-century dancing guides to the hulking set of 1937 Encyclopaedia Britannicas that compete for space with other estate-sale finds in my parents-in-law’s home. I imagine all the schoolkids who must have once used them for reports. And then I think of how, for all their density, the volumes are light on information that the 21st-century reader can trust. (Or, to put it another way, there are plenty of facts inside. I’m just not sure which ones are still true.)
But according to Samuel Arbesman, author of The Half-Life of Facts: Why Everything We Know Has an Expiration Date (Current/Penguin), information actually decays in a systematic manner.
Arbesman is an applied mathematician and expert in the field of scientometrics (which looks at the science of science), a fellow at Harvard, and a senior scholar at the Ewing-Marion Kauffman Foundation in Kansas City, Missouri.
In a New Scientist article adapted from his new book, he writes that, “In the aggregate there are regularities to the changes, and we can even identify how fast facts will decay over time. This means we don’t have to be all at sea in a world of changing knowledge.”
Using the metaphor of radioactive material, Arbesman explains how long it takes for half the information in a particular field to be disproved or replaced by new data. For example, surgery information has a “half-life” of 45 years, while physics has a “half-life” of a decade. (There are various ways to measure obsolescence, from getting experts to review the factual content of papers and books, to recording how long it takes journal articles to stop getting cited.)
Arbesman goes on to discuss how we might situate information on a spectrum, from the most rapidly expiring (think stock market movements) to the longest shelf-life (“pretty much everything the ancient Greeks wrote about geometry”). So next time I’m at my in-laws, maybe I should flip open volume E, for Euclid.