Facts change all the time. Smoking has gone from doctor recommended to deadly. We used to think the Earth was the center of the universe and that the brontosaurus was a real dinosaur. In short, what we know about the world is constantly changing.
Samuel Arbesman shows us how knowledge in most fields evolves systematically and predictably, and how this evolution unfolds in a fascinating way that can have a powerful impact on our lives.
He takes us through a wide variety of fields, including those that change quickly, over the course of a few years, or over the span of centuries.
Way back in 2012, I discovered Netgalley and requested a few books. But somehow, I lost track of my acceptances and eventually forgot all about Netgalley. My ARC of The Half-Life of Facts lay neglected, forgotten, and unread. The years went by, and eventually, I rediscovered Netgalley, only to find that I had a black mark on my record: I'd never read or reviewed The Half-Life of Facts. Although I've been slowly repairing my reviewing ratio, I only discovered recently that I could actually go back and right the wrongs of 2012. And so I purchased The Half-Life of Facts, and I'm glad I did.
As can be inferred from the title, The Half-Life of Facts deals with the idea that facts--the concepts generally accepted by scientists of the day--are constantly in flux. Arbesman is essentially introducing scientometrics, that is, the scientific study of scientific studies. Concepts such as mathematical theorems are inherently true and thus unchanging. However, the "facts" of chemistry, physics, and biology are constantly being overturned. For example, there's a whole collection of "Lazarus taxa," that is species "known" to be extinct but which have since been rediscovered. As our technology improves, so does our ability to approximate the truth, and so does the rate by which we overturn old "facts." According to Arbesman, the surviving population of facts undergoes exponential decay with respect to time. This decay can be observed in a variety of ways, such as examining paper citation rates over time or studies of the rate at which medical facts are disproved. I don't accept all of the theories presented--to my mind, many of them rest upon correlation rather than causality--but his theories are interesting nonetheless.
Possibly because the main idea is so very simple, Arbesman quickly moves onto other related areas of scientometrics. For example, he also talks about innovation and "S-curve theory," which hypothesizes that various innovations in a field can be plotted as a series of connected logistic curves which together can be smoothed to look exponential. He also talks about the importance of population density, which he claims can cause a superlinear growth of technology. In one of the most interesting sections of the book, he discusses how mutation-tracking techniques from evolutionary genetics can be used to determine the age and origins of papers and books. Some of the studies using these techniques seem a bit suspect to me; for instance, one researcher uses these techniques to assert that almost all written works from the early Middle Ages have survived; however, he uses the Venerable Bede to calculate the half-life of these documents, and it's safe to assume that holy relics may have been treated a little differently than other documents. However, other uses of the technique are quite compelling; for example, according to a study that Arbesman cites (and presumably read) which used these techniques, researchers read only 20% of the works that they cite in their texts. This has led to amusing situations such as the fictional James Moriarty from the Sherlock Holmes stories being cited in real-life texts.
The most shocking section for me was on the issue of multiple independent discovery. While facts do propagate, they often don't propagate fast or far enough, leading to independent studies of the same facts. In school, I learned about Barabasi and Albert's famous paper on preferential attachment. Little did I know that preferential attachment had been discovered at least four times before. Similarly, the Erdos-Renyi random graph I learned about and used was actually first examined and discovered twenty years before. Arbesman puts it like this:
As Stigler's Law of Eponymy stats: "No scientific law is named after its discoverer." Naturally, Stephen Stigler attributes this law to Robert Merton.
Arbesman also discusses the other major reason for the propagation of false facts: publication bias. In a landmark paper, "Why Most Published Research Findings are False," John P. A. Ioannidis discussed how the academic setup is a recipe for false facts. Researchers gain acclaim through discovery of interesting phenomena, which can lead to a pattern that xkcd eloquently describes: To put it another way, if you flip five coins a thousand times, and each time you wear a different coloured shirt, you're practically guaranteed for at least one run to come up all heads. If you go and publish on that one time, you can claim "statistical significance" between the shirt colour and the five heads-- as long as you don't mention the other 999 trials. It isn't even always intentional: researchers want results badly, and they don't always consider experiment repetition in their calculations. Sadly, the atmospheric environment isn't great at self-correcting. Researchers don't gain fame and fortune from replicating other people's experiments, whether they find the original paper to be true or false. They're far more likely to use the first paper as "weak evidence" for their own tenuous findings.
If you're looking for a light, casual introduction to the basics of scientometrics, then Half-Life of Facts is definitely worth a look. Just make sure to read it before the book's own half-life has run out.
The Half-Life of Facts is an engaging, popular introduction to several topics within information science. The title comes from the notion that one can measure the rate at which papers within a discipline become obsolete, assigning a half-life to each field. The author explicitly uses 'fact' in a loose sense; he means, things we understand to be true about the universe.
The book is filled with anecdotes and is highly readable. On the down side, the author never presents an overarching framework for understanding information science, and lacking that from another source, I've had no way to retain most of what he said. Also, many of the analogies and anecdotes are woefully imprecise - Arbesman frequently opts for the most entertaining story, rather than an example that accurately illustrates the process or relationships he's trying to convey.
One omission struck me as particularly odd throughout the book. I realize that not everyone is interested in sustainability. But, much of Arbesman's discussion of the progress of scientific and technical knowledge sidesteps the question of what happens to the trajectory of knowledge when a material culture is unsustainable. It gives this book some of the same flavor of economic analyses that assume technological fixes will always eventually substitute for scarce materials, so even a society that looks currently unsustainable will self-correct over the long term. Arbesman doesn't say this, and it may be he is saving this discussion for another work, but for me it meant that much of the book felt disconnected from reality.
This was a good book, but the writing left something to desired.
Overall, the ideas the author presents are sound, and he uses a plethora of examples to show the reader what he is talking about, which I liked very much. However, his style is somewhat jumpy, bouncing between three or four related topics per chapter. I think his point was to try and connect these related ideas in a meaningful way, but I felt he could have done this better.
The content of the book is obviously where this book shines, as it gives a bit of a crash course in street statistics and probability. This information is spread throughout the book and is encountered when it is most pertinent. The author's examples are clear and give a wide representation of the phenomena he talks about, and there were quite a few points when I laughed out loud.
One thing to note is that the author does not have a chapter in which he gives bullet points on how to be an intelligent consumer of information (facts), which I prefer. If you can't critically think about these kinds of things without line by line instructions, you may not appreciate the range of tools he gives to deal with the ever changing current of knowledge the world presents.
Probably made more sense when it was published. Solid recap of a bunch of disparate, interesting ideas, but lacks connective tissue that would make it hit.
Arbesman presents a very broad and entertaining survey of the current state of science. The book touches on a huge array of issues that concern scientific work, like falsification, reproducibility, the combination of different fields and the process of generating new knowledge. Similarly, these topics are illustrated with anecdotes from throughout science. Arbesman tackles behavioral issues, astronomy, paleontology, physics, chemistry and many more topics. His digressions tend to be brief, and they build together towards a general argument quite nicely.
Unfortunately, the book is very brief and will probably strike readers as shallow. While the author very clearly illustrates how both knowledge and technology change all the time, this isn't exactly an eye-opening revelation for most information savvy people. Most of us have already seen massive transformations in thought and technology over the last few years.
While this group would seem to be the most likely audience for Arbesman's book, he seems to be targeting a less informed, generally more disconnected audience. He provides a clear case about how many people will learn over generations as their children bring new facts home. This group will be shocked to hear that dinosaurs had feathers. And Arbesman wants to help them bridge this problem by being more active in their consumption of information. This is a nice sentiment, but the people who don't have the time to stay updated on general facts are probably the same group that won't take the time to read Arbesman's book.
A better understanding of Arbesman's audience should have led him to focus more or tools like Mendeley, DEVONThink and Endnote, which are clearly beneficial to information savvy people looking to gain more use out of their personal reading and research. The same can be said about his discussion of information aggregators and other warehouses of contemporary facts. Since these end up only being short asides, Arbesman's book feels like a lost opportunity.
1 The Half Life of Facts - facts, in terms of current knowledge, have a half life 2. The Pace of Discovery - the growth of knowledge is exponential - 1947 Lehman study suggested doubling times from 20 years in Grand Opera to 87 years in medicine 3. The Asymptote of Truth - knowledge decays, as in being over-turned and becoming obsolete, exponentially 4. Moore's Law of Everything - discussion of the exponential growth of technology - technology expands among larger groups, but decays in smaller groups - Krember hypothesized that technological growth should be in proportion to population - Merton augmented this to be affected by the population make-up - i.e. scientists versus others 5. The Spread of Facts - social networking and the spread of facts - a number of examples of known errors that where not readily corrected by the spread of facts - Simkin and Roychowdhury study concluded that only about 20% of scientists who cite an article actually read that paper 6. Hidden Knowledge - hidden knowledge is that only known to a few, or knowledge that must be connected to other facts to reveal its utility - competitions for solutions to problems as a method of reaching out to the social network - CoPub Discovery where a massive database of articles is mined to find relationships between disease and additional knowledge, uncovering potential treatments 7. Fact Phase Transitions - fact phase transitions are large scale shifts in knowledge - on predicting fact phase transitions 8. Mount Everest and the Discovery of Error - on measurement, and error in measurement - the p-value gives the probability that a result is simply due to chance - results published in scientific journals generally require a p-value < 0.05 - corollary: one in 20 published result is incorrect 9. The Human Side of Facts - how human perceptions after our knowledge of facts 10. At the Edge of What We Know - musings on the rapidly increasing knowledge base, in the future
I actually thought this would be more about social media and the ultra fast news cycle, but it was still an ok read. I take issue on the very loose definition of "facts" used in the book, there's a big difference between discovering that the Earth revolves around the sun, figuring out the one billionth digit of pi, measuring a value with an increased level of accuracy and precision, and using meta-analysis to increase the statistical significance of a study. According to the author, an underlying "fact" has changed in each of those instances, but the former is a game-changing event that makes us see the world differently and the latter examples are an almost inevitable progression of knowledge. It was interesting to see how well-known techniques in one field can be used to solve problems in unrelated fields, it makes me wonder how that might affect patentability if a computer algorithm can be used to form new combinations of prior art techniques that may yield unexpected results, but without a human "flash of genius."
A range of topics centered on the lifespan of facts - why go out of date faster than we think, and how we as human handle this. Arbesman looks at how various scientific and technological "facts" - the various types of facts and their certainty, measurement and transmission.
If you are interested in the way knowledge spreads, or in how we are handling the current explosion in technical and factual knowledge, this book is well worth the read. My takeaways:
* if you don't want facts to go stale - stop memorizing them and look them up as needed * facts are created and go stale in predictable and mathematically modelable ways * for fields outside of our expertise, we often let facts go into statis - and that stays our reality until something major (education, our kids, news) forces us to update them. Even when we do that, we do so with any number of biases that means that we really have only a relatively few self-selected facts.
I wish I thought that I would be able to remember everything I learned from this book, though Arbesman did give me an out by suggesting that the time has come to outsource our memories to the cloud because that's the best way of getting the latest information. But already I'm forgetting some of the mechanisms of error that humans are prone to, lapsing back into them. I can feel my sharp edges blurring.
This is a wonderful book! Turns out that nearly everything can be quantified in ways I never dreamed before, and there are new and exciting ways to combine information to generate fresh discoveries that are just beginning to be explored. This book makes a person want to stand up and cheer.
Buen libro; habla de muchos tópicos relacionados con los hechos en el sentido amplio. Sacude algunos mitos y muestra la necesidad de contrastar constantemente: hasta los hechos científicos que damos por confirmados cambian, y es posible predecir el ratio de cambio del conocimiento.
Was drawn to this book due to the relevancy of "alternative facts". Discusses the transfer, turnover rate, and origins of knowledge. Highly recommend if anyone is interested in how what we know as "fact" actually changes over time.
Facts change over time. Some, we expect to change. One hundred years ago, the answer to “How many billion people on earth?” was: two. When I was at school that changed from four to five. Recently, it became seven. Others, like "How many fingers on a human hand?”, we expect to remain constant (at least for a very long time). Sometimes even these sorts of facts, however, change unexpectedly. From 1912 to 1956 scientists were certain there were 48 chromosomes in a human cell. Some even had to abandon research when they found they could only account for 46 of the 48 they knew had to be there.
Other facts, like “How many elements in the periodic table?”, change just slowly enough for those of us who aren’t paying attention to be surprised when the answer turns out to be about 10% higher than when we last looked.
Arbesman calls these sorts of facts — those that change over years or decades, rather than days or millennia — mesofacts. And he argues that how they change is actually fairly predictable. For this he uses the analogy of radioactivity: A single atom of uranium is highly unpredictable: you can’t know whether it might decay in the next minute, or last for another million years. But a chunk of uranium, made up of trillions of such atoms, becomes much more manageable, with a predictable half-life. Similarly, we may not know when any specific fact might be supplanted, but how a body of knowledge, in the aggregate, changes over time, can be measured and understood scientifically.[1]
At first it seemed like the book was going to expand much more on why this is important. In the first chapter he notes that it’s certainly practically useful, for example, to know that many areas of medical knowledge ‘decay’ in under 50 years, making it worthwhile to check semi-regularly whether the facts you’ve based (say) your exercise and diet regimes around are still true[2]. But he also notes that the subtler version is also even more important: simply being aware of how knowledge itself works, at a meta-level, is important for making sense of the world, and anticipating — and planning for — flaws in our knowledge. I had hoped for more expansion on this idea, but instead the book then takes off on a rather disjointed tour of lots of semi-related knowledge-based themes. Mostly this is anecdote driven, but while the author appears to really want to be Malcolm Gladwell, he can’t quite pull it off.
I did find a couple of these areas to be quite fascinating, though:
One is to do with how many previous trials researchers tend to cite, as a proxy for how deeply they study what has come before before simply jumping into their Shiny New Research. Unsurprisingly the answer is “Not very many” — on average only about 25% of papers that should be cited are (and with a heavy bias towards the most recent ones). One particularly striking example of why this can be important is on the research into treating heart attacks with the drug streptokinase. There were over 30 published trials before this was shown to be effective. However a follow-up cumulative meta-analysis found that if each of these trials had not only looked at their own results, but combined them with those of each of the previous trials, a statistically significantly result could have been found 15 years earlier.
The other is the concept of “undiscovered public knowledge” — where, for example, someone has shown that A implies B, and someone else has shown than B implies C, but no-one knows both these things, and therefore "A implies C” lies hidden in the literature as an unknown fact. In a classic example of this, Don Swanson combined two previously unrelated sets of scientific articles — one describing poor blood circulation in patients with Raynaud’s Syndrome; the other showing that dietary fish oil could improve blood circulation — to suggest (with no background in medicine or biology, and based on no research other than pulling together previously published information) that fish oil might be useful as a treatment for Raynaud’s: a finding subsequently backed up in trials.
It seems that this area has become significantly more automated in recent years, with massive databases now monitoring the co-occurence of terms in published papers, to look for potential links, and having successfully discovered previous unknown links between genes and diseases (e.g. Graves’ disease), and also other potential drug treatments like Swanson’s.
Unfortunately, however, the book disappointed. The high-level concept of framing information as having a half-life is certainly appealing, and some of the sections helped frame certain concepts with a little extra clarity, but largely it failed to engage with any of the areas sufficiently well.
—
[1] One simple but effective method for measuring this is to simply get a group of experts in a particular field to re-examine a large number of historic papers, and categorise them as either still factual; substantially correct but out of date; or now disproven. When this was done, for example, with almost 500 articles about liver disease, a strikingly clear graph of knowledge decay become apparent (with a half-life in this particular field of about 45 years). This approach, however, is rather time-consuming, so instead you can make the first-order approximation of measuring how long any given work continues to be cited by others.
[2] I had also hoped (in vain) for a discussion on the flip-side of this: comparing (for example) the rate of "Key New Breakthrough”-type articles to the half-life of information in the field, as a proxy measurement for bad journalism.
My bottom line is that this is an interesting work and the reader will learn from it, but there were far too many poorly worded questionable passages and disastrous sentence structures in this book for either author or editor to be praised. = = = = = = = = = = = = = = Maybe author Samuel Arbesman should have made his best points earlier. He almost lost this reviewer along the way.
To declare early, "I am choosing to use 'fact' in a loose fashion," sounds tantamount to "playing loose with the facts." Why would a serious reader tolerate ambiguity in cold hard facts in a book titled for those very elements? To later say "Errors are especially pernicious facts" served to challenge the fingernails on the ledge as I tried to hang on with the book until it developed its message.
I gave the author an opportunity to make his case, which is about "The Half-Life" of something he loosely defined. Sadly, he disappointed again, playing loose with the second major element of the title by showing his version of half-life to be far less pure than the scientific definition with which most readers will be familiar, with his case being presented for just the first half of some fact, and not necessarily repeated for the half that is left and the half after that. Carbon dating by half-life would be difficult if not impossible if it had the loose properties of Samuel Arbesman’s half-life.
If one is able to swim beyond the flotsam left in the early pages, however, there is significant highly interesting clarity in the waves of information that flow through this work.
Who wouldn’t become entrapped by a discussion of life spans increasing at an increasing rate that could theoretically make man immortal by achieving "actuarial escape velocity?" Now he had my attention in a positive way.
I was finally fully caught up by a section that made quick work with virtual bullet points of good sense: -"This problem of how to disseminate facts effectively will only become more acute, as our general body of knowledge increases …" -"It’s a lot easier to spread the first … fact that sounds correct …" -"We often don’t track our sources" -"Bad information can spread fast … credible information or news spreads faster" -"Be critical before spreading information and examine it to see what is true."
More positives followed. Arbesman gives us reason to care about his topic, declaring, "As knowledge changes more rapidly, the resulting change in society can be drastic." His comparison of the scientific "phase transition" to the more social/technical term "tipping points" is creative and useful.
Additionally, the author provides plenty of hope for us to expect we can deal with our world of facts, as well as some insight into tools and methods, better practices, and positive inroads being made in areas that allow knowledge to be solidified around true facts.
Arbesman uses examples within and outside science to show how facts change and teases the reader that "Fast changes in facts, like everything else we’ve seen, have an order to them. One that is measurable and predictable." (Just as poor sentence structure was predictable in this work!)
He calls out a notable misuse of probability and fact with the story of a doomsayer declaring "a one in two chance" that a black hole capable of destroying the earth would be created concurrent with a particular scientific advance. The naysayer was technically accurate, given either (a) the world would end, or (b) it would not; so therefore it must be 50-50!
Arbesman wisely devotes more than a chapter to the dependency of 'facts' on how our brains have been conditioned and how they process information. Facts are not just a matter of technical science, but social science as well. He further connects disciplines by drawing a parallel for physical and social/technical sciences: "Phase transitions have been used to understand all types of rapid change, from ecological models … to tipping points for how fads and fashions spread."
Credit the author for a few quotable points, including: -Addressing the willingness of people to change: "We are often like objects being dragged through mud. We change, but slowly, and with the residue of where we came from upon us." -Regarding credits given to published scientific research: "Citations are the coin of the scientific [journal] realm." -About error: Arbesman quoting Kathryn Schulz from her work Being Wrong: "'This is the pivotal insight of the Scientific Revolution: that the advancement of knowledge depends on current theories collapsing in the face of new insights and discoveries. In this model of progress, errors do not lead us away from the truth. Instead, they edge us incrementally toward it.'"
Additional issues I had: -The word infinite should have been used more sparingly for the benefit of comprehension. -Please let it not be written, or unedited, that "... our general body of knowledge increases, in general." [sic] -And also explain why science "should take into account everything that has come before it." The author is intending that new science should address prior theories, but the thought is poorly delivered.
My bottom line is that this is an interesting work and the reader will learn from it, but there were far too many poorly worded questionable passages and disastrous sentence structures in this book for either author or editor to be praised.
People should acknowledge that facts do change as science advances. Our knowledge tends to solidify at the end of formal schooling except in our professional field. It is then updated generationally when teaching children. Some other solutions were presented as well. Epochal shifts (phase changes) in knowledge were discussed and then dismissed as accumulations that finally crossed a tipping point. Phase changes used to happen every hundred years or so, and are now happening multiple times in a lifetime so humans are having a harder time adapting.
Many areas with quantitative data about information were fitted to exponential curves. However, there was no confidence value to these curves on how accurate they are. Human knowledge may seem like a long time, but most of the data is only from the last few centuries. There is a lot more randomness than fitting everything to an exponential curve shows. Instead of looking for confirmatory examples, the author should have looked for counter-facts, or at least presented the data in more depth.
Not only are our politics fact challenged, our science is as well. Biggest difference? Science has a self-correcting mechanism built in. Wonderful wonky book on statistical reviews of how facts change over time.
This is a bit dry and mathematically-minded, but still a good discussion of change in truth as we know it over time. It covers corrections in scientific understanding (e.g. the nature of the atom) and things you would expect to shift, such as the biggest cities in the world or the best way to treat a nosebleed. Samuel Arbesman is looking at this mostly from the meta-science of looking at how knowledge changes, where it turns out that change in facts frequently behave predictably in the aggregate, where you look at all facts in a certain category. There's a lot of historical discussion of how people came to think about how science is done.
Towards the end there is a decent amount of discussion about "facts" that were never really true in the first place -- this book mostly covers what I was looking for in Bad Science: Quacks, Hacks, and Big Pharma Flacks. In addition to covering that "repeatability" of experiments is a fundamental principle that is mostly left out today, Arbesman notes that experiments are often done enough times that at least one of the results is likely to come out well simply due to chance, and then that one can be published. It's clear the author doesn't want to disparage the accepted scientific community, but at the same time he has to admit that when only 25% of publishing scientists read the relevant papers in their own specialty, it doesn't reflect well on the work being done.
The author is not immune to human error in judgement, for example when he relies on the sole example of someone's analysis of works of the Venerable Bede to estimate how much knowledge civilization may have lost over the centuries. In V.B.'s case the answer is "not much", and Arbesman reassures us we're unlikely to have lost much generally either. Except that the Venerable Bede was a revered figure even shortly after his life so that I suspect his works were much more likely to be preserved than others who weren't admired so quickly. Also, even the author tells stories to suggest a point he can't substantiate. However, nobody is immune to those flaws, so you just do your best to recognize when it happens.
After I saw Samuel Arbesman speak at Tedx Kansas City a few weeks ago, I knew I had to read his book. The premise of his talk and his book is that facts are not really information set in stone, the way we usually think about them. The world is constantly changing and nothing is for certain forever. I was floored by the notion that what my kids are learning in school may contradict what I learned in school. For some reason, that notion had never occurred to me!
The Half-Life of Facts is easily understood by a lay person. I found it very readable and I don't have a head for science at all. Each chapter outlines a different reason why facts may either change or be found to be untrue. Arbesman uses examples throughout, all of which I found fascinating. I would love to read even more stories about which facts have changed over time and why.
I was surprised by some of the facts that are no longer true. For instance, did you know that there really isn't a dinosaur called a Brontosaurus? I had no idea and both of my boys have been through dinosaur obsessions within the past few years. The Brontosaurus was found to be a type of Apatosaurus over a hundred years ago. However, once something is out in the ether, it's really hard to circulate information modifying or correcting the original assertion.
I appreciated that not only does Arbesman discuss the various ways in which untruths persist and facts change over time, he also offers suggestions of how to keep current without getting information overload.
I love that in keeping with the spirit of The Half-Life of Facts, Arbesman's website has a Errata and Updates section for the book. There is already one case listed in which Arbesman unknowingly perpetuated a myth about how spinach became known to have a high iron content.
It's very rare that I read a non-fiction book that I have a hard time putting down. The Half-Life of Facts is one of those rare riveting works of non-fiction. I highly recommend it to all.
This book didn't really deliver. While it did provide estimates of turnover in facts in various areas of knowledge (though based on very doubtful methodologies), it didn't provide much guidance on how to deal with it and the guidance it did provide was little more than "look it up on the internet" - oddly this was seen as a good thing and it was touted as a good thing to effectively outsource our memories to the cloud. The problem with this is that it will ultimately lead to the death of innovation in our society since new ideas almost invariably come about through an unusual combination of existing ideas, but for this to work you have to have ideas in your head in the first place. Instead of making us smarter, what I suspect is happening (from watching the performance of Gen-Y's on quiz shows) is that the next generation is becoming increasingly ignorant about some very basic facts.
Another failing of the book was that many of the facts that the author was talking about were ones that obviously changed (such as the population of the world) where pretty much everyone is aware that such facts are dynamic. A section on language change was also obvious.
So all up not a terribly useful book to read.
I recommend instead "Wrong: Why experts keep failing us" for a much better guide to misinfomation masquerading as fact and the woeful inadequacies in particular of medical research.
I tried very hard to like this book. I want to like it a lot more than I did. I also want to remember everything I read because it is interesting. Overall, I know that won't happen, but the take-away is things change. Things that can be taken as fact change, so we should ensure to question what is a fact, where it came from, and understand the cycle of "facts". He speaks about a lot of great ideas, uses many great examples, and diagrams. I just had such a hard time staying engaged for long periods (or even short periods) of time. I found it a struggle to finish this book. Even though it has such great potential, and great material!
I would recommend it, but only to someone who was ambitious about this. I thought I was, but I had such a hard time finishing it.
The first chapter explains that the author employs a very loose definition of the word fact; in any number of other places, phrases like 'creation of facts' are used. In other words, the book consists of a massive reversal of consciousness and existence.
The two stars are for the only value the book has, which is to give insight as to how a lot of people see knowledge these days.
Reminiscent of Malcolm Gladwell - lots of entertaining anecdotes loosely woven together around a common theme, but not too deep and little "so what" - more descriptive than prescriptive. Strongest in the middle, beginning and conclusion both weaker. Definitely worth reading, particularly good complement to The Signal and the Noise.
An interesting discussion of how knowledge (facts, scientific discoveries, technology) changes over time. Occasionally it strays too far into "Did you know ... (interesting fact) ...?" territory, which is annoying, but I'm feeling generous, so I'll give it 4 stars instead of 3. In sum, a good start on an interesting topic, but doesn't quite go into enough depth to be really great.
Arbesman, Samuel (2012). The Half-life of Facts: Why Everything We Know Has an Expiration Date. New York: Current. 2012. ISBN 9781101595299. Pagine 256. 14,10 €
amazon.com
L’idea di fondo del libro è originale e stimolante: i fatti e le nozioni che apprendiamo e formano l’insieme delle nostre conoscenze hanno una durata limitata, non definibile specificamente, ma prevedibile statisticamente, come succede per il decadimento degli atomi radioattivi. Arbesman articola e argomenta quest’idea diffusamente e in modo approfondito, a partire dalla disciplina di cui è un esperto, la scientometria, ma poi si disperde in campi forse abbastanza prossimi all’argomentazione e all’argomento centrale, ma non strettamente connessi. La mia impressione finale è quella che, in questo modo, il libro abbia perso in efficacia e che l’autore abbia perso l’occasione di trasmetterci un’idea originale e memorabile. Temo che una bella porzione di responsabilità per questo esito ricada sulle spalle dell’agente letterario di Arbesman, Max Brockman, figlio e socio del più famoso Max di cui abbiamo già avuto occasione di scrivere (qui, per esempio, ma anche qui, qui, qui anche se per un errore imperdonabile, e da ultimo qui).
Insomma, alla fine sono rimasto un po’ deluso, non per specifiche manchevolezze, ma per una certa mancanza di compattezza del volume. Si capirà meglio che cosa intendo dire percorrendo l’indice del volume, capitolo per capitolo:
The Half-life of Facts: il capitolo dedicato all’argomento di fondo del libro, all’idea che la conoscenza è come la radioattività, nel senso che prevedere quando un singolo atomo d’uranio decadrà è pressoché impossibile, ma prevedere quando la metà degli atomi di una blocco di uranio decadrà è possibile nell’aggregato (704 milioni di anni è la risposta). Lo stesso, sostiene in modo convincente Arbesman, è vero per i fatti che costituiscono la conoscenza nel complesso o in una singola disciplina. The Pace of Discovery: segue un andamento di crescita esponenziale e non lineare. Di qui Arbesman passa a raccontare la nascita della scientometria e i suoi risultati principali. The Asymptote of Truth: l’idea dell’emivita dei fatti viene corretta illustrando il carattere cumulativo delle scoperte scientifiche. Moore’s Law of Everything: il capitolo illustra dapprima l’idea che un risultato come quello implicito nella legge di Moore è il risultato del succedersi di una serie di fenomeni di crescita che obbediscono a una legge logistica e poi generalizza l’idea alla scienza e alla tecnologia. The Spread of Facts: l’informazione non si diffonde istantaneamente e segue percorsi che possono essere studiati dalla network analysis. Hidden Knowledge: la coda lunga dell’expertise e la nascita di InnoCentive, e molte molte altre cose … Insomma, una dei capitoli più ricchi di spunti ma anche tra i più dispersivi. Fact Phase Transitions: apparentemente in contraddizione con l’idea dell’emivita dei fatti, l’idea che le innovazioni possono comparire all’imnprovviso è in realtà il portato della stessa logica che porta a distinguere i comportamenti micro (imprevedibili e cumulativi) dal’emergere di quelli macro (l’emivita, appunto, ma anche le transizioni di fase). Mount Everest and the Discovery of Error: che cosa siano i fatti e come cambiano è spesso una questione di misurazione (un tema affascinante, ma non nuovo per chi fa lo statistico). The Human Side of Facts: tra i vari limiti umani, il capitolo si concentra sulla shifting baseline syndrome, quella che ci fa percepire il cambiamento soltanto prendendo a riferimento lo stato del mondo al momento della nostra nascita o a quello in cui siamo diventati per la prima volta coscienti di un fenomeno. At the Edge of What We Know: anche il nostro cervello ha una carrying capacity? Anche tutto il complesso della conoscenza segue una logistica? (O forse ha ragione Ray Kurzweil e la singolarità è vicina?)
Non vorrei lasciarvi soltanto con la mia recensione non del tutto convinta. Perciò, prima di passare al consueto florilegio di citazioni, la parola alla difesa.
* * *
Cominciamo con due interventi in video dello stesso Sam Arbesman. Entrambi sono una presentazione del suo libro, la prima TED:
La seconda girata per la Kaufmann Foundation, dove Arbesman lavora:
Il terzo video è una divertente animazione:
* * *
E adesso, un po’ di recensioni trovate in giro sul web:
La recensione di Antonio Sgobba (La vita mortale dei fatti) su la lettura de Il Corriere della sera del 30 settembre 2012. Daniel Engber. Truth Decay. A network scientist examines the lifespan of a fact. Slate 5 ottobre 2012. L’intervista di Bora Zivkovic su Scientific American (10 agosto 2012). L’intervista su The Economist del 28 novembre 2012.
* * *
Ed ecco le consuete citazioni (riferimento come sempre alle posizioni sul Kindle). Vi consiglio di darci almeno un’occhiata, perché aiutano a capire perché – nonostante i difetti strutturali segnalati in precedenza – resta comunque un libro che è utile leggere, per i numerosi spunti e stimoli che contiene.
Facts are how we organize and interpret our surroundings. [103]
[W]hen people thought the earth was flat, they were wrong. When people thought the earth was spherical, they were wrong. But if you think that thinking the earth is spherical is just as wrong as thinking the earth is flat, then your view is wronger than both of them put together. [605. La citazione è tratta da: Asimov, Isaac. "The Relativity of Wrong". The Skeptical Inquirer. 14, no. 1 (1989): 35-44]
Larger groups of interacting people can maintain skills and innovations, and in turn develop new ones. A small group doesn’t have the benefit of specialization and idea exchange necessary for any of this to happen. [969: Density is destiny]
Viewed this way, a city is then a place where people can easily interact. [1067]
Simkin and Roychowdhury conclude, using some elegant math, that only about 20 percent of scientists who cite an article have actually read that paper. [1483]
In 1771, a French academy offered a prize for finding a vegetable that would provide adequate nutrition during a time of famine. The prize was won two years later by Antoine Parmentier for his suggestion of the potato. [1656]
In 1999, Albert-László Barabási and Réka Albert wrote a celebrated paper that was published in Science, one of the world’s premier scientific journals, about a process they termed preferential attachment. The process is responsible for creating a certain pattern of connections in networks — also known as a long tail of popularity — by the simple rule of the rich getting richer, or in this case, connections begetting more connections. For example, on Twitter there are a few individuals with millions of followers, while most users have only a handful. This paper shows how, by assuming a simple rule that newcomers look at everyone in the network and are more likely to connect with the most popular people, you can explain why you get the properties of the entire network — in Twitter or elsewhere — that we see. Using a wide variety of datasets and some mathematics, they showed this rigorous result. Unfortunately, they weren’t the first. Derek Price, the father of scientometrics, had written a paper in the 1970s showing that one can get this same pattern by invoking a similar rule with respect to how scientific papers cite one another. But Barabási and Albert didn’t know about Price. Price wasn’t the first either. Herbert Simon, a renowned economist, had developed the same idea in the fifties. Which also happened to be the same concept that Udny Yule had published several decades earlier. The general concept of preferential attachment is actually known by many names. It’s known as the Matthew effect, as Robert Merton coined it, in sociology, and is related to something known as Gibrat’s Law when it comes to understanding how cities and firms grow. [1665-1669]
To understand that sort of thing, or any other system for which we want to explain a certain phenomenon, we need to create much simpler models. These don’t make any claims for verisimilitude. Instead, they go to the other extreme and claim the following: We can make an extremely basic model that even with all the complexity of real life stripped away still has certain features of our complicated world. And if we can capture these features of our world, maybe we can understand why they occur. In our case, the question is whether a simple model can be made that exhibits phase transitions. [2008]
For example, the larger the population of a city, the smaller the number of gas stations that are necessary per capita; gas stations might be indicative of energy usage of the city as a whole, and it seems that larger cities are more efficient consumers of energy. This is similar to how larger organisms are more energy efficient than smaller ones. [2193]
Precision refers to how consistent one’s measurements are from time to time. [2407] Accuracy refers to how similar one’s measurements are to the real value. [2411]
“Statistics is the science that lets you do twenty experiments a year and publish one false result in Nature.” [2475. La citazione è attribuita a John Maynard Smith]
Shifting baseline syndrome was first identified and named by Daniel Pauly to refer to what happened with fish populations throughout the world. [2774]
Alan Kay, a pioneering computer scientist, defined technology as “anything that was invented after you were born.” [2786. La citazione è tratta da: Kelly, Kevin. What Technology Wants. New York: Viking. 2010. p. 235]
While we are nowhere near the end of science — the sum of what we don’t know is staggering — we might very well be in a logistic curve of ever-changing knowledge as well, rather than one of exponential growth. One of the reasons I believe this could be true is simple: demographics. It seems unlikely that the rapid population growth will continue growing faster and faster. Whenever a country has become industrialized, its development has gone hand in hand9 with a drop in birth rate. Therefore, as the world as a whole advances technologically, population will cease to grow at the frenetic pace of previous decades and centuries. Combined with energy constraints — we are nowhere near our limits, but our energy resources are certainly not unbounded — exponential knowledge growth cannot continue forever. On the other hand, as computational power advances, computer-aided scientific discovery could push this slowdown far off into the future. [3277]
Interesting perspective on how scientific knowledge has an expiry date, though draggy at times. The bit about exponential growth propelling innovation seems to jar against the notion of sustainability, though allusions to logistic curves and carrying capacities alleviate that somewhat. tl;dr - don't be so sure of what you know.
___ Perhaps more derivative fields (e.g. medicine) move more slowly compared to the basic areas of knowledge on which they depend.
By their 40s, Nobel laureates are first authors on only 26% of their papers, as compared to their less accomplished contemporaries (56%). Nicer people are indeed more creative, successful, and likely to win Nobel prizes.
If you uttered the statement "80% of all the scientists who have ever lived are alive today" nearly anytime in the past 300 years, you'd be right. (exponential growth of people doing science)
When someone develops a new innovation, it is largely untested. It might be better than what is currently in use, but it is clearly a work in progress. Thus the new technology is initially only a little better. As it becomes refined (the bit that distinguishes engineering and practical application from basic science), they begin to realise the potential of this new innovation.
Science is about understanding the origins, nature, and behaviour of the universe and all it contains: engineering is about solving problems by rearranging the stuff of the world to make new things. Science modifies what we know about the world, technology modifies what we can do in the world.
Many economists argue that population growth has grown hand in hand with innovation and the development of new facts (cities as hotbeds of innovation).
Berlin's expanse grew according to a simple rule of thumb: the distance reachable in 30 minutes or less. A city can be said to be a place where people can easily interact.
Facts spread by social networks. And medium strength ties are the most important for such spread. They are the happy medium between weak ties that don't spread anything, and strong ties that don't spread new information (informationally inbred).
Hidden knowledge has many forms. At its most basic level it can consist of pieces of information that are unknown, or are known only to a few, and for all practical purposes, still need to be revealed. Other times it includes facts that are part of undiscovered public knowledge, when bits of knowledge need to be connected to other pieces of information in order to yield new facts.
Revolutions in science have often been preceded by revolutions in measurement.
Atomic weights vary, based on which country a sample is taken from, or even what type of water the element is found in, can give a different isotope mixture.
As the saying among doctors goes: hurry up and use a new drug while it still works.
The smaller the effect sizes in a scientific field, the less likely it is the research findings are to be true. If an effect is small, we could simply be measuring noise.
Scientists rarely perform confirmatory replications of experiments. "I've got my own science to do".
John Maynard Keynes: When the facts change, I change my mind. What do you do, sir?
A surefire way of adhering to a certain viewpoint: have a close relative take the opposite position.
Whichever bias we are subject to, factual inertia permeates our entire lives.
We have a tendency to reject anything newer than our own childhood.
Science is also subject to our baser instincts: Data is hoarded, scientists refuse to collaborate, and grudges can play a role in peer review.
By not relying on our own memories, we become more likely to be up-to-date in our facts, because the newest knowledge is more likely to be online than in our own heads.
Errors do not lead us away from the truth. They edge us incrementally toward it.
An intriguing book that didn't go anywhere useful.
Samuel Arbesman seems a brilliant academic and knows about many different fields of science and mathematics. He uses that knowledge to show how quickly what we "know" to be true changes and knowledge advances (always through science from his world view). One interesting point he made is how quickly that process is changing in the modern world. Where it used to take centuries or decades for basic knowledge to change, in our world that pace of change is down to years or months.
The charm of his book is Arbesman's illustrative storytelling from the history of science or math to make his point. The book is full of interesting little vignettes from which one is likely to draw bits of wisdom even beyond what Arbesman intended.
But, other than describing how quickly facts change, Arebesman never really draws any conclusions or suggests any wisdom about what it means or what one should do about it. In a final chapter, he acknowledges there may be limits to the amount we can know and the pace of change, but doesn't wrestle with what that means or how to recognize we've hit limits. Likewise, he attempts to show we shouldn't fear change--even very rapid change--but in making his point compares to rapid growth of Portuguese dominance over global seas and rapid increase in our measuring of the accuracy of time to the potential creation of superhuman artificial intelligence. This seems an unhelpful comparison to me.
The book is useful to remind ourselves that what we think we know isn't necessarily so (and to remind us that today's "fact checkers" are quite often mistaken or will be proven so). It is charming for its stories. But it doesn't challenge the reader to any real action or change.