Statistics Done Wrong Quotes

Rate this book
Clear rating
Statistics Done Wrong: The Woefully Complete Guide Statistics Done Wrong: The Woefully Complete Guide by Alex Reinhart
1,080 ratings, 4.19 average rating, 120 reviews
Open Preview
Statistics Done Wrong Quotes Showing 1-16 of 16
“Misconceptions are like cockroaches: you have no idea where they came from, but they’re everywhere—often where you don’t expect them—and they’re impervious to nuclear weapons.”
Alex Reinhart, Statistics Done Wrong: The Woefully Complete Guide
“The first principle is that you must not fool yourself, and you are the easiest person to fool. — Richard P. Feynman”
Alex Reinhart, Statistics Done Wrong: The Woefully Complete Guide
“Your eyeball is not a well-defined statistical procedure.”
Alex Reinhart, Statistics Done Wrong: The Woefully Complete Guide
“two experiments with different designs can produce identical data but different p values because the unobserved data is different. Suppose I ask you a series of 12 true-or-false questions”
Alex Reinhart, Statistics Done Wrong: The Woefully Complete Guide
“There’s no mathematical tool to tell you whether your hypothesis is true or false; you can see only whether it’s consistent with the data. If the data is sparse or unclear, your conclusions will be uncertain.”
Alex Reinhart, Statistics Done Wrong: The Woefully Complete Guide
“As psychologist Paul Meehl complained, Meanwhile our eager-beaver researcher, undismayed by logic-of-science considerations and relying blissfully on the “exactitude” of modern statistical hypothesis-testing, has produced a long publication list and been promoted to a full professorship. In terms of his contribution to the enduring body of psychological knowledge, he has done hardly anything. His true position is that of a potent-but-sterile intellectual rake, who leaves in his merry path a long train of ravished maidens but no viable scientific offspring.6”
Alex Reinhart, Statistics Done Wrong: The Woefully Complete Guide
“You can likely get good advice in exchange for some chocolates or a beer.”
Alex Reinhart, Statistics Done Wrong: The Woefully Complete Guide
“Raw data has to be edited, converted to other formats, and linked with other datasets; statistical analysis has to be performed, sometimes with custom software; and plots and tables have to be created from the results. This is often done by hand, with bits of data copied and pasted into different data files and spreadsheets—a tremendously error-prone process. There”
Alex Reinhart, Statistics Done Wrong: The Woefully Complete Guide
“There’s a common misconception that statistics is boring and monotonous. Collect lots of data; plug numbers into Excel, SPSS, or R; and beat the software with a stick until it produces colorful charts and graphs. Done! All”
Alex Reinhart, Statistics Done Wrong: The Woefully Complete Guide
“Remember that “statistically insignificant” does not mean “zero.” Even if your result is insignificant, it represents the best available estimate given the data you have collected. “Not significant” does not mean “nonexistent.”
Alex Reinhart, Statistics Done Wrong: The Woefully Complete Guide
“When a study claims to have detected a large effect with a relatively small sample, your first reaction should not be “Wow, they’ve found something big!” but “Wow, this study is underpowered!”
Alex Reinhart, Statistics Done Wrong: The Woefully Complete Guide
“Statistical significance is often a crutch, a catchier-sounding but less informative substitute for a good confidence interval.”
Alex Reinhart, Statistics Done Wrong: The Woefully Complete Guide
“Rather than taking the time to understand the interesting parts of scientific research, armchair statisticians snipe at news articles, using the vague description of the study regurgitated from some overenthusiastic university press release to criticize the statistical design of the research.[”
Alex Reinhart, Statistics Done Wrong: The Woefully Complete Guide
“Correlation doesn’t imply causation, but it does waggle its eyebrows suggestively and gesture furtively while mouthing, “Look over there.”[”
Alex Reinhart, Statistics Done Wrong: The Woefully Complete Guide
“But most other fields do not use protocol registration, and researchers have the freedom to use whatever methods they feel appropriate. For example, in a survey of academic psychologists, more than half admitted to deciding whether to collect more data after checking whether their results were significant, usually concealing this practice in publications.”
Alex Reinhart, Statistics Done Wrong: The Woefully Complete Guide
“Notice that the p value works by assuming there is no difference between your experimental groups. This is a counterintuitive feature of significance testing: if you want to prove that your drug works, you do so by showing the data is inconsistent with the drug not working. Because of this, p values can be extended to any situation where you can mathematically express a hypothesis you want to knock down. But p values have their limitations. Remember, p is a measure of surprise, with a smaller value suggesting that you should be more surprised. It’s not a measure of the size of the effect. You can get a tiny p value by measuring a huge effect — “This medicine makes people live four times longer” — or by measuring a tiny effect with great certainty. And because any medication or intervention usually has some real effect, you can always get a statistically significant result by collecting so much data that you detect extremely tiny but relatively unimportant differences. As Bruce Thompson wrote, Statistical significance testing can involve a tautological logic in which tired researchers, having collected data on hundreds of subjects, then conduct a statistical test to evaluate whether there were a lot of subjects, which the researchers already know, because they collected the data and know they are tired. This tautology has created considerable damage as regards the cumulation of knowledge.1”
Alex Reinhart, Statistics Done Wrong: The Woefully Complete Guide