Bayesian Cognitive Modeling Quotes
Bayesian Cognitive Modeling: A Practical Course
by
Michael D. Lee24 ratings, 4.12 average rating, 1 review
Bayesian Cognitive Modeling Quotes
Showing 1-13 of 13
“Thus, a posterior distribution describes our uncertainty with respect to a parameter of interest, and the posterior is useful—or, as a Bayesian would have it, necessary—for probabilistic prediction and for sequential updating.”
― Bayesian Cognitive Modeling: A Practical Course
― Bayesian Cognitive Modeling: A Practical Course
“This again contrasts with orthodox inference, in which inference for sequential designs is radically different from that for non-sequential designs (for a discussion, see, for example, Anscombe, 1963).”
― Bayesian Cognitive Modeling: A Practical Course
― Bayesian Cognitive Modeling: A Practical Course
“Instead of “integrating over the posterior,” orthodox methods often use the “plug-in principle.” In this case, the plug-in principle suggests that we predict solely based on , the maximum likelihood estimate. Why is this generally a bad idea? Can you think of a specific situation in which this may not be so much of a problem?”
― Bayesian Cognitive Modeling: A Practical Course
― Bayesian Cognitive Modeling: A Practical Course
“In other words, how can we use the posterior distribution —which, after all, represents everything that we know about from the old set—to predict the number of correct responses out of the new set of questions? The mathematical solution is to integrate over the posterior, , where is the predicted number of correct responses out of the additional set of 5 questions. Computationally, you can think of this procedure as repeatedly drawing a random value from the posterior, and using that value to every time determine a single . The end result is , the posterior predictive distribution of the possible number of correct responses in the additional set of 5 questions. The important point is that by integrating over the posterior, all predictive uncertainty is taken into account.”
― Bayesian Cognitive Modeling: A Practical Course
― Bayesian Cognitive Modeling: A Practical Course
“On his blog, prominent Bayesian Andrew Gelman wrote (March 18, 2010): “Some probabilities are more objective than others. The probability that the die sitting in front of me now will come up ‘6’ if I roll it …that’s about . But not exactly, because it’s not a perfectly symmetric die. The probability that I’ll be stopped by exactly three traffic lights on the way to school tomorrow morning: that’s well, I don’t know exactly, but it is what it is.” Was de Finetti wrong, and is there only one clearly defined probability of Andrew Gelman encountering three traffic lights on the way to school tomorrow morning?”
― Bayesian Cognitive Modeling: A Practical Course
― Bayesian Cognitive Modeling: A Practical Course
“To understand why de Finetti wrote this, consider the following situation: someone tosses a fair coin, and the outcome will be either heads or tails. What do you think the probability is that the coin lands heads up? Now suppose you are a physicist with advanced measurement tools, and you can establish relatively precisely both the position of the coin and the tension in the muscles immediately before the coin is tossed in the air—does this change your probability? Now suppose you can briefly look into the future (Bem, 2011), albeit hazily. Is your probability still the same?”
― Bayesian Cognitive Modeling: A Practical Course
― Bayesian Cognitive Modeling: A Practical Course
“Bayesian x% credible interval that extends from the to the percentile of the posterior distribution. For the posterior distribution in Figure 1.1, a 95% Bayesian credible interval for extends from 0.59 to 0.98. In contrast to the orthodox confidence interval, this means that one can be 95% confident that the true value of lies in between 0.59 and 0.98.”
― Bayesian Cognitive Modeling: A Practical Course
― Bayesian Cognitive Modeling: A Practical Course
“reduced our uncertainty about the value of , as shown by the posterior distribution being narrower than the prior distribution.”
― Bayesian Cognitive Modeling: A Practical Course
― Bayesian Cognitive Modeling: A Practical Course
“marginal likelihood (i.e., the probability of the observed data) does not involve the parameter , and is given by a single number that ensures that the area under the posterior distribution equals 1.”
― Bayesian Cognitive Modeling: A Practical Course
― Bayesian Cognitive Modeling: A Practical Course
“The mode of the posterior distribution for is 0.9, equal to the maximum likelihood estimate (MLE), and the 95% credible interval extends from 0.59 to 0.98.”
― Bayesian Cognitive Modeling: A Practical Course
― Bayesian Cognitive Modeling: A Practical Course
“Chapter 14: Multinomial processing trees • Matzke, D., Dolan, C. V., Batchelder, W. H., & Wagenmakers, E.-J. (in press). Bayesian estimation of multinomial processing tree models with heterogeneity in participants and items. Psychometrika.”
― Bayesian Cognitive Modeling: A Practical Course
― Bayesian Cognitive Modeling: A Practical Course
“Chapter 6: Latent-mixture models • Ortega, A., Wagenmakers, E.-J., Lee, M. D., Markowitsch, H. J., & Piefke, M. (2012). A Bayesian latent group analysis for detecting poor effort in the assessment of malingering. Archives of Clinical Neuropsychology, 27, 453–465.”
― Bayesian Cognitive Modeling: A Practical Course
― Bayesian Cognitive Modeling: A Practical Course
“Wagenmakers, E.-J., Lodewyckx, T., Kuriyal, H., & Grasman, R. (2010). Bayesian hypothesis testing for psychologists: A tutorial on the Savage–Dickey method.”
― Bayesian Cognitive Modeling: A Practical Course
― Bayesian Cognitive Modeling: A Practical Course
