Daniel Dantas

12%
Flag icon
The standard multi-armed bandit problem assumes that the probabilities with which the arms pay off are fixed over time. But that’s not necessarily true of airlines, restaurants, or other contexts in which people have to make repeated choices.
Algorithms to Live By: The Computer Science of Human Decisions
Rate this book
Clear rating
Open Preview