Doug Lautzenheiser

33%
Flag icon
The rationale behind fitting a model to the whole training dataset after k-fold cross-validation is that providing more training samples to a learning algorithm usually results in a more accurate and robust model. Since k-fold cross-validation is a resampling technique without replacement, the advantage of this approach is that each sample point will be used for training and validation (as part of a test fold) exactly once, which yields a lower-variance estimate of the model performance than the holdout method.
Python Machine Learning: Machine Learning and Deep Learning with Python, scikit-learn, and TensorFlow
Rate this book
Clear rating
Open Preview