Perhaps the most important feature of the k-NN algorithm is that it’s a so-called nonparametric model. Cast your mind back to the perceptron. Once you have a trained model, using some initial training dataset, the perceptron is simply characterized by its weight vector, w. The number of elements of this vector equals the number of parameters that define the perceptron. This number is not dependent on the amount of training data. You could train the perceptron with one hundred instances of data or a million, but at the end of the training session, the hyperplane would still be defined by w. A
...more

