We know that in 3D space a large fraction of the volume of the cube is taken up by the enclosed sphere. This fraction starts decreasing as we move up in dimensions. We saw that as the number of dimensions tends to infinity, the volume of the unit hypersphere tends to zero. This means that the internal volume of the hypercube taken up by the unit hypersphere vanishes, most of the volume of the hypercube ends up near the vertices, and all the vertices are equally far away from each other. What’s all this got to do with the k-NN algorithm and machine learning? Well, let’s say that data points
...more

