An Empirical Study on The Properties of Random Bases for Kernel Methods: Implementation and Extensions
We compare the performance of the four types of increasingly-adapted bases methods for kernel approximations on toy data and on a real world dataset of Boston housing prices. We compare the methods’ predictive performance, ability to replicate the true kernel, and behavior on out-of-distribution data. We observe that the improvement in performance with increasing basis adaptation noted by Alber et al. is less pronounced in our toy data regression tasks than it was in the image classification tasks in the paper. Although our results demonstrate that the increasing complexity of adapted kernel methods are not always worth the performance gains, the work of Alber at al. in this paper provides an enlightening illustration of the connection between gaussian processes and neural networks.