Description
We place ourselves in the setting of high-dimensional statistical inference, where the number of variables p in a dataset of interest is of the same order of magnitude as the number of observations n. We consider the spectrum of certain kernel random matrices, in particular n x n matrices whose (i,j)-th entry is f(X_i'X_j/p) or f(\norm{X_i-X_j}^2/p), where p is the dimension of the data, and X_i are independent data vectors. Here f is assumed to be a locally smooth function. The study is motivated by questions arising in statistics and computer science, where these matrices are used to perform, among other things, non-linear versions of principal component analysis. Surprisingly, we show that in high-dimensions, and for the models we analyze, the problem becomes essentially linear - which is at odds with heuristics sometimes used to justify the usage of these methods. The analysis also highlights certain peculiarities of models widely studied in random matrix theory and raises some questions about their relevance as tools to model high-dimensional data encountered in practice.