Minimax Rates For Cost-sensitive Learning On Manifolds With Approximate Nearest Neighbours | Awesome Similarity Search Papers

Minimax Rates For Cost-sensitive Learning On Manifolds With Approximate Nearest Neighbours

Henry Wj Reeve, Gavin Brown Β· Algorithmic Learning Theory 2017 Β· 2018

We study the approximate nearest neighbour method for cost-sensitive classification on low-dimensional manifolds embedded within a high-dimensional feature space. We determine the minimax learning rates for distributions on a smooth manifold, in a cost-sensitive setting. This generalises a classic result of Audibert and Tsybakov. Building upon recent work of Chaudhuri and Dasgupta we prove that these minimax rates are attained by the approximate nearest neighbour algorithm, where neighbours are computed in a randomly projected low-dimensional space. In addition, we give a bound on the number of dimensions required for the projection which depends solely upon the reach and dimension of the manifold, combined with the regularity of the marginal.

Explore more on:
Uncategorized
Similar Work
Loading…