Large width nearest prototype classification on general distance spaces
In this paper we consider the problem of learning nearest-prototype classifiers in any finite distance space; that is, in any finite set equipped with a distance function. An important advantage of a distance space over a metric space is that the triangle inequality need not be satisfied, which makes our results potentially very useful in practice. We consider a family of binary classifiers for learning nearest-prototype classification on distance spaces, building on the concept of large-width learning which we introduced and studied in earlier works. Nearest-prototype is a more general version of the ubiquitous nearest-neighbor classifier: a prototype may or may not be a sample point. One advantage in the approach taken in this paper is that the error bounds depend on a 'width' parameter, which can be sample-dependent and thereby yield a tighter bound.
| Item Type | Article |
|---|---|
| Copyright holders | © 2018 Elsevier B.V. |
| Departments | LSE > Academic Departments > Mathematics |
| DOI | 10.1016/j.tcs.2018.04.045 |
| Date Deposited | 27 Apr 2018 |
| Acceptance Date | 24 Apr 2018 |
| URI | https://researchonline.lse.ac.uk/id/eprint/87680 |
Explore Further
- http://www.lse.ac.uk/Mathematics/people/Martin-Anthony.aspx (Author)
- https://www.scopus.com/pages/publications/85046809141 (Scopus publication)
- https://www.sciencedirect.com/journal/theoretical-... (Official URL)