Improving the Accuracy of Nearest-Neighbor Classification Using Principled Construction and Stochastic Sampling of Training-Set Centroids
A conceptually simple way to classify images is to directly compare test-set data and training-set data. The accuracy of this approach is limited by the method of comparison used, and by the extent to which the training-set data cover configuration space. Here we show that this coverage can be substantially increased using coarse-graining (replacing groups of images by their centroids) and stochastic sampling (using distinct sets of centroids in combination). We use the MNIST and Fashion-MNIST data sets to show that a principled coarse-graining algorithm can convert training images into fewer image centroids without loss of accuracy of classification of test-set images by nearest-neighbor classification. Distinct batches of centroids can be used in combination as a means of stochastically sampling configuration space, and can classify test-set data more accurately than can the unaltered training set. On the MNIST and Fashion-MNIST data sets this approach converts nearest-neighbor classification from a mid-ranking- to an upper-ranking member of the set of classical machine-learning techniques.
- Sponsoring Organization:
- USDOE; USDOE Office of Science (SC), Basic Energy Sciences (BES)
- Grant/Contract Number:
- AC02-05CH11231
- OSTI ID:
- 1762549
- Alternate ID(s):
- OSTI ID: 1816061
- Journal Information:
- Entropy, Journal Name: Entropy Journal Issue: 2 Vol. 23; ISSN ENTRFG; ISSN 1099-4300
- Publisher:
- MDPI AGCopyright Statement
- Country of Publication:
- Switzerland
- Language:
- English
Similar Records
Classifying and analyzing small-angle scattering data using weighted k nearest neighbors machine learning techniques