19158

Accelerated Approximate Nearest Neighbors Search Through Hierarchical Product Quantization

Ameer M.S. Abdelhadi, Christos-Savvas Bouganis, George A. Constantinides
Department of Electrical and Electronic Engineering, Imperial College London, London SW7 2AZ, United Kingdom
International Conference on Field-Programmable Technology (ICFPT ’19), 2019

@article{abdelhadi2019accelerated,

   title={Accelerated Approximate Nearest Neighbors Search Through Hierarchical Product Quantization},

   author={Abdelhadi, Ameer MS and Bouganis, Christos-Savvas and Constantinides, George A},

   year={2019}

}

A fundamental recurring task in many machine learning applications is the search for the Nearest Neighbor in high dimensional metric spaces. Towards answering queries in large scale problems, state-of-the-art methods employ Approximate Nearest Neighbors (ANN) search, a search that returns the nearest neighbor with high probability, as well as techniques that compress the dataset. Product-Quantization (PQ) based ANN search methods have demonstrated state-of-the-art performance in several problems, including classification, regression and information retrieval. The dataset is encoded into a Cartesian product of multiple low-dimensional codebooks, enabling faster search and higher compression. Being intrinsically parallel, PQ-based ANN search approaches are amendable for hardware acceleration. This paper proposes a novel Hierarchical PQ (HPQ) based ANN search method as well as an FPGA-tailored architecture for its implementation that outperforms current state of the art systems. HPQ gradually refines the search space, reducing the number of data compares and enabling a pipelined search. The mapping of the architecture on a Stratix 10 FPGA device demonstrates over x250 speedups over current state-of-the-art systems, opening the space for addressing larger datasets and/or improving the query times of current systems.
No votes yet.
Please wait...

* * *

* * *

HGPU group © 2010-2024 hgpu.org

All rights belong to the respective authors

Contact us: