TR2015-070
Universal Embeddings For Kernel Machine Classification
-
- "Universal Embeddings for Kernel Machine Classification", International Conference on Sampling Theory and Applications (SampTA), DOI: 10.1109/SAMPTA.2015.7148902, May 2015, pp. 307-311.BibTeX TR2015-070 PDF
- @inproceedings{Boufounos2015may,
- author = {Boufounos, P.T. and Mansour, H.},
- title = {Universal Embeddings for Kernel Machine Classification},
- booktitle = {International Conference on Sampling Theory and Applications (SampTA)},
- year = 2015,
- pages = {307--311},
- month = may,
- publisher = {IEEE},
- doi = {10.1109/SAMPTA.2015.7148902},
- url = {https://www.merl.com/publications/TR2015-070}
- }
,
- "Universal Embeddings for Kernel Machine Classification", International Conference on Sampling Theory and Applications (SampTA), DOI: 10.1109/SAMPTA.2015.7148902, May 2015, pp. 307-311.
-
MERL Contacts:
-
Research Area:
Abstract:
Visual inference over a transmission channel is increasingly becoming an important problem in a variety of applications. In such applications, low latency and bit-rate consumption are often critical performance metrics, making data compression necessary. In this paper, we examine feature compression for support vector machine (SVM)-based inference using quantized randomized embeddings. We demonstrate that embedding the features is equivalent to using the SVM kernel trick with a mapping to a lower dimensional space. Furthermore, we show that universal embeddings-a recently proposed quantized embedding design-approximate a radial basis function (RBF) kernel, commonly used for kernel-based inference. Our experimental results demonstrate that quantized embeddings achieve 50% rate reduction, while maintaining the same inference performance. Moreover, universal embeddings achieve a further reduction in bit-rate over conventional quantized embedding methods, validating the theoretical predictions.