TR2020-093
Representation Learning via Adversarially-Contrastive Optimal Transport
-
- "Representation Learning via Adversarially-Contrastive Optimal Transport", International Conference on Machine Learning (ICML), Daumé, H. and Singh, A., Eds., July 2020, pp. 10675-10685.BibTeX TR2020-093 PDF Software
- @inproceedings{Cherian2020jul,
- author = {Cherian, Anoop and Aeron, Shuchin},
- title = {Representation Learning via Adversarially-Contrastive Optimal Transport},
- booktitle = {International Conference on Machine Learning (ICML)},
- year = 2020,
- editor = {Daumé, H. and Singh, A.},
- pages = {10675--10685},
- month = jul,
- url = {https://www.merl.com/publications/TR2020-093}
- }
,
- "Representation Learning via Adversarially-Contrastive Optimal Transport", International Conference on Machine Learning (ICML), Daumé, H. and Singh, A., Eds., July 2020, pp. 10675-10685.
-
MERL Contact:
-
Research Areas:
Abstract:
In this paper, we study the problem of learning compact (low-dimensional) representations for sequential data that captures its implicit spatiotemporal cues. To maximize extraction of such informative cues from the data, we set the problem within the context of contrastive representation learning and to that end propose a novel objective via optimal transport. Specifically, our formulation seeks a low-dimensional subspace representation of the data that jointly (i) maximizes the distance of the data (embedded in this subspace) from an adversarial data distribution under the optimal transport, a.k.a. the Wasserstein distance, (ii) captures the temporal order, and (iii) minimizes the data distortion. To generate the adversarial distribution, we propose a novel framework connecting Wasserstein GANs with a classifier, allowing a principled mechanism for producing good negative distributions for contrastive learning, which is currently a challenging problem. Our full objective is cast as a subspace learning problem on the Grassmann manifold and solved via Riemannian optimization. To empirically study our formulation, we provide experiments on the task of human action recognition in video sequences. Our results demonstrate competitive performance against challenging baselines.
Software & Data Downloads
Related News & Events
-
NEWS MERL researchers presenting three papers at ICML 2020 Date: July 12, 2020 - July 18, 2020
Where: Vienna, Austria (virtual this year)
MERL Contacts: Anoop Cherian; Devesh K. Jha; Daniel N. Nikovski
Research Areas: Artificial Intelligence, Computer Vision, Data Analytics, Dynamical Systems, Machine Learning, Optimization, RoboticsBrief- MERL researchers are presenting three papers at the International Conference on Machine Learning (ICML 2020), which is virtually held this year from 12-18th July. ICML is one of the top-tier conferences in machine learning with an acceptance rate of 22%. The MERL papers are:
1) "Finite-time convergence in Continuous-Time Optimization" by Orlando Romero and Mouhacine Benosman.
2) "Can Increasing Input Dimensionality Improve Deep Reinforcement Learning?" by Kei Ota, Tomoaki Oiki, Devesh Jha, Toshisada Mariyama, and Daniel Nikovski.
3) "Representation Learning Using Adversarially-Contrastive Optimal Transport" by Anoop Cherian and Shuchin Aeron.
- MERL researchers are presenting three papers at the International Conference on Machine Learning (ICML 2020), which is virtually held this year from 12-18th July. ICML is one of the top-tier conferences in machine learning with an acceptance rate of 22%. The MERL papers are: