TR2020-072
Collaborative Motion Prediction via Neural Motion Message Passing
-
- "Collaborative Motion Prediction via Neural Motion Message Passing", IEEE Conference on Computer Vision and Pattern Recognition (CVPR), DOI: 10.1109/CVPR42600.2020.00635, June 2020, pp. 6318-6327.BibTeX TR2020-072 PDF
- @inproceedings{Hu2020jun,
- author = {Hu, Yue and Chen, Siheng and Zhang, Ya and Gu, Xiao},
- title = {Collaborative Motion Prediction via Neural Motion Message Passing},
- booktitle = {IEEE Conference on Computer Vision and Pattern Recognition (CVPR)},
- year = 2020,
- pages = {6318--6327},
- month = jun,
- doi = {10.1109/CVPR42600.2020.00635},
- url = {https://www.merl.com/publications/TR2020-072}
- }
,
- "Collaborative Motion Prediction via Neural Motion Message Passing", IEEE Conference on Computer Vision and Pattern Recognition (CVPR), DOI: 10.1109/CVPR42600.2020.00635, June 2020, pp. 6318-6327.
-
Research Areas:
Abstract:
Motion prediction is essential and challenging for autonomous vehicles and social robots. One challenge of motion prediction is to model the interaction among traffic actors, which could cooperate with each other to avoid collisions or form groups. To address this challenge, we propose neural motion message passing (NMMP) to explicitly model the interaction and learn representations for directed interactions between actors. Based on the proposed NMMP, we design the motion prediction systems for two settings: the pedestrian setting and the joint pedestrian and vehicle setting. Both systems share a common pattern: we use an individual branch to model the behavior of a single actor and an interactive branch to model the interaction between actors, while with different wrappers to handle the varied input formats and characteristics. The experimental results show that both systems outperform the previous state-of-the-art methods on several existing benchmarks. Besides, we provide interpretability for interaction learning. Code is available at https://github.com/PhyllisH/NMMP
Related News & Events
-
NEWS MERL researchers presenting four papers and organizing two workshops at CVPR 2020 conference Date: June 14, 2020 - June 19, 2020
MERL Contacts: Anoop Cherian; Michael J. Jones; Toshiaki Koike-Akino; Tim K. Marks; Kuan-Chuan Peng; Ye Wang
Research Areas: Artificial Intelligence, Computer Vision, Machine LearningBrief- MERL researchers are presenting four papers (two oral papers and two posters) and organizing two workshops at the IEEE/CVF Computer Vision and Pattern Recognition (CVPR 2020) conference.
CVPR 2020 Orals with MERL authors:
1. "Dynamic Multiscale Graph Neural Networks for 3D Skeleton Based Human Motion Prediction," by Maosen Li, Siheng Chen, Yangheng Zhao, Ya Zhang, Yanfeng Wang, Qi Tian
2. "Collaborative Motion Prediction via Neural Motion Message Passing," by Yue Hu, Siheng Chen, Ya Zhang, Xiao Gu
CVPR 2020 Posters with MERL authors:
3. "LUVLi Face Alignment: Estimating Landmarks’ Location, Uncertainty, and Visibility Likelihood," by Abhinav Kumar, Tim K. Marks, Wenxuan Mou, Ye Wang, Michael Jones, Anoop Cherian, Toshiaki Koike-Akino, Xiaoming Liu, Chen Feng
4. "MotionNet: Joint Perception and Motion Prediction for Autonomous Driving Based on Bird’s Eye View Maps," by Pengxiang Wu, Siheng Chen, Dimitris N. Metaxas
CVPR 2020 Workshops co-organized by MERL researchers:
1. Fair, Data-Efficient and Trusted Computer Vision
2. Deep Declarative Networks.
- MERL researchers are presenting four papers (two oral papers and two posters) and organizing two workshops at the IEEE/CVF Computer Vision and Pattern Recognition (CVPR 2020) conference.