Software & Data Downloads — CAZSL

Context-Aware Zero Shot Learning for learning a model that can generalize to different parameters or features of the interacting objects.

Learning accurate models of the physical world is required for a lot of robotic manipulation tasks. However, during manipulation, robots are expected to interact with unknown workpieces so that building predictive models which can generalize over a number of these objects is highly desirable. We provide codes for context-aware zero shot learning (CAZSL) models, an approach utilizing a Siamese network architecture, embedding space masking and regularization based on context variables which allows us to learn a model that can generalize to different parameters or features of the interacting objects. The proposed learning algorithm on the recently released Omnipush data set that allows testing of meta-learning capabilities using low-dimensional data. The codes allow comparison of the proposed method with several other baseline techniques. The proposed method will be presented at IROS2020.