Relationships. Hallucinating Humans for Learning Object Affordances

Hallucinating Humans for Learning Object Affordances, Robot Learning Lab, Computer Science Department, Cornell University.

We bear in mind that the object being worked on is going to be ridden in, sat upon, looked at, talked into, activated, operated, or in some other way used by people individually or en masse. –Dreyfuss (1955).

In fact, humans are the primary reason for our environments to even exist. In this work, we show that modeling human-object relationships (i.e., object affordances) gives us a more compact way of modeling the contextual relationships (as compared to modeling object-object the relationships).

One key aspect of our work is that the humans may not be even seen by our algorithm! Given a large dataset containing only objects, our method considers human poses as a latent variable, and uses infinite topic models to learn the relationships. We apply this to the robotic task of arranging a cluttered house.

Publications

  1. Infinite Latent Conditional Random Fields for Modeling Environments through Humans, Yun Jiang, Ashutosh Saxena. InRobotics: Science and Systems (RSS), 2013. [PDFSupplementary material]
  2. Hallucinated Humans as the Hidden Context for Labeling 3D Scenes, Yun Jiang, Hema S. Koppula, Ashutosh Saxena. InComputer Vision and Pattern Recognition (CVPR), 2013 (oral). [PDF]
  3. Learning Object Arrangements in 3D Scenes using Human Context, Yun Jiang, Marcus Lim, Ashutosh Saxena. InInternational Conference of Machine Learning (ICML), June 2012. [PDF]
  4. Hallucinating Humans for Learning Robotic Placement of Objects, Yun JiangAshutosh Saxena. In International Symposium on Experimental Robotics (ISER), May 2012. [PDF]

Data and Code

Download data and code and find more details here.

Related Project

Placing and Arranging Objects (robot manipulation).

People

Yun Jiang yunjiang at cs.cornell.edu

Marcus Lim

Ashutosh Saxena asaxena at cs.cornell.ed

Personal Robotics team

Sponsors

Microsoft Faculty Fellowship, 2012.

 

Leave a comment