Intelligent autonomous systems need detailed models of their environment to achieve sophisticated tasks. Vision sensors provide rich information and are broadly used to obtain these models, particularly, indoor scene understanding has been widely studied. A common initial step to solve this problem is the estimation of the 3D layout of the scene. This work addresses the problem of scene layout propagation along a video sequence. We use a Particle Filter framework to propagate the scene layout obtained using a state-of-the-art technique on the initial frame and propose how to generate, evaluate and sample new layout hypotheses on each frame. Our intuition is that we can obtain better layout estimation at each frame through propagation than running separately at each image. The experimental validation shows promising results for the presented approach.
Publication Type: Conference Paper
Publication: International Conference on Image Analysis and Recognition (ICIAR), Springer (2014)