Inverse compositional estimation of 3D pose and lighting in dynamic scenes
In this paper, we show how to estimate, accurately and efficiently, the 3D motion of a rigid object and time-varying lighting in a dynamic scene. This is achieved in an inverse compositional tracking framework with a novel warping function that involves a 2D --> 3D --> 2D transformation. This...
Publié dans: | IEEE transactions on pattern analysis and machine intelligence. - 1979. - 30(2008), 7 vom: 12. Juli, Seite 1300-7 |
---|---|
Auteur principal: | |
Autres auteurs: | |
Format: | Article en ligne |
Langue: | English |
Publié: |
2008
|
Accès à la collection: | IEEE transactions on pattern analysis and machine intelligence |
Sujets: | Journal Article Research Support, U.S. Gov't, Non-P.H.S. |
Résumé: | In this paper, we show how to estimate, accurately and efficiently, the 3D motion of a rigid object and time-varying lighting in a dynamic scene. This is achieved in an inverse compositional tracking framework with a novel warping function that involves a 2D --> 3D --> 2D transformation. This also allows us to extend traditional two frame inverse compositional tracking to a sequence of frames, leading to even higher computational savings. We prove the theoretical convergence of this method and show that it leads to significant reduction in computational burden. Experimental analysis on multiple video sequences shows impressive speed-up over existing methods while retaining a high level of accuracy |
---|---|
Description: | Date Completed 10.07.2008 Date Revised 13.06.2008 published: Print Citation Status MEDLINE |
ISSN: | 1939-3539 |
DOI: | 10.1109/TPAMI.2008.81 |