Motion analysis of articulated objects from monocular images

This paper presents a new method of motion analysis of articulated objects from feature point correspondences over monocular perspective images without imposing any constraints on motion. An articulated object is modeled as a kinematic chain consisting of joints and links, and its 3D joint positions...

Ausführliche Beschreibung

Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on pattern analysis and machine intelligence. - 1998. - 28(2006), 4 vom: 11. Apr., Seite 625-36
1. Verfasser: Zhang, Xiaoyun (VerfasserIn)
Weitere Verfasser: Liu, Yuncai, Huang, Thomas S
Format: Aufsatz
Sprache:English
Veröffentlicht: 2006
Zugriff auf das übergeordnete Werk:IEEE transactions on pattern analysis and machine intelligence
Schlagworte:Evaluation Study Journal Article
Beschreibung
Zusammenfassung:This paper presents a new method of motion analysis of articulated objects from feature point correspondences over monocular perspective images without imposing any constraints on motion. An articulated object is modeled as a kinematic chain consisting of joints and links, and its 3D joint positions are estimated within a scale factor using the connection relationship of two links over two or three images. Then, twists and exponential maps are employed to represent the motion of each link, including the general motion of the base link and the rotation of other links around their joints. Finally, constraints from image point correspondences, which are similar to that of the essential matrix in rigid motion, are developed to estimate the motion. In the algorithm, the characteristic of articulated motion, i.e., motion correlation among links, is applied to decrease the complexity of the problem and improve the robustness. A point pattern matching algorithm for articulated objects is also discussed in this paper. Simulations and experiments on real images show the correctness and efficiency of the algorithms
Beschreibung:Date Completed 18.04.2006
Date Revised 10.12.2019
published: Print
Citation Status MEDLINE
ISSN:0162-8828