Motion recovery for a class of movements under perspective observation

Xinkai Chen, Hiroyuki Kano

Research output: Contribution to conferencePaperpeer-review

Abstract

The recovery of motion for a class of movements in the space by using the perspective observation of one point is considered in this paper. The motion equation can cover a wide class of practical movements in the space. The estimations of the position and motion parameters which are all time-varying are simultaneously developed in the proposed algorithm. The formulated problem can be converted into the observation of a dynamical system with nonlinearities. The proposed observer is based on the second method of Lyapunov. First, the parameters relating to the rotation of the motion are identified, where only one camera is needed. Then the position of the moving object is identified, where the stereo vision is necessary. In the third step, the parameters relating to the straight movement are identified. The assumptions about the perspective system are reasonable, and the convergence conditions are intuitive and have apparently physical interpretations. The proposed method requires minor a priori knowledge about the system and can cope with a much more general class of perspective systems. Furthermore, the algorithm is modified to deal with the occlusion phenomenon.

Original languageEnglish
Pages328-333
Number of pages6
Publication statusPublished - 2004 Dec 1
EventProceedings of the 2004 IEEE International Symposium on Intelligent Control - 2004 ISIC - Taipei, Taiwan, Province of China
Duration: 2004 Sept 22004 Sept 4

Conference

ConferenceProceedings of the 2004 IEEE International Symposium on Intelligent Control - 2004 ISIC
Country/TerritoryTaiwan, Province of China
CityTaipei
Period04/9/204/9/4

ASJC Scopus subject areas

  • Control and Systems Engineering
  • Modelling and Simulation
  • Computer Science Applications
  • Electrical and Electronic Engineering

Fingerprint

Dive into the research topics of 'Motion recovery for a class of movements under perspective observation'. Together they form a unique fingerprint.

Cite this