Binocular fixation on wide-angle foveated vision system contour-based feature generation from space-variant image using DFT

Sota Shimizu, Hao Jiang, Shinsuke Shimojo, Joel Burdick

Research output: Contribution to conferencePaperpeer-review

Abstract

This paper introduces a novel interactive vision system, which is suitable for cooperative works between the human and computer. This system acquires human-like binocular wide-angle foveated (WAF) information from a stereo camera head with special wide-angle optical lens and provides processed video signals both to the computers and to the user's sight simultaneously. The user can observe this unique information on 3D head mount display (HMD). The developed vision system is quite applicable for the human brain and vision research like psychophysics, because it has two kinds of loop, which are biology and computational. In the system implementation, binocular fixation provides well-fused 3D images to the user. This paper proposes to carry out it based on features extracted from contour images, and examines about scale, rotation and translation (SRT) invariant features generated from WAF space-variant images using Discrete Fourier Transform (DFT).

Original languageEnglish
Pages366-372
Number of pages7
Publication statusPublished - 2005
Externally publishedYes
EventProceedings of the 2005 IEEE/ASME International Conference on Advanced Intelligent Mechatronics, AIM 2005 - Monterey, CA, United States
Duration: 2005 Jul 242005 Jul 28

Conference

ConferenceProceedings of the 2005 IEEE/ASME International Conference on Advanced Intelligent Mechatronics, AIM 2005
Country/TerritoryUnited States
CityMonterey, CA
Period05/7/2405/7/28

ASJC Scopus subject areas

  • Control and Systems Engineering
  • Software
  • Computer Science Applications
  • Electrical and Electronic Engineering

Fingerprint

Dive into the research topics of 'Binocular fixation on wide-angle foveated vision system contour-based feature generation from space-variant image using DFT'. Together they form a unique fingerprint.

Cite this