[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
article

A Probabilistic Criterion to Detect Rigid Point Matches Between Two Images and Estimate the Fundamental Matrix

Published: 05 May 2004 Publication History

Abstract

The perspective projections of n physical points on two views (stereovision) are constrained as soon as n ≥ 8. However, to prove in practice the existence of a rigid motion between two images, more than 8 point matches are desirable in order to compensate for the limited accuracy of the matches. In this paper, we propose a computational definition of rigidity and a probabilistic criterion to rate the meaningfulness of a rigid set as a function of both the number of pairs of points (n) and the accuracy of the matches. This criterion yields an objective way to compare, say, precise matches of a few points and approximate matches of a lot of points. It gives a yes/no answer to the question: “could this rigid points correspondence have occurred by chance?”, since it guarantees that the expected number of meaningful rigid sets found by chance in a random distribution of points is as small as desired. It also yields absolute accuracy requirements for rigidity detection in the case of non-matched points, and optimal values of n, depending on the expected accuracy of the matches and on the proportion of outliers. We use it to build an optimized random sampling algorithm that is able to detect a rigid motion and estimate the fundamental matrix when the set of point matches contains up to 90% of outliers, which outperforms the best currently known methods like M-estimators, LMedS, classical RANSAC and Tensor Voting.

References

[1]
Boufama, B. and Mohr, R. 1995. Epipole and fundamental matrix estimation using virtual parallax. In Proc. Int. Conf. Vision, Boston, pp. 1030-1036.
[2]
Desolneux, A. Moisan, L., and Morel, J.-M. 2000. Meaningful alignments, Int. J. of Computer Vision, 40(1):7-23.
[3]
Faugeras, O., Lustman, F., and Toscani, G. 1987. Motion and structure from motion from point and line matches. In Proc. 1st Int. Conf. on Computer Vision, pp. 25-34.
[4]
Faugeras, O. and Luong, Q.-T., 2001. The Geometry of Multiple Images. MIT Press.
[5]
Fischler, M. and Bolles, R. 1981. Random sample consensus: A paradigm for model fitting with applications to image analysis and automated cartography. Communications of the ACM, 24:381- 385.
[6]
Hartley, R.I. 1997. Kruppa's equations derived from the fundamental matrix. IEEE Trans. on Pattern Analysis and Machine Intelligence, 19(2):133-135.
[7]
Hartley, R.I. 1997. Self-calibration of stationary cameras. Int. J. of Computer Vision, 22(1):5-24.
[8]
Longuet-Higgins, H. 1981. A computer algorithm for reconstructing a scene from two projections. Nature 293:133-135.
[9]
Lowe, D. 1985. Perceptual Organization and Visual Recognition. Kluwer Academic Publishers.
[10]
Luong, Q.-T. and Faugeras, O. 1996. The fundamental matrix: Theory, algorithms and stability analysis, Int. J. of Computer Vision, 17(1):43-76.
[11]
McReynolds, D.P. and Lowe, D.G. 1996. Rigidity checking of 3D point correspondences under perspective projection. IEEE Trans. on Pattern Analysis and Machine Intelligence, 18(12):1174- 1185.
[12]
Oliensis, J. and Genc, Y. 2001. Fast and accurate algorithms for projective multi-image structure from motion. IEEE Trans. on Pattern Analysis and Machine Intelligence, 23(6):546-559.
[13]
Press, W.H., Teukolsky, S.A., Vetterling, W.T., and Flannery, B.P. 1988. Numerical Recipes in C. Cambridge University Press.
[14]
Rissanen, J. 1983. A universal prior for integers and estimation by Minimum Description Length. Annals of Statistics, 11(2):416- 431.
[15]
Rissanen, J. 1989. Stochastic Complexity in Statistical Inquiry. World Scientific Press.
[16]
Salvi, J., Armangué, X., and Pages, J. 2001. A survey addressing the fundamental matrix estimation problem. IEEE Int. Conf. on Image Processing, Thessaloniki (Greece).
[17]
Stewart, C.V. 1995. MINPRAN: A new robust estimator for computer vision. IEEE Trans. on Pattern Analysis and Machine Intelligence, 17:925-938.
[18]
Tang, C.K., Medioni, G., and Lee, M.S. 2001. N-dimensional tensor voting, and application to epipolar geometry estimation. IEEE Trans. on Pattern Analysis and Machine Intelligence, 23(8):829- 844.
[19]
Torr, P.H.S. and Murray, D.W. 1997. The development and comparison of robust methods for estimating the fundamental matrix. Int. J. of Computer Vision, 24(3):271-300.
[20]
Torr, P.H.S., Zisserman, A., and Murray, D.W. 1995. Motion clustering using the trilinear constraint over three views. In Workshop on Geometrical Modeling and Invariants for Computer Vision, Xidian University Press.
[21]
Zhang, Z., Deriche, R., Faugeras, O., and Luong, Q.T. 2001. Estimating the fundamental matrix by transforming image points in projective space. Computer Vision and Image Understanding 82:174-180.
[22]
Zhang, Z. 1998. Determining the epipolar geometry and its uncertainty: A review, Int J. of Computer Vision, 27(2):161- 195.
[23]
Zhang, Z., Deriche, R., Faugeras, O., and Luong, Q.-T. 1994. A robust technique for matching two uncalibrated images through the recovery of the unknown epipolar geometry. AI Journal 78:87- 119.

Cited By

View all

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image International Journal of Computer Vision
International Journal of Computer Vision  Volume 57, Issue 3
May-June 2004
73 pages

Publisher

Kluwer Academic Publishers

United States

Publication History

Published: 05 May 2004

Author Tags

  1. fundamental matrix
  2. meaningful event
  3. point matches
  4. rigidity detection
  5. stereovision
  6. structure from motion

Qualifiers

  • Article

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)0
  • Downloads (Last 6 weeks)0
Reflects downloads up to 24 Dec 2024

Other Metrics

Citations

Cited By

View all

View Options

View options

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media