(with Allan Jepson)
This work involves a new approach for tracking rigid and articulated objects using a view-based representation. The approach builds on and extends work on eigenspace representations, robust estimation techniques, and parameterized optical flow estimation. First, we note that the least-squares image reconstruction of standard eigenspace techniques has a number of problems and we reformulate the reconstruction problem as one of robust estimation. Second we define a ``subspace constancy assumption'' that allows us to exploit techniques for parameterized optical flow estimation to simultaneously solve for the view of an object and the affine transformation between the eigenspace and the image. To account for large affine transformations between the eigenspace and the image we define an EigenPyramid representation and a coarse-to-fine matching strategy. Finally, we use these techniques to track objects over long image sequences in which the objects simultaneously undergo both affine image motions and changes of view. In particular we use this ``EigenTracking'' technique to track and recognize the gestures of a moving hand.
Two 100-image sequences. The sequence in data/gesture1/crop-*.pgm contains the training data while data/gesture2/*.pgm contains the sequence used for tracking.
Black, M. J. and Jepson, A., EigenTracking: Robust matching and tracking of articulated objects using a view-based representation, International Journal of Computer Vision, 26(1), pp. 63-84, 1998. also Xerox PARC, Technical Report P95-000515, Feb. 1996.
Black, M. J. and Jepson, A., EigenTracking: Robust matching and tracking of articulated objects using a view-based representation, Proc. Fourth European Conf. on Computer Vision, ECCV'96, B. Buxton and R. Cipolla (Eds.), Springer Verlag, LNCS 1064, Cambridge, England, April 1996, pp. 329-342. (postscript, 2.75MB)