Fragment-based variational visual tracking
Abstract
We propose a Bayesian tracking algorithm based on adaptive fragmentation and variational approximation. By using the cue of gradient, we fragment the target into disconnected rectangles and reduce the confusion from the background. To handle the uncertainties in real tracking case, we choose the Bayesian framework with a variational implementation. The parameters of the variational inference are updated according to the observation and to the weights of the voting candidates. Experimental results show that our tracker outperforms directive searching and particle filtering. Furthermore, due to the simplicity of calculation, the proposed method can be applied to real-time surveillance systems.