—In this paper, a multitarget tracking system for collocated video and acoustic sensors is presented. We formulate thetracking problem using a particle ﬁlter based on a state-space approach.We ﬁrst discuss the acoustic state-space formulation whoseobservations use a sliding window of direction-of-arrival estimates.We then present the video state space that tracks a target’s position on the image plane based on online adaptive appearancemodels. For the joint operation of the ﬁlter, we combine the statevectors of the individualmodalities and also introduce a time-delayvariable to handle the acoustic-video data synchronization issue,caused by acoustic propagation delays. A novel particle ﬁlter proposal strategy for joint state-space tracking is introduced, whichplaces the random support of the joint ﬁlter where the ﬁnal posterior is likely to lie. By using the Kullback-Leibler divergence measure, it is shown that the joint operation of the ﬁlter decreases theworst case divergence of the individual modalities. The resultingjoint tracking ﬁlter is quite robust against video and acoustic occlusions due to our proposal strategy. Computer simulations arepresented with synthetic and ﬁeld data to demonstrate the ﬁlter’sperformance.
Did you like this research project?
To get this research project Guidelines, Training and Code... Click Here