Target Tracking Using a Joint Acoustic Video System —In this paper, a multitarget tracking system for collocated video and acoustic sensors is presented. We formulate thetracking problem using a particle filter based on a state-space approach.We first discuss the acoustic state-space formulation whoseobservations use a sliding window of direction-of-arrival estimates.We then present the video state space that tracks a target’s position on the image plane based on online adaptive appearancemodels. For the joint operation of the filter, we combine the statevectors of the individualmodalities and also introduce a time-delayvariable to handle the acoustic-video data synchronization issue,caused by acoustic propagation delays. A novel particle filter proposal strategy for joint state-space tracking is introduced, whichplaces the random support of the joint filter where the final posterior is likely to lie. By using the Kullback-Leibler divergence measure, it is shown that the joint operation of the filter decreases theworst case divergence of the individual modalities. The resultingjoint tracking filter is quite robust against video and acoustic occlusions due to our proposal strategy. Computer simulations arepresented with synthetic and field data to demonstrate the filter’sperformance. Did you like this research project? To get this research project Guidelines, Training and Code... Click Here facebook twitter google+ linkedin stumble pinterest Learning and Matching of Dynamic Shape Manifolds for Human Action recognition Approximate Entropy-Based Epileptic EEG Detection Using Artificial Neural Networks