PROJECT TITLE :
Optimizing HTTP-Based Adaptive Streaming in Vehicular Environment Using Markov Decision Process
Hypertext transfer protocol (HTTP) is the fundamental mechanics supporting net browsing on the Web. An HTTP server stores massive volumes of contents and delivers specific pieces to the shoppers when requested. There is a recent move to use HTTP for video streaming moreover, which promises seamless integration of video delivery to existing HTTP-primarily based server platforms. This can be achieved by segmenting the video into many tiny chunks and storing these chunks as separate files on the server. For adaptive streaming, the server stores completely different quality versions of the same chunk in numerous files to allow real-time quality adaptation of the video thanks to network bandwidth variation experienced by a client. For each chunk of the video, which quality version to download, thus, becomes a major call-making challenge for the streaming shopper, especially in vehicular environment with important uncertainty in mobile bandwidth. During this paper, we have a tendency to demonstrate that for such call creating, the Markov call method (MDP) is superior to previously proposed non-MDP solutions. Using publicly out there video and bandwidth datasets, we show that the MDP achieves up to a 15x reduction in playback deadline miss compared to a well known non-MDP resolution when the MDP has the previous knowledge of the bandwidth model. We conjointly think about a model-free MDP implementation that uses Q-learning to gradually learn the optimal choices by continuously observing the end result of its call making. We realize that the MDP with Q-learning significantly outperforms the MDP that uses bandwidth models.
Did you like this research project?
To get this research project Guidelines, Training and Code... Click Here