In this paper, the performance of the ALOHA and CSMA MAC protocols are analyzed in spatially distributed wireless networks. The main system objective is correct reception of packets, and thus the analysis is performed in terms of outage chance. In our network model, packets belonging to specific transmitters arrive randomly in space and time in step with a three-D Poisson point method, and are then transmitted to their meant destinations using a totally-distributed MAC protocol. A packet transmission is considered successful if the received SINR is above a predefined threshold for the length of the packet. Accurate bounds on the outage probabilities are derived as a function of the transmitter density, the quantity of backoffs and retransmissions, and in the case of CSMA, conjointly the sensing threshold. The analytical expressions are validated with simulation results. For continuous-time transmissions, CSMA with receiver sensing (that involves adding a feedback channel to the conventional CSMA protocol) is shown to yield the best performance. Moreover, the sensing threshold of CSMA is optimized. It is shown that introducing sensing for lower densities (i.e., in sparse networks) isn't useful, while for higher densities (i.e., in dense networks), using an optimized sensing threshold provides significant gain.
Did you like this research project?
To get this research project Guidelines, Training and Code... Click Here