Summary: DIMACS Series in Discrete Mathematics
and Theoretical Computer Science
Stochastic Modeling of a Single TCP IP Session over a
Random Loss Channel
Al-Hussein A. Abou-Zeid, Murat Azizoglu, and Sumit Roy
Abstract. In this paper, we present an analytical framework for modeling the
performance of a single TCP session in the presence of random packet loss.
This framework may be applicable to communications channels that cause ran-
dom packet loss modelled by appropriate statistics of the inter-loss duration.
It is shown that the analytical model predicts the throughput for LANs WANs
low and high bandwidth-delay products with reasonable accuracy, as mea-
sured against the throughput obtained by simulation. Random loss is found to
severely a ect the network throughput, higher speed channels are found to be
more vulnerable to random loss than slower channels, especially for moderate
to high loss rates.
TCP IP has been designed for reliable networks in which most packet losses
occur primarily due to network congestion. An important aspect of TCP is its
window-based congestion avoidance mechanism 6 . In TCP IP, when a node suc-
cessfully receives a packet, it sends an acknowledgment ACK back to the source.