Reducing TCP Retransmissions By Using Machine Learning For More Accurate RTT Estimation
When computers communicate via connection-oriented protocols like TCP, upon receipt of a segment of information, the recipient sends an acknowledgment back to the sender. This facilitates two important traits of connection-oriented protocols: being aware of which segments have been lost in the transfer, and the exact round trip times (RTT). By observing previous RTT values, algorithms are used to estimate what future retransmission time-out (RTO) values should be set to, i.e., if an acknowledgement is not received within the RTO window after sending a segment, then the segment should be considered lost and resent. This paper proposes a neural networks approach to prediction RTTs. By comparing an RNN-LSTM, and a CNN-LSTM, versus Jacobson's Algorithm, both the RNN-LMST and the CNN-LSTM was shown to provide a better RTT estimate. By replacing the predictor used by Jacobson's Algorithm with a neural network predictor, the number of segment retransmissions was reduced by more than 90%.
Neural Networks, Machine learning, Jacobson's algorithm, TCP, SCTP, Congestion control
Dasgupta, B. (2019). <i>Reducing TCP retransmissions by using machine learning for more accurate RTT estimation</i> (Unpublished thesis). Texas State University, San Marcos, Texas.