Packet Loss Rate is measured as the parts per million (ppm) of packets lost with respect to packets sent by both clients and servers.
TCP packet loss is calculated by the sum of timer-based retransmitted packets, fast retransmitted packets, retransmitted SYN packets, and retransmitted FIN packets.
UDP packet loss is calculated by the difference between packets sent and packets received.
Packet loss occurs when one or more packets traveling across a system fail to reach their destination. For a firewall, packet loss is caused by system congestion. Packet loss indirectly reduces throughput and goodput as TCP interprets loss as an indication of congestion and adjust their transmission rate to avoid congestive collapse. Packet loss also increases TCP RTT latency due to additional time needed for retransmission.