Getting Jittery
There are four major reasons for implementing QoS:
Latency, the amount of time it takes for a packet to get from one place to another, may be caused by bandwidth saturation, lack of resources (CPU or RAM) on a network device, distance or type of connection. You can reduce latency only so much--there's no way to send a message and receive a reply via a satellite link in less than 500 ms, for example.
Jitter refers to the variance in latency. Unless two nodes are on the same switch, latency will vary greatly from packet to packet. When network bandwidth is saturated, jitter increases. For applications such as file downloads or Web browsing, this is not usually a big deal. However, streaming video and VoIP suffer greatly from high jitter. QoS can be used to help even out jitter by giving streaming traffic a higher bandwidth priority. Another solution is to increase buffer size.
Random packet loss, which occurs when networks or devices are oversaturated, causes clipping in streaming media, reset and dropped connections, and other transaction difficulties. Worse, dropped packets need to be retransmitted, compounding the problem. QoS methods can limit the amount of bandwidth a protocol or connection uses, thus preventing or limiting oversaturation.