Tools such as NetIQ Corp.'s Vivinet Assessor can analyze enterprise environments and traffic patterns and assess the impact of voice traffic, using codecs such as G.711. In the future, such tools will evaluate the impact of video traffic.
Taking the QoS route is only one way enterprises can actively manage their IP traffic. Content delivery networks and multicasting can also reduce the bandwidth requirements for convergence. These technologies are critical for enterprises that reuse content both in-house and over the Internet to generate revenue, maintain content supply lines to a remote workforce, and reduce the costs in corporate communications.
As Web pages grow fat with embedded audio and video content, network pipes to deliver that content have starved. Most enterprises are still using T1 lines as connecting points for WANs and the Internet. These points can create bottlenecks and congestion as customers and employees request content from a central location. For example, 10 branch employees accessing a streamed training video encoded at 150 Kbps from a central server can eat up the bandwidth of a T1 line and saturate the link to your branch office. Likewise, high loads from customers can make your public Web site unreachable.
By positioning content near the network's edge, public CDNs (content-delivery networks), such as those offered by Akamai Technologies and Speedera Networks, can reduce the latency in content delivery by leveraging cache. Using cache devices or servers strategically located in ISP PoPs (points of presence), public CDNs mirror or preposition original content from the enterprise and make it available close to customers and end users. The result is more accessible content and faster delivery. In the enterprise, eCDNs (enterprise CDNs), such as Volera's Velocity CDN and its Excelerator server cache, provide similar functions.
Enterprise cache servers are positioned close to original content servers or near end users on remote WAN links to speed content to end users. The servers dynamically pull content from origin servers based on end-user requests and maintain that content in cache for later use. Cache servers (also called proxy caches, since they act on the origin servers' behalf) also store prepositioned content copied or mirrored from central servers. Prepositioning reduces the amount of information that traverses costly links, improves application response time and reduces overall network latency.
Proxy caches can also integrate with directory and authentication services to provide access control to cache servers and content. They support a variety of file types and content from FTP and streaming media servers from Apple, Microsoft Windows Media Technology and RealNetworks. Caches can serve streaming media on demand or live, and limit the bandwidth for narrow pipes. For live events, caches support stream splitting, in which a single streaming file is received over a WAN link and split over the LAN to downstream users. This one-to-many delivery mechanism leverages multicasting in the enterprise.