John Linton
Exetel recently signed an agreement with a P2P caching provider and last week we received the large number of caching servers and switches needed to implement this service. Over the next 4 - 5 weeks we will implement this new service on a trial basis and, if the trial's successful, we will put the service into production in early 2008.
The decision to implement P2P caching was made almost a year after we implemented P2P 'filtering' to better balance the usage of our rapidly increasing ingress/egress and customer connectivity bandwidth. The results of our installation of the P2P 'filtering' technology was highly effective and met all of our objectives of moving approximately 400 mbps of peak bandwidth from the 8pm to 12 midnight time (which is the most heavily used period of each day) to 12 midnight to 8 am time (which is the most lightly used period of each day). This resulted in a monthly saving of expenditure on bandwidth of almost $100,000 a month and improved overall bandwidth usage efficincy by almost 15%.
This was a really good financial result for Exetel (and without it I think we would have struggled to remain in business) but it was achieved at a cost of seriously annoying some customers who felt inconvenienced by their P2P downloads taking a longer time to complete. That in itself was not a total negative in that the, relatively, small number of customers who moved away from Exetel included almost all of the heaviest 'free time' users. As was obviously going to happen (whether those users who left us realised it or not) was that EVERY ISP in Australia who could afford the cost of a P2P filtering solution would do what Exetel did - our openness in telling our customers in advance about what we were doing only accelerated this event (I must ask our provider of P2P filtering solutions for commission on the 9 sales they made in Australia immediately following Exetel's successful implementation of their products).
However, the P2P filtering 'solution' was only ever regarded as a first step in a three year plan to deal with the very significant increases in peak time bandwidth usage that had been escalating for the past two years and showed every sign of increasing. One of the most notable indications has been the 'jump' in the utilisation of the peering point connectivity between Exetel and PIPE - this has reached 350 mbps at its peak compared with less than 50 mbps two years ago. The most obvious reason for this rapid, and continuing, growth is that P2P provides download 'targets' that are increasingly based locally rather than internationally. The other reason was that Akamai established their Sydney download point in the PIPE data centre and Akamai provides software distribution for Microsoft and many other major software houses from a Sydney server rather than from a US based server.
In the latest edition of New Scientist there is an article that, in part, points very strongly to the likelihood of companies such as Microsoft using P2P in the future to distribute their games programs and the updates to those programs:
http://technology.newscientist.com/article/mg19626256.700-p2p-growth-creates-battle-for-bandwidth.html
I also read an article last week in which the most popular of the P2P software developers, BitTorrent, was now actively pushing its products to Microsoft and other major on line distribution users.
While we are not super-bright in looking in to the future, we saw (probably along with everyone else who has an interest in managing bandwidth resources) that this would almost certainly happen.
The Akamai success around the world demonstrates that companies like Microsoft and Adobe and many other producers of software are prepared to pay big money to dedicated distribution companies. (just as, for instance, modem and other hardware companies outsource the delivery of their products to end users via fulfilment companies). However Microsoft et alia didn't get to be big companies by not controlling costs and making savings where they could.
So......using some form of P2P is a very logical way, if the financial side of this methodology can be worked out - not very difficult, of distributing software titles and software updates including the two big items - operating systems and games. If this eventuates, and that appears to be a highly likely scenario, then a very high percentage of legitimate HTTP traffic will become legitimate P2P traffic used by commercial organisations as well as by individuals.
I have no idea how to calculate what the percentage of file downloads via P2P will become once/if this happens but my guess would be well over 80%.
We decided that Exetel would need a more effective way of dealing with a constantly increasing usage of P2P as a percentage of total traffic and, after almost a year of looking at the various options available to an ISP of our size we selected a P2P caching solution that, if it's successful, will increase the speed of delivery of P2P files and reduce the cost of delivery to Exetel. It will, if the BitTorrent press releases can be actually implemented to the complete satisfaction of companies such as Microsoft, Sony, Warner Bros etc, allow the signature detection of 'legitimate' P2P traffic. In other words the P2P traffic will carry an ability to tell the equipment used for identifying P2P traffic that the owner of the copyright of the data has authorised its distibution via P2P processes.
All that, of course is in the future......Exetel's principal reason for starting this trial is to determine whether P2P caching will significantly speed the delivery of downloads to an end user and if that increased speed can be delivered at a lower cost to Exetel.
Just one more complication to delivering what, in "the old days", was a pretty simple service.