Napster, the first easy-to-share P2P file sharing network, was all the rage in the late 1990s before the record industry cracked down and drove it into bankruptcy. While the morality of Napster system is still topic of big debate, it role in the proliferation of broadband cannot be denied. Consumer, tired of downloading files switched en masse to arguably faster cable and DSL connections. Thank you, Shawn Fanning, for helping the carriers come out of their financial doldrums. Napster, legal issues aside, was the first application that showed the consumers what was possible with broadband.
Napster’s legacy will be that it taught AOLers how to consume digital media. That it was okay to download music (and eventually video) instead of going to record stores or renting movies at local Blockbuster. The illegality of the service put an end to the business, but not the habit of digital consumption. Since then quite a few variants have come to market – Kazaa, Audio Galaxy, Earth Station, Bit Torrent – and they all have only reinforced the message. I had a chance to catch up Andrew Parker, chief technology officer of Cambridge, UK-based CacheLogic. The company studies traffic patterns on the Internet, and has often come-up with interesting data.
Parker was in town promoting his report on the state of P2P nation, and a new service called Streamsight monitoring network, that would be an array of CacheLogic appliances spread worldwide, that will collect information on the type of network traffic, which will then be available to carriers worldwide to get a better idea about what’s flowing on their pipes. Parker, a reserved Englishman on best of days was sluggish because of a pesky wisdom tooth that has been taking its time coming out of hiding. Despite the pain, we got into a spirited discussion, and came to a not-so-pleasant conclusion: P2P is driving consumer broadband demand….. and broadband is driving P2P uptake.
The symbiotic relationship between the two is reflected in this accompanying network traffic pattern graphic. It leads me to a few conclusions
+ The service providers have a little or no reason to block P2P traffic in the near terms, because it drives growth. And since most service providers are in growth mode, well, you know…. ehm!
+ In the long term, however P2P traffic if not managed properly is going to become a big problem.
+ The explosion in P2P traffic is going to have an impact on the people who don’t use the P2P services as well.
+ Due to P2P’s symmetrical nature on average 80% of upstream capacity is consumed by P2P
Parker told me that many television and old line television companies are experimenting with P2P technologies for video distribution. BBC and Sky are the most public about their plans, but there are others who are looking to use P2P to get more viewers for their content. I think on a more longer term, this is an interesting situation and brings up some niggling questions about Silicon Valley’s concept of the moment: The Long Tail. I guess, as niche content finds it footing, one has to wonder who is really footing the bill for the distribution.
I mean be it P2P or iTunes or Rhapsody, we are simply shifting to cost of distribution over to the “pipe owners” who are (whether they like it or not,) being reduced to “mere conduits,” or utilities. For instance the distribution costs of a record used to be printing the CDs, and getting them into the stores, which the record label paid for. Now, if you take a song, put it on a server, and start selling it, the distribution cost is really the “IP transit,” which someone has to pay for.
And as the debate continues, one thing which is becoming increasingly certain: P2P has become the driver of broadband, and for now there is nothing which can even come close.