Thursday, July 10, 2008

Weapons of Mass Congestion & Weapons of Mass Distribution

Yesterday, I heard Richard French, the very brilliant and usually convincing former Bell senior executive, CRTC Commissioner, Quebec Cabinet Minister and man of many other hats and skills, defend internet throttling at the University of Ottawa.

It seems that we must accept as a premise that there is network congestion. He didn’t offer any evidence or citations to elevate this beyond a mere premise. Somewhat like “weapons of mass congestion” (my term) that is caused by “weapons of mass distribution” (my term), namely P2P applications (his explanation).

Mr. French said that less than 20% of internet users are using more than 80% of the capacity and that throttling based upon applications (i.e. P2P) is not only justifiable but necessary and that other alternative such as usage based pricing are unrealistic.

He simply dismissed any suggestion that throttling is being done to satisfy copyright owners. There is however, evidence that suggests otherwise. For example, events such as this.

Net neutrality (and its antithesis, which is throttling) is going to become a very important issue. And I mean VERY.

The future of the internet is at stake. We are at risk that it is going to turn into a new version of cable and pay television, controlled by the usual suspects.



  1. I completely fail to understand why he would suggest that usage-based pricing is unrealistic. It seems to make sense to me, why is it never considered an option? Didn't they get themselves into this mess by providing "unlimited" bandwidth in the first place, that they couldn't provide?

  2. It's a matter of mis-information to obtain "control" of the market place. This debate could go either way:

    1) An Open marketplace on the net that everyone has a stake in and can compete in (media wise)

    2) A "selected" few given control, effectively wiping out any competition in the media marketplace.

    If #1 would happen which has evidence to support it and how beneficial it would be. #2 would be based on "theory" if it were to ever happen, and no evidence.

    You can't gain credibility in this debate now without coming out with facts. We need to debate the facts, not the spin or theories. Once the facts are out, weigh them equally and come up with a decision. There are no facts to support a system of “control”.

    If the facts are presented, we will have an open market place, something these media giants are up on their heels with right now and desperately don't want to happen. They are quickly figuring out they don't have a leg to stand on, so they throw smoke and mirrors at the politicians, see if they look past the facts.

    I have every trust in the fact that the Canadian people won't let our politicians be blind sighted. They are watching this closely.

  3. Are you seriously arguing that there is, in fact, no congestion? That backbone providers light fibre because they're too dumb to realise they needn't bother? That the IP layer is not a shared medium in which TCP flooding is effective? That content distribution providers like Akamai should fire all of their customers and return of all their fees because there is no need for them? That Google should turn down all its distributed server farms in the Americas, Europe, and Asia because they're completely irrelevant?

    It's one thing to argue that traffic should be managed in a way that is transparent, consistent, and non-discriminatory vis-a-vis subscribers. It's quite another to wish away the Internet's basic plumbing.

  4. Perhaps the biggest problem in the P2P throttling argument, it that the most arguably efficient P2P protocol, BitTorrent, is not only used by personal file sharers, but product in a cost effective way. Of course the bigger media players don't want to have to compete so once again shutting down P2P is effective for them to advance as a scapegoat. It takes just as much or more bandwidth for 50 people to download a gigabyte of data via FTP or HTTP as it does for the same 50 people to do so via BitTorrent; the connections are just more widely distributed saving individual sites bandwidth (reducing fees to the IPS) and marginally increasing the speed to the end user.

  5. The following seemed to get dropped from my previous post - keyboard problem here I think.


    " not only used by personal file sharers, but "


    by numerous Open Source projects to distribute their software (something the traditional software companies would like to see impeded) as well as a number of smaller media companies for distributing their product in a cost effective way.

  6. Evidence? Who needs evidence?

    Read this funny article about Bell released internal data about their alleged network congestion: