« Welcome to Portland! | Main | Who cares if a spammer is arrested? »

Botnets and Emissions Trading

Many of the customers I engage with at work have been struggling with how to identify and handle the botnet drones. Now, I am going to assume that everyone who either reads or stumbles upon this page has some understanding of botnets and their impact. Over the past several weeks, Estonians have become very familiar with the effects botnet-enabled DDoS attacks can have on everyday life. The networks are the prime source of spam. There is common agreement that yes, botnets are a problem and yes, they need to go away. Who should actually bear the burden of de-fanging these networks?

Disarming the actors behind these attacks involves dismantling the botnets themselves, which is itself an increasingly challenging problem. Older-style bots used IRC servers as a central command-and-control mechanism, making them vulnerable to decapitation attacks by security personnel. Newer systems use P2P-style C&C protocols adapted from guerilla file-sharing systems that are notoriously difficult to control. Other than traffic and content mitigation, which several organizations have proven to be extremely effective, the solution is to take down botnets node-by-node.

So who should eliminate botnets? End users don't feel responsible or even recognize that there is a problem; all they know is that they are using their computer then someone comes along and tells them they are infected with a virus. Service providers (telephone and cable companies) with infected customers aren't really responsible, and pay the cost through outbound bandwidth charges and outbound MTA capacity, which is relatively minor charge compared to the people who are the targets of the attacks. Operating system vendors aren't responsible, because once they sell the product to the customer, they are no longer liable for if, when, or how the customer becomes compromised. Ultimately, the people who bear the largest cost are the ones who are least capable of remediating the source of the spam, namely the service providers of the attack recipients. These actors have to pay for bandwidth for inbound attacks, storage for spam, and support calls from their customers asking why their computer is slow when it is, in reality, a botted system.

In many ways, we have a classic Tragedy of the Commons-type issue. The communal grazing areas, or shared resources that were critical for the working class' ability to make a living, have been replaced by today's fiber lines. Currently the "tragedy" is solved via by bandwidth providers through embargoes of one another: if one service provider gets out of line, the others will block all mail originating from the offender. Recently I have been pondering another possible solution, one based upon financial mechanisms.

While it would likely be impossible to implement, a Cap-and-Trade-style trading system seems extremely appropriate. Similar to carbon trading schemes, a cap-and-trade system for malicious content established between providers would create economic incentives to correctly monitor and reduce the volume of unwanted content that flows between their networks. The system would involve a cap on how much malicious content the parties would deem acceptable to send to one another. Providers who are able to better control the amount of malicious traffic, through expenditures on personnel and products. They can recoup those costs through the sale of credits associated with the difference between their level of outbound malicious content and the agreed-upon cap. Providers who don't police their traffic are forced to buy credits from those who do, which in turn puts a price on their lack of responsibility.

Eventually, the provider may choose to expose this cost of security to the end user, with rebates or special offers extended to users who keep their systems clean and never cause a problem. The end users in turn are incented to keep their machines clean, the Internet would return to the pre-fall-from-eden utopia that it once was, and the world would be a happy place once again.*

* Having providers buy into this concept, building a monitoring infrastructure, setting prices, assembling a market, and maintaining a clearinghouse for credit trades would be pretty damned hard. I don't think this is a practical idea, it does make for a fun thought experiment.

Comments (1)


This is a good concept, and I can think of a couple of ways to make it practical. A QoS or rate-limiting based peering system might mitigate some of the problems of a centralized cap'n'trade approach. That ISP's could agree to pass N-megs of waste-data from each other, before it gets rate-limited - may be an option.

Current IDS/IPS tools can catch the bulk of waste-data or net-noise, and the amount that actually gets through will be marginal. This approach is a technical solution, but why an ISP would expose itself voluntarily to having their traffic limited because of the behaviour of user-machines that are not their responsibility, remains to be heard.

If some of the larger networks implemented such a system, it would compel the others to participate, but there would have to be a demonstrable benefit to the initial ISP's to rate-limit the waste-data from a peer, as (perhaps arguably) there aren't any significant constraints on core-network bandwidth that would make it worth while.

Post a comment

(If you haven't left a comment here before, you may need to be approved by the site owner before your comment will appear. Until then, it won't appear on the entry. Thanks for waiting.)


This page contains a single entry from the blog posted on May 31, 2007 5:01 PM.

The previous post in this blog was Welcome to Portland!.

The next post in this blog is Who cares if a spammer is arrested?.

Many more can be found on the main index page or by looking through the archives.

Powered by
Movable Type 3.33