Clear Standards for Reasonable Network Management

This is the second in a series of posts by Chris Riley, Free Press Policy Counsel, to summarize the primary policy recommendations made in recent comments submitted to the Federal Communications Commission in its open Internet proceeding. Today’s topic: reasonable network management.

In my last post, I discussed Free Press’ position on nondiscrimination, and why a clear and comprehensive rule without loopholes is essential to protect consumers, competition and innovation. But are there times when discrimination is beneficial and should be allowed? Although the ISPs’ network problems are exaggerated, the answer is yes – in the right contexts, when done in the right way. We support the FCC’s general idea of allowing for “reasonable network management,” but as we discuss in our filing for the agency’s NPRM on open Internet rules, we remain highly critical of the vagueness of the definition proposed.

The discussion of network controls centers on the issue of congestion, which is often discussed but less often understood. Congestion occurs as a result of very heavy network use, when many users are simultaneously sharing a network resource that has been designed for use by only a few. When congestion occurs, not all of the data passing through a pipe can fit. Congestion sometimes lasts for only a fraction of a second, but sometimes it lasts much longer. Although the Internet was designed to handle congestion without the network faltering, poorly engineered Internet applications or applications that depend on high performance can be temporarily disrupted. The higher the utilization gets within a network, the more frequent and severe the congestion and impact on Internet use.

Previously, network operators dealt with high utilization by increasing network capacity. If too many people used the network at the same time, then it was expanded to accommodate the high demand. This system worked well for the history of the Internet, and it’s still working well today.

Delusional data usage

A big talking point of network gatekeepers in the FCC’s proceeding on open Internet rules is the idea that Internet use is getting out of hand, leading to unprecedented levels of use and severe congestion. Allegedly, network gatekeepers need to impose similarly unprecedented additional controls to deal with this congestion. This idea is commonly known as the “exaflood,” and it’s a delusion. Data usage has been growing steadily for years; network engineers have always been able to accommodate the rapid pace of growth; and there is no evidence to support any change to this pattern. Talk about a solution in search of a problem!

That said: Even a properly engineered network will experience sporadic, mild periods of high utilization and congestion. So appropriate network controls to deal with periods of congestion can be reasonable. Similarly, network controls that deal with spam or viruses or denial of service attacks can certainly be reasonable. And the FCC should ensure that network operators’ ability to impose “reasonable” network controls can coexist with consumer protections in this proceeding.

Clear guidance from the FCC

So, where does that leave us? The FCC needs to offer clear guidance as to what “reasonable” means, so that network operators can better understand what actions might get them in trouble, and consumers can be assured that this “reasonable” framework won’t just rubber stamp anything the network operator wants. But rather than be clear, the FCC’s suggestion was circular: “Reasonable network management consists of practices which are reasonable.”

Admittedly, this isn’t an easy question. The FCC has two conflicting goals: Be clear, but allow for good behavior. But the right solution isn’t that hard. It’s similar to frameworks that have been adopted either voluntarily or by regulation in Japan and Canada. To be considered reasonable, network controls must have a good reason for their existence, and they must not harm anybody unnecessarily. Put in other terms, this is a two-part test for “purpose” and “means.”

Public interest purpose

The first part of the test is: What is the purpose of the practice, and why should we consider it valuable? In our comments, we propose that the purpose be a “public interest purpose” – something which on balance serves the public interest, not merely short-term parochial interests. The purpose should also be real, not hypothetical. For example, congestion management can be a public interest purpose, but only if the network operator can demonstrate that congestion is occurring or at least likely because of high utilization. Going back to my first post: Any discrimination is inherently harmful to some Internet traffic. Thus, if the purpose isn’t real, the discrimination at issue is unwarranted and unreasonable. Practices should not be approved absent data showing that the intended purpose is not merely hypothetical.

Time, geography and proportion

If the purpose is real and valuable, the next question is, what means are used to achieve that purpose? We break this question down into three parts: time, geography and proportion, and I will use congestion management as my hypothetical purpose. “Geography” says that if you demonstrate high utilization in a service area somewhere in downtown San Francisco, you should not be using a congestion management practice in Boston. “Time” says that if you demonstrate that your network experiences high utilization between 7 and 9 p.m. in a local area, you should not be using a congestion management practice at noon. And “proportion” says that you can’t block all uses of one application even if some of those uses are contributing to congestion. You can’t discriminate just because it’s the easy answer if you have a more appropriate remedy available.

With a good two-part purpose and means test in place, the FCC can evaluate network controls that violate the open Internet rules on a case-by-case process, to separate the bad actors from the good. Clear standards for what counts as “reasonable” will limit the FCC’s arbitrary discretion, and give the reviewing courts (and the public and Congress) something to evaluate. But a definition that says “reasonable is that which is reasonable” is nothing but Swiss cheese.