Adam Thierer's Blog, page 67

April 18, 2013

Internet Analogies: Remember When the Internet Was the Information Superhighway? (Part 2)

Why did the government impose a completely different funding mechanism on the Internet than on the Interstate Highway System? There is no substantive distinction between the shared use of local infrastructure by commercial “edge” providers on the Internet and shared use of the local infrastructure by commercial “edge” providers (e.g., FedEx) on the highways.



In Part 1 of this post, I described the history of government intervention in the funding of the Internet, which has been used to exempt commercial users from paying for the use of local Internet infrastructure. The most recent intervention, known as “net neutrality”, was ostensibly intended to protect consumers, but in practice, requires that consumers bear all the costs of maintaining and upgrading local Internet infrastructure while content and application providers pay nothing. This consumer-funded commercial subsidy model is the opposite of the approach the government took when funding the Interstate Highway System: The federal government makes commercial users pay more for their use of the highways than consumers. This fundamental difference in approach is why net neutrality advocates abandoned the “information superhighway” analogy promoted by the Clinton Administration during the 1990s.



The Interstate Highway System was authorized by the Federal Aid Highway Act of 1956, which created the Highway Trust Fund (HTF) to finance the new “superhighway.” The HTF is a user-supported fund that derives hypothecated tax revenues from excise taxes on motor fuels and heavy commercial vehicles, which are the primary source of revenue for federal-aid highways. When it was designing this funding mechanism, the government recognized that the additional congestion and road damage caused by the commercial trucking industry imposes additional costs on highway infrastructure. Although all users contribute to the HTF through fuel taxes, the commercial trucking industry pays higher excise taxes than consumer users. Diesel fuel, which is used primarily by the commercial trucking industry, is taxed at a higher rate than gasoline (diesel is taxed at 24.3 cents per gallon, gasoline is taxed at 18.3 cents per gallon). The HTF also receives revenues produced by excise taxes imposed exclusively on tires used for heavy vehicles, the retail sale of heavy highway vehicles (e.g., semi-trucks), and from the heavy vehicle use tax. These taxes are intended to “better reflect the cost responsibility of heavy trucks” for shared use of the highway infrastructure.



If the theory of the ESP exemption and the net neutrality payment exemption were applied to the highways, commercial users wouldn’t pay any hypothecated taxes for their use of the Interstate Highway System. In net neutrality terms, FedEx uses the highways to offer an “edge” service to consumers who ask FedEx to deliver packages to their home using a shared highway infrastructure that FedEx doesn’t own or operate. If the government treated FedEx the same way it treats “over the top” Internet companies, the government would eliminate taxes on diesel fuel and, rather than charge a heavy vehicle use tax, the government would provide a heavy vehicle use exemption. As a result, consumers would have to pay higher gasoline taxes to make up for the funding lost when shipping companies stopped paying hypothecated taxes (similar to the way telephone subscribers paid for the ESP exemption with their phone bills). The higher gasoline taxes would impact every consumer who uses the highways – even consumers who never use FedEx. Any suggestion that FedEx pay its fair share for use of the highways would be deemed a “plot to block highway freedom” by threatening the “commercial model” of the “open highways” (the terms used by FCC Chairman Julius Genachowski to describe any suggestion that would eliminate the net neutrality payment exemption in the Internet context).



Why did the government impose a completely different “commercial model” on the Internet than on the Interstate Highway System? There is no substantive distinction between the shared use of local infrastructure by commercial “edge” providers on the Internet and shared use of the local infrastructure by commercial “edge” providers (e.g., FedEx) on the highways. The difference in treatment is a historical anomaly resulting from the initially “temporary” ESP exemption that has morphed into a desire to permanently subsidize the profits of “over the top” Internet companies in order to “preserve” the historical payment models of the Internet. In its order adopting net neutrality rules, the FCC exempted “edge” providers from paying for their use of local Internet infrastructure because ISPs “may have incentives to increase revenues by charging edge providers,” which the FCC believed would reduce incentives for edge providers to invest by reducing “the potential profit that an edge provider would expect to earn from developing new offerings.” (Emphasis added.) Excise taxes also reduce the potential profits of FedEx and other users of heavy commercial vehicles on the highways, but the federal government has not exempted them from the ordinary costs of doing business to encourage investment in new shipping offerings. To the contrary, a brochure released by the Department of Transportation (DOT) asks, “What can be done to enhance [heavy vehicle use tax] revenues?” The DOT views the heavy use vehicle tax as a way to “level the playing field” for consumers “by ensuring that operators of heavy trucks pay a little more for the highway network.” Of course, the FCC says net neutrality, which exempts commercial users from paying anything for their use of the local Internet, creates a “level playing field” for consumers too.



In an economic system based on capitalism, companies are not routinely exempted from the ordinary costs of doing business, including the use of shared infrastructure. Though the Interstate Highway System is not a free market, the government has at least attempted to correlate usage and costs. When FedEx uses heavy vehicles to deliver packages, it pays more for its use of the highways than consumers, even when consumers have requested FedEx deliveries. This has the effect of reducing the potential profits of FedEx – the “harm” to “edge” providers the FCC relied on to justify the net neutrality payment exemption on the Internet – but it also has the effect of encouraging FedEx to innovate and invest in more efficient methods of package delivery that cause less congestion and harm to the highways.



The FCC took the opposite approach with net neutrality. Its rules are designed to maximize the profits of commercial “edge” providers on the Internet while reducing their incentives to use bandwidth more efficiently. As a result, Internet consumers who never watch a video on the Internet nevertheless share a portion of the cost of upgrading local Internet infrastructure to deliver high definition video while “over the top” Internet companies – no matter how large or successful they become – pay nothing. No wonder net neutrality advocates have stopped talking about the “information superhighway.” If policymakers were to examine the analogy too closely, they might realize that net neutrality isn’t intended to “level the playing field” for consumers – it’s intended to protect the profits of commercial “edge” providers at the expense of consumers.




 •  0 comments  •  flag
Share on Twitter
Published on April 18, 2013 06:58

April 17, 2013

USPTO should step up review of software patents

The US Patent and Trademark office is starting to recognize that it has a software patent problem and is soliciting suggestions for how to improve software patent quality. A number of parties such as Google and EFF have filed comments.



I am on record against the idea patenting software at all. I think it is too difficult for programmers, as they are writing code, to constantly check to see if they are violating existing software patents, which are not, after all, easy to identify. Furthermore, any complex piece of software is likely to violate hundreds of patents owned by competitors, which makes license negotiation costly and not straightforward.



However, given that the abolition of software patents seems unlikely in the medium term, there are some good suggestions in the Google and EFF briefs. They both note that the software patents granted to date have been overbroad, equivalent to patenting headache medicine in general rather than patenting a particular molecule for use as a headache drug.



This argument highlights one significant problem with patent systems generally, that they depend on extremely high-quality review of patent applications to function effectively. If we’re going to have patents for software, or anything else, we need to take the review process seriously. Consequently, I would favor whatever increase in patent application fees is necessary to ensure that the quality of review is rock solid. Give USPTO the resources it needs to comply with existing patent law, which seems to preclude such overbroad patents. Simply applying patent law consistently would reduce some of the problems with software patents.



Higher fees would also function as a Pigovian tax on patenting, disincentivizing patent protection for minor innovations. This is desirable because the licensing cost of these minor innovations is likely to exceed the social benefits the patents generate, if any.



While it remains preferable to undertake major patent reform, many of the steps proposed by Google and EFF are good marginal policy improvements. I hope the USPTO considers these proposals carefully.




 •  0 comments  •  flag
Share on Twitter
Published on April 17, 2013 11:40

Spectrum allocation: Time to get on with it

My new policy brief urges the Federal Communications Commission to get on with the business of allocating the necessary spectrum to meet the burgeoning demand for wireless services.



The paper was finished before Chairman Julius Genachowski announced his resignation last month. At the risk of sounding harsh, that might be addition by subtraction. One of the big disappointments of Genachowski’s tenure was the lack of significant movement to get spectrum freed up and auctioned. In fairness, there were the interests a number of powerful constituencies to be balanced: the wireless companies, the broadcasters, and the federal government itself, which is sitting on chunks of prime spectrum and refuses to budge.



But that’s the job Congress specifically delegated to the FCC. We’d be closer to a resolution–and the public would have been better served–had the FCC put its energies into crafting a viable plan for spectrum trading and re-assignment instead of hand-wringing over how to handicap bidders with neutrality conditions and giving regulatory favors to developers of unproven technologies such as Super WiFi. Instead of managing the spectrum process, the FCC got sidetracked trying to to pick winners and losers.



A new chairman brings an opportunity for a new direction. Spectrum relief should go to the top of the agenda. And as I say in the policy brief, just do it.




 •  0 comments  •  flag
Share on Twitter
Published on April 17, 2013 11:32

CISPA’s Vast Overreach

Last summer at an AEI-sponsored event on cybersecurity, NSA head General Keith Alexander made the case for information sharing legislation aimed at improving cybersecurity. His response to a question from Ellen Nakashima of the Washington Post (starting at 54:25 in the video at the link) was a pretty good articulation of how malware is identified and blocked using algorithmic signatures. In his longish answer, he made the pitch for access to key malware information for the purpose of producing real-time defenses.



What the antivirus world does is it maps that out and creates what’s called a signature. So let’s call that signature A. …. If signature A were to hit or try to get into the power grid, we need to know that signature A was trying to get into the power grid and came from IP address x, going to IP address y.



We don’t need to know what was in that email. We just need to know that it contained signature A, came from there, went to there, at this time.





[I]f we know it at network speed we can respond to it. And those are the authorities and rules and stuff that we’re working our way through.





[T]hat information sharing portion of the legislation is what the Internet service providers and those companies would be authorized to share back and forth with us at network speed. And it only says: signature A, IP address, IP address. So, that is far different than that email that was on it coming.



Now it’s intersting to note, I think—you know, I’m not a lawyer but you could see this—it’s interesting to note that a bad guy sent that attack in there. Now the issue is what about all the good people that are sending their information in there, are you reading all those. And the answer is we don’t need to see any of those. Only the ones that had the malware on it. Everything else — and only the fact that that malware was there — so you didn’t have to see any of the original emails. And only the ones that had the malware on it did you need to know that something was going on.



It might be interesting to get information about who sent malware, but General Alexander said he wanted to know attack signatures, originating IP address, and destination. That’s it.



Now take a look at what CISPA, the Cybersecurity Information Sharing and Protection Act (H.R. 624), allows companies to share with the government provided they can’t be proven to have acted in bad faith:



information directly pertaining to—

(i) a vulnerability of a system or network of a government or private entity or utility;



(ii) a threat to the integrity, confidentiality, or availability of a system or network of a government or private entity or utility or any information stored on, processed on, or transiting such a system or network;



(iii) efforts to deny access to or degrade, disrupt, or destroy a system or network of a government or private entity or utility; or



(iv) efforts to gain unauthorized access to a system or network of a government or private entity or utility, including to gain such unauthorized access for the purpose of exfiltrating information stored on, processed on, or transiting a system or network of a government or private entity or utility.


That’s an incredible variety of subjects. It can include vast swaths of data about Internet users, their communications, and the files they upload. In no sense is it limited to attack signatures and relevant IP addresses.



What is going on here? Why has General Alexander’s claim to need attack signatures and IP addresses resulted in legislation that authorizes wholesale information sharing and that immunizes companies who violate privacy in the process? One could only speculate. What we know is that CISPA is a vast overreach relative to the problem General Alexander articulated. The House is debating CISPA Wednesday and Thursday this week.




 •  0 comments  •  flag
Share on Twitter
Published on April 17, 2013 07:30

Internet Analogies: Remember When the Internet Was the Information Superhighway? (Part 1)

Many net neutrality advocates would prefer that the FCC return to the regulatory regime that existed during the dial-up era of the Internet. They have fond memories of the artificially low prices charged by the dial-up ISPs of that era, but have forgotten that those artificially low prices were funded by consumers through implied subsidies embedded in their monthly telephone bills.



Remember when the Internet was the “information superhighway”? As recently as 2009, the Federal Communications Commission (FCC) still referred to the broadband Internet as, “the interstate highway of the 21st century.” Highways remain a close analogy to the Internet, yet by 2010, net neutrality advocates had replaced Internet highway analogies with analogies to waterworks and the electrical grid. They stopped analogizing the Internet to highways when they realized their approach to Internet regulation is inconsistent with government management of the National Highway System, which has always required commercial users of the highways to pay more for their use than ordinary consumers. In contrast, net neutrality is only the latest in a series of government interventions that have exempted commercial users from paying for the use of local Internet infrastructure.



Vice President Gore popularized the term “information superhighway” in the 1990s as a way of garnering support for the Clinton Administration’s technology initiatives, including its Agenda for Action to promote the Internet (then dubbed the “National Information Infrastructure”). Gore believed the Internet would become the “Interstate Highway System of the 21st Century,” and he frequently analogized the Internet to a “network of highways – much like the Interstates begun in the ‘50s.” He envisioned networks as diverse as the nation’s roadways:



These are highways carrying information rather than people or goods. And I’m not talking about just one eight-lane turnpike. I mean a collection of Interstates and feeder roads made up of different materials in the same way that roads can be concrete or macadam – or gravel. Some highways will be made up of fiber optics. Others will be built out of coaxial or wireless.


Though the Administration recognized their similarities, it chose to fund the “21st Century technology infrastructure” of the Internet differently than highways. The Administration recognized the “fundamental fact” that the private sector was already investing approximately $50 billion annually in telecommunications infrastructure compared to the one to two billion the federal government was contributing. Based on this fact, the Administration determined that its first principle for government action should be to “promote private sector investment” in Internet infrastructure through tax incentives and communications reform legislation that would “encourage innovation and promote long-term investment.” The reform legislation referred to in the Agenda for Action became the Telecommunications Act of 1996.



As implemented by the FCC, however, the 1996 Act discouraged investment in local Internet infrastructure by shifting costs caused by “over the top” Internet service providers to consumers who subscribed to plain old telephone services. The 1996 Act distinguishes between:




“Telecommunications carriers,” which provide switched voice telephone service and are subject to common carrier regulation, and
“Information service providers,” which provide data and Internet communications services and are not subject to common carrier regulation.


The FCC concluded that these categories were equivalent to an existing FCC distinction between “basic” and “enhanced” services (with “telecommunications” equaling “basic” and “information” equaling “enhanced”) that was developed in the early 1970s when computing capabilities were new and the telephone system was still a monopoly.



When the telephone monopoly was dismantled in 1983, the FCC required that interstate carriers (e.g., long distance telephone companies and cellular carriers) pay “access charges” to local carriers to maintain the infrastructure of local telephone exchanges (which had previously been maintained through monopoly rents). The FCC temporarily exempted “enhanced service providers” from paying access charges to avoid a “bill shock” to data users. See MTS and WATS Market Structure, FCC 83-356 (1983). This “ESP exemption” was intended to be temporary, because it “forced [telephone subscribers] to bear a disproportionate share of the local [telephone] exchange costs that access charges [were] designed to cover.” See ESP Exemption Order, FCC 88-151 (1988). The FCC extended the ESP exemption permanently in the ESP Exemption Order – despite its discriminatory impact on telephone subscribers who didn’t use data services (which were mostly used by big businesses at that time) – because the market for data services was still emerging. The FCC concluded that, “to the extent the exemption for enhanced service providers may be discriminatory, it remains, for the present, not an unreasonable discrimination.”



After the 1996 Act was passed, the FCC converted the “ESP” exemption into the information service provider (or “ISP”) exemption, which exempted independent “dial-up” Internet service providers from paying access charges and the per-minute rates applicable to interstate telecommunications services (i.e., long distance telephone calls). See Access Charge Reform, FCC 97-158 (1997). The FCC treated “over the top” dial-up ISPs as local “end user” customers and permitted them to lease lines from telephone companies at the significantly lower, flat monthly rates applicable to business lines used for local calls. Because dial-up ISPs could pay a flat monthly rate for unlimited data traffic rather than the per-minute charges that were then applicable to long distance telephone calls, ISPs offered unlimited dial-up Internet access to consumers at flat monthly rates that were artificially low in comparison to the rates charged for telephone service. As a result, consumers who subscribed to telephone services paid “subscriber line charges” and higher per-minute long distance rates to cover costs to local exchange networks that were caused by dial-up ISPs and their subscribers. Even telephone subscribers who were not using Internet services were in effect required by law to subsidize dial-up ISPs.



Although treating dial-up Internet traffic as “local” meant that ISPs didn’t have to pay access charges (which apply only to interstate calls), the 1996 Act introduced a new payment type – “reciprocal compensation” – that was designed to apply to the exchange of local calls. The states, which have jurisdiction only over local calls, interpreted this provision as requiring that dial-up ISPs pay reciprocal compensation for their share of the costs involved in maintaining local telephone exchanges. The FCC quickly issued an order preempting the states on jurisdictional grounds by concluding that dial-up ISP-bound traffic is inherently interstate. See Inter-Carrier Compensation for ISP-Bound Traffic, FCC 99-38 (1999). The FCC concluded that the Internet could not be separated into an “intrastate telecommunications service” (the call from the consumer to the dial-up ISP’s local server) and an “interstate information service” (the Internet access provided by the ISP’s local server), because the definition of “information services in the 1996 Act recognizes the inseparability, for purposes of jurisdictional analysis, of the information service and the underlying telecommunications.” The FCC thus required states to treat dial-up ISP traffic as local for pricing purposes and as interstate (i.e., long distance) for jurisdictional purposes. The FCC justified this absurd result by noting the “strong federal interest in ensuring that regulation does nothing to impede the growth of the Internet – which has flourished to date under our ‘hands off’ regulatory approach – or the development of competition.” Of course, dictating that local telephone companies lease their lines at regulated prices to dial-up ISPs was not a “hands off” approach that any free market economist would recognize.



The FCC didn’t adopt a truly “hands-off” regulatory approach to the Internet until it classified broadband Internet access services as information services during the Bush Administration, a classification that prevented or eliminated mandatory wholesale requirements and government price regulations on cable modem (2002), DSL and Fiber (2005), broadband over power line (2006), and wireless broadband (2007). In its Cable Modem Order, the FCC relied on the same rationale used during the Clinton Administration to preempt the states from imposing reciprocal compensation on dial-up ISPs: The FCC concluded that broadband Internet access was an integrated service with “no separate offering of telecommunications service.” Although this rationale was consistent with the jurisdictional premise adopted by the FCC when it preempted the states to preserve the ISP exemption (not to mention the FCC’s tentative conclusion in the 1980s that enhanced service providers should be required to pay access charges), the transition of the Internet to a free market in which “over the top” Internet companies might have to pay their fair share for the use of local Internet infrastructure became the rallying cry for “net neutrality.”



Although the FCC claimed that the net neutrality rules it adopted in 2010 were intended to protect consumers, net neutrality is actually the intellectual descendant of the ESP exemption – a “temporary” exemption that became a permanent subsidy paid by consumers for the benefit of “over the top” Internet companies. Many net neutrality advocates would prefer that the FCC return to the regulatory regime that existed during the dial-up era of the Internet. They have fond memories of the artificially low prices charged by the dial-up ISPs of that era, but have forgotten that those artificially low prices were funded by consumers through implied subsidies embedded in their monthly telephone bills. Due to the drastic decline in telephone subscriptions, that subsidy model is no longer viable, which in part explains why the FCC issued a policy statement embracing net neutrality principles on the same day it deregulated broadband Internet access provided by telephone companies: Net neutrality became the alternative mechanism for forcing consumers to subsidize “over the top” providers of Internet services.



In many ways, the FCC’s net neutrality proceeding in 2010 was a replay of earlier proceedings involving the ESP exemption. In the ESP exemption proceeding, telephone companies, state public utility commissions and attorneys general (with the notable exception of California), and consumer groups (e.g., National Consumers League) generally supported eliminating the ESP exemption, whereas enhanced service providers, device manufacturers (e.g., Apple), and data users (primarily large enterprises at that time) opposed paying access charges to use local telephone networks. States and consumer groups argued that all users of the local telephone network should “pay a fair share of the costs of the local network,” including Internet companies, and that the ESP exemption resulted in telephone consumers subsidizing big data companies. Enhanced service providers argued that the exemption was necessary to promote further development of the “fragile” data services market that was still in its “infancy.” Similar players made similar arguments in the net neutrality proceeding (with the exception that many consumer groups switched sides), and the FCC used the same rationale for adopting net neutrality rules that it relied on in the ISP exemption proceeding. And, similar to the impact of the ESP and ISP exemptions, the FCC’s net neutrality rules have had the effect of spreading costs caused by some “over the top” Internet services to all Internet access subscribers – including those who don’t use the most data intensive services – by prohibiting the owners of local Internet infrastructure from charging fees to content and application providers that use local Internet infrastructure to reach consumers.



Although the Interstate Highway System is user funded, it has no analog to the consumer-funded commercial subsidy model that has supported US-based Internet companies since 1983. When the federal government funded the Interstate Highway System, it embraced the principle that all users of a shared resource should pay for its use and that heavy users should pay the most. This contradiction between the funding policies of the Internet and the Interstate Highway System is why net neutrality advocates had to abandon the “information superhighway” analogy.



Part 2 of this post describes the funding mechanisms used to build and maintain the Interstate Highway System and compares them to net neutrality in more detail.




 •  0 comments  •  flag
Share on Twitter
Published on April 17, 2013 04:27

April 16, 2013

Marc Hochstein on bitcoin

Marc Hochstein

Marc Hochstein, Executive Editor of American Banker,  a leading media outlet covering the banking and financial services community, discusses bitcoin.



According to Hochstein, bitcoin has made its name as a digital currency, but the truly revolutionary aspect of the technology is its dual function as a payment system competing against companies like PayPal and Western Union. While bitcoin has been in the news for its soaring exchange rate lately, Hochstein says the actual price of bitcoin is really only relevant for speculators in the short-term; in the long-term, however, the anonymous, decentralized nature of bitcoin has far-reaching implications.



Hochstein goes on to talk about  the new market in bitcoin futures and some of bitcoin’s weaknesses—including the volatility of the bitcoin market.



Download



Related Links


VIDEO: Why Banks Should Care About Bitcoin, Hochstein
Why Bitcoin Matters: It’s the Payment System, Stupid!, Hochstein
Bitcoin vs. Big Government: How the virtual currency undermines government authority, Brito
Online Cash Bitcoin Could Challenge Governments, Banks, Brito



 •  0 comments  •  flag
Share on Twitter
Published on April 16, 2013 03:00

April 15, 2013

An Internet ‘free from Government Control’ A worthy principle

On Wednesday, April 10, a bill “to Affirm the Policy of the United States Regarding Internet Governance” was marked up in the U.S. House of Representatives. The bill is an attempt to put a formal policy statement into statute law. The effective part says simply:



It is the policy of the United States to promote a global Internet free from government control and to preserve and advance the successful multistakeholder model that governs the Internet.



Yet this attempt to formulate a clear principle and make it legally binding policy has become controversial. This has happened because the bill brings to a head the latent contradictions and elisions that characterize U.S. international Internet policy. In the process it has driven a wedge between what was once a unified front by U.S. Democrats and Republicans against incursions into Internet governance by intergovernmental organizations such as the ITU.



The problem, it seems, is that the Democratic side of the aisle can’t bring itself to say that it is against ‘government control’ per se. Indeed, the bill has forced people linked to the Obama administration to come out and openly admit that ‘government control’ of the internet is OK when we exercise it; it’s just those other countries and international organizations that we need to worry about.





The U.S. has been deeply enmeshed in this contradiction ever since the World Summit on the Information Society in 2003-5, when it fended off criticisms of the U.S.-controlled ICANN while claiming to oppose ‘government control.’ In the meantime various US government agencies have (largely unconscious of or independently of the Internet freedom rhetoric) cast global shadows of hierarchy over various aspects of the Internet, seeking extraterritorial domain name takedowns, ACTA, restricted online gambling, cyber-weapons, and so on.



Until now, the contradiction has remained latent, a sotto voce muttering that the emperor has no clothes. Only a few hyper-critical academics (like us) were willing to articulate the argument, generally irritating everyone in the process. But now it’s out in the open. The double standard is humorously evident in this video showing the testimony of Rep. Eshoo, a Democrat of California, in the markup hearings. Rep. Eshoo says:



“…the expert agencies have expressed concern with the term, quote, ‘government control,’ unquote. One diplomat suggested that the use of his term could actually undermine existing Internet governance institutions such as ICANN because of its, uh, uh, close relationship with, uh, our government. Foreign countries frequently cite the close coordination between ICANN and US Dept of Commerce as an example of US quote ‘control’ over the internet.”



Well, yes, Rep. Eshoo, other countries do look at ICANN as a form of global Internet control exercised by one government. Are they wrong? ICANN gets its policy making authority over the DNS root directly from a contract with the U.S. government, and in exchange for receiving that contract ICANN has to stay in the U.S. and conform to various policies. This is not ‘close coordination;’ it’s control. Not even the slipperiest politician can plausibly deny this.



A similar double standard was raised in the response of Public Knowledge (PK), a U.S. public interest group. PK happily collected grants to join the U.S.-led charge against ‘government control of the Internet’ in the renegotiation of the ITU’s International Telecommunication Regulations. It joined in the anti-government rhetoric about how the Internet had to be left alone. Now it wants to clarify its position a bit:



we fear that the broad language of the proposed bill may intrude on areas of consumer protection, competition policy, law enforcement and cybersecurity long considered appropriate for national policy formulated by governments with input from civil society, business and the technical community.



Like Rep. Eshoo, PK is forced to distinguish between government control at home (the good kind) and government control that involves the rest of the world (the scary kind). Note that PK also tacitly accepts the description of different roles for government and civil society that the authoritarian states put into the WSIS Tunis Agenda: governments formulate policy and the rest of us just provide input.



Remember, at the end of the WCIT negotiations we were being told that an indirect reference to spam (“unsolicited bulk electronic communications”) in the ITRs opened the door to systematic content regulation on a global basis. Now PK is forced to admit that:



Although we opposed the ITU resolution to require countries to limit spam, the United States protects its citizens from spam through the CAN-SPAM Act.



Indeed. And why are domestic spam laws fine and international ones (that would have to be enforced by and consistent with those same domestic laws, and ratified by the same national legislature that passed the domestic laws) a threat to the very basis of free expression? According to PK,



Our opposition to ceding authority to the ITU to decide how to balance consumer protection and free expression is not because we see no role for government in protecting consumers or promoting competition. Rather, we believe those matters are best decided here at home, by a Congress accountable to the people and enforced by a government constrained by the Constitution.



So has PK gone cyber-nationalist? Like the Chinese, the Russians, the Saudis and the Iranians, does it want a balkanized Internet governed by a separate and distinct series of national sovereigns? If so, what, exactly, is wrong with the ITU as a venue for negotiating governance? The ITU is a global governance institution founded on the principles of national sovereignty.



We think its high time to call the bluff of American politicians and advocacy groups that play with this double standard. If they cannot bring themselves to embrace a principle of “a global Internet free from government control” it’s time to ask them what they do stand for.



Defending the legitimate rights of consumers to be protected against fraud or monopolies is not “government control” of the Internet, by any serious definition. By protecting individual rights to privacy, by challenging coercive and collusive monopolies and by prosecuting fraud, governments are maintaining individual freedom, not exerting control. It is worrisome, therefore, that allegedly liberal groups such as PK want to maintain an option for ‘government control’ at the level of broad principle.



The PK’s reversion to cybernationalism is both intellectually flawed and politically disturbing. Their attempt to distinguish between national laws and international ones falls apart completely when examined. Laws that overreach and over-regulate occur in both levels; PK simultaneously underestimates the dangers of government control at home (which is odd, given its involvement in issues such as CISPA) and overstates the dangers of international laws (which typically have to be ratified domestically and are subject to reservations).



Whether you are talking about China, Russia or the USA, you can’t have a free Internet and a national Internet. As a virtual space constructed out of a globally interconnected infrastructure, cyberspace realizes its highest potential when it is not artificially bounded by jurisdiction or hierarchically imposed filters. Right now, the biggest threats to internet freedom are from national governments. And while there are indeed aspects of communications that can and should be left to domestic regulation, any regulation that is too scary to be implemented at the international level probably poses many of the same dangers when enacted at the national level. The idea that we only have to worry about ‘government control’ when we are talking about foreign governments is obviously wrong.



The House bill articulates a worthy principle that can be and should be globally applicable to the Internet. Not controlling the Internet does not mean that there is no role for laws or regulations that safeguard individual rights; it means that national governments should recognize the Internet’s transnational nature and refrain from trying to suppress the rights to free expression and free association that have emerged in the context of a decentralized Internet not under the control of any sovereign.




 •  0 comments  •  flag
Share on Twitter
Published on April 15, 2013 06:10

April 11, 2013

“Internet Freedom”: A Short Reading List

Following up on Eli’s earlier post (“Does CDT believe in Internet freedom?”), I thought I’d just point out that we’ve spent a great deal of time here through the years defending real Internet freedom, which is properly defined as “freedom from state action; not freedom for the State to reorder our affairs to supposedly make certain people or groups better off or to improve some amorphous ‘public interest.’” All too often these days, “Internet freedom,” like the term “freedom” more generally, is defined as a set of positive rights/entitlements complete with corresponding obligations on government to delivery the goods and tax/regulate comprehensively to accomplish it.  Using “freedom” in that way represents a grotesque corruption of language and one that defenders of human liberty must resist with all our energy.



I’ll be writing more about this in upcoming columns, but here’s a short list of past posts on Internet freedom, properly defined:




The Problem with the “Declaration of Internet Freedom” & the “Digital Bill of Rights” – by Adam Thierer (July 2, 2012)
A Note to Congress: The United Nations Isn’t a Serious Threat to Internet Freedom—but You Are – by Jerry Brito & Adam Thierer (The Atlantic, June 19, 2012)
Does the Internet Need a Global Regulator? – by Adam Thierer (Forbes, May 6, 2012)
More Confusion about Internet “Freedom” – by Adam Thierer (Mar. 1, 2011)
Internet Freedom–Real vs Imagined – by Adam Thierer (Dec. 12, 2007)
A Response to Andrew McLaughlin on Net Neutrality & “Freedom” – by Adam Thierer (July 9, 2011)
Web 2.0, Section 230, and Nozick’s “Utopia of Utopias” – by Adam Thierer (Jan. 13, 2009)
Cyber-Libertarianism: The Case for Real Internet Freedom - by Adam Thierer & Berin Szoka (Aug. 12, 2009)
Broadband as a Human Right (and a short list of other things I am entitled to on your dime) – by Adam Thierer (Oct. 14, 2009)
“Internet Freedom”: How Statists Corrupt Our Language – by Berin Szoka (Oct. 27, 2009)



 •  0 comments  •  flag
Share on Twitter
Published on April 11, 2013 18:09

Does CDT believe in Internet freedom?

Last year, in advance of the World Conference on International Telecommunication, Congress passed a concurrent resolution stating its sense that US officials should promote and articulate the clear and unequivocal “policy of the United States to promote a global Internet free from government control and preserve and advance the successful multistakeholder model that governs the Internet today.” This language sailed through the House on a bipartisan basis with broad support from basically everyone in US civil society.



Now that WCIT is over, and the World Telecommunication/ICT Policy Forum looms, Congress is considering a law that reads:



It is the policy of the United States to promote a global Internet free from government control and to preserve and advance the successful multistakeholder model that governs the Internet.


And suddenly it’s controversial. Democrats are concerned that language about freedom “from government control” would apply to—gasp—the US government.



As Rep. Walden says,



Last Congress, we “talked the talk” and passed a resolution defending a global Internet free from government control. This Congress we must “walk the walk” and make it official U.S. policy. If this is a principle we truly believe in, there is no downside to stating so plainly in U.S. law.


I could not agree more.



I am especially disappointed by our friends at CDT. They are coming out against the bill, with both blog post and letter barrels blazing, after having supported the exact same language last year. Apparently, in CDT’s world: US government regulation of the Internet good, foreign government regulation of the Internet bad.



This episode shows the prescience of my colleagues Jerry Brito and Adam Thierer. As they wrote last year when Congress was considering the joint resolution:



The most serious threat to Internet freedom is not the hypothetical specter of United Nations control, but the very real creeping cyber-statism at work in the legislatures of the United States and other nations.


CDT gets this exactly backwards. Here’s hoping they change their minds yet again.




 •  0 comments  •  flag
Share on Twitter
Published on April 11, 2013 07:38

April 10, 2013

Begun the next crypto wars have

34890833The mid-to-late 90s saw the crypto wars, probably the Internet’s first major victory against government attempts to control information online. At stake was the public’s right to use strong encryption, which facilitates commerce and allows individuals to maintains their personal privacy, but which government feared would allow “drug lords, spies, terrorists and even violent gangs to communicate about their crimes and their conspiracies with impunity,” as FBI Director Louis Freeh told the Senate Judiciary Committee in 1997. In the end, popular opinion overwhelmed government efforts to include back doors in publicly available encryption, which might as well have been no encryption at all.



Leading the charge in the crypto wars were the cypherpunks, many of whom were radical libertarians who predicted that privacy and anonymity powered by strong encryption would fundamentally shift the balance of power between individuals and the state. For example, in this paper (also from 1997) Tim May, one of the cypherpunk’s founders, describes the some of the social implications of “untraceable digital cash”:




Some of these “marginal” uses are terrible to consider. Extortion, kidnapping, and even murder contracts become easier to set up. Extortion, for example, becomes almost unstoppable at the usual place: the collection of a payoff and/or the spending of the payoff money. The extortionist makes his threat from the safety of his home PC, using networks of remailers and message pools, and demands payment in untraceable digital cash… .



Similar to extortion are markets for kidnappings (riskier, due to the physical act), and even untraceable markets for murders. For murder contracts, the usual risk is in setting up the hit—asking around is almost a guaranteed way of getting the FBI involved, and advertising in traceable ways is a similar invitation. This risk is largely removed when anonymous contact and payment methods are used. To ensure the job is completed, third party escrow services—anonymous, of course, but with an established cyberspatial reputation—hold the digital cash until completion.




The thing is, untraceable digital cash has not been a reality until now. Over at Reason, I write that while much of the discussion about Bitcoin is focused on whether the virtual currency has all the attributes of money and whether it can ever be a viable alternative to state-backed fiat currency, its real revolutionary potential is as untraceable digital cash.




Time will tell whether the gold bugs or the skeptics are right, but what’s being overlooked is that it doesn’t matter whether Bitcoin makes it as a store of value or a unit of account for it to work as a medium of exchange. Even if the Bitcoin market remains volatile and never pans out as a good store of value or unit of account, one can imagine users converting their dollars or euros to bitcoins for just long enough to make a transaction; perhaps just minutes. And as long as it works as a medium of exchange, it is the true digital cash that was missing from the cypherpunks’ predictions.



With a little bit of effort, today you can purchase bitcoins anonymously with physical cash. You could then do all sorts of things the government doesn’t want you to do. You could buy illegal drugs on the notorious Silk Road, an encrypted website that has been operating with impunity for the past two years facilitating annual sales estimated at almost $15 million. You could gamble at various casinos or prediction markets, buy contraband Cuban cigars, or even give money to WikiLeaks. Dissidents in Iran or China can use Bitcoin to buy premium blogging services from WordPress, which now accepts payment in the currency. Perhaps more importantly, Bitcoin makes the cypherpunks predictions of markets for stolen secret information and even assassinations feasible.




I predict that we will soon see another round of the crypto wars. Now that Bitcoin has broken through to at least some public notice, I suspect we will see greater use of the currency and with it greater illicit use. I also suspect we will see the intelligence community, law enforcement, and child safety advocates take greater notice of Bitcoin as an anonymous payment processor. (Indeed, you can glean from this speech by the director of the Financial Crimes Enforcement Network that they see decentralized virtual currencies like Bitcoin as “emerging payment systems.”) And I suspect that traditional payment processors who might be in competition with Bitcoin to take notice as well. If these stars align, I imagine we will see public calls to “do something” about Bitcoin.



Although Bitcoin’s decentralized nature makes it difficult to regulate, its ecosystem (and even the network itself) is not impervious to attack. Those of us who see the benefits, and not just the costs of digital cash should begin preparing for this likely confrontation.




 •  0 comments  •  flag
Share on Twitter
Published on April 10, 2013 10:08

Adam Thierer's Blog

Adam Thierer
Adam Thierer isn't a Goodreads Author (yet), but they do have a blog, so here are some recent posts imported from their feed.
Follow Adam Thierer's blog with rss.