Adam Thierer's Blog, page 83

October 11, 2012

Pandora vs. FM: A case of picking winners and losers

A few weeks I wrote an intentionally provocative post comparing copyright to Solyndra. My argument was that just as Congress has a knowledge problem and a public choice problem picking the right technologies to subsidize, so does it have these problems when it comes to picking winners and losers when it comes to setting out the contours of copyright.



I’m grateful for all the wonderful feedback I got on that post, and I agree with those who pointed out that a problem with my analogy was that unlike subsidies to Solyndra, copyright doesn’t pick particular politically connected individuals or companies to privilege. I think it’s much more accurate to say that Congress can use copyright to privilege certain classes of well-organized industries or companies.



A case in point that shows how Congress picks winners and losers is being debated right now: the framework that governs digital music broadcasting royalties for satellite radio and internet radio stations like Pandora. Today you pay a different royalty rate for playing a sound recording (like the lasted LMFAO opus) depending on what kind of radio station you are. Satellite radio stations pay 6 to 8 percent of their gross revenues each year in royalties. Pandora, however, pays around 50 percent, and it will likely be more next year. Meanwhile, traditional AM and FM radio stations pay nothing, zip, zero, zilch.



I won’t get into the public choice problems that may have led to this situation, but the fact is that one legacy industry is not just being subsidized with free access to an essential input (sound recordings), but it is also protected. That protection comes at the expense of a new and innovative industry–internet radio–that is being charged punishing rates for the same essential input.



My colleague Matt Mitchell recently published an excellent paper entitled The Pathology of Privilege: The Economic Consequences of Government Favoritism that catalogs the different ways government has favored particular industries. He also shows how this behavior “misdirects resources, impedes genuine economic progress, breeds corruption, and undermines the legitimacy of both the government and the private sector.” The uneven playing field in the digital music space fits right in with the type of privilege he discussed.



Conservatives and libertarians who are wary of such government extensions of privilege should keep their eye on copyright as the source of many such imbalances.




 •  0 comments  •  flag
Share on Twitter
Published on October 11, 2012 13:29

October 10, 2012

Announcing TechWire, a new tech policy news and views aggregator

For some time now I’ve been trying to teach myself how to program. I’m proficient in HTML and CSS, and and I could always tinker around the edges of PHP, but I really couldn’t code something from scratch to save my life. Well, there’s no better way to learn than by doing, at least when it comes to programming, so I gave myself a project to complete and, by golly, I did it. It’s called TechWire. It’s a tech policy news aggregator and I’m making it available on the web because I think it might be useful to other tech policy nerds.





Quite simply it’s a semi-curated reverse-chronological list of the most recent tech policy news with links to the original sources. And it’s not just news stories. You’ll also see opinion columns, posts from the various policy shops around town, new scholarly papers, and new books on tech policy. And you can also drill down into just one of these categories to see just the latest news, or just the latest opinions, or just the latest papers. Leave the page open in a tab and the site auto-refreshes as new items come in. Alternatively there is a Twitter feed at @TechWireFTW to get the latest.



Other features include the ability to look up the news for a particular day in the past, as well as clicking on a story to see what other stories are related. That’s especially useful if you want to see how different outlets are covering the same issue or if you want to see how an issue has developed over time. Just click on the linked timestamp at the end of a story to see related posts.



I hope TechWire (at http://techwireftw.com/) is as useful to you as it was fun for me to code. I’d like to thank some folks who really helped me along the way: Pete Snyder and Eli Dourado for putting up with my dumb questions, Cord Blomquist for great hosting and serene patience, Adam Thierer for being the best QA department I could wish for, and my wife Kathleen for putting up with me staring at the computer for hours.




 •  0 comments  •  flag
Share on Twitter
Published on October 10, 2012 10:12

October 9, 2012

Important new paper on data caps and usage-based pricing by Daniel Lyons



Today the Mercatus Center at George Mason University has released a new working paper by Boston College Law School Professor Daniel Lyons entitled, “The Impact of Data Caps and Other Forms of Usage-Based Pricing for Broadband Access.”



There’s been much hand-wringing about fixed and mobile broadband services increasingly looking to move to usage-based pricing or to impose data caps. Some have even suggested an outright ban on the practice. As Adam Thierer has catalogued in these pages, the ‘net neutrality’ debate has in many ways been leading to this point: pricing flexibility vs. price controls.



In his new paper, Lyons explores the implications of this trend toward usage-based pricing. He finds that data caps and other forms of metered consumption are not inherently anti-consumer or anticompetitive.




Rather, they reflect different pricing strategies through which a broadband company may recover its costs from its customer base and fund future infrastructure investment. By aligning costs more closely with use, usage-based pricing may effectively shift more network costs onto those consumers who use the network the most. Companies can thus avoid forcing light Internet users to subsidize the data-heavy habits of online gamers and movie torrenters. Usage-based pricing may also help alleviate network congestion by encouraging customers, content providers, and network operators to use broadband more efficiently.



Opponents of usage-based pricing have noted that data caps may be deployed for anticompetitive purposes. But data caps can be a problem only when a firm with market power exploits that power in a way that harms consumers. Absent a specific market failure, which critics have not yet shown, broadband providers should be free to experiment with usage-based pricing and other pricing strategies as tools in their arsenal to meet rising broadband demand. Public policies allowing providers the freedom to experiment best preserve the spirit of innovation that has characterized the Internet since its inception.




Lyons does a magnificent job of walking the reader through every aspect of the usage-based pricing issue, its benefits as a cost-recovery and congestion management tool, and its potential anticompetitive effects. “Ultimately, data caps and other pricing strategies are ways that broadband companies can distinguish themselves from one another to achieve a competitive advantage in the marketplace,” he concludes. “When firms experiment with different business models, they can tailor services to niche audiences whose interests are inadequately satisfied by a one-size-fits-all flat-rate plan. Absent anticompetitive concerns, public policy should encourage companies to experiment with different pricing models as a way to compete against one another.”




 •  0 comments  •  flag
Share on Twitter
Published on October 09, 2012 09:09

Scott Shackelford on cybersecurity and polycentric governance

Post image for Scott Shackelford on cybersecurity and polycentric governance

Scott Shackelford, assistant professor of business law and ethics at Indiana University, and author of the soon-to-be-published book Managing Cyber Attacks in International Law, Business, and Relations: In Search of Cyber Peace, explains how polycentric governance could be the answer to modern cybersecurity concerns.



Shackelford  originally began researching collective action problems in physical commons, including Antarctica, the deep sea bed, and outer space, where he discovered the efficacy of polycentric governance in addressing these issues. Noting the similarities between these communally owned resources and the Internet, Shackelford was drawn to the idea of polycentric governance as a solution to the collective action problems he identified in the online realm, particularly when it came to cybersecurity.



Shackelford contrasts the bottom-up form of governance characterized by self-organization and networking regulations at multiple levels to the increasingly state-centric approach prevailing in forums like the International Telecommunication Union (ITU).  Analyzing the debate between Internet sovereignty and Internet freedom through the lens of polycentric regulation, Shackelford reconceptualizes both cybersecurity and the future of Internet governance.






Download



Related Links


Managing Cyber Attacks in International Law, Business, and Relations: In Search of Cyber Peace, by Shackelford

Toward Cyber Peace: Managing Cyber Attacks Through Polycentric Governance, by Shackelford

Is There a Cybersecurity Market Failure , by Eli Dourado




 •  0 comments  •  flag
Share on Twitter
Published on October 09, 2012 03:00

October 2, 2012

Book Review: Christopher Yoo’s “The Dynamic Internet”

Looking for a concise overview of how Internet architecture has evolved and a principled discussion of the public policies that should govern the Net going forward? Then look no further than Christopher Yoo‘s new book, The Dynamic Internet: How Technology, Users, and Businesses are Transforming the Network. It’s a quick read (just 140 pages) and is worth picking up.  Yoo is a Professor of Law, Communication, and Computer & Information Science at the University of Pennsylvania and also serves as the Director of the Center for Technology, Innovation & Competition there. For those who monitor ongoing developments in cyberlaw and digital economics, Yoo is a well-known and prolific intellectual who has established himself as one of the giants of this rapidly growing policy arena.



Yoo makes two straight-forward arguments in his new book. First, the Internet is changing. In Part 1 of the book, Yoo offers a layman-friendly overview of the changing dynamics of Internet architecture and engineering. He documents the evolving nature of Internet standards, traffic management and congestion policies, spam and security control efforts, and peering and pricing policies. He also discusses the rise of peer-to-peer applications, the growth of mobile broadband, the emergence of the app store economy, and what the explosion of online video consumption means for ongoing bandwidth management efforts. Those are the supply-side issues. Yoo also outlines the implications of changes in the demand-side of the equation, such as changing user demographics and rapidly evolving demands from consumers. He notes that these new demand-side realities of Internet usage are resulting in changes to network management and engineering, further reinforcing changes already underway on the supply-side.



Yoo’s second point in the book flows logically from the first: as the Internet continues to evolve in such a highly dynamic fashion, public policy must as well. Yoo is particularly worried about calls to lock in standards, protocols, and policies from what he regards as a bygone era of Internet engineering, architecture, and policy. “The dramatic shift in Internet usage suggests that its founding architectural principles form the mid-1990s may no longer be appropriate today,” he argues. (p. 4) “[T]he optimal network architecture is unlikely to be static. Instead, it is likely to be dynamic over time, changing with the shifts in end-user demands,” he says. (p. 7) Thus, “the static, one-size-fits-all approach that dominates the current debate misses the mark.” (p. 7)



Yoo makes a particular powerful case for flexible network pricing policies. His outstanding chapter on “The Growing Complexity of Internet Pricing” offers an excellent overview of the changing dynamics of pricing in this arena and explains why experimentation with different pricing methods and business models must be allowed to continue. Getting pricing right is essential, Yoo notes, if we hope to ensure ongoing investment in new networks and services. He also notes how foolish it is to expect the government to come in and save the day thought massive infrastructure investment to cover the hundreds of billions of dollars needed to continue to build-out high-speed services:



Most industry and political observers believe that the federal government will not be in a position to allocate that amount of money to upgrade our nation’s broadband infrastructure for the foreseeable future. The next-generation network will thus be built by private enterprise. But private corporations cannot be expected to undertake such investments unless they have a reasonable prospect of recovering their upfront costs from consumers who are using the increased bandwidth and other enhancements to the existing network. (p. 102)


Again, that’s why flexible pricing policies and ongoing experimentation with various business models is vital. This insight is particularly timely in light of the recent renewed interest in data caps. A lot of people who don’t know a lick about economics and have never run a real business in their lives are seemingly obsessed with telling private operators how to run theirs. If the Net neutrality wars devolve into a battle over price controls — exactly as I predicted they would 7 years ago this month — then we could be headed for a day when federal policymakers derail the advances in broadband we’ve seen in recent years by substituting mandates for markets.



Throughout the second half of his book, Yoo explains why that would be a disaster for consumers and high-tech innovation. To most of us, the arguments Yoo advances here are perfectly logical, but to many Ivory Tower intellectuals who dominate Net policy debates today, it will all be considered apostasy of the very highest order. Those that elevate Net neutrality and so-called “public interest” regulation to quasi-religious concepts will likely be constructing Christopher Yoo voodoo dolls and attempting to sew his mouth shut. Yet, the policy standard Yoo is advancing here is perfectly logical. In essence, he’s trying to counter the gradual growth of a Precautionary Principle mindset for Internet policy. Here’s how he puts it:



Just as engineers must design structures that preserve room for experimentation, so must regulators. In particular, regulators should avoid promulgating policies that foreclose certain technical approaches or require industry actors to obtain advance approval before they can experiment with new technological solutions. The benefits of most practices will remain ambiguous before they are deployed, and placing the burden on industry actors to prove consumer benefit before implementation would chill experimentation and effectively prevent ambiguous practices from ever being deployed. This in turn would prevent engineers from obtaining the real-world experience they need to evaluate different technological solutions and eliminate the breathing room on which technological progress depends. In the face of uncertainty, policymakers should not attempt to predict which particular network solution will ultimately prevail; rather, they ought to focus on creating regulatory structures that give industry participants the freedom to pursue a wide range of business strategies and allow consumers to decide which one (or ones, if consumer demand is sufficiently diverse to support multiple business models targeted at different market niches) ultimately proves to be the best.” (p. 8)


In other words, public policy must not restrict experimentation based on conjectural fears and boogeyman scenarios. Public policy should generally seek to avoid ex ante forms of preemptive, prophylactic Internet regulation and instead rely on an ex post approach when and if things go wrong. As I have argued here many times before, as a general rule, our policymakers should embrace “techno-agnosticism” toward ongoing debates over standards, protocols, business models, pricing methods, and so on. Lawmakers should not be preemptively tilting the balance in one direction or the other or, worse yet, restricting experimentation that can help us find superior solutions. Here’s how Yoo articulates this same principle of techno-agnosticism:



network engineering is inherently an exercise in tradeoffs that does not lend itself to broad generalizations. There is no such thing as a perfect, inherently superior architecture. Instead, the optimal infrastructure for any particular network depends on the nature of the flows passing through the network as well as the costs of the technologies comprising the network. This perspective stands in stark contrast to the categorical tone that has dominated debates over Internet policy for the past five years. (p. 138)


Indeed it does. If you read through books by Zittrain, Lessig, Wu, van Schewick, Frischmann, and others, you will notice the consistent assertion that we already have the magic formula for the Internet and all networks, for that matter. It almost always comes down to what I have referred to as an ideology of “openness at any cost” or “neutrality uber alles.” In this religion, everything is subservient to openness and neutrality, no matter what the cost (and no matter how defined, even if that is much trickier than those academics let on). But for all the reasons Yoo lays out in his book, we should reject neutrality uber alles as the basis of public policy. “The shifts in the technological and economic environment surrounding the network should remind everyone involved in Internet policy of the importance of embracing change.” (p. 139).  Again, that counsels techno-agnosticism and light-touch, responsive regulation — not a preemptive Precautionary Principle for Internet decision-making. As Yoo states in his conclusion:



Perhaps the best means for creating such an environment is to create a regulatory-enforcement regime that evaluates any charges of improper behavior on a case-by-case basis after the fact… So long as the burden of proof is placed on the party challenging the practice, such a regime should provide sufficient breathing room for industry participants to experiment with new solutions for emerging problems while simultaneously safeguarding consumers against any anticompetitive practices. (p. 139).


And even under that regime, Yoo makes it clear throughout the book that there should be a very high bar established before regulation is pursued. This is particularly true because of the First Amendment values at stake when the government attempts to regulate speech platforms. In Chapter 9 of the book, Yoo walks the reader through all the relevant case law on this front and makes it clear how “the Supreme Court has repeatedly recognized that the editorial discretion exercised by intermediaries serves important free speech values.” (p. 120). Yoo also makes the case that a certain degree of intermediation helps serve consumer needs by helping them more easily find the content and services they desire. Law should not seek to constrain that and, under current Supreme Court First Amendment jurisprudence, it probably cannot.



So, in conclusion, I strongly encourage everyone to pick up a copy of Christopher Yoo’s Dynamic Internet. It strikes just the right balance for Net governance and public policy in the information age. It all comes down to flexibility and freedom.  If the Internet and all modern digital technologies are to thrive, we must reject the central planner’s mindset that dominated the analog era and forever bury all the static thinking it entailed.



Additional Reading:




The Real Net Neutrality Debate: Pricing Flexibility Versus Pricing Regulation (Oct 2005)
More on Net Neutrality, the Importance of Business Model Experimentation & Pricing Flexibility (May 2012)
Smartphones & Usage-Based Pricing: Are Price Controls Coming? (July 2011)
Netflix Falls Prey to Marginal Cost Fallacy & Pleads for a Broadband Free Ride (July 8, 2011)
Why Congestion Pricing for the iPhone & Broadband Makes Sense (October 7, 2009)
The (Un)Free Press Calls for Internet Price Controls: “The Broadband Internet Fairness Act” (June 17, 2009)
Free Press Hypocrisy over Metering & Internet Price Controls (June 18, 2009)
Mueller’s “Networks and States” = Classical Liberalism for the Information Age (Nov. 2010)
Doctorow’s Definition of “Techno-Optimism” Is Full of Fear & False Choices (May 2011)
review of Zittrain’s “Future of the Internet” (March 2008)
Code, Pessimism, and the Illusion of “Perfect Control” [review of Lessig's "Code" at 10th anniversary] (May 2009)



 •  0 comments  •  flag
Share on Twitter
Published on October 02, 2012 11:13

Dan Provost on indie capitalism

Post image for Dan Provost on indie capitalism

Designer Dan Provost, co-founder of the indie hardware and software company Studio Neat, and co-author of It Will Be Exhilarating: Indie Capitalism and Design Entrepreneurship in the 21st Century, discusses how technological innovation helped him build his business. Provost explains how he and his co-founder Tom Gerhardt were able to rely on crowdfunding to finance their business. Avoiding loans or investors, he says, has allowed them to more freely experiment and innovate. Provost also credits 3D printing for his company’s success, saying their hardware designs–very popular tripod mounts for the iPhone and a stylus for the iPad–would not have been possible without the quick-prototyping technology.



Download



Related Links


Studio Neat

Kickstarter

It Will Be Exhilarating: Indie Capitalism and Design Entrepreneurship in the 21st Century , by Provost and Gerhardt




 •  0 comments  •  flag
Share on Twitter
Published on October 02, 2012 03:00

September 25, 2012

Vinton Cerf on U.N. regulation of the internet



Vinton Cerf, one of the “fathers of the internet,” discusses what he sees as one of the greatest threats to the internet—the encroachment of the United Nations’ International Telecommunications Union (ITU) into the internet realm. ITU member states will meet this December in Dubai to update international telecommunications regulations and consider proposals to regulate the net. Cerf argues that, as the face of telecommunications is changing, the ITU is attempting to justify its continued existence by expanding its mandate to include the internet. Cerf says that the business model of the internet is fundamentally different from that of traditional telecommunications, and as a result, the ITU’s regulatory model will not work. In place of top-down ITU regulation, Cerf suggests that open multi-stakeholder processes and bilateral agreements may be a better solutions to the challenges of governance on the internet.



Download



Related Links


Keep the Internet Open, by Cerf
Information about the World Conference on International Telecommunications (WCIT), Internet Society
WCITLeaks.Org



 •  0 comments  •  flag
Share on Twitter
Published on September 25, 2012 08:52

September 24, 2012

Event Tomorrow: Bronwyn Howell on Regulating Broadband Networks

Tomorrow the Information Economy Project at George Mason University wil present the latest installment of its Tullock Lecture series, featuring Dr. Bronwyn Howell of the New Zealand Institute for the Study of Competition and Regulation. Here is the notice:



Dr. Bronwyn Howell – Tuesday, Sept. 25, 2012
New Zealand Institute for the Study of Competition and Regulation
4:00 to 5:30 pm @ Founder’s Hall Room 111, GMU School of Law, 3301 Fairfax Drive, Arlington, Va. Reception to Follow in the Levy Atrium, 5:30-6:30 pm Admission is free but seating is limited.



“Regulating Broadband Networks: The Global Data for Evidence-Based Public Policy:” Policy makers in the U.S. and around the world are wrestling with “the broadband problem” – how to get advanced forms of Internet access to businesses and consumers. A variety of regulatory approaches have been used, some focusing on incentives to drive deployment of rival networks, others on network sharing mandates or government subsidies. Despite a wealth of diverse experience, there seems to be a great deal of confusion about what the data actually suggest. Few people have studied these data more carefully, however, than New Zealand economist Bronwyn Howell, who will frame the lessons of the global broadband marketplace. Prof. Howell will be introduced by Dr. Scott Wallsten, Senior Fellow at the Technology Policy Institute, who served as Economics Director for the FCC’s National Broadband Plan. RSVP online here or by email to iep.gmu@gmail.com.




 •  0 comments  •  flag
Share on Twitter
Published on September 24, 2012 09:57

September 23, 2012

Sorry broadcasters, I have little sympathy for your copyright claims

[image error]Ryan Radia recently posted an impassioned and eminently reasonable defense of copyright with which I generally agree, especially since he acknowledges that “our Copyright Act abounds with excesses and deficiencies[.]” However, Ryan does this in the context of defending broadcaster rights against internet retransmitters, such as ivi and Aereo, and I have a bone to pick with that. He writes,



[Copyright] is why broadcasters may give their content away for free to anybody near a metropolitan area who has an antenna and converter box, while simultaneously preventing third parties like ivi from distributing the same exact content (whether free of charge or for a fee). At first, this may seem absurd, but consider how many websites freely distribute their content on the terms they see fit. That’s why I can read all the Techdirt articles I desire, but only on Techdirt’s website. If copyright protection excluded content distributed freely to the general public, creators of popular ad-supported content would soon find others reproducing their content with fewer ads.



I think what Ryan is missing is that copyright is not why broadcasters give away their content for free over the air. The real reason is that they are required to do so as a condition of their broadcast license. In exchange for free access to one of the main inputs of their business–spectrum–broadcasters agree to make their signal available freely to the public. Also, the fact that TV stations broadcast to metro areas (and not regionally or nationally) is not the product of technical limitations or business calculus, but because the FCC decided to only offer metro-sized licenses in the name of “localism.” That’s not a system I like, but it’s the system we have.



So, if what the public gets for giving broadcasters free spectrum is the right to put up an antenna and grab the signals without charge, why does it matter how they do it? To me a service like Aereo is just an antenna with a very long cable to one’s home, just like the Supreme Court found about CATV systems in Fortnightly. What broadcasters are looking to do is double-dip. They want free spectrum, but then they also want to use copyright to limit how the public can access their over-the-air signals. To address Ryan’s analogy from above, Techdirt is not like a broadcaster because it isn’t getting anything from the government in exchange for a “public interest” obligation.



Ideally, of course, spectrum would be privatized. In that world I think we’d see little if any ad-supported broadcast TV because there are much better uses for the spectrum. If there was any broadcast TV, it would be national or regional as there is hardly any market for local content. And the signal would likely be encrypted and pay-per-view, not free over-the-air. In such a world the copyright system Ryan favors makes sense, but that’s not the world we live in. As long as the broadcasters are getting free goodies like spectrum and must-carry, their copyright claims ring hollow.




 •  0 comments  •  flag
Share on Twitter
Published on September 23, 2012 10:25

Sorry broadcasters, I have little sympathy for your copyrights claims

[image error]Ryan Radia recently posted an impassioned and eminently reasonable defense of copyright with which I generally agree, especially since he acknowledges that “our Copyright Act abounds with excesses and deficiencies[.]” However, Ryan does this in the context of defending broadcaster rights against internet retransmitters, such as ivi and Aereo, and I have a bone to pick with that. He writes,



[Copyright] is why broadcasters may give their content away for free to anybody near a metropolitan area who has an antenna and converter box, while simultaneously preventing third parties like ivi from distributing the same exact content (whether free of charge or for a fee). At first, this may seem absurd, but consider how many websites freely distribute their content on the terms they see fit. That’s why I can read all the Techdirt articles I desire, but only on Techdirt’s website. If copyright protection excluded content distributed freely to the general public, creators of popular ad-supported content would soon find others reproducing their content with fewer ads.



I think what Ryan is missing is that copyright is not why broadcasters give away their content for free over the air. The real reason is that they are required to do so as a condition of their broadcast license. In exchange for free access to one of the main inputs of their business–spectrum–broadcasters agree to make their signal available freely to the public. Also, the fact that TV stations broadcast to metro areas (and not regionally or nationally) is not the product of technical limitations or business calculus, but because the FCC decided to only offer metro-sized licenses in the name of “localism.” That’s not a system I like, but it’s the system we have.



So, if what the public gets for giving broadcasters free spectrum is the right to put up an antenna and grab the signals without charge, why does it matter how they do it? To me a service like Aereo is just an antenna with a very long cable to one’s home, just like the Supreme Court found about CATV systems in Fortnightly. What broadcasters are looking to do is double-dip. They want free spectrum, but then they also want to use copyright to limit how the public can access their over-the-air signals. To address Ryan’s analogy from above, Techdirt is not like a broadcaster because it isn’t getting anything from the government in exchange for a “public interest” obligation.



Ideally, of course, spectrum would be privatized. In that world I think we’d see little if any ad-supported broadcast TV because there are much better uses for the spectrum. If there was any broadcast TV, it would be national or regional as there is hardly any market for local content. And the signal would likely be encrypted and pay-per-view, not free over-the-air. In such a world the copyright system Ryan favors makes sense, but that’s not the world we live in. As long as the broadcasters are getting free goodies like spectrum and must-carry, their copyright claims ring hollow.




 •  0 comments  •  flag
Share on Twitter
Published on September 23, 2012 10:25

Adam Thierer's Blog

Adam Thierer
Adam Thierer isn't a Goodreads Author (yet), but they do have a blog, so here are some recent posts imported from their feed.
Follow Adam Thierer's blog with rss.