Adam Thierer's Blog, page 84
September 21, 2012
The Problem with API Neutrality
I’ve been hearing more rumblings about “API neutrality” lately. This idea, which originated with Jonathan Zittrain’s book, The Future of the Internet–And How to Stop It, proposes to apply Net neutrality to the code/application layer of the Internet. A blog called “The API Rating Agency,” which appears to be written by Mehdi Medjaoui, posted an essay last week endorsing Zittrain’s proposal and adding some meat to the bones of it. (My thanks to CNet’s Declan McCullagh for bringing it to my attention).
Medjaoui is particularly worried about some of Twitter’s recent moves to crack down on 3rd party API uses. Twitter is trying to figure out how to monetize its platform and, in a digital environment where advertising seems to be the only business model that works, the company has decided to establish more restrictive guidelines for API use. In essence, Twitter believes it can no longer be a perfectly open platform if it hopes to find a way to make money. The company apparently believes that some restrictions will need to be placed on 3rd party uses of its API if the firm hopes to be able to attract and monetize enough eyeballs.
While no one is sure whether that strategy will work, Medjaoui doesn’t even want the experiment to go forward. Building on Zittrain, he proposes the following approach to API neutrality:
Absolute data to 3rd party non-discrimination : all content, data, and views equally distributed on the third party ecosystem. Even a competitor could use an API in the same conditions than all others, with not restricted re-use of the data.
Limited discrimination without tiering : If you don’t pay specific fees for quality of service, you cannot have a better quality of service, as rate limit, quotas, SLA than someone else in the API ecosystem.If you pay for a high level Quality of service, so you’ll benefit of this high level quality of service, but in the same condition than an other customer paying the same fee.
First come first served : No enqueuing API calls from paying third party applications, as the free 3rd-party are in the rate limits.
Before I critique this, let’s go back and recall why Zittrain suggested we might need API neutrality for certain online services or digital platforms. Although Zittrain does not label it as such, API neutrality assumes the platform or device in question is a sort of public utility or common carrier. Zittrain is concerned that the absence of API neutrality could imperil “generativity,” technologies or networks that invite or allow tinkering and all sorts of creative secondary uses. Primary examples include general-purpose personal computers (PCs) and the traditional “best efforts” Internet. By contrast, Zittrain contemptuously refers to “tethered, sterile appliances,” or digital technologies or networks that discourage or disallow tinkering. Zittrain’s primary examples are proprietary devices like Apple’s iPhone or the TiVo, or online walled gardens like the old AOL and current cell phone networks. Such “take it or leave it” devices or platforms earn Zittrain’s wrath. He argues that we run the risk of seeing the glorious days of generative devices and the open Internet give way to those tethered appliances and closed networks. He fears most users will flock to tethered appliances in search of stability or security, and worries because those tethered appliances are less “open” and more “regulable,” thus allowing easier control by either large corporate intermediaries or government officials. In other words, the “future of the Internet” Zittrain is hoping to “stop” is a world dominated by tethered digital appliances and walled gardens, because they are too easily controlled by other actors. He argues:
If there is a present worldwide threat to neutrality in the movement of bits, it comes not from restrictions on traditional Internet access that can be evaded using generative PCs, but from enhancements to traditional and emerging appliancized services that are not open to third-party tinkering.
Because he fears the rise of “walled gardens” and “mediated experiences,” Zittrain goes on to wonder, “Should we consider network neutrality-style mandates for appliancized systems?” He responds to his own question as follows:
The answer lies in that subset of appliancized systems that seeks to gain the benefits of third-party contributions while reserving the right to exclude it later. . . . Those who offer open APIs on the Net in an attempt to harness the generative cycle ought to remain application-neutral after their efforts have succeeded, so all those who built on top of their interface can continue to do so on equal terms. (p. 183-4)
While many would agree that API neutrality represents a fine generic norm for online commerce and interactions, Zittrain implies it should be a legal standard to which online providers are held. He even alludes to the possibility of applying the common law principle of adverse possession more broadly in these contexts. He notes that adverse possession “dictates that people who openly occupy another’s private property without the owner’s explicit objection (or, for that matter, permission) can, after a lengthy period of time, come to legitimately acquire it.” (p. 183) He does not make it clear when that principle would be triggered as it pertains to digital platforms or social media APIs. But it would seem clear that his API neutrality rule would eventually regulate the major information providers and platforms of our day, including: Apple, Google, Twitter, Facebook, and many others.
As I argued in my paper, “The Perils of Classifying Social Media Platforms as Public Utilities,” API neutrality regulation is a dangerous notion. There are many problems with the logic of Zittrain’s API neutrality proposal and with the application of adverse possession to social media platforms or digital applications. What follows below is my critique of the notion that appeared in that paper, and it also explains why Medjaoui’s new formulation and clarification of the principle is equally problematic.
First, most developers who offer open APIs are unlikely to close them later because they do not want to incur the wrath of “those who built on top of their interfaces,” to use Zittrain’s parlance. Social media services make themselves more attractive to users and advertisers by providing platforms with plentiful opportunities for diverse interactions and innovations. The “walled gardens” of the Internet’s first generation are largely things of the past. Thus, a powerful self-correcting mechanism is at work in this space. If social media operators were to lock down their platforms or applications in a highly restrictive fashion, both application developers and average users would likely revolt. Moreover, a move to foreclose or limit generative opportunities could spur more entry and innovation as other application (“app”) developers and users seek out more open, pro-generative alternatives.
Consider an example involving Apple and the iPhone. Shortly after the iPhone’s release, Apple reversed itself and opened its iPhone platform to third-party app developers. The result was an outpouring of innovation. Customers in more than 123 countries had downloaded more than eighteen billion apps from Apple’s App Store at a rate of more than 1 billion apps per month as of late 2011.
But what if Apple decides to suddenly shut its App Store and prohibit all third-party contributions, after initially allowing them? There is no obvious incentive for Apple to do so, and there are plenty of competitive reasons for Apple not to close off third-party development, especially as its application dominance is a key element of Apple’s success in the smartphone and tablet sectors. Under Zittrain’s proposed paradigm, regulators would treat the iPhone as the equivalent of a commoditized common carriage device and force the App Store to operate on regulated, public utility–like terms without editorial or technological (and perhaps interoperability) control by Apple itself. But if Apple were to open the door to developers only to slam it shut a short time later, the company would likely lose those developers and customers to alternative platforms. Google, Amazon, Microsoft, and others would be only too happy to take Apple’s business by offering a wealth of stores and devices that allow users greater freedom. Market choices, not regulatory edicts such as mandatory API neutrality, should determine the future of the Internet.
The same logic indicates the likely counterproductive effects of efforts to impose API neutrality on Twitter. Until recently, Twitter had a voluntary open access policy in that it allowed nearly unlimited third-party reuse and modification of its API. It is now partially abandoning that policy by taking greater control over the uses of its API. This policy reversal will, no doubt, lead to claims that the company is acting like one of Tim Wu’s proverbial “information empires” and that perhaps Zittrain’s API neutrality regime should be put in place as a remedy. Indeed, Zittrain has already referred to Twitter’s move as a “bait-and-switch” and recommended an API neutrality remedy. Zittrain’s actions could foreshadow more pressure from academics and policymakers that will first encourage Twitter to continue open access, but then potentially force the company to grant nondiscriminatory access to its platform on regulated terms. Nondiscriminatory access would represent a step toward the forced commoditization of the Twitter API and the involuntary surrender of the company’s property rights to some collective authority that will manage the platform as a common carrier or essential facility.
Yet again, innovation and competitive entry remain possible in this arena. There is nothing stopping other microblogging or short-messaging services from offering alternatives to Twitter. Some people would decry the potential lack of interoperability among competing services at first, but innovators would quickly find work-arounds. A decade ago, similar angst surrounded AOL’s growing power in the instant-messaging (IM) marketplace. Many feared AOL would monopolize the market and exclude competitors by denying interconnection. Markets evolved quickly, however. Today, anyone can download a free chat client like Digsby or Adium to manage IM services from AOL, Yahoo!, Google, Facebook, and just about any other company, all within a single interface, essentially making it irrelevant which chat service your friends use. These innovations occurred despite a mandate in the conditions of Time Warner’s acquisition of AOL that the post-merger firm provide for IM interoperability. The provision was quietly sunset as irrelevant a short three years later.
A similar market response could follow Twitter’s to exert excessive control over its APIs. In web 2.0 markets—that is, markets built on pure code—the fixed costs of investment are orders of magnitude less than they were with the massive physical networks of pipes and towers from the era of analog broadcasting and communications. Thus, major competition for Twitter is more than possible, and it is likely to come from sources and platforms we cannot currently imagine, just as few of us could have imagined something like Twitter developing.
Even if some social media platform owners did want to abandon previously open APIs and move to a sort of walled garden, there is no reason to classify such a move as anticompetitive foreclosure or leveraging of the platform. Marketplace experimentation in search of a sustainable business model should not be made illegal. Since most social media sites such as Twitter do not charge for the services they provide, some limited steps to lock down their platforms or APIs might help them earn a return on their investments by monetizing traffic on their own platforms. If a social media provider had to live under a strict version of Zittrain’s API neutrality principle, however, it might be extremely difficult to monetize traffic and increase businesses since the company would be forced to share its only valuable intellectual property.
In sum, if the government were to forcibly apply API neutrality or adverse possession principles through utility-like regulation, it would send a signal to social media entrepreneurs that their platforms are theirs in name only and could be coercively commoditized once they are popular enough. Such a move would constitute a serious disincentive to future innovation and investment. “API neutrality” would upend the way much of the modern digital economy operates and cripple many of America’s most innovative companies and sectors. In the long run, such changes could sacrifice America’s current role as a global information technology leader. For these reasons, API neutrality mandates should be rejected.
Additional Reading
Twitter, the Monopolist? Is this Tim Wu’s “Threat Regime” In Action?
A Vision of (Regulatory) Things to Come for Twitter?
review of Zittrain’s “Future of the Internet”







Your Privacy and FCC Broadband Measurement: What You Need to Know About Your Personal Data
Consumers should be aware that “government transparency” also applies to the data consumers voluntarily provide to the FCC when they participate in a government-run broadband measurement program.
The most egregious aspect of these broadband measurement programs, however, is that the FCC kept the public in the dark for more than a year by failing to disclose that its mobile testing apps were collecting user locations (by latitude and longitude) and unique handset identification numbers that the FCC’s contractors can make available to the public.
The Federal Communications Commission (FCC) recently announced a new program to measure mobile broadband performance in the United States. The FCC believes it is “difficult” for consumers to get detailed information about their mobile broadband performance, and that “transparency on broadband speeds drives improvement in broadband speeds.” The FCC does not, however, limit transparency to broadband speeds. Consumers should be aware that “government transparency” also applies to the data consumers voluntarily provide to the FCC when they participate in a government-run broadband measurement program. Information collected by the FCC about individual consumers may be “routinely disclosed” to other federal agencies, states, or local agencies that are investigating or prosecuting a civil or criminal violation. Some personal information, including individual IP address, mobile handset location data, and unique handset identification numbers, may be released to the public.
This blog post describes the FCC’s broadband measurement programs and highlights the personal data that may be disclosed about those who participate in them.
Consumers who wish to participate in an FCC testing program should first read all of the applicable privacy policies to understand how the government and its third-party vendors will use their data. The FCC has not yet determined how it will implement its new mobile broadband measurement program, and is seeking public input about appropriate methodologies for testing mobile performance at an open meeting, which will be held today from 9:30 AM to 11:00 AM Eastern in the FCC’s Meeting Room (TW-C305) at 445 12th Street SW, Washington, DC 20554. The meeting will also be live streamed here. Given the FCC’s poor track record describing the information it actually collects, consumers should consider raising questions regarding the FCC’s privacy policies and testing methodologies at this open meeting.
The FCC’s Broadband Measurement Programs
The FCC’s new mobile testing initiative expands its ongoing “Measuring Broadband America” program. The current program measures the performance of residential wired and wireless broadband service in the United States using several different tests.
Under the “FCC SamKnows Broadband Community” program, the agency tests the broadband connections of residential consumers who volunteer to use a wireless router running custom software provided by SamKnows, a British company retained under contract by the FCC. The wireless routers, known as “White Boxes,” began shipping to U.S. consumers in September 2010. The FCC released its first report on wireline broadband performance in August 2011, and a second report in July 2012. All network traffic generated by consumers using SamKnows flows through the White Boxes, and continuously monitors personal consumer data until the participant affirmatively opts out of the program. SamKnows says its “goal is to embed [its] software suite into internet [sic] enabled devices . . . globally.”
In March 2010, the FCC made available a software-based broadband speed test (i.e., a test that does not require hardware supplied by SamKnows) for both wired and wireless Internet access. These tests are collectively known as the “Consumer Broadband Test.” The wireline test is still available in its “beta” version and allows consumers to choose between two testing companies: Ookla and M-Lab. The mobile version, which was developed by Ookla and appears substantially similar to Ookla’s own Speedtest.net app, is available for both the Google Android and Apple iOS operating systems and can be downloaded from their apps stores and the federal government’s official web portal. These apps monitor, collect, and report a consumer’s mobile data rates, latency, and user location when initiated on the handset.
Finally, consumers who do not have broadband Internet access available at their home can submit a “Broadband Dead Zone Reporting Form” to the FCC that includes their home address. The FCC is apparently using these reports to create a “Broadband Dead Zone Registry.”
Why Is the FCC Asking You to Help It Measure Broadband Speeds?
The FCC asserts that its tests are necessary because information regarding broadband performance is not readily available to consumers and could be inaccurate. Now that the FCC has gotten into the broadband speed testing market, however, other providers of these services have begun to question the accuracy of the FCC’s tests and whether there is a need for government testing.
When the FCC announced that it had contracted with SamKnows to measure wireline broadband speeds, Ookla stated that the FCC’s plan to conduct expensive tests that gather small samples of data similar to that which is already widely available in the market “offers little added insight into the discussion of the speed, quality, and availability of broadband connections across the nation and is an unacceptable waste of taxpayer money.” Ookla offers free broadband speed testing for wired and wireless connections at speedtest.net, free broadband quality testing at pingtest.net, and monthly snapshots of global broadband performance at Net Index. In fact, the FCC’s current mobile measurement apps are based on Ookla’s free speedtest.net app.
There are many other commercial sources of broadband performance available as well. For example, PC Magazine performs annual field tests of mobile broadband speeds in 30 cities nationwide using three vehicles over a three-week period. In its most recent tests, it completed 60,000 test cycles and published test results by region, carrier, and technology (3G and 4G).
Given the readily available commercial alternatives, there is no compelling reason for consumers to participate in the FCC’s broadband measurement programs. There is at least one very good reason, however, for consumers to avoid participating in the FCC’s tests. The tests collect your personal data, and once you’ve volunteered to provide it to the government, it can do virtually anything it wants with it.
The Privacy Act and Government Data Sharing
In its most recent Privacy Act filing, the FCC describes the personal information it maintains about consumers who participate in its broadband measurement programs:
The street address, city, state, and zip code, of each individual who elects to participate in the Broadband Dead Zone Report survey and each individual who participates in both the wired and wireless versions of the Consumer Broadband Test;
The Internet Protocol (IP) address of each individual who elects to participate in both the wired and wireless versions of the Consumer Broadband Test;
The unique handset identification number of each individual’s smartphone used to access the mobile Consumer Broadband Test; and
The location (in latitude and longitude) reported by each user’s handset at the moment the user initiates the mobile Consumer Broadband Test.
The use of this data is not, however, limited to the FCC.
The Privacy Act, 5 U.S.C. § 552a, governs the collection, use, and dissemination of personal information by the federal government and its contractors. The Privacy Act generally prohibits federal agency disclosure of records maintained on individuals, but that prohibition is subject to a number of exceptions. One exception allows a federal agency to share personal data with federal, state, or local agencies for civil or criminal law enforcement activities. Another exception includes “routine use,” which allows the use of personal data for “a purpose [that] is compatible with the purpose for which [the personal data] was created.”
The FCC’s privacy policy and system of records for its broadband measurement programs reveals that the agency makes liberal use of these exceptions. The FCC is routinely sharing this personal data with federal, state, or local agencies whenever “there is an indication of a violation or potential violation of a statute, regulation, rule, or order,” and with the Department of Justice when it is relevant to litigation. There are virtually no limits on the use or disclosure of personal data by the FCC’s contractors.
IP addresses, mobile handset location, and unique handset identification numbers may be shared with FCC software partners as part of the Consumer Broadband Test application. These partners may publish the IP address, mobile handset location, unique handset identification numbers, and broadband performance data, or otherwise make this information available to the public (but the IP address is not associated with a street address).
The FCC is thus allowing its contractors to publicly disclose personal information that the FCC itself cannot publish.
The most egregious aspect of these broadband measurement programs, however, is that the FCC kept the public in the dark for more than a year by failing to disclose that its mobile testing apps were collecting user locations (by latitude and longitude) and unique handset identification numbers that the FCC’s contractors can make available to the public. The FCC first proposed its system of records for broadband measurement on December 30, 2009, when the program was known as the “Broadband Unavailability Survey and Broadband Quality Test.” On April 7, 2010, after it launched the “Consumer Broadband Test” (including its mobile testing apps), the FCC revised its system of records to reflect the broadband measurement program it had actually implemented, with one critical exception: The FCC’s system of records did not disclose that its mobile testing apps collect user locations and unique handset identification numbers. The FCC also failed to flag the collection of this data in its Privacy Threshold Analysis, which was conducted in May 2010.
Although the FCC finally disclosed the collection of this data in a formal filing on July 14, 2011, the FCC’s privacy policy still fails to mention the collection of mobile handset locations and unique handset identification numbers. This information is also unavailable in the Android and iOS app stores or on USA.gov, where consumers are most likely to download the mobile apps.
Representative Ed Markey, the co-chairman of the Congressional Privacy Caucus, believes “Consumers should know and have the choice to say no to software on their mobile devices that is transmitting their personal and sensitive information.” He introduced a bill this month that would require private companies to disclose the type of information mobile monitoring software would collect, where it would be sent, and how it would be used. The bill would also require that companies obtain consent from consumers before using sensitive information and – ironically – that such agreements be filed with the FCC and FTC. Markey may want reconsider whether the FCC should be given authority to oversee mobile privacy agreements given its own failures to disclose the type of data it collects and its acquiescence to its vendor demands to publicly disclose sensitive information.
Consumers who volunteer to participate in a government program deserve accurate and transparent disclosures regarding the personal information that will be collected and the protection that information will receive. The FCC hasn’t met that obligation in its broadband measurement programs. Consumers who are considering participation in the FCC’s new mobile measurement program should demand accountability from the FCC and its third-party contractors before agreeing to provide sensitive data.







September 20, 2012
Let The Music Play: Critics Of Universal-EMI Merger Are Singing Off-Key
There are a lot of inaccurate claims – and bad economics – swirling around the Universal Music Group (UMG)/EMI merger, currently under review by the US Federal Trade Commission and the European Commission (and approved by regulators in several other jurisdictions including, most recently, Australia). Regulators and industry watchers should be skeptical of analyses that rely on outmoded antitrust thinking and are out of touch with the real dynamics of the music industry.
The primary claim of critics such as the American Antitrust Institute and Public Knowledge is that this merger would result in an over-concentrated music market and create a “super-major” that could constrain output, raise prices and thwart online distribution channels, thus harming consumers. But this claim, based on a stylized, theoretical economic model, is far too simplistic and ignores the market’s commercial realities, the labels’ self-interest and the merger’s manifest benefits to artists and consumers.
For market concentration to raise serious antitrust issues, products have to be substitutes. This is in fact what critics argue: that if UMG raised prices now it would be undercut by EMI and lose sales, but that if the merger goes through, EMI will no longer constrain UMG’s pricing power. However, the vast majority of EMI’s music is not a substitute for UMG’s. In the real world, there simply isn’t much price competition across music labels or among the artists and songs they distribute. Their catalogs are not interchangeable, and there is so much heterogeneity among consumers and artists (“product differentiation,” in antitrust lingo) that relative prices are a trivial factor in consumption decisions: No one decides to buy more Lady Gaga albums because the Grateful Dead’s are too expensive. The two are not substitutes, and assessing competitive effects as if they are, simply because they are both “popular music,” is not instructive.
Given these factors, a larger catalog won’t lead to abuse of market power. This is precisely why the European Union cleared the Sony/EMI music publishing merger, concluding that “Customers usually select a song or certain musical works and not a [label] or a [label’s] catalog… In the event that a customer is wedded to a particular song…or a catalog of songs…, even a small [label] would have pricing power over these particular musical works. The merger would not affect this situation (since the size of the catalog does not matter).”
A second popular criticism is that a combined UMG/EMI would control 51 of 2011’s Billboard Hot 100 songs. But this assertion ignores the ever-changing nature of musical output and consumer tastes – not to mention that “top-selling songs of 2011” is hardly a relevant antitrust market (and neither is “top-selling songs of the last 10 years”). A label’s ownership of 51 songs that were popular in 2011 is not suggestive of its ability to price its full catalog of several million songs in negotiations with an online music service. Meanwhile, by other measures (this year independent artists garnered over 50% of Grammy nominations and won 44% of the awards) the major labels are hardly the only purveyors of valuable songs, and competition from Indie labels and artists is significant.
Edgar Bronfman, a director and former CEO and chairman of Warner Music Group, recently testified in Congress against the merger, arguing that a combined UMG/EMI could decide “what digital services live and what digital services die.” But Bronfman himself has elsewhere acknowledged that labels can’t prosper if they can’t sell their music. As chairman of Universal in 2001 he told Congress that, “for us to effectively market and distribute…albums, they are going to have to be on as many different online music sites as possible…. Frankly, if we lock away our catalog, we aren’t generating value for our artists or our shareholders or our fans.” As a competitor of UMG, Bronfman may have changed his tune, but his earlier point is even more true today with digital sales exceeding 50% of the market.
Far from wanting to constrain supply or hamstring distribution channels, labels have an incentive to make music widely and easily accessible. In fact, power buyers like Apple may have greater control over the marketplace than the labels. As UMG’s CEO Lucian Grainge bluntly noted, “[i]f Apple stops selling our music, we go out of business. Apple does not.” Critics downplay the role of power buyers in disciplining prices, but that assertion goes against the evidence.
Dismissive attitudes about piracy as a constraint on prices also miss the mark. For many consumers, a marginal price increase will indeed induce some piracy. More positively, the opposite also holds true: Increased consumer access to inexpensive and accessible legal content reduce piracy. Given the ravages of pirated music since Napster, it’s no wonder that labels – including both Universal and EMI – are now licensing their music to so many legal digital music services like Spotify. UMG’s incentives to continue to do so can only increase following the merger.
Finally, antitrust reviews must consider the benefits of the merger. Bringing together Universal and EMI could create substantial operating efficiencies. More efficient A&R and production should benefit artists (and fans) directly. And with a larger catalog UMG’s opportunities for pairing similar artists for marketing and concert promotion would increase, helping new and less-popular artists reach larger audiences. And UMG is in a position to breathe new life into EMI’s catalog with investment in human capital and artists’ careers that EMI simply can’t muster.
Claims of this merger‘s anticompetitive effects are not supported either by antitrust analysis or the realities of this market. Regulators should let the music play.
[Crossposted from Forbes.com]







September 19, 2012
On Copyright and Business Models: Why ivi Deserved to Be Shut Down
Imagine a service that livestreams major broadcast television channels over the Internet for $4.99 a month — no cable or satellite subscription required. For an extra 99 cents a month, the service offers DVR functionality, making it possible to record, rewind, and pause live broadcast television on any broadband-equipped PC.
If this service sounds too good to be true, that’s because it is. But for a time, it was the business model of ivi. Cheaper than a cable/satellite/fiber subscription and more reliable than an over-the-air antenna, ivi earned positive reviews when it launched in September 2010.
Soon thereafter, however, a group of broadcast networks, affiliates, and content owners sued ivi in federal court for copyright infringement. The court agreed with the broadcasters and ordered ivi to cease operations pending the resolution of the lawsuit.
ivi appealed this ruling to the 2nd Circuit, which affirmed the trial court’s preliminary injunction earlier this month in an opinion (PDF) by Judge Denny Chin. The appeals court held as follows:
The rights holders would likely prevail on their claim that ivi infringed on their performance rights, as ivi publicly performed their copyrighted programs without permission;
ivi is not a “cable system” eligible for the Copyright Act’s compulsory license for broadcast retransmissions, as ivi distributes video over the Internet, rather than its own facilities;
Allowing ivi to continue operating would likely cause irreparable harm to the rights holders, as ivi’s unauthorized distribution of copyrighted programs diminishes the works’ market value, and ivi would likely be unable to pay damages if it loses the lawsuit;
ivi cannot be “legally harmed by the fact that it cannot continue streaming plaintiffs’ programming,” thus tipping the balance of hardships in plaintiffs’ favor;
While the broad distribution of creative works advances the public interest, the works streamed by ivi are already widely accessible to the public.
As much as I enjoy a good statutory construction dispute, to me, the most interesting question here is whether ivi caused “irreparable harm” to rights holders.
Writing on Techdirt, Mike Masnick is skeptical of the 2nd Circuit’s holding, criticizing its “purely faith-based claims … that a service like ivi creates irreparable harm to the TV networks.” He argues that even though ivi “disrupt[s] the ‘traditional’ way that [the broadcast television] industry’s business model works … that doesn’t necessarily mean that it’s automatically diminishing the value of the original.” Citing the VCR and DVR, two technologies that disrupted traditional methods of monetizing content, Mike concludes that “[t]here’s no reason to think” ivi wouldn’t “help [content owners'] business by increasing the value of shows by making them more easily watchable by people.”
Mike has a point. Perhaps many ivi subscribers previously didn’t watch much, if any, broadcast television. But thanks to ivi, some of these viewers may get hooked on hit network shows like American Idol, NCIS, or Person of Interest. Some ivi subscribers might even go on to buy seasons of their favorite shows on Blu-ray or DVD. If these assumptions hold true, ivi might actually increase the market value of the television programs it streams. So why aren’t rights holders applauding ivi — or emulating it — instead of trying to shut it down?
Perhaps it’s because the rights holders worry that ivi could attract a large audience of “cord cutters” who previously bought season passes to their favorite shows from Internet media stores such as iTunes or Amazon Instant Video. Rights holders might also worry that ivi could induce cord cutting by inducing people to cancel their basic cable or satellite television service. Why pay a cable company $16.50 a month for local broadcast channels when you can get them from ivi for less than a third of the price of cable?
Broadcasters might worry about ivi undercutting their advertising revenues. Because television ad rates are largely based on viewership statistics — as determined by audience measurement companies like Nielsen — each person who unplugs his antenna or cancels his cable subscription for ivi is one fewer Nielsen viewer. (Although ivi is reportedly interested in cutting a deal with Nielsen to ensure its ratings reflect ivi’s audience, it appears no deal was in place when ivi launched.) From the broadcasters’ perspective, it doesn’t matter if lots of ivi subscribers actually watch television ads, as advertisers typically aren’t willing to pay for eyeballs they can’t measure.
Adding insult to injury, ivi streamed the local channels of two markets, Seattle and New York City, to subscribers worldwide. Because broadcast affiliates typically sell ad slots to local businesses, every person who uses ivi but doesn’t reside in Seattle or NYC amounts to one fewer set of eyeballs for a local affiliate.
So ivi could be helping rights holders, hurting them, or doing some of both. To determine ivi’s net impact on content owners’ bottom line, we need to know whether the first type of viewers discussed above (those who spend more on content because of ivi) makes more money for content owners than they lose on other viewers (those who substitute ivi viewing for media store purchases, over-the-air viewing, or pay-TV subscriptions). Unfortunately, we lack the data to answer these questions with confidence.
Nevertheless, there are good reasons to assume that ivi subscribers who generate less revenue for content owners after signing up for ivi vastly outnumber those who generate more revenue.
Consider ivi’s natural subscriber base: people who already pay for network television content — via pay-TV, Internet media stores, or streaming services like Hulu Plus — or watch for free via authorized, ad-supported sources. For many of these viewers, ivi presents a compelling alternative to other sources of network television content.
But what about ivi’s potential to deliver networks a new, untapped audience? Well, at $60 per year, ivi won’t likely play well with casual viewers who aren’t even sure if network television is worth watching. These viewers far more likely to test the network TV waters by streaming recently-aired shows for free on Hulu or network websites.
ivi is also unlikely to attract many network television lovers who’d otherwise miss out on it because they lack the cash. That’s because most low-income television junkies already tune in — perhaps via free, over-the-air network television (which nearly all TV owners can already access with nothing more than a $20 antenna and $30 converter box). From the content owners’ perspective, each user who switches from an over-the-air antenna to an ivi subscription is basically a wash.
At best, ivi may attract some viewers who can’t afford or aren’t willing to pay for basic cable, and live too far away from an urban area to receive an over-the-air signal. But do these viewers outnumber the many TV junkies who want cheaper, more convenient access to network television content? I highly doubt it. And, had ivi tried to persuade the 2nd Circuit that its service actually benefits rights holders, I suspect the Court wouldn’t have bought the argument unless ivi could marshall data that probably doesn’t exist.
Business Models, Innovation, and Incentives
If ivi is such an attractive alternative to “legacy” business models, why don’t the broadcast networks simply follow ivi’s lead by offering a comparable service? It seems like a no-brainer; after all, networks and affiliates already have established relationships with advertisers, and enjoy immediate access to perfect digital copies of their content. A joint venture of the major networks, perhaps in collaboration with their affiliates, would surely dominate ivi (assuming both services were comparably priced).
What explains the broadcasters and rights holders’ reticence toward this business model? Perhaps they’re too stupid or lazy to see the green in front of them. Maybe they’re too attached to obsolete business models to monetize their content in a rational, profit-maximizing manner.
But the rights holders could also be acting perfectly rationally. Maybe the $6 monthly fee ivi charges isn’t the profit-maximizing price at which to charge consumers for high definition, live, recordable, rewindable network television content. Perhaps the business strategy currently employed by broadcasters and creators — complex and confusing as it may be to most people — captures more income for the creation and distribution of television shows than alternative business models.
I don’t know whether content owners ought to shun or embrace ivi’s business model. Neither does Mike Masnick — or, for that matter, anyone. At best, armed with extensive economic data and market research, we’d still only be able to make an educated guess as to how content owners should structure their businesses. Modern consumers’ preferences are simply too opaque, divergent, and dynamic for any producer to systematically squeeze out every last drop of profits or surplus.
Even under uncertainty, however, decisions must still be made. In the market for creative works, their creators (and their assignees) are empowered by the Copyright Act with an exclusive, but limited, right to decide how to monetize their works. So it is that broadcasters and affiliates may dictate how television shows are distributed, and decide how much to charge for them, for a limited time and with certain exceptions.
This is why broadcasters may give their content away for free to anybody near a metropolitan area who has an antenna and converter box, while simultaneously preventing third parties like ivi from distributing the same exact content (whether free of charge or for a fee). At first, this may seem absurd, but consider how many websites freely distribute their content on the terms they see fit. That’s why I can read all the Techdirt articles I desire, but only on Techdirt’s website. If copyright protection excluded content distributed freely to the general public, creators of popular ad-supported content would soon find others reproducing their content with fewer ads. Between Hulu — with its several minutes of ads per episode — and a competing service offering the same content, but with nothing more than a few text ads, many viewers would prefer the latter option.
Of course, the Copyright Act is no guarantee that a particular business model will succeed, or that a content creator will make a profit. It simply vests in each rights holder the power to decide among business models for monetizing their content.
Why let creators and their assignees make these decisions? Even if we believe that that public policy ought to “promote the Progress of … useful Arts” — an admittedly controversial belief that is beyond the scope of this essay — why give content creators exclusive rights to copy, distribute, perform, transmit, and sell their expressive works? There are, after all, plenty of other ways government could encourage people to create movies, books, music, video games, and other socially valuable expressions.
For instance, we could award monetary prizes to creators of popular works, perhaps by measuring how often they’re viewed or experienced. We could create a federal Department of Creative Expression and hire 50,000 of the nation’s most talented writers, artists, and musicians to create books, movies, television shows, and songs all day. We could also give individuals and companies generous, refundable tax credits for income derived from expressive works.
But, for the most part, we don’t do these things. Of the many ways our government could foster the creation of expressive works, we chose copyrights — as have many other governments over the years.
So why copyright? Two reasons: knowledge and incentives.
In an ever-changing world, the best way to discover how to monetize creative works is through trial-and-error. By empowering lots of individual creators and companies to experiment with different ways of distributing content, knowledge emerges through spontaneous order, as rights holders mimic their successful competitors while constantly trying to figure out an even smarter way to make money. Instead of relying on a centralized bureaucracy or a small group of lawmakers to decide how much to charge for creative works, the institution of copyright disperses such decisions, harnessing the wisdom of the crowd for a better outcome.
If decentralized decision-making works so well, why limit it to creators? Surely if everybody could monetize creative works, we’d enjoy even more innovative distribution strategies. But this would push the value of creative works down to their marginal price, zero. While it’s still possible to make money by distributing free content, as Mike has explained as comprehensively as anyone, it’s not necessarily the best way for creators to make money. If it were, everybody would already be doing it!
Therefore, we give content creators and their assignees a limited, exclusive right — a temporary monopoly, as it’s often described — over their works. They get to decide not only what to create, but how to distribute it. Whether they reap vast rewards or lose their shirts depends solely on the decisions they make.
To be sure, our Copyright Act abounds with excesses and deficiencies, many of which we’ve discussed on these pages over the years. (For instance, it lacks a registration or renewal requirements, imposes draconian criminal penalties on noncommercial infringement, and confers copyrights on too broad a range of subject matter.) Despite these problems, however, the exclusive right to monetize expressive works — a right that ivi flagrantly violates — is at the core of copyright. If there’s one exclusive right that copyright laws should secure for content creators, it’s the right to sell complete copies of newly-produced creative works made for the purpose of private commercial gain.
If ivi doesn’t violate this right, I don’t know what does.







“It’s just more reliable than YouTube, ok? “
That was the response of a friend currently in Rwanda who had issued a Facebook plea for someone to upload the weird “Innocence of Muslims” video to Dropbox.
“Oh, where is the stupid internet in Rwanda?????” she exclaimed.
In typical snark, I had asked, “What do you connect to Dropbox with? Tin-can on string?”
She actually has Internet access, but she finds YouTube so much less reliable than other platforms that she asks friends to upload YouTube videos elsewhere.
I anecdotally find YouTube videos to be clunky downloads compared to others. Quite naturally, I watch fewer videos on YouTube and more on other platforms. I don’t know, but guess, that Google has made some decision to economize on video downloads—a high percentage of people probably watch only the first third of any video, so why send them the whole thing right away?—and that its imperfect implementation has me watching the spinning “pause” wheel (or playing “snake”) routinely when I think a YouTube offering would be interesting.
Would the Google of five years have allowed that? It’s well known that Google recognizes speed as an important elements of quality service on the Internet.
And this is why antitrust action against Google is unwarranted. When companies get big, they lose their edge, as I’m guessing Google is losing its edge in video service. This opens the door to competitors as part of natural economic processes.
Just the other week, I signed up with Media.net and I’ll soon be running tests on whether it gets better results for me on WashingtonWatch.com than Google AdSense. So far so good. A human customer service representative navigated me through the (simple) process of opening an account and getting their ad code.
These are anecdotes suggesting Google’s competitive vulnerability. But you can get a more systematic airing of views at TechFreedom’s event September 28th: “Should the FTC Sue Google Over Search?“







September 18, 2012
Ryan Radia on the constitutionality of net neutrality
Ryan Radia, associate director of technology studies at the Competitive Enterprise Institute, discusses the amicus brief he helped author in the case of Verizon v. Federal Communications Commission now before the D.C. Circuit Court of Appeals. Radia analyzes the case, which will determine the fate of the FCC’s net neutrality rule. While Verizon is arguing that the FCC does not have the authority to issue suce rules, Radia says that the constitutional implications of the net neutrality rule are more important. He explains that the amicus brief outlines both First and Fifth Amendment arguments against the rule, stating that net neutrality impinges on the speech of Internet service providers and constitutes an illegal taking of their private property.
[Flash 9 is required to listen to audio.]
Related Links
Brief of Amici Curiae, by CEI, Cato et al.
Video: Why Net Neutrality Regulations are Infirm, by Radia
Verizon slams the FCC’s net neutrality rules as unconstitutional, The Next Web







September 14, 2012
Real Lawyers Read the Footnotes, but Cite Them only when Relevant: A Response to Harold Feld on the FCC SpectrumCo Order
By Geoffrey Manne, Matt Starr & Berin Szoka
“Real lawyers read the footnotes!”—thus did Harold Feld chastise Geoff and Berin in a recent blog post about our CNET piece on the Verizon/SpectrumCo transaction. We argued, as did Commissioner Pai in his concurrence, that the FCC provided no legal basis for its claims of authority to review the Commercial Agreements that accompanied Verizon’s purchase of spectrum licenses—and that these agreements for joint marketing, etc. were properly subject only to DOJ review (under antitrust).
Harold insists that the FCC provided “actual analysis of its authority” in footnote 349 of its Order. But real lawyers read the footnotes carefully. That footnote doesn’t provide any legal basis for the FTC to review agreements beyond a license transfer; indeed, the footnote doesn’t even assert such authority. In short, we didn’t cite the footnote because it is irrelevant, not because we forgot to read it.
First, a reminder of what we said:
The FCC’s review of the Commercial Agreements accompanying the spectrum deal exceeded the limits of Section 310(d) of the Communications Act. As Commissioner Pai noted in his concurring statement, “Congress limited the scope of our review to the proposed transfer of spectrum licenses, not to other business agreements that may involve the same parties.” We (and others) raised this concern in public comments filed with the Commission. Here’s the agency’s own legal analysis — in full: “The Commission has authority to review the Commercial Agreements and to impose conditions to protect the public interest.” There’s not even an accompanying footnote.
Even if Harold were correct that footnote 349 provides citations to possible sources of authority for the FCC to review the Commercial Agreements, it remains irrelevant to our claim: The FCC exceeded its authority under 310(d) and asserted its authority under 310(d) without any analysis or citation. Footnote 349 begins with the phrase, “[a]side from Section 310(d)….” It is no surprise, then, that the footnote contains no analysis of the agency’s authority under that section.
The FCC’s authority under 310(d) is precisely what is at issue here. The question was raised and argued in several submissions to the Commission (including ours), and the Commission is clearly aware of this. In paragraph 142 of the Order, the agency notes the parties’ objection to its review of the Agreements: “Verizon Wireless and the Cable Companies respond that the Commission should not review the Commercial Agreements because… the Commission does not have authority to review the agreements.” That objection, rooted in 310(d), is to the Commission extending its transaction review authority (unquestionably arising under only 310(d)) beyond that section’s limits. The Commission then answers the parties’ claim in the next paragraph with the language we quoted: “The Commission has authority to review the Commercial Agreements and to impose conditions to protect the public interest.” By doing so without reference to other statutory language, it seems clear that the FCC’s unequivocal, unsupported statement of authority is a statement of authority under 310(d).
This is as it should be. The FCC’s transaction review authority is limited to Section 310(d). Thus if the agency were going to review the Commercial Agreements as part of the transfer, the authority to do so must come from 310(d) alone. But 310(d) on its face provides no authority to review anything beyond the transfer of spectrum. If the Commission wanted to review the Commercial Agreements, it needed to provide analysis on how exactly 310(d), despite appearances, gives it the authority to do so. But the Commission does nothing of the sort.
But let’s be charitable, and consider whether footnote 349 provides relevant analysis of its authority to review the Commercial Agreements under any statute.
The Commission did cite to several other sections of the Communications Act in the paragraph (145) that includes footnote 349. But that paragraph relates not to the review of the transaction itself (or even the ability of the parties to enter into the Commercial Agreements) but to the Commission’s authority to ensure that Verizon complies with the conditions imposed on the transaction, and to monitor the possible effects the Agreements have on the market after the fact. Three of the four statutes cited in the footnote (47 U.S.C. §§ 152, 316, & 548) don’t appear to give the Commission authority for anything related this transaction. Only 47 U.S.C. § 201 is relevant. But having authority to monitor a wireless provider’s post-transaction business practices is far different from having the authority to halt or condition the transaction itself before its completion because of concerns about ancillary agreements. The FCC cites no statutes to support this authority—because none exist.
This is not simply a semantic distinction. By claiming authority to review ancillary agreements in the course of reviewing license transfers, the Commission gains further leverage over companies seeking license transfer approvals, putting more of the companies’ economic interests at risk. This means companies will more likely make the “voluntary” concessions (with no opportunity for judicial scrutiny) that they would not otherwise have made—or they might not enter into deals in the first place. As we (Geoff and Berin) said in our CNET article, “the FCC has laid down its marker, letting all future comers know that its bargaining advantage extends well beyond the stack of chips Congress put in front of it.” In merger reviews, the house has a huge advantage, and it is magnified if the agency can expand the scope of activity under its review.
Thus Harold is particularly off-base when he writes that “[g]iven that there is no question that the FCC has authority to entertain complaints going forward, and certainly has authority to monitor how the markets under its jurisdiction are developing, it is hard to understand the jurisdictional argument even as the worship of empty formalism.” This misses the point entirely. The difference between the FCC reviewing the Commercial Agreements in deciding whether to permit the license transfer (or demand concessions) and regulating the Agreements after the fact is no mere “formalism.”
Regardless, if the FCC were actually trying to rely on these other sections of the Communications Act for authority to review the Commercial Agreements, it would have cited them in Paragraph 143, where it asserted that authority—not two paragraphs later in a footnote supporting the agency’s order assigning post-transaction monitoring tasks to the Wireline Competition Bureau. Moreover, none of these alleged assertions of authority amounts to an analysis of the FCC’s jurisdiction. Given the debate that took place in the record over the issue, a simple list of statutes purporting to confer jurisdiction would be utterly insufficient in response. Not as insufficient as an unadorned, conclusory statement of authority without even such a list of statutes (what the FCC actually did) — but awfully close.
We stand by our claim that the Commission failed to cite — let alone analyze — its authority to review the Commercial Agreements in this transaction. The FCC’s role in transaction reviews has been hotly contested, at least partially inspiring the FCC Process Reform Act that passed this spring in the House. Given the controversy around the issue, the Commission should have gone out of its way to justify its assertion of authority, citing precedent and making a coherent argument — in other words, engaging in legal analysis. At least, that’s what “real lawyers” would do.
But in real politik, perhaps it was naïve of us to expect more analysis from the agency that tried to justify net neutrality regulation by pointing to a deregulatory statute aimed at encouraging the deployment of broadband and claiming that somewhere in there, perhaps, hidden between the lines, was the authority the agency needed—but which Congress never actually gave it.
When the FCC plays fast and loose with the law in issuing regulations, someone will likely sue, thus forcing the FCC to justify itself to a court. On net neutrality, the D.C. Circuit seems all but certain to strike down the FCC’s Open Internet Order for lacking any firm legal basis. But when the FCC skirts legal limits on its authority in merger review, the parties to a merger have every incentive to settle and keep their legal qualms to themselves; even when the FCC blocks a merger, the parties usually calculate that t isn’t worth suing or trying to make a point about principle. Thus, through merger review, the FCC gets away with regulation by stealth—footnotes about legal authority be damned. Groups like the Electronic Frontier Foundation rightly worry about the FCC’s expansive claims of authority as a “Trojan Horse,” even when they applaud the FCC’s ends. We know Harold doesn’t like this transaction, but why doesn’t he worry about where the FCC is taking us?







September 13, 2012
Executive Branch Makes Power Grab to Create a New Spectrum Architecture without Congress
The findings and recommendations of the PCAST described above are an obvious attempt by the Administration to usurp Congressional authority and muscle it out of its constitutional jurisdiction over commercial spectrum use.
And one would expect that some in Congress would be downright angry that the Chairman of the FCC, an independent agency, is supporting a Presidential power grab.
The House Energy and Commerce Committee’s Subcommittee on Communications and Technology is holding a hearing this morning to examine how federal agencies and commercial wireless companies might benefit from more efficient government use of spectrum. The hearing is intended to address a report issued by the President’s Council of Advisors on Science and Technology (PCAST) that rejects the Constitutional role of Congress in managing our nation’s spectrum resources and neuters the FCC. The issues raised in the PCAST report should be subject to further study and not implemented through an unconstitutional Presidential memorandum. Only Congress can delegate this authority.
It appears President Obama is engaged in an aggressive strategy to increase presidential authority. After the Republican Party gained control of the House of Representatives in 2010, Obama responded by directing his staff to “push the envelope in finding things we can do on our own.” According to a White House official, “The president isn’t going to be stonewalled by politics,” which is a polite way of saying he plans to ignore Congress and the separation of powers in the U.S. Constitution.
The President began by refusing or selectively enforcing laws that he doesn’t like, such as laws restricting illegal immigration and advancing educational achievement. This has been an effective tactic in instances when Congress refuses to change existing laws, but is unavailing when the President believes a new law should be enacted. Emboldened by his early success with selective enforcement of existing laws, Obama has expanded his Congressional nullification strategy to the creation of new laws by presidential fiat.
A recent example of this new, more imperialistic approach to the presidency involves spectrum – the electromagnetic waves that empower the mobile Internet. The National Broadband Plan issued by the Federal Communications Commission (FCC) in 2010 recommended that 300 MHz of spectrum be made available for mobile Internet use by 2015. The plan also asked Congress to grant the FCC authority to conduct incentive auctions to allocate additional spectrum to mobile use.
There was bipartisan support for incentive auctions in Congress, but House Republicans and Senate Democrats disagreed on the best way to allocate and assign additional spectrum. The Senate preferred an approach that would allocate substantial amounts of spectrum for shared use on an unlicensed basis. The House preferred a licensed approach that would assign all reallocated spectrum through auctions. FCC Chairman Genachowski lobbied publicly against the House bill on this issue and lost. The incentive auction legislation adopted by Congress in February 2012 preserved the FCC’s authority to allocate some spectrum on an unlicensed basis, but required that the bulk of reallocated spectrum be licensed.
The incentive auction legislation also amended the Commercial Spectrum Enhancement Act, which provides auction funding for the relocation of federal wireless systems when federal spectrum is reallocated for commercial use on a licensed basis. Congress also required the National Telecommunications and Information Administration (NTIA), a unit of the Department of Commerce, to submit a report to the President identifying 15 MHz of spectrum between 1675 and 1710 MHz for reallocation from federal to commercial use, which the FCC is required to auction within three years. These provisions make clear that Congress expected federal spectrum would continue to be reallocated for commercial use on a licensed basis.
The Administration, however, decided to create its own spectrum policies by establish a new process through PCAST. In July 2012, PCAST issued a report finding that “the traditional practice of clearing government-held spectrum of Federal users and auctioning it for commercial use is not sustainable.” The report recommends the President issue a memorandum stating, “it is the policy of the U.S. government to share underutilized Federal spectrum to the maximum extent possible.” It should be no surprise that PCAST “believe[s] this shift in direction will also require increased White House involvement” and recommends that the Executive Branch manage this “wholly new approach to Federal spectrum architecture.” The Secretary of Commerce would determine the “conditions of use” that would apply to all users of shared federal spectrum, including commercial users.
The findings and recommendations of the PCAST described above are an obvious attempt by the Administration to usurp Congressional authority and muscle it out of its constitutional jurisdiction over commercial spectrum use. Although the report frequently mentions the FCC, the independent agency would have no meaningful role in this “new architecture.” Downsizing the role of the FCC – and by extension, Congress – in the management of federal spectrum would be a win for this Administration, FCC Chairman Genachowski, and all federal agencies that use spectrum.
The Administration wins by giving significant supporters access to additional spectrum on an unlicensed basis. It’s no coincidence that the only industry representatives on the PCAST, Google and Microsoft, were also the biggest proponents of unlicensed spectrum in the dispute between the House and Senate during the incentive auction debate. The recommendations in the PCAST report would turn their loss before Congress into a win before the President.
FCC Chairman Genachowski wins by pursuing his preferred spectrum policies without having to subject them to a vote of a bi-partisan panel of Congressionally-confirmed Commissioners. The PCAST report recommends that an Executive Branch agency — the Commerce Department — decide how commercial companies would share federal spectrum, and the Chairman would be the FCC’s sole representative on spectrum matters before the Executive Branch. The Chairman could also count shared federal spectrum toward achieving the spectrum promises made in the National Broadband Plan.
Federal spectrum users win by ensuring that they won’t have to vacate their spectrum ever again. The PCAST report assumes commercial users would be required to work around existing and future federal systems.
If the PCAST report’s recommendations were implemented as is, the biggest losers would be Congress, the FCC Commissioners, and the American people. The PCAST report’s rejection of the “traditional” approach to federal spectrum management as “unsustainable” should be a shock to Congress, who mandated a specific policy approach only a few months before the PCAST report was released. It’s certainly surprising that experts recommended the President ignore Congressional legislation and constitutional authority. And one would expect that some in Congress would be downright angry that the Chairman of the FCC, an independent agency, is supporting a Presidential power grab.
Congress is the governmental branch with constitutional authority to manage our nation’s spectrum. Congress delegated exclusive jurisdiction over non-federal use of spectrum to the FCC. (47 U.S.C. § 303) Congress delegated jurisdiction to the Executive branch over the federal use of spectrum only. (47 U.S.C. § 305) Although the Assistant Secretary of NTIA and the Chairman of the FCC meet biannually to conduct “joint spectrum planning,” Congress specified that the purpose of these joint planning meetings is to determine (1) whether spectrum licenses can be auctioned for commercial use, (2) future spectrum requirements, (3) future spectrum allocations, and (4) actions to promote the efficient use of spectrum, including shared use of spectrum to increase commercial access. (47 U.S.C. § 922) Congress thus requires the agencies to consider the reallocation of federal spectrum for licensed use as well as spectrum sharing on a biannual basis. When the agencies recommend significant changes to existing spectrum allocations and assignments, however, Congress has traditionally enacted legislation to implement the recommendations.
The recommendations of the PCAST report would abolish this statutory scheme through a single Presidential memorandum. Whatever its merits, the Constitution requires, and the American people should expect, that the PCAST proposal be subjected to open debate in Congress. The people did not choose the PCAST experts, but they did vote for their Congressional representatives. That still means something, doesn’t it?







September 11, 2012
Christopher Steiner on algorithms
Christopher Steiner, author of Automate This: How Algorithms Came to Rule the World, discusses his new book. Steiner originally set about studying the prevalence of algorithms in Wall Street stock trading but soon found they were everywhere. Stock traders were the first to use algorithms as a substitute for human judgment to make trades automatically, allowing for much faster trading. But now algorithms are used to diagnose illnesses, interpret legal documents, analyze foreign policy, and write newspaper articles. Algorithms have even been used to look at how people form sentences to determine that person’s personality and mental state so that customer service agents can deal with upset customers better. Steiner discusses the benefits–and risks–of algorithmic automation and how it will change the world.
Related Links
“Automate This: How Algorithms Came to Rule the World”, by Steiner
“In Knight Capital fiasco, a flurry of rule changes”, CNN Money
“David Cope: ‘You pushed the button and out came hundreds and thousands of sonatas”, Guardian







September 10, 2012
How Google Fiber is not like Verizon FiOS
In a recent post, Tim Lee does a good job of explaining why facilities-based competition in broadband is difficult. He writes,
As Verizon is discovering with its FiOS project, it’s much harder to turn a profit installing the second local loop; both because fewer than 50 percent of customers are likely to take the service, and because competition pushes down margins. And it’s almost impossible to turn a profit providing a third local loop, because fewer than a third of customers are likely to sign up, and even more competition means even thinner margins.
Tim thus concludes that
the kind of “facilities-based” competition we’re seeing in Kansas City, in which companies build redundant networks that will sit idle most of the time, is extremely wasteful. In a market where every household has n broadband options (each with its own fiber network), only 1/n local loops will be in use at any given time. The larger n is, the more resources are wasted on redundant infrastructure.
I don’t understand that conclusion. You would imagine that redundant infrastructure would be built only if it is profitable to its builder. Tim is right we probably should not expect more than a few competitors, but I don’t see how more than one pipe is necessarily wasteful. If laying down a second set of pipes is profitable, shouldn’t we welcome the competition? The question is whether that second pipe is profitable without government subsidy.
That brings me to a larger point: I think what Tim is missing is what makes Google Fiber so unique. Tim is assuming that all competitors in broadband will make their profits from the subscription fees they collect from subscribers. As we all know, that’s not how Google tends to operate. Google’s primary business model is advertising, and that’s likely from where they expect their return to come. One of Google Fiber’s price points is free, so we might expect greater adoption of the service. That’s disruptive innovation that could sustainably increase competition and bring down prices for consumers–without a government subsidy.
Kansas City sadly gave Google all sorts of subsidies, like free power and rackspace for its servers as Tim has pointed out, but it also cut serious red tape. For example, there is no build-out requirement for Google Fiber, a fact now bemoaned by digital divide activists. Such requirements, I would argue, are the true cause of the unused and wasteful overbuilding that Tim laments.
So what matters more? The in-kind subsidies or the freedom to build only where it’s profitable? I think that’s the empirical question we’re really arguing about. It’s not a forgone conclusion of broadband economics that there can be only one. And do we want to limit competition in part of a municipality in order to achieve equity for the whole? That’s another question over which “original recipe” and bleeding-heart libertarians may have a difference of opinion.







Adam Thierer's Blog
- Adam Thierer's profile
- 1 follower
