Adam Thierer's Blog, page 133

April 26, 2011

Threat Inflation in Cybersecurity Policy

Today my colleague Tate Watkins and I are releasing a new working paper on cybersecurity policy. Please excuse my patently sleep-deprived mug while I describe it here:







Over the past few years there has been a steady drumbeat of alarmist rhetoric coming out of Washington about potential catastrophic cybersecurity threats. For example, at a Senate Armed Services Committee hearing last year, Chairman Carl Levin said that "cyberweapons and cyberattacks potentially can be devastating, approaching weapons of mass destruction in their effects." Proposed responses include increased federal spending on cybersecurity and the regulation of private network security practices.



The rhetoric of "cyber doom" employed by proponents of increased federal intervention, however, lacks clear evidence of a serious threat that can be verified by the public. As a result, the United States may be witnessing a bout of threat inflation.



Threat inflation, according to Thrall and Cramer, is a concept in political science that refers to "the attempt by elites to create concern for a threat that goes beyond the scope and urgency that a disinterested analysis would justify." Different actors—including members of Congress, defense contractors, journalists, policy experts, academics, and civilian, military, and intelligence officials—will each have their own motives for contributing to threat inflation. When a threat is inflated, the marketplace of ideas on which a democracy relies to make sound judgments—in particular, the media and popular debate—can become overwhelmed by fallacious information. The result can be unwarranted public support for misguided policies.



The run-up to the Iraq War illustrates the dynamic of threat inflation. After 9/11, the Bush Administration decided to invade Iraq to oust Saddam Hussein. Lacking any clear casus belli, the administration sought popular and congressional support for war by promoting several rationales that ultimately proved baseless.



Over the past two years, there has been a drive for increased federal involvement in cybersecurity. This drive is evidenced by the introduction of several comprehensive cybersecurity bills in Congress, the initiation of several regulatory proceedings related to cybersecurity by the Federal Communications Commission and Commerce Department, and increased coverage of the issue in the media. The official consensus seems to be that the United States is facing a grave and immediate threat that only quick federal intervention can address. This narrative has gone largely unchallenged by members of Congress or the press, and it has inflated the threat.



There is very little verifiable evidence to substantiate the threats claimed, and the most vocal proponents of a threat engage in rhetoric that can only be characterized as alarmist. Cyber threat inflation parallels what we saw in the run-up to the Iraq War.



Additionally, a cyber-industrial complex is emerging, much like the military-industrial complex of the Cold War. This complex may serve to not only supply cybersecurity solutions to the federal government, but to drum up demand for them as well.



In our new working paper, Tate Watkins and I draw a parallel between today's cybersecurity debate and the run-up to the Iraq War and look at how an inflated public conception of the threat we face may lead to unnecessary regulation of the Internet. We also draw a parallel between the emerging cybersecurity establishment and the military-industrial complex of the Cold War and look at how unwarranted external influence can lead to unnecessary federal spending. Finally, we survey several federal cybersecurity proposals and present a framework for policy makers to analyze the cybersecurity threat.



Over the next few days I'll be excerpting the paper here and would love your thoughts and reactions.




 •  0 comments  •  flag
Share on Twitter
Published on April 26, 2011 07:12

Jane Yakowitz on tragedy of the data commons

Post image for Jane Yakowitz on tragedy of the data commons

On the podcast this week, Jane Yakowitz, a visiting assistant professor at Brooklyn Law School, discusses her new paper about data anonymization and privacy regulation, Tragedy of the Data Commons. Citing privacy concerns, legal scholars and privacy advocates have recently called for tighter restrictions on the collection and dissemination of public research data. Yakowitz first explains why these concerns are overblown, arguing that scholars have misinterpreted the risks of anonymized data sets. She then discusses the social value of the data commons, noting the many useful studies that likely wouldn't have been possible without a data commons. She finally suggests why the data commons is undervalued, citing disparate reactions to similar statistical releases by OkCupid and Facebook, and offers a few policy recommendations for the data commons.



Related Links


Tragedy of the Data Commons, by Yakowitz
Broken Promises of Privacy: Responding to the Surprising Failure of Anonymization, by Paul Ohm
"Race and Romance: An Uneven Playing Field for Black Women," Freakonomics
"Facebook digs through user data and graphs U.S. happiness," LA Times
"Ok Trends: Dating Research from Ok Cupid"
"Jane Yakowitz on How Privacy Regulation Threatens Research & Knowledge," by Adam Thierer


To keep the conversation around this episode in one place, we'd like to ask you to comment at the web page for this episode on Surprisingly Free. Also, why not subscribe to the podcast on iTunes?




 •  0 comments  •  flag
Share on Twitter
Published on April 26, 2011 05:00

April 25, 2011

White House Takes Wrong Side on Police GPS Tracking

It is disappointing that the Obama administration, which campaigned against George W. Bush's poor record on civil liberties protection, is pursuing a course that aims to limit Fourth Amendment rights when it comes to the use of location tracking technology.



The Washington Post reported yesterday that the Obama administration has petitioned the U.S. Supreme Court to overturn a ruling last year by the U.S. Court of Appeals for the D.C. Circuit that forces police to obtain a warrant before tracking the movements of a suspect using a global positioning device.



The motion is significant because various state laws conflict over procedure and the Supreme Court, if it takes the case, could establish long-term procedure going forward. In the case at hand, United States vs. Antoine Jones, the D.C. court sided with the defendant, overturning the conviction against Jones, who was accused of being a major cocaine dealer, ruling that D.C. police violated due process by using a GPS device to track Jones' movements for one month without a warrant. Appellate courts in New York and California, on the other hand, have ruled in favor of police in similar cases.





The case also comes as location-tracking technology becomes more common. This itself is fueling an ongoing debate about the balance between utility and privacy. Witness last week's revelation that Apple iPhones and Google Android smartphones by default track their users' locations, which are then transmitted back to the respective companies and stored.



A valid concern in all this is the question of how aggressively law enforcement will seek this access to this data now that it exists. The D.C., New York and California cases all involved direct use of tracking technology by law enforcement agencies—police affixed the GPS transmitters to suspect vehicles. These instances, as we see, already raise questions of due process. The next step will likely see police deputizing commercial companies to do the tracking work for them. This has happened post-9/11, with U.S. government agencies demanding that phone companies turn over calling records without a warrant. Moreover, as the law stands now, there's enough wiggle room for police and prosecutors to claim they don't need them. Much of this search-and-seizure abuse can be corrected by specific legislation, such as extending safeguards of the Electronic Communications Privacy Act to include personal data stored by third-parties.



The other element in U.S v. Jones is "reasonable expectation of privacy." True, one does not have such expectation on the public streets, but, as Justice Douglas Ginsberg, in a reasoned application of the concept, wrote in his decision (as per the Post), "the whole of a person's movements over the course of a month is not actually exposed to the public because the likelihood a stranger would observe all those movements…is essentially nil."



This provides a real-world response to Chief Judge David Sentelle's dissenting argument. "A person's reasonable expectation of privacy while traveling on public highways is zero," Sentelle wrote, and "the sum of an infinite number of zero-value parts is also zero."



But there's a whiff of sophistry here, recalling for me Zeno's Paradox of the Achilles and the Tortoise. The ancient Greek philosopher, perhaps wryly, asserted that because, mathematically speaking, an infinite number of points lay between points A and B (i.e., wherever you are, you'll always have halfway to go), a traveler setting out from point A, in theory at least, would never reach Point B.  Because we live in a finite world, we know Zeno's Paradox does not hold.



Likewise, while we may not expect privacy when we drive down to the local grocery store–at any point along the way we may seen by a neighbor out walking the dog–we have a reasonable expectation of protection from round-the-clock observation. Sentelle's point is undone by the fact that if observation is constant, aggressive and/or obnoxious enough, courts consider it to be harassment. That's why we have restraining orders.



Politics being what they are, the Obama administration picked as unsavory a defendant it could find to set up as a test case—no one wants to go out on a limb to speak for the rights of a drug kingpin. But it would do us good remember the principle, not the man. With personal information being recorded, stored and processed as much as it is, the correct policy is to strengthen Fourth Amendment protections, not petition for their dilution.




 •  0 comments  •  flag
Share on Twitter
Published on April 25, 2011 13:46

Revisiting the Bitcoin bubble



Here is a chart of the Bitcoin-dollar exchange rate for the past six months. The arrow notes the date my column on the virtual currency was published in TIME.com. The day after that piece was published, the Bitcoin exchange rate reached an all time high at $1.19. Yesterday, just over a week later, it was pushing $2.



A wiser fella than myself once said, correlation is not causation, and no doubt my article was just a contributing factor in Bitcoin's recent run-up. It's simply getting increasingly mainstream attention, and with that more speculators and speculation about mainstream adoption. The chart above lends a lot of credence to Tim Lee's bubble critique, so I wanted to make sure I wasn't giving that argument short shrift.



There may well be a Bitcoin bubble, and it may even be likely. But again, I think that misses the greater point about what Bitcoin represents. Bitcoin may be tulips and the bubble may burst, but the innovation—distributed, anonymous payments—is here to stay. Napster went bust, but its innovation presaged BitTorrent, which is here to stay. Could the Bitcoin project itself go bust? Certainly, but the innovation solving the double-spending problem I've been talking about, will be taken up and improved by others, just as other picked up and ran with Napster's innovation.



I want to start thinking through the practical and legal implications of that innovation. If you don't think the innovation could ever allow for a useful store of value, then mine is a fool's errand. I guess I'm betting on the success of a censorship resistant currency.




 •  0 comments  •  flag
Share on Twitter
Published on April 25, 2011 12:13

A Smarter Way to Tax Internet Sales

Consumers are buying more and more stuff from online retailers located out-of-state, and state and local governments aren't happy about it. States argue that this trend has shrunk their brick and mortar sales tax base, causing them to lose out on tax revenues. (While consumers in most states are required by law to annually remit sales taxes for goods and services purchased out of state, few comply with this practically unenforceable rule).



CNET's Declan McCullagh recently reported that a couple of U.S. Senators are pushing for a bill that would require many Internet retailers to collect sales taxes on behalf of states in which they have no "nexus" (physical presence).



In his latest Forbes.com column, "The Internet Tax Man Cometh," Adam Thierer argues against this proposed legislation. He points out that while cutting spending should be the top priority of state governments, the dwindling brick and mortar tax base presents a legitimate public policy concern. However, Thierer suggests an alternative to "deputizing" Internet retailers as interstate sales tax collectors:



The best fix might be for states to clarify tax sourcing rules and implement an "origin-based" tax system. Traditional sales taxes are already imposed at the point of sale, or origin. If you buy a book in a Seattle bookstore, the local sales tax rate applies, regardless of where you "consume" it. Why not tax Net sales the same way? Under an origin-based sourcing rule, all sales would be sourced to the principal place of business for the seller and taxed accordingly.


Origin-based taxation is a superb idea, as my CEI colleague Jessica Melugin explained earlier this month in the San Jose Mercury News in an op-ed critiquing California's proposed affiliate nexus tax:



An origin-based tax regime, based on the vendor's principal place of business instead of the buyer's location, will address the problems of the current system and avoid the drawbacks of California's plan. This keeps politicians accountable to those they tax. Low-tax states will likely enjoy job creation as businesses locate there. An origin-based regime will free all retailers from the accounting burden of reporting to multiple jurisdictions. Buyers will vote with their wallets, "choosing" the tax rate when making decisions about where to shop online and will benefit from downward pressure on sales taxes. Finally, brick-and-mortar retailers would have the "even playing field" they seek.

Congress should exercise its authority over interstate commerce and produce legislation to fundamentally reform sales taxes to an origin-based regime. In the meantime, California legislators should resist the temptation to tax those beyond their borders. Might we suggest an awards show tax?


Origin-based sourcing is not without its detractors, but the arguments against it are weak. R. David L. Campbell, for instance, responds to Thierer's Forbes.com column by claiming that origin-based taxation amounts to "taxation without representation," because it would result in some consumers paying sales taxes despite having no say over the elected officials who established such taxes.



That's true, but so what? Consumers who buy from retailers located out-of-state are already impacted by laws in those states. DC residents cannot order wine and have it shipped to them any Pennsylvania-based retailer due to that state's laws, even though the District of Columbia has fairly permissive laws regarding direct-to-consumer wine shipments from out-of-state. Besides, consumers who buy online pay all sorts of indirect taxes. Consider that major electronics retailer Newegg.com, which is incorporated in California, paid $22m in state corporate income taxes in 2008. A big chunk of that $22m was passed on to out-of-state consumers who have no say over California tax rates. While most Newegg.com customers can't vote in California, many of the firm's thousands of employees can. The company is also better positioned than thousands of dispersed citizens to lobby state legislators for a favorable business climate.



Campbell also brings up the SSTP (Streamlined Sales Tax Project), an effort launched back in 2000 by a group states to establish a cooperative sales tax regime. While the project's objective to "streamline" sales taxes is laudable in theory, it turns out — unsurprisingly — that getting dozens of state governments to get behind a simple, uniform, reciprocal sales tax regime is quite challenging in practice.



Joseph Henchman, the Tax Foundation's Vice President of Legal & State Projects, discussed the project's massive shortcomings in his 2009 testimony before the Maryland Legislature [emphasis in original]:



The SSTP already abandoned the notion of taxing like transactions alike when they adopted "destination sourcing" for online sales, but permitted states to adopt "origin sourcing" for intrastate sales. This in effect requires Internet companies to collect sales taxes based on where their customer is located, but allows brick-and-mortar stores to collect sales taxes based on where the store is located. In this way, the SSTP prevents a level playing field between Internet business and brick-and-mortar businesses.




Coupled with the SSTP's non-worry about reducing the number of jurisdictions . . . full implementation of the SSTP at this time, without serious reforms, could result in a serious and inequitable burden on e-commerce. . . . The SSTP has not accomplished its mission. The SSTP should look again at serious simplification efforts before declaring themselves a success and seeking to expand state taxing power. . . . Neither the wholesale adoption nationwide of uniform sales tax statutes, nor the development of a working alternative that provides the certainty needed for long-term investment, are likely in the foreseeable future.


While the SSTP has made some progress in the last couple of years, it continues to encounter resistance from state governments, and sales taxes remain exceedingly complex.



Congress could address the issue in a far simpler manner by enacting legislation that provides for origin-based taxation. Dr. Michael S. Greve, the John G. Searle Scholar at the American Enterprise Institute (and the Chairman of the Competitive Enterprise Institute) wrote a superb study in 2003, Sell Globally, Tax Locally, in which he articulates the case for origin-based taxation in painstaking detail. Greve discusses the importance of tax competition and dismisses the "race to the bottom" argument on pages 26 to 28:



By rendering sellers indifferent to the local tax, destination-based taxation minimizes tax competition. Under an origin-based regime, in contrast, sellers in a low-tax jurisdiction enjoy a competitive advantage. States and countries will seek to attract firms by offering a low tax rate. As jurisdictions attempt to stem the flight of business firms into lowtax jurisdictions, sales taxes will spiral downward. If sellers are perfectly mobile and transaction costs (such as shipping cost) are negligible, the equilibrium tax rate—all else equal—is zero. This "race to the bottom" argument is the sum and substance of the case for destination-based taxation and the true reason why governments consistently and vociferously oppose origin-based taxation. But the argument is unpersuasive.

First of all, all else is not in fact equal. We would probably see the zero-tax equilibrium if sellers were entirely free to designate their home state, or to designate their place of incorporation as their home state. The principal-place-of-business rule, in contrast, disciplines the sellers' choices. As already suggested, sales taxes are one element in a bundle of services and obligations that are offered by each jurisdiction. A jurisdiction that provides an educated labor force, an excellent infrastructure, a favorable regulatory environment, a sensible and efficient judicial system, or sufficient "quality of life" benefits may be able to exact a sales tax or its economic equivalent. . . . An unattractive jurisdiction that drives up the cost of doing business, meanwhile, will be unable to compensate those selfinflicted disadvantages by becoming a "sales tax haven."

More fundamentally, one cannot assume that the downward pressure on tax competition necessarily translates into a race to the bottom. Under certain (heroic) assumptions, tax competition may compromise local governments' ability to finance public goods; in that event, the race is to the bottom. But . . . it is equally plausible . . . to welcome tax competition as a much-needed discipline and countervailing force to local rent-seeking and interest group exploitation. Under these more realistic assumptions, tax competition reduces the "political residuum" that is available to local politicians for purposes of redistribution—without, at the same time, compromising local governments' abilities to levy taxes, akin to user fees, to finance public goods.

It is true that destination-based systems also curtail some tax competition. The local tax mix, including the sales tax, will be a factor in the citizens' (though not firms') locational decisions. . . . In many cases, though, firms may be more responsive to changes in the local tax structure—and to advantageous changes in "foreign" jurisdictions—than are individual citizens. A 2-percent local sales tax hike may not induce an individual to move. . . . That same increase, though, may have a rather dramatic effect on firms' locational decisions.

States compete for citizens and firms on any number of margins—environmental regulation, labor regulation, business and income taxes. All elements of the regulatory and tax environment operate as factors for local firms. Countless government decisions provide firms with competitive advantages or disadvantages and, at the margin, shape business decisions to locate in a given state or locality.


Amen.




 •  0 comments  •  flag
Share on Twitter
Published on April 25, 2011 10:43

And the Winner Is . . . !

Melissa Yu is the winner of first prize in the middle school category of C-SPAN's StudentCam 2011 competition. Her video, "Net Neutrality: The Federal Government's Role in Our Online Community," is an eight-minute look at the push for regulation of Internet service with an emphasis appropriate for students on how the three branches of government have each been involved in the story up to now.



Many TLF readers already know the story and the key players, but if you haven't been following along, or if you want a refresher, here's a better video than I could have produced in eighth grade. Or now. Congratulations, Melissa Yu!






 •  0 comments  •  flag
Share on Twitter
Published on April 25, 2011 07:28

Event: The FCC's Wireless Competition Report: A Preview

Every year since 1995, the Federal Communications Commission has released a report on the state of competition in the wireless market, and it will soon release the fifteenth. Last year's report was the first not to find the market "effectively competitive." As a result, expectations are high for the new annual report. How it determines the state of competition in the wireless market could affect regulatory policy and how the Commission looks at proposed mergers



Join the Mercatus Center at George Mason University's Technology Policy Program for a discussion of these issues, including:




What does a proper analysis of wireless competition look like?
What should we expect from the FCC's report this year?
How should the FCC address competition in the future?


Our panel will feature Thomas W. Hazlett, Professor of Law & Economics, George Mason University School of Law; Joshua D. Wright, Assistant Professor of Law, George Mason University School of Law; Robert M. Frieden, Professor of Telecommunications & Law, Penn State University; and Harold Feld, Legal Director, Public Knowledge



When: Wednesday, May 18, 2011, 4 – 5:30 p.m. (with a reception to follow)



Where: George Mason University's Arlington Campus, just ten minutes from downtown Washington. (Founders Hall, Room 111, 3351 N. Fairfax Drive, Arlington, VA)



To RSVP for yourself and your guests, please contact Megan Gandee at 703-993-4967 or mmahan@gmu.edu no later than May 16, 2011. If you can't make it to the Mercatus Center, you can watch this discussion live online at mercatus.org.




 •  0 comments  •  flag
Share on Twitter
Published on April 25, 2011 07:15

April 22, 2011

Video: Debating Privacy & Online Advertising on the Stossel Show

On this week's John Stossel show on Fox Business Network, I debated Internet privacy, advertising, and data collection issues with Michael Fertik of Reputation.com. In the few minutes we had for the segment, I tried to reiterate a couple of keep points that we've hammered repeatedly here in the past:




There's no free lunch. All the free sites and service we enjoy online today are powered by advertising and data collection. [see this op-ed]
There is no clear harm in most cases, or what some argue is harm also can have many benefits that are rarely discussed. [see this paper.]
There's little acknowledgement of the trade-offs involved in having government create an information control regime for the Internet. [see this filing and these three essays: 1, 2, 3.]
The ultimate code of "fair information practices" is the First Amendment, which favors free speech, openness, and transparency over secrecy and information control. [see this piece.]
"Hands Off the Net" is a policy that has served us well. There are dangerous ramifications for our economy and long-term Internet freedoms if we continue down the road of "European-izing" privacy law here in the States. [see this essay and this filing.]
At some point, personal responsibility needs to come into the equation. With so many privacy enhancing empowerment tools already on the market, it begs the question: If consumers don't take steps to use those tools, why should government intervene and take action for them?


Anyway, here's the 7-min video of the debate between Fertik and me:






 •  0 comments  •  flag
Share on Twitter
Published on April 22, 2011 20:41

For The Last Time: The Bell System Monopoly Is Not Being Rebuilt

Believe it or not, this argument is being trotted out as part of the pressure from consumer activist groups against AT&T's proposed acquisition of T-Mobile. The subject of a Senate Judiciary Hearing on the merger, scheduled for May 11, even asks, "Is Humpty Dumpty Being Put Back Together Again?"



It seems because the deal would leave AT&T and Verizon as the country's two leading wireless service providers, the blogosphere is aflutter with worries that we are returning to the bad old days when AT&T pretty much owned all of the country's telecom infrastructure.



It is true that AT&T and Verizon trace their history back to the six-year antitrust case brought by the Nixon Justice Department, which ended in the 1984 divestiture of then-AT&T's 22 local telephone operating companies, which were regrouped into seven regional holding companies.



Over the last 28 years, there has been gradual consolidation, each time accompanied by an uproar that the Bell monopoly days were returning. But those claims miss the essential goal of the Bell break-up, and why, even though those seven "Baby Bell" companies have been integrated into three, there's no going back to the pre-divestiture AT&T.





The Bell System monopoly was vertically integrated. Not only did it have a monopoly on local services, it operated the only long-distance company, it handled all incoming and outgoing international calls, and, most important, its wholly-owned subsidiaries, Bell Labs and Western Electric, developed, manufactured and sold all network equipment from the switches and cable to the phone in your home and office.



The claim that that AT&T and T-Mobile merger will remake the Bell System is undone by recalling why AT&T was broken up in the first place. It had little to do with it being a monopoly provider of residential telephone service. Remember, in the final judgment, the seven spin-off companies retained their local monopolies.



The problem was that AT&T was its own supply chain. As such, in the 1960s and 70s, as the computer industry was going through massive upheaval because of rapid and disruptive strides being made by semiconductor companies, AT&T remained insulated in a bubble. Since AT&T only bought from AT&T, AT&T could dictate the pace of telecom technology evolution, say from mechanical switches, to electronic to digital. This was virtually opposite the situation that was happening with computers and data networking, where central mainframe-based architectures were disintermediated by distributed computing. IBM and Sperry were giving way to Digital Equipment Corp. and Wang, and ultimately Microsoft and Apple.



It's arguable, at least, that the demand for faster data networking, driven by the trend toward distributed intelligence, created the policy pressure for the Bell break-up and competitive telecom in general. MCI, which provided the first long distance alternative for businesses, appeared on the scene in the late 70s. At the same time, spurred by the 1968 Carterfone decision that permitted end-users to attach their own terminal equipment to AT&T network, intense competition broke out for office phone systems, especially those that could integrate data networking. More and more, it seemed as if AT&T's ironclad grip on the U.S. public network was an obstacle to innovation, not the enabler it had purported itself to be for decades (and one of the legs on which it rested its whole "natural monopoly" argument).



In fact, conventional wisdom at the time was that the government was going to force AT&T to divest Western Electric and Bell Labs in order to create a competitive market for network infrastructure. Divestiture accomplished this somewhat, because it separated local exchange infrastructure from AT&T's control. Ironically, it was market forces that accomplished what regulators had hoped to, when, in 1996, AT&T divested Western Electric, because by then, AT&T itself was the Bell companies' biggest competitor, and that was straining its ability to sell into that segment.



So while the divested Bell companies have re-merged, even to the point of consuming their former parent, they have no control over the supply chain, and therefore, cannot control prices or product development the way AT&T did pre-break-up.



Keeping things in the context of wireless for now, as that's what's driving the AT&T and T-Mobile deal, it's clear consumers are impatient for the latest smartphone models. Even as merged unit, AT&T and T-Mobile cannot arbitrarily choose when and where to release new technology like the Bell System once could. Quite the opposite, there continues to be an ongoing race as to which company can deliver the most popular phones on the best terms. Case in point was the hoopla surrounding Verizon's introduction of the iPhone earlier this spring. That was accompanied by price cuts as well as hints of the new iPhone model expected in the fall. In the meantime, a geek war has broken out over the utility and relative benefits of Apple's iOS-based iPhone and Google's Android operating system. Certainly the debate gets confusing, overwrought and tiresome, but that's because consumers can vote with their pocketbook. In the Bell System days, there were no such dialogues because there was no such choice.



Most other arguments fall apart, too. Size by itself is not an antitrust argument. Nor is duopoly or triopoly. It takes a certain level of capital and heft to operate a nationwide network, and the fact that post-merger, there will still be three national companies competing alongside regional players speaks to the competitiveness of the industry. AT&T and Verizon have similar market share numbers, and although Sprint lags, it has a healthy share of the government sector. It is not as weak as the media suggests.



Market share also is an imprecise measure of competition and consumer harm. A company with 80 percent market share may be doing nothing illegal. It can be holding that level because low prices and innovative products yield loyal customers. Cisco Systems, which makes Internet routers and switches, is a great example. It dominates the segment, yet does through aggressive innovation, quality products and strong customer support.



At the same time, we have seen companies whose market shares pundits have deemed unassailable wilt in the face of a newcomer who can provide more utility or expose a weakness. Witness Firefox against Explorer; Facebook against MySpace, and in wireless devices, Apple against Nokia.



Others have raised the customer service issue–that AT&T consistently ranks low in customer service surveys. This metric itself cannot be used as a "customer harm" because there is no predicting how this might change with the mix of T-Mobile (which has good customer satisfaction ratings). Measurements are also subjective. Everyone complains about the phone company. Yet, in AT&T's case, the equally measurable popularity of the iPhone seems to offset those complaints. It also undermines the market share argument–begging the question of why a company whose service is reportedly so inferior poses such a threat to competition. But just to be skanky, if antitrust approval hinged on customer service, United Airlines would never been allowed to merge with Continental.



A valid antitrust case must show the merger will allow AT&T to illegally or unfairly limit options for consumers. In U.S. antitrust law, this usually means determining whether a dominant company can it use its size to undermine or drive out otherwise healthy competitors by controlling access to other parts of the supply chain—such as manufacturing, transportation or distribution. In modern antitrust jurisprudence, leveraging size to speed innovation, respond to market needs, or lure investment dollars is not seen as unfair or illegal. This distinction guards against the use of courts to protect or prop up uncompetitive companies. (European antitrust, however, is a different animal).



The AT&T/T-Mobile merger is a sign of maturing market, not the reconstitution of a monopoly that existed 30 years ago in an environment very different from today. Bottom line, economies of scale could not sustain seven regional telecommunications companies. Far from "unthinkable," as one-time FCC Chairman Reed Hundt once declared, their consolidation was inevitable. A few prescient analysts, including Victor Schnee and Allan Tumolillo in the landmark study "Taking Over Telephone Companies," predicted this very thing as far back as 1990.



Broadly speaking, we are entering a new phase of service provision, where wireless stands to be a much more competitive "last mile" technology for broadband. This will shuffle the players and the stakes again. The former AT&T companies dominate wireless, to be sure, but on the wireline side, cable companies have the competitive advantage. The right approach to this merger would be to view it in the context of the evolving broadband market. With this perspective, AT&T and T-Mobile won't be one wireless company among three, but one national broadband player among six or seven.



Let the competition grow.




 •  0 comments  •  flag
Share on Twitter
Published on April 22, 2011 11:26

April 20, 2011

DC event on Internet governance

"Global Internet Governance: Research and Public Policy Challenges for the Next Decade" is the title for a conference event held May 5 and 6 at the American University School of International Service in Washington. See the full program here.



Featured will be a keynote by the NTIA head, Assistant Secretary for Commerce Lawrence Strickling. TLF-ers may be especially interested in the panel on the market for IP version 4 addresses that is emerging as the Regional Internet Registries and ICANN have depleted their free pool of IP addresses. The panel "Scarcity in IPv4 addresses" will feature representatives of the American Registry for Internet Numbers (ARIN) and Addrex/Depository, Inc., the new company that brokered the deal between Nortel and Microsoft. There will also be debates about Wikileaks and the future of the Internet Governance Forum. Academic research papers on ICANN's Affirmation of Commitments, the role of the national governments in ICANN, the role of social media in the Middle East/North Africa revolutions, and other topics will be presented on the second day. The event was put together by the Global Internet Governance Academic Network (GigaNet). Attendance is free of charge but you are asked to register in advance.




 •  0 comments  •  flag
Share on Twitter
Published on April 20, 2011 14:35

Adam Thierer's Blog

Adam Thierer
Adam Thierer isn't a Goodreads Author (yet), but they do have a blog, so here are some recent posts imported from their feed.
Follow Adam Thierer's blog with rss.