Adam Thierer's Blog, page 147

January 28, 2011

Egyptian Government Attacks Egypt's Internet

In response to civil unrest, the Egyptian government appears to have ordered service providers to shut down all international connections to the Internet. According to the blog post at the link just above, Egypt's four main ISPs have cut off their connections to the outside world. Specifically, their "BGP routes were withdrawn." The Border Gateway Protocol is what most Internet service providers use to establish routing between one another, so that Internet traffic flows among them. I anticipate we might have comments here that dig deeper into specifics.



An attack on BGP is one of few potential sources of global shock cited by an OECD report I noted recently. The report almost certainly imagined a technical attack by rogue actors but, assuming current reporting to be true, the source of this attack is a government exercising coercion over Internet service providers within its jurisdiction.



That is far from an impossibility in the United States. The U.S. government has proposed both directly and indirectly to centralize control over U.S. Internet service providers. C|Net's Declan McCullagh reports that an "Internet kill switch" proposal championed by by Sens. Joseph Lieberman (I-Conn.) and Susan Collins (R-Maine) will be reintroduced in the new Congress very soon. The idea is to give "kill switch" authority to the government for use in responding to some kind of "cyberemergency." We see here that a government with use "kill switch" power will use it when the "emergency" is a challenge to its authority.



When done in good faith, flipping an Internet "kill switch" would be stupid and self-destructive, tantamount to an auto-immune reaction that compounds the damage from a cybersecurity incident. The more likely use of "kill switch" authority would be bad faith, as the Egyptian government illustrates, to suppress speech and assembly rights.



In the person of the Federal Communications Commission, the U.S. government has also proposed to bring Internet service providers under a regulatory umbrella that it could turn to censorship or protest suppression in the future. Larry Downes has a five-part analysis of the government's regulatory plan here on TLF (1, 2, 3, 4, 5). The intention of its proponents is in no way to give the government this kind of authority, but government power is not always used as intended, and there is plenty of scholarship to show that government agencies use their power to achieve goals that are non-statutory and even unconstitutional.



The D.C. area's surfeit of recent weather caused the cancellation yesterday of a book event I was to participate in, discussing Evgeny Morozov's The Net Delusion: The Dark Side of Internet Freedom. I don't know that he makes the case overwhelmingly, but Morozov argues that governments are ably using the Internet to stifle freedom movements. (See Adam's review, hear Jerry's podcast.)



Events going on here in the United States right now could position the U.S. government to exercise the kind of authority we might look down our noses at Egypt for practicing. The lesson from the Egypt story—what we know of it so far—is that eternal vigilance is the price of freedom.




 •  0 comments  •  flag
Share on Twitter
Published on January 28, 2011 06:58

January 27, 2011

Data Privacy Day is Among Us

Data Privacy Day is January 28. And as Steve DelBianco writes at the NetChoice blog, now is an opportune time for it as Congress, the Commerce Department, and the Federal Trade Commission each have proposed new rights and rules for data privacy.



To appreciate Data Privacy Day you must first ignore the Euro-babble description of what is Data Privacy Day ("an international celebration of the dignity of the individual expressed through personal information") and take it for what it really is: a prodding for Internet users to take a critical look at how they share and communicate information online.



Importantly, this is not a day for governments, but for users. As Steve writes, "the role for government should be in areas where users and business cannot act alone, including law enforcement, international data flows, and pre-empting a patchwork of state laws. Government should use is powers to pursue online fraud and criminal misuse of data, not to create rules that narrowly prescribe what and how data should be used."



Also, check out the tech-friendly quotes from Obama's State of the Union in Steve's post.




 •  0 comments  •  flag
Share on Twitter
Published on January 27, 2011 19:29

January 26, 2011

Badges? We don't need no stinking badges: Reading the FCC's Net Neutrality Order (Part V)

(Follow the links for Part I, Part II, Part III and Part IV.)



In this final post on the FCC's Dev. 23, 2010 Open Internet Report and Order, I'll look briefly at the problematic legal foundation on which the FCC has built its new regulations on broadband Internet access.  That discussion need only be brief largely because the extended legal analysis has already been admirably detailed by FCC Commissioner Robert McDowell.  His dissent (see pages 145-177 of the Report and Order) calmly and systematically dismantles the case made by the majority (See ¶¶ 115-150).



This is no theoretical discussion of statutory interpretation.  Even before the rules have been published on the Federal Register, two broadband providers—Verizon and then MetroPCS—have already filed lawsuits in the D.C. Circuit Court of Appeals challenging the FCC's authority to regulate.  (See Jim DeLong's definitive deciphering of Verizon's efforts to secure exclusive jurisdiction in the D.C. Circuit)  The arguments sketched out in Commissioner McDowell's dissent are likely to mirror the complainants' briefs in these and likely other Petitions for Review of the Order.





The Need for Authorization





Nate Anderson of Ars Technica, who did a great service in running side-by-side the provisions in the FCC's final Order and the terms of Verizon-Google's proposed legislative framework, nonetheless misses the mark in his conclusion.  Why, he wonders, is Verizon challenging rules that so closely track with the ones it proposed?   Or, as Anderson puts it, "Why is Verizon suing over net neutrality rules it once supported?"



I wouldn't and didn't (see Part III) go as far as Anderson, who concludes "The similarity here is astonishing…."  Both the final rules and the Verizon-Google proposal closely tracked, with important differences, the original order the FCC proposed in October, 2009.  And there are material differences between what Verizon-Google proposed and what the FCC ultimately voted on, notably in the treatment of mobile broadband.



But those details aside, there is one crucial difference that Anderson acknowledges but doesn't really credit.  As he writes, "the Verizon/Google proposal did make one other suggestion: it should be passed by Congress, not the FCC…."



That might seem like a small enough difference.  Rules are rules, what difference if the FCC passed them under its rulemaking authority or if Congress had put them into a new statute, such as the Internet Freedom Preservation Act, which would have naturally given the FCC authority to enforce them anyway?



But in fact that procedural difference embodies the principle objection not only to the Report and Order but to the process by which it was completed. Put simply, Congress alone has the power to regulate; the FCC can only act on authority delegated to it by Congress.  Any rulemaking undertaken without authority is not only dangerous, but also unconstitutional.



And Congress, it's clear, has not delegated authority to the FCC to regulate broadband Internet access.  What Verizon and others are most concerned with is that if the FCC somehow gets away with passing new rules anyway, the agency will have established a dangerous precedent.  Any time in the future that the FCC or any other federal independent agency wants to extend its power, it need only deputize itself.



That is the feature of the Open Internet Report and Order that has most alarmed the communications industry, members of Congress, and advocates of limited government.  And that is principally why the House has promised to reverse the ruling, even as Verizon and others challenge it in court.  In short, the text of the rules aside, it very much matters that the FCC, and not Congress, took up elements of the framework proposed by Verizon-Google.



Regulatory Overreach is not a New Problem





The problem of regulatory overreach goes far beyond net neutrality.  Under a novel and somewhat fragile arrangement that was worked out during the New Deal, independent federal regulatory agencies can exercise considerable authority that the Constitution, on its face, reserves to the Legislative and Judicial branches.  Indeed, the early New Deal Supreme Court overturned much of FDR's regulatory agenda under the so-called "nondelegation doctrine."



After FDR threatened to "pack the court" with more sympathetic Justices, a key swing Justice changed sides, saving the Court and the New Deal.  (The so-called "switch in time that saved nine," which few people realize is a pun on the sewing parable of a "stitch in time saves nine.")



But even so, federal regulators operate under strict controls that ensure they do not become, to use the Supreme Court's word for earlier FCC power grabs, "untethered" in their authority.  FCC Commissioners are appointed by the President and confirmed by the Senate, and can only be removed from office by impeachment.  At least two of the five Commissioners must be members of a party different from the President's.



Both the rulemaking (legislative) and adjudicatory (judicial) powers of the agency are strictly limited by implementing statutes passed by Congress.  If the agency isn't given explicit powers to regulate, regardless of the appearance or reality of significant market failures, only Congress can delegate additional powers.    And the courts, in the checks-and-balance system, are the final determinants of what powers have and have not been granted to an agency.



So the FCC has a problem.  It wants to regulate broadband Internet providers to ensure the "level playing field" it believes essential to the success of the Internet.  But Congress has never given them authority to do so, and has failed since 2004 to pass new legislation that would grant additional authority.



The FCC has actually lost ground during the rulemaking process.  An effort to enforce its Open Internet policy statement through adjudication against Comcast was rejected in April, 2010, further limiting the wiggle room the agency might have had to go forward with the formal rulemaking it began in October 2009.   (The rulemaking was, in some sense, an effort to formalize the policy statements.)



What's the problem?  Briefly:  Under the Communications Act of 1996, and consistent with earlier versions of the FCC's implementing statute, the agency was given broad authority over common carrier telephone service (Title II of the Act) but almost no authority over information services or what used to be known as "enhanced" or "ancillary services" (pre-Internet access, these included call waiting and other supplements to telephone service) (Title I of the Act).  The one exception was Internet access provided by dial-up modems, which of course is no longer a significant source of access.



The Comcast case, in line with several earlier D.C. Circuit and Supreme Court cases, made clear that Title I simply did not delegate authority over broadband access.



There was nothing new in that.  The FCC has made numerous efforts to attach otherwise unauthorized regulations to Title I's so-called "ancillary jurisdiction," but the courts frequently reject these efforts as overreaching.



For example, in 2005 the D.C. Circuit rejected regulations the FCC approved that would have required consumer products manufacturers to include "broadcast flag" technology in any device capable of receiving a television signal—a regulation that was grounded in ancillary jurisdiction over television broadcasters.  But while the agency had unquestioned authority over broadcasters, they could not require not-broadcasters to comply with rules aimed at helping the broadcasters control unauthorized home taping.



At oral argument, the judges nearly laughed the FCC out of court.  "You're out there in the whole world, regulating. Are washing machines next?" asked Judge Harry Edwards. Judge David Sentelle added, "You can't regulate washing machines. You can't rule the world."



The result in the Comcast case was much the same.  And the October, 2009 NPRM had grounded its authority to proceed solely with Title I.  With that avenue all but foreclosed to the agency by Comcast, the Chairman found himself in one of several corners he inhabited over the last year.  Congress was unlikely to move on any of the net neutrality bills floating around committees (and indeed, did not do so), but Genachowski was committed to the rulemaking.



The FCC's "Very Smart Lawyers" Try Again





What to do?  One option was to undertake a "reclassification" of broadband Internet to categorize it as a telephone service subject to Title II, a section of the law that comes with fifty-plus years of baggage from the regulation of the former telephone monopoly.  The Commission (for now) has wisely avoided taking that step, which itself would have been subject to substantial legal challenges.



The authority stalemate seemed to doom the net neutrality proceeding.  But then in late Fall FCC Chairman Julius Genachowski told the audience at the Web 2.0 Summit that the FCC's "very smart lawyers" had figured out a way to get around the Title I/Title II problem.  The net neutrality faithful and faithless waited, with breath held.



In the final Report and Order, however, all we really got was a rerun of the argument that had failed in the Comcast case, with only minor tweaking.  Again, Commissioner McDowell's detailed dissent explains the weakness of the argument without the need for much added commentary.



The courts have consistently told the FCC that to invoke ancillary jurisdiction, a rulemaking must be reasonably related to a specific delegated power elsewhere in the Communications Act.  It has to be "ancillary" to some other authority the Commission already has, in other words.  Title I gives no powers on its own over "information services."  In the Comcast case, the FCC listed off several provisions in hopes that at least one of them would stick, but the court rejected all of them.



In the Order (¶¶ 124-137), the FCC tries several new provisions.  Obviously the best bets were already exhausted in the Comcast case, so here they provide even weaker bases for ancillary authority over broadband Internet than the laundry list rejected by the court in Comcast.    Most get only perfunctory explanation.  The FCC knows it is on thin thin ice.



Instead, the Order relies principally on a new and unconvincing reading of Section 706 of the Act.  (See ¶¶ 117-123)  Section 706 had formed the principal argument in Comcast as well, but there the agency argued that Section 706 was the provision that enabled it to use ancillary authority over Title I Information Services.  The court rejected that argument.



The revised Section 706 argument is that that provision in and of itself provides sufficient authority for the FCC to implement the Open Internet rules.  Well, here it is:



SEC. 706. ADVANCED TELECOMMUNICATIONS INCENTIVES.



(a) IN GENERAL-The Commission and each State commission with regulatory jurisdiction over telecommunications services shall encourage the deployment on a reasonable and timely basis of advanced telecommunications capability to all Americans (including, in particular, elementary and secondary schools and classrooms) by utilizing, in a manner consistent with the public interest, convenience, and necessity, price cap regulation, regulatory forbearance, measures that promote competition in the local telecommunications market, or other regulating methods that remove barriers to infrastructure investment.



(b) INQUIRY-The Commission shall, within 30 months after the date of enactment of this Act, and regularly thereafter, initiate a notice of inquiry concerning the availability of advanced telecommunications capability to all Americans (including, in particular, elementary and secondary schools and classrooms) and shall complete the inquiry within 180 days after its initiation. In the inquiry, the Commission shall determine whether advanced telecommunications capability is being deployed to all Americans in a reasonable and timely fashion. If the Commission's determination is negative, it shall take immediate action to accelerate deployment of such capability by removing barriers to infrastructure investment and by promoting competition in the telecommunications market.



(c) DEFINITIONS- For purposes of this subsection:



(1) ADVANCED TELECOMMUNICATIONS CAPABILITY- The term `advanced telecommunications capability' is defined, without regard to any transmission media or technology, as high-speed, switched, broadband telecommunications capability that enables users to originate and receive high-quality voice, data, graphics, and video telecommunications using any technology.



On its face, neither 706(a) nor 706(b) would appear to give the FCC power to put regulatory constraints of any kind on how broadband Internet access providers operate.  The goal of this section is to encourage the FCC to promote broadband by "regulating methods that remove barriers to infrastructure investment," including forebearance from use of its existing powers.  The history of this provision, as Commissioner McDowell explains, was aimed at removing regulations of Title II telephone carriers that hindered their ability to provide advanced telecommunications capability.



The reliance on Section 706(b) is even stranger, and deeply cynical.  It requires the FCC to issue a regular report on broadband deployment and, if it finds such deployment is not taking place in a "reasonable and timely manner," to take "immediate action to accelerate deployment" by "removing barriers" to investment.



Again, as Commissioner McDowell notes, the 706(b) Reports have consistently found broadband deployment to be proceeding at a rapid pace, confirming what everyone already knows.  Americans are signing on to the Internet faster than any previous information technology, whether through wireline or, increasingly, wireless broadband,



That is, until July, 2010, a few short months after the Comcast decision.  For the first time ever, the 706(b) Report found that "broadband deployment to all Americans is not reasonable and timely."  (The Report, along with the Open Internet Order, was approved on a party-line 3-2 vote of the Commission.)  This despite the fact that broadband availability grew from 15% of Americans in 2003 to 95% in 2010 (data made available in the National Broadband Plan as well).



The negative 706(b) Report was clearly a pretext to give the agency the ability to trigger the "immediate action" language of the 706(b), but even then, see above, the action the FCC is supposed to take is in the nature of deregulating broadband, not adding additional regulations.  How will rules that limit the operational flexibility of broadband providers "accelerate deployment"?  The majority argues simply (¶ 123)  that "Section 706(b) provides express authority for the pro-investment, pro-competition rues we adopt today."  Hardly.



The effort to connect Section 706 to the Open Internet rules is, charitably, flimsy at best.  But there's yet another problem.  The FCC has already foreclosed that connection.  The agency has long rejected the view it now adopts that Section 706 provides any explicit authority for rulemaking, whether on its own (the new argument) or as a hook for ancillary jurisdiction under Title I.



As the D.C. Circuit noted in the Comcast case (Slip. Op. at 30-31), "In an earlier, still-binding order, the Commission ruled that section 706 'does not constitute an independent grant of authority.'  Instead, the Commission explained, section 706 'directs the Commission to use the authority granted in other provisions . . . to encourage the deployment of advanced services.'"  So Section 706 doesn't give the agency any regulatory authority, just guidance on how to apply (or not) other provisions in the Act.  That, at least, has long been the FCC's own view of the law, a view courts will give considerable deference.



In dispensing with the Section 706 argument in Comcast, the court concluded that "Because the Commission has never questioned, let alone overruled, that understanding of section 706, and because agencies 'may not . . . depart from a prior policy sub silentio,' the Commission remains bound by its earlier conclusion that section 706 grants no regulatory authority."  (citations omitted)



That last sentence seemed to leave the door open just a crack for the FCC to "depart from its prior policy" in an explicit way.  And, it's possible to read the Report and Order as doing just that.  (See ¶ 122 for the majority's hilarious explanation for why it had never before noticed that Section 706 granted explicit authority.)



But not so fast.  While agencies have broad discretion to overrule earlier decisions, there must be some rational basis for doing so. There must be some changed circumstances, some evidence, some explanation that passes the sniff test. A reviewing court will look to see if there is some external evidence that justifies the changed interpretation of Section 706.



And there's nothing here that meets even that minimal standard.  Again, to quote Commissioner McDowell (Report at 148), "This move is arbitrary and capricious and is not supported by the evidence in the record or a change of law."  Losing the Comcast case is not reason enough, but that seems to be all that's happened to justify this surprising new understanding of a 15 year-old provision in the FCC's implanting statute.



Preserving which Internet again?





The rest of the FCC's "Authority" section, as noted, throws in the rest of the kitchen sink, largely provisions of Title II, that Comcast didn't already dispose of.  The connection between the Open Internet rules and the Commission's regulatory powers over telephone service, television and radio broadcasting, cable TV and spectrum management are just too tenuous to be convincing to a reviewing court.  If that authority is close enough to support net neutrality, it's close enough to support anything, including, for example, the broadcast flag rules already overturned.



There's more.  Trying the net neutrality rules to problems of VoIP, IP-based television broadcasting, IP radio, and other video and audio services, as one of my law professors used to say, proves too much.  It actually undermines the FCC's position by bringing into sharp focus the reality behind the agency's real problem here.



Since Congress last reviewed the agency's authority in 1996, the Internet's packet-switching protocols have quickly and surprisingly taken over as the super-dominant technology for all forms of communications, traditional and new.  The world of television, radio, and computing have changed completely, leaving little of the world the 1996 Act gave the FCC authority to regulate.  Even the "Internet" as we knew it in 1996 looks little like the robust ecosystem of digital life that we enjoy today.



Which brings us squarely back to the problem of "nostalgia" I described in the previous post.  The FCC is operating under a statute that has its origins in the 1930's, and which was lasted updated (poorly) fifteen years ago, when dial-up consumer Internet was still very much in its infancy.  The communications, computing and entertainment industries operated in silos with little overlap.  Each had its own established players and long histories of regulatory intervention.



But these and other related industries have all undergone nearly complete transformation in the intervening years, largely outside the notice or authority of the FCC to intervene.  Device and content convergence is a reality.  Consumers now use far more computing resources than do businesses.



Meanwhile, those aspects of the industry still under strict FCC control—including Plain Old Telephone Service (POTS) and over-the-air television and radio—have gone into deep decline.  They've become a legacy business that owners can't even exit from, because there's no one interested in the dwindling assets.



That's no coincidence.  Those businesses (in some cases parts of companies whose unregulated operations are thriving), thanks to the regulatory environment in which they operate, are simply unable to respond quickly to rapidly evolving new technologies, applications, and consumer demands.   They suffer from a regulatory disease closely related to the Innovator's Dilemma.  They can't adapt, even if they had the will to do so.



Continued efforts, including this one, to fit round regulations into square statutory pegs underscores that the FCC has no authority over what has evolved to be our new and magical communications platform.   They have no authority because Congress hasn't given them any.  Period.



Moreover, invocations (incantations?) of outmoded, obsolete, and inapplicable provisions of the old communications law also reminds us how much progress has been made during the period when the FCC has been unable or unwilling to intervene in the evolution of that platform.



Probably not the conclusion the FCC was hoping to have drawn from its nearly 200-page Report.  But there you have it.




 •  0 comments  •  flag
Share on Twitter
Published on January 26, 2011 18:13

At Game::Business::Law 2011 – Livetweeting & Talking About Privacy

Hosted by SMU's Guildhall video game law graduate program, the Game::Business::Law summit is the leading conference in the field. Follow the discussion on the #GBL2011 hashtag. Here's the make-up of my privacy panel:



Moderator

Professor Xuan-Thao Nguyen, SMU Dedman School of Law

Speakers
Jennifer Archie, Partner, Latham & Watkins LLP

Andrew S. Ehmke, Partner, Haynes and Boone, LLP

Dr. Joshua Fairfield, Washington & Lee School of Law

Berin Szoka, Founder, TechFreedom


This is an all-star cast. Prof. Nguyen is a big name in video game law field; I had the privilege to work with Jennifer Archie on Internet law when I practiced at Latham; and Josh Fairfield is one of the few law professors I find myself in perfect philosophical harmony with. Check out this summary of his excellent 2009 paper Virtual Parentalism. I only met Andy last night at the reception, but he's a solid thinker on the law of gaming. As they say on postcards: Wish you were here!




 •  0 comments  •  flag
Share on Twitter
Published on January 26, 2011 08:02

Thoughts on the Future of Online Video Regulation

Last week, it was my great honor to speak at the 2011 State of the Net 2011 event, where I participated in a panel discussion about the future of the online video marketplace.  In an earlier essay, I mentioned how some of the discussion that day revolved around the Comcast-NBCU merger, which had just been approved by the FCC, but with unprecedented strings being attached.  The heart of the panel discussion, however, was a debate about the future of online video and regulation of the video marketplace more generally. Also joining me on the panel were Susan Crawford of Cardozo Law School, William Lehr of MIT, Marvin Ammori of Nebraska Law School, and Richard Bennett of ITIF.





During my response time on the panel, which begins around 28:45 of the video, I made a couple of key points:




We're living in the golden age of video. In considering the state of the video marketplace, we need to put things in some historical context. We should appreciate just how far we've come from the "age of scarcity," in which we only had access to a handful of VHF and UHF broadcast channels in most communities, compared to present day. Indeed, we are today blessed today to live in a world of information abundance. By the FCC's last count, 565 cable or satellite channels exist today and those channels and programs are available over more platforms (cable, satellite, telco, online, mail, etc) than ever before.
Deregulation (or light-touch) rules helped. Video distribution and program diversity thrived as the FCC gradually loosened the regulatory chains or forebore from regulating emerging video platforms or programs.  By contrast, in the highly-regulated past, innovation, competition, and diversity were stagnant.
"Gatekeeper" control fears are bunk. Content continues to flow over multiple platforms in an unprecedented manner. That only makes sense since content creators and distributors have every incentive to get as much content pushed out on as many platforms as possible in order to make money! No one ever got rich in this space by locking up all their content. Moreover,  vertical integration of programming by MVPDs is at its lowest point in the past 20 years. The percentage of channels owned by video distributors has fallen from 50% in 1990 to around 15% today.
Youngsters today don't "watch TV" anymore. They watch YouTube, Hulu, Netflix, Apple TV, Google TV, Amazon, XBox Live, PlayStation, Roku, etc.  The video market is highly dynamic and subject to seemingly constant disruptive technological change.
Level the playing field in favor of more freedom. To the extent there is a regulatory asymmetry at work between the old media marketplace and the online or Internet video world, and to the extent policymakers are looking to "level the regulatory playing field" between them, I argued we should level the playing field in favor of freedom.
Clean up the old mess now. Therefore, the old rules need to go. Those rules would include must carry mandates and other carriage requirements / compulsory licensing rules, retransmission consent rules, "localism" and other program content mandates, set-tob box regs, advertising limitations, etc.
Or, at least don't extend old mess to new world. If lawmakers refuse to get rid of the old rules, however, we should erect a high and tight firewall between the old and new worlds and not muck up the new online video ecosystem with rules and regulations that would stifle the wonderful developments and diversity we are witnessing today.


See the entire State of the Net 2011 panel on YouTube here.




 •  0 comments  •  flag
Share on Twitter
Published on January 26, 2011 07:58

January 25, 2011

What I Learned About Wireless Broadband Watching the State of the Union Coverage

In previous posts, I've criticized the Federal Communications Commission for arbitrarily jacking up the speed in its definition of broadband (to 4 mbps download/1mbps upload) so that third generation wireless does not count as broadband. This makes broadband markets appear less competitive.  It also expands the "need" for universal service subsidies for broadband, since places that have 3G wireless but not wired broadband get counted as not having broadband.



The FCC's definition is based on the speed necessary to support streaming video.  I rarely watch video on my computer. But tonight I had a chance to test the wisdom of the FCC's definition.  I'm in rural southern Delaware with broadband access only via a 3G modem. I wanted to watch more State of the Union coverage than the broadcast channels out here carried. So, I fired up the old PC and watched things on CNN.com.  The video showed up fine and smooth, and it didn't even burp when I opened another window to start working on this post.



So now I have not just analysis that questions the FCC's definition of broadband, but that most precious of commodities in Washington regulatory debates: AN ANECDOTE!!!




 •  0 comments  •  flag
Share on Twitter
Published on January 25, 2011 20:14

Video of Next Digital Decade & TechFreedom Launch Event Now Available

TechFreedom launched last week with a half-day symposium dedicated to our first publication, The Next Digital Decade: Essays on the Future of the Internetincluding a fireside chat with FCC Commissioner Robert McDowell, three panels and a conversation about TechFreedom and its mission. Santa Clara Law Professor Eric Goldman, who has three essays in the book, provides a detailed write-up of the discussion on his blog.



Read a summary of the book here, or our Manifesto for TechFreedom. You can watch or download video from the event below (download links are at the bottom).





Fireside Chat: FCC Cmr. Robert McDowell & CNET's Declan McCullagh



Welcome & Interview: Declan McCullagh & Berin Szoka



Panel 1: Internet Optimism, Pessimism & the Future of Online Culture

Berin Szoka, TechFreedom (Moderator) Andrew Keen, author of Cult of the Amateur Adam Thierer, Mercatus Center Prof. Ann Bartow, South Carolina School of Law Prof. Frank Pasquale, Seton Hall Law School



Panel 2: Internet Exceptionalism & Intermediary Deputization

Adam Thierer, Mercatus Center (Moderator) Prof. Eric Goldman, Santa Clara School of Law Josh Goldfoot Prof. H. Brian Holland, Texas Wesleyan School of Law Prof. Mark MacCarthy, Georgetown University Prof. Frank Pasquale, Seton Hall Law School



Panel 3: Who Will Govern the Net in 2020?

Berin Szoka, TechFreedom (Moderator) Prof. David Johnson, New York Law School Prof. Milton Mueller, Syracuse University Shane Tews, VeriSign Chris Wolf, Hogan Lovells



Download Links 

You can download video of these panels (in MPEG-4 h.264/AAC format) from the links below:



Fireside chat (54.3 MB)Welcome (16.6 MB) Panel 1 (129.6 MB) Panel 2 (111.4 MB) Panel 3 (133.4 MB)


 •  0 comments  •  flag
Share on Twitter
Published on January 25, 2011 10:06

Sean Lawson tempers cyber doom

Post image for Sean Lawson tempers cyber doom

On this week's podcast, Sean Lawson, an assistant professor in the Department of Communication at the University of Utah and a contributor to the Forbes.com security blog, The Firewall, discusses his new Mercatus Center working paper, Beyond Cyber-Doom: Cyberattack Scenarios and the Evidence of History. Cyber security may be the new black, but it's been a significant policy issue since the 1980s. Lawson talks about the current cyber security discourse, addressing conflation of diverse threats, overemphasis on hypothetical doom scenarios, and the resulting effects on policy proposals. He then looks to the history of disasters, including blackouts, the attacks of 9/11, and Hurricane Katrina, to help estimate impacts from potential cyber disasters. Lawson also discusses incorrect doomsday predictions about WWII aerial bombardment, and he offers a few conclusions and policy recommendations based on his research.



Related Links


Beyond Cyber-Doom: Cyberattack Scenarios and the Evidence of History
Cyberwar Discourse Project, University of Utah College of Humanities
"How Non-Geek Government Can Make Cyber Policy", by Lawson
"WikiLeaks And The Ongoing Beat Of The Cyberwar Drums", by Lawson


To keep the conversation around this episode in one place, we'd like to ask you to comment at the web page for this episode on Surprisingly Free. Also, why not subscribe to the podcast on iTunes?




 •  0 comments  •  flag
Share on Twitter
Published on January 25, 2011 05:00

January 22, 2011

2/22 DCBar Event: "Video Convergence: Bringing the Internet to the Television & Putting Pay TV Programming Online"

The DCBar's  Computer and Telecommunications Law Section section, on whose Steering Committee I sit, is co-hosting a fascinating brown bag lunch on February 22, 12:15 p.m. to 1:45 p.m at the District of Columbia Bar, 1101 K Street N.W., Conference Center, Washington D.C. 20005.



Online content on your television and your mobile handset; cord cutting; televisions that surf the web and store your family photos…. Are we there?  Do we want to be there?  This panel will explore the latest marketplace developments as well as the legal and policy challenges surrounding this convergence of formerly distinct realms.  Among other issues, the panelists will discuss:  how consumers today are experiencing video and how they are likely to do so in the near future; what technical and legal issues will affect these ongoing marketplace developments; and what actions the FCC is likely to take – and should or should not take – to facilitate competition and choice for consumers, including its planned "AllVid" rulemaking.


Speakers:




Rick Chessen, Senior Vice President, Law and Regulatory Policy National Cable &                                             Telecommunications Association
Stacy Fuller, Vice President of Regulatory Affairs, DIRECTV
Michael Petricone, Senior Vice President of Government Affairs, Consumer Electronics Association
Gigi Sohn, President, Public Knowledge


Moderators:




Matthew Brill, Partner, Latham & Watkins LLP
Lynn Charytan, Vice President Legal Regulatory Affairs, Comcast Corp



 •  0 comments  •  flag
Share on Twitter
Published on January 22, 2011 08:55

January 21, 2011

OECD: "Cyberwar" Overhyped

(HT: Schneier) Here's a refreshingly careful report on cybersecurity from the Organization for Economic Cooperation and Development's "Future Global Shocks" project. Notably: "The authors have concluded that very few single cyber-related events have the capacity to cause a global shock." There will be no cyber-"The Day After."



Here are a few cherry-picked top lines:



Catastrophic single cyber-related events could include: successful attack on one of the underlying technical protocols upon which the Internet depends, such as the Border Gateway Protocol which determines routing between Internet Service Providers and a very large-scale solar flare which physically destroys key communications components such as satellites, cellular base stations and switches. For the remainder of likely breaches of cybsersecurity such as malware, distributed denial of service, espionage, and the actions of criminals, recreational hackers and hacktivists, most events will be both relatively localised and short-term in impact.


The vast majority of attacks about which concern has been expressed apply only to Internet-connected computers. As a result, systems which are stand-alone or communicate over proprietary networks or are air-gapped from the Internet are safe from these. However these systems are still vulnerable to management carelessness and insider threats.


Analysis of cybsersecurity issues has been weakened by the lack of agreement on terminology and the use of exaggerated language. An "attack" or an "incident" can include anything from an easily-identified "phishing" attempt to obtain password details, a readily detected virus or a failed log-in to a highly sophisticated multi-stranded stealth onslaught. Rolling all these activities into a single statistic leads to grossly misleading conclusions. There is even greater confusion in the ways in which losses are estimated. Cyberespionage is not a "few keystrokes away from cyberwar", it is one technical method of spying. A true cyberwar is an event with the characteristics of conventional war but fought exclusively in cyberspace.


The hyping of "cyber" threats—bordering on hucksterism—should stop. Many different actors have a good deal of work to do on securing computers, networks, and data. But there is no crisis, and the likelihood of any cybersecurity failure causing a crisis is extremely small.




 •  0 comments  •  flag
Share on Twitter
Published on January 21, 2011 10:29

Adam Thierer's Blog

Adam Thierer
Adam Thierer isn't a Goodreads Author (yet), but they do have a blog, so here are some recent posts imported from their feed.
Follow Adam Thierer's blog with rss.