Tyler Cowen's Blog, page 348
June 14, 2013
Finance and the Common Good
That is a new paper by E. Glen Weyl, and here is one excerpt:
Using relatively crude methods to translate publicly-available data into income estimates, I estimate that approximately 40% of income of authors in both fi�elds comes from consulting activities, roughly consistent with self-reported income �figures for the broader profession in the National Center for Education Statistics’s National Study of Postsecondary Faculty. Second, I show that consulting activity in industrial organization is primarily policy-oriented while consulting work in �finance is primarily geared toward private interests. Together these two facts are weakly suggestive that material incentives are at least complementary with research focus.
But this is not Inside Job either:
…the view I put forward is not that �financial economists defended the interests of the �firms for which they worked in regulatory disputes; this is precisely what I believe industrial organization economists, in contrast to fi�nancial economists, often did. Instead it is that �financial economists were simply not interested in such disputes instead focusing on aiding private accumulation of wealth in markets rather than pushing public policy in one direction or the other.
Glen predicts that future work in finance will become more like research in industrial organization and move in a more legal and regulatory and policy-oriented direction, as the real world shifts toward greater regulation of finance.
June 13, 2013
Assorted links
1. Indian ferris wheel (video, recommended; at least five different aspects of this material fascinate me).
2. A critical view of seasteading.
3. Japanese underground bicycle parking, more interesting than it sounds.
4. John Nye on James Buchanan.
5. 12 earth scars.
6. The economics of Netflix expansion.
Breakthrough with Honduran charter cities
Written reports from Central America often require Straussian skills, but at least on the surface it would appear that Honduras will go forward with some version of the free city/charter city idea. A translation passing through Google, Tom Bell, and Lotta Moberg (not holding any one of them responsible for it, but to my eye it appears acceptably close) indicates:
“The Law complements the amendments to Articles 294, 303 and 329 of the Constitution which paved the way for the creation of these special areas. [Those amendments fixed the problems that caused the Honduran S.Ct. to strike down the earlier version of the statute, which aimed to establish REDE.] The ZEDE legislation authorizes the establishment of courts with exclusive jurisdiction, which may adopt legal systems and traditions of other parts of the world, provided that they ensure equal or better protection of constitutional human rights protected under Honduran law.”
The legislation was hardly crammed down the legislature’s throat. As I mentioned, the Honduran S.Ct. struck down an earlier version of the statute. The ZEDE legislation sparked “a fierce debate because several municipalities fear losing their autonomy and tax collection.” (The answer to those objections, in floor debate: You can arrange annexation by the ZEDE, winning the same legal status.)
Interested in moving there? “The ZEDE may establish coexistence agreements with people who wish to live or reside freely within their jurisdiction.”
There is a Honduran Spanish-language link here (it doesn’t work in every browser, but experiment). It starts with this, which seems clear enough:
La ley orgánica especial que regulará las Zonas de Empleo y Desarrollo Económico (ZEDE), la nueva versión de las “ciudades modelos”, fue aprobada ayer por el Congreso Nacional en su último debate, lo que deja las puertas abiertas para que empresarios extranjeros inviertan en regiones específicas con reglas diferentes al resto del país y con autonomía propia.
Developing…
And for the pointer I thank Lotta Moberg.
Austrian markets in everything, rising inequality edition
An Austrian hotel is advertising for a modern-day court fool, who is communicative, extroverted, musical, creative and imaginative.
Applicants are asked to bring — and play — their musical instrument during the job interview. Also welcome: creative costumes. The successful candidate will earn 1,400 euros — around $1,900 — a month.
Hotel director Melanie Franke says those interested should not think they’re on a fool’s errand in applying. She says the idea is to treat guests like royalty, noting that “jesters were a luxury that royal families indulged themselves in.”
Here is a little bit more.

How sticky are wages anyway?
On the front of this new Elsby, Shin, and Solon paper (pdf) it reads “Preliminary and incomplete,” but if anything that is a better description of the pieces which have come before theirs. They have what I consider to be the holy grail of macroeconomics, namely a worker-by-worker micro database of nominal wage stickiness under adverse economic conditions, including the great recession and with over 40,000 workers, drawn from the Current Population Survey.
Here are a few results:
1. When looking at the distribution of nominal wage changes, there is always a spike at zero.
2. That said, the spike, ranging from six to twenty percent, isn’t as big as one might expect.
3. The fraction of hourly workers reporting a nominal wage reduction always exceeds ten percent, and the fraction of non-hourly workers reporting a nominal wage reduction always exceeds twenty percent.
3b. In 2007-2008, 37.1% of U.S. workers in the non-hourly sample experienced negative nominal wage changes. That’s a lot. In the following years that figure was over thirty percent. See Table 6 on p.24.
4. These figures are for workers who stay with the same employer for a year or more, and thus they are from sectors where nominal stickiness is especially likely. Overall nominal stickiness is probably considerably smaller than those figures indicate, as the broader pool of workers includes temps, those on commissions, those with short-term jobs, and so on.
5. If you compare the great recession to earlier downturns, “…initial evidence appears to be weak for a simple story in which the combination of downward stickiness in nominal wages and low inflation has generated high unemployment through excessive rates of job loss.” If it were primarily a story of sticky nominal wages, we should have expected layoff rates to be even higher than they were.
6. Overall wages are less sticky in the UK than in the U.S.; for instance “the proportion [of measured UK workers] experiencing nominal wage cuts regularly has run in the neighborhood of 20 percent.” (And here are some recent related results.)
7. Other studies with true microdata also find strongly procyclical real wages, often mediated through changes in nominal wages, including nominal wage declines.
8. The slowdown in real wage growth for U.S. women, during the great recession, follows puzzling patterns.
9. None of these figures include wage changes which take the form of changes in the quality of working conditions, chances of promotion, fringe benefits, and so on.
NB: This paper does not show nominal wages to be fully flexible, nor does it show that observed nominal wage changes were “enough” to re-equilibrate labor markets. Still, this paper should serve as a useful corrective to excess reliance on the sticky nominal wage hypothesis. Nominal wage stickiness is a matter of degree and perhaps we need to turn the dial back a bit on this one.
Note also that this paper need not discriminate against neo-Keynesian and monetarist theories, though it will point our attention toward “zero marginal revenue product” versions of the argument, in which case the flexibility of nominal wages simply doesn’t help much. Note also that such versions of the argument may have somewhat different analytic and policy conclusions than what we are used to expecting.
Addendum: Also from Solon, this time with Martins and Thomas, is this paper about Portugal (pdf), showing considerable nominal flexibility for entry wages in labor markets.

June 12, 2013
21-minute video interview with Thomas Schelling
Assorted links
1. Nick Rowe on Japan and interest rates, and he asks whether Japan is already dead.
2. Parlor game, island, Japanese Chamber of Commerce, stir and mix.
3. Eichengreen essay on Robert Fogel (jstor).
4. The culture that is Vietnam: festival of killing inner insects, and TNR resurrects its “Plank Blog.”
5. Dan Drezner on Albert Hirschman.
6. Turkish Jugaad, and basketball in an arbitrage economy.
7. Disputes over the economic benefits of the human genome project, be careful not to measure inputs!

What I’ve been reading
1. James Salter, All That Is. Excellent set pieces from a strong writer with a cult literary following, but for me the story as a whole didn’t add up to much interesting. I did finish it, however.
2. Mason B. Williams, City of Ambition: FDR, LaGuardia, and the Making of Modern New York. A useful historical look at how fiscal stimulus gets translated into actual urban policies on the ground, well documented and also surprisingly readable.
3. Garry Wills, Nixon Agonistes: The Crisis of the Self-Made Man. For such a long book about a topic I don’t wish to read any more about, this is compelling. It has many excellent sentences, such as “Nixon is a Market ascetic, and politics is his business. On it he lavishes an intensity of dedication that is literally consuming.” Every President should have a book this good about him.
4. Thane Gustafson, Wheel of Fortune: The Battle for Oil and Power in Russia. A very detailed, readable, highly useful, and economically sophisticated account of how they got from the mess they had back then to the mess they have right now.
5. The Fragrance of Guava, Conversations with Gabriel García Márquez. This book gives a very good sense of how the author sees his life’s work as fitting together, and why the short fiction and Autumn of the Patriarch are important.

Peter Thiel is to write a book
Peter A. Thiel, a co-founder of PayPal, has struck a deal to write a book about how to build companies of the future, his publisher said on Tuesday.
The book, “Zero to One,” will be published in March 2014 by Crown Business, an imprint of the Crown Publishing Group at Random House.
Here is a little more, hat tip to Michelle Dawson.

A New FDA for the Age of Personalized, Molecular Medicine
In a brilliant new paper (pdf) (html) Peter Huber draws upon molecular biology, network analysis and Bayesian statistics to make some very important recommendations about FDA policy. Consider the following drugs (my list):
Drug A helps half of those to whom it is prescribed but it causes very serious liver damage in the other half. Drug B works well at some times but when administered at other times it accelerates the disease. Drug C fails to show any effect when tested against a placebo but it does seem to work in practice when administered as part of a treatment regime.
Which of these drugs should be approved and which rejected? The answer is that all of them should be approved; that is, all of them should be approved if we can target each drug to the right patient at the right time and with the right combination of other drugs. Huber argues that Bayesian adaptive testing, with molecular biology and network analysis providing priors, can determine which patients should get which drugs when and in what combinations. But we can only develop the data to target drugs if the drugs are actually approved and available in the field. The current FDA testing regime, however, is not built for adaptive testing in the field.
The current regime was built during a time of pervasive ignorance when the best we could do was throw a drug and a placebo against a randomized population and then count noses. Randomized controlled trials are critical, of course, but in a world of limited resources they fail when confronted by the curse of dimensionality. Patients are heterogeneous and so are diseases. Each patient is a unique, dynamic system and at the molecular level diseases are heterogeneous even when symptoms are not. In just the last few years we have expanded breast cancer into first four and now ten different types of cancer and the subdivision is likely to continue as knowledge expands. Match heterogeneous patients against heterogeneous diseases and the result is a high dimension system that cannot be well navigated with expensive, randomized controlled trials. As a result, the FDA ends up throwing out many drugs that could do good:
Given what we now know about the biochemical complexity and diversity of the environments in which drugs operate, the unresolved question at the end of many failed clinical trials is whether it was the drug that failed or the FDA-approved script. It’s all too easy for a bad script to make a good drug look awful. The disease, as clinically defined, is, in fact, a cluster of many distinct diseases: a coalition of nine biochemical minorities, each with a slightly different form of the disease, vetoes the drug that would help the tenth. Or a biochemical majority vetoes the drug that would help a minority. Or the good drug or cocktail fails because the disease’s biochemistry changes quickly but at different rates in different patients, and to remain effective, treatments have to be changed in tandem; but the clinical trial is set to continue for some fixed period that doesn’t align with the dynamics of the disease in enough patients
Or side effects in a biochemical minority veto a drug or cocktail that works well for the majority. Some cocktail cures that we need may well be composed of drugs that can’t deliver any useful clinical effects until combined in complex ways. Getting that kind of medicine through today’s FDA would be, for all practical purposes, impossible.
The alternative to the FDA process is large collections of data on patient biomarkers, diseases and symptoms all evaluated on the fly by Bayesian engines that improve over time as more data is gathered. The problem is that the FDA is still locked in an old mindset when it refuses to permit any drugs that are not “safe and effective” despite the fact that these terms can only be defined for a large population by doing violence to heterogeneity. Safe and effective, moreover, makes sense only when physicians are assumed to be following simple, A to B, drug to disease, prescribing rules and not when they are targeting treatments based on deep, contextual knowledge that is continually evolving:
In a world with molecular medicine and mass heterogeneity the FDA’s role will change from the yes-no single rule that fits no one to being a certifier of biochemical pathways:
By allowing broader use of the drug by unblinded doctors, accelerated approval based on molecular or modest—and perhaps only temporary—clinical benefits launches the process that allows more doctors to work out the rest of the biomarker science and spurs the development of additional drugs. The FDA’s focus shifts from licensing drugs, one by one, to regulating a process that develops the integrated drug-patient science to arrive at complex, often multidrug, prescription protocols that can beat biochemically complex diseases.
…As others take charge of judging when it is in a patient’s best interest to start tinkering with his own molecular chemistry, the FDA will be left with a narrower task—one much more firmly grounded in solid science. So far as efficacy is concerned, the FDA will verify the drug’s ability to perform a specific biochemical task in various precisely defined molecular environments. It will evaluate drugs not as cures but as potential tools to be picked off the shelf and used carefully but flexibly, down at the molecular level, where the surgeon’s scalpels and sutures can’t reach.
In an important section, Huber notes that some of the biggest successes of the drug system in recent years occurred precisely because the standard FDA system was implicitly bypassed by orphan drug approval, accelerated approval and off-label prescribing (see also The Anomaly of Off-Label Prescribing).
But for these three major licensing loopholes, millions of people alive today would have died in the 1990s. Almost all the early HIV- and AIDS-related drugs—thalidomide among them—were designated as orphans. Most were rushed through the FDA under the accelerated-approval rule. Many were widely prescribed off-label. Oncology is the other field in which the orphanage, accelerated approval, and off-label prescription have already played a large role. Between 1992 and 2010, the rule accelerated patient access to 35 cancer drugs used in 47 new treatments. For the 26 that had completed conventional followup trials by the end of that period, the median acceleration time was almost four years.
Together, HIV and some cancers have also gone on to demonstrate what must replace the binary, yes/ no licensing calls and the preposterously out-of-date Washington-approved label in the realm of complex molecular medicine.
Huber’s paper has a foreword by Andrew C. von Eschenbach, former commissioner of the FDA, who concludes:
For precision medicine to flourish, Congress must explicitly empower the agency to embrace new tools, delegate other authorities to the NIH and/or patient-led organizations, and create a legal framework that protects companies from lawsuits to encourage the intensive data mining that will be required to evaluate medicines effectively in the postmarket setting. Last but not least, Congress will also have to create a mechanism for holding the agency accountable for producing the desired outcomes.

Tyler Cowen's Blog
- Tyler Cowen's profile
- 844 followers
