Geoffrey R. Stone's Blog, page 3

February 24, 2016

The Supreme Court Vacancy and the Constitutional Responsibilities of the Senate

Republican members of the Senate Judiciary Committee announced that they will not consider any nominee put forward by our nation's president. This is unconscionable. If they carry through on this threat, it will be directly incompatible with their solemn responsibilities under the U.S. Constitution.
 •  0 comments  •  flag
Share on Twitter
Published on February 24, 2016 08:43

December 22, 2015

ISIS, Fear, and the Freedom of Speech

In recent weeks, two of the legal scholars I most admire - Cass Sunstein and Eric Posner - have independently called for possible limitations on the scope of First Amendment protection in light of the dangers posed to the United States by online radicalization messages directed at Americans. See http://www.bloombergview.com/articles... and http://www.slate.com/articles/news_an...



Although I certainly understand the concerns driving these suggestions, it is essential that we resist the temptation to restrict our most fundamental freedoms in moment of panic. This is not to say that our nation's security is not important or that preventing terrorist attacks is not a critical goal. But it is to say that this is not an appropriate way to protect ourselves.



At the core of these concerns is the fear that if ISIS supporters are free to encourage others to join their ranks and to launch terrorist attacks against the United States, we will be less safe than if we could make it a crime for individuals to promote such messages. This is a credible fear. But a credible fear is not a sufficient justification for jettisoning hard-bought constitutional rights.



We have a long history in the United States of compromising our First Amendment freedoms in the face of perceived danger and then later recognizing that we had overreacted, often with dire consequences for individual freedom and for our democracy.



Less than a decade after we adopted the First Amendment, which provides that "Congress shall make no law . . . abridging the freedom of speech, or of the press," Congress enacted the Sedition Act of 1798, which effectively made it a crime for any person to criticize the President, the Congress, or the government of the United States. The purported justification for the legislation was fear of a possible war with France, then the world's leading military power. The rationale was that, if citizens could criticize the government, then the government would be less able to protect the nation in time of war.



The war never came, but government officials relentlessly prosecuted their critics, in no small part in an unsuccessful effort to retain their power in the 1800 elections. In later years, Congress repealed the Act, the government released those who had been convicted under it, and the Supreme Court declared that the Sedition Act of 1798 had in fact been unconstitutional.



A similar situation arose during World War I, when the federal government enacted first the Espionage Act of 1917 and then the Sedition Act of 1918. As interpreted and applied, these laws once again made it a crime for any person to criticize the government, the war, the draft, the military, or the flag of the United States. In another episode driven by fear, some 2,000 Americans were prosecuted, convicted, and sentenced to prison terms for as long as ten or twenty years for doing nothing more than questioning the morality, wisdom or legality of government policy during the war.



Once again, after all the dust settled, those who were convicted were released from prison and pardoned, and the government eventually acknowledged that its actions had been driven by an illegitimate combination of panic and political expediency. Although the Supreme Court upheld the constitutionality these prosecutions and convictions at the time, in later years the Court recognized that they had violated the First Amendment.



More recently, during the 1950s Red Scare and the era of McCarthyism, government at the federal, state and local levels all prosecuted, blacklisted, and jailed tens of thousands of Americans because they had once been members of the Communist Party, or affiliated organizations. The premise of these prosecutions was that these individuals were disloyal and posed a threat to the security of the United States because the Communist Party advocated the violent overthrow of government.



Once again, with the passage of time, the nation came to the realization that it had panicked and that it had wrongly persecuted these individuals without justification. And once again, the Supreme Court, which had initially upheld these prosecutions and blacklists, later came to its senses and declared these actions unconstitutional.



Finally in 1969, in its landmark decision in as Brandenburg v. Ohio, the Supreme Court, building upon the powerful dissenting opinions in the earlier eras of Justices Oliver Wendell Holmes and Louis Brandeis, declared in no uncertain terms that in the United States the government cannot constitutionally punish individuals for expressing their views, even when those views call for the use of violence, unless the government can demonstrate that such speech is likely to trigger imminent violence. Short of that, the Court held, the only proper response, even to expression we fear and despise is not suppression, but counter-speech. Of course, this is not without risk, but the price of freedom is always a degree of risk.



Given our grim history in periods of perceived or real crisis, and given how long it has taken us to attain the wisdom and insight we have gained through painful national experience, this is definitely not the time to turn back the clock and to revert to long discredited doctrines that served us so poorly in the past. The temptation is certainly understandable, but the better part of wisdom is not to toss away our hard-bought freedoms in the absence of truly compelling necessity.



This holds true, by the way, not only for free speech, but for all of our freedoms. Once we discard our free speech rights, what is to keep us from then discarding other rights as well? Suppose, for example, some nut-case politician were to call for the internment of all Muslim Americans? After all, better be safe than sorry.



Of course, we already did the equivalent of this during World War II, when the United States interned 120,000 persons of Japanese descent, two-thirds of whom were American citizens, men, women, and children. In one of the most tragic decisions in American history, the Supreme Court, in the midst of the war, upheld the constitutionality of this program. In later years, the United States government formally apologized for this travesty and provided restitution to those whose lives had been devastated by this painful example of wartime panic.



The long and short of it is this: In the free speech arena, we have struggled for more than two hundred years to get to the right place. We should not throw that wisdom away in a panic. If we do, we will once again deeply - and rightly - regret our actions.
 •  0 comments  •  flag
Share on Twitter
Published on December 22, 2015 12:19

ISIS, Fear, and the Freedom of Speech

In recent weeks, two of the legal scholars I most admire -- Cass Sunstein and Eric Posner -- have independently called for possible limitations on the scope of First Amendment protection in light of the dangers posed to the United States by online radicalization messages directed at Americans. Although I certainly understand the concerns driving these suggestions, it is essential that we resist the temptation to restrict our most fundamental freedoms in moment of panic. This is not to say that our nation's security is not important or that preventing terrorist attacks is not a critical goal. But it is to say that this is not an appropriate way to protect ourselves.
 •  0 comments  •  flag
Share on Twitter
Published on December 22, 2015 07:19

December 15, 2015

Justice Scalia, Affirmative Action and the Perils of Oral Argument

Ever since the oral argument last week in the Supreme Court in Fisher v. University of Texas, which involves the constitutionality of the University of Texas' affirmative action program, Justice Antonin Scalia has been castigated and excoriated by commentators, mostly on the left, for asking the attorney for the University of Texas about the so-called "mismatch" objection to affirmative action. In Justice Scalia's words: "There are those who contend that it does not benefit African-Americans to get them into the University of Texas where they do not do well, as opposed to having them go to a less-advanced school, a less -- a slower-track school where they do well."



Although I often disagree with Justice Scalia, and although I emphatically disagree with him about the constitutionality of affirmative action, the outrage and condemnation sparked by this comment is completely unwarranted. Justice Scalia's comment, which asked about the merits of an argument frequently made against affirmative action, and which was made specifically in briefs before the Supreme Court in this very case, was perfectly appropriate. As is often the case, Justice Scalia might have helped himself by framing his comment in a more sensitive manner. But the plain and simple fact is that his question gave the attorney for the University of Texas an opportunity to respond to one of the central arguments made against the constitutionality of affirmative action.



The "mismatch" argument runs more or less as follows. Colleges and universities generally admit students based on their academic achievements and potential, as reflected in their standardized test scores, their high school GPAs, and the nature of the courses they took in high school. These criteria have been shown to be reasonably good at predicting academic success at the college level.



When a college employs affirmative action, it typically admits some students who would not otherwise be admitted based on their academic credentials because their presence would add diversity to the institution and to the student body. Predictably, those students generally do less well academically in college, on average, than most students who are admitted solely on the basis of academic potential. This phenomenon may be exacerbated by a variety of factors, including a sometimes less than congenial or supportive atmosphere for minority students on campus, but it is a perfectly predictable consequence of admitting students, who would not otherwise be admitted on the basis of academic potential, whether because of affirmative action, because they are good football players, because they play the oboe, or because they are the children of potentially generous donors.



In the affirmative action context, as in the other settings, this raises the question whether the college is exploiting the students for its own ends -- to achieve diversity -- at the expense of the students' own best interests. Put simply, is a student better off graduating in, say, the bottom 20 percent of a first tier college or in the top 20 percent of a second tier college?



This is not an easy question. If we were discussing your own kid, what would you think?



As a former Dean and Provost at the University of Chicago, I have had this conversation many times over the years with friends and former students who want advice about how to advise their own children. Most often, I've had this conversation with wealthy individuals who know they can get their kids into a top tier college, which hopes someday to receive a large gift in appreciation, but who know from their kids' SAT scores and high school records that, purely on the basis of academic potential, their kids wouldn't get into a top tier college.



What they worry about is the impact on their kids of "being in over their heads." They worry in part about how good an education their kids will actually get if they're in over their heads academically, and they worry in part about the effect of a mediocre college performance on their kids' sense of self-confidence. As highly successful people themselves, they understand the importance of self-confidence. They worry that, if their kids barely get by in college, they will have their self-confidence beaten out of them. It's often a tough call.



This is essentially what Justice Scalia was asking about. A number of social scientists have studied the "mismatch" theory and compared the experience of minority students who are the beneficiaries of affirmative action with those who have not benefited from affirmative action. That is, the idea is to compare the experiences of college students with more or less equivalent academic potential, some of whom attend first tier and some of whom attend second tier colleges.



The results of these studies suggest that the concern that the intended beneficiaries of affirmative action are actually being harmed rather than helped is largely unfounded. In comparing these two groups, the data suggest that the students who attend the first tier schools do less well in terms of academic performance in college, but that they are as likely to graduate, as likely to have a satisfactory college experience, and as likely -- indeed, more likely -- to get good jobs upon graduation than their peers at second tier schools.



But, of course, there are also social scientists who disagree. Although the weight of authority at the moment appears to be on the side of those who find that affirmative action does, indeed, benefit its intended beneficiaries, the matter is still open to debate. Moreover, even if the matter were resolved with respect to the average student entering a college in an affirmative action program, this does not mean that every such student benefits from the experience. Depending on the background, self-confidence, and makeup of the student, the experience can either be a good one, or a not so good one. Indeed, that's precisely what my rich friends worry about with respect to their own kids.



Now, in my own view, none of this has anything to do with the constitutionality of affirmative action. Rather, these are interesting details that should be taken into account by individual students and their families in making individual decisions for themselves. But my view of affirmative action, unfortunately, is not the view of the Supreme Court. In the Supreme Court's view, it is unconstitutional for public institutions of higher education to take race into account in making admissions decisions unless they have a compelling interest for doing so. Even the possibility that the mismatch theory is correct, at least for some students, might be sufficient, under that standard, to invalidate affirmative action programs. That is bad constitutional law, but as long as it is the law of the land it is perfectly appropriate and sensible for a justice to ask about this.



It is time that we stopped condemning each other for asking hard questions, however much we might not like them.
 •  0 comments  •  flag
Share on Twitter
Published on December 15, 2015 17:58

Justice Scalia, Affirmative Action and the Perils of Oral Argument

Although I often disagree with Justice Scalia, and although I emphatically disagree with him about the constitutionality of affirmative action, the outrage and condemnation sparked by this comment is completely unwarranted.
 •  0 comments  •  flag
Share on Twitter
Published on December 15, 2015 12:34

November 21, 2015

Woodrow Wilson, Princeton University, and the Battles We Choose to Fight

As part of their recent thirty-two hour sit-in outside the office of Princeton University's president Chris Eisgruber, members of one of Princeton's student organizations, the Black Justice League, demanded that Eisgruber remove all images of Woodrow Wilson from all of Princeton's public spaces and erase Wilson's name from Princeton's internationally acclaimed Woodrow Wilson School of Public and International Affairs. Eisgruber, who I'm proud to say was one of my students several decades ago at the University of Chicago Law School, is mulling it over.



For more than a century, Princeton has had a special place in its heart for Woodrow Wilson. In part, this was because Wilson served as president of the university from 1902 to 1910. During his presidency of Princeton, Wilson renewed and reinvigorated the institution. In only eight years, he increased the size of the faculty from 112 to 174, paying special attention to both teaching and scholarly excellence.



Wilson also made progressive innovations in the curriculum, raised admissions standards to move Princeton away from its historic image as an institution dedicated only to students from the upper crust, and took strides to invigorate the university's intellectual life by replacing the traditional norm of the "gentleman's C" with a course of serious and rigorous study. As Wilson told alumni, his goal was "to transform thoughtless boys . . . into thinking men."



Wilson also attempted (unsuccessfully because of the resistance of alumni) to curtail the influence of social elites by abolishing the upper-class eating clubs, appointed the first Jew and the first Catholic to the faculty, and helped liberate the university's board of trustees from the grip of tradition-bound and morally-conservative Presbyterians. Given that record of achievement, it's easy to understand why Princeton has chosen to recognize Woodrow Wilson as one of its greatest and most influential presidents.



Of course, Princeton has also chosen to honor Wilson because of his later service as President of the United States. During his tenure as President, Wilson was one of the nation's most effective leaders of the progressive movement. Shortly after assuming office, he expressly called to account some of the most powerful industrial and financial leaders in the nation for what he deemed their malpractices in business affairs.



As President, Wilson oversaw the passage of a range of progressive legislation previously unparalleled in American history. Among the bills he signed into law were the Federal Reserve Act, the Federal Trade Commission Act, the Clayton Antitrust Act, the Adamson Act, which for the first time imposed a maximum eight-hour day for railroad workers, and the Keating-Owen Act, which (before it was held unconstitutional by the-then-very-conservative Supreme Court) curtailed child labor. Samuel Gompers, the most visible labor leader of the time, described Wilson's achievements as a "Magna Carta" for the rights of the workingman.



Among his other accomplishments, Wilson, over bitter opposition from anti-Semites, appointed the first Jewish member of the Supreme Court - Louis Brandeis, and offered his Fourteen Points and his strong support of the League of Nations in the hope of promoting international peace and averting future world wars.



Wilson was not without his flaws, however. During World War I, he, like Presidents John Adams and Abraham Lincoln before him, supported the aggressive suppression of dissent in wartime in a way that seriously damaged the core principles of American democracy.



More to the point of the current controversy at Princeton, Wilson also ordered the segregation of federal government offices, and his War Department drafted hundreds of thousands of African-Americans into the army, gave them equal pay with whites, but -- in accord with military policy from the Civil War through the Second World War -- assigned them to all-black units with white officers. When a delegation of African-Americans protested this policy, Wilson told them that "segregation is not a humiliation" and ought not "to be so regarded by you gentlemen."



Like his suppression of dissent during the war, Wilson's support of racial segregation was deplorable. But it is important to understand that at the time such segregation was legal, was consistent with the views of most Americans, and was part of the public policy in many, perhaps most, states in the nation. Indeed, in some quarters such segregation was even considered a "progressive" reform insofar as it limited the opportunities for interracial disputes that could trigger white violence.



It would, of course, have been great if Woodrow Wilson, like some others of his generation, had directly challenged the morality of racial segregation. It would have been great if he had not believed in the principle of white supremacy. But, like all of us, he was a man of his own time, and he should be judged accordingly.



All in all, Woodrow Wilson is almost universally regarded as one of the greatest presidents in Princeton's history and, despite his serious shortcomings, one of the greatest presidents our nation has ever known. Wilson was in almost all respects a progressive champion who, like many other progressive champions of his era, was morally obtuse on the issue of racial justice. Thus, when all is said and done, Wilson should be judged by Princeton, as he has been judged by historians, not only by the moral standards of today, but by his achievements and his values in the setting of his own time.



After all, if Woodrow Wilson is to be obliterated from Princeton because his views about race were backward and offensive by contemporary standards, then what are we to do with George Washington, Thomas Jefferson, James Madison, James Monroe, and Andrew Jackson, all of whom actually owned slaves? What are we to do with Abraham Lincoln, who declared in 1958 that "I am not, nor ever have been, in favor of bringing about in any way the social and political equality of the white and black races," and that "I am not, nor ever have been, in favor of making voters or jurors of negroes, nor of qualifying them to hold office, nor to intermarry with white people"?



What are we to do with Franklin Roosevelt, who ordered the internment of 120,000 persons of Japanese descent? With Dwight Eisenhower, who issued an Executive Order declaring homosexuals a serious security risk? With Bill Clinton, who signed the Defense of Marriage Act? With Barack Obama and Hillary Clinton, both of whom opposed the legalization of same-sex marriage?



And what are we to do with Supreme Court Justice Oliver Wendell Holmes, who once opined in a case involving compulsory sterilization that "three generations of imbeciles is enough"? With Leland Stanford, after whom Stanford University is named who, as governor of California, lobbied for the restriction of Chinese immigration, explaining to the state legislature in 1862 that "the presence of numbers of that degraded and distinct people would exercise a deleterious effect upon the superior race"?



And what are we do with all of the presidents, politicians, academic leaders, industrial leaders, jurists, and social reformers who at one time or another in American history denied women's right to equality, opposed women's suffrage, and insisted that a woman's proper place was "in the home"? And on and on and on.



Not having any personal connection to Princeton (other than my affection and respect for its current president), I don't really care one way or the other whether Princeton erases Woodrow Wilson from its history - except to the extent that such an action would inevitably invite an endless array of similar claims that would both fundamentally distort the realities of our history and distract attention from the real issues of deeply-rooted injustice in our contemporary society that we need to take seriously today. This, quite frankly, is not one of them.
 •  0 comments  •  flag
Share on Twitter
Published on November 21, 2015 06:14

November 11, 2015

Understanding the Free Speech Issues at Missouri and Yale

How should we think about the free speech issues in the recent controversies at the University of Missouri and Yale? In my view, universities have a deep obligation to protect and preserve the freedom of expression. That is, most fundamentally, at the very core of what makes a university a university.



This has not always been true. Throughout history, colleges and universities have limited freedom of expression in all sorts of ways. In the nineteenth century, they often forbade the expression of any views that were inconsistent with Christian religious doctrine, including of course the doctrine of evolution. In the twentieth century, they often forbade the expression of any views that were seen as unpatriotic during World War I or as communistic during the McCarthy era. Today, the battle is primarily over expression that makes students feel uncomfortable or unsafe. The principles, though, are the same.



Last year, I chaired a faculty committee at the University of Chicago that was charged with the task of drafting a formal Statement of Principles for the University on freedom of expression. That statement, which can be found here, has since been adopted by a number of other institutions, including Princeton, Purdue, and American University.



Drawing on the principles articulated in that Statement, but speaking only for myself, I would offer the following thoughts about the events at Missouri and Yale:



First, should students be permitted to wear Halloween costumes that might offend or upset other students (for example, wearing sombreros or blackface or dressing as aborted fetuses) or to use language that might offend or upset other students (for example, kike, fag, spic, dyke, nigger, slut, etc.)?



The answer clearly is "yes." The robust freedom of speech that must be guaranteed by a university must include the freedom to express thoughts, opinions, and views that others find odious, hateful, distasteful, and offensive. The use of such costumes and words, however uncivil and provocative, enables the expression of particular views in an especially powerful and emotional manner. However offensive such speech might be to others, it is clearly part of the freedom of expression we all must tolerate.



Second, should students who are offended by such expression be permitted to condemn those who engage in such behavior as ignorant, racist, hateful, and despicable? Of course. Toleration does not imply acceptance or agreement. The freedom to speak does not give one the right not to be condemned and despised for one's speech. That is the whole point of the "marketplace of ideas."



Third, should students who are offended by such expression be permitted to demand that the university discipline students who engage in such behavior? Of course. Although the university should resist such demands, students are perfectly within their rights to try to persuade the institution to change its policies to address such behavior. As should be evident, in my view the institution should not change its policies in this regard, but such issues are always open to debate and deliberation.



Fourth, should the university discourage students from expressing themselves in ways that might offend, upset, annoy, or demean other students or make them feel disrespected, insulted, or unsafe? This depends on the context.



In my view, a university should not itself take positions on substantive issues. A university should not declare, for example, that abortion is moral, that undocumented immigrants have a right to remain in the United States, that the United States should abandon Israel, or that a flat tax is the best policy. It is for the faculty and students of the institution to debate those issues for themselves, and the university as an institution should not intrude in those debates by purporting to decide on the "correct" point of view.



On the other hand, a university can promote certain values both to educate its students and to foster an intellectual environment that is most conducive to the achievement of the institution's larger educational goals. To that end, a university can appropriately encourage a climate of civility and mutual respect. It can do this in a variety of ways, as long as it stops short of censorship. More specifically, a university can legitimately educate students about the harms caused by the use of offensive, insulting, degrading, and hurtful language and behavior and encourage them to express their views, however offensive or hurtful they might be, in ways that are not unnecessarily disrespectful or uncivil.



This, of course, leaves many questions unanswered. But it is a start.
 •  0 comments  •  flag
Share on Twitter
Published on November 11, 2015 09:47

October 21, 2015

In the Name of Decency...

Ibrahim Parlak is a Kurd who was born in a small farming village in southeast Turkey in 1962. As a minority ethnic and religious group, Kurds have historically been subjected to vicious discrimination, oppression and violence by the Turks. As a high school student, Ibrahim was imprisoned for three months in a military prison in Turkey for participating in humanitarian activities designed to help his people. After his release, he left Turkey to continue his education in Germany.



Seven years later, he became involved with the Kurdish separatist movement, known as the PKK. He re-entered Turkey and wound up in a PKK firefight with Turkish soldiers in which two Turks were killed. Ibrahim was later captured by the Turks. He was tortured and threatened in heinous ways. After he revealed the location of a hidden cache of PKK weapons, he was released, but he was now seen as an enemy by both the Turkish government and the PKK.



Ibrahim then managed to escape Turkey and enter the United States in 1991. He applied for and was granted asylum in Chicago. Ibrahim became a model immigrant. He settled in a small town in southwestern Michigan, opened a highly successful restaurant, married, became a much-respected member of his community, had an American-born daughter (who is now in college), and applied for naturalization in 1998. As Ibrahim has observed, America provided him "with the opportunity to become someone." America is a place where "if you live by the rules and work hard, ... dreams can come true."



It is a heart-warming story.



Not quite. The events of 9/11 changed everything. Because the United States had designated the PKK a terrorist organization in 1997 -- six years after Ibrahim had come to the United States -- the Bush administration denied his naturalization petition and initiated deportation proceedings against him. In 2004, he was imprisoned without jail, awaiting deportation. His friends and neighbors rallied to his support by the hundreds, and after ten months in prison a federal judge declared his detention unconstitutional.



The Bush administration, though, continued to press for deportation, and although the Obama administration has at times hesitated, the government's campaign to throw Ibrahim out of our nation continues to this day. With the fervent support of his many friends and admirers -- I proudly include myself among them -- Ibrahim has continued to fight against what another federal judge has described as "a sad remnant of an era of paranoid, overzealous, error-riddled and misguided anti-terrorism and immigration enforcement that has now gone by the wayside."



Sadly, Ibrahim is now at the end of his rope. Despite the efforts of federal officials like former Senator Carl Levin (D-Michigan), former U.S. Attorney John Smietanka, and former FBI counter-intelligence and terrorism lawyer Anne Buckleitner, all of whom have been dedicated supports of Ibrahim's cause, the Department of Homeland Security has now ordered Ibrahim to apply for residency to some other country.



In the name of decency and human rights, it is time to bring this absurd and, indeed, unjust campaign of persecution to an end. It is time for President Obama to issue a presidential pardon to Ibrahim, and let this good and decent man who has lived a peaceful and lovely life in our nation for more than twenty years live, finally, in peace.
 •  0 comments  •  flag
Share on Twitter
Published on October 21, 2015 12:39

September 7, 2015

Kim Davis and the Freedom of Religion

The Kim Davis situation raises interesting questions about the meaning and practical effect of the freedom of religion. Although, for reasons that I will explain, the issue today is one of public policy, rather than constitutional law, the evolution of constitutional principles in this realm is illuminating.



The First Amendment forbids government to make any law "prohibiting the free exercise" of religion. At its core, this guarantee forbids government from intentionally interfering with the freedom of individuals to practice their religion. Thus, the free exercise principle at least presumptively forbids the government to enact laws expressly prohibiting Muslim women to wear burkas, expressly prohibiting Jews to circumcise their male children, or expressly forbidding Catholics to use sacramental wine. Such laws are paradigmatic violations of the free exercise principle, because their very purpose is to restrict the religious practices of particular faiths.



The Kim Davis situation is different. Consider, for example, a law that prohibits anyone to use peyote. Does such law violate the free exercise principle when it is applied to a member of a religious group that uses peyote as a sacrament? Or consider a law that compels military service. Does such law violate the free exercise principle when it is applied to a member of a religion that teaches pacifism? Or consider a law that prohibits discrimination on the basis of sexual orientation. Does such a law violate the free exercise principle when it is applied to apply to an elevator operator in a government building who refuses to allow in "his" elevator people he regards as sodomites?



The difference between the paradigmatic violations of the free exercise principle and these latter examples is that in the paradigmatic situations the laws at issue are expressly and purposely directed at the free exercise of religion, whereas the latter examples involve what is commonly referred to as the "incidental effects" problem.



In the incidental effects situation, a law that is not otherwise problematic on its face is claimed to be impermissible because, in application, it incidentally interferes with some individual's asserted "rights." This happens all the time.



Consider, for example, the freedom of speech. A law prohibiting speeding has nothing to do with free speech. But suppose an individual who is arrested for speeding insists that he should be exempt from the anti-speeding law because he was rushing to a lecture. Or, suppose an individual who is arrested for littering by dropping thousands of leaflets from a helicopter maintains that he should be exempt from the anti-littering law because this was an efficient way for him to communicate his views to others. Or, suppose a reporter who is arrested for burglary for breaking into a person's home asserts that he should be exempt from the law against burglary because this was a good way for him to get a story.



As common sense would suggest, in all of these examples the individual demanding the exemption will lose. Put simply, and except in truly extraordinary circumstances, the Supreme Court does not look kindly on incidental effects claims in the free speech context.



Why is this so? There are at least four pretty good reasons. First, once one opens the door to such claims, every tomdickandharry will assert them. "I robbed that guy so I could give money to my favorite presidential candidate -- Donald Trump." "I parked illegally so I could go to the bookstore." "I was naked at the beach so I could protest laws against nudity." And so on.



Second, in such circumstances, it would be next to impossible for courts to determine in each instance which free speech claims are sincere and which are a sham. It all rests in the mind of the actor, and everyone who does anything illegal would be tempted to cook up a free speech explanation. Sorting it out would be a fact-finding nightmare.



Third, even if a particular a claim is in fact sincere, and even if the individual did in fact violate the law in order to communicate, it would be daunting, indeed, to figure out in each instance whether the individual's free speech interest is sufficiently important to override the legitimate reason for the law. Is speeding to get to a lecture sufficiently "important" to justify speeding? Is being naked on a beach to protest nudity laws sufficiently "important" to justify the nudity?



Fourth, in most of these situations, with just a little creativity, the individual claiming the exemption could have expressed herself in some other manner, without violating the law. In such circumstances, the denial of an exemption does not significantly interfere with free speech.



For all of these reasons, the Supreme Court has consistently looked skeptically on claims that incidental effects on free speech violate the First Amendment. Such exemptions are granted only in truly extraordinary circumstances.



For similar reasons, the Supreme Court has generally -- though not always -- been similarly skeptical of incidental effects claims in the free exercise context. In 1878, for example, George Reynolds, a Mormon, claimed that a federal anti-polygamy law infringed his right to the free exercise of his religion, because Mormonism called for polygamous marriage.



The Court denied the challenge, explaining in no uncertain terms that to exempt individuals from criminal prohibitions on account of their religious beliefs would make "the professed doctrines of religious belief superior to the law of the land, and in effect ... permit every citizen to become a law unto himself." In short, laws having incidental effects on religious freedom, like laws having incidental effects on free speech, were presumptively constitutional.



That remained the law until the 1963, when the Warren Court reconsidered the question. Seventh Day Adventist Adell Sherbert claimed that her free exercise rights had been violated when she was denied unemployment benefits because her religious beliefs forbade her to work on Saturdays. The Court held that the government could not require Sherbert to work on Saturdays in order to be eligible for unemployment compensation, because the state's interest in applying the law to her was not sufficiently weighty to overcome her right to the free exercise of her religion.



This doctrine, which revolutionized the law, wobbled around inconsistently for roughly twenty-five years as courts struggled to apply the new doctrine to a range of factual situations that weren't quite as neat as Sherbert's. In some cases, there were doubts about whether the religion really required the individual's behavior; in some, there were doubts about whether the claimant really held the belief; in some, there were doubts about whether the claimant could have meet her religious beliefs in other ways; in some, there were difficult questions about whether the government's interest in not granting an exemption was sufficiently weighty to override the religious claim. It short, it was messy.



Then, in 1990, in Employment Division v. Smith, the Supreme Court reversed course once again. Galen Smith was fired from his job because he had ingested peyote for sacramental purposes at a ceremony of the Native American Church. He was then denied unemployment compensation because he had been terminated for employee-related "misconduct." He maintained that the denial of unemployment benefits to him violated his rights under the Free Exercise Clause.



In an opinion by Justice Antonin Scalia, the Court jettisoned its earlier doctrine and held that Smith was not entitled to an exemption. In effect, the Court reinstated the incidental effects doctrine as it had first been enunciated by the Court more than a century earlier in the polygamy case.



Whatever one thinks of Smith as a constitutional decision, it does not resolve the larger social question. Even if something is not unconstitutional, it might still be bad public policy. Indeed, in the years after Smith, a broad coalition of groups as diverse as the ACLU and the National Association of Evangelicals came together to enact legislation at both the federal and state levels that presumptively prohibits government from disadvantaging an individual for acting on his religious beliefs even when the applicable law has only an incidental effect on his behavior.



The ultimate question is whether this policy can in fact be pursued in a manner that accommodates sincerely-held religious beliefs without creating so many problems of implementation, and so many incentives for false claims, that the costs of the policy outweigh its benefits.



Although I am not a religious person, I believe that, as a matter of sound public policy, we should bend over backwards not to penalize individuals for the honest exercise of their sincerely-held religious beliefs. Why penalize someone for acting out of a sincerely-held religious belief unless there is at least a reasonable justification for doing so?



The answer, of course, is that for the reasons noted earlier, the doctrine might just be too complicated to implement in a predictable, sensible, even-handed, and fair-minded manner. The plain and simple fact is that the doctrine invites all sorts of abuse and sham claims, and compels judges and other government officials to make complex and potentially offensive judgments about the content of religious belief, both for the religion and for the individual.



On the other hand, the virtue of such a doctrine might be not only that it respects sincere religious beliefs, to the extent it can discern them, but also that it can protect members of minority religions. In the real world, the members of mainstream religious groups will rarely, if ever, find themselves in need of this principle, because those who make the laws will almost always carve out explicit exceptions for them.



As an example, compare Prohibition, which carved out an express exemption for the use of sacramental wine, with the laws prohibiting the use of peyote, which did not carve out an express exemption for the Native American Church. Or, think of the Seventh Day Adventists. Predictably, the unemployment compensation law did not require individuals to work on Sunday, thus privileging mainstream religions. But those who enacted the unemployment compensation law did not carve out a similar exemption for Seventh Day Adventists.



A doctrine that permits courts to recognize exemptions even for minority religions thus offers an important safeguard of equality and even-handedness in an otherwise predictably unequal and unjust approach to the legislative recognition of explicit legislative exemptions.



In the end, then, it is fair to say that, as a general matter, there is no obvious "right" answer. This is a difficult issue. There is a "right" answer, however, in the Kim Davis situation. Indeed, her case is not even a hard one. A public official, who acts as an agent of the government, simply cannot place her own religious beliefs above those of the constitutional obligations of the state and the constitutional rights of our citizens. Davis should have found a way to reconcile her personal religious beliefs with her official responsibilities, or she should have resigned.



Davis is the moral equivalent of the elevator operator in a government building who, for her own religious reasons, refuses to let gays and lesbians ride in "her" elevator, which is the only one in the building. This, quite simply, she cannot do.
 •  0 comments  •  flag
Share on Twitter
Published on September 07, 2015 21:21

August 25, 2015

Academic Freedom and the Meaning of Courage

Sometimes, it takes courage to stand up for academic freedom.



Three months ago I posted an article addressing academic freedom issues that had arisen at Northwestern University. In that piece, I related an incident involving Alice Dreger, William Peace, and an issue of the journal Atrium.



As I then reported, Atrium is a journal published by Northwestern University's Medical Humanities and Bioethics Program. Each issue focuses on a different theme, and each contributor is expected to explore the theme "in different, thought-provoking ways." The Winter 2014 issue of Atrium, which was edited by Professor Alice Dreger, included a series of lively articles on the theme of "Bad Girls."



One of the articles, written by William Peace, then the 2014 Jeannette K. Watson Distinguished Visiting Professor in the Humanities at Syracuse University, was titled "Head Nurses." In this essay, Peace, who is disabled, told the story of how 36 years earlier a young woman nurse, with whom he had grown close, provided oral sex to him during rehabilitation in order to address his deep concerns that, after a severe health problem left him paralyzed, he could no longer be sexually active.



Peace's essay, which was written and edited in a responsible, mature, and thoughtful manner, so upset the authorities at Northwestern University's Feinberg School of Medicine that they ordered the story removed from the online version of Atrium. This act of blatant censorship, in direct contravention of any plausible understanding of academic freedom, remained in place for fourteen months, over the continued objections of Peace and Dreger.



Northwestern finally reversed course only after Peace and Dreger made clear that they would take the matter public if the university did not relent. Presumably, the university's concern was that the inclusion of such an "offensive" article in Atrium might put off some of the university's donors and the hospital's patrons, either because of its acknowledgement of oral sex or because it might be construed as demeaning to women. Neither concern is a justification for censorship. The journal, the issue, and the essay were all squarely within the bounds of academic freedom, and Northwestern University should have stood proudly in support of that principle.



For the last three months, Dreger has been trying to get Northwestern to state unequivocally that its action was inappropriate. She wanted an assurance that no similar action would occur in the future. Although officials at Northwestern have affirmed the institution's commitment to the general principle of academic freedom, they have not been willing to admit that the act of taking down the article was incompatible with academic freedom. In such circumstances, Dreger came to question whether she could continue her relationship with Northwestern.



Most professors in this situation would have patted themselves on the back for having managed to get the article back online and then turned their attention back to their usual work of teaching and scholarship. But Dreger has made a career out of defending academic freedom. Her most recent book, Galileo's Middle Finger: Heretics, Activists, and the Search for Justice in Science, is a brilliant account of the importance of academic freedom in situations in which researchers got in trouble for putting forth challenging ideas about sex. She believes deeply in academic freedom. She refused to go back to business as usual.



Although grateful to those "university leaders" at Northwestern University who had defended her academic freedom in the past "when they received often sharp criticisms of my work," on Monday she wrote to Northwestern's Provost that "I no longer work at that institution. I no longer work at a university that fearlessly defends academic freedom in the face of criticism, controversy, and calls for censorship. Now I work at a university at which my own dean thinks he has the authority to censor my work. An institution in which the faculty are afraid to offend the dean is not an institution where I can in good conscience do my work. Such an institution is not a 'university,' in the truest sense of that word."



And, with that, Dreger resigned her position at Northwestern.



Few individuals would have the courage to take that step. Few individuals would sacrifice themselves in this way in the name of integrity, honesty, and academic freedom. It was not easy for Dreger to take this step. But with stunning clarity, Northwestern University has now been given Alice Dreger's Middle Finger.
 •  0 comments  •  flag
Share on Twitter
Published on August 25, 2015 14:07

Geoffrey R. Stone's Blog

Geoffrey R. Stone
Geoffrey R. Stone isn't a Goodreads Author (yet), but they do have a blog, so here are some recent posts imported from their feed.
Follow Geoffrey R. Stone's blog with rss.