Tuesday, June 20, 2017

The Real Reason You Can’t Understand Why Black Americans Are Furious

Reposting an excellent article by Clay Rivers (5 min read):

https://medium.com/clay-rivers/the-real-reason-you-cant-understand-why-black-americans-are-furious-8a9993f1540d

Sunday, June 18, 2017

The History Test - How Should the Courts use History?

by Jill Lepore

Constitutional interpretation reaches back to the dawn of time.


On the night of April 9, 1931, James M. Kiley, thirty-nine, was shot with a .32-calibre pistol at a gas station in Somerville, Massachusetts, during a botched holdup. Kiley, the night manager, had twenty-four dollars in his pocket; the cash in the register was untouched. Herman Snyder, nineteen, was found guilty of first-degree murder and sentenced to death. “Well, that’s that,” Snyder said, when the jury delivered the verdict. But that wasn’t that. Snyder filed an appeal arguing that his constitutional rights had been violated: during his trial, when the judge, the jury, lawyers for both sides, and a court stenographer visited the gas station, the judge refused to allow Snyder to go along. Even Lizzie Borden had been offered a chance to go with the jury to the crime scene, Snyder’s lawyers pointed out, and so had Sacco and Vanzetti.

In the summer of 1933, Snyder’s lawyers went to see Louis Brandeis, the Supreme Court Justice, at his summer home, on Cape Cod; Brandeis, in an extraordinary gesture from the highest court, issued a stay of execution. The Court agreed to hear the appeal, and, in January, 1934, upheld Snyder’s conviction in a 5–4 opinion that proposed a standard for measuring the weight of tradition in fundamental-rights cases, a standard sometimes known as the history test.

Some rights, like freedom of religion, are written down, which doesn’t always make them easier to secure; and some, like the right to marry, aren’t, which doesn’t mean that they’re less fundamental. The Constitution, as originally drafted, did not include a bill of rights. At the time, a lot of people thought that listing rights was a bad idea because, in a republic, the people retain all the rights not specifically granted to the government and because anything written down is both limited and open to interpretation. “What is the liberty of the press?” Alexander Hamilton asked. “Who can give it any definition which would not leave the utmost latitude for evasion?” These were excellent questions, but Hamilton lost the argument. The Bill of Rights was ratified in 1791. Past the question of which rights there remained the question of whose rights. In 1857, in Dred Scott, the Supreme Court asked whether any “negro whose ancestors were imported into this country and sold as slaves” is “entitled to all the rights, and privileges, and immunities” guaranteed in the Constitution. Relying on “historical facts,” the Court answered no, arguing that, at the time of the framing, black people “had for more than a century before been regarded as beings of an inferior order, and altogether unfit to associate with the white race either in social or political relations, and so far inferior that they had no rights which the white man was bound to respect.” After Emancipation, the Fourteenth Amendment, ratified in 1868, cast off the shackles of history with this guarantee: “No state shall make or enforce any law which shall abridge the privileges or immunities of citizens of the United States; nor shall any state deprive any person of life, liberty, or property, without due process of law; nor deny to any person within its jurisdiction the equal protection of the laws.” Then, in a series of cases in the early twentieth century, the courts began applying parts of the Bill of Rights to the states, mainly by way of the Fourteenth Amendment.

Yet how would judges decide what rights fall under the definition of due process and equal protection? There seemed to be two possibilities: precedent and reasonable judgment. In Snyder v. Massachusetts, Snyder’s attorneys argued that Snyder had a fundamental right to go on the trip to the gas station, under the due-process clause. But Justice Benjamin Cardozo, writing for the majority, said that the question turned not only on a reasonable reading of the Fourteenth Amendment or on precedent but also on whether refusing to bring a defendant with the jury to the crime scene “offends some principle of justice so rooted in the traditions and conscience of our people as to be ranked as fundamental.” He then recited instances, going back to 1747, to show that what Snyder had been denied did not meet this standard.

History, in one fashion or another, has a place in most constitutional arguments, as it does in most arguments of any kind, even those about whose turn it is to wash the dishes. Generally, appeals to tradition provide little relief for people who, historically, have been treated unfairly by the law. You can’t fight segregation, say, by an appeal to tradition; segregation was an entrenched American tradition. In 1896, Plessy v. Ferguson, essentially reprising Dred, cited the “established usages, customs, and traditions of the people” in affirming the constitutionality of Jim Crow laws. In 1954, to challenge such laws, Brown v. Board of Education disavowed historical analysis and cited, instead, social science: empirical data. Meanwhile, Snyder was chiefly cited in appeals of murder convictions involving defendants who claimed that their rights had been violated. In 1945, Justice William O. Douglas cited Snyder in a 5–4 decision reversing the conviction of a Georgia sheriff who had arrested a young black man for stealing a tire and then beaten him to death. The killing was “shocking and revolting,” Douglas wrote, but it was impossible to know whether the victim’s civil rights had been violated. In a fierce dissent, Francis Murphy argued that the reversal was absurd: “Knowledge of a comprehensive law library is unnecessary for officers of the law to know that the right to murder individuals in the course of their duties is unrecognized in this nation.”

But, in recent decades, the history test applied in cases like Snyder has quietly taken a special place; it has been used to help determine the constitutionality of everything from assisted suicide to deportation, by the unlikely route of judicial decisions about sex. History’s place in American jurisprudence took a turn in 1973, in Roe v. Wade, when the Court dusted off its incunabula and looked into what “history reveals about man’s attitudes toward the abortion procedure over the centuries,” as Justice Harry Blackmun explained. Abortion had not been a crime in Britain’s North American colonies, nor was it a crime in most parts of the United States until after the Civil War. “It perhaps is not generally appreciated that the restrictive criminal abortion laws in effect in a majority of States today are of relatively recent vintage,” Blackmun wrote. In turning back the hands of time, he didn’t stop there. “We are told that, at the time of the Persian Empire, abortifacients were known, and that criminal abortions were severely punished. We are also told, however, that abortion was practiced in Greek times as well as in the Roman Era, and that ‘it was resorted to without scruple.’ ” Roe overturned laws passed by state legislatures by appealing to ancient history. William Rehnquist, in his dissent, cited Snyder: “The fact that a majority of the States reflecting, after all, the majority sentiment in those States, have had restrictions on abortions for at least a century is a strong indication, it seems to me, that the asserted right to an abortion is not ‘so rooted in the traditions and conscience of our people as to be ranked as fundamental.’ ”

Not coincidentally, liberals began applying the history test to fundamental-rights cases at the very moment that women and minorities were entering the historical profession and writing history that liberal-minded judges might be able to cite. Conservatives, meanwhile, defined a new historical method: originalism, a method with roots in the kind of analysis made in Dred Scott. Originalism is essentially a very tightly defined history test. Snyder’s invocation of “the traditions and conscience of our people” is like a reader’s pass to the library stacks. There is virtually no end of places in the historical record to look for the traditions and conscience of our people, especially when “our people” is everyone. Originalism, a term coined in 1980, asks judges to read only the books on a single shelf in the library: the writings of delegates to the Constitutional Convention and the ratifying conventions, the Federalist Papers, and a handful of other newspapers and pamphlets published between 1787 and 1791 (and, occasionally, public records relating to debates over subsequent amendments, especially the Fourteenth). Even more narrowly, some originalists insist on consulting only documents that convey the “public understanding” of the writings of these great men. “If someone found a letter from George Washington to Martha telling her that what he meant by the power to lay taxes was not what other people meant,” Robert Bork once wrote, “that would not change our reading of the Constitution in the slightest.”

Roe, along with a series of civil-rights decisions made by the Warren Court, fuelled the growth of a conservative legal movement. The Federalist Society, founded in a number of law schools in 1982, developed an intellectual tradition, promoted scholarship, and sought to place its members on the courts. (Justices Samuel Alito and Clarence Thomas, along with Neil Gorsuch, who has been nominated to join them, are affiliated with the Federalist Society.) Within five years of its founding, the society had chapters at more than seventy law schools.

In 1985, in a speech to the Federalist Society, Ronald Reagan’s Attorney General, Edwin Meese, announced that “the Administration’s approach to constitutional interpretation” was to be “rooted in the text of the Constitution as illuminated by those who drafted, proposed, and ratified it.” He called this a “jurisprudence of original intention,” and contrasted it with the “misuse of history” by jurists who saw, in the Constitution’s “spirit,” things like “concepts of human dignity,” with which they had turned the Constitution into a “charter for judicial activism.” Meese’s statement met with a reply from Justice William Brennan, who said that anyone who had ever studied in the archives knew better than to believe that the records of the Constitutional Convention and the ratifying conventions offered so certain, exact, and singular a verdict as that which Meese expected to find there. (Obama’s Supreme Court nominee Merrick B. Garland clerked for Brennan.) Brennan called the idea that modern judges could discern the framers’ original intention “little more than arrogance cloaked as humility.”

In opposing fundamental-rights arguments, though, the Reagan-era Court used not only originalist arguments but also the history test. In June, 1986, the Court ruled, 5–4, in Bowers v. Hardwick, that the right to engage in homosexual sex was not rooted in tradition; instead, prohibitions on homosexual sex were rooted in tradition. Justice Byron White, writing for the majority, said that these prohibitions had “ancient roots.” In a concurring opinion, Justice Lewis Powell wrote, “I cannot say that conduct condemned for hundreds of years has now become a fundamental right.” Blackmun, in his dissent, argued against this use of history: “I cannot agree that either the length of time a majority has held its convictions or the passions with which it defends them can withdraw legislation from this Court’s scrutiny.”

Antonin Scalia joined the Court in the next term. And, soon afterward, in 1987, Reagan had the opportunity to appoint another Justice, and named Robert Bork. Less than an hour after the nomination was announced, Senator Edward M. Kennedy called for Democrats to resist what he described as Reagan’s attempt to “impose his reactionary vision of the Constitution on the Supreme Court and on the next generation of Americans.” Laurence Tribe, the Harvard law professor, testified in opposition to Bork’s nomination. But concerns about Bork’s vantage on history were not limited to liberal legal scholars. His most determined critics included the federal judge Richard Posner, who wrote of Bork’s views, “There are other reasons for obeying a judicial decision besides the Court’s ability to display, like the owner of a champion airedale, an impeccable pedigree for the decision, connecting it to its remote eighteenth-century ancestor.” In retrospect, the way this debate reached the public was mostly a distraction. The press generally reduced the disagreement to a stubbornly partisan battle in which conservatives and the past squared off against liberals and the future, and missed most of what was at stake: the relationship between history and the law.

Scalia was the Court’s most determined and eloquent originalist, but he also frequently invoked tradition. In 1989, writing for the majority in Michael H. v. Gerald M., a case involving the assertion of parental visitation rights, he argued that finding rights “rooted in history and tradition” required identifying the “most specific” tradition; Brennan, in his dissent, questioned Scalia’s method, writing that the opinion’s “exclusively historical analysis portends a significant and unfortunate departure from our prior cases and from sound constitutional decisionmaking.” As he had in his debate with Meese, Brennan charged Scalia with something between ignorance and duplicity. “It would be comforting to believe that a search for ‘tradition’ involves nothing more idiosyncratic or complicated than poring through dusty volumes on American history,” Brennan wrote, but history is more complicated than that, “because reasonable people can disagree about the content of particular traditions, and because they can disagree even about which traditions are relevant.” Even more fundamentally, Brennan argued that the appeal to tradition essentially nullifies the Fourteenth Amendment, whose whole point was to guarantee constitutional protections to those Americans who had not been protected by the traditions and consciences of other Americans.

If less carefully observed than the debate over originalism, the debate over the history test has influenced judicial nominations for decades. “A core question is whether, in examining this nation’s history and tradition, the Court will protect only those interests supported by a specific and longlasting tradition, or whether the Court will not so constrict its analysis,” Senator Joseph Biden said during hearings on David Souter’s nomination, in 1990. (Biden had been coached by Tribe.) Souter’s answer—“It has got to be a quest for reliable evidence, and there may be reliable evidence of great generality”—satisfied Democrats. Liberal legal scholars, meanwhile, had grown increasingly alarmed by Scalia’s use of history: in a 1990 case, for example, he cited a book written in 1482 in a narrowing definition of due process, and in a 1991 case he cited punishments imposed during the reign of James II to uphold a mandatory life sentence without the possibility of parole for the possession of six hundred and fifty grams of cocaine. The legal scholar Erwin Chemerinsky argued that conservatives on the Court had turned to history-test historicism because originalism is so patently flawed as a mode of constitutional interpretation. (The framers weren’t originalists; Brown v. Board can’t be squared with originalism; originalism can’t be reconciled with democratic self-government.) “The constant use of history to justify conservative results leads to the cynical conclusion that the country has a seventeenth century Court as it enters the twenty-first century,” Chemerinsky wrote in 1993. “It is not enough to make one want to take all the history books out of the Supreme Court’s library, but it makes one come close.”

Or you could write new history books. Geoffrey R. Stone, a distinguished professor and a former dean of the University of Chicago Law School, is a past chairman of the American Constitution Society, which was founded, in 2001, as an answer to the Federalist Society. His new book, “Sex and the Constitution: Sex, Religion, and Law from America’s Origins to the Twenty-first Century” (Liveright), locates “America’s origins” in antiquity. Applying the history test to the regulation of sex, Stone begins his inquiry in the sixth century B.C.E., and expands into a learned, illuminating, and analytical compendium that brings together the extraordinary research of a generation of historians in service of a constitutional call to arms.

Stone started working on the book about a decade ago, not long after the Court reversed Bowers. In Lawrence v. Texas, in 2003, the majority opinion overturned state sodomy laws by rejecting the history presented as evidence in Bowers. Colonial anti-sodomy laws did exist, Kennedy wrote in Lawrence, but they applied to everyone, not just to men; also, they were hardly ever enforced and “it was not until the 1970’s that any State singled out same-sex relations for criminal prosecution, and only nine States have done so.” In short, Kennedy wrote, “the historical grounds relied upon in Bowers are more complex than the majority opinion and the concurring opinion by Chief Justice Burger indicate.”

The tables had turned. Between Bowers and Lawrence, academic historians had produced a considerable body of scholarship about the regulation of sexuality, on which the Court was able to draw. Scalia, in an uncharacteristically incoherent dissent, mainly fumed about this, arguing that “whether homosexual sodomy was prohibited by a law targeted at same-sex sexual relations or by a more general law prohibiting both homosexual and heterosexual sodomy, the only relevant point is that it was criminalized—which suffices to establish that homosexual sodomy is not a right ‘deeply rooted in our Nation’s history and tradition.’ ” Scalia, in effect, accused the majority of doing too much historical research.

The inconsistency is perhaps best explained by the Court’s wish to pretend that it is not exercising judicial discretion. One legal scholar has suggested that the history test is like Dumbo’s feather. Dumbo can fly because he’s got big ears, but he doesn’t like having big ears, so he decides he can fly because he’s got a magic feather. The Court has got big, activist ears; it would rather believe it’s got a magical history feather.

Lately, the field of argument, if not always of battle, in many fundamental-rights cases has moved from the parchment pages of the Constitution to the clay of Mesopotamia. In Obergefell v. Hodges, the 2015 Supreme Court decision that overturned state bans on same-sex marriage, Justice Kennedy, writing for the majority, reached back almost to the earliest written records of human societies. “From their beginning to their most recent page, the annals of human history reveal the transcendent importance of marriage,” he said. “Since the dawn of history, marriage has transformed strangers into relatives, binding families and societies together.” He cited Confucius. He quoted Cicero. The states that wanted to ban same-sex marriage described its practice as a betrayal of that history, but Kennedy saw it as a continuation, a testament to “the enduring importance of marriage.” Marriage is an institution with “ancient origins,” Kennedy said, but that doesn’t mean it’s changeless. Scalia, in a heated dissent, called Kennedy’s opinion “silly” and “pretentious.” As a matter of historical analysis, Scalia mostly confined himself to the past century and a half. “When the Fourteenth Amendment was ratified in 1868, every State limited marriage to one man and one woman, and no one doubted the constitutionality of doing so,” he said. “That resolves these cases.”

Liberal legal scholars disagree, and Stone’s “Sex and the Constitution” is an attempt to pull together all their evidence, for the sake of court battles to come. Ancient Greeks, Romans, and Jews believed that sex was natural and didn’t have a lot of rules about it, Stone argues. Early Christians, influenced by Augustine of Hippo, who in the fifth century decided that Adam and Eve had been thrown out of the Garden of Eden because of lust, decided that sex was a sin, and condemned all sorts of things, including masturbation. Stone speculates that the medieval church’s condemnation of same-sex sex, a concern that emerged in the eleventh century and that became pronounced in the writings of Thomas Aquinas, was a consequence of a new requirement: clerical celibacy. According to Stone, Aquinas argued that the sins of mutual masturbation, oral sex, and anal sex were worse if they involved two members of the same sex, a position that became church dogma in the sixteenth century.

During the Reformation, Protestants redeemed one kind of sex: intercourse between a married man and woman. (Martin Luther argued that sex was as “necessary to the nature of man as eating and drinking.”) Protestants also rejected the Catholic Church’s condemnation of contraception. But they believed that governments ought to regulate sexual behavior for the sake of public order. In the seventeenth century, most of England’s American colonies had an established religion, an arrangement that, a revolution later, they abdicated.

Enlightenment philosophers rejected Christian teachings about sex, and, believing in the pursuit of happiness, they believed, too, in the pursuit of pleasure. The Constitution and the Bill of Rights say nothing about sex, of any kind, with anyone, under any circumstances. Nor do any of the original state constitutions. Nor did any laws in any of the states, at the time of the founding, forbid sexual expression, or abortion before quickening, and sodomy laws were seldom enforced. That changed in the first half of the nineteenth century, when a religious revival led states to pass new laws, including the first law against obscenity. A campaign against the long-standing practice of abortion began, followed by a crusade against contraception and, at the turn of the twentieth century, the persecution of homosexuals. The cases from Roe to Lawrence to Obergefell, Stone suggests, constitute a revolution, not a turning away but a turning back, toward the Enlightenment.

History written to win a legal argument has a different claim to authority than history written to find out what happened. In a study of sex, Stone might have been interested in any number of practices, but he has confined his investigation to matters that are sources of ongoing constitutional and political debate in the United States today: abortion, contraception, obscenity, and sodomy or homosexuality. Practices that were once crimes, like fornication and adultery, or that are still crimes, like incest, infanticide, and rape, generally lie outside the scope of his concern. This has the effect of obscuring the relationship between things he’s interested in and things he’s not interested in, and it introduces a circularity: he has defined the scope of his study by drawing a line between what’s criminal and what’s not, when how that line came to be drawn is the subject of his study.

The history of the regulation of sexuality, especially the parts he’s chosen to gloss over—which happen to be parts that particularly concern the vulnerability of women and children—is a chronicle of a staggeringly long reign of sanctioned brutality. That reign rests on a claim on the bodies of women and children, as a right of property, made by men. “The page of history teems with woman’s wrongs,” Sarah Grimké wrote in 1837. Stone only skimmed that page. Or consider this page, from the Congressional Record in 1866, during the debate over the Fourteenth Amendment. Jacob Howard, a Republican senator from Michigan, explained that the amendment “protects the black man in his fundamental rights as a citizen with the same shield which it throws over the white man.” Howard assured his audience that the amendment did not guarantee black men the right to vote, even though he wished that it did, and here he quoted James Madison, who’d written that “those who are to be bound by laws, ought to have a voice in making them,” at which point Reverdy Johnson, a Democrat from Maryland, wondered how far such a proposition could be extended, especially given the amendment’s use of the word “person”:
Mr. Johnson: Females as well as males?
Mr. Howard: Mr. Madison does not say anything about females.
Mr. Johnson: “Persons.”
Mr. Howard: I believe Mr. Madison was old enough and wise enough to take it for granted that there was such a thing as the law of nature which has a certain influence even in political affairs, and that by that law women and children are not regarded as the equals of men.

History isn’t a feather. It’s an albatross.

Last year, Neil Gorsuch delivered a memorial tribute to Scalia, in which he said that the Justice’s greatest contribution to jurisprudence was his commitment to historical inquiry. Gorsuch said that Scalia had reminded legal scholars that, rather than contemplating the future, “judges should instead strive (if humanly and so imperfectly) to apply the law as it is, focusing backward, not forward.”

Scalia spent much of his career arguing for the importance of history in the interpretation of the law. “If ideological judging is the malady,” Scalia said in 2010, “the avowed application of such personal preferences will surely hasten the patient’s demise, and the use of history is far closer to being the cure than being the disease.”

Gorsuch’s account of this debate is more measured. Whose history? How far back? “In due process cases, the Supreme Court has frequently looked not only to this nation’s history, but also to English common law,” Gorsuch has written. “But why stop there? Why not examine Roman or Greek or some other ancient precedent as, say, Justice Blackmun did in his opinion for the Court in Roe v. Wade? And what about contemporary experience in other Western countries?” His book on assisted suicide contains a chapter, called “The Debate Over History,” that applies the history test to the question of the right to die. He began his survey with Plato, hopscotched across the centuries, and decided that, while a consensus had grown “that suicide is essentially a medical problem,” the historical record offers, at best, limited support for the idea of a right to assisted suicide and euthanasia. Gorsuch, an eloquent and candid writer, has his doubts about the history test. He writes, “The history test, for all its promise of constraining judicial discretion, carries with it a host of unanswered methodological questions and does not always guarantee the sort of certainty one might perhaps hope for.”

Gorsuch may be dubious about the history test, but he happens to be a particularly subtle scholar of precedent. (He’s a co-author of a new book, “The Law of Judicial Precedent”; Scalia had been meant to write the foreword.) And he’s written powerfully about the relationship between history and the law. In 2015, Gorsuch wrote an opinion in a case that concerned Alfonzo Deniz Robles. Deniz, a Mexican citizen, twice entered the United States illegally. He married an American citizen, and had four children. In 2005, the Tenth Circuit court ruled that an immigrant in Deniz’s position was grandfathered into a lapsed program that allowed him to pay a fine and apply for residency, so Deniz applied for a visa. The government held up his application for years, and by the time it was reviewed the Board of Immigration Appeals, an executive agency, overruled the court, requiring him to leave the country for ten years before applying for residency. (“It was, like, Today you can wear a purple hat but tomorrow you can’t,” Deniz’s wife, Teresa, told me. “It was mind-boggling.”) Deniz appealed, on the ground that his rights to due process had been violated.

The appeal reached Gorsuch’s court in 2014, at which point immigration services told Deniz, as Gorsuch explained, “that he’d have to start the decade-long clock now even though if he’d known back in 2005 that this was his only option, his wait would be almost over.” Writing for the court, Gorsuch explained that judicial reasoning is always backward-looking, while legislation is forward-looking; he cited a thirteenth-century English jurist to establish that the presumption against retroactive legislation is nearly as old as common law, and the retrospective effect of judicial decisions, he said, has been established for almost a thousand years. But what about acts of the executive branch? Gorsuch said that if an executive agency is acting like a judge its rulings are retroactive, but if it’s acting like a legislature its rulings are prospective. That is, if the Board of Immigration Appeals makes a new policy, it can’t apply it to people who made choices under the old policy. The Tenth Circuit ruled in favor of Deniz. He still doesn’t have a green card. That will likely take years.

The chain of cases that are of interest to Stone in “Sex and the Constitution” will be revisited by a newly constituted Supreme Court, once Scalia’s replacement finally takes a seat. More immediately, though, the Court will be asked to rule on the due-process and equal-protection-violation claims made in opposition to President Trump’s early executive orders, as a matter of federal law. “A temporary absence from the country does not deprive longtime residents of their right to due process,” eighteen state attorneys general and others argued in a brief challenging the Trump Administration’s travel ban. Gorsuch’s several rulings urging restraint of the executive branch carry a particular weight in this new political moment, in which the history test is already being applied to those orders. “The framers worried that placing the power to legislate, prosecute, and jail in the hands of the Executive would invite the sort of tyranny they experienced at the hands of a whimsical king,” Gorsuch wrote in a dissent from 2015. A lot of people are still worried about that.

Alfonzo and Teresa Deniz, who live in Wyoming with their kids, have so far spent more than forty thousand dollars on legal fees. They’ve got another court date, on March 21st, the day after the Senate Judiciary Committee begins hearings on Gorsuch’s nomination. The law keeps changing. “You hear a lot of things,” Teresa told me. “It’s scary.” She’s terrified that her children will lose their father. I asked Teresa if she and her husband had ever met Neil Gorsuch. She said no. She didn’t know that he’d been nominated to the Supreme Court. I asked her if she had a message for the Court. “Look at the families,” she said. She began to cry. She said, “I just hope that they can come up with something that is justice.”

Jill Lepore is a staff writer with The New Yorker and a professor of history at Harvard. “The Secret History of Wonder Woman” is her latest book.

Is Socially Responsible Capitalism Losing?

No More Mr. Nice Guy by Sheelah Kolhatkar


In December, 2015, a new startup called Juno entered the ride-hailing market in New York City with a simple proposition: it was going to treat its drivers better than its competitors, notably Uber, did theirs—and do “something that was socially responsible,” as one of Juno’s co-founders, Talmon Marco, told me last fall. In practice, that meant drivers would keep a bigger part of their fares and be eligible for a form of stock ownership in the company. But, on April 26th, when an Israeli company named Gett announced that it was buying Juno for two hundred million dollars, that changed. The merged company is dropping the restricted stock plan for drivers, and those who already hold stock are being offered small cash payments, reportedly in the hundred-dollar range, in exchange.

Juno’s founders had adopted the language of a doing-well-by-doing-good philosophy that has spread in the business world in recent years. Some call it conscious or socially responsible capitalism, but the basic idea is that any business has multiple stakeholders—not just owners but employees, consumers, and also the community—and each of their interests should be taken into account. The idea arose in response to an even more powerful principle: the primacy of investor rights. In a new book, “The Golden Passport,” the journalist Duff McDonald lays much of the blame for that thinking at the feet of a Harvard Business School professor named Michael Jensen, whose “agency theory,” developed in the nineteen-eighties, sought to align the interests of managers with those of the company’s investors. (Gordon Gekko spoke eloquently on its behalf in the movie “Wall Street.”) This alignment led to huge stock-option pay packages for top corporate managers and, McDonald argues, provided an intellectual framework that justifies doing anything (within the law) to increase a company’s stock price, whether that be firing workers or polluting the environment.

In this philosophical tension, the investors-above-all doctrine seems to have triumphed over the more inclusive approach. “I think what’s recent is maybe being so completely blatant about it,” Peter Cappelli, a professor and labor economist at Wharton, said. When American Airlines agreed to give raises to its pilots and flight attendants in April, analysts at a handful of investment banks reacted bitterly. “This is frustrating,” a Citigroup analyst named Kevin Crissey wrote in a note that was sent to the bank’s clients. “Labor is being paid first again. Shareholders get leftovers.” Jamie Baker, of JPMorgan, also chimed in: “We are troubled by AAL’s wealth transfer of nearly $1 billion to its labor groups.”

Those comments were mocked online, but similar sentiments are everywhere in the financial establishment. Both Costco and Whole Foods—whose C.E.O., John Mackey, wrote the book “Conscious Capitalism”—have been criticized by Wall Street investors and analysts for years for, among other things, their habit of paying workers above the bare minimum. Paul Polman, who, as C.E.O. of the Anglo-Dutch conglomerate Unilever, has made reducing the company’s carbon footprint a priority, recently fought off a takeover bid from Kraft Heinz, which is known for its ruthless cost-cutting.

Newer platform companies have also encountered the phenomenon. An app called Maple, which made the nearly unheard-of decision to offer health benefits and employee status to its food-delivery people, folded in recent months. Etsy, which allows craftspeople to sell their goods online, and which became known for its employee perks, has lost most of its stock-market value since it went public, in 2015; hedge-fund investors have been pushing the company to reduce its costs and to lay off employees. In the case of Juno, according to a person familiar with its operations, the founders sold the company and agreed to cut its driver stock awards because they couldn’t find new investors to finance its growth. “They were stuck from an expansion perspective, and this was what had to give,” I was told. “It came with some huge compromises.”

Many factors contributed to the troubles of these companies, but Cappelli notes how “vociferously the investment community seems to object to being nice to employees. It’s a reminder that, in the corporate world, things are constantly yielding to the finance guys—whether they know what they’re doing or not.”

This fixation on short-term stock gains is inherently unstable, Cappelli said. “The interesting thing is always to ask them, ‘What’s the value proposition for employees? Why should these people work only for the interest of the shareholders? How are you going to get people to work hard?’?” He went on, “I don’t think they have an answer.”

When I called a Juno driver named Salin Sarder to ask about the latest developments, he was surprised to learn that the Juno stock-grant program had been cancelled, and blamed his ignorance on the fact that he hadn’t checked his e-mail. (The company has not made a public statement and did not respond to my inquiries.) He was, on the other hand, pleased to learn that the new Juno-Gett would be honoring the favorable commission rate Juno had been offering, at least for a few months. He also had a few thoughts about the app-economy business model favored by Silicon Valley investors. “If you are a millionaire and all around you is poor, you have no safety,” Sarder, who comes from Bangladesh, said. “Happiness is there when everyone has happiness.” 

This article appears in other versions of the June 5 & 12, 2017, issue of The New Yorker, with the headline “No More Mr. Nice Guy.”

The Work You Do, The Person You Are by Toni Morrison


All I had to do for the two dollars was clean Her house for a few hours after school. It was a beautiful house, too, with a plastic-covered sofa and chairs, wall-to-wall blue-and-white carpeting, a white enamel stove, a washing machine and a dryer—things that were common in Her neighborhood, absent in mine. In the middle of the war, She had butter, sugar, steaks, and seam-up-the-back stockings.

I knew how to scrub floors on my knees and how to wash clothes in our zinc tub, but I had never seen a Hoover vacuum cleaner or an iron that wasn’t heated by fire.

Part of my pride in working for Her was earning money I could squander: on movies, candy, paddleballs, jacks, ice-cream cones. But a larger part of my pride was based on the fact that I gave half my wages to my mother, which meant that some of my earnings were used for real things—an insurance-policy payment or what was owed to the milkman or the iceman. The pleasure of being necessary to my parents was profound. I was not like the children in folktales: burdensome mouths to feed, nuisances to be corrected, problems so severe that they were abandoned to the forest. I had a status that doing routine chores in my house did not provide—and it earned me a slow smile, an approving nod from an adult. Confirmations that I was adultlike, not childlike.

In those days, the forties, children were not just loved or liked; they were needed. They could earn money; they could care for children younger than themselves; they could work the farm, take care of the herd, run errands, and much more. I suspect that children aren’t needed in that way now. They are loved, doted on, protected, and helped. Fine, and yet . . .

Little by little, I got better at cleaning Her house—good enough to be given more to do, much more. I was ordered to carry bookcases upstairs and, once, to move a piano from one side of a room to the other. I fell carrying the bookcases. And after pushing the piano my arms and legs hurt so badly. I wanted to refuse, or at least to complain, but I was afraid She would fire me, and I would lose the freedom the dollar gave me, as well as the standing I had at home—although both were slowly being eroded. She began to offer me her clothes, for a price. Impressed by these worn things, which looked simply gorgeous to a little girl who had only two dresses to wear to school, I bought a few. Until my mother asked me if I really wanted to work for castoffs. So I learned to say “No, thank you” to a faded sweater offered for a quarter of a week’s pay.

Still, I had trouble summoning the courage to discuss or object to the increasing demands She made. And I knew that if I told my mother how unhappy I was she would tell me to quit. Then one day, alone in the kitchen with my father, I let drop a few whines about the job. I gave him details, examples of what troubled me, yet although he listened intently, I saw no sympathy in his eyes. No “Oh, you poor little thing.” Perhaps he understood that what I wanted was a solution to the job, not an escape from it. In any case, he put down his cup of coffee and said, “Listen. You don’t live there. You live here. With your people. Go to work. Get your money. And come on home.”

That was what he said. This was what I heard:

1. Whatever the work is, do it well—not for the boss but for yourself.

2. You make the job; it doesn’t make you.

3. Your real life is with us, your family.

4. You are not the work you do; you are the person you are.

I have worked for all sorts of people since then, geniuses and morons, quick-witted and dull, bighearted and narrow. I’ve had many kinds of jobs, but since that conversation with my father I have never considered the level of labor to be the measure of myself, and I have never placed the security of a job above the value of home.

Sunday, June 11, 2017

The Wolf Who Cried Boy

"Cosby is hoping to bank on this brutal history and use it as cover to cast doubt on his own crimes. I mourn that decision because of the confusion and chaos it will cause for black people who are actually innocent, who were actually falsely accused, whose innocence will be doubted even further because a clearly guilty person has misused a historical reality for his own benefit — a historical reality he spent a good chunk of public speeches denying."

https://medium.com/@SonofBaldwin/the-wolf-who-cried-boy-2341c6cd47ba