Did Mass Immigration Destroy Israel’s Institutions?

The pro-institution case for increased immigration continues to get better. A new working paper by Benjamin Powell and others looks to Israel as a natural experiment. From the abstract:

The relaxation of emigration restrictions in the Soviet Union and the State’s subsequent collapse led to a large exogenous shock to Israel’s immigrant flows because Israel allows unrestricted immigration for world-wide Jews. Israel’s population increased by 20 percent in the 1990s due to immigration from the former Soviet Union. These immigrants did not bring social capital that eroded the quality of Israel’s institutional environment. We find that high quality political institutions were maintained while economic institutions improved substantially over the decade. Our case study finds that the immigrants played an active role in this institutional evolution and we also employ a synthetic control to verify that it is likely that the institutions improvement would not have occurred to the same degree without the mass migration.

The authors conclude,

This finding in no way proves that in every case unrestricted migration would not harm destination country institutions. However, as a complement to Clark et. al. (2015) that found in a-cross country empirical analysis that existing stocks and flows immigrants were associated with improvements in economic institutions, it should increase our skepticism of claims that 26 unrestricted migration would necessarily lead to institutional deterioration that would destroy the estimated “trillion dollar bills” that the global economy could gain through much greater migration flows (pgs. 25-26).

The Garden of Enid: AuthorCast with Scott Hales

This is part of the DR Book Collection.

Image result for the garden of enidWhile Bill Watterson’s Calvin & Hobbes touched on childhood and life experience more generally, cartoonist Scott Hales delves into the details and nuances of Mormonism’s unique and somewhat odd culture while capturing the same kind of magic described above. His new graphic novel–The Garden of Enid: Adventures of a Weird Girl, Part One–follows the thoughts and experiences of Enid: a witty, contemplative, socially-awkward (“weird”) 15-year-old Mormon girl. The hilarity of the strips stems from the portrayals of embarrassingly familiar situations faced by young Mormons: stake dances, boring teachers, YW camp, EFY, etc. Reading them feels like being in on an inside joke. Their depth, however, emerges from the moments of loneliness, uncertainty, reflection, and flickers of human connection. For me, the heart of the graphic novel is summed up in Enid’s exchange with her McConkie-loving seminary teacher who dismisses her “weird questions” in favor of a supposedly “simple”, “black and white” gospel. By contrast, the God Enid believes in is a “colorful” one “who likes weird questions.” Similarly, life is not “black and white.” It’s not even gray. It’s vibrant.

The Garden of Enid is what it is to be an American Mormon in microcosm. Even though the main character is a Mia Maid, Enid’s experiences can resonate with Mormons of all ages and genders. For me, Enid is that ward member that you have an unexpected, but incredibly moving moment with; that member who totally “gets it” when you’re unable to put on a smile at church. But she also–like Calvin–can model what not to do and how to cut oneself off from others. Like the best comic strips, Enid allows you to both laugh and reflect. And it’s a nice reminder that not only is God colorful, but so is life.

You can see my full review (from which the above is taken) at Worlds Without End. You can listen to cartoonist Scott Hales interviewed on Greg Kofford Books’ AuthorCast here.

Climate Change and Economic Growth

Philosopher Joseph Heath has an enlightening working paper on the economics and ethics of climate change. Heath is emphatic that his goal is

not to make a case for the importance of economic growth, but merely to expose an inconsistency in the views held by many environmental ethicists. Part of my reason for doing so is to narrow the gap somewhat, between the discussion about climate change that occurs in philosophical circles and the one that is occurring in policy circles, about the appropriate public response to the crisis. One of the major differences is that the policy debate is conducted under the assumption of ongoing economic growth, as well as an appreciation of the importance of growth for raising living standards in underdeveloped countries. The philosophical discussion, on the other hand, is dominated by the view that ongoing economic growth is either impossible or undesirable, leading to widespread acceptance of the steady-state view. This view is, however, a complete non-starter as far as the policy debate is concerned, because it is too easily satisfied. As a result, its widespread acceptance among philosophers (and environmentalists) has led to their large-scale self-marginalization (pg. 31).

Drawing on the economic research of economists Nicholas Stern and William Nordhaus, Heath proceeds to point out how misleading language often distorts and exaggerates the negative impact of climate change:

Stern adopts a similar mode of expression when he suggests that “in the baseline-climate scenario with all three categories of economic impact, the mean cost to India and South-East Asia is around 6% of regional GDP by 2100, compared to a global average of 2.6%.” The casual reader could be forgiven for thinking that the reference, when he speaks of “loss in GDP per capita,” is to present GDP. What he is talking about, however, is actually the loss of a certain percentage of expected future GDP. In some cases, he states this more clearly: “The cost of climate change in India and South East Asia could be as high as 9- 13% loss in GDP by 2100 compared with what could have been achieved in a world without climate change.” The last clause is of course crucial – under this scenario, GDP will not be 9-13% lower than it is right now, but rather lower than it might have been, in 2100, had there not been any climate change…In other words, what Stern is saying is that climate change stands poised to depress the rate of growth. This type of ambiguity has unfortunately become common in the literature. An important recent paper in Nature by Marshall Burke, Solomon M. Hsiang and Edward Miguel, estimating the anticipated costs of climate change, presents its conclusions in the same misleading way. The abstract of the paper states that “unmitigated climate change is expected to reshape the global economy by reducing average global incomes by roughly 23% by 2100.” The paper itself, however, states the finding in a slightly different way: “climate change reduces projected global output by 23% in 2100, relative to a world without climate change.” Again, that last qualifying clause is crucial, yet it was the unqualified version of the claim found in the abstract that made its way into the headlines, when the study was published (pgs. 15-16).

Heath acknowledges that

these potential losses are enormous, and they call for a strong policy response in the present. At the same time, what these economists are describing is not a “broken world,” in which “each generation is worse off than the last.” On the contrary, they are describing a world in which the average person is vastly better off than the average person is now – just not as well off as he or she might have been, had we been less profligate in our greenhouse gas emissions. It is important, in this context to recall that annual rate of real per capita GDP growth in India, at the time of writing is 6.3%, and so what Stern is describing is, at worst, the loss of approximately two years worth of growth. At the present rate of growth, living standards of the average person in India are doubling every 12 years. There are fluctuations from year to year, but the mean expectation of several studies, calculated by William Nordhaus, suggests that the GDP of India will be about 40 times larger in 2100 than it was in the year 2000 (which implies an average real growth rate of 3.8%). The 9-13% loss, due to climate change, is calculated against the 40-times-larger 2100 GDP, not the present one (pg. 16-17).

The full paper has more details and additional arguments. But this is the kind of serious cost/benefit analysis we need to be having about climate change.

PolitEcho and Difficult Run: Our Echo Chambers Examined

A week or two ago, I saw something interesting and new going around on Facebook: PolitEcho. It’s a cool idea. The app[ref]It’s a Google Chrome extension.[/ref] analyzes the politics of your friends on Facebook–and your feed–and then answers the question, “is your news feed a bubble?”

So I thought it would be fun to ask the Difficult Run editors to run the analysis on their own Facebook profiles and send me the results so that we could publish a little post that showed the respective bubbles of the folks who write for Difficult Run.

Now, before we get to the results, I have to lower expectations just a bit. Like a lot of data visualization projects, PolitEcho doesn’t really live up to its guiding concept. The way it analyzes political affiliation is very, very rudimentary. Instead of doing anything cool like using the Moral Foundations Word Count tool to conduct sentiment analysis on that things that your friends actually post[ref]This is just one example of what I’d love to do, given time and resources that I do not have.[/ref], instead PolitEcho just looks to see whether your friends have “liked” a variety of pre-screened news sources on Facebook. If they like Breitbart, for example, they’re conservative. If they like DailyKos, they’re liberal.  In other words, don’t read too much into this.

That said–and mostly for fun–here are what our political bubbles look like.

Nathaniel

Monica

Walker

Allen

So, that’s what our social networks look like, from one particularly naive political viewpoint.

How about yours?

Can Identity Politics Defend Liberty?

When Russell Fox highly praised an article with the headline The Defense of Liberty Can’t Do Without Identity Politics, I knew I had to read it. Between his praise and the tagline of the site–“moderation in pursuit of justice”–I was fascinated to see what a fusion of classical liberalism and identity politics might look like. As it turns out, however, it’s not an alliance that I can see any hope for.

The first indication that things were going awry was author Jacob Levy’s dismissal of the Trump win as not even really needing an explanation:

Donald Trump received a smaller share of the popular vote than Mitt Romney did in 2012, but his Electoral College victory was so unexpected that it seems to call forth explanation after explanation.

The idea is that Trump’s apparently overwhelming victory is basically a figment of the peculiar nature of our voting system. In reality, it was about 80,000 votes in three states [ref]Pennsylvania, Michigan, and Wisconsin[/ref] that proved decisive, and such a small number of votes “is susceptible of almost endless plausible explanations.”

All of which is true. And all of which misses the point entirely. When someone as objectionable as Donald Trump performs about as well as your typical Republican candidate, that is not a reason to wave your hands dismissively. That is a matter for serious reflection, because Trump was far, far from a typical Republican politician. Levy seems to be saying that Trump did more or less as well–plus or minus an insignificant fraction of total voters–as anybody else would have: Jeb Bush, Marco Rubio, Mitt Romney, whatever. But that’s not an explanation of anything. It actually happens to be the very fact which demands an explanation!

The second misstep is fundamentally misconstruing the nature of political correctness. To hear Levy tell it, identity politics is basically indistinguishable from civility and common moral rectitude. Thus, Levy insists that Trump’s low points such as the attack on Judge Curiel or the Khan family or the Bobby Bush video, were all instances of Trump violating political correctness. Ergo, Trump did not rise when he contravened identity culture, but rather fell, and so you can’t credit any kind of anti-PC sentiment for his victory.

But to categorize these mistakes as exclusively or even primarily about political correctness makes little sense. The Khan family is a gold star family, and when has concern for the military ever been associated with political correctness or identity politics? Yes, the Khan family is also Muslim, but there’s no way to describe this as only or even mainly about political correctness. The same goes for the Bobby Bush tapes where–once again–Trump’s foul language and outright criminal behavior violated not only the norms of political correctness (for being misogynistic) but also of–as I said earlier–basic decency. The attack on Curiel was the only one that could fairly be categorized as substantially about political correctness and little else, and so it’s a fundamental mistake to draw the conclusion that whenever Trump violated PC norms his poll numbers fell. On the contrary, his penchant for trampling on political correctness were the defining attributes of his campaign.[ref]According to Pew, “By a ratio of about five-to-one (83% to 16%), more Trump supporters say too many people are easily offended. Among Clinton supporters, 59% think people need to exercise caution in speaking to avoid offending others, while 39% think too many are easily offended.” And that’s just one example.[/ref]

Speaking more broadly, however, Levy’s dismissive attitude towards the excesses of political correctness and identity politics fundamentally misapprehends what that movement is already about. On the issue of college campuses, he writes:

It turns out that 18-year-olds seized of the conviction of their own righteousness are prone to immoderation and simplistic views. (Who knew?)

But–as amusing as those stories are–he neglects the part where people lose their jobs as a result of these temper tantrums and how this very real threat has led to a climate of fear and paranoia.[ref]This is especially true when the protests and repercussions spill outside of college campuses. Go back to Brendan Eich and start from there.[/ref] Nor is that just a matter of anecdotes. In “Political diversity will improve social psychological science” a team of researchers[ref]José L. Duarte, Jarret T. Crawford, Charlotta Stern, Jonathan Haidt, Lee Jussim, and Philip E. Tetlock[/ref] substantiate the following claims:

  1. Academic psychology once had considerable political diversity, but has lost nearly all of it in the last 50 years.
  2. This lack of political diversity can undermine the validity of social psychological science via mechanisms such as the embedding of liberal values into research questions and methods, steering researchers away from important but politically unpalatable research topics, and producing conclusions that mischaracterize liberals and conservatives alike.
  3. Increased political diversity would improve social psychological science by reducing the impact of bias mechanisms such as confirmation bias, and by empowering dissenting minorities to improve the quality of the majority’s thinking.
  4. The underrepresentation of non-liberals in social psychology is most likely due to a combination of self-selection, hostile climate, and discrimination

So much for the kids will be kids approach Levy favors. Undermining free speech and open inquiry on collage campuses and beyond strikes me as a legitimately concerning trend, not just a kind of cute overzealousness..

Levy also characterizes identity politics as starting and ending with any particular concern for a particular group of individuals. Anti-sodomy laws used to discriminate gays can be written in apparently neutral terms (with regard to sexuality) and America’s racist criminal justice system is ostensibly colorblind. In order to reform these discriminatory systems, Levy insists, we have to have identity-conscious politics that refuse to give up at the most superficial veneer of impartiality.

agree with Levy on this point,[ref]See my review of The New Jim Crow for more info on why.[/ref] but what we agree on and what identity politics constitute are two different things. Take the criminal justice system, for example. The inequality is evident in statistics that indicate blacks and whites use drugs at roughly equivalent levels, but that blacks are more likely to be arrested, charged, charged with more serious offenses, and convicted. It is entirely possible to oppose this because you want blacks and whites to be treated identically. This is nothing new. It’s the same spirit–broadened and expanded–as “all men are created equal.”

But what does this have to do with the doctrine of intersectionality, a political idea rooted in the fundamental alienation of people based on categories of race, gender, and sexuality? How can a universal view of humanity where we’re all fundamentally alike–and should be treated that way–possibly coexist with a doctrine that takes as axiomatic the mutual incomprehensibility of our lives based on identity categories?[ref]And, even more sinister perhaps, seems to imply that the experiences of individuals within designated identity categories are fungible.[/ref]

What does this have to do with the stubborn insistence of contemporary social justice warriors–many of whom come from extremely privileged backgrounds (as their prevalence on elite college campuses renders obvious)–to insist we check our racial privilege while ignoring other forms of privelege that are at least as relevant but would indict them as well? (I’m looking at you, socio-economic class.)

Or, to expand things a bit more, let’s consider critical race theory which–according to its own proponents–“rejects the traditions of liberalism and meritocracy.”[ref]UCLA School of Public Affairs: What is Critical Race Theory?[/ref] Levy argues that “the defense of liberty can’t do without identity politics,”[ref]I’m assuming he wrote the title, but even if an editor came up with it, it reflects his argument accurately enough.[/ref] but it turns out that the actual practitioners of identity politics think they can get along without liberty (at least: classical liberalism) just fine, thanks very much. Levy might think he’s on the side of the politically correct and the social justice advocates of identity politics, but I’m pretty sure the feeling’s not mutual.[ref]For more on the gap between identity politics and conventional notions of justice, see: When Social Justice Isn’t About Justice.[/ref]

Aside from these particular ideological incompatibilities between classical liberalism and identity politics, we also have research from Bradley Campbell and Jason Manning delving into the rise of “victimhood culture” as something genuinely new and unique, using the same kinds of instances Levy dismisses as insignificant to illustrate “large-scale moral change” and the rise of a distinct victimhood culture. The only other two moral cultures they identify are honor culture and dignity culture, so it’s not like we get a new one of these every decade or so. This shift is seismic.

In the end, Levy believes that identity culture and classical liberalism can be allies. And, insofar as what he means is that “the progress of freedom depends on those who know where the shoe chafes,”[ref]That’s’ the caption of the article’s photo, and I’m not sure if Levy wrote it or not.[/ref] then I agree. The trouble is that the identity politics of today–however they started–are effectively a method of entrenching socio-economic inequality by diverting attention away from the privileges of wealth and elite education with a myopic emphasis on race, gender and sexuality that–while sometimes vital in specific cases–becomes in its myopic form a tool of oppression rather than of freedom. Fundamentally, the project of contemporary identity politics both historically unique and essentially anti-liberal.

The coalition we need to build is not one between libertarianism and identity politics. On the contrary, the coalition we need today is between those who reject identity politics (whether they lean to the left or to the right) and those who embrace it (whether they lean to the left or to the right.)[ref]See also: Victimhood Culture Metastasizes[/ref] This coalition will not bring about a happy utopia because vital partisan differences will remain, but it will forestall the widening division and social dissolution that have wrought so much dysfunction and destruction on our political and social institutions in recent years.

Why Trump “Tortured” Romney

According to Roger Stone (a Trump adviser):

Donald Trump was interviewing Mitt Romney for secretary of State in order to torture him… To toy with him.[ref]Via The Hill[/ref]

That might be all there is to it. Far be it from me to put pure pettiness past Trump. But whatever the motives, the move comes with an important fringe benefit for Trump. From now on, whenever Romney criticizes Trump, the Trump team can spin it nothing more than sour grapes.

I’m loathe to give Trump credit as some kind of political savant just because he won. I’m more inclined to give credit to larger political forces and sheer dumb luck. But that kind of sabotage doesn’t seem outside the realm of possibility. It’s a petty, but useful, way to neutralize the most visible and respected #NeverTrump Republican.[ref]The fact that a lot of credulous people depicted Romney’s carefully-worded statements as sucking up to Trump is just another way to undermine a potent critic.[/ref]

The Cost of the Death Taboo

This post is part of the General Conference Odyssey.

Two thoughts from two different talks.

In “Blessed Are the Peacemakers”, Elder Burton said that:

We forget that we are not, and cannot be, totally independent of one another either in thought or action. We are part of a total community. We are all members of one family, as Paul reminded the Greeks at Athens when he explained that God “hath made of one blood all nations of men to dwell on all the face of the earth.” (Acts 17:26)

Although Elder Burton went in a different direction, that thought made me think about the talk before his, Elder LeGrand Richards’ What After Death?

I thought today that I would like to direct what I have to say to those parents who have lost children in death before they reached maturity and could enter into the covenant of marriage and have their own children here upon this earth. I reckon that there aren’t many families who haven’t had that experience.

Elder Richards was born in 1886. I wondered what childhood mortality rates looked like for him, so I checked a great site (Our World in Data), but data for the United States only goes back to 1933.

I added in the United Kingdom and then France to get an older data set.[ref]You can see from the graph that, while the lines are not identical, they follow a similar trend.[/ref] So, using France as a proxy, the kind of child mortality that Elder Richards would have been familiar was between 250 and 225 children per 1,000 dying before the age of 5.

By the time of this conference in 1974, the rate was down to about 20. For the most recent data (2013) the rate is about 4. In other words, the chances that a given newborn would die before age 5 have falledn from 25% to 2% to 0.4% from the time that Elder Richards was born to the time when he gave his talk to the time we are alive today. For a family with small children, the chance that none would die before the age of 5 was only 32% when Elder Richards was born. It was 92% in 1974. It is 98% today.

When he said, “I reckon that there aren’t many families who haven’t had that experience,” he was absolutely correct for his time, but the world has changed substantially since then.

The reason that I connect the two talks is that Elder Burton reminds us of how integral family is to our identity. As the saying goes: we’re social animals. And the first society is the family. This is a vital truth to who we are as human beings. I don’t think anything could possibly drive that lesson home than the unimaginable tragedy of losing a young child and having that family circle broken, at least temporarily.

I say “unimaginable” because to me it is. In my lifetime, having all your children survive to adulthood isn’t the exception; it’s the rule. But Elder Richards didn’t have to imagine it. As he discussed in his talk, two of his children died before they were old enough to be married.[ref]I realize my definition—dying before age 5—and Elder Richards’ definition—dying before being old enough to have married and have children—are not the same. I hope you can forgive the inaccuracy; I just went with the data I could quickly find.[/ref]

Right now I am reading The Clockwork Universe: Isaac Newton, the Royal Society, and the Birth of the Modern World. The author—Edward Dolnick—is at great pains to show how different the world of the 17th century was from the world of today. Back then, for example, no one knew what caused disease and nobody could do anything about it. From the Great Fire of London (1666) to a resurgence of Black Death (1665), the men and women who lived at that time lived their entire lives under the shadow of inexplicable, uncontrollable death.

One thing Dolnick doesn’t understand, however, is how recently that has changed. Modernity may have dawned in the 17th and 18th centuries but—as the childhoold mortality figures show—disease and accident continued to make death a common, everyday experience well into the 20th century. Not long ago I read Samuel Brown’s incredible book, Through the Valley of Shadows. Althogh it’s a technical book in many ways, Brown sets up his main discussion (of living wills, advance directives, and intensive care units) with a discussion of “the dying of death.”

Before the Dying of Death, death was part of everyday experience. Death was recognized as horrifying, but people were able to understand it as part of the overall meaning of life and knew how to prepare for it when the time came. The understanding of death was broad enough to cross religious boundaries… By the end of the Dying of Death, Americans had contained the terror of death by simply ignoring it until the moment of crisis, but the sanctity of death had disappeared along with menacing presence. People found themselves newly unprepared when they came to die. Where many generations of humans had spent most of their lives preparing for their deathbed, modern Americans spent only hours to at most days, right in the their death agony, trying to come to terms with what was once called the King of Terrors… Since twentieth-century Americans had not generally spent their lives in the shadow of death, when they came to approach Death, as every human being inevitably does, they discovered just how culturally defenseless they were before it’s terrible power.[ref]Through the Valley of Shadows, page 27[/ref]

These changes occurred during the late 19th and early 20th centuries, when basic understanding of the germ theory of disease led to incredibly advances in public health, but at that time there was still effectively nothing doctors could do to combat most diseases once they took hold. I was surprised at how recent this transition had occurred, but according to Brown, “physicians were mostly bad for your health until the recent past. The Baby Boomers are really the first generation born under the aegis of modern medicine.”[ref]Through the Valley of Shadows, page 32[/ref]

So, prior to the 1960s, doctors really couldn’t do anything at all to actively intervene in a wide variety of life-threatening medical emergencies. Since that time, however, our ability to postpone death has grown tremendously, to the point where ICUs frequently perform medical miracles. So, what has this newfound power achieved for us?

Well, it hasn’t all been good. Brown observes that “A major problem in contemporary society is that we combine our distaste for struggle or pain or disability with an unspeakable fear of death.”[ref]Through the Valley of Shadows, page 29[/ref] We have, in effect, stigmatized dying. As a result, “The dying–once celebrated as people with special wisdom who deserved the rapt attention of family and even strangers—[have] become America’s dirty secret.”[ref]Through the Valley of Shadows, page 29[/ref]

Additionally, ICUs—the frontlines in modern America’s war on death—have become places of trauma: “Many people leave the ICU with emotional scars as severe as those carried by combat veterans. Only a minority skate by without anxiety, depressing, or PTSD or some combination of the three.”[ref]Through the Valley of Shadows, page 167[/ref] This trauma is often the result of delusions, and rape delusions in particular: “it’s common for female patients to have memories of rape from urinary bladder catheters”[ref]Through the Valley of Shadows, page 140[/ref]. There are others, however:

The rape delusions associated with bladder catheters are haunting enough, but they don’t exhaust the list of terrible memories people often acquire in the ICU. Most of these frightening delusions relate to imprisonment, capture, or torture. Some feature aliens or homicidal doctors and nurses. More than a few incorporate the famous Capgras delusion, in which the important people in a person’s life are replaced by evil duplicates. These interpretations likely derive from the intense, paranoid attention that comes with high stress coupled with acute pain. The distressed brain tries to weave a meaningful narrative to explain why familiar faces (or people in professional gear and lab coats) are poking and prodding you as you are tied to a bed… some are frankly horrifying.

Let me explain why I’ve taking us on this long, long tangent. What I’m trying to explain is that as we’ve grown in our power to confront death, we have rediscovered an ancient truth: that power brings responsibility. This isn’t just about superheroes. It’s about ordinary men and women with no medical training and no preparation suddenly being told by doctors that it’s up to them to determine if their loved parent, or spouse, or child should live or die. But—precisely because death is so remote and even taboo—we’re completely and totally unprepared to shoulder this burden. As a result: many are crushed underneath it.

The majority of patients and families [emphasis added] come out of the ICU with post-traumatic stress, anxiety, or depression. They are more shell-shocked then combat veterans, according to an array of recent studies.[ref]Through the Valley of Shadows, page 5[/ref]

When it’s not about individuals staggering under the weight of responsibility they have no preparation for, it’s monstrous institutional inertia instead:

A friend’s elderly father, a devout Catholic, receive his last rites in a hospital. He struggled against the wrist restraints to create the sign of the cross in response to the priest’s gentle ministrations. The restraints intended to keep him from dislodging any medical equipment obstructed his desperate hunger to participate in the healthful rituals of the deathbed. He died later that day. It never occurred to the nurses and doctors to release the restraints for this final interaction with his priest. My friend and his family still remember that angry straining for divine connection, stymied by medical handcuffs.[ref]The Valley of Shadows, pages 137-138[/ref]

I share all this because if I just said, “Gee, now our children don’t die, and that’s weakened our appreciation for family,” it would sound banal (at best) or monstrously cruel (at worst). That’s not what I want to say. But I do want to illustrate how our medical prowess—despite absolutely being a blessing we should never surrender[ref]I don’t want any confusion on that point[/ref] has nonetheless presented us with fresh sets of problems we did not have to confront before.

When we stood powerless before death we had a kind of innocence. Now death seems to be far more contained, striking not children and spouses in their homes but the elderly in hospitals and hospices, and so we are all the less prepared to deal with it when it comes, as it surely must. That innocence is gone. Before we didn’t have to choose. Now—collectively and often individually—we do.

I feel like I need to say it again, and so I will one more time: I do not want to turn back the clock. I do not want to live in a world where having four children means probably having to watch at least one of them die in my arms. I want to live in a world where we can cure diseases and heal the sick. I thank God daily that my children are healthy and safe.

But this is a world that presents new and strange challenges. Elder Richards knew the pain of burying his own children, and this cemented in him a conviction of the importance of family relationships and the reality of life after death. He paid a high, high price for these blessings, one no parent would willingly pay.

The questions we have to ask are these: How are we going to acquire the wisdom and understanding to shoulder the responsibilities of technologically sophisticated modern medicine? How do we hold onto a fundamental understanding of the vital importance of family relationships in a world where—because death is so are—we so seldom have to learn through the painfully direct method of heartbreaking loss? How do we find the kind of life-sustaining, bedrock faith of Elder Richards without paying that staggeringly high cost?

I don’t know.

But I do believe that the best place to start is by understanding and cherishing the words and experiences of those who have paid that price before us, and then left bequeathed their words and testimonies to us who follow.

Check out the other posts from the General Conference Odyssey this week and join our Facebook group to follow along!

Illiberal Reformers: An Interview with Thomas Leonard

This is part of the DR Book Collection.

A few years ago, I took an interest in the history of the Progressive Era. This interest was peaked by conservative author Jonah Goldberg’s polemic Liberal Fascism and moved to more academic research during my undergrad. I studied the history the labor unions and the words and ideas of major progressive icons. One scholar whose work I came into contact with and continued to follow over the years was Princeton economist Thomas Leonard. I’ve known for the last few years that Leonard was working on a book that explored the relationship between progressive reformers’ economic agendas and their enthusiastic support of eugenics. Finally, his Illiberal Reformers: Race, Eugenics, and American Economics in the Progressive Era was published this year through Princeton University Press.

The book meticulously demonstrates that the progressive impulse toward inflating the administrative state was driven largely by self-promotion (i.e, the professionalization of economists), racist ideologies (i.e., the fear of race suicide),[ref]Even seemingly good things like national parks had racist overtones.[/ref] and an unwavering faith in science. Not only should the “undesirables” of the gene pool be sterilized, but they should be crowded out of the labor force as well. Those considered “unfit” for the labor market included blacks, immigrants, and women. In order to artificially raise the cost of employing the “unfit,” progressives sought to implement minimum wage (often argued to be a “tariff” on immigrant labor), maximum hours, and working standard legislation.

There is far more in Leonard’s book, which not only provides keen insights into progressive economics, but provides an excellent historical overview of race and eugenics in the Progressive Era. Check out his interview on the podcast Free Thoughts below.

When Trigger Warnings Don’t Work

Related imageTrigger warnings” have been all the rage lately. They’ve sparked a national discussion, but what have they really accomplished? “What is a trigger warning?” asks Mariah Flynn, the Education Program Coordinator for the Greater Good Science Center.

The term, often used interchangeably with “content warning,” is a heads up that readers may encounter distressing content—and in recent years, trigger or content warnings have become controversial. To some, like University of Chicago administrators, such warnings keep students from being challenged or engaging with provocative course materials. Others feel that such warnings are useful tools that keep learners from having a strong emotional response to certain kinds of content, usually depicting physical or emotional violence.

For all of the excitement around trigger warnings, they’re actually quite rare. In an effort to gather more information about their use on college campuses, the National Coalition Against Censorship conducted a survey of over 800 educators from the Modern Language Association and the College Art Association—and found that only one percent reported that their institutions had adopted a policy on trigger warnings. Moreover, only fifteen percent of respondents said that students had asked for warnings.

In many respects, framing content warnings as a “censorship” or “free speech” issue is not helpful to professors or students. There is no evidence that they lead to the widespread suppression of troubling material or class discussion. At worst, warnings are merely gratuitous for a majority of students. At their best, however, content warnings can actually help students engage with course material and develop a caring relationship with their teachers.

So while some students may claim they are too “triggered” to read classical mythology, actual policies regarding trigger warnings are rare (even if campus politics are not). Yet, Flynn points out that

[a]bout three-fourths of us will experience trauma over the course of our lifetime. About ten percent of those people will develop post-traumatic stress disorder (PTSD), experiencing symptoms like flashbacks, memory gaps, depression, or hyper-vigilance.

Avoiding triggering topics—a very common strategy for people with PTSD—isn’t the best way to process traumatic events. Avoidance of triggers is a symptom of PTSD, not a cure. In fact, exposure therapy (a specific type of cognitive behavioral therapy where patients are exposed to physical or mental reminders of their trauma) is not only most common method for treating PTSD; it’s also one of the most effective.

This research might lead some to suggest that perhaps we don’t need to be so concerned about student’s exposure to triggering content, if exposure is the best way for them to process past traumatic events. However, exposure therapy works best under the care of a trained therapist. Even though exposure is an effective way to deal with PTSD, instructors aren’t therapists and the classroom is not an appropriate place for such a therapy.

Trigger warnings are also challenging to implement, because identifying potential triggers isn’t easy. Individuals with past trauma are often triggered by seemingly neutral things that have nothing to do with the content an instructor might present in class—the scent of a certain type of cologne or hearing a song associated with the traumatic event they experienced. Educators won’t always know what might trigger a student who is a victim of trauma and can’t possibly provide a warning for everything that might be a trigger.

Flynn suggests three ways of tackling the issue:

  1. Be upfront about what students can expect from your course.
  2. Consider alternative readings or activities.
  3. Offer information on other coping strategies and self-care.

There are ways to be sensitive to the experiences and mental health of students. Implementing trigger warnings doesn’t seem to be the most effective means of doing so, especially when the policy is hijacked by political agendas.

Historians vs. Economists: The History of Slavery

Image result for 12 years a slave
Chiwetel Ejiofor as Solomon Northup in 2013’s ’12 Years a Slave’

A new article over at The Chronicle of Higher Education provides an excellent review of a controversy that has been brewing over the last couple years that should be of interest to those who care about history and economics. The controversy surrounds the new history of slavery and capitalism, marked by books like Johnson’s River of Dark Dreams, Beckert’s Empire of Cotton, and especially Baptist’s The Half Has Never Been Told. The main claim among these historians is that slavery was essential to American capitalism and the emergence of the Industrial Revolution. Economists and other social scientists are not convinced. “Most economic historians,” the article states,

have argued that “cotton textiles were not essential to the Industrial Revolution,” and that cotton production did not necessarily depend on slavery, according to [Dartmouth economist] Douglas A. Irwin…Summarizing economists’ thinking…Irwin points out that cotton was grown elsewhere in the world without slaves. Cotton production continued to rise in the United States even after slavery was abolished. “In this view, the economic rise of the West was not dependent on slavery,” Irwin says, “but came about as a result of an economic process described by Adam Smith in his book The Wealth of Nations — a process that depended on free enterprise, exchange, and the division of labor.”

Economists see the problem with the new histories on slavery as

stem[ming] in part from how the discipline of history has developed. In the ’60s and ’70s, historians and economists battled over economic history. But as historians turned toward culture, and economists became more quantitative, economic history increasingly became just a subfield of economics. For a variety of reasons, including the 2008 crisis, historians are turning their attention back to financial matters. But they “did not build up their tools in order to understand the material world,” says Rhode. “And they carry along certain ideological positions which they hold fervently and are not willing to test.” Historians, he says, “can’t be making stuff up.”

Historians, however, see economic history as too reductive:

“The problem is the economists left history for statistical model building,” says Eric Foner, a historian of 19th-century America at Columbia University. “History for them is just a source of numbers, a source of data to throw into their equations.” Foner considers counterfactuals absurd. A historian’s job is not to speculate about alternative universes, he says. It’s to figure out what happened and why. And, in the history that actually took place, cotton was extremely important in the Industrial Revolution.

Some economists who attack the new slavery studies are “champion nitpickers,” adds Foner…”They’re barking up the wrong tree. They’re so obsessed with detail that they don’t really confront the broader dynamics of the interpretations. Yes, I’m sure there are good, legitimate criticisms of the handling of economic data. But in some ways I think it’s almost irrelevant to the fundamental thrust of these works.”

The article is an excellent introduction to an important controversy in historical scholarship. Check it out.