Silence: Book & Film

This is part of the DR Book Collection.

Image result for silence endoI bought Shūsaku Endō’s classic Silence in early 2016 when I discovered that Martin Scorsese would be bringing it to the big screen toward the end of the year. I’d heard nothing, but praise for the novel. However, given that I’m not much of a fiction reader, the book sat on my shelf until December. But once I cracked it open, it struck me as the kind of fiction that Christians need to read. Much of what is labeled as “Christian fiction” (whether in print or film) is superficial fluff reminiscent of God’s Not Dead or the Left Behind series. But Silence tackles subjects like faith vs. doubt, discipleship vs. orthodoxy, and the problem of evil. In essence, it’s what lived religion looks like.

The novel was powerful as was its recent film adaptation by Scorsese (in my view, one of the best films of the year). You can see a trailer for the movie below.

The Soul of the World: Interview with Roger Scruton

This is part of the DR Book Collection.

Image result for the soul of the worldI’ve been a fan of British philosopher Roger Scruton ever since I stumbled on his BBC documentary Why Beauty Matters as an undergrad. I picked up his Oxford-published Beauty at the UNT library soon after. I found Scruton’s ideas to be powerful and provocative, even if the writing was at times difficult or lackluster compared to the concepts.

I felt the same way about his Princeton-published The Soul of the World. As The Wall Street Journal summarizes,

Viewed through the lens of scientific reductionism, all existence is fundamentally the bouncing around of various material particles, some arranged in the form of gene-perpetuating machines we call humans. Mr. Scruton almost agrees—we are, in fact, gene-perpetuating machines, and the finer, higher aspects of human existence emerge from, and rest upon, biological machinery. As he points out, though, it’s a long jump from this acknowledgment to the assertion that “this is all there is.” The jump, according to Mr. Scruton, lands us in “a completely different world, and one in which we humans are not truly at home.” A truly human outlook involves the intuition of intangible realities that find no place in even our most sensitive systems of biology, chemistry or physics.

For Scruton, this reductionism “overlook[s] the aspect of our mental states that is most important to us, and through which we understand and act upon each other’s motives, namely, their intentionality or “aboutness”” (pg. 4). The theism presented by Scruton is a kind of general classical theism, one that could be embraced by someone outside a specific religious tradition. His last few chapters discussing art, music, and the faces of others are especially enlightening. Just as no one would reduce the Mona Lisa to mere pixels on a canvas or the music of Bach to random sounds, to reduce humans and nature to their mechanical functions is to misunderstand reality and suppress our own daily experiences of the world. Thought-provoking stuff.

You can hear a brief snippet of an interview with Roger Scruton below.

“The Eucatastrophe of Man’s History”

Image result for jrr tolkien
Tolkien

In his famous essay “On Fairy Stories,” The Lord of the Rings author and Oxford professor J.R.R. Tolkien coined the term “eucatastrophe.” The word was meant to portray the opposite of tragedy and embody the “Consolation of the Happy Ending”:

The consolation of fairy-stories, the joy of the happy ending: or more correctly of the good catastrophe, the sudden joyous “turn” (for there is no true end to any fairy-tale): this joy, which is one of the things which fairy-stories can produce supremely well, is not essentially “escapist,” nor “fugitive.” In its fairy-tale—or otherworld—setting, it is a sudden and miraculous grace: never to be counted on to recur. It does not deny the existence of dyscatastrophe, of sorrow and failure: the possibility of these is necessary to the joy of deliverance; it denies (in the face of much evidence, if you will) universal final defeat and in so far is evangelium, giving a fleeting glimpse of Joy, Joy beyond the walls of the world, poignant as grief. It is the mark of a good fairy-story, of the higher or more complete kind, that however wild its events, however fantastic or terrible the adventures, it can give to child or man that hears it, when the “turn” comes, a catch of the breath, a beat and lifting of the heart, near to (or indeed accompanied by) tears, as keen as that given by any form of literary art, and having a peculiar quality (pgs. 22-23).

Being a devout Catholic and key figure in C.S. Lewis’ conversion to Christianity, Tolkien concluded his essay by writing,

I would venture to say that approaching the Christian Story from this direction, it has long been my feeling (a joyous feeling) that God redeemed the corrupt making-creatures, men, in a way fitting to this aspect, as to others, of their strange nature. The Gospels contain a fairy-story, or a story of a larger kind which embraces all the essence of fairy-stories. They contain many marvels—peculiarly artistic, beautiful, and moving: “mythical” in their perfect, self-contained significance; and among the marvels is the greatest and most complete conceivable eucatastrophe. But this story has entered History and the primary world; the desire and aspiration of sub-creation has been raised to the fulfillment of Creation. The Birth of Christ is the eucatastrophe of Man’s history. The Resurrection is the eucatastrophe of the story of the Incarnation. This story begins and ends in joy. It has preeminently the “inner consistency of reality.” There is no tale ever told that men would rather find was true, and none which so many sceptical men have accepted as true on its own merits. For the Art of it has the supremely convincing tone of Primary Art, that is, of Creation. To reject it leads either to sadness or to wrath…But this story is supreme; and it is true. Art has been verified. God is the Lord, of angels, and of men—and of elves. Legend and History have met and fused (pgs. 23-24).

Merry Christmas everyone.

Image result for incarnation

Climate Change and Economic Growth

Philosopher Joseph Heath has an enlightening working paper on the economics and ethics of climate change. Heath is emphatic that his goal is

not to make a case for the importance of economic growth, but merely to expose an inconsistency in the views held by many environmental ethicists. Part of my reason for doing so is to narrow the gap somewhat, between the discussion about climate change that occurs in philosophical circles and the one that is occurring in policy circles, about the appropriate public response to the crisis. One of the major differences is that the policy debate is conducted under the assumption of ongoing economic growth, as well as an appreciation of the importance of growth for raising living standards in underdeveloped countries. The philosophical discussion, on the other hand, is dominated by the view that ongoing economic growth is either impossible or undesirable, leading to widespread acceptance of the steady-state view. This view is, however, a complete non-starter as far as the policy debate is concerned, because it is too easily satisfied. As a result, its widespread acceptance among philosophers (and environmentalists) has led to their large-scale self-marginalization (pg. 31).

Drawing on the economic research of economists Nicholas Stern and William Nordhaus, Heath proceeds to point out how misleading language often distorts and exaggerates the negative impact of climate change:

Stern adopts a similar mode of expression when he suggests that “in the baseline-climate scenario with all three categories of economic impact, the mean cost to India and South-East Asia is around 6% of regional GDP by 2100, compared to a global average of 2.6%.” The casual reader could be forgiven for thinking that the reference, when he speaks of “loss in GDP per capita,” is to present GDP. What he is talking about, however, is actually the loss of a certain percentage of expected future GDP. In some cases, he states this more clearly: “The cost of climate change in India and South East Asia could be as high as 9- 13% loss in GDP by 2100 compared with what could have been achieved in a world without climate change.” The last clause is of course crucial – under this scenario, GDP will not be 9-13% lower than it is right now, but rather lower than it might have been, in 2100, had there not been any climate change…In other words, what Stern is saying is that climate change stands poised to depress the rate of growth. This type of ambiguity has unfortunately become common in the literature. An important recent paper in Nature by Marshall Burke, Solomon M. Hsiang and Edward Miguel, estimating the anticipated costs of climate change, presents its conclusions in the same misleading way. The abstract of the paper states that “unmitigated climate change is expected to reshape the global economy by reducing average global incomes by roughly 23% by 2100.” The paper itself, however, states the finding in a slightly different way: “climate change reduces projected global output by 23% in 2100, relative to a world without climate change.” Again, that last qualifying clause is crucial, yet it was the unqualified version of the claim found in the abstract that made its way into the headlines, when the study was published (pgs. 15-16).

Heath acknowledges that

these potential losses are enormous, and they call for a strong policy response in the present. At the same time, what these economists are describing is not a “broken world,” in which “each generation is worse off than the last.” On the contrary, they are describing a world in which the average person is vastly better off than the average person is now – just not as well off as he or she might have been, had we been less profligate in our greenhouse gas emissions. It is important, in this context to recall that annual rate of real per capita GDP growth in India, at the time of writing is 6.3%, and so what Stern is describing is, at worst, the loss of approximately two years worth of growth. At the present rate of growth, living standards of the average person in India are doubling every 12 years. There are fluctuations from year to year, but the mean expectation of several studies, calculated by William Nordhaus, suggests that the GDP of India will be about 40 times larger in 2100 than it was in the year 2000 (which implies an average real growth rate of 3.8%). The 9-13% loss, due to climate change, is calculated against the 40-times-larger 2100 GDP, not the present one (pg. 16-17).

The full paper has more details and additional arguments. But this is the kind of serious cost/benefit analysis we need to be having about climate change.

Revitalizing Liberalism

“[M]aking the case for liberalism is a Sisyphean task,” writes Will Wilkinson at the Niskanen Center. “If the old truths are not updated for each new age, they will slip from our grasp and lose our allegiance. The terms in which those truths have been couched will become hollow, potted mottoes, will fail to galvanize, inspire, and move us. The old truths will remain truths, but they’ll be dismissed and neglected as mere dogma, noise. And the liberal, open society will again face a crisis of faith…Intellectual and moral infrastructure depreciates. There are ongoing costs of maintenance. You can’t successfully defend liberalism once and for all, just like you can’t install a sewage system and flush happily ever after without giving it another thought.” The fading of liberal ideas is captured in the recent work of political scientist Yascha Mounk, as reported last week in The New York Times.

“If we fail to constantly refurbish the case for and commitment to liberalism,” Wilkinson continues,

reinforcing it against the specific damage of the age, our institutions will drift toward generalized opportunistic corruption and declining popular legitimacy. Our culture will drift toward defensive avidity and mutual distrust. Our politics will drift toward primal zero-sum tribal conflict. All of which creates a fat political opening for would-be despots and ends-justifies-the-means zealots. Fascism and communism arose in liberal Europe because the liberal elites got complacent, nobody fixed the pipes, and then awful people with awful ideas rose to power promising to fix damage—and then caused massively more damage.

He concludes,

The fact that liberalism has become rote is central to our problem. Academic left-liberalism is doggedly utopian—and stale. Democratic Party liberalism is incoherent—and stale. Orthodox libertarianism is dogmatically blinkered—and stale. The “classical liberalism” of conservative-libertarian fusionism is phony—and stale. Each of our legacy liberalisms is, in its own way, corrupt. It’s all part of our pitted, pocked, cracked and creaking liberal cultural infrastructure. It doesn’t help to replace rotten wood with rotten wood, rusty pipe with rusty pipe…An effective defense of the open society must begin with an empirically-minded account of its complex inner workings and its surpassing value. Liberal political order is humanity’s greatest achievement. That may sound like hype, but it’s the cold, hard truth. The liberal state, and the global traffic of goods, people, and ideas that it has enabled has led to the greatest era of peace in history, to new horizons of practical knowledge, health, wealth, longevity, and equality, and massive decline in desperate poverty and needless suffering. It’s clearer than ever that the multicultural, liberal-democratic, capitalist welfare state is far-and-away the best humanity has ever done. But people don’t know this! We are dangerously oblivious to the nature of our freedom and good fortune, and seem poised to snatch defeat from the jaws of victory.

I can only hope that some of my posts here at Difficult Run have contributed to this project of making liberalism great again.

Two Stupids Don’t Make a Smart

Wikipedia: "Symphony of the Stones carved by the Goght River at Garni Gorge in Armenia is an example of an emergent natural structure." Released by WOWARMENIA for Wikimedia under Creative Commons Attribution-Share Alike license
Wikipedia: “Symphony of the Stones carved by the Goght River at Garni Gorge in Armenia is an example of an emergent natural structure.”
Released by WOWARMENIA for Wikimedia under Creative Commons Attribution-Share Alike license

I didn’t get a chance to make that pithy observation in a Facebook exchange this morning because my interlocutor gave me the boot. That’s OK, I may have been blocked from somebody’s Facebook feed for thinking bad thoughts, but I can’t get blocked from my own blog! You can’t stop this signal, baby.

So, just as two wrongs don’t make a right, let’s use this Columbus Day to talk about two stupids that don’t make a smart.

Bad Idea 1: The Noble Savage

There’s a school of thought which holds, essentially, that everything was fine and dandy in the Americas until the Europeans came along and ruined it. The idea, seen in Disney and plenty of other places, is that “native” peoples lived at harmony with the Earth, appreciating the fragile balance of their precious ecosystems and proactively maintaining it. This idea is bunk. The reality is that in almost all cases the only limit on the extent to which any culture restricts its exploitation of natural resources is technological. Specifically, humanity has an unambiguous track record of killing everything edible in sight as they spread across the globe, leading to widespread extinctions from Australia to the Americas and upending entire ecosystems. If our ancient ancestor didn’t wipe a species out, the reason was either that it didn’t taste good or they couldn’t. As Yuval Noah Harari put it Sapiens:

Don’t believe the tree-huggers who claim that our ancestors lived in harmony with nature. Long before the Industrial Revolution, homo sapiens held the record among all organisms for driving the most plant and animal species to their extinctions. We have the dubious distinction of being the deadliest species in the annals of biology.

Harari specifically describes how the first humans to discover Australia not only wiped out species after species, but–in so doing–converted the entire continent into (pretty much) the desert it is today:

The settlers of Australia–or, more accurately, its conquerors–didn’t just adapt, they transformed the Australian ecosystem beyond recognition. The first human footprint on a sandy Australian beach was immediately washed away by the waves, yet, when the invaders advanced inland, they left behind a different footprint. One that would never be expunged.

Matt Ridley, in The Origins of Virtue, lists some of the animals that no longer exist thanks to hungry humans:

Soon after the arrival of the first people in Australia, possibly 60,000 years ago, a whole guild of large beasts vanished — marsupial rhinos, giant diprotodons, tree fellers, marsupial lions, five kinds of giant wombat, seven kinds of short-faced kangaroos, eight kinds of giant kangaroo, a two-hundred-kilogram flightless bird. Even the kangaroo species that survived shrank dramatically in size, a classic evolutionary response to heavy predation.

And that pattern was repeated again and again. Harari again:

Mass extinctions akin to the archetypal Australian decimation occurred again and again in the ensuing millennia whenever people settled another part of the outer world.

Have you ever wondered why the Americas don’t have the biodiversity of large animals that Africa does? We’ve got some deer and bison, but nothing like the hippos, giraffes, elephants, and other African megafauna. Why not? Because the first humans to get here killed and ate them all, that’s why not. There’s even a name for what happened: the Pleistocene overkill. Back to Ridley:

Coincident with the first certain arrival of people in North America, 11,500 years ago, 73% of the large mammal genera quickly died out…  By 8000 years ago, 80% of the large mammal genera  in south America were also extinct — giant sloths, giant armadillos, giant guanacos, giant capybaras, anteaters the size of horses.

In Madagascar, he notes that “at least 17 species of lemurs (all the diurnal one is larger than 10 kg in weight, one as big as a gorilla), and the remarkable elephant birds — the biggest of which weighs 1000 pounds — were dead within a few centuries of the islands first colonization by people in about 500 A.D.” In New Zealand, “the first Maoris sat down and ate their way through all 12 species of the giant moa birds. . .  Half of all new Zealand’s indigenous land birds are extinct.” The same thing happened in Hawaii, where at least half of the 100 unique Hawaiian birds were extinct shortly after humans arrived. “In all, as the Polynesians colonized the Pacific, they extinguished 20% of all the bird species on earth.”

Ridley’s myth-busting doesn’t end there. He cites four different studies of Amazon Indians “that have directly tested their conservation ethic.” The results? “All four rejected the hypothesis [that the tribes had a conservation ethic].” Moving up to North America, he writes that “There is no evidence that the ‘thank-you-dead-animal’ ritual was a part of Indian folklore before the 20th century,” and cites Nicanor Gonsalez, “At no time have indigenous groups included the concepts of conservation and ecology in their traditional vocabulary.”

This might all sound a little bit harsh, but it’s important to be realistic. Why? Because these myths–no matter how good the intentions behind them–are corrosive. The idea of the Noble Savage is intrinsically patronizing. It says that “primitive” or “native” cultures are valuable to the extent that they are also virtuous. That’s not how human rights should work. We are valuable–all of us–intrinsically. Not “contingent on passing some test of ecological virtue” (as Ridley puts it.)

Let me take a very brief tangent. Ridley’s argument here (as it relates to conservation) is exactly parallel to John McWhorters linguistic arguments and Steven Pinker’s psychological arguments. In The Language Hoax, John McWhorter takes down the Sapir-Whorf Hypothesis, which is the trendy linguistic theory that what you think is determined by the language you think it in. Just like the Noble Savage, this idea was originally invented by Westerners on behalf of well, everybody else. The idea is that “primitive” people were more in contact with the timeless mysteries of the cosmos because (for example) they spoke in a language that didn’t use tense. Not only did this turn out to be factually incorrect (they just marked tense differently, or implied it in other cases, as many European languages also do), but it’s an intrinsically bad idea. McWhorter:

In the quest to dissuade the public from cultural myopia, this kind of thinking has veered into exotification. The starting point is, without a doubt, I respect that you are not like me. However, in a socio-cultural context in which that respect is processed as intellectually and morally enlightened, inevitably, to harbor that respect comes to be associated with what it is to do right and to be right as a person. An ideological mission creep thus sets in. Respect will magnify into something more active and passionate. The new watchcry becomes, “I like that you are not like me,” or alternately, “What I like about you is that you are not like me.” That watchcry signifies, “What’s good about you is that you are not like me.” Note however, the object of that encomium, has little reason to feel genuinely praised. His being not like a Westerner is neither what he feels as his personhood or self-worth, nor what we should consider it to be, either explicitly or implicitly.

The cute stories about the languages primitive peoples speak and the ways that enables them to see the world in unique and special ways end up being nothing but a particularly subtle form of cultural imperialism: our values are being used to determine the value of their culture. All we did was change up the values by which we pass judgement on others. Thus: “our characterization of indigenous people in this fashion is more for our own benefit than theirs.”

The underlying premise of Harrari, Ridley, and McWhorter is what Steven Pinker’s The Blank Slate tackles directly: the universality of human nature. We can best avoid the bigotry and discrimination that has marred our history not by a counter-bigotry that holds up other cultures as special or superior (either because they’re in magical harmony with nature or possess unique linguistic insights) but by reaffirming the fact that there is such a thing as an universal, underlying human nature that unites all cultures.

Universal human nature is not a byproduct of political wishful thinking, by the way. Steven Pinker includes as an appendix to The Blank Slate a long List of Human Universals compiled by Donald E. Brown in 1989. It is a long list, organized alphabetically. To give a glimpse of the sorts of things behaviors and attributes common to all human cultures, here are the first and last items from the list:

  • abstraction in speech and thought
  • actions under self-control distinguished from those not under control
  • aesthetics
  • affection expressed and felt
  • age grades
  • age statuses
  • age terms
  • vowel contrasts
  • weaning
  • weapons
  • weather control (attempts to)
  • white (color term)
  • world view

The list also includes lots of stuff about binary gender which is exactly why you haven’t heard of the list and why Steven Pinker is considered a rogue iconoclast. These days, one does not simply claim that gender is binary.

one-does-not-simply-say

I’ve spent a lot of time on the idea of the Noble Savage as it relates to ecology, but of course it’s a broader concept than that. I was once yelled at quite forcibly by a presenter trying to teach us kids that warfare did not exist among pre-Columbian Native Americans. I was only 11 or 12 at the time, but I knew that was bs and said so.

The point is that the whole notion of a mosaic of Native Americans living in peace and prosperity until the evil Christopher Columbus showed up and ruined everything is a bad idea. It’s stupid number 1.

Bad Idea 2: Christopher Columbus is Just Misunderstood

So, this is the claim that started the discussion that got me blocked by somebody on Facebook today. The argument, such as it was, goes something like this: Columbus looks very bad from our 21st century viewpoint, but that’s an unfair, anachronistic standard. By the standards of his day, he was just fine, and those are the standards by which he should be measured.

The problem with this idea is that, like the first, it’s simply not true. One of the best, popular accounts of why comes from The Oatmeal. In this comic, Matthew Inman contrasts Columbus with a contemporary: Bartolomé de las Casas. While Columbus and his ilk were off getting to various hijinks including (but not limited to) child sex slavery and using dismemberment to motivate slaves to gather more gold, de las Casas was busy arguing that indigenous people deserved rights and that slavery should be abolished. Yes, at the time of Columbus.

The argument that if we judge Columbus by the standards of his day he comes out OK does not hold up. We can find plenty of people at that time–not just de las Casas–who were abolitionists or (if they didn’t go that far) were critical of the excessive cruelty of Columbus and many like him. Keep in mind that slavery had been a thing in Europe for thousands of years until the Catholics finally stamped it out around the 10th century. So it’s not like opposition to slavery is a modern invention. When slavery was restarted in Africa and then the Americas many in the Catholic clergy opposed it once again, but were unable to stop it. So the idea that–by the standards of his day–Columbus was just fine and dandy doesn’t work. He’s a pretty bad guy in any century.

Two Stupids Don’t Make a Smart

I understand the temptation to respond to Noble Savage-type denunciations of Christopher Columbus by trying to defend the guy. You see somebody making a bad argument, and you want to argue that they’re wrong.

But that isn’t how logic actually works. A broken clock really is right twice a day, and a bad argument can still have a true conclusion. If I tell you that 2+2 = 4 because Mars is in the House of the Platypus my argument is totally wrong, but my conclusion is still true.

The Noble Savage is a bad bit of cultural luggage we really should jettison, but Columbus is still a bad guy no matter how you slice it. Using one stupid idea to combat another stupid idea doesn’t actually enlighten anyyone.

The Bell Curve of Extremism

There are basically two kinds of moderates / independents: the ignorant and the wise. It really is a sad twist of fate to stick the two together, but nobody honest every said life was fair.

To illustrate, let me introduce you to a concept I’ll call the Bell Curve of Extremism:

bell-curve-of-extremism

 

To flesh this out, I’ll use some examples from voting.

A person on the left doesn’t know who they’re going to vote for because they don’t know much of anything at all. They may not even know who’s running or who’s already in office. This doesn’t mean they’re stupid, necessarily. They could be brilliant, but just pay no attention to politics.

A person in the left knows exactly who they’re voting for, and it’s never really been in question. What’s more, they can give you a very long list of the reasons they are voting for that person and–what’s more–all the terrible, horrible things about the leading contender that make him or her totally unfit for office and a threat to truth, justice, and the American Way. This is the kind of person who consumes a lot of news, but probably from a narrow range of sources, like DailyKos or RedState. They’re not bad people, but they high motivation tends to lead to an awful lot of research that is heavily skewed by confirmation bias.

A person on the right may also be unsure of how they’re going to cast their vote, but it’s not because they don’t know what’s going on. The problem is they do, and this knowledge has led them (as often as not) to fall right off the traditional left/right axis. I called myself a radical moderate when I was in high-school. At the time, it was mostly because I was on the far left but I wanted to sound cool. Later on in life I found myself near the peak of the bell-curve, a die-hard conservative with all the answers who was half-convinced that liberals were undermining the country. But then I went to graduate school to study economics (one of the areas where I was staunchly conservative) and lo and behold: things got complicated. I fell off the peak and I’ve been sliding down the slope ever since. And what do you know, but I found out recently that radical moderates are actually a thing. They even include some of my very favorite thinkers, like John McWhorter (cited above) and Jonathan Haidt (cited in a lot of my posts). I’ve come full circle, from know-nothing moderate to know-that-I-know-nothing radical moderate.

It’s kind of lonely and depressing over here, to be honest, and we don’t often find an awful lot to shout about. Which is why the conversation tends to be dedicated by peak-extremists who know just enough to be dangerous. About the only banner you’ll see us waving is the banner of epistemic humility. And really, how big of a parade can you expect to line up behind, “People probably don’t know as much as they think they do? (Including us!)”

But one thing that I can share with some conviction is this post, and the idea that–when it comes to ideas–fighting fire with fire just burns the whole house down. There is validity to the idea that things were better before Christopher Columbus showed up. There was a helpful lack of measles and small pox, for example. But blaming the transmission of those diseases (except in the rare cases when it was important) and the resulting humanitarian catastrophe on Columbus doesn’t make any sense. He did a lot of really evil things, but intentional germ warfare was not among them. Relying on it because the numbers are so big is lazy. There is also validity to the idea that Columbus lived in a different time. Many of the most compassionate Westerners were motivated not by a modern sense of equal rights but by a more feudal-tinged idea of noblesse oblige. De la Casa himself, for example, first suggested making things easier on Caribbean slaves by importing more African slaves before later deciding that all slavery was a bad idea. And if you fast-forward to the 19th century abolitionist movements, you’ll find plenty of what counts as racism in the 21st century among the abolitionists who were motivated (in some cases) by ideas of civilizing the savages. Racial politics are complicated enough in the 21st century alone, of course we can’t bring in perspectives from six centuries ago and expect all the good guys to neatly align on bullet point of focus-group vetted talking points!

So yes: I see validity to both sides of the fight. If your goal is to win in the short term, then the most useful thing to do is double-down on your strongest arguments and cherry-pick the other side’s weakest points. This is the strategy of two stupids making a smart, and it doesn’t work.

If your goal is to win in the long term, then you have to undergo a fundamental transformation of perspective. The short-term model isn’t just short-term. It’s ego-centric. The fundamental conceit of the idea of winning is the idea of being right, as an individual. Your view is the correct one, and the idea is to have your idea colonize other people’s brains. It is unavoidably an ego-trip.

The long-term model isn’t just about the long-term. It’s also about seeing the whole that is more than the sum of the parts. In this view, the likeliest scenario is that nobody is right because, on any particular suitably complex question, we are like the world before Newton and the world before Einstein: waiting for a new solution no one has thought of. And, even if somebody does have the right solution to the problem we face now, that will almost certainly not be the right answer to the problem we will face tomorrow. In that case, it’s not about having the right ideas in the heads of the read people, it’s about having a culture and a society that is capable of supporting a robust ecosystem of idea-creation. The focus begins to shift away from the “I” and towards the “we.”

In this model, your job is not to be the one, singular, heroic Messiah who tells everyone the answer to their problem. Your job is to play your part in a larger collective. Maybe that means you should be the lone voice calling from the wilderness, the revolutionary prophet like Newton or Einstein. But more probably it means your job is to simply be one more ant carrying one more grain of sand to build the collective pile of human knowledge and maybe–through conversations with friends and family–shift the center of gravity infinitesimally in a better direction.

I’m not a relativist. I’m a staunch realist in the sense that I believe in an objective, underlying reality that is not dependent on social construction or individual interpretation. But I’m also a realist in the sense of acknowledging that the last living human being to have ever understood the entire domain of mathematics was Carl Friedrich Gauss and he died in 1865. No living person today understands all mathematical theory. And that’s just math. What about physics and history and chemistry and psychology? And that’s just human knowledge. What about the things nobody knows or has thought of yet? An individual is tiny, and so is their sphere of knowledge. The idea that the answers to really big questions fall within that itty-bitty radius seems correspondingly remote. In short: the truth is out there, but you probably don’t have it and you probably can’t find it. It may very well be, keeping this metaphor going, that the answer to some of our questions are too complex for any one person to hold in their brain, even if they could discover one.

I’m not giving up on truth. I am giving up on atomic individualism, on the idea that the end of our consideration with regards to truth is the question of how much of it we can fit into our individual skulls. That seems very small minded, if you’ll pardon the pun. Instead, I’m much more interested in ways in which individuals can do their part to contribute to building a society that may understand more than its constituent individuals do or (since that seems a bit speculative, even to me) at a minimum provides ample opportunity for individuals to create, express, and compare ideas in the hope of discovering something new.

Two stupids can’t make a smart. The oversimplification and prejudice necessary to play that strategy is not worth the cost. Winning debates is not the ultimate goal. We can aim for something higher, but we have to be willing to lay down our own egos in the process and contribute to something bigger.

Stop Engaging “The Culture”

Image result for paul romans

So says a thought-provoking article in Christianity Today. According to the author, engaging “the culture” simply “causes us to stab blindly in the dark” and “miss our actual cultural responsibility and opportunity”:

A nation of 300 million people, especially one as gloriously diverse as the United States, does not have one monolithic “culture.” It has neighborhoods and cities, ethnic groups and affinity groups, political parties and religious denominations. There is a shared national ethos, to be sure. But that ethos is constantly being contested, challenged, and reimagined by different groups within the nation, and ignored or actively resisted by others.

Even the idea of “the culture,” in the way we now use the phrase, is fairly new. The New Testament, especially the Gospel of John, prefers the term “the world” (cosmos in Greek) for what we might call “the culture,” especially systems of ideology and influence that operate independent of God. But it also speaks of “nations” or “peoples” (ethne in Greek—today we might call them “ethnolinguistic groups”). We are called to resist being “conformed to this world” (Rom. 12:2, ESV), and to make disciples of all ethne, in the hope that they all will join in the multinational, multilingual, multicultural chorus around the throne of the Lamb (Rev. 7:9).

In short,

Instead of preoccupying ourselves with the cosmos, we are called to the ethne. Rather than engaging in largely imaginary relationships with the world system…we are called to real people in a real place. With those real people, we reflect on the concrete possibilities and limitations of the time and place we share (including, to be sure, the ways the world system presses in on us). We learn to care for what is lasting and valuable in our particular time and place, and begin to create alternatives to things that are inadequate and broken. 

The more we do this—the more fully human we become, entwined in relationships of empowering mutual dependence—the less bound and tempted we will be by “the culture.” And the less bound we are by “the culture,” the more we are able to actually influence culture around us, even sometimes up to very large scales—because we are creating and sustaining real alternatives to it.

We are to be like Paul, who didn’t seek to “engage “Rome,”” but instead “wrote a letter to actual Romans.” Similarly, “our mission is not primarily to “engage the culture” but to “love our neighbor.” Our neighbor is not an abstract collective noun, but a real person in a real place.”

Something to remember.

So Proud of My Humility

A recent Aeon article highlights the importance of what Nathaniel has called “epistemic humility” and the dangers of overconfidence:

The internet and digital media have created the impression of limitless knowledge at our fingertips. But, by making us lazy, they have opened up a space that ignorance can fill. On the Edge website, the psychologist Tania Lombrozo of the University of California explained how technology enhances our illusions of wisdom. She argues that the way we access information about an issue is critical to our understanding – and the more easily we can recall an image, word or statement, the more likely we’ll think we’ve successfully learned it, and so refrain from effortful cognitive processing.

This lack of intellectual humility often leads to the trolling we see in online discussions:

Intellectually humble people don’t repress, hide or ignore their vulnerabilities, like so many trolls…People who are humble by nature tend to be more open-minded and quicker to resolve disputes, since they recognise that their own opinions might not be valid. The psychologist Carol Dweck at Stanford University in California has shown that if you believe intelligence can be developed through experience and hard work, you’re likely to make more of an effort to solve difficult problems, compared with those who think intelligence is hereditary and unchangeable.

One of the more exciting portions of the article is its discussion of the

Thrive Center for Human Development in California, which seeks to help young people turn into successful adults, is funding a series of major studies about intellectual humility. Their hypothesis is that humility, curiosity and openness are key to a fulfilling life. One of their papers proposes a scale for measuring humility by examining questions such as whether people are consistently humble or whether it depends on circumstances. Acknowledging that our opinions (and those of others) vary by circumstance is, in itself, a significant step towards reducing our exaggerated confidence that we are right.

The key takeaway, however, is the following:

Intellectual humility relies on the ability to prefer truth over social status. It is marked primarily by a commitment to seeking answers, and a willingness to accept new ideas – even if they contradict our views. In listening to others, we run the risk of discovering that they know more than we do. But humble people see personal growth as a goal in itself, rather than as a means of moving up the social ladder. We miss out on a lot of available information if we focus only on ourselves and on our place in the world (bold mine).

Most of my anxieties when it comes to research, writing, and learning are ultimately anxieties over my place in the social pecking order. I have a long way to go when it comes to learning intellectual humility.

How (Not) To Be Secular: A Lecture by James K.A. Smith

Years back, Catholic philosopher Charles Taylor published his 800+-page tome A Secular Age. I actually checked it out from the library once, got about 15 pages into it, and didn’t pick it up again until I had to return it. I realized that it was something I’d have to spend a lot of time not only reading, but chewing on. Given that I was still a newly-married undergrad, I decided to revisit it at another time.

I still haven’t tackled Taylor’s book, but I did recently complete James K.A. Smith’s How (Not) To Be Secular: Reading Charles Taylor. Smith’s book acts as a summarized walkthrough of Taylor’s, illuminating and at times taking issue with the some of ideas presented. By reading Smith’s book first, I feel prepared to take on the entirety of Taylor. In short, Smith and Taylor argue that the Western world has become a disenchanted one in which belief in God is just one option of belief among many:

A society is secular insofar as religious belief or belief in God is understood to be one option among others, and thus contestable (and contested). At issue here is a shift in “the conditions of belief.” As Taylor notes, the shift to secularity “in this sense” indicates “a move from a society where belief in God is unchallenged and indeed, unproblematic, to one in which it is understood to be one option among others, and frequently not the easiest to embrace”…It is in this sense that we live in a “secular age” even if religious participation might be visible and fervent.

Shifts towards secularization led us to see ourselves as free agents closed off to external meaning, influences, and forces. Social ties and hierarchies were no longer seen as being grounded in higher, sacred orders. Reality was no longer a cosmos full meaning and purpose, but merely a universe full of chaos and chance. The secularity of the modern age is inescapable even for the most ardent believer. But this isn’t a subtraction story (i.e., the loss of superstition) as much as it is a change in sensibilities; a change in the water we swim in so to speak.

I’m certainly not doing the book justice in my brief summary, so I’ll just say this: anyone interested in making sense of our secular age, but hesitant to read 800 pages on the subject, should check out Smith’s book. You can see him lecturing at BYU’s Wheatley Institution below.

Soul on Fire: Appreciating Elie Wiesel

Elie Wiesel departed this world at the age of eighty seven. He has had a tremendous influence on my life, though I never met or corresponded with him. His books were always in the house when I was growing up, and I remember my mother retelling for me the plot of Dawn, but I cannot remember how old I was. Perhaps I was nine. The details have faded but the memory remains and comes to mind quite often. Honestly, it was sobering and a bit frightening to realize that no one is exempt from life’s horrors, that even I might be forced at some point to choose between two ugly outcomes. I still hope that I never will, but it was Elie Wiesel who forced me to acknowledge the possibility that it could happen.

I have not rushed to post on Wiesel’s death because I have been picking up his works again and pondering his life. He deserves as much. I confess that it has been at least a year since I last read something of his. Wiesel himself resisted tidy conclusions. Still, something that I have noticed while following  media coverage is just how much is misunderstood about Wiesel. He had his flaws and failings, of course, and valid criticisms can (or is it should) be leveled at him. There were even survivors with more compelling views on the universality of the Holocaust than his own, and Wiesel sometimes clashed with them, but he was a powerful voice for good nonetheless. Then there was the disgraceful spectacle of people like Max Blumenthal, who possess the moral stature of a Chihuahua, publishing tweet after tweet vilifying Wiesel not long after his death was announced.

Something that I can see even in many valid criticisms is that Wiesel is being judged by our own image of a Holocaust survivor and champion of human rights should be rather than by what Wiesel actually was. To understand Wiesel we must set aside such grand images as citizen of the world and its conscience, and start with the Elie Wiesel who was deported to the kingdom of night, as he would put it. A shy but ardent Hasidic youth who viewed everything through a spiritual lens. His parents had to force him to set some time aside for secular studies, such was his religious fervor. Then came the Holocaust, an outburst of the forces of evil so intense that it destroyed his ability to believe as he once did. Wiesel always wanted to recover that simple faith, but could not. This is the thread running throughout his works, the source of the enigmatic laughter and silences that fill his stories.

Night is a powerful novel. Really needs no introduction. Dawn is not as well known, but as noted above, perhaps more compelling and troubling because it deals with the internal struggle of a Holocaust survivor faced with making an awful choice. If Night is about surviving in a kingdom where God does not act, Dawn looks at the choices one must make when acting in history instead of God. Night will make you weep, and Dawn will chill you, but if you want to get at the man behind Wiesel’s public persona then read Souls on FireSouls is a collection of sayings, stories, and character sketches of several 18th-19th century Hasidic masters, leaders of a Jewish revivalist movement in Eastern Europe. Wiesel has written quite a bit on this or that Hasidic master. It is a prominent topic in his writings, and though I have never bothered to quantify this, I would not be surprised if he has written more frequently and directly about Hasidism than he has about the Holocaust, but everything that he wrote eventually touched upon his experience in the camps.

The Hasidic master, or Rebbe, acts as bridge between his followers, the Hasidim, and God. The Rebbe was central to how they approached the world, so telling stories about these masters practically became a sacred duty. These were stories about hidden saints and holy beggars, miracles and prophecies, uplifting the poor and downtrodden, intense longing for the Messiah as if he were due any minute, putting God on trial for neglecting his children, and a host of other colorful episodes, but most of all about the soul and how to mend it. Sometimes cryptic and paradoxical, they all share a love of truth. These stories were used to draw man closer to God rather than simply entertain. Wiesel did that, too. “I don’t believe the aim of literature is to entertain, to distract, to amuse.”

Hasidism, then, was the world of Wiesel’s innocence, where God was close, always ready to intervene on behalf of those who loved him, a world filled with warm memories of conversations with his grandfather the devout Hasid. It was he who taught Wiesel his first stories and embodied their virtues. Hasidism was about faith. Not mental affirmation, but an attitude of trust and devotion.

In the chapter entitled Disciples IV, Elie Wiesel relates a Hasidic legend of how Satan protested the birth of a particular Rebbe so holy that he would draw enough followers closer to God so as to destroy Satan’s kingdom effortlessly. The heavenly court recognized the unfairness of that scenario, and decided to send a rival – a counterfeit Rebbe – whom no one would suspect of serving God’s rival.

How is one to know? How does one recognize purity in a man? And how can one be sure? I remember putting this question to my grandfather. He chuckled and his eyes twinkled when he answered: “But one is never sure; nor should one be. Actually, it all depends on the Hasid; it is he who, in the final analysis, must justify the Rebbe.”

It is hard not to see this as really being about God, about Wiesel’s relationship with him. This answer to a childhood question, I think, lies behind the anger in Night, and behind the moral calculus in Dawn. Like the old chestnut, show me your friends and I will show you who you are. With Wiesel, though, there is never a simple affirmation of man’s moral superiority to God. That is a subtle nuance which even as fine a film as God on Trial (inspired by a Wiesel experience and story) misses. Man is responsible for affirming his devotion to truth through his actions and choices, perhaps even to transform his master through them. His failure to do so can have acute repercussions because God and man are inseparably linked.

“You’ll grow up, you’ll see,” my grandfather had said. “You’ll see that is more difficult, more rare to find a Hasid than a Rebbe. To induce others to believe is easier than to believe…”

Another story is shared of a Rebbe scolding God for keeping an old man like him waiting all his life for the messiah, then Wiesel’s own memory of his grandfather blessing him to see the messiah end evil, and how that caused him to tremble in Auschwitz for his grandfather. A story about a holy dance invites Wiesel to wonder how his grandfather died. For him, it is all connected. He expressed what he experienced in the camps in terms of these Hasidic tales and sayings.

One of Hasidism’s finest tales relates that the founder of Hasidism went to a certain spot in the woods to perform a ritual and utter a prayer to avert a disaster. His successor could not remember the ritual, but knew the spot and the prayer. The next Rebbe knows only the spot, and, finally, only the story remains. This must suffice, or can it? Wiesel suggests that we might be past that stage.

The proof is that the threat has not been averted. Perhaps we are no longer able to tell the story. Could all of us be guilty? Even the survivors? Especially the survivors?

That last question alone opens up a world of anguish that the trite and easy phrase survivor’s guilt can never fathom. It also lends urgency to the task of storytelling. There are no easy answers to any of these questions which occupied Wiesel his entire life.

Two sayings of Hasidic masters are given in the chapter with no commentary. “To pronounce useless words is to commit murder,” and, “Nothing and nobody down here frightens me… But the moaning of a beggar makes me shudder.” Both of these encapsulate Wiesel’s approach as author and witness. Waste no word on things that do not teach truth and fear nothing as much as another’s suffering.

There is much more that could be discussed. Instead, read Souls on Fire, especially the moving postscript describing why he wrote it. Your time will be greatly rewarded.

To end like I began, on a personal note: I was surprised to feel no sorrow at Wiesel’s passing. In fact, I almost felt happy. I typically get very emotional thinking about the Holocaust at any length. Why not now? In Jewish thought death is often considered a passage from the world of illusion to the world of truth. Wiesel loved truth but was haunted by it. He was truly a soul on fire, so perhaps now he will be able to see things as they really are, and meet with God to reconcile differences and finally have his questions answered. A chance, I feel, to regain his childhood faith.