Tell Your Children When You’re Wrong

Man kisses baby he's holding
Man kisses baby he's holding
By @kellysikkema via Unsplash

In one of the most poignant scenes from The Adam Project, Ryan Reynold’s character meets Jennifer Garner’s character by chance in a bar. Garner is Reynold’s mom, but she doesn’t recognize him because he has time-traveled into the past. He overhears her talking to the bartender about the difficulties she’s having with her son (Reynold’s younger self) in the wake of her husband’s death. Reynolds interjects to comfort his mom (which is tricky, since he’s a total stranger, as far as she knows). If this is hard to follow, just watch the clip. It’s less than three minutes long.

The line that really stuck with me from this was Reynolds telling Garner, “the problem with acting like you have it all together is he believes it.”

When I talked about this scene with Liz and Carl and Ben for a recent episode of Pop Culture on the Apricot Tree, I realized that there’s an important connection between this line and apologizing to our kids. And it’s probably not the reason you’re thinking.

The reason you’re thinking has to do with demonstrating to our kids how they should react to making a bad choice. This is also a good reason to apologize to your children. Admitting mistakes is hard. Guilt doesn’t feel good, and it takes practice to handle it positively. You don’t want to allow healthy guilt to turn into unhealthy shame (going from “I did bad” to “I am bad”). And you don’t want to allow the pain of guilt to cause you to lash out even more, like a wounded animal defending itself when there’s no threat. So yeah, apologize to your kids for your mistakes so they know how to apologize for theirs.

And, while we’re on the subject, don’t assume that apologizing to your kids will undermine your authority as a parent. It is important for parents to have authority. Your job is to keep your kids safe and teach them. This only works if you’re willing to set rules they don’t want to follow and teach lessons they don’t want to learn. That requires authority. But the authority should come from the fact that you love them and know more than they do. Or, in the words of the Lord to Joseph Smith, authority should only be maintained “by persuasion, by long-suffering, by gentleness and meekness, and by love unfeigned; By kindness, and pure knowledge” (D&C 121:41-42). If your authority comes from kindness and knowledge, then it will never be threatened by apologizing in those cases when you get it wrong.

In fact, this takes us to that second reason. The one I don’t think as many folks have thought of. And it’s this: admitting that you made a mistake is an important way of showing your kids how hard you’re trying and, by extension, demonstrating your love for them.

The only way to never make a mistake is to never push yourself. Only by operating well within your capabilities can you have a flawless record over the long run. Think about an elite performer like an Olympic gymnast or figure skater. They are always chasing perfection, rarely finding it, and that’s in events that are very short (from a few minutes to a few seconds). If you saw a gymnast doing a flawless, 30-minute routine every day for a month you would know that that routine was very, very easy (for them, at least).

You can’t tell if someone makes mistakes because they’re at the limits of their capacity or just careless. But if someone never makes mistakes, then you know that whatever they are doing isn’t much of a challenge.

So what does that tell your kid if you never apologize? In effect, it tells them that you have it all together. That, for you at least, parenting is easy.

That’s not the worst message in the world, but it’s not a great one either. Not only is it setting them up for a world of hurt when they become parents one day, but you’re missing an opportunity to tell them the truth: that parenting is the hardest thing you’ve ever done, and you’re doing it for them.

Don’t take this too far. You don’t want to weigh your kids down with every worry and fear and stress you have. You don’t want to tell them every day how hard your job is. That’s a weight no child should carry. But it’s OK to let them know—sparingly, from time to time—that being a parent is really hard. What better, healthier way to convey that than to frankly admit when you make a mistake?

“The problem with acting like you have it all together is he believes it. Maybe he needs to know that you don’t. It’s OK if you don’t.”

Anti-Provincial Provincialism and Fighting Monsters

Part 1: Anti-Americanism Americanism

I ran across a humorous meme in a Facebook group that got me thinking about anti-provincial provincialism. Well, not the meme, but a response to it. Here’s the original meme:

Now check out this (anonymized) response to it (and my response to them):

What has to happen, I wondered, for someone to assert that English is the most common language “only in the United States“? Well, they have to be operating from a kind of anti-Americanism that is so potent it has managed to swing all the way around back to being an extreme American centrism view again. After all, this person was so eager to take the US down a peg (I am assuming) that they managed to inadvertently erase the entire Anglosphere. The only people who exclude entire categories of countries from consideration are rabid America Firsters and rabid America Lasters. The commonality? They’re both only thinking about America.

It is a strange feature of our times that so many folks seem to become the thing they claim to oppose. The horseshoe theory is having its day.

The conversation got even stranger when someone else showed up to tell me that I’d misread the Wikipedia article that I’d linked. Full disclosure: I did double check this before I linked to it, but I still had an “uh oh” moment when I read their comment. Wouldn’t be the first time I totally misread a table, even when specifically checking my work. Here’s the comment:

Thankfully, dear reader, I did not have to type out the mea culpa I was already composing in my mind. Here’s the data (and a link):

My critic had decided to focus only on the first language (L1) category. The original point about “most commonly spoken” made no such distinction. So why rely on it? Same reason, I surmise, as the “only in the US” line of argument: to reflexively oppose anything with the appearance of American jingoism.

Because we can all see that’s the subtext here, right? To claim that English is the “most common language” when it is also the language (most) Americans speak is to appear to be making some of rah-rah ‘Murica statement. Except that… what happens if it’s just objectively true?

And it is objectively true. English has the greatest number of total speakers in the world by a wide margin. Even more tellingly, the number of English 2L speakers outnumbers Chinees 2L speakers by more than 5-to-1. This means that when someone chooses a language to study, they pick English 5 times more often than Chinese. No matter how you slice it, the fact that English is the most common language is just a fact about the reality we currently inhabit.

Not only that, but the connection of this fact to American chauvanism is historically ignorant. Not only is this a discussion about the English language and not the American one, but the linguistic prevalence of English predates the rise of America a great power. If you think millions of Indians conduct business in English because of America then you need to open a history book. The Brits started this stated of affairs back when the sun really did never set on their empire. We just inherited it.

I wonder if there’s something about opposing something thoughtlessly that causes you to eventually, ultimately, become that thing. Maybe Nietzsche’s aphorism doesn’t just sound cool. Maybe there’s really something to it.

This image doesn’t include enough of the quote, which is: “Beware that, when fighting monsters, you yourself do not become a monster… for when you gaze long into the abyss. The abyss gazes also into you.” But it’s cute. Original from Instagram.

Part 2: Anti-Provincial Provincialism

My dad taught me the phrase “anti-provincial provincialism” when I was a kid. We were talking about the tendency of some Latter-day Saint academics to over-correct for the provincialism of their less-educated Latter-day Saint community and in the process recreate a variety of the provincialism they were running away from. Let me fill this in a bit.

First, a lot of Latter-day Saints can be provincial.

This shouldn’t shock anyone. Latter-day Saint culture is tight-knit and uniform. For teenagers when I was growing up, you had:

  • Three hours of Church on Sunday
  • About an hour of early-morning seminary in the church building before school Monday – Friday
  • Some kind of 1-2 hour youth activity in the church building on Wednesday evenings

This is plenty of time to tightly assimilate and indoctrinate the rising generation, and for the most part this is a good thing. I am a strong believer in liberalism, which sort of secularizes the public square to accommodate different religious traditions. This secularization isn’t anti-religious, it is what enables those religions to thrive by carving out their own spaces to flourish. State religions have a lot of power, but this makes them corrupt and anemic in terms of real devotion. Pluralism is good for all traditions.

But a consequences of the tight-knit culture is that Latter-day Saints can grow up unable to clearly differentiate between general cultural touchstones (Latter-day Saints love Disney and The Princess Bride, but so do lots of people) and unique cultural touchstones (like Saturday’s Warrior and Johnny Lingo).

We have all kinds of arcane knowledge that nobody outside our culture knows or cares about, especially around serving two-year missions. Latter-day Saints know what the MTC is (even if they mishear it as “empty sea” when they’re little, like I did) and can recount their parents’ and relatives’ favorite mission stories. They also employ some theological terms in ways that non-LDS (even non-LDS Christians) would find strange.

And the thing is: if nobody tells you, then you never learn which things are things everyone knows and which things are part of your strange little religious community alone. Once, when I was in elementary school, I called my friend on the phone and his mom picked up. I addressed her as “Sister Apple” because Apple was their last name and because at that point in my life the only adults I talked to were family, teachers, or in my church. Since she wasn’t family or a teacher, I defaulted to addressing her as I was taught to address the adults in my church.

As I remember it today, her reaction was quite frosty. Maybe she thought I was in a cult. Maybe I’d accidentally raised the specter of the extremely dark history of Christians imposing their faith on Jews (my friend’s family was Jewish). Maybe I am misremembering. All I know for sure is I felt deeply awkward, apologized profusely, tried to explain, and then never made that mistake ever again. Not with her, not with anyone else.

I had these kinds of experiences–experiences that taught me to see clearly the boundaries between Mormon culture and other cultures–not only because I grew up in Virginia but also because (for various reasons) I didn’t get along very well with my LDS peer group for most of my teen years. I had very few close LDS friends from the time that I was about 12 until I was in my 30s. Lots of LDS folks, even those who grew up outside of Utah, didn’t have them. Or had fewer.

So the dynamic you can run into is when a Latter-day Saint without this kind of awareness trips over some of the (to them) invisible boundaries between Mormon culture and the surrounding culture. If they do this in front of another Latter-day Saint who does know, then the one who’s in the know has a tendency to cringe.

This is where you get provincialism (the Latter-day Saint who doesn’t know any better) and anti-provincial provincialism (the Latter-day Saint who is too invested in knowing better). After all, why should one Latter-day Saint feel so threatened by a social faux pax of another Latter-day Saint unless they are really invested in that group identity?

My dad was frustrated, at the time, with Latter-day Saint intellectuals who liked to discount their own culture and faith. They were too eager to write off Mormon art or culture or research that was amenable to faithful LDS views. They thought they were being anti-provincial. They thought they were acting like the people around them, outgrowing their culture. But the fact is that their fear of being seen as or identified with Mormonism made them just as obsessed with Mormonism as the mots provincial Mormon around. And twice as annoying.

Part 3: Beyond Anti

Although I should have known better, given what my parents taught me growing up, I became one of those anti-provincial provincials for a while. I had a chip on my shoulder about so-called “Utah Mormons”. I felt that the Latter-day Saints in Utah looked down on us out in the “mission field,” so I turned the perceived slight into a badge of honor. Yeah maybe this was the mission field, and if so that meant we out here doing the work were better than Utah Mormons. We had more challenges to overcome, couldn’t be lazy about our faith, etc.

And so, like an anti-Americanist who becomes an Americanist I became an anti-provincial provincialist. I carried that chip on my shoulder into my own mission where, finally meeting a lot of Utah Mormons on neutral territory, I got over myself. Some of them were great. Some of them were annoying. They were just folk. There are pros and cons to living in a religious majority or a minority. I still prefer living where I’m in the minority, but I’m no longer smug about it. It’s just a personal preference. There are tradeoffs.

One of the three or four ideas that’s had the most lasting impact on my life is the idea that there are fundamentally only two human motivations. Love, attraction, or desire on the one hand. Fear, avoidance, or aversion on the other.

Why is it that fighting with monsters turns you into a monster? I suspect the lesson is that how and why you fight your battles is an important as what battles you choose to fight. I wrote a Twitter thread about this on Saturday, contrasting tribal reasons for adhering to a religion and genuine conversion. The thread starts here, but here’s the relevant Tweet:

If you’re concerned about American jingoism: OK. That’s a valid concern. But there are two ways you can stand against it. In fear of the thing. Or out of love for something else. Choose carefully. Because if you’re motivated by fear, then you will–in the end–become the thing your fear motivates you to fight against. You will try to fight fire with fire, and then you will become the fire.

If you’re concerned about Mormon provincialism: OK. There are valid concerns. Being able to see outside your culture and build bridges with other cultures is a good thing. But, here again, you have to ask if you’re more afraid of provincialism or more in love with building bridges. Because if you’re afraid of provincialism, well… that’s how you get anti-provincial provincialism. And no bridges, by the way.

I might rewrite my pinned Tweet one day.

It’s not two truths. It’s just one. You want something? Fight for it. Fighting against things only gets you nothing in the end.
2 Timothy 1:7, KJV

Acceptance, Humility, and Goal Setting

In theory, setting goals should be simple. Start with where you are by measuring your current performance. Decide where you want to be. Then map out a series of goals that will get you from Point A to Point B with incremental improvements. Despite the simple theory, my lived reality was a nightmare of failure and self-reproach.

It took decades and ultimately I fell into the solution more by accident than by design, but I’m at a point now where setting goals feels easy and empowering. Since I know a lot of people struggle with this—and maybe some even struggle in the same ways that I did—I thought I’d write up my thoughts to make the path a little easier for the next person.

The first thing I had to learn was humility.

For most of my life, I set goals by imagining where I should be and then making that my goal. I utterly refused to begin with where I currently was. I refused to even contemplate it. Why? Because I was proud, and pride always hides insecurity. I could accept that I wasn’t where I should be in a fuzzy, abstract sense. That was the point of setting the goal in the first place. But to actually measure my current achievement and then use that as the starting point? That required me to look in the mirror head-on, and I couldn’t bear to do it. I was too afraid of what I’d see. My sense of worth depended on my achievement, and so if my achievements were not what they should be then I wasn’t what I should be. The first step of goal setting literally felt like an existential threat.

This was particularly true because goal-setting is an integral part of Latter-day Saint culture. Or theology. The line is fuzzy. Either way, the worst experiences I had with goal setting were on my mission. The stakes felt incredibly high. I was the kind of LDS kid who always wanted to go on a mission. I grew up on stories of my dad’s mission and other mission stories. And I knew that being a mission was like this singular opportunity to prove myself. So on top of the general religious pressure there was this additional pressure of “now or never”. If I wasn’t a good missionary, I’d regret it for the rest of my life. Maybe for all eternity. No pressure.

So when we had religious leaders come and say things like, “If you all contact at least 10 people a day, baptisms in the mission will double,” I was primed for an entirely dysfunctional and traumatizing experience. Which is exactly what I get.

(I’m going to set aside the whole metrics-approach-to-missionary-work conversation, because that’s a topic unto itself.)

I tried to be obedient. I set the goal. And I recorded my results. I think I maybe hit 10 people a day once. Certainly never for a whole week straight (we planned weekly). When I failed to hit the goal, I prayed and I fasted and I agonized and I beat myself up. But I never once—not once!—just started with my total from last week and tried to do an incremental improvement on that. That would mean accepting that the numbers from last week were not some transitory fluke. They were my actual current status. I couldn’t bear to face that.

This is one of life’s weird little contradictions. To improve yourself—in other words, to change away from who you currently are—the first step is to accept who you currently are. Accepting reality—not accepting that its good enough, but just that it is—will paradoxically make it much easier to change reality.

Another one of life’s little paradoxes is that pride breeds insecurity and humility breeds confidence. Or maybe it’s the other way around. When I was a missionary at age 19-21, I still didn’t really believe that I was a child of God. That I had divine worth. That God loved me because I was me. I thought I had to earn worth. I didn’t realize it was a gift. That insecurity fueled pride as a coping mechanism. I wanted to be loved and valuable. I thought those things depended on being good enough. So I had to think of myself as good enough.

But once I really accepted that my value is innate—as it is for all of us—that confidence meant I no longer had to keep up an appearance of competence and accomplishment for the sake of protecting my ego. Because I had no ego left to protect.

Well, not no ego. It’s a process. I’m not perfectly humble yet. But I’m getting there, and that confidence/humility has made it possible for me to just look in the mirror and accept where I am today.

The second thing is a lot simpler: keeping records.

The whole goal setting process as I’ve outlined it depends on being able to measure something. This is why I say I sort of stumbled into all of this by accident. The first time I really set good goals and stuck to them was when I was training for my first half marathon. Because it wasn’t a religious goal, the stakes were a lot lower, so I didn’t have a problem accepting my starting point.

More importantly, I was in the habit of logging my miles for every run. So I could look back for a few weeks (I’d just started) and easily see how many miles per week I’d been running. This made my starting point unambiguous.

Next, I looked up some training plans. These showed how many miles per week I should be running before the marathon. They also had weekly plans, but to make the goals personal to me, I modified a weekly plan so that it started with my current miles-per-week and ramped up gradually in the time I had to the goal mies-per-week.

I didn’t recognize this breakthrough for what it was at the time, but—without even thinking about it—I started doing a similar thing with other goals. I tracked my words when I wrote, for example, and that was how I had some confidence that I could do the Great Challenge. I signed up to write a short story every week for 52 weeks in a year because I knew I’d been averaging about 20,000 words / month, which was within range of 4-5 short stories / month.

And then, just a month ago, I plotted out how long I wanted it to take for me to get through the Book of Mormon attribution project I’m working on. I’ve started that project at least 3 or 4 times, but never gotten through 2 Nephi, mostly because I didn’t have a road map. So I built a road map, using exactly the same method that I used for the running plan and the short stories plan.

And that was when I finally realized I was doing what I’d wanted to do my whole life: setting goals. Setting achievable goals. And then achieving them. Here’s my chart for the Book of Mormon attribution project, by the way, showing that I’m basically on track about 2 months in.

The y-axis is characters, if you’re curious. There are about 1.4m characters in the BoM.

I hope this helps someone out there. It’s pretty basic, but it’s what I needed to learn to be able to start setting goals. And setting goals has enabled me to do two things that I really care about. First, it lets me take on bigger projects and improve myself more / faster than I had in the past. Second, and probably more importantly, it quiets the demon of self-doubt. Like I said: I’m not totally recovered from that “I’m only worth what I accomplish” mental trap. When I look at my goals and see that I’m making consistent progress towards things I care about, it reminds me what really matters. Which is the journey.

I guess that’s the last of life’s little paradoxes for this blog post: you have to care about pursuing your goals to live a meaningful, fulfilling life. But you don’t actually have to accomplish anything. Accomplishment is usually outside your control. It’s always subject to fortune. The output is the destination. It doesn’t really matter.

But you still have to really try to get that output. Really make a journey towards that destination. Because that’s the input. That’s what you put in. That does depend on you.

I believe that doing my best to put everything into my life is what will make my life fulfilling. What I get out of it? Well, after I’ve done all I can, then who cares how it actually works out. That’s in God’s hands.

The Real Social Dilemma

The Social Dilemma is a newly available documentary on Netflix about the peril of social networks. 

The documentary does a decent job of introducing some of the ways social networks (Facebook, Twitter, Pinterest, etc.) are negatively impacting society. If this is your entry point to the topic, you could do worse.

But if you’re looking for really thorough analysis of what is going wrong or for possible solutions, then this documentary will leave you wanting more. Here are four specific topics–three small and one large–where The Social Dilemma fell short.

AI Isn’t That Impressive

I published a piece in June on Why I’m an AI Skeptic and a lot of what I wrote then applies here. Terms like “big data” and “machine learning” are overhyped, and non-experts don’t realize that these tools are only at their most impressive in a narrow range of circumstances. In most real-world cases, the results are dramatically less impressive. 

The reason this matters is that a lot of the oomph of The Social Dilemma comes from scaring people, and AI just isn’t actually that scary. 

Randall Munroe’s What If article is technically about a robot apocalypse, but the gist of it applies to AI as well.

I don’t fault the documentary for not going too deep into the details of machine learning. Without a background in statistics and computer science, it’s hard to get into the details. That’s fair. 

I do fault them for sensationalism, however. At one point Tristan Harris (one of the interviewees) makes a really interesting point that we shouldn’t be worried about when AI surpasses human strengths, but when it surpasses human weaknesses. We haven’t reached the point where AI is better than a human at the things humans are good at–creative thinking, language, etc. But we’ve already long since passed the point where AI is better than humans at things humans are bad at, such as memorizing and crunching huge data sets. If AI is deployed in ways that leverage human weaknesses, like our cognitive biases, then we should already be concerned. So far this is reasonable, or at least interesting.

But then his next slide (they’re showing a clip of a presentation he was giving) says something like: “Checkmate humanity.”

I don’t know if the sensationalism is in Tristan’s presentation or The Real Dilemma’s editing, but either way I had to roll my eyes.

All Inventions Manipulate Us

At another point, Tristan tries to illustrate how social media is fundamentally unlike other human inventions by contrasting it with a bicycle. “No one got upset when bicycles showed up,” he says. “No one said…. we’ve just ruined society. Bicycles are affecting society, they’re pulling people away from their kids. They’re ruining the fabric of democracy.”

Of course, this isn’t really true. Journalists have always sought sensationalism and fear as a way to sell their papers, and–as this humorous video shows–there was all kinds of panic around the introduction of bicycles.

Tristan’s real point, however, is that bicycles were were a passive invention. They don’t actively badger you to get you to go on bike rides. They just sit there, benignly waiting for you to decide to use them or not. In this view, you can divide human inventions into everything before social media (inanimate objects that obediently do our bidding) and after social media (animate objects that manipulate us into doing their bidding).

That dichotomy doesn’t hold up. 

First of all, every successful human invention changes behavior individually and collectively. If you own a bicycle, then the route you take to work may very well change. In a way, the bike does tell you where to go. 

To make this point more strongly, try to imagine what 21st century America would look like if the car had never been invented. No interstate highway system, no suburbs or strip malls, no car culture. For better and for worse, the mere existence of a tool like the car transformed who we are both individually and collectively. All inventions have cultural consequences like that, to a greater or less degree.

Second, social media is far from the first invention that explicitly sets out to manipulate people. If you believe the argumentative theory, then language and even rationality itself evolved primarily as ways for our primate ancestors to manipulate each other. It’s literally what we evolved to do, and we’ve never stopped.

Propaganda, disinformation campaigns, and psy-ops are one obvious category of examples with roots stretching back into prehistory. But, to bring things closer to social networks, all ad-supported broadcast media have basically the same business model: manipulate people to captivate their attention so that you can sell them ads. That’s how radio and TV got their commercial start: with the exact same mission statement as GMail, Google search, or Facebook. 

So much for the idea that you can divide human inventions into before and after social media. It turns out that all inventions influence the choices we make and plenty of them do so by design.

That’s not to say that nothing has changed, of course. The biggest difference between social networks and broadcast media is that your social networking feed is individualized

With mass media, companies had to either pick and choose their audience in broad strokes (Saturday morning for kids, prime time for families, late night for adults only) or try to address two audiences at once (inside jokes for the adults in animated family movies marketed to children). With social media, it’s kind of like you have a radio station or a TV studio that is geared just towards you.

Thus, social media does present some new challenges, but we’re talking about advancements and refinements to humanity’s oldest game–manipulating other humans–rather than some new and unprecedented development with no precursor or context. 

Consumerism is the Real Dilemma

The most interesting subject in the documentary, to me at least, was Jaron Lanier. When everyone else was repeating that cliché about “you’re the product, not the customer” he took it a step or two farther. It’s not that you are the product. It’s not even that your attention is the product. What’s really being sold by social media companies, Lanier pointed out, is the ability to incrementally manipulate human behavior.

This is an important point, but it raises a much bigger issue that the documentary never touched. 

This is the amount of money spent in the US on advertising as a percent of GDP over the last century:

Source: Wikipedia

It’s interesting to note that we spent a lot more (relative to the size of our economy) on advertising in the 1920s and 1930s than we do today. What do you think companies were buying for their advertising dollars in 1930 if not “the ability to incrementally manipulate human behavior”?

Because if advertising doesn’t manipulate human behavior then why spend the money? If you can’t manipulate human behavior with a billboard or a movie trailer or a radio spot, then nobody would ever spend money on any of those things.

This is the crux of my disagreement with The Social Dilemma. The poison isn’t social media. The poison is advertising. The danger of social media is just that (within the current business model) it’s a dramatically more effective method of delivering the poison.

Let me stipulate that advertising is not an unalloyed evil. There’s nothing intrinsically wrong with showing people a new product or service and trying to persuade them to pay you for it. The fundamental premise of a market economy is that voluntary exchange is mutually beneficial. It leaves both people better off. 

And you can’t have voluntary exchange without people knowing what’s available. Thus, advertising is necessary to human commerce and is a part of an ecosystem flourishing, mutually beneficial exchanges and healthy competition. You could not have modern society without advertising of some degree and type. 

That doesn’t mean the amount of advertising–or the kind of advertising–that we accept in our society is healthy. As with  basically everything, the difference between poison and medicine is found in the details of dosage and usage. 

There was a time, not too long ago, when the Second Industrial Revolution led to such dramatically increased levels of production that economists seriously theorized about ever shorter work weeks with more and more time spent pursuing art and leisure with our friends and families. Soon, we’d spend only ten hours a week working, and the rest developing our human potential.

And yet in the time since then, we’ve seen productivity skyrocket (we can make more and more stuff with the same amount of time) while hours worked have also remained roughly steady. The simplest reason for this? We’re addicted to consumption. Instead of holding production basically constant (and working fewer and fewer hours), we’ve tried to maximize consumption by keeping as busy as possible. This addiction to consumption, not necessarily having but to acquiring stuff, manifests in some really weird cultural anomalies that–if we witnessed them from an alien perspective–probably strike us as dysfunctional or even pathological.

I’ll start with a personal example: when I’m feeling a little down I can reliably get a jolt of euphoria from buying something. Doesn’t have to be much. Could be a gadget or a book I’ve wanted on Amazon. Could be just going through the drive-thru. Either way, clicking that button or handing over my credit card to the Chick-Fil-A worker is a tiny infusion of order and control in a life that can seem confusingly chaotic and complex. 

It’s so small that it’s almost subliminal, but every transaction is a flex. The benefit isn’t just the food or book you purchase. It’s the fact that you demonstrated the power of being able to purchase it. 

From a broader cultural perspective, let’s talk about unboxing videos. These are videos–you can find thousands upon thousands of them on YouTube–where someone gets a brand new gizmo and films a kind of ritualized process of unpacking it. 

This is distinct from a product review (a separate and more obviously useful genre). Some unboxing videos have little tidbits of assessment, but that’s beside the point. The emphasis is on the voyeuristic appeal of watching someone undress an expensive, virgin item. 

And yeah, I went with deliberately sexual language in that last sentence because it’s impossible not to see the parallels between brand newness and virginity, or between ornate and sophisticated product packaging and fashionable clothing, or between unboxing an item and unclothing a person. I’m not saying it’s literally sexual, but the parallels are too strong to ignore.

These do not strike me as the hallmarks of a healthy culture, and I haven’t even touched on the vast amounts of waste. Of course there’s the literal waste, both from all that aforementioned packaging and from replacing consumer goods (electronics, clothes, etc.) at an ever-faster pace. There’s also the opportunity cost, however. If you spend three or four or ten times more on a pair of shoes to get the right brand and style than you could on a pair of equally serviceable shoes without the right branding, well… isn’t that waste? You could have spent the money on something else or, better still, saved it or even worked less. 

This rampant consumerism isn’t making us objectively better off or happier. It’s impossible to separate consumerism from status, and status is a zero-sum game. For every winner, there must be a loser. And that means that, as a whole, status-seeking can never make us better off. We’re working ourselves to death to try and win a game that doesn’t improve our world. Why?

Advertising is the proximate cause. Somewhere along the way advertisers realized that instead of trying to persuade people directly that this product would serve some particular need, you could bypass the rational argument and appeal to subconscious desires and fears. Doing this allows for things like “brand loyalty.” It also detaches consumption from need. You can have enough physical objects, but you can you ever have enough contentment, or security, or joy, or peace? 

So car commercials (to take one example) might mention features, but most of the work is done by stoking your desires: for excitement if it’s a sports car, for prestige if it’s a luxury car, or for competence if it’s a pickup truck. Then those desires are associated with the make and model of the car and presto! The car purchase isn’t about the car anymore. It’s about your aspirations as a human being. 

The really sinister side-effect is that when you hand over the cash to buy whatever you’ve been persuaded to buy, what you’re actually hoping for is not a car or ice cream or a video game system. What you’re actually seeking is the fulfillment of a much deeper desire for belonging or safety or peace or contentment. Since no product can actually meet those deeper desires, advertising simultaneously stokes longing and redirects us away from avenues that could potentially fulfill it. We’re all like Dumbledore in the cave, drinking poison that only makes us thristier and thirstier.

One commercial will not have any discernible effect, of course, but life in 21st century America is a life saturated by these messages. 

And if you think it’s bad enough when the products sell you something external, what about all the products that promise to make you better? Skinnier, stronger, tanner, whatever. The whole outrage of fashion models photoshopped past biological possibility is just one corner of the overall edifice of an advertising ecosystem that is calculated to make us hungry and then sell us meals of thin air. 

I developed this theory that advertising fuels consumerism, which sabotages our happiness at an individual and social level, when I was a teenager in the 1990s. There was no social media back then.

So, getting back to The Social Dilemma, the problem isn’t that life was fine and dandy and then social networking came and destroyed everything. The problem is that we already lived in a sick, consumerist society where advertising inflamed desires and directed them away from any hope of fulfillment and then social media made it even worse

After all, everything that social media does has been done before. 

News feeds are tweaked to keep your scrolling endlessly? Radio stations have endlessly fiddled with their formulas for placing advertisements to keep you from changing that dial. TV shows were written around advertising breaks to make sure you waited for the action to continue. (Watch any old episode of Law and Order to see what I mean.) Social media does the same thing, it’s just better at it. (Partially through individualized feeds and AI algorithms, but also through effectively crowd-sourcing the job: every meme you post contributes to keeping your friends and family ensnared.)

Advertisements bypassing objective appeals to quality or function and appeal straight to your personal identity, your hopes, your fears? Again, this is old news. Consider the fact that you immediately picture in your mind different stereotypes for the kind of person who drives a Ford F-150, a Subaru Outback, or a Honda Civic. Old fashioned advertisements were already well on the way of fracturing society into “image tribes” that defined themselves and each other at least in part in terms of their consumption patterns. Social media just doubled down on that trend by allowing increasingly smaller and more homogeneous tribes to find and socialize with each other (and be targeted by advertisers). 

So the biggest thing that was missing from The Social Dilemma was the realization that social isn’t some strange new problem. It’s an old problem made worse. 

Solutions

The final shortcoming of The Social Dilemma is that there were no solutions offered. This is an odd gap because at least one potential solution is pretty obvious: stop relying on ad supported products and services. If you paid $5 / month for your Facebook account and that was their sole revenue stream (no ads allowed), then a lot of the perverse incentives around manipulating your feed would go away.

Another solution would be stricter privacy controls. As I mentioned above, the biggest differentiator between social media and older, broadcast media is individualization. I’ve read (can’t remember where) about the idea of privacy collectives: groups of consumers could band together, withhold their data from social media groups, and then dole it out in exchange for revenue (why shouldn’t you get paid for the advertisements you watch?) or just refuse to participate at all.

These solutions have drawbacks. It sounds nice to get paid for watching ads (nicer than the alternative, anyway) and to have control over your data, but there are some fundamental economic realities to consider. “Free” services like Facebook and Gmail and YouTube can never actually be free. Someone has to pay for the servers, the electricity, the bandwidth, the developers, and all of that. If advertisers don’t, then consumers will need to. Individuals can opt out and basically free-ride on the rest of us, but if everyone actually did it then the system would collapse. (That’s why I don’t use ad blockers, by the way. It violates the categorical imperative.)

And yeah, paying $5/month to Twitter (or whatever) would significantly change the incentives to manipulate your feed, but it wouldn’t actually make them go away. They’d still have every incentive to keep you as highly engaged as possible to make sure you never canceled your subscription and enlisted all your friends to sign up, too. 

Still, it would have been nice if The Social Dilemma had spent some time talking about specific possible solutions.

On the other hand, here’s an uncomfortable truth: there might not be any plausible solutions. Not the kind a Netflix documentary is willing to entertain, anyway.

In the prior section, I said “advertising is the proximate cause” of consumerism (emphasis added this time). I think there is a deeper cause, and advertising–the way it is done today–is only a symptom of that deeper cause.

When you stop trying to persuade people to buy your product directly–by appealing to their reason–and start trying to bypass their reason to appeal to subconscious desires you are effectively dehumanizing them. You are treating them as a thing to be manipulated. As a means to an end. Not as a person. Not as an end in itself. 

That’s the supply side: consumerism is a reflection of our willingness to tolerate treating each other as things. We don’t love others.

On the demand side, the emptier your life is, the more susceptible you become to this kind of advertising. Someone who actually feels belonging in their life on a consistent basis isn’t going to be easily manipulated into buying beer (or whatever) by appealing to that need. Why would they? The need is already being met.

That’s the demand side: consumerism is a reflection of how much meaning is missing from so many of our lives. We don’t love God (or, to be less overtly religious, feel a sense of duty and awe towards transcendent values).

As long as these underlying dysfunctions are in place, we will never successfully detoxify advertising through clever policies and incentives. There’s no conceivable way to reasonably enforce a law that says “advertising that objectifies consumers is illegal,” and any such law would violate the First Amendment in any case. 

The difficult reality is that social media is not intrinsically toxic any more than advertising is intrinsically toxic. What we’re witnessing is our cultural maladies amplified and reflected back through our technologies. They are not the problem. We are.

Therefore, the one and only way to detoxify our advertising and social media is to overthrow consumerism at the root. Not with creative policies or stringent laws and regulations, but with a fundamental change in our cultural values. 

We have the template for just such a revolution. The most innovative inheritance of the Christian tradition is the belief that, as children of God, every human life is individually and intrinsically valuable. An earnest embrace of this principle would make manipulative advertising unthinkable and intolerable. Christianity–like all great religions, but perhaps with particular emphasis–also teaches that a valuable life is found only in the service of others, service that would fill the emptiness in our lives and make us dramatically less susceptible to manipulation in the first place.

This is not an idealistic vision of Utopia. I am not talking about making society perfect. Only making it incrementally better. Consumerism is not binary. The sickness is a spectrum. Every step we could take away from our present state and towards a society more mindful of transcendent ideals (truth, beauty, and the sacred) and more dedicated to the love and service of our neighbors would bring a commensurate reduction in the sickness of manipulative advertising that results in tribalism, animosity, and social breakdown. 

There’s a word for what I’m talking about, and the word is: repentance. Consumerism, the underlying cause of toxic advertising that is the kernel of the destruction wrought by social media, is the cultural incarnation of our pride and selfishness. We can’t jury rig an economic or legal solution to a fundamentally spiritual problem. 

We need to renounce what we’re doing wrong, and learn–individually and collectively–to do better.

Are Americans Religiously Literate?

I’ve written a lot of about political knowledge (or the lack of). A recent Pew study tests Americans’ religious knowledge and the results aren’t exactly inspiring. When it comes to Christian or biblical basics, the majority of the population answers correctly. Less so when it comes to Islam, but still a majority.

Most Americans are familiar with key elements of Christianity, terminology of nonbelief, basics of Islam

When you start to move beyond these religions, however, the knowledge drops drastically.

Three-in-ten or fewer Americans know when Jewish Sabbath begins, that Rosh Hashana is the Jewish New Year

One-in-five Americans know Protestantism (not Catholicism) traditionally teaches that salvation comes through faith alone

The next bit reminded me of an incident at church a few years back. During a lesson, the teacher made a comment about how he “never asks about other people’s religion, but they always ask me about mine.” He took this as evidence of the “truthfulness” of Mormonism. I did not hesitate to point out that their curiosity likely had less to do with the Church’s “truthfulness” and more to do with us being a supposedly weird, polygamous cult with a different book. I then noted that learning about other religions improves interfaith dialogue by allowing us to better communicate with those of different faith backgrounds.

As the data below demonstrate, that teacher was not alone: Mormons are some of the most well-versed when it comes to the Bible and Christianity, but some of the least knowledgeable regarding other religions.

Evangelical Protestants get the most questions right about Christianity; Jews are most well-versed in world religions

Overall, it appears that American religious literacy is pretty meh.

Some Thoughts on the Tolkien Movie

I could have sworn there was a quick image of a cross in one scene, but I couldn’t find it online. So here’s a generic screen shot from the movie instead.

I saw Tolkien last week, and I really enjoyed it. This was surprising to me, because religion was absolutely essential to J. R. R. Tolkien’s life, to his motivations for inventing Middle Earth and all that went with it, and to the themes and characters of all the works he wrote in Middle Earth. Hollywood, on the other hand, is utterly incapable of handling religion seriously. So, how did Tolkien manage to be a good film anyway?

By basically ignoring religion.

Don’t get me wrong. They do mention that he’s Catholic, depict his relationship with the priest who was his caretaker after his mother died, and talk about the tension when Tolkien–a Catholic–wanted to pursue a relationship with Edith, who wasn’t Catholic.

You might think that’s religion, but it’s not, any more than Romeo and Juliet coming from different houses was about religion. His Catholicism is treated as a kind of immutable faction that he was born into and so is stuck with it.

Now, if the movie tried to explain anything deep about Tolkien’s character or his work while omitting religion, it would have failed utterly. It succeeds because–after redacting religion from Tolkien’s life–it also studiously avoids trying to say anything deep about his life or his life’s work.

As far as this film is concerned, all you need to understand how Tolkien’s life led him to create Middle Earth are a series of simplistic and primarily visual references. Tolkien left behind a boyhood home of rolling green hills. That’s the shire. Once, he saw the shadows of bare tree branches on the ceiling of his childhood room at night. That’s ents.

And of course there’s World War I. German flamethrowers attacking a British trench became the balrog. Shattered and broken human corpses mired in a denuded wasteland reduced to mud and water-filled craters became the Dead Marshes. And a kind of generic sense of impending, invisible doom became a dragon and also Sauron’s all-seeing eye.

As for the most famous aspect of Tolkien’s writing–the fact that he invented entire languages–that’s basically written off as a kind of personal obsession. Some people juggle geese. What are you going to do?

None of this is wrong, and that’s why the movie is so enjoyable. It’s fun to see the visual references, even if they are a bit heavy-handed. The rise-from-ashes, boyhood camaraderie and romantic plotlines are all moving. But for the most part the movie avoids all the really deep stuff and just tells a light, superficial story about Tolkien’s circle of friends growing up. And I’m fine with all of that.

Not everything has to go deep, and a movie is far from the best way to investigate what Tolkien meant by “subcreation” or a “secondary world” and all the theology that goes with that, anyway. Even if Hollywood could do religion. Which, seeing as how they can’t, just makes me grateful that in this film they didn’t try. That saves it from ruin and makes it a perfectly fun movie that every fan of J. R. R. Tolkien should see.

Does Religion Lead to Good Sex?

Drawing on a new IFS study, David French writes in the National Review,

How many happy, sexually vibrant religiousmarried couples have you seen on popular television shows or movies — even in this era of fragmented, targeted entertainment? Now, compare that number (which is very, very close to zero) with the number of times you’ve seen liberation from religion portrayed as the key to sexual fulfillment.

How many times, amid the celebrations of sexuality on college campuses, do you hear the speakers at the various “sex weeks” say something like, “If you really want to improve your odds of enjoying a sexually satisfying life with a faithful partner, you might want to check out church”? Or how many wonkish progressives — the very people most likely to share charts and graphs about the effects of public policies or to pass around the latest social science about race, gender, and gender identity — will dwell on charts such as these, from the invaluable Institute for Family Studies:


He continues:

The global data reflected the U.S. reality. Highly religious couples “enjoy higher-quality relationships and more sexual satisfaction” compared with mixed or entirely secular couples. Moreover, in the global study, religion has an increasingly positive influence on fertility. Religious couples had “0.27 more children than those who never, or practically never, attend.”

Sadly, however, religious practice was “not protective against domestic violence.” There was no statistically significant difference in risk between secular and religious couples.

The IFS study doesn’t just explode progressive cultural stereotypes of unhappy, sexless religious prudes. Conservatives often think of feminists (especially secular feminists) as angry and joyless. But the study indicates otherwise. There was a “J-Curve in overall relationship quality for women.” It turns out that women in “shared secular, progressive relationships enjoy comparatively high levels of relationship quality.” They were surpassed only by “women in highly religious relationships, especially traditionalists.”

Less sex may also be contributing to less happiness. “IFS senior fellow Bradford Wilcox and IFS research fellow Lyman Stone followed Julian’s work by examining whether the sex recession was related to the measurable decline of happiness in America’s young adults. They concluded that “changes in sexual frequency can account for about one-third of the decline in happiness since 2012 and almost 100 percent of the decline in happiness since 2014.”” In short, the sexual revolution has brought about

its own brand of unhappiness, including — ironically enough — sexlessness…Sexual liberation has all too often brought neither sex nor liberation, and thanks to the work of the IFS, we can respond to felt need with real data. Are you seeking love in this life? The church doors are always open, and while matchmaking isn’t its purpose, the connection to a holy God carries with it connection to his flawed people, and in those connections you can find profound joy.

Stuff I Say at School – Part V: Tocqueville and Social Capital

This is part of the Stuff I Say at School series.

The Assignment

Alexis de Tocqueville argues that the active involvement of American citizens in civil society distinguishes America from Europe and helps to prevent American government from becoming over centralized.  In fact, civil society not only prevents Big Government from taking over, but enlarges each citizen’s life, helping them overcome the natural tendency of democratic citizens to isolate from each other.  Contemporary social observers, like Robert Putnam and Marc Dunkelman, have seen trends of disengagement from civil society in their recent studies (and more engagement in virtual communities via technology).   Discuss the significance of civil society from Tocqueville’s perspective and whether these recent trends of disengagement should be viewed as a cause of some alarm.

The Stuff I Said

Tocqueville’s view of civil society is very organic; a kind of pre-state network guided by cultural norms and both individual  and communal pursuits. The bottom-up, arguably emergent nature of Tocqueville’s perception is likely why many classical liberal writers quote him so favorably. The ability of private individuals to organize to advance societal goals rather than relying on the coercion of the state appears to be deeply encouraged by Tocqueville. This makes public engagement a necessity to avoid “despotism.” This makes the decline in social capital potentially problematic. 

However, there are a few points worth noting about the claims of social capital decline and the march toward despotism:

First and foremost, government has grown significantly since the mid 1800s. Democracy in America was written 20-30 years prior to the outbreak of the Civil War. My own state of Texas had not even been annexed yet. For all we know, Tocqueville might think we’ve been in the era of Big Government for over a century.

Next, economists Dora Costa and Matthew Kahn find that declines in social capital (i.e., volunteering and organization membership, entertainment of friends and relatives at home) between 1952 and 1998 were largely among women due to their increased participation in the labor force. Other contributors were income inequality and increasing ethnic heterogeneity. While income inequality can be a problem (it tends to erode trust), increasing diversity and female labor participation are, in my view, not negative developments.

Parents also appear to be spending more time with their children. For example, a 2016 study of 11 Western countries found that “the mean time the average mother in the 11 countries spent daily on child care in 1965 was calculated to be about 54 minutes, it increased to a predicted 104 minutes by 2012. For fathers, the estimates increased from a scant 16 minutes daily in 1965 to 59 minutes in 2012” (pg. 1090). Engaged parenting results in better child outcomes. So while parents may not be entertaining friends or bowling with buddies as much, they are giving their kids more attention. Considering Tocqueville’s focus on family, I think he would find this a plus (especially in the midst of the family fragmentation that has occurred over the last few decades).

But even with these declines, a majority of Americans still participate in various organizations. Drawing on the 2007 Baylor National Religious Survey, sociologist Rodney Stark finds that while 41% of Americans have no membership in non-church organizations, 48% had 1-3 memberships and 11% had 4-5 memberships. “About six Americans out of ten belong to at least one voluntary organization. Add in church organizations and the number rises to more than seven out of ten, and the median becomes two memberships” (pg. 122-123).

Finally, the labor market was dominated by agriculture (76.2% in 1800; 53.6% in 1850) during the period that Tocqueville wrote. By the turn of the 20th century, however, most of the labor force could be found in manufacturing (35.8%) and service sectors (23.6%). By the 21st century, service had come to dominate the labor market (73% in 1999). While social capital in the form of organizational participation may have declined over the last half century, the kind of work we do has changed drastically. This includes our workplace experience. We actually have co-workers that we spend hours each day cooperating with and customers that we are obligated to respect day in and day out. The relationships (and social capital) we establish through the workplace are very different from 19th-century farms or even industrial-era factories. The late Peter Drucker believed that today’s business institutions “are increasingly the means through which individual human beings find their livelihood, find their access to social status, to community and to individual achievement and satisfaction” (pg. 16). I don’t think we should underestimate the long-run impact of commerce on social capital. Numerous studies find that markets foster socially-desirable traits like trust, cooperation, and tolerance.[ref]Another classmate pointed out that it’s likely too soon to tell whether or not internet-based communities can fulfill the same civic functions as older forms. I think his point about the internet is really important. The concern over “echo chambers” may in fact be far overblown. Granted, there is evidence that suggests social media does increase things like political polarization. But we may really be underselling the benefits of greater connectivity via technology (especially through social media and mobile phones).[/ref]

In short, I think Tocqueville might find some of our over-reliance on government distasteful, but overall would be impressed with how incredibly adaptive the American people have been over the course of nearly two centuries of rapid change and development. This latter point would confirm many of the observations he made about the underlying mores of American civil society.

Stuff I Say at School – Part I: Tocqueville and the “Nones”

This is part of the Stuff I Say at School series.

I started my MA program in Government at John Hopkins University this past month. Homework is therefore going to take up a lot of my time and cut into my blogging. Instead of admitting defeat, I’ve decided to share excerpts from various assignments in a kind of series. I was inspired by the Twitter feed “Sh*t My Dad Says.” While “Sh*t I Say at School” is a funnier title, I’ll go the less vulgar route and name it “Stuff I Say at School.” Some of this material will be familiar to DR readers, but presenting it in a new context will hopefully keep it fresh. So without further ado, let’s dive in.

The Assignment

A recent Pew study showed that millennials are less religiously affiliated than any other previous cohort of Americans (sometimes called the rise of the “nones”).  Given the emphasis Tocqueville places on the role religion plays in creating a culture that helps to keep democracy in America anchored, analyze these developments through Tocqueville’s viewpoint[.]

The Stuff I Said

Tocqueville would likely have a strong affinity for Baylor sociologist Rodney Stark’s research on religion. Stark’s sociological analysis of religion takes a similar approach to Tocqueville, acknowledging that the religious competition and pluralism (i.e., religious free market) that resulted from religion’s uncoupling from the state produces a robust, dynamic religious environment. He puts it bluntly in his book The Triumph of Faith: “the more religious competition there is within a society, the higher the overall level of individual participation” (pg. 56). It is the state sponsorship of churches, he claims, that has contributed to Europe’s religious decline.

I was struck by the claim in the lecture that 95% of Americans attended church weekly in the mid 19th-century because it contradicts the data collected by Stark and Finke:

On the eve of the Revolution only about 17 percent of Americans were churched. By the start of the Civil War this proportion had risen dramatically, to 37 percent. The immense dislocations of the war caused a serious decline in adherence in the South, which is reflected in the overall decline to 35 percent in the 1870 census. The rate then began to rise once more, and by 1906 slightly more than half of the U.S. population was churched. Adherence rates reached 56 percent by 1926. Since then the rate has been rather stable although inching upwards. By 1980 church adherence was about 62 percent (pg. 22).

Tocqueville might also be more optimistic about the state of America’s religious pulse. For example, Stark has criticized the narrative that often accompanies the “rise of the nones”:

The [Pew] findings would seem to be clear: the number of Americans who say their religious affiliation is “none” has increased from about 8 percent in 1990 to about 22 percent in 2014. But what this means is not so obvious, for, during this same period, church attendance did not decline and the number of atheists did not increase. Indeed, the percentage of atheists in America has stayed steady at about 4 percent since a question about belief in God was first asked in 1944. In addition, except for atheists, most of the other “nones” are religious in the sense that they pray (some pray very often) and believe in angels, in heaven, and even in ghosts. Some are also rather deeply involved in “New Age” mysticisms.

So who are these “nones,” and why is their number increasing–if it is? Back in 1990 most Americans who seldom or never attended church still claimed a religious affiliation when asked to do so. Today, when asked their religious preference, instead of saying Methodist or Catholic, now a larger proportion of nonattenders say “none,” by which most seem to mean “no actual membership.” The entire change has taken place within the nonattending group, and the nonattending group has not grown.

In other words, this change marks a decrease only in nominal affiliation, not an increase in irreligion. So whatever else it may reflect, the change does not support claims for increased secularization, let alone a decrease in the number of Christians. It may not even reflect an increase in those who say they are “nones.” The reason has to do with response rates and the accuracy of surveys (pg. 190).

Finally, Tocqueville was right to recognize the benefits of religion to society. As laid out by Stark in his America’s Blessings (pg. 4-5),the religious compared to irreligious Americans are:

  • Less likely to commit crimes.
  • More likely to contribute to contribute to charities, volunteer their time, and be active in civic affairs (a recent Pew study provides support for this last one).
  • Happier, less neurotic, less likely to commit suicide.
  • Living longer.
  • More likely to marry, stay married, have children, and be more satisfied in their marriage.
  • Less likely to abuse their spouse or children.
  • Less likely to cheat on their spouse.
  • Performing better on standardized tests.
  • More successful in their careers.
  • Less likely to drop out of school.
  • More likely to consume “high culture.”
  • Less likely to believe in occult and paranormal phenomena (e.g., Bigfoot, UFOs).

Overall, I think Tocqueville would be pleased to see data back up his observations.

More Stuff

A classmate pointed to a recent study claiming that when one controls for social desirability, the amount of atheists in America possibly rises to over a quarter of the population. The study is certainly interesting, though I wonder if this would hold up in other countries. Based on Stark’s The Triumph of Faith, these are the following average percentages of atheists across the world:

  • Latin America: 2.5%
  • Western Europe: 6.7%
  • Eastern Europe: 4.6%
  • Islamic Nations: 1.1%
  • Sub-Saharan Africa: 0.7%
  • Asia: 11.3%
  • Other (Australia, Canada, Iceland, New Zealand): 8.4%

As for the unaffiliated Millennials, unchurched and irreligious are two different things. A Pew study from last year found that 72% of the “nones” believe in some kind of higher power, with 17% believing in the “God of the Bible.” Even 67% of self-identified agnostics believe in a higher power, with 3% believing in the “God of the Bible.” But unchurching can lead to other forms of spirituality. The Baylor Religion Survey has found, perhaps surprisingly to some, that traditional forms of religion and high church attendance have strong negative effects on belief in the occult and paranormal. In other words, a regular church-goer is less likely than a non-attendee to believe things like Atlantis, haunted houses, UFOs, mediums, New Age movements, alternative medicine, etc. This is probably why Millennials are turning to things like astrologyalternative medicinehealing crystals, and the like.

Inclusive Institutions and the Church

A few months ago, I posted about a new working paper exploring the origins of WEIRD psychology. A brand new job market paper builds on this research:

Political institutions, ranging from autocratic regimes to inclusive, democratic ones, are widely acknowledged as a critical determinant of economic prosperity (e.g. Acemoglu and Robinson 2012, North, Wallis, and Weingast 2009). They create incentives that foster or inhibit economic growth. Yet, the emergence and global variation of growth-enhancing, inclusive political institutions in which people broadly participate in the governing process and the power of the elite is constrained, are not well understood. Initially, inclusive institutions were largely confined to the West. How and why did those institutions emerge in Europe?

This article contributes to the debate on the formation and global variation of inclusive institutions by combining and empirically testing two long-standing hypotheses. First, anthropologist Jack Goody (1983) hypothesized that, motivated by financial gains, the medieval Catholic Church implemented marriage policies—most prominently, prohibitions on cousin marriage—that destroyed the existing European clan-based kin networks. This created an almost unique European family system where, still today, the nuclear family dominates and marriage among blood relatives is virtually absent. This contrasts with many parts of the world, where first- and second-cousin marriages are common (Bittles and Black 2010). Second, several scholars have hypothesized that strong extended kin networks are detrimental to the formation of social cohesion and affect institutional outcomes (Weber, 1958; Todd, 1987; Augustine, 1998). Theologian Augustine of Hippo (354–430) pointed out that marrying outside the kin group enlarges the range of social relations and “should thereby bind social life more effectively by involving a greater number of people in them” (Augustine of Hippo, 354-430 / 1998, p. 665). More recently, Greif (2005), Greif and Tabellini (2017), Mitterauer (2010), and Henrich (forthcoming) combined these two hypotheses and emphasized the critical role of the Church’s marriage prohibitions for Europe’s institutional development (pg. 2).

His findings?:

The analysis demonstrates that already before the year 1500 AD, Church exposure and its marriage regulations are predictive of the formation of communes—self-governed cities that put constraints on the executive. The difference-in-difference analysis does not reveal pre-trends and results are robust to many specifications. They hold within historic political entities addressing concerns that the relation is driven by other institutional factors and when exploiting quasi-natural experiments where Church exposure was determined by the random outcomes of medieval warfare. Moreover, exploiting regional and temporal variation in marriage regulations suggests that the dissolution of kin networks was decisive for the formation of communes.

The study also empirically establishes a robust link between Church exposure and dissolution of extended kin networks at the country, ethnicity and European regional level. A language-based proxy for cousin marriage—cousin terms—offers a window into the past and rules out that the dissolution was driven by more recent events like the Industrial Revolution or modernization. Moreover, the study reports a robust link between kin networks, civicness and inclusive institutions. The link between kin networks and civicness holds within countries and—getting closer to causality—among children of immigrants, who grew up in the same country but vary in their vertically transmitted preference for cousin marriage. Kin networks predict regional institutional failure within Italy, ethnicities’ local-level democratic traditions and modern-day democratic institutions at the country level. Measures for the strength of pre-industrial kin networks rule out contemporary reverse causality or the possibility that the estimates are driven by contemporary omitted variables. The analysis also demonstrates that the association between kin networks and the formation of inclusive institutions holds universally—both within Europe and when excluding Europe and countries with a large European ancestry. This universal link strengthens the hypothesis that the Church’s marriage regulations, and not some other Church-related factor, were decisive for European development.

Underlying these early institutional developments was most likely a psychology that, as a consequence of dissolved kin networks, reflects greater individualism and a more generalized, impartial morality (Schulz et al. 2018). This is a building block not only for inclusive institutions but also for economic development more generally. For example, transmission of knowledge across kin networks and the shift away from a collectivistic culture toward an individualistic one, a culture of growth, may have further contributed to Europe’s economic development (Mokyr, 2016; de la Croix, 2018).

…To build strong, functional, inclusive institutions and to foster democracy, the potentially deleterious effect of dense kin networks must be considered. Also, simply exporting established formal institutions to other societies without considering existing kin networks will likely fail. Policies that foster cooperation beyond the boundaries of one’s kin group, however, have a strong potential to successfully diminish the fractionalization of societies. These can be policies that encouraging marriages across kin groups. More generally, policies that foster interactions that go beyond the boundaries of in-groups such as family, close friends, social class, political affiliation or ethnicity are likely to increase social cohesion (pg. 41-42).