Tell Your Children When You’re Wrong

Man kisses baby he's holding
Man kisses baby he's holding
By @kellysikkema via Unsplash

In one of the most poignant scenes from The Adam Project, Ryan Reynold’s character meets Jennifer Garner’s character by chance in a bar. Garner is Reynold’s mom, but she doesn’t recognize him because he has time-traveled into the past. He overhears her talking to the bartender about the difficulties she’s having with her son (Reynold’s younger self) in the wake of her husband’s death. Reynolds interjects to comfort his mom (which is tricky, since he’s a total stranger, as far as she knows). If this is hard to follow, just watch the clip. It’s less than three minutes long.

The line that really stuck with me from this was Reynolds telling Garner, “the problem with acting like you have it all together is he believes it.”

When I talked about this scene with Liz and Carl and Ben for a recent episode of Pop Culture on the Apricot Tree, I realized that there’s an important connection between this line and apologizing to our kids. And it’s probably not the reason you’re thinking.

The reason you’re thinking has to do with demonstrating to our kids how they should react to making a bad choice. This is also a good reason to apologize to your children. Admitting mistakes is hard. Guilt doesn’t feel good, and it takes practice to handle it positively. You don’t want to allow healthy guilt to turn into unhealthy shame (going from “I did bad” to “I am bad”). And you don’t want to allow the pain of guilt to cause you to lash out even more, like a wounded animal defending itself when there’s no threat. So yeah, apologize to your kids for your mistakes so they know how to apologize for theirs.

And, while we’re on the subject, don’t assume that apologizing to your kids will undermine your authority as a parent. It is important for parents to have authority. Your job is to keep your kids safe and teach them. This only works if you’re willing to set rules they don’t want to follow and teach lessons they don’t want to learn. That requires authority. But the authority should come from the fact that you love them and know more than they do. Or, in the words of the Lord to Joseph Smith, authority should only be maintained “by persuasion, by long-suffering, by gentleness and meekness, and by love unfeigned; By kindness, and pure knowledge” (D&C 121:41-42). If your authority comes from kindness and knowledge, then it will never be threatened by apologizing in those cases when you get it wrong.

In fact, this takes us to that second reason. The one I don’t think as many folks have thought of. And it’s this: admitting that you made a mistake is an important way of showing your kids how hard you’re trying and, by extension, demonstrating your love for them.

The only way to never make a mistake is to never push yourself. Only by operating well within your capabilities can you have a flawless record over the long run. Think about an elite performer like an Olympic gymnast or figure skater. They are always chasing perfection, rarely finding it, and that’s in events that are very short (from a few minutes to a few seconds). If you saw a gymnast doing a flawless, 30-minute routine every day for a month you would know that that routine was very, very easy (for them, at least).

You can’t tell if someone makes mistakes because they’re at the limits of their capacity or just careless. But if someone never makes mistakes, then you know that whatever they are doing isn’t much of a challenge.

So what does that tell your kid if you never apologize? In effect, it tells them that you have it all together. That, for you at least, parenting is easy.

That’s not the worst message in the world, but it’s not a great one either. Not only is it setting them up for a world of hurt when they become parents one day, but you’re missing an opportunity to tell them the truth: that parenting is the hardest thing you’ve ever done, and you’re doing it for them.

Don’t take this too far. You don’t want to weigh your kids down with every worry and fear and stress you have. You don’t want to tell them every day how hard your job is. That’s a weight no child should carry. But it’s OK to let them know—sparingly, from time to time—that being a parent is really hard. What better, healthier way to convey that than to frankly admit when you make a mistake?

“The problem with acting like you have it all together is he believes it. Maybe he needs to know that you don’t. It’s OK if you don’t.”

Anti-Provincial Provincialism and Fighting Monsters

Part 1: Anti-Americanism Americanism

I ran across a humorous meme in a Facebook group that got me thinking about anti-provincial provincialism. Well, not the meme, but a response to it. Here’s the original meme:

Now check out this (anonymized) response to it (and my response to them):

What has to happen, I wondered, for someone to assert that English is the most common language “only in the United States“? Well, they have to be operating from a kind of anti-Americanism that is so potent it has managed to swing all the way around back to being an extreme American centrism view again. After all, this person was so eager to take the US down a peg (I am assuming) that they managed to inadvertently erase the entire Anglosphere. The only people who exclude entire categories of countries from consideration are rabid America Firsters and rabid America Lasters. The commonality? They’re both only thinking about America.

It is a strange feature of our times that so many folks seem to become the thing they claim to oppose. The horseshoe theory is having its day.

The conversation got even stranger when someone else showed up to tell me that I’d misread the Wikipedia article that I’d linked. Full disclosure: I did double check this before I linked to it, but I still had an “uh oh” moment when I read their comment. Wouldn’t be the first time I totally misread a table, even when specifically checking my work. Here’s the comment:

Thankfully, dear reader, I did not have to type out the mea culpa I was already composing in my mind. Here’s the data (and a link):

My critic had decided to focus only on the first language (L1) category. The original point about “most commonly spoken” made no such distinction. So why rely on it? Same reason, I surmise, as the “only in the US” line of argument: to reflexively oppose anything with the appearance of American jingoism.

Because we can all see that’s the subtext here, right? To claim that English is the “most common language” when it is also the language (most) Americans speak is to appear to be making some of rah-rah ‘Murica statement. Except that… what happens if it’s just objectively true?

And it is objectively true. English has the greatest number of total speakers in the world by a wide margin. Even more tellingly, the number of English 2L speakers outnumbers Chinees 2L speakers by more than 5-to-1. This means that when someone chooses a language to study, they pick English 5 times more often than Chinese. No matter how you slice it, the fact that English is the most common language is just a fact about the reality we currently inhabit.

Not only that, but the connection of this fact to American chauvanism is historically ignorant. Not only is this a discussion about the English language and not the American one, but the linguistic prevalence of English predates the rise of America a great power. If you think millions of Indians conduct business in English because of America then you need to open a history book. The Brits started this stated of affairs back when the sun really did never set on their empire. We just inherited it.

I wonder if there’s something about opposing something thoughtlessly that causes you to eventually, ultimately, become that thing. Maybe Nietzsche’s aphorism doesn’t just sound cool. Maybe there’s really something to it.

This image doesn’t include enough of the quote, which is: “Beware that, when fighting monsters, you yourself do not become a monster… for when you gaze long into the abyss. The abyss gazes also into you.” But it’s cute. Original from Instagram.

Part 2: Anti-Provincial Provincialism

My dad taught me the phrase “anti-provincial provincialism” when I was a kid. We were talking about the tendency of some Latter-day Saint academics to over-correct for the provincialism of their less-educated Latter-day Saint community and in the process recreate a variety of the provincialism they were running away from. Let me fill this in a bit.

First, a lot of Latter-day Saints can be provincial.

This shouldn’t shock anyone. Latter-day Saint culture is tight-knit and uniform. For teenagers when I was growing up, you had:

  • Three hours of Church on Sunday
  • About an hour of early-morning seminary in the church building before school Monday – Friday
  • Some kind of 1-2 hour youth activity in the church building on Wednesday evenings

This is plenty of time to tightly assimilate and indoctrinate the rising generation, and for the most part this is a good thing. I am a strong believer in liberalism, which sort of secularizes the public square to accommodate different religious traditions. This secularization isn’t anti-religious, it is what enables those religions to thrive by carving out their own spaces to flourish. State religions have a lot of power, but this makes them corrupt and anemic in terms of real devotion. Pluralism is good for all traditions.

But a consequences of the tight-knit culture is that Latter-day Saints can grow up unable to clearly differentiate between general cultural touchstones (Latter-day Saints love Disney and The Princess Bride, but so do lots of people) and unique cultural touchstones (like Saturday’s Warrior and Johnny Lingo).

We have all kinds of arcane knowledge that nobody outside our culture knows or cares about, especially around serving two-year missions. Latter-day Saints know what the MTC is (even if they mishear it as “empty sea” when they’re little, like I did) and can recount their parents’ and relatives’ favorite mission stories. They also employ some theological terms in ways that non-LDS (even non-LDS Christians) would find strange.

And the thing is: if nobody tells you, then you never learn which things are things everyone knows and which things are part of your strange little religious community alone. Once, when I was in elementary school, I called my friend on the phone and his mom picked up. I addressed her as “Sister Apple” because Apple was their last name and because at that point in my life the only adults I talked to were family, teachers, or in my church. Since she wasn’t family or a teacher, I defaulted to addressing her as I was taught to address the adults in my church.

As I remember it today, her reaction was quite frosty. Maybe she thought I was in a cult. Maybe I’d accidentally raised the specter of the extremely dark history of Christians imposing their faith on Jews (my friend’s family was Jewish). Maybe I am misremembering. All I know for sure is I felt deeply awkward, apologized profusely, tried to explain, and then never made that mistake ever again. Not with her, not with anyone else.

I had these kinds of experiences–experiences that taught me to see clearly the boundaries between Mormon culture and other cultures–not only because I grew up in Virginia but also because (for various reasons) I didn’t get along very well with my LDS peer group for most of my teen years. I had very few close LDS friends from the time that I was about 12 until I was in my 30s. Lots of LDS folks, even those who grew up outside of Utah, didn’t have them. Or had fewer.

So the dynamic you can run into is when a Latter-day Saint without this kind of awareness trips over some of the (to them) invisible boundaries between Mormon culture and the surrounding culture. If they do this in front of another Latter-day Saint who does know, then the one who’s in the know has a tendency to cringe.

This is where you get provincialism (the Latter-day Saint who doesn’t know any better) and anti-provincial provincialism (the Latter-day Saint who is too invested in knowing better). After all, why should one Latter-day Saint feel so threatened by a social faux pax of another Latter-day Saint unless they are really invested in that group identity?

My dad was frustrated, at the time, with Latter-day Saint intellectuals who liked to discount their own culture and faith. They were too eager to write off Mormon art or culture or research that was amenable to faithful LDS views. They thought they were being anti-provincial. They thought they were acting like the people around them, outgrowing their culture. But the fact is that their fear of being seen as or identified with Mormonism made them just as obsessed with Mormonism as the mots provincial Mormon around. And twice as annoying.

Part 3: Beyond Anti

Although I should have known better, given what my parents taught me growing up, I became one of those anti-provincial provincials for a while. I had a chip on my shoulder about so-called “Utah Mormons”. I felt that the Latter-day Saints in Utah looked down on us out in the “mission field,” so I turned the perceived slight into a badge of honor. Yeah maybe this was the mission field, and if so that meant we out here doing the work were better than Utah Mormons. We had more challenges to overcome, couldn’t be lazy about our faith, etc.

And so, like an anti-Americanist who becomes an Americanist I became an anti-provincial provincialist. I carried that chip on my shoulder into my own mission where, finally meeting a lot of Utah Mormons on neutral territory, I got over myself. Some of them were great. Some of them were annoying. They were just folk. There are pros and cons to living in a religious majority or a minority. I still prefer living where I’m in the minority, but I’m no longer smug about it. It’s just a personal preference. There are tradeoffs.

One of the three or four ideas that’s had the most lasting impact on my life is the idea that there are fundamentally only two human motivations. Love, attraction, or desire on the one hand. Fear, avoidance, or aversion on the other.

Why is it that fighting with monsters turns you into a monster? I suspect the lesson is that how and why you fight your battles is an important as what battles you choose to fight. I wrote a Twitter thread about this on Saturday, contrasting tribal reasons for adhering to a religion and genuine conversion. The thread starts here, but here’s the relevant Tweet:

If you’re concerned about American jingoism: OK. That’s a valid concern. But there are two ways you can stand against it. In fear of the thing. Or out of love for something else. Choose carefully. Because if you’re motivated by fear, then you will–in the end–become the thing your fear motivates you to fight against. You will try to fight fire with fire, and then you will become the fire.

If you’re concerned about Mormon provincialism: OK. There are valid concerns. Being able to see outside your culture and build bridges with other cultures is a good thing. But, here again, you have to ask if you’re more afraid of provincialism or more in love with building bridges. Because if you’re afraid of provincialism, well… that’s how you get anti-provincial provincialism. And no bridges, by the way.

I might rewrite my pinned Tweet one day.

It’s not two truths. It’s just one. You want something? Fight for it. Fighting against things only gets you nothing in the end.
2 Timothy 1:7, KJV

Acceptance, Humility, and Goal Setting

In theory, setting goals should be simple. Start with where you are by measuring your current performance. Decide where you want to be. Then map out a series of goals that will get you from Point A to Point B with incremental improvements. Despite the simple theory, my lived reality was a nightmare of failure and self-reproach.

It took decades and ultimately I fell into the solution more by accident than by design, but I’m at a point now where setting goals feels easy and empowering. Since I know a lot of people struggle with this—and maybe some even struggle in the same ways that I did—I thought I’d write up my thoughts to make the path a little easier for the next person.

The first thing I had to learn was humility.

For most of my life, I set goals by imagining where I should be and then making that my goal. I utterly refused to begin with where I currently was. I refused to even contemplate it. Why? Because I was proud, and pride always hides insecurity. I could accept that I wasn’t where I should be in a fuzzy, abstract sense. That was the point of setting the goal in the first place. But to actually measure my current achievement and then use that as the starting point? That required me to look in the mirror head-on, and I couldn’t bear to do it. I was too afraid of what I’d see. My sense of worth depended on my achievement, and so if my achievements were not what they should be then I wasn’t what I should be. The first step of goal setting literally felt like an existential threat.

This was particularly true because goal-setting is an integral part of Latter-day Saint culture. Or theology. The line is fuzzy. Either way, the worst experiences I had with goal setting were on my mission. The stakes felt incredibly high. I was the kind of LDS kid who always wanted to go on a mission. I grew up on stories of my dad’s mission and other mission stories. And I knew that being a mission was like this singular opportunity to prove myself. So on top of the general religious pressure there was this additional pressure of “now or never”. If I wasn’t a good missionary, I’d regret it for the rest of my life. Maybe for all eternity. No pressure.

So when we had religious leaders come and say things like, “If you all contact at least 10 people a day, baptisms in the mission will double,” I was primed for an entirely dysfunctional and traumatizing experience. Which is exactly what I get.

(I’m going to set aside the whole metrics-approach-to-missionary-work conversation, because that’s a topic unto itself.)

I tried to be obedient. I set the goal. And I recorded my results. I think I maybe hit 10 people a day once. Certainly never for a whole week straight (we planned weekly). When I failed to hit the goal, I prayed and I fasted and I agonized and I beat myself up. But I never once—not once!—just started with my total from last week and tried to do an incremental improvement on that. That would mean accepting that the numbers from last week were not some transitory fluke. They were my actual current status. I couldn’t bear to face that.

This is one of life’s weird little contradictions. To improve yourself—in other words, to change away from who you currently are—the first step is to accept who you currently are. Accepting reality—not accepting that its good enough, but just that it is—will paradoxically make it much easier to change reality.

Another one of life’s little paradoxes is that pride breeds insecurity and humility breeds confidence. Or maybe it’s the other way around. When I was a missionary at age 19-21, I still didn’t really believe that I was a child of God. That I had divine worth. That God loved me because I was me. I thought I had to earn worth. I didn’t realize it was a gift. That insecurity fueled pride as a coping mechanism. I wanted to be loved and valuable. I thought those things depended on being good enough. So I had to think of myself as good enough.

But once I really accepted that my value is innate—as it is for all of us—that confidence meant I no longer had to keep up an appearance of competence and accomplishment for the sake of protecting my ego. Because I had no ego left to protect.

Well, not no ego. It’s a process. I’m not perfectly humble yet. But I’m getting there, and that confidence/humility has made it possible for me to just look in the mirror and accept where I am today.

The second thing is a lot simpler: keeping records.

The whole goal setting process as I’ve outlined it depends on being able to measure something. This is why I say I sort of stumbled into all of this by accident. The first time I really set good goals and stuck to them was when I was training for my first half marathon. Because it wasn’t a religious goal, the stakes were a lot lower, so I didn’t have a problem accepting my starting point.

More importantly, I was in the habit of logging my miles for every run. So I could look back for a few weeks (I’d just started) and easily see how many miles per week I’d been running. This made my starting point unambiguous.

Next, I looked up some training plans. These showed how many miles per week I should be running before the marathon. They also had weekly plans, but to make the goals personal to me, I modified a weekly plan so that it started with my current miles-per-week and ramped up gradually in the time I had to the goal mies-per-week.

I didn’t recognize this breakthrough for what it was at the time, but—without even thinking about it—I started doing a similar thing with other goals. I tracked my words when I wrote, for example, and that was how I had some confidence that I could do the Great Challenge. I signed up to write a short story every week for 52 weeks in a year because I knew I’d been averaging about 20,000 words / month, which was within range of 4-5 short stories / month.

And then, just a month ago, I plotted out how long I wanted it to take for me to get through the Book of Mormon attribution project I’m working on. I’ve started that project at least 3 or 4 times, but never gotten through 2 Nephi, mostly because I didn’t have a road map. So I built a road map, using exactly the same method that I used for the running plan and the short stories plan.

And that was when I finally realized I was doing what I’d wanted to do my whole life: setting goals. Setting achievable goals. And then achieving them. Here’s my chart for the Book of Mormon attribution project, by the way, showing that I’m basically on track about 2 months in.

The y-axis is characters, if you’re curious. There are about 1.4m characters in the BoM.

I hope this helps someone out there. It’s pretty basic, but it’s what I needed to learn to be able to start setting goals. And setting goals has enabled me to do two things that I really care about. First, it lets me take on bigger projects and improve myself more / faster than I had in the past. Second, and probably more importantly, it quiets the demon of self-doubt. Like I said: I’m not totally recovered from that “I’m only worth what I accomplish” mental trap. When I look at my goals and see that I’m making consistent progress towards things I care about, it reminds me what really matters. Which is the journey.

I guess that’s the last of life’s little paradoxes for this blog post: you have to care about pursuing your goals to live a meaningful, fulfilling life. But you don’t actually have to accomplish anything. Accomplishment is usually outside your control. It’s always subject to fortune. The output is the destination. It doesn’t really matter.

But you still have to really try to get that output. Really make a journey towards that destination. Because that’s the input. That’s what you put in. That does depend on you.

I believe that doing my best to put everything into my life is what will make my life fulfilling. What I get out of it? Well, after I’ve done all I can, then who cares how it actually works out. That’s in God’s hands.

I Want to be Right

Not long ago I was in a Facebook debate and my interlocutor accused me of just wanting to be right. 

Interesting accusation.

Of course I want to be right. Why else would we be having this argument? But, you see, he wasn’t accusing me of wanting to be right but of wanting to appear right. Those are two very different things. One of them is just about the best reason for debate and argument you can have. The other is just about the worst. 

Anyone who has spent a lot of time arguing on the Internet has asked themselves what the point of it all is. The most prominent theory is the speculator theory: you will never convince your opponent but you might convince the folks watching. There’s merit to that, but it also rests on a questionable assumption, which is that the default purpose is to win the argument by persuading the other person and (when that fails) we need to find some alternative. OK, but I question if we’ve landed on the right alternative.

I don’t think the primary importance of a debate is persuading speculators. The most important person for you to persuade in a debate is yourself.

It’s a truism these days that nobody changes their mind, and we all like to one-up each other with increasingly cynical takes on human irrationality and intractability. The list of cognitive biases on Wikipedia is getting so long that you start to wonder how humans manage to reason at all. Moral relativism and radical non-judgmentalism are grist for yet more “you won’t believe this” headlines, and of course there’s the holy grail of misanthropic cynicism:the argumentative theory. As Haidt summarizes one scholarly article on it:

Reasoning was not designed to pursue the truth. Reasoning was designed by evolution to help us win arguments. That’s why they call it The Argumentative Theory of Reasoning. So, as they put it, “The evidence reviewed here shows not only that reasoning falls quite short of reliably delivering rational beliefs and rational decisions. It may even be, in a variety of cases, detrimental to rationality. Reasoning can lead to poor outcomes, not because humans are bad at it, but because they systematically strive for arguments that justify their beliefs or their actions. This explains the confirmation bias, motivated reasoning, and reason-based choice, among other things.

Jonathan Haidt in “Righteous Mind”

Reasoning was not designed to pursue truth.

Well, there you have it. Might as well just admit that Randall Munroe was right and all pack it in, then, right?

Not so fast.

This whole line of research has run away with itself. We’ve sped right past the point of dispassionate analysis and deep into sensationalization territory. Case in point: the backfire effect. 

According to RationalWiki “the effect is claimed to be that when, in the face of contradictory evidence, established beliefs do not change but actually get stronger.” The article goes on:

The backfire effect is an effect that was originally proposed by Brendan Nyhan and Jason Reifler in 2010 based on their research of a single survey item among conservatives… The effect was subsequently confirmed by other studies.

Entry on RationalWiki

If you’ve heard of it, it might be from a popular post by The Oatmeal. Take a minute to check it out. (I even linked to the clean version without all the profanity.)

Click the image to read the whole thing.

Wow. Humans are so irrational, that not only can you not convince them with facts, but if you present facts they believe the wrong stuff even more

Of course, it’s not really “humans” that are this bad at reasoning. It’s some humans. The original research was based on conservatives and the implicit subtext behind articles like the one on RationalWiki is that they are helplessly mired in irrational biases but we know how to conquer our biases, or at the very least make some small headway that separates us from the inferior masses. (Failing that, at least we’re raising awareness!) But I digress.

The important thing isn’t that this cynicism is always covertly at least a little one-sided, it’s that the original study has been really hard to replicate. From an article on Mashable:

[W]hat you should keep in mind while reading the cartoon is that the backfire effect can be hard to replicate in rigorous research. So hard, in fact, that a large-scale, peer-reviewed study presented last August at the American Political Science Association’s annual conference couldn’t reproduce the findings of the high-profile 2010 study that documented backfire effect.

Uh oh. Looks like the replication crisis–which has been just one part of the larger we-can’t-really-know-anything fad–has turned to bite the hand that feeds it. 

This whole post (the one I’m writing right now) is a bit weird for me, because when I started blogging my central focus was epistemic humility. And it’s still my driving concern. If I have a philosophical core, that’s it. And epistemic humility is all about the limits of what we (individually and collectively) can know. So, I never pictured myself being the one standing up and saying, “Hey, guys, you’ve taken this epistemic humility thing too far.” 

But that’s exactly what I’m saying.

Epistemic humility was never supposed to be a kind of “we can never know the truth for absolute certain so may as well give up” fatalism. Not for me, anyway. It was supposed to be about being humble in our pursuit of truth. Not in saying that the pursuit was doomed to fail so why bother trying.

I think even a lot of the doomsayers would agree with that. I quoted Jonathan Haidt on the argumentative theory earlier, and he’s one of my favorite writers. I’m pretty sure he’s not an epistemological nihilist. RationalWiki may get a little carried away with stuff like the backfire effect (they gave no notice on their site that other studies have failed to replicate the effect), but evidently they think there’s some benefit to telling people about it. Else, why bother having a wiki at all?

Taken to its extreme, epistemic humility is just as self-defeating as subjectivism. Subjectivism–the idea that truth is ultimately relative–is incoherent because if you say “all truth is relative” you’ve just made an objective claim. That’s the short version. For the longer version, read Thomas Nagel’s The Last Word

The same goes for all this breathless humans-are-incapable-of-changing-their-minds stuff. Nobody who does all the hard work of researching and writing and teaching can honestly believe that in their bones. At least, not if you think (as I do) that a person’s actions are the best measure of their actual beliefs, rather than their own (unreliable) self-assessments.

Here’s the thing, if you agree with the basic contours of epistemic humility–with most of the cognitive biases and even the argumentative hypothesis–you end up at a place where you think human belief is a reward-based activity like any other. We are not truth-seeking machines that automatically and objectively crunch sensory data to manufacture beliefs that are as true as possible given the input. Instead, we have instrumental beliefs. Beliefs that serve a purpose. A lot of the time that purpose is “make me feel good” as in “rationalize what I want to do already” or “help me fit in with this social clique”.

I know all this stuff, and my reaction is: so what?

So what if human belief is instrumental? Because you know what, you can choose to evaluate your beliefs by things like “does it match the evidence?” or “is it coherent with my other beliefs?” Even if all belief is ultimately instrumental, we still have the freedom to choose to make truth the metric of our beliefs. (Or, since we don’t have access to truth, surrogates like “conformance with evidence” and “logical consistency”.)

Now, this doesn’t make all those cognitive biases just go away. This doesn’t disprove the argumentative theory. Let’s say it’s true. Let’s say we evolved the capacity to reason to make convincing (rather than true) arguments. OK. Again I ask: so what? Who cares why we evolved the capacity, now that we have it we get to decide what to do with it. I’m pretty sure we did not evolve opposable thumbs for the purpose of texting on touch-screen phones. Yet here we are and they seem adequate to the task. 

What I’m saying is this: epistemic humility and the associated body of research tell us that humans don’t have to conform their beliefs to truth and that we are incapable of conforming our beliefs perfect to truth and that it’s hard to conform our beliefs even mostly to truth. OK. But nowhere is it written that we can make no progress at all. Nowhere is it written we cannot try or that–when we try earnestly–we are doomed to make absolutely no headway at all.

I want to be right. And I’m not apologizing for that. 

So how do Internet arguments come into this? One way that we become right–individually and collectively–is by fighting over things. It’s pretty similar to the theory behind our adversarial criminal justice system. Folks who grow up in common law countries (of which the US is one) might not realize that’s not the way all criminal justice systems work. The other major alternative is the inquisitorial system (which is used in countries like France and Italy).

In an inquisitorial system, the court is the one that conducts the investigation. In an adversarial system the court is supposed to be neutral territory where two opposing camps–the prosecution and the defense–lay out their case. That’s where the “adversarial” part comes in: the prosecutors and defenders are the adversaries. In theory, the truth arises from the conflict between the two sides. The court establishes rules of fair play (sharing evidence, not lying) and–within those bounds–the prosecutors’ and defenders’ job is not to present the truest argument but the best argument for their respective side. 

The analogy is not a perfect one, of course. For one thing, we also have a presumption of innocence in the criminal justice system because we’re not evaluating ideas we’re evaluating people. That presumption of innocence is crucial in a real criminal justice system, but it has no exact analogue in the court of ideas.

For another thing, we have a judge to oversee trials and enforce the rules. There’s no impartial judge when you have a debate with randos on the Internet. This is unfortunate, because it means that If we don’t police ourselves in our debates, then the whole process breaks down. There is no recourse.

When I say I want to be right, what am I saying, in this context? I’m saying that I want to know more at the end of a debate than I did at the start. That’s the goal. 

People like to say you never change anyone’s mind in a debate. What they really mean is that you never reverse someone’s mind in a debate. And, while that’s not literally true, it’s pretty close. It’s really, really rare for someone to go into a single debate as pro-life (or whatever) and come out as pro-choice (or whatever). I have never seen someone make a swing that dramatic in a single debate. I certainly never have.

But it would be absurd to say that I never “changed my mind” because of the debates I’ve had about abortion. I’ve changed my mind hundreds of time. I’ve abandoned bad arguments and adopted or invented new ones. I’ve learned all kinds of facts about law and history and biology that I didn’t know before. I’ve even changed my position many times. Just because the positions were different variations within the theme of pro-life doesn’t mean I’ve never “changed my mind”. If you expect people to walk in with one big, complex, set of ideas that are roughly aligned with a position (pro-life, pro-gun) and then walk out of a single conversation with whole new set of ideas that are aligned under the opposite position (pro-choice, anti-gun), then you’re setting that bar way too high.

But all of this only works if the folks having the argument follow the rules. And–without a judge to enforce them–that’s hard.

This is where the other kind of wanting to “be right” comes in. One of the most common things I see in a debate (whether I’m having it or not) is that folks want to avoid having to admit they were wrong

First, let me state emphatically that if you want to avoid admitting you were wrong you don’t actually care about being right in the sense that I mean it. Learning where you are wrong is just about the only way to become right! People who really want to “be right” embrace being wrong every time it happens because those are the stepping stones to truth. Every time you learn a belief or a position you took was wrong, you’re taking a step closer to being right.

But–going back to those folks who want to avoid appearing wrong–they don’t actually want to be right. They just want to appear right. They’re not worried about truth. They worried about prestige. Or ego. Or something else.

If you don’t care about being right and you only care about appearing right, then you don’t care about truth either. And these folks are toxic to the whole project of adversarial truth-seeking. Because they break the rules. 

What are the rules? Basic stuff like don’t lie, debate the issue not the person, etc. Maybe I’ll come up with a list. There’s a whole set of behaviors that can make your argument appear stronger while in fact all you’re doing is peeing in the pool for everyone who cares about truth. 

If you care about being right, then you will give your side of the debate your utmost. You’ll present the best evidence, use the tightest arguments, and throw in some rhetorical flourishes for good measure. But if you care about being right, then you will not break the rules to advance your argument (No lying!) and you also won’t just abandon your argument in midstream to switch to a new one that seems more promising. Anyone who does that–who swaps their claims mid-stream whenever they see one that shows a more promising temporary advantage–isn’t actually trying to be right. They’re trying to appear right. 

They’re not having an argument or a debate. They’re fighting for prestige or protecting their ego or doing something else that looks like an argument but isn’t actually one. 

I wrote this partially to vent. Partially to organize my feelings. But also to encourage folks not to give up hope, because if you believe that nobody cares about truth and changing minds is impossible then it becomes a self-fulfilling prophecy.

And you want to know the real danger of relativism and post-modernism and any other truth-adverse ideology? Once truth is off the table as the goal, the only thing remaining is power.

As long as people believe in truth, there is a fundamentally cooperative aspect to all arguments. Even if you passionately think someone is wrong, if you both believe in truth then there is a sense in which you’re playing the same game. There are rules. And, more than rules, there’s a common last resort you’re both appealing to. No matter how messy it gets and despite the fact that nobody ever has direct, flawless access to truth, even the bitterest ideological opponents have that shred of common ground: they both think they are right, which means they both thing “being right” is a thing you can, and should, strive to be.

But if you set that aside, then you sever the last thread between opponents and become nothing but enemies. If truth is not a viable recourse, all that is left is power. You have to destroy your opponent. Metaphorically at first. Literally if that fails. Nowhere does it say on the packaging of relativism “May lead to animosity and violence”. It’s supposed to do the opposite. It’s advertised as leading to tolerance and non-judgmentalism, but by taking truth off the table it does the opposite.

Humans are going to disagree. That’s inevitable. We will come into conflict. With truth as an option, there is no guarantee that the conflict will be non-violent, but it’s always an option. It can even be a conflict that exists in an environment of friendship, respect, and love. It’s possible for people who like and admire each other to have deep disagreements and to discuss them sharply but in a context of that mutual friendship. It’s not easy, but it’s possible. 

Take truth off the table, and that option disappears. This doesn’t mean we go straight from relativism to mutual annihilation, but it does mean the only thing left is radical partisanship where each side views the other as an alien “other”. Maybe that leads to violence, maybe not. But it can’t lead to friendship, love, and unity in the midst of disagreement.

So I’ll say it one more time: I want to be right.

I hope you do, too.

If that’s the case, then there’s a good chance we’ll get into some thundering arguments. We’ll say things we regret and offend each other. Nobody is a perfect, rational machine. Biases don’t go away and ego doesn’t disappear just because we are searching for truth. So we’ll make mistakes and, hopefully, we’ll also apologize and find common ground. We’ll change each other’s minds and teach each other things and grudgingly earn each other’s respect. Maybe we’ll learn to be friends long before we ever agree on anything.

Because if I care about being right and you care about being right, then we already have something deep inside of us that’s the same. And even if we disagree about every single other thing, we always will.

How and Why to Rate Books and Things

Here’s the image that inspired this post:


Now, there’s an awful lot of political catnip in that post, but I’m actually going to ignore it. So, if you want to hate on Captain Marvel or defend Captain Marvel: this is not the post for you. I want to talk about an apolitical disagreement I have with this perspective.

The underlying idea of this argument is that you should rate a movie based on how good or bad it is in some objective, cosmic sense. Or at least based on something other than how you felt about the movie. In this particular case, you should rate the movie based on some political ideal or in such a way as to promote the common good. Or something. No, you shouldn’t. ALl of these approaches are bad ideas.

That's not how this works

The correct way to rate a movie–or a book, or a restaurant, etc.–is to just give the rating that best reflects how much joy it brought you. That’s it!

Let’s see if I can convince you.

To begin with, I’m not saying that such a thing as objective quality doesn’t exist. I think it probably does. No one can really tell where subjective taste ends and objective quality begins, but I’m pretty sure that “chocolate or vanilla” is a matter of purely personal preference but “gives you food poisoning or does not” is a matter of objective quality.

So I’m not trying to tell you that you should use your subjective reactions because that’s all there is to go on. I think it’s quite possible to watch a movie and think to yourself, “This wasn’t for me because I don’t like period romances (personal taste), but I can recognize that the script, directing, and acting were all excellent (objective quality) so I’m going to give it 5-stars.”

It’s possible. A lot of people even think there’s some ethical obligation to do just that. As though personal preferences and biases were always something to hide and be ashamed of. None of that is true.

The superficial reason I think it’s a bad idea has to do with what I think ratings are for. The purpose of a rating–and by a rating I mean a single, numeric score that you give to a movie or a book, like 8 out of 10 or 5 stars–is to help other people find works that they will enjoy and avoid works that they won’t enjoy. Or, because you can do this, to help people specifically look for works that will challenge them and that they might not like, and maybe pass up a book that will be too familiar. You can do all kinds of things with ratings. But only if the ratings are simple and honest. Only if the ratings encode good data.

The ideal scenario is a bunch of people leaving simple, numeric ratings for a bunch of works. This isn’t Utopia, it’s Goodreads. (Or any of a number of similar sites.) What you can then do is load up your list of works that you’ve liked / disliked / not cared about and find other people out there who have similar tastes. They’ve liked a lot of the books you’ve liked, they’ve disliked a lot of the books you’ve disliked, and they’ve felt meh about a lot of the books you’ve felt meh about. Now, if this person has read a book you haven’t read and they gave it 5-stars: BAM! You’re quite possibly found your next great read.

You can do this manually yourself. In fact, it’s what all of us instinctively do when we start talking to people about movies. We compare notes. If we have a lot in common, we ask that person for recommendation. It’s what we do in face-to-face interactions. When we use big data sets and machine learning algorithms to automate the process, we call them recommender systems. (What I’m describing is the collaborative filtering approach as opposed to content-based filtering, which also has it’s place.)

This matters a lot to me for the simple reason that I don’t like much of what I read. So, it’s kind of a topic that’s near and dear to my heart. 5-star books are rare for me. Most of what I read is probably 3-stars. A lot of it is 1-star or 2-star. In a sea of entertainment, I’m thirsty. I don’t have any show that I enjoy watching right now. I’m reading a few really solid series, but they come out at a rate of 1 or 2 books a year, and I read more like 120 books a year. The promise of really deep collaborative filtering is really appealing if it means I can find is valuable.

But if you try to be a good citizen and rate books based on what you think they’re objective quality is, the whole system breaks down.

Imagine a bunch of sci-fi fans and a bunch of mystery fans that each read a mix of both genres. The sci-fi fans enjoy the sci-fi books better (and the mystery fans enjoy the mystery books more), but they try to be objective in their ratings. The result of this is that the two groups disappear from the data. You can no longer go in and find the group that aligns with your interests and then weight their recommendations more heavily. Instead of having a clear population that gives high marks to the sci-fi stuff and high-marks to the mystery stuff, you just have one, amorphous group that gives high (or maybe medium) marks to everything.

How is this helpful? It is not. Not as much as it could be, anyway.

In theoretical terms, you have to understand that your subjective reaction to a work is complex. It incorporates the objective quality of the work, your subjective taste, and then an entire universe of random chance. Maybe you were angry going into the theater, and so the comedy didn’t work for you the way it would normally have worked. Maybe you just found out you got a raise, and everything was ten times funnier than it might otherwise have been. This is statistical noise, but it’s unbiased noise. This means that it basically goes away if you have a high enough sample.

On the other hand, if you try to fish out the objective components of a work from the stew of subjective and circumstantial components, you’re almost guaranteed to get it wrong. You don’t know yourself very well. You don’t know for yourself where you objective assessment ends and your subjective taste begins. You don’t know for yourself what unconscious factors were at play when you read that book at that time of your life. You can’t disentangle the objective from the subjective, and if you try you’re just going to end up introducing error into the equation that is biased. (In the Captain Marvel example above, you’re explicitly introducing political assessments into your judgment of the movie. That’s silly, regardless of whether your politics make you inclined to like it or hate it.)

What does this all mean? It means that it’s not important to rate things objectively (you can’t, and you’ll just mess it up), but it is helpful to rate thing frequently. The more people we have rating things in a way that can be sorted and organized, the more use everyone can get from those ratings. In this sense, ratings have positive externalities.

Now, some caveats:

Ratings vs. Reviews

A rating (in my terminology, I don’t claim this is the Absolute True Definition) is a single, numeric score. A review is a mini-essay where you get to explain your rating. The review is the place where you should try to disentangle the objective from the subjective. You’ll still fail, of course, but (1) it won’t dirty the data and (2) your failure to be objective can still be interesting and even illuminating. Reviews–the poor man’s version of criticism–is a different beast and it plays by different rules.

So: don’t think hard about your ratings. Just give a number and move on.

Do think hard about your reviews (if you have time!) Make them thoughtful and introspective and personal.

Misuse of the Data

There is a peril to everyone giving simplistic ratings, which is that publishers (movie studios, book publishers, whatever) will be tempted to try and reverse-engineer guaranteed money makers.

Yeah, that’s a problem, but it’s not like they’re not doing that anyway. The reason that movie studios keep making sequels, reboots, and remakes is that they are already over-relying on ratings. But they don’t rely on Goodreads or Rotten Tomatoes. They rely on money.

This is imperfect, too, given the different timing of digital vs. physical media channels, etc. but the point is that adding your honest ratings to Goodreads isn’t going to make traditional publishing any more likely to try and republish last years cult hit. They’re doing to do that anyway, and they already have better data (for their purposes) than you can give them.

Ratings vs. Journalism

My advice applies to entertainment. I’m not saying that you should just rate everything without worrying about objectivity. This should go without saying but, just in case, I said it.

You shouldn’t apply this reasoning to journalism because one vital function of journalism for society is to provide a common pool of facts that everyone can then debate about. One reason our society is so sadly warped and full of hatred is that we’ve lost that kind of journalism.

Of it’s probably impossible to be perfectly objective. The term is meaningless. Human beings do not passively receive input from our senses. Every aspect of learning–from decoding sounds into speech to the way vision works–is an active endeavor that depends on biases and assumptions.

When we say we want journalists to be objective, what we really mean is that (1) we want them to stick to objectively verifiable facts (or at least not do violence to them) and (2) we would like them to embody, insofar as possible, the common biases of the society they’re reporting to. There was a time when we, as Americans, knew that we had certain values in common. I believe that for the most part we still do. We’re suckers for underdogs, we value individualism, we revere hard work, and we are optimistic and energetic. A journalistic establishment that embraces those values is probably one that will serve us well (although I haven’t thought about it that hard, and it still has to follow rule #1 about getting the facts right). That’s bias, but it’s a bias that is positive: a bias towards truth, justice, and the American way.

What we can’t afford, but we unfortunately have to live with, is journalism that takes sides within the boundaries of our society.

Strategic Voting

There are some places other than entertainment where this logic does hold, however, and one of them is voting. One of the problems of American voting is that we go with majority-take-all voting, which is like the horse-and-buggy era of voting technology. Majority-take-all voting is probably much worse for us than a 2-party system, because it encourages strategic voting.

Just like rating Captain Marvel higher or lower because your politics make you want it to succeed or fail, strategic voting is where you vote for the candidate that you think can win rather than the candidate that you actually like the most.

There are alternatives that (mostly) eliminate this problem, the most well-known of which is instant-runoff voting. Instead of voting for just one candidate, you rank the candidates in the order that you prefer them. This means that you can vote for your favorite candidate first even if he or she is a longshot. If they don’t win, no problem. Your vote isn’t thrown away. In essence, it’s automatically moved to your second-favorite candidate. You don’t actually need to have multiple run-off elections. You just vote once with your full list of preferences and then it’s as if you were having a bunch of runoffs.

There are other important reasons why I think it’s better to vote for simple, subjective evaluations of the state of the country instead of trying to figure out who has the best policy choices, but I’ll leave that discussion for another day.

Limitations

The idea of simple, subjective ratings is not a cure-all. As I noted above, it’s not appropriate for all scenarios (like journalism). It’s also not infinitely powerful. The more people you have and the more things they rate (especially when lots of diverse people are rating the same thing), the better. If you have 1,000 people, maybe you can detect who likes what genre. If you have 10,000 people, maybe you can also detect sub-genres. If you have 100,000 people, maybe you can detect sub-genres and other characteristics, like literary style.

But no matter how many people you have, you’re never going to be able to pick up every possible relevant factor in the data because there are too many and we don’t even know what they are. And, even if you could, that still wouldn’t make predictions perfect because people are weird. Our tastes aren’t just a list of items (spaceships: yes, dragons: no). They are interactive. You might really like spaceships in the context of gritty action movies and hate spaceships in your romance movies. And you might be the only person with that tick. (OK, that tick would probably be pretty common, but you can think of others that are less so.)

This is a feature, not a bug. If it were possible to build a perfect recommendation it would also be possible to build (at least in theory) an algorithm to generate optimal content. I can’t think of anything more hideous or dystopian. At least, not as far as artistic content goes.

I’d like a better set of data because I know that there are an awful lot of books out there right now that I would love to read. And I can’t find them. I’d like better guidance.

But I wouldn’t ever want to turn over my reading entirely to a prediction algorithm, no matter how good it is. Or at least, not a deterministic one. I prefer my search algorithms to have some randomness built in, like simulated annealing.

I’d say about 1/3rd of what I read is fiction I expect to like, about 1/3rd is non-fiction I expect to like, and 1/3rd is random stuff. That random stuff is so important. It helps me find stuff that no prediction algorithm could ever help me find.

It also helps the system over all, because it means I’m not trapped in a little clique with other people who are all reading the same books. Reading outside your comfort zone–and rating them–is a way to build bridges between fandom.

So, yeah. This approach is limited. And that’s OK. The solution is to periodically shake things up a bit. So those are my rules: read a lot, rate everything you read as simply and subjectively as you can, and make sure that you’re reading some random stuff every now and then to keep yourself out of a rut and to build bridges to people with different tastes then your own.

Stuff I Say at School – Part II: Self-Interested Politicians

This is part of the Stuff I Say at School series.

The Assignment

After listening to [Benjamin] Ginsberg‘s lecture, do you agree with his assessment that politics is all about interests and power?

The Stuff I Said

Image result for the elephant in the brain

Kevin Simler and Robin Hanson’s recent book The Elephant in the Brain demonstrates that these underlying desires for power and status inform many of our decisions and behaviors in everyday life. Politicians certainly do not transcend these selfish motives by virtue of their office. I would actually add a subcategory to “status”: moral grandstanding. We want to paint ourselves as “good people” by signaling to others our superior moral quality. This allows us to enjoy the social capital that comes along with the improved reputation. We not only gain status, but we can also think of ourselves as do-gooders; crusaders who fight the good fight. Unsurprisingly, evidence suggests that we have inflated views  of our own moral character and that acts of moral outrage are largely self-serving. What’s unfortunate is that social media may be exacerbating moral outrage by making signaling both easier and less costly to the individual.

I think the rise of populism in both America and Europe is a timely example of interests at play. While various elements contribute to the populist mindset, economic insecurity is the water it swims in. And this insecurity has been exploited by politicians of more extreme ideologies across multiple countries. For example, the Great Recession eroded European trust in mainstream political parties: a one percentage point increase in unemployment was associated with a 2 to 4 percentage point increase in the populist vote. A 2016 study looked at the political results of financial crises in Europe from 1870 to 2014 and found that far-right parties were the typical outcome. In America, President Trump made “Make America Great Again” his rallying cry, feeding off the public’s distrust of “the Establishment” during the post-crisis years. In doing so, he advocated protectionism and tighter borders. Oddly enough, you find comparable populist sentiments on the Left: Bernie Sanders has been very anti-trade and iffy on liberalized immigration (open borders is “a Koch Brothers proposal“), all in the name of helping the American worker. One of his former campaign organizers–the newly-elected Congresswoman Ocasio-Cortez–has also expressed similar concerns over trade deals (especially NAFTA). This is why The Economist sees less of a left/right divide today and more of an open/close divide. Skepticism of trade and immigration wrapped in “power to the people” sentiments may be invigorating in rhetoric, but it’s asinine in practice. And it’s doing nothing more than riding the wave of voter anxiety. What’s worse, it’s hiding these politicians’ accumulation of power, attainment of status, and moral self-aggrandizement behind what Ginsberg so aptly calls “the veneer of public spiritedness.”

More Stuff

A classmate asked if I believed that politicians always acted in self-interest or if there were moral lines that some would not cross. In response, I pointed out that Simler and Hanson are largely arguing against what they see as the tendency for people to tiptoe around hidden motives and self-deception. It’s not that we’re only motivated by selfish motives. We just tend to gloss over them. But they are deeply embedded. Failing to acknowledge them not only has personal consequences, but public ones as well (their chapter on medicine is especially on point). I think we should consider moral motivations through all possible means available, including life experience and behavior. However, I think a healthy dose of skepticism is necessary. It can certainly help protect us against intentional deception. But perhaps more importantly, it helps protect us against unintentional deception. It’s easy to give more weight to life experience, moral principles, and the like when it’s a politician on “our side,” all while harshly judging those on “the other side” as unscrupulous. Political skepticism or cynicism can aid in keeping our own selfish motives and emotional highs in check. And it can lead us to seek out more information, improve our understanding, and refine our beliefs. Otherwise, we end up being consumed by our own good intentions and moral principles without actually learning how to implement these principles.

Image result for against democracy

My classmate also put forth a hypothetical to get a feel for my position: if legislative districts were redrawn so that legislators now represented districts with a different ideological makeup, how many would change their positions on issues just to stay in power? Personally, I think we would see a fair number of politicians shift their position because it is more advantageous. However, there is considerable evidence that political deliberation with ideological opposites actually backfires. Political philosopher Jason Brennan reviews the evidence in chapter 3 of his book Against Democracy and finds that political deliberation:

  • Undermines cooperation
  • Exacerbates conflict when groups are different sizes
  • Avoids debates about facts and is instead driven by status-seeking and positions of influence
  • Uses language in biased and manipulative ways, usually by painting the opposition as intrinsically bad
  • Avoids controversial topics and sticks to safe subjects
  • Amplifies intellectual biases

There’s more, but that should make my point. So even if some politicians did not flip flop in their newly-drawn districts, the above list should give us pause before we conclude that their doubling down is proof of disinterest in status or moral grandstanding.

I certainly believe that people have moral limits and lines they will not cross. My skepticism (which I prefer to the word cynicism, but I’m fine with interchanging them) is largely about honest self-examination and the examination of others. For example, consider something that is generally of no consequence: Facebook status updates. My Facebook feed is often full of political rants, social commentaries, and cultural critiques. Why do we do that? Why post a political tract as a status? It can’t be because of utility. A single Facebook status isn’t going to fix Washington or shift the course of society. It’s unlikely to persuade the unsaved among your Facebook friends. In fact, it’s probably counterproductive given our tendency for motivated reasoning. When we finally rid ourselves of the high-minded rationales that make next to zero sense, we find that it boils down to signaling: we are signaling our tribe. And that feels good. We get “Likes.” We get our worldview confirmed by others. We gain more social capital as a member of the group. We even get to moral grandstand in the face of that friend or two who hold (obviously) wrong, immoral beliefs. Sure, some of it may be about moral conviction and taking a stand. That certainly sounds and feels better. But I think we will all be better off if we realize that’s really what those behaviors are about: sounding and feeling good. And I think our politics will be better off if we apply a similar lens to it. 

And More Stuff

A classmate drew on Dan Ariely’s work to argue that people–including politicians–have a “personal fudge factor“: most people will cheat a little bit without feeling they’ve compromised their sense that they are a “good person.” When people are reminded of moral values (in the case of the experiments, the honor code or 10 commandments), they don’t cheat, including atheists. So while politicians may compromise their values here and there, they still have a moral sense of self that they are unlikely to violate.

In response, I pointed out that a registered replication report last year was unable to reproduce Ariely’s results. That doesn’t mean his results were wrong, just that we need to be cautious in drawing any strong conclusions from them.

When discussing his priming with the 10 Commandments on pg. 635, Ariely references Shariff and Norenzayan’s well-known 2007 study. This found that people behave more prosocially (in this case, generosity in experimental economic games) when primed with religious concepts. They offered a couple explanations for this. One hypothesis suggested that “the religious prime aroused an imagined presence of supernatural watchers…Generosity in cooperative games has been shown to be sensitive to even minor changes that compromise anonymity and activate reputational concerns” (pg. 807). They then cite studies (which later studies confirm) that found people behaving more prosocially in the presence of eye images. “In sum,” the authors write, “we are suggesting that activation of God concepts, even outside of reflective awareness, matches the input conditions of an agency detector and, as a result, triggers this hyperactive tendency to infer the presence of an intentional watcher. This sense of being watched then activates reputational concerns, undermines the anonymity of the situation, and, as a result, curbs selfish behavior” (pg. 807-808). In short, religious priming makes us think someone upstairs is watching us. This has more to do with being seen as good.

However, religious priming obviously doesn’t work for the honor code portion. Yet, Shariff and Norenzayan’s other explanation is actually quite helpful in this regard: “the activation of perceptual conceptual representations increases the likelihood of goals, plans, and motor behavior consistent with those representations…Irrespective of any attempt to manage their reputations, subjects may have automatically behaved more generously when these concepts were activated, much as subjects are more likely to interrupt a conversation when the trait construct ‘‘rude’’ is primed, or much as university students walk more slowly when the ‘‘elderly’’ stereotype is activated (Bargh et al., 1996)” (pg. 807). Being primed with the “honorable student” stereotype, students were more likely to behave honorably (or honestly). 

In short, Ariely’s study I think shows a mix of motivations when it comes to behaving morally: (1) maintaining our self-concept as a good person, (2) fear of being caught and having our reputation (and the benefits that come with along with it) damaged, and (3) our susceptibility to outside influence.

My point about moral grandstanding is not that we should interpret all behaviors by politicians through the lens of self-delusion and status seeking. But being aware of it can help us cut through a lot of nonsense and avoid being swept up in a collective self-congratulation. To quote Tosi and Warmke, “thinking about grandstanding is a cause for self-reflection, not a call to arms. An argument against grandstanding shouldn’t be used as a cudgel to attack people who say things we dislike. Rather, it’s an encouragement to reassess why and how we speak to one another about moral and political issues. Are we doing good with our moral talk? Or are we trying to convince others that we are good?” And as philosopher David Schmidtz is said to have quipped, if your main goal is to show that your heart is in the right place, then your heart is not in the right place.

Building a Life Story

This post is part of the General Conference Odyssey.

The first time I wrote in my journal was in the days immediately after my baptism when I was 8 years old. I still have the pages somewhere in a box, including the hand-drawn map of the different routes I could take when I walked back and forth from school.

I have started and stopped journals countless times since then because it’s one of those things that, as Elder Groberg reminded us in Writing Your Personal and Family History, good Mormons are supposed to do.

As much as I enjoy writing, there’s always been one big thing inhibiting me from keeping a journal more reliably, and it is this: I don’t know what the real story is. This isn’t some weird post-modern hang-up, so much as it is (as far as I can tell) a weird psychological hang-up. I never know how I feel about things. Interrogating my true feelings about the things that are going on in my life is like collecting mist with a butterfly net. I can record the brute facts of my life—I can draw the map and label the streets—but I can’t tell you what those facts mean. Not even, and perhaps most especially, to me.

My inner life is an optical illusion. It is a collection of lines that looks like the inside of a cube one moment or the outside of a cube the next. It is a picture of a rabbit for a blink, and then it is a picture of a duck. It is two faces; it is a chalice. It is an old lady; it is a young woman.

This is why I spend almost no time at all thinking about my past. My friends and family all remember so much more of the things that I’ve been through than I do. For me, the past is like a crime scene, and I am afraid to contaminate the evidence. I have a superstitious belief that there is a true story, an objective reality, and I’m afraid that if I try to hard to find it then I will only erase it.

I have a couple of binders somewhere that contain all the letters that I sent home while I was serving my mission in Hungary and all of the letters that people sent to me. I think the binders were a gift when I got home, but I’m not sure. I’ve never opened them. I’m not sure where they are. I don’t even like to look at the binders, let alone consider reading the pages inside. Because my mission was the one time in my life when I acted like I knew what was going on and when I told everyone how I felt about things, and I’m afraid that it was all lies. It was the hardest time of my young life, and I have vague recollections of writing relentlessly optimistic and happy letters despite feeling so depressed that it felt like physical pain on most days. The whole thing is wildly embarrassing to me. I acted like I knew what was going on. I had no idea. I have lived almost as many years after my mission as I lived before it, and I still have no idea what was going on or why it was so hard for me.

If writing a journal is about writing the real story of my feelings, then I can’t write a journal for the simple reason that I don’t know my own story.

And yet, I should. Write a journal, that is. Like Elder Groberg says, writing a journal “helps immeasurably in gaining a true, eternal perspective of life” and “should be a great motivation to do what is right.” I know that’s accurate: the reflection of writing about my life has helped me put things into perspective.

Maybe that’s the point?

I’m teaching the Old Testament in Gospel Doctrine this year, and it’s a mess. We just made the transition from Joshua to Judges, and I taught about how all the mass slaughter that supposedly happened in Joshua is pretty flatly contradicted by Judges. On the bright side: you don’t have to believe in a genocidal God.  On the downside: it’s hard to make sense of all the contradictions. In Deuteronomy, we’re told a Moabite will never enter the assembly of the Lord until the 10th generation. Ruth, the hero of the Book of Ruth, is Moabite and that makes King David 1/8th Moabite. And, while we’re on the topic, how do we reconcile the apparent gap between the miracle-laden Exodus story and the miracle-free story of Ruth and Boaz?

The one encouraging thing is that, as I read Elder Groberg’s talk, I realize that the Old Testament is a mess in a lot of the same ways that my own life story is a mess.

There may be one, true, ultimate truth about everything. Not just the objective facts of life, but the subjective ones as well. Maybe there is an absolutely true narrative. But if there is, we will never know it in this life. In this life, stories are things we make up. Fictional stories are based on imaginary facts. And real stories—including history—is made up based on true facts. But they are both made up.

I’m not sure if I have that right or not, but it sounds promising. At the very least, it’s worth giving a shot. I’m going to try writing in my journal again, and this time I’m not going to try and find a life story. I’m going to use the raw materials of my experiences to build one.

Check out the other posts from the General Conference Odyssey this week and join our Facebook group to follow along!

Speaking on campus and the ctrl-left

Update: the full text of the speech is now online on its own post.

My university is home to a controversial Confederate War memorial.

It is a bronze sculpture of a college student carrying a rifle, commemorating the students at my university who left their studies and went to fight in the American Civil War for the Confederacy. On the base are three inscriptions, the middle of which shows the student in class, hearing the call of a woman representing duty urging him to fight. The side inscription speaks of honor and duty.

The statue has always been controversial, but recent events have brought the controversy back.

The university is holding an open panel, inviting the general public to share their thoughts. You just had to register, and the first 25 get to go.

Well, I have some thoughts on the monument, and I wanted to share them. So I signed up, and I wrote a speech (exactly 3 minutes in length), and I’ve been practicing it. On Wednesday, I anticipate getting to deliver the speech.

I would be pretty foolish to not be worried. Actually, on an issue this incendiary, I am pretty foolish to want to speak out at all.

For starters, there’s a chance my talk could anger white supremacist groups.

I am a white man with pale skin and reddish/blondish hair. I am married to a beautiful woman from Costa Rica, with caramel skin and these gorgeous black eyes you can just get lost in. We don’t have children yet, but we are both excited to meet them. I know they will be beautiful, like their mother. I hope my daughters look like her, with her dark skin and dark eyes and her raven black hair.

If you listen to what white supremacist groups actually say these days, then you’d know this is their raison d’être. They refer to it by the moronic title “white genocide” — the “diluting” of the “white race” through marriage of white people with people of other races.

Me and my family are the main thing that white supremacists march against.

In the speech I have planned, I think I make it clear that I consider the cause of the Confederacy in the American Civil War to be an unworthy cause — it was certainly not worth the lives of the men who died for it.

That might anger white supremacists, who would already have reason to despise my family.

But, I’m not afraid of angering white supremacists; they’re evil, but they don’t frighten me. Because I know they are a powerless group of isolated and outcast individuals with little to no social standing in their own communities, who are resorted to anonymous online forums for human contact. They are pathetic, and I’m not enough of a coward to shrink away from shadows in a basement.

White supremacy is, of course, evil. It cost me nothing to say that, and means nothing when I do say it, as everyone either agrees with it already, or is a white supremacist and doesn’t care what society thinks about them.

White supremacy is also stupid. It is lazy thinking. It is the kind of mental shortcut that the feeble-minded rely on. It is the sort of excuse that the weak-willed cower to, lacking the testicular fortitude to face their own inadequacies. It’s the kind of pseudo-intellectualism the internet is famous for, citing poorly analyzed statistics, when all it would take is meeting one normal, middle-class African American to see the fatuity of it all — that blacks and whites are the same race, because there is only the one race of Adam.

My comments might make them mad, but what are they going to do? Make memes about me?

There is also a chance my speech could anger Progressives on the ctrl-left. Actually, probably a much bigger chance. And that does scare me.

It scares me so much that I’m actually considering if I even want to speak at all. I have a speech written, and I’ve been practicing it, and I’ve shopped it with a number of friends, and I’ve made edits and timed it perfectly. But I’m thinking of not doing it at all.

I’m afraid of what the ctrl-left could do to me.

What is the ctrl-left? The label is a take on the alt-right designation, though the ctrl-left have been around for a lot longer. Maybe since the Bush administration. They are a political activist class — that is, they are a class of people with nothing else to do but be politically active. They are employed in universities, shutting down conservative voices. They are employed in news stations, selectively editing narratives and choosing which stories to give press time. They are employed at online opinion magazines, and spend all day opining on politics and culture. They are employed in Starbucks, and then spend 14 hours a day on twitter and tumblr investigating the lives of people they disagree with, trying to have them removed from their jobs, or shut down their youtube, facebook, or twitter to prevent them from sharing in electronic public forums. They are employed in tech companies enforcing “community standards” with bans and post removals, which on platform after platform seems to conveniently mean removing opinions on the right of American politics.

The ctrl-left, in essence, want to control what you are allowed to say, and punish you when you say what you are not.

The most recent explosion of this movement has been in antifa, the group of emotional children using acts of literal street violence to suppress and silence dissident voices in the public sphere — which is to say, they are a group of fascists. These jackbooted thugs have been taking to the streets, punching people in the face, smashing up their campuses in temper tantrums, setting fires, and generally acting exactly like the goosestepping authoritarians they are in order to stop people from saying anything that they don’t think people should be able to say anymore.

This latest expression of the ctrl-left doesn’t particularly worry me. I can take physical violence. I can take being punched in the face, or maced, or beaten with a club. I would consider it an honor, actually. Make my day.

What does worry me are the online Social Justice Warriors in the ctrl-left who have nothing better to do with their lives, apparently, than to seek new ways to punish people for wrongthink.

I work in academia. Tenured professors cannot get fired for refusing to attend their own classes for two years, but tenured professors have been fired for daring to injure the precious emotions of the ctrl-left. I’m a mere, lowly teaching assistant. I could lose my job, or be dismissed from school. I could be made unhirable in colleges and tech companies.

If my speech offends the wrong person, they could look to dig up all kinds of stuff on me.

It wouldn’t even be very hard to dig up stuff on me. For most of my life, I was a pretty terrible jerk. Just ask anyone who knew me in high school. Since high school, I have been slightly tolerable. If you had nothing to do but look for reasons to say crap about me, you could find crap to say about me. And the ctrl-left has absolutely nothing else to do.

But even if they can’t find dirt on me, the very act of disagreeing with their orthodoxy is a firable offense. They have power in universities and companies to crush whoever displeases them; and not only do they have it, but they use it.

I know this, so I generally go about my day and just grit my teeth and keep my mouth shut. My fellow students don’t have to keep their mouths shut, because they affirm the accepted dogmata of our thought guardians.

I let them talk and express opinions I disagree with and laugh at people who think the exact things I think and endorse ideologies I completely reject and say nothing, because I just want to get out of here alive, get my PhD, and maybe once I have a job I can rely on, maybe then I’ll be able to breathe again.

And the crux of the story is that I’m just sick of it. I am sick and tired of shutting up. I am done with being expected to receive with full docility the ramblings of this tumblr magisterium. I’m tired of feeling like I can’t speak my mind without retaliation and blowback, while others can express their politics unafraid.

I’m done. I’m done being shut up.

Realistically, I can probably expect nothing. I’m probably over-worrying myself. It’s unlikely anyone will really take notice. It’s an indoor event with a few dozen speakers, and who really wants to attend a meeting like that unless you’re speaking? Local news might pick it up, and they might run two seconds of my three minute speech (probably selectively edited to make it sound like I’m saying something completely opposite of what I’m saying), and then that’s probably it. Maybe some person I know might notice and say something, maybe a student would say they heard I spoke or something, but that’s about it.

In a rational universe, maybe that’s all there needs to be about it. I can just say what I think, people can hear it and agree or disagree with it, we can have back-and-forth, and then we go on our merry ways.

But this is not a rational universe, so who knows what I can expect.

(Authors’s Note and General Disclaimer: These are not the only two groups of people with opinions in this country. There are people opposed to the monument who are not part of the ctrl-left and who want civil dialogue and peaceful protest to lead the change. There are people in favor of the monument who are neither white supremacists nor part of the alt-right, and who want all people to be treated with the dignity due all human individuals. There are people on the left who also champion free speech, such as the ACLU, because free speech is not a partisan concern but the birthright of humanity. I know these people exist, because I know them; they are my family and friends and neighbors. With all of these people, I hope to see the American spirit of passionate but nonviolent engagement in the marketplace of ideas continue to drive political discourse. To the ctrl-left and alt-right, I pray that God has mercy on you and grants you repentance from your hatred, violence, and folly.)

An Antidote for Smugness

Suppose Frank and Joe get into a Facebook debate, and suppose Frank knows a lot more about the issue over which they’re disagreeing. Neither one of them is really an expert, but Frank has read a lot more and maybe even has some sort-of relevant background. The longer the discussion goes, the more he realizes that Joe doesn’t really know what he’s talking about.

Let’s give Frank the benefit of the doubt. He’s not just a victim of confirmation bias. Joe really doesn’t know that much about the issue, it’s evident in what he’s written, and Frank’s assessment on that score is accurate.

So, naturally, this can lead Frank to feel a little smug, and smugness is toxic. It’s a poison that clouds our thinking, alienates us from people who could be our friends, and fuels arrogance and pride.

Frank is self-aware enough to realize that he’s having this reaction, but–as it turns out–there’s more to good character than just being able to recognize your own bad behavior. It’s good to suppress an angry outburst, for example, but it’s even better to overcome the anger itself. Controlling behavior is nice, but shaping character is better. Unfortunately, character can’t be shaped directly. We have to come at it sideways and ambush our own bad character traits when they least suspect it. We have to–wherever possible–cheat.

So here’s an idea.

The reason that Frank knows more than Joe about this issue is that Frank took the time to research it. He read dozens of articles. Joe didn’t, and so Joe doesn’t know as much. Instead of attributing his superiority in this one realm to some kind of personal attribute, Frank should ask himself: “What was Joe doing with the time I used to study this issue?”

 

Maybe Joe is lazy, and Joe was just watching reality TV show reruns. Maybe Joe is actually very curious and diligent, and was using the same amount of time studying some totally unrelated topic which–if they discussed it–would quickly demonstrate to Frank what it feels like to be the one who doesn’t really get it. Or maybe Joe wasn’t  studying, but he uses his time volunteering to make his neighborhood a better place.

It doesn’t really matter, because–in practice–Frank will never know. The point of the question is to ask it, because asking it reframes the context of Frank’s smugness. It’s not about some kind of overall, general superiority of intellect. It’s about the simple fact that Frank spent time studying a particular issue, and Joe didn’t. This is a smugness antidote. On top of dispelling the person-to-person comparison, it raises questions for Frank, such as: Was studying this particular issue really that wise an expenditure of his finite time and energy? Maybe it was, but maybe it wasn’t, all things considered. This should make Frank a little uneasy. That’s healthy. Certainly fair healthier than smugness, at any rate.

I am not a very good person. This isn’t a statement of false humility. I suspect, all things considered, that I’m probably about average by most comparisons with others, although I’ll never really be sure.[ref]Assuming that I’m average seems a good rule of thumb, all things considered.[/ref] But that’s not the point. I don’t care about comparing myself with others; I care about the gap between who I am and who I’d like to be. And the person I’d like to be doesn’t have to devise strategies for decency or play tricks on himself to mimic virtue. That’s what I mean when I say I’m not a very good person.

But we don’t get to choose the kind of person we are. Not in an instant, anyway. We come into this world with a load of genetic and environmental baggage that, by the time we get around to being thinking, self-aware little human creatures, is already more than we could ever hope to sort through in a life time. All we can ever do is start where we are. Hopefully we make incremental steps in the right direction, but human character can’t be perfect in a life time. There’s the old expressions, “fake it ’till you make it.” We’re never going to make it. So we just have to keep faking it. Play-acting at being a good person–when it’s done out of a sincere desire to learn to be good–is the best we can hope for.

This is one technique I try to remind myself to use in that game, and I thought I’d share it.

Top Ten Teen Albums: Walker Edition

There’s a new Internet/Facebook list going around: “10 albums that made a lasting impression on you as a teenager.” I thought it’d be fun to give you a glimpse into the musical tastes of my teenage self, which largely continue today. Attempting to think of whole albums was a little difficult because this was the age of mix CDs. I had a ton of mix CDs with various artists. I also had a lot of “Greatest Hits” and “The Best of…” albums (I wore out The Cream of Clapton as well as The Best of Bond…James Bond), which I’ve decided not to count. I’ve also limited the list to one album per artist. Otherwise, my list would likely be made up of two bands. It should also be noted that my musical tastes were largely seen through the eyes of a budding guitar player. Virtually everything was interpreted through the filter of, “How can this affect my guitar playing?” So, without further ado, here are my top ten teen albums (in no particular order):

Image result for enema of the stateBlink 182 – Enema of the State: I went through a huge Blink 182 phase through middle school and into my freshman year of high school. Aside from some radio play (“Dammit” was actually the first song I ever heard by them), my first proper introduction to them was at scout camp one summer. One of my best friends at the time had Enema of the State with him (I want to say on cassette) and he let me listen to some of the songs as we made our way to different merit badge sessions. This led to mixtapes featuring songs from Enema, Dude Ranch, and even Buddha. My parents weren’t particularly thrilled when they finally heard some of the crude themes and coarse language on these tapes, but that didn’t stop me from getting my secret stash of Blink CDs. It was this album that made me want to play an instrument. I decided I wanted to play bass because (1) everyone and their mother plays guitar and (2) Mark–the Blink bassist–was in my eyes the coolest member of the band[ref]Check out his bass intro in “Carousel.”[/ref] (though now I know it’s definitely the drummer Travis Barker). My parents opted for a guitar instead and I’ve never once regretted it. Even though I prefer their 2001 album Take Off Your Pants and Jacket (see what they did there?), Enema is the one that started my musical journey. Below is the highly immature video for the single “What’s My Age Again?”

Image result for make yourselfIncubus – Make Yourself: During one of our family vacations, my older sister Tori let me listen to her copy of Make Yourself by Incubus. I’d heard some of their hits on the radio, but this was the first time going through the entire album. I fell in love with it to the point that my sister just gave it to me. Brandon Boyd’s vocals and the somewhat unique twist on 90s alternative rock stood out to me as did its ability to capture my various teenage emotions, from angst to puppy love to a desire for self-direction. Their follow-up albums during my high school years–Morning View and A Crow Left of the Murder–received even more constant rotation than Make Yourself, but it was this album that began my still ongoing love affair with Incubus and their talent for both capturing my emotional states and transporting me to new ones. Below is the video for their song “Stellar.”

Image result for Ride the lightningMetallica – Ride the Lightning: I had heard Metallica growing up. Who hasn’t heard “Enter Sandman” or “Nothing Else Matters“? But I only started paying attention to them after hearing their live album S&M in the weight room at my high school freshman year. It was my first year of guitar playing and my initial thought was, “If they are this good live, what are they like on their records?” As I started downloading Metallica songs, I saw them perform one I hadn’t heard before on VH1 (trivia: it was bassist Jason Newsted’s last performance before he left the band). The song was “Fade to Black” and it was found on Ride the Lightning. I bought that album soon after and brought it with me on a family vacation to Washington, D.C. I listened to it non-stop and decided that I wanted to be able to play like guitarists James Hetfield and Kirk Hammett. This was my pathway to metal. I ended up with all of the Metallica albums, as well as a large chunk of Megadeth, Pantera, Ozzy Osbourne, and Dream Theater albums. I probably spent far more time listening to these metal bands than anything else, but it was Ride the Lightning that started it all. My guitar chops improved as did my musical taste because of it. You can see the VH1 performance of “Fade to Black” that ignited the flame below.

Image result for dark side of the moon albumPink Floyd – Dark Side of the Moon: My brother-in-law JC has been a guitar player since he was a teenager (at least). Whenever we would visit my sister, he would always go through his ginormous digital collection of music in hopes of educating me out of my Blink 182 phase and moving me beyond Metallica. He first used David Gilmour’s ending solo in “Comfortably Numb” to peak my interest in Pink Floyd. I ended up getting Echoes: The Best of Pink Floyd (on my own or as a gift, I can’t remember), but Dark Side of the Moon was the first actual album that listened to heavily (followed by The Wall). The musical style was so different from what I was used to; a kind of progressive, psychedelic rock. Gilmour’s less-is-more melodic playing was such a contrast to the shredding I was accustomed to from metal bands. Plus, the idea of a concept album was pretty new to me. Composing songs that bled into each other as they told a coherent story or relayed similar themes was a new level of creativity for me. Dark Side taught me to slow my playing down and told me that emotion and melody were key to a good lead. You can see what in my estimate is the best version of “Money” from the concert Delicate Sound of Thunder below, which I watched over and over again as a teenager.

Image result for led zeppelin iiLed Zeppelin – Led Zeppelin II: I originally bought Led Zeppelin II for my dad for his birthday(?) one year at the suggestion of my mom. I wasn’t very familiar with Led Zeppelin at the time and even though I thought “Whole Lotta Love” was pretty cool, it didn’t peak my interest all that much. However, after picking up the guitar and shifting away from pop punk bands, I started “borrowing” (i.e., making a permanent part of my personal collection) Led Zeppelin II from my dad. While I acquired The Best of Led Zeppelin: Early & Latter Days, Vol. 1 & 2, it was this album that made me really appreciate the bluesy elements of rock. It felt like a bridge between the old and the new, between traditional blues and modern rock. And everyone was amazing: Plant’s vocals, Page’s guitar, Jones’ bass, and Bonham’s drums. Hard to find a band in which every member is of the highest caliber. And yes: I still prefer it to Led Zeppelin IV. You can see them performing “Whole Lotta Love” live from the Led Zeppelin DVD I watched consistently in my later years of high school.

Image result for pearl jam tenPearl Jam – TenI used to hate Pearl Jam.  I remember revealing my dislike of them to a bass player friend my freshman/sophomore year and he was flabbergasted that a guitar player would not like them. “But they’re such good musicians!” he protested. I don’t know what it was about them. Maybe it was Eddie Vedder’s voice (a co-worker of mine once described him as sounding like a man singing in a freezer). Maybe it was the flannel. My suspicion is that I just had not been properly exposed to them beyond “Jeremy” (which is an awesome song, mind you). I started downloading a number of Pearl Jam songs in my later years of high school and found myself appreciating them more and more. I finally caved and bought Ten.[ref]That led to Vs. Since then I’ve acquired Pearl Jam and Lightning Bolt. I bought their double-disc compilation Rearview Mirror (Greatest Hits 1991-2003) on my mission because every missionary should enjoy a little Pearl Jam late at night.[/ref] The album was (and is) phenomenal. There isn’t a song on it that isn’t top-notch. You can see them on full display with “Alive” below.

Image result for moving picturesRush – Moving PicturesMy first introduction to Rush was their video for “Time Stand Still” early one weekday morning on VH1. The video was ridiculous, but there was something about the band that I really liked. I stumbled on them again when “Test For Echo” came on one of those satellite music channels that I had playing in the background one day. I recognized the vocals and the band name and once again found myself being drawn to their style. On another fateful weekday morning, I saw their video for “Limelight” on VH1. The video was incredibly dated, but the song blew me away. Their mastery of the instruments was incredible and I was sold on Alex Lifeson’s wammy-heavy solo. I had to have that song. I ended up buying Moving Pictures soon after. I couldn’t believe that a trio could create that kind of sound. Typically, I focused solely on the guitar playing, but Rush made it impossible to ignore Lee’s bass playing or Peart’s drumming. This opened the flood gates: virtually every album and a couple concerts (one of which was as recent as 2015) later, I still consider them one of my favorites. You can see the video for “Limelight” below.

Image result for dirt alice in chainsAlice in Chains – DirtLucky for me, my YM/Scout leader for the longest time was also one of my best friend’s dad. While I was listening to Blink 182, my friend (due to his dad’s influence) was listening to the likes of Led Zeppelin, Pink Floyd, and even the chainsaw-wielding Jackal. One Wednesday night, as I caught a ride with my friend, his dad popped in one of his many CDs. Suddenly, a chugging, metal chord progression filled the car, along a with a jolting scream and eerie harmonies. I was caught off guard, but thoroughly entranced. About halfway through, the guitarist ripped into a headbanging solo. My ears perked up. The song, unfortunately, came to an end after only a couple minutes. When I asked what this was, my friend’s dad answered (with a smile), “Alice in Chains.” The album was Dirt and the song was “Them Bones.” I borrowed the album, ripped it, and became an AIC fan from then on. Jerry Cantrell, the guitarist and co-vocalist, provided a blues-based, melodic metal I could rock out to.[ref]I even have Jerry Cantrell’s solo albums Boggy Depot and Degradation Trip.[/ref] More importantly, he provided a type of playing that seemed achievable: not because his playing was sub-par, but because it evidenced a moderate partaking of the best rock music had to offer. Cantrell was not a shredder, a blues master, or a progressive rock composer (he still isn’t). But he was and is a fine guitar player, lyricist, and all-around musician. He instilled me with confidence and inspiration in my first few years of playing and remains influential even today. You can see the video for “Them Bones” below.

Image result for rumoursFleetwood Mac – RumoursFor Christmas one year I received a year-long subscription to Guitar World magazine. In one of the issues, it featured a kind of boxing bracket for guitarists, rating them on a 0-5 scale on things like chops, influence, creativity, etc. Unfortunately, my mother trashed all of my Guitar World issues while I was on my mission (I’m still not sure why), so I’m unable to reference it properly. But at the time, I used the bracket to learn about guitarists I had never heard of before. At one point, I came across the name Lindsey Buckingham with something like a 3.7 in chops. When I discovered that he was the guitarist/co-vocalist of Fleetwood Mac, I remembered that my mom had Rumours in her van. I promptly borrowed the album and began soaking in Buckingham’s fingerpicked style. The album remained in constant rotation along with their (then) new album Say You Will. The mix of male and female vocals gave it a more diverse sound than I was used to and my enjoyment of Rumour‘s more pop-oriented style helped expand my musical palate. You can see their performance of “The Chain” from their live album The Dance (which I also spent of fair amount of time listening to) below.[ref]The Dance‘s renditions of “Rhiannon” and “Big Love” are my favorites.[/ref]

Image result for texas floodSteve Ray Vaughan and Double Trouble – Texas FloodIn my first year of guitar playing (and therefore still in my pop punk phase), I had a Sunday School teacher who recognized that my fellow Blinkophile friend and I were “big into music.” One day, she held us after class and gave each of us a copy of SRV’s Texas Flood. Because we were guitar players, she knew we would appreciate SRV’s skills. In actuality, we both had a bit of an aversion to the album: it was straight blues and that just wasn’t us. We were “punk rockers” and SRV was definitely not that. Fast-forward a year or so. I was going through my CD collection and pulled our Texas Flood to give it another listen. I was floored: the tone, the bends, the precision. It was beautiful. While other artists (see Zeppelin and Floyd above) opened the door to a bluesier style, it was this album that solidified the blues in my book. It paved the way for my embrace of other blues guitarists like B.B. King, Albert King, Buddy Guy, Joe Bonamassa, Robben Ford, and others. Texas Flood is the reason that I earned the name “Blues Man” from a co-worker due to my Pandora picks. You can see SRV & Double Trouble performing my favorite track off the album–“Lenny”–below.

Here are a few honorable mentions with a brief explanation:

Megadeth – Rust in Peace: After Metallica, Megadeth was the next biggest metal band I listened to (Dave Mustaine was a former member of Metallica before they kicked him out). Rust in Peace was the first album of theirs I bought and it is still my favorite.

Tool – Lateralus: A friend of mine had a select few bands that he insisted were required listening. One of them was Tool and he made me promise to listen Lateralus all the way through without stopping. If I loved it, he would burn me the rest of their albums. I did and he did.

Prince – Purple Rain: I probably listened to “The Very Best of…” more, but Purple Rain put Prince’s skills on full display. The talent of the man was almost sickening.

Les Miserables: Original Broadway Cast: My older sister Nicole was a big theatre geek, so Les Miserables became a staple of my growing up. I still listen to it fairly often and I wept like a baby at the end of the 2012 film.

Phantom of the Opera: Original London Cast: Ditto. I dragged my high school girlfriend to the 2004 film. She kept trying to get friendly in the theater and I kept telling her to leave me alone so I could watch the movie. Priorities.

 

There you have it: my top ten teen albums.