I Want to be Right

Not long ago I was in a Facebook debate and my interlocutor accused me of just wanting to be right. 

Interesting accusation.

Of course I want to be right. Why else would we be having this argument? But, you see, he wasn’t accusing me of wanting to be right but of wanting to appear right. Those are two very different things. One of them is just about the best reason for debate and argument you can have. The other is just about the worst. 

Anyone who has spent a lot of time arguing on the Internet has asked themselves what the point of it all is. The most prominent theory is the speculator theory: you will never convince your opponent but you might convince the folks watching. There’s merit to that, but it also rests on a questionable assumption, which is that the default purpose is to win the argument by persuading the other person and (when that fails) we need to find some alternative. OK, but I question if we’ve landed on the right alternative.

I don’t think the primary importance of a debate is persuading speculators. The most important person for you to persuade in a debate is yourself.

It’s a truism these days that nobody changes their mind, and we all like to one-up each other with increasingly cynical takes on human irrationality and intractability. The list of cognitive biases on Wikipedia is getting so long that you start to wonder how humans manage to reason at all. Moral relativism and radical non-judgmentalism are grist for yet more “you won’t believe this” headlines, and of course there’s the holy grail of misanthropic cynicism:the argumentative theory. As Haidt summarizes one scholarly article on it:

Reasoning was not designed to pursue the truth. Reasoning was designed by evolution to help us win arguments. That’s why they call it The Argumentative Theory of Reasoning. So, as they put it, “The evidence reviewed here shows not only that reasoning falls quite short of reliably delivering rational beliefs and rational decisions. It may even be, in a variety of cases, detrimental to rationality. Reasoning can lead to poor outcomes, not because humans are bad at it, but because they systematically strive for arguments that justify their beliefs or their actions. This explains the confirmation bias, motivated reasoning, and reason-based choice, among other things.

Jonathan Haidt in “Righteous Mind”

Reasoning was not designed to pursue truth.

Well, there you have it. Might as well just admit that Randall Munroe was right and all pack it in, then, right?

Not so fast.

This whole line of research has run away with itself. We’ve sped right past the point of dispassionate analysis and deep into sensationalization territory. Case in point: the backfire effect. 

According to RationalWiki “the effect is claimed to be that when, in the face of contradictory evidence, established beliefs do not change but actually get stronger.” The article goes on:

The backfire effect is an effect that was originally proposed by Brendan Nyhan and Jason Reifler in 2010 based on their research of a single survey item among conservatives… The effect was subsequently confirmed by other studies.

Entry on RationalWiki

If you’ve heard of it, it might be from a popular post by The Oatmeal. Take a minute to check it out. (I even linked to the clean version without all the profanity.)

Click the image to read the whole thing.

Wow. Humans are so irrational, that not only can you not convince them with facts, but if you present facts they believe the wrong stuff even more

Of course, it’s not really “humans” that are this bad at reasoning. It’s some humans. The original research was based on conservatives and the implicit subtext behind articles like the one on RationalWiki is that they are helplessly mired in irrational biases but we know how to conquer our biases, or at the very least make some small headway that separates us from the inferior masses. (Failing that, at least we’re raising awareness!) But I digress.

The important thing isn’t that this cynicism is always covertly at least a little one-sided, it’s that the original study has been really hard to replicate. From an article on Mashable:

[W]hat you should keep in mind while reading the cartoon is that the backfire effect can be hard to replicate in rigorous research. So hard, in fact, that a large-scale, peer-reviewed study presented last August at the American Political Science Association’s annual conference couldn’t reproduce the findings of the high-profile 2010 study that documented backfire effect.

Uh oh. Looks like the replication crisis–which has been just one part of the larger we-can’t-really-know-anything fad–has turned to bite the hand that feeds it. 

This whole post (the one I’m writing right now) is a bit weird for me, because when I started blogging my central focus was epistemic humility. And it’s still my driving concern. If I have a philosophical core, that’s it. And epistemic humility is all about the limits of what we (individually and collectively) can know. So, I never pictured myself being the one standing up and saying, “Hey, guys, you’ve taken this epistemic humility thing too far.” 

But that’s exactly what I’m saying.

Epistemic humility was never supposed to be a kind of “we can never know the truth for absolute certain so may as well give up” fatalism. Not for me, anyway. It was supposed to be about being humble in our pursuit of truth. Not in saying that the pursuit was doomed to fail so why bother trying.

I think even a lot of the doomsayers would agree with that. I quoted Jonathan Haidt on the argumentative theory earlier, and he’s one of my favorite writers. I’m pretty sure he’s not an epistemological nihilist. RationalWiki may get a little carried away with stuff like the backfire effect (they gave no notice on their site that other studies have failed to replicate the effect), but evidently they think there’s some benefit to telling people about it. Else, why bother having a wiki at all?

Taken to its extreme, epistemic humility is just as self-defeating as subjectivism. Subjectivism–the idea that truth is ultimately relative–is incoherent because if you say “all truth is relative” you’ve just made an objective claim. That’s the short version. For the longer version, read Thomas Nagel’s The Last Word

The same goes for all this breathless humans-are-incapable-of-changing-their-minds stuff. Nobody who does all the hard work of researching and writing and teaching can honestly believe that in their bones. At least, not if you think (as I do) that a person’s actions are the best measure of their actual beliefs, rather than their own (unreliable) self-assessments.

Here’s the thing, if you agree with the basic contours of epistemic humility–with most of the cognitive biases and even the argumentative hypothesis–you end up at a place where you think human belief is a reward-based activity like any other. We are not truth-seeking machines that automatically and objectively crunch sensory data to manufacture beliefs that are as true as possible given the input. Instead, we have instrumental beliefs. Beliefs that serve a purpose. A lot of the time that purpose is “make me feel good” as in “rationalize what I want to do already” or “help me fit in with this social clique”.

I know all this stuff, and my reaction is: so what?

So what if human belief is instrumental? Because you know what, you can choose to evaluate your beliefs by things like “does it match the evidence?” or “is it coherent with my other beliefs?” Even if all belief is ultimately instrumental, we still have the freedom to choose to make truth the metric of our beliefs. (Or, since we don’t have access to truth, surrogates like “conformance with evidence” and “logical consistency”.)

Now, this doesn’t make all those cognitive biases just go away. This doesn’t disprove the argumentative theory. Let’s say it’s true. Let’s say we evolved the capacity to reason to make convincing (rather than true) arguments. OK. Again I ask: so what? Who cares why we evolved the capacity, now that we have it we get to decide what to do with it. I’m pretty sure we did not evolve opposable thumbs for the purpose of texting on touch-screen phones. Yet here we are and they seem adequate to the task. 

What I’m saying is this: epistemic humility and the associated body of research tell us that humans don’t have to conform their beliefs to truth and that we are incapable of conforming our beliefs perfect to truth and that it’s hard to conform our beliefs even mostly to truth. OK. But nowhere is it written that we can make no progress at all. Nowhere is it written we cannot try or that–when we try earnestly–we are doomed to make absolutely no headway at all.

I want to be right. And I’m not apologizing for that. 

So how do Internet arguments come into this? One way that we become right–individually and collectively–is by fighting over things. It’s pretty similar to the theory behind our adversarial criminal justice system. Folks who grow up in common law countries (of which the US is one) might not realize that’s not the way all criminal justice systems work. The other major alternative is the inquisitorial system (which is used in countries like France and Italy).

In an inquisitorial system, the court is the one that conducts the investigation. In an adversarial system the court is supposed to be neutral territory where two opposing camps–the prosecution and the defense–lay out their case. That’s where the “adversarial” part comes in: the prosecutors and defenders are the adversaries. In theory, the truth arises from the conflict between the two sides. The court establishes rules of fair play (sharing evidence, not lying) and–within those bounds–the prosecutors’ and defenders’ job is not to present the truest argument but the best argument for their respective side. 

The analogy is not a perfect one, of course. For one thing, we also have a presumption of innocence in the criminal justice system because we’re not evaluating ideas we’re evaluating people. That presumption of innocence is crucial in a real criminal justice system, but it has no exact analogue in the court of ideas.

For another thing, we have a judge to oversee trials and enforce the rules. There’s no impartial judge when you have a debate with randos on the Internet. This is unfortunate, because it means that If we don’t police ourselves in our debates, then the whole process breaks down. There is no recourse.

When I say I want to be right, what am I saying, in this context? I’m saying that I want to know more at the end of a debate than I did at the start. That’s the goal. 

People like to say you never change anyone’s mind in a debate. What they really mean is that you never reverse someone’s mind in a debate. And, while that’s not literally true, it’s pretty close. It’s really, really rare for someone to go into a single debate as pro-life (or whatever) and come out as pro-choice (or whatever). I have never seen someone make a swing that dramatic in a single debate. I certainly never have.

But it would be absurd to say that I never “changed my mind” because of the debates I’ve had about abortion. I’ve changed my mind hundreds of time. I’ve abandoned bad arguments and adopted or invented new ones. I’ve learned all kinds of facts about law and history and biology that I didn’t know before. I’ve even changed my position many times. Just because the positions were different variations within the theme of pro-life doesn’t mean I’ve never “changed my mind”. If you expect people to walk in with one big, complex, set of ideas that are roughly aligned with a position (pro-life, pro-gun) and then walk out of a single conversation with whole new set of ideas that are aligned under the opposite position (pro-choice, anti-gun), then you’re setting that bar way too high.

But all of this only works if the folks having the argument follow the rules. And–without a judge to enforce them–that’s hard.

This is where the other kind of wanting to “be right” comes in. One of the most common things I see in a debate (whether I’m having it or not) is that folks want to avoid having to admit they were wrong

First, let me state emphatically that if you want to avoid admitting you were wrong you don’t actually care about being right in the sense that I mean it. Learning where you are wrong is just about the only way to become right! People who really want to “be right” embrace being wrong every time it happens because those are the stepping stones to truth. Every time you learn a belief or a position you took was wrong, you’re taking a step closer to being right.

But–going back to those folks who want to avoid appearing wrong–they don’t actually want to be right. They just want to appear right. They’re not worried about truth. They worried about prestige. Or ego. Or something else.

If you don’t care about being right and you only care about appearing right, then you don’t care about truth either. And these folks are toxic to the whole project of adversarial truth-seeking. Because they break the rules. 

What are the rules? Basic stuff like don’t lie, debate the issue not the person, etc. Maybe I’ll come up with a list. There’s a whole set of behaviors that can make your argument appear stronger while in fact all you’re doing is peeing in the pool for everyone who cares about truth. 

If you care about being right, then you will give your side of the debate your utmost. You’ll present the best evidence, use the tightest arguments, and throw in some rhetorical flourishes for good measure. But if you care about being right, then you will not break the rules to advance your argument (No lying!) and you also won’t just abandon your argument in midstream to switch to a new one that seems more promising. Anyone who does that–who swaps their claims mid-stream whenever they see one that shows a more promising temporary advantage–isn’t actually trying to be right. They’re trying to appear right. 

They’re not having an argument or a debate. They’re fighting for prestige or protecting their ego or doing something else that looks like an argument but isn’t actually one. 

I wrote this partially to vent. Partially to organize my feelings. But also to encourage folks not to give up hope, because if you believe that nobody cares about truth and changing minds is impossible then it becomes a self-fulfilling prophecy.

And you want to know the real danger of relativism and post-modernism and any other truth-adverse ideology? Once truth is off the table as the goal, the only thing remaining is power.

As long as people believe in truth, there is a fundamentally cooperative aspect to all arguments. Even if you passionately think someone is wrong, if you both believe in truth then there is a sense in which you’re playing the same game. There are rules. And, more than rules, there’s a common last resort you’re both appealing to. No matter how messy it gets and despite the fact that nobody ever has direct, flawless access to truth, even the bitterest ideological opponents have that shred of common ground: they both think they are right, which means they both thing “being right” is a thing you can, and should, strive to be.

But if you set that aside, then you sever the last thread between opponents and become nothing but enemies. If truth is not a viable recourse, all that is left is power. You have to destroy your opponent. Metaphorically at first. Literally if that fails. Nowhere does it say on the packaging of relativism “May lead to animosity and violence”. It’s supposed to do the opposite. It’s advertised as leading to tolerance and non-judgmentalism, but by taking truth off the table it does the opposite.

Humans are going to disagree. That’s inevitable. We will come into conflict. With truth as an option, there is no guarantee that the conflict will be non-violent, but it’s always an option. It can even be a conflict that exists in an environment of friendship, respect, and love. It’s possible for people who like and admire each other to have deep disagreements and to discuss them sharply but in a context of that mutual friendship. It’s not easy, but it’s possible. 

Take truth off the table, and that option disappears. This doesn’t mean we go straight from relativism to mutual annihilation, but it does mean the only thing left is radical partisanship where each side views the other as an alien “other”. Maybe that leads to violence, maybe not. But it can’t lead to friendship, love, and unity in the midst of disagreement.

So I’ll say it one more time: I want to be right.

I hope you do, too.

If that’s the case, then there’s a good chance we’ll get into some thundering arguments. We’ll say things we regret and offend each other. Nobody is a perfect, rational machine. Biases don’t go away and ego doesn’t disappear just because we are searching for truth. So we’ll make mistakes and, hopefully, we’ll also apologize and find common ground. We’ll change each other’s minds and teach each other things and grudgingly earn each other’s respect. Maybe we’ll learn to be friends long before we ever agree on anything.

Because if I care about being right and you care about being right, then we already have something deep inside of us that’s the same. And even if we disagree about every single other thing, we always will.

How and Why to Rate Books and Things

Here’s the image that inspired this post:

Now, there’s an awful lot of political catnip in that post, but I’m actually going to ignore it. So, if you want to hate on Captain Marvel or defend Captain Marvel: this is not the post for you. I want to talk about an apolitical disagreement I have with this perspective.

The underlying idea of this argument is that you should rate a movie based on how good or bad it is in some objective, cosmic sense. Or at least based on something other than how you felt about the movie. In this particular case, you should rate the movie based on some political ideal or in such a way as to promote the common good. Or something. No, you shouldn’t. ALl of these approaches are bad ideas.

That's not how this works

The correct way to rate a movie–or a book, or a restaurant, etc.–is to just give the rating that best reflects how much joy it brought you. That’s it!

Let’s see if I can convince you.

To begin with, I’m not saying that such a thing as objective quality doesn’t exist. I think it probably does. No one can really tell where subjective taste ends and objective quality begins, but I’m pretty sure that “chocolate or vanilla” is a matter of purely personal preference but “gives you food poisoning or does not” is a matter of objective quality.

So I’m not trying to tell you that you should use your subjective reactions because that’s all there is to go on. I think it’s quite possible to watch a movie and think to yourself, “This wasn’t for me because I don’t like period romances (personal taste), but I can recognize that the script, directing, and acting were all excellent (objective quality) so I’m going to give it 5-stars.”

It’s possible. A lot of people even think there’s some ethical obligation to do just that. As though personal preferences and biases were always something to hide and be ashamed of. None of that is true.

The superficial reason I think it’s a bad idea has to do with what I think ratings are for. The purpose of a rating–and by a rating I mean a single, numeric score that you give to a movie or a book, like 8 out of 10 or 5 stars–is to help other people find works that they will enjoy and avoid works that they won’t enjoy. Or, because you can do this, to help people specifically look for works that will challenge them and that they might not like, and maybe pass up a book that will be too familiar. You can do all kinds of things with ratings. But only if the ratings are simple and honest. Only if the ratings encode good data.

The ideal scenario is a bunch of people leaving simple, numeric ratings for a bunch of works. This isn’t Utopia, it’s Goodreads. (Or any of a number of similar sites.) What you can then do is load up your list of works that you’ve liked / disliked / not cared about and find other people out there who have similar tastes. They’ve liked a lot of the books you’ve liked, they’ve disliked a lot of the books you’ve disliked, and they’ve felt meh about a lot of the books you’ve felt meh about. Now, if this person has read a book you haven’t read and they gave it 5-stars: BAM! You’re quite possibly found your next great read.

You can do this manually yourself. In fact, it’s what all of us instinctively do when we start talking to people about movies. We compare notes. If we have a lot in common, we ask that person for recommendation. It’s what we do in face-to-face interactions. When we use big data sets and machine learning algorithms to automate the process, we call them recommender systems. (What I’m describing is the collaborative filtering approach as opposed to content-based filtering, which also has it’s place.)

This matters a lot to me for the simple reason that I don’t like much of what I read. So, it’s kind of a topic that’s near and dear to my heart. 5-star books are rare for me. Most of what I read is probably 3-stars. A lot of it is 1-star or 2-star. In a sea of entertainment, I’m thirsty. I don’t have any show that I enjoy watching right now. I’m reading a few really solid series, but they come out at a rate of 1 or 2 books a year, and I read more like 120 books a year. The promise of really deep collaborative filtering is really appealing if it means I can find is valuable.

But if you try to be a good citizen and rate books based on what you think they’re objective quality is, the whole system breaks down.

Imagine a bunch of sci-fi fans and a bunch of mystery fans that each read a mix of both genres. The sci-fi fans enjoy the sci-fi books better (and the mystery fans enjoy the mystery books more), but they try to be objective in their ratings. The result of this is that the two groups disappear from the data. You can no longer go in and find the group that aligns with your interests and then weight their recommendations more heavily. Instead of having a clear population that gives high marks to the sci-fi stuff and high-marks to the mystery stuff, you just have one, amorphous group that gives high (or maybe medium) marks to everything.

How is this helpful? It is not. Not as much as it could be, anyway.

In theoretical terms, you have to understand that your subjective reaction to a work is complex. It incorporates the objective quality of the work, your subjective taste, and then an entire universe of random chance. Maybe you were angry going into the theater, and so the comedy didn’t work for you the way it would normally have worked. Maybe you just found out you got a raise, and everything was ten times funnier than it might otherwise have been. This is statistical noise, but it’s unbiased noise. This means that it basically goes away if you have a high enough sample.

On the other hand, if you try to fish out the objective components of a work from the stew of subjective and circumstantial components, you’re almost guaranteed to get it wrong. You don’t know yourself very well. You don’t know for yourself where you objective assessment ends and your subjective taste begins. You don’t know for yourself what unconscious factors were at play when you read that book at that time of your life. You can’t disentangle the objective from the subjective, and if you try you’re just going to end up introducing error into the equation that is biased. (In the Captain Marvel example above, you’re explicitly introducing political assessments into your judgment of the movie. That’s silly, regardless of whether your politics make you inclined to like it or hate it.)

What does this all mean? It means that it’s not important to rate things objectively (you can’t, and you’ll just mess it up), but it is helpful to rate thing frequently. The more people we have rating things in a way that can be sorted and organized, the more use everyone can get from those ratings. In this sense, ratings have positive externalities.

Now, some caveats:

Ratings vs. Reviews

A rating (in my terminology, I don’t claim this is the Absolute True Definition) is a single, numeric score. A review is a mini-essay where you get to explain your rating. The review is the place where you should try to disentangle the objective from the subjective. You’ll still fail, of course, but (1) it won’t dirty the data and (2) your failure to be objective can still be interesting and even illuminating. Reviews–the poor man’s version of criticism–is a different beast and it plays by different rules.

So: don’t think hard about your ratings. Just give a number and move on.

Do think hard about your reviews (if you have time!) Make them thoughtful and introspective and personal.

Misuse of the Data

There is a peril to everyone giving simplistic ratings, which is that publishers (movie studios, book publishers, whatever) will be tempted to try and reverse-engineer guaranteed money makers.

Yeah, that’s a problem, but it’s not like they’re not doing that anyway. The reason that movie studios keep making sequels, reboots, and remakes is that they are already over-relying on ratings. But they don’t rely on Goodreads or Rotten Tomatoes. They rely on money.

This is imperfect, too, given the different timing of digital vs. physical media channels, etc. but the point is that adding your honest ratings to Goodreads isn’t going to make traditional publishing any more likely to try and republish last years cult hit. They’re doing to do that anyway, and they already have better data (for their purposes) than you can give them.

Ratings vs. Journalism

My advice applies to entertainment. I’m not saying that you should just rate everything without worrying about objectivity. This should go without saying but, just in case, I said it.

You shouldn’t apply this reasoning to journalism because one vital function of journalism for society is to provide a common pool of facts that everyone can then debate about. One reason our society is so sadly warped and full of hatred is that we’ve lost that kind of journalism.

Of it’s probably impossible to be perfectly objective. The term is meaningless. Human beings do not passively receive input from our senses. Every aspect of learning–from decoding sounds into speech to the way vision works–is an active endeavor that depends on biases and assumptions.

When we say we want journalists to be objective, what we really mean is that (1) we want them to stick to objectively verifiable facts (or at least not do violence to them) and (2) we would like them to embody, insofar as possible, the common biases of the society they’re reporting to. There was a time when we, as Americans, knew that we had certain values in common. I believe that for the most part we still do. We’re suckers for underdogs, we value individualism, we revere hard work, and we are optimistic and energetic. A journalistic establishment that embraces those values is probably one that will serve us well (although I haven’t thought about it that hard, and it still has to follow rule #1 about getting the facts right). That’s bias, but it’s a bias that is positive: a bias towards truth, justice, and the American way.

What we can’t afford, but we unfortunately have to live with, is journalism that takes sides within the boundaries of our society.

Strategic Voting

There are some places other than entertainment where this logic does hold, however, and one of them is voting. One of the problems of American voting is that we go with majority-take-all voting, which is like the horse-and-buggy era of voting technology. Majority-take-all voting is probably much worse for us than a 2-party system, because it encourages strategic voting.

Just like rating Captain Marvel higher or lower because your politics make you want it to succeed or fail, strategic voting is where you vote for the candidate that you think can win rather than the candidate that you actually like the most.

There are alternatives that (mostly) eliminate this problem, the most well-known of which is instant-runoff voting. Instead of voting for just one candidate, you rank the candidates in the order that you prefer them. This means that you can vote for your favorite candidate first even if he or she is a longshot. If they don’t win, no problem. Your vote isn’t thrown away. In essence, it’s automatically moved to your second-favorite candidate. You don’t actually need to have multiple run-off elections. You just vote once with your full list of preferences and then it’s as if you were having a bunch of runoffs.

There are other important reasons why I think it’s better to vote for simple, subjective evaluations of the state of the country instead of trying to figure out who has the best policy choices, but I’ll leave that discussion for another day.


The idea of simple, subjective ratings is not a cure-all. As I noted above, it’s not appropriate for all scenarios (like journalism). It’s also not infinitely powerful. The more people you have and the more things they rate (especially when lots of diverse people are rating the same thing), the better. If you have 1,000 people, maybe you can detect who likes what genre. If you have 10,000 people, maybe you can also detect sub-genres. If you have 100,000 people, maybe you can detect sub-genres and other characteristics, like literary style.

But no matter how many people you have, you’re never going to be able to pick up every possible relevant factor in the data because there are too many and we don’t even know what they are. And, even if you could, that still wouldn’t make predictions perfect because people are weird. Our tastes aren’t just a list of items (spaceships: yes, dragons: no). They are interactive. You might really like spaceships in the context of gritty action movies and hate spaceships in your romance movies. And you might be the only person with that tick. (OK, that tick would probably be pretty common, but you can think of others that are less so.)

This is a feature, not a bug. If it were possible to build a perfect recommendation it would also be possible to build (at least in theory) an algorithm to generate optimal content. I can’t think of anything more hideous or dystopian. At least, not as far as artistic content goes.

I’d like a better set of data because I know that there are an awful lot of books out there right now that I would love to read. And I can’t find them. I’d like better guidance.

But I wouldn’t ever want to turn over my reading entirely to a prediction algorithm, no matter how good it is. Or at least, not a deterministic one. I prefer my search algorithms to have some randomness built in, like simulated annealing.

I’d say about 1/3rd of what I read is fiction I expect to like, about 1/3rd is non-fiction I expect to like, and 1/3rd is random stuff. That random stuff is so important. It helps me find stuff that no prediction algorithm could ever help me find.

It also helps the system over all, because it means I’m not trapped in a little clique with other people who are all reading the same books. Reading outside your comfort zone–and rating them–is a way to build bridges between fandom.

So, yeah. This approach is limited. And that’s OK. The solution is to periodically shake things up a bit. So those are my rules: read a lot, rate everything you read as simply and subjectively as you can, and make sure that you’re reading some random stuff every now and then to keep yourself out of a rut and to build bridges to people with different tastes then your own.

Stuff I Say at School – Part II: Self-Interested Politicians

This is part of the Stuff I Say at School series.

The Assignment

After listening to [Benjamin] Ginsberg‘s lecture, do you agree with his assessment that politics is all about interests and power?

The Stuff I Said

Image result for the elephant in the brain

Kevin Simler and Robin Hanson’s recent book The Elephant in the Brain demonstrates that these underlying desires for power and status inform many of our decisions and behaviors in everyday life. Politicians certainly do not transcend these selfish motives by virtue of their office. I would actually add a subcategory to “status”: moral grandstanding. We want to paint ourselves as “good people” by signaling to others our superior moral quality. This allows us to enjoy the social capital that comes along with the improved reputation. We not only gain status, but we can also think of ourselves as do-gooders; crusaders who fight the good fight. Unsurprisingly, evidence suggests that we have inflated views  of our own moral character and that acts of moral outrage are largely self-serving. What’s unfortunate is that social media may be exacerbating moral outrage by making signaling both easier and less costly to the individual.

I think the rise of populism in both America and Europe is a timely example of interests at play. While various elements contribute to the populist mindset, economic insecurity is the water it swims in. And this insecurity has been exploited by politicians of more extreme ideologies across multiple countries. For example, the Great Recession eroded European trust in mainstream political parties: a one percentage point increase in unemployment was associated with a 2 to 4 percentage point increase in the populist vote. A 2016 study looked at the political results of financial crises in Europe from 1870 to 2014 and found that far-right parties were the typical outcome. In America, President Trump made “Make America Great Again” his rallying cry, feeding off the public’s distrust of “the Establishment” during the post-crisis years. In doing so, he advocated protectionism and tighter borders. Oddly enough, you find comparable populist sentiments on the Left: Bernie Sanders has been very anti-trade and iffy on liberalized immigration (open borders is “a Koch Brothers proposal“), all in the name of helping the American worker. One of his former campaign organizers–the newly-elected Congresswoman Ocasio-Cortez–has also expressed similar concerns over trade deals (especially NAFTA). This is why The Economist sees less of a left/right divide today and more of an open/close divide. Skepticism of trade and immigration wrapped in “power to the people” sentiments may be invigorating in rhetoric, but it’s asinine in practice. And it’s doing nothing more than riding the wave of voter anxiety. What’s worse, it’s hiding these politicians’ accumulation of power, attainment of status, and moral self-aggrandizement behind what Ginsberg so aptly calls “the veneer of public spiritedness.”

More Stuff

A classmate asked if I believed that politicians always acted in self-interest or if there were moral lines that some would not cross. In response, I pointed out that Simler and Hanson are largely arguing against what they see as the tendency for people to tiptoe around hidden motives and self-deception. It’s not that we’re only motivated by selfish motives. We just tend to gloss over them. But they are deeply embedded. Failing to acknowledge them not only has personal consequences, but public ones as well (their chapter on medicine is especially on point). I think we should consider moral motivations through all possible means available, including life experience and behavior. However, I think a healthy dose of skepticism is necessary. It can certainly help protect us against intentional deception. But perhaps more importantly, it helps protect us against unintentional deception. It’s easy to give more weight to life experience, moral principles, and the like when it’s a politician on “our side,” all while harshly judging those on “the other side” as unscrupulous. Political skepticism or cynicism can aid in keeping our own selfish motives and emotional highs in check. And it can lead us to seek out more information, improve our understanding, and refine our beliefs. Otherwise, we end up being consumed by our own good intentions and moral principles without actually learning how to implement these principles.

Image result for against democracy

My classmate also put forth a hypothetical to get a feel for my position: if legislative districts were redrawn so that legislators now represented districts with a different ideological makeup, how many would change their positions on issues just to stay in power? Personally, I think we would see a fair number of politicians shift their position because it is more advantageous. However, there is considerable evidence that political deliberation with ideological opposites actually backfires. Political philosopher Jason Brennan reviews the evidence in chapter 3 of his book Against Democracy and finds that political deliberation:

  • Undermines cooperation
  • Exacerbates conflict when groups are different sizes
  • Avoids debates about facts and is instead driven by status-seeking and positions of influence
  • Uses language in biased and manipulative ways, usually by painting the opposition as intrinsically bad
  • Avoids controversial topics and sticks to safe subjects
  • Amplifies intellectual biases

There’s more, but that should make my point. So even if some politicians did not flip flop in their newly-drawn districts, the above list should give us pause before we conclude that their doubling down is proof of disinterest in status or moral grandstanding.

I certainly believe that people have moral limits and lines they will not cross. My skepticism (which I prefer to the word cynicism, but I’m fine with interchanging them) is largely about honest self-examination and the examination of others. For example, consider something that is generally of no consequence: Facebook status updates. My Facebook feed is often full of political rants, social commentaries, and cultural critiques. Why do we do that? Why post a political tract as a status? It can’t be because of utility. A single Facebook status isn’t going to fix Washington or shift the course of society. It’s unlikely to persuade the unsaved among your Facebook friends. In fact, it’s probably counterproductive given our tendency for motivated reasoning. When we finally rid ourselves of the high-minded rationales that make next to zero sense, we find that it boils down to signaling: we are signaling our tribe. And that feels good. We get “Likes.” We get our worldview confirmed by others. We gain more social capital as a member of the group. We even get to moral grandstand in the face of that friend or two who hold (obviously) wrong, immoral beliefs. Sure, some of it may be about moral conviction and taking a stand. That certainly sounds and feels better. But I think we will all be better off if we realize that’s really what those behaviors are about: sounding and feeling good. And I think our politics will be better off if we apply a similar lens to it. 

And More Stuff

A classmate drew on Dan Ariely’s work to argue that people–including politicians–have a “personal fudge factor“: most people will cheat a little bit without feeling they’ve compromised their sense that they are a “good person.” When people are reminded of moral values (in the case of the experiments, the honor code or 10 commandments), they don’t cheat, including atheists. So while politicians may compromise their values here and there, they still have a moral sense of self that they are unlikely to violate.

In response, I pointed out that a registered replication report last year was unable to reproduce Ariely’s results. That doesn’t mean his results were wrong, just that we need to be cautious in drawing any strong conclusions from them.

When discussing his priming with the 10 Commandments on pg. 635, Ariely references Shariff and Norenzayan’s well-known 2007 study. This found that people behave more prosocially (in this case, generosity in experimental economic games) when primed with religious concepts. They offered a couple explanations for this. One hypothesis suggested that “the religious prime aroused an imagined presence of supernatural watchers…Generosity in cooperative games has been shown to be sensitive to even minor changes that compromise anonymity and activate reputational concerns” (pg. 807). They then cite studies (which later studies confirm) that found people behaving more prosocially in the presence of eye images. “In sum,” the authors write, “we are suggesting that activation of God concepts, even outside of reflective awareness, matches the input conditions of an agency detector and, as a result, triggers this hyperactive tendency to infer the presence of an intentional watcher. This sense of being watched then activates reputational concerns, undermines the anonymity of the situation, and, as a result, curbs selfish behavior” (pg. 807-808). In short, religious priming makes us think someone upstairs is watching us. This has more to do with being seen as good.

However, religious priming obviously doesn’t work for the honor code portion. Yet, Shariff and Norenzayan’s other explanation is actually quite helpful in this regard: “the activation of perceptual conceptual representations increases the likelihood of goals, plans, and motor behavior consistent with those representations…Irrespective of any attempt to manage their reputations, subjects may have automatically behaved more generously when these concepts were activated, much as subjects are more likely to interrupt a conversation when the trait construct ‘‘rude’’ is primed, or much as university students walk more slowly when the ‘‘elderly’’ stereotype is activated (Bargh et al., 1996)” (pg. 807). Being primed with the “honorable student” stereotype, students were more likely to behave honorably (or honestly). 

In short, Ariely’s study I think shows a mix of motivations when it comes to behaving morally: (1) maintaining our self-concept as a good person, (2) fear of being caught and having our reputation (and the benefits that come with along with it) damaged, and (3) our susceptibility to outside influence.

My point about moral grandstanding is not that we should interpret all behaviors by politicians through the lens of self-delusion and status seeking. But being aware of it can help us cut through a lot of nonsense and avoid being swept up in a collective self-congratulation. To quote Tosi and Warmke, “thinking about grandstanding is a cause for self-reflection, not a call to arms. An argument against grandstanding shouldn’t be used as a cudgel to attack people who say things we dislike. Rather, it’s an encouragement to reassess why and how we speak to one another about moral and political issues. Are we doing good with our moral talk? Or are we trying to convince others that we are good?” And as philosopher David Schmidtz is said to have quipped, if your main goal is to show that your heart is in the right place, then your heart is not in the right place.

Building a Life Story

This post is part of the General Conference Odyssey.

The first time I wrote in my journal was in the days immediately after my baptism when I was 8 years old. I still have the pages somewhere in a box, including the hand-drawn map of the different routes I could take when I walked back and forth from school.

I have started and stopped journals countless times since then because it’s one of those things that, as Elder Groberg reminded us in Writing Your Personal and Family History, good Mormons are supposed to do.

As much as I enjoy writing, there’s always been one big thing inhibiting me from keeping a journal more reliably, and it is this: I don’t know what the real story is. This isn’t some weird post-modern hang-up, so much as it is (as far as I can tell) a weird psychological hang-up. I never know how I feel about things. Interrogating my true feelings about the things that are going on in my life is like collecting mist with a butterfly net. I can record the brute facts of my life—I can draw the map and label the streets—but I can’t tell you what those facts mean. Not even, and perhaps most especially, to me.

My inner life is an optical illusion. It is a collection of lines that looks like the inside of a cube one moment or the outside of a cube the next. It is a picture of a rabbit for a blink, and then it is a picture of a duck. It is two faces; it is a chalice. It is an old lady; it is a young woman.

This is why I spend almost no time at all thinking about my past. My friends and family all remember so much more of the things that I’ve been through than I do. For me, the past is like a crime scene, and I am afraid to contaminate the evidence. I have a superstitious belief that there is a true story, an objective reality, and I’m afraid that if I try to hard to find it then I will only erase it.

I have a couple of binders somewhere that contain all the letters that I sent home while I was serving my mission in Hungary and all of the letters that people sent to me. I think the binders were a gift when I got home, but I’m not sure. I’ve never opened them. I’m not sure where they are. I don’t even like to look at the binders, let alone consider reading the pages inside. Because my mission was the one time in my life when I acted like I knew what was going on and when I told everyone how I felt about things, and I’m afraid that it was all lies. It was the hardest time of my young life, and I have vague recollections of writing relentlessly optimistic and happy letters despite feeling so depressed that it felt like physical pain on most days. The whole thing is wildly embarrassing to me. I acted like I knew what was going on. I had no idea. I have lived almost as many years after my mission as I lived before it, and I still have no idea what was going on or why it was so hard for me.

If writing a journal is about writing the real story of my feelings, then I can’t write a journal for the simple reason that I don’t know my own story.

And yet, I should. Write a journal, that is. Like Elder Groberg says, writing a journal “helps immeasurably in gaining a true, eternal perspective of life” and “should be a great motivation to do what is right.” I know that’s accurate: the reflection of writing about my life has helped me put things into perspective.

Maybe that’s the point?

I’m teaching the Old Testament in Gospel Doctrine this year, and it’s a mess. We just made the transition from Joshua to Judges, and I taught about how all the mass slaughter that supposedly happened in Joshua is pretty flatly contradicted by Judges. On the bright side: you don’t have to believe in a genocidal God.  On the downside: it’s hard to make sense of all the contradictions. In Deuteronomy, we’re told a Moabite will never enter the assembly of the Lord until the 10th generation. Ruth, the hero of the Book of Ruth, is Moabite and that makes King David 1/8th Moabite. And, while we’re on the topic, how do we reconcile the apparent gap between the miracle-laden Exodus story and the miracle-free story of Ruth and Boaz?

The one encouraging thing is that, as I read Elder Groberg’s talk, I realize that the Old Testament is a mess in a lot of the same ways that my own life story is a mess.

There may be one, true, ultimate truth about everything. Not just the objective facts of life, but the subjective ones as well. Maybe there is an absolutely true narrative. But if there is, we will never know it in this life. In this life, stories are things we make up. Fictional stories are based on imaginary facts. And real stories—including history—is made up based on true facts. But they are both made up.

I’m not sure if I have that right or not, but it sounds promising. At the very least, it’s worth giving a shot. I’m going to try writing in my journal again, and this time I’m not going to try and find a life story. I’m going to use the raw materials of my experiences to build one.

Check out the other posts from the General Conference Odyssey this week and join our Facebook group to follow along!

Speaking on campus and the ctrl-left

Update: the full text of the speech is now online on its own post.

My university is home to a controversial Confederate War memorial.

It is a bronze sculpture of a college student carrying a rifle, commemorating the students at my university who left their studies and went to fight in the American Civil War for the Confederacy. On the base are three inscriptions, the middle of which shows the student in class, hearing the call of a woman representing duty urging him to fight. The side inscription speaks of honor and duty.

The statue has always been controversial, but recent events have brought the controversy back.

The university is holding an open panel, inviting the general public to share their thoughts. You just had to register, and the first 25 get to go.

Well, I have some thoughts on the monument, and I wanted to share them. So I signed up, and I wrote a speech (exactly 3 minutes in length), and I’ve been practicing it. On Wednesday, I anticipate getting to deliver the speech.

I would be pretty foolish to not be worried. Actually, on an issue this incendiary, I am pretty foolish to want to speak out at all.

For starters, there’s a chance my talk could anger white supremacist groups.

I am a white man with pale skin and reddish/blondish hair. I am married to a beautiful woman from Costa Rica, with caramel skin and these gorgeous black eyes you can just get lost in. We don’t have children yet, but we are both excited to meet them. I know they will be beautiful, like their mother. I hope my daughters look like her, with her dark skin and dark eyes and her raven black hair.

If you listen to what white supremacist groups actually say these days, then you’d know this is their raison d’être. They refer to it by the moronic title “white genocide” — the “diluting” of the “white race” through marriage of white people with people of other races.

Me and my family are the main thing that white supremacists march against.

In the speech I have planned, I think I make it clear that I consider the cause of the Confederacy in the American Civil War to be an unworthy cause — it was certainly not worth the lives of the men who died for it.

That might anger white supremacists, who would already have reason to despise my family.

But, I’m not afraid of angering white supremacists; they’re evil, but they don’t frighten me. Because I know they are a powerless group of isolated and outcast individuals with little to no social standing in their own communities, who are resorted to anonymous online forums for human contact. They are pathetic, and I’m not enough of a coward to shrink away from shadows in a basement.

White supremacy is, of course, evil. It cost me nothing to say that, and means nothing when I do say it, as everyone either agrees with it already, or is a white supremacist and doesn’t care what society thinks about them.

White supremacy is also stupid. It is lazy thinking. It is the kind of mental shortcut that the feeble-minded rely on. It is the sort of excuse that the weak-willed cower to, lacking the testicular fortitude to face their own inadequacies. It’s the kind of pseudo-intellectualism the internet is famous for, citing poorly analyzed statistics, when all it would take is meeting one normal, middle-class African American to see the fatuity of it all — that blacks and whites are the same race, because there is only the one race of Adam.

My comments might make them mad, but what are they going to do? Make memes about me?

There is also a chance my speech could anger Progressives on the ctrl-left. Actually, probably a much bigger chance. And that does scare me.

It scares me so much that I’m actually considering if I even want to speak at all. I have a speech written, and I’ve been practicing it, and I’ve shopped it with a number of friends, and I’ve made edits and timed it perfectly. But I’m thinking of not doing it at all.

I’m afraid of what the ctrl-left could do to me.

What is the ctrl-left? The label is a take on the alt-right designation, though the ctrl-left have been around for a lot longer. Maybe since the Bush administration. They are a political activist class — that is, they are a class of people with nothing else to do but be politically active. They are employed in universities, shutting down conservative voices. They are employed in news stations, selectively editing narratives and choosing which stories to give press time. They are employed at online opinion magazines, and spend all day opining on politics and culture. They are employed in Starbucks, and then spend 14 hours a day on twitter and tumblr investigating the lives of people they disagree with, trying to have them removed from their jobs, or shut down their youtube, facebook, or twitter to prevent them from sharing in electronic public forums. They are employed in tech companies enforcing “community standards” with bans and post removals, which on platform after platform seems to conveniently mean removing opinions on the right of American politics.

The ctrl-left, in essence, want to control what you are allowed to say, and punish you when you say what you are not.

The most recent explosion of this movement has been in antifa, the group of emotional children using acts of literal street violence to suppress and silence dissident voices in the public sphere — which is to say, they are a group of fascists. These jackbooted thugs have been taking to the streets, punching people in the face, smashing up their campuses in temper tantrums, setting fires, and generally acting exactly like the goosestepping authoritarians they are in order to stop people from saying anything that they don’t think people should be able to say anymore.

This latest expression of the ctrl-left doesn’t particularly worry me. I can take physical violence. I can take being punched in the face, or maced, or beaten with a club. I would consider it an honor, actually. Make my day.

What does worry me are the online Social Justice Warriors in the ctrl-left who have nothing better to do with their lives, apparently, than to seek new ways to punish people for wrongthink.

I work in academia. Tenured professors cannot get fired for refusing to attend their own classes for two years, but tenured professors have been fired for daring to injure the precious emotions of the ctrl-left. I’m a mere, lowly teaching assistant. I could lose my job, or be dismissed from school. I could be made unhirable in colleges and tech companies.

If my speech offends the wrong person, they could look to dig up all kinds of stuff on me.

It wouldn’t even be very hard to dig up stuff on me. For most of my life, I was a pretty terrible jerk. Just ask anyone who knew me in high school. Since high school, I have been slightly tolerable. If you had nothing to do but look for reasons to say crap about me, you could find crap to say about me. And the ctrl-left has absolutely nothing else to do.

But even if they can’t find dirt on me, the very act of disagreeing with their orthodoxy is a firable offense. They have power in universities and companies to crush whoever displeases them; and not only do they have it, but they use it.

I know this, so I generally go about my day and just grit my teeth and keep my mouth shut. My fellow students don’t have to keep their mouths shut, because they affirm the accepted dogmata of our thought guardians.

I let them talk and express opinions I disagree with and laugh at people who think the exact things I think and endorse ideologies I completely reject and say nothing, because I just want to get out of here alive, get my PhD, and maybe once I have a job I can rely on, maybe then I’ll be able to breathe again.

And the crux of the story is that I’m just sick of it. I am sick and tired of shutting up. I am done with being expected to receive with full docility the ramblings of this tumblr magisterium. I’m tired of feeling like I can’t speak my mind without retaliation and blowback, while others can express their politics unafraid.

I’m done. I’m done being shut up.

Realistically, I can probably expect nothing. I’m probably over-worrying myself. It’s unlikely anyone will really take notice. It’s an indoor event with a few dozen speakers, and who really wants to attend a meeting like that unless you’re speaking? Local news might pick it up, and they might run two seconds of my three minute speech (probably selectively edited to make it sound like I’m saying something completely opposite of what I’m saying), and then that’s probably it. Maybe some person I know might notice and say something, maybe a student would say they heard I spoke or something, but that’s about it.

In a rational universe, maybe that’s all there needs to be about it. I can just say what I think, people can hear it and agree or disagree with it, we can have back-and-forth, and then we go on our merry ways.

But this is not a rational universe, so who knows what I can expect.

(Authors’s Note and General Disclaimer: These are not the only two groups of people with opinions in this country. There are people opposed to the monument who are not part of the ctrl-left and who want civil dialogue and peaceful protest to lead the change. There are people in favor of the monument who are neither white supremacists nor part of the alt-right, and who want all people to be treated with the dignity due all human individuals. There are people on the left who also champion free speech, such as the ACLU, because free speech is not a partisan concern but the birthright of humanity. I know these people exist, because I know them; they are my family and friends and neighbors. With all of these people, I hope to see the American spirit of passionate but nonviolent engagement in the marketplace of ideas continue to drive political discourse. To the ctrl-left and alt-right, I pray that God has mercy on you and grants you repentance from your hatred, violence, and folly.)

An Antidote for Smugness

Suppose Frank and Joe get into a Facebook debate, and suppose Frank knows a lot more about the issue over which they’re disagreeing. Neither one of them is really an expert, but Frank has read a lot more and maybe even has some sort-of relevant background. The longer the discussion goes, the more he realizes that Joe doesn’t really know what he’s talking about.

Let’s give Frank the benefit of the doubt. He’s not just a victim of confirmation bias. Joe really doesn’t know that much about the issue, it’s evident in what he’s written, and Frank’s assessment on that score is accurate.

So, naturally, this can lead Frank to feel a little smug, and smugness is toxic. It’s a poison that clouds our thinking, alienates us from people who could be our friends, and fuels arrogance and pride.

Frank is self-aware enough to realize that he’s having this reaction, but–as it turns out–there’s more to good character than just being able to recognize your own bad behavior. It’s good to suppress an angry outburst, for example, but it’s even better to overcome the anger itself. Controlling behavior is nice, but shaping character is better. Unfortunately, character can’t be shaped directly. We have to come at it sideways and ambush our own bad character traits when they least suspect it. We have to–wherever possible–cheat.

So here’s an idea.

The reason that Frank knows more than Joe about this issue is that Frank took the time to research it. He read dozens of articles. Joe didn’t, and so Joe doesn’t know as much. Instead of attributing his superiority in this one realm to some kind of personal attribute, Frank should ask himself: “What was Joe doing with the time I used to study this issue?”


Maybe Joe is lazy, and Joe was just watching reality TV show reruns. Maybe Joe is actually very curious and diligent, and was using the same amount of time studying some totally unrelated topic which–if they discussed it–would quickly demonstrate to Frank what it feels like to be the one who doesn’t really get it. Or maybe Joe wasn’t  studying, but he uses his time volunteering to make his neighborhood a better place.

It doesn’t really matter, because–in practice–Frank will never know. The point of the question is to ask it, because asking it reframes the context of Frank’s smugness. It’s not about some kind of overall, general superiority of intellect. It’s about the simple fact that Frank spent time studying a particular issue, and Joe didn’t. This is a smugness antidote. On top of dispelling the person-to-person comparison, it raises questions for Frank, such as: Was studying this particular issue really that wise an expenditure of his finite time and energy? Maybe it was, but maybe it wasn’t, all things considered. This should make Frank a little uneasy. That’s healthy. Certainly fair healthier than smugness, at any rate.

I am not a very good person. This isn’t a statement of false humility. I suspect, all things considered, that I’m probably about average by most comparisons with others, although I’ll never really be sure. But that’s not the point. I don’t care about comparing myself with others; I care about the gap between who I am and who I’d like to be. And the person I’d like to be doesn’t have to devise strategies for decency or play tricks on himself to mimic virtue. That’s what I mean when I say I’m not a very good person.

But we don’t get to choose the kind of person we are. Not in an instant, anyway. We come into this world with a load of genetic and environmental baggage that, by the time we get around to being thinking, self-aware little human creatures, is already more than we could ever hope to sort through in a life time. All we can ever do is start where we are. Hopefully we make incremental steps in the right direction, but human character can’t be perfect in a life time. There’s the old expressions, “fake it ’till you make it.” We’re never going to make it. So we just have to keep faking it. Play-acting at being a good person–when it’s done out of a sincere desire to learn to be good–is the best we can hope for.

This is one technique I try to remind myself to use in that game, and I thought I’d share it.

Top Ten Teen Albums: Walker Edition

There’s a new Internet/Facebook list going around: “10 albums that made a lasting impression on you as a teenager.” I thought it’d be fun to give you a glimpse into the musical tastes of my teenage self, which largely continue today. Attempting to think of whole albums was a little difficult because this was the age of mix CDs. I had a ton of mix CDs with various artists. I also had a lot of “Greatest Hits” and “The Best of…” albums (I wore out The Cream of Clapton as well as The Best of Bond…James Bond), which I’ve decided not to count. I’ve also limited the list to one album per artist. Otherwise, my list would likely be made up of two bands. It should also be noted that my musical tastes were largely seen through the eyes of a budding guitar player. Virtually everything was interpreted through the filter of, “How can this affect my guitar playing?” So, without further ado, here are my top ten teen albums (in no particular order):

Image result for enema of the stateBlink 182 – Enema of the State: I went through a huge Blink 182 phase through middle school and into my freshman year of high school. Aside from some radio play (“Dammit” was actually the first song I ever heard by them), my first proper introduction to them was at scout camp one summer. One of my best friends at the time had Enema of the State with him (I want to say on cassette) and he let me listen to some of the songs as we made our way to different merit badge sessions. This led to mixtapes featuring songs from Enema, Dude Ranch, and even Buddha. My parents weren’t particularly thrilled when they finally heard some of the crude themes and coarse language on these tapes, but that didn’t stop me from getting my secret stash of Blink CDs. It was this album that made me want to play an instrument. I decided I wanted to play bass because (1) everyone and their mother plays guitar and (2) Mark–the Blink bassist–was in my eyes the coolest member of the band (though now I know it’s definitely the drummer Travis Barker). My parents opted for a guitar instead and I’ve never once regretted it. Even though I prefer their 2001 album Take Off Your Pants and Jacket (see what they did there?), Enema is the one that started my musical journey. Below is the highly immature video for the single “What’s My Age Again?”

Image result for make yourselfIncubus – Make Yourself: During one of our family vacations, my older sister Tori let me listen to her copy of Make Yourself by Incubus. I’d heard some of their hits on the radio, but this was the first time going through the entire album. I fell in love with it to the point that my sister just gave it to me. Brandon Boyd’s vocals and the somewhat unique twist on 90s alternative rock stood out to me as did its ability to capture my various teenage emotions, from angst to puppy love to a desire for self-direction. Their follow-up albums during my high school years–Morning View and A Crow Left of the Murder–received even more constant rotation than Make Yourself, but it was this album that began my still ongoing love affair with Incubus and their talent for both capturing my emotional states and transporting me to new ones. Below is the video for their song “Stellar.”

Image result for Ride the lightningMetallica – Ride the Lightning: I had heard Metallica growing up. Who hasn’t heard “Enter Sandman” or “Nothing Else Matters“? But I only started paying attention to them after hearing their live album S&M in the weight room at my high school freshman year. It was my first year of guitar playing and my initial thought was, “If they are this good live, what are they like on their records?” As I started downloading Metallica songs, I saw them perform one I hadn’t heard before on VH1 (trivia: it was bassist Jason Newsted’s last performance before he left the band). The song was “Fade to Black” and it was found on Ride the Lightning. I bought that album soon after and brought it with me on a family vacation to Washington, D.C. I listened to it non-stop and decided that I wanted to be able to play like guitarists James Hetfield and Kirk Hammett. This was my pathway to metal. I ended up with all of the Metallica albums, as well as a large chunk of Megadeth, Pantera, Ozzy Osbourne, and Dream Theater albums. I probably spent far more time listening to these metal bands than anything else, but it was Ride the Lightning that started it all. My guitar chops improved as did my musical taste because of it. You can see the VH1 performance of “Fade to Black” that ignited the flame below.

Image result for dark side of the moon albumPink Floyd – Dark Side of the Moon: My brother-in-law JC has been a guitar player since he was a teenager (at least). Whenever we would visit my sister, he would always go through his ginormous digital collection of music in hopes of educating me out of my Blink 182 phase and moving me beyond Metallica. He first used David Gilmour’s ending solo in “Comfortably Numb” to peak my interest in Pink Floyd. I ended up getting Echoes: The Best of Pink Floyd (on my own or as a gift, I can’t remember), but Dark Side of the Moon was the first actual album that listened to heavily (followed by The Wall). The musical style was so different from what I was used to; a kind of progressive, psychedelic rock. Gilmour’s less-is-more melodic playing was such a contrast to the shredding I was accustomed to from metal bands. Plus, the idea of a concept album was pretty new to me. Composing songs that bled into each other as they told a coherent story or relayed similar themes was a new level of creativity for me. Dark Side taught me to slow my playing down and told me that emotion and melody were key to a good lead. You can see what in my estimate is the best version of “Money” from the concert Delicate Sound of Thunder below, which I watched over and over again as a teenager.

Image result for led zeppelin iiLed Zeppelin – Led Zeppelin II: I originally bought Led Zeppelin II for my dad for his birthday(?) one year at the suggestion of my mom. I wasn’t very familiar with Led Zeppelin at the time and even though I thought “Whole Lotta Love” was pretty cool, it didn’t peak my interest all that much. However, after picking up the guitar and shifting away from pop punk bands, I started “borrowing” (i.e., making a permanent part of my personal collection) Led Zeppelin II from my dad. While I acquired The Best of Led Zeppelin: Early & Latter Days, Vol. 1 & 2, it was this album that made me really appreciate the bluesy elements of rock. It felt like a bridge between the old and the new, between traditional blues and modern rock. And everyone was amazing: Plant’s vocals, Page’s guitar, Jones’ bass, and Bonham’s drums. Hard to find a band in which every member is of the highest caliber. And yes: I still prefer it to Led Zeppelin IV. You can see them performing “Whole Lotta Love” live from the Led Zeppelin DVD I watched consistently in my later years of high school.

Image result for pearl jam tenPearl Jam – TenI used to hate Pearl Jam.  I remember revealing my dislike of them to a bass player friend my freshman/sophomore year and he was flabbergasted that a guitar player would not like them. “But they’re such good musicians!” he protested. I don’t know what it was about them. Maybe it was Eddie Vedder’s voice (a co-worker of mine once described him as sounding like a man singing in a freezer). Maybe it was the flannel. My suspicion is that I just had not been properly exposed to them beyond “Jeremy” (which is an awesome song, mind you). I started downloading a number of Pearl Jam songs in my later years of high school and found myself appreciating them more and more. I finally caved and bought Ten. The album was (and is) phenomenal. There isn’t a song on it that isn’t top-notch. You can see them on full display with “Alive” below.

Image result for moving picturesRush – Moving PicturesMy first introduction to Rush was their video for “Time Stand Still” early one weekday morning on VH1. The video was ridiculous, but there was something about the band that I really liked. I stumbled on them again when “Test For Echo” came on one of those satellite music channels that I had playing in the background one day. I recognized the vocals and the band name and once again found myself being drawn to their style. On another fateful weekday morning, I saw their video for “Limelight” on VH1. The video was incredibly dated, but the song blew me away. Their mastery of the instruments was incredible and I was sold on Alex Lifeson’s wammy-heavy solo. I had to have that song. I ended up buying Moving Pictures soon after. I couldn’t believe that a trio could create that kind of sound. Typically, I focused solely on the guitar playing, but Rush made it impossible to ignore Lee’s bass playing or Peart’s drumming. This opened the flood gates: virtually every album and a couple concerts (one of which was as recent as 2015) later, I still consider them one of my favorites. You can see the video for “Limelight” below.

Image result for dirt alice in chainsAlice in Chains – DirtLucky for me, my YM/Scout leader for the longest time was also one of my best friend’s dad. While I was listening to Blink 182, my friend (due to his dad’s influence) was listening to the likes of Led Zeppelin, Pink Floyd, and even the chainsaw-wielding Jackal. One Wednesday night, as I caught a ride with my friend, his dad popped in one of his many CDs. Suddenly, a chugging, metal chord progression filled the car, along a with a jolting scream and eerie harmonies. I was caught off guard, but thoroughly entranced. About halfway through, the guitarist ripped into a headbanging solo. My ears perked up. The song, unfortunately, came to an end after only a couple minutes. When I asked what this was, my friend’s dad answered (with a smile), “Alice in Chains.” The album was Dirt and the song was “Them Bones.” I borrowed the album, ripped it, and became an AIC fan from then on. Jerry Cantrell, the guitarist and co-vocalist, provided a blues-based, melodic metal I could rock out to. More importantly, he provided a type of playing that seemed achievable: not because his playing was sub-par, but because it evidenced a moderate partaking of the best rock music had to offer. Cantrell was not a shredder, a blues master, or a progressive rock composer (he still isn’t). But he was and is a fine guitar player, lyricist, and all-around musician. He instilled me with confidence and inspiration in my first few years of playing and remains influential even today. You can see the video for “Them Bones” below.

Image result for rumoursFleetwood Mac – RumoursFor Christmas one year I received a year-long subscription to Guitar World magazine. In one of the issues, it featured a kind of boxing bracket for guitarists, rating them on a 0-5 scale on things like chops, influence, creativity, etc. Unfortunately, my mother trashed all of my Guitar World issues while I was on my mission (I’m still not sure why), so I’m unable to reference it properly. But at the time, I used the bracket to learn about guitarists I had never heard of before. At one point, I came across the name Lindsey Buckingham with something like a 3.7 in chops. When I discovered that he was the guitarist/co-vocalist of Fleetwood Mac, I remembered that my mom had Rumours in her van. I promptly borrowed the album and began soaking in Buckingham’s fingerpicked style. The album remained in constant rotation along with their (then) new album Say You Will. The mix of male and female vocals gave it a more diverse sound than I was used to and my enjoyment of Rumour‘s more pop-oriented style helped expand my musical palate. You can see their performance of “The Chain” from their live album The Dance (which I also spent of fair amount of time listening to) below.

Image result for texas floodSteve Ray Vaughan and Double Trouble – Texas FloodIn my first year of guitar playing (and therefore still in my pop punk phase), I had a Sunday School teacher who recognized that my fellow Blinkophile friend and I were “big into music.” One day, she held us after class and gave each of us a copy of SRV’s Texas Flood. Because we were guitar players, she knew we would appreciate SRV’s skills. In actuality, we both had a bit of an aversion to the album: it was straight blues and that just wasn’t us. We were “punk rockers” and SRV was definitely not that. Fast-forward a year or so. I was going through my CD collection and pulled our Texas Flood to give it another listen. I was floored: the tone, the bends, the precision. It was beautiful. While other artists (see Zeppelin and Floyd above) opened the door to a bluesier style, it was this album that solidified the blues in my book. It paved the way for my embrace of other blues guitarists like B.B. King, Albert King, Buddy Guy, Joe Bonamassa, Robben Ford, and others. Texas Flood is the reason that I earned the name “Blues Man” from a co-worker due to my Pandora picks. You can see SRV & Double Trouble performing my favorite track off the album–“Lenny”–below.

Here are a few honorable mentions with a brief explanation:

Megadeth – Rust in Peace: After Metallica, Megadeth was the next biggest metal band I listened to (Dave Mustaine was a former member of Metallica before they kicked him out). Rust in Peace was the first album of theirs I bought and it is still my favorite.

Tool – Lateralus: A friend of mine had a select few bands that he insisted were required listening. One of them was Tool and he made me promise to listen Lateralus all the way through without stopping. If I loved it, he would burn me the rest of their albums. I did and he did.

Prince – Purple Rain: I probably listened to “The Very Best of…” more, but Purple Rain put Prince’s skills on full display. The talent of the man was almost sickening.

Les Miserables: Original Broadway Cast: My older sister Nicole was a big theatre geek, so Les Miserables became a staple of my growing up. I still listen to it fairly often and I wept like a baby at the end of the 2012 film.

Phantom of the Opera: Original London Cast: Ditto. I dragged my high school girlfriend to the 2004 film. She kept trying to get friendly in the theater and I kept telling her to leave me alone so I could watch the movie. Priorities.


There you have it: my top ten teen albums.

Two Stupids Don’t Make a Smart

Wikipedia: "Symphony of the Stones carved by the Goght River at Garni Gorge in Armenia is an example of an emergent natural structure." Released by WOWARMENIA for Wikimedia under Creative Commons Attribution-Share Alike license
Wikipedia: “Symphony of the Stones carved by the Goght River at Garni Gorge in Armenia is an example of an emergent natural structure.”
Released by WOWARMENIA for Wikimedia under Creative Commons Attribution-Share Alike license

I didn’t get a chance to make that pithy observation in a Facebook exchange this morning because my interlocutor gave me the boot. That’s OK, I may have been blocked from somebody’s Facebook feed for thinking bad thoughts, but I can’t get blocked from my own blog! You can’t stop this signal, baby.

So, just as two wrongs don’t make a right, let’s use this Columbus Day to talk about two stupids that don’t make a smart.

Bad Idea 1: The Noble Savage

There’s a school of thought which holds, essentially, that everything was fine and dandy in the Americas until the Europeans came along and ruined it. The idea, seen in Disney and plenty of other places, is that “native” peoples lived at harmony with the Earth, appreciating the fragile balance of their precious ecosystems and proactively maintaining it. This idea is bunk. The reality is that in almost all cases the only limit on the extent to which any culture restricts its exploitation of natural resources is technological. Specifically, humanity has an unambiguous track record of killing everything edible in sight as they spread across the globe, leading to widespread extinctions from Australia to the Americas and upending entire ecosystems. If our ancient ancestor didn’t wipe a species out, the reason was either that it didn’t taste good or they couldn’t. As Yuval Noah Harari put it Sapiens:

Don’t believe the tree-huggers who claim that our ancestors lived in harmony with nature. Long before the Industrial Revolution, homo sapiens held the record among all organisms for driving the most plant and animal species to their extinctions. We have the dubious distinction of being the deadliest species in the annals of biology.

Harari specifically describes how the first humans to discover Australia not only wiped out species after species, but–in so doing–converted the entire continent into (pretty much) the desert it is today:

The settlers of Australia–or, more accurately, its conquerors–didn’t just adapt, they transformed the Australian ecosystem beyond recognition. The first human footprint on a sandy Australian beach was immediately washed away by the waves, yet, when the invaders advanced inland, they left behind a different footprint. One that would never be expunged.

Matt Ridley, in The Origins of Virtue, lists some of the animals that no longer exist thanks to hungry humans:

Soon after the arrival of the first people in Australia, possibly 60,000 years ago, a whole guild of large beasts vanished — marsupial rhinos, giant diprotodons, tree fellers, marsupial lions, five kinds of giant wombat, seven kinds of short-faced kangaroos, eight kinds of giant kangaroo, a two-hundred-kilogram flightless bird. Even the kangaroo species that survived shrank dramatically in size, a classic evolutionary response to heavy predation.

And that pattern was repeated again and again. Harari again:

Mass extinctions akin to the archetypal Australian decimation occurred again and again in the ensuing millennia whenever people settled another part of the outer world.

Have you ever wondered why the Americas don’t have the biodiversity of large animals that Africa does? We’ve got some deer and bison, but nothing like the hippos, giraffes, elephants, and other African megafauna. Why not? Because the first humans to get here killed and ate them all, that’s why not. There’s even a name for what happened: the Pleistocene overkill. Back to Ridley:

Coincident with the first certain arrival of people in North America, 11,500 years ago, 73% of the large mammal genera quickly died out…  By 8000 years ago, 80% of the large mammal genera  in south America were also extinct — giant sloths, giant armadillos, giant guanacos, giant capybaras, anteaters the size of horses.

In Madagascar, he notes that “at least 17 species of lemurs (all the diurnal one is larger than 10 kg in weight, one as big as a gorilla), and the remarkable elephant birds — the biggest of which weighs 1000 pounds — were dead within a few centuries of the islands first colonization by people in about 500 A.D.” In New Zealand, “the first Maoris sat down and ate their way through all 12 species of the giant moa birds. . .  Half of all new Zealand’s indigenous land birds are extinct.” The same thing happened in Hawaii, where at least half of the 100 unique Hawaiian birds were extinct shortly after humans arrived. “In all, as the Polynesians colonized the Pacific, they extinguished 20% of all the bird species on earth.”

Ridley’s myth-busting doesn’t end there. He cites four different studies of Amazon Indians “that have directly tested their conservation ethic.” The results? “All four rejected the hypothesis [that the tribes had a conservation ethic].” Moving up to North America, he writes that “There is no evidence that the ‘thank-you-dead-animal’ ritual was a part of Indian folklore before the 20th century,” and cites Nicanor Gonsalez, “At no time have indigenous groups included the concepts of conservation and ecology in their traditional vocabulary.”

This might all sound a little bit harsh, but it’s important to be realistic. Why? Because these myths–no matter how good the intentions behind them–are corrosive. The idea of the Noble Savage is intrinsically patronizing. It says that “primitive” or “native” cultures are valuable to the extent that they are also virtuous. That’s not how human rights should work. We are valuable–all of us–intrinsically. Not “contingent on passing some test of ecological virtue” (as Ridley puts it.)

Let me take a very brief tangent. Ridley’s argument here (as it relates to conservation) is exactly parallel to John McWhorters linguistic arguments and Steven Pinker’s psychological arguments. In The Language Hoax, John McWhorter takes down the Sapir-Whorf Hypothesis, which is the trendy linguistic theory that what you think is determined by the language you think it in. Just like the Noble Savage, this idea was originally invented by Westerners on behalf of well, everybody else. The idea is that “primitive” people were more in contact with the timeless mysteries of the cosmos because (for example) they spoke in a language that didn’t use tense. Not only did this turn out to be factually incorrect (they just marked tense differently, or implied it in other cases, as many European languages also do), but it’s an intrinsically bad idea. McWhorter:

In the quest to dissuade the public from cultural myopia, this kind of thinking has veered into exotification. The starting point is, without a doubt, I respect that you are not like me. However, in a socio-cultural context in which that respect is processed as intellectually and morally enlightened, inevitably, to harbor that respect comes to be associated with what it is to do right and to be right as a person. An ideological mission creep thus sets in. Respect will magnify into something more active and passionate. The new watchcry becomes, “I like that you are not like me,” or alternately, “What I like about you is that you are not like me.” That watchcry signifies, “What’s good about you is that you are not like me.” Note however, the object of that encomium, has little reason to feel genuinely praised. His being not like a Westerner is neither what he feels as his personhood or self-worth, nor what we should consider it to be, either explicitly or implicitly.

The cute stories about the languages primitive peoples speak and the ways that enables them to see the world in unique and special ways end up being nothing but a particularly subtle form of cultural imperialism: our values are being used to determine the value of their culture. All we did was change up the values by which we pass judgement on others. Thus: “our characterization of indigenous people in this fashion is more for our own benefit than theirs.”

The underlying premise of Harrari, Ridley, and McWhorter is what Steven Pinker’s The Blank Slate tackles directly: the universality of human nature. We can best avoid the bigotry and discrimination that has marred our history not by a counter-bigotry that holds up other cultures as special or superior (either because they’re in magical harmony with nature or possess unique linguistic insights) but by reaffirming the fact that there is such a thing as an universal, underlying human nature that unites all cultures.

Universal human nature is not a byproduct of political wishful thinking, by the way. Steven Pinker includes as an appendix to The Blank Slate a long List of Human Universals compiled by Donald E. Brown in 1989. It is a long list, organized alphabetically. To give a glimpse of the sorts of things behaviors and attributes common to all human cultures, here are the first and last items from the list:

  • abstraction in speech and thought
  • actions under self-control distinguished from those not under control
  • aesthetics
  • affection expressed and felt
  • age grades
  • age statuses
  • age terms
  • vowel contrasts
  • weaning
  • weapons
  • weather control (attempts to)
  • white (color term)
  • world view

The list also includes lots of stuff about binary gender which is exactly why you haven’t heard of the list and why Steven Pinker is considered a rogue iconoclast. These days, one does not simply claim that gender is binary.


I’ve spent a lot of time on the idea of the Noble Savage as it relates to ecology, but of course it’s a broader concept than that. I was once yelled at quite forcibly by a presenter trying to teach us kids that warfare did not exist among pre-Columbian Native Americans. I was only 11 or 12 at the time, but I knew that was bs and said so.

The point is that the whole notion of a mosaic of Native Americans living in peace and prosperity until the evil Christopher Columbus showed up and ruined everything is a bad idea. It’s stupid number 1.

Bad Idea 2: Christopher Columbus is Just Misunderstood

So, this is the claim that started the discussion that got me blocked by somebody on Facebook today. The argument, such as it was, goes something like this: Columbus looks very bad from our 21st century viewpoint, but that’s an unfair, anachronistic standard. By the standards of his day, he was just fine, and those are the standards by which he should be measured.

The problem with this idea is that, like the first, it’s simply not true. One of the best, popular accounts of why comes from The Oatmeal. In this comic, Matthew Inman contrasts Columbus with a contemporary: Bartolomé de las Casas. While Columbus and his ilk were off getting to various hijinks including (but not limited to) child sex slavery and using dismemberment to motivate slaves to gather more gold, de las Casas was busy arguing that indigenous people deserved rights and that slavery should be abolished. Yes, at the time of Columbus.

The argument that if we judge Columbus by the standards of his day he comes out OK does not hold up. We can find plenty of people at that time–not just de las Casas–who were abolitionists or (if they didn’t go that far) were critical of the excessive cruelty of Columbus and many like him. Keep in mind that slavery had been a thing in Europe for thousands of years until the Catholics finally stamped it out around the 10th century. So it’s not like opposition to slavery is a modern invention. When slavery was restarted in Africa and then the Americas many in the Catholic clergy opposed it once again, but were unable to stop it. So the idea that–by the standards of his day–Columbus was just fine and dandy doesn’t work. He’s a pretty bad guy in any century.

Two Stupids Don’t Make a Smart

I understand the temptation to respond to Noble Savage-type denunciations of Christopher Columbus by trying to defend the guy. You see somebody making a bad argument, and you want to argue that they’re wrong.

But that isn’t how logic actually works. A broken clock really is right twice a day, and a bad argument can still have a true conclusion. If I tell you that 2+2 = 4 because Mars is in the House of the Platypus my argument is totally wrong, but my conclusion is still true.

The Noble Savage is a bad bit of cultural luggage we really should jettison, but Columbus is still a bad guy no matter how you slice it. Using one stupid idea to combat another stupid idea doesn’t actually enlighten anyyone.

The Bell Curve of Extremism

There are basically two kinds of moderates / independents: the ignorant and the wise. It really is a sad twist of fate to stick the two together, but nobody honest every said life was fair.

To illustrate, let me introduce you to a concept I’ll call the Bell Curve of Extremism:



To flesh this out, I’ll use some examples from voting.

A person on the left doesn’t know who they’re going to vote for because they don’t know much of anything at all. They may not even know who’s running or who’s already in office. This doesn’t mean they’re stupid, necessarily. They could be brilliant, but just pay no attention to politics.

A person in the left knows exactly who they’re voting for, and it’s never really been in question. What’s more, they can give you a very long list of the reasons they are voting for that person and–what’s more–all the terrible, horrible things about the leading contender that make him or her totally unfit for office and a threat to truth, justice, and the American Way. This is the kind of person who consumes a lot of news, but probably from a narrow range of sources, like DailyKos or RedState. They’re not bad people, but they high motivation tends to lead to an awful lot of research that is heavily skewed by confirmation bias.

A person on the right may also be unsure of how they’re going to cast their vote, but it’s not because they don’t know what’s going on. The problem is they do, and this knowledge has led them (as often as not) to fall right off the traditional left/right axis. I called myself a radical moderate when I was in high-school. At the time, it was mostly because I was on the far left but I wanted to sound cool. Later on in life I found myself near the peak of the bell-curve, a die-hard conservative with all the answers who was half-convinced that liberals were undermining the country. But then I went to graduate school to study economics (one of the areas where I was staunchly conservative) and lo and behold: things got complicated. I fell off the peak and I’ve been sliding down the slope ever since. And what do you know, but I found out recently that radical moderates are actually a thing. They even include some of my very favorite thinkers, like John McWhorter (cited above) and Jonathan Haidt (cited in a lot of my posts). I’ve come full circle, from know-nothing moderate to know-that-I-know-nothing radical moderate.

It’s kind of lonely and depressing over here, to be honest, and we don’t often find an awful lot to shout about. Which is why the conversation tends to be dedicated by peak-extremists who know just enough to be dangerous. About the only banner you’ll see us waving is the banner of epistemic humility. And really, how big of a parade can you expect to line up behind, “People probably don’t know as much as they think they do? (Including us!)”

But one thing that I can share with some conviction is this post, and the idea that–when it comes to ideas–fighting fire with fire just burns the whole house down. There is validity to the idea that things were better before Christopher Columbus showed up. There was a helpful lack of measles and small pox, for example. But blaming the transmission of those diseases (except in the rare cases when it was important) and the resulting humanitarian catastrophe on Columbus doesn’t make any sense. He did a lot of really evil things, but intentional germ warfare was not among them. Relying on it because the numbers are so big is lazy. There is also validity to the idea that Columbus lived in a different time. Many of the most compassionate Westerners were motivated not by a modern sense of equal rights but by a more feudal-tinged idea of noblesse oblige. De la Casa himself, for example, first suggested making things easier on Caribbean slaves by importing more African slaves before later deciding that all slavery was a bad idea. And if you fast-forward to the 19th century abolitionist movements, you’ll find plenty of what counts as racism in the 21st century among the abolitionists who were motivated (in some cases) by ideas of civilizing the savages. Racial politics are complicated enough in the 21st century alone, of course we can’t bring in perspectives from six centuries ago and expect all the good guys to neatly align on bullet point of focus-group vetted talking points!

So yes: I see validity to both sides of the fight. If your goal is to win in the short term, then the most useful thing to do is double-down on your strongest arguments and cherry-pick the other side’s weakest points. This is the strategy of two stupids making a smart, and it doesn’t work.

If your goal is to win in the long term, then you have to undergo a fundamental transformation of perspective. The short-term model isn’t just short-term. It’s ego-centric. The fundamental conceit of the idea of winning is the idea of being right, as an individual. Your view is the correct one, and the idea is to have your idea colonize other people’s brains. It is unavoidably an ego-trip.

The long-term model isn’t just about the long-term. It’s also about seeing the whole that is more than the sum of the parts. In this view, the likeliest scenario is that nobody is right because, on any particular suitably complex question, we are like the world before Newton and the world before Einstein: waiting for a new solution no one has thought of. And, even if somebody does have the right solution to the problem we face now, that will almost certainly not be the right answer to the problem we will face tomorrow. In that case, it’s not about having the right ideas in the heads of the read people, it’s about having a culture and a society that is capable of supporting a robust ecosystem of idea-creation. The focus begins to shift away from the “I” and towards the “we.”

In this model, your job is not to be the one, singular, heroic Messiah who tells everyone the answer to their problem. Your job is to play your part in a larger collective. Maybe that means you should be the lone voice calling from the wilderness, the revolutionary prophet like Newton or Einstein. But more probably it means your job is to simply be one more ant carrying one more grain of sand to build the collective pile of human knowledge and maybe–through conversations with friends and family–shift the center of gravity infinitesimally in a better direction.

I’m not a relativist. I’m a staunch realist in the sense that I believe in an objective, underlying reality that is not dependent on social construction or individual interpretation. But I’m also a realist in the sense of acknowledging that the last living human being to have ever understood the entire domain of mathematics was Carl Friedrich Gauss and he died in 1865. No living person today understands all mathematical theory. And that’s just math. What about physics and history and chemistry and psychology? And that’s just human knowledge. What about the things nobody knows or has thought of yet? An individual is tiny, and so is their sphere of knowledge. The idea that the answers to really big questions fall within that itty-bitty radius seems correspondingly remote. In short: the truth is out there, but you probably don’t have it and you probably can’t find it. It may very well be, keeping this metaphor going, that the answer to some of our questions are too complex for any one person to hold in their brain, even if they could discover one.

I’m not giving up on truth. I am giving up on atomic individualism, on the idea that the end of our consideration with regards to truth is the question of how much of it we can fit into our individual skulls. That seems very small minded, if you’ll pardon the pun. Instead, I’m much more interested in ways in which individuals can do their part to contribute to building a society that may understand more than its constituent individuals do or (since that seems a bit speculative, even to me) at a minimum provides ample opportunity for individuals to create, express, and compare ideas in the hope of discovering something new.

Two stupids can’t make a smart. The oversimplification and prejudice necessary to play that strategy is not worth the cost. Winning debates is not the ultimate goal. We can aim for something higher, but we have to be willing to lay down our own egos in the process and contribute to something bigger.

Obedience Out of Love

This post is a talk that I gave in my congregation a couple of weeks ago. A few folks asked me for copies, so I thought putting it online would be the simplest approach.

Love or Fear

I have heard it said that every decision a human makes fundamentally comes down to one of only two motivations: fear or love. That’s it.

Scientists are a little less romantic about it, but they actually have the same basic concept. From biology to computer science whether you’re talking about an amoeba or an artificial intelligence the fundamental choice every agent has to make comes down to attraction or avoidance. You are attracted to the good stuff. You avoid the bad stuff. If you’re a bacterium, it means you move towards food and you move away from anything that thinks your food. So these are the two motives any creature can have: we either move towards what we want or we move away from what we don’t want.

An Irritant or a Quest

President Benson said

When obedience ceases to be an irritant and becomes our quest, in that moment God will endow us with power. 

What I want to talk about is how we make that transition. How do we change our attitude towards obedience? How do move beyond the place where obedience feels like a burden and get to the place where obedience feels like a challenge? How do we turn obedience from an irritant into a quest?

I believe that it comes down to fear and love. We have to wean ourselves away from fear-based obedience and towards love-based obedience. We have to fear less and love more. It’s like Paul told Timothy:

For God hath not given us the spirit of fear; but of power, and of love, and of a sound mind.

Fearful Obedience

On my mission I set goals all the time. My companion and I would sit down, we’d pray, and we’d set “realistic” goals. We’d set goals that—sitting in the apartment, feeling charged to go out and do the Lord’s work—seemed easily attainable.

I don’t think we hit 50% of our weekly goals a single time. Not even once. And yet somehow, we never learned. Every week we felt really horrible about how bad our goals were going, and every week we rallied and we did the exact same thing.

The Lord can work with all kinds of tools, but I’m pretty sure that even he appreciates the value of a sharp instrument over a blunt one. As a missionary I was definitely not the sharpest tool in the shed. It never occurred to me, not even once, that the only realistic goals would be to start with what we actually accomplished last week and then build from there.

It’s not actually that I was too dumb. The truth is that I was too proud to admit how far from perfect I was. I wanted to think of myself as a good missionary. I’d wanted to serve my whole life, I was following the rules, and I truly wanted to be there. So I just assumed—naturally—given all my good intentions I had to be pretty good, right?

Well, not. First of all, that’s not realistic. That’s just wishful thinking. I try to be a lot more realistic now than I was then. (My wife doesn’t think I’m very good at that, yet.) More importantly, however, I was operating out of fear. It was fear-based obedience. I was afraid of failure. And, I have come to learn, there is a mile of difference between trying to avoid failure and pursuing success. They may look similar from the outside, but from the inside they could not be more opposite.

Trying to be obedient out of fear means that you’re in constant stress. You’re unwilling to take risks—and risks are necessary for growth. Over time, this can lead to shriveling and atrophy. You remember the parable of the talents? The rich man gives his servants 5 talents, 2 talents, and 1 talent. The story has always bugged me, because it’s the poor guy who only gets 1 talent that messes it up. I’d like the story more if it was the guy who got 5 talents who was lazy. But that’s not the point. The point is that the first two invested. They risked. They turned doubled their talents. But the last guy? He was so afraid of losing his talent he just buried it. That’s fear right there.

Still Better than Disobedience

Let me pause for a second and make a very important disclaimer. Fear-based obedience is not as good as love-based obedience, but it’s still a whole lot better than disobedience. I don’t want anybody misunderstanding me on that point, OK?

And there’s a reason for that.

The laws of nature, the laws of God, the laws of life, are one and the same and are always in full force.

No matter why you are obedient, you’re still going to enjoy at least some of the blessings of that obedience because the laws of God are always in full force. The laws of physics don’t care why you buckle your seat-belt, right? If you get into an accident, your motivation does not enter into the equation. If you have the seat-belt on, you’re going to be safe. If you have it off, you’re going to be in a lot more danger.

O my beloved young friends, even selfishly it is smart to keep the commandments God has given.

So, step 1 is be obedient. What we’re talking about now is step 2, which is how to be obedient.

Look, if your option is to either be obedient because you feel like you’re supposed to or be disobedient, then go ahead: be obedience out of obligation. When I was growing up I avoided a lot of pitfalls because I was afraid. That’s the honest truth. I didn’t smoke, I didn’t drink, I didn’t watch pornography. It’s not because I’m such an awesome, righteous guy. It’s because I was risk averse. I saw that a lot of bad stuff came along with drinking and smoking, and in general I never wanted to lose control. I was obedient because I was afraid. That’s better than disobedient. And there are plenty of days when I just don’t feel all excited about following the commandments, and I just go through the motions because it’s what I promised to do. Going through the motions is better than not doing it at all.

So, fearful obedience is better than disobedience, but it’s not that great. We want love-based obedience. This is tricky, in a way, because we’re changing horses in mid-stream. We need to find a way out of fear-based obedience and into love-based obedience. Let’s start with letting go of fear.

A Remission of Sins

I have always been struck by the phrase “remission of sins.” We pretty much always hear about it around baptism.

John did baptize in the wilderness, and preach the baptism of repentance for the remission of sins.

And the only other time we ever use the word “remission” is when we’re talking about cancer. I’m not a scholar. I know that the word remission has two definitions. It can mean “the cancellation of a debt” or it can mean “a temporary recovery.” And I’m not certain which one fits best with our understanding of baptism. But as I understand it, the idea of cancer going into remission and the idea of sins going into remission is basically the same. The one difference is this: when your cancer goes into remission you can’t control if or when it will come back. But when your sins go into remission you are in control. As long as you abide by the covenant you made when you were baptized, they are in remission.

What this means to me is: you don’t have to be afraid. You don’t have to live in fear. When you have faith in Christ, you get to live in hope.

That’s the first key to transitioning away from fear-based obedience.


The second key is being humble. That was my biggest problem as a goal-setting missionary. I was too proud to admit how weak I was. That got a lot easier as I got older. My life, in many ways, has been a string of disastrous failures ever since I got home from my mission. I have failed at so many things and in so many ways and with such utter gracelessness that I have been blessed with the inability to take myself very seriously anymore. I am like the poor Zoramites who were not allowed into the synagogues that they had built.

because ye are compelled to be humble blessed are ye

This humiliation has been a great blessing. It has taught me that fear of your own sins is a kind of arrogance. It is like saying that your evil is greater than Christ’s good. It is like saying that you have the ability to dig a hole too deep for your Savior to lift you out of. And if there’s only one thing you remember from my talk, let it be this: as long as you want to be saved there are no holes that deep.

When you are humble, failure loses the power to intimidate you. That’s why the devil hates humble people, they are practically impossible to push around. When you have faith and are humble, you are ready to let go of fear.

Godly Ambition

I’m going to share one of my favorite quotes. It’s from a man named Ira Glass, and—on the surface, at least—it won’t sound very religious. But it is. You’ll see. Here is his quote:

Nobody tells this to people who are beginners, I wish someone told me. All of us who do creative work, we get into it because we have good taste. But there is this gap. For the first couple years you make stuff, it’s just not that good. It’s trying to be good, it has potential, but it’s not. But your taste, the thing that got you into the game, is still killer. And your taste is why your work disappoints you. A lot of people never get past this phase, they quit. Most people I know who do interesting, creative work went through years of this. We know our work doesn’t have this special thing that we want it to have. We all go through this. And if you are just starting out or you are still in this phase, you gotta know its normal and the most important thing you can do is do a lot of work. Put yourself on a deadline so that every week you will finish one story. It is only by going through a volume of work that you will close that gap, and your work will be as good as your ambitions. And I took longer to figure out how to do this than anyone I’ve ever met. It’s gonna take awhile. It’s normal to take awhile. You’ve just gotta fight your way through.

This quote is about love and fear. All artists have to confront the gap between what they want to do (make good art) and what they are doing (making terrible art) and then they have to make a decision: love or fear. If they choose fear, they will quit, because the pain of failure is too much. If they choose love, they will refuse to give up. And that means they will keep on failing. They will write bad stories. They will write terrible poetry. They will take horrible photographs. And they will do it again and again and again until they get it right.

This applies to all of us.

Every thing which inviteth to do good, and to persuade to believe in Christ, is sent forth by the power and gift of Christ

We all see light and truth and kindness and beauty in our lives at some point. We love those things. And we all see our own actions and we see darkness and deception, selfishness and ugliness. And then we have to choose: love or fear.

If we choose love, then—just like the artists—we go right on ahead and keep failing. We fail at being perfectly kind. We fail at being perfectly wise. We fail at being perfectly honest. And every morning we get up and we do it again and again until one day we get it right. And that is what it means to be a saint.

A saint is not necessarily a person who is perfect, but he is a person who strives for perfection—one who tries to overcome those faults and failings which take him away from God. A true saint will seek to change his manner of living to conform more closely to the ways of the Lord.

Artists learn to be better artists by first being bad artists. They practice. And people learn to be better people by first being bad people. They practice. And the name for that practice is: obedience. When you “seek to change [your] manner of living to conform more closely to the ways of the Lord” because you love the things which “inviteth to do good” then—for you—obedience has become a quest. Keeping the commandments will always be hard, but it will no longer feel like a burden because you will understand that keeping the commandments is the path to becoming the kind of person you would be proud to be.

If you’d like a printable copy of the talk, here you go.

“I know who you are now, and I name you my enemy.”

For those who haven’t read The Screwtape Letters by C.S. Lewis, Wormwood is the demon the letters are addressed to. It’s Wormwood’s job to weaken faith and encourage sin in the human he’s assigned to, and the letters are from his uncle, a demon named Screwtape, who gives Wormwood advice on how to do this.

Yesterday while meandering through Spotify, I came across the song “Dear Wormwood” by the Oh Hellos.  From what I can tell, the song is about a demon who weakened the singer’s faith since childhood and how the (now adult) singer is recognizing and trying to overcome the demon’s influence.

I’m a secularist, and by that I mean I don’t practice a religion and don’t have faith in anything supernatural. But I’m a reluctant secularist, and by that I mean I had good experiences with the religion of my childhood, I miss it and wish it were true, but I don’t actually believe it is. From that context, the song kind of hits a nerve.

You can listen to it here:

Here are the lyrics, though I recommend listening to it first or concurrently rather than reading them on their own:

When I was a child, I didn’t hear a single word you said
The things I was afraid of, they were all confined beneath my bed
But the years have been long, and you have taught me well to hide away
The things that I believed in, you’ve taught me to call them all escapes

I know who you are now

There before the threshold, I saw a brighter world beyond myself
And in my hour of weakness, you were there to see my courage fail
For the years have been long, and you have taught me well to sit and wait
Planning without acting, steadily becoming what I hate

I know who you are now

I have always known you, you have always been there in my mind
But now I understand you, and I will not be part of your designs

I know who I am now
And all that you’ve made of me
I know who you are now
And I name you my enemy

I know who I am now
I know who I want to be
I want to be more than this devil inside of me