“Christians get divorced at a rate equal to non-Christians.”
Ever hear that one? If you’ve ever gone to church for any length of time, especially a more hip and “modern” church, you’ve probably heard that said again and again and again. If you have Christian friends, you’ve probably heard it repeated ad nauseum on Facebook and Twitter. Strange thing: I’ve never heard any non-Christian talk about it…only Christians (well, I take that back…one gay man brought it up in a discussion on Facebook. Link below.).
It is common knowledge amongst believers nowadays. The thing is that the research it’s based on is kinda shoddy. I’ve blogged about it before, and the other day I ran into another one who has analyzed that often-quoted stat. Brad Wright is a sociologist professor at the University of Connecticut, and he evaluates the research that birthed that stat on his blog in a whole series. It is all well worth a read.
The main flaw he points out is that Barna, the research group that published the stat (it was soon therafter picked up by Ron Sider in his book The Scandal of the Evangelical Conscience, which Wright also reviews in the series), erroneously compared Christians to Christians in their anaylsis. That is, they only counted evangelicals as Christians, while all other groups they collapsed into one category: all Catholics, Mainline Protestants, etc, were grouped with atheists and agnostics. Perhaps many of the Catholics et al were not Christians, but no doubt some were. If they wanted to compare evangelicals to, perhaps, Mainline Protestants and/or Catholics, that would have been interesting, but to lump them into one category with those of other religions and those of no religion was sloppy…this is especially so given that Barna labeled the latter group “non-Christians.”
Wright also analyzes other data outside of the Barna Group and suggests that we should question the conventional wisdom. He also shows how frequent church attendance correlates strongly with a low divorce rate. For example, when it comes to Evangelicals alone, in the data, frequent attendance makes an almost 20% difference. While he is careful to note that there might be other causal explanations other than Christianity lowering divorce rates (afterall, correlation does not mean causation), at the very least his analysis is a strong reminder to be discerning in what we accept.
Should we now pat ourselves on the back and comfort ourselves that “it’s not as bad as we think?” Well, no. Exhortations to greater holiness are always a pressing concern for the faithful. What should we take away from this, then? A few things:
For one, no matter the righteousness of one’s cause, we must remember a greater duty: a duty to the truth. It might be mighty persuasive to our fellow Christ-followers if we toss out an alarming stat. Such is the way of sensational news. However, we are first and foremost people of the truth. Our Lord was the Way, the Truth, and the Light. We need to reflect that, and this includes what we use to convince others to do better in their devotion. Simple as that.
Relatedly, even if the stat/data/evidence/persuasive support is useful, we should pause and do the necessary work to discern the details behind the scenes. This applies to Wright as well, by the way. Perhaps he’s missing something somewhere. Actually, there are a few things about his analysis that smell off to me. For instance, that data he cites marks only professed belief, not actual. The former isn’t exactly 100% accurate. Secondly, the frequent church attendance marker says nothing about the theological content of said church. While attendees are separated by Evangelical/Mainline/Catholic markers, this is far from sufficient. There are churches out there that call themselves Evangelical (or at least they are categorized by others as Evangelical) that have the theological consistency of a frightened bat. Jesus wouldn’t recognize those folks even if they wore t-shirts picturing His face with the word “hope” under it.
Perhaps the markers above are the best sociologists have. If so, they can’t be faulted I reckon, but that doesn’t mean they aren’t shortcomings of the research.
You might be tempted to dismiss what I said above about the need to not cite useful stats that aren’t exactly truthful. Maybe your experience convinces you. Perhaps you think that there’s no harm in it, and there is actually much good that could come out of shocking our fellow Christians into obedience. “That’s simply what it takes these days. If we didn’t do that, no one would listen,” you reason.
Call this the “by any means necessary” approach. This leads to a third lesson: when we take that approach, we lose credibility. To paraphrase C.S Lewis, when we shoot for truth, we get both persuasiveness and the truth. When we shoot for only persuasiveness and disregard the pure truth, we get neither. It’s one of those “boy who cries wolf” things. Speaking personally, when I’m interacting with someone who has not been discerning in the past, I tend not to trust them, even if their cause is just. I’ve always got a thought lingering in the back of my mind: what are they not telling me? I now have this attitude towards anyone who has used the common stat Wright takes to task. While there will always be quite a few folks in the pews and in our circle of friends that will buy what we’re selling no matter what, I’m willing to bet there are many, especially non-Christians, who aren’t as easily convinced. If you give them a reason to withdraw trust, they will.
Wright puts it well:
Obviously we would like both useful and accuracy, but if we had to let one go, which would it be?
I can understand why people want to emphasize the useful. Why not use statistics, as well as anything else we can find, to advance the Kingdom?
And yet… if we’re not 100% accurate in our creation or use of research, then that starts to eat away at the credibility of our work.
Here’s an example of how this might play out. Suppose an author is concerned about Christians having some moral problem. S/he then finds all the statistics consistent with this “problem” hypothesis, ignoring ones that might contradict it. The end result: A skewed presentation of who the world works, but a presentation designed to get Christians to do the right thing.
I suppose this issue revolves around questions of the ends justifying the means. I would even say that some of the egregious misuse of statistics about Christianity are done with the best of intentions. Me, I want to go wherever the data lead me, though I realize that I have my own biases and limitations that can get into the way. Ultimately, if it is truth we’re after, cutting corners on our means of getting there isn’t going to help.
Perhaps your attitude is that of one commenter on Wright’s blog: Christians should always assume the worst and apologize, so we should embrace the conventional wisdom, even if it’s not 100% true.
I don’t know about that. It might lead someone to being walked on more, but it doesn’t necessarily lead to more respect. I learned that from one of my past relationships. I was frequently apologizing to her, even when the accusations were trumped up. She finally got tired of it and told me to shut up. She soon broke up with me. Might the same dynamic apply in Christians’ relationship to the world?
I have found that Christians often latch onto bad news about the church and run with it. I even do this from time to time. It’s almost like the self-flagellation featured in The DaVinci Code. We think it brings purification. We are so apt to do this that we seldom pause to question the news that we recieve. In the rush, we tend to trust well-known Christian sources (such as Barna) whose information might be sensational, but isn’t subject to the normal academic peer review process (which has its own shortcomings, I admit). Sometimes we get egg on our faces from this habit.
Think: can you really see Jesus endorsing the “by any means necessary” approach? Do you think He EVER fudged just a little bit to get more followers?