
In April, a Bible Society report based on YouGov polling claimed that ‘the Church is in a period of rapid growth’. But it subsequently came to light that other YouGov results show the opposite, and now YouGov has told Humanists UK that ‘survey design and weighting… is the likely cause of the discrepancy’. In other words, it is not possible to say which poll reflects the true picture, and so the Bible Society’s conclusion is not reliable. In the interests of truth and informed public debate, Humanists UK is calling on the Bible Society to admit to the problems in its research and retract what it said.
The report in question is called The Quiet Revival. It is based on a YouGov opinion poll that found that church attendance had risen 50% in the last six years, from 8% of the adult population to 12%. At the time, Humanists UK and others questioned whether or not this was plausible in light of recorded church attendance figures and the British Social Attitudes Survey both showing substantial declines over the same period, with the Bible Society pushing back and defending its data as correct.
It subsequently emerged that the British Election Study – another, even larger YouGov opinion poll – asked the same question around the same time and recorded about half the church attendance that the Bible Society poll did, completely contradicting the Bible Society’s narrative. But since the report came out, there has been widespread media coverage in the national papers about how the trend of religious decline has halted, in fact reversed. It represents a serious impact on discourse. It’s really important to get the facts right. And the evidence doesn’t show there is a religious revival in the UK.
Inconsistent polling data
The Bible Society report was based on a 2018 survey of 19,000 English and Welsh adults compared with a 2024 survey of 13,000. Both surveys were weighted based on ethnicity, with that calculated from the 2011 Census in the case of the former and the 2021 Census in the case of the latter. The Bible Society asserts, ‘Both samples therefore give a 1% margin of error at a 99% confidence level, meaning they are highly reliable.’ The 2018 survey found 8% of the population being Christians saying they attend church monthly, increasing to 12% in 2024 – a 50% rise. The Bible Society said this was driven by Gen Z being a ‘spiritual generation’.
Last month David Voas, Emeritus Professor of Social Science at UCL – one of the country’s leading quantitative experts in religion, questioned the conclusions. In a piece for The Conversation, he pointed to the fact that a few months before the Bible Society’s own 2024 survey, the British Election Study commissioned YouGov to ask the same question to an even larger sample of 30,000 people. The results are strikingly different – with 6.6% of the population being Christians attending church monthly. This is down from 8.0% in 2015. The British Election Study surveys were panel surveys, which (accounting for population change) mean they ask the same people over time. They are weighted on the basis of how politically engaged respondents are.
The wording of the questions between each poll is slightly different. We will explore that more later but for now it suffices to say that this doesn’t explain the different results.
Problems with data of this sort
So how could one very large YouGov survey show monthly church attendance rising from 8% to 12%, while another even larger one shows it going down to 6.6%? David Voas provides the following answer:
‘Gold standard social surveys are based on random (probability) samples of the population: everyone has a chance to be included. The British Social Attitudes survey is one such example – and found that churchgoing fell by nearly a quarter from 2018-23 [from 12.2% to 9.3%].
By contrast, people opt in to YouGov’s survey panel and are rewarded after completing a certain number of surveys. The risk of low-quality or even bogus responses is considerable.
YouGov creates a quota sample from its large self-selected panel. The sample will match the population on a number of key characteristics, such as age and sex, but that does not make it representative in all respects. As quota samples do not give each person in the population a known chance of being selected, statistical inference is not possible and findings cannot be reliably generalised.
To write (as in the Bible Society report) that because thousands of people participated in the two surveys, they “give a 1% margin of error at a 99% confidence level” is misleading.’
David goes on to argue that, because young adults are harder to contact, these problems are likely more pronounced among younger adults – suggesting that the data around this group becoming more religious is likely to be less reliable still. He concludes, ‘It would be fascinating to probe all of these issues further, but regrettably, the Bible Society has not published the dataset. Open access to all data is now a basic expectation in scientific work.’
YouGov: ‘discrepancy likely caused by survey design and weighting’
We put these issues to YouGov. Its answers suggest that Professor Voas is correct. YouGov told us, ‘survey design and weighting will have an impact on outcomes, and that is the likely cause of the discrepancy.’ It went on to say:
Question wording
The questions in the two surveys were not the same, so you are not comparing like with like. They both gave sliding scales, but not the same ones. The BES offers options of weekly, fortnightly, monthly etc, while the Bible Society survey, which aimed to capture more granular data on this specific topic, goes daily, a few times a week, once a week, fortnightly etc. The BES also offers an option of ‘it varies from week to week’ [sic – actually varies too much to say], which may have reduced the number who say ‘weekly’.
Sample weighting
The BES survey, being a political survey, includes YouGov’s political weighting. One part of this is a weighting for what we call ‘political interest’. This is how politically engaged a respondent is and looking at the data people who are more politically engaged also appear more likely to be regular church-goers. In practice, the addition of weighting on political interest serves to reduce the proportion of the sample who are highly engaged, so probably also impacts the percentage who go to church. The survey for the Bible Society was not a political questionnaire and used a weighting scheme appropriate for the subject matter including an ethnic minority interlocked with age. Given church-going is higher among young people from ethnic minorities this is likely to be part of the explanation. The BES survey is a tracker poll, e.g. we have been running the same survey and weighting scheme for a number of years without changing it, so as to make the time series comparable over the period and it only weights by ethnicity in London.
It’s worth going through that answer carefully. Question wording shouldn’t matter here. First, YouGov notes different question wording. To be transparent, here is the full question and results for each:
Bible Society:
‘Apart from weddings, baptisms/christenings, and funerals how often, if at all, did you go to a church service in the last year?‘
Daily/almost daily 1%; A few times a week 3%; About once a week 7%; About once a fortnight 1%; About once a month 2%; A few times a year 6%; About once a year 6%; Hardly ever 12%; Never 59%; Not answered 3%.
Across all religions, that gives 14% of the population as going to church services monthly or more – and a staggering 11% weekly or more. The Bible Society hasn’t published the breakdown of these results for Christians specifically but has said that Christians who go to church monthly or more constitute 12% of the population. (Implying that one in 50 people are not Christian but go to church monthly or more.)
British Election Study:
‘Apart from such special occasions such as weddings, funerals and baptisms, how often do you attend services or meetings connected with your religion?‘
Never or practically never 14%; Less often than once a year 5%; Less often but at least once a year 3%; Less often but at least twice a year 4%; Less often but at least once a month 2%; Less often but at least once in two weeks 2%; Once a week or more 6%; Varies too much to say 2%; I am not religious 60%; Don’t know 3%.
So this gives 6% attending once a week or more, and 10% attending monthly or more. 4% of the population are found to be Christians attending church weekly or more, and 6.6% are attending monthly or more. 1.4% of the population are Christians ticking ‘varies too much to say’.
There are obviously some differences in question wording. The most obvious is the ‘Varies too much to say’ 1.4% (which YouGov misstated as ‘it varies from week to week’). But it is hard to see why someone attending monthly or more would not feel comfortable picking one of those options. David Voas told us that ‘the general view is that ‘it varies too much to say’ suggests occasional but infrequent attendance, but of course it’s hard to be sure.’ What we do know however is that 1.4% is a very small share of the population. If they are all attending monthly or more then that only increases monthly attendance to 8%. With a similar increase in 2015, the British Election Study would still show a decline.
But enough on specifics – let’s get to the bigger point. Yes, the wording and options of the questions are different. But both surveys do have a set of options that aim to capture the number of people who go to church weekly or more; and monthly or more. If the opinion poll is completely reliable at doing this, then shouldn’t they end up with the same results?
In pointing out that differences in question wording may affect results, YouGov’s answer seems to be that the polls are not in fact accurately measuring the number of people who are going to church weekly or monthly. Not, anyway, to the degree of precision needed to distinguish these two survey results from each other.
Sample weighting and the imprecision of polls
This brings us nicely onto the second thing that YouGov referred to in its answer. The two surveys were weighted differently from each other, one based on ethnicity (which might help to capture minority ethnic people – good) and the other based on political engagement (which might help to capture politically less engaged people – good). It’s not obvious that one is preferable or more accurate than the other. But they appear to have contributed to, or caused, the different results here.
YouGov is attempting to capture, objectively, the number of people who go to church weekly or more; and monthly or more. If different approaches to weighting result in such different outcomes, then that appears to mean we don’t know which (if either) accurately reflects genuine church attendance.
The Bible Society says ‘Both samples therefore give a 1% margin of error at a 99% confidence level, meaning they are highly reliable.’ The reason they say this is because if you have a truly random sample from a population, then given their sample and population sizes, the margin of error would be +/-1%. But the problem is that people who sign up with online pollsters are not representative of the population as a whole. This is why pollsters weight their panels. The fact that weighting two surveys on different bases can lead to such different results suggests that the true margin of error – as in, how far away the result is from actual, real attendance – might be quite a bit larger. Large enough to lead to one survey showing declining church attendance while another shows it growing. To see one show almost double the church attendance of the other.
(We also asked YouGov whether it thinks the Bible Society sentence about the 1% margin of error is correct. It did not answer.)
This kind of weighting problem doesn’t to the same degree plague all findings from YouGov polls. When its polls find 73% of people support the Assisted Dying Bill, versus 16% opposed; or 70% support legal recognition of humanist marriages, with only 15% opposed; these are big majorities. The precise figures may be affected by weighting decisions, but they would not come close to being overturned by different weightings. The fact that the sample sizes on these assisted dying and marriage polls are smaller than the church attendance polls increases the margin of error in the classic sense. But at the same time, the large majorities mean we can still have a high degree of confidence in the results. That cannot be said for something that shows a trend going from 8% to 12% – only a 4% rise – that is quite susceptible to change due to weighting decisions.
There are other reasons worth mentioning to suspect that the British Election Study survey may be more accurate than the Bible Society’s. The first is that the British Election Study uses the same panel over time, changing only as the population changes, so the trend is somewhat more likely to be accurate.
Another possible problem with the Bible Society poll is that it appears to only include those who completed their full survey. But the full survey consisted of a very large number of questions, mainly about the Bible. It is easy to imagine that less religiously engaged respondents will have got bored midway through, dropped out, and so seemingly not had their responses logged. This may have distorted the results to look more religious than the population as a whole. We asked YouGov about this problem too, but it has not answered us.
Better sources of data
The Bible Society data therefore cannot be relied upon. We should therefore look at other sources. David already cites one – the British Social Attitudes Survey, which, because of the way it constructs its sample, is considered the gold standard of polling. It found that churchgoing fell by nearly a quarter from 2018-23 from 12.2% to 9.3%. This is still a lot higher than recorded attendance data – which may suggest some social desirability bias among respondents.
Which brings us onto the last source of data, actual recorded Church attendance data. Humanists UK noted in its previous piece that among adults, the Bible Society found 56% monthly church attendance growth overall, between 2018 and 2024; 30% Church of England growth; and an astonishing 112% Catholic growth. But weekly attendance for the Church of England over the same period is down 24%. And the decline is 27% for 2018-23 with the Catholic Church (with 2024 not out yet). While both churches report annual increases since the post-pandemic low, they both remain below the pre-pandemic trend.
No source of data is perfect but these other sources of data don’t have the same methodological concerns to them as the Bible Society’s. They should be preferred.
The Bible Society ignored the other data
One thing that is particularly concerning is that the Bible Society report considered none of this other data. There is a passing reference in the background section to ‘survey data’ and ‘published attendance and membership figures’ having hitherto pointed to ‘decline’ but that is it.
When we saw the Bible Society survey, the first thing we thought was to compare it to religious attendance data. Why doesn’t The Quiet Revival do that itself? An essential part of any serious study is to consider all available data – both the new data the study introduces, but also how it fits into the context of existing data. We hope that the Bible Society didn’t ignore this data because it was inconvenient to the narrative it wished to present.
The Bible Society should retract its claims
To sum up, the Bible Society’s finding of Church growth is completely undermined by a larger YouGov poll suggesting continued church shrinkage. The contradictory nature of these two YouGov polls says something about the limits of this sort of poll. And other sources of data – the British Social Attitudes Survey, church attendance records – also point to shrinkage.
The Bible Society could have published its poll results alongside this other data and tried to explain the difference. But it didn’t do that. Instead it ignored all this other data in its report. It suggested a very high degree of confidence in its findings, said that ‘the Church is in a period of rapid growth, driven by young adults and in particular young men’; said ‘its reality can no longer be denied’; and branded Gen Z as the ‘spiritual generation’. From this it generated lots of headlines saying the Church is growing again.
The evidence does not support that conclusion and the Bible Society should retract its claims that it does. Public opinion on these vital questions has been distorted by widespread coverage of this misinformation.
Notes
For further comment or information, media should contact Humanists UK Director of Public Affairs and Policy Richy Thompson at press@humanists.uk or phone 0203 675 0959.
Read the Bible Society’s report.
Read David Voas’s piece for The Conversation.
Humanists UK is the national charity working on behalf of non-religious people. Powered by over 130,000 members and supporters, we advance free thinking and promote humanism to create a tolerant society where rational thinking and kindness prevail. We provide ceremonies, pastoral care, education, and support services benefitting over a million people every year and our campaigns advance humanist thinking on ethical issues, human rights, and equal treatment for all.