I moved the posts that follow from a thread in the History forum, as they were a digression following this comment from LightSpectra:
To say that political liberalisation is incompatible with Christianity is clearly false, although certainly it is incompatible with some forms of Christianity. Perhaps you think Christians shouldn't be political liberals, but it's evidently the case that a great many are - perhaps the majority, at least in the western world outside the US, and perhaps even within the US until recent years, as I mentioned above. There are really two questions to ask about American religion: first, why is religion so important in the US in a way in which it isn't in other countries? And second, why has religion in the US become so closely associated with right-wing politics? Americans sometimes ask the first but tend to overlook the second, because they've become so used to religion being linked to right-wing politics. It's become so pervasive that even non-Americans (non-religious ones) often assume the same thing; I've known people be astonished to learn that the Vatican opposed the Gulf War, for example, since they assumed that the church is just right-wing straight down the line. But as I said, the association is a recent one and by no means monolithic even in the US today.
I suspect that the answers to both questions are linked, but the mechanics are very hard to identify.
It's almost entirely demonstrable that urbanization often results in political liberalisation, which results in an altered worldview that is incompatible with Christianity, even though mainline Protestantism (e.g. classical Lutheranism) has tried to make it so.
To say that political liberalisation is incompatible with Christianity is clearly false, although certainly it is incompatible with some forms of Christianity. Perhaps you think Christians shouldn't be political liberals, but it's evidently the case that a great many are - perhaps the majority, at least in the western world outside the US, and perhaps even within the US until recent years, as I mentioned above. There are really two questions to ask about American religion: first, why is religion so important in the US in a way in which it isn't in other countries? And second, why has religion in the US become so closely associated with right-wing politics? Americans sometimes ask the first but tend to overlook the second, because they've become so used to religion being linked to right-wing politics. It's become so pervasive that even non-Americans (non-religious ones) often assume the same thing; I've known people be astonished to learn that the Vatican opposed the Gulf War, for example, since they assumed that the church is just right-wing straight down the line. But as I said, the association is a recent one and by no means monolithic even in the US today.
I suspect that the answers to both questions are linked, but the mechanics are very hard to identify.