Lest anyone any longer opine that America is a Christian nation.
De-Christianization, you say? To which I reply, yes. When scripture no longer even guides the moral and ethical choices of ‘Christians’ the fact of de-christianization is made plain. If America were a Christian nation, its citizens would be guided by the teachings of the Bible. It isn’t. They aren’t.