America Was Never a Christian Nation Folks
“We need to take back America for Jesus and make this once again a Christian nation.” How many times have you heard this battle-cry from Christians? It goes something like this:
Our founding fathers founded this country on Christian principles and ever since then, we’ve digressed; and now, because of our bad decisions, we are at a place of immense depravity. And if the Christians don’t lead the way in bringing us back to God, there’s going to be some hell to pay from a wrathful God.