Do we really want the US to be a "Christian Nation"? Should even Christians want this?
I bring this up because it seems like there are many Christians today that actually think they want this.
But would a “Christian Government” really fix this country even from the Christian’s perspective? I would say not and here is why...
Well... for one thing, if this were a Christian Nation, freedom of religion would be the first thing to be thrown out the window. And the separation of church and state would obviously be null and void.
And then which of the thousands of Christian denominations would come into power? Which of them would be dictating what the "true" Christian faith is to the entire nation? Would it be the Jehovah's Witnesses, or perhaps the Catholic Church with their Pope and with their potential for world domination, or the Baptists? I would think that the Catholics are the most likely to take power since they have the most experience with this sort of thing.
Do we really want to combine religion with the power of life and death over people again, or with the might of the most powerful military on earth? Is this really what people want?
The very existence of each of the different denominations of Christianity is dependant on the freedom of religion and the separation of church and state. Without it, one denomination would probably eventually have national and perhaps even world domination. And it would not matter in the least if it is the “True” one (yours of course). It would be either the most popular or the most ruthless. Why not Joel Osteen’s fantasy worldview?
The different denominations will ever fight over political power in this country and everywhere else. And the more centuries that go by, the more the Christian world will fragment into pieces, like shattered glass. It happens every day as one church breaks away from another to maintain a more “Pure Truth.”
One thing I know for sure, we would certainly not have peace and harmony in our time.
What do you think?
On a side note…
Do you think that the Republicans in the US are more "Christian" in their platform or do you think that the Democrats are?
I bring this up because it seems like there are many Christians today that actually think they want this.
But would a “Christian Government” really fix this country even from the Christian’s perspective? I would say not and here is why...
Well... for one thing, if this were a Christian Nation, freedom of religion would be the first thing to be thrown out the window. And the separation of church and state would obviously be null and void.
And then which of the thousands of Christian denominations would come into power? Which of them would be dictating what the "true" Christian faith is to the entire nation? Would it be the Jehovah's Witnesses, or perhaps the Catholic Church with their Pope and with their potential for world domination, or the Baptists? I would think that the Catholics are the most likely to take power since they have the most experience with this sort of thing.
Do we really want to combine religion with the power of life and death over people again, or with the might of the most powerful military on earth? Is this really what people want?
The very existence of each of the different denominations of Christianity is dependant on the freedom of religion and the separation of church and state. Without it, one denomination would probably eventually have national and perhaps even world domination. And it would not matter in the least if it is the “True” one (yours of course). It would be either the most popular or the most ruthless. Why not Joel Osteen’s fantasy worldview?
The different denominations will ever fight over political power in this country and everywhere else. And the more centuries that go by, the more the Christian world will fragment into pieces, like shattered glass. It happens every day as one church breaks away from another to maintain a more “Pure Truth.”
One thing I know for sure, we would certainly not have peace and harmony in our time.
What do you think?
On a side note…
Do you think that the Republicans in the US are more "Christian" in their platform or do you think that the Democrats are?