Join the Meeting Place for Moms!
Talk to other moms, share advice, and have fun!

(minimum 6 characters)

In Reference to the Newsweek Article; when was this ever a "Christian America"?

This is what the Bible reads in Ephesians 6:12:

"For we wrestle not against flesh and blood, but against principalities, against powers, against the rulers of the darkness of this world, against spiritual wickedness in high places"

Therefore, if this was ever a "Christian America", there would have been no need for this to be written. This scripture doesn't apply only to places such as Iraq or Afghanistan and it most certainly doesn't exclude America. This has never been a Christian country. There have always been "rulers of darkness and spiritual wickedness in high places".

So for whatever reason this person felt like they needed to publish this article, it was a waste of time because true Christians always knew that we were never living in a "Christian America" and the things that we are going through now are also written and have been prophesied about.

Answer Question

Asked by Anonymous at 5:06 PM on Apr. 8, 2009 in Religion & Beliefs

Answers (9)
  • Well "Christian America" is just referring to Christian Americans or Americans that are Christian. It doesn't actually mean it is a whole country of Christians.

    Answer by hannahwill at 5:18 PM on Apr. 8, 2009

  • You can't really categorize or blanket an entire population like that. I believe the only Christian nation is Heaven.

    Answer by ReneeK3 at 5:21 PM on Apr. 8, 2009

  • I am just wondering, haven't you ever heard this as a way to describe a group of Americans? Like, " Black America, White America, Jewish America, Atheist America" the list can go on and on that is just a way of describing a group of the population.


    Answer by hannahwill at 5:23 PM on Apr. 8, 2009

  • Heaven isn't only Christian...IMO of course.

    Answer by hannahwill at 5:24 PM on Apr. 8, 2009

  • Yes, I realize that this may be referring to a certain group of people, but some people also think that this nation was "built" on Christian principles and beliefs by Christian people which would make this a Christian Country and in IMO, this country has never really been a "Christian Country". Only Heaven is purely believers. And no they may not all be called Christians in Heaven, if any of them will be for that matter, but they will all be believers in the Father, Son and the Holy Spirit.

    Answer by Anonymous at 5:34 PM on Apr. 8, 2009

  • I can agree to an extent with the fact that this country was not built on Christian principles but they did have a lot of influence and certainly have had a huge impact of the US and it still (like it or not) has a huge impact today.

     I have to disagree with the fact that all will be believers. Our life on earth has little to nothing to do with heaven and God.


    Answer by hannahwill at 5:39 PM on Apr. 8, 2009

  • In H.S. I had a Jewish friend who took offense to the celebration of"Christ"mas in public schools, ie. decorations and everyone saying Merry Christmas to her. This was back in the 70's. She spoke up about this and wow was she given a hard time. I think some of the "christian" kids would have burned her at the stake. I think we definitely live in a "christian" America and its time for CHANGE.

    Answer by writeon at 5:47 PM on Apr. 8, 2009

  • Therefore, if this was ever a "Christian America", there would have been no need for this to be written.

    Ummm... you do realize that this scripture was written well before America was "discovered", right?

    Answer by Koukla12905 at 1:52 AM on Apr. 9, 2009

  • Excellent question! A lot of people have been told that our country was founded on Christian principles but that's incorrect. Our founding fathers were not all Christians, but theists. They believed in god but saw the horrors that religion in government had caused all throughout Europe and were trying to avoid the same thing. If anything America was founded on the principle that everyone should be able to worship how they wanted to without persecution and keep the religion out of government.
    A fact of life is that the majority of Americans are Christian so it has seeped in to all facets of our lives, sometimes excluding other beliefs. Wrong in my book but whatever.

    Answer by krisr169 at 10:36 AM on Apr. 9, 2009

Join CafeMom now to contribute your answer and become part of our community. It's free and takes just a minute.