Join the Meeting Place for Moms!
Talk to other moms, share advice, and have fun!

(minimum 6 characters)

Is this a Christian Nation?

Do you believe the United States is a Christian nation? If so, what do you believe that means for those of another faith or no belief? If not, do you believe we should establish Christianity as the official religion?

Answer Question

Asked by Anonymous at 9:39 AM on Mar. 20, 2009 in Politics & Current Events

Answers (112)
  • No it's not a Christian nation....and sadly it never will be.

    Answer by ReneeK3 at 9:41 AM on Mar. 20, 2009

  • O M G
    "establish Christianity as the official relgion" is unconstitutional, illegal, and treasonous. You should be ashamed for suggesting it. This is NOT a Christian nation, not even all the founding fathers were Christian. The expression "Christian nation" is nothing but an indication of lack of knowledge about the foundation of the country, the constitution and history.

    Answer by NotPanicking at 9:42 AM on Mar. 20, 2009

  • Wow. Notpanicking, take Prozac or something! Good grief! It was just a simple question. Get a grip,already!

    Answer by Anonymous at 9:43 AM on Mar. 20, 2009

  • NO.

    USA is a free country were everybody can practice religion of her/his choice or not practise at all.


    Answer by Anonymous at 9:46 AM on Mar. 20, 2009

  • Oh, anon. Please. Take your drama mama butt back to bed.

    Answer by Momtocccd at 9:48 AM on Mar. 20, 2009

  • They wish.

    Answer by IhartU at 9:51 AM on Mar. 20, 2009

  • We used to have some real Christian principles but since we turned have turned away from them completely our nation has went down hill quickly. I am a Christian but it would be unconstitutional to establish an "official" religion. Thats one of the reason our nation was founded, to give everyone the right to worship as they please. It is also unconstitutional to restrict Christians from speaking about their faith and praying where they please.

    Answer by abbynzachsmommy at 10:02 AM on Mar. 20, 2009

  • Hell no it isn't and it never will be. There will never be an "official religion" because there are dozens of other religions that have their place here. We don't even have a national language and you want a national religion? I thought you repubs. are for LESS government? Oh, only when it suits your needs?

    Answer by Anonymous at 10:04 AM on Mar. 20, 2009

  • I think if anything the United States is a faith based nation. Allowing freedom to believe in God. Not just one religious affiliation, but rather a freedom for ALL religions.


    Answer by grlygrlz2 at 10:06 AM on Mar. 20, 2009

  • I believe we are primarily a christian nation, i think the US was brought up by mostly christian beliefs. but no i dont think we should have an official religion...

    Answer by Anonymous at 10:08 AM on Mar. 20, 2009

Join CafeMom now to contribute your answer and become part of our community. It's free and takes just a minute.