So in all honesty looking for real reasons here. No jokes please. So why is it so many people believe this country is a Christian nation?
Because the majority claim Christianity and they feel the need to show their dominance and their perceived superiority
Answer by sweet-a-kins at 9:44 PM on Aug. 29, 2010
Answer by 29again at 9:50 PM on Aug. 29, 2010
Answer by yourspecialkid at 10:12 PM on Aug. 29, 2010
Answer by purpleducky at 9:30 PM on Aug. 29, 2010
Answer by purpleducky at 10:17 PM on Aug. 29, 2010
The first thing you learn in school is that we founded this country to escape religious persecution. They intended America to have the freedom of religion. There is no reason to call this a Christian nation. When you do that you are discounting everyone else's religion or non religion that live in America and love America. You cannot claim this country as your own it's everyone's country.
Answer by mommom2000 at 10:29 PM on Aug. 29, 2010
Answer by anikahaynes1 at 9:07 PM on Aug. 29, 2010
Answer by Carpy at 10:51 PM on Aug. 29, 2010
Answer by 29again at 11:18 PM on Aug. 29, 2010
According to Wikipedia, 76% of Americans identified themselves as Christian in 2008. They show data from many other years.
We are a mostly WHITE nation too...Does that make us a WHITE COUNTRY?
There are more WOMEN than men , does that mean we are a FEMALE country?
We are supposed to be a FREE COUNTRY....PERIOD
Answer by sweet-a-kins at 8:46 AM on Aug. 30, 2010
Posts with Most Replies