I was reading a book on the first thanksgiving that my son got from school and it said that the pilgrims came to america in search of a better place and they disvovered a land with brown skinned people. then blah blah blah the pilgrims showed them how to make whatever. I think it's quite one-sided that they don't ever tell what really happened. The pilgirims were starving and they didn't know how to do anything for themselves and the Native American's helped them and as soon as the pilgrims were well off and they learned all of the ways to live off of the land the pilgrims pretty much backstabbed them and gave the diseases that would never have been introduced if they hadn't helped them in the first place. The pilgrims also turned on them and wanted to wipe them out by giving the women and children blankets that were infested with small pox. Well I don't know about you but I think that would be more realistic and true to form if they told everyone the truth and stopped putting out the image of a cookie cutter peaceful bond between native americans and the pilgrims.
p.s. I'm not saying to kill white people or whatever and i'm not against any other race. i just wanted ur opinion on the subject.
Asked by Anonymous at 1:20 PM on Feb. 1, 2011 in Relationships
What- and make ourselves look bad??? *insert sarcasm here*
Answer by skittles1108 at 1:25 PM on Feb. 1, 2011
Answer by SWasson at 1:26 PM on Feb. 1, 2011
Answer by Nanixh at 1:28 PM on Feb. 1, 2011
Answer by fiatpax at 1:31 PM on Feb. 1, 2011
Answer by anng.atlanta at 1:32 PM on Feb. 1, 2011
I was failed in school for pointing out this FACT. Sorry, people were already here, the "discovery" was already made.
Answer by MrsHouston47302 at 1:34 PM on Feb. 1, 2011
Answer by 2BlondeBabies at 1:38 PM on Feb. 1, 2011
Answer by gwen20 at 2:00 PM on Feb. 1, 2011
Answer by bjane01 at 2:23 PM on Feb. 1, 2011
Answer by fiatpax at 8:53 AM on Feb. 2, 2011