I'm a Canadian, lets make this clear. Often I think those of you to the south forget that we're up here. That we're you're biggest trading partners and by and far the largest forgien consumers of American Media.
I'm also confused, lost and a little dismayed.
I've always felt I understood 'where Americans were coming from' so to speak. Through the Bush years we Canadian's largely blamed some of the unsettling things we saw going on south of our boarder on first 9/11, then on Bush and lastly upon the economy. We thought (or perhaps hoped) that with Obama and the message of change that it would be okay or at least better... But even in light of a climate of hope, of change, of choosing a new path I see so much seething hatred coming out of the States these days. I really, honestly fail to understand.
So American Moms... can you explain to me why?
The things that are 'hot button issues' down there are 100% off the table up here - up here if a politician goes against equality, abortion, sexual or personal freedom anything like that their career is over.
Up here these are non issues. Why are they so deversive down south?
Why is it such a terrible thing to grant equal rights to gay people?
Why is a woman's domain over her body in question?
Why do people feel so unsafe in their own neighborhoods that they feel the need to carry guns?
Why is there High Fructose Corn Syrup in EVERYTHING even though all research points out that it's terribly toxic for your children?
Why is socialism a dirty word?
Help a Canadian Mom get some insight into the minds of those only a few miles away?