If you don't like America...just leave (S/O of a post)
One thing that always gets me fired up is when someone says if you don't like America in it's current state, then you should leave.
And I just have to shake my head, because that is absolutely one of THE most un-American things a person can say.
I've said it in multiple replies, and I will say it again in here. Americans not only have the right, but the RESPONSIBILITY to stand up against our government when it is not acting in the best interest of the people.
Not in the best interest of itself, not in the best interest of corporations, not in the best interest of the upper class.
Of the PEOPLE...as in EVERYONE in this nation. And while originally written to exclude certain members of society, today we're beyond that (or at least I hope so). And everyone has the responsibility to do their part to not only speak out against, but change what they see is wrong in our nation, and our government.
IF we leave...if we give up on America, then how are we fulfilling our duty as American citizens to change what we see is wrong?
Too long have we let our government walk all over us. And now that the American people are finally starting to wake up and realize all that is wrong, our government, and the citizens that are happy with the status quo, are starting freaking out. They are starting to realize that the American people can not be pushed around, can not be quieted.
So the ONLY people I can think that would say "if you don't like how America is, then leave" are the ones who don't want change, because that would mean a possibility of a fair chance for everyone in our nation to be (at the very least) secure, and have the life they want.
And if someone would willingly deny that to ANY of our citizens...then while they may be American on paper, they are not American by heart.