Is America the only country where women don't care how they look? ETA!!!
When did it become acceptable to wear pajamas outside of the house when you aren't sick or pregnant, or your kids aren't sick? There was once a time where a woman took pride in her appearance and strived to make a good first impression. When did that change?
I know a lot of women just don't care anymore and say things like "It's my life, I'll dress how I want to dress", "I'm not trying to impress anyone", "Who cares what I look like?"
I've learned that if you take pride and care in your appearance outside of your home, people are more likely to take you seriously. Honestly, if you were at the grocery store and two people made a recommendation on a food item, who would you take more seriously? The mom in sweats who isn't pregnant or sick, or the polished woman?
Is this common in just America, or have women in other countries given up how they look as well?
Well, it seems I struck a nerve with some of you.
I never said everyone has to wear 50s type housewives dress, an evening gown, or a skirt/pants set to go grocery shopping. Nor did I say I dressed like that. I didn't even say "Nana nana boo boo I'm better than you!" but some of you seemed to take it that way. All I asked was when did it become acceptable to wear pajamas outside the house, and when did some women stop caring about how they look?
I should have clarified that it's important that anyone, not just women, should care about how they look. My mom was dirt poor when she was growing up but she said her mother would have been so embarrased if her children would have looked slovenly, and that there was never an excuse for being dirty and unkempt.
Hell, even a pair of jeans, a tshirt, sneakers, and brushed hair is better than a pair of pajamas that have been slept in.
But of course, this is all my opinion.