This is going to sound rude, but I need to get it out
I am seeing more and more posts about women in here and other weight loss groups saying that they want/need to lose weight, but they refuse to eat veggies. Don't get me wrong, I don't like most vegetables. But the thought of gaining more weight scares me more than the thought of having to eat some broccoli.
Yes, you can lose weight without forcing yourself to eat vegetables, but sometimes I think that people don't really try to like healthier options.
I'm not trying to offend people, just couldn't hold it in anymore. Also, eating healthier will also encourage your children to eat healthier, and maybe they won't end up with the adversity to veggies.