Were you ladies taught by your parent(s) to stand on your own two feet? Often times some of the posts on here sadden me because I see so many women that have gotten divorced or broken up with a boyfriend and have no idea how to deal now that he isn't around. Growing up I was always taught how to take care of myself. I've had break-ups but I have never wondered "what am I going to do" or anything along those lines when the relationship was said and done. I think it's okay to rely on your partner to an extent but to be completely dependant upon him is not the right thing to do in my opinion. I'm not a woman that has the "I don't need a man" mentality (my boyfriend is great and I do need him sometimes) BUT...depending completely on him for everything is something I will not do. If things ended between us today, I'd be okay with taking care of bills, my car, my kids, etc. because I've been on my own since I was 21.
I think all women need some sense of Independence because relationships and marriages aren't guaranteed to last.