Does anyone else feel this way?
And before anyone says something I know there are female drs who are way better then others and have made medical breakthroughs but even knowing that I still feel that only men should be drs.
So I'm getting a lot of heat for this post!!!
I think it should be this way because I've never had a caring female dr. Or a caring male nurse. When I've had a male dr I've felt like he was actully listening to my concerns and all the female nurses I've ever been seen by seem to care more about my health then the male nurses I've been seen by.
I am not sexist , I think women sould be allowed to be whatever they want. But it doesn't mean I'm not entitled to my opinion.