I just read an interesting article that I thought you guys would enjoy. I had never really given it much thought, but I always found it odd that so many people would say things like "Seeing that dude's butt scarred me for life". As a teenager I recall my friends expressing concern over ever having to see a man naked, saying their genitals were gross, meanwhile I just felt curious and not at all disgusted.
I don't think naked dudes are gross, I actually enjoy them. But apparently lots of people find male nudity to be upsetting. I've heard the same thing about naked men on television. That audiences can handle a naked woman, but not a naked man, and that showing penises, especially erect penises is considered too "hostile".
So, why is there so much body hatred for men?