Join the Meeting Place for Moms!
Talk to other moms, share advice, and have fun!

(minimum 6 characters)

Do you agree or disagree w/ the idea that reality TV has a negative effect on how women are viewed?

Why or why not?


Asked by MaterialGirl198 at 12:21 AM on Mar. 27, 2011 in Entertainment

Level 17 (3,513 Credits)
This question is closed.
Answers (4)
  • I think it doesn't help. I live in another country and they watch shows like Jersey Shore and Rock of Love and that's how they think us American girls are. All that Spring Break MTV shit doesn't help either.

    Answer by Nanixh at 12:38 AM on Mar. 27, 2011

  • I don't think reality TV set that standard, I think the girls on the shows are what makes us look bad. Two words, JERSEY SHORE.

    Answer by Renee3K at 12:23 AM on Mar. 27, 2011

  • No, I think people learn pretty early how they are gonna feel about women. My husband learned his views from how his mother acted and sadly carries that with him today. I do think reality tv can perpetuate stereotypes of women!

    Answer by ILoveCade at 12:35 AM on Mar. 27, 2011

  • I think reality tv negatively portrays the entire human race....well Americans at least.


    Answer by lilysmom2607 at 8:57 AM on Mar. 27, 2011

Next question in Entertainment
What do you do on the weekends

Next question overall (General Parenting)