Women and nudity?
How do you feel about the topic that so many movies have to always have a female character show herself nude for some meaningless scene?
There are different sides to this
How do the men feel about it, how do women feel about it?
As a man, on one side I can appreciate how the woman looks, who doesn't enjoy seeing an attractive woman nude.
As a man with a wife and daughter I think about how that is someone's daughter, that is someone's wife who's bearing all for basically just the whim of the director because nudity is never really "needed" and no film is "better" because it is in there.
Seriously, how often have you seen a show where we assume all the characters bath/wash themselves but inevitably the camera follows the woman character to the shower and shows it.
Women, what to you think of it. They are adults, they accept it and get paid.
What do you think about the fact that being a woman in Hollywood almost always means you'll be asked to appear nude and refusing to do so may hinder/hurt your career. All because men want to see you nude. Do you care? Does it anger/annoy you? Have you ever scene a film and thought, I would have like it better if it had nudity.
Deutschland hat die Weltmeisterschaft zum vierten Mal gewonnen! 🇩🇪🇺🇸