In the early days of filming men were typically the main characters, did society help with the turn over of females being able to play the aggressive main character?
More and more films are including rape scenes, does this help shape the character as a much stronger person after being assaulted?
Masculinity in film: why are woman often seen as a man's possession?
Since same sex relationships are widely more accepted in today's world is there less controversy with homosexuality in films?
Women in film: in today's present time there are more and more films that show men and woman are equal and have the same abilities to do things. It hasn't always been this way though, in the early days men were seen as more prestigious. Has society this way of thinking?
Are films that portray real life situations played by African Americans liked better than the films where African Americans are portrayed as being lazy, irresponsible, and liars?