Title
1

Is Hollywood "disgustingly sexist ?"

Following Kristen Stewart’s recent comments on sexism is Hollywood, is there any relevant to her claim today?

  • Hollywood mirrors society and mainstream society reflects sexism, classism and racism. What could also be addressed here is how theses 'isms' affect the current generation of Hollywood actors such as Kristen Stewart. If they rebel will they be replaced; since Hollywood is most concerned with generating revenue and collecting accolades? – Venus Echos 9 years ago
    1
  • There are lots of directions that could be taken with this, from how actresses are paid less than their male counterparts (even within the same films, eg. the leaked email that stated the female stars in American Hustle were intentionally paid less than the male), or the state of female film makers, particularly directors, as female-directed films are often only made in indie circles despite regular critical acclaim for the directors themselves. This could be an enormous topic, so finding one part to concentrate on will probably be the best way to approach it. – Hannah Spencer 9 years ago
    1

Want to write about Film or other art forms?

Create writer account