Contributing writer for The Artifice.
Junior Contributor I
How has film and television progressed society in regards to race and sexuality?
It seems that in the last ten years films and tv shows have started talking about race and sexuality in a more progressive way. Gay and lesbian couples are appearing in tv shows, colored women are lead characters, and these themes are discussed in an open and unbiased manner. Has this depiction in media helped transform society as a place where sexuality can be discussed in an open manner?