Title
3

Is Hollywood and the Oscars as relevant as they used to be?

We’ve just seen history in the making last weekend when, for the first time ever a foreign language film won the Best Picture Oscar. Has Hollywood finally learned that there are other countries outside of the US where brilliant and talented people tell fascinating stories? Does this mark a new era for Hollywood in which we’ll see more international talent celebrated or a last attempt of a dying branch of the film industry to stay relevant and "woke" in an increasingly globalised and diverse world?

  • You could also look at 2016, which seemed to be the start of a new, diverse era in Hollywood, with more people of colour and LGBTQ stories, and see why that failed to make an impact (or, at least, as much of an impact as everyone believed it would). – OkaNaimo0819 4 years ago
    0
  • I would still argue that many "problems" that critics have brought up about Oscar voting and the landscape of voters still rings true for many watching the event. I would suggest looking at the Oscar campaign surrounding Greta Gerwig's "Little Women". It was a film that resonated with many viewers of all different ages and received critical reception, but was snubbed in a key category like Best Director. While Parasite's win was an exciting surprise, it feels like the Oscars have not changed as much as they have been touted to change. – Sean Gadus 4 years ago
    1

Want to write about Film or other art forms?

Create writer account