Contributing writer for The Artifice.
Junior Contributor I
Is Hollywood and the Oscars as relevant as they used to be?
We’ve just seen history in the making last weekend when, for the first time ever a foreign language film won the Best Picture Oscar. Has Hollywood finally learned that there are other countries outside of the US where brilliant and talented people tell fascinating stories? Does this mark a new era for Hollywood in which we’ll see more international talent celebrated or a last attempt of a dying branch of the film industry to stay relevant and "woke" in an increasingly globalised and diverse world?