Contributing writer for The Artifice.
Junior Contributor I
Why have apocalypse-style films and TV shows become so mainstream in recent years?
The idea of an apocalypse has existed in history for hundreds of years, but why in recent times has the idea of an apocalypse become to mainstream? Whether it’s zombies, nukes, or anything in between, these stories have taken a deep root in our modern culture. Is it because we feel detached from our primal survivalist selves? Take for example the show The Walking Dead. The show is a massive success, second only to Game Of Thrones during its run time. Apart from the amazing writing and impeccable acting performances, there is a certain allure to the idea of a group of at-first strangers growing into a family through trials and tribulations and lots of zombie guts. It is also interesting to see how these stories are received in different cultures around the world. For example I know that in many parts of Asia, there is a massive love for all things zombie. Why do you think this is?