In recent years, there has been a marked shift from political shows of the late 90s/early 00s like The West Wing, which acted as an idealized representation of the political world. The bad guys always got what was coming to them, our team always won in the end, the couples always got together, etc. Now with the introduction and success of shows such as Scandal and House of Cards, viewers are seeing darker, jaded, terrifying underbellies of the political world where nothing is tied up neatly and oftentimes you are expected to side with the villains. Why have audiences shifted to this kind of entertainment? What is it about our current time, perhaps, that makes this shift believable? And how does a lighter show like Veep figure into this equation?
Really good idea for a topic. There could be so many reasons for this shift... perhaps people don't trust who are running our countries anymore and that we assume all politicians are corrupt. – samcel6 years ago