Thanks to the COVID-19 pandemic, there's been a new spotlight shone on doctors, nurses, and other healthcare workers. To show support for those in the medical field, it's now time to evaluate the portrayal of both doctors and nurses in TV: particular tropes, harmful stereotypes, progress in the way women/LGBTQ/BIPOC characters are handled or portrayed. What are some examples of groundbreaking works in the genre? What are some terrible or offensive examples? Some shows to look at are Grey's Anatomy and Scrubs. Comparisons can also be made to non-American TV shows and how they approach the subject matter.
I think The Good Doctor should definitely be added to the discussion. The majority of the doctors are Black or Brown (although the main protagonist is white, which brings up other issues). There's also plenty to say about how female physicians are treated or portrayed, especially with the addition of Dr. Jordan. Check out the episode where Jordan treats a large Black female patient, and has to deal with the racial and weight-related implications of her treatment, as well as how her fellow doctors handle it. – Stephanie M.4 years ago