So, I've been re-reading my notes from my course, when it struck me how often I commented on the poor portrayal of women in media. I've recently started reading up on the male gaze, which explains a lot of this poor portrayal; the default setting for film and television producers seems to be to meet the male gaze.
Now, the term "male gaze" was coined in 1975, by Laura Mulvey, who argued that films showed typically male protagonists, and assumed a heterosexual, male audience was the default.
36 years on, do you feel that films still conform to the male gaze? Are there any films, or even TV shows, which you feel stray away from the male gaze, assuming an androgynous gaze?