My family watches a lot of American television on Netflix ,Prime, Disney etc.
I have noticed that a lot of the drama and comedy doesn't quite reflect true American society (especially with teen shows). To my early teen daughter the US seems some sort of cool utopia full of beautiful people with liberal attitudes. In reality the US is a relatively religious conse pmrvative society with views that are I think in contrast to a lot of UK views e.g.gun law, abortion etc.
I really do think teens in their formative years should watch the news to get the real political pulse in the US rather than the slick PR that is presented in a lot of its huge media industry content.