Can someone explain too me please? I always thought that as America the 'land of the free' I don't get why religious groups keep trying to force their beliefs on everyone.
Growing up I didn't realise America was so religious. I get it was founded on religious freedom but when did their beliefs now get to rule the highest court?
Has it always been a religious country. I thought as places became richer and had choice that went downhill.
Rainy day debate and curious.
Please or to access all these features
Please
or
to access all these features
Join the discussion and meet other Mumsnetters on our free online chat forum.
Chat
USA politics
17 replies
MegCleary · 25/09/2020 13:39
OP posts:
Please create an account
To comment on this thread you need to create a Mumsnet account.