I'll take a stab at it, although I'm looking at it from an American perspective.
Societies mostly seem to divide themselves into the two categories of "conservative" or "liberal." These days, conservatives are on the "right" and liberals are on the "left." (I say "these days," because apparently that hasn't always been the case.) Left/right are used as insults, but aren't necessarily. I lean left, for instance.
Conservatives traditionally tended to be on the side of business, they tended to be the party more money-oriented, they tended to support the military, to be more cautious, more fixated on stability, more likely to look to some golden past. Liberals, by contrast, tend to be more progressive, more forward-thinking, more advocating the rights of individuals, and so more interested in things like education and health care.
In America, the right does tend to be racist. That's partly because they have no interest in helping disadvantaged people. But in a way, that's partly because the right is exclusionary. They want all the rights and money for themselves. They don't much like the idea of women's rights, either. The liberals want the benefits to be distributed. "Woke" is a silly term that I wish people would give up, and it's bizarre that it's used as an insult. It basically referred, I think, to a kind of enlightenment, and it's typical of the right to use it as mockery. In America, at least, the right is far, far more extreme than the left.
Hope that helps a little.