I've been wondering - if women ruled the world, would football quietly go away?
I'n not concerned whether it was played by men or women, but, if women held the majority power, as men do now, would football players be billionaire heroes? Would it make the news headlines? If not, where might the seemingly endless money involved in the game be channelled instead?
Whadya reckon?