I haven't been an active Wikipedia editor for quite a while, but if you know the back end of the site, it's very obvious how the mechanisms that Jimmy and Larry put into place to avoid bias have been turned into ways to enforce bias.
I was involved in a small way in exposing Johann Hari's activities, and that was an eye opener for me in terms of how Hari was able to game the system - for instance, using multiple sockpuppet accounts when the rules strictly forbid accusing anyone of sockpuppetry. Hari's exposure had to happen offsite, and under WP rules it couldn't even be discussed until a "reliable" third party source had reported on it. It became a huge issue for a shortish while, until the WP honchos shut it down, assured us that they had new processes that would prevent anything similar happening again... and I notice Hari's page has been cleaned up to obscure what the whole scandal was about.
That was quite a long time ago, and gaming of the site rules has got much worse since.
I will say that the right wing or far right mirror sites (Conservapedia, Infogalactic) have been absolute failures, partly because they don't have the base of volunteer editors (Wiki has many fewer active editors than people think, but few is more than none) and partly because they're not attempting to do anything more than replace Wiki's biases with their own biases.
Grokipedia seems like something different to me, and it's obvious the AI building it is extremely powerful. From looking at some fairly niche topics that I have knowledge of, it's not perfect but it's pretty good, the articles are often much more comprehensive than WP, and the AI is creating quite balanced "some people say x, others say y" sections which I find more useful than WP's standard "here's what Paul Krugman and Owen Jones think".
So I think it's an interesting experiment in what AI can do. Which makes it worth following for me. If it was just a Wikipedia clone with the existing WP bias replaced by Musk's eccentric opinions, which several PPs assume, I wouldn't say that.