I've found in at least two instances Grokipedia had something Wikipedia didn't.
One was looking up who "Ray Peat" was after encountering it on Twitter. Grok was obviously a bit more fawning over this right-aligned figure but Wikipedia had long since entirely deleted its page, so I didn't have much of a choice. Seems bizarre to just not have a page on a subject discussed every day on Twitter.
The other is far more impactful IMO. Every politician's or political figure's page on Wikipedia just goes "Bob is a politician. In 2025 <list of every controversial thing imaginable>". You have no idea what he's about and what he represents; you don't even have any idea if anyone cared, since all this was added at that moment in 2025 and not updated since. Grokipedia does not do this at all. If you want to know about someone's actual political career, Grokipedia weights recent controversies equal to past controversies and isolates it all to a section specifically for controversies.
ryandrake 1 hours ago [-]
The concept of Grokipedia reminds me of the old (now defunct? won't load) "Conservapedia" project that basically only had detailed pages for topics where observable fact was incompatible with political ideology--so for these topics, the site showed the Alternative Facts that conformed to that ideology. If you looked up something non-political like "Traffic Light" or "Birthday Cake" there would be no article at all. Because being a complete repository of information was not an actual goal of the site.
ilamont 19 minutes ago [-]
Another defunct site is Deletionpedia, which compiled articles that had been removed from Wikipedia for not meeting various criteria (usually relating to notability IIRC). The site is dead but the HN discussion lives on:
Besides the political slant of Grokipedia, it's true that a lot of work that needed to be crowdsourced can be now packaged as work for LLMs. We all know the disadvantages of using LLMs, so let me mention some of the advantages: much higher speed, much more impervious to groupthink, cliques, and organised campaigns; truly ego-less editing and debating between "editors". Grokipedia is not viable because of Musk's derangement, but other projects, more open and publicly auditable, might come along.
Avshalom 55 minutes ago [-]
"higher speed" isn't an advantage for an encyclopedia.
The fact that Musk's derangement is clear from reading grokipedia articles shows that LLMs are less impervious to ego. Combine easily ego driven writing with "higher speed" and all you get is even worse debates.
delecti 27 minutes ago [-]
It's not an advantage for an encyclopedia that cares foremost about truth. Missing pages is a disadvantage though.
b00ty4breakfast 52 minutes ago [-]
LLMs are only impervious to "groupthink" and "organized campaigns" and other biases if the people implementing them are also impervious to them, or at least doing their best to address them. This includes all the data being used and the methods they use to process it.
You rightfully point out that the Grok folks are not engaged in that effort to avoid bias but we should hold every one of these projects to a similar standard and not just assume that due diligence was made.
dghlsakjg 53 minutes ago [-]
> much more impervious to groupthink
Citation very much needed. LLMs are arguably concentrated groupthink (albeit a different type than wiki editors - although I'm sure they are trained on that), and are incredibly prone to sycophancy.
Establishing fact is hard enough with humans in the loop. Frankly, my counterargument is that we should be incredibly careful about how we use AI in sources of truth. We don't want articles written faster, we want them written better. I'm not sure AI is up to that task.
greggoB 42 minutes ago [-]
> impervious to groupthink, cliques, and organised campaigns
Yeeeeah, no. LLMs are only as good as the datasets they are trained on (ie the internet, with all its "personality"). We also know the output is highly influenced by the prompting, which is a human-determined parameter, and this seems unlikely to change any time soon.
This idea that the potential of AI/LLMs is somehow not fairly represented by how they're currently used is ludicrous to me. There is no utopia in which their behaviour is somehow magically separated from the source of their datasets. While society continues to elevate and amplify the likes of Musk, the AI will simply reflect this, and no version of LLM-pedia will be a truly viable alternative to Wikipedia.
mschuster91 3 minutes ago [-]
The core problem is that AI training processes can't by itself know during training that a part of the training dataset is bad.
Basically, a normal human with some basic media literacy knows that tabloids, the "yellow press" rags, Infowars or Grokipedia aren't good authoritative sources and automatically downranks their content or refuses to read it entirely.
An AI training program however? It can't skip over B.S., it relies on the humans compiling the dataset - otherwise it will just ingest it and treat it as 1:1 ranked with authoritative, legitimate sources.
lich_king 11 minutes ago [-]
Right, but the reason that Conservapedia fizzled out is that you can't really build a critical mass of human editors if the only reason your site exists is that you have a very specific view on dinosaurs and homosexuality (even among hardline conservatives, most will not share your views).
What's different with Grokipedia is that you now have an army of robots who can put a Young Earth spin on a million articles overnight.
I do think that as it is, Grokipedia is a threat to Wikipedia because the complaints about accuracy don't matter to most people. And if you're in the camp that the cure to the subtle left-wing bias of Wikipedia is robotically injecting more egregious right-wing bias, the project might even be up your alley. The best hope is that everyone else gets the same idea and we end up with 50 politically-motivated forks at each others' throats.
atonse 37 minutes ago [-]
Have you tried Grokipedia yet?
Cuz you’ve mainly addressed the concept. But have you read a bunch of articles? Found inaccuracies? Seen the edit process?
Cuz, regardless of ideology, the edit process couldn’t have been done before because AI like this didn’t exist before.
weregiraffe 18 minutes ago [-]
Who decided what is an observable fact?
not2b 1 hours ago [-]
Conservapedia had to have a person create each article and didn't have the labor or interest. Grok can spew out any number of pages on any subject, and those topics that aren't ideologically important to Musk will just be the usual LLM verbiage that might be right or might not.
graemep 8 minutes ago [-]
Some other articles are fine, but its horribly unreliable.
I tried subject of the first wikipedia article in my browser history search. It was Malleus Maleficarum. The first part of the text was correct and a decent summary, the rest suddenly switched to an article about an album called Hammer of the Witch by a band called Ringworm.
Images are a weak point. No image of the Sri Lankan flag, a weird one of the flag of the UK, which the caption says "The national flag of the United Kingdom, known as the Union Jack" - bad! Wikipedia has a better image, and an entire article on the Union Flag.
The article on marriage vows (another one I have looked at recently) seems more extensive than WIkipedia's but only because it conflates vows with wedding ceremonies. Wikipedia's interpretation is narrow but covers the subject matter much better. Grokepedia would not have told me what I wanted to know, while Wikipedia does.
I do not see the point. If I want AI written answers a chat interface is better. That might be a real threat to Wikipedia, but an AI written equivalent is not.
robin_reala 44 minutes ago [-]
Side note, but Kagi has a great feature where you can remove worthless sites like Grokipedia from your results so that you can safely forget they exist. Recommended.
mzajc 32 minutes ago [-]
For users of other search engines, the uBlacklist extension[0] is a godsend. It'll also apply the same blacklist to every search engine you use.
They also have a report form for slop sites, but none of mine got reviewed yet (I have 5 reports since November, and the help still says "We will start processing reports officially in January.")
beloch 36 minutes ago [-]
Grokipedia is currently:
1) Less accurate than Wikipedia.
2) More verbose, harder to read, and less well organized than Wikipedia.
Pick a non-political topic and compare the Wikipedia page to the Grokipedia page. It's not even close.
If Grokipedia ever closes the #2 gap, then we might start to see a non-negligible number of users ignoring #1. At present, only the most easily offended political snowflakes would willingly inflict Grokipedia on themselves.
tbrownaw 50 minutes ago [-]
Grokipedia is a tool for converting money into improvements in AI (by iterating on it). Any outward resemblance to an encyclopedia is incidental, despite apparently being the intended purpose.
lich_king 26 minutes ago [-]
The thing is, this doesn't come down to merit. Search engines is what made Wikipedia popular because for two decades, they sent every other query there. And in the past, there was no alternative, except for spammy mirrors and paywalled sites.
I don't like the idea or the execution of Grokipedia, but it's already showing up in my search results. If it continues to rise, it will become a real threat to Wikipedia even if it's full of errors and LLM-speak. Most people won't know or care. They're already being served compute-constrained AI summaries in search and most are loving it.
bdcravens 33 minutes ago [-]
In terms of total size, it absolutely has a long way to go. How it ends up remains to be seen.
Much of the conversation around it has been disingenuous, focusing on growth percentages as opposed to actual size. Once upon a time, the Parler and Truth Social apps were also at the top of the charts based on growth.
grumbel 20 minutes ago [-]
Grokipedia is at 6,092,140 articles, English Wikipedia has 7,141,148. So it's pretty close already after just four months.
lysace 15 minutes ago [-]
IMO, these two things are both true:
a) Wales' co-founder Larry Sanger is largely correct about the bias of Wikipedia
b) Grokipedia is a joke
HardwareLust 45 minutes ago [-]
That is exactly what it is.
ggoo 1 hours ago [-]
Accurate.
Havoc 32 minutes ago [-]
Now when mechahitler needs a source to back that musky is indeed fitter than usain bolt he has a reference
ghywertelling 54 minutes ago [-]
[flagged]
whiteclawlegal 2 hours ago [-]
[flagged]
Rendered at 21:19:24 GMT+0000 (Coordinated Universal Time) with Vercel.
One was looking up who "Ray Peat" was after encountering it on Twitter. Grok was obviously a bit more fawning over this right-aligned figure but Wikipedia had long since entirely deleted its page, so I didn't have much of a choice. Seems bizarre to just not have a page on a subject discussed every day on Twitter.
The other is far more impactful IMO. Every politician's or political figure's page on Wikipedia just goes "Bob is a politician. In 2025 <list of every controversial thing imaginable>". You have no idea what he's about and what he represents; you don't even have any idea if anyone cared, since all this was added at that moment in 2025 and not updated since. Grokipedia does not do this at all. If you want to know about someone's actual political career, Grokipedia weights recent controversies equal to past controversies and isolates it all to a section specifically for controversies.
"Deletionpedia: Rescuing articles from Wikipedia's deletionism": https://news.ycombinator.com/item?id=31297057
(https://www.stupidedia.org german only satirical wiki)
The fact that Musk's derangement is clear from reading grokipedia articles shows that LLMs are less impervious to ego. Combine easily ego driven writing with "higher speed" and all you get is even worse debates.
You rightfully point out that the Grok folks are not engaged in that effort to avoid bias but we should hold every one of these projects to a similar standard and not just assume that due diligence was made.
Citation very much needed. LLMs are arguably concentrated groupthink (albeit a different type than wiki editors - although I'm sure they are trained on that), and are incredibly prone to sycophancy.
Establishing fact is hard enough with humans in the loop. Frankly, my counterargument is that we should be incredibly careful about how we use AI in sources of truth. We don't want articles written faster, we want them written better. I'm not sure AI is up to that task.
Yeeeeah, no. LLMs are only as good as the datasets they are trained on (ie the internet, with all its "personality"). We also know the output is highly influenced by the prompting, which is a human-determined parameter, and this seems unlikely to change any time soon.
This idea that the potential of AI/LLMs is somehow not fairly represented by how they're currently used is ludicrous to me. There is no utopia in which their behaviour is somehow magically separated from the source of their datasets. While society continues to elevate and amplify the likes of Musk, the AI will simply reflect this, and no version of LLM-pedia will be a truly viable alternative to Wikipedia.
Basically, a normal human with some basic media literacy knows that tabloids, the "yellow press" rags, Infowars or Grokipedia aren't good authoritative sources and automatically downranks their content or refuses to read it entirely.
An AI training program however? It can't skip over B.S., it relies on the humans compiling the dataset - otherwise it will just ingest it and treat it as 1:1 ranked with authoritative, legitimate sources.
What's different with Grokipedia is that you now have an army of robots who can put a Young Earth spin on a million articles overnight.
I do think that as it is, Grokipedia is a threat to Wikipedia because the complaints about accuracy don't matter to most people. And if you're in the camp that the cure to the subtle left-wing bias of Wikipedia is robotically injecting more egregious right-wing bias, the project might even be up your alley. The best hope is that everyone else gets the same idea and we end up with 50 politically-motivated forks at each others' throats.
Cuz you’ve mainly addressed the concept. But have you read a bunch of articles? Found inaccuracies? Seen the edit process?
Cuz, regardless of ideology, the edit process couldn’t have been done before because AI like this didn’t exist before.
I tried subject of the first wikipedia article in my browser history search. It was Malleus Maleficarum. The first part of the text was correct and a decent summary, the rest suddenly switched to an article about an album called Hammer of the Witch by a band called Ringworm.
Images are a weak point. No image of the Sri Lankan flag, a weird one of the flag of the UK, which the caption says "The national flag of the United Kingdom, known as the Union Jack" - bad! Wikipedia has a better image, and an entire article on the Union Flag.
The article on marriage vows (another one I have looked at recently) seems more extensive than WIkipedia's but only because it conflates vows with wedding ceremonies. Wikipedia's interpretation is narrow but covers the subject matter much better. Grokepedia would not have told me what I wanted to know, while Wikipedia does.
I do not see the point. If I want AI written answers a chat interface is better. That might be a real threat to Wikipedia, but an AI written equivalent is not.
[0]: https://github.com/iorate/ublacklist
1) Less accurate than Wikipedia.
2) More verbose, harder to read, and less well organized than Wikipedia.
Pick a non-political topic and compare the Wikipedia page to the Grokipedia page. It's not even close.
If Grokipedia ever closes the #2 gap, then we might start to see a non-negligible number of users ignoring #1. At present, only the most easily offended political snowflakes would willingly inflict Grokipedia on themselves.
I don't like the idea or the execution of Grokipedia, but it's already showing up in my search results. If it continues to rise, it will become a real threat to Wikipedia even if it's full of errors and LLM-speak. Most people won't know or care. They're already being served compute-constrained AI summaries in search and most are loving it.
Much of the conversation around it has been disingenuous, focusing on growth percentages as opposed to actual size. Once upon a time, the Parler and Truth Social apps were also at the top of the charts based on growth.
a) Wales' co-founder Larry Sanger is largely correct about the bias of Wikipedia
b) Grokipedia is a joke