poodaddy
Member
We all need details and you know this.At least I'm not that one guy who donated to Anita Sarkeesian on here. You know the guy.
We all need details and you know this.At least I'm not that one guy who donated to Anita Sarkeesian on here. You know the guy.
This is being made by AI. It took 1000's of people decades to make the current wikipedia. Fighting, fact checking, researching, fighting some more. It has flaws but it is at least researched. An AI is going to make a massive mess - imagine this -Why not. Competition is good.
There should be at least 4 or 5 more encyclopedias made by different people. Wikipedia being the only one for that long has shown its limits long ago.
They pretty much sums up humanity in a nutshell anyhow.Ah yes, Elon Musk's horny and racist AI. That'll be perfect for cataloging the history of humanity.
![]()
The 'doing anything' requires bringing back fact checkers, historians, and moderation (so that lies do not spread like wildfire). But too many people on the internet today see any form of moderation as trampling on free speech. So we're stuck where we are until people finally agree that a standardized level of rules and regulations should be a common thing across 99.9% of sites that don't end in "chan".I have to agree. More people are just finding shit where they want to see it.
Instead of looking at the core issue, it's just woke this, left that, right that. Lots of complaining on the internet, turning it into memes instead of doing anything, or trying to do anything about the issues.
It gets things wrong on simple tasks - like I have searched for if a game has been released - and it comes back with 'yes *game* is available to play right now, it released on *gives date two weeks in the future*.'If the end product is good, I'm all for it. Problem is AI recaps can be bad. Google's is often wrong, especially if you search pro athletes where it can get messed up scraping pages because those articles it links to talk about his pro athlete father or brother, so the AI is getting mixed up who is who.
Or you see stupid trolling edits. Check out MMA fights and you'll often get someone editing the page saying something stupid like XXX got knocked out or make a joke, but the fight is still going on. Then a minute later, someone will re-edit it back. It's actually funny shit you got to catch at that moment of time. lol
It's not like Wiki is top notch. It's just that it's a great site which has tons of topics and always among the top handful of search links, but the depth of each page can be dogshit and short. It comes down to if that page has enough public interest for some editors to make a good page about that person or topic. If not, then it's junk.
Just go with what makes sense.I don't know anything about this, but that sounds like the right move any normal person should make?
I hope not.The AI Bubble pop will be something to behold.
I hope not.
Made about $100k in AI and quantum stocks in the last 4 weeks.
It gets things wrong on simple tasks - like I have searched for if a game has been released - and it comes back with 'yes *game* is available to play right now, it released on *gives date two weeks in the future*.'
Yeah what the fuck is your problem NaziGaf?!? So what if everything that can be politicized has an enforced leftist take?!? Why can't you just accept the rest of Wikipedia uncritically?!Leftist bias? Wokepedia?
I mean, if you're only checking articles about Trump, Brexit, Soros, LGBT, Antifa and Palestine, I guess it might be,
but 90% of Wikipedia is about normal neutral stuff, like animals, plants, illnesses, programming, movies etc.
Or is that somehow woke, too?
It was in the Charlie Kirk thread, okay I'm guessing he probably wasn't the only one on the site who did it, even discounting the brokies who moved onto purpler pastures full time. But if he were the one and only, it'd be kinda funny is all.We all need details and you know this.
Ideally the same way community notes works, where things get approved when enough people with different viewpoints agree that a note is fair and factually accurate. That said, I'll believe it when I see it, as entire articles are going to be much harder to keep free from bias than a brief note that adds context to a social media post.I view certain Wikipedia editors exactly like the petty, power-tripping nutjob mods on Reddit.
I can't see Grokipedia being much of an improvement. How is an AI supposed to determine the trustworthiness of the data when it's learning from this kind of garbage in the first place?
By making the moderation a lot more lax?Wouldn't trust Elon Musk as far as I could through him;.
His freedom of speech is essentially him viewing himself as the main character of the story with all of us as NPC's.
He wants an echo chamber to his views.
Its amazing that people are still blind to this. Simply because they perceive certain facts as having a "political agenda".Wouldn't trust Elon Musk as far as I could through him;.
His freedom of speech is essentially him viewing himself as the main character of the story with all of us as NPC's.
He wants an echo chamber to his views.
The AI is not actually Elon Musk lol.Wouldn't trust Elon Musk as far as I could through him;.
His freedom of speech is essentially him viewing himself as the main character of the story with all of us as NPC's.
He wants an echo chamber to his views.
There are 7 million entries on Wikipedia just for the English language - with most having *made up number* 100's of tweets worth of information - lots of it very specialized. It could work in the long term, but as wikipedia already exists I don't see people putting in the effort beyond Jan 6th, immigration, Islam, trans gender, Epstein or other popular political topics. If the entry for SN2 reactions is wrong who is going to put in the effort to fix it when the wikipedia version already exists?Ideally the same way community notes works, where things get approved when enough people with different viewpoints agree that a note is fair and factually accurate. That said, I'll believe it when I see it, as entire articles are going to be much harder to keep free from bias than a brief note that adds context to a social media post.
Regardless of what you think of Grok, or Musk the real discussion topic is around the accuracy of wiki on more biased topics, or the edit battles you see.
I stopped donating to them long ago. People treat it like Delphi but while its value is underpinned by human knowledge its worth is eroded by human management.
I don't think the value prospect here is Grok supplanting Wiki but the outcome of AI being free to create its own Delphi and be measured (going onto other things)
We are at the earliest stages of an arms race. Eventually they will devour/destroy one another.Isn't the point to make one AI to rule them all? Why make multiple versions?
I still trust it more than I trust Wikipedia mods.AI is still too stupid and limited to be trusted with curating human knowledge and that's before having to trust the lying, egotistical billionaire behind it.
What's so woke about Wikipedia? Can anyone at least list a few examples instead of just spouting shit.
By making the moderation a lot more lax?