poodaddy
Member
We all need details and you know this.At least I'm not that one guy who donated to Anita Sarkeesian on here. You know the guy.
We all need details and you know this.At least I'm not that one guy who donated to Anita Sarkeesian on here. You know the guy.
This is being made by AI. It took 1000's of people decades to make the current wikipedia. Fighting, fact checking, researching, fighting some more. It has flaws but it is at least researched. An AI is going to make a massive mess - imagine this -Why not. Competition is good.
There should be at least 4 or 5 more encyclopedias made by different people. Wikipedia being the only one for that long has shown its limits long ago.
They pretty much sums up humanity in a nutshell anyhow.Ah yes, Elon Musk's horny and racist AI. That'll be perfect for cataloging the history of humanity.
![]()
The 'doing anything' requires bringing back fact checkers, historians, and moderation (so that lies do not spread like wildfire). But too many people on the internet today see any form of moderation as trampling on free speech. So we're stuck where we are until people finally agree that a standardized level of rules and regulations should be a common thing across 99.9% of sites that don't end in "chan".I have to agree. More people are just finding shit where they want to see it.
Instead of looking at the core issue, it's just woke this, left that, right that. Lots of complaining on the internet, turning it into memes instead of doing anything, or trying to do anything about the issues.
It gets things wrong on simple tasks - like I have searched for if a game has been released - and it comes back with 'yes *game* is available to play right now, it released on *gives date two weeks in the future*.'If the end product is good, I'm all for it. Problem is AI recaps can be bad. Google's is often wrong, especially if you search pro athletes where it can get messed up scraping pages because those articles it links to talk about his pro athlete father or brother, so the AI is getting mixed up who is who.
Or you see stupid trolling edits. Check out MMA fights and you'll often get someone editing the page saying something stupid like XXX got knocked out or make a joke, but the fight is still going on. Then a minute later, someone will re-edit it back. It's actually funny shit you got to catch at that moment of time. lol
It's not like Wiki is top notch. It's just that it's a great site which has tons of topics and always among the top handful of search links, but the depth of each page can be dogshit and short. It comes down to if that page has enough public interest for some editors to make a good page about that person or topic. If not, then it's junk.
Just go with what makes sense.I don't know anything about this, but that sounds like the right move any normal person should make?
I hope not.The AI Bubble pop will be something to behold.
I hope not.
Made about $100k in AI and quantum stocks in the last 4 weeks.
It gets things wrong on simple tasks - like I have searched for if a game has been released - and it comes back with 'yes *game* is available to play right now, it released on *gives date two weeks in the future*.'
Yeah what the fuck is your problem NaziGaf?!? So what if everything that can be politicized has an enforced leftist take?!? Why can't you just accept the rest of Wikipedia uncritically?!Leftist bias? Wokepedia?
I mean, if you're only checking articles about Trump, Brexit, Soros, LGBT, Antifa and Palestine, I guess it might be,
but 90% of Wikipedia is about normal neutral stuff, like animals, plants, illnesses, programming, movies etc.
Or is that somehow woke, too?
It was in the Charlie Kirk thread, okay I'm guessing he probably wasn't the only one on the site who did it, even discounting the brokies who moved onto purpler pastures full time. But if he were the one and only, it'd be kinda funny is all.We all need details and you know this.
Ideally the same way community notes works, where things get approved when enough people with different viewpoints agree that a note is fair and factually accurate. That said, I'll believe it when I see it, as entire articles are going to be much harder to keep free from bias than a brief note that adds context to a social media post.I view certain Wikipedia editors exactly like the petty, power-tripping nutjob mods on Reddit.
I can't see Grokipedia being much of an improvement. How is an AI supposed to determine the trustworthiness of the data when it's learning from this kind of garbage in the first place?