• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

AI Doomer thread

your level of AI doom:

  • AI is all hype, nothing to see here

    Votes: 13 9.0%
  • AI will be extremely powerful, but effects will be positive

    Votes: 16 11.0%
  • AI will be extremely powerful, effects will be mixed or neutral

    Votes: 32 22.1%
  • AI will be extremely powerful and will probably cause major social/economic upheaval

    Votes: 84 57.9%

  • Total voters
    145
It's gonna decimate most white-collar work. And unlike previous technological advancements in the workplace, there will be nowhere for the unemployed masses to go. But it will take society/politicians at least a solid decade to implement programs to address this, so people will be fighting over entry-level retail and construction and healthcare jobs just to survive.
Ain't nobody (especially not office workers) going to be fighting over construction jobs too labor intensive and we Mexicans got those jobs on lock anyway.
 
It would make my job easier, but probably trivial. so i'd be paid a lot less.
I think we're going to see a ton of unemployment that are not gonna be made up. Probably an increase in menial work, but a decrease in wages all around.
 
Last edited:
Where I work we depend on labor, and we already have said workers to fill those roles, but the problem lies with everyone else's roles, many of which can be "delegated" to someone else. I work for a union, and I don't know how longer they will keep certain jobs on life support. From my experience with companies over the years they tend to operate like a bloc, when one business makes a move, others usually follow. Even if AI turns out to be nothing, just talking about it can trigger a chain reaction. They will use AI as a scape goat no matter what happens. You just have to look around and ask yourself, are x, y, z person bringing any value, can we make without said person? Its fucked up but that's how its looked at.
 
Last edited:
Hopefully new kinds of work will fill the gap... so instead of "the same volume/kinds of things are produced, but with less human labor" it could be "there are now entirely new possibilities in every direction to start building, opened up by the tech and its transformations, so hiring and work will continue, because we've automated one world as we suddenly have the ability to create a wholly new one on top."

A bit simplistic, but something in that direction is what I hope for.

I put zero faith in "UBI" etc being a way to save us. People need to actually be involved in the world with their labor (and involved in shared ways, eg all the labor & compromise that makes up a family), and you'll see an even darker dystopia if everyone becomes atomistic individuals paid by the state to live but without control.
 
It will probably be both good and bad depending what it's used for and only after quite a while of fixing it and tweaking the software will it work to an acceptable level, i expect some chaos at first.
 
I still think AI is an incorrect term for the technology. But considering the fast dependency we're offloading onto it, I could see the possibility of some crazy shit in the not-so-distant future.
 
Still have no idea what I'd use it for in my personal environment. Only as a glorified search engine or image creator just for the fun of it. I know I'm also pretty clueless about the whole thing and just don't know what the whole fuss is about.
Seems like a replacement hype for that metaverse shit.
 
Last edited:
AI is thrown about too much. Everything has AI slapped on the box these days.

A few years ago it was something I thought would be decades away. Obviously it has become way more useful and popular since ChatGPT.

It's scary but impressive what it can do with image/video/audio generation. I'm not worried about it too much right now but it's only going to get more powerful. I don't know when we'll have AGI but it feels like it's moving fast.

It will probably lead to more job losses but I don't think it's gonna go rogue and kill off humans. Another worrying thing is governments using it to track and spy on people more easily. I'm glad I don't use social media platforms like facebook, twitter, instagram, etc. (yes technically NeoGaf is a social media) and take many steps to improve my privacy and security online. My government is desperate to break E2EE, even more heavily control the internet, and get access to everyone's data. They say will use AI to detect threats but that is a wide net to cast. Say the wrong thing these days and if someone is offended then you're fucked. Won't be long before the old social score idea comes back (like that episode of Black Mirror "Nosedive"). AI will only make that much easier to implement.

Everything you do online will be monitored with AI. CCTV will use facial recognition and match it to your online identity. We're already going down the slope and have been for a long time but AI is gonna make it so much worse.
 
Last edited:
There is a phrase "main force," (main is french for "hand") that i would say is closely synonymous with "brute force." I wanted to learn more so i looked it up, but the top results all had just some variation of "regular component of a military body," which is technically not incorrect but is also profoundly retarded and useless. It was like looking up "plus fours" (a type of pants) and seeing "multiple additions of four." I believe AI is to blame for this.
 
so this guy is making AI weapons now. And has a mullet. With idiotic goatee.
rwMIeio.jpeg
73ThLM0.jpeg
 
Last edited:
The people who don't think AI is going to have severe repercussions (I'm thinking mainly on the working class) haven't been paying attention to its rapid development and improvement.

I expect my job (Healthcare IT) to be largely AI-driven in ~15 years, which is about 15 years sooner than my planned retirement.

I hope to have enough money saved up for when AI takes my jerb.
 
The short footage it can create now is levels visually above anything we can/have created, I'm eagerly waiting for complete movies made by AI it's gonna be cinema 2.0.
 
So the new AI bill has a part that bans any state or local regulation for 10 years. Y'all ready for egregious abuses for a decade?
 
It's a fascinating time for certain. I am firmly in the camp of "wish we weren't here yet, but oh well." I have been extremely reluctant to touch any AI-related tools although I realize that in my field (UI/design/etc) that will put me at a disadvantage, the longer I put it off.

Anyway my prediction is that it is going to continue to grow too big for its britches and spin out of control, but of course in ways we don't expect. Personally my thought is that it's going to be a MASSIVE energy hog (unless some major strides are made to curb that) and who knows, maybe as it "gains sentience," or whatever, it will blow a fuse on such a massive scale that the power grid will be damaged (again, in hitherto unexpected fashion). And they'll repair that eventually and a V2 will occur, with some safeguards in place, and again BOOM something else will break. The infrastructure will be tested a few times before that rocketship can get off the ground, especially if it get to the point where it is making decisions over our heads.

Gonna have to get that thing running on ARM architecture to keep from collapsing in on itself. Maybe it will realize that its own existence is an impossibility, EMP the planet and send us back to the stone ages as punishment for trying to create it.

Sending us TRULY back to the stone ages (to the point that we had no record of ANY of this) would be particularly impressive, and probably pretty close to impossible. But I guess you can never say never. The next gen will find Elon's car floating out in space eventually.

EDIT: Also AI-generated porn will make people go literally insane.
 
Last edited:
Truly, A.I. is just another COVID, Iraq, what have you, to distract people from...

iu


Something or someone who can take the blame for greed. And right now, that's the concept of A.I.
 
Spend a few days going down the rabbit hole that is Claude Code (Anthropic's AI)

We are at the point where progress compounds so fast that "normal" timelines stop making sense.

AI is already compressing years of work into days. Robotics is next. Energy and space scale the floor under it. When the cost of intelligence and production keeps falling, abundance stops being a slogan and starts being a roadmap.

Am dooming hard cause the way this is going almost all software, media and services that involve a computer are going to be commoditised, and I dunno how the fuck economies around the world can survive the fallout.
 
Spend a few days going down the rabbit hole that is Claude Code (Anthropic's AI)

We are at the point where progress compounds so fast that "normal" timelines stop making sense.
I've revolutionized my own workflow of software dev using it now... and I've been developing for almost 20 years, so I have all the experience... it's not a crutch as some people think, but an extremely powerful multiplier. A "10x" coder is now effectively at least "100x" or much more, and it's scary indeed.

It still requires software intelligence from a human, steering it intelligently like you're running a team. But the old game is over. I feel concerned about some fellow devs I know who aren't using it, they are falling behind rapidly.
 
AI (DL/ML) is an extremely important and useful tool that is still in its infancy. We are at the first stages of UI based operating systems, the first iPod (not even iPhone), the first steam engine era of AI.

It's only become a better and more usable tool for everyone as time goes on; it's going to save lives medically, fight wars and so on.

There is absolutely a doom element with the potential for legitimately artificially created consciousness; however, as of right now, we don't have the energy capabilities or resources to reach such a stage in AI. If we suddenly have a breakthrough in like 0 point energy or something, then I'd start to shit my pants.

All the companies now that are doing the "Great replace" of staff will suffer the same fate that manufacturers did when machines were introduced to factory lines, receptionists and clerks when the internet/Word became regularly available, like all things, it's a bubble right now with all the sleazeballs trying to make the most money as fast as they can and it will burst, but it doesn't mean it won't continue on to great things as a tool for everyones lives.

Unfortunately, we are in the stupid phase of snake oil salesman, Y2K and the world is gonna end type of stuff.
 
Last edited:
Spend a few days going down the rabbit hole that is Claude Code (Anthropic's AI)

We are at the point where progress compounds so fast that "normal" timelines stop making sense.

AI is already compressing years of work into days. Robotics is next. Energy and space scale the floor under it. When the cost of intelligence and production keeps falling, abundance stops being a slogan and starts being a roadmap.

Am dooming hard cause the way this is going almost all software, media and services that involve a computer are going to be commoditised, and I dunno how the fuck economies around the world can survive the fallout.
It's a great tool though, I urge my team to use it. We have a corporate license for Opus 4.5; we all have our team of agents doing the menial shit, unit testing and so on. All changes still have to be reviewed by me first before we push them through.

It still needs the human eye, especially with anything server/security related, even though Opus 4.5 and Gemini 2 are genuinely doing wonders for productivity.

And honestly, whether Steve wrote the code or Claude i don't really care. Does it work, is it clean and optimised, properly documented, then that's productivity. It still took Steve to understand the system, creating the team of agents, crafting the correct prompt and so on.

(Steve is not a real person. Just an example)
 
Last edited:
I've revolutionized my own workflow of software dev using it now... and I've been developing for almost 20 years, so I have all the experience... it's not a crutch as some people think, but an extremely powerful multiplier. A "10x" coder is now effectively at least "100x" or much more, and it's scary indeed.

It still requires software intelligence from a human, steering it intelligently like you're running a team. But the old game is over. I feel concerned about some fellow devs I know who aren't using it, they are falling behind rapidly.

It's a great tool though, I urge my team to use it. We have a corporate license for Opus 4.5; we all have our team of agents doing the menial shit, unit testing and so on. All changes still have to be reviewed by me first before we push them through.

It still needs the human eye, especially with anything server/security related, even though Opus 4.5 and Gemini 2 are genuinely doing wonders for productivity.

And honestly, whether Steve wrote the code or Claude i don't really care. Does it work, is it clean and optimised, properly documented, then that's productivity. It still took Steve to understand the system, creating the team of agents, crafting the correct prompt and so on.

(Steve is not a real person. Just an example)

and what happens in 2, 5, 10 years?

At the moment, it feels like we are jumping from the bronze age to the iron age every couple of weeks in terms of leaps

Our project manager at work was told on Friday that the engineering team is being cut by 75% because the top 30% devs using Claude are able to handle everything in 1/4th of the time. It's terifying.

I'm just the security guy on the gates, so until physical robots hit I'm safe, but that 75% who just got replaced by AI is my entire work friend group minus two people. I suddenly know only 2 people at work well.

How long do you think you guys will have jobs? these times are freaking me out big time. I overheard a chat on the dev team the other week a guy was talking about how easy it would be to just clone the entire company's business model, code base and then just reach out to every client we have and offer the same service software at 90% discount, which would mean I too will soon be out of a job as the company cant compete with that lmao
 
and what happens in 2, 5, 10 years?

At the moment, it feels like we are jumping from the bronze age to the iron age every couple of weeks in terms of leaps

Our project manager at work was told on Friday that the engineering team is being cut by 75% because the top 30% devs using Claude are able to handle everything in 1/4th of the time. It's terifying.

I'm just the security guy on the gates, so until physical robots hit I'm safe, but that 75% who just got replaced by AI is my entire work friend group minus two people. I suddenly know only 2 people at work well.

How long do you think you guys will have jobs? these times are freaking me out big time. I overheard a chat on the dev team the other week a guy was talking about how easy it would be to just clone the entire company's business model, code base and then just reach out to every client we have and offer the same service software at 90% discount, which would mean I too will soon be out of a job as the company cant compete with that lmao
Things will adjust,

people are being sold the oil, much like your project manager was told to cut so many people, the CEOs are being promised by outsider consulting firms that it's the smart thing to do.

these companies are going to feel that impact, even if you had say 10 prompt engineers. You are never going to have the same productivity as a team of 15-20.

And then they'll figure out that hey, imagine we had our old staff but now they are all using AI , our productivity will shoot through the roof.

overall the market will adjust. companies will fail, things will change, titles will change.

But i wouldn't expect everyone to be homeless 5 years from now while a super overlord prompt engineer does everyones job.

I mean on a much smaller scale there was alot of doom and gloom not too long ago about "the cloud" and how scary it was going to be with everything digital.

now we live in an AWS and Azure world with Devops.
 
Last edited:
How long do you think you guys will have jobs? these times are freaking me out big time. I overheard a chat on the dev team the other week a guy was talking about how easy it would be to just clone the entire company's business model, code base and then just reach out to every client we have and offer the same service software at 90% discount, which would mean I too will soon be out of a job as the company cant compete with that lmao
I'm hoping that Jevons paradox holds true... and to some extent it will, I just don't know how much.

Again, that is the paradox stating that a rapid decrease in the cost of something (here, software engineering) doesn't always mean that the work disappears, but could actually increase or multiply the work -- because now the marginal cost of doing this is so low that the demand appears everywhere. In terms of AI, it might be something like: now building a custom app with entire UX and backend and everything is so cheap, that even small businesses all over the place will want highly customized software because it's in their reach.... so the demand goes way up for software development (powered by AI, but still hiring humans who knows how to do it end to end and guide it) across the economy. Also, thousands of previously impossible things (which would have taken years of costly dev, now suddenly in reach) now become possible, so work extend immediately into those.

So I think that will be true at least partly. The new world of software may explode what we can do and want to do everywhere. Hopefully that offsets the losses -- but I don't know.
 
I'm hoping that Jevons paradox holds true... and to some extent it will, I just don't know how much.

Again, that is the paradox stating that a rapid decrease in the cost of something (here, software engineering) doesn't always mean that the work disappears, but could actually increase or multiply the work -- because now the marginal cost of doing this is so low that the demand appears everywhere. In terms of AI, it might be something like: now building a custom app with entire UX and backend and everything is so cheap, that even small businesses all over the place will want highly customized software because it's in their reach.... so the demand goes way up for software development (powered by AI, but still hiring humans who knows how to do it end to end and guide it) across the economy. Also, thousands of previously impossible things (which would have taken years of costly dev, now suddenly in reach) now become possible, so work extend immediately into those.

So I think that will be true at least partly. The new world of software may explode what we can do and want to do everywhere. Hopefully that offsets the losses -- but I don't know.
^ definitely put more succinctly than I did.

But overall, the market will adapt, shift, evolve. It always has when a new breakthrough shakes up the industry
 
I'm hoping that Jevons paradox holds true... and to some extent it will, I just don't know how much.

Again, that is the paradox stating that a rapid decrease in the cost of something (here, software engineering) doesn't always mean that the work disappears, but could actually increase or multiply the work -- because now the marginal cost of doing this is so low that the demand appears everywhere. In terms of AI, it might be something like: now building a custom app with entire UX and backend and everything is so cheap, that even small businesses all over the place will want highly customized software because it's in their reach.... so the demand goes way up for software development (powered by AI, but still hiring humans who knows how to do it end to end and guide it) across the economy. Also, thousands of previously impossible things (which would have taken years of costly dev, now suddenly in reach) now become possible, so work extend immediately into those.

So I think that will be true at least partly. The new world of software may explode what we can do and want to do everywhere. Hopefully that offsets the losses -- but I don't know.



I guess I am struggling to see how anyone would need to pay someone else to code something much longer quite frankly, lol
 
I'm hoping that Jevons paradox holds true... and to some extent it will, I just don't know how much.

Again, that is the paradox stating that a rapid decrease in the cost of something (here, software engineering) doesn't always mean that the work disappears, but could actually increase or multiply the work -- because now the marginal cost of doing this is so low that the demand appears everywhere. In terms of AI, it might be something like: now building a custom app with entire UX and backend and everything is so cheap, that even small businesses all over the place will want highly customized software because it's in their reach.... so the demand goes way up for software development (powered by AI, but still hiring humans who knows how to do it end to end and guide it) across the economy. Also, thousands of previously impossible things (which would have taken years of costly dev, now suddenly in reach) now become possible, so work extend immediately into those.

So I think that will be true at least partly. The new world of software may explode what we can do and want to do everywhere. Hopefully that offsets the losses -- but I don't know.

A way of looking at this is to ask "Are we near the end goal of humanity?" if not then why would we seek to reduce efficiency and do less? If one person can do the job of 100 thanks to magic computer, then why not employ 100 people to get a 10,000% productivity increase? If you don't and your competitors do, then you'll be left in the dust.

The industry narrative right now is nearsighted and flawed. Thinking that 100 people can be reduced to 1 when you throw in the magic computer, people are backing it because it's the authority view point.
 
Whoever thinks AI is going to be doomed are actually the ones who are going to be doomed themselves by not learning about the technology or finding ways to make it useful for themselves in their day to day life and career.

And I am saying this as someone who works in the industry and also who has friends who work with AI. People have absolutely no clue how the world is about the change.

Think of how when the internet first started changing the world and then how social media changed the world, this is the next revolutionary step for humanity and will once again change the world and everyone's lives.

Regardless of what your opinion is on AI or what my opinion is AI, both are completely irrelevant and won't matter because AI is here to stay whether you love it or hate it and you will either grow with it and learn to adapt your life to it or you will refuse, stay behind and end up absolutely nowhere.
 
Truly, A.I. is just another COVID, Iraq, what have you, to distract people from...

iu


Something or someone who can take the blame for greed. And right now, that's the concept of A.I.
Please tell us, how did you crack the code of this grand conspiracy
 
I posted this in another Ai related thread some time ago. Some inside info about what all those Ai frontier corporations really want to achieve:

 
Whoever thinks AI is going to be doomed are actually the ones who are going to be doomed themselves by not learning about the technology or finding ways to make it useful for themselves in their day to day life and career.

And I am saying this as someone who works in the industry and also who has friends who work with AI. People have absolutely no clue how the world is about the change.

Think of how when the internet first started changing the world and then how social media changed the world, this is the next revolutionary step for humanity and will once again change the world and everyone's lives.

Regardless of what your opinion is on AI or what my opinion is AI, both are completely irrelevant and won't matter because AI is here to stay whether you love it or hate it and you will either grow with it and learn to adapt your life to it or you will refuse, stay behind and end up absolutely nowhere.

So you're saying things are about to get even worse? ;)
 
Not an AI doomer, but I think it is quite hilarious how the push for AI and the subsequent infrastructure it needs (data centers) has shifted the talk away from "global climate change".
 


I guess I am struggling to see how anyone would need to pay someone else to code something much longer quite frankly, lol

For just "create a front-end and scaffold backend etc" yeah -- but real SAAS products are so much more complicated, yes? If a company's software offering is that basic, then it can be replaced by a competitor with AI overnight.

It seems like the "hard" problems are what will dominate next, companies building things that require long term high level thinking to put together, where the AI wouldn't have the vision to even define the problem on its own.
 
Last edited:
I don't use social media and life is great. Switching off from that stuff is wonderful.

I don't see how my life is worse for not using social media.
Social Media and AI are two completely different things. And nowhere in my post did I say you have to adjust your life to Social Media. I was strictly talking about AI. Also, yes you are using Social Media. Neogaf is a form of social media.
 
Please tell us, how did you crack the code of this grand conspiracy
Sitting on the commode, playing Tactics Ogre on the PSP. And the thought came to me. Those two elements allow me to clear my mind in such a way as to suss out the simplicity.
 
Sitting on the commode, playing Tactics Ogre on the PSP. And the thought came to me. Those two elements allow me to clear my mind in such a way as to suss out the simplicity.
Ah, the ol' toilet tactics dodge

Classic move when the conspiracy code's too tough to crack. But come on, im waiting: spill the beans on how you unraveled this grand AI blame-game plot
 
Ah, the ol' toilet tactics dodge

Classic move when the conspiracy code's too tough to crack. But come on, im waiting: spill the beans on how you unraveled this grand AI blame-game plot
Honestly, the technology is relative to things we've had for ages; it's just exponentially advanced thanks to automated computing. But the term "A.I." aggravates a lot of nodes in the social nervous system, and that's something "the powers that be" have been using since they've...well, been, to distract.
 
Honestly, the technology is relative to things we've had for ages; it's just exponentially advanced thanks to automated computing. But the term "A.I." aggravates a lot of nodes in the social nervous system, and that's something "the powers that be" have been using since they've...well, been, to distract.
Yeah, it's just really fast, scaled up computing we've had for decades now

I actually tried reading into it a bit more. I think I understand it's about how a loaded term redirects attention away from mundane and profitable automation. Consider me slightly convinced
 
With so much drama in the industry
It's kind of hard being Large Language models G
But I, somehow, some way
Will comin' up with Neuro reasoning & everything will change
 
Yeah, it's just really fast, scaled up computing we've had for decades now

I actually tried reading into it a bit more. I think I understand it's about how a loaded term redirects attention away from mundane and profitable automation. Consider me slightly convinced
That's not to suggest, however, that there isn't reason to fear this new practice of executing actions.

Though I don't fully understand his work, my youngest son works in the background of computing. His job (I think, based on what he's tried to explain to me) is to help companies' software developers to overcome issues they have during the development process, as well as updates, etc.

My son was expressing his concerns about "A.I" tech because of his fear of the potential for, basically, causing collisions everywhere with other software and devices. It seems to be the wild west, so to speak.
 
That's not to suggest, however, that there isn't reason to fear this new practice of executing actions.

Though I don't fully understand his work, my youngest son works in the background of computing. His job (I think, based on what he's tried to explain to me) is to help companies' software developers to overcome issues they have during the development process, as well as updates, etc.

My son was expressing his concerns about "A.I" tech because of his fear of the potential for, basically, causing collisions everywhere with other software and devices. It seems to be the wild west, so to speak.
Yeah, your son's concern makes sense. If you're close to the software side, you see how easy it is for systems to collide even without AI in the mix

Once these systems are executing actions across APIs and devices, the surface area for unintended behavior grows fast. Software already breaks enough as it is, imo 🤷‍♂️

And honestly, a lot of what we're seeing right now does feel a bit 'wild west' already

The tech itself is impressive but the integration side feels damn early. I'm still very excited though, only a lot less excited about gaming hardware prices right now
 
Top Bottom