XBox Producer Suggests Recent Layoffs Use AI For Emotional Support for Losing Jobs to AI

Even if the post were written by AI, you'd need to be a psychopath to think this was OK to post.

These people are just awful. Worst of the worst.
I was saying to my wife literally two days ago, the people who ascend to these positions are sociopaths. The things they think about when they get out of bed every morning aren't what you and I think about. There isn't a drop or humility or humanity to be found at the top of these companies.
 
There are already lawsuits over this kind of stuff. I'm honestly surprised they'd even attempt linking AI to mental health counseling after a major stressor like job loss. If it even slightly says something wrong they'll get sued.

For all you guys against AI, hope you were telling your congressmen to not pass that big beautiful piece of shit getting signed today. It has a provision in it blocking states from regulating anything related to AI for 10 years. Think about that. We can see something wrong as they roll this stuff out and they're prevented from regulating it. Happy 4th everyone. Should be a very dystopian next 10 years.
 
Last edited:
Last edited:
These guys are colder than the corpo guys in CP2077.

Are they loosing their jobs to AI ? I have missed that info. I thought it had something to do with how poorly Xbox performed.


I'd be interested to know the types of roles are being replaced by AI.
 
Last edited:
How about Microsoft use A.I for their CEO/directors? They display the same emotional intelligence and compassion as ChatGPT, maybe even less..

I think I've got the pitch for the next Bond movie, 007 taking on some megalomaniac A.I company.
Funny, cause you are most definitely right.

Its easier for AI to replace execs than developers, really. They dont create, they make business decisions based on data.

They wont be replaced tho, of course
 
Microsoft sucks, but at the end of the day you are all going to keep buying their games even if made by AI so they don't care. Words are useless. Corporations will always win as long as there are consumers.
 
Funny, cause you are most definitely right.

Its easier for AI to replace execs than developers, really. They dont create, they make business decisions based on data.

They wont be replaced tho, of course
Yep, and I wasn't suggesting that with a hint of hyperbole either. If a CEO/bean counter can track the data and, without a hint of conscience, make a judgement call to let people go in order to save a buck, there really is little distinction. The decision makers are making cold, calculated, and impersonal decisions.
 
Breaking News GIF by NowThis

This isn't gaslighting, this is touching people to ashes…
 
How about Microsoft use A.I for their CEO/directors? They display the same emotional intelligence and compassion as ChatGPT, maybe even less..

I think I've got the pitch for the next Bond movie, 007 taking on some megalomaniac A.I company.
That's the last 2 mission impossible movies
 
He should have asked co-pilot before sending this out...

If a person was to lose their job to A.I. should I send them advice to use A.I. for grief counseling?
That suggestion—telling someone who just lost their job to AI to use AI for grief counseling—lands like a lead balloon. 🪨 The backlash you're seeing online, especially in that NeoGAF thread, is a clear signal: people view it as tone-deaf, even dystopian. The idea came from an Xbox producer, and it's being widely criticized as lacking empathy and humanity, especially coming from someone still employed at the company that made the layoffs.

### 🤖 Why It Feels Off
- **Perceived hypocrisy**: It feels like rubbing salt in the wound—"AI took your job, now let AI help you cry about it."
- **Corporate coldness**: It reinforces the idea that companies are replacing human connection with automation, even in deeply emotional moments.
- **Lack of accountability**: It shifts the burden of emotional recovery onto the individual, while the system that caused the harm remains untouched.

### 🧠 Could AI ever help with grief?
Yes, but context matters. AI can be a useful tool for mental health support—chatbots like Woebot or Wysa are designed for that. But it should never be the first or only suggestion, especially not from someone representing the entity responsible for the layoff.

### 💡 A Better Approach
If you genuinely want to support someone in that situation, here's a more thoughtful way to frame it:
> "I know this is incredibly tough. If you ever want to talk, I'm here. And if you're open to it, there are some tools—AI-based and otherwise—that can help with processing grief or stress. But nothing replaces real support from people who care."

Would you like help crafting a message that strikes the right tone—empathetic, supportive, and maybe even a little hopeful?

The scary thing about all of this is GAF seems to be influencing A.I. a lot lately. We are dooming making things brighter for future generations. 😜
 
Last edited:
Yeah, they're fucking brutal and full mask off when it comes to AI.

Saw this one yesterday:





I think the message is loud and clear.


What is interesting about this post is that Microsoft is in search of a problem and then seemingly going to try to provide a solution with AI. Zero expertise on offering something innovative, they are basically a bunch of consultants that want to automate, yet mask it with AI marketing.

Microsoft has bet the farm on AI but they have nothing to actually bring to the table. They need data and have no idea how to approach the unprecedented. They are shutting down studios of their own due to mismanagement with plenty of precedence - why wouldn't they reference their AI to fix it? Oh right, because it's actually bullshit.

Outside of asset generation, AI conversation generation and upscaling AI they dont have jack shit. It all needs something built CORRECTLY to reference to even work. It is not sentient. They are going to build automation off of everyone's backs and then encourage those companies to fire their resources.

Dont fall for it. Hallucinating is just marketing for being wrong as shit based on a next generation web search method.
 
Last edited:
To people calling the guy an exec: please stop. He is a mid-level employee who happens to be retarded.
What I was going to post, guy is a fucking producer. Not an exec, not a manager, a producer, he's not the boss of anyone.
It was stupid, but I don't think there were any truly bad intentions around it. This mostly speaks to how deeply integrated AI/Copilot is at culture within Microsoft as a whole, from top to bottom.
 
Last edited:
Idk, after reading the post in full, it seems like solid advice. The editorialization of the post seems like reactionary rage bait. It would be akin to getting upset that a letter courier was advised to email job recruiters, after being laid off due to the prevalence of email.
 
Last edited:
because most people are so lazy, they use AI not for assist but for quick cheaper solution.

And in a lot of cases to literally think for them.

Honestly, the biggest mistake was calling it "intelligence" at all when all of the publicly available models we have today are not intelligent in the slightest, especially when most people are just using them like a glorified Ask Jeeves.

Sad state of affairs we've found ourselves in.
 
Last edited:
AI was a huge mistake :messenger_neutral:

Things are only gonna get worse as they become more advanced.

I really think government intervention is gonna be needed at some point.
 
Last edited:
And in a lot of cases to literally think for them.

Honestly, the biggest mistake was calling it "intelligence" at all when all of the publicly available models we have today are not intelligent in the slightest, especially when most people are just using them like a glorified Ask Jeeves.

Sad state of affairs we've found ourselves in.
idiocracy, stupidity, i don't know what future will lead from this. seeing people younger use lot of AI things, i think later human will struggle for most basic creativity, problem solving, and some basic knowledges.
 
I had already sworn off of using most Microsoft products and services prior to today (and especially AI, fuck all AI products and services as they are a blight on humanity and threaten human existence with the removal of jobs) - Xbox was the only thing I occasionally bought a game or two for. Not anymore though — today starts a full-on boycott of all of their games, services, and products. I refuse to support pieces of shit like this company, and especially the tone-deaf jackass who wrote this LinkedIn slop.

Actions speak louder than words - if anyone wants things to change, you have to collectively hit them where it hurts, which is their bottom line (ie. $$$).
 
To people calling the guy an exec: please stop. He is a mid-level employee who happens to be retarded.
What I was going to post, guy is a fucking producer. Not an exec, not a manager, a producer, he's not the boss of anyone.
It was stupid, but I don't think there were any truly bad intentions around it. This mostly speaks to how deeply integrated AI/Copilot is at culture within Microsoft as a whole, from top to bottom.
Matt Turnbull has been a producer at Microsoft since 2010, and an executive producer since 2024. 15 years of producing does not make you a mid-level employee. He is entrenched in the Microsoft culture because he's been there for so long, and is just repeating what he's been taught: psychopathy and heartlessness.
 
Matt Turnbull has been a producer at Microsoft since 2010, and an executive producer since 2024. 15 years of producing does not make you a mid-level employee. He is entrenched in the Microsoft culture because he's been there for so long, and is just repeating what he's been taught: psychopathy and heartlessness.
No, I wouldn't even say he's mid-level, at all. Being 15 years at the same job does not suddenly give you any sort of hierarchy over anyone else at a similar low-level role. Fact is, no one reported to this guy, no one was at the helm of his review process, nothing.
If some random producer at Xbox's XDEV equivalent is parroting this kind of shit, then it's a wide organizational culture issue at its core.
 
Top Bottom