Intel wants to tone down the launch of their Arc Desktop Graphics Cards

I gave Raja the benefit of the doubt before, now I am just pissed off at his incompetence. If he doesn't turn things around for the next launch he has got to go.

I would say get rid of him right now if they had a better replacement lined up.
 
Last edited:
I gave Raja the benefit of the doubt before, now I am just pissed off at his incompetence. If he doesn't turn things around for the next launch he has got to go.

I would say get rid of him right now if they had a better replacement lined up.
I didn't. I knew he was incompetent. I just love the Intel fanboys who took his departure as a horrible thing for AMD. They have THRIVED in his absence.

Fuck that loser.
 
What is wrong with this Raja guy? To start a new graphics generation out of nothing can't be easy.
 
What is wrong with this Raja guy? To start a new graphics generation out of nothing can't be easy.
It's not, but apparently everyone on the internet now is an armchair hardware design and firmware expert, with enough experience to pass judgement.

I hope Intel stays in the dGPU game for the sake of competition and consumer options.
 
What is wrong with this Raja guy? To start a new graphics generation out of nothing can't be easy.

He's not a terrible engineer. But he got a bad reputation from an event a few years ago.
basically, he led the development the original Vega, and before any card was out, he was saying things like: "Poor Volta"
Then the cards launched and Vega was a bit of a disappointment.
Volta was a big success, especially in the AI field, as it was the first card from nVidia with Tensor Units.
 
this is going to be rough for intel, they will launch with a card that comes close to the 3070 and then the 4000 series will launch. I want more competition in the GPU space so I wanted them to be as awesome as possible, but they were supposed launch at the start of the year....so we will see how they are received.
 
I didn't. I knew he was incompetent. I just love the Intel fanboys who took his departure as a horrible thing for AMD. They have THRIVED in his absence.

Fuck that loser.
Yeah, I think you were the one who told me how braggadocios and deceitful he was with Vega architecture. I mean Vega 56 was still a good product but they had to price it as such, they clearly meant to compete with 1080/ti.

I just hoped Intel knew what they were doing, but this was pre Pat takeover 😕
 
JohnnyFootball JohnnyFootball well I did remember why I was enthusiastic about raja.

Vega was a compute beast, and I thought if that didn't translate well into video cards, it could still be a great result if focused on the server side of things. I was thinking they hired him for server processor design.

But sapphire rapids is being delayed again and again, so..
 
I gave Raja the benefit of the doubt before, now I am just pissed off at his incompetence. If he doesn't turn things around for the next launch he has got to go.

I would say get rid of him right now if they had a better replacement lined up.
Why? He very clearly left AMD on bad terms and there are reports he's really a bit of an incompetent cunt*.

Though I am quite glad Intel hired him and he did his thing. They deserve it. And it's no surprise they were willing to hire him; I bet he fit right into their management.

*not from a technical standpoint, but from a managerial one.
 
Last edited:
He's not a terrible engineer. But he got a bad reputation from an event a few years ago.
basically, he led the development the original Vega, and before any card was out, he was saying things like: "Poor Volta"
Then the cards launched and Vega was a bit of a disappointment.
Volta was a big success, especially in the AI field, as it was the first card from nVidia with Tensor Units.

I don't get too involved in PC GPUs much since I'm a console gamer and the guy might have said some stupid things but still, these engineers that make it to top, they are usually incredibly smart people, who knows what constraints or wrong decisions, that didn't seem like such at the time, make designs not work out.
 
Last edited:
I gave Raja the benefit of the doubt before, now I am just pissed off at his incompetence. If he doesn't turn things around for the next launch he has got to go.
I'd say it's hard to blame all this on the guy.

Judging from their newer integrated graphics chips (Intel Xe) they don't seem to have increased the floating point performance of their processors considerately since Broadwell in 2013 (comparing equal shading units and execution units number). It's crazy.

I don't know if ARC has the same efficiency, but Intel Xe was supposed to be a step towards "it". It's been quite evident that the improvement in performance per clock has been quite shit and they are building these GPU's like they build Larabee all these years ago, basically doing huge clusters of something they already have and hoping for the best.

This "Pentium D x100" approach seems engrained on Intel at this point. Perhaps they wanted to ship as fast as possible, and decided on this, then it backfired for quite a few years in a row. I also doubt GPU focus was ever that high for intel, seeing they didn't even want to allocate proper production from their foundries to them. Thus the GPU division was making omelettes without eggs for the longest time, perhaps still is.

Intel is obviously able to develop a good GPU if they wanted and invested enough, if they fail I doubt they really tried. Apple started out by stealing PowerVR tech and employees after buying GPU's from them for years (then had to settle to licence stuff from them) and look at them now.
Vega was a compute beast, and I thought if that didn't translate well into video cards, it could still be a great result if focused on the server side of things. I was thinking they hired him for server processor design.
Yes, but he clearly didn't take that tech/architecture with him.

Working with intel his team is basing their design on existing intel graphic processing pipeline. Which was never "good". Intel GPU's in fact remind me of Raspberry Pi VideoCore gpu's a lot.

Floating point is a plus when their focus generation on generation is on decoding and encoding video "better" (and by better, supporting more formats through fixed function translating into less workload for the CPU) The video block on intel integrated chips has increased steadily gen-on-gen, unlike the graphics core performance per clock.
But sapphire rapids is being delayed again and again, so..
I doubt it's all on him.

At this point he's like a guy working on Burger King who's in charge of doing a vegan burrito.

Burger King sells zero burritos, they're just musing if they can sell them. But he has to use Whopper ptties and also burger bits for it. And fry them elsewhere.

He might even deserve it, but he's been given a tall order.
 
Last edited:
This is what AMD's VP said recently about Raja.

VentureBeat: I remember when Raja Koduri shifted over to Intel in 2017. I know that one person can't make that huge a difference, but is there anything you would trace to pre-Raja and post-Raja in terms of how AMD looks at graphics? Is there anything you gravitated more or less toward?

Naffziger:
Raja is a visionary. He paints a great and compelling picture of the gaming future and features that are required to drive the gaming experience to the next level. He's great at that. As far as hands-on silicon execution, his background is in software. He definitely helped AMD to improve our software game and feature sets. I worked closely with Raja, but I didn't join the graphics group until after he had left. He had a sabbatical there and went to Intel. So as far as the performance-per-watt, that was not really Raja's footprint. But some of the software dimensions and such.
 
The launch is not going to happen, IMO. These puppies are delayed indefinitely...or the launch is scaled down tremendously. To the point of...why bother?
 
Last edited:
Why? He very clearly left AMD on bad terms and there are reports he's really a bit of an incompetent cunt*.

Though I am quite glad Intel hired him and he did his thing. They deserve it. And it's no surprise they were willing to hire him; I bet he fit right into their management.

*not from a technical standpoint, but from a managerial one.
Why am I pissed off, or why do I want him replaced if he can't deliver soon?

I'm pissed off because I'm a shareholder, #1, and #2 I'm pissed off because it's very important to get the best engineer to help with server design.

In terms of me wanting him gone if this is the best he's got, it should be obvious but even if I didn't have money in this, I want all 3 of the big companies to be at their best for competition sake, as an enthusiast.

Why on earth would you want Intel to fail now? It's like wanting amd to fail when Intel was dominant, but worse because Intel is important for western chip production as well.
 
I'd say it's hard to blame all this on the guy.

Judging from their newer integrated graphics chips (Intel Xe) they don't seem to have increased the floating point performance of their processors considerately since Broadwell in 2013 (comparing equal shading units and execution units number). It's crazy.

I don't know if ARC has the same efficiency, but Intel Xe was supposed to be a step towards "it". It's been quite evident that the improvement in performance per clock has been quite shit and they are building these GPU's like they build Larabee all these years ago, basically doing huge clusters of something they already have and hoping for the best.

This "Pentium D x100" approach seems engrained on Intel at this point. Perhaps they wanted to ship as fast as possible, and decided on this, then it backfired for quite a few years in a row. I also doubt GPU focus was ever that high for intel, seeing they didn't even want to allocate proper production from their foundries to them. Thus the GPU division was making omelettes without eggs for the longest time, perhaps still is.

Intel is obviously able to develop a good GPU if they wanted and invested enough, if they fail I doubt they really tried. Apple started out by stealing PowerVR tech and employees after buying GPU's from them for years (then had to settle to licence stuff from them) and look at them now.

Yes, but he clearly didn't take that tech/architecture with him.

Working with intel his team is basing their design on existing intel graphic processing pipeline. Which was never "good". Intel GPU's in fact remind me of Raspberry Pi VideoCore gpu's a lot.

Floating point is a plus when their focus generation on generation is on decoding and encoding video "better" (and by better, supporting more formats through fixed function translating into less workload for the CPU) The video block on intel integrated chips has increased steadily gen-on-gen, unlike the graphics core performance per clock.

I doubt it's all on him.

At this point he's like a guy working on Burger King who's in charge of doing a vegan burrito.

Burger King sells zero burritos, they're just musing if they can sell them. But he has to use Whopper ptties and also burger bits for it. And fry them elsewhere.

He might even deserve it, but he's been given a tall order.
I mean sure he can't take Vega design with him but surely he can take that design philosophy and improve upon it?!
 
I'd say it's hard to blame all this on the guy.

Judging from their newer integrated graphics chips (Intel Xe) they don't seem to have increased the floating point performance of their processors considerately since Broadwell in 2013 (comparing equal shading units and execution units number). It's crazy.

I don't know if ARC has the same efficiency, but Intel Xe was supposed to be a step towards "it". It's been quite evident that the improvement in performance per clock has been quite shit and they are building these GPU's like they build Larabee all these years ago, basically doing huge clusters of something they already have and hoping for the best.

This "Pentium D x100" approach seems engrained on Intel at this point. Perhaps they wanted to ship as fast as possible, and decided on this, then it backfired for quite a few years in a row. I also doubt GPU focus was ever that high for intel, seeing they didn't even want to allocate proper production from their foundries to them. Thus the GPU division was making omelettes without eggs for the longest time, perhaps still is.

Intel is obviously able to develop a good GPU if they wanted and invested enough, if they fail I doubt they really tried. Apple started out by stealing PowerVR tech and employees after buying GPU's from them for years (then had to settle to licence stuff from them) and look at them now.

Yes, but he clearly didn't take that tech/architecture with him.

Working with intel his team is basing their design on existing intel graphic processing pipeline. Which was never "good". Intel GPU's in fact remind me of Raspberry Pi VideoCore gpu's a lot.

Floating point is a plus when their focus generation on generation is on decoding and encoding video "better" (and by better, supporting more formats through fixed function translating into less workload for the CPU) The video block on intel integrated chips has increased steadily gen-on-gen, unlike the graphics core performance per clock.

I doubt it's all on him.

At this point he's like a guy working on Burger King who's in charge of doing a vegan burrito.

Burger King sells zero burritos, they're just musing if they can sell them. But he has to use Whopper ptties and also burger bits for it. And fry them elsewhere.

He might even deserve it, but he's been given a tall order.
It's not all on him. It's just mostly on him and Intel for not doing their homework and looking at his actual track record.
 
Why am I pissed off, or why do I want him replaced if he can't deliver soon?

I'm pissed off because I'm a shareholder, #1, and #2 I'm pissed off because it's very important to get the best engineer to help with server design.

In terms of me wanting him gone if this is the best he's got, it should be obvious but even if I didn't have money in this, I want all 3 of the big companies to be at their best for competition sake, as an enthusiast.

Why on earth would you want Intel to fail now? It's like wanting amd to fail when Intel was dominant, but worse because Intel is important for western chip production as well.
Because Intel are going to be fine and they still haven't been adequately punished for what they did to AMD.

There's also Nvidia in the GPU market.
 
Because Intel are going to be fine and they still haven't been adequately punished for what they did to AMD.

There's also Nvidia in the GPU market.
Punished? That's a really weird thing to say, and it's against your best interest.
unless you own amd/nvidia shares

This is just business.
 
Last edited:
I mean sure he can't take Vega design with him but surely he can take that design philosophy and improve upon it?!
But, as winjer pointed he's a software guy. Meaning as much as he understand what he wants, he's not actually designing the chip and making the decisions to make it possible. For that Intel probably thought they had what they needed. Judging how the design looks to me I wouldn't be surprised if they strapped the Larabee/Knights Ferry/Xeon Phi guys into this trainwreck.

He might even have made diagrams and sorted out how he wanted the chip architecture to be (big and wide!) and posed with it... but that doesn't make it performant enough as it can only perform as good as the sum of it's parts after all.

It seems like intel only put the money to poach such engineers from 2021 onwards:

Intel Poaches Top AMD GPU Architect to Lead Xe Development

October 12, 2021

Last month the company poached Vineet Goel, a graphics technology veteran from AMD.

At Intel, Vineet Goel will serve as Vice President and General Manager of Xe GPU Architecture and IP Engineering at Intel and will be responsible for development of the company's Xe architectures. An Intel spokesperson confirmed (...) that Goel will head a group of architects and engineers that will be 'architecting, designing and verifying Intel's Xe IP road map.' Since the first two or three Xe architecture generations have already been defined (or at least named) Goel will have an impact on Intel's Xe GPUs that are years down the road.

Goel he started his career at Real3D in the late 1990s and continued it at ATI Technologies Orlando division after Real3D was sold to Intel in 1999. He worked at ATI and then at AMD till 2011, when he joined his former colleague Eric Demers at Qualcomm and worked on ultra-low-power Adreno GPUs for Snapdragon system-on-chips. He spent about five years at Qualcomm and then re-joined AMD in mid-2016.

He served as Corporate Vice President of GPU Architecture at AMD and before that (from August 2016 to June 2019) was responsible for GPU architecture development at the company. Given the timeline, Vineet Goel played a key role in development of AMD's latest RDNA and RDNA 2 as well as CDNA and CDNA 2 architectures. Furthermore, he also contributed strongly to development of upcoming RDNA 3 and CDNA 3 architectures as well as their successors.
Source: https://www.tomshardware.com/news/intel-poaches-amd-veteran-vineet-goel

Intel poaches another lead GPU designer from AMD

February 20, 2022

Rohit Verma, an AMD veteran SoC architect, has jumped ship back to Intel. In his new role, he'll be doing the same thing he did at AMD -- leading the design process of discrete GPUs. Before his 8-year stint at AMD, Verma spent 15 years with Intel. His LinkedIn page says he was a lead SoC architect during that time.
Source: https://www.techspot.com/news/93476-intel-poaches-another-lead-gpu-designer-amd.html

Perhaps after they first taped out the current designs and realized they sucked/couldn't compete they realized they couldn't just use whatever they had on payroll.
It's not all on him. It's just mostly on him and Intel for not doing their homework and looking at his actual track record.
He wasn't necessarily a bad hire considering without him it would be harder to poach some engineers.

It was a start like any other, ambitious on paper but he's no miracle worker, one has to wonder why that was basically the only GPU-design veteran hiring step they took for a while.
 
Last edited:
Because Intel are going to be fine

Honestly, I'm not so sure about this. Short-term? Yes. Long-term? Jury is still out.

Intel has a people problem, and despite Pat's best efforts, it's still present. Yes, they have a government windfall coming their way, but much like GM, who were deemed "too big to fail" by Nancy Pelosi back in 2008--and are flirting with bankruptcy right now, there's a culture problem.

Culture problems take years--sometimes decades--to fix.

I am 100% rooting for Intel to succeed, even if they abandon chip design altogether and move to fabrication.
 
Last edited:
I am 100% rooting for Intel to succeed, even if they abandon chip design altogether and move to fabrication.
I root for competition, but can't really root for Intel, Nvidia and Qualcomm.

All of them are pricks, behave like pricks and should eat humble pie for 10 years whenever possible, as long as that doesn't prejudice competitiveness of the market. The issue is that it would.

If that was assured thought, they could just die a horrible death as far as I'm concerned.


Intel behaviour with limited cpu support on motherboards, artificial feature-set "tiers" per chipset and baring features that the cpu has but they don't want their customers to access is enough for me to think that they're a horrible company with horrible anti-consumer behaviour that shouldn't be allowed in a era where we want to reduce e-waste. Their chips are good, but god they're a shit company.

I could go on with shitty practices they have, but it's all consistent with what I already said, they just apply it to every part of their business model.
 
Last edited:
Why am I pissed off, or why do I want him replaced if he can't deliver soon?

I'm pissed off because I'm a shareholder, #1, and #2 I'm pissed off because it's very important to get the best engineer to help with server design.

In terms of me wanting him gone if this is the best he's got, it should be obvious but even if I didn't have money in this, I want all 3 of the big companies to be at their best for competition sake, as an enthusiast.

Why on earth would you want Intel to fail now? It's like wanting amd to fail when Intel was dominant, but worse because Intel is important for western chip production as well.
I'd think as a shareholder, that the GPU side of the business is the least of your worries, with the current downturn reported by Intel due to the slowdown in the global economy.

We'll see how AMD fares tomorrow to gauge
 
I remember Raja hyping products a lot and underdelivering each time, AMD seems much better at their Gpu game those days.

He was entertaining though I must say.
 
It's amazing how people think that CPU/GPU architecture is this thing that can pivot in a year depending on who comes and goes when in reality doing a new architecture is like a 5 year endeavour of design and integration with fabs. Raja Koduri was obviously involved in the building blocks of RDNA and beyond before he left AMD in late 2017 because those parts shipped in 2019. Likewise he inherited much of the late GCN-era stagnation when he got back to AMD in 2013.
 
Top Bottom