• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD is Becoming a Software Company. Here's the Plan

Bernoulli

M2 slut
AMD is making changes in a big way to how they are approaching technology, shifting their focus from hardware development to emphasizing software, APIs, and AI experiences. Software is no longer just a complement to hardware; it's the core of modern technological ecosystems, and AMD is finally aligning its strategy accordingly.

The major difference between AMD and NVIDIA is that AMD is a hardware company that makes software on the side to support its hardware; while NVIDIA is a software company that designs hardware on the side to accelerate its software. This is about to change, as AMD is making a pivot toward software. They believe that they now have the full stack of computing hardware—all the way from CPUs, to AI accelerators, to GPUs, to FPGAs, to data-processing and even server architecture. The only frontier left for AMD is software.

For example, Radeon GPUs have had tessellation capabilities at least two generations ahead of NVIDIA, which was only exploited by developers after Microsoft standardized it in the DirectX 11 API, the same happened with Mantle and DirectX 12. In both cases, the X-factor NVIDIA enjoys is a software-first approach, the way it engages with developers, and more importantly, the install-base (over 75% of the discrete GPU market-share). There have been several such examples of AMD silicon packing exotic accelerators across its hardware stack that haven't been properly exploited by the software community. The reasons are usually the same—AMD has been a hardware-first company.

 

winjer

Gold Member
Curious how software companies are investing in making their own hardware. And at the same time, companies that make hardware, investing in making their own software.
Trying to control everything, from top to bottom.
 
I expect AMD to fall sooner or later.

Odd to treat Nvidia as a software company. They have made a decent software for their hardware, but they are hardly a software company. They have the biggest marketshare there and thus capable of driving the hardware decisions and align the software with it, but they are not really a software company.

The biggest question right now is where Nvidia is going to invest their current riches.
 
Last edited:

West Texas CEO

GAF's Nicest Lunch Thief and Nosiest Dildo Archeologist
Chair Fail GIF
 

willothedog

Member
I expect AMD to fall sooner or later.

Odd to treat Nvidia as a software company. They have made a decent software for their hardware, but they are hardly a software company. They have the biggest marketshare there and thus capable of driving the hardware decisions and align the software with it, but they are not really a software company.

The biggest question right now is where Nvidia is going to invest their current riches.

The investment seems to be going into yearly product releases in their hugely profitable enterprise products.

"the more you buy......"

nvidia-unveils-its-future-chip-rollout-plans-till-2027-next-v0-2ut6mtax674d1.png
 
Last edited:

UltimaKilo

Gold Member
Curious how software companies are investing in making their own hardware. And at the same time, companies that make hardware, investing in making their own software.
Trying to control everything, from top to bottom.

They learned after COVID, and from Apple, that it’s best to control most of your own pipeline than be dependent on others. It’s very expensive, yet still worth it for most. Qualcomm, for instance, has been heavily investing in software over the last 3-4 years.
 

Buggy Loop

Member
They let what… 15 years head start for Nvidia cuda and other software suites and let them seep in every universities and professional business?

I also thought that there were open solutions coming to completely bypass Nvidia? What’s the point of competing not only against Nvidia, but open source community too?

Anyway

Morgan Freeman Good Luck GIF
 
This seems like a necessary choice for them. A lot of professional software struggles to run on AMD, a workaround layer that makes AMD cards compatible with apps built for Cuda would be a big step, as one example.
 

M1chl

Currently Gif and Meme Champion
They are like 12 years late too late and AMD is playing that "open source card" and their sw output is subpar, even when it comes to bioses (how long took fixing disconnecting USB on Ryzen 3XXX, etc) and generally requires open source community to help them with their shit.
 

winjer

Gold Member
They learned after COVID, and from Apple, that it’s best to control most of your own pipeline than be dependent on others. It’s very expensive, yet still worth it for most. Qualcomm, for instance, has been heavily investing in software over the last 3-4 years.

And on the other side, we have software companies like AWS and Google making their own hardware.
 
The investment seems to be going into yearly product releases in their hugely profitable enterprise products.

"the more you buy......"

nvidia-unveils-its-future-chip-rollout-plans-till-2027-next-v0-2ut6mtax674d1.png
But that's the road to nowhere as other companies - who are their biggest customers - are trying to build their own silicon. I would expect Nvidia to spread their current cash into more businesses and stuff, to diversify more. They have a lot of cash now.
 
Last edited:

diffusionx

Gold Member
There's not much more die to shrink, and it's hard to get a chip made even if there is. At least on the graphics side, obviously new GPUs will still come out, but in terms of hardware it's a very stagnant market. The development/advancement is coming from software/algorithm.
 

Buggy Loop

Member
But that's the road to nowhere as other companies - who are their biggest customers - are trying to build their own silicon.

So why is AMD even bothering with software if customers are moving to internal solutions?

Oh, we're missing the scale of AI in this discussion I guess.

So far their internal chipsets are used for secondary lower tier AI work while they keep Nvidia's top end for the top of the line non-public R&D. When they say they want to reduce their dependence on Nvidia, it means there's not enough Nvidia chips to go around, not that they want to stop using Nvidia. Google will be offering client end Blackwell even after their Trillium Gen 6 for example. AWS is aiming to have the biggest powered AI supercomputer, Project Ceiba, powered by Nvidia. Microsoft is buying everything available, their own chip solution, AMD, massive buy orders at Nvidia, everything AI slapped on it, Microsoft buys.

There's no "blow" to Nvidia like sensationalists titles would want to make you believe, everyone is lining up to buy everything Nvidia gets out. The good thing is also everything AMD can manufacture is bought.

Let's not even get into the whole rack and interlinks solutions that Nvidia offers versus the competition. It's a lot more components than just the GPU and this is where Nvidia kills the competition, because even if the competition's GPU was free, it still would be too expensive. Total cost of ownership is completely skewed towards Nvidia as of now.

There's such high demand for AI that both AMD & Nvidia will probably score their >1 Million GPU deal soon for a 2028-30 release. The kind of projects that require their own nuclear stations. Microsoft Stargate comes to mind, but many others will race to compete. This is also publicly known customers, I can't imagine how much demand there is for military/intelligence. The first country to have sovereign AI is assured to dominate, they don't care about ethics because if you thought the race to have the first nuclear bomb was interesting, the thing with sovereign AI is that once you get it, it's exponential how fast you'll advance, not something you can catch up to or reverse engineer like a bomb.
 
So why is AMD even bothering with software if customers are moving to internal solutions?
Well they have to find vectors of growth. Nvidia has embedded itself quite heavily in a lot of workloads (mainly due to CUDA). AMD was left behind and CUDA has become a standard essentially. AMD has to offer some decent alternative to Nvidia's stack.

There's no "blow" to Nvidia like sensationalists titles would want to make you believe, everyone is lining up to buy everything Nvidia gets out. The good thing is also everything AMD can manufacture is bought.
Nobody is going to rely on Nvidia only. For now Nvidia is the only option. But it won't be the case within the next 5-10 years. Various companies will find or create alternatives one way or another.
 
Last edited:

Dorfdad

Gold Member
So why is AMD even bothering with software if customers are moving to internal solutions?

Oh, we're missing the scale of AI in this discussion I guess.

So far their internal chipsets are used for secondary lower tier AI work while they keep Nvidia's top end for the top of the line non-public R&D. When they say they want to reduce their dependence on Nvidia, it means there's not enough Nvidia chips to go around, not that they want to stop using Nvidia. Google will be offering client end Blackwell even after their Trillium Gen 6 for example. AWS is aiming to have the biggest powered AI supercomputer, Project Ceiba, powered by Nvidia. Microsoft is buying everything available, their own chip solution, AMD, massive buy orders at Nvidia, everything AI slapped on it, Microsoft buys.

There's no "blow" to Nvidia like sensationalists titles would want to make you believe, everyone is lining up to buy everything Nvidia gets out. The good thing is also everything AMD can manufacture is bought.

Let's not even get into the whole rack and interlinks solutions that Nvidia offers versus the competition. It's a lot more components than just the GPU and this is where Nvidia kills the competition, because even if the competition's GPU was free, it still would be too expensive. Total cost of ownership is completely skewed towards Nvidia as of now.

There's such high demand for AI that both AMD & Nvidia will probably score their >1 Million GPU deal soon for a 2028-30 release. The kind of projects that require their own nuclear stations. Microsoft Stargate comes to mind, but many others will race to compete. This is also publicly known customers, I can't imagine how much demand there is for military/intelligence. The first country to have sovereign AI is assured to dominate, they don't care about ethics because if you thought the race to have the first nuclear bomb was interesting, the thing with sovereign AI is that once you get it, it's exponential how fast you'll advance, not something you can catch up to or reverse engineer like a bomb.
This is why china wants to corner the semiconductor market and take Taiwan. If they are allowed to control 90% of the world’s chips production they can control the future.
 

Buggy Loop

Member
This is why china wants to corner the semiconductor market and take Taiwan. If they are allowed to control 90% of the world’s chips production they can control the future.

Absolutely

Control of chipsets is now a matter of national defense, #1 asset even, since most of the USA GDP is linked directly to chipset production now. China ain't happy to have been cornered for semiconductors.

It'll be the chip war. The fight for the world's most critical technology. It's inevitable. China won't allow to be excluded from the tech much longer.

Its also why USA is investing so much for their own internal foundries now. They won't allow China to take control of world's chip production. All new fabs are now planned for North America and Europe.

Just the USA foundry roadmap is total insanity.

NvkbAExuu9TMJH3Qbu5wGS.png
 

Dorfdad

Gold Member
Absolutely

Control of chipsets is now a matter of national defense, #1 asset even, since most of the USA GDP is linked directly to chipset production now. China ain't happy to have been cornered for semiconductors.

It'll be the chip war. The fight for the world's most critical technology. It's inevitable. China won't allow to be excluded from the tech much longer.

Its also why USA is investing so much for their own internal foundries now. They won't allow China to take control of world's chip production. All new fabs are now planned for North America and Europe.

Just the USA foundry roadmap is total insanity.

NvkbAExuu9TMJH3Qbu5wGS.png
Yeah that looks like a mess waiting to happen. Bigger issues is components sure we can build the plants, but the cost to manufacture those chips using our current wages and taxes will kill any real momentum. That 300.00 msrp will be 600-700 USD..

Not to mention china holds like 60% or more of the natural resources to make chips
 

winjer

Gold Member
Explain to me in terms of revenue how the FOOK Nvda classifies as a software company.

Because it's the software that makes the difference, especially CUDA. But then there is a ton of other software that Nvidia is making.
Watch their presentations focused on the enterprise market, and you will se that the hardware is the foundation. But the software (CUDA) is what keeps most companies tied to Nvidia.
Like Jim Keller said:

O0VqZ9H.jpeg
 
Last edited:

Buggy Loop

Member
Because it's the software that makes the difference, especially CUDA. But then there is a ton of other software that Nvidia is making.
Watch their presentations focused on the enterprise market, and you will se that the hardware is the foundation. But the software (CUDA) is what keeps most companies tied to Nvidia.
Like Jim Keller said:

O0VqZ9H.jpeg

The hell is Jim Keller on? On right, probably on the team that convinced AMD's upper management to not pursue a CUDA alternative all those years at AMD. How's that going?

I want to see a play where you tell customers wanting to do massively parallel GPUs so just own it up and program their own language to go use pixel and vertex shaders to emulate general-purpose compute on their own. Writing a generalized compiler that can cope with a complex memory model and high parallelized compute sounds easy! 🤡

Cuda is not fast? What is faster? Nvidia will generally put together a new package for focus areas. End customers would be spending years to build a software stack in the hope of maybe catching up.
 

kikkis

Member
Absolutely

Control of chipsets is now a matter of national defense, #1 asset even, since most of the USA GDP is linked directly to chipset production now. China ain't happy to have been cornered for semiconductors.

It'll be the chip war. The fight for the world's most critical technology. It's inevitable. China won't allow to be excluded from the tech much longer.

Its also why USA is investing so much for their own internal foundries now. They won't allow China to take control of world's chip production. All new fabs are now planned for North America and Europe.

Just the USA foundry roadmap is total insanity.

NvkbAExuu9TMJH3Qbu5wGS.png
Taiwan has 22% instead of 90% of global chip manufacturing. And know way in hell is most of the us gdp tied to it. And you can't just expect to just topple a advanced factory and press a button to start making chips even if the factory wasn't killswitched or destroyed.
 

winjer

Gold Member
The hell is Jim Keller on? On right, probably on the team that convinced AMD's upper management to not pursue a CUDA alternative all those years at AMD. How's that going?

I want to see a play where you tell customers wanting to do massively parallel GPUs so just own it up and program their own language to go use pixel and vertex shaders to emulate general-purpose compute on their own. Writing a generalized compiler that can cope with a complex memory model and high parallelized compute sounds easy! 🤡

Cuda is not fast? What is faster? Nvidia will generally put together a new package for focus areas. End customers would be spending years to build a software stack in the hope of maybe catching up.

Keller is probably the last person to be a fanboy of any company. Dude isn't even a fanboy of his own work.
And he knows more about tech, than 99.999% of people on this planet.
Just because you don't like what he says, doesn't mean he is wrong.
 

Buggy Loop

Member
Keller is probably the last person to be a fanboy of any company. Dude isn't even a fanboy of his own work.
And he knows more about tech, than 99.999% of people on this planet.
Just because you don't like what he says, doesn't mean he is wrong.

So what’s his solution?

Bitching on something is the easiest thing in the world.
 

Griffon

Member
I think it's a trap they're getting themselves into.

Ryzen and their SoCs are good specifically because they are hardware people first.

They should find a way to leverage their strength rather than try to be a pale imitation of Nvidia which is a very different company.
 
Last edited:

winjer

Gold Member
So what’s his solution?

Bitching on something is the easiest thing in the world.

He is just describing what CUDA has done to the market.
And like the other dev stated, a swamp is a great moat.
That is what CUDA has done for Nvidia. It's not just a tool to develop a lot of modern software, especially in the enterprise side, it's also a gatekeeper that prevents the competition from entering.
And by competition, that is not just AMD. It's also Intel, AWS, Google, Microsoft, Apple, Tentorrent, Qualcom, etc.
This forum, has too many people with a limited view of tech, that somehow think that this just a feud between nVidia and AMD, but the reality is much more complex.
The competition is between a a few dozen of very big players.
 

StereoVsn

Member
But that's the road to nowhere as other companies - who are their biggest customers - are trying to build their own silicon. I would expect Nvidia to spread their current cash into more businesses and stuff, to diversify more. They have a lot of cash now.
They are trying. They bought Mellanox, they are investing into everything from Auto to endpoints to dev. It’s just right now hard to fault them for taking the tremendous AI profit.

The profit will get cut as all the major hyperscalers , Intel, Qualcimm, bunch of Chinese companies and a metric ton of startups are rushing for more efficient and cheaper AI hardware. So Nvidia are taking their money now.
 

Bashtee

Member
They should find a way to leverage their strength rather than try to be a pale imitation of Nvidia which is a very different company.
But this is exactly where the leverage is - software. Setting up ML or LLM with NVIDIA card support is magnitudes easier than AMD with ROCm. Intel is trying to push software with oneAPI, too. Your hardware is worth nothing if nobody can utilize it. There are way too many barriers to use AMD cards for computing that don't exist with NVIDIA across the board.
 

Buggy Loop

Member
He is just describing what CUDA has done to the market.
And like the other dev stated, a swamp is a great moat.
That is what CUDA has done for Nvidia. It's not just a tool to develop a lot of modern software, especially in the enterprise side, it's also a gatekeeper that prevents the competition from entering.
And by competition, that is not just AMD. It's also Intel, AWS, Google, Microsoft, Apple, Tentorrent, Qualcom, etc.
This forum, has too many people with a limited view of tech, that somehow think that this just a feud between nVidia and AMD, but the reality is much more complex.
The competition is between a a few dozen of very big players.

A gatekeeper that prevents the competition from entering?

15 years, FIFTEEN YEARS they let CUDA being CUDA with almost no competition. They created their own gate.
 

winjer

Gold Member
A gatekeeper that prevents the competition from entering?

15 years, FIFTEEN YEARS they let CUDA being CUDA with almost no competition. They created their own gate.

Exactly. Nvidia created their own gate.
They saw how the GPU was advancing towards greater programmability and took the initiative. Credit is due to nvidia.
And now there are plenty of companies trying to find a way around it.

Like Keller said, its similar to what Intel did with X86. They are both swamps.
 

Buggy Loop

Member
Exactly. Nvidia created their own gate.
They saw how the GPU was advancing towards greater programmability and took the initiative. Credit is due to nvidia.
And now there are plenty of companies trying to find a way around it.

Like Keller said, its similar to what Intel did with X86. They are both swamps.

No, the total lack of competition for all those years is what created the gate. The door was wide open for ATI/AMD to counter it. I guess everyone evaluated that it was not worth the money right? Now they all need to catch up.
 

Dorfdad

Gold Member
Add to the fact that Apple is entering the AI world as well so not only do they have to go up against nvidia, Microsoft, and apple now there are others also piling on…

AMD can absolutely make it a closer race but that’s going to cost them a ton of money and buying technology to catch up quicker. I’d honestly put 50k in their stock for a few years and see how it goes.
 

winjer

Gold Member
No, the total lack of competition for all those years is what created the gate. The door was wide open for ATI/AMD to counter it. I guess everyone evaluated that it was not worth the money right? Now they all need to catch up.

ATI did try to counter it with OpenCL.
What you forget is that AMD when trough a very rough time, for close to a decade.
Not only did they pay way too much for ATI, but then they had the failures with the 32nm process node and Bulldozer.
So there was no money or resources to counter anything from nvidia or Intel.
 

SolidQ

Gold Member
So there was no money or resources to counter anything from nvidia or Intel
That a lot people missed it. It's only 7 years, since they started get money, also need spend to different divisions and buying Xilinx.
Even for Intel with money not easy jump to Gpu side
As we know they prepared massive lineup with RDNA5
 
The major difference between AMD and NVIDIA is that AMD is a hardware company that makes software on the side to support its hardware; while NVIDIA is a software company that designs hardware on the side to accelerate its software.
major difference is nvidia basically always has better hardware and software.
 

TheKratos

Member
Because it's the software that makes the difference, especially CUDA. But then there is a ton of other software that Nvidia is making.
Watch their presentations focused on the enterprise market, and you will se that the hardware is the foundation. But the software (CUDA) is what keeps most companies tied to Nvidia.
Like Jim Keller said:

O0VqZ9H.jpeg
You did not explain in terms of revenue. 90%+ comes from GPU sales. NVDA is a hardware company that happens to create software to make their hardware superior. CUDAS makes their GPU superior, but the product being sold is the GPU.
 
AMD is about 30 years behind Nvidia figuring out you need to be software first to maximize the utility of your hardware

Good luck, I guess.
 

winjer

Gold Member
You did not explain in terms of revenue. 90%+ comes from GPU sales. NVDA is a hardware company that happens to create software to make their hardware superior. CUDAS makes their GPU superior, but the product being sold is the GPU.

Nvidia doesn't sell the software as a separate part. It's a package that includes hardware and software.
And although the hardware is very good, it's the software that makes the difference.
 
Top Bottom