• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD Announces Acquisition of Nod.ai, Plans to Rapidly Improve AI Resources

JohnnyFootball

GerAlt-Right. Ciriously.
I am surprised I didn't see a thread for this, as this could a bbig deal: AMD acquiring an AI company.

Why is this a big deal? It's not secret that nvidia is kicking AMDs ass in AI and utilizing AI tech into their GPUs. FSR is mediocre at best and AMD frame generation is in it's infancy. Right now DLSS/Ray Tracing and now Frame Generation are indisputably better features. I'd even argue that XeSS is better than FSR, not to mention Intel GPUs take far less of a performance hit with ray tracing enabled. AMD is being attacked from all directions. They have closed the gap in rastuerized performance, but need to step up their plans in features.

It obviously remains to be seen what AMDs true plans are, but the potential is there.

Share your thoughts.
 
Last edited:

T4keD0wN

Member
I can't believe my eyes; is AMD really planning to start taking GPU sector seriously and maybe even start competing eventually?
 

SmokedMeat

Gamer™
AMD needs to do something, so I’m glad to see it.

At the same time all of these upscalers don’t mean much if developers continue dropping the ball on optimization.
 
Last edited:

SolidQ

Member
I can't believe my eyes; is AMD really planning to start taking GPU sector seriously and maybe even start competing eventually?
You won't have alot money after 2015, for heavily competetion with Intel/NV, need time for this
 

Spyxos

Member
How long do you think it's going to take for this purchase to make itself visible? Are we talking about a year or even more then few years?
 

mystech

Member
Smart move! They NEED to compete better with Nvidia and while DLSS is much better than AMDs solution, this could help them catch up fast enough for the next generation.

As I said in another thread, I think the PS5 / SX will be the last consoles to rely on brute force power to produce good graphics at high resolution. The tech behind the Switch 2 will be the way the industry heads. GPUs will work smarter not harder. I think that’s how we get a true generation leap over what we have now. The PS6 will for sure have enough brute force power to run circles around the PS5 (and even the future pro version) but it can use that power in combination with A.I. (which will have advanced much further by then).

It’s like moving from a naturally aspirated V6 to a much more powerful V8 and then adding a turbocharger to that V8. PS4 to PS5 wasn’t a big leap but I think PS5 to PS6 will be a lot more noticeable.
 

winjer

Gold Member
Nod.ai is a high-performance machine learning systems company that provides key enabling open source technologies for future AI systems using advanced compiler based approaches. They created the SHARK Machine Learning Distribution that is built on LLVM, MLIR, OpenXLA's IREE, and Nod.ai's tuning. They leverage machine learning based methods (reinforcement learning in particular) for codegen and auto-scheduling of compute and communication over a heterogeneous set of resources (such as CPUs, GPUs, Accelerators, etc.).

Nod.ai's mission is to make AI accessible to everyone by improving the performance and efficiency of machine learning training and inference. They believe that AI should be used for good, and they are committed to developing technologies that are sustainable and ethical.

Nod.ai was founded in 2021 by a team of experienced AI researchers and engineers. The company is headquartered in Santa Clara, California.

Here are some of the benefits of using Nod.ai:
  • Improved performance: Nod.ai's compiler-based approach to machine learning can significantly improve the performance of training and inference, especially on large and complex models.
  • Increased efficiency: Nod.ai's auto-scheduling capabilities can help to reduce the time and effort required to deploy machine learning models on a variety of hardware platforms.
  • Reduced costs: Nod.ai's technologies can help to reduce the costs of machine learning training and inference, especially for large-scale deployments.
  • Open source: Nod.ai's core technologies are open source, which makes them accessible to a wider community of users and contributors.

This is not to make games. It's to make compilers and tools for AI.
Of course it will benefit other areas of AI, inside AMD. So hopefuly, it will also trickle down to gaming GPUs.
 

killatopak

Member
Honestly, it’s about time though I can’t blame them.

They probably put resources to their CPU department which is now much better than competition.

I just don’t know how it’ll pan out since they’re battling two fronts.
 
I doubt this is for gaming.

Nvidia are making stupid amounts of money through selling AI accelerators to large cloud services and farms, ChatGPT is powered entirely by Nvidia's AI chips. I think the H100 alone retails for around $10,000 +

Seems like AMD is now following suit as they usually do.
 

StereoVsn

Gold Member
This is a good move for AMD. However overall laser focus on AI from NVidia and AMD takes away from gaming of course.

As an example reportedly NVidia pulled a lot of devs from the gaming driver team to work on AI/enterprise hardware drivers. Hence the slowdown in releases.
 
Hopefully this works out for them. Their whole Zen CPU architecture has a been a huge win. I've replaced every server and PC at work with AMD CPU's since they are really best bang for buck and have such low power consumption. Started 4 years ago and haven't had nearly as many issues as I had previously with Intel. I really do wish they would take the GPU side more seriously. Nvidia is charging whatever they hell they want for AI and ML cards. On the other hand I think Nvidia is a way stronger company than the Intel of five years ago, getting competitive with them is going to take some serious time and investment.
 
Hopefully this works out for them. Their whole Zen CPU architecture has a been a huge win. I've replaced every server and PC at work with AMD CPU's since they are really best bang for buck and have such low power consumption. Started 4 years ago and haven't had nearly as many issues as I had previously with Intel. I really do wish they would take the GPU side more seriously. Nvidia is charging whatever they hell they want for AI and ML cards. On the other hand I think Nvidia is a way stronger company than the Intel of five years ago, getting competitive with them is going to take some serious time and investment.

I think AMD have wanted to compete in GPU since RDNA 1, but they've been struggling. I remember watching an interview with Intel engineers and they were talking about how difficult it's been to develop GPU's, there's so much problem solving, engineering and resources required.

Nvidia have been at the game for so long (pun intended), almost 30 years I think, and they've had the fortune of strong leadership under Jensen which has allowed them to stay ahead of everyone in the GPU space.

AMD got close with RDNA 2, but then fell behind again with RDNA 3, and now for RDNA 4 they're abandoning the high end because of the complex design a high end chip would require, and instead they're shifting the resources to RDNA 5 high end. (just a rumor).
 
You mean this Nod?

txg9c2e.jpg
 
Top Bottom