• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Intel B580 has serious issues with driver overhead

He clearly stated that 10th gen having HT was a new feature that distinguished it from previous gens, making it not a rebrand.
I don't know if english isn't your first language, but you are deeply misunderstanding what "once again" means.
If apple presents a new iPhone and says "we once again support a headphone jack!", do you think that means this is the first ever iPhone to have one? Of course not.
He never said it was the first to have HT or that it was new tech and you are wrong for accusing him about lying about it being so; if you still don't understand this point specifically, then this has ran its course and I won't reply to you again.
 

winjer

Gold Member
I don't know if english isn't your first language, but you are deeply misunderstanding what "once again" means.
If apple presents a new iPhone and says "we once again support a headphone jack!", do you think that means this is the first ever iPhone to have one? Of course not.
He never said it was the first to have HT or that it was new tech and you are wrong for accusing him about lying about it being so; if you still don't understand this point specifically, then this has ran its course and I won't reply to you again.

I'm starting to think you didn't read our conversation. But here is a recap.
I pointed out that there was no reason for support for Arc to start at 10th Gen, because this was a rebrand of previous generations of CPUs.
He counterpointed, stating that there were differences in 10th Gen, that made it not be a rebrand. One of the points he stated was HT.
The point of contention is whether 10th gen is a rebrand or not. And for it not to be a rebrand, means it's architecture has to have new features.
But none of the features he presented are new to the Core line. HT was used on all Core generations.
ReBar is not linked to any generation of CPUs, it's just a feature in the PCIe standard. And was enabled in several Core generations with a Bios update.
And PCIe Gen 4 is not a part of the CPU. It's a part of the Southbridge, or what Intel now calls the uncore, meaning it's not in the CPU Core. So it's also not a change to the CPU architecture.
The architecture that was really new, would be 11th gen. And after that 12th. And then a minor change with 13th gen. And then a rebrand with 14th gen. And now a new architecture with Core 200 series.
 

SScorpio

Member
I'm starting to think you didn't read our conversation. But here is a recap.
I pointed out that there was no reason for support for Arc to start at 10th Gen, because this was a rebrand of previous generations of CPUs.
He counterpointed, stating that there were differences in 10th Gen, that made it not be a rebrand. One of the points he stated was HT.
The point of contention is whether 10th gen is a rebrand or not. And for it not to be a rebrand, means it's architecture has to have new features.
But none of the features he presented are new to the Core line. HT was used on all Core generations.
ReBar is not linked to any generation of CPUs, it's just a feature in the PCIe standard. And was enabled in several Core generations with a Bios update.
And PCIe Gen 4 is not a part of the CPU. It's a part of the Southbridge, or what Intel now calls the uncore, meaning it's not in the CPU Core. So it's also not a change to the CPU architecture.
The architecture that was really new, would be 11th gen. And after that 12th. And then a minor change with 13th gen. And then a rebrand with 14th gen. And now a new architecture with Core 200 series.
You're right that the 9th and 10th gen Intel series are very close to one another. It's very odd that it says 10th gen is supported when Intel PCIe Gen 4 support first appeared on the 11th series.

But you are incorrect about PCIe only being part of the southbridge. Modern CPUs have dedicated PCIe lanes directly to the CPU, and then there is an interconnect from the southbridge to the CPU.

So both a CPU and MB need to support whatever generation of PCIe you want to run. Some AM4+ B450 motherboards met the spec for Gen 4, so installing a Zen 2 CPU allowed the GPU and first NVME slot to work at Gen 4 speeds, but the interconnect to the southbridge was still only Gen 3. And budget modern boards off Gen 5 for the GPU with x16 lanes, and x4 lanes for the first NVME slot, but gen 4 interconnects for everything else.
 

winjer

Gold Member
You're right that the 9th and 10th gen Intel series are very close to one another. It's very odd that it says 10th gen is supported when Intel PCIe Gen 4 support first appeared on the 11th series.

But you are incorrect about PCIe only being part of the southbridge. Modern CPUs have dedicated PCIe lanes directly to the CPU, and then there is an interconnect from the southbridge to the CPU.

So both a CPU and MB need to support whatever generation of PCIe you want to run. Some AM4+ B450 motherboards met the spec for Gen 4, so installing a Zen 2 CPU allowed the GPU and first NVME slot to work at Gen 4 speeds, but the interconnect to the southbridge was still only Gen 3. And budget modern boards off Gen 5 for the GPU with x16 lanes, and x4 lanes for the first NVME slot, but gen 4 interconnects for everything else.

But that is the thing, the parts that changed with 10th gen are some in the uncore. The part that is not the CPU core.
 

Silver Wattle

Gold Member
Do we know if pcie bandwidth plays a roll on such old CPU's? The b580 is only 8x 4.0 and a 9600k and it's motherboard can only support 3.0 afaik.

The 6500xt was 4x 4.0 and when dropped to 3.0 it's performance tanked and that was a significantly slower card.

According to techpowerup the B580 is 224% the performance of the 6500xt but only 2x the pcie lanes.
 
Last edited:

winjer

Gold Member
Do we know if pcie bandwidth plays a roll on such old CPU's? The b580 is only 8x 4.0 and a 9600k and it's motherboard can only support 3.0 afaik.

The 6500xt was 4x 4.0 and when dropped to 3.0 it's performance tanked and that was a significantly slower card.

The 4060 they use on these tests, also have 8 lanes, but performance doesn't drop anywhere as bad as with the B580.

Hardwate Cannucks did check CPU utilization and it looked off with the B580.
 
Isn't this why the first-gen Arc Alchemist required you to have support for Resizable Bar or something

Anyways, Intel has come a long way from their Extreme Graphics days but they still have a long way to go before they are in the same ballpark as AMD much less the same galaxy as Nvidia
 
Last edited:

Three

Member
The 4060 they use on these tests, also have 8 lanes, but performance doesn't drop anywhere as bad as with the B580.
I suspect because the B580 has more VRAM and better memory bandwidth and it is reserving it for memory access. The game is using it somehow in an inefficient way that favours memory performance for ReBAR when it shouldn't for better performance on a low end CPU.
Hardwate Cannucks did check CPU utilization and it looked off with the B580.
That is normal though isn't it? The 4060 is a lower performance card so as you approach the "CPU bound/IO bound" limit your GPU utilisation is higher than higher end cards.
 
Last edited:

winjer

Gold Member
I suspect because the B580 has more VRAM and better memory bandwidth and it is reserving it for memory access. The game is using it somehow in an inefficient way that favours memory performance for ReBAR when it shouldn't for better performance on a low end CPU.

That is normal though isn't it? The 4060 is a lower performance card so as you approach the "CPU bound/IO bound" limit your GPU utilisation is higher than higher end cards.

You should watch the Cannucks video.
 

winjer

Gold Member
I watched it. What did I miss in it?

The 4060 is in the ballpark of the B580.
But its not the only card that retains performance better then the B580.
But the thing that indicates its probably a driver issue is CPU utilization on the B580, in some games.
 

Three

Member
The 4060 is in the ballpark of the B580.
The B580 normally performs a bit better than a 4060.
But its not the only card that retains performance better then the B580.
Of course but my point is, and I could be completely wrong here, that the higher memory bandwith of the B580 with ReBAR is saturating the 8 lane PCIE 3.0 port with the 9600k. The 7600 and 4060 have about half the memory bandwith of the B580 so CPU access to that memory likely doesn't pose as much of a problem for the port.
The 6700XT is 16 lanes.

But the thing that indicates its probably a driver issue is CPU utilization on the B580, in some games.
I didn’t see CPU utilisation. I saw a GPU utilisation graph. This is probably what I've missed. Can you timestamp it?
 
Last edited:

winjer

Gold Member
The B580 normally performs a bit better than a 4060.

Of course but my point is, and I could be completely wrong here, that the higher memory bandwith of the B580 with ReBAR is saturating the 8 lane PCIE 3.0 port with the 9600k. The 7600 and 4060 have about half the memory bandwith of the B580 so CPU access to that memory likely doesn't pose as much of a problem for the port.
The 6700XT is 16 lanes.


I didn’t see CPU utilisation. I saw a GPU utilisation graph. This is probably what I've missed. Can you timestamp it?

The B580 should only lose performance in a similar percentage to the 4060. not drop to performance levels of a gtx 1060.

Memory bandwidth for vram is the connection between the gpu die and the vram. The bandwidth that affects rebar is the pcie lanrs and generation. So I don't think that is the problem.

I misspoke. I meant GPU utilization charts.
Usually, a gpu will be at the high 90% utilization. When it drops too much, if can mean an issue with CPU bottleneck, game engine problems or driver problems. Since this only happens with the B580, its probably drivers.
And if its a driver issue, then Intel can probably fix it with an update.
 

scydrex

Member
Told you so.

I said there is safety in numbers. Nvidia has numbers. Not intel, not amd. Why would anyone recommend any gpu from anyone but the company with the highest userbase? The software we play will work with Nvidia, and guaranteed to work well. Why roll the dice with an unknown?

Also, when everyone in gaming is telling you what to buy, why be contrary? Enjoy your mid range at best technical issues I guess.
So you don't want competition or other makers to rise? Prefer to Nvidia to continue to control prices and have the monopoly.
 

Three

Member
I misspoke. I meant GPU utilization charts.
Usually, a gpu will be at the high 90% utilization. When it drops too much, if can mean an issue with CPU bottleneck, game engine problems or driver problems. Since this only happens with the B580, its probably drivers.
And if its a driver issue, then Intel can probably fix it with an update.
it's CPU/IO bottleneck behaviour with a particular game engine, that's what I'm suggesting. The B580 gets lower utilisation because it's being bottlenecked by something else and you would expect lower GPU utilisation anyway, especially as it's a slightly better card that has high bandwidth for reBAR gains. The games that seem affected could be those that benefit most from ReBAR and having ReBAR limited by the port basically becoming 8x 3.0 with the slow CPU/Mobo means bigger hits in performance. Intel may be able to improve this in drivers but calling the card "broken" is a bit of a stretch IMO.
 
Last edited:

winjer

Gold Member
it's CPU/IO bottleneck behaviour with a particular game engine, that's what I'm suggesting. The B580 gets lower utilisation because it's being bottlenecked by something else and you would expect lower GPU utilisation anyway, especially as it's a slightly better card that has high bandwidth for reBAR gains. The games that seem affected could be those that benefit most from ReBAR and having ReBAR limited by the port basically becoming 8x 3.0 with the slow CPU/Mobo means bigger hits in performance. Intel may be able to improve this in drivers but calling the card "broken" is a bit of a stretch IMO.

I don't think the cards are broken, I think it's just an issue with the current drivers.
The 4060 has the same 8 PCIe lane setup and doesn't lose performance.
I would expect the B580 to have fewer issues with the PCIe bus, because it has more vram, so it can cache more data, if the game needs it.
And that means going fewer times to main memory to fetch data.
 

Three

Member
I don't think the cards are broken, I think it's just an issue with the current drivers.
The 4060 has the same 8 PCIe lane setup and doesn't lose performance.
The 4060 relied less on ReBAR for its performance in those specific games I believe.
So killing the B580s advantage with the slower PCIe setup drops fps more because those games see the biggest gains with reBAR. There are games that don't benefit from rebar like the listed Doom eternal, I believe that sees a decrease in performance with ReBAR on an nvidia card.
I would expect the B580 to have fewer issues with the PCIe bus, because it has more vram, so it can cache more data, if the game needs it.
And that means going fewer times to main memory to fetch data.
ReBAR is CPU access to GPU memory isn't it? So depending on the game engine wouldn't that increase bandwidth requirements with the PCIe bus? I was under the impression that increased CPU access meant a better bus requirement. I'm not sure how more VRAM would help there but maybe I'm not understanding it all that well.
 
Last edited:

winjer

Gold Member
The 4060 relied less on ReBAR for its performance in those specific games I believe.
So killing the B580s advantage with the slower PCIe setup drops fps more because those games see the biggest gains with reBAR. There are games that don't benefit from rebar like the listed Doom eternal, I believe that sees a decrease in performance with ReBAR on an nvidia card.

ReBAR is CPU access to GPU memory isn't it? So depending on the game engine wouldn't that increase bandwidth requirements with the PCIe bus? I was under the impression that increased CPU access meant a better bus requirement. I'm not sure how more VRAM would help there but maybe I'm not understanding it all that well.

Yes, overall, Nvidia cards are less reliant on ReBar for performance. But even AMD cards are not getting the performance loss that the B580 is getting.

ReBar allows the CPU and GPU to share all vram available. Without it, it's limited to 256MB.
But if this is the problem with Battlemage, then it could mean the GPU or CPU is doing unnecessary accesses to vram, causing some overhead.
A good way to find out, is if some tech channel used Intel's Graphis Performance Analyser.
That way we could see if there is something strange going on.

 

winjer

Gold Member
If it were actually that easy to compete with Nvidia, AMD would have done it by now

It's been over 30 years and AMD hasn't done it

What chance do you really think Intel has?

If the period is 30 years, then AMD has done it several times.
If the period is 10 years, then yes, AMD has been in the backfoot.
 
If the period is 30 years, then AMD has done it several times.
If the period is 10 years, then yes, AMD has been in the backfoot.
I can count on 1 hand the number of times AMD actually competed with Nvidia over 30 years and each time it was only very temporary before Nvidia recovered and reclaimed their lead

AMD has never ever been able to sustain a competitive scenario with Nvidia, asking Intel to come out of nowhere and do it is pure fantasy
 

winjer

Gold Member
I can count on 1 hand the number of times AMD actually competed with Nvidia over 30 years and each time it was only very temporary before Nvidia recovered and reclaimed their lead

AMD has never ever been able to sustain a competitive scenario with Nvidia, asking Intel to come out of nowhere and do it is pure fantasy

If you are talking about market share, yes, ATI only managed to surpass Nvidia once.
If we are talking about performance or power efficiency, then AMD has done it several times.
 

Bojji

Member
it needs a CPU like the 9800x3d to really perform better than the rtx 4060

Fuck me, 7600 is the minimum here for this GPU (or 5800X3D).

Something like this is not new, AMD had massive overhead problem in many DX11 games (I personally tested this in FC4 using 290 and 970 with 2500K). They fixed it to some extend.

Now in DX12 nvidia has higher overhead but compared to this Intel shit, it's super small.

This guy was on this ~1 year ago:

 

Three

Member
But if this is the problem with Battlemage, then it could mean the GPU or CPU is doing unnecessary accesses to vram, causing some overhead.
This is my theory. The game and system think that they have 12GB of 192bit very high bandwidth memory. However they don't take into account that with the poor mobo/CPU combo they're bottlenecked by the PCIe controller interface. the AMD cards tested in comparison have 16 lanes or lower slower memory and so the game/system doesn't tax the interface as much to begin with.
 

winjer

Gold Member
This is my theory. The game and system think that they have 12GB of 192bit very high bandwidth memory. However they don't take into account that with the poor mobo/CPU combo they're bottlenecked by the PCIe controller interface. the AMD cards tested in comparison have 16 lanes or lower slower memory and so the game/system doesn't tax the interface as much to begin with.

Check out the new video from HU.
From their testing, it really seems it's not about rebar.
 

hinch7

Member

Thats some horrible driver overhead, even on more recent CPU's like Zen 3 causing huge bottleneck in CPU demanding games like SM Remastered.

People who are into competitive games are probably better off with a AMD or Nvidia GPU for now :/ Hopefully Intel can sort this out.
 
Last edited:

Bojji

Member
I wonder how modern budget CPUs hold up (12/13 series i5s and 5600x/7600x, etc.)?

Not great:

M6SoQ8G.jpeg
M5bEQcZ.jpeg
 

Gaiff

SBI’s Resident Gaslighter
Oof, I know it's an exception, but this is the kind of thing that would make it better for reviewers to test budget GPUs with budget CPUs as well. I know they do that to eliminate CPU bottlenecks, but even more critical than this is having a good idea how a budget GPU fares with a CPU it's likely to be paired with.
 
I was already a little worried weeks before when the testers said this card could end up using a lot of more watts compared to their rival cards.
 

Crayon

Member
Wow this sucks. I use a ryz 5600 and the B580 is exactly the kind of thing I'd be shopping for. That would be a borderline shit combo that I had every reason to expect being just fine.
 

Bojji

Member
Wow this sucks. I use a ryz 5600 and the B580 is exactly the kind of thing I'd be shopping for. That would be a borderline shit combo that I had every reason to expect being just fine.

Most games are not very CPU heavy, what they tested are outliers. BUT as they said in the video cpu demands will only go higher in the future.
 

Crayon

Member
Most games are not very CPU heavy, what they tested are outliers. BUT as they said in the video cpu demands will only go higher in the future.

Thats what I meant by borderline. I could use it for quite a while until I picked the wrong game from my backlog and .. oops my computer sucks. Even if it's only a few games, that's a hard pill to swallow.
 
Yikes. Hopefully they can reduce the overhead with driver updates down the line. They did a lot of positive work post release on the last series.
 

samuelgregory

Neo Member




TLDR, with low end or older CPUs, Battlemage GPUs have huge performance drops in CPU heavy games. Much greater than GPUs from AMD or Nvidia.
This drop is so big, it makes the B580 to have performance on par with a GTX 1060.
This means that buying a B580 to make a quick and cheap upgrade for an older or low end system, will have performance issues in several games.
Also, ReBar continues to be very important for Intel GPUs to have good performance. So make sure your system has this feature enabled.
Intel has been notified and is working to find out what is the problem with their drivers.

tQuh9Jl.png


Wm88KB2.png
black wrap car

Thinking of getting the Intel B580 but you are not sporting an eight core Ryzen 7000/9000 processor or Intel equivalent? Might want to think twice. Hardware Unboxed has tested the best Battlemage GPU with varying tier of CPUs, and found that in most cases it already starts losing performance with the R5 7600 as compared to the 9800X3D that everyone and their dog used for the initial review of the B580. Performance continues to rapidly fall off as the CPU gets weaker. They tested at 1080p native in four modern titles where the highest average framerate were all below 100fps except in Spiderman Remastered, which got 152fps with the 9800X3D and suffers a 25% drop in performance for using the R5 7600. They have deduced that the culprit is a very high driver overhead. The competing graphics used as control is the RTX 4060.
 
Last edited:
Top Bottom