• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Hogwarts Legacy PC Performance Thread

This game sounds like a mess from the posts in this thread suggesting a memory leak. Did Digital Foundry opt to not review this game to avoid backlash because of the controversy around it?

edit: okay, so there's a whole thread on why they aren't covering it that I hadn't seen yet, nevermind.
edit 2: looks like they just tweeted they'll be covering this game and the review should be out in a couple of days.
 
Last edited:
No, I ignore things like that usually. I professionally calibrate my screen per input and treat the PC input the same. The biggest thing is to make sure you override nvidia's default behavior of running RGB 8-bit dithered for HDR, and instead force it to use YCbCbr and 10-bit color for true HDR. Only other tweak i do is have Windows run SDR content in its HDR Container at 85% white brightness.
Should you use ycbcbr 10th even with hdr offi stead of rgb?
 

JeloSWE

Member
Should you use ycbcbr 10th even with hdr offi stead of rgb?
No, on PC always use RGB if possible, don't bother with the ycbcbr. 8bit in SDR is usually enough in Windows but it doesn't hurt anything to us 10 bit, eg in Win 11 there is a slight translucent blur to the windows panels and 10bit produces smoother gradient without banding.
 
  • Like
Reactions: GHG

Patrick S.

Banned
This fixed stutterings for me:

  1. Navigate to "AppData\Local\Hogwarts Legacy\Saved\Config\WindowsNoEditor" and backup "Engine.ini". Add the following to the bottom of the file and save it:
[SystemSettings]

r.bForceCPUAccessToGPUSkinVerts=True

r.GTSyncType=1

r.OneFrameThreadLag=1

r.FinishCurrentFrame=0

r.TextureStreaming=1

r.Streaming.PoolSize=3072

r.Streaming.LimitPoolSizeToVRAM=1



[ConsoleVariables]

AllowAsyncRenderThreadUpdates=1

AllowAsyncRenderThreadUpdatesDuringGamethreadUpdates=1

AllowAsyncRenderThreadUpdatesEditor=1

Source
PC gaming, gotta love it.

I've been a personal computer gamer since the days of the Amiga, but this shit is reeeeeally getting ridiculous.

I bought the PS5 deluxe edition for my son and was planning to double dip on PC, but since the game's launch I've seen like dozens of 900-post reddit threads of people complaining about poor performance on really high end PCs, along with the typical "go here, do that, edit this, reboot in this and that mode, disable x, enable y" "fixes" being recommended here and there.

Meanwhile, my kid has been happily playing the perfectly fine PS5 version.

I have an i5 12600k, 16GB of super duper fancy gaming RAM, and an RTX 3080, but I have to worry about performance and sub-30 fps dips with RTX if I buy this for PC? Nah, I'm good man. Mission double dip aborted.
 
Last edited:
No, on PC always use RGB if possible, don't bother with the ycbcbr. 8bit in SDR is usually enough in Windows but it doesn't hurt anything to us 10 bit, eg in Win 11 there is a slight translucent blur to the windows panels and 10bit produces smoother gradient without banding.
I'm confused. aren't you changing YCBCBR in PC as well?

This is what the original poster said

No, I ignore things like that usually. I professionally calibrate my screen per input and treat the PC input the same. The biggest thing is to make sure you override nvidia's default behavior of running RGB 8-bit dithered for HDR, and instead force it to use YCbCbr and 10-bit color for true HDR. Only other tweak i do is have Windows run SDR content in its HDR Container at 85% white brightness.
 
Last edited:

Catphish

Gold Member

I appreciate that. Thanks for posting it.

I knew something wasn't right with ray tracing. Even on low, it fucking crushes FPS. I have to have DLSS on Max Performance for it to be playable, but that setting looks like shit, so I turned RTX off altogether, as the video suggests.

I hope they address these issues in a future patch. The game is too good to be hobbled by bad performance.
 

Pagusas

Elden Member

2 issues with that video.

1. Ray Tracing is actually pretty daman amazing; the reflections are 100% needed to make the Great Hall look amazing (it looks so flat without them). With the RT engine.ini tweak ray tracing looks subaerially better.
2. Ultra view distance is 100% a needed setting, the difference between ultra and high is huge in terms of image quality.
 
Last edited:

Comandr

Member
Haven't gotten into the thick of it yet but at a glance this patch has done some real work under the hood. Typically use Hogsmeade as my test bed since it seems to be one of the more demanding areas of the game. Testing on steam deck right now.

Before patch:
All settings at low, FSR 2 Quality - 24-28 FPS

After patch
All settings at low, FSR 2 Quality - 38-41 FPS
All settings at ultra, FSR 2 Quality - 21-22 FPS
All settings at ultra, native resolution - 19 FPS
 
I noticed no difference on my gaming laptop with a 3070. Still a stuttery mess in Hogsmeade with all settings in Ultra and no RT. I'll try on my gaming PC with a 3090 later.
 
When I turned Steam on it didnt patch the game and I show no recent patch download. How do I tell if I am running the latest patch?
 
PC gaming, gotta love it.

I've been a personal computer gamer since the days of the Amiga, but this shit is reeeeeally getting ridiculous.

I bought the PS5 deluxe edition for my son and was planning to double dip on PC, but since the game's launch I've seen like dozens of 900-post reddit threads of people complaining about poor performance on really high end PCs, along with the typical "go here, do that, edit this, reboot in this and that mode, disable x, enable y" "fixes" being recommended here and there.

Meanwhile, my kid has been happily playing the perfectly fine PS5 version.

I have an i5 12600k, 16GB of super duper fancy gaming RAM, and an RTX 3080, but I have to worry about performance and sub-30 fps dips with RTX if I buy this for PC? Nah, I'm good man. Mission double dip aborted.
Can't disagree. Sometimes I think the same as you. Some games are better played on consoles.
But at least you can do something about it and try to fix it. And PC has other features, like mods.
 

JeloSWE

Member
I'm confused. aren't you changing YCBCBR in PC as well?

This is what the original poster said
I never use anything else than RGB on PC for both SDR and HDR. YCbCr is a video standard that separates the signal into Luma (brightness) and Chroma (color value) and this allows for easier compression of the Chroma channel. This is what all video content MKV/MP4/Youtube/BluRay/DVD normally uses and the color information is usually half in width and sometimes in height as well. This can cause problems with very saturated thin lines and details or transition between sharp but colorful areas such in cartoons or UI. Text is especially problematic in a Windows environment that has ClearType enabled and will cause color fringing. That said, PS5 does indeed output in 4:2:2 which isn't optimal but is rarely noticeable in real life. So while you can use YCbCr without noticing any problems RGB is prefered if available.

wGdVkpO.jpg
 
even when playing PC on a TV?
Yes. TV's support RGB just fine. The native color space for a PC and a TV are just the same: RGB. There is no reason to convert RGB to YCbCr on the PC and send that to the TV and then have the TV convert YCbCr back to RGB. Why the fuck would you introduce a color space conversion for no fucking reason?
 

yamaci17

Member
texture streaming pools are changed for texture quality

new high is the old low

old one
Low = 3000
med = 3500
high = 4100
ultra = 5000

new one
Low = 1200
Med = 1800
High = 3000
Ultra = 5000

this is just a move to satiate the inflated ego of 8-10 gb card owners. i've been pondering over forums suggesting 8-10 gb people to use the low texture streaming setting and it looked really good, but they wouldn't listen and accept just because it was labeled "low".

now they can be happy with the "high" that was labeled as "low" before. ignorance is bliss, they say. it is true.

this is pure psychology at this point. if you can accept 3000 mb budget when it is labeled as high as opposed to low, you're simply playing for semantics. you just want it to be labeled high, you do not want the actual functionality of what "high" would mean.

people really do not understand that there's no one true way of calling something low or high. i will admit that old low did not deserve the name "low". it deserved the name "standard", at worst. but now it is labeled high. can it be labeled as high? maybe. 3000 mb pool size manages to load all high quality texturing around medium range.

but inbetween settings are now gone extinct (3500 / 4100 mb). such a shame.
 
Last edited:

Sanctuary

Member
Nice, the patch took me from having a stutterfest in many areas, but still being playable to a crashfest where I can't even get past the first cutscene of the main story quest that I had saved before prior to the patch. Never had a single crash over my previous six hours, and now it's literally every few minutes.

edit: OK, it seems like maybe using the previous Engine.ini file alterations did something more than simply the tweaks added to it. Either that, or the default is just trash now. Previously I backed up the original and used the modified one and had no crashes. After the patch, I renamed the modified one so that the game could create a new one and use that. Letting the game use the Engine.ini it created keeps causing crashes, so after going back to the previously modified one...no crashes. At least not yet anyway. Previously I couldn't even get past this cutscene, and tried six times. Replayed that cutscene three times more just to see if it was a fluke, and so far so good. The Engine.ini file the game creates is causing issues. For me.
 
Last edited:

yosean

Neo Member
Patch isnt perfect but I can atleast play the game with the minimum it dropping to is 28fps in Hogmead. 60+ most other places

1444p DLSS
Ultra Settings
16gb ram RTX3070
 

MMaRsu

Member
Patch isnt perfect but I can atleast play the game with the minimum it dropping to is 28fps in Hogmead. 60+ most other places

1444p DLSS
Ultra Settings
16gb ram RTX3070
Why use ultra settings? Are you daft? Your on a 3070 bro, put that shit on high.

Im on a 3070 too, Id advise you to use the Windows tweaks Winjet posted itt. 28 in Hogsmeade is an error, it should not run like that.

And check if RT Shadows isnt on by accident.

I get around 60/70fps in Hogsmeade
3070 / 32gb ram / 5700x cpu
 
Last edited:

yosean

Neo Member
Why use ultra settings? Are you daft? Your on a 3070 bro, put that shit on high.

Im on a 3070 too, Id advise you to use the Windows tweaks Winjet posted itt. 28 in Hogsmeade is an error, it should not run like that.

And check if RT Shadows isnt on by accident.

I get around 60/70fps in Hogsmeade
3070 / 32gb ram / 5700x cpu
Because I found putting High, Medium or Ultra made a really insignificant change in performance and Hosmead still dropped to around 30, so if it's always going to drop to 30 I might as well enjoy the graphics elsewhere
 

MMaRsu

Member
Because I found putting High, Medium or Ultra made a really insignificant change in performance and Hosmead still dropped to around 30, so if it's always going to drop to 30 I might as well enjoy the graphics elsewhere
Its not supposed to drop to 30, do the Windows tweaks posted earlier. I have a 3070 as well and it went from a stuttering mess to beautiful 60+
 

b0uncyfr0

Member
It looks to me that Intel Xess is performing better than FSR 2. Im checking some specific scenes and Xess is mostly utilising the gpu (around 90%).

FSR 2.0 however drops down to sub 70%.. very weird.
 

Nitty_Grimes

Made a crappy phPBB forum once ... once.
PC gaming, gotta love it.

I've been a personal computer gamer since the days of the Amiga, but this shit is reeeeeally getting ridiculous.

I bought the PS5 deluxe edition for my son and was planning to double dip on PC, but since the game's launch I've seen like dozens of 900-post reddit threads of people complaining about poor performance on really high end PCs, along with the typical "go here, do that, edit this, reboot in this and that mode, disable x, enable y" "fixes" being recommended here and there.

Meanwhile, my kid has been happily playing the perfectly fine PS5 version.

I have an i5 12600k, 16GB of super duper fancy gaming RAM, and an RTX 3080, but I have to worry about performance and sub-30 fps dips with RTX if I buy this for PC? Nah, I'm good man. Mission double dip aborted.
It’s like editing the s: startup-sequence on the Amiga 😂👍🏻
 

GymWolf

Gold Member
The game run pretty much the same after the patch, 90-95% perfect with sporadic stutter during combat or exploration.
 

yamaci17

Member
It looks to me that Intel Xess is performing better than FSR 2. Im checking some specific scenes and Xess is mostly utilising the gpu (around 90%).

FSR 2.0 however drops down to sub 70%.. very weird.
xess is more heavy on non intel hardware. are you actually seeing better frames per second, or just higher gpu utilization?
 

b0uncyfr0

Member
xess is more heavy on non intel hardware. are you actually seeing better frames per second, or just higher gpu utilization?
Hmm, you could be right. I was comparing FSR quality (960p base) to Xess Ultra Quality (base 1108p).

Ill try Xess Quality, which is also 960p base.
 

Thebonehead

Gold Member
PC gaming, gotta love it.

I've been a personal computer gamer since the days of the Amiga, but this shit is reeeeeally getting ridiculous.
The Amiga was something special.

PC gaming has always had config to do, back from the DOS days when we used to have to muck around with memmaker and config.sys

We had to ensure we had enough TSR's shifted to the upper memory and have enough in conventional memory to run some games!
 
texture streaming pools are changed for texture quality

new high is the old low

old one
Low = 3000
med = 3500
high = 4100
ultra = 5000

new one
Low = 1200
Med = 1800
High = 3000
Ultra = 5000

this is just a move to satiate the inflated ego of 8-10 gb card owners. i've been pondering over forums suggesting 8-10 gb people to use the low texture streaming setting and it looked really good, but they wouldn't listen and accept just because it was labeled "low".

now they can be happy with the "high" that was labeled as "low" before. ignorance is bliss, they say. it is true.

this is pure psychology at this point. if you can accept 3000 mb budget when it is labeled as high as opposed to low, you're simply playing for semantics. you just want it to be labeled high, you do not want the actual functionality of what "high" would mean.

people really do not understand that there's no one true way of calling something low or high. i will admit that old low did not deserve the name "low". it deserved the name "standard", at worst. but now it is labeled high. can it be labeled as high? maybe. 3000 mb pool size manages to load all high quality texturing around medium range.

but inbetween settings are now gone extinct (3500 / 4100 mb). such a shame.

I get what you are saying but the problem is that the setting is called Texture *Quality* not Texture Streaming Pool Size so it is not surprising that people are opting for Ultra for this setting because it usually controls the resolution of the textures so the lower you go then the less 4K-like and blurry they become. I cannot remember the last time I played any game on less than the maximum texture setting, whether that is High, Ultra or Epic or whatever.

Maybe the developers should have labelled the settings to more appropriately reflect what they do or have separate Texture Quality and Texture Streaming Pool Size options to make it clearer what each does, while having the game default to the one most suited for the amount of VRAM the graphics card has (but still allowing the user to change it if needed)?

Personally, I have been using the Engine INI tweaks posted by various sites and this pretty much fixed the performance issues I was having on my 10 GB RTX 3080 graphics card. I think the tweak sets the poolsize to 3,072 MB and this keeps VRAM usage around 8 GB (at most 8.5 GB) and also allows me to use RT if I want to as the 1 GB increase in VRAM still keeps it below the 10 GB limit of the card. However, RT is pretty half-baked in this game and has to big of a performance hit so I have just opted for the game's recommended 1440p Ultra settings with RT disabled. The game still has some microstuttering but the major hitches/stutters are now completely gone and I am playing it at a pretty much locked 90 fps albeit with DLSS Auto (rendered from 1907x971 or something).

I have also played the PS5 version, mostly in the 40 fps Balanced mode with the Frame Cap set to on for a locked framerate and it looks great and feels perfectly fine to play. However, that version also has similar micro-stuttering issues to the PC version I am playing so it looks like this is an engine issue that the developers still need to address.

PC specs by the way are: Intel i5-13600KF, 32 GB DDR4 3600, 10 GB RTX 3080, Windows 11 Pro 22H2
 
Last edited:

GymWolf

Gold Member
Just to have a clear pictures here, did the people who suffered stuttering with high end pc (so 4000\7000 series owners) had THIS type of stuttering? (just watch the first part of the video)

 

SlimySnake

Flashless at the Golden Globes
Just to have a clear pictures here, did the people who suffered stuttering with high end pc (so 4000\7000 series owners) had THIS type of stuttering? (just watch the first part of the video)


not as bad on my 3080 but i bet thats what would happen if you ran RT on a 2070 or a 2060.

my framerate would dip to mid 30s and mid 20s as soon as i entered that courtyard. Ultra with no RT is mostly fine with some rare stutters in hogsmead but the GPU usage is only 40-50% inside hogwarts and even in the courtyard as long as you dont enable RT.
 

GymWolf

Gold Member
not as bad on my 3080 but i bet thats what would happen if you ran RT on a 2070 or a 2060.

my framerate would dip to mid 30s and mid 20s as soon as i entered that courtyard. Ultra with no RT is mostly fine with some rare stutters in hogsmead but the GPU usage is only 40-50% inside hogwarts and even in the courtyard as long as you dont enable RT.
I wasn't speaking to 3000series peasants but ok.

:lollipop_blowing_kiss:
 

DanEON

Member
Just to have a clear pictures here, did the people who suffered stuttering with high end pc (so 4000\7000 series owners) had THIS type of stuttering? (just watch the first part of the video)


yes, but that would happen only in hogsmead for me. Those areas shown on the video was fine, just some stutters after cutscenes. Ultra + RT + DLSS + FG. With RT off it was fine, no big deeps, even on Hogsmeade. (My PC: 4070ti, 32gb ddr5, Ryzen 7600).
 
Last edited:

GymWolf

Gold Member
you not getting those big drops shown on the video?
Nope, the only stutter i have are like microseconds, a bit more severe with rtx on but nowhere near that level, still kinda annoying with a 2000 euros machine but what can you do?

That shit is like bullet time from matrix...

But i have a 4080.
 
Last edited:

DanEON

Member
Nope, the only stutter i have are like microseconds, a bit more severe with rtx on but nowhere near that level, still kinda annoying with a 2000 euros machine but what can you do?

That shit is like bullet time from matrix...

But i have a 4080.
you never had it, even on first day? Because I didnt test it after the patch. I got a refund on steam and I am playing on PS5.
 

SatansReverence

Hipster Princess
Just to have a clear pictures here, did the people who suffered stuttering with high end pc (so 4000\7000 series owners) had THIS type of stuttering? (just watch the first part of the video)


Never saw anything like that with a 7900xtx and 12700k.

Looks like running out of memory type situation.
 

JRW

Member
This game sounds like a mess from the posts in this thread suggesting a memory leak. Did Digital Foundry opt to not review this game to avoid backlash because of the controversy around it?

edit: okay, so there's a whole thread on why they aren't covering it that I hadn't seen yet, nevermind.
edit 2: looks like they just tweeted they'll be covering this game and the review should be out in a couple of days.

Did they back out? I don't see any Hogwarts tweets from DF.

EDIT: Oops ok found it on John's twitter page https://twitter.com/dark1x

I always look forward to their performance / image quality setting tweaks.
 
Last edited:
Top Bottom