• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Hogwarts Legacy PC Performance Thread

GHG

Member
I have the same I think, but how could a 5700x bottleneck a 3070?

I don't like throwing this word around because people do it too often, but in this case it does indeed seem to be the case that the game is unoptimised and there's something in particular causing a CPU bottleneck.

The fact that it's still present to a certain extent even with RT turned off suggests there's something going on in the background which isn't allowing the GPU to take over the heavy load like it should. Some people are speculating it could be system RAM related, but it's clear that somewhere in the pipeline (that CPU-Ram-GPU) round-trip, something is going on which is holding things up.

This is where you'd expect Alex from DF to do his job and get to the bottom of it, but alas...
 
Last edited:

nightmare-slain

Gold Member

TrebleShot

Member
Ehhhhh Performance on this game is a bit of a joke.
4090/5800x 4k Ultra with all setting cranked, horrible in the castle up and down from like 70-40 fps.

Needs DLSS3 with frame gen to get 120ish fps. Not good. Frame Generation and upscaling shouldn't be the solution when playing on high end hardware.
 

MMaRsu

Member
I don't like throwing this word around because people do it too often, but in this case it does indeed seem to be the case that the game is unoptimised and there's something in particular causing a CPU bottleneck.

The fact that it's still present to a certain extent even with RT turned off suggests there's something going on in the background which isn't allowing the GPU to take over the heavy load like it should. Some people are speculating it could be system RAM related, but it's clear that somewhere in the pipeline (that CPU-Ram-GPU) round-trip, something is going on which is holding things up.

This is where you'd expect Alex from DF to do his job and get to the bottom of it, but alas...

Gotta be Denuvo trash fucking up performance. At times the fps is fine, then in the same scene it will randomly start stuttering. When Im home Ill try to record it.

Its fucked :(
 
  • Like
Reactions: GHG

GHG

Member
So tried the game for around 4 hours, 4080/32 gb ddr5 6000/13600k all stock, i tried to achieve 4k60.

First i tried going native4k with dlsaa (basically dlss AA without upscaling, technically the best AA method), everything ultra and only rtx reflections on, it was perfect in the introduction but not so much inside the school where it had some occasional stuttering or frame loss (i forgot to turn on the framecounter so i don't know what was between the 2).
I had frame generation greyed out so i think it was disabled, not sure why.

Then i tried dlss quality with fram gen and it look a bit worse but inside the school seems to reduce to stuttering to a minimum butcin hogsmeade i still have some hiccups, nothing too problematic but not absolutely perfect neither, glat to say that frame gen feels good, if there is additional input lag added it was almost unnoticeable.


The only 2 bug i had was the protagonist voice if you chose the 2 lower piched voices, it sound like an eco or inside a tunnel during conversations, i fixed by returning to the pitch in the middle.

And sometimes during cinematics there was like rare "half second" accelleration, for lack of better words while using dlssa, they disappeared while using dlss quality+framegen.

Is it normal that i had frame gen greyed out if i don't use dlss upscaling? It should work to get better framerates at native resolution, not only when using dlss upscaling...

(I was still using a form of dlss for the AA so it is not clear)

Overall it runs like dead space remake, perfect for 90% of the time with occasional stutter here and there with ds running a bit worse.

All of this was tested without the new drivers so i still have an ace on my sleeve to play tonight when i'm gonna return home from work.

The graphic is as uneven as it comes, some incredible vistas and some meh ones, some very nice models and some very poor ones, some very detailed textures and some ps3 textures, overall very good looking but nothing more than a mid\good ps4 game in 4k tbh, also some occasional texture loading a bit later here and there.

P..s if rtx is the problem then i'm just gonna disable them tonight.

Follow this guide for frame gen:



If the option is greyed out for you then make sure you have done step 1:

First and foremost you need HW GPU scheduling ON in Windows,
 

nightmare-slain

Gold Member
Ehhhhh Performance on this game is a bit of a joke.
4090/5800x 4k Ultra with all setting cranked, horrible in the castle up and down from like 70-40 fps.

Needs DLSS3 with frame gen to get 120ish fps. Not good. Frame Generation and upscaling shouldn't be the solution when playing on high end hardware.
Depends how you look at it. the 4000 cards are a huge upgrade even if you leave out DLSS3. Before I turned on DLSS3 in any game I was impressed by how powerful the card is. DLSS3 takes it to another level and unlocks more performance. Some people seem to have this idea that using DLSS/Frame Generation is cheating and shouldn't be used. It's not... it's a feature and there to be used! I'm definitely turning it on and if it lets me play a game at 120-140fps then fucking brilliant.

We might be at the point that we shouldn't really expect to see significant performance gains from new hardware. We're not going to be able to keep shrinking dies and squeezing out more and more performance. 4000 cards are on 5nm process and 5000 cards likely will be on 3nm. In general, technology looks to be at 1nm in just 3 years from now. Where do go beyond that? We shouldn't expect quantum computing any time soon.

AI is the big thing right now and what we're seeing with DLSS is performance gains thanks to it. With AI we can render at lower resolutions and get just as good quality if not better than native. With DLSS3 we're now seeing frame generation. As AI improves and features drip down to lower tiered products then we'll start to count AI performance as normal.
 

GymWolf

Gold Member
Follow this guide for frame gen:



If the option is greyed out for you then make sure you have done step 1:

I already had everything set to perfection.

Later i'm gonna try the trick another member suggested.
 

Mayar

Member
After struggling for a day on my config (AMD Ryzen 5 3600, 16GB RAM + RTS 2070 12GB), I decided to continue playing on my PS5. In general, a more pleasant experience than on my PC ...
(Everything is pretty good indoors, but when you go out into the open world, the situation is completely different ..)
 

JackSparr0w

Banned
After struggling for a day on my config (AMD Ryzen 5 3600, 16GB RAM + RTS 2070 12GB), I decided to continue playing on my PS5. In general, a more pleasant experience than on my PC ...
(Everything is pretty good indoors, but when you go out into the open world, the situation is completely different ..)
Could have waited for launch day patch at least but you spent $120 to buy both versions instead?
 
Last edited:

b0uncyfr0

Member
Here are a few tweaks for Windows that can reduce stuttering:

The most important is disabling Control Flow Guard, since this is a DX12 game.
Open Exploit Protection via your Windows Search bar. Then, click On the Program Settings Tab and click On The “+ Add programs to customise”. Then, click On Choose Exact File Path, find game's exe and open it. Afterwards, you’ll have to navigate its Program Settings, scroll down to “Control Flow Guard”, put check mark in “Override System Settings”, turn it Off and Apply. Restart your PC.

Next, open the game. Then the Task Manager. Look for the game executable, then click on Open File Location. This is important, because UE4 creates 2 exe files for each game and we want the right one.
Right click on the game's executable, then properties, then the Compatibility tab. Then click on Disable Full Screen Optimizations.
Then click on Override high DPI scaling Behavior. Choose Application. Apply these settings and restart the game.
Well shit, this actually helped!! I always thought these were hit or miss but it really did smooth out the game for me.

Now, as long as im under 8GB vram (cause 6600xt) - most of the major stuttes are gone.
CPU/GPU usage could be better but that could be a me thing.
 

Mayar

Member
Could have waited for launch day patch at least but you spent $120 to buy both versions instead?
I understand that this sounds strange to some. But for me it was an acceptable option, my girl plays on PC and I'm in my room on PS5, everyone is happy =)
 
  • Strength
Reactions: Rea

winjer

Gold Member
Well shit, this actually helped!! I always thought these were hit or miss but it really did smooth out the game for me.

Now, as long as im under 8GB vram (cause 6600xt) - most of the major stuttes are gone.
CPU/GPU usage could be better but that could be a me thing.

I know. Most people think these things are just placebo.
A couple of weeks ago, I got the same thing from a friend, while he was trying out The Callisto protocol on his new 4090.

BTW, you can also, try out the HZBOcclusion tweak, since you have an AMD card.
It doesn't work on every UE4 game, but on some it can give a ~10% performance boost.

[/script/engine.renderersettings]
r.HZBOcclusion=1
 

Mayar

Member
It seems to me that the main problem is that they decided to use Unreal 4, why such a decision was made is not entirely clear, but to put it mildly, it is far from the best and capricious engine. Therefore, even if they roll out patches, the situation will certainly get better, but I do not expect any radical result. What prevented them from using Unreal 5 or making a transfer from version 4 to 5, I don’t quite understand.
 

nightmare-slain

Gold Member
It seems to me that the main problem is that they decided to use Unreal 4, why such a decision was made is not entirely clear, but to put it mildly, it is far from the best and capricious engine. Therefore, even if they roll out patches, the situation will certainly get better, but I do not expect any radical result. What prevented them from using Unreal 5 or making a transfer from version 4 to 5, I don’t quite understand.
The game has been in development long before UE5 came out. It was probably too much work to move it over and update the game.

For being an UE4 game it looks damn good. At least I think so.
 

LiquidMetal14

hide your water-based mammals
Here are a few tweaks for Windows that can reduce stuttering:

The most important is disabling Control Flow Guard, since this is a DX12 game.
Open Exploit Protection via your Windows Search bar. Then, click On the Program Settings Tab and click On The “+ Add programs to customise”. Then, click On Choose Exact File Path, find game's exe and open it. Afterwards, you’ll have to navigate its Program Settings, scroll down to “Control Flow Guard”, put check mark in “Override System Settings”, turn it Off and Apply. Restart your PC.

Next, open the game. Then the Task Manager. Look for the game executable, then click on Open File Location. This is important, because UE4 creates 2 exe files for each game and we want the right one.
Right click on the game's executable, then properties, then the Compatibility tab. Then click on Disable Full Screen Optimizations.
Then click on Override high DPI scaling Behavior. Choose Application. Apply these settings and restart the game.

I know. Most people think these things are just placebo.
A couple of weeks ago, I got the same thing from a friend, while he was trying out The Callisto protocol on his new 4090.

BTW, you can also, try out the HZBOcclusion tweak, since you have an AMD card.
It doesn't work on every UE4 game, but on some it can give a ~10% performance boost.

[/script/engine.renderersettings]
r.HZBOcclusion=1
I'm going to be trying these things out when I get home. Thanks for sharing.
 

Peterthumpa

Member
It seems to me that the main problem is that they decided to use Unreal 4, why such a decision was made is not entirely clear, but to put it mildly, it is far from the best and capricious engine. Therefore, even if they roll out patches, the situation will certainly get better, but I do not expect any radical result. What prevented them from using Unreal 5 or making a transfer from version 4 to 5, I don’t quite understand.
This is without a doubt a ChatGPT copy pasta. :messenger_grinning_sweat:
 

winjer

Gold Member

vram.png


performance-2560-1440.png


performance-rt-2560-1440.png
 
Last edited:

DanEON

Member
So it looks like what is causing the stutters is the good old shader compilation:

"Just like in most other recent releases, shader compiling and stuttering is a problem in Hogwarts Legacy. Right on startup, before the main menu, the game presents you a "compiling shaders" screen that lasts for three to five minutes. Unfortunately the shaders that get compiled at this time are only the most essential ones. As you progress through the game you'll encounter serious drops in framerate (to unplayable levels) for about 30 seconds. Just stop and wait for the shader compiler to finish—this breaks the immersion of course and I wonder if this can't be solved more elegantly. I investigated a bit further and the game is compiling shaders in the background even during normal stutter-free gameplay, too, without affecting the game much. We've seen such problems in many games recently and I keep hearing how these will get fixed in the "day one" patch, "soon" after launch, maybe with the next big patch, or with the next DLC. For nearly all games these issues never get fixed, and there's never any noteworthy performance improvements, so /doubt. My recommendation is to just live with it, stop playing for a few seconds when you get hit by the stutter, take a short break, and resume when things are smooth again. Just to clarify, for 98% of my play time in the first few hours there was no stutter and no frame drop issues, everything was super smooth."
 

LiquidMetal14

hide your water-based mammals
So it looks like what is causing the stutters is the good old shader compilation:

"Just like in most other recent releases, shader compiling and stuttering is a problem in Hogwarts Legacy. Right on startup, before the main menu, the game presents you a "compiling shaders" screen that lasts for three to five minutes.
This is the main thing that caught my attention here. I do agree that shader compilation doesn't just happen once. I did update the driver yesterday but it seems like every time I booted the game up it's needed to do the shader compilation.

I have the game on an nvme drive so that may help but it's definitely not 3 to 5 minutes. Even the first time it might have been a little over a minute if that and subsequently it's been 30 seconds or so every time.

Regarding VRAM usage, it has definitely pegged a little higher for me as I posted earlier in the thread it's around 19gb 4k maxed.
 
Last edited:

Mister Wolf

Gold Member
So it looks like what is causing the stutters is the good old shader compilation:

"Just like in most other recent releases, shader compiling and stuttering is a problem in Hogwarts Legacy. Right on startup, before the main menu, the game presents you a "compiling shaders" screen that lasts for three to five minutes. Unfortunately the shaders that get compiled at this time are only the most essential ones. As you progress through the game you'll encounter serious drops in framerate (to unplayable levels) for about 30 seconds. Just stop and wait for the shader compiler to finish—this breaks the immersion of course and I wonder if this can't be solved more elegantly. I investigated a bit further and the game is compiling shaders in the background even during normal stutter-free gameplay, too, without affecting the game much. We've seen such problems in many games recently and I keep hearing how these will get fixed in the "day one" patch, "soon" after launch, maybe with the next big patch, or with the next DLC. For nearly all games these issues never get fixed, and there's never any noteworthy performance improvements, so /doubt. My recommendation is to just live with it, stop playing for a few seconds when you get hit by the stutter, take a short break, and resume when things are smooth again. Just to clarify, for 98% of my play time in the first few hours there was no stutter and no frame drop issues, everything was super smooth."

It's insane to me that they just won't give the option to precompile ALL of the shaders beforehand In the settings.
 

yamaci17

Member
Seems like 8 GB is a no go for 1440p. Sad times. Had to go by 1080p/medium to not get VRAM spikes. Still get them here and there. Turns out there's not much difference between med and ultra textures, but still.

16 GB RAM is %80 enough, but Hogsmeade pushes it to the limits. I hope they addess it with day 1 patch. This is another game with artificial VRAM cap limitations, that makes GPU to only use a maximum of %88 of its VRAM, causing 700-800 mb of VRAM to be always free on 8 GB buffer.
 
Last edited:

Mister Wolf

Gold Member
Is that even possible given the dynamic TOD and the fact that if you have RT turned on then those calculations for AO, shadows and reflections are not pre-determined?

Metro Exodus compiles shaders, doesn't have issues with shader stutter, and uses raytracing for all of the lighting in the game with a dynamic TOD.
 
Last edited:
  • Like
Reactions: GHG

GHG

Member
Metro Exodus compiles shaders, doesn't have issues with shader stutter, and uses raytracing for all of the lighting in the game with a dynamic TOD.

I guess it is possible then.

I think all the examples of games that don't have these issues use something other than unreal engine though.

Even though my hardware is good enough to get around this for the most part I would really like someone to get to the bottom of this, it's getting silly now.
 

ajanke

Member
Part of my performance issues yesterday (the first day was great performance) is due to after the NVIDIA driver update somehow RT was turned on to high. Once I turned it down to low settings my performance was mostly back to normal. The first day was without RT at all.
 

Guwop

Neo Member
It's okay to say fuck on this forum mate.
However it'd be nice if you didn't make claims about being able to max out the game at 1440p on a 7 year old GPU and get solid 60 fps considering there are 2x more powerful cards struggling to achieve the same.
It would be even nicer if you quit acting like you know how the game runs on my 7 year old PC when you have no clue. If you don't believe it then save your tears and don't believe it. Again, now ask me do I give a f*ck what you believe?
 

Aja

Neo Member
One weird thing.... I've noticed how I can stop the stutterfest that sometimes occures. Just now it happened in Hogwarts main hall which I came to while everyone was eating. The frames sloooowed down to a crawl. Casted the light spell and voila. Frames all good again and smooth playing. And I mean instantly. It just switches over after that. Tried this three times in different situations worked everytime. So freaking weird.
 

Brock2621

Member
One weird thing.... I've noticed how I can stop the stutterfest that sometimes occures. Just now it happened in Hogwarts main hall which I came to while everyone was eating. The frames sloooowed down to a crawl. Casted the light spell and voila. Frames all good again and smooth playing. And I mean instantly. It just switches over after that. Tried this three times in different situations worked everytime. So freaking weird.
Interesting. Mine stuttered for a frame or two every time I cast light.

I’m hoping there’s a patch released today
 

GymWolf

Gold Member
It is strange because i don't have prolonged slowdowns at all, just the occasional random stutter.

It is a far away from callisto, and it also looks way worse tbh.
 
Last edited:
Capped the game to 30fps, ofc far from ideal but constant weird camera stutters feel much worse than fairly consistent stable fps
 

manfestival

Member
Ehhhhh Performance on this game is a bit of a joke.
4090/5800x 4k Ultra with all setting cranked, horrible in the castle up and down from like 70-40 fps.

Needs DLSS3 with frame gen to get 120ish fps. Not good. Frame Generation and upscaling shouldn't be the solution when playing on high end hardware.
This is the post. I watched some videos like Daniel Owens one and it basically confirms everything you are saying. Makes me think these other dudes are on one.
 
Top Bottom