• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Destiny 2 PC performance thread

Well, workaround for HDR in 1440p right now is to use the 378.92 NV drivers and set resolution to 1080p and res scale to 133%. Game looks alright like that.

UI is still in 1080p though and looks real assy.

Hopefully this is all ironed out for actual realease in October. Kinda shitty as it stands.
 

Wonko_C

Member
Way below specs (Core 2 Quad Q9650/GTX 750) and it manages to pull off 60fps, with maybe 30-40ish during the Cabal defense part. I haven't been able to go further than that because, like the console beta, the game always disconnects right after that.

https://www.youtube.com/watch?v=I4B4ngi0YZ0

https://www.youtube.com/watch?v=amfxsXKT8Fs

https://www.youtube.com/watch?v=gWPPCkbB2dU

I'm hoping I can buy a GTX1050 Ti to replace my 750 soon. Those extra 3GB should keep my PC alive for a couple more years at least.
 
Yoooooo, this PC port is ridiculous

My setup:

ASUS GTX 1060 6GB
AMD FX-6300 with stock cooling
16GB DDR3 RAM
4K HDR Sony TV

I can run the game at 4K 30 on high or 1080p 60 also on high

HDR seems weird though. It doesn't look as good as any game I've played with it on PS4 Pro. Is anyone else noticing this?
 

j-wood

Member
i7 6700K and GTX 1070 here. Factory clocks.

I'm getting between 45 to 75 FPS on 2560x1440 with everything maxed, MSAA turned off.

Any other extraneous settings I should turn off/down to get to smooth 60?



Are you overclocking by chance? I've got similar parts and I'm nowhere near 100fps even with MSAA turned off.

Nope, no overclocking. I have an i7 6700k, 1070, and 16GB of DDR4. Game is installed on an SSD.
 

Tecnniqe

Banned
That's a nice rant, but the screenshots demonstrate very clearly that MSAA isn't implemented correctly in this game at this point.

(And it certainly does some things better than any PPAA, that's why e.g. all the most highly-acclaimed [visually] VR games use it)

Bullshit. In a forward renderer, MSAA is often easily the best tradeoff between temporal stability, spatial aliasing reduction, sharpness and performance.
The law has been spoketh
 

KainXVIII

Member
Input delay, the mouse feels so heavy with it on. I don't know how anyone could play a shooter with VSync on, it feels disgusting IMO. Fast Sync is better if you absolutely need it on.

I play every shoother with vsync on, never noticed any input lag with mouse.
 

Makoto-Yuki

Gold Member
Any reason why mine doesn't seem to show resolutions above 1080p? Swapping to windowed fullscreen doesn't fix it either.

the game won't show resolutions higher than whatever your monitor is. if you want to downsample then use the resolution render setting. At 1080p 134% is roughly 1440p. 167% is about 1800p. At 1440p 125% is 1800p and 150% is 2160p.
 
I didn't have the fps counter on but my 1080 had no problems with DSR 5K on highest settings (except MSAA). Felt super smooth. I'll try again with a fps counter on tonight.
 

Durante

Member
The law has been spoketh
I should add that it's true that the performance/visual tradeoff for MSAA often doesn't make sense in deferred renderers.
But it's really important to recognize that
  • it's broken in this game right now, and if it was working as intended the visual aliasing-reduction effect would be much better.
  • there absolutely are tradeoffs with PPAA and TAA techniques, and MSAA is still the technique of choice for many use cases (e.g. forward renderers or VR).
 
the game won't show resolutions higher than whatever your monitor is. if you want to downsample then use the resolution render setting. At 1080p 134% is roughly 1440p. 167% is about 1800p. At 1440p 125% is 1800p and 150% is 2160p.

So I assume that means the percentage is applied to both axes?

134% means 34% more on the X and Y axis for example, rather than 34% more total pixels?

Because 1440p is actually more like 177% of 1080p on a total pixel basis.
 
Same here. Really bugging me that I can't figure out why

Temps are normal though. No warmer than 65 degrees Celsius, which is just fine for a constant 100 load. It does cause weird jitters though.

Also; on gsync the vsync doesn't seem to work, the game instead turns on a frame limiter when enabling vsync. However gsync gets turned off when you pick a non-native hz for some reason
 
Temps are normal though. No warmer than 65 degrees Celsius, which is just fine for a constant 100 load. It does cause weird jitters though

Yuuuuup. My exact situation. It's irritating having a 6600K + 1070 and still having drops down to 30-40 FPS and people with identical rigs maintaining 70+ FPS
 

aett

Member
Been messing with the video settings and I'm getting roughly 45-50 FPS on "High" with the shadows turned low and using FXAA. Anything else I can lower for some more performance without sacrificing much graphically?
 

Zeneric

Member
All of graphics settings are on High (except Texture Quality which is on Extra High, and DoF and Motion Blur are turned off - and SMAA on), I get 90-120 FPS on 1080p.

My PC specs:
6600k oc'd at 4.5GHZ
GTX 970
16GB DDR4
Runs on a SSD
Win10

My GTX 970 is still relevant I see.
 

nOoblet16

Member
So I assume that means the percentage is applied to both axes?

134% means 34% more on the X and Y axis for example, rather than 34% more total pixels?

Because 1440p is actually more like 177% of 1080p on a total pixel basis.
Yes it's to both axes. That's why 200% is 4K.

If you want 1440P downsampled to a 1080P screen select 134% which is a tiny bit over 1440P possible. 133% likewise is a tiny bit under 1440P.
 
There's an in-game option to display FPS as well.

One thing I immediately noticed is the Strike mission runs noticeably slower (80-100) than the Story mission (100-130), so benchmarks and impressions will vary a lot based on that.
 

Makoto-Yuki

Gold Member
There's an in-game option to display FPS as well.

One thing I immediately noticed is the Strike mission runs noticeably slower (80-100) than the Story mission (100-130), so benchmarks and impressions will vary a lot based on that.
There is but that's all it does. No frametimes, cpu/gpu/memory usage, temps, or fan speeds. RTSS has it all and shows real time graphs too.
 

ZOONAMI

Junior Member
There's an in-game option to display FPS as well.

One thing I immediately noticed is the Strike mission runs noticeably slower (80-100) than the Story mission (100-130), so benchmarks and impressions will vary a lot based on that.

Where is the game option for fps?
 

Makoto-Yuki

Gold Member
My 6700K at 4.5GHz only has about 20-30% usage and stays at 50-55C. The game is so easy on the CPU it's crazy. My 1070 at 2038/4175 never touches 70C with max 88% usage and about 3GB VRAM usage. RAM is about 8GB usage.

Output is 1080p 60hz with 134% (1440p) rendering resolution. All settings as high as they will go except AA down to SMAA, AO down to HDAO, DoF down to High, motion blur/grain set to off.

Absolutely solid 60fps throughout both strike/crucible. Not replayed the story mission again since using these settings.

I couldn't be happier with performance. I might be able to bump the rendering resolution about a bit more but probably wont bother.
 

Zeneric

Member
The 970 has been able to handle practically everything I've thrown at it quite well at 1080p. The worst performers are usually the really VRAM-hungry games.

Yes, the VRAM hungry games. Even when you lower the settings to get VRAM to 3.5-4GB or lower, graphics still look beautiful. I reckon 970 will last for a long while (on 1080p res), at least till the next generation consoles come out that push graphics beyond this generation's graphics.
 

nOoblet16

Member
My 6700K at 4.5GHz only has about 20-30% usage and stays at 50-55C. The game is so easy on the CPU it's crazy. My 1070 at 2038/4175 never touches 70C with max 88% usage and about 3GB VRAM usage. RAM is about 8GB usage.

Output is 1080p 60hz with 134% (1440p) rendering resolution. All settings as high as they will go except AA down to SMAA, AO down to HDAO, DoF down to High, motion blur/grain set to off.

Absolutely solid 60fps throughout both strike/crucible. Not replayed the story mission again since using these settings.

I couldn't be happier with performance. I might be able to bump the rendering resolution about a bit more but probably wont bother.
Thats interesting because if there was ever a proof that console CPUs are terrible, it's this. CPU is the entire reason Destiny is 30FPS and the modern CPUs are barely even being utilised while doing tasks that's already topping out console CPUs.
 

MastAndo

Member
Well, workaround for HDR in 1440p right now is to use the 378.92 NV drivers and set resolution to 1080p and res scale to 133%. Game looks alright like that.

UI is still in 1080p though and looks real assy.

Hopefully this is all ironed out for actual realease in October. Kinda shitty as it stands.
Cool, I'll give the render resolution bump a try. Thanks.
 

Rizific

Member
quick question for you all.....are you guys good with a solid 60 fps or pushing it closer to 100fps and higher?

as someone with a 144hz monitor, i try to get my average around 90-100 if possible since thats about where I stop being able to notice any difference. 60 is the new 30.
 
I'm just about to build a gtx 1080ti ryzen 7 pc, am I good on hitting 4k 21:9 60 fps with that?

You should totally be able to minus lowering some settings like MSAA and DoF.

Ryzen performance is kinda weird tbh, this video shows how it strictly uses 8 cores and that's it, it's literally not using SMT threads which is quite dissapointing.

I imagine this could be due they locked their optimization for 8 threads max or something Ryzen related, i would love to see the game perform on Threadripper or even X299 to see if more physical cores affect the load distribution.

https://www.youtube.com/watch?v=p4mKq--nQxI

Edit:

This video of a 7900x with a 1080ti SLI follows the same pattern, some threads have 0% utilization lol.

https://youtu.be/r0q-M4mwYxs
 

Easy_D

never left the stone age
Considering the reports that the FX6300 manages 60 FPS I take it my 280X ain't up to par for a locked 60 then?

Medium/High presets both ran the same so I settled with High, was somewhere around 45-60 FPS. Some small dips to 40, but never under. The CPU barely seems to be used, fan didn't even spin up. Jesus, how weak are the consoles even
 

Akronis

Member
Bullshit. In a forward renderer, MSAA is often easily the best tradeoff between temporal stability, spatial aliasing reduction, sharpness and performance.

Forward rendering has its own drawbacks and isn't a great fit for every game.

MSAA is awesome, but it just isn't possible in a lot of games :(

Considering the reports that the FX6300 manages 60 FPS I take it my 280X ain't up to par for a locked 60 then?

Medium/High presets both ran the same so I settled with High, was somewhere around 45-60 FPS. Some small dips to 40, but never under. The CPU barely seems to be used, fan didn't even spin up. Jesus, how weak are the consoles even

I mean, I don't think a CPU exists in the last 8 years that you can put into a PC that would "match" the CPU in consoles.
 
It really should not do that. Even my 6820HK at 3.6Ghz is not 100% at 4k.

My 6700K at 4.5GHz only has about 20-30% usage and stays at 50-55C. The game is so easy on the CPU it's crazy. My 1070 at 2038/4175 never touches 70C with max 88% usage and about 3GB VRAM usage. RAM is about 8GB usage.

Output is 1080p 60hz with 134% (1440p) rendering resolution. All settings as high as they will go except AA down to SMAA, AO down to HDAO, DoF down to High, motion blur/grain set to off.

Absolutely solid 60fps throughout both strike/crucible. Not replayed the story mission again since using these settings.

I couldn't be happier with performance. I might be able to bump the rendering resolution about a bit more but probably wont bother.

I'm really puzzled why my CPU is getting topped out by this game.

This behavior only happens in D2 beta, BF1, and BF4. I ran around in Witcher 3 and Metro 2033 and my CPU usage was consistently under 60-70% and my GPU was maxed out, as it should have been. No clue what's going on. Ran virus scans, checked power profile, checked background processes. CPU usage at rest on my desktop is <5%

When task manager is the window in focus, D2 uses 50% CPU. When D2 comes into focus, it spikes back up to 100%. No other processes using any significant amount of CPU cycles
 
Top Bottom