Ahh, ok.. conspiracy theories is something that the NeoGAF folks really like
Your concerns have been duly noted, but I can't relate to you.
I'm not worried in the slightest. The Xbox One is indeed the weaker machine compared to the PS4 and the recommended PC specs but that does not mean they have to spend a colossal amount of time trying to get the best of it, most likely they just tone down effects and that's it.
Why do these threads always have to turn out the same way, when the results once the benchmarks come in always paint the same picture?
This is a spit in the face to all of us in the Elite Gaming PC Race (EGPCR). 30fps is simply unplayable trash.
lol at some of the people in this thread...
"Hey guys I've got 4 Titan Z's and 4 i7 processors with 128GB of ram, I don't know if I can even get 30 fps on this omg what will i do??!??!"
I'm just saying what those two guys wrote, obviously I wasn't there so can't vouch if it's authentic - maybe CDPR just served vodka as refreshers.
1st article:
We were asked not to change the gfx settings - we could look at them but not change anything. Understandable - such an early build might not be 100% stable. I regret we couldn't have seen Nvidia HariWorks in action - it was disabled. The hair on the characters looked okay, but was the worst element of them. We played in 1920x1080 on High (only setting above was Ultra), IIRC with SSAO disabled and some medium AA and AF. There was no fps counter visible and for most of the gameplay game was fluid - meaning ~30 fps or slightly more. Not exceeding 35 fps though, even in inside locations. There was a large fluctuations of framerate - not something in line of 10-15 fps, but it was possible to have a fps drop from 40 to 20 in a drop of a hat - usually caused by location, not by the things happening on screen. Lot of work to do on the game optimalization. Only thing I'm sure - a lot of gamers will not be happy with their HW performance
(quick and dirty translation)
Second article had similar feelings, although he states SSAO was on.
This image perfectly sums up every single pre-release spec thread.
So they had no way of telling the framerate due to no fps counter but they could tell exactly when it ran at +30, or at 35, etc.. ? Am I reading this right?
Wprawdzie żadnego licznika nie udostępniono, ale spowolnienia były wyraźnie odczuwalne, zwłaszcza w mocno zalesionych rejonach. Jestem graczem, testuję sporo sprzętu, więc bez problemu wyłapuję takie zjawiska
lol at some of the people in this thread...
"Hey guys I've got 4 Titan Z's and 4 i7 processors with 128GB of ram, I don't know if I can even get 30 fps on this omg what will i do??!??!"
This image perfectly sums up every single pre-release spec thread.
On the other hand, once we get actual benchmarks it's usually tumbleweed.gif.
lol at some of the people in this thread...
"Hey guys I've got 4 Titan Z's and 4 i7 processors with 128GB of ram, I don't know if I can even get 30 fps on this omg what will i do??!??!"
You do know that there's a bunch of other stuff that contributes to the final look other than resolution, models and textures? It's not 2004 anymore where turning off bump mapping made everything look like shit.
Hilarious and so true.
I dunno why anyone would expect to play this at 60FPS. I never seen an open world with that level of graphics running at 60FPS before. Normally open world games are limited to 30FPS due to sheer size and complexity of the world.
Why is expecting 60fps on a PC that out of the ordinary? That should be the bare minimum. Being open world has nothing to do with it.
Are there a lot of new PC gamers lately? These threads always happen and then when said game gets released the benchmark thread is practically empty.
Of course I realize that but this is what they said
"PC, PS4 and Xbox One use the same build, textures and models. Game will look almost identical on all platforms"
If that isn't a massive red flag I don't know what is? How heavily are they investing in these PC only features if what they say is it will look almost identical on all platforms?
Again I'm not dooming the game or saying it won't look as good as it should. What I'm taking issue with is this statement and if its true its 100% bull crap coming from a PC first developer like CDProjekt Red.
Why is expecting 60fps on a PC that out of the ordinary? That should be the bare minimum. Being open world has nothing to do with it.
Why is expecting 60fps on a PC that out of the ordinary? That should be the bare minimum. Being open world has nothing to do with it.
Sure there is. You can frame limit to 30, you can enable half-refresh v-sync (assuming a 60Hz display), or you could run your display at 30Hz (if supported) and run normal v-sync...Lol there is no "running" at 30 fps on pc. It's all variable depending on on settings vs hardware.
Welp, I'm now targeting "not buying".
Witcher 2 ran and looked amazing at 60FPS on my old 660TI. On my 970 with downsampling it's still very very great. I don't see a significant improvement in TW3 over TW2 that it should run so much worse, besides being unoptimized. They probably just had too much work with optimizing for consoles as well.
The devs told us way back into last year that a 780ti would not be able to 1080p/60 this game so I feel like expectations should have been tempered quite a while back.
If I can get close to 60 fps @ 1080p with my 970's with a blend of high and a few ultra settings, I'll be happy but I'm not holding my breath.
The devs told us way back into last year that a 780ti would not be able to 1080p/60 this game so I feel like expectations should have been tempered quite a while back.
If I can get close to 60 fps @ 1080p with my 970's with a blend of high and a few ultra settings, I'll be happy but I'm not holding my breath.
Their Ultra settings may also include something like 8XMSAA by default. Somthing that I find courageous, but also kinda overboard for most people. I would imagine the effects and techniques settings can be set to ultra for a lot of people... but IQ (MSAA) as well as view distance and LOD distance settings will need to be toned down.
I'd be kind of surprised if TW3 actually had MSAA. TW2 didn't, and a lot of games with deferred rendering don't go to the trouble of implementing it at all.
Why is the nvidia counterpart a 770 when a 290 is much better?
Wait, then what was TW2's anti-aliasing done with? And I'm not talking about ubersampling.
I'd be kind of surprised if TW3 actually had MSAA. TW2 didn't, and a lot of games with deferred rendering don't go to the trouble of implementing it at all.
(What I'm saying is that I can't find a concrete source stating this is actually the case. There's a story from September saying that PC ultra settings include 8X MSAA , but that source also has some "facts" about the console versions that we now know are false.)
Posted on behalf of junior FLAguy:
http://www.gry-online.pl/S018.asp?ID=1132 the original interview
http://www.reddit.com/r/gaming/comments/2trb9b/adam_badowski_cd_projekt_red_managing_director_in/ the reddit topic
According to the google translate they also ask some other downgrade-related stuff that the redditor did not translate (e: see edits below) but says he will if there is enough interest.