He's just making a completely unprovoked dig at PC gamers. As usual.Sometimes I feel I'm missing something.
He's just making a completely unprovoked dig at PC gamers. As usual.Sometimes I feel I'm missing something.
?
Nothing prevents you from playing at 60fps if your system has enough grunt.
I don't get the drama here, recommended specs always aim for 30fps or close. A 290 can most likely play at high/ultra 30fps and close to 60 using console settings.
Sometimes I feel I'm missing something.
He's just making a completely unprovoked dig at PC gamers. As usual.
Well, there you go. Not max'd.
A 780 with ubersampling will get around 40 fps at 1080p
We cannot comment on optimization until the game is in our hands. I can't believe the game won't run well on a 290 or 4gb 770. If the consoles run at high settings then you can only conclude a 290 will run it much, much better at the same settings.Recommended specs at 30fps does not paint a good picture for this game being optimized. And then if you ever played TW2 you're already shitting your pants.
DA:I is far more reasonable technically, and made with PS360 in mind. It carefully sections areas, cuts corners aggressively with clips and popins, has less detailed (more stylized) character models and just keeps density lower in general. Some may have wondered why Orlais feels so empty in the day or why Redcliffe is so small.If it can run Dragon Age Inquisition, I don't see why this would be an unreasonable expectation.
Gamers will one day in 2017 grab this for $8.99 in Steam, and run this puppy in glorious 1440p/ultra/60 with their Nvidia Volta.I don't know who's trolling, but some responses really annoy me. We need games like these that push graphics even if it means that most people will run the game at 30 fps at this point in time. Having high spec's doesn't necessarily mean the game is unoptimized, it might just mean that the is pushing more graphics.
You can always change the settings to get 60fps. I reckon the game will look great even with some settings turned off.
Welp, I guess it's time I upgraded. I have a 670. In the meantime, I'll be getting this on PS4.
We cannot comment on optimization until the game is in our hands. I can't believe the game won't run well on a 290 or 4gb 770. If the consoles run at high settings then you can only conclude a 290 will run it much, much better at the same settings.
People like posting stuff like this all the time, aka just irrational unthinkable stuff. I at that point doubt that they would even care.Do you really think a PS4 can run this game better than a 670?
Well yea but I have a 760 so according to them I'm already under 30fps. No reason to go PC instead of PS4 for me really.
This is definitely a 'wait for Digital Foundry article' type of buy
The Ps4 wont be running the game at settings which cause a 770 or a 290 to run @ 30hz. Come on man, use the brain
I reckon a 760 will run the game as well as the PS4 but only if it's the 4gb version. I don't see 2gb cards faring well at all in this game. I think (but it's only speculation) that a mid-range GPU like a 7950/760 will run the game at high settings and 30fps.Well yea but I have a 760 so according to them I'm already under 30fps. No reason to go PC instead of PS4 for me really.
Why do these threads always have to turn out the same way, when the results once the benchmarks come in always paint the same picture?
Welp, I'm now targeting "not buying".
Witcher 2 ran and looked amazing at 60FPS on my old 660TI. On my 970 with downsampling it's still very very great. I don't see a significant improvement in TW3 over TW2 that it should run so much worse, besides being unoptimized. They probably just had too much work with optimizing for consoles as well.
Why do these threads always have to turn out the same way, when the results once the benchmarks come in always paint the same picture?
I think we can say that PC gaming is officially part of the console war bullshit, despite not being a console. I partially blame the master race meme for the war developing to this point.Why do these threads always have to turn out the same way, when the results once the benchmarks come in always paint the same picture?
And it will be glorious.Gamers will one day in 2017 grab this for $8.99 in Steam, and run this puppy in glorious 1440p/ultra/60 with their Nvidia Volta.
This is ideal, but some games, especially open-world games, still struggle with hitting a consistent 'x' framerate even when lowering settings. It all depends on where the bottlenecks are and how taxing certain settings are to what parts of the system, ya know?As long as i can choose to lower settings and get 60fps, where is the issue? I'd rather they push potential options to the max, and let me decide, since i can do that on PC, than just give me Dark Souls 2 graphics and go "oh wow, i can put everything on MAX at 4K, and still get 60 fps, what a marvel! Nevermind how the game actually looks".
For non=Polish readers, I can confirm that both articles state that the preview build of the game was played on i7-4790 + GTX 980 and had framedrops when played on High preset (and generally ~30 fps).
I really think most of the time these requirements are just guesstimates from developers, rather than them methodically benchmarking different systems with very specific targets in mind.This game must be incredibly demanding. I can only assume this means you'll get 30fps if you have the recommended system at 1080p/ultra.
Some info from the Metro preview:
"We spent the majority of our time on the Xbox One version and it was immediately obvious that the console versions dont look nearly as good as the PC. Thats completely as youd expect, especially as Polish developer CD Projekt RED used to work almost solely on the PC, but those expecting the game to look like the trailers should brace themselves.
We also played the PlayStation 4 version, which to the naked eye looks identical although apparently it runs at 1080p resolution compared to the Xbox Ones 900p. What was clearly visible though was a lot of screen tearing, particularly in the first hour or so; obvious object pop-in; and some surprisingly low res textures at times. There were a few bugs and glitches too, but nothing that seemed concerning given the game is still four months away (it certainly appeared more stable than Dragon Age: Inquisition did even at launch)."
Yeah, i guess that was my experience with both Watch Dogs and Unity (though Unity almost got stable 60 for more than a minute or so, at certain moments) WD mostly was stuttering, since FRAPS was telling me it was rock solid 60fps.This is ideal, but some games, especially open-world games, still struggle with hitting a consistent 'x' framerate even when lowering settings. It all depends on where the bottlenecks are and how taxing certain settings are to what parts of the system, ya know?
Hey look, conflicting information!
How about waiting for the in-depth comparison of the final version for each platform instead of going crazy now?
Really curious to see the benchmarks for this and what the bigger performance hogs are. Is Nvidia's HairWorks as demanding as i expect it to be? I can't recall ever having played a game with that tech, so I don't know for sure.
Oh, one more thing from those Polish articles linked earlier - the build they played had both HairWorks and SSAO disabled.
And still struggled to reach 30 fps on GTX 980.
Oh, one more thing from those Polish articles linked earlier - the build they played had both HairWorks and SSAO disabled.
And still struggled to reach 30 fps on GTX 980.
Oh, one more thing from those Polish articles linked earlier - the build they played had both HairWorks and SSAO disabled.
And still struggled to reach 30 fps on GTX 980.
The Witcher 2 was/is a monster on PCs.
That fact along with the screenshots being posted all over the internet led me to this conclusion long ago.
I hope my babby 660ti can run it at low settings
Every single other report I've read claimed that the PC version was running much more smoothly than the console versions.Oh, one more thing from those Polish articles linked earlier - the build they played had both HairWorks and SSAO disabled.
And still struggled to reach 30 fps on GTX 980.
Hey look, conflicting information!
How about waiting for the in-depth comparison of the final version for each platform instead of going crazy now?
Do you believe that someone on another thread use the line : "We spent the majority of our time on the Xbox One version.." to demonstrate that CD Projekt RED team put most effort into the XboxOne version ? When I read that, I was like "Are people really that stupid?"
Every single other report I've read claimed that the PC version was running much more smoothly than the console versions.
So either
(a) the PC version "struggles to reach 30 FPS" and the console versions run below 20, or
(b) the PC version didn't actually "struggle to reach 30 FPS"
I strongly believe that the latter is the case, but hey, we'll all see what's what when actual benchmarks are out.
Poproszono nas, żebyśmy nie zmieniali opcji graficznych. Mogliśmy podejrzeć, co ustawiono, ale bez ruszania czegokolwiek. Rozumiem to: tak wczesny build może nie być w stu procentach stabilny. Trochę żałuję, że nie mogłem zobaczyć w akcji Nvidia HairWorks – ta opcja była wyłączona. Mimo że do włosów postaci nie mam zastrzeżeń, nie wybijają się one ponad średnią. Mówiąc prościej, to najbardziej niepasujący do całej reszty element, najmocniej odstający od świetnie wymodelowanych postaci. Zagrałem w rozdzielczości 1920 × 1080 w ustawieniach wysokich (po nich są już tylko „ultra”, o ile pamiętam – z wyłączoną opcją SSAO i skromnym wygładzaniem krawędzi i filtrowaniem anizotropowym. Nie miałem licznika klatek animacji. Przez większość czasu umowna płynność była zachowana, co należy rozumieć przez okolice 30 kl./s lub nieco więcej, ale nie więcej niż 35 kl./s, nawet w zamkniętych pomieszczeniach. Jednak niekiedy wahania płynności były bardzo duże. Może nie były to spadki do 10–15 kl./s lub jeszcze mniej, ale się zdarzało, że liczba ta zmniejszała się z 40 do 20 w ciągu dosłownie sekundy i często nie było to związane z akcją na ekranie, bardziej z konkretnym miejscem. Pod względem optymalizacji jest jeszcze sporo pracy. Wiele osób zapłacze nad swoim sprzętem, tego jestem pewien.
If Hairworks off still produces that kind of hair I don't need it.
AMD will have at least 3 skus for April-June 2015. No idea about Nvidia.since we probably won't get a new series this year, and only one or two new cards (if rumors are to be trusted).
Do you believe that someone on another thread use the line : "We spent the majority of our time on the Xbox One version.." to demonstrate that CD Projekt RED team put most effort into the XboxOne version ? When I read that, I was like "Are people really that stupid?"
Why bother being rational when you can simply let yourself fall into the trap of confirmation bias ?
Neither is it mine.I was that person and I already said I made a mistake of reading too fast. English is not my native language.
I don't think so but some will grasp at any straw to reach that conclusion.But I still think that is what the situation is like at CDPR, more time spent optimizing x1 version will lead to less time on PS4 and PC.
I was that person and I already said I made a mistake of reading too fast. English is not my native language.
But I still think that is what the situation is like at CDPR, more time spent optimizing x1 version will lead to less time on PS4 and PC.
Ok, but why do you think they are going to spent more time on optimizing the X1 version?
Neither is it mine.
I don't think so but some will grasp at any straw to reach that conclusion.
Until proven otherwise CDPR have made sure that each skus is properly taken advantage of. PC has more bells and whistles, the PS4 is 1080p, the Xbox One is 900p (presumably at same settings as PS4).
I see nothing to get upset at.
Ok, but why do you think they are going to spent more time on optimizing the X1 version?
Because it fits within a conspiracy theory level narrative about game dev.
Your concerns have been duly noted, but I can't relate to you.I already said I read too fast and got mistaken. Don't tell me you have never made mistakes. And I am not upset, at all, just concerned.