The chart is on this very page (well, the previous). You even quoted it! On a 960 HBAO+ is even faster than the default AO!10-15 frames...? Wait what? Can someone confirm? Or is it sarcasm. I really don't know :/
The chart is on this very page (well, the previous). You even quoted it! On a 960 HBAO+ is even faster than the default AO!10-15 frames...? Wait what? Can someone confirm? Or is it sarcasm. I really don't know :/
Good call. The previous version only had an AA on - off toggle. And it was only believed to be FXAA. But what the newest version actually uses we don't know I believe.
Damn, some people really can't stand the idea of decreasing some settings.
"playing at console peasant settings?! NEVER!!"
Which graphics comparisons? The ones with versions that didn't even include the real ultra settings?If the graphics comparisons are accurate, it's easy to understand why when looking at performance benchmarks.
I usually go through Nvida and set adaptive v-sync to half refresh.How do you lock it at 30 fps without it being juddery as f*** ?
The files also explicitly said MSAA is not currently supported. It requires engine level changes as they don't currently support any hardware AA solution. Highly doubt they are making new engine features at this point during crunch time
According to this chart, it's 3 frames.
10-15 frames...? Wait what? Can someone confirm? Or is it sarcasm. I really don't know :/
I feel pretty confident you can hit a nice, relatively stable 60fps.
No need to play with ultra settings. Playing with high should catapult you above 60 easily.
The files also explicitly said MSAA is not currently supported. It requires engine level changes as they don't currently support any hardware AA solution. Highly doubt they are making new engine features at this point during crunch time
Yeah but HBAO+ usually looks great.
What about SMAA?
Wow. I was staring at the same chart for minutes and still didn't pick it up. I even quoted the damn chart.According to this chart, it's 3 frames.
I'm tired dammit!The chart is on this very page (well, the previous). You even quoted it! On a 960 HBAO+ is even faster than the default AO!
Which graphics comparisons? The ones with versions that didn't even include the real ultra settings?
Game doesn't support it out of the box but via SweetFX? That may still be possible.
Game doesn't support it out of the box but via SweetFX? That may still be possible.
The Witcher 3 is on the home straight, meanwhile us the developer has supplied CD Projekt Red with a quasi-final version. "Quasi" because although the Day-1 patch is already present in our version, but not yet final adjustments that allegedly some of the announced by Nvidia itself Game Ready driver for The Witcher 3 has to do. We can not guarantee 100 percent that the graphics shown below corresponds exactly to what the users by 1 clock in the morning to play on Tuesday and so. We ask for your understanding, because more we do not know unfortunately.
Wow. I was staring at the same chart for minutes and still didn't pick it up. I even quoted the damn chart.
Does the game support any AA solution? D:
I have a bad feeling AMD owners are going to get the short end of the stick for this one 290X user here.
Doesn't the game unlock soon for certain regions? Really wanna see what I can expect with an r9 290.
Only post process. Equivalent to FXAA and some in-house temporal AA solution
Dat chart. Glad I went with a R9 280X over a GTX 960 when I changed my GPU. If I lower some settings I'm sure I'll be able to scrape by at 30 FPS at 1080p :lol.
Which graphics comparisons? The ones with versions that didn't even include the real ultra settings?
Alright. Ill check out what they have to offer. At any rate the 4K DSR will add some AA support too.
So I am curious about something. Only followed the downgrade thread and a few others casually. With the performance graphs we are seeing, does it confirm that it won't look identical to PS4 or is the game that horribly unoptimized?
Well a 960 slightly OC (or at least, a version with a bit more factory OC than that Zotac) seems perfectly capable of running it at max details/30fps. With this game the card seems to perform slightly better than the 770, which is roughly on par with a 280X but who knows, maybe the AMD card fares better here. We'll know for sure when they add it to the bench
Ok. I may be wrong, but the way I understand it is that Hairworks is direct compute based <snip>
I thought they said they were using some in house TXAA they developed?
The game was updated with better LOD and more grass etc in the 1.02 build, on the older build (info in the downgrade thread) that was the ultra setting, however those ultra settings are now the very high setting in the new build and the new better LOD/grass used in this benchmark is the new Ultra setting.So I am curious about something. Only followed the downgrade thread and a few others casually. With the performance graphs we are seeing, does it confirm that it won't look identical to PS4 or is the game that horribly unoptimized?
With the performance graphs we are seeing, does it confirm that it won't look identical to PS4 or is the game that horribly unoptimized?
You are going to use 4k DSR for this game? Do you have a single card or SLI? Because Nvidia doesn't support DSR at 4k for SLI setups. It will not utilize one of your cards.
The game was updated with better LOD and more grass etc in the 1.02 build, on the older build (info in the downgrade thread) that was the ultra setting, however those settings are now the very high setting and the new LOD/grass (used in this benchmark) is the new Ultra setting.
So the pics etc from the downgrade thread based on this new build would be the very high setting.
PS4 = high setting,
Old build ultra = very high setting.
New build ultra = ultra.
Hmm. I have SLI
Could have swore I used DSR and SLI together.
If anything I can make a custom resolution for 4K.