The Witcher 3 runs at 1080p ULTRA ~60 fps on a 980

Is this "I must be able to ultra everything or its bad optimization" a new trend? I remember when ultra was supposed to be extreme in order to allow the game to look better with future hardware

It the past developers used much more powerful PCs than most people. Right now if Computer gaming is your hobby every time you upgrade your PC in a major way - you are pretty close to an enviorment similar to what they are using. It's one of the reason computer gaming has gotten larger, computer parts pricing has gone down as more and more people around the World have computers/laptops.
I don't know but back in the day a game on the PC would look one way on my compter and completly different in a video, due to the limitation of my particular rig. Right now I have had my computer for 5 years and in the process of investing into a new one. If I build a PC capable of running the Witcher in the highest resolution, I think my PC will be prepaired to face the next 2 years without needing any major upgrades. It's a game I've looked forward since the original - although personally I would have loved for CD Projekt to work on Cyberpunk 2077 instead.
 
Why are some people so obsessed with 4K?

Wouldn't you rather run the game at Ultra @1080P?

IMO settings trump resolution any day.

What about people who have a 4K display? Doesn't playing at less than native resolution of your display usually create ugly upscaling artifacts?

What Durante is saying is that downsampling is really inefficient compared to proper supersampling.

What's the difference? I thought they were different names for the same thing.
 
I don't know when I should start counting, my PC is not pre-built. It's a machine I continually upgraded ever since 2010. I couldn't tell you how much I spent because I would need to remember every parts I bought and sold.

What I do know however, is that it's well worth it and I'll keep doing it. You see, cost-effectiveness is not arbitrarily defined. No doubt I must have spent more than a PS4 or Xbox One but when I'm looking at the result before my very eyes I really can't complain.

Selling my PS4/XBO was the best decision I could have ever made. ;)

I'm happy for you. I just don't see how spending hundreds or even thousands of dollars since 2010 has any impact on how I or anyone enjoys a game. I actually asked the question legitimately to see if I wanted to go in on a gaming PC as a console player, but I never get a straight answer. Yeah, there's separate settings lower this or that, but that's not the point. From what I read on GAF and other sites is that PC gaming is the end all be all of game experience and that's why I ask you... How much do you have in your PC right now to enjoy the games like you do at this moment.
 
I'm happy for you. I just don't see how spending hundreds or even thousands of dollars since 2010 has any impact on how I or anyone enjoys a game. I actually asked the question legitimately to see if I wanted to go in on a gaming PC as a console player, but I never get a straight answer. Yeah, there's separate settings lower this or that, but that's not the point. From what I read on GAF and other sites is that PC gaming is the end all be all of game experience and that's why I ask you... How much do you have in your PC right now to enjoy the games like you do at this moment.

You are never going to get a straight answer because PC's are so scalable to your desired hardware and the graphic/framerates you desire. Everybody has a price point in which they are willing to spend to enjoy the games they want at the technical specs they have. It all comes down to personal preference on the pc.

The problem is, pc gaming talk has devolved into playing games at the utmost highest settings regardless of cost and performance ratio to graphics ratio.
 
I'm happy for you. I just don't see how spending hundreds or even thousands of dollars since 2010 has any impact on how I or anyone enjoys a game. I actually asked the question legitimately to see if I wanted to go in on a gaming PC as a console player, but I never get a straight answer. Yeah, there's separate settings lower this or that, but that's not the point. From what I read on GAF and other sites is that PC gaming is the end all be all of game experience and that's why I ask you... How much do you have in your PC right now to enjoy the games like you do at this moment.

This isn't a useful question to ask. How much you enjoy whatever experience on PC for x price is completely variable depending on the person. Bad questions get bad answers.
 
only on non-integer multiples.

Has this ever actually been confirmed? I see lots of people assume it, but never a confirmation. Wouldn't it require the upscaling algorithm to actually check that it can increase by a factor that is a whole number before it performs the upscale? Else, wouldn't all the artifacts still be there? Anyone ever run 1080p on a 4k monitor?
 
Today I learned!

Is that also why DSR settings like 1.50x, 1.78x, or 2.25x look bad?

Yes, that certainly is a factor. The reason integer 1080p to 4k scaling works well is because 1 pixel of 1080p upscaled is exactly 4 pixels, thus no interpolation is required.
 
You are never going to get a straight answer because PC's are so scalable to your desired hardware and the graphic/framerates you desire. Everybody has a price point in which they are willing to spend to enjoy the games they want at the technical specs they have. It all comes down to personal preference on the pc.

The problem is, pc gaming talk has devolved into playing games at the utmost highest settings regardless of cost and performance ratio to graphics ratio.

This isn't a useful question to ask. How much you enjoy whatever experience on PC for x price is completely variable depending on the person. Bad questions get bad answers.

I want to enjoy playing games the same way he does. I have tons of disposable income and I'd like to pretty much copy exactly what he has because I've seen him post about how he likes his settings, etc. Nothing hard about that. I'm legitimately curious. I don't know why anyone skirts around the issue. Post your hardware and prices. Is that so hard?
 
I want to enjoy playing games the same way he does. I have tons of disposable income and I'd like to pretty much copy exactly what he has because I've seen him post about how he likes his settings, etc. Nothing hard about that. I'm legitimately curious. I don't know why anyone skirts around the issue. Post your hardware and prices. Is that so hard?
Dude you're gettin a dell!
 
What's the difference? I thought they were different names for the same thing.

It's easy to visualize as a grid. Without SSAA/downsampling, imagine a normal grid. Every intersection is a pixel. What downsampling does is make the grid denser, so you have more pixels. But the pattern is the same.

But, if you could instead offset the extra samples so they're not aligned with the original grid, the samples spread out and it's harder for things to slip inbetween them. You get more coverage for the same amount of samples. That's the idea behind what people mean when they say "SSAA." You are right that downsampling is a kind of supersampling, though.
 
I want to enjoy playing games the same way he does. I have tons of disposable income and I'd like to pretty much copy exactly what he has because I've seen him post about how he likes his settings, etc. Nothing hard about that. I'm legitimately curious. I don't know why anyone skirts around the issue. Post your hardware and prices. Is that so hard?

Well, there was something I used to say: If you are not going to go for $1500 and up, (from scratch, that is), wait until you can.
 
I want to enjoy playing games the same way he does. I have tons of disposable income and I'd like to pretty much copy exactly what he has because I've seen him post about how he likes his settings, etc. Nothing hard about that. I'm legitimately curious. I don't know why anyone skirts around the issue. Post your hardware and prices. Is that so hard?

This is what I have at the moment :
W90LHQw.png


Needless to say peripherals are not included. I have no idea about the price but my balance must look very good since I resell my old parts.
I'm not using my "normal" monitor, I need to get my ASUS back.
 
Well, those are exactly my system specs. I have everything overclocked ~15% so that should be enough headroom for a constant 1080p 60fps.
 
I'm still on my i5-750, that's from 2009. I upgraded the RAM several years back, as well as switched to an SSD (the HDD was from my old build), then got a larger, second SSD later. The most major upgrade was dropping a 770 4gb in there. I also replaced the PSU, which was again from my old build. The case is from 2004 or something, so that's old build too.

When I upgrade it's just going to be CPU/Mobo/RAM, everything else is still cool.

If you upgrade parts of your rig every now and then you're pretty good.

I run Dying Light well, I'm playing Dark Souls well, games like the Arkham series and Dishonored are maxed at 1080p/60fps locked.
 
Why are some people so obsessed with 4K?

Wouldn't you rather run the game at Ultra @1080P?

IMO settings trump resolution any day.

Display quality matters a lot too. I was caught up in the 4K hype until I saw LG's 1080P OLED - looks better than any 4K LCD I've seen despite the lower resolution (as long as you're not sitting super close). I'll wait until I can afford a 4K OLED.
 
Why are some people so obsessed with 4K?

Wouldn't you rather run the game at Ultra @1080P?

IMO settings trump resolution any day.

Because 4K literally holds 4x the amount of pixels as 1080p, which means much greater sharpness and image quality? Just take a stroll over to the PC Screenshot thread to see for yourself.

Display quality matters a lot too. I was caught up in the 4K hype until I saw LG's 1080P OLED - looks better than any 4K LCD I've seen despite the lower resolution (as long as you're not sitting super close). I'll wait until I can afford a 4K OLED.

Of course panel quality matters a lot, but the 1080p OLED won't match, in any competition, the sharpness and quality of a similar 4K panel. I can't imagine a 1080p OLED display actually looking better than a 4k IPS display displaying an inputl that's actually native 4K.
 
That's a....tall order. The 970 can runs games at 4k sure but The Witcher 3 might be too much even one notch below max. And 3.5 of VRAM will probably not suffice.

Probably. I don't mind having some settings on med as well.

Why are some people so obsessed with 4K?

Wouldn't you rather run the game at Ultra @1080P?

IMO settings trump resolution any day.

Have you actually played on a 4K display? Dropping some settings is easily preferable in order to run at 4K, it looks friggin ridiculous. Plus dropping (or lowering) things like AA doesn't matter at 4K because of the insane resolution (though I'm on a TV so distance is also a factor.
 
In a 4chan thread somebody took the recent Kaer Moren pic and added a sharpening filter which actually made it look more similar to the reveal screenshots. Proves that's a big part of the difference between then and now.

Anyway, I took another look at the Xbox press conference footage and I actually think it looks great. And that was nine months of optimization ago.
 
720p looks like a blurry mess in my 1440p display.

1080p looks better than it. (I think this is what you are talking about, right?)

yah. I guess that should answer that question. It does appear that the scalers in monitors are absolute garbage if that is the case. But you would need to compare it to a native 720p panel that was the same size as your 1440p to determine what looks worse.
 
If the game looks incredible its understandable that the GPU can't do that, but this game looks average, so if a 980 can't run it in 60 FPS its bad optimized, its obvious really.

It becomes wrong when stating it is based on subjective perception of what's on screen (or simply the setting used) without considering actual tech running and how it impact the hardware.
 
Guys let's say you had to chose between something like 4k 30 fps with ~medium overall settings, or 1080p 60fps with ~ultra overall settings, which would you chose? I know it's hard to say when we don't know what the difference in settings looks like yet, but just in general, what is your preference?
 
Guys let's say you had to chose between something like 4k 30 fps with ~medium overall settings, or 1080p 60fps with ~ultra overall settings, which would you chose? I know it's hard to say when we don't know what the difference in settings looks like yet, but just in general, what is your preference?

Whatever your native screen resolution is and viewing distance.
Im playing from the sofa, 42" 720p plasma, 1080p Ultra settings, 8 foot viewing distance.

If you had the choice i think the 1080p ultra is going to look overall better than 4k at medium settings as a lot of detail is just gone. You just see glaring issues quicker with 4k.

And for me 30fps is not enjoyable.
 
Guys let's say you had to chose between something like 4k 30 fps with ~medium overall settings, or 1080p 60fps with ~ultra overall settings, which would you chose? I know it's hard to say when we don't know what the difference in settings looks like yet, but just in general, what is your preference?

this

p1030196-229aca5.jpg
 
Whatever your native screen resolution is and viewing distance.
Im playing from the sofa, 42" 720p plasma, 1080p Ultra settings, 8 foot viewing distance.

If you had the choice i think the 1080p ultra is going to look overall better than 4k at medium settings as a lot of detail is just gone. You just see glaring issues quicker with 4k.

And for me 30fps is not enjoyable.
Yeah the issue is I have a 4k monitor but I always see people saying that you should play at your monitor's native resolution. Mine is 4k, but in a game like Witcher 3 I'd probably end up getting around 30 fps on medium settings. I don't know if this would look better than something like 1080p 60fps ultra.

Could anyone translate? I'm genuinely interested in what it's about haha.
 
I am happy with playing on 1080p HDTV with max settings at 60 fps. When a single graphics card can run 90% of 4K games at 60-100 fps at max settings, that's when I'll switch to a 4K TV.
 
I somehow missed this thread entirely.


I have a 3770k, 970 FTW Edition, and 16 gigs of ram.


Possible that I will be able to do a mix of Ultra and High at 4K@ 30fps?

That would be fantastic.

How much slower is a 970 FTW than a 980?

Edit: Looking at this 970 FTW review,
http://www.overclockers.com/evga-gtx970-ftw-graphics-card-review/

it looks like the 970 FTW is never more than 10 frames behind the 980, in any game, most of the time less.

I wonder then if I could actually manage all Ultra at half the framerate but 4K... Somehow doubt it, but my hope is now higher for a mix of Ultra/High 4K.
 
Just built this gaming rig specifically to play Witcher 3 at 1080p/60fps on Ultra

CPU: Intel Core i7-4790K 4.0GHz Quad-Core Processor
CPU Cooler: Corsair H100i GTX 70.7 CFM Liquid CPU Cooler
Motherboard: Gigabyte GA-Z97X-UD5H-BK ATX LGA1150 Motherboard
Memory: Kingston Fury Black Series 16GB (2 x 8GB) DDR3-1866 Memory
Storage: Kingston HyperX 3K 240GB 2.5" Solid State Drive
Storage: OCZ ARC 100 480GB 2.5" Solid State Drive
Video Card: EVGA GeForce GTX 980 4GB Superclocked ACX 2.0 Video Card
Case: NZXT H440 (Black/Red) ATX Mid Tower Case
Power Supply: Corsair Professional 750W 80+ Gold Certified Semi-Modular ATX
 
I somehow missed this thread entirely.


I have a 3770k, 970 FTW Edition, and 16 gigs of ram.


Possible that I will be able to do a mix of Ultra and High at 4K@ 30fps?

That would be fantastic.

How much slower is a 970 FTW than a 980?

Edit: Looking at this 970 FTW review,
http://www.overclockers.com/evga-gtx970-ftw-graphics-card-review/

it looks like the 970 FTW is never more than 10 frames behind the 980, in any game, most of the time less.

I wonder then if I could actually manage all Ultra at half the framerate but 4K... Somehow doubt it, but my hope is now higher for a mix of Ultra/High 4K.
i think it depends most upon how much VRAM TW3 uses. Shading wise... 30 fps @ 4k seems perhaps reasonable @ high. But it is hard to say obviously.
 
i think it depends most upon how much VRAM TW3 uses. Shading wise... 30 fps @ 4k seems perhaps reasonable @ high. But it is hard to say obviously.


Fair enough. Must be some crazy optimization bough. Because of all the games I own, I feel like TW2 is the one that struggles the most in 4K. So seeing this run at such good frames is really impressive.


If I end up dropping some settings or running it at 1440p (whichever looks best) that's fine. I can always pick up another 970 FTW down the road and SLI this game into the ground.

I'll keep an eye out for deals.
 
I somehow missed this thread entirely.


I have a 3770k, 970 FTW Edition, and 16 gigs of ram.


Possible that I will be able to do a mix of Ultra and High at 4K@ 30fps?

That would be fantastic.

How much slower is a 970 FTW than a 980?

Edit: Looking at this 970 FTW review,
http://www.overclockers.com/evga-gtx970-ftw-graphics-card-review/

it looks like the 970 FTW is never more than 10 frames behind the 980, in any game, most of the time less.

I wonder then if I could actually manage all Ultra at half the framerate but 4K... Somehow doubt it, but my hope is now higher for a mix of Ultra/High 4K.

I doubt it

When I was running sli 980's most games would still fluctuate between 30-60fps @4k
With a single 970 you are really pushing it(but who knows considering the game isn't out yet)
 
Does anybody regret buying a 970 when you know you need a 989 to play on ultra?



You need a 980 to play Ultra at 60fps solid.



A 970 will easily run it on Ultra at 30-40+ FPS.


My 970 FTW edition is within 10 frames of a stock 980 on pretty much every single benchmark.


So no.


Although I am considering buying a second 970 FTW+ just so my 4K options are more robust.
 
Guys, sort of an unrelated question, but if I run into VRAM issues, as in it gets maxed out, I'll end up with stuttering, right? If I get 30 or 40 locked, then that's not a VRAM issue, am I right in assuming so?
 
Titan X should knock it out of the park then. Can't wait.


Titan X, or better yet, dual Titan X's will make this game. (And any other game really) it's bitch.


I don't even want to know the frames you could achieve at 1080p with dual Titan X's.
 
You need a 980 to play Ultra at 60fps solid.



A 970 will easily run it on Ultra at 30-40+ FPS.


My 970 FTW edition is within 10 frames of a stock 980 on pretty much every single benchmark.


So no.


Although I am considering buying a second 970 FTW+ just so my 4K options are more robust.

Considering you can OC a 970 to stock 980 performance it should do a lot better than 30-40 FPS.
 
Top Bottom