Witcher 3 downgrade arguments in here and nowhere else

Status
Not open for further replies.
Dude, I'm aware of some people downplaying the PC version, but I've never seen anyone saying the PS4 version is a generation above the XB1 version. That's beyond hyperbole. And to be fair, my post had nothing to do with the PC version, so why bring it up?

I... not sure if you're joking or just don't understand the use of hyperbole to make a point.

I don't know the precise arguments they'll make, since I can't see into the future, but my point is that the same people downplaying the PC verison here, will be exaggerating the differences between the PS4 and Xbone. I wouldn't rule out PS2 levels of exaggeration either. Fanboys can surprise.
 
Yes he really did a good job, but I wanted to see it for myself again. When the game is out someone should record the exact same 35min of Gameplay on Ultra with Nvidia Workshop on. Then everybody will see that they reduced a lot and that their PR is becoming disgusting.

That is the plan. Even though a full 1:1 recording is ideal, it's way too much work. I will try to provide 1:1 screen comparisons at the very least. As to what the conclusion would be, we shall see...
 
Umm isn't the xbox one version 900p?

Dynamic resolution, so it jumps between 900 and 1080p. I kinda wish this was an option on PC as well, just to see how it looks, if it aint something you really notice while playing I'd be cool with having it for a smoother framerate
 
Dynamic resolution, so it jumps between 900 and 1080p. I kinda wish this was an option on PC as well, just to see how it looks, if it aint something you really notice while playing I'd be cool with having it for a smoother framerate
It would look horrifying.
 
X1 captures look good:

efn46PG.jpg

CmcSVNl.jpg
 
Dynamic resolution, so it jumps between 900 and 1080p. I kinda wish this was an option on PC as well, just to see how it looks, if it aint something you really notice while playing I'd be cool with having it for a smoother framerate

Most games on most gamer's rigs these days are CPU bound, so it wouldn't make a difference performance wise.
 
Most games on most gamer's rigs these days are CPU bound, so it wouldn't make a difference performance wise.

Really? Most games? Most recent games I've played can have gpu usage at 99%, no way for me to get my CPU usage up as high with the options - it looks mostly to be at like 50% or so. 780ti sc / 4770K@4.4
 
There's definitely a downgrade but the game still manages to look fantastic. Have fun y'all playing it on tuesday! (or the ones playing it already damn)
 
Really? Most games? Most recent games I've played can have gpu usage at 99%, no way for me to get my CPU usage up as high with the options - it looks mostly to be at like 50% or so. 780ti sc / 4770K@4.4

Then I have to ask you, are you playing every title at 4K, max details and (most importantly) without v-sync?

Also, I said for most gamers. Most gamers, unlike us higher(ish) end guys do not have a multithreaded quad core, let alone at 4.4 GHz. Look around in just the Witcher 3 threads on Gaf, you'll see people everywhere with 2500Ks. There was one asking about his i5 750. And we all know how the Steam surveys look like.
Yes, I stand by that point, most games for most gamers are CPU bound. Mind you, I don't mean EVERY game for EVERY gamer.

Also, I can see the appeal for this auto resolution function, but I don't think it would help a lot on PC.

Edit: Ironically I have to add, that this probably won't apply to Wild Hunt, since it's supposed to be very easy on the CPU.
 
Most games on most gamer's rigs these days are CPU bound, so it wouldn't make a difference performance wise.

That's just not true at all. That would mean that no one would get any improvements by getting a new GPU, which isn't the case.

A better CPU will of course help with some stuff, but it's not the most limiting factor when it comes to performance.
 
A little over twice would be more than 800$, though.

Well, actually, for me specifically, the cost of turning my PC into a gaming one, is about the same as a console.

I'm a programmer, so I'm going to need a decent PC to do my work anyway. The only thing I really require is a GPU and even a high end one is of comparable price to a PS4.

So, I ask myself: Self, do I spend $400 on a closed down platform that only plays games in what I consider mediocre settings - 30 FPS/medium high graphics, etc - a platform that I can never upgrade, so while today it's medium high, in 2 years it will be low, and where I have to pay to play online, and where games are all $60, and wher eI don't ahve the option to play some games with a mouse and keyboard like I prefer....

Or do I just spend roughly the same amount on a GPU that smokes a PS4 and that I can always upgrade later on down the line if I so desire?

The answer, for me, was easy.
 
That's just not true at all. That would mean that no one would get any improvements by getting a new GPU, which isn't the case.

A better CPU will of course help with some stuff, but it's not the most limiting factor when it comes to performance.

Please see my post above. I'm experiencing this first hand: 2500K, only four threads makes THE difference in Watch Dogs, ACU, Dying Light, GTA V and Far Cry 4 (this is from the top of my head).
Gamers, who invested in an i7 or similar are pretty much set, that is true. But even then you'll be bound in games like ACU (because of DX11 limitations).
Go to a demanding seen and set your res to 720p or lower. It doesn't matter, you'll have the same framerate and that's what I was getting at. Of course you need a top notch card.

Edit: I feel like I didn't get my point across correctly because I forgot to mention an important detail: These days, if you HAVE a good card (290/970) and you are struggling for framerate (I usually play with vsync at 60Hz), it's almost always because of the CPU or DX11 limitation/CPU overhead.
 
Dynamic resolution, so it jumps between 900 and 1080p. I kinda wish this was an option on PC as well, just to see how it looks, if it aint something you really notice while playing I'd be cool with having it for a smoother framerate

Wait, what?

Holy goodness, no. If you were having trouble with a framerate your rig couldn't run, why wouldn't you just cap it? And keep it steady?
 
Also, I said for most gamers. Most gamers, unlike us higher(ish) end guys do not have a multithreaded quad core, let alone at 4.4 GHz. Look around in just the Witcher 3 threads on Gaf, you'll see people everywhere with 2500Ks. There was one asking about his i5 750. And we all know how the Steam surveys look like.
Yes, I stand by that point, most games for most gamers are CPU bound. Mind you, I don't mean EVERY game for EVERY gamer.

The reason that so many people use older CPUs is precisely because, for most games, going to a newer/faster CPU doesn't increase performance much (or at least enough to justify it over getting a better GPU.)
 
It's weird to see CDPR go from being hailed as the saviour of PC gaming and one of the better devs out there in terms of honesty around here to downgrades and PR bullshit in the last week haha

Not saying this as a personal opinion or anything, just the mood towards them seems to have shifted a bit around GAF lately. I'm still excited for the game though but it' IS a bit worrying to see how similar PS4 and PC footage looks in comparisons, time will tell I guess

That's what happens when these marketing deals are in place, and they aren't allowed to be transparent. They could have released a video and quelled everyone's fears, but nope.
 
Dynamic resolution, so it jumps between 900 and 1080p. I kinda wish this was an option on PC as well, just to see how it looks, if it aint something you really notice while playing I'd be cool with having it for a smoother framerate

Assuming you play on a monitor you will notice for sure. But why would such a feature need to be implemented on the PC? It's as simply as going through the options menu and changing it lol
 
Wait, what?

Holy goodness, no. If you were having trouble with a framerate your rig couldn't run, why wouldn't you just cap it? And keep it steady?

It's probably good for smoothing out the bumps.

For instance you might not be able to run Ultra settings at 1080p without dips to the low 50s - high 40s, and those dips might only occur when there are a lot of effects going on (like particles) or a particularly far reaching view.

I think a dynamic scaling of a games' resolution is a better solution than locking to a lower resolution than you are capable of 90% of the time or just accepting the dips.
 
It's probably good for smoothing out the bumps.

For instance you might not be able to run Ultra settings at 1080p without dips to the low 50s - high 40s, and those dips might only occur when there are a lot of effects going on (like particles) or a particularly far reaching view.

I think a dynamic scaling of a games' resolution is a better solution than locking to a lower resolution than you are capable of 90% of the time or just accepting the dips.

Ah, that makes sense. If you're dropping slight frames in heavy areas, maybe it would work well instead of hard capping it at a lower rate.
 
The reason that so many people use older CPUs is precisely because, for most games, going to a newer/faster CPU doesn't increase performance much (or at least enough to justify it over getting a better GPU.)

Just one example. A quote from PCGH on Dying Light 1.2.1
The performance of top cards like the R9 290(X) is still thwarted by the CPU.
This was even with a 4790K@4.5
http://www.pcgameshardware.de/Dying-Light-PC-257307/Specials/Technik-Test-Benchmarks-1149149/

Edit: I feel like this is going way off-topic. Feel free to PM me, if you want to discuss this further.
 
Dynamic resolution, so it jumps between 900 and 1080p. I kinda wish this was an option on PC as well, just to see how it looks, if it aint something you really notice while playing I'd be cool with having it for a smoother framerate
giphy.gif

How long did it take until people realized that Killzone: Shadowfall used this technique in multiplayer? Months I believe. You don't really see a difference as long as the camera is moving.

I wish developers would use more of cheats like these to make the most out of the hardware. I am still waiting for games that use the 60 fps framerate upscaler that was demoed years ago.
 
The reason that so many people use older CPUs is precisely because, for most games, going to a newer/faster CPU doesn't increase performance much (or at least enough to justify it over getting a better GPU.)

On top of that, CPUs don't become useless nearly as quickly. A 5 year old GPU is much more outdated (power-wise) than a 5 year old CPU.
 
How long did it take until people realized that Killzone: Shadowfall used this technique in multiplayer? Months I believe. You don't really see a difference as long as the camera is moving.

I wish developers would use more of cheats like these to make the most out of the hardware. I am still waiting for games that use the 60 fps framerate upscaler that was demoed years ago.

Didn't a lawsuit result from that?

Lawsuit aside, though- doing something like this on a closed platform connected to a TV, where you know what the target resolution is, is one thing.

For PC's, though? This makes no sense. If you're advocating for this in the name of a smoother gameplay experience, I have a much, much better solution for you. It's called G-Sync.
 
Funny thing. Glaced at GAF few days back and I saw this topic next to that one about "remember RDR?", and thought "hell, I'd take RDR as is over how W3 makes me feel". The initial W3 promo pictures made me imagine RDR but with Witcher setup. You and your horse. Until you see something interesting and put a heel into your mount and speed into some rough shit.

I remain utterly disappointed and to celebrate the launch, will probably cancel my CE preorder.

Feckin' CDPR.

At least I stumbled into Pillars of Eternity last week.

I'm confused. Did they remove the ability to ride a horse and do sidequests? That's some downgrade, if so.
 
I'd like the option for dynamic resolution, sure.

4K during slowness, take in the sights, heavy combat takes it to 1440p.

I'd be alright with that depending how nice and smooth it is, and I guess the type of game. I'd be happy with it in this I think.

I liked how it was done in wipeout 2048. Framedrops just aren't acceptable. It's already going so fast you need to rely a bit on muscle memory anyway (and a bit less on what you're seeing).
 
I'd like the option for dynamic resolution, sure.

4K during slowness, take in the sights, heavy combat takes it to 1440p.
Yeah it would be nice to be able to smooth out frametimes, especially if they can divorce the hud elements' resolution from the gameplay. Maybe if you could select the min resolution it could drop to, more graphics options are always appreciated
 
How long did it take until people realized that Killzone: Shadowfall used this technique in multiplayer? Months I believe. You don't really see a difference as long as the camera is moving.
Shadow Fall uses a quite different technique that goes beyond normal upscaling.
Dynamic resolution means only 1080=no scaling, everything below=use scaling.
 
But how much more manpower, time and money would need to be dedicated so whole game could be upgraded to at e.g. VGX2013 asset level and optimised?

Modders can do all kinds of crazy shit because they make mod just for e.g. super realistic hair modeling and don't need care how it affects performance overall, but hair will look very very good.

Oh, I can agree with you. I understand the business reasons behind the downgrade. I still wouldn't be surprised to see CDPR revisit the game after release with a huge upgrade patch. I think we can also look forward to some great mod work that will make the game look downright gorgeous. I don't think the visual situation will be permanent either way.
 
My sister got the game for ps4 and I just started to watch her play. This game does not look good at all on ps4, after seeing the ps4 version I doubt the pc will look anything like the ps4, vegetation is really low quality and draw distance textures look barren and really cheap. She is in the early open environment though, but looks real bad so far. PC can't be close to this.
 
Yeah it would be nice to be able to smooth out frametimes, especially if they can divorce the hud elements' resolution from the gameplay. Maybe if you could select the min resolution it could drop to, more graphics options are always appreciated
Firefall has a feature like this. I don't know exactly which engine the game use, but it seems to be an heavily tweaked Offset Engine (almost entirely rewritten).
 
Status
Not open for further replies.
Top Bottom