The Witcher 3 4K Patch for Xbox One X

Native 4K or checkerboard and then also improve the AA because while it was fine outside cities it had plenty of jaggies in Novigrad and such. Who knows, maybe the patch brings additional visual things too. But if it's truly called 4K patch, probably not.
 
I am excite. Being one of the 20 people on this planet that hasnt played The Witcher 3, I'll be buying it on release day.
 
So it is already coming out for the PS4 Pro in the next few days? :D
I've been hoping and waiting so long for this, even though they said before there wouldn't be any patch, lol. I kept hoping.
Now please don't fuck up the downsampling part, alright? There will be many people going berserk if the patch is basically useless for Pro owners with 1080p TVs.
 
I am excite. Being one of the 20 people on this planet that hasnt played The Witcher 3, I'll be buying it on release day.

1.) Make sure you get the complete edition
2.) If your not a fan of the movement go into settings and turn on alternative movement. I almost sold my game until I found out about that.

Hopefully you enjoy it.
 
So it is already coming out for the PS4 Pro in the next few days? :D
I've been hoping and waiting so long for this, even though they said before there wouldn't be any patch, lol. I kept hoping.
Now please don't fuck up the downsampling part, alright? There will be many people going berserk if the patch is basically useless for Pro owners with 1080p TVs.

Yeah. Currently I am on 4K TV but might go back to 1080p in future. I would be pissed off if downscaling isn't there.
 
Backed up by CDPR's community lead.

b2bLPN3.png

https://twitter.com/Marcin360/status/874137087938789376

This gif seems appropriate

4Fy46oy.gif
 
A RAM disk is a virtual hard disk done with ram. You assign for example 10gb of your ram to it, and windows would believe you have a new extra hdd of 10gb. The advantage is that ram is still much more faster than SSDs.
If you have big amounts of ram and the game is small, you can load up the entire game on it. Otherwise, you can put a critical data file on it, and use a symlink to make Windows believe it's still on c:gameswhatever.

I remember RAM disks were great in the 8bit era, could use half the memory in my Atari as a RAM disk and was much faster than Floppy.

Can't help thinking it would be useless on PC now as disk accesses are already cached/buffered in memory to speed things up. You'd also need to reinstall the game every time you restart your PC. So sounds slightly mad.
 
Any chance of a performance mode? Runs like shit on my PS4P currently

The pro patch should clean up any remaining performance issues right?
It should now shouldn't it, but many people will tell you it's locked 30fps atm, when it's not.

Really looking forward to hearing the details. Native 4K or checkerboard with better AA and perhaps even extras? Speaking about X here, don't have a PS4.
You have an X already?

will Xbox one X will be 60 fps? I believe ps4 pro doesn't have enough firepower
If XBONEX has a 60fps mode so will the PS4 PRO, they have the same GPU, one is just downclocked a bit more from a reference RX 480.

This game does not require a beefy CPU, so a mix of high and med settings can give these mid-gen refreshes a nice enough performance mode at 1080p.

According to DF test W3 is almost locked 30 fps even with non-Pro PS4 these days. So if you are having performance issues with Pro you should probably figure out what is wrong your console.
There is nothing wrong with his console, Witcher 3 is not a locked 30fps on the PRO, it still drops the odd frame and cutscenes are definitely not locked. The expansions are not locked either....
 
It runs almost exactly like that on base PS4, that was the verdict of DF on that boost mode test. After all those patches, they managed to squeeze a quite stable 30 FPS out of the PS4 Amateur.
There were still drops on the base ps4 in the swamps for example, and also the beginning training section when you run down those training course scaffolds (I’ve seen this myself) which are cleared up when you use the boost mode on pro. Either way, the frame rate in the fully patched version was definitely not something I’d describe as shit, and if it has drops on pro with boost mode, they must be really rare.
 
Any idea if this update will decrease load times?
I doubt it, consoles have horrific loading times this gen, and the fact that's it's not limited to one game or platform seems to indicate that they're here to stay. I'm assuming it's anti-piracy measures (HDD encryption, etc.) that are to blame, and they're probably not things that are likely to be changed. I don't think you can really expect any difference over Boost mode on the Pro (which already exploits the Pro's better I/O AFAIK).

It would be nice to be surprised, but I won't hold my breath.
 
I wonder what's the chances they have implemented CBR into their engine for cyberpunk and have added it to the patch for Witcher.

I doubt it, consoles have horrific loading times this gen, and the fact that's it's not limited to one game or platform seems to indicate that they're here to stay. I'm assuming it's anti-piracy measures (HDD encryption, etc.) that are to blame, and they're probably not things that are likely to be changed. I don't think you can really expect any difference over Boost mode on the Pro (which already exploits the Pro's better I/O AFAIK).

Slower decompression due to CPU limits, larger amounts of data to fill into memory, similar HDD speeds to last gen, etc. have all contributed to the load times seen this gen. I'm hoping we at least move over to 7200rpm drives next gen.
 
Slower decompression due to CPU limits, larger amounts of data to fill into memory, similar HDD speeds to last gen, etc. have all contributed to the load times seen this gen. I'm hoping we at least move over to 7200rpm drives next gen.
With equivalent hardware, we see PC versions enjoying huge gains over the consoles when it comes to load times though. It does seem to be something console specific (which is why I guessed it's the various anti-piracy measures, particularly the encrypted HDDs).
 
With equivalent hardware, we see PC versions enjoying huge gains over the consoles when it comes to load times though. It does seem to be something console specific (which is why I guessed it's the various anti-piracy measures, particularly the encrypted HDDs).

There aren't really any PCs out there kicking around with equivalent hardware. Even a Skylake i3 has far better IPC than what we see in consoles.

Much of the enthusiast market also keeps their most played games on SSDs which basically leads to 0 loading times when combined with a mid-high range CPU.
 
It should now shouldn't it, but many people will tell you it's locked 30fps atm, when it's not.

You have an X already?


If XBONEX has a 60fps mode so will the PS4 PRO, they have the same GPU, one is just downclocked a bit more from a reference RX 480.

This game does not require a beefy CPU, so a mix of high and med settings can give these mid-gen refreshes a nice enough performance mode at 1080p.


There is nothing wrong with his console, Witcher 3 is not a locked 30fps on the PRO, it still drops the odd frame and cutscenes are definitely not locked. The expansions are not locked either....

Eh, no? Why you ask? I was merely speculating what could be the case for Xbox One X with this game.

Also shouldn't loading times definitely be shorter thanks to the One X anyway? They did say several things like much better AF are forced and loading times shorter. Wonder how much shorter.

For the love of god, please turn of Points of Interests in the menu as soon as you can.

You can thank me later.

Why is that? On my first playthrough that I never finished I didn't turn it off.
 
The load times are already considerably reduced on the PS4 Pro, even before the boost mode was added.

Yeah, the load time improvements using boost mode in Witcher 3 are some of the best and most noticeable boost mode enhancements in any game on the Pro.

It feels like they are reduced 50 to even 75% depending on where you are and what you are loading.
 
With equivalent hardware, we see PC versions enjoying huge gains over the consoles when it comes to load times though. It does seem to be something console specific (which is why I guessed it's the various anti-piracy measures, particularly the encrypted HDDs).

Even a cheap $70 pentium outperforms the Jaguar chips. Encryption may play a part but I think the faster CPU, additional memory, and higher bandwidth on the PC likely contributes more to the increase in speed.

Eh, no? Why you ask? I was merely speculating what could be the case for Xbox One X with this game.

Also shouldn't loading times definitely be shorter thanks to the One X anyway? They did say several things like much better AF are forced and loading times shorter. Wonder how much shorter.

I wouldn't attempt to engage thelastword in a serious discussion. I'm convinced the mods keep him around for comic relief.

If they add higher resolution textures to the 1X version, load times may not be quicker. It also my not have better AF, though we may see an increase in texture filtering from the higher resolution alone. The system implementation of 16xAF and shorter load times only applies to unpatched games. So this wouldn't apply to Witcher 3.
 
My original playthrough of this game got interrupted and never continued, and I just got a Pro, so once this patch drops seems like a great time to jump back in (or start over, more likely).
 
No 2160 CBR has the same amount of pixels as 2160 native

Sure, the pixels of the final output image, but we all know leaving it *at that* would be dishonest. There is a reason why there is a differentiation between native 3840X2160 and and a 3840X2160 being drawn by some CBR-like technique. One is more accurate, more detailed, and reflects best what *actual* pixels are in the current frame.
 
No 2160 CBR has the same amount of pixels as 2160 native

For CBR half of the pixels are rendered per frame to construct a 2160p image. The other half is used from the previous frame. In my book its a render path half the pixels to native 4K. Don't even try to deny that!
 
What? This is so wrong. XBX has a massive GPU advantage, as well as bandwidth. It's the CPU that is similar.
What is so wrong, both of the GPU's are based on RX 480......As a matter of fact the PRO has some unique features that helps it with rendering, like CBR hardware, RPM and the ID buffer...but on a base level they're both based on the Polaris GPU...

It also makes no sense to speak of bandwidth without presenting the whole picture, because the PS4 has a 20 GBPS interconnect between CPU to GPU...CPU to Memory on top of the features mentioned above (id buffer, rpm, cbr hardware)..all of that that would maximize efficiency on render time.....So real world results....? If all of these features are used should prove very interesting...

Of course I don't expect this Witcher Patch to make use of FP16, but hey, they could use geometry rendering for their performance mode to maximize fps there, so they have many ways to meet their target on the PRO at least...

Eh, no? Why you ask? I was merely speculating what could be the case for Xbox One X with this game.

Also shouldn't loading times definitely be shorter thanks to the One X anyway? They did say several things like much better AF are forced and loading times shorter. Wonder how much shorter.
Tbh, these load times should have never been so awful on the consoles...I've seen Witcher tested on PC APU's with weaker GPUs's than what was in vanilla PS4 and loadtimes were excellent...

But yes, loadtimes, DD, AF and general graphics settings should get a huge boost on top of framerate....I did say should, because I really don''t know what this dev will bring to the table unless the patch is live...So I'm looking forward to see if they improve console settings to PC high-ultra standards....at the very least.
 
I know not of what a RAM disk is. Would it help?

My hardware is excellent, but yeah. I always figured it was a CPU load issue.

Perhaps. RAM disk is running all the games files from main system RAM, so instead of it paging your HDD or SSD which has seek times, it would be working within RAM, thus massively reducing the intensity and perhaps occurance of stutter.

Though I am not sure what the cause is, harwadre or software-wise, why upping those config variables starts a stutter. It does the same on my PC. I never tried a RAM disk though as the game is fucking huge, you would need 48 GB or 64 GB RAM.
 
For CBR half of the pixels are rendered per frame to construct a 2160p image. The other half is used from the previous frame. In my book its a render path half the pixels to native 4K. Don't even try to deny that!

It's not as simple as that either it's not just a old frame slapped together with the new frame to make a full frame.

every pixel on the screen is rendered but some use data from the older frame to help with the rendering.
 
It's not as simple as that either it's not just a old frame slapped together with the new frame to make a full frame.

every pixel on the screen is rendered but some use data from the older frame to help with the rendering.

Lol half the pixels use previous frame. It's nothing like rendering a native res where the entire scene is rendered at the res in one pass in realtime. Using previous frame to fill in missing pixels is not the same. Not saying checkerboarding is bad it's really good but nothing compared to a native res technically
 
Because you'll just be running from icon to icon rather than exploring the world organically.

Good point for sure because I indeed kept going after those icons I saw on the minimap. Kinda like "oh hey another, let's go check it out."

It's much cooler to actually just explore around and out of the blue a new quest is triggered. I'll make sure to turn them off when I begin my new playthrough come November.

@thelastword

As amazing as this game was, those loading times really ruined it at times and it made me change the difficulty from Blood and broken bones to something easier. I love to explore and that means I could meet a very powerful enemy when I least expect it. I like a challenge and I don't mind dying and having to retry but the loading times each time were killing.
 
It's not as simple as that either it's not just a old frame slapped together with the new frame to make a full frame.

every pixel on the screen is rendered but some use data from the older frame to help with the rendering.

The point of the original statement....

2160p CBR is just 2.9x the pixels, should be doable, or they can go native 1800p and then upscale.

...was to compare PERFORMANCE. And as far as performance is concerned, CBR 4K is not the same as native 4K. Regardless of how many pixels are presented to the display. That is irrelevant.
 
Lol half the pixels use previous frame. It's nothing like rendering a native res where the entire scene is rendered at the res in one pass in realtime. Using previous frame to fill in missing pixels is not the same. Not saying checkerboarding is bad it's really good but nothing compared to a native res technically

That's not what's happening
 
The point of the original statement....



...was to compare PERFORMANCE. And as far as performance is concerned, CBR 4K is not the same as native 4K. Regardless of how many pixels are presented to the display. That is irrelevant.

The performance isn't the same as rendering half the pixels either


eBVXyG6.jpg
 
The performance isn't the same as rendering half the pixels either


eBVXyG6.jpg

You know that the frame time measured is not just that particular render step that is needed for construct the 2160p image (native or checkerboarded) but also for example culling, post processing and AI, right?

Also nobody pretended that CBR takes half the processing compared to native rendering. It is obvious that there is additional processing needed to construct the image to a full frame buffer.

My last response to the matter as I don't want to derail the thread any further!
 
Top Bottom