No, frame generation is not on for these, Nvidia really is that much faster than AMD at Path Tracing.Is Frame Generation forced on Path Tracing? Is that why there's no tests specifying the setting?
We ran several rounds of benchmarks for this article, on a wide selection of graphics cards at various settings. Our first test run is rasterization only, at ultra settings, without any ray tracing. Here we're seeing very decent framerates across the board. To reach 60 FPS at 1080p you need a Radeon RX 7600 or RTX 4060. 1440p with 60 FPS is possible for many cards, too, you'll need RX 7700 XT or RTX 4070 or better. 4K60 is pretty challenging though—only the RTX 4090 can handle it (without any upscaling tech).
Next up is rasterization at ultra, plus ray traced reflections, sun shadows, local shadows and ray traced lighting at "Medium." Here the hardware requirements go up quite a bit. At Full HD, the RTX 3090 reached 60.7 FPS, the RTX 4060 55.7 FPS. AMD's fastest the RX 7900 XTX is far behind, with just 49.7 FPS. At higher resolutions, AMD falls behind more and more. While RTX 4090 can reach 36 FPS at 4K, the RX 7900 XTX only gets 17 FPS. No wonder, NVIDIA is promoting Cyberpunk to show their dominance in ray tracing.
Last but not least, we activated path tracing, which brings even the best GPUs down. The mighty RTX 4090 got 61 FPS at 1080p, 4K was almost unplayable at 20 FPS. Things look even worse for AMD, with RX 7900 XTX reaching only 14.5 FPS at 1080p, 8.8 FPS at 1440p and 4.3 FPS at 4K. The good thing is that Phantom Liberty supports all three rivaling upscaling technologies from NVIDIA, AMD and Intel. With DLSS enabled, in "Quality" mode, the RTX 4090 gets 47 FPS at 4K—much more playable. If you enable DLSS 3 Frame Generation on top of that, the FPS reaches a solid 73 FPS. Without DLSS upscaling and just Frame Generation the FPS rate is 38 FPS at 4K, but the latency is too high to make it a good experience, you always need upscaling. Since the upscalers have various quality modes, you can easily trade FPS vs image resolution, which makes the higher ray tracing quality modes an option, even with weaker hardware, but at some point the upscaling pixelation will get more distracting than the benefit from improved rendering technologies.
Seems like they’re still pretty competitive CPU-wise but if you want any kind of ray tracing or decent upscaling/frame generation, goodnightThis is looking dire for AMD. They are vastly behind in what seems to be the next paradigm shift in rendering technology. Hopefully, RDNA 4 makes a major leap in RTRT. Otherwise, they'll be left behind completely.
No, and it's not on in these benchmarks. It is, however, pretty strongly recommended.Is Frame Generation forced on Path Tracing? Is that why there's no tests specifying the setting?
is it worth using Frame generation on 60Hz display? Would it lead to lot of input lag?
is it worth using Frame generation on 60Hz display? Would it lead to lot of input lag?
Jup helps if you got reflex on. U will get lower input mostly then native.
I think it will work like shit, frame gen only works great if it tops below refresh rate, on non vrr display it will stutter like motherfucker.
what settings do you use to avoid screen tearing and reduce input lag?had no issue's with it on a 60hz tv with games.
what settings do you use to avoid screen tearing and reduce input lag?
Dude, i literally tried the game maxed out for like 30 min, i already know that FOR ME some lights and shadows are not worth the hit on resolution and framerate.Every time you'll be in indirect light situation like this
Or have characters around
You'll remember that you're playing in "old gen" mode. You'll end up enabling path tracing all the time.
It’s a bad RT situation for AMD but really, this is one single game. Hardly any devs will make the effort to implement lighting this advanced until next gen at the earliest when the consoles will have the ability to handle it somewhat well.
had no issue's with it on a 60hz tv with games.
It’s a bad RT situation for AMD but really, this is one single game. Hardly any devs will make the effort to implement lighting this advanced until next gen at the earliest when the consoles will have the ability to handle it somewhat well.
Is your monitor you are seating close to 55 inch?1080p benchmarks? What year is it?
1 frame but it's a great one, totally worth it.Missing the PT results where 8 GB AMD cards crashed.
I wonder why AMD 8 GB cards can't run this game where Nvidia 8 GB cards can, better compression on Nvidia maybe???
Nvidia will make them, big chance there big selling point is dedicated hardware on the 5000 series that speeds up pathtracing massively and sponsorships for games will make them implement it.
Not going to lie, pathtracing is mighty fine and fixes the oh so ugly NPC's in cyberpunk, but its dark like really dark. It doesn't feel natural. It feels like shadow setting is overtuned to hell. But disabling it makes the entire game feel like last gen.
Anyway AMD needs to start investing into AI solutions far more, they are behind the curve for a while and that gets bigger if they don't start to follow nvidia or atleast try to compete.
It can be if you're getting like 45fps and trying to get to 60.is it worth using Frame generation on 60Hz display? Would it lead to lot of input lag?
Apparently toward the end of the live stream happening today, I assume sometime close to 5pm CEST.Do we know what time today the 2.0 update is going live? I just got a tiny update in Steam this morning but apparently it wasn't the real update yet.
AMD has to throw their hybrid pipeline in the garbage
Was made for saving silicon area and complexity in favor of rasterization, but even that they barely got an advantage somehow. That hybrid RT is made so that queuing systems juggle between rasterization and RT and ML (ML part has never been demonstrated to work with the other 2 so far..) in, AMD's prefered method, inline ray tracing. You control and know what you're getting, the scheduler can do the rest.
Path tracing is not like that, it's chaotic. The hybrid pipeline is choking big time.
This is a funny way to put it. Path tracing isn't unnaturally dark, and in many cases it can make dark areas brighter due to bounce lighting. But it is a fundamentally different way of lighting a scene and if the level designers and artists weren't using/considering path tracing when they created a scene, then the scene isn't going to be lit according to the artist's intention.Not going to lie, pathtracing is mighty fine and fixes the oh so ugly NPC's in cyberpunk, but its dark like really dark. It doesn't feel natural. It feels like shadow setting is overtuned to hell. But disabling it makes the entire game feel like last gen.
AMD has consoles. But take a moment to think what this means for the PS6 and ray tracing if they can't come up with something good.This is looking dire for AMD. They are vastly behind in what seems to be the next paradigm shift in rendering technology. Hopefully, RDNA 4 makes a major leap in RTRT. Otherwise, they'll be left behind completely.
Update 2.0 is coming out at 5 PM CEST today!
Its gonna be reeeeeeaaal interesting to see where next gen Switch lands in comparison to last gen and current gen consolesAMD has consoles. But take a moment to think what this means for the PS6 and ray tracing if they can't come up with something good.
AMD has consoles. But take a moment to think what this means for the PS6 and ray tracing if they can't come up with something good.
This is the same game that everyone hated and now the expansion is smoking sales.
With the current expectations with BC and console generational lines getting blurred with pro consoles and cross gen titles, I’m inclined to believe it’s an uphill battle to get someone to switch architectures. I bet Nintendo sticks with nvidia and Sony/MS stick with AMDIntel could maybe slide in consoles with their tiled based APUs
Microsoft could look into gaining an advantage over Sony and a different supplier is pretty much the only way to go since AMD seems in too much architectural turmoil for coming years
With the current expectations with BC and console generational lines getting blurred with pro consoles and cross gen titles, I’m inclined to believe it’s an uphill battle to get someone to switch architectures. I bet Nintendo sticks with nvidia and Sony/MS stick with AMD
Intel makes x86 CPUs as well, I'm not sure if switching to a different x86 CPU brand would mess up BC on the PS5.With the current expectations with BC and console generational lines getting blurred with pro consoles and cross gen titles, I’m inclined to believe it’s an uphill battle to get someone to switch architectures. I bet Nintendo sticks with nvidia and Sony/MS stick with AMD