Yeah, but the PS4 Pro, X1X and Series S versions could run the game just fine. I would say its more of a PS4.5 game than anything. You can also look at the vram requirements to give you a good idea of what they were aiming for. Cyberpunk in 2020 used around 5 GB on PC. Only ray tracing pushed it higher to around 7GB. Now path tracing pushes it above 12GB, but the game's assets in 2020 were designed with last gen hardware in mind.
PS4 couldve run this game at 720p. CD Project just didnt even bother optimizing those versions.
nah, cyberpunk runs just fine on $400 consoles like the playstation 5. It was designed with those consoles in mind. the path tracing mode is just a bonus. We have discussed this a million times in this thread. The game was designed with baked lighting in mind. It looks great even with RTGI off.
And nothing about insomniac's ending looks amazing. The matrix is running on the same hardware and looks amazing. Even better than cyberpunk on $10k PCs. (its actually more like $3k but i will allow the hyperbole since im a big fan of hyperbole myself).
i mean i can get cyberpunk with path tracing 1440p dlss balanced (looks better than native 1080p) and get around 30 fps average (if not for vram limitations, it would likely never drop below that) on my aging 3070, which I got for 500 bucks msrp. got my whole rig for like around 850 bucks when I sold my older parts. it is %80 there. it only lacks the crispness of higher resolutions. but it looks good. look at how ff xvi or forspoken made joke out of PS5 for example. at least DLSS produces good results at 1440p upscaling. can't say the same for FSR or other regular old upscalers
it is really hyperbole imo. when it comes to pushing graphical fidelity, 30 FPS on consoles are considered acceptable. why it is not acceptable on something midrange like 3070 too? I find upwards of 30 fps playable with Reflex on PC so it will always be a win in my book. just assume you have a console and you play with the highest fidelity mode. So I don't get the "playable framerates" argument. it depends on the user. and, at the end of the day, I paid around 850 bucks for my PC, right? so ps5 running rt shadows at 30 fps with 1200p-1400p resolution averages is impressive, but only paying 1.7x money but getting playable PATH tracing at around 30 fps with a better upscaler and better image quality is not?
The path tracing and DLSS definitely adds more than 2x to the game.
on top of that, here's what my 500 bucks investment into 3070 allowed me to experience past 3 years
- DLDSR (makes old games playing even more worthwhile, very good anti aliaser by itself)
- DLSS (peak reconstruction, dlss tweaks, being able to run at DSR 4K+DLSS performance and getting massive visual improvements over native 1080p while paying small price of performance. DLDSR+DLSS combo is also amazing. I've used both combo in countless games
- Half Life ray tracing, Quake ray tracing, the promise and potential of having all those great old games getting ray tracing treatment. they're games I will play to eternity. It is just something that a console won't do (not saying can't do, there's no modding support. if half life was to get an official ray tracing treatment by Valve to be released on consoles, they'd charge 40 bucks for it)
- Cyberpunk path tracing. I can get 30+ FPS with a good image quality. That's all I need. I can go in, drive a bike or walk around. Disable it and play the game. I can do both. The card allows me to experience and visit the city in that respect. Sure it all goes down the drain in combat. It is clear there are limitations.
- Witcher 3 RTGI + rt reflections, all suite. It looked so gorgeous, and revisiting that game in those visuals were a boon. I were able to get away with a locked 40 fps experience. granted consoles also run it with similar RTGI, it can look quite blurry. and that is where the crucial advantage DLSS has comes into play
- Metro Exodus RTGI + physx + hairworks. they all mesh together so fine. I was so grateful I had other reasons for delaying playing metro exodus for so long. I lucked out and enhanced edition was my initial first experience
- NVIDIA reflex, practically makes all games less laggier by default and lets you enjoy low lag at any framerate. the INPUT LAG I get at 35 FPS GPU bound with Reflex is most likely less than a console game running at locked 60 FPS/vsync (yet the latter doesn't complain but former does. amazing, right?)
I might sound like a shill (I really am not and I hate NVIDIA with all my guts due to VRAM) but genuinely, I still think it was a worthwhile investment. it is, for me, better to pay 500 bucks and get great experience and something NEW within the first 3-4 years rather than paying 500 bucks, then paying subscription money for years and get something that is great 6+ years later. I'm not saying the consoles are bad either, it is a good investment too. in the end, when the benefits I got from my 850 bucks investment total, I just am happy.
After seeing how much of a good stuff I can get out of PC and new tech, I will simply wait for a viable, affordable 16 GB NVIDIA card that doesn't ask for 1200 bucks. I would gladly pay for a 700 bucks 16 GB 5070 at this point . PC simply hits different. and at this point a potential 16 gb 5070 would most likely last you upwards of 6 years (which is why NVIDIA is hesitant to give 16 GB to midrange products. But I will bide my time, and still reap the benefits of what I have)
I can even see why NVIDIA is so hesitant on giving a lot of VRAM to these cards. you already get insane value out of them. if they had ample amounts of VRAM, we would use them till like 2030. it is sad but it is what it is. I hate their practices but I cannot also deny how much utiltiy they have to 2000/3000 cards in general. I wouldn't get or suggest 4060 or 4070 to anyone though. I will simply wait for 5070 at this point. If it is packed with 12 GB too, I will rest easy with my 8 GB (knowing that developers will do enough job to make sure their game runs on 12 gb with ray tracing at 1440p. means I will still be able to get away with 8 GB at a lower resolution with tweaked settings lol)