Brand new Cyberpunk 2077 screeshots featuring NvIDIA RTX Raytracing

You might also prefer amd features, superior performance in your favorite games, bigger more future proof memory buffers, the looks of amd logo and Lisa Zu, etc. There is plenty of valid reasons to choose AMD over nvidia. But for some reason everybody has to prefer the features you like or you are wasting your money lol.

Now i know you are just a troll hahaha, good one. You got me.
 
Cyberpunk has been my most hyped game since Witcher 3's release. Even with my 1080ti, still not enough to get me to upgrade for RT. They need to bring down the 2080ti's by a solid $300/$400 before I'd even consider upgrading.
 
What the fuck is going on in that swimsuit?
eoefh22.png

giphy.gif
 
Amd should take note and launch a gpu with ray tracing hardware same day as cyberpunk + make bundles

Would be smart to make up how they are behind.

Amd should focus on their driver working optimally for this title, game by itself is so big that it´s gonna overshadow all the nvidia PR "look at me" attempts, nobody is gonna care I think, just like it happened with GTAV. Knowing CDPR I doubt they are gonna block AMD from early access so all should be good for them.
 
Let me put that way.

I don't care about any... for me makes no difference witch one sells more or have the better hardware... the only interesting part are how they build their architectures.

But I know ray-tracing is not a waste and there is no reason except price to buy AMD cards today.
nVidia is at least two generations ahead AMD in GPU tech.
Or like me having a samsung tv with only hdmi inputs and adaptive sync being extremely important to you.
 
Or like me having a samsung tv with only hdmi inputs and adaptive sync being extremely important to you.
I believe that is important... I should not buy a TV today without HDM 2.1... just like Ray-tracing.
 
Last edited:
I believe that is important... I should not buy a TV today without HDM 2.1... just like Ray-tracing.
Samsung supports the hdmi 2.1 features like vrr over hdmi 2.0

"All Samsung QLED TV series (Q60, Q70, Q80, Q90 and Q900R and the RU8000 Series edge-lit 4K LED-LCD TV Series will support both VRR and AMD's proprietary FreeSync (a system like VRR that maintains perfect synchronization between the display and a PC or console even as a game's frame rate fluctuates) at up at up to 120Hz. FreeSync is supported by Xbox One S and One X consoles. "


2018 tvs also have this:

"Select Samsung QLED TVs to be launched this year are set to support a 120 Hz maximum refresh rate, HDMI 2.1's VRR, as well as AMD's FreeSync technologies, the company announced earlier this year. The technologies do essentially the same thing, but they are not the same method – AMD's Freesync-over-HDMI being a proprietary method – and as such are branded differently. From technological point of view, both methods require hardware and firmware support both on the source (i.e., appropriate display controller) as well as the sink (i.e., display scaler). As it appears, Samsung decided to add support for both methods. "

 
Last edited:
Ah geez

This is going to slaughter my RTX 2070 lmao

I don´t care, it wont improve the appearance of the game radically and even if I buy an nvidia card to play this game it will run like hot garbage with RT, prices here are crazy so it will have to be something like the 1660:



Except the GTX 1660 is not an RTX card and thus has no dedicated RTX hardware.

Garbage troll attempt is garbage
 
Last edited:
Best looking game I've ever seen. The art style is so well realised. Take note game industry. This is the result of hiring people based on merit. This is what you used to be.
 
The game is certainly beautiful, but i honestly dont know it is all because of ray tracing. I would like some kind of "RTX ON and RTX OFF" comparison.

I think RTX 2060 might be able to run this with ray tracing on at 1080p 60 FPS and high settings.
 
I've yet to be impressed with any raytracing demo they have shown so far. Maybe it's because I've been working with "real" 3D render engines for years, but none of this stuff looks significantly better with RTX than without, and some games like the recently released RTX version of Quake 2 look downright hideous compared to the original.
Yeah and "real" renderers use ray tracing or path tracing. It's absolutely the future and you're lying to yourself if you don't think that's the case.
 
Last edited:
I needed dual 2080tis for this and Microsoft Flight Simulator didn't I lads? Not a waste of money in the slightest right? RIGHT?
 
More like dual 3080ti's.

Rule n1 with CD Projekt Red.

If you want to play the game at max settings, u will need 2 top end gpu's.

U needed 2x 580's for witcher 2, and you needed 2x 980's for witcher 3. to get it to run at 1080p 60 fps.

probably same goes for this game. I aspect 30 fps with raytracing everything maxed on a 3080ti.
 
Don't know what you are on, I worked with Vray (an offline raytracer for example 3D studio Max) for years and the stuff I did with lighting, was even years back ahead of what we have now.. the stuff that they do now with RT real time in games is amazing, and a real jump forward for the games industry, in realism. Games that are old, lake geometry and textures, but something like Quake RTX looks way better in the lighting and shadowing department, art direction is another beast ...
I've worked with everything from Vray to Corona to Octane as well, and I still think RTX doesn't add much. That's bound to change at some point, but as for right now, I couldn't care less. Devs have gotten so good at faking shadows and lighting that the benefit most games get from RTX at this point is minimal. The solution people seem to have taken so far is to go way overboard and make everything much more shiny and glossy than it would be in real life, just so the effect becomes immediately obvious.

I remember the same thing happening in the late 90s when Voodoo cards became popular. A number of games around that time were basically nothing but glorified tech demos so overloaded with effects that they could give you seizures (Rage Software's Incoming and Expendable come to mind). People at the time thought it looked amazing, when the truth was that it looked like shit. The tech eventually got better and artists managed to show some restraint, but it took a while. The same will probably happen with RTX, but it will take a few more hardware generations before we get there.
 
I think most of us are going to get a shock to the system once we witness how the game looks on base Xbox One and Ps4. The contrast between these screenshots and footage from base Xbox and PS4 is going to be dramatic.

I'm tempted to wait for the GTX 30 series. By then the raytracing capabilities will be more optimized and you might be able to run this at 4k on max settings.
 
I think most of us are going to get a shock to the system once we witness how the game looks on base Xbox One and Ps4. The contrast between these screenshots and footage from base Xbox and PS4 is going to be dramatic.

I'm tempted to wait for the GTX 30 series. By then the raytracing capabilities will be more optimized and you might be able to run this at 4k on max settings.

I can guarantee it. Console only folks who care about graphics are going to be very upset by the stark difference. Something we havnt really seen too much this gen.
 
I can guarantee it. Console only folks who care about graphics are going to be very upset by the stark difference. Something we havnt really seen too much this gen.

At some point they'll have to show Xbox/Ps4 footage.

I'm really curious how they'll handle the marketing...
 
After TW3 downgrade gate I would think CDPR would be wary of only releasing pumped up PC screens.

I think it's more important they don't gimp the PC version to maintain parity with the consoles this time. And everything I've seen and heard says they learned the lesson, the PC version is its own thing in terms of graphics, not hindered by the consoles. That's likely because they want the higher end build for next gen consoles also.

I fully expect max 1920x1080/30 on Xbox X, and all the other systems being dynamic sub 1080p with lots of things like crowd sizes being smaller and Lod being more noticeable.
 
Last edited:
I think most of us are going to get a shock to the system once we witness how the game looks on base Xbox One and Ps4. The contrast between these screenshots and footage from base Xbox and PS4 is going to be dramatic.

I'm tempted to wait for the GTX 30 series. By then the raytracing capabilities will be more optimized and you might be able to run this at 4k on max settings.

But will this game ever come out on the Switch? :messenger_tears_of_joy:

Never forget.....

JoSXCPZ.gif
 
woaah. looks stunning.

looks like we're only gonna be able to play this at launch with the current RTX cards. I have a 2080. i hope i can play it at 1440p with decent settings. i've already started saving/upgrading my PC. waiting for my new 1440p 144hz gsync monitor. i'll be buying a 3950X later this year.

my full spec will be:

3950X
32GB
RTX 2080
1440p 144hz monitor.

i suspect the CPU/RAM will be fine but to run this game at high with RTX it's gonna be demanding as shit. i'll be upgrading my 2080 for sure next year if Nvidia bring out their 7nm cards.
 
I think it's more important they don't gimp the PC version to maintain parity with the consoles this time. And everything I've seen and heard says they learned the lesson, the PC version is its own thing in terms of graphics, not hindered by the consoles. That's likely because they want the higher end build for next gen consoles also.

I fully expect max 1920x1080/30 on Xbox X, and all the other systems being dynamic sub 1080p with lots of things like crowd sizes being smaller and Lod being more noticeable.

No fucking way, that would be a disaster. When the game comes out the biggest audience will still be the base consoles. They won't be looking to piss them off with a difference that big, which it would be if X is at 1080p30.

I'm expecting 1080p sub 30 for PS4. The main worry is if the game looks substantially worse without RT enabled.
 
Last edited:
No fucking way, that would be a disaster. When the game comes out the biggest audience will still be the base consoles. They won't be looking to piss them with a difference that big, which it would be if X is at 1080p30.

I'm expecting 1080p sub 30 for PS4. The main worry is if the game looks substantially worse without RT enabled.

I really believe you are going to be disappointed if you are expecting 4k/30 out of the Xbox X with Cyberpunk.
 
After TW3 downgrade gate I would think CDPR would be wary of only releasing pumped up PC screens.
to be fair this is showing off what is currently a PC only feature. if console owners come in here expecting the PS4/XB1 versions to look like this then that's on them.

i think (hope) most consoles owners aren't this dumb and maybe they just think that PS5/XB2 will look like this. if the game is updated for next gen it might get Raytracing but it will be toned down compared to the full fat PC experience.
 
I think it's more important they don't gimp the PC version to maintain parity with the consoles this time. And everything I've seen and heard says they learned the lesson, the PC version is its own thing in terms of graphics, not hindered by the consoles. That's likely because they want the higher end build for next gen consoles also.

I fully expect max 1920x1080/30 on Xbox X, and all the other systems being dynamic sub 1080p with lots of things like crowd sizes being smaller and Lod being more noticeable.
Agreed. I think what we've seen of the game is indicative of what we can expect from PS5 and Scarlet ports; especially with the new SSD drives.
 
Top Bottom